Skip to content

Most of my links are from indexing services. Why?

Hi, most of the links my GSA SER has built are from indexing services (I do have SER linked with GSA SEO Indexer).  This can't be healthy, right?  How do I have a more evenly distributed profile where the other backlink types get more links percentage-wise?

Comments

  • SvenSven www.GSA-Online.de
    These are easy to find links. You should not mix them anyway with contextual links. Use them on the last tier only.
  • sickseosickseo London,UK
    The distribution of link sources is based on what you have in your site list.

    If you want more of the other types such as articles, forums, social network and wiki links, (best for T1 and T2) you have to scrape for them, then test them so that they get added to your site list and used on future projects. 
  • Is this a sitelist you purchased? 
  • Hi, these are all Tier 2 links (backlinks to "natural" backlinks to my money site).  And no, this is not a sitelist but au naturel ones (i.e., that GSA SER scrapes). 

    A question:  Should I uncheck the indexer option in GSA SER?  Of course, I am still use GSA Indexer to index the backlinks built here (but those are Tier 3's right?  GSA Indexer backlinks to GSA SER backlinks to "natural" backlinks to my money site).
  • Sven said:
    These are easy to find links. You should not mix them anyway with contextual links. Use them on the last tier only.
    So you're saying I should unclick the "indexer" option on GSA SER's "Where to Submit" menu, correct?
  • edited June 24
    Most people turn off the search function in SER and scrape their own site lists (or buy one). Especially for contextual link sources. 
  • Most people turn off the search function in SER and scrape their own site lists (or buy one). Especially for contextual link sources. 
    Thank you for your comment.  A couple of questions:

    1)  What some good vendors for these site lists?  Recommendations and links would be totally appreciated!

    2)  Would using site lists from vendors cause any issues since everyone will be using the same site lists and therefore the same URL's for backlinks?  (Sorry if I sound ridiculous.  I'm totally new to this.)
  • Ray2003 said:
    Most people turn off the search function in SER and scrape their own site lists (or buy one). Especially for contextual link sources. 
    Thank you for your comment.  A couple of questions:

    1)  What some good vendors for these site lists?  Recommendations and links would be totally appreciated!

    2)  Would using site lists from vendors cause any issues since everyone will be using the same site lists and therefore the same URL's for backlinks?  (Sorry if I sound ridiculous.  I'm totally new to this.)
    I have not purchased a site list in a long time. Probably 10+ years. I don't know whos good or which one is worth it, or if any are. There is a buy sell trade forum here. Take a look at the vendors that sell there. See who's active and who doesn’t have pissed off customers. I can’t and won’t recommend a service to you, sorry. 

  • sickseosickseo London,UK
    You'll get much better results by scraping/building your own site list. I wouldn't recommend buying a site list - not if you want to get results and move your rankings up. Sure it's the quick and easy way, but it won't produce results.

    If you are serious about getting results then you really should look into setting up a dedicated scraping system for all the major search engines. This is how you tap into the real ranking power of the software. You'll never achieve that through paid site lists.

    - Using lists shared by hundreds of other users has much less value as they are flooded with posts and external links. They are literally leaking link juice.

    - Sites with hundreds of thousands of low quality posts on them make getting new posts indexed that much harder. The site has already been flagged as low quality by Google. Your single high quality post won't change this.

    - Plus the site numbers per engine from list providers is embarassingly low. I'd be embarased to sell those site lists.

    Those new sernuke engines are the best link sources to use for any tier. I'm using 3 tiers of them and get very good natural indexing on them - no indexing service - just tiers. 
  • londonseolondonseo London, UK
    sickseo said:
    You'll get much better results by scraping/building your own site list. I wouldn't recommend buying a site list - not if you want to get results and move your rankings up. Sure it's the quick and easy way, but it won't produce results.

    If you are serious about getting results then you really should look into setting up a dedicated scraping system for all the major search engines. This is how you tap into the real ranking power of the software. You'll never achieve that through paid site lists.

    - Using lists shared by hundreds of other users has much less value as they are flooded with posts and external links. They are literally leaking link juice.

    - Sites with hundreds of thousands of low quality posts on them make getting new posts indexed that much harder. The site has already been flagged as low quality by Google. Your single high quality post won't change this.

    - Plus the site numbers per engine from list providers is embarassingly low. I'd be embarased to sell those site lists.

    Those new sernuke engines are the best link sources to use for any tier. I'm using 3 tiers of them and get very good natural indexing on them - no indexing service - just tiers. 
    What engines are in each tier, or how did you set up the tiers?
  • sickseosickseo London,UK
    I'm running 3 tiers with articles, forums, social network, wiki and sernuke engines on all 3 tiers. Pure contextuals and profiles. I've abandoned all the lower quality link sources such as redirects/indexers and blog comments/guestbooks in favour of engines that are more likely to get indexed and add value.

    As for the tiered structure, lots of options here but currently running with the following structure:


    A single T1 project powered by 5 T2 projects and each of those are powered by an additonal T3 project. This is a 1-5-5 project. You can go bigger for more competitive projects and use even a 1-10-100 which I used to run in the past, but you'll probably need 1 server to run something that big continuously for several months.

    Also have the tiered option set to only build links to do follow links. Links are reverified automatically every 3 days to remove dead links. The projects are using do follow and no follow link sources across all 3 tiers. Each T2/3 project can easily go as high as 200,000 links so it's quite a big structure that's being built.

    It's a very nice automated way to build the tiers. The authority of the link sources are quite varied. Although GSA shows PR0 on the vast majority of sites, there are still a fair amount of PR1-PR8 link sources in there too. This is quite meaningless though as when checking DR and DA stats, the metrics are varied again, some have DR and no DA, and vice versa.

    I have other strategies in play that boost the authority on all the link sources anyway, so using a DA0 site today could well become a DA30+ site in the future once the tiers have been built. It takes several months running this project before you start seeing any impact on rankings.
    Thanked by 2londonseo Hunar
  • londonseolondonseo London, UK
    sickseo said:
    I'm running 3 tiers with articles, forums, social network, wiki and sernuke engines on all 3 tiers. Pure contextuals and profiles. I've abandoned all the lower quality link sources such as redirects/indexers and blog comments/guestbooks in favour of engines that are more likely to get indexed and add value.

    As for the tiered structure, lots of options here but currently running with the following structure:


    A single T1 project powered by 5 T2 projects and each of those are powered by an additonal T3 project. This is a 1-5-5 project. You can go bigger for more competitive projects and use even a 1-10-100 which I used to run in the past, but you'll probably need 1 server to run something that big continuously for several months.

    Also have the tiered option set to only build links to do follow links. Links are reverified automatically every 3 days to remove dead links. The projects are using do follow and no follow link sources across all 3 tiers. Each T2/3 project can easily go as high as 200,000 links so it's quite a big structure that's being built.

    It's a very nice automated way to build the tiers. The authority of the link sources are quite varied. Although GSA shows PR0 on the vast majority of sites, there are still a fair amount of PR1-PR8 link sources in there too. This is quite meaningless though as when checking DR and DA stats, the metrics are varied again, some have DR and no DA, and vice versa.

    I have other strategies in play that boost the authority on all the link sources anyway, so using a DA0 site today could well become a DA30+ site in the future once the tiers have been built. It takes several months running this project before you start seeing any impact on rankings.

    Thanks for this detailed explanation and thanks for all your contributions to this forum  B)
    Thanked by 1sickseo
  • londonseolondonseo London, UK
    sickseo said:
    I'm running 3 tiers with articles, forums, social network, wiki and sernuke engines on all 3 tiers. Pure contextuals and profiles. I've abandoned all the lower quality link sources such as redirects/indexers and blog comments/guestbooks in favour of engines that are more likely to get indexed and add value.

    As for the tiered structure, lots of options here but currently running with the following structure:


    A single T1 project powered by 5 T2 projects and each of those are powered by an additonal T3 project. This is a 1-5-5 project. You can go bigger for more competitive projects and use even a 1-10-100 which I used to run in the past, but you'll probably need 1 server to run something that big continuously for several months.

    Also have the tiered option set to only build links to do follow links. Links are reverified automatically every 3 days to remove dead links. The projects are using do follow and no follow link sources across all 3 tiers. Each T2/3 project can easily go as high as 200,000 links so it's quite a big structure that's being built.

    It's a very nice automated way to build the tiers. The authority of the link sources are quite varied. Although GSA shows PR0 on the vast majority of sites, there are still a fair amount of PR1-PR8 link sources in there too. This is quite meaningless though as when checking DR and DA stats, the metrics are varied again, some have DR and no DA, and vice versa.

    I have other strategies in play that boost the authority on all the link sources anyway, so using a DA0 site today could well become a DA30+ site in the future once the tiers have been built. It takes several months running this project before you start seeing any impact on rankings.

    Why do you have 5 T2 projects when you could just run one T2 project with the same amount of articles that is within the 5 T2 projects?
  • sickseosickseo London,UK
    It makes more links with 5 projects. If 1 project goes up to 200k links then 5 of them will make 1 million links.
    Thanked by 1londonseo
  • londonseolondonseo London, UK
    sickseo said:
    It makes more links with 5 projects. If 1 project goes up to 200k links then 5 of them will make 1 million links.

    Is there any advantage to having more than one link from the same domain?
  • sickseosickseo London,UK
    edited July 5
    I'd expect it to help with crawling and boosting page authority. I'm usng random url option and pointing them at 6k+ T1 links. The odds of same domain being used repeatedly at same T1 link is pretty slim.

    The goal is to use unique domains pointing at tiers below.

    But if a domain is used more than once to point to the same link in the tier below - only the keyword anchors will have benefit as it will send a signal for different keywords/branding anchors - but you won't gain any extra authority as the domain has already been used once.
    Thanked by 1londonseo
  • sickseo said:
    It makes more links with 5 projects. If 1 project goes up to 200k links then 5 of them will make 1 million links.
    Hi, your responses are great wealths of information, esp. if you are, like me, a total neophyte.  Some of your comments are above my head, however, so I was wondering if you can clarify a bit:

    1)  When you have 5 Tier 2 projects, how much do you differentiate the Settings (i.e., the "Where to Submit" fill-ins)?  If I were to do multiple Tier 2 projects, I don't want SER to be searching for and posting to the same links across all 5. How do I prevent this?

    2)  You mentioned SERNuke's engines.  I took your advice and looked at them.  Did you buy all the engines or only a couple?  Which engine would you recommend for sites that review products on Amazon?  

    3)  Your Tier 3 projects build far more links than Tier 2 and especially 1, correct?  The only way I know how to do this right now (remember, I'm a beginner) is to click on Indexer services, which I'd imagine you're not doing.  How do you have Tier 3 have a lot more links than Tier 2?  (I don't build Tier 1 links correctly; I just boost existing "natural" Tier 1 backlinks). 

    Any help would be appreciated!  You're an absolutely GSA SER guru!
  • sickseosickseo London,UK
    1. You can disable the searching of links in project settings so that only one project has scraping enabled. Personally I have a different set up for scraping new targets with zennoposter. My T1 and T2/3 projects use the same engines - contextuals and profiles.



    The traditional set up is to use T3 (last tier) projects with engines like blog comments, guestbooks, indexers, redirects and even wikis. These engines tend to be good for getting links in T2 crawled which could lead to indexing. These engines should never be placed in the middle of your tiered structure as they will have either high obls (blog comments/guestbooks) or be mostly no follow (blog comments/wikis). No good for passing link juice.

    The redirect and indexer links will be mixed no follow and do follow with low obls so could be placed in the middle tier, but their crawling/indexing rates are incredibly low with google. They did a spam update last year to combat these types of links being indexed.

    In my tests these link types are mostly ignored by Google - hence why I don't use them.

    If you don't want the software to be posting to the same sites repeatedly, then disable reposting of links.



    You'll need to test this as you won't get many links if each project uses each site only once. If you only have 100 sites in your site list, then you will only get 100 links created per project.

    2. For amazon product review pages, I'd suggest targetting brand names and product names variations - these will usually have lower competition and be easier to rank for.

    The best engines for contextual do follows are the real estate and job packages. The bio links package is ok for this too. These allow for keyword anchors to be placed within the content - the most effective type of link for rankings. Other built in engines are also good for this, such as moodle, wordpress, smf, xpress article, dwqa, gnuboard - although the sites will be mixed do follow/no follow. 

    The git a likes package will be best in terms of link numbers - gitea has thousands of working sites. But the do follow links are profile url anchors, not keyword anchors. The wiki links it makes are no follow links - so no good for pushing link juice through them.

    I've bought all of the engines. My strategy is based on acquiring links from as many unique domains as possible. My site list for ser nuke engines has over 4000 sites in it.

    3. Think pyramids when building 3 tiers. As you have T1 links already made, you only need to build T2/3 projects to them. A good strategy would be to use contextuals/profiles on T2 and use the junk links on T3 - blog comments, guestbooks, redirects, indexers, as well as wikis. Also make sure on your T3 project you set up the tier settings and enable "do follow" links only. This set up is designed to push link juice to your T1's.



    That's 1 way. Or you could use contextuals and profiles across both T2/3 and avoid the junk links - this is what I do as these link sources are mainly content based and will have better indexing rates than using the junk links.

    All I can say is play with the software and run different tests. There are lots of strategies that can be deployed once you undersatnd the basics.

    To control the amount of links made in each tier - there are 2 ways to do this. Either duplicate a project x amount of times to have more links made in that tier. Or use the reposting settings to have a project make more accounts and more posts. But make sure you use delay settings.
  • londonseolondonseo London, UK
    sickseo said:
    1. You can disable the searching of links in project settings so that only one project has scraping enabled. Personally I have a different set up for scraping new targets with zennoposter. My T1 and T2/3 projects use the same engines - contextuals and profiles.



    The traditional set up is to use T3 (last tier) projects with engines like blog comments, guestbooks, indexers, redirects and even wikis. These engines tend to be good for getting links in T2 crawled which could lead to indexing. These engines should never be placed in the middle of your tiered structure as they will have either high obls (blog comments/guestbooks) or be mostly no follow (blog comments/wikis). No good for passing link juice.

    The redirect and indexer links will be mixed no follow and do follow with low obls so could be placed in the middle tier, but their crawling/indexing rates are incredibly low with google. They did a spam update last year to combat these types of links being indexed.

    In my tests these link types are mostly ignored by Google - hence why I don't use them.

    If you don't want the software to be posting to the same sites repeatedly, then disable reposting of links.



    You'll need to test this as you won't get many links if each project uses each site only once. If you only have 100 sites in your site list, then you will only get 100 links created per project.

    2. For amazon product review pages, I'd suggest targetting brand names and product names variations - these will usually have lower competition and be easier to rank for.

    The best engines for contextual do follows are the real estate and job packages. The bio links package is ok for this too. These allow for keyword anchors to be placed within the content - the most effective type of link for rankings. Other built in engines are also good for this, such as moodle, wordpress, smf, xpress article, dwqa, gnuboard - although the sites will be mixed do follow/no follow. 

    The git a likes package will be best in terms of link numbers - gitea has thousands of working sites. But the do follow links are profile url anchors, not keyword anchors. The wiki links it makes are no follow links - so no good for pushing link juice through them.

    I've bought all of the engines. My strategy is based on acquiring links from as many unique domains as possible. My site list for ser nuke engines has over 4000 sites in it.

    3. Think pyramids when building 3 tiers. As you have T1 links already made, you only need to build T2/3 projects to them. A good strategy would be to use contextuals/profiles on T2 and use the junk links on T3 - blog comments, guestbooks, redirects, indexers, as well as wikis. Also make sure on your T3 project you set up the tier settings and enable "do follow" links only. This set up is designed to push link juice to your T1's.



    That's 1 way. Or you could use contextuals and profiles across both T2/3 and avoid the junk links - this is what I do as these link sources are mainly content based and will have better indexing rates than using the junk links.

    All I can say is play with the software and run different tests. There are lots of strategies that can be deployed once you undersatnd the basics.

    To control the amount of links made in each tier - there are 2 ways to do this. Either duplicate a project x amount of times to have more links made in that tier. Or use the reposting settings to have a project make more accounts and more posts. But make sure you use delay settings.

    Is there a reason why you don't use Pingbacks & Trackbacks on T3?
  • sickseosickseo London,UK
    I don't see them having any impact on indexing or rankings. Have hardly any of those targets in my site list either. I've tried scraping them and building the list but never had any success with it. These are old school tactics, similar to blog comment links which I'm moving away from.

    These days I'm more focused on higher quality link sources - ones that are content based with low obls with good indexing rates.

    For T3, the lower quality link sources have traditionally been the way to go. But I'm using contextuals and profiles on T3 instead which is giving me better results.
    Thanked by 1londonseo
  • londonseolondonseo London, UK
    sickseo said:
    I don't see them having any impact on indexing or rankings. Have hardly any of those targets in my site list either. I've tried scraping them and building the list but never had any success with it. These are old school tactics, similar to blog comment links which I'm moving away from.

    These days I'm more focused on higher quality link sources - ones that are content based with low obls with good indexing rates.

    For T3, the lower quality link sources have traditionally been the way to go. But I'm using contextuals and profiles on T3 instead which is giving me better results.

    So are you saying the contextuals and profiles you use on T3 all get indexed naturally? Is it just MediaWiki links that index naturally?
  • sickseosickseo London,UK
    Not all - you'll never get 100% indexing. But yes, wikis, articles, and sernuke engines mostly get indexed naturally. Even if the T3 doesn't get indexed but is crawled, then it may lead to the T2 being indexed in most cases.

    The new engines work like this right now. But once those sites have been spammed to death I'd expect things to change.
    Thanked by 1londonseo
Sign In or Register to comment.