Link Indexing

Which indexing services are available within GSA SER and which ones are the vast majority of people using?

Also, I have been reading quite a few threads in the forum and a few people seem to be building upwards of 250K links per day.  Checking some of the indexing services, if you did pass your links through them, this gets expensive very quickly.  Some charge $120 and upwards for 500K links, which works out to $120 every couple of days for people churning out 250K links per day.

Would love to hear from anyone who is actually using an indexing service and perhaps get an idea how they use it and perhaps associated costs and if they feel indexing offers value for money.



  • shaunshaun - The Ultimate Resource For Free GSA Tool Tutorials!
    I published all of this content based on Indexing last year. I haven't really used any services since June/July last year though so things could have changed. The thing with indexing is many services will only get Google to crawl your link, Google decided if they will index it or not using stuff like duplicate content percentage and readability so you can pay $xxx per month and still get nothing indexed. Additionally, you might have 50% or more of your links die within a month or so meaning half of the credits are wasted.

  • rsharpe75rsharpe75 United Kingdom
    Hi @shaun

    Your review of inject was very interesting because it was one of the services I was looking at.

    With regards to dead links, I actually read another article you wrote that addressed this.  At the time, I thought perhaps a staggered approach would work quite nicely.  For example:

    First 4 weeks spent building x number of Tier 1 links per day.
    After the 4 weeks have passed, I check which links from day 1 are still active and only build Tier 2 links to those.  This step will obviously also take 4 weeks.
    When this is done in week 8, I'll then start checking tier 1's and 2's to make sure they are alive and use this output to build tier 3 links over the course of the next 4 weeks.

    I'm hoping this will allow me to totally maximise effectiveness of the link building and ultimately, the indexing when put through a service.

    I'm going to have to work out a way to organise this approach because if done wrong, I'll confuse myself.  I'll also have to work multiple projects simultaneously otherwise the velocity of links built using this staggered approach will be quite low.

    Would love to hear your thoughts.

    Thanked by 1Deeeeeeee
  • shaunshaun - The Ultimate Resource For Free GSA Tool Tutorials!
    The staggering thing is similar to something I used before stopping SER on my money sites. Basically alive check, organised by date on the 1st of every money, select everything from the previous month and then push it to the indexer via the right click options.
  • Tim89Tim89
    My indexer is great for mass link indexing, @shaun I just skimmed through your last indexing post and it seems to favour index inject, but this isn't a great viable solution if you're mass link building tiers using SER or most automated tools.

    In terms of organisation, I have used back link monitor for a long time now and it's very good and doing just that, organising links, viewing tiered links and also keeps track of url age, so, what I do, is build my links, import them into BLM projects, run a check, then 14 days later, I re-run the project and highlight links that are 14 days old (you can do whatever date you want 30+) then I submit the live urls older than 14 days old, this weeds out all the dead links.. BLM supports Express Indexer as well, so it's a simple case of highlighting all those links, right clicking and submitting.
    Thanked by 1Deeeeeeee
Sign In or Register to comment.