Link Indexing
rsharpe75
United Kingdom
Which indexing services are available within GSA SER and which ones are the vast majority of people using?
Also, I have been reading quite a few threads in the forum and a few people seem to be building upwards of 250K links per day. Checking some of the indexing services, if you did pass your links through them, this gets expensive very quickly. Some charge $120 and upwards for 500K links, which works out to $120 every couple of days for people churning out 250K links per day.
Would love to hear from anyone who is actually using an indexing service and perhaps get an idea how they use it and perhaps associated costs and if they feel indexing offers value for money.
Thanks
Also, I have been reading quite a few threads in the forum and a few people seem to be building upwards of 250K links per day. Checking some of the indexing services, if you did pass your links through them, this gets expensive very quickly. Some charge $120 and upwards for 500K links, which works out to $120 every couple of days for people churning out 250K links per day.
Would love to hear from anyone who is actually using an indexing service and perhaps get an idea how they use it and perhaps associated costs and if they feel indexing offers value for money.
Thanks
Comments
Your review of inject was very interesting because it was one of the services I was looking at.
With regards to dead links, I actually read another article you wrote that addressed this. At the time, I thought perhaps a staggered approach would work quite nicely. For example:
First 4 weeks spent building x number of Tier 1 links per day.
After the 4 weeks have passed, I check which links from day 1 are still active and only build Tier 2 links to those. This step will obviously also take 4 weeks.
When this is done in week 8, I'll then start checking tier 1's and 2's to make sure they are alive and use this output to build tier 3 links over the course of the next 4 weeks.
I'm hoping this will allow me to totally maximise effectiveness of the link building and ultimately, the indexing when put through a service.
I'm going to have to work out a way to organise this approach because if done wrong, I'll confuse myself. I'll also have to work multiple projects simultaneously otherwise the velocity of links built using this staggered approach will be quite low.
Would love to hear your thoughts.
In terms of organisation, I have used back link monitor for a long time now and it's very good and doing just that, organising links, viewing tiered links and also keeps track of url age, so, what I do, is build my links, import them into BLM projects, run a check, then 14 days later, I re-run the project and highlight links that are 14 days old (you can do whatever date you want 30+) then I submit the live urls older than 14 days old, this weeds out all the dead links.. BLM supports Express Indexer as well, so it's a simple case of highlighting all those links, right clicking and submitting.