not much, your tier 1 link will get de-indexed and your rankings you 'once' had for that anchor, will plummet.
that's ofcourse if you manage to get 100% index ratio on your 5,000 links.
Rapid indexing is fine guys, but when and if you start to index mega amounts of links all to a single URL, that url can be deindexed because of it, just saying.
Tim89 do you have a study on that or are you assuming?
My guess is that if Majestic/Ahrefs knows about a link G does as well - even if it is not indexed. So why would rapid indexing mean more than rapid link building?
@Tim89 you could definetely be right but I still like to experiment
Don't want to start a big debate but honestly, I think the quality of the links plays a big role. These are contexual links so lets see what happens anways
you can safely build a million back links in one day.
you can not safely index a million links in one day.
majestic and ahrefs aren't using the google index, I once had a site that I had built around 50,000 links to, majestic said I had around 3,000 links and ahrefs indicated I had 10,000 links, now you tell me how many links I have?
I have around 50,000 links because I built them myself and I am monitoring them links with BLM.
this is why I cringe when people look at majestic seo and ahrefs for their link analysis, it's pointless, fair enough you may be able to indicate a trend in link loss to counter act any possible ranking drops but this is not a great indicator when it comes to actual link volume.
Tim89 that is kind of interesting. I don't doubt your test but I think more testing would be in order to better understand what's really happening and what the limits are of it. Maybe the G crawler is aware of all links (approximately) but the ranking / penalty algorithm only concerns itself with indexed ones. Actually that does make sense.
So then maybe we should be setting our limits to indexed link per day rather than submitted/verified.
@Samx Indeed, ask yourself, how can a non-indexed link carry any weight in rankings?
The crawler can possibly crawl the link and knows where that link is, but I doubt it is contributed to rankings or as you mentioned the algo, until it is indexed..
There were a few members on this very forum complaining about their links not being indexed as they used to before the May update earlier this year, that just proves this theory even more, links were not being indexed as easily and therefore rankings were not increasing as they normally once would.
It's common sense that a non indexed link has no effect in rankings, this is why SEO is so random, I'll give you an example;
lets say we begin on a site and we've created 50 exact anchored back links with our 1 keyword.. and then we create 150 raw urls and generic anchors for our anchor text diversity... lets say by day 7, 10 of our exact anchors have been indexed and 10 of our generics have been indexed, this means we currently have a 50/50 anchor text diversity, which in effect gives us mediocre rankings..
However, lets say, by day 14 all of our 50 exact anchor back links have been indexed and none of our generics have been indexed at this point, we would in fact get penalised not because we've built too many links.. but simply because none of our generic anchor backlinks have been indexed, many people at this point shit a brick and start building more and more and more links, this is bad.
When to put it simply, you just needed to concentrate on indexing those generics you already built which in return 'should' get you out of the gutter and where you want to be.
the algo is dynamic, some times when you fluctuate it could be anything, from link loss, to not having enough generics due to some link loss yet again..
Personally, I wouldn't start a campaign without having an understanding of how many tier 1 links I WANT TO BUILD, so I can keep track of the numbers, normally this is 3x the amount of base exact anchored back links that I would need, so if I needed around 50 exact anchored back links I would build 200 tier 1 links in total then start tiering them out from there.
anyways, I'm rambling and I don't think I'm making sense now
Have you tried GSA SEO Indexer for deep links? It does nothing in the short term. Maybe, if some of the links it gets are catched up by Analytics, they might do something in the long term, but the difference is huge.
@komakino The indexing service does not use any links, rss feeds, or build any anchor text. It does not leave any footprints for any submitted links, and has not harm any of our networks.
The method works, and takes a lot of resources to get such a high volume indexed at 50-75% rates.
I'll probably wait a few days to see if you guys can hold it off even after few days. I'd like feedback then. Thereafter I'll be definitely interested to try this service out.
@glennf The API won't be ready for at least a few weeks so when that does happen, you'll be hearing about us quite a bit .
The results you'll find would be 50-60% indexing rate on typically T1 and T2 lists. If you want to test it out on a small list, just add me on skype: cloudlinkbuilding
@jiggsaw please check your email that admin@theincredibleindexer.com sent, it contains full instructions on how to submit links.
Update:
We are now using dropbox only for submission, it is faster for everyone to simply copy and paste into a shared excel file. Emails were sent to update everyone who have joined recently.
OK. I'll wait until the API is integrated into GSA SER by Sven assuming he will be doing this.
Forgive me. When you say "the results you'll find would be 50-60% indexing rate on typically T1 and T2 lists", does this ultimately mean I will see faster and better rankings in the SERPS, assuming I used good content for my GSA SER projects?
Comments
not much, your tier 1 link will get de-indexed and your rankings you 'once' had for that anchor, will plummet.
that's ofcourse if you manage to get 100% index ratio on your 5,000 links.
Rapid indexing is fine guys, but when and if you start to index mega amounts of links all to a single URL, that url can be deindexed because of it, just saying.
My guess is that if Majestic/Ahrefs knows about a link G does as well - even if it is not indexed. So why would rapid indexing mean more than rapid link building?
I've tried it myself at a higher scale.
indexing 35,000+ links to a single url.
My theory is this,
you can safely build a million back links in one day.
you can not safely index a million links in one day.
majestic and ahrefs aren't using the google index, I once had a site that I had built around 50,000 links to, majestic said I had around 3,000 links and ahrefs indicated I had 10,000 links, now you tell me how many links I have?
I have around 50,000 links because I built them myself and I am monitoring them links with BLM.
this is why I cringe when people look at majestic seo and ahrefs for their link analysis, it's pointless, fair enough you may be able to indicate a trend in link loss to counter act any possible ranking drops but this is not a great indicator when it comes to actual link volume.
So then maybe we should be setting our limits to indexed link per day rather than submitted/verified.
The crawler can possibly crawl the link and knows where that link is, but I doubt it is contributed to rankings or as you mentioned the algo, until it is indexed..
There were a few members on this very forum complaining about their links not being indexed as they used to before the May update earlier this year, that just proves this theory even more, links were not being indexed as easily and therefore rankings were not increasing as they normally once would.
It's common sense that a non indexed link has no effect in rankings, this is why SEO is so random, I'll give you an example;
lets say we begin on a site and we've created 50 exact anchored back links with our 1 keyword..
and then we create 150 raw urls and generic anchors for our anchor text diversity...
lets say by day 7, 10 of our exact anchors have been indexed and 10 of our generics have been indexed, this means we currently have a 50/50 anchor text diversity, which in effect gives us mediocre rankings..
However, lets say, by day 14 all of our 50 exact anchor back links have been indexed and none of our generics have been indexed at this point, we would in fact get penalised not because we've built too many links.. but simply because none of our generic anchor backlinks have been indexed, many people at this point shit a brick and start building more and more and more links, this is bad.
When to put it simply, you just needed to concentrate on indexing those generics you already built which in return 'should' get you out of the gutter and where you want to be.
the algo is dynamic, some times when you fluctuate it could be anything, from link loss, to not having enough generics due to some link loss yet again..
Personally, I wouldn't start a campaign without having an understanding of how many tier 1 links I WANT TO BUILD, so I can keep track of the numbers, normally this is 3x the amount of base exact anchored back links that I would need, so if I needed around 50 exact anchored back links I would build 200 tier 1 links in total then start tiering them out from there.
anyways, I'm rambling and I don't think I'm making sense now
For those asking about where to purchase go to http://theincredibleindexer.com/
Just curious, because for example I wouldn't want my 100k links to be plus-ed by the same 1k accounts.
So if you can shed some light on the technique without revealing too much or just assure that there's no footprint, that would be nice.
This question has probably been asked before, but how does this service integrate with GSA SER projects?
What results can I expect when i sign up for a package?
How does this service integrate with GSA SER projects?
What results can I expect when I sign up for a package?
Forgive me. When you say "the results you'll find would be 50-60% indexing rate on typically T1 and T2 lists", does this ultimately mean I will see faster and better rankings in the SERPS, assuming I used good content for my GSA SER projects?