not much, your tier 1 link will get de-indexed and your rankings you 'once' had for that anchor, will plummet.
that's ofcourse if you manage to get 100% index ratio on your 5,000 links.
Rapid indexing is fine guys, but when and if you start to index mega amounts of links all to a single URL, that url can be deindexed because of it, just saying.
Tim89 do you have a study on that or are you assuming?
My guess is that if Majestic/Ahrefs knows about a link G does as well - even if it is not indexed. So why would rapid indexing mean more than rapid link building?
@Tim89 you could definetely be right but I still like to experiment
Don't want to start a big debate but honestly, I think the quality of the links plays a big role. These are contexual links so lets see what happens anways
you can safely build a million back links in one day.
you can not safely index a million links in one day.
majestic and ahrefs aren't using the google index, I once had a site that I had built around 50,000 links to, majestic said I had around 3,000 links and ahrefs indicated I had 10,000 links, now you tell me how many links I have?
I have around 50,000 links because I built them myself and I am monitoring them links with BLM.
this is why I cringe when people look at majestic seo and ahrefs for their link analysis, it's pointless, fair enough you may be able to indicate a trend in link loss to counter act any possible ranking drops but this is not a great indicator when it comes to actual link volume.
Tim89 that is kind of interesting. I don't doubt your test but I think more testing would be in order to better understand what's really happening and what the limits are of it. Maybe the G crawler is aware of all links (approximately) but the ranking / penalty algorithm only concerns itself with indexed ones. Actually that does make sense.
So then maybe we should be setting our limits to indexed link per day rather than submitted/verified.
@Samx Indeed, ask yourself, how can a non-indexed link carry any weight in rankings?
The crawler can possibly crawl the link and knows where that link is, but I doubt it is contributed to rankings or as you mentioned the algo, until it is indexed..
There were a few members on this very forum complaining about their links not being indexed as they used to before the May update earlier this year, that just proves this theory even more, links were not being indexed as easily and therefore rankings were not increasing as they normally once would.
It's common sense that a non indexed link has no effect in rankings, this is why SEO is so random, I'll give you an example;
lets say we begin on a site and we've created 50 exact anchored back links with our 1 keyword.. and then we create 150 raw urls and generic anchors for our anchor text diversity... lets say by day 7, 10 of our exact anchors have been indexed and 10 of our generics have been indexed, this means we currently have a 50/50 anchor text diversity, which in effect gives us mediocre rankings..
However, lets say, by day 14 all of our 50 exact anchor back links have been indexed and none of our generics have been indexed at this point, we would in fact get penalised not because we've built too many links.. but simply because none of our generic anchor backlinks have been indexed, many people at this point shit a brick and start building more and more and more links, this is bad.
When to put it simply, you just needed to concentrate on indexing those generics you already built which in return 'should' get you out of the gutter and where you want to be.
the algo is dynamic, some times when you fluctuate it could be anything, from link loss, to not having enough generics due to some link loss yet again..
Personally, I wouldn't start a campaign without having an understanding of how many tier 1 links I WANT TO BUILD, so I can keep track of the numbers, normally this is 3x the amount of base exact anchored back links that I would need, so if I needed around 50 exact anchored back links I would build 200 tier 1 links in total then start tiering them out from there.
anyways, I'm rambling and I don't think I'm making sense now
Have you tried GSA SEO Indexer for deep links? It does nothing in the short term. Maybe, if some of the links it gets are catched up by Analytics, they might do something in the long term, but the difference is huge.
@komakino The indexing service does not use any links, rss feeds, or build any anchor text. It does not leave any footprints for any submitted links, and has not harm any of our networks.
The method works, and takes a lot of resources to get such a high volume indexed at 50-75% rates.
I'll probably wait a few days to see if you guys can hold it off even after few days. I'd like feedback then. Thereafter I'll be definitely interested to try this service out.
@glennf The API won't be ready for at least a few weeks so when that does happen, you'll be hearing about us quite a bit .
The results you'll find would be 50-60% indexing rate on typically T1 and T2 lists. If you want to test it out on a small list, just add me on skype: cloudlinkbuilding
@jiggsaw please check your email that admin@theincredibleindexer.com sent, it contains full instructions on how to submit links.
Update:
We are now using dropbox only for submission, it is faster for everyone to simply copy and paste into a shared excel file. Emails were sent to update everyone who have joined recently.
OK. I'll wait until the API is integrated into GSA SER by Sven assuming he will be doing this.
Forgive me. When you say "the results you'll find would be 50-60% indexing rate on typically T1 and T2 lists", does this ultimately mean I will see faster and better rankings in the SERPS, assuming I used good content for my GSA SER projects?
Would it be ok to submit thousands of backlinks at once for indexing if they are all pointing to completely different inner pages of the same website - or will that get my main URL penalized for that site?
@Glennf yessir when used right this indexer will give you faster and better rankings. Indexing your links are important, but like everything you need to know what you are doing to get it done right!
@Mike it would definitely be wise to drip them over a period of time as link velocity is important, too fast and it can hurt your site from my own experiences
I have question. I put links per day (250 per day), but how i will know when they processed? Also here will be some status (indexed or not) or i need to verify links indexation via scrapebox/gsa?
88/154 (indexed/not indexed) and 130/120 for first batch. Less than 50% overall, but still good. Later will try to resubmit not indexed links to check what the ratio will in this case.
So, looks like it really good service. But we will need normal interface, api and updates for future ;-)
i went from 0/57 to 4/57 after more than a day after processing. its all wikilinks with "ok" content. can it take a bit more time or did sth go wrong...?
tim89: i constantly see how google indexing new articles (for example unique content, indexed via HQ bookmarks). It put article into index almost immediately, after it removes this article from index for few days (up to 1 week), after it it returns to index. I have seen it few times on t1 unique articles which monitored manually.
I have question to cloudattack. What the method (without secret sauce) used to index our backlinks? Is this indexer based on behavior factors, links/social or it something different? Just want to be sure what here is not so spammy/easy recognizable way.
How many links was this case study you speak of focusing on?
If you've previously kept track of, lets say 10 links and you index these links with this indexing service, then 2-3 days later all of these 10 links become de-indexed, you're stating that this is normal behavior?
I've witnessed myself sometimes articles do indeed get indexed and then de-indexed for a short period of time but I'm talking 1 page out o f 100 pages.
tim89: it not a study. I index tier 1 unique content links (for important sites manually), using my own system. so i see G reaction on all t1 unique articles. I not using any indexing services for T1.
Usual article indexation looks like for me : in index for 1-2 days, no index, and after 3-8 days in index again.
@Tim89 we are not doing what you are thinking we are doing to index links So feel free to test all you want, it really is not doing anyone justice.
@mafcra our proprietary indexer is developed to be safe and effective (no links, social media, or anchor texts). The complexity of how it works, is a secret in itself, as much as I don't mind helping others and giving back to the community, the fewer that knows the better it is for everyone (Google could easily put 2+2 together !).
I'd simply like to know before I switch my own indexing services to yours, if this indexing service actually works.
Forgive me for asking questions, I like to do that before paying for a service and if there are individuals already paying for your service and want to help promote your service then it would cost them nothing to release additional information in regards to my request, no one has to but it would be nice to simply know if these indexed links stay indexed in a few days that's all.
From my previous experiences, I too have a working method which typically gets the same indexing rates as your method and within the same timeframe as yours, but I am simply too lazy to add that task to my already busy schedule, eventhough it doesn't take that much time to do.
I'm not attempting to put your service down, I just would like to know facts before switching as I normally stay with a service provider for a long time.
We are running projects every day, might be slower on weekends but we try to get them running on weekends. Eventually, we'll not be manually submitting them and getting a system in place to automate it completely.
For those who had issues and need re-runs do let us know through the admin email. We are constantly working out ways to get this indexer working in prime condition, but so far so good!
For those who don't know yet, this indexer is meant for new links only. Anyone who are submitting old lists may experience less than 50-60% indexing rates. This is normal and we also believe you can achieve over 60% if your links are fresh. So for those submitting old links, don't do it, it is not worth it
We are getting API soon, so this is not going to be issue for those who will benefit off this service tremendously.
@cloudattack I submitted a bunch of older backlinks to your service (that are still "verified" in SER) just to make sure my links from the past will finally get indexed. They took so long to create it's worth it for me to spend the money to have you try and improve my index rate (on almost a years worth of link building work) I'm happy with any index rate improvement. In my eyes, 50 to 60% would still be awesome for old links.
Can it hurt anything or get my sites penalized to submit older links?
@sonic81 you are set up and ready to go! Sorry just had quite a bit of emails to go through, must have missed yours
@Samx around 3-5 days old links, would be the ideal period to submit links to our indexer, with an API set up, you will not have to worry and can start dripping links daily.
@Mike submitting older links will never hurt your rankings in any way form or shape, you may experience just a lower indexing rate from our tests.
You may still try to submit those, though we recommend only new links (3 days old is best) as those are indexing at a more reliable rate than those built few weeks or months ago.
@Glennf It's perfect for Tier 1 and Tier 2 links, and can easily index Tier 3 links.
the main question is whether these links stay indexed if you aren't building backlinks to them? -especially an issue for lower tier spammy links which are hard to index.
apologies if it's already been answered, didn't fancy going thru the whole 4 pages just to find the answer to that .
@peterparker sure after testing it out for 6 months on Tier1 and Tier2 they have not shown an issue with deindexing.
For those who build huge Tier 3 lists to index are having great success with our indexer and will continue to increase their monthly profits by incorporating this sweet service
Comments
not much, your tier 1 link will get de-indexed and your rankings you 'once' had for that anchor, will plummet.
that's ofcourse if you manage to get 100% index ratio on your 5,000 links.
Rapid indexing is fine guys, but when and if you start to index mega amounts of links all to a single URL, that url can be deindexed because of it, just saying.
My guess is that if Majestic/Ahrefs knows about a link G does as well - even if it is not indexed. So why would rapid indexing mean more than rapid link building?
I've tried it myself at a higher scale.
indexing 35,000+ links to a single url.
My theory is this,
you can safely build a million back links in one day.
you can not safely index a million links in one day.
majestic and ahrefs aren't using the google index, I once had a site that I had built around 50,000 links to, majestic said I had around 3,000 links and ahrefs indicated I had 10,000 links, now you tell me how many links I have?
I have around 50,000 links because I built them myself and I am monitoring them links with BLM.
this is why I cringe when people look at majestic seo and ahrefs for their link analysis, it's pointless, fair enough you may be able to indicate a trend in link loss to counter act any possible ranking drops but this is not a great indicator when it comes to actual link volume.
So then maybe we should be setting our limits to indexed link per day rather than submitted/verified.
The crawler can possibly crawl the link and knows where that link is, but I doubt it is contributed to rankings or as you mentioned the algo, until it is indexed..
There were a few members on this very forum complaining about their links not being indexed as they used to before the May update earlier this year, that just proves this theory even more, links were not being indexed as easily and therefore rankings were not increasing as they normally once would.
It's common sense that a non indexed link has no effect in rankings, this is why SEO is so random, I'll give you an example;
lets say we begin on a site and we've created 50 exact anchored back links with our 1 keyword..
and then we create 150 raw urls and generic anchors for our anchor text diversity...
lets say by day 7, 10 of our exact anchors have been indexed and 10 of our generics have been indexed, this means we currently have a 50/50 anchor text diversity, which in effect gives us mediocre rankings..
However, lets say, by day 14 all of our 50 exact anchor back links have been indexed and none of our generics have been indexed at this point, we would in fact get penalised not because we've built too many links.. but simply because none of our generic anchor backlinks have been indexed, many people at this point shit a brick and start building more and more and more links, this is bad.
When to put it simply, you just needed to concentrate on indexing those generics you already built which in return 'should' get you out of the gutter and where you want to be.
the algo is dynamic, some times when you fluctuate it could be anything, from link loss, to not having enough generics due to some link loss yet again..
Personally, I wouldn't start a campaign without having an understanding of how many tier 1 links I WANT TO BUILD, so I can keep track of the numbers, normally this is 3x the amount of base exact anchored back links that I would need, so if I needed around 50 exact anchored back links I would build 200 tier 1 links in total then start tiering them out from there.
anyways, I'm rambling and I don't think I'm making sense now
For those asking about where to purchase go to http://theincredibleindexer.com/
Just curious, because for example I wouldn't want my 100k links to be plus-ed by the same 1k accounts.
So if you can shed some light on the technique without revealing too much or just assure that there's no footprint, that would be nice.
This question has probably been asked before, but how does this service integrate with GSA SER projects?
What results can I expect when i sign up for a package?
How does this service integrate with GSA SER projects?
What results can I expect when I sign up for a package?
Forgive me. When you say "the results you'll find would be 50-60% indexing rate on typically T1 and T2 lists", does this ultimately mean I will see faster and better rankings in the SERPS, assuming I used good content for my GSA SER projects?
could you possibly keep track of 20 urls and put them through the index process and wait a few days, then recheck their index status?
report back
thanks...
How many links was this case study you speak of focusing on?
If you've previously kept track of, lets say 10 links and you index these links with this indexing service, then 2-3 days later all of these 10 links become de-indexed, you're stating that this is normal behavior?
I've witnessed myself sometimes articles do indeed get indexed and then de-indexed for a short period of time but I'm talking 1 page out o f 100 pages.
I'd simply like to know before I switch my own indexing services to yours, if this indexing service actually works.
Forgive me for asking questions, I like to do that before paying for a service and if there are individuals already paying for your service and want to help promote your service then it would cost them nothing to release additional information in regards to my request, no one has to but it would be nice to simply know if these indexed links stay indexed in a few days that's all.
From my previous experiences, I too have a working method which typically gets the same indexing rates as your method and within the same timeframe as yours, but I am simply too lazy to add that task to my already busy schedule, eventhough it doesn't take that much time to do.
I'm not attempting to put your service down, I just would like to know facts before switching as I normally stay with a service provider for a long time.
Tim89 and you can track them yourself. PM me if you like.
@cloudattack I submitted a bunch of older backlinks to your service (that are still "verified" in SER) just to make sure my links from the past will finally get indexed. They took so long to create it's worth it for me to spend the money to have you try and improve my index rate (on almost a years worth of link building work) I'm happy with any index rate improvement. In my eyes, 50 to 60% would still be awesome for old links.
Can it hurt anything or get my sites penalized to submit older links?
apologies if it's already been answered, didn't fancy going thru the whole 4 pages just to find the answer to that