Get your links indexed - [Some Ideas]
Hey,
i know a lot of people suffer the problem of not getting their links indexed, especially after the recent updates. Call it Penguin 2.0 or whatever, but what most people noticed is the following:
- less links get indexed
- indexing is slower
- proxies get banned faster and therefore scraping targets gets harder
None of my pages really got "hit" by recent updates in a way that they lost -500 in rankings, but what i observed soon was a big loss of links in total.
In automated linkbuilding it is pretty normal that you build links and lose them. If you're doing it the lazy way and build your T1 with SER (what most people here do for the sake of easyness), there is no such thing as a "safe" T1. You will lose links after a while. Your useful content gets deleted, the page goes offline, gets deindexed etc.
What counts is that you have a steady steam of fresh links to replace these and also gain links in total. Lose 90, build 100...
Most people here count on tiered linkbuilding (TLB) to do this, and so do I.
The problem with TLB is (and has always been), that the top of your pyramid gets extremely big at some point and it becomes almost impossible to build multiple links to each of the Top-Tiers. If you do the math (for for the 10-10-10 blueprint most people use), you will see that you need millions of BL per project after a while, which is almost impossible.
The other problem is, that while you're building a bigger T1, you constantly lose T2, T3 etc. links too.
Imagine a pyramid where the top gets bigger all the time, while a little goblin with a Google-T-Shirt constantly takes away stones from the base. While the top grows, your constantly trying to repair the base to safe that damn thing from falling together...
After some time you reach a critical point where your hands will bleed, your back will hurt and you cant keep up anymore.
Some of my older projects were suffering this fate last month, and since they became real cashcows over time i didnt want to lose them. They already started dropping in the SERPs.
Since i didn't find many new targets that i didnt already parse to repair the base of the pyramid, i first ticked that "continously try to post to a page even..." in my contextual T2 and created a catchall email for these (so that i can create infinite accounts).
The outcome was, that SER built a LOT new contextual links on platforms where it already did. But who cares? 30 Articles on the same page, but all pointing to different T1 Articles. I dont think G will filter this. So a lot less targets are needed.
But obviously this wasnt enough, the T1 still got indexed extremely slow...
I did an index-check on my T1 after few weeks and saw sth like 200/800 indexed. A lot of them were extremely old, like 3-4 month. The new ones after Penguin 2.0 had an indexing-rate of maybe 5-10%.
I decided to delete all indexed T1's at that point. I know this is counter-intuitive since the indexed ones at least pass link-juice. But i checked the amount of BL for some of these properties and they already had around 50-100 each.
The idea i had in mind was, that with this i get more links to the non-indexed links and therefore get them indexed.
I had to decide between the concept of link-juice and indexing. I think it's right that contextual properties pass linkjuice and over time maybe PR to your moneysite. But is it normal for a 700-word article on a drupal blog to have 300 backlinks? I dont think so. The amount of link-juice these properties can pass is limited.
Imho it is enough to build links to them until they get indexed (they should have a bunch at that point) and then forget about them (which means deleting them). Instead of building hundreds and thousands of links to some Buddypress-Property, these ressources should be used to get new links indexed.
And guess what, this really worked out for me. I didnt change anything (except these 2 things) and now i see many new indexed links popping up - and i start gaining links in total instead of losing.
I think these tipps could be of value for some of you.
Today i even considered to delete every verified link in my T1 that is older than 30 days. If they dont get indexed after they received a bunch of T2 and T1A links, they most likely never will. Or maybe they will over time, but you dont need to build additional links for that.
I hope i can help some people with this and I'd appreciate if you share your ideas and thoughts about that
Forgot to mention that i also disabled lindexed a few weeks ago (not because they are bad but because i think it's too obvious for the SE's) and im not using any indexing service any more, like pinging etc.
Best Regards
Comments
As you said, just because it has a tier doesn't mean anything, and I'm pretty sure G is looking at how to neutralise tiering if they haven't done so already.
How are you both checking the indexing? in SER or Scrapebox or?