SER Made Contextual Indexing Rates Since 1st July 16
shaun
https://www.youtube.com/ShaunMarrs
Afternoon guys,
Has anyone else noticed a change in their indexing rates from the past few weeks? Not 100% sure the date it started but it seemed fine at the end of last month. I have gone from being able to get around 75% of SER made contextual links indexed and maintaining that number to getting around 25% indexed then it just drops off over time to at best 5%.
I have been testing a crap ton of indexing services to try find a solution but none are working.
I'm not sure if Google have released another indexing algo patch as they did around this time last year, it seems that the two phases of indexing links are now much stricter. The initial phase where google does its checks and then decides if the link gets indexed or not. Then a constant ongoing check on links that are in their index and this seems to just be kicking new SER contextuals out of the index like theres no tomorrow .
Strange thing is, pages on the exact same domains are indexed and holding steady but new pages on those same domains have the effect i'm describing.
Initial thoughts are Google have updated their index algo and are able to kick spammed sites out their index easier. As I have said I have been testing a fair few indexing services out there along with testing auto spun, human spun and manually spun content. I plan to use ReCaptcha solvers when sven has investigated a few things I have reported with CB to try get a set of less spammed domains to try my systems on but other than that i'm at a loss .
Anyone out there tracking their indexing rates of contextuals made by SER this month and able to share them and some light on the methods they use to index?
Comments
I hadn't thought of "building a list of verified domains that are protected by ReCaptcha, kCaptcha and Mollom", but it makes sense based on what I think is probably causing these indexing problems - restrictions on how many pages a domains can index based on some quality metric and/or restrictions that start if a bunch of new pages start appearing all at once.
As for creating human readable content, I bet you could get away with doing 1000 word articles where about 200-400 words are heavily spun and the remaining 600-800 are duplicate content. Stick 100-200 words at the beginning and end, the duplicate stuff in the middle, and I think you'll be good.
So building tier 2 links no longer keep the tier 1 contextuals indexed?
You're saying they initially get indexed and Google later de indexes those pages?
I'm thinking of posting to all my tier 1 contextual target links and checking their index status after a week.
Then only keeping domains still indexed and deleting the rest until I build up a decent list.
@710fla - I like this idea. Might be a good idea to make multiple posts per domain as well.
Interesting stuff @shaun and @JudderMan. Appreciate the sharing