Why Contextual Verification Keeps Getting Worse - by @ron
I suspected something was wrong with the state of the union on Contextual targets. I wasn't able to put my finger on any one issue. However, a number of issues kept running through my head:
- Contextual Targets are getting harder to post to because of script changes
- Site owners were approving less links
- Captcha was getting tougher
- Or just blame @sven (because that is everybody's favorite)
I had a theory, so I ran a very simple test. I took 2 websites:
Website 1 is heavily spammed and penalized - I dropped a ton (thousands and thousands) of contextual links with the urls (including inner pages) on various contextual sites.
In Website 1 (the penalized and for all intents and purposes dead website) I cleaned all history, submitteds, verifieds, URL cache, history - everything - and stuck in 10 new emails in all projects. It was completely penalized, so I didn't give a rat's ass that I was deleting the history. I literally started with Zeros in all columns - just like a brand new website.
Website 2 is a new virgin website with no GSA-SER links or links of any kind. I stuck in 10 emails, and started with all Zeros as no links were built to this website ever.
Note: For all intents and purposes, I had two seemingly identical websites with no link history in GSA-SER, and I was starting from scratch. The only difference was that Website 1 had a track record, while Website 2 was as clean as snow.
Important: Then I fed all T1 Contextual projects - across both Website 1 and Website 2 - with the exact same URLs in which to make links. These websites have no tiered structures., and just one Contextual T1 project per page of the website.
And guess what I found? The brand new website had 5 X the number of verified contextual links as the old battered website. You will see below the contextual verified links circled for each page of the respective website:
This tells me, beyond a reasonable doubt, that as a website makes more links, more contextual links will get rejected. I am not an authority on exactly why, however, I do have some opinions based on being in this business for 16 years (yes, since 1998, argggh):
- On a lot of websites, you only get one chance to make a link. Not all websites, but many websites. Just like that bug called a ceceda, you only get one chance to mate, and then you die.
- Your website goes on the spam list, it gets passed to other websites, and then your target acquisition goes down exponentially.
If you feel very strongly about the value of contextual targets (like I do and most others here), quit using tiered structures where you pummel the living snot out of your T1 with a ton of T2 and T3 contextual tiers.
That way, and if you are smart, you are saving the prize (the contextuals) for an orderly drip of new contextual links on your T1, and without spamming the living crap **with the same domain** on all the other contextual tiers. In my opinion, doing so hurts and limits your ability to use them on your T1 - which is where you really need them.
I hope you all understand what I am saying. If you truly value your contextual targets - and you buy a list - or you use SER to scrape them - Use them wisely.
I believe we have now reached a level of anti-spam measures that will drastically erode your ability to use contextual targets like they are free and grow on trees. The more you spam contextual targets, the less you will be able to use them ever again - for that domain where you are making links - including any and all inner pages.
I'm not telling you how to set-up tiers, or how to do SEO. But there is one thing I will tell you. The SERLists.com team and I test a lot of stuff. When we something is this black and white, where we actually control all the variables except one variable, it is difficult to dispute the evidence.
I hope this all helps you. And if you haven't signed up yet for Advanced SER Tips (I know, shameless plug), you might want to give it a shot. When we update the next issue, we will have some cool reading material in there - and all back issues with links.
I have to get back to the lab. You guys discuss and critique as you feel appropriate.
p.s. Please don't PM me on this - that is not playing fair - keep it here in the thread. Thank you!
So, from that point of view what about using URL shoteners on T1 and contextual on T2?
So, let me ask you something: You came to the conclusion that niche related/contextual sites seem so share some sort of spam list. Wouldn't it be possible to approach this problem by doing either one of the following:
1. Build a high quality T1 and blast it with non-contextual links
2. Build a high quality campaign on all tiers (just slightly lower the filter once you go down 1 tier) and spam those tiers with secondary links from the side?
I'm quite new to working with SER and tiered links, so my ideas may sound foolish to you, but I thought I'd ask anyway.
If people were to only use contextuals for tier 1's then use additional platforms for tiers 2 and 3, that would suffice but only until the next update and all that "fake spam juice" will get deindexed and these links are not of value and google simply shits out the crap on their updates.. and people think their sites are penalised when in reality they just built their links using crap.
The only solution I personally see here is to carry on finding new link sources (contextuals) which means scraping.... or ofcourse, purchasing your lists and subscribing for your updates which would be easier to do.
@XXXX - I haven't thought through the ramifications for strategy yet. I wrote that up pretty late last night, and need time to digest it myself. I just provided my initial gut reactions, not necessarily any battle plan.
@spammasta - Thank you mate!
@Tixxpff - Like I was saying, I really haven't reached any conclusions yet. I just know intuitively that pounding contextuals on the same project URL diminishes your ability to post well on that project going forward. I'm still thinking about it all, lol.
@Tim89 - I agree, it is pretty hard to just abandon contextuals on lower tiers. I think if we just understand that when you pound these things, you become exponentially less able to get them verified. So instead of everyone complaining (and trying to pin it on someone like Sven, lol), just understand that the game is changing quickly.
Back around 5 years ago I was spamming blog comments at a ridiculous rate with Scrapebox. One of the things I noticed is that my domain was being rejected at many targets. There was definitely a blacklist going around. I couldn't even get manual submissions accepted. It was very frustrating. I honestly believe this is in the same camp.
And I believe you are right. Lists are becoming more important in being able to play the game.
1) did you create new fresh gsa projects for the penalized website? or did you just clear history of existing projects? i would create this test again (penalized site vs new fresh domain) using fresh gsa projects on both. otherwise you have a variable that is not the same between them.
2) what about when you automatically insert co-citation links. does this mean then that anytime you use a co-citation link for many of the authority websites (wikipedia, youtube, etc.) you are more likely to be rejected by the contextual target, since chances are these same authority domains have been linked to already?
something doesn't quite make sense here...
Also I would duplicate the test a few times to confirm it, as in the past I have seen odd ball GSA projects that get very low contextual verification count, then I create a new project for same domain and everything works better and verification is higher. Just something I have seen a few times in the past.
Keep us updated.
@dr0ne - 1) Lol, you read my mind. I won't get into the specifics right now, but if your idea makes a difference, then that means some bigger problems for everybody. I have a theory that goes back to something that happened 5 weeks ago in SER...
2) Never thought about that. That is interesting.
Anyway, I like how you think
Thanks for buying the list
Btw, after reading all this, does that mean that your initial concerns (at the beginning of the post) were unfounded? I'm not sure if I read you correctly?