Skip to content

Why Contextual Verification Keeps Getting Worse - by @ron

ronron SERLists.com
edited April 2014 in GSA Search Engine Ranker

I suspected something was wrong with the state of the union on Contextual targets. I wasn't able to put my finger on any one issue. However, a number of issues kept running through my head:

  • Contextual Targets are getting harder to post to because of script changes
  • Site owners were approving less links
  • Captcha was getting tougher
  • Or just blame @sven (because that is everybody's favorite)

I had a theory, so I ran a very simple test. I took 2 websites:

Website 1

Website 1 is heavily spammed and penalized - I dropped a ton (thousands and thousands) of contextual links with the urls (including inner pages) on various contextual sites.

In Website 1 (the penalized and for all intents and purposes dead website) I cleaned all history, submitteds, verifieds, URL cache, history - everything - and stuck in 10 new emails in all projects. It was completely penalized, so I didn't give a rat's ass that I was deleting the history. I literally started with Zeros in all columns - just like a brand new website.

Website 2

Website 2 is a new virgin website with no GSA-SER links or links of any kind. I stuck in 10 emails, and started with all Zeros as no links were built to this website ever.

The Analysis

Note: For all intents and purposes, I had two seemingly identical websites with no link history in GSA-SER, and I was starting from scratch. The only difference was that Website 1 had a track record, while Website 2 was as clean as snow.

Important: Then I fed all T1 Contextual projects - across both Website 1 and Website 2 - with the exact same URLs in which to make links. These websites have no tiered structures., and just one Contextual T1 project per page of the website. 

And guess what I found? The brand new website had 5 X the number of verified contextual links as the old battered website. You will see below the contextual verified links circled for each page of the respective website:

image

This tells me, beyond a reasonable doubt, that as a website makes more links, more contextual links will get rejected. I am not an authority on exactly why, however, I do have some opinions based on being in this business for 16 years (yes, since 1998, argggh):

  1. On a lot of websites, you only get one chance to make a link. Not all websites, but many websites. Just like that bug called a ceceda, you only get one chance to mate, and then you die.
  2. Your website goes on the spam list, it gets passed to other websites, and then your target acquisition goes down exponentially.

Possible Solutions:

If you feel very strongly about the value of contextual targets (like I do and most others here), quit using tiered structures where you pummel the living snot out of your T1 with a ton of T2 and T3 contextual tiers.

That way, and if you are smart, you are saving the prize (the contextuals) for an orderly drip of new contextual links on your T1, and without spamming the living crap **with the same domain** on all the other contextual tiers. In my opinion, doing so hurts and limits your ability to use them on your T1 - which is where you really need them.

I hope you all understand what I am saying. If you truly value your contextual targets - and you buy a list - or you use SER to scrape them - Use them wisely.

I believe we have now reached a level of anti-spam measures that will drastically erode your ability to use contextual targets like they are free and grow on trees. The more you spam contextual targets, the less you will be able to use them ever again - for that domain where you are making links - including any and all inner pages.

I'm not telling you how to set-up tiers, or how to do SEO. But there is one thing I will tell you. The SERLists.com team and I test a lot of stuff. When we something is this black and white, where we actually control all the variables except one variable, it is difficult to dispute the evidence.

I hope this all helps you. And if you haven't signed up yet for Advanced SER Tips (I know, shameless plug), you might want to give it a shot. When we update the next issue, we will have some cool reading material in there - and all back issues with links.

I have to get back to the lab. You guys discuss and critique as you feel appropriate.

Ron

p.s. Please don't PM me on this - that is not playing fair - keep it here in the thread. Thank you!

:)

«1

Comments

  • Hi @Ron, excellent - always excellent =D>!

    So, from that point of view what about using URL shoteners on T1 and contextual on T2?
  • Post of the century.
  • @ron That was a great read. I simply love the effort and time you put into stuff like this to provide us, the community, with such valuable information.

    So, let me ask you something: You came to the conclusion  that niche related/contextual sites seem so share some sort of spam list. Wouldn't it be possible to approach this problem by doing either one of the following:

    1. Build a high quality T1 and blast it with non-contextual links
    2. Build a high quality campaign on all tiers (just slightly lower the filter once you go down 1 tier) and spam those tiers with secondary links from the side?

    I'm quite new to working with SER  and tiered links, so my ideas may sound foolish to you, but I thought I'd ask anyway.
  • Tim89Tim89 www.expressindexer.solutions
    edited April 2014
    Good analysis, However for the most of us who do use contextual platforms, We use them to avoid being penalised, contextual sources are going to yield 10x value than for instance.. a forum profile link, so if you're saying to avoid recycling your contextual link types for tier 2 and 3, what do you suggest you use to power up these contextual tier 1's which would hopefully be future-proof and by that I mean avoid massive drops or penalties.

    If people were to only use contextuals for tier 1's then use additional platforms for tiers 2 and 3, that would suffice but only until the next update and all that "fake spam juice" will get deindexed and these links are not of value and google simply shits out the crap on their updates.. and people think their sites are penalised when in reality they just built their links using crap.

    The only solution I personally see here is to carry on finding new link sources (contextuals) which means scraping.... or ofcourse, purchasing your lists and subscribing for your updates which would be easier to do. :)
  • @Ron showed some real insight. Could qualify for post of the year.

    I saw @Ron's post, and it really validated some things that I was thinking for a long time (but did absolutely nothing about). As everyone knows, we sell verified lists, and we sometimes get to hear from folks that are struggling building links. We test our lists extensively before we release them, and the vast majority of our customers report great results.

    So why do some people struggle? 

    @Ron's discovery explains an awful lot. The site that struggled building links was likely blacklisted by webspam services.

    The SERLists team (@Ron, @2Take2, @Gooner, and @Satans_Apprentice) will be convening an emergency pow-wow to study the data, and create a new best practices guide for avoiding these landmines.
  • ronron SERLists.com

    @XXXX - I haven't thought through the ramifications for strategy yet. I wrote that up pretty late last night, and need time to digest it myself. I just provided my initial gut reactions, not necessarily any battle plan.

    @spammasta - Thank you mate!

    @Tixxpff - Like I was saying, I really haven't reached any conclusions yet. I just know intuitively that pounding contextuals on the same project URL diminishes your ability to post well on that project going forward. I'm still thinking about it all, lol.

    @Tim89 - I agree, it is pretty hard to just abandon contextuals on lower tiers. I think if we just understand that when you pound these things, you become exponentially less able to get them verified. So instead of everyone complaining (and trying to pin it on someone like Sven, lol), just understand that the game is changing quickly.

    Back around 5 years ago I was spamming blog comments at a ridiculous rate with Scrapebox. One of the things I noticed is that my domain was being rejected at many targets. There was definitely a blacklist going around. I couldn't even get manual submissions accepted. It was very frustrating. I honestly believe this is in the same camp.

    And I believe you are right. Lists are becoming more important in being able to play the game. 

  • @ron - if there are any problems than buy new domain for 2 - 3$ and 301 it to your target url. Problem solved, at least temporarily.
  • ronron SERLists.com
    @RayBan - Yeah, I know you can get the link juice funneling with a nice new domain, especially a nice cheap .info, right?

    But it's just like a dog that keeps chasing its tail. It bothers me that they are getting smarter about it. It just creates more hoops to jump through. Why does effective spamming just keep getting harder... 
    :-?
  • edited April 2014
    couple of thoughts

    1) did you create new fresh gsa projects for the penalized website? or did you just clear history of existing projects? i would create this test again (penalized site vs new fresh domain) using fresh gsa projects on both. otherwise you have a variable that is not the same between them.

    2) what about when you automatically insert co-citation links. does this mean then that anytime you use a co-citation link for many of the authority websites (wikipedia, youtube, etc.) you are more likely to be rejected by the contextual target, since chances are these same authority domains have been linked to already?

    something doesn't quite make sense here...

    Also I would duplicate the test a few times to confirm it, as in the past I have seen odd ball GSA projects that get very low contextual verification count, then I create a new project for same domain and everything works better and verification is higher. Just something I have seen a few times in the past.

    Keep us updated.
  • Personally stop using article engines in lower tiers like hundreds years ago. Great thread.
  • ronron SERLists.com

    @dr0ne - 1) Lol, you read my mind. I won't get into the specifics right now, but if your idea makes a difference, then that means some bigger problems for everybody. I have a theory that goes back to something that happened 5 weeks ago in SER... 

    2) Never thought about that. That is interesting.

    Anyway, I like how you think  ;)

  • edited April 2014
    @dr0ne @ron - ad 1.) you wouldn't even have to run the project, you can just compare the text files the project is saved in, they are in AppData/roaming/GSA SER/projects. if the files are identical there is not much SER will do differently. As far as I understand SER has no memory except for the text files, right @sven?
  • SvenSven www.GSA-Online.de
    yes
  • ronron SERLists.com
    edited April 2014
    Guys, this is a huge deal. Huge.

    I saw a number of projects weren't making many links. I will get into the theory in a minute. But the results are very obvious.

    I took a contextual project that you see at the very top post - the one that says "Contextual" in blue next to it. It was hardly making any contextual links. Not to rehash, but I thought the problem was over-spamming the URL. I was dead wrong. Something got corrupted with the project files.

    Test On Old Contextual Project vs. New Contextual Project
      
    1) I ran the "problem contextual project" for 3 days trying to make contextual links using our SERLists Blue List. You will see the results below. p.s I hadn't even touched this project in a year except to change emails.

    2) Next, I made a brand new project (no clone - brand new). I used the same URL, a new spin, 10 new emails, same exact anchors - and I ran our SERLists Blue List for only 30 minutes.

    3) The third project is brand new, fake URL, a new spin, same contextual engines, 10 new emails - and I ran the SERLists Blue List for 30 minutes.

    Look at the results:

    image

    In 30 minutes I made almost 3X the links that I did in 3 days!!!

    Here's my theory - take it or leave it. 6 weeks ago or so, I updated SER. I received an error message from SER that said my "html files may have been corrupted" or something to that effect.

    The SER Version History log even says:

    image

    Guys - as far as I can tell, this is 100% the issue. There is no other explanation. It happened exactly at this time. Verifications plunged on one of my servers. LPM tanked. Nothing was ever the same.

    I am now in the ugly process of remaking a lot of projects. It may take a day, but as you can clearly see, the results are worth it!

    UPDATE: All I did was update the spin article (everything, title, body) on the old project - and ran the SERLists Blue List for just 5 minutes on the old project - check out the results just running one contextual project:

    image

    In 10 minutes I made 2X more links than in 3 days!

    Moral of the story: Change the article - in all projects where there is an article - if the project is older than 6 weeks old.

    Ron
  • ronron SERLists.com
    edited April 2014
    NEW UPDATE:

    Please update to  v8.23. I think @Sven just changed the entire ballgame. I didn't even think about the fact that I upgraded to the new version. So many variables...

    225 LPM. One contextual project running. Blue List. 90% verification rate. Ridiculous!! hahaha

    Good times are back again! :D

    p.s. - I tested leaving the project articles alone. No issues. Just one big-ass major improvement. I'm saving both SER folders and this version in a warehouse with armed guards.
  • It sure is a great find Ron, and I highly appreciate you sharing this with all of us. Once again a great contribution to this community by doing good tests. I'll sure try it on all my active projects.

    However, I have to wonder if this wouldn't only have affected those who are using HTML variations in their articles?
  • ronron SERLists.com
    I don't think you have to change anything. Just update. Sven changed something.

    I just found another crazy thing.

    If I run all projects at once, I have great performance. If I use the scheduler, my speed cuts by 70%. Absolutely no idea why.

    I never thought I could run 150 projects at once without using a scheduler - never used to be able to do that in the old days.

    And it's a lot faster if you just run 1- 20 projects by themselves as a group (no scheduler) as opposed to 150 at once. 

    Damn, you just have to test everything. You think one thing is isolated, and then aother thing pops up. Well anyway, I think Sven made a strong improvement. Verification is now 90% on contextuals, and very fast.

    If you want to wet your pants, just run like 5 contextual projects with a list. Mind blowing.
  • edited April 2014
    @ron do you get memory errors? 
    With such amount of projects I always get memory issue, despite that SER shows memory usage below 2GB and CPU below 50%

    How many treads are you now running?
  • edited April 2014
    @ron

    Lol. Upgraded based on your advise. Trying contextual as test without any scheduler. 64% verified ration (that's huge too for me, never had above 10% verified!). Running through identified list. Btw, also bought your list tomorrow yesterday, its nice. :)

    XpressEngine is major topper though in running through identified list. One thing I was always confused about is what exactly are contextual engines? I guess all Articles, Social Bookmarks and Social Networks?

    And @Sven you rock - like Ron, I will be too safeguarding this version very well.

    Cheers.
  • goonergooner SERLists.com
    @pratik - People generally mean articles, social networks and wiki's when they say contextuals.
    Thanks for buying the list :)
  • ronron SERLists.com
    edited April 2014
    @meph - Zero memory errors. 300 threads. I'm only running tests on the VPS - our servers are way more powerful.

    @Pratik - You will be loving the list, that's all I can say. Hang on to your underwear, lol. Yeah, my verified is 90+%, just like it was 5 weeks ago. Some servers were affected, others weren't - hard to figure this stuff out. I personally never bother with bookmarks anymore as they have gotten too tough. I'll leave them checked when running a list, but I would never try to use SER to scrape for them - way too hard doing it that way + loss of link production when SER could be doing more efficient things. 
  • @gooner Thanks. And my pleasure haha.

    @ron Are you just running your own verified list or identified list? Because I didn't had much good success with just identified list (although it was bigger than I had ever seen even with my verified list in my life lol, somewhere around 30-40%).

    Thanks.
  • nice post, im running your Blue List for 2 projects only, started today and so far success rate is very good :) ill let it run for a while to see what happens 
  • ronron SERLists.com
    @Pratik - Dude, where have you been? Of course I'm using our list - the one we sell here, lol. I think you were gone on a long vacation my good man.

    ;)
  • ronron SERLists.com
    edited April 2014
    @muxmkt - Thanks man! The lists are off the hook. Now that Sven has fixed things up, even more links. I'm getting over 90% verification. Just make sure you have v8.23.

    I'm just glad that things are back to where we started :D
  • Yes im liking it :) around 90% here too. 

    Actually im running the Red List and using a recaptcha solver here but it isnt solving too much but well, at least im getting links :D
  • ronron SERLists.com
    edited April 2014
    :P

    You bet! Suck up those verified like a vacuum, haha
  • @ron Yeah, haha - was caught up in work since couple of months. I mentioned I already bought it today. :)
  • Ron, thanks for your PM to me.

    Btw, after reading all this, does that mean that your initial concerns (at the beginning of the post) were unfounded? I'm not sure if I read you correctly?
  • yep, BIG changes with todays update, haven't seen SER fly like this for a good while.
Sign In or Register to comment.