Do more challenging CAPTCHA types correlate with "better" targets?

0
DeeeeeeeeDeeeeeeee the Americas
I love CAPTCHAs. Why? They're interesting. And, often interesting to behold.

But do more challenging CAPTCHAs correlate with better sites? I would guess yes, as if someone really cares about their site, they'd change the default settings in their captcha plug-in or use something they've found or created themselves.

So is it true? I do not have nearly enough knowledge about which captchas are for which specific target types and engines, and also which targets statistically tend to have better trust, rank, higher SERPs, etc.

Comments

  • 0
    shaunshaun http://shaunmarrs.com/ - The Ultimate Resource For Free GSA Tool Tutorials!
    I published this a few month back when the manual captcha settings were added to the tool and planned to do a case study on it but it just costs too much.

    IMO if an OCR captcha tool can break the captcha then there's going to be a ton of submissions to the domain increasing the chance the domain will exceed its bandwidth and go offline, get rolled back by the admin so Google see your link then see it disappear along with a whole bunch of other links.

    Using a manual solved captcha gets through stuff like ReCaptcha V2 meaning there will be less submissions to the domain but problem shifts from your submission getting verified and then going offline due to the above reasons to just getting your submission verified at all.

    Although I don't remember the exacts of what I did when I was testing this, I am pretty sure I had a bunch of projects and within those, one had autogenerated content and one had a human-made article and I just used general scraping/link extraction rather than niche based keywords.

    I don't remember if the auto-generated content managed to get any verified at all but if it did it was definitely less than the human article. That lead me to think that submissions are held for human moderation before being published and giving you your verified link.

    I can remember duplicating the human article project at a few stages to reverify the links while leaving the original project to post and the project still had link loss. I'm guessing it was to do with the admin changing their mind and deleting the post or a follow up verification email being sent or something.

    If the domains are being human moderated then your content would also probably have to match the niche of the website. I never tried niche based scrapes so I don't know if these would get you better verified counts but IMO it wasnt worth it when you can just build a bridge network and have 100% control over your links or go out and bribe a webmaster for a guest post or paid link.
    Thanked by 2Deeeeeeee redrays
  • 0
    DeeeeeeeeDeeeeeeee the Americas
    @shaun "published this a few month back when the manual captcha settings were added" to GSE-SER.

    I absolutely am thankful for having these additional project-level captcha solving options. Great update, @Sven!

    "...increasing the chance the domain will exceed its bandwidth and go offline, get rolled back by the admin so Google see your link then see it disappear along with a whole bunch of other links..."

    The mechanism behind the process so many refer to when they say that the links will "go up and down" on a site.  I think many ppl know OF this phenomenon, but don't know the reason why it happens.

    "...problem shifts from your submission getting verified and then going offline..."

    Interesting and insightful observation.  I guess re-verifying links a few rounds is a great idea for all projects, then?

    As always, thanks for your research at this level. Really, on here and BHW, I find nothing that compares to your studies, man!~ Keep up the cool experiments at the lab!!

    "... you can just build a bridge network and have 100% control..."

    True! I hate writing or paying for new content, just to place it somewhere I have zero control over.

  • 0
    DeeeeeeeeDeeeeeeee the Americas
    Had to post this, sure most have seen already, was new to me, tho:



  • 0
    shaunshaun http://shaunmarrs.com/ - The Ultimate Resource For Free GSA Tool Tutorials!
    Cheers man, I plan to start putting out content again in 2018 but probably on my YouTube channel with a summarised version of the videos being put on my blog.

    " I guess re-verifying links a few rounds is a great idea for all projects, then?"

    Without a doubt, especially if you are building links to them, theres no point not reverifying links while you are wasting resources building links to them only to find out the link is dead and everything you built to it is wasted.

    I really thing bridge networks are the way forward for automation, they are quick, easy and cheap to set up and definatley have an effect. As you control the domain you can use something like SER or Scrapebox to post to it for you while using human wrote articles as you know they wont be deleted.

    The below money page is targeting a low comp keyword on a domain that is 8 month old now but I decided to post content on every bridge network domain I have and link to it to see what happened and it got to the first page of Google after around two month. This is the quickest I have managed to get something to the first page in a long time.


  • 0
    DeeeeeeeeDeeeeeeee the Americas
    "...it got to the first page of Google after around two month..."

    Well, results don't lie. :o I know you see what's needed to adapt in this new environment.  (I have to read your many articles again! I'm sure I've missed many. Valuable, definitely)

    The results of AI is that SEO will be/is more difficult, will require/requires much more precision. No longer throwing darts at a wall-sized bulls eye. :/

    For all I see myself, from all you've written as well, we will be putting out more valuable content, where it matters most, and so content value, and placement, is premium now.

    So with your Bridge Network concept, you achieve control over both, no?

    "Without a doubt, especially if you are building links to them, theres no point not reverifying links while you are wasting resources building links to them only to find out the link is dead and everything you built to it is wasted."  

    I can see as I scale up, dead links are going to waste resources for processor time, captcha solver fees, tying up proxies, building links that serve zero purpose, and finally time I could be doing productive SER runs.
  • 0
    redraysredrays Las Vegas
    "I never tried niche based scrapes so I don't know if these would get you better verified counts but IMO it wasnt worth it when you can just build a bridge network and have 100% control over your links or go out and bribe a webmaster for a guest post or paid link."

    This. A friend of mine tested @shaun's theory about posting to better sites with SER and human captcha solving. His cost per link was just outrageously high, and the quality of the targets was nothing special. It was cheaper and much more effective to just go buy links.
    Thanked by 1Deeeeeeee
Sign In or Register to comment.