Skip to content

Comments

  • HinkysHinkys SEOSpartans.com - Catchalls for SER - 30 Day Free Trial
    Awesome read! Didn't think anyone managed to build something that breaks those image captchas automatically, let alone with that accuracy. Too bad they told google instead of releasing a product. ;)

    Now the 70+% method is impressive but considering they were using Clarifai API, it would get crazy expensive on large scale.However, that offline method they were using seems much more interesting. 40% using only an open-source library, without any APIs, paid or free? Hell yeah.

    If you're thinking about starting a new product, you might want to explore this option further cause if you build it, you're going to have a metric shit ton of customers. (That is, if SER can even work with those images, not to mention it would first have to be able to click that checkbox recaptcha first to even get to them)
  • SvenSven www.GSA-Online.de
    I will have to study this more next work week. However just quickly read the abstract and it sounds interesting so I posted it here. Glad you liked it.
  • HinkysHinkys SEOSpartans.com - Catchalls for SER - 30 Day Free Trial
    Yeah you should, if nothing else, just go over it once, I found it fascinating. Not the easiest read, but fascinating none the less.
  • spunko2010spunko2010 Isle of Man
    This part was interesting to me:

    "The same happens if the browser and engine versions are up-to-date, but dont correspond to the actual environment of the experiment (e.g., if we use Firefox but report Chrome). Fallback captchas are also returned when the User-Agent is mis-formated"

    @Sven does SER spoof any user-agents? If so, which and can they be changed?
  • @spunko2010

    good question. I wonder if certain user agents holder higher value in the search engines. Maybe having a nice, averaged out ratio of user agents should more authenticity than having say 100% of your links coming from a specific user agent. Not sure though.


  • SvenSven www.GSA-Online.de
    @spunko2010 SER uses different user agents but makes sure not to mix that with different agents per outgoing ip. So that one site sees just one user agent per session.

    I also think that this makes not much of a differens. I use firefox with a user agent spoofing plugin and I don't have issues with google recaptcha...almost always showing me the easy one to solve in browser.
  • spunko2010spunko2010 Isle of Man
    @Sven ok. And do you update them often, or can we update/edit the UAs? If not perhaps you can allow such a function in future? Reason I ask is because I would like to test out a few things that they referred to (kind of) with 'canvas rendering'. Search engines must be using quite extensive footprints, and I want to test out faking different ones.
  • SvenSven www.GSA-Online.de
    you can update them as they are in a file user_agents.dat in the installation folder.
  • andrzejekandrzejek Polska
    edited April 2016
    https://panopticlick.eff.org/

    Your browser fingerprint appears to be unique among the 136,192 tested so far.

  • SvenSven www.GSA-Online.de
    Just because sue use javascript shich usually submits a lot more than you want like screensize, desktop size, language, regions...
    I for myself have javascript off from all sites except for those I trust.

    https://noscript.net/
  • spunko2010spunko2010 Isle of Man
    Anyone know if there is a plaintext list of UAs available (free or $$$)?

    Like this but more extensive: http://www.useragentstring.com/pages/Browserlist/
  • SvenSven www.GSA-Online.de
    Yes but you don't have any statistics how often a certain agent is used and in what time frame. Of course an update on firefox will result in a currently used agent not being used tomorrow as often as it was.
  • spunko2010spunko2010 Isle of Man
    Well clicky tells me this information for my actual site visitors so I could use that I guess
Sign In or Register to comment.