Skip to content

Why Am I Getting Google Search Captcha?

edited September 2012 in Need Help
I use the PC that I have SER running on for a lot of other things during the day. I have set SER to use proxies "Everwhere", but several times over the past couple days, when I go to do a search on Google, I will be presented with a captcha. To me, this would indicate that Google thinks I am doing too many search queries. There is nothing else running on this PC (other than SER) that would be doing queries.

This leads me to several questions...

1.) Is there a bug in SER where proxies are not being used for search queries?

2.) If this captcha is coming up when I am doing manual searches, should I assume that it is also probably happening inside SER and these searches are failing (thus reducing my results)?

3.) If it is happening because of searches done in SER, I don't understand. I am running six projects, 70 threads, 30 seconds between S/E queries, 20 proxies, and about 40 search engines (US, Canada, Australia, United Kingdom). It doesn't seem like, even if I was not using proxies for searches, that SER would be sending that many requests to Google that fast that I would be getting blocked.

Comments

  • OzzOzz
    edited September 2012
    Can't tell you if this is a bug. Maybe its because you don't have "google-able" proxies anymore and GSA take your IP as a fallback solution. Only Sven can answer you that.

    But I've an easy solution for you. Right click SEs -> Uncheck all Google SEs by mask. You should do fine without google and get better results anyway as many proxies are disabled for google only and not for the other SEs. As GSA won't use google you should use your proxies for posting ONLY imo. Your results should be better even that GSA only searches every 60 seconds.

    An even better solution is to buy a couple of shared/private proxies.
  • Ozz...thanks for your suggestions. I will look into taking the Google SEs out.

    I did not think to check to see if the proxies were having problems with Google. I just went to check them with Scrapebox and cannot run SB (I posted another question thread about that if you have any experience with that and want to reply there). So, at this point, I do not know if the proxies are ok or not.

    Regarding your last statement, the 20 proxies that I have are private (and fast) and I have not had any previous problems with them. They check ok in SER. When I originally tested them with SB, they were OK with Google and with SB.
  • I got my answer to the Scrapebox question, so you can disregard that. Seems the problem is at their end (server problems - DDOS attack).
  • Check in GSA that the option automatically remove bad proxies on use is NOT checked.
    Otherwise, if (when) your private proxies hickup and dont respond, they will be removed, and eventually you will have no active ones left, and it will be using your real IP.

    This happened to me. That option is fine with public proxies and autoscrape proxies, because it always adds new ones. Dont use it with private proxies.

    Also google has all sorts of ban threasholds, so SB test is useless. For example, it will ban an ip only for advanced searches, so the same proxy will work for search term "cat" but block for search term "inurl:cat".
Sign In or Register to comment.