@Sven I'm having this problem so I have set all the projects to use only Global Lists and I don't have any filter by PR, I'm using 10 semi-private proxies by BurProxies but I still get this message. If SER is not scraping how it's doing queries to google ? Thanks
@kiosh You get this message because you use only 10 semi private proxies. Recaptcha service see your proxies too many times in short period of time and give you this message. More proxies or lower threads will fix your problem.
Thanks @satyr85 but since I'm not scraping why is using google ? As I understand this message is showed when you do too much queries in a short period of time (and then comes the get more proxies or lower threads, or set a time between each query) ... but I thought that only posting, google would "never" be used ... that's my doubt
Towards the end of the registration form, you can see a captcha challenge (to prevent bots from auto sign up)
Google owns this service, hence everytime you hit a sign up or registration page with recaptcha, google will log your IP. When google detects excessive use of the same IP (proxy), it will display difficult blob recaptchas or block your IP "We're sorry but your computer or network may be sending automated queries. blah blah..."
So to avoid or reduce the occurance of this problem, do what @satyr85 said, "more proxies or lower your SER threads"...
Comments
You get this message because you use only 10 semi private proxies. Recaptcha service see your proxies too many times in short period of time and give you this message. More proxies or lower threads will fix your problem.
http://www.cokoyes.com/user/register/
Towards the end of the registration form, you can see a captcha challenge (to prevent bots from auto sign up)
Google owns this service, hence everytime you hit a sign up or registration page with recaptcha, google will log your IP. When google detects excessive use of the same IP (proxy), it will display difficult blob recaptchas or block your IP "We're sorry but your computer or network may be sending automated queries. blah blah..."
So to avoid or reduce the occurance of this problem, do what @satyr85 said, "more proxies or lower your SER threads"...