Skip to content

IP/Proxy Block?

what shit is this? I heard G changed something. Anyone else have issue? I'm getting a lot now.


  • Yeah, me too. I decided it is easier to scrape with scrapebox and then import.
  • My proxies are working fine according to scrapebox but still many of these blocks from Google. My settings:
    I'm on 10 semi private proxies (buyproxies). I'm wondering if issue with buyproxies after reading what @LeeG says. Any suggestions? Please help.
  • I am not really sure what the cause is, I even tried to increase search engine query time to 60 seconds and that didn't help. I have 10 dedicated proxies from buyproxies as well. Let's see if anyone else has any clues...
  • Here are some queries that are getting blocked

    "Powered by Jisko"
    "Powered By phpFox Version 3.2.0"
    "Driven by DokuWiki"
  • OzzOzz
    edited June 2013
    are you really sure this didn't happen all the time but you just couldn't  knew it??!

    the reason i ask is because those log-warning messages were just introduced the past few versions so it may have happened all the time, but you just didn't know it.

    however, i've notice nothing special the past few days and the proxies are working as always for me. i'm not saying that google didn't change anything though.
  • edited June 2013
    It started recently. My proxies got banned quite often but came back in an hour if I stopped using them. I uncheck custom time between queries and 14 hours so far without ban. But I'm using only one SE tho. Experiments   :D
  • @Ozz the reason i ask is because those log-warning messages were just introduced the past few versions so it may have happened all the time, but you just didn't know it.

    I think you are correct. I need to up the amount of proxies I use for the amount of scraping SER is doing.
  • maybe add "Yahoo" as well as fallback solution.
  • Ya I'm using yahoo too. I don't know if I get this all the time but I know I'm getting it a lot now. I'm also seeing my LPM drop. Is it just me getting this? 
  • OK since unchecking Google RND I'm not seeing any more blocks. Maybe just a fluke. @Sven or @Ozz What is Google RND exactly? Thanks
  • OzzOzz
    edited June 2013
    Google RND queries random IPs of googles search-servers directly. what seems to be very good idea at first glance causing IP bans quickly as it seems (and is the opposite of what it was introduced for).
  • edited June 2013
    @seagul here is a thread (somewhere) where some users describe they have this problem now. im one of them. noone had a real solution.

    what i've tried yet (without success)
    - removing advanced operators like inurl: etc. it was my best bet that these get you banned extremely fast (because no mediocre user is using them, almost only scrapers)
    - changing google countries (sometimes you only have a countryspecific ban)

    what seems to help/what you can try:
    - removing google RND (recommendation by sven, i'll leave this unchecked because it seems to get your proxies banned faster). It took now 12 hours instead of 2 to get all my private proxies banned again (...)
    - using yahoo for at least 50% of your projects 
    - increase the search engine query time (maybe even back to the 60 seconds default)

    what definitely helps:
    - go through all your proxies 1 by 1. Check 1 proxy, let a sample project run and see if its blocked. Uncheck it. Rinse and Repeat. only keep the working proxies.
    - do this daily.

    This is extremely painful and i only did it once, just to see that my proxies got banned again...

    Im only using the global site list now and will most likely switch to another software (not because SER not beeing a good tool but because i cant solve this problem). 

    I bought 3 new sets of proxies from different providers and used different options to avoid getting banned. Even installed SER on my home PC to see if i have a VPS problem or anything. Same thing over and over again. The funny thing is that my proxies are working in scrapebox and gscraper. So i could scrape lists and feed GSA, but i really don't like to babysit a software like that.

    Best Regards

  • Ok thank you for your replies. I am now seeing more blocked Proxies again but not as many as when I had RND enabled. Funny because as you say the proxies check out OK in scrapebox. I see lots of threads about problems with proxies but I've never had a problem until now. Makes me wonder if Google have specifically launched a campaign against SER LOL
  • LeeGLeeG Eating your first bourne

    It happens to us all

    Shared proxies, means just that, others are using and sharing them.

    Editing how you run ser, don't mean the others using them are doing anything different to stop google bans

    They might even be the cause

  • @LeeG well either the other party/parties on the shared proxies are hammering them or SER is triggering Google's blocking filter. I hope I can get to the bottom of it. If I thought private proxies would help I would buy them but I don't think it will help somehow.
  • Well seagul if youre using shared proxies you can never be sure who got the proxies banned :D Thats the first thing i would fix.

    The proxycheck in scrapebox you mention doesnt really mean much because you dont know how it tests. Maybe it just checks whether the proxy can connect to google. But what makes me a bit curious is the fact that scrapebox can scrape with these proxies, no matter what. SER can't.

    Best Regards
  • LeeGLeeG Eating your first bourne

    How do you know they are even using ser? They might be hammering the proxies with scrapebox or some other software


    One of the reasons I have been telling people to use a selection of random engines

    Try the method of using random googles Ozz shared today

  • edited June 2013
    @leeg the problem with this method, imho, is that you have all Googles in there somehow. And some like google malta and the like yield extremely poor results. sometimes it shows 300 results while shows  2 mio. 

    there are ~30 google engines that are useful (from my observation, correct me if im wrong) and mixing them with the poor ones may reduce the ban-risk, but at the same time it lowers the amount of new targets.

    Im desperately trying to fix GSA for me, but at the moment i dont see a "real" solution. I dont know why some people are lucky with their proxies and some are not. While most advanced users here are using almost the same settings, it is alerting that some just lose all of their proxies while others dont.

    Even with only using the sitelist i harvested so far i can rank pages, but its not a good feeling if you're limited with your maximum amount of links.

    Best Regards&a good evening.

  • i don't think you've fully understand the SE mod LeeG mentioned. just modify it to the 30 SE you are happy with.
  • LeeGLeeG Eating your first bourne

    Ozz, don't expect people to do something that involves doing something and not having it provided on a plate for them


    Using their brains, seems way beyond the comprehension of some in all honesty

    Served on a plate or they moan

    Some results are better than no results

    No results means more calls on google.

    More calls on google can results in your proxie getting banned for a period

    So those engines you don't like, even though they might give poor results, they buy time between searches.

    End result, less proxie bans

  • edited June 2013
    I am going to try @ozz method and see if it helps.Mine are dedicated private proxies and I am having the same problem, so, I can't blame someone else hammering my proxies!

    I think it is the nature of the game, if you are going to scrape with advanced operators all day long then there is a good chance that some/all will get banned.

    The best solution is to simply buy some more.....
  • i had same problem, mind using share proxies, even im trying using gsa public proxy, they same ip block too, or firewall issue...

    are GSA ser , must have very dedicated private proxies , too pass this?
Sign In or Register to comment.