Proxy block on google - What can i do?


i´m constantly getting an error warning in gsa ranker that says: " IP/Proxy block on google".
I´m using public proxies and have three questions about this topic.

1) Do public proxies just lower the submission speed (decrease lpm) or are there more disadvantages of using public proxies? 
2) What happends if i use the "test against - feature"
3) Is there a workaround for this ip block?

Thank you :-)


  • MrXMrX Germany
    Accepted Answer
    1. As they are public, they are getting banned on google much faster than private ones.
    2. Test agains will test if your proxy is alive like the others testing features too (afaik)
    3. Get private proxies, lower your threads, higher the custom wait time between queries
  • edited July 2013
    i worked around that proxyban thing by 

    a) removing most advanced operators (like site/inurl/ etc.) from the footprints
    b) adding some new footprints without advanced operators
    c) 60 seconds query time with 20 private proxies, which makes 20 minutes per proxy. Im having maybe 1 proxy banned per day, but it normally recovers

    Of course you get way less new targets that way, and thats where scrapebox+public proxies enter the game. If you scrape 1000+ proxies with either SER or Scrapebox (i personally use scrapebox) you can easily let it run with advanced operators+100 threads and scrape 1-5 mio. Urls over night, depending on the footprints of course.

    At some point i tested gscraper, but i must say i like scrapebox more. The inbuilt GScraper proxies are the biggest joke ive ever heard, if I use GS i normally prescrape the proxies with SB... (i ran through 1,5 mio proxies when i used their proxy service because they get banned so fast...). SB is king for scraping.

    Do that big scrape 5-10 times a month and your LPM will be fine. At the moment my homeversion is running at 300 LpM almost constantly, which of course depends a lot on the size of your global list and your overall settings.

    But to make a long story short, dont worry too much about SER scraping, it is a posting tool imho and the inbuilt scraper is just a nice feature if you want to let it alone for some weeks. For scraping use an external tool to get the most out of SER.

    Best Regards
  • edited July 2013

    Curious as to how do you remove footprints to search from SER?

    I'm now using Scrapebox to scrape using 2K public proxies. Yahoo is the biggest winner each time I've scraped. Google is low. And bing is joke. Bing gets stuck just after a while. Is it same for you?

    I do around 175 threads for those 3 in total.

    >> Also you listed "Do that big scrape 5-10 times a month"

    Won't those yield same results really?

  • edited July 2013

    Do you mind telling us your SB settings for scraping, i.e. # of connections and timeout?  Perhaps I should ask also about your Internet bandwidth ( result) and # of CPUs.  I have similar proxy ban issues here and looking to apply your tips.  Cheers.
  • @startrip searching for 'remove inurl' shows only this thread. Can you please share how to remove it?
  • edited August 2013
    @spunko2010 i dont know exactly what you mean, but if you wanna remove the advanced operator "inurl" you have to go into the engine files and remove it (look for search term= a|b|c|inurl:d and delete it)

  • edited August 2013
    Sorry guys i didnt see all the questions.

    @bluejacket For scrapebox I do 20 connections and use only google. i have giant bandwidth because i'm a good friend of a guy working at my ISP and i paid for special cables, but i think the proxies are the bottleneck normally. try to find good lists for public proxies.

    @pratik go into the engine files in the installation folder and look for "search term=" there u will see all the footprints. you can modify them with a text editor.
  • Friends in high places huh @Startrip ;)
Sign In or Register to comment.