Skip to content

Google proxy block

spunko2010spunko2010 Isle of Man
I have just done analysis of my submissions and Google blocks all my proxies, as does Startpage. For every country. What settings do you guys have on search engine delay in seconds? Is 30s sufficient? Mine was previously 9s.


  • OzzOzz
    edited June 2013
    well, depends on your proxies (dedicated or semi-dedicated) and how many of them you got.
    it is at it is, you need to test those things for yourself.

    btw, startpage was really crap once i test it couple of month ago. back in time it gave me only the 10 results of the first page and wasn't scraping the following pages. maybe that changed, but you should test those things with the "search online for URLs" tool for example.
  • spunko2010spunko2010 Isle of Man
    PS: I have 50 semi private proxies.
  • spunko2010spunko2010 Isle of Man
    I have ordered 20 full dedicated now for testing. Will report back.
  • spunko2010spunko2010 Isle of Man
    I have a question for @sven also related. If I set the Search Engine time delay to 30 seconds, and have 10 projects running with same search engines selected, is the delay per project? Or will the delay apply to the search engine across ALL projects regardless? Hopefully the latter....
  • its global delay from my observations. 60 seconds between every query, disregarding amount of proxies or projects.
  • yes, all options are global in the general options menu if i'm not wrong.
  • Disregarding the # of proxies?

    That wouldn't make sense in my humble opinion. It should be 60 seconds for each proxy.
    @Sven, can you clarify that?
  • SvenSven
    that time is for search engines and not counted by proxy numbers.
  • spunko2010spunko2010 Isle of Man
    edited June 2013
    OK, thanks. But say we have 60 proxies and only 1 project, a delay of 60s in this instance is silly, since there are many proxies that can query the search engines... Right? If I understand, I could set my delay to 1s here which would mean no same search engine is searched more than once every 60s.....???

    Also if only trackbacks and BC use the keywords to target sites, does this make any difference to my campaigns if I don't use either of those? Or are the search engines still used to find other sites via non-keyword methods?
  • >Also if only trackbacks and BC use the keywords...
    those are the only ones that are using ALWAYS keywords for search. most engines using SOMETIMES keywords for search.
  • Ok, now I can see why my scrapes were pretty slow. I had the delay set to 90s, assuming it was per proxy.
  • AlexRAlexR Cape Town
    Maybe someone can assist with this as it's related. 

    When Google blocks a proxy I noticed in my browser that the Google search still appears but when you do a search you get a blocked page. Yet, our Google proxy tester in SER checks the homepage for a string. (The homepage loads when a proxy is blocked so it will find the string and pass) 

    Are there 2 types of Google proxy blocks? 
    a) One that blocks the Google homepage from showing
    b) One that only blocks when you try search? 
  • OzzOzz
    edited June 2013
    there are at least some different steps in the google system until your proxies are blocked. at first you have to enter a captcha (= site with string loads) and i believe next step is that you just get a message that your IP is blocked from searching (= site string doesn't load???).

    i'm not 100% sure about this though if thats the way the ban process works (all the time)
  • I found the proxy are easily blocked by google using the public proxy!
  • spunko2010spunko2010 Isle of Man
    edited June 2013
    @ozz you can see the steps Google use here:

    When Google detects scraping activity this is going to happen:
    1. When accessing Google, you can be warned about something "dangerous" going on.
    You will see a warning about a possible Virus or Trojan on your computer.
    2. If you continue scraping Google they will now throw in their first block.
    You will again see the virus message, this time you need to enter a Captcha to continue.
    The Captcha will create an authentication cookie that allows you to continue.
    3. Now Google uses larger weapons: They will block your IP temporarily. ("Google blocked your ip temporarily")
    It can last from minutes to hours, you immediately need to stop your current scraping and change code/add IPs.
    4. If you scrape google again you will be banned for a longer time.

  • OzzOzz
    edited June 2013
    thanks @spunko2010. thats good information and maybe @Sven can make use of it and send the proxies to a penalty box for a while when they get the "Google blocked your ip temporarily" message.

    i monitored something interesting though. to me it seems that if SER is searching for competitor links than the proxies working well, even if i saw some "IP block" messages before. maybe someone else can confirm that as well.

    however, i'm using yahoo as backup and i believe its doing a pretty good job from what i've witnessed.

    @yorktownfashion: did you expect something else from public proxies when they are shared along thousands of users?
  • Good suggestion by @Ozz, to move the blocked IPs to a penalty box with a time setting we can define ourselves.

    Also, I'm getting my proxies blocked pretty fast. I'm not sure if I'm right with this one, but I believe depending on the nature of your footprint and queries the block may happen faster.
    E.g. when just scraping for string such as blue widget or red widget review, the IP doesn't get the 302 block that fast.
    But when scraping for
    "powered by hurrdurr engine 3.2" "add your spammy comment" inurl:"display=all" -"comments are closed"
    the IP get's the block hammer way faster.

    Anybody else noticed something like that?
  • I'm using proxy's from
  • Im using shared proxy from buyproxies and having my proxies block by google frequently since last week. Its looks like some change have been made on Google or it just GSA bugs. Using GSA for about 4 month and never got this problem before this.
  • spunko2010spunko2010 Isle of Man
    It would be helpful if @komakino and others can say if they are using:

    1. Analyse competitor backlinks, and
    2. Search engine delay setting

    Mine are YES and 61s. And I have not seen any proxy blocks today for Google. Only for Startpage....!
  • spunko2010spunko2010 Isle of Man
    edited June 2013
    Also mine are mix of shared and fully dedicated proxies.

    Just a warning also if you are using Proxy-Hub. COUNT YOUR PROXIES. I bought 20 they sent me 17....  Also for another non-GSA project I have a company in India scrape Google Shopping every day, they have over 1000 proxies...  So if you are using very small number of proxies that are shared with others also, I don't think you will have much success with Google.
  • @benny, can you get me an invitation code for Rightproxy?

    @spunko2010, I stopped using the internal scraper of SER. I started analyzing and building custom footprints for each engine. Then I scrape it with Scrapebox and import it into SER.

    But yeah, even with very high delays I'm getting plenty of blocks. Used private proxies from before, but they got blocked fast. Now giving 2-3 different services a try.
  • spunko2010spunko2010 Isle of Man
    WHat delay do you have set?
  • edited June 2013
    Delay: RND
    RND delay range: Max 60s, Min 59s

    Running Google harvester with 1 thread and 100 private proxies.
  • There is something going on with the Automatic test proxies option. I usually have it set to every 20 mins to test if I have less than 50 active proxies. 

    So what I noticed is that when I go to the proxy panel and click test proxies myself before starting SER it shows most of my proxies tested ok and are TRANS. But after running SER for a alittle while a number of my active proxies dropped. I took a look at the list and see that those proxies which work at the moment show WEB :-/ wtf
  • edited June 2013
    I'm another one with the google ip proxy block msg constantly now and with all or most googles. Use semi private proxies which I change every month. Running gsa ser for almost a year and this is new one for me. Haven't changed my procedures or settings so I'm clueless.
  • spunko2010spunko2010 Isle of Man
    It would help if you can post your settings and number of proxies to this thread.
  • The Google proxy block is something that started this week for me.

    Been using GSA SER for 3 months and did not notice it before.

    I guess it might have occurred earlier and I never noticed it.

    I was using 20 private proxies before and they expired yesterday and ALL seemed to be flagged by Google.

    I've got 30 new private proxies and I'm not using them until we understand what's going on.

    Until then I started scraping proxies with the program and have 1,000 in use now and I really don't see any change in the number of submitted proxies and I actually see less Google warnings.

    Maybe I can just forget private proxies and the $50/mo it now costs me????????

    In the 3 months of 24/7 usage I came from 2,000+ place to 48 place and am tickled pink.

    I'm going to start automatic postings to see if I can get to page 1..........
  • i use for pr checkin and scraping , invitation code 502206932 if someone wants it
  • This has always been confusing for me. I get IP/proxy block on Google message often. But I think they're just for specific proxies? I use about 50+ semi dedicated proxies, so I think it should be fine still? I seem to care much less about it now as I'm getting 35+ LPM since a day or two. I just let my boy blast and don't pay much heed. I've around 18-20 SE's checked and my search interval is set to 14 or 15 I think.
Sign In or Register to comment.