Skip to content

Proxies Dead in 5 Minutes?

Just got started with GSA and I had 28 private proxies banned in about 5 minutes.  Here were the settings I think are relevant:

- USA Search Engines (Google, Yahoo, Bing)
- Articles, Web 2.0, Wikis only
- Threads to Use: 300
- HTML Timeout: 30
- Search engine query wait time: 60 seconds

What did I do wrong here?


  • SvenSven
    Try to disable "google RND" if you use that and make sure the time between search engine queries is high enough (default 60 seconds).
  • Thank you. I was reading through another thread and someone pointed out how using private proxies for checking PR can get you banned really fast.  Luckily I was able to upload the proxies again about 6 hours later and they were working fine (I set Check PR to public).  
  • edited June 2013
    Finally smb else having that issue. I used google rnd too and my proxies were banned extremely fast (less than an hour).

    They didnt recover yet when used in GSA (getting error msgs) and im using yahoo and yandex now.

    The funny thing is, when i use the same proxies in scrapebox they work. Strange stuff.
  • Same here, mine still worked in content foundry right after.   Luckily I was able to recover them for GSA. 
  • @sven you have an idea why proxies work for sth else while not working for GSA?
  • i've noticed something simliar.  if u get a 000/000 error say for and u find there are results in the web browser you'll prob find a country specific proxy is banned.

    i.e. when u run the scrapebox or gsa proxy checker make sure u run it on and u'll find that is' banned where as say isn't
  • SvenSven
    @Startrip sorry I have no idea. But what @sonic81 says might apply to you.
  • Shit, all my proxies were marked dead overnight.

    300 Thread count
    HTML timeout: 90 seconds
    Google and Yahoo US
    5 seconds between search engine queries
    Search engines, submissions, and verification running on Private Proxies (verification on Public and Private)

    Multiple programs showing them as dead.  Will try again in a few hours.  
  • edited June 2013
    I have noticed the same thing. Proxies are getting banned on Google engines really quick. I purchased private proxies from I even tried Errsy cloud proxies (which are supposed to access thousands of proxies), but after about 10 minutes they started getting banned too. I have been able to kind of get around the issue by disabling all PR checking and using a lot of proxies with a lower thread count (1 thread or less per proxy). Don't forget to disable the PR options in each project as well as the main program "options"..both captcha PR check and indexing PR check before sending to indexer.

    This is only a suggestion that has been working for me. If anyone else has ideas or comments please feel free to share. Also, my settings are pretty conservative (only 30-50 threads), so I'm surprised my proxies from were getting banned at all.

    To my knowledge the proxy issue has nothing to do with GSA. My hosting company has also experienced the same issue with their proxies across multiple servers using many different programs (hrefer, xrumer, scrapebox, etc.). They only started noticing it a week or two ago (after the new G update). They think "G" has enhanced their detection methods and are now better able to detect it.

    Thoughts anyone?

  • edited June 2013
    @Laubster I think your thread count is too high and delay between searches too low for the number of proxies you have (28 right?). Look at the numbers next to "T:" at the bottom left of GSA. That's your threads. I doubt you are actually running at 300 threads anyway.

    Try dropping to less than 100 threads and unchecking the "Custom time to wait between search engine queries." I think I remember watching one of Santos's videos and him saying not to mess with that unless you were running a lot of proxies. See here:
    ...he had 100 proxies and set it for 100 threads and 5 second delay in the video.
  • Yeah I remember that video too.  I'm almost positive it's because I've been doing PR checking, although I had global settings set to use public proxies.  Oh well, lesson learned.  It would make sense if Google's cracking down on Proxy usage - they can't fight programs posting links to sites in their index, but they can crack down on proxies as much as possible. 

    @Sven - does unchecking the "PR Checking" box in global settings turn PR checking completely off, including checking PR for verified links?? 
  • Got some useful info from here. My proxies were temporary banned by G since today morning. Stop GSA for a while and some of them came back, but after running GSA for an hour, proxies were banned again. Had Buyproxies replaced all. I was running 200 threads, 50 proxies and 5 sec custom time. I have only 30 proxies now so will implement some suggestions by @gsarver to see if proxies will last. 
  • edited June 2013
    Whups. I meant to add these in the post above. Here is where you disable PR checking in the main program options.

    ....make sure these are UNchecked if you want to minimize your proxies getting banned. Also be sure to turn off PR filtering in each project. By the way, this is how some of the power users here run so many threads without getting their proxies banned.

  • I turned of G completely few days ago, turned off PR checking(useless anyways) and went for yahoo/MSN. Proxies still didnt recover, I've checked them daily.G seems to become extremely strict...(which basically means they cant stop us) ;)
  • Yeah, I think you are right. They are trying to stop us, but a lot of what they say is just scare tactics. Have you tried LeeG's suggestions of only selecting a few engines (4-8)? I'm using "G" international and one country specific as well as Bing. Seems to be working well and I have not been blocked.

    Also, I have GSA set to test my proxies every 30 minutes, disabling down ones to give them a break. It then retests again in 30 minutes and enables the ones that were down if they are up again. Make sure you disable the lists it scrapes for proxies (uncheck everything) or you will have public mixed in with your private proxies.
  • I've turned Google and PR checking off as well. Only using Privates for Submissions and Verifications, and I'm not even sure if they're needed on the Verifs.  Long story short I'm harvesting mass amounts of proxies automatically and using publics to power my searches.  Not the fastest method, but it works.

    Luckily my proxies have come back to life (I haven't tested them against Google, but they're working everywhere else).
  • @gsarver That's an interesting method.  I'm almost too scared of using my privates on G to even test it! 
  • AlexRAlexR Cape Town
    @sven - am I correct in saying that changing these PR settings won't make a difference if you have PR filter in your projects, since it only checks them once when it finds the URL and then it applies these filters AFTER the project PR filter?
  • SvenSven
    @AlexR I can't follow you (as usually). Your sentences are all so long that I lost the meaning after reading it. I also don't understand  to what context this is.
  • AlexRAlexR Cape Town
    @sven - if you change the PR settings as per Gsarver post (under global settings, like captcha or indexer pr thresholds) it won't make a difference if you have a PR filter within the project options. 
    (i.e. only submit to sites with PR3+, etc)

    Since the program FIRST checks PR on the project level, and once it has the PR then it stores this THEN only applies the captcha PR test or indexer PR test, it doesn't recheck the PR for these global PR checks. 

    Anyways - not too important, just users who don't want to use PR must ensure that ALL PR settings are disabled. Both global and project settings. 
Sign In or Register to comment.