IP getting slapped when using proxies?
Hi @Sven, On my VPS I'm finding that sometimes my IP is getting slapped by Google, even though I'm using proxies.
If I stop SER and then try to open up my web browser and search for something I get the captcha request.
I only run SER on it (24/7) and always use at least 20 semi dedicated proxies.
Edit: I also have the proxy settings to stop on no active proxies, and restart on active proxies (but I never get them all blocked anyway).
Any Ideas?
If I stop SER and then try to open up my web browser and search for something I get the captcha request.
I only run SER on it (24/7) and always use at least 20 semi dedicated proxies.
Edit: I also have the proxy settings to stop on no active proxies, and restart on active proxies (but I never get them all blocked anyway).
Any Ideas?
Comments
That is ofcourse if you've correctly set up your proxy settings in GSA.
It only happens occasionally, but these are the settings that I'm using;
Edit: AFAIK my VPS is not shared and neither is the IP.
They suggested the proxies must have been "leaking". Maybe that could be the problem for you?
How was your situation resolved, did they just swap the proxies and all was well?
To be honest i would doubt it was true because i've never come across anything like that, but they did send me copies of the spam complaints, and now your IP banning confirms it i guess.
I'm not too worried about it at the moment, but it could become a problem as I'm considering getting shot of the VPS and running SER off of a home computer instead.
Hopefully buyproxies will shed some light on it mate.
I wish I saw this earlier.
You do not want that box checked below:
Private proxies may go 'off' for a second or two, but they come back within seconds (every couple of months you may end up with a bad proxy that needs to be replaced, but that's not what I am talking about here). Never disable private proxies if down. If one by one your proxies are detected as dead (when they really aren't), you will end up with no proxies. My very first thread on this forum was exactly about my IP leaking, which led to that setting, and the one below:
You have that one covered, so I don't think that is the cause. Just uncheck that one to disable private proxies. In theory that has nothing to do with using your IP, but my problems went away when I didn't disable private proxies.
Thanks for the explanation @ron
No guarantees here, but I think that will help. One thing for sure is that you will be firing on all cylinders with your proxies, and your LPM will be at the maximum for your settings.
If you want to see what I mean, go into your proxy area and test all proxies against google. See if one or two fail. Then retry those failed ones right away. Almost without exception they will be successful. It could be line noise, whatever...but if you kill those proxies the instant there is a hiccup, you will be without proxies very quickly. And who knows what can happen in a multithreaded application with hundreds of threads if you have no proxies. Ideally, that second setting should save your rear end.
Actually, It makes perfect sense, as thinking about it, it normally happens when I'm running at highish LPM.
I will give it a try, and let you all know how I get on.
Thanks again
specially with semi-private proxies
now since a week on private
google slaps seem to be of temporary nature after a few hrs testing OK again
what about your top picture
x select if anonymous
x remove non-anonymous proxies
??
anonymous test very sow and need a time-out much longer for testing proxy else all fail anonymous test
All of my proxies are showing as anonymous in the list, but out of interest, what would you advise that I set the timeout to?
After contacting the proxy company and them kindly swapping my proxies for a new list, the problem seems to have gone away.
I contacted the VPS provider and they were confident that if it wasn't my settings then it must be the proxies.
Interestingly, and I don't know how true this is, they also said that if your proxies are prone to leaking then the problem will be worse if they are shared, as you will also get your IP slapped quicker and more often.
I then contacted the proxy provider and asked them to test them for me, and they just sent me a new list which seems to be working fine.
for proxy testing from HOME PC
I set it to 45-60 seconds
it takes anywhere from 1 second to many dozens of seconds just for the testing
that depends on how busy your own line is
on VPS of course its shorter
if a proxy test is time-out, then it is by definition dead
hence better give enough time to always have active proxies
your waiting time 1 second between SE use for the small number of proxies you have
to me that seems extremely HIGH
but it appears HERE normal that VPS users hammer the proxy
my proxy provider offered a recommendation
1/3 of proxy number as threads
for SB I use 45-60 seconds waiting between SE use
50 private (dedicated proxies)
even then if I use precise high matching footprints and exotic G countries
sometimes I get hammered after just a few queries
may be some queries with globally known accurate footprints are 'BLACKLISTED ??
what ever is personal choice and experience
it depends on keywords ordinary keywords I can query for hours w/o any problems
but results less useful
I noticed that sometimes to do all slower yields to higher result and more submissions / verifications than faster
to avoid hammering by G, I do almost all target searching w/o SE using creative SB
and only submission / PR with SER
@startrip, when scraping I set my connections in scrapebox to be around 20% of the total number of (semi dedicated) proxies that I've got and never have any problems with them getting banned, even using footprints like "inurl:YourSpammyKeyword" - I only ever seem to run into problems if I forget to load up some public ones for PR and index checking.
I no longer scrape but use SB based on that idea found on another forum:
http://www.blackhatworld.com/blackhat-seo/black-hat-seo-tools/605958-tut-how-easily-build-huge-sites-lists-gsa-ser.html
I use a modified version of above and filter OFFLINE using a bash script and offline footprints on my Linux workstation,
then import target files to SER on a win7
when running SER on quality targets, my best experience is/was on LOW threads because
- lots of successful registration = more email verification (very time consuming here in my situation) and more upload for successfully registered sites ( I do almost only articles = more upload volume requiring more time / longer html timeout (90 seconds minimum to 120 sec max)
SB without scraping gives me easily 1 million or more URLs per day on a slow connection shared with SER that works 24/7 with interruptions for maintenance only
it works fine and brings even on a very slow broadband connection enough successful target URLs to keep SER busy
the last run I did yielded in 1.0 million harvested URLs > converted after filtering into some 8000 potentiaol submission targets
and the conversion ratio submitted to verified is much higher than when scraping
usually about 25-40% are verified immediately
final result:
FYI
working in KH, I have a connection of total max 2MBs shared with multiple users here = my own share usually is 256kBs max 512kBs = VERY different from guys in EU or US or working on VPS
there is NO fast 3.75G or 4G here and no ISP offers more than 2MBs
there always are solutions to success even with limited resources