Since you set search engine queries at 90 sec and you have 50 proxies.Each of your proxy searches per 90 sec. This results in low in searched sites and threads in search engines.
i think there may be a problem with the proxy checker in GSA ser it says all my buyproxies.org are dead but i chcek a few manually using scrapebox check (actually harvesting on several googles like google.com.au google.com.ph google.com google.co.uk) and they work fine.
i haven't found a way to easily test a proxy the scrapebox proxy tester passes when sometimes the proxy is actually dead.
well, the ban can happen all the time and its just temporarly. this depends also on the query time you are using and i've read somewhere that specific search terms can quicken the ban.
with specific search terms i mean some special operators like url:"/register/" for example. when one of your proxies have searched with such a term more often that the other proxy than the chances are higher that it will be banned from searching. i can't proof this though.
Edit: i think i'm confusing something and missed the context of your issue. you say that your proxies are shown as "not working" with the SER proxy tester for google, but scrapebox show them as "working" for google, right?
also i like to know what your timeout settings for proxy testing are? maybe its lower in SER than in scrapebox??!
Edit2: did you test them against google.co.au and other googles in SER as well?
that's correct. i find the tests in GSA and scrapebox confusing. it says working in scrapebox then it says not working in GSA.
in scrapebox i dig a bit deepener and try to harvest for certain googles i.e. google.com.au google.com.ph google.co.uk google.com.kw and some pass some fail. i think the scrapebox just tests google.com
the problem with GSA is that it's hard to know when things are running slow if there is a setting issue or proxy issue. more often than not it's a proxy issue but its NOT clear nor is there an easy way to test this.
because in my GSA settings i select 5 random googles i.e. google korea, google kazistan, google ireland etc. to try and spread all the googles across my projects to minimize ban.
but the problem is the ban can be for a SPECIFIC country google i.e. ban in google.com.kw the only way i find this out is by looking at the log searching for 000/000 and looking at the google where it's failing.
then i scrapebox test that google to see if my proxy isbanned and if so replace that google search engine in my project settings
maybe @Sven is able to add a "Search Engine Rotation" feature? we select a couple of SEs and SER use those SEs until they are banned if they are just giving "000/000". Every 24h or if it falls below a certain number like 3 working SE than the SE-counter will be resetted and starts from the beginning. a number in the status bar could show us how many SEs are in use right now.
however, the main problem i see with this method is that sometimes the footprints just don't give any results which will also result in 000/000´s. this is especially true if "always use keywords" is activated. if bans and 000s could be determined from each other than this would be a great feature i think.
Comments
I was wrong to trust proxy-hub to buy 50 dedicated proxies from them. After i switch back the proxy to buyproxies.org the problem solve.
When i check my dedicated proxy, almost all those proxy died. i Have send then email yesterday but till now i haven;t got any respond.
Never buy proxy-hub proxy... support was also really SLOW...
@sven, please remove the promotion code that you put at the GSA SER options and switch it to @buyproxies.org.
Now, is that safe to say it's okay? Why do I feel 50 seconds is way to much, and is my 3 seconds fine?
I have learned to ignore it because everytime I check my proxies against scrapebox or some other seo software the proxies are just fine.
I agree the method could use some improvement, but I honestly think you should test them once to make sure they work and then forget about it.