Skip to content

2.7 mill Proxies not working

Carl71Carl71 United Kingdom
Hi there all, any tips would be appreciated...

Did a GSA proxy search on all lists. It came back with 2.78 mill proxies. But when I do a test (have tried Google, Bing and Whatsmyip) and "If" I'm lucky I get about 2-10 of these working and then system burns them out in under 10 minutes.

Please help. Not sure if I should be using a different test setup or if I have settings wrong. 

I have tried YT searching for tutorials but can't find anything that is set different to what I have atm.

Any help would be appreciated.
Tagged:

Comments

  • sickseosickseo London,UK
    You do realise these are public proxy sources? Just because you found over 2 million proxies, does not mean they should all work. 

    By definition, they are public proxies. 
    Thanked by 1Carl71
  • Carl71Carl71 United Kingdom
    Yes, for sure, totally get they are public, but even the law of averages should find more than 10 working and then even so they should work for more than 10 mins. (Atleast I would have thought so) 

    :(
  • SvenSven www.GSA-Online.de
    10 working proxies is indeed very little. If you test against BING it should give you a lot more results.
    Do you use any special settings? Please show some screenshots.
    Thanked by 1Carl71
  • Carl71Carl71 United Kingdom
    Here is the Proxy test settings I currently have. (If there are any other setting screenshots you would like let me know which ones) Any other suggestions of tweeks would be appreciated :)


  • sickseosickseo London,UK
    edited October 14
    Testing against Google search is likely why most are showing as dead or not working. Google is extremely sensitive and blocks ips very quickly. Many of those proxies likely do work when used for link submission as the sites in your site list won't have the same ip detection that google uses.

    If you are wanting to scrape google with these proxies, you will need to use other types of proxies, such as residential/rotating proxies. Very expensive. The cheap alternative is to scrape other search engines such as google clones and bing/yahoo who do not block ip's as aggressively as google do.

    You're also testing at 1000 threads which is very high and will need a very good internet line to support so many connections. I normally run it at 100 threads and a 5 second timeout to ensure only fast proxies are saved. Using a 120 second timeout will also save the slow proxies that pass the test. You don't want to be using slow proxies.
    Thanked by 1Carl71
  • Carl71Carl71 United Kingdom
    Thanks for the tips :)
Sign In or Register to comment.