A new Tool is born - GSA Proxy Scraper
Sven
www.GSA-Online.de
GSA Proxy Scraper
You can find details here: http://www.proxy-scraper.com/
I hope this will help everyone to find more reliable sources of proxies and manage to test them and keep them updated.
- comes with over 800 sources to find proxies
- lots of different things to find and locate new sources
- proxy/port scanner
- many options to test against different websites (you can create custom test of course)
- automatically check for anonymous level
- give report of a suspicious proxy (in control of a spying company)
- many filter options like google passing proxies
- automatic upload of proxy reports (csv, text, html)
- internal proxy server that can be used in any tool you want to directly use the proxies.
Any ideas, suggestions and bugreport (I hope there are no big bugs) are welcome!
Comments
Also make ip ranges based on countries or many proxies. (so we can import them, not only using existing ones)
Here is proxyfire format:
Right now i got no idea how to import these ranges here
115.184.100.60 115.184.106.190 #1666 hosts
would become...
115.184.100.60-115.184.106.190
Is there any randomization when testing proxies? So we can avoid netscan detection.
get ip ranges that generate tons of google passed proxies, (myself)
loop the testing only for google passed
and export into my ftp. (i know its possible)
Is it already possible at this stage ?
yes all of that thats possible.
- export automatically to ftp is working including a filter for google passed only
- re-testing proxies can be limited to google-passed proxies as well but I would use the filter on export only
There is many services where people sell port scanned google passed proxies.
Some ranges are able to generate loads of google passed proxies, the key is to find these ranges and scan them 24/7 then export google passed to your scraping tool.
some proxies are up only on certain hour's ... etc. etc.
so project a) is scanning ip ranges for open ports (proxies)
project b) is testing proxies from project a) for google pass (each lets say 60-180minutes)
project a) runs only if there is less than 3000 google passed proxies in project b)
btw. internal proxy server will randomize the proxy each time?
- loop function added for next update
- internal proxy server uses a new proxy on each request
- using a proxy for proxy scanning/testing is something i can add as well, but maybe not for next update
Thank you for the feature - actually we can use only 1 proxy, is it too much hassle to import many proxies and randomize it? I mean, too much hassle in coding?
- naw of course using many proxies is not a problem. can add that as well.
- change log menu is indeed missing
- will recheck proxyfire format (coded blindly and never checked as i thought it would work )
Did you test proxy scanning on 5k-10k-20k of threads? Looks like its stuck and slow, maybe theres a limit?
@useruser1 the Proxy Scraper is a complete rewrite. It is way faster, offers a lot more functions and has many many new sources and ways to get proxies.
---
Checking proxies via proxy...whats the use in it? It would cause a lot problems if that check.proxy is down
---
it re-checks proxies in intervals. Youc an configure this to happen or not...or limit it to certain tagged proxies
---
proxy providers can also get content from local files in next update (must have forgotten about this)
count me in if there's any early bird discount code
pretty please @sven
Common, this tool is fresh and you want a discount already It's better than any other proxy tool you will find...that should be enough to get your attention