...After Proxy Multiply scrapes proxies from all of those websites, you now have at least 10,000 proxies in the program. Once you have proxies in the program, Proxy Multiply goes in to ‘search mode’. Proxy Multiply will then use those 10,000 proxies to search Google for more fresh lists of proxies!...
Instead of doing only port searches ip searches from fresh proxy ips could be nice.
I also have No Hands Proxies which is really useless, instead of adding backilnks NHSEO is scanning and removing bad proxies. I wonder if this tool will be the same. Will this save $$ paid for private proxies or I will still have to pay for them.. ?
@TOPtActics thats already happening It does certain tests with known proxy IPs in search engines. All replies from them are stored and parsed for new proxies.
@Marc_L how should that work? I mean it finds hundreds of new proxies each time...scanning all of them would be too much in my eyes.
I did the same on No Hands Proxies, very tempting on trial, after I bough it very limited long term effect..went back to private proxies. I will try the demo hopefully will be more powerful.
What I need to know from your tests is for example one website project with max 200 backlinks/day how many links would your tool post in % compare with private proxies ? Using No Hands Proxies I would get only around 10-20% resulting in very low and slow upper rank.
"@Marc_L how should that work? I mean it finds hundreds of new proxies each time...scanning all of them would be too much in my eyes."
Hi Sven,
Maybe just have it run once every x minutes/hours/days on currently working proxies. I really like the port scanned proxies as they last much longer and less people have access to them.
Private proxies are always better, you can't compare them to public proxies. That being said, I use public proxies on lots of my gsa installs and get good results. With this program I "pre-filter" 100s of thousands of proxies to a thousand or so, that way GSA doesn't have to check 100k proxies so it has more resources to make backlinks.
300 threads - Everything is working fine for first X minutes, then remote desktop freeze.
Its not possible to connect to server via remote desktop.
After server reset (via ipmi) everything is back to normal.
After update to 1.04 software freeze and dont start.
I stopped using demo but i still got some problems with remote desktop, with downloading files from google drive. After uninstalling Proxy Scraper everything is back to normal. There is some problem i belive. I use same servers for port scanning with XXX thousand port scans per second so i dont think there is any problem on server side.
Connection is 1gbps, Proxy checker was using less than 5-10mbps. I can use on this server 100-200mbps with no problems. Can i find any logs after uninstalling proxy scraper to send you?
i bought today proxy scrapper, and cb. Can you integrate to the proxy scrapper as main option, that automatically delete proxies, if slower than .... number? I see there is the option, but its manually, and maybe great if the program can delete automatically the slower proxies...
(the not anonym proxies automatically delete as main option too)
There is the automatic export option in the proxy scrapper. How do i can setup the other GSA products (ser+redirect+indexer) to can automatic read and import the exported proxy file?
I run it for 3 days now, and i change the intervall to 30 minute, switch on the anonimity filter, and delete not working proxies, but its scrapping not so much number of proxies. Only 400 - 500 average ALL. (and i dont use more limitations) maximum threads 35 connect timeout: 15000 Other timeout: 5000
If i switch on the inbuilt proxy scrapper in any my other gsa product, i get thousands of anonym proxies, very fast. What is the problem with the scrapper?
There is no problem! The proxies in Proxy Scraper are tested. If you add them with other GSA tools, you get them but most of them will turn out as being dead.
Comments
next version will test the proxy for you.
By the way to get a random proxy from the list, you can as well do the following:
1. Enable internat proxy server in options
2. in proxy scanner you will simply choose that internal proxy server (by default 127.0.0.1:8080) and it will use a random one.
have at least 10,000 proxies in the program. Once you have proxies in
the program, Proxy Multiply goes in to ‘search mode’. Proxy Multiply
will then use those 10,000 proxies to search Google for more fresh lists
of proxies!...
Instead of doing only port searches ip searches from fresh proxy ips could be nice.
@TOPtActics thats already happening It does certain tests with known proxy IPs in search engines. All replies from them are stored and parsed for new proxies.
@Marc_L how should that work? I mean it finds hundreds of new proxies each time...scanning all of them would be too much in my eyes.
@andrew there is a demo version. Just try it
I am spending about $600 a month on proxies so if this will lower the cost I'd definitely try it, but my experience with public proxies isn't good.
Hi Sven,
Maybe just have it run once every x minutes/hours/days on currently working proxies. I really like the port scanned proxies as they last much longer and less people have access to them.
I guess it doesn't matter since we can filter them out in the automatic export section, right?
just lower the timeout and the proxies you get are faster
a filter for only accept anonymous proxies is there as well.
Question: what kind of format do I use to enter multiple ports here: http://prntscr.com/77wogf
is it port,port,port or port port port or something else?
1. export in PRoxy Scraper to a location of your choice.
2. in SER (or any other GSA product) you add a new proxy provider and instead of using it to read from an URL, you point it to the file from point 1