1. To Start off with – Don’t have any Bandwidth running in the Background as it will affect the Results 2. First you need to Configure GSA 3. Find a good Source - In this Case lets use https://www.blackhatsem.com/Forum-Proxy-Lists 4. To Find this go to GSA Proxy Scraper and Right Click on Source and Copy Source URL 5. Right Click and go to Source URL 6. This will open up your Web Browser 7. Now go to GSA proxy Scraper 8. Click the Blue Add Button 9. Search Online 10. From URL(s) 11. Enter – Bottom Option 12. This will open up the Parse URLs Box and you will see the URL you Copied so it will Automatically Paste it 13. Parse Sub links – Drop that down to All Links 14. How to Add All Proxies 15. I use ADD ALL and Test 16. I will Explain Later why this seems to Work Better 17. The Results Box will open BUT DONT CLOSE IT 18. You will see the Working Proxies it has Found 19. Go to the Save Button that you can see in the opened Box 20. Just Call it Blackhatsem for the Future as it will Automatically rename it blackhatsem.dat in this Case or whatever named 21. You can Close the Box as you have the important file you need
22. Now New Proxies will be Added 23. Go to Settings in GSA now 24. Go to Automatic Search 25. In the First time of use Highlight Everything 26. Press test 27. Any Errors don’t use those sites till you find out why there are errors 28. The first Test will take a while 29. Don’t close the Box Yet 30. Go to Settings 31. Just Follow the Setting I have used and I will explain Why later !! 32. Close the Box 33. Go to Test 34. All 35. Make Sure every box is Ticked 36. You can Maximise this Box, but untick Custom Test in this Case ( It isn’t Needed ) 37. Test the Proxies Now 38. Go away, Have a Holiday and come back as it will take a while ” Be Patient ” 39. On Some of the Proxies you will always get one that is Stubborn and just sits there 40. Don’t Touch it 41. Basically the More Colours on a Proxy line or the TAG Column the Better 42. Done !!
43. Now they have Tested
44. Go to the Reliability Column and Delete anything Below 80% 45. The important thing is that you have Stable Fast Reliable Proxies 46. You don’t need to always do preform the Long Test, especially as you have some Stable Proxies now
When you Test again Lower the Timeout from 6000 to 1000 - 2000 I sometimes use 800 - 1000
The Lower the number the Better 1000ms = 1 Second ie a ms or Millisecond is a 1000th of a Second
If Google Blocks it or IP Chicken Blocks it, then chances are it will Fail Also the " Alive " column is important
Some of my Reliable ones have been alive for close to 30 Days - But Be careful as some could be a " Honey Pot " so never use Unknown Proxies for any important Browsing, that is unencrypted.
Get a Portable Browser that you don't need to install.
You will need to Configure the Browser to use Proxies or setup GSA to run as 127.0.0.1:8080
1. Open GSA Proxy Scraper 2. ADD 3. From File 4. Find where your .dat File is 5. If you can’t see it change .txt file to All (*.*) 6. Now
Double Click File you have saved in the past 7. Box fill Pop up 8. Middle Option ADD and Test 9. If it dosnt Find any – Link has Died !! 10. If the .dat File is Large it will take a while 11. Now Test on a High Timeout 6000 – 8000 is Ok to
start off with
.dat Files go Stale for many Reasons - Just like Bread, you need to check them or update them I have many other Tricks that can get exactly what you need.
Enjoy !!
Tips - When preforming a Large Test, to start off with don't use many TAGS because you want to eliminate the IP's that are Stubborn use the Anonymous which you have to use and maybe another one - I use Ticketmaster or Google search Google is very strict and bans and blocks or preforms CAPTCHA - Not all Google Tests are Truly dead tho
wow that looks like a lot of work I'll stick to paying $30 p/m to webshare until such time as I need something else. And thank you for the share and instructions - bookmarked
wow that looks like a lot of work I'll stick to paying $30 p/m to webshare until such time as I need something else. And thank you for the share and instructions - bookmarked
Hi Rastarr - Its not really a lot of work once you automate it.
It means if webshare went down for example, I can still have an unlimited source of Fast reliable proxies.
Besides $360.00 a year?
If you dont mind me asking, why is it important to you, to pay?
Well, I just bought GSA PS based on the great results I'm seeing you get. Testing testing testing Do you add this URL the same way as https://www.blackhatsem.com/Forum-Proxy-Lists mentioned in your first post?
Actually, I discovered that this URL is already in the list of providers so I guess no need to add anything, the same as https://www.blackhatsem.com/Forum-Proxy-Lists once you edit the entry and adjust the URL
Put that link into the Google "Search Bar" to see how I searched
Within 3 seconds I got close to 1000 Raw Proxies
Raw means Untested
Do the same with Yandex or DuckDuck for example, and you will get 1000's
Can't you simply add these strings as new Provider types though? Or is the Parse Links to follow option something that @sven needs to add to make it more automated?
Anyways, I'm still a total newb with GSA PS so there's so much I must learn including whatever about how to chain proxies and it's value to me. At least Proxy Scraper provides me to one less point of failure as far as 3rd party providers plus cost saving and more proxies once I get a process working properly.
I guess go to an existing one, Press Edit and see how its created I can see you click on Add and a Single Box pops up I am not sure if there is a way of Testing if existing ones are Dead or Alive?
There is a File here that looks interesting C:\Users\Username\AppData\Roaming\GSA Proxy Scraper
We seem to have These in config.ini ( Im not sure what the Number 1 Means )
@Ritchievalens well, I just added the string with default parameters, as a test. I didn't see any way to test it so just added it anyway. Seems like it is working in some way though lol
at the moment, I'm a newbie who is armed and dangerous. Just getting the lay of the land and collecting whatever comes my way, testing with a view of dropping a proxy provider who is a point of failure. Seems to be heading in the right direction though which makes me a happy camper
Comments
GSA Proxy Scraper – Finding Fast Stable Proxies
GSA Proxy Scraper – Finding Fast Stable Proxies1. To Start off with – Don’t have any Bandwidth running in the Background as it will affect the Results
2. First you need to Configure GSA
3. Find a good Source - In this Case lets use https://www.blackhatsem.com/Forum-Proxy-Lists
4. To Find this go to GSA Proxy Scraper and Right Click on Source and Copy Source URL
5. Right Click and go to Source URL
6. This will open up your Web Browser
7. Now go to GSA proxy Scraper
8. Click the Blue Add Button
9. Search Online
10. From URL(s)
11. Enter – Bottom Option
12. This will open up the Parse URLs Box and you will see the URL you Copied so it will Automatically Paste it
13. Parse Sub links – Drop that down to All Links
14. How to Add All Proxies
15. I use ADD ALL and Test
16. I will Explain Later why this seems to Work Better
17. The Results Box will open BUT DONT CLOSE IT
18. You will see the Working Proxies it has Found
19. Go to the Save Button that you can see in the opened Box
20. Just Call it Blackhatsem for the Future as it will Automatically rename it blackhatsem.dat in this Case or whatever named
21. You can Close the Box as you have the important file you need
22. Now New Proxies will be Added
23. Go to Settings in GSA now
24. Go to Automatic Search
25. In the First time of use Highlight Everything
26. Press test
27. Any Errors don’t use those sites till you find out why there are errors
28. The first Test will take a while
29. Don’t close the Box Yet
30. Go to Settings
31. Just Follow the Setting I have used and I will explain Why later !!
32. Close the Box
33. Go to Test
34. All
35. Make Sure every box is Ticked
36. You can Maximise this Box, but untick Custom Test in this Case ( It isn’t Needed )
37. Test the Proxies Now
38. Go away, Have a Holiday and come back as it will take a while ” Be Patient ”
39. On Some of the Proxies you will always get one that is Stubborn and just sits there
40. Don’t Touch it
41. Basically the More Colours on a Proxy line or the TAG Column the Better
42. Done !!
43. Now they have Tested
44. Go to the Reliability Column and Delete anything Below 80%
45. The important thing is that you have Stable Fast Reliable Proxies
46. You don’t need to always do preform the Long Test, especially as you have some Stable Proxies now
I sometimes use 800 - 1000
1000ms = 1 Second ie a ms or Millisecond is a 1000th of a Second
Also the " Alive " column is important
Unknown Proxies for any important Browsing, that is unencrypted.
Now the Opposite of the .dat File
1. Open GSA Proxy Scraper
2. ADD
3. From File
4. Find where your .dat File is
5. If you can’t see it change .txt file to All (*.*)
6. Now Double Click File you have saved in the past
7. Box fill Pop up
8. Middle Option ADD and Test
9. If it dosnt Find any – Link has Died !!
10. If the .dat File is Large it will take a while
11. Now Test on a High Timeout 6000 – 8000 is Ok to start off with
I have many other Tricks that can get exactly what you need.
Google is very strict and bans and blocks or preforms CAPTCHA - Not all Google Tests are Truly dead tho
I'll stick to paying $30 p/m to webshare until such time as I need something else.
And thank you for the share and instructions - bookmarked
Do you add this URL the same way as https://www.blackhatsem.com/Forum-Proxy-Lists mentioned in your first post?
Or is the Parse Links to follow option something that @sven needs to add to make it more automated?
Anyways, I'm still a total newb with GSA PS so there's so much I must learn including whatever about how to chain proxies and it's value to me. At least Proxy Scraper provides me to one less point of failure as far as 3rd party providers plus cost saving and more proxies once I get a process working properly.
I can see you click on Add and a Single Box pops up
I am not sure if there is a way of Testing if existing ones are Dead or Alive?
0058.net=1
0dayproxies.blogspot.com=1
0phreakandroid.blogspot.com=1
1000freeanonymousproxy.blogspot.com=1
100filmovru.blog.fc2.com=1
100proxygratis.blogspot.com=1
I didn't see any way to test it so just added it anyway.
Seems like it is working in some way though lol
I have also noted, that the Depth it searches with in a Website can be changes?
Seems to be heading in the right direction though which makes me a happy camper