Skip to content

How To Use SER To Get Public Proxies For Scrapebox

Hey guys,

Somebody just asked me about this, so I thought I would spell out what I do.

The issue is that SB harvesting of public proxies sucks - that guy could learn a thing or two from @sven. But when I need to check indexing or PR on Scrapebox, this is what I do to get a bunch of Public Proxies FAST:

First, here's where it all happens:

image 

  1. Shut down SER
  2. Create a dup project, stick in a different URL just in case it tries to post
  3. All real projects are inactive, the dummy project is active (only one)
  4. Check all boxes above - make sure path to where proxies are written is convenient, like desktop
  5. Save and start dummy project
  6. It takes about 3-10 minutes, but the text file appears with about 150 proxies and then you hit Scrapebox quickly thereafter - Public Proxies burn out lightning fast - especially as you use them in Scrapebox
  7. You will get about 2-4 iterations of checking  done on SB, but as you know, proxies burn out very fast, hence more iterations
  8. You reach a point where the Public Proxies are no longer good, and you only have a small number of index or PR checks remaining
  9. When I get the unchecked list in SB down to about 30, I substitute my real proxies (or my IP)to finish those final 30 or whatever
I have run into a situation where the proxy list didn't publish to my desktop, and I went into the tab above and started clicking things off and back on, and it triggered the publishing of the text file.

So this is how I do big indexing checks and PR checks. Everybody talks about this and that source for public proxies. You don't need it. This is 1000% faster than running SB to find proxies - complete waste of time to use SB to find proxies. 5 minutes versus i hour. Plus way better public proxies than SB.

So +1 to SER and @sven on this. It kicks ass.

Ron
Tagged:

Comments

  • SvenSven www.GSA-Online.de
    thanks for the flowers :)
  • Very nice Ron.

    I'll have to try this out next time I've got some hardcore index checking action going on!

    Cheers.
  • ronron SERLists.com
    I know most people missed this, but when you need to check indexing or PR checks, this is a great solution for public proxies without any additional effort.
  • Going to give this a try, thanks for that Ron.
  • Nice one Ron
  • I was going to test this but you have done it already :D ! well GSA SER do have so much of features !! 
  • I'm trying this technique for hours and for the love of god I can't get it working at all. I set GSA to scrape and test public proxies and export to file, but whenever I load the proxies into scrapebox and start harvesting I can't get a single search. 

    GSA SER tests the proxies and they many of them turn fine (like 1000 from 15k scraped) but scrapebox can't use not a single proxy of those. Not sure why, but this doesn't work at all for me. Not even close to anything worthwhile.

    I'm wondering if this still works or do I have to tweak anything..
  • ronron SERLists.com
    @spiritfly - I don't think anything has changed. I will give it a go later this week when I have time. The key is not to waste time and test. Let SER write them on your desktop, and as soon as that happens, import them into Scrapebox and let it rip.
  • @ron yup, I'm sure I did exactly that. Faster than lightning strike. :) I tried scraping google immediately after that list was built. from 5-6 tries (different proxy scrapes from gsa) I've only managed to get only 1 search. I've tried setting to 100, 500 threads in scrapebox, played with the possible settings, but never managed to get anything.

    It's strange why it hasn't work. After GSA tested around 13k proxies around 1000 were successful on google. Will play around again today and see how it goes.
  • ronron SERLists.com
    @spiritfly, well than that is a bummer. It was my secret method, lol. I am still going to try it, maybe this weekend. I am hoping I can get it to work because it is a brilliant way to get Scrapebox to work faster.
Sign In or Register to comment.