Skip to content

Feeding proxies to GSA SER

I thought I had this working before, but can someone walk me through the process? I have PS saving a text file to a folder. Then under Add/Edit ProxySites in SER, I have add the location for this folder. I have SER set to Automatically Search For New Proxies every 5 minutes (I will increase this when I get this working properly). SER log says "Searching and testing for new proxies..."
"Testing 0 Proxies..."
"Proxy testing finished"

I know that the location I have saved the proxy TXT file in has data in it. But SER doesn't seem to be picking it up. Any ideas why?

Comments

  • SvenSven www.GSA-Online.de
    is that location accessable for everyone or maybe you save to some location that is restriced to admins?
  • It's on Dropbox. Should I make it local?
  • SvenSven www.GSA-Online.de
    should work on dropbox as well but yes, try it local.
  • Didn't work on local either. I'm stumped. I can always import them by hand, just trying to automate everything here.
  • I got it sorted. Thanks.
  • Tins1960Tins1960 United States
    I have my proxies saved to local and I have checked the properties, each file is accessible to system not just admin. For some reason I am only getting about 200 Google passed proxies with the proxy scraper. When my SER pulls the file that the Google passed are saved to, it only gets about 9 Google passed proxies. I am not sure what I am doing wrong. I have followed the set up vid and am getting little results.

    Can anyone help?
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Google passed proxies burn out crazy fast so it could just be because of the delay of loading them into SER, even if it is just a couple of minutes.

    When i'm using the PS proxies in Scrapebox I have it's harvester running at 3500 threads and I am confident there are a few people doing the exact same thing.
  • Tins1960Tins1960 United States
    That makes sense for the burn through. Thanks mate!
  • Tins1960Tins1960 United States
    How do you fix the delay? Should I pull every 5 minutes from the save file?
  • shaunshaun https://www.youtube.com/ShaunMarrs
    You cant fix the delay, you can just minimise it. Pretty sure 5 mins is the lowest option available so set it to that.

    Essentially you want the proxies active in your scraping projects ASAP before someone else burns them out.
  • Tins1960Tins1960 United States
    OK, Shaun, I love the info you post. I have followed several threads with your tips to maximizing LPM. Do you use a automated file to pull these proxies into scrapebox to find targets?
  • Tins1960Tins1960 United States
    TY
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Nah when you open the scrapebox harvister there is a built in option for proxies refreshing when you click the proxy drop down.

    I used to have one instance of scrapebox that just scraped proxies and dumped them into a file but GSA PS is much better for this.

    If your list is still small you can have two instances of scrapebox up at once. One running off GSA PS proxies and one using the internal cloud proxies of scrapebox. I used to recommend link extraction as the quickest method of growing a list but it returns so many unusable links or links on platforms with problems that I have moved over to footprint scraping for SER stuff. 
Sign In or Register to comment.