Skip to content

Proxy Advice Needed

goonergooner SERLists.com
Hi guys,

It seems like whatever i do with private proxies they are being burned out in 2 - 3 days.

I've tried proxyhub and buyproxies and have gone through 50 already. They are still working partly but i am seeing constant banned messages in the logs and a lot of "loading from site files" as i have the verified lists selected as a backup.

First day with the new proxies i was 100+ LPM, now i'm lucky to hit 50 as it's using the verified lists and i'm seeing lots of "Already Parsed" messages.

Currently have all English search engines selected and 30 second delay, running 30 proxies and 300 threads.

Have read through the forum and have tried a variety of combinations of search engines, threads and delays (including the method of 5 random Google search engines selected).

Any advice please?

Thanks

Comments

  • AlexRAlexR Cape Town
    What's your time between Search queries set to and how many threads you running?
  • edited August 2013
    tbh im using scrapebox solely to scrape sites.

    the funny thing is, when i use the same proxies in gsa ser i get a lot of proxy-ban messages, but scrapebox is still scraping with 80 urls/second.

    the only thing to really avoid it is to have a shit-ton of proxies. i tried it with 100 semis and still got some of those error msgs (less), but now with scrapebox its fine.

    grow your global list through importing and the program will run on its own ...

    i think scraping new sites is one of the most often discussed problems here, and i tell people all the time to use a scraper (SB,GScraper or whatever, im not affiliated with them) and use SER for posting (it is NOT made for scraping, it is made for posting...).

    It's a bit of additional work (like 5 min/day), but it seems to solve a lot of problems and you can adjust fast if you need one special platform etc.

    I didnt import any list now for 5 days because i dont wanna interrupt the (giant) scrape im doing, and i still got stable LpM and every project gets enough links.

    Regards
  • goonergooner SERLists.com
    @alexr - 30 proxies, 300 threads, 30 sec delay... Too short of a delay?

    @startrip - I've also tried public proxies from scrapebox in SER - Same result as you. Actually i've been doing as you said partly, proxies from scrapebox, import into gscraper and then letting it run for 24 hours... Then importing lists into SER... But i've only been doing that on a few projects at a time. So maybe i'll try importing lists into all projects and also growing the verified lists as you suggested. Thanks mate.

    Just out of interest - Why do you use scrapebox instead of gscraper? I find gscraper much faster.
  • AlexRAlexR Cape Town
    @gooner - too many threads! It seems G have tightened up on this and I'd try 100 threads and see how that works and increase slowly. 
  • goonergooner SERLists.com
    @AlexR - Thanks i will give it a try.
Sign In or Register to comment.