Skip to content

Seperate Thread Settings For Scraping & Posting.

goonergooner SERLists.com
edited November 2013 in Feature Requests
Hi @Sven,

With all the problems people are having with proxy blocks etc would it be possible to set a certain number of threads for scraping and another amount for posting?

Or if that is too much of a coding nightmare, maybe you could make an option so that all URLs identified by SER can be auto imported into all projects as targets (Or maybe projects by mask etc).

This would mean SER would use much less resources on scraping and therefore save on proxy bans, because at the moment each project scrapes individually and this is probably not necessary for the majority of users.

I'm not sure if either the options above are workable but i'm sure there must be a more efficient way to scrape because as an example, i have 100 private proxies which i can use on Scrapebox with 25 threads, 5 second delay on only one Google site and it scrapes all day long with no issues.

Then import those lists into projects on SER with 250 threads and again i have enough targets in SER to run all day.

If SER could somehow replicate that process it would save many hours of manual work.

Please let me know what you think. Thanks
Tagged:

Comments

  • SvenSven www.GSA-Online.de
    What waiting time you have in SER?
  • goonergooner SERLists.com
    120
  • Or instead of different settings, maybe different proxies for scraping all together. Or a combo of both. Just a thought. 
  • actually looking more at it, I guess there is already a way to use different proxies for searches. Just set my 2nd set of proxies as public even though they aren't and use them only in searches. Nice. 
  • goonergooner SERLists.com
    I don't think that will help, in fact it will make it worse.

    You are keeping the same number of threads scraping but lowering the number of proxies used. We need the reverse... Limit the number of threads scraping but keep all proxies doing so.
Sign In or Register to comment.