Seperate Thread Settings For Scraping & Posting.
gooner
SERLists.com
Hi @Sven,
With all the problems people are having with proxy blocks etc would it be possible to set a certain number of threads for scraping and another amount for posting?
Or if that is too much of a coding nightmare, maybe you could make an option so that all URLs identified by SER can be auto imported into all projects as targets (Or maybe projects by mask etc).
This would mean SER would use much less resources on scraping and therefore save on proxy bans, because at the moment each project scrapes individually and this is probably not necessary for the majority of users.
I'm not sure if either the options above are workable but i'm sure there must be a more efficient way to scrape because as an example, i have 100 private proxies which i can use on Scrapebox with 25 threads, 5 second delay on only one Google site and it scrapes all day long with no issues.
Then import those lists into projects on SER with 250 threads and again i have enough targets in SER to run all day.
If SER could somehow replicate that process it would save many hours of manual work.
Please let me know what you think. Thanks
With all the problems people are having with proxy blocks etc would it be possible to set a certain number of threads for scraping and another amount for posting?
Or if that is too much of a coding nightmare, maybe you could make an option so that all URLs identified by SER can be auto imported into all projects as targets (Or maybe projects by mask etc).
This would mean SER would use much less resources on scraping and therefore save on proxy bans, because at the moment each project scrapes individually and this is probably not necessary for the majority of users.
I'm not sure if either the options above are workable but i'm sure there must be a more efficient way to scrape because as an example, i have 100 private proxies which i can use on Scrapebox with 25 threads, 5 second delay on only one Google site and it scrapes all day long with no issues.
Then import those lists into projects on SER with 250 threads and again i have enough targets in SER to run all day.
If SER could somehow replicate that process it would save many hours of manual work.
Please let me know what you think. Thanks
Comments
You are keeping the same number of threads scraping but lowering the number of proxies used. We need the reverse... Limit the number of threads scraping but keep all proxies doing so.