Only scrape if target urls < x
Right now I'm scraping far more urls then I post to, target urls just keeps on getting bigger and bigger and its not submitting as much as i want to
Would be great to have more control over when it scrapes/posts, maybe an option to only scrape if target urls are below a certain threshold
Would be great to have more control over when it scrapes/posts, maybe an option to only scrape if target urls are below a certain threshold
Comments
Would be great to have a setting to not have to manually switch on/off scraping
You set up SER to find targets and post to them. It's originally designed that way and it still works. You need google passed proxies and it will does it's job.
Another feature that would be sweet could be a status like "Search only" but "Submit only"