Are your proxies passing Google/Bing or whatever search engine you're scraping from when you test? If you don't have many proxies you can lower your thread count and or increase the time between search engine queries so they aren't hammered so hard.
The best thing to do is import your own list if you can
1linklistFREE TRIAL Linklists - VPM of 150+ - http://1linklist.com
like @s4nt0s said, at some point your going to need to bring in outside scraping. SERS inbuilt scraper is good, but has a very serious limit-of-scale to it.
The short term measure you can take, is to add new footprints to SERS engines - as well as more keywords, and additional search engines to scrape from.
Comments
like @s4nt0s said, at some point your going to need to bring in outside scraping. SERS inbuilt scraper is good, but has a very serious limit-of-scale to it.
The short term measure you can take, is to add new footprints to SERS engines - as well as more keywords, and additional search engines to scrape from.
I see now I have to have public proxies that have passed the SEs, if I am using GSA's internal on-the-fly scrape function for projects.
I thought the message meant there were no more matching sites that could be found to post to using, those KWs.