No targets to post to (maybe blocked by search engines, no scheduled posting)

KenKen Singapore
How to resolve the problem above? It came out so frequently. Do I have to buy more proxies to solve it? Can anyone give me a clue?


  • 0
    s4nt0ss4nt0s Houston, Texas
    Accepted Answer
    Are your proxies passing Google/Bing or whatever search engine you're scraping from when you test? If you don't have many proxies you can lower your thread count and or increase the time between search engine queries so they aren't hammered so hard.

    The best thing to do is import your own list if you can  ;)
  • 0
    1linklist1linklist FREE TRIAL Linklists - VPM of 150+ -
    Accepted Answer

    like @s4nt0s said, at some point your going to need to bring in outside scraping. SERS inbuilt scraper is good, but has a very serious limit-of-scale to it.

    The short term measure you can take, is to add new footprints to SERS engines - as well as more keywords, and additional search engines to scrape from.
  • 0
    DeeeeeeeeDeeeeeeee the Americas
    edited October 2017
    I've also encountered this message, but up until now I have interpreted it wrongly.

    I see now I have to have public proxies that have passed the SEs, if I am using GSA's internal on-the-fly scrape function for projects.

    I thought the message meant there were no more matching sites that could be found to post to using, those KWs.
Sign In or Register to comment.