Skip to content

Projects that drastically slow down in verified count over time

specifically i'm talking about projects where you let GSA scrape for targets.

ive got a couple projects that for whatever reason went from 6-7 VPM and over time slow to nothing, now 0.20 VPM.

I've got them all reading a random line out of a folder of millions of keywords split into 2k files. I've also got all english search engines checked and am using private & fresh public proxies for scraping with the public ones being replaced every 15 minutes with fresh ones

with so many keywords to go through, why are my projects slowing down to almost zero vpm within a few days? there should be plenty of keyword / foot print combos to go through for weeks at least

Comments

  • i just did 'delete unused accounts' and cleared all the emails in my catch alls (even with delete after 5 days, they got really filled up quickly) and the VPM jumped up a ton. hopefully it sticks
  • 1linklist1linklist FREE TRIAL Linklists - VPM of 150+ - http://1linklist.com
    @kijix

    Eventually you exhaust all the easily accessible "Top Level" targets you scrape for. The longer a project runs, the more detailed its History (the record of sites it has already posted to) becomes.

    The quick fix is to expand your scraping. Add new footprints, and some new keywords. You'll see a big difference :)

    (Assuming you have not done this already)

    Regards,

    -Jordan
Sign In or Register to comment.