Skip to content

Advantages of "Manual List Scraping" vs "Always Use Keywords to Find Targets"

Yesterday it took me 15 hours to scrape 1.000.000 urls with Scrapebox (mixing 500 SER footprints with 1000 niche keywords).

Isnt is that with the option "Always Use Keywords to Find Targets"  SER will (exactly like Scrapebox) just combine keywords with footprints as queries and search in all specified search engines?    What exactly are the advantages of scraping lists manually with e.g. Scrapebox then?
  • Of course you can use scrapebox on another computer and therefore use its additional CPU & RAM while SER is doing other things.
  • Also you can import the target list to multiple projects if they need the same targets.
  1. But if i had only 1 project and enough computer power. Why would i use Scrapebox?
  2. And what are disadvantages of using "Always Use Keywords to Find Targets" for Tier1 projects?  Without this option you will not find many niche targets right?


  • spunko2010spunko2010 Isle of Man
    edited June 2013
    I believe that ALways Use Keywords option only works for Trackbacks and BC. Are those 1m URLs all suitable for posting to via SER? If yes, how would one build such a filter?
  • edited June 2013
    The target urls are not highly filtered.  I just took the SER footprints of the engines that i like and mixed them with my keywords in SB.  The analysis of a 600.000 lines SER-logfile shows that 25% of my scraped urls were identified. But that strongly depends on the the chosen engines and the niche.

    24,7%    matches engine
    68,8%    no engine matches
      3,2%    filter "xyz" matches  (bad words)
      3,3%    already parsed / already awaiting registration

    Thanks for your answer @spunko2010. You said you think that "always use keywords ..." works rather for Trackbacks & BC.
    Many experts on this forum use Scrapebox or Gscraper to scrape targets which works in a very similar way as the keyword option in SER.  What is the advantage of SB compared to using the "always use keywords .." option (apart from what i said above)?
  • I'm curious here too now that I'm on the verge of deciding to use scraped list or let SER scrape.

    Perhaps @ron or @LeeG can give more helpful advice.
  • use what works best for you and test your options. some things you need to decide for yourself imo.
  • I just tried a test scraped list scraped via scrapebox (contain about 150K URLs) and feed to SER but IMHO it was slow. Parsing itself occupied a lot of time, but when I switched to scrape and post, I returned to my normal LPM.

    I guess it is not for me.
Sign In or Register to comment.