Lets get a poll: who scrapes their own lists and who lets GSA scrape and why?
I've heard compelling arguments for both and Im still not quite decided tho Im still veering towards letting GSA scrape due to the obv hands free nature tho experimenting with adding lists and blasting currently.
So Im interested to hear others opinions on the matter one way or otehr and the reasons for such.
Comments
@gooner - if you merge in some 'stop' words after you have merged the keywords with the footprints it works even better.
@judderman - that seems a bit low, if I use the link extraction technique I generally get about 10k verified from a list of 1,000,000 targets, but that's just following the steps and not sorting any urls in any way throughout the process.
I'm new to SER, but list method is working much better for me. Scraped about 1.2mil urls with "extraction technique" and got already 9k verified urls, but it passed only 600-700k now. Stopped because of spamvilla is down ((( .
But I haven't got succeed with SER auto scraper. Maybe Im doing something wrong. But I really don't know what? I use proxies, I use 200k kws list, ticked google, bing and yahoo searchengines. SER worked for 2 days and found only 150 places to post (it was t1 all contextual).
For the average Joe, ahrefs can be a pretty solid source of LinkLists.