GSA Footprints
guys many of the footprints for all of the different engines are out of date and no longer return a healthy amount of results
sven is willing to listen to suggestions for adjustments, so for those who are willing, it would be nice to have some help with this task of weeding out the bad footprints.
if you want to help, simply choose a platform (article, blog, social network, wiki etc) and pick some engines from that platform. open up options -> advanced -> search online for URLs -> Add pre-defined footprints -> choose a platform -> choose an engine
the foot prints it populates you just need to put into scrapebox and use the google competition finder
imo anything that gives less than 5k results is not at all worth keeping and just slows SER down.
actually since many people will be using said footprints i would say anything under 20k even is not worth it.
many of you may use lists or scrape outside SER but lately i have been using GSA to do my scraping and it can work quite well but many of these foot prints have to go and it is a rather big task for one person to do alone
if you are willing to help please post here or PM me and send me your results and i will get everything together to send to sven.
Comments
How do you judge a 'healthy number of results' ?
"imo anything that gives less than 5k results is not at all worth keeping and just slows SER down."
Exactly. It's just your opinion. It doesn't mean Sven should make changes that will affect every user. Those 5k sites might contain gold dust. How do you know you're not sacrificing quality for speed?
I'd prefer to make those decisions myself. The only question that ought to be considered is whether the footprint itself continues to work.
Yes you can. They're in the Footprint Studio. Just select the ones you don't want and hit delete.
I understand the point you're making, I just think it's for individual users to decide what they keep or not.
I've been scraping and posting to Bing targets and wanted to see how many domains are actually in Google's index.
I tried scraping Google with Google passed proxies but the only real efficient method was to pay monthly for rotating proxies.
I already had GSA Proxy Scraper so decided to start scraping Bing.
When scraping for tier 2 links like blog comments I scrape Frontpage.com since they get their results from Google API and have a less stricter ban limit for proxies.
So I'm assuming you scrape Bing for blog comments, guest books, and image comments too?
I'm editing my footprints tonight and gonna scrape Bing and post to the raw list tomorrow.