Skip to content

[FEATURE] Save FAILED Url's Per Project

AlexRAlexR Cape Town
edited November 2012 in Feature Requests
I have some high filters for my moneysite. The program has been running for weeks...running through millions of targets URL's. These are then saved to the identified list if it's a recognisable platform and to the global if it failed. 

BUT there are number of submissions that failed due to bad capcha, rejected by moderator, etc on very good sites. 

I want GSA to save these failed submissions on a per project basis (rather than global) so that I can neaten up the list and re-enter them for the project to try again. Currently, I have to clear target URL history, but then it still has to respider those million URL's to find the gems. 

At least if it's on a per project basis it wouldn't have to redo all the spidering. I can then find all the PR5 blogs it (in the project failed list) missed and manually comment on those, and all the others I can run through a different capcha service, etc. It just gives us so many options. 

I really want to focus on quality of links rather than quantity and if it has used up all the resources and time to find the link I want to make sure that I get it without having to restart everything! Some niches have limited sites so you have to get all the links available and this would help us do that..

Any thoughts?


  • Why not saving those "gems" to a different file before importing? And I think you should consider to buy Scrapebox which will help you to sort your URLs a lot.
  • AlexRAlexR Cape Town
    I have SB...using it for finding URL's...

    I have too many projects to use SB to only find targets and GSA is good to find targets. 

    It's just that GSA is doing so much scraping. What I want is a list of URL's that it identified for a project but didn't get verified...per project rather than global. 

    Would allow us to really target the best URL's and make sure we get the links! We could try again and again until we do get them...
  • AlexRAlexR Cape Town
    @Sven - would this be doable (to extract failed submissions on a per project basis rather than global)? It would just make it so much easier to focus on getting quality links on a per project basis. 
  • SvenSven
    i bookmarked another thread about this already and might add this sooner or later
  • AlexRAlexR Cape Town
    Thanks...I'd really really appreciate it. :) I'm a quality over quantity kinda person so this would help all those with a similar outlook. 
Sign In or Register to comment.