[FEATURE] Save FAILED Url's Per Project
AlexR Cape Town
edited November 2012 in Feature Requests
I have some high filters for my moneysite. The program has been running for weeks...running through millions of targets URL's. These are then saved to the identified list if it's a recognisable platform and to the global if it failed.
BUT there are number of submissions that failed due to bad capcha, rejected by moderator, etc on very good sites.
I want GSA to save these failed submissions on a per project basis (rather than global) so that I can neaten up the list and re-enter them for the project to try again. Currently, I have to clear target URL history, but then it still has to respider those million URL's to find the gems.
At least if it's on a per project basis it wouldn't have to redo all the spidering. I can then find all the PR5 blogs it (in the project failed list) missed and manually comment on those, and all the others I can run through a different capcha service, etc. It just gives us so many options.
I really want to focus on quality of links rather than quantity and if it has used up all the resources and time to find the link I want to make sure that I get it without having to restart everything! Some niches have limited sites so you have to get all the links available and this would help us do that..