Best Practices for Managing Site Lists (Identified / Submitted / Verified / Failed)
Hey guys,
I was wondering how most of you manage your site lists in GSA SER. Specifically, I’m trying to optimize how I use the global site list folders: Identified, Submitted, Verified, and Failed.
Here’s how I currently understand them:
- Identified: GSA SER found a potential link that matches an engine, but it doesn’t guarantee a registration form or successful submission.
- Submitted: The site was identified and successfully submitted to meaning it had a registration form and accepted content.
- Verified: The most valuable site was identified, submitted, and the link was verified as live.
- Failed: Submission failed due to reasons like offline site, bad proxy, missing registration form, engine mismatch, captcha/email failure, etc.
My current setup:
I’ve always ticked Submitted and Verified in the global site list settings because they seem the most reliable for link building. They’ve passed the full process and are usable by other projects.

My questions:
- Is there any real benefit to ticking Identified and Failed for building site lists?
- I assume processing the Failed list might recover a small % of usable links, but is it worth the time/resources?
- For Identified, if I’ve already processed 1M URLs and they’ve moved to Submitted/Verified, is there any point in appending more to the Identified folder? Wouldn’t that just reprocess old URLs and slow things down?
- Would it be better to untick Identified and Failed, and instead import freshly scraped Identified URLs directly into projects?
- That way I avoid bloating the global Identified folder and wasting time reprocessing known URLs.
Would love to hear how others handle this especially those running large campaigns or scraping at scale.
Thanks in advance!

Comments