Skip to content

Clean up of ALL levels of global site-list by mask or domains or self made blacklist

after running SER for a while (weeks or months or longer) - and learning while using, we may accumulate a number of

- honeypot sites
- spamtrap sites

or
any other sites we may want to avoid in future and permanently REMOVE from entire global list (identified + success + verified + failed)

of course we have the individual filters in each project / T
but that certainly consumes MORE resources while submitting than a once and for all clean up of ALL global site list may do.

best option IMO is to FIRST clean up ALL + then filter NEW imports of target URRLs by using own black-list

as in the recent post for dead/malfunctioning blog-comment sites
https://forum.gsa-online.de/discussion/7101/some-types-of-general-blog-comment-sites-not-working-but-instant-verification-shown

there are among others known sites / domains with multiple unique URLs that may be harvested and present on all levels of global sitelists

like for example

mythem.es
mailld.com
www.0l.ro
spamtrap.ro

that might be deleted / cleaned up as soon as recognized

the end goal is an ever cleaner = more efficient global sitelist when BUILDING and adding new sites to maintain / increase submission efficiency

the very same self-created BLACKLIST also could then be used to filter URL-imports by all means (from online site-lists or from files or from scraping by SER)

Tagged:
Sign In or Register to comment.