The previous thread I marked as solved, but still I really wait forward for that functionality, considering all features added in the past in gsa this one seems no big deal that can improve gsa work significantly. As I wrote in the previous thred, such filtering (ability to compare multiple lsits) can be helpful in situations:
1. In a verified folder we have urls from an expired domain we do not want to link anymore.
In an identified folder we have domains that are not safe to link to -
as sending stopspamforum requests. It turned out after a while and we
want to remove it from all files. Adding in the filter option for
posting results in more time consuming posting.
3. It can be a great
way to just filter files based on a multiple keywords/domains - better
that can be done by scrapebox - as it only works with lists not bigger
than 1 million results.
But I come to yet one more situation that I guess it's even more important than all those above mentioned and can result in an amazing gsa speed boost. I mean that we would be able to delete from identified folder all those urls that we submitted urls before so gsa will not spend time posting on urls that are submittable but not able to become verified.