[Feature Request] - Filter Entries From File (Automatically)
Hinkys
SEOSpartans.com - Catchalls for SER - 30 Day Free Trial
Would it be possible to integrate the "filter entries from a file present in another file", the same one from SER (Advanced -> Tools -> Filter Entries From File) so that it works similarly as a delete duplicates project (automatically check files(s) every X minutes).
For example, this would be useful to automatically remove all targets that are already present in the SER's Identified folder after a platform identification run OR to remove all targets that were identified as part of previous PI runs.
For example, this would be useful to automatically remove all targets that are already present in the SER's Identified folder after a platform identification run OR to remove all targets that were identified as part of previous PI runs.
Comments
Sounds like a good idea. the only problem is not many people use GSA SER for platform identifier.. Me for instance use GSA Platform identifier and then just import the sl file into gsa ser global identified.
My biggest problem is the project target files, GSA SER just keep adding to the list without checking if it exist already.
I spend 2 weeks removing duplicated from 10 GSA ser installs, total duplicates was close to 1 trillion and took up round 400GB of space.
So i am totally for any automation , "set and forget", to remove dups from link list and target urls, even if it use more resources.
Hope @sven can consider
Great, looking forward to it! I'm trying to fully automate the list building process and this feature would be invaluable in making the process efficient.
@royalmice
I was talking about the standalone GSA Platform Identifier software (it was posted in GSA PI forum although that isn't apparent from the post itself)
As far as duplicates go, it would take a lot of processing power to check each new site against the entire identified list. With GSA Platform Identifier it's not that bad, just set a project to dedup your identified / verified folders every so often. But yeah, those duplicates add up REALLY fast.