How the posting really works?
I am curious how gsa is posting on variable platforms - I guess it's the same schema. But what interests me is can we somehow disable growing duplicates in our global lists? Say we have like 200k of some platform and after posing it goes twice or more than that value as those urls are again identified. Would not it be faster to skip that identification (which leads us to growing duplicates) in every project run? We can make cleanup or verify it only once.
Regarding the above I am not also sure of one thing - if we make remove duplicates doesn't it implicate ommiting some urls in currently running projects? I mean it changes urls order in that file after all. As far as I hear gsa load only like few hundreds url in bunches when posting for memory saving.
Comments