Skip to content

cleanup duplicate urls of identified lists automatically/periodically ?

GiorgosKGiorgosK Greece
edited September 2012 in Feature Requests
I run "clean duplicate URLs" from OPTIONS > ADVANCED > TOOLS
and it cleaned more than 19000 urls
after 1 hour I go again and it cleans more than 5000 urls

I was wondering if those 5000 duplicate urls that can built within an hour will slow down the process of submitting

I guess each project checks before submitting and it skips URL if already submitted
but it still seems like a wasted time to have duplicate URLs in the lists

how about a periodic cleanup of duplicate URLs and a setting in options ?
is it doable ?

Comments

  • SvenSven www.GSA-Online.de
    The URLs have to be checked anyway before submission so it is no wast of time here. But cleaning the lists automatically would be. It is also not a real problem to have duplicate URLs in the files as the program takes random URLs from it and not one by one.
Sign In or Register to comment.