Skip to content

Remove duplicate domains after Nth occurence? Or Trim to Root and duplicate.

edited May 2014 in Feature Requests
Is it possible to make Remove duplicate domains feature in Tools to be able to remove only those duplicates that occured Nth time and more?

This would allow to have a safeguard of sorts in our lists in case one link goes bad, is removed, etc. Should it happen when we had 2 links remaining for example, one would be bad but another one still be working.

Or maybe it is possible to take all links in a list, trim them to root and thus have 2 copies of the same target while keeping "engines assigned" to them? This would achieve the same goal.

What do you guys think? 
:-?

EDIT: Nevermind. Just remembered :(|) that it would bloat lists immensely with little gains. Please remove the thread or just ignore my babble.

Comments

  • Trim URLs to root and/or last folder (i.e. xxx.com/blog/) could be useful for contextual platforms. 
Sign In or Register to comment.