Skip to content

Remove duplicate domains after 2nd occurrence

edited May 2014 in Feature Requests
Yea I know I mentioned it before, but...
As there is no easy options to extract sorted in URLs, fiddle with them as if they were one large file and import back again without sorting them in, could a feature be implemented that would remove every 3rd and onward occurence of the domain. 

While it seems useless, it can help to clean the lists more thoroughly, but not as aggressively as with "remove dup URLs" + "remove dup domains with all engines checked" for example. If there was 7 duplicates, two would remain giving more confidence that just one

:-?

Comments

  • Import your list twice?
  • edited May 2014
    fakenickahl I do not quite follow?

    I am building a list at all times, whether its dummy projects or not, so there's bound to be duplicates in all engines. If I were to remove all dupe domains from all engines, that would be a little over the top because there would be only 1 copy of target left. 

    However I think that all duplicate domain should be removed, including blog comments, but left in no more than 2 copies.
    Right now I have lists where there's 15 links of the same domain for example, however, removing all except one is less efficient, because there is a risk that this remaining one could be non-working, hence 1 wasted target. 

Sign In or Register to comment.