Skip to content

links load limit removing duplicate

alexeyalexey Moscow
edited July 2016 in GSA Platform Identifier
Uploaded 400 million links. which limit the program? 1 billion removing or not?
what the speed of removal of duplicates depends on?
speed grows to 600 000 references a minute falls to 30 000 minute


  • alexeyalexey Moscow
    you held testing Platform Identifier for the maximum size of the txt file for check of big reference bases?
  • s4nt0ss4nt0s Houston, Texas
    It's hard to say what the limit of the program is, but its usually storing those URL's in memory so the bigger the list and longer it has to run removing dups, your memory is going to increase and things could slow down.

    There isn't really a set limit though.
  • alexeyalexey Moscow
    we will test thanks!
  • 1linklist1linklist FREE TRIAL Linklists - VPM of 150+ -
    Just to share our experience, it can handle MASSIVE lists. We've run through files as big as 1-2GBS before - Though it DID take a few hours to work through it.

    As S4ntos said, the real limit is memory - if you have enough ram it will work, otherwise chunk the list up and do it in parts.
  • I've been using SER to remove duplicates from big files.
Sign In or Register to comment.