Skip to content

Best way to clean site lists? I keep running out of memory.

I've searched and identified one way of cleaning site lists as simply right clicking tools and choosing clean site lists. The issue I have is that SER always runs out of memory and I'm never able to finish. Is there a better way to do this? I simply want to remove duplicate domains and remove dead targets. By the way, does SER store duplicate domains on article engines? I didn't see a reason for this. So for example, if I have a verified drupal domain in my global site lists and I import a list of URLs that I scraped and that domain happens to be in that same imported list, will SER store this domain again?

Comments

  • SvenSven www.GSA-Online.de
    where exactly do you run out of memory? On DeDupe ?
  • krushinemkrushinem
    I have a robust solution for that
  • JamPackedSpamJamPackedSpam
    @sven I run about about 1/4 way through running the clean site lists function. I click to resume the list cleaning and I repeatedly run out of memory. @krushinem - what is your solution?
  • krushinemkrushinem
    JamPackedSpam the answer to the last question is yes.  Ser to my knowledge doesn't filter out duplicate domains on site import. 
  • JamPackedSpamJamPackedSpam
    Okay. You mentioned having a robust solution for cleaning site lists - do you care to share how? @krushinem thanks.
Sign In or Register to comment.