Skip to content

Best way to clean site lists? I keep running out of memory.

I've searched and identified one way of cleaning site lists as simply right clicking tools and choosing clean site lists. The issue I have is that SER always runs out of memory and I'm never able to finish. Is there a better way to do this? I simply want to remove duplicate domains and remove dead targets. By the way, does SER store duplicate domains on article engines? I didn't see a reason for this. So for example, if I have a verified drupal domain in my global site lists and I import a list of URLs that I scraped and that domain happens to be in that same imported list, will SER store this domain again?

Comments

Sign In or Register to comment.