How Long Does it Take to 'Clean' the Lists?
Been running the cleaner for a few days. It's at 300k and the bar along the bottom is probably about 0.01% complete. Is it OK to run SER and clean the lists? I know it slows it down a little but it will probably take a few weeks at this rate to clean the lists properly....Plus I can't update or it will start over, won't it? That's what it did last week so I gave up on it and decided to keep running SER.
Comments
The progress bar is not accurate, it will stay on 0.01% or similar until it's pretty much done then shoot up to 100%.
You can work out roughly how long it will take, see how many url's in global lists, minus duplicates then base estimate on how long it's taken to do your 300k.
It depends on how many urls you have and how many threads, last time i ran it it took about 2 - 3 days to do 1 - 2 million urls if i remember rightly.
just KNOW how to!
how?
if you go into your folders with the "identified" "verified", etc
look at "identified" folder
if SER is cleaning up a larger file (several dozen kB or hundreds of KB,
then it creates a temporary .txt file {in UTF8 encoding}
with same name as the one it is currently processing
clean up works from top to bottom of your file list in "identified"
if you stop at that point
the cleaned up are already in the original file
all others in the .txt file
hence you know where you aborted
and where to restart from scratch by UN-checking all above already done
at that point - when restarting my clean up,
I copied all URLs from .txt file into original again
and restarted clean up from there downwards
then DEDUP of course before running SER submissions again
then you check only in small groups of URLs, that allows you to do your daily work as well as your upgrades as needed