Skip to content

Remove Already Successful Domains from imported lists

Just a quick suggestion - is it possible to add another filtering option related to Remove Duplicate Domains/URLS, namely Remove Already Successful URLs which will make SER to compare Target URL cache with successful URLs of the given project and clear those that were already successful from target cache.

The reason for that: when managing many different projects with the different lists (manual links importing), some links will slip into the projects second (third, etc.) time, hence this filtering would allow to improve performance by clearing those links out.

I may be wrong somewhere (its pretty late here), but anyway, just a thought :)

Comments

  • SvenSven www.GSA-Online.de
    But those urls are filtered out anyway before a thread is started. I don' think it will make much of a difference.
  • NikodimNikodim
    Ah, you mean like when it shows in log "already successful" it just kinda skips with no performance impact? :-?
  • SvenSven www.GSA-Online.de
    yes
  • NikodimNikodim
    Ahhh, I see now. Thanks for clarification!  :)
  • yourmindyourmind
    Sven,  what about filtering imported urls from "already parsed" and bulk filtering urls from failed?
  • SvenSven www.GSA-Online.de
    wouldn't make much of a difference as already mentioned in my 1st post.
  • yourmindyourmind
    ok, what about bulk clean imported for failed lists?
  • SvenSven www.GSA-Online.de
    edited March 2015
    thats something I can add indeed have added for next update.
  • yourmindyourmind
    Sven, you are best!
Sign In or Register to comment.