Skip to content

Clear Target URL Cache

Hi,

What is the best way to clear target url cache in GSA  ?

1.Daily 
2.Weekly
3.Monthly
Tagged:

Comments

  • DeeeeeeeeDeeeeeeee the Americas
    4. Never ✔

    haha I know this can get bloated. I have yet to clear even once.  :|

    Does this ever happen automatically?



    Is this bad to never clear? If so, please help me understand why this needs to be reset.

    (besides the obvious, slowing down the program, one definite thing to avoid)


    And, then, of course, back to rumeshfb's question...if it's a necessity...how often....??


  • rumeshfbrumeshfb Srilanka
    I ask this question because my  Hard Disk is filling up very fast because of GSA target URL cache. normally I do it monthly. I want to know what is the best time for that. Daily ,weekly or monthly

  • DeeeeeeeeDeeeeeeee the Americas
    edited July 2019
    You can easily automate this task  in Windows with a batch file you write and have the system run it at set intervals you determine.

    You can also make periodic backups in this way, as well as sets of timed backups (retaining like 2 day older backups, say, as wellas current) with batch files.

    But again, I wonder, as you probably do, what the (other--so far we have SPEED and STORAGE SPACE--) considerations are for deleting, and thus, how frequently should it be done?  Is never really bad??? :o
  • rumeshfbrumeshfb Srilanka
    so you mean target URL cache is bad ?
  • DeeeeeeeeDeeeeeeee the Americas
    edited July 2019
    Hmm...If the target URL cache were bad I don't think Sven would have put it in.

    Or, it would be able to be disabled. Can it?

    Maybe it becomes bad *ONLY* when it gets too bloated, because then SER has to read thru the file to find a match each time, and a longer list of URLs  means a longer scan for matches within that URL cache list. That HAS to affect time and ultimately LPM.

    "so you mean target URL cache is bad ?"

    I meant...was it bad to NEVER, EVER clear out the target URL cache, as I,for example, never do. lol :p





  • rumeshfbrumeshfb Srilanka
    But Hard Disk memory is now going to over Its filling up faster. I have many projects in GSa for different websites. Normally When I clear target url cache I can free up too much memory
  • DeeeeeeeeDeeeeeeee the Americas
    edited July 2019
    "But Hard Disk memory is now going to over Its filling up faster. "

    I guess if you are facing disk storage space issues, eliminating target url cache will keep you up and running, as oppose to not even being able to run SER because your hard drive is full.

    In that case, deleting makes perfect sense.  :) You get back the space on your hard disk and can operate GSA products with some storage space headroom.

    Maybe there's a way to keep this file at a certain size you choose, deleting only the oldest target URLs (or certain classes of other URLs: country match, TLD match, etc), first? 

    I wonder if we could do this from  within SER.
  • rumeshfbrumeshfb Srilanka
    Yes right
  • 1linklist1linklist FREE TRIAL Linklists - VPM of 150+ - http://1linklist.com
    edited July 2019
    The target url cache is, quite literally, the list of targets a given project is going to attempt to post to.

    If this becomes to big, it can slow SER down. I believe this is mostly related to the through-put limit of your hard-drive. Its not about how big it is, but how FAST it is. (And possibly the 32bit 2gb memory limit of SER. But I dont think Sven reads the entire file into memory. It could also be a limitation of Windows itself.)

    Personally I run SER on an SSD drive. I get seem to get much better performance that way, and most hosts charge a pittance for a small SSD drive. (100gb or so. You should run your OS off the drive to.)

    My target URL caches are usually pretty huge (1-2 million per project) and I dont have any problems, but I know I used to, and thinking back, I havent since switching to SSD drives.

    I've not exactly quantified my results with this, so its possible it does come down to some read-speed limitation in SER or windows itself, but thats my .02 cents.

    -Jordan
    Thanked by 1Deeeeeeee
  • andrzejekandrzejek Polska
    edited July 2019
    Switching to normal SSDs or NVMe gives a performance boost. Some time ago i was running SER in RAM DISK but that was bad, due to persistance of the data...

    http://www.m-techlaptops.com/ShopOnline/pc/How-to-boost-your-computers-speed-and-longevity-d67.htm

    " In other words, the speed at which you access the data is more important than the rate at which you’re accessing it. A normal hard disk has a response time of about 16ms, a good SSD will respond in 0.05ms, RAM will respond in 50ns. (notice ms vs. ns) To put that in perspective, 0.05ms is equal to 50,000ns. This means that RAM can serve up data in memory 1000 times faster than a NVMe drive even though the file size they can carry is about the same. It is like Speedy Gonzales suddenly being able to carry as much stuff as Bubba. Of course Speedy is going to hands down me the winner. he can carry as much and do it a thousand times quicker."
    Thanked by 1Deeeeeeee
  • VMartVMart Natural SEO
    SER lot of options need to work manual not fully automation.
    Need to clear catche manual. When sven will fix it automatically schedule.
    One year back I send the future request to sven.
Sign In or Register to comment.