8.31 Still Frying CPU

Is anyone else having issues with 8.31? 

I'm still getting random high CPU usage and I am having to run at 250 threads, which is down from 600 earlier today.

Even at at 250 threads I'm still getting higher CPU usage than when I was running 600 threads in the previous versions.

Comments

  • I have teh same... 
  • goonergooner SERLists.com
    Yep same here, i'm not sure it's unique to 8.31 - It seems to have been like this for me for the last 5 or more versions.
  • Same here, 8.31 killing my cpu, gsa ser and captcha breaker hang on second minute.
  • 150 threads 93% cpu  machine is hanging 

  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    edited April 2014
    I'm at 300 threads, 1 gig mem and 20% cpu

    Everything is running great for me. I guess all the bug report I sent in helped @Sven fix any problems I was having. Great job.
  • The latest update for me is unusable, its literally frying my cpu even with just 25 threads.
  • Just upgraded, so not sure what I should be getting :/
  • edited April 2014
    Running 7500 threads on a D-Wave 512-qubit Vesuvius quantum processor at 12% CPU usage with 1.5GB RAM usage. No problems here. 
  • For me 8.31 made the problem a tiny bit better in that at SER is not completely maxed out right now. It IS however, at a good 90+% all the time.

    One new problem is that my LpM has just plummeted completely. In both GSA's I run, it's less than 10 per minute - and this happens with nothing but imported lists. Should run waaaay faster than that. Just a couple version back it was at healthy 70LpM pretty consistently - and at that point I even used SER for scraping.


  • All mine are running fine except SER is not using all of the threads I assign to each copy, but VpM is as it was over previous versions. 


  • edited May 2014
    @JudderMan a D-Wave 512-qubit Vesuvius quantum processor? I presume you are joking :)
  • same here, whole system crashing cuz of SER
  • edited May 2014
    SER frezzes completely when I start projects immediately. Even CTRL+ALT+DEL opens the Taksmanager with big delay like a program is in an endless loop and does not give up the CPU.
    I was with 8.28 (or 8.29?) and made update to 8.31 in one step yesterday. With 8.28 this behaviour was definitely not since it is very evident and it happens always.

    I have a very simple project structure, running with 25 threads - nothing high performance. Running on my PC at home.

    Looks like this is a good time to create new mail accounts ;-)
  • edited May 2014
    @jjumpm2 nope, 4 dedis, dual Xeons of varying speeds.

    I spent many months optimising the setup of SER (thanks to many people on here, gooner, ron, and many more) and I honestly think that keeping SER 'light' ie. using macros where possible, not clogging it up with loads of content and not pushing threads too much that it stays fast for me. I let SER scrape by the way.
  • The same here, lowered my active projects to 10 and it`s crasing after couple of minutes, before I had 300 threads with 30-40 active projects without any problems.
  • Guy, are your problems stricte connected to CPU? i have also high CPU usage but the worst thing is memory issues and constantly popup window alert...
  • CPU is fine for me on 8.31, memory is still running about 2gb
  • davbeldavbel UK
    edited May 2014
    I think @meph and @judderman may have a point here.  Unless @Sven tells us differently :D

    @Sven do the file sizes in the projects folder have an impact on performance? Some of the file sizes are fairly big - many are a few Mb, but some, especially the Static and Targets are quite big at 80 - 90Mb.

    Some are obvious, but some not so:

    Articles File - Article content
    Hosts Done File - ???
    New Keywords - New KWs found on sites
    PRJ - Project Settings
    RND_Content - Guessing random content, but what for ???
    Static File - Looks like when the link was posted ???
    Success File - Successful Posts
    Targets File - Target URLS File
    URLS_Done - URLSs posted awaiting verification ???
    Verify File - URLs to verify

    There are also random files from the above with 4 number extensions - I'm assuming these can be deleted as they all seem to be weeks / months old.

    Assuming the file size has an impact on performance, the big target files will be where I've loaded a list into a specific project, so the lesson here is import lists to identified or split over a number of projects, but can we trim / reduce / remove the other files?
  • I made a lot of tests and got to conclusion that memory issues (and to a lesser extent CPU) is somehow connected to the list runing. I was testing SER with only one project and small list and it didnt help. But...
    I have several lists (from different sources) and one that is mine - that i scraped for my own.
    When i run SER via bought list first i didnt get any problems (just after purchased) but after some time this lists have a huge impact on my memory.
    But when i choose my own scrapped list i dont have at all memory problems... (5 projects without limits running with 500-600 MB, on bought list i have almost 3 GB memory usage even for one project running!)

    Now I am testing "HTML timeout" option and "maxmimum size of website to download" for problematic lists - i change value form 160-180 to 30-60 seconds and from 4 to 1-2 MB and memory usage is much lower also a little bit CPU.
    I thinks this is the problem of buying lists - after some time they are hammered to death...

    Can somebody confirm that?

    I think this is not the main reason causing memory problems AND CPU but is part of it.
  • edited May 2014
    I've noticed with my oldest server, which will inevitably have larger verified, failed, identified lists, is slower than the newer servers (even though it is more powerful). That said, I don't want to delete or trim down my verified list in case good targets are deleted. I do de-dupe once a week.

    NB that server has more projects than the newer ones..
  • @meph - bought lists:

    1) Have more occupied bandwidth on average at any given time due to tons of people having those exact same targets and hitting them

    2) Have larger pages on average from the same sites getting hit over and over again

    Both reasons contribute to more resource usage.
  • for me, SER takes 95%+ CPU even on 10 threads lol. So no links for me till its fixed soon i hope
  • After the latest update CPU is now running very low. The lowest i have seen it run.

    Thanks for the Fix.
  • Same here. 8.32 solved the issues, CPU usage is lower and on my home pc I run now 100 threads instead of 50 ~ 75.
  • Back up to 600 threads and running at average 32% CPU

    Thanks @Sven
Sign In or Register to comment.