Skip to content

High CPU Load Not Resolved (For Me)

124

Comments

  • My scraped lists are using very little resources compared to my verified lists. When I run a scraped list in SER I can use about 1500 threads while more than 300-400 threads on a verified list will give me out of memory errors.
  • Whoa!

    I just checked the e-mail on both SER's and JUST ABOUT ALL OF THEM FAILED.

    What on Earth is with that? Did I get crappy e-mails? I haven't checked the actual projects just yet, but I was indeed wondering why they're not hammering away almost any links at all!

    Could trying to use failing e-mails cause problems like this? I'll just change them and see what happens...
  • I'm having the same problem with CPU load.  I haven't changed anything and everything was running fine until the last two updates.  Now, no matter what I seem to do, the CPU seems all but stuck on 99-100%.  

    Can someone help with this?
  • 2Take22Take2 UK
    edited May 2014
    @fakenickahl, I've not noticed it myself, but would it not stand to reason that processing a scrape in a project(s) would use less memory than running a verified list, considering that with a scrape you're probably only posting to about 5 % of the target urls (and just sorting / identifying the rest), whereas with a verified list you would be posting to nearer 90% (with all of the extra steps that posting involves)?

    Also, not saying that you do it, but the problem is probably made worse by people who don't set their projects up to separate contextual and non-contextual engines. In my experience, doing this seems to make a big difference to SER's performance when running lists, even if you're doing churn and burn.
  • Guys, can you check your emails as well?

    I just pulled up some old projects, and I'm getting this even with e-mails that worked just fine when I was running the projects:
    ERR: authentication failed

    What could be causing that? IS it because I've been changing the versions many times and now I'm missing some file somewhere or what?
  • I just got some new emails from BHW, and they are failing as well. Must be some problem with hotmail itself. Can someone else do a check on this as well and see if your hotmail acconts are working or not? That would definitely explain why nothing is happening with SER right now as I use exclusively hotmail accounts.
  • I've just loaded some new hotmail accounts in and they are working. BanditIM for hotmail, he's on here too.

    I always verify and reverify then change the emails in older projects periodically.

    I don't use scraped lists so I can't really comment on the rest, sorry.
  • edited May 2014
    @Artsi Yep hotmails are dead for me too. 

  • Okay!

    Yep, I've ordered hotmail accounts from Fiverr and now from BanditIm. Neither of those are working for me right now.

    Wow, that's awesome. I hope hotmail gets their stuff sorted out FAST. Can't quite do anything right now...
  • People still experiencing CPU issues?

    I have now installed and re-installed versions ranging from 7.51 to 7.78 to 7.79 to 8.31 to 8.32.

    I've booted the VPS's several times and gone through the settings many, many time over...

    Could you people list some reasons what causes a high CPU usage? I'm seriously running out of ideas here.

    One thing that came to my mind is this: does the character coding of the url lists play any kind of role in the problem? I mean... Is Unicode harder for the CPU than ANSI, as an example? At least the file is a lot bigger...
  • Okay, a small update.

    So I went ahead and created this catchall email for projects (which by the way tends to give me some sort of SCRIPT ERROR in the log, but anyways...)

    When I was running those scraped lists, SER was at MAX at around 300 threads... I poured those urls into their own identified, successful and verified folders.

    I then drove those site lists into the 14 projects I had, and even at 1250 threads, SER is moving along pretty nicely (so nicely actually, that even the stats can't keep up).

    So, the problem is in running scraped lists, end of story. Now the question is, what causes that? Have I scraped bunch of crap? I must say that I did see A LOT of this "no engine matches" stuff on the log. Is that some reason for high CPU?
  • edited May 2014
    Of course it is @Artsi. When you use lists they are more or less contain target where you can post to, whereas in scraped lists majority of targets will be unsuccessful.

    From my observations, overall, newest SER versions indeed are more CPU hungry and where I was able to go 900-1200 now only about 150-300. But S/V ratio is more pleasing to me, and there was many great changes to step back to 7.9x, 8.0x and even 8.1x. Then again, I need more tests to decide which works better for me yet.
  • Well, is it really so then that sorting the lists takes more CPU than actually creating the profiles and submitting and all that?
  • That sounds about right, which is why some people who scrape lists have a separate VPS/dedi running lists to just sort/filter the lists, and other dedis to post the verified (sharing the verified list through Dropbox - be warned of doing it this way, check gooners/ron's posts about this as I'm not 100% that this doesn't come without it's own problems).


  • What emails are you guys using now ?
  • I just installed 8.32 on a new workstation running 2 - Xeon quad core 2.26GHz with 12GB RAM. I was only using 200 threads and over 1/2 the time all 8 cores were maxed 100%. Taskmgr lists all cpu being used by SER. Is anyone else seeing this, and what is a solution?

    UPDATE: - After discussing this with others it seems as though the high CPU load could be associated with maxing out the internet connection. Why this is, I have no idea.

    @Sven - Can you clarify if maxing out the internet connection would cause high CPU loads? If so, why, and what can be done about this?
  • @HUBBA, wow that's quite some computing power right there!

    Here's the thing... I have 2 VPS's I run SER on. Both have 1000Mbs internet connection, and SER is hardly using more than couple of dozen Megs.

    Altough I don't know if the task manager is only showing the incoming traffic or what, but I strongly doubt that SER would be able to make use of ALL of the 1G/s. connection, that's just impossible. And yet, my CPU's are melting away as well.

    One thing that I don't get no matter how much I think about, is that I've tried about half a dozen versions, and this keeps on happening on all of them.

    Go figure..
  • One thing which I keep seeing is people having CPU problems.

    @Sven -you said that the CPU usage is as low as it can be at this point. While I do respect your work a lot let me tell you that this is not true.

    If the CPU usage is as low as it can possible be how you explain yourself the fact that on previous SER versions (7.XX) my CPU was between 3-10% when running one campaign and maximum 50-60% when running multiple ones and now it stays in full load when i run only one campaign ? How can this be possible ? I haven`t changed any settings, i haven`t touched anything and in fact i even lowered the no. of thread from 300 to arround 100.

    With this being said i have to say that cpu usage is not as low as it can be as you said. On top of that have  a look at other people which posted here in this thread. They have much better CPU`s than mine, however they face the exact same problem

    Like I said @Sven i do respect your work a lot, i supose updating this software (and other gsa softwares) is a demanding job but as you can see "the cpu movie" is not finished just yet. I do understand that identifying exactly what is causing this is tricky and you might need more time or data from other users to figure it out but in the meanwhile please don`t say that the cpu is as low as it can be. I`m saying this to you because each time i posted a question about this issue you kept saying that the problem is on my end

    Thank you very much for understanding  
  • Just to throw my 2 pence in, I am running 8.32, 1200 threads, 50/20 scheduled projects, memory: 1.42Gb and CPU 40% - 50% on a 24Gb / (2) Quad Core Intel Xeon 2.26GHz server. LPM is running at about 21LPM which is pretty good considering 90% of projects are contextual including Web2.0 and SEREngines.

    Have to say I am very happy with the latest version! 
  • SvenSven www.GSA-Online.de

    Bahh there is one thing I hate...people telling me how to do things and especially how to code.

    You can not really compare a version from 7.xx with 8.xx...I don't know how many changes are beweeen that but just think about it...a lot has changed since then. You come in saying speed is important for you. It's not for everyone. The end result counts. You still do SEO and not just link posting. Of course some added features might have influence on CPU usage. Yes I know that but it is from what I can see very low CPU usage.

    And when for someone the CPU usage is high you get another thread for people complaining about a too low CPU usage. I can't make it right for everyone...still I try! Yes maybe I can get it down even more...but then again you get more people complaining about it.

    It just get's on my nerves. Send me your project backup + Setting backup and I try to improve things. But again, I can'T make it right for everyone. Just deal with it.


  • @Sven i think you totally misunderstood me. If you read again my message you`ll see that I never told you what to do or how to code.  All i did was telling you the problems i have (i didn`t told you "Sven i recommend you do this and this and that") so don`t get me wrong.

    And I agree with you a lot has changed from v7 to the latest version and i wouldn`t complain if my cpu usage was increasing in an acceptable %. But with one campaign running 90-99% when previously i had arround 10% there is a big difference don`t you agree ?

    I`ll send a backup to your email a little later today when i`ll be on the pc and hopefully you`ll be more relaxed.

    Once again @Sven - i didn`t told you what to do or how to code. I taught my message would be very clear but if that`s what you understood from my it i`m sorry it wasn`t my intention
  • KaineKaine thebestindexer.com
    Yet guys priority is the final result. I know it behavior CPU must annoy you but unfortunately this is not a priority.
  • Sven, you're doing a great job. Just to let you know that. Good to see that even though you weren't too happy with some of the complains, you still offered to help. Cheers mate!
  • Same here. v8.31 was perfect but the new version is using too much CPU.
  • SvenSven www.GSA-Online.de
    Bahh...all I did is bug fixing and thats it. CPU usage should have been decreased if at all. If you see any difference it is due to some other things.
  • Anyone experiencing problems with v 8.33 ?

    I just upgraded to 8.33 and i`m facing a lot of issues.

    I`m getting a lot of "Proxy ........blocked on......" even if i plugged in my new proxies for this month. I went and tested them and they were are good.

    Secondly a lot of "no engine matches"

    Third  less than 1 LPM which I never had (LPM is arround 0.87)

    Fourth i`m getting a lot of duplicate submissions

    All these while letting ser scrape

    Anyone here experiencing the same issues ?


  • edited May 2014
    @seo4all, I'm experiencing a ton of "no engine matches" things as well.

    I'm in the middle of figuring out what causes that. I mean... I scraped a list of 400k urls, and put them in SER. I got something like 90 verified links from that list, so it might be that my footprint list was all screwed up. Need to look into this a bit more..

    Oh yeah, and I don't any scraping with SER.

    I am however, a little bit doubtful that this would be a version issue.
  • Another issue with this version.

    I set my threads to 200 but it`s only using between 3-10 which is way too low.

    So to resume here`s the problem i face with the latest version

    1. "Proxy blocked on....." even if my proxies are fresh (and all ok after testing them)

    2. A lot of either "no engine matches" OR "wrong engine"

    3. SER is using only 3-10 threads even if is set to 200 (never faced this before)

    4. The lowest LPM i ever got (less than 1)

    5. Duplicate submissions

    Ping @Sven
  • SvenSven www.GSA-Online.de

    You forgot to say something about CPU usage. 

    1. Turn on debug mode and double click on such entry to check in browser

    2. same as 1.

    3. Sorry can't say anything to that. Can be anything

    4. LpM bahhh

    5. What you mean by that?

  • @Sven -i haven`t said anything about cpu usage now because i decided not to import target urls. Without doing it my cpu usage is very very good. It would go in full load if i will import url list. Since i haven`t found a solution to that i decided to let ser scrape and post on it`s own.

    Can you told me where is the debugger located on what should i do ? I never done this before....

    Regarding duplicate submissions i mean just that. SER is submitting to the same site twice (it doesn`t duplicate everything tough however for certain sites it posts to them twice)   
Sign In or Register to comment.