Skip to content

Anyone getting out of memory way more now?

Hi,
Since updating ser from somewhere between 7.45-7.47 I've been getting the out of memory pop up a lot more than before. It has become insane, I used to be able to run ser just fine at 1500 threads and I'd only get the message like once a day which was fine. But now I'm getting it at 900 threads after running just a few hours. This means I have had to babysit my server. Also I noticed that ser is now using 2GB ram all the time, and until a few days ago I had never seen it at anything but 1GB. I'll try further reducing my threads, but I'd just hate to see this kill my lpm as I'm doing churn and burn. Also it's not because of my hardware, as I've got a dedicated server with 16GB ram, 8 threads CPU at 2.67 ghz, and a 100 mbps line.

Has anyone else noticed something similar to what I'm seeing?

And @Sven, can I in any way give some more information on what I'm experiencing to help you? I've sent a few bug reports, but I don't think it'll further help to send massive amounts of those. And forgive me for my lack of knowledge in programming, but is there any reason why there is not a 64bit version of ser?
«1

Comments

  • goonergooner SERLists.com
    I have the same problem on high spec dedi too. Would love to see a 64 bit version.
  • ronron SERLists.com
    edited January 2014

    Yes, I noticed that the ram spiked from 1 GB to 2 GB - that's one helluva big jump.

    @sven - Any idea on what changed that could be causing this?

    Edit: I should add that I am not seeing that so far in v7.48.

  • KaineKaine thebestindexer.com
    I have 32 GB of RAM just waiting to serve ^ ^

    Consumption of ram is not a problem, except what must adapt to those who have not much. 

    I think 4 GB of RAM is a minimum for pc now. 

    +1 For a 64-bit version if it does not require too much work
  • goonergooner SERLists.com
    edited January 2014
    RAM is not a problem, CPU is not a problem, SER is the problem.
    Sad but true.
  • edited January 2014
    I've sorted my out of memory errors (see my thread) - I have unchecked the 'continuously post even if failed' option, unchecked identified and failed in the global list, I have 7 English search engines, and run 20/20 on scheduler. LPM is now 100-110 all day. This only changed yesterday so it's been around 30 hours of decent stats.

    My server had to be rebooted as I think it had a problem, then it's been great since, so I can't say it's SER/my settings or my server as I've changed/updated all three around the same time.

    Running 500 threads, 65 proxies, verify every 1440 minutes and re-verify every 5000 minutes.

    I've trimmed down the lists I've been adding to my dummy projects so it can clear through them. One of them is still at 800k (originally 2mil), I remove duplicates everyday and it's currently cleaning the lists too (slowly). I haven't updated to 7.48...and don't want to yet as usually it slows everything down for a couple of days, not sure why but it's done that from 6.1x for me.
  • SvenSven www.GSA-Online.de
    Sorry but things like that are hard to debug. If someone can reproduce it in any way I am more than happy to fix it.
  • Same here.
  • KaineKaine thebestindexer.com
    edited January 2014
    (Google traduct)

    @JudderMan "I remove duplicates everyday"

    You want say duplicate at post or scrape ?

    For me the "out of memory" are export project. 

    As if the swap was not exploited or more. 

    The machine on which I turned SER currently have only 4GB.

    We ask a lot of update Sven, this is normal rules overlap after a moment. 

    I personally disables 99.9% of options and I have only one project. 

    The problem must be deep.

    "Also, I've mentioned this before but SER looks says my CPU is running at 99% but when checking Task Manager it's actually <20%, sometimes 4%"

    Idem, this worried me more than the problem of ram. but maybe 99% is not CPU activity but only cache  cpu ?

    I vote for the arretter update service for some time, the time to optimize hardware resources. More options and more we go into a bottleneck CPU RAM which is linked to the restriction on 32 bit application. This is not the fault of Sven. 

    Btw I wish him good luck for an app like Ser must have an impressive code ... 

    courage
  • I have been running at 700 threads for roughly 20 hours now, and I havn't seen the error message yet, which is quite nice. But the thing that makes me wonder is that I used to be able to run at a much higher amount of threads with no problems, and that was not more than a few days ago. Surprisingly though, I've been at 500 LpM with 700 threads, which I used to get with 1000-1200 threads, so that's definitely nice.

    @JudderMan, thanks a lot for you input, I remember reading your thread while running 1200 threads just fine, and I had checked the option to continously post even to failed sites, unchecked failed and identified, checked around 130 search engines, without the scheduler, and at that time I was running 5 projects which had imported files of 3 millions urls into each (not split).

    @Sven, I guess no one has been able to reproduce this in a suitable way for you to look into it yet then? It simply happens when I run a shit load of threads though I've been forced to decrease my threads more and more after every 3-4 updates recently. Even though the only setting I've changed during this time is the amount of threads. Also, you wouldn't happen to know what has caused the higher usage of ram in ser?
  • @Sven Instead of creating new thread, just thought I'd ping you here. I am also getting loads of "no article found, add new" type of errors in the logs. Running 7.49.
  • SvenSven www.GSA-Online.de
    edited January 2014

    @Pratik just disable the dupechecking for articles

    @fakenickahl sorry noone gave me something to play with. So it's hard to reproduce and "fix" something here. There have not been much I changed lately that could have caused this. Actually less memor is used (site list handling).

  • @Sven Then wouldn't it submit loads of duplicate articles? Hmm.
  • @Kaine I remove duplicate URLs/domains using Options > Advanced > Tools > Remove Duplicate URLs (or domains)

    @Sven after chatting a lot to Gooner it looks like these memory errors are caused by having files too large in SER ie. identified/failed or keyword files or not using macros but using lots of projects (300+).

    I think the 'weight' of SER can slow down processes and the user (us) usually thinks that there is a problem with their LPM so they increase threads and it doesn't increase, and therefore we see a lot of low LPM or out of memory threads popping up on the forum.


  • Am getting those errors couple of times daily and am running like 100-200 threads. Am getting them on lower therds anyway too. And my pc/isp are not that bad i guess
  • SvenSven www.GSA-Online.de
    edited January 2014
    next version will improved things here. Just changed a lot stuff so memory usage is way less.
  • +1 You are the hardest working programmer I have ever met. Thanks for making this awesome tool!
  • That sure sounds great Sven! I'm always excited to update ser and see what new changes has been made, as I'm confident your intention is always to improve the program. I'll report back with what's happening after trying it out for a while.
  • goonergooner SERLists.com
    Thanks @sven - Our of memory has been killing me over recent weeks
  • My input which MIGHT help some people:

    Only run 10x the amount of threads per projects, and if possible that figure should be around double the amount of proxies you have.

    For instance: I usually run 20/20 on scheduler with 450 threads and it's been working fine but today gave 12 projects a few hours on their own and was getting out of memory errors. Lowered the threads to 120 and it's running at 135LPM, 860MB RAM and 99% CPU (my server is barely noticing any usage ie. 4-8% CPU and 1GB RAM). I have 60 working proxies and SE searches at 12 seconds.

    Example: if you have 20 projects. You could use 10 proxies and to run at 200 threads but the chances of 10 proxies getting burned out is high so you need to increase the time between searches, probably 100+ seconds, and your LPM is going to be low. The more proxies and projects means more threads can be used but the time between searches still needs caution. The faster the server the more projects you can run at any one time (I can run 75 if SER is searching and 150 if I'm verifying). 

    AND, where possible make SER lighter - ie. use macros, don't let it waste resources on the global identified and failed folders, clean your lists, remove duplicate domains and URLs everyday, ensure you have enough SEs clicked (5-10), don't let SER waste time with continuously posting to the same broken/failed URL, use Santos's footprint editor and Google some footprints (there are many huge lists on BHW). Adjust, tweak, test and MANAGE your SER regularly. Don't copy anyone else's strategy, take snippets of information from multiple sources and test test test test test test test until you know what works.

    I'm sure someone brighter than me can give an exact calculation taking into consideration all variables.


  • goonergooner SERLists.com
    Great job on the update @sven... I'm seeing big reduction in memory usage. Thank you.
  • edited January 2014
    .....ignore everything I've just said....just updated to 7.50 and getting 1LPM. Was constantly getting 135LPM for the past 24+ hours....knew I shouldn't have updated as it always ruins things for me. Nothing else changed, just updated.
  • goonergooner SERLists.com
    @judderman - SER always goes into mass verify mode for me when i update/restart... Maybe it will take a while to get up to speed?
  • Hmm it's not verifying. It's posting. Lots of download failed, though. It's just hanging, the lower scroll log is so slow. 
  • goonergooner SERLists.com
    That's strange, i don't know what to say really... I'm only seeing good things, same LPM and memory usage hasn't gone above 800MB so far
  • BrandonBrandon Reputation Management Pro
    7.50 fixed the memory errors for me. LPM is normal.
  • edited January 2014
    I've can run 1000 threads now not using lists, though....hopefully it sorts itself out. I noticed that the active/inactive projects weren't 'remembered' after the update, not sure why - I'll check everything else to make sure it's the same as before.
  • i cant see any difference, still getting errors
  • BrandonBrandon Reputation Management Pro
    I have had a few errors, even though earlier I stated it was fixed.
  • wow... I only run 100 threads with 30 proxies. How much is your monthly server bill to run 500+ threads? How many proxies in use specs on your servers? I just qued mine up to 500 for the hell of it and I don't think the vps enjoyed it although ram use seemed reasonable CPU bounced around at or near max. 
  • edited January 2014
    I'm back to 900 threads after the latest update, and havn't seen any problems so far though I'll let it run for a few days before I'll know for sure.

    @Royalmouse, I'm paying $50/month for a dedicated server with 16gb ddr3 ram, an 8 threaded cpu at 2,67ghz each, and a 100mbps/100mbps connection. Honestly, ser is mostly only using 30-40% cpu, and ser + windows 7 + captcha breaker only uses around 3-4 gb ram so that's pretty much overkill. About proxies, I'm mainly using public proxies from a very good hourly updated list of google passed proxies. So it sure can be done pretty cheap if that is what you are after.
Sign In or Register to comment.