Skip to content

Having high CPU and Memory Errors? Try this...

Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
edited January 2014 in Other / Mixed
I've noticed on some of my personal projects, when I run them, that I get out of memory errors and also high CPU usage with SER. I figured out how to solve this bug a while ago, but only today seen it again when running a new project of mine. Sorry I never posted this before, I was going to but I guess it slipped my mind. That's what happens I guess when you put things off till later. :D

If you're getting these errors, here is how to fix it. It work wonders for me and might help you.

1) Open your project option (Double Click)
2) Go to the option tab
3) Uncheck the option "Continuously try to post to site even if failed before"

I've done lots of testing when I found this bug and un-checking that option truly helps.

Maybe @Sven can look into this option to see why it uses so much memory and CPU.

Hope this helps. \:D/

If you know of another simple change to projects that helps to reduce memory errors and or CPU usage, please post them here. I'm sure it will help others.

Comments

  • SvenSven www.GSA-Online.de
    Sorry but "Continuously try to post to site even if failed before" can not really drop the memory usage. Must be something else. Is just one project running producing this issue for you? If so, as always, get me the project backup ;)
  • goonergooner SERLists.com
    Didn't help me solve my memory issues :(
  • But try this to solve memory issue: open Ser, edit all the options you want and after this close Ser. Open again and - without changing anything - hit "Start". It worked for me - no memory error. As far as I change any settings after the start and hit start again -> memory error appears again.
  • SvenSven www.GSA-Online.de
    @magix naw this can't be it :/ Is that for one project only? If so I am (as always) interested in a backup to debug it.
  • No, multiple projects running. I have 2 GB RAM on VPS. I have 10 projects running and change every 20 minutes.
  • edited January 2014
    Still getting issues with this. Please see the screenshot - my SER stats are at the bottom of the image under the task manager box. I can run 5-600 threads then get out of memory errors after 24+ hours of running so I reduce it to 400 threads and then get errors after another 24hours. Currently I'm running at 350 threads but as you can see from the screenshot I'm hardly using my server's capabilities - I can run 500 threads and various other tools at the same time....I wonder if my server's Internet connection is having issues or is this primarily an SER issue?


    32GB RAM, 8-core 2.4GHz, 60 proxies, 75 projects with 20 minute schedule, 180 html timeout, 5-15 seconds between SE searches (I reduce this when it's running well). If I was running all 450 projects, I guess my server would be running close to 100%...which is the goal and the main reason for buying such a beefy server...
  • goonergooner SERLists.com
    @judderman - Try this:

    http://www.koshyjohn.com/software/memclean/

    Set it to clean automatically every 5 minutes. Also, you can open command prompt and type: ipconfig/flushdns
    and hit enter. You can do that a couple times per day. It should help to manage it if not solve it.

    I solved it by having less projects, i removed 75 projects (now at 150) and it runs beautifully, i can run 50 projects at one time instead of 20 and increase threads from 400 to 600.

    Only downside is i need another installation for the removed projects.

  • Thanks @gooner as always, running it now. I'm still running 9% of my CPU even with Scrapebox running at 1,000 threads (250 x 4 SEs) and SER on too....
  • goonergooner SERLists.com
    edited January 2014
    @judderman - No worries mate. You have a beast of a server dude! I'm still waiting on mine.. They won't even tell me an expected due date. Just it's in a queue bla bla.

    All i can suggest is remove projects, it worked for me and i'm getting more links, higher LPM and CPU usage is up from 30% to 70% roughly.

    I also deleted the success project files for lower tiers, they were getting huge. So you could try that but then you might get duplicate links on those tiers.

    EDIT: I didn't delete success project files, just deleted the url's contained within. Need to be clear on that! lol
  • Ah man, that sucks from the server company, but once it's up and running the support is decent.

    Do you mean remove the projects completely from SER? Or lower the amount of projects running on the scheduler? I have probably 50 Inactive projects that I have set up but haven't finished building the websites yet. 

    Do you mean delete target URL or URL cache? I don't mind getting duplicate links on junk tiers.
  • goonergooner SERLists.com
    edited January 2014
    @judderman - I completely removed the projects from SER. Changing how many were on the scheduler seemed to make only a little difference. I think each project accumulates various files and data and it all adds up to make the thing run slower.

    The other suggestion, go to AppData/Roaming/GSA Search Engine Ranker/Projects and sort the files in there by size order. You will probably see that "Success Files" are the largest, in my case they were 10mb and larger each. You can open those files and delete all url's contained within and then save them again, to save space and make SER run quicker.

    But doing that means that those projects can get duplicate links as SER uses those files to check if it has already posted there. So better to do that for junk tiers only.
  • edited January 2014
    Hmm....thanks @gooner - I'll give that a whirl. New update 7.39 is running at 0.4LPM with no changes. Reverted back to .38 and same 0.4LPM, so I'd guess something is up with my connection/server/my copy of SER/proxies/or something else.
  • goonergooner SERLists.com
    ok @judderman, no probs. New update running ok for me so yea it could be another reason.
  • Aha, finally found the prj files (not in the normal way, had to search for them) and some of my target URLs files are 44MB!!! From adding my scraped lists - trimming them down now by letting only those active. I think your setup would work better for me, a VPS for sorting lists and running real project files on the dedi.

    I'm still hesitant to delete projects and get more copies of SER as that means more copies of CB and cost of VPSs.
  • goonergooner SERLists.com
    edited January 2014
    @judderman - Yea i'm reluctant too but i had no choice, fed up with babysitting SER and i don't know about you but with the out of memory notification, sometimes i have to click yes or no like 1000 times before it goes away.

    I don't know how you can run 500 projects with 75 active on scheduler, no server i've had can handle anything like that.

    The setup i have works pretty well, if you want to get really clever with it you can use dropbox to use the verified list from VPS across all your dedi's. So no manual importing of lists other than on VPS
  • Haha yeah I know no server could do all of the projects at the same time (it would be awesome, though ;)).

    Have you tried Dropboxifier? 
  • goonergooner SERLists.com
    @judderman - No, i mean no server i've had can run SER as you do now - Not all projects at the same time.

    I've never tried dropboxifier, what's the advantage over standard dropbox installation? Speed?


  • I am getting Captcha service GSA Captcha breaker not responding . Is it due to this reason ?

    i have 8-9 projects with only 3 active at the moment , earlier I used to run 100 threads , I get this error even when I running 10 threads still Ser occupies 90% CPU . My VPS is 3+ghz , 2 gb ram . 
    I recently imported 150 k target urls to one of my project , could this be the problem ?
  • Ahh I see @gooner - wait until your get your dedi and see what you can push it to :) Will be interested to hear what your settings/setup can do.

    Dropboxifier, I saw it somewhere recently and it sounds like it makes it faster/easier to do what you do with various VPSs pulling data from Dropbox etc. Not sure entirely what the difference is but it might be of some use.

    @DonCorleone - I find importing lists takes up a lot of resources, unless they are AA/verified lists, as SER has to load the page, check it, sign up, submit and verify (I think). It's basically filtering out the crap from the list which takes time. I'm doing the same at the moment with a 2million list...it's taking time but running at 60LPM so it's churning through it at an OK rate.
  • @JudderMan So you suggest to use keywords with that spinfolder macro method , to find targets ? Also what could be the reason for me getting this error "captcha service GSA Cb not responding " .

    You have an awesome LPM , I hover around 5-15 LPM . Do you post to all engines ? I post to only engines with a high conversion ratio. (Fatsteve's method ) .
  • Yeah I use the spinfolder macro for kws, 530k to be exact, split into 100k txt docs. 

    I have no idea why CB isn't responding for you. I've only just bought it and it works fine for me, sorry can't help with that.

    Is that a good LPM for running raw lists? I have no idea, but my dedi is pretty good so maybe it is. I have trimmed my Engines down as per FatSteve's method, I'm adding footprints as per Santos's tool, and creating lists myself but will be buying Donald Beck's video series soon about scraping as I'm not very good at it.

    Once I get the settings of everything perfect I will write up my findings on here, as others have done ie. Ron etc, but I'm still very much a newbie with SER so I won't pretend I know everything yet (not so much ranking websites and using other SEO tools, but a newbie with SER definitely).


  • goonergooner SERLists.com
    @judderman - What you said about the size of the list you are working with was absolutely spot on. By chance i got a new 20k list so i started using that instead of my old verified list and LPM went through the roof, but memory usage dropped so much that i could increase threads, which increased LPM even more.

    Was running 500 threads, now running 1500 threads at 200+ LPM :O

    Small fresh lists is the way to go, good spot dude.


  • Awesome stats :) Not sure what you use but you could split your lists with Text Wedge, which lets you choose how many lines of data per txt file it spits out.
Sign In or Register to comment.