Skip to content

"too much memory in use, waiting till more is available"

edited December 2012 in Need Help
Hi there,

I'm getting this message from GSA "too much memory in use, waiting till more is available" which is installed in a VPS Windows 2008. I'm only running another session of Scrapejet and of course CaptchaSniper.

I have changed the priority to low but still giving me same message even though i have rebooted the server. I used to run this with 300 threads but now it's only 100. I guess It's my server but it would be great to get confirmation from other users

cheers


Tagged:

Comments

  • OzzOzz
    edited December 2012
    maybe your project files got too big. maintain your projects with revarification and delete duplicate urls/domains for instance. hope this helps for the beginning.

    also make sure you have the latest CS with patches installed to get rid of that memory bug.
  • hi there, how do I go about "maintaining the projects"? what's revarification?

    thanks in advance
  • OzzOzz
    edited December 2012
    re-verify = mark and right click project(s) -> show urls -> verified -> verify

    delete duplicates = options -> advanced -> tools -> delete duplicate urls (for blog comments) || delete domain (for the rest)

    i wasn't in the situation that i'm running out of memory yet, but i do those things regulary. hopefully it helps to free up some memory because the projects don't need to cache dispensible urls. this is just an assumption though and i don't know how big the effect will be.

    also get rid of deactivaed projects if you don't use them anymore. backup that projects and delete them afterwards.

    you could also save some memory if you reduce the keywords for searching if you added tons of them to your projects. 

    if you don't want to delete keywords it maybe helps if you safe all your keywords to file and spin them like this:
    {keyword 1|keyword 2| keyword 3|....|keyword n}

    this spinned keyword file can be attached to your project with
    -> right click key "keyword" field -> use contents of a random file

    i'm not sure though if it makes a difference if they come from file or stored directly in your projects.
  • @ozz

    I do delete duplicate URLs on a regular basis.

    But have not done delete duplicate domains since I figure in the project settings I enabled NOT to post on same domain twice. Is this a valid assumption ?

    Also, let's say

    http://www.site.com/webpage.html
    http://www.site.com/webpage.html
    http://www.site.com

    1) If I delete duplicate domains will all 3 be deleted ?
    2) If I delete duplicate URL I'm assuming that the 1st 2 will be deleted ?

    And do you delete duplicate URLs and duplicate domains before adding the submitted and verified URLs to the global list
  • SvenSven www.GSA-Online.de
    edited January 2013
    1. http://www.sample1.com/subpage1.html
    2. http://www.sample1.com/subpage1.html
    3. http://www.sample1.com/subpage2.html
    4. http://www.sample2.com/subpage1.html

    delete duplicate URLs will delete just remove entry 1

    delete duplicate domains will remove entry 2 and 3

  • "delete duplicate URLs will delete just remove entry 1"

    gotcha

    "delete duplicate domains will remove entry 2 and 3"

    will keep entry 1 assuming of course that delete duplicate URLs was NOT performed first BUT if it was then in this case the delete duplicate domain will ONLY remove entry #3.
  • Is it prudent and preferred to delete duplicate URLs and duplicate DOMAINS before add the submitted and verified URLs to the global list ?

    BTW, happy new year to you @sven and everyone
  • I am also getting that error too a lot.

    Also, When I start GSA SER I see in the message box a lot of running projects showing as "project stopped" and then the project goes from green to white but it still shows as active.

    Any thoughts?
  • are you using the scheduler ?
  • No. I am going to
  • AlexRAlexR Cape Town
    @sven: as per Ozz's suggestion:

    Delete duplicates = options -> advanced -> tools -> delete duplicate urls (for blog comments) 
    || delete domain (for the rest)

    1) Under delete duplicate domains, could we have it auto DESELECT all the blog comments? (Since this is how most are using it)
    2) Delete duplicate urls (for blog comments) - can we have the default to only have blog comments selected?
    3) Would it be possible to select by platform GROUP, rather than by individual platform? As platforms get added, this becomes tedious. (So maybe a similar functionality to the search for URL's, where you can select by platform TYPE.
    4) Would be nice to automate/schedule this process somehow. I bet many users don't do this regularly. Maybe have an option to neaten lists when user updates, or once per week, etc? 

    Would be neat!
  • 1-2) i second that but it would be much easier if the settings are just saved for this, imo. plus an option to check/uncheck by mask could be usefull.
  • edited January 2013
    @sven

    I got the same message

    "too much memory in use...."

    did not change any of my configuration ? none change on software and hardware JUST the latest GSA SER update.

    It happened 2 SER updates ago BUT I stop running SER the 1st time it happen because I got busy doing something else then I ran it again this afternoon after installing the new update. I came back home 6 hours after and it submitted around 2700 post then the messages starts to appear. Usually my submission averages around 1k per hour.

    anybody else experiencing this problem I'm having ?

    screenshot
    http://www.freeimagehosting.net/5stbk
  • OzzOzz
    edited January 2013
    i don't think its caused by the latest updates.

    maybe you got to a point where your projects has grown up to a size that causing memory issues. how much RAM are you using after a fresh start and once you get the "memory" messages in your log?

    do you automatically collect keywords from other sites? and how many keywords you've collected by this time? this could also be a factor where memory issues could appear once you have trillions of keywords in your projects (i believe).
  • got 4gb plus the only thing that is running is SER. nothing else is running than usual since I started using GSA SER.

    well I don't collect META keywords from other sites. I disable those. I just use my own keywords that is related to my main keyword.

    How do you check or rather where do you check the collected keywords ?
  • SER is able to use about 2GB of RAM. If its up to the point that its using 2GB RAM than you will get those messages.

    If you don't collect keywords, than this can't be the issue, but you can "click to count" your keywords in your project options if you like to know that. 

    Did you maintain your projects like I've mentioned above already?
  • well I do delete duplicate URLS after every verification and reading the above a few days ago I started to delete duplicate domains too.

    I need to check if I spin keywords. But I don;t think I am. I'll check tomorrow. got to cahtch some zzzzzz... it;s 2AM and need to be somewhere in 6 hours.

    later
  • I'm having the same problem. "too much memory in use...."

    @Ozz, i'm using the generic keyword that i download from BHW. It was about 100K and after that i didn't use collect keyword.

    Is that a problem?
  • LeeGLeeG Eating your first bourne

    Run ser in scheduler made and keep an eye on the memory used in the bottom bar

    I reset and reboot my vps every 12hrs

  • @LeeG, what is your setting for a scheduler? Are you using the default setting?
  • LeeGLeeG Eating your first bourne

    I just run it on the default change every thirty minutes etc

    Gives a nice spread of links

    Memory usage has improved recently, I used to get 8 hours out of a run. Now I can do about 16 hours

  • edited January 2013
    run 10 projects at the same time and switch to next project every 30 minute? OK i will do that...

    @LeeG, thanks

    OK one more things, if i use the scheduler... I can add as many project as i want in the GAS SER right? cause the project will run according to my setting. But the downside is the link that i got will be small if i have to many project in the GSA SER, it because the GSA SER will switch to another project with delay.


  • LeeGLeeG Eating your first bourne

    80k links in 12hrs is small. That what I did from midnight to mid day

    It just randomly rotates them, nothing to worry about

  • i use the scheduler 7 projects every 5 minutes
Sign In or Register to comment.