Memory Errors Galore

Since one of the more recent versions of GSA I am getting a crazy amount of memory errors on my projects. It's the "Out of Memory!" popup error.
It seems to be worse when I run projects that have large linklists imported to run. (200k-600k urls)

Any advice on how to avoid this?

It happens with just a couple of projects, server is a an absolute beast with 2 x 8 Core CPUs and 32GB of RAM so it can't be to blame.
I've taken to using the scheduler to run projects now and am just running 2 projects at once (used to be able to run dozens) but its still happening with just 2 projects!

Any advice would be much appreciated.

Comments

  • ronron SERLists.com
    I was just talking to someone that processed a 400,000 link list in 45 minutes. I don't think that is your problem.

    I'm not sure how you set up the project for this list, but I would run it separately as a dummy project making links to Bing or something, remove all search engines on that project, turn off all other projects, and run just that project. You should hit LPM's around 300 and fly through that list. If I am wrong, somebody like @gooner or @Satans_Apprentice can probably chime in as I saw them talking about it somewhere here.
  • Turn off everything. I ran 8 projects vs 8 lists simultaneously. I imported the target list to each project and cut it loose. Of course my VPS has 4 processors and 8 GB of ram. If you check and save pageranks, it will cut your LPM in half.
  • goonergooner SERLists.com
    @hungryim - I noticed a similar problem too, i have a VPS running 40 projects, 400 threads and processing several million urls via imported lists per day. The job of the VPS is only to process scraped lists and it was running well until 3 or 4 days ago.

    Over the last few days the whole thing has gone very very laggy. Now running 20 projects at 200 threads and it's been lagged totally out for the last 3 hours.

    Not sure what the problem is, but i suspect something has changed on a recent update.
    Maybe @sven can shed some light.
  • @ron: what kind of machine do you need to get 400k links imported in 45 minutes? (I assume "processed" = imported + identified...or do you mean "submitted to"?)

    Since I started using SB for harvesting and GSA posting-only, I also thought about getting a 2nd vps just dedicated to scraping + importing + identifying...consumes resources like nothing else.
  • ronron SERLists.com
    @johnmiller - I will defer to @gooner and @Satans_Apprentice as those are the guys who said they were processing scraped lists.

    Please remember that you import that list and then submit links. You may only get 5% of that to turn into actual links, or some low number like that..
  • goonergooner SERLists.com
    @johnmiller - @ron meant that SER processed a 400k list in 45 mins, as in SER went through the list and posted what it could and ran out of targets after 45 mins.

    That is a lot different to 400k submitted links in 45 mins of course.

    The best performance i have ever got is 10,000 verified links per hour... Consistently hour after hour. That was with a Hetzner dedicated server which was truly amazing. Too bad they banned me after 1 day for spam abuse reports.

    I use a separate VPS for processing scraped lists and it works well, but i have 3 other VPS to feed. I wouldn't necessary recommend 1 VPS to feed only 1 other VPS.

    Unless you are already making decent money of course... I'm more than happy to spend money on something that makes me more money lol
  • i used to have memory errors all the time on 2 gigs vps since i moved to 4 gig not had a single one that i can recall.
  • ronron SERLists.com
    edited December 2013
    ^^Usually I speak more clearly. Thanks for clearing that up @gooner.

    @PeterParker - Back in the very beginning of this product, many of learned that 2 Gb of RAM was not enough to do the job. 4 Gb is perfect like you said. The issue is that the Windows operating system needs its own hunk of memory to operate before SER kicks in. So then you layer on SER, and then add on captcha solving like CB, and you simply must have more memory if you are going to grow your projects and processing. Not that you can't operate at 2 MB when starting out, but it probably won't work for very long.

    The other thing I learned was that I could manage memory issues by simply using the scheduler. You can get away with running all projects at once - up to a point. I think I started having problems at 40 - 50 projects, and I had to start using the scheduler.

    Lastly, to address the OP's question, running SER non-stop 24X7 seems to accumulate memory usage. I like restarting the VPS at least once a week if not once every 3-4 days. The RAM starts lower for me when I do a reset on the VPS and completely eliminated that issue for me. 
  • I also feel that v7.24 may slow things down. Couldn't say for sure, just notice the submitted number of new project has dropped quite a lot with same setting.
  • Switch back to 7.23 - no memory error and fast...
  • I'll have to correct me: same happens in 7.23 but at a later point - though memory usage is only 20% - sorry.
Sign In or Register to comment.