Having high CPU and Memory Errors? Try this...
Trevor_Bandura
267,647 NEW GSA SER Verified List
I've noticed on some of my personal projects, when I run them, that I get out of memory errors and also high CPU usage with SER. I figured out how to solve this bug a while ago, but only today seen it again when running a new project of mine. Sorry I never posted this before, I was going to but I guess it slipped my mind. That's what happens I guess when you put things off till later.
If you're getting these errors, here is how to fix it. It work wonders for me and might help you.
1) Open your project option (Double Click)
2) Go to the option tab
3) Uncheck the option "Continuously try to post to site even if failed before"
I've done lots of testing when I found this bug and un-checking that option truly helps.
Maybe @Sven can look into this option to see why it uses so much memory and CPU.
Hope this helps. \:D/
If you know of another simple change to projects that helps to reduce memory errors and or CPU usage, please post them here. I'm sure it will help others.
Comments
http://www.koshyjohn.com/software/memclean/
Set it to clean automatically every 5 minutes. Also, you can open command prompt and type: ipconfig/flushdns
and hit enter. You can do that a couple times per day. It should help to manage it if not solve it.
I solved it by having less projects, i removed 75 projects (now at 150) and it runs beautifully, i can run 50 projects at one time instead of 20 and increase threads from 400 to 600.
Only downside is i need another installation for the removed projects.
All i can suggest is remove projects, it worked for me and i'm getting more links, higher LPM and CPU usage is up from 30% to 70% roughly.
I also deleted the success project files for lower tiers, they were getting huge. So you could try that but then you might get duplicate links on those tiers.
EDIT: I didn't delete success project files, just deleted the url's contained within. Need to be clear on that! lol
The other suggestion, go to AppData/Roaming/GSA Search Engine Ranker/Projects and sort the files in there by size order. You will probably see that "Success Files" are the largest, in my case they were 10mb and larger each. You can open those files and delete all url's contained within and then save them again, to save space and make SER run quicker.
But doing that means that those projects can get duplicate links as SER uses those files to check if it has already posted there. So better to do that for junk tiers only.
I don't know how you can run 500 projects with 75 active on scheduler, no server i've had can handle anything like that.
The setup i have works pretty well, if you want to get really clever with it you can use dropbox to use the verified list from VPS across all your dedi's. So no manual importing of lists other than on VPS
I've never tried dropboxifier, what's the advantage over standard dropbox installation? Speed?
Was running 500 threads, now running 1500 threads at 200+ LPM :O
Small fresh lists is the way to go, good spot dude.