Still Getting Out of Memory Errors
I know I sound like a broken record, but this is getting insane. I have one project running at the moment, 150 threads, running through my verified lists only......and getting out of memory errors. HTML timeout is 180sec, 60 proxies (55 working). It's running at 200LPM.
I know from reading and previous threads that people have suggested to delete projects, delete cache, delete target URLs but I'm running global lists only so what is up? What am I doing wrong? 150 threads isn't a lot, if I'm cleaning lists or running one project at a time, I used to be able to run 800-1000 threads.
Also, I've mentioned this before but SER looks says my CPU is running at 99% but when checking Task Manager it's actually <20%, sometimes 4% and still getting out of memory errors. When the RAM on SER goes to 1GB that is usually when it stops working....even though my server has 32GB to play with.
Comments
Example: 400k verified list i can run at 600 threads before i get "out of memory".
35k verified list i can run at 1500 threads.
Loading these lists must consume a lot of memory, What size is your verified?
Also, i remember reading a very old thread that said the out of memory issue is triggered at 1.2GB usage - Maybe @sven can confirm that?
Yea i'm working with smaller lists only now. Myself and a few other users have been testing that method and it works really well.
The problem is the 1.8 million list i reckon. On the server i use for scraping/testing i import max 100k links per project to keep it lightweight and fast.
Anything that uses memory and you don't need, get rid... Even de-duping domains and url's daily keeps lists to a minimum.
1) Load 1 million and split it over multiple projects.
2) Assuming you are importing into test projects: Run more projects so SER processes each project more slowly.
I only need to import lists once per day, so not so bad.
I have a dedi that processes scraped lists and then i import them into small batches into my main dedi as "submitted". I also check "verified" so it can use verified links across all projects, about once per month i save that verified list and remove it from SER and start a new one. Keeps the lists small and the links fresh.
For keywords, i put just "1" - No need to waste memory saving keywords that are not used because SER doesn't scrape. I check the option to save PR to verified lists so that SER doesn't have to check PR when it uses those links again.
De-dup all lists regularly, every little memory saver helps. Good lists are really important for speed and low memory usage. If i couldn't scrape and produce my own sitelists i would buy them every month.
A couple of months ago i was the steady 70 - 100 LPM guy, day in day out. With all those modifications my stats look like this now:
That's with 130 projects running 24/7 - No scheduler.