GSA SER memory / Thread issue
Hello guys,
I've recently been experiencing an issue and have had no luck solving it.
I have 10 projects that loaded all the data fields with heavily spun articles (spintax), and I suspect this is causing high CPU usage. However, even with just 5 projects, the CPU usage goes up to 100% (I've had no luck with the scheduler either).
I am running on 400 threads with 40 semi-dedicated proxies.
The server is an AMD Ryzen 9 with 16 cores and 12GB RAM (I'm aware that GSA SER is a 32-bit application and won't consume more than 4GB).
I've tried running a few links on the global folder in different ways, but the result is the same.
Can anyone suggest what might be wrong? The biggest confusion for me is that setting 1 active project results in around 20% CPU usage, but it never uses the full amount of threads.
I've recently been experiencing an issue and have had no luck solving it.
I have 10 projects that loaded all the data fields with heavily spun articles (spintax), and I suspect this is causing high CPU usage. However, even with just 5 projects, the CPU usage goes up to 100% (I've had no luck with the scheduler either).
I am running on 400 threads with 40 semi-dedicated proxies.
The server is an AMD Ryzen 9 with 16 cores and 12GB RAM (I'm aware that GSA SER is a 32-bit application and won't consume more than 4GB).
I've tried running a few links on the global folder in different ways, but the result is the same.
No matter what, it hits 100% CPU usage.
This happens only with one copy of GSA other works fine. Both running same link list
Can anyone suggest what might be wrong? The biggest confusion for me is that setting 1 active project results in around 20% CPU usage, but it never uses the full amount of threads.
Can anyone help me?
@sven can you least give me some points or guide for that .
@sven Whats is the best way to process 4million identified targets . ( i believe there is a huge difference with each importing method
Also why we can add a article macro from folder. i guess it consume low memory then right ?
Tagged:
Comments
I have done a post on how to deal \ trouble shoot out of memory issues - but take a look at the settings NOT to use that cause high load maybe it could be of help in your CPU load issue
Here is the post :
And if that done work try some of the maintanance tiips here, specially the cleaning of target urls.:
I don't know what it is but SER is definitely not the problem. Maybe while SER is working you have some other application running that is sucking up your CPU. For example, an application that consumes a lot of CPU could be Money Robot, or the antivirus, or something else. But definitely not SER. I work with a Ryzen 5 5600G and I don't go beyond 3% CPU. Maybe you have too many proxies all working together straining the CPU. Below is a photo of my configuration and as you can see it is larger than the one you use. I have 64 active projects. However, I installed 64GB of 16-bit RAM, but I don't think this is the key factor
Just out of curiosity, in addition to your paid proxies, have you activated the search for proxies in the SER settings? Because that also leads to excessive CPU consumption. See the update times for proxies search or disable some proxies in the SER list. Obviously mine are hypotheses
Typically, CPU consumption occurs most when searching, downloading, or submitting backlink URLs to blog search engines
If you have have 64 GB of RAM, why are you limiting GSA SER to only use 2300 MB (2.3GB when it is capable of utelizing around 3.5 GB (being a 32bit software) With your CPU a more proactical setting would be 75% for CPU and 3000 for Memory.
I hope you are using a local captcha solver because at 1000 threads using an external Captcha solver will cause massive bottle necks.
It also does not mean if you set it to 3000 MB that GSA SER will use that all the time, that is only the threshold you set to lower the threads once it reach 3000MB
Also If you set the CPU to 95% like you had in your screenshot, then you are going to have a hard time running other applications , because once a system is running at 95% of CPU capacity things are going to get pretty hot on that motherboard and it will become unstable pretty fast.
That is why I suggest a modest setting of 75% for CPU throttling. That should allow plenty of resources for your other apps, and same as the memory this is only the max threshold at which it will start lowering threads to stay below 75% cpu usuage
It must be just me but i dont think running at 1000 threads is a good thing, for a number of reasons:
1. massive strain on your PC's hardware, internet bandwidth
2. massive link velocity - Search engines like Google are constantly updating their algorithms to detect and penalize manipulative link building practices. Using automated tools at such high thread counts increases the risk of triggering these penalties, potentially leading to a drop in rankings or even deindexation.
3. Sometimes going slower gets better results = Do you know the story of the Old uull and the you bull that was standing on a hill and looking at the cows at the bottom of the hill ?
PR: negative messages on Captcha and lost links on sites fortunately I don't receive any at the moment, but it's nice when Google Search sends a message of congratulations for having reached 1500 clicks in a month... fortunately
Best wishes to all of you too for your projects! Thanks a lot. What you want for me I wish 100 times for you
Have a look at this post i done today, it might give you better insight on how to scale your work and preserve resources: https://forum.gsa-online.de/discussion/comment/191627/#Comment_191627