Skip to content

GSA SER memory / Thread issue

Hello guys,

I've recently been experiencing an issue and have had no luck solving it.

I have 10 projects that loaded all the data fields with heavily spun articles (spintax), and I suspect this is causing high CPU usage. However, even with just 5 projects, the CPU usage goes up to 100% (I've had no luck with the scheduler either).

I am running on 400 threads with 40 semi-dedicated proxies.

The server is an AMD Ryzen 9 with 16 cores and 12GB RAM (I'm aware that GSA SER is a 32-bit application and won't consume more than 4GB).

I've tried running a few links on the global folder in different ways, but the result is the same.

No matter what, it hits 100% CPU usage.

This happens only with one copy of GSA other works fine. Both running same link list

Can anyone suggest what might be wrong? The biggest confusion for me is that setting 1 active project results in around 20% CPU usage, but it never uses the full amount of threads.

Can anyone help me?

@sven  can you least give me some points or guide for that . 

@sven Whats is the best way to process 4million identified targets .  ( i believe there is a huge difference with each importing method

Also why we can add a article macro  from folder.  i guess it consume low memory then right ?

Comments

  • There is a big difference on how to modify and import URL's for sure with different methods giving different behavior.

    Something seems way off there either way, though.

    Maybe a fresh started project will help on machine with issue in question. Sometimes Ill get a project that "hangs" and simply restarting fresh will give me better results. . .

    Clearing caches etc, then hitting start fresh.. could be issue in project?

    Have you tried that? Starting over fresh?

    Ive never seen my CPU very high with just SER....
  • There is a big difference on how to modify and import URL's for sure with different methods giving different behavior.

    Something seems way off there either way, though.

    Maybe a fresh started project will help on machine with issue in question. Sometimes Ill get a project that "hangs" and simply restarting fresh will give me better results. . .

    Clearing caches etc, then hitting start fresh.. could be issue in project?

    Have you tried that? Starting over fresh?

    Ive never seen my CPU very high with just SER....


    Yep  tried to reset the project.  Delete spintax articles and stil cant even run single project... since my other copy works fine i am going to restore back up from it and see.

  • SvenSven www.GSA-Online.de
    please send me the project backup in question.
  • Sven said:
    please send me the project backup in question.

    Hi @Sven

    I created another project and seems now it works fine ( actually restore project from my other copy of gsa )

    But i still like to know what it cause. Where can i send backup.

  • royalmiceroyalmice WEBSITE: ---> https://asiavirtualsolutions.com | SKYPE:---> asiavirtualsolutions
    edited April 10
    @OMRON

    I have done a post on how to deal \ trouble shoot out of memory issues - but take a look at the settings NOT to use that cause high load maybe it could be of help in your CPU load issue 

    Here is the post : 

    https://asiavirtualsolutions.com/how-to-avoid-the-out-of-memory-error-when-using-gsa-ser/
    And if that done work try some of the maintanance tiips here, specially the cleaning of target urls.:

    https://asiavirtualsolutions.com/maintanance-for-gsa-search-engine-ranker/

  • verdemuschioverdemuschio Italy
    edited April 10
    @OMRON
    I don't know what it is but SER is definitely not the problem. Maybe while SER is working you have some other application running that is sucking up your CPU. For example, an application that consumes a lot of CPU could be Money Robot, or the antivirus, or something else. But definitely not SER. I work with a Ryzen 5 5600G and I don't go beyond 3% CPU. Maybe you have too many proxies all working together straining the CPU. Below is a photo of my configuration and as you can see it is larger than the one you use. I have 64 active projects. However, I installed 64GB of 16-bit RAM, but I don't think this is the key factor





    Just out of curiosity, in addition to your paid proxies, have you activated the search for proxies in the SER settings? Because that also leads to excessive CPU consumption. See the update times for proxies search or disable some proxies in the SER list. Obviously mine are hypotheses

    Typically, CPU consumption occurs most when searching, downloading, or submitting backlink URLs to blog search engines
    Thanked by 1OMRON
  • royalmiceroyalmice WEBSITE: ---> https://asiavirtualsolutions.com | SKYPE:---> asiavirtualsolutions
    I don't know what it is but SER is definitely not the problem. Maybe while SER is working you have some other application running that is sucking up your CPU. For example, an application that consumes a lot of CPU could be Money Robot, or the antivirus, or something else. But definitely not SER. I work with a Ryzen 5 5600G and I don't go beyond 3% CPU. Maybe you have too many proxies all working together straining the CPU. Below is a photo of my configuration and as you can see it is larger than the one you use. I have 64 active projects. However, I installed 64GB of 16-bit RAM, but I don't think this is the key factor





    If you have have 64 GB of RAM, why are you limiting GSA SER to only use 2300 MB (2.3GB when it is capable of utelizing around 3.5 GB (being a 32bit software)  With your CPU a more proactical setting would be 75% for CPU and 3000 for Memory.

    I hope you are using a local captcha solver because at 1000 threads using an external Captcha solver will cause massive bottle necks.
  • verdemuschioverdemuschio Italy
    edited April 10
    royalmice said:

    If you have have 64 GB of RAM, why are you limiting GSA SER to only use 2300 MB (2.3GB when it is capable of utelizing around 3.5 GB (being a 32bit software)  With your CPU a more proactical setting would be 75% for CPU and 3000 for Memory.
    @royalmice because I have other applications running that require memory.

    However, @OMRON tries to lower the number of last verified URLs


  • SvenSven www.GSA-Online.de
    @OMRON please use some free hoster and send me the download link in private message.
  • @royalmice I tried to do as you wrote. I closed a couple of applications and increased the memory.
    However, due to the increase in memory I didn't notice big differences in favor of the CPU, while the captchas are fewer but the resolution has greater control on the positives
  • royalmiceroyalmice WEBSITE: ---> https://asiavirtualsolutions.com | SKYPE:---> asiavirtualsolutions
    @verdemuschio
    @royalmice because I have other applications running that require memory.
    that does not make sense because you mentioned you have AMD Ryzen 9 with 16 cores and 12GB RAM -----> so since I suggest you set that memory setting in GSA to 3000MB  - it mean you will still have 9000MB left for your other apps.
    It also does not mean if you set it to 3000 MB that GSA SER will use that all the time, that is only the threshold you set to lower the threads once it reach 3000MB

    Also If you set the CPU to 95% like you had in your screenshot, then you are going to have a hard time running other applications , because once a system is running at 95% of CPU capacity things are going to get pretty hot on that motherboard and it will become unstable pretty fast.
    That is why I suggest a modest setting of 75%  for CPU throttling.  That should allow plenty of resources for your other apps, and same as the memory this is only the max threshold at which it will start lowering threads  to stay below 75% cpu usuage


    It must be just me but i dont think running at 1000 threads is a good thing, for a number of reasons: 
    1. massive strain on your  PC's hardware, internet bandwidth
    2. massive link velocity - Search engines like Google are constantly updating their algorithms to detect and penalize manipulative link building practices. Using automated tools at such high thread counts increases the risk of triggering these penalties, potentially leading to a drop in rankings or even deindexation.
    3. Sometimes going slower gets better results = Do you know the story of the Old uull and the you bull that was standing on a hill and looking at the cows at the bottom of the hill ?


  • @royalmice I think there was some confusion. I didn't start the discussion. I have a Ryzen 5 5600G and 64 GB of 16bit RAM. The one who has the Ryzen 9 is OMRON. :D  :D  :D  :D   . In any case, it isn't causing me any problems, not even 95%. But I'll try to set it to 75%, see how it works. Thank you
  • I don't think running 1000 threads is great idea either. Possible but, eeeghhh.. I took an image recently showing 800 threads, but that was only because someone tried saying it wasn't possible.

    Just to show how silly somethings that are said are, I did test from and older laptop with the task manager backing the statistics up to the front GUI in SER.

    @verdemuschio when I use an i-5 9th gen CPU with SER generally I see bouncing around 3 percent as well I believe. This is only without any add-ons or anything else, though.

    @royalmice Slow and steady def wins the race, especially working with some of these browser scripts and social networks - Today "drip feeding" is pretty much mandatory, I think.

    Or should I say what was considered "drip feeding" years ago is likely "normal" speed for a successful campaign today.

    I don't see point in blasting yourself out of search either or blast a bunch of links just to be deleted anyways.

    Guess, how using tier wise as some other factors could matter also.






  • verdemuschioverdemuschio Italy
    edited April 12
    Yes, exactly what you write @backlinkaddict. However, I configured those settings with due ignorance on my part, but they are settings taken from some SER tutorials around the internet. However, regardless of all this, in relation to this discussion opened by @OMRON I wanted to underline that in my opinion the problem is not the CPU, because I only have a Ryzen 5, compared to him who works with a Ryzen 9, and having set on SER parameters with higher numbers I have no CPU blocking problems. So I believe that OMRON's problem is to be found elsewhere.
    Then for my SER settings I lowered the numbers as you recommended
  • settings taken from some SER tutorials around the internet

    verdemuschio  likely, this will cause some issues. Maybe a start, but still have to test for your own environment and tweak settings to your liking.

    as @royalmice pointed out threads that high, likely just going to cause some bottlenecks in operation somewhere and that's even before getting into the SEO effects of it, or wasted resources.

    Though, everyone has their own process...

    Likely, to run that high 24-7, you will need very large amount of stable proxies, thousands of emails, the money spent on solving captchas and adding content being readable and unique (indexable)... I would suspect it would be a waste of money, resources costing a fortune.

    Would be nice to have some sort of "overview" on how many links got indexed, which ones stuck, how many captchas are continually being wasted on sites you'll never get verified or added to the blacklist.

    I think listVSlist tool it's best for this currently, as well as checking Lpm vs Vpm



  • But look, apart from the captchas where the statistics of the solved and incorrect captchas can be seen directly on SER, but I can see the added and deleted backlinks with Ahrefs. Furthermore, Ahrefs also sends an email notification about this with statistics.
    Then as far as articles and emails are concerned, I already use them normally. However, it is true that I do not yet know some parameters that need to be set exactly, but nevertheless I am getting my results in Google. In 5 days some links present in good positions on Google Search Italia were indexed. I can't complain at the moment
  • edited April 12
    My point was not that you cant see what failed and what did not for captchas in statistics (this is not so accurate anyways, depending on how you set it up), it was how many sites are being wasted on trying to solve captchas and retrying for a site/URL that will never be verified anyways. This is on user, settings, list used etc. . .

    Especially, the waste from a useless list. I bet this is very high percentage for good amount off users. 

    Basically, joking here but - nice have a prompt at end of month "Congrats on wasting 40$ for captchas on links/urls that would never get verified anyway. And your link loss for built and indexed links this month was 65%"

    @verdemuschio   I totally agree!

    Getting results = what matters at the end of the day, how people use SER in their strategy they must figure out for themselves.

    but nevertheless I am getting my results in Google. In 5 days some links present in good positions on Google Search Italia were indexed. I can't complain at the moment
    This is good news, I'm hopeful more of these strategy type discussions using what is working will increase and this forum can go back to talking and sharing strategy again B)

    I believe, SemRush, Ahrefs and Majestic, likely other types of these services will all email you results from their database. 

    Even some free to a degree for sites you own and can verify as well as add/connect G Search Console and G Analytics for more helpful data, if you have that set up on your main sites or landing pages. 

    G products for main sites not some PBN-like sites. In case that was not so clear.
  • I use Ahrefs and Semrush, and currently have a 30 day free trial of MOZ. Let's say that more or less everyone agrees with the same statistics. Today the MOZ bar extension installed on Google Chrome gave me a clearer idea on what to improve. It provided me with less complex information than Ahrefs that clarified some things for me, including about backlinks

    PR: negative messages on Captcha and lost links on sites fortunately I don't receive any at the moment, but it's nice when Google Search sends a message of congratulations for having reached 1500 clicks in a month... fortunately


  • Yes, I agree the achievements from Google are a nice motivational touch!

    For local and organic.

    I do agree, the Moz Bar is nice and simple, the others do offer way more info but I'm not fan of how some of the data is presented to user in some of them.
  • backlinkaddict i run 1600 threads and my CPU around 95 - 98 with Xevil 6 + GSA SER + GSA CB . 

    I post only for GSA's best performing sites and for my higher threads work than lower threads. No scraping only posting.

    @s4nt0s going to PM now..i hope sendspace will work.


  • Not really sure exactly why you tagging me about this?

    I post only for GSA's best performing sites and for my higher threads work than lower threads. No scraping only posting.
    This appears to be good strategy, though not really completely clear. I have mentioned this focus on the engines that are providing true verifieds and results for now, What else can one do?

    I understand that it possible to run whatever threads you want, though likely for most if looked into it's just wasting ton of resources was a point I think I made, if your referring to that.

    This I don't get, why you are tagging him if going to PM him, why not just sent him a message?
    @s4nt0s going to PM now..i hope sendspace will work.


  • verdemuschioverdemuschio Italy
    edited April 13
    @backlinkaddict however, just because we have touched on the topic, I post two images that demonstrate that for me with the current settings SER is working perfectly and the results are excellent. Captcha resolution, emails and articles also do the job. For each project I created 200 to 1000 items. Then when I receive the results of the added and deleted Doffollows from Ahrefs I will post them. It's clear that I don't have to prove anything to anyone, it's a sharing to demonstrate that with the current settings everything works.
      
  • verdemuschio  was I the one disagreeing with you on specific topic or something?

    I find generally things are working fine even when some say they are not.

    Though, I have no reason for running that many threads 24-7 currently, my image was for someone saying you couldnt run 200 threads in SER :D

    I see your image reverifying links though, Yes.

    If you are getting results, that is what matters in the end ;)

    Wish you best of luck with your projects!
  • verdemuschioverdemuschio Italy
    edited April 13
    verdemuschio  was I the one disagreeing with you on specific topic or something?
    Absolutely no! It wasn't directed at you in general, I tagged you because you and I were talking about it, but it's clear that the discussion is general.
    However at the moment, and I underline at the moment, SER works well. Then tomorrow we'll see how it goes. I confirm that the results are what count. And it is clear that everyone has their own means at their disposal, so what works for me, doesn't work for someone else. The settings are basically the same for everyone. Then everyone must find their own balance, also depending on the means, such as the proxies that they use.

    Best wishes to all of you too for your projects! Thanks a lot. What you want for me I wish 100 times for you  <3
  • royalmiceroyalmice WEBSITE: ---> https://asiavirtualsolutions.com | SKYPE:---> asiavirtualsolutions
    @OMRON

    Have a look at this post i done today, it might give you better insight on how to scale your work and preserve resources: https://forum.gsa-online.de/discussion/comment/191627/#Comment_191627
Sign In or Register to comment.