Skip to content

USES Alot of CPU!!!!!!

2

Comments

  • AlexRAlexR Cape Town
    How many threads are you running indexer on? I lowered mine to 10 and it works like a charm. I bet you had a very high number of threads and this is what caused the issue. 
  • true i have it on 100 lemme try the 10
  • AlexRAlexR Cape Town
    :-) Will make a world of difference!
  • ... true, but now Indexer need 10 years to finish ... i guess
  • Except lower the GSA SEO Indexer thread, is there any other way?
  • lol, i am playing around with this now, and even on 50 threads only, i guess that Indexer is the CPU ressource hog, and it prolly never was SER
  • ronron SERLists.com
    @oil - Exactly my point. You have two beasts in the room, SER and Indexer. There isn't room for both of them if you build a lot of links. I give each their own space. 
  • edited February 2013
    i usually have nohandsseo on 50-60 threads and GSA SER on the same... 
    these are the speedtest.net results i get with everything running. my cpu usage is 2 core vps 
    Intel Xeon E5420 2 Cores
    2.61 GHZ
    With  these programs running at the amount of threads listed'
    Here is my speedtest.net run with all programs running and cpu stays at 100% usage all the time
    I have 3GB Ram on my VPS that i use about 60% of most times:
    image

    • 1. NoHandsseo 50 threads
    • 2. NoHandsIndexer 10 threads
    • 3. NohandsPinger 10 threads
    • 4. GSA Seo indexer 15 threads
    • 5. GSA Search Engine Ranker With 
    • 20 private Proxys from "Proxyblaze" 50 threads/ Html Timeout 60 Seconds
    • 6. GSA Captcha Breaker

    All programs and i check several times daily and make sure they are all updated, and i make sure i have all windows updates. im running Server 2012 64bit. Any Suggestions to improve performance? That is all thats running no antivirus / etc or any unneeded services. What do you guys use to back up you VPS? my host wants like 40$ a month for r1 soft which is to much so im trying to figure out best way to back up my vps..
  • No CS anymore? And why  NHSEO? Is blog- and image-posting so much better with NHSEO? I don't use NHSEO so I can't tell, but all platforms of NHSEO are supported in SER as well. Did you try to focus on SER only to free up rescources?

    60s HTML timeout seems very low to me, but if it works than its superb of course.
  • Ya, I've put GSA Indexer to rest. Can't have it killing my computer while SER is running. Hopefully @Sven can find a solution to this.
  • ronron SERLists.com

    @king818 - I don't think there is a problem. I just think people need to understand that an internet connection can only handle so many threads. If you go full on indexer, you are blasting at 300 threads. It is extremely powerful. (But you can always decrease the threads in Indexer).

    It would be like having two SER's running on the same PC. You probably wouldn't do it at the same time. 

  • edited February 2013
    @ron Either way thanks for helping me solve what was going on
  • i never though GSA SEO indexer was causing the CPU load... ok newbie question. how do i see the how many resource that a program consume?

  • ronron SERLists.com
    edited February 2013
    Task manager. Right click on the bottom bar of your screen. But that's CPU. The real issue is what you are throwing at your internet connection.
  • AlexRAlexR Cape Town
    @Ozz - you said "60s HTML timeout seems very low to me, but if it works than its superb of course."

    This is time website waits to load before SER moves on right? So surely with a setting of 60, it means it will wait up to 60s before moving on. Why have this so high? Surely 30 is a better number? If a site takes 60s to load, wouldn't it be better to move on quickly?  My understanding is it was the time for a single site to load on a thread, so more threads wouldn't impact it unless you have a slow machine. 
  • i would think so... most sites load in 10 seconds for me.. usually like 1 second in a browser..
  • Mine's slowed down a LOT as well after buying Captcha Breaker. Avg. solving time this morning stood at 23 seconds! Sure, I've got accuracy maxed over the speed settings, but this used to be 2-4 seconds, running the same amount of campaigns.

    I started noticing the slower submissions when GSER added the "CPU and Memory Usage" functions, which are pointless in my opinion. You can check that yourself by using task manager. To constantly monitor the PC/VPS usage uses unnecessary memory I'd say.

    I've got a fast VPS from a good provider, running GSA indexer, CSER and CB only. 19 campaigns at 120 threads on GSER, 30 threads on the indexer. Even though I can sometimes see the CB working, the GSER interface is not responding most of the time, so I cant stop it or make adjustments to my settings without restarting it.

    Will a future version reduce memory load? This is becoming a real problem now.
  • OzzOzz
    edited February 2013
    @seo_addict: so the changes you should make in CB options like i told you at BHW had no effect at all??
    to me it sounds that you should decrease your threads in SER and Indexer. what could have happen is that due to CB you get more verifications => more usage of the indexer => more CPU/RAM usage overall

    @gg: its more about how many sites get verified when SER is in verification mode. LeeG made some good points about it. if everything is working for you than you don't need to change anything.
  • CPU Overload cause of GSA SEO indexer is because @sven have made the GSA SEO Indexer run so quick. I think if @sven can add this feature to GSA SEO Indexer it will reduce the cpu load. Correct me if i'm wrong @sven.

    Don't display the submit status of GSA SEO Indexer, just make it run at background. Does this reduce the cpu load??? I know someone will ask, how do i know GSA SEO Indexer is running, the submit status is the way they see it.

    So how about make a options. If the hide submit status is really help reduce the cpu load then create a options to show and to hide the submit status.
  • I followed Rons advice and did get a lindexed account, and just dont run Indexer same time with SER, and since then all is very sweet and fine again, running indexer with 10 threads didnt help a lot for me, cause the load was still at 50% and with 10 threads basically a never ending story :D

    i am as far as possible away from the 50K limit per day on lindexed, but with both running (SER / Indexer) my cpu load was always at 100% and stopping SER / or moving the mouse was a task which took like 5 min.

    Nevertheless, GSA tools are just the fu.. best thing i ever seen, Love it!~

  • Is there a way to send to GSA Indexer , but have GSA Indexer on another computer? I want to use my VPS to run SER and CB and my home computer to run Indexer. 
  • ronron SERLists.com

    I doubt it, but you can export verified files from SER in one second, and import them into GSA Indexer quite easily.

    Trust me, on full indexer, you can load it up with a massive file and let it run for a couple of days if you have thousands or tens of thousands of links.

  • thought about that as well, :D

    but then here the manual solution,

    uncheck sending to GSA Indexer, after some days, select all your projects

    Show URLs / Verified / Copy that

    and import into multiple queue of Indexer

    and indexer will run forever

     

  • edited February 2013
    If someone wants to automate this it would be very appreciated :)

    What I just did was copy all verified URLs (from the right hand side of SER) and cleared list.  Then pasted into indexer. 
    And I'm just going to keep doing it like that so I'm not running the same links through indexer twice or missing any links.

    All in all not too bad :)
  • ronron SERLists.com

    @king818 - That's why I keep a folder by project group with the indexed links as of my last run. I really don't want to run any dups through Indexer that are already indexed. So I 'compare' in scrapebox to weed out the dups.

    Just don't forget that natural indexing occurs over a few weeks, and it is kind of worth it to check on the indexing first. So after I weed out the dups, I use public proxies to check on indexing, and just move over the ones that aren't indexed. 

  • @ron

    how do you do that to have just the verified links of the last run, thats exactly what i would need

  • ronron SERLists.com

    See here: https://forum.gsa-online.de/discussion/comment/12628#Comment_12628

    You need to maintain a list of your verified/indexed links. Once you have a list to begin with, you can 'compare' that list in scrapebox against all verifieds you pull from SER. It will weed out the ones already in your good list, and leave you with the new ones.

    Or you can just do it by date. You just need to keep a log so you don't get confused. Either way @oil, you want to check indexing in SB before you send them to GSA Indexer. Many links get indexed over time, so I don't want to waste effort trying to index links that are already indexed.

  • @ron, thx that makes sense
  • AlexRAlexR Cape Town
    Wouldn't it be neat if we had option to sen links to indexer After x days and before they get sent see if they are already indexed??? Would save a lot of resources if we removed urls that are naturally indexed.
  • Guys, I run 4 tiers for my projects. Is it even worth trying to index the lower 3 tiers or should I focus solely on getting all my T1 links indexed?
Sign In or Register to comment.