How many threads are you running indexer on? I lowered mine to 10 and it works like a charm. I bet you had a very high number of threads and this is what caused the issue.
@oil - Exactly my point. You have two beasts in the room, SER and Indexer. There isn't room for both of them if you build a lot of links. I give each their own space.
20 private Proxys from "Proxyblaze" 50 threads/ Html Timeout 60 Seconds
6. GSA Captcha Breaker
All programs and i check several times daily and make sure they are all updated, and i make sure i have all windows updates. im running Server 2012 64bit. Any Suggestions to improve performance? That is all thats running no antivirus / etc or any unneeded services. What do you guys use to back up you VPS? my host wants like 40$ a month for r1 soft which is to much so im trying to figure out best way to back up my vps..
No CS anymore? And why NHSEO? Is blog- and image-posting so much better with NHSEO? I don't use NHSEO so I can't tell, but all platforms of NHSEO are supported in SER as well. Did you try to focus on SER only to free up rescources?
60s HTML timeout seems very low to me, but if it works than its superb of course.
@king818 - I don't think there is a problem. I just think people need to understand that an internet connection can only handle so many threads. If you go full on indexer, you are blasting at 300 threads. It is extremely powerful. (But you can always decrease the threads in Indexer).
It would be like having two SER's running on the same PC. You probably wouldn't do it at the same time.
@Ozz - you said "60s HTML timeout seems very low to me, but if it works than its superb of course."
This is time website waits to load before SER moves on right? So surely with a setting of 60, it means it will wait up to 60s before moving on. Why have this so high? Surely 30 is a better number? If a site takes 60s to load, wouldn't it be better to move on quickly? My understanding is it was the time for a single site to load on a thread, so more threads wouldn't impact it unless you have a slow machine.
Mine's slowed down a LOT as well after buying Captcha Breaker. Avg. solving time this morning stood at 23 seconds! Sure, I've got accuracy maxed over the speed settings, but this used to be 2-4 seconds, running the same amount of campaigns.
I started noticing the slower submissions when GSER added the "CPU and Memory Usage" functions, which are pointless in my opinion. You can check that yourself by using task manager. To constantly monitor the PC/VPS usage uses unnecessary memory I'd say.
I've got a fast VPS from a good provider, running GSA indexer, CSER and CB only. 19 campaigns at 120 threads on GSER, 30 threads on the indexer. Even though I can sometimes see the CB working, the GSER interface is not responding most of the time, so I cant stop it or make adjustments to my settings without restarting it.
Will a future version reduce memory load? This is becoming a real problem now.
@seo_addict: so the changes you should make in CB options like i told you at BHW had no effect at all??
to me it sounds that you should decrease your threads in SER and Indexer. what could have happen is that due to CB you get more verifications => more usage of the indexer => more CPU/RAM usage overall
@gg: its more about how many sites get verified when SER is in verification mode. LeeG made some good points about it. if everything is working for you than you don't need to change anything.
CPU Overload cause of GSA SEO indexer is because @sven have made the GSA SEO Indexer run so quick. I think if @sven can add this feature to GSA SEO Indexer it will reduce the cpu load. Correct me if i'm wrong @sven.
Don't display the submit status of GSA SEO Indexer, just make it run at background. Does this reduce the cpu load??? I know someone will ask, how do i know GSA SEO Indexer is running, the submit status is the way they see it.
So how about make a options. If the hide submit status is really help reduce the cpu load then create a options to show and to hide the submit status.
I followed Rons advice and did get a lindexed account, and just dont run Indexer same time with SER, and since then all is very sweet and fine again, running indexer with 10 threads didnt help a lot for me, cause the load was still at 50% and with 10 threads basically a never ending story
i am as far as possible away from the 50K limit per day on lindexed, but with both running (SER / Indexer) my cpu load was always at 100% and stopping SER / or moving the mouse was a task which took like 5 min.
Nevertheless, GSA tools are just the fu.. best thing i ever seen, Love it!~
Is there a way to send to GSA Indexer , but have GSA Indexer on another computer? I want to use my VPS to run SER and CB and my home computer to run Indexer.
I doubt it, but you can export verified files from SER in one second, and import them into GSA Indexer quite easily.
Trust me, on full indexer, you can load it up with a massive file and let it run for a couple of days if you have thousands or tens of thousands of links.
@king818 - That's why I keep a folder by project group with the indexed links as of my last run. I really don't want to run any dups through Indexer that are already indexed. So I 'compare' in scrapebox to weed out the dups.
Just don't forget that natural indexing occurs over a few weeks, and it is kind of worth it to check on the indexing first. So after I weed out the dups, I use public proxies to check on indexing, and just move over the ones that aren't indexed.
You need to maintain a list of your verified/indexed links. Once you have a list to begin with, you can 'compare' that list in scrapebox against all verifieds you pull from SER. It will weed out the ones already in your good list, and leave you with the new ones.
Or you can just do it by date. You just need to keep a log so you don't get confused. Either way @oil, you want to check indexing in SB before you send them to GSA Indexer. Many links get indexed over time, so I don't want to waste effort trying to index links that are already indexed.
Wouldn't it be neat if we had option to sen links to indexer After x days and before they get sent see if they are already indexed??? Would save a lot of resources if we removed urls that are naturally indexed.
Comments
@king818 - I don't think there is a problem. I just think people need to understand that an internet connection can only handle so many threads. If you go full on indexer, you are blasting at 300 threads. It is extremely powerful. (But you can always decrease the threads in Indexer).
It would be like having two SER's running on the same PC. You probably wouldn't do it at the same time.
I started noticing the slower submissions when GSER added the "CPU and Memory Usage" functions, which are pointless in my opinion. You can check that yourself by using task manager. To constantly monitor the PC/VPS usage uses unnecessary memory I'd say.
I've got a fast VPS from a good provider, running GSA indexer, CSER and CB only. 19 campaigns at 120 threads on GSER, 30 threads on the indexer. Even though I can sometimes see the CB working, the GSER interface is not responding most of the time, so I cant stop it or make adjustments to my settings without restarting it.
Will a future version reduce memory load? This is becoming a real problem now.
Don't display the submit status of GSA SEO Indexer, just make it run at background. Does this reduce the cpu load??? I know someone will ask, how do i know GSA SEO Indexer is running, the submit status is the way they see it.
So how about make a options. If the hide submit status is really help reduce the cpu load then create a options to show and to hide the submit status.
I followed Rons advice and did get a lindexed account, and just dont run Indexer same time with SER, and since then all is very sweet and fine again, running indexer with 10 threads didnt help a lot for me, cause the load was still at 50% and with 10 threads basically a never ending story
i am as far as possible away from the 50K limit per day on lindexed, but with both running (SER / Indexer) my cpu load was always at 100% and stopping SER / or moving the mouse was a task which took like 5 min.
Nevertheless, GSA tools are just the fu.. best thing i ever seen, Love it!~
I doubt it, but you can export verified files from SER in one second, and import them into GSA Indexer quite easily.
Trust me, on full indexer, you can load it up with a massive file and let it run for a couple of days if you have thousands or tens of thousands of links.
thought about that as well,
but then here the manual solution,
uncheck sending to GSA Indexer, after some days, select all your projects
Show URLs / Verified / Copy that
and import into multiple queue of Indexer
and indexer will run forever
@king818 - That's why I keep a folder by project group with the indexed links as of my last run. I really don't want to run any dups through Indexer that are already indexed. So I 'compare' in scrapebox to weed out the dups.
Just don't forget that natural indexing occurs over a few weeks, and it is kind of worth it to check on the indexing first. So after I weed out the dups, I use public proxies to check on indexing, and just move over the ones that aren't indexed.
@ron
how do you do that to have just the verified links of the last run, thats exactly what i would need
See here: https://forum.gsa-online.de/discussion/comment/12628#Comment_12628
You need to maintain a list of your verified/indexed links. Once you have a list to begin with, you can 'compare' that list in scrapebox against all verifieds you pull from SER. It will weed out the ones already in your good list, and leave you with the new ones.
Or you can just do it by date. You just need to keep a log so you don't get confused. Either way @oil, you want to check indexing in SB before you send them to GSA Indexer. Many links get indexed over time, so I don't want to waste effort trying to index links that are already indexed.