- select use all the engine << doesn't influence the memory usage - didn't collect keyword. << influences memory usage a tiny bit - 100 Submission a day << doesn't influence memory usage - didn't use global list << doesn't influence memory usage
- didn't use PR or OBL << doesn't influence memory usage
It's more depending on the data you have filled int he project (especially keywords).
darman82
- Keyword i use to scrape is 100K generic keyword - keyword for anchor text is only 1-2 - LSI keyword around 50 spin keyword {key1|key2|key3|etc} - Thread 100-150 - HTML timeout - 120 - search engine queries - 10 - dedicated proxies - Article i use is ACW articles
If we run the GSA SER at the windows 32bit then the max memory usage of GSA SER will be 1GB, but if we run the GSA SER at windows 64 bit the max memory usage will increase to 2GB.
This is the main reason why only couple of people here who having the "out of memory" problem with GSA SER.
I have ask my vps provider to re-install my vps with windows 64bit. I hope this will solve my "out of memory" problem.
For those not having "out of memory" issue, i assume your windows is 64bit. I'm i correct?
Ozz
if thats the case than it sounds like a bug to me as 32bit should use up to 2GB and 64bit up to 4GB
Sven www.GSA-Online.de
@Ozz, it's actually true. A 32bit OS does only provide up to 2gb to the program if it is modified as in here:
A 64bit program provides this only if the 32bit tells the OS it can actually handle that kind of address allocation.
GSA programs allow that special use of more memory but on 32bit systems only if the boot.ini is modified.
AlexR Cape Town
So I am running Win Server 2008 RS2.
1) Does this mean the whole system can only use 1gb RAM or just a single program running?
2) If so, what's the point of everyone having 8gb of RAM on their VPS's if only 1 is used?
3) I'm having issues with CPU use being very very high. I have 2gb RAM, but it's normally only using 100mb of it, but CPU is at 98% all the time. It's a 2 core, 6.80HZ CPU. Is anyone else having issues with SER using a lot of CPU?
Bytefaker
Be carefull when SER measures the CPU usage. Check it via taskmanager or other windows tools. You'll see that it very seldom use 99% uf the CPU ressources.
But CPU is my bottleneck too. I'm currently use a VPS with 4 cores and 4 GB ram. Ram is more than enough available but CPU is the problem.
What you can do with the ressources: For example let Scrapebox run and harvest a nice "little" list of new targets you can import later.
AlexR Cape Town
@Bytefaker - issues on both SER measure as well as task manager. I've even asked hosting if they can put a little VPS package together with a bit more CPU, but they won't. The next package has more disk space, RAM and other things I don't need.
That's why I was wondering if others are finding GSA SER is using a lot of CPU. I know Sven codes a lean program, but wondering if there is a feature or something that is eating up the CPU.
Hunar
edited January 2013
I'm bottled by the 2G as well.
On my server I have a lot of projects running. If I load up the scheduler at 40 Projects and a switch of 30 mins and threads at 200. Within 20 Mins Ser will say it's at 99% CPU and 2G Mem. but If I look at my task manager I'm only at like 40% cpu usage and 25% Ram Usage. So, I am barely using my Dedi to it's full potential and thus wasting a lot of money per month on the server.
So my workaround to this is only loading up 20 Projects at a time with a switch of 30 mins and i stay around the 1g mem and 99% cpu usage according to SER.
AlexR Cape Town
@Hunar - are your threads staying around 200 or is that just your setting?
The reason I ask is that many assume it's running at full threads. My swings from 5 to 100 just while I'm watching it.
it stays constant at 200 threads because I have global site list enabled. As soon as I disable it. It will do that swing from 1-100 even though I have 200 threads enabled.
sonic81
@hunar. i notice this as well. what's the cause of the drop when you turn off global lists i'm guessing because it doesn't have as many targets to post to?
Hunar
If I remember right it's because the extra threads have nothing to do. So it only uses the threads that have something to do. I'd imagine that it could use the extra threads to do more searching or what not but sven, lee or ozz would know more about that than I would.
LeeG Eating your first bourne
I notice threads drop when it does searches and also when it shuts down one project and starts another
I run 250 threads, 10 projects on a 30 minute rotation
darman82
OK, i'm using trustvm vps and the os was windows 7 - 32bit. I have ask Carlos about to re-install my vps with windows 64bit but he answer me back by saying he doesn't have a windows 64bit os available. What a bad support system they have.I think i will have to move two of my vps from them to other vps service.
My vps was Windows 7 and it does mention that if we edit it wrong it can cause Windows not to start. But is ok, now i know the problem is not at the GSA SER but instead because my VPS OS. I guess i have to move it then.
OK i think this problem is solve, so the solution for the "Out Of Memory" problem is, please use windows 64bit for your OS and have minimum 4GB memory.
Thanks All...
Sven www.GSA-Online.de
Well I read it's working for XP as well...so I would guess for Win7 as well. Not tried it though.
grind
One install running three new projects with ~5k urls and 15k keywords (plus Collect Keywords From Target Sites and Use Collected Keywords To Find New Targets selected) would hit the 1GB (32bit) memory limit in about 3 minutes.
Knowing I'd been running the same project with diff URLs and keywords for three weeks without instance, I tried to figure out what I'd done differently. Each URL has specific keywords assigned to it along with a fair number of raw URLs and random anchors (click here, click this, etc) set up in a deep nested spintax configuration to give me the %s I want. I had to do it this way instead of using the checkboxes below the anchor text area since the keywords were specific to URL.
Anyway, for this run, I used a bunch of %Website% variables instead of the actual URL list in my set up file, as it made for quicker work in excel and as soon as it fired, memory over limit. Rebooted, ran one project only, same thing. Tried another project, same thing. Went back and fixed my data to use the actual URL instead of the %website% variable, fired it up, been running like a champ for 20+ hours now, current memory usage is 221MB.
So if you're using a ton of nested spintax and %variables% and running into memory issues, try replacing the variable with the actual data. Takes longer to set up with excel, text editor of choice and some regex than using the variables but seems to run a lot cleaner. Hope that helps somebody.
Grind
darman82
@Grind, the article i use is from ACW and i only use %anchor_text% variables and no other things.
I will test to edit my scrape keyword, now i'm using 100K generic keyword for scraping. I will edit it to around 1000keyword only.
For some reason my GSA SER dont want to use more than 1GB of RAM, even when it indicated 99% cpu usuage.
I get Out of Mem warings when i start GSA SER, and then after about 10 warnings it runs fine and no more warnings.
Here is my set up:
WIN 7 - 64 bit CPU : Core I7 - 2006 k Sandy Bridge - over clocked to 4.2 Ghz RAM : 16 GB DDR3 (usually only using around 50%)
GSA SER : V.5.04 Projects: 43 ( 10 projects each with about 3 tiers ) Threads : 100 HTML Time Out : 130 Use proxies Everywhere Time to wait between serp queries : 20 Automatically search proxies every 0 min. Check newly added proxies + retest previous I use 40 of my own semi shared ( buy proxies) + the scrapped.
Why dont want my SER want to use more than 1 GB RAM ? I read somewhere that we can get it to use up to 4 GB RAM on a win 7 64 bit system but no one seems to be able to tell me how
darman82
hi royalmice,
from my experience GSA SER will use 3GB max memory if we run the GSA SER at windows x64 bit. Also, from my testing - GSA SER can only run 40 campaign max, for better result just use 30 campaign.
I keep having the same problem until i stop some project and only run my highest priority campaign.
@darman82 yes i did try with both monitor pc resources on and off but it did not mke any difference.
I think i have just managed to get it to use 2GB RAM.
I stopped all projects and edited each project as follow Schedule on a 50 verified submissions per day. Dont collect keywords from target sites Dont select use domain url as abchor
I noticed that when i clicked the main stop, some projects were not stopping, only showing stopping for more than a hour. After i edited each project i restarted it and this time all was ok. So either the editing i done fixed something that was causing it not to go over 1 GB.
Anyways all ok now and it is cranking along at 2 GB. Now i just need to wait for the cows to grow horns then Sven might do a 64bit version (with optemised code ), so i can make full use of my available RAM
bretty113
Schedule on a 50 verified submissions per day.
royalmice how to schedule 50 verified submission per day?
Comments
@royalmice can you send me these 50 files? I would like to get the same exception here and maybe improve it's performance.
@darman82
- select use all the engine << doesn't influence the memory usage
- didn't collect keyword. << influences memory usage a tiny bit
- 100 Submission a day << doesn't influence memory usage
- didn't use global list
<< doesn't influence memory usage
- didn't use PR or OBL << doesn't influence memory usage
It's more depending on the data you have filled int he project (especially keywords).
- keyword for anchor text is only 1-2
- LSI keyword around 50 spin keyword {key1|key2|key3|etc}
- Thread 100-150
- HTML timeout - 120
- search engine queries - 10
- dedicated proxies
- Article i use is ACW articles
@Sven, how abut that settings?
Have zipped the Kontent Machine exported files and busy uploading to Dropbox now, will send you a message as soon as the upload is done
If we run the GSA SER at the windows 32bit then the max memory usage of GSA SER will be 1GB, but if we run the GSA SER at windows 64 bit the max memory usage will increase to 2GB.
This is the main reason why only couple of people here who having the "out of memory" problem with GSA SER.
I have ask my vps provider to re-install my vps with windows 64bit. I hope this will solve my "out of memory" problem.
For those not having "out of memory" issue, i assume your windows is 64bit. I'm i correct?
@Ozz, it's actually true. A 32bit OS does only provide up to 2gb to the program if it is modified as in here:
http://www.maxi-pedia.com/3GB+switch+Windows+boot.ini+3+GB
A 64bit program provides this only if the 32bit tells the OS it can actually handle that kind of address allocation.
GSA programs allow that special use of more memory but on 32bit systems only if the boot.ini is modified.
On my server I have a lot of projects running. If I load up the scheduler at 40 Projects and a switch of 30 mins and threads at 200. Within 20 Mins Ser will say it's at 99% CPU and 2G Mem. but If I look at my task manager I'm only at like 40% cpu usage and 25% Ram Usage. So, I am barely using my Dedi to it's full potential and thus wasting a lot of money per month on the server.
So my workaround to this is only loading up 20 Projects at a time with a switch of 30 mins and i stay around the 1g mem and 99% cpu usage according to SER.
I notice threads drop when it does searches and also when it shuts down one project and starts another
I run 250 threads, 10 projects on a 30 minute rotation
@sven, about this page -> http://www.maxi-pedia.com/3GB switch Windows boot.ini 3 GB
It only support this os :
My vps was Windows 7 and it does mention that if we edit it wrong it can cause Windows not to start. But is ok, now i know the problem is not at the GSA SER but instead because my VPS OS. I guess i have to move it then.
OK i think this problem is solve, so the solution for the "Out Of Memory" problem is, please use windows 64bit for your OS and have minimum 4GB memory.
Thanks All...
I will test to edit my scrape keyword, now i'm using 100K generic keyword for scraping. I will edit it to around 1000keyword only.
I have 2VPS and both my vps was out of memory.
I get Out of Mem warings when i start GSA SER, and then after about 10 warnings it runs fine and no more warnings.
Here is my set up:
WIN 7 - 64 bit
CPU : Core I7 - 2006 k Sandy Bridge - over clocked to 4.2 Ghz
RAM : 16 GB DDR3 (usually only using around 50%)
GSA SER : V.5.04
Projects: 43 ( 10 projects each with about 3 tiers )
Threads : 100
HTML Time Out : 130
Use proxies Everywhere
Time to wait between serp queries : 20
Automatically search proxies every 0 min.
Check newly added proxies + retest previous
I use 40 of my own semi shared ( buy proxies) + the scrapped.
Why dont want my SER want to use more than 1 GB RAM ?
I read somewhere that we can get it to use up to 4 GB RAM on a win 7 64 bit system but no one seems to be able to tell me how
from my experience GSA SER will use 3GB max memory if we run the GSA SER at windows x64 bit. Also, from my testing - GSA SER can only run 40 campaign max, for better result just use 30 campaign.
I keep having the same problem until i stop some project and only run my highest priority campaign.
Thanks for the feedback. My problem is my GSA SER dont want to go over 1 GB, even if i run 500 threads.
Make sure you uncheck monitor pc resource and automatically lower thread at options. Don't use those options.
try it
I think i have just managed to get it to use 2GB RAM.
I stopped all projects and edited each project as follow
Schedule on a 50 verified submissions per day.
Dont collect keywords from target sites
Dont select use domain url as abchor
I noticed that when i clicked the main stop, some projects were not stopping, only showing stopping for more than a hour. After i edited each project i restarted it and this time all was ok.
So either the editing i done fixed something that was causing it not to go over 1 GB.
Anyways all ok now and it is cranking along at 2 GB. Now i just need to wait for the cows to grow horns then Sven might do a 64bit version (with optemised code ), so i can make full use of my available RAM
You said in this thread :
>> I said I will not compile a 64bit version. Though with some tricks on the memory manager I made it to use up to 4GB on a 64Bit system.
My questions is , On all my 64bit servers once SER hit 1.9 / 2.1 GB of ram ..I get the annoying "Out of memory" message forever!
Any ideas why my SER installations are not using 4 GB Ram ??