improvements on use of cpu
I am running about 100 campaigns (only tier1 at the moment), and could easily use another hundred. And then also tier 2 or even tier 3. I am also running captcha breaker and seo indexer.
My problem is my cpu, which runs at 99 %. I run like 40 threads and a html timeout of 100. I dont use forums and blog comments.
I use scedule posting, so each domain only get about 60 verified links a day.
Are there anything I can do to be able to run another 100 campaigns (meaning 200 alltogether) and maybe even use a tier 2 on each campaign without the program freezing because of lack of cpu?
I have thought about getting another licence, but then I also need another licence for CB, and then it starts to hurt my purse.
My problem is my cpu, which runs at 99 %. I run like 40 threads and a html timeout of 100. I dont use forums and blog comments.
I use scedule posting, so each domain only get about 60 verified links a day.
Are there anything I can do to be able to run another 100 campaigns (meaning 200 alltogether) and maybe even use a tier 2 on each campaign without the program freezing because of lack of cpu?
I have thought about getting another licence, but then I also need another licence for CB, and then it starts to hurt my purse.
Comments
cpu for CB can hardly be lowered (only if I optimize the code which Im doing all the time)
cpu for seo indexer can only be lowered when using less threads
cpu for SER can be decreased on several parts...but for that you need to show us some more of your setup.
I use:
40 threads
html timeout 100
10 private proxies from proxy-hub
custom time between search engine queries 5 sec
I use proxies for PR checking
I use CB with 4 retry. I dont use anything else besides CB
I use filters with update every 1440 min. max download size is 4 mb
on each campaign I use all platforms except blog comments, document sharing, forums, guestbooks, image comments, pingback, referrers, trackbacks and video
the campaigns pause after 60 submissions a day +- 20
I skip sites with more than 50 outgoing links and also PR below 2 and above 6. I also skip unknown PR.
I try to skip creating nofollow links
I also skip sites with certain badwords
I use 37 international search engines to target urls.
Will a few, like 50 or 100 keywords be enough?
@mlk10 - I wanted to add that as a user of SEO Indexer (which is an outstanding product), that I do not run it while I use SER. It creates 1400 links per url in about 20 seconds, and it sucks the ogygen out of my PC. My SER completely slows down when I run both together. So I run SEO Indexer separately.
What I do is grab all the verifieds, sort by date since my last indexing run, give the file to SEO Indexer, and let it run by itself. The thing is a beast, and in my opinion, I get more out of both by giving each their own separate time. Just my experience.