Skip to content

improvements on use of cpu

edited January 2013 in Need Help
I am running about 100 campaigns (only tier1 at the moment), and could easily use another hundred. And then also tier 2 or even tier 3. I am also running captcha breaker and seo indexer.

My problem is my cpu, which runs at 99 %. I run like 40 threads and a html timeout of 100. I dont use forums and blog comments.

I use scedule posting, so each domain only get about 60 verified links a day.

Are there anything I can do to be able to run another 100 campaigns (meaning 200 alltogether) and maybe even use a tier 2 on each campaign without the program freezing because of lack of cpu?

I have thought about getting another licence, but then I also need another licence for CB, and then it starts to hurt my purse.

Comments

  • SvenSven www.GSA-Online.de

    cpu for CB can hardly be lowered (only if I optimize the code which Im doing all the time)

    cpu for seo indexer can only be lowered when using less threads

    cpu for SER can be decreased on several parts...but for that you need to show us some more of your setup.

  • Hi.

    I use:
    40 threads
    html timeout 100
    10 private proxies from proxy-hub
    custom time between search engine queries 5 sec
    I use proxies for PR checking
    I use CB with 4 retry. I dont use anything else besides CB
    I use filters with update every 1440 min. max download size is 4 mb

    on each campaign I use all platforms except blog comments, document sharing, forums, guestbooks, image comments, pingback, referrers, trackbacks and video

    the campaigns pause after 60 submissions a day +- 20
    I skip sites with more than 50 outgoing links and also PR below 2 and above 6. I also skip unknown PR.
    I try to skip creating nofollow links
    I also skip sites with certain badwords

    I use 37 international search engines to target urls.

  • SvenSven www.GSA-Online.de
    if you don't use blog comments, make sure your used keywords are just small. You don't need to import massive keywords which is only taking time to parse and memory to keep it updated.
  • Hmm ok. I can see that I use 5 mb of keywords, so thats a lot of keywords.

    Will a few, like 50 or 100 keywords be enough?
  • SvenSven www.GSA-Online.de
    Yes as keywords are only used as tags for some engines than. You can even use 10 and it would be enough.
  • This is a great tip btw. Thx Sven!
  • SvenSven www.GSA-Online.de
    Oh I can teach Ozz something ... strange ;) your welcome
  • Thanks a lot
  • ronron SERLists.com

    @mlk10 - I wanted to add that as a user of SEO Indexer (which is an outstanding product), that I do not run it while I use SER. It creates 1400 links per url in about 20 seconds, and it sucks the ogygen out of my PC. My SER completely slows down when I run both together. So I run SEO Indexer separately.

    What I do is grab all the verifieds, sort by date since my last indexing run, give the file to SEO Indexer, and let it run by itself. The thing is a beast, and in my opinion, I get more out of both by giving each their own separate time. Just my experience.

  • Good idea :)
  • AlexRAlexR Cape Town
    @sven - that's an excellent tip about keywords! :-)

    I don't have too many keywords, but still the CPU use is extremely high, like 99%. 

    You said. 

    "CPU for SER can be decreased on several parts..."

    What are the other parts?
  • SvenSven www.GSA-Online.de
    Mainly the content you use, search engine selection, a lot filters (bad word, url filters)...
  • OzzOzz
    edited February 2013
    I've modified my scripts for the tags anyway, but your method should do a good job for anyone that is using articles/SNs or SBs in seperated projects.

    In terms of tags you should use them somewhat like categories or like broader keywords. Instead of "German Sheppard Dog Training" use "Dog Training". Reason for that is that those 'Tag'-sites have PR sometimes and if you are lucky than your link is listed on that site.

    To know which sites using tags make us of notepad++
    --> Search --> Find in Files --> Find what: %keyword% --> Directory: ...Program Files (x86)\GSA Search Engine Ranker\Engines --> Find all


  • AlexRAlexR Cape Town
    @sven - could we not have option to use "anchor" and "secondary" anchor as tags? They are more applicable than keywords in many instances. 
  • SvenSven www.GSA-Online.de
    no option...not every single thing needs an option!
  • So if we had a large KW list it would make more sense instead of feeding it into every project, to chop it up into smaller parts, so each project had their unique KW's and less memory usage + not all scraping the same results + the submitted ones anyway joining into the global list that is shared across all of them projects..
Sign In or Register to comment.