My progress or lack of

Hi all, I am about 7 days into my first project and was wondering are my results typical.
I am running articles, web 2.0 , wiki links and social network for my tier 1 money site.

I have been reading up on the forum and made some tweaks so I bought private proxies, I only have 5 PP running. I also bought some Hotmail accounts and am using 5 at the moment, should I change them or increase the number of them?

I only have about 20 keywords, am I restricting myself here and should I be getting a whole lot more from Google.
I have been running it on my Laptop overnight for the last few and I have 385 submitted and 181 verified, does this seem correct.?
 Appreciate any feedback in order to improve what I have already accomplished , many thanks

Comments

  • davbeldavbel UK
    Accepted Answer
    You might want to post some screen prints of your project settings from your options screen.  Remember to blur out any of your URLs / email addresses :D

    20 kw is not a lot.  I'd be aiming to try to get as many as you can related to your money sites, ideally at least 1000+.  You can use Google KW tool or Scrapebox if you have it.

    There is a top 100k KW list floating about on the forums, which lots of ppl have used (including me) so that might be worth a go just to get those numbers running
  • Thanks Davbel, the keyword list I thought had to be relevant to the niche I am in .
    I will take some snap shots and post them when I am home on this thread , many thanks
  • If I were you I would read the many good posts on this forum which tell you exactly how to set the software up. It's how I learned and far quicker than waiting for responses here. Just check back through recent posts and bookmark the good ones :)
  • Thanks seagal. I have tweaked the project as I said by browsing the threads. But there are something's that I can't find out about as mentioned above.
    My project is set up but I would like to improve it
    Thanks
  • Accepted Answer
    I dont know if i'm the only one but I've always just written out my own keywords, normally 50 or so each time. I haven't had any problems with doing that, and it means I can target new sites maybe -  even if you use the 1k list on here then how many of those are going to be spammed to sh*t already??
  • Thanks for that spunko,
    I have researched my niche which is weddngs and I have about 400 keywords for the keyword tool that get a greater search than a 1000 a month, I agree with you , no point spamming defeats the idea of getting relavant links.
    Any more advice on the global options spunko
  • I use GScraper to get keywords. It will dig a few levels pulling Google's related keywords. Probably a little expensive to just do that, but it's a good tool to have. It basically does everything Scrapebox does, but a lot faster.
  • Thanks Kreist , I did have a look at that and its on my to do list so thanks for the good advice.
    Any more tips for the global options.
    I am running it on 50 threads and 80 html timeout, is this ok for getting good results from home ?
    Thanks again
  • Optimal html timeout from what I've read on here is usually between 130-140. As far as the threads, you should be able to go as high as your cpu lets you. Or until it becomes annoying since you're using your home computer. But, you did say you only use 5 proxies. I'm not sure how many threads you should use per proxy. I have 20 semi-private and use 420 threads. So, you should be fine up to 100 I would think.

    Understand most of the info I'm giving you is from what I've read, not experience. I've only been using this for a month or so. :P
  • Oh, and any specific global options you're talking about?
  • Thanks Kreist, thats right I am only using 5 proxies, I will increase that and I have changed the threads to 100 and html timeout is 120 . I am also using about 100 search engines for my only campaign which is tier 1.
    Also using 50 Hotmail accounts.
    Appreciate your advice, some seem to be annoyed at asking questions here . so I do appreciate it.
Sign In or Register to comment.