Optimizing GSA for large amounts of projects - Am I doing it right?

Hi, so I've been working with GSA for a while now and been able to get some excellent results + LPM. I'm stepping up and working GSA with a much larger scope of projects now for all of my tier 1 guest posts/blog network posts.

So, below are my first set of tier 1 posts setup and just started running (Figures below are after 15 minutes of running)

image

Now what I've done, is setup the first project with all of my desired options then duplicated it... importing a new set of content from KM each time. I've used the Tools > Auto Fill > Use Kontent Machine for that.

I've then imported a list of keywords (50k high search queries) to each project - the exact same keywords. I've also imported a set of very targeted anchor text keywords from GKWT (590 total) and I am using the same keywords and anchor text for every project (All projects are pointing at tier 1 posts pointing at my money site - all in the same niche/industry.

Settings below:

image
I've gone through all of the target sites on the left, and deselected anything with a low verification rate (Anything under 100 basically) So I should be seeing a really good success rate there.

These are my options below, are they satisfactory?
Selected engines = Google International + a few random countries + ASK.com
image

I've also imported 500 email addresses for all of the projects (Same emails for each project)
image

So my results are.. I'm sitting on about 30 LPM. 
I have 30 semi dedicated proxies from BuyProxies.org 
VPS = Berman Hosting Extreme.

----------------------------------------------------------------------------------------------------

I'd love to hear what any of the pros here have to say, especially on google scraping etc. My verified list is about 6MB (450k urls) 

Thoughts? Ideas?

Thanks!

Comments

  • Just curious - how are you generating your keyword lists for scraping? Just generically or are you going more niche relevant?
  • Generically in this case, but if I was going Niche relevant, I'd use GKWT and a few different searches - filter out anything with under xx,xxx amount of monthly searches and build a highly targetted keyword list with 5-10k keywords in 10 minutes or so.
  • Okay. I know a lot of people are just using generic lists they found online for scraping but I assumed that it's much better to use a targeted niche list just for the sake of not scraping the same spammed out targets that everyone else is scraping if anything else. 
  • Pretty impressed by:Organization of project, and how you presented it here.Lots of choices and switches for me to go over.
    I guess i do it a bit different, using Account Creator + for e-mails..But i did see someone using the huge lists of e-mails and they were reporting low level of accounts being blacklisted..
    Can't comment much beyond that, hopefully some of the more experienced Users around here can shed some more light.
    THanks for sharing though!
  • OzzOzz
    edited May 2013 Accepted Answer
    there are two things i think you should change.

    1) put all your related links in one project. i mean, they are all about "video recorder", why not using them in one project??

    2) i wouldn't use all emails as SER has to scan 500 emails per project each time to find all messages that were related to that project. i'm not 100% sure about this, but i believe this is what happens.

    conclusion: just do one project for each niche and just use ~10 emails for each project you swap out on regualar basis.
  • Thanks @ozz. I have about 20 separate projects for the same client/niche. I've built about 40k verified and would like to build tier 2 links to them then 3/4 etc. 

    Can i use the Merge projects function? Then duplicate it and inside the duplicated project "use verified URL's" from the other project? Will that work seamlessly?

    Cheers
Sign In or Register to comment.