Skip to content

My Tiered Setup - How Can I Improve it?

edited April 2013 in Other / Mixed
Hey I've had GSA for close to a week now. Spent a few 2-3 days pouring through this forum reading everything I could find on the software and playing around with it. I've developed the strategy that I want to go with. I'd like to share it here, hopefully those just starting their GSA journey can learn from this and also feedback and your opinion about any part of this is greatly appreciated.

My SETUP:

I followed a tier setup similar to Ron's here: https://forum.gsa-online.de/discussion/2930/ser-tiers#Item_28 and Ozz's here: https://forum.gsa-online.de/discussion/2930/ser-tiers#Item_28. I will be using GSA for everything except for Press Release submissions & creating Web 2.0 platforms on sites like tumblr, blogger, ect. Here's an example of one of my site's campaigns:

image

-> Contextual-Tier1, Tier2 = Article, Social Network, Web 2.0, Wikis
-> Crap for both contextual & directories = Blog comments, Forum, Guest Book, Image comment, Microblog, S Bookmark, S Network, Wiki. (These are the secondary backlinks. or Ron calls them kitchen sink. I just named them crap).
-> Directories = Just directories
-> PR-Tier2 = These are backlinks to Press releases that I have submitted for that site. The PRs are tier1 making these links tier2. This include contextual backlinks mixed with the "crap" backlinks from above.
-> Web20-Tier2 = I havent added this at the moment as I'm waiting for this web 2.0 GSA add-on from Jamese: https://forum.gsa-online.de/discussion/3192/serengines-com-web-2-0-engines#Item_16. If he doesn't launch it soon then I'll have these made by hand/look for something else.

OPTIONS

I set the following options in the main options box:

- 120 Threads
- 120 HTML timeout
- Using 30 shared proxies. $25 from buyproxies.org. They work well.
- 3 second wait time between search queries.
- Using captcha breaker & just signed up for askmebot. $5/mth.
- I have indexing completely off to save resources. I can't find the thread, but it was posted by Ron that he checks his links 1x/week to see if they're indexed and then runs them through gsa seo indexer all at once. I plan to do this.
- I have filter on
- And on advanced I have everything checked.

With this options and my setup I do about 5k submissions/day and ~15 LPM. This includes my PR filter on some of the campaigns. I will be working to improve this #.

TIER 1.

Ok so that's the basic setup. Here's how i set things up:

-> Data

image

-> Data I did:

- I have ~10 urls from the site in there. The homepage only once. [This I got from someone who wrote that if you look at natural backlinks to a site, only about 10% of them go to the homepage the rest to other pages]
- Chose random url to be more natural
- Use url variations to be more natural
- I have a massive list of keywords 100k. [I got that from here: https://forum.gsa-online.de/discussion/comment/18225#Comment_18225]
- Use collected keywords to find new targets so it never runs out
- My anchor texts are hidden but I've got 20 of the main ones we're targeting (this campaign is for an aged site already ranking in top 10 pages BTW. I'll talk about new sites at the end).
- Use secondary 10%. For this i took my main keywords and added the url at the beginning & a bunch of keywords from the google keyword tool. So it looks like {url.com keyword 1|url.com keyword 2|google keyword tool keyword|...}. The reason for putting the url at the beginning with some is looking through other sites backlink profiles i noticed many have this naturally occuring.
- Generic 20%
- Domain as anchor 25%
- Anchor variation 100% - figure it's more natural

Now you can't see my article but I have a spun one in there that i spun w/ Spin chimp (any software can be used). Did care about perfect readability on this.
Also for categories I used the tip of using *General*, *Misc.*, ect. Using the *s gets you more coverage i read somewhere.
Everything else like website title, description, ect is spun manually to make it 100% make sense.

-> Options

image

- Set to 40 submissions/day
- Set to PR 1+. I'm only running 6 campaigns right now. Once I start running more I'll probably remove this to save resources but right now I can afford it.
- Put url where clearly seen as spam. The reason for this is so that the article will be posted on wikis & s networks instead of just a profile being created so that I can get the contextual link.
- You cant see it but under target urls I have 8 english search engines checked and told it to use urls from global lists.
- You can see I have the index links option off.
- Set to verify once per day to save resources
- Everything else is pretty standard



-> Email

image

- I bought the email creator here: https://forum.gsa-online.de/discussion/3056/sell-email-account-creator-#Item_51. Works well. What I do is create 30 emails with it per campaign and have all 30 forward to one main one - the tool setups that up automatically for you. This way gsa only has to log into 1 email account for everything and it can never be blacklisted.


TIER 2:

image

- Same Data as Tier 1 but I dont use the 2ndary anchor texts as those have my $ site url in them
- Set to submit 200/day. Thinking of setting this to unlimited?
- Remove the PR1+ filter
- Index links should be off, I must have accidently turned it on.

DIRECTORIES

image

- Set to submit 60/day
- Pr 1+ checked again. 
- You cant see the data in this shot but Its pretty straight forward. I have a title and description spun well to be readable. I also unchecked all the anchor options like use generic, use secondary ect for directories

PRs

image.

- I have 65 PR url's loaded up in this
- 400 submissions per day. That comes to just ~6 per url. ? I should probably unlimit this?
- What different about this one is I have the "crap" links here with the contextual links. The reason for this is I dont plan to backlink the contextual links so no need to separate. That was the only reason to separate earlier, b/c i didnt want to waste resources backlinking the "crap" links. BTW my calling them crap links doesnt mean they're bad, that's just a memorable name I had for them.

CRAP/SECONDARY LINKS

image

- These are the secondary links so less filters.
- For data which you cant see, it's the same as the contextual tier1 stuff. Same article and titles. For stuff like blog comment, forum, ect i just use the standard info that GSA puts in there.
- 200/day
- also should have indexer off...
- otherwise same options as the others

QUANTITY OF LINKS BUILT

Ok Quantity is something that I didn't find much on the threads about from other people. This would be great to get feedback on. What I'm doing for campaigns for sites that I call "Mature" (either 3+ years old or ranking in top 100 or already have some good natural backlinks) is everything that is above, so basically:

- Tier 1: 40-80 contextual submissions per day & 60 directories (depends on the site) to $ site.
- To the contextual only: a Tier2 of 200 contextual/day
- Then everything exact the $ site is given the "Crap" links
- + on top of this I've got my 65 PRs that I always have made in month 1 + soon to have ~10 web 2.0s built per month until run out of them (so maybe max 30-50 as these are limited)

?What are your thoughts on that?

For New sites/ aren't yet ranking in top 10 pages for any of their keywords/dont have a good set of natural backlinks yet:

- Tier 1: 10 contextual submissions per day & 30 directories to $ site ( the reason for the high # of directories is that I figure only 10% of them will accept the submission. Thoughts?). This is the only thing that I change in terms of quantity. Everything else below gets the same:
- To the contextual only: a Tier2 of 200 contextual/day
- Then everything exact the $ site is given the "Crap" links
- + on top of this I've got my 65 PRs that I always have made in month 1 + soon to have ~10 web 2.0s built per month until run out of them (so maybe max 30-50 as these are limited)

? What are your thoughts on this? Is it too much/too little? Anyone have experience on this that they can pitch in.

Anchor texts: THIS IS PERHAPS THE MOST IMPORTANT PART. Maybe people I've talked to spend the 1st month only building links with the anchor text being the site name/variations of the name w/o going for the keyword they want to rank for. So the PRs, web 2.0s, contextual links, and directories will all use variations of the anchor "Business Name". I think it makes sense and is most natural to do this so I plan to do 80% of links like that, 20% of links as the keyword anchor. Thoughts on this? And what about using "Business name - keyword" as the anchor?

Other tips:

- Download a multi-clipboard, you'll thank me. I use ditto on pc, jumpcut on mac is good.
- Proxies are not an option, they're a requirement
- Use hotmail
- When setting up a new campaign for a site, I duplicate a contextual-tier1 then move that to a new folder which has the name of the site. Then edit it as needed. Then i dup that into tier2. Then dup that into the directories one. Then I dup tier2 again, and that one I edit for the "crap" ones that I need. This is an ok way of doing it, but I know there has to be a better way. Any ideas?

What I've got:
GSA SER obviously
GSA Captcha Breaker
GSA indexer
Spin Chimp
Excel
For articles, I grab 3 off goarticles and first spin paragraphs, then spin words.


«1

Comments

  • Oh yes another question - I always just check the entire category of backlink when I want to use it - so for wiki i just check the box infront for wiki. But some of the engine platforms only have a small # of installs and so very few if any backlinks come for it. I feel like it might waste resources. Am I write. Does anyone check just the individual ones they want?
  • Looking forward to some answers by some of the big names on this board! I'm at a similar skill level in my GSA usage and have been learning so much from all the great feedback this forum gets.
  • @TheReaper234 - First off, I would offer a "Well Done" and a pat on the back.  =D> You have shown more initiative in the one week that you have owned SER than many (most!) other users probably have in months of using it. It is well worth the time you have put in up front to learn how to use the program right and get it set up well from the beginning.

    I did not have time to read through everything and check all the settings. I will try to come back and give some more input at another time (if others have not already). A couple small things that I did are...

    1.) On your email settings, you should check the box to wait 15 minutes between checking emails.
    2.) Are you using one article for your "crap" projects? If so, you probably need to change this to be using more than one since you are building so many links (look at %spinfolder macro). You may even want to do this for your tier 1 projects. I create a folder with about 20 super-spun articles (each article can produce thousands of 70% unique articles), and use that macro in my projects.
    3.) Where you have created one project for multiple pages on your main site, I usually create multiple projects - one for each page. This is because I am trying to rank each page for different search terms. You can accomplish this the way you have it set up as long as you code the keywords with the URLs, but it doesn't look like you have it set up this way. If you don't, you are telling Google that the same search terms apply to every page on your site (that you have in this project anyway). Also, if you set each page up as a different project, you can build more links to your whole site faster.
    4.) When you do create some tier 1 links to your money site outside of SER, I would create a project in SER and load these URLs into it and start building links to those as well.

    Anyway...a few things I saw right off the top. But, like I said to start...job well done!

    What type of PC and internet connection are you running this on?

    I will check back and see how things are going and what others have to say...good luck!
  • some ideas that could increase your LpM --

    uncheck "collect keywords from target sites" and "use collected keywords to find new target sites", don't use PR filter.

    reference: https://forum.gsa-online.de/discussion/2611/my-gsa-ser-config-full-monty-warning-low-lpm
  • This is quickly turning into a sticky thread for me :)  @DavidA2, how do you produce your superspuns please.
  • @sobiman - I developed a site about a year ago that provided super-spun articles to members when they joined. I created a couple thousand super-spun articles for this site, but I could never figure out how to get it marketed properly. It is set up on JVZoo, but never really did anything (sigh!) So it is just sitting there...and I have access to all these articles on over 100 different topics.

    If you (or anyone else for that matter) is interested, PM me. If there is some interest, I will see what I can do to get people set up with access to it. Maybe it was overpriced for what I was providing...I don't know (I don't think so). But it is a shame for it to just be sitting there (unused).

    Basically, it was set up with 3 membership levels that allowed members to download different numbers of "super-spun" articles each month.

    If there is interest, I would "resurrect" it and reduce the price (greatly?) for people here.


  • @DavidA2 - I am intersted, and I would think others are. Everyone is looking for good content.  How about you posting the level of spinning and the price you have in mind.  Then the community can say what they think, and hopefully provide a level of comparison with other offers.
  • id say 95% done. the rest is just cosmetics. like chosing the best engines and so on.

    watch out for that generic anchor %.... id producre crap for me (5% turned out to 30% .....), I uncheck it now and use 40% URL / domain an chor instead. i never liked that "click here for less spam" anchortext ;-) idea..
  • @DavidA2

    1.) On your email settings, you should check the box to wait 15 minutes between checking emails.

    I only have one email that it logs into though. I used the video that the seller of the email creator made you can see that to see what i mean.

    2.) Are you using one article for your "crap" projects? If so, you probably need to change this to be using more than one since you are building so many links (look at %spinfolder macro). You may even want to do this for your tier 1 projects. I create a folder with about 20 super-spun articles (each article can produce thousands of 70% unique articles), and use that macro in my projects.

    I am. That's an excellent Idea that I need to implement.

    3.) Where you have created one project for multiple pages on your main site, I usually create multiple projects - one for each page. This is because I am trying to rank each page for different search terms. You can accomplish this the way you have it set up as long as you code the keywords with the URLs, but it doesn't look like you have it set up this way. If you don't, you are telling Google that the same search terms apply to every page on your site (that you have in this project anyway). Also, if you set each page up as a different project, you can build more links to your whole site faster.

    Thx for the tip I will consider doing that.

    4.) When you do create some tier 1 links to your money site outside of SER, I would create a project in SER and load these URLs into it and start building links to those as well.

    Yup that's what im doing

    Anyway...a few things I saw right off the top. But, like I said to start...job well done!

    What type of PC and internet connection are you running this on?

    i5 2.27Ghz 4gb ram. Basic home cable internet.

    I will check back and see how things are going and what others have to say...good luck!

    Thx!


    uncheck "collect keywords from target sites" and "use collected keywords to find new target sites", don't use PR filter.

    Does  "collect keywords from target sites" and "use collected keywords to find new target sites" slow it down?


    Ya I'm noticing alot of generic anchors, a higher % then I put in. I think I'm just gonna mix in a few generic terms with my secondary keywords.

    Thx for the feedback everyone. Please if anyone has input they can share about the Quantity of links to build that'd be greatly appreciated!
  • AlexRAlexR Cape Town
    This is a great post. :-) 

    I think @Sven should add this to the FAQ. 

    "It's not doing what I expect, what can I do" - you know those generic help me questions with no details!

    Answer - "Read This Post!" 
  • ronron SERLists.com
    edited April 2013

    @The Reaper234, I've been busy and haven't had a chance to respond. Here are my thoughts:

    - I would be careful with directories going to the moneysite. I didn't even include them in my diagram as I have found them to be generally worthless, and not able to make many links with them. So for me, it is an inefficient platform to create links.

    - I would not use collect keywords and/or Use collected keywords. I do not have this checked on any projects. You get a lot of longtail crap that won't help you find new targets, so it is an inefficiency to have that checked.

    - You have captcha selected to also ask user. I assume you have a captcha cracker. If you do, then have "Ask 1st service to fill captcha".

    - I would drop the PR filter. I don't have that on for any project and rank just great without it. It will really slow down your links built and LPM in general. Except for things like blog comments, or in your case directories, going straight to moneysite.

    - Do 40 "submissions reached in a 'day'", not 1440. That screws everybody up. Always use 'per day' and always use 'submissions', never 'verifications', just for the record.

    - Tick 'on' Verified Links must have exact URL always. If it is a Tier 1, set it to automatically every 1440. If it is a high volume crap tier, choose "Never/Disabled" and manually run verification on those crap tiers once a week, or every 3 days, or whenever you would like. You want a verification process on your T1 daily because your underneath tiers can start building links to them right away - it will help index those T1 links, and of course shoot link juice through them to your moneysite.

    - Tier 2 - Uncheck "Avoid posting URL on same domain" on high volume crap tiers - you want massive links. I would set a number of links 'per URL' as opposed to a flat number - that way your T2 linkbuilding increases with hand-in-hand increases in your T1 - otherwise you will build too few links over time on 'underneath' tiers. Fix captcha setting, again, to not involve you - the user. Check that box on verification and change to Never/Disabled for high volume crap tiers - assuming you verify manually on your own.

  • follow ron's advice, 99,9% done ;-)
  • @ron

    As usual your input is great.

    Ya what im trying to do w/ directories is build a nice list of verified ones. I've found these links do help w/ ranking, my stuff anyway.

    This: Uncheck "Avoid posting URL on same domain" on high volume crap tiers.... Just saved me my insanity. I'm testing a new platform and was going crazy wondering why on a list of 200k identified lists it would stop posting at 4k.

    Would you mind giving some input on the quantities I wrote I'm doing? I've found very little on this thread about quantity of backlinks that people are doing.
  • ronron SERLists.com
    edited April 2013

    I will start brand new domains with a drip of 5 submitted per day. Then week 2 I might put in a contextual T2 at 10X per URL.

    Then week 3 I will probably put in a T1A at 10X per URL. And then keep building downward (T3) and sideways (T2A & T3A in subsequent weeks).

    Just by what I said, you can tell that I gradually accelerate into a new website. I am in no rush. This technique works, and it sticks.

    You then have the ability to make those 10X per URL crap tiers into 20X. You then have the ability to make the T1,2,3 contextual tiers go to 20X in subsequent weeks. I always save the direct T1 increases for last.

    If you were to draw a picture of what I do, it is a diagram with a small head, big shoulders and long legs.

    If this is my baseline for a new website, then you can make rational assumptions of what I do for more mature websites. Trust me, it isn't a lot more.

    If you accelerate into a scenario where you make a lot of links, you will run out  of the ability to feed such a beast. It becomes insane. I can maintain position in 50k - 150k exact match with this outline.

  • Good thread, bookmarking!

    Question: are you guys re-verifying your contextual tier1 links? I found out that some of such backlinks doesn't exist after some time(blog deleted, site doesn't work anymore) so it is a big waste of resources to create tier2, tier3 or even other T1A T2A T3A like Ron does...
  • ronron SERLists.com

    I re-verify those once per week. They do tend to stick, so I am not too worried about creating tiers underneath to links that do not exist. Very small fallout. 

  • Ron, how do you reverify the links that have been verified for the first time? Does Modify Project>Verify All Links actually reverify all links both new submitted links and previously verified links?
  • ronron SERLists.com

    Right click on Project.

    Show URLs>Verified>Verify

    You can highlight all projects at once and do this, or just highlight all the T1's like I do, or whichever projects you want to do.

  • Got it, Ron. TQ \m/
  • @ron 10x per URL per day?
  • ronron SERLists.com
    Yes. Remember only 10%-20% become links. So it makes for a whopping drip of 1-2 real links per URL.
  • Point taken. I really appreciate your help.

    If you are building links to authority site with > 100 pages, do you prioritize them or just submit 5 links per urls per day to each money page? (and subsequent tiers)
  • now i build links to a project only using Directory, I have 2000 submited and only 80 verified, the only thing that is notable is that the traffic increased a lot, I think the owners of directories just check the websites.
    no proxies used for this project.

    thx
  • Hey @Ron

    Can you talk about your indexing rate? I've got about 20k backlinks I've run thru indexification and gsa indexer and very very few are getting indexed. The contextual ones have a better rate, maybe like 5% but the others maybe 0.5% if even that. What are you getting?

    Does it even help your site increase in ranking if the link doesnt get indexed? I've always heard no
  • my indexing rate is as bad as yours @TheReaper234 ; have followed the same steps like you, use KM spintax like @ron ; but dont even get a index rate, like ron with 50 % or more, i think that is impossible... sorry guys...
  • @TheReaper234, just wondering how your projects / rankings are going? I hope all is going great.

    I am just about to start my campaign based on your very helpful thread! Many thanks


  • I've had mixed results. The indexing rate is extremely poor. i just changed up my setup quite a bit. Will report back to let you know what happened.
  • ronron SERLists.com
    edited April 2013

    I just checked the indexing rate on some projects where I did not use the SER Ping function, and where SER automatically sent them to Lindexed.  So Lindexed was the only method I used.

    The indexation rate was 37%. This was for contextual T1, T2 & T3 on a base of 35,000 links, so statistically these are good numbers.

    On this batch of links I will now be putting the non-indexed links through GSA Indexer. We will see what happens. Usually when I do something like this I get a really nice jump, like an extra 25% will get indexed within a week or two.

    With respect to Indexing in Google...

    The indexation rate for crap links like profiles will be the lowest, whereas the indexation rate on things like articles and web 2.0's will always be higher.

    Please remember that indexation is a function of 'value' with Google...pages Google believes 'worthy' to be indexed and cached...just like a library would be making that decision.

    Some things will never index because they are crap. Google will also tend to index inner pages on properties that have a higher PR on the homepage.

    So to illustrate...on the crap end of the spectrum you have a PR0 crap forum with an inner page forum profile of PR N/A. This is total crap and will never get indexed.

    On the other end of the spectrum you have a fresh web 2.0 property with a page PR N/A, but it's on the Wordpress platform which has a homepage PR9. This has the best chance of getting indexed.

    So when you try to analyze your indexation rate, you need to evaluate it based on the type of links you are building. Look at your pie charts to understand the distribution of how you build links.

    10,000 wikis will have a much higher indexation rate than 10,000 forum profiles. Blog comments on blogs that are already indexed solves that problem. Leaving a forum comment on page 37 of comments doesn't. 

    So if you are very hung up on indexing - sort your verified links by platform - export them - and run through scrapebox to check the indexation level for each platform to understand where your 'problem' lies.

    I tend to not fret over this issue. I understand the importance of indexing - and take some measures to help things index - but I never lose sight of the fact that building more links is even more important.  

     

  • another golden advice from @ron , thanks i wasn't sure about that now i'm very clear.
  • ronron SERLists.com
    edited April 2013

    Just an update - this took me all day to do...

    I checked the indexing rate on my crap tiers T1A, T2A, T3A which is where the majority of my links are built.

    The index rate was 75.7% on the junk tiers. This was on a base of several hundred thousand links.

Sign In or Register to comment.