Skip to content

What Type Of Backlinks Are Recommended Towards Tier 1 ?

edited January 2013 in Need Help
Hi guys,

Purchased GSA SER a couple of days ago and I'm currently testing different options.

My link building pyramid concept looks like this:

***Money Site

***Tier 1 - Unique, High Quality, Manually Created Web 2.0

***Tiers 2 and 3 - I will let GSA handle them.

My question to those of you with more experience is: What type of links should i build, via GSA, towards my Tier 1 (which consists of manually created Web 2.0s) ? I am asking this because I've seen many people advising against guestbooks or image sharing websites. What do you recommend ( I'm trying to play it safe not blast those Web 2.0s into oblivion :) )

Thanks

Comments

  • I am sure you will find different answers based on various experiences.

    My T2s are usually PR 1+ Forums, Bookmarking, Profiles, Microblog & Directories
    My T3s are usually everything SER can handle (plus Scrapebox blog posts)

    This has worked well for me so far.
  • Thanks Chutney.

    Any other opinions ? Which engines, in your experience, should be avoided when creating backlinks towards Tier 1 ?
  • Well since it's tier 1 my general preference is too include as many contextual platforms as possible... places where I can submit an article with an in-context backlink to T1 that is thematically related. I would also suggest trying to avoid platforms with a high turnover rate for lost links (guestbooks, blog comments and forum profiles tend to be a bit worse in this regard). It's good to have plenty of variety to T1 though, but I usually reserve the more high-volume-don't-care-about-link-loss stuff for T2+.
  • ronron SERLists.com
    edited January 2013

    Tier 1 - Contextual platforms where you can leave an article - Social Network, Web2.0 and Articles. I use KM exclusively for these with outstanding results (and I also endorse ACW which is great as well). If you have a prized moneysite that is worth the extra effort, I would also build some manual web2.0's that are human reviewed, and write some articles or outsource it.

    Tier 2 - The kitchen sink, which is basically the other platforms. I don't use trackbacks, pingback or referrer for these. If you plan on having a Third Tier, then include in Tier 2 (as a separate project) the same platforms as Tier 1.

    Tier 3 - If you do a Tier 3, then shoot the kitchen sink (above) to both Tier 2 projects.

    It is important to understand that contextual properties will gather the most value in tiering. The other links are of lesser value (for example, a bookmark is not something to create tiers under if you get my drift - who would be linking in to bookmarks in real life - and it will never accumulate value like an article). So always draw up your tiering scheme to run contextual properties undermeath one another, and hit each layer of contextual links with the kitchen sink. These are the properties you want to feed the link juice.

    Bear in mind that using Forum will get your email blacklisted quicker, so you may want to separate that platform. If you have anything that you must absolutely have a PR filter on (like blogs), you are best to break out those platforms as separate projects as you will slow down GSA and find fewer targets. The same applies to the OBL link filter.

    Personally, I do not use these filters as I have found that I am ranking really well without them. I have always been a fan of high PR, but I have found since Penguin that this rule is no longer the rule. Most things you post on with the exception of blogs and maybe guestbooks have very low OBL's, so it isn't worth it.

  • Thanks Ron. Stellar Post!
  • Contextual links are always good (Web 2.0, article directories) but I have also had good results with blog comments, forum profiles and social networks in Tier 1.

    Don't use guestbook, image comments, indexer, pingback, referrer, trackback in Tier 1. I use these platforms in Tier 3 along with ScrapeBox.

    I don't use any PR filters as that decreases the number of links but I do use OBL filter of 60 to 100 (for blog comments).

    You will however get the best results by having Tier 1 created manually. Also, use the Tier 1 as mini blogs i.e. having anywhere from 5 to 20 posts. You can get good results with single post Web 2.0s but they don't stick around for a long time, and with mini Web 2.0 blogs you can also use them for lead gen and eventually they will increase in PR and authority.

    I also use Lindexed to get all the links indexed.
  • AlexRAlexR Cape Town
    @ron & @tumpien - would love to get your opinion on the following.

    I have some super well written spun articles that have taken my article writer over a day to write each. 

    What would guys advise:
    1) Use these to create Tier 1 links to my moneysite? Manual web2.0's?
    2) If so, what platforms?
    3) If contextual (using GSA SER), what article sites and PR filter should I apply?

    I really don't want to waste them.

    Then are you using a tool to spin articles to use for contextual tier 2 links? I just don't want to use these great articles 
  • ronron SERLists.com
    edited January 2013

    If you are paying for content, I would use those on non-GSA Tier 1 properties that are the tough/manual ones that you would have to do manually anyway. And then of course set up tiers underneath those.

    For valuable money sites, I would create some of those prized elusive web 2.0's as a tier1. Then I would also use GSA to create separate Tier1's, and use a tool like KM or ACW for those, and everything underneath it. And then of course, put tiers below that.

    I cannot tell you how much success I have had using GSA with just KM. People go on and on about all these high quality tier1's. To each his or her own. But I am knocking it out of the park with those spinners and just GSA with no manual web 2.0's. If they (your tier 1's) start to rank, then of course you will want to rewrite them into sales pieces.

    Like I said above, Tier1 should be only those platforms that yield articles, and only those properties within that platform that yield articles. Like articles, web 2.0 and social network, wiki. If you just go two tiers deep, then tier 2 is the kitchen sink (all non article properties). If you have a tier 3, then separate tier 2 into two projects with contextual properties like tier 1, and the kitchen sink as the other project. Then tier 3 is the kitchen sink. Tier 3 then goes underneath the tier 2 contextual properties. Don't waste your time creating tiers under worthless links like the kitchen sink - they will never accumulate or store valiue - they are just links.

  • +1 to the above. Use manual web 2.0 properties. Any of the usual suspects will do like Squidoo, Tumblr, Wordpress, Blogger, etc. Try to build 5-10 page web 2.0 blogs... I've seen much better success with them.

    Also, PDF and document sites especially if the content is unique and reads well and then build web 2.0s to them and use GSA on the web 2.0s. Don't spam the document sharing sites... I've lost a few that way. But if done right you'll got lots to top rankings with them. I've got a ton of rankings using the likes of SlideShare.

    PS - Don't spin the same articles for the lower tiers. Makes them lose their uniqueness. Use something like Kontent Machine or Fiverr for spun content for the lower tiers. My personal favorite is http://www.uniqueblend.net/... spun well, Copyscape pass and also they read ok.
  • For Tier1 - All SER platforms that create contextual links + KM

    If you are using unique or highly manual spun content for tier1 ..don't bother to use SER with spun content
    If you are using SER with spun content, don't bother to use unique or manual spun content

    Use either of the one :-)
  • edited January 2013
    @ron do you have a list of which properties within which platforms are yielding articles ? I don't know yet what each property looks like and such list would probably be very helpful to other rookies as well.

    And another question: do you word-spin your KM content ? Or just spinning paragraphs or sentences ?

    Thanks
  • ronron SERLists.com

    @divatz - Take "Articles" as an example. Click the lead box so that all article platforms are highlighted. On the expander box, right-click on it and "Uncheck engines that use no contextual links". That leaves you with what you want. Social Networks and Web 2.0 are the others that are worthy.

    If you want to see what they look like, right-click on your project, Show URL=>Verified, double click on a property and take a look.

    I use the defaults for KM, and integrate a SpinChimp API. 

  • ronron SERLists.com
    I also said Microblog further up and meant social network for Tier 1. I was tired I guess. So articles, social network and web 2.0 for tier 1.
  • @ron - Thanks a lot, it's all much clearer now.

    Another thing i would like to ask is how do you approach indexing ?

    I was thinking of using scrapebox...

    Do you try to index only tier 2 (articles, social network, web 2.0) [for me tier 1 consists of only original, high quality web 2.0s]? Or the whole kitchen sink ? I assume it's the latter but i want to know your opinion.

    Thanks
  • ronron SERLists.com

    I try to index everything. I have GSA sending everything to Lindexed. After a couple of weeks, I look to see what is not indexing (so that means I keep master lists for projects and compare).

    Then I take those and put them in GSA Indexer. I love that program - it works extremely well. But I build so many links that I can't auto-feed that indexer - just too many links. So I do it separately.

    Scrapebox is a great way to help something get indexed. I would highly recommend that as a way to help get links indexed. That is my final step in getting things indexed.

  • Whats the difference between scarpebox and using GSA with only blog comments and trackbacks?
  • ronron SERLists.com

    If you own scrapebox then you would know that it can scrape tens of thousands of blog sites to post to quickly. It does only one thing - it finds blog pages to target for posting.

    Whereas GSA is dividing its resources across multiple projects - finding multiple targets - on multiple platforms. And then it opens emails, verifies, etc. If I can just get GSA to vacuum and get me some beer...

    You really want GSA to do its thing because it is so good at it. Let it build links. Try to take the indexing burden and handle that separately. I think you will find that to be the best use of resources.

     

  • scrapebox posts faster
  • ronron SERLists.com
    ^^Exactly!
  • AlexRAlexR Cape Town
    Let's say you're targetting 5 keywords.

    Are you creating a set of web2.0's for each keyword? i.e. 10 web2.0's per keyword, total of 50 web2.0's?

    OR 

    are you using a set of 10 web2.0's each with an article on the keyword? Thus you're getting a broader web2.0?
  • @ron What's the reason for separating tier2, one for contextual like tier 1 and one for the rest?


  • @Ozz Thanks for your comprehensive guide.
  • @ron Your answers have been really helpful, but like always, i have some more questions :)

    How do you index your backlinks with Scrapebox ? By posting comments to blogs right ? Do you use any auto-approve list (free or paid ?) ? Do you make your own auto-approve lists ? Any other useful approach ?

    Thanks
  • ATM, I am shooting out some spintax to my moneysite, posting it on article directories (drupal, wiki, etc..)

    spintax is readable (!) created my self, no autospin. using approx 250 different anchortexts (will in crease that).

    some other projects are like that:

    moneysite
    L1 web.20 with ONE unique article, approx 400 words, related to money site.
    L2 is spinned articles, postet to article dirctors.
    L3 "junk" links, like forum profiles, etc.. (all the rest, except trackback and stuff..)

    I think (some SEOs like greg morrison), point their spinned articles to their monesite too.
    but those spinns are readable :-) and related.

    for existing projects that have some links, traffic, customer, I would go for web.20 strategy.  just to be safe..

  • ronron SERLists.com
    edited January 2013

    @divatz - I take a multi-step approach to index:

    1) I use the Lindexed API key to autofeed them as they are built in GSA.

    2) After a couple of weeks, I re-verify all verified links. Then I take what is there and check if they are indexed in Scrapebox (I don't want to slow down GSA, and I need public proxies to do this anyway).

    3) I create two separate lists for each Project Grouping- Indexed and NonIndexed.

    4) I keep the Indexed list always updated and adding to it. I take the NonIndexed list and submit it to GSA Indexer - great program by the way, and deserves more credit. That program builds 1400 links to each nonindexed link.

    5) I wait a couple of weeks. I then check that list I submitted to GSA to see what got indexed and what didn't. I add the ones that got indexed to my master list. Why? Because I use SB to compare any new verified lists from GSA to weed out ones I already indexed. Saves a ton of time.

    6) I take the nonidexed ones from GSA Indexer, and I run a SB blast. Scrape your own list in SB to blast. If you already have a list, great. These are just links. That is my final step.

    If I had xrumer, I would have just eliminated steps 2-6 lol :) But I don't have XR yet. Doesn't mean I won't. 

  • @ron Thank you for your willingness to share!

    So if I am reading correctly you are using KM to product Articles for Tier1?  I can't get KM to produce much readable content, any tips for that?


  • ronron SERLists.com

    @ambition - No spinner is going to create readable content. I don't worry about that and my rankings are golden and they stick.

    The only time you ever need to worry about readable content is if you are trying to create accounts at human moderated platforms. It's not google that's going to pick you off - it's the humans - remember that.

  • @ron, i remember i ask you before about GSA SEO Indexer and you use full indexing instead of quick indexing. But i don;'t know you didn't select use only index site that allow deep link?

    From my point of view, it was useless to blast to 1400 indexing site that more then half not allowed deep link, try to select this options "sue only indexing site that allow deep link", cause it only 450+ site, it was 1/3 of the full indexing site list.

    By doing this, you will speed you GSA SEO indexing blast by 3x times...

    correct me if i'm wrong... since i never use full blast... i use quick and now full but with only indexing service that allow deep links.
  • ronron SERLists.com

    @darman82 - I completely forgot about that - too many balls in the air. Brilliant observation. I just checked Indexer and have full and deep link both checked.

    Do you just check the deep link and that's all?

  • hey @ron how many links you put in your tier 1 content?
  • ronron SERLists.com
    Just one. Works like a charm :)
  • @ron, just check full with those deep links and you can see GSA SEO Indexer only submit to around 450+ of indexing site rather then 1500 indexing site that you mention.
  • ronron SERLists.com
    Ok thanks!
  • yeah thanks for nfo @darman82
  • ronron SERLists.com
    @darman82 gets the award for the most useful post today =D>
  • ron with most your projects do you just 2 tiers.
  • ronron SERLists.com

    I do to start off. When I see myself closing in, I usually turn on a 3rd tier to have a strong gallop to the finish line. But I have gotten to the top with just two tiers.

    Some keywords need more juice. The best thing to do is hold something in reserve until you actually need it. Otherwise you are going full force at everything, and that's when you tend to get spanked. 

  • OK just so I'm clear about the indexer options guys?
    Full=checked
    Deep link= checked
    Is this how you do it @ron and @darman82? Thanks guys
  • ronron SERLists.com
    Full unchecked. Deep checked.
  • Aha! thanks @ron. You are indeed one of the most helpful fellows I have come across
  • OK so if I uncheck full it goes to "quick". So "quick" and "deep" are both checked correct?
  • @ron full & deep both checked gives me approx 450 df links to each URL. Quick & deep both checked only gives me 50 odd
  • hey @ron you put a custom link on article body, or the gsa default link?
  • ronron SERLists.com

    @seagul - Just check deep. I need to test it but I am pretty sure that is the solution.

    @rodol - I use <a href="%url%">%anchor_text%</a>. It is randomly inserted in spin syntax articles from KM. I don't use any other tokens.

     

  • Ron - no other links to Wikipedia or authority domains, so the only link is to your moneysite?
  • @ Ron it will select full indexer mode when you select "only sites that index deep links". I cant get it to just select the one. Its alway both....
  • ronron SERLists.com
    edited February 2013

    @spunko2010 - You got it. Not saying that authority linkouts are a bad idea. At face value it seems logical. All I'm saying is that I have never used them and rank great.

    There are so many things out there that people keep repeating. Like you need high quality on tier 1, blah, blah, blah, which is complete and utter bs.

    If you really, really cared about an authority linkout and if it was so important, wouldn't you really want your moneysite to have one rather than a tier1? But I digress. 

  • ronron SERLists.com
    @m1xf1 - Hmmmm. I will give it a whirl tomorrow. In the middle of stuff now.
  • You have to select full and deep to run the GSA SEO indexer for deep link indexing site...
  • hey @ron so i believe you make sure you insert your token in a non spinned sentence
  • ronron SERLists.com
    @rodol - I don't do anything - KM does it automatically. It sticks the token in something like 20-25 sentences that are all the same sentence spin. So no matter which spin is rendered, it is always in that unique spin automatically. So I never insert anything anywhere. All automatic.
  • Ooh ok, thanks.
  • spunko2010spunko2010 Isle of Man
    edited February 2013
    @ron understood, but what if Google add that to their algorithm soon? I read it was already in it, maybe that was b/s. Surely worth doing it on a few to spread the risk though?


    Also, wish ACW had that feature! Would save me 10 mins per day.  Looks like they do.
  • ronron SERLists.com
    Don't believe what you read spunko. You can randomize placement too. There are no issues, trust me.
  • @ron you uncheck nofollow platforms?
  • ronron SERLists.com
    No, you want both. Otherwise it's unnatural.
  • Hi Ron

    How many keywords you use for your tier 1 and do you use general keywords in Tier 1?
  • I have another question: are you guys using character spinning for tier 1 content?
  • ronron SERLists.com

    @shan - I was using about 1,000 on T1 and recently upped that to about 5,000. The key is finding nice niche words - there are only so many. And yes, on T2 and beyond, I use about 10,000 random generic. I might test 100,000 just to see if it doesn't cause memory issues.

    @traged - I don't use it but that doesn't mean it's a bad idea. Many swear by it. I'm sure you are completely safe using it. 

Sign In or Register to comment.