Skip to content

Is Anyone Getting Penalized For Tiering Link Building?

has anyone noticed a drop out of serps compeltely using tiered link building?

i'venoticed someting odd, slow pace tier 1 but blasting tier2 and tier 3 and i've noticed a few of my experimental sites getting knocked out of the serps.

this is worrying to me i'm wondering if any tiered linkbuilding rollout that matt cutts said would come out in summer is out yet?  anyone notice something simliar?

Comments

  • let me ask this another way.  has anyone post penguin 2.0 been able to rank a site using tiered link bulding similar to ron's method?  tjhanks.
  • Are you using the PR filters on your tier 1 - tier 3 campaigns? Penguin tends to not favor low PR on tier 1 so what I found beneficial is using a higher PR for tier 1 with minimal out-bounding links and lower PR for your 2nd and 3rd tier. Try keeping tier 1 to strictly web 2.0s and higher PR article directories, this works for me. Good luck! :)
  • Thanks chaoticsmooth. How high PR for do you set yours for Tier 1 ? I just set mine to PR 2 with no more than 50 outbound links. For all my other tiers (1A, 2, 2A, 3, 3A) I just left the outbound link box unchecked and no PR filter.
  • @sonic81 that is google propaganda.  How can they tell it's tiered or not? How can they tell the link you placed from the 100s or 1000s of others?

    @chaoticsmooth using only web2.0 and high PR links is a sure way to make an easy to spot footprint.  REAL websites get a variety of links from lots of different PR/quality/age/type sites.

    And I don't even want to start on your "Penguin tends to not favor low PR on tier 1" statement...
    :D
  • I am going off of personal experience davbel. What works for someone may not work for someone else. Glennf I set my PR filter to 3 on 1st tier. All my sites I have been building high PR with on 1st tier stuck after penguin rolled through so, I must be doing something right :)
  • @davbel

    it's actually quite possible to do this algorithmically.  whether google does it or not is another question because it would require a lot of computing resources.

    think about it.  when you analyze the competition you can quickly tell if 't s ablack hat tiered campaign.  just load up the website in ahrefs.com and look at their link profile.

    for their money anchortext links take each of them an aanalyze them again if they have 10 links to that it's likely a tiered campaign.

    if this can be done manually this can be done via a cmoputer.

    maybe i'm paranoid

    maybe my settings are wrong

    GSA was working for me great and 1 month post penguin i've usde the same strategies as before and results seem not as good or slower.

    the other theory i have no way to verify this without a lot of work is i know google has a lot of spam site lists due to their own research and also user submissions.

    it's possbile that my global list built over the last year with GSA contains such sites which if i keep building to may cause problems.

    the correct way to fix this would be to take all global lists and run them thru scrapebox and check which domains are indexed and delete the rest.

    but it's a tonne of manual work i'm trying to avoid it if i can!
Sign In or Register to comment.