Skip to content

Penguin Update and T1 Anchortexts

2»

Comments

  • AlexRAlexR Cape Town
    @spunko2010 - what do you mean? Does anchor % ratio impact number of required links?
  • spunko2010spunko2010 Isle of Man
    edited October 2013
    @AlexR well it was actually a sweeping generalisation and maybe a bit misleading. But generally of course, for very competitive niches more links is better than fewer.

    I use variable anchor tests. Let's just say my KW is "Pratik SEO", I would have:

    30% Generic
    30% URL
    15% Secondary  - {view all Pratik Seo|Pratik SEO 2013|click to go to Pratik SEO|Pratik SEO tips and tricks|compare Pratik SEO|open Pratik SEO site here}
    Main Anchors  -  {Pratik SEO|PratikSEO|click here to visit Pratik SEO|Pratik|about Pratik SEO}

    Something like that.


    I have spammed at 40% of the same anchor text keyword (just 2 words) and done very well though. So I don't think there is any "Magic" number to avoid. And I bought some crap off BHW Forums where he's spammed at 60%... But has good reviews.. Will wait and see.
  • @spunko2010 Thanks for the advise! :)

    I'm always confused at anchor text % so hopefully my confusion is cleared now. Let's just test and say whether I'm right or not:

    Say I want 15% for my main anchor text, then lets say I set:

    LSI: 25%
    Secondary Anchor text: 30%
    Some other: 30%

    = total ratio = 85%

    Then my main anchor text should have 15% weight-age, right or wrong?

    Cheers.
  • ronron SERLists.com

    There aren't any answers on any of this. On a few of my serious sites I am doing branding + naked url = 99%. That means only only 1% for all anchor phrases, and my actual keyword will only have a few links. The rest are LSI. Your onpage seo tells Google clearly what your site is about. It may be slower, but I don't see how I can get penalized.

    On regular affiliate sites, I am trying two methods: 1) just contextual tiers, no junk tiers, with 3% anchor text for primary keyword; 2) Pure junk and contextual all in one tier with a high % anchor text (30%++).

    Anything can work in the short term. Especially spam. My goal is to see if I can get a site to make it past a few algo's.

    I don't know any answers. No one does. All you can do is test. If it is a serious site, then play it conservative. If it is a standard issue affiliate site, then I would go for the fast cash, and rinse and repeat.

  • spunko2010spunko2010 Isle of Man
    edited October 2013
    @Pratik as @ron says there is no specific answer. Personally I would not use 15% main anchor, 30% secondary and 25% LSI. I think it's too high. ANd you really need more URLs in there - look at natural link profiles on Ahrefs.com, some of them have 70% URLs.

     A lot of people on here and elsewhere suggest 20-40% generic. If I was being ultra conservative with a moneysite that I wanted to last for months+ then I would go for 40% generic, 30% URL, the rest is up to you. You don't need a high % of Main Anchors in some niches. I ranked one site (pre-Penguin 2.0) with 10x web2.0 blogs and only 1 had the KW in the anchor, so 10%. It wasn't that competitive. But you couldn't do this a lot of niches. Just try out different approaches, it's trial and error... and even doing the exact same thing twice can result in wildly different results as others here have said. I'm pretty sure Google is using some of randomization too, so that x sometimes doesn't equal y. There probably is no magic number or answer and they created it like that.


    Also

    image
  • @spunko2010 @ron I was merely stating that as example, as it was never clear to me on how anchor text % in GSA ser are counted. For example I set it such a way that the main anchor should be only 10% or around but it was 23% as per ahrefs report. So I was wanting to take advise on configuring those so I can control % of my main anchor text. I wasn't asking about what is safe line or such as such I know only Google knows it, perhaps the most guarded secret on this earth, even more so than NASA lol.
  • spunko2010spunko2010 Isle of Man
    edited October 2013
    Oh, I see. Well I get similar 'inaccurate' reports on Anchor Text on ahrefs.com. I assume it doesn't count places like Microblogs where it's always going to be the naked URL only, profile pages, etc.
  • ronron SERLists.com

    Ahrefs and Majestic are estimated to see only 40% of all links. Not a great way to take an accurate measurement. I love those services, especially ahrefs, but for what you are doing, it isn't good enough.

    I use the ser verified report to give me what I need. It keeps track of all links you built, plus you re-verify to remove the dead links. You can't get much more accurate than that. Of course, you may have more indexed pages with a particular anchor, but that should smooth out in larger numbers.

    I think it was @sweeppicker that loves Backlink Monitor. If I am not mistaken, you load up all the links you verified, and it keeps track of those plus indexing. So that product may have a better answer, but you would have to ask a user if you can get a breakdown of anchors for only indexed pages.

  • 2Take22Take2 UK
    edited October 2013
    @ron, You probably already know this, but you can also use Scrapebox to output the data in the way that you mention.

    First you use the index checking function on your URL list, and then run the indexed URLs through the backlink checker tool to give you a nice spread sheet with a list of live, indexed links, and their corresponding anchor text.
  • ronron SERLists.com
    Very nice. I never thought about it even though I use SB for index checks, lol. Makes sense.
  • @pratik are you using only 1 keyword on main anchor?


  • edited October 2013
    This Penquin 2.1 pretty much made my site go from pg. 1 to pg. 30+ on most of the keywords I used as anchor text in my backlinks. Even though i tried not to use too many main anchor texts during my last year of link building, I'm thinking that there were still too many. I'm still ranking pg 1 on a few keywords that I didn't use as anchors so this is definitely got me rethinking anchor text strategy. I've thought about trying to recover by creating thousands of new links with only "generic" anchors and thinning things out. Anyone have any luck trying that method to recover? Or is it too late for me to try this? Will G not let me rank on my main keywords ever again?
  • ronron SERLists.com
    I already tried the recovery deal going back to Penguin 1.0. I will never waste my time again. Not so much the effort, but more the wasted time. You won't bring it back (those are the odds). I would just relaunch it on a new domain. You will quickly pass up your current visitor count, and get to where you were within a couple of months. I can't imagine taking another path.
  • @rodol Yes, only one main anchor text. Others are % based (i.e. LSI, etc).
  • goonergooner SERLists.com
    edited October 2013
    @mike - My experience to Ron's is a little different. I have been able to recover sites by thinning out the anchor text as you mentioned. However, guess what.... All those sites were hit again in Penguin 2.1!!!!

    So you can recover with a lot of patience but once you are on Google's radar i believe they will just penalise you with every new update.

    I do agree with Ron about the new domain method, i've found new sites can be ranked on page one in let's say 2 - 3 months while recovering the old site might take up to 6 months.
  • grax1grax1 Professional SEO, UK | White Label SEO Provider
    edited October 2013
    Hi guys, I hope you don't mind if I join the discussion and ask you one question. I recently read about looking for target sites with own footprints on some forum, probably bhw. Is it really necessary? I mean, I'm not able to create my own lists of footprints because it would be really time-consuming for someone with little experience. The guy that said it mean that when you use gsa footprints for example posting articles you get your articles on sites that are spammed to death by big community using gsa - because a lot of them use the software on default settings, I mean they don't go so deep and change footprints etc. I checked a lot of articles sites I posted to manually and there is so much spam etc, but a lot of people mentioned articles for tier 1 in their tutorials, and it's contextual bl so shouldn't hurt if done right, but the problem is - is it possible to get on articles on valuable websites that are not spammed, using gsa with default fp ? When I check a lot of articles sites manually I decided to not include them in tier 1 so to be honest I'm now using only web 2.0 blogs and I'm afraid it's not enough to rank, even in my low comp kws..
  • Ron for a site that you want to try and keep in the SERPS for a length of time, with Web 2.0s as Tier 1, would you split the anchor text of the tier 2 into URL / Generic / LSI etc or would you just use your money keywords ?. Would you do the same if the tier 2 is split into HQ and KS ??

    Thanks

    Ross
  • edited October 2013
    @ron @gooner thanks for the advice. I have to keep this 5 yr old site running due to the nature of the business so I figured anything to increase the rank is worth a try. It's on thousands of customers business cards, ads, correspondence, etc. I have also started work on another domain/site.
  • goonergooner SERLists.com
    @mike - I have to keep working on client sites for the same reasons you mentioned, but i try to convince them to use a new site if the original is penalised. You can have one site on branded paperwork and another for SEO.

    @grax1 - All article, web 2.0 etc sites will be spammed to death as you say, but that's the nature of those sites - you can still rank well using them.

    With regards scraping sites i just let SER do it's thing. I don't want to mess around with gscraper or scrapebox every day. I'm now getting 100,000+ verified links every day using only SER default footprints and plenty of page one rankings too.

    So, i suggest you experiment with SER, run some projects, see what rankings you get before you worry too much about footprints etc.

    Hope that helps.
  • ronron SERLists.com
    What @gooner said about having a site for branded traffic and another site for SEO is spot-on =D> One of the most intelligent pieces of advice ever given, especially for the way it was said.

    People (small website business owners) have all this hype about the brand, the name, blah blah blah. What matters is having a successful presence on the internet. It just may not be the site name of the mothership. Big deal. Get over it. Create an ass-kicking new site and have two sites. Win / win. 
  • 2Take22Take2 UK
    edited October 2013
    Agree one hundred percent with what @gooner and @ron are saying regarding branded sites.

    My branded site has minimal off page seo done on it these days, it's just not worth it to me as re-branding would mean that I would need to change all my company uniform, paper work, sign writing on the vehicles, etc. ect.

    That said, my main site is 8 years old and any new pages that I create rank easily pretty much with just internal links, and a few authority links from a small network that I'm building.

    Although it's a bit more work, I tend to just churn and burn lead gen sites these days to keep an additional steady flow of new clients, as it works out much cheaper than adwords, although I might diversify and look for some affiliate opportunities as well.

  • grax1grax1 Professional SEO, UK | White Label SEO Provider
    edited October 2013
    @gooner thank you for your advice, so I'll try to play a little with articles to get the best quality sites. 
  • I think it's also very important to talk about link velocity. For me Link Velocity is the main key. Some people here stated that they spam a brand new domain to dead and they rank(ed) but it never happened to me . And believe me I ranked hundreds of websites. When it comes to toll like SER you can really kill a domain in one day if you do not know how to use it. About link velocity I ve a question for @gooner : how many verified ( submitted ? ) you build x day on a brand new domain? Have you ever tried to rank aged domains? I know @ron advice to build links slowly and that works for me but its very very slow and in my opinion you have to have also related links etc.. Very curious about the answer , also considering my websites life in top 10 is becoming shorter and shorter....
  • goonergooner SERLists.com
    hello @theguruland - You're right, link diversity is very important.

    To answer your questions, i usually go for about 50 submitted per day on a new domain, i do use aged domains too and start them off at about 80/day submitted usually.

    But, in most cases i use other links too - High PR links from my blog network and manual web 2.0.

    I've managed to rank easy kw's with SER alone and it worked fine, not sure if it would for more difficult kw's. But seeing as i have already have the high PR sites/web 2.0's it makes sense to use them, for link diversity if nothing else.

    With regards lifespan's of sites, in the last update only sites that were penguin'ed before were hit again this time. So, fingers crossed, it's working well for me.


  • @gooner , 50 new submission for a brand new blog is a huge number :) Thats my opinion. I got slapped even with 20 or less. Still what platform you use ?
  • goonergooner SERLists.com
    @theguruland - Well i figure 50 submitted = around 30 verified and they don't all get indexed.
    I'm not using a high performance indexing service like the two advertised on this forum.
    I have too many sites to submit all of them to those indexing services as it would cost a fortune, so i just use lindexed and build more links instead.

    Tier's 1 and 2 are contextual only.
Sign In or Register to comment.