Skip to content

Is Penguin Released

13567

Comments

  • spunko2010spunko2010 Isle of Man
    @Alex make social signals as in buy them? :) Or only 100% natural social signals?
  • @spunko2010 It's up to you, I don't know, If your content is "delicious", try some Social Locker, set it as "This content is locked, Share to have it!", and inside it you could just put something even you don't want it to be locked... Humans are curious ;) 
    Or well, try AMF,YLH or some 5rr gigs, it's up to you. At least this method worked for Penguin 1.0, people recovered their ranks in a few days, don't know if it's still working in Penguin 2.0.
  • AlexRAlexR Cape Town
    @jigsaw - thanks! I'm just reviewing some ratios. What are your thoughts on the ratio of:
    1) Brandname
    2) Generic
    3) Domain URL
    4) Anchors
    5) Home/Deep link

    In an ideal backlink profile?

    Just seems like there is a lot of guessing around these and I'd like to hear your thoughts and anyone else's thoughts on these. I might even start a different discussion thread on these. 
  • spunko2010spunko2010 Isle of Man
    edited May 2013
    @AlexR well let me tell you what I was doing, because I was hit harder than most by Google, so maybe here's a guide in what not to do (?) :


    Generic - 20%
    Domain URL - 15%
    Anchors - I have 3 anchors per project so a 33% split of 65%...
    Home/Deep link - 0-5% were homelinks, rest were deeplink


    This was on T1. For T2 I was doing 60% generic and 35% domainURLs, the remaining 5% was keywords . I also checked my T1 links today and there are loads of Wikis, even though I didn;'t select them that often, so I think that's an issue for me. I will rarely if ever point another Wiki at my MS :)

  • >>> @sounk2010 Startrip, mind sharing what type of links you have pointing to your MS?

    Its article engines (yad,joomla,buddypress,articlescript and the like), SN (phpfox,dolphin etc.), directories (aardvark topsites and some french thing that performs well), some microblogs and some social bm (pligg, scuttle and a few others).

    nothing special and not a single web2.0
  • ronron SERLists.com
    edited May 2013

    @Spunko - Way too high on the key terms. I average 6% per key term. Not saying that's the model, but it's a good start. Especially for longer term strategies. Churn and burn is a different deal.

    But it's more than just that which affected sites. It never is just one thing.

  • This is the funny thing about this update.  The Churn and Burn sites that I own All Stayed and Even moved up in positions.   The ones that were more for longer term following what G says blah blah  All Penalized.  Which kinda confirms what I've always thought and that is to do the exact opposite of whatever G says.

    Which I do about 70% Churn and Burn and 30% Longer term.
  • spunko2010spunko2010 Isle of Man
    edited May 2013
    @ron thanks... I'm only doing 15 or so verifieds a day at most with those numbers, not sure if that makes a difference on the key terms? I've changed it down now.

    I think the main issue for me was the homepage:inner link ratio was suspicious. I'm trying to get  it at 25:75 now. I just hope it's possible to recover lost rank, and there isn't some sort of X mark hanging over my site now .
  • ronron SERLists.com

    @spunko2010 - Usually when a page suffers a big penalty (like -100, -200, -300), it's typically death for that page. Typically. But not always.

    In penguin 1.0 for example, some people were able to recover by diluting anchor text. And those people had very few links - so it was fixable. Plus they knew exactly what the issue was, so they could take action by diluting anchors. Most others built 10's of thousands of links with the same anchor, and there was no way or recovering.

    If you can't figure out the problem, then it is almost impossible to undo the problem. This algo had many aspects to it, and who knows, they may have run a couple of other updates concurrently just to throw everybody off. I know I would do that.

    All you can do is make logical adjustments based on the info you have, and then start working on other sites.

    @Hunar - What's funny is I commented earlier that the viable timeline for a website seems to be getting shorter and shorter. If it gets any shorter all my sites will be considered churn and burn, lol.

  • LeeGLeeG Eating your first bourne
    What percentages of keywords, generic and bare urls are you guys running on t2 and t3
  • Please guys stop taking any actions now. It's too early for this. The full effect is not completed yet. My main keywords most of them went from page 1 to page 5 / 6 / 7.

    After a few hours some of them went back to page 2 and some did not.  Just wait till the update is completed and then start analyzing the penalty and take some actions based on that.
  • spunko2010spunko2010 Isle of Man
    edited May 2013
    @LeeG for T2:


    60% Generic
    30% URL
    10% KW

    (Approx with some variation)

    For T3:
    58% Generic
    40% URL
    2% Keyword


    Anyone think that's related? As @ron says I've changed the T2 to increase the KW. No idea about T3 of what to do. I had it one of my first ever T3 at 33% and that KW ranked on page 1.....

    Now running my T2s at 25% KW

    @mamadou with respect who are you tell us not to analyze our own SERPS rankings? I spend 2-3 hours a day on this and it's something I'm keen to get to the bottom of.
  • LeeGLeeG Eating your first bourne
    Top man, cheers for that
  • edited May 2013
    @spunko2010

    I didn't say stop analyzing your serps!!! , I analyze my serps rankings at least 2 times a day.

    I'm just saying stop taking actions now because you don't know which keyword is really penalized and which one is having a "google dance" because of the new update. I told you that I have some keywords went off to page 5/6 on the 23rd then they recovered.

    Just wait till you really know which keyword is penalized and which one will settle on better rankings!. When you do that you can take wise decisions based on facts not random decisions based on some "Google dance" like behaviors.
  • spunko2010spunko2010 Isle of Man
    @mamadou well, I saw a big drop on 20th May and since then it's dropped every day, yesterday my worst day. So I hope it improves and dances back up, but I doubt it.
  • We got hit a bit on a few sites. Not too badly this round, but enough where we have to scramble to get back where we were. Looks like they hit a lot of the dormant links from locations where there is little social value.

    My primarily SER site projects have mixed results. Where we did a lot of diverse content 2.0's, again only doing 10 - 15 Tier 1 2.0's per project, extremely well spun and then adding the "Ron" formatted tiers, we are fine. In fact those are still creeping up nicely. Where we did some flat linking, our pages dropped off a lot more. So it looks like tiers are good right now even if they are a lot more work.

    Where we see the best results over the last few days. Pages where we used SER AND pushed a lot of social juice to the pages. 

    If you think about it, it makes perfect sense, and is the right thing for Google to do. Continue to crush out the spammy linking and add value to engagement. No longer one way, but the value is in a conversation (engagement).

    My opinion is that SEO now is about marketing to get engagement, not marketing for Google. 

    I agree with jiggsaw. We need to be doing a broad range of activity, not only to limit exposure in the future, but because that is the expected activity level
  • ronron SERLists.com
    edited May 2013

    I don't want to set a path that people may interpret in stone.

    I have a tendency to pick 10 great keywords as my primary anchor - and I treat them equally for linkbuilding. By the time I dilute with secondary, naked URL and generic I end up at a lower value like 6% or sometimes lower.

    That said, I have plenty of examples where I am 2X-3X that level successfully. So please take everything in context. And make sure you look at your competitors too - that is always instructive.

  • spunko2010spunko2010 Isle of Man
    edited May 2013
    @viking can you please clarify what you mean here 'Where we see the best results over the last few days. Pages where we used SER AND pushed a lot of social juice to the pages. ' - by social juice, do you mean natural or purchased, i.e. are your visitors just clicking share or are you doing something else?

    I just struggle with all this social stuff. My site offers something new but I just can't seem to get people to talk about it. :)

    Also how often are you adding more T1 links after the initial 10-15? Or are you not adding any at all to it once they;'re done?
  • @alexr what i did on my mainpage that now went from zero to hero was
    T1:
    3 mainKW, 2secondary with 10%, generic 20 and 20 url.
    On t2 I tried sth special: 10% generic, 10% URL and 80 % LSI keywords for my main KW(170 different ones).
    On T3 (which I personally direct to both T1 and T2) I ticked "use anchor of verified URL".

    The page really creeped for weeks and now hit page 1 (spot3-6) for main&secondary KW.

    So maybe LSI is your best bet.

    I'm on my mobile so forgive shitty format;)



  • So looks like Ron's tiered diagram is still a "safe bet" post Penguin 2.0 so far? All my projects are following that diagram with only 10 links per day on Tier 1, but I have some projects that were using the same tiered diagram but I made those inactive for the time being as they point links at 2 of my main money sites that are basically my livelihood for now.

    Can others validate Ron's tiered linking strategy to put my mind at ease?
  • spunko2010spunko2010 Isle of Man
    edited May 2013
    @glennf. I used tiered linking although I stuck to regular T1 < T2 < T3 not with T1A , T2A etc. My site has dropped to page 2 for several key KWs. It's a set back of about 1 month, I haven't been doing SEO for that long, so that could be a factor. Maybe you guys who have been doing tiered linking for longer have more linkes so can hold up more, I guess that's one reason.
  • ronron SERLists.com

    It's not the tiered linking. Most of everything I have done with tiers is just fine. There are many ways to do it, but they all boil down to linkjuice anyway.

    Changes in algorithms can cause changes in how certain types of links are valued. So you need to experiment with things like that, and in order to do that, you really need a bunch of sites. It is profoundly difficult to test things if all you have is one website.

    The other thing you have to really pay attention to is link-loss. Rising in rankings is heavily dependent on a positive acceleration in total links. If you don't understand this concept, I would strongly advise doing some homework.

  • spunko2010spunko2010 Isle of Man
    @ron ahrefs.com shows that I have 2 links pointing to my main KW page, even though GSA says I have 70 verifieds. So that's kind of strange, I just re-verified them so I can't see major linkloss there.

    If I had to hazard a guess at what caused my issues it would be either some or all of these:

    - I built too many web 2.0 sites from the same *domain eg *wordpress.com
    - There wasn't enough links pointing to my homepage.
    - I didn't vary my anchors enough for T1, and I varied them too much on T2 & T3.
    - I used ALL search engines as opposed to 10 Google ones. I think some of those are probably prone to blacklisted sites.
  • hrefs won't display all the links (no tool will).  there's a study done on which tools show the best indexing:


    If you build your own links i.e. using GSA you will always have the complete blueprint link structure of your website.

    Check which links are indexed using a tool like backlink monitor using free proxies scraped by GSA to determine which ones google has indexed.
  • Backlink tools can never be more than a control sample. For ahrefs you can take the amount of shown links x3 and you have a vague number. For SeoMOZ its x4 ...

    Best Regards
  • ronron SERLists.com

    I use it mainly to pay attention to my link loss/link gain metrics.

    For competitor analysis I use the usual metrics like anchors, etc.

    On my sites that were affected: I am seeing a strong rebound in my rankings on about half those sites, and very positive gains on the other half. My conclusion is that this is a devaluation, not a penalty. I'm just building more links. Period. 

  • @ron I agree on the devaluation side of things.  my sites that were effected i see rankings drops not obliteration if it was a penalty it would be -100 or more.
  • ronron SERLists.com
    I definitely had some go -300 and come back. When penguin 1.0 hit, it went down like -500, and then to rub salt in the wound, each day it went -20 every day for a week. Now that is a penalty. You don't gain back quickly when there is a penalty.
  • Something interesting i've had some keywords (not sites) not appearing at all and haven't recovered yet.  too early to tell i'm just keeping everything the same now reading a lot and waiting till more data comes out business as usual with link building.

    terry kyle did a good post just now.

  • spunko2010spunko2010 Isle of Man
    @sonic81 ok, do you mind sharing what sort of link generation you were doing mostly to your sites that were affected?

    @ron glad to hear you are recovering. So far I'm still being whacked by Google. Hopefully it will end soon. No warning in WMT though, I'm still doing tiered linking.
Sign In or Register to comment.