Skip to content

Feb 2017 G Algo Update

I think most are aware of a G update that rolled out the first week of this month. It turns out that this may have been a significant update where the effects are just now being felt. In a number of forums there are widespread reports of falling traffic. I don't pay close attention to my ms GA account but just had a deep dive and am now concerned that I have been affected as well. My traffic is off 30%. My traffic consists of pbn which has held steady, and web 2.0 which i use to hide my pbn. the web 2.0 traffic is usually not a big thing, some deliver and some don't. I use them for link juice too. It looks like this traffic dropped off quite a bit from the accounts that had live traffic. Behind the 2.0's I use SER.

So, this would not be a huge concern if my rankings held. 

Checked the serp rankings, and sure enough more than a few of my kw's have been hit and dropped off quite a bit.

So, putting the pieces together, either the 2.0's went tits-up on me (using sere to create and post) or the SER layer exposed them somehow. I am pretty conservative with SER and keep high quality there.

The bigger picture, G seems to be more able to detect these types of networks. This will require the next phase of thought process to stay ahead of the curve. Remember info-graphics and how easy it was to rank with them? Would anyone ever use them to a MS these days? Nope. Things change, and we have to stay ahead of the change. 

Would love to hear of anyone elses experience this week and if you have determined what the trigger could be. I'll report back with any other findings.
«134

Comments

  • Hey @viking - pretty sure that it's the web 2.0s based on what you're saying and what I've seen personally. Is it possible that a bunch of them have been deleted or deindexed for one reason for another?
  • @redrays hey there. You dodged another big storm back home :-)

    I thought that too, but the page drops were to positions that were worse than before the 2.0's were added indicating some sort of a penalty rather than loss of link juice. At least that is my thinking. I'll comb through the 2.0's though and see if I am missing something. Thanks!
  • @viking - so I heard. It's been 65 here (though mostly cloudy) the past few weeks again :D

    Anyway, hope what you're seeing is just fluctuations from link loss and not a full blown penalty. Please keep us updated if you're able to figure out what's going on.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    One of my new sites has been hit, on the flip side though I have a C&B thats just climbing.
  • @shaun that's interesting!
  • spunko2010spunko2010 Isle of Man
    edited February 2017
    301 it to virtually identical domain and move on.  :)

    PS: One thing I've noticed these past few months is that Google is taking a lot longer to index 301 domains. It used to be <24 hours but now can take up to 5 days.
  • @spunk2010 I hear you. But I cannot as the MS is a branded site.

    @shaun huh. The C+B uses different link types or just blasting articles?

    Yesterday I had a KW disappear that was top 20 and had zero seo activity for the page. 

    Scratching my head here in trying to find a clue as to what the algo is catching.
  • edited February 2017
    ok, here is an article that may provide some light. It is hinting at PBN detection or possibly spammy link devaluation. It confirms an update hit first week of February. If this is a new ability for G to detect PBN's, I am screwed. It seems like some are affected while others are not, so perhaps the platform has something to do with it somehow.

  • shaunshaun https://www.youtube.com/ShaunMarrs
    @viking articles, forums, social networks and wikis. Blasted 100,000 at it and just left it to see how its going.
  • @viking - what kind of keyword is it? I've seen this kind of behavior on a few local keywords I track (think: "widgets in New York"). My rankings have always bounced back eventually.
  • @redrays not a local kw. it is with the big boys. the range for the affected kw's is 900k to 11M on the serp page. I was doing really well (page 1/2), but I did something to muck it up a bit and I am sliding across all my kw's now. I stopped the sere projects and am actually thinking of diluting things with a quick flash of SER linkage with high quality settings. Not panicking yet but am def concerned. I have a hunch it is sere related as I created thousands of 2.0 links for various projects. I use a catchall email, but maybe the 'domain' used is an issue as opposed to using 1000 mail.ru accounts for example?? 
  • @viking - yea, I can see that leading to a mass ban if you registered enough accounts. Are your web 2.0s dead?
  • @redrays nope. think I should pull them down? 
  • @viking - I wouldn't. I don't think anyone except for the owner of the Web 2 has access to your email address. So it's a footprint that puts you at risk for getting your accounts deleted but not something Google will figure out.
  • @redrays thx. I will ride it out a bit and see what happens. 
  • @viking - no worries, that's the course I would take :)
  • Well, you know those moments when you suddenly realize how stupid you can be? Had one this weekend. While I have been fiddling about off-site with various SEO techniques, I left it to an employee to manage on-site stuff. After the Feb 2017 algo changes and sudden plunging of pages I have been obsessed with finding out why this could have happened to what I thought would have been a solid off-page setup. 

    I was doing some backend updates and stumbled across a script and stopped in my tracks. The site had GA script on it. At some point, someone had installed the GA script to better analyze consumer activity. Suddenly it all became clear that we had allowed the fox in the hen house. So G was able to not only see our traffic with detailed accuracy, but also analyze the incoming links, and possibly expose my PBN layer.

    I was gutted over this discovery. So the first thing I did was remove all scripts and snipets I could find. Then I re-wrote the content on the pages that had suffered the most in the update.

    After four days, rankings are back on the way up. One page that disappeared from the index is now at #30.

    Lesson learned = avoid all opportunities for G to get a bead on my site. Once they get any data read, they will absolutely place you in the index against other data they have. When they don't have this data, they can only look at what they have, which is a link profile, and judge that against other peer sites with similar link profiles. So, it's not so much a 'penalty' as it is being placed where they are seeing it in a relative sense for that data. If you think about it, it makes perfect sense.

  • That's very interesting @viking I've used Piwik in the past to gain an insight without letting G in through the door but flip flopped last year when I figured that not having analytics on a bunch of websites is something of a footprint in itself. I bit like the guys that blocks ahrefs from crawling the site, that's a red flag as well.

    But like you say, fox inside the hen house. I'm going to need a rethink. Thanks for the heads up.
  • @Johan yep. Hard to tell what to do these days. I think erroring on the side of caution is a risk I will take. If G wants to consider it a red flag that I don't have GA or WM script installed, then I will take that risk as opposed to allowing them under the hood for a full on look at what I have going on. I am going into hyper anti-G mode now and am stopping my use of Chrome and gmail. Who knows what can be leaking to them over time.
    Right now I have almost all of my KW's improving in my rank tracker. I'll report back in another week or two on this as an update to report if my changes made a difference.
  • @viking - you said you updated the content on the pages that dropped? Perhaps that was the reason that they've jumped back up, and removing the GA code actually had nothing to do with it? 

  • @2Take2 good thought. So many moving pieces to this stuff. I only updated the pages that dropped out of the top 100 on my tracker. The other pages that plunged less severely, I basically left alone as sort of a control point so I could understand what was happening. As of this morning the results are good, but mixed. I had one KW at #4 on page 1, serp of about 400k that just disappeared from the index. I never did any seo on that page. It is still missing. While other pages have seemed to come back nicely. The rise back in page rankings is for both re-written content and untouched content, so I am not certain how to read the recovery just yet. I completely re-wrote the entire page for the KW that was #4 and nothing has happened with it. Yet. Traffic is slightly better but not in a meaningful way. I think it is too early for me to tell if any of this really had any effect, but I am watching it closely.

    Other than the anecdotal changes, I would love to get feedback from anyone on whether or not they are/are not using GA and WM scripting on-site. I think that it was a dumb error on my company's part to do so. I think giving G real data to judge you by is very risky unless of course you have absolute certainty that all of your measurable metrics are better than your competitors. And, that the effort is sustainable. Letting G into your site to watch visitor traffic gives them a baseline to value your site and content that is beyond what they can see from the outside. Of course the marketing types love GA. I do too. The in-depth analytical information is great. But the flip side is that G is really cataloging your data precisely. In my company's case, I know that we compete with global enterprises with an army of internal seo/sem employees and it is highly likely we could never compete with real data. But, I can certainly build a link profile every bit as good as theirs which could allow us to compete.

    Will post any updates as they come in.  Good luck all.
  • @viking - I use GA / WM on sites I'm serious about and that I legitimately feel stand above the competition in the ways that you mention. I don't know how usage data are measured, but I've seen enough to believe it has some impact post-panda in certain niches. It sounds like you're in the kind of niche where it does matter.
  • @redrays right! Exactly my point. So you have domain expertise for your KW. Your metrics are better than your competitors. So 100% I would let GA and WM crawl my site all day long. It only adds more value and strength to a site's ranking. I agree with your position and would ride that for as long as I could.

    But, I am not that lucky/good. I chose a market where I am competing against global giants (think Ford, GE, Whole Foods, type of competitors) with in-house staff. I know first-hand from a CDO friend at one of these places that I will never have a chance to rank against the hundreds/thousands of companies I compete against. No way. If we all allow GA inside for a peak, I lose on every single metric. My MS has great, original, unique content that I wrote. I have an excellent PBN layer. I have a decent tiered up link profile. But as soon as G has a peak at traffic flow and engagement, I lose. It is the way it should be. No arguing that.

    This is why we are all here. Because we need leverage and need to operate in the shadows. Or at least that is why I am here. 

    So my point is that since I cannot possibly match up to my competitors metrics, why would I prove that to GA/WM? Isn't it better to keep them guessing and possibly get the benefit of the doubt? As long as IE, Firefox, etc exist, they cannot use just chrome data. In 2017 Chrome owns 57% marketshare. The only way they can move you in a relative way is by actual-data in GA/WM, or by link-profile. I don't think it is a coincidence that the GA script was installed on my MS and on the next significant update we plunged like a flushed toilet, and then stabilized and started to rise when we yanked the script off the site. Maybe it is, but I can't find any other clue to go with right now.

    I may be in LV this summer and we'll have to compare notes! Until then thanks for all your great advice. I appreciate it. Will keep updating from my side.
  • @viking - I agree with you and think it's a clever solution on your part :) Frankly I wish I'd thought of it for a website I used to manage that had subpar engagement metrics, but live and learn.

    Hope you're able to make it out here!
  • Remember gstatic font could give google info too.
  • @content32 thanks for that! Do you know what info gstatic passes to G?

    Update on my strategy. The KW I had at #4 that disappeared from the serp has now returned to the #4 position. Again, not conclusive by any means, but it looks like not allowing G access to your real data makes a diff for rankings. Could also just mean that it is a settling from the last algo update as that usually happens too. I have 5/20 kw's still off by 50-75 spots so I will watch those now for movement. The other variable here is that I have stopped all link building for the last week or so and perhaps that plays into it as well.
  • @viking I heard about it on another forum and it makes sense because gstatic is like google analytics in a sense it is one every one of your pages.. and users who go on your pages will automatically call googles servers to fetch the font which will let google gather data for example anything they would do with a gogole analytics cookies such as ip, location and all that funky stuff.

    Also nothing is ever "free" with google there is always a catch and gathering users data is probabaly it with there fonts.
  • @content32 thanks so much for that! I was clueless to that leak.

    I wanted to post a Friday afternoon update on my side for those who may be interested.

    I lost 2 KW's in total that were top 10 from the feb algo change. When I mean lost, I mean dropped off the serp completely. Both are in 1M+ serps.

    As stated a few days ago, I had stopped link building to sort things out, and my #4 kw re-appeared. So the immediate thought was one of two things; either I was leaking in my PBN + 2.0 layers, or GA/WM data screwed me. Since I yanked out the GA/WM script and I didn't touch my existing backlinks, it seems clear enough that the script was smelling more and more the dirty diaper.

    So after that, I decided since I had one of my two big KW back, I was going to experiment with the missing one. On the 23rd, I booted up SER and started blasting contextual links directly at the page. Nothing crazy, a few hundred a day, high quality. 

    Nothing until I checked serplab about an hour ago. The keyword is back! Not top 10, it is 18, but it is back.

    My overall rankings are slightly off, but for the most part back where they were in January.

    What is my takeaway:
    1. I am absolutely convinced that GA/WM script on site is a huge mistake for blackhat activity. Unless you are fully confident that your data exceeds your peer group, keep G out of your business. You will suffer for it.

    2. 2.0's could be an accomplice here. My KW's returned from no more 2.0's and from diversifying the link profile across a number of platforms.

    3. In my case, SER direct to MS clearly made a difference and with high quality works still like a champ.

    4. Finally, I have no real clue what the issue was, just anecdotal evidence. G and SEO in 2017 are scary and tough. I will continue to focus on building new domain PBN's and build them over time as a way to protect my ms. I fully expect to continue to stumble through the year as G refine's their algo. In fact I am sure they monitor forums like these to see what we are up to. I would if I were them.

    With that, I am off to a case of cold ones to celebrate. Good luck all.
  • Thanks @viking good info there, glad you got it on it's way back! Yeah I never use GA/WM for those specific reasons, never knew that about fonts though, cheeky sods xD

    I agree with diversification!
  • As this has been mentioned, what about adsense? @viking @content32
Sign In or Register to comment.