#65 – Google FRED: What You Need To Know

What you will learn

  • How the new Google FRED update affected our sites
  • Some theories around what actually caused the update
  • Why nobody really knows what caused it at this point
  • Whether or not the authority site model still works (hint: it does)
  • What to do if you get hit by any algorithm update

Today, we’re talking about the recent Google FRED update (on March 8th), as well as Google updates in general and what they mean for you as an authority site owner.

What Happened To Our Sites?

Kicking things off with what happened to our sites, our analytics actually show little change after the update.

Health Ambition is down 5%, which – considering the time of year – is pretty normal and something we’re expecting anyway. As for Authority Hacker, it’s still very stable at +/- 1%.

‘The Authority Site System‘ case study site is steadily growing, though it’s still in the early growth stages so it’s difficult to associate any algorithmic changes with that, but it’s a good sign nevertheless.

Reports Of Significant Traffic Loss

Some people have reported up to 90% drop in traffic as a result of this update.

As worrying as that is, this kind of thing happens with every updates. there’s always a small group of people claiming they did everything right, but were still somehow hit.

What you often find is, their site isn’t as clean as they make out and they forget to tell you about the time they bought some links or ripped some content — so you have to be careful not to take these claims at face value.

The Root Cause

Whenever a big update like this happens, people scramble to find out what factors are responsible for sites that were “hit”. As a result, people share URL’s that were affected and the community as a whole tries to piece things together.

At the moment, nobody really knows what Google is targeting with this update, and it may be a while until we have any idea what it might be.

If you look at how Google did things 5-6 years ago, it was still regular programming (if this, then that – kinda thing). Now, they’re investing heavily in AI which means it will be much harder to pinpoint the root cause because it’s a lot more refined.

It’s easy for people make blanket statements – like “this update targets low quality sites”, or “sites with thin content” – but it’s far too broad to be of any real use to the community. After all, what exactly is thin content? At what point is content not thin anymore?

A Big Game Of Chinese Whispers

In a situation like this, where panic and confusion are rife, people tend to flock to industry experts for advice.

As a result, these ‘experts’ are under pressure to come up with something in order to retain their status – even if it shows little evidence or justification. These theories are then used as the basis for other, fan-made theories which creates a massive game of Chinese whispers.

Eventually, people start creating false beliefs about what happened, purely on the back of theories that have now snowballed out of control.

If you’ve ever watched Derren Brown’s Trick or Treat, you may have seen an experiment he carried out highlighting this problem perfectly.

Our Take On This, And Every Future Update.

Basically, we don’t know what caused it.

It’s probably more complicated than we know and there’s a good chance we’ll never really understand what caused it.

What we do know, however, is what Google wants. They want to provide people with the best user experience so they’ll keep coming back, and they’ll keep making searches.

That’s how Google makes the majority of their money and you can bet their doing everything they can to improve the user experience. Every update is designed to show relevant, high-quality and trustworthy sites…

…so guess what your site needs to be?

What To Do If You’ve Been Hit

hit by car

#1: Just Stay Cool

When an update is released, it tends to have an immediate – but not permanent – effect.

We’ve seen this happen enough times to know that an update is fine-tuned for weeks and months after the initial rollout. but then changes many times over the next weeks or months.

In other words, it’s very possible your site will bounce back.

Seriously, the best thing to do is to just take a break. Wait for the dust to settle. Work on other projects. Expand your portfolio.

#2: Let T he Community Figure It Out

Sometimes the best thing to do is just wait for other people to figure it out. Sounds lazy, but some people are in a much better position than you when it comes to solving these kinds of problem.

Agencies are great example because they’ll be under pressure from clients, and they’ll also have a ton of sites to try different things with.

Even better, when they find a solution, it’s not long before they publish a guest post on site like Moz.com detailing their success.

#3: Take Incremental Steps To Fix It

It’s common for people to read a theory on a blog, and then make significant changes to their site based on what that person ‘thinks’ was the cause.

If you’re site does bounce back, you’ll never know why because you threw the baby out with the bathwater. We know it can be tempting when you see a big traffic loss to going to repair mode, but that approach will hurt you in the long run.

Note: A lot of these updates don’t happen in real-time. Google has to hit refresh for any changes to be applied to your site. That means anything you’ve done to your site may not have an effect until Google rolls it out again.


If you’ve followed us for any length of time, you’ll know this is exactly why we focus on building white-hat sites. Sites that remain unaffected by updates because they’re sites Google WANTS to show.

Want to learn how to build sites that are update-proof? We cover everything in our step-by-step program, The Authority Site System.

Share on facebook
Share on twitter
Share on linkedin
Share on pocket
Share on email
Share on print

Do you want to learn how to build 6 figure authority sites?

Subscribe to join our FREE training and…

  • Learn how to build white hat links to your site without headaches
  • Finally have a proven method to finding profitable niches
  • Get access to our foolproof keyword research methods
  • Learn how to outsource high quality content


  1. Nice podcast! Thanks for reading my Q :)

    I read a few tweets by Gary Illyes, Google Employee. None directly mentioning the update, but a good user to follow anyways for people who haven’t read him. Lots of his DYK (Did You Know) posts give valuable info about some inner workings of Google.

    He also tweeted this link in Feb which has been updated Mar 14 — after FRED, the updated Google Search Quality Guidelines. I’m SURE it’s worth a read and surprised more search engine blogs aren’t covering it in their FRED updates.


    1. Yep, they actually said at SMX that the answer to this update was in this giant PDF, I’ll take a look, probably with a version tool we could find what changed.

  2. Hi,

    I’m with Neil. Early February (6th or 7th), my two largest sites (large for me, probably tiny for most) started down. One of them that I already had removed AdSense from started back up since early March. The other is continuing down. I removed AdSense from that one as well since it wasn’t really bringing in that much anyway. Both are Amazon affiliate based with less than 60 articles.

    Most articles have about 4-5 Amazon products. One link at the bottom, one at the top, and three for each product (one image, one text, and one CTA button). Each of these links are 302 redirected (as described on Google guidelines; maybe not the 302 part, I may have found that suggestion elsewhere) from an “/specialstr/ProductName” type link to the actual affiliate link. I then have the robots.txt Disallow all links “/specialstr/” to now pass the Juice along to the affiliate. I’m thinking about removing my “text” affiliate links and my image links and just keeping the CTA button. This would reduce my affiliate outbound link numbers by around 60%. I’m thinking Google might look at my site more favorably. But, since I’ve already told Google to “Disallow” those thinks, I wasn’t sure if they were even looking at them. Google shouldn’t be following them from a “juice calculation” standpoint. But, maybe they do still count them from a “total percentage of affiliate links on my site” standpoint. Any advice from the experts on “percentage of outbound links being affiliate links” Google algorithm effects? Or, if having these Disallowed via robots.txt helps Google look more favorable than if not Disallowed. I’m really guessing not doing this is a “no no”. I had 3 sites that were killed with the Sept. 23rd PBN updates where they were all marked as “thin content” (one of them I could see being possibly “thin”. But, I had some meat on the bones for two of the others. I didn’t “figure out” to do the “don’t follow” (Disallow) until after several failed attempts to get the penalty dropped. I eventually just abandoned those web sites.

    The site that I has already removed AdSense from dropped about 25% from early Feb to early March. Since March it has come some of the way back (still down maybe 5-10%).

    The site that I still had AdSense on started down in early Feb (you could make an argument for maybe last day or two of January). It did not start to rebound in early Feb like my other site. And, has dropped more than 50% since the 1st of Feb (even more if you count the last 3 days of Jan). So, since I read that there was a possibility that “heavy ad” sites were being hit by Fred, I removed AdSense from that site as well a day or two ago (even though it only had two AdSense blocks; side and bottom). AdSense wasn’t really making that much for me anyway, so I’ve always wondered if the site would have more credibility if AdSense was removed, thereby getting more Amazon affiliate clicks. So, I’m going without AdSense on them for now.

    I can’t say I saw CTR improvement from my other site when I removed the AdSense. I just like the look and feel of not having AdSense. Neither site really did very well with AdSense without me planting a big (ok, medium size squarish size) obnoxious ad right in the middle of my opening paragraph. I never really liked that. But, contrary to my thoughts (and what I read online), those ads converted a TON better on my sites than the tall side banner, or bottom wide banner (like 5x more with the obnoxious opening paragraph AdSense block).

    Anyway, I thought I’d pass my small experiences along in case it helps anyone.


    1. I don’t understand your method using 302s etc, but if you are just trying to stop Google counting affiliate links, I just add: rel=”nofollow” to those Amazon links. Inside the anchor tag before href=”www.amzn…”

      Maybe you already do that as well. Much easier than fiddling with robots.txt and other stuff.

      Also worth noting that there was (apparently) a Feb 1st Update AND a Feb 7th. Presumably they were both targetting different things. My traffic began declining after the 7th.

      What I’ve been looking at is trying to improve UX. ESPECIALLY for mobile device users.

      So I thought I’d got the mobile version working okay, and had responsive Adsense at the top. I expected that to be a short banner-type thing. BUT!! It was a giant square ad – entirely filled the Above-Fold space for mobile users.

      My advice is to check mobile useability using anything other than Google’s tools – they strip out the Adsense, so you don’t see what real mobile users are seeing – took me a year to figure that out! (I use mobiletest.me now.)

      And like with Authority Hacker, and tons of other well constructed sites these days, make sure you have shorter paragraphs and plenty of white space. Mine were gigantic walls of text, tiny images using in HTML Tables (because it made everything look pretty on desktop!)

      60% of my traffic is mobile, so I figured if G thinks I’m offering a bad UX to them, they’ll hit my site.

      But like the advice in the podcast here, I haven’t done a TON to change things, other than go through every post, splitting paragraphs, adding space, reloading larger images so they look great on mobile and desktop, etc.

      The volatility makes me think I’m right on a “boundary” and G can’t figure out whether to hit me or not. And it also maybe indicates it’s something in a real-time algo – rather than something requiring a refresh. But I don’t know! lol.

      On UX, maybe check your bounce rates – mine are NEVER below 80% and often higher than 90%.

  3. I went from 4,500 visitors/day to 1,300 visitors/day within 48 hours. Some info about my site:

    Age: 7 months
    Backlinks: 103
    Referring Domains: 97
    Backlink Types: Mostly outreach (zero PBNs or black hat stuff)
    Site Page Speed: Approx. 1 second
    Pages: 121
    Avg. Article Length: 700 to 1,500 depending on topic
    Monetization Method: Google AdSense & Amazon
    # of Ads Per Page: 2-3 (non intrusive)
    Popups: None
    HTTPS: Yes
    Anchor Text: Normal
    On-Site SEO: Normal

    To be honest, I’m totally clueless as to what I could have done. I’ve tried to be as whitehat as possible and I still got hit. It really sucks, but I guess that’s what you get when you rely 100% on Google.

    To stay sane, I’m currently working on a totally new project that’s more focused on reputation (actually being GOOD) rather than relying on Google.

    Thanks for the podcast guys.

  4. Haha! Glad I checked Google before posting that REALLY long comment I just wrote. So the Fred Update is different to the Feb 7th Update. I got hit by the Feb 7th one and lost 67% of my traffic.

    Can’t say whether I got hit by Fred because traffic volatility on a day-to-day basis since the “Groundhog” Update has been running at plus/minus 20%. Just rollercoastering. Up 22% yesterday, down 21% so far today. lol. Been like that for weeks.

    1. I firmly believe that the recent update was caused by the ghostly presence of Fred Dibnah. An eccentric Englishman unfortunately deceased but a national treasure to all us Brits.
      Fred liked blowing things up.

  5. My site was completely whitehat based on your techniques and I got hit. My content has always been based around quality too. I always work from a perspective that I want people to land on an article on my site and not need to go elsewhere to find information on the subject tackled.

    I am just continuing at the moment until I know it is completely over but being whitehat doesn’t make you immune to the updates.

Leave a Comment

Your email address will not be published. Required fields are marked *