#332 – 4 PROVEN Steps To Beat Google Updates

🗒️ Overview

  • Breakdown of the content auditing process
  • How often should I be running a content audit?
  • When do I need to start auditing my content?

A special thanks to our sponsors for this episode, Digital PR Agency Search Intelligence.

Google just released a fat core update and all the rules you know about SEO are about to change again. As Google has already started de-indexing what they deem to be low quality sites, content auditing has never been more important.

It’s the process that might save you from an extinction event during the next update, or could resurrect your site if you’ve already been affected. Plus, it’s the only process that’s consistently shown results for websites affected by previous large updates like Panda, medic, spam and core updates.

So in this episode, we break down exactly what you need to do to curate content that will keep Google – and you – happy.

Our Content Auditing Process

  1. Start by gathering a comprehensive list of all indexed pages on your site using Google Search Console (GSC) and Ahrefs’ Webmaster Tools. While GSC is a go-to for many, Ahrefs is invaluable – even using the free version – for its depth of data, such as linking root domains, top keywords, and organic traffic insights.
  2. Next, import this list into a Notion template crafted for content audits. This step is crucial for categorizing pages based on traffic, backlinks, and will affect what actions you need to take next.
  3. Next, it’s time to grade your content like Google would. Consider factors like keyword relevance, search intent match, ranking potential, content accuracy, mobile optimization, and conversion effectiveness. This grading system will help you see your content a bit more objectively.
  4. Now it’s time to make decisions based on your gradings. Should you keep it, update it, repurpose it, redirect it, no-index it, or delete it altogether? Decisions should be influenced by factors such as page age, ranking, backlinks, business value outside of SEO, etc.
    Implement a task management system within Notion to efficiently execute the necessary changes for each page. It’s crucial to be quick and efficient here to translate your audit into tangible benefits.

Key Points to Keep in Mind

  • Mobile Optimization: Google is mobile first now. Ensure your pages are optimized and formatted well for mobile screens. Avoid common pitfalls like poor table presentations and walls of text.
  • Monetization Sensitivity: Be cautious of over-monetization and promoting shady companies or products.
  • Implementation Efficiency: For implementing 301 redirects, consider using Cloudflare’s free mass redirect tool rather than an SEO plugin, which will slow down your site and eat into your crawl budget.

Recommendations

  • Periodic Audits: Conduct content audits annually, as a form of site “spring cleaning.” This means you’ll be proactive and maintain site quality even with Google’s constantly evolving standards.
  • Realistic Benchmarking: Set realistic expectations about your content’s competitiveness. Don’t hesitate to make tough decisions about repurposing or redirecting content that no longer fits or that you simply can’t rank for.
  • Conversion Focus: Traffic isn’t the end goal; focus on the conversion and revenue generation potential of your pages.

When to start content audits

While newer sites might only benefit from an audit after the first couple of years, established sites should aim for an annual audit. This means there’s time for new sites to grow, and see what’s working and what’s not.

Outsourcing content audits

As a site owner or SEO, your oversight in the auditing process is vital. The third party who is running the content audit might not know which pages are valuable to your business, so while you can delegate tasks, you should definitely be reviewing the final decisions here.

Balancing Topical Authority and Quality

Quality should always come before quantity. Regular maintenance of your content ensures that your site remains authoritative and relevant within its niche.
Rather aim for less content that’s high quality and well-maintained, rather than more content that’s weak. It’s all about creating quality content that truly adds value.

Long-term Impact

Content audits are no quick fix but position your site for future success. In the ever-changing game of SEO, it’s critical to regularly reflect on your content strategy, make sure you’re still doing the right thing, and be proactive about making changes.

By regularly running a content audit, not only do you potentially safeguard your site against future Google updates, but you also align it more closely with your audience’s needs and expectations. Remember, the goal isn’t just to climb to the top; it’s to stay there.

Transcript

Google just released a fat core update and all the rules you know about SEO are about to change again.

But I’m not going to go over the update now because we’ve already done it on our news channel. If you want to check the video,
click on the card above or check the link in the description.

But what we’re talking about in today’s podcast is very closely related to what’s happening now on Google. Now they’ve started literally de indexing sites that they deemed low quality. And I’m not just talking about AI sites here, a lot of non AI sites are also getting caught in the process.

My hunch is that now, with generative AI, the web is getting a lot larger while not necessarily adding a lot
of new information.

So Google has to become some kind of curator for websites and just can’t maintain an index where all the websites are part of it anymore.

And that’s what brings us to today’s podcast topic; content auditing.

I know it may sound boring, but that’s the process that might save you from an extinction event during the next update or resurrect your site if you’ve already been affected.

It’s the only process that has consistently showed results for websites affected by large updates like Panda, medic, spam and core updates, and yes, even HCU, but not the 2023 version yet at the time of recording. But I suspect we’ll see them very soon as Google rolls out the new version as we speak.

So, for example, a friend of ours was affected by the series of core updates and HCU towards the end of 2023, like many of you, and his traffic went from around 1,600 to around 1,300, so about half. And I’m not even counting Christmas because we know traffic is always lower during that period.

On first look, his site looks really good, but as we dove into it, we found many issues which we’ll go over during this episode. Long story short,
we advised him to do an aggressive site audit and no index most of his content to elevate the average page quality. A few weeks later, he deindexed 90% of his site, following our advice only keeping the best pages that were actually competitive in the index.

And what’s the result?

Well, a few weeks later his traffic mostly recovered. He jumped up to 2 to 2.2k clicks per day, and he’s in a good spot to make a full recovery at this point.

And we followed the exact same process on Authority Hacker preemptively towards the end of last year, and we no indexed 60% of all the pages on the site. During the next core update, our traffic jumped up 30% and jumped up 30% again this January.

So do I think content audits are one of the cornerstone SEO activities everyone should carry on?

Absolutely.
So much so that a few weeks ago we’ve shot an entire Blueprint about it for our Pro members, including premade Notion templates, tactics and automations to execute things faster and better without cutting corners, checklists and everything you can think of.

And so far people are really loving it.

And next week we will be offering it to every one of our listeners and email subscribers at an 80% discount on
authorityhacker.com/content-audit.

If you go on that page now, you can sign up for the waiting list and we’ll let you know when it’s available.

But whether you buy our Blueprint or not, we’ll walk you through the whole process in today’s episode, and why more than ever this is a good time to look into this.

Now unfortunately, this episode was recorded 4 hours before the update was announced, so you will hear us talk about how Google hasn’t released an update in a while, etc. But this changes nothing to the value of the process, and I’d argue it’s even more important now.

Before we jump into today’s episode, I want to thank today’s sponsor, Search Intelligence, and we’ll tell you more about them a little bit later.

For now, let’s jump in.

Hi everyone.

Welcome to the Authority Hacker Podcast.

Today we’re not going to talk about a very
joyful topic and interesting topic,

but definitely a topic that is of interest
to a lot of people that I think are

listening to this episode and that is
recovering from large Google updates.

So we’re not necessarily going
to target HCU specifically here.

I mean, I’m not going to come out and say
we have an amazing recovery story from HCU.

There’s not really strong recovery stories
from HCU, but there is good strong

recoveries from core updates which have
affected a lot of people recently,

and in general, how to deal with these
large updates that send your traffic down.

So you’ve probably heard
it in the pre intro.

We are releasing a new Blueprint
that helps you prevent and recover

from large updates,
but we’re also going to give you a lot

of that in this episode,
so feel free to listen to that.

Before we jump into the exact step by step

process, we’re actually going to have
to establish a lot of things on how Google

works and why this methodology is not
bullshit, basically because I think

there’s a lot of bullshit
around recovering sites, etc.

And feel free to comment under the
episode, ask us some questions, et cetera.

But I think it’s something a lot of people

need to do, even if you don’t actually get
affected by penalties yet,

or an algo update hasn’t affected your site
yet, because it’s very much a process

that helps keep your site clean and lean
and remove a lot of essentially bloat

from it that could cause issues
later, which is the main reason why people

see these issues happening
in the first place.

So today’s episode I have Mark, I think,

Mark, you’re going to mostly be here
to challenge me and be the voice

of the people and everything,
which I think is good.

I think we need to do that because
it’s such a sensitive topic.

Right.
People care a lot about this.

It is.
And I think you brought up a good point

earlier, that this is not just for people
who have been hit already,

but it’s a good prevention
technique to run on an annual basis.

And we’ve run that on Authority Hacker

towards the end of last year and
it nested us some pretty good results.

Right?
Yeah, that’s the thing.

I don’t think we’ve
reinvented the wheel here.

We merely kind of like built a nice

process that’s easy to follow with some
templates, et cetera, that helps you out,

but it’s more about the framework
that helps you think about this and make

the right decisions for your
site that I think is valuable.

And yeah, we have some case studies,
we have case studies on Authority Hacker.

We have an anonymous case study.

I mean, the guy we helped didn’t want me

to share his website,
so I’m not going to do that.

But he did get affected by core updates,
lost around half of his traffic

and recovered most of it
using this methodology.

So we’ll show you some graphs
on the screen if you want.

So I suggest we just get in and we get
started, first of all with some background

information on how Google updates work,
how Google works in general,

so we can understand the methodology
behind the Blueprint.

Basically so Google has what we
think is two levels of algorithm.

One is what I call the light algorithms,

which is kind of like the day to day
classic SEO metrics that you’ll see,

you know, that would be like the domain
authorities, the link to page,

the keywords that are on the page,
the publish date,

all the classic SEO stuff that creates the
fluctuation in your day to day rankings.

These light systems work on the page
levels, which means that essentially you

can have a page shooting up because
you’re doing the right thing.

You have the right links,

you have the right keyword on the page,
et cetera, and you can have another page

that does things completely wrong
and does not rank well, right?

So it’s like you tend to not have massive

traffic fluctuations day to day with these
kind of like light, let’s say,

legacy algorithms inside of Google
that don’t cost them much resources as

well, which is why they can’t
afford to run it all the time.

Then you have the heavy
algorithm updates, right?

These are like the most compute intensive

updates where they use kind of like
advanced AI and it costs them a lot

of money and it takes a lot
of time to roll out as well.

Right.
The last core update took more than three

weeks to roll out because potentially
it takes this much resources.

These are the, tends to be the updates

that they announce and that they
give names to, right?

Yeah.
And the thing with these updates recently

is that Google has been
releasing them as grapes.

So it’s like you can see,

like for example, the last round of core
updates, we got one every three weeks for,

know, three months and then
nothing for more than six months.

And I feel like potentially it helps
them create what I call a fog of war.

Like in Age of Empires,
you don’t see the map

and then as a result,
it helps them fight spam, basically.

It’s harder to reverse engineer
what happens, et cetera.

I don’t know if it’s that or if it’s
just they ship it when it’s ready.

This one, it’s like,

I can’t tell for sure,
but it’s pretty weird,

the timing of a lot of these updates,
and they come quite close together.

Usually.

Maybe they’re tested together,
though that could be a thing as well.

There was a lot of speculation last year
as well, that the updates post HCU were

designed to fix some of the damage
that apparent damage HCU caused.

Yeah, well, that’s also up for debate,

but we actually looked at the density or
the frequency of updates and it was

the most updated period, the most number
of days that updates were occurring

since Google has released
that data forever.

There were sometimes four or five,

I think, updates running concurrently
at the same time, going over holidays like

Black Friday, Thanksgiving, that they
tend to avoid typically doing updates on.

So there’s a lot of activity.

Yeah, but the thing with these updates is
that they apply a modifier on your site.

It applies to all the pages on your site,
which as a result leads to either very

large traffic increases or very large
traffic decreases for most people.

And it’s like some people
just see nothing change.

Basically it happens as, all right,

so it’s one of these three cases,
but it’s quite rare that you get like

a five or 10% traffic increase
or decrease from this.

It tends to be quite drastic.
Why?

Because it affects all the pages on your

site, regardless of the individual
quality of the pages

in this case, Google has just decided,
look, this site is like pretty shit.

Just let’s tank all the pages basically.

And so that’s kind of like the core

difference between the light algorithm
updates and the heavy algorithm updates.

The thing with these updates is they can
disagree with essentially the legacy

factors, like the links,
the keywords on page, et cetera.

So you can do something
that essentially rewards you day to day.

So you’re like, oh, I’m publishing

this content and Google
is ranking me for more and more keywords.

And it’s working.
They’re liking it, et cetera,

ie. Publishing a lot of AI
content, for example.

And then a big core update comes
out and your traffic goes like.

And it’s just like, tanks completely.
Why?

Because this core update that is

essentially smarter than the day to day
algorithm updates essentially disagrees.

And once it looks a little bit deeper
in your site, et cetera,

it thinks your site is shit,
and therefore it should not rank applies

a modifier site wide,
your traffic tanks completely,

which is what we see for a lot of AI
content generated sites. Nowadays

there’s quite a few rankings,
but it’s been six months,

there’s been no proper update as well.

But this was also applied to a lot
of non AI sites, people who

have had sites for four, five

years or more, this affected them, right?

And that’s really counterintuitive
in a way, because you’re doing something.

And especially now we’re in a period
like six months, no updates, right?

So six months where essentially these
light algorithm updates have been deciding

the fate of websites, depending if they
have no classifier applied to them.

Like you go up or you go down,

and so you get no feedback from these
smarter algorithm updates at this point.

And so what that spells is like

potentially large shrinks next time
there’s one of these updates running

because it’s been such a long time where
you got no feedback from Google on whether

what you’re doing is
good or bad, basically.

And so that makes SEO
very counterintuitive.

You’ll have people posting Twitter
screenshots of GSC of the last three

months or something and be like, oh,
look, my tactic is working, et cetera.

The reality is they’re probably doomed

in the next big core update or
spam update or something like that.

But because the classifier hasn’t been
applied and the dumb algorithms are not

smart enough to catch that, yes,
they are winning now, but long term,

they’ll pretty much be
messed up, basically.

And that’s kind of like the difference

between short term SEO and long term SEO
is like, if you want to do long term SEO,

you need to survive not
one of these updates.

You need to survive,
like 25 of these updates.

Right.
And so you essentially need to play quite

safe compared to the people who just want
to play the day to day algorithms and can

take more risk because
they’re just not very smart

basically. I would push back
on that and say they’re not very smart.

I’m talking about the algorithms, not –

Okay, fair enough.

That short term approach,
like the rank and tank model, as you said.

Yeah, there’s a lot of short
term profit to be made.

It’s not illegal as well.

It’s completely legal,
like publishing content.

You’re allowed to publish
content on the Internet.

If Google decides to rank
it is their problem.

Yeah, exactly.

It’s their editorial decision to rank it.

The problem is the algorithms doing that.
They’re pretty old right now.

They’re pretty figured out.

And therefore,

it doesn’t mean that because you win, you
will win the next core update, basically.

Right.

So now we’ve established
that it’s pretty basic.

A lot of people probably
know that already.

Let’s talk about these site wide updates.
How do they work?

Right.

Most likely what they do is they take
a sample of pages on your site,

they run it against
some kind of AI system that essentially

decides, is this a good
page or is this a bad page?

That is probably quite different
from the day to day algorithms.

It’s more machine learning, et cetera.

So it’s going to be based, for example,
on what the quality raters say.

So for example, quality raters,
for six months, they will be visiting

sites on the Internet and they’ll say,
oh, this site is very helpful.

This site is not very helpful, et cetera.

They feed that data to machine learning.

And these algorithms, they’re like, oh,
look, there’s a bunch of common factors

between all these sites that have been
deemed helpful,

and it might not be the factors that
the quality raters have deemed helpful.

So the quality rater,
they read their quality rater guidelines,

and they get told about EEAT,
they get told about, you can understand

who is the author,
who is behind this, et cetera.

But the algorithm might not
understand the website the same way.

It just knows it’s a helpful website.

It’s going to look at random
factors that are common.

So I listed some random examples that the
algorithm could be using, for example.

So I don’t know if you know,

but if you’re a tech nerd like me,
you know fonts on your website,

the size can be determined
in different ways, right?

You can determine the size in pixels,
which is kind of like the legacy way.

So you say 18 pixel, nine pixel,
very much like Microsoft Word.

Most people build their site that way.

The problem with that is
it’s not very accessible.

If I have poor vision and I increase
the base font size on my browser,

and I can’t read very well.

So the modern way of building websites is

to use kind of like relative
values called EM and REM, right?

And these allow you to scale the font

based on the base font
size of your browser.

So it’s like 1.2

REM instead of like, if my base
font size is 16, it’s like 16 times 1.2.

If my base font size is 20,
it’s 20 times 1.2.

Makes my site accessible
to people with poor vision, right?

It’s very likely that sites deemed that’s helpful,

large sites like Forbes, et cetera.

It’s very likely that these sites have

implemented these accessibility features,
like the font size, for example.

And it’s very likely that the machine
learning could be like, oh, look,

all these sites deemed helpful are using
REM for font size instead of using pixels,

like most of these niche sites that are
using it the legacy way,

and didn’t really learn about these
new ways of making sites accessible.

And so while the quality rater has
identified EEAT, the machine learning has

identified REM as a font size factor
to actually decide that, oh,

this is a helpful site, right,
while looking at the same site.

Another thing might be the cookie consent

thing that actually Cyrus’ study showed
recently was a positive ranking factor

in the recent HCU right? Sites that had
a cookie consent in the EU tended to not do

as bad after the HCU as
sites that did not have one.

It’s probably something that the machine

learning has picked up on that the quality
raters didn’t deem as helpful,

but the sites that are helpful tend
to have a cookie consent box, for example.

Right.
To play Devil’s advocate on that one,

that feels like it’s very much like
a correlation causation question.

Yeah, but that’s exactly
how machine learnings work.

The better quality sites that care about
their content, that care about everything

else, they do the things
that they’re supposed to do legally.

Yeah, makes sense.

And so as a result,
Google is actually rewarding the REM font

size or whatever because it correlates
very highly with good EEAT, for example.

And talking about EEAT,

a lot of sites now have these
kind of like content reviewers.

So they don’t just show the author,
they show the reviewer.

My belief is that initially this
was not a ranking factor, right.

That all this stuff, it was very hard
for Google to determine how legit this is.

However, if you go on large quality sites

like Healthline, like Forbes,
whatever, they all have it now.

They have an author,
they have a reviewer, right?

So now there’s a correlation between sites
that are winning the SERPs and the sites

that have a reviewer
on their content, right?

And therefore, it’s not because Google is

verifying on having some crazy algorithm
thing or whatever, it’s just the fact that

Google associates now having a reviewer
with your content, with a quality site.

Therefore EEAT becomes a factor.

But it becomes a factor because
of Google’s PR effort to push EEAT and big

sites adopting it and correlating
with essentially being good sites rather

than Google coming up with some
super fancy way of evaluating EEAT.

So EEAT is almost like a self-fulfilling prophecy, you

know? I’ve seen that mentioned before.

We talked about this in various
niches, like the VPN niche is

a great example of this, that there’s maybe
a couple dozen popular VPN companies

and like ExpressVPN, Nord or
the two that are, I guess, promoted a lot.

Surfshark as well, is a big one.

And it’s kind of like if you don’t mention

those big ones in any of your articles,
you’re missing some of the –

Yeah, exactly.
So yeah, it’s interesting. But you see.

How EEAT can go from essentially a wish

from Google that people buy into, edit
their sites, and now quality raters flag

these sites that have
this stuff as helpful.

Therefore the machine learning knows how

to recognise EEAT through
correlation, right?

In my opinion, this is
very much how this works.

And that kind of reconciliates the people
who are like,

how does this work technically and how
does Google verify your claims?

I don’t think they do.

But potentially the badges on your page
and all that stuff, et cetera,

that is visible correlates
with being a high quality site.

And I’m not sure saying it’s going to fix

all your problems,
but it’s one of the things that might be

associated with you being a higher
quality site during these core updates.

Similarly with user metrics,
like user engagement metrics, right?

It’s like, for example,
a site that has been deemed helpful might

have a time between the person clicks
on the result and the person comes back

to the SERP of 2 minutes
and 30 seconds, right?

It’s like we know that.

And then Google can benchmark that time

of your site against that time on the site
that they know is helpful because a human

reviewed it and then they
see how far off you are.

And it’s like for many reasons, people
could be spending less time on your site.

One of the main reason I think is most

people’s site look absolute trash
on mobile, for example,

whereas lab sites look very good on mobile
and most traffic is mobile these days.

And if you essentially have lower time
on page because your site is a poor

experience on mobile,
even if your content is great,

a lot of people be like,
oh, but my content is great.

It’s better than these
big sites, et cetera.

People are potentially staying less

on your site and therefore you’re deemed
unhelpful, even if your content

and the words you have on your page
and the photos and everything are great.

Same with over monetization, right?

It’s like it could fuck you over.
If there’s ads everywhere,

people click back too early,
that essentially messes your site.

So think about this as like Google taking
the benchmark that is set by quality

raters and then putting your site against
them for many, many metrics and then

deciding how close you are to them to
determine if you’re a helpful site or not.

Now, I don’t think Google looks
at all the pages on your site.

I think that would be very expensive.

They probably look at the pages they send

the most traffic to, which essentially
is what affects them the most, right?

They take their visitors,
they send it to your website.

They want this to be a good experience.

So most likely.

And that’s kind of the good
thing with the audit, right?

It’s like if you kind of like
do you can 80/20 it if you want.

You can kind of review
your top pages and fix them.

And it probably is a lot more
helpful than doing everything.

There’s kind of diminishing returns as you
do that, which is nice because doing

an audit is pretty heavy
and time consuming, actually.

It’s a grind for sure.
Yeah.

So you can 80/20 it basically.

But basically that’s the idea.

Google uses something
like a quality score.

So quality score is actually
something borrowed from AdWords.

So I don’t know if you know,
but when you do AdWords,

when you do Google Ads,
Google actually assigns a score between

one and ten to your landing page,
matching with your keywords.

And it’s trying to decide
how closely you are matched.

And the better you match, the lower
your CPC, so you pay less for your ads.

And if you do, it’s done this
for decades, for a long time.

This isn’t a new thing.

That’s what we kind of like borrowed
from Google, the quality score idea.

And so our whole auditing system
essentially

revolves around assigning a quality score,
assigning an average quality score target.

So I usually tell people like,

you want all the index pages on your site
to be at least a seven out of ten

and we have a whole scoring
system for that, et cetera.

And then anything that’s below a seven out
of ten, you’re going to have to do

something about it, basically, because
that’s what’s bringing your average down.

That’s kind of the reason for all this explainer.

Yeah, go ahead.
One thing that’s really important here,

because most people I speak to are like,
oh yeah, I have great content.

And there’s a couple of things.

Often people don’t realise
how their content is,

maybe there’s a gap between what they have
and what’s currently ranking at the top

of the SERP, or even if when you created
that content, it was really good.

Best in the world.

SERPs change very quickly,
especially in competitive spaces.

People are updating their articles,
creating new content.

The search intent itself changes.

So that’s why you really need to go
through this regularly,

at least once a year,
to kind of reevaluate things,

because it’s very easy to create
a process to create content,

much harder to create process
to constantly evaluate and analyse it.

There’s actually more to that as well.
There’s also like,

from an SEO perspective,
let’s work with Google for a second.

There’s also kind of like strategic

decisions that probably made sense
at the time when you made them.

So trying to rank for a keyword, for
example, and now you have a bunch of DR

It doesn’t make sense anymore.

And quite often a lot of your link profile

is pointed at places where
you have no chance of ranking.

So these audits, and we’ll talk about link

profile redeployment
is a big thing as well.

And I think it’s a big
thing in terms of recovery.

It’s not just recovering your rankings,

it’s kind of re optimising
your assets as well.

It’s a big, big deal.

And that has helped us a lot.

I might give an example later on,
some of the stuff we’ve done around that.

But doing this exercise basically helps

you also redeploy things in a smarter way
towards keywords you actually have

a chance to rank for,
especially after large updates, right?

It’s like when Reddit
ranks for everything.

I’m telling you, there’s many keywords
on your site you’re targeting.

You can’t rank, you can’t anymore.

And it’s a bit of a painful realisation.
You’re like, oh,

I could easily rank for this before,
but now I can’t. I think for Authority

Hacker we had like best keyword tool.

We were number one for best keyword
tool for a long time right? Now,

good luck.

It’s like DR 95 sites ranking
for it, we can’t rank for it.

It’s unfortunate, but just,
it’s not possible.

And so we’ve redeployed,

we had 100 plus linking root domains
to that best keyword tool page or something

earned through actually
making a good review.

And we redeployed them to a page
that can actually rank.

I can’t remember which page.

I think it’s pointed to the topical
authority page now or something like this,

like something that’s like semi
related that we can actually rank for.

And so we kind of keep doing that dance

with other sites when we can’t beat them,
we just dodge them instead of trying to –

This is a really interesting point as
well, because we’re not just making SEO

decisions here, but we’re
making business decisions here.

Because ultimately traffic,
we always say it’s a vanity metric.

It doesn’t matter how much traffic you

have, it’s how much money
you generate from that.

So if you’re able to redirect some of this
link juice to pages,

which can bring you leads or sales or
whatever it is you’re going for, then

it’s a much better thing to do than having
these links go to a page that has no

chance of even being
anywhere close to page one.

I suggest we jump into the audit process
because now we’re kind of like chipping

at it, but we are actually
not getting into it.

So we just spent,
I’m looking at my timer here.

We talked for 20 something
minutes about how Google works.

Now the question is like,
how do I solve this update?

That’s why I click on the podcast.
What are you talking about?

Let’s talk about this right now and we’re
going to talk about the whole process.

And for those of you who want to see us do
it, who want the templates, et cetera,

are that’s when you might be
interested in a blueprint.

Otherwise we’ll give you a bunch of info
here and you can try it yourself.

But basically,

if Google is going to sample pages on your
site and give them a quality score,

we’re going to do the same
with our auditing process, right?

We’re going to go through essentially all

the indexed pages on our website and it’s
very precise index not all the pages.

So if you have no index stuff,
it doesn’t matter.

And no indexing stuff works, as we
could see from the anonymous case study.

That I showed actually he no indexed 90%

of the pages on the site and his
traffic went up, actually.

So no index is going to be an option
and your role is going to be going through

your pages that essentially are low
quality score and either delete them.

So if they have no traffic and no links,

we tend to delete them because
they add very little value.

Unless you maybe have a realistic
shot at ranking for the keyword.

But that’s where you reevaluate
your keywords, redirect them.

So if a page has no traffic but links,
you’re much better off redirecting

that page to a page that has
a shot at actually getting traffic.

You can update the page.

So essentially if the page is valuable

for your business, it can help
you make sales, et cetera.

And some of the content is salvageable.

Then we do a content update,

which is essentially you keep the core
of the content and you just fix it.

You patch it up basically.

Or you can rewrite everything.

That’s when nothing’s salvageable

but the page is valuable to you,
or you can no index pages.

So we do that when the page is valuable

to the user, but we don’t think
that’s what Google is looking for.

So for example, an example is like a lot

of people like republishing their
newsletter on their site, right?

It’s cool and everything,

but the truth is it tends to not be
deemed as a high quality page by Google.

While Google says right for human,
they’re full of shit in my opinion.

And the truth is these kind of like dumb

algorithms very much rule
the core of how Google works.

And they want like an article that’s

structured the way it’s
supposed to be structured.

Etc.
Yeah.

You’re not really rewarded for posting
content for humans at this point.

I hope they fix that at some point, but at
this point I would no index it, actually.

Or you can repurpose stuff.

And I’ll give you an example recently
on stuff that I’ve repurposed.

Repurposing is basically content that
might not be good enough for your domain.

Let’s say you’re like DR

trying to rank for a keyword.

So either the keyword is too difficult or

the content is not good enough for your
quality threshold,

then you might be better off using this as
a guest post or potentially using this as

parasite SEO, as I’ve
talked about in a second.

So that’s essentially the process
of auditing, like the very wide process.

I’m going to talk to specifics
in a second. For Authority Hacker,

yeah, we removed 100 blog posts
that we either deleted or redirected.

We even removed an entire category.

You know, we had the funnels category,

and this is gone from the site now,
mostly because we found that 90%

of the content in that category
was not on the standards we wanted.

And so we’ve deleted, I think,
like 40 or 50 articles from this category.

The writer was probably one
of our worst writers, to be honest.

And we were making money,
that’s the thing.

We kind of gave up on some affiliate

commissions on ActiveCampaign and so on,
but we felt that it could be the cause

for the site dropping
in an update in the future.

So we decided to not take that risk.

It was worth it.

We no indexed 300 podcast pages as well.

It’s kind of a difficult one because

several people want to find
a podcast on Google.

But Google doesn’t like our podcast pages

because podcast show notes
tend to be very thin.

There’s low value on them.

And so therefore, it kind of like brings
your average quality score down, right?

It’s like if you took an average page
on Authority Hacker with the podcast pages,

probably the median page
is a podcast page, right?

So it means you cut halfway through the
page count, you land on a podcast page.

It’s not a great page.

It’s not reflective of the content
we put on the blog elsewhere.

So we didn’t want this to bring
down the average quality.

Bam.
No, indexed everything.

And if you put Authority Hacker in Ahrefs,

you’ll see the number of index pages
dropped quite a lot when we did this.

We’re still in the process of updating

and rewriting dozens of pages,
following this process as well.

And we’re testing repurposing.

So I’ll give you
an example of repurposing.

That’s a funny one.

So in our funnel section, actually,

that we deleted, we had an article
on ThriveCart review, right?

So ThriveCart is
the shopping cart we’re using.

Some days we love it,
some days we hate it.

Let’s just say that.

Still pretty decent if you’re getting

started, because there’s no
subscription, which is nice.

But the point is, the review was outdated.

It was four or five years old.

And when we did the audit process,

one of the points is like,
is the content up to date?

And in this case, it was clearly no,

we were missing lots of new features,
it wasn’t very good, et cetera.

So we unpublished it.

But ThriveCart, we’ve made
money from this review, right?

I checked I think there was over 20

grand of commissions
on the affiliate dashboard.

Right.

I was like, that kind of sucks to kind
of give up on these commissions,

but we just don’t have the resources
to redo that review properly.

I don’t think we have a writer

that specialises enough in shopping
carts to do justice to that review.

So we removed it from our site
to essentially increase our quality score.

But I recently republished it on LinkedIn,

which is basically
a completely unmoderated DR

want, including your affiliate link CTAs, anything.

Three days later,

that review is now ranking higher
than it ever did on Authority Hacker

com.

And we can reap the benefits

of essentially the trickling commissions
from the ThriveCart review that we had

without damaging the average
quality score of the site.

We do lose a little bit on email

subscribers and retargeting
data, et cetera.

It’s not perfect, but that’s what
I’m talking about with repurposing.

That’s where you can be a little sneaky.

I mean, Google’s being sneaky with us.

I don’t feel bad for repurposing,
essentially.

It’s not a sneaky review.

When the user lands on the review is
the same experience we had on Authority

Hacker, except it’s
on LinkedIn, basically.

So that’s an example of repurposing here.

Is that something you recommend doing

for most people,
or is that mostly just a fun experiment?

It’s a good question,
but the thing is, it works super well.

It’s too easy.
Like three days later we were ranking.

I outrank DR 75 plus sites that have had
a more updated review for years.

Just putting it on LinkedIn right now,
and I mean, I don’t feel if it will still

rank when we release this podcast,
but right now it’s ranking for everything.

So it’s like potentially, potentially,
but I don’t think it’s going to last.

I think LinkedIn is either going to put

some kind of moderation system in place,
or Google is going to not reward them as

much because it’s a short
term game, basically.

But again, if LinkedIn stops ranking,

I can take it off LinkedIn
and put it somewhere else,

basically. The idea of repurposing is

like, I have a piece of content,
what do I do with this?

It could be a guest post,

could be posted on another domain,
whatever serves my business goals, really.

And my kind of code of ethics is like,

as long as I’m not cheating the user, the
reader of the article, I’m okay with it.

Basically, if I’m not selling some shitty
product, if the advice is the same,

that I would give an Authority Hacker,
et cetera, I’m kind of okay with it.

I don’t think it’s detrimental to the
Internet and to the users, and it’s fine.

So it’s like kind of like white
hat parasite SEO you’d call it.

But I mean, play with the current
set of rules, right?

Anyway, let’s go back to the audit

process, because that is
just one of the things.

So how do you actually do this?

How do you actually do this
audit thing, et cetera?

Well, the first thing you need to do is

collect a list of all
your index pages, right?

So there’s multiple ways to do that.

A lot of people will recommend GSC,

but I actually prefer using
the free version of Ahrefs.

Free version.
Please don’t

negative comment us.

No credits necessary.

No credits.
Why?

Because there’s actually a lot of very

useful metrics in
Webmasters Tools from Ahrefs.

I think, it’s the free plan, basically.

And so it allows you to not only just get

your index pages, but you can get their
linking root domains,

you can get their top keyword,
and you can get their organic traffic.

You can get some of that in GSC,
but you’re missing the link data,

particularly. Now if you
want to be more precise,

you can take both data,

the GSC data and the Ahrefs data and just
do a quick vlookup on the URL level

to bring the GSC data for the clicks,
while you can use the linking root

domains, et cetera, from Ahrefs
and have more precise data, basically.

So once you have that spreadsheet,

basically it’s a CSV,
you import that into a Notion template.

This Notion template is
available in the Blueprint.

That’s kind of the benefit for buying

the Blueprint is that you
actually get the Notion template.

So I’ll tell you how you do this,

but if you want it done for
you, buy the Blueprint, basically.

And so you import that data into a Notion

database that essentially
is pre built for you.

And that Notion database
is already pretty smart.

It’s doing a bunch of work for you.

So it’s going to bucket together all

the pages that you should
be considering deleting.

So that’s pages that have
no links and no traffic.

For example, the pages you should consider

redirecting because it’s going to look
at linking root domains and traffic.

So if you see that you have links but no
traffic, for example, it’d be like hey,

here’s a bucket of pages you
should consider redirecting.

It’s also going to make a bucket of pages
you should consider doing nothing with.

Because obviously sometimes pages are
good, you don’t need to change everything.

And so if you rank like top five for your

target keyword, et cetera,
it’s going to bucket all these pages

together and you can quickly
flag them as like okay.

And that saves a lot of time

with the audit because using these
metrics, it does 80% of the work.

You should still check it.

It’s not perfect,
especially on the deleting side.

Sometimes even if a page has no organic

traffic, it might have like internal
traffic that is not reported in Ahrefs,

for example, a page that explains your

business model, your About page,
things like that.

Don’t delete this for that reason.

And it’s okay to keep
them indexed as well.

So that’s the thing.

Now once you’ve made that database of all
your index pages in Notion pretty quickly

using Ahrefs and GSC, you need to start
grading your pages like Google basically.

So again, we have prepared a big checklist

that is essentially created for every
single page on the page level.

So you have your database in Notion.

Every time you open the page you have

a checklist and you can go through
this and tick what is working?

What is not working.

I’m going to give you a bunch
of elements that are in the checklist.

I’m going to not give you the whole
checklist because that’s part

of the product, but let’s
just talk about some of them.

So for example, and when we were talking

about business decisions,
that’s actually one of them.

It’s like the page targets a keyword
I can realistically rank for.

Again, that means go
and Google your keyword again,

check who’s above you.

Can you beat these people or not?
Or are they DR

It’s time to be realistic.

For a lot of people, it’s like
a lot of keywords got closed down.

The faster you realise that,

the faster you can reutilize
these resources of the content.

And the links pointed to that page

to something that’s actually going
to be productive for your business.

Right?

And then there’s like the page
matches the search intent.

So does the format of the article match

the kind of format of articles
you see on page one?

The page ranks well and gets
traffic for intended keywords.

So if not, obviously that’s not working.

And the page generates
good conversions as well.

And I think that’s one that a lot
of people kind of miss on.

So, like, this industry is obsessed
with traffic and GSC data and very,

very little obsession for how
much money does that make them.

And so what I want to say is that it’s

quite important that if you’ve been
ranking for a while, if you have

historical data on that page,
you need to go and look back how much

money you’ve made from this or how
many conversions you’ve made from this.

Because while you may be able to rank

for that keyword, if that’s making you no
money, please, please redirect that page,

reuse that content somewhere else,
and actually make it something productive

for your business rather than just
ego inflation through SEO and GSC data.

Because I’ll tell you, for us,

we have periods where sometimes we have
way more traffic than usual,

but our conversions are not as high
and vice versa, where the traffic is not

as high, but conversions
are very good, for example.

So traffic is not conversions.

And you should go look back at that.

And it can be quite difficult
to measure this sometimes.

I know Amazon used to give you a report
where it would actually show you how much

money each page was making,
but they took it away for some reason.

I know other platforms,

other affiliate platforms allow you to do
this if you’re selling your own products.

There are much more advanced tools,

like we use a company called
Segmetrics to do this.

And it actually tracks kind of leads

coming around and then as they go through
your sales funnel and eventually buy.

So you can see where the people

who purchased your products,
which organic pages they came from.

Yeah.
You also can use tracking IDs.

Right.

It gets really messy really quickly if

you start having a lot.

I think Lasso has a free analytic tool as

well, where they, like,
I think you can track up to a few thousand

dollars a month of affiliate income
for free and then they charge you.

But yeah, check the price.

I don’t remember the price.

I’m not going to say if it’s a good
deal or not because I haven’t checked.

But I know there are tools now
that help you with that for affiliate.

And if you have your own products,

you definitely need some
analytics set up properly.

And that’s kind of like the higher level.

It’s like, is it even worth putting
any kind of effort into that page?

Do I even want to recover that page?

Because quite often
the answer would be no.

And it’s like, I’d rather my writers try

something that might work rather than
work on something that’s sure to not work.

And so this is kind
of like the high level.

Obviously, we need to go back to more
like SEO considerations as well.

The next one I have is the pages related
to my site’s main topic or activity.

I think that’s a big one as well,

because a lot of people,
they do well in SEO and they’re like, oh,

this site is rocking, let’s just
put some more content onto it.

And then they kind of like expand way
beyond the original scope of their site.

That’s what happened to the anonymous
case study that we talked about as well.

He expanded quite a bit beyond his

original scope and as a result,
well, he took a hit.

Kind of dilutes it a bit
really, doesn’t it?

Yeah, and that’s the problem.

I think when you’re growing,
it’s kind of fine to kind of slowly expand

your scope because otherwise you
might run out of topics to talk about.

But when you’re shrinking,
it’s also important to understand that you

need to reduce your scope and go back
to your roots because that’s what Google

knows your site for, that’s what
most of your links are relevant for.

And essentially rebuilding that kind

of core relevancy is very important
for sites to come back, actually.

And so while most people have accepted

the fact that they can expand their scope
when things are going well,

very few have accepted the fact that they
need to reduce their scope when things are

not going well and follow the analytics
curve, basically on knowing whether

you need to recenter on your
topic or if you can expand.

Basically, if you win the last core
update, yeah, maybe you can expand.

If you lose the last core update,

then kind of like recenter
on what you’re doing, right.

The next one is like the content
is up to date and accurate.

I mean, pretty obvious.

But what that means is especially
on competitive SERPs where people are

updating their content, very often the
search intent is a moving target, right?

It’s not a fixed thing.
So it’s like what it was when you wrote

the article versus what it is
today is probably quite different.

And it’s quite easy to take your eyes off

that when you’re not
paying attention to a page.

It’s doing well, it’s rocking, et cetera.

You’re busy with something else,

but that’s the time to kind of relook
at that and be honest about it.

Is the product I reviewed,
is there a new version?

Did they do some updates?

Was there some kind of mass recall
for this product that I did not cover.

Or has everyone else just written better

reviews than you now and you
don’t stand a chance?

Yeah, exactly.

It also goes under the page ‘target keywords
I can realistically rank for’, that part.

I would say we saw a lot

of that with helpful
content update as well.

So where there would be ten affiliate

sites on page one, there was maybe
one or sometimes zero on page one.

And then you just had Reddit,

Quora and a bunch of random ecom sites
which also kind of had shitty content.

But another question there.
Well, it’s also

the search intent and type of site ranking

is but potentially part
of the search intent.

Is it all ecom sites ranking then

maybe my content site, it’s not
really welcome right now at least.

Hopefully when it changes
with updates it will be better.

But it’s just time to be realistic
on a lot of these things.

So that’s one of them.

Like up to date and accurate.

Quite often you’ll realise how outdated

your content is as you just
spend time just on this point.

Then the next one, I think is really one

where a lot of people are failing
and the page is well formatted.

It looks good on all screen sizes,
emphasis on all screen sizes.

It’s like most people build their site
on desktop, they look at their site

on desktop, they never experience their
site on mobile, and it’s a shit show.

You need to change the size of your
fonts on H2s, et cetera.

On mobile you need to do all these things.

You need to have set up all these rules on
your site because otherwise it’s not good.

It’s not good enough,
the spacings, et cetera.

You need to set all this stuff up,
otherwise you don’t have a good experience

on mobile, for example,
and Google is mobile first now.

There’s no wall of text as well and no
wall of text on mobile,

which is very different as wall of text
on desktop because

you can fit six mobile screens on one
desktop screen, something like this.

And so what that means is while the page
might look good when you browse it in your

browser, if I open it on my mobile,
it’s all like shrunk horizontally.

Holy shit.
You could be scrolling like four or five

screens of words and words
and words and nothing.

And this is how people drop off your page.

Like when I was talking about Google
looking at average time on site versus

sites that have been deemed helpful,
that’s one thing that could potentially

mess you up with these helpful
content updates, et cetera.

So it’s like evaluating
that for all screen sizes.

So that means right click on your browser
on your page, inspect and you can click

on the device on the top right
of the settings panels and you can set

responsive or you can
even select a device.

So like I usually do take
the iPhone 12 Pro, I think.

And it just sizes like it’s
an iPhone 12 Pro, right?

And it’s like I see and I’m scrolling

through and I’m kind of looking at the
experience and most people don’t do that.

And I’m telling you,
I think that’s a big reason a lot

of people got messed up and a big,
big reason why large sites are ranking

well because they actually put
effort into their mobile experience.

Quite often you go on a Forbes page,

it looks very different
on mobile and desktop.

Like you’ll have more collapsible
parts on mobiles, right?

They’ll have this accordion thing that

collapse on mobile and then
expand it on desktop, for example.

And as a result you see
a full article on desktop.

But you go on mobile it’s like you can
actually scroll the page in like three

swipes and just expand
the section you want, for example.

So that’s the kind of like mobile UX stuff
that is becoming essentially search intent

as it’s correlating
with sites deemed helpful.

It’s also like it ties back into when
you’re creating content as well.

This whole notion I know I hammer on about
a lot of fluff content and intros

especially we talked about a lot
being super long and just unnecessary.

So it’s even more so on mobile.

People search for something,
you just want the answer.

So give it to them
and give it to them fast.

The time to value is
super important there.

Yeah.
And just the formatting,

like a lot of these comparison tables,
they look shit on mobile and so on.

We had several members,

we had a Platinum member, AH Pro
Platinum member talking about this process.

And I had many members come back to me.

I opened their site on the call.

I’m like, I’m sorry,
but it’s just not great.

And then they came back
to me and they fixed it.

It helps them identify this stuff.
Basically.

Another one I have is the page is not over
monetized or promoting shady companies.

Now the shady companies one is

an interesting one because it’s funny how
the world of SEO is very,

very excited about the idea of semantics,
entities, et cetera,

when it comes to optimising your page,
but not so much when it comes

to the companies you’re mentioning
in there that might have negative

sentiment associated with them,
bad reviews, et cetera.

If you are the person that pushes this

stuff, if Google is using all the semantic
stuff to evaluate your pages,

they probably use that semantic stuff
to evaluate the reputation of the people

you’re associating yourself with as well
through your entities.

Therefore, sites that, first of all,
sites that have lots and lots

of affiliate links tend to have
done worse recently.

Like kind of like tone it down.

Same with ads, basically.

And I think sites that have been

essentially associating themselves with
shadier companies – unless they’re large sites –

like you’ll find an Outlook India article

promoting the shittiest
diet pills right now.

And maybe ranking. Outlook India is not
doing so hot recently, but still,

you get the idea. Being a large site,
forgives you all your sins, basically.

But if you’re not a large site,
it’s important basically.

And so I want to address that because

people are going to come back
in the comments like, oh,

but I see this large site ranking
for this keyword, et cetera.

Yes, that’s true.
That’s how Google works right now.

They need to fix that.

There’s a problem, but it doesn’t change

the fact that that’s how they judge
your smaller site, potentially.

So that’s pretty much like
grading your page, right?

So you go through this checklist
and you have a grading system.

You do grade your page the same
way Google would, basically.

And so you’re able to see which pages are
below your seven and then that tells you,

like, look, I need to do
something about that, right?

And that’s exactly the next thing.

It’s making decisions in the process.
Right.

So once you’ve gone through the checklist,

it helps you to have more of a honest view
about the page and see its shortcomings.

It’s kind of like a conscience check.

That’s why I wrote in the notes, like,

everyone thinks their content is
great until they look more closely.

It’s true.

And it was the case with the
anonymous case study as well.

Right?
We had a call with him and I think most

people would say his site is pretty good,
even though he got affected.

But when we looked closer,
I thought it was great.

I looked closer, I was like, man,

it’s just not like your reviews could
be quite a bit better, actually.

It’s missing a lot
of information and so on.

It’s not that up to date.

And that helped him kind of take a step

back and be like, oh shit,
I need to do something about it.

Maybe I was great at one time and then
now maybe I kind of fell behind.

Yeah, that’s the thing.

A large part of this is competition
and content quality especially.

It just gets better over time as people
level up, adapt, learn from what

other people are doing –

  • copy your content.

Exactly.

While you were the best in the world three
years ago, it’s often not the case today.

Yeah, I call it content inflation.
Yeah.

It’s a good way of describing it.

Sometimes I go back,
I know you don’t like talking about it,

but sometimes I go back and check,
Health Ambition, and I’m like, shit.

Like GPT 3.5

is like way better than
what we had on there.

And it’s like, that’s the perfect
example of content inflation.

Right?

You go back to a site you’ve built like

seven, eight years ago, and I agree,
AI content is better than that.

So essentially that’s the gaps
in the market you can take with AI

content, because definitely
it does a better job.

But, yeah, so it’s like these
things, they’re not forever.

Basically, nothing’s forever
except diamonds, I guess.

But anyway, once you have gone through

that checklist,
we made an interactive flowchart

that essentially decide
the best pass for your page.

So again, I’m not going to give the whole
flowchart because essentially it’s

the course, but I’m going
to give you an example.

Right?
So we’re going to go through one.

I put the screenshot
on the page and we go together.

So one of the first question is,
is the page older than six months?

And you just answer yes or no, right?

If no, we tend to tell people, like,
look, give it a little bit of time.

That page probably needs to age a little

bit before you decide to slash
it or do something with it.

Then it’s like, does the page rank
top three for its targeted keyword?

You can say yes, you can say no.

If you want to know what’s behind yes,
you’re going to have to buy the Blueprint.

Then it’s like, is the keyword unique

and important to you and can
you realistically rank for it?

And that’s where you essentially
go back and Google the keyword.

You see who’s ranking and you’re like,
do I have a chance?

Can I actually rank?

Let’s say in this case we say no, right?

If you say yes, it might be different.

But if you say no, does the page
has valuable links pointing to it?

Because we know it can’t rank
for its unique keywords, basically.

But there might be salvageable
parts that we can go after here.

And this is an interesting point,
actually, because a lot of people,

they will leave an article that doesn’t
rank and is out of date – and it’s wasted –

because it has links and they want
that links for their domain, essentially.

But even though it’s
not helping that page.

But, man, redirects are so powerful.
It works so well,

actually. Maybe I’ll give
an example of redirect after.

So in this case, I put, yes,
there is links pointing to it.

So it’s interesting.

It’s like, is this page valuable
for your business outside of SEO?

And the reason I’m asking that is

sometimes these pages are still worth
it if they’re like high convert.

Like, imagine you have a squeeze page,
for example, right?

It doesn’t rank for a unique
keyword or can’t rank for anything.

It has links pointing to it because people

have been promoting it or
you’ve been promoting it online.

I’m not going to cut my squeeze page
to just salvage some links.

However,

I could take the content of that squeeze
page, put it somewhere else,

and redirect that link with all
the backlinks somewhere else.

It depends on how the traffic
comes in, basically.

But you can be sneaky a bit.

That’s actually a tactic that we’ve done.

So, for example, I think we had
an old blog post from Perrin, right?

Perrin’s case study on hip hop,
his website that he built and sold, right?

This page had like 100
plus linking root domains.

But it’s a great story for Authority Hacker.
I don’t want to get rid of it.

It’s kind of like part
of our legacy, et cetera.

It’s interesting.
So what we did is we took the content

of that page and we just put it
on a new URL that has no links.

And we took the original URL with 100
linking with domains,

and we pointed it to a page
that actually has a chance for ranking.

So we kind of get, in French,

we say the butter and the money
from the butter,

which is essentially
we get to use the links more efficiently

while keeping the legacy page
and the content we want on the site.

I think the English saying is
have your cake and eat it too.

Okay, well, pretty similar, except for us,

we got too lazy, so we just ate the butter.
Anyway.

So that’s why we’re asking that.

But anyway, to go back to a flowchart,
is the page valuable outside of SEO?

If you say yes, it says,

consider reposting the content
on a different URL and redirecting this

URL to a page that competes in SEO
to better utilise links pointing to it.

So that’s exactly what I was describing.

If no, redirect the page to a more
valuable page, unless you think it’s

valuable to your business and gets
traffic from other sources.

So you can see how the flowchart works.

Like, you get to look at your checklist,
you see all the shortcomings of the page,

you have graded your page,
and then you make a decision based

on looking at this information
and using the flowchart.

So we have basically mapped out all

potential scenarios in there, and if we
get more feedback, I’ll keep updating it.

That’s kind of the beauty
of a Notion template.

Yeah, that’s pretty much the flowchart.

So once you get there,
you make a decision, you go in your Notion

template and you decide like,
do I delete this, do I redirect this,

do I update this,
do I leave it as is, et cetera.

But the thing is, making a decision
doesn’t fix your website, right?

It’s not going to do anything.

You just have a Notion template
with a decision assigned to a URL.

Google doesn’t give a shit.

What matters is you acting on your

decisions and doing it
fast and efficiently.

So that’s why we’ve actually built
a full task system in our Notion system.

So when you walk on that page,
when you’re kind of like infused

with the decision, the content,
everything, you are into it.

You don’t have to reread the page
like six months later to do something.

You can actually write down all your tasks
and then just queue them up and they’re

associated to your URL
and to all the metrics.

So we build a task system that allows you
to see the pages associated with, the tags

that are associated with, the types
of tasks that it’s associated with.

So for example, you can tag the tasks as
like 301s, for example,

and do all your 301s at once,
for example, which for example,

most people do their 301s in their SEO plugin.

Terrible mistake.
Why?

Because your SEO plugin is actually going
to load your website and then redirect.

It’s very slow, basically,
and resource intensive.

So when Google follows it and it’s heavy

on their resources,
you’re eating your crawl budget.

You’re much better off using the mass

Cloudflare 301 redirect
manager that everyone can use.

It’s for free, it’s on Cloudflare.

And we show you how to do
that actually in the Blueprint.

And so essentially you can take all your

task tags as 301 and you
can do them all at once

in the Cloudflare master direct manager,
you tick all these tasks at once because

they are grouped together and bam,
that part of your audit is done.

Same for the updates, same for everything.

So it makes essentially acting
on your decisions a lot easier.

And that’s what I was saying.

I don’t think we reinvented the wheel
here, but we’ve built like a nice,

sleek process that allows you to actually
get this stuff done in less time,

less effort, and essentially make
potentially less – good decisions.

That’s why the flowchart

and the checklist, et cetera, are here,
is to help you not mess it up,

because if you mess it up, you can
really mess up your traffic as well.

You said less good decisions.

You mean less bad decisions, I’m sure.
Less bad decisions.

Yeah.
No, I don’t want to sell it.

No, that’s good.
Yeah.

So that’s essentially the whole process.
Right?

And the value added is like

the checklists, the Notion templates,
it’s the decision flowchart,

the task system to make sure
everything gets done, et cetera.

So it’s streamlined basically.

That’s kind of like where it works.

And to be honest,

every single site that has recovered
from a large update has followed some kind

of a similar system
ever since Google Panda.

I think that’s when this
kind of came out originally.

But the problem is people
talk about this but don’t really solve

the practical problems
on executing this, basically.

And I think that’s what we
try to do with this Blueprint.

A couple of questions.
So how often should you run this process?

Ideally, I’d say once a year.

We call it like it’s for people
who got affected by algorithm updates.

But ideally you do this
before you are affected.

Right.
Because keeping your site clean is

the most reliable way to make
sure this doesn’t happen to you.

It’s not for sure.
Like nothing’s for sure with Google.

And I don’t want to be that guy
that’s like, look, this is perfect.

It’s going to work every time, et cetera.

It’s not even for sure going to recover

your website because we’re
not in control of Google.

I can just tell you historically that’s
what people have done to recover websites.

But personally I would run it every year.

Just pick a month of the year.

I like January, kind of like do a big

cleaning in January
of your site every year.

Potentially update your title tags
with the year, et cetera,

update your publish dates,
do all these things that kind of boost up

your rankings anyway, while you’re
kind of auditing the website.

And that’s a really good
housekeeping exercise to do.

And we are going to keep doing that every
year on the projects we work on

actively, actually.

And if you’re running a newer site,
is this something you need to do quite

quickly or at what stage does
this become kind of relevant?

Maybe after two years or something?

I don’t think I would do
it immediately, obviously.

What’s the point of editing
something brand new?

Sites kind of obey
different rules in a way.

They need some time to establish
themselves, et cetera.

And then if they do the things right,
they will experience growth anyway.

And it’s like you kind of need to see what

grows, what doesn’t,
what sticks, what doesn’t.

And you clean up once a year.

So I would say probably the two first

years I wouldn’t bother too much,
and then after that I would slowly start.

But you can kind of be milder
for the earlier years of your site.

And as your site gets older and older

pages, you’re going to have to be
a lot harsher on these audits, basically.

And how critical do you think it is

that you, the site owner,
the business owner, the SEO,

make these decisions versus maybe
outsourcing some of this work? For us,

I didn’t do all the decisions,

for example, so we have someone who works
on SEO, but I reviewed everything.

It’s kind of like it was pre done.

So I had someone kind of prepare

the buckets of like, hey,
here’s what we want to delete.

Here’s what we want
to redirect, et cetera.

based on the rules I set.

I think it’s important to review it
because this SEO person that does it

for you, they might not understand what
page is important to your business.

And sometimes a page that doesn’t get much

traffic is extremely
important to your business.

And they’ll delete it.

If they’re not careful, they’ll delete
it or they’ll redirect it, et cetera.

It’s not good.

So it’s like there’s a degree of knowing

the business, not just
knowing SEO, to doing this.

So I don’t think you
need to do everything.

I think you can definitely pass this
on to someone else to essentially do up

to the decision part and then go
through this yourself and approve it.

I think that’s kind of like if you don’t

have time to do this because it can be
quite time consuming,

especially for large sites,
I think that’s the best way to do this.

The point of the template and the system
is you can pass this on to someone

and review it in a convenient format
in the Notion database, basically.

And if your site has 10,000 pages,

are you literally going through all 10,

Sounds like it’s going
to take a long time.

I mean, it’s not as bad because

the template kind of like
buckets things for you.

So you’re very likely to have lots
of pages with like no traffic, no links.

It’s just going to bucket these and it’s
like, yeah, you see the URL,

you see the page title,
and you see no traffic, no link.

And then it’s like
you kind of tag them, really,

I select 10 or 15 in a batch or

something, and I just
flag them all as delete.

For example, if you have thousands

of pages as well,
you tend to have categories and you might

make a decision across a category,
like we did with the funnel category,

just like, okay, well,
let’s get rid of the whole thing here.

And it’s quicker than going
through them all one by one.

But the buckets helps a lot,

like the automatic bucketing of pages
through the Ahrefs and GSC data.

It saves so much time because once it’s

there and you see all these metrics
in one dashboard, quite often,

yeah, I was selecting chunks of 20 pages,

for redirect or deletion or whatever.

It was like, correct

and then 20% of the time I would
probably make a different decision.

So it still takes time,
but it saves time to do it that way.

So how does this then impact this concept
of topical authority where we want to be

covering everything to do
with our niche or our industry?

I mean, are we going to be deleting half

the pages and wrecking
that or what’s the plan?

Yeah, it’s interesting, right?

On one side you have people
that essentially want to add more pages

on their site, and then on the other side,
I’m telling you, delete a bunch of stuff.

I think that these
concepts can live together.

It’s really about,

it’s like whether you want to believe
in topical authority, that’s fine.

The question is, what is the quality
of the execution of that content?

Right?
So if you’re doing low quality content,

then potentially your pages
get low quality score.

And despite the fact that you’re covering

a lot of topics, your site
is going to get spanked.

And honestly,

you can find plenty of examples of that
following the recent updates, right?

I don’t think topical authority
is a way to protect your website.

However, if you execute your content

properly, it’s well maintained
and everything, and you use a process like

this one to make sure your page
goes well with everything Google’s looking

for, et cetera, then I
think it’s completely fine.

I think it potentially works together.

It’s like, I’m not saying make tiny sites
only, I’m saying don’t bite more than you

can chew and make sites that you
can actually maintain properly.

Because I think that matters
a lot for Google.

So it’s like a good example
is like Healthline, right?

Healthline is big health site,
lots of pages, et cetera.

But all pages are executed
on a pretty high level.

Google rewards them,
they’re doing very well, right?

But then you can find many more bloated
sites that have lots of pages but not

executed well that get
tanked in the updates.

I think part of the reason for that is

because it’s easy to create content
and to build processes to do it,

but it’s much harder to know when to stop
so that you can maintain that content

at the real best of the best kind
of standard that’s necessary across your

entire site in order to realise
all these gains and stuff.

And so, yeah, that’s the challenge.

It’s not just like building the site,

it’s maintaining it and lasting over time,
which is where you make the real money.

Right?

It sucks to climb all the way
there, to just fall off

when you actually get there.

You’re expected to actually last long
enough to actually make your money back.

So potentially the challenge of building

large sites is maybe more than
what people thought it was.

Right, because it’s one thing to get
there, it’s another thing to stay.

But you can absolutely have your topical
map, do all your topics, et cetera,

but once a year, go through all your
content and make sure it’s still in line.

Potentially that improves your topical
authority because you’re looking at what’s

ranking, you’re looking at what new topics
are associated with your pages, et cetera.

You re add them in there
and that helps you out.

So it’s like, I think topical authority,

it’s not necessarily a bad concept and it
potentially is how Google works,

but it has pushed people to produce
lots of low quality content and bloat their sites.

It’s the execution of it.

Like if you lower your standards too much,
there’s other factors can come into play.

Exactly.

So while technically it may be how this
works, I don’t think for people with very

limited resources, it’s necessarily a good
framework because they should focus more

on creating fewer good pages
than many low quality pages.

When you have larger amount of resources,
I think it comes at play.

It’s a little bit more
practical, basically.

So that’s how it works.

I don’t think these are particularly
opposed strategies,

but quality of execution is
often lacking in this industry.

And this audit process helps you open

the eyes on potentially your
lack of quality execution.

It did on ours.

And quite often you’ll realise that when

you do the audit and it’s just
going to bucket your pages.

For some sites like 80% of the pages are

going to land to the delete bucket
and it’s like that begs questions, right?

You just spent all this money creating
all this content you’re about to delete

if you follow this process.
Is it really worth keeping doing the same

thing, or should you actually
update the way you create content?

Is there a better way to do this

that potentially gets you
better results down the line?

So that’s going to open up all these kind
of philosophical questions on how you

create content and how you do things,
et cetera, which is a good thing.

It’s like a good questioning once a year,
are we doing the right thing?

Is this actually helping us or should we
change direction and re optimise things?

It’s really a healthy thing to do
in a content business,

and this process will help you do that,
which is another perk of doing this.

I think that kind of plays into the kind

of long term results here,
because we’re not saying, hey,

go do a site audit and you’ll
get full recovery from HCU.

That’s just not the case.

And the thing as well is like,
you need to another update.

Like quite often you do these audits and
it’s like, don’t expect results tomorrow.

I’m not going to say that either.

Quite often you need Google to press

the button again and then
potentially you get a recovery.

And it’s not guaranteed either.

What I can tell you is everyone who is
recovered has done something similar.

That’s the truth.

But in general, it’s going
to ask you questions.

I’m firing people,

usually after this process,
I’m pointing fingers at people who do bad

stuff, et cetera, and identify
things that could be done better.

And I think it’s just a healthy thing
to do rather than just keep going as we’ve

always done, not change anything
and then bam, one day -80% traffic.

Honestly, we’ve been guilty
of that on previous sites, right.

It’s happened to us and it’s like,

I think this process is going to make
things

hasn’t happened to us on Authority Hacker
because we’re a bit more proactive on this

and we do this before
it goes bad, basically.

Yeah.
I mean, that’s the process.

Again, I know this corner of the industry

can be a bit shady and people will
promise you a lot of things, et cetera.

I don’t want to make these promises,

but I want to tell you that this is
a healthy thing to do for your site.

This is something that people
who recover actually end up doing.

And what we’ve done is we’ve streamlined
the process and made it easier, basically.

And the best thing if you’re an AH Pro
member is you can actually get this

Blueprint right now
in our new members area.

It’s available right now.

So head on over to
members.authorityhacker.

com if you’re an AH Pro member.

If you’re not an AH Pro member, first,
what are you doing with your life?

And second, you can pick up
this Blueprint a week today.

So that’s going to be on Monday,
the 18 March, and it will be available as

part of a bundle of Blueprints
and resources and bonuses.

Do you want to tell us a little bit
more about what’s included, Gael?

Yeah, so I’ve made a module
on technical audit as well.

So I’m still using the Ahrefs free site

audit tool because actually it surfaces a
lot of technical issues if you have them.

And for example, I’ll give you an example
of a member who had his non slash URLs

at the end of the URL, not redirect
to his slash URLs at the end of the URL.

And he was getting links to both.
Right.

It’s not too bad when
you have a rel canonical.

Like rel canonical supposed to fix
that basically, but it’s still not great.

Essentially your links go through

something a little bit
lower than a redirect.

And so this helped him identify these kind
of issues and I told him to fix it

on Cloudflare in the member area,
and essentially helps you cover some

ground that a content
audit will not cover.

And we’ve included
the EEAT Blueprint as well.

Again, we’ve talked about EEAT
quite a lot in this episode.

Originally I was not a believer of EEAT,
and now I think EEAT is a factor through

correlation, just because essentially
the sites that have been deemed as helpful

by quality raters have these
kind of on page factors.

If you correlate with these sites,
you’re more likely to be deemed as helpful

by a machine learning algorithm
and therefore implementing a bunch

of these things is
a good idea on your website.

Is this going to fix your
website from HCU on its own?

I don’t think so.

I think you need to do
the content audit as well.

I think you need to look
in your tech as well.

And so that’s why we kind of made

that bundle because it kind
of covers most grounds.

And there’s actually a bonus as well
where I audit a bunch of pages as well.

So the Blueprint shows the process
and walks you through it.

And I just recorded like an extra video

where I just goes through a bunch of pages
that were maybe challenging in audits

we’ve done and things like that and kind
of like talk through

why we’ve done things and why you may
consider that so that we can get a little

bit more in the trenches and doing it
in real life on real sites, basically.

So that’s pretty much the bundle.

So this is going to be available
for just a few days from next Monday.

So make sure you’re
subscribed to our email list.

Go to AuthorityHacker.com/subscribe to do that if you want to be notified of this.

And it’s worth doing because there will be
a big discount as well if you’re on there.

So highly recommend that.
Alright, cool.

Well, that’s pretty much the episode. Whether you buy the Blueprint or not, I hope this was helpful, this made you think about how
to manage your website, et cetera.

We always want to provide value on this podcast. It’s not just a sales fest. So hopefully that was good.

If you enjoyed it, don’t forget to like, subscribe and let us know what you thought of it in the comments.

And let us know what topics in the podcast you’d like us to cover as well or what guests you’d like us to have. We’re always looking to kind of expand the podcast. I think the interviews have been really good lately. I’ve been really happy with it. I’ve learned a lot personally. I know the last one I did on site speed was pretty nerdy, but I kind of enjoyed that.

So yeah, pretty much, let us know in the comments.

Hope you enjoyed it and we’ll see you in the next episode.