August 2024 Google Core Update – News, Opinion and Analysis

Overview

  • August Core Update analysis
  • Interview with Danny Sullivan, Google Search Liason
  • Moz Brand Authority Study
  • Mediavine banning AI content
  • WordPress Security Updates
  • Google Spam Score Discussions on Reddit

Resources

We have created an unbranded slide deck for you to present to you team which you can download here.

If you want this delivered to your inbox each week in powerpoint and Google slides formats, subscribe to our newsletter.

Intro

August has been a very busy month in the world of SEO consultants with Google updating their algorithm using their new ranking system that was rolled out in the March 2024 update.

This blog post will be dedicated to analysing the August core update and the Search Engine Roundtable interview with Google Search Liaison, Danny Sullivan to give us more of an understanding of what is at play and how to best carry out our optimisations to future proof our client’s websites.

Please note that this blog post contains our SEO agency analysis and opinion. No one knows the exact mix of ranking factors in the Google algorithm and to quote the hilarious Spurious Correlations website, correlation does not equal causation. In this blog post we will seek to read between the lines of the Googler’s word salad and attempt to codify their advice into actionable steps.

The interview with Danny Sullivan, Search Liaison

Source

Preamble and disclaimer

Interviews with Googlers are interesting in that they also don’t truly know anything about the algorithm, rather they have a PR friendly abstracted understanding of the algorithm. This makes Google representatives very similar to SEO consultants who spend a lot of time on Twitter; high on head knowledge and low on dirty hands.

At Type A we are in the unique position to have access to client search console accounts with over 10m queries registering more than 10 impressions per month. Although this sample size sounds large and impressive, it falls victim to the same issues of any of the other studies in that the data is heavily skewed and biased towards particular countries and industries and is not a large enough corpus to derive many meaningful correlations that are globally applicable.

That being said, we are a professional SEO agency managing enterprise accounts so we apply a high degree of analytical rigor to the data that we do have and use the findings in the data to extrapolate out some potential correlations. We then add a sprinkle of our 12 years of SEO experience and a little personal opinion to formulate the day to day optimisations that have been driving success for our clients.

Interview point 1: content creation for plumbers

Half way through the interview, Barry Schwartz asks about a plumbing website that has all the right content, all keywords gaps filled and plenty of articles in their help section to show their expertise. Sullivan responds with,

“a lot of the traffic that the local plumber was getting probably wasn’t even something that was converting to them.” In contrast, if that plumber shared really personal and professional stories about plumbing issues in their local area, that would be more something Google would want to reward.”

Firstly, having recently dealt with a plumber to move a radiator in my house, I had a little cynical smile at the thought of Dave the plumber sharing his personal anecdotes around plumbing.

“I got up, had a coffee with 5 sugars, went to the clients house, dropped off a log in their bathroom and then picked my favourite radio station to blast our 80s bangers whilst I ripped their living room to pieces. I didn’t have the right parts in the van so I turned off their water for the day and left at 3pm to attempt to fix it tomorrow”

Now clearly, my plumber is a modern day shakespeare so he wouldn’t have any trouble writing up his personal experiences in his company blog but for the rest of us, we need to dig into what Danny is getting at and work out some specific actions.

Decoding interview point 1

It appears Danny is alluding to EEAT (Expertise, Experience, Authority and Trust), in particular he is focusing on the experience part.

As any copywriter is able to write up an article about “fixing a leaky tap” it’s hard to actually workout what “real, high value” content is in this scenario. From a keyword point of view, all boxes are ticked, assuming all the plumbers have roughly the same domain authority, how does Google decide what to rank. They rank the article that shows the most “experience”.

So, how do you show “experience” as a set of algorithmically define principles that are easily measured. You can do things like:

  • Add an author bio
    • Link to the authors social profiles
    • Link to the authors professional certifications
    • Use Schema ‘same as’ to connect the dots
    • Link to the about us section if the author is also the owner or key employee
  • Add quotes from the plumber with quotation schema
  • Quote the customer
  • Add original imagery with EXIF data still in tact
  • Share these pages to social
  • Write in the first person (say “I did this”, not “we” )
  • Talk about your unique approach saying what you don’t do as well as what you do, do and show how you use your discretion to deliver specific services
  • Go into a weird amount of detail about specific things you do – prefer the ABTECH2000 blow torch to the ZAPYY5000 – say that! As content relies more on AI having unusual entities in the text that do not exist in other documents may be a way to differentiate and show true experience and expertise. Take it a step further and write and link to another blog post where you review the 2 items you are talking about.
  • Internally link similar content types together and deliberately exclude some pages from internal linking if they are not semantically related

Workflow for to get your clients to actually do this

  • Get original images
  • Get quotes from the client
  • Do an entity audit to plan content production outside of the usual gap analysis
  • Enhance pages to make them sharable on social (write about “How we Saved Mrs Jones Basement from Flooding”, instead of “Fixing a Burst Pipe in a Basement”)

I recommend an entity audit. After you have done your keyword research and know the terms you need to make new pages about ask your client for the various methodologies, tools and techniques that they use. In the case of the plumber, are their specific brands of tools they use? Do the specialise in one form of plumbing like heating? Do they have any opinions on brands of boiler or radiator types? DO they have different approaches depending on the type of house?

This approach may feel a little unyielding or only applicable to some industries but that is not the case. We can apply it to a white collar business like consulting. What school do they go to? WHat’s their personal productivity setup? What’s their must read books? What cases have they seen that they think are examples of well executed business transformations?

Interview Point 2: Why did I lose position 1 to position 2

In the interview Danny gives a “it’s not you, it’s me babe” answer when Barry asks about when core updates knock you off the top spot.

Sullivan says,

“If you move from first to second, that can be a notable traffic impact. That’s what happens. It doesn’t mean that we don’t like your content. We clearly do like your content. That’s why you’re in the top results. But it’s going to be hard for you to then regain all that traffic back because of something else ranking higher, which is still useful to people as well, and overall if everything is useful to people on search, then overall everybody gains.”

Having some clients obsessively check rankings and have a minor pulmery ambolism when things move in teh wrong direction, this particular point hit home quite a bit.

I read that quote and hear Bob Dylans “it aint me babe” and see the meme “you vs the guy she tells you not to worry about”. On the surface, it’s a little defeatist. It says, “the algorithm has moved someone higher, accept your new place in life, peasant!”.

Decoding Interview Point 2: CTR

If we are to try and apply some algorithmic reasoning to why this might be the case, we can confidently say that machine learning is at play.

SEOs have thought that clickstream data has been used for a long term with the famous study by Rand Fishkin indication this might be the case and then the recent Google API leak showing that it absolutely conclusively is the case and that our Google reps have been very selective in their wording to deflect us from the truth, presumably to discourage CTR manipulation.

As it’s very likely that the top spots are controlled by user interaction with the results, what can we do to increase the amount of clicks coming through the search engine to keep us on top?

Workflow to improve CTR

  • Optimise page titles to be clickable, not just contain the keyword
  • Optimise meta descriptions to be entising and clickable and be aware of the multiple ways Google decides on titles in the SERPs
  • Optimise open graph tags now that they can also be used as a title – make them a bit more “engagement bait-esk”
    • SEO Title: How to Fix a Burst Pipe in a Basment
    • Open Graph Title: Saving a family home from a flood by fixing burst pipes.
  • Add schema
  • Optimise intro fragments of text in your page copy
  • Add internal anchors to content
  • Judiciously use header tags for proper page structure

Interview Point 3: Changes to the ranking system

Sullivan mentions the change in their ranking systems quite a bit in this interview and their are carious breadcrumb clues in all of Google’s correspondence since March 2024 that the way they are evaluating pages is changing quite a bit.

Sullivan mentions this in the interview,

“And our ranking systems are different, and among other things, our ranking systems are also rewarding other kind of content too, including forum content and social content, because that’s an important part of providing a good set of diverse results”

If we are to read the changelog in the Quality Rater Guidelines, we can see the inclusion of Experience in December 2022:

“Added new ‘ Page Quality Rating FAQs ‘ to clarify how Experience factors into PQ rating; minor revisions to other FAQs for consistency”

And since then Google have also reported that they are using more personal experiences when ranking content. We can assume this is an attempt to filter out content that has been written by a copywriter that doesn’t actually have any real world experience with the subject matter and is instead just writing to a very detailed SEO brief.

Also, if we are all honest with ourselves, we skip past the vast majority of content on webpages that have been “designed for search”. In this weeks Canonical Chronicle, I use the example of recipe sites giving you their family history before revealing a 5 ingredient cupcake recipe or an e-commerce website writing about the history of DC electricity to sell you a light bulb. Both are examples of content that will now be demoted as it doesn’t have any experience

We need to re-evaluate what we understand good content to be. It used to be that factually correct content that mentioned all the correct entities in our gap analysis and fulfilled all of the keywords in our content briefs was enough. As every SEO In the world starts to do this, it’s now impossible to actually work out objectively which content is the “best” or “most valuable” to the user. As SEOs we have become victims of our own success. We’ve created so many factually correct well structured pieces of content for our clients that this is no longer the bar.

The bar for quality content contains real world experience. But, the question is, how is this codified into an algorithm?

How we need to think about “quality content” moving forward

Quality is subjective. For example, as a Scotsman I think the roll and square sausage and a glass bottle or Irn Bru is the pinnacle of human culinary achievement Gordon Ramsey would beg to differ. Algorithms are a little more cut and dry. They are objective.

So what objective points can we work into our content production to do better in search and in general design content that can travel and gather traffic through sources other than Google.

Here are some things, I think we should take pause and think about:

  • Negative TF:IDF as a potential ranking factor? 😝
  • Callbacks to things you’ve written about before
  • Reinforcing your opinion through multiple posts and linking back to older posts
  • Having an opinion on major happenings in your industry

The following is pure conjecture at this point. Expect a more objective deep dive soon.

Negative TF:IDF

I can’t see you read this post but if I could, I bet a good potion of you just rolled your eyes.

TF:IDF means term frequency vs inverse document frequency and was all the rage in the late 2000s when optimising content. It was essentially just a sophisticated keyword density metric that used the top 10 to work out what words to put in an article.

But what if Google now is now looking at other text libraries with words that relate to generalised experience on a topic. So if you have a perfectly keyword optimised article without any writers cruft in the article that would traditionally “dilute” the keyword potency your article is now seen as low quality. Perfect optimisation now is deemed low quality as it’s not human enough. This does not mean I am advocating for making deliberate mistakes but I am advocating for creating content that’s got some personality outside of your companies “tone of voice” documentation.

Callbacks or referencing without links

Referencing things you’ve written before is typically done by linking to other posts you’ve created or linking to things that are semantically related. But as we know from research such about content fragments, or Fraggles as the wonderful Cindy Krum of Mobile Moxie would say, Google understands the text fragments in your content and has them indexed as fragments, not as entire documents.

So, it therefore makes sense that you reference and create call backs in your content without hyperlinking to indicate your experience. Sometimes, linking to the same thing over and over again doesn’t make sense for the user but mentioning it or quoting yourself does.

As Google indexes content as Fragments, it’s realtively easy to codify this into an algorithm – do you reference the things you have said in the past? Or is this just copy for search with no cojent through put line that connects everything together?

It’s weird when something massive happens in your industry and you don’t acknowledge it

Was there a major change or update that affects you clients industry? If they work in finance, was the new budget just announced? If they work in property, did the law just change? If they are a plumber, is there new heating regulations we must follow? If they are in pest control, did a picture of a mouse carrying a slice of pizza on the New York subway just go viral?

It’s likely that your clients domain will have a topical relevance score. However, have you ever thought of the possibility of a topical relevance score that changes based on the time of year and the search patterns of the end user?

As the popularity of topics rise and fall across a year, how well does your content serve the changes in popularity? Are you publishing according to an editorial calendar that fits with the goings-on in your industry or are you publishing another listicle that’s connected to a keyword that you want to rank for?

As Google develops it algorithm we believe that timing and freshness will become more important as the index grows with more and more content and Google needs to make editorial decisions about what constitutes quality on an algorithmic level.

Related Posts

Subscribe to our Newsletter