Google Drops the Spam Hammer

Download this weeks SEO News Deck by joining our email list.

In this weeks Canonical Chronicle there are a few minor updates in the world of SEO and Google. The most important one being that FID (First Input Delay) is now going to be depreciated in favour of INP.

We were warned about this several months ago but as of this month there will not longer be any FID metric available in your CrUX reports. Instead, this is replaced with INP (Interaction to Next Paint).

Why has Google depreciated FID?

FID was a measurement of the first interaction between the user and the browser, however to quote the Google wed.dev documents, they found that Chrome usage data shows 90% of a users interactions are after the page loads. Therefore, they needed a measurement that measured responsiveness thought out the page lifecycle.

What is INP?

INP stands for Interaction to Next Paint and it is a Core Web Vital metric that assesses the responsiveness of a webpage using the event timing API. It essentially measures the time difference between a user doing something and the page responding. The lower the INP score the better as this shows the page was consistently able to respond quickly to the majority of user actions.

How do we optimise INP?

Firstly, you need to work out the elements that are creating slow interactions. The best way to do this is with “field data”. You would get this from a Crhome User Experience Report (CrUX).

Usually you would look at 3 things when assessing an issue: input delay, processing duration and presentation delay. Each one will give clues about what to optimise.

If it’s input delay, it’s likely the scripts the browser is evaluating need to be optimised and minified, where as something like presentation delay relates to minimising the overall size of the DOM.

For a detailed documentation on how to optimise INP, see the official Google docs.

Spam Warning for Indexing API

It’s rare that you hear about spam in the world of SEO anymore. Long gone are the days of firing up scrapebox and xrummer to do a “link building” campaign. Although, I am glad to see that article spinning has reinvented itself from “the Best Spinner” to ChatGPT. I kid, I kid.

Google has updated it’s Indexing API documentation to tell us naughty SEOs to stop spamming irrelevant links and using multiple accounts to get our crappy content indexed.

In case you didn’t know the indexing API is not a general purpose indexing tool and should only be used for Live Videos and Job Listings. For all other indexing you should use the normal methods.

How to get pages indexed without the indexing API?

There are 3 ways you would typically get a page indexed:

  1. Make sure it’s in your XML sitemap and submit it to Google Search Console
  2. Add internal links to the URL you want to index
  3. Share the URL that you want to get indexed on social or link to it from another domain

There are lots of services out there that will index pages for you but typically they are just pinging the URLs or submitting the URLs in a big sitemap to a burner search console account.

Back in the day if you wanted to index lots of URLs during a migration or new site build, you would use the domains backlink profile and turn that into an XML sitemap and submit to GSC. This doesn’t really work anymore but feel free to give it a shot for academic purposes.

Wayback Machine in the SERPs

Google have stuck a deal with the Wayback Machine and will now show the old versions of a URL in the search results. Just click on “about this page” on the result itself and the link will appear.

It’s not a major update but could be interesting to check the changes your competitor has made to a page over time if you see their rankings changing.

New Privacy Tech in Google Ads

Google caring about user privacy is akin to a fox caring about hen welfare. The new update to the Ads platform allows for the advertiser to upload their customer data but keep it “private”.

This is rolling out ot all ad accounts free of charge.

It is primarily done by 3 methods:

  1. Trusted Execution Environments – this is essentially partioning your data in a container that only you have access to
  2. Attestation – which just means a technical test to prove your data is processed as intended
  3. Open Sourcing the Code on github

Whilst any move to encrypt and protect user data should be applauded, if we look behind the curtain this is actually a move to increase Google’s dominance in the world of ads. If they are the only player to offer this technology they become immune to more regulation as they have outsourced the privacy back to the advertiser. This means that the more places like the EU try to regulate them with things like GDPR, the more it hurts competition and the more they win.

Ahh the joys of being an unregulated monopoly.

Structured Data for Video

There is a minor update for Schema markup for the VIdeo Objects called inelligbleRegion

The schema property is called ‘inelighleRegion’ and you can nest it inside the VideoObject schema, Make sure to use the two or three letter ISO 3166 format and implement in the usual way with JSON LD or meta tags in the microdata format.

Download the deck

If you want to present the news to your team each week, subscribe to our newsletter to get the free deck delivered to your inbox each week.

If you CBA with that, you can view it on Google Drive without an email here.

Links

Related Posts

Subscribe to our Newsletter