The Google Algorithm Updates April and May 2019

By | May 16, 2019

The Google Algorithm Updates That Probably Never Happened

I’ve spent years trying to work out what Google is doing. More to the point, what they want. Each Google algorithm update examined for cause and effect.

At the top level it’s easy. But once you are working with or against other webmasters who also know their stuff, the little things really matter.

  • Just how much content does Google expect for this market?
  • What is the balance between link quality and link quantity?
  • Where does site age come into it?
  • How come sites with poor or none existent meta data rank so well?
  • How is a page gets number 1 in Google when it looks like it’s full of spun content?

It might be best to accept the weird and wonderful ways Google works, to take a phlegmatic view of the changes and undulations of success and failure a website might have.

Roll Forward to Huge Google Algorithm Moves in Spring 2019

For example, the SERP noise created in April 2019 lead many to believe that there had been a Google algorithm update. When there seemed to be a massive correction and more changes in SERPs in early May 2019. Many webmasters were sure there had been a major Google algorithm update.

Moz’s “Mozcast” showed that in late April the movements in rankings for the sites they tracked were extreme on the 18th and 19th April, then after settling, rocketed again on the 30th April. Then on the 7th and 8th of May there seemed to be a correction. Something happened that put the meter above the 80% storm level again.

Google SERP activity, Mid April to Mid May 2019

Google were asked via Webmaster forums what had happened.

They replied “Nothing… we’ve done nothing”

If True – How Is This Possible?

Let’s take a look at the Google SERP algorithm from a bird’s eye perspective. A macro view of what it is and what it tries to achieve.

How the Google algorithm crawls, scrapes, gathers, abstracts, obfuscates and alters (often seemingly for the sake of altering). to see if we can work out how such huge movements in search position could happen without an explicit Google update

An overview of the process at this level could be described with two ideas.

  • Google may abstract intentionally because they do not want their search algorithm to be reversed engineered.
  • Of greater importance is; The algorithm is now so complex. Pulling in factors from so many different areas, that it is unlikely that the outcome of any URL in its index could be calculated by hand, even with a calculator, even within a spreadsheet.  Google is now so complex, holds so much information that they are virtually Skynet (pre self-awareness and cyber android  holocaust)

The index moves all the time. It is live. Major updates caused by fundamental updates to the algorithm, put there deliberately to alter some aspect of SERPs tend to be recorded.

Sites like Moz keep a track on these things, and to their credit Google often tell us when this is coming, or at least announce after the event that a major alteration to search was made.

“We just rolled out another Panda Update”

Here’s the thing that a good physicist or experimental psychologist might tell you.

Complex systems develop a life of their own

Google’s search ranking algorithm is an incredibly complex system and it can be fairly said that it  is not possible to predict its outcomes.

At this point I want to introduce a gentleman called John Gall and his work on complex systems.

Here’s a few quotes from John to whet your appetite.

“Complex systems usually operate in failure mode”

“A complex system designed from scratch never works and cannot be made to work. You have to start over, beginning with a working simple system”

I plucked these two out of a huge number of inciteful, and somewhat terrifying ideas. You can find more here;

https://en.wikiquote.org/wiki/John_Gall

Let’s take a look at these two and work out what they mean in terms of Google algorithm updates

 

“Complex systems operating in failure mode”

This means that, a complex system’s main goal after reaching a level of complexity that cannot be held within the knowledge and agency of a single person or small group of people – is to not fail.

The goals of that system were set initially. The system was designed to achieve something or measure something. It was simpler and likely more focussed on a describable goal or set of goals.

These may have been expanded upon over time with peripheral goals, and new goals added. Some compatible, and likely many incompatible with the initial goal.

The system hacked, whacked and hammered until it managed, somehow, to conform and give reasonable results across a range of goals, many of which were not part of its original design.

As this evolution went on the system gets more complex. Inter-relationships between goals occur.

Clashes and redundancies appear all over the place. The ability for anyone, even highly intelligent people, even the very people who designed much of the system, to understand it is lost.

Hopelessly lost.

At that point every new addition needs to be tested on data. Tested for impact. The presumption of impact theoretically is not possible, and if attempted will be invariably wrong, often catastrophically so.

However, these tests are not designed to be exhaustive or empirical. They could not be. The testing required to understand and account for every variable and every factor are almost infinite. Exhaustive testing is not possible.

Welcome to the world of “Testing for reasonableness”

Where

”These results are within expected parameters” replaces the old and largely preferable

“These results are correct”

To ask the results to be “correct” is to ask the wrong question and mis-understand the nature of complex systems.

How Complex Does a System Need to Be Before Control Of Outcomes is Lost?

A system doesn’t have to appear to be that complex from the outside to be utterly inscrutable once allowed to expand into the real world.

Think of the person who invented chess. There are 64 squares, 2 colours on the board and 2 sets of pieces. Each set comprising of 6 “types” Pawn, Rook, Knight, Bishop, King and Queen. They have basic rules to move within the confines of the board and squares.

This is a rule set that could have been devised within an hour.

Yet Chess has been with us 1400 years or more. Has been studied and analysed by some of the sharpest minds humanity can offer and has been offered as a puzzle to solve to computers who can crunch millions upon millions of the variables it offers in minutes.

Yet it is still not possible to play “A perfect game”. In fact, in most circumstances it is not possible to play ”The best possible move” – or even to always know what the best possible move might be.

The data set of possibilities cannot be confined to a human mind and it appears that computer-controlled AI that can crunch data at incredible speed is only “better” at it than humans. 

It is not perfect, and it cannot know “best” from “probably the best I’ve found so far”

The best artificial chess AI may not be much closer to knowing how to win in any circumstances than a human player. It is just able to crunch more possibilities and hold the potential outcomes of those possibilities.

And chess, at its core, is such a limited system with so few moving parts whose rules can be taught to a child in 30 minutes.

Yet once “unleashed” by starting the process of moving the first piece, with each subsequent move the number of potential outcomes grows exponentially.

John Galls next handpicked quote;

A complex system designed from scratch never works and cannot be made to work. You have to start over, beginning with a working simple system

This is where the changes to the Google algorithm come in. Google want its algorithm to do more and more. Such a simple process 20 years ago. Content richness, inbound links, initial allocated authority (the starting point of authority in the form of Page rank – allocated it to seeder sites)

That was pretty much it.

With this initial data set and the goal of creating an ordered index for user input search terms Google managed to be up and running with a functional system very quickly.

Roll forward to 2019

A subset of Google’s algorithm considerations include;

  • Retention
  • Bounce rate
  • Visitor levels
  • Inbound link volume
  • Inbound link authority
  • Inbound link relevance
  • Social signals
  • Page load speed
  • Outbound links
  • Domain suffix
  • Meta data
  • Keyword use
  • Associated keywords
  • Competition
  • Commerciality
  • Verboseness of topic or market (content volume expectation)
  • Trust of website or business entity that owns website
  • Location of website servers
  • Market size

And on and on and on

Game theory (not really)

Most tellingly, unlike Chess whose rules and pieces are controlled and have hardly changed. The website’s that the Google algorithm ranks get more complex year on year.

Imagine chess changing like that;

 “Every 4th move the King can move like a Knight and every time a Bishop lands on this square it turns into a Rook and if there is any Rook already in the same row it changes colour”

Then a year later someone changes those rules and adds 4 more or adds nuance and complexity to existing rules

“A Rook in the same row changes colour – but if the King is still on its starting square that is optional”

then

“A Rook in the same row changes colour – but if the King is still on its starting square that is optional unless the player has already done it once then his opponent decides whether it changes colour”

then

“A Rook in the same row changes colour – but if the King is still on its starting square that is optional unless the player has already done it once then his opponent decides whether it changes colour. This only applies if at least 16 pieces in total are still on the board”

then…

Google Algorithm Has Exponential Complexity

Imagine this game where rules for every piece in every circumstance are added to, nuanced, elaborated and altered regularly

Suppose now that you are in charge and expected to understand  everything, all the time, in every circumstance, for every game of chess going on in the world – all at once.

The games are websites and the entity trying to make sense of all of them, simultaneously, is Google.

Ok my game analogy is running a little thin, but hopefully you understand the levels of complexity Google deals with. Self-created complexity it might be, but that doesn’t make it less of a data issue for Google

The point of the quote from John Gall is: There is nothing Google can do about it other than scrap the system and start again, and as websites and web technology changes, it’s only going to get more complex.

What is The Butterfly Effect?

You saw this coming, didn’t you? That old “Butterfly” trope?

What is the butterfly effect? The online dictionary nails it quite succinctly;

“The phenomenon whereby a minute localized change in a complex system can have large effects elsewhere.”

Google is now an incredibly complex system whose inputs and outcomes are well beyond the comprehension of individual humans or even groups of humans. The overall Google algorithm is certainly complex enough to be subject to these rules.

Massive ripples can be caused within this system by relatively small and seemingly inconsequential changes in state or alterations in expected input.

April May Google Algorithm Update – Truth or Lie?

I believe Google when they say they did not make major changes to the algorithm a few weeks ago. It has got to the point where for major changes to occur Google don’t need to do anything.

They will just occur consequentially as a result of the massive number of inputs and variables.

Maybe a major player like Amazon changed the meta data across huge swathes of their website (something they could do in a few button presses). Perhaps a world event outside the internet distracted people from their expected search patterns for long enough for a contingency to kick in?

Who knows? (I certainly don’t)

But what I do predict is, that this will happen more often over time. Because this level of unexpected consequence is built into the very nature of complex systems. It is inevitable.