The Rank Volatility Scenario in 2020: A Rank Stability Study
February 12, 2020 |
In a way, Google's core algorithm updates
have come to dominate the SEO conversation. The core updates create for a constant stream of SEO chatter as we look to better understand them and avoid their wrath. When another update arrives, nothing rivals the amount of attention it garners. In 2019, we experienced three such updates; the March, June, and September core updates. Moreover, by just the second week of 2020, we were already witness to the first core update of the year
. I would imagine if you asked the average SEO if rank is getting more and more unstable the answer would be an unequivocal, yes!
That got me thinking, is such a notion accurate? Was rank more volatile in 2019 than it was before the latest string of core updates arrived? Is the continued presence of both Google's confirmed and unconfirmed updates making rank stability harder and harder to come by? As machine learning progresses are we seeing more and more rank volatility? Has August 2018's Medic update
put us on a new path of increased rank fluctuations?
Let's find out!
Rank Volatility at a Glance
When I started my analysis on rank stability walking into 2020 it seemed as if the notion of rank being more unstable than ever was a no-brainer. The trend charts, highlighting rank volatility over the past 4+ years seem to highlight a serious increase in ranking shifts in the more recent past.
Looking at the Health niche and the extent of the rank movement at the top position on the SERP seems noticeably more extensive than in the past. (By the way, you can clearly see the impact of the Medic Update on the Health niche!)
Similarly, when looking at the top 5 positions on the SERP... volatility in 2019 seems to have picked up significantly.
However, looking at increases and decreases per se does not tell the entire story. (Also, drawing conclusions from a look at trends per se is nonsensical at best.) Take the above instance as an example. Here we're looking at the % of times the same URLs were shown in the same order for the same keywords from one month to the next. A decrease would mean that the same keywords produced fewer results with the same URLs at the same ranking position from one month to the next.
Now, let's suppose that an update stopped the top 5 results from matching exactly (same URLs in the same ranking position from one point in time to the next) 50% of the time to 30% of the time. That would represent some significant rank volatility, to say the least. However, we have to consider that 30% historically. While it is low relative to before the update hit the SERP, perhaps in 2017 the norm was a 20% match of URLs in the same ranking order from one month to the next. In this case, rank, historically speaking, is actually more stable even with the update's impact.
Looking back at the two trend charts above and the overall levels of rank stability, it seems to be in the same ballpark as they have been since circa January 2017. In other words... if we drew a straight arrow through this data from 2018 and on... the overall
rank volatility levels seem to be relatively consistent when all is said and done. In fact, it looks a bit more stable, at least according to this one metric.
So, let us ask again, is rank really less stable?
A Bit on the Method
Before I get into the data itself, let me briefly explain what I did so that you know what the strengths and limitations of this study are. To determine the relative rank stability of URLs on the SERP I analyzed both the top position on the SERP in isolation and the top five positions as a singular unit. In specific, I looked at what I will refer to in this study as an "exact match
" metric. For my purposes here, "exact match
" means if the URL that was shown at a given ranking position at one moment in time is utilized by Google at the same ranking position at another point in time. For example, say the top three sites shown on the SERP for a keyword on January 1, 2019, were:
Now suppose that on February 1, 2019, the top three sites listed for the same keyword were:
In this case, the percentage of URLs that matched from one data point to the next would be 0%. This is the "exact match
" metric that I employed utilizing a dataset upwards of 7,500 keywords.
With this metric, I analyzed the percentage of keywords that reflected exact matches at the first through fifth ranking positions from one month to the next for 2017, 2018, and 2019. This data was comprised of keywords from the Travel, Retail, Health, and Finance niches. This means, for each yearly average, a total of 48 data points were analyzed. As a result, the per niche data, which utilized just 12 data points per niche, is extremely limited.
How Stable is Rank on the Google SERP?
Rank on the Google SERP has traditionally been relatively volatile from a certain perspective. Google's algorithm updates have come to be relatively frequent and its confirmed core updates have become infamous for their impact on rank volatility.
My attempt to land a Featured Snippet in the above paragraph aside, let's start our analysis of rank stability over time with a look at the average "ranking match" (i.e., the "exact match" - if you don't know what that is you have been caught skipping the previous section where I outlined my method - shame on you).
Rank Stability at the Top Position of the SERP
Here's the average percentage of keywords that produce the same URL at the same position (i.e., position 1 in this case) from one month to the next over the past three years:
The differences highlighted above are not entirely drastic, not by a longshot. From 2017 to 2019 we're only talking about a roughly 3.5% decrease in rank stability at the first position (or a 3.5% increase in volatility if that helps make things clearer). Meaning, rank, overall, has not gotten more volatile at the top-ranking position. (There's more to this story, but for now, that's a fair conclusion.)
The same can't be said when looking at the top 5 positions on the SERP overall:
In this instance, while there is a marginal gap in rank stability between 2017 and 2018, there is a much wider gap between 2018 and 2019. When comparing 2019 to 2018 there was a 31% increase in rank stability over the top five positions on the SERP. It would appear then, that Google, while committed to its accuracy at the top position on the SERP, has considerably shaken up the positions that follow it.
In fact, this uptick in rank volatility at the top five positions overall was consistent across all of the niches studied:
As I mentioned earlier, there are major limitations with using this data at the niche level. That said, it is evident that Google's tendency to increasingly swap URLs at the top five positions applies across the board. That is, there is not one niche or type of niche (say Your Money Your Life [YMYL] niches) that are skewing the average (which is important to note with there being such a focus on YMYL sites within the SEO industry).
How Drastic Is Rank Volatility Now?
While rank overall, once you move past the top position on the SERP, seems to have gotten more volatile in 2019, there is still the question of how volatile the SERP is from one month to the next. In other words, we still don't know what the average site can expect to see in terms of volatility from one moment in time to the next. All we know at this point is the overall level of volatility over the past three years.
In other words, imagine rank had gotten more
stable at positions 1-5 in 2019... it's still entirely possible that the swings in fluctuations might be more drastic than in 2018. Or the inverse, while rank seems to have been more stable in 2018 it's entirely possible that the swings in rank stability were actually less drastic in 2019 with the increase in volatility consisting of more gradual rank movements.
To answer this question I tracked the percentage increase/decrease in rank stability from month to month in order to arrive at the standard deviation for each calendar year. The results are incredibly telling. Here is the standard deviation of the month-to-month increase/decrease in rank stability at the first ranking position:
There is an absolutely incredible increase in the standard deviation produced by the 2018 and 2019 data! The standard deviation during 2018 and 2019 more than doubled its deviation when compared to 2017. Keep in mind... this is the first
position on the SERP we're talking about here.
That rank stability overall has not increased since 2017 (see data in the previous section) means that while Google may be happy with what it's showing at the very top of the SERP overall, when it does make a change the consequences vis-a-vis rank stability are far more drastic.
As to be expected, this pattern can also be seen when looking at positions 1-5 as a singular unit:
What's interesting here is proportionately that gap between 2017's standard deviation and 2018's and 2019's standard deviation narrows when looking at the top five ranking positions overall. Meaning, Google has specifically
gotten far more drastic with how it approaches movement at the very first position on the SERP. It simple terms, the top position on the SERP is unique in the drastic ranking swings it tends to see since 2018.
The Bottom Line on Rank Stability on the Google SERP
It's hard to digest all this and what it means for sites and ranking on the SERP as it's a bit of a multifaceted picture. That said, the data points to a slightly more volatile landscape since 2017 once you move past the first position on the SERP.
This may be surprising since Google's core updates seemingly wreak large amounts of ranking havoc for many sites, and surely they do. This I believe is reflected in the increasingly drastic nature of rank fluctuations on the SERP. That is, when volatility does occur the data points to it being far more severe than it once was.
As oversimplified as it may be, I would think of the current rank volatility situation as follows (and of course, I'm not speaking to any specific instances or sites): The chances of seeing ranking losses have not changed much since 2017 but the impact of such changes or the amount of change seen at a given moment in time has! (Which with bigger and broader updates, i.e., the core updates, makes a lot of sense.)
In other words, the chances that a keyword won't continue to show the same results in the same order from one point in time to the next has not increased substantially since 2018 (when looking at the top five results). However, the chances of your volatility being more extreme when Google does mix things up has.
What's Behind More Drastic Rank Volatility?
The one thing that really stands out to me is that Google is less afraid to do some serious damage at the very top of the SERP (position one). That right there tells me that Google is far more confident in what it knows, in how to analyze a site, and in how it interprets web content. A bolder and more self-assured interpretation of web content speaks directly to Google's machine learning advancements, increased entity interpretation, and so forth.
By having more accurate and more authentic
means to decipher content and evaluate websites, Google has in a way become fearless. As a result, it's no longer holding sites that may rank #1 on the SERP in the same esteem. If Google thinks that content is not relevant, it's not afraid to demote it. This "fearlessness" results in Google taking a harder look at the URLs ranking #1 and swapping more of those URLs when running an update. Again, seeing more concentrated ranking movement makes a good deal of sense as we now live in the age of the "core update."
With a more qualitative understanding of the web comes a Google that is willing to act on that interpretation without bounds. Google is confident in how its machine learning is better able to help interpret multiple facets of the ranking equation and we're seeing Google act on that interpretation unequivocally.
Ranking Fortunes: What's Next for Rank Stability?
What's next for rank stability? As if I'm some sort of fortune-teller. OK, Google.... stick your palm out. The truth is I can see this going one of two directions. Either, as Google's machine learning properties continue evolving rank will stabilize over time or as Google introduces more properties we will continue to see drastic changes to the SERP's rankings.
In other words, all things being equal as properties such as BERT learn more, the changes they make to rank should be more refined. That, in turn, should bring rank volatility down. Of course, at the same time, Google could do anything from introducing more machine learning properties to the implementation of major improvements to its current properties that could induce more volatility.
Due to this scenario, volatility could decrease, stay the same, or even increase. And with a prediction like that, you really can't be wrong.