Google Algorithm Update – Frequency of Quality Updates, Surfing The Gray Area, and Reversals
We’ve seen our fair share of major core ranking updates this year with an update in early January, then the February 7 update, then Fred on March 7, and then more movement in late April and early May. And just a few weeks from the last update, we witnessed another big core update that rolled out on May 17, 2017.
Barry Schwartz was the first to report the update, and it wasn’t long before the impact was clear. I’ve seen substantial volatility from this update and it does look like yet another quality update. More on that soon.
Since 5/17, I’ve been digging into the update and analyzing sites that saw impact (both positive and negative). From a data standpoint, I have access to a number of sites that saw movement on 5/17, and I’ve also had a number of companies reach out to me after seeing drops or gains starting on the 17th. So there was plenty of data to dig into.
Below, I’ll describe what I’ve uncovered during my research, provide some interesting cases, along with some observations about how Google is rolling out quality updates. I’ll also cover what you can do now if you have been impacted (which shouldn’t be a surprise if you’ve read my previous posts about major core updates focused on quality).
Examples of Impact
Just like with other major core ranking updates focused on quality, I’ve seen a range of impact. For example, some sites are dropping or surging 20-30%, while others saw much more impact. And there were several that dropped heavily in early May, only to surge back on 5/17. For example, one site dropped by 83% on 5/7, only to fully recover on 5/17. It’s been fascinating to analyze cases like that (there were several).
First, here are some examples of movement based on the update:
You get the picture. When the update rolled out, there was significant movement across many sites globally.
Remember, real people are visiting your site. Respect them.
After digging into many drops, I saw the usual suspects when it comes to “quality updates”. For example, aggressive advertising, UX barriers, thin content mixed with UX barriers, frustrating user interface problems, deceptive ads, low-quality content, and more.
If you’ve read ANY of my posts about previous major core updates focused on quality, then what I just listed should be very familiar to you. As Google collects fresh data and refreshes its algorithms, sites drop or gain. And if you’ve been pushing the limits from a user experience standpoint, good luck.
Since Phantom 2 in May of 2015, I’ve been saying, “hell hath no fury like a user scorned”. You should get a poster of that saying and hang it in your office. Actually, hang several of them. I’m not kidding.
For example, check out this page on a site that was negatively impacted by the 5/17 update:
And on the flip side, there are sites that surged during the update that have been working hard to enhance the user experience, cut down on aggressive advertising, and boost content quality. And some have been working on this for a long time… so it’s great to see them increase during the May 17 update.
Remember, Google’s John Mueller said you need to significantly improve quality overall (and for the long-term). I’ve seen this first hand while helping companies impacted by these updates.
Side Note: Eighth Graders Know What’s Frustrating, Shouldn’t We All?
Every year I present to eighth graders at a local school about SEO, digital marketing, Google, etc. And every year I ask them if they’ve experienced any sites where it’s hard to find the content (due to ads, popups, and other aggressive monetization tactics). Well, this year they all raised their hands and groaned in agreement at the same time.
THOSE ARE YOUR VISITORS.
And Google is trying to make them (and others like them) happy. So if eighth graders know the difference between what’s acceptable usability-wise, shouldn’t we all? That includes c-level executives, monetization teams, marketing teams, and content teams.
So read the Quality Rater Guidelines, share the pdf with everyone on your team, have meetings to cover the guidelines, and internalize what Google deems low versus high quality. I’ve seen many connections between what’s contained in the QRG with what I’m seeing in the field. Beware.
Here’s a screenshot from the Rater Guidelines about distracting ads. Keep in mind, this is directly from Google:
Below, I’ll cover a few interesting examples from my analysis of the update. I can’t cover everything I’ve seen or this post would be fifty pages long. So I’ve surfaced some of the more intriguing examples.
Major Reversals – Fine Tweaks or Collateral Damage?
I can’t cover this update without mentioning a very interesting case. There’s a site that was algorithmically smoked on 5/7 (around the 5/4 update) that lost 83% of its Google organic traffic overnight. Yes, 83%. It was a horrible hit, and one of the worst I have seen. After the initial hit, the company reached out to me to let me know what was going on, explained what they thought it could be, etc.
After reading about updates like this, they decided to change how sponsored links were handled across the site, and refined how advertising was displayed in general (although it doesn’t appear that their ads were overly aggressive from a UX standpoint.) Then they waited… and on 5/17, almost all of their rankings and traffic returned. Here is a screenshot of the drop and the reversal.
Now, I’ve covered how long it takes to typically recover from a major core ranking update focused on quality, and that’s usually not just a few weeks. It typically takes months. In addition, I’ve seen reversals before based on Google tweaking an algorithm to ensure we are seeing the best possible results in the SERPs. And those tweaks can sometimes yield complete reversals.
Like these reversals from past algorithm updates:
So, did the changes the company implemented after getting smoked cause the recovery during the 5/17 update or did Google simply adjust the algorithm? The only way to test this out would be to reverse their changes and see how it goes. But I definitely don’t recommend doing that… since the changes they implemented were the right ones. Anyway, just an interesting case.
Surfing the Gray Area of Quality
There was another interesting example I’ve been analyzing. It’s a site that got hammered by the June 2016 update and then surged during the February 2017 update. The site definitely did some work from an aggressive monetization standpoint, but probably just enough to creep out of the gray area. I call this “surfing the gray area of quality” because at any point, the wave could crash and you can fall back into the black.
During the May 17 update, the site lost close to 20% of its Google organic traffic. It’s in a very competitive niche and dropped in rankings for a number of important queries. So, this was clearly an adjustment after the surge in February. I’m working with them now to identify all potential quality problems, so it will be interesting to see how they progress during future quality updates.
In-depth content paying off.
There’s another site I’ve helped extensively over the years that has been working hard on creating killer content on their blog. It’s an ecommerce site in a competitive niche and they decided to invest heavily in content development (including video).
During the May 2017 update, they jumped 21% overall, and the blog content really surged (jumping 37%). When checking blog versus non-blog content, you can really see the difference in the surge. So it’s interesting to see their hard work pay off from a content development standpoint.
A Splattering of Ectoplasm and a Sprinkle of Panda
When analyzing sites that were impacted, you could see relevancy adjustments show up again (just like with previous quality updates). I’ve mentioned this before in previous posts about major core ranking updates focused on quality. For example, sites dropping in rankings when their content couldn’t really live up to user expectations.
And remember, that’s exactly how Google has explained Panda ever since it became part of Google’s core ranking algorithm. Here’s a video of Gary Illyes explaining this. So, was Panda prepped again and rolled out at the same time other quality algorithms were refreshed? That’s totally possible, but hard to pin down. All I can say is that I saw relevancy adjustments many times during my travels while analyzing the 5/17 update.
Frequency of Quality Updates – Google’s quasi human is progressing…
Ever since the fall of 2016 when we saw a number of updates in a short period of time, I’ve been hinting that Google might be increasing the frequency of “quality updates”. And 2017 has followed that pattern. Here are the core ranking updates focused on quality that I have seen since the beginning of 2017:
Notice anything interesting? We’ve gone from every few months to once per month (and sometimes multiple times per month). That’s not surprising since Google’s ultimate goal is to NOT roll out updates like this at one time. Instead, I’m sure they would love to have these updates continually rolling. Yes, like Panda of the present, which is very different than old-school Panda (which rolled out on one day and caused mass hysteria).
I’m sure I’ll be covering this more in future posts, but it’s worth noting that Google is upping its game from a frequency standpoint. On the one hand, that’s great for sites that have been negatively impacted. And on the other hand, that means sites can be impacted more often (and not just every few months).
So heads-up… it’s not unusual to see impact like what’s shown below. That’s when a site sees impact during a number of quality updates (since it’s in the gray area quality-wise). That’s why it’s so important to get out of the gray area.
What You Can Do Now:
I’ve already covered this extensively while writing about quality updates, but I’ll list it again below. If you’ve been negatively impacted by a major core ranking update focused on quality, then:
- Analyze your site objectively from both a content quality and user experience standpoint. Run a crawl, or several, to get a solid look at your entire site. Then analyze that crawl through a quality lens.
- Improve low quality content, nuke thin content, or noindex pages that shouldn’t be in Google’s index. If they aren’t indexed, they can’t hurt you. Focus on “quality indexation”.
- Tone down aggressive advertising. If you annoy your users, Google can pick that up. And if Google sees this in aggregate, you can get smoked. Beware.
- Fix all major user experience problems on the site. Go through your site like a user would. If there are any issues that inhibit users achieving a goal, fix them. And fast.
- Hunt down technical problems that can be causing either content or UX problems. For example, major fetch and render problems, tech glitches that can cause thin content, performance problems, and more. Technical SEO is extremely important to nail down. So don’t overlook what’s under the hood.
- Continually work to add high quality content to your site. Make sure what you’re publishing is unique, relevant, and will satisfy user needs. If you can’t do that, don’t publish it. And boost the quality of content already on your site (if you feel some of it is lacking). Check the queries leading to your content and make sure it can meet or exceed user expectations. If it can’t, enhance it.
- Understand that you probably won’t recover quickly. John Mueller has said many times that Google needs to see significant improvement in quality overall, and over the long term. So keep fighting your way back. You can recover, but it’s not a sprint. It’s a marathon.
Summary – Will the frequency of quality updates continue to increase?
Based on the (new) frequency of updates, I’m wondering if we’ll see another quality update soon. Google seems to be pushing these updates monthly now (and we’ve even seen multiple quality updates per month in some cases). Again, that’s good for those that are impacted, but could be mean a lot of volatility for other sites. Stay tuned. It could be a hot summer algo-wise.
Source: G-Squared Interactive