Many factors can affect rankings after a core algorithm update. It’s not always about the helpfulness of content, there are other factors that can play a role in why the algorithm changed and negatively affected your website rankings.
If you find yourself saying, “It used to rank before, why doesn’t it rank now?” then some of these factors may be something to consider.
1. Algorithmic Losses Are Not Necessarily Persistent
Sites hit by the core algorithm update (which includes the Helpful Content part) do not have a permanent strike against them. Over the past ten years Google has rolled out complicated algorithms and systems that can take months between update cycles, leaving affected sites unable to find a quick path back to the search results. While that’s not a permanent mark it does feel like a site has acquired a curse that permanently marks them as no good and permanently excluded.
Google’s John Mueller answered a question where he confirmed that getting caught in a Core Algorithm Update is not persistent and with work a site can recover from being hit by an update.
Someone asked on X (formerly Twitter):
“Can a site hit by HCU grow again in terms of traffic if it improves in quality? Many fear that no matter the amount of improvements we make a HCU hit site will forever have a classifier assigned to it that keeps it from growing again.”
John Mueller responded:
“Yes, sites can grow again after being affected by the “HCU” (well, core update now). This isn’t permanent. It can take a lot of work, time, and perhaps update cycles, and/but a different – updated – site will be different in search too.”
2. Recovering Is Not The Right Word
A lot of people think of recovering from an update as resetting the rankings so that websites regain positions to a previous state. John Mueller’s answer on X suggests that publishers can understand algorithmic effects as something that requires adjusting a website to fit into an evolving web, including user expectations.
Mueller tweeted:
“Permanent changes are not very useful in a dynamic world, so yes. However, “recover” implies going back to just-as-before, and IMO that is always unrealistic, since the world, user-expectations, and the rest of the web continues to change. It’s never “just-as-before”.”
This statement seems to imply that to a certain degree, algorithmic updates reflect user expectations in what they expect to see in the search results. A way to understand this is with the example of Google’s Medic Update from a few years back. That update reflected a realignment of the search results with what users expect to see when making certain queries. After the Medic update, search queries for medical topics required search results with a scientific approach. Sites that reflected folk remedies and unscientific did not fit that updated definition of relevance.
There are subtle variations to this realignment of search results that goes directly to answering the question, what do users mean when they ask a search query? Sometimes relevance means informational sites while for other queries it may mean review sites are what users expect to see.
So if your site is hit by a core algorithm update, revisit the SERPs and try to determine what the new SERPs mean in terms of relevance and self-assess whether your site meets this new definition of relevance.
Circling back to Mueller’s response, there is no “going back to just-as-before” and that may be because there has been a subtle shift in relevance. Sometimes the fix is subtle. Sometimes getting back into the search engine results (SERPs) requires a major change in the website so that it meets with user expectations.
3. Thresholds And Ranking Formulas
Another interesting point that Mueller discussed is the difference between an ongoing algorithmic evaluation and the more persistent effects from a ranking system that requires an update cycle before a site can recover.
Someone asked:
“The simple question is whether you need to wait for a new core update to recover from the HCU. A simple “yes” or “no you can recover anytime” would suffice.”
John Mueller answered:
“It’s because not all changes require another update cycle. In practice, I’d assume that stronger effects will require another update. Core updates can include many things.”
Then continued with these interesting comments:
“For example, a ranking formula + some thresholds could be updated. The effects from the updated formula are mostly ongoing, the changes to thresholds often require another update to adjust.
…(“thresholds” is a simplification for any numbers that need a lot of work and data to be recalculated, reevaluated, reviewed)”
The above means there are two kinds of effects that can hit a site. One that is a part of a continually updated ranking formula that can quickly reflect changes made to a site. These used to be called rolling updates where the core algorithm can make relatively instant evaluations about a site and boost or demote the rankings.
The other kind of algorithmic issue is one that requires a massive recalculation. This is what the HCU and even the Penguin algorithms used to be like until they got folded into the core algorithm. They were like massive calculations that seemed to assign scores that were only updated on the following cycle.
4. The Web & Users Change
In another recent exchange on X, John Mueller affirmed that a key to success is keeping track of what users expect.
He tweeted:
“…there is no one-shot secret to long-lasting online success. Even if you find something that works now, the web, user desires, and how they engage with websites changes. It’s really hard to make good, popular, persistent things.”
That statement offers these concepts to keep in mind for online success:
- The Internet
- User desires
- How users engage with websites
- popularity is not persistent
Those are not algorithm factors. But they could be things that Google picks up on in terms of understanding what users expect to see when they make a search query.
What users expect to see is my preferred definition of relevance. That has practically zero to do with “semantic relevance” and more about what users themselves expect. This is something that some SEOs and publishers trip over. They focus hard on what words and phrases mean and forget that what really matters is what they mean to users.
Mueller posted something similar in an answer about why a website ranks #1 in one country and doesn’t perform as well in another. He said that what users expect to see in response to a query can be different from country to country. The point is that it’s not about semantics and entities and other technical aspects but often search ranking relevance has a lot to do with the users.
He tweeted:
“It’s normal for the search results in countries to vary. Users are different, expectations may vary, and the web is also very different.”
That insight may be helpful for some publishers who have lost rankings in a core algorithm update. It could be that user expectations have changed and the algorithm is reflecting those expectations.
5. Page-Level Signal
Google’s SearchLiaison affirmed that the Helpful Content component of the core algorithm is generally a page-level signal but that there are sitewide ones as well. His tweet quoted the Helpful Content Update FAQ which says:
“Do Google’s core ranking systems assess the helpfulness of content on a page-level or site-wide basis?
Our core ranking systems are primarily designed to work on the page level, using a variety of signals and systems to understand the helpfulness of individual pages. We do have some site-wide signals that are also considered.”
Keep An Open Mind
It’s frustrating to lose rankings in a core algorithm update. I’ve been working in SEO for about 25 years and auditing websites since 2004. Helping site owners identify why their sites no longer rank has taught me that it’s useful is to keep an open mind about what is affecting the rankings.
The core algorithm has a lot of signals, some of which pertain to the helpfulness while others are relevance to users, relevance to site queries and also just plain site quality. So it may be helpful to not get stuck thinking that a site lost rankings because of one thing because it could be something else or even multiple factors.
Featured Image by Shutterstock/Benny Marty
Source link : Searchenginejournal.com