Home SEO For PublishersFundamentals The Ultimate Google Algorithm Guide: Critical Updates, Tips And Ranking Factors

The Ultimate Google Algorithm Guide: Critical Updates, Tips And Ranking Factors

by Patrick D

Every year, Google announces new updates to its search algorithm. In the vast majority of cases, Google algorithm changes are insignificant enough to go unnoticed. But now and then, Google releases an update so significant that it completely changes how we practice search engine optimization (SEO).

We’ll review the most important search algorithm changes in this article. We’ll also investigate why these changes were made, how they work, and how they affected or changed SEO strategies.

Content

When you’re a publisher trying to maintain the highest possible page ranking at all times, Google’s algorithm updates can cause you some problems.

But the good news is that it is possible to adapt quickly to new Google algorithms and maintain your SERP ranking.

What is a Google algorithm update?

First and foremost, let’s talk about the Google algorithm. It’s extremely complicated, and it’s only getting more so as Google tries to improve the searcher’s experience. Every algorithm update aims to improve the user’s search experience. Until 2010, publishers and SEO specialists lived more calmly.

But the Caffeine algorithm in 2010 changed everything. Search engine results started changing several times a day rather than once every few weeks. In any given year, Google makes over 600 algorithm updates, the vast majority of which go unnoticed.

But when Google announces major updates, it sends the SEO community into a frenzy trying to understand and benefit from them.

How often are Google algorithm changes released?

Google’s algorithm is updated daily. They may even update multiple times per day. These updates are usually minor, and you won’t notice a drop in search rankings as a result. But not for the “core” updates. A few times a year, Google makes large-scale changes that can directly affect your website’s performance. We often expect a core update once or twice a year.

Publisher’s checklist to the major Google algorithm updates till 2022

The following are the five most significant Google algorithm updates, as explained by Google and SEO experts.

1. Panda algorithm

The Panda algorithm, named after its developer Navneet Panda, debuted on February 23, 2011. Panda was created to rank high-quality websites higher in search results while demoting low-quality sites. When this algorithm update first appeared, it was unnamed, and many of us dubbed it the “Farmer” update because it seemed to affect content farms. (Content farms are websites that collect information from a variety of sources, often stealing it from other websites, to generate a large number of pages with the sole purpose of ranking well in Google for a variety of keywords.) It did, however, affect a large number of sites.

Many SEOs in forums assumed that Panda was targeting sites with unnatural backlink patterns when it first appeared. But it’s all about your website’s quality.

Panda severely harmed some websites. Panda led to site-wide issues, which means it didn’t just degrade individual web pages in SERPs. Instead, Google considered your entire website to be of lower quality. But Panda can also only affect a portion of a website, such as a news blog or a single subdomain.

A blog post by Google employee Amit Singhal includes a checklist that you can use to see if your site is truly high quality. Here’s the rundown:

  • Is this an article written by an expert or enthusiast who knows a lot about the subject, or is it more superficial?
  • Are there any duplicates, overlapping, or redundant content on the same or similar topics with slight keyword variations on the site?
  • Are the topics chosen based on genuine reader interests, or does the site generate content by guessing what will rank well in search engines?
  • When compared to other pages in the search results, does the page provide significant value?
  • Is the site a well-known authority on the subject?
  • Would you trust the information on this site if you had a health-related question?
  • When this site’s name is mentioned, do you think you’ll recognize it as a reliable source?
  • Is there any insightful analysis or interesting information that isn’t obvious in this article?
  • Is this a page you’d want to bookmark, share, or recommend to a friend?
  • Are the pages made with great care and attention to detail?

That’s a lot of information to take in. These questions do not necessarily imply that Google is attempting to determine whether your articles are interesting or whether you have presented both sides of a story algorithmically.

Rather, these questions outline factors that can influence how real-life users rate your site’s quality. No one truly understands all of the factors that Google considers when evaluating the quality of your site through Panda. Ultimately, the goal is to create the best possible website for your users.

Google Panda algorithm fact table

Date:February 23, 2011
Main Focus:Duplicate pagesPlagiarized/ thin content,User-generated spamKeyword stuffing.
How it worksPanda algorithm update gives web pages a “quality score”, used as a metric for ranking.
Best practicesCheck your site for duplicate and thin content, and keyword stuffing regularly. You’ll need a site crawler like SEO PowerSuite’s Website Auditor to do this.
You can also avoid a potential penalty by using WebSite Auditor’s new Content Editor module to create pages without the risk of keyword stuffing.

Content Editor examines your top competitors’ pages and makes SEO recommendations based on content that has already proven successful in Google searches.
Use a plagiarism checker like Copyscape to see if your content has been duplicated elsewhere on the web.

2. Penguin algorithm

Penguin’s goal is to reduce Google’s trust in websites that have cheated by creating unnatural backlinks to gain a competitive advantage in SERP results. While Penguin’s primary focus is on unnatural links, other factors can influence a website’s ranking. Backlinks are widely acknowledged as the most critical factor to consider.

A backlink serves as a vote for your website. If a well-known website links to yours, Google sees it as a recommendation for your website. But if a smaller unknown website links to you, this vote will not be as influential as a vote from a more established website. Yet, if a large number of these small votes are cast, it can make a significant difference.

Another thing Google’s penguin algorithm considered is anchor text: the text underlined in a link. The anchor text for a link to our homepage would be “ad network.”

When several sites link to Adsterra.com with the anchor text “Ad network,” Google knows that people looking for “Ad network” will want to see websites like Adsterra in their search results.

It’s easy to see how this part of the algorithm could be manipulated. While we don’t know what the Penguin algorithm works with exactly, but we do know that it also attempts to detect low-quality, self-made backlinks. The Penguin algorithm is similar to Google assigning a “trust factor” to your backlinks.

Penguin is a site-wide algorithm, according to John Mueller, a Google employee. So if the Penguin algorithm determines that a majority of your site’s backlinks are untrustworthy, Google will reduce its trust in your entire site. As a result, the site’s overall ranking will suffer.

Penguin search algorithm best practices

Penguin, like Panda, is a filter. This means that the algorithm is re-run regularly, and sites are re-evaluated. The latest Google Penguin update, which made Penguin run in real-time, was released on January 10, 2016. So once you’ve made the necessary changes to your page, you’ll be able to recover almost immediately.

To recover from Penguin, you must first identify the unnatural links pointing to your site and either remove them or ask Google not to count them using the disavow tool if you can’t remove them.

Also, the Penguin algorithm is not the same as a manual penalty for unnatural links. To recover from Penguin, you do not need to submit a reconsideration request. You also won’t need to document your work in removing links because no Google employee will be reviewing it manually.

Google Penguin fact sheet

Date: April 24, 2012

Main Focus: Spammy or irrelevant backlinks, over-optimized anchor text

How it works: The goal of Google Penguin is to penalize websites with unnatural backlinks. After this update, low-effort link building, such as buying links from link farms and PBNs, was no longer possible.

How to adjust: Monitor the growth of your link profile and conduct regular audits with a backlink checker like SEO SpyGlass to avoid the effects of the Google Penguin update. Keep an eye out for any unusual spikes, which could indicate a negative SEO attack by your competitors.

The statistics that we know Penguin considers are factored into SEO SpyGlass’ Penalty Risk formula. Sort your backlink list from highest risk to lowest risk by going to the Penalty Risk tab. Add them to the disavow file, download it, and submit it to Google’s Disavow Links Tool if they turn out to be malicious.

3. Hummingbird algorithm

Google first announced the hummingbird update on September 26, 2013.

Hummingbird was a complete rewrite of Google’s entire algorithm. If you think of the Google algorithm as an engine, Panda and Penguin are algorithm changes that are akin to adding a new part to the engine, such as a filter or a fuel pump, as Danny Sullivan put it. On the other hand, Hummingbird wasn’t just a new part; it was an entirely new engine.

Many of the old updates (such as Panda and Penguin) are still used in the new engine, but a large portion of it is brand new.

The Hummingbird algorithm is designed to help Google better understand a user’s query. In his post about Google patents, Bill Slawski provides a great example of this. He explains that when a user searches for “What is the best place to find and eat seafood?” Hummingbird can tell that the user is most likely interested in “restaurants” results.

These changes helped Google’s voice search to become more effective. When we type a search query, we might type “best ad network,” but when we speak a query (for example, through Google Glass or Google Now), we’re more likely to say, “Which ad network is the most profitable for publishers?” Hummingbird’s goal is to understand better what users are asking when they ask questions like this.

Hummingbird best practices

If you read the posts linked above, you’ll notice that the answer to this question is to create content that satisfies search intent rather than simply trying to rank for a specific keyword. But you should be doing this already!

Google’s goal is to encourage webmasters to publish the best content possible and provide answers to people looking for information. You’re on the right track if you can create content that answers people’s questions.

When it comes to “recovering” from Hummingbird, we realize that’s a very hazy answer. Hummingbirds are distinct from Pandas and Penguins. When the Panda or Penguin algorithms demote a site, it’s because Google has lost faith in its quality, whether on-site quality or its backlinks’ legitimacy. You can regain the algorithm’s trust and, as a result, see improvements if you address those quality issues.

However, if your website has been performing poorly since the launch of Hummingbird, there is no way to reclaim those previously held keyword rankings. You can, however, attract new visitors by improving the depth of your website’s offerings.

Date: August 22, 2013

Main Focus: Low-quality content, keyword stuffing

How it works: The Hummingbird search algorithm helps Google better understand search queries and deliver relevant results (instead of the individual terms within the query). Even if a page doesn’t contain the exact words entered by the searcher, the Hummingbird algorithm allows it to rank for a query. Natural language processing uses latent semantic indexing, co-occurring terms, and synonyms to achieve this.

How to adjust: Expand your keyword research to include concepts. Analyze related searches, synonyms, and related terms. Similar searches and questions, as well as Google Autocomplete, are excellent sources of ideas.
Learn your audience’s language and adapt your content accordingly. Creating comprehensive content that meets searcher intent benefits both your site’s engagement and SEO.

4. Rankbrain algorithm

RankBrain is a machine learning system that complements Hummingbird. It assists Google in processing and answering unfamiliar, unique, and original queries by using historical data on previous user behavior.

Date: October 26, 2015

Main Focus: Lack of query-specific relevance; shallow content; poor UX.

Best practices:

To diversify your content, it’s a good idea to broaden your keyword research and pay special attention to related searches and synonyms. Whether you like it or not, the days of relying solely on Google AdWords’ short-tail terms are over.

Furthermore, as search engines’ ability to process natural language improves, unnatural phrasing, particularly in titles and meta descriptions, may become an issue.

With competitive analysis, you can optimize your pages for relevance and comprehensiveness. With WebSite Auditor’s TF-IDF tool, you can determine what phrases and keywords your competitors are using. Find a way to include them in your content to improve search relevance.

5. Fred algorithm

Fred is the unofficial name for a Google update that penalized websites that used overly aggressive monetization. Excessive advertising, low-value content, and websites that provide little value to users were all targeted by the algorithm. It penalized websites that existed solely to generate revenue rather than to provide useful content.

Date: March 7, 2017

Main Focus: Aggressive monetization, Misleading or deceptive ads, User experience barriers, Poor mobile compatibility, Thin content.

Best practice

It’s perfectly fine to have ads on your website; however, if they prevent users from reading your content, consider reducing their quantity and reconsidering their placement. It’s also a good idea to self-evaluate your website using the Google Search Quality Rater Guidelines.

As usual, look for pages with insufficient content and fix them. And, of course, keep striving to improve the user experience.

6. Mobilegeddon (mobile-friendly update)

Google’s Mobile-Friendly Update (2015) or Mobilegeddon, was created to ensure that mobile-friendly pages rank higher in mobile search and non-mobile-friendly pages rank lower. But it is was no longer sufficient to rank websites based on their mobile friendliness. So Google introduced mobile-first indexing this year, which means it began indexing webpages with the smartphone agent first. Furthermore, websites with only desktop versions have been indexed.

Date: April 21, 2015.

Main Focus: All websites (mobile-friendly and non-mobile-friendly)

Best Practice: Check your Search Console to see if your website has migrated to mobile-first indexing.  Also, double-check to see if your robots.txt file does not prevent Google bot from crawling your pages.

If you haven’t optimized your website for mobile devices, now is the time to do so. If your site is already mobile-friendly, use the mobile-friendly test to see if it meets Google’s requirements.

Make sure your mobile site has the same content as your desktop site if you use dynamic serving or separate URLs. Furthermore, structured data and metadata should be available on both versions of your website.

How to succeed with Google’s algorithm in 2022

1. Consistently publish engaging content

It’s been three years since content overtook links as the most critical factor in Google’s search algorithm, and content’s weight in the algorithm increased in 2022.

In the last year, it’s become increasingly clear that Google tests new content to see if it responds well to the keyword’s search intent. It is promoted if the behavior of the searchers indicates that your content is answering their questions. Google’s AI prefers thought leadership content that is published at least twice a week.

2. Use keywords in meta and title tags

It’s essential for ranking to include the keywords your page should rank for in the title tag. While any publisher with SEO experience understands this, keyword strategy is a time-consuming intellectual task that can easily take 20-30 minutes per page. Try putting keywords in the beginning or as close as possible in a title tag. Their density in the web page’s content is also critical.

While backlinks are still a significant factor in Google’s decision to rank a website in its search results, content is now the most important ranking factor.

3. Improve your Niche expertise

Google started ranking websites that are niche experts in mid-2017. Being a niche expert means having a hub of 10+ authoritative pages centered on the same “seed” keyword. For example, the keyword “advertising network” could be the nucleus keyword for an ad network company with landing pages devoted to “ad networks for small business,” “ad network for advertisers,” as well as FAQ landing pages devoted to “ad pricing,” and more.

The nucleus keyword’s consistency across the website’s pages acts as a magnet, attracting visitors from any Google search that includes the nucleus keyword.

4. Use internal links

In 2017, Google also placed a lot more emphasis on internal links, which is often discussed in conjunction with hubs. If internal links connect web pages with the same or similar keywords in their title tags, the website will often rank higher for that keyword. 

Publishing 100 articles on various aspects of a subject and linking them all back to a single authoritative page would be a powerful expression of that page’s value and would give that page a higher ranking ability.

Note: In 2022, the hub and spoke approach, which combines Keywords in Meta Title Tags, Niche Expertise, and Internal Links will be the most comprehensive and effective SEO strategy.

5. Make your website Mobile-first

If you want to get more visitors in 2022, your website must be simpler to navigate on mobile phones and tablets than on computers. The old standard was “mobile-friendliness,” but Google has moved to a mobile-first world, which means it expects you to focus on mobile visitors when designing your website. The site should look identical on mobile and desktop: the layout should be simple, and you should optimize its navigation for a mobile user experience.

6. Improve page speed

Google has always placed a premium on the user experience, which is why it has invested in thousands of data centers worldwide to serve search results in milliseconds. Your website should follow Google’s lead and prioritize site speed. Pages should load as quickly as possible. Your site’s ranking ability decreases with each additional second it takes to load. Google’s free PageSpeed Insights tool can help you test your page speed.

7. Enable Site security / SSL certificate

 As the internet has become more important in our lives, so have hackers. Serving up sites that are harmful to Google’s searchers would be Google’s worst nightmare. As a result, if your domain is even vulnerable to being hacked—for example, if your site doesn’t have an SSL certificate (indicated by the “s” at the end of “HTTPS”)—it will lose its ranking ability. An SSL certificate is usually free and straightforward to obtain from your registrar.

8. Offsite mentions

Experts always debated if Google considers mentions of a website that aren’t hyperlinks. Mentioning a company is the same concept as links, but without the actual linking, so why shouldn’t that mention boost a site’s Google credibility? This factor, while still new, accounts for a small but significant portion of Google’s algorithm.

9. Boost user engagement

This factor is the part of the most significant change to Google’s algorithm in the last five years. Previously, Google was hesitant to give value to an on-site factor that site owners could easily manipulate. But Google’s increasingly sophisticated search algorithms technology has made user engagement a significant part of its algorithm.

10. Deploy Schema Markup / Structured Data

Like a modern version of meta tags, you can add schema tags to your web pages to help Google serve more visual search results like snippets. Schema markup is a modern version of meta tags that help Google serve more visual search results like snippets. You’re probably familiar with schema markup if you’ve ever seen search results that are longer and list out a site’s main pages, or highlight an important piece of data, or include a 5-star rating system or a calendar of events.

Google prefers pages that use schema markup because the search results from those pages are more useful to searchers. As an added benefit, they make search results stand out from the rest of the page.

11. Put keywords in URL 

A trace of old-school SEO from the 2000s, including the keyword(s) you’re targeting in the page’s URL, is still a best practice, even if it has very little weight in the algorithm.

12. Use keywords in Header Tags 

Using keywords in a webpage’s H1, H2, and H3 tags is a best practice that might improve a page’s ranking ability by a small margin. This isn’t something you should do excessively, but it’s something to keep in mind.

2021-2022 Google algorithm ranking factors

FactorAlgorithm Effect
Regular Engaging Content26%
Keywords in Meta Title Tags22%
Backlinks16%
Mobile-First Website13%
Niche Expertise12%
Internal Links5%
User Engagement5%
Page Speed3%
Site Security / SSL Certificate2%
Offsite Mentions1%
Schema Markup / Structured Data1%
Keywords in URL1%
Keywords in Header Tags1%
Keywords in Meta DescriptionTags and Other Factors1%
  • The #1 factor, Consistently Publish Engaging Content, became more important, especially with creating hubs of interlinked pages targeting the same keyword.
  • Mobile-friendliness may not be as important because it is now the standard and Google now only penalizes websites that have made no efforts in this area.
  • Keywords in meta titles became more important, emphasizing the importance of a proper keyword research strategy that satisfies users’ search intent.

How do I know when Google releases a new algorithm update?

There’s no need to track every single tweak Google makes to its algorithm. However, it’s crucial you track core updates, so you can adapt your SEO strategy accordingly.

First, set up a Google Alert for algorithm changes. With Google Alerts, you’ll receive a notification straight to your inbox whenever algorithm updates are mentioned online so you can start preparing for the changes as soon as possible.

Next, if you’re on Twitter, follow Google SearchLiaison. It’s an official account where you’ll see notifications of core algorithm updates. Google Alerts don’t work on social media, so following this account gives you another way to track any mentions of algorithm changes.

From here, you can read the official announcements to learn more about how these algorithm updates affect you. You can also learn more about planned improvements to Google, which may help you improve your SEO strategy more generally.

Thirdly, you can use Google Analytics to help you spot changes to the algorithm. How can Google Analytics help? Well, it can help you identify unusual fluctuations in traffic and conversions, for one thing. You can then take a closer look at your performance to see if you’ve been “affected” in some way by a Google algorithm change.

Is Google’s algorithm different from other search engines?

Yes. For one thing, we don’t know how often search engines like Bing or Yahoo update their algorithms, whereas we know that Google updates its algorithm very frequently.

On the other hand, there’s a little more transparency around Bing’s key ranking factors, which include:

  • metadata
  • page loading time
  • quantity and quality of backlinks

In some ways, then, it’s easier to understand the Bing algorithm and tweak your SEO strategy to match. Since Yahoo is closely tied to Bing, the principles are very similar, but the difference is that they’ve published a transparent and useful guide to help publishers better.

How to determine which Google algorithm update hit your website

Have you noticed a steep decline in traffic recently? An algorithm change could be responsible. Here are the two best ways to determine which update just hit your website and how to recover from the penalties:

  1. First, check out Google Search Central. This platform contains multiple resources to help you diagnose common performance problems and identify possible algorithm penalties.
  2. Next, log on to Search Console, a free analytics tool from Google. Whether you want to identify mobile usability problems or monitor your site’s performance, Search Console has the resources you need.

Combined, these tools can help you easily track changes to the Google algorithm.

Related Posts
×