The Ultimate SEO and Digital Marketing Resource Network

Skyrocket your SEO strategy with LinkGraph's expert resources. Browse our content to stay ahead of the curve, drive business growth, and crush your SEO goals.

Free Consultation
Hero Image
What do you want to know?

BERT: Google’s Largest Update in Years

By Manick Bhan on Sep 30, 2024 - 9 minute read

The search engine optimization landscape changed in 2019 when Google introduced the BERT update for SEO. The change in the algorithm focused on improving how search queries […]

The search engine optimization landscape changed in 2019 when Google introduced the BERT update for SEO. The change in the algorithm focused on improving how search queries were understood to provide better results for users.

With this update, search engine optimization strategies were elevated to a new level, allowing for a better understanding of user intent and optimizing on-page SEO dynamics. Here is everything SEOs and site owners should know about Google’s advanced natural language processing model.

What is Google’s BERT Update for SEO?

BERT is a deep learning, natural language processing (NLP) algorithm launched by Google. The BERT update date was October 2019 for US English. The algorithm aims to understand context and interpret queries more accurately to provide users with more relevant results. Before the end of the year, the search algorithm had already expanded to more than 70 languages.

The name is an acronym for Bidirectional Encoder Representations from Transformers. BERT utilizes a machine learning model that is capable of comprehending the nuances and subtleties of language, including context and relationships between words in a sentence.

What was the effect of the BERT algorithm update on Google search?

This is what Google said:

“These improvements are oriented around improving language understanding, particularly for more natural language/conversational queries, as BERT is able to help Search better understand the nuance and context of words in Searches and better match those queries with helpful results.

Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.”

In fact, according to Google the BERT update will impact 1 in 10 English searches in the U.S., that’s 10% of search queries. This is Google’s most significant update in the past 5 years, at least according to them:

We’re making a significant improvement to how we understand queries, representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search.

The introduction of BERT caused major ranking fluctuations in the first week of the new algorithm’s deployment. SEOs over at Webmaster World have been commenting on the fluctuations since the beginning of the week, noting the largest rankings fluctuation to hit the SEO world since RankBrain.

A screenshot of a message on a phone screen after the Google BERT update.

An update showing a text message on a cell phone.

 

But What Is a Natural Processing Language?

As you saw, BERT’s algorithm uses natural processing language (NLP) to comprehend the user’s search intent. NLP is a field of artificial intelligence (AI) that uses linguistics, computer science, and other methods to bridge the gap between human communication and machine understanding.

In verbal communication, it’s easy to notice variations in how we speak, right? A person who grew up in Texas probably knows different slang or uses a different meaning for a term than someone from California.

If comprehension is sometimes challenging for us, the issues might be even bigger for machines that need structured data to understand the queries, especially in a local search. After all, this is a reality in almost every language and country in the world.

NLP models aim to analyze and process human language in a way that allows machines to derive meaning and context from words, sentences, and entire documents. This involves tasks such as text analysis, sentiment analysis, language translation, and speech recognition.

BERT Analyzes Search Queries, NOT Webpages

The October 24th, 2019, release of the BERT update for SEO improved how the search engine giant analyzes and understands search queries—not webpage visibility.

An updated version of Google's BERT algorithm is like having a magnifying glass on both physical and digital screens.
This is possible because BERT helps Google understand the context of each word in a phrase in the search query. Roger Montti from SEJ provided a great example of this in his piece on the BERT update:
An example of context and Google update.
He used the example search “How to catch a cow fishing?” Traditionally, Google displayed results about livestock (cows). Now, the search engine seems to understand that fishing adjusts the use of the term “cow” and refers to bass rather than bovines within a fishing context.

Update your Google search results for 'how to catch waterfowl' to get the latest information.

 

How Does BERT Work?

BERT improves Google’s understanding of search queries using a bidirectional (that’s the B in BERT) context model and more appropriate organic search results. Google actually talked a bit about the mechanics of BERT in 2018 when they announced BERT as a new technique for natural language processing pre-training.

What is pre-training? Pre-training is just teaching a machine how to do a task before you actually give it work to do. Traditionally, its datasets are loaded with a few thousand to a few hundred thousand human-labeled examples.

Pre-training has been around for a while, but what makes BERT special is that it’s both contextual (each word’s meaning shifts based on the words around it) and bidirectional. In other words, the meaning of a term is understood based on the words both before it and after it.

Per Google’s Blog:

“In the sentence ‘I accessed the bank account,’ a unidirectional contextual model would represent ‘bank’ based on ‘I accessed the’ but not ‘account.’ However, BERT represents ‘bank’ using both its previous and next context — ‘I accessed the … account’ — starting from the very bottom of a deep neural network, making it deeply bidirectional.

The Google BERT update builds on recent advancements in machine learning and entity recognition to provide a better user experience. Basically, BERT helps identify all the parts of speech and the context of the words before Google processes a search.

What Does BERT Mean for Search Results?

According to Google, users are going to start seeing more relevant results that better match the intent of a search. This algorithm improvement spans regular results and rich snippet results.

Google provided a few helpful examples in their blog entry announcing BERT.

The Esthetician Example

First, we have a user trying to understand whether Estheticians spend a lot of time on their feet as part of the job.

You can see below that before BERT, Google read the query “Do estheticians stand a lot at work?” and produced a result comparing types of work environments for estheticians.

After BERT, Google is surfacing an article on the physical demands of being an esthetician, much more in line with the information the searcher was originally trying to surface.

Query-DoEstheticiansStandALotAtWork.max

The Math Practice Books Example

In this example, a user is looking for math practice books for adults but finds math practice books for children.

After BERT, Google correctly recognizes the context and the user intent of the query, accounting better for the second part of the search “for adults.”

Query-MathPracticeBooksForAdults

The Can You Pick Up Medicine For Someone Else Example

In this example, the query “Can you get medicine for someone pharmacy” returns a result about filling prescriptions in general rather than how to fill them for a third party.

With the BERT update for SEO, Google better understands the goal of the user and surfaces a piece about whether or not a patient can have a friend or family member pick up their prescription for them.

Query-CanYouGetMedicineForSomeonePharmacy

BERT, Google’s algorithm update, also helped it maintain its market dominance in voice search, which is naturally more context-driven and uses full sentences and questions.

What Can I Do to Rank Better after BERT?

For any keywords where your brand lost rankings, you should look at the revised search results page to better understand how Google is viewing the search intent of your target terms. Then, revise your content accordingly to better meet the user’s goals.

If you lost rankings under BERT, it’s more likely to be an issue related to how well your page matches a user’s search intent (helping a user reach their goal) than a content quality issue.

Given that the BERT update for SEO is likely to further support voice search efforts over time, we’d also recommend websites write clear and concise copy for their content strategy. Don’t use filler language, don’t be vague, get straight to the point.

In addition, invest in deeper keyword research for your target audience to write content that is more in line with each user intent and business goal.

Need help optimizing your content? Check out LinkGraph’s On Page Content Optimizer, reach out to a member of our team at [email protected], or schedule a meeting to get set up today.

Query-MathPracticeBooksForAdults

Where Can I Learn More?

Dawn Anderson gave a great presentation at Pubcon on “Google BERT and Family and the Natural Language Understanding Leaderboard Race,” and you can take a look at her presentation.

Jeff Dean also recently gave a keynote on AI at Google, including BERT, that you can watch.

Drive Your Revenue to New Heights

Unleash Your Brand Potential with Our Award-Winning Services and Cutting-Edge Software. Get Started with a FREE Instant Site Audit.

Real-time SEO Auditing & Issue Detection

Get detailed recommendations for on-page, off-site, and technical optimizations.