Google\’s BERT Search Algorithm Helps to Understand Natural Language

Google has released BERT, which includes new neural networking techniques which helps to understand intentions behind search queries.

BERT is Bidirectional Encoder Representations from Transformers.

What is BERT?

BERT is one of the biggest updates to Google\’s search algorithm in quite some time.

In fact, it is likely to be the biggest change Google have made since RankBrain was introduced.

BERT is essentially a neural network-based technique for natural language processing (shortened down to NLP) pre-training.

Put simply, this allows computers and Google itself to understand language more similarly to how humans do.

In more depth, Google puts it this way:

Last year, we introduced and open-sourced a neural network-based technique for natural language processing (NLP) pre-training called Bidirectional Encoder Representations from Transformers, or as we call it–BERT, for short. This technology enables anyone to train their own state-of-the-art question answering system. 
This breakthrough was the result of Google research on transformers: models that process words in relation to all the other words in a sentence, rather than one-by-one in order. BERT models can therefore consider the full context of a word by looking at the words that come before and after it—particularly useful for understanding the intent behind search queries.
But it’s not just advancements in software that can make this possible: we needed new hardware too. Some of the models we can build with BERT are so complex that they push the limits of what we can do using traditional hardware, so for the first time we’re using the latest Cloud TPUs to serve search results and get you more relevant information quickly. 

https://www.blog.google/products/search/search-language-understanding-bert/

What change will BERT have?

BERT is said to impact 1 in 10 of all search queries made on Google.

BERT is helping Google understand context and nuance of words. This means search results can better be matched to relevant search queries.

This is also used for featured snippets.

At the moment it is impacting 1 in 10 searches in US English, with support for other locales and languages later on.

Google says:

With the latest advancements from our research team in the science of language understanding–made possible by machine learning–we’re making a significant improvement to how we understand queries, representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search. 

https://www.blog.google/products/search/search-language-understanding-bert/

How do I optimise for BERT?

It is highly unlikely you can optimise your site or content explicitly for BERT.

You couldn\’t previously optimise for RankBrain, and this is likely to continue.

At the end of the day it boils down to writing content for users as you should do. If you do this correctly, then your content will be ranking for the corresponding queries.

https://twitter.com/dannysullivan/status/1188689288915050498
https://twitter.com/dannysullivan/status/1188698321113759744

Google has the following example of how it will change.

For example, if someone searches \”2019 brazil traveler to usa need a visa\” Google can now better understand how the word \”to\” relates to the rest of the sentence. BERT allows Google to understand it is a Brazilian traveling to the US rather than the other way around.

\"\"

Related Posts

  • All Posts
  • Algorithm Updates
  • Analytics Tools
  • Content Marketing
  • Digital Marketing
  • Ecommerce
  • Email Marketing
  • Google Ads
  • Link Building
  • Marketing
  • Paid Social
  • PPC
  • Press Releases
  • Search Engine News
  • Security & Compliance
  • SEO
  • SEO Basics
  • Social Media
  • Social Media Marketing
  • Technical SEO
  • Uncategorized
  • Website Development
  • Xanthos News