BERT Update: How Google’s State-of-the-Art AI Algorithm Will Impact You

BERT Update

Comprehending machine language to offer an apt result to the users has always been the prime goal of Google. In 2015, Google released RankBrain, their initial big step towards gratifying this goal.

Years later, just a few days ago, this search engine giant made, “the biggest leap forward in five years, and one of the biggest leaps forward in the history of Search”.

According to Pandu Nayak, Google Fellow and Vice President, Search, “Last year, we introduced and open-sourced a neural network-based technique for natural language processing (NLP) pre-training called Bidirectional Encoder Representations from Transformers, or as we call it–BERT, for short. This technology enables anyone to train their own state-of-the-art question answering system.”

BERT is a research paper, named as BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, authored by Jacob Devlin, Ming-Wei Chang, Kenton Lee, Kristina Toutanova at Google AI Language.

By offering state-of-the-art results in different types of NLP tasks, along with Natural Language Inference (MNLI), Question Answering (SQuAD v1.1), and various others, BERT has caused a twirl in the Machine Learning community.

But what actually BERT is all about? Will it bring a major change in search results? Will it be beneficial for SEO?

Read on to get the answers.

What is Google BERT?

Google BERT or Bidirectional Encoder Representations from Transformers is a neural network-based technique for natural language processing that has been pre-trained on the Wikipedia corpus. It is a machine-learning algorithm, which should lead to a better understanding of queries, as well as the content.

What makes it different?

Builds upon the latest work in pre-training contextual representations, BERT includes Generative Pre-Training, ELMo, Semi-supervised Sequence Learning, and ULMFit. But unlike the prior models, BERT is the first unsupervised, bidirectional language representation, pre-trained using only a plain text corpus.

What is the difference between BERT and RankBrain?

Though there are some similarities between BERT and RankBrain, the first artificial intelligence method for understanding queries, as they are two separate algorithms, there are differences too.

According to Eric Enge, general manager at Perficient Digital, “The first thing to understand about RankBrain is that it runs in parallel with the normal organic search ranking algorithms, and it is used to make adjustments to the results calculated by those algorithms”.

To adjust results, RankBrain looks at the current query and find similar kind of past queries. Moreover, it helps Google construe search queries so that it can surface results which may not contain the exact words in the query.

However, “BERT operates in a completely different manner,” said Enge. It looks at the content before and after a word to inform its understanding of the meaning and significance of the word.

Both BERT and RankBrain are used by Google in order to process queries and web content to expand a better understanding of what the words mean.

What challenges does it help to solve?

There are diverse things that we humans recognize easily, but it is difficult for machines or search engines to comprehend. These are –

  • The problem with words

Words are tricky to understand as plenty of them are polysemous, ambiguous, and synonymous. BERT is designed in such a way that it helps decipher ambiguous phrases and sentences that are made of numerous words with multiple meanings.

  • Word’s context

According to Ludwig Wittgenstein, a Philosopher, “The meaning of a word is its use in a language”. It means that a word does not have any meaning unless it is used in a meticulous context. The meaning of a word in a sentence changes based on the parts of speech and other related things. So, the longer a sentence would be, the harder it would be to keep track of the diverse parts of speech in the sentence.

  • On NLR and NLU

NLU or Natural Language Understanding needs knowledge of context and sense of reasoning. Though it is easy for humans to understand, it is highly challenging for machines. Structured data helps to recapitulate however, what about the hot mess in between? There are lots of gaps to fill. This is where NLU comes in as it helps search engines cram the gap between named entities.

How will it work?

To understand how this language works, let’s have a look at the acronym and what it stands for:

B: Bi-directional

Previously all the language models were uni-directional – they can move the window in one direction (either from left to right or from right to left) to understand the word’s context. But BERT is different. It is a bi-directional language modeling and so, it can see the whole sentence on both the side of a word contextual language modeling.

ER: Encoder Representations

What gets encoded will be decoded. It is an in-and-out mechanism.

T: Transformers

BERT uses “transformers” as well as “masked language modeling”. One of the biggest problems with NLU in the past has been not being able to understand in what context a word is used. Transformers pay heed to all the word meaning and then tie them to understand the actual meaning.

What kinds of Natural Language tasks does BERT help with?

BERT will not help in all of the things. It has a limitation regarding this. But the things that it will help with are:

  • Conference resolution
  • Named entity determination
  • Questions answering
  • Textual entailment and predicting the next sentence
  • Automatic summarization
  • Polysemy resolution
  • Word sense disambiguation

How will it impact search?

“BERT will help Search better understand one in 10 searches in the U.S. in English, and we’ll bring this to more languages and locales over time”. But how BERT will help Google to understand? BERT has a better understanding of the nuances that are associated with human language and so, it can help Google to interpret longer and questioning queries and give the right answers to the users.

Moreover, as BERT has a mono-linguistic to multi-linguistic ability, it can translate one language into several other languages, and so, there is a colossal possibility to transfer a lot of learnings to other languages although it does not necessarily understand the language properly.

Does Google BERT affect SEO?

Yes, the BERT update will affect SEO. Let me explain it –

SEO – Search engine optimization is the process of making your site better for search engines. Therefore, any update that the search engines are making to their algorithm influences the search engine optimization process.”

Now, the question that comes to mind is what to do to optimize for the Google BERT update? If we go with Danny Sullivan, Google’s public Search Liaison, the answer is simple: nothing new.

According to his Tweet post “There’s nothing to optimize for with BERT, nor anything for anyone to be rethinking. The fundamentals of us seeking to reward great content remain unchanged”. The one and only thing that he asks to focus on is: write content for users.

Conclusion

BERT is undeniably a major infiltrate in the use of Machine Learning for Natural Language Processing. As it is amicable and permits fine-tuning, it is expected to allow a lot of practical applications in the future. Here, we have tried to present the main ideas of the paper. If you are enthusiastic to delve deep, then read the full article and let us know what you think regarding this.

Leave a Reply