Google: How BERT Works, The Largest Update Of The Most Used Search Engine Algorithm In The World

Image copyright
                 Getty Images
                
            
            
            Image caption
                
                    Nearly 3,000 million searches are made daily around the world.

"The curiosity of people is infinite."

If something has characterized the gigantic and very popular Google, it has been the algorithm on which the almost 3,000 million searches on its page are based every day.

RELATED

However, in recent months – and according to the company's own report – that algorithm needed to be optimized to match the objectives of the search giant.

How can math put you in prison?

Therefore, according to a statement published this month by the company's vice president Pandu Nayak, they launched the Bidirectional Encoder Representations from Transformers (BERT), which is the largest search engine update in recent years.

According to Nayak, 15% of the billions of searches that Google processes each day are on topics or issues that they had not seen before.

"So we are building a mechanism to respond to queries that we cannot anticipate," he explained.

As Nayak himself points out, BERT functions as a system that has been designed to better understand the language in which we speak every day.

"At its core, Google Search is based on understanding the language. Our job is to decipher what you want to look for on the web, no matter how you type it or how you combine the words on the keyboard," he noted.

Image copyright
                 Getty Images
                
            
            
            Image caption
                
                    BERT functions as a neural network that can be trained to process natural language.

"Although we continue to improve the way we understand language, the truth is that sometimes we don't succeed," he added.

Neural system

According to Google, BERT is an open source neural network that has been trained to process natural language.

What does Google's "brain team" do?

"This technology allows you to train your own system to improve and increase the ability to answer questions on Google," Nayak explained.

In short, what makes BERT different from what Google currently offers is that it processes words in the context of a sentence, rather than word for word.

"BERT can consider the complete context of a word when observing those that come before and after, and this is useful to understand the intention of the queries that are made in the search engine," said Nayak.

Image copyright
                 Getty Images
                
            
            
            Image caption
                
                    Google has based its company's success on search algorithms.

But an example is worth a thousand words.

In the search "2019 Brazilian tourist to the US needs a visa", the preposition "a" is essential. But before BERT, she was totally ignored by the search engine, so the results ended up "understanding" that it could be someone from the US. I needed a visa for Brazil.

That is why the search results were usually press articles that made travel reviews of Americans to Brazil.

"With BERT, the search engine is able to understand the context provided by the preposition 'a' and in that way it can offer more optimal results, such as the page of the State Department where this information is given," said Nayak.

If it is more effective?

For now BERT will only be in use in English searches, but it was announced that it will soon be available for languages ​​such as Korean, Hindi and Portuguese.

However, several technology analysts have pointed out that, despite the advances that BERT brings, the underlying problem is not solved when it comes to understanding language.

"Prepositions and pronouns have been very problematic historically for search engines, but BERT helps a lot with this. The context improves due to the bi-directional nature of the new algorithm," algorithm expert Dawn Anderson told Search Engine Journal.

Image copyright
                 Getty Images
                
            
            
            Image caption
                
                    BERT has been considered a real breakthrough in artificial intelligence issues.

"But there is still a lot of work ahead. For example, the word 'bass' means many things (it can be an adjective or a musical instrument). There are many meanings for a single word. That is why it is vital that the context be understood in order to understand the precise meaning of the term, "he added.

So far, according to media such as The Verge and Wired, Google searches after BERT have improved by up to 10%, according to the results presented by users.

"Everyone who makes money on web traffic should take note of this progress," said Dieter Bohn of The Verge magazine.

And he remarked a phrase from Payak himself: "This is the biggest, most positive change we've had in the last five years and perhaps one of the biggest from the beginning."

Now you can receive notifications from BBC News World. Download the new version of our app and activate them to not miss our best content.

Do you already know our YouTube channel? Subscribe!
                                                                                                        
SEARCH FOR MORE

YOU MAY ALSO LIKE