A blog post from Google has revealed that it is making its most significant algorithm change in several years, as it aims to get the search engine to better understand the language humans use every day.
The algorithm alteration is called Bidirectional Encoder Representations from Transformers (BERT). It focuses on the problem users sometimes face in converting a query they have in their minds into words a search engine can understand. Search Engine Land describes it as the “biggest change in search since Google released RankBrain” in 2015.
BERT was available on an open source basis last year, but as of last week it began to be rolled out by Google and is likely to be fully live in the near future. Google explains that the algorithm helps the search engine understand the nuance of the user’s query, and has given examples to illustrate this.
One example is that with a search for “2019 brazil traveler to usa need a visa”, a pre-BERT Google would have returned results related to US citizens travelling to Brazil. It would have overlooked the word ‘to’, which indicates the opposite – a Brazilian visiting the USA. Once BERT is fully operational, such a search will direct the user to the site of the US embassy in Brazil.
A second example is related to Google Snippets, and shows that they previously sometimes put too much emphasis on nouns and not the prepositions around them. In a search for “parking on a hill with no curb”, Google previously provided a Snippet that looked for the word ‘curb’, so in effect it failed to answer the question. The implementation of BERT means that it will now understand that the user wants guidance on situations where there is no curb (or ‘kerb’ here in the UK).
From an SEO perspective, this has interesting ramifications on the concept of ‘stop words’. Traditionally, these are words like ‘to’, ‘the’, ‘a’ and ‘some’ that search engines tend to ignore. As search engines come to understand human language better, they are learning that these words can actually change the meaning of a sentence.