Google introduces new “natural language” algorithm

Posted on October 28, 2019


A blog post from Google has revealed that it is making its most significant algorithm change in several years, as it aims to get the search engine to better understand the language humans use every day.

The algorithm alteration is called Bidirectional Encoder Representations from Transformers (BERT). It focuses on the problem users sometimes face in converting a query they have in their minds into words a search engine can understand. Search Engine Land describes it as the “biggest change in search since Google released RankBrain” in 2015.

BERT was available on an open source basis last year, but as of last week it began to be rolled out by Google and is likely to be fully live in the near future. Google explains that the algorithm helps the search engine understand the nuance of the user’s query, and has given examples to illustrate this.

One example is that with a search for “2019 brazil traveler to usa need a visa”, a pre-BERT Google would have returned results related to US citizens travelling to Brazil. It would have overlooked the word ‘to’, which indicates the opposite – a Brazilian visiting the USA. Once BERT is fully operational, such a search will direct the user to the site of the US embassy in Brazil.

A second example is related to Google Snippets, and shows that they previously sometimes put too much emphasis on nouns and not the prepositions around them. In a search for “parking on a hill with no curb”, Google previously provided a Snippet that looked for the word ‘curb’, so in effect it failed to answer the question. The implementation of BERT means that it will now understand that the user wants guidance on situations where there is no curb (or ‘kerb’ here in the UK).

From an SEO perspective, this has interesting ramifications on the concept of ‘stop words’. Traditionally, these are words like ‘to’, ‘the’, ‘a’ and ‘some’ that search engines tend to ignore. As search engines come to understand human language better, they are learning that these words can actually change the meaning of a sentence.

Content Team Leader at Engage Web
John works for Engage Web as a Content Team Leader and regularly contributes to the website and programmes of his beloved Chester F.C.
John Murray
  • […] Google understanding more than just the words that are entered into the search bar, similar to the BERT update of October. Google continues to advise that businesses should ensure that the content they post to […]

  • […] noting that this is the first core update since Google introduced its natural language algorithm BERT during October and November. In the early stages of 2020, “intent research” has been the big […]

  • […] Local Businesses Study for 2020, the first time it has been undertaken since Google implemented its BERT technology to help the search engine understand human […]

  • […] queries across a range of media, promising it will be “1,000 times more powerful” than its BERT algorithm, which was introduced in […]

  • Call Now Button

    Who Engage Web has helped:

    Ice Lolly Minuteman Press BUNZLGS1 UK The Underfloor Heating Store West Cheshire Athletic Club Thomas Cook MWB Business ExchangeWeb Media 360 D2 Architects Beacon Financial Training Steely ProductsBurlydam Garden Centre Asentiv BodyHQ Clever Vine Endeavour Mortgages Pro Networks Comm-Tech Wickers World Ascot Mortgages Top Teks
    TEL: 0345 621 4321