Last week, Bing revealed that it had been using BERT within its search results before Google announced its update, and that it had been using it on a much larger scale.
Bing made this announcement in a blog post published on November 18th, in which the Microsoft-owned company details the challenges it faced when it began rolling out BERT into its global search results. Furthermore, Bing explained that it had been using this technology since April this year, which is about half a year earlier than competitor Google.
Google revealed in a blog post towards the end of October that it would begin using BERT in an attempt to understand search better than before. However, the company is not yet using it on a global scale, with its use of BERT within the search results currently only affecting just 10% of all searches made in its native USA, and with featured snippets in around two dozen countries.
What is BERT?
BERT is an algorithm alteration, and stands for Bidirectional Encoder Representations from Transformers. It focuses on solving the problems some users can have in converting queries they have into language search engines understand. It wants to be able to understand what people actually require from their searches. We have written about this topic here and here.
BERT has been described by some as the biggest change in the world of search for five years.
Bing has admitted that it has found the implementation of BERT to be challenging, explaining that applying a model of deep learning such as this to web searches on a global scale is tricky and expensive. The company said that it was made possible thanks to Azure, Microsoft’s cloud computing service.
This service allowed Bing to serve the BERT model quicker and better than it had been doing up to this point. These recent improvements Bing has made are now available worldwide.