Google Search engine has gotten smarter when it comes to understanding user’s conversational queries. According to a news report, the search giant has implemented a neural network-based technique for language processing called the Bidirectional Encoder Representations from Transformers (Bert). This gives the search engine advanced capability to recognize word sequences.
The company announced that this is one of the biggest advancements in the history of Search and having an understanding of the word sequences will make a drastic difference in the search results.
For example, previously when a user searched for ‘Australian traveler to Canada need a visa,’ they would see most results covering information on Australian travelers looking to travel to Canada. However, after the implementation of BERT, the top results will include information on how to get a visa to the Canada as an Australian citizen.
Google announced its update at a press release where the company’s VP of Search Pandu Nayak gave a few examples of the updated search algorithm.
Of course, Google admits that Search is still not as perfect as they would like it to be. For instance, if a user searches, ‘what state is south of Nebraska,’ Google will give results that include South Nebraska instead of Kansas – even after the adaptation of BERT.
- Also read: Google claims quantum system are about to take over classical computers
Nevertheless, Google continues to improve Search with frequent updates and we can anticipate that the search giant will improve further in the near future.
As of now, BERT will only be applicable to English-language searches made in the USA. Google is, however, working on ways to reach a global audience with time.
Read next: Google Launches Five New Apps to Reduce Screen Time