Milestone: BERT Boosts Google Search
Google built its brand on Search, and the tech giant has not forgotten that. In what the company calls "the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search," Google today announced that it has leveraged its pretrained language model BERT to dramatically improve the understanding of search queries. The next time when you search in Google you won't need to worry about speaking or typing each word precisely to get the results you're looking for, thanks to BERT (Bidirectional Encoder Representations from Transformers). BERT is a neural network-based technique for natural language processing (NLP) pretraining introduced and open-sourced by Google last year. When applied to ranking and featured snippets in search, BERT models can process words in relation to all other words in a sentence rather than considering them one-by-one and in order. This enables a better "understanding" of context, which is particularly helpful when it comes to longer, more conversational queries, or searches where prepositions strongly affect meaning.
Oct-25-2019, 19:26:53 GMT
- Country:
- North America > United States (0.38)
- Industry:
- Information Technology > Services (0.57)
- Technology: