California [USA], October 25 (ANI): After 15 years of handling unanticipated search queries, Google announced that it has made progress in understanding language using machine learning. Google said in its official blog that by using Natural Language Processing (NLP) pre-training technique called Bidirectional Encoder Representations from Transformers or BERT model, understanding keyword-ese search queries has become easier. The company is using some of the BERT models with the latest Cloud TPUs to serve search results and bring more relevant information quickly by understanding the context of the query. BERT has shown search improvements in languages such as Korean, Hindi, and Portuguese. (ANI)
(The above story is verified and authored by ANI staff, ANI is South Asia's leading multimedia news agency with over 100 bureaus in India, South Asia and across the globe. ANI brings the latest news on Politics and Current Affairs in India & around the World, Sports, Health, Fitness, Entertainment, & News. The views appearing in the above post do not reflect the opinions of LatestLY)













Quickly


