How Google is getting benefit from AI/ML to provides us better search result.

Shobhit Singh Pal
4 min readDec 19, 2020

--

Whenever we seek for information we go to google search. How google answer our questions because we query to google for any type of info. Sometimes we misspelled query or we asked that information which involves some statistics or may be we want to recognize something in an image. Whatever mistake we made google process it and most of time get the correct match result for us.

Lets discuss what google uses in google search to get best result for us:

To help us find exactly what we’re
looking for

At the heart of Google Search is its ability to understand our query and rank relevant results for that query. Google has invested deeply in language understanding research, and last year google introduced how BERT language understanding systems are helping to deliver more relevant results in Google Search. BERT is now used in almost every query in English, helping us get higher quality results for our questions. BERT is a pre-trained unsupervised natural language processing model.

Spelling
one in 10 queries every day are misspelled. So for this google recently introduced a new spelling algorithm that uses a deep neural net to significantly improve its ability to decipher misspellings. In fact, this single change makes a greater improvement to spelling than all of google improvements over the last five years.

Passages
Very specific searches can be the hardest to get right, since sometimes the single sentence that answers our question might be buried deep in a web page. Google has recently made a breakthrough in ranking and are now able to not just index web pages, but individual passages from the pages. By better understanding the relevancy of specific passages, not just the overall page, google can find that needle-in-a-haystack information we’re looking for. This technology will improve 7 percent of search queries across all languages as google roll it out globally.

Understanding key moments in videos

Using a new AI-driven approach, google is now able to understand the deep semantics of a video and automatically identify key moments. This lets us tag those moments in the video, so we can navigate them like chapters in a book. Whether we’re looking for that one step in a recipe tutorial, or the game-winning home run in a highlights reel, we can easily find those moments. Google has started testing this technology this year, and by the end of 2020 google expect that 10 percent of searches on Google will use this new technology.

Deepening understanding through data

Sometimes the best search result is a statistic. But often stats are buried in large datasets and not easily comprehensible or accessible online. Since 2018, google has been working on the Data Commons project an open knowledge database of statistical data started in collaboration with the U.S. Census, Bureau of Labor Statistics, World Bank and many others. Bringing these datasets together was a first step, and now google is making this information more accessible and useful through Google Search.

Now when we ask a question like “how many people work in Chicago ,” google use natural language processing to map our search to one specific set of the billions of data points in Data Commons to provide the right stat in a visual, easy to understand format. We’ll also find other relevant data points and context — like stats for other cities — to easily explore the topic in more depth.

If you don’t know how to search it, sing it

We’ve all had that experience of having a tune stuck in our head, but can’t quite remember the lyrics. Now, when those moments arise, we just have to hum to search, and google AI models can match the melody to the right song.

This is enough to say nowadays whatever google is and google can do in search is all because of AI.

--

--

Shobhit Singh Pal
Shobhit Singh Pal

No responses yet