Google’s latest AI learnings will now be visible in search products, triggering what it believes are “new ways to find and explore information in a more natural and intuitive way.” As Google rolls out the Unified Multitasking Model, or MUM, users will start to see new ways to search visually and can also gain a better understanding of a topic they’re looking for.
With the new visual search capabilities, said a new blog post by Senior Vice President Prabhakar Raghavan that coincides with Google’s annual Search On event, Google users will be able to simply tap the goal icon while by looking at “a photo of a shirt, and ask Google to find you the same model, but on a different item of clothing, such as socks”.
The post explained that it helps when the search is difficult to accurately describe with words alone. You can also use this visual search to point to a particular part, whose name you don’t know, and get a tutorial on how to use or fix it.
Google Search’s Liz Reid explained that the new AI capabilities do three things. âOne, what really is the question behind your query and how do we understand this; and be able to do it in new ways and not just with text, voice or images. The second helps you ask the next question when sometimes you don’t know what it should be. The third is just to make it easier to explore informationâ¦ the web is amazing with this vastness, but sometimes it can be a little overwhelming, âshe said.
The blog said it would also become easier to explore and understand new topics with âThings to Knowâ. For most topics, Google will use its learnings to show which things it knows people are likely to look for first. Google promises that soon “MUM will unlock more in-depth information that you might not have been able to research.” It will also start showing new features to refine and expand searches and offer a new searchable results page for easy inspiration.
In video, Google will begin to identify related topics “with links for easy digging and learning more.” The post read: “Using MUM, we can even show related topics that are not explicitly mentioned in the video, based on our advanced understanding of the information in the video.”
When asked how Google would contextualize search in a different language or from a place where sensitivities will be different, Reid told indianexpress.com: âWhat MUM really understands is the concepts, so it might take your query and sort of map it, but then connect it to related information that might be expressed differently in another language. The fact that it’s a cross-training with all the different languages ââtogether also makes it easier.
Currently, MUM has visibility across 75 different languages ââand understands queries in all of those languages.