Multisearch could make Google Lens your search sensei

Google search will be more appropriate with the introduction of multi-research, a combination of text and image searches with Google Lens.After searching the image through the lens, you will now be able to ask additional questions or add parameters to your search to narrow the results. Cases of Google use for this feature include shopping for clothing with certain patterns in various colors or directing your camera to the bicycle wheels and then typing “How to fix” to see guides and videos on bicycle repairs. According to Google, the best use case for multisearch, for now, is the result of shopping.

The company launched beta from this feature on Thursday to US users from the Google application on the Android and iOS platform. Just click the camera icon next to the microphone icon or open a photo from your gallery, select what you want to search, and slide on your results to reveal the “Add to Search button” where you can type additional text.

This announcement is a public trial of features that have been teasing by search giants for almost a year; Google discusses the features when introducing mothers on Google I / O 2021, then provide more information about it on September 2021. Mum, or multitask unified model, is a new Google AI model for searches revealed at the same company I / O event.Mum replaces the old AI model, Bert; Representation of two-way encoders from transformers. Mum, according to Google, about a thousand times stronger than Bert.

Analysis: will it be any good?

It’s in the beta version for now, but Google is sure to make a big hoopla about the mother during the announcement. From what we see, the lens is usually good enough to identify objects and translate text. However, the increase in AI will add another dimension to it and can make it a tool that is more useful to find the information you need about what you see now, as opposed to general information about such a thing.However, begging questions about how well it will determine exactly what you want. For example, if you look at the sofa with a striking pattern on it but prefer to have it as a chair, can you find it enough to find what you want? Will it be in a physical store or on an online storefront like Wayfair? Google searches often get inaccurate physical supplies from the nearest store, do they get better, too?

We have many questions, but the possibility they will only be answered once more people start using multiearch. AI properties are to be better with use, after all.

Leave a Reply

Your email address will not be published. Required fields are marked *