Google Lens will soon be able to help you find nearby objects and services

Google Lens will soon be able to help you find

Running a query in Google’s search engine is as simple as typing in a few keywords using a keyboard. However, if this innocuous gesture has become quite natural for some, for Google, the future of online research will be done in another way, even more natural, thanks to images.

The American company, which held its annual I/O conference yesterday, took the opportunity to present the new features deployed in Google Lens, its visual search module, but also to lift the veil on the major innovations to come.

Multiple search in your neighborhood

A few weeks ago, Google unveiled the arrival of “Multisearch” in its search engine. This multiple search, which allows keywords, images and voice to be combined in a single query, can now be used to find local information.
By incorporating the notion of location, Google hopes to help you find what you’re looking for by first looking near where you are.

To illustrate this progress, Google took the example of a person who would like to taste a particular dish. By combining an image (a photo or a screenshot) of a dish whose name you don’t know and associating it with the keywords “near here” in a query submitted to the search engine, Google can now help you find restaurants serving this dish near your location.

To achieve this feat, the search engine obviously relies on Artificial Intelligence, but also and above all on the gigantic library of data it possesses. Google will scan millions of images and reviews published online and on Google Maps to find the restaurant that will satisfy your cravings. Please note, however, that this multiple nearby search will initially only be available in English.

Research in augmented reality and in real time

But the best is probably yet to come. Google indeed lifted the veil on Scene Exploration, a new function that will be able to analyze an entire scene. You will be able to submit to Google Lens what you have in front of you to launch a multiple search, the results of which will be displayed in augmented reality and in real time.

Here again, the Mountain View company illustrated its concept using a concrete example: you are in the chocolate bar section and want to select the best dark chocolate bar, without hazelnuts. With Scene Exploration, all you have to do is point your smartphone camera at the radius and let Google Lens scan the different tablets within the radius.
The module will then display the result directly on your screen in augmented reality. Google specifies, however, that Scene exploration is a feature that will arrive on Google Lens “in the future”, without advancing on a possible date of availability.

Source : Google

1n-tech