Google’s AI Mode tipped to gain a ‘Live’ feature similar to Gemini
What you need to know
- Google’s AI Mode is rumored to receive a new “Live for AI Mode” feature following newly spotted code.
- The feature seems to leverage Google Lens quite heavily, and could provide a more vocal or conversational experience with AI Mode like Gemini Live.
- A few days ago, Google’s AI Mode opened to all Labs testers in the U.S. and adds visual cards for products and places.
Google isn’t slowing down the development of its latest AI feature, as a newly spotted mode could help take things further.
AI Mode for Search is Google’s latest focus, and a post by 9to5Google suggests there’s a little more on the way for mobile. The publication’s Google app teardown reportedly contains code about a “Live for AI Mode” feature in the works. Supposedly, early code leads speculation to believe it’ll function similarly to Gemini Live. Aside from the “Live” tag in the code, there is a button to “End Session” and mute your microphone.
A description within the code states this “Live” feature lets users “have a real-time voice conversation with AI Mode to find exactly what you’re looking for.”
Other than these similarities, the post states “Live for AI Mode” will primarily piggyback off the mode’s Google Lens integration. Many of the code strings begin with “lens_live,” furthering this report. So, it seems users can bring an item or place within Google Lens’ view and “start speaking to search.” Additionally, users may be able to interrupt the AI to offer more information or to guide it in a new direction.
However, this feature seems to have a small limitation: an inability to ask follow-up questions. If the AI understands a would-be question as a follow-up, it’ll be treated as a brand new question for a whole new search.
AI Mode’s Adventure Continues
As the publication notes, Google rolled out its Lens integration for AI Mode in early April. The feature worked like Lens in other places, where users can take a snapshot and then speak their query for a search. However, what we’re seeing rumored now is a little more flexible and intuitive, as users can have a more conversational experience with the mode. What’s more, during that same time, Google Lens received an update for “multisearch.”
In short, the update brought the ability for users to combine images with text when searching for items through Lens. As an example, Google says users could take a snapshot of a dress they like and add a new color in the text field. Lens would take everything into account and try to locate a similarly styled dress (or the same one), but in your preferred color.
There’s no telling when “Live for AI Mode” could arrive for users; however, AI Mode picked up a new update on May 1. After opening to everyone using Labs in the U.S., AI Mode will now display a history of your previous queries, like Gemini. Google also detailed users would soon receive the ability to tap for more details when using AI Mode. Such details would be contained within blocks for “local spots,” like restaurants and stores.
Reviews, ratings, store hours, and the like will be available for interested parties.
Post Comment