Affiliate links on Android Authority may earn us a commission. Learn more.
Google rolls out slew of new AI-powered features for Search, Maps, and more
- Google is rolling out new AI features for a variety of services.
- Search, Maps, Lens, and Shopping are all getting new tools.
- Bard will now be able to use images for prompts.
From Google Maps to Bard, Google’s mission to integrate AI into every aspect of its services is not slowing down for even a second. The company has just announced a bevy of new AI tools and other features that will be coming to various products starting today or in the near future. Here’s everything that was announced and what you need to know.
Google Maps is getting a new feature and two updates. Starting with the updates, Immersive View — a feature that stitches together images to create a multidimensional view of a location — is now rolling out to more cities. The newly added cities include Amsterdam, Dublin, Florence, and Venice. Google announced in May that it planned to roll out the feature to 15 cities — including these four — by the end of the year. The rest of the cities include Berlin, Las Vegas, London, Los Angeles, Miami, New York, Paris, San Francisco, San Jose, Seattle, and Tokyo. It will also be expanding out to 500 landmarks around the globe.
The other update is coming for the Recents feature on desktop. Slated to roll out globally next month, this update will make it so that Recent highlights are saved even after you close out of the Google Maps window. So now you’ll be able to plan multiple trips at once or take a break and come back without losing your progress.
As for the new feature, Google is bringing a tool called glanceable directions to Android and iOS globally this month. This feature allows users to view updated ETAs and where to make the next turn from their lock screen or route overview. This information had previously only been available when in the comprehensive navigation mode. The tech giant states that this feature can be used for the walking, cycling, and driving modes.
Two new features will be available to those who shop using Google Search. The first new feature is a virtual try-on tool that’s available starting today. Google says this tool lets users see how clothes look on a diverse set of real models while accurately reflecting “how it would drape, fold, cling, stretch, and form wrinkles and shadows on a diverse set of real models in various poses.” Right out of the gate, this tool can be used with brands including Everlane, H&M, LOFT, and Anthropologie.
The other feature is called guided refinements, which aims to help US shoppers refine their search to find the perfect clothing item. As Google describes it, the company is trying to recreate the feeling of using a store associate to help find similiar options to the item you’re looking at, but maybe with a different detail. Except, in this situation, the store associate is Google’s AI.
Google Lens — the tool that lets you take pictures and search for it — will now let you search for skin conditions. With the new Search Your Skin feature, users will be able to take a picture of their skin and get visual matches for skin conditions. Google says the feature also works for other areas of the body like nails, lips, and hair.
Google spoke at length about bringing an AI-powered snapshot feature to its search engine during Google I/O earlier this year. The company is now deploying that generative AI feature to help with research on restuarants, hotels, and tourist attractions. Users will reportedly be able to ask detailed questions and get answers from the web, reviews, and photos.
Something that was also announced earlier and is coming soon is image prompts. Previously, Bard could only accept text prompts, but in the coming weeks users will also be able to include images. For example, users can use the prompt, “Help me come up with a caption for this photo.”