Search results for

All search results
Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

Google Glass resurrected: What we want to see from Google's new smart glasses

Google showed off new smart glasses at I/O 2022, but we need more features for a commercial success.
By

Published onMay 22, 2022

Google IO 2022 smart glasses
Google

Google Glass was the company’s first foray into smart eyewear back in 2012/2013, touting features like a projector-driven display, touchpad, camera for photos and video recording, bone conduction audio, and voice commands. It certainly made for an innovative but ultimately unsuccessful first try, eventually finding a small niche for itself in the business world.

It looks like Google isn’t giving up on consumer-oriented smart glasses just yet, as it demonstrated a prototype for an unnamed pair of glasses at its I/O 2022 developer conference. The new glasses seem similar to the spectacles touted by startup Focal, which was acquired by Google in 2020, with the search giant demonstrating live translations and real-time transcriptions.

Our guide: Everything you need to know about Google hardware

This is certainly a cool proof of concept and we don’t know if Google plans to commercialize this product or not. But the company needs to bring plenty more to the table if it wants a future pair of smart glasses to succeed with a wider audience. Here are some features we hope to see on a potential Google Glass successor.

Sign language interpretation

Google’s own video indeed showcased real-time transcription of speech, demonstrating how hearing-impaired people could benefit from the tech. But how about sign language translation and interpretation?

Related: The best American Sign Language apps for Android

It sounds super high-tech, but Google announced a web game in December 2021 to help people learn sign language. The company also announced on-device hand-tracking tech back in 2019 (see the screenshots above), laying down the foundation for sign language applications. So we’d love to see sign language to written or spoken language translation on a future pair of Google smart glasses.

Control your smart home gadgets by looking at them

One of the more interesting smart home advancements in recent years has been ultra wide-band (UWB) technology, found in devices from the likes of Apple, Google, Samsung, and Xiaomi. The latter company even posted a neat demo, showing that you can control smart home gadgets by simply pointing your phone at the relevant gadget.

Imagine looking in the direction of the front door to access your smart doorbell's camera feed.

What if we brought this UWB-enabled tech to a future pair of Google smart glasses though? Imagine simply looking at your smart door lock to lock/unlock it, or looking in the direction of the front door to access your smart doorbell’s camera feed. This could theoretically extend to traditional smart speakers and smart displays, as looking at a UWB-equipped smart speaker could enable listening or give you visual access to routines and other commands.

More reading: Everything you need to know about UWB wireless technology

Filters, filters everywhere

Instagram NASA filter
Hadlee Simons / Android Authority

It’s not the most useful example of machine learning, but it’s hard to argue that filters haven’t been one of the main driving forces behind AR tech in the last five years. Everyone from Snapchat and Instagram to TikTok has used machine learning to offer fun face filters for use on their platforms.

Dig deeper: What’s the difference between AR and VR?

This wouldn’t be the first time we see AR filters and 3D effects on a pair of smart glasses, though, as Snapchat Spectacles offer some of these effects too. However, these are generally limited to environmental effects rather than offering face filters. So a more open approach combined with face filters would be great for creators on other, more popular platforms like TikTok and Instagram.

A major upgrade for navigation

This seems like a no-brainer, but Google Maps Live View is certainly a feature we’d love to see on a future pair of smart glasses. Google introduced this augmented reality feature for Maps in late 2020, overlaying directions and other navigation info on your phone camera viewfinder. Simply lift your phone up and point it around to receive directions.

More on augmented reality: The best augmented reality apps for Android

Google Maps Live View also gained more functionality last year, offering virtual street labels, signs to point out landmarks, and the ability to view details about some places (e.g. reviews or whether they’re busy).

With glasses, the camera follows your eyesight so you don't need to stand in the street holding your phone up to discover the world around.

This all sounds like a natural fit for a next-generation pair of smart glasses, as the camera would always be on your head anyway and you don’t need to stand in the street holding your phone up to discover the world around. Live View on glasses also seems like a more seamless experience while driving, as opposed to taking your eyes off the road — although we hope a smart glass implementation isn’t too distracting either.

More advanced AR searches

Google showed off a couple of advancements for Google Lens and AR-powered search at I/O 2022, namely multi-search and scene exploration. And they both seem like a suitable fit for a future pair of Google smart glasses.

Multisearch allows you to point your camera at an object or product to initiate a search while also letting you add text-based search modifiers. For example, you could take a photo of a rosemary plant and add “care instructions.”

Opinion: Reliving the good old days — Google I/O 2022 felt like a flashback

Scene exploration is another nifty AR search tool, as you can hold your phone camera up to search the world around you. This sounds pedestrian, but it’s more expansive than Lens’s existing visual search functionality, taking multiple objects in a scene into account rather than just one. You’re also getting handy insights overlaid onto the scene and products. Google gave the example of someone searching for chocolate in a grocery store aisle. You’ll not only get ratings for each chocolate bar but the feature will also suss out desired keywords like “dark” or “nut-free” chocolate.

Other Lens features like copy/pasting words from the real world, highlighting popular dishes on restaurant menus, and translating foreign languages also seem like great additions to a new version of Google Glass. So fingers crossed that a future product indeed offers all of these features.

Most of the stuff your smartwatch does

A Samsung Galaxy Watch 4 on a leather surface displays the watch face Info Brick.
Kaitlyn Cimino / Android Authority

Aside from health and fitness, smartwatches excel in several other areas owing to their wearable nature. These tasks include taking phone calls, offering music controls for your phone, and editing/checking grocery shopping lists. This is in addition to more mundane features like notification mirroring, viewing calendar entries, and showing weather forecasts.

We'd like to see next-generation Google smart glasses that reduce the need to pick up our phones for every task.

We’d therefore like to see next-generation Google smart glasses take their cue from Wear OS smartwatches in this regard, reducing the need to pick up the phone for every task. We could even see some light fitness functionality coming to these glasses too, such as step-counting for walking and GPS tracking for cycling.

Buyer’s guide: The best smartwatches on the market today


What’s stopping new smart glasses?

We’d definitely argue that the timing has never been better for a Google Glass successor of sorts. For starters, augmented reality has improved in a major way from those early days of requiring dedicated hardware. Google’s own ARCore augmented reality suite only requires a camera and some sensors like a gyroscope to overlay animals and objects in front of you.

Natural language processing is another area where the search giant has made impressive strides, compared to the pre-Assistant original Google Glass era. The Tensor chipset in the Pixel 6 phones even allows for offline voice typing, highlighting just how far we’ve come in the last decade. This is another area where a future pair of smart glasses could see notable improvements.

What do you want to see most from new Google smart glasses?

203 votes

There’s also been a major increase in horsepower since the first Google Glass back in 2012/2013, which originally had a dual-core chipset and 2GB of RAM. Google itself has taken advantage of this power boost in 2019’s Glass 2 Enterprise Edition, packing a Snapdragon XR1 processor. These power advancements could definitely come to consumer-level spectacles as well.

In other words, the technological pieces of the puzzle — both hardware and software — seem to be coming together for a Google Glass follow-up. But this product will live and die by its use-cases, so hopefully, Google adds a few of our wishlist features above if it’s aiming for a commercial release soon.

You might like