Affiliate links on Android Authority may earn us a commission. Learn more.
I tried Google's 3 new Android XR glasses prototypes, and they're incredible
16 hours ago
A few months ago, Meta held one of the most disastrous launch events in recent memory, at which it unveiled the Meta Ray-Ban Display Glasses. Despite the wave of technical issues that flat-out ruined many of the live demos on stage, Meta still won the day, because its new AR glasses were actually really cool. A more advanced follow-up to the surprisingly popular Ray-Ban Meta AI Glasses ($329 at Amazon), the new glasses have a monocular display (meaning there’s one screen in one of the lenses), a Neural Band wrist strap that lets you control what you see by performing air gestures, and, most importantly, a price: $799. While expensive, this kept Meta at the forefront of the burgeoning XR revolution by making it the first major tech player to have a retail-ready set of smart glasses that seem to bring us the future Google Glass promised us over 10 years ago.
Unfortunately for Meta, there are three major problems with the new Ray-Bans. First, and most obviously, they are made by Meta, a company that few people like or trust enough to wear its face computer and feed it deeply personal data. Second, it is a Meta-only platform, with app availability limited for the time being to the company’s own products — WhatsApp, Instagram, and Facebook. Third, the AI smarts of the glasses also come from Meta, and not from the true AI players right now, namely OpenAI and Google.
Well, Google invited me to San Jose last week to try out its new glasses running Android XR, the operating system that first debuted with the Samsung Galaxy XR ($1799.99 at Samsung) in October. Although what I saw are just prototypes for real products coming later, they could potentially solve all three of Meta’s problems — immediately putting Google into a terrific position to be the real star of the upcoming XR revolution.
Which Android XR glasses do you want most?
Android XR glasses: 3 different prototypes and approaches
Google showed me three different sets of glasses, two of which it asked me not to photograph. Let’s start with the one it is willing to show off: Project Aura.
Project Aura

Project Aura is a collaborative effort between Google and a company called XREAL, a well-known Chinese company in XR fan circles that has been making various types of smart glasses since 2017. However, none of its products have ever been runaway hits or made much of a dent outside of the tech early adopter market.
Project Aura is poised to essentially be a lighter, more portable version of the Galaxy XR. It’s the same core concept: You put a headset on your face that’s tethered to a “puck” you keep in your pocket or clip to your belt. However, instead of a fully immersive VR-style headset, Project Aura resembles traditional glasses (although definitely not the kind you would wear while out and about).
Project Aura is, remarkably, almost a 1:1 recreation of the Galaxy XR. It's just in the form of glasses instead of a headset.
Remarkably, outside of this physical difference, Project Aura is essentially a 1:1 recreation of the Galaxy XR. It runs the same operating system in Android XR and uses the same chipset in Qualcomm’s Snapdragon XR2 Plus Gen 2. It offers the same gesture controls through hand tracking and supports all the same apps that users are already familiar with on the Galaxy XR. It’s just much more portable and likely has much worse battery life. We don’t have exact battery details yet, but I’m guessing this based on the size of the puck, the fact that it had both the battery and the computer in it, and how much it warmed up during my brief time with it.
So if Project Aura is just a less immersive Galaxy XR that won’t last as long on your face, what’s the point? Google sees it as a better option for travelers, for one. Wearing a full-on VR headset on a plane isn’t great, but a relatively lightweight pair of glasses is a much easier pill to swallow. This type of product has been XREAL’s bread and butter — just look at the XREAL One Pro — so Project Aura makes perfect sense from that angle. Additionally, there could be options for this away from the strictly consumer space. For example, giving Project Aura to a museum visitor for a guided virtual tour makes a lot more sense than handing them a Galaxy XR.
Although Project Aura doesn’t have a retail name, price, or set release date yet, Google told me we can expect a full launch in 2026.
Monocular XR glasses (and an audio-only counterpart)
Up next, Google showed me what was probably the most exciting prototype: a set of monocular XR glasses. Although the Android XR team asked me not to photograph them, it’s easy for you to imagine what they look like, because they are very similar to what you see in the thumbnail above, which is an even earlier prototype I used at Google I/O 2025. They are the same core concept as the new Meta Ray-Ban Display Glasses: face gear with POV cameras and a single, full-color display embedded into one of the lenses.
Like the new Meta Ray-Ban Display Glasses, Google's prototypes have one display embedded in the right lens.
Right off the bat, though, these glasses were much, much more advanced than what I used at I/O. The display was sharper and more visible. The software was smooth and refined, the integration of Gemini felt natural, and it was incredibly easy to figure out how to use them with very little instruction from the Google team.

There are two ways to control the glasses. The first is by using the touchpad built into the right stem, which responds to taps and swipes. A tap-and-hold launches Gemini, a single tap acts as an “OK” for on-screen UI elements, etc. The other way to control them is the way Google clearly prefers, which is to just talk with Gemini. Through familiar Gemini Live interactions, I could “show” it what I see through the glasses and ask questions about the real world. For example, I could be in the kitchen looking at my pantry and ask Gemini to help me with a recipe that uses the ingredients I already have.
Don’t want to miss the best from Android Authority?
- Set us as a favorite source in Google Discover to never miss our latest exclusive reports, expert analysis, and much more.
- You can also set us as a preferred source in Google Search by clicking the button below.
I even conducted a Google Meet call with a Googler in another room. I could see her perfectly well on the tiny display and hear her clearly through the speakers in the stems that fired directly into my ears. She couldn’t see me, of course, but she could see what I was seeing through the glasses. Max Spear, the Android XR product lead at Google, told me that he used the glasses to make a Thanksgiving dish with his dad remotely. He wore the glasses, and his dad instructed him on what to do with a full view of Max’s kitchen.
Listening to music, conducting Google Meet calls, Maps navigation, Gemini Live conversations — all happening on a tiny display near my eye.
I could also ask Gemini to do basic things, like take a photo. When you take one, the shot briefly appears in the display to give you a visual confirmation that a) you successfully captured the photo, and b) what it looks like. If you have a Pixel Watch, it will even show the captured photo there briefly if you’d prefer that. Either way, the photo is immediately sent to your Google Photos account, so you can view it on your phone.
Speaking of which, your phone is the real brains of the operation. Although some very minor local computing is happening on the glasses themselves, Google isn’t even trying to make these a standalone product. That’s why there’s no Neural Band counterpart, but Google did say it was working on a gesture system for Wear OS-based watches (we’ve seen hints of it already) that, crucially, would not require you to wear anything you’re not already wearing. Additionally, the glasses aren’t nearly as thick and heavy as the new Meta Ray-Ban Display Glasses.
There is very little active compute on the glasses themselves. The wirelessly connected phone is the brain.
I commented to Google about how smart this is: most people in the next 5-10 years won’t leave their house with just a pair of smart glasses and without their phone in their pocket, so why bother putting any compute into the glasses themselves? The Google team nearly erupted in cheers when I said that, with one even saying, “OK, so this guy gets it.”
Anyway, the software on the glasses is the exact opposite of what Meta is offering. Here, let me show you the difference succinctly by showing you Meta’s UI and Google’s. The images below show simulated images of Google’s UI, but as someone who’s used the prototypes, I can confirm that it’s not far off from this:
Google is going for an absolute bare-bones UI that only shows the information you need and nothing else. I didn’t see a launcher, nor an app drawer — just information I needed when I needed it. If you’re using Maps to navigate somewhere, you just see the next direction you need to take. If you want to see some more info, you tilt your head down and can then see a small map, but it goes away as soon as you look forward again. If you’re listening to music, you’ll see the most basic of music players, with the only splash of color being a tiny album art square. If you’re trying to get to your Uber pick-up (Uber is an early partner with Google on this), you’ll see the core info about your driver and the pickup location on your display.

The killer aspect of all this, though — the thing that Meta can’t even begin to touch — is that these things I saw in the display weren’t Android XR apps. They were just repurposed ongoing notifications from the connected phone. In other words, the music player was the YouTube Music player you see in your notification drop-down on your phone. It wasn’t a facsimile of it — it was that exact thing, just skinned to be more minimalistic. The Maps were just grabbed from the usual notification you’d see while navigating. Uber — same thing. The way Google explained it to me, any Android developer with an existing ongoing notification feature could have a monocular glasses experience with just a few new API hooks. What’s more, Google is not vetting them beforehand: literally anyone could make one of these starting on day one.
Everything I saw on the glasses came from the connected phone. This sounds like it's too simple, but is actually brilliant: It's your phone, but on your face.
I really need to emphasize how huge this is. This means that when you take a pair of these glasses out of the box, you will be able to immediately integrate them into your life. You won’t need to visit an anemic app store and hope there’s more than ten choices there, or even be hassled to re-download apps you already use. You won’t need to wear a superfluous wrist strap and make gestures in the air like you’re casting a spell so you can navigate to a clunky app that was tossed together to meet a deadline. Your phone is the brains, and many of the apps there will already work with the glasses without you needing to do anything.

The final retail version of these glasses will be a co-development between Google, Samsung, and traditional glasses companies Warby Parker and Gentle Monster. The name, price, and release date are unknown for now, but Google is committing to a 2026 launch.
Oh, and there will also be a display-free version. Obviously, you’ll lose out on things like visual Maps instructions and other visual-first experiences, but Gemini, POV cameras, music, etc., will all still be supported. These audio-only models will also land in 2026 for an undisclosed price.
Binocular XR glasses
Finally, Google gave me a very brief demo of something coming in the far-off future: binocular XR glasses. As the name suggests, these glasses have two displays — one in each lens — that give you a wider field of view for the software content and put it front-and-center rather than slightly to the right like the monocular version.
If you're really excited about everyday XR glasses with two displays, you're going to need to wait quite a while.
Crucially, though, having two displays allows the software to create the illusion of depth. I was shown a demo of Maps navigation with these glasses, and while it looked a lot like what I saw on the monocular version, it just felt a bit more engaging and realistic because the pins seemed to pop out at me more. I was also able to watch a YouTube video with surprisingly good audio and visual fidelity, which could be a game-changer for people who ride the subway to work in the morning.
Unfortunately, Google only gave me a few minutes with these and told me upfront that we won’t see them in retail any time soon. Google was cagey with details, but it confirmed they absolutely won’t be coming in 2026, so that means 2027 at the earliest.
Android XR glasses hands-on: The REAL future
In October, when I walked out onto the streets of Manhattan after my first time using the retail-ready Galaxy XR, I remember feeling very little excitement. I was much more concerned about where I was going to get lunch than I was with the technology I had just experienced. This is chiefly because the Galaxy XR is just another VR headset. Despite Google and Samsung trying to hype it up as something different, it’s a VR headset with a few new tricks — just like the Apple Vision Pro. And we all know how well the Vision Pro is doing (read: not well at all).
I wasn't too excited about the Galaxy XR, but I am really, really excited for everyday XR glasses.
But after leaving the demo event with Google’s various XR glasses last week in San Jose, I felt elated. I truly felt like I had used something that is going to be a major aspect of our lives in the future. Google might not hit a grand slam right out of the gate with these in 2026, but the core idea is there. Smart glasses are the future, and Google is now the company I most suspect will be at the forefront of that future.
And that brings me back to where I started with this article: Meta. Google’s XR glasses have already solved two of the significant problems with Meta’s glasses — and they’re not even out yet. They will have a robust software experience right out of the box with players from Google, close partners like Uber, and then pretty much any other Android developer who wants to get on board. This stands in stark contrast to Meta’s “you only get Meta apps” nonsense with the Meta Ray-Ban Display Glasses. Likewise, the glasses run on Gemini, which is, in my opinion, the best generative AI experience on the market right now (even ChatGPT’s parent company is starting to admit feeling the heat from Gemini). Even if Gemini wasn’t the best, it is head and shoulders above anything Meta has produced.
That leaves us at the third problem with Meta’s ambitions, which is Meta itself. Is Google any better than Meta at the moment? Sure, most people I know don’t hate Google as passionately as they do Meta, but there are still lots of people who distrust the company. Are we ready for a future where you wear a face computer that Google has deep control over? Are you ready for Google to, quite literally, peer into your life?
Are we ready for Google to be on our faces every waking moment of the day?
I had a nice, long chat with Google reps after my demo about these heavy topics, and they said they are taking them very seriously. The glasses will all have a prominent on/off switch. When in the “off” position, that switch will be distinctly red in color to signal to everyone, “My spy glasses aren’t on right now, guys.” There’s also an LED light on the front that signals when the cameras are capturing any kind of information, and any attempts to cover up the LED will render the cameras inoperable. Google also told me it’s working hard to develop clear privacy statutes that are shown early and often to the user, so they know (and can control) everything Google gets from them.
This was all comforting to hear, but Google has built a bad reputation over the past few years for a whole slew of reasons that I won’t get into. Suffice it to say that the company will need to work overtime to convince the masses that it will be responsible with its new face computers. If it can do that, though? I foresee Google very quickly usurping Meta in this space — and the sky’s the limit from there.
Thank you for being part of our community. Read our Comment Policy before posting.

