Links on Android Authority may earn us a commission. Learn more.
This wild video shows what could be the future of mobile voice assistants
Imagine a world where you can look at something, ask a question about it, and immediately get an answer. That’s exactly what researchers at the Human-Computer Interaction Institute at Carnegie Mellon University are developing. The project is called WorldGaze, and it’s pretty amazing.
According to the team, WorldGaze “enhances mobile voice interaction with real-world gaze location.” Holding a WorldGaze-equipped smartphone out in front of you lets you use various voice assistants to engage with your surroundings without providing any additional context.
The software works by simultaneously activating the front and rear cameras on a smartphone, taking in a combined 200-degree field of view. This allows WorldGaze to hone in on where you are looking.
WorldGaze then passes that contextual info to Siri, Alexa, or Google Assistant to make voice-activated commands much more powerful. That means you can find out what time a business closes, how much something costs, or even control your smart home gadgets just by looking at something and initiating a “Hey Siri/Google/Alexa” command.
What’s even more interesting about WorldGaze is that it’s a software exclusive solution, meaning it doesn’t require any specific hardware. The team says WorldGaze could launch as a standalone application, but it’s more likely to be integrated as a background service.
Unfortunately, WorldGaze isn’t a product you can use just yet. So far, it’s just a proof of concept, and the team has tested it in streetscapes, retail environments, and smart home settings.
Plus, just the thought of holding a phone in front of me wherever I go makes my arm tired. Thankfully, the team will explore implementing WorldGaze into smart glasses in the future.
This sounds a lot more promising than adding it to existing mobile devices. Hopefully, it’s not too long before we see some WorldGaze-equipped consumer products. Until then, if you want to learn more about the project, you can do so here, or you check out the team’s official research paper here.