Search results for

All search results
Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

Project Soli showcased on a smartwatch at Google I/O

At Google I/O, the search giant revealed that they had made tremendous strides in the realm of power efficiency for hands-off, radar-input Project Soli.
By

Published onMay 20, 2016

We haven’t heard much from Project Soli in a while, but those with their fingertips to the pulse of the tech world will recall Google demonstrating their hands-off approach to device interaction last year. The technology uses radar to determine the position of your hands in space, and natural movements can be used for input. Unfortunately, the power demands for this tiny radar device proved to be restrictive, and developers who received the early access dev kit last year had to rely on power from a laptop to keep it alive. Since this technology is being specifically developed for wearables, you can see how this was something of a roadblock.

soli
At Google I/O, however, the search giant revealed that they had made tremendous strides in the realm of power efficiency for Project Soli. The team cut power consumption by a factor of 22, meaning the Soli chip can run on a mere .054 W instead of its previous demand for 1.2 W. In terms of raw computational power, this iteration of the technology is fully 256x as efficient than its predecessor and is still able to accurately interpret hand gestures at a rate of 18,000 frames per second. These advancements have enabled the team to install Soli in a smartwatch.

radar-visualisation-loop

Interacting with this wearable makes you look like a freaking magician, as simple commands can be relayed to the device without even making physical contact with it. One ailment that has long plagued smartwatches is that information display is inhibited by the small screen. UI designers have to take into account the reality that using your finger on a touch display so small covers up a substantial portion of the screen. Being able to interact with the device at a distance alleviates this problem.

Google ATAP's Project Soli will make interacting with wearables a breeze
News

The team demoed that the technology isn’t limited to wearables. They also created a speaker system that is controllable via hand gestures that can be read at a distance of 15 meters. The future smart home may see us controlling all aspects of our environment by simply standing in the living room and motioning like a maestro before an orchestra.

soli-interact

It will likely be some time before we start seeing this technology at the consumer level. However, Google is planning on releasing a new, better developers kit in 2017 that will be on the beta level. What are your thoughts regarding Project Soli? Is this the future of device interaction, or will things take a more verbal, conversational route? Let us know your theory and opinions in the comments!

You might like