Affiliate links on Android Authority may earn us a commission. Learn more.
Another rumor: The Pixel 4 might recognize your hand gestures (Update)
Update, June 11, 2019 (6:15 pm ET): Shortly after the initial rumor was published, XDA-Developers shared information about a gesture-based feature they have been tracking in the Android Q betas. According to the publication, the new gestures are called “Skip” and “Silence” and would rely on a device including an “Aware” sensor.
XDA-Developers notes that the strings of code for Aware are just placeholders and incomplete. With the rumor of Project Soli potentially being added to the Pixel 4 or even updated Nest smart speakers, these media control gestures make a lot of sense.
Original article, June 11, 2019 (5:19 pm ET): Just hours after we reported on the Pixel 4 and Pixel 4 XL’s potential camera systems, a new rumor has sprung surrounding the two handsets. According to 9to5Google, Google could be including hand gesture tracking technology to its upcoming smartphones.
Google ATAP introduced what it calls Project Soli at the company’s 2015 I/O developer conference. In a nutshell, the technology is a miniature radar that can pick up on minute hand gestures. As you can see from the above video, Google envisioned Project Soli to be used as a touch-free user interface that could be used to interact with wearables, radios, and other electronics.
The photo below demonstrates how various finger taps and twists could change aspects of an app running on a tablet.
Google’s special projects department was relatively quiet about Project Soli after its debut up until last December. On the very last day of the year, the FCC gave the search giant permission to start testing the radar technology.
This chain of events would give Google enough time to experiment with the technology more and potentially add it to a commercial device — such as the Pixel 4.
For now, Project Soli’s inclusion in the Pixel 4 and Pixel 4 XL is very much a rumor. It’s unclear how Google would incorporate the feature, but 9to5Google believes it could be used for basic interactions or even triggering various Assistant features.