Google implemented Motion Sense gestures in the Pixel 4 series, using a tiny ‘Soli’ radar to detect your hand movements. The search giant already explained how the tech works last year, but it’s now given us a look at what exactly the radar sees.
In a post on Google’s AI Blog, the company says they’ve developed algorithms which means the Soli radar system doesn’t require a “well-defined image” of a person to work properly. This has potential implications for privacy too, as the firm says they don’t generate “distinguishable” images of a person’s body or face.
The team also shared several GIFs showing us what the Pixel 4 Soli radar actually sees, and it’s indeed tough if not impossible to tell who’s using the system. The first GIF shows a person approaching the radar, the second GIF shows someone reaching for the device, and the third animation shows a person conducting a swipe action.
From here, Google said it had to develop on-device algorithms to accurately detect gestures, owing to the wide variety of unique ways people might perform them. For example, one person’s way of swiping might be different to that of another person. It also developed algorithms to ensure that background movements weren’t falsely detected as gestures.
Another factor Google had to take into account when developing the Pixel 4 Soli radar system was interference from other components within the phone. In fact, the firm had to develop new signal processing techniques to reduce the effect of audio vibration on the radar’s signal. This latter breakthrough paved the way for Motion Sense to be used for music playback as well.
“We are excited to continue researching and developing Soli to enable new radar-based sensing and perception capabilities,” the team concluded, suggesting that this isn’t the end of the road for the technology.
Are you a fan of Motion Sense on the Pixel 4 series? Let us know in the comments!