Search results for

All search results
Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

Google Assistant may be vulnerable to attacks via subsonic commands

A new research study claims voice-based AI services like Google Assistant could be given malicious commands with subsonic audio sources.
By

Published onMay 11, 2018

Google Assistant
TL;DR
  • A new study claims that Google Assistant, and other voice command-based AI services like Alexa and Siri,may be vulnerable to subsonic commands.
  • The study says that while these commands cannot be heard by humans, they can be detected by Google Assistant, Siri and Alexa.
  • In theory, cybercriminals could use these commands to order these services to purchase products, launch websites and more.

We have already seen that voice-based AI services like Google Assistant can accidentally be turned on just by listening to a TV commercial. Now a new study claims that Google Assistant, along with its rivals like Apple’s Siri and Amazon’s Alexa, could be vulnerable to sound commands that can’t even be heard by humans.

According to The New York Times, the research was conducted by teams at Berkeley and Princeton University in the US, along with China’s Zhejiang University. They say that they have created a way to get rid of sounds that would normally be heard by Google Assistant, Siri and Alexa, and replace them with audio files that cannot be heard by the human ear. However, they can be heard and used by the machine learning software that’s used to power these digital assistants.

So what does that mean? In theory, the researchers claim that cybercriminals could use these subsonic commands to cause all sort of havoc. They could put in audio in a YouTube video or website that could cause Google Assistant to order products online without your consent, launch malicious sites and more. If a speaker like Google Home is connected to smart home devices, these kinds of stealth commands could possibly order your security cameras to shut down, your lights to go off and your door to unlock.

The good news is that there is no evidence that these kinds of subsonic commands are being used outside the university research facilities that found them in the first place. When asked to comment, Google claims that Assistant already has ways to defeat these kinds of commands. Apple and Amazon have also commented, claiming they have taken steps to address these concerns. Hopefully, these companies will continue to develop security measures to defeat these kinds of threats.

You might like