Hidden Voice Commands in Videos Can Hack Your Smartphone


Our smartphones come with built-in personal digital assistants which perform the action based on voice commands. With the improvement in speech recognition system, these things are becoming better and a closer part of our daily life. But this quality can be exploited to hack your smartphone.

A research conducted by UC Berkeley and Georgetown University discovered how a distorted voice command hidden in YouTube videos can be used to attack a smartphone. It seems weird but this was actually demonstrated and worked as per reported.

Distorted voice commands actually belong to voice assistants in smartphones and these voice commands asks smartphone assistant to perform some action. While these voice commands are distorted in a manner to make it hard to understand by the human.

Once you play a video on your computer and laptop, hidden voice commands can trigger operations in your smartphone. It can be used to open a web page, downloading something, opening an app or something similar.

Researchers proposed two attack models:

Black-box model: It consists of voice commands which can be deciphered by a human if he concentrates.

White-box model: It consists of distorted voice commands which are next to impossible for a human being to understand.

Researchers also said that they are working on an alarming system which will warn users if any such voice command initiates an action in the smartphone. They claim to have 99.8% accuracy with machine learning approach and a challenge response system created.

While we are not sure of this kind of thing has been used in past, now you should be more alert while watching any random video on YouTube. It is not just about YouTube, it can be used in any kind of video on any online site or offline in our system.

If you think this kind of attack is not possible, you should watch the video added blow. The smartphone was placed around 10 feet away from speakers but it got success in performing any action in the smartphone.

While actions performed by these voice assistants are limited. So, attack scope is also limited.