Silent Commands Compromise Security For Alexa, Siri

Alexa

Researchers have discovered possible security implications for digital assistants such as Apple’s Siri, Amazon’s Alexa, and Google’s Assistant.

According to a report in The New York Times, over the last two years, researchers in the United States and Canada have discovered ways to send hidden commands to digital assistants, which can then activate and issue orders to the systems via ways that are undetectable to the human ear.

That ability could allow hackers to use the assistants to unlock smart locks, access users’ bank accounts, or steal a variety of personal information.

In 2016, students from the University of California, Berkeley, and Georgetown University proved that they could hide commands in white noise over loudspeakers and through YouTube videos to get smart devices to turn on airplane mode or open a website.

This month, some of those Berkeley researchers published a research paper that said inaudible commands could be imbedded into music or spoken text.

“We wanted to see if we could make it even more stealthy,” said Nicholas Carlini, a fifth-year Ph.D. student in computer security at U.C. Berkeley and one of the paper’s authors.

While there is no evidence these techniques have moved outside the laboratory, it’s possible hackers will eventually discover the method on their own.

“My assumption is that the malicious people already employ people to do what I do,” said Carlini.

Last year, researchers at Princeton University and China’s Zhejiang University also found voice-activated devices could be given orders using inaudible frequencies, including dolphin sounds. Chinese researchers called the technique DolphinAttack.

Humans can’t hear ultrasonic frequencies of this kind, but they’re clear as a bell to a digital speaker. Those ultrasonic commands can be used to trick any voice-activated AI assistant into all kinds of trouble.

Amazon responded to this recent report by saying it has taken steps to ensure its speaker is secure. Google said its platform has features that mitigate such commands, while Apple said an iPhone or iPad must be unlocked before Siri will open an app.