Messages Hidden in Bird Calls or Music Could Be Used to Hijack Your Voice Assistant

iStock.com/Petmal
iStock.com/Petmal / iStock.com/Petmal
facebooktwitterreddit

Virtual assistants are supposed to make our lives easier, but a disturbing new revelation shows how they could be used for evil. In what sounds like a Black Mirror episode, Fast Company notes that hackers could theoretically disguise commands as ordinary sounds—like a bird’s chirps or music—and broadcast them across apps or TV commercials. These messages, while imperceptible to human ears, would be specially coded so that a virtual assistant like Alexa or Cortana could pick up on them and act accordingly.

This discovery comes from scientists at Germany’s Ruhr-University Bochum, who have been studying “adversarial attacks.” These “optical illusions for machines,” as non-profit research company OpenAI puts it, occur when the information fed into a machine learning system is designed to confuse it and produce an error.

According to researchers, hackers could hide messages in songs, speech, or other sounds that only a voice assistant could “hear.” This could result in unauthorized purchases being made or private information being compromised. Consider the following clip, for example.

The audio sounds a bit off, but the hidden message—“deactivate security camera and unlock front door”—is impossible to understand, according to an experiment involving 22 test subjects. After listening to the commands, none of the listeners were able to understand or transcribe what was said. This discovery and other findings led the researchers to conclude, “in general, it is possible to hide any target transcription within any audio file" [PDF].

This isn’t the first time privacy and information security concerns have surfaced in regard to voice assistants. A study last year found that Alexa could pick up on “whisper” commands that fell outside the range of human hearing. And last May, Alexa recorded an Oregon woman’s private conversations with her husband and randomly sent them to one of her contacts in Seattle. Fortunately, they were only talking about hardwood floors, but Alexa still got the boot.

Amazon told Co.Design that the company is looking into researchers' latest findings. Until then, you might not want to trust your voice assistant with your most sensitive information or darkest secrets.

[h/t Fast Company]