Hackers Can Silently Control Siri From 16 Feet Away

A clever attack by French researchers turns your headphone cable into an antenna to send surreptitious voice commands.
siri
WIRED

Siri may be your personal assistant. But your voice is not the only one she listens to. As a group of French researchers have discovered, Siri also helpfully obeys the orders of any hacker who talks to her—even, in some cases, one who’s silently transmitting those commands via radio from as far as 16 feet away.

A pair of researchers at ANSSI, a French government agency devoted to information security, have shown that they can use radio waves to silently trigger voice commands on any Android phone or iPhone that has Google Now or Siri enabled, if it also has a pair of headphones with a microphone plugged into its jack. Their clever hack uses those headphones’ cord as an antenna, exploiting its wire to convert surreptitious electromagnetic waves into electrical signals that appear to the phone’s operating system to be audio coming from the user’s microphone. Without speaking a word, a hacker could use that radio attack to tell Siri or Google Now to make calls and send texts, dial the hacker’s number to turn the phone into an eavesdropping device, send the phone’s browser to a malware site, or send spam and phishing messages via email, Facebook, or Twitter.

“The possibility of inducing parasitic signals on the audio front-end of voice-command-capable devices could raise critical security impacts,” the two French researchers, José Lopes Esteves and Chaouki Kasmi, write in a paper published by the IEEE. Or as Vincent Strubel, the director of their research group at ANSSI puts it more simply, “The sky is the limit here. Everything you can do through the voice interface you can do remotely and discreetly through electromagnetic waves.”

The researchers’ work, which was first presented at the Hack in Paris conference over the summer but received little notice outside of a few French websites, uses a relatively simple collection of equipment: It generates its electromagnetic waves with a laptop running the open-source software GNU Radio, a USRP software-defined radio, an amplifier, and an antenna. In its smallest form, which the researchers say could fit inside a backpack, their setup has a range of around six and a half feet. In a more powerful form that requires larger batteries and could only practically fit inside a car or van, the researchers say they could extend the attack’s range to more than 16 feet.

The experimental setup Kasmi and Esteves used to hijack smartphones’ voice commands with radio waves.PHOTOGRAPH: JOSÉ LOPES ESTEVES

Here’s a video showing the attack in action: In the demo, the researchers commandeer Google Now via radio on an Android smartphone and force the phone’s browser to visit the ANSSI website. (That experiment was performed inside a radio-wave-blocking Faraday cage, the researchers say, to abide by French regulations that forbid broadcasting certain electromagnetic frequencies. But Kasmi and Esteves say that the Faraday cage wasn’t necessary for the attack to work.)

The researchers’ silent voice command hack has some serious limitations: It only works on phones that have microphone-enabled headphones or earbuds plugged into them. Many Android phones don’t have Google Now enabled from their lockscreen, or have it set to only respond to commands when it recognizes the user’s voice. iPhones have Siri enabled from the lockscreen by default, but the the new version of Siri for the iPhone 6s verifies the owner’s voice just as Google Now does.1 Another limitation is that attentive victims would likely be able to see that the phone was receiving mysterious voice commands and cancel them before their mischief was complete.

Then again, the researchers contend that a hacker could hide the radio device inside a backpack in a crowded area and use it to transmit voice commands to all the surrounding phones, many of which might be vulnerable and hidden in victims’ pockets or purses. “You could imagine a bar or an airport where there are lots of people,” says Strubel. “Sending out some electromagnetic waves could cause a lot of smartphones to call a paid number and generate cash.”

Although the latest version of iOS now has a hands-free feature that allows iPhone owners to send voice commands merely by saying “Hey Siri,” Kasmi and Esteves say that their attack works on older versions of the operating system, too. iPhone headphones have long had a button on their cord that allows the user to enable Siri with a long press. By reverse engineering and spoofing the electrical signal of that button press, their radio attack can trigger Siri from the lockscreen without any interaction from the user. “It’s not mandatory to have an always-on voice interface,” says Kasmi. “It doesn’t make the phone more vulnerable, it just makes the attack less complex.”

Of course, security conscious smartphone users probably already know that leaving Siri or Google Now enabled on their phone’s login screen represents a security risk. At least in Apple’s case, anyone who gets hands-on access to the device has long been able to use those voice command features to squeeze sensitive information out of the phone—from contacts to recent calls—or even hijack social media accounts. But the radio attack extends the range and stealth of that intrusion, making it all the more important for users to disable the voice command functions from their lock screen.

The ANSSI researchers say they’ve contacted Apple and Google about their work and recommended other fixes, too: They advise that better shielding on headphone cords would force attackers to use a higher-power radio signal, for instance, or an electromagnetic sensor in the phone could block the attack. But they note that their attack could also be prevented in software, too, by letting users create their own custom “wake” words that launch Siri or Google Now, or by using voice recognition to block out strangers’ commands. Neither Google nor Apple has yet responded to WIRED’s inquiry about the ANSSI research.

Without the security features Kasmi and Esteves recommend, any smartphone’s voice features could represent a security liability—whether from an attacker with the phone in hand or one that’s hidden in the next room. “To use a phone’s keyboard you need to enter a PIN code. But the voice interface is listening all the time with no authentication,” says Strubel. “That’s the main issue here and the goal of this paper: to point out these failings in the security model.”

1Correction 10/15/2015 12:00pm EST: An earlier version of the story stated that Siri doesn’t have verification of the owner’s voice. In fact, that feature was introduced with the iPhone 6s. Apologies for the error.