Combining machine learning and data analytics, Siri – the personal assistant for millions of Apple users – is a very powerful tool. Simply by voicing commands, Siri listens and obeys, whether you want to know how many calories are in your soda can or how many planes are flying above your head this very instant. But what if someone commanded Siri without your permission? A group of ethical French hackers recently showed it’s possible to hijack Siri from up to 16 feet away using hardware that can fit in a backpack and satisfy any whim.
The experiment was made by researchers at ANSSI, a French gov branch specialized in information security. The security breach was found when an Apple user plugs in headphones equipped with a mic. Using a laptop running GNU Radio, equipped with an antenna and amplifier, the French hackers demonstrated how it was possible to send electromagnetic waves that get picked up by the headphones. The EM waves then turn into an electrical signal, get digitized and code instructions for Siri or Google Now, for Android handsets. The commands are pre-programmed, so the hackers didn’t even have to whisper a request to Siri.
“The possibility of inducing parasitic signals on the audio front-end of voice-command-capable devices could raise critical security impacts,” the two French researchers, José Lopes Esteves and Chaouki Kasmi, write in a paper published by the IEEE.
“The sky is the limit here. Everything you can do through the voice interface you can do remotely and discreetly through electromagnetic waves.”
Kasmi and Esteves say that a hacker could walk inside an airport or some other busy public space with the hardware turned on, listening and sending signals to any Apple device with Siri enabled, and headphones plugged in. The phone can then be instructed to open a malware site, which can install further malicious code, or send SMS to paid numbers that make cash for the hackers. The researchers told Wired that they’ve contacted Apple and Google about the issue and recommend manufacturers devise headphone chords with better shielding. I can’t help think, however, that all these ethical hacks meant to showcase breaches in security are actually giving hackers fain ideas. We can only hope companies stay one step ahead of the wave, for our own sake.
Users concerned about such hacks should disable access to Siri from the lockscreen. This can be accomplished by opening the iOS Settings application, selecting Touch ID & Passcode, and then scrolling down to uncheck Siri under Allow Access When Locked.