Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Hackers use radio waves to silently control Apple's Siri, Android's Google Now

A newly spotlighted hack utilizes an iPhone or Android handset — with headphones plugged in — to remotely and silently access the smartphone's built-in voice controls, potentially unbeknownst to the user.

Researchers from French government agency ANSSI found they were able to control Apple's Siri or Android's Google Now from as far as 16 feet away, according to Wired. The hack is accomplished by using a radio transmitter to tap into a pair of headphones with integrated microphone plugged into the mobile device, using the headphone cable as an antenna.

Headphone cables make decent radio antennas, as evidenced by Apple's use of them to enable FM radio reception on its iPod nano. The team at ANSSI found they can exploit this and trick an iPhone or Android device into believing the audio commands are coming from the connected microphone.

"Without speaking a word, a hacker could use that radio attack to tell Siri or Google Now to make calls and send texts, dial the hacker's number to turn the phone into an eavesdropping device, send the phone's browser to a malware site, or send spam and phishing messages via email, Facebook, or Twitter," Wired explained.

In its smallest, most compact form, the hack can be accomplished from up to about six and a half feet away with equipment that could fit inside a backpack. A more powerful form operational up to 16 feet away would require the hardware be housed in a car or van.

The hack only works on headphone-connected iPhones which have Siri enabled from the lockscreen — which is Apple's default setting. It works not only with the new iPhone 6s which has "Hey Siri" always listening, but also with older devices, by spoofing the button press required to activate Siri on a set of headphones, such as Apple's own EarPods.

Of course, anyone who can get their hands on a user's iPhone can access Siri as long as it's enabled from the lock screen. But the ANSSI technique would allow for more remote, stealth access of a device, potentially unbeknownst to the user.

Some Android devices do feature voice recognition for Google Now access, which could thwart the potential hack. Apple has no such functionality built into Siri yet.

Starting with iOS 9, Apple has begun tailoring "Hey Siri" voice prompts to each individual user, helping the personal assistant recognize a user's voice when they use they functionality. The new setup process could be a potential precursor to voice recognition security in future versions of iOS.

Users concerned about such hacks should disable access to Siri from the lockscreen. This can be accomplished by opening the iOS Settings application, selecting Touch ID & Passcode, and then scrolling down to uncheck Siri under Allow Access When Locked. There, users can also disable access to the Today screen, Notifications View, Reply With Message, and Wallet, if they so choose.

For further security, users can also go back to the root Settings menu and choose Control Center and disable Access on Lock Screen. This will prevent a stolen iPhone from being placed into Airplane Mode without turning off the device.

As for the hardware side of security, the researchers at ANSSI have reached out to both Apple and Google, recommending that the companies adopt better shielding on their own headphone cords, which would make it more difficult for nefarious hackers to co-opt. Future handsets could also include electromagnetic sensors as a form of security.

Apple and Google could also fix the issue through software, allowing users to create custom voice prompts to invoke Siri and Google Now. Like Apple's "Hey Siri," Google allows users to begin a voice search with the generic query "OK Google."