May be Siri and Google Now is your personal assistant. But your voice is not the only one she listens to. Siri and Google Now also helpfully obeys the orders of any hacker who talks to her— even, in some cases, one who’s silently transmitting those commands via radio from as far as 16 feet away.
A pair of researchers, José Lopes Esteves and Chaouki Kasmi at ANSSI, a French government agency devoted to information security have discovered a way to use radio waves to silently activate Siri or Android’s Google Now from across the room. Researchers claims that this new method will help hackers to attack target phone through Siri and Google Now from a 16 feet distance.
Also Read : Virus In Candy Crush And Other Popular Games Attacking Android Users
The hack only works on Android phone or iPhone that has Google Now or Siri enabled, and if it also has a pair of headphones with a microphone plugged into its jack.Wired report explains that — “Their clever hack uses those headphones’ cord as an antenna, exploiting its wire to convert surreptitious electromagnetic waves into electrical signals that appear to the phone’s operating system to be audio coming from the user’s microphone.”
By this hack without speaking a word, a hacker could use that radio attack to tell Siri or Google Now to make calls and send texts, dial the hacker’s number to turn the phone into an eavesdropping device, send the phone’s browser to a malware site, or send spam and phishing messages via email, Facebook, or Twitter.
“The possibility of inducing parasitic signals on the audio front-end of voice-command-capable devices could raise critical security impacts,” — the two French researchers, José Lopes Esteves and Chaouki Kasmi, write in a paper published by the IEEE.
Also Read : Microsoft Pays $24,000 To A Hacker For Hacking Outlook Account

And this silent radio wave hack has some serious limitations. It only works on phones that have microphone-enabled headphones or earbuds plugged into them. Many Android phones don’t have Google Now enabled from their lockscreen, or have it set to only respond to commands when it recognizes the user’s voice. iPhones have Siri enabled from the lockscreen by default, but the the new version of Siri for the iPhone 6s verifies the owner’s voice just as Google Now does.Another limitation is that attentive victims would likely be able to see that the phone was receiving mysterious voice commands and cancel them before their mischief was complete.
Also Read : This New Computer Chip Can Self Destruct In 5 Seconds
Here’s a video uploaded by Wired showing the attack in action: In the demo, the researchers commandeer Google Now via radio on an Android smartphone and force the phone’s browser to visit the ANSSI website.
According to the researchers, a hacker could hide the radio device inside a backpack in a crowded area and use it to transmit voice commands to all the surrounding phones, many of which might be vulnerable and hidden in victims’ pockets or purses.
“You could imagine a bar or an airport where there are lots of people,” — says Vincent Strubel, the director of their research group at ANSSI. “Sending out some electromagnetic waves could cause a lot of smartphones to call a paid number and generate cash.”
And now leaving Siri or Google Now enabled on their phone’s login screen represents a security risk. The radio attack extends the range and stealth of that intrusion, making it all the more important for users to disable the voice command functions from their lock screen.
“To use a phone’s keyboard you need to enter a PIN code. But the voice interface is listening all the time with no authentication,” — says Strubel. “That’s the main issue here and the goal of this paper: to point out these failings in the security model.”
The ANSSI researchers say they’ve contacted Apple and Google about their work and recommended other fixes, too: They advise that better shielding on headphone cords would force attackers to use a higher-power radio signal, for instance, or an electromagnetic sensor in the phone could block the attack. But they note that their attack could also be prevented in software, too, by letting users create their own custom “wake” words that launch Siri or Google Now, or by using voice recognition to block out strangers’ commands.
Also Read : Stagefright 2.0 Vulnerability Compromised 1 Billion Android Phones With Hoaxer Audio Files