Virtual voice assistants such as Siri and Google Now detect key words when you ask them questions so as to understand and be able to offer you the service that you require. They also have access to the majority of tools built into your phone. For example, Siri is able to search your contact list and tell you where each of your friends is at any given moment. Both Siri and Google Now allow for calls or messages to be sent with a simple and direct command.
But what might happen if it’s not only you that could give the command, and if someone else were able to send orders remotely without even uttering a word?
A group of investigators from the National Agency for Computer Security in France (ANSSI) have discovered that these voice assistants could be tapped into by outside sources. They’re unearthed a method in which it is possible to send them commands from a distance of up to 10 meters.
To complete these tests, the team of investigators used radios waves to communicate with these voice tools without making any sounds. The only things needed are headphones with an in-built microphone.
For short distances (around two meters), the tools needed are even simpler – the group used an open-key program called GNU radio, a USRP radio, an antenna, and a signal amplifier.
[youtube http://www.youtube.com/watch?v=l9txd4a4tUE]
The headphones serve as an antenna (for cellphones with a radio you need to connect them in order to listen) and the cable allows the cybercriminals to convert the electromagnetic waves into electric ones.
Once the message is translated and understood, it acts as an audio coming from the microphone: the operating system would recognize it as such and would transmit the instructions to Siri or Google Now.
This way, the cybercriminals are able to make them perform calls, send text messages, or even mark their own number so as the devices become listening tools. What’s more, they could even send the web browser to a page filled with malware and send spam messages or carry out phishing attacks via the email, Facebook, or Twitter accounts.
“The likelihood of sending signals to devices that accept voice commands could provoke an increase in attacks”, stated the authors of the study, which was published on the digital site IEEE.
Everything that a user can do by using voice commands is an opening for cybercriminals, who could have the chance to communicate with various devices at once. In public spaces such as airports, the attacks could be immense.
This strategy, however, isn’t without limitations. Many Android telephones don’t have Google Now available on a blocked screen, or are configured to only respond to one type of voice. Even though Siri can be accessed via a blocked screen, the latest version (on iPhone 6) is also configurable to only recognize one voice – that of the user.
The post Voice assistants like Siri and Google Now could be vulnerable to attack appeared first on MediaCenter Panda Security.
Source: Panda