Excuse me, but is that really you? The risks of voice-activated tech

Just when we were all getting comfy with Siri and the gang, using these voice-activated gods and goddesses to fulfill our every need, or at least the little needs like knowing what time it is, we get word from a bunch of uptight, glass-half-empty researchers at security firm AVG that these disembodied virtual assistants may actually pose a security risk.

One expert quoted by the BBC thinks developers need to give Siri and her ilk more smarts so they can tell if it’s really you, their master, they’re talking to, or some interloper barking mischievous commands at the VA behind that home screen.

 

An expert at security firm AVG found some voice-activated systems responded just as well to fake voices as they did to that of the owner.

Clever fraudsters could subvert this to send bogus messages or compromise gadgets in the future, said AVG.

Problems with voice-activated systems were found by Yuval Ben-Itzhak, chief technology officer at anti-virus firm AVG who managed to turn on and control a smart TV using a synthesised voice. The attack worked, he said, because the gadget did nothing to check who was speaking.

Voice-activated functions on Apple and Android smartphones were also vulnerable to the same attack, he found. In one demonstration, he used the synthesised voice to send a bogus message via an Android smartphone telling everyone in the device’s contacts book that a company was going out of business.

The bigger problem, says Ben-Itzhak, is that in the future, as these voice-activated systems become more sophisticated and more widely adopted by us all, all kinds of havoc could be wreaked by someone impersonating your voice! Even kids could hack a workaround to behave badly with Siri:

 

Mr Ben-Itzhak also wondered if children could exploit the flaw and use it to turn off safety features that stop them seeing or using inappropriate content.

In the future, when homes and offices are peppered with more and more devices that are controlled via voice, attackers might well be tempted to abuse them, he warned.

“Utilising voice activation technology in the Internet of Things without authenticating the source of the voice is like leaving your computer without a password, everyone can use it and send commands,” he wrote in a blog about the research.

Photo of Siri icon by Gary Reyes/Mercury News archives

 

Tags:

 

Share this Post



 
 
 
  • Voice rec has little to do with it. If someone has you phone and gets past your password then it doesn’t matter what input method they use, keyboard or voice. This is a silly alarmist PR message

 
 
css.php