Skip to content

Breaking News

Pat May, business reporter, San Jose Mercury News, for his Wordpress profile. (Michael Malone/Bay Area News Group)
PUBLISHED: | UPDATED:

Just when we were all getting comfy with Siri and the gang, using these voice-activated gods and goddesses to fulfill our every need, or at least the little needs like knowing what time it is, we get word from a bunch of uptight, glass-half-empty researchers at security firm AVG that these disembodied virtual assistants may actually pose a security risk.

Apparently, one expert quoted by the BBC, thinks developers need to give Siri and her ilk more smarts so they can tell if it s really you, their master, they re talking to, or some interloper barking mischievous commands at the VA behind that home screen.

 

An expert at security firm AVG found some voice-activated systems responded just as well to fake voices as they did to that of the owner.

Clever fraudsters could subvert this to send bogus messages or compromise gadgets in the future, said AVG.

Problems with voice-activated systems were found by Yuval Ben-Itzhak, chief technology officer at anti-virus firm AVG who managed to turn on and control a smart TV using a synthesised voice. The attack worked, he said, because the gadget did nothing to check who was speaking.

Voice-activated functions on Apple and Android smartphones were also vulnerable to the same attack, . In one demonstration, he used the synthesised voice to send a bogus message via an Android smartphone telling everyone in the device s contacts book that a company was going out of business.

The bigger problem, says Ben-Itzhak, is that in the future, as these voice-activated systems become more sophisticated and more widely adopted by us all, all kinds of havoc could be wreaked by someone impersonating your voice! Even kids could hack a work-around to behave badly with Siri:

 

Mr Ben-Itzhak also wondered if children could exploit the flaw and use it to turn off safety features that stop them seeing or using inappropriate content.

In the future, when homes and offices are peppered with more and more devices that are controlled via voice, attackers might well be tempted to abuse them, he warned.

Utilising voice activation technology in the Internet of Things without authenticating the source of the voice is like leaving your computer without a password, everyone can use it and send commands,  .

Credit: tweaktown.com