Skip to main content

Could hackers trick voice assistants into committing fraud? Researchers say yes.

posted onFebruary 18, 2019
by l33tdawg
Venture Beat
Credit: Venture Beat

Voice assistant technology is supposed to make our lives easier, but security experts say it comes with some uniquely invasive risks. Since the beginning of the year, multiple Nest security camera users have reported instances of strangers hacking into and issuing voice commands to Alexa, falsely announcing a North Korean missile attack, and targeting one family by speaking directly to their child, turning up their home thermostat to 90 degrees, and shouting insults. These incidents are alarming, but the potential for silent compromises of voice assistants could be even more damaging.

Nest owner Google — which recently integrated Google Assistant support into Nest control hubs — has blamed weak user passwords and a lack of two-factor authentication for the attacks. But even voice assistants with strong security may be vulnerable to stealthier forms of hacking. Over the past couple of years, researchers at universities in the US, China, and Germany have successfully used hidden audio files to make AI-powered voice assistants like Siri and Alexa follow their commands.

Source

Tags

Industry News

You May Also Like

Recent News

Tuesday, November 19th

Friday, November 8th

Friday, November 1st

Tuesday, July 9th

Wednesday, July 3rd

Friday, June 28th

Thursday, June 27th

Thursday, June 13th

Wednesday, June 12th

Tuesday, June 11th