Study finds hackers can control Siri, Alexa and Google Assistant using inaudible commands
Researchers have developed a way to hijack popular voice assistants right under the user's nose. All it takes is slipping some secret commands into music playing on the radio, YouTube videos or white noise for someone to control your smart speaker. The commands are undetectable to the human ear so there's little the device owner can do to stop it. Researchers have developed a way to hijack popular voice assistants, including Apple's Siri, Google's Assistant and Amazon's Echo, using secret commands undetectable to the human ear Luckily, the disconcerting vulnerability was only carried out for the study, which was conducted by researchers from University of California, Berkeley. But it still highlights a critical flaw that experts warn could be used for far more nefarious purposes, such as unlocking doors, wiring money or purchasing items online, according to the New York Times.
May-10-2018, 22:20:09 GMT
- Country:
- Asia > China (0.06)
- North America > United States
- California > Alameda County > Berkeley (0.25)
- Industry:
- Information Technology > Security & Privacy (0.32)
- Technology: