Goto

Collaborating Authors

Hackers Can Use Ultrasonic Waves to Secretly Control Voice Assistant Devices

#artificialintelligence

Researchers have discovered a new means to target voice-controlled devices by propagating ultrasonic waves through solid materials in order to interact with and compromise them using inaudible voice commands without the victims' knowledge. Called "SurfingAttack," the attack leverages the unique properties of acoustic transmission in solid materials -- such as tables -- to "enable multiple rounds of interactions between the voice-controlled device and the attacker over a longer distance and without the need to be in line-of-sight." In doing so, it's possible for an attacker to interact with the devices using the voice assistants, hijack SMS two-factor authentication codes, and even place fraudulent calls, the researchers outlined in the paper, thus controlling the victim device inconspicuously. The research was published by a group of academics from Michigan State University, Washington University in St. Louis, Chinese Academy of Sciences, and the University of Nebraska-Lincoin. The results were presented at the Network Distributed System Security Symposium (NDSS) on February 24 in San Diego.


Amazon Alexa, Apple's Siri and Google Assistant can be hacked using lasers, experts warn

FOX News

Fox Business Briefs: Amazon is rolling out new tools to give users control over the stored voice recordings from their Alexa devices, amid a range of different privacy-related concerns. Voice assistants such as Amazon's Alexa, Apple's Siri and Google Assistant can be hacked by shining a laser on the devices' microphones, according to an international team of researchers. Dubbed "Light Commands," the hack "allows attackers to remotely inject inaudible and invisible commands into voice assistants," according to a statement from experts at the University of Electro-Communications in Tokyo and the University of Michigan. By targeting the MEMS (Microelectro-Mechanical Systems) microphones with lasers, the researchers say they were able to make the microphones respond to light as if it was sound. "Exploiting this effect, we can inject sound into microphones by simply modulating the amplitude of a laser light," they wrote in the research paper.


Researchers use laser to hack voice-activated devices like Amazon Echo

USATODAY - Tech Top Stories

Suddenly, the garage door opens, a burglar slides in, uses another laser to have the Echo start the car and drives off. Researchers from the University of Michigan have used laser lights to exploit a wide variety of voice-activated devices, giving them access to everything from thermostats to garage door openers to front door locks. The researchers have communicated their findings to Amazon, Google and Apple, which are studying the research. Working with researchers from the University of Electro-Communications in Japan, U-M's researchers published a paper and a web site detailing how it works. There are also videos showing it in action.


Alexa, Cortana, Google, Siri user? Watch out for these inaudible command attacks

ZDNet

The attack works against a variety hardware with the Alexa, Cortana, Google, and Siri, assistants by transforming human voice commands into an ultrasonic frequency above 20kHz. Researchers have devised a method to give potentially harmful instructions to most popular voice assistants using voice commands that can't be heard by humans. The researchers from Zheijiang University have validated that their so-called DolphinAttack works against a variety hardware with the Siri, Google, Cortana and Alexa assistants. They have also developed proof-of-concept attacks to illustrate how an attacker could exploit inaudible voice commands, including silently instructing Siri to make a FaceTime call on an iPhone, telling Google to switch the phone into airplane mode, and even manipulating the navigation system in an Audi. A portable attack method they devised is also extremely cheap, requiring an amplifier, ultrasonic transducer and battery that cost just $3.


Laser can be used to simulate a human voice and hack into Google Home and other smart devices

Daily Mail - Science & tech

A group of researchers have published results from a shocking experiment that shows how voice controlled smart devices can be operated remotely using targeted laser beams to simulate human speech. The researchers announced Monday that they were able to control a Google Home and command it to remotely open the garage door from a separate building 230 feet away. Also susceptible were Amazon's Echo, Facebook Portal, a range of Android smartphones and tablets, and both iPhones and iPads. The experiments were conducted by a group of scientists from the University of Michigan and The University of Electro-Communications in Tokyo. 'It's possible to make microphones respond to light as if it were sound,' Takeshi Sugarawa, of University of Electro-Communications in Tokyo, told Wired.