Amazon Alexa, Apple's Siri and Google Assistant can be hacked using lasers, experts warn

FOX News

Fox Business Briefs: Amazon is rolling out new tools to give users control over the stored voice recordings from their Alexa devices, amid a range of different privacy-related concerns. Voice assistants such as Amazon's Alexa, Apple's Siri and Google Assistant can be hacked by shining a laser on the devices' microphones, according to an international team of researchers. Dubbed "Light Commands," the hack "allows attackers to remotely inject inaudible and invisible commands into voice assistants," according to a statement from experts at the University of Electro-Communications in Tokyo and the University of Michigan. By targeting the MEMS (Microelectro-Mechanical Systems) microphones with lasers, the researchers say they were able to make the microphones respond to light as if it was sound. "Exploiting this effect, we can inject sound into microphones by simply modulating the amplitude of a laser light," they wrote in the research paper.


Researchers use laser to hack voice-activated devices like Amazon Echo

USATODAY - Tech Top Stories

Suddenly, the garage door opens, a burglar slides in, uses another laser to have the Echo start the car and drives off. Researchers from the University of Michigan have used laser lights to exploit a wide variety of voice-activated devices, giving them access to everything from thermostats to garage door openers to front door locks. The researchers have communicated their findings to Amazon, Google and Apple, which are studying the research. Working with researchers from the University of Electro-Communications in Japan, U-M's researchers published a paper and a web site detailing how it works. There are also videos showing it in action.


Alexa, Cortana, Google, Siri user? Watch out for these inaudible command attacks

ZDNet

The attack works against a variety hardware with the Alexa, Cortana, Google, and Siri, assistants by transforming human voice commands into an ultrasonic frequency above 20kHz. Researchers have devised a method to give potentially harmful instructions to most popular voice assistants using voice commands that can't be heard by humans. The researchers from Zheijiang University have validated that their so-called DolphinAttack works against a variety hardware with the Siri, Google, Cortana and Alexa assistants. They have also developed proof-of-concept attacks to illustrate how an attacker could exploit inaudible voice commands, including silently instructing Siri to make a FaceTime call on an iPhone, telling Google to switch the phone into airplane mode, and even manipulating the navigation system in an Audi. A portable attack method they devised is also extremely cheap, requiring an amplifier, ultrasonic transducer and battery that cost just $3.


Laser can be used to simulate a human voice and hack into Google Home and other smart devices

Daily Mail - Science & tech

A group of researchers have published results from a shocking experiment that shows how voice controlled smart devices can be operated remotely using targeted laser beams to simulate human speech. The researchers announced Monday that they were able to control a Google Home and command it to remotely open the garage door from a separate building 230 feet away. Also susceptible were Amazon's Echo, Facebook Portal, a range of Android smartphones and tablets, and both iPhones and iPads. The experiments were conducted by a group of scientists from the University of Michigan and The University of Electro-Communications in Tokyo. 'It's possible to make microphones respond to light as if it were sound,' Takeshi Sugarawa, of University of Electro-Communications in Tokyo, told Wired.


Could hackers trick voice assistants into committing fraud? Researchers say yes.

#artificialintelligence

Voice assistant technology is supposed to make our lives easier, but security experts say it comes with some uniquely invasive risks. Since the beginning of the year, multiple Nest security camera users have reported instances of strangers hacking into and issuing voice commands to Alexa, falsely announcing a North Korean missile attack, and targeting one family by speaking directly to their child, turning up their home thermostat to 90 degrees, and shouting insults. These incidents are alarming, but the potential for silent compromises of voice assistants could be even more damaging. Nest owner Google -- which recently integrated Google Assistant support into Nest control hubs -- has blamed weak user passwords and a lack of two-factor authentication for the attacks. But even voice assistants with strong security may be vulnerable to stealthier forms of hacking.