LightCommands Audio Injection Attacks Threaten Voice Assistants

#artificialintelligence 

Researchers have come up with a new attack strategy against smart assistants. These attacks threaten all devices featuring voice assistants. Dubbed as'LightCommands', these attacks enable a potential attacker to inject voice commands to the devices and take control of them. Researchers have developed new attacks that allow meddling with smart assistants. These attacks named'LightCommands' allow injecting audio signals to voice assistants.