LightCommands Audio Injection Attacks Threaten Voice Assistants


Researchers have come up with a new attack strategy against smart assistants. These attacks threaten all devices featuring voice assistants. Dubbed as'LightCommands', these attacks enable a potential attacker to inject voice commands to the devices and take control of them. Researchers have developed new attacks that allow meddling with smart assistants. These attacks named'LightCommands' allow injecting audio signals to voice assistants.

Duplicate Docs Excel Report

None found

Similar Docs  Excel Report  more

None found