Laser-Based Audio Injection on Voice-Controllable Systems

Light Commands is a vulnerability of MEMS microphones that allows attackers to remotely inject inaudible and invisible commands into voice assistants, such as Google assistant, Amazon Alexa, Facebook Portal, and Apple Siri using light.

In our paper we demonstrate this effect, successfully using light to inject malicious commands into several voice controlled devices such as smart speakers, tablets, and phones across large distances and through glass windows.

The implications of injecting unauthorized voice commands vary in severity based on the type of commands that can be executed through voice. As an example, in our paper we show how an attacker can use light-injected voice commands to unlock the victim's smart-lock protected home doors, or even locate, unlock and start various vehicles.

Read the Paper Cite
@inproceedings{Sugawara2020LightCommands,
    author = {Sugawara, Takeshi and Cyr, Benjamin and Rampazzi, Sara and Genkin, Daniel and Fu, Kevin},
    title = {Light Commands: Laser-Based Audio Injection on Voice-Controllable Systems},
    year = {2019}
}
							

See Light Commands in Action




Team

Q&A

Acknowledgments

This research was funded by JSPS KAKENHI Grant Number JP18K18047 and JP18KK0312, by the Defense Advanced Research Projects Agency (DARPA) under contract FA8750-19-C-0531, gifts from Intel, AMD, and Analog Devices, an award from MCity at University of Michigan, and by the National Science Foundation under grant CNS-1330142.