DolphinAttack: Inaudible Voice Commands
Title | DolphinAttack: Inaudible Voice Commands |
Publication Type | Conference Paper |
Year of Publication | 2017 |
Authors | Zhang, Guoming, Yan, Chen, Ji, Xiaoyu, Zhang, Tianchen, Zhang, Taimin, Xu, Wenyuan |
Conference Name | Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security |
Publisher | ACM |
Conference Location | New York, NY, USA |
ISBN Number | 978-1-4503-4946-8 |
Keywords | command injection attacks, composability, defense, mems microphones, Metrics, pubcrawl, resilience, Resiliency, security analysis, Speech recognition, voice controllable systems |
Abstract | Speech recognition (SR) systems such as Siri or Google Now have become an increasingly popular human-computer interaction method, and have turned various systems into voice controllable systems (VCS). Prior work on attacking VCS shows that the hidden voice commands that are incomprehensible to people can control the systems. Hidden voice commands, though "hidden", are nonetheless audible. In this work, we design a totally inaudible attack, DolphinAttack, that modulates voice commands on ultrasonic carriers (e.g., f textgreater 20 kHz) to achieve inaudibility. By leveraging the nonlinearity of the microphone circuits, the modulated low-frequency audio commands can be successfully demodulated, recovered, and more importantly interpreted by the speech recognition systems. We validated DolphinAttack on popular speech recognition systems, including Siri, Google Now, Samsung S Voice, Huawei HiVoice, Cortana and Alexa. By injecting a sequence of inaudible voice commands, we show a few proof-of-concept attacks, which include activating Siri to initiate a FaceTime call on iPhone, activating Google Now to switch the phone to the airplane mode, and even manipulating the navigation system in an Audi automobile. We propose hardware and software defense solutions, and suggest to re-design voice controllable systems to be resilient to inaudible voice command attacks. |
URL | https://dl.acm.org/citation.cfm?doid=3133956.3134052 |
DOI | 10.1145/3133956.3134052 |
Citation Key | zhang_dolphinattack:_2017 |