Visible to the public DolphinAttack: Inaudible Voice Commands

TitleDolphinAttack: Inaudible Voice Commands
Publication TypeConference Paper
Year of Publication2017
AuthorsZhang, Guoming, Yan, Chen, Ji, Xiaoyu, Zhang, Tianchen, Zhang, Taimin, Xu, Wenyuan
Conference NameProceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security
PublisherACM
Conference LocationNew York, NY, USA
ISBN Number978-1-4503-4946-8
Keywordscommand injection attacks, composability, defense, mems microphones, Metrics, pubcrawl, resilience, Resiliency, security analysis, Speech recognition, voice controllable systems
Abstract

Speech recognition (SR) systems such as Siri or Google Now have become an increasingly popular human-computer interaction method, and have turned various systems into voice controllable systems (VCS). Prior work on attacking VCS shows that the hidden voice commands that are incomprehensible to people can control the systems. Hidden voice commands, though "hidden", are nonetheless audible. In this work, we design a totally inaudible attack, DolphinAttack, that modulates voice commands on ultrasonic carriers (e.g., f textgreater 20 kHz) to achieve inaudibility. By leveraging the nonlinearity of the microphone circuits, the modulated low-frequency audio commands can be successfully demodulated, recovered, and more importantly interpreted by the speech recognition systems. We validated DolphinAttack on popular speech recognition systems, including Siri, Google Now, Samsung S Voice, Huawei HiVoice, Cortana and Alexa. By injecting a sequence of inaudible voice commands, we show a few proof-of-concept attacks, which include activating Siri to initiate a FaceTime call on iPhone, activating Google Now to switch the phone to the airplane mode, and even manipulating the navigation system in an Audi automobile. We propose hardware and software defense solutions, and suggest to re-design voice controllable systems to be resilient to inaudible voice command attacks.

URLhttps://dl.acm.org/citation.cfm?doid=3133956.3134052
DOI10.1145/3133956.3134052
Citation Keyzhang_dolphinattack:_2017