Visible to the public Biblio

Filters: Author is Liu, Jian  [Clear All Filters]
2021-08-17
Liu, Jian, Chen, Yingying, Dong, Yudi, Wang, Yan, Zhao, Tiannming, Yao, Yu-Dong.  2020.  Continuous User Verification via Respiratory Biometrics. IEEE INFOCOM 2020 - IEEE Conference on Computer Communications. :1—10.
The ever-growing security issues in various mobile applications and smart devices create an urgent demand for a reliable and convenient user verification method. Traditional verification methods request users to provide their secrets (e.g., entering passwords and collecting fingerprints). We envision that the essential trend of user verification is to free users from active participation in the verification process. Toward this end, we propose a continuous user verification system, which re-uses the widely deployed WiFi infrastructure to capture the unique physiological characteristics rooted in user's respiratory motions. Different from the existing continuous verification approaches, posing dependency on restricted scenarios/user behaviors (e.g., keystrokes and gaits), our system can be easily integrated into any WiFi infrastructure to provide non-intrusive continuous verification. Specifically, we extract the respiration-related signals from the channel state information (CSI) of WiFi. We then derive the user-specific respiratory features based on the waveform morphology analysis and fuzzy wavelet transformation of the respiration signals. Additionally, a deep learning based user verification scheme is developed to identify legitimate users accurately and detect the existence of spoofing attacks. Extensive experiments involving 20 participants demonstrate that the proposed system can robustly verify/identify users and detect spoofers under various types of attacks.
2020-09-04
Wu, Yi, Liu, Jian, Chen, Yingying, Cheng, Jerry.  2019.  Semi-black-box Attacks Against Speech Recognition Systems Using Adversarial Samples. 2019 IEEE International Symposium on Dynamic Spectrum Access Networks (DySPAN). :1—5.
As automatic speech recognition (ASR) systems have been integrated into a diverse set of devices around us in recent years, security vulnerabilities of them have become an increasing concern for the public. Existing studies have demonstrated that deep neural networks (DNNs), acting as the computation core of ASR systems, is vulnerable to deliberately designed adversarial attacks. Based on the gradient descent algorithm, existing studies have successfully generated adversarial samples which can disturb ASR systems and produce adversary-expected transcript texts designed by adversaries. Most of these research simulated white-box attacks which require knowledge of all the components in the targeted ASR systems. In this work, we propose the first semi-black-box attack against the ASR system - Kaldi. Requiring only partial information from Kaldi and none from DNN, we can embed malicious commands into a single audio chip based on the gradient-independent genetic algorithm. The crafted audio clip could be recognized as the embedded malicious commands by Kaldi and unnoticeable to humans in the meanwhile. Experiments show that our attack can achieve high attack success rate with unnoticeable perturbations to three types of audio clips (pop music, pure music, and human command) without the need of the underlying DNN model parameters and architecture.
2020-02-17
Wang, Chen, Liu, Jian, Guo, Xiaonan, Wang, Yan, Chen, Yingying.  2019.  WristSpy: Snooping Passcodes in Mobile Payment Using Wrist-worn Wearables. IEEE INFOCOM 2019 - IEEE Conference on Computer Communications. :2071–2079.
Mobile payment has drawn considerable attention due to its convenience of paying via personal mobile devices at anytime and anywhere, and passcodes (i.e., PINs or patterns) are the first choice of most consumers to authorize the payment. This paper demonstrates a serious security breach and aims to raise the awareness of the public that the passcodes for authorizing transactions in mobile payments can be leaked by exploiting the embedded sensors in wearable devices (e.g., smartwatches). We present a passcode inference system, WristSpy, which examines to what extent the user's PIN/pattern during the mobile payment could be revealed from a single wrist-worn wearable device under different passcode input scenarios involving either two hands or a single hand. In particular, WristSpy has the capability to accurately reconstruct fine-grained hand movement trajectories and infer PINs/patterns when mobile and wearable devices are on two hands through building a Euclidean distance-based model and developing a training-free parallel PIN/pattern inference algorithm. When both devices are on the same single hand, a highly challenging case, WristSpy extracts multi-dimensional features by capturing the dynamics of minute hand vibrations and performs machine-learning based classification to identify PIN entries. Extensive experiments with 15 volunteers and 1600 passcode inputs demonstrate that an adversary is able to recover a user's PIN/pattern with up to 92% success rate within 5 tries under various input scenarios.
2019-01-16
Wang, Chen, Liu, Jian, Guo, Xiaonan, Wang, Yan, Chen, Yingying.  2018.  Inferring Mobile Payment Passcodes Leveraging Wearable Devices. Proceedings of the 24th Annual International Conference on Mobile Computing and Networking. :789–791.
Mobile payment has drawn considerable attention due to its convenience of paying via personal mobile devices at anytime and anywhere, and passcodes (i.e., PINs) are the first choice of most consumers to authorize the payment. This work demonstrates a serious security breach and aims to raise the awareness of the public that the passcodes for authorizing transactions in mobile payments can be leaked by exploiting the embedded sensors in wearable devices (e.g., smartwatches). We present a passcode inference system, which examines to what extent the user's PIN during mobile payment could be revealed from a single wrist-worn wearable device under different input scenarios involving either two hands or a single hand. Extensive experiments with 15 volunteers demonstrate that an adversary is able to recover a user's PIN with high success rate within 5 tries under various input scenarios.
Zhao, Tianming, Wang, Yan, Liu, Jian, Chen, Yingying.  2018.  Your Heart Won'T Lie: PPG-based Continuous Authentication on Wrist-worn Wearable Devices. Proceedings of the 24th Annual International Conference on Mobile Computing and Networking. :783–785.
This paper presents a photoplethysmography (PPG)-based continuous user authentication (CA) system, which especially leverages the PPG sensors in wrist-worn wearable devices to identify users. We explore the uniqueness of the human cardiac system captured by the PPG sensing technology. Existing CA systems require either the dedicated sensing hardware or specific gestures, whereas our system does not require any users' interactions but only the wearable device, which has already been pervasively equipped with PPG sensors. Notably, we design a robust motion artifacts (MA) removal method to mitigate the impact of MA from wrist movements. Additionally, we explore the characteristic fiducial features from PPG measurements to efficiently distinguish the human cardiac system. Furthermore, we develop a cardiac-based classifier for user identification using the Gradient Boosting Tree (GBT). Experiments with the prototype of the wrist-worn PPG sensing platform and 10 participants in different scenarios demonstrate that our system can effectively remove MA and achieve a high average authentication success rate over \$90%\$.
2018-06-07
Liu, Jian, Wang, Chen, Chen, Yingying, Saxena, Nitesh.  2017.  VibWrite: Towards Finger-input Authentication on Ubiquitous Surfaces via Physical Vibration. Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security. :73–87.

The goal of this work is to enable user authentication via finger inputs on ubiquitous surfaces leveraging low-cost physical vibration. We propose VibWrite that extends finger-input authentication beyond touch screens to any solid surface for smart access systems (e.g., access to apartments, vehicles or smart appliances). It integrates passcode, behavioral and physiological characteristics, and surface dependency together to provide a low-cost, tangible and enhanced security solution. VibWrite builds upon a touch sensing technique with vibration signals that can operate on surfaces constructed from a broad range of materials. It is significantly different from traditional password-based approaches, which only authenticate the password itself rather than the legitimate user, and the behavioral biometrics-based solutions, which usually involve specific or expensive hardware (e.g., touch screen or fingerprint reader), incurring privacy concerns and suffering from smudge attacks. VibWrite is based on new algorithms to discriminate fine-grained finger inputs and supports three independent passcode secrets including PIN number, lock pattern, and simple gestures by extracting unique features in the frequency domain to capture both behavioral and physiological characteristics such as contacting area, touching force, and etc. VibWrite is implemented using a single pair of low-cost vibration motor and receiver that can be easily attached to any surface (e.g., a door panel, a desk or an appliance). Our extensive experiments demonstrate that VibWrite can authenticate users with high accuracy (e.g., over 95% within two trials), low false positive rate (e.g., less 3%) and is robust to various types of attacks.