Visible to the public Earprint: Transient Evoked Otoacoustic Emission for Biometrics

TitleEarprint: Transient Evoked Otoacoustic Emission for Biometrics
Publication TypeJournal Article
Year of Publication2014
AuthorsYuxi Liu, Hatzinakos, D.
JournalInformation Forensics and Security, IEEE Transactions on
Volume9
Pagination2291-2301
Date PublishedDec
ISSN1556-6013
KeywordsAccess Control, acoustic response, advance forgery methods, advanced forgery techniques, advanced spoofing techniques, Auditory system, authorisation, biometric fusion, biometrics, biometrics (access control), click stimulus, cochlea, data privacy, Data security, earprint, falsification attacks, feature extraction, financial transaction, human auditory system, information fusion, learning (artificial intelligence), Linear discriminant analysis, machine learning technique linear discriminant analysis, nonstationary signal time-frequency representation, otoacoustic emissions, replay attacks, Robust Biometric Modality, sensor fusion, signal representation, TEOAE, Time-frequency Analysis, transient evoked otoacoustic emission, wavelet analysis
Abstract

Biometrics is attracting increasing attention in privacy and security concerned issues, such as access control and remote financial transaction. However, advanced forgery and spoofing techniques are threatening the reliability of conventional biometric modalities. This has been motivating our investigation of a novel yet promising modality transient evoked otoacoustic emission (TEOAE), which is an acoustic response generated from cochlea after a click stimulus. Unlike conventional modalities that are easily accessible or captured, TEOAE is naturally immune to replay and falsification attacks as a physiological outcome from human auditory system. In this paper, we resort to wavelet analysis to derive the time-frequency representation of such nonstationary signal, which reveals individual uniqueness and long-term reproducibility. A machine learning technique linear discriminant analysis is subsequently utilized to reduce intrasubject variability and further capture intersubject differentiation features. Considering practical application, we also introduce a complete framework of the biometric system in both verification and identification modes. Comparative experiments on a TEOAE data set of biometric setting show the merits of the proposed method. Performance is further improved with fusion of information from both ears.

URLhttps://ieeexplore.ieee.org/document/6914592
DOI10.1109/TIFS.2014.2361205
Citation Key6914592