Visible to the public Fusion of Face Recognition and Facial Expression Detection for Authentication: A Proposed Model

TitleFusion of Face Recognition and Facial Expression Detection for Authentication: A Proposed Model
Publication TypeConference Paper
Year of Publication2017
AuthorsYin, Delina Beh Mei, Omar, Shariman, Talip, Bazilah A., Muklas, Amalia, Norain, Nur Afiqah Mohd, Othman, Abu Talib
Conference NameProceedings of the 11th International Conference on Ubiquitous Information Management and Communication
PublisherACM
Conference LocationNew York, NY, USA
ISBN Number978-1-4503-4888-1
Keywordsauthentication, Biometric, face recognition, facial expression, Human Behavior, human behaviour, human factor, human factors, identity verification, pubcrawl, Two factor Authentication
Abstract

The paper presents a novel model of hybrid biometric-based authentication. Currently, the recognition accuracy of a single biometric verification system is often much reduced due to many factors such as the environment, user mode and physiological defects of an individual. Apparently, the enrolment of static biometric is highly vulnerable to impersonation attack. Due to the fact of single biometric authentication only offers one factor of verification, we proposed to hybrid two biometric attributes that consist of physiological and behavioural trait. In this study, we utilise the static and dynamic features of a human face. In order to extract the important features from a face, the primary steps taken are image pre-processing and face detection. Apparently, to distinguish between a genuine user or an imposter, the first authentication is to verify the user's identity through face recognition. Solely depending on a single modal biometric is possible to lead to false acceptance when two or more similar face features may result in a relatively high match score. However, it is found the False Acceptance Rate is 0.55% whereas the False Rejection Rate is 7%. By reason of the security discrepancies in the mentioned condition, therefore we proposed a fusion method whereby a genuine user will select a facial expression from the seven universal expression (i.e. happy, sad, anger, disgust, surprise, fear and neutral) as enrolled earlier in the database. For the proof of concept, it is proven in our results that even there are two or more users coincidently have the same face features, the selected facial expression will act as a password to be prominently distinguished a genuine or impostor user.

URLhttp://doi.acm.org/10.1145/3022227.3022247
DOI10.1145/3022227.3022247
Citation Keyyin_fusion_2017