Visible to the public Region Based Robust Facial Expression Analysis

TitleRegion Based Robust Facial Expression Analysis
Publication TypeConference Paper
Year of Publication2018
AuthorsLian, Zheng, Li, Ya, Tao, Jianhua, Huang, Jian, Niu, Mingyue
Conference Name2018 First Asian Conference on Affective Computing and Intelligent Interaction (ACII Asia)
Keywordsclass activation map, confusion matrix, emotion recognition, eyes areas, face recognition, facial areas, facial emotion recognition, facial expression recognition, facial recognition, feature extraction, Human Behavior, human computer interaction, human-machine interaction, low-resolution images, Metrics, Mice, mouse areas, Mouth, mouth regions, Nose, nose areas, partial faces, pose variations, pubcrawl, Resiliency, robust facial expression analysis, Testing, Training
AbstractFacial emotion recognition is an essential aspect in human-machine interaction. In the real-world conditions, it faces many challenges, i.e., illumination changes, large pose variations and partial or full occlusions, which cause different facial areas with different sharpness and completeness. Inspired by this fact, we focus on facial expression recognition based on partial faces in this paper. We compare contribution of seven facial areas of low-resolution images, including nose areas, mouse areas, eyes areas, nose to mouse areas, nose to eyes areas, mouth to eyes areas and the whole face areas. Through analysis on the confusion matrix and the class activation map, we find that mouth regions contain much emotional information compared with nose areas and eyes areas. In the meantime, considering larger facial areas is helpful to judge the expression more precisely. To sum up, contributions of this paper are two-fold: (1) We reveal concerned areas of human in emotion recognition. (2) We quantify the contribution of different facial parts.
DOI10.1109/ACIIAsia.2018.8470391
Citation Keylian_region_2018