Visible to the public Facial Motion Prior Networks for Facial Expression Recognition

TitleFacial Motion Prior Networks for Facial Expression Recognition
Publication TypeConference Paper
Year of Publication2019
AuthorsChen, Yuedong, Wang, Jianfeng, Chen, Shikai, Shi, Zhongchao, Cai, Jianfei
Conference Name2019 IEEE Visual Communications and Image Processing (VCIP)
KeywordsDeep Learning, emotion recognition, expressive faces, face recognition, facial expression benchmark datasets, facial expression recognition, facial mask learning, facial motion prior networks, facial muscle moving regions, facial recognition, facial-motion mask, FER framework, FER methods, Human Behavior, human factors, learning (artificial intelligence), Metrics, muscle, neutral faces, prior domain knowledge, prior knowledge, pubcrawl, resilience, Resiliency
Abstract

Deep learning based facial expression recognition (FER) has received a lot of attention in the past few years. Most of the existing deep learning based FER methods do not consider domain knowledge well, which thereby fail to extract representative features. In this work, we propose a novel FER framework, named Facial Motion Prior Networks (FMPN). Particularly, we introduce an addition branch to generate a facial mask so as to focus on facial muscle moving regions. To guide the facial mask learning, we propose to incorporate prior domain knowledge by using the average differences between neutral faces and the corresponding expressive faces as the training guidance. Extensive experiments on three facial expression benchmark datasets demonstrate the effectiveness of the proposed method, compared with the state-of-the-art approaches.

DOI10.1109/VCIP47243.2019.8965826
Citation Keychen_facial_2019