Relation-aware facial expression recognition

2021 
Research on facial expression recognition has been moving from the constrained lab scenarios to the in-the-wild situations and has made progress in recent years. However, it is still very challenging to deal with facial expression in the wild due to large poses and occlusion as well as illumination and intensity variations. Generally, existing methods mainly take the whole face as a uniform source of features for facial expression analysis. Actually, physiology and psychology research shows that some crucial regions such as the eye and mouth reflect the differences of different facial expressions, which have close relationships with emotion expression. Inspired by this observation, a novel relation-aware facial expression recognition method called Relation Convolutional Neural Network (ReCNN) is proposed in this paper, which can adaptively capture the relationship between crucial regions and facial expressions leading to the focus on the most discriminative regions for recognition. We have evaluated the proposed ReCNN on two large in-the-wild databases: AffectNet and RAF-DB. Extensive experiments on these databases show that our method has superior recognition accuracy compared with state-of-the-art methods and the relationship between crucial regions and facial expressions is beneficial to improve the performance of facial expression recognition.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    1
    Citations
    NaN
    KQI
    []