A Brief Review of Facial Emotion Recognition Based on Visual Information

被引:418
作者
Ko, Byoung Chul [1 ]
机构
[1] Keimyung Univ, Dept Comp Engn, Daegu 42601, South Korea
关键词
facial emotion recognition; conventional FER; deep learning-based FER; convolutional neural networks; long short term memory; facial action coding system; facial action unit; EXPRESSION RECOGNITION; SYSTEM;
D O I
10.3390/s18020401
中图分类号
O65 [分析化学];
学科分类号
070302 [分析化学];
摘要
Facial emotion recognition (FER) is an important topic in the fields of computer vision and artificial intelligence owing to its significant academic and commercial potential. Although FER can be conducted using multiple sensors, this review focuses on studies that exclusively use facial images, because visual expressions are one of the main information channels in interpersonal communication. This paper provides a brief review of researches in the field of FER conducted over the past decades. First, conventional FER approaches are described along with a summary of the representative categories of FER systems and their main algorithms. Deep-learning-based FER approaches using deep networks enabling "end-to-end" learning are then presented. This review also focuses on an up-to-date hybrid deep-learning approach combining a convolutional neural network (CNN) for the spatial features of an individual frame and long short-term memory (LSTM) for temporal features of consecutive frames. In the later part of this paper, a brief review of publicly available evaluation metrics is given, and a comparison with benchmark results, which are a standard for a quantitative comparison of FER researches, is described. This review can serve as a brief guidebook to newcomers in the field of FER, providing basic knowledge and a general understanding of the latest state-of-the-art studies, as well as to experienced researchers looking for productive directions for future work.
引用
收藏
页数:20
相关论文
共 68 条
[1]
[Anonymous], 2017, PATTERN RECOGN LETT
[2]
[Anonymous], 2015, P 17 ACM INT C MULT
[3]
[Anonymous], 2017, J COMMUN COMPUT
[4]
[Anonymous], 2017, ARXIV170707204
[5]
[Anonymous], IEEE T AFFECT COMPUT
[6]
[Anonymous], P INT WORKSHOP COGNI
[7]
[Anonymous], 2015, P IEEE INT C AUT FAC, DOI DOI 10.1109/FG.2015.7163082
[8]
[Anonymous], 2017, P IEEE 9 INT C WIR C
[9]
[Anonymous], P IEEE C COMP VIS PA
[10]
[Anonymous], 2017, IEEE C COMP VIS PATT