Detection, tracking, and classification of action units in facial expression

被引:150
作者
Lien, JJJ
Kanade, T
Cohn, PJF
Li, CC
机构
[1] Carnegie Mellon Univ, Inst Robot, Pittsburgh, PA 15213 USA
[2] Vis Corp, Jersey City, NJ 07302 USA
[3] Univ Pittsburgh, Dept Psychol, Pittsburgh, PA 15260 USA
[4] Univ Pittsburgh, Dept Elect Engn, Pittsburgh, PA 15260 USA
关键词
face expression recognition; optical flow; high-gradient component detection; hidden Markov model; human-computer interaction;
D O I
10.1016/S0921-8890(99)00103-7
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Most of the current work on automated facial expression analysis attempt to recognize a small set of prototypic expressions, such as joy and fear Such prototypic expressions, however, occur infrequently, and human emotions and intentions are communicated more often by changes in one or two discrete features. To capture the full range of facial expression, detection, tracking, and classification of fine-grained changes in facial features are needed. We developed the first version of a computer vision system that is sensitive to subtle changes in the face. The system includes three modules to extract feature information: dense-flow extraction using a wavelet motion model, facial-feature tracking, and edge and line extraction. The feature information thus extracted is fed to discriminant classifiers or hidden Markov models that classify it into FACS action units, the descriptive system to code fine-grained changes in facial expression. The system was tested on image sequences from 100 male and female subjects of varied ethnicity. Agreement with manual FACS coding was strong for the results based on dense-flow extraction and facial-feature tracking, and strong to moderate for edge and line extraction. (C) 2000 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:131 / 146
页数:16
相关论文
共 35 条
[1]  
[Anonymous], 1995, A computerized analysis of facial expression: Feasibility of automated discrimination
[2]  
[Anonymous], 1979, The Expression of Emotions in Man and Animals
[3]   Measuring facial expressions by computer image analysis [J].
Bartlett, MS ;
Hager, JC ;
Ekman, P ;
Sejnowski, TJ .
PSYCHOPHYSIOLOGY, 1999, 36 (02) :253-263
[4]   Learning parameterized models of image motion [J].
Black, MJ ;
Yacoob, Y ;
Jepson, AD ;
Fleet, DJ .
1997 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, PROCEEDINGS, 1997, :561-567
[5]  
BLACK MJ, 1995, P INT WORKSH AUT FAC, P12
[6]   Adaptive multiresolution collocation methods for initial boundary value problems of nonlinear PDEs [J].
Cai, W ;
Wang, JZ .
SIAM JOURNAL ON NUMERICAL ANALYSIS, 1996, 33 (03) :937-970
[7]   Infant ''surprise'' expressions as coordinative motor structures [J].
Camras, LA ;
Lambrecht, L ;
Michel, GF .
JOURNAL OF NONVERBAL BEHAVIOR, 1996, 20 (03) :183-195
[8]   Facial expressions in Hollywood's portrayal of emotion [J].
Carroll, JM ;
Russell, JA .
JOURNAL OF PERSONALITY AND SOCIAL PSYCHOLOGY, 1997, 72 (01) :164-176
[9]  
COHN JF, 1998, P IEEE WORKSH ROB HU, P33
[10]   Histograms of oriented gradients for human detection [J].
Dalal, N ;
Triggs, B .
2005 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOL 1, PROCEEDINGS, 2005, :886-893