Tracking hand dynamics in unconstrained environments

被引:5
作者
Azoz, Y [1 ]
Devi, L [1 ]
Sharma, R [1 ]
机构
[1] Penn State Univ, Dept Comp Engn & Sci, Pond Lab 220, University Pk, PA 16802 USA
来源
AUTOMATIC FACE AND GESTURE RECOGNITION - THIRD IEEE INTERNATIONAL CONFERENCE PROCEEDINGS | 1998年
关键词
D O I
10.1109/AFGR.1998.670961
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A key problem in building an interface in which the user uses hand gestures to control a computer generated display without restrictions is the ability to localize and track the human arm in image sequences. This paper proposes a multimodal localization scheme combined with a tracking framework that exploits the articulated structure of the arm. The localization uses the multiple cues of motion, shape and color to locate a set of image features. Using constraint fusion, these features are tracked by a modified Extended Kalman Filter. An interaction scheme between tracking and localization is proposed in order to improve the estimation while decreasing the computational requirements The results of extensive simulations and experiments with real data are described including a large database of hand gestures involved in display control.
引用
收藏
页码:274 / 279
页数:2
相关论文
empty
未找到相关数据