Human skeleton tracking from depth data using geodesic distances and optical flow

被引:127
作者
Schwarz, Loren Arthur [1 ]
Mkhitaryan, Artashes [1 ]
Mateus, Diana [1 ]
Navab, Nassir [1 ]
机构
[1] Tech Univ Munich, Dept Informat, D-85748 Garching, Germany
关键词
Human pose estimation; Depth imaging; Geodesic distances;
D O I
10.1016/j.imavis.2011.12.001
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we present a method for human full-body pose estimation from depth data that can be obtained using Time of Flight (ToF) cameras or the Kinect device. Our approach consists of robustly detecting anatomical landmarks in the 3D data and fitting a skeleton body model using constrained inverse kinematics. Instead of relying on appearance-based features for interest point detection that can vary strongly with illumination and pose changes, we build upon a graph-based representation of the depth data that allows us to measure geodesic distances between body parts. As these distances do not change with body movement, we are able to localize anatomical landmarks independent of pose. For differentiation of body parts that occlude each other, we employ motion information, obtained from the optical flow between subsequent intensity images. We provide a qualitative and quantitative evaluation of our pose tracking method on ToF and Kinect sequences containing movements of varying complexity. (C) 2011 Elsevier B.V. All rights reserved.
引用
收藏
页码:217 / 226
页数:10
相关论文
共 29 条
[1]  
[Anonymous], IEEE C COMP VIS PATT
[2]  
[Anonymous], 2010, IEEE C COMP VIS PATT
[3]  
[Anonymous], IEEE C COMP VIS PATT
[4]  
[Anonymous], BRIT MACH VIS C BMVC
[5]  
[Anonymous], BRIT MACH VIS C BMVC
[6]  
[Anonymous], 2011, IEEE C COMP VIS PATT
[7]  
Bandouch J., 2008, ARTICULATED MOTION D
[8]  
Bleiweiss A., 2009, ACM SIGGRAPH ASIA SK
[9]   An adaptive optical flow technique for person tracking systems [J].
Denman, Simon ;
Chandran, Vinod ;
Sridharan, Sridha .
PATTERN RECOGNITION LETTERS, 2007, 28 (10) :1232-1239
[10]  
Fossati A., 2009, IEEE C COMP VIS PATT