Active gaze tracking for human-robot interaction

被引:12
作者
Atienza, R [1 ]
Zelinsky, A [1 ]
机构
[1] Australian Natl Univ, Res Sch Informat Sci, Canberra, ACT 0200, Australia
来源
FOURTH IEEE INTERNATIONAL CONFERENCE ON MULTIMODAL INTERFACES, PROCEEDINGS | 2002年
关键词
active gaze tracking; active face tracking; human-robot interface;
D O I
10.1109/ICMI.2002.1167004
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In our effort to make human-robot interfaces more user-friendly, we built an active gaze tracking system that can measure a person's gaze direction in real-time. Gaze normally tells which object in his/her surrounding a person is interested in. Therefore, it can be used as a medium for human-robot interaction like instructing a robot arm to pick a certain object a user is looking at. In this paper, we discuss how we developed and put together algorithms for zoom camera calibration, low-level control of active head, face and gaze tracking to create an active gaze tracking system.
引用
收藏
页码:261 / 266
页数:6
相关论文
共 11 条
[1]  
ATIENZA R, 2001, PRACTICAL ZOOM CAMER
[2]   Detecting human faces in color images [J].
Cai, J ;
Goshtasby, A .
IMAGE AND VISION COMPUTING, 1999, 18 (01) :63-75
[3]   TRICLOPS - A TOOL FOR STUDYING ACTIVE VISION [J].
FIALA, JC ;
LUMIA, R ;
ROBERTS, KJ ;
WAVERING, AJ .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 1994, 12 (2-3) :231-250
[4]  
MATSUMOTO Y, 2000, P IEEE 4 INT C AUT F
[5]  
NEWMAN R, 2000, P IEEE 4 INT C AUT F
[6]  
ROUGEAUX S, 1999, THESIS U EVRY ELECTR
[7]  
RUZON M, 1998, RGB CIE LAB MATLAB C
[8]  
*SEEING MACH INC, 2001, FAC 1 0 US MAN
[9]  
STIEFELHAGEN R, 1997, P ICASSP MUN GERM
[10]  
WILLSON R, 1994, P SPIE