NAVIG: augmented reality guidance system for the visually impaired Combining object localization, GNSS, and spatial audio

被引:102
作者
Katz, Brian F. G. [1 ]
Kammoun, Slim [2 ,3 ]
Parseihian, Gaetan [1 ]
Gutierrez, Olivier [2 ,3 ]
Brilhault, Adrien [2 ,3 ,4 ]
Auvray, Malika [1 ]
Truillet, Philippe [2 ,3 ]
Denis, Michel [1 ]
Thorpe, Simon [3 ,4 ]
Jouffrais, Christophe [2 ,3 ]
机构
[1] Univ Paris 11, CNRS, LIMSI, F-91403 Orsay, France
[2] CNRS, IRIT, Toulouse, France
[3] Univ Toulouse 3, F-31062 Toulouse, France
[4] CNRS, CerCo, Toulouse, France
关键词
Assisted navigation; Guidance; Spatial audio; Visually impaired assistive device; Need analysis; AUDITORY DISPLAY; ROUTE; BLIND; SUBSTITUTION; EXPLORATION; PEDESTRIANS; PERCEPTION; PEOPLE; SOUND; AID;
D O I
10.1007/s10055-012-0213-6
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Navigating complex routes and finding objects of interest are challenging tasks for the visually impaired. The project NAVIG (Navigation Assisted by artificial VIsion and GNSS) is directed toward increasing personal autonomy via a virtual augmented reality system. The system integrates an adapted geographic information system with different classes of objects useful for improving route selection and guidance. The database also includes models of important geolocated objects that may be detected by real-time embedded vision algorithms. Object localization (relative to the user) may serve both global positioning and sensorimotor actions such as heading, grasping, or piloting. The user is guided to his desired destination through spatialized semantic audio rendering, always maintained in the head-centered reference frame. This paper presents the overall project design and architecture of the NAVIG system. In addition, details of a new type of detection and localization device are presented. This approach combines a bio-inspired vision system that can recognize and locate objects very quickly and a 3D sound rendering system that is able to perceptually position a sound at the location of the recognized object. This system was developed in relation to guidance directives developed through participative design with potential users and educators for the visually impaired.
引用
收藏
页码:253 / 269
页数:17
相关论文
共 65 条
[51]  
Parseihian G, 2010, WORKSH MULT LOC BAS
[52]   Drishti: An integrated indoor/outdoor blind navigation system and service [J].
Ran, L ;
Helal, S ;
Moore, S .
SECOND IEEE ANNUAL CONFERENCE ON PERVASIVE COMPUTING AND COMMUNICATIONS, PROCEEDINGS, 2004, :23-30
[53]  
Roentgen U.R., 2008, Visual Impairment & Blindness, V102, P702, DOI DOI 10.1177/0145482X0810201105
[54]  
Sang-Kyeong Park, 2009, 2009 ICROS-SICE International Joint Conference. ICCAS-SICE 2009, P3970
[55]  
SHI JB, 1994, 1994 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, PROCEEDINGS, P593, DOI 10.1109/CVPR.1994.323794
[56]  
STROTHOTTE T, 1995, ASSIST TECHN RES SER, V1, P348
[57]   Speed of processing in the human visual system [J].
Thorpe, S ;
Fize, D ;
Marlot, C .
NATURE, 1996, 381 (6582) :520-522
[58]  
Thrun S., 2002, Exploring Artificial Intelligence in The New Millennium, P1
[59]   Evaluation of acoustic beacon characteristics for navigation tasks [J].
Tran, TV ;
Letowski, T ;
Abouchacra, KS .
ERGONOMICS, 2000, 43 (06) :807-827
[60]   Multisensory VR exploration for computer fluid dynamics in the CoRSAIRe project [J].
Vezien, J. M. ;
Menelas, B. ;
Nelson, J. ;
Picinali, L. ;
Bourdot, P. ;
Ammi, M. ;
Katz, B. F. G. ;
Burkhardt, J. M. ;
Pastur, L. ;
Lusseyran, F. .
VIRTUAL REALITY, 2009, 13 (04) :257-271