Resolving multiple occluded layers in augmented reality

被引:73
作者
Livingston, MA [1 ]
Swan, JE [1 ]
Gabbard, JL [1 ]
Höllerer, TH [1 ]
Hix, D [1 ]
Julier, SJ [1 ]
Baillot, Y [1 ]
Brown, D [1 ]
机构
[1] USN, Res Lab, Virtual Real Lab, Washington, DC 20375 USA
来源
SECOND IEEE AND ACM INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY, PROCEEDINGS | 2003年
关键词
D O I
10.1109/ISMAR.2003.1240688
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A useful function of augmented reality (AR) systems is their ability to visualize occluded infrastructure directly in a user's view of the environment. This is especially important for our application context, which utilizes mobile AR for navigation and other operations in an urban environment. A key problem in the AR field is how to best depict occluded objects in such a way that the viewer can correctly infer the depth relationships between different physical and virtual objects. Showing a single occluded object with no depth context presents an ambiguous picture to the user. But showing all occluded objects in the environments leads to the "Superman's X-ray vision" problem, in which the user sees too much information to make sense of the depth relationships of objects. Our efforts differ qualitatively from previous work in AR occlusion, because our application domain involves far-field occluded objects, which are tens of meters distant from the user. Previous work has focused on near-field occluded objects, which are within or just beyond arm's reach, and which use different perceptual cues. We designed and evaluated a number of sets of display attributes. We then conducted a user study to determine which representations best express occlusion relationships among far field objects. We identify a drawing style and opacity settings that enable the user to accurately interpret three layers of occluded objects, even in the absence of perspective constraints.
引用
收藏
页码:56 / 65
页数:10
相关论文
共 26 条
[1]   Merging virtual objects with the real world: seeing ultrasound imagery within the patient [J].
Bajura, Michael ;
Fuchs, Henry ;
Ohbuchi, Ryutarou .
Computer Graphics (ACM), 1992, 26 (02) :203-210
[2]  
BELL B, 2001, P ACM S US INT SOFTW, P101, DOI [10.1145/502360.502363, DOI 10.1145/502360.502363]
[3]   Spatial information displays on a wearable computer [J].
Billinghurst, M ;
Bowskill, J ;
Dyer, N ;
Morphett, J .
IEEE COMPUTER GRAPHICS AND APPLICATIONS, 1998, 18 (06) :24-31
[4]  
Caudell T.P., 1992, P IEEE HAW INT C SYS, V2, P659, DOI [10.1109/HICSS.1992, DOI 10.1109/HICSS.1992]
[5]   How the eye measures reality and virtual reality [J].
Cutting, JE .
BEHAVIOR RESEARCH METHODS INSTRUMENTS & COMPUTERS, 1997, 29 (01) :27-36
[6]   Perceptual issues in Augmented Reality [J].
Drascic, D ;
Milgram, P .
STEREOSCOPIC DISPLAYS AND VIRTUAL REALITY SYSTEMS III, 1996, 2653 :123-134
[7]  
EDWARDS P, 1995, MED ROBOTICS COM SEP
[8]   Localization of virtual objects in the near visual field [J].
Ellis, SR ;
Menges, BM .
HUMAN FACTORS, 1998, 40 (03) :415-431
[9]   KNOWLEDGE-BASED AUGMENTED REALITY [J].
FEINER, S ;
MACINTYRE, B ;
SELIGMANN, D .
COMMUNICATIONS OF THE ACM, 1993, 36 (07) :53-62
[10]   A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment [J].
Feiner, S ;
MacIntyre, B ;
Hollerer, T ;
Webster, A .
FIRST INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTERS - DIGEST OF PAPERS, 1997, :74-81