MPI CyberMotion Simulator: Implementation of a Novel Motion Simulator to Investigate Multisensory Path Integration in Three Dimensions

被引:22
作者
Barnett-Cowan, Michael [1 ]
Meilinger, Tobias [1 ]
Vidal, Manuel [2 ]
Teufel, Harald [1 ]
Buelthoff, Heinrich H. [1 ,3 ]
机构
[1] Max Planck Inst Biol Cybernet, Dept Human Percept Cognit & Act, D-72076 Tubingen, Germany
[2] CNRS, Coll France, Lab Physiol Percept & Act, F-75700 Paris, France
[3] Korea Univ, Dept Brain & Cognit Engn, Seoul, South Korea
来源
JOVE-JOURNAL OF VISUALIZED EXPERIMENTS | 2012年 / 63期
关键词
Neuroscience; Issue; 63; Motion simulator; multisensory integration; path integration; space perception; vestibular; vision; robotics; cybernetics;
D O I
10.3791/3436
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Path integration is a process in which self-motion is integrated over time to obtain an estimate of one's current position relative to a starting point (1). Humans can do path integration based exclusively on visual (2-3), auditory (4), or inertial cues (5). However, with multiple cues present, inertial cues -particularly kinaesthetic - seem to dominate (6-7). In the absence of vision, humans tend to overestimate short distances (<5 m) and turning angles (< 30 degrees), but underestimate longer ones (5). Movement through physical space therefore does not seem to be accurately represented by the brain. Extensive work has been done on evaluating path integration in the horizontal plane, but little is known about vertical movement (see (3) for virtual movement from vision alone). One reason for this is that traditional motion simulators have a small range of motion restricted mainly to the horizontal plane. Here we take advantage of a motion simulator (8-9) with a large range of motion to assess whether path integration is similar between horizontal and vertical planes. The relative contributions of inertial and visual cues for path navigation were also assessed. 16 observers sat upright in a seat mounted to the flange of a modified KUKA anthropomorphic robot arm. Sensory information was manipulated by providing visual (optic flow, limited lifetime star field), vestibular-kinaesthetic (passive self motion with eyes closed), or visual and vestibular-kinaesthetic motion cues. Movement trajectories in the horizontal, sagittal and frontal planes consisted of two segment lengths (1st: 0.4 m, 2nd: 1 m; +/- 0.24 m/s(2) peak acceleration). The angle of the two segments was either 45 degrees or 90 degrees. Observers pointed back to their origin by moving an arrow that was superimposed on an avatar presented on the screen. Observers were more likely to underestimate angle size for movement in the such bias in the sagittal plane. Finally, observers responded slower when answering based on vestibular-kinaesthetic information alone. Human path integration based on vestibular-kinaesthetic information alone thus takes longer than when visual information is present. That pointing is consistent with underestimating and overestimating the angle one has moved through in the horizontal and vertical planes respectively, suggests that the neural representation of self-motion through space is non-symmetrical which may relate to the fact that humans experience movement mostly within the horizontal plane.
引用
收藏
页数:6
相关论文
共 16 条
[1]   The effects of proprioceptive and visual feedback on geographical orientation in virtual environments [J].
Bakker, NH ;
Werkhoven, PJ ;
Passenier, PO .
PRESENCE-TELEOPERATORS AND VIRTUAL ENVIRONMENTS, 1999, 8 (01) :36-53
[2]  
Barnett-Cowan M., 2012, EXPT BRAIN IN PRESS
[3]   Temporal processing of active and passive head movement [J].
Barnett-Cowan, Michael ;
Harris, Laurence R. .
EXPERIMENTAL BRAIN RESEARCH, 2011, 214 (01) :27-35
[4]   Perceived timing of vestibular stimulation relative to touch, light and sound [J].
Barnett-Cowan, Michael ;
Harris, Laurence R. .
EXPERIMENTAL BRAIN RESEARCH, 2009, 198 (2-3) :221-231
[5]  
Correia M.J., 1968, ACTA OTO-LARYNGOL, V230, P3
[6]   IS ELEVATION ENCODED IN COGNITIVE MAPS [J].
GARLING, T ;
BOOK, A ;
LINDBERG, E ;
ARCE, C .
JOURNAL OF ENVIRONMENTAL PSYCHOLOGY, 1990, 10 (04) :341-351
[7]   Path integration from optic flow and body senses in a homing task [J].
Kearns, MJ ;
Warren, WH ;
Duchon, AP ;
Tarr, MJ .
PERCEPTION, 2002, 31 (03) :349-374
[8]   NONVISUAL NAVIGATION BY BLIND AND SIGHTED - ASSESSMENT OF PATH INTEGRATION ABILITY [J].
LOOMIS, JM ;
KLATZKY, RL ;
GOLLEDGE, RG ;
CICINELLI, JG ;
PELLEGRINO, JW ;
FRY, PA .
JOURNAL OF EXPERIMENTAL PSYCHOLOGY-GENERAL, 1993, 122 (01) :73-91
[9]   Assessing auditory distance perception using perceptually directed action [J].
Loomis, JM ;
Klatzky, RL ;
Philbeck, JW ;
Golledge, RG .
PERCEPTION & PSYCHOPHYSICS, 1998, 60 (06) :966-980
[10]   Navigating without vision: Basic and applied research [J].
Loomis, JM ;
Klatzky, RL ;
Golledge, RG .
OPTOMETRY AND VISION SCIENCE, 2001, 78 (05) :282-289