1
|
Kopiske K, Heinrich EM, Jahn G, Bendixen A, Einhäuser W. Multisensory cues for walking in virtual reality: humans combine conflicting visual and self-motion information to reproduce distances. J Neurophysiol 2023; 130:1028-1040. [PMID: 37701952 DOI: 10.1152/jn.00011.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Revised: 08/30/2023] [Accepted: 09/06/2023] [Indexed: 09/14/2023] Open
Abstract
When humans walk, it is important for them to have some measure of the distance they have traveled. Typically, many cues from different modalities are available, as humans perceive both the environment around them (for example, through vision and haptics) and their own walking. Here, we investigate the contribution of visual cues and nonvisual self-motion cues to distance reproduction when walking on a treadmill through a virtual environment by separately manipulating the speed of a treadmill belt and of the virtual environment. Using mobile eye tracking, we also investigate how our participants sampled the visual information through gaze. We show that, as predicted, both modalities affected how participants (N = 28) reproduced a distance. Participants weighed nonvisual self-motion cues more strongly than visual cues, corresponding also to their respective reliabilities, but with some interindividual variability. Those who looked more toward those parts of the visual scene that contained cues to speed and distance tended also to weigh visual information more strongly, although this correlation was nonsignificant, and participants generally directed their gaze toward visually informative areas of the scene less than expected. As measured by motion capture, participants adjusted their gait patterns to the treadmill speed but not to walked distance. In sum, we show in a naturalistic virtual environment how humans use different sensory modalities when reproducing distances and how the use of these cues differs between participants and depends on information sampling.NEW & NOTEWORTHY Combining virtual reality with treadmill walking, we measured the relative importance of visual cues and nonvisual self-motion cues for distance reproduction. Participants used both cues but put more weight on self-motion; weight on visual cues had a trend to correlate with looking at visually informative areas. Participants overshot distances, especially when self-motion was slow; they adjusted steps to self-motion cues but not to visual cues. Our work thus quantifies the multimodal contributions to distance reproduction.
Collapse
Affiliation(s)
- Karl Kopiske
- Cognitive Systems Lab, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
| | - Elisa-Maria Heinrich
- Cognitive Systems Lab, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
- Physics of Cognition Group, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
| | - Georg Jahn
- Applied Geropsychology and Cognition, Faculty of Behavioural and Social Sciences, Chemnitz University of Technology, Chemnitz, Germany
| | - Alexandra Bendixen
- Cognitive Systems Lab, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
| | - Wolfgang Einhäuser
- Physics of Cognition Group, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
| |
Collapse
|
2
|
Hanna M, Fung J, Lamontagne A. Multisensory control of a straight locomotor trajectory. J Vestib Res 2018; 27:17-25. [PMID: 28387689 DOI: 10.3233/ves-170603] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Locomotor steering is contingent upon orienting oneself spatially in the environment. When the head is turned while walking, the optic flow projected onto the retina is a complex pattern comprising of a translational and a rotational component. We have created a unique paradigm to simulate different optic flows in a virtual environment. We hypothesized that non-visual (vestibular and somatosensory) cues are required for proper control of a straight trajectory while walking. This research study included 9 healthy young subjects walking in a large physical space (40×25m2) while the virtual environment is viewed in a helmet-mounted display. They were instructed to walk straight in the physical world while being exposed to three conditions: (1) self-initiated active head turns (AHT: 40° right, left, or none); (2) visually simulated head turns (SHT); and (3) visually simulated head turns with no target element (SHT_NT). Conditions 1 and 2 involved an eye-level target which subjects were instructed to fixate, whereas condition 3 was similar to condition 2 but with no target. Identical retinal flow patterns were present in the AHT and SHT conditions whereas non-visual cues differed in that a head rotation was sensed only in AHT but not in SHT. Body motions were captured by a 12-camera Vicon system. Horizontal orientations of the head and body segments, as well as the trajectory of the body's centre of mass were analyzed. SHT and SNT_NT yielded similar results. Heading and body segment orientations changed in the direction opposite to the head turns in SHT conditions. Heading remained unchanged across head turn directions in AHT. Results suggest that non-visual information is used in the control of heading while being exposed to changing rotational optic flows. The small magnitude of the changes in SHT conditions suggests that the CNS can re-weight relevant sources of information to minimize heading errors in the presence of sensory conflicts.
Collapse
Affiliation(s)
- Maxim Hanna
- School of Physical and Occupational Therapy, McGill University, Montreal, QC, Canada.,Feil and Oberfeld /CRIR Research Centre, Jewish Rehabilitation Hospital, CISSS-Laval, QC, Canada
| | - Joyce Fung
- School of Physical and Occupational Therapy, McGill University, Montreal, QC, Canada.,Feil and Oberfeld /CRIR Research Centre, Jewish Rehabilitation Hospital, CISSS-Laval, QC, Canada
| | - Anouk Lamontagne
- School of Physical and Occupational Therapy, McGill University, Montreal, QC, Canada.,Feil and Oberfeld /CRIR Research Centre, Jewish Rehabilitation Hospital, CISSS-Laval, QC, Canada
| |
Collapse
|
3
|
Manning JR, Lew TF, Li N, Sekuler R, Kahana MJ. MAGELLAN: a cognitive map-based model of human wayfinding. J Exp Psychol Gen 2014; 143:1314-1330. [PMID: 24490847 DOI: 10.1037/a0035542] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
In an unfamiliar environment, searching for and navigating to a target requires that spatial information be acquired, stored, processed, and retrieved. In a study encompassing all of these processes, participants acted as taxicab drivers who learned to pick up and deliver passengers in a series of small virtual towns. We used data from these experiments to refine and validate MAGELLAN, a cognitive map-based model of spatial learning and wayfinding. MAGELLAN accounts for the shapes of participants' spatial learning curves, which measure their experience-based improvement in navigational efficiency in unfamiliar environments. The model also predicts the ease (or difficulty) with which different environments are learned and, within a given environment, which landmarks will be easy (or difficult) to localize from memory. Using just 2 free parameters, MAGELLAN provides a useful account of how participants' cognitive maps evolve over time with experience, and how participants use the information stored in their cognitive maps to navigate and explore efficiently.
Collapse
Affiliation(s)
| | - Timothy F Lew
- Department of Psychology, University of Pennsylvania
| | - Ningcheng Li
- Department of Bioengineering, University of Pennsylvania
| | | | | |
Collapse
|
4
|
Do walkers follow their heads? Investigating the role of head rotation in locomotor control. Exp Brain Res 2012; 219:175-90. [PMID: 22466410 DOI: 10.1007/s00221-012-3077-9] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2011] [Accepted: 03/14/2012] [Indexed: 10/28/2022]
Abstract
Eye and head rotations are normally correlated with changes in walking direction; however, it is unknown whether they play a causal role in the control of steering. The objective of the present study was to answer two questions about the role of head rotations in steering control when walking to a goal. First, are head rotations sufficient to elicit a change in walking direction? Second, are head rotations necessary to initiate a change in walking direction or guide steering to a goal? To answer these questions, participants either walked toward a goal located 7 m away or were cued to steer to the left or right by 37°. On a subset of trials, participants were either cued to voluntarily turn their heads to the left or right, or they underwent an involuntary head perturbation via a head-mounted air jet. The results showed that large voluntary head turns (35°) yielded slight path deviations (1°-2°) in the same or opposite direction as the head turn, depending on conditions, which have alternative explanations. Involuntary head rotations did not elicit path deviations despite comparable head rotation magnitudes. In addition, the walking trajectory when turning toward an eccentric goal was the same regardless of head orientation. Steering can thus be decoupled from head rotation during walking. We conclude that head rotations are neither a sufficient nor a necessary component of steering control, because they do not induce a turn and they are not required to initiate a turn or to guide the locomotor trajectory to a goal.
Collapse
|
5
|
Abstract
The study of spatial vision is a long and well traveled road (which, of course, converges to a vanishing point at the horizon). Its various distortions have been widely investigated empirically, and most concentrate, pragmatically, on the space anterior to the observer. The visual world behind the observer has received relatively less attention and it is this perspective the current experiments address. Our results show systematic perceptual distortions in the posterior visual world when viewed statically. Under static viewing conditions, observer's perceptual representation was consistently 'spread' in a hyperbolic fashion. Directions to distant, peripheral locations were consistently overestimated by about 11 degrees from the ground truth and this variability increased as the target was moved toward the center of the observer's back. The perceptual representation of posterior visual space is, no doubt, secondary to the more immediate needs of the anterior visual world. Still, it is important in some domains including certain sports, such as rowing, and in vehicular navigation.
Collapse
Affiliation(s)
- Flip Phillips
- Department of Psychology, Skidmore College, Vision Laboratories, Saratoga Springs, NY 12866-1632, USA.
| | | |
Collapse
|
6
|
Lamontagne A, Fung J, McFadyen B, Faubert J, Paquette C. Stroke affects locomotor steering responses to changing optic flow directions. Neurorehabil Neural Repair 2010; 24:457-68. [PMID: 20067950 DOI: 10.1177/1545968309355985] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
BACKGROUND Stroke patients manifest steering difficulties during walking, which may arise from an altered perception of visual motion. OBJECTIVE To examine the ability of stroke patients to control their heading direction while walking in a virtual environment (VE) describing translational optic flows (OFs) expanding from different directions. METHODS The authors evaluated 10 stroke patients and 11 healthy people while they were walking overground and visualizing a VE in a helmet-mounted display. Participants were instructed to walk straight in the VE and were randomly exposed to an OF having a focus of expansion (FOE) located in 5 possible locations (0 degrees, +/-20 degrees, and +/-40 degrees to the right or left). The body's center of mass (CoM) trajectory, heading direction, and horizontal body reorientation were recorded with a Vicon-512 system. RESULTS Healthy participants veered opposite to the FOE location in the physical world, with larger deviations occurring at the most eccentric FOE locations. Stroke patients displayed altered steering behaviors characterized either by an absence of CoM trajectory corrections, multiple errors in the heading direction, or systematic veering to the nonparetic side. Both groups displayed relatively small CoM trajectory corrections that led to large virtual heading errors. CONCLUSIONS The control of heading of locomotion in response to different OF directions is affected by stroke. An altered perception of heading direction and/or a poor integration of sensory and motor information are likely causes. This altered response to OF direction while walking may contribute to steering difficulties after stroke.
Collapse
Affiliation(s)
- Anouk Lamontagne
- School of Physical and Occupational Therapy, McGill University and Jewish Rehabilitation Hospital (Feil & Oberfeld/CRIR) Research Center, Montréal, Canada.
| | | | | | | | | |
Collapse
|
7
|
Toussaint Y, Fagard J. A counterclockwise bias in running. Neurosci Lett 2008; 442:59-62. [DOI: 10.1016/j.neulet.2008.06.056] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2008] [Revised: 06/17/2008] [Accepted: 06/21/2008] [Indexed: 11/17/2022]
|
8
|
Wagner J, Stephan T, Kalla R, Brückmann H, Strupp M, Brandt T, Jahn K. Mind the bend: cerebral activations associated with mental imagery of walking along a curved path. Exp Brain Res 2008; 191:247-55. [DOI: 10.1007/s00221-008-1520-8] [Citation(s) in RCA: 46] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2008] [Accepted: 07/23/2008] [Indexed: 10/21/2022]
|
9
|
Schmuckler MA, Collimore LM, Dannemiller JL. Infants' Reactions to Object Collision on Hit and Miss Trajectories. INFANCY 2007; 12:105-118. [DOI: 10.1111/j.1532-7078.2007.tb00236.x] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
10
|
Lichtenstein L, Barabas J, Woods RL, Peli E. A Feedback-Controlled Interface for Treadmill Locomotion in Virtual Environments. ACM TRANSACTIONS ON APPLIED PERCEPTION 2007; 4:7. [PMID: 18167515 PMCID: PMC2132658 DOI: 10.1145/1227134.1227141] [Citation(s) in RCA: 15] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
Virtual environments (VEs) allow safe, repeatable, and controlled evaluations of obstacle avoidance and navigation performance of people with visual impairments using visual aids. Proper simulation of mobility in a VE requires an interface, which allows subjects to set their walking pace. Using conventional treadmills, the subject can change their walking speed by pushing the tread with their feet, while leveraging handrails or ropes (self-propelled mode). We developed a feedback-controlled locomotion interface that allows the VE workstation to control the speed of the treadmill, based on the position of the user. The position and speed information is also used to implement automated safety measures, so that the treadmill can be halted in case of erratic behavior. We compared the feedback-controlled mode to the self-propelled mode by using speed-matching tasks (follow a moving object or match the speed of an independently moving scene) to measure the efficacy of each mode in maintaining constant subject position, subject control of the treadmill, and subject pulse rates. Additionally, we measured the perception of speed in the VE on each mode. The feedback-controlled mode required less physical exertion than self-propelled. The average position of subjects on the feedback-controlled treadmill was always within a centimeter of the desired position. There was a smaller standard deviation in subject position when using the self-propelled mode than when using the feedback-controlled mode, but the difference averaged less than six centimeters across all subjects walking at a constant speed. Although all subjects underestimated the speed of an independently moving scene at higher speeds, their estimates were more accurate when using the feedback-controlled treadmill than the self-propelled.
Collapse
|
11
|
Jahn K, Kalla R, Karg S, Strupp M, Brandt T. Eccentric eye and head positions in darkness induce deviation from the intended path. Exp Brain Res 2006; 174:152-7. [PMID: 16604319 DOI: 10.1007/s00221-006-0431-9] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2006] [Accepted: 03/01/2006] [Indexed: 11/26/2022]
Abstract
Head and gaze are aligned with the actual path during locomotion. Before a turn is made, gaze changes in the direction of the planned trajectory. We investigated whether eccentric horizontal head and/or eye position without vision causes deviations from the intended straight path. Twenty blindfolded healthy volunteers were asked to walk toward a previously seen target 10 m straight ahead. Various combinations of head and eye positions were tested (eye-in-head gaze straight ahead or 35 degrees left or right with head straight ahead or 70 degrees left or right). Head rotation to the left caused a gait deviation to the right (3.7 degrees ) and head rotation to the right caused a deviation to the left (2.7 degrees ; F(2,40) = 34.966; P < 0.00001). Eye position also showed a tendency to cause gait deviations opposite in direction to gaze, which was, however, not significant. Deviations from the intended straight path were largest with head rotation and eyes straight ahead (gaze 70 degrees off target) or eyes opposite to head rotation (gaze 35 degrees off target). Notably, when lateral eye deviation added to head rotation (gaze 105 degrees off target), i.e., gaze is directed backward, mean deviations decreased (2.3 degrees to the right and 1.2 degrees to the left). Thus, we show that (1) eccentric head positions induce direction-specific gait deviations that are independent of concurrent environmental visual information, and (2) that gait deviations are contraversive to eye-head gaze rather than ipsiversive as reported by others for visually controlled locomotion. The direction of deviation may reflect the compensation of an expected or perceived deviation in the direction of gaze.
Collapse
Affiliation(s)
- Klaus Jahn
- Klinikum Grosshadern, Department of Neurology, Ludwig-Maximilians University, Marchioninistrasse 15, 81377 Munich, Germany.
| | | | | | | | | |
Collapse
|