1
|
Jerjian SJ, Harsch DR, Fetsch CR. Self-motion perception and sequential decision-making: where are we heading? Philos Trans R Soc Lond B Biol Sci 2023; 378:20220333. [PMID: 37545301 PMCID: PMC10404932 DOI: 10.1098/rstb.2022.0333] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2023] [Accepted: 06/18/2023] [Indexed: 08/08/2023] Open
Abstract
To navigate and guide adaptive behaviour in a dynamic environment, animals must accurately estimate their own motion relative to the external world. This is a fundamentally multisensory process involving integration of visual, vestibular and kinesthetic inputs. Ideal observer models, paired with careful neurophysiological investigation, helped to reveal how visual and vestibular signals are combined to support perception of linear self-motion direction, or heading. Recent work has extended these findings by emphasizing the dimension of time, both with regard to stimulus dynamics and the trade-off between speed and accuracy. Both time and certainty-i.e. the degree of confidence in a multisensory decision-are essential to the ecological goals of the system: terminating a decision process is necessary for timely action, and predicting one's accuracy is critical for making multiple decisions in a sequence, as in navigation. Here, we summarize a leading model for multisensory decision-making, then show how the model can be extended to study confidence in heading discrimination. Lastly, we preview ongoing efforts to bridge self-motion perception and navigation per se, including closed-loop virtual reality and active self-motion. The design of unconstrained, ethologically inspired tasks, accompanied by large-scale neural recordings, raise promise for a deeper understanding of spatial perception and decision-making in the behaving animal. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Steven J. Jerjian
- Solomon H. Snyder Department of Neuroscience, Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Devin R. Harsch
- Solomon H. Snyder Department of Neuroscience, Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD 21218, USA
- Center for Neuroscience and Department of Neurobiology, University of Pittsburgh, Pittsburgh, PA 15213, USA
| | - Christopher R. Fetsch
- Solomon H. Snyder Department of Neuroscience, Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD 21218, USA
| |
Collapse
|
2
|
Muller KS, Matthis J, Bonnen K, Cormack LK, Huk AC, Hayhoe M. Retinal motion statistics during natural locomotion. eLife 2023; 12:82410. [PMID: 37133442 PMCID: PMC10156169 DOI: 10.7554/elife.82410] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2022] [Accepted: 04/09/2023] [Indexed: 05/04/2023] Open
Abstract
Walking through an environment generates retinal motion, which humans rely on to perform a variety of visual tasks. Retinal motion patterns are determined by an interconnected set of factors, including gaze location, gaze stabilization, the structure of the environment, and the walker's goals. The characteristics of these motion signals have important consequences for neural organization and behavior. However, to date, there are no empirical in situ measurements of how combined eye and body movements interact with real 3D environments to shape the statistics of retinal motion signals. Here, we collect measurements of the eyes, the body, and the 3D environment during locomotion. We describe properties of the resulting retinal motion patterns. We explain how these patterns are shaped by gaze location in the world, as well as by behavior, and how they may provide a template for the way motion sensitivity and receptive field properties vary across the visual field.
Collapse
Affiliation(s)
- Karl S Muller
- Center for Perceptual Systems, The University of Texas at Austin, Austin, United States
| | - Jonathan Matthis
- Department of Biology, Northeastern University, Boston, United States
| | - Kathryn Bonnen
- School of Optometry, Indiana University, Bloomington, United States
| | - Lawrence K Cormack
- Center for Perceptual Systems, The University of Texas at Austin, Austin, United States
| | - Alex C Huk
- Center for Perceptual Systems, The University of Texas at Austin, Austin, United States
| | - Mary Hayhoe
- Center for Perceptual Systems, The University of Texas at Austin, Austin, United States
| |
Collapse
|
3
|
van Helvert MJL, Selen LPJ, van Beers RJ, Medendorp WP. Predictive steering: integration of artificial motor signals in self-motion estimation. J Neurophysiol 2022; 128:1395-1408. [PMID: 36350058 DOI: 10.1152/jn.00248.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
Abstract
The brain's computations for active and passive self-motion estimation can be unified with a single model that optimally combines vestibular and visual signals with sensory predictions based on efference copies. It is unknown whether this theoretical framework also applies to the integration of artificial motor signals, such as those that occur when driving a car, or whether self-motion estimation in this situation relies on sole feedback control. Here, we examined if training humans to control a self-motion platform leads to the construction of an accurate internal model of the mapping between the steering movement and the vestibular reafference. Participants (n = 15) sat on a linear motion platform and actively controlled the platform's velocity using a steering wheel to translate their body to a memorized visual target (motion condition). We compared their steering behavior to that of participants (n = 15) who remained stationary and instead aligned a nonvisible line with the target (stationary condition). To probe learning, the gain between the steering wheel angle and the platform or line velocity changed abruptly twice during the experiment. These gain changes were virtually undetectable in the displacement error in the motion condition, whereas clear deviations were observed in the stationary condition, showing that participants in the motion condition made within-trial changes to their steering behavior. We conclude that vestibular feedback allows not only the online control of steering but also a rapid adaptation to the gain changes to update the brain's internal model of the mapping between the steering movement and the vestibular reafference.NEW & NOTEWORTHY Perception of self-motion is known to depend on the integration of sensory signals and, when the motion is self-generated, the predicted sensory reafference based on motor efference copies. Here we show, using a closed-loop steering experiment with a direct coupling between the steering movement and the vestibular self-motion feedback, that humans are also able to integrate artificial motor signals, like the motor signals that occur when driving a car.
Collapse
Affiliation(s)
- Milou J L van Helvert
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
| | - Luc P J Selen
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
| | - Robert J van Beers
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands.,Department of Human Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands
| | - W Pieter Medendorp
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
| |
Collapse
|
4
|
Modeling Physiological Sources of Heading Bias from Optic Flow. eNeuro 2021; 8:ENEURO.0307-21.2021. [PMID: 34642226 PMCID: PMC8607907 DOI: 10.1523/eneuro.0307-21.2021] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2021] [Revised: 09/01/2021] [Accepted: 09/20/2021] [Indexed: 11/21/2022] Open
Abstract
Human heading perception from optic flow is accurate for directions close to the straight-ahead and systematic biases emerge in the periphery (Cuturi and Macneilage, 2013; Sun et al., 2020). In pursuit of the underlying neural mechanisms, primate brain dorsal medial superior temporal (MSTd) area has been a focus because of its causal link with heading perception (Gu et al., 2012). Computational models generally explain heading sensitivity in individual MSTd neurons as a feedforward integration of motion signals from medial temporal (MT) area that resemble full-field optic flow patterns consistent with the preferred heading direction (Britten, 2008; Mineault et al., 2012). In the present simulation study, we quantified within the structure of this feedforward model how physiological properties of MT and MSTd shape heading signals. We found that known physiological tuning characteristics generally supported the accuracy of heading estimation, but not always. A weak-to-moderate overrepresentation of peripheral headings in MSTd garnered the highest accuracy and precision out of the models that we tested. The model also performed well when noise corrupted high proportions of the optic flow vectors. Such a peripheral MSTd model performed well when units possessed a range of receptive field (RF) sizes and were strongly direction tuned. Physiological biases in MT direction tuning toward the radial direction also supported heading estimation, but the tendency for MT preferred speed and RF size to scale with eccentricity did not. Our findings help elucidate the extent to which different physiological tuning properties influence the accuracy and precision of neural heading signals.
Collapse
|
5
|
Churan J, Kaminiarz A, Schwenk JCB, Bremmer F. Action-dependent processing of self-motion in parietal cortex of macaque monkeys. J Neurophysiol 2021; 125:2432-2443. [PMID: 34010579 DOI: 10.1152/jn.00049.2021] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Successful interaction with the environment requires the dissociation of self-induced from externally induced sensory stimulation. Temporal proximity of action and effect is hereby often used as an indicator of whether an observed event should be interpreted as a result of own actions or not. We tested how the delay between an action (press of a touch bar) and an effect (onset of simulated self-motion) influences the processing of visually simulated self-motion in the ventral intraparietal area (VIP) of macaque monkeys. We found that a delay between the action and the start of the self-motion stimulus led to a rise of activity above the baseline activity before motion onset in a subpopulation of 21% of the investigated neurons. In the responses to the stimulus, we found a significantly lower sustained activity when the press of a touch bar and the motion onset were contiguous compared to the condition when the motion onset was delayed. We speculate that this weak inhibitory effect might be part of a mechanism that sharpens the tuning of VIP neurons during self-induced motion and thus has the potential to increase the precision of heading information that is required to adjust the orientation of self-motion in everyday navigational tasks.NEW & NOTEWORTHY Neurons in macaque ventral intraparietal area (VIP) are responding to sensory stimulation related to self-motion, e.g. visual optic flow. Here, we found that self-motion induced activation depends on the sense of agency, i.e., it differed when optic flow was perceived as self- or externally induced. This demonstrates that area VIP is well suited for study of the interplay between active behavior and sensory processing during self-motion.
Collapse
Affiliation(s)
- Jan Churan
- Department of Neurophysics, Philipps-Universität Marburg, Marburg, Germany.,Center for Mind, Brain and Behavior, Philipps-Universität Marburg and Justus-Liebig-Universität Gießen, Marburg, Germany
| | - Andre Kaminiarz
- Department of Neurophysics, Philipps-Universität Marburg, Marburg, Germany.,Center for Mind, Brain and Behavior, Philipps-Universität Marburg and Justus-Liebig-Universität Gießen, Marburg, Germany
| | - Jakob C B Schwenk
- Department of Neurophysics, Philipps-Universität Marburg, Marburg, Germany.,Center for Mind, Brain and Behavior, Philipps-Universität Marburg and Justus-Liebig-Universität Gießen, Marburg, Germany
| | - Frank Bremmer
- Department of Neurophysics, Philipps-Universität Marburg, Marburg, Germany.,Center for Mind, Brain and Behavior, Philipps-Universität Marburg and Justus-Liebig-Universität Gießen, Marburg, Germany
| |
Collapse
|
6
|
Lockwood CT, Duffy CJ. Hyperexcitability in Aging Is Lost in Alzheimer's: What Is All the Excitement About? Cereb Cortex 2020; 30:5874-5884. [PMID: 32548625 DOI: 10.1093/cercor/bhaa163] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022] Open
Abstract
Neuronal hyperexcitability has emerged as a potential biomarker of late-onset early-stage Alzheimer's disease (LEAD). We hypothesize that the aging-related posterior cortical hyperexcitability anticipates the loss of excitability with the emergence of impairment in LEAD. To test this hypothesis, we compared the behavioral and neurophysiological responses of young and older (ON) normal adults, and LEAD patients during a visuospatial attentional control task. ONs show frontal cortical signal incoherence and posterior cortical hyper-responsiveness with preserved attentional control. LEADs lose the posterior hyper-responsiveness and fail in the attentional task. Our findings suggest that signal incoherence and cortical hyper-responsiveness in aging may contribute to the development of functional impairment in LEAD.
Collapse
Affiliation(s)
- Colin T Lockwood
- Departments of Neurology and Brain and Cognitive Sciences, University of Rochester Medical Center, Rochester 14642, NY, USA
| | - Charles J Duffy
- Departments of Neurology and Brain and Cognitive Sciences, University of Rochester Medical Center, Rochester 14642, NY, USA
| |
Collapse
|
7
|
Cullen KE. Vestibular processing during natural self-motion: implications for perception and action. Nat Rev Neurosci 2019; 20:346-363. [PMID: 30914780 PMCID: PMC6611162 DOI: 10.1038/s41583-019-0153-1] [Citation(s) in RCA: 116] [Impact Index Per Article: 23.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
How the brain computes accurate estimates of our self-motion relative to the world and our orientation relative to gravity in order to ensure accurate perception and motor control is a fundamental neuroscientific question. Recent experiments have revealed that the vestibular system encodes this information during everyday activities using pathway-specific neural representations. Furthermore, new findings have established that vestibular signals are selectively combined with extravestibular information at the earliest stages of central vestibular processing in a manner that depends on the current behavioural goal. These findings have important implications for our understanding of the brain mechanisms that ensure accurate perception and behaviour during everyday activities and for our understanding of disorders of vestibular processing.
Collapse
Affiliation(s)
- Kathleen E Cullen
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore, MD, USA.
| |
Collapse
|
8
|
Lockwood CT, Vaughn W, Duffy CJ. Attentional ERPs distinguish aging and early Alzheimer's dementia. Neurobiol Aging 2018; 70:51-58. [PMID: 29960173 DOI: 10.1016/j.neurobiolaging.2018.05.022] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2017] [Revised: 05/17/2018] [Accepted: 05/18/2018] [Indexed: 10/14/2022]
Abstract
The early detection of Alzheimer's disease requires our distinguishing it from cognitive aging. Here, we test whether spatial attentional changes might support that distinction. We engaged young normal (YN), older normal (ON), and patients with early Alzheimer's dementia (EAD) in an attentionally cued, self-movement heading discrimination task while we recorded push-button response times and event related potentials. YNs and ONs show the behavioral effects of attentional shifts from the cue to the target, whereas EAD patients did not (p < 0.001). YNs and ONs also show the shifting lateralization of a newly described attentional event related potentials component, whereas EAD patients did not (p < 0.001). Our findings suggest that spatial inattention in EAD patients may contribute to heading direction processing impairments that distinguish them from ONs and undermine their navigational capacity and driving safety.
Collapse
Affiliation(s)
- Colin T Lockwood
- Departments of Neurology, Brain and Cognitive Sciences, Ophthalmology, The Center for Visual Science, The University of Rochester Medical Center, Rochester, NY 14642-0673, USA
| | - William Vaughn
- Departments of Neurology, Brain and Cognitive Sciences, Ophthalmology, The Center for Visual Science, The University of Rochester Medical Center, Rochester, NY 14642-0673, USA
| | - Charles J Duffy
- Departments of Neurology, Brain and Cognitive Sciences, Ophthalmology, The Center for Visual Science, The University of Rochester Medical Center, Rochester, NY 14642-0673, USA.
| |
Collapse
|
9
|
Page WK, Duffy CJ. Path perturbation detection tasks reduce MSTd neuronal self-movement heading responses. J Neurophysiol 2017; 119:124-133. [PMID: 29046430 DOI: 10.1152/jn.00958.2016] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
We presented optic flow and real movement heading stimuli while recording MSTd neuronal activity. Monkeys were alternately engaged in three tasks: visual detection of optic flow heading perturbations, vestibular detection of real movement heading perturbations, and auditory detection of brief tones. Push-button RTs were fastest for tones and slower for visual and vestibular heading perturbations, suggesting that the tone detection task was easier. Neuronal heading selectivity was strongest during the tone detection task, and weaker during the visual and vestibular heading perturbation detection tasks. Heading selectivity was weaker during visual and vestibular path perturbation detection, despite our presented heading cues only in the visual and vestibular modalities. We conclude that focusing on the self-movement transients of path perturbation distracted the monkeys from their heading and reduced neuronal responsiveness to heading direction. NEW & NOTEWORTHY Heading analysis is critical for steering and navigation. We recorded the activity of monkey cortical heading neurons during naturalistic self-movement. When the monkeys were required to respond to transient changes in their path, neuronal responses to heading direction were diminished. This suggests that the need to respond to momentary path perturbations reduces your ability to process your heading direction.
Collapse
Affiliation(s)
- William K Page
- Departments of Neurology, Neurobiology and Anatomy, Ophthalmology, Brain and Cognitive Sciences, and The Center for Visual Science, The University of Rochester Medical Center , Rochester, New York
| | - Charles J Duffy
- Departments of Neurology, Neurobiology and Anatomy, Ophthalmology, Brain and Cognitive Sciences, and The Center for Visual Science, The University of Rochester Medical Center , Rochester, New York
| |
Collapse
|