1
|
Sun Q, Wang JY, Gong XM. Conflicts between short- and long-term experiences affect visual perception through modulating sensory or motor response systems: Evidence from Bayesian inference models. Cognition 2024; 246:105768. [PMID: 38479091 DOI: 10.1016/j.cognition.2024.105768] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2023] [Revised: 02/29/2024] [Accepted: 03/07/2024] [Indexed: 03/24/2024]
Abstract
The independent effects of short- and long-term experiences on visual perception have been discussed for decades. However, no study has investigated whether and how these experiences simultaneously affect our visual perception. To address this question, we asked participants to estimate their self-motion directions (i.e., headings) simulated from optic flow, in which a long-term experience learned in everyday life (i.e., straight-forward motion being more common than lateral motion) plays an important role. The headings were selected from three distributions that resembled a peak, a hill, and a flat line, creating different short-term experiences. Importantly, the proportions of headings deviating from the straight-forward motion gradually increased in the peak, hill, and flat distributions, leading to a greater conflict between long- and short-term experiences. The results showed that participants biased their heading estimates towards the straight-ahead direction and previously seen headings, which increased with the growing experience conflict. This suggests that both long- and short-term experiences simultaneously affect visual perception. Finally, we developed two Bayesian models (Model 1 vs. Model 2) based on two assumptions that the experience conflict altered the likelihood distribution of sensory representation or the motor response system. The results showed that both models accurately predicted participants' estimation biases. However, Model 1 predicted a higher variance of serial dependence compared to Model 2, while Model 2 predicted a higher variance of the bias towards the straight-ahead direction compared to Model 1. This suggests that the experience conflict can influence visual perception by affecting both sensory and motor response systems. Taken together, the current study systematically revealed the effects of long- and short-term experiences on visual perception and the underlying Bayesian processing mechanisms.
Collapse
Affiliation(s)
- Qi Sun
- Department of Psychology, Zhejiang Normal University, Jinhua, PR China; Intelligent Laboratory of Zhejiang Province in Mental Health and Crisis Intervention for Children and Adolescents, Jinhua, PR China; Key Laboratory of Intelligent Education Technology and Application of Zhejiang Province, Zhejiang Normal University, Jinhua, PR China.
| | - Jing-Yi Wang
- Department of Psychology, Zhejiang Normal University, Jinhua, PR China
| | - Xiu-Mei Gong
- Department of Psychology, Zhejiang Normal University, Jinhua, PR China
| |
Collapse
|
2
|
Sun Q, Zhan LZ, You FH, Dong XF. Attention affects the perception of self-motion direction from optic flow. iScience 2024; 27:109373. [PMID: 38500831 PMCID: PMC10946324 DOI: 10.1016/j.isci.2024.109373] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2023] [Revised: 01/02/2024] [Accepted: 02/27/2024] [Indexed: 03/20/2024] Open
Abstract
Many studies have demonstrated that attention affects the perception of many visual features. However, previous studies show conflicting results regarding the effect of attention on the perception of self-motion direction (i.e., heading) from optic flow. To address this question, we conducted three behavioral experiments and found that estimation accuracies of large headings (>14°) decreased with attention load, discrimination thresholds of these headings increased with attention load, and heading estimates were systematically compressed toward the focus of attention. Therefore, the current study demonstrated that attention affected heading perception from optic flow, showing that the perception is both information-driven and cognitive.
Collapse
Affiliation(s)
- Qi Sun
- School of Psychology, Zhejiang Normal University, Jinhua, P.R. China
- Zhejiang Philosophy and Social Science Laboratory for the Mental Health and Crisis Intervention of Children and Adolescents, Zhejiang Normal University, Jinhua, P.R. China
- Key Laboratory of Intelligent Education Technology and Application of Zhejiang Province, Zhejiang Normal University, Jinhua, P.R. China
| | - Lin-Zhe Zhan
- School of Psychology, Zhejiang Normal University, Jinhua, P.R. China
| | - Fan-Huan You
- School of Psychology, Zhejiang Normal University, Jinhua, P.R. China
| | - Xiao-Fei Dong
- School of Psychology, Zhejiang Normal University, Jinhua, P.R. China
| |
Collapse
|
3
|
Yang L, Jin M, Zhang C, Qian N, Zhang M. Distributions of Visual Receptive Fields from Retinotopic to Craniotopic Coordinates in the Lateral Intraparietal Area and Frontal Eye Fields of the Macaque. Neurosci Bull 2024; 40:171-181. [PMID: 37573519 PMCID: PMC10838878 DOI: 10.1007/s12264-023-01097-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2023] [Accepted: 04/16/2023] [Indexed: 08/15/2023] Open
Abstract
Even though retinal images of objects change their locations following each eye movement, we perceive a stable and continuous world. One possible mechanism by which the brain achieves such visual stability is to construct a craniotopic coordinate by integrating retinal and extraretinal information. There have been several proposals on how this may be done, including eye-position modulation (gain fields) of retinotopic receptive fields (RFs) and craniotopic RFs. In the present study, we investigated coordinate systems used by RFs in the lateral intraparietal (LIP) cortex and frontal eye fields (FEF) and compared the two areas. We mapped the two-dimensional RFs of neurons in detail under two eye fixations and analyzed how the RF of a given neuron changes with eye position to determine its coordinate representation. The same recording and analysis procedures were applied to the two brain areas. We found that, in both areas, RFs were distributed from retinotopic to craniotopic representations. There was no significant difference between the distributions in the LIP and FEF. Only a small fraction of neurons was fully craniotopic, whereas most neurons were between the retinotopic and craniotopic representations. The distributions were strongly biased toward the retinotopic side but with significant craniotopic shifts. These results suggest that there is only weak evidence for craniotopic RFs in the LIP and FEF, and that transformation from retinotopic to craniotopic coordinates in these areas must rely on other factors such as gain fields.
Collapse
Affiliation(s)
- Lin Yang
- Key Laboratory of Cognitive Neuroscience and Learning, Division of Psychology, Beijing Normal University, Beijing, 100875, China
| | - Min Jin
- Key Laboratory of Cognitive Neuroscience and Learning, Division of Psychology, Beijing Normal University, Beijing, 100875, China
| | - Cong Zhang
- Institute of Neuroscience, Key Laboratory of Primate Neurobiology, CAS Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, 200031, China
| | - Ning Qian
- Department of Neuroscience and Zuckerman Institute, Columbia University, New York, 10027, USA
| | - Mingsha Zhang
- Key Laboratory of Cognitive Neuroscience and Learning, Division of Psychology, Beijing Normal University, Beijing, 100875, China.
| |
Collapse
|
4
|
Zeng Z, Zhang C, Gu Y. Visuo-vestibular heading perception: a model system to study multi-sensory decision making. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220334. [PMID: 37545303 PMCID: PMC10404926 DOI: 10.1098/rstb.2022.0334] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2022] [Accepted: 05/15/2023] [Indexed: 08/08/2023] Open
Abstract
Integrating noisy signals across time as well as sensory modalities, a process named multi-sensory decision making (MSDM), is an essential strategy for making more accurate and sensitive decisions in complex environments. Although this field is just emerging, recent extraordinary works from different perspectives, including computational theory, psychophysical behaviour and neurophysiology, begin to shed new light onto MSDM. In the current review, we focus on MSDM by using a model system of visuo-vestibular heading. Combining well-controlled behavioural paradigms on virtual-reality systems, single-unit recordings, causal manipulations and computational theory based on spiking activity, recent progress reveals that vestibular signals contain complex temporal dynamics in many brain regions, including unisensory, multi-sensory and sensory-motor association areas. This challenges the brain for cue integration across time and sensory modality such as optic flow which mainly contains a motion velocity signal. In addition, new evidence from the higher-level decision-related areas, mostly in the posterior and frontal/prefrontal regions, helps revise our conventional thought on how signals from different sensory modalities may be processed, converged, and moment-by-moment accumulated through neural circuits for forming a unified, optimal perceptual decision. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Zhao Zeng
- CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, 200031 Shanghai, People's Republic of China
- University of Chinese Academy of Sciences, 100049 Beijing, People's Republic of China
| | - Ce Zhang
- CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, 200031 Shanghai, People's Republic of China
- University of Chinese Academy of Sciences, 100049 Beijing, People's Republic of China
| | - Yong Gu
- CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, 200031 Shanghai, People's Republic of China
- University of Chinese Academy of Sciences, 100049 Beijing, People's Republic of China
| |
Collapse
|
5
|
Liu B, Shan J, Gu Y. Temporal and spatial properties of vestibular signals for perception of self-motion. Front Neurol 2023; 14:1266513. [PMID: 37780704 PMCID: PMC10534010 DOI: 10.3389/fneur.2023.1266513] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2023] [Accepted: 08/29/2023] [Indexed: 10/03/2023] Open
Abstract
It is well recognized that the vestibular system is involved in numerous important cognitive functions, including self-motion perception, spatial orientation, locomotion, and vector-based navigation, in addition to basic reflexes, such as oculomotor or body postural control. Consistent with this rationale, vestibular signals exist broadly in the brain, including several regions of the cerebral cortex, potentially allowing tight coordination with other sensory systems to improve the accuracy and precision of perception or action during self-motion. Recent neurophysiological studies in animal models based on single-cell resolution indicate that vestibular signals exhibit complex spatiotemporal dynamics, producing challenges in identifying their exact functions and how they are integrated with other modality signals. For example, vestibular and optic flow could provide congruent and incongruent signals regarding spatial tuning functions, reference frames, and temporal dynamics. Comprehensive studies, including behavioral tasks, neural recording across sensory and sensory-motor association areas, and causal link manipulations, have provided some insights into the neural mechanisms underlying multisensory self-motion perception.
Collapse
Affiliation(s)
- Bingyu Liu
- Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, International Center for Primate Brain Research, Chinese Academy of Sciences, Shanghai, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Jiayu Shan
- Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, International Center for Primate Brain Research, Chinese Academy of Sciences, Shanghai, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Yong Gu
- Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, International Center for Primate Brain Research, Chinese Academy of Sciences, Shanghai, China
- University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
6
|
Gao W, Lin Y, Shen J, Han J, Song X, Lu Y, Zhan H, Li Q, Ge H, Lin Z, Shi W, Drugowitsch J, Tang H, Chen X. Diverse effects of gaze direction on heading perception in humans. Cereb Cortex 2023:7024719. [PMID: 36734278 DOI: 10.1093/cercor/bhac541] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2022] [Revised: 12/24/2022] [Accepted: 12/27/2022] [Indexed: 02/04/2023] Open
Abstract
Gaze change can misalign spatial reference frames encoding visual and vestibular signals in cortex, which may affect the heading discrimination. Here, by systematically manipulating the eye-in-head and head-on-body positions to change the gaze direction of subjects, the performance of heading discrimination was tested with visual, vestibular, and combined stimuli in a reaction-time task in which the reaction time is under the control of subjects. We found the gaze change induced substantial biases in perceived heading, increased the threshold of discrimination and reaction time of subjects in all stimulus conditions. For the visual stimulus, the gaze effects were induced by changing the eye-in-world position, and the perceived heading was biased in the opposite direction of gaze. In contrast, the vestibular gaze effects were induced by changing the eye-in-head position, and the perceived heading was biased in the same direction of gaze. Although the bias was reduced when the visual and vestibular stimuli were combined, integration of the 2 signals substantially deviated from predictions of an extended diffusion model that accumulates evidence optimally over time and across sensory modalities. These findings reveal diverse gaze effects on the heading discrimination and emphasize that the transformation of spatial reference frames may underlie the effects.
Collapse
Affiliation(s)
- Wei Gao
- Department of Neurology and Psychiatry of the Second Affiliated Hospital, College of Biomedical Engineering and Instrument Science, Interdisciplinary Institute of Neuroscience and Technology, School of Medicine, Zhejiang University, 268 Kaixuan Road, Jianggan District, Hangzhou 310029, China
| | - Yipeng Lin
- Department of Neurology and Psychiatry of the Second Affiliated Hospital, College of Biomedical Engineering and Instrument Science, Interdisciplinary Institute of Neuroscience and Technology, School of Medicine, Zhejiang University, 268 Kaixuan Road, Jianggan District, Hangzhou 310029, China
| | - Jiangrong Shen
- College of Computer Science and Technology, Zhejiang University, 38 Zheda Road, Xihu District, Hangzhou 310027, China
| | - Jianing Han
- College of Computer Science and Technology, Zhejiang University, 38 Zheda Road, Xihu District, Hangzhou 310027, China
| | - Xiaoxiao Song
- Department of Liberal Arts, School of Art Administration and Education, China Academy of Art, 218 Nanshan Road, Shangcheng District, Hangzhou 310002, China
| | - Yukun Lu
- Department of Neurology and Psychiatry of the Second Affiliated Hospital, College of Biomedical Engineering and Instrument Science, Interdisciplinary Institute of Neuroscience and Technology, School of Medicine, Zhejiang University, 268 Kaixuan Road, Jianggan District, Hangzhou 310029, China
| | - Huijia Zhan
- Department of Neurology and Psychiatry of the Second Affiliated Hospital, College of Biomedical Engineering and Instrument Science, Interdisciplinary Institute of Neuroscience and Technology, School of Medicine, Zhejiang University, 268 Kaixuan Road, Jianggan District, Hangzhou 310029, China
| | - Qianbing Li
- Department of Neurology and Psychiatry of the Second Affiliated Hospital, College of Biomedical Engineering and Instrument Science, Interdisciplinary Institute of Neuroscience and Technology, School of Medicine, Zhejiang University, 268 Kaixuan Road, Jianggan District, Hangzhou 310029, China
| | - Haoting Ge
- Department of Neurology and Psychiatry of the Second Affiliated Hospital, College of Biomedical Engineering and Instrument Science, Interdisciplinary Institute of Neuroscience and Technology, School of Medicine, Zhejiang University, 268 Kaixuan Road, Jianggan District, Hangzhou 310029, China
| | - Zheng Lin
- Department of Psychiatry, Second Affiliated Hospital, School of Medicine, Zhejiang University, 88 Jiefang Road, Shangcheng District, Hangzhou 310009, China
| | - Wenlei Shi
- Center for the Study of the History of Chinese Language and Center for the Study of Language and Cognition, Zhejiang University, 866 Yuhangtang Road, Xihu District, Hangzhou 310058, China
| | - Jan Drugowitsch
- Department of Neurobiology, Harvard Medical School, Longwood Avenue 220, Boston, MA 02116, United States
| | - Huajin Tang
- College of Computer Science and Technology, Zhejiang University, 38 Zheda Road, Xihu District, Hangzhou 310027, China
| | - Xiaodong Chen
- Department of Neurology and Psychiatry of the Second Affiliated Hospital, College of Biomedical Engineering and Instrument Science, Interdisciplinary Institute of Neuroscience and Technology, School of Medicine, Zhejiang University, 268 Kaixuan Road, Jianggan District, Hangzhou 310029, China
| |
Collapse
|
7
|
The Effects of Depth Cues and Vestibular Translation Signals on the Rotation Tolerance of Heading Tuning in Macaque Area MSTd. eNeuro 2020; 7:ENEURO.0259-20.2020. [PMID: 33127626 PMCID: PMC7688306 DOI: 10.1523/eneuro.0259-20.2020] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2020] [Revised: 10/17/2020] [Accepted: 10/22/2020] [Indexed: 12/03/2022] Open
Abstract
When the eyes rotate during translational self-motion, the focus of expansion (FOE) in optic flow no longer indicates heading, yet heading judgements are largely unbiased. Much emphasis has been placed on the role of extraretinal signals in compensating for the visual consequences of eye rotation. However, recent studies also support a purely visual mechanism of rotation compensation in heading-selective neurons. Computational theories support a visual compensatory strategy but require different visual depth cues. We examined the rotation tolerance of heading tuning in macaque area MSTd using two different virtual environments, a frontoparallel (2D) wall and a 3D cloud of random dots. Both environments contained rotational optic flow cues (i.e., dynamic perspective), but only the 3D cloud stimulus contained local motion parallax cues, which are required by some models. The 3D cloud environment did not enhance the rotation tolerance of heading tuning for individual MSTd neurons, nor the accuracy of heading estimates decoded from population activity, suggesting a key role for dynamic perspective cues. We also added vestibular translation signals to optic flow, to test whether rotation tolerance is enhanced by non-visual cues to heading. We found no benefit of vestibular signals overall, but a modest effect for some neurons with significant vestibular heading tuning. We also find that neurons with more rotation tolerant heading tuning typically are less selective to pure visual rotation cues. Together, our findings help to clarify the types of information that are used to construct heading representations that are tolerant to eye rotations.
Collapse
|
8
|
Abstract
Previous work shows that observers can use information from optic flow to perceive the direction of self-motion (i.e. heading) and that perceived heading exhibits a bias towards the center of the display (center bias). More recent work shows that the brain is sensitive to serial correlations and the perception of current stimuli can be affected by recently seen stimuli, a phenomenon known as serial dependence. In the current study, we examined whether, apart from center bias, serial dependence could be independently observed in heading judgments and how adding noise to optic flow affected center bias and serial dependence. We found a repulsive serial dependence effect in heading judgments after factoring out center bias in heading responses. The serial effect expands heading estimates away from the previously seen heading to increase overall sensitivity to changes in heading directions. Both the center bias and repulsive serial dependence effects increased with increasing noise in optic flow, and the noise-dependent changes in the serial effect were consistent with an ideal observer model. Our results suggest that the center bias effect is due to a prior of the straight-ahead direction in the Bayesian inference account for heading perception, whereas the repulsive serial dependence is an effect that reduces response errors and has the added utility of counteracting the center bias in heading judgments.
Collapse
Affiliation(s)
- Qi Sun
- Department of Psychology, The University of Hong Kong, Hong Kong SAR.,
| | - Huihui Zhang
- School of Psychology, The University of Sydney, Sydney, Australia.,
| | - David Alais
- School of Psychology, The University of Sydney, Sydney, Australia.,
| | - Li Li
- Department of Psychology, The University of Hong Kong, Hong Kong SAR.,Faculty of Arts and Science, New York University Shanghai, Shanghai, People's Republic of China.,NYU-ECNU Institute of Brain and Cognitive Science, New York University Shanghai, Shanghai, People's Republic of China.,
| |
Collapse
|
9
|
Ulozienė I, Totilienė M, Balnytė R, Kuzminienė A, Kregždytė R, Paulauskas A, Blažauskas T, Marozas V, Uloza V, Kaski D. Subjective visual vertical and visual dependency in patients with multiple sclerosis. Mult Scler Relat Disord 2020; 44:102255. [DOI: 10.1016/j.msard.2020.102255] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2019] [Revised: 05/05/2020] [Accepted: 05/31/2020] [Indexed: 01/19/2023]
|
10
|
Retinal Stabilization Reveals Limited Influence of Extraretinal Signals on Heading Tuning in the Medial Superior Temporal Area. J Neurosci 2019; 39:8064-8078. [PMID: 31488610 DOI: 10.1523/jneurosci.0388-19.2019] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2019] [Revised: 08/17/2019] [Accepted: 08/20/2019] [Indexed: 11/21/2022] Open
Abstract
Heading perception in primates depends heavily on visual optic-flow cues. Yet during self-motion, heading percepts remain stable, even though smooth-pursuit eye movements often distort optic flow. According to theoretical work, self-motion can be represented accurately by compensating for these distortions in two ways: via retinal mechanisms or via extraretinal efference-copy signals, which predict the sensory consequences of movement. Psychophysical evidence strongly supports the efference-copy hypothesis, but physiological evidence remains inconclusive. Neurons that signal the true heading direction during pursuit are found in visual areas of monkey cortex, including the dorsal medial superior temporal area (MSTd). Here we measured heading tuning in MSTd using a novel stimulus paradigm, in which we stabilize the optic-flow stimulus on the retina during pursuit. This approach isolates the effects on neuronal heading preferences of extraretinal signals, which remain active while the retinal stimulus is prevented from changing. Our results from 3 female monkeys demonstrate a significant but small influence of extraretinal signals on the preferred heading directions of MSTd neurons. Under our stimulus conditions, which are rich in retinal cues, we find that retinal mechanisms dominate physiological corrections for pursuit eye movements, suggesting that extraretinal cues, such as predictive efference-copy mechanisms, have a limited role under naturalistic conditions.SIGNIFICANCE STATEMENT Sensory systems discount stimulation caused by an animal's own behavior. For example, eye movements cause irrelevant retinal signals that could interfere with motion perception. The visual system compensates for such self-generated motion, but how this happens is unclear. Two theoretical possibilities are a purely visual calculation or one using an internal signal of eye movements to compensate for their effects. The latter can be isolated by experimentally stabilizing the image on a moving retina, but this approach has never been adopted to study motion physiology. Using this method, we find that extraretinal signals have little influence on activity in visual cortex, whereas visually based corrections for ongoing eye movements have stronger effects and are likely most important under real-world conditions.
Collapse
|
11
|
Bronstein AM. A conceptual model of the visual control of posture. PROGRESS IN BRAIN RESEARCH 2019; 248:285-302. [PMID: 31239139 DOI: 10.1016/bs.pbr.2019.04.023] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/11/2023]
Abstract
In order to isolate the visual contribution to the control of postural balance, experiments in which subjects are exposed to large-field visual motion (optokinetic) stimuli are reviewed. In these situations, at motion onset, the visual stimulus signals subject self-motion but inertial (vestibulo-proprioceptive) cues do not. Visually evoked postural responses (VEPR) thus induced can be quickly suppressed by cognitive status or simple repetition of the stimulus, if the inertial self-motion cues available to the subject are reliable. In the conceptual model presented here, the process of assessing the reliability, and degree of matching, of visual and inertial signals is carried out by a General comparator; in turn able to access the Gain control mechanism of the visuo-postural system. Complexity and congruency in the visual stimulus itself are assessed by a Visual comparator, e.g., the presence of motion parallax in the visual stimulus can reverse the sway response direction. VEPR can also be re-oriented according to the position of the eyes in the head and the head on the trunk. This indicates that ocular and cervical proprioceptors must also access the gain control mechanism so that visual stimuli can recruit and silence different postural muscles appropriately. The overall gain of the visuo-postural system is also influenced by less easily defined idiosyncratic factors, such as visual dependence and psychological traits; interestingly both these factors have been found to be associated with poor long term outcome in vestibular disorders. The experimental results and model presented illustrate that the visuo-postural system is a wonderful example of interaction between physics (e.g., stimuli geometry, body dynamics), neuroscience and the border zone between neurology and psycho-somatic medicine.
Collapse
Affiliation(s)
- Adolfo M Bronstein
- Neuro-Otology Unit, Division of Brain Sciences, Imperial College London, Charing Cross Hospital, London, United Kingdom.
| |
Collapse
|
12
|
Rodriguez R, Crane BT. Effect of range of heading differences on human visual-inertial heading estimation. Exp Brain Res 2019; 237:1227-1237. [PMID: 30847539 DOI: 10.1007/s00221-019-05506-1] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2018] [Accepted: 03/01/2019] [Indexed: 11/29/2022]
Abstract
Both visual and inertial cues are salient in heading determination. However, optic flow can ambiguously represent self-motion or environmental motion. It is unclear how visual and inertial heading cues are determined to have common cause and integrated vs perceived independently. In four experiments visual and inertial headings were presented simultaneously with ten subjects reporting visual or inertial headings in separate trial blocks. Experiment 1 examined inertial headings within 30° of straight-ahead and visual headings that were offset by up to 60°. Perception of the inertial heading was shifted in the direction of the visual stimulus by as much as 35° by the 60° offset, while perception of the visual stimulus remained largely uninfluenced. Experiment 2 used ± 140° range of inertial headings with up to 120° visual offset. This experiment found variable behavior between subjects with most perceiving the sensory stimuli to be shifted towards an intermediate heading but a few perceiving the headings independently. The visual and inertial headings influenced each other even at the largest offsets. Experiments 3 and 4 had similar inertial headings to experiments 1 and 2, respectively, except subjects reported environmental motion direction. Experiment 4 displayed similar perceptual influences as experiment 2, but in experiment 3 percepts were independent. Results suggested that perception of visual and inertial stimuli tend to be perceived as having common causation in most subjects with offsets up to 90° although with significant variation in perception between individuals. Limiting the range of inertial headings caused the visual heading to dominate the perception.
Collapse
Affiliation(s)
- Raul Rodriguez
- Department of Bioengineering, University of Rochester, 601 Elmwood Avenue, Box 629, Rochester, NY, 14642, USA
| | - Benjamin T Crane
- Department of Bioengineering, University of Rochester, 601 Elmwood Avenue, Box 629, Rochester, NY, 14642, USA. .,Department of Otolaryngology, University of Rochester, 601 Elmwood Avenue, Box 629, Rochester, NY, 14642, USA. .,Department of Neuroscience, University of Rochester, 601 Elmwood Avenue, Box 629, Rochester, NY, 14642, USA.
| |
Collapse
|
13
|
Abstract
Detection of the state of self-motion, such as the instantaneous heading direction, the traveled trajectory and traveled distance or time, is critical for efficient spatial navigation. Numerous psychophysical studies have indicated that the vestibular system, originating from the otolith and semicircular canals in our inner ears, provides robust signals for different aspects of self-motion perception. In addition, vestibular signals interact with other sensory signals such as visual optic flow to facilitate natural navigation. These behavioral results are consistent with recent findings in neurophysiological studies. In particular, vestibular activity in response to the translation or rotation of the head/body in darkness is revealed in a growing number of cortical regions, many of which are also sensitive to visual motion stimuli. The temporal dynamics of the vestibular activity in the central nervous system can vary widely, ranging from acceleration-dominant to velocity-dominant. Different temporal dynamic signals may be decoded by higher level areas for different functions. For example, the acceleration signals during the translation of body in the horizontal plane may be used by the brain to estimate the heading directions. Although translation and rotation signals arise from independent peripheral organs, that is, otolith and canals, respectively, they frequently converge onto single neurons in the central nervous system including both the brainstem and the cerebral cortex. The convergent neurons typically exhibit stronger responses during a combined curved motion trajectory which may serve as the neural correlate for complex path perception. During spatial navigation, traveled distance or time may be encoded by different population of neurons in multiple regions including hippocampal-entorhinal system, posterior parietal cortex, or frontal cortex.
Collapse
Affiliation(s)
- Zhixian Cheng
- Department of Neuroscience, Yale School of Medicine, New Haven, CT, United States
| | - Yong Gu
- Key Laboratory of Primate Neurobiology, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, Shanghai, China
| |
Collapse
|
14
|
Effect of vibration during visual-inertial integration on human heading perception during eccentric gaze. PLoS One 2018; 13:e0199097. [PMID: 29902253 PMCID: PMC6002115 DOI: 10.1371/journal.pone.0199097] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2017] [Accepted: 05/31/2018] [Indexed: 11/21/2022] Open
Abstract
Heading direction is determined from visual and inertial cues. Visual headings use retinal coordinates while inertial headings use body coordinates. Thus during eccentric gaze the same heading may be perceived differently by visual and inertial modalities. Stimulus weights depend on the relative reliability of these stimuli, but previous work suggests that the inertial heading may be given more weight than predicted. These experiments only varied the visual stimulus reliability, and it is unclear what occurs with variation in inertial reliability. Five human subjects completed a heading discrimination task using 2s of translation with a peak velocity of 16cm/s. Eye position was ±25° left/right with visual, inertial, or combined motion. The visual motion coherence was 50%. Inertial stimuli included 6 Hz vertical vibration with 0, 0.10, 0.15, or 0.20cm amplitude. Subjects reported perceived heading relative to the midline. With an inertial heading, perception was biased 3.6° towards the gaze direction. Visual headings biased perception 9.6° opposite gaze. The inertial threshold without vibration was 4.8° which increased significantly to 8.8° with vibration but the amplitude of vibration did not influence reliability. With visual-inertial headings, empirical stimulus weights were calculated from the bias and compared with the optimal weight calculated from the threshold. In 2 subjects empirical weights were near optimal while in the remaining 3 subjects the inertial stimuli were weighted greater than optimal predictions. On average the inertial stimulus was weighted greater than predicted. These results indicate multisensory integration may not be a function of stimulus reliability when inertial stimulus reliability is varied.
Collapse
|