1
|
Kayser C, Debats N, Heuer H. Both stimulus-specific and configurational features of multiple visual stimuli shape the spatial ventriloquism effect. Eur J Neurosci 2024; 59:1770-1788. [PMID: 38230578 DOI: 10.1111/ejn.16251] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2023] [Revised: 12/22/2023] [Accepted: 12/25/2023] [Indexed: 01/18/2024]
Abstract
Studies on multisensory perception often focus on simplistic conditions in which one single stimulus is presented per modality. Yet, in everyday life, we usually encounter multiple signals per modality. To understand how multiple signals within and across the senses are combined, we extended the classical audio-visual spatial ventriloquism paradigm to combine two visual stimuli with one sound. The individual visual stimuli presented in the same trial differed in their relative timing and spatial offsets to the sound, allowing us to contrast their individual and combined influence on sound localization judgements. We find that the ventriloquism bias is not dominated by a single visual stimulus but rather is shaped by the collective multisensory evidence. In particular, the contribution of an individual visual stimulus to the ventriloquism bias depends not only on its own relative spatio-temporal alignment to the sound but also the spatio-temporal alignment of the other visual stimulus. We propose that this pattern of multi-stimulus multisensory integration reflects the evolution of evidence for sensory causal relations during individual trials, calling for the need to extend established models of multisensory causal inference to more naturalistic conditions. Our data also suggest that this pattern of multisensory interactions extends to the ventriloquism aftereffect, a bias in sound localization observed in unisensory judgements following a multisensory stimulus.
Collapse
Affiliation(s)
- Christoph Kayser
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
| | - Nienke Debats
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
| | - Herbert Heuer
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
- Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany
| |
Collapse
|
2
|
Bruns P, Thun C, Röder B. Quantifying accuracy and precision from continuous response data in studies of spatial perception and crossmodal recalibration. Behav Res Methods 2024; 56:3814-3830. [PMID: 38684625 PMCID: PMC11133116 DOI: 10.3758/s13428-024-02416-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/05/2024] [Indexed: 05/02/2024]
Abstract
The ability to detect the absolute location of sensory stimuli can be quantified with either error-based metrics derived from single-trial localization errors or regression-based metrics derived from a linear regression of localization responses on the true stimulus locations. Here we tested the agreement between these two approaches in estimating accuracy and precision in a large sample of 188 subjects who localized auditory stimuli from different azimuthal locations. A subsample of 57 subjects was subsequently exposed to audiovisual stimuli with a consistent spatial disparity before performing the sound localization test again, allowing us to additionally test which of the different metrics best assessed correlations between the amount of crossmodal spatial recalibration and baseline localization performance. First, our findings support a distinction between accuracy and precision. Localization accuracy was mainly reflected in the overall spatial bias and was moderately correlated with precision metrics. However, in our data, the variability of single-trial localization errors (variable error in error-based metrics) and the amount by which the eccentricity of target locations was overestimated (slope in regression-based metrics) were highly correlated, suggesting that intercorrelations between individual metrics need to be carefully considered in spatial perception studies. Secondly, exposure to spatially discrepant audiovisual stimuli resulted in a shift in bias toward the side of the visual stimuli (ventriloquism aftereffect) but did not affect localization precision. The size of the aftereffect shift in bias was at least partly explainable by unspecific test repetition effects, highlighting the need to account for inter-individual baseline differences in studies of spatial learning.
Collapse
Affiliation(s)
- Patrick Bruns
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146, Hamburg, Germany.
| | - Caroline Thun
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146, Hamburg, Germany
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146, Hamburg, Germany
| |
Collapse
|
3
|
Lin R, Zeng F, Wang Q, Chen A. Cross-Modal Plasticity during Self-Motion Perception. Brain Sci 2023; 13:1504. [PMID: 38002465 PMCID: PMC10669852 DOI: 10.3390/brainsci13111504] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2023] [Revised: 10/13/2023] [Accepted: 10/23/2023] [Indexed: 11/26/2023] Open
Abstract
To maintain stable and coherent perception in an ever-changing environment, the brain needs to continuously and dynamically calibrate information from multiple sensory sources, using sensory and non-sensory information in a flexible manner. Here, we review how the vestibular and visual signals are recalibrated during self-motion perception. We illustrate two different types of recalibration: one long-term cross-modal (visual-vestibular) recalibration concerning how multisensory cues recalibrate over time in response to a constant cue discrepancy, and one rapid-term cross-modal (visual-vestibular) recalibration concerning how recent prior stimuli and choices differentially affect subsequent self-motion decisions. In addition, we highlight the neural substrates of long-term visual-vestibular recalibration, with profound differences observed in neuronal recalibration across multisensory cortical areas. We suggest that multisensory recalibration is a complex process in the brain, is modulated by many factors, and requires the coordination of many distinct cortical areas. We hope this review will shed some light on research into the neural circuits of visual-vestibular recalibration and help develop a more generalized theory for cross-modal plasticity.
Collapse
Affiliation(s)
- Rushi Lin
- Key Laboratory of Brain Functional Genomics (Ministry of Education), East China Normal University, 3663 Zhongshan Road N., Shanghai 200062, China; (R.L.); (F.Z.); (Q.W.)
| | - Fu Zeng
- Key Laboratory of Brain Functional Genomics (Ministry of Education), East China Normal University, 3663 Zhongshan Road N., Shanghai 200062, China; (R.L.); (F.Z.); (Q.W.)
| | - Qingjun Wang
- Key Laboratory of Brain Functional Genomics (Ministry of Education), East China Normal University, 3663 Zhongshan Road N., Shanghai 200062, China; (R.L.); (F.Z.); (Q.W.)
| | - Aihua Chen
- Key Laboratory of Brain Functional Genomics (Ministry of Education), East China Normal University, 3663 Zhongshan Road N., Shanghai 200062, China; (R.L.); (F.Z.); (Q.W.)
- NYU-ECNU Institute of Brain and Cognitive Science, New York University Shanghai, Shanghai 200122, China
| |
Collapse
|
4
|
Bruns P, Röder B. Development and experience-dependence of multisensory spatial processing. Trends Cogn Sci 2023; 27:961-973. [PMID: 37208286 DOI: 10.1016/j.tics.2023.04.012] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2022] [Revised: 04/24/2023] [Accepted: 04/25/2023] [Indexed: 05/21/2023]
Abstract
Multisensory spatial processes are fundamental for efficient interaction with the world. They include not only the integration of spatial cues across sensory modalities, but also the adjustment or recalibration of spatial representations to changing cue reliabilities, crossmodal correspondences, and causal structures. Yet how multisensory spatial functions emerge during ontogeny is poorly understood. New results suggest that temporal synchrony and enhanced multisensory associative learning capabilities first guide causal inference and initiate early coarse multisensory integration capabilities. These multisensory percepts are crucial for the alignment of spatial maps across sensory systems, and are used to derive more stable biases for adult crossmodal recalibration. The refinement of multisensory spatial integration with increasing age is further promoted by the inclusion of higher-order knowledge.
Collapse
Affiliation(s)
- Patrick Bruns
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Germany.
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Germany
| |
Collapse
|
5
|
Debats NB, Heuer H, Kayser C. Different time scales of common-cause evidence shape multisensory integration, recalibration and motor adaptation. Eur J Neurosci 2023; 58:3253-3269. [PMID: 37461244 DOI: 10.1111/ejn.16095] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2023] [Accepted: 07/03/2023] [Indexed: 09/05/2023]
Abstract
Perceptual coherence in the face of discrepant multisensory signals is achieved via the processes of multisensory integration, recalibration and sometimes motor adaptation. These supposedly operate on different time scales, with integration reducing immediate sensory discrepancies and recalibration and motor adaptation reflecting the cumulative influence of their recent history. Importantly, whether discrepant signals are bound during perception is guided by the brains' inference of whether they originate from a common cause. When combined, these two notions lead to the hypothesis that the time scales on which integration and recalibration (or motor adaptation) operate are associated with different time scales of evidence about a common cause underlying two signals. We tested this prediction in a well-established visuo-motor paradigm, in which human participants performed visually guided hand movements. The kinematic correlation between hand and cursor movements indicates their common origin, which allowed us to manipulate the common-cause evidence by titrating this correlation. Specifically, we dissociated hand and cursor signals during individual movements while preserving their correlation across the series of movement endpoints. Following our hypothesis, this manipulation reduced integration compared with a condition in which visual and proprioceptive signals were perfectly correlated. In contrast, recalibration and motor adaption were not affected by this manipulation. This supports the notion that multisensory integration and recalibration deal with sensory discrepancies on different time scales guided by common-cause evidence: Integration is prompted by local common-cause evidence and reduces immediate discrepancies, whereas recalibration and motor adaptation are prompted by global common-cause evidence and reduce persistent discrepancies.
Collapse
Affiliation(s)
- Nienke B Debats
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
| | - Herbert Heuer
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
- Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany
| | - Christoph Kayser
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
| |
Collapse
|
6
|
Kayser C, Park H, Heuer H. Cumulative multisensory discrepancies shape the ventriloquism aftereffect but not the ventriloquism bias. PLoS One 2023; 18:e0290461. [PMID: 37607201 PMCID: PMC10443876 DOI: 10.1371/journal.pone.0290461] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2023] [Accepted: 08/08/2023] [Indexed: 08/24/2023] Open
Abstract
Multisensory integration and recalibration are two processes by which perception deals with discrepant signals. Both are often studied in the spatial ventriloquism paradigm. There, integration is probed by the presentation of discrepant audio-visual stimuli, while recalibration manifests as an aftereffect in subsequent judgements of unisensory sounds. Both biases are typically quantified against the degree of audio-visual discrepancy, reflecting the possibility that both may arise from common underlying multisensory principles. We tested a specific prediction of this: that both processes should also scale similarly with the history of multisensory discrepancies, i.e. the sequence of discrepancies in several preceding audio-visual trials. Analyzing data from ten experiments with randomly varying spatial discrepancies we confirmed the expected dependency of each bias on the immediately presented discrepancy. And in line with the aftereffect being a cumulative process, this scaled with the discrepancies presented in at least three preceding audio-visual trials. However, the ventriloquism bias did not depend on this three-trial history of multisensory discrepancies and also did not depend on the aftereffect biases in previous trials - making these two multisensory processes experimentally dissociable. These findings support the notion that the ventriloquism bias and the aftereffect reflect distinct functions, with integration maintaining a stable percept by reducing immediate sensory discrepancies and recalibration maintaining an accurate percept by accounting for consistent discrepancies.
Collapse
Affiliation(s)
- Christoph Kayser
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
| | - Hame Park
- Department of Neurophysiology & Pathophysiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Herbert Heuer
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
- Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany
| |
Collapse
|
7
|
Kirsch W, Kunde W. Changes in body perception following virtual object manipulation are accompanied by changes of the internal reference scale. Sci Rep 2023; 13:7137. [PMID: 37130888 PMCID: PMC10154308 DOI: 10.1038/s41598-023-34311-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2022] [Accepted: 04/27/2023] [Indexed: 05/04/2023] Open
Abstract
Changes in body perception often arise when observers are confronted with related yet discrepant multisensory signals. Some of these effects are interpreted as outcomes of sensory integration of various signals, whereas related biases are ascribed to learning-dependent recalibration of coding individual signals. The present study explored whether the same sensorimotor experience entails changes in body perception that are indicative of multisensory integration and those that indicate recalibration. Participants enclosed visual objects by a pair of visual cursors controlled by finger movements. Then either they judged their perceived finger posture (indicating multisensory integration) or they produced a certain finger posture (indicating recalibration). An experimental variation of the size of the visual object resulted in systematic and opposite biases of the perceived and produced finger distances. This pattern of results is consistent with the assumption that multisensory integration and recalibration had a common origin in the task we used.
Collapse
Affiliation(s)
- Wladimir Kirsch
- Department of Psychology, University of Würzburg, Röntgenring 11, 97070, Würzburg, Germany.
| | - Wilfried Kunde
- Department of Psychology, University of Würzburg, Röntgenring 11, 97070, Würzburg, Germany
| |
Collapse
|
8
|
Sciortino P, Kayser C. Steady state visual evoked potentials reveal a signature of the pitch-size crossmodal association in visual cortex. Neuroimage 2023; 273:120093. [PMID: 37028733 DOI: 10.1016/j.neuroimage.2023.120093] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2023] [Revised: 03/31/2023] [Accepted: 04/04/2023] [Indexed: 04/08/2023] Open
Abstract
Crossmodal correspondences describe our tendency to associate sensory features from different modalities with each other, such as the pitch of a sound with the size of a visual object. While such crossmodal correspondences (or associations) are described in many behavioural studies their neurophysiological correlates remain unclear. Under the current working model of multisensory perception both a low- and a high-level account seem plausible. That is, the neurophysiological processes shaping these associations could commence in low-level sensory regions, or may predominantly emerge in high-level association regions of semantic and object identification networks. We exploited steady-state visual evoked potentials (SSVEP) to directly probe this question, focusing on the associations between pitch and the visual features of size, hue or chromatic saturation. We found that SSVEPs over occipital regions are sensitive to the congruency between pitch and size, and a source analysis pointed to an origin around primary visual cortices. We speculate that this signature of the pitch-size association in low-level visual cortices reflects the successful pairing of congruent visual and acoustic object properties and may contribute to establishing causal relations between multisensory objects. Besides this, our study also provides a paradigm can be exploited to study other crossmodal associations involving visual stimuli in the future.
Collapse
|
9
|
Zeng F, Zaidel A, Chen A. Contrary neuronal recalibration in different multisensory cortical areas. eLife 2023; 12:82895. [PMID: 36877555 PMCID: PMC9988259 DOI: 10.7554/elife.82895] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Accepted: 02/21/2023] [Indexed: 03/07/2023] Open
Abstract
The adult brain demonstrates remarkable multisensory plasticity by dynamically recalibrating itself based on information from multiple sensory sources. After a systematic visual-vestibular heading offset is experienced, the unisensory perceptual estimates for subsequently presented stimuli are shifted toward each other (in opposite directions) to reduce the conflict. The neural substrate of this recalibration is unknown. Here, we recorded single-neuron activity from the dorsal medial superior temporal (MSTd), parietoinsular vestibular cortex (PIVC), and ventral intraparietal (VIP) areas in three male rhesus macaques during this visual-vestibular recalibration. Both visual and vestibular neuronal tuning curves in MSTd shifted - each according to their respective cues' perceptual shifts. Tuning of vestibular neurons in PIVC also shifted in the same direction as vestibular perceptual shifts (cells were not robustly tuned to the visual stimuli). By contrast, VIP neurons demonstrated a unique phenomenon: both vestibular and visual tuning shifted in accordance with vestibular perceptual shifts. Such that, visual tuning shifted, surprisingly, contrary to visual perceptual shifts. Therefore, while unsupervised recalibration (to reduce cue conflict) occurs in early multisensory cortices, higher-level VIP reflects only a global shift, in vestibular space.
Collapse
Affiliation(s)
- Fu Zeng
- Key Laboratory of Brain Functional Genomics (Ministry of Education), East China Normal UniversityShanghaiChina
| | - Adam Zaidel
- Gonda Multidisciplinary Brain Research Center, Bar-Ilan UniversityRamat GanIsrael
| | - Aihua Chen
- Key Laboratory of Brain Functional Genomics (Ministry of Education), East China Normal UniversityShanghaiChina
| |
Collapse
|
10
|
Debats NB, Heuer H, Kayser C. Short-term effects of visuomotor discrepancies on multisensory integration, proprioceptive recalibration, and motor adaptation. J Neurophysiol 2023; 129:465-478. [PMID: 36651909 DOI: 10.1152/jn.00478.2022] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/19/2023] Open
Abstract
Information about the position of our hand is provided by multisensory signals that are often not perfectly aligned. Discrepancies between the seen and felt hand position or its movement trajectory engage the processes of 1) multisensory integration, 2) sensory recalibration, and 3) motor adaptation, which adjust perception and behavioral responses to apparently discrepant signals. To foster our understanding of the coemergence of these three processes, we probed their short-term dependence on multisensory discrepancies in a visuomotor task that has served as a model for multisensory perception and motor control previously. We found that the well-established integration of discrepant visual and proprioceptive signals is tied to the immediate discrepancy and independent of the outcome of the integration of discrepant signals in immediately preceding trials. However, the strength of integration was context dependent, being stronger in an experiment featuring stimuli that covered a smaller range of visuomotor discrepancies (±15°) compared with one covering a larger range (±30°). Both sensory recalibration and motor adaptation for nonrepeated movement directions were absent after two bimodal trials with same or opposite visuomotor discrepancies. Hence our results suggest that short-term sensory recalibration and motor adaptation are not an obligatory consequence of the integration of preceding discrepant multisensory signals.NEW & NOTEWORTHY The functional relation between multisensory integration and recalibration remains debated. We here refute the notion that they coemerge in an obligatory manner and support the hypothesis that they serve distinct goals of perception.
Collapse
Affiliation(s)
- Nienke B Debats
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
| | - Herbert Heuer
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany.,Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany
| | - Christoph Kayser
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
| |
Collapse
|
11
|
Lokša P, Kopčo N. Toward a Unified Theory of the Reference Frame of the Ventriloquism Aftereffect. Trends Hear 2023; 27:23312165231201020. [PMID: 37715636 PMCID: PMC10505348 DOI: 10.1177/23312165231201020] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2022] [Revised: 08/21/2023] [Accepted: 08/26/2023] [Indexed: 09/18/2023] Open
Abstract
The ventriloquism aftereffect (VAE), observed as a shift in the perceived locations of sounds after audio-visual stimulation, requires reference frame (RF) alignment since hearing and vision encode space in different RFs (head-centered vs. eye-centered). Previous experimental studies reported inconsistent results, observing either a mixture of head-centered and eye-centered frames, or a predominantly head-centered frame. Here, a computational model is introduced, examining the neural mechanisms underlying these effects. The basic model version assumes that the auditory spatial map is head-centered and the visual signals are converted to head-centered frame prior to inducing the adaptation. Two mechanisms are considered as extended model versions to describe the mixed-frame experimental data: (1) additional presence of visual signals in eye-centered frame and (2) eye-gaze direction-dependent attenuation in VAE when eyes shift away from the training fixation. Simulation results show that the mixed-frame results are mainly due to the second mechanism, suggesting that the RF of VAE is mainly head-centered. Additionally, a mechanism is proposed to explain a new ventriloquism-aftereffect-like phenomenon in which adaptation is induced by aligned audio-visual signals when saccades are used for responding to auditory targets. A version of the model extended to consider such response-method-related biases accurately predicts the new phenomenon. When attempting to model all the experimentally observed phenomena simultaneously, the model predictions are qualitatively similar but less accurate, suggesting that the proposed neural mechanisms interact in a more complex way than assumed in the model.
Collapse
Affiliation(s)
- Peter Lokša
- Institute of Computer Science, Faculty of Science, P. J. Šafárik University in Košice, Košice, Slovakia
| | - Norbert Kopčo
- Institute of Computer Science, Faculty of Science, P. J. Šafárik University in Košice, Košice, Slovakia
| |
Collapse
|
12
|
The development of audio-visual temporal precision precedes its rapid recalibration. Sci Rep 2022; 12:21591. [PMID: 36517503 PMCID: PMC9751280 DOI: 10.1038/s41598-022-25392-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2022] [Accepted: 11/29/2022] [Indexed: 12/15/2022] Open
Abstract
Through development, multisensory systems reach a balance between stability and flexibility: the systems integrate optimally cross-modal signals from the same events, while remaining adaptive to environmental changes. Is continuous intersensory recalibration required to shape optimal integration mechanisms, or does multisensory integration develop prior to recalibration? Here, we examined the development of multisensory integration and rapid recalibration in the temporal domain by re-analyzing published datasets for audio-visual, audio-tactile, and visual-tactile combinations. Results showed that children reach an adult level of precision in audio-visual simultaneity perception and show the first sign of rapid recalibration at 9 years of age. In contrast, there was very weak rapid recalibration for other cross-modal combinations at all ages, even when adult levels of temporal precision had developed. Thus, the development of audio-visual rapid recalibration appears to require the maturation of temporal precision. It may serve to accommodate distance-dependent travel time differences between light and sound.
Collapse
|
13
|
Musical training refines audiovisual integration but does not influence temporal recalibration. Sci Rep 2022; 12:15292. [PMID: 36097277 PMCID: PMC9468170 DOI: 10.1038/s41598-022-19665-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2022] [Accepted: 09/01/2022] [Indexed: 11/11/2022] Open
Abstract
When the brain is exposed to a temporal asynchrony between the senses, it will shift its perception of simultaneity towards the previously experienced asynchrony (temporal recalibration). It is unknown whether recalibration depends on how accurately an individual integrates multisensory cues or on experiences they have had over their lifespan. Hence, we assessed whether musical training modulated audiovisual temporal recalibration. Musicians (n = 20) and non-musicians (n = 18) made simultaneity judgements to flash-tone stimuli before and after adaptation to asynchronous (± 200 ms) flash-tone stimuli. We analysed these judgements via an observer model that described the left and right boundaries of the temporal integration window (decisional criteria) and the amount of sensory noise that affected these judgements. Musicians’ boundaries were narrower (closer to true simultaneity) than non-musicians’, indicating stricter criteria for temporal integration, and they also exhibited enhanced sensory precision. However, while both musicians and non-musicians experienced cumulative and rapid recalibration, these recalibration effects did not differ between the groups. Unexpectedly, cumulative recalibration was caused by auditory-leading but not visual-leading adaptation. Overall, these findings suggest that the precision with which observers perceptually integrate audiovisual temporal cues does not predict their susceptibility to recalibration.
Collapse
|
14
|
Park H, Kayser C. The context of experienced sensory discrepancies shapes multisensory integration and recalibration differently. Cognition 2022; 225:105092. [DOI: 10.1016/j.cognition.2022.105092] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2021] [Revised: 03/02/2022] [Accepted: 03/07/2022] [Indexed: 11/03/2022]
|
15
|
Sciortino P, Kayser C. The rubber hand illusion is accompanied by a distributed reduction of alpha and beta power in the EEG. PLoS One 2022; 17:e0271659. [PMID: 35905100 PMCID: PMC9337658 DOI: 10.1371/journal.pone.0271659] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2021] [Accepted: 07/05/2022] [Indexed: 11/18/2022] Open
Abstract
Previous studies have reported correlates of bodily self-illusions such as the rubber hand in signatures of rhythmic brain activity. However, individual studies focused on specific variations of the rubber hand paradigm, used different experimental setups to induce this, or used different control conditions to isolate the neurophysiological signatures related to the illusory state, leaving the specificity of the reported illusion-signatures unclear. We here quantified correlates of the rubber hand illusion in EEG-derived oscillatory brain activity and asked two questions: which of the observed correlates are robust to the precise nature of the control conditions used as contrast for the illusory state, and whether such correlates emerge directly around the subjective illusion onset. To address these questions, we relied on two experimental configurations to induce the illusion, on different non-illusion conditions to isolate neurophysiological signatures of the illusory state, and we implemented an analysis directly focusing on the immediate moment of the illusion onset. Our results reveal a widespread suppression of alpha and beta-band activity associated with the illusory state in general, whereby the reduction of beta power prevailed around the immediate illusion onset. These results confirm previous reports of a suppression of alpha and beta rhythms during body illusions, but also highlight the difficulties to directly pinpoint the precise neurophysiological correlates of the illusory state.
Collapse
Affiliation(s)
- Placido Sciortino
- Department of Cognitive Neuroscience, Faculty of Biology, Bielefeld University, Bielefeld, Germany
- * E-mail:
| | - Christoph Kayser
- Department of Cognitive Neuroscience, Faculty of Biology, Bielefeld University, Bielefeld, Germany
| |
Collapse
|
16
|
Aller M, Mihalik A, Noppeney U. Audiovisual adaptation is expressed in spatial and decisional codes. Nat Commun 2022; 13:3924. [PMID: 35798733 PMCID: PMC9262908 DOI: 10.1038/s41467-022-31549-0] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2021] [Accepted: 06/21/2022] [Indexed: 11/09/2022] Open
Abstract
The brain adapts dynamically to the changing sensory statistics of its environment. Recent research has started to delineate the neural circuitries and representations that support this cross-sensory plasticity. Combining psychophysics and model-based representational fMRI and EEG we characterized how the adult human brain adapts to misaligned audiovisual signals. We show that audiovisual adaptation is associated with changes in regional BOLD-responses and fine-scale activity patterns in a widespread network from Heschl's gyrus to dorsolateral prefrontal cortices. Audiovisual recalibration relies on distinct spatial and decisional codes that are expressed with opposite gradients and time courses across the auditory processing hierarchy. Early activity patterns in auditory cortices encode sounds in a continuous space that flexibly adapts to misaligned visual inputs. Later activity patterns in frontoparietal cortices code decisional uncertainty consistent with these spatial transformations. Our findings suggest that regions within the auditory processing hierarchy multiplex spatial and decisional codes to adapt flexibly to the changing sensory statistics in the environment.
Collapse
Affiliation(s)
- Máté Aller
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, UK.
- MRC Cognition and Brain Sciences Unit, University of Cambridge, Cambridge, UK.
| | - Agoston Mihalik
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, UK
- Department of Psychiatry, University of Cambridge, Cambridge, UK
| | - Uta Noppeney
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, UK
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands
| |
Collapse
|
17
|
Bruns P, Li L, Guerreiro MJ, Shareef I, Rajendran SS, Pitchaimuthu K, Kekunnaya R, Röder B. Audiovisual spatial recalibration but not integration is shaped by early sensory experience. iScience 2022; 25:104439. [PMID: 35874923 PMCID: PMC9301879 DOI: 10.1016/j.isci.2022.104439] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2021] [Revised: 02/14/2022] [Accepted: 05/06/2022] [Indexed: 11/15/2022] Open
Affiliation(s)
- Patrick Bruns
- Biological Psychology and Neuropsychology, University of Hamburg, 20146 Hamburg, Germany
- Corresponding author
| | - Lux Li
- Biological Psychology and Neuropsychology, University of Hamburg, 20146 Hamburg, Germany
- Department of Epidemiology and Biostatistics, Schulich School of Medicine & Dentistry, Western University, London, ON N6G 2M1, Canada
| | - Maria J.S. Guerreiro
- Biological Psychology and Neuropsychology, University of Hamburg, 20146 Hamburg, Germany
- Biological Psychology, Department of Psychology, School of Medicine and Health Sciences, University of Oldenburg, 26111 Oldenburg, Germany
| | - Idris Shareef
- Jasti V Ramanamma Children’s Eye Care Centre, LV Prasad Eye Institute, Hyderabad, Telangana 500034, India
| | - Siddhart S. Rajendran
- Biological Psychology and Neuropsychology, University of Hamburg, 20146 Hamburg, Germany
- Jasti V Ramanamma Children’s Eye Care Centre, LV Prasad Eye Institute, Hyderabad, Telangana 500034, India
| | - Kabilan Pitchaimuthu
- Biological Psychology and Neuropsychology, University of Hamburg, 20146 Hamburg, Germany
- Jasti V Ramanamma Children’s Eye Care Centre, LV Prasad Eye Institute, Hyderabad, Telangana 500034, India
| | - Ramesh Kekunnaya
- Jasti V Ramanamma Children’s Eye Care Centre, LV Prasad Eye Institute, Hyderabad, Telangana 500034, India
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg, 20146 Hamburg, Germany
| |
Collapse
|
18
|
Multivariate Analysis of Evoked Responses during the Rubber Hand Illusion Suggests a Temporal Parcellation into Manipulation and Illusion-Specific Correlates. eNeuro 2022; 9:ENEURO.0355-21.2021. [PMID: 34980661 PMCID: PMC8805188 DOI: 10.1523/eneuro.0355-21.2021] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2021] [Revised: 11/16/2021] [Accepted: 12/13/2021] [Indexed: 11/23/2022] Open
Abstract
The neurophysiological processes reflecting body illusions such as the rubber hand remain debated. Previous studies investigating the neural responses evoked by the illusion-inducing stimulation have provided diverging reports as to when these responses reflect the illusory state of the artificial limb becoming embodied. One reason for these diverging reports may be that different studies contrasted different experimental conditions to isolate potential correlates of the illusion, but individual contrasts may reflect multiple facets of the adopted experimental paradigm and not just the illusory state. To resolve these controversies, we recorded EEG responses in human participants and combined multivariate (cross-)classification with multiple Illusion and non-Illusion conditions. These conditions were designed to probe for markers of the illusory state that generalize across the spatial arrangements of limbs or the specific nature of the control object (a rubber hand or participant’s real hand), hence which are independent of the precise experimental conditions used as contrast for the illusion. Our results reveal a parcellation of evoked responses into a temporal sequence of events. Around 125 and 275 ms following stimulus onset, the neurophysiological signals reliably differentiate the illusory state from non-Illusion epochs. These results consolidate previous work by demonstrating multiple neurophysiological correlates of the rubber hand illusion and illustrate how multivariate approaches can help pinpointing those that are independent of the precise experimental configuration used to induce the illusion.
Collapse
|
19
|
Debats NB, Heuer H, Kayser C. Visuo-proprioceptive integration and recalibration with multiple visual stimuli. Sci Rep 2021; 11:21640. [PMID: 34737371 PMCID: PMC8569193 DOI: 10.1038/s41598-021-00992-2] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2021] [Accepted: 10/18/2021] [Indexed: 11/29/2022] Open
Abstract
To organize the plethora of sensory signals from our environment into a coherent percept, our brain relies on the processes of multisensory integration and sensory recalibration. We here asked how visuo-proprioceptive integration and recalibration are shaped by the presence of more than one visual stimulus, hence paving the way to study multisensory perception under more naturalistic settings with multiple signals per sensory modality. We used a cursor-control task in which proprioceptive information on the endpoint of a reaching movement was complemented by two visual stimuli providing additional information on the movement endpoint. The visual stimuli were briefly shown, one synchronously with the hand reaching the movement endpoint, the other delayed. In Experiment 1, the judgments of hand movement endpoint revealed integration and recalibration biases oriented towards the position of the synchronous stimulus and away from the delayed one. In Experiment 2 we contrasted two alternative accounts: that only the temporally more proximal visual stimulus enters integration similar to a winner-takes-all process, or that the influences of both stimuli superpose. The proprioceptive biases revealed that integration—and likely also recalibration—are shaped by the superposed contributions of multiple stimuli rather than by only the most powerful individual one.
Collapse
Affiliation(s)
- Nienke B Debats
- Department of Cognitive Neuroscience, Universität Bielefeld, Universitätsstrasse 25, 33615, Bielefeld, Germany. .,Center for Cognitive Interaction Technology (CITEC), Universität Bielefeld, Bielefeld, Germany.
| | - Herbert Heuer
- Department of Cognitive Neuroscience, Universität Bielefeld, Universitätsstrasse 25, 33615, Bielefeld, Germany.,Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany
| | - Christoph Kayser
- Department of Cognitive Neuroscience, Universität Bielefeld, Universitätsstrasse 25, 33615, Bielefeld, Germany.,Center for Cognitive Interaction Technology (CITEC), Universität Bielefeld, Bielefeld, Germany
| |
Collapse
|