1
|
Ampollini S, Ardizzi M, Ferroni F, Cigala A. Synchrony perception across senses: A systematic review of temporal binding window changes from infancy to adolescence in typical and atypical development. Neurosci Biobehav Rev 2024; 162:105711. [PMID: 38729280 DOI: 10.1016/j.neubiorev.2024.105711] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2023] [Revised: 04/14/2024] [Accepted: 05/03/2024] [Indexed: 05/12/2024]
Abstract
Sensory integration is increasingly acknowledged as being crucial for the development of cognitive and social abilities. However, its developmental trajectory is still little understood. This systematic review delves into the topic by investigating the literature about the developmental changes from infancy through adolescence of the Temporal Binding Window (TBW) - the epoch of time within which sensory inputs are perceived as simultaneous and therefore integrated. Following comprehensive searches across PubMed, Elsevier, and PsycInfo databases, only experimental, behavioral, English-language, peer-reviewed studies on multisensory temporal processing in 0-17-year-olds have been included. Non-behavioral, non-multisensory, and non-human studies have been excluded as those that did not directly focus on the TBW. The selection process was independently performed by two Authors. The 39 selected studies involved 2859 participants in total. Findings indicate a predisposition towards cross-modal asynchrony sensitivity and a composite, still unclear, developmental trajectory, with atypical development associated to increased asynchrony tolerance. These results highlight the need for consistent and thorough research into TBW development to inform potential interventions.
Collapse
Affiliation(s)
- Silvia Ampollini
- Department of Humanities, Social Sciences and Cultural Industries, University of Parma, Borgo Carissimi, 10, Parma 43121, Italy.
| | - Martina Ardizzi
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Via Volturno 39E, Parma 43121, Italy
| | - Francesca Ferroni
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Via Volturno 39E, Parma 43121, Italy
| | - Ada Cigala
- Department of Humanities, Social Sciences and Cultural Industries, University of Parma, Borgo Carissimi, 10, Parma 43121, Italy
| |
Collapse
|
2
|
Jiang Z, An X, Liu S, Yin E, Yan Y, Ming D. Beyond alpha band: prestimulus local oscillation and interregional synchrony of the beta band shape the temporal perception of the audiovisual beep-flash stimulus. J Neural Eng 2024; 21:036035. [PMID: 37419108 DOI: 10.1088/1741-2552/ace551] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2022] [Accepted: 07/07/2023] [Indexed: 07/09/2023]
Abstract
Objective.Multisensory integration is more likely to occur if the multimodal inputs are within a narrow temporal window called temporal binding window (TBW). Prestimulus local neural oscillations and interregional synchrony within sensory areas can modulate cross-modal integration. Previous work has examined the role of ongoing neural oscillations in audiovisual temporal integration, but there is no unified conclusion. This study aimed to explore whether local ongoing neural oscillations and interregional audiovisual synchrony modulate audiovisual temporal integration.Approach.The human participants performed a simultaneity judgment (SJ) task with the beep-flash stimuli while recording electroencephalography. We focused on two stimulus onset asynchrony (SOA) conditions where subjects report ∼50% proportion of synchronous responses in auditory- and visual-leading SOA (A50V and V50A).Main results.We found that the alpha band power is larger in synchronous response in the central-right posterior and posterior sensors in A50V and V50A conditions, respectively. The results suggested that the alpha band power reflects neuronal excitability in the auditory or visual cortex, which can modulate audiovisual temporal perception depending on the leading sense. Additionally, the SJs were modulated by the opposite phases of alpha (5-10 Hz) and low beta (14-20 Hz) bands in the A50V condition while the low beta band (14-18 Hz) in the V50A condition. One cycle of alpha or two cycles of beta oscillations matched an auditory-leading TBW of ∼86 ms, while two cycles of beta oscillations matched a visual-leading TBW of ∼105 ms. This result indicated the opposite phases in the alpha and beta bands reflect opposite cortical excitability, which modulated the audiovisual SJs. Finally, we found stronger high beta (21-28 Hz) audiovisual phase synchronization for synchronous response in the A50V condition. The phase synchrony of the beta band might be related to maintaining information flow between visual and auditory regions in a top-down manner.Significance.These results clarified whether and how the prestimulus brain state, including local neural oscillations and functional connectivity between brain regions, affects audiovisual temporal integration.
Collapse
Affiliation(s)
- Zeliang Jiang
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
| | - Xingwei An
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
| | - Shuang Liu
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
| | - Erwei Yin
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
- Defense Innovation Institute, Academy of Military Sciences (AMS), 100071 Beijing, People's Republic of China
- Tianjin Artificial Intelligence Innovation Center (TAIIC), 300457 Tianjin, People's Republic of China
| | - Ye Yan
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
- Defense Innovation Institute, Academy of Military Sciences (AMS), 100071 Beijing, People's Republic of China
- Tianjin Artificial Intelligence Innovation Center (TAIIC), 300457 Tianjin, People's Republic of China
| | - Dong Ming
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
| |
Collapse
|
3
|
Marsicano G, Bertini C, Ronconi L. Alpha-band sensory entrainment improves audiovisual temporal acuity. Psychon Bull Rev 2024; 31:874-885. [PMID: 37783899 DOI: 10.3758/s13423-023-02388-x] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/10/2023] [Indexed: 10/04/2023]
Abstract
Visual and auditory stimuli are transmitted from the environment to sensory cortices with different timing, requiring the brain to encode when sensory inputs must be segregated or integrated into a single percept. The probability that different audiovisual (AV) stimuli are integrated into a single percept even when presented asynchronously is reflected in the construct of temporal binding window (TBW). There is a strong interest in testing whether it is possible to broaden or shrink TBW by using different neuromodulatory approaches that can speed up or slow down ongoing alpha oscillations, which have been repeatedly hypothesized to be an important determinant of the TBWs size. Here, we employed a web-based sensory entrainment protocol combined with a simultaneity judgment task using simple flash-beep stimuli. The aim was to test whether AV temporal acuity could be modulated trial by trial by synchronizing ongoing neural oscillations in the prestimulus period to a rhythmic sensory stream presented in the upper (∼12 Hz) or lower (∼8.5 Hz) alpha range. As a control, we implemented a nonrhythmic condition where only the first and the last entrainers were employed. Results show that upper alpha entrainment shrinks AV TBW and improves AV temporal acuity when compared with lower alpha and control conditions. Our findings represent a proof of concept of the efficacy of sensory entrainment to improve AV temporal acuity in a trial-by-trial manner, and they strengthen the idea that alpha oscillations may reflect the temporal unit of AV temporal binding.
Collapse
Affiliation(s)
- Gianluca Marsicano
- Department of Psychology, University of Bologna, Viale Berti Pichat 5, 40121, Bologna, Italy
- Centre for Studies and Research in Cognitive Neuroscience, University of Bologna, Via Rasi e Spinelli 176, 47023, Cesena, Italy
| | - Caterina Bertini
- Department of Psychology, University of Bologna, Viale Berti Pichat 5, 40121, Bologna, Italy
- Centre for Studies and Research in Cognitive Neuroscience, University of Bologna, Via Rasi e Spinelli 176, 47023, Cesena, Italy
| | - Luca Ronconi
- School of Psychology, Vita-Salute San Raffaele University, Via Olgettina 58, 20132, Milan, Italy.
- Division of Neuroscience, IRCCS San Raffaele Scientific Institute, Milan, Italy.
| |
Collapse
|
4
|
Wang L, Lin L, Ren J. The characteristics of audiovisual temporal integration in streaming-bouncing bistable motion perception: considering both implicit and explicit processing perspectives. Cereb Cortex 2023; 33:11541-11555. [PMID: 37874024 DOI: 10.1093/cercor/bhad388] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2023] [Revised: 10/01/2023] [Accepted: 10/02/2023] [Indexed: 10/25/2023] Open
Abstract
This study explored the behavioral and neural activity characteristics of audiovisual temporal integration in motion perception from both implicit and explicit perspectives. The streaming-bouncing bistable paradigm (SB task) was employed to investigate implicit temporal integration, while the corresponding simultaneity judgment task (SJ task) was used to examine explicit temporal integration. The behavioral results revealed a negative correlation between implicit and explicit temporal processing. In the ERP results of both tasks, three neural phases (PD100, ND180, and PD290) in the fronto-central region were identified as reflecting integration effects and the auditory-evoked multisensory N1 component may serve as a primary component responsible for cross-modal temporal processing. However, there were significant differences between the VA ERPs in the SB and SJ tasks and the influence of speed on implicit and explicit integration effects also varied. The aforementioned results, building upon the validation of previous temporal renormalization theory, suggest that implicit and explicit temporal integration operate under distinct processing modes within a shared neural network. This underscores the brain's flexibility and adaptability in cross-modal temporal processing.
Collapse
Affiliation(s)
- Luning Wang
- School of Psychology, Shanghai University of Sport, No. 399, Changhai Road, Yangpu District, Shanghai, 200438, China
| | - Liyue Lin
- School of Psychology, Shanghai University of Sport, No. 399, Changhai Road, Yangpu District, Shanghai, 200438, China
| | - Jie Ren
- China Table Tennis College, Shanghai University of Sport, No. 399, Changhai Road, Yangpu District, Shanghai, 200438, China
| |
Collapse
|
5
|
Drijvers L, Holler J. The multimodal facilitation effect in human communication. Psychon Bull Rev 2023; 30:792-801. [PMID: 36138282 PMCID: PMC10104796 DOI: 10.3758/s13423-022-02178-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/01/2022] [Indexed: 11/08/2022]
Abstract
During face-to-face communication, recipients need to rapidly integrate a plethora of auditory and visual signals. This integration of signals from many different bodily articulators, all offset in time, with the information in the speech stream may either tax the cognitive system, thus slowing down language processing, or may result in multimodal facilitation. Using the classical shadowing paradigm, participants shadowed speech from face-to-face, naturalistic dyadic conversations in an audiovisual context, an audiovisual context without visual speech (e.g., lips), and an audio-only context. Our results provide evidence of a multimodal facilitation effect in human communication: participants were faster in shadowing words when seeing multimodal messages compared with when hearing only audio. Also, the more visual context was present, the fewer shadowing errors were made, and the earlier in time participants shadowed predicted lexical items. We propose that the multimodal facilitation effect may contribute to the ease of fast face-to-face conversational interaction.
Collapse
Affiliation(s)
- Linda Drijvers
- Donders Institute for Brain, Cognition, and Behaviour, Radboud University, Montessorilaan 3, 6525, HR, Nijmegen, The Netherlands.
- Max Planck Institute for Psycholinguistics, Wundtlaan 1, 6525, XD, Nijmegen, The Netherlands.
| | - Judith Holler
- Donders Institute for Brain, Cognition, and Behaviour, Radboud University, Montessorilaan 3, 6525, HR, Nijmegen, The Netherlands
- Max Planck Institute for Psycholinguistics, Wundtlaan 1, 6525, XD, Nijmegen, The Netherlands
| |
Collapse
|
6
|
Ainsworth K, Bertone A. Audiovisual temporal binding window narrows with age in autistic individuals. Autism Res 2023; 16:355-363. [PMID: 36426723 DOI: 10.1002/aur.2860] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2022] [Accepted: 11/10/2022] [Indexed: 11/27/2022]
Abstract
Atypical sensory perception has been recognized in autistic individuals since its earliest descriptions and is now considered a key characteristic of autism. Although the integration of sensory information (multisensory integration; MSI) has been demonstrated to be altered in autism, less is known about how this perceptual process differs with age. This study aimed to assess the integration of audiovisual information across autistic children and adolescents. MSI was measured using a non-social, simultaneity judgment task. Variation in temporal sensitivity was evaluated via Gaussian curve fitting procedures, allowing us to compare the width of temporal binding windows (TBWs), where wider TBWs indicate less sensitivity to temporal alignment. We compared TBWs in age and IQ matched groups of autistic (n = 32) and neurotypical (NT; n = 73) children and adolescents. The sensory profile of all participants was also measured. Across all ages assessed (i.e., 6 through 18 years), TBWs were negatively correlated with age in the autistic group. A significant correlation was not found in the NT group. When compared as a function of child (6-12 years) and adolescent (13-18 years) age groups, a significant interaction of group (autism vs NT) by age group was found, whereby TBWs became narrower with age in the autistic, but not neurotypical group. We also found a significant main effect of age and no significant main effect of group. Results suggest that TBW differences between autistic and neurotypical groups diminishes with increasing age, indicating an atypical developmental profile of MSI in autism which ameliorates across development.
Collapse
Affiliation(s)
- Kirsty Ainsworth
- Perceptual Neuroscience Laboratory (PNLab) for Autism and Development, McGill University, Montréal, Quebec, Canada.,Department of Educational and Counselling Psychology, McGill University, Montréal, Quebec, Canada
| | - Armando Bertone
- Perceptual Neuroscience Laboratory (PNLab) for Autism and Development, McGill University, Montréal, Quebec, Canada.,Department of Educational and Counselling Psychology, McGill University, Montréal, Quebec, Canada
| |
Collapse
|
7
|
Johnston PR, Alain C, McIntosh AR. Individual Differences in Multisensory Processing Are Related to Broad Differences in the Balance of Local versus Distributed Information. J Cogn Neurosci 2022; 34:846-863. [PMID: 35195723 DOI: 10.1162/jocn_a_01835] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The brain's ability to extract information from multiple sensory channels is crucial to perception and effective engagement with the environment, but the individual differences observed in multisensory processing lack mechanistic explanation. We hypothesized that, from the perspective of information theory, individuals with more effective multisensory processing will exhibit a higher degree of shared information among distributed neural populations while engaged in a multisensory task, representing more effective coordination of information among regions. To investigate this, healthy young adults completed an audiovisual simultaneity judgment task to measure their temporal binding window (TBW), which quantifies the ability to distinguish fine discrepancies in timing between auditory and visual stimuli. EEG was then recorded during a second run of the simultaneity judgment task, and partial least squares was used to relate individual differences in the TBW width to source-localized EEG measures of local entropy and mutual information, indexing local and distributed processing of information, respectively. The narrowness of the TBW, reflecting more effective multisensory processing, was related to a broad pattern of higher mutual information and lower local entropy at multiple timescales. Furthermore, a small group of temporal and frontal cortical regions, including those previously implicated in multisensory integration and response selection, respectively, played a prominent role in this pattern. Overall, these findings suggest that individual differences in multisensory processing are related to widespread individual differences in the balance of distributed versus local information processing among a large subset of brain regions, with more distributed information being associated with more effective multisensory processing. The balance of distributed versus local information processing may therefore be a useful measure for exploring individual differences in multisensory processing, its relationship to higher cognitive traits, and its disruption in neurodevelopmental disorders and clinical conditions.
Collapse
|
8
|
Marsicano G, Cerpelloni F, Melcher D, Ronconi L. Lower multisensory temporal acuity in individuals with high schizotypal traits: a web-based study. Sci Rep 2022; 12:2782. [PMID: 35177673 PMCID: PMC8854550 DOI: 10.1038/s41598-022-06503-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2021] [Accepted: 01/25/2022] [Indexed: 12/02/2022] Open
Abstract
Natural events are often multisensory, requiring the brain to combine information from the same spatial location and timing, across different senses. The importance of temporal coincidence has led to the introduction of the temporal binding window (TBW) construct, defined as the time range within which multisensory inputs are highly likely to be perceptually bound into a single entity. Anomalies in TBWs have been linked to confused perceptual experiences and inaccurate filtering of sensory inputs coming from different environmental sources. Indeed, larger TBWs have been associated with disorders such as schizophrenia and autism and are also correlated to a higher level of subclinical traits of these conditions in the general population. Here, we tested the feasibility of using a web-based version of a classic audio-visual simultaneity judgment (SJ) task with simple flash-beep stimuli in order to measure multisensory temporal acuity and its relationship with schizotypal traits as measured in the general population. Results show that: (i) the response distribution obtained in the web-based SJ task was strongly similar to those reported by studies carried out in controlled laboratory settings, and (ii) lower multisensory temporal acuity was associated with higher schizotypal traits in the “cognitive-perceptual” domains. Our findings reveal the possibility of adequately using a web-based audio-visual SJ task outside a controlled laboratory setting, available to a more diverse and representative pool of participants. These results provide additional evidence for a close relationship between lower multisensory acuity and the expression of schizotypal traits in the general population.
Collapse
Affiliation(s)
- Gianluca Marsicano
- School of Psychology, Vita-Salute San Raffaele University, Milan, Italy.,Division of Neuroscience, IRCCS San Raffaele Scientific Institute, Milan, Italy
| | - Filippo Cerpelloni
- Center for Mind/Brain Sciences and Department of Psychology and Cognitive Science, University of Trento, Rovereto, Italy.,Laboratory of Biological Psychology, Department of Brain and Cognition, Leuven Brain Institute, KU Leuve, Leuven, Belgium.,Institute of Research in Psychology (IPSY) & Institute of Neuroscience (IoNS)-University of Louvain (UCLouvain), Leuven, Belgium
| | - David Melcher
- Center for Mind/Brain Sciences and Department of Psychology and Cognitive Science, University of Trento, Rovereto, Italy. .,Psychology Program, Division of Science, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates.
| | - Luca Ronconi
- School of Psychology, Vita-Salute San Raffaele University, Milan, Italy.,Division of Neuroscience, IRCCS San Raffaele Scientific Institute, Milan, Italy
| |
Collapse
|
9
|
Wang L, Lin L, Sun Y, Hou S, Ren J. The effect of movement speed on audiovisual temporal integration in streaming-bouncing illusion. Exp Brain Res 2022; 240:1139-1149. [PMID: 35147722 DOI: 10.1007/s00221-022-06312-y] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2021] [Accepted: 01/18/2022] [Indexed: 11/04/2022]
Abstract
Motion perception in real situations is often stimulated by multisensory information. Speed is an essential characteristic of moving objects; however, at present, it is not clear whether speed affects the process of audiovisual temporal integration in motion perception. Therefore, this study used a streaming-bouncing task (a bistable motion perception; SB task) combined with a simultaneous judgment task (SJ task) to explore the effect of speed on audiovisual temporal integration from implicit and explicit perspectives. The experiment had a within-subjects design, two speed conditions (fast/slow), eleven audiovisual conditions [stimulus onset asynchrony (SOA): 0 ms/ ± 60 ms/ ± 120 ms/ ± 180 ms/ ± 240 ms/ ± 300 ms], and a visual-only condition. A total of 30 subjects were recruited for the study. These participants completed the SB task and the SJ task successively. The results showed the following outcomes: (1) the optimal times needed to induce the "bouncing" illusion and maximum audiovisual bounce-inducing effect (ABE) magnitude were much earlier than that for the optimal time of audiovisual synchrony, (2) speed as a bottom-up factor could affect the proportion of "bouncing" perception in SB illusions but did not affect the ABE magnitude, (3) speed could also affect the ability of audiovisual temporal integration in motion perception, and the main manifestation was that the point of subjective simultaneity (PSS) in fast speed conditions was earlier than that of slow speed conditions in the SJ task and (4) the SB task and SJ task were not related. In conclusion, the time to complete the maximum audiovisual integration was different from the optimal time for synchrony perception; moreover, speed could affect audiovisual temporal integration in motion perception but only in explicit temporal tasks.
Collapse
Affiliation(s)
- Luning Wang
- School of Psychology, Shanghai University of Sport, Shanghai, 200438, China
| | - Liyue Lin
- School of Psychology, Shanghai University of Sport, Shanghai, 200438, China
| | - Yujia Sun
- China Table Tennis College, Shanghai University of Sport, Shanghai, 200438, China
| | - Shuang Hou
- School of Psychology, Shanghai University of Sport, Shanghai, 200438, China
| | - Jie Ren
- China Table Tennis College, Shanghai University of Sport, Shanghai, 200438, China.
| |
Collapse
|
10
|
Nazaré CJ, Oliveira AM. Effects of Audiovisual Presentations on Visual Localization Errors: One or Several Multisensory Mechanisms? Multisens Res 2021; 34:1-35. [PMID: 33882452 DOI: 10.1163/22134808-bja10048] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2020] [Accepted: 03/30/2021] [Indexed: 11/19/2022]
Abstract
The present study examines the extent to which temporal and spatial properties of sound modulate visual motion processing in spatial localization tasks. Participants were asked to locate the place at which a moving visual target unexpectedly vanished. Across different tasks, accompanying sounds were factorially varied within subjects as to their onset and offset times and/or positions relative to visual motion. Sound onset had no effect on the localization error. Sound offset was shown to modulate the perceived visual offset location, both for temporal and spatial disparities. This modulation did not conform to attraction toward the timing or location of the sounds but, demonstrably in the case of temporal disparities, to bimodal enhancement instead. Favorable indications to a contextual effect of audiovisual presentations on interspersed visual-only trials were also found. The short sound-leading offset asynchrony had equivalent benefits to audiovisual offset synchrony, suggestive of the involvement of early-level mechanisms, constrained by a temporal window, at these conditions. Yet, we tentatively hypothesize that the whole of the results and how they compare with previous studies requires the contribution of additional mechanisms, including learning-detection of auditory-visual associations and cross-sensory spread of endogenous attention.
Collapse
Affiliation(s)
- Cristina Jordão Nazaré
- Instituto Politécnico de Coimbra, ESTESC - Coimbra Health School, Audiologia, Coimbra, Portugal
| | | |
Collapse
|
11
|
Csonka M, Mardmomen N, Webster PJ, Brefczynski-Lewis JA, Frum C, Lewis JW. Meta-Analyses Support a Taxonomic Model for Representations of Different Categories of Audio-Visual Interaction Events in the Human Brain. Cereb Cortex Commun 2021; 2:tgab002. [PMID: 33718874 PMCID: PMC7941256 DOI: 10.1093/texcom/tgab002] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Revised: 12/31/2020] [Accepted: 01/06/2021] [Indexed: 01/23/2023] Open
Abstract
Our ability to perceive meaningful action events involving objects, people, and other animate agents is characterized in part by an interplay of visual and auditory sensory processing and their cross-modal interactions. However, this multisensory ability can be altered or dysfunctional in some hearing and sighted individuals, and in some clinical populations. The present meta-analysis sought to test current hypotheses regarding neurobiological architectures that may mediate audio-visual multisensory processing. Reported coordinates from 82 neuroimaging studies (137 experiments) that revealed some form of audio-visual interaction in discrete brain regions were compiled, converted to a common coordinate space, and then organized along specific categorical dimensions to generate activation likelihood estimate (ALE) brain maps and various contrasts of those derived maps. The results revealed brain regions (cortical "hubs") preferentially involved in multisensory processing along different stimulus category dimensions, including 1) living versus nonliving audio-visual events, 2) audio-visual events involving vocalizations versus actions by living sources, 3) emotionally valent events, and 4) dynamic-visual versus static-visual audio-visual stimuli. These meta-analysis results are discussed in the context of neurocomputational theories of semantic knowledge representations and perception, and the brain volumes of interest are available for download to facilitate data interpretation for future neuroimaging studies.
Collapse
Affiliation(s)
- Matt Csonka
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Nadia Mardmomen
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Paula J Webster
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Julie A Brefczynski-Lewis
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Chris Frum
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - James W Lewis
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| |
Collapse
|
12
|
Horsfall RP. Narrowing of the Audiovisual Temporal Binding Window Due To Perceptual Training Is Specific to High Visual Intensity Stimuli. Iperception 2021; 12:2041669520978670. [PMID: 33680418 PMCID: PMC7897829 DOI: 10.1177/2041669520978670] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2020] [Accepted: 11/14/2020] [Indexed: 12/04/2022] Open
Abstract
The temporal binding window (TBW), which reflects the range of temporal offsets in which audiovisual stimuli are combined to form a singular percept, can be reduced through training. Our research aimed to investigate whether training-induced reductions in TBW size transfer across stimulus intensities. A total of 32 observers performed simultaneity judgements at two visual intensities with a fixed auditory intensity, before and after receiving audiovisual TBW training at just one of these two intensities. We show that training individuals with a high visual intensity reduces the size of the TBW for bright stimuli, but this improvement did not transfer to dim stimuli. The reduction in TBW can be explained by shifts in decision criteria. Those trained with the dim visual stimuli, however, showed no reduction in TBW. Our main finding is that perceptual improvements following training are specific for high-intensity stimuli, potentially highlighting limitations of proposed TBW training procedures.
Collapse
Affiliation(s)
- Ryan P. Horsfall
- Ryan P. Horsfall, Division of Neuroscience & Experimental Psychology, University of Manchester, Manchester M13 9PL, United Kingdom.
| |
Collapse
|
13
|
Scurry AN, Chifamba K, Jiang F. Electrophysiological Dynamics of Visual-Tactile Temporal Order Perception in Early Deaf Adults. Front Neurosci 2020; 14:544472. [PMID: 33071731 PMCID: PMC7539666 DOI: 10.3389/fnins.2020.544472] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2020] [Accepted: 08/19/2020] [Indexed: 11/17/2022] Open
Abstract
Studies of compensatory plasticity in early deaf (ED) individuals have mainly focused on unisensory processing, and on spatial rather than temporal coding. However, precise discrimination of the temporal relationship between stimuli is imperative for successful perception of and interaction with the complex, multimodal environment. Although the properties of cross-modal temporal processing have been extensively studied in neurotypical populations, remarkably little is known about how the loss of one sense impacts the integrity of temporal interactions among the remaining senses. To understand how auditory deprivation affects multisensory temporal interactions, ED and age-matched normal hearing (NH) controls performed a visual-tactile temporal order judgment task in which visual and tactile stimuli were separated by varying stimulus onset asynchronies (SOAs) and subjects had to discern the leading stimulus. Participants performed the task while EEG data were recorded. Group averaged event-related potential waveforms were compared between groups in occipital and fronto-central electrodes. Despite similar temporal order sensitivities and performance accuracy, ED had larger visual P100 amplitudes for all SOA levels and larger tactile N140 amplitudes for the shortest asynchronous (± 30 ms) and synchronous SOA levels. The enhanced signal strength reflected in these components from ED adults are discussed in terms of compensatory recruitment of cortical areas for visual-tactile processing. In addition, ED adults had similar tactile P200 amplitudes as NH but longer P200 latencies suggesting reduced efficiency in later processing of tactile information. Overall, these results suggest that greater responses by ED for early processing of visual and tactile signals are likely critical for maintained performance in visual-tactile temporal order discrimination.
Collapse
Affiliation(s)
- Alexandra N Scurry
- Department of Psychology, University of Nevada, Reno, Reno, NV, United States
| | - Kudzai Chifamba
- Department of Psychology, University of Nevada, Reno, Reno, NV, United States
| | - Fang Jiang
- Department of Psychology, University of Nevada, Reno, Reno, NV, United States
| |
Collapse
|
14
|
Individual differences in sensory integration predict differences in time perception and individual levels of schizotypy. Conscious Cogn 2020; 84:102979. [DOI: 10.1016/j.concog.2020.102979] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2020] [Revised: 06/17/2020] [Accepted: 06/17/2020] [Indexed: 12/13/2022]
|
15
|
Zhou HY, Cheung EFC, Chan RCK. Audiovisual temporal integration: Cognitive processing, neural mechanisms, developmental trajectory and potential interventions. Neuropsychologia 2020; 140:107396. [PMID: 32087206 DOI: 10.1016/j.neuropsychologia.2020.107396] [Citation(s) in RCA: 38] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2019] [Revised: 02/14/2020] [Accepted: 02/15/2020] [Indexed: 12/21/2022]
Abstract
To integrate auditory and visual signals into a unified percept, the paired stimuli must co-occur within a limited time window known as the Temporal Binding Window (TBW). The width of the TBW, a proxy of audiovisual temporal integration ability, has been found to be correlated with higher-order cognitive and social functions. A comprehensive review of studies investigating audiovisual TBW reveals several findings: (1) a wide range of top-down processes and bottom-up features can modulate the width of the TBW, facilitating adaptation to the changing and multisensory external environment; (2) a large-scale brain network works in coordination to ensure successful detection of audiovisual (a)synchrony; (3) developmentally, audiovisual TBW follows a U-shaped pattern across the lifespan, with a protracted developmental course into late adolescence and rebounding in size again in late life; (4) an enlarged TBW is characteristic of a number of neurodevelopmental disorders; and (5) the TBW is highly flexible via perceptual and musical training. Interventions targeting the TBW may be able to improve multisensory function and ameliorate social communicative symptoms in clinical populations.
Collapse
Affiliation(s)
- Han-Yu Zhou
- Neuropsychology and Applied Cognitive Neuroscience Laboratory, CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | | | - Raymond C K Chan
- Neuropsychology and Applied Cognitive Neuroscience Laboratory, CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, China.
| |
Collapse
|
16
|
Zerr M, Freihorst C, Schütz H, Sinke C, Müller A, Bleich S, Münte TF, Szycik GR. Brief Sensory Training Narrows the Temporal Binding Window and Enhances Long-Term Multimodal Speech Perception. Front Psychol 2019; 10:2489. [PMID: 31749748 PMCID: PMC6848860 DOI: 10.3389/fpsyg.2019.02489] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2019] [Accepted: 10/22/2019] [Indexed: 11/13/2022] Open
Abstract
Our ability to integrate multiple sensory-based representations of our surrounding supplies us with a more holistic view of our world. There are many complex algorithms our nervous system uses to construct a coherent perception. An indicator to solve this 'binding problem' are the temporal characteristics with the specificity that environmental information has different propagation speeds (e.g., sound and electromagnetic waves) and sensory processing time and thus the temporal relationship of a stimulus pair derived from the same event must be flexibly adjusted by our brain. This tolerance can be conceptualized in the form of the cross-modal temporal binding window (TBW). Several studies showed the plasticity of the TBW and its importance concerning audio-visual illusions, synesthesia, as well as psychiatric disturbances. Using three audio-visual paradigms, we investigated the importance of length (short vs. long) as well as modality (uni- vs. multimodal) of a perceptual training aiming at reducing the TBW in a healthy population. We also investigated the influence of the TBW on speech intelligibility, where participants had to integrate auditory and visual speech information from a videotaped speaker. We showed that simple sensory trainings can change the TBW and are capable of optimizing speech perception at a very naturalistic level. While the training-length had no different effect on the malleability of the TBW, the multisensory trainings induced a significantly stronger narrowing of the TBW than their unisensory counterparts. Furthermore, a narrowing of the TBW was associated with a better performance in speech perception, meaning that participants showed a greater capacity for integrating informations from different sensory modalities in situations with one modality impaired. All effects persisted at least seven days. Our findings show the significance of multisensory temporal processing regarding ecologically valid measures and have important clinical implications for interventions that may be used to alleviate debilitating conditions (e.g., autism, schizophrenia), in which multisensory temporal function is shown to be impaired.
Collapse
Affiliation(s)
- Michael Zerr
- Department of Psychosomatic Medicine and Psychotherapy, Hannover Medical School, Hanover, Germany
| | - Christina Freihorst
- Department of Psychiatry, Social Psychiatry and Psychotherapy, Hannover Medical School, Hanover, Germany
| | - Helene Schütz
- Department of Psychiatry, Social Psychiatry and Psychotherapy, Hannover Medical School, Hanover, Germany
| | - Christopher Sinke
- Department of Psychiatry, Social Psychiatry and Psychotherapy, Hannover Medical School, Hanover, Germany
| | - Astrid Müller
- Department of Psychosomatic Medicine and Psychotherapy, Hannover Medical School, Hanover, Germany
| | - Stefan Bleich
- Department of Psychiatry, Social Psychiatry and Psychotherapy, Hannover Medical School, Hanover, Germany
| | - Thomas F Münte
- Department of Neurology, University of Lübeck, Lübeck, Germany.,Institute of Psychology II, University of Lübeck, Lübeck, Germany
| | - Gregor R Szycik
- Department of Psychiatry, Social Psychiatry and Psychotherapy, Hannover Medical School, Hanover, Germany
| |
Collapse
|
17
|
Cortical processes underlying the effects of static sound timing on perceived visual speed. Neuroimage 2019; 199:194-205. [DOI: 10.1016/j.neuroimage.2019.05.062] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2019] [Revised: 04/09/2019] [Accepted: 05/24/2019] [Indexed: 01/10/2023] Open
|
18
|
Ikumi N, Torralba M, Ruzzoli M, Soto-Faraco S. The phase of pre-stimulus brain oscillations correlates with cross-modal synchrony perception. Eur J Neurosci 2018; 49:150-164. [DOI: 10.1111/ejn.14186] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2017] [Revised: 09/11/2018] [Accepted: 09/13/2018] [Indexed: 11/30/2022]
Affiliation(s)
- Nara Ikumi
- Multisensory Research Group; Center for Brain and Cognition; Universitat Pompeu Fabra; Barcelona Spain
| | - Mireia Torralba
- Multisensory Research Group; Center for Brain and Cognition; Universitat Pompeu Fabra; Barcelona Spain
| | - Manuela Ruzzoli
- Multisensory Research Group; Center for Brain and Cognition; Universitat Pompeu Fabra; Barcelona Spain
| | - Salvador Soto-Faraco
- Multisensory Research Group; Center for Brain and Cognition; Universitat Pompeu Fabra; Barcelona Spain
- Institució Catalana de Recerca i Estudis Avançats (ICREA); Barcelona Spain
| |
Collapse
|
19
|
Lange J, Kapala K, Krause H, Baumgarten TJ, Schnitzler A. Rapid temporal recalibration to visuo-tactile stimuli. Exp Brain Res 2017; 236:347-354. [PMID: 29143125 PMCID: PMC5809529 DOI: 10.1007/s00221-017-5132-z] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2017] [Accepted: 11/10/2017] [Indexed: 11/28/2022]
Abstract
For a comprehensive understanding of the environment, the brain must constantly decide whether the incoming information originates from the same source and needs to be integrated into a coherent percept. This integration process is believed to be mediated by temporal integration windows. If presented with temporally asynchronous stimuli for a few minutes, the brain adapts to this new temporal relation by recalibrating the temporal integration windows. Such recalibration can occur even more rapidly after exposure to just a single trial of asynchronous stimulation. While rapid recalibration has been demonstrated for audio-visual stimuli, evidence for rapid recalibration of visuo-tactile stimuli is lacking. Here, we investigated rapid recalibration in the visuo-tactile domain. Subjects received visual and tactile stimuli with different stimulus onset asynchronies (SOA) and were asked to report whether the visuo-tactile stimuli were presented simultaneously. Our results demonstrate visuo-tactile rapid recalibration by revealing that subjects' simultaneity reports were modulated by the temporal order of stimulation in the preceding trial. This rapid recalibration effect, however, was only significant if the SOA in the preceding trial was smaller than 100 ms, while rapid recalibration could not be demonstrated for SOAs larger than 100 ms. Since rapid recalibration in the audio-visual domain has been demonstrated for SOAs larger than 100 ms, we propose that visuo-tactile recalibration works at shorter SOAs, and thus faster time scales than audio-visual rapid recalibration.
Collapse
Affiliation(s)
- Joachim Lange
- Medical Faculty, Institute of Clinical Neuroscience and Medical Psychology, Heinrich Heine University, Düsseldorf, Germany.
| | - Katharina Kapala
- Medical Faculty, Institute of Clinical Neuroscience and Medical Psychology, Heinrich Heine University, Düsseldorf, Germany
| | - Holger Krause
- Medical Faculty, Institute of Clinical Neuroscience and Medical Psychology, Heinrich Heine University, Düsseldorf, Germany
| | - Thomas J Baumgarten
- Medical Faculty, Institute of Clinical Neuroscience and Medical Psychology, Heinrich Heine University, Düsseldorf, Germany
| | - Alfons Schnitzler
- Medical Faculty, Institute of Clinical Neuroscience and Medical Psychology, Heinrich Heine University, Düsseldorf, Germany
| |
Collapse
|
20
|
Stevenson RA, Toulmin JK, Youm A, Besney RMA, Schulz SE, Barense MD, Ferber S. Increases in the autistic trait of attention to detail are associated with decreased multisensory temporal adaptation. Sci Rep 2017; 7:14354. [PMID: 29085016 PMCID: PMC5662613 DOI: 10.1038/s41598-017-14632-1] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2017] [Accepted: 10/12/2017] [Indexed: 11/09/2022] Open
Abstract
Recent empirical evidence suggests that autistic individuals perceive the world differently than their typically-developed peers. One theoretical account, the predictive coding hypothesis, posits that autistic individuals show a decreased reliance on previous perceptual experiences, which may relate to autism symptomatology. We tested this through a well-characterized, audiovisual statistical-learning paradigm in which typically-developed participants were first adapted to consistent temporal relationships between audiovisual stimulus pairs (audio-leading, synchronous, visual-leading) and then performed a simultaneity judgement task with audiovisual stimulus pairs varying in temporal offset from auditory-leading to visual-leading. Following exposure to the visual-leading adaptation phase, participants' perception of synchrony was biased towards visual-leading presentations, reflecting the statistical regularities of their previously experienced environment. Importantly, the strength of adaptation was significantly related to the level of autistic traits that the participant exhibited, measured by the Autism Quotient (AQ). This was specific to the Attention to Detail subscale of the AQ that assesses the perceptual propensity to focus on fine-grain aspects of sensory input at the expense of more integrative perceptions. More severe Attention to Detail was related to weaker adaptation. These results support the predictive coding framework, and suggest that changes in sensory perception commonly reported in autism may contribute to autistic symptomatology.
Collapse
Affiliation(s)
- Ryan A Stevenson
- Western University, Department of Psychology, London, ON, Canada.
- Western University, Brain and Mind Institute, London, ON, Canada.
- Western University, Program in Neuroscience, London, ON, Canada.
- Western University, Department of Psychiatry, London, ON, Canada.
- York University, Centre for Vision Research, Toronto, ON, Canada.
| | - Jennifer K Toulmin
- The University of Toronto, Department of Psychology, Toronto, ON, Canada
| | - Ariana Youm
- The University of Toronto, Department of Psychology, Toronto, ON, Canada
| | | | - Samantha E Schulz
- Western University, Department of Psychology, London, ON, Canada
- Western University, Brain and Mind Institute, London, ON, Canada
| | - Morgan D Barense
- The University of Toronto, Department of Psychology, Toronto, ON, Canada
- The Rotman Research Institute, Toronto, ON, Canada
| | - Susanne Ferber
- The University of Toronto, Department of Psychology, Toronto, ON, Canada
- The Rotman Research Institute, Toronto, ON, Canada
| |
Collapse
|
21
|
Being First Matters: Topographical Representational Similarity Analysis of ERP Signals Reveals Separate Networks for Audiovisual Temporal Binding Depending on the Leading Sense. J Neurosci 2017; 37:5274-5287. [PMID: 28450537 PMCID: PMC5456109 DOI: 10.1523/jneurosci.2926-16.2017] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2016] [Revised: 02/20/2017] [Accepted: 02/25/2017] [Indexed: 11/30/2022] Open
Abstract
In multisensory integration, processing in one sensory modality is enhanced by complementary information from other modalities. Intersensory timing is crucial in this process because only inputs reaching the brain within a restricted temporal window are perceptually bound. Previous research in the audiovisual field has investigated various features of the temporal binding window, revealing asymmetries in its size and plasticity depending on the leading input: auditory–visual (AV) or visual–auditory (VA). Here, we tested whether separate neuronal mechanisms underlie this AV–VA dichotomy in humans. We recorded high-density EEG while participants performed an audiovisual simultaneity judgment task including various AV–VA asynchronies and unisensory control conditions (visual-only, auditory-only) and tested whether AV and VA processing generate different patterns of brain activity. After isolating the multisensory components of AV–VA event-related potentials (ERPs) from the sum of their unisensory constituents, we ran a time-resolved topographical representational similarity analysis (tRSA) comparing the AV and VA ERP maps. Spatial cross-correlation matrices were built from real data to index the similarity between the AV and VA maps at each time point (500 ms window after stimulus) and then correlated with two alternative similarity model matrices: AVmaps = VAmaps versus AVmaps ≠ VAmaps. The tRSA results favored the AVmaps ≠ VAmaps model across all time points, suggesting that audiovisual temporal binding (indexed by synchrony perception) engages different neural pathways depending on the leading sense. The existence of such dual route supports recent theoretical accounts proposing that multiple binding mechanisms are implemented in the brain to accommodate different information parsing strategies in auditory and visual sensory systems. SIGNIFICANCE STATEMENT Intersensory timing is a crucial aspect of multisensory integration, determining whether and how inputs in one modality enhance stimulus processing in another modality. Our research demonstrates that evaluating synchrony of auditory-leading (AV) versus visual-leading (VA) audiovisual stimulus pairs is characterized by two distinct patterns of brain activity. This suggests that audiovisual integration is not a unitary process and that different binding mechanisms are recruited in the brain based on the leading sense. These mechanisms may be relevant for supporting different classes of multisensory operations, for example, auditory enhancement of visual attention (AV) and visual enhancement of auditory speech (VA).
Collapse
|
22
|
Shahin AJ, Shen S, Kerlin JR. Tolerance for audiovisual asynchrony is enhanced by the spectrotemporal fidelity of the speaker's mouth movements and speech. LANGUAGE, COGNITION AND NEUROSCIENCE 2017; 32:1102-1118. [PMID: 28966930 PMCID: PMC5617130 DOI: 10.1080/23273798.2017.1283428] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/09/2016] [Accepted: 01/07/2017] [Indexed: 06/07/2023]
Abstract
We examined the relationship between tolerance for audiovisual onset asynchrony (AVOA) and the spectrotemporal fidelity of the spoken words and the speaker's mouth movements. In two experiments that only varied in the temporal order of sensory modality, visual speech leading (exp1) or lagging (exp2) acoustic speech, participants watched intact and blurred videos of a speaker uttering trisyllabic words and nonwords that were noise vocoded with 4-, 8-, 16-, and 32-channels. They judged whether the speaker's mouth movements and the speech sounds were in-sync or out-of-sync. Individuals perceived synchrony (tolerated AVOA) on more trials when the acoustic speech was more speech-like (8 channels and higher vs. 4 channels), and when visual speech was intact than blurred (exp1 only). These findings suggest that enhanced spectrotemporal fidelity of the audiovisual (AV) signal prompts the brain to widen the window of integration promoting the fusion of temporally distant AV percepts.
Collapse
Affiliation(s)
- Antoine J Shahin
- Center for Mind and Brain, University of California, Davis, CA, 95618
| | - Stanley Shen
- Center for Mind and Brain, University of California, Davis, CA, 95618
| | - Jess R Kerlin
- Center for Mind and Brain, University of California, Davis, CA, 95618
| |
Collapse
|