1
|
Akdogan I, Ogmen H, Kafaligonul H. The phase coherence of cortical oscillations predicts dynamic changes in perceived visibility. Cereb Cortex 2024; 34:bhae380. [PMID: 39319441 PMCID: PMC11422671 DOI: 10.1093/cercor/bhae380] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2024] [Revised: 08/28/2024] [Accepted: 09/03/2024] [Indexed: 09/26/2024] Open
Abstract
The phase synchronization of brain oscillations plays an important role in visual processing, perceptual awareness, and performance. Yet, the cortical mechanisms underlying modulatory effects of post-stimulus phase coherence and frequency-specific oscillations associated with different aspects of vision are still subject to debate. In this study, we aimed to identify the post-stimulus phase coherence of cortical oscillations associated with perceived visibility and contour discrimination. We analyzed electroencephalogram data from two masking experiments where target visibility was manipulated by the contrast ratio or polarity of the mask under various onset timing conditions (stimulus onset asynchronies, SOAs). The behavioral results indicated an SOA-dependent suppression of target visibility due to masking. The time-frequency analyses revealed significant modulations of phase coherence over occipital and parieto-occipital regions. We particularly identified modulations of phase coherence in the (i) 2-5 Hz frequency range, which may reflect feedforward-mediated contour detection and sustained visibility; and (ii) 10-25 Hz frequency range, which may be associated with suppressed visibility through inhibitory interactions between and within synchronized neural pathways. Taken together, our findings provide evidence that oscillatory phase alignments, not only in the pre-stimulus but also in the post-stimulus window, play a crucial role in shaping perceived visibility and dynamic vision.
Collapse
Affiliation(s)
- Irem Akdogan
- Department of Neuroscience, Bilkent University, Cankaya, Ankara 06800, Türkiye
- Aysel Sabuncu Brain Research Center, Bilkent University, Cankaya, Ankara 06800, Türkiye
- National Magnetic Resonance Research Center (UMRAM), Bilkent University, Cankaya, Ankara 06800, Türkiye
| | - Haluk Ogmen
- Laboratory of Perceptual and Cognitive Dynamics, Electrical & Computer Engineering, Ritchie School of Engineering & Computer Science, University of Denver, Denver, CO 80210, United States
| | - Hulusi Kafaligonul
- Department of Neuroscience, Bilkent University, Cankaya, Ankara 06800, Türkiye
- Aysel Sabuncu Brain Research Center, Bilkent University, Cankaya, Ankara 06800, Türkiye
- National Magnetic Resonance Research Center (UMRAM), Bilkent University, Cankaya, Ankara 06800, Türkiye
- Neuroscience and Neurotechnology Center of Excellence (NÖROM), Faculty of Medicine, Gazi University, Yenimahalle, Ankara 06560, Türkiye
| |
Collapse
|
2
|
Jiang Z, An X, Liu S, Yin E, Yan Y, Ming D. Beyond alpha band: prestimulus local oscillation and interregional synchrony of the beta band shape the temporal perception of the audiovisual beep-flash stimulus. J Neural Eng 2024; 21:036035. [PMID: 37419108 DOI: 10.1088/1741-2552/ace551] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2022] [Accepted: 07/07/2023] [Indexed: 07/09/2023]
Abstract
Objective.Multisensory integration is more likely to occur if the multimodal inputs are within a narrow temporal window called temporal binding window (TBW). Prestimulus local neural oscillations and interregional synchrony within sensory areas can modulate cross-modal integration. Previous work has examined the role of ongoing neural oscillations in audiovisual temporal integration, but there is no unified conclusion. This study aimed to explore whether local ongoing neural oscillations and interregional audiovisual synchrony modulate audiovisual temporal integration.Approach.The human participants performed a simultaneity judgment (SJ) task with the beep-flash stimuli while recording electroencephalography. We focused on two stimulus onset asynchrony (SOA) conditions where subjects report ∼50% proportion of synchronous responses in auditory- and visual-leading SOA (A50V and V50A).Main results.We found that the alpha band power is larger in synchronous response in the central-right posterior and posterior sensors in A50V and V50A conditions, respectively. The results suggested that the alpha band power reflects neuronal excitability in the auditory or visual cortex, which can modulate audiovisual temporal perception depending on the leading sense. Additionally, the SJs were modulated by the opposite phases of alpha (5-10 Hz) and low beta (14-20 Hz) bands in the A50V condition while the low beta band (14-18 Hz) in the V50A condition. One cycle of alpha or two cycles of beta oscillations matched an auditory-leading TBW of ∼86 ms, while two cycles of beta oscillations matched a visual-leading TBW of ∼105 ms. This result indicated the opposite phases in the alpha and beta bands reflect opposite cortical excitability, which modulated the audiovisual SJs. Finally, we found stronger high beta (21-28 Hz) audiovisual phase synchronization for synchronous response in the A50V condition. The phase synchrony of the beta band might be related to maintaining information flow between visual and auditory regions in a top-down manner.Significance.These results clarified whether and how the prestimulus brain state, including local neural oscillations and functional connectivity between brain regions, affects audiovisual temporal integration.
Collapse
Affiliation(s)
- Zeliang Jiang
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
| | - Xingwei An
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
| | - Shuang Liu
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
| | - Erwei Yin
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
- Defense Innovation Institute, Academy of Military Sciences (AMS), 100071 Beijing, People's Republic of China
- Tianjin Artificial Intelligence Innovation Center (TAIIC), 300457 Tianjin, People's Republic of China
| | - Ye Yan
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
- Defense Innovation Institute, Academy of Military Sciences (AMS), 100071 Beijing, People's Republic of China
- Tianjin Artificial Intelligence Innovation Center (TAIIC), 300457 Tianjin, People's Republic of China
| | - Dong Ming
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
| |
Collapse
|
3
|
Marsicano G, Bertini C, Ronconi L. Alpha-band sensory entrainment improves audiovisual temporal acuity. Psychon Bull Rev 2024; 31:874-885. [PMID: 37783899 DOI: 10.3758/s13423-023-02388-x] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/10/2023] [Indexed: 10/04/2023]
Abstract
Visual and auditory stimuli are transmitted from the environment to sensory cortices with different timing, requiring the brain to encode when sensory inputs must be segregated or integrated into a single percept. The probability that different audiovisual (AV) stimuli are integrated into a single percept even when presented asynchronously is reflected in the construct of temporal binding window (TBW). There is a strong interest in testing whether it is possible to broaden or shrink TBW by using different neuromodulatory approaches that can speed up or slow down ongoing alpha oscillations, which have been repeatedly hypothesized to be an important determinant of the TBWs size. Here, we employed a web-based sensory entrainment protocol combined with a simultaneity judgment task using simple flash-beep stimuli. The aim was to test whether AV temporal acuity could be modulated trial by trial by synchronizing ongoing neural oscillations in the prestimulus period to a rhythmic sensory stream presented in the upper (∼12 Hz) or lower (∼8.5 Hz) alpha range. As a control, we implemented a nonrhythmic condition where only the first and the last entrainers were employed. Results show that upper alpha entrainment shrinks AV TBW and improves AV temporal acuity when compared with lower alpha and control conditions. Our findings represent a proof of concept of the efficacy of sensory entrainment to improve AV temporal acuity in a trial-by-trial manner, and they strengthen the idea that alpha oscillations may reflect the temporal unit of AV temporal binding.
Collapse
Affiliation(s)
- Gianluca Marsicano
- Department of Psychology, University of Bologna, Viale Berti Pichat 5, 40121, Bologna, Italy
- Centre for Studies and Research in Cognitive Neuroscience, University of Bologna, Via Rasi e Spinelli 176, 47023, Cesena, Italy
| | - Caterina Bertini
- Department of Psychology, University of Bologna, Viale Berti Pichat 5, 40121, Bologna, Italy
- Centre for Studies and Research in Cognitive Neuroscience, University of Bologna, Via Rasi e Spinelli 176, 47023, Cesena, Italy
| | - Luca Ronconi
- School of Psychology, Vita-Salute San Raffaele University, Via Olgettina 58, 20132, Milan, Italy.
- Division of Neuroscience, IRCCS San Raffaele Scientific Institute, Milan, Italy.
| |
Collapse
|
4
|
Wang L, Lin L, Ren J. The characteristics of audiovisual temporal integration in streaming-bouncing bistable motion perception: considering both implicit and explicit processing perspectives. Cereb Cortex 2023; 33:11541-11555. [PMID: 37874024 DOI: 10.1093/cercor/bhad388] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2023] [Revised: 10/01/2023] [Accepted: 10/02/2023] [Indexed: 10/25/2023] Open
Abstract
This study explored the behavioral and neural activity characteristics of audiovisual temporal integration in motion perception from both implicit and explicit perspectives. The streaming-bouncing bistable paradigm (SB task) was employed to investigate implicit temporal integration, while the corresponding simultaneity judgment task (SJ task) was used to examine explicit temporal integration. The behavioral results revealed a negative correlation between implicit and explicit temporal processing. In the ERP results of both tasks, three neural phases (PD100, ND180, and PD290) in the fronto-central region were identified as reflecting integration effects and the auditory-evoked multisensory N1 component may serve as a primary component responsible for cross-modal temporal processing. However, there were significant differences between the VA ERPs in the SB and SJ tasks and the influence of speed on implicit and explicit integration effects also varied. The aforementioned results, building upon the validation of previous temporal renormalization theory, suggest that implicit and explicit temporal integration operate under distinct processing modes within a shared neural network. This underscores the brain's flexibility and adaptability in cross-modal temporal processing.
Collapse
Affiliation(s)
- Luning Wang
- School of Psychology, Shanghai University of Sport, No. 399, Changhai Road, Yangpu District, Shanghai, 200438, China
| | - Liyue Lin
- School of Psychology, Shanghai University of Sport, No. 399, Changhai Road, Yangpu District, Shanghai, 200438, China
| | - Jie Ren
- China Table Tennis College, Shanghai University of Sport, No. 399, Changhai Road, Yangpu District, Shanghai, 200438, China
| |
Collapse
|
5
|
Azizi L, Polti I, van Wassenhove V. Spontaneous α Brain Dynamics Track the Episodic "When". J Neurosci 2023; 43:7186-7197. [PMID: 37704373 PMCID: PMC10601376 DOI: 10.1523/jneurosci.0816-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2023] [Revised: 07/14/2023] [Accepted: 08/06/2023] [Indexed: 09/15/2023] Open
Abstract
Across species, neurons track time over the course of seconds to minutes, which may feed the sense of time passing. Here, we asked whether neural signatures of time-tracking could be found in humans. Participants stayed quietly awake for a few minutes while being recorded with magnetoencephalography (MEG). They were unaware they would be asked how long the recording lasted (retrospective time) or instructed beforehand to estimate how long it will last (prospective timing). At rest, rhythmic brain activity is nonstationary and displays bursts of activity in the alpha range (α: 7-14 Hz). When participants were not instructed to attend to time, the relative duration of α bursts linearly predicted individuals' retrospective estimates of how long their quiet wakefulness lasted. The relative duration of α bursts was a better predictor than α power or burst amplitude. No other rhythmic or arrhythmic activity predicted retrospective duration. However, when participants timed prospectively, the relative duration of α bursts failed to predict their duration estimates. Consistent with this, the amount of α bursts was discriminant between prospective and retrospective timing. Last, with a control experiment, we demonstrate that the relation between α bursts and retrospective time is preserved even when participants are engaged in a visual counting task. Thus, at the time scale of minutes, we report that the relative time of spontaneous α burstiness predicts conscious retrospective time. We conclude that in the absence of overt attention to time, α bursts embody discrete states of awareness constitutive of episodic timing.SIGNIFICANCE STATEMENT The feeling that time passes is a core component of consciousness and episodic memory. A century ago, brain rhythms called "α" were hypothesized to embody an internal clock. However, rhythmic brain activity is nonstationary and displays on-and-off oscillatory bursts, which would serve irregular ticks to the hypothetical clock. Here, we discovered that in a given lapse of time, the relative bursting time of α rhythms is a good indicator of how much time an individual will report to have elapsed. Remarkably, this relation only holds true when the individual does not attend to time and vanishes when attending to it. Our observations suggest that at the scale of minutes, α brain activity tracks episodic time.
Collapse
Affiliation(s)
- Leila Azizi
- Cognitive Neuroimaging Unit, NeuroSpin, Commissariat à l'énergie atomique et aux énergies alternatives, Institut National de la Santé et de la Recherche Médicale, Université Paris-Saclay, Gif/Yvette 91191, France
| | - Ignacio Polti
- Kavli Institute for Systems Neuroscience, Norwegian University of Science and Technology, Trondheim, Norway 7030
- Department of Psychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany D-04103
| | - Virginie van Wassenhove
- Cognitive Neuroimaging Unit, NeuroSpin, Commissariat à l'énergie atomique et aux énergies alternatives, Institut National de la Santé et de la Recherche Médicale, Université Paris-Saclay, Gif/Yvette 91191, France
| |
Collapse
|
6
|
Xu Q, Hu J, Qin Y, Li G, Zhang X, Li P. Intention affects fairness processing: Evidence from behavior and representational similarity analysis of event-related potential signals. Hum Brain Mapp 2023; 44:2451-2464. [PMID: 36749642 PMCID: PMC10028638 DOI: 10.1002/hbm.26223] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2022] [Revised: 01/11/2023] [Accepted: 01/21/2023] [Indexed: 02/08/2023] Open
Abstract
In an ultimatum game, the responder must decide between pursuing self-interest and insisting on fairness, and these choices are affected by the intentions of the proposer. However, the time course of this social decision-making process is unclear. Representational similarity analysis (RSA) is a useful technique for linking brain activity with rich behavioral data sets. In this study, electroencephalography (EEG) was used to measure the time course of neural responses to proposed allocation schemes with different intentions. Twenty-eight participants played an ultimatum game as responders. They had to choose between accepting and rejecting the fair or unfair money allocation schemes of proposers. The schemes were offered based on the proposer's selfish intention (monetary gain), altruistic intention (donation to charity), or ambiguous intention (unknown to the responder). We used a spatiotemporal RSA and inter-subject RSA (IS-RSA) to explore the connections between event-related potentials (ERPs) after offer presentation and intention presentation with four types of behavioral data (acceptance, response time, fairness ratings, and pleasantness ratings). The spatiotemporal RSA results revealed that only response time variation was linked with the difference in ERPs at 432-592 ms after offer presentation on the posterior parietal and prefrontal regions. Meanwhile, the IS-RSA results found a significant association between inter-individual differences in response time and differences in ERP activity at 596-812 ms after the presentation of ambiguous intention, particularly in the prefrontal region. This study expands the intention-based reciprocal model to the third-party context and demonstrates that brain activity can represent response time differences in social decision-making.
Collapse
Affiliation(s)
- Qiang Xu
- Brain Function and Psychological Science Research Center, Shenzhen University, Shenzhen, China
| | - Jiali Hu
- Brain Function and Psychological Science Research Center, Shenzhen University, Shenzhen, China
| | - Yi Qin
- Brain Function and Psychological Science Research Center, Shenzhen University, Shenzhen, China
| | - Guojie Li
- Brain Function and Psychological Science Research Center, Shenzhen University, Shenzhen, China
| | - Xukai Zhang
- Department of Psychology, University of Jyväskylä, Jyväskylä, Finland
| | - Peng Li
- Brain Function and Psychological Science Research Center, Shenzhen University, Shenzhen, China
| |
Collapse
|
7
|
Zhou HY, Yang HX, Wei Z, Wan GB, Lui SSY, Chan RCK. Audiovisual synchrony detection for fluent speech in early childhood: An eye-tracking study. Psych J 2022; 11:409-418. [PMID: 35350086 DOI: 10.1002/pchj.538] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2021] [Revised: 01/09/2022] [Accepted: 02/17/2022] [Indexed: 11/05/2022]
Abstract
During childhood, the ability to detect audiovisual synchrony gradually sharpens for simple stimuli such as flashbeeps and single syllables. However, little is known about how children perceive synchrony for natural and continuous speech. This study investigated young children's gaze patterns while they were watching movies of two identical speakers telling stories side by side. Only one speaker's lip movements matched the voices and the other one either led or lagged behind the soundtrack by 600 ms. Children aged 3-6 years (n = 94, 52.13% males) showed an overall preference for the synchronous speaker, with no age-related changes in synchrony-detection sensitivity as indicated by similar gaze patterns across ages. However, viewing time to the synchronous speech was significantly longer in the auditory-leading (AL) condition compared with that in the visual-leading (VL) condition, suggesting asymmetric sensitivities for AL versus VL asynchrony have already been established in early childhood. When further examining gaze patterns on dynamic faces, we found that more attention focused on the mouth region was an adaptive strategy to read visual speech signals and thus associated with increased viewing time of the synchronous videos. Attention to detail, one dimension of autistic traits featured by local processing, has been found to be correlated with worse performances in speech synchrony processing. These findings extended previous research by showing the development of speech synchrony perception in young children, and may have implications for clinical populations (e.g., autism) with impaired multisensory integration.
Collapse
Affiliation(s)
- Han-Yu Zhou
- Neuropsychology and Applied Cognitive Neuroscience Laboratory, CAS Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Han-Xue Yang
- Neuropsychology and Applied Cognitive Neuroscience Laboratory, CAS Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Zhen Wei
- Affiliated Shenzhen Maternity and Child Healthcare Hospital, Shenzhen, China
| | - Guo-Bin Wan
- Affiliated Shenzhen Maternity and Child Healthcare Hospital, Shenzhen, China
| | - Simon S Y Lui
- Department of Psychiatry, The University of Hong Kong, Hong Kong Special Administrative Region, China
| | - Raymond C K Chan
- Neuropsychology and Applied Cognitive Neuroscience Laboratory, CAS Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
8
|
Johnston PR, Alain C, McIntosh AR. Individual Differences in Multisensory Processing Are Related to Broad Differences in the Balance of Local versus Distributed Information. J Cogn Neurosci 2022; 34:846-863. [PMID: 35195723 DOI: 10.1162/jocn_a_01835] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The brain's ability to extract information from multiple sensory channels is crucial to perception and effective engagement with the environment, but the individual differences observed in multisensory processing lack mechanistic explanation. We hypothesized that, from the perspective of information theory, individuals with more effective multisensory processing will exhibit a higher degree of shared information among distributed neural populations while engaged in a multisensory task, representing more effective coordination of information among regions. To investigate this, healthy young adults completed an audiovisual simultaneity judgment task to measure their temporal binding window (TBW), which quantifies the ability to distinguish fine discrepancies in timing between auditory and visual stimuli. EEG was then recorded during a second run of the simultaneity judgment task, and partial least squares was used to relate individual differences in the TBW width to source-localized EEG measures of local entropy and mutual information, indexing local and distributed processing of information, respectively. The narrowness of the TBW, reflecting more effective multisensory processing, was related to a broad pattern of higher mutual information and lower local entropy at multiple timescales. Furthermore, a small group of temporal and frontal cortical regions, including those previously implicated in multisensory integration and response selection, respectively, played a prominent role in this pattern. Overall, these findings suggest that individual differences in multisensory processing are related to widespread individual differences in the balance of distributed versus local information processing among a large subset of brain regions, with more distributed information being associated with more effective multisensory processing. The balance of distributed versus local information processing may therefore be a useful measure for exploring individual differences in multisensory processing, its relationship to higher cognitive traits, and its disruption in neurodevelopmental disorders and clinical conditions.
Collapse
|
9
|
Marsicano G, Cerpelloni F, Melcher D, Ronconi L. Lower multisensory temporal acuity in individuals with high schizotypal traits: a web-based study. Sci Rep 2022; 12:2782. [PMID: 35177673 PMCID: PMC8854550 DOI: 10.1038/s41598-022-06503-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2021] [Accepted: 01/25/2022] [Indexed: 12/02/2022] Open
Abstract
Natural events are often multisensory, requiring the brain to combine information from the same spatial location and timing, across different senses. The importance of temporal coincidence has led to the introduction of the temporal binding window (TBW) construct, defined as the time range within which multisensory inputs are highly likely to be perceptually bound into a single entity. Anomalies in TBWs have been linked to confused perceptual experiences and inaccurate filtering of sensory inputs coming from different environmental sources. Indeed, larger TBWs have been associated with disorders such as schizophrenia and autism and are also correlated to a higher level of subclinical traits of these conditions in the general population. Here, we tested the feasibility of using a web-based version of a classic audio-visual simultaneity judgment (SJ) task with simple flash-beep stimuli in order to measure multisensory temporal acuity and its relationship with schizotypal traits as measured in the general population. Results show that: (i) the response distribution obtained in the web-based SJ task was strongly similar to those reported by studies carried out in controlled laboratory settings, and (ii) lower multisensory temporal acuity was associated with higher schizotypal traits in the “cognitive-perceptual” domains. Our findings reveal the possibility of adequately using a web-based audio-visual SJ task outside a controlled laboratory setting, available to a more diverse and representative pool of participants. These results provide additional evidence for a close relationship between lower multisensory acuity and the expression of schizotypal traits in the general population.
Collapse
Affiliation(s)
- Gianluca Marsicano
- School of Psychology, Vita-Salute San Raffaele University, Milan, Italy.,Division of Neuroscience, IRCCS San Raffaele Scientific Institute, Milan, Italy
| | - Filippo Cerpelloni
- Center for Mind/Brain Sciences and Department of Psychology and Cognitive Science, University of Trento, Rovereto, Italy.,Laboratory of Biological Psychology, Department of Brain and Cognition, Leuven Brain Institute, KU Leuve, Leuven, Belgium.,Institute of Research in Psychology (IPSY) & Institute of Neuroscience (IoNS)-University of Louvain (UCLouvain), Leuven, Belgium
| | - David Melcher
- Center for Mind/Brain Sciences and Department of Psychology and Cognitive Science, University of Trento, Rovereto, Italy. .,Psychology Program, Division of Science, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates.
| | - Luca Ronconi
- School of Psychology, Vita-Salute San Raffaele University, Milan, Italy.,Division of Neuroscience, IRCCS San Raffaele Scientific Institute, Milan, Italy
| |
Collapse
|
10
|
Liang J, Li Y, Zhang Z, Luo W. Sound gaps boost emotional audiovisual integration independent of attention: Evidence from an ERP study. Biol Psychol 2021; 168:108246. [PMID: 34968556 DOI: 10.1016/j.biopsycho.2021.108246] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2021] [Revised: 12/18/2021] [Accepted: 12/23/2021] [Indexed: 11/02/2022]
Abstract
The emotion discrimination paradigm was adopted to study the effect of interrupted sound on visual emotional processing under different attentional states. There were two experiments: Experiment 1: judging facial expressions (explicit task), Experiment 2: judging the position of a bar (implicit task). In Experiment 1, ERP results showed that there was a sound gap accelerating the effect of P1 present only under neutral faces. In Experiment 2, the accelerating effect (P1) existed regardless of the emotional condition. Combining two experiments, P1 findings suggest that sound gap enhances bottom-up attention. The N170 and late positive component (LPC) were found to be regulated by emotion face in both experiments, with fear over the neutral. Comparing the two experiments, the explicit task induced a larger LPC than the implicit task. Overall, sound gaps boosted the audiovisual integration by bottom-up attention in early integration, while cognitive expectations led to top-down attention in late stages.
Collapse
Affiliation(s)
- Junyu Liang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; Key Laboratory of Brain and Cognitive Neuroscience, Dalian 116029, Liaoning Province, China
| | - Yuchen Li
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; Key Laboratory of Brain and Cognitive Neuroscience, Dalian 116029, Liaoning Province, China
| | - Zhao Zhang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; Institute of Psychology, Weifang Medical University, Weifang 216053, China; Key Laboratory of Brain and Cognitive Neuroscience, Dalian 116029, Liaoning Province, China
| | - Wenbo Luo
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; Key Laboratory of Brain and Cognitive Neuroscience, Dalian 116029, Liaoning Province, China.
| |
Collapse
|
11
|
Neural correlates of metacontrast masking across different contrast polarities. Brain Struct Funct 2021; 226:3067-3081. [PMID: 33779794 DOI: 10.1007/s00429-021-02260-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2020] [Accepted: 03/16/2021] [Indexed: 01/01/2023]
Abstract
Metacontrast masking is a powerful illusion to investigate the dynamics of perceptual processing and to control conscious visual perception. However, the neural mechanisms underlying this fundamental investigative tool are still debated. In the present study, we examined metacontrast masking across different contrast polarities by employing a contour discrimination task combined with EEG (Electroencephalography). When the target and mask had the same contrast polarity, a typical U-shaped metacontrast function was observed. A change in mask polarity (i.e., opposite mask polarity) shifted this masking function to a monotonic increasing function such that the target visibility was strongly suppressed at stimulus onset asynchronies less than 50 ms. This transition in metacontrast function has been typically interpreted as an increase in intrachannel inhibition of the sustained activities functionally linked to object visibility and identity. Our EEG analyses revealed an early (160-300 ms) and a late (300-550 ms) spatiotemporal cluster associated with this effect of polarity. The early cluster was mainly over occipital and parieto-occipital scalp sites. On the other hand, the later modulations of the evoked activities were centered over parietal and centro-parietal sites. Since both of these clusters were beyond 160 ms, the EEG results point to late recurrent inhibitory mechanisms. Although the findings here do not directly preclude other proposed mechanisms for metacontrast, they highlight the involvement of recurrent intrachannel inhibition in metacontrast masking.
Collapse
|
12
|
Kaya U, Kafaligonul H. Audiovisual interactions in speeded discrimination of a visual event. Psychophysiology 2021; 58:e13777. [PMID: 33483971 DOI: 10.1111/psyp.13777] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2020] [Revised: 01/07/2021] [Accepted: 01/07/2021] [Indexed: 01/10/2023]
Abstract
The integration of information from different senses is central to our perception of the external world. Audiovisual interactions have been particularly well studied in this context and various illusions have been developed to demonstrate strong influences of these interactions on the final percept. Using audiovisual paradigms, previous studies have shown that even task-irrelevant information provided by a secondary modality can change the detection and discrimination of a primary target. These modulations have been found to be significantly dependent on the relative timing between auditory and visual stimuli. Although these interactions in time have been commonly reported, we have still limited understanding of the relationship between the modulations of event-related potentials (ERPs) and final behavioral performance. Here, we aimed to shed light on this important issue by using a speeded discrimination paradigm combined with electroencephalogram (EEG). During the experimental sessions, the timing between an auditory click and a visual flash was varied over a wide range of stimulus onset asynchronies and observers were engaged in speeded discrimination of flash location. Behavioral reaction times were significantly changed by click timing. Furthermore, the modulations of evoked activities over medial parietal/parieto-occipital electrodes were associated with this effect. These modulations were within the 126-176 ms time range and more importantly, they were also correlated with the changes in reaction times. These results provide an important functional link between audiovisual interactions at early stages of sensory processing and reaction times. Together with previous research, they further suggest that early crossmodal interactions play a critical role in perceptual performance.
Collapse
Affiliation(s)
- Utku Kaya
- National Magnetic Resonance Research Center (UMRAM), Bilkent University, Ankara, Turkey.,Informatics Institute, Middle East Technical University, Ankara, Turkey.,Department of Anesthesiology, University of Michigan, Ann Arbor, MI, USA
| | - Hulusi Kafaligonul
- National Magnetic Resonance Research Center (UMRAM), Bilkent University, Ankara, Turkey.,Interdisciplinary Neuroscience Program, Aysel Sabuncu Brain Research Center, Bilkent University, Ankara, Turkey
| |
Collapse
|
13
|
Zhao Z, Lei S, Weiqi H, Suyong Y, Wenbo L. The influence of the cross-modal emotional pre-preparation effect on audiovisual integration. Neuroreport 2020; 31:1161-1166. [PMID: 32991523 DOI: 10.1097/wnr.0000000000001530] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
Previous studies have shown that the cross-modal pre-preparation effect is an important factor for audiovisual integration. However, the facilitating influence of the pre-preparation effect on the integration of emotional cues remains unclear. Therefore, this study examined the emotional pre-preparation effect during the multistage process of audiovisual integration. Event-related potentials (ERPs) were recorded while participants performed a synchronous or asynchronous integration task with fearful or neutral stimuli. The results indicated that, compared with the sum of the unisensory presentation of visual (V) and auditory (A) stimuli (A+V), only fearful audiovisual stimuli induced a decreased N1 and an enhanced P2; this was not found for the neutral stimuli. Moreover, the fearful stimuli triggered a larger P2 than the neutral stimuli in the audiovisual condition, but not in the sum of the combined (A+V) waveforms. Our findings imply that, in the early perceptual processing stage and perceptual fine processing stage, fear improves the processing efficiency of the emotional audiovisual integration. In the last cognitively assessing stage, the fearful audiovisual induced a larger late positive component (LPC) than the neutral audiovisual. Moreover, the asynchronous-audiovisual induced a greater LPC than the synchronous-audiovisual during the 400-550 ms period. The different integration effects between the fearful and neutral stimuli may reflect the existence of distinct mechanisms of the pre-preparation in terms of the emotional dimension. In light of these results, we present a cross-modal emotional pre-preparation effect involving a three-phase emotional audiovisual integration.
Collapse
Affiliation(s)
- Zhang Zhao
- Institute of Psychology, Weifang Medical University, Weifang.,Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University.,Key Laboratory of Brain and Cognitive Neurosience, Dalian, Liaoning Province
| | - Sun Lei
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University.,Key Laboratory of Brain and Cognitive Neurosience, Dalian, Liaoning Province
| | - He Weiqi
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University.,Key Laboratory of Brain and Cognitive Neurosience, Dalian, Liaoning Province
| | - Yang Suyong
- School of Psychology, Shanghai University of Sport, Shanghai, China
| | - Luo Wenbo
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University.,Key Laboratory of Brain and Cognitive Neurosience, Dalian, Liaoning Province
| |
Collapse
|
14
|
Individual differences in sensory integration predict differences in time perception and individual levels of schizotypy. Conscious Cogn 2020; 84:102979. [DOI: 10.1016/j.concog.2020.102979] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2020] [Revised: 06/17/2020] [Accepted: 06/17/2020] [Indexed: 12/13/2022]
|
15
|
Zumer JM, White TP, Noppeney U. The neural mechanisms of audiotactile binding depend on asynchrony. Eur J Neurosci 2020; 52:4709-4731. [PMID: 32725895 DOI: 10.1111/ejn.14928] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2020] [Revised: 07/06/2020] [Accepted: 07/24/2020] [Indexed: 11/30/2022]
Abstract
Asynchrony is a critical cue informing the brain whether sensory signals are caused by a common source and should be integrated or segregated. This psychophysics-electroencephalography (EEG) study investigated the influence of asynchrony on how the brain binds audiotactile (AT) signals to enable faster responses in a redundant target paradigm. Human participants actively responded (psychophysics) or passively attended (EEG) to noise bursts, "taps-to-the-face" and their AT combinations at seven AT asynchronies: 0, ±20, ±70 and ±500 ms. Behaviourally, observers were faster at detecting AT than unisensory stimuli within a temporal integration window: the redundant target effect was maximal for synchronous stimuli and declined within a ≤70 ms AT asynchrony. EEG revealed a cascade of AT interactions that relied on different neural mechanisms depending on AT asynchrony. At small (≤20 ms) asynchronies, AT interactions arose for evoked response potentials (ERPs) at 110 ms and ~400 ms post-stimulus. Selectively at ±70 ms asynchronies, AT interactions were observed for the P200 ERP, theta-band inter-trial coherence (ITC) and power at ~200 ms post-stimulus. In conclusion, AT binding was mediated by distinct neural mechanisms depending on the asynchrony of the AT signals. Early AT interactions in ERPs and theta-band ITC and power were critical for the behavioural response facilitation within a ≤±70 ms temporal integration window.
Collapse
Affiliation(s)
- Johanna M Zumer
- School of Psychology, University of Birmingham, Birmingham, UK.,Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK.,Centre for Human Brain Health, University of Birmingham, Birmingham, UK.,School of Life and Health Sciences, Aston University, Birmingham, UK
| | - Thomas P White
- School of Psychology, University of Birmingham, Birmingham, UK.,Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK
| | - Uta Noppeney
- School of Psychology, University of Birmingham, Birmingham, UK.,Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK.,Centre for Human Brain Health, University of Birmingham, Birmingham, UK.,Donders Institute for Brain, Cognition, and Behaviour, Nijmegen, The Netherlands
| |
Collapse
|
16
|
Bastiaansen M, Berberyan H, Stekelenburg JJ, Schoffelen JM, Vroomen J. Are alpha oscillations instrumental in multisensory synchrony perception? Brain Res 2020; 1734:146744. [DOI: 10.1016/j.brainres.2020.146744] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2019] [Revised: 02/21/2020] [Accepted: 02/26/2020] [Indexed: 01/18/2023]
|
17
|
Judging Relative Onsets and Offsets of Audiovisual Events. Vision (Basel) 2020; 4:vision4010017. [PMID: 32138261 PMCID: PMC7157228 DOI: 10.3390/vision4010017] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2019] [Revised: 02/15/2020] [Accepted: 02/23/2020] [Indexed: 01/29/2023] Open
Abstract
This study assesses the fidelity with which people can make temporal order judgments (TOJ) between auditory and visual onsets and offsets. Using an adaptive staircase task administered to a large sample of young adults, we find that the ability to judge temporal order varies widely among people, with notable difficulty created when auditory events closely follow visual events. Those findings are interpretable within the context of an independent channels model. Visual onsets and offsets can be difficult to localize in time when they occur within the temporal neighborhood of sound onsets or offsets.
Collapse
|
18
|
Zhou HY, Cheung EFC, Chan RCK. Audiovisual temporal integration: Cognitive processing, neural mechanisms, developmental trajectory and potential interventions. Neuropsychologia 2020; 140:107396. [PMID: 32087206 DOI: 10.1016/j.neuropsychologia.2020.107396] [Citation(s) in RCA: 38] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2019] [Revised: 02/14/2020] [Accepted: 02/15/2020] [Indexed: 12/21/2022]
Abstract
To integrate auditory and visual signals into a unified percept, the paired stimuli must co-occur within a limited time window known as the Temporal Binding Window (TBW). The width of the TBW, a proxy of audiovisual temporal integration ability, has been found to be correlated with higher-order cognitive and social functions. A comprehensive review of studies investigating audiovisual TBW reveals several findings: (1) a wide range of top-down processes and bottom-up features can modulate the width of the TBW, facilitating adaptation to the changing and multisensory external environment; (2) a large-scale brain network works in coordination to ensure successful detection of audiovisual (a)synchrony; (3) developmentally, audiovisual TBW follows a U-shaped pattern across the lifespan, with a protracted developmental course into late adolescence and rebounding in size again in late life; (4) an enlarged TBW is characteristic of a number of neurodevelopmental disorders; and (5) the TBW is highly flexible via perceptual and musical training. Interventions targeting the TBW may be able to improve multisensory function and ameliorate social communicative symptoms in clinical populations.
Collapse
Affiliation(s)
- Han-Yu Zhou
- Neuropsychology and Applied Cognitive Neuroscience Laboratory, CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | | | - Raymond C K Chan
- Neuropsychology and Applied Cognitive Neuroscience Laboratory, CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, China.
| |
Collapse
|
19
|
Zhang Z, He W, Li Y, Zhang M, Luo W. Facilitation of Crossmodal Integration During Emotional Prediction in Methamphetamine Dependents. Front Neural Circuits 2020; 13:80. [PMID: 32038178 PMCID: PMC6989411 DOI: 10.3389/fncir.2019.00080] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2019] [Accepted: 12/11/2019] [Indexed: 12/05/2022] Open
Abstract
Methamphetamine (meth) can greatly damage the prefrontal cortex of the brain and trigger dysfunction of the cognitive control loop, which triggers not only drug dependence but also emotional disorders. The imbalance between the cognitive and emotional systems will lead to crossmodal emotional deficits. Until now, the negative impact of meth dependence on crossmodal emotional processing has not received attention. Therefore, the present study firstly examined the differences in crossmodal emotional processing between healthy controls and meth dependents (MADs) and then investigated the role of visual- or auditory-leading cues in the promotion of crossmodal emotional processing. Experiment 1 found that MADs made a visual-auditory integration disorder for fearful emotion, which may be related to the defects in information transmission between the auditory and auditory cortex. Experiment 2 found that MADs had a crossmodal disorder pertaining to fear under visual-leading cues, but the fearful sound improved the detection of facial emotions for MADs. Experiment 3 reconfirmed that, for MADs, A-leading cues could induce crossmodal integration immediately more easily than V-leading ones. These findings provided sufficient quantitative indicators and evidences that meth dependence was associated with crossmodal integration disorders, which in turn was associated with auditory-leading cues that enhanced the recognition ability of MADs for complex emotions (all results are available at: https://osf.io/x6rv5/). These results provided a better understanding for individuals using drugs in order to enhance the cognition for the complex crossmodal emotional integration.
Collapse
Affiliation(s)
| | | | | | | | - Wenbo Luo
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, China
| |
Collapse
|
20
|
Cortical processes underlying the effects of static sound timing on perceived visual speed. Neuroimage 2019; 199:194-205. [DOI: 10.1016/j.neuroimage.2019.05.062] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2019] [Revised: 04/09/2019] [Accepted: 05/24/2019] [Indexed: 01/10/2023] Open
|
21
|
Ikumi N, Torralba M, Ruzzoli M, Soto-Faraco S. The phase of pre-stimulus brain oscillations correlates with cross-modal synchrony perception. Eur J Neurosci 2018; 49:150-164. [DOI: 10.1111/ejn.14186] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2017] [Revised: 09/11/2018] [Accepted: 09/13/2018] [Indexed: 11/30/2022]
Affiliation(s)
- Nara Ikumi
- Multisensory Research Group; Center for Brain and Cognition; Universitat Pompeu Fabra; Barcelona Spain
| | - Mireia Torralba
- Multisensory Research Group; Center for Brain and Cognition; Universitat Pompeu Fabra; Barcelona Spain
| | - Manuela Ruzzoli
- Multisensory Research Group; Center for Brain and Cognition; Universitat Pompeu Fabra; Barcelona Spain
| | - Salvador Soto-Faraco
- Multisensory Research Group; Center for Brain and Cognition; Universitat Pompeu Fabra; Barcelona Spain
- Institució Catalana de Recerca i Estudis Avançats (ICREA); Barcelona Spain
| |
Collapse
|
22
|
Simon DM, Wallace MT. Integration and Temporal Processing of Asynchronous Audiovisual Speech. J Cogn Neurosci 2018; 30:319-337. [DOI: 10.1162/jocn_a_01205] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Multisensory integration of visual mouth movements with auditory speech is known to offer substantial perceptual benefits, particularly under challenging (i.e., noisy) acoustic conditions. Previous work characterizing this process has found that ERPs to auditory speech are of shorter latency and smaller magnitude in the presence of visual speech. We sought to determine the dependency of these effects on the temporal relationship between the auditory and visual speech streams using EEG. We found that reductions in ERP latency and suppression of ERP amplitude are maximal when the visual signal precedes the auditory signal by a small interval and that increasing amounts of asynchrony reduce these effects in a continuous manner. Time–frequency analysis revealed that these effects are found primarily in the theta (4–8 Hz) and alpha (8–12 Hz) bands, with a central topography consistent with auditory generators. Theta effects also persisted in the lower portion of the band (3.5–5 Hz), and this late activity was more frontally distributed. Importantly, the magnitude of these late theta oscillations not only differed with the temporal characteristics of the stimuli but also served to predict participants' task performance. Our analysis thus reveals that suppression of single-trial brain responses by visual speech depends strongly on the temporal concordance of the auditory and visual inputs. It further illustrates that processes in the lower theta band, which we suggest as an index of incongruity processing, might serve to reflect the neural correlates of individual differences in multisensory temporal perception.
Collapse
|
23
|
Lange J, Kapala K, Krause H, Baumgarten TJ, Schnitzler A. Rapid temporal recalibration to visuo-tactile stimuli. Exp Brain Res 2017; 236:347-354. [PMID: 29143125 PMCID: PMC5809529 DOI: 10.1007/s00221-017-5132-z] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2017] [Accepted: 11/10/2017] [Indexed: 11/28/2022]
Abstract
For a comprehensive understanding of the environment, the brain must constantly decide whether the incoming information originates from the same source and needs to be integrated into a coherent percept. This integration process is believed to be mediated by temporal integration windows. If presented with temporally asynchronous stimuli for a few minutes, the brain adapts to this new temporal relation by recalibrating the temporal integration windows. Such recalibration can occur even more rapidly after exposure to just a single trial of asynchronous stimulation. While rapid recalibration has been demonstrated for audio-visual stimuli, evidence for rapid recalibration of visuo-tactile stimuli is lacking. Here, we investigated rapid recalibration in the visuo-tactile domain. Subjects received visual and tactile stimuli with different stimulus onset asynchronies (SOA) and were asked to report whether the visuo-tactile stimuli were presented simultaneously. Our results demonstrate visuo-tactile rapid recalibration by revealing that subjects' simultaneity reports were modulated by the temporal order of stimulation in the preceding trial. This rapid recalibration effect, however, was only significant if the SOA in the preceding trial was smaller than 100 ms, while rapid recalibration could not be demonstrated for SOAs larger than 100 ms. Since rapid recalibration in the audio-visual domain has been demonstrated for SOAs larger than 100 ms, we propose that visuo-tactile recalibration works at shorter SOAs, and thus faster time scales than audio-visual rapid recalibration.
Collapse
Affiliation(s)
- Joachim Lange
- Medical Faculty, Institute of Clinical Neuroscience and Medical Psychology, Heinrich Heine University, Düsseldorf, Germany.
| | - Katharina Kapala
- Medical Faculty, Institute of Clinical Neuroscience and Medical Psychology, Heinrich Heine University, Düsseldorf, Germany
| | - Holger Krause
- Medical Faculty, Institute of Clinical Neuroscience and Medical Psychology, Heinrich Heine University, Düsseldorf, Germany
| | - Thomas J Baumgarten
- Medical Faculty, Institute of Clinical Neuroscience and Medical Psychology, Heinrich Heine University, Düsseldorf, Germany
| | - Alfons Schnitzler
- Medical Faculty, Institute of Clinical Neuroscience and Medical Psychology, Heinrich Heine University, Düsseldorf, Germany
| |
Collapse
|
24
|
Covic A, Keitel C, Porcu E, Schröger E, Müller MM. Audio-visual synchrony and spatial attention enhance processing of dynamic visual stimulation independently and in parallel: A frequency-tagging study. Neuroimage 2017; 161:32-42. [PMID: 28802870 DOI: 10.1016/j.neuroimage.2017.08.022] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2017] [Revised: 07/13/2017] [Accepted: 08/06/2017] [Indexed: 11/25/2022] Open
Abstract
The neural processing of a visual stimulus can be facilitated by attending to its position or by a co-occurring auditory tone. Using frequency-tagging, we investigated whether facilitation by spatial attention and audio-visual synchrony rely on similar neural processes. Participants attended to one of two flickering Gabor patches (14.17 and 17 Hz) located in opposite lower visual fields. Gabor patches further "pulsed" (i.e. showed smooth spatial frequency variations) at distinct rates (3.14 and 3.63 Hz). Frequency-modulating an auditory stimulus at the pulse-rate of one of the visual stimuli established audio-visual synchrony. Flicker and pulsed stimulation elicited stimulus-locked rhythmic electrophysiological brain responses that allowed tracking the neural processing of simultaneously presented Gabor patches. These steady-state responses (SSRs) were quantified in the spectral domain to examine visual stimulus processing under conditions of synchronous vs. asynchronous tone presentation and when respective stimulus positions were attended vs. unattended. Strikingly, unique patterns of effects on pulse- and flicker driven SSRs indicated that spatial attention and audiovisual synchrony facilitated early visual processing in parallel and via different cortical processes. We found attention effects to resemble the classical top-down gain effect facilitating both, flicker and pulse-driven SSRs. Audio-visual synchrony, in turn, only amplified synchrony-producing stimulus aspects (i.e. pulse-driven SSRs) possibly highlighting the role of temporally co-occurring sights and sounds in bottom-up multisensory integration.
Collapse
Affiliation(s)
- Amra Covic
- Institut für Psychologie, Universität Leipzig, Neumarkt 9-19, 04109, Leipzig, Germany; Institut für Medizinische Psychologie und Medizinische Soziologie, Universitätsmedizin Göttingen, Georg-August-Universität, 37973, Göttingen, Germany
| | - Christian Keitel
- Centre for Cognitive Neuroimaging, Institute of Neuroscience and Psychology, University of Glasgow, 58 Hillhead Street, G12 8QB, Glasgow, UK.
| | - Emanuele Porcu
- Institut für Psychologie, Otto-von-Guericke-Universität Magdeburg, Universitätsplatz 2, Gebäude 23, 39106, Magdeburg, Germany
| | - Erich Schröger
- Institut für Psychologie, Universität Leipzig, Neumarkt 9-19, 04109, Leipzig, Germany
| | - Matthias M Müller
- Institut für Psychologie, Universität Leipzig, Neumarkt 9-19, 04109, Leipzig, Germany
| |
Collapse
|