1
|
Yan D, Seki A. Differential modulations of theta and beta oscillations by audiovisual congruency in letter-speech sound integration. Eur J Neurosci 2024. [PMID: 39469847 DOI: 10.1111/ejn.16563] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2024] [Accepted: 09/23/2024] [Indexed: 10/30/2024]
Abstract
The integration of visual letters and speech sounds is a crucial part of learning to read. Previous studies investigating this integration have revealed a modulation by audiovisual (AV) congruency, commonly known as the congruency effect. To investigate the cortical oscillations of the congruency effects across different oscillatory frequency bands, we conducted a Japanese priming task in which a visual letter was followed by a speech sound. We analyzed the power and phase properties of oscillatory activities in the theta and beta bands between congruent and incongruent letter-speech sound (L-SS) pairs. Our results revealed stronger theta-band (5-7 Hz) power in the congruent condition and cross-modal phase resetting within the auditory cortex, accompanied by enhanced inter-trial phase coherence (ITPC) in the auditory-related areas in response to the congruent condition. The observed congruency effect of theta-band power may reflect increased neural activities in the left auditory region during L-SS integration. Additionally, theta ITPC findings suggest that visual letters amplify neuronal responses to the following corresponding auditory stimulus, which may reflect the differential cross-modal influences in the primary auditory cortex. In contrast, decreased beta-band (20-35 Hz) oscillatory power was observed in the right centroparietal regions for the congruent condition. The reduced beta power seems to be unrelated to the processing of AV integration, but may be interpreted as the brain response to predicting auditory sounds during language processing. Our data provide valuable insights by indicating that oscillations in different frequency bands contribute to the disparate aspects of L-SS integration.
Collapse
Affiliation(s)
- Dongyang Yan
- Graduate School of Education, Hokkaido University, Sapporo, Japan
| | - Ayumi Seki
- Faculty of Education, Hokkaido University, Sapporo, Japan
| |
Collapse
|
2
|
Zhang H, Xie J, Tao Q, Ge Z, Xiong Y, Xu G, Li M, Han C. The effect of rhythmic stimuli with spatial information on sensorimotor synchronization: an EEG and EMG study. Front Neurosci 2024; 18:1448051. [PMID: 39429702 PMCID: PMC11486764 DOI: 10.3389/fnins.2024.1448051] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2024] [Accepted: 09/16/2024] [Indexed: 10/22/2024] Open
Abstract
Introduction Sensorimotor synchronization (SMS) is the human ability to align body movement rhythms with external rhythmic stimuli. While the effects of rhythmic stimuli containing only temporal information on SMS have been extensively studied, less is known about how spatial information affects SMS performance. This study investigates the neural mechanisms underlying SMS with rhythmic stimuli that include both temporal and spatial information, providing insights into the influence of these factors across different sensory modalities. Methods This study compared the effects temporal information and spatial information on SMS performance across different stimuli conditions. We simultaneously recorded the electroencephalogram (EEG), the electromyogram (EMG), and behavioral data as subjects performed synchronized tapping to rhythmic stimuli. The study analyzed SMS performance under conditions including auditory, visual, and auditory-visual motion stimuli (containing both temporal and spatial information), as well as auditory, visual, and auditory-visual non-motion stimuli (containing only temporal information). Specifically, the research examined behavioral data (i.e., mean asynchrony, absolute asynchrony, and variability), neural oscillations, cortico-muscular coherence (CMC), and brain connectivity. Results The results demonstrated that SMS performance was superior with rhythmic stimuli containing both temporal and spatial information compared to stimuli with only temporal information. Moreover, sensory-motor neural entrainment was stronger during SMS with rhythmic stimuli containing spatial information within the same sensory modality. SMS with both types of rhythmic stimuli was found to be dynamically modulated by neural oscillations and cortical-muscular coupling in the beta band (13-30 Hz). Discussion These findings provide deeper insights into the combined effects of temporal and spatial information, as well as sensory modality, on SMS performance. The study highlights the dynamic modulation of SMS by neural oscillations and CMC, particularly in the beta band, offering valuable contributions to understanding the neural basis of sensorimotor synchronization.
Collapse
Affiliation(s)
- Huanqing Zhang
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Jun Xie
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, China
- School of Mechanical Engineering, Xinjiang University, Ürümqi, China
| | - Qing Tao
- School of Mechanical Engineering, Xinjiang University, Ürümqi, China
| | - Zengle Ge
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Yu Xiong
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Guanghua Xu
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Min Li
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Chengcheng Han
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, China
| |
Collapse
|
3
|
Senkowski D, Engel AK. Multi-timescale neural dynamics for multisensory integration. Nat Rev Neurosci 2024; 25:625-642. [PMID: 39090214 DOI: 10.1038/s41583-024-00845-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/02/2024] [Indexed: 08/04/2024]
Abstract
Carrying out any everyday task, be it driving in traffic, conversing with friends or playing basketball, requires rapid selection, integration and segregation of stimuli from different sensory modalities. At present, even the most advanced artificial intelligence-based systems are unable to replicate the multisensory processes that the human brain routinely performs, but how neural circuits in the brain carry out these processes is still not well understood. In this Perspective, we discuss recent findings that shed fresh light on the oscillatory neural mechanisms that mediate multisensory integration (MI), including power modulations, phase resetting, phase-amplitude coupling and dynamic functional connectivity. We then consider studies that also suggest multi-timescale dynamics in intrinsic ongoing neural activity and during stimulus-driven bottom-up and cognitive top-down neural network processing in the context of MI. We propose a new concept of MI that emphasizes the critical role of neural dynamics at multiple timescales within and across brain networks, enabling the simultaneous integration, segregation, hierarchical structuring and selection of information in different time windows. To highlight predictions from our multi-timescale concept of MI, real-world scenarios in which multi-timescale processes may coordinate MI in a flexible and adaptive manner are considered.
Collapse
Affiliation(s)
- Daniel Senkowski
- Department of Psychiatry and Neurosciences, Charité - Universitätsmedizin Berlin, Berlin, Germany
| | - Andreas K Engel
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany.
| |
Collapse
|
4
|
Yusuf PA, Hubka P, Konerding W, Land R, Tillein J, Kral A. Congenital deafness reduces alpha-gamma cross-frequency coupling in the auditory cortex. Hear Res 2024; 449:109032. [PMID: 38797035 DOI: 10.1016/j.heares.2024.109032] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/15/2024] [Revised: 04/30/2024] [Accepted: 05/13/2024] [Indexed: 05/29/2024]
Abstract
Neurons within a neuronal network can be grouped by bottom-up and top-down influences using synchrony in neuronal oscillations. This creates the representation of perceptual objects from sensory features. Oscillatory activity can be differentiated into stimulus-phase-locked (evoked) and non-phase-locked (induced). The former is mainly determined by sensory input, the latter by higher-level (cortical) processing. Effects of auditory deprivation on cortical oscillations have been studied in congenitally deaf cats (CDCs) using cochlear implant (CI) stimulation. CI-induced alpha, beta, and gamma activity were compromised in the auditory cortex of CDCs. Furthermore, top-down information flow between secondary and primary auditory areas in hearing cats, conveyed by induced alpha oscillations, was lost in CDCs. Here we used the matching pursuit algorithm to assess components of such oscillatory activity in local field potentials recorded in primary field A1. Additionally to the loss of induced alpha oscillations, we also found a loss of evoked theta activity in CDCs. The loss of theta and alpha activity in CDCs can be directly related to reduced high-frequency (gamma-band) activity due to cross-frequency coupling. Here we quantified such cross-frequency coupling in adult 1) hearing-experienced, acoustically stimulated cats (aHCs), 2) hearing-experienced cats following acute pharmacological deafening and subsequent CIs, thus in electrically stimulated cats (eHCs), and 3) electrically stimulated CDCs. We found significant cross-frequency coupling in all animal groups in > 70% of auditory-responsive sites. The predominant coupling in aHCs and eHCs was between theta/alpha phase and gamma power. In CDCs such coupling was lost and replaced by alpha oscillations coupling to delta/theta phase. Thus, alpha/theta oscillations synchronize high-frequency gamma activity only in hearing-experienced cats. The absence of induced alpha and theta oscillations contributes to the loss of induced gamma power in CDCs, thereby signifying impaired local network activity.
Collapse
Affiliation(s)
- Prasandhya A Yusuf
- Hannover Medical School, Institute of AudioNeuroTechnology and Department of Experimental Otology of the ENT Clinics, Hannover, Germany; Faculty of Medicine University of Indonesia, Department of Medical Physiology and Biophysics / Medical Technology IMERI, Jakarta, Indonesia.
| | - Peter Hubka
- Hannover Medical School, Institute of AudioNeuroTechnology and Department of Experimental Otology of the ENT Clinics, Hannover, Germany
| | - Wiebke Konerding
- Hannover Medical School, Institute of AudioNeuroTechnology and Department of Experimental Otology of the ENT Clinics, Hannover, Germany
| | - Rüdiger Land
- Hannover Medical School, Institute of AudioNeuroTechnology and Department of Experimental Otology of the ENT Clinics, Hannover, Germany
| | - Jochen Tillein
- J.W. Goethe University, Department of Otorhinolaryngology, Frankfurt am Main, Germany
| | - Andrej Kral
- Hannover Medical School, Institute of AudioNeuroTechnology and Department of Experimental Otology of the ENT Clinics, Hannover, Germany; Australian Hearing Hub, School of Medicine and Health Sciences, Macquarie University, Sydney, Australia
| |
Collapse
|
5
|
Jiang Z, An X, Liu S, Yin E, Yan Y, Ming D. Beyond alpha band: prestimulus local oscillation and interregional synchrony of the beta band shape the temporal perception of the audiovisual beep-flash stimulus. J Neural Eng 2024; 21:036035. [PMID: 37419108 DOI: 10.1088/1741-2552/ace551] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2022] [Accepted: 07/07/2023] [Indexed: 07/09/2023]
Abstract
Objective.Multisensory integration is more likely to occur if the multimodal inputs are within a narrow temporal window called temporal binding window (TBW). Prestimulus local neural oscillations and interregional synchrony within sensory areas can modulate cross-modal integration. Previous work has examined the role of ongoing neural oscillations in audiovisual temporal integration, but there is no unified conclusion. This study aimed to explore whether local ongoing neural oscillations and interregional audiovisual synchrony modulate audiovisual temporal integration.Approach.The human participants performed a simultaneity judgment (SJ) task with the beep-flash stimuli while recording electroencephalography. We focused on two stimulus onset asynchrony (SOA) conditions where subjects report ∼50% proportion of synchronous responses in auditory- and visual-leading SOA (A50V and V50A).Main results.We found that the alpha band power is larger in synchronous response in the central-right posterior and posterior sensors in A50V and V50A conditions, respectively. The results suggested that the alpha band power reflects neuronal excitability in the auditory or visual cortex, which can modulate audiovisual temporal perception depending on the leading sense. Additionally, the SJs were modulated by the opposite phases of alpha (5-10 Hz) and low beta (14-20 Hz) bands in the A50V condition while the low beta band (14-18 Hz) in the V50A condition. One cycle of alpha or two cycles of beta oscillations matched an auditory-leading TBW of ∼86 ms, while two cycles of beta oscillations matched a visual-leading TBW of ∼105 ms. This result indicated the opposite phases in the alpha and beta bands reflect opposite cortical excitability, which modulated the audiovisual SJs. Finally, we found stronger high beta (21-28 Hz) audiovisual phase synchronization for synchronous response in the A50V condition. The phase synchrony of the beta band might be related to maintaining information flow between visual and auditory regions in a top-down manner.Significance.These results clarified whether and how the prestimulus brain state, including local neural oscillations and functional connectivity between brain regions, affects audiovisual temporal integration.
Collapse
Affiliation(s)
- Zeliang Jiang
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
| | - Xingwei An
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
| | - Shuang Liu
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
| | - Erwei Yin
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
- Defense Innovation Institute, Academy of Military Sciences (AMS), 100071 Beijing, People's Republic of China
- Tianjin Artificial Intelligence Innovation Center (TAIIC), 300457 Tianjin, People's Republic of China
| | - Ye Yan
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
- Defense Innovation Institute, Academy of Military Sciences (AMS), 100071 Beijing, People's Republic of China
- Tianjin Artificial Intelligence Innovation Center (TAIIC), 300457 Tianjin, People's Republic of China
| | - Dong Ming
- Academy of Medical Engineering and Translational Medicine, Tianjin University, 300072 Tianjin, People's Republic of China
| |
Collapse
|
6
|
Ahveninen J, Lee HJ, Yu HY, Lee CC, Chou CC, Ahlfors SP, Kuo WJ, Jääskeläinen IP, Lin FH. Visual Stimuli Modulate Local Field Potentials But Drive No High-Frequency Activity in Human Auditory Cortex. J Neurosci 2024; 44:e0890232023. [PMID: 38129133 PMCID: PMC10869150 DOI: 10.1523/jneurosci.0890-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Revised: 11/06/2023] [Accepted: 11/07/2023] [Indexed: 12/23/2023] Open
Abstract
Neuroimaging studies suggest cross-sensory visual influences in human auditory cortices (ACs). Whether these influences reflect active visual processing in human ACs, which drives neuronal firing and concurrent broadband high-frequency activity (BHFA; >70 Hz), or whether they merely modulate sound processing is still debatable. Here, we presented auditory, visual, and audiovisual stimuli to 16 participants (7 women, 9 men) with stereo-EEG depth electrodes implanted near ACs for presurgical monitoring. Anatomically normalized group analyses were facilitated by inverse modeling of intracranial source currents. Analyses of intracranial event-related potentials (iERPs) suggested cross-sensory responses to visual stimuli in ACs, which lagged the earliest auditory responses by several tens of milliseconds. Visual stimuli also modulated the phase of intrinsic low-frequency oscillations and triggered 15-30 Hz event-related desynchronization in ACs. However, BHFA, a putative correlate of neuronal firing, was not significantly increased in ACs after visual stimuli, not even when they coincided with auditory stimuli. Intracranial recordings demonstrate cross-sensory modulations, but no indication of active visual processing in human ACs.
Collapse
Affiliation(s)
- Jyrki Ahveninen
- Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Massachusetts General Hospital, Charlestown, Massachusetts 02129
- Department of Radiology, Harvard Medical School, Boston, Massachusetts 02115
| | - Hsin-Ju Lee
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, Ontario M4N 3M5, Canada
- Department of Medical Biophysics, University of Toronto, Toronto, Ontario M5G 1L7, Canada
| | - Hsiang-Yu Yu
- Department of Epilepsy, Neurological Institute, Taipei Veterans General Hospital, Taipei 11217, Taiwan
- School of Medicine, National Yang Ming Chiao Tung University, Taipei 112304, Taiwan
| | - Cheng-Chia Lee
- School of Medicine, National Yang Ming Chiao Tung University, Taipei 112304, Taiwan
- Department of Neurosurgery, Neurological Institute, Taipei Veterans General Hospital, Taipei 11217, Taiwan
| | - Chien-Chen Chou
- Department of Epilepsy, Neurological Institute, Taipei Veterans General Hospital, Taipei 11217, Taiwan
- School of Medicine, National Yang Ming Chiao Tung University, Taipei 112304, Taiwan
| | - Seppo P Ahlfors
- Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Massachusetts General Hospital, Charlestown, Massachusetts 02129
- Department of Radiology, Harvard Medical School, Boston, Massachusetts 02115
| | - Wen-Jui Kuo
- Institute of Neuroscience, National Yang Ming Chiao Tung University, Taipei 112304, Taiwan
| | - Iiro P Jääskeläinen
- Brain and Mind Laboratory, Department of Neuroscience and Biomedical Engineering, Aalto University School of Science, Espoo, FI-00076 AALTO, Finland
- International Laboratory of Social Neurobiology, Institute of Cognitive Neuroscience, Higher School of Economics, Moscow 101000, Russia
| | - Fa-Hsuan Lin
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, Ontario M4N 3M5, Canada
- Department of Medical Biophysics, University of Toronto, Toronto, Ontario M5G 1L7, Canada
- Brain and Mind Laboratory, Department of Neuroscience and Biomedical Engineering, Aalto University School of Science, Espoo, FI-00076 AALTO, Finland
| |
Collapse
|
7
|
Zuo Y, Wang Z. Neural Oscillations and Multisensory Processing. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1437:121-137. [PMID: 38270857 DOI: 10.1007/978-981-99-7611-9_8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/26/2024]
Abstract
Neural oscillations play a role in sensory processing by coordinating synchronized neuronal activity. Synchronization of gamma oscillations is engaged in local computation of feedforward signals and synchronization of alpha-beta oscillations is engaged in feedback processing over long-range areas. These spatially and spectrally segregated bi-directional signals may be integrated by a mechanism of cross-frequency coupling. Synchronization of neural oscillations has also been proposed as a mechanism for information integration across multiple sensory modalities. A transient stimulus or rhythmic stimulus from one modality may lead to phase alignment of ongoing neural oscillations in multiple sensory cortices, through a mechanism of cross-modal phase reset or cross-modal neural entrainment. Synchronized activities in multiple sensory cortices are more likely to boost stronger activities in downstream areas. Compared to synchronized oscillations, asynchronized oscillations may impede signal processing, and may contribute to sensory selection by setting the oscillations in the target-related cortex and the oscillations in the distractor-related cortex to opposite phases.
Collapse
Affiliation(s)
- Yanfang Zuo
- Department of Neurology, Guangzhou First People's Hospital, School of Medicine, South China University of Technology, Guangzhou, China
- Center for Medical Research on Innovation and Translation, Institute of Clinical Medicine, Guangzhou First People's Hospital, School of Medicine, South China University of Technology, Guangzhou, China
| | - Zuoren Wang
- Institute of Neuroscience, State Key Laboratory of Neuroscience, CAS Center for Excellence in Brain Science & Intelligence Technology, Chinese Academy of Sciences, Shanghai, China
- University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
8
|
Beker S, Molholm S. Do we all synch alike? Brain-body-environment interactions in ASD. Front Neural Circuits 2023; 17:1275896. [PMID: 38186630 PMCID: PMC10769494 DOI: 10.3389/fncir.2023.1275896] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2023] [Accepted: 11/27/2023] [Indexed: 01/09/2024] Open
Abstract
Autism Spectrum Disorder (ASD) is characterized by rigidity of routines and restricted interests, and atypical social communication and interaction. Recent evidence for altered synchronization of neuro-oscillatory brain activity with regularities in the environment and of altered peripheral nervous system function in ASD present promising novel directions for studying pathophysiology and its relationship to ASD clinical phenotype. Human cognition and action are significantly influenced by physiological rhythmic processes that are generated by both the central nervous system (CNS) and the autonomic nervous system (ANS). Normally, perception occurs in a dynamic context, where brain oscillations and autonomic signals synchronize with external events to optimally receive temporally predictable rhythmic information, leading to improved performance. The recent findings on the time-sensitive coupling between the brain and the periphery in effective perception and successful social interactions in typically developed highlight studying the interactions within the brain-body-environment triad as a critical direction in the study of ASD. Here we offer a novel perspective of autism as a case where the temporal dynamics of brain-body-environment coupling is impaired. We present evidence from the literature to support the idea that in autism the nervous system fails to operate in an adaptive manner to synchronize with temporally predictable events in the environment to optimize perception and behavior. This framework could potentially lead to novel biomarkers of hallmark deficits in ASD such as cognitive rigidity and altered social interaction.
Collapse
Affiliation(s)
- Shlomit Beker
- Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine, Bronx, NY, United States
| | | |
Collapse
|
9
|
Liu Y, Wang Z, Wei T, Zhou S, Yin Y, Mi Y, Liu X, Tang Y. Alterations of Audiovisual Integration in Alzheimer's Disease. Neurosci Bull 2023; 39:1859-1872. [PMID: 37812301 PMCID: PMC10661680 DOI: 10.1007/s12264-023-01125-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2023] [Accepted: 06/22/2023] [Indexed: 10/10/2023] Open
Abstract
Audiovisual integration is a vital information process involved in cognition and is closely correlated with aging and Alzheimer's disease (AD). In this review, we evaluated the altered audiovisual integrative behavioral symptoms in AD. We further analyzed the relationships between AD pathologies and audiovisual integration alterations bidirectionally and suggested the possible mechanisms of audiovisual integration alterations underlying AD, including the imbalance between energy demand and supply, activity-dependent degeneration, disrupted brain networks, and cognitive resource overloading. Then, based on the clinical characteristics including electrophysiological and imaging data related to audiovisual integration, we emphasized the value of audiovisual integration alterations as potential biomarkers for the early diagnosis and progression of AD. We also highlighted that treatments targeted audiovisual integration contributed to widespread pathological improvements in AD animal models and cognitive improvements in AD patients. Moreover, investigation into audiovisual integration alterations in AD also provided new insights and comprehension about sensory information processes.
Collapse
Affiliation(s)
- Yufei Liu
- Department of Neurology and Innovation Center for Neurological Disorders, Xuanwu Hospital, Capital Medical University, National Center for Neurological Disorders, Beijing, 100053, China
| | - Zhibin Wang
- Department of Neurology and Innovation Center for Neurological Disorders, Xuanwu Hospital, Capital Medical University, National Center for Neurological Disorders, Beijing, 100053, China
| | - Tao Wei
- Department of Neurology and Innovation Center for Neurological Disorders, Xuanwu Hospital, Capital Medical University, National Center for Neurological Disorders, Beijing, 100053, China
| | - Shaojiong Zhou
- Department of Neurology and Innovation Center for Neurological Disorders, Xuanwu Hospital, Capital Medical University, National Center for Neurological Disorders, Beijing, 100053, China
| | - Yunsi Yin
- Department of Neurology and Innovation Center for Neurological Disorders, Xuanwu Hospital, Capital Medical University, National Center for Neurological Disorders, Beijing, 100053, China
| | - Yingxin Mi
- Department of Neurology and Innovation Center for Neurological Disorders, Xuanwu Hospital, Capital Medical University, National Center for Neurological Disorders, Beijing, 100053, China
| | - Xiaoduo Liu
- Department of Neurology and Innovation Center for Neurological Disorders, Xuanwu Hospital, Capital Medical University, National Center for Neurological Disorders, Beijing, 100053, China
| | - Yi Tang
- Department of Neurology and Innovation Center for Neurological Disorders, Xuanwu Hospital, Capital Medical University, National Center for Neurological Disorders, Beijing, 100053, China.
| |
Collapse
|
10
|
Tan SHJ, Kalashnikova M, Di Liberto GM, Crosse MJ, Burnham D. Seeing a Talking Face Matters: Gaze Behavior and the Auditory-Visual Speech Benefit in Adults' Cortical Tracking of Infant-directed Speech. J Cogn Neurosci 2023; 35:1741-1759. [PMID: 37677057 DOI: 10.1162/jocn_a_02044] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/09/2023]
Abstract
In face-to-face conversations, listeners gather visual speech information from a speaker's talking face that enhances their perception of the incoming auditory speech signal. This auditory-visual (AV) speech benefit is evident even in quiet environments but is stronger in situations that require greater listening effort such as when the speech signal itself deviates from listeners' expectations. One example is infant-directed speech (IDS) presented to adults. IDS has exaggerated acoustic properties that are easily discriminable from adult-directed speech (ADS). Although IDS is a speech register that adults typically use with infants, no previous neurophysiological study has directly examined whether adult listeners process IDS differently from ADS. To address this, the current study simultaneously recorded EEG and eye-tracking data from adult participants as they were presented with auditory-only (AO), visual-only, and AV recordings of IDS and ADS. Eye-tracking data were recorded because looking behavior to the speaker's eyes and mouth modulates the extent of AV speech benefit experienced. Analyses of cortical tracking accuracy revealed that cortical tracking of the speech envelope was significant in AO and AV modalities for IDS and ADS. However, the AV speech benefit [i.e., AV > (A + V)] was only present for IDS trials. Gaze behavior analyses indicated differences in looking behavior during IDS and ADS trials. Surprisingly, looking behavior to the speaker's eyes and mouth was not correlated with cortical tracking accuracy. Additional exploratory analyses indicated that attention to the whole display was negatively correlated with cortical tracking accuracy of AO and visual-only trials in IDS. Our results underscore the nuances involved in the relationship between neurophysiological AV speech benefit and looking behavior.
Collapse
Affiliation(s)
- Sok Hui Jessica Tan
- The MARCS Institute of Brain, Behaviour and Development, Western Sydney University, Australia
- Science of Learning in Education Centre, Office of Education Research, National Institute of Education, Nanyang Technological University, Singapore
| | - Marina Kalashnikova
- The Basque Center on Cognition, Brain and Language
- IKERBASQUE, Basque Foundation for Science
| | - Giovanni M Di Liberto
- ADAPT Centre, School of Computer Science and Statistics, Trinity College Institute of Neuroscience, Trinity College, The University of Dublin, Ireland
| | - Michael J Crosse
- SEGOTIA, Galway, Ireland
- Trinity Center for Biomedical Engineering, Department of Mechanical, Manufacturing & Biomedical Engineering, Trinity College Dublin, Dublin, Ireland
| | - Denis Burnham
- The MARCS Institute of Brain, Behaviour and Development, Western Sydney University, Australia
| |
Collapse
|
11
|
Symeonidou ER, Ferris DP. Visual Occlusions Result in Phase Synchrony Within Multiple Brain Regions Involved in Sensory Processing and Balance Control. IEEE Trans Neural Syst Rehabil Eng 2023; 31:3772-3780. [PMID: 37725737 PMCID: PMC10616968 DOI: 10.1109/tnsre.2023.3317055] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/21/2023]
Abstract
There is a need to develop appropriate balance training interventions to minimize the risk of falls. Recently, we found that intermittent visual occlusions can substantially improve the effectiveness and retention of balance beam walking practice (Symeonidou & Ferris, 2022). We sought to determine how the intermittent visual occlusions affect electrocortical activity during beam walking. We hypothesized that areas involved in sensorimotor processing and balance control would demonstrate spectral power changes and inter-trial coherence modulations after loss and restoration of vision. Ten healthy young adults practiced walking on a treadmill-mounted balance beam while wearing high-density EEG and experiencing reoccurring visual occlusions. Results revealed spectral power fluctuations and inter-trial coherence changes in the visual, occipital, temporal, and sensorimotor cortex as well as the posterior parietal cortex and the anterior cingulate. We observed a prolonged alpha increase in the occipital, temporal, sensorimotor, and posterior parietal cortex after the occlusion onset. In contrast, the anterior cingulate showed a strong alpha and theta increase after the occlusion offset. We observed transient phase synchrony in the alpha, theta, and beta bands within the sensory, posterior parietal, and anterior cingulate cortices immediately after occlusion onset and offset. Intermittent visual occlusions induced electrocortical spectral power and inter-trial coherence changes in a wide range of frequencies within cortical areas relevant for multisensory integration and processing as well as balance control. Our training intervention could be implemented in senior and rehabilitation centers, improving the quality of life of elderly and neurologically impaired individuals.
Collapse
|
12
|
Wakim KM, Foxe JJ, Molholm S. Cued motor processing in autism and typical development: A high-density electrical mapping study of response-locked neural activity in children and adolescents. Eur J Neurosci 2023; 58:2766-2786. [PMID: 37340622 DOI: 10.1111/ejn.16063] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2022] [Revised: 05/29/2023] [Accepted: 05/30/2023] [Indexed: 06/22/2023]
Abstract
Motor atypicalities are common in autism spectrum disorder (ASD) and are often evident prior to classical ASD symptoms. Despite evidence of differences in neural processing during imitation in autistic individuals, research on the integrity and spatiotemporal dynamics of basic motor processing is surprisingly sparse. To address this need, we analysed electroencephalography (EEG) data recorded from a large sample of autistic (n = 84) and neurotypical (n = 84) children and adolescents while they performed an audiovisual speeded reaction time (RT) task. Analyses focused on RTs and response-locked motor-related electrical brain responses over frontoparietal scalp regions: the late Bereitschaftspotential, the motor potential and the reafferent potential. Evaluation of behavioural task performance indicated greater RT variability and lower hit rates in autistic participants compared to typically developing age-matched neurotypical participants. Overall, the data revealed clear motor-related neural responses in ASD, but with subtle differences relative to typically developing participants evident over fronto-central and bilateral parietal scalp sites prior to response onset. Group differences were further parsed as a function of age (6-9, 9-12 and 12-15 years), sensory cue preceding the response (auditory, visual and bi-sensory audiovisual) and RT quartile. Group differences in motor-related processing were most prominent in the youngest group of children (age 6-9), with attenuated cortical responses observed for young autistic participants. Future investigations assessing the integrity of such motor processes in younger children, where larger differences may be present, are warranted.
Collapse
Affiliation(s)
- Kathryn-Mary Wakim
- The Cognitive Neurophysiology Laboratory, Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine, Bronx, New York, USA
| | - John J Foxe
- The Cognitive Neurophysiology Laboratory, Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine, Bronx, New York, USA
- The Frederick J. and Marion A. Schindler Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, New York, USA
| | - Sophie Molholm
- The Cognitive Neurophysiology Laboratory, Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine, Bronx, New York, USA
- The Frederick J. and Marion A. Schindler Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, New York, USA
| |
Collapse
|
13
|
Lestang JH, Cai H, Averbeck BB, Cohen YE. Functional network properties of the auditory cortex. Hear Res 2023; 433:108768. [PMID: 37075536 PMCID: PMC10205700 DOI: 10.1016/j.heares.2023.108768] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/28/2022] [Revised: 03/27/2023] [Accepted: 04/11/2023] [Indexed: 04/21/2023]
Abstract
The auditory system transforms auditory stimuli from the external environment into perceptual auditory objects. Recent studies have focused on the contribution of the auditory cortex to this transformation. Other studies have yielded important insights into the contributions of neural activity in the auditory cortex to cognition and decision-making. However, despite this important work, the relationship between auditory-cortex activity and behavior/perception has not been fully elucidated. Two of the more important gaps in our understanding are (1) the specific and differential contributions of different fields of the auditory cortex to auditory perception and behavior and (2) the way networks of auditory neurons impact and facilitate auditory information processing. Here, we focus on recent work from non-human-primate models of hearing and review work related to these gaps and put forth challenges to further our understanding of how single-unit activity and network activity in different cortical fields contribution to behavior and perception.
Collapse
Affiliation(s)
- Jean-Hugues Lestang
- Departments of Otorhinolaryngology, University of Pennsylvania, Philadelphia, PA 19104, USA
| | - Huaizhen Cai
- Departments of Otorhinolaryngology, University of Pennsylvania, Philadelphia, PA 19104, USA
| | - Bruno B Averbeck
- Laboratory of Neuropsychology, National Institute of Mental Health, National Institutes of Health, Bethesda, MD 20892, USA.
| | - Yale E Cohen
- Departments of Otorhinolaryngology, University of Pennsylvania, Philadelphia, PA 19104, USA; Neuroscience, University of Pennsylvania, Philadelphia, PA 19104, USA; Bioengineering, University of Pennsylvania, Philadelphia, PA 19104, USA
| |
Collapse
|
14
|
Ronconi L, Vitale A, Federici A, Mazzoni N, Battaglini L, Molteni M, Casartelli L. Neural dynamics driving audio-visual integration in autism. Cereb Cortex 2023; 33:543-556. [PMID: 35266994 DOI: 10.1093/cercor/bhac083] [Citation(s) in RCA: 16] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2021] [Revised: 02/04/2022] [Indexed: 02/03/2023] Open
Abstract
Audio-visual (AV) integration plays a crucial role in supporting social functions and communication in autism spectrum disorder (ASD). However, behavioral findings remain mixed and, importantly, little is known about the underlying neurophysiological bases. Studies in neurotypical adults indicate that oscillatory brain activity in different frequencies subserves AV integration, pointing to a central role of (i) individual alpha frequency (IAF), which would determine the width of the cross-modal binding window; (ii) pre-/peri-stimulus theta oscillations, which would reflect the expectation of AV co-occurrence; (iii) post-stimulus oscillatory phase reset, which would temporally align the different unisensory signals. Here, we investigate the neural correlates of AV integration in children with ASD and typically developing (TD) peers, measuring electroencephalography during resting state and in an AV integration paradigm. As for neurotypical adults, AV integration dynamics in TD children could be predicted by the IAF measured at rest and by a modulation of anticipatory theta oscillations at single-trial level. Conversely, in ASD participants, AV integration/segregation was driven exclusively by the neural processing of the auditory stimulus and the consequent auditory-induced phase reset in visual regions, suggesting that a disproportionate elaboration of the auditory input could be the main factor characterizing atypical AV integration in autism.
Collapse
Affiliation(s)
- Luca Ronconi
- School of Psychology, Vita-Salute San Raffaele University, 20132 Milan, Italy.,Division of Neuroscience, IRCCS San Raffaele Scientific Institute, 20132 Milan, Italy
| | - Andrea Vitale
- Theoretical and Cognitive Neuroscience Unit, Child Psychopathology Department, Scientific Institute IRCCS Eugenio Medea, 23842 Bosisio Parini, Italy
| | - Alessandra Federici
- Theoretical and Cognitive Neuroscience Unit, Child Psychopathology Department, Scientific Institute IRCCS Eugenio Medea, 23842 Bosisio Parini, Italy.,Sensory Experience Dependent (SEED) group, IMT School for Advanced Studies Lucca, 55100 Lucca, Italy
| | - Noemi Mazzoni
- Theoretical and Cognitive Neuroscience Unit, Child Psychopathology Department, Scientific Institute IRCCS Eugenio Medea, 23842 Bosisio Parini, Italy.,Laboratory for Autism and Neurodevelopmental Disorders, Center for Neuroscience and Cognitive Systems, Istituto Italiano di Tecnologia, 38068 Rovereto, Italy.,Department of Psychology and Cognitive Science, University of Trento, 38068 Rovereto, Italy
| | - Luca Battaglini
- Department of General Psychology, University of Padova, 35131 Padova, Italy.,Department of Physics and Astronomy "Galileo Galilei", University of Padova, 35131 Padova, Italy
| | - Massimo Molteni
- Child Psychopathology Department, Scientific Institute IRCCS Eugenio Medea, 23842 Bosisio Parini, Italy
| | - Luca Casartelli
- Theoretical and Cognitive Neuroscience Unit, Child Psychopathology Department, Scientific Institute IRCCS Eugenio Medea, 23842 Bosisio Parini, Italy
| |
Collapse
|
15
|
Cuppini C, Magosso E, Monti M, Ursino M, Yau JM. A neurocomputational analysis of visual bias on bimanual tactile spatial perception during a crossmodal exposure. Front Neural Circuits 2022; 16:933455. [PMID: 36439678 PMCID: PMC9684216 DOI: 10.3389/fncir.2022.933455] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2022] [Accepted: 10/13/2022] [Indexed: 11/11/2022] Open
Abstract
Vision and touch both support spatial information processing. These sensory systems also exhibit highly specific interactions in spatial perception, which may reflect multisensory representations that are learned through visuo-tactile (VT) experiences. Recently, Wani and colleagues reported that task-irrelevant visual cues bias tactile perception, in a brightness-dependent manner, on a task requiring participants to detect unimanual and bimanual cues. Importantly, tactile performance remained spatially biased after VT exposure, even when no visual cues were presented. These effects on bimanual touch conceivably reflect cross-modal learning, but the neural substrates that are changed by VT experience are unclear. We previously described a neural network capable of simulating VT spatial interactions. Here, we exploited this model to test different hypotheses regarding potential network-level changes that may underlie the VT learning effects. Simulation results indicated that VT learning effects are inconsistent with plasticity restricted to unisensory visual and tactile hand representations. Similarly, VT learning effects were also inconsistent with changes restricted to the strength of inter-hemispheric inhibitory interactions. Instead, we found that both the hand representations and the inter-hemispheric inhibitory interactions need to be plastic to fully recapitulate VT learning effects. Our results imply that crossmodal learning of bimanual spatial perception involves multiple changes distributed over a VT processing cortical network.
Collapse
Affiliation(s)
- Cristiano Cuppini
- Department of Electrical, Electronic, and Information Engineering “Guglielmo Marconi,” University of Bologna, Bologna, Italy,*Correspondence: Cristiano Cuppini,
| | - Elisa Magosso
- Department of Electrical, Electronic, and Information Engineering “Guglielmo Marconi,” University of Bologna, Bologna, Italy
| | - Melissa Monti
- Department of Electrical, Electronic, and Information Engineering “Guglielmo Marconi,” University of Bologna, Bologna, Italy
| | - Mauro Ursino
- Department of Electrical, Electronic, and Information Engineering “Guglielmo Marconi,” University of Bologna, Bologna, Italy
| | - Jeffrey M. Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, TX, United States
| |
Collapse
|
16
|
Mercier MR, Dubarry AS, Tadel F, Avanzini P, Axmacher N, Cellier D, Vecchio MD, Hamilton LS, Hermes D, Kahana MJ, Knight RT, Llorens A, Megevand P, Melloni L, Miller KJ, Piai V, Puce A, Ramsey NF, Schwiedrzik CM, Smith SE, Stolk A, Swann NC, Vansteensel MJ, Voytek B, Wang L, Lachaux JP, Oostenveld R. Advances in human intracranial electroencephalography research, guidelines and good practices. Neuroimage 2022; 260:119438. [PMID: 35792291 PMCID: PMC10190110 DOI: 10.1016/j.neuroimage.2022.119438] [Citation(s) in RCA: 56] [Impact Index Per Article: 18.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2021] [Revised: 05/23/2022] [Accepted: 06/30/2022] [Indexed: 12/11/2022] Open
Abstract
Since the second-half of the twentieth century, intracranial electroencephalography (iEEG), including both electrocorticography (ECoG) and stereo-electroencephalography (sEEG), has provided an intimate view into the human brain. At the interface between fundamental research and the clinic, iEEG provides both high temporal resolution and high spatial specificity but comes with constraints, such as the individual's tailored sparsity of electrode sampling. Over the years, researchers in neuroscience developed their practices to make the most of the iEEG approach. Here we offer a critical review of iEEG research practices in a didactic framework for newcomers, as well addressing issues encountered by proficient researchers. The scope is threefold: (i) review common practices in iEEG research, (ii) suggest potential guidelines for working with iEEG data and answer frequently asked questions based on the most widespread practices, and (iii) based on current neurophysiological knowledge and methodologies, pave the way to good practice standards in iEEG research. The organization of this paper follows the steps of iEEG data processing. The first section contextualizes iEEG data collection. The second section focuses on localization of intracranial electrodes. The third section highlights the main pre-processing steps. The fourth section presents iEEG signal analysis methods. The fifth section discusses statistical approaches. The sixth section draws some unique perspectives on iEEG research. Finally, to ensure a consistent nomenclature throughout the manuscript and to align with other guidelines, e.g., Brain Imaging Data Structure (BIDS) and the OHBM Committee on Best Practices in Data Analysis and Sharing (COBIDAS), we provide a glossary to disambiguate terms related to iEEG research.
Collapse
Affiliation(s)
- Manuel R Mercier
- INSERM, INS, Institut de Neurosciences des Systèmes, Aix-Marseille University, Marseille, France.
| | | | - François Tadel
- Signal & Image Processing Institute, University of Southern California, Los Angeles, CA United States of America
| | - Pietro Avanzini
- Institute of Neuroscience, National Research Council of Italy, Parma, Italy
| | - Nikolai Axmacher
- Department of Neuropsychology, Faculty of Psychology, Institute of Cognitive Neuroscience, Ruhr University Bochum, Universitätsstraße 150, Bochum 44801, Germany; State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University, 19 Xinjiekou Outer St, Beijing 100875, China
| | - Dillan Cellier
- Department of Cognitive Science, University of California, La Jolla, San Diego, United States of America
| | - Maria Del Vecchio
- Institute of Neuroscience, National Research Council of Italy, Parma, Italy
| | - Liberty S Hamilton
- Department of Neurology, Dell Medical School, The University of Texas at Austin, Austin, TX, United States of America; Institute for Neuroscience, The University of Texas at Austin, Austin, TX, United States of America; Department of Speech, Language, and Hearing Sciences, Moody College of Communication, The University of Texas at Austin, Austin, TX, United States of America
| | - Dora Hermes
- Department of Physiology and Biomedical Engineering, Mayo Clinic, Rochester, MN, United States of America
| | - Michael J Kahana
- Department of Psychology, University of Pennsylvania, Philadelphia, PA, United States of America
| | - Robert T Knight
- Department of Psychology and the Helen Wills Neuroscience Institute, University of California, Berkeley, CA 94720, United States of America
| | - Anais Llorens
- Helen Wills Neuroscience Institute, University of California, Berkeley, United States of America
| | - Pierre Megevand
- Department of Clinical neurosciences, Faculty of Medicine, University of Geneva, Geneva, Switzerland
| | - Lucia Melloni
- Department of Neuroscience, Max Planck Institute for Empirical Aesthetics, Grüneburgweg 14, Frankfurt am Main 60322, Germany; Department of Neurology, NYU Grossman School of Medicine, 145 East 32nd Street, Room 828, New York, NY 10016, United States of America
| | - Kai J Miller
- Department of Neurosurgery, Mayo Clinic, Rochester, MN 55905, USA
| | - Vitória Piai
- Donders Institute for Brain, Cognition, and Behaviour, Radboud University, Nijmegen, the Netherlands; Department of Medical Psychology, Radboudumc, Donders Centre for Medical Neuroscience, Nijmegen, the Netherlands
| | - Aina Puce
- Department of Psychological & Brain Sciences, Programs in Neuroscience, Cognitive Science, Indiana University, Bloomington, IN, United States of America
| | - Nick F Ramsey
- Department of Neurology and Neurosurgery, UMC Utrecht Brain Center, UMC Utrecht, the Netherlands
| | - Caspar M Schwiedrzik
- Neural Circuits and Cognition Lab, European Neuroscience Institute Göttingen - A Joint Initiative of the University Medical Center Göttingen and the Max Planck Society, Göttingen, Germany; Perception and Plasticity Group, German Primate Center, Leibniz Institute for Primate Research, Göttingen, Germany
| | - Sydney E Smith
- Neurosciences Graduate Program, University of California, La Jolla, San Diego, United States of America
| | - Arjen Stolk
- Donders Institute for Brain, Cognition, and Behaviour, Radboud University, Nijmegen, the Netherlands; Psychological and Brain Sciences, Dartmouth College, Hanover, NH, United States of America
| | - Nicole C Swann
- University of Oregon in the Department of Human Physiology, United States of America
| | - Mariska J Vansteensel
- Department of Neurology and Neurosurgery, UMC Utrecht Brain Center, UMC Utrecht, the Netherlands
| | - Bradley Voytek
- Department of Cognitive Science, University of California, La Jolla, San Diego, United States of America; Neurosciences Graduate Program, University of California, La Jolla, San Diego, United States of America; Halıcıoğlu Data Science Institute, University of California, La Jolla, San Diego, United States of America; Kavli Institute for Brain and Mind, University of California, La Jolla, San Diego, United States of America
| | - Liang Wang
- CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Jean-Philippe Lachaux
- Lyon Neuroscience Research Center, EDUWELL Team, INSERM UMRS 1028, CNRS UMR 5292, Université Claude Bernard Lyon 1, Université de Lyon, Lyon F-69000, France
| | - Robert Oostenveld
- Donders Institute for Brain, Cognition, and Behaviour, Radboud University, Nijmegen, the Netherlands; NatMEG, Karolinska Institutet, Stockholm, Sweden
| |
Collapse
|
17
|
Ross LA, Molholm S, Butler JS, Bene VAD, Foxe JJ. Neural correlates of multisensory enhancement in audiovisual narrative speech perception: a fMRI investigation. Neuroimage 2022; 263:119598. [PMID: 36049699 DOI: 10.1016/j.neuroimage.2022.119598] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2022] [Revised: 08/26/2022] [Accepted: 08/28/2022] [Indexed: 11/25/2022] Open
Abstract
This fMRI study investigated the effect of seeing articulatory movements of a speaker while listening to a naturalistic narrative stimulus. It had the goal to identify regions of the language network showing multisensory enhancement under synchronous audiovisual conditions. We expected this enhancement to emerge in regions known to underlie the integration of auditory and visual information such as the posterior superior temporal gyrus as well as parts of the broader language network, including the semantic system. To this end we presented 53 participants with a continuous narration of a story in auditory alone, visual alone, and both synchronous and asynchronous audiovisual speech conditions while recording brain activity using BOLD fMRI. We found multisensory enhancement in an extensive network of regions underlying multisensory integration and parts of the semantic network as well as extralinguistic regions not usually associated with multisensory integration, namely the primary visual cortex and the bilateral amygdala. Analysis also revealed involvement of thalamic brain regions along the visual and auditory pathways more commonly associated with early sensory processing. We conclude that under natural listening conditions, multisensory enhancement not only involves sites of multisensory integration but many regions of the wider semantic network and includes regions associated with extralinguistic sensory, perceptual and cognitive processing.
Collapse
Affiliation(s)
- Lars A Ross
- The Frederick J. and Marion A. Schindler Cognitive Neurophysiology Laboratory, The Ernest J. Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, New York, 14642, USA; Department of Imaging Sciences, University of Rochester Medical Center, University of Rochester School of Medicine and Dentistry, Rochester, New York, 14642, USA; The Cognitive Neurophysiology Laboratory, Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, New York, 10461, USA.
| | - Sophie Molholm
- The Frederick J. and Marion A. Schindler Cognitive Neurophysiology Laboratory, The Ernest J. Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, New York, 14642, USA; The Cognitive Neurophysiology Laboratory, Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, New York, 10461, USA
| | - John S Butler
- The Cognitive Neurophysiology Laboratory, Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, New York, 10461, USA; School of Mathematical Sciences, Technological University Dublin, Kevin Street Campus, Dublin, Ireland
| | - Victor A Del Bene
- The Cognitive Neurophysiology Laboratory, Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, New York, 10461, USA; University of Alabama at Birmingham, Heersink School of Medicine, Department of Neurology, Birmingham, Alabama, 35233, USA
| | - John J Foxe
- The Frederick J. and Marion A. Schindler Cognitive Neurophysiology Laboratory, The Ernest J. Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, New York, 14642, USA; The Cognitive Neurophysiology Laboratory, Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, New York, 10461, USA.
| |
Collapse
|
18
|
Crosse MJ, Foxe JJ, Tarrit K, Freedman EG, Molholm S. Resolution of impaired multisensory processing in autism and the cost of switching sensory modality. Commun Biol 2022; 5:601. [PMID: 35773473 PMCID: PMC9246932 DOI: 10.1038/s42003-022-03519-1] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2021] [Accepted: 05/23/2022] [Indexed: 11/09/2022] Open
Abstract
Children with autism spectrum disorders (ASD) exhibit alterations in multisensory processing, which may contribute to the prevalence of social and communicative deficits in this population. Resolution of multisensory deficits has been observed in teenagers with ASD for complex, social speech stimuli; however, whether this resolution extends to more basic multisensory processing deficits remains unclear. Here, in a cohort of 364 participants we show using simple, non-social audiovisual stimuli that deficits in multisensory processing observed in high-functioning children and teenagers with ASD are not evident in adults with the disorder. Computational modelling indicated that multisensory processing transitions from a default state of competition to one of facilitation, and that this transition is delayed in ASD. Further analysis revealed group differences in how sensory channels are weighted, and how this is impacted by preceding cross-sensory inputs. Our findings indicate that there is a complex and dynamic interplay among the sensory systems that differs considerably in individuals with ASD. Crosse et al. study a cohort of 364 participants with autism spectrum disorders (ASD) and matched controls, and show that deficits in multisensory processing observed in high-functioning children and teenagers with ASD are not evident in adults with the disorder. Using computational modelling they go on to demonstrate that there is a delayed transition of multisensory processing from a default state of competition to one of facilitation in ASD, as well as differences in sensory weighting and the ability to switch between sensory modalities, which sheds light on the interplay among sensory systems that differ in ASD individuals.
Collapse
Affiliation(s)
- Michael J Crosse
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine, Bronx, NY, USA. .,The Dominick P. Purpura Department of Neuroscience, Rose F. Kennedy Intellectual and Developmental Disabilities Research Center, Albert Einstein College of Medicine, Bronx, NY, USA. .,Trinity Centre for Biomedical Engineering, Department of Mechanical, Manufacturing & Biomedical Engineering, Trinity College Dublin, Dublin, Ireland.
| | - John J Foxe
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine, Bronx, NY, USA.,The Dominick P. Purpura Department of Neuroscience, Rose F. Kennedy Intellectual and Developmental Disabilities Research Center, Albert Einstein College of Medicine, Bronx, NY, USA.,The Cognitive Neurophysiology Laboratory, Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Katy Tarrit
- The Cognitive Neurophysiology Laboratory, Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Edward G Freedman
- The Cognitive Neurophysiology Laboratory, Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Sophie Molholm
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine, Bronx, NY, USA. .,The Dominick P. Purpura Department of Neuroscience, Rose F. Kennedy Intellectual and Developmental Disabilities Research Center, Albert Einstein College of Medicine, Bronx, NY, USA. .,The Cognitive Neurophysiology Laboratory, Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA.
| |
Collapse
|
19
|
Michail G, Senkowski D, Holtkamp M, Wächter B, Keil J. Early beta oscillations in multisensory association areas underlie crossmodal performance enhancement. Neuroimage 2022; 257:119307. [PMID: 35577024 DOI: 10.1016/j.neuroimage.2022.119307] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2022] [Revised: 04/29/2022] [Accepted: 05/10/2022] [Indexed: 11/28/2022] Open
Abstract
The combination of signals from different sensory modalities can enhance perception and facilitate behavioral responses. While previous research described crossmodal influences in a wide range of tasks, it remains unclear how such influences drive performance enhancements. In particular, the neural mechanisms underlying performance-relevant crossmodal influences, as well as the latency and spatial profile of such influences are not well understood. Here, we examined data from high-density electroencephalography (N = 30) recordings to characterize the oscillatory signatures of crossmodal facilitation of response speed, as manifested in the speeding of visual responses by concurrent task-irrelevant auditory information. Using a data-driven analysis approach, we found that individual gains in response speed correlated with larger beta power difference (13-25 Hz) between the audiovisual and the visual condition, starting within 80 ms after stimulus onset in the secondary visual cortex and in multisensory association areas in the parietal cortex. In addition, we examined data from electrocorticography (ECoG) recordings in four epileptic patients in a comparable paradigm. These ECoG data revealed reduced beta power in audiovisual compared with visual trials in the superior temporal gyrus (STG). Collectively, our data suggest that the crossmodal facilitation of response speed is associated with reduced early beta power in multisensory association and secondary visual areas. The reduced early beta power may reflect an auditory-driven feedback signal to improve visual processing through attentional gating. These findings improve our understanding of the neural mechanisms underlying crossmodal response speed facilitation and highlight the critical role of beta oscillations in mediating behaviorally relevant multisensory processing.
Collapse
Affiliation(s)
- Georgios Michail
- Department of Psychiatry and Psychotherapy, Charité Campus Mitte (CCM), Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Charitéplatz 1, Berlin 10117, Germany.
| | - Daniel Senkowski
- Department of Psychiatry and Psychotherapy, Charité Campus Mitte (CCM), Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Charitéplatz 1, Berlin 10117, Germany
| | - Martin Holtkamp
- Epilepsy-Center Berlin-Brandenburg, Institute for Diagnostics of Epilepsy, Berlin 10365, Germany; Department of Neurology, Charité - Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Charité Campus Mitte (CCM), Charitéplatz 1, Berlin 10117, Germany
| | - Bettina Wächter
- Epilepsy-Center Berlin-Brandenburg, Institute for Diagnostics of Epilepsy, Berlin 10365, Germany
| | - Julian Keil
- Biological Psychology, Christian-Albrechts-University Kiel, Kiel 24118, Germany
| |
Collapse
|
20
|
Brang D, Plass J, Sherman A, Stacey WC, Wasade VS, Grabowecky M, Ahn E, Towle VL, Tao JX, Wu S, Issa NP, Suzuki S. Visual cortex responds to sound onset and offset during passive listening. J Neurophysiol 2022; 127:1547-1563. [PMID: 35507478 DOI: 10.1152/jn.00164.2021] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Sounds enhance our ability to detect, localize, and respond to co-occurring visual targets. Research suggests that sounds improve visual processing by resetting the phase of ongoing oscillations in visual cortex. However, it remains unclear what information is relayed from the auditory system to visual areas and if sounds modulate visual activity even in the absence of visual stimuli (e.g., during passive listening). Using intracranial electroencephalography (iEEG) in humans, we examined the sensitivity of visual cortex to three forms of auditory information during a passive listening task: auditory onset responses, auditory offset responses, and rhythmic entrainment to sounds. Because some auditory neurons respond to both sound onsets and offsets, visual timing and duration processing may benefit from each. Additionally, if auditory entrainment information is relayed to visual cortex, it could support the processing of complex stimulus dynamics that are aligned between auditory and visual stimuli. Results demonstrate that in visual cortex, amplitude-modulated sounds elicited transient onset and offset responses in multiple areas, but no entrainment to sound modulation frequencies. These findings suggest that activity in visual cortex (as measured with iEEG in response to auditory stimuli) may not be affected by temporally fine-grained auditory stimulus dynamics during passive listening (though it remains possible that this signal may be observable with simultaneous auditory-visual stimuli). Moreover, auditory responses were maximal in low-level visual cortex, potentially implicating a direct pathway for rapid interactions between auditory and visual cortices. This mechanism may facilitate perception by time-locking visual computations to environmental events marked by auditory discontinuities.
Collapse
Affiliation(s)
- David Brang
- Department of Psychology, University of Michigan, Ann Arbor, MI, United States
| | - John Plass
- Department of Psychology, University of Michigan, Ann Arbor, MI, United States
| | - Aleksandra Sherman
- Department of Cognitive Science, Occidental College, Los Angeles, CA, United States
| | - William C Stacey
- Department of Neurology, University of Michigan, Ann Arbor, MI, United States
| | | | - Marcia Grabowecky
- Department of Psychology, Northwestern University, Evanston, IL, United States
| | - EunSeon Ahn
- Department of Psychology, University of Michigan, Ann Arbor, MI, United States
| | - Vernon L Towle
- Department of Neurology, The University of Chicago, Chicago, IL, United States
| | - James X Tao
- Department of Neurology, The University of Chicago, Chicago, IL, United States
| | - Shasha Wu
- Department of Neurology, The University of Chicago, Chicago, IL, United States
| | - Naoum P Issa
- Department of Neurology, The University of Chicago, Chicago, IL, United States
| | - Satoru Suzuki
- Department of Psychology, Northwestern University, Evanston, IL, United States
| |
Collapse
|
21
|
Woolnough O, Forseth KJ, Rollo PS, Roccaforte ZJ, Tandon N. Event-Related Phase Synchronization Propagates Rapidly across Human Ventral Visual Cortex. Neuroimage 2022; 256:119262. [PMID: 35504563 PMCID: PMC9382906 DOI: 10.1016/j.neuroimage.2022.119262] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2021] [Revised: 03/31/2022] [Accepted: 04/27/2022] [Indexed: 11/01/2022] Open
Abstract
Visual inputs to early visual cortex integrate with semantic, linguistic and memory inputs in higher visual cortex, in a manner that is rapid and accurate, and enables complex computations such as face recognition and word reading. This implies the existence of fundamental organizational principles that enable such efficiency. To elaborate on this, we performed intracranial recordings in 82 individuals while they performed tasks of varying visual and cognitive complexity. We discovered that visual inputs induce highly organized posterior-to-anterior propagating patterns of phase modulation across the ventral occipitotemporal cortex. At individual electrodes there was a stereotyped temporal pattern of phase progression following both stimulus onset and offset, consistent across trials and tasks. The phase of low frequency activity in anterior regions was predicted by the prior phase in posterior cortical regions. This spatiotemporal propagation of phase likely serves as a feed-forward organizational influence enabling the integration of information across the ventral visual stream. This phase modulation manifests as the early components of the event related potential; one of the most commonly used measures in human electrophysiology. These findings illuminate fundamental organizational principles of the higher order visual system that enable the rapid recognition and characterization of a variety of inputs.
Collapse
Affiliation(s)
- Oscar Woolnough
- Vivian L. Smith Department of Neurosurgery, McGovern Medical School at UT Health Houston, Houston, TX, 77030, United States of America; Texas Institute for Restorative Neurotechnologies, University of Texas Health Science Center at Houston, Houston, TX, 77030, United States of America
| | - Kiefer J Forseth
- Vivian L. Smith Department of Neurosurgery, McGovern Medical School at UT Health Houston, Houston, TX, 77030, United States of America; Texas Institute for Restorative Neurotechnologies, University of Texas Health Science Center at Houston, Houston, TX, 77030, United States of America
| | - Patrick S Rollo
- Vivian L. Smith Department of Neurosurgery, McGovern Medical School at UT Health Houston, Houston, TX, 77030, United States of America; Texas Institute for Restorative Neurotechnologies, University of Texas Health Science Center at Houston, Houston, TX, 77030, United States of America
| | - Zachary J Roccaforte
- Vivian L. Smith Department of Neurosurgery, McGovern Medical School at UT Health Houston, Houston, TX, 77030, United States of America; Texas Institute for Restorative Neurotechnologies, University of Texas Health Science Center at Houston, Houston, TX, 77030, United States of America
| | - Nitin Tandon
- Vivian L. Smith Department of Neurosurgery, McGovern Medical School at UT Health Houston, Houston, TX, 77030, United States of America; Texas Institute for Restorative Neurotechnologies, University of Texas Health Science Center at Houston, Houston, TX, 77030, United States of America; Memorial Hermann Hospital, Texas Medical Center, Houston, TX, 77030, United States of America.
| |
Collapse
|
22
|
Jessica Tan SH, Kalashnikova M, Di Liberto GM, Crosse MJ, Burnham D. Seeing a Talking Face Matters: The Relationship between Cortical Tracking of Continuous Auditory-Visual Speech and Gaze Behaviour in Infants, Children and Adults. Neuroimage 2022; 256:119217. [PMID: 35436614 DOI: 10.1016/j.neuroimage.2022.119217] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2021] [Revised: 04/09/2022] [Accepted: 04/14/2022] [Indexed: 11/24/2022] Open
Abstract
An auditory-visual speech benefit, the benefit that visual speech cues bring to auditory speech perception, is experienced from early on in infancy and continues to be experienced to an increasing degree with age. While there is both behavioural and neurophysiological evidence for children and adults, only behavioural evidence exists for infants - as no neurophysiological study has provided a comprehensive examination of the auditory-visual speech benefit in infants. It is also surprising that most studies on auditory-visual speech benefit do not concurrently report looking behaviour especially since the auditory-visual speech benefit rests on the assumption that listeners attend to a speaker's talking face and that there are meaningful individual differences in looking behaviour. To address these gaps, we simultaneously recorded electroencephalographic (EEG) and eye-tracking data of 5-month-olds, 4-year-olds and adults as they were presented with a speaker in auditory-only (AO), visual-only (VO), and auditory-visual (AV) modes. Cortical tracking analyses that involved forward encoding models of the speech envelope revealed that there was an auditory-visual speech benefit [i.e., AV > (A+V)], evident in 5-month-olds and adults but not 4-year-olds. Examination of cortical tracking accuracy in relation to looking behaviour, showed that infants' relative attention to the speaker's mouth (vs. eyes) was positively correlated with cortical tracking accuracy of VO speech, whereas adults' attention to the display overall was negatively correlated with cortical tracking accuracy of VO speech. This study provides the first neurophysiological evidence of auditory-visual speech benefit in infants and our results suggest ways in which current models of speech processing can be fine-tuned.
Collapse
Affiliation(s)
- S H Jessica Tan
- The MARCS Institute of Brain, Behaviour and Development, Western Sydney University.
| | - Marina Kalashnikova
- The Basque Center on Cognition, Brain and Language; IKERBASQUE, Basque Foundation for Science
| | | | - Michael J Crosse
- Trinity Center for Biomedical Engineering, Department of Mechanical, Manufacturing & Biomedical Engineering, Trinity College Dublin, Dublin, Ireland
| | - Denis Burnham
- The MARCS Institute of Brain, Behaviour and Development, Western Sydney University
| |
Collapse
|
23
|
Cherenkova L, Sokolova L. Age-Related Dynamics of Crossmodal Priming. EXPERIMENTAL PSYCHOLOGY (RUSSIA) 2022. [DOI: 10.17759/exppsy.2022150405] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
<p>The study is aimed at studying at determining the temporal dynamics of crossmodal priming in preschool children. The study involved 60 children aged 4 to 6 years (M = 5.6; SD = 1.2) and 20 adult subjects aged 17 to 23 years (M = 20.4; SD = 2.6). The priming paradigm was used as a research model. In this study, we determined the influence of a priori visual stimulation on the speed and accuracy of identification of test sounds, depending on the congruence of their combination with visual objects and the interval between the test and prime stimuli. In the course of the study, it was found that in 4-year-old children, a priori visual information leads to a decrease in the accuracy and speed of reaction to test sound stimuli - a negative priming effect. The magnitude of the negative priming effect decreases with an increase in the interval between prime and test stimuli. In 5-year-old children, the number of errors increases only when incongruent combinations of stimuli are presented - a negative priming effect. On the contrary, the reaction time decreases only in congruent trials with when the test stimulus is delayed relative to the prime by 150-500 ms — a positive priming effect. In 6-year-old children and adults, the accuracy of the reaction does not change, and the reaction rate significantly increases in congruent trials positive priming effect) and decreases in incongruent trials (negative priming effect). The observed dynamics of changes in the interaction of sound and visual stimulation testifies to the formation of mechanisms of attention and multisensory integration in preschool children.</p>
Collapse
Affiliation(s)
| | - L.V. Sokolova
- Санкт-Петербургский государственный университет (ФГБОУ ВО СПбГУ)
| |
Collapse
|
24
|
Multisensory stimuli shift perceptual priors to facilitate rapid behavior. Sci Rep 2021; 11:23052. [PMID: 34845325 PMCID: PMC8629992 DOI: 10.1038/s41598-021-02566-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2021] [Accepted: 11/16/2021] [Indexed: 11/08/2022] Open
Abstract
Multisensory stimuli speed behavioral responses, but the mechanisms subserving these effects remain disputed. Historically, the observation that multisensory reaction times (RTs) outpace models assuming independent sensory channels has been taken as evidence for multisensory integration (the "redundant target effect"; RTE). However, this interpretation has been challenged by alternative explanations based on stimulus sequence effects, RT variability, and/or negative correlations in unisensory processing. To clarify the mechanisms subserving the RTE, we collected RTs from 78 undergraduates in a multisensory simple RT task. Based on previous neurophysiological findings, we hypothesized that the RTE was unlikely to reflect these alternative mechanisms, and more likely reflected pre-potentiation of sensory responses through crossmodal phase-resetting. Contrary to accounts based on stimulus sequence effects, we found that preceding stimuli explained only 3-9% of the variance in apparent RTEs. Comparing three plausible evidence accumulator models, we found that multisensory RT distributions were best explained by increased sensory evidence at stimulus onset. Because crossmodal phase-resetting increases cortical excitability before sensory input arrives, these results are consistent with a mechanism based on pre-potentiation through phase-resetting. Mathematically, this model entails increasing the prior log-odds of stimulus presence, providing a potential link between neurophysiological, behavioral, and computational accounts of multisensory interactions.
Collapse
|
25
|
Zulfiqar I, Moerel M, Lage-Castellanos A, Formisano E, De Weerd P. Audiovisual Interactions Among Near-Threshold Oscillating Stimuli in the Far Periphery Are Phase-Dependent. Front Hum Neurosci 2021; 15:642341. [PMID: 34526884 PMCID: PMC8435850 DOI: 10.3389/fnhum.2021.642341] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2020] [Accepted: 07/22/2021] [Indexed: 11/30/2022] Open
Abstract
Recent studies have highlighted the possible contributions of direct connectivity between early sensory cortices to audiovisual integration. Anatomical connections between the early auditory and visual cortices are concentrated in visual sites representing the peripheral field of view. Here, we aimed to engage early sensory interactive pathways with simple, far-peripheral audiovisual stimuli (auditory noise and visual gratings). Using a modulation detection task in one modality performed at an 84% correct threshold level, we investigated multisensory interactions by simultaneously presenting weak stimuli from the other modality in which the temporal modulation was barely-detectable (at 55 and 65% correct detection performance). Furthermore, we manipulated the temporal congruence between the cross-sensory streams. We found evidence for an influence of barely-detectable visual stimuli on the response times for auditory stimuli, but not for the reverse effect. These visual-to-auditory influences only occurred for specific phase-differences (at onset) between the modulated audiovisual stimuli. We discuss our findings in the light of a possible role of direct interactions between early visual and auditory areas, along with contributions from the higher-order association cortex. In sum, our results extend the behavioral evidence of audio-visual processing to the far periphery, and suggest - within this specific experimental setting - an asymmetry between the auditory influence on visual processing and the visual influence on auditory processing.
Collapse
Affiliation(s)
- Isma Zulfiqar
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands
- Maastricht Centre for Systems Biology, Maastricht University, Maastricht, Netherlands
| | - Michelle Moerel
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands
- Maastricht Centre for Systems Biology, Maastricht University, Maastricht, Netherlands
- Maastricht Brain Imaging Centre (MBIC), Maastricht, Netherlands
| | - Agustin Lage-Castellanos
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands
| | - Elia Formisano
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands
- Maastricht Centre for Systems Biology, Maastricht University, Maastricht, Netherlands
- Maastricht Brain Imaging Centre (MBIC), Maastricht, Netherlands
| | - Peter De Weerd
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands
- Maastricht Centre for Systems Biology, Maastricht University, Maastricht, Netherlands
| |
Collapse
|
26
|
Opoku-Baah C, Schoenhaut AM, Vassall SG, Tovar DA, Ramachandran R, Wallace MT. Visual Influences on Auditory Behavioral, Neural, and Perceptual Processes: A Review. J Assoc Res Otolaryngol 2021; 22:365-386. [PMID: 34014416 PMCID: PMC8329114 DOI: 10.1007/s10162-021-00789-0] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2020] [Accepted: 02/07/2021] [Indexed: 01/03/2023] Open
Abstract
In a naturalistic environment, auditory cues are often accompanied by information from other senses, which can be redundant with or complementary to the auditory information. Although the multisensory interactions derived from this combination of information and that shape auditory function are seen across all sensory modalities, our greatest body of knowledge to date centers on how vision influences audition. In this review, we attempt to capture the state of our understanding at this point in time regarding this topic. Following a general introduction, the review is divided into 5 sections. In the first section, we review the psychophysical evidence in humans regarding vision's influence in audition, making the distinction between vision's ability to enhance versus alter auditory performance and perception. Three examples are then described that serve to highlight vision's ability to modulate auditory processes: spatial ventriloquism, cross-modal dynamic capture, and the McGurk effect. The final part of this section discusses models that have been built based on available psychophysical data and that seek to provide greater mechanistic insights into how vision can impact audition. The second section reviews the extant neuroimaging and far-field imaging work on this topic, with a strong emphasis on the roles of feedforward and feedback processes, on imaging insights into the causal nature of audiovisual interactions, and on the limitations of current imaging-based approaches. These limitations point to a greater need for machine-learning-based decoding approaches toward understanding how auditory representations are shaped by vision. The third section reviews the wealth of neuroanatomical and neurophysiological data from animal models that highlights audiovisual interactions at the neuronal and circuit level in both subcortical and cortical structures. It also speaks to the functional significance of audiovisual interactions for two critically important facets of auditory perception-scene analysis and communication. The fourth section presents current evidence for alterations in audiovisual processes in three clinical conditions: autism, schizophrenia, and sensorineural hearing loss. These changes in audiovisual interactions are postulated to have cascading effects on higher-order domains of dysfunction in these conditions. The final section highlights ongoing work seeking to leverage our knowledge of audiovisual interactions to develop better remediation approaches to these sensory-based disorders, founded in concepts of perceptual plasticity in which vision has been shown to have the capacity to facilitate auditory learning.
Collapse
Affiliation(s)
- Collins Opoku-Baah
- Neuroscience Graduate Program, Vanderbilt University, Nashville, TN, USA
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
| | - Adriana M Schoenhaut
- Neuroscience Graduate Program, Vanderbilt University, Nashville, TN, USA
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
| | - Sarah G Vassall
- Neuroscience Graduate Program, Vanderbilt University, Nashville, TN, USA
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
| | - David A Tovar
- Neuroscience Graduate Program, Vanderbilt University, Nashville, TN, USA
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
| | - Ramnarayan Ramachandran
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
- Department of Psychology, Vanderbilt University, Nashville, TN, USA
- Department of Hearing and Speech, Vanderbilt University Medical Center, Nashville, TN, USA
- Vanderbilt Vision Research Center, Nashville, TN, USA
| | - Mark T Wallace
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA.
- Department of Psychology, Vanderbilt University, Nashville, TN, USA.
- Department of Hearing and Speech, Vanderbilt University Medical Center, Nashville, TN, USA.
- Vanderbilt Vision Research Center, Nashville, TN, USA.
- Department of Psychiatry and Behavioral Sciences, Vanderbilt University Medical Center, Nashville, TN, USA.
- Department of Pharmacology, Vanderbilt University, Nashville, TN, USA.
| |
Collapse
|
27
|
Hirst RJ, McGovern DP, Setti A, Shams L, Newell FN. What you see is what you hear: Twenty years of research using the Sound-Induced Flash Illusion. Neurosci Biobehav Rev 2020; 118:759-774. [DOI: 10.1016/j.neubiorev.2020.09.006] [Citation(s) in RCA: 31] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2020] [Revised: 07/06/2020] [Accepted: 09/03/2020] [Indexed: 01/17/2023]
|
28
|
Mégevand P, Mercier MR, Groppe DM, Zion Golumbic E, Mesgarani N, Beauchamp MS, Schroeder CE, Mehta AD. Crossmodal Phase Reset and Evoked Responses Provide Complementary Mechanisms for the Influence of Visual Speech in Auditory Cortex. J Neurosci 2020; 40:8530-8542. [PMID: 33023923 PMCID: PMC7605423 DOI: 10.1523/jneurosci.0555-20.2020] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2020] [Revised: 07/27/2020] [Accepted: 08/31/2020] [Indexed: 12/26/2022] Open
Abstract
Natural conversation is multisensory: when we can see the speaker's face, visual speech cues improve our comprehension. The neuronal mechanisms underlying this phenomenon remain unclear. The two main alternatives are visually mediated phase modulation of neuronal oscillations (excitability fluctuations) in auditory neurons and visual input-evoked responses in auditory neurons. Investigating this question using naturalistic audiovisual speech with intracranial recordings in humans of both sexes, we find evidence for both mechanisms. Remarkably, auditory cortical neurons track the temporal dynamics of purely visual speech using the phase of their slow oscillations and phase-related modulations in broadband high-frequency activity. Consistent with known perceptual enhancement effects, the visual phase reset amplifies the cortical representation of concomitant auditory speech. In contrast to this, and in line with earlier reports, visual input reduces the amplitude of evoked responses to concomitant auditory input. We interpret the combination of improved phase tracking and reduced response amplitude as evidence for more efficient and reliable stimulus processing in the presence of congruent auditory and visual speech inputs.SIGNIFICANCE STATEMENT Watching the speaker can facilitate our understanding of what is being said. The mechanisms responsible for this influence of visual cues on the processing of speech remain incompletely understood. We studied these mechanisms by recording the electrical activity of the human brain through electrodes implanted surgically inside the brain. We found that visual inputs can operate by directly activating auditory cortical areas, and also indirectly by modulating the strength of cortical responses to auditory input. Our results help to understand the mechanisms by which the brain merges auditory and visual speech into a unitary perception.
Collapse
Affiliation(s)
- Pierre Mégevand
- Department of Neurosurgery, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, New York 11549
- Feinstein Institutes for Medical Research, Manhasset, New York 11030
- Department of Basic Neurosciences, Faculty of Medicine, University of Geneva, 1211 Geneva, Switzerland
| | - Manuel R Mercier
- Department of Neurology, Montefiore Medical Center, Bronx, New York 10467
- Department of Neuroscience, Albert Einstein College of Medicine, Bronx, New York 10461
- Institut de Neurosciences des Systèmes, Aix Marseille University, INSERM, 13005 Marseille, France
| | - David M Groppe
- Department of Neurosurgery, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, New York 11549
- Feinstein Institutes for Medical Research, Manhasset, New York 11030
- The Krembil Neuroscience Centre, University Health Network, Toronto, Ontario M5T 1M8, Canada
| | - Elana Zion Golumbic
- The Gonda Brain Research Center, Bar Ilan University, Ramat Gan 5290002, Israel
| | - Nima Mesgarani
- Department of Electrical Engineering, Columbia University, New York, New York 10027
| | - Michael S Beauchamp
- Department of Neurosurgery, Baylor College of Medicine, Houston, Texas 77030
| | - Charles E Schroeder
- Nathan S. Kline Institute, Orangeburg, New York 10962
- Department of Psychiatry, Columbia University, New York, New York 10032
| | - Ashesh D Mehta
- Department of Neurosurgery, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, New York 11549
- Feinstein Institutes for Medical Research, Manhasset, New York 11030
| |
Collapse
|
29
|
Tagini S, Scarpina F, Scacchi M, Mauro A, Zampini M. Reduced Temporal Sensitivity in Obesity: Evidence From a Simultaneity Judgement Task. Multisens Res 2020; 33:777-791. [PMID: 31978872 DOI: 10.1163/22134808-20201501] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2019] [Accepted: 12/19/2019] [Indexed: 11/19/2022]
Abstract
Preliminary evidence showed a reduced temporal sensitivity (i.e., larger temporal binding window) to audiovisual asynchrony in obesity. Our aim was to extend this investigation to visuotactile stimuli, comparing individuals of healthy weight and with obesity in a simultaneity judgment task. We verified that individuals with obesity had a larger temporal binding window than healthy-weight individuals, meaning that they tend to integrate visuotactile stimuli over an extended range of stimulus onset asynchronies. We point out that our finding gives evidence in support of a more pervasive impairment of the temporal discrimination of co-occurrent stimuli, which might affect multisensory integration in obesity. We discuss our results referring to the possible role of atypical oscillatory neural activity and structural anomalies in affecting the perception of simultaneity between multisensory stimuli in obesity. Finally, we highlight the urgency of a deeper understanding of multisensory integration in obesity at least for two reasons. First, multisensory bodily illusions might be used to manipulate body dissatisfaction in obesity. Second, multisensory integration anomalies in obesity might lead to a dissimilar perception of food, encouraging overeating behaviours.
Collapse
Affiliation(s)
- Sofia Tagini
- 1Center for Mind/Brain Sciences, CIMeC, University of Trento, Rovereto (TN), Italy
| | - Federica Scarpina
- 2Istituto Auxologico Italiano, IRCCS, Ospedale S. Giuseppe, Piancavallo (VCO), Italy.,3'Rita Levi Montalcini' Department of Neuroscience, University of Turin, Turin, Italy
| | - Massimo Scacchi
- 2Istituto Auxologico Italiano, IRCCS, Ospedale S. Giuseppe, Piancavallo (VCO), Italy.,4Department of Clinical Sciences and Community Health, University of Milan, Milan, Italy
| | - Alessandro Mauro
- 2Istituto Auxologico Italiano, IRCCS, Ospedale S. Giuseppe, Piancavallo (VCO), Italy.,3'Rita Levi Montalcini' Department of Neuroscience, University of Turin, Turin, Italy
| | - Massimiliano Zampini
- 1Center for Mind/Brain Sciences, CIMeC, University of Trento, Rovereto (TN), Italy.,5Department of Psychology and Cognitive Science, University of Trento, Rovereto (TN), Italy
| |
Collapse
|
30
|
Zumer JM, White TP, Noppeney U. The neural mechanisms of audiotactile binding depend on asynchrony. Eur J Neurosci 2020; 52:4709-4731. [PMID: 32725895 DOI: 10.1111/ejn.14928] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2020] [Revised: 07/06/2020] [Accepted: 07/24/2020] [Indexed: 11/30/2022]
Abstract
Asynchrony is a critical cue informing the brain whether sensory signals are caused by a common source and should be integrated or segregated. This psychophysics-electroencephalography (EEG) study investigated the influence of asynchrony on how the brain binds audiotactile (AT) signals to enable faster responses in a redundant target paradigm. Human participants actively responded (psychophysics) or passively attended (EEG) to noise bursts, "taps-to-the-face" and their AT combinations at seven AT asynchronies: 0, ±20, ±70 and ±500 ms. Behaviourally, observers were faster at detecting AT than unisensory stimuli within a temporal integration window: the redundant target effect was maximal for synchronous stimuli and declined within a ≤70 ms AT asynchrony. EEG revealed a cascade of AT interactions that relied on different neural mechanisms depending on AT asynchrony. At small (≤20 ms) asynchronies, AT interactions arose for evoked response potentials (ERPs) at 110 ms and ~400 ms post-stimulus. Selectively at ±70 ms asynchronies, AT interactions were observed for the P200 ERP, theta-band inter-trial coherence (ITC) and power at ~200 ms post-stimulus. In conclusion, AT binding was mediated by distinct neural mechanisms depending on the asynchrony of the AT signals. Early AT interactions in ERPs and theta-band ITC and power were critical for the behavioural response facilitation within a ≤±70 ms temporal integration window.
Collapse
Affiliation(s)
- Johanna M Zumer
- School of Psychology, University of Birmingham, Birmingham, UK.,Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK.,Centre for Human Brain Health, University of Birmingham, Birmingham, UK.,School of Life and Health Sciences, Aston University, Birmingham, UK
| | - Thomas P White
- School of Psychology, University of Birmingham, Birmingham, UK.,Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK
| | - Uta Noppeney
- School of Psychology, University of Birmingham, Birmingham, UK.,Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK.,Centre for Human Brain Health, University of Birmingham, Birmingham, UK.,Donders Institute for Brain, Cognition, and Behaviour, Nijmegen, The Netherlands
| |
Collapse
|
31
|
Bauer AKR, Debener S, Nobre AC. Synchronisation of Neural Oscillations and Cross-modal Influences. Trends Cogn Sci 2020; 24:481-495. [PMID: 32317142 PMCID: PMC7653674 DOI: 10.1016/j.tics.2020.03.003] [Citation(s) in RCA: 41] [Impact Index Per Article: 8.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2019] [Revised: 02/20/2020] [Accepted: 03/14/2020] [Indexed: 01/23/2023]
Abstract
At any given moment, we receive multiple signals from our different senses. Prior research has shown that signals in one sensory modality can influence neural activity and behavioural performance associated with another sensory modality. Recent human and nonhuman primate studies suggest that such cross-modal influences in sensory cortices are mediated by the synchronisation of ongoing neural oscillations. In this review, we consider two mechanisms proposed to facilitate cross-modal influences on sensory processing, namely cross-modal phase resetting and neural entrainment. We consider how top-down processes may further influence cross-modal processing in a flexible manner, and we highlight fruitful directions for further research.
Collapse
Affiliation(s)
- Anna-Katharina R Bauer
- Department of Experimental Psychology, Brain and Cognition Lab, Oxford Centre for Human Brain Activity, Department of Psychiatry, Wellcome Centre for Integrative Neuroimaging, University of Oxford, UK.
| | - Stefan Debener
- Department of Psychology, Neuropsychology Lab, Cluster of Excellence Hearing4All, University of Oldenburg, Germany
| | - Anna C Nobre
- Department of Experimental Psychology, Brain and Cognition Lab, Oxford Centre for Human Brain Activity, Department of Psychiatry, Wellcome Centre for Integrative Neuroimaging, University of Oxford, UK
| |
Collapse
|
32
|
The interplay between multisensory integration and perceptual decision making. Neuroimage 2020; 222:116970. [PMID: 32454204 DOI: 10.1016/j.neuroimage.2020.116970] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2019] [Revised: 03/23/2020] [Accepted: 05/15/2020] [Indexed: 01/15/2023] Open
Abstract
Facing perceptual uncertainty, the brain combines information from different senses to make optimal perceptual decisions and to guide behavior. However, decision making has been investigated mostly in unimodal contexts. Thus, how the brain integrates multisensory information during decision making is still unclear. Two opposing, but not mutually exclusive, scenarios are plausible: either the brain thoroughly combines the signals from different modalities before starting to build a supramodal decision, or unimodal signals are integrated during decision formation. To answer this question, we devised a paradigm mimicking naturalistic situations where human participants were exposed to continuous cacophonous audiovisual inputs containing an unpredictable signal cue in one or two modalities and had to perform a signal detection task or a cue categorization task. First, model-based analyses of behavioral data indicated that multisensory integration takes place alongside perceptual decision making. Next, using supervised machine learning on concurrently recorded EEG, we identified neural signatures of two processing stages: sensory encoding and decision formation. Generalization analyses across experimental conditions and time revealed that multisensory cues were processed faster during both stages. We further established that acceleration of neural dynamics during sensory encoding and decision formation was directly linked to multisensory integration. Our results were consistent across both signal detection and categorization tasks. Taken together, the results revealed a continuous dynamic interplay between multisensory integration and decision making processes (mixed scenario), with integration of multimodal information taking place both during sensory encoding as well as decision formation.
Collapse
|
33
|
Wilsch A, Mercier MR, Obleser J, Schroeder CE, Haegens S. Spatial Attention and Temporal Expectation Exert Differential Effects on Visual and Auditory Discrimination. J Cogn Neurosci 2020; 32:1562-1576. [PMID: 32319865 DOI: 10.1162/jocn_a_01567] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Anticipation of an impending stimulus shapes the state of the sensory systems, optimizing neural and behavioral responses. Here, we studied the role of brain oscillations in mediating spatial and temporal anticipations. Because spatial attention and temporal expectation are often associated with visual and auditory processing, respectively, we directly contrasted the visual and auditory modalities and asked whether these anticipatory mechanisms are similar in both domains. We recorded the magnetoencephalogram in healthy human participants performing an auditory and visual target discrimination task, in which cross-modal cues provided both temporal and spatial information with regard to upcoming stimulus presentation. Motivated by prior findings, we were specifically interested in delta (1-3 Hz) and alpha (8-13 Hz) band oscillatory state in anticipation of target presentation and their impact on task performance. Our findings support the view that spatial attention has a stronger effect in the visual domain, whereas temporal expectation effects are more prominent in the auditory domain. For the spatial attention manipulation, we found a typical pattern of alpha lateralization in the visual system, which correlated with response speed. Providing a rhythmic temporal cue led to increased postcue synchronization of low-frequency rhythms, although this effect was more broadband in nature, suggesting a general phase reset rather than frequency-specific neural entrainment. In addition, we observed delta-band synchronization with a frontal topography, which correlated with performance, especially in the auditory task. Combined, these findings suggest that spatial and temporal anticipations operate via a top-down modulation of the power and phase of low-frequency oscillations, respectively.
Collapse
Affiliation(s)
| | - Manuel R Mercier
- University of Toulouse Paul Sabatier.,Aix Marseille University, Inserm, INS, Institut de Neurosciences des Systèmes, Marseille, France
| | - Jonas Obleser
- University of Lübeck.,Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Charles E Schroeder
- Columbia University College of Physicians and Surgeons.,Nathan Kline Institute, Orangeburg, SC
| | - Saskia Haegens
- Columbia University College of Physicians and Surgeons.,Radboud University Nijmegen
| |
Collapse
|
34
|
Keil J. Double Flash Illusions: Current Findings and Future Directions. Front Neurosci 2020; 14:298. [PMID: 32317920 PMCID: PMC7146460 DOI: 10.3389/fnins.2020.00298] [Citation(s) in RCA: 43] [Impact Index Per Article: 8.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2019] [Accepted: 03/16/2020] [Indexed: 11/29/2022] Open
Abstract
Twenty years ago, the first report on the sound-induced double flash illusion, a visual illusion induced by sound, was published. In this paradigm, participants are presented with different numbers of auditory and visual stimuli. In case of an incongruent number of auditory and visual stimuli, the influence of auditory information on visual perception can lead to the perception of the illusion. Thus, combining two auditory stimuli with one visual stimulus can induce the perception of two visual stimuli, the so-called fission illusion. Alternatively, combining one auditory stimulus with two visual stimuli can induce the perception of one visual stimulus, the so-called fusion illusion. Overall, current research shows that the illusion is a reliable indicator of multisensory integration. It has also been replicated using different stimulus combinations, such as visual and tactile stimuli. Importantly, the robustness of the illusion allows the widespread use for assessing multisensory integration across different groups of healthy participants and clinical populations and in various task setting. This review will give an overview of the experimental evidence supporting the illusion, the current state of research concerning the influence of cognitive processes on the illusion, the neural mechanisms underlying the illusion, and future research directions. Moreover, an exemplary experimental setup will be described with different options to examine perception, alongside code to test and replicate the illusion online or in the laboratory.
Collapse
Affiliation(s)
- Julian Keil
- Biological Psychology, Christian-Albrechts-Universität zu Kiel, Kiel, Germany
| |
Collapse
|
35
|
Laffere A, Dick F, Tierney A. Effects of auditory selective attention on neural phase: individual differences and short-term training. Neuroimage 2020; 213:116717. [PMID: 32165265 DOI: 10.1016/j.neuroimage.2020.116717] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2019] [Revised: 03/02/2020] [Accepted: 03/04/2020] [Indexed: 02/06/2023] Open
Abstract
How does the brain follow a sound that is mixed with others in a noisy environment? One possible strategy is to allocate attention to task-relevant time intervals. Prior work has linked auditory selective attention to alignment of neural modulations with stimulus temporal structure. However, since this prior research used relatively easy tasks and focused on analysis of main effects of attention across participants, relatively little is known about the neural foundations of individual differences in auditory selective attention. Here we investigated individual differences in auditory selective attention by asking participants to perform a 1-back task on a target auditory stream while ignoring a distractor auditory stream presented 180° out of phase. Neural entrainment to the attended auditory stream was strongly linked to individual differences in task performance. Some variability in performance was accounted for by degree of musical training, suggesting a link between long-term auditory experience and auditory selective attention. To investigate whether short-term improvements in auditory selective attention are possible, we gave participants 2 h of auditory selective attention training and found improvements in both task performance and enhancements of the effects of attention on neural phase angle. Our results suggest that although there exist large individual differences in auditory selective attention and attentional modulation of neural phase angle, this skill improves after a small amount of targeted training.
Collapse
Affiliation(s)
- Aeron Laffere
- Department of Psychological Sciences, Birkbeck, University of London, Malet Street, London, WC1E 7HX, UK
| | - Fred Dick
- Department of Psychological Sciences, Birkbeck, University of London, Malet Street, London, WC1E 7HX, UK; Division of Psychology & Language Sciences, UCL, Gower Street, London, WC1E 6BT, UK
| | - Adam Tierney
- Department of Psychological Sciences, Birkbeck, University of London, Malet Street, London, WC1E 7HX, UK.
| |
Collapse
|
36
|
Molholm S, Murphy JW, Bates J, Ridgway EM, Foxe JJ. Multisensory Audiovisual Processing in Children With a Sensory Processing Disorder (I): Behavioral and Electrophysiological Indices Under Speeded Response Conditions. Front Integr Neurosci 2020; 14:4. [PMID: 32116583 PMCID: PMC7026671 DOI: 10.3389/fnint.2020.00004] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2019] [Accepted: 01/20/2020] [Indexed: 11/23/2022] Open
Abstract
Background Maladaptive reactivity to sensory inputs is commonly observed in neurodevelopmental disorders (e.g., autism, ADHD). Little is known, however, about the underlying neural mechanisms. For some children, atypical sensory reactivity is the primary complaint, despite absence of another identifiable neurodevelopmental diagnosis. Studying Sensory Processing Disorder (SPD) may well provide a window into the neuropathology of these symptoms. It has been proposed that a deficit in sensory integration underlies the SPD phenotype, but objective quantification of sensory integration is lacking. Here we used neural and behavioral measures of multisensory integration (MSI), which would be affected by impaired sensory integration and for which there are well accepted objective measures, to test whether failure to integrate across the senses is associated with atypical sensory reactivity in SPD. An autism group served to determine if observed differences were unique to SPD. Methods We tested whether children aged 6–16 years with SPD (N = 14) integrate multisensory inputs differently from age-matched typically developing controls (TD: N = 54), or from children with an autism spectrum disorder (ASD: N = 44). Participants performed a simple reaction-time task to the occurrence of auditory, visual, and audiovisual stimuli presented in random order, while high-density recordings of electrical brain activity were made. Results Children with SPD showed large reductions in the extent to which they benefited from multisensory inputs compared to TDs. The ASD group showed similarly reduced response speeding to multisensory relative to unisensory inputs. Neural evidence for MSI was seen across all three groups, with the multisensory response differing from the sum of the unisensory responses. Post hoc tests suggested the possibility of enhanced MSI in SPD in timeframes consistent with cortical sensory registration (∼60 ms), followed by reduced MSI during a timeframe consistent with object formation (∼130 ms). The ASD group also showed reduced MSI in the later timeframe. Conclusion Children with SPD showed reduction in their ability to benefit from redundant audio-visual inputs, similar to children with ASD. Neurophysiological recordings, on the other hand, showed that major indices of MSI were largely intact, although post hoc testing pointed to periods of potential differential processing. While these exploratory electrophysiological observations point to potential sensory-perceptual differences in multisensory processing in SPD, it remains equally plausible at this stage that later attentional processing differences may yet prove responsible for the multisensory behavioral deficits uncovered here.
Collapse
Affiliation(s)
- Sophie Molholm
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Montefiore Medical Center, Albert Einstein College of Medicine, Bronx, NY, United States.,Dominick P. Purpura Department of Neuroscience, Albert Einstein College of Medicin, Bronx, NY, United States.,The Cognitive Neurophysiology Laboratory, The Ernest J. Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, United States
| | - Jeremy W Murphy
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Montefiore Medical Center, Albert Einstein College of Medicine, Bronx, NY, United States
| | - Juliana Bates
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Montefiore Medical Center, Albert Einstein College of Medicine, Bronx, NY, United States
| | - Elizabeth M Ridgway
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Montefiore Medical Center, Albert Einstein College of Medicine, Bronx, NY, United States
| | - John J Foxe
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Montefiore Medical Center, Albert Einstein College of Medicine, Bronx, NY, United States.,Dominick P. Purpura Department of Neuroscience, Albert Einstein College of Medicin, Bronx, NY, United States.,The Cognitive Neurophysiology Laboratory, The Ernest J. Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, United States
| |
Collapse
|
37
|
La Rocca D, Ciuciu P, Engemann DA, van Wassenhove V. Emergence of β and γ networks following multisensory training. Neuroimage 2020; 206:116313. [PMID: 31676416 PMCID: PMC7355235 DOI: 10.1016/j.neuroimage.2019.116313] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2019] [Revised: 10/22/2019] [Accepted: 10/23/2019] [Indexed: 12/31/2022] Open
Abstract
Our perceptual reality relies on inferences about the causal structure of the world given by multiple sensory inputs. In ecological settings, multisensory events that cohere in time and space benefit inferential processes: hearing and seeing a speaker enhances speech comprehension, and the acoustic changes of flapping wings naturally pace the motion of a flock of birds. Here, we asked how a few minutes of (multi)sensory training could shape cortical interactions in a subsequent unisensory perceptual task. For this, we investigated oscillatory activity and functional connectivity as a function of individuals' sensory history during training. Human participants performed a visual motion coherence discrimination task while being recorded with magnetoencephalography. Three groups of participants performed the same task with visual stimuli only, while listening to acoustic textures temporally comodulated with the strength of visual motion coherence, or with auditory noise uncorrelated with visual motion. The functional connectivity patterns before and after training were contrasted to resting-state networks to assess the variability of common task-relevant networks, and the emergence of new functional interactions as a function of sensory history. One major finding is the emergence of a large-scale synchronization in the high γ (gamma: 60-120Hz) and β (beta: 15-30Hz) bands for individuals who underwent comodulated multisensory training. The post-training network involved prefrontal, parietal, and visual cortices. Our results suggest that the integration of evidence and decision-making strategies become more efficient following congruent multisensory training through plasticity in network routing and oscillatory regimes.
Collapse
Affiliation(s)
- Daria La Rocca
- CEA/DRF/Joliot, Université Paris-Saclay, 91191, Gif-sur-Yvette, France; Université Paris-Saclay, Inria, CEA, Palaiseau, 91120, France
| | - Philippe Ciuciu
- CEA/DRF/Joliot, Université Paris-Saclay, 91191, Gif-sur-Yvette, France; Université Paris-Saclay, Inria, CEA, Palaiseau, 91120, France
| | - Denis-Alexander Engemann
- CEA/DRF/Joliot, Université Paris-Saclay, 91191, Gif-sur-Yvette, France; Université Paris-Saclay, Inria, CEA, Palaiseau, 91120, France
| | - Virginie van Wassenhove
- CEA/DRF/Joliot, Université Paris-Saclay, 91191, Gif-sur-Yvette, France; Cognitive Neuroimaging Unit, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin Center, 91191, Gif-sur-Yvette, France.
| |
Collapse
|
38
|
Stereotactic electroencephalography in humans reveals multisensory signal in early visual and auditory cortices. Cortex 2020; 126:253-264. [PMID: 32092494 DOI: 10.1016/j.cortex.2019.12.032] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2019] [Revised: 08/20/2019] [Accepted: 12/30/2019] [Indexed: 02/02/2023]
Abstract
Unequivocally demonstrating the presence of multisensory signals at the earliest stages of cortical processing remains challenging in humans. In our study, we relied on the unique spatio-temporal resolution provided by intracranial stereotactic electroencephalographic (SEEG) recordings in patients with drug-resistant epilepsy to characterize the signal extracted from early visual (calcarine and pericalcarine) and auditory (Heschl's gyrus and planum temporale) regions during a simple audio-visual oddball task. We provide evidences that both cross-modal responses (visual responses in auditory cortex or the reverse) and multisensory processing (alteration of the unimodal responses during bimodal stimulation) can be observed in intracranial event-related potentials (iERPs) and in power modulations of oscillatory activity at different temporal scales within the first 150 msec after stimulus onset. The temporal profiles of the iERPs are compatible with the hypothesis that MSI occurs by means of direct pathways linking early visual and auditory regions. Our data indicate, moreover, that MSI mainly relies on modulations of the low-frequency bands (foremost the theta band in the auditory cortex and the alpha band in the visual cortex), suggesting the involvement of feedback pathways between the two sensory regions. Remarkably, we also observed high-gamma power modulations by sounds in the early visual cortex, thus suggesting the presence of neuronal populations involved in auditory processing in the calcarine and pericalcarine region in humans.
Collapse
|
39
|
Sugiyama S, Kinukawa T, Takeuchi N, Nishihara M, Shioiri T, Inui K. Tactile Cross-Modal Acceleration Effects on Auditory Steady-State Response. Front Integr Neurosci 2019; 13:72. [PMID: 31920574 PMCID: PMC6927992 DOI: 10.3389/fnint.2019.00072] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2019] [Accepted: 12/02/2019] [Indexed: 01/09/2023] Open
Abstract
In the sensory cortex, cross-modal interaction occurs during the early cortical stages of processing; however, its effect on the speed of neuronal activity remains unclear. In this study, we used magnetoencephalography (MEG) to investigate whether tactile stimulation influences auditory steady-state responses (ASSRs). To this end, a 0.5-ms electrical pulse was randomly presented to the dorsum of the left or right hand of 12 healthy volunteers at 700 ms while a train of 25-ms pure tones were applied to the left or right side at 75 dB for 1,200 ms. Peak latencies of 40-Hz ASSR were measured. Our results indicated that tactile stimulation significantly shortened subsequent ASSR latency. This cross-modal effect was observed from approximately 50 ms to 125 ms after the onset of tactile stimulation. The somatosensory information that appeared to converge on the auditory system may have arisen during the early processing stages, with the reduced ASSR latency indicating that a new sensory event from the cross-modal inputs served to increase the speed of ongoing sensory processing. Collectively, our findings indicate that ASSR latency changes are a sensitive index of accelerated processing.
Collapse
Affiliation(s)
- Shunsuke Sugiyama
- Department of Psychiatry and Psychotherapy, Gifu University Graduate School of Medicine, Gifu, Japan
| | - Tomoaki Kinukawa
- Department of Anesthesiology, Nagoya University Graduate School of Medicine, Nagoya, Japan
| | | | - Makoto Nishihara
- Multidisciplinary Pain Center, Aichi Medical University, Nagakute, Japan
| | - Toshiki Shioiri
- Department of Psychiatry and Psychotherapy, Gifu University Graduate School of Medicine, Gifu, Japan
| | - Koji Inui
- Departmernt of Functioning and Disability, Institute for Developmental Research, Kasugai, Japan
| |
Collapse
|
40
|
Sengupta R, Yaruss JS, Loucks TM, Gracco VL, Pelczarski K, Nasir SM. Theta Modulated Neural Phase Coherence Facilitates Speech Fluency in Adults Who Stutter. Front Hum Neurosci 2019; 13:394. [PMID: 31798431 PMCID: PMC6878001 DOI: 10.3389/fnhum.2019.00394] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2019] [Accepted: 10/22/2019] [Indexed: 12/03/2022] Open
Abstract
Adults who stutter (AWS) display altered patterns of neural phase coherence within the speech motor system preceding disfluencies. These altered patterns may distinguish fluent speech episodes from disfluent ones. Phase coherence is relevant to the study of stuttering because it reflects neural communication within brain networks. In this follow-up study, the oscillatory cortical dynamics preceding fluent speech in AWS and adults who do not stutter (AWNS) were examined during a single-word delayed reading task using electroencephalographic (EEG) techniques. Compared to AWNS, fluent speech preparation in AWS was characterized by a decrease in theta-gamma phase coherence and a corresponding increase in theta-beta coherence level. Higher spectral powers in the beta and gamma bands were also observed preceding fluent utterances by AWS. Overall, there was altered neural communication during speech planning in AWS that provides novel evidence for atypical allocation of feedforward control by AWS even before fluent utterances.
Collapse
Affiliation(s)
- Ranit Sengupta
- Department of Communication Sciences and Disorders, Northwestern University, Evanston, IL, United States
| | - J Scott Yaruss
- Department of Communicative Sciences and Disorders, Michigan State University, East Lansing, MI, United States
| | - Torrey M Loucks
- Department of Communication Sciences and Disorders, Faculty of Rehabilitation Medicine, University of Alberta, Edmonton, AB, Canada.,Institute for Stuttering Treatment and Research, Faculty of Rehabilitation Medicine, University of Alberta, Edmonton, AB, Canada
| | | | - Kristin Pelczarski
- School of Family Studies and Human Services, Kansas State University, Manhattan, KS, United States
| | - Sazzad M Nasir
- Haskins Laboratories, New Haven, CT, United States.,Indiana Academy, Ball State University, Muncie, IN, United States
| |
Collapse
|
41
|
Wagner J, Makeig S, Hoopes D, Gola M. Can Oscillatory Alpha-Gamma Phase-Amplitude Coupling be Used to Understand and Enhance TMS Effects? Front Hum Neurosci 2019; 13:263. [PMID: 31427937 PMCID: PMC6689956 DOI: 10.3389/fnhum.2019.00263] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2019] [Accepted: 07/12/2019] [Indexed: 12/25/2022] Open
Abstract
Recent applications of simultaneous scalp electroencephalography (EEG) and transcranial magnetic stimulation (TMS) suggest that adapting stimulation to underlying brain states may enhance neuroplastic effects of TMS. It is often assumed that longer-lasting effects of TMS on brain function may be mediated by phasic interactions between TMS pulses and endogenous cortical oscillatory dynamics. The mechanisms by which TMS exerts its neuromodulatory effects, however, remain unknown. Here, we discuss evidence concerning the functional effects on synaptic plasticity of oscillatory cross-frequency coupling in cortical networks as a potential framework for understanding the neuromodulatory effects of TMS. We first discuss evidence for interactions between endogenous oscillatory brain dynamics and externally induced electromagnetic field activity. Alpha band (8-12 Hz) activities are of special interest here because of the wide application and therapeutic effectiveness of rhythmic TMS (rTMS) using a stimulus repetition frequency at or near 10 Hz. We discuss the large body of literature on alpha oscillations suggesting that alpha oscillatory cycles produce periodic inhibition or excitation of neuronal processing through phase-amplitude coupling (PAC) of low-frequency oscillations with high-frequency broadband (or gamma) bursting. Such alpha-gamma coupling may reflect excitability of neuronal ensembles underlying neuroplasticity effects of TMS. We propose that TMS delivery with simultaneous EEG recording and near real-time estimation of source-resolved alpha-gamma PAC might be used to select the precise timing of TMS pulse deliveries so as to enhance the neuroplastic effects of TMS therapies.
Collapse
Affiliation(s)
- Johanna Wagner
- Swartz Center for Computational Neurosciences, Institute for Neural Computation, University of California, San Diego, San Diego, CA, United States
| | - Scott Makeig
- Swartz Center for Computational Neurosciences, Institute for Neural Computation, University of California, San Diego, San Diego, CA, United States
| | - David Hoopes
- Department of Radiation Medicine and Applied Sciences, School of Medicine, University of California, San Diego, San Diego, CA, United States
| | - Mateusz Gola
- Swartz Center for Computational Neurosciences, Institute for Neural Computation, University of California, San Diego, San Diego, CA, United States.,Institute of Psychology, Polish Academy of Sciences, Warsaw, Poland
| |
Collapse
|
42
|
Benedetto A, Morrone MC, Tomassini A. The Common Rhythm of Action and Perception. J Cogn Neurosci 2019; 32:187-200. [PMID: 31210564 DOI: 10.1162/jocn_a_01436] [Citation(s) in RCA: 37] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
Abstract
Research in the last decade has undermined the idea of perception as a continuous process, providing strong empirical support for its rhythmic modulation. More recently, it has been revealed that the ongoing motor processes influence the rhythmic sampling of sensory information. In this review, we will focus on a growing body of evidence suggesting that oscillation-based mechanisms may structure the dynamic interplay between the motor and sensory system and provide a unified temporal frame for their effective coordination. We will describe neurophysiological data, primarily collected in animals, showing phase-locking of neuronal oscillations to the onset of (eye) movements. These data are complemented by novel evidence in humans, which demonstrate the behavioral relevance of these oscillatory modulations and their domain-general nature. Finally, we will discuss the possible implications of these modulations for action-perception coupling mechanisms.
Collapse
|
43
|
Long-range functional coupling predicts performance: Oscillatory EEG networks in multisensory processing. Neuroimage 2019; 196:114-125. [PMID: 30959196 DOI: 10.1016/j.neuroimage.2019.04.001] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2018] [Revised: 02/25/2019] [Accepted: 04/01/2019] [Indexed: 12/12/2022] Open
Abstract
The integration of sensory signals from different modalities requires flexible interaction of remote brain areas. One candidate mechanism to establish communication in the brain is transient synchronization of oscillatory neural signals. Although there is abundant evidence for the involvement of cortical oscillations in brain functions based on the analysis of local power, assessment of the phase dynamics among spatially distributed neuronal populations and their relevance for behavior is still sparse. In the present study, we investigated the interaction between remote brain areas by analyzing high-density electroencephalogram (EEG) data obtained from human participants engaged in a visuotactile pattern matching task. We deployed an approach for purely data-driven clustering of neuronal phase coupling in source space, which allowed imaging of large-scale functional networks in space, time and frequency without defining a priori constraints. Based on the phase coupling results, we further explored how brain areas interacted across frequencies by computing phase-amplitude coupling. Several networks of interacting sources were identified with our approach, synchronizing their activity within and across the theta (∼5 Hz), alpha (∼10 Hz), and beta (∼20 Hz) frequency bands and involving multiple brain areas that have previously been associated with attention and motor control. We demonstrate the functional relevance of these networks by showing that phase delays - in contrast to spectral power - were predictive of task performance. The data-driven analysis approach employed in the current study allowed an unbiased examination of functional brain networks based on EEG source level connectivity data. Showcased for multisensory processing, our results provide evidence that large-scale neuronal coupling is vital to long-range communication in the human brain and relevant for the behavioral outcome in a cognitive task.
Collapse
|
44
|
Attention Periodically Binds Visual Features As Single Events Depending on Neural Oscillations Phase-Locked to Action. J Neurosci 2019; 39:4153-4161. [PMID: 30886011 DOI: 10.1523/jneurosci.2494-18.2019] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2018] [Revised: 03/04/2019] [Accepted: 03/09/2019] [Indexed: 11/21/2022] Open
Abstract
Recent psychophysical studies have demonstrated that periodic attention in the 4-8 Hz range facilitates performance on visual detection. The present study examined the periodicity of feature binding, another major function of attention, in human observers (3 females and 5 males for behavior, with 7 males added for the EEG experiment). In a psychophysical task, observers reported a synchronous pair of brightness (light/dark) and orientation (clockwise/counterclockwise) patterns from two combined brightness-orientation pairs presented in rapid succession. We found that temporal binding performance exhibits periodic oscillations at ∼8 Hz as a function of stimulus onset delay from a self-initiated button press in conditions where brightness-orientation pairs were spatially separated. However, as one would expect from previous studies on pre-attentive binding, significant oscillations were not apparent in conditions where brightness-orientation pairs were spatially superimposed. EEG results, while fully compatible with behavioral oscillations, also revealed a significant dependence of binding performance across trials on prestimulus neural oscillatory phases within the corresponding band. The peak frequency of this dependence was found to be correlated with intertrial phase coherence (ITPC) around the timing of button press in parietal sensors. Moreover, the peak frequency of the ITPC was found to predict behavioral frequency in individual observers. Together, these results suggest that attention operates periodically (at ∼8 Hz) on the perceptual binding of multimodal visual information and is mediated by neural oscillations phase-locked to voluntary action.SIGNIFICANCE STATEMENT Recent studies in neuroscience suggest that the brain's attention network operates rhythmically at 4-8 Hz. The present behavioral task revealed that attentional binding of visual features is performed periodically at ∼8 Hz, and EEG analysis showed a dependence of binding performance on prestimulus neural oscillatory phase. Furthermore, this association between perceptual and neural oscillations is triggered by voluntary action. Periodic processes driven by attention appear to contribute not only to sensory processing but also to the temporal binding of diverse information into a conscious event synchronized with action.
Collapse
|
45
|
Abstract
At any given moment, we receive input through our different sensory systems, and this information needs to be processed and integrated. Multisensory processing requires the coordinated activity of distinct cortical areas. Key mechanisms implicated in these processes include local neural oscillations and functional connectivity between distant cortical areas. Evidence is now emerging that neural oscillations in distinct frequency bands reflect different mechanisms of multisensory processing. Moreover, studies suggest that aberrant neural oscillations contribute to multisensory processing deficits in clinical populations, such as schizophrenia. In this article, we review recent literature on the neural mechanisms underlying multisensory processing, focusing on neural oscillations. We derive a framework that summarizes findings on (1) stimulus-driven multisensory processing, (2) the influence of top-down information on multisensory processing, and (3) the role of predictions for the formation of multisensory perception. We propose that different frequency band oscillations subserve complementary mechanisms of multisensory processing. These processes can act in parallel and are essential for multisensory processing.
Collapse
Affiliation(s)
- Julian Keil
- 1 Biological Psychology, Christian-Albrechts-University Kiel, Kiel, Germany
- 2 Department of Psychiatry and Psychotherapy, St. Hedwig Hospital, Charité-Universitätsmedizin Berlin, Berlin, Germany
| | - Daniel Senkowski
- 2 Department of Psychiatry and Psychotherapy, St. Hedwig Hospital, Charité-Universitätsmedizin Berlin, Berlin, Germany
| |
Collapse
|
46
|
Effect of acceleration of auditory inputs on the primary somatosensory cortex in humans. Sci Rep 2018; 8:12883. [PMID: 30150686 PMCID: PMC6110726 DOI: 10.1038/s41598-018-31319-3] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2018] [Accepted: 08/17/2018] [Indexed: 11/09/2022] Open
Abstract
Cross-modal interaction occurs during the early stages of processing in the sensory cortex; however, its effect on neuronal activity speed remains unclear. We used magnetoencephalography to investigate whether auditory stimulation influences the initial cortical activity in the primary somatosensory cortex. A 25-ms pure tone was randomly presented to the left or right side of healthy volunteers at 1000 ms when electrical pulses were applied to the left or right median nerve at 20 Hz for 1500 ms because we did not observe any cross-modal effect elicited by a single pulse. The latency of N20 m originating from Brodmann's area 3b was measured for each pulse. The auditory stimulation significantly shortened the N20 m latency at 1050 and 1100 ms. This reduction in N20 m latency was identical for the ipsilateral and contralateral sounds for both latency points. Therefore, somatosensory-auditory interaction, such as input to the area 3b from the thalamus, occurred during the early stages of synaptic transmission. Auditory information that converged on the somatosensory system was considered to have arisen from the early stages of the feedforward pathway. Acceleration of information processing through the cross-modal interaction seemed to be partly due to faster processing in the sensory cortex.
Collapse
|
47
|
Oya H, Gander PE, Petkov CI, Adolphs R, Nourski KV, Kawasaki H, Howard MA, Griffiths TD. Neural phase locking predicts BOLD response in human auditory cortex. Neuroimage 2018; 169:286-301. [PMID: 29274745 PMCID: PMC6139034 DOI: 10.1016/j.neuroimage.2017.12.051] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2017] [Revised: 11/22/2017] [Accepted: 12/16/2017] [Indexed: 11/16/2022] Open
Abstract
Natural environments elicit both phase-locked and non-phase-locked neural responses to the stimulus in the brain. The interpretation of the BOLD signal to date has been based on an association of the non-phase-locked power of high-frequency local field potentials (LFPs), or the related spiking activity in single neurons or groups of neurons. Previous studies have not examined the prediction of the BOLD signal by phase-locked responses. We examined the relationship between the BOLD response and LFPs in the same nine human subjects from multiple corresponding points in the auditory cortex, using amplitude modulated pure tone stimuli of a duration to allow an analysis of phase locking of the sustained time period without contamination from the onset response. The results demonstrate that both phase locking at the modulation frequency and its harmonics, and the oscillatory power in gamma/high-gamma bands are required to predict the BOLD response. Biophysical models of BOLD signal generation in auditory cortex therefore require revision and the incorporation of both phase locking to rhythmic sensory stimuli and power changes in the ensemble neural activity.
Collapse
Affiliation(s)
- Hiroyuki Oya
- Department of Neurosurgery, Human Brain Research Laboratory, University of Iowa, Iowa City, IA 52252, USA.
| | - Phillip E Gander
- Department of Neurosurgery, Human Brain Research Laboratory, University of Iowa, Iowa City, IA 52252, USA
| | | | - Ralph Adolphs
- Division of the Humanities and Social Sciences, California Institute of Technology, Pasadena, CA 91125, USA
| | - Kirill V Nourski
- Department of Neurosurgery, Human Brain Research Laboratory, University of Iowa, Iowa City, IA 52252, USA
| | - Hiroto Kawasaki
- Department of Neurosurgery, Human Brain Research Laboratory, University of Iowa, Iowa City, IA 52252, USA
| | - Matthew A Howard
- Department of Neurosurgery, Human Brain Research Laboratory, University of Iowa, Iowa City, IA 52252, USA
| | - Timothy D Griffiths
- Wellcome Trust Centre for Neuroimaging, Institute of Neurology, University College London, UK
| |
Collapse
|
48
|
Murray MM, Thelen A, Ionta S, Wallace MT. Contributions of Intraindividual and Interindividual Differences to Multisensory Processes. J Cogn Neurosci 2018; 31:360-376. [PMID: 29488852 DOI: 10.1162/jocn_a_01246] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Most evidence on the neural and perceptual correlates of sensory processing derives from studies that have focused on only a single sensory modality and averaged the data from groups of participants. Although valuable, such studies ignore the substantial interindividual and intraindividual differences that are undoubtedly at play. Such variability plays an integral role in both the behavioral/perceptual realms and in the neural correlates of these processes, but substantially less is known when compared with group-averaged data. Recently, it has been shown that the presentation of stimuli from two or more sensory modalities (i.e., multisensory stimulation) not only results in the well-established performance gains but also gives rise to reductions in behavioral and neural response variability. To better understand the relationship between neural and behavioral response variability under multisensory conditions, this study investigated both behavior and brain activity in a task requiring participants to discriminate moving versus static stimuli presented in either a unisensory or multisensory context. EEG data were analyzed with respect to intraindividual and interindividual differences in RTs. The results showed that trial-by-trial variability of RTs was significantly reduced under audiovisual presentation conditions as compared with visual-only presentations across all participants. Intraindividual variability of RTs was linked to changes in correlated activity between clusters within an occipital to frontal network. In addition, interindividual variability of RTs was linked to differential recruitment of medial frontal cortices. The present findings highlight differences in the brain networks that support behavioral benefits during unisensory versus multisensory motion detection and provide an important view into the functional dynamics within neuronal networks underpinning intraindividual performance differences.
Collapse
Affiliation(s)
- Micah M Murray
- Vaudois University Hospital Center and University of Lausanne.,Center for Biomedical Imaging of Lausanne and Geneva.,Fondation Asile des Aveugles and University of Lausanne.,Vanderbilt University Medical Center
| | | | - Silvio Ionta
- Vaudois University Hospital Center and University of Lausanne.,Fondation Asile des Aveugles and University of Lausanne.,ETH Zürich
| | - Mark T Wallace
- Vanderbilt University Medical Center.,Vanderbilt University
| |
Collapse
|
49
|
Cruzat J, Deco G, Tauste-Campo A, Principe A, Costa A, Kringelbach ML, Rocamora R. The dynamics of human cognition: Increasing global integration coupled with decreasing segregation found using iEEG. Neuroimage 2018; 172:492-505. [PMID: 29425897 DOI: 10.1016/j.neuroimage.2018.01.064] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2017] [Revised: 01/23/2018] [Accepted: 01/25/2018] [Indexed: 11/28/2022] Open
Abstract
Cognitive processing requires the ability to flexibly integrate and process information across large brain networks. How do brain networks dynamically reorganize to allow broad communication between many different brain regions in order to integrate information? We record neural activity from 12 epileptic patients using intracranial EEG while performing three cognitive tasks. We assess how the functional connectivity between different brain areas changes to facilitate communication across them. At the topological level, this facilitation is characterized by measures of integration and segregation. Across all patients, we found significant increases in integration and decreases in segregation during cognitive processing, especially in the gamma band (50-90 Hz). We also found higher levels of global synchronization and functional connectivity during task execution, again particularly in the gamma band. More importantly, functional connectivity modulations were not caused by changes in the level of the underlying oscillations. Instead, these modulations were caused by a rearrangement of the mutual synchronization between the different nodes as proposed by the "Communication Through Coherence" Theory.
Collapse
Affiliation(s)
- Josephine Cruzat
- Center for Brain and Cognition, Department of Information and Communication Technologies, Universitat Pompeu Fabra, Ramon Trias Fargas 25-27, 08005, Barcelona, Spain.
| | - Gustavo Deco
- Center for Brain and Cognition, Department of Information and Communication Technologies, Universitat Pompeu Fabra, Ramon Trias Fargas 25-27, 08005, Barcelona, Spain; Institució Catalana de la Recerca i Estudis Avançats (ICREA), Barcelona, Spain; Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, 04103, Leipzig, Germany; School of Psychological Sciences, Monash University, Melbourne, Clayton, VIC, 3800, Australia
| | - Adrià Tauste-Campo
- Center for Brain and Cognition, Department of Information and Communication Technologies, Universitat Pompeu Fabra, Ramon Trias Fargas 25-27, 08005, Barcelona, Spain; Epilepsy Unit, Department of Neurology, IMIM Hospital del Mar, Universitat Pompeu Fabra, Passeig Marítim, 25, 08003, Barcelona, Spain
| | - Alessandro Principe
- Epilepsy Unit, Department of Neurology, IMIM Hospital del Mar, Universitat Pompeu Fabra, Passeig Marítim, 25, 08003, Barcelona, Spain
| | - Albert Costa
- Center for Brain and Cognition, Department of Information and Communication Technologies, Universitat Pompeu Fabra, Ramon Trias Fargas 25-27, 08005, Barcelona, Spain; Institució Catalana de la Recerca i Estudis Avançats (ICREA), Barcelona, Spain
| | - Morten L Kringelbach
- Department of Psychiatry, University of Oxford, OX3 7JX, Oxford, UK; Center for Music in the Brain (MIB), Department of Clinical Medicine, Aarhus University, Nørrebrogade 44, Building 10G, 8000, Aarhus, Denmark; Institut d'études avancées de Paris, France
| | - Rodrigo Rocamora
- Epilepsy Unit, Department of Neurology, IMIM Hospital del Mar, Universitat Pompeu Fabra, Passeig Marítim, 25, 08003, Barcelona, Spain
| |
Collapse
|
50
|
Atilgan H, Town SM, Wood KC, Jones GP, Maddox RK, Lee AKC, Bizley JK. Integration of Visual Information in Auditory Cortex Promotes Auditory Scene Analysis through Multisensory Binding. Neuron 2018; 97:640-655.e4. [PMID: 29395914 PMCID: PMC5814679 DOI: 10.1016/j.neuron.2017.12.034] [Citation(s) in RCA: 90] [Impact Index Per Article: 12.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2017] [Revised: 10/28/2017] [Accepted: 12/22/2017] [Indexed: 12/29/2022]
Abstract
How and where in the brain audio-visual signals are bound to create multimodal objects remains unknown. One hypothesis is that temporal coherence between dynamic multisensory signals provides a mechanism for binding stimulus features across sensory modalities. Here, we report that when the luminance of a visual stimulus is temporally coherent with the amplitude fluctuations of one sound in a mixture, the representation of that sound is enhanced in auditory cortex. Critically, this enhancement extends to include both binding and non-binding features of the sound. We demonstrate that visual information conveyed from visual cortex via the phase of the local field potential is combined with auditory information within auditory cortex. These data provide evidence that early cross-sensory binding provides a bottom-up mechanism for the formation of cross-sensory objects and that one role for multisensory binding in auditory cortex is to support auditory scene analysis. Visual stimuli can shape how auditory cortical neurons respond to sound mixtures Temporal coherence between senses enhances sound features of a bound multisensory object Visual stimuli elicit changes in the phase of the local field potential in auditory cortex Vision-induced phase effects are lost when visual cortex is reversibly silenced
Collapse
Affiliation(s)
- Huriye Atilgan
- The Ear Institute, University College London, London, UK
| | - Stephen M Town
- The Ear Institute, University College London, London, UK
| | | | - Gareth P Jones
- The Ear Institute, University College London, London, UK
| | - Ross K Maddox
- Department of Biomedical Engineering and Department of Neuroscience, Del Monte Institute for Neuroscience, University of Rochester, Rochester, NY, USA; Institute for Learning and Brain Sciences and Department of Speech and Hearing Sciences, University of Washington, Seattle, WA, USA
| | - Adrian K C Lee
- Institute for Learning and Brain Sciences and Department of Speech and Hearing Sciences, University of Washington, Seattle, WA, USA
| | | |
Collapse
|