1
|
Vogler NW, Chen R, Virkler A, Tu VY, Gottfried JA, Geffen MN. Direct piriform-to-auditory cortical projections shape auditory-olfactory integration. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.07.11.602976. [PMID: 39071445 PMCID: PMC11275881 DOI: 10.1101/2024.07.11.602976] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/30/2024]
Abstract
In a real-world environment, the brain must integrate information from multiple sensory modalities, including the auditory and olfactory systems. However, little is known about the neuronal circuits governing how odors influence and modulate sound processing. Here, we investigated the mechanisms underlying auditory-olfactory integration using anatomical, electrophysiological, and optogenetic approaches, focusing on the auditory cortex as a key locus for cross-modal integration. First, retrograde and anterograde viral tracing strategies revealed a direct projection from the piriform cortex to the auditory cortex. Next, using in vivo electrophysiological recordings of neuronal activity in the auditory cortex of awake male or female mice, we found that odors modulate auditory cortical responses to sound. Finally, we used in vivo optogenetic manipulations during electrophysiology to demonstrate that olfactory modulation in auditory cortex, specifically, odor-driven enhancement of sound responses, depends on direct input from the piriform cortex. Together, our results identify a novel role of piriform-to-auditory cortical circuitry in shaping olfactory modulation in the auditory cortex, shedding new light on the neuronal mechanisms underlying auditory-olfactory integration.
Collapse
Affiliation(s)
- Nathan W. Vogler
- Department of Otorhinolaryngology, Perelman School of Medicine, University of Pennsylvania
| | - Ruoyi Chen
- Department of Otorhinolaryngology, Perelman School of Medicine, University of Pennsylvania
| | - Alister Virkler
- Department of Neurology, Perelman School of Medicine, University of Pennsylvania
| | - Violet Y. Tu
- Department of Otorhinolaryngology, Perelman School of Medicine, University of Pennsylvania
| | - Jay A. Gottfried
- Department of Neurology, Perelman School of Medicine, University of Pennsylvania
| | - Maria N. Geffen
- Department of Otorhinolaryngology, Perelman School of Medicine, University of Pennsylvania
| |
Collapse
|
2
|
Schormans AL, Allman BL. Layer-specific enhancement of visual-evoked activity in the audiovisual cortex following a mild degree of hearing loss in adult rats. Hear Res 2024; 450:109071. [PMID: 38941694 DOI: 10.1016/j.heares.2024.109071] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/01/2024] [Revised: 06/12/2024] [Accepted: 06/17/2024] [Indexed: 06/30/2024]
Abstract
Following adult-onset hearing impairment, crossmodal plasticity can occur within various sensory cortices, often characterized by increased neural responses to visual stimulation in not only the auditory cortex, but also in the visual and audiovisual cortices. In the present study, we used an established model of loud noise exposure in rats to examine, for the first time, whether the crossmodal plasticity in the audiovisual cortex that occurs following a relatively mild degree of hearing loss emerges solely from altered intracortical processing or if thalamocortical changes also contribute to the crossmodal effects. Using a combination of an established pharmacological 'cortical silencing' protocol and current source density analysis of the laminar activity recorded across the layers of the audiovisual cortex (i.e., the lateral extrastriate visual cortex, V2L), we observed layer-specific changes post-silencing in the strength of the residual visual, but not auditory, input in the noise exposed rats with mild hearing loss compared to rats with normal hearing. Furthermore, based on a comparison of the laminar profiles pre- versus post-silencing in both groups, we can conclude that noise exposure caused a re-allocation of the strength of visual inputs across the layers of the V2L cortex, including enhanced visual-evoked activity in the granular layer; findings consistent with thalamocortical plasticity. Finally, we confirmed that audiovisual integration within the V2L cortex depends on intact processing within intracortical circuits, and that this form of multisensory processing is vulnerable to disruption by noise-induced hearing loss. Ultimately, the present study furthers our understanding of the contribution of intracortical and thalamocortical processing to crossmodal plasticity as well as to audiovisual integration under both normal and mildly-impaired hearing conditions.
Collapse
Affiliation(s)
- Ashley L Schormans
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario, 1151 Richmond St., London, ON N6A 5C1, Canada.
| | - Brian L Allman
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario, 1151 Richmond St., London, ON N6A 5C1, Canada
| |
Collapse
|
3
|
Zhang YJ, Lee JY, Igarashi KM. Circuit dynamics of the olfactory pathway during olfactory learning. Front Neural Circuits 2024; 18:1437575. [PMID: 39036422 PMCID: PMC11258029 DOI: 10.3389/fncir.2024.1437575] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2024] [Accepted: 06/20/2024] [Indexed: 07/23/2024] Open
Abstract
The olfactory system plays crucial roles in perceiving and interacting with their surroundings. Previous studies have deciphered basic odor perceptions, but how information processing in the olfactory system is associated with learning and memory is poorly understood. In this review, we summarize recent studies on the anatomy and functional dynamics of the mouse olfactory learning pathway, focusing on how neuronal circuits in the olfactory bulb (OB) and olfactory cortical areas integrate odor information in learning. We also highlight in vivo evidence for the role of the lateral entorhinal cortex (LEC) in olfactory learning. Altogether, these studies demonstrate that brain regions throughout the olfactory system are critically involved in forming and representing learned knowledge. The role of olfactory areas in learning and memory, and their susceptibility to dysfunction in neurodegenerative diseases, necessitate further research.
Collapse
Affiliation(s)
- Yutian J. Zhang
- Department of Anatomy and Neurobiology, School of Medicine, University of California, Irvine, Irvine, United States
| | - Jason Y. Lee
- Department of Anatomy and Neurobiology, School of Medicine, University of California, Irvine, Irvine, United States
| | - Kei M. Igarashi
- Department of Anatomy and Neurobiology, School of Medicine, University of California, Irvine, Irvine, United States
- Department of Biomedical Engineering, Samueli School of Engineering, University of California, Irvine, Irvine, United States
- Center for Neural Circuit Mapping, School of Medicine, University of California, Irvine, Irvine, United States
- Center for the Neurobiology of Learning and Memory, University of California, Irvine, Irvine, United States
- Institute for Memory Impairments and Neurological Disorders, University of California, Irvine, Irvine, United States
| |
Collapse
|
4
|
Huang Y, Brosch M. Behavior-related visual activations in the auditory cortex of nonhuman primates. Prog Neurobiol 2024; 240:102637. [PMID: 38879074 DOI: 10.1016/j.pneurobio.2024.102637] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2023] [Revised: 05/29/2024] [Accepted: 06/06/2024] [Indexed: 06/22/2024]
Abstract
While it is well established that sensory cortical regions traditionally thought to be unimodal can be activated by stimuli from modalities other than the dominant one, functions of such foreign-modal activations are still not clear. Here we show that visual activations in early auditory cortex can be related to whether or not the monkeys engaged in audio-visual tasks, to the time when the monkeys reacted to the visual component of such tasks, and to the correctness of the monkeys' response to the auditory component of such tasks. These relationships between visual activations and behavior suggest that auditory cortex can be recruited for visually-guided behavior and that visual activations can prime auditory cortex such that it is prepared for processing future sounds. Our study thus provides evidence that foreign-modal activations in sensory cortex can contribute to a subject's ability to perform tasks on stimuli from foreign and dominant modalities.
Collapse
Affiliation(s)
- Ying Huang
- Research Group Comparative Neuroscience, Leibniz Institute for Neurobiology, Brenneckestraße 6, Magdeburg 39118, Germany.
| | - Michael Brosch
- Research Group Comparative Neuroscience, Leibniz Institute for Neurobiology, Brenneckestraße 6, Magdeburg 39118, Germany; Center for Behavioral Brain Sciences, Otto-von-Guericke-University, Universitätsplatz 2, Magdeburg 39106, Germany
| |
Collapse
|
5
|
Ma S, Zhou Y, Wan T, Ren Q, Yan J, Fan L, Yuan H, Chan M, Chai Y. Bioinspired In-Sensor Multimodal Fusion for Enhanced Spatial and Spatiotemporal Association. NANO LETTERS 2024; 24:7091-7099. [PMID: 38804877 DOI: 10.1021/acs.nanolett.4c01727] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2024]
Abstract
Multimodal perception can capture more precise and comprehensive information compared with unimodal approaches. However, current sensory systems typically merge multimodal signals at computing terminals following parallel processing and transmission, which results in the potential loss of spatial association information and requires time stamps to maintain temporal coherence for time-series data. Here we demonstrate bioinspired in-sensor multimodal fusion, which effectively enhances comprehensive perception and reduces the level of data transfer between sensory terminal and computation units. By adopting floating gate phototransistors with reconfigurable photoresponse plasticity, we realize the agile spatial and spatiotemporal fusion under nonvolatile and volatile photoresponse modes. To realize an optimal spatial estimation, we integrate spatial information from visual-tactile signals. For dynamic events, we capture and fuse in real time spatiotemporal information from visual-audio signals, realizing a dance-music synchronization recognition task without a time-stamping process. This in-sensor multimodal fusion approach provides the potential to simplify the multimodal integration system, extending the in-sensor computing paradigm.
Collapse
Affiliation(s)
- Sijie Ma
- Department of Applied Physics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
- Joint Research Centre of Microelectronics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
| | - Yue Zhou
- Department of Applied Physics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
- Joint Research Centre of Microelectronics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
| | - Tianqing Wan
- Department of Applied Physics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
- Joint Research Centre of Microelectronics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
| | - Qinqi Ren
- Department of Applied Physics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
- Joint Research Centre of Microelectronics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
| | - Jianmin Yan
- Department of Applied Physics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
- Joint Research Centre of Microelectronics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
| | - Lingwei Fan
- Department of Applied Physics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
- Joint Research Centre of Microelectronics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
| | - Huanmei Yuan
- Department of Electronic and Computer Engineering, The Hong Kong University of Science and Technology, Hong Kong 999077, People's Republic of China
| | - Mansun Chan
- Department of Electronic and Computer Engineering, The Hong Kong University of Science and Technology, Hong Kong 999077, People's Republic of China
| | - Yang Chai
- Department of Applied Physics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
- Joint Research Centre of Microelectronics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
| |
Collapse
|
6
|
Oude Lohuis MN, Marchesi P, Olcese U, Pennartz CMA. Triple dissociation of visual, auditory and motor processing in mouse primary visual cortex. Nat Neurosci 2024; 27:758-771. [PMID: 38307971 DOI: 10.1038/s41593-023-01564-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2022] [Accepted: 12/19/2023] [Indexed: 02/04/2024]
Abstract
Primary sensory cortices respond to crossmodal stimuli-for example, auditory responses are found in primary visual cortex (V1). However, it remains unclear whether these responses reflect sensory inputs or behavioral modulation through sound-evoked body movement. We address this controversy by showing that sound-evoked activity in V1 of awake mice can be dissociated into auditory and behavioral components with distinct spatiotemporal profiles. The auditory component began at approximately 27 ms, was found in superficial and deep layers and originated from auditory cortex. Sound-evoked orofacial movements correlated with V1 neural activity starting at approximately 80-100 ms and explained auditory frequency tuning. Visual, auditory and motor activity were expressed by different laminar profiles and largely segregated subsets of neuronal populations. During simultaneous audiovisual stimulation, visual representations remained dissociable from auditory-related and motor-related activity. This three-fold dissociability of auditory, motor and visual processing is central to understanding how distinct inputs to visual cortex interact to support vision.
Collapse
Affiliation(s)
- Matthijs N Oude Lohuis
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
- Champalimaud Neuroscience Programme, Champalimaud Foundation, Lisbon, Portugal
| | - Pietro Marchesi
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| | - Umberto Olcese
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| | - Cyriel M A Pennartz
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands.
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands.
| |
Collapse
|
7
|
Sun W, Wu H, Peng Y, Zheng X, Li J, Zeng D, Tang P, Zhao M, Feng H, Li H, Liang Y, Su J, Chen X, Hökfelt T, He J. Heterosynaptic plasticity of the visuo-auditory projection requires cholecystokinin released from entorhinal cortex afferents. eLife 2024; 13:e83356. [PMID: 38436304 PMCID: PMC10954309 DOI: 10.7554/elife.83356] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2022] [Accepted: 03/03/2024] [Indexed: 03/05/2024] Open
Abstract
The entorhinal cortex is involved in establishing enduring visuo-auditory associative memory in the neocortex. Here we explored the mechanisms underlying this synaptic plasticity related to projections from the visual and entorhinal cortices to the auditory cortex in mice using optogenetics of dual pathways. High-frequency laser stimulation (HFS laser) of the visuo-auditory projection did not induce long-term potentiation. However, after pairing with sound stimulus, the visuo-auditory inputs were potentiated following either infusion of cholecystokinin (CCK) or HFS laser of the entorhino-auditory CCK-expressing projection. Combining retrograde tracing and RNAscope in situ hybridization, we show that Cck expression is higher in entorhinal cortex neurons projecting to the auditory cortex than in those originating from the visual cortex. In the presence of CCK, potentiation in the neocortex occurred when the presynaptic input arrived 200 ms before postsynaptic firing, even after just five trials of pairing. Behaviorally, inactivation of the CCK+ projection from the entorhinal cortex to the auditory cortex blocked the formation of visuo-auditory associative memory. Our results indicate that neocortical visuo-auditory association is formed through heterosynaptic plasticity, which depends on release of CCK in the neocortex mostly from entorhinal afferents.
Collapse
Affiliation(s)
- Wenjian Sun
- Department of Neuroscience, City University of Hong KongHong KongChina
- Centre for Regenerative Medicine and Health, Hong Kong Institute of Science & Innovation, Chinese Academy of SciencesHong KongChina
| | - Haohao Wu
- Department of Neuroscience, Karolinska InstitutetStockholmSweden
| | - Yujie Peng
- Department of Neuroscience, City University of Hong KongHong KongChina
- Centre for Regenerative Medicine and Health, Hong Kong Institute of Science & Innovation, Chinese Academy of SciencesHong KongChina
| | - Xuejiao Zheng
- Department of Neuroscience, City University of Hong KongHong KongChina
- Centre for Regenerative Medicine and Health, Hong Kong Institute of Science & Innovation, Chinese Academy of SciencesHong KongChina
| | - Jing Li
- Department of Neuroscience, City University of Hong KongHong KongChina
- Centre for Regenerative Medicine and Health, Hong Kong Institute of Science & Innovation, Chinese Academy of SciencesHong KongChina
| | - Dingxuan Zeng
- Department of Neuroscience, City University of Hong KongHong KongChina
| | - Peng Tang
- Department of Neuroscience, City University of Hong KongHong KongChina
- Centre for Regenerative Medicine and Health, Hong Kong Institute of Science & Innovation, Chinese Academy of SciencesHong KongChina
| | - Ming Zhao
- Department of Neuroscience, Karolinska InstitutetStockholmSweden
| | - Hemin Feng
- Department of Neuroscience, City University of Hong KongHong KongChina
- Centre for Regenerative Medicine and Health, Hong Kong Institute of Science & Innovation, Chinese Academy of SciencesHong KongChina
| | - Hao Li
- Department of Neuroscience, City University of Hong KongHong KongChina
- Centre for Regenerative Medicine and Health, Hong Kong Institute of Science & Innovation, Chinese Academy of SciencesHong KongChina
| | - Ye Liang
- Department of Neuroscience, City University of Hong KongHong KongChina
- Centre for Regenerative Medicine and Health, Hong Kong Institute of Science & Innovation, Chinese Academy of SciencesHong KongChina
| | - Junfeng Su
- Department of Neuroscience, City University of Hong KongHong KongChina
| | - Xi Chen
- Department of Neuroscience, City University of Hong KongHong KongChina
- City University of Hong Kong Shenzhen Research InstituteShenzhenChina
| | - Tomas Hökfelt
- Department of Neuroscience, Karolinska InstitutetStockholmSweden
- Institute of Advanced Study, City University of Hong KongHong KongChina
| | - Jufang He
- Department of Neuroscience, City University of Hong KongHong KongChina
- City University of Hong Kong Shenzhen Research InstituteShenzhenChina
| |
Collapse
|
8
|
Monaco S, Menghi N, Crawford JD. Action-specific feature processing in the human cortex: An fMRI study. Neuropsychologia 2024; 194:108773. [PMID: 38142960 DOI: 10.1016/j.neuropsychologia.2023.108773] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2023] [Revised: 11/29/2023] [Accepted: 12/18/2023] [Indexed: 12/26/2023]
Abstract
Sensorimotor integration involves feedforward and reentrant processing of sensory input. Grasp-related motor activity precedes and is thought to influence visual object processing. Yet, while the importance of reentrant feedback is well established in perception, the top-down modulations for action and the neural circuits involved in this process have received less attention. Do action-specific intentions influence the processing of visual information in the human cortex? Using a cue-separation fMRI paradigm, we found that action-specific instruction processing (manual alignment vs. grasp) became apparent only after the visual presentation of oriented stimuli, and occurred as early as in the primary visual cortex and extended to the dorsal visual stream, motor and premotor areas. Further, dorsal stream area aIPS, known to be involved in object manipulation, and the primary visual cortex showed task-related functional connectivity with frontal, parietal and temporal areas, consistent with the idea that reentrant feedback from dorsal and ventral visual stream areas modifies visual inputs to prepare for action. Importantly, both the task-dependent modulations and connections were linked specifically to the object presentation phase of the task, suggesting a role in processing the action goal. Our results show that intended manual actions have an early, pervasive, and differential influence on the cortical processing of vision.
Collapse
Affiliation(s)
- Simona Monaco
- CIMeC - Center for Mind/Brain Sciences, University of Trento, Rovereto (TN), Italy.
| | - Nicholas Menghi
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - J Douglas Crawford
- Center for Vision Research, York University, Toronto, Ontario M3J 1P3, Canada; Vision: Science to Applications (VISTA) Program, Neuroscience Graduate Diploma Program and Departments of Psychology, Biology, and Kinesiology and Health Science, York University, Toronto, Ontario M3J 1P3, Canada
| |
Collapse
|
9
|
Ahveninen J, Lee HJ, Yu HY, Lee CC, Chou CC, Ahlfors SP, Kuo WJ, Jääskeläinen IP, Lin FH. Visual Stimuli Modulate Local Field Potentials But Drive No High-Frequency Activity in Human Auditory Cortex. J Neurosci 2024; 44:e0890232023. [PMID: 38129133 PMCID: PMC10869150 DOI: 10.1523/jneurosci.0890-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Revised: 11/06/2023] [Accepted: 11/07/2023] [Indexed: 12/23/2023] Open
Abstract
Neuroimaging studies suggest cross-sensory visual influences in human auditory cortices (ACs). Whether these influences reflect active visual processing in human ACs, which drives neuronal firing and concurrent broadband high-frequency activity (BHFA; >70 Hz), or whether they merely modulate sound processing is still debatable. Here, we presented auditory, visual, and audiovisual stimuli to 16 participants (7 women, 9 men) with stereo-EEG depth electrodes implanted near ACs for presurgical monitoring. Anatomically normalized group analyses were facilitated by inverse modeling of intracranial source currents. Analyses of intracranial event-related potentials (iERPs) suggested cross-sensory responses to visual stimuli in ACs, which lagged the earliest auditory responses by several tens of milliseconds. Visual stimuli also modulated the phase of intrinsic low-frequency oscillations and triggered 15-30 Hz event-related desynchronization in ACs. However, BHFA, a putative correlate of neuronal firing, was not significantly increased in ACs after visual stimuli, not even when they coincided with auditory stimuli. Intracranial recordings demonstrate cross-sensory modulations, but no indication of active visual processing in human ACs.
Collapse
Affiliation(s)
- Jyrki Ahveninen
- Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Massachusetts General Hospital, Charlestown, Massachusetts 02129
- Department of Radiology, Harvard Medical School, Boston, Massachusetts 02115
| | - Hsin-Ju Lee
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, Ontario M4N 3M5, Canada
- Department of Medical Biophysics, University of Toronto, Toronto, Ontario M5G 1L7, Canada
| | - Hsiang-Yu Yu
- Department of Epilepsy, Neurological Institute, Taipei Veterans General Hospital, Taipei 11217, Taiwan
- School of Medicine, National Yang Ming Chiao Tung University, Taipei 112304, Taiwan
| | - Cheng-Chia Lee
- School of Medicine, National Yang Ming Chiao Tung University, Taipei 112304, Taiwan
- Department of Neurosurgery, Neurological Institute, Taipei Veterans General Hospital, Taipei 11217, Taiwan
| | - Chien-Chen Chou
- Department of Epilepsy, Neurological Institute, Taipei Veterans General Hospital, Taipei 11217, Taiwan
- School of Medicine, National Yang Ming Chiao Tung University, Taipei 112304, Taiwan
| | - Seppo P Ahlfors
- Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Massachusetts General Hospital, Charlestown, Massachusetts 02129
- Department of Radiology, Harvard Medical School, Boston, Massachusetts 02115
| | - Wen-Jui Kuo
- Institute of Neuroscience, National Yang Ming Chiao Tung University, Taipei 112304, Taiwan
| | - Iiro P Jääskeläinen
- Brain and Mind Laboratory, Department of Neuroscience and Biomedical Engineering, Aalto University School of Science, Espoo, FI-00076 AALTO, Finland
- International Laboratory of Social Neurobiology, Institute of Cognitive Neuroscience, Higher School of Economics, Moscow 101000, Russia
| | - Fa-Hsuan Lin
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, Ontario M4N 3M5, Canada
- Department of Medical Biophysics, University of Toronto, Toronto, Ontario M5G 1L7, Canada
- Brain and Mind Laboratory, Department of Neuroscience and Biomedical Engineering, Aalto University School of Science, Espoo, FI-00076 AALTO, Finland
| |
Collapse
|
10
|
Wang Y, Zhao R, Zhu D, Fu X, Sun F, Cai Y, Ma J, Guo X, Zhang J, Xue Y. Voxel- and tensor-based morphometry with machine learning techniques identifying characteristic brain impairment in patients with cervical spondylotic myelopathy. Front Neurol 2024; 15:1267349. [PMID: 38419699 PMCID: PMC10899699 DOI: 10.3389/fneur.2024.1267349] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2023] [Accepted: 01/24/2024] [Indexed: 03/02/2024] Open
Abstract
Aim The diagnosis of cervical spondylotic myelopathy (CSM) relies on several methods, including x-rays, computed tomography, and magnetic resonance imaging (MRI). Although MRI is the most useful diagnostic tool, strategies to improve the precise and independent diagnosis of CSM using novel MRI imaging techniques are urgently needed. This study aimed to explore potential brain biomarkers to improve the precise diagnosis of CSM through the combination of voxel-based morphometry (VBM) and tensor-based morphometry (TBM) with machine learning techniques. Methods In this retrospective study, 57 patients with CSM and 57 healthy controls (HCs) were enrolled. The structural changes in the gray matter volume and white matter volume were determined by VBM. Gray and white matter deformations were measured by TBM. The support vector machine (SVM) was used for the classification of CSM patients from HCs based on the structural features of VBM and TBM. Results CSM patients exhibited characteristic structural abnormalities in the sensorimotor, visual, cognitive, and subcortical regions, as well as in the anterior corona radiata and the corpus callosum [P < 0.05, false discovery rate (FDR) corrected]. A multivariate pattern classification analysis revealed that VBM and TBM could successfully identify CSM patients and HCs [classification accuracy: 81.58%, area under the curve (AUC): 0.85; P < 0.005, Bonferroni corrected] through characteristic gray matter and white matter impairments. Conclusion CSM may cause widespread and remote impairments in brain structures. This study provided a valuable reference for developing novel diagnostic strategies to identify CSM.
Collapse
Affiliation(s)
- Yang Wang
- Department of Radiology, Tianjin Medical University General Hospital, Tianjin, China
- Tianjin Key Laboratory of Functional Imaging, Tianjin Medical University General Hospital, Tianjin, China
| | - Rui Zhao
- Department of Orthopedics Surgery, Tianjin Medical University General Hospital, Tianjin, China
| | - Dan Zhu
- Tianjin Key Laboratory of Functional Imaging, Tianjin Medical University General Hospital, Tianjin, China
- Department of Radiology, Tianjin Medical University General Hospital Airport Hospital, Tianjin, China
| | - Xiuwei Fu
- Department of Radiology, Tianjin Medical University General Hospital, Tianjin, China
- Tianjin Key Laboratory of Functional Imaging, Tianjin Medical University General Hospital, Tianjin, China
| | - Fengyu Sun
- Department of Radiology, Tianjin Medical University General Hospital, Tianjin, China
| | - Yuezeng Cai
- Department of Radiology, Tianjin Medical University General Hospital, Tianjin, China
| | - Juanwei Ma
- Department of Radiology, Tianjin Medical University General Hospital, Tianjin, China
- Tianjin Key Laboratory of Functional Imaging, Tianjin Medical University General Hospital, Tianjin, China
| | - Xing Guo
- Department of Orthopedics Surgery, Tianjin Medical University General Hospital, Tianjin, China
| | - Jing Zhang
- Department of Radiology, Tianjin Medical University General Hospital, Tianjin, China
- Tianjin Key Laboratory of Functional Imaging, Tianjin Medical University General Hospital, Tianjin, China
| | - Yuan Xue
- Tianjin Key Laboratory of Functional Imaging, Tianjin Medical University General Hospital, Tianjin, China
- Tianjin Key Laboratory of Spine and Spinal Cord, Tianjin Medical University General Hospital, Tianjin, China
| |
Collapse
|
11
|
Lin X, Liu Y, Huang J. Reducing sweetness expectation in milk tea by crossmodal visuo-auditory interaction. Appetite 2024; 192:107107. [PMID: 37890531 DOI: 10.1016/j.appet.2023.107107] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2023] [Revised: 10/24/2023] [Accepted: 10/24/2023] [Indexed: 10/29/2023]
Abstract
In the realm of healthy dietary choices about reducing sweetness perception, the exploration of crossmodal effects stands as a frequently employed approach. Both music and color can independently influence flavor evaluation and gustatory experience by eliciting emotions. However, less research has been done on the effects of audio-visual crossmodal interactions on sweetness expectations and perceptions. The present study conducted two experiments delving into the crossmodal effect on sweetness expectation and perception of milk tea by manipulating the emotional valence of music and packaging color. The results showed that positive (vs. negative) music led to higher sweetness expectations and perceptions for milk teas with neutral packaging color. Irrespective of music, participants had higher sweetness expectations for milk tea with positive or neutral (vs. negative) packaging colors. The congruence of valence between music and packaging color influenced sweetness perception. Positive (vs. negative) music correlated with a sweeter perception when the packaging color was positive. Exposed to negative music, subjects showed a higher sweetness perception with negative (vs. positive) packaging colors. In conclusion, the results suggest that the valence of music and packaging color crossmodally influence consumers' evaluation of milk tea, and it differs depending on whether it was tasted. Thus, this study has demonstrated the crossmodal influence of music and packaging color, providing valuable implications for healthy eating and marketing applications.
Collapse
Affiliation(s)
- Xin Lin
- Department of Psychology, Soochow University, Suzhou, 215123, China; Department of Applied Psychology, Fuzhou University, Fuzhou, 350108, China
| | - Yujia Liu
- Department of Music, Soochow University, Suzhou, 215000, China
| | - Jianping Huang
- Department of Psychology, Soochow University, Suzhou, 215123, China.
| |
Collapse
|
12
|
Suzuki M, Pennartz CMA, Aru J. How deep is the brain? The shallow brain hypothesis. Nat Rev Neurosci 2023; 24:778-791. [PMID: 37891398 DOI: 10.1038/s41583-023-00756-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/25/2023] [Indexed: 10/29/2023]
Abstract
Deep learning and predictive coding architectures commonly assume that inference in neural networks is hierarchical. However, largely neglected in deep learning and predictive coding architectures is the neurobiological evidence that all hierarchical cortical areas, higher or lower, project to and receive signals directly from subcortical areas. Given these neuroanatomical facts, today's dominance of cortico-centric, hierarchical architectures in deep learning and predictive coding networks is highly questionable; such architectures are likely to be missing essential computational principles the brain uses. In this Perspective, we present the shallow brain hypothesis: hierarchical cortical processing is integrated with a massively parallel process to which subcortical areas substantially contribute. This shallow architecture exploits the computational capacity of cortical microcircuits and thalamo-cortical loops that are not included in typical hierarchical deep learning and predictive coding networks. We argue that the shallow brain architecture provides several critical benefits over deep hierarchical structures and a more complete depiction of how mammalian brains achieve fast and flexible computational capabilities.
Collapse
Affiliation(s)
- Mototaka Suzuki
- Department of Cognitive and Systems Neuroscience, Swammerdam Institute for Life Sciences, University of Amsterdam, Amsterdam, The Netherlands.
| | - Cyriel M A Pennartz
- Department of Cognitive and Systems Neuroscience, Swammerdam Institute for Life Sciences, University of Amsterdam, Amsterdam, The Netherlands
| | - Jaan Aru
- Institute of Computer Science, University of Tartu, Tartu, Estonia.
| |
Collapse
|
13
|
Paraouty N, Yao JD, Varnet L, Chou CN, Chung S, Sanes DH. Sensory cortex plasticity supports auditory social learning. Nat Commun 2023; 14:5828. [PMID: 37730696 PMCID: PMC10511464 DOI: 10.1038/s41467-023-41641-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2023] [Accepted: 09/11/2023] [Indexed: 09/22/2023] Open
Abstract
Social learning (SL) through experience with conspecifics can facilitate the acquisition of many behaviors. Thus, when Mongolian gerbils are exposed to a demonstrator performing an auditory discrimination task, their subsequent task acquisition is facilitated, even in the absence of visual cues. Here, we show that transient inactivation of auditory cortex (AC) during exposure caused a significant delay in task acquisition during the subsequent practice phase, suggesting that AC activity is necessary for SL. Moreover, social exposure induced an improvement in AC neuron sensitivity to auditory task cues. The magnitude of neural change during exposure correlated with task acquisition during practice. In contrast, exposure to only auditory task cues led to poorer neurometric and behavioral outcomes. Finally, social information during exposure was encoded in the AC of observer animals. Together, our results suggest that auditory SL is supported by AC neuron plasticity occurring during social exposure and prior to behavioral performance.
Collapse
Affiliation(s)
- Nihaad Paraouty
- Center for Neural Science New York University, New York, NY, 10003, USA.
| | - Justin D Yao
- Department of Otolaryngology, Rutgers University, New Brunswick, NJ, 08901, USA
| | - Léo Varnet
- Laboratoire des Systèmes Perceptifs, UMR 8248, Ecole Normale Supérieure, PSL University, Paris, 75005, France
| | - Chi-Ning Chou
- Center for Computational Neuroscience, Flatiron Institute, Simons Foundation, New York, NY, USA
- School of Engineering & Applied Sciences, Harvard University, Cambridge, MA, 02138, USA
| | - SueYeon Chung
- Center for Neural Science New York University, New York, NY, 10003, USA
- Center for Computational Neuroscience, Flatiron Institute, Simons Foundation, New York, NY, USA
| | - Dan H Sanes
- Center for Neural Science New York University, New York, NY, 10003, USA
- Department of Psychology, New York University, New York, NY, 10003, USA
- Department of Biology, New York University, New York, NY, 10003, USA
- Neuroscience Institute, NYU Langone Medical Center, New York, NY, 10003, USA
| |
Collapse
|
14
|
Haimson B, Mizrahi A. Plasticity in auditory cortex during parenthood. Hear Res 2023; 431:108738. [PMID: 36931020 DOI: 10.1016/j.heares.2023.108738] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/28/2022] [Revised: 02/09/2023] [Accepted: 03/06/2023] [Indexed: 03/11/2023]
Abstract
Most animals display robust parental behaviors that support the survival and well-being of their offspring. The manifestation of parental behaviors is accompanied by physiological and hormonal changes, which affect both the body and the brain for better care giving. Rodents exhibit a behavior called pup retrieval - a stereotyped sequence of perception and action - used to identify and retrieve their newborn pups back to the nest. Pup retrieval consists of a significant auditory component, which depends on plasticity in the auditory cortex (ACx). We review the evidence of neural changes taking place in the ACx of rodents during the transition to parenthood. We discuss how the plastic changes both in and out of the ACx support the encoding of pup vocalizations. Key players in the mechanism of this plasticity are hormones and experience, both of which have a clear dynamic signature during the transition to parenthood. Mothers, co caring females, and fathers have been used as models to understand parental plasticity at disparate levels of organization. Yet, common principles of cortical plasticity and the biological mechanisms underlying its involvement in parental behavior are just beginning to be unpacked.
Collapse
Affiliation(s)
- Baruch Haimson
- The Edmond and Lily Safra Center for Brain Sciences, and 2Department of Neurobiology, The Hebrew University of Jerusalem, Jerusalem 91904, Israel.
| | - Adi Mizrahi
- The Edmond and Lily Safra Center for Brain Sciences, and 2Department of Neurobiology, The Hebrew University of Jerusalem, Jerusalem 91904, Israel.
| |
Collapse
|
15
|
Franken MK, Liu BC, Ostry DJ. Towards a somatosensory theory of speech perception. J Neurophysiol 2022; 128:1683-1695. [PMID: 36416451 PMCID: PMC9762980 DOI: 10.1152/jn.00381.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2022] [Revised: 11/19/2022] [Accepted: 11/19/2022] [Indexed: 11/24/2022] Open
Abstract
Speech perception is known to be a multimodal process, relying not only on auditory input but also on the visual system and possibly on the motor system as well. To date there has been little work on the potential involvement of the somatosensory system in speech perception. In the present review, we identify the somatosensory system as another contributor to speech perception. First, we argue that evidence in favor of a motor contribution to speech perception can just as easily be interpreted as showing somatosensory involvement. Second, physiological and neuroanatomical evidence for auditory-somatosensory interactions across the auditory hierarchy indicates the availability of a neural infrastructure that supports somatosensory involvement in auditory processing in general. Third, there is accumulating evidence for somatosensory involvement in the context of speech specifically. In particular, tactile stimulation modifies speech perception, and speech auditory input elicits activity in somatosensory cortical areas. Moreover, speech sounds can be decoded from activity in somatosensory cortex; lesions to this region affect perception, and vowels can be identified based on somatic input alone. We suggest that the somatosensory involvement in speech perception derives from the somatosensory-auditory pairing that occurs during speech production and learning. By bringing together findings from a set of studies that have not been previously linked, the present article identifies the somatosensory system as a presently unrecognized contributor to speech perception.
Collapse
Affiliation(s)
| | | | - David J Ostry
- McGill University, Montreal, Quebec, Canada
- Haskins Laboratories, New Haven, Connecticut
| |
Collapse
|
16
|
Merrikhi Y, Kok MA, Lomber SG, Meredith MA. A comparison of multisensory features of two auditory cortical areas: primary (A1) and higher-order dorsal zone (DZ). Cereb Cortex Commun 2022; 4:tgac049. [PMID: 36632047 PMCID: PMC9825723 DOI: 10.1093/texcom/tgac049] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2022] [Revised: 11/11/2022] [Accepted: 11/14/2022] [Indexed: 11/19/2022] Open
Abstract
From myriads of ongoing stimuli, the brain creates a fused percept of the environment. This process, which culminates in perceptual binding, is presumed to occur through the operations of multisensory neurons that occur throughout the brain. However, because different brain areas receive different inputs and have different cytoarchitechtonics, it would be expected that local multisensory features would also vary across regions. The present study investigated that hypothesis using multiple single-unit recordings from anesthetized cats in response to controlled, electronically-generated separate and combined auditory, visual, and somatosensory stimulation. These results were used to compare the multisensory features of neurons in cat primary auditory cortex (A1) with those identified in the nearby higher-order auditory region, the Dorsal Zone (DZ). Both regions exhibited the same forms of multisensory neurons, albeit in different proportions. Multisensory neurons exhibiting excitatory or inhibitory properties occurred in similar proportions in both areas. Also, multisensory neurons in both areas expressed similar levels of multisensory integration. Because responses to auditory cues alone were so similar to those that included non-auditory stimuli, it is proposed that this effect represents a mechanism by which multisensory neurons subserve the process of perceptual binding.
Collapse
Affiliation(s)
- Yaser Merrikhi
- Corresponding authors: Yaser Merrikhi, Department of Physiology, Faculty of Medicine, McGill University, Montreal, Quebec H3G 1Y6, Canada. and Stephen G Lomber, Department of Physiology, Faculty of Medicine, McGill University, Montreal, Quebec H3G 1Y6, Canada.
| | - Melanie A Kok
- Graduate Program in Neuroscience, University of Western Ontario, London, Ontario N6A 5K8, Canada
| | - Stephen G Lomber
- Corresponding authors: Yaser Merrikhi, Department of Physiology, Faculty of Medicine, McGill University, Montreal, Quebec H3G 1Y6, Canada. and Stephen G Lomber, Department of Physiology, Faculty of Medicine, McGill University, Montreal, Quebec H3G 1Y6, Canada.
| | - M Alex Meredith
- Department of Anatomy and Neurobiology, School of Medicine, Virginia Commonwealth University, Richmond, Virginia 23298, USA
| |
Collapse
|
17
|
Lohse M, Zimmer-Harwood P, Dahmen JC, King AJ. Integration of somatosensory and motor-related information in the auditory system. Front Neurosci 2022; 16:1010211. [PMID: 36330342 PMCID: PMC9622781 DOI: 10.3389/fnins.2022.1010211] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Accepted: 09/28/2022] [Indexed: 11/30/2022] Open
Abstract
An ability to integrate information provided by different sensory modalities is a fundamental feature of neurons in many brain areas. Because visual and auditory inputs often originate from the same external object, which may be located some distance away from the observer, the synthesis of these cues can improve localization accuracy and speed up behavioral responses. By contrast, multisensory interactions occurring close to the body typically involve a combination of tactile stimuli with other sensory modalities. Moreover, most activities involving active touch generate sound, indicating that stimuli in these modalities are frequently experienced together. In this review, we examine the basis for determining sound-source distance and the contribution of auditory inputs to the neural encoding of space around the body. We then consider the perceptual consequences of combining auditory and tactile inputs in humans and discuss recent evidence from animal studies demonstrating how cortical and subcortical areas work together to mediate communication between these senses. This research has shown that somatosensory inputs interface with and modulate sound processing at multiple levels of the auditory pathway, from the cochlear nucleus in the brainstem to the cortex. Circuits involving inputs from the primary somatosensory cortex to the auditory midbrain have been identified that mediate suppressive effects of whisker stimulation on auditory thalamocortical processing, providing a possible basis for prioritizing the processing of tactile cues from nearby objects. Close links also exist between audition and movement, and auditory responses are typically suppressed by locomotion and other actions. These movement-related signals are thought to cancel out self-generated sounds, but they may also affect auditory responses via the associated somatosensory stimulation or as a result of changes in brain state. Together, these studies highlight the importance of considering both multisensory context and movement-related activity in order to understand how the auditory cortex operates during natural behaviors, paving the way for future work to investigate auditory-somatosensory interactions in more ecological situations.
Collapse
|
18
|
Rabinovich RJ, Kato DD, Bruno RM. Learning enhances encoding of time and temporal surprise in mouse primary sensory cortex. Nat Commun 2022; 13:5504. [PMID: 36127340 PMCID: PMC9489862 DOI: 10.1038/s41467-022-33141-y] [Citation(s) in RCA: 14] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2021] [Accepted: 09/02/2022] [Indexed: 11/09/2022] Open
Abstract
Primary sensory cortex has long been believed to play a straightforward role in the initial processing of sensory information. Yet, the superficial layers of cortex overall are sparsely active, even during sensory stimulation; additionally, cortical activity is influenced by other modalities, task context, reward, and behavioral state. Our study demonstrates that reinforcement learning dramatically alters representations among longitudinally imaged neurons in superficial layers of mouse primary somatosensory cortex. Learning an object detection task recruits previously unresponsive neurons, enlarging the neuronal population sensitive to touch and behavioral choice. Cortical responses decrease upon repeated stimulus presentation outside of the behavioral task. Moreover, training improves population encoding of the passage of time, and unexpected deviations in trial timing elicit even stronger responses than touches do. In conclusion, the superficial layers of sensory cortex exhibit a high degree of learning-dependent plasticity and are strongly modulated by non-sensory but behaviorally-relevant features, such as timing and surprise. Activity in the superficial layers of the sensory cortex is believed to be largely driven by incoming sensory stimuli. Here the authors demonstrate how learning changes neural responses to sensations according to both behavioral relevance and timing, suggesting a high degree of non-sensory modulation.
Collapse
Affiliation(s)
- Rebecca J Rabinovich
- Department of Neuroscience, Columbia University, New York, NY, 10027, USA.,Kavli Institute for Brain Science, Columbia University, New York, NY, 10027, USA.,Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY, 10027, USA
| | - Daniel D Kato
- Department of Neuroscience, Columbia University, New York, NY, 10027, USA.,Kavli Institute for Brain Science, Columbia University, New York, NY, 10027, USA.,Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY, 10027, USA
| | - Randy M Bruno
- Department of Neuroscience, Columbia University, New York, NY, 10027, USA. .,Kavli Institute for Brain Science, Columbia University, New York, NY, 10027, USA. .,Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY, 10027, USA. .,Department of Physiology, Anatomy & Genetics, University of Oxford, Oxford, UK.
| |
Collapse
|
19
|
Anandakumar DB, Liu RC. More than the end: OFF response plasticity as a mnemonic signature of a sound’s behavioral salience. Front Comput Neurosci 2022; 16:974264. [PMID: 36148326 PMCID: PMC9485674 DOI: 10.3389/fncom.2022.974264] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2022] [Accepted: 08/17/2022] [Indexed: 11/29/2022] Open
Abstract
In studying how neural populations in sensory cortex code dynamically varying stimuli to guide behavior, the role of spiking after stimuli have ended has been underappreciated. This is despite growing evidence that such activity can be tuned, experience-and context-dependent and necessary for sensory decisions that play out on a slower timescale. Here we review recent studies, focusing on the auditory modality, demonstrating that this so-called OFF activity can have a more complex temporal structure than the purely phasic firing that has often been interpreted as just marking the end of stimuli. While diverse and still incompletely understood mechanisms are likely involved in generating phasic and tonic OFF firing, more studies point to the continuing post-stimulus activity serving a short-term, stimulus-specific mnemonic function that is enhanced when the stimuli are particularly salient. We summarize these results with a conceptual model highlighting how more neurons within the auditory cortical population fire for longer duration after a sound’s termination during an active behavior and can continue to do so even while passively listening to behaviorally salient stimuli. Overall, these studies increasingly suggest that tonic auditory cortical OFF activity holds an echoic memory of specific, salient sounds to guide behavioral decisions.
Collapse
Affiliation(s)
- Dakshitha B Anandakumar
- Wallace H. Coulter Department of Biomedical Engineering, Georgia Institute of Technology and Emory University, Atlanta, GA, United States
- Department of Biology, Emory University, Atlanta, GA, United States
| | - Robert C Liu
- Department of Biology, Emory University, Atlanta, GA, United States
- Center for Translational Social Neuroscience, Emory University, Atlanta, GA, United States
| |
Collapse
|
20
|
Bailey KM, Giordano BL, Kaas AL, Smith FW. Decoding sounds depicting hand-object interactions in primary somatosensory cortex. Cereb Cortex 2022; 33:3621-3635. [PMID: 36045002 DOI: 10.1093/cercor/bhac296] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2022] [Revised: 05/24/2022] [Accepted: 07/07/2022] [Indexed: 11/13/2022] Open
Abstract
Neurons, even in the earliest sensory regions of cortex, are subject to a great deal of contextual influences from both within and across modality connections. Recent work has shown that primary sensory areas can respond to and, in some cases, discriminate stimuli that are not of their target modality: for example, primary somatosensory cortex (SI) discriminates visual images of graspable objects. In the present work, we investigated whether SI would discriminate sounds depicting hand-object interactions (e.g. bouncing a ball). In a rapid event-related functional magnetic resonance imaging experiment, participants listened attentively to sounds from 3 categories: hand-object interactions, and control categories of pure tones and animal vocalizations, while performing a one-back repetition detection task. Multivoxel pattern analysis revealed significant decoding of hand-object interaction sounds within SI, but not for either control category. Crucially, in the hand-sensitive voxels defined from an independent tactile localizer, decoding accuracies were significantly higher for hand-object interactions compared to pure tones in left SI. Our findings indicate that simply hearing sounds depicting familiar hand-object interactions elicit different patterns of activity in SI, despite the complete absence of tactile stimulation. These results highlight the rich contextual information that can be transmitted across sensory modalities even to primary sensory areas.
Collapse
Affiliation(s)
- Kerri M Bailey
- School of Psychology, University of East Anglia, Norwich NR4 7TJ, United Kingdom
| | - Bruno L Giordano
- Institut des Neurosciences de La Timone, CNRS UMR 7289, Université Aix-Marseille, Marseille CNRS UMR 7289, France
| | - Amanda L Kaas
- Department of Cognitive Neuroscience, Maastricht University, Maastricht 6229 EV, The Netherlands
| | - Fraser W Smith
- School of Psychology, University of East Anglia, Norwich NR4 7TJ, United Kingdom
| |
Collapse
|
21
|
Kashash Y, Smarsh G, Zilkha N, Yovel Y, Kimchi T. Alone, in the dark: The extraordinary neuroethology of the solitary blind mole rat. eLife 2022; 11:78295. [PMID: 35674717 PMCID: PMC9177142 DOI: 10.7554/elife.78295] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2022] [Accepted: 05/12/2022] [Indexed: 11/13/2022] Open
Abstract
On the social scale, the blind mole rat (BMR; Spalax ehrenbergi) is an extreme. It is exceedingly solitary, territorial, and aggressive. BMRs reside underground, in self-excavated tunnels that they rarely leave. They possess specialized sensory systems for social communication and navigation, which allow them to cope with the harsh environmental conditions underground. This review aims to present the blind mole rat as an ideal, novel neuroethological model for studying aggressive and solitary behaviors. We discuss the BMR's unique behavioral phenotype, particularly in the context of 'anti-social' behaviors, and review the available literature regarding its specialized sensory adaptations to the social and physical habitat. To date, the neurobiology of the blind mole rat remains mostly unknown and holds a promising avenue for scientific discovery. Unraveling the neural basis of the BMR's behavior, in comparison to that of social rodents, can shed important light on the underlying mechanisms of psychiatric disorders in humans, in which similar behaviors are displayed.
Collapse
Affiliation(s)
- Yael Kashash
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot, Israel
| | - Grace Smarsh
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot, Israel.,School of Zoology, Faculty of Life Sciences, Tel Aviv University, Tel Aviv, Israel
| | - Noga Zilkha
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot, Israel
| | - Yossi Yovel
- School of Zoology, Faculty of Life Sciences, Tel Aviv University, Tel Aviv, Israel
| | - Tali Kimchi
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot, Israel
| |
Collapse
|
22
|
Oddball-irrelevant visual stimuli cross-modally attenuate auditory mismatch negativity in rats. Neuroreport 2022; 33:363-368. [DOI: 10.1097/wnr.0000000000001793] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
23
|
Li YT, Chen JW, Yan LF, Hu B, Chen TQ, Chen ZH, Sun JT, Shang YX, Lu LJ, Cui GB, Wang W. Dynamic Alterations of Functional Connectivity and Amplitude of Low-Frequency Fluctuations in Patients with Unilateral Sudden Sensorineural Hearing Loss. Neurosci Lett 2022; 772:136470. [PMID: 35066092 DOI: 10.1016/j.neulet.2022.136470] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/24/2021] [Revised: 12/26/2021] [Accepted: 01/17/2022] [Indexed: 02/05/2023]
Abstract
Unilateral sudden sensorineural hearing loss (SSNHL) adversely affects the quality of life, leading to increased risk of depression and cognitive decline. Our previous studies have mainly focused on the static brain function abnormalities in SSNHL patients. However, the dynamic features of brain activity in SSNHL patients are not elucidated. To explore the dynamic brain functional alterations in SSNHL patients, age- and sex- matched SSNHL patients (n=38) and healthy controls (HC, n=44) were enrolled. The dynamic functional connectivity (dFC) and dynamic amplitude of low-frequency fluctuation (dALFF) methods were used to compare the temporal features and dynamic neural activity between the two groups. In dFC analyses, the multiple functional connectivities (FCs) were clustered into 2 different states; a greater proportion of FCs in SSNHL patients showed sparse state compared with HC. In dALFF analyses, SSNHL individuals exhibited decreased dALFF variability in bilateral inferior occipital gyrus, middle occipital gyrus, calcarine, right lingual gyrus, and right fusiform gyrus. dALFF variability showed a negative correlation with activated partial thromboplatin time. The dynamic characteristics of SSNHL patients were different from static functional connectivity and static amplitude of low-frequency fluctuation, especially within the visual cortices. These findings suggest that SSNHL patients experience cross-modal plasticity and visual compensation, which may be closely related to the pathophysiology of SSNHL.
Collapse
Affiliation(s)
- Yu-Ting Li
- Department of Radiology, Functional and Molecular Imaging Key Lab of Shaanxi Province, Tangdu Hospital, Fourth Military Medical University, 569 Xinsi Road, Xi'an 710038, Shaanxi, China
| | - Jia-Wei Chen
- Department of Otolaryngology Head and Neck Surgery, Tangdu Hospital, Fourth Military Medical University, Xi'an 710038, Shaanxi, China
| | - Lin-Feng Yan
- Department of Radiology, Functional and Molecular Imaging Key Lab of Shaanxi Province, Tangdu Hospital, Fourth Military Medical University, 569 Xinsi Road, Xi'an 710038, Shaanxi, China
| | - Bo Hu
- Department of Radiology, Functional and Molecular Imaging Key Lab of Shaanxi Province, Tangdu Hospital, Fourth Military Medical University, 569 Xinsi Road, Xi'an 710038, Shaanxi, China
| | - Tian-Qi Chen
- Institution of Basic Medicine, Fourth Military Medical University, 169 Changle Road, Xi'an 710032, Shaanxi, China
| | - Zhu-Hong Chen
- Department of Radiology, Functional and Molecular Imaging Key Lab of Shaanxi Province, Tangdu Hospital, Fourth Military Medical University, 569 Xinsi Road, Xi'an 710038, Shaanxi, China
| | - Jing-Ting Sun
- Department of Radiology, Functional and Molecular Imaging Key Lab of Shaanxi Province, Tangdu Hospital, Fourth Military Medical University, 569 Xinsi Road, Xi'an 710038, Shaanxi, China; Shaanxi University of Chinese Medicine, Middle Section of Century Avenue, Xianyang 712046, Shaanxi, China
| | - Yu-Xuan Shang
- Department of Radiology, Functional and Molecular Imaging Key Lab of Shaanxi Province, Tangdu Hospital, Fourth Military Medical University, 569 Xinsi Road, Xi'an 710038, Shaanxi, China
| | - Lian-Jun Lu
- Department of Otolaryngology Head and Neck Surgery, Tangdu Hospital, Fourth Military Medical University, Xi'an 710038, Shaanxi, China.
| | - Guang-Bin Cui
- Department of Radiology, Functional and Molecular Imaging Key Lab of Shaanxi Province, Tangdu Hospital, Fourth Military Medical University, 569 Xinsi Road, Xi'an 710038, Shaanxi, China.
| | - Wen Wang
- Department of Radiology, Functional and Molecular Imaging Key Lab of Shaanxi Province, Tangdu Hospital, Fourth Military Medical University, 569 Xinsi Road, Xi'an 710038, Shaanxi, China.
| |
Collapse
|
24
|
Shiramatsu TI, Mori K, Ishizu K, Takahashi H. Auditory, Visual, and Cross-Modal Mismatch Negativities in the Rat Auditory and Visual Cortices. Front Hum Neurosci 2021; 15:721476. [PMID: 34602996 PMCID: PMC8484534 DOI: 10.3389/fnhum.2021.721476] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2021] [Accepted: 08/24/2021] [Indexed: 12/04/2022] Open
Abstract
When the brain tries to acquire an elaborate model of the world, multisensory integration should contribute to building predictions based on the various pieces of information, and deviance detection should repeatedly update these predictions by detecting “errors” from the actual sensory inputs. Accumulating evidence such as a hierarchical organization of the deviance-detection system indicates that the deviance-detection system can be interpreted in the predictive coding framework. Herein, we targeted mismatch negativity (MMN) as a type of prediction-error signal and investigated the relationship between multisensory integration and MMN. In particular, we studied whether and how cross-modal information processing affected MMN in rodents. We designed a new surface microelectrode array and simultaneously recorded visual and auditory evoked potentials from the visual and auditory cortices of rats under anesthesia. Then, we mapped MMNs for five types of deviant stimuli: single-modal deviants in (i) the visual oddball and (ii) auditory oddball paradigms, eliciting single-modal MMN; (iii) congruent audio-visual deviants, (iv) incongruent visual deviants, and (v) incongruent auditory deviants in the audio-visual oddball paradigm, eliciting cross-modal MMN. First, we demonstrated that visual MMN exhibited deviance detection properties and that the first-generation focus of visual MMN was localized in the visual cortex, as previously reported in human studies. Second, a comparison of MMN amplitudes revealed a non-linear relationship between single-modal and cross-modal MMNs. Moreover, congruent audio-visual MMN exhibited characteristics of both visual and auditory MMNs—its latency was similar to that of auditory MMN, whereas local blockage of N-methyl-D-aspartic acid receptors in the visual cortex diminished it as well as visual MMN. These results indicate that cross-modal information processing affects MMN without involving strong top-down effects, such as those of prior knowledge and attention. The present study is the first electrophysiological evidence of cross-modal MMN in animal models, and future studies on the neural mechanisms combining multisensory integration and deviance detection are expected to provide electrophysiological evidence to confirm the links between MMN and predictive coding theory.
Collapse
Affiliation(s)
| | - Kanato Mori
- Graduate School of Information Science and Technology, The University of Tokyo, Tokyo, Japan
| | - Kotaro Ishizu
- Institute for Quantitative Biosciences, The University of Tokyo, Tokyo, Japan
| | - Hirokazu Takahashi
- Graduate School of Information Science and Technology, The University of Tokyo, Tokyo, Japan
| |
Collapse
|
25
|
Lohse M, Dahmen JC, Bajo VM, King AJ. Subcortical circuits mediate communication between primary sensory cortical areas in mice. Nat Commun 2021; 12:3916. [PMID: 34168153 PMCID: PMC8225818 DOI: 10.1038/s41467-021-24200-x] [Citation(s) in RCA: 19] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2020] [Accepted: 06/02/2021] [Indexed: 12/20/2022] Open
Abstract
Integration of information across the senses is critical for perception and is a common property of neurons in the cerebral cortex, where it is thought to arise primarily from corticocortical connections. Much less is known about the role of subcortical circuits in shaping the multisensory properties of cortical neurons. We show that stimulation of the whiskers causes widespread suppression of sound-evoked activity in mouse primary auditory cortex (A1). This suppression depends on the primary somatosensory cortex (S1), and is implemented through a descending circuit that links S1, via the auditory midbrain, with thalamic neurons that project to A1. Furthermore, a direct pathway from S1 has a facilitatory effect on auditory responses in higher-order thalamic nuclei that project to other brain areas. Crossmodal corticofugal projections to the auditory midbrain and thalamus therefore play a pivotal role in integrating multisensory signals and in enabling communication between different sensory cortical areas.
Collapse
Affiliation(s)
- Michael Lohse
- Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford, UK.
- Sainsbury Wellcome Centre, London, UK.
| | - Johannes C Dahmen
- Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford, UK
| | - Victoria M Bajo
- Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford, UK
| | - Andrew J King
- Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford, UK.
| |
Collapse
|
26
|
Visual Influences on Auditory Behavioral, Neural, and Perceptual Processes: A Review. J Assoc Res Otolaryngol 2021; 22:365-386. [PMID: 34014416 PMCID: PMC8329114 DOI: 10.1007/s10162-021-00789-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2020] [Accepted: 02/07/2021] [Indexed: 01/03/2023] Open
Abstract
In a naturalistic environment, auditory cues are often accompanied by information from other senses, which can be redundant with or complementary to the auditory information. Although the multisensory interactions derived from this combination of information and that shape auditory function are seen across all sensory modalities, our greatest body of knowledge to date centers on how vision influences audition. In this review, we attempt to capture the state of our understanding at this point in time regarding this topic. Following a general introduction, the review is divided into 5 sections. In the first section, we review the psychophysical evidence in humans regarding vision's influence in audition, making the distinction between vision's ability to enhance versus alter auditory performance and perception. Three examples are then described that serve to highlight vision's ability to modulate auditory processes: spatial ventriloquism, cross-modal dynamic capture, and the McGurk effect. The final part of this section discusses models that have been built based on available psychophysical data and that seek to provide greater mechanistic insights into how vision can impact audition. The second section reviews the extant neuroimaging and far-field imaging work on this topic, with a strong emphasis on the roles of feedforward and feedback processes, on imaging insights into the causal nature of audiovisual interactions, and on the limitations of current imaging-based approaches. These limitations point to a greater need for machine-learning-based decoding approaches toward understanding how auditory representations are shaped by vision. The third section reviews the wealth of neuroanatomical and neurophysiological data from animal models that highlights audiovisual interactions at the neuronal and circuit level in both subcortical and cortical structures. It also speaks to the functional significance of audiovisual interactions for two critically important facets of auditory perception-scene analysis and communication. The fourth section presents current evidence for alterations in audiovisual processes in three clinical conditions: autism, schizophrenia, and sensorineural hearing loss. These changes in audiovisual interactions are postulated to have cascading effects on higher-order domains of dysfunction in these conditions. The final section highlights ongoing work seeking to leverage our knowledge of audiovisual interactions to develop better remediation approaches to these sensory-based disorders, founded in concepts of perceptual plasticity in which vision has been shown to have the capacity to facilitate auditory learning.
Collapse
|
27
|
Development of Auditory Cortex Circuits. J Assoc Res Otolaryngol 2021; 22:237-259. [PMID: 33909161 DOI: 10.1007/s10162-021-00794-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Accepted: 03/03/2021] [Indexed: 02/03/2023] Open
Abstract
The ability to process and perceive sensory stimuli is an essential function for animals. Among the sensory modalities, audition is crucial for communication, pleasure, care for the young, and perceiving threats. The auditory cortex (ACtx) is a key sound processing region that combines ascending signals from the auditory periphery and inputs from other sensory and non-sensory regions. The development of ACtx is a protracted process starting prenatally and requires the complex interplay of molecular programs, spontaneous activity, and sensory experience. Here, we review the development of thalamic and cortical auditory circuits during pre- and early post-natal periods.
Collapse
|
28
|
|
29
|
Yao JD, Gimoto J, Constantinople CM, Sanes DH. Parietal Cortex Is Required for the Integration of Acoustic Evidence. Curr Biol 2020; 30:3293-3303.e4. [PMID: 32619478 DOI: 10.1016/j.cub.2020.06.017] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2020] [Revised: 05/12/2020] [Accepted: 06/04/2020] [Indexed: 01/31/2023]
Abstract
Sensory-driven decisions are formed by accumulating information over time. Although parietal cortex activity is thought to represent accumulated evidence for sensory-based decisions, recent perturbation studies in rodents and non-human primates have challenged the hypothesis that these representations actually influence behavior. Here, we asked whether the parietal cortex integrates acoustic features from auditory cortical inputs during a perceptual decision-making task. If so, we predicted that selective inactivation of this projection should impair subjects' ability to accumulate sensory evidence. We trained gerbils to perform an auditory discrimination task and obtained measures of integration time as a readout of evidence accumulation capability. Minimum integration time was calculated behaviorally as the shortest stimulus duration for which subjects could discriminate the acoustic signals. Direct pharmacological inactivation of parietal cortex increased minimum integration times, suggesting its role in the behavior. To determine the specific impact of sensory evidence, we chemogenetically inactivated the excitatory projections from auditory cortex to parietal cortex and found this was sufficient to increase minimum behavioral integration times. Our signal-detection-theory-based model accurately replicated behavioral outcomes and indicated that the deficits in task performance were plausibly explained by elevated sensory noise. Together, our findings provide causal evidence that parietal cortex plays a role in the network that integrates auditory features for perceptual judgments.
Collapse
Affiliation(s)
- Justin D Yao
- Center for Neural Science, New York University, New York, NY 10003, USA.
| | - Justin Gimoto
- Center for Neural Science, New York University, New York, NY 10003, USA
| | - Christine M Constantinople
- Center for Neural Science, New York University, New York, NY 10003, USA; Neuroscience Institute, NYU Langone Medical Center, New York University, New York, NY 10016, USA
| | - Dan H Sanes
- Center for Neural Science, New York University, New York, NY 10003, USA; Department of Psychology, New York University, New York, NY 10003, USA; Department of Biology, New York University, New York, NY 10003, USA; Neuroscience Institute, NYU Langone Medical Center, New York University, New York, NY 10016, USA
| |
Collapse
|
30
|
Qiu J, Singh P, Pan G, de Paolis A, Champagne FA, Liu J, Cardoso L, Rodríguez-Contreras A. Defining the relationship between maternal care behavior and sensory development in Wistar rats: Auditory periphery development, eye opening and brain gene expression. PLoS One 2020; 15:e0237933. [PMID: 32822407 PMCID: PMC7442246 DOI: 10.1371/journal.pone.0237933] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2020] [Accepted: 08/05/2020] [Indexed: 12/18/2022] Open
Abstract
Defining the relationship between maternal care, sensory development and brain gene expression in neonates is important to understand the impact of environmental challenges during sensitive periods in early life. In this study, we used a selection approach to test the hypothesis that variation in maternal licking and grooming (LG) during the first week of life influences sensory development in Wistar rat pups. We tracked the onset of the auditory brainstem response (ABR), the timing of eye opening (EO), middle ear development with micro-CT X-ray tomography, and used qRT-PCR to monitor changes in gene expression of the hypoxia-sensitive pathway and neurotrophin signaling in pups reared by low-LG or high-LG dams. The results show the first evidence that the transcription of genes involved in the hypoxia-sensitive pathway and neurotrophin signaling is regulated during separate sensitive periods that occur before and after hearing onset, respectively. Although the timing of ABR onset, EO, and the relative mRNA levels of genes involved in the hypoxia-sensitive pathway did not differ between pups from different LG groups, we found statistically significant increases in the relative mRNA levels of four genes involved in neurotrophin signaling in auditory brain regions from pups of different LG backgrounds. These results suggest that sensitivity to hypoxic challenge might be widespread in the auditory system of neonate rats before hearing onset, and that maternal LG may affect the transcription of genes involved in experience-dependent neuroplasticity.
Collapse
Affiliation(s)
- Jingyun Qiu
- Department of Biology and Center for Discovery and Innovation, City College, City University of New York, New York, New York, United States of America
| | - Preethi Singh
- Department of Biology and Center for Discovery and Innovation, City College, City University of New York, New York, New York, United States of America
| | - Geng Pan
- Department of Biology and Center for Discovery and Innovation, City College, City University of New York, New York, New York, United States of America
| | - Annalisa de Paolis
- Department of Biomedical Engineering, City College, City University of New York, New York, New York, United States of America
| | - Frances A. Champagne
- Department of Psychology, University of Texas at Austin, Austin, Texas, United States of America
| | - Jia Liu
- Neuroscience Initiative, Advanced Science Research Center at the Graduate Center, City University of New York, New York, New York, United States of America
| | - Luis Cardoso
- Department of Biomedical Engineering, City College, City University of New York, New York, New York, United States of America
| | - Adrián Rodríguez-Contreras
- Department of Biology and Center for Discovery and Innovation, City College, City University of New York, New York, New York, United States of America
- * E-mail:
| |
Collapse
|
31
|
Anomalous intrinsic connectivity within and between visual and auditory networks in major depressive disorder. Prog Neuropsychopharmacol Biol Psychiatry 2020; 100:109889. [PMID: 32067960 DOI: 10.1016/j.pnpbp.2020.109889] [Citation(s) in RCA: 49] [Impact Index Per Article: 12.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/07/2019] [Revised: 01/30/2020] [Accepted: 02/14/2020] [Indexed: 01/07/2023]
Abstract
OBJECTIVE Major depressive disorder (MDD) is a ubiquitous mental illness with heterogeneous symptoms, however, the pathophysiology mechanisms are still not fully understood. Clinical and preclinical studies suggested that depression could cause disturbances in sensory perception systems, disruptions in auditory and visual functions may serve as an essential clinical features underlying MDD. METHODS The current study investigated the abnormal intrinsic connectivity within and between visual and auditory networks in 95 MDD patients and 97 age-, gender-, education level-matched healthy controls (HCs) by using resting-state functional magnetic resonance imaging (fMRI). One auditory network (AN) and three visual components including visual component 1 (VC1), VC2, and VC3 were identified by using independent component analysis method based on the fMRI networks during the resting state with the largest spatial correlations, combining with brain regions and specific network templates. RESULTS We found that MDD could be characterized by the following disrupted network model relative to HCs: (i) reduced within-network connectivity in the AN, VC2, and VC3; (ii) reduced between-network connectivity between the AN and the VC3. Furthermore, aberrant functional connectivity (FC) within the visual network was linked to the clinical symptoms. CONCLUSIONS Overall, our results demonstrated that abnormalities of FC in perception systems including intrinsic visual and auditory networks may explain neurobiological mechanisms underlying MDD and could serve as a potential effective biomarker.
Collapse
|
32
|
Selective attention to sound features mediates cross-modal activation of visual cortices. Neuropsychologia 2020; 144:107498. [PMID: 32442445 DOI: 10.1016/j.neuropsychologia.2020.107498] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2019] [Revised: 03/14/2020] [Accepted: 05/12/2020] [Indexed: 11/20/2022]
Abstract
Contemporary schemas of brain organization now include multisensory processes both in low-level cortices as well as at early stages of stimulus processing. Evidence has also accumulated showing that unisensory stimulus processing can result in cross-modal effects. For example, task-irrelevant and lateralised sounds can activate visual cortices; a phenomenon referred to as the auditory-evoked contralateral occipital positivity (ACOP). Some claim this is an example of automatic attentional capture in visual cortices. Other results, however, indicate that context may play a determinant role. Here, we investigated whether selective attention to spatial features of sounds is a determining factor in eliciting the ACOP. We recorded high-density auditory evoked potentials (AEPs) while participants selectively attended and discriminated sounds according to four possible stimulus attributes: location, pitch, speaker identity or syllable. Sound acoustics were held constant, and their location was always equiprobable (50% left, 50% right). The only manipulation was to which sound dimension participants attended. We analysed the AEP data from healthy participants within an electrical neuroimaging framework. The presence of sound-elicited activations of visual cortices depended on the to-be-discriminated, goal-based dimension. The ACOP was elicited only when participants were required to discriminate sound location, but not when they attended to any of the non-spatial features. These results provide a further indication that the ACOP is not automatic. Moreover, our findings showcase the interplay between task-relevance and spatial (un)predictability in determining the presence of the cross-modal activation of visual cortices.
Collapse
|
33
|
Rahimi MD, Fadardi JS, Saeidi M, Bigdeli I, Kashiri R. Effectiveness of cathodal tDCS of the primary motor or sensory cortex in migraine: A randomized controlled trial. Brain Stimul 2020; 13:675-682. [DOI: 10.1016/j.brs.2020.02.012] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2019] [Revised: 02/06/2020] [Accepted: 02/08/2020] [Indexed: 01/03/2023] Open
|
34
|
Beebe NL, Noftz WA, Schofield BR. Perineuronal nets and subtypes of GABAergic cells differentiate auditory and multisensory nuclei in the intercollicular area of the midbrain. J Comp Neurol 2020; 528:2695-2707. [PMID: 32304096 DOI: 10.1002/cne.24926] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2019] [Revised: 04/07/2020] [Accepted: 04/08/2020] [Indexed: 11/10/2022]
Abstract
The intercollicular region, which lies between the inferior and superior colliculi in the midbrain, contains neurons that respond to auditory, visual, and somatosensory stimuli. Golgi studies have been used to parse this region into three distinct nuclei: the intercollicular tegmentum (ICt), the rostral pole of the inferior colliculus (ICrp), and the nucleus of the brachium of the IC (NBIC). Few reports have focused on these nuclei, especially the ICt and the ICrp, possibly due to lack of a marker that distinguishes these areas and is compatible with modern methods. Here, we found that staining for GABAergic cells and perineuronal nets differentiates these intercollicular nuclei in guinea pigs. Further, we found that the proportions of four subtypes of GABAergic cells differentiate intercollicular nuclei from each other and from adjacent inferior collicular subdivisions. Our results support earlier studies that suggest distinct morphology and functions for intercollicular nuclei, and provide staining methods that differentiate intercollicular nuclei and are compatible with most modern techniques. We hope that this will help future studies to further characterize the intercollicular region.
Collapse
Affiliation(s)
- Nichole L Beebe
- Hearing Research Group, Department of Anatomy and Neurobiology, Northeast Ohio Medical University, Rootstown, Ohio, USA
| | - William A Noftz
- Hearing Research Group, Department of Anatomy and Neurobiology, Northeast Ohio Medical University, Rootstown, Ohio, USA.,Biomedical Sciences Program, Kent State University, Kent, Ohio, USA
| | - Brett R Schofield
- Hearing Research Group, Department of Anatomy and Neurobiology, Northeast Ohio Medical University, Rootstown, Ohio, USA.,Biomedical Sciences Program, Kent State University, Kent, Ohio, USA
| |
Collapse
|
35
|
Zhang M, Kwon SE, Ben-Johny M, O'Connor DH, Issa JB. Spectral hallmark of auditory-tactile interactions in the mouse somatosensory cortex. Commun Biol 2020; 3:64. [PMID: 32047263 PMCID: PMC7012892 DOI: 10.1038/s42003-020-0788-5] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2019] [Accepted: 01/22/2020] [Indexed: 11/08/2022] Open
Abstract
To synthesize a coherent representation of the external world, the brain must integrate inputs across different types of stimuli. Yet the mechanistic basis of this computation at the level of neuronal populations remains obscure. Here, we investigate tactile-auditory integration using two-photon Ca2+ imaging in the mouse primary (S1) and secondary (S2) somatosensory cortices. Pairing sound with whisker stimulation modulates tactile responses in both S1 and S2, with the most prominent modulation being robust inhibition in S2. The degree of inhibition depends on tactile stimulation frequency, with lower frequency responses the most severely attenuated. Alongside these neurons, we identify sound-selective neurons in S2 whose responses are inhibited by high tactile frequencies. These results are consistent with a hypothesized local mutually-inhibitory S2 circuit that spectrally selects tactile versus auditory inputs. Our findings enrich mechanistic understanding of multisensory integration and suggest a key role for S2 in combining auditory and tactile information.
Collapse
Affiliation(s)
- Manning Zhang
- Department of Biomedical Engineering, The Johns Hopkins University School of Medicine, Baltimore, MD, 21205, USA
- Department of Biomedical Engineering, Washington University in St. Louis, St. Louis, MO, 63130, USA
| | - Sung Eun Kwon
- Solomon H. Snyder Department of Neuroscience, The Johns Hopkins University School of Medicine, Kavli Neuroscience Discovery Institute, and Brain Science Institute, Baltimore, MD, 21205, USA
- Department of Molecular, Cellular and Developmental Biology, University of Michigan, Ann Arbor, MI, 48109, USA
| | - Manu Ben-Johny
- Department of Biomedical Engineering, The Johns Hopkins University School of Medicine, Baltimore, MD, 21205, USA
- Department of Physiology and Cellular Biophysics, Columbia University, New York, NY, 10032, USA
| | - Daniel H O'Connor
- Solomon H. Snyder Department of Neuroscience, The Johns Hopkins University School of Medicine, Kavli Neuroscience Discovery Institute, and Brain Science Institute, Baltimore, MD, 21205, USA
| | - John B Issa
- Department of Biomedical Engineering, The Johns Hopkins University School of Medicine, Baltimore, MD, 21205, USA.
- Department of Neurobiology, Northwestern University, Evanston, IL, 60201, USA.
| |
Collapse
|
36
|
Nakata S, Takemoto M, Song WJ. Differential cortical and subcortical projection targets of subfields in the core region of mouse auditory cortex. Hear Res 2020; 386:107876. [PMID: 31881516 DOI: 10.1016/j.heares.2019.107876] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/27/2019] [Revised: 12/17/2019] [Accepted: 12/20/2019] [Indexed: 11/15/2022]
Abstract
The core region of the rodent auditory cortex has two areas: the primary auditory area (A1) and the anterior auditory field (AAF). However, the functional difference between these areas is unclear. To elucidate this issue, here we studied the projections from A1 and AAF in mice using adeno-associated virus (AAV) vectors expressing either a green fluorescent protein or a red fluorescent protein. After mapping A1 and AAF using optical imaging, we injected a distinct AAV vector into each of the two fields at a frequency-matched high-frequency location. We found that A1 and AAF projected commonly to virtually all target areas examined, but each field had its own preference for projection targets. Frontal and parietal regions were the major cortical targets: in the frontal cortex, A1 and AAF showed dominant projections to the anterior cingulate cortex Cg1 and the secondary motor cortex (M2), respectively; in the parietal cortex, A1 and AAF exhibited dense projections to the medial secondary visual cortex and the posterior parietal cortex (PPC), respectively. Although M2 and PPC received considerable input from A1 as well, A1 innervated the medial part whereas AAF innervated the lateral part of these cortical regions. A1 also projected to the orbitofrontal cortex, while AAF also projected to the primary somatosensory cortex and insular auditory cortex. As for subcortical projections, A1 and AAF projected to a common ventromedial region in the caudal striatum with a comparable strength; they also both projected to the medial geniculate body and the inferior colliculus, innervating common and distinct divisions of the nuclei. A1 also projected to visual subcortical structures, such as the superior colliculus and the lateral posterior nucleus of the thalamus, where fibres from AAF were sparse. Our results demonstrate the preference of A1 and AAF for cortical and subcortical targets, and for divisions in individual target. The preference of A1 and AAF for sensory-related structures suggest a role for A1 in providing auditory information for audio-visual association at both the cortical and subcortical level, and a distinct role of AAF in providing auditory information for association with somatomotor information in the cortex.
Collapse
Affiliation(s)
- Shiro Nakata
- Department of Sensory and Cognitive Physiology, Graduate School of Medical Sciences, Kumamoto University, Kumamoto, 860-8556, Japan
| | - Makoto Takemoto
- Department of Sensory and Cognitive Physiology, Graduate School of Medical Sciences, Kumamoto University, Kumamoto, 860-8556, Japan
| | - Wen-Jie Song
- Department of Sensory and Cognitive Physiology, Graduate School of Medical Sciences, Kumamoto University, Kumamoto, 860-8556, Japan; Center for Metabolic Regulation of Healthy Aging, Faculty of Life Sciences, Kumamoto University, Kumamoto, 860-8556, Japan.
| |
Collapse
|
37
|
Stereotactic electroencephalography in humans reveals multisensory signal in early visual and auditory cortices. Cortex 2020; 126:253-264. [PMID: 32092494 DOI: 10.1016/j.cortex.2019.12.032] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2019] [Revised: 08/20/2019] [Accepted: 12/30/2019] [Indexed: 02/02/2023]
Abstract
Unequivocally demonstrating the presence of multisensory signals at the earliest stages of cortical processing remains challenging in humans. In our study, we relied on the unique spatio-temporal resolution provided by intracranial stereotactic electroencephalographic (SEEG) recordings in patients with drug-resistant epilepsy to characterize the signal extracted from early visual (calcarine and pericalcarine) and auditory (Heschl's gyrus and planum temporale) regions during a simple audio-visual oddball task. We provide evidences that both cross-modal responses (visual responses in auditory cortex or the reverse) and multisensory processing (alteration of the unimodal responses during bimodal stimulation) can be observed in intracranial event-related potentials (iERPs) and in power modulations of oscillatory activity at different temporal scales within the first 150 msec after stimulus onset. The temporal profiles of the iERPs are compatible with the hypothesis that MSI occurs by means of direct pathways linking early visual and auditory regions. Our data indicate, moreover, that MSI mainly relies on modulations of the low-frequency bands (foremost the theta band in the auditory cortex and the alpha band in the visual cortex), suggesting the involvement of feedback pathways between the two sensory regions. Remarkably, we also observed high-gamma power modulations by sounds in the early visual cortex, thus suggesting the presence of neuronal populations involved in auditory processing in the calcarine and pericalcarine region in humans.
Collapse
|
38
|
Gau R, Bazin PL, Trampel R, Turner R, Noppeney U. Resolving multisensory and attentional influences across cortical depth in sensory cortices. eLife 2020; 9:46856. [PMID: 31913119 PMCID: PMC6984812 DOI: 10.7554/elife.46856] [Citation(s) in RCA: 38] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2019] [Accepted: 01/07/2020] [Indexed: 11/13/2022] Open
Abstract
In our environment, our senses are bombarded with a myriad of signals, only a subset of which is relevant for our goals. Using sub-millimeter-resolution fMRI at 7T, we resolved BOLD-response and activation patterns across cortical depth in early sensory cortices to auditory, visual and audiovisual stimuli under auditory or visual attention. In visual cortices, auditory stimulation induced widespread inhibition irrespective of attention, whereas auditory relative to visual attention suppressed mainly central visual field representations. In auditory cortices, visual stimulation suppressed activations, but amplified responses to concurrent auditory stimuli, in a patchy topography. Critically, multisensory interactions in auditory cortices were stronger in deeper laminae, while attentional influences were greatest at the surface. These distinct depth-dependent profiles suggest that multisensory and attentional mechanisms regulate sensory processing via partly distinct circuitries. Our findings are crucial for understanding how the brain regulates information flow across senses to interact with our complex multisensory world.
Collapse
Affiliation(s)
- Remi Gau
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, United Kingdom.,Institute of Psychology, Université Catholique de Louvain, Louvain-la-Neuve, Belgium.,Institute of Neuroscience, Université Catholique de Louvain, Louvain-la-Neuve, Belgium
| | - Pierre-Louis Bazin
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.,Integrative Model-based Cognitive Neuroscience research unit, University of Amsterdam, Amsterdam, Netherlands
| | - Robert Trampel
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Robert Turner
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.,Sir Peter Mansfield Imaging Centre, University of Nottingham, Nottingham, United Kingdom
| | - Uta Noppeney
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, United Kingdom.,Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands
| |
Collapse
|
39
|
Macharadze T, Budinger E, Brosch M, Scheich H, Ohl FW, Henschke JU. Early Sensory Loss Alters the Dendritic Branching and Spine Density of Supragranular Pyramidal Neurons in Rodent Primary Sensory Cortices. Front Neural Circuits 2019; 13:61. [PMID: 31611778 PMCID: PMC6773815 DOI: 10.3389/fncir.2019.00061] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2019] [Accepted: 09/03/2019] [Indexed: 01/26/2023] Open
Abstract
Multisensory integration in primary auditory (A1), visual (V1), and somatosensory cortex (S1) is substantially mediated by their direct interconnections and by thalamic inputs across the sensory modalities. We have previously shown in rodents (Mongolian gerbils) that during postnatal development, the anatomical and functional strengths of these crossmodal and also of sensory matched connections are determined by early auditory, somatosensory, and visual experience. Because supragranular layer III pyramidal neurons are major targets of corticocortical and thalamocortical connections, we investigated in this follow-up study how the loss of early sensory experience changes their dendritic morphology. Gerbils were sensory deprived early in development by either bilateral sciatic nerve transection at postnatal day (P) 5, ototoxic inner hair cell damage at P10, or eye enucleation at P10. Sholl and branch order analyses of Golgi-stained layer III pyramidal neurons at P28, which demarcates the end of the sensory critical period in this species, revealed that visual and somatosensory deprivation leads to a general increase of apical and basal dendritic branching in A1, V1, and S1. In contrast, dendritic branching, particularly of apical dendrites, decreased in all three areas following auditory deprivation. Generally, the number of spines, and consequently spine density, along the apical and basal dendrites decreased in both sensory deprived and non-deprived cortical areas. Therefore, we conclude that the loss of early sensory experience induces a refinement of corticocortical crossmodal and other cortical and thalamic connections by pruning of dendritic spines at the end of the critical period. Based on present and previous own results and on findings from the literature, we propose a scenario for multisensory development following early sensory loss.
Collapse
Affiliation(s)
- Tamar Macharadze
- Department Systems Physiology of Learning, Leibniz Institute for Neurobiology, Magdeburg, Germany.,Clinic for Anesthesiology and Intensive Care Medicine, Otto von Guericke University Hospital, Magdeburg, Germany
| | - Eike Budinger
- Department Systems Physiology of Learning, Leibniz Institute for Neurobiology, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Magdeburg, Germany
| | - Michael Brosch
- Center for Behavioral Brain Sciences, Magdeburg, Germany.,Special Lab Primate Neurobiology, Leibniz Institute for Neurobiology, Magdeburg, Germany
| | - Henning Scheich
- Center for Behavioral Brain Sciences, Magdeburg, Germany.,Emeritus Group Lifelong Learning, Leibniz Institute for Neurobiology, Magdeburg, Germany
| | - Frank W Ohl
- Department Systems Physiology of Learning, Leibniz Institute for Neurobiology, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Magdeburg, Germany.,Institute for Biology, Otto von Guericke University, Magdeburg, Germany
| | - Julia U Henschke
- Institute of Cognitive Neurology and Dementia Research (IKND), Otto von Guericke University, Magdeburg, Germany
| |
Collapse
|
40
|
Odor Identification in Rats: Behavioral and Electrophysiological Evidence of Learned Olfactory-Auditory Associations. eNeuro 2019; 6:ENEURO.0102-19.2019. [PMID: 31362955 PMCID: PMC6709214 DOI: 10.1523/eneuro.0102-19.2019] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2019] [Revised: 06/28/2019] [Accepted: 07/15/2019] [Indexed: 12/31/2022] Open
Abstract
The ability to recognize and identify a smell is highly dependent on multisensory context and expectation, for example, hearing the name of the odor source. Here, we develop a novel auditory-odor association task in rats, wherein the animal learns that a specific auditory tone, when associated with a specific odor, predicts reward (Go signal), whereas the same tone associated with a different odor, or vice versa, is not (No-Go signal). The tone occurs prior to the onset of the odor, allowing physiological analyses of sensory-evoked local field potential (LFP) activity to each stimulus in primary auditory cortex and anterior piriform cortex (aPCX). In trained animals that have acquired the task, both auditory and subsequent olfactory cues activate β band oscillations in both the auditory cortex and PCX, suggesting multisensory integration. Naive animals show no such multisensory responses, suggesting the response is learned. In addition to the learned multisensory evoked responses, functional connectivity between auditory cortex and PCX, as assessed with spectral coherence and phase lag index (PLI), is enhanced. Importantly, both the multi-sensory evoked responses and the functional connectivity are context-dependent. In trained animals, the same auditory stimuli presented in the home cage evoke no responses in auditory cortex or PCX, and functional connectivity between the sensory cortices is reduced. Together, the results demonstrate how learning and context shape the expression of multisensory cortical processing. Given that odor identification impairment is associated with preclinical dementia in humans, the mechanisms suggested here may help develop experimental models to assess effects of neuropathology on behavior.
Collapse
|
41
|
Multisensory learning between odor and sound enhances beta oscillations. Sci Rep 2019; 9:11236. [PMID: 31375760 PMCID: PMC6677763 DOI: 10.1038/s41598-019-47503-y] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2019] [Accepted: 06/26/2019] [Indexed: 11/22/2022] Open
Abstract
Multisensory interactions are essential to make sense of the environment by transforming the mosaic of sensory inputs received by the organism into a unified perception. Brain rhythms allow coherent processing within areas or between distant brain regions and could thus be instrumental in functionally connecting remote brain areas in the context of multisensory interactions. Still, odor and sound processing relate to two sensory systems with specific anatomofunctional characteristics. How does the brain handle their association? Rats were challenged to discriminate between unisensory stimulation (odor or sound) and the multisensory combination of both. During learning, we observed a progressive establishment of high power beta oscillations (15–35 Hz) spanning on the olfactory bulb, the piriform cortex and the perirhinal cortex, but not the primary auditory cortex. In the piriform cortex, beta oscillations power was higher in the multisensory condition compared to the presentation of the odor alone. Furthermore, in the olfactory structures, the sound alone was able to elicit a beta oscillatory response. These findings emphasize the functional differences between olfactory and auditory cortices and reveal that beta oscillations contribute to the memory formation of the multisensory association.
Collapse
|
42
|
Császár-Nagy N, Kapócs G, Bókkon I. Classic psychedelics: the special role of the visual system. Rev Neurosci 2019; 30:651-669. [PMID: 30939118 DOI: 10.1515/revneuro-2018-0092] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2018] [Accepted: 11/05/2018] [Indexed: 12/23/2022]
Abstract
Here, we briefly overview the various aspects of classic serotonergic hallucinogens reported by a number of studies. One of the key hypotheses of our paper is that the visual effects of psychedelics might play a key role in resetting fears. Namely, we especially focus on visual processes because they are among the most prominent features of hallucinogen-induced hallucinations. We hypothesize that our brain has an ancient visual-based (preverbal) intrinsic cognitive process that, during the transient inhibition of top-down convergent and abstract thinking (mediated by the prefrontal cortex) by psychedelics, can neutralize emotional fears of unconscious and conscious life experiences from the past. In these processes, the decreased functional integrity of the self-referencing processes of the default mode network, the modified multisensory integration (linked to bodily self-consciousness and self-awareness), and the modified amygdala activity may also play key roles. Moreover, the emotional reset (elimination of stress-related emotions) by psychedelics may induce psychological changes and overwrite the stress-related neuroepigenetic information of past unconscious and conscious emotional fears.
Collapse
Affiliation(s)
- Noemi Császár-Nagy
- National University of Public Services, Budapest, Hungary.,Psychosomatic Outpatient Clinics, Budapest, Hungary
| | - Gábor Kapócs
- Saint John Hospital, Budapest, Hungary.,Institute of Behavioral Sciences, Semmelweis University, Budapest, Hungary
| | - István Bókkon
- Psychosomatic Outpatient Clinics, Budapest, Hungary.,Vision Research Institute, Neuroscience and Consciousness Research Department, Lowell, MA, USA
| |
Collapse
|
43
|
Császár N, Kapócs G, Bókkon I. A possible key role of vision in the development of schizophrenia. Rev Neurosci 2019; 30:359-379. [PMID: 30244235 DOI: 10.1515/revneuro-2018-0022] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2018] [Accepted: 08/01/2018] [Indexed: 12/12/2022]
Abstract
Based on a brief overview of the various aspects of schizophrenia reported by numerous studies, here we hypothesize that schizophrenia may originate (and in part be performed) from visual areas. In other words, it seems that a normal visual system or at least an evanescent visual perception may be an essential prerequisite for the development of schizophrenia as well as of various types of hallucinations. Our study focuses on auditory and visual hallucinations, as they are the most prominent features of schizophrenic hallucinations (and also the most studied types of hallucinations). Here, we evaluate the possible key role of the visual system in the development of schizophrenia.
Collapse
Affiliation(s)
- Noemi Császár
- Gaspar Karoly University Psychological Institute, H-1091 Budapest, Hungary.,Psychoszomatic Outpatient Department, H-1037 Budapest, Hungary
| | - Gabor Kapócs
- Buda Family Centred Mental Health Centre, Department of Psychiatry and Psychiatric Rehabilitation, St. John Hospital, Budapest, Hungary
| | - István Bókkon
- Psychoszomatic Outpatient Department, H-1037 Budapest, Hungary.,Vision Research Institute, Neuroscience and Consciousness Research Department, 25 Rita Street, Lowell, MA 01854, USA
| |
Collapse
|
44
|
Li Q, Liu G, Yuan G, Wang G, Wu Z, Zhao X. DC Shifts-fMRI: A Supplement to Event-Related fMRI. Front Comput Neurosci 2019; 13:37. [PMID: 31244636 PMCID: PMC6581730 DOI: 10.3389/fncom.2019.00037] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/24/2019] [Accepted: 05/21/2019] [Indexed: 11/13/2022] Open
Abstract
Event-related fMRI have been widely used in locating brain regions which respond to specific tasks. However, activities of brain regions which modulate or indirectly participate in the response to a specific task are not event-related. Event-related fMRI can't locate these regulatory regions, detrimental to the integrity of the result that event-related fMRI revealed. Direct-current EEG shifts (DC shifts) have been found linked to the inner brain activity, a fusion DC shifts-fMRI method may have the ability to reveal a more complete response of the brain. In this study, we used DC shifts-fMRI to verify that even when responding to a very simple task, (1) The response of the brain is more complicated than event-related fMRI generally revealed and (2) DC shifts-fMRI have the ability of revealing brain regions whose responses are not in event-related way. We used a classical and simple paradigm which is often used in auditory cortex tonotopic mapping. Data were recorded from 50 subjects (25 male, 25 female) who were presented with randomly presented pure tone sequences with six different frequencies (200, 400, 800, 1,600, 3,200, 6,400 Hz). Our traditional fMRI results are consistent with previous findings that the activations are concentrated on the auditory cortex. Our DC shifts-fMRI results showed that the cingulate-caudate-thalamus network which underpins sustained attention is positively activated while the dorsal attention network and the right middle frontal gyrus which underpin attention orientation are negatively activated. The regional-specific correlations between DC shifts and brain networks indicate the complexity of the response of the brain even to a simple task and that the DC shifts can effectively reflect these non-event-related inner brain activities.
Collapse
Affiliation(s)
- Qiang Li
- Education Science College, Guizhou Normal College, Guiyang, China
| | - Guangyuan Liu
- College of Electronic and Information Engineering, Southwest University, Chongqing, China.,Chongqing Collaborative Innovation Center for Brain Science, Southwest University, Chongqing, China
| | - Guangjie Yuan
- College of Electronic and Information Engineering, Southwest University, Chongqing, China
| | - Gaoyuan Wang
- College of Music, Southwest University, Chongqing, China
| | - Zonghui Wu
- Southwest University Hospital, Southwest University, Chongqing, China
| | - Xingcong Zhao
- College of Electronic and Information Engineering, Southwest University, Chongqing, China
| |
Collapse
|
45
|
Schormans AL, Typlt M, Allman BL. Adult-Onset Hearing Impairment Induces Layer-Specific Cortical Reorganization: Evidence of Crossmodal Plasticity and Central Gain Enhancement. Cereb Cortex 2019; 29:1875-1888. [PMID: 29668848 PMCID: PMC6458918 DOI: 10.1093/cercor/bhy067] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2017] [Revised: 02/22/2018] [Indexed: 11/14/2022] Open
Abstract
Adult-onset hearing impairment can lead to hyperactivity in the auditory pathway (i.e., central gain enhancement) as well as increased cortical responsiveness to nonauditory stimuli (i.e., crossmodal plasticity). However, it remained unclear to what extent hearing loss-induced hyperactivity is relayed beyond the auditory cortex, and thus, whether central gain enhancement competes or coexists with crossmodal plasticity throughout the distinct layers of the audiovisual cortex. To that end, we investigated the effects of partial hearing loss on laminar processing in the auditory, visual and audiovisual cortices of adult rats using extracellular electrophysiological recordings performed 2 weeks after loud noise exposure. Current-source density analyses revealed that central gain enhancement was not relayed to the audiovisual cortex (V2L), and was instead restricted to the granular layer of the higher order auditory area, AuD. In contrast, crossmodal plasticity was evident across multiple cortical layers within V2L, and also manifested in AuD. Surprisingly, despite this coexistence of central gain enhancement and crossmodal plasticity, noise exposure did not disrupt the responsiveness of these neighboring cortical regions to combined audiovisual stimuli. Overall, we have shown for the first time that adult-onset hearing impairment causes a complex assortment of intramodal and crossmodal changes across the layers of higher order sensory cortices.
Collapse
Affiliation(s)
- Ashley L Schormans
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario, London, Ontario, Canada
| | - Marei Typlt
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario, London, Ontario, Canada
| | - Brian L Allman
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario, London, Ontario, Canada
| |
Collapse
|
46
|
Chanauria N, Bharmauria V, Bachatene L, Cattan S, Rouat J, Molotchnikoff S. Sound Induces Change in Orientation Preference of V1 Neurons: Audio-Visual Cross-Influence. Neuroscience 2019; 404:48-61. [PMID: 30703505 DOI: 10.1016/j.neuroscience.2019.01.039] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2018] [Revised: 01/18/2019] [Accepted: 01/21/2019] [Indexed: 10/27/2022]
Abstract
In the cortex, demarcated unimodal sensory regions often respond to unforeseen sensory stimuli and exhibit plasticity. The goal of the current investigation was to test evoked responses of primary visual cortex (V1) neurons when an adapting auditory stimulus is applied in isolation. Using extracellular recordings in anesthetized cats, we demonstrate that, unlike the prevailing observation of only slight modulations in the firing rates of the neurons, sound imposition in isolation entirely shifted the peaks of orientation tuning curves of neurons in both supra- and infragranular layers of V1. Our results suggest that neurons specific to either layer dynamically integrate features of sound and modify the organization of the orientation map of V1. Intriguingly, these experiments present novel findings that the mere presentation of a prolonged auditory stimulus may drastically recalibrate the tuning properties of the visual neurons and highlight the phenomenal neuroplasticity of V1 neurons.
Collapse
Affiliation(s)
- Nayan Chanauria
- Neurophysiology of Visual System, Département de Sciences Biologiques, Université de Montréal, CP 6128 Succursale Centre-Ville, Montréal, QC H3C 3J7, Canada
| | - Vishal Bharmauria
- Neurophysiology of Visual System, Département de Sciences Biologiques, Université de Montréal, CP 6128 Succursale Centre-Ville, Montréal, QC H3C 3J7, Canada
| | - Lyes Bachatene
- Neurophysiology of Visual System, Département de Sciences Biologiques, Université de Montréal, CP 6128 Succursale Centre-Ville, Montréal, QC H3C 3J7, Canada
| | - Sarah Cattan
- Neurophysiology of Visual System, Département de Sciences Biologiques, Université de Montréal, CP 6128 Succursale Centre-Ville, Montréal, QC H3C 3J7, Canada
| | - Jean Rouat
- Departement de Génie Électrique et Génie Informatique, Université de Sherbrooke, Sherbrooke, QC, Canada
| | - Stéphane Molotchnikoff
- Neurophysiology of Visual System, Département de Sciences Biologiques, Université de Montréal, CP 6128 Succursale Centre-Ville, Montréal, QC H3C 3J7, Canada.
| |
Collapse
|
47
|
Maruyama AT, Komai S. Auditory-induced response in the primary sensory cortex of rodents. PLoS One 2018; 13:e0209266. [PMID: 30571722 PMCID: PMC6301624 DOI: 10.1371/journal.pone.0209266] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2018] [Accepted: 12/03/2018] [Indexed: 11/18/2022] Open
Abstract
The details of auditory response at the subthreshold level in the rodent primary somatosensory cortex, the barrel cortex, have not been studied extensively, although several phenomenological reports have been published. Multisensory features may act as neuronal representations of links between inputs from one sensory modality to other sensory modalities. Here, we examined the basic multisensory postsynaptic responses in the rodent barrel cortex using in vivo whole-cell recordings of neurons. We observed robust responses to acoustic stimuli in most barrel cortex neurons. Acoustically evoked responses were mediated by hearing and reached approximately 60% of the postsynaptic response amplitude elicited by strong somatosensory stimuli. Compared to tactile stimuli, auditory stimuli evoked postsynaptic potentials with a longer latency and longer duration. Specifically, auditory stimuli in barrel cortex neurons appeared to trigger "up states", episodes associated with membrane depolarization and increased synaptic activity. Taken together, our data suggest that barrel cortex neurons have multisensory properties, with distinct synaptic mechanisms underlying tactile and non-tactile responses.
Collapse
Affiliation(s)
- Atsuko T. Maruyama
- Department of Science and Technology, Nara Institute of Science Technology, Takayama, Japan
| | - Shoji Komai
- Department of Science and Technology, Nara Institute of Science Technology, Takayama, Japan
- * E-mail:
| |
Collapse
|
48
|
Bieler M, Xu X, Marquardt A, Hanganu-Opatz IL. Multisensory integration in rodent tactile but not visual thalamus. Sci Rep 2018; 8:15684. [PMID: 30356135 PMCID: PMC6200796 DOI: 10.1038/s41598-018-33815-y] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2018] [Accepted: 10/04/2018] [Indexed: 11/09/2022] Open
Abstract
Behavioural performance requires a coherent perception of environmental features that address multiple senses. These diverse sensory inputs are integrated in primary sensory cortices, yet it is still largely unknown whether their convergence occurs even earlier along the sensory tract. Here we investigate the role of putatively modality-specific first-order (FO) thalamic nuclei (ventral posteromedial nucleus (VPM), dorsal lateral geniculate nucleus (dLGN)) and their interactions with primary sensory cortices (S1, V1) for multisensory integration in pigmented rats in vivo. We show that bimodal stimulation (i.e. simultaneous light flash and whisker deflection) enhances sensory evoked activity in VPM, but not dLGN. Moreover, cross-modal stimuli reset the phase of thalamic network oscillations and strengthen the coupling efficiency between VPM and S1, but not between dLGN and V1. Finally, the information flow from VPM to S1 is enhanced. Thus, FO tactile, but not visual, thalamus processes and relays sensory inputs from multiple senses, revealing a functional difference between sensory thalamic nuclei during multisensory integration.
Collapse
Affiliation(s)
- Malte Bieler
- Developmental Neurophysiology, Institute of Neuroanatomy, University Medical Center Hamburg-Eppendorf, 20251, Hamburg, Germany. .,Laboratory for Neural Computation, Department of Physiology, University of Oslo, 0372, Oslo, Norway.
| | - Xiaxia Xu
- Developmental Neurophysiology, Institute of Neuroanatomy, University Medical Center Hamburg-Eppendorf, 20251, Hamburg, Germany
| | - Annette Marquardt
- Developmental Neurophysiology, Institute of Neuroanatomy, University Medical Center Hamburg-Eppendorf, 20251, Hamburg, Germany
| | - Ileana L Hanganu-Opatz
- Developmental Neurophysiology, Institute of Neuroanatomy, University Medical Center Hamburg-Eppendorf, 20251, Hamburg, Germany.
| |
Collapse
|
49
|
Milne AE, Petkov CI, Wilson B. Auditory and Visual Sequence Learning in Humans and Monkeys using an Artificial Grammar Learning Paradigm. Neuroscience 2018; 389:104-117. [PMID: 28687306 PMCID: PMC6278909 DOI: 10.1016/j.neuroscience.2017.06.059] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2017] [Revised: 06/26/2017] [Accepted: 06/27/2017] [Indexed: 11/30/2022]
Abstract
Language flexibly supports the human ability to communicate using different sensory modalities, such as writing and reading in the visual modality and speaking and listening in the auditory domain. Although it has been argued that nonhuman primate communication abilities are inherently multisensory, direct behavioural comparisons between human and nonhuman primates are scant. Artificial grammar learning (AGL) tasks and statistical learning experiments can be used to emulate ordering relationships between words in a sentence. However, previous comparative work using such paradigms has primarily investigated sequence learning within a single sensory modality. We used an AGL paradigm to evaluate how humans and macaque monkeys learn and respond to identically structured sequences of either auditory or visual stimuli. In the auditory and visual experiments, we found that both species were sensitive to the ordering relationships between elements in the sequences. Moreover, the humans and monkeys produced largely similar response patterns to the visual and auditory sequences, indicating that the sequences are processed in comparable ways across the sensory modalities. These results provide evidence that human sequence processing abilities stem from an evolutionarily conserved capacity that appears to operate comparably across the sensory modalities in both human and nonhuman primates. The findings set the stage for future neurobiological studies to investigate the multisensory nature of these sequencing operations in nonhuman primates and how they compare to related processes in humans.
Collapse
Affiliation(s)
- Alice E Milne
- Institute of Neuroscience, Henry Wellcome Building, Newcastle University, Framlington Place, Newcastle upon Tyne NE2 4HH, United Kingdom; Centre for Behaviour and Evolution, Henry Wellcome Building, Newcastle University, Framlington Place, Newcastle upon Tyne NE2 4HH, United Kingdom
| | - Christopher I Petkov
- Institute of Neuroscience, Henry Wellcome Building, Newcastle University, Framlington Place, Newcastle upon Tyne NE2 4HH, United Kingdom; Centre for Behaviour and Evolution, Henry Wellcome Building, Newcastle University, Framlington Place, Newcastle upon Tyne NE2 4HH, United Kingdom.
| | - Benjamin Wilson
- Institute of Neuroscience, Henry Wellcome Building, Newcastle University, Framlington Place, Newcastle upon Tyne NE2 4HH, United Kingdom; Centre for Behaviour and Evolution, Henry Wellcome Building, Newcastle University, Framlington Place, Newcastle upon Tyne NE2 4HH, United Kingdom
| |
Collapse
|
50
|
Mattingly MM, Donell BM, Rosen MJ. Late maturation of backward masking in auditory cortex. J Neurophysiol 2018; 120:1558-1571. [PMID: 29995598 DOI: 10.1152/jn.00114.2018] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Speech perception relies on the accurate resolution of brief, successive sounds that change rapidly over time. Deficits in the perception of such sounds, indicated by a reduced ability to detect signals during auditory backward masking, strongly relate to language processing difficulties in children. Backward masking during normal development has a longer maturational trajectory than many other auditory percepts, implicating the involvement of central auditory neural mechanisms with protracted developmental time courses. Despite the importance of this percept, its neural correlates are not well described at any developmental stage. We therefore measured auditory cortical responses to masked signals in juvenile and adult Mongolian gerbils and quantified the detection ability of individual neurons and neural populations in a manner comparable with psychoacoustic measurements. Perceptually, auditory backward masking manifests as higher thresholds for detection of a short signal followed by a masker than for the same signal in silence. Cortical masking was driven by a combination of suppressed responses to the signal and a reduced dynamic range available for signal detection in the presence of the masker. Both coding elements contributed to greater masked threshold shifts in juveniles compared with adults, but signal-evoked firing suppression was more pronounced in juveniles. Neural threshold shifts were a better match to human psychophysical threshold shifts when quantified with a longer temporal window that included the response to the delayed masker, suggesting that temporally selective listening may contribute to age-related differences in backward masking. NEW & NOTEWORTHY In children, auditory detection of backward masked signals is immature well into adolescence, and detection deficits correlate with problems in speech processing. Our auditory cortical recordings reveal immature backward masking in adolescent animals that mirrors the prolonged development seen in children. This is driven by both signal-evoked suppression and dynamic range reduction. An extended window of analysis suggests that differences in temporally focused listening may contribute to late maturing thresholds for backward masked signals.
Collapse
Affiliation(s)
- Michelle M Mattingly
- Department of Anatomy & Neurobiology, Northeast Ohio Medical University, Rootstown, Ohio
| | - Brittany M Donell
- Department of Anatomy & Neurobiology, Northeast Ohio Medical University, Rootstown, Ohio
| | - Merri J Rosen
- Department of Anatomy & Neurobiology, Northeast Ohio Medical University, Rootstown, Ohio
| |
Collapse
|