1
|
Kato DD, Bruno RM. Stability of cross-sensory input to primary somatosensory cortex across experience. Neuron 2025; 113:291-306.e7. [PMID: 39561767 PMCID: PMC11757082 DOI: 10.1016/j.neuron.2024.10.020] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2023] [Revised: 08/03/2024] [Accepted: 10/22/2024] [Indexed: 11/21/2024]
Abstract
Merging information across sensory modalities is key to forming robust percepts, yet how the brain achieves this feat remains unclear. Recent studies report cross-modal influences in the primary sensory cortex, suggesting possible multisensory integration in the early stages of cortical processing. We test several hypotheses about the function of auditory influences on mouse primary somatosensory cortex (S1) using in vivo two-photon calcium imaging. We found sound-evoked spiking activity in an extremely small fraction of cells, and this sparse activity did not encode auditory stimulus identity. Moreover, S1 did not encode information about specific audio-tactile feature conjunctions. Auditory and audio-tactile stimulus encoding remained unchanged after both passive experience and reinforcement. These results suggest that while primary sensory cortex is plastic within its own modality, the influence of other modalities is remarkably stable and stimulus nonspecific.
Collapse
Affiliation(s)
- Daniel D Kato
- Department of Neuroscience, Columbia University, New York, NY 10027, USA
| | - Randy M Bruno
- Department of Neuroscience, Columbia University, New York, NY 10027, USA; Department of Physiology, Anatomy, & Genetics, University of Oxford, Oxford OX1 3PT, UK.
| |
Collapse
|
2
|
Li Z, He L, Peng L, Zhu X, Li M, Hu D. Negative hemodynamic response in the visual cortex: Evidence supporting neuronal origin via hemodynamic observation and two-photon imaging. Brain Res Bull 2025; 220:111149. [PMID: 39615859 DOI: 10.1016/j.brainresbull.2024.111149] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2024] [Revised: 11/07/2024] [Accepted: 11/25/2024] [Indexed: 12/08/2024]
Abstract
The positive hemodynamic response (PHR) during stimulation often co-occurs with a strong, sustained negative hemodynamic response (NHR). However, the characteristics and neurophysiological mechanisms of the NHR, especially in regions distal to the PHR, remain incompletely understood. Using intrinsic optical imaging (OI) and two-photon imaging, we observed that forelimb electrical stimulation evoked strong PHR signals in the forelimb region of the primary somatosensory cortex (S1FL). Meanwhile, NHR signals primarily appeared in the primary visual cortex (V1), with a delayed onset and lower amplitude relative to the PHR signals. Additionally, stimulation led to a reduction in cerebral blood flow (CBF) in the NHR region. Notably, there was an overall suppression of the calcium response in the NHR region, although a small proportion (14 %) of neurons exhibited concurrent activation. Axon tracing revealed cortico-cortical projections from S1FL to V1. These findings suggest that neuronal deactivation significantly contributes to the origin of the NHR, offering additional insights into the specific inhibitory mechanisms underlying the NHR.
Collapse
Affiliation(s)
- Zhen Li
- College of Intelligence Science and Technology, National University of Defense Technology, Changsha, China
| | - Lihua He
- College of Intelligence Science and Technology, National University of Defense Technology, Changsha, China
| | - Limin Peng
- College of Intelligence Science and Technology, National University of Defense Technology, Changsha, China
| | - Xuan Zhu
- College of Intelligence Science and Technology, National University of Defense Technology, Changsha, China
| | - Ming Li
- College of Intelligence Science and Technology, National University of Defense Technology, Changsha, China.
| | - Dewen Hu
- College of Intelligence Science and Technology, National University of Defense Technology, Changsha, China.
| |
Collapse
|
3
|
Vogler NW, Chen R, Virkler A, Tu VY, Gottfried JA, Geffen MN. Direct Piriform-to-Auditory Cortical Projections Shape Auditory-Olfactory Integration. J Neurosci 2024; 44:e1140242024. [PMID: 39510831 PMCID: PMC11622214 DOI: 10.1523/jneurosci.1140-24.2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2024] [Revised: 09/12/2024] [Accepted: 10/09/2024] [Indexed: 11/15/2024] Open
Abstract
In a real-world environment, the brain must integrate information from multiple sensory modalities, including the auditory and olfactory systems. However, little is known about the neuronal circuits governing how odors influence and modulate sound processing. Here, we investigated the mechanisms underlying auditory-olfactory integration using anatomical, electrophysiological, and optogenetic approaches, focusing on the auditory cortex as a key locus for cross-modal integration. First, retrograde and anterograde viral tracing strategies revealed a direct projection from the piriform cortex to the auditory cortex. Next, using in vivo electrophysiological recordings of neuronal activity in the auditory cortex of awake male or female mice, we found that odors modulate auditory cortical responses to sound. Finally, we used in vivo optogenetic manipulations during electrophysiology to demonstrate that olfactory modulation in the auditory cortex, specifically, odor-driven enhancement of sound responses, depends on direct input from the piriform cortex. Together, our results identify a novel role of piriform-to-auditory cortical circuitry in shaping olfactory modulation in the auditory cortex, shedding new light on the neuronal mechanisms underlying auditory-olfactory integration.
Collapse
Affiliation(s)
- Nathan W Vogler
- Departments of Otorhinolaryngology, Perelman School of Medicine, University of Pennsylvania
| | - Ruoyi Chen
- Departments of Otorhinolaryngology, Perelman School of Medicine, University of Pennsylvania
| | - Alister Virkler
- Neurology, Perelman School of Medicine, University of Pennsylvania
| | - Violet Y Tu
- Departments of Otorhinolaryngology, Perelman School of Medicine, University of Pennsylvania
| | - Jay A Gottfried
- Neurology, Perelman School of Medicine, University of Pennsylvania
| | - Maria N Geffen
- Departments of Otorhinolaryngology, Perelman School of Medicine, University of Pennsylvania
- Neurology, Perelman School of Medicine, University of Pennsylvania
- Neuroscience, Perelman School of Medicine, University of Pennsylvania
| |
Collapse
|
4
|
Vogler NW, Chen R, Virkler A, Tu VY, Gottfried JA, Geffen MN. Direct piriform-to-auditory cortical projections shape auditory-olfactory integration. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.07.11.602976. [PMID: 39071445 PMCID: PMC11275881 DOI: 10.1101/2024.07.11.602976] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/30/2024]
Abstract
In a real-world environment, the brain must integrate information from multiple sensory modalities, including the auditory and olfactory systems. However, little is known about the neuronal circuits governing how odors influence and modulate sound processing. Here, we investigated the mechanisms underlying auditory-olfactory integration using anatomical, electrophysiological, and optogenetic approaches, focusing on the auditory cortex as a key locus for cross-modal integration. First, retrograde and anterograde viral tracing strategies revealed a direct projection from the piriform cortex to the auditory cortex. Next, using in vivo electrophysiological recordings of neuronal activity in the auditory cortex of awake male or female mice, we found that odors modulate auditory cortical responses to sound. Finally, we used in vivo optogenetic manipulations during electrophysiology to demonstrate that olfactory modulation in auditory cortex, specifically, odor-driven enhancement of sound responses, depends on direct input from the piriform cortex. Together, our results identify a novel role of piriform-to-auditory cortical circuitry in shaping olfactory modulation in the auditory cortex, shedding new light on the neuronal mechanisms underlying auditory-olfactory integration.
Collapse
Affiliation(s)
- Nathan W. Vogler
- Department of Otorhinolaryngology, Perelman School of Medicine, University of Pennsylvania
| | - Ruoyi Chen
- Department of Otorhinolaryngology, Perelman School of Medicine, University of Pennsylvania
| | - Alister Virkler
- Department of Neurology, Perelman School of Medicine, University of Pennsylvania
| | - Violet Y. Tu
- Department of Otorhinolaryngology, Perelman School of Medicine, University of Pennsylvania
| | - Jay A. Gottfried
- Department of Neurology, Perelman School of Medicine, University of Pennsylvania
| | - Maria N. Geffen
- Department of Otorhinolaryngology, Perelman School of Medicine, University of Pennsylvania
| |
Collapse
|
5
|
Kato DD, Bruno RM. Stability of cross-sensory input to primary somatosensory cortex across experience. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.08.07.607026. [PMID: 39149350 PMCID: PMC11326227 DOI: 10.1101/2024.08.07.607026] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 08/17/2024]
Abstract
Merging information from across sensory modalities is key to forming robust, disambiguated percepts of the world, yet how the brain achieves this feat remains unclear. Recent observations of cross-modal influences in primary sensory cortical areas have suggested that multisensory integration may occur in the earliest stages of cortical processing, but the role of these responses is still poorly understood. We address these questions by testing several hypotheses about the possible functions served by auditory influences on the barrel field of mouse primary somatosensory cortex (S1) using in vivo 2-photon calcium imaging. We observed sound-evoked spiking activity in a small fraction of cells overall, and moreover that this sparse activity was insufficient to encode auditory stimulus identity; few cells responded preferentially to one sound or another, and a linear classifier trained to decode auditory stimuli from population activity performed barely above chance. Moreover S1 did not encode information about specific audio-tactile feature conjunctions that we tested. Our ability to decode auditory audio-tactile stimuli from neural activity remained unchanged after both passive experience and reinforcement. Collectively, these results suggest that while a primary sensory cortex is highly plastic with regard to its own modality, the influence of other modalities are remarkably stable and play a largely stimulus-non-specific role.
Collapse
Affiliation(s)
- Daniel D Kato
- Department of Neuroscience, Columbia University, New York, NY 10027, USA
| | - Randy M Bruno
- Department of Neuroscience, Columbia University, New York, NY 10027, USA
- Department of Physiology, Anatomy, & Genetics, University of Oxford, Oxford OX1 3PT, United Kingdom
| |
Collapse
|
6
|
Jordan J, Sacramento J, Wybo WAM, Petrovici MA, Senn W. Conductance-based dendrites perform Bayes-optimal cue integration. PLoS Comput Biol 2024; 20:e1012047. [PMID: 38865345 PMCID: PMC11168673 DOI: 10.1371/journal.pcbi.1012047] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2022] [Accepted: 03/31/2024] [Indexed: 06/14/2024] Open
Abstract
A fundamental function of cortical circuits is the integration of information from different sources to form a reliable basis for behavior. While animals behave as if they optimally integrate information according to Bayesian probability theory, the implementation of the required computations in the biological substrate remains unclear. We propose a novel, Bayesian view on the dynamics of conductance-based neurons and synapses which suggests that they are naturally equipped to optimally perform information integration. In our approach apical dendrites represent prior expectations over somatic potentials, while basal dendrites represent likelihoods of somatic potentials. These are parametrized by local quantities, the effective reversal potentials and membrane conductances. We formally demonstrate that under these assumptions the somatic compartment naturally computes the corresponding posterior. We derive a gradient-based plasticity rule, allowing neurons to learn desired target distributions and weight synaptic inputs by their relative reliabilities. Our theory explains various experimental findings on the system and single-cell level related to multi-sensory integration, which we illustrate with simulations. Furthermore, we make experimentally testable predictions on Bayesian dendritic integration and synaptic plasticity.
Collapse
Affiliation(s)
- Jakob Jordan
- Department of Physiology, University of Bern, Bern, Switzerland
- Electrical Engineering, Yale University, New Haven, Connecticut, United States of America
| | - João Sacramento
- Department of Physiology, University of Bern, Bern, Switzerland
- Institute of Neuroinformatics, UZH / ETH Zurich, Zurich, Switzerland
| | - Willem A. M. Wybo
- Department of Physiology, University of Bern, Bern, Switzerland
- Institute of Neuroscience and Medicine, Forschungszentrum Jülich, Jülich, Germany
| | | | - Walter Senn
- Department of Physiology, University of Bern, Bern, Switzerland
| |
Collapse
|
7
|
Stocke S, Samuelsen CL. Multisensory Integration Underlies the Distinct Representation of Odor-Taste Mixtures in the Gustatory Cortex of Behaving Rats. J Neurosci 2024; 44:e0071242024. [PMID: 38548337 PMCID: PMC11097261 DOI: 10.1523/jneurosci.0071-24.2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2024] [Revised: 02/21/2024] [Accepted: 03/14/2024] [Indexed: 05/15/2024] Open
Abstract
The perception of food relies on the integration of olfactory and gustatory signals originating from the mouth. This multisensory process generates robust associations between odors and tastes, significantly influencing the perceptual judgment of flavors. However, the specific neural substrates underlying this integrative process remain unclear. Previous electrophysiological studies identified the gustatory cortex as a site of convergent olfactory and gustatory signals, but whether neurons represent multimodal odor-taste mixtures as distinct from their unimodal odor and taste components is unknown. To investigate this, we recorded single-unit activity in the gustatory cortex of behaving female rats during the intraoral delivery of individual odors, individual tastes, and odor-taste mixtures. Our results demonstrate that chemoselective neurons in the gustatory cortex are broadly responsive to intraoral chemosensory stimuli, exhibiting time-varying multiphasic changes in activity. In a subset of these chemoselective neurons, odor-taste mixtures elicit nonlinear cross-modal responses that distinguish them from their olfactory and gustatory components. These findings provide novel insights into multimodal chemosensory processing by the gustatory cortex, highlighting the distinct representation of unimodal and multimodal intraoral chemosensory signals. Overall, our findings suggest that olfactory and gustatory signals interact nonlinearly in the gustatory cortex to enhance the identity coding of both unimodal and multimodal chemosensory stimuli.
Collapse
Affiliation(s)
- Sanaya Stocke
- Departments of Biology, University of Louisville, Louisville, Kentucky 40292
| | - Chad L Samuelsen
- Anatomical Sciences and Neurobiology, University of Louisville, Louisville, Kentucky 40292
| |
Collapse
|
8
|
Lemercier CE, Krieger P, Manahan-Vaughan D. Dynamic modulation of mouse thalamocortical visual activity by salient sounds. iScience 2024; 27:109364. [PMID: 38523779 PMCID: PMC10959669 DOI: 10.1016/j.isci.2024.109364] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2023] [Revised: 12/11/2023] [Accepted: 02/26/2024] [Indexed: 03/26/2024] Open
Abstract
Visual responses of the primary visual cortex (V1) are altered by sound. Sound-driven behavioral arousal suggests that, in addition to direct inputs from the primary auditory cortex (A1), multiple other sources may shape V1 responses to sound. Here, we show in anesthetized mice that sound (white noise, ≥70dB) drives a biphasic modulation of V1 visually driven gamma-band activity, comprising fast-transient inhibitory and slow, prolonged excitatory (A1-independent) arousal-driven components. An analogous yet quicker modulation of the visual response also occurred earlier in the visual pathway, at the level of the dorsolateral geniculate nucleus (dLGN), where sound transiently inhibited the early phasic visual response and subsequently induced a prolonged increase in tonic spiking activity and gamma rhythmicity. Our results demonstrate that sound-driven modulations of visual activity are not exclusive to V1 and suggest that thalamocortical inputs from the dLGN to V1 contribute to shaping V1 visual response to sound.
Collapse
Affiliation(s)
- Clément E. Lemercier
- Department of Neurophysiology, Medical Faculty, Ruhr-University Bochum, 44801 Bochum, Germany
| | - Patrik Krieger
- Department of Neurophysiology, Medical Faculty, Ruhr-University Bochum, 44801 Bochum, Germany
| | - Denise Manahan-Vaughan
- Department of Neurophysiology, Medical Faculty, Ruhr-University Bochum, 44801 Bochum, Germany
| |
Collapse
|
9
|
Mazo C, Baeta M, Petreanu L. Auditory cortex conveys non-topographic sound localization signals to visual cortex. Nat Commun 2024; 15:3116. [PMID: 38600132 PMCID: PMC11006897 DOI: 10.1038/s41467-024-47546-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2023] [Accepted: 04/02/2024] [Indexed: 04/12/2024] Open
Abstract
Spatiotemporally congruent sensory stimuli are fused into a unified percept. The auditory cortex (AC) sends projections to the primary visual cortex (V1), which could provide signals for binding spatially corresponding audio-visual stimuli. However, whether AC inputs in V1 encode sound location remains unknown. Using two-photon axonal calcium imaging and a speaker array, we measured the auditory spatial information transmitted from AC to layer 1 of V1. AC conveys information about the location of ipsilateral and contralateral sound sources to V1. Sound location could be accurately decoded by sampling AC axons in V1, providing a substrate for making location-specific audiovisual associations. However, AC inputs were not retinotopically arranged in V1, and audio-visual modulations of V1 neurons did not depend on the spatial congruency of the sound and light stimuli. The non-topographic sound localization signals provided by AC might allow the association of specific audiovisual spatial patterns in V1 neurons.
Collapse
Affiliation(s)
- Camille Mazo
- Champalimaud Neuroscience Programme, Champalimaud Foundation, Lisbon, Portugal.
| | - Margarida Baeta
- Champalimaud Neuroscience Programme, Champalimaud Foundation, Lisbon, Portugal
| | - Leopoldo Petreanu
- Champalimaud Neuroscience Programme, Champalimaud Foundation, Lisbon, Portugal.
| |
Collapse
|
10
|
Oude Lohuis MN, Marchesi P, Olcese U, Pennartz CMA. Triple dissociation of visual, auditory and motor processing in mouse primary visual cortex. Nat Neurosci 2024; 27:758-771. [PMID: 38307971 DOI: 10.1038/s41593-023-01564-5] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2022] [Accepted: 12/19/2023] [Indexed: 02/04/2024]
Abstract
Primary sensory cortices respond to crossmodal stimuli-for example, auditory responses are found in primary visual cortex (V1). However, it remains unclear whether these responses reflect sensory inputs or behavioral modulation through sound-evoked body movement. We address this controversy by showing that sound-evoked activity in V1 of awake mice can be dissociated into auditory and behavioral components with distinct spatiotemporal profiles. The auditory component began at approximately 27 ms, was found in superficial and deep layers and originated from auditory cortex. Sound-evoked orofacial movements correlated with V1 neural activity starting at approximately 80-100 ms and explained auditory frequency tuning. Visual, auditory and motor activity were expressed by different laminar profiles and largely segregated subsets of neuronal populations. During simultaneous audiovisual stimulation, visual representations remained dissociable from auditory-related and motor-related activity. This three-fold dissociability of auditory, motor and visual processing is central to understanding how distinct inputs to visual cortex interact to support vision.
Collapse
Affiliation(s)
- Matthijs N Oude Lohuis
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
- Champalimaud Neuroscience Programme, Champalimaud Foundation, Lisbon, Portugal
| | - Pietro Marchesi
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| | - Umberto Olcese
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| | - Cyriel M A Pennartz
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands.
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands.
| |
Collapse
|
11
|
Meneghetti N, Vannini E, Mazzoni A. Rodents' visual gamma as a biomarker of pathological neural conditions. J Physiol 2024; 602:1017-1048. [PMID: 38372352 DOI: 10.1113/jp283858] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2022] [Accepted: 01/23/2024] [Indexed: 02/20/2024] Open
Abstract
Neural gamma oscillations (indicatively 30-100 Hz) are ubiquitous: they are associated with a broad range of functions in multiple cortical areas and across many animal species. Experimental and computational works established gamma rhythms as a global emergent property of neuronal networks generated by the balanced and coordinated interaction of excitation and inhibition. Coherently, gamma activity is strongly influenced by the alterations of synaptic dynamics which are often associated with pathological neural dysfunctions. We argue therefore that these oscillations are an optimal biomarker for probing the mechanism of cortical dysfunctions. Gamma oscillations are also highly sensitive to external stimuli in sensory cortices, especially the primary visual cortex (V1), where the stimulus dependence of gamma oscillations has been thoroughly investigated. Gamma manipulation by visual stimuli tuning is particularly easy in rodents, which have become a standard animal model for investigating the effects of network alterations on gamma oscillations. Overall, gamma in the rodents' visual cortex offers an accessible probe on dysfunctional information processing in pathological conditions. Beyond vision-related dysfunctions, alterations of gamma oscillations in rodents were indeed also reported in neural deficits such as migraine, epilepsy and neurodegenerative or neuropsychiatric conditions such as Alzheimer's, schizophrenia and autism spectrum disorders. Altogether, the connections between visual cortical gamma activity and physio-pathological conditions in rodent models underscore the potential of gamma oscillations as markers of neuronal (dys)functioning.
Collapse
Affiliation(s)
- Nicolò Meneghetti
- The Biorobotics Institute, Scuola Superiore Sant'Anna, Pisa, Italy
- Department of Excellence for Robotics and AI, Scuola Superiore Sant'Anna, Pisa, Italy
| | - Eleonora Vannini
- Neuroscience Institute, National Research Council (CNR), Pisa, Italy
| | - Alberto Mazzoni
- The Biorobotics Institute, Scuola Superiore Sant'Anna, Pisa, Italy
- Department of Excellence for Robotics and AI, Scuola Superiore Sant'Anna, Pisa, Italy
| |
Collapse
|
12
|
Zhuo C, Tian H, Zhu J, Fang T, Ping J, Wang L, Sun Y, Cheng L, Chen C, Chen G. Low-dose lithium adjunct to quetiapine improves cognitive task performance in mice with MK801-induced long-term cognitive impairment: Evidence from a pilot study. J Affect Disord 2023; 340:42-52. [PMID: 37506773 DOI: 10.1016/j.jad.2023.07.104] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/11/2023] [Revised: 07/04/2023] [Accepted: 07/23/2023] [Indexed: 07/30/2023]
Abstract
BACKGROUND Low-dose lithium (LD-Li) has been shown to rescue cognitive impairment in mouse models of short-term mild cognitive impairment, dementia, and schizophrenia. However, few studies have characterized the effects of LD-Li, alone or in conjunction with anti-psychotics, in the mouse model of MK801-induced long term cognitive impairment. METHODS The present study used in vivo Ca2+ imaging and a battery of cognitive function assessments to investigate the long-term effects of LD-Li on cognition in mice exposed to repeated injections of MK801. Prefrontal Ca2+ activity was visualized to estimate alterations in neural activity in the model mice. Pre-pulse inhibition (PPI), novel object recognition (NOR), Morris water maze (MWM), and fear conditioning (FC) tasks were used to characterize cognitive performance; open field activity (OFA) testing was used to observe psychotic symptoms. Two treatment strategies were tested: LD-Li [250 mg/d human equivalent dose (HED)] adjunct to quetiapine (QTP; 600 mg/d HED); and QTP-monotherapy (mt; 600 mg/d HED). RESULTS Compared to the QTP-mt group, the LD-Li + QTP group showed greatly improved cognitive performance on all measures between experimental days 29 and 85. QTP-mt improved behavioral measures compared to untreated controls, but the effects persisted only from day 29 to day 43. These data suggest that LD-Li + QTP is superior to QTP-mt for improving long-term cognitive impairments in the MK801 mouse model. LIMITATIONS There is no medical consensus regarding lithium use in patients with schizophrenia. CONCLUSION More pre-clinical and clinical studies are needed to further investigate effective treatment strategies for patients with long-term cognitive impairments, such as chronic schizophrenia.
Collapse
Affiliation(s)
- Chuanjun Zhuo
- Key Laboratory of Sensory Information Processing Abnormalities in Schizophrenia (SIPAC_Lab), Tianjin Fourth Center Hospital, Nankai University Affiliated Tianjin Fourth Center Hospital, Tianjin Medical University Affiliated Tianjin Fourth Center Hospital, Tianjin 300140, China; Animal Imaging Center (AIC), Wenzhou Seventh Peoples Hospital, Wenzhou 325000, China; Laboratory of Psychiatric-Neuroimaging-Genetic and Co-morbidity (PNGC_Lab), Tianjn Anding Hospital, Nankai University Affiliated Tianjin Anding Hospital, Tianjin Mental Health Center of Tianjin Medical University, Tianjin Medical University Affiliated Tianjin Anding Hospital, Tianjin 300222, China.
| | - Hongjun Tian
- Key Laboratory of Sensory Information Processing Abnormalities in Schizophrenia (SIPAC_Lab), Tianjin Fourth Center Hospital, Nankai University Affiliated Tianjin Fourth Center Hospital, Tianjin Medical University Affiliated Tianjin Fourth Center Hospital, Tianjin 300140, China
| | - Jingjing Zhu
- Animal Imaging Center (AIC), Wenzhou Seventh Peoples Hospital, Wenzhou 325000, China
| | - Tao Fang
- Key Laboratory of Sensory Information Processing Abnormalities in Schizophrenia (SIPAC_Lab), Tianjin Fourth Center Hospital, Nankai University Affiliated Tianjin Fourth Center Hospital, Tianjin Medical University Affiliated Tianjin Fourth Center Hospital, Tianjin 300140, China
| | - Jing Ping
- Animal Imaging Center (AIC), Wenzhou Seventh Peoples Hospital, Wenzhou 325000, China
| | - Lina Wang
- Laboratory of Psychiatric-Neuroimaging-Genetic and Co-morbidity (PNGC_Lab), Tianjn Anding Hospital, Nankai University Affiliated Tianjin Anding Hospital, Tianjin Mental Health Center of Tianjin Medical University, Tianjin Medical University Affiliated Tianjin Anding Hospital, Tianjin 300222, China
| | - Yun Sun
- Laboratory of Psychiatric-Neuroimaging-Genetic and Co-morbidity (PNGC_Lab), Tianjn Anding Hospital, Nankai University Affiliated Tianjin Anding Hospital, Tianjin Mental Health Center of Tianjin Medical University, Tianjin Medical University Affiliated Tianjin Anding Hospital, Tianjin 300222, China
| | - Langlang Cheng
- Animal Imaging Center (AIC), Wenzhou Seventh Peoples Hospital, Wenzhou 325000, China
| | - Chunmian Chen
- Animal Imaging Center (AIC), Wenzhou Seventh Peoples Hospital, Wenzhou 325000, China
| | - Guangdong Chen
- Animal Imaging Center (AIC), Wenzhou Seventh Peoples Hospital, Wenzhou 325000, China
| |
Collapse
|
13
|
Hajnal MA, Tran D, Einstein M, Martelo MV, Safaryan K, Polack PO, Golshani P, Orbán G. Continuous multiplexed population representations of task context in the mouse primary visual cortex. Nat Commun 2023; 14:6687. [PMID: 37865648 PMCID: PMC10590415 DOI: 10.1038/s41467-023-42441-w] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2023] [Accepted: 10/10/2023] [Indexed: 10/23/2023] Open
Abstract
Effective task execution requires the representation of multiple task-related variables that determine how stimuli lead to correct responses. Even the primary visual cortex (V1) represents other task-related variables such as expectations, choice, and context. However, it is unclear how V1 can flexibly accommodate these variables without interfering with visual representations. We trained mice on a context-switching cross-modal decision task, where performance depends on inferring task context. We found that the context signal that emerged in V1 was behaviorally relevant as it strongly covaried with performance, independent from movement. Importantly, this signal was integrated into V1 representation by multiplexing visual and context signals into orthogonal subspaces. In addition, auditory and choice signals were also multiplexed as these signals were orthogonal to the context representation. Thus, multiplexing allows V1 to integrate visual inputs with other sensory modalities and cognitive variables to avoid interference with the visual representation while ensuring the maintenance of task-relevant variables.
Collapse
Affiliation(s)
- Márton Albert Hajnal
- Department of Computational Sciences, Wigner Research Center for Physics, Budapest, 1121, Hungary
| | - Duy Tran
- Department of Neurology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, CA, 90095, USA
- Albert Einstein College of Medicine, New York, NY, 10461, USA
| | - Michael Einstein
- Department of Neurology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, CA, 90095, USA
| | - Mauricio Vallejo Martelo
- Department of Neurology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, CA, 90095, USA
| | - Karen Safaryan
- Department of Neurology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, CA, 90095, USA
| | - Pierre-Olivier Polack
- Center for Molecular and Behavioral Neuroscience, Rutgers University, Newark, NJ, 07102, USA
| | - Peyman Golshani
- Department of Neurology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, CA, 90095, USA.
- Integrative Center for Learning and Memory, Brain Research Institute, University of California, Los Angeles, Los Angeles, CA, 90095, USA.
- West Los Angeles VA Medical Center, CA, 90073, Los Angeles, USA.
| | - Gergő Orbán
- Department of Computational Sciences, Wigner Research Center for Physics, Budapest, 1121, Hungary.
| |
Collapse
|
14
|
Choi I, Demir I, Oh S, Lee SH. Multisensory integration in the mammalian brain: diversity and flexibility in health and disease. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220338. [PMID: 37545309 PMCID: PMC10404930 DOI: 10.1098/rstb.2022.0338] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 04/30/2023] [Indexed: 08/08/2023] Open
Abstract
Multisensory integration (MSI) occurs in a variety of brain areas, spanning cortical and subcortical regions. In traditional studies on sensory processing, the sensory cortices have been considered for processing sensory information in a modality-specific manner. The sensory cortices, however, send the information to other cortical and subcortical areas, including the higher association cortices and the other sensory cortices, where the multiple modality inputs converge and integrate to generate a meaningful percept. This integration process is neither simple nor fixed because these brain areas interact with each other via complicated circuits, which can be modulated by numerous internal and external conditions. As a result, dynamic MSI makes multisensory decisions flexible and adaptive in behaving animals. Impairments in MSI occur in many psychiatric disorders, which may result in an altered perception of the multisensory stimuli and an abnormal reaction to them. This review discusses the diversity and flexibility of MSI in mammals, including humans, primates and rodents, as well as the brain areas involved. It further explains how such flexibility influences perceptual experiences in behaving animals in both health and disease. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Ilsong Choi
- Center for Synaptic Brain Dysfunctions, Institute for Basic Science (IBS), Daejeon 34141, Republic of Korea
| | - Ilayda Demir
- Department of biological sciences, KAIST, Daejeon 34141, Republic of Korea
| | - Seungmi Oh
- Department of biological sciences, KAIST, Daejeon 34141, Republic of Korea
| | - Seung-Hee Lee
- Center for Synaptic Brain Dysfunctions, Institute for Basic Science (IBS), Daejeon 34141, Republic of Korea
- Department of biological sciences, KAIST, Daejeon 34141, Republic of Korea
| |
Collapse
|
15
|
Pennartz CMA, Oude Lohuis MN, Olcese U. How 'visual' is the visual cortex? The interactions between the visual cortex and other sensory, motivational and motor systems as enabling factors for visual perception. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220336. [PMID: 37545313 PMCID: PMC10404929 DOI: 10.1098/rstb.2022.0336] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2023] [Accepted: 06/13/2023] [Indexed: 08/08/2023] Open
Abstract
The definition of the visual cortex is primarily based on the evidence that lesions of this area impair visual perception. However, this does not exclude that the visual cortex may process more information than of retinal origin alone, or that other brain structures contribute to vision. Indeed, research across the past decades has shown that non-visual information, such as neural activity related to reward expectation and value, locomotion, working memory and other sensory modalities, can modulate primary visual cortical responses to retinal inputs. Nevertheless, the function of this non-visual information is poorly understood. Here we review recent evidence, coming primarily from studies in rodents, arguing that non-visual and motor effects in visual cortex play a role in visual processing itself, for instance disentangling direct auditory effects on visual cortex from effects of sound-evoked orofacial movement. These findings are placed in a broader framework casting vision in terms of predictive processing under control of frontal, reward- and motor-related systems. In contrast to the prevalent notion that vision is exclusively constructed by the visual cortical system, we propose that visual percepts are generated by a larger network-the extended visual system-spanning other sensory cortices, supramodal areas and frontal systems. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Cyriel M. A. Pennartz
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
| | - Matthijs N. Oude Lohuis
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Champalimaud Research, Champalimaud Foundation, 1400-038 Lisbon, Portugal
| | - Umberto Olcese
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
| |
Collapse
|
16
|
Coen P, Sit TPH, Wells MJ, Carandini M, Harris KD. Mouse frontal cortex mediates additive multisensory decisions. Neuron 2023; 111:2432-2447.e13. [PMID: 37295419 PMCID: PMC10957398 DOI: 10.1016/j.neuron.2023.05.008] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2022] [Revised: 12/02/2022] [Accepted: 05/10/2023] [Indexed: 06/12/2023]
Abstract
The brain can combine auditory and visual information to localize objects. However, the cortical substrates underlying audiovisual integration remain uncertain. Here, we show that mouse frontal cortex combines auditory and visual evidence; that this combination is additive, mirroring behavior; and that it evolves with learning. We trained mice in an audiovisual localization task. Inactivating frontal cortex impaired responses to either sensory modality, while inactivating visual or parietal cortex affected only visual stimuli. Recordings from >14,000 neurons indicated that after task learning, activity in the anterior part of frontal area MOs (secondary motor cortex) additively encodes visual and auditory signals, consistent with the mice's behavioral strategy. An accumulator model applied to these sensory representations reproduced the observed choices and reaction times. These results suggest that frontal cortex adapts through learning to combine evidence across sensory cortices, providing a signal that is transformed into a binary decision by a downstream accumulator.
Collapse
Affiliation(s)
- Philip Coen
- UCL Queen Square Institute of Neurology, University College London, London, UK; UCL Institute of Ophthalmology, University College London, London, UK.
| | - Timothy P H Sit
- Sainsbury-Wellcome Center, University College London, London, UK
| | - Miles J Wells
- UCL Queen Square Institute of Neurology, University College London, London, UK
| | - Matteo Carandini
- UCL Institute of Ophthalmology, University College London, London, UK
| | - Kenneth D Harris
- UCL Queen Square Institute of Neurology, University College London, London, UK
| |
Collapse
|
17
|
Dorman R, Bos JJ, Vinck MA, Marchesi P, Fiorilli J, Lorteije JAM, Reiten I, Bjaalie JG, Okun M, Pennartz CMA. Spike-based coupling between single neurons and populations across rat sensory cortices, perirhinal cortex, and hippocampus. Cereb Cortex 2023; 33:8247-8264. [PMID: 37118890 PMCID: PMC10425201 DOI: 10.1093/cercor/bhad111] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2022] [Revised: 03/09/2023] [Accepted: 03/10/2023] [Indexed: 04/30/2023] Open
Abstract
Cortical computations require coordination of neuronal activity within and across multiple areas. We characterized spiking relationships within and between areas by quantifying coupling of single neurons to population firing patterns. Single-neuron population coupling (SNPC) was investigated using ensemble recordings from hippocampal CA1 region and somatosensory, visual, and perirhinal cortices. Within-area coupling was heterogeneous across structures, with area CA1 showing higher levels than neocortical regions. In contrast to known anatomical connectivity, between-area coupling showed strong firing coherence of sensory neocortices with CA1, but less with perirhinal cortex. Cells in sensory neocortices and CA1 showed positive correlations between within- and between-area coupling; these were weaker for perirhinal cortex. All four areas harbored broadcasting cells, connecting to multiple external areas, which was uncorrelated to within-area coupling strength. When examining correlations between SNPC and spatial coding, we found that, if such correlations were significant, they were negative. This result was consistent with an overall preservation of SNPC across different brain states, suggesting a strong dependence on intrinsic network connectivity. Overall, SNPC offers an important window on cell-to-population synchronization in multi-area networks. Instead of pointing to specific information-coding functions, our results indicate a primary function of SNPC in dynamically organizing communication in systems composed of multiple, interconnected areas.
Collapse
Affiliation(s)
- Reinder Dorman
- Systems and Cognitive Neuroscience Group, SILS Center for Neuroscience, University of Amsterdam, 1098 XH Amsterdam, The Netherlands
| | - Jeroen J Bos
- Systems and Cognitive Neuroscience Group, SILS Center for Neuroscience, University of Amsterdam, 1098 XH Amsterdam, The Netherlands
- Donders Institute for Brain, Cognition and Behavior, Radboud University, 6500 HC Nijmegen, The Netherlands
| | - Martin A Vinck
- Systems and Cognitive Neuroscience Group, SILS Center for Neuroscience, University of Amsterdam, 1098 XH Amsterdam, The Netherlands
- Ernst Strüngmann Institute for Neuroscience in Cooperation with Max Plank Society, 60528 Frankfurt, Germany
| | - Pietro Marchesi
- Systems and Cognitive Neuroscience Group, SILS Center for Neuroscience, University of Amsterdam, 1098 XH Amsterdam, The Netherlands
| | - Julien Fiorilli
- Systems and Cognitive Neuroscience Group, SILS Center for Neuroscience, University of Amsterdam, 1098 XH Amsterdam, The Netherlands
| | - Jeanette A M Lorteije
- Systems and Cognitive Neuroscience Group, SILS Center for Neuroscience, University of Amsterdam, 1098 XH Amsterdam, The Netherlands
| | - Ingrid Reiten
- Institute of Basic Medical Sciences, University of Oslo, NO-0316 Oslo, Norway
| | - Jan G Bjaalie
- Institute of Basic Medical Sciences, University of Oslo, NO-0316 Oslo, Norway
| | - Michael Okun
- Department of Psychology and Neuroscience Institute, University of Sheffield, Sheffield S10 2TN, UK
| | - Cyriel M A Pennartz
- Systems and Cognitive Neuroscience Group, SILS Center for Neuroscience, University of Amsterdam, 1098 XH Amsterdam, The Netherlands
| |
Collapse
|
18
|
Klaver LMF, Brinkhof LP, Sikkens T, Casado-Román L, Williams AG, van Mourik-Donga L, Mejías JF, Pennartz CMA, Bosman CA. Spontaneous variations in arousal modulate subsequent visual processing and local field potential dynamics in the ferret during quiet wakefulness. Cereb Cortex 2023; 33:7564-7581. [PMID: 36935096 PMCID: PMC10267643 DOI: 10.1093/cercor/bhad061] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2022] [Revised: 02/11/2023] [Accepted: 02/14/2023] [Indexed: 03/21/2023] Open
Abstract
Behavioral states affect neuronal responses throughout the cortex and influence visual processing. Quiet wakefulness (QW) is a behavioral state during which subjects are quiescent but awake and connected to the environment. Here, we examined the effects of pre-stimulus arousal variability on post-stimulus neural activity in the primary visual cortex and posterior parietal cortex in awake ferrets, using pupil diameter as an indicator of arousal. We observed that the power of stimuli-induced alpha (8-12 Hz) decreases when the arousal level increases. The peak of alpha power shifts depending on arousal. High arousal increases inter- and intra-areal coherence. Using a simplified model of laminar circuits, we show that this connectivity pattern is compatible with feedback signals targeting infragranular layers in area posterior parietal cortex and supragranular layers in V1. During high arousal, neurons in V1 displayed higher firing rates at their preferred orientations. Broad-spiking cells in V1 are entrained to high-frequency oscillations (>80 Hz), whereas narrow-spiking neurons are phase-locked to low- (12-18 Hz) and high-frequency (>80 Hz) rhythms. These results indicate that the variability and sensitivity of post-stimulus cortical responses and coherence depend on the pre-stimulus behavioral state and account for the neuronal response variability observed during repeated stimulation.
Collapse
Affiliation(s)
- Lianne M F Klaver
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098 XH Amsterdam, The Netherlands
| | - Lotte P Brinkhof
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098 XH Amsterdam, The Netherlands
| | - Tom Sikkens
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098 XH Amsterdam, The Netherlands
| | - Lorena Casado-Román
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098 XH Amsterdam, The Netherlands
| | - Alex G Williams
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098 XH Amsterdam, The Netherlands
| | - Laura van Mourik-Donga
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098 XH Amsterdam, The Netherlands
| | - Jorge F Mejías
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098 XH Amsterdam, The Netherlands
- Research Priority Program Brain and Cognition, University of Amsterdam, Amsterdam, The Netherlands
| | - Cyriel M A Pennartz
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098 XH Amsterdam, The Netherlands
- Research Priority Program Brain and Cognition, University of Amsterdam, Amsterdam, The Netherlands
| | - Conrado A Bosman
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098 XH Amsterdam, The Netherlands
- Research Priority Program Brain and Cognition, University of Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|
19
|
Mertens PEC, Marchesi P, Ruikes TR, Oude Lohuis M, Krijger Q, Pennartz CMA, Lansink CS. Coherent mapping of position and head direction across auditory and visual cortex. Cereb Cortex 2023; 33:7369-7385. [PMID: 36967108 PMCID: PMC10267650 DOI: 10.1093/cercor/bhad045] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2022] [Revised: 01/31/2023] [Accepted: 02/01/2023] [Indexed: 09/21/2024] Open
Abstract
Neurons in primary visual cortex (V1) may not only signal current visual input but also relevant contextual information such as reward expectancy and the subject's spatial position. Such contextual representations need not be restricted to V1 but could participate in a coherent mapping throughout sensory cortices. Here, we show that spiking activity coherently represents a location-specific mapping across auditory cortex (AC) and lateral, secondary visual cortex (V2L) of freely moving rats engaged in a sensory detection task on a figure-8 maze. Single-unit activity of both areas showed extensive similarities in terms of spatial distribution, reliability, and position coding. Importantly, reconstructions of subject position based on spiking activity displayed decoding errors that were correlated between areas. Additionally, we found that head direction, but not locomotor speed or head angular velocity, was an important determinant of activity in AC and V2L. By contrast, variables related to the sensory task cues or to trial correctness and reward were not markedly encoded in AC and V2L. We conclude that sensory cortices participate in coherent, multimodal representations of the subject's sensory-specific location. These may provide a common reference frame for distributed cortical sensory and motor processes and may support crossmodal predictive processing.
Collapse
Affiliation(s)
- Paul E C Mertens
- Center for Neuroscience, Faculty of Science, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, Amsterdam 1098 XH, The Netherlands
| | - Pietro Marchesi
- Center for Neuroscience, Faculty of Science, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, Amsterdam 1098 XH, The Netherlands
| | - Thijs R Ruikes
- Center for Neuroscience, Faculty of Science, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, Amsterdam 1098 XH, The Netherlands
| | - Matthijs Oude Lohuis
- Center for Neuroscience, Faculty of Science, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, Amsterdam 1098 XH, The Netherlands
- Champalimaud Neuroscience Programme, Champalimaud Foundation, Lisbon, Portugal
| | - Quincy Krijger
- Center for Neuroscience, Faculty of Science, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, Amsterdam 1098 XH, The Netherlands
| | - Cyriel M A Pennartz
- Center for Neuroscience, Faculty of Science, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, Amsterdam 1098 XH, The Netherlands
- Research Priority Program Brain and Cognition, University of Amsterdam, Science Park 904, Amsterdam 1098 XH, The Netherlands
| | - Carien S Lansink
- Center for Neuroscience, Faculty of Science, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, Amsterdam 1098 XH, The Netherlands
- Research Priority Program Brain and Cognition, University of Amsterdam, Science Park 904, Amsterdam 1098 XH, The Netherlands
| |
Collapse
|
20
|
Muller M, Pennartz CMA, Bosman CA, Olcese U. A novel task to investigate vibrotactile detection in mice. PLoS One 2023; 18:e0284735. [PMID: 37079581 PMCID: PMC10118142 DOI: 10.1371/journal.pone.0284735] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2022] [Accepted: 04/06/2023] [Indexed: 04/21/2023] Open
Abstract
Throughout the last decades, understanding the neural mechanisms of sensory processing has been a key objective for neuroscientists. Many studies focused on uncovering the microcircuit-level architecture of somatosensation using the rodent whisker system as a model. Although these studies have significantly advanced our understanding of tactile processing, the question remains to what extent the whisker system can provide results translatable to the human somatosensory system. To address this, we developed a restrained vibrotactile detection task involving the limb system in mice. A vibrotactile stimulus was delivered to the hindlimb of head-fixed mice, who were trained to perform a Go/No-go detection task. Mice were able to learn this task with satisfactory performance and with reasonably short training times. In addition, the task we developed is versatile, as it can be combined with diverse neuroscience methods. Thus, this study introduces a novel task to study the neuron-level mechanisms of tactile processing in a system other than the more commonly studied whisker system.
Collapse
Affiliation(s)
- Mariel Muller
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Amsterdam, The Netherlands
| | - Cyriel M. A. Pennartz
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Amsterdam, The Netherlands
| | - Conrado A. Bosman
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Amsterdam, The Netherlands
| | - Umberto Olcese
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|
21
|
Williams AM, Angeloni CF, Geffen MN. Sound Improves Neuronal Encoding of Visual Stimuli in Mouse Primary Visual Cortex. J Neurosci 2023; 43:2885-2906. [PMID: 36944489 PMCID: PMC10124961 DOI: 10.1523/jneurosci.2444-21.2023] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2021] [Revised: 02/14/2023] [Accepted: 02/23/2023] [Indexed: 03/23/2023] Open
Abstract
In everyday life, we integrate visual and auditory information in routine tasks such as navigation and communication. While concurrent sound can improve visual perception, the neuronal correlates of audiovisual integration are not fully understood. Specifically, it remains unclear whether neuronal firing patters in the primary visual cortex (V1) of awake animals demonstrate similar sound-induced improvement in visual discriminability. Furthermore, presentation of sound is associated with movement in the subjects, but little is understood about whether and how sound-associated movement affects audiovisual integration in V1. Here, we investigated how sound and movement interact to modulate V1 visual responses in awake, head-fixed mice and whether this interaction improves neuronal encoding of the visual stimulus. We presented visual drifting gratings with and without simultaneous auditory white noise to awake mice while recording mouse movement and V1 neuronal activity. Sound modulated activity of 80% of light-responsive neurons, with 95% of neurons increasing activity when the auditory stimulus was present. A generalized linear model (GLM) revealed that sound and movement had distinct and complementary effects of the neuronal visual responses. Furthermore, decoding of the visual stimulus from the neuronal activity was improved with sound, an effect that persisted even when controlling for movement. These results demonstrate that sound and movement modulate visual responses in complementary ways, improving neuronal representation of the visual stimulus. This study clarifies the role of movement as a potential confound in neuronal audiovisual responses and expands our knowledge of how multimodal processing is mediated at a neuronal level in the awake brain.SIGNIFICANCE STATEMENT Sound and movement are both known to modulate visual responses in the primary visual cortex; however, sound-induced movement has largely remained unaccounted for as a potential confound in audiovisual studies in awake animals. Here, authors found that sound and movement both modulate visual responses in an important visual brain area, the primary visual cortex, in distinct, yet complementary ways. Furthermore, sound improved encoding of the visual stimulus even when accounting for movement. This study reconciles contrasting theories on the mechanism underlying audiovisual integration and asserts the primary visual cortex as a key brain region participating in tripartite sensory interactions.
Collapse
Affiliation(s)
- Aaron M Williams
- Department of Otorhinolaryngology, University of Pennsylvania, Philadelphia, Pennsylvania, 19104
- Department of Neuroscience, University of Pennsylvania, Philadelphia, Pennsylvania, 19104
- Department of Neurology, University of Pennsylvania, Philadelphia, Pennsylvania, 19104
| | - Christopher F Angeloni
- Department of Otorhinolaryngology, University of Pennsylvania, Philadelphia, Pennsylvania, 19104
- Department of Psychology, University of Pennsylvania, Philadelphia, Pennsylvania 19104
| | - Maria N Geffen
- Department of Otorhinolaryngology, University of Pennsylvania, Philadelphia, Pennsylvania, 19104
- Department of Neuroscience, University of Pennsylvania, Philadelphia, Pennsylvania, 19104
- Department of Neurology, University of Pennsylvania, Philadelphia, Pennsylvania, 19104
| |
Collapse
|
22
|
Bimbard C, Sit TPH, Lebedeva A, Reddy CB, Harris KD, Carandini M. Behavioral origin of sound-evoked activity in mouse visual cortex. Nat Neurosci 2023; 26:251-258. [PMID: 36624279 PMCID: PMC9905016 DOI: 10.1038/s41593-022-01227-x] [Citation(s) in RCA: 42] [Impact Index Per Article: 21.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2021] [Accepted: 10/31/2022] [Indexed: 01/10/2023]
Abstract
Sensory cortices can be affected by stimuli of multiple modalities and are thus increasingly thought to be multisensory. For instance, primary visual cortex (V1) is influenced not only by images but also by sounds. Here we show that the activity evoked by sounds in V1, measured with Neuropixels probes, is stereotyped across neurons and even across mice. It is independent of projections from auditory cortex and resembles activity evoked in the hippocampal formation, which receives little direct auditory input. Its low-dimensional nature starkly contrasts the high-dimensional code that V1 uses to represent images. Furthermore, this sound-evoked activity can be precisely predicted by small body movements that are elicited by each sound and are stereotyped across trials and mice. Thus, neural activity that is apparently multisensory may simply arise from low-dimensional signals associated with internal state and behavior.
Collapse
Affiliation(s)
- Célian Bimbard
- UCL Institute of Ophthalmology, University College London, London, UK.
| | - Timothy P H Sit
- Sainsbury Wellcome Centre, University College London, London, UK
- UCL Queen Square Institute of Neurology, University College London, London, UK
| | - Anna Lebedeva
- Sainsbury Wellcome Centre, University College London, London, UK
- UCL Queen Square Institute of Neurology, University College London, London, UK
| | - Charu B Reddy
- UCL Institute of Ophthalmology, University College London, London, UK
| | - Kenneth D Harris
- UCL Queen Square Institute of Neurology, University College London, London, UK
| | - Matteo Carandini
- UCL Institute of Ophthalmology, University College London, London, UK
| |
Collapse
|
23
|
Idris A, Christensen BA, Walker EM, Maier JX. Multisensory integration of orally-sourced gustatory and olfactory inputs to the posterior piriform cortex in awake rats. J Physiol 2023; 601:151-169. [PMID: 36385245 PMCID: PMC9869978 DOI: 10.1113/jp283873] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2022] [Accepted: 11/09/2022] [Indexed: 11/18/2022] Open
Abstract
Flavour refers to the sensory experience of food, which is a combination of sensory inputs sourced from multiple modalities during consumption, including taste and odour. Previous work has demonstrated that orally-sourced taste and odour cues interact to determine perceptual judgements of flavour stimuli, although the underlying cellular- and circuit-level neural mechanisms remain unknown. We recently identified a region of the piriform olfactory cortex in rats that responds to both taste and odour stimuli. Here, we investigated how converging taste and odour inputs to this area interact to affect single neuron responsiveness ensemble coding of flavour identity. To accomplish this, we recorded spiking activity from ensembles of single neurons in the posterior piriform cortex (pPC) in awake, tasting rats while delivering taste solutions, odour solutions and taste + odour mixtures directly into the oral cavity. Our results show that taste and odour inputs evoke highly selective, temporally-overlapping responses in multisensory pPC neurons. Comparing responses to mixtures and their unisensory components revealed that taste and odour inputs interact in a non-linear manner to produce unique response patterns. Taste input enhances trial-by-trial decoding of odour identity from small ensembles of simultaneously recorded neurons. Together, these results demonstrate that taste and odour inputs to pPC interact in complex, non-linear ways to form amodal flavour representations that enhance identity coding. KEY POINTS: Experience of food involves taste and smell, although how information from these different senses is combined by the brain to create our sense of flavour remains unknown. We recorded from small groups of neurons in the olfactory cortex of awake rats while they consumed taste solutions, odour solutions and taste + odour mixtures. Taste and smell solutions evoke highly selective responses. When presented in a mixture, taste and smell inputs interacted to alter responses, resulting in activation of unique sets of neurons that could not be predicted by the component responses. Synergistic interactions increase discriminability of odour representations. The olfactory cortex uses taste and smell to create new information representing multisensory flavour identity.
Collapse
Affiliation(s)
- Ammar Idris
- Department of Neurobiology & AnatomyWake Forest School of MedicineWinston‐SalemNCUSA
| | - Brooke A. Christensen
- Department of Neurobiology & AnatomyWake Forest School of MedicineWinston‐SalemNCUSA
| | - Ellen M. Walker
- Department of Neurobiology & AnatomyWake Forest School of MedicineWinston‐SalemNCUSA
| | - Joost X. Maier
- Department of Neurobiology & AnatomyWake Forest School of MedicineWinston‐SalemNCUSA
| |
Collapse
|
24
|
Existing function in primary visual cortex is not perturbed by new skill acquisition of a non-matched sensory task. Nat Commun 2022; 13:3638. [PMID: 35752622 PMCID: PMC9233699 DOI: 10.1038/s41467-022-31440-y] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2021] [Accepted: 06/16/2022] [Indexed: 02/07/2023] Open
Abstract
Acquisition of new skills has the potential to disturb existing network function. To directly assess whether previously acquired cortical function is altered during learning, mice were trained in an abstract task in which selected activity patterns were rewarded using an optical brain-computer interface device coupled to primary visual cortex (V1) neurons. Excitatory neurons were longitudinally recorded using 2-photon calcium imaging. Despite significant changes in local neural activity during task performance, tuning properties and stimulus encoding assessed outside of the trained context were not perturbed. Similarly, stimulus tuning was stable in neurons that remained responsive following a different, visual discrimination training task. However, visual discrimination training increased the rate of representational drift. Our results indicate that while some forms of perceptual learning may modify the contribution of individual neurons to stimulus encoding, new skill learning is not inherently disruptive to the quality of stimulus representation in adult V1.
Collapse
|
25
|
Corbo J, McClure JP, Erkat OB, Polack PO. Dynamic Distortion of Orientation Representation after Learning in the Mouse Primary Visual Cortex. J Neurosci 2022; 42:4311-4325. [PMID: 35477902 PMCID: PMC9145234 DOI: 10.1523/jneurosci.2272-21.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2021] [Revised: 01/24/2022] [Accepted: 02/13/2022] [Indexed: 11/21/2022] Open
Abstract
Learning is an essential cognitive mechanism allowing behavioral adaptation through adjustments in neuronal processing. It is associated with changes in the activity of sensory cortical neurons evoked by task-relevant stimuli. However, the exact nature of those modifications and the computational advantages they may confer are still debated. Here, we investigated how learning an orientation discrimination task alters the neuronal representations of the cues orientations in the primary visual cortex (V1) of male and female mice. When comparing the activity evoked by the task stimuli in naive mice and the mice performing the task, we found that the representations of the orientation of the rewarded and nonrewarded cues were more accurate and stable in trained mice. This better cue representation in trained mice was associated with a distortion of the orientation representation space such that stimuli flanking the task-relevant orientations were represented as the task stimuli themselves, suggesting that those stimuli were generalized as the task cues. This distortion was context dependent as it was absent in trained mice passively viewing the task cues and enhanced in the behavioral sessions where mice performed best. Those modifications of the V1 population orientation representation in performing mice were supported by a suppression of the activity of neurons tuned for orientations neighboring the orientations of the task cues. Thus, visual processing in V1 is dynamically adapted to enhance the reliability of the representation of the learned cues and favor generalization in the task-relevant computational space.SIGNIFICANCE STATEMENT Performance improvement in a task often requires facilitating the extraction of the information necessary to its execution. Here, we demonstrate the existence of a suppression mechanism that improves the representation of the orientations of the task stimuli in the V1 of mice performing an orientation discrimination task. We also show that this mechanism distorts the V1 orientation representation space, leading stimuli flanking the task stimuli orientations to be generalized as the task stimuli themselves.
Collapse
Affiliation(s)
- Julien Corbo
- Center for Molecular and Behavioral Neuroscience, Rutgers University-Newark, Newark, New Jersey 07102
| | - John P McClure
- Center for Molecular and Behavioral Neuroscience, Rutgers University-Newark, Newark, New Jersey 07102
- Behavioral and Neural Sciences Graduate Program, Rutgers University-Newark, Newark, New Jersey 07102
| | - O Batuhan Erkat
- Center for Molecular and Behavioral Neuroscience, Rutgers University-Newark, Newark, New Jersey 07102
- Behavioral and Neural Sciences Graduate Program, Rutgers University-Newark, Newark, New Jersey 07102
| | - Pierre-Olivier Polack
- Center for Molecular and Behavioral Neuroscience, Rutgers University-Newark, Newark, New Jersey 07102
| |
Collapse
|
26
|
Bigelow J, Morrill RJ, Olsen T, Hasenstaub AR. Visual modulation of firing and spectrotemporal receptive fields in mouse auditory cortex. CURRENT RESEARCH IN NEUROBIOLOGY 2022; 3:100040. [PMID: 36518337 PMCID: PMC9743056 DOI: 10.1016/j.crneur.2022.100040] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2022] [Revised: 04/26/2022] [Accepted: 05/06/2022] [Indexed: 10/18/2022] Open
Abstract
Recent studies have established significant anatomical and functional connections between visual areas and primary auditory cortex (A1), which may be important for cognitive processes such as communication and spatial perception. These studies have raised two important questions: First, which cell populations in A1 respond to visual input and/or are influenced by visual context? Second, which aspects of sound encoding are affected by visual context? To address these questions, we recorded single-unit activity across cortical layers in awake mice during exposure to auditory and visual stimuli. Neurons responsive to visual stimuli were most prevalent in the deep cortical layers and included both excitatory and inhibitory cells. The overwhelming majority of these neurons also responded to sound, indicating unimodal visual neurons are rare in A1. Other neurons for which sound-evoked responses were modulated by visual context were similarly excitatory or inhibitory but more evenly distributed across cortical layers. These modulatory influences almost exclusively affected sustained sound-evoked firing rate (FR) responses or spectrotemporal receptive fields (STRFs); transient FR changes at stimulus onset were rarely modified by visual context. Neuron populations with visually modulated STRFs and sustained FR responses were mostly non-overlapping, suggesting spectrotemporal feature selectivity and overall excitability may be differentially sensitive to visual context. The effects of visual modulation were heterogeneous, increasing and decreasing STRF gain in roughly equal proportions of neurons. Our results indicate visual influences are surprisingly common and diversely expressed throughout layers and cell types in A1, affecting nearly one in five neurons overall.
Collapse
Affiliation(s)
- James Bigelow
- Coleman Memorial Laboratory, University of California, San Francisco, USA
- Department of Otolaryngology–Head and Neck Surgery, University of California, San Francisco, 94143, USA
| | - Ryan J. Morrill
- Coleman Memorial Laboratory, University of California, San Francisco, USA
- Neuroscience Graduate Program, University of California, San Francisco, USA
- Department of Otolaryngology–Head and Neck Surgery, University of California, San Francisco, 94143, USA
| | - Timothy Olsen
- Coleman Memorial Laboratory, University of California, San Francisco, USA
- Department of Otolaryngology–Head and Neck Surgery, University of California, San Francisco, 94143, USA
| | - Andrea R. Hasenstaub
- Coleman Memorial Laboratory, University of California, San Francisco, USA
- Neuroscience Graduate Program, University of California, San Francisco, USA
- Department of Otolaryngology–Head and Neck Surgery, University of California, San Francisco, 94143, USA
| |
Collapse
|
27
|
McClure JP, Erkat OB, Corbo J, Polack PO. Estimating How Sounds Modulate Orientation Representation in the Primary Visual Cortex Using Shallow Neural Networks. Front Syst Neurosci 2022; 16:869705. [PMID: 35615425 PMCID: PMC9124944 DOI: 10.3389/fnsys.2022.869705] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2022] [Accepted: 04/07/2022] [Indexed: 12/15/2022] Open
Abstract
Audiovisual perception results from the interaction between visual and auditory processing. Hence, presenting auditory and visual inputs simultaneously usually improves the accuracy of the unimodal percepts, but can also lead to audiovisual illusions. Cross-talks between visual and auditory inputs during sensory processing were recently shown to occur as early as in the primary visual cortex (V1). In a previous study, we demonstrated that sounds improve the representation of the orientation of visual stimuli in the naïve mouse V1 by promoting the recruitment of neurons better tuned to the orientation and direction of the visual stimulus. However, we did not test if this type of modulation was still present when the auditory and visual stimuli were both behaviorally relevant. To determine the effect of sounds on active visual processing, we performed calcium imaging in V1 while mice were performing an audiovisual task. We then compared the representations of the task stimuli orientations in the unimodal visual and audiovisual context using shallow neural networks (SNNs). SNNs were chosen because of the biological plausibility of their computational structure and the possibility of identifying post hoc the biological neurons having the strongest influence on the classification decision. We first showed that SNNs can categorize the activity of V1 neurons evoked by drifting gratings of 12 different orientations. Then, we demonstrated using the connection weight approach that SNN training assigns the largest computational weight to the V1 neurons having the best orientation and direction selectivity. Finally, we showed that it is possible to use SNNs to determine how V1 neurons represent the orientations of stimuli that do not belong to the set of orientations used for SNN training. Once the SNN approach was established, we replicated the previous finding that sounds improve orientation representation in the V1 of naïve mice. Then, we showed that, in mice performing an audiovisual detection task, task tones improve the representation of the visual cues associated with the reward while deteriorating the representation of non-rewarded cues. Altogether, our results suggest that the direction of sound modulation in V1 depends on the behavioral relevance of the visual cue.
Collapse
Affiliation(s)
- John P. McClure
- Center for Molecular and Behavioral Neuroscience, Rutgers University–Newark, Newark, NJ, United States
- Behavioral and Neural Sciences Graduate Program, Rutgers University–Newark, Newark, NJ, United States
| | - O. Batuhan Erkat
- Center for Molecular and Behavioral Neuroscience, Rutgers University–Newark, Newark, NJ, United States
- Behavioral and Neural Sciences Graduate Program, Rutgers University–Newark, Newark, NJ, United States
| | - Julien Corbo
- Center for Molecular and Behavioral Neuroscience, Rutgers University–Newark, Newark, NJ, United States
| | - Pierre-Olivier Polack
- Center for Molecular and Behavioral Neuroscience, Rutgers University–Newark, Newark, NJ, United States
| |
Collapse
|
28
|
Samuelsen CL, Vincis R. Cortical Hub for Flavor Sensation in Rodents. Front Syst Neurosci 2021; 15:772286. [PMID: 34867223 PMCID: PMC8636119 DOI: 10.3389/fnsys.2021.772286] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2021] [Accepted: 10/21/2021] [Indexed: 01/05/2023] Open
Abstract
The experience of eating is inherently multimodal, combining intraoral gustatory, olfactory, and somatosensory signals into a single percept called flavor. As foods and beverages enter the mouth, movements associated with chewing and swallowing activate somatosensory receptors in the oral cavity, dissolve tastants in the saliva to activate taste receptors, and release volatile odorant molecules to retronasally activate olfactory receptors in the nasal epithelium. Human studies indicate that sensory cortical areas are important for intraoral multimodal processing, yet their circuit-level mechanisms remain unclear. Animal models allow for detailed analyses of neural circuits due to the large number of molecular tools available for tracing and neuronal manipulations. In this review, we concentrate on the anatomical and neurophysiological evidence from rodent models toward a better understanding of the circuit-level mechanisms underlying the cortical processing of flavor. While more work is needed, the emerging view pertaining to the multimodal processing of food and beverages is that the piriform, gustatory, and somatosensory cortical regions do not function solely as independent areas. Rather they act as an intraoral cortical hub, simultaneously receiving and processing multimodal sensory information from the mouth to produce the rich and complex flavor experience that guides consummatory behavior.
Collapse
Affiliation(s)
- Chad L Samuelsen
- Department of Anatomical Sciences and Neurobiology, University of Louisville, Louisville, KY, United States
| | - Roberto Vincis
- Department of Biological Science and Program in Neuroscience, Florida State University, Tallahassee, FL, United States
| |
Collapse
|
29
|
Rezaul Karim AKM, Proulx MJ, de Sousa AA, Likova LT. Neuroplasticity and Crossmodal Connectivity in the Normal, Healthy Brain. PSYCHOLOGY & NEUROSCIENCE 2021; 14:298-334. [PMID: 36937077 PMCID: PMC10019101 DOI: 10.1037/pne0000258] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Objective Neuroplasticity enables the brain to establish new crossmodal connections or reorganize old connections which are essential to perceiving a multisensorial world. The intent of this review is to identify and summarize the current developments in neuroplasticity and crossmodal connectivity, and deepen understanding of how crossmodal connectivity develops in the normal, healthy brain, highlighting novel perspectives about the principles that guide this connectivity. Methods To the above end, a narrative review is carried out. The data documented in prior relevant studies in neuroscience, psychology and other related fields available in a wide range of prominent electronic databases are critically assessed, synthesized, interpreted with qualitative rather than quantitative elements, and linked together to form new propositions and hypotheses about neuroplasticity and crossmodal connectivity. Results Three major themes are identified. First, it appears that neuroplasticity operates by following eight fundamental principles and crossmodal integration operates by following three principles. Second, two different forms of crossmodal connectivity, namely direct crossmodal connectivity and indirect crossmodal connectivity, are suggested to operate in both unisensory and multisensory perception. Third, three principles possibly guide the development of crossmodal connectivity into adulthood. These are labeled as the principle of innate crossmodality, the principle of evolution-driven 'neuromodular' reorganization and the principle of multimodal experience. These principles are combined to develop a three-factor interaction model of crossmodal connectivity. Conclusions The hypothesized principles and the proposed model together advance understanding of neuroplasticity, the nature of crossmodal connectivity, and how such connectivity develops in the normal, healthy brain.
Collapse
|
30
|
Han X, Xu J, Chang S, Keniston L, Yu L. Multisensory-Guided Associative Learning Enhances Multisensory Representation in Primary Auditory Cortex. Cereb Cortex 2021; 32:1040-1054. [PMID: 34378017 DOI: 10.1093/cercor/bhab264] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2021] [Revised: 07/13/2021] [Accepted: 07/15/2021] [Indexed: 11/12/2022] Open
Abstract
Sensory cortices, classically considered to represent modality-specific sensory information, are also found to engage in multisensory processing. However, how sensory processing in sensory cortices is cross-modally modulated remains an open question. Specifically, we understand little of cross-modal representation in sensory cortices in perceptual tasks and how perceptual learning modifies this process. Here, we recorded neural responses in primary auditory cortex (A1) both while freely moving rats discriminated stimuli in Go/No-Go tasks and when anesthetized. Our data show that cross-modal representation in auditory cortices varies with task contexts. In the task of an audiovisual cue being the target associating with water reward, a significantly higher proportion of auditory neurons showed a visually evoked response. The vast majority of auditory neurons, if processing auditory-visual interactions, exhibit significant multisensory enhancement. However, when the rats performed tasks with unisensory cues being the target, cross-modal inhibition, rather than enhancement, predominated. In addition, multisensory associational learning appeared to leave a trace of plastic change in A1, as a larger proportion of A1 neurons showed multisensory enhancement in anesthesia. These findings indicate that multisensory processing in principle sensory cortices is not static, and having cross-modal interaction in the task requirement can substantially enhance multisensory processing in sensory cortices.
Collapse
Affiliation(s)
- Xiao Han
- Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai) School of Life Sciences, East China Normal University, Shanghai 200062, China
| | - Jinghong Xu
- Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai) School of Life Sciences, East China Normal University, Shanghai 200062, China
| | - Song Chang
- Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai) School of Life Sciences, East China Normal University, Shanghai 200062, China
| | - Les Keniston
- Department of Physical Therapy, University of Maryland Eastern Shore, Princess Anne, MD 21853, USA
| | - Liping Yu
- Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai) School of Life Sciences, East China Normal University, Shanghai 200062, China.,Key Laboratory of Adolescent Health Assessment and Exercise Intervention of Ministry of Education, School of Life Sciences, East China Normal University, Shanghai 200062, China
| |
Collapse
|
31
|
Meijer GT, Marchesi P, Mejias JF, Montijn JS, Lansink CS, Pennartz CMA. Neural Correlates of Multisensory Detection Behavior: Comparison of Primary and Higher-Order Visual Cortex. Cell Rep 2021; 31:107636. [PMID: 32402272 DOI: 10.1016/j.celrep.2020.107636] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2019] [Revised: 01/10/2020] [Accepted: 04/21/2020] [Indexed: 01/16/2023] Open
Abstract
We act upon stimuli in our surrounding environment by gathering the multisensory information they convey and by integrating this information to decide on a behavioral action. We hypothesized that the anterolateral secondary visual cortex (area AL) of the mouse brain may serve as a hub for sensorimotor transformation of audiovisual information. We imaged neuronal activity in primary visual cortex (V1) and AL of the mouse during a detection task using visual, auditory, and audiovisual stimuli. We found that AL neurons were more sensitive to weak uni- and multisensory stimuli compared to V1. Depending on contrast, different subsets of AL and V1 neurons showed cross-modal modulation of visual responses. During audiovisual stimulation, AL neurons showed stronger differentiation of behaviorally reported versus unreported stimuli compared to V1, whereas V1 showed this distinction during unisensory visual stimulation. Thus, neural population activity in area AL correlates more closely with multisensory detection behavior than V1.
Collapse
Affiliation(s)
- Guido T Meijer
- Swammerdam Institute for Life Sciences, Center for Neuroscience, Faculty of Science, University of Amsterdam, 1098 XH Amsterdam, the Netherlands
| | - Pietro Marchesi
- Swammerdam Institute for Life Sciences, Center for Neuroscience, Faculty of Science, University of Amsterdam, 1098 XH Amsterdam, the Netherlands
| | - Jorge F Mejias
- Swammerdam Institute for Life Sciences, Center for Neuroscience, Faculty of Science, University of Amsterdam, 1098 XH Amsterdam, the Netherlands
| | - Jorrit S Montijn
- Swammerdam Institute for Life Sciences, Center for Neuroscience, Faculty of Science, University of Amsterdam, 1098 XH Amsterdam, the Netherlands
| | - Carien S Lansink
- Swammerdam Institute for Life Sciences, Center for Neuroscience, Faculty of Science, University of Amsterdam, 1098 XH Amsterdam, the Netherlands; Research Priority Program Brain and Cognition, University of Amsterdam, 1098 XH Amsterdam, the Netherlands.
| | - Cyriel M A Pennartz
- Swammerdam Institute for Life Sciences, Center for Neuroscience, Faculty of Science, University of Amsterdam, 1098 XH Amsterdam, the Netherlands; Research Priority Program Brain and Cognition, University of Amsterdam, 1098 XH Amsterdam, the Netherlands.
| |
Collapse
|
32
|
Liang Y, Fan JL, Sun W, Lu R, Chen M, Ji N. A Distinct Population of L6 Neurons in Mouse V1 Mediate Cross-Callosal Communication. Cereb Cortex 2021; 31:4259-4273. [PMID: 33987642 DOI: 10.1093/cercor/bhab084] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Through the corpus callosum, interhemispheric communication is mediated by callosal projection (CP) neurons. Using retrograde labeling, we identified a population of layer 6 (L6) excitatory neurons as the main conveyer of transcallosal information in the monocular zone of the mouse primary visual cortex (V1). Distinct from L6 corticothalamic (CT) population, V1 L6 CP neurons contribute to an extensive reciprocal network across multiple sensory cortices over two hemispheres. Receiving both local and long-range cortical inputs, they encode orientation, direction, and receptive field information, while are also highly spontaneous active. The spontaneous activity of L6 CP neurons exhibits complex relationships with brain states and stimulus presentation, distinct from the spontaneous activity patterns of the CT population. The anatomical and functional properties of these L6 CP neurons enable them to broadcast visual and nonvisual information across two hemispheres, and thus may play a role in regulating and coordinating brain-wide activity events.
Collapse
Affiliation(s)
- Yajie Liang
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA 20147, USA.,Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland, Baltimore, MD 201210, USA
| | - Jiang Lan Fan
- UCSF-UC Berkeley Joint PhD Program in Bioengineering, University of California, Berkeley, CA 94720, USA
| | - Wenzhi Sun
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA 20147, USA.,iHuman Institute, ShanghaiTech University, Shanghai 201210, China.,Chinese Institute for Brain Research, Beijing 102206, China
| | - Rongwen Lu
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA 20147, USA.,National Eye Institute, National Institutes of Health, Bethesda, MD 20892, USA
| | - Ming Chen
- iHuman Institute, ShanghaiTech University, Shanghai 201210, China
| | - Na Ji
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA 20147, USA.,Department of Physics, Department of Molecular and Cell Biology, Helen Wills Neuroscience Institute, University of California, Berkeley, CA 94720, USA
| |
Collapse
|
33
|
Siemann JK, Veenstra-VanderWeele J, Wallace MT. Approaches to Understanding Multisensory Dysfunction in Autism Spectrum Disorder. Autism Res 2020; 13:1430-1449. [PMID: 32869933 PMCID: PMC7721996 DOI: 10.1002/aur.2375] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2020] [Revised: 07/20/2020] [Accepted: 07/28/2020] [Indexed: 12/14/2022]
Abstract
Abnormal sensory responses are a DSM-5 symptom of autism spectrum disorder (ASD), and research findings demonstrate altered sensory processing in ASD. Beyond difficulties with processing information within single sensory domains, including both hypersensitivity and hyposensitivity, difficulties in multisensory processing are becoming a core issue of focus in ASD. These difficulties may be targeted by treatment approaches such as "sensory integration," which is frequently applied in autism treatment but not yet based on clear evidence. Recently, psychophysical data have emerged to demonstrate multisensory deficits in some children with ASD. Unlike deficits in social communication, which are best understood in humans, sensory and multisensory changes offer a tractable marker of circuit dysfunction that is more easily translated into animal model systems to probe the underlying neurobiological mechanisms. Paralleling experimental paradigms that were previously applied in humans and larger mammals, we and others have demonstrated that multisensory function can also be examined behaviorally in rodents. Here, we review the sensory and multisensory difficulties commonly found in ASD, examining laboratory findings that relate these findings across species. Next, we discuss the known neurobiology of multisensory integration, drawing largely on experimental work in larger mammals, and extensions of these paradigms into rodents. Finally, we describe emerging investigations into multisensory processing in genetic mouse models related to autism risk. By detailing findings from humans to mice, we highlight the advantage of multisensory paradigms that can be easily translated across species, as well as the potential for rodent experimental systems to reveal opportunities for novel treatments. LAY SUMMARY: Sensory and multisensory deficits are commonly found in ASD and may result in cascading effects that impact social communication. By using similar experiments to those in humans, we discuss how studies in animal models may allow an understanding of the brain mechanisms that underlie difficulties in multisensory integration, with the ultimate goal of developing new treatments. Autism Res 2020, 13: 1430-1449. © 2020 International Society for Autism Research, Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Justin K Siemann
- Department of Biological Sciences, Vanderbilt University, Nashville, Tennessee, USA
| | - Jeremy Veenstra-VanderWeele
- Department of Psychiatry, Columbia University, Center for Autism and the Developing Brain, New York Presbyterian Hospital, and New York State Psychiatric Institute, New York, New York, USA
| | - Mark T Wallace
- Department of Psychiatry, Vanderbilt University, Nashville, Tennessee, USA
- Department of Psychology, Vanderbilt University, Nashville, Tennessee, USA
- Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, Tennessee, USA
- Kennedy Center for Research on Human Development, Vanderbilt University, Nashville, Tennessee, USA
| |
Collapse
|
34
|
Kimura A. Cross-modal modulation of cell activity by sound in first-order visual thalamic nucleus. J Comp Neurol 2020; 528:1917-1941. [PMID: 31983057 DOI: 10.1002/cne.24865] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2019] [Revised: 12/19/2019] [Accepted: 01/16/2020] [Indexed: 12/16/2022]
Abstract
Cross-modal auditory influence on cell activity in the primary visual cortex emerging at short latencies raises the possibility that the first-order visual thalamic nucleus, which is considered dedicated to unimodal visual processing, could contribute to cross-modal sensory processing, as has been indicated in the auditory and somatosensory systems. To test this hypothesis, the effects of sound stimulation on visual cell activity in the dorsal lateral geniculate nucleus were examined in anesthetized rats, using juxta-cellular recording and labeling techniques. Visual responses evoked by light (white LED) were modulated by sound (noise burst) given simultaneously or 50-400 ms after the light, even though sound stimuli alone did not evoke cell activity. Alterations of visual response were observed in 71% of cells (57/80) with regard to response magnitude, latency, and/or burst spiking. Suppression predominated in response magnitude modulation, but de novo responses were also induced by combined stimulation. Sound affected not only onset responses but also late responses. Late responses were modulated by sound given before or after onset responses. Further, visual responses evoked by the second light stimulation of a double flash with a 150-700 ms interval were also modulated by sound given together with the first light stimulation. In morphological analysis of labeled cells projection cells comparable to X-, Y-, and W-like cells and interneurons were all susceptible to auditory influence. These findings suggest that the first-order visual thalamic nucleus incorporates auditory influence into parallel and complex thalamic visual processing for cross-modal modulation of visual attention and perception.
Collapse
Affiliation(s)
- Akihisa Kimura
- Department of Physiology, Wakayama Medical University, Wakayama, Japan
| |
Collapse
|
35
|
Shaw LH, Freedman EG, Crosse MJ, Nicholas E, Chen AM, Braiman MS, Molholm S, Foxe JJ. Operating in a Multisensory Context: Assessing the Interplay Between Multisensory Reaction Time Facilitation and Inter-sensory Task-switching Effects. Neuroscience 2020; 436:122-135. [PMID: 32325100 DOI: 10.1016/j.neuroscience.2020.04.013] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2019] [Revised: 04/03/2020] [Accepted: 04/06/2020] [Indexed: 11/28/2022]
Abstract
Individuals respond faster to presentations of bisensory stimuli (e.g. audio-visual targets) than to presentations of either unisensory constituent in isolation (i.e. to the auditory-alone or visual-alone components of an audio-visual stimulus). This well-established multisensory speeding effect, termed the redundant signals effect (RSE), is not predicted by simple linear summation of the unisensory response time probability distributions. Rather, the speeding is typically faster than this prediction, leading researchers to ascribe the RSE to a so-called co-activation account. According to this account, multisensory neural processing occurs whereby the unisensory inputs are integrated to produce more effective sensory-motor activation. However, the typical paradigm used to test for RSE involves random sequencing of unisensory and bisensory inputs in a mixed design, raising the possibility of an alternate attention-switching account. This intermixed design requires participants to switch between sensory modalities on many task trials (e.g. from responding to a visual stimulus to an auditory stimulus). Here we show that much, if not all, of the RSE under this paradigm can be attributed to slowing of reaction times to unisensory stimuli resulting from modality switching, and is not in fact due to speeding of responses to AV stimuli. As such, the present data do not support a co-activation account, but rather suggest that switching and mixing costs akin to those observed during classic task-switching paradigms account for the observed RSE.
Collapse
Affiliation(s)
- Luke H Shaw
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA
| | - Edward G Freedman
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA
| | - Michael J Crosse
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics & Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, NY 10461, USA
| | - Eric Nicholas
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA
| | - Allen M Chen
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA
| | - Matthew S Braiman
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA
| | - Sophie Molholm
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA; The Cognitive Neurophysiology Laboratory, Department of Pediatrics & Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, NY 10461, USA
| | - John J Foxe
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA; The Cognitive Neurophysiology Laboratory, Department of Pediatrics & Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, NY 10461, USA.
| |
Collapse
|
36
|
Xu X, Hanganu-Opatz IL, Bieler M. Cross-Talk of Low-Level Sensory and High-Level Cognitive Processing: Development, Mechanisms, and Relevance for Cross-Modal Abilities of the Brain. Front Neurorobot 2020; 14:7. [PMID: 32116637 PMCID: PMC7034303 DOI: 10.3389/fnbot.2020.00007] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2019] [Accepted: 01/27/2020] [Indexed: 12/18/2022] Open
Abstract
The emergence of cross-modal learning capabilities requires the interaction of neural areas accounting for sensory and cognitive processing. Convergence of multiple sensory inputs is observed in low-level sensory cortices including primary somatosensory (S1), visual (V1), and auditory cortex (A1), as well as in high-level areas such as prefrontal cortex (PFC). Evidence shows that local neural activity and functional connectivity between sensory cortices participate in cross-modal processing. However, little is known about the functional interplay between neural areas underlying sensory and cognitive processing required for cross-modal learning capabilities across life. Here we review our current knowledge on the interdependence of low- and high-level cortices for the emergence of cross-modal processing in rodents. First, we summarize the mechanisms underlying the integration of multiple senses and how cross-modal processing in primary sensory cortices might be modified by top-down modulation of the PFC. Second, we examine the critical factors and developmental mechanisms that account for the interaction between neuronal networks involved in sensory and cognitive processing. Finally, we discuss the applicability and relevance of cross-modal processing for brain-inspired intelligent robotics. An in-depth understanding of the factors and mechanisms controlling cross-modal processing might inspire the refinement of robotic systems by better mimicking neural computations.
Collapse
Affiliation(s)
- Xiaxia Xu
- Developmental Neurophysiology, Center for Molecular Neurobiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Ileana L Hanganu-Opatz
- Developmental Neurophysiology, Center for Molecular Neurobiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Malte Bieler
- Laboratory for Neural Computation, Institute of Basic Medical Sciences, University of Oslo, Oslo, Norway
| |
Collapse
|
37
|
Gau R, Bazin PL, Trampel R, Turner R, Noppeney U. Resolving multisensory and attentional influences across cortical depth in sensory cortices. eLife 2020; 9:46856. [PMID: 31913119 PMCID: PMC6984812 DOI: 10.7554/elife.46856] [Citation(s) in RCA: 38] [Impact Index Per Article: 7.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2019] [Accepted: 01/07/2020] [Indexed: 11/13/2022] Open
Abstract
In our environment, our senses are bombarded with a myriad of signals, only a subset of which is relevant for our goals. Using sub-millimeter-resolution fMRI at 7T, we resolved BOLD-response and activation patterns across cortical depth in early sensory cortices to auditory, visual and audiovisual stimuli under auditory or visual attention. In visual cortices, auditory stimulation induced widespread inhibition irrespective of attention, whereas auditory relative to visual attention suppressed mainly central visual field representations. In auditory cortices, visual stimulation suppressed activations, but amplified responses to concurrent auditory stimuli, in a patchy topography. Critically, multisensory interactions in auditory cortices were stronger in deeper laminae, while attentional influences were greatest at the surface. These distinct depth-dependent profiles suggest that multisensory and attentional mechanisms regulate sensory processing via partly distinct circuitries. Our findings are crucial for understanding how the brain regulates information flow across senses to interact with our complex multisensory world.
Collapse
Affiliation(s)
- Remi Gau
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, United Kingdom.,Institute of Psychology, Université Catholique de Louvain, Louvain-la-Neuve, Belgium.,Institute of Neuroscience, Université Catholique de Louvain, Louvain-la-Neuve, Belgium
| | - Pierre-Louis Bazin
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.,Integrative Model-based Cognitive Neuroscience research unit, University of Amsterdam, Amsterdam, Netherlands
| | - Robert Trampel
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Robert Turner
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.,Sir Peter Mansfield Imaging Centre, University of Nottingham, Nottingham, United Kingdom
| | - Uta Noppeney
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, United Kingdom.,Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands
| |
Collapse
|
38
|
Audio-visual experience strengthens multisensory assemblies in adult mouse visual cortex. Nat Commun 2019; 10:5684. [PMID: 31831751 PMCID: PMC6908602 DOI: 10.1038/s41467-019-13607-2] [Citation(s) in RCA: 29] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2018] [Accepted: 11/07/2019] [Indexed: 11/09/2022] Open
Abstract
We experience the world through multiple senses simultaneously. To better understand mechanisms of multisensory processing we ask whether inputs from two senses (auditory and visual) can interact and drive plasticity in neural-circuits of the primary visual cortex (V1). Using genetically-encoded voltage and calcium indicators, we find coincident audio-visual experience modifies both the supra and subthreshold response properties of neurons in L2/3 of mouse V1. Specifically, we find that after audio-visual pairing, a subset of multimodal neurons develops enhanced auditory responses to the paired auditory stimulus. This cross-modal plasticity persists over days and is reflected in the strengthening of small functional networks of L2/3 neurons. We find V1 processes coincident auditory and visual events by strengthening functional associations between feature specific assemblies of multimodal neurons during bouts of sensory driven co-activity, leaving a trace of multisensory experience in the cortical network.
Collapse
|
39
|
Delving Deep into Crossmodal Integration. J Neurosci 2019; 38:6442-6444. [PMID: 30021764 PMCID: PMC6052241 DOI: 10.1523/jneurosci.0988-18.2018] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2018] [Revised: 06/03/2018] [Accepted: 06/08/2018] [Indexed: 11/21/2022] Open
|
40
|
Abstract
In this article, we review the anatomical inputs and outputs to the mouse primary visual cortex, area V1. Our survey of data from the Allen Institute Mouse Connectivity project indicates that mouse V1 is highly interconnected with both cortical and subcortical brain areas. This pattern of innervation allows for computations that depend on the state of the animal and on behavioral goals, which contrasts with simple feedforward, hierarchical models of visual processing. Thus, to have an accurate description of the function of V1 during mouse behavior, its involvement with the rest of the brain circuitry has to be considered. Finally, it remains an open question whether the primary visual cortex of higher mammals displays the same degree of sensorimotor integration in the early visual system.
Collapse
Affiliation(s)
- Emmanouil Froudarakis
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas 77030, USA;
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, Texas 77030, USA
| | - Paul G Fahey
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas 77030, USA;
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, Texas 77030, USA
| | - Jacob Reimer
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas 77030, USA;
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, Texas 77030, USA
| | - Stelios M Smirnakis
- Department of Neurology, Brigham and Women's Hospital, Boston, Massachusetts 02115, USA
- Jamaica Plain VA Medical Center, Boston, Massachusetts 02130, USA
| | - Edward J Tehovnik
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas 77030, USA;
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, Texas 77030, USA
| | - Andreas S Tolias
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas 77030, USA;
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, Texas 77030, USA
- Department of Electrical and Computer Engineering, Rice University, Houston, Texas 77005, USA
| |
Collapse
|
41
|
Deneux T, Harrell ER, Kempf A, Ceballo S, Filipchuk A, Bathellier B. Context-dependent signaling of coincident auditory and visual events in primary visual cortex. eLife 2019; 8:44006. [PMID: 31115334 PMCID: PMC6544434 DOI: 10.7554/elife.44006] [Citation(s) in RCA: 50] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2018] [Accepted: 05/20/2019] [Indexed: 01/10/2023] Open
Abstract
Detecting rapid, coincident changes across sensory modalities is essential for recognition of sudden threats or events. Using two-photon calcium imaging in identified cell types in awake, head-fixed mice, we show that, among the basic features of a sound envelope, loud sound onsets are a dominant feature coded by the auditory cortex neurons projecting to primary visual cortex (V1). In V1, a small number of layer 1 interneurons gates this cross-modal information flow in a context-dependent manner. In dark conditions, auditory cortex inputs lead to suppression of the V1 population. However, when sound input coincides with a visual stimulus, visual responses are boosted in V1, most strongly after loud sound onsets. Thus, a dynamic, asymmetric circuit connecting AC and V1 contributes to the encoding of visual events that are coincident with sounds.
Collapse
Affiliation(s)
- Thomas Deneux
- Department for Integrative and Computational Neuroscience (ICN), Paris-Saclay Institute of Neuroscience (NeuroPSI), UMR9197 CNRS, University Paris Sud, Gif-sur-Yvette, France
| | - Evan R Harrell
- Department for Integrative and Computational Neuroscience (ICN), Paris-Saclay Institute of Neuroscience (NeuroPSI), UMR9197 CNRS, University Paris Sud, Gif-sur-Yvette, France
| | - Alexandre Kempf
- Department for Integrative and Computational Neuroscience (ICN), Paris-Saclay Institute of Neuroscience (NeuroPSI), UMR9197 CNRS, University Paris Sud, Gif-sur-Yvette, France
| | - Sebastian Ceballo
- Department for Integrative and Computational Neuroscience (ICN), Paris-Saclay Institute of Neuroscience (NeuroPSI), UMR9197 CNRS, University Paris Sud, Gif-sur-Yvette, France
| | - Anton Filipchuk
- Department for Integrative and Computational Neuroscience (ICN), Paris-Saclay Institute of Neuroscience (NeuroPSI), UMR9197 CNRS, University Paris Sud, Gif-sur-Yvette, France
| | - Brice Bathellier
- Department for Integrative and Computational Neuroscience (ICN), Paris-Saclay Institute of Neuroscience (NeuroPSI), UMR9197 CNRS, University Paris Sud, Gif-sur-Yvette, France
| |
Collapse
|
42
|
Chanauria N, Bharmauria V, Bachatene L, Cattan S, Rouat J, Molotchnikoff S. Sound Induces Change in Orientation Preference of V1 Neurons: Audio-Visual Cross-Influence. Neuroscience 2019; 404:48-61. [PMID: 30703505 DOI: 10.1016/j.neuroscience.2019.01.039] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2018] [Revised: 01/18/2019] [Accepted: 01/21/2019] [Indexed: 10/27/2022]
Abstract
In the cortex, demarcated unimodal sensory regions often respond to unforeseen sensory stimuli and exhibit plasticity. The goal of the current investigation was to test evoked responses of primary visual cortex (V1) neurons when an adapting auditory stimulus is applied in isolation. Using extracellular recordings in anesthetized cats, we demonstrate that, unlike the prevailing observation of only slight modulations in the firing rates of the neurons, sound imposition in isolation entirely shifted the peaks of orientation tuning curves of neurons in both supra- and infragranular layers of V1. Our results suggest that neurons specific to either layer dynamically integrate features of sound and modify the organization of the orientation map of V1. Intriguingly, these experiments present novel findings that the mere presentation of a prolonged auditory stimulus may drastically recalibrate the tuning properties of the visual neurons and highlight the phenomenal neuroplasticity of V1 neurons.
Collapse
Affiliation(s)
- Nayan Chanauria
- Neurophysiology of Visual System, Département de Sciences Biologiques, Université de Montréal, CP 6128 Succursale Centre-Ville, Montréal, QC H3C 3J7, Canada
| | - Vishal Bharmauria
- Neurophysiology of Visual System, Département de Sciences Biologiques, Université de Montréal, CP 6128 Succursale Centre-Ville, Montréal, QC H3C 3J7, Canada
| | - Lyes Bachatene
- Neurophysiology of Visual System, Département de Sciences Biologiques, Université de Montréal, CP 6128 Succursale Centre-Ville, Montréal, QC H3C 3J7, Canada
| | - Sarah Cattan
- Neurophysiology of Visual System, Département de Sciences Biologiques, Université de Montréal, CP 6128 Succursale Centre-Ville, Montréal, QC H3C 3J7, Canada
| | - Jean Rouat
- Departement de Génie Électrique et Génie Informatique, Université de Sherbrooke, Sherbrooke, QC, Canada
| | - Stéphane Molotchnikoff
- Neurophysiology of Visual System, Département de Sciences Biologiques, Université de Montréal, CP 6128 Succursale Centre-Ville, Montréal, QC H3C 3J7, Canada.
| |
Collapse
|
43
|
McClure JP, Polack PO. Pure tones modulate the representation of orientation and direction in the primary visual cortex. J Neurophysiol 2019; 121:2202-2214. [PMID: 30969800 DOI: 10.1152/jn.00069.2019] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/23/2023] Open
Abstract
Multimodal sensory integration facilitates the generation of a unified and coherent perception of the environment. It is now well established that unimodal sensory perceptions, such as vision, are improved in multisensory contexts. Whereas multimodal integration is primarily performed by dedicated multisensory brain regions such as the association cortices or the superior colliculus, recent studies have shown that multisensory interactions also occur in primary sensory cortices. In particular, sounds were shown to modulate the responses of neurons located in layers 2/3 (L2/3) of the mouse primary visual cortex (V1). Yet, the net effect of sound modulation at the V1 population level remained unclear. In the present study, we performed two-photon calcium imaging in awake mice to compare the representation of the orientation and the direction of drifting gratings by V1 L2/3 neurons in unimodal (visual only) or multimodal (audiovisual) conditions. We found that sound modulation depended on the tuning properties (orientation and direction selectivity) and response amplitudes of V1 L2/3 neurons. Sounds potentiated the responses of neurons that were highly tuned to the cue's orientation and direction but weakly active in the unimodal context, following the principle of inverse effectiveness of multimodal integration. Moreover, sound suppressed the responses of neurons untuned for the orientation and/or the direction of the visual cue. Altogether, sound modulation improved the representation of the orientation and direction of the visual stimulus in V1 L2/3. Namely, visual stimuli presented with auditory stimuli recruited a neuronal population better tuned to the visual stimulus orientation and direction than when presented alone. NEW & NOTEWORTHY The primary visual cortex (V1) receives direct inputs from the primary auditory cortex. Yet, the impact of sounds on visual processing in V1 remains controverted. We show that the modulation by pure tones of V1 visual responses depends on the orientation selectivity, direction selectivity, and response amplitudes of V1 neurons. Hence, audiovisual stimuli recruit a population of V1 neurons better tuned to the orientation and direction of the visual stimulus than unimodal visual stimuli.
Collapse
Affiliation(s)
- John P McClure
- Center for Molecular and Behavioral Neuroscience, Rutgers University-Newark, Newark, New Jersey
| | - Pierre-Olivier Polack
- Center for Molecular and Behavioral Neuroscience, Rutgers University-Newark, Newark, New Jersey
| |
Collapse
|
44
|
Meijer GT, Mertens PEC, Pennartz CMA, Olcese U, Lansink CS. The circuit architecture of cortical multisensory processing: Distinct functions jointly operating within a common anatomical network. Prog Neurobiol 2019; 174:1-15. [PMID: 30677428 DOI: 10.1016/j.pneurobio.2019.01.004] [Citation(s) in RCA: 34] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2017] [Revised: 12/21/2018] [Accepted: 01/21/2019] [Indexed: 12/16/2022]
Abstract
Our perceptual systems continuously process sensory inputs from different modalities and organize these streams of information such that our subjective representation of the outside world is a unified experience. By doing so, they also enable further cognitive processing and behavioral action. While cortical multisensory processing has been extensively investigated in terms of psychophysics and mesoscale neural correlates, an in depth understanding of the underlying circuit-level mechanisms is lacking. Previous studies on circuit-level mechanisms of multisensory processing have predominantly focused on cue integration, i.e. the mechanism by which sensory features from different modalities are combined to yield more reliable stimulus estimates than those obtained by using single sensory modalities. In this review, we expand the framework on the circuit-level mechanisms of cortical multisensory processing by highlighting that multisensory processing is a family of functions - rather than a single operation - which involves not only the integration but also the segregation of modalities. In addition, multisensory processing not only depends on stimulus features, but also on cognitive resources, such as attention and memory, as well as behavioral context, to determine the behavioral outcome. We focus on rodent models as a powerful instrument to study the circuit-level bases of multisensory processes, because they enable combining cell-type-specific recording and interventional techniques with complex behavioral paradigms. We conclude that distinct multisensory processes share overlapping anatomical substrates, are implemented by diverse neuronal micro-circuitries that operate in parallel, and are flexibly recruited based on factors such as stimulus features and behavioral constraints.
Collapse
Affiliation(s)
- Guido T Meijer
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, the Netherlands.
| | - Paul E C Mertens
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, the Netherlands.
| | - Cyriel M A Pennartz
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, the Netherlands; Research Priority Program Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, the Netherlands.
| | - Umberto Olcese
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, the Netherlands; Research Priority Program Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, the Netherlands.
| | - Carien S Lansink
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, the Netherlands; Research Priority Program Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, the Netherlands.
| |
Collapse
|
45
|
How Senses Work Together: Cross-Modal Interactions between Primary Sensory Cortices. Neural Plast 2018; 2018:5380921. [PMID: 30647732 PMCID: PMC6311735 DOI: 10.1155/2018/5380921] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2018] [Accepted: 11/04/2018] [Indexed: 11/17/2022] Open
Abstract
On our way through a town, the things we see can make us change the way we go. The things that we hear can make us stop or walk on, or the things we feel can cause us to wear a warm jacket or just a t-shirt. All these behaviors are mediated by highly complex processing mechanisms in our brain and reflect responses to many important sensory inputs. The mammalian cerebral cortex, which processes the sensory information, consists of largely specialized sensory areas mainly receiving information from their corresponding sensory modalities. The first cortical regions receiving the input from the outer world are the so called primary sensory cortices. Strikingly, there is convincing evidence that primary sensory cortices do not work in isolation but are substantially affected by other sensory modalities. Here, we will review previous and current literature on this cross-modal interplay.
Collapse
|
46
|
Olcese U, Oude Lohuis MN, Pennartz CMA. Sensory Processing Across Conscious and Nonconscious Brain States: From Single Neurons to Distributed Networks for Inferential Representation. Front Syst Neurosci 2018; 12:49. [PMID: 30364373 PMCID: PMC6193318 DOI: 10.3389/fnsys.2018.00049] [Citation(s) in RCA: 25] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2018] [Accepted: 09/25/2018] [Indexed: 11/29/2022] Open
Abstract
Neuronal activity is markedly different across brain states: it varies from desynchronized activity during wakefulness to the synchronous alternation between active and silent states characteristic of deep sleep. Surprisingly, limited attention has been paid to investigating how brain states affect sensory processing. While it was long assumed that the brain was mostly disconnected from external stimuli during sleep, an increasing number of studies indicates that sensory stimuli continue to be processed across all brain states-albeit differently. In this review article, we first discuss what constitutes a brain state. We argue that-next to global, behavioral states such as wakefulness and sleep-there is a concomitant need to distinguish bouts of oscillatory dynamics with specific global/local activity patterns and lasting for a few hundreds of milliseconds, as these can lead to the same sensory stimulus being either perceived or not. We define these short-lasting bouts as micro-states. We proceed to characterize how sensory-evoked neural responses vary between conscious and nonconscious states. We focus on two complementary aspects: neuronal ensembles and inter-areal communication. First, we review which features of ensemble activity are conducive to perception, and how these features vary across brain states. Properties such as heterogeneity, sparsity and synchronicity in neuronal ensembles will especially be considered as essential correlates of conscious processing. Second, we discuss how inter-areal communication varies across brain states and how this may affect brain operations and sensory processing. Finally, we discuss predictive coding (PC) and the concept of multi-level representations as a key framework for understanding conscious sensory processing. In this framework the brain implements conscious representations as inferences about world states across multiple representational levels. In this representational hierarchy, low-level inference may be carried out nonconsciously, whereas high levels integrate across different sensory modalities and larger spatial scales, correlating with conscious processing. This inferential framework is used to interpret several cellular and population-level findings in the context of brain states, and we briefly compare its implications to two other theories of consciousness. In conclusion, this review article, provides foundations to guide future studies aiming to uncover the mechanisms of sensory processing and perception across brain states.
Collapse
Affiliation(s)
- Umberto Olcese
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| | - Matthijs N. Oude Lohuis
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| | - Cyriel M. A. Pennartz
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| |
Collapse
|
47
|
Meijer GT, Pie JL, Dolman TL, Pennartz CMA, Lansink CS. Audiovisual Integration Enhances Stimulus Detection Performance in Mice. Front Behav Neurosci 2018; 12:231. [PMID: 30337861 PMCID: PMC6180166 DOI: 10.3389/fnbeh.2018.00231] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2018] [Accepted: 09/14/2018] [Indexed: 11/13/2022] Open
Abstract
The detection of objects in the external world improves when humans and animals integrate object features of multiple sensory modalities. Behavioral and neuronal mechanisms underlying multisensory stimulus detection are poorly understood, mainly because they have not been investigated with suitable behavioral paradigms. Such behavioral paradigms should (i) elicit a robust multisensory gain, (ii) incorporate systematic calibration of stimulus amplitude to the sensory capacities of the individual subject, (iii) yield a high trial count, and (iv) be easily compatible with a large variety of neurophysiological recording techniques. We developed an audiovisual stimulus detection task for head-fixed mice which meets all of these critical behavioral constraints. Behavioral data obtained with this task indicated a robust increase in detection performance of multisensory stimuli compared with unisensory cues, which was maximal when both stimulus constituents were presented at threshold intensity. The multisensory behavioral effect was associated with a change in the perceptual performance which consisted of two components. First, the visual and auditory perceptual systems increased their sensitivity meaning that low intensity stimuli were more often detected. Second, enhanced acuity enabled the systems to better classify whether there was a stimulus or not. Fitting our data to signal detection models revealed that the multisensory gain was more likely to be achieved by integration of sensory signals rather than by stimulus redundancy or competition. This validated behavioral paradigm can be exploited to reliably investigate the neuronal correlates of multisensory stimulus detection at the level of single neurons, microcircuits, and larger perceptual systems.
Collapse
Affiliation(s)
- Guido T. Meijer
- Swammerdam Institute for Life Sciences, Center for Neuroscience, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
| | - Jean L. Pie
- Swammerdam Institute for Life Sciences, Center for Neuroscience, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
| | - Thomas L. Dolman
- Swammerdam Institute for Life Sciences, Center for Neuroscience, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
| | - Cyriel M. A. Pennartz
- Swammerdam Institute for Life Sciences, Center for Neuroscience, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Program Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| | - Carien S. Lansink
- Swammerdam Institute for Life Sciences, Center for Neuroscience, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Program Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| |
Collapse
|
48
|
Tivadar RI, Retsa C, Turoman N, Matusz PJ, Murray MM. Sounds enhance visual completion processes. Neuroimage 2018; 179:480-488. [PMID: 29959049 DOI: 10.1016/j.neuroimage.2018.06.070] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2018] [Revised: 06/13/2018] [Accepted: 06/25/2018] [Indexed: 10/28/2022] Open
Abstract
Everyday vision includes the detection of stimuli, figure-ground segregation, as well as object localization and recognition. Such processes must often surmount impoverished or noisy conditions; borders are perceived despite occlusion or absent contrast gradients. These illusory contours (ICs) are an example of so-called mid-level vision, with an event-related potential (ERP) correlate at ∼100-150 ms post-stimulus onset and originating within lateral-occipital cortices (the ICeffect). Presently, visual completion processes supporting IC perception are considered exclusively visual; any influence from other sensory modalities is currently unknown. It is now well-established that multisensory processes can influence both low-level vision (e.g. detection) as well as higher-level object recognition. By contrast, it is unknown if mid-level vision exhibits multisensory benefits and, if so, through what mechanisms. We hypothesized that sounds would impact the ICeffect. We recorded 128-channel ERPs from 17 healthy, sighted participants who viewed ICs or no-contour (NC) counterparts either in the presence or absence of task-irrelevant sounds. The ICeffect was enhanced by sounds and resulted in the recruitment of a distinct configuration of active brain areas over the 70-170 ms post-stimulus period. IC-related source-level activity within the lateral occipital cortex (LOC), inferior parietal lobe (IPL), as well as primary visual cortex (V1) were enhanced by sounds. Moreover, the activity in these regions was correlated when sounds were present, but not when absent. Results from a control experiment, which employed amodal variants of the stimuli, suggested that sounds impact the perceived brightness of the IC rather than shape formation per se. We provide the first demonstration that multisensory processes augment mid-level vision and everyday visual completion processes, and that one of the mechanisms is brightness enhancement. These results have important implications for the design of treatments and/or visual aids for low-vision patients.
Collapse
Affiliation(s)
- Ruxandra I Tivadar
- The LINE (Laboratory for Investigative Neurophysiology), Department of Radiology, University Hospital Center and University of Lausanne, 1011, Lausanne, Switzerland; Department of Ophthalmology, University of Lausanne and Fondation Asile des Aveugles, 1003, Lausanne, Switzerland
| | - Chrysa Retsa
- The LINE (Laboratory for Investigative Neurophysiology), Department of Radiology, University Hospital Center and University of Lausanne, 1011, Lausanne, Switzerland
| | - Nora Turoman
- The LINE (Laboratory for Investigative Neurophysiology), Department of Radiology, University Hospital Center and University of Lausanne, 1011, Lausanne, Switzerland
| | - Pawel J Matusz
- The LINE (Laboratory for Investigative Neurophysiology), Department of Radiology, University Hospital Center and University of Lausanne, 1011, Lausanne, Switzerland; Information Systems Institute at the University of Applied Sciences Western Switzerland (HES-SO Valais), 3960, Sierre, Switzerland
| | - Micah M Murray
- The LINE (Laboratory for Investigative Neurophysiology), Department of Radiology, University Hospital Center and University of Lausanne, 1011, Lausanne, Switzerland; Department of Ophthalmology, University of Lausanne and Fondation Asile des Aveugles, 1003, Lausanne, Switzerland; The EEG Brain Mapping Core, Center for Biomedical Imaging (CIBM), University Hospital Center and University of Lausanne, 1011, Lausanne, Switzerland; Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, TN, 37203-5721, USA.
| |
Collapse
|
49
|
Follmann R, Goldsmith CJ, Stein W. Multimodal sensory information is represented by a combinatorial code in a sensorimotor system. PLoS Biol 2018; 16:e2004527. [PMID: 30321170 PMCID: PMC6201955 DOI: 10.1371/journal.pbio.2004527] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2017] [Revised: 10/25/2018] [Accepted: 10/02/2018] [Indexed: 11/22/2022] Open
Abstract
A ubiquitous feature of the nervous system is the processing of simultaneously arriving sensory inputs from different modalities. Yet, because of the difficulties of monitoring large populations of neurons with the single resolution required to determine their sensory responses, the cellular mechanisms of how populations of neurons encode different sensory modalities often remain enigmatic. We studied multimodal information encoding in a small sensorimotor system of the crustacean stomatogastric nervous system that drives rhythmic motor activity for the processing of food. This system is experimentally advantageous, as it produces a fictive behavioral output in vitro, and distinct sensory modalities can be selectively activated. It has the additional advantage that all sensory information is routed through a hub ganglion, the commissural ganglion, a structure with fewer than 220 neurons. Using optical imaging of a population of commissural neurons to track each individual neuron's response across sensory modalities, we provide evidence that multimodal information is encoded via a combinatorial code of recruited neurons. By selectively stimulating chemosensory and mechanosensory inputs that are functionally important for processing of food, we find that these two modalities were processed in a distributed network comprising the majority of commissural neurons imaged. In a total of 12 commissural ganglia, we show that 98% of all imaged neurons were involved in sensory processing, with the two modalities being processed by a highly overlapping set of neurons. Of these, 80% were multimodal, 18% were unimodal, and only 2% of the neurons did not respond to either modality. Differences between modalities were represented by the identities of the neurons participating in each sensory condition and by differences in response sign (excitation versus inhibition), with 46% changing their responses in the other modality. Consistent with the hypothesis that the commissural network encodes different sensory conditions in the combination of activated neurons, a new combination of excitation and inhibition was found when both pathways were activated simultaneously. The responses to this bimodal condition were distinct from either unimodal condition, and for 30% of the neurons, they were not predictive from the individual unimodal responses. Thus, in a sensorimotor network, different sensory modalities are encoded using a combinatorial code of neurons that are activated or inhibited. This provides motor networks with the ability to differentially respond to categorically different sensory conditions and may serve as a model to understand higher-level processing of multimodal information.
Collapse
Affiliation(s)
- Rosangela Follmann
- School of Biological Sciences, Illinois State University, Normal, Illinois, United States of America
| | | | - Wolfgang Stein
- School of Biological Sciences, Illinois State University, Normal, Illinois, United States of America
| |
Collapse
|
50
|
Goltstein PM, Meijer GT, Pennartz CM. Conditioning sharpens the spatial representation of rewarded stimuli in mouse primary visual cortex. eLife 2018; 7:37683. [PMID: 30222107 PMCID: PMC6141231 DOI: 10.7554/elife.37683] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2018] [Accepted: 08/29/2018] [Indexed: 11/13/2022] Open
Abstract
Reward is often employed as reinforcement in behavioral paradigms but it is unclear how the visuospatial aspect of a stimulus-reward association affects the cortical representation of visual space. Using a head-fixed paradigm, we conditioned mice to associate the same visual pattern in adjacent retinotopic regions with availability and absence of reward. Time-lapse intrinsic optical signal imaging under anesthesia showed that conditioning increased the spatial separation of mesoscale cortical representations of reward predicting- and non-reward predicting stimuli. Subsequent in vivo two-photon calcium imaging revealed that this improved separation correlated with enhanced population coding for retinotopic location, specifically for the trained orientation and spatially confined to the V1 region where rewarded and non-rewarded stimulus representations bordered. These results are corroborated by conditioning-induced differences in the correlation structure of population activity. Thus, the cortical representation of visual space is sharpened as consequence of associative stimulus-reward learning while the overall retinotopic map remains unaltered.
Collapse
Affiliation(s)
- Pieter M Goltstein
- Center for Neuroscience, Swammerdam Institute for Life Sciences, University of Amsterdam, Amsterdam, Netherlands.,Research Priority Program Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| | - Guido T Meijer
- Center for Neuroscience, Swammerdam Institute for Life Sciences, University of Amsterdam, Amsterdam, Netherlands.,Research Priority Program Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| | - Cyriel Ma Pennartz
- Center for Neuroscience, Swammerdam Institute for Life Sciences, University of Amsterdam, Amsterdam, Netherlands.,Research Priority Program Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| |
Collapse
|