1
|
Efron B, Ntelezos A, Katz Y, Lampl I. Detection and neural encoding of whisker-generated sounds in mice. Curr Biol 2025:S0960-9822(25)00124-1. [PMID: 39978346 DOI: 10.1016/j.cub.2025.01.061] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2024] [Revised: 01/07/2025] [Accepted: 01/28/2025] [Indexed: 02/22/2025]
Abstract
The vibrissa system of mice and other rodents enables active sensing via whisker movements and is traditionally considered a purely tactile system. Here, we ask whether whisking against objects produces audible sounds and whether mice are capable of perceiving these sounds. We found that whisking by head-fixed mice against objects produces audible sounds well within their hearing range. We recorded neural activity in the auditory cortex of mice in which we had abolished vibrissae tactile sensation and found that the firing rate of auditory neurons was strongly modulated by whisking against objects. Furthermore, the object's identity could be reliably decoded from the population's neuronal activity and closely matched the decoding patterns derived from sounds that were recorded simultaneously, suggesting that neuronal activity reflects acoustic information. Lastly, trained mice, in which vibrissae tactile sensation was abolished, were able to accurately identify objects solely based on the sounds produced during whisking. Our results suggest that, beyond its traditional role as a tactile sensory system, the vibrissa system of rodents engages both tactile and auditory modalities in a multimodal manner during active exploration.
Collapse
Affiliation(s)
- Ben Efron
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot 7610001, Israel
| | - Athanasios Ntelezos
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot 7610001, Israel
| | - Yonatan Katz
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot 7610001, Israel
| | - Ilan Lampl
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot 7610001, Israel.
| |
Collapse
|
2
|
Harmon TC, Madlon-Kay S, Pearson J, Mooney R. Vocalization modulates the mouse auditory cortex even in the absence of hearing. Cell Rep 2024; 43:114611. [PMID: 39116205 PMCID: PMC11720499 DOI: 10.1016/j.celrep.2024.114611] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2024] [Revised: 07/10/2024] [Accepted: 07/24/2024] [Indexed: 08/10/2024] Open
Abstract
Vocal communication depends on distinguishing self-generated vocalizations from other sounds. Vocal motor corollary discharge (CD) signals are thought to support this ability by adaptively suppressing auditory cortical responses to auditory feedback. One challenge is that vocalizations, especially those produced during courtship and other social interactions, are accompanied by other movements and are emitted during a state of heightened arousal, factors that could potentially modulate auditory cortical activity. Here, we monitor auditory cortical activity, ultrasonic vocalizations (USVs), and other non-vocal courtship behaviors in a head-fixed male mouse while he interacts with a female mouse. This approach reveals a vocalization-specific signature in the auditory cortex that suppresses the activity of USV playback-excited neurons, emerges before vocal onset, and scales with USV band power. Notably, this vocal modulatory signature is also present in the auditory cortex of congenitally deaf mice, revealing an adaptive vocal CD signal that manifests independently of auditory feedback or auditory experience.
Collapse
Affiliation(s)
- Thomas C Harmon
- Department of Neurobiology, Duke University, Durham, NC 27710, USA.
| | - Seth Madlon-Kay
- Department of Neurobiology, Duke University, Durham, NC 27710, USA; Center for Cognitive Neuroscience, Duke University, Durham, NC 27708, USA
| | - John Pearson
- Department of Neurobiology, Duke University, Durham, NC 27710, USA; Center for Cognitive Neuroscience, Duke University, Durham, NC 27708, USA; Department of Biostatistics & Bioinformatics, Duke University, Durham, NC 27710, USA
| | - Richard Mooney
- Department of Neurobiology, Duke University, Durham, NC 27710, USA
| |
Collapse
|
3
|
Wang B, Audette NJ, Schneider DM, Aljadeff J. Desegregation of neuronal predictive processing. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.08.05.606684. [PMID: 39149380 PMCID: PMC11326200 DOI: 10.1101/2024.08.05.606684] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 08/17/2024]
Abstract
Neural circuits construct internal 'world-models' to guide behavior. The predictive processing framework posits that neural activity signaling sensory predictions and concurrently computing prediction-errors is a signature of those internal models. Here, to understand how the brain generates predictions for complex sensorimotor signals, we investigate the emergence of high-dimensional, multi-modal predictive representations in recurrent networks. We find that robust predictive processing arises in a network with loose excitatory/inhibitory balance. Contrary to previous proposals of functionally specialized cell-types, the network exhibits desegregation of stimulus and prediction-error representations. We confirmed these model predictions by experimentally probing predictive-coding circuits using a rich stimulus-set to violate learned expectations. When constrained by data, our model further reveals and makes concrete testable experimental predictions for the distinct functional roles of excitatory and inhibitory neurons, and of neurons in different layers along a laminar hierarchy, in computing multi-modal predictions. These results together imply that in natural conditions, neural representations of internal models are highly distributed, yet structured to allow flexible readout of behaviorally-relevant information. The generality of our model advances the understanding of computation of internal models across species, by incorporating different types of predictive computations into a unified framework.
Collapse
Affiliation(s)
- Bin Wang
- Department of Physics, University of California San Diego, La Jolla, CA, 92093, USA
| | | | - David M Schneider
- Center for Neural Science, New York University, New York, NY 10003, USA
| | - Johnatan Aljadeff
- Department of Neurobiology, University of California San Diego, La Jolla, CA, 92093, USA
| |
Collapse
|
4
|
Holey BE, Schneider DM. Sensation and expectation are embedded in mouse motor cortical activity. Cell Rep 2024; 43:114396. [PMID: 38923464 PMCID: PMC11304474 DOI: 10.1016/j.celrep.2024.114396] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2023] [Revised: 05/15/2024] [Accepted: 06/07/2024] [Indexed: 06/28/2024] Open
Abstract
During behavior, the motor cortex sends copies of motor-related signals to sensory cortices. Here, we combine closed-loop behavior with large-scale physiology, projection-pattern-specific recordings, and circuit perturbations to show that neurons in mouse secondary motor cortex (M2) encode sensation and are influenced by expectation. When a movement unexpectedly produces a sound, M2 becomes dominated by sound-evoked activity. Sound responses in M2 are inherited partially from the auditory cortex and are routed back to the auditory cortex, providing a path for the reciprocal exchange of sensory-motor information during behavior. When the acoustic consequences of a movement become predictable, M2 responses to self-generated sounds are selectively gated off. These changes in single-cell responses are reflected in population dynamics, which are influenced by both sensation and expectation. Together, these findings reveal the embedding of sensory and expectation signals in motor cortical activity.
Collapse
Affiliation(s)
- Brooke E Holey
- Center for Neural Science, New York University, New York, NY 10003, USA; Neuroscience Institute, NYU Medical Center, New York, NY 10016, USA
| | - David M Schneider
- Center for Neural Science, New York University, New York, NY 10003, USA.
| |
Collapse
|
5
|
Beach SD, Tang DL, Kiran S, Niziolek CA. Pars Opercularis Underlies Efferent Predictions and Successful Auditory Feedback Processing in Speech: Evidence From Left-Hemisphere Stroke. NEUROBIOLOGY OF LANGUAGE (CAMBRIDGE, MASS.) 2024; 5:454-483. [PMID: 38911464 PMCID: PMC11192514 DOI: 10.1162/nol_a_00139] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/13/2023] [Accepted: 02/07/2024] [Indexed: 06/25/2024]
Abstract
Hearing one's own speech allows for acoustic self-monitoring in real time. Left-hemisphere motor planning regions are thought to give rise to efferent predictions that can be compared to true feedback in sensory cortices, resulting in neural suppression commensurate with the degree of overlap between predicted and actual sensations. Sensory prediction errors thus serve as a possible mechanism of detection of deviant speech sounds, which can then feed back into corrective action, allowing for online control of speech acoustics. The goal of this study was to assess the integrity of this detection-correction circuit in persons with aphasia (PWA) whose left-hemisphere lesions may limit their ability to control variability in speech output. We recorded magnetoencephalography (MEG) while 15 PWA and age-matched controls spoke monosyllabic words and listened to playback of their utterances. From this, we measured speaking-induced suppression of the M100 neural response and related it to lesion profiles and speech behavior. Both speaking-induced suppression and cortical sensitivity to deviance were preserved at the group level in PWA. PWA with more spared tissue in pars opercularis had greater left-hemisphere neural suppression and greater behavioral correction of acoustically deviant pronunciations, whereas sparing of superior temporal gyrus was not related to neural suppression or acoustic behavior. In turn, PWA who made greater corrections had fewer overt speech errors in the MEG task. Thus, the motor planning regions that generate the efferent prediction are integral to performing corrections when that prediction is violated.
Collapse
Affiliation(s)
| | - Ding-lan Tang
- Waisman Center, The University of Wisconsin–Madison
- Academic Unit of Human Communication, Development, and Information Sciences, University of Hong Kong, Hong Kong, SAR China
| | - Swathi Kiran
- Department of Speech, Language & Hearing Sciences, Boston University
| | - Caroline A. Niziolek
- Waisman Center, The University of Wisconsin–Madison
- Department of Communication Sciences and Disorders, The University of Wisconsin–Madison
| |
Collapse
|
6
|
Clayton KK, Stecyk KS, Guo AA, Chambers AR, Chen K, Hancock KE, Polley DB. Sound elicits stereotyped facial movements that provide a sensitive index of hearing abilities in mice. Curr Biol 2024; 34:1605-1620.e5. [PMID: 38492568 PMCID: PMC11043000 DOI: 10.1016/j.cub.2024.02.057] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2023] [Revised: 01/02/2024] [Accepted: 02/23/2024] [Indexed: 03/18/2024]
Abstract
Sound elicits rapid movements of muscles in the face, ears, and eyes that protect the body from injury and trigger brain-wide internal state changes. Here, we performed quantitative facial videography from mice resting atop a piezoelectric force plate and observed that broadband sounds elicited rapid and stereotyped facial twitches. Facial motion energy (FME) adjacent to the whisker array was 30 dB more sensitive than the acoustic startle reflex and offered greater inter-trial and inter-animal reliability than sound-evoked pupil dilations or movement of other facial and body regions. FME tracked the low-frequency envelope of broadband sounds, providing a means to study behavioral discrimination of complex auditory stimuli, such as speech phonemes in noise. Approximately 25% of layer 5-6 units in the auditory cortex (ACtx) exhibited firing rate changes during facial movements. However, FME facilitation during ACtx photoinhibition indicated that sound-evoked facial movements were mediated by a midbrain pathway and modulated by descending corticofugal input. FME and auditory brainstem response (ABR) thresholds were closely aligned after noise-induced sensorineural hearing loss, yet FME growth slopes were disproportionately steep at spared frequencies, reflecting a central plasticity that matched commensurate changes in ABR wave 4. Sound-evoked facial movements were also hypersensitive in Ptchd1 knockout mice, highlighting the use of FME for identifying sensory hyper-reactivity phenotypes after adult-onset hyperacusis and inherited deficiencies in autism risk genes. These findings present a sensitive and integrative measure of hearing while also highlighting that even low-intensity broadband sounds can elicit a complex mixture of auditory, motor, and reafferent somatosensory neural activity.
Collapse
Affiliation(s)
- Kameron K Clayton
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Department of Otolaryngology-Head and Neck Surgery, Harvard Medical School, Boston, MA 02114, USA.
| | - Kamryn S Stecyk
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA
| | - Anna A Guo
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA
| | - Anna R Chambers
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Department of Otolaryngology-Head and Neck Surgery, Harvard Medical School, Boston, MA 02114, USA
| | - Ke Chen
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Department of Otolaryngology-Head and Neck Surgery, Harvard Medical School, Boston, MA 02114, USA
| | - Kenneth E Hancock
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Department of Otolaryngology-Head and Neck Surgery, Harvard Medical School, Boston, MA 02114, USA
| | - Daniel B Polley
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Department of Otolaryngology-Head and Neck Surgery, Harvard Medical School, Boston, MA 02114, USA
| |
Collapse
|
7
|
Tsunada J, Wang X, Eliades SJ. Multiple processes of vocal sensory-motor interaction in primate auditory cortex. Nat Commun 2024; 15:3093. [PMID: 38600118 PMCID: PMC11006904 DOI: 10.1038/s41467-024-47510-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2022] [Accepted: 04/02/2024] [Indexed: 04/12/2024] Open
Abstract
Sensory-motor interactions in the auditory system play an important role in vocal self-monitoring and control. These result from top-down corollary discharges, relaying predictions about vocal timing and acoustics. Recent evidence suggests such signals may be two distinct processes, one suppressing neural activity during vocalization and another enhancing sensitivity to sensory feedback, rather than a single mechanism. Single-neuron recordings have been unable to disambiguate due to overlap of motor signals with sensory inputs. Here, we sought to disentangle these processes in marmoset auditory cortex during production of multi-phrased 'twitter' vocalizations. Temporal responses revealed two timescales of vocal suppression: temporally-precise phasic suppression during phrases and sustained tonic suppression. Both components were present within individual neurons, however, phasic suppression presented broadly regardless of frequency tuning (gating), while tonic was selective for vocal frequencies and feedback (prediction). This suggests that auditory cortex is modulated by concurrent corollary discharges during vocalization, with different computational mechanisms.
Collapse
Affiliation(s)
- Joji Tsunada
- Auditory and Communication Systems Laboratory, Department of Otorhinolaryngology: Head and Neck Surgery, University of Pennsylvania Perelman School of Medicine, Philadelphia, PA, USA
- Chinese Institute for Brain Research, Beijing, China
| | - Xiaoqin Wang
- Laboratory of Auditory Neurophysiology, Department of Biomedical Engineering, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Steven J Eliades
- Auditory and Communication Systems Laboratory, Department of Otorhinolaryngology: Head and Neck Surgery, University of Pennsylvania Perelman School of Medicine, Philadelphia, PA, USA.
- Department of Head and Neck Surgery & Communication Sciences, Duke University School of Medicine, Durham, NC, USA.
| |
Collapse
|
8
|
Morandell K, Yin A, Triana Del Rio R, Schneider DM. Movement-Related Modulation in Mouse Auditory Cortex Is Widespread Yet Locally Diverse. J Neurosci 2024; 44:e1227232024. [PMID: 38286628 PMCID: PMC10941236 DOI: 10.1523/jneurosci.1227-23.2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2023] [Revised: 12/12/2023] [Accepted: 01/15/2024] [Indexed: 01/31/2024] Open
Abstract
Neurons in the mouse auditory cortex are strongly influenced by behavior, including both suppression and enhancement of sound-evoked responses during movement. The mouse auditory cortex comprises multiple fields with different roles in sound processing and distinct connectivity to movement-related centers of the brain. Here, we asked whether movement-related modulation in male mice might differ across auditory cortical fields, thereby contributing to the heterogeneity of movement-related modulation at the single-cell level. We used wide-field calcium imaging to identify distinct cortical fields and cellular-resolution two-photon calcium imaging to visualize the activity of layer 2/3 excitatory neurons within each field. We measured each neuron's responses to three sound categories (pure tones, chirps, and amplitude-modulated white noise) as mice rested and ran on a non-motorized treadmill. We found that individual neurons in each cortical field typically respond to just one sound category. Some neurons are only active during rest and others during locomotion, and those that are responsive across conditions retain their sound-category tuning. The effects of locomotion on sound-evoked responses vary at the single-cell level, with both suppression and enhancement of neural responses, and the net modulatory effect of locomotion is largely conserved across cortical fields. Movement-related modulation in auditory cortex also reflects more complex behavioral patterns, including instantaneous running speed and nonlocomotor movements such as grooming and postural adjustments, with similar patterns seen across all auditory cortical fields. Our findings underscore the complexity of movement-related modulation throughout the mouse auditory cortex and indicate that movement-related modulation is a widespread phenomenon.
Collapse
Affiliation(s)
- Karin Morandell
- Center for Neural Science, New York University, New York, New York 10012
| | - Audrey Yin
- Center for Neural Science, New York University, New York, New York 10012
| | | | - David M Schneider
- Center for Neural Science, New York University, New York, New York 10012
| |
Collapse
|
9
|
Zhou W, Schneider DM. Learning within a sensory-motor circuit links action to expected outcome. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.02.08.579532. [PMID: 38370770 PMCID: PMC10871315 DOI: 10.1101/2024.02.08.579532] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/20/2024]
Abstract
The cortex integrates sound- and movement-related signals to predict the acoustic consequences of behavior and detect violations from expectations. Although expectation- and prediction-related activity has been observed in the auditory cortex of humans, monkeys, and mice during vocal and non-vocal acoustic behaviors, the specific cortical circuitry required for forming memories, recalling expectations, and making predictions remains unknown. By combining closed-loop behavior, electrophysiological recordings, longitudinal pharmacology, and targeted optogenetic circuit activation, we identify a cortical locus for the emergence of expectation and error signals. Movement-related expectation signals and sound-related error signals emerge in parallel in the auditory cortex and are concentrated in largely distinct neurons, consistent with a compartmentalization of different prediction-related computations. On a trial-by-trial basis, expectation and error signals are correlated in auditory cortex, consistent with a local circuit implementation of an internal model. Silencing the auditory cortex during motor-sensory learning prevents the emergence of expectation signals and error signals, revealing the auditory cortex as a necessary node for learning to make predictions. Prediction-like signals can be experimentally induced in the auditory cortex, even in the absence of behavioral experience, by pairing optogenetic motor cortical activation with sound playback, indicating that cortical circuits are sufficient for movement-like predictive processing. Finally, motor-sensory experience realigns the manifold dimensions in which auditory cortical populations encode movement and sound, consistent with predictive processing. These findings show that prediction-related signals reshape auditory cortex dynamics during behavior and reveal a cortical locus for the emergence of expectation and error.
Collapse
Affiliation(s)
- WenXi Zhou
- Center for Neural Science, New York University, New York, NY, 10012
| | | |
Collapse
|
10
|
Rummell BP, Bikas S, Babl SS, Gogos JA, Sigurdsson T. Altered corollary discharge signaling in the auditory cortex of a mouse model of schizophrenia predisposition. Nat Commun 2023; 14:7388. [PMID: 37968289 PMCID: PMC10651874 DOI: 10.1038/s41467-023-42964-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2022] [Accepted: 10/27/2023] [Indexed: 11/17/2023] Open
Abstract
The ability to distinguish sensations that are self-generated from those caused by external events is disrupted in schizophrenia patients. However, the neural circuit abnormalities underlying this sensory impairment and its relationship to the risk factors for the disease is not well understood. To address this, we examined the processing of self-generated sounds in male Df(16)A+/- mice, which model one of the largest genetic risk factors for schizophrenia, the 22q11.2 microdeletion. We find that auditory cortical neurons in Df(16)A+/- mice fail to attenuate their responses to self-generated sounds, recapitulating deficits seen in schizophrenia patients. Notably, the auditory cortex of Df(16)A+/- mice displayed weaker motor-related signals and received fewer inputs from the motor cortex, suggesting an anatomical basis underlying the sensory deficit. These results provide insights into the mechanisms by which a major genetic risk factor for schizophrenia disrupts the top-down processing of sensory information.
Collapse
Affiliation(s)
- Brian P Rummell
- Institute of Neurophysiology, Goethe University, Theodor-Stern Kai 7, 60590, Frankfurt, Germany
- Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, 60528, Frankfurt am Main, Germany
| | - Solmaz Bikas
- Institute of Neurophysiology, Goethe University, Theodor-Stern Kai 7, 60590, Frankfurt, Germany
| | - Susanne S Babl
- Institute of Neurophysiology, Goethe University, Theodor-Stern Kai 7, 60590, Frankfurt, Germany
| | - Joseph A Gogos
- Mortimer B. Zuckerman Mind Brain and Behavior Institute, Columbia University, New York, NY, 10027, USA
- Departments of Physiology, Neuroscience and Psychiatry, Vagelos College of Physicians & Surgeons, Columbia University, New York, NY, 10032, USA
| | - Torfi Sigurdsson
- Institute of Neurophysiology, Goethe University, Theodor-Stern Kai 7, 60590, Frankfurt, Germany.
| |
Collapse
|
11
|
Audette NJ, Schneider DM. Stimulus-Specific Prediction Error Neurons in Mouse Auditory Cortex. J Neurosci 2023; 43:7119-7129. [PMID: 37699716 PMCID: PMC10601367 DOI: 10.1523/jneurosci.0512-23.2023] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2023] [Revised: 08/07/2023] [Accepted: 09/04/2023] [Indexed: 09/14/2023] Open
Abstract
Comparing expectation with experience is an important neural computation performed throughout the brain and is a hallmark of predictive processing. Experiments that alter the sensory outcome of an animal's behavior reveal enhanced neural responses to unexpected self-generated stimuli, indicating that populations of neurons in sensory cortex may reflect prediction errors (PEs), mismatches between expectation and experience. However, enhanced neural responses to self-generated stimuli could also arise through nonpredictive mechanisms, such as the movement-based facilitation of a neuron's inherent sound responses. If sensory prediction error neurons exist in sensory cortex, it is unknown whether they manifest as general error responses, or respond with specificity to errors in distinct stimulus dimensions. To answer these questions, we trained mice of either sex to expect the outcome of a simple sound-generating behavior and recorded auditory cortex activity as mice heard either the expected sound or sounds that deviated from expectation in one of multiple distinct dimensions. Our data reveal that the auditory cortex learns to suppress responses to self-generated sounds along multiple acoustic dimensions simultaneously. We identify a distinct population of auditory cortex neurons that are not responsive to passive sounds or to the expected sound but that encode prediction errors. These prediction error neurons are abundant only in animals with a learned motor-sensory expectation, and encode one or two specific violations rather than a generic error signal. Together, these findings reveal that cortical predictions about self-generated sounds have specificity in multiple simultaneous dimensions and that cortical prediction error neurons encode specific violations from expectation.SIGNIFICANCE STATEMENT Audette et. al record neural activity in the auditory cortex while mice perform a sound-generating forelimb movement and measure neural responses to sounds that violate an animal's expectation in different ways. They find that predictions about self-generated sounds are highly specific across multiple stimulus dimensions and that a population of typically nonsound-responsive neurons respond to sounds that violate an animal's expectation in a specific way. These results identify specific prediction error (PE) signals in the mouse auditory cortex and suggest that errors may be calculated early in sensory processing.
Collapse
Affiliation(s)
- Nicholas J Audette
- Center for Neural Science, New York University, New York, New York 10003
| | - David M Schneider
- Center for Neural Science, New York University, New York, New York 10003
| |
Collapse
|
12
|
Beño-Ruiz-de-la-Sierra RM, Arjona-Valladares A, Fondevila Estevez S, Fernández-Linsenbarth I, Díez Á, Molina V. Corollary discharge function in healthy controls: Evidence about self-speech and external speech processing. Eur J Neurosci 2023; 58:3705-3713. [PMID: 37635264 DOI: 10.1111/ejn.16125] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2022] [Revised: 08/02/2023] [Accepted: 08/04/2023] [Indexed: 08/29/2023]
Abstract
As we speak, corollary discharge mechanisms suppress the auditory conscious perception of the self-generated voice in healthy subjects. This suppression has been associated with the attenuation of the auditory N1 component. To analyse this corollary discharge phenomenon (agency and ownership), we registered the event-related potentials of 42 healthy subjects. The N1 and P2 components were elicited by spoken vowels (talk condition; agency), by played-back vowels recorded with their own voice (listen-self condition; ownership) and by played-back vowels recorded with an external voice (listen-other condition). The N1 amplitude elicited by the talk condition was smaller compared with the listen-self and listen-other conditions. There were no amplitude differences in N1 between listen-self and listen-other conditions. The P2 component did not show differences between conditions. Additionally, a peak latency analysis of N1 and P2 components between the three conditions showed no differences. These findings corroborate previous results showing that the corollary discharge mechanisms dampen sensory responses to self-generated speech (agency experience) and provide new neurophysiological evidence about the similarities in the processing of played-back vowels with our own voice (ownership experience) and with an external voice.
Collapse
Affiliation(s)
| | | | | | | | - Álvaro Díez
- Department of Psychiatry, School of Medicine, University of Valladolid, Valladolid, Spain
| | - Vicente Molina
- Department of Psychiatry, School of Medicine, University of Valladolid, Valladolid, Spain
- Psychiatry Service, University Clinical Hospital of Valladolid, Valladolid, Spain
| |
Collapse
|
13
|
Holey BE, Schneider DM. Sensation and expectation are embedded in mouse motor cortical activity. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.09.13.557633. [PMID: 37745573 PMCID: PMC10515891 DOI: 10.1101/2023.09.13.557633] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/26/2023]
Abstract
During behavior, the motor cortex sends copies of motor-related signals to sensory cortices. It remains unclear whether these corollary discharge signals strictly encode movement or whether they also encode sensory experience and expectation. Here, we combine closed-loop behavior with large-scale physiology, projection-pattern specific recordings, and circuit perturbations to show that neurons in mouse secondary motor cortex (M2) encode sensation and are influenced by expectation. When a movement unexpectedly produces a sound, M2 becomes dominated by sound-evoked activity. Sound responses in M2 are inherited partially from the auditory cortex and are routed back to the auditory cortex, providing a path for the dynamic exchange of sensory-motor information during behavior. When the acoustic consequences of a movement become predictable, M2 responses to self-generated sounds are selectively gated off. These changes in single-cell responses are reflected in population dynamics, which are influenced by both sensation and expectation. Together, these findings reveal the rich embedding of sensory and expectation signals in motor cortical activity.
Collapse
Affiliation(s)
- Brooke E Holey
- Center for Neural Science, New York University
- Neuroscience Institute, NYU Medical Center
| | | |
Collapse
|
14
|
Vivaldo CA, Lee J, Shorkey M, Keerthy A, Rothschild G. Auditory cortex ensembles jointly encode sound and locomotion speed to support sound perception during movement. PLoS Biol 2023; 21:e3002277. [PMID: 37651461 PMCID: PMC10499203 DOI: 10.1371/journal.pbio.3002277] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2023] [Revised: 09/13/2023] [Accepted: 07/26/2023] [Indexed: 09/02/2023] Open
Abstract
The ability to process and act upon incoming sounds during locomotion is critical for survival and adaptive behavior. Despite the established role that the auditory cortex (AC) plays in behavior- and context-dependent sound processing, previous studies have found that auditory cortical activity is on average suppressed during locomotion as compared to immobility. While suppression of auditory cortical responses to self-generated sounds results from corollary discharge, which weakens responses to predictable sounds, the functional role of weaker responses to unpredictable external sounds during locomotion remains unclear. In particular, whether suppression of external sound-evoked responses during locomotion reflects reduced involvement of the AC in sound processing or whether it results from masking by an alternative neural computation in this state remains unresolved. Here, we tested the hypothesis that rather than simple inhibition, reduced sound-evoked responses during locomotion reflect a tradeoff with the emergence of explicit and reliable coding of locomotion velocity. To test this hypothesis, we first used neural inactivation in behaving mice and found that the AC plays a critical role in sound-guided behavior during locomotion. To investigate the nature of this processing, we used two-photon calcium imaging of local excitatory auditory cortical neural populations in awake mice. We found that locomotion had diverse influences on activity of different neurons, with a net suppression of baseline-subtracted sound-evoked responses and neural stimulus detection, consistent with previous studies. Importantly, we found that the net inhibitory effect of locomotion on baseline-subtracted sound-evoked responses was strongly shaped by elevated ongoing activity that compressed the response dynamic range, and that rather than reflecting enhanced "noise," this ongoing activity reliably encoded the animal's locomotion speed. Decoding analyses revealed that locomotion speed and sound are robustly co-encoded by auditory cortical ensemble activity. Finally, we found consistent patterns of joint coding of sound and locomotion speed in electrophysiologically recorded activity in freely moving rats. Together, our data suggest that rather than being suppressed by locomotion, auditory cortical ensembles explicitly encode it alongside sound information to support sound perception during locomotion.
Collapse
Affiliation(s)
- Carlos Arturo Vivaldo
- Department of Psychology, University of Michigan, Ann Arbor, Michigan, United States of America
| | - Joonyeup Lee
- Department of Psychology, University of Michigan, Ann Arbor, Michigan, United States of America
| | - MaryClaire Shorkey
- Department of Psychology, University of Michigan, Ann Arbor, Michigan, United States of America
| | - Ajay Keerthy
- Department of Psychology, University of Michigan, Ann Arbor, Michigan, United States of America
| | - Gideon Rothschild
- Department of Psychology, University of Michigan, Ann Arbor, Michigan, United States of America
- Kresge Hearing Research Institute and Department of Otolaryngology—Head and Neck Surgery, University of Michigan, Ann Arbor, Michigan, United States of America
| |
Collapse
|
15
|
Keshavarzi S, Velez-Fort M, Margrie TW. Cortical Integration of Vestibular and Visual Cues for Navigation, Visual Processing, and Perception. Annu Rev Neurosci 2023; 46:301-320. [PMID: 37428601 PMCID: PMC7616138 DOI: 10.1146/annurev-neuro-120722-100503] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/12/2023]
Abstract
Despite increasing evidence of its involvement in several key functions of the cerebral cortex, the vestibular sense rarely enters our consciousness. Indeed, the extent to which these internal signals are incorporated within cortical sensory representation and how they might be relied upon for sensory-driven decision-making, during, for example, spatial navigation, is yet to be understood. Recent novel experimental approaches in rodents have probed both the physiological and behavioral significance of vestibular signals and indicate that their widespread integration with vision improves both the cortical representation and perceptual accuracy of self-motion and orientation. Here, we summarize these recent findings with a focus on cortical circuits involved in visual perception and spatial navigation and highlight the major remaining knowledge gaps. We suggest that vestibulo-visual integration reflects a process of constant updating regarding the status of self-motion, and access to such information by the cortex is used for sensory perception and predictions that may be implemented for rapid, navigation-related decision-making.
Collapse
Affiliation(s)
- Sepiedeh Keshavarzi
- The Sainsbury Wellcome Centre for Neural Circuits and Behavior, University College London, London, United Kingdom;
| | - Mateo Velez-Fort
- The Sainsbury Wellcome Centre for Neural Circuits and Behavior, University College London, London, United Kingdom;
| | - Troy W Margrie
- The Sainsbury Wellcome Centre for Neural Circuits and Behavior, University College London, London, United Kingdom;
| |
Collapse
|
16
|
Morandell K, Yin A, Del Rio RT, Schneider DM. Movement-related modulation in mouse auditory cortex is widespread yet locally diverse. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.07.03.547560. [PMID: 37461568 PMCID: PMC10349927 DOI: 10.1101/2023.07.03.547560] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 07/25/2023]
Abstract
Neurons in the mouse auditory cortex are strongly influenced by behavior, including both suppression and enhancement of sound-evoked responses during movement. The mouse auditory cortex comprises multiple fields with different roles in sound processing and distinct connectivity to movement-related centers of the brain. Here, we asked whether movement-related modulation might differ across auditory cortical fields, thereby contributing to the heterogeneity of movement-related modulation at the single-cell level. We used wide-field calcium imaging to identify distinct cortical fields followed by cellular-resolution two-photon calcium imaging to visualize the activity of layer 2/3 excitatory neurons within each field. We measured each neuron's responses to three sound categories (pure tones, chirps, and amplitude modulated white noise) as mice rested and ran on a non-motorized treadmill. We found that individual neurons in each cortical field typically respond to just one sound category. Some neurons are only active during rest and others during locomotion, and those that are responsive across conditions retain their sound-category tuning. The effects of locomotion on sound-evoked responses vary at the single-cell level, with both suppression and enhancement of neural responses, and the net modulatory effect of locomotion is largely conserved across cortical fields. Movement-related modulation in auditory cortex also reflects more complex behavioral patterns, including instantaneous running speed and non-locomotor movements such as grooming and postural adjustments, with similar patterns seen across all auditory cortical fields. Our findings underscore the complexity of movement-related modulation throughout the mouse auditory cortex and indicate that movement-related modulation is a widespread phenomenon.
Collapse
Affiliation(s)
- Karin Morandell
- Center for Neural Science, New York University, New York, NY 10012
| | - Audrey Yin
- Center for Neural Science, New York University, New York, NY 10012
| | | | | |
Collapse
|
17
|
Price BH, Jensen CM, Khoudary AA, Gavornik JP. Expectation violations produce error signals in mouse V1. Cereb Cortex 2023; 33:8803-8820. [PMID: 37183176 PMCID: PMC10321125 DOI: 10.1093/cercor/bhad163] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2023] [Revised: 04/22/2023] [Accepted: 04/25/2023] [Indexed: 05/16/2023] Open
Abstract
Repeated exposure to visual sequences changes the form of evoked activity in the primary visual cortex (V1). Predictive coding theory provides a potential explanation for this, namely that plasticity shapes cortical circuits to encode spatiotemporal predictions and that subsequent responses are modulated by the degree to which actual inputs match these expectations. Here we use a recently developed statistical modeling technique called Model-Based Targeted Dimensionality Reduction (MbTDR) to study visually evoked dynamics in mouse V1 in the context of an experimental paradigm called "sequence learning." We report that evoked spiking activity changed significantly with training, in a manner generally consistent with the predictive coding framework. Neural responses to expected stimuli were suppressed in a late window (100-150 ms) after stimulus onset following training, whereas responses to novel stimuli were not. Substituting a novel stimulus for a familiar one led to increases in firing that persisted for at least 300 ms. Omitting predictable stimuli in trained animals also led to increased firing at the expected time of stimulus onset. Finally, we show that spiking data can be used to accurately decode time within the sequence. Our findings are consistent with the idea that plasticity in early visual circuits is involved in coding spatiotemporal information.
Collapse
Affiliation(s)
- Byron H Price
- Center for Systems Neuroscience, Department of Biology, Boston University, Boston, MA 02215, USA
- Graduate Program in Neuroscience, Boston University, Boston, MA 02215, USA
| | - Cambria M Jensen
- Center for Systems Neuroscience, Department of Biology, Boston University, Boston, MA 02215, USA
| | - Anthony A Khoudary
- Center for Systems Neuroscience, Department of Biology, Boston University, Boston, MA 02215, USA
| | - Jeffrey P Gavornik
- Center for Systems Neuroscience, Department of Biology, Boston University, Boston, MA 02215, USA
- Graduate Program in Neuroscience, Boston University, Boston, MA 02215, USA
| |
Collapse
|
18
|
Audette NJ, Schneider DM. Stimulus-specific prediction error neurons in mouse auditory cortex. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.01.06.523032. [PMID: 36711690 PMCID: PMC9881916 DOI: 10.1101/2023.01.06.523032] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 01/09/2023]
Abstract
Comparing expectation with experience is an important neural computation performed throughout the brain and is a hallmark of predictive processing. Experiments that alter the sensory outcome of an animal's behavior reveal enhanced neural responses to unexpected self-generated stimuli, indicating that populations of neurons in sensory cortex may reflect prediction errors - mismatches between expectation and experience. However, enhanced neural responses to self-generated stimuli could also arise through non-predictive mechanisms, such as the movement-based facilitation of a neuron's inherent sound responses. If sensory prediction error neurons exist in sensory cortex, it is unknown whether they manifest as general error responses, or respond with specificity to errors in distinct stimulus dimensions. To answer these questions, we trained mice to expect the outcome of a simple sound-generating behavior and recorded auditory cortex activity as mice heard either the expected sound or sounds that deviated from expectation in one of multiple distinct dimensions. Our data reveal that the auditory cortex learns to suppress responses to self-generated sounds along multiple acoustic dimensions simultaneously. We identify a distinct population of auditory cortex neurons that are not responsive to passive sounds or to the expected sound but that explicitly encode prediction errors. These prediction error neurons are abundant only in animals with a learned motor-sensory expectation, and encode one or two specific violations rather than a generic error signal.
Collapse
Affiliation(s)
- Nicholas J Audette
- Center for Neural Science, New York University, 4 Washington Place, New York, NY 10003, USA
| | - David M Schneider
- Center for Neural Science, New York University, 4 Washington Place, New York, NY 10003, USA
| |
Collapse
|
19
|
English G, Ghasemi Nejad N, Sommerfelt M, Yanik MF, von der Behrens W. Bayesian surprise shapes neural responses in somatosensory cortical circuits. Cell Rep 2023; 42:112009. [PMID: 36701237 DOI: 10.1016/j.celrep.2023.112009] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2022] [Revised: 09/16/2022] [Accepted: 12/31/2022] [Indexed: 01/26/2023] Open
Abstract
Numerous psychophysical studies show that Bayesian inference governs sensory decision-making; however, the specific neural circuitry underlying this probabilistic mechanism remains unclear. We record extracellular neural activity along the somatosensory pathway of mice while delivering sensory stimulation paradigms designed to isolate the response to the surprise generated by Bayesian inference. Our results demonstrate that laminar cortical circuits in early sensory areas encode Bayesian surprise. Systematic sensitivity to surprise is not identified in the somatosensory thalamus, rather emerging in the primary (S1) and secondary (S2) somatosensory cortices. Multiunit spiking activity and evoked potentials in layer 6 of these regions exhibit the highest sensitivity to surprise. Gamma power in S1 layer 2/3 exhibits an NMDAR-dependent scaling with surprise, as does alpha power in layers 2/3 and 6 of S2. These results show a precise spatiotemporal neural representation of Bayesian surprise and suggest that Bayesian inference is a fundamental component of cortical processing.
Collapse
Affiliation(s)
- Gwendolyn English
- Institute of Neuroinformatics, ETH Zurich & University of Zurich, 8057 Zurich, Switzerland; ZNZ Neuroscience Center Zurich, ETH Zurich & University of Zurich, 8057 Zurich, Switzerland.
| | - Newsha Ghasemi Nejad
- Institute of Neuroinformatics, ETH Zurich & University of Zurich, 8057 Zurich, Switzerland; ZNZ Neuroscience Center Zurich, ETH Zurich & University of Zurich, 8057 Zurich, Switzerland
| | - Marcel Sommerfelt
- Institute of Neuroinformatics, ETH Zurich & University of Zurich, 8057 Zurich, Switzerland
| | - Mehmet Fatih Yanik
- Institute of Neuroinformatics, ETH Zurich & University of Zurich, 8057 Zurich, Switzerland; ZNZ Neuroscience Center Zurich, ETH Zurich & University of Zurich, 8057 Zurich, Switzerland
| | - Wolfger von der Behrens
- Institute of Neuroinformatics, ETH Zurich & University of Zurich, 8057 Zurich, Switzerland; ZNZ Neuroscience Center Zurich, ETH Zurich & University of Zurich, 8057 Zurich, Switzerland.
| |
Collapse
|
20
|
Kyzar EJ, Denfield GH. Taking subjectivity seriously: towards a unification of phenomenology, psychiatry, and neuroscience. Mol Psychiatry 2023; 28:10-16. [PMID: 36460728 PMCID: PMC10130907 DOI: 10.1038/s41380-022-01891-2] [Citation(s) in RCA: 18] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/01/2022] [Revised: 11/08/2022] [Accepted: 11/15/2022] [Indexed: 12/05/2022]
Abstract
Nearly all psychiatric diseases involve alterations in subjective, lived experience. The scientific study of the biological basis of mental illness has generally focused on objective measures and observable behaviors, limiting the potential for our understanding of brain mechanisms of disease states and possible treatments. However, applying methods designed principally to interpret objective behavioral measures to the measurement and extrapolation of subjective states presents a number of challenges. In order to help bridge this gap, we draw on the tradition of phenomenology, a philosophical movement concerned with elucidating the structure of lived experience, which emerged in the early 20th century and influenced philosophy of mind, cognitive science, and psychiatry. A number of early phenomenologically-oriented psychiatrists made influential contributions to the field, but this approach retreated to the background as psychiatry moved towards more operationalized disease classifications. Recently, clinical-phenomenological research and viewpoints have re-emerged in the field. We argue that the potential for phenomenological research and methods to generate productive hypotheses about the neurobiological basis of psychiatric diseases has thus far been underappreciated. Using specific examples drawing on the subjective experience of mania and psychosis, we demonstrate that phenomenologically-oriented clinical studies can generate novel and fruitful propositions for neuroscientific investigation. Additionally, we outline a proposal for more rigorously integrating phenomenological investigations of subjective experience with the methods of modern neuroscience research, advocating a cross-species approach with a key role for human subjects research. Collaborative interaction between phenomenology, psychiatry, and neuroscience has the potential to move these fields towards a unified understanding of the biological basis of mental illness.
Collapse
Affiliation(s)
- Evan J Kyzar
- Department of Psychiatry, Columbia University, New York, NY, USA. .,Research Foundation for Mental Hygiene, Menands, NY, USA. .,New York State Psychiatric Institute, 1051 Riverside Drive, New York, NY, USA.
| | - George H Denfield
- Department of Psychiatry, Columbia University, New York, NY, USA. .,Research Foundation for Mental Hygiene, Menands, NY, USA. .,New York State Psychiatric Institute, 1051 Riverside Drive, New York, NY, USA.
| |
Collapse
|
21
|
Audette NJ, Zhou W, La Chioma A, Schneider DM. Precise movement-based predictions in the mouse auditory cortex. Curr Biol 2022; 32:4925-4940.e6. [PMID: 36283411 PMCID: PMC9691550 DOI: 10.1016/j.cub.2022.09.064] [Citation(s) in RCA: 26] [Impact Index Per Article: 8.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2022] [Revised: 09/15/2022] [Accepted: 09/30/2022] [Indexed: 11/05/2022]
Abstract
Many of the sensations experienced by an organism are caused by their own actions, and accurately anticipating both the sensory features and timing of self-generated stimuli is crucial to a variety of behaviors. In the auditory cortex, neural responses to self-generated sounds exhibit frequency-specific suppression, suggesting that movement-based predictions may be implemented early in sensory processing. However, it remains unknown whether this modulation results from a behaviorally specific and temporally precise prediction, nor is it known whether corresponding expectation signals are present locally in the auditory cortex. To address these questions, we trained mice to expect the precise acoustic outcome of a forelimb movement using a closed-loop sound-generating lever. Dense neuronal recordings in the auditory cortex revealed suppression of responses to self-generated sounds that was specific to the expected acoustic features, to a precise position within the movement, and to the movement that was coupled to sound during training. Prediction-based suppression was concentrated in L2/3 and L5, where deviations from expectation also recruited a population of prediction-error neurons that was otherwise unresponsive. Recording in the absence of sound revealed abundant movement signals in deep layers that were biased toward neurons tuned to the expected sound, as well as expectation signals that were present throughout the cortex and peaked at the time of expected auditory feedback. Together, these findings identify distinct populations of auditory cortical neurons with movement, expectation, and error signals consistent with a learned internal model linking an action to its specific acoustic outcome.
Collapse
Affiliation(s)
- Nicholas J Audette
- Center for Neural Science, New York University, 4 Washington Place, New York, NY 10003, USA
| | - WenXi Zhou
- Center for Neural Science, New York University, 4 Washington Place, New York, NY 10003, USA
| | - Alessandro La Chioma
- Center for Neural Science, New York University, 4 Washington Place, New York, NY 10003, USA
| | - David M Schneider
- Center for Neural Science, New York University, 4 Washington Place, New York, NY 10003, USA.
| |
Collapse
|
22
|
Billig AJ, Lad M, Sedley W, Griffiths TD. The hearing hippocampus. Prog Neurobiol 2022; 218:102326. [PMID: 35870677 PMCID: PMC10510040 DOI: 10.1016/j.pneurobio.2022.102326] [Citation(s) in RCA: 25] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2022] [Revised: 06/08/2022] [Accepted: 07/18/2022] [Indexed: 11/17/2022]
Abstract
The hippocampus has a well-established role in spatial and episodic memory but a broader function has been proposed including aspects of perception and relational processing. Neural bases of sound analysis have been described in the pathway to auditory cortex, but wider networks supporting auditory cognition are still being established. We review what is known about the role of the hippocampus in processing auditory information, and how the hippocampus itself is shaped by sound. In examining imaging, recording, and lesion studies in species from rodents to humans, we uncover a hierarchy of hippocampal responses to sound including during passive exposure, active listening, and the learning of associations between sounds and other stimuli. We describe how the hippocampus' connectivity and computational architecture allow it to track and manipulate auditory information - whether in the form of speech, music, or environmental, emotional, or phantom sounds. Functional and structural correlates of auditory experience are also identified. The extent of auditory-hippocampal interactions is consistent with the view that the hippocampus makes broad contributions to perception and cognition, beyond spatial and episodic memory. More deeply understanding these interactions may unlock applications including entraining hippocampal rhythms to support cognition, and intervening in links between hearing loss and dementia.
Collapse
Affiliation(s)
| | - Meher Lad
- Translational and Clinical Research Institute, Newcastle University Medical School, Newcastle upon Tyne, UK
| | - William Sedley
- Translational and Clinical Research Institute, Newcastle University Medical School, Newcastle upon Tyne, UK
| | - Timothy D Griffiths
- Biosciences Institute, Newcastle University Medical School, Newcastle upon Tyne, UK; Wellcome Centre for Human Neuroimaging, UCL Queen Square Institute of Neurology, University College London, London, UK; Human Brain Research Laboratory, Department of Neurosurgery, University of Iowa Hospitals and Clinics, Iowa City, USA
| |
Collapse
|
23
|
Abram SV, Hua JPY, Ford JM. Consider the pons: bridging the gap on sensory prediction abnormalities in schizophrenia. Trends Neurosci 2022; 45:798-808. [PMID: 36123224 PMCID: PMC9588719 DOI: 10.1016/j.tins.2022.08.008] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2022] [Revised: 08/04/2022] [Accepted: 08/23/2022] [Indexed: 01/18/2023]
Abstract
A shared mechanism across species heralds the arrival of self-generated sensations, helping the brain to anticipate, and therefore distinguish, self-generated from externally generated sensations. In mammals, this sensory prediction mechanism is supported by communication within a cortico-ponto-cerebellar-thalamo-cortical loop. Schizophrenia is associated with impaired sensory prediction as well as abnormal structural and functional connections between nodes in this circuit. Despite the pons' principal role in relaying and processing sensory information passed from the cortex to cerebellum, few studies have examined pons connectivity in schizophrenia. Here, we first briefly describe how the pons contributes to sensory prediction. We then summarize schizophrenia-related abnormalities in the cortico-ponto-cerebellar-thalamo-cortical loop, emphasizing the dearth of research on the pons relative to thalamic and cerebellar connections. We conclude with recommendations for advancing our understanding of how the pons relates to sensory prediction failures in schizophrenia.
Collapse
Affiliation(s)
- Samantha V Abram
- San Francisco Veterans Affairs Medical Center, San Francisco, CA, USA; University of California, San Francisco, CA, USA
| | - Jessica P Y Hua
- San Francisco Veterans Affairs Medical Center, San Francisco, CA, USA; University of California, San Francisco, CA, USA; Sierra Pacific Mental Illness Research Education and Clinical Centers, San Francisco Veterans Affairs Medical Center, San Francisco, CA, USA; Department of Psychiatry and Behavioral Sciences, The University of California, San Francisco, CA, USA
| | - Judith M Ford
- San Francisco Veterans Affairs Medical Center, San Francisco, CA, USA; University of California, San Francisco, CA, USA.
| |
Collapse
|
24
|
Olsen T, Hasenstaub AR. Offset Responses in the Auditory Cortex Show Unique History Dependence. J Neurosci 2022; 42:7370-7385. [PMID: 35999053 PMCID: PMC9525174 DOI: 10.1523/jneurosci.0494-22.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2022] [Revised: 08/10/2022] [Accepted: 08/13/2022] [Indexed: 11/21/2022] Open
Abstract
Sensory responses typically vary depending on the recent history of sensory experience. This is essential for processes, including adaptation, efficient coding, and change detection. In the auditory cortex (AC), the short-term history dependence of sound-evoked (onset) responses has been well characterized. Yet many AC neurons also respond to sound terminations, and little is known about the history dependence of these "offset" responses, whether the short-term dynamics of onset and offset responses are correlated, or how these properties are distributed among cell types. Here we presented awake male and female mice with repeating noise burst stimuli while recording single-unit activity from primary AC. We identified parvalbumin and somatostatin interneurons through optotagging, and also separated narrow-spiking from broad-spiking units. We found that offset responses are typically less depressive than onset responses, and this result was robust to a variety of stimulus parameters, controls, measurement types, and selection criteria. Whether a cell's onset response facilitates or depresses does not predict whether its offset response facilitates or depresses. Cell types differed in the dynamics of their onset responses, and in the prevalence, but not the dynamics, of their offset responses. Finally, we clustered cells according to spiking responses and found that response clusters were associated with cell type. Each cluster contained cells of several types, but even within a cluster, cells often showed cell type-specific response dynamics. We conclude that onset and offset responses are differentially influenced by recent sound history, and discuss the implications of this for the encoding of ongoing sound stimuli.SIGNIFICANCE STATEMENT Sensory neuron responses depend on stimulus history. This history dependence is crucial for sensory processing, is precisely controlled at individual synapses and circuits, and is adaptive to the specific requirements of different sensory systems. In the auditory cortex, neurons respond to sound cessation as well as to sound itself, but how history dependence is used along this separate, "offset" information stream is unknown. We show that offset responses are more facilitatory than sound responses, even in neurons where sound responses depress. In contrast to sound onset responses, offset responses are absent in many cells, are relatively homogeneous, and show no cell type-specific differences in history dependence. Offset responses thus show unique response dynamics, suggesting their unique functions.
Collapse
Affiliation(s)
- Timothy Olsen
- Coleman Memorial Laboratory
- Department of Otolaryngology-Head and Neck Surgery, University of California, San Francisco, California 94143
| | - Andrea R Hasenstaub
- Coleman Memorial Laboratory
- Department of Otolaryngology-Head and Neck Surgery, University of California, San Francisco, California 94143
| |
Collapse
|
25
|
Echolocation-related reversal of information flow in a cortical vocalization network. Nat Commun 2022; 13:3642. [PMID: 35752629 PMCID: PMC9233670 DOI: 10.1038/s41467-022-31230-6] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2021] [Accepted: 05/30/2022] [Indexed: 11/09/2022] Open
Abstract
The mammalian frontal and auditory cortices are important for vocal behavior. Here, using local-field potential recordings, we demonstrate that the timing and spatial patterns of oscillations in the fronto-auditory network of vocalizing bats (Carollia perspicillata) predict the purpose of vocalization: echolocation or communication. Transfer entropy analyses revealed predominant top-down (frontal-to-auditory cortex) information flow during spontaneous activity and pre-vocal periods. The dynamics of information flow depend on the behavioral role of the vocalization and on the timing relative to vocal onset. We observed the emergence of predominant bottom-up (auditory-to-frontal) information transfer during the post-vocal period specific to echolocation pulse emission, leading to self-directed acoustic feedback. Electrical stimulation of frontal areas selectively enhanced responses to sounds in auditory cortex. These results reveal unique changes in information flow across sensory and frontal cortices, potentially driven by the purpose of the vocalization in a highly vocal mammalian model.
Collapse
|
26
|
Abstract
The last decade has seen the emergence of new theoretical frameworks to explain pathological fatigue, a much neglected, yet highly significant symptom across a wide range of diseases. While the new models of fatigue provide new hypotheses to test, they also raise a number of questions. The primary purpose of this essay is to examine the predictions of three recently proposed models of fatigue, the overlap and differences between them, and the evidence from diseases that may lend support to the models of fatigue. I also present expansions for the sensory attenuation model of fatigue. Further questions examined here are the following: What are the neural substrates of fatigue? How can sensory attenuation, which underpins agency also explain fatigue? Are fatigue and agency related?
Collapse
Affiliation(s)
- Annapoorna Kuppuswamy
- Department of Clinical and Movement Neuroscience, Institute of Neurology, University College London, London, UK
| |
Collapse
|
27
|
Lesicko AMH, Angeloni CF, Blackwell JM, De Biasi M, Geffen MN. Cortico-fugal regulation of predictive coding. eLife 2022; 11:73289. [PMID: 35290181 PMCID: PMC8983050 DOI: 10.7554/elife.73289] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2021] [Accepted: 03/12/2022] [Indexed: 11/13/2022] Open
Abstract
Sensory systems must account for both contextual factors and prior experience to adaptively engage with the dynamic external environment. In the central auditory system, neurons modulate their responses to sounds based on statistical context. These response modulations can be understood through a hierarchical predictive coding lens: responses to repeated stimuli are progressively decreased, in a process known as repetition suppression, whereas unexpected stimuli produce a prediction error signal. Prediction error incrementally increases along the auditory hierarchy from the inferior colliculus (IC) to the auditory cortex (AC), suggesting that these regions may engage in hierarchical predictive coding. A potential substrate for top-down predictive cues is the massive set of descending projections from the auditory cortex to subcortical structures, although the role of this system in predictive processing has never been directly assessed. We tested the effect of optogenetic inactivation of the auditory cortico-collicular feedback in awake mice on responses of IC neurons to stimuli designed to test prediction error and repetition suppression. Inactivation of the cortico-collicular pathway led to a decrease in prediction error in IC. Repetition suppression was unaffected by cortico-collicular inactivation, suggesting that this metric may reflect fatigue of bottom-up sensory inputs rather than predictive processing. We also discovered populations of IC units that exhibit repetition enhancement, a sequential increase in firing with stimulus repetition. Cortico-collicular inactivation led to a decrease in repetition enhancement in the central nucleus of IC, suggesting that it is a top-down phenomenon. Negative prediction error, a stronger response to a tone in a predictable rather than unpredictable sequence, was suppressed in shell IC units during cortico-collicular inactivation. These changes in predictive coding metrics arose from bidirectional modulations in the response to the standard and deviant contexts, such that units in IC responded more similarly to each context in the absence of cortical input. We also investigated how these metrics compare between the anesthetized and awake states by recording from the same units under both conditions. We found that metrics of predictive coding and deviance detection differ depending on the anesthetic state of the animal, with negative prediction error emerging in the central IC and repetition enhancement and prediction error being more prevalent in the absence of anesthesia. Overall, our results demonstrate that the auditory cortex provides cues about the statistical context of sound to subcortical brain regions via direct feedback, regulating processing of both prediction and repetition.
Collapse
Affiliation(s)
- Alexandria M H Lesicko
- Department of Otorhinolaryngology, University of Pennsylvania, Philadelphia, United States
| | | | - Jennifer M Blackwell
- Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, United States
| | - Mariella De Biasi
- Department of Psychiatry, University of Pennsylvania, Philadelphia, United States
| | - Maria N Geffen
- Department of Otorhinolaryngology, University of Pennsylvania, Philadelphia, United States
| |
Collapse
|
28
|
Braga A, Schönwiesner M. Neural Substrates and Models of Omission Responses and Predictive Processes. Front Neural Circuits 2022; 16:799581. [PMID: 35177967 PMCID: PMC8844463 DOI: 10.3389/fncir.2022.799581] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2021] [Accepted: 01/05/2022] [Indexed: 11/24/2022] Open
Abstract
Predictive coding theories argue that deviance detection phenomena, such as mismatch responses and omission responses, are generated by predictive processes with possibly overlapping neural substrates. Molecular imaging and electrophysiology studies of mismatch responses and corollary discharge in the rodent model allowed the development of mechanistic and computational models of these phenomena. These models enable translation between human and non-human animal research and help to uncover fundamental features of change-processing microcircuitry in the neocortex. This microcircuitry is characterized by stimulus-specific adaptation and feedforward inhibition of stimulus-selective populations of pyramidal neurons and interneurons, with specific contributions from different interneuron types. The overlap of the substrates of different types of responses to deviant stimuli remains to be understood. Omission responses, which are observed both in corollary discharge and mismatch response protocols in humans, are underutilized in animal research and may be pivotal in uncovering the substrates of predictive processes. Omission studies comprise a range of methods centered on the withholding of an expected stimulus. This review aims to provide an overview of omission protocols and showcase their potential to integrate and complement the different models and procedures employed to study prediction and deviance detection.This approach may reveal the biological foundations of core concepts of predictive coding, and allow an empirical test of the framework's promise to unify theoretical models of attention and perception.
Collapse
Affiliation(s)
- Alessandro Braga
- Institute of Biology, Faculty of Life Sciences, University of Leipzig, Leipzig, Germany
- International Max Plank Research School, Max Plank Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Marc Schönwiesner
- Institute of Biology, Faculty of Life Sciences, University of Leipzig, Leipzig, Germany
- International Laboratory for Research on Brain, Music, and Sound (BRAMS), Université de Montréal, Montreal, QC, Canada
| |
Collapse
|
29
|
Parthasharathy M, Mantini D, Orban de Xivry JJ. Increased upper-limb sensory attenuation with age. J Neurophysiol 2021; 127:474-492. [PMID: 34936521 DOI: 10.1152/jn.00558.2020] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
The pressure of our own finger on the arm feels differently than the same pressure exerted by an external agent: the latter involves just touch, whereas the former involves a combination of touch and predictive output from the internal model of the body. This internal model predicts the movement of our own finger and hence the intensity of the sensation of the finger press is decreased. A decrease in intensity of the self-produced stimulus is called sensory attenuation. It has been reported that, due to decreased proprioception with age and an increased reliance on the prediction of the internal model, sensory attenuation is increased in older adults. In this study, we used a force-matching paradigm to test if sensory attenuation is also present over the arm and if aging increases sensory attenuation. We demonstrated that, while both young and older adults overestimate a self-produced force, older adults overestimate it even more showing an increased sensory attenuation. In addition, we also found that both younger and older adults self-produce higher forces when activating the homologous muscles of the upper limb. While this is traditionally viewed as evidence for an increased reliance on internal model function in older adults because of decreased proprioception, proprioception appeared unimpaired in our older participants. This begs the question of whether an age-related decrease in proprioception is really responsible for the increased sensory attenuation observed in older people.
Collapse
Affiliation(s)
- Manasa Parthasharathy
- Motor Control and Neuroplasticity Research group, Department of Movement Sciences, KU Leuven, Leuven, Belgium.,Leuven Brain Institute, KU Leuven, Leuven, Belgium
| | - Dante Mantini
- Motor Control and Neuroplasticity Research group, Department of Movement Sciences, KU Leuven, Leuven, Belgium.,Brain Imaging and Neural Dynamics Research Group, IRCCS San Camillo Hospital, Venice, Italy
| | - Jean-Jacques Orban de Xivry
- Motor Control and Neuroplasticity Research group, Department of Movement Sciences, KU Leuven, Leuven, Belgium.,Leuven Brain Institute, KU Leuven, Leuven, Belgium
| |
Collapse
|
30
|
Stripeikyte G, Pereira M, Rognini G, Potheegadoo J, Blanke O, Faivre N. Increased Functional Connectivity of the Intraparietal Sulcus Underlies the Attenuation of Numerosity Estimations for Self-Generated Words. J Neurosci 2021; 41:8917-8927. [PMID: 34497152 PMCID: PMC8549530 DOI: 10.1523/jneurosci.3164-20.2021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2020] [Revised: 06/29/2021] [Accepted: 07/01/2021] [Indexed: 11/21/2022] Open
Abstract
Previous studies have shown that self-generated stimuli in auditory, visual, and somatosensory domains are attenuated, producing decreased behavioral and neural responses compared with the same stimuli that are externally generated. Yet, whether such attenuation also occurs for higher-level cognitive functions beyond sensorimotor processing remains unknown. In this study, we assessed whether cognitive functions such as numerosity estimations are subject to attenuation in 56 healthy participants (32 women). We designed a task allowing the controlled comparison of numerosity estimations for self-generated (active condition) and externally generated (passive condition) words. Our behavioral results showed a larger underestimation of self-generated compared with externally generated words, suggesting that numerosity estimations for self-generated words are attenuated. Moreover, the linear relationship between the reported and actual number of words was stronger for self-generated words, although the ability to track errors about numerosity estimations was similar across conditions. Neuroimaging results revealed that numerosity underestimation involved increased functional connectivity between the right intraparietal sulcus and an extended network (bilateral supplementary motor area, left inferior parietal lobule, and left superior temporal gyrus) when estimating the number of self-generated versus externally generated words. We interpret our results in light of two models of attenuation and discuss their perceptual versus cognitive origins.SIGNIFICANCE STATEMENT We perceive sensory events as less intense when they are self-generated compared with when they are externally generated. This phenomenon, called attenuation, enables us to distinguish sensory events from self and external origins. Here, we designed a novel fMRI paradigm to assess whether cognitive processes such as numerosity estimations are also subject to attenuation. When asking participants to estimate the number of words they had generated or passively heard, we found bigger underestimation in the former case, providing behavioral evidence of attenuation. Attenuation was associated with increased functional connectivity of the intraparietal sulcus, a region involved in numerosity processing. Together, our results indicate that the attenuation of self-generated stimuli is not limited to sensory consequences but is also impact cognitive processes such as numerosity estimations.
Collapse
Affiliation(s)
- Giedre Stripeikyte
- Center for Neuroprosthetics, Swiss Federal Institute of Technology (EPFL), CH-1202 Geneva, Switzerland
- Brain Mind Institute, Faculty of Life Sciences, Swiss Federal Institute of Technology (EPFL), CH-1015 Lausanne, Switzerland
| | - Michael Pereira
- Center for Neuroprosthetics, Swiss Federal Institute of Technology (EPFL), CH-1202 Geneva, Switzerland
- Brain Mind Institute, Faculty of Life Sciences, Swiss Federal Institute of Technology (EPFL), CH-1015 Lausanne, Switzerland
- Laboratoire de Psychologie et NeuroCognition, CNRS, Univ. Grenoble Alpes, CNRS, LPNC, 38000 Grenoble, France
| | - Giulio Rognini
- Center for Neuroprosthetics, Swiss Federal Institute of Technology (EPFL), CH-1202 Geneva, Switzerland
- Brain Mind Institute, Faculty of Life Sciences, Swiss Federal Institute of Technology (EPFL), CH-1015 Lausanne, Switzerland
| | - Jevita Potheegadoo
- Center for Neuroprosthetics, Swiss Federal Institute of Technology (EPFL), CH-1202 Geneva, Switzerland
- Brain Mind Institute, Faculty of Life Sciences, Swiss Federal Institute of Technology (EPFL), CH-1015 Lausanne, Switzerland
| | - Olaf Blanke
- Center for Neuroprosthetics, Swiss Federal Institute of Technology (EPFL), CH-1202 Geneva, Switzerland
- Brain Mind Institute, Faculty of Life Sciences, Swiss Federal Institute of Technology (EPFL), CH-1015 Lausanne, Switzerland
- Department of Neurology, University of Geneva, CH-1211 Geneva, Switzerland
| | - Nathan Faivre
- Center for Neuroprosthetics, Swiss Federal Institute of Technology (EPFL), CH-1202 Geneva, Switzerland
- Brain Mind Institute, Faculty of Life Sciences, Swiss Federal Institute of Technology (EPFL), CH-1015 Lausanne, Switzerland
- Laboratoire de Psychologie et NeuroCognition, CNRS, Univ. Grenoble Alpes, CNRS, LPNC, 38000 Grenoble, France
| |
Collapse
|
31
|
Klee JL, Souza BC, Battaglia FP. Learning differentially shapes prefrontal and hippocampal activity during classical conditioning. eLife 2021; 10:e65456. [PMID: 34665131 PMCID: PMC8545395 DOI: 10.7554/elife.65456] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2020] [Accepted: 10/10/2021] [Indexed: 11/25/2022] Open
Abstract
The ability to use sensory cues to inform goal-directed actions is a critical component of behavior. To study how sounds guide anticipatory licking during classical conditioning, we employed high-density electrophysiological recordings from the hippocampal CA1 area and the prefrontal cortex (PFC) in mice. CA1 and PFC neurons undergo distinct learning-dependent changes at the single-cell level and maintain representations of cue identity at the population level. In addition, reactivation of task-related neuronal assemblies during hippocampal awake Sharp-Wave Ripples (aSWRs) changed within individual sessions in CA1 and over the course of multiple sessions in PFC. Despite both areas being highly engaged and synchronized during the task, we found no evidence for coordinated single cell or assembly activity during conditioning trials or aSWR. Taken together, our findings support the notion that persistent firing and reactivation of task-related neural activity patterns in CA1 and PFC support learning during classical conditioning.
Collapse
Affiliation(s)
- Jan L Klee
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands
| | - Bryan C Souza
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands
| | - Francesco P Battaglia
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands
| |
Collapse
|
32
|
Muzzu T, Saleem AB. Feature selectivity can explain mismatch signals in mouse visual cortex. Cell Rep 2021; 37:109772. [PMID: 34610298 PMCID: PMC8655498 DOI: 10.1016/j.celrep.2021.109772] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2021] [Revised: 07/28/2021] [Accepted: 09/09/2021] [Indexed: 11/23/2022] Open
Abstract
Sensory experience often depends on one's own actions, including self-motion. Theories of predictive coding postulate that actions are regulated by calculating prediction error, which is the difference between sensory experience and expectation based on self-generated actions. Signals consistent with prediction error have been reported in the mouse visual cortex (V1) when visual flow coupled to running was unexpectedly stopped. Here, we show that such signals can be elicited by visual stimuli uncoupled to an animal running. We record V1 neurons while presenting drifting gratings that unexpectedly stop. We find strong responses to visual perturbations, which are enhanced during running. Perturbation responses are strongest in the preferred orientation of individual neurons, and perturbation-responsive neurons are more likely to prefer slow visual speeds. Our results indicate that prediction error signals can be explained by the convergence of known motor and sensory signals, providing a purely sensory and motor explanation for purported mismatch signals.
Collapse
Affiliation(s)
- Tomaso Muzzu
- UCL Institute of Behavioural Neuroscience, Department of Experimental Psychology, University College London, 26 Bedford Way, London WC1H 0AP, UK.
| | - Aman B Saleem
- UCL Institute of Behavioural Neuroscience, Department of Experimental Psychology, University College London, 26 Bedford Way, London WC1H 0AP, UK.
| |
Collapse
|
33
|
The posterior auditory field is the chief generator of prediction error signals in the auditory cortex. Neuroimage 2021; 242:118446. [PMID: 34352393 DOI: 10.1016/j.neuroimage.2021.118446] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2021] [Revised: 07/26/2021] [Accepted: 08/02/2021] [Indexed: 01/13/2023] Open
Abstract
The auditory cortex (AC) encompasses distinct fields subserving partly different aspects of sound processing. One essential function of the AC is the detection of unpredicted sounds, as revealed by differential neural activity to predictable and unpredictable sounds. According to the predictive coding framework, this effect can be explained by repetition suppression and/or prediction error signaling. The present study investigates functional specialization of the rat AC fields in repetition suppression and prediction error by combining a tone frequency oddball paradigm (involving high-probable standard and low-probable deviant tones) with two different control sequences (many-standards and cascade). Tones in the control sequences were comparable to deviant events with respect to neural adaptation but were not violating a regularity. Therefore, a difference in the neural activity between deviant and control tones indicates a prediction error effect, whereas a difference between control and standard tones indicates a repetition suppression effect. Single-unit recordings revealed by far the largest prediction error effects for the posterior auditory field, while the primary auditory cortex, the anterior auditory field, the ventral auditory field, and the suprarhinal auditory field were dominated by repetition suppression effects. Statistically significant repetition suppression effects occurred in all AC fields, whereas prediction error effects were less robust in the primary auditory cortex and the anterior auditory field. Results indicate that the non-lemniscal, posterior auditory field is more engaged in context-dependent processing underlying deviance-detection than the other AC fields, which are more sensitive to stimulus-dependent effects underlying differential degrees of neural adaptation.
Collapse
|
34
|
Clayton KK, Asokan MM, Watanabe Y, Hancock KE, Polley DB. Behavioral Approaches to Study Top-Down Influences on Active Listening. Front Neurosci 2021; 15:666627. [PMID: 34305516 PMCID: PMC8299106 DOI: 10.3389/fnins.2021.666627] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2021] [Accepted: 06/09/2021] [Indexed: 11/21/2022] Open
Abstract
The massive network of descending corticofugal projections has been long-recognized by anatomists, but their functional contributions to sound processing and auditory-guided behaviors remain a mystery. Most efforts to characterize the auditory corticofugal system have been inductive; wherein function is inferred from a few studies employing a wide range of methods to manipulate varying limbs of the descending system in a variety of species and preparations. An alternative approach, which we focus on here, is to first establish auditory-guided behaviors that reflect the contribution of top-down influences on auditory perception. To this end, we postulate that auditory corticofugal systems may contribute to active listening behaviors in which the timing of bottom-up sound cues can be predicted from top-down signals arising from cross-modal cues, temporal integration, or self-initiated movements. Here, we describe a behavioral framework for investigating how auditory perceptual performance is enhanced when subjects can anticipate the timing of upcoming target sounds. Our first paradigm, studied both in human subjects and mice, reports species-specific differences in visually cued expectation of sound onset in a signal-in-noise detection task. A second paradigm performed in mice reveals the benefits of temporal regularity as a perceptual grouping cue when detecting repeating target tones in complex background noise. A final behavioral approach demonstrates significant improvements in frequency discrimination threshold and perceptual sensitivity when auditory targets are presented at a predictable temporal interval following motor self-initiation of the trial. Collectively, these three behavioral approaches identify paradigms to study top-down influences on sound perception that are amenable to head-fixed preparations in genetically tractable animals, where it is possible to monitor and manipulate particular nodes of the descending auditory pathway with unparalleled precision.
Collapse
Affiliation(s)
- Kameron K. Clayton
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA, United States
| | - Meenakshi M. Asokan
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA, United States
| | - Yurika Watanabe
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA, United States
| | - Kenneth E. Hancock
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA, United States
- Department of Otolaryngology – Head and Neck Surgery, Harvard Medical School, Boston, MA, United States
| | - Daniel B. Polley
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA, United States
- Department of Otolaryngology – Head and Neck Surgery, Harvard Medical School, Boston, MA, United States
| |
Collapse
|
35
|
Li Y, Wang X, Li Z, Chen J, Qin L. Effect of locomotion on the auditory steady state response of head-fixed mice. World J Biol Psychiatry 2021; 22:362-372. [PMID: 32901530 DOI: 10.1080/15622975.2020.1814409] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
OBJECTIVES Electroencephalographic (EEG) examinations of the auditory steady-state response (ASSR) can non-invasively probe cortical function to generate the gamma-band (40 Hz) oscillation, which is increasingly applied to the neurophysiological studies on the rodent models of psychiatric disorders. Though, it has been well established that the brain activities are significantly modulated by the behavioural state (such as locomotion), how the ASSR is affected remains unclear. METHODS We investigated the effect of locomotion by recording local field potential (LFP) evoked by 40-Hz click-train from multiple brain areas: auditory cortex (AC), medial geniculate body (MGB), hippocampus (HP) and prefrontal cortex (PFC), in head-fixed mice free to run on a treadmill. Comparisons were conducted on the LFPs during spontaneous movement and stationary conditions. RESULTS We found that in both the auditory (AC and MGB) and non-auditory areas (HP and PFC), locomotion reduced the initial negative deflection of LFP (early response during 0-100 ms from stimulus onset), and had no significant effect on the ASSR phase-locking to the late stimulus (100-500 ms). CONCLUSIONS Our results suggest that different neural mechanisms contribute to the early response and ASSR, and the ASSR is a more robust biomarker to investigate the pathogenesis of neuropsychiatric disorders.
Collapse
Affiliation(s)
- Yingzhuo Li
- Department of Physiology, China Medical University, Shenyang, PR China
| | - Xuejiao Wang
- Department of Physiology, China Medical University, Shenyang, PR China
| | - Zijie Li
- Department of Physiology, China Medical University, Shenyang, PR China
| | - Jingyu Chen
- Department of Physiology, China Medical University, Shenyang, PR China
| | - Ling Qin
- Department of Physiology, China Medical University, Shenyang, PR China
| |
Collapse
|
36
|
Henschke JU, Price AT, Pakan JMP. Enhanced modulation of cell-type specific neuronal responses in mouse dorsal auditory field during locomotion. Cell Calcium 2021; 96:102390. [PMID: 33744780 DOI: 10.1016/j.ceca.2021.102390] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2021] [Revised: 03/05/2021] [Accepted: 03/10/2021] [Indexed: 11/16/2022]
Abstract
As we move through the environment we experience constantly changing sensory input that must be merged with our ongoing motor behaviors - creating dynamic interactions between our sensory and motor systems. Active behaviors such as locomotion generally increase the sensory-evoked neuronal activity in visual and somatosensory cortices, but evidence suggests that locomotion largely suppresses neuronal responses in the auditory cortex. However, whether this effect is ubiquitous across different anatomical regions of the auditory cortex is largely unknown. In mice, auditory association fields such as the dorsal auditory cortex (AuD), have been shown to have different physiological response properties, protein expression patterns, and cortical as well as subcortical connections, in comparison to primary auditory regions (A1) - suggesting there may be important functional differences. Here we examined locomotion-related modulation of neuronal activity in cortical layers ⅔ of AuD and A1 using two-photon Ca2+ imaging in head-fixed behaving mice that are able to freely run on a spherical treadmill. We determined the proportion of neurons in these two auditory regions that show enhanced and suppressed sensory-evoked responses during locomotion and quantified the depth of modulation. We found that A1 shows more suppression and AuD more enhanced responses during locomotion periods. We further revealed differences in the circuitry between these auditory regions and motor cortex, and found that AuD is more highly connected to motor cortical regions. Finally, we compared the cell-type specific locomotion-evoked modulation of responses in AuD and found that, while subpopulations of PV-expressing interneurons showed heterogeneous responses, the population in general was largely suppressed during locomotion, while excitatory population responses were generally enhanced in AuD. Therefore, neurons in primary and dorsal auditory fields have distinct response properties, with dorsal regions exhibiting enhanced activity in response to movement. This functional distinction may be important for auditory processing during navigation and acoustically guided behavior.
Collapse
Affiliation(s)
- Julia U Henschke
- Institute of Cognitive Neurology and Dementia Research, Otto-von-Guericke-University Magdeburg, Leipziger Str. 44, 39120, Magdeburg, Germany; German Centre for Neurodegenerative Diseases, Leipziger Str. 44, 39120, Magdeburg, Germany
| | - Alan T Price
- Institute of Cognitive Neurology and Dementia Research, Otto-von-Guericke-University Magdeburg, Leipziger Str. 44, 39120, Magdeburg, Germany; German Centre for Neurodegenerative Diseases, Leipziger Str. 44, 39120, Magdeburg, Germany; Cognitive Neurophysiology group, Leibniz Institute for Neurobiology (LIN), 39118, Magdeburg, Germany
| | - Janelle M P Pakan
- Institute of Cognitive Neurology and Dementia Research, Otto-von-Guericke-University Magdeburg, Leipziger Str. 44, 39120, Magdeburg, Germany; German Centre for Neurodegenerative Diseases, Leipziger Str. 44, 39120, Magdeburg, Germany; Center for Behavioral Brain Sciences, Universitätsplatz 2, 39120, Magdeburg, Germany.
| |
Collapse
|
37
|
Altmann CF, Yamasaki D, Song Y, Bucher B. Processing of self-initiated sound motion in the human brain. Brain Res 2021; 1762:147433. [PMID: 33737062 DOI: 10.1016/j.brainres.2021.147433] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2020] [Revised: 03/10/2021] [Accepted: 03/11/2021] [Indexed: 12/01/2022]
Abstract
Interacting with objects in our environment usually leads to audible noise. Brain responses to such self-initiated sounds have been shown to be attenuated, in particular the so-called N1 component measured with electroencephalography (EEG). This attenuation has been proposed to be the effect of an internal forward model that allows for cancellation of the sensory consequences of a motor command. In the current study we asked whether the attenuation due to self-initiation of a sound also affects a later event-related potential - the so-called motion-onset response - that arises in response to moving sounds. To this end, volunteers were instructed to move their index fingers either left or rightward which resulted in virtual movement of a sound either to the left or to the right. In Experiment 1, sound motion was induced with in-ear head-phones by shifting interaural time and intensity differences and thus shifting the intracranial sound image. We compared the motion-onset responses under two conditions: a) congruent, and b) incongruent. In the congruent condition, the sound image moved in the direction of the finger movement, while in the incongruent condition sound motion was in the opposite direction of the finger movement. Clear motion-onset responses with a negative cN1 component peaking at about 160 ms and a positive cP2 component peaking at about 230 ms after motion-onset were obtained for both the congruent and incongruent conditions. However, the motion-onset responses did not significantly differ between congruent and incongruent conditions in amplitude or latency. In Experiment 2, in which sounds were presented with loudspeakers, we observed attenuation for self-induced versus externally triggered sound motion-onset, but again, there was no difference between congruent and incongruent conditions. In sum, these two experiments suggest that the motion-onset response measured by EEG can be attenuated for self-generated sounds. However, our result did not indicate that this attenuation depended on congruency of action and sound motion direction.
Collapse
Affiliation(s)
- Christian F Altmann
- Human Brain Research Center, Graduate School of Medicine, Kyoto University, Kyoto 606-8507, Japan; Parkinson-Klinik Ortenau, 77709 Wolfach, Germany.
| | - Daiki Yamasaki
- Department of Psychology, Graduate School of Letters, Kyoto University, Kyoto 606-8501, Japan; Japan Society for the Promotion of Science, Tokyo 102-0083, Japan
| | - Yunqing Song
- Human Brain Research Center, Graduate School of Medicine, Kyoto University, Kyoto 606-8507, Japan
| | - Benoit Bucher
- Department of Psychology, Graduate School of Letters, Kyoto University, Kyoto 606-8501, Japan
| |
Collapse
|
38
|
Clayton KK, Williamson RS, Hancock KE, Tasaka GI, Mizrahi A, Hackett T, Polley DB. Auditory Corticothalamic Neurons Are Recruited by Motor Preparatory Inputs. Curr Biol 2021; 31:310-321.e5. [PMID: 33157020 PMCID: PMC7855066 DOI: 10.1016/j.cub.2020.10.027] [Citation(s) in RCA: 46] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2020] [Revised: 10/06/2020] [Accepted: 10/08/2020] [Indexed: 12/22/2022]
Abstract
Corticothalamic (CT) neurons comprise the largest component of the descending sensory corticofugal pathway, but their contributions to brain function and behavior remain an unsolved mystery. To address the hypothesis that layer 6 (L6) CTs may be activated by extra-sensory inputs prior to anticipated sounds, we performed optogenetically targeted single-unit recordings and two-photon imaging of Ntsr1-Cre+ L6 CT neurons in the primary auditory cortex (A1) while mice were engaged in an active listening task. We found that L6 CTs and other L6 units began spiking hundreds of milliseconds prior to orofacial movements linked to sound presentation and reward, but not to other movements such as locomotion, which were not linked to an explicit behavioral task. Rabies tracing of monosynaptic inputs to A1 L6 CT neurons revealed a narrow strip of cholinergic and non-cholinergic projection neurons in the external globus pallidus, suggesting a potential source of motor-related input. These findings identify new pathways and local circuits for motor modulation of sound processing and suggest a new role for CT neurons in active sensing.
Collapse
Affiliation(s)
- Kameron K. Clayton
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear Infirmary, Boston MA 02114 USA
| | - Ross S. Williamson
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear Infirmary, Boston MA 02114 USA
- Dept. Otolaryngology, Harvard Medical School, Boston MA 02114 US
| | - Kenneth E. Hancock
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear Infirmary, Boston MA 02114 USA
- Dept. Otolaryngology, Harvard Medical School, Boston MA 02114 US
| | - Gen-ichi Tasaka
- The Edmond and Lily Safra Center for Brain Sciences, Dept. Neurobiology, Hebrew University of Jerusalem, Jerusalem ISR
| | - Adi Mizrahi
- The Edmond and Lily Safra Center for Brain Sciences, Dept. Neurobiology, Hebrew University of Jerusalem, Jerusalem ISR
| | - Troy Hackett
- Dept. Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville TN, 37203 USA
| | - Daniel B Polley
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear Infirmary, Boston MA 02114 USA
- Dept. Otolaryngology, Harvard Medical School, Boston MA 02114 US
| |
Collapse
|
39
|
Asilador A, Llano DA. Top-Down Inference in the Auditory System: Potential Roles for Corticofugal Projections. Front Neural Circuits 2021; 14:615259. [PMID: 33551756 PMCID: PMC7862336 DOI: 10.3389/fncir.2020.615259] [Citation(s) in RCA: 29] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2020] [Accepted: 12/17/2020] [Indexed: 01/28/2023] Open
Abstract
It has become widely accepted that humans use contextual information to infer the meaning of ambiguous acoustic signals. In speech, for example, high-level semantic, syntactic, or lexical information shape our understanding of a phoneme buried in noise. Most current theories to explain this phenomenon rely on hierarchical predictive coding models involving a set of Bayesian priors emanating from high-level brain regions (e.g., prefrontal cortex) that are used to influence processing at lower-levels of the cortical sensory hierarchy (e.g., auditory cortex). As such, virtually all proposed models to explain top-down facilitation are focused on intracortical connections, and consequently, subcortical nuclei have scarcely been discussed in this context. However, subcortical auditory nuclei receive massive, heterogeneous, and cascading descending projections at every level of the sensory hierarchy, and activation of these systems has been shown to improve speech recognition. It is not yet clear whether or how top-down modulation to resolve ambiguous sounds calls upon these corticofugal projections. Here, we review the literature on top-down modulation in the auditory system, primarily focused on humans and cortical imaging/recording methods, and attempt to relate these findings to a growing animal literature, which has primarily been focused on corticofugal projections. We argue that corticofugal pathways contain the requisite circuitry to implement predictive coding mechanisms to facilitate perception of complex sounds and that top-down modulation at early (i.e., subcortical) stages of processing complement modulation at later (i.e., cortical) stages of processing. Finally, we suggest experimental approaches for future studies on this topic.
Collapse
Affiliation(s)
- Alexander Asilador
- Neuroscience Program, The University of Illinois at Urbana-Champaign, Champaign, IL, United States
- Beckman Institute for Advanced Science and Technology, Urbana, IL, United States
| | - Daniel A. Llano
- Neuroscience Program, The University of Illinois at Urbana-Champaign, Champaign, IL, United States
- Beckman Institute for Advanced Science and Technology, Urbana, IL, United States
- Molecular and Integrative Physiology, The University of Illinois at Urbana-Champaign, Champaign, IL, United States
| |
Collapse
|
40
|
Casado-Román L, Carbajal GV, Pérez-González D, Malmierca MS. Prediction error signaling explains neuronal mismatch responses in the medial prefrontal cortex. PLoS Biol 2020; 18:e3001019. [PMID: 33347436 PMCID: PMC7785337 DOI: 10.1371/journal.pbio.3001019] [Citation(s) in RCA: 40] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2020] [Revised: 01/05/2021] [Accepted: 12/03/2020] [Indexed: 02/06/2023] Open
Abstract
The mismatch negativity (MMN) is a key biomarker of automatic deviance detection thought to emerge from 2 cortical sources. First, the auditory cortex (AC) encodes spectral regularities and reports frequency-specific deviances. Then, more abstract representations in the prefrontal cortex (PFC) allow to detect contextual changes of potential behavioral relevance. However, the precise location and time asynchronies between neuronal correlates underlying this frontotemporal network remain unclear and elusive. Our study presented auditory oddball paradigms along with "no-repetition" controls to record mismatch responses in neuronal spiking activity and local field potentials at the rat medial PFC. Whereas mismatch responses in the auditory system are mainly induced by stimulus-dependent effects, we found that auditory responsiveness in the PFC was driven by unpredictability, yielding context-dependent, comparatively delayed, more robust and longer-lasting mismatch responses mostly comprised of prediction error signaling activity. This characteristically different composition discarded that mismatch responses in the PFC could be simply inherited or amplified downstream from the auditory system. Conversely, it is more plausible for the PFC to exert top-down influences on the AC, since the PFC exhibited flexible and potent predictive processing, capable of suppressing redundant input more efficiently than the AC. Remarkably, the time course of the mismatch responses we observed in the spiking activity and local field potentials of the AC and the PFC combined coincided with the time course of the large-scale MMN-like signals reported in the rat brain, thereby linking the microscopic, mesoscopic, and macroscopic levels of automatic deviance detection.
Collapse
Affiliation(s)
- Lorena Casado-Román
- Cognitive and Auditory Neuroscience Laboratory (CANELAB), Institute of Neuroscience of Castilla y León (INCYL), Salamanca, Spain
- Institute for Biomedical Research of Salamanca (IBSAL), Salamanca, Spain
| | - Guillermo V. Carbajal
- Cognitive and Auditory Neuroscience Laboratory (CANELAB), Institute of Neuroscience of Castilla y León (INCYL), Salamanca, Spain
- Institute for Biomedical Research of Salamanca (IBSAL), Salamanca, Spain
| | - David Pérez-González
- Cognitive and Auditory Neuroscience Laboratory (CANELAB), Institute of Neuroscience of Castilla y León (INCYL), Salamanca, Spain
- Institute for Biomedical Research of Salamanca (IBSAL), Salamanca, Spain
| | - Manuel S. Malmierca
- Cognitive and Auditory Neuroscience Laboratory (CANELAB), Institute of Neuroscience of Castilla y León (INCYL), Salamanca, Spain
- Institute for Biomedical Research of Salamanca (IBSAL), Salamanca, Spain
- Department of Biology and Pathology, Faculty of Medicine, University of Salamanca, Salamanca, Spain
| |
Collapse
|
41
|
Guitchounts G, Masís J, Wolff SB, Cox D. Encoding of 3D Head Orienting Movements in the Primary Visual Cortex. Neuron 2020; 108:512-525.e4. [DOI: 10.1016/j.neuron.2020.07.014] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2020] [Revised: 06/11/2020] [Accepted: 07/13/2020] [Indexed: 10/23/2022]
|
42
|
Hua L, Recasens M, Grent-'t-Jong T, Adams RA, Gross J, Uhlhaas PJ. Investigating cortico-subcortical circuits during auditory sensory attenuation: A combined magnetoencephalographic and dynamic causal modeling study. Hum Brain Mapp 2020; 41:4419-4430. [PMID: 32662585 PMCID: PMC7502827 DOI: 10.1002/hbm.25134] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2020] [Revised: 06/19/2020] [Accepted: 06/26/2020] [Indexed: 01/27/2023] Open
Abstract
Sensory attenuation refers to the decreased intensity of a sensory percept when a sensation is self‐generated compared with when it is externally triggered. However, the underlying brain regions and network interactions that give rise to this phenomenon remain to be determined. To address this issue, we recorded magnetoencephalographic (MEG) data from 35 healthy controls during an auditory task in which pure tones were either elicited through a button press or passively presented. We analyzed the auditory M100 at sensor‐ and source‐level and identified movement‐related magnetic fields (MRMFs). Regression analyses were used to further identify brain regions that contributed significantly to sensory attenuation, followed by a dynamic causal modeling (DCM) approach to explore network interactions between generators. Attenuation of the M100 was pronounced in right Heschl's gyrus (HES), superior temporal cortex (ST), thalamus, rolandic operculum (ROL), precuneus and inferior parietal cortex (IPL). Regression analyses showed that right postcentral gyrus (PoCG) and left precentral gyrus (PreCG) predicted M100 sensory attenuation. In addition, DCM results indicated that auditory sensory attenuation involved bi‐directional information flow between thalamus, IPL, and auditory cortex. In summary, our data show that sensory attenuation is mediated by bottom‐up and top‐down information flow in a thalamocortical network, providing support for the role of predictive processing in sensory‐motor system.
Collapse
Affiliation(s)
- Lingling Hua
- Institute for Neuroscience and Psychology, University of Glasgow, Glasgow, UK
| | - Marc Recasens
- Institute for Neuroscience and Psychology, University of Glasgow, Glasgow, UK
| | - Tineke Grent-'t-Jong
- Institute for Neuroscience and Psychology, University of Glasgow, Glasgow, UK.,Department of Child and Adolescent Psychiatry, Charité-Universitätsmedizin Berlin, Berlin, Germany
| | - Rick A Adams
- Centre for Medical Image Computing, Department of Computer Science, University College London, London, UK
| | - Joachim Gross
- Institute for Neuroscience and Psychology, University of Glasgow, Glasgow, UK.,Institute of Biomagnetism and Biosignal analysis, Westphalian Wilhelms University Muenster, Münster, Germany
| | - Peter J Uhlhaas
- Institute for Neuroscience and Psychology, University of Glasgow, Glasgow, UK.,Department of Child and Adolescent Psychiatry, Charité-Universitätsmedizin Berlin, Berlin, Germany
| |
Collapse
|
43
|
Makov S, Zion Golumbic E. Irrelevant Predictions: Distractor Rhythmicity Modulates Neural Encoding in Auditory Cortex. Cereb Cortex 2020; 30:5792-5805. [DOI: 10.1093/cercor/bhaa153] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2019] [Revised: 04/10/2020] [Accepted: 05/02/2020] [Indexed: 12/12/2022] Open
Abstract
Abstract
Dynamic attending theory suggests that predicting the timing of upcoming sounds can assist in focusing attention toward them. However, whether similar predictive processes are also applied to background noises and assist in guiding attention “away” from potential distractors, remains an open question. Here we address this question by manipulating the temporal predictability of distractor sounds in a dichotic listening selective attention task. We tested the influence of distractors’ temporal predictability on performance and on the neural encoding of sounds, by comparing the effects of Rhythmic versus Nonrhythmic distractors. Using magnetoencephalography we found that, indeed, the neural responses to both attended and distractor sounds were affected by distractors’ rhythmicity. Baseline activity preceding the onset of Rhythmic distractor sounds was enhanced relative to nonrhythmic distractor sounds, and sensory response to them was suppressed. Moreover, detection of nonmasked targets improved when distractors were Rhythmic, an effect accompanied by stronger lateralization of the neural responses to attended sounds to contralateral auditory cortex. These combined behavioral and neural results suggest that not only are temporal predictions formed for task-irrelevant sounds, but that these predictions bear functional significance for promoting selective attention and reducing distractibility.
Collapse
Affiliation(s)
- Shiri Makov
- Gonda Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat-Gan 5290002, Israel
| | - Elana Zion Golumbic
- Gonda Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat-Gan 5290002, Israel
| |
Collapse
|
44
|
Abstract
Contemporary brain research seeks to understand how cognition is reducible to neural activity. Crucially, much of this effort is guided by a scientific paradigm that views neural activity as essentially driven by external stimuli. In contrast, recent perspectives argue that this paradigm is by itself inadequate and that understanding patterns of activity intrinsic to the brain is needed to explain cognition. Yet, despite this critique, the stimulus-driven paradigm still dominates-possibly because a convincing alternative has not been clear. Here, we review a series of findings suggesting such an alternative. These findings indicate that neural activity in the hippocampus occurs in one of three brain states that have radically different anatomical, physiological, representational, and behavioral correlates, together implying different functional roles in cognition. This three-state framework also indicates that neural representations in the hippocampus follow a surprising pattern of organization at the timescale of ∼1 s or longer. Lastly, beyond the hippocampus, recent breakthroughs indicate three parallel states in the cortex, suggesting shared principles and brain-wide organization of intrinsic neural activity.
Collapse
Affiliation(s)
- Kenneth Kay
- Howard Hughes Medical Institute, Kavli Institute for Fundamental Neuroscience, Department of Physiology, University of California San Francisco, San Francisco, California
| | - Loren M Frank
- Howard Hughes Medical Institute, Kavli Institute for Fundamental Neuroscience, Department of Physiology, University of California San Francisco, San Francisco, California
| |
Collapse
|
45
|
Heins N, Pomp J, Kluger DS, Trempler I, Zentgraf K, Raab M, Schubotz RI. Incidental or Intentional? Different Brain Responses to One's Own Action Sounds in Hurdling vs. Tap Dancing. Front Neurosci 2020; 14:483. [PMID: 32477059 PMCID: PMC7237737 DOI: 10.3389/fnins.2020.00483] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2020] [Accepted: 04/20/2020] [Indexed: 12/20/2022] Open
Abstract
Most human actions produce concomitant sounds. Action sounds can be either part of the action goal (GAS, goal-related action sounds), as for instance in tap dancing, or a mere by-product of the action (BAS, by-product action sounds), as for instance in hurdling. It is currently unclear whether these two types of action sounds-incidental or intentional-differ in their neural representation and whether the impact on the performance evaluation of an action diverges between the two. We here examined whether during the observation of tap dancing compared to hurdling, auditory information is a more important factor for positive action quality ratings. Moreover, we tested whether observation of tap dancing vs. hurdling led to stronger attenuation in primary auditory cortex, and a stronger mismatch signal when sounds do not match our expectations. We recorded individual point-light videos of newly trained participants performing tap dancing and hurdling. In the subsequent functional magnetic resonance imaging (fMRI) session, participants were presented with the videos that displayed their own actions, including corresponding action sounds, and were asked to rate the quality of their performance. Videos were either in their original form or scrambled regarding the visual modality, the auditory modality, or both. As hypothesized, behavioral results showed significantly lower rating scores in the GAS condition compared to the BAS condition when the auditory modality was scrambled. Functional MRI contrasts between BAS and GAS actions revealed higher activation of primary auditory cortex in the BAS condition, speaking in favor of stronger attenuation in GAS, as well as stronger activation of posterior superior temporal gyri and the supplementary motor area in GAS. Results suggest that the processing of self-generated action sounds depends on whether we have the intention to produce a sound with our action or not, and action sounds may be more prone to be used as sensory feedback when they are part of the explicit action goal. Our findings contribute to a better understanding of the function of action sounds for learning and controlling sound-producing actions.
Collapse
Affiliation(s)
- Nina Heins
- Department of Psychology, University of Muenster, Münster, Germany
- Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster, Germany
| | - Jennifer Pomp
- Department of Psychology, University of Muenster, Münster, Germany
- Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster, Germany
| | - Daniel S. Kluger
- Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster, Germany
- Institute for Biomagnetism and Biosignalanalysis, University of Muenster, Muenster, Germany
| | - Ima Trempler
- Department of Psychology, University of Muenster, Münster, Germany
- Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster, Germany
| | - Karen Zentgraf
- Department of Movement Science and Training in Sports, Institute of Sport Sciences, Goethe University Frankfurt, Frankfurt, Germany
| | - Markus Raab
- Department of Performance Psychology, Institute of Psychology, German Sport University Cologne, Cologne, Germany
- School of Applied Sciences, London South Bank University, London, United Kingdom
| | - Ricarda I. Schubotz
- Department of Psychology, University of Muenster, Münster, Germany
- Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster, Germany
| |
Collapse
|
46
|
Riecke L, Marianu IA, De Martino F. Effect of Auditory Predictability on the Human Peripheral Auditory System. Front Neurosci 2020; 14:362. [PMID: 32351361 PMCID: PMC7174672 DOI: 10.3389/fnins.2020.00362] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2019] [Accepted: 03/24/2020] [Indexed: 11/13/2022] Open
Abstract
Auditory perception is facilitated by prior knowledge about the statistics of the acoustic environment. Predictions about upcoming auditory stimuli are processed at various stages along the human auditory pathway, including the cortex and midbrain. Whether such auditory predictions are processed also at hierarchically lower stages-in the peripheral auditory system-is unclear. To address this question, we assessed outer hair cell (OHC) activity in response to isochronous tone sequences and varied the predictability and behavioral relevance of the individual tones (by manipulating tone-to-tone probabilities and the human participants' task, respectively). We found that predictability alters the amplitude of distortion-product otoacoustic emissions (DPOAEs, a measure of OHC activity) in a manner that depends on the behavioral relevance of the tones. Simultaneously recorded cortical responses showed a significant effect of both predictability and behavioral relevance of the tones, indicating that their experimental manipulations were effective in central auditory processing stages. Our results provide evidence for a top-down effect on the processing of auditory predictability in the human peripheral auditory system, in line with previous studies showing peripheral effects of auditory attention.
Collapse
Affiliation(s)
- Lars Riecke
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands
| | - Irina-Andreea Marianu
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands
| | - Federico De Martino
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands.,Center for Magnetic Resonance Research, University of Minnesota, Minneapolis, MN, United States
| |
Collapse
|
47
|
Dissociation of Unit Activity and Gamma Oscillations during Vocalization in Primate Auditory Cortex. J Neurosci 2020; 40:4158-4171. [PMID: 32295815 DOI: 10.1523/jneurosci.2749-19.2020] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2019] [Revised: 02/10/2020] [Accepted: 02/26/2020] [Indexed: 11/21/2022] Open
Abstract
Vocal production is a sensory-motor process in which auditory self-monitoring is used to ensure accurate communication. During vocal production, the auditory cortex of both humans and animals is suppressed, a phenomenon that plays an important role in self-monitoring and vocal motor control. However, the underlying neural mechanisms of this vocalization-induced suppression are unknown. γ-band oscillations (>25 Hz) have been implicated a variety of cortical functions and are thought to arise from activity of local inhibitory interneurons, but have not been studied during vocal production. We therefore examined γ-band activity in the auditory cortex of vocalizing marmoset monkeys, of either sex, and found that γ responses increased during vocal production. This increase in γ contrasts with simultaneously recorded suppression of single-unit and multiunit responses. Recorded vocal γ oscillations exhibited two separable components: a vocalization-specific nonsynchronized ("induced") response correlating with vocal suppression, and a synchronized ("evoked") response that was also present during passive sound playback. These results provide evidence for the role of cortical γ oscillations during inhibitory processing. Furthermore, the two distinct components of the γ response suggest possible mechanisms for vocalization-induced suppression, and may correspond to the sensory-motor integration of top-down and bottom-up inputs to the auditory cortex during vocal production.SIGNIFICANCE STATEMENT Vocal communication is important to both humans and animals. In order to ensure accurate information transmission, we must monitor our own vocal output. Surprisingly, spiking activity in the auditory cortex is suppressed during vocal production yet maintains sensitivity to the sound of our own voice ("feedback"). The mechanisms of this vocalization-induced suppression are unknown. Here we show that auditory cortical γ oscillations, which reflect interneuron activity, are actually increased during vocal production, the opposite response of that seen in spiking units. We discuss these results with proposed functions of γ activity during inhibitory sensory processing and coordination of different brain regions, suggesting a role in sensory-motor integration.
Collapse
|
48
|
Listening to music while running alters ground reaction forces: a study of acute exposure to varying speed and loudness levels in young women and men. Eur J Appl Physiol 2020; 120:1391-1401. [PMID: 32277258 DOI: 10.1007/s00421-020-04371-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2019] [Accepted: 04/05/2020] [Indexed: 10/24/2022]
Abstract
PURPOSE Music listening while running enhances physiological and psychological features, resulting in a more enjoyable experience. The possible influence of music on ground reaction forces (GRF) during running, however, is unknown. Considering the 'distracting' role of music on runner's attention, we hypothesized that music would cover foot impacts against the ground. This study verified such hypothesis by testing the effects of different music volumes while running at different velocities. METHODS Fifty fit volunteers (F:M = 22:8; 23 ± 2 years) performed 2-min running stints over 3 random conditions (80-dB, 85-dB music; 'no music'), at 3 velocities (8, 10, 12 km/h). Participants ran on a sensorized treadmill that recorded GRF during all experiments. RESULTS Listening to 85-dB music resulted in greater GRF at 8 (p = 0.0005) and 10 km/h (p = 0.04) but not 12 km/h (p = 0.35) and not with 80-dB volume. Gender-based analyses revealed significant Condition × gender interactions only for 85-dB music vs. 'no music'. Bonferroni-adjusted comparisons revealed significant music-induced increases in GRF only in men at 8 km/h (+ 4.1 kg/cm2, p < 0.0005; women: + 0.8 kg/cm2, p = 0.47) and 10 km/h (+ 3.3 kg/cm2, p = 0.004; women: + 0.8 kg/cm2, p = 0.51) but not at 12 km/h. CONCLUSION In active men, listening to loud music while running results in increased GRF, whereas no effect was observed in women. The lack of music effect in women may be related to structural factors, such as larger hip width-to-femoral length ratio, possibly resulting in different loading patterns. The present preliminary findings introduce high-volume music listening as a new potential risk factor for injury in young runners.
Collapse
|
49
|
Schneider DM. Reflections of action in sensory cortex. Curr Opin Neurobiol 2020; 64:53-59. [PMID: 32171079 DOI: 10.1016/j.conb.2020.02.004] [Citation(s) in RCA: 32] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2019] [Revised: 01/25/2020] [Accepted: 02/09/2020] [Indexed: 11/26/2022]
Abstract
Nearly every movement that one makes produces a corresponding set of sensations. The simple fact that much of our sensory world is driven by our own actions underscores one of the major computations that our brains execute every day: to interpret the sensory world even as we interact with and change it. It should not be surprising therefore that activity in sensory cortex reflects not only incoming sensory inputs but also ongoing movement and behavioral state. With a focus on the mouse as a model organism, this review highlights recent findings revealing the widespread modulation of sensory cortex across diverse movements, the circuitry through which movement-related inputs are integrated with sensory signals, and the computational and perceptual roles that motor-sensory integration may serve within the brain.
Collapse
Affiliation(s)
- David M Schneider
- Center for Neural Science, New York University, New York, NY 10003, United States.
| |
Collapse
|
50
|
Gogos JA, Crabtree G, Diamantopoulou A. The abiding relevance of mouse models of rare mutations to psychiatric neuroscience and therapeutics. Schizophr Res 2020; 217:37-51. [PMID: 30987923 PMCID: PMC6790166 DOI: 10.1016/j.schres.2019.03.018] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/29/2019] [Revised: 03/19/2019] [Accepted: 03/22/2019] [Indexed: 01/08/2023]
Abstract
Studies using powerful family-based designs aided by large scale case-control studies, have been instrumental in cracking the genetic complexity of the disease, identifying rare and highly penetrant risk mutations and providing a handle on experimentally tractable model systems. Mouse models of rare mutations, paired with analysis of homologous cognitive and sensory processing deficits and state-of-the-art neuroscience methods to manipulate and record neuronal activity have started providing unprecedented insights into pathogenic mechanisms and building the foundation of a new biological framework for understanding mental illness. A number of important principles are emerging, namely that degradation of the computational mechanisms underlying the ordered activity and plasticity of both local and long-range neuronal assemblies, the building blocks necessary for stable cognition and perception, might be the inevitable consequence and the common point of convergence of the vastly heterogeneous genetic liability, manifesting as defective internally- or stimulus-driven neuronal activation patterns and triggering the constellation of schizophrenia symptoms. Animal models of rare mutations have the unique potential to help us move from "which" (gene) to "how", "where" and "when" computational regimes of neural ensembles are affected. Linking these variables should improve our understanding of how symptoms emerge and how diagnostic boundaries are established at a circuit level. Eventually, a better understanding of pathophysiological trajectories at the level of neural circuitry in mice, aided by basic human experimental biology, should guide the development of new therapeutics targeting either altered circuitry itself or the underlying biological pathways.
Collapse
Affiliation(s)
- Joseph A. Gogos
- Mortimer B. Zuckerman Mind Brain and Behavior Institute Columbia University, New York, NY 10027 USA,Department of Physiology and Cellular Biophysics, College of Physicians and Surgeons, Columbia University, New York, NY 10032, USA,Department of Neuroscience, Columbia University, New York, NY 10032 USA,Correspondence should be addressed to: Joseph A. Gogos ()
| | - Gregg Crabtree
- Mortimer B. Zuckerman Mind Brain and Behavior Institute Columbia University, New York, NY 10027 USA,Department of Physiology and Cellular Biophysics, College of Physicians and Surgeons, Columbia University, New York, NY 10032, USA
| | - Anastasia Diamantopoulou
- Mortimer B. Zuckerman Mind Brain and Behavior Institute Columbia University, New York, NY 10027 USA,Department of Physiology and Cellular Biophysics, College of Physicians and Surgeons, Columbia University, New York, NY 10032, USA
| |
Collapse
|