1
|
Hariharan S, Palomares EG, Babl SS, López-Jury L, Hechavarria JC. Cerebellar activity predicts vocalization in fruit bats. Curr Biol 2024; 34:5112-5119.e3. [PMID: 39389060 DOI: 10.1016/j.cub.2024.09.033] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2024] [Revised: 08/21/2024] [Accepted: 09/12/2024] [Indexed: 10/12/2024]
Abstract
Echolocating bats exhibit remarkable auditory behaviors, enabled by adaptations both within and outside their auditory system. Yet research on echolocating bats has focused mostly on brain areas that belong to the classic ascending auditory pathway. This study provides direct evidence linking the cerebellum, an evolutionarily ancient and non-classic auditory structure, to vocalization and hearing. We report that in the fruit-eating bat Carollia perspicillata, external sounds can evoke cerebellar responses with latencies below 20 ms. Such fast responses are indicative of early inputs to the bat cerebellum. After establishing fruit-eating bats as a good model to study cerebellar auditory responses, we searched for a neural correlate of vocal production within the cerebellum. We investigated spike trains and field potentials occurring before and after vocalization and found that the type of sound produced (echolocation pulses or communication calls) can be decoded from pre-vocal and post-vocal neural signals, with prediction accuracies that reach above 85%. The latter provides a direct correlate of vocalization in an ancient motor-coordination structure that lies outside of the classic ascending auditory pathway. Taken together, our findings provide evidence of specializations for vocalization and hearing in the cerebellum of an auditory specialist.
Collapse
Affiliation(s)
- Shivani Hariharan
- Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, 60528 Frankfurt am Main, Germany; Institute of Cell Biology and Neuroscience, Goethe University Frankfurt, 60438 Frankfurt am Main, Germany.
| | - Eugenia González Palomares
- Institute of Cell Biology and Neuroscience, Goethe University Frankfurt, 60438 Frankfurt am Main, Germany
| | - Susanne S Babl
- Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, 60528 Frankfurt am Main, Germany; Institute of Cell Biology and Neuroscience, Goethe University Frankfurt, 60438 Frankfurt am Main, Germany
| | - Luciana López-Jury
- Max Planck Institute for Brain Research, 60438 Frankfurt am Main, Germany
| | - Julio C Hechavarria
- Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, 60528 Frankfurt am Main, Germany; Institute of Cell Biology and Neuroscience, Goethe University Frankfurt, 60438 Frankfurt am Main, Germany.
| |
Collapse
|
2
|
García-Rosales F, Schaworonkow N, Hechavarria JC. Oscillatory Waveform Shape and Temporal Spike Correlations Differ across Bat Frontal and Auditory Cortex. J Neurosci 2024; 44:e1236232023. [PMID: 38262724 PMCID: PMC10919256 DOI: 10.1523/jneurosci.1236-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2023] [Revised: 11/01/2023] [Accepted: 11/29/2023] [Indexed: 01/25/2024] Open
Abstract
Neural oscillations are associated with diverse computations in the mammalian brain. The waveform shape of oscillatory activity measured in the cortex relates to local physiology and can be informative about aberrant or dynamically changing states. However, how waveform shape differs across distant yet functionally and anatomically related cortical regions is largely unknown. In this study, we capitalize on simultaneous recordings of local field potentials (LFPs) in the auditory and frontal cortices of awake, male Carollia perspicillata bats to examine, on a cycle-by-cycle basis, waveform shape differences across cortical regions. We find that waveform shape differs markedly in the fronto-auditory circuit even for temporally correlated rhythmic activity in comparable frequency ranges (i.e., in the delta and gamma bands) during spontaneous activity. In addition, we report consistent differences between areas in the variability of waveform shape across individual cycles. A conceptual model predicts higher spike-spike and spike-LFP correlations in regions with more asymmetric shapes, a phenomenon that was observed in the data: spike-spike and spike-LFP correlations were higher in the frontal cortex. The model suggests a relationship between waveform shape differences and differences in spike correlations across cortical areas. Altogether, these results indicate that oscillatory activity in the frontal and auditory cortex possesses distinct dynamics related to the anatomical and functional diversity of the fronto-auditory circuit.
Collapse
Affiliation(s)
- Francisco García-Rosales
- Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, Frankfurt am Main 60528, Germany
| | - Natalie Schaworonkow
- Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, Frankfurt am Main 60528, Germany
| | - Julio C Hechavarria
- Institut für Zellbiologie und Neurowissenschaft, Goethe-Universität, Frankfurt am Main 60438, Germany
| |
Collapse
|
3
|
Tsunada J, Eliades SJ. Frontal-Auditory Cortical Interactions and Sensory Prediction During Vocal Production in Marmoset Monkeys. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.01.28.577656. [PMID: 38352422 PMCID: PMC10862695 DOI: 10.1101/2024.01.28.577656] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 02/21/2024]
Abstract
The control of speech and vocal production involves the calculation of error between the intended vocal output and the resulting auditory feedback. Consistent with this model, recent evidence has demonstrated that the auditory cortex is suppressed immediately before and during vocal production, yet is still sensitive to differences between vocal output and altered auditory feedback. This suppression has been suggested to be the result of top-down signals containing information about the intended vocal output, potentially originating from motor or other frontal cortical areas. However, whether such frontal areas are the source of suppressive and predictive signaling to the auditory cortex during vocalization is unknown. Here, we simultaneously recorded neural activity from both the auditory and frontal cortices of marmoset monkeys while they produced self-initiated vocalizations. We found increases in neural activity in both brain areas preceding the onset of vocal production, notably changes in both multi-unit activity and local field potential theta-band power. Connectivity analysis using Granger causality demonstrated that frontal cortex sends directed signaling to the auditory cortex during this pre-vocal period. Importantly, this pre-vocal activity predicted both vocalization-induced suppression of the auditory cortex as well as the acoustics of subsequent vocalizations. These results suggest that frontal cortical areas communicate with the auditory cortex preceding vocal production, with frontal-auditory signals that may reflect the transmission of sensory prediction information. This interaction between frontal and auditory cortices may contribute to mechanisms that calculate errors between intended and actual vocal outputs during vocal communication.
Collapse
Affiliation(s)
- Joji Tsunada
- Chinese Institute for Brain Research, Beijing, China
- Department of Veterinary Medicine, Faculty of Agriculture, Iwate University, Morioka, Iwate, Japan
| | - Steven J. Eliades
- Department of Head and Neck Surgery & Communication Sciences, Duke University School of Medicine, Durham, NC 27710, USA
| |
Collapse
|
4
|
Kiai A, Clemens J, Kössl M, Poeppel D, Hechavarría J. Flexible control of vocal timing in Carollia perspicillata bats enables escape from acoustic interference. Commun Biol 2023; 6:1153. [PMID: 37957351 PMCID: PMC10643407 DOI: 10.1038/s42003-023-05507-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2023] [Accepted: 10/25/2023] [Indexed: 11/15/2023] Open
Abstract
In natural environments, background noise can degrade the integrity of acoustic signals, posing a problem for animals that rely on their vocalizations for communication and navigation. A simple behavioral strategy to combat acoustic interference would be to restrict call emissions to periods of low-amplitude or no noise. Using audio playback and computational tools for the automated detection of over 2.5 million vocalizations from groups of freely vocalizing bats, we show that bats (Carollia perspicillata) can dynamically adapt the timing of their calls to avoid acoustic jamming in both predictably and unpredictably patterned noise. This study demonstrates that bats spontaneously seek out temporal windows of opportunity for vocalizing in acoustically crowded environments, providing a mechanism for efficient echolocation and communication in cluttered acoustic landscapes.
Collapse
Affiliation(s)
- Ava Kiai
- Institute for Cell Biology and Neuroscience, Goethe University, Frankfurt am Main, Germany.
| | - Jan Clemens
- European Neuroscience Center, Göttingen, Germany
| | - Manfred Kössl
- Institute for Cell Biology and Neuroscience, Goethe University, Frankfurt am Main, Germany
| | - David Poeppel
- Ernst Strüngmann Institute, Frankfurt am Main, Germany
| | - Julio Hechavarría
- Institute for Cell Biology and Neuroscience, Goethe University, Frankfurt am Main, Germany.
| |
Collapse
|
5
|
Forli A, Yartsev MM. Hippocampal representation during collective spatial behaviour in bats. Nature 2023; 621:796-803. [PMID: 37648869 PMCID: PMC10533399 DOI: 10.1038/s41586-023-06478-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2022] [Accepted: 07/25/2023] [Indexed: 09/01/2023]
Abstract
Social animals live and move through spaces shaped by the presence, motion and sensory cues of multiple other individuals1-6. Neural activity in the hippocampus is known to reflect spatial behaviour7-9 yet its study is lacking in such dynamic group settings, which are ubiquitous in natural environments. Here we studied hippocampal activity in groups of bats engaged in collective spatial behaviour. We find that, under spontaneous conditions, a robust spatial structure emerges at the group level whereby behaviour is anchored to specific locations, movement patterns and individual social preferences. Using wireless electrophysiological recordings from both stationary and flying bats, we find that many hippocampal neurons are tuned to key features of group dynamics. These include the presence or absence of a conspecific, but not typically of an object, at landing sites, shared spatial locations, individual identities and sensory signals that are broadcasted in the group setting. Finally, using wireless calcium imaging, we find that social responses are anatomically distributed and robustly represented at the population level. Combined, our findings reveal that hippocampal activity contains a rich representation of naturally emerging spatial behaviours in animal groups that could in turn support the complex feat of collective behaviour.
Collapse
Affiliation(s)
- Angelo Forli
- Department of Bioengineering, UC Berkeley, Berkeley, CA, USA
| | - Michael M Yartsev
- Department of Bioengineering, UC Berkeley, Berkeley, CA, USA.
- Helen Wills Neuroscience Institute, UC Berkeley, Berkeley, CA, USA.
| |
Collapse
|
6
|
Son S, Moon J, Kim YJ, Kang MS, Lee J. Frontal-to-visual information flow explains predictive motion tracking. Neuroimage 2023; 269:119914. [PMID: 36736637 DOI: 10.1016/j.neuroimage.2023.119914] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2022] [Revised: 01/28/2023] [Accepted: 01/30/2023] [Indexed: 02/04/2023] Open
Abstract
Predictive tracking demonstrates our ability to maintain a line of vision on moving objects even when they temporarily disappear. Models of smooth pursuit eye movements posit that our brain achieves this ability by directly streamlining motor programming from continuously updated sensory motion information. To test this hypothesis, we obtained sensory motion representation from multivariate electroencephalogram activity while human participants covertly tracked a temporarily occluded moving stimulus with their eyes remaining stationary at the fixation point. The sensory motion representation of the occluded target evolves to its maximum strength at the expected timing of reappearance, suggesting a timely modulation of the internal model of the visual target. We further characterize the spatiotemporal dynamics of the task-relevant motion information by computing the phase gradients of slow oscillations. We discovered a predominant posterior-to-anterior phase gradient immediately after stimulus occlusion; however, at the expected timing of reappearance, the axis reverses the gradient, becoming anterior-to-posterior. The behavioral bias of smooth pursuit eye movements, which is a signature of the predictive process of the pursuit, was correlated with the posterior division of the gradient. These results suggest that the sensory motion area modulated by the prediction signal is involved in updating motor programming.
Collapse
Affiliation(s)
- Sangkyu Son
- Center for Neuroscience Imaging Research, Institute for Basic Science (IBS), Suwon 16419, South Korea; Department of Biomedical Engineering, Sungkyunkwan University, Suwon 16419, South Korea
| | - Joonsik Moon
- Center for Neuroscience Imaging Research, Institute for Basic Science (IBS), Suwon 16419, South Korea
| | - Yee-Joon Kim
- Center for Cognition and Sociality, Institute for Basic Science (IBS), Daejeon 34141, South Korea
| | - Min-Suk Kang
- Center for Neuroscience Imaging Research, Institute for Basic Science (IBS), Suwon 16419, South Korea; Department of Psychology, Sungkyunkwan University, Seoul 03063, South Korea.
| | - Joonyeol Lee
- Center for Neuroscience Imaging Research, Institute for Basic Science (IBS), Suwon 16419, South Korea; Department of Biomedical Engineering, Sungkyunkwan University, Suwon 16419, South Korea; Department of Intelligent Precision Healthcare Convergence, Sungkyunkwan University, Suwon 16419, South Korea.
| |
Collapse
|