1
|
Kaplan HS, Horvath PM, Rahman MM, Dulac C. The neurobiology of parenting and infant-evoked aggression. Physiol Rev 2025; 105:315-381. [PMID: 39146250 DOI: 10.1152/physrev.00036.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2023] [Revised: 07/19/2024] [Accepted: 08/09/2024] [Indexed: 08/17/2024] Open
Abstract
Parenting behavior comprises a variety of adult-infant and adult-adult interactions across multiple timescales. The state transition from nonparent to parent requires an extensive reorganization of individual priorities and physiology and is facilitated by combinatorial hormone action on specific cell types that are integrated throughout interconnected and brainwide neuronal circuits. In this review, we take a comprehensive approach to integrate historical and current literature on each of these topics across multiple species, with a focus on rodents. New and emerging molecular, circuit-based, and computational technologies have recently been used to address outstanding gaps in our current framework of knowledge on infant-directed behavior. This work is raising fundamental questions about the interplay between instinctive and learned components of parenting and the mutual regulation of affiliative versus agonistic infant-directed behaviors in health and disease. Whenever possible, we point to how these technologies have helped gain novel insights and opened new avenues of research into the neurobiology of parenting. We hope this review will serve as an introduction for those new to the field, a comprehensive resource for those already studying parenting, and a guidepost for designing future studies.
Collapse
Affiliation(s)
- Harris S Kaplan
- Department of Molecular and Cellular Biology, Howard Hughes Medical Institute, Center for Brain Science, Harvard University, Cambridge, Massachusetts, United States
| | - Patricia M Horvath
- Department of Molecular and Cellular Biology, Howard Hughes Medical Institute, Center for Brain Science, Harvard University, Cambridge, Massachusetts, United States
| | - Mohammed Mostafizur Rahman
- Department of Molecular and Cellular Biology, Howard Hughes Medical Institute, Center for Brain Science, Harvard University, Cambridge, Massachusetts, United States
| | - Catherine Dulac
- Department of Molecular and Cellular Biology, Howard Hughes Medical Institute, Center for Brain Science, Harvard University, Cambridge, Massachusetts, United States
| |
Collapse
|
2
|
Hope J, Beckerle TM, Cheng PH, Viavattine Z, Feldkamp M, Fausner SML, Saxena K, Ko E, Hryb I, Carter RE, Ebner TJ, Kodandaramaiah SB. Brain-wide neural recordings in mice navigating physical spaces enabled by robotic neural recording headstages. Nat Methods 2024; 21:2171-2181. [PMID: 39375573 DOI: 10.1038/s41592-024-02434-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2023] [Accepted: 08/21/2024] [Indexed: 10/09/2024]
Abstract
Technologies that can record neural activity at cellular resolution at multiple spatial and temporal scales are typically much larger than the animals that are being recorded from and are thus limited to recording from head-fixed subjects. Here we have engineered robotic neural recording devices-'cranial exoskeletons'-that assist mice in maneuvering recording headstages that are orders of magnitude larger and heavier than the mice, while they navigate physical behavioral environments. We discovered optimal controller parameters that enable mice to locomote at physiologically realistic velocities while maintaining natural walking gaits. We show that mice learn to work with the robot to make turns and perform decision-making tasks. Robotic imaging and electrophysiology headstages were used to record recordings of Ca2+ activity of thousands of neurons distributed across the dorsal cortex and spiking activity of hundreds of neurons across multiple brain regions and multiple days, respectively.
Collapse
Affiliation(s)
- James Hope
- Department of Mechanical Engineering, University of Minnesota, Twin Cities, Minneapolis, MN, USA
| | - Travis M Beckerle
- Department of Mechanical Engineering, University of Minnesota, Twin Cities, Minneapolis, MN, USA
| | - Pin-Hao Cheng
- Department of Mechanical Engineering, University of Minnesota, Twin Cities, Minneapolis, MN, USA
| | - Zoey Viavattine
- Department of Mechanical Engineering, University of Minnesota, Twin Cities, Minneapolis, MN, USA
| | - Michael Feldkamp
- Department of Mechanical Engineering, University of Minnesota, Twin Cities, Minneapolis, MN, USA
| | - Skylar M L Fausner
- Department of Mechanical Engineering, University of Minnesota, Twin Cities, Minneapolis, MN, USA
| | - Kapil Saxena
- Department of Mechanical Engineering, University of Minnesota, Twin Cities, Minneapolis, MN, USA
| | - Eunsong Ko
- Department of Mechanical Engineering, University of Minnesota, Twin Cities, Minneapolis, MN, USA
| | - Ihor Hryb
- Department of Mechanical Engineering, University of Minnesota, Twin Cities, Minneapolis, MN, USA
- Department of Neuroscience, University of Minnesota, Twin Cities, Minneapolis, MN, USA
| | - Russell E Carter
- Department of Neuroscience, University of Minnesota, Twin Cities, Minneapolis, MN, USA
| | - Timothy J Ebner
- Department of Neuroscience, University of Minnesota, Twin Cities, Minneapolis, MN, USA
| | - Suhasa B Kodandaramaiah
- Department of Mechanical Engineering, University of Minnesota, Twin Cities, Minneapolis, MN, USA.
- Department of Neuroscience, University of Minnesota, Twin Cities, Minneapolis, MN, USA.
- Department of Biomedical Engineering, University of MinnesotaTwin Cities, Minneapolis, MN, USA.
| |
Collapse
|
3
|
Tariq MF, Sterrett SC, Moore S, Lane, Perkel DJ, Gire DH. Dynamics of odor-source localization: Insights from real-time odor plume recordings and head-motion tracking in freely moving mice. PLoS One 2024; 19:e0310254. [PMID: 39325742 PMCID: PMC11426488 DOI: 10.1371/journal.pone.0310254] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2023] [Accepted: 08/27/2024] [Indexed: 09/28/2024] Open
Abstract
Animals navigating turbulent odor plumes exhibit a rich variety of behaviors, and employ efficient strategies to locate odor sources. A growing body of literature has started to probe this complex task of localizing airborne odor sources in walking mammals to further our understanding of neural encoding and decoding of naturalistic sensory stimuli. However, correlating the intermittent olfactory information with behavior has remained a long-standing challenge due to the stochastic nature of the odor stimulus. We recently reported a method to record real-time olfactory information available to freely moving mice during odor-guided navigation, hence overcoming that challenge. Here we combine our odor-recording method with head-motion tracking to establish correlations between plume encounters and head movements. We show that mice exhibit robust head-pitch motions in the 5-14Hz range during an odor-guided navigation task, and that these head motions are modulated by plume encounters. Furthermore, mice reduce their angles with respect to the source upon plume contact. Head motions may thus be an important part of the sensorimotor behavioral repertoire during naturalistic odor-source localization.
Collapse
Affiliation(s)
- Mohammad F. Tariq
- Graduate Program in Neuroscience, University of Washington, Seattle, Washington, United States of America
- Department of Psychology, University of Washington, Seattle, Washington, United States of America
| | - Scott C. Sterrett
- Graduate Program in Neuroscience, University of Washington, Seattle, Washington, United States of America
- Department of Psychology, University of Washington, Seattle, Washington, United States of America
| | - Sidney Moore
- Department of Psychology, University of Washington, Seattle, Washington, United States of America
| | - Lane
- Department of Psychology, University of Washington, Seattle, Washington, United States of America
- Department of Psychology, Seattle University, Seattle, Washington, United States of America
| | - David J. Perkel
- Departments of Biology & Otolaryngology, University of Washington, Seattle, Washington, United States of America
| | - David H. Gire
- Department of Psychology, University of Washington, Seattle, Washington, United States of America
| |
Collapse
|
4
|
Fu J, Pierzchlewicz PA, Willeke KF, Bashiri M, Muhammad T, Diamantaki M, Froudarakis E, Restivo K, Ponder K, Denfield GH, Sinz F, Tolias AS, Franke K. Heterogeneous orientation tuning in the primary visual cortex of mice diverges from Gabor-like receptive fields in primates. Cell Rep 2024; 43:114639. [PMID: 39167488 PMCID: PMC11463840 DOI: 10.1016/j.celrep.2024.114639] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2022] [Revised: 06/19/2024] [Accepted: 07/31/2024] [Indexed: 08/23/2024] Open
Abstract
A key feature of neurons in the primary visual cortex (V1) of primates is their orientation selectivity. Recent studies using deep neural network models showed that the most exciting input (MEI) for mouse V1 neurons exhibit complex spatial structures that predict non-uniform orientation selectivity across the receptive field (RF), in contrast to the classical Gabor filter model. Using local patches of drifting gratings, we identified heterogeneous orientation tuning in mouse V1 that varied up to 90° across sub-regions of the RF. This heterogeneity correlated with deviations from optimal Gabor filters and was consistent across cortical layers and recording modalities (calcium vs. spikes). In contrast, model-synthesized MEIs for macaque V1 neurons were predominantly Gabor like, consistent with previous studies. These findings suggest that complex spatial feature selectivity emerges earlier in the visual pathway in mice than in primates. This may provide a faster, though less general, method of extracting task-relevant information.
Collapse
Affiliation(s)
- Jiakun Fu
- Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA; Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX 77030, USA
| | - Paweł A Pierzchlewicz
- Institute for Bioinformatics and Medical Informatics, Tübingen University, Tübingen, Germany; Georg-August University Göttingen, Göttingen, Germany
| | - Konstantin F Willeke
- Institute for Bioinformatics and Medical Informatics, Tübingen University, Tübingen, Germany; Georg-August University Göttingen, Göttingen, Germany
| | - Mohammad Bashiri
- Institute for Bioinformatics and Medical Informatics, Tübingen University, Tübingen, Germany; Georg-August University Göttingen, Göttingen, Germany
| | - Taliah Muhammad
- Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA; Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX 77030, USA
| | - Maria Diamantaki
- Institute of Molecular Biology & Biotechnology, Foundation of Research & Technology - Hellas, Heraklion, Crete, Greece; School of Medicine, University of Crete, Heraklion, Crete, Greece
| | - Emmanouil Froudarakis
- Institute of Molecular Biology & Biotechnology, Foundation of Research & Technology - Hellas, Heraklion, Crete, Greece; School of Medicine, University of Crete, Heraklion, Crete, Greece
| | - Kelli Restivo
- Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA; Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX 77030, USA
| | - Kayla Ponder
- Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA; Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX 77030, USA
| | - George H Denfield
- Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA; Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX 77030, USA
| | - Fabian Sinz
- Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA; Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX 77030, USA; Institute for Bioinformatics and Medical Informatics, Tübingen University, Tübingen, Germany; Georg-August University Göttingen, Göttingen, Germany
| | - Andreas S Tolias
- Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA; Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX 77030, USA; Department of Ophthalmology, Byers Eye Institute, Stanford University School of Medicine, Stanford, CA 94303, USA; Stanford Bio-X, Stanford University, Stanford, CA 94305, USA; Wu Tsai Neurosciences Institute, Stanford University, Stanford, CA 94305, USA; Department of Electrical Engineering, Stanford University, Stanford, CA 94305, USA.
| | - Katrin Franke
- Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA; Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX 77030, USA; Department of Ophthalmology, Byers Eye Institute, Stanford University School of Medicine, Stanford, CA 94303, USA; Stanford Bio-X, Stanford University, Stanford, CA 94305, USA; Wu Tsai Neurosciences Institute, Stanford University, Stanford, CA 94305, USA.
| |
Collapse
|
5
|
González-Rueda A, Jensen K, Noormandipour M, de Malmazet D, Wilson J, Ciabatti E, Kim J, Williams E, Poort J, Hennequin G, Tripodi M. Kinetic features dictate sensorimotor alignment in the superior colliculus. Nature 2024; 631:378-385. [PMID: 38961292 PMCID: PMC11236723 DOI: 10.1038/s41586-024-07619-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2022] [Accepted: 05/28/2024] [Indexed: 07/05/2024]
Abstract
The execution of goal-oriented behaviours requires a spatially coherent alignment between sensory and motor maps. The current model for sensorimotor transformation in the superior colliculus relies on the topographic mapping of static spatial receptive fields onto movement endpoints1-6. Here, to experimentally assess the validity of this canonical static model of alignment, we dissected the visuo-motor network in the superior colliculus and performed in vivo intracellular and extracellular recordings across layers, in restrained and unrestrained conditions, to assess both the motor and the visual tuning of individual motor and premotor neurons. We found that collicular motor units have poorly defined visual static spatial receptive fields and respond instead to kinetic visual features, revealing the existence of a direct alignment in vectorial space between sensory and movement vectors, rather than between spatial receptive fields and movement endpoints as canonically hypothesized. We show that a neural network built according to these kinetic alignment principles is ideally placed to sustain ethological behaviours such as the rapid interception of moving and static targets. These findings reveal a novel dimension of the sensorimotor alignment process. By extending the alignment from the static to the kinetic domain this work provides a novel conceptual framework for understanding the nature of sensorimotor convergence and its relevance in guiding goal-directed behaviours.
Collapse
Affiliation(s)
- Ana González-Rueda
- MRC Laboratory of Molecular Biology, Cambridge, UK.
- St Edmund's College, University of Cambridge, Cambridge, UK.
| | | | | | | | | | | | - Jisoo Kim
- Department of Physiology, Development and Neuroscience, University of Cambridge, Cambridge, UK
| | | | - Jasper Poort
- Department of Physiology, Development and Neuroscience, University of Cambridge, Cambridge, UK
| | - Guillaume Hennequin
- MRC Laboratory of Molecular Biology, Cambridge, UK
- Department of Engineering, University of Cambridge, Cambridge, UK
| | | |
Collapse
|
6
|
Tzanou A, Theodorou E, Mantas I, Dalezios Y. Excitatory Projections of Wide Field Collicular Neurons to the Nucleus of the Optic Tract in the Rat. J Comp Neurol 2024; 532:e25651. [PMID: 38961597 DOI: 10.1002/cne.25651] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2023] [Revised: 05/20/2024] [Accepted: 06/07/2024] [Indexed: 07/05/2024]
Abstract
The superficial layers of the mammalian superior colliculus (SC) contain neurons that are generally responsive to visual stimuli but can differ considerably in morphology and response properties. To elucidate the structure and function of these neurons, we combined extracellular recording and juxtacellular labeling, detailed anatomical reconstruction, and ultrastructural analysis of the synaptic contacts of labeled neurons, using transmission electron microscopy. Our labeled neurons project to different brainstem nuclei. Of particular importance are neurons that fit the morphological criteria of the wide field (WF) neurons and whose dendrites are horizontally oriented. They display a rather characteristic axonal projection pattern to the nucleus of optic tract (NOT); thus, we call them superior collicular WF projecting to the NOT (SCWFNOT) neurons. We corroborated the morphological characterization of this neuronal type as a distinct neuronal class with the help of unsupervised hierarchical cluster analysis. Our ultrastructural data demonstrate that SCWFNOT neurons establish excitatory connections with their targets in the NOT. Although, in rodents, the literature about the WF neurons has focused on their extensive projection to the lateral posterior nucleus of the thalamus, as a conduit for information to reach the visual association areas of the cortex, our data suggest that this subclass of WF neurons may participate in the optokinetic nystagmus.
Collapse
Affiliation(s)
- Athanasia Tzanou
- School of Medicine, University of Crete, Heraklion, Greece
- Institute of Applied and Computational Mathematics (IACM), Foundation for Research and Technology-Hellas (FORTH), Heraklion, Greece
| | - Eirini Theodorou
- School of Medicine, University of Crete, Heraklion, Greece
- Institute of Applied and Computational Mathematics (IACM), Foundation for Research and Technology-Hellas (FORTH), Heraklion, Greece
| | - Ioannis Mantas
- School of Medicine, University of Crete, Heraklion, Greece
| | - Yannis Dalezios
- School of Medicine, University of Crete, Heraklion, Greece
- Institute of Applied and Computational Mathematics (IACM), Foundation for Research and Technology-Hellas (FORTH), Heraklion, Greece
| |
Collapse
|
7
|
Skyberg RJ, Niell CM. Natural visual behavior and active sensing in the mouse. Curr Opin Neurobiol 2024; 86:102882. [PMID: 38704868 PMCID: PMC11254345 DOI: 10.1016/j.conb.2024.102882] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2023] [Revised: 04/05/2024] [Accepted: 04/10/2024] [Indexed: 05/07/2024]
Abstract
In the natural world, animals use vision for a wide variety of behaviors not reflected in most laboratory paradigms. Although mice have low-acuity vision, they use their vision for many natural behaviors, including predator avoidance, prey capture, and navigation. They also perform active sensing, moving their head and eyes to achieve behavioral goals and acquire visual information. These aspects of natural vision result in visual inputs and corresponding behavioral outputs that are outside the range of conventional vision studies but are essential aspects of visual function. Here, we review recent studies in mice that have tapped into natural behavior and active sensing to reveal the computational logic of neural circuits for vision.
Collapse
Affiliation(s)
- Rolf J Skyberg
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene OR 97403, USA. https://twitter.com/SkybergRolf
| | - Cristopher M Niell
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene OR 97403, USA.
| |
Collapse
|
8
|
Oesch LT, Ryan MB, Churchland AK. From innate to instructed: A new look at perceptual decision-making. Curr Opin Neurobiol 2024; 86:102871. [PMID: 38569230 PMCID: PMC11162954 DOI: 10.1016/j.conb.2024.102871] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2023] [Revised: 03/07/2024] [Accepted: 03/08/2024] [Indexed: 04/05/2024]
Abstract
Understanding how subjects perceive sensory stimuli in their environment and use this information to guide appropriate actions is a major challenge in neuroscience. To study perceptual decision-making in animals, researchers use tasks that either probe spontaneous responses to stimuli (often described as "naturalistic") or train animals to associate stimuli with experimenter-defined responses. Spontaneous decisions rely on animals' pre-existing knowledge, while trained tasks offer greater versatility, albeit often at the cost of extensive training. Here, we review emerging approaches to investigate perceptual decision-making using both spontaneous and trained behaviors, highlighting their strengths and limitations. Additionally, we propose how trained decision-making tasks could be improved to achieve faster learning and a more generalizable understanding of task rules.
Collapse
Affiliation(s)
- Lukas T Oesch
- Department of Neurobiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, United States
| | - Michael B Ryan
- Department of Neurobiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, United States. https://twitter.com/NeuroMikeRyan
| | - Anne K Churchland
- Department of Neurobiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, United States.
| |
Collapse
|
9
|
Ambrad Giovannetti E, Rancz E. Behind mouse eyes: The function and control of eye movements in mice. Neurosci Biobehav Rev 2024; 161:105671. [PMID: 38604571 DOI: 10.1016/j.neubiorev.2024.105671] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2023] [Revised: 03/12/2024] [Accepted: 04/08/2024] [Indexed: 04/13/2024]
Abstract
The mouse visual system has become the most popular model to study the cellular and circuit mechanisms of sensory processing. However, the importance of eye movements only started to be appreciated recently. Eye movements provide a basis for predictive sensing and deliver insights into various brain functions and dysfunctions. A plethora of knowledge on the central control of eye movements and their role in perception and behaviour arose from work on primates. However, an overview of various eye movements in mice and a comparison to primates is missing. Here, we review the eye movement types described to date in mice and compare them to those observed in primates. We discuss the central neuronal mechanisms for their generation and control. Furthermore, we review the mounting literature on eye movements in mice during head-fixed and freely moving behaviours. Finally, we highlight gaps in our understanding and suggest future directions for research.
Collapse
Affiliation(s)
| | - Ede Rancz
- INMED, INSERM, Aix-Marseille University, Marseille, France.
| |
Collapse
|
10
|
Singh VP, Li J, Mitchell J, Miller C. Active vision in freely moving marmosets using head-mounted eye tracking. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.05.11.593707. [PMID: 38766147 PMCID: PMC11100783 DOI: 10.1101/2024.05.11.593707] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/22/2024]
Abstract
Our understanding of how vision functions as primates actively navigate the real-world is remarkably sparse. As most data have been limited to chaired and typically head-restrained animals, the synergistic interactions of different motor actions/plans inherent to active sensing - e.g. eyes, head, posture, movement, etc. - on visual perception are largely unknown. To address this considerable gap in knowledge, we developed an innovative wireless head-mounted eye tracking system called CEREBRO for small mammals, such as marmoset monkeys. Our system performs Chair-free Eye-Recording using Backpack mounted micROcontrollers. Because eye illumination and environment lighting change continuously in natural contexts, we developed a segmentation artificial neural network to perform robust pupil tracking in these conditions. Leveraging this innovative system to investigate active vision, we demonstrate that although freely-moving marmosets exhibit frequent compensatory eye movements equivalent to other primates, including humans, the predictability of the visual system is enhanced when animals are freely-moving relative to when they are head-fixed. Moreover, despite increases in eye/head-motion during locomotion, gaze stabilization actually improved over periods when the monkeys were stationary. Rather than impair vision, the dynamics of gaze stabilization in freely-moving primates has been optimized over evolution to enable active sensing during natural exploration.
Collapse
Affiliation(s)
- Vikram Pal Singh
- Cortical Systems & Behavior Lab, University of California San Diego
| | - Jingwen Li
- Cortical Systems & Behavior Lab, University of California San Diego
| | - Jude Mitchell
- Department of Brain and Cognitive Science, University of Rochester
| | - Cory Miller
- Cortical Systems & Behavior Lab, University of California San Diego
- Neurosciences Graduate Program, University of California San Diego
| |
Collapse
|
11
|
Clayton KK, Stecyk KS, Guo AA, Chambers AR, Chen K, Hancock KE, Polley DB. Sound elicits stereotyped facial movements that provide a sensitive index of hearing abilities in mice. Curr Biol 2024; 34:1605-1620.e5. [PMID: 38492568 PMCID: PMC11043000 DOI: 10.1016/j.cub.2024.02.057] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2023] [Revised: 01/02/2024] [Accepted: 02/23/2024] [Indexed: 03/18/2024]
Abstract
Sound elicits rapid movements of muscles in the face, ears, and eyes that protect the body from injury and trigger brain-wide internal state changes. Here, we performed quantitative facial videography from mice resting atop a piezoelectric force plate and observed that broadband sounds elicited rapid and stereotyped facial twitches. Facial motion energy (FME) adjacent to the whisker array was 30 dB more sensitive than the acoustic startle reflex and offered greater inter-trial and inter-animal reliability than sound-evoked pupil dilations or movement of other facial and body regions. FME tracked the low-frequency envelope of broadband sounds, providing a means to study behavioral discrimination of complex auditory stimuli, such as speech phonemes in noise. Approximately 25% of layer 5-6 units in the auditory cortex (ACtx) exhibited firing rate changes during facial movements. However, FME facilitation during ACtx photoinhibition indicated that sound-evoked facial movements were mediated by a midbrain pathway and modulated by descending corticofugal input. FME and auditory brainstem response (ABR) thresholds were closely aligned after noise-induced sensorineural hearing loss, yet FME growth slopes were disproportionately steep at spared frequencies, reflecting a central plasticity that matched commensurate changes in ABR wave 4. Sound-evoked facial movements were also hypersensitive in Ptchd1 knockout mice, highlighting the use of FME for identifying sensory hyper-reactivity phenotypes after adult-onset hyperacusis and inherited deficiencies in autism risk genes. These findings present a sensitive and integrative measure of hearing while also highlighting that even low-intensity broadband sounds can elicit a complex mixture of auditory, motor, and reafferent somatosensory neural activity.
Collapse
Affiliation(s)
- Kameron K Clayton
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Department of Otolaryngology-Head and Neck Surgery, Harvard Medical School, Boston, MA 02114, USA.
| | - Kamryn S Stecyk
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA
| | - Anna A Guo
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA
| | - Anna R Chambers
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Department of Otolaryngology-Head and Neck Surgery, Harvard Medical School, Boston, MA 02114, USA
| | - Ke Chen
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Department of Otolaryngology-Head and Neck Surgery, Harvard Medical School, Boston, MA 02114, USA
| | - Kenneth E Hancock
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Department of Otolaryngology-Head and Neck Surgery, Harvard Medical School, Boston, MA 02114, USA
| | - Daniel B Polley
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Department of Otolaryngology-Head and Neck Surgery, Harvard Medical School, Boston, MA 02114, USA
| |
Collapse
|
12
|
Yamashita J, Takimoto Y, Oishi H, Kumada T. How do personality traits modulate real-world gaze behavior? Generated gaze data shows situation-dependent modulations. Front Psychol 2024; 14:1144048. [PMID: 38268808 PMCID: PMC10805946 DOI: 10.3389/fpsyg.2023.1144048] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2023] [Accepted: 12/21/2023] [Indexed: 01/26/2024] Open
Abstract
It has both scientific and practical benefits to substantiate the theoretical prediction that personality (Big Five) traits systematically modulate gaze behavior in various real-world (working) situations. Nevertheless, previous methods that required controlled situations and large numbers of participants failed to incorporate real-world personality modulation analysis. One cause of this research gap is the mixed effects of individual attributes (e.g., the accumulated attributes of age, gender, and degree of measurement noise) and personality traits in gaze data. Previous studies may have used larger sample sizes to average out the possible concentration of specific individual attributes in some personality traits, and may have imposed control situations to prevent unexpected interactions between these possibly biased individual attributes and complex, realistic situations. Therefore, we generated and analyzed real-world gaze behavior where the effects of personality traits are separated out from individual attributes. In Experiment 1, we successfully provided a methodology for generating such sensor data on head and eye movements for a small sample of participants who performed realistic nonsocial (data-entry) and social (conversation) work tasks (i.e., the first contribution). In Experiment 2, we evaluated the effectiveness of generated gaze behavior for real-world personality modulation analysis. We successfully showed how openness systematically modulates the autocorrelation coefficients of sensor data, reflecting the period of head and eye movements in data-entry and conversation tasks (i.e., the second contribution). We found different openness modulations in the autocorrelation coefficients from the generated sensor data of the two tasks. These modulations could not be detected using real sensor data because of the contamination of individual attributes. In conclusion, our method is a potentially powerful tool for understanding theoretically expected, systematic situation-specific personality modulation of real-world gaze behavior.
Collapse
Affiliation(s)
- Jumpei Yamashita
- NTT Access Network Service Systems Laboratories, Nippon Telegraph and Telephone Corporation, Tokyo, Japan
- Graduate School of Informatics, Kyoto University, Kyoto, Japan
| | - Yoshiaki Takimoto
- NTT Human Informatics Laboratories, Nippon Telegraph and Telephone Corporation, Kanagawa, Japan
| | - Haruo Oishi
- NTT Access Network Service Systems Laboratories, Nippon Telegraph and Telephone Corporation, Tokyo, Japan
| | | |
Collapse
|
13
|
Syeda A, Zhong L, Tung R, Long W, Pachitariu M, Stringer C. Facemap: a framework for modeling neural activity based on orofacial tracking. Nat Neurosci 2024; 27:187-195. [PMID: 37985801 PMCID: PMC10774130 DOI: 10.1038/s41593-023-01490-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2022] [Accepted: 10/10/2023] [Indexed: 11/22/2023]
Abstract
Recent studies in mice have shown that orofacial behaviors drive a large fraction of neural activity across the brain. To understand the nature and function of these signals, we need better computational models to characterize the behaviors and relate them to neural activity. Here we developed Facemap, a framework consisting of a keypoint tracker and a deep neural network encoder for predicting neural activity. Our algorithm for tracking mouse orofacial behaviors was more accurate than existing pose estimation tools, while the processing speed was several times faster, making it a powerful tool for real-time experimental interventions. The Facemap tracker was easy to adapt to data from new labs, requiring as few as 10 annotated frames for near-optimal performance. We used the keypoints as inputs to a deep neural network which predicts the activity of ~50,000 simultaneously-recorded neurons and, in visual cortex, we doubled the amount of explained variance compared to previous methods. Using this model, we found that the neuronal activity clusters that were well predicted from behavior were more spatially spread out across cortex. We also found that the deep behavioral features from the model had stereotypical, sequential dynamics that were not reversible in time. In summary, Facemap provides a stepping stone toward understanding the function of the brain-wide neural signals and their relation to behavior.
Collapse
Affiliation(s)
- Atika Syeda
- HHMI Janelia Research Campus, Ashburn, VA, USA.
| | - Lin Zhong
- HHMI Janelia Research Campus, Ashburn, VA, USA
| | - Renee Tung
- HHMI Janelia Research Campus, Ashburn, VA, USA
| | - Will Long
- HHMI Janelia Research Campus, Ashburn, VA, USA
| | | | | |
Collapse
|
14
|
Blanco-Hernández E, Balsamo G, Preston-Ferrer P, Burgalossi A. Sensory and behavioral modulation of thalamic head-direction cells. Nat Neurosci 2024; 27:28-33. [PMID: 38177338 DOI: 10.1038/s41593-023-01506-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2022] [Accepted: 10/24/2023] [Indexed: 01/06/2024]
Abstract
Head-direction (HD) neurons are thought to exclusively encode directional heading. In awake mice, we found that sensory stimuli evoked robust short-latency responses in thalamic HD cells, but not in non-HD neurons. The activity of HD cells, but not that of non-HD neurons, was tightly correlated to brain-state fluctuations and dynamically modulated during social interactions. These data point to a new role for the thalamic compass in relaying sensory and behavioral-state information.
Collapse
Affiliation(s)
- Eduardo Blanco-Hernández
- Institute of Neurobiology, University of Tübingen, Tübingen, Germany
- Werner-Reichardt Centre for Integrative Neuroscience, Tübingen, Germany
| | - Giuseppe Balsamo
- Institute of Neurobiology, University of Tübingen, Tübingen, Germany
- Werner-Reichardt Centre for Integrative Neuroscience, Tübingen, Germany
- Graduate Training Centre of Neuroscience, IMPRS, Tübingen, Germany
| | - Patricia Preston-Ferrer
- Institute of Neurobiology, University of Tübingen, Tübingen, Germany.
- Werner-Reichardt Centre for Integrative Neuroscience, Tübingen, Germany.
| | - Andrea Burgalossi
- Institute of Neurobiology, University of Tübingen, Tübingen, Germany.
- Werner-Reichardt Centre for Integrative Neuroscience, Tübingen, Germany.
| |
Collapse
|
15
|
Xu A, Hou Y, Niell CM, Beyeler M. Multimodal Deep Learning Model Unveils Behavioral Dynamics of V1 Activity in Freely Moving Mice. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 2023; 36:15341-15357. [PMID: 39005944 PMCID: PMC11242920] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Subscribe] [Scholar Register] [Indexed: 07/16/2024]
Abstract
Despite their immense success as a model of macaque visual cortex, deep convolutional neural networks (CNNs) have struggled to predict activity in visual cortex of the mouse, which is thought to be strongly dependent on the animal's behavioral state. Furthermore, most computational models focus on predicting neural responses to static images presented under head fixation, which are dramatically different from the dynamic, continuous visual stimuli that arise during movement in the real world. Consequently, it is still unknown how natural visual input and different behavioral variables may integrate over time to generate responses in primary visual cortex (V1). To address this, we introduce a multimodal recurrent neural network that integrates gaze-contingent visual input with behavioral and temporal dynamics to explain V1 activity in freely moving mice. We show that the model achieves state-of-the-art predictions of V1 activity during free exploration and demonstrate the importance of each component in an extensive ablation study. Analyzing our model using maximally activating stimuli and saliency maps, we reveal new insights into cortical function, including the prevalence of mixed selectivity for behavioral variables in mouse V1. In summary, our model offers a comprehensive deep-learning framework for exploring the computational principles underlying V1 neurons in freely-moving animals engaged in natural behavior.
Collapse
Affiliation(s)
- Aiwen Xu
- Department of Computer Science University of California, Santa Barbara Santa Barbara, CA 93117
| | - Yuchen Hou
- Department of Computer Science University of California, Santa Barbara Santa Barbara, CA 93117
| | - Cristopher M Niell
- Department of Biology, Institute of Neuroscience University of Oregon Eugene, OR 97403
| | - Michael Beyeler
- Department of Computer Science Department of Psychological & Brain Sciences University of California, Santa Barbara Santa Barbara, CA 93117
| |
Collapse
|
16
|
Parker PRL, Martins DM, Leonard ESP, Casey NM, Sharp SL, Abe ETT, Smear MC, Yates JL, Mitchell JF, Niell CM. A dynamic sequence of visual processing initiated by gaze shifts. Nat Neurosci 2023; 26:2192-2202. [PMID: 37996524 PMCID: PMC11270614 DOI: 10.1038/s41593-023-01481-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2022] [Accepted: 10/04/2023] [Indexed: 11/25/2023]
Abstract
Animals move their head and eyes as they explore the visual scene. Neural correlates of these movements have been found in rodent primary visual cortex (V1), but their sources and computational roles are unclear. We addressed this by combining head and eye movement measurements with neural recordings in freely moving mice. V1 neurons responded primarily to gaze shifts, where head movements are accompanied by saccadic eye movements, rather than to head movements where compensatory eye movements stabilize gaze. A variety of activity patterns followed gaze shifts and together these formed a temporal sequence that was absent in darkness. Gaze-shift responses resembled those evoked by sequentially flashed stimuli, suggesting a large component corresponds to onset of new visual input. Notably, neurons responded in a sequence that matches their spatial frequency bias, consistent with coarse-to-fine processing. Recordings in freely gazing marmosets revealed a similar sequence following saccades, also aligned to spatial frequency preference. Our results demonstrate that active vision in both mice and marmosets consists of a dynamic temporal sequence of neural activity associated with visual sampling.
Collapse
Affiliation(s)
- Philip R L Parker
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
- Behavioral and Systems Neuroscience, Department of Psychology, Rutgers University, New Brunswick, NJ, USA
| | - Dylan M Martins
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Emmalyn S P Leonard
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Nathan M Casey
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Shelby L Sharp
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Elliott T T Abe
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Matthew C Smear
- Institute of Neuroscience and Department of Psychology, University of Oregon, Eugene, OR, USA
| | - Jacob L Yates
- Department of Biology and Program in Neuroscience and Cognitive Science, University of Maryland, College Park, MD, USA
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, CA, USA
| | - Jude F Mitchell
- Department of Brain and Cognitive Sciences and Center for Visual Sciences, University of Rochester, Rochester, NY, USA.
| | - Cristopher M Niell
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA.
| |
Collapse
|
17
|
Hope J, Beckerle T, Cheng PH, Viavattine Z, Feldkamp M, Fausner S, Saxena K, Ko E, Hryb I, Carter R, Ebner T, Kodandaramaiah S. Brain-wide neural recordings in mice navigating physical spaces enabled by a cranial exoskeleton. RESEARCH SQUARE 2023:rs.3.rs-3491330. [PMID: 38014260 PMCID: PMC10680923 DOI: 10.21203/rs.3.rs-3491330/v1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/29/2023]
Abstract
Complex behaviors are mediated by neural computations occurring throughout the brain. In recent years, tremendous progress has been made in developing technologies that can record neural activity at cellular resolution at multiple spatial and temporal scales. However, these technologies are primarily designed for studying the mammalian brain during head fixation - wherein the behavior of the animal is highly constrained. Miniaturized devices for studying neural activity in freely behaving animals are largely confined to recording from small brain regions owing to performance limitations. We present a cranial exoskeleton that assists mice in maneuvering neural recording headstages that are orders of magnitude larger and heavier than the mice, while they navigate physical behavioral environments. Force sensors embedded within the headstage are used to detect the mouse's milli-Newton scale cranial forces which then control the x, y, and yaw motion of the exoskeleton via an admittance controller. We discovered optimal controller tuning parameters that enable mice to locomote at physiologically realistic velocities and accelerations while maintaining natural walking gait. Mice maneuvering headstages weighing up to 1.5 kg can make turns, navigate 2D arenas, and perform a navigational decision-making task with the same performance as when freely behaving. We designed an imaging headstage and an electrophysiology headstage for the cranial exoskeleton to record brain-wide neural activity in mice navigating 2D arenas. The imaging headstage enabled recordings of Ca2+ activity of 1000s of neurons distributed across the dorsal cortex. The electrophysiology headstage supported independent control of up to 4 silicon probes, enabling simultaneous recordings from 100s of neurons across multiple brain regions and multiple days. Cranial exoskeletons provide flexible platforms for largescale neural recording during the exploration of physical spaces, a critical new paradigm for unraveling the brain-wide neural mechanisms that control complex behavior.
Collapse
Affiliation(s)
- James Hope
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Travis Beckerle
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Pin-Hao Cheng
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Zoey Viavattine
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Michael Feldkamp
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Skylar Fausner
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Kapil Saxena
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Eunsong Ko
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Ihor Hryb
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
- Department of Neuroscience, University of Minnesota, Twin Cities
| | - Russell Carter
- Department of Biomedical Engineering, University of Minnesota, Twin Cities
| | - Timothy Ebner
- Department of Biomedical Engineering, University of Minnesota, Twin Cities
| | - Suhasa Kodandaramaiah
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
- Department of Biomedical Engineering, University of Minnesota, Twin Cities
- Department of Neuroscience, University of Minnesota, Twin Cities
| |
Collapse
|
18
|
Asokan MM, Watanabe Y, Kimchi EY, Polley DB. Potentiation of cholinergic and corticofugal inputs to the lateral amygdala in threat learning. Cell Rep 2023; 42:113167. [PMID: 37742187 PMCID: PMC10879743 DOI: 10.1016/j.celrep.2023.113167] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2023] [Revised: 07/07/2023] [Accepted: 09/07/2023] [Indexed: 09/26/2023] Open
Abstract
The amygdala, cholinergic basal forebrain, and higher-order auditory cortex (HO-AC) regulate brain-wide plasticity underlying auditory threat learning. Here, we perform multi-regional extracellular recordings and optical measurements of acetylcholine (ACh) release to characterize the development of discriminative plasticity within and between these brain regions as mice acquire and recall auditory threat memories. Spiking responses are potentiated for sounds paired with shock (CS+) in the lateral amygdala (LA) and optogenetically identified corticoamygdalar projection neurons, although not in neighboring HO-AC units. Spike- or optogenetically triggered local field potentials reveal enhanced corticofugal-but not corticopetal-functional coupling between HO-AC and LA during threat memory recall that is correlated with pupil-indexed memory strength. We also note robust sound-evoked ACh release that rapidly potentiates for the CS+ in LA but habituates across sessions in HO-AC. These findings highlight a distributed and cooperative plasticity in LA inputs as mice learn to reappraise neutral stimuli as possible threats.
Collapse
Affiliation(s)
- Meenakshi M Asokan
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Division of Medical Sciences, Harvard Medical School, Boston, MA 02114, USA.
| | - Yurika Watanabe
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA
| | - Eyal Y Kimchi
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Department of Neurology, Massachusetts General Hospital, Boston, MA 02114, USA
| | - Daniel B Polley
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Division of Medical Sciences, Harvard Medical School, Boston, MA 02114, USA; Department of Otolaryngology - Head and Neck Surgery, Harvard Medical School, Boston, MA 02114, USA
| |
Collapse
|
19
|
Chen J, Gish CM, Fransen JW, Salazar-Gatzimas E, Clark DA, Borghuis BG. Direct comparison reveals algorithmic similarities in fly and mouse visual motion detection. iScience 2023; 26:107928. [PMID: 37810236 PMCID: PMC10550730 DOI: 10.1016/j.isci.2023.107928] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2023] [Revised: 08/07/2023] [Accepted: 09/12/2023] [Indexed: 10/10/2023] Open
Abstract
Evolution has equipped vertebrates and invertebrates with neural circuits that selectively encode visual motion. While similarities in the computations performed by these circuits in mouse and fruit fly have been noted, direct experimental comparisons have been lacking. Because molecular mechanisms and neuronal morphology in the two species are distinct, we directly compared motion encoding in these two species at the algorithmic level, using matched stimuli and focusing on a pair of analogous neurons, the mouse ON starburst amacrine cell (ON SAC) and Drosophila T4 neurons. We find that the cells share similar spatiotemporal receptive field structures, sensitivity to spatiotemporal correlations, and tuning to sinusoidal drifting gratings, but differ in their responses to apparent motion stimuli. Both neuron types showed a response to summed sinusoids that deviates from models for motion processing in these cells, underscoring the similarities in their processing and identifying response features that remain to be explained.
Collapse
Affiliation(s)
- Juyue Chen
- Interdepartmental Neurosciences Program, Yale University, New Haven, CT 06511, USA
| | - Caitlin M Gish
- Department of Physics, Yale University, New Haven, CT 06511, USA
| | - James W Fransen
- Department of Anatomical Sciences and Neurobiology, University of Louisville, Louisville, KY 40202, USA
| | | | - Damon A Clark
- Interdepartmental Neurosciences Program, Yale University, New Haven, CT 06511, USA
- Department of Physics, Yale University, New Haven, CT 06511, USA
- Department of Molecular, Cellular, Developmental Biology, Yale University, New Haven, CT 06511, USA
- Department of Neuroscience, Yale University, New Haven, CT 06511, USA
| | - Bart G Borghuis
- Department of Anatomical Sciences and Neurobiology, University of Louisville, Louisville, KY 40202, USA
| |
Collapse
|
20
|
Cruz TL, Chiappe ME. Multilevel visuomotor control of locomotion in Drosophila. Curr Opin Neurobiol 2023; 82:102774. [PMID: 37651855 DOI: 10.1016/j.conb.2023.102774] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2023] [Revised: 07/26/2023] [Accepted: 08/01/2023] [Indexed: 09/02/2023]
Abstract
Vision is critical for the control of locomotion, but the underlying neural mechanisms by which visuomotor circuits contribute to the movement of the body through space are yet not well understood. Locomotion engages multiple control systems, forming distinct interacting "control levels" driven by the activity of distributed and overlapping circuits. Therefore, a comprehensive understanding of the mechanisms underlying locomotion control requires the consideration of all control levels and their necessary coordination. Due to their small size and the wide availability of experimental tools, Drosophila has become an important model system to study this coordination. Traditionally, insect locomotion has been divided into studying either the biomechanics and local control of limbs, or navigation and course control. However, recent developments in tracking techniques, and physiological and genetic tools in Drosophila have prompted researchers to examine multilevel control coordination in flight and walking.
Collapse
Affiliation(s)
- Tomás L Cruz
- Champalimaud Research, Champalimaud Centre for the Unknown, 1400-038 Lisbon, Portugal
| | - M Eugenia Chiappe
- Champalimaud Research, Champalimaud Centre for the Unknown, 1400-038 Lisbon, Portugal.
| |
Collapse
|
21
|
Keshavarzi S, Velez-Fort M, Margrie TW. Cortical Integration of Vestibular and Visual Cues for Navigation, Visual Processing, and Perception. Annu Rev Neurosci 2023; 46:301-320. [PMID: 37428601 PMCID: PMC7616138 DOI: 10.1146/annurev-neuro-120722-100503] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/12/2023]
Abstract
Despite increasing evidence of its involvement in several key functions of the cerebral cortex, the vestibular sense rarely enters our consciousness. Indeed, the extent to which these internal signals are incorporated within cortical sensory representation and how they might be relied upon for sensory-driven decision-making, during, for example, spatial navigation, is yet to be understood. Recent novel experimental approaches in rodents have probed both the physiological and behavioral significance of vestibular signals and indicate that their widespread integration with vision improves both the cortical representation and perceptual accuracy of self-motion and orientation. Here, we summarize these recent findings with a focus on cortical circuits involved in visual perception and spatial navigation and highlight the major remaining knowledge gaps. We suggest that vestibulo-visual integration reflects a process of constant updating regarding the status of self-motion, and access to such information by the cortex is used for sensory perception and predictions that may be implemented for rapid, navigation-related decision-making.
Collapse
Affiliation(s)
- Sepiedeh Keshavarzi
- The Sainsbury Wellcome Centre for Neural Circuits and Behavior, University College London, London, United Kingdom;
| | - Mateo Velez-Fort
- The Sainsbury Wellcome Centre for Neural Circuits and Behavior, University College London, London, United Kingdom;
| | - Troy W Margrie
- The Sainsbury Wellcome Centre for Neural Circuits and Behavior, University College London, London, United Kingdom;
| |
Collapse
|
22
|
Zahler SH, Taylor DE, Wright BS, Wong JY, Shvareva VA, Park YA, Feinberg EH. Hindbrain modules differentially transform activity of single collicular neurons to coordinate movements. Cell 2023; 186:3062-3078.e20. [PMID: 37343561 PMCID: PMC10424787 DOI: 10.1016/j.cell.2023.05.031] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2022] [Revised: 04/10/2023] [Accepted: 05/19/2023] [Indexed: 06/23/2023]
Abstract
Seemingly simple behaviors such as swatting a mosquito or glancing at a signpost involve the precise coordination of multiple body parts. Neural control of coordinated movements is widely thought to entail transforming a desired overall displacement into displacements for each body part. Here we reveal a different logic implemented in the mouse gaze system. Stimulating superior colliculus (SC) elicits head movements with stereotyped displacements but eye movements with stereotyped endpoints. This is achieved by individual SC neurons whose branched axons innervate modules in medulla and pons that drive head movements with stereotyped displacements and eye movements with stereotyped endpoints, respectively. Thus, single neurons specify a mixture of endpoints and displacements for different body parts, not overall displacement, with displacements for different body parts computed at distinct anatomical stages. Our study establishes an approach for unraveling motor hierarchies and identifies a logic for coordinating movements and the resulting pose.
Collapse
Affiliation(s)
- Sebastian H Zahler
- Department of Anatomy, University of California, San Francisco, San Francisco, CA 94143, USA; Neuroscience Graduate Program, University of California, San Francisco, San Francisco, CA 94143, USA
| | - David E Taylor
- Department of Anatomy, University of California, San Francisco, San Francisco, CA 94143, USA; Neuroscience Graduate Program, University of California, San Francisco, San Francisco, CA 94143, USA
| | - Brennan S Wright
- Department of Anatomy, University of California, San Francisco, San Francisco, CA 94143, USA; Neuroscience Graduate Program, University of California, San Francisco, San Francisco, CA 94143, USA
| | - Joey Y Wong
- Department of Anatomy, University of California, San Francisco, San Francisco, CA 94143, USA
| | - Varvara A Shvareva
- Department of Anatomy, University of California, San Francisco, San Francisco, CA 94143, USA
| | - Yusol A Park
- Department of Anatomy, University of California, San Francisco, San Francisco, CA 94143, USA
| | - Evan H Feinberg
- Department of Anatomy, University of California, San Francisco, San Francisco, CA 94143, USA; Neuroscience Graduate Program, University of California, San Francisco, San Francisco, CA 94143, USA; Kavli Institute for Fundamental Neuroscience, University of California, San Francisco, San Francisco, CA 94143, USA; Center for Integrative Neuroscience, University of California, San Francisco, San Francisco, CA 94143, USA.
| |
Collapse
|
23
|
Mimica B, Tombaz T, Battistin C, Fuglstad JG, Dunn BA, Whitlock JR. Behavioral decomposition reveals rich encoding structure employed across neocortex in rats. Nat Commun 2023; 14:3947. [PMID: 37402724 PMCID: PMC10319800 DOI: 10.1038/s41467-023-39520-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2022] [Accepted: 06/16/2023] [Indexed: 07/06/2023] Open
Abstract
The cortical population code is pervaded by activity patterns evoked by movement, but it remains largely unknown how such signals relate to natural behavior or how they might support processing in sensory cortices where they have been observed. To address this we compared high-density neural recordings across four cortical regions (visual, auditory, somatosensory, motor) in relation to sensory modulation, posture, movement, and ethograms of freely foraging male rats. Momentary actions, such as rearing or turning, were represented ubiquitously and could be decoded from all sampled structures. However, more elementary and continuous features, such as pose and movement, followed region-specific organization, with neurons in visual and auditory cortices preferentially encoding mutually distinct head-orienting features in world-referenced coordinates, and somatosensory and motor cortices principally encoding the trunk and head in egocentric coordinates. The tuning properties of synaptically coupled cells also exhibited connection patterns suggestive of area-specific uses of pose and movement signals, particularly in visual and auditory regions. Together, our results indicate that ongoing behavior is encoded at multiple levels throughout the dorsal cortex, and that low-level features are differentially utilized by different regions to serve locally relevant computations.
Collapse
Affiliation(s)
- Bartul Mimica
- Princeton Neuroscience Institute, Princeton University, Washington Road, Princeton, 100190, NJ, USA.
| | - Tuçe Tombaz
- Kavli Institute for Systems Neuroscience, Norwegian University of Science and Technology, Olav Kyrres Gate 9, 7030, Trondheim, Norway
| | - Claudia Battistin
- Kavli Institute for Systems Neuroscience, Norwegian University of Science and Technology, Olav Kyrres Gate 9, 7030, Trondheim, Norway
- Department of Mathematical Sciences, Norwegian University of Science and Technology, 7491, Trondheim, Norway
| | - Jingyi Guo Fuglstad
- Kavli Institute for Systems Neuroscience, Norwegian University of Science and Technology, Olav Kyrres Gate 9, 7030, Trondheim, Norway
| | - Benjamin A Dunn
- Kavli Institute for Systems Neuroscience, Norwegian University of Science and Technology, Olav Kyrres Gate 9, 7030, Trondheim, Norway
- Department of Mathematical Sciences, Norwegian University of Science and Technology, 7491, Trondheim, Norway
| | - Jonathan R Whitlock
- Kavli Institute for Systems Neuroscience, Norwegian University of Science and Technology, Olav Kyrres Gate 9, 7030, Trondheim, Norway.
| |
Collapse
|
24
|
Saleem AB, Busse L. Interactions between rodent visual and spatial systems during navigation. Nat Rev Neurosci 2023:10.1038/s41583-023-00716-7. [PMID: 37380885 DOI: 10.1038/s41583-023-00716-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/31/2023] [Indexed: 06/30/2023]
Abstract
Many behaviours that are critical for animals to survive and thrive rely on spatial navigation. Spatial navigation, in turn, relies on internal representations about one's spatial location, one's orientation or heading direction and the distance to objects in the environment. Although the importance of vision in guiding such internal representations has long been recognized, emerging evidence suggests that spatial signals can also modulate neural responses in the central visual pathway. Here, we review the bidirectional influences between visual and navigational signals in the rodent brain. Specifically, we discuss reciprocal interactions between vision and the internal representations of spatial position, explore the effects of vision on representations of an animal's heading direction and vice versa, and examine how the visual and navigational systems work together to assess the relative distances of objects and other features. Throughout, we consider how technological advances and novel ethological paradigms that probe rodent visuo-spatial behaviours allow us to advance our understanding of how brain areas of the central visual pathway and the spatial systems interact and enable complex behaviours.
Collapse
Affiliation(s)
- Aman B Saleem
- UCL Institute of Behavioural Neuroscience, Department of Experimental Psychology, University College London, London, UK.
| | - Laura Busse
- Division of Neuroscience, Faculty of Biology, LMU Munich, Munich, Germany.
- Bernstein Centre for Computational Neuroscience Munich, Munich, Germany.
| |
Collapse
|
25
|
Shaw L, Wang KH, Mitchell J. Fast prediction in marmoset reach-to-grasp movements for dynamic prey. Curr Biol 2023; 33:2557-2565.e4. [PMID: 37279754 PMCID: PMC10330526 DOI: 10.1016/j.cub.2023.05.032] [Citation(s) in RCA: 8] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2022] [Revised: 03/31/2023] [Accepted: 05/15/2023] [Indexed: 06/08/2023]
Abstract
Primates have evolved sophisticated, visually guided reaching behaviors for interacting with dynamic objects, such as insects, during foraging.1,2,3,4,5 Reaching control in dynamic natural conditions requires active prediction of the target's future position to compensate for visuo-motor processing delays and to enhance online movement adjustments.6,7,8,9,10,11,12 Past reaching research in non-human primates mainly focused on seated subjects engaged in repeated ballistic arm movements to either stationary targets or targets that instantaneously change position during the movement.13,14,15,16,17 However, those approaches impose task constraints that limit the natural dynamics of reaching. A recent field study in marmoset monkeys highlights predictive aspects of visually guided reaching during insect prey capture among wild marmoset monkeys.5 To examine the complementary dynamics of similar natural behavior within a laboratory context, we developed an ecologically motivated, unrestrained reach-to-grasp task involving live crickets. We used multiple high-speed video cameras to capture the movements of common marmosets (Callithrix jacchus) and crickets stereoscopically and applied machine vision algorithms for marker-free object and hand tracking. Contrary to estimates under traditional constrained reaching paradigms, we find that reaching for dynamic targets can operate at incredibly short visuo-motor delays around 80 ms, rivaling the speeds that are typical of the oculomotor systems during closed-loop visual pursuit.18 Multivariate linear regression modeling of the kinematic relationships between the hand and cricket velocity revealed that predictions of the expected future location can compensate for visuo-motor delays during fast reaching. These results suggest a critical role of visual prediction facilitating online movement adjustments for dynamic prey.
Collapse
Affiliation(s)
- Luke Shaw
- Department of Neuroscience, University of Rochester Medical Center, Rochester, NY 14642, USA
| | - Kuan Hong Wang
- Department of Neuroscience, University of Rochester Medical Center, Rochester, NY 14642, USA.
| | - Jude Mitchell
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY 14611, USA.
| |
Collapse
|
26
|
Hope J, Beckerle T, Cheng PH, Viavattine Z, Feldkamp M, Fausner S, Saxena K, Ko E, Hryb I, Carter R, Ebner T, Kodandaramaiah S. Brain-wide neural recordings in mice navigating physical spaces enabled by a cranial exoskeleton. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.06.04.543578. [PMID: 37333228 PMCID: PMC10274744 DOI: 10.1101/2023.06.04.543578] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/20/2023]
Abstract
Complex behaviors are mediated by neural computations occurring throughout the brain. In recent years, tremendous progress has been made in developing technologies that can record neural activity at cellular resolution at multiple spatial and temporal scales. However, these technologies are primarily designed for studying the mammalian brain during head fixation - wherein the behavior of the animal is highly constrained. Miniaturized devices for studying neural activity in freely behaving animals are largely confined to recording from small brain regions owing to performance limitations. We present a cranial exoskeleton that assists mice in maneuvering neural recording headstages that are orders of magnitude larger and heavier than the mice, while they navigate physical behavioral environments. Force sensors embedded within the headstage are used to detect the mouse's milli-Newton scale cranial forces which then control the x, y, and yaw motion of the exoskeleton via an admittance controller. We discovered optimal controller tuning parameters that enable mice to locomote at physiologically realistic velocities and accelerations while maintaining natural walking gait. Mice maneuvering headstages weighing up to 1.5 kg can make turns, navigate 2D arenas, and perform a navigational decision-making task with the same performance as when freely behaving. We designed an imaging headstage and an electrophysiology headstage for the cranial exoskeleton to record brain-wide neural activity in mice navigating 2D arenas. The imaging headstage enabled recordings of Ca2+ activity of 1000s of neurons distributed across the dorsal cortex. The electrophysiology headstage supported independent control of up to 4 silicon probes, enabling simultaneous recordings from 100s of neurons across multiple brain regions and multiple days. Cranial exoskeletons provide flexible platforms for largescale neural recording during the exploration of physical spaces, a critical new paradigm for unraveling the brain-wide neural mechanisms that control complex behavior.
Collapse
Affiliation(s)
- James Hope
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Travis Beckerle
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Pin-Hao Cheng
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Zoey Viavattine
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Michael Feldkamp
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Skylar Fausner
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Kapil Saxena
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Eunsong Ko
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Ihor Hryb
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
- Department of Neuroscience, University of Minnesota, Twin Cities
| | - Russell Carter
- Department of Biomedical Engineering, University of Minnesota, Twin Cities
| | - Timothy Ebner
- Department of Biomedical Engineering, University of Minnesota, Twin Cities
| | - Suhasa Kodandaramaiah
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
- Department of Biomedical Engineering, University of Minnesota, Twin Cities
- Department of Neuroscience, University of Minnesota, Twin Cities
| |
Collapse
|
27
|
Jauch I, Kamm J, Benn L, Rettig L, Friederich HC, Tesarz J, Kuner T, Wieland S. 2MDR, a Microcomputer-Controlled Visual Stimulation Device for Psychotherapy-Like Treatments of Mice. eNeuro 2023; 10:10/6/ENEURO.0394-22.2023. [PMID: 37268421 DOI: 10.1523/eneuro.0394-22.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2022] [Revised: 02/16/2023] [Accepted: 02/27/2023] [Indexed: 06/04/2023] Open
Abstract
Post-traumatic stress disorder and other mental disorders can be treated by an established psychotherapy called Eye Movement Desensitization and Reprocessing (EMDR). In EMDR, patients are confronted with traumatic memories while they are stimulated with alternating bilateral stimuli (ABS). How ABS affects the brain and whether ABS could be adapted to different patients or mental disorders is unknown. Interestingly, ABS reduced conditioned fear in mice. Yet, an approach to systematically test complex visual stimuli and compare respective differences in emotional processing based on semiautomated/automated behavioral analysis is lacking. We developed 2MDR (MultiModal Visual Stimulation to Desensitize Rodents), a novel, open-source, low-cost, customizable device that can be integrated in and transistor-transistor logic (TTL) controlled by commercial rodent behavioral setups. 2MDR allows the design and precise steering of multimodal visual stimuli in the head direction of freely moving mice. Optimized videography allows semiautomatic analysis of rodent behavior during visual stimulation. Detailed building, integration, and treatment instructions along with open-source software provide easy access for inexperienced users. Using 2MDR, we confirmed that EMDR-like ABS persistently improves fear extinction in mice and showed for the first time that ABS-mediated anxiolytic effects strongly depend on physical stimulus properties such as ABS brightness. 2MDR not only enables researchers to interfere with mouse behavior in an EMDR-like setting, but also demonstrates that visual stimuli can be used as a noninvasive brain stimulation to differentially alter emotional processing in mice.
Collapse
Affiliation(s)
- Isa Jauch
- Department of Functional Neuroanatomy, Institute for Anatomy and Cell Biology, Heidelberg University, 69120 Heidelberg, Germany
| | - Jan Kamm
- Department of Functional Neuroanatomy, Institute for Anatomy and Cell Biology, Heidelberg University, 69120 Heidelberg, Germany
| | - Luca Benn
- Department of Functional Neuroanatomy, Institute for Anatomy and Cell Biology, Heidelberg University, 69120 Heidelberg, Germany
| | - Lukas Rettig
- Department of Functional Neuroanatomy, Institute for Anatomy and Cell Biology, Heidelberg University, 69120 Heidelberg, Germany
| | - Hans-Christoph Friederich
- Department of General Internal and Psychosomatic Medicine, Heidelberg University and Heidelberg University Hospital, 69115 Heidelberg, Germany
| | - Jonas Tesarz
- Department of General Internal and Psychosomatic Medicine, Heidelberg University and Heidelberg University Hospital, 69115 Heidelberg, Germany
| | - Thomas Kuner
- Department of Functional Neuroanatomy, Institute for Anatomy and Cell Biology, Heidelberg University, 69120 Heidelberg, Germany
| | - Sebastian Wieland
- Department of Functional Neuroanatomy, Institute for Anatomy and Cell Biology, Heidelberg University, 69120 Heidelberg, Germany
- Department of General Internal and Psychosomatic Medicine, Heidelberg University and Heidelberg University Hospital, 69115 Heidelberg, Germany
| |
Collapse
|
28
|
Xu A, Hou Y, Niell CM, Beyeler M. Multimodal Deep Learning Model Unveils Behavioral Dynamics of V1 Activity in Freely Moving Mice. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.05.30.542912. [PMID: 37398256 PMCID: PMC10312557 DOI: 10.1101/2023.05.30.542912] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/04/2023]
Abstract
Despite their immense success as a model of macaque visual cortex, deep convolutional neural networks (CNNs) have struggled to predict activity in visual cortex of the mouse, which is thought to be strongly dependent on the animal's behavioral state. Furthermore, most computational models focus on predicting neural responses to static images presented under head fixation, which are dramatically different from the dynamic, continuous visual stimuli that arise during movement in the real world. Consequently, it is still unknown how natural visual input and different behavioral variables may integrate over time to generate responses in primary visual cortex (V1). To address this, we introduce a multimodal recurrent neural network that integrates gaze-contingent visual input with behavioral and temporal dynamics to explain V1 activity in freely moving mice. We show that the model achieves state-of-the-art predictions of V1 activity during free exploration and demonstrate the importance of each component in an extensive ablation study. Analyzing our model using maximally activating stimuli and saliency maps, we reveal new insights into cortical function, including the prevalence of mixed selectivity for behavioral variables in mouse V1. In summary, our model offers a comprehensive deep-learning framework for exploring the computational principles underlying V1 neurons in freely-moving animals engaged in natural behavior.
Collapse
Affiliation(s)
- Aiwen Xu
- Department of Computer Science University of California, Santa Barbara Santa Barbara, CA 93117
| | - Yuchen Hou
- Department of Computer Science University of California, Santa Barbara Santa Barbara, CA 93117
| | - Cristopher M Niell
- Department of Biology, Institute of Neuroscience University of Oregon Eugene, OR 97403
| | - Michael Beyeler
- Department of Computer Science, Department of Psychological and Brain Sciences University of California, Santa Barbara Santa Barbara, CA 93117
| |
Collapse
|
29
|
Vandevelde JR, Yang JW, Albrecht S, Lam H, Kaufmann P, Luhmann HJ, Stüttgen MC. Layer- and cell-type-specific differences in neural activity in mouse barrel cortex during a whisker detection task. Cereb Cortex 2023; 33:1361-1382. [PMID: 35417918 DOI: 10.1093/cercor/bhac141] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2021] [Revised: 03/17/2022] [Accepted: 03/18/2022] [Indexed: 11/14/2022] Open
Abstract
To address the question which neocortical layers and cell types are important for the perception of a sensory stimulus, we performed multielectrode recordings in the barrel cortex of head-fixed mice performing a single-whisker go/no-go detection task with vibrotactile stimuli of differing intensities. We found that behavioral detection probability decreased gradually over the course of each session, which was well explained by a signal detection theory-based model that posits stable psychometric sensitivity and a variable decision criterion updated after each reinforcement, reflecting decreasing motivation. Analysis of multiunit activity demonstrated highest neurometric sensitivity in layer 4, which was achieved within only 30 ms after stimulus onset. At the level of single neurons, we observed substantial heterogeneity of neurometric sensitivity within and across layers, ranging from nonresponsiveness to approaching or even exceeding psychometric sensitivity. In all cortical layers, putative inhibitory interneurons on average proffered higher neurometric sensitivity than putative excitatory neurons. In infragranular layers, neurons increasing firing rate in response to stimulation featured higher sensitivities than neurons decreasing firing rate. Offline machine-learning-based analysis of videos of behavioral sessions showed that mice performed better when not moving, which at the neuronal level, was reflected by increased stimulus-evoked firing rates.
Collapse
Affiliation(s)
- Jens R Vandevelde
- Institute of Physiology, University Medical Center of the Johannes Gutenberg University Mainz, Duesbergweg 6, 55128 Mainz, Germany.,Institute of Pathophysiology, University Medical Center of the Johannes Gutenberg University Mainz, Duesbergweg 6, 55128 Mainz, Germany
| | - Jenq-Wei Yang
- Institute of Physiology, University Medical Center of the Johannes Gutenberg University Mainz, Duesbergweg 6, 55128 Mainz, Germany
| | - Steffen Albrecht
- Institute of Physiology, University Medical Center of the Johannes Gutenberg University Mainz, Duesbergweg 6, 55128 Mainz, Germany
| | - Henry Lam
- Computational Intelligence, Faculty of Law, Management and Economics, Johannes Gutenberg University Mainz, Jakob-Welder-Weg 9, 55128 Mainz, Germany
| | - Paul Kaufmann
- Computational Intelligence, Faculty of Law, Management and Economics, Johannes Gutenberg University Mainz, Jakob-Welder-Weg 9, 55128 Mainz, Germany
| | - Heiko J Luhmann
- Institute of Physiology, University Medical Center of the Johannes Gutenberg University Mainz, Duesbergweg 6, 55128 Mainz, Germany
| | - Maik C Stüttgen
- Institute of Pathophysiology, University Medical Center of the Johannes Gutenberg University Mainz, Duesbergweg 6, 55128 Mainz, Germany
| |
Collapse
|
30
|
Bimbard C, Sit TPH, Lebedeva A, Reddy CB, Harris KD, Carandini M. Behavioral origin of sound-evoked activity in mouse visual cortex. Nat Neurosci 2023; 26:251-258. [PMID: 36624279 PMCID: PMC9905016 DOI: 10.1038/s41593-022-01227-x] [Citation(s) in RCA: 33] [Impact Index Per Article: 33.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2021] [Accepted: 10/31/2022] [Indexed: 01/10/2023]
Abstract
Sensory cortices can be affected by stimuli of multiple modalities and are thus increasingly thought to be multisensory. For instance, primary visual cortex (V1) is influenced not only by images but also by sounds. Here we show that the activity evoked by sounds in V1, measured with Neuropixels probes, is stereotyped across neurons and even across mice. It is independent of projections from auditory cortex and resembles activity evoked in the hippocampal formation, which receives little direct auditory input. Its low-dimensional nature starkly contrasts the high-dimensional code that V1 uses to represent images. Furthermore, this sound-evoked activity can be precisely predicted by small body movements that are elicited by each sound and are stereotyped across trials and mice. Thus, neural activity that is apparently multisensory may simply arise from low-dimensional signals associated with internal state and behavior.
Collapse
Affiliation(s)
- Célian Bimbard
- UCL Institute of Ophthalmology, University College London, London, UK.
| | - Timothy P H Sit
- Sainsbury Wellcome Centre, University College London, London, UK
- UCL Queen Square Institute of Neurology, University College London, London, UK
| | - Anna Lebedeva
- Sainsbury Wellcome Centre, University College London, London, UK
- UCL Queen Square Institute of Neurology, University College London, London, UK
| | - Charu B Reddy
- UCL Institute of Ophthalmology, University College London, London, UK
| | - Kenneth D Harris
- UCL Queen Square Institute of Neurology, University College London, London, UK
| | - Matteo Carandini
- UCL Institute of Ophthalmology, University College London, London, UK
| |
Collapse
|
31
|
Horrocks EAB, Mareschal I, Saleem AB. Walking humans and running mice: perception and neural encoding of optic flow during self-motion. Philos Trans R Soc Lond B Biol Sci 2023; 378:20210450. [PMID: 36511417 PMCID: PMC9745880 DOI: 10.1098/rstb.2021.0450] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2022] [Accepted: 08/30/2022] [Indexed: 12/15/2022] Open
Abstract
Locomotion produces full-field optic flow that often dominates the visual motion inputs to an observer. The perception of optic flow is in turn important for animals to guide their heading and interact with moving objects. Understanding how locomotion influences optic flow processing and perception is therefore essential to understand how animals successfully interact with their environment. Here, we review research investigating how perception and neural encoding of optic flow are altered during self-motion, focusing on locomotion. Self-motion has been found to influence estimation and sensitivity for optic flow speed and direction. Nonvisual self-motion signals also increase compensation for self-driven optic flow when parsing the visual motion of moving objects. The integration of visual and nonvisual self-motion signals largely follows principles of Bayesian inference and can improve the precision and accuracy of self-motion perception. The calibration of visual and nonvisual self-motion signals is dynamic, reflecting the changing visuomotor contingencies across different environmental contexts. Throughout this review, we consider experimental research using humans, non-human primates and mice. We highlight experimental challenges and opportunities afforded by each of these species and draw parallels between experimental findings. These findings reveal a profound influence of locomotion on optic flow processing and perception across species. This article is part of a discussion meeting issue 'New approaches to 3D vision'.
Collapse
Affiliation(s)
- Edward A. B. Horrocks
- Institute of Behavioural Neuroscience, Department of Experimental Psychology, University College London, London WC1H 0AP, UK
| | - Isabelle Mareschal
- School of Biological and Behavioural Sciences, Queen Mary, University of London, London E1 4NS, UK
| | - Aman B. Saleem
- Institute of Behavioural Neuroscience, Department of Experimental Psychology, University College London, London WC1H 0AP, UK
| |
Collapse
|
32
|
McBride SD, Ober J, Dylak J, Schneider W, Morton AJ. Oculomotor Abnormalities in a Sheep (Ovis aries) Model of Huntington's Disease: Towards a Biomarker for Assessing Therapeutic Efficacy. J Huntingtons Dis 2023; 12:189-200. [PMID: 37718849 DOI: 10.3233/jhd-230584] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/19/2023]
Abstract
BACKGROUND Huntington's disease (HD) is characterized by a loss of control of motor function that causes the presence of abnormal eye movements at early stages. OBJECTIVE To determine if, compared to normal sheep, HD sheep have abnormal eye movements. METHODS We measured eye movements in a transgenic sheep (Ovis aries) model of HD using a purpose-built, head-mounted sheep oculometer. This allows us to measure saccades without the need for either behavioral training or head fixation. At the age of testing (6 years old), the HD sheep were pre-manifest. We used 21 sheep (11 HD, 10 normal). RESULTS We found small but significant differences in eye movements between normal (control) and HD sheep during vestibular ocular reflex (VOR)- and vestibular post-rotational nystagmus (PRN)-based tests. CONCLUSIONS Two measures were identified that could distinguish normal from HD sheep; the number of PRN oscillations when tested in the dark and the gain (eye movement to head movement ratio) during the VOR when tested in the light. To our knowledge, this is the first study in which eye movements have been quantified in sheep. It demonstrates the feasibility of measuring and quantifying human-relevant eye movements in this species. The HD-relevant deficits show that even in 'premanifest' sheep there are measurable signs of neurological dysfunction that are characterized by loss of control of eye movements.
Collapse
Affiliation(s)
| | - Jan Ober
- Ober Consulting Sp. z o.o., Poznań, Poland
| | | | | | - A Jennifer Morton
- Department of Physiology, Development and Neuroscience, University of Cambridge, Downing Street, Cambridge, UK
| |
Collapse
|
33
|
Parker PRL, Abe ETT, Leonard ESP, Martins DM, Niell CM. Joint coding of visual input and eye/head position in V1 of freely moving mice. Neuron 2022; 110:3897-3906.e5. [PMID: 36137549 PMCID: PMC9742335 DOI: 10.1016/j.neuron.2022.08.029] [Citation(s) in RCA: 13] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2022] [Revised: 07/16/2022] [Accepted: 08/30/2022] [Indexed: 12/15/2022]
Abstract
Visual input during natural behavior is highly dependent on movements of the eyes and head, but how information about eye and head position is integrated with visual processing during free movement is unknown, as visual physiology is generally performed under head fixation. To address this, we performed single-unit electrophysiology in V1 of freely moving mice while simultaneously measuring the mouse's eye position, head orientation, and the visual scene from the mouse's perspective. From these measures, we mapped spatiotemporal receptive fields during free movement based on the gaze-corrected visual input. Furthermore, we found a significant fraction of neurons tuned for eye and head position, and these signals were integrated with visual responses through a multiplicative mechanism in the majority of modulated neurons. These results provide new insight into coding in the mouse V1 and, more generally, provide a paradigm for investigating visual physiology under natural conditions, including active sensing and ethological behavior.
Collapse
Affiliation(s)
- Philip R L Parker
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Elliott T T Abe
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Emmalyn S P Leonard
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Dylan M Martins
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Cristopher M Niell
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA.
| |
Collapse
|
34
|
Younk R, Widge A. Quantifying defensive behavior and threat response through integrated headstage accelerometry. J Neurosci Methods 2022; 382:109725. [PMID: 36243171 DOI: 10.1016/j.jneumeth.2022.109725] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2022] [Revised: 10/03/2022] [Accepted: 10/08/2022] [Indexed: 11/06/2022]
Abstract
BACKGROUND Defensive and threat-related behaviors are common models for aspects of human mental illness.These behaviors are typically quantified by potentially laborious and/or computationally intensive video recording and post hoc analysis. Depending on the analysis method, the resulting measurements can be noisy or inaccurate. Other defensive behaviors, such as suppression of operant reward seeking, require extensive animal pre-training. Inertial tracking and accelerometry can be computationally efficient, but require specialized hardware. NEW METHOD We quantified rodent defensive behavior using a commercially available electrophysiology headstage with 3-axis accelerometry integration during a threat conditioning and extinction paradigm. We tested multiple pre-processing and smoothing methods and correlated them against video-derived freezing and suppression of operant bar pressing. RESULTS The best approach to tracking defensive behavior from accelerometry was Gaussian filter smoothing of the first derivative. Behavior scores from this method reproduced canonical conditioning and extinction curves. Timepoint-to-timepoint correlations between accelerometry,video, and bar press metrics were statistically significant but modest (largest r = 0.53, between accelerometry and bar press). These increased when traditional thresholding-based analyses were used, at the cost of a loss of temporal resolution (r = 0.97 between thresholded accelerometry and percent time freezing). COMPARISON WITH EXISTING METHODS Accelerometry's integration with standard electrophysiology systems and relatively light weight signal processing may make it particularly well suited to detect behavior in resource-constrainedor real-time applications. CONCLUSIONS Accelerometry allows researchers already using electrophysiology to assess defensive behaviors without the need for additional behavioral measures or video. The modest correlations between metrics suggest that each measures a distinct aspect of defensive behavior. Accelerometry is a viable alternative to current defensive measurements, and its non-overlap with other metrics may allow for more sophisticated dissection of threat responses.
Collapse
Affiliation(s)
- Rebecca Younk
- Department of Neuroscience, University of Minnesota, Minneapolis, MN 55455, USA.
| | - Alik Widge
- Department of Psychiatry, University of Minnesota, Minneapolis, MN 55455, USA.
| |
Collapse
|
35
|
Thurley K. Naturalistic neuroscience and virtual reality. Front Syst Neurosci 2022; 16:896251. [PMID: 36467978 PMCID: PMC9712202 DOI: 10.3389/fnsys.2022.896251] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2022] [Accepted: 10/31/2022] [Indexed: 04/04/2024] Open
Abstract
Virtual reality (VR) is one of the techniques that became particularly popular in neuroscience over the past few decades. VR experiments feature a closed-loop between sensory stimulation and behavior. Participants interact with the stimuli and not just passively perceive them. Several senses can be stimulated at once, large-scale environments can be simulated as well as social interactions. All of this makes VR experiences more natural than those in traditional lab paradigms. Compared to the situation in field research, a VR simulation is highly controllable and reproducible, as required of a laboratory technique used in the search for neural correlates of perception and behavior. VR is therefore considered a middle ground between ecological validity and experimental control. In this review, I explore the potential of VR in eliciting naturalistic perception and behavior in humans and non-human animals. In this context, I give an overview of recent virtual reality approaches used in neuroscientific research.
Collapse
Affiliation(s)
- Kay Thurley
- Faculty of Biology, Ludwig-Maximilians-Universität München, Munich, Germany
- Bernstein Center for Computational Neuroscience Munich, Munich, Germany
| |
Collapse
|
36
|
Efficient training approaches for optimizing behavioral performance and reducing head fixation time. PLoS One 2022; 17:e0276531. [DOI: 10.1371/journal.pone.0276531] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2022] [Accepted: 10/10/2022] [Indexed: 11/12/2022] Open
Abstract
The use of head fixation has become routine in systems neuroscience. However, whether the behavior changes with head fixation, whether animals can learn aspects of a task while freely moving and transfer this knowledge to the head fixed condition, has not been examined in much detail. Here, we used a novel floating platform, the “Air-Track”, which simulates free movement in a real-world environment to address the effect of head fixation and developed methods to accelerate training of behavioral tasks for head fixed mice. We trained mice in a Y maze two choice discrimination task. One group was trained while head fixed and compared to a separate group that was pre-trained while freely moving and then trained on the same task while head fixed. Pre-training significantly reduced the time needed to relearn the discrimination task while head fixed. Freely moving and head fixed mice displayed similar behavioral patterns, however, head fixation significantly slowed movement speed. The speed of movement in the head fixed mice depended on the weight of the platform. We conclude that home-cage pre-training improves learning performance of head fixed mice and that while head fixation obviously limits some aspects of movement, the patterns of behavior observed in head fixed and freely moving mice are similar.
Collapse
|
37
|
Franke K, Willeke KF, Ponder K, Galdamez M, Zhou N, Muhammad T, Patel S, Froudarakis E, Reimer J, Sinz FH, Tolias AS. State-dependent pupil dilation rapidly shifts visual feature selectivity. Nature 2022; 610:128-134. [PMID: 36171291 PMCID: PMC10635574 DOI: 10.1038/s41586-022-05270-3] [Citation(s) in RCA: 26] [Impact Index Per Article: 13.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2021] [Accepted: 08/23/2022] [Indexed: 11/09/2022]
Abstract
To increase computational flexibility, the processing of sensory inputs changes with behavioural context. In the visual system, active behavioural states characterized by motor activity and pupil dilation1,2 enhance sensory responses, but typically leave the preferred stimuli of neurons unchanged2-9. Here we find that behavioural state also modulates stimulus selectivity in the mouse visual cortex in the context of coloured natural scenes. Using population imaging in behaving mice, pharmacology and deep neural network modelling, we identified a rapid shift in colour selectivity towards ultraviolet stimuli during an active behavioural state. This was exclusively caused by state-dependent pupil dilation, which resulted in a dynamic switch from rod to cone photoreceptors, thereby extending their role beyond night and day vision. The change in tuning facilitated the decoding of ethological stimuli, such as aerial predators against the twilight sky10. For decades, studies in neuroscience and cognitive science have used pupil dilation as an indirect measure of brain state. Our data suggest that, in addition, state-dependent pupil dilation itself tunes visual representations to behavioural demands by differentially recruiting rods and cones on fast timescales.
Collapse
Affiliation(s)
- Katrin Franke
- Institute for Ophthalmic Research, Tübingen University, Tübingen, Germany.
- Center for Integrative Neuroscience, Tübingen University, Tübingen, Germany.
- Department of Neuroscience, Baylor College of Medicine, Houston, TX, USA.
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX, USA.
| | - Konstantin F Willeke
- Institute for Bioinformatics and Medical Informatics, Tübingen University, Tübingen, Germany
- Department of Computer Science, Göttingen University, Göttingen, Germany
| | - Kayla Ponder
- Department of Neuroscience, Baylor College of Medicine, Houston, TX, USA
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX, USA
| | - Mario Galdamez
- Department of Neuroscience, Baylor College of Medicine, Houston, TX, USA
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX, USA
| | - Na Zhou
- Department of Neuroscience, Baylor College of Medicine, Houston, TX, USA
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX, USA
| | - Taliah Muhammad
- Department of Neuroscience, Baylor College of Medicine, Houston, TX, USA
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX, USA
| | - Saumil Patel
- Department of Neuroscience, Baylor College of Medicine, Houston, TX, USA
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX, USA
| | - Emmanouil Froudarakis
- Department of Neuroscience, Baylor College of Medicine, Houston, TX, USA
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX, USA
- Institute of Molecular Biology and Biotechnology, Foundation for Research and Technology Hellas, Heraklion, Greece
| | - Jacob Reimer
- Department of Neuroscience, Baylor College of Medicine, Houston, TX, USA
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX, USA
| | - Fabian H Sinz
- Department of Neuroscience, Baylor College of Medicine, Houston, TX, USA
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX, USA
- Institute for Bioinformatics and Medical Informatics, Tübingen University, Tübingen, Germany
- Department of Computer Science, Göttingen University, Göttingen, Germany
| | - Andreas S Tolias
- Department of Neuroscience, Baylor College of Medicine, Houston, TX, USA
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, TX, USA
- Department of Electrical and Computer Engineering, Rice University, Houston, TX, USA
| |
Collapse
|
38
|
Parker PRL, Abe ETT, Beatie NT, Leonard ESP, Martins DM, Sharp SL, Wyrick DG, Mazzucato L, Niell CM. Distance estimation from monocular cues in an ethological visuomotor task. eLife 2022; 11:e74708. [PMID: 36125119 PMCID: PMC9489205 DOI: 10.7554/elife.74708] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2021] [Accepted: 08/29/2022] [Indexed: 12/02/2022] Open
Abstract
In natural contexts, sensory processing and motor output are closely coupled, which is reflected in the fact that many brain areas contain both sensory and movement signals. However, standard reductionist paradigms decouple sensory decisions from their natural motor consequences, and head-fixation prevents the natural sensory consequences of self-motion. In particular, movement through the environment provides a number of depth cues beyond stereo vision that are poorly understood. To study the integration of visual processing and motor output in a naturalistic task, we investigated distance estimation in freely moving mice. We found that mice use vision to accurately jump across a variable gap, thus directly coupling a visual computation to its corresponding ethological motor output. Monocular eyelid suture did not affect gap jumping success, thus mice can use cues that do not depend on binocular disparity and stereo vision. Under monocular conditions, mice altered their head positioning and performed more vertical head movements, consistent with a shift from using stereopsis to other monocular cues, such as motion or position parallax. Finally, optogenetic suppression of primary visual cortex impaired task performance under both binocular and monocular conditions when optical fiber placement was localized to binocular or monocular zone V1, respectively. Together, these results show that mice can use monocular cues, relying on visual cortex, to accurately judge distance. Furthermore, this behavioral paradigm provides a foundation for studying how neural circuits convert sensory information into ethological motor output.
Collapse
Affiliation(s)
- Philip RL Parker
- Institute of Neuroscience, University of OregonEugeneUnited States
| | - Elliott TT Abe
- Institute of Neuroscience, University of OregonEugeneUnited States
| | - Natalie T Beatie
- Institute of Neuroscience, University of OregonEugeneUnited States
| | | | - Dylan M Martins
- Institute of Neuroscience, University of OregonEugeneUnited States
| | - Shelby L Sharp
- Institute of Neuroscience, University of OregonEugeneUnited States
| | - David G Wyrick
- Institute of Neuroscience, University of OregonEugeneUnited States
| | - Luca Mazzucato
- Institute of Neuroscience, University of OregonEugeneUnited States
- Department of Mathematics, University of OregonEugeneUnited States
| | - Cristopher M Niell
- Institute of Neuroscience, University of OregonEugeneUnited States
- Department of Biology, University of OregonEugeneUnited States
| |
Collapse
|
39
|
Distinguishing externally from saccade-induced motion in visual cortex. Nature 2022; 610:135-142. [PMID: 36104560 PMCID: PMC9534749 DOI: 10.1038/s41586-022-05196-w] [Citation(s) in RCA: 14] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2021] [Accepted: 08/04/2022] [Indexed: 12/03/2022]
Abstract
Distinguishing sensory stimuli caused by changes in the environment from those caused by an animal’s own actions is a hallmark of sensory processing1. Saccades are rapid eye movements that shift the image on the retina. How visual systems differentiate motion of the image induced by saccades from actual motion in the environment is not fully understood2. Here we discovered that in mouse primary visual cortex (V1) the two types of motion evoke distinct activity patterns. This is because, during saccades, V1 combines the visual input with a strong non-visual input arriving from the thalamic pulvinar nucleus. The non-visual input triggers responses that are specific to the direction of the saccade and the visual input triggers responses that are specific to the direction of the shift of the stimulus on the retina, yet the preferred directions of these two responses are uncorrelated. Thus, the pulvinar input ensures differential V1 responses to external and self-generated motion. Integration of external sensory information with information about body movement may be a general mechanism for sensory cortices to distinguish between self-generated and external stimuli. Distinct activity patterns in the primary visual cortex distinguish movement in the environment from motion caused by eye movements.
Collapse
|
40
|
Abstract
Since the discovery of rapid eye movement (REM) sleep, the nature of the eye movements that characterize this sleep phase has remained elusive. Do they reveal gaze shifts in the virtual environment of dreams or simply reflect random brainstem activity? We harnessed the head direction (HD) system of the mouse thalamus, a neuronal population whose activity reports, in awake mice, their actual HD as they explore their environment and, in sleeping mice, their virtual HD. We discovered that the direction and amplitude of rapid eye movements during REM sleep reveal the direction and amplitude of the ongoing changes in virtual HD. Thus, rapid eye movements disclose gaze shifts in the virtual world of REM sleep, thereby providing a window into the cognitive processes of the sleeping brain.
Collapse
Affiliation(s)
- Yuta Senzai
- Department of Physiology, University of California, San Francisco, San Francisco, CA, USA
- Howard Hughes Medical Institute, University of California, San Francisco, San Francisco, CA, USA
| | - Massimo Scanziani
- Department of Physiology, University of California, San Francisco, San Francisco, CA, USA
- Howard Hughes Medical Institute, University of California, San Francisco, San Francisco, CA, USA
| |
Collapse
|
41
|
Pedrosa R, Song C, Knöpfel T, Battaglia F. Combining Cortical Voltage Imaging and Hippocampal Electrophysiology for Investigating Global, Multi-Timescale Activity Interactions in the Brain. Int J Mol Sci 2022; 23:6814. [PMID: 35743257 PMCID: PMC9224488 DOI: 10.3390/ijms23126814] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2022] [Revised: 06/14/2022] [Accepted: 06/16/2022] [Indexed: 11/17/2022] Open
Abstract
A new generation of optogenetic tools for analyzing neural activity has been contributing to the elucidation of classical open questions in neuroscience. Specifically, voltage imaging technologies using enhanced genetically encoded voltage indicators have been increasingly used to observe the dynamics of large circuits at the mesoscale. Here, we describe how to combine cortical wide-field voltage imaging with hippocampal electrophysiology in awake, behaving mice. Furthermore, we highlight how this method can be useful for different possible investigations, using the characterization of hippocampal-neocortical interactions as a case study.
Collapse
Affiliation(s)
- Rafael Pedrosa
- Donders Institute for Brain Cognition and Behaviour, Radboud University, 6525AJ Nijmegen, The Netherlands;
| | - Chenchen Song
- Laboratory for Neuronal Circuit Dynamics, Imperial College London, London W12 0NN, UK;
| | - Thomas Knöpfel
- Laboratory for Neuronal Circuit Dynamics, Imperial College London, London W12 0NN, UK;
| | - Francesco Battaglia
- Donders Institute for Brain Cognition and Behaviour, Radboud University, 6525AJ Nijmegen, The Netherlands;
| |
Collapse
|
42
|
Huang K, Yang Q, Han Y, Zhang Y, Wang Z, Wang L, Wei P. An Easily Compatible Eye-tracking System for Freely-moving Small Animals. Neurosci Bull 2022; 38:661-676. [PMID: 35325370 PMCID: PMC9206064 DOI: 10.1007/s12264-022-00834-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2021] [Accepted: 12/03/2021] [Indexed: 12/13/2022] Open
Abstract
Measuring eye movement is a fundamental approach in cognitive science as it provides a variety of insightful parameters that reflect brain states such as visual attention and emotions. Combining eye-tracking with multimodal neural recordings or manipulation techniques is beneficial for understanding the neural substrates of cognitive function. Many commercially-available and custom-built systems have been widely applied to awake, head-fixed small animals. However, the existing eye-tracking systems used in freely-moving animals are still limited in terms of their compatibility with other devices and of the algorithm used to detect eye movements. Here, we report a novel system that integrates a general-purpose, easily compatible eye-tracking hardware with a robust eye feature-detection algorithm. With ultra-light hardware and a detachable design, the system allows for more implants to be added to the animal's exposed head and has a precise synchronization module to coordinate with other neural implants. Moreover, we systematically compared the performance of existing commonly-used pupil-detection approaches, and demonstrated that the proposed adaptive pupil feature-detection algorithm allows the analysis of more complex and dynamic eye-tracking data in free-moving animals. Synchronized eye-tracking and electroencephalogram recordings, as well as algorithm validation under five noise conditions, suggested that our system is flexibly adaptable and can be combined with a wide range of neural manipulation and recording technologies.
Collapse
Affiliation(s)
- Kang Huang
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute, Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China
- University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Qin Yang
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute, Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China
| | - Yaning Han
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute, Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China
- University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Yulin Zhang
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute, Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China
| | - Zhiyi Wang
- Harbin Institute of Technology Shenzhen, Shenzhen, 518055, China
| | - Liping Wang
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute, Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China
- University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Pengfei Wei
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute, Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China.
- University of Chinese Academy of Sciences, Beijing, 100049, China.
| |
Collapse
|
43
|
Suárez-Pereira I, Llorca-Torralba M, Bravo L, Camarena-Delgado C, Soriano-Mas C, Berrocoso E. The Role of the Locus Coeruleus in Pain and Associated Stress-Related Disorders. Biol Psychiatry 2022; 91:786-797. [PMID: 35164940 DOI: 10.1016/j.biopsych.2021.11.023] [Citation(s) in RCA: 57] [Impact Index Per Article: 28.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/11/2021] [Revised: 11/24/2021] [Accepted: 11/26/2021] [Indexed: 12/26/2022]
Abstract
The locus coeruleus (LC)-noradrenergic system is the main source of noradrenaline in the central nervous system and is involved intensively in modulating pain and stress-related disorders (e.g., major depressive disorder and anxiety) and in their comorbidity. However, the mechanisms involving the LC that underlie these effects have not been fully elucidated, in part owing to the technical difficulties inherent in exploring such a tiny nucleus. However, novel research tools are now available that have helped redefine the LC system, moving away from the traditional view of LC as a homogeneous structure that exerts a uniform influence on neural activity. Indeed, innovative techniques such as DREADDs (designer receptors exclusively activated by designer drugs) and optogenetics have demonstrated the functional heterogeneity of LC, and novel magnetic resonance imaging applications combined with pupillometry have opened the way to evaluate LC activity in vivo. This review aims to bring together the data available on the efferent activity of the LC-noradrenergic system in relation to pain and its comorbidity with anxiodepressive disorders. Acute pain triggers a robust LC stress response, producing spinal cord-mediated endogenous analgesia while promoting aversion, vigilance, and threat detection through its ascending efferents. However, this protective biological system fails in chronic pain, and LC activity produces pain facilitation, anxiety, increased aversive memory, and behavioral despair, acting at the medulla, prefrontal cortex, and amygdala levels. Thus, the activation/deactivation of specific LC projections contributes to different behavioral outcomes in the shift from acute to chronic pain.
Collapse
Affiliation(s)
- Irene Suárez-Pereira
- Neuropsychopharmacology and Psychobiology Research Group, Department of Neuroscience, University of Cádiz, Cádiz, Spain; Instituto de Investigación e Innovación Biomédica de Cádiz, Hospital Universitario Puerta del Mar, Cádiz, Spain; Centro de Investigación Biomédica en Red de Salud Mental, Instituto de Salud Carlos III, Madrid, Spain
| | - Meritxell Llorca-Torralba
- Neuropsychopharmacology and Psychobiology Research Group, Department of Psychology, University of Cádiz, Cádiz, Spain; Instituto de Investigación e Innovación Biomédica de Cádiz, Hospital Universitario Puerta del Mar, Cádiz, Spain; Centro de Investigación Biomédica en Red de Salud Mental, Instituto de Salud Carlos III, Madrid, Spain
| | - Lidia Bravo
- Neuropsychopharmacology and Psychobiology Research Group, Department of Neuroscience, University of Cádiz, Cádiz, Spain; Instituto de Investigación e Innovación Biomédica de Cádiz, Hospital Universitario Puerta del Mar, Cádiz, Spain; Centro de Investigación Biomédica en Red de Salud Mental, Instituto de Salud Carlos III, Madrid, Spain
| | - Carmen Camarena-Delgado
- Neuropsychopharmacology and Psychobiology Research Group, Department of Psychology, University of Cádiz, Cádiz, Spain; Instituto de Investigación e Innovación Biomédica de Cádiz, Hospital Universitario Puerta del Mar, Cádiz, Spain
| | - Carles Soriano-Mas
- Centro de Investigación Biomédica en Red de Salud Mental, Instituto de Salud Carlos III, Madrid, Spain; Department of Psychiatry, Bellvitge University Hospital, Bellvitge Biomedical Research Institute, Barcelona, Spain; Department of Psychobiology and Methodology in Health Sciences, Universitat Autònoma de Barcelona, Barcelona, Spain
| | - Esther Berrocoso
- Neuropsychopharmacology and Psychobiology Research Group, Department of Psychology, University of Cádiz, Cádiz, Spain; Instituto de Investigación e Innovación Biomédica de Cádiz, Hospital Universitario Puerta del Mar, Cádiz, Spain; Centro de Investigación Biomédica en Red de Salud Mental, Instituto de Salud Carlos III, Madrid, Spain.
| |
Collapse
|
44
|
Guardamagna M, Eichler R, Pedrosa R, Aarts AAA, Meyer AF, Battaglia F. The Hybrid Drive: a chronic implant device combining tetrode arrays with silicon probes for layer-resolved ensemble electrophysiology in freely moving mice. J Neural Eng 2022; 19. [PMID: 35421850 DOI: 10.1088/1741-2552/ac6771] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2021] [Accepted: 04/10/2022] [Indexed: 10/18/2022]
Abstract
Objective. Understanding the function of brain cortices requires simultaneous investigation at multiple spatial and temporal scales and to link neural activity to an animal's behavior. A major challenge is to measure within- and across-layer information in actively behaving animals, in particular in mice that have become a major species in neuroscience due to an extensive genetic toolkit. Here we describe the Hybrid Drive, a new chronic implant for mice that combines tetrode arrays to record within-layer information with silicon probes to simultaneously measure across-layer information.Approach. The design of our device combines up to 14 tetrodes and 2 silicon probes, that can be arranged in custom arrays to generate unique areas-specific (and multi-area) layouts.Main Results. We show that large numbers of neurons and layer-resolved local field potentials can be recorded from the same brain region across weeks without loss in electrophysiological signal quality. The drive's lightweight structure (~3.5 g) leaves animal behavior largely unchanged, compared to other tetrode drives, during a variety of experimental paradigms. We demonstrate how the data collected with the Hybrid Drive allow state-of-the-art analysis in a series of experiments linking the spiking activity of CA1 pyramidal layer neurons to the oscillatory activity across hippocampal layers.Significance. Our new device fits a gap in the existing technology and increases the range and precision of questions that can be addressed about neural computations in freely behaving mice.
Collapse
Affiliation(s)
| | - Ronny Eichler
- Radboud University, Heyendaalseweg 135, Nijmegen, 6500 HC, NETHERLANDS
| | - Rafael Pedrosa
- Radboud University, Heyendaalseweg 135, Nijmegen, 6500 HC, NETHERLANDS
| | - Arno A A Aarts
- ATLAS Neuroengineering, Kapeldreef 75, Leuven, B-3000, BELGIUM
| | - Arne F Meyer
- Radboud University, Heyendaalseweg 135, Nijmegen, 6500 HC, NETHERLANDS
| | | |
Collapse
|
45
|
Abstract
Retinal circuits transform the pixel representation of photoreceptors into the feature representations of ganglion cells, whose axons transmit these representations to the brain. Functional, morphological, and transcriptomic surveys have identified more than 40 retinal ganglion cell (RGC) types in mice. RGCs extract features of varying complexity; some simply signal local differences in brightness (i.e., luminance contrast), whereas others detect specific motion trajectories. To understand the retina, we need to know how retinal circuits give rise to the diverse RGC feature representations. A catalog of the RGC feature set, in turn, is fundamental to understanding visual processing in the brain. Anterograde tracing indicates that RGCs innervate more than 50 areas in the mouse brain. Current maps connecting RGC types to brain areas are rudimentary, as is our understanding of how retinal signals are transformed downstream to guide behavior. In this article, I review the feature selectivities of mouse RGCs, how they arise, and how they are utilized downstream. Not only is knowledge of the behavioral purpose of RGC signals critical for understanding the retinal contributions to vision; it can also guide us to the most relevant areas of visual feature space. Expected final online publication date for the Annual Review of Vision Science, Volume 8 is September 2022. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
Collapse
Affiliation(s)
- Daniel Kerschensteiner
- John F. Hardesty, MD, Department of Ophthalmology and Visual Sciences; Department of Neuroscience; Department of Biomedical Engineering; and Hope Center for Neurological Disorders, Washington University School of Medicine, Saint Louis, Missouri, USA;
| |
Collapse
|
46
|
Han Y, Huang K, Chen K, Pan H, Ju F, Long Y, Gao G, Wu R, Wang A, Wang L, Wei P. MouseVenue3D: A Markerless Three-Dimension Behavioral Tracking System for Matching Two-Photon Brain Imaging in Free-Moving Mice. Neurosci Bull 2022; 38:303-317. [PMID: 34637091 PMCID: PMC8975979 DOI: 10.1007/s12264-021-00778-6] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2021] [Accepted: 06/23/2021] [Indexed: 10/20/2022] Open
Abstract
Understanding the connection between brain and behavior in animals requires precise monitoring of their behaviors in three-dimensional (3-D) space. However, there is no available three-dimensional behavior capture system that focuses on rodents. Here, we present MouseVenue3D, an automated and low-cost system for the efficient capture of 3-D skeleton trajectories in markerless rodents. We improved the most time-consuming step in 3-D behavior capturing by developing an automatic calibration module. Then, we validated this process in behavior recognition tasks, and showed that 3-D behavioral data achieved higher accuracy than 2-D data. Subsequently, MouseVenue3D was combined with fast high-resolution miniature two-photon microscopy for synchronous neural recording and behavioral tracking in the freely-moving mouse. Finally, we successfully decoded spontaneous neuronal activity from the 3-D behavior of mice. Our findings reveal that subtle, spontaneous behavior modules are strongly correlated with spontaneous neuronal activity patterns.
Collapse
Affiliation(s)
- Yaning Han
- Shenzhen Key Laboratory of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Fundamental Research Institutions, Shenzhen, 518055, China
- University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Kang Huang
- Shenzhen Key Laboratory of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Fundamental Research Institutions, Shenzhen, 518055, China
- University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Ke Chen
- Shenzhen Key Laboratory of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Fundamental Research Institutions, Shenzhen, 518055, China
- University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Hongli Pan
- Shenzhen Key Laboratory of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Fundamental Research Institutions, Shenzhen, 518055, China
| | - Furong Ju
- Shenzhen Key Laboratory of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Fundamental Research Institutions, Shenzhen, 518055, China
| | - Yueyue Long
- Shenzhen Key Laboratory of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Fundamental Research Institutions, Shenzhen, 518055, China
- University of Rochester, Rochester, NY, 14627, USA
| | - Gao Gao
- Shenzhen Key Laboratory of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Fundamental Research Institutions, Shenzhen, 518055, China
- Honam University, Gwangju, 62399, South Korea
| | - Runlong Wu
- State Key Laboratory of Membrane Biology, Institute of Molecular Medicine, Peking University, Beijing, 100101, China
| | - Aimin Wang
- Department of Electronics, Peking University, Beijing, 100871, China
- State Key Laboratory of Advanced Optical Communication Systems and Networks, Peking University, Beijing, 100101, China
| | - Liping Wang
- Shenzhen Key Laboratory of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Fundamental Research Institutions, Shenzhen, 518055, China.
- University of Chinese Academy of Sciences, Beijing, 100049, China.
| | - Pengfei Wei
- Shenzhen Key Laboratory of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Fundamental Research Institutions, Shenzhen, 518055, China.
- University of Chinese Academy of Sciences, Beijing, 100049, China.
| |
Collapse
|
47
|
Romano V, Zhai P, van der Horst A, Mazza R, Jacobs T, Bauer S, Wang X, White JJ, De Zeeuw CI. Olivocerebellar control of movement symmetry. Curr Biol 2022; 32:654-670.e4. [PMID: 35016009 DOI: 10.1016/j.cub.2021.12.020] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2021] [Revised: 10/26/2021] [Accepted: 12/08/2021] [Indexed: 01/02/2023]
Abstract
Coordination of bilateral movements is essential for a large variety of animal behaviors. The olivocerebellar system is critical for the control of movement, but its role in bilateral coordination has yet to be elucidated. Here, we examined whether Purkinje cells encode and influence synchronicity of left-right whisker movements. We found that complex spike activity is correlated with a prominent left-right symmetry of spontaneous whisker movements within parts, but not all, of Crus1 and Crus2. Optogenetic stimulation of climbing fibers in the areas with high and low correlations resulted in symmetric and asymmetric whisker movements, respectively. Moreover, when simple spike frequency prior to the complex spike was higher, the complex spike-related symmetric whisker protractions were larger. This finding alludes to a role for rebound activity in the cerebellar nuclei, which indeed turned out to be enhanced during symmetric protractions. Tracer injections suggest that regions associated with symmetric whisker movements are anatomically connected to the contralateral cerebellar hemisphere. Together, these data point toward the existence of modules on both sides of the cerebellar cortex that can differentially promote or reduce the symmetry of left and right movements in a context-dependent fashion.
Collapse
Affiliation(s)
- Vincenzo Romano
- Department of Neuroscience, Erasmus MC, Rotterdam, the Netherlands.
| | - Peipei Zhai
- Department of Neuroscience, Erasmus MC, Rotterdam, the Netherlands
| | | | - Roberta Mazza
- Department of Neuroscience, Erasmus MC, Rotterdam, the Netherlands
| | - Thomas Jacobs
- Department of Neuroscience, Erasmus MC, Rotterdam, the Netherlands
| | - Staf Bauer
- Department of Neuroscience, Erasmus MC, Rotterdam, the Netherlands
| | - Xiaolu Wang
- Department of Neuroscience, Erasmus MC, Rotterdam, the Netherlands
| | - Joshua J White
- Department of Neuroscience, Erasmus MC, Rotterdam, the Netherlands
| | - C I De Zeeuw
- Department of Neuroscience, Erasmus MC, Rotterdam, the Netherlands; Netherlands Institute for Neuroscience, Royal Academy of Arts and Sciences, Amsterdam, the Netherlands.
| |
Collapse
|
48
|
Abstract
The mouse has dichromatic color vision based on two different types of opsins: short (S)- and middle (M)-wavelength-sensitive opsins with peak sensitivity to ultraviolet (UV; 360 nm) and green light (508 nm), respectively. In the mouse retina, cone photoreceptors that predominantly express the S-opsin are more sensitive to contrasts and denser towards the ventral retina, preferentially sampling the upper part of the visual field. In contrast, the expression of the M-opsin gradually increases towards the dorsal retina that encodes the lower visual field. Such a distinctive retinal organization is assumed to arise from a selective pressure in evolution to efficiently encode the natural scenes. However, natural image statistics of UV light remain largely unexplored. Here we developed a multi-spectral camera to acquire high-quality UV and green images of the same natural scenes, and examined the optimality of the mouse retina to the image statistics. We found that the local contrast and the spatial correlation were both higher in UV than in green for images above the horizon, but lower in UV than in green for those below the horizon. This suggests that the dorsoventral functional division of the mouse retina is not optimal for maximizing the bandwidth of information transmission. Factors besides the coding efficiency, such as visual behavioral requirements, will thus need to be considered to fully explain the characteristic organization of the mouse retina.
Collapse
Affiliation(s)
- Luca Abballe
- Department of Biomedical Engineering, Sapienza University of Rome, Rome, Italy
| | - Hiroki Asari
- European Molecular Biology Laboratory, Epigenetics and Neurobiology Unit, EMBL Rome, Monterotondo, Rome, Italy
| |
Collapse
|
49
|
Woodward K, Apps R, Goodfellow M, Cerminara NL. Cerebello-Thalamo-Cortical Network Dynamics in the Harmaline Rodent Model of Essential Tremor. Front Syst Neurosci 2022; 16:899446. [PMID: 35965995 PMCID: PMC9365993 DOI: 10.3389/fnsys.2022.899446] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2022] [Accepted: 06/22/2022] [Indexed: 11/18/2022] Open
Abstract
Essential Tremor (ET) is a common movement disorder, characterised by a posture or movement-related tremor of the upper limbs. Abnormalities within cerebellar circuits are thought to underlie the pathogenesis of ET, resulting in aberrant synchronous oscillatory activity within the thalamo-cortical network leading to tremors. Harmaline produces pathological oscillations within the cerebellum, and a tremor that phenotypically resembles ET. However, the neural network dynamics in cerebellar-thalamo-cortical circuits in harmaline-induced tremor remains unclear, including the way circuit interactions may be influenced by behavioural state. Here, we examined the effect of harmaline on cerebello-thalamo-cortical oscillations during rest and movement. EEG recordings from the sensorimotor cortex and local field potentials (LFP) from thalamic and medial cerebellar nuclei were simultaneously recorded in awake behaving rats, alongside measures of tremor using EMG and accelerometery. Analyses compared neural oscillations before and after systemic administration of harmaline (10 mg/kg, I.P), and coherence across periods when rats were resting vs. moving. During movement, harmaline increased the 9-15 Hz behavioural tremor amplitude and increased thalamic LFP coherence with tremor. Medial cerebellar nuclei and cerebellar vermis LFP coherence with tremor however remained unchanged from rest. These findings suggest harmaline-induced cerebellar oscillations are independent of behavioural state and associated changes in tremor amplitude. By contrast, thalamic oscillations are dependent on behavioural state and related changes in tremor amplitude. This study provides new insights into the role of cerebello-thalamo-cortical network interactions in tremor, whereby neural oscillations in thalamocortical, but not cerebellar circuits can be influenced by movement and/or behavioural tremor amplitude in the harmaline model.
Collapse
Affiliation(s)
- Kathryn Woodward
- School of Physiology, Pharmacology and Neuroscience, University of Bristol, Bristol, United Kingdom
| | - Richard Apps
- School of Physiology, Pharmacology and Neuroscience, University of Bristol, Bristol, United Kingdom
| | - Marc Goodfellow
- Department of Engineering, Mathematics and Physical Sciences, University of Exeter, Exeter, United Kingdom
- Living Systems Institute, University of Exeter, Exeter, United Kingdom
| | - Nadia L. Cerminara
- School of Physiology, Pharmacology and Neuroscience, University of Bristol, Bristol, United Kingdom
- *Correspondence: Nadia L. Cerminara
| |
Collapse
|
50
|
Barnes SJ, Keller GB, Keck T. Homeostatic regulation through strengthening of neuronal network-correlated synaptic inputs. eLife 2022; 11:81958. [PMID: 36515269 PMCID: PMC9803349 DOI: 10.7554/elife.81958] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2022] [Accepted: 11/30/2022] [Indexed: 12/15/2022] Open
Abstract
Homeostatic regulation is essential for stable neuronal function. Several synaptic mechanisms of homeostatic plasticity have been described, but the functional properties of synapses involved in homeostasis are unknown. We used longitudinal two-photon functional imaging of dendritic spine calcium signals in visual and retrosplenial cortices of awake adult mice to quantify the sensory deprivation-induced changes in the responses of functionally identified spines. We found that spines whose activity selectively correlated with intrinsic network activity underwent tumor necrosis factor alpha (TNF-α)-dependent homeostatic increases in their response amplitudes, but spines identified as responsive to sensory stimulation did not. We observed an increase in the global sensory-evoked responses following sensory deprivation, despite the fact that the identified sensory inputs did not strengthen. Instead, global sensory-evoked responses correlated with the strength of network-correlated inputs. Our results suggest that homeostatic regulation of global responses is mediated through changes to intrinsic network-correlated inputs rather than changes to identified sensory inputs thought to drive sensory processing.
Collapse
Affiliation(s)
- Samuel J Barnes
- Department of Brain Sciences, Division of Neuroscience, Imperial College London, Hammersmith Hospital CampusLondonUnited Kingdom,UK Dementia Research Institute at Imperial CollegeLondonUnited Kingdom
| | - Georg B Keller
- Friedrich Miescher Institute for Biomedical ResearchBaselSwitzerland
| | - Tara Keck
- Department of Neuroscience, Physiology and Pharmacology, University College LondonLondonUnited Kingdom
| |
Collapse
|