51
|
MouseVenue3D: A Markerless Three-Dimension Behavioral Tracking System for Matching Two-Photon Brain Imaging in Free-Moving Mice. Neurosci Bull 2021; 38:303-317. [PMID: 34637091 PMCID: PMC8975979 DOI: 10.1007/s12264-021-00778-6] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2021] [Accepted: 06/23/2021] [Indexed: 10/20/2022] Open
Abstract
Understanding the connection between brain and behavior in animals requires precise monitoring of their behaviors in three-dimensional (3-D) space. However, there is no available three-dimensional behavior capture system that focuses on rodents. Here, we present MouseVenue3D, an automated and low-cost system for the efficient capture of 3-D skeleton trajectories in markerless rodents. We improved the most time-consuming step in 3-D behavior capturing by developing an automatic calibration module. Then, we validated this process in behavior recognition tasks, and showed that 3-D behavioral data achieved higher accuracy than 2-D data. Subsequently, MouseVenue3D was combined with fast high-resolution miniature two-photon microscopy for synchronous neural recording and behavioral tracking in the freely-moving mouse. Finally, we successfully decoded spontaneous neuronal activity from the 3-D behavior of mice. Our findings reveal that subtle, spontaneous behavior modules are strongly correlated with spontaneous neuronal activity patterns.
Collapse
|
52
|
Fayat R, Delgado Betancourt V, Goyallon T, Petremann M, Liaudet P, Descossy V, Reveret L, Dugué GP. Inertial Measurement of Head Tilt in Rodents: Principles and Applications to Vestibular Research. SENSORS (BASEL, SWITZERLAND) 2021; 21:6318. [PMID: 34577524 PMCID: PMC8472891 DOI: 10.3390/s21186318] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/31/2021] [Revised: 09/03/2021] [Accepted: 09/13/2021] [Indexed: 12/21/2022]
Abstract
Inertial sensors are increasingly used in rodent research, in particular for estimating head orientation relative to gravity, or head tilt. Despite this growing interest, the accuracy of tilt estimates computed from rodent head inertial data has never been assessed. Using readily available inertial measurement units mounted onto the head of freely moving rats, we benchmarked a set of tilt estimation methods against concurrent 3D optical motion capture. We show that, while low-pass filtered head acceleration signals only provided reliable tilt estimates in static conditions, sensor calibration combined with an appropriate choice of orientation filter and parameters could yield average tilt estimation errors below 1.5∘ during movement. We then illustrate an application of inertial head tilt measurements in a preclinical rat model of unilateral vestibular lesion and propose a set of metrics describing the severity of associated postural and motor symptoms and the time course of recovery. We conclude that headborne inertial sensors are an attractive tool for quantitative rodent behavioral analysis in general and for the study of vestibulo-postural functions in particular.
Collapse
Affiliation(s)
- Romain Fayat
- Neurophysiologie des Circuits Cérébraux, Institut de Biologie de l’ENS (IBENS), Ecole Normale Supérieure, UMR CNRS 8197, INSERM U1024, Université PSL, 75005 Paris, France;
- Laboratoire MAP5, UMR CNRS 8145, Université Paris Descartes, 75006 Paris, France
| | | | - Thibault Goyallon
- Laboratoire Jean Kuntzmann, Université Grenoble Alpes, UMR CNRS 5224, INRIA, 38330 Montbonnot-Saint-Martin, France; (T.G.); (L.R.)
| | - Mathieu Petremann
- Preclinical Development, Sensorion SA, 34080 Montpellier, France; (V.D.B.); (M.P.); (P.L.); (V.D.)
| | - Pauline Liaudet
- Preclinical Development, Sensorion SA, 34080 Montpellier, France; (V.D.B.); (M.P.); (P.L.); (V.D.)
| | - Vincent Descossy
- Preclinical Development, Sensorion SA, 34080 Montpellier, France; (V.D.B.); (M.P.); (P.L.); (V.D.)
| | - Lionel Reveret
- Laboratoire Jean Kuntzmann, Université Grenoble Alpes, UMR CNRS 5224, INRIA, 38330 Montbonnot-Saint-Martin, France; (T.G.); (L.R.)
| | - Guillaume P. Dugué
- Neurophysiologie des Circuits Cérébraux, Institut de Biologie de l’ENS (IBENS), Ecole Normale Supérieure, UMR CNRS 8197, INSERM U1024, Université PSL, 75005 Paris, France;
| |
Collapse
|
53
|
Niell CM, Scanziani M. How Cortical Circuits Implement Cortical Computations: Mouse Visual Cortex as a Model. Annu Rev Neurosci 2021; 44:517-546. [PMID: 33914591 PMCID: PMC9925090 DOI: 10.1146/annurev-neuro-102320-085825] [Citation(s) in RCA: 30] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
The mouse, as a model organism to study the brain, gives us unprecedented experimental access to the mammalian cerebral cortex. By determining the cortex's cellular composition, revealing the interaction between its different components, and systematically perturbing these components, we are obtaining mechanistic insight into some of the most basic properties of cortical function. In this review, we describe recent advances in our understanding of how circuits of cortical neurons implement computations, as revealed by the study of mouse primary visual cortex. Further, we discuss how studying the mouse has broadened our understanding of the range of computations performed by visual cortex. Finally, we address how future approaches will fulfill the promise of the mouse in elucidating fundamental operations of cortex.
Collapse
Affiliation(s)
- Cristopher M. Niell
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene, Oregon 97403, USA
| | - Massimo Scanziani
- Department of Physiology and Howard Hughes Medical Institute, University of California San Francisco, San Francisco, California 94158, USA;
| |
Collapse
|
54
|
van Beest EH, Mukherjee S, Kirchberger L, Schnabel UH, van der Togt C, Teeuwen RRM, Barsegyan A, Meyer AF, Poort J, Roelfsema PR, Self MW. Mouse visual cortex contains a region of enhanced spatial resolution. Nat Commun 2021; 12:4029. [PMID: 34188047 PMCID: PMC8242089 DOI: 10.1038/s41467-021-24311-5] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2020] [Accepted: 05/18/2021] [Indexed: 11/15/2022] Open
Abstract
The representation of space in mouse visual cortex was thought to be relatively uniform. Here we reveal, using population receptive-field (pRF) mapping techniques, that mouse visual cortex contains a region in which pRFs are considerably smaller. This region, the “focea,” represents a location in space in front of, and slightly above, the mouse. Using two-photon imaging we show that the smaller pRFs are due to lower scatter of receptive-fields at the focea and an over-representation of binocular regions of space. We show that receptive-fields of single-neurons in areas LM and AL are smaller at the focea and that mice have improved visual resolution in this region of space. Furthermore, freely moving mice make compensatory eye-movements to hold this region in front of them. Our results indicate that mice have spatial biases in their visual processing, a finding that has important implications for the use of the mouse model of vision. The representation of space in mouse visual cortex was considered to be relatively uniform. The authors show that mice have improved visual resolution in a cortical region representing a location in space directly in front and slightly above them, showing that the representation of space in mouse visual cortex is non-uniform.
Collapse
Affiliation(s)
- Enny H van Beest
- Department of Vision & Cognition, Netherlands Institute for Neuroscience, Amsterdam, The Netherlands
| | - Sreedeep Mukherjee
- Department of Vision & Cognition, Netherlands Institute for Neuroscience, Amsterdam, The Netherlands
| | - Lisa Kirchberger
- Department of Vision & Cognition, Netherlands Institute for Neuroscience, Amsterdam, The Netherlands
| | - Ulf H Schnabel
- Department of Vision & Cognition, Netherlands Institute for Neuroscience, Amsterdam, The Netherlands
| | - Chris van der Togt
- Department of Vision & Cognition, Netherlands Institute for Neuroscience, Amsterdam, The Netherlands
| | - Rob R M Teeuwen
- Department of Vision & Cognition, Netherlands Institute for Neuroscience, Amsterdam, The Netherlands
| | - Areg Barsegyan
- Department of Vision & Cognition, Netherlands Institute for Neuroscience, Amsterdam, The Netherlands
| | - Arne F Meyer
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands.,Sainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London, London, UK
| | - Jasper Poort
- Department of Physiology, Development and Neuroscience, University of Cambridge, Cambridge, UK.,Department of Psychology, University of Cambridge, Cambridge, UK
| | - Pieter R Roelfsema
- Department of Vision & Cognition, Netherlands Institute for Neuroscience, Amsterdam, The Netherlands. .,Department of Integrative Neurophysiology, Center for Neurogenomics and Cognitive Research, VU University, Amsterdam, The Netherlands. .,Department of Psychiatry, Academic Medical Center, Amsterdam, The Netherlands.
| | - Matthew W Self
- Department of Vision & Cognition, Netherlands Institute for Neuroscience, Amsterdam, The Netherlands
| |
Collapse
|
55
|
Qiu Y, Zhao Z, Klindt D, Kautzky M, Szatko KP, Schaeffel F, Rifai K, Franke K, Busse L, Euler T. Natural environment statistics in the upper and lower visual field are reflected in mouse retinal specializations. Curr Biol 2021; 31:3233-3247.e6. [PMID: 34107304 DOI: 10.1016/j.cub.2021.05.017] [Citation(s) in RCA: 23] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2021] [Revised: 04/06/2021] [Accepted: 05/11/2021] [Indexed: 12/29/2022]
Abstract
Pressures for survival make sensory circuits adapted to a species' natural habitat and its behavioral challenges. Thus, to advance our understanding of the visual system, it is essential to consider an animal's specific visual environment by capturing natural scenes, characterizing their statistical regularities, and using them to probe visual computations. Mice, a prominent visual system model, have salient visual specializations, being dichromatic with enhanced sensitivity to green and UV in the dorsal and ventral retina, respectively. However, the characteristics of their visual environment that likely have driven these adaptations are rarely considered. Here, we built a UV-green-sensitive camera to record footage from mouse habitats. This footage is publicly available as a resource for mouse vision research. We found chromatic contrast to greatly diverge in the upper, but not the lower, visual field. Moreover, training a convolutional autoencoder on upper, but not lower, visual field scenes was sufficient for the emergence of color-opponent filters, suggesting that this environmental difference might have driven superior chromatic opponency in the ventral mouse retina, supporting color discrimination in the upper visual field. Furthermore, the upper visual field was biased toward dark UV contrasts, paralleled by more light-offset-sensitive ganglion cells in the ventral retina. Finally, footage recorded at twilight suggests that UV promotes aerial predator detection. Our findings support that natural scene statistics shaped early visual processing in evolution.
Collapse
Affiliation(s)
- Yongrong Qiu
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany; Centre for Integrative Neuroscience (CIN), University of Tübingen, 72076 Tübingen, Germany; Graduate Training Centre of Neuroscience (GTC), International Max Planck Research School, University of Tübingen, 72076 Tübingen, Germany
| | - Zhijian Zhao
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany; Centre for Integrative Neuroscience (CIN), University of Tübingen, 72076 Tübingen, Germany
| | - David Klindt
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany; Centre for Integrative Neuroscience (CIN), University of Tübingen, 72076 Tübingen, Germany; Graduate Training Centre of Neuroscience (GTC), International Max Planck Research School, University of Tübingen, 72076 Tübingen, Germany
| | - Magdalena Kautzky
- Division of Neurobiology, Faculty of Biology, LMU Munich, 82152 Planegg-Martinsried, Germany; Graduate School of Systemic Neurosciences (GSN), LMU Munich, 82152 Planegg-Martinsried, Germany
| | - Klaudia P Szatko
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany; Centre for Integrative Neuroscience (CIN), University of Tübingen, 72076 Tübingen, Germany; Graduate Training Centre of Neuroscience (GTC), International Max Planck Research School, University of Tübingen, 72076 Tübingen, Germany; Bernstein Centre for Computational Neuroscience, 72076 Tübingen, Germany
| | - Frank Schaeffel
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany
| | - Katharina Rifai
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany; Carl Zeiss Vision International GmbH, 73430 Aalen, Germany
| | - Katrin Franke
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany; Centre for Integrative Neuroscience (CIN), University of Tübingen, 72076 Tübingen, Germany; Bernstein Centre for Computational Neuroscience, 72076 Tübingen, Germany
| | - Laura Busse
- Division of Neurobiology, Faculty of Biology, LMU Munich, 82152 Planegg-Martinsried, Germany; Bernstein Centre for Computational Neuroscience, 82152 Planegg-Martinsried, Germany.
| | - Thomas Euler
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany; Centre for Integrative Neuroscience (CIN), University of Tübingen, 72076 Tübingen, Germany; Bernstein Centre for Computational Neuroscience, 72076 Tübingen, Germany.
| |
Collapse
|
56
|
Ebbesen CL, Froemke RC. Body language signals for rodent social communication. Curr Opin Neurobiol 2021; 68:91-106. [PMID: 33582455 PMCID: PMC8243782 DOI: 10.1016/j.conb.2021.01.008] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2020] [Revised: 01/09/2021] [Accepted: 01/25/2021] [Indexed: 12/15/2022]
Abstract
Integration of social cues to initiate adaptive emotional and behavioral responses is a fundamental aspect of animal and human behavior. In humans, social communication includes prominent nonverbal components, such as social touch, gestures and facial expressions. Comparative studies investigating the neural basis of social communication in rodents has historically been centered on olfactory signals and vocalizations, with relatively less focus on non-verbal social cues. Here, we outline two exciting research directions: First, we will review recent observations pointing to a role of social facial expressions in rodents. Second, we will review observations that point to a role of 'non-canonical' rodent body language: body posture signals beyond stereotyped displays in aggressive and sexual behavior. In both sections, we will outline how social neuroscience can build on recent advances in machine learning, robotics and micro-engineering to push these research directions forward towards a holistic systems neurobiology of rodent body language.
Collapse
Affiliation(s)
- Christian L Ebbesen
- Skirball Institute of Biomolecular Medicine, Neuroscience Institute, Departments of Otolaryngology, Neuroscience and Physiology, New York University School of Medicine, New York, NY, 10016, USA; Center for Neural Science, New York University, New York, NY, 10003, USA.
| | - Robert C Froemke
- Skirball Institute of Biomolecular Medicine, Neuroscience Institute, Departments of Otolaryngology, Neuroscience and Physiology, New York University School of Medicine, New York, NY, 10016, USA; Center for Neural Science, New York University, New York, NY, 10003, USA; Howard Hughes Medical Institute Faculty Scholar, USA.
| |
Collapse
|
57
|
Poort J, Meyer AF. Vision: Depth perception in climbing mice. Curr Biol 2021; 31:R486-R488. [PMID: 34033773 DOI: 10.1016/j.cub.2021.03.066] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
Depth perception helps animals interact with a three-dimensional world. A new study presents a novel paradigm for studying depth perception in naturally climbing mice and links their behavior to binocular disparity signals in primary visual cortical neurons.
Collapse
Affiliation(s)
- Jasper Poort
- Department of Physiology, Development and Neuroscience, University of Cambridge, Cambridge CB2 3EG, UK; Department of Psychology, University of Cambridge, Cambridge CB2 3EB, UK.
| | - Arne F Meyer
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen 6525 AJ, The Netherlands; Sainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London, London W1T 4JG, UK.
| |
Collapse
|
58
|
Ivanchenko D, Rifai K, Hafed ZM, Schaeffel F. A low-cost, high-performance video-based binocular eye tracker for psychophysical research. J Eye Mov Res 2021; 14. [PMID: 34122750 PMCID: PMC8190563 DOI: 10.16910/jemr.14.3.3] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/22/2022] Open
Abstract
We describe a high-performance, pupil-based binocular eye tracker that approaches the performance
of a well-established commercial system, but at a fraction of the cost. The eye
tracker is built from standard hardware components, and its software (written in Visual C++)
can be easily implemented. Because of its fast and simple linear calibration scheme, the eye
tracker performs best in the central 10 degrees of the visual field. The eye tracker possesses
a number of useful features: (1) automated calibration simultaneously in both eyes while
subjects fixate four fixation points sequentially on a computer screen, (2) automated realtime
continuous analysis of measurement noise, (3) automated blink detection, (4) and realtime
analysis of pupil centration artifacts. This last feature is critical because it is known
that pupil diameter changes can be erroneously registered by pupil-based trackers as a
change in eye position. We evaluated the performance of our system against that of a wellestablished
commercial system using simultaneous measurements in 10 participants. We
propose our low-cost eye tracker as a promising resource for studies of binocular eye movements.
Collapse
|
59
|
Evaluating Visual Cues Modulates Their Representation in Mouse Visual and Cingulate Cortex. J Neurosci 2021; 41:3531-3544. [PMID: 33687964 DOI: 10.1523/jneurosci.1828-20.2021] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2020] [Revised: 02/12/2021] [Accepted: 02/17/2021] [Indexed: 11/21/2022] Open
Abstract
Choosing an action in response to visual cues relies on cognitive processes, such as perception, evaluation, and prediction, which can modulate visual representations even at early processing stages. In the mouse, it is challenging to isolate cognitive modulations of sensory signals because concurrent overt behavior patterns, such as locomotion, can also have brainwide influences. To address this challenge, we designed a task, in which head-fixed mice had to evaluate one of two visual cues. While their global shape signaled the opportunity to earn reward, the cues provided equivalent local stimulation to receptive fields of neurons in primary visual (V1) and anterior cingulate cortex (ACC). We found that mice evaluated these cues within few hundred milliseconds. During this period, ∼30% of V1 neurons became cue-selective, with preferences for either cue being balanced across the recorded population. This selectivity emerged in response to the behavioral demands because the same neurons could not discriminate the cues in sensory control measurements. In ACC, cue evaluation affected a similar fraction of neurons; emerging selectivity, however, was stronger than in V1, and preferences in the recorded population were biased toward the cue promising reward. Such a biased selectivity regime might allow the mouse to infer the promise of reward simply by the overall level of activity. Together, these experiments isolate the impact of task demands on neural responses in mouse cerebral cortex, and document distinct neural signatures of cue evaluation in V1 and ACC.SIGNIFICANCE STATEMENT Performing a cognitive task, such as evaluating visual cues, not only recruits frontal and parietal brain regions, but also modulates sensory processing stages. We trained mice to evaluate two visual cues, and show that, during this task, ∼30% of neurons recorded in V1 became selective for either cue, although they provided equivalent visual stimulation. We also show that, during cue evaluation, mice frequently move their eyes, even under head fixation, and that ignoring systematic differences in eye position can substantially obscure the modulations seen in V1 neurons. Finally, we document that modulations are stronger in ACC, and biased toward the reward-predicting cue, suggesting a transition in the neural representation of task-relevant information across processing stages in mouse cerebral cortex.
Collapse
|
60
|
Johnson KP, Fitzpatrick MJ, Zhao L, Wang B, McCracken S, Williams PR, Kerschensteiner D. Cell-type-specific binocular vision guides predation in mice. Neuron 2021; 109:1527-1539.e4. [PMID: 33784498 DOI: 10.1016/j.neuron.2021.03.010] [Citation(s) in RCA: 43] [Impact Index Per Article: 14.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2020] [Revised: 02/09/2021] [Accepted: 03/05/2021] [Indexed: 12/20/2022]
Abstract
Predators use vision to hunt, and hunting success is one of evolution's main selection pressures. However, how viewing strategies and visual systems are adapted to predation is unclear. Tracking predator-prey interactions of mice and crickets in 3D, we find that mice trace crickets with their binocular visual fields and that monocular mice are poor hunters. Mammalian binocular vision requires ipsi- and contralateral projections of retinal ganglion cells (RGCs) to the brain. Large-scale single-cell recordings and morphological reconstructions reveal that only a small subset (9 of 40+) of RGC types in the ventrotemporal mouse retina innervate ipsilateral brain areas (ipsi-RGCs). Selective ablation of ipsi-RGCs (<2% of RGCs) in the adult retina drastically reduces the hunting success of mice. Stimuli based on ethological observations indicate that five ipsi-RGC types reliably signal prey. Thus, viewing strategies align with a spatially restricted and cell-type-specific set of ipsi-RGCs that supports binocular vision to guide predation.
Collapse
Affiliation(s)
- Keith P Johnson
- John F. Hardesty, MD Department of Ophthalmology and Visual Sciences, Washington University School of Medicine, St. Louis, MO 63110, USA; Graduate Program in Neuroscience, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Michael J Fitzpatrick
- John F. Hardesty, MD Department of Ophthalmology and Visual Sciences, Washington University School of Medicine, St. Louis, MO 63110, USA; Graduate Program in Neuroscience, Washington University School of Medicine, St. Louis, MO 63110, USA; Medical Scientist Training Program, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Lei Zhao
- John F. Hardesty, MD Department of Ophthalmology and Visual Sciences, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Bing Wang
- John F. Hardesty, MD Department of Ophthalmology and Visual Sciences, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Sean McCracken
- John F. Hardesty, MD Department of Ophthalmology and Visual Sciences, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Philip R Williams
- John F. Hardesty, MD Department of Ophthalmology and Visual Sciences, Washington University School of Medicine, St. Louis, MO 63110, USA; Department of Neuroscience, Washington University School of Medicine, St. Louis, MO 63110, USA; Hope Center for Neurological Disorders, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Daniel Kerschensteiner
- John F. Hardesty, MD Department of Ophthalmology and Visual Sciences, Washington University School of Medicine, St. Louis, MO 63110, USA; Department of Neuroscience, Washington University School of Medicine, St. Louis, MO 63110, USA; Hope Center for Neurological Disorders, Washington University School of Medicine, St. Louis, MO 63110, USA; Department of Biomedical Engineering, Washington University School of Medicine, St. Louis, MO 63110, USA.
| |
Collapse
|
61
|
Dennis EJ, El Hady A, Michaiel A, Clemens A, Tervo DRG, Voigts J, Datta SR. Systems Neuroscience of Natural Behaviors in Rodents. J Neurosci 2021; 41:911-919. [PMID: 33443081 PMCID: PMC7880287 DOI: 10.1523/jneurosci.1877-20.2020] [Citation(s) in RCA: 31] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2020] [Revised: 10/15/2020] [Accepted: 10/20/2020] [Indexed: 11/21/2022] Open
Abstract
Animals evolved in complex environments, producing a wide range of behaviors, including navigation, foraging, prey capture, and conspecific interactions, which vary over timescales ranging from milliseconds to days. Historically, these behaviors have been the focus of study for ecology and ethology, while systems neuroscience has largely focused on short timescale behaviors that can be repeated thousands of times and occur in highly artificial environments. Thanks to recent advances in machine learning, miniaturization, and computation, it is newly possible to study freely moving animals in more natural conditions while applying systems techniques: performing temporally specific perturbations, modeling behavioral strategies, and recording from large numbers of neurons while animals are freely moving. The authors of this review are a group of scientists with deep appreciation for the common aims of systems neuroscience, ecology, and ethology. We believe it is an extremely exciting time to be a neuroscientist, as we have an opportunity to grow as a field, to embrace interdisciplinary, open, collaborative research to provide new insights and allow researchers to link knowledge across disciplines, species, and scales. Here we discuss the origins of ethology, ecology, and systems neuroscience in the context of our own work and highlight how combining approaches across these fields has provided fresh insights into our research. We hope this review facilitates some of these interactions and alliances and helps us all do even better science, together.
Collapse
Affiliation(s)
- Emily Jane Dennis
- Princeton University and Howard Hughes Medical Institute, Princeton, New Jersey, 08540
| | - Ahmed El Hady
- Princeton University and Howard Hughes Medical Institute, Princeton, New Jersey, 08540
| | | | - Ann Clemens
- University of Edinburgh, Edinburgh, Scotland, EH8 9JZ
| | | | - Jakob Voigts
- Massachusetts Institute of Technology, Cambridge, Massachusets, 02139
| | | |
Collapse
|
62
|
Marshall JD, Aldarondo DE, Dunn TW, Wang WL, Berman GJ, Ölveczky BP. Continuous Whole-Body 3D Kinematic Recordings across the Rodent Behavioral Repertoire. Neuron 2021; 109:420-437.e8. [PMID: 33340448 PMCID: PMC7864892 DOI: 10.1016/j.neuron.2020.11.016] [Citation(s) in RCA: 58] [Impact Index Per Article: 19.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2020] [Revised: 10/01/2020] [Accepted: 11/16/2020] [Indexed: 12/13/2022]
Abstract
In mammalian animal models, high-resolution kinematic tracking is restricted to brief sessions in constrained environments, limiting our ability to probe naturalistic behaviors and their neural underpinnings. To address this, we developed CAPTURE (Continuous Appendicular and Postural Tracking Using Retroreflector Embedding), a behavioral monitoring system that combines motion capture and deep learning to continuously track the 3D kinematics of a rat's head, trunk, and limbs for week-long timescales in freely behaving animals. CAPTURE realizes 10- to 100-fold gains in precision and robustness compared with existing convolutional network approaches to behavioral tracking. We demonstrate CAPTURE's ability to comprehensively profile the kinematics and sequential organization of natural rodent behavior, its variation across individuals, and its perturbation by drugs and disease, including identifying perseverative grooming states in a rat model of fragile X syndrome. CAPTURE significantly expands the range of behaviors and contexts that can be quantitatively investigated, opening the door to a new understanding of natural behavior and its neural basis.
Collapse
Affiliation(s)
- Jesse D Marshall
- Department of Organismic and Evolutionary Biology, Harvard University, Cambridge, MA 02138, USA.
| | - Diego E Aldarondo
- Program in Neuroscience, Harvard University, Cambridge, MA 02138, USA
| | - Timothy W Dunn
- Department of Statistical Science, Duke University, Durham, NC 27710, USA
| | - William L Wang
- Department of Organismic and Evolutionary Biology, Harvard University, Cambridge, MA 02138, USA
| | - Gordon J Berman
- Department of Biology, Emory University, Atlanta, GA 30322, USA
| | - Bence P Ölveczky
- Department of Organismic and Evolutionary Biology, Harvard University, Cambridge, MA 02138, USA.
| |
Collapse
|
63
|
Mallory CS, Hardcastle K, Campbell MG, Attinger A, Low IIC, Raymond JL, Giocomo LM. Mouse entorhinal cortex encodes a diverse repertoire of self-motion signals. Nat Commun 2021; 12:671. [PMID: 33510164 PMCID: PMC7844029 DOI: 10.1038/s41467-021-20936-8] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2020] [Accepted: 12/31/2020] [Indexed: 01/30/2023] Open
Abstract
Neural circuits generate representations of the external world from multiple information streams. The navigation system provides an exceptional lens through which we may gain insights about how such computations are implemented. Neural circuits in the medial temporal lobe construct a map-like representation of space that supports navigation. This computation integrates multiple sensory cues, and, in addition, is thought to require cues related to the individual's movement through the environment. Here, we identify multiple self-motion signals, related to the position and velocity of the head and eyes, encoded by neurons in a key node of the navigation circuitry of mice, the medial entorhinal cortex (MEC). The representation of these signals is highly integrated with other cues in individual neurons. Such information could be used to compute the allocentric location of landmarks from visual cues and to generate internal representations of space.
Collapse
Affiliation(s)
- Caitlin S Mallory
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA, USA
| | - Kiah Hardcastle
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA, USA
| | - Malcolm G Campbell
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA, USA
| | - Alexander Attinger
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA, USA
| | - Isabel I C Low
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA, USA
| | - Jennifer L Raymond
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA, USA
| | - Lisa M Giocomo
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA, USA.
| |
Collapse
|
64
|
Sattler NJ, Wehr M. A Head-Mounted Multi-Camera System for Electrophysiology and Behavior in Freely-Moving Mice. Front Neurosci 2021; 14:592417. [PMID: 33584174 PMCID: PMC7874224 DOI: 10.3389/fnins.2020.592417] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2020] [Accepted: 12/14/2020] [Indexed: 01/25/2023] Open
Abstract
Advances in the ability to monitor freely-moving mice may prove valuable for the study of behavior and its neural correlates. Here we present a head-mounted multi-camera system comprised of inexpensive miniature analog camera modules, and illustrate its use for investigating natural behaviors such as prey capture, courtship, sleep, jumping, and exploration. With a four-camera headset, monitoring the eyes, ears, whiskers, rhinarium, and binocular visual field can all be achieved simultaneously with high-density electrophysiology. With appropriate focus and positioning, all eye movements can be captured, including cyclotorsion. For studies of vision and eye movements, cyclotorsion provides the final degree of freedom required to reconstruct the visual scene in retinotopic coordinates or to investigate the vestibulo-ocular reflex in mice. Altogether, this system allows for comprehensive measurement of freely-moving mouse behavior, enabling a more holistic, and multimodal approach to investigate ethological behaviors and other processes of active perception.
Collapse
Affiliation(s)
- Nicholas J. Sattler
- Department of Biology, Institute of Neuroscience, University of Oregon, Eugene, OR, United States
| | - Michael Wehr
- Department of Psychology, Institute of Neuroscience, University of Oregon, Eugene, OR, United States
| |
Collapse
|
65
|
Flossmann T, Rochefort NL. Spatial navigation signals in rodent visual cortex. Curr Opin Neurobiol 2020; 67:163-173. [PMID: 33360769 DOI: 10.1016/j.conb.2020.11.004] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2020] [Revised: 11/04/2020] [Accepted: 11/06/2020] [Indexed: 12/14/2022]
Abstract
During navigation, animals integrate sensory information with body movements to guide actions. The impact of both navigational and movement-related signals on cortical visual information processing remains largely unknown. We review recent studies in awake rodents that have revealed navigation-related signals in the primary visual cortex (V1) including speed, distance travelled and head-orienting movements. Both cortical and subcortical inputs convey self-motion related information to V1 neurons: for example, top-down inputs from secondary motor and retrosplenial cortices convey information about head movements and spatial expectations. Within V1, subtypes of inhibitory neurons are critical for the integration of navigation-related and visual signals. We conclude with potential functional roles of navigation-related signals in V1 including gain control, motor error signals and predictive coding.
Collapse
Affiliation(s)
- Tom Flossmann
- Centre for Discovery Brain Sciences, School of Biomedical Sciences, University of Edinburgh, Edinburgh, EH8 9XD, United Kingdom
| | - Nathalie L Rochefort
- Centre for Discovery Brain Sciences, School of Biomedical Sciences, University of Edinburgh, Edinburgh, EH8 9XD, United Kingdom; Simons Initiative for the Developing Brain, University of Edinburgh, Edinburgh, EH8 9XD, United Kingdom.
| |
Collapse
|
66
|
Storchi R, Milosavljevic N, Allen AE, Zippo AG, Agnihotri A, Cootes TF, Lucas RJ. A High-Dimensional Quantification of Mouse Defensive Behaviors Reveals Enhanced Diversity and Stimulus Specificity. Curr Biol 2020; 30:4619-4630.e5. [PMID: 33007242 PMCID: PMC7728163 DOI: 10.1016/j.cub.2020.09.007] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2020] [Revised: 07/06/2020] [Accepted: 09/03/2020] [Indexed: 12/16/2022]
Abstract
Instinctive defensive behaviors, consisting of stereotyped sequences of movements and postures, are an essential component of the mouse behavioral repertoire. Since defensive behaviors can be reliably triggered by threatening sensory stimuli, the selection of the most appropriate action depends on the stimulus property. However, since the mouse has a wide repertoire of motor actions, it is not clear which set of movements and postures represent the relevant action. So far, this has been empirically identified as a change in locomotion state. However, the extent to which locomotion alone captures the diversity of defensive behaviors and their sensory specificity is unknown. To tackle this problem, we developed a method to obtain a faithful 3D reconstruction of the mouse body that enabled to quantify a wide variety of motor actions. This higher dimensional description revealed that defensive behaviors are more stimulus specific than indicated by locomotion data. Thus, responses to distinct stimuli that were equivalent in terms of locomotion (e.g., freezing induced by looming and sound) could be discriminated along other dimensions. The enhanced stimulus specificity was explained by a surprising diversity. A clustering analysis revealed that distinct combinations of movements and postures, giving rise to at least 7 different behaviors, were required to account for stimulus specificity. Moreover, each stimulus evoked more than one behavior, revealing a robust one-to-many mapping between sensations and behaviors that was not apparent from locomotion data. Our results indicate that diversity and sensory specificity of mouse defensive behaviors unfold in a higher dimensional space, spanning multiple motor actions.
Collapse
Affiliation(s)
- Riccardo Storchi
- Division of Neuroscience and Experimental Psychology, School of Biological Science, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, UK.
| | - Nina Milosavljevic
- Division of Neuroscience and Experimental Psychology, School of Biological Science, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, UK
| | - Annette E Allen
- Division of Neuroscience and Experimental Psychology, School of Biological Science, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, UK
| | - Antonio G Zippo
- Institute of Neuroscience, Consiglio Nazionale delle Ricerche, Milan, Italy
| | - Aayushi Agnihotri
- Division of Neuroscience and Experimental Psychology, School of Biological Science, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, UK
| | - Timothy F Cootes
- Division of Informatics, Imaging & Data Science, School of Health Science, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, UK
| | - Robert J Lucas
- Division of Neuroscience and Experimental Psychology, School of Biological Science, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, UK
| |
Collapse
|
67
|
Guitchounts G, Masís J, Wolff SB, Cox D. Encoding of 3D Head Orienting Movements in the Primary Visual Cortex. Neuron 2020; 108:512-525.e4. [DOI: 10.1016/j.neuron.2020.07.014] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2020] [Revised: 06/11/2020] [Accepted: 07/13/2020] [Indexed: 10/23/2022]
|
68
|
Disparity Sensitivity and Binocular Integration in Mouse Visual Cortex Areas. J Neurosci 2020; 40:8883-8899. [PMID: 33051348 DOI: 10.1523/jneurosci.1060-20.2020] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2020] [Revised: 09/18/2020] [Accepted: 09/22/2020] [Indexed: 01/02/2023] Open
Abstract
Binocular disparity, the difference between the two eyes' images, is a powerful cue to generate the 3D depth percept known as stereopsis. In primates, binocular disparity is processed in multiple areas of the visual cortex, with distinct contributions of higher areas to specific aspects of depth perception. Mice, too, can perceive stereoscopic depth, and neurons in primary visual cortex (V1) and higher-order, lateromedial (LM) and rostrolateral (RL) areas were found to be sensitive to binocular disparity. A detailed characterization of disparity tuning across mouse visual areas is lacking, however, and acquiring such data might help clarifying the role of higher areas for disparity processing and establishing putative functional correspondences to primate areas. We used two-photon calcium imaging in female mice to characterize the disparity tuning properties of neurons in visual areas V1, LM, and RL in response to dichoptically presented binocular gratings, as well as random dot correlograms (RDC). In all three areas, many neurons were tuned to disparity, showing strong response facilitation or suppression at optimal or null disparity, respectively, even in neurons classified as monocular by conventional ocular dominance (OD) measurements. Neurons in higher areas exhibited broader and more asymmetric disparity tuning curves compared with V1, as observed in primate visual cortex. Finally, we probed neurons' sensitivity to true stereo correspondence by comparing responses to correlated RDC (cRDC) and anticorrelated RDC (aRDC). Area LM, akin to primate ventral visual stream areas, showed higher selectivity for correlated stimuli and reduced anticorrelated responses, indicating higher-level disparity processing in LM compared with V1 and RL.SIGNIFICANCE STATEMENT A major cue for inferring 3D depth is disparity between the two eyes' images. Investigating how binocular disparity is processed in the mouse visual system will not only help delineating the role of mouse higher areas for visual processing, but also shed light on how the mammalian brain computes stereopsis. We found that binocular integration is a prominent feature of mouse visual cortex, as many neurons are selectively and strongly modulated by binocular disparity. Comparison of responses to correlated and anticorrelated random dot correlograms (RDC) revealed that lateromedial area (LM) is more selective to correlated stimuli, while less sensitive to anticorrelated stimuli compared with primary visual cortex (V1) and rostrolateral area (RL), suggesting higher-level disparity processing in LM, resembling primate ventral visual stream areas.
Collapse
|
69
|
Deichler A, Carrasco D, Lopez-Jury L, Vega-Zuniga T, Márquez N, Mpodozis J, Marín GJ. A specialized reciprocal connectivity suggests a link between the mechanisms by which the superior colliculus and parabigeminal nucleus produce defensive behaviors in rodents. Sci Rep 2020; 10:16220. [PMID: 33004866 PMCID: PMC7530999 DOI: 10.1038/s41598-020-72848-0] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2020] [Accepted: 08/06/2020] [Indexed: 12/22/2022] Open
Abstract
The parabigeminal nucleus (PBG) is the mammalian homologue to the isthmic complex of other vertebrates. Optogenetic stimulation of the PBG induces freezing and escape in mice, a result thought to be caused by a PBG projection to the central nucleus of the amygdala. However, the isthmic complex, including the PBG, has been classically considered satellite nuclei of the Superior Colliculus (SC), which upon stimulation of its medial part also triggers fear and avoidance reactions. As the PBG-SC connectivity is not well characterized, we investigated whether the topology of the PBG projection to the SC could be related to the behavioral consequences of PBG stimulation. To that end, we performed immunohistochemistry, in situ hybridization and neural tracer injections in the SC and PBG in a diurnal rodent, the Octodon degus. We found that all PBG neurons expressed both glutamatergic and cholinergic markers and were distributed in clearly defined anterior (aPBG) and posterior (pPBG) subdivisions. The pPBG is connected reciprocally and topographically to the ipsilateral SC, whereas the aPBG receives afferent axons from the ipsilateral SC and projected exclusively to the contralateral SC. This contralateral projection forms a dense field of terminals that is restricted to the medial SC, in correspondence with the SC representation of the aerial binocular field which, we also found, in O. degus prompted escape reactions upon looming stimulation. Therefore, this specialized topography allows binocular interactions in the SC region controlling responses to aerial predators, suggesting a link between the mechanisms by which the SC and PBG produce defensive behaviors.
Collapse
Affiliation(s)
- Alfonso Deichler
- Laboratorio de Neurobiología Y Biología del Conocer, Departamento de Biología, Facultad de Ciencias, Universidad de Chile, Las Palmeras, 3425, Santiago, Chile.
| | - Denisse Carrasco
- Laboratorio de Neurobiología Y Biología del Conocer, Departamento de Biología, Facultad de Ciencias, Universidad de Chile, Las Palmeras, 3425, Santiago, Chile
| | - Luciana Lopez-Jury
- Laboratorio de Neurobiología Y Biología del Conocer, Departamento de Biología, Facultad de Ciencias, Universidad de Chile, Las Palmeras, 3425, Santiago, Chile
| | - Tomas Vega-Zuniga
- Institute of Science and Technology Austria (IST Austria), Klosterneuburg, Austria
| | - Natalia Márquez
- Laboratorio de Neurobiología Y Biología del Conocer, Departamento de Biología, Facultad de Ciencias, Universidad de Chile, Las Palmeras, 3425, Santiago, Chile
| | - Jorge Mpodozis
- Laboratorio de Neurobiología Y Biología del Conocer, Departamento de Biología, Facultad de Ciencias, Universidad de Chile, Las Palmeras, 3425, Santiago, Chile
| | - Gonzalo J Marín
- Laboratorio de Neurobiología Y Biología del Conocer, Departamento de Biología, Facultad de Ciencias, Universidad de Chile, Las Palmeras, 3425, Santiago, Chile.
- Facultad de Medicina, Universidad Finis Terrae, Santiago, Chile.
| |
Collapse
|
70
|
Revealing the structure of pharmacobehavioral space through motion sequencing. Nat Neurosci 2020; 23:1433-1443. [PMID: 32958923 PMCID: PMC7606807 DOI: 10.1038/s41593-020-00706-3] [Citation(s) in RCA: 107] [Impact Index Per Article: 26.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2019] [Accepted: 08/10/2020] [Indexed: 12/15/2022]
Abstract
Understanding how genes, drugs and neural circuits influence behavior requires the ability to effectively organize information about similarities and differences within complex behavioral datasets. Motion Sequencing (MoSeq) is an ethologically-inspired behavioral analysis method that identifies modular components of 3D mouse body language called “syllables.” Here we show that MoSeq effectively parses behavioral differences and captures similarities elicited by a panel of neuro- and psychoactive drugs administered to a cohort of nearly 700 mice. MoSeq identifies syllables that are characteristic of individual drugs; we leverage this finding to reveal specific on- and off-target effects of both established and candidate therapeutics in a mouse model of autism spectrum disorder. These results demonstrate that MoSeq can meaningfully organize large-scale behavioral data, illustrate the power of a fundamentally modular description of behavior, and suggest that behavioral syllables represent a new class of druggable target.
Collapse
|
71
|
Privitera M, Ferrari KD, von Ziegler LM, Sturman O, Duss SN, Floriou-Servou A, Germain PL, Vermeiren Y, Wyss MT, De Deyn PP, Weber B, Bohacek J. A complete pupillometry toolbox for real-time monitoring of locus coeruleus activity in rodents. Nat Protoc 2020; 15:2301-2320. [PMID: 32632319 DOI: 10.1038/s41596-020-0324-6] [Citation(s) in RCA: 28] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2019] [Accepted: 04/01/2020] [Indexed: 01/20/2023]
Abstract
The locus coeruleus (LC) is a region in the brainstem that produces noradrenaline and is involved in both normal and pathological brain function. Pupillometry, the measurement of pupil diameter, provides a powerful readout of LC activity in rodents, primates and humans. The protocol detailed here describes a miniaturized setup that can screen LC activity in rodents in real-time and can be established within 1-2 d. Using low-cost Raspberry Pi computers and cameras, the complete custom-built system costs only ~300 euros, is compatible with stereotaxic surgery frames and seamlessly integrates into complex experimental setups. Tools for pupil tracking and a user-friendly Pupillometry App allow quantification, analysis and visualization of pupil size. Pupillometry can discriminate between different, physiologically relevant firing patterns of the LC and can accurately report LC activation as measured by noradrenaline turnover. Pupillometry provides a rapid, non-invasive readout that can be used to verify accurate placement of electrodes/fibers in vivo, thus allowing decisions about the inclusion/exclusion of individual animals before experiments begin.
Collapse
Affiliation(s)
- Mattia Privitera
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland.,Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Kim David Ferrari
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland.,Experimental Imaging and Neuroenergetics, Institute of Pharmacology and Toxicology, University of Zurich, Zurich, Switzerland
| | - Lukas M von Ziegler
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland.,Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Oliver Sturman
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland.,Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Sian N Duss
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland.,Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Amalia Floriou-Servou
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland.,Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Pierre-Luc Germain
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland.,Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Yannick Vermeiren
- Department of Biomedical Sciences, Laboratory of Neurochemistry and Behavior, Institute Born-Bunge, University of Antwerp, Wilrijk (Antwerp), Antwerpen, Belgium.,Department of Neurology and Alzheimer Center, University of Groningen and University Medical Center Groningen (UMCG), Groningen, the Netherlands
| | - Matthias T Wyss
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland.,Experimental Imaging and Neuroenergetics, Institute of Pharmacology and Toxicology, University of Zurich, Zurich, Switzerland
| | - Peter P De Deyn
- Department of Biomedical Sciences, Laboratory of Neurochemistry and Behavior, Institute Born-Bunge, University of Antwerp, Wilrijk (Antwerp), Antwerpen, Belgium.,Department of Neurology and Alzheimer Center, University of Groningen and University Medical Center Groningen (UMCG), Groningen, the Netherlands.,Department of Neurology, Memory Clinic of Hospital Network Antwerp (ZNA) Middelheim and Hoge Beuken, Antwerp, Belgium
| | - Bruno Weber
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland. .,Experimental Imaging and Neuroenergetics, Institute of Pharmacology and Toxicology, University of Zurich, Zurich, Switzerland.
| | - Johannes Bohacek
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland. .,Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland.
| |
Collapse
|
72
|
Parker PRL, Brown MA, Smear MC, Niell CM. Movement-Related Signals in Sensory Areas: Roles in Natural Behavior. Trends Neurosci 2020; 43:581-595. [PMID: 32580899 PMCID: PMC8000520 DOI: 10.1016/j.tins.2020.05.005] [Citation(s) in RCA: 66] [Impact Index Per Article: 16.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2020] [Revised: 05/02/2020] [Accepted: 05/24/2020] [Indexed: 11/24/2022]
Abstract
Recent studies have demonstrated prominent and widespread movement-related signals in the brain of head-fixed mice, even in primary sensory areas. However, it is still unknown what role these signals play in sensory processing. Why are these sensory areas 'contaminated' by movement signals? During natural behavior, animals actively acquire sensory information as they move through the environment and use this information to guide ongoing actions. In this context, movement-related signals could allow sensory systems to predict self-induced sensory changes and extract additional information about the environment. In this review we summarize recent findings on the presence of movement-related signals in sensory areas and discuss how their study, in the context of natural freely moving behaviors, could advance models of sensory processing.
Collapse
Affiliation(s)
- Philip R L Parker
- Institute of Neuroscience, University of Oregon, Eugene, OR 97403, USA.
| | - Morgan A Brown
- Institute of Neuroscience, University of Oregon, Eugene, OR 97403, USA
| | - Matthew C Smear
- Institute of Neuroscience, University of Oregon, Eugene, OR 97403, USA; Department of Psychology, University of Oregon, Eugene, OR 97403, USA
| | - Cristopher M Niell
- Institute of Neuroscience, University of Oregon, Eugene, OR 97403, USA; Department of Biology, University of Oregon, Eugene, OR 97403, USA.
| |
Collapse
|
73
|
Dynamic coordination of the perirhinal cortical neurons supports coherent representations between task epochs. Commun Biol 2020; 3:406. [PMID: 32733065 PMCID: PMC7393175 DOI: 10.1038/s42003-020-01129-3] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2020] [Accepted: 07/08/2020] [Indexed: 01/10/2023] Open
Abstract
Cortical neurons show distinct firing patterns across multiple task epochs characterized by different computations. Recent studies suggest that such distinct patterns underlie dynamic population code achieving computational flexibility, whereas neurons in some cortical areas often show coherent firing patterns across epochs. To understand how coherent single-neuron code contributes to dynamic population code, we analyzed neural responses in the rat perirhinal cortex (PRC) during cue and reward epochs of a two-alternative forced-choice task. We found that the PRC neurons often encoded the opposite choice directions between those epochs. By using principal component analysis as a population-level analysis, we identified neural subspaces associated with each epoch, which reflected coordination across the neurons. The cue and reward epochs shared neural dimensions where the choice directions were consistently discriminated. Interestingly, those dimensions were supported by dynamically changing contributions of the individual neurons. These results demonstrated heterogeneity of coherent single-neuron representations in their contributions to population code.
Collapse
|
74
|
Michaiel AM, Abe ETT, Niell CM. Dynamics of gaze control during prey capture in freely moving mice. eLife 2020; 9:e57458. [PMID: 32706335 PMCID: PMC7438109 DOI: 10.7554/elife.57458] [Citation(s) in RCA: 45] [Impact Index Per Article: 11.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2020] [Accepted: 07/23/2020] [Indexed: 12/24/2022] Open
Abstract
Many studies of visual processing are conducted in constrained conditions such as head- and gaze-fixation, and therefore less is known about how animals actively acquire visual information in natural contexts. To determine how mice target their gaze during natural behavior, we measured head and bilateral eye movements in mice performing prey capture, an ethological behavior that engages vision. We found that the majority of eye movements are compensatory for head movements, thereby serving to stabilize the visual scene. During movement, however, periods of stabilization are interspersed with non-compensatory saccades that abruptly shift gaze position. Notably, these saccades do not preferentially target the prey location. Rather, orienting movements are driven by the head, with the eyes following in coordination to sequentially stabilize and recenter the gaze. These findings relate eye movements in the mouse to other species, and provide a foundation for studying active vision during ethological behaviors in the mouse.
Collapse
Affiliation(s)
- Angie M Michaiel
- Institute of Neuroscience and Department of Biology, University of OregonEugeneUnited States
| | - Elliott TT Abe
- Institute of Neuroscience and Department of Biology, University of OregonEugeneUnited States
| | - Cristopher M Niell
- Institute of Neuroscience and Department of Biology, University of OregonEugeneUnited States
| |
Collapse
|
75
|
Abstract
Across vertebrates, eye movements serve the dual purpose of image stabilization during head or body movement, and gaze relocation. A new study has measured head and bilateral eye movements in freely moving mice, providing a detailed characterization of dynamic gaze behavior.
Collapse
Affiliation(s)
- Magdalena Kautzky
- Division of Neurobiology, Department Biology II, LMU Munich, 82151 Munich, Germany; Graduate School of Systemic Neuroscience (GSN), LMU Munich, 82151 Munich, Germany
| | - Laura Busse
- Division of Neurobiology, Department Biology II, LMU Munich, 82151 Munich, Germany; Bernstein Centre for Computational Neuroscience, 82151 Munich, Germany.
| |
Collapse
|
76
|
Meyer AF, O'Keefe J, Poort J. Two Distinct Types of Eye-Head Coupling in Freely Moving Mice. Curr Biol 2020; 30:2116-2130.e6. [PMID: 32413309 PMCID: PMC7284311 DOI: 10.1016/j.cub.2020.04.042] [Citation(s) in RCA: 82] [Impact Index Per Article: 20.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2020] [Revised: 04/09/2020] [Accepted: 04/20/2020] [Indexed: 11/17/2022]
Abstract
Animals actively interact with their environment to gather sensory information. There is conflicting evidence about how mice use vision to sample their environment. During head restraint, mice make rapid eye movements coupled between the eyes, similar to conjugate saccadic eye movements in humans. However, when mice are free to move their heads, eye movements are more complex and often non-conjugate, with the eyes moving in opposite directions. We combined head and eye tracking in freely moving mice and found both observations are explained by two eye-head coupling types, associated with vestibular mechanisms. The first type comprised non-conjugate eye movements, which compensate for head tilt changes to maintain a similar visual field relative to the horizontal ground plane. The second type of eye movements was conjugate and coupled to head yaw rotation to produce a "saccade and fixate" gaze pattern. During head-initiated saccades, the eyes moved together in the head direction but during subsequent fixation moved in the opposite direction to the head to compensate for head rotation. This saccade and fixate pattern is similar to humans who use eye movements (with or without head movement) to rapidly shift gaze but in mice relies on combined head and eye movements. Both couplings were maintained during social interactions and visually guided object tracking. Even in head-restrained mice, eye movements were invariably associated with attempted head motion. Our results reveal that mice combine head and eye movements to sample their environment and highlight similarities and differences between eye movements in mice and humans.
Collapse
Affiliation(s)
- Arne F Meyer
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen 6525, the Netherlands; Sainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London (UCL), London W1T 4JG, UK.
| | - John O'Keefe
- Sainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London (UCL), London W1T 4JG, UK; Department of Cell and Developmental Biology, UCL, London WC1E 6BT, UK
| | - Jasper Poort
- Sainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London (UCL), London W1T 4JG, UK; Department of Psychology, University of Cambridge, Cambridge CB2 3EB, UK.
| |
Collapse
|
77
|
Kaplan HS, Zimmer M. Brain-wide representations of ongoing behavior: a universal principle? Curr Opin Neurobiol 2020; 64:60-69. [PMID: 32203874 DOI: 10.1016/j.conb.2020.02.008] [Citation(s) in RCA: 35] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2020] [Revised: 02/16/2020] [Accepted: 02/17/2020] [Indexed: 12/13/2022]
Abstract
Recent neuronal activity recordings of unprecedented breadth and depth in worms, flies, and mice have uncovered a surprising common feature: brain-wide behavior-related signals. These signals pervade, and even dominate, neuronal populations thought to function primarily in sensory processing. Such convergent findings across organisms suggest that brain-wide representations of behavior might be a universal neuroscientific principle. What purpose(s) do these representations serve? Here we review these findings along with suggested functions, including sensory prediction, context-dependent sensory processing, and, perhaps most speculatively, distributed motor command generation. It appears that a large proportion of the brain's energy and coding capacity is used to represent ongoing behavior; understanding the function of these representations should therefore be a major goal in neuroscience research.
Collapse
Affiliation(s)
- Harris S Kaplan
- Department of Neuroscience and Developmental Biology, University of Vienna, Althanstrasse 14, 1090 Vienna, Austria; Research Institute of Molecular Pathology (IMP), Vienna Biocenter (VBC), Campus-Vienna-Biocenter 1, 1030 Vienna, Austria.
| | - Manuel Zimmer
- Department of Neuroscience and Developmental Biology, University of Vienna, Althanstrasse 14, 1090 Vienna, Austria; Research Institute of Molecular Pathology (IMP), Vienna Biocenter (VBC), Campus-Vienna-Biocenter 1, 1030 Vienna, Austria
| |
Collapse
|
78
|
Schneider DM. Reflections of action in sensory cortex. Curr Opin Neurobiol 2020; 64:53-59. [PMID: 32171079 DOI: 10.1016/j.conb.2020.02.004] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2019] [Revised: 01/25/2020] [Accepted: 02/09/2020] [Indexed: 11/26/2022]
Abstract
Nearly every movement that one makes produces a corresponding set of sensations. The simple fact that much of our sensory world is driven by our own actions underscores one of the major computations that our brains execute every day: to interpret the sensory world even as we interact with and change it. It should not be surprising therefore that activity in sensory cortex reflects not only incoming sensory inputs but also ongoing movement and behavioral state. With a focus on the mouse as a model organism, this review highlights recent findings revealing the widespread modulation of sensory cortex across diverse movements, the circuitry through which movement-related inputs are integrated with sensory signals, and the computational and perceptual roles that motor-sensory integration may serve within the brain.
Collapse
Affiliation(s)
- David M Schneider
- Center for Neural Science, New York University, New York, NY 10003, United States.
| |
Collapse
|
79
|
Bjerre AS, Palmer LM. Probing Cortical Activity During Head-Fixed Behavior. Front Mol Neurosci 2020; 13:30. [PMID: 32180705 PMCID: PMC7059801 DOI: 10.3389/fnmol.2020.00030] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2019] [Accepted: 02/10/2020] [Indexed: 01/20/2023] Open
Abstract
The cortex is crucial for many behaviors, ranging from sensory-based behaviors to working memory and social behaviors. To gain an in-depth understanding of the contribution to these behaviors, cellular and sub-cellular recordings from both individual and populations of cortical neurons are vital. However, techniques allowing such recordings, such as two-photon imaging and whole-cell electrophysiology, require absolute stability of the head, a requirement not often fulfilled in freely moving animals. Here, we review and compare behavioral paradigms that have been developed and adapted for the head-fixed preparation, which together offer the needed stability for live recordings of neural activity in behaving animals. We also review how the head-fixed preparation has been used to explore the function of primary sensory cortices, posterior parietal cortex (PPC) and anterior lateral motor (ALM) cortex in sensory-based behavioral tasks, while also discussing the considerations of performing such recordings. Overall, this review highlights the head-fixed preparation as allowing in-depth investigation into the neural activity underlying behaviors by providing highly controllable settings for precise stimuli presentation which can be combined with behavioral paradigms ranging from simple sensory detection tasks to complex, cross-modal, memory-guided decision-making tasks.
Collapse
Affiliation(s)
- Ann-Sofie Bjerre
- Florey Institute of Neuroscience and Mental Health, University of Melbourne, Parkville, VIC, Australia
| | - Lucy M Palmer
- Florey Institute of Neuroscience and Mental Health, University of Melbourne, Parkville, VIC, Australia
| |
Collapse
|
80
|
Cortical circuits for integration of self-motion and visual-motion signals. Curr Opin Neurobiol 2019; 60:122-128. [PMID: 31869592 DOI: 10.1016/j.conb.2019.11.013] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2019] [Revised: 11/13/2019] [Accepted: 11/15/2019] [Indexed: 12/19/2022]
Abstract
The cerebral cortex contains cells which respond to movement of the head, and these cells are thought to be involved in the perception of self-motion. In particular, studies in the primary visual cortex of mice show that both running speed and passive whole-body rotation modulates neuronal activity, and modern genetically targeted viral tracing approaches have begun to identify previously unknown circuits that underlie these responses. Here we review recent experimental findings and provide a road map for future work in mice to elucidate the functional architecture and emergent properties of a cortical network potentially involved in the generation of egocentric-based visual representations for navigation.
Collapse
|
81
|
Corthals K, Moore S, Geurten BR. Strategies of locomotion composition. CURRENT OPINION IN INSECT SCIENCE 2019; 36:140-148. [PMID: 31622810 DOI: 10.1016/j.cois.2019.09.007] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/01/2019] [Revised: 09/10/2019] [Accepted: 09/24/2019] [Indexed: 06/10/2023]
Abstract
This review aims to highlight the importance of saccades during locomotion as a strategy to reduce sensory information loss while the subject is moving. Acquiring sensory data from the environment during movement results in a temporal flow of information, as the sensory precept changes with the position of the observer. Accordingly, the movement pattern shapes the sensory flow. Therefore, the requirements of locomotion and sensation have to be balanced in the behaviour of the organism. Insect vision provides deep insight into the interplay between action and perception. Insects can shape their optic flow by reducing their rotational movements to fast and short saccades. This generates prolonged phases of translations which provide depth information. Extensive behavioural and physiological studies on insects show how shaping the optic flow facilitates the coding of motion vision. Indeed the saccadic strategy provides an elegant solution to optimise sensory flow. Complementary studies in other taxa reported similar locomotion strategies emphasising the crucial influence of sensory flow on locomotion.
Collapse
Affiliation(s)
- Kristina Corthals
- Lund University, Functional Zoology, Sölvegatan 35, 223 62 Lund, Sweden
| | - Sharlen Moore
- Instituto de Fisiologıa Celular - Neurociencias, Universidad Nacional Autónoma de México, Av. Universidad 3000, Coyoacán, 04510 Mexico City, Mexico; Max Planck Institute of Experimental Medicine, Department of Neurogenetics, Hermann-Rein-Str. 3, 37075 Göttingen, Germany
| | - Bart Rh Geurten
- Georg-August-University Göttingen, Department of Cellular Neuroscience, Julia-Lermontowa-Weg 3, 37077 Göttingen, Germany.
| |
Collapse
|
82
|
Wang C, Chen X, Knierim JJ. Egocentric and allocentric representations of space in the rodent brain. Curr Opin Neurobiol 2019; 60:12-20. [PMID: 31794917 DOI: 10.1016/j.conb.2019.11.005] [Citation(s) in RCA: 61] [Impact Index Per Article: 12.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2019] [Revised: 10/27/2019] [Accepted: 11/08/2019] [Indexed: 12/17/2022]
Abstract
Spatial signals are prevalent within the hippocampus and its neighboring regions. It is generally accepted that these signals are defined with respect to the external world (i.e., a world-centered, or allocentric, frame of reference). Recently, evidence of egocentric processing (i.e., self-centered, defined relative to the subject) in the extended hippocampal system has accumulated. These results support the idea that egocentric sensory information, derived from primary sensory cortical areas, may be transformed to allocentric representations that interact with the allocentric hippocampal system. We propose a framework to explain the implications of the egocentric-allocentric transformations to the functions of the medial temporal lobe memory system.
Collapse
Affiliation(s)
- Cheng Wang
- Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, The Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen, China; Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, USA.
| | - Xiaojing Chen
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, USA
| | - James J Knierim
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, USA; Solomon H. Snyder Department of Neuroscience, Johns Hopkins University, Baltimore, MD, USA.
| |
Collapse
|
83
|
Baden T, Euler T, Berens P. Understanding the retinal basis of vision across species. Nat Rev Neurosci 2019; 21:5-20. [PMID: 31780820 DOI: 10.1038/s41583-019-0242-1] [Citation(s) in RCA: 143] [Impact Index Per Article: 28.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/22/2019] [Indexed: 12/12/2022]
Abstract
The vertebrate retina first evolved some 500 million years ago in ancestral marine chordates. Since then, the eyes of different species have been tuned to best support their unique visuoecological lifestyles. Visual specializations in eye designs, large-scale inhomogeneities across the retinal surface and local circuit motifs mean that all species' retinas are unique. Computational theories, such as the efficient coding hypothesis, have come a long way towards an explanation of the basic features of retinal organization and function; however, they cannot explain the full extent of retinal diversity within and across species. To build a truly general understanding of vertebrate vision and the retina's computational purpose, it is therefore important to more quantitatively relate different species' retinal functions to their specific natural environments and behavioural requirements. Ultimately, the goal of such efforts should be to build up to a more general theory of vision.
Collapse
Affiliation(s)
- Tom Baden
- Sussex Neuroscience, School of Life Sciences, University of Sussex, Brighton, UK. .,Institute for Ophthalmic Research, University of Tübingen, Tübingen, Germany.
| | - Thomas Euler
- Institute for Ophthalmic Research, University of Tübingen, Tübingen, Germany.,Werner Reichardt Centre for Integrative Neuroscience, University of Tübingen, Tübingen, Germany
| | - Philipp Berens
- Institute for Ophthalmic Research, University of Tübingen, Tübingen, Germany.,Werner Reichardt Centre for Integrative Neuroscience, University of Tübingen, Tübingen, Germany.,Institute for Bioinformatics and Medical Informatics, University of Tübingen, Tübingen, Germany.,Bernstein Centre for Computational Neuroscience, University of Tübingen, Tübingen, Germany
| |
Collapse
|
84
|
Schwartz ZP, Buran BN, David SV. Pupil-associated states modulate excitability but not stimulus selectivity in primary auditory cortex. J Neurophysiol 2019; 123:191-208. [PMID: 31721652 DOI: 10.1152/jn.00595.2019] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022] Open
Abstract
Recent research in mice indicates that luminance-independent fluctuations in pupil size predict variability in spontaneous and evoked activity of single neurons in auditory and visual cortex. These findings suggest that pupil is an indicator of large-scale changes in arousal state that affect sensory processing. However, it is not known whether pupil-related state also influences the selectivity of auditory neurons. We recorded pupil size and single-unit spiking activity in the primary auditory cortex (A1) of nonanesthetized male and female ferrets during presentation of natural vocalizations and tone stimuli that allow measurement of frequency and level tuning. Neurons showed a systematic increase in both spontaneous and sound-evoked activity when pupil was large, as well as desynchronization and a decrease in trial-to-trial variability. Relationships between pupil size and firing rate were nonmonotonic in some cells. In most neurons, several measurements of tuning, including acoustic threshold, spectral bandwidth, and best frequency, remained stable across large changes in pupil size. Across the population, however, there was a small but significant decrease in acoustic threshold when pupil was dilated. In some recordings, we observed rapid, saccade-like eye movements during sustained pupil constriction, which may indicate sleep. Including the presence of this state as a separate variable in a regression model of neural variability accounted for some, but not all, of the variability and nonmonotonicity associated with changes in pupil size.NEW & NOTEWORTHY Cortical neurons vary in their response to repeated stimuli, and some portion of the variability is due to fluctuations in network state. By simultaneously recording pupil and single-neuron activity in auditory cortex of ferrets, we provide new evidence that network state affects the excitability of auditory neurons, but not sensory selectivity. In addition, we report the occurrence of possible sleep states, adding to evidence that pupil provides an index of both sleep and physiological arousal.
Collapse
Affiliation(s)
- Zachary P Schwartz
- Neuroscience Graduate Program, Oregon Health and Science University, Portland, Oregon
| | - Brad N Buran
- Oregon Hearing Research Center, Oregon Health and Science University, Portland, Oregon
| | - Stephen V David
- Oregon Hearing Research Center, Oregon Health and Science University, Portland, Oregon
| |
Collapse
|
85
|
Datta SR, Anderson DJ, Branson K, Perona P, Leifer A. Computational Neuroethology: A Call to Action. Neuron 2019; 104:11-24. [PMID: 31600508 PMCID: PMC6981239 DOI: 10.1016/j.neuron.2019.09.038] [Citation(s) in RCA: 191] [Impact Index Per Article: 38.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2019] [Revised: 09/16/2019] [Accepted: 09/23/2019] [Indexed: 12/11/2022]
Abstract
The brain is worthy of study because it is in charge of behavior. A flurry of recent technical advances in measuring and quantifying naturalistic behaviors provide an important opportunity for advancing brain science. However, the problem of understanding unrestrained behavior in the context of neural recordings and manipulations remains unsolved, and developing approaches to addressing this challenge is critical. Here we discuss considerations in computational neuroethology-the science of quantifying naturalistic behaviors for understanding the brain-and propose strategies to evaluate progress. We point to open questions that require resolution and call upon the broader systems neuroscience community to further develop and leverage measures of naturalistic, unrestrained behavior, which will enable us to more effectively probe the richness and complexity of the brain.
Collapse
Affiliation(s)
| | - David J Anderson
- Division of Biology and Biological Engineering 156-29, California Institute of Technology, Pasadena, CA 91125, USA; Howard Hughes Medical Institute, Pasadena, CA, 91125, USA; Tianqiao and Chrissy Chen Institute for Neuroscience, California Institute of Technology, Pasadena, CA 91125, USA
| | - Kristin Branson
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA 20147, USA
| | - Pietro Perona
- Division of Engineering & Applied Sciences 136-93, California Institute of Technology, Pasadena, CA 91125, USA
| | - Andrew Leifer
- Department of Physics, Princeton University, Princeton, NJ 08544, USA; Princeton Neuroscience Institute, Princeton University, Princeton, NJ 08544, USA.
| |
Collapse
|
86
|
Paradoxical Rules of Spike Train Decoding Revealed at the Sensitivity Limit of Vision. Neuron 2019; 104:576-587.e11. [PMID: 31519460 DOI: 10.1016/j.neuron.2019.08.005] [Citation(s) in RCA: 26] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2018] [Revised: 05/28/2019] [Accepted: 08/03/2019] [Indexed: 12/11/2022]
Abstract
All sensory information is encoded in neural spike trains. It is unknown how the brain utilizes this neural code to drive behavior. Here, we unravel the decoding rules of the brain at the most elementary level by linking behavioral decisions to retinal output signals in a single-photon detection task. A transgenic mouse line allowed us to separate the two primary retinal outputs, ON and OFF pathways, carrying information about photon absorptions as increases and decreases in spiking, respectively. We measured the sensitivity limit of rods and the most sensitive ON and OFF ganglion cells and correlated these results with visually guided behavior using markerless head and eye tracking. We show that behavior relies only on the ON pathway even when the OFF pathway would allow higher sensitivity. Paradoxically, behavior does not rely on the spike code with maximal information but instead relies on a decoding strategy based on increases in spiking.
Collapse
|
87
|
La Chioma A, Bonhoeffer T, Hübener M. Area-Specific Mapping of Binocular Disparity across Mouse Visual Cortex. Curr Biol 2019; 29:2954-2960.e5. [DOI: 10.1016/j.cub.2019.07.037] [Citation(s) in RCA: 26] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2019] [Revised: 07/05/2019] [Accepted: 07/11/2019] [Indexed: 10/26/2022]
|
88
|
Genetically Defined Functional Modules for Spatial Orienting in the Mouse Superior Colliculus. Curr Biol 2019; 29:2892-2904.e8. [PMID: 31474533 PMCID: PMC6739420 DOI: 10.1016/j.cub.2019.07.083] [Citation(s) in RCA: 51] [Impact Index Per Article: 10.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2019] [Revised: 07/26/2019] [Accepted: 07/30/2019] [Indexed: 01/27/2023]
Abstract
In order to explore and interact with their surroundings, animals need to orient toward specific positions in space. Throughout the animal kingdom, head movements represent a primary form of orienting behavior. The superior colliculus (SC) is a fundamental structure for the generation of orienting responses, but how genetically distinct groups of collicular neurons contribute to these spatially tuned behaviors remains largely to be defined. Here, through the genetic dissection of the murine SC, we identify a functionally and genetically homogeneous subclass of glutamatergic neurons defined by the expression of the paired-like homeodomain transcription factor Pitx2. We show that the optogenetic stimulation of Pitx2ON neurons drives three-dimensional head displacements characterized by stepwise, saccade-like kinematics. Furthermore, during naturalistic foraging behavior, the activity of Pitx2ON neurons precedes and predicts the onset of spatially tuned head movements. Intriguingly, we reveal that Pitx2ON neurons are clustered in an orderly array of anatomical modules that tile the entire intermediate layer of the SC. Such a modular organization gives origin to a discrete and discontinuous representation of the motor space, with each Pitx2ON module subtending a defined portion of the animal’s egocentric space. The modularity of Pitx2ON neurons provides an anatomical substrate for the convergence of spatially coherent sensory and motor signals of cortical and subcortical origins, thereby promoting the recruitment of appropriate movement vectors. Overall, these data support the view of the superior colliculus as a selectively addressable and modularly organized spatial-motor register. Pitx2 expression labels a functionally homogeneous class of projecting SC neurons Pitx2ON neurons drive three-dimensional head movements during foraging behavior Pitx2ON neurons are organized in an orderly array of anatomical modules Modularity of Pitx2ON neurons defines a discrete motor map for spatial orienting
Collapse
|
89
|
Abstract
Understanding the brain requires understanding behavior. New machine vision and learning techniques are poised to revolutionize our ability to analyze behaviors exhibited by animals in the laboratory. Here we describe one such method, Motion Sequencing (MoSeq), which combines three-dimensional (3D) imaging with unsupervised machine learning techniques to identify the syllables and grammar that comprise mouse body language. This Q&A situates MoSeq within the array of novel methods currently being developed for behavioral analysis, enumerates its relative strengths and weaknesses, and describes its future trajectory.
Collapse
Affiliation(s)
- Sandeep Robert Datta
- Harvard Medical School Department of Neurobiology, WAB 336, 200 Longwood Avenue, Boston, MA, 02115, USA.
| |
Collapse
|