51
|
Wang C, Chen X, Knierim JJ. Egocentric and allocentric representations of space in the rodent brain. Curr Opin Neurobiol 2019; 60:12-20. [PMID: 31794917 DOI: 10.1016/j.conb.2019.11.005] [Citation(s) in RCA: 74] [Impact Index Per Article: 14.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2019] [Revised: 10/27/2019] [Accepted: 11/08/2019] [Indexed: 12/17/2022]
Abstract
Spatial signals are prevalent within the hippocampus and its neighboring regions. It is generally accepted that these signals are defined with respect to the external world (i.e., a world-centered, or allocentric, frame of reference). Recently, evidence of egocentric processing (i.e., self-centered, defined relative to the subject) in the extended hippocampal system has accumulated. These results support the idea that egocentric sensory information, derived from primary sensory cortical areas, may be transformed to allocentric representations that interact with the allocentric hippocampal system. We propose a framework to explain the implications of the egocentric-allocentric transformations to the functions of the medial temporal lobe memory system.
Collapse
Affiliation(s)
- Cheng Wang
- Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, The Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen, China; Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, USA.
| | - Xiaojing Chen
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, USA
| | - James J Knierim
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, USA; Solomon H. Snyder Department of Neuroscience, Johns Hopkins University, Baltimore, MD, USA.
| |
Collapse
|
52
|
Smith PF. The Growing Evidence for the Importance of the Otoliths in Spatial Memory. Front Neural Circuits 2019; 13:66. [PMID: 31680880 PMCID: PMC6813194 DOI: 10.3389/fncir.2019.00066] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2019] [Accepted: 09/30/2019] [Indexed: 01/14/2023] Open
Abstract
Many studies have demonstrated that vestibular sensory input is important for spatial learning and memory. However, it has been unclear what contributions the different parts of the vestibular system - the semi-circular canals and otoliths - make to these processes. The advent of mutant otolith-deficient mice has made it possible to isolate the relative contributions of the otoliths, the utricle and saccule. A number of studies have now indicated that the loss of otolithic function impairs normal spatial memory and also impairs the normal function of head direction cells in the thalamus and place cells in the hippocampus. Epidemiological studies have also provided evidence that spatial memory impairment with aging, may be linked to saccular function. The otoliths may be important in spatial cognition because of their evolutionary age as a sensory detector of orientation and the fact that velocity storage is important to the way that the brain encodes its place in space.
Collapse
Affiliation(s)
- Paul F. Smith
- Department of Pharmacology and Toxicology, Brain Health Research Centre, School of Biomedical Sciences, University of Otago Medical School, Dunedin, New Zealand
- Brain Research New Zealand, Auckland, New Zealand
- Eisdell Moore Centre for Hearing and Balance Research, University of Auckland, Auckland, New Zealand
| |
Collapse
|
53
|
Vanzella W, Grion N, Bertolini D, Perissinotto A, Gigante M, Zoccolan D. A passive, camera-based head-tracking system for real-time, three-dimensional estimation of head position and orientation in rodents. J Neurophysiol 2019; 122:2220-2242. [PMID: 31553687 DOI: 10.1152/jn.00301.2019] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022] Open
Abstract
Tracking head position and orientation in small mammals is crucial for many applications in the field of behavioral neurophysiology, from the study of spatial navigation to the investigation of active sensing and perceptual representations. Many approaches to head tracking exist, but most of them only estimate the 2D coordinates of the head over the plane where the animal navigates. Full reconstruction of the pose of the head in 3D is much more more challenging and has been achieved only in handful of studies, which employed headsets made of multiple LEDs or inertial units. However, these assemblies are rather bulky and need to be powered to operate, which prevents their application in wireless experiments and in the small enclosures often used in perceptual studies. Here we propose an alternative approach, based on passively imaging a lightweight, compact, 3D structure, painted with a pattern of black dots over a white background. By applying a cascade of feature extraction algorithms that progressively refine the detection of the dots and reconstruct their geometry, we developed a tracking method that is highly precise and accurate, as assessed through a battery of validation measurements. We show that this method can be used to study how a rat samples sensory stimuli during a perceptual discrimination task and how a hippocampal place cell represents head position over extremely small spatial scales. Given its minimal encumbrance and wireless nature, our method could be ideal for high-throughput applications, where tens of animals need to be simultaneously and continuously tracked.NEW & NOTEWORTHY Head tracking is crucial in many behavioral neurophysiology studies. Yet reconstruction of the head's pose in 3D is challenging and typically requires implanting bulky, electrically powered headsets that prevent wireless experiments and are hard to employ in operant boxes. Here we propose an alternative approach, based on passively imaging a compact, 3D dot pattern that, once implanted over the head of a rodent, allows estimating the pose of its head with high precision and accuracy.
Collapse
Affiliation(s)
- Walter Vanzella
- Visual Neuroscience Laboratory, International School for Advanced Studies (SISSA), Trieste, Italy.,Glance Vision Technologies, Trieste, Italy
| | - Natalia Grion
- Visual Neuroscience Laboratory, International School for Advanced Studies (SISSA), Trieste, Italy
| | - Daniele Bertolini
- Visual Neuroscience Laboratory, International School for Advanced Studies (SISSA), Trieste, Italy
| | - Andrea Perissinotto
- Visual Neuroscience Laboratory, International School for Advanced Studies (SISSA), Trieste, Italy.,Glance Vision Technologies, Trieste, Italy
| | - Marco Gigante
- Mechatronics Lab, International School for Advanced Studies (SISSA), Trieste, Italy
| | - Davide Zoccolan
- Visual Neuroscience Laboratory, International School for Advanced Studies (SISSA), Trieste, Italy
| |
Collapse
|
54
|
Laurens J, Angelaki DE. The Brain Compass: A Perspective on How Self-Motion Updates the Head Direction Cell Attractor. Neuron 2019; 97:275-289. [PMID: 29346751 DOI: 10.1016/j.neuron.2017.12.020] [Citation(s) in RCA: 35] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2017] [Revised: 11/19/2017] [Accepted: 12/13/2017] [Indexed: 12/17/2022]
Abstract
Head direction cells form an internal compass signaling head azimuth orientation even without visual landmarks. This property is generated by a neuronal ring attractor that is updated using rotation velocity cues. The properties and origin of this velocity drive remain, however, unknown. We propose a quantitative framework whereby this drive represents a multisensory self-motion estimate computed through an internal model that uses sensory prediction errors of vestibular, visual, and somatosensory cues to improve on-line motor drive. We show how restraint-dependent strength of recurrent connections within the attractor can explain differences in head direction cell firing between free foraging and restrained passive rotation. We also summarize recent findings on how gravity influences azimuth coding, indicating that the velocity drive is not purely egocentric. Finally, we show that the internal compass may be three-dimensional and hypothesize that the additional vertical degrees of freedom use global allocentric gravity cues.
Collapse
Affiliation(s)
- Jean Laurens
- Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA.
| | - Dora E Angelaki
- Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA
| |
Collapse
|
55
|
Matsumoto N, Kitanishi T, Mizuseki K. The subiculum: Unique hippocampal hub and more. Neurosci Res 2019; 143:1-12. [DOI: 10.1016/j.neures.2018.08.002] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2018] [Revised: 07/10/2018] [Accepted: 08/03/2018] [Indexed: 01/09/2023]
|
56
|
Jercog PE, Ahmadian Y, Woodruff C, Deb-Sen R, Abbott LF, Kandel ER. Heading direction with respect to a reference point modulates place-cell activity. Nat Commun 2019; 10:2333. [PMID: 31133685 PMCID: PMC6536526 DOI: 10.1038/s41467-019-10139-7] [Citation(s) in RCA: 26] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2018] [Accepted: 04/17/2019] [Indexed: 12/02/2022] Open
Abstract
The tuning of neurons in area CA1 of the hippocampus emerges through a combination of non-spatial input from different sensory modalities and spatial information about the animal’s position and heading direction relative to the spatial enclosure being navigated. The positional modulation of CA1 neuronal responses has been widely studied (e.g. place tuning), but less is known about the modulation of these neurons by heading direction. Here, utilizing electrophysiological recordings from CA1 pyramidal cells in freely moving mice, we report that a majority of neural responses are modulated by the heading-direction of the animal relative to a point within or outside their enclosure that we call a reference point. The finding of heading-direction modulation relative to reference points identifies a novel representation encoded in the neuronal responses of the dorsal hippocampus. Place cells are neurons in the hippocampus which encode an animal’s location in space. Here, in mice, the authors show that place cell activity is also modulated by the heading-direction of the animal relative to a particular “reference point” that can be either within or outside their enclosure.
Collapse
Affiliation(s)
- P E Jercog
- Department of Neuroscience, College of Physicians and Surgeons of Columbia University, New York, NY, 10032, USA.
| | - Y Ahmadian
- Institute of Neuroscience, Department of Biology and Mathematics, University of Oregon, Eugene, OR, 97401, USA
| | - C Woodruff
- Department of Neuroscience, College of Physicians and Surgeons of Columbia University, New York, NY, 10032, USA
| | - R Deb-Sen
- Department of Neuroscience, College of Physicians and Surgeons of Columbia University, New York, NY, 10032, USA
| | - L F Abbott
- Department of Neuroscience, College of Physicians and Surgeons of Columbia University, New York, NY, 10032, USA.,Department of Physiology and Cellular Biophysics, College of Physicians and Surgeons of Columbia University, New York, NY, 10032, USA.,Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY, 10027, USA.,Kavli Institute for Brain Science, Columbia University, New York, NY, 10032, USA
| | - E R Kandel
- Department of Neuroscience, College of Physicians and Surgeons of Columbia University, New York, NY, 10032, USA.,Department of Physiology and Cellular Biophysics, College of Physicians and Surgeons of Columbia University, New York, NY, 10032, USA.,Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY, 10027, USA.,Kavli Institute for Brain Science, Columbia University, New York, NY, 10032, USA.,Department of Psychiatry, College of Physicians and Surgeons of Columbia University, New York, NY, 10032, USA.,Howard Hughes Medical Institute at Columbia University, New York, NY, 10032, USA
| |
Collapse
|
57
|
Abstract
The mammalian hippocampus is important for normal memory function, particularly memory for places and events. Place cells, neurons within the hippocampus that have spatial receptive fields, represent information about an animal’s position. During periods of rest, but also during active task engagement, place cells spontaneously recapitulate past trajectories. Such ‘replay’ has been proposed as a mechanism necessary for a range of neurobiological functions, including systems memory consolidation, recall and spatial working memory, navigational planning, and reinforcement learning. Focusing mainly, but not exclusively, on work conducted in rodents, we describe the methodologies used to analyse replay and review evidence for its putative roles. We identify outstanding questions as well as apparent inconsistencies in existing data, making suggestions as to how these might be resolved. In particular, we find support for the involvement of replay in disparate processes, including the maintenance of hippocampal memories and decision making. We propose that the function of replay changes dynamically according to task demands placed on an organism and its current level of arousal.
Collapse
Affiliation(s)
- H Freyja Ólafsdóttir
- Research Department of Cell and Developmental Biology, UCL, Gower Street, London, WC1E 6BT, UK.
| | - Daniel Bush
- UCL Institute of Cognitive Neuroscience and UCL Institute of Neurology, 17 Queen Square, London, WC1N 3AZ, UK
| | - Caswell Barry
- Research Department of Cell and Developmental Biology, UCL, Gower Street, London, WC1E 6BT, UK.
| |
Collapse
|
58
|
Haberkern H, Basnak MA, Ahanonu B, Schauder D, Cohen JD, Bolstad M, Bruns C, Jayaraman V. Visually Guided Behavior and Optogenetically Induced Learning in Head-Fixed Flies Exploring a Virtual Landscape. Curr Biol 2019; 29:1647-1659.e8. [DOI: 10.1016/j.cub.2019.04.033] [Citation(s) in RCA: 40] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2018] [Revised: 03/22/2019] [Accepted: 04/11/2019] [Indexed: 11/28/2022]
|
59
|
Wang C, Chen X, Lee H, Deshmukh SS, Yoganarasimha D, Savelli F, Knierim JJ. Egocentric coding of external items in the lateral entorhinal cortex. Science 2019; 362:945-949. [PMID: 30467169 DOI: 10.1126/science.aau4940] [Citation(s) in RCA: 147] [Impact Index Per Article: 29.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2018] [Accepted: 10/19/2018] [Indexed: 01/05/2023]
Abstract
Episodic memory, the conscious recollection of past events, is typically experienced from a first-person (egocentric) perspective. The hippocampus plays an essential role in episodic memory and spatial cognition. Although the allocentric nature of hippocampal spatial coding is well understood, little is known about whether the hippocampus receives egocentric information about external items. We recorded in rats the activity of single neurons from the lateral entorhinal cortex (LEC) and medial entorhinal cortex (MEC), the two major inputs to the hippocampus. Many LEC neurons showed tuning for egocentric bearing of external items, whereas MEC cells tended to represent allocentric bearing. These results demonstrate a fundamental dissociation between the reference frames of LEC and MEC neural representations.
Collapse
Affiliation(s)
- Cheng Wang
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, USA
| | - Xiaojing Chen
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, USA
| | - Heekyung Lee
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, USA
| | - Sachin S Deshmukh
- Centre for Neuroscience, Indian Institute of Science, Bangalore, India
| | | | - Francesco Savelli
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, USA
| | - James J Knierim
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, USA. .,Solomon H. Snyder Department of Neuroscience, Johns Hopkins University, Baltimore, MD, USA
| |
Collapse
|
60
|
Correlation structure of grid cells is preserved during sleep. Nat Neurosci 2019; 22:598-608. [DOI: 10.1038/s41593-019-0360-0] [Citation(s) in RCA: 52] [Impact Index Per Article: 10.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2017] [Accepted: 02/06/2019] [Indexed: 01/16/2023]
|
61
|
Mankin EA, Thurley K, Chenani A, Haas OV, Debs L, Henke J, Galinato M, Leutgeb JK, Leutgeb S, Leibold C. The hippocampal code for space in Mongolian gerbils. Hippocampus 2019; 29:787-801. [PMID: 30746805 DOI: 10.1002/hipo.23075] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2017] [Revised: 12/07/2018] [Accepted: 01/15/2019] [Indexed: 11/11/2022]
Abstract
Large parts of our knowledge about the physiology of the hippocampus in the intact brain are derived from studies in rats and mice. While many of those findings fit well to the limited data available from humans and primates, there are also marked differences, for example, in hippocampal oscillation frequencies and in the persistence of theta oscillations. To test whether the distinct sensory specializations of the visual and auditory system of primates play a key role in explaining these differences, we recorded basic hippocampal physiological properties in Mongolian gerbils, a rodent species with high visual acuity, and good low-frequency hearing, similar to humans. We found that gerbils show only minor differences to rats regarding hippocampal place field activity, theta properties (frequency, persistence, phase precession, theta compression), and sharp wave ripple events. The only major difference between rats and gerbils was a considerably higher degree of head direction selectivity of gerbil place fields, which may be explained by their visual system being able to better resolve distant cues. Thus, differences in sensory specializations between rodent species only affect hippocampal circuit dynamics to a minor extent, which implies that differences to other mammalian lineages, such as bats and primates, cannot be solely explained by specialization in the auditory or visual system.
Collapse
Affiliation(s)
- Emily A Mankin
- Neurobiology Section and Center for Neural Circuits and Behavior, University of California, San Diego, La Jolla, California.,Department of Neurosurgery, David Geffen School of Medicine and Semel Institute For Neuroscience and Human Behavior, University of California, Los Angeles, California
| | - Kay Thurley
- Department Biologie II, Ludwig-Maximilians-Universität München, Martinsried, Germany.,Bernstein Center for Computational Neuroscience Munich, Martinsried, Germany
| | - Alireza Chenani
- Department Biologie II, Ludwig-Maximilians-Universität München, Martinsried, Germany.,Bernstein Center for Computational Neuroscience Munich, Martinsried, Germany
| | - Olivia V Haas
- Department Biologie II, Ludwig-Maximilians-Universität München, Martinsried, Germany.,Bernstein Center for Computational Neuroscience Munich, Martinsried, Germany
| | - Luca Debs
- Neurobiology Section and Center for Neural Circuits and Behavior, University of California, San Diego, La Jolla, California
| | - Josephine Henke
- Department Biologie II, Ludwig-Maximilians-Universität München, Martinsried, Germany
| | - Melissa Galinato
- Neurobiology Section and Center for Neural Circuits and Behavior, University of California, San Diego, La Jolla, California
| | - Jill K Leutgeb
- Neurobiology Section and Center for Neural Circuits and Behavior, University of California, San Diego, La Jolla, California
| | - Stefan Leutgeb
- Neurobiology Section and Center for Neural Circuits and Behavior, University of California, San Diego, La Jolla, California.,Kavli Institute for Brain and Mind, University of California, San Diego, La Jolla, California
| | - Christian Leibold
- Department Biologie II, Ludwig-Maximilians-Universität München, Martinsried, Germany.,Bernstein Center for Computational Neuroscience Munich, Martinsried, Germany
| |
Collapse
|
62
|
Differential influences of environment and self-motion on place and grid cell firing. Nat Commun 2019; 10:630. [PMID: 30733457 PMCID: PMC6367320 DOI: 10.1038/s41467-019-08550-1] [Citation(s) in RCA: 36] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2018] [Accepted: 01/12/2019] [Indexed: 01/15/2023] Open
Abstract
Place and grid cells in the hippocampal formation provide foundational representations of environmental location, and potentially of locations within conceptual spaces. Some accounts predict that environmental sensory information and self-motion are encoded in complementary representations, while other models suggest that both features combine to produce a single coherent representation. Here, we use virtual reality to dissociate visual environmental from physical motion inputs, while recording place and grid cells in mice navigating virtual open arenas. Place cell firing patterns predominantly reflect visual inputs, while grid cell activity reflects a greater influence of physical motion. Thus, even when recorded simultaneously, place and grid cell firing patterns differentially reflect environmental information (or 'states') and physical self-motion (or 'transitions'), and need not be mutually coherent.
Collapse
|
63
|
Diersch N, Wolbers T. The potential of virtual reality for spatial navigation research across the adult lifespan. ACTA ACUST UNITED AC 2019; 222:222/Suppl_1/jeb187252. [PMID: 30728232 DOI: 10.1242/jeb.187252] [Citation(s) in RCA: 57] [Impact Index Per Article: 11.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
Older adults often experience serious problems in spatial navigation, and alterations in underlying brain structures are among the first indicators for a progression to neurodegenerative diseases. Studies investigating the neural mechanisms of spatial navigation and its changes across the adult lifespan are increasingly using virtual reality (VR) paradigms. VR offers major benefits in terms of ecological validity, experimental control and options to track behavioral responses. However, navigation in the real world differs from navigation in VR in several aspects. In addition, the importance of body-based or visual cues for navigation varies between animal species. Incongruences between sensory and motor input in VR might consequently affect their performance to a different degree. After discussing the specifics of using VR in spatial navigation research across species, we outline several challenges when investigating age-related deficits in spatial navigation with the help of VR. In addition, we discuss ways to reduce their impact, together with the possibilities VR offers for improving navigational abilities in older adults.
Collapse
Affiliation(s)
- Nadine Diersch
- Aging & Cognition Research Group, German Center for Neurodegenerative Diseases (DZNE), 39120 Magdeburg, Germany
| | - Thomas Wolbers
- Aging & Cognition Research Group, German Center for Neurodegenerative Diseases (DZNE), 39120 Magdeburg, Germany.,Center for Behavioural Brain Sciences (CBBS), Otto-von-Guericke University Magdeburg, 39120 Magdeburg, Germany.,Medical Faculty, University Hospital Magdeburg, Otto-von-Guericke-University Magdeburg, 39120 Magdeburg, Germany
| |
Collapse
|
64
|
Jayakumar RP, Madhav MS, Savelli F, Blair HT, Cowan NJ, Knierim JJ. Recalibration of path integration in hippocampal place cells. Nature 2019; 566:533-537. [PMID: 30742074 PMCID: PMC6629428 DOI: 10.1038/s41586-019-0939-3] [Citation(s) in RCA: 49] [Impact Index Per Article: 9.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2018] [Accepted: 01/10/2019] [Indexed: 01/11/2023]
Abstract
Hippocampal place cells are spatially tuned neurons that serve as elements of a 'cognitive map' in the mammalian brain1. To detect the animal's location, place cells are thought to rely upon two interacting mechanisms: sensing the position of the animal relative to familiar landmarks2,3 and measuring the distance and direction that the animal has travelled from previously occupied locations4-7. The latter mechanism-known as path integration-requires a finely tuned gain factor that relates the animal's self-movement to the updating of position on the internal cognitive map, as well as external landmarks to correct the positional error that accumulates8,9. Models of hippocampal place cells and entorhinal grid cells based on path integration treat the path-integration gain as a constant9-14, but behavioural evidence in humans suggests that the gain is modifiable15. Here we show, using physiological evidence from rat hippocampal place cells, that the path-integration gain is a highly plastic variable that can be altered by persistent conflict between self-motion cues and feedback from external landmarks. In an augmented-reality system, visual landmarks were moved in proportion to the movement of a rat on a circular track, creating continuous conflict with path integration. Sustained exposure to this cue conflict resulted in predictable and prolonged recalibration of the path-integration gain, as estimated from the place cells after the landmarks were turned off. We propose that this rapid plasticity keeps the positional update in register with the movement of the rat in the external world over behavioural timescales. These results also demonstrate that visual landmarks not only provide a signal to correct cumulative error in the path-integration system4,8,16-19, but also rapidly fine-tune the integration computation itself.
Collapse
Affiliation(s)
| | - Manu S Madhav
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, USA.
| | - Francesco Savelli
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, USA
| | - Hugh T Blair
- Department of Psychology, UCLA, Los Angeles, CA, USA
| | - Noah J Cowan
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| | - James J Knierim
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, USA
- Solomon H. Snyder Department of Neuroscience, Johns Hopkins University, Baltimore, MD, USA
| |
Collapse
|
65
|
Monaco JD, De Guzman RM, Blair HT, Zhang K. Spatial synchronization codes from coupled rate-phase neurons. PLoS Comput Biol 2019; 15:e1006741. [PMID: 30682012 PMCID: PMC6364943 DOI: 10.1371/journal.pcbi.1006741] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2018] [Revised: 02/06/2019] [Accepted: 12/21/2018] [Indexed: 01/18/2023] Open
Abstract
During spatial navigation, the frequency and timing of spikes from spatial neurons including place cells in hippocampus and grid cells in medial entorhinal cortex are temporally organized by continuous theta oscillations (6-11 Hz). The theta rhythm is regulated by subcortical structures including the medial septum, but it is unclear how spatial information from place cells may reciprocally organize subcortical theta-rhythmic activity. Here we recorded single-unit spiking from a constellation of subcortical and hippocampal sites to study spatial modulation of rhythmic spike timing in rats freely exploring an open environment. Our analysis revealed a novel class of neurons that we termed 'phaser cells,' characterized by a symmetric coupling between firing rate and spike theta-phase. Phaser cells encoded space by assigning distinct phases to allocentric isocontour levels of each cell's spatial firing pattern. In our dataset, phaser cells were predominantly located in the lateral septum, but also the hippocampus, anteroventral thalamus, lateral hypothalamus, and nucleus accumbens. Unlike the unidirectional late-to-early phase precession of place cells, bidirectional phase modulation acted to return phaser cells to the same theta-phase along a given spatial isocontour, including cells that characteristically shifted to later phases at higher firing rates. Our dynamical models of intrinsic theta-bursting neurons demonstrated that experience-independent temporal coding mechanisms can qualitatively explain (1) the spatial rate-phase relationships of phaser cells and (2) the observed temporal segregation of phaser cells according to phase-shift direction. In open-field phaser cell simulations, competitive learning embedded phase-code entrainment maps into the weights of downstream targets, including path integration networks. Bayesian phase decoding revealed error correction capable of resetting path integration at subsecond timescales. Our findings suggest that phaser cells may instantiate a subcortical theta-rhythmic loop of spatial feedback. We outline a framework in which location-dependent synchrony reconciles internal idiothetic processes with the allothetic reference points of sensory experience.
Collapse
Affiliation(s)
- Joseph D. Monaco
- Biomedical Engineering Department, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Rose M. De Guzman
- Psychology Department, University of California, Los Angeles, Los Angeles, CA, USA
| | - Hugh T. Blair
- Psychology Department, University of California, Los Angeles, Los Angeles, CA, USA
| | - Kechen Zhang
- Biomedical Engineering Department, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| |
Collapse
|
66
|
Rolls ET, Wirth S. Spatial representations in the primate hippocampus, and their functions in memory and navigation. Prog Neurobiol 2018; 171:90-113. [DOI: 10.1016/j.pneurobio.2018.09.004] [Citation(s) in RCA: 59] [Impact Index Per Article: 9.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2018] [Revised: 09/10/2018] [Accepted: 09/10/2018] [Indexed: 01/01/2023]
|
67
|
Kornienko O, Latuske P, Bassler M, Kohler L, Allen K. Non-rhythmic head-direction cells in the parahippocampal region are not constrained by attractor network dynamics. eLife 2018; 7:35949. [PMID: 30222110 PMCID: PMC6158010 DOI: 10.7554/elife.35949] [Citation(s) in RCA: 32] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2018] [Accepted: 08/24/2018] [Indexed: 11/13/2022] Open
Abstract
Computational models postulate that head-direction (HD) cells are part of an attractor network integrating head turns. This network requires inputs from visual landmarks to anchor the HD signal to the external world. We investigated whether information about HD and visual landmarks is integrated in the medial entorhinal cortex and parasubiculum, resulting in neurons expressing a conjunctive code for HD and visual landmarks. We found that parahippocampal HD cells could be divided into two classes based on their theta-rhythmic activity: non-rhythmic and theta-rhythmic HD cells. Manipulations of the visual landmarks caused tuning curve alterations in most HD cells, with the largest visually driven changes observed in non-rhythmic HD cells. Importantly, the tuning modifications of non-rhythmic HD cells were often non-coherent across cells, refuting the notion that attractor-like dynamics control non-rhythmic HD cells. These findings reveal a new population of non-rhythmic HD cells whose malleable organization is controlled by visual landmarks.
Collapse
Affiliation(s)
- Olga Kornienko
- Department of Clinical Neurobiology, Medical Faculty of Heidelberg University and German Cancer Research Center, Heidelberg, Germany
| | - Patrick Latuske
- Department of Clinical Neurobiology, Medical Faculty of Heidelberg University and German Cancer Research Center, Heidelberg, Germany
| | - Mathis Bassler
- Department of Clinical Neurobiology, Medical Faculty of Heidelberg University and German Cancer Research Center, Heidelberg, Germany
| | - Laura Kohler
- Department of Clinical Neurobiology, Medical Faculty of Heidelberg University and German Cancer Research Center, Heidelberg, Germany
| | - Kevin Allen
- Department of Clinical Neurobiology, Medical Faculty of Heidelberg University and German Cancer Research Center, Heidelberg, Germany
| |
Collapse
|
68
|
Park JL, Dudchenko PA, Donaldson DI. Navigation in Real-World Environments: New Opportunities Afforded by Advances in Mobile Brain Imaging. Front Hum Neurosci 2018; 12:361. [PMID: 30254578 PMCID: PMC6141718 DOI: 10.3389/fnhum.2018.00361] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2018] [Accepted: 08/23/2018] [Indexed: 12/29/2022] Open
Abstract
A central question in neuroscience and psychology is how the mammalian brain represents the outside world and enables interaction with it. Significant progress on this question has been made in the domain of spatial cognition, where a consistent network of brain regions that represent external space has been identified in both humans and rodents. In rodents, much of the work to date has been done in situations where the animal is free to move about naturally. By contrast, the majority of work carried out to date in humans is static, due to limitations imposed by traditional laboratory based imaging techniques. In recent years, significant progress has been made in bridging the gap between animal and human work by employing virtual reality (VR) technology to simulate aspects of real-world navigation. Despite this progress, the VR studies often fail to fully simulate important aspects of real-world navigation, where information derived from self-motion is integrated with representations of environmental features and task goals. In the current review article, we provide a brief overview of animal and human imaging work to date, focusing on commonalties and differences in findings across species. Following on from this we discuss VR studies of spatial cognition, outlining limitations and developments, before introducing mobile brain imaging techniques and describe technical challenges and solutions for real-world recording. Finally, we discuss how these advances in mobile brain imaging technology, provide an unprecedented opportunity to illuminate how the brain represents complex multifaceted information during naturalistic navigation.
Collapse
Affiliation(s)
- Joanne L Park
- Department of Psychology, Faculty of Natural Sciences, University of Stirling, Stirling, United Kingdom
| | - Paul A Dudchenko
- Department of Psychology, Faculty of Natural Sciences, University of Stirling, Stirling, United Kingdom
| | - David I Donaldson
- Department of Psychology, Faculty of Natural Sciences, University of Stirling, Stirling, United Kingdom
| |
Collapse
|
69
|
Finkelstein A, Ulanovsky N, Tsodyks M, Aljadeff J. Optimal dynamic coding by mixed-dimensionality neurons in the head-direction system of bats. Nat Commun 2018; 9:3590. [PMID: 30181554 PMCID: PMC6123463 DOI: 10.1038/s41467-018-05562-1] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2017] [Accepted: 06/25/2018] [Indexed: 01/18/2023] Open
Abstract
Ethologically relevant stimuli are often multidimensional. In many brain systems, neurons with “pure” tuning to one stimulus dimension are found along with “conjunctive” neurons that encode several dimensions, forming an apparently redundant representation. Here we show using theoretical analysis that a mixed-dimensionality code can efficiently represent a stimulus in different behavioral regimes: encoding by conjunctive cells is more robust when the stimulus changes quickly, whereas on long timescales pure cells represent the stimulus more efficiently with fewer neurons. We tested our predictions experimentally in the bat head-direction system and found that many head-direction cells switched their tuning dynamically from pure to conjunctive representation as a function of angular velocity—confirming our theoretical prediction. More broadly, our results suggest that optimal dimensionality depends on population size and on the time available for decoding—which might explain why mixed-dimensionality representations are common in sensory, motor, and higher cognitive systems across species. Multidimensional stimuli are often represented by neurons encoding only a single dimension and those encoding multiple dimensions. Here, the authors present theoretical and experimental analyses to show that mixed representations are optimal to efficiently encode such stimuli under different behavioral modes.
Collapse
Affiliation(s)
- Arseny Finkelstein
- Department of Neurobiology, Weizmann Institute of Science, 76100, Rehovot, Israel.,Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA, 20147, USA
| | - Nachum Ulanovsky
- Department of Neurobiology, Weizmann Institute of Science, 76100, Rehovot, Israel
| | - Misha Tsodyks
- Department of Neurobiology, Weizmann Institute of Science, 76100, Rehovot, Israel.
| | - Johnatan Aljadeff
- Department of Neurobiology, University of Chicago, Chicago, IL, 60637, USA. .,Department of Bioengineering, Imperial College, London, London, SW7 2AZ, UK.
| |
Collapse
|
70
|
Zhao M. Human spatial representation: what we cannot learn from the studies of rodent navigation. J Neurophysiol 2018; 120:2453-2465. [PMID: 30133384 DOI: 10.1152/jn.00781.2017] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022] Open
Abstract
Studies of human and rodent navigation often reveal a remarkable cross-species similarity between the cognitive and neural mechanisms of navigation. Such cross-species resemblance often overshadows some critical differences between how humans and nonhuman animals navigate. In this review, I propose that a navigation system requires both a storage system (i.e., representing spatial information) and a positioning system (i.e., sensing spatial information) to operate. I then argue that the way humans represent spatial information is different from that inferred from the cellular activity observed during rodent navigation. Such difference spans the whole hierarchy of spatial representation, from representing the structure of an environment to the representation of subregions of an environment, routes and paths, and the distance and direction relative to a goal location. These cross-species inconsistencies suggest that what we learn from rodent navigation does not always transfer to human navigation. Finally, I argue for closing the loop for the dominant, unidirectional animal-to-human approach in navigation research so that insights from behavioral studies of human navigation may also flow back to shed light on the cellular mechanisms of navigation for both humans and other mammals (i.e., a human-to-animal approach).
Collapse
Affiliation(s)
- Mintao Zhao
- School of Psychology, University of East Anglia , Norwich , United Kingdom.,Department of Human Perception, Cognition, and Action, Max Planck Institute for Biological Cybernetics , Tübingen , Germany
| |
Collapse
|
71
|
Meister M. Memory System Neurons Represent Gaze Position and the Visual World. J Exp Neurosci 2018; 12:1179069518787484. [PMID: 30034250 PMCID: PMC6050609 DOI: 10.1177/1179069518787484] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2018] [Accepted: 06/15/2018] [Indexed: 11/17/2022] Open
Abstract
The entorhinal cortex, a brain area critical for memory, contains neurons that fire when a rodent is in a certain location (eg, grid cells), or when a monkey looks at certain locations. In rodents, these spatial representations align to visual objects in the environment by firing when the animal is in a preferred location defined by relative position of visual environmental features. Recently, our laboratory found that simultaneously recorded entorhinal neurons in monkeys can exhibit different spatial reference frames for gaze position, including a reference frame of visual environmental features. We also discovered that most of the neurons represent gaze position. These results suggest that gaze information in multiple spatial reference frames is a potent signal used in the primate memory system. Here, I describe how these findings support three underappreciated views of the hippocampal memory system.
Collapse
Affiliation(s)
- Miriam Meister
- Washington National Primate Research Center, Seattle, WA, USA
- Department of Physiology and Biophysics, University of Washington, Seattle, WA, USA
- University of Washington School of Medicine, Seattle, WA, USA
| |
Collapse
|
72
|
Chen G, King JA, Lu Y, Cacucci F, Burgess N. Spatial cell firing during virtual navigation of open arenas by head-restrained mice. eLife 2018; 7:34789. [PMID: 29911974 PMCID: PMC6029848 DOI: 10.7554/elife.34789] [Citation(s) in RCA: 31] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2018] [Accepted: 06/11/2018] [Indexed: 12/17/2022] Open
Abstract
We present a mouse virtual reality (VR) system which restrains head-movements to horizontal rotations, compatible with multi-photon imaging. This system allows expression of the spatial navigation and neuronal firing patterns characteristic of real open arenas (R). Comparing VR to R: place and grid, but not head-direction, cell firing had broader spatial tuning; place, but not grid, cell firing was more directional; theta frequency increased less with running speed, whereas increases in firing rates with running speed and place and grid cells' theta phase precession were similar. These results suggest that the omni-directional place cell firing in R may require local-cues unavailable in VR, and that the scale of grid and place cell firing patterns, and theta frequency, reflect translational motion inferred from both virtual (visual and proprioceptive) and real (vestibular translation and extra-maze) cues. By contrast, firing rates and theta phase precession appear to reflect visual and proprioceptive cues alone.
Collapse
Affiliation(s)
- Guifen Chen
- UCL Institute of Cognitive Neuroscience, University College London, London, United Kingdom.,Department of Neuroscience Physiology and Pharmacology, University College London, London, United Kingdom
| | - John Andrew King
- Department of Clinical Educational Health Psychology, University College London, London, United Kingdom
| | - Yi Lu
- UCL Institute of Cognitive Neuroscience, University College London, London, United Kingdom.,Department of Neuroscience Physiology and Pharmacology, University College London, London, United Kingdom
| | - Francesca Cacucci
- Department of Neuroscience Physiology and Pharmacology, University College London, London, United Kingdom
| | - Neil Burgess
- UCL Institute of Cognitive Neuroscience, University College London, London, United Kingdom.,UCL Institute of Neurology, University College London, London, United Kingdom
| |
Collapse
|
73
|
Viejo G, Cortier T, Peyrache A. Brain-state invariant thalamo-cortical coordination revealed by non-linear encoders. PLoS Comput Biol 2018; 14:e1006041. [PMID: 29565979 PMCID: PMC5882158 DOI: 10.1371/journal.pcbi.1006041] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2017] [Revised: 04/03/2018] [Accepted: 02/16/2018] [Indexed: 02/02/2023] Open
Abstract
Understanding how neurons cooperate to integrate sensory inputs and guide behavior is a fundamental problem in neuroscience. A large body of methods have been developed to study neuronal firing at the single cell and population levels, generally seeking interpretability as well as predictivity. However, these methods are usually confronted with the lack of ground-truth necessary to validate the approach. Here, using neuronal data from the head-direction (HD) system, we present evidence demonstrating how gradient boosted trees, a non-linear and supervised Machine Learning tool, can learn the relationship between behavioral parameters and neuronal responses with high accuracy by optimizing the information rate. Interestingly, and unlike other classes of Machine Learning methods, the intrinsic structure of the trees can be interpreted in relation to behavior (e.g. to recover the tuning curves) or to study how neurons cooperate with their peers in the network. We show how the method, unlike linear analysis, reveals that the coordination in thalamo-cortical circuits is qualitatively the same during wakefulness and sleep, indicating a brain-state independent feed-forward circuit. Machine Learning tools thus open new avenues for benchmarking model-based characterization of spike trains.
Collapse
Affiliation(s)
- Guillaume Viejo
- Montreal Neurological Institute, McGill University, 3801 University Street, Montreal, QC H3A 2B4, Canada
| | - Thomas Cortier
- Montreal Neurological Institute, McGill University, 3801 University Street, Montreal, QC H3A 2B4, Canada
- École Normale Supérieure, 45 Rue d’Ulm, 75005 Paris, France
| | - Adrien Peyrache
- Montreal Neurological Institute, McGill University, 3801 University Street, Montreal, QC H3A 2B4, Canada
- * E-mail:
| |
Collapse
|
74
|
Abstract
All motile organisms use spatially distributed chemical features of their surroundings to guide their behaviors, but the neural mechanisms underlying such behaviors in mammals have been difficult to study, largely due to the technical challenges of controlling chemical concentrations in space and time during behavioral experiments. To overcome these challenges, we introduce a system to control and maintain an olfactory virtual landscape. This system uses rapid flow controllers and an online predictive algorithm to deliver precise odorant distributions to head-fixed mice as they explore a virtual environment. We establish an odor-guided virtual navigation behavior that engages hippocampal CA1 "place cells" that exhibit similar properties to those previously reported for real and visual virtual environments, demonstrating that navigation based on different sensory modalities recruits a similar cognitive map. This method opens new possibilities for studying the neural mechanisms of olfactory-driven behaviors, multisensory integration, innate valence, and low-dimensional sensory-spatial processing.
Collapse
Affiliation(s)
- Brad A Radvansky
- Department of Neurobiology, Northwestern University, Evanston, IL, 60208, USA
| | - Daniel A Dombeck
- Department of Neurobiology, Northwestern University, Evanston, IL, 60208, USA.
| |
Collapse
|
75
|
Aitken P, Zheng Y, Smith PF. The modulation of hippocampal theta rhythm by the vestibular system. J Neurophysiol 2018; 119:548-562. [DOI: 10.1152/jn.00548.2017] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/17/2022] Open
Abstract
The vestibular system is a sensory system that has evolved over millions of years to detect acceleration of the head, both rotational and translational, in three dimensions. One of its most important functions is to stabilize gaze during unexpected head movement; however, it is also important in the control of posture and autonomic reflexes. Theta rhythm is a 3- to 12-Hz oscillating EEG signal that is intimately linked to self-motion and is also known to be important in learning and memory. Many studies over the last two decades have shown that selective activation of the vestibular system, using either natural rotational or translational stimulation, or electrical stimulation of the peripheral vestibular system, can induce and modulate theta activity. Furthermore, inactivation of the vestibular system has been shown to significantly reduce theta in freely moving animals, which may be linked to its impairment of place cell function as well as spatial learning and memory. The pathways through which vestibular information modulate theta rhythm remain debatable. However, vestibular responses have been found in the pedunculopontine tegmental nucleus (PPTg) and activation of the vestibular system causes an increase in acetylcholine release into the hippocampus, probably from the medial septum. Therefore, a pathway from the vestibular nucleus complex and/or cerebellum to the PPTg, supramammillary nucleus, posterior hypothalamic nucleus, and septum to the hippocampus is likely. The modulation of theta by the vestibular system may have implications for vestibular effects on cognitive function and the contribution of vestibular impairment to the risk of dementia.
Collapse
Affiliation(s)
- Phillip Aitken
- Department of Pharmacology and Toxicology, School of Biomedical Sciences, and Brain Health Research Centre, University of Otago, Dunedin, New Zealand
| | - Yiwen Zheng
- Department of Pharmacology and Toxicology, School of Biomedical Sciences, and Brain Health Research Centre, University of Otago, Dunedin, New Zealand
- Brain Research New Zealand Centre of Research Excellence
- Eisdell Moore Centre for Hearing and Balance Research, University of Auckland, Auckland, New Zealand
| | - Paul F. Smith
- Department of Pharmacology and Toxicology, School of Biomedical Sciences, and Brain Health Research Centre, University of Otago, Dunedin, New Zealand
- Brain Research New Zealand Centre of Research Excellence
- Eisdell Moore Centre for Hearing and Balance Research, University of Auckland, Auckland, New Zealand
| |
Collapse
|
76
|
Opposing and Complementary Topographic Connectivity Gradients Revealed by Quantitative Analysis of Canonical and Noncanonical Hippocampal CA1 Inputs. eNeuro 2018; 5:eN-NWR-0322-17. [PMID: 29387780 PMCID: PMC5790753 DOI: 10.1523/eneuro.0322-17.2018] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2017] [Revised: 01/08/2018] [Accepted: 01/09/2018] [Indexed: 01/07/2023] Open
Abstract
Physiological studies suggest spatial representation gradients along the CA1 proximodistal axis. To determine the underlying anatomical basis, we quantitatively mapped canonical and noncanonical inputs to excitatory neurons in dorsal hippocampal CA1 along the proximal-distal axis in mice of both sexes using monosynaptic rabies tracing. Our quantitative analyses show comparable strength of subiculum complex and entorhinal cortex (EC) inputs to CA1, significant inputs from presubiculum and parasubiculum to CA1, and a threefold stronger input to proximal versus distal CA1 from CA3. Noncanonical subicular complex inputs exhibit opposing topographic connectivity gradients whereby the subiculum-CA1 input strength systematically increases but the presubiculum-CA1 input strength decreases along the proximal-distal axis. The subiculum input strength cotracks that of the lateral EC, known to be less spatially selective than the medial EC. The functional significance of this organization is verified physiologically for subiculum-to-CA1 inputs. These results reveal a novel anatomical framework by which to determine the circuit bases for CA1 representations.
Collapse
|
77
|
Abstract
The mammalian brain has neurons that specifically represent the animal’s location in the environment. Place cells in the hippocampus encode position, whereas grid cells in the medial entorhinal cortex, one synapse away, also express information about the distance and direction that the animal is moving. In this study, we show that, in 2.5–3-wk-old rat pups, place cells have firing fields whose positions depend on distance travelled, despite the immature state of grid fields at this age. The results suggest that place fields can be generated from self-motion–induced distance information in the absence of fully matured grid patterns. Place cells in the hippocampus and grid cells in the medial entorhinal cortex rely on self-motion information and path integration for spatially confined firing. Place cells can be observed in young rats as soon as they leave their nest at around 2.5 wk of postnatal life. In contrast, the regularly spaced firing of grid cells develops only after weaning, during the fourth week. In the present study, we sought to determine whether place cells are able to integrate self-motion information before maturation of the grid-cell system. Place cells were recorded on a 200-cm linear track while preweaning, postweaning, and adult rats ran on successive trials from a start wall to a box at the end of a linear track. The position of the start wall was altered in the middle of the trial sequence. When recordings were made in complete darkness, place cells maintained fields at a fixed distance from the start wall regardless of the age of the animal. When lights were on, place fields were determined primarily by external landmarks, except at the very beginning of the track. This shift was observed in both young and adult animals. The results suggest that preweaning rats are able to calculate distances based on information from self-motion before the grid-cell system has matured to its full extent.
Collapse
|
78
|
Transformation of the head-direction signal into a spatial code. Nat Commun 2017; 8:1752. [PMID: 29170377 PMCID: PMC5700966 DOI: 10.1038/s41467-017-01908-3] [Citation(s) in RCA: 41] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2016] [Accepted: 10/24/2017] [Indexed: 12/18/2022] Open
Abstract
Animals integrate multiple sensory inputs to successfully navigate in their environments. Head direction (HD), boundary vector, grid and place cells in the entorhinal-hippocampal network form the brain’s navigational system that allows to identify the animal’s current location, but how the functions of these specialized neuron types are acquired remain to be understood. Here we report that activity of HD neurons is influenced by the ambulatory constraints imposed upon the animal by the boundaries of the explored environment, leading to spurious spatial information. However, in the post-subiculum, the main cortical stage of HD signal processing, HD neurons convey true spatial information in the form of border modulated activity through the integration of additional sensory modalities relative to egocentric position, unlike their driving thalamic inputs. These findings demonstrate how the combination of HD and egocentric information can be transduced into a spatial code. A cognitive map of space must integrate allocentric cues such as head direction (HD) with various egocentric cues. Here the authors report that anterior thalamic (ADn) neurons encode a pure HD signal, while neurons in post-subiculum represent a conjunction of HD and egocentric cues such as body posture with respect to environment boundaries.
Collapse
|
79
|
Cullen KE, Taube JS. Our sense of direction: progress, controversies and challenges. Nat Neurosci 2017; 20:1465-1473. [PMID: 29073639 PMCID: PMC10278035 DOI: 10.1038/nn.4658] [Citation(s) in RCA: 121] [Impact Index Per Article: 17.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2017] [Accepted: 09/14/2017] [Indexed: 12/16/2022]
Abstract
In this Perspective, we evaluate current progress in understanding how the brain encodes our sense of direction, within the context of parallel work focused on how early vestibular pathways encode self-motion. In particular, we discuss how these systems work together and provide evidence that they involve common mechanisms. We first consider the classic view of the head direction cell and results of recent experiments in rodents and primates indicating that inputs to these neurons encode multimodal information during self-motion, such as proprioceptive and motor efference copy signals, including gaze-related information. We also consider the paradox that, while the head-direction network is generally assumed to generate a fixed representation of perceived directional heading, this computation would need to be dynamically updated when the relationship between voluntary motor command and its sensory consequences changes. Such situations include navigation in virtual reality and head-restricted conditions, since the natural relationship between visual and extravisual cues is altered.
Collapse
Affiliation(s)
- Kathleen E Cullen
- Department of Biomedical Engineering, The Johns Hopkins University, Baltimore, Maryland, USA
| | - Jeffrey S Taube
- Department of Psychological & Brain Sciences, Dartmouth College, Hanover, New Hampshire, USA
| |
Collapse
|
80
|
Sarel A, Finkelstein A, Las L, Ulanovsky N. Vectorial representation of spatial goals in the hippocampus of bats. Science 2017; 355:176-180. [PMID: 28082589 DOI: 10.1126/science.aak9589] [Citation(s) in RCA: 159] [Impact Index Per Article: 22.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2016] [Accepted: 12/14/2016] [Indexed: 11/02/2022]
Abstract
To navigate, animals need to represent not only their own position and orientation, but also the location of their goal. Neural representations of an animal's own position and orientation have been extensively studied. However, it is unknown how navigational goals are encoded in the brain. We recorded from hippocampal CA1 neurons of bats flying in complex trajectories toward a spatial goal. We discovered a subpopulation of neurons with angular tuning to the goal direction. Many of these neurons were tuned to an occluded goal, suggesting that goal-direction representation is memory-based. We also found cells that encoded the distance to the goal, often in conjunction with goal direction. The goal-direction and goal-distance signals make up a vectorial representation of spatial goals, suggesting a previously unrecognized neuronal mechanism for goal-directed navigation.
Collapse
Affiliation(s)
- Ayelet Sarel
- Department of Neurobiology, Weizmann Institute of Science, Rehovot 76100, Israel
| | - Arseny Finkelstein
- Department of Neurobiology, Weizmann Institute of Science, Rehovot 76100, Israel
| | - Liora Las
- Department of Neurobiology, Weizmann Institute of Science, Rehovot 76100, Israel
| | - Nachum Ulanovsky
- Department of Neurobiology, Weizmann Institute of Science, Rehovot 76100, Israel.
| |
Collapse
|
81
|
Grieves RM, Duvelle É, Wood ER, Dudchenko PA. Field repetition and local mapping in the hippocampus and the medial entorhinal cortex. J Neurophysiol 2017; 118:2378-2388. [PMID: 28814638 PMCID: PMC5646201 DOI: 10.1152/jn.00933.2016] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2016] [Revised: 07/20/2017] [Accepted: 07/20/2017] [Indexed: 11/22/2022] Open
Abstract
Hippocampal place cells support spatial cognition and are thought to form the neural substrate of a global "cognitive map." A widely held view is that parts of the hippocampus also underlie the ability to separate patterns or to provide different neural codes for distinct environments. However, a number of studies have shown that in environments composed of multiple, repeating compartments, place cells and other spatially modulated neurons show the same activity in each local area. This repetition of firing fields may reflect pattern completion and may make it difficult for animals to distinguish similar local environments. In this review we 1) highlight some of the navigation difficulties encountered by humans in repetitive environments, 2) summarize literature demonstrating that place and grid cells represent local and not global space, and 3) attempt to explain the origin of these phenomena. We argue that the repetition of firing fields can be a useful tool for understanding the relationship between grid cells in the entorhinal cortex and place cells in the hippocampus, the spatial inputs shared by these cells, and the propagation of spatially related signals through these structures.
Collapse
Affiliation(s)
- Roddy M Grieves
- Institute of Behavioural Neuroscience, Department of Experimental Psychology, University College London, London, United Kingdom
| | - Éléonore Duvelle
- Institute of Behavioural Neuroscience, Department of Experimental Psychology, University College London, London, United Kingdom
| | - Emma R Wood
- Centre for Cognitive and Neural Systems, Edinburgh Medical School: Biomedical Sciences, University of Edinburgh, Edinburgh, United Kingdom; and
| | - Paul A Dudchenko
- Centre for Cognitive and Neural Systems, Edinburgh Medical School: Biomedical Sciences, University of Edinburgh, Edinburgh, United Kingdom; and
- Faculty of Natural Sciences, University of Stirling, Stirling, United Kingdom
| |
Collapse
|
82
|
Stowers JR, Hofbauer M, Bastien R, Griessner J, Higgins P, Farooqui S, Fischer RM, Nowikovsky K, Haubensak W, Couzin ID, Tessmar-Raible K, Straw AD. Virtual reality for freely moving animals. Nat Methods 2017; 14:995-1002. [PMID: 28825703 PMCID: PMC6485657 DOI: 10.1038/nmeth.4399] [Citation(s) in RCA: 137] [Impact Index Per Article: 19.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2016] [Accepted: 07/06/2017] [Indexed: 12/29/2022]
Abstract
Standard animal behavior paradigms incompletely mimic nature and thus limit our understanding of behavior and brain function. Virtual reality (VR) can help, but it poses challenges. Typical VR systems require movement restrictions but disrupt sensorimotor experience, causing neuronal and behavioral alterations. We report the development of FreemoVR, a VR system for freely moving animals. We validate immersive VR for mice, flies, and zebrafish. FreemoVR allows instant, disruption-free environmental reconfigurations and interactions between real organisms and computer-controlled agents. Using the FreemoVR platform, we established a height-aversion assay in mice and studied visuomotor effects in Drosophila and zebrafish. Furthermore, by photorealistically mimicking zebrafish we discovered that effective social influence depends on a prospective leader balancing its internally preferred directional choice with social interaction. FreemoVR technology facilitates detailed investigations into neural function and behavior through the precise manipulation of sensorimotor feedback loops in unrestrained animals.
Collapse
Affiliation(s)
- John R. Stowers
- Research Institute of Molecular Pathology, Vienna Biocenter, Vienna, Austria
- loopbio gmbh, Kritzendorf, Austria
| | - Maximilian Hofbauer
- Research Institute of Molecular Pathology, Vienna Biocenter, Vienna, Austria
- loopbio gmbh, Kritzendorf, Austria
- Max F. Perutz Laboratories, University of Vienna, Vienna, Austria
- Research Platform “Rhythms of Life”, University of Vienna, Vienna, Austria
| | - Renaud Bastien
- Department of Collective Behaviour, Max Planck Institute for Ornithology, 78457 Konstanz, Germany
- Chair of Biodiversity and Collective Behaviour, Department of Biology, University of Konstanz 78457, Konstanz, Germany
| | - Johannes Griessner
- Research Institute of Molecular Pathology, Vienna Biocenter, Vienna, Austria
| | - Peter Higgins
- Research Institute of Molecular Pathology, Vienna Biocenter, Vienna, Austria
| | - Sarfarazhussain Farooqui
- Max F. Perutz Laboratories, University of Vienna, Vienna, Austria
- Research Platform “Rhythms of Life”, University of Vienna, Vienna, Austria
- Medizinische Universität Wien, Dept. for Internal Medicine I, 1090 Wien, Austria
| | - Ruth M. Fischer
- Max F. Perutz Laboratories, University of Vienna, Vienna, Austria
| | - Karin Nowikovsky
- Medizinische Universität Wien, Dept. for Internal Medicine I, 1090 Wien, Austria
| | - Wulf Haubensak
- Research Institute of Molecular Pathology, Vienna Biocenter, Vienna, Austria
| | - Iain D. Couzin
- Department of Collective Behaviour, Max Planck Institute for Ornithology, 78457 Konstanz, Germany
- Chair of Biodiversity and Collective Behaviour, Department of Biology, University of Konstanz 78457, Konstanz, Germany
| | - Kristin Tessmar-Raible
- Max F. Perutz Laboratories, University of Vienna, Vienna, Austria
- Research Platform “Rhythms of Life”, University of Vienna, Vienna, Austria
| | - Andrew D. Straw
- Research Institute of Molecular Pathology, Vienna Biocenter, Vienna, Austria
- Institute of Biology I and Bernstein Center Freiburg, Faculty of Biology, Albert-Ludwigs-University Freiburg, Freiburg, Germany
| |
Collapse
|
83
|
Harland B, Grieves RM, Bett D, Stentiford R, Wood ER, Dudchenko PA. Lesions of the Head Direction Cell System Increase Hippocampal Place Field Repetition. Curr Biol 2017; 27:2706-2712.e2. [PMID: 28867207 PMCID: PMC5607353 DOI: 10.1016/j.cub.2017.07.071] [Citation(s) in RCA: 40] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2017] [Revised: 07/02/2017] [Accepted: 07/31/2017] [Indexed: 11/26/2022]
Abstract
A central tenet of systems neuroscience is that the mammalian hippocampus provides a cognitive map of the environment. This view is supported by the finding of place cells, neurons whose firing is tuned to specific locations in an animal's environment, within this brain region. Recent work, however, has shown that these cells repeat their firing fields across visually identical maze compartments [1, 2]. This repetition is not observed if these compartments face different directions, suggesting that place cells use a directional input to differentiate otherwise similar local environments [3, 4]. A clear candidate for this input is the head direction cell system. To test this, we disrupted the head direction cell system by lesioning the lateral mammillary nuclei and then recorded place cells as rats explored multiple, connected compartments, oriented in the same or in different directions. As shown previously, we found that place cells in control animals exhibited repeated fields in compartments arranged in parallel, but not in compartments facing different directions. In contrast, the place cells of animals with lesions of the head direction cell system exhibited repeating fields in both conditions. Thus, directional information provided by the head direction cell system appears essential for the angular disambiguation by place cells of visually identical compartments.
Collapse
Affiliation(s)
- Bruce Harland
- Faculty of Natural Sciences, University of Stirling, Stirling FK9 4LA, UK; Centre for Cognitive and Neural Systems, Edinburgh Medical School: Biomedical Sciences, University of Edinburgh, 1 George Square, Edinburgh EH8 9JZ, UK
| | - Roddy M Grieves
- Faculty of Natural Sciences, University of Stirling, Stirling FK9 4LA, UK; Centre for Cognitive and Neural Systems, Edinburgh Medical School: Biomedical Sciences, University of Edinburgh, 1 George Square, Edinburgh EH8 9JZ, UK; University College London, Institute of Behavioural Neuroscience, Department of Experimental Psychology, London, UK
| | - David Bett
- Faculty of Natural Sciences, University of Stirling, Stirling FK9 4LA, UK; Centre for Cognitive and Neural Systems, Edinburgh Medical School: Biomedical Sciences, University of Edinburgh, 1 George Square, Edinburgh EH8 9JZ, UK
| | - Rachael Stentiford
- Centre for Cognitive and Neural Systems, Edinburgh Medical School: Biomedical Sciences, University of Edinburgh, 1 George Square, Edinburgh EH8 9JZ, UK
| | - Emma R Wood
- Centre for Cognitive and Neural Systems, Edinburgh Medical School: Biomedical Sciences, University of Edinburgh, 1 George Square, Edinburgh EH8 9JZ, UK
| | - Paul A Dudchenko
- Faculty of Natural Sciences, University of Stirling, Stirling FK9 4LA, UK; Centre for Cognitive and Neural Systems, Edinburgh Medical School: Biomedical Sciences, University of Edinburgh, 1 George Square, Edinburgh EH8 9JZ, UK.
| |
Collapse
|
84
|
Hardcastle K, Maheswaranathan N, Ganguli S, Giocomo LM. A Multiplexed, Heterogeneous, and Adaptive Code for Navigation in Medial Entorhinal Cortex. Neuron 2017; 94:375-387.e7. [PMID: 28392071 PMCID: PMC5498174 DOI: 10.1016/j.neuron.2017.03.025] [Citation(s) in RCA: 171] [Impact Index Per Article: 24.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2016] [Revised: 01/21/2017] [Accepted: 03/20/2017] [Indexed: 12/19/2022]
Abstract
Medial entorhinal grid cells display strikingly symmetric spatial firing patterns. The clarity of these patterns motivated the use of specific activity pattern shapes to classify entorhinal cell types. While this approach successfully revealed cells that encode boundaries, head direction, and running speed, it left a majority of cells unclassified, and its pre-defined nature may have missed unconventional, yet important coding properties. Here, we apply an unbiased statistical approach to search for cells that encode navigationally relevant variables. This approach successfully classifies the majority of entorhinal cells and reveals unsuspected entorhinal coding principles. First, we find a high degree of mixed selectivity and heterogeneity in superficial entorhinal neurons. Second, we discover a dynamic and remarkably adaptive code for space that enables entorhinal cells to rapidly encode navigational information accurately at high running speeds. Combined, these observations advance our current understanding of the mechanistic origins and functional implications of the entorhinal code for navigation. VIDEO ABSTRACT.
Collapse
Affiliation(s)
- Kiah Hardcastle
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA 94305, USA; Department of Applied Physics, Stanford University, Stanford, CA 94305, USA.
| | - Niru Maheswaranathan
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA 94305, USA; Department of Applied Physics, Stanford University, Stanford, CA 94305, USA
| | - Surya Ganguli
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA 94305, USA; Department of Applied Physics, Stanford University, Stanford, CA 94305, USA
| | - Lisa M Giocomo
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA 94305, USA.
| |
Collapse
|
85
|
Tafazoli S, Safaai H, De Franceschi G, Rosselli FB, Vanzella W, Riggi M, Buffolo F, Panzeri S, Zoccolan D. Emergence of transformation-tolerant representations of visual objects in rat lateral extrastriate cortex. eLife 2017; 6. [PMID: 28395730 PMCID: PMC5388540 DOI: 10.7554/elife.22794] [Citation(s) in RCA: 31] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2016] [Accepted: 02/26/2017] [Indexed: 01/17/2023] Open
Abstract
Rodents are emerging as increasingly popular models of visual functions. Yet, evidence that rodent visual cortex is capable of advanced visual processing, such as object recognition, is limited. Here we investigate how neurons located along the progression of extrastriate areas that, in the rat brain, run laterally to primary visual cortex, encode object information. We found a progressive functional specialization of neural responses along these areas, with: (1) a sharp reduction of the amount of low-level, energy-related visual information encoded by neuronal firing; and (2) a substantial increase in the ability of both single neurons and neuronal populations to support discrimination of visual objects under identity-preserving transformations (e.g., position and size changes). These findings strongly argue for the existence of a rat object-processing pathway, and point to the rodents as promising models to dissect the neuronal circuitry underlying transformation-tolerant recognition of visual objects. DOI:http://dx.doi.org/10.7554/eLife.22794.001 Everyday, we see thousands of different objects with many different shapes, colors, sizes and textures. Even an individual object – for example, a face – can present us with a virtually infinite number of different images, depending on from where we view it. In spite of this extraordinary variability, our brain can recognize objects in a fraction of a second and without any apparent effort. Our closest relatives in the animal kingdom, the non-human primates, share our ability to effortlessly recognize objects. For many decades, they have served as invaluable models to investigate the circuits of neurons in the brain that underlie object recognition. In recent years, mice and rats have also emerged as useful models for studying some aspects of vision. However, it was not clear whether these rodents’ brains could also perform complex visual processes like recognizing objects. Tafazoli, Safaai et al. have now recorded the responses of visual neurons in rats to a set of objects, each presented across a range of positions, sizes, rotations and brightness levels. Applying computational and mathematical tools to these responses revealed that visual information progresses through a number of brain regions. The identity of the visual objects is gradually extracted as the information travels along this pathway, in a way that becomes more and more robust to changes in how the object appears. Overall, Tafazoli, Safaai et al. suggest that rodents share with primates some of the key computations that underlie the recognition of visual objects. Therefore, the powerful sets of experimental approaches that can be used to study rats and mice – for example, genetic and molecular tools – could now be used to study the circuits of neurons that enable object recognition. Gaining a better understanding of such circuits can, in turn, inspire the design of more powerful artificial vision systems and help to develop visual prosthetics. Achieving these goals will require further work to understand how different classes of neurons in different brain regions interact as rodents perform complex visual discrimination tasks. DOI:http://dx.doi.org/10.7554/eLife.22794.002
Collapse
Affiliation(s)
- Sina Tafazoli
- Visual Neuroscience Lab, International School for Advanced Studies (SISSA), Trieste, Italy
| | - Houman Safaai
- Visual Neuroscience Lab, International School for Advanced Studies (SISSA), Trieste, Italy.,Laboratory of Neural Computation, Center for Neuroscience and Cognitive Systems @UniTn, Istituto Italiano di Tecnologia, Rovereto, Italy.,Department of Neurobiology, Harvard Medical School, Boston, United States
| | - Gioia De Franceschi
- Visual Neuroscience Lab, International School for Advanced Studies (SISSA), Trieste, Italy
| | | | - Walter Vanzella
- Visual Neuroscience Lab, International School for Advanced Studies (SISSA), Trieste, Italy
| | - Margherita Riggi
- Visual Neuroscience Lab, International School for Advanced Studies (SISSA), Trieste, Italy
| | - Federica Buffolo
- Visual Neuroscience Lab, International School for Advanced Studies (SISSA), Trieste, Italy
| | - Stefano Panzeri
- Laboratory of Neural Computation, Center for Neuroscience and Cognitive Systems @UniTn, Istituto Italiano di Tecnologia, Rovereto, Italy
| | - Davide Zoccolan
- Visual Neuroscience Lab, International School for Advanced Studies (SISSA), Trieste, Italy
| |
Collapse
|
86
|
Wirth S, Baraduc P, Planté A, Pinède S, Duhamel JR. Gaze-informed, task-situated representation of space in primate hippocampus during virtual navigation. PLoS Biol 2017; 15:e2001045. [PMID: 28241007 PMCID: PMC5328243 DOI: 10.1371/journal.pbio.2001045] [Citation(s) in RCA: 64] [Impact Index Per Article: 9.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2016] [Accepted: 01/18/2017] [Indexed: 01/11/2023] Open
Abstract
To elucidate how gaze informs the construction of mental space during wayfinding in visual species like primates, we jointly examined navigation behavior, visual exploration, and hippocampal activity as macaque monkeys searched a virtual reality maze for a reward. Cells sensitive to place also responded to one or more variables like head direction, point of gaze, or task context. Many cells fired at the sight (and in anticipation) of a single landmark in a viewpoint- or task-dependent manner, simultaneously encoding the animal’s logical situation within a set of actions leading to the goal. Overall, hippocampal activity was best fit by a fine-grained state space comprising current position, view, and action contexts. Our findings indicate that counterparts of rodent place cells in primates embody multidimensional, task-situated knowledge pertaining to the target of gaze, therein supporting self-awareness in the construction of space. In the brain of mammalian species, the hippocampus is a key structure for episodic and spatial memory and is home to neurons coding a selective location in space (“place cells”). These neurons have been mostly investigated in the rat. However, species such as rodents and primates have access to different olfactory and visual information, and it is still unclear how their hippocampal cells compare. By analyzing hippocampal activity of nonhuman primates (rhesus macaques) while they searched a virtual environment for a reward, we show that space coding is more complex than a mere position or orientation selectivity. Rather, space is represented as a combination of visually derived information and task-related knowledge. Here, we uncover how this multidimensional representation emerges from gazing at the environment at key moments of the animal’s exploration of space. We show that neurons are active for precise positions and actions related to the landmarks gazed at by the animals. Neurons were even found to anticipate the appearance of landmarks, sometimes responding to a landmark that was not yet visible. Overall, the place fields of primate hippocampal neurons appear as the projection of a multidimensional memory onto physical space.
Collapse
Affiliation(s)
- Sylvia Wirth
- Centre de Neuroscience Cognitive, UMR 5229, CNRS and University of Lyon, Bron, France
- * E-mail:
| | - Pierre Baraduc
- Centre de Neuroscience Cognitive, UMR 5229, CNRS and University of Lyon, Bron, France
- GIPSA-lab, UMR 5216, CNRS and University of Grenoble-Alpes, Saint Martin d'Hères, France
| | - Aurélie Planté
- Centre de Neuroscience Cognitive, UMR 5229, CNRS and University of Lyon, Bron, France
| | - Serge Pinède
- Centre de Neuroscience Cognitive, UMR 5229, CNRS and University of Lyon, Bron, France
| | - Jean-René Duhamel
- Centre de Neuroscience Cognitive, UMR 5229, CNRS and University of Lyon, Bron, France
| |
Collapse
|
87
|
Thurley K, Ayaz A. Virtual reality systems for rodents. Curr Zool 2017; 63:109-119. [PMID: 29491968 PMCID: PMC5804145 DOI: 10.1093/cz/zow070] [Citation(s) in RCA: 48] [Impact Index Per Article: 6.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2016] [Accepted: 05/26/2016] [Indexed: 01/24/2023] Open
Abstract
Over the last decade virtual reality (VR) setups for rodents have been developed and utilized to investigate the neural foundations of behavior. Such VR systems became very popular since they allow the use of state-of-the-art techniques to measure neural activity in behaving rodents that cannot be easily used with classical behavior setups. Here, we provide an overview of rodent VR technologies and review recent results from related research. We discuss commonalities and differences as well as merits and issues of different approaches. A special focus is given to experimental (behavioral) paradigms in use. Finally we comment on possible use cases that may further exploit the potential of VR in rodent research and hence inspire future studies.
Collapse
Affiliation(s)
- Kay Thurley
- Department Biologie II, Ludwig-Maximilians-Universität München, Großhaderner Straße 2, D-82152 Planegg-Martinsried, GermanyBernstein Center for Computational Neuroscience Munich, Germany,Brain Research Institute, University of Zurich, Winterthurerstrasse 190, CH-8057, Zurich, Switzerland
| | - Aslı Ayaz
- Department Biologie II, Ludwig-Maximilians-Universität München, Großhaderner Straße 2, D-82152 Planegg-Martinsried, GermanyBernstein Center for Computational Neuroscience Munich, Germany,Brain Research Institute, University of Zurich, Winterthurerstrasse 190, CH-8057, Zurich, Switzerland
| |
Collapse
|
88
|
Nashaat MA, Oraby H, Sachdev RNS, Winter Y, Larkum ME. Air-Track: a real-world floating environment for active sensing in head-fixed mice. J Neurophysiol 2016; 116:1542-1553. [PMID: 27486102 DOI: 10.1152/jn.00088.2016] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2016] [Accepted: 07/01/2016] [Indexed: 11/22/2022] Open
Abstract
Natural behavior occurs in multiple sensory and motor modalities and in particular is dependent on sensory feedback that constantly adjusts behavior. To investigate the underlying neuronal correlates of natural behavior, it is useful to have access to state-of-the-art recording equipment (e.g., 2-photon imaging, patch recordings, etc.) that frequently requires head fixation. This limitation has been addressed with various approaches such as virtual reality/air ball or treadmill systems. However, achieving multimodal realistic behavior in these systems can be challenging. These systems are often also complex and expensive to implement. Here we present "Air-Track," an easy-to-build head-fixed behavioral environment that requires only minimal computational processing. The Air-Track is a lightweight physical maze floating on an air table that has all the properties of the "real" world, including multiple sensory modalities tightly coupled to motor actions. To test this system, we trained mice in Go/No-Go and two-alternative forced choice tasks in a plus maze. Mice chose lanes and discriminated apertures or textures by moving the Air-Track back and forth and rotating it around themselves. Mice rapidly adapted to moving the track and used visual, auditory, and tactile cues to guide them in performing the tasks. A custom-controlled camera system monitored animal location and generated data that could be used to calculate reaction times in the visual and somatosensory discrimination tasks. We conclude that the Air-Track system is ideal for eliciting natural behavior in concert with virtually any system for monitoring or manipulating brain activity.
Collapse
Affiliation(s)
- Mostafa A Nashaat
- Neurocure Cluster of Excellence, Humboldt-Universität zu Berlin, Berlin, Germany; and Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Hatem Oraby
- Neurocure Cluster of Excellence, Humboldt-Universität zu Berlin, Berlin, Germany; and
| | - Robert N S Sachdev
- Neurocure Cluster of Excellence, Humboldt-Universität zu Berlin, Berlin, Germany; and
| | - York Winter
- Neurocure Cluster of Excellence, Humboldt-Universität zu Berlin, Berlin, Germany; and
| | - Matthew E Larkum
- Neurocure Cluster of Excellence, Humboldt-Universität zu Berlin, Berlin, Germany; and
| |
Collapse
|
89
|
|
90
|
A new direction. Nat Rev Neurosci 2016. [DOI: 10.1038/nrn.2016.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
91
|
Abstract
To understand the origins of spatial navigational signals, Acharya et al. record the activity of hippocampal neurons in rats running in open two-dimensional environments in both the real world and in virtual reality. They find that a subset of hippocampal neurons have directional tuning that persists in virtual reality, where vestibular cues are absent.
Collapse
Affiliation(s)
- Cian O'Donnell
- Department of Computer Science, Faculty of Engineering, University of Bristol, Bristol BS8 1UB, UK
| | - Terrence J Sejnowski
- Howard Hughes Medical Institute at the Salk Institute for Biological Studies, La Jolla, CA 92037, USA; Division of Biological Sciences, University of California, San Diego, La Jolla, CA 92161, USA.
| |
Collapse
|