1
|
Aziz A, Patil BK, Lakshmikanth K, Sreeharsha PSS, Mukhopadhyay A, Chakravarthy VS. Modeling hippocampal spatial cells in rodents navigating in 3D environments. Sci Rep 2024; 14:16714. [PMID: 39030197 PMCID: PMC11271631 DOI: 10.1038/s41598-024-66755-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2024] [Accepted: 07/03/2024] [Indexed: 07/21/2024] Open
Abstract
Studies on the neural correlates of navigation in 3D environments are plagued by several issues that need to be solved. For example, experimental studies show markedly different place cell responses in rats and bats, both navigating in 3D environments. In this study, we focus on modelling the spatial cells in rodents in a 3D environment. We propose a deep autoencoder network to model the place and grid cells in a simulated agent navigating in a 3D environment. The input layer to the autoencoder network model is the HD layer, which encodes the agent's HD in terms of azimuth (θ) and pitch angles (ϕ). The output of this layer is given as input to the Path Integration (PI) layer, which computes displacement in all the preferred directions. The bottleneck layer of the autoencoder model encodes the spatial cell-like responses. Both grid cell and place cell-like responses are observed. The proposed model is verified using two experimental studies with two 3D environments. This model paves the way for a holistic approach using deep neural networks to model spatial cells in 3D navigation.
Collapse
Affiliation(s)
- Azra Aziz
- Computational Neuroscience Lab, Indian Institute of Technology Madras, Chennai, 600036, India
| | - Bharat K Patil
- Computational Neuroscience Lab, Indian Institute of Technology Madras, Chennai, 600036, India
| | - Kailash Lakshmikanth
- Computational Neuroscience Lab, Indian Institute of Technology Madras, Chennai, 600036, India
| | | | - Ayan Mukhopadhyay
- Department of Physics, Indian Institute of Technology Madras, Chennai, 600036, India
- Instituto de Física, Pontificia Universidad Católica de Valparaíso, Valparaíso, Chile
| | - V Srinivasa Chakravarthy
- Computational Neuroscience Lab, Indian Institute of Technology Madras, Chennai, 600036, India.
- Center for Complex Systems and Dynamics, Indian Institute of Technology Madras, Chennai, 600036, India.
- Department of Biotechnology, Indian Institute of Technology Madras, Chennai, 600036, India.
| |
Collapse
|
2
|
Zhu Y, Gelnaw H, Auer F, Hamling KR, Ehrlich DE, Schoppik D. A brainstem circuit for gravity-guided vertical navigation. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.03.12.584680. [PMID: 38559209 PMCID: PMC10980031 DOI: 10.1101/2024.03.12.584680] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 04/04/2024]
Abstract
The sensation of gravity anchors our perception of the environment and is crucial for navigation. However, the neural circuits that transform gravity into commands for navigation are undefined. We first determined that larval zebrafish (Danio rerio) navigate vertically by maintaining a consistent heading across a series of upward climb or downward dive bouts. Gravity-blind mutant fish swim with more variable heading and excessive veering, leading to inefficient vertical navigation. After targeted photoablation of ascending vestibular neurons and spinal projecting midbrain neurons, but not vestibulospinal neurons, vertical navigation was impaired. These data define a sensorimotor circuit that uses evolutionarily-conserved brainstem architecture to transform gravitational signals into persistent heading for vertical navigation. The work lays a foundation to understand how vestibular inputs allow animals to move efficiently through their environment.
Collapse
Affiliation(s)
- Yunlu Zhu
- Departments of Otolaryngology, Neuroscience & Physiology, and the Neuroscience Institute, New York University Grossman School of Medicine
| | - Hannah Gelnaw
- Departments of Otolaryngology, Neuroscience & Physiology, and the Neuroscience Institute, New York University Grossman School of Medicine
| | - Franziska Auer
- Departments of Otolaryngology, Neuroscience & Physiology, and the Neuroscience Institute, New York University Grossman School of Medicine
| | - Kyla R. Hamling
- Departments of Otolaryngology, Neuroscience & Physiology, and the Neuroscience Institute, New York University Grossman School of Medicine
| | - David E. Ehrlich
- Departments of Otolaryngology, Neuroscience & Physiology, and the Neuroscience Institute, New York University Grossman School of Medicine
| | - David Schoppik
- Departments of Otolaryngology, Neuroscience & Physiology, and the Neuroscience Institute, New York University Grossman School of Medicine
- Lead Contact
| |
Collapse
|
3
|
Zwergal A, Grabova D, Schöberl F. Vestibular contribution to spatial orientation and navigation. Curr Opin Neurol 2024; 37:52-58. [PMID: 38010039 PMCID: PMC10779452 DOI: 10.1097/wco.0000000000001230] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2023]
Abstract
PURPOSE OF REVIEW The vestibular system provides three-dimensional idiothetic cues for updating of one's position in space during head and body movement. Ascending vestibular signals reach entorhinal and hippocampal networks via head-direction pathways, where they converge with multisensory information to tune the place and grid cell code. RECENT FINDINGS Animal models have provided insight to neurobiological consequences of vestibular lesions for cerebral networks controlling spatial cognition. Multimodal cerebral imaging combined with behavioural testing of spatial orientation and navigation performance as well as strategy in the last years helped to decipher vestibular-cognitive interactions also in humans. SUMMARY This review will update the current knowledge on the anatomical and cellular basis of vestibular contributions to spatial orientation and navigation from a translational perspective (animal and human studies), delineate the behavioural and functional consequences of different vestibular pathologies on these cognitive domains, and will lastly speculate on a potential role of vestibular dysfunction for cognitive aging and impeding cognitive impairment in analogy to the well known effects of hearing loss.
Collapse
Affiliation(s)
- Andreas Zwergal
- German Center for Vertigo and Balance Disorders (DSGZ), LMU University Hospital, LMU Munich
- Department of Neurology, LMU University Hospital, LMU Munich, Munich, Germany
| | - Denis Grabova
- German Center for Vertigo and Balance Disorders (DSGZ), LMU University Hospital, LMU Munich
| | - Florian Schöberl
- German Center for Vertigo and Balance Disorders (DSGZ), LMU University Hospital, LMU Munich
- Department of Neurology, LMU University Hospital, LMU Munich, Munich, Germany
| |
Collapse
|
4
|
Liu B, Shan J, Gu Y. Temporal and spatial properties of vestibular signals for perception of self-motion. Front Neurol 2023; 14:1266513. [PMID: 37780704 PMCID: PMC10534010 DOI: 10.3389/fneur.2023.1266513] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2023] [Accepted: 08/29/2023] [Indexed: 10/03/2023] Open
Abstract
It is well recognized that the vestibular system is involved in numerous important cognitive functions, including self-motion perception, spatial orientation, locomotion, and vector-based navigation, in addition to basic reflexes, such as oculomotor or body postural control. Consistent with this rationale, vestibular signals exist broadly in the brain, including several regions of the cerebral cortex, potentially allowing tight coordination with other sensory systems to improve the accuracy and precision of perception or action during self-motion. Recent neurophysiological studies in animal models based on single-cell resolution indicate that vestibular signals exhibit complex spatiotemporal dynamics, producing challenges in identifying their exact functions and how they are integrated with other modality signals. For example, vestibular and optic flow could provide congruent and incongruent signals regarding spatial tuning functions, reference frames, and temporal dynamics. Comprehensive studies, including behavioral tasks, neural recording across sensory and sensory-motor association areas, and causal link manipulations, have provided some insights into the neural mechanisms underlying multisensory self-motion perception.
Collapse
Affiliation(s)
- Bingyu Liu
- Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, International Center for Primate Brain Research, Chinese Academy of Sciences, Shanghai, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Jiayu Shan
- Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, International Center for Primate Brain Research, Chinese Academy of Sciences, Shanghai, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Yong Gu
- Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, International Center for Primate Brain Research, Chinese Academy of Sciences, Shanghai, China
- University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
5
|
Parra-Barrero E, Vijayabaskaran S, Seabrook E, Wiskott L, Cheng S. A map of spatial navigation for neuroscience. Neurosci Biobehav Rev 2023; 152:105200. [PMID: 37178943 DOI: 10.1016/j.neubiorev.2023.105200] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2023] [Revised: 04/13/2023] [Accepted: 04/24/2023] [Indexed: 05/15/2023]
Abstract
Spatial navigation has received much attention from neuroscientists, leading to the identification of key brain areas and the discovery of numerous spatially selective cells. Despite this progress, our understanding of how the pieces fit together to drive behavior is generally lacking. We argue that this is partly caused by insufficient communication between behavioral and neuroscientific researchers. This has led the latter to under-appreciate the relevance and complexity of spatial behavior, and to focus too narrowly on characterizing neural representations of space-disconnected from the computations these representations are meant to enable. We therefore propose a taxonomy of navigation processes in mammals that can serve as a common framework for structuring and facilitating interdisciplinary research in the field. Using the taxonomy as a guide, we review behavioral and neural studies of spatial navigation. In doing so, we validate the taxonomy and showcase its usefulness in identifying potential issues with common experimental approaches, designing experiments that adequately target particular behaviors, correctly interpreting neural activity, and pointing to new avenues of research.
Collapse
Affiliation(s)
- Eloy Parra-Barrero
- Institute for Neural Computation, Faculty of Computer Science, Ruhr University Bochum, Bochum, Germany; International Graduate School of Neuroscience, Ruhr University Bochum, Bochum, Germany
| | - Sandhiya Vijayabaskaran
- Institute for Neural Computation, Faculty of Computer Science, Ruhr University Bochum, Bochum, Germany
| | - Eddie Seabrook
- Institute for Neural Computation, Faculty of Computer Science, Ruhr University Bochum, Bochum, Germany
| | - Laurenz Wiskott
- Institute for Neural Computation, Faculty of Computer Science, Ruhr University Bochum, Bochum, Germany; International Graduate School of Neuroscience, Ruhr University Bochum, Bochum, Germany
| | - Sen Cheng
- Institute for Neural Computation, Faculty of Computer Science, Ruhr University Bochum, Bochum, Germany; International Graduate School of Neuroscience, Ruhr University Bochum, Bochum, Germany.
| |
Collapse
|
6
|
Sinha AK, Lee C, Holt JC. Elucidating the role of muscarinic acetylcholine receptor (mAChR) signaling in efferent mediated responses of vestibular afferents in mammals. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.07.31.549902. [PMID: 37577578 PMCID: PMC10418111 DOI: 10.1101/2023.07.31.549902] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 08/15/2023]
Abstract
The peripheral vestibular system detects head position and movement through activation of vestibular hair cells (HCs) in vestibular end organs. HCs transmit this information to the CNS by way of primary vestibular afferent neurons. The CNS, in turn, modulates HCs and afferents via the efferent vestibular system (EVS) through activation of cholinergic signaling mechanisms. In mice, we previously demonstrated that activation of muscarinic acetylcholine receptors (mAChRs), during EVS stimulation, gives rise to a slow excitation that takes seconds to peak and tens of seconds to decay back to baseline. This slow excitation is mimicked by muscarine and ablated by the non-selective mAChR blockers scopolamine, atropine, and glycopyrrolate. While five distinct mAChRs (M1-M5) exist, the subtype(s) driving EVS-mediated slow excitation remain unidentified and details on how these mAChRs alter vestibular function is not well understood. The objective of this study is to characterize which mAChR subtypes drive the EVS-mediated slow excitation, and how their activation impacts vestibular physiology and behavior. In C57Bl/6J mice, M3mAChR antagonists were more potent at blocking slow excitation than M1mAChR antagonists, while M2/M4 blockers were ineffective. While unchanged in M2/M4mAChR double KO mice, EVS-mediated slow excitation in M3 mAChR-KO animals were reduced or absent in irregular afferents but appeared unchanged in regular afferents. In agreement, vestibular sensory-evoked potentials (VsEP), known to be predominantly generated from irregular afferents, were significantly less enhanced by mAChR activation in M3mAChR-KO mice compared to controls. Finally, M3mAChR-KO mice display distinct behavioral phenotypes in open field activity, and thermal profiles, and balance beam and forced swim test. M3mAChRs mediate efferent-mediated slow excitation in irregular afferents, while M1mAChRs may drive the same process in regular afferents.
Collapse
|
7
|
Saleem AB, Busse L. Interactions between rodent visual and spatial systems during navigation. Nat Rev Neurosci 2023:10.1038/s41583-023-00716-7. [PMID: 37380885 DOI: 10.1038/s41583-023-00716-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/31/2023] [Indexed: 06/30/2023]
Abstract
Many behaviours that are critical for animals to survive and thrive rely on spatial navigation. Spatial navigation, in turn, relies on internal representations about one's spatial location, one's orientation or heading direction and the distance to objects in the environment. Although the importance of vision in guiding such internal representations has long been recognized, emerging evidence suggests that spatial signals can also modulate neural responses in the central visual pathway. Here, we review the bidirectional influences between visual and navigational signals in the rodent brain. Specifically, we discuss reciprocal interactions between vision and the internal representations of spatial position, explore the effects of vision on representations of an animal's heading direction and vice versa, and examine how the visual and navigational systems work together to assess the relative distances of objects and other features. Throughout, we consider how technological advances and novel ethological paradigms that probe rodent visuo-spatial behaviours allow us to advance our understanding of how brain areas of the central visual pathway and the spatial systems interact and enable complex behaviours.
Collapse
Affiliation(s)
- Aman B Saleem
- UCL Institute of Behavioural Neuroscience, Department of Experimental Psychology, University College London, London, UK.
| | - Laura Busse
- Division of Neuroscience, Faculty of Biology, LMU Munich, Munich, Germany.
- Bernstein Centre for Computational Neuroscience Munich, Munich, Germany.
| |
Collapse
|
8
|
Ginosar G, Aljadeff J, Las L, Derdikman D, Ulanovsky N. Are grid cells used for navigation? On local metrics, subjective spaces, and black holes. Neuron 2023; 111:1858-1875. [PMID: 37044087 DOI: 10.1016/j.neuron.2023.03.027] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2022] [Revised: 11/18/2022] [Accepted: 03/20/2023] [Indexed: 04/14/2023]
Abstract
The symmetric, lattice-like spatial pattern of grid-cell activity is thought to provide a neuronal global metric for space. This view is compatible with grid cells recorded in empty boxes but inconsistent with data from more naturalistic settings. We review evidence arguing against the global-metric notion, including the distortion and disintegration of the grid pattern in complex and three-dimensional environments. We argue that deviations from lattice symmetry are key for understanding grid-cell function. We propose three possible functions for grid cells, which treat real-world grid distortions as a feature rather than a bug. First, grid cells may constitute a local metric for proximal space rather than a global metric for all space. Second, grid cells could form a metric for subjective action-relevant space rather than physical space. Third, distortions may represent salient locations. Finally, we discuss mechanisms that can underlie these functions. These ideas may transform our thinking about grid cells.
Collapse
Affiliation(s)
- Gily Ginosar
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot 76100, Israel
| | - Johnatan Aljadeff
- Department of Neurobiology, University of California, San Diego, La Jolla, CA 92093, USA
| | - Liora Las
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot 76100, Israel
| | - Dori Derdikman
- Department of Neuroscience, Rappaport Faculty of Medicine and Research Institute, Technion, Haifa 31096, Israel.
| | - Nachum Ulanovsky
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot 76100, Israel.
| |
Collapse
|
9
|
Zhu SL, Lakshminarasimhan KJ, Angelaki DE. Computational cross-species views of the hippocampal formation. Hippocampus 2023; 33:586-599. [PMID: 37038890 PMCID: PMC10947336 DOI: 10.1002/hipo.23535] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2023] [Revised: 03/17/2023] [Accepted: 03/21/2023] [Indexed: 04/12/2023]
Abstract
The discovery of place cells and head direction cells in the hippocampal formation of freely foraging rodents has led to an emphasis of its role in encoding allocentric spatial relationships. In contrast, studies in head-fixed primates have additionally found representations of spatial views. We review recent experiments in freely moving monkeys that expand upon these findings and show that postural variables such as eye/head movements strongly influence neural activity in the hippocampal formation, suggesting that the function of the hippocampus depends on where the animal looks. We interpret these results in the light of recent studies in humans performing challenging navigation tasks which suggest that depending on the context, eye/head movements serve one of two roles-gathering information about the structure of the environment (active sensing) or externalizing the contents of internal beliefs/deliberation (embodied cognition). These findings prompt future experimental investigations into the information carried by signals flowing between the hippocampal formation and the brain regions controlling postural variables, and constitute a basis for updating computational theories of the hippocampal system to accommodate the influence of eye/head movements.
Collapse
Affiliation(s)
- Seren L Zhu
- Center for Neural Science, New York University, New York, New York, USA
| | - Kaushik J Lakshminarasimhan
- Center for Theoretical Neuroscience, Zuckerman Mind Brain Behavior Institute, Columbia University, New York, New York, USA
| | - Dora E Angelaki
- Center for Neural Science, New York University, New York, New York, USA
- Mechanical and Aerospace Engineering, Tandon School of Engineering, New York University, New York, New York, USA
| |
Collapse
|
10
|
Alexander AS, Place R, Starrett MJ, Chrastil ER, Nitz DA. Rethinking retrosplenial cortex: Perspectives and predictions. Neuron 2023; 111:150-175. [PMID: 36460006 DOI: 10.1016/j.neuron.2022.11.006] [Citation(s) in RCA: 19] [Impact Index Per Article: 19.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2022] [Revised: 08/09/2022] [Accepted: 11/06/2022] [Indexed: 12/03/2022]
Abstract
The last decade has produced exciting new ideas about retrosplenial cortex (RSC) and its role in integrating diverse inputs. Here, we review the diversity in forms of spatial and directional tuning of RSC activity, temporal organization of RSC activity, and features of RSC interconnectivity with other brain structures. We find that RSC anatomy and dynamics are more consistent with roles in multiple sensorimotor and cognitive processes than with any isolated function. However, two more generalized categories of function may best characterize roles for RSC in complex cognitive processes: (1) shifting and relating perspectives for spatial cognition and (2) prediction and error correction for current sensory states with internal representations of the environment. Both functions likely take advantage of RSC's capacity to encode conjunctions among sensory, motor, and spatial mapping information streams. Together, these functions provide the scaffold for intelligent actions, such as navigation, perspective taking, interaction with others, and error detection.
Collapse
Affiliation(s)
- Andrew S Alexander
- Department of Psychological and Brain Sciences, Boston University, Boston, MA 02215, USA
| | - Ryan Place
- Department of Cognitive Science, University of California, San Diego, La Jolla, CA 92093, USA
| | - Michael J Starrett
- Department of Neurobiology & Behavior, University of California, Irvine, Irvine, CA 92697, USA
| | - Elizabeth R Chrastil
- Department of Neurobiology & Behavior, University of California, Irvine, Irvine, CA 92697, USA; Department of Cognitive Sciences, University of California, Irvine, Irvine, CA 92697, USA.
| | - Douglas A Nitz
- Department of Cognitive Science, University of California, San Diego, La Jolla, CA 92093, USA.
| |
Collapse
|
11
|
Fundamental Cause of Bio-Chirality: Space-Time Symmetry—Concept Review. Symmetry (Basel) 2022. [DOI: 10.3390/sym15010079] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2022] Open
Abstract
The search for fundamental determinants of bio-molecular chirality is a hot topic in biology, clarifying the meaning of evolution and the enigma of life’s origin. The question of origin may be resolved assuming that non-biological and biological entities obey nature’s universal laws grounded on space-time symmetry (STS) and space-time relativity (SPR). The fabric of STS is our review’s primary subject. This symmetry, encompassing the behavior of elementary particles and galaxy structure, imposes its fundamental laws on all hierarchical levels of the biological world. From the perspective of STS, objects across spatial scales may be classified as chiral or achiral concerning a specific space-related symmetry transformation: mirror reflection. The chiral object is not identical (i.e., not superimposable) to its mirror image. In geometry, distinguish two kinds of chiral objects. The first one does not have any reflective symmetry elements (a point or plane of symmetry) but may have rotational symmetry axes (dissymmetry). The second one does not have any symmetry elements (asymmetry). As the form symmetry deficiency, Chirality is the critical structural feature of natural systems, including sub-atomic particles and living matter. According to the Standard Model (SM) theory and String Theory (StrT), elementary particles associated with the four fundamental forces of nature determine the existence of micro- and galaxy scales of nature. Therefore, the inheritance of molecular symmetry from the symmetry of elementary particles indicates a bi-directional (internal [(micro-scale) and external (galaxy sale)] causal pathway of prevalent bio-chirality. We assume that the laws of the physical world impact the biological matter’s appearance through both extremities of spatial dimensions. The extended network of multi-disciplinary experimental evidence supports this hypothesis. However, many experimental results are derived and interpreted based on the narrow-view prerogative and highly specific terminology. The current review promotes a holistic approach to experimental results in two fast-developing, seemingly unrelated, divergent branches of STS and biological chirality. The generalized view on the origin of prevalent bio-molecular chirality is necessary for understanding the link between a diverse range of biological events. The chain of chirality transfer links ribosomal protein synthesis, cell morphology, and neuronal signaling with the laterality of cognitive functions.
Collapse
|
12
|
Hagbi Z, Segev E, Eilam D. Keep a level head to know the way ahead: How rodents travel on inclined surfaces? iScience 2022; 25:104424. [PMID: 35663016 PMCID: PMC9157226 DOI: 10.1016/j.isci.2022.104424] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2022] [Revised: 04/13/2022] [Accepted: 05/12/2022] [Indexed: 11/30/2022] Open
Abstract
Animals traveling on a horizontal surface stabilize their head in relation to the substrate in order to gather spatial information and orient. What, however, do they do when traveling on an incline? We examined how three rodent species differing in motor abilities and habitats explore a platform tilted at 0–90°, hypothesizing that they would attempt to maintain bilateral vestibular cues. We found that traveling up or down was mainly straight vertically rather than diagonally, which results in identical bilateral vestibular cues. This was also achieved when traveling horizontally through rotating the head to parallel the horizontal plane. Traveling diagonally up or down was avoided, perhaps due to different bilateral vestibular cues that could hinder orientation. Accordingly, we suggest that maintaining identical bilateral cues is an orientational necessity that overrides differences in motor abilities and habitats, and that this necessity is a general characteristic of animals in motion. Three rodent species were tested on a platform inclined at 0°–90° Increased inclination results in traveling straight vertically or horizontally Both these shapes of trajectories feature a horizontal leveled head We suggest that such posture is required for spatial orientation when in motion
Collapse
Affiliation(s)
- Zohar Hagbi
- School of Zoology, George S. Wise Faculty of Life Sciences, Tel-Aviv University, Israel
| | - Elad Segev
- Department of Applied Mathematics, Holon Institute of Technology, Holon, Israel
| | - David Eilam
- School of Zoology, George S. Wise Faculty of Life Sciences, Tel-Aviv University, Israel
| |
Collapse
|
13
|
Schneider A, Zimmermann C, Alyahyay M, Steenbergen F, Brox T, Diester I. 3D pose estimation enables virtual head fixation in freely moving rats. Neuron 2022; 110:2080-2093.e10. [DOI: 10.1016/j.neuron.2022.04.019] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2021] [Revised: 01/13/2022] [Accepted: 04/18/2022] [Indexed: 10/18/2022]
|
14
|
Ebbesen CL, Froemke RC. Automatic mapping of multiplexed social receptive fields by deep learning and GPU-accelerated 3D videography. Nat Commun 2022; 13:593. [PMID: 35105858 PMCID: PMC8807631 DOI: 10.1038/s41467-022-28153-7] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2021] [Accepted: 01/06/2022] [Indexed: 12/25/2022] Open
Abstract
Social interactions powerfully impact the brain and the body, but high-resolution descriptions of these important physical interactions and their neural correlates are lacking. Currently, most studies rely on labor-intensive methods such as manual annotation. Scalable and objective tracking methods are required to understand the neural circuits underlying social behavior. Here we describe a hardware/software system and analysis pipeline that combines 3D videography, deep learning, physical modeling, and GPU-accelerated robust optimization, with automatic analysis of neuronal receptive fields recorded in interacting mice. Our system ("3DDD Social Mouse Tracker") is capable of fully automatic multi-animal tracking with minimal errors (including in complete darkness) during complex, spontaneous social encounters, together with simultaneous electrophysiological recordings. We capture posture dynamics of multiple unmarked mice with high spatiotemporal precision (~2 mm, 60 frames/s). A statistical model that relates 3D behavior and neural activity reveals multiplexed 'social receptive fields' of neurons in barrel cortex. Our approach could be broadly useful for neurobehavioral studies of multiple animals interacting in complex low-light environments.
Collapse
Affiliation(s)
- Christian L Ebbesen
- Skirball Institute of Biomolecular Medicine, New York University School of Medicine, New York, NY, 10016, USA.
- Neuroscience Institute, New York University School of Medicine, New York, NY, 10016, USA.
- Department of Otolaryngology, New York University School of Medicine, New York, NY, 10016, USA.
- Department of Neuroscience and Physiology, New York University School of Medicine, New York, NY, 10016, USA.
- Center for Neural Science, New York University, New York, NY, 10003, USA.
| | - Robert C Froemke
- Skirball Institute of Biomolecular Medicine, New York University School of Medicine, New York, NY, 10016, USA.
- Neuroscience Institute, New York University School of Medicine, New York, NY, 10016, USA.
- Department of Otolaryngology, New York University School of Medicine, New York, NY, 10016, USA.
- Department of Neuroscience and Physiology, New York University School of Medicine, New York, NY, 10016, USA.
- Center for Neural Science, New York University, New York, NY, 10003, USA.
| |
Collapse
|
15
|
Hennestad E, Witoelar A, Chambers AR, Vervaeke K. Mapping vestibular and visual contributions to angular head velocity tuning in the cortex. Cell Rep 2021; 37:110134. [PMID: 34936869 PMCID: PMC8721284 DOI: 10.1016/j.celrep.2021.110134] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2021] [Revised: 09/21/2021] [Accepted: 11/24/2021] [Indexed: 11/19/2022] Open
Abstract
Neurons that signal the angular velocity of head movements (AHV cells) are important for processing visual and spatial information. However, it has been challenging to isolate the sensory modality that drives them and to map their cortical distribution. To address this, we develop a method that enables rotating awake, head-fixed mice under a two-photon microscope in a visual environment. Starting in layer 2/3 of the retrosplenial cortex, a key area for vision and navigation, we find that 10% of neurons report angular head velocity (AHV). Their tuning properties depend on vestibular input with a smaller contribution of vision at lower speeds. Mapping the spatial extent, we find AHV cells in all cortical areas that we explored, including motor, somatosensory, visual, and posterior parietal cortex. Notably, the vestibular and visual contributions to AHV are area dependent. Thus, many cortical circuits have access to AHV, enabling a diverse integration with sensorimotor and cognitive information.
Collapse
Affiliation(s)
- Eivind Hennestad
- Institute of Basic Medical Sciences, Section of Physiology, University of Oslo, Oslo, Norway
| | - Aree Witoelar
- Institute of Basic Medical Sciences, Section of Physiology, University of Oslo, Oslo, Norway
| | - Anna R Chambers
- Institute of Basic Medical Sciences, Section of Physiology, University of Oslo, Oslo, Norway
| | - Koen Vervaeke
- Institute of Basic Medical Sciences, Section of Physiology, University of Oslo, Oslo, Norway.
| |
Collapse
|
16
|
Delle Monache S, Indovina I, Zago M, Daprati E, Lacquaniti F, Bosco G. Watching the Effects of Gravity. Vestibular Cortex and the Neural Representation of "Visual" Gravity. Front Integr Neurosci 2021; 15:793634. [PMID: 34924968 PMCID: PMC8671301 DOI: 10.3389/fnint.2021.793634] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2021] [Accepted: 11/08/2021] [Indexed: 11/13/2022] Open
Abstract
Gravity is a physical constraint all terrestrial species have adapted to through evolution. Indeed, gravity effects are taken into account in many forms of interaction with the environment, from the seemingly simple task of maintaining balance to the complex motor skills performed by athletes and dancers. Graviceptors, primarily located in the vestibular otolith organs, feed the Central Nervous System with information related to the gravity acceleration vector. This information is integrated with signals from semicircular canals, vision, and proprioception in an ensemble of interconnected brain areas, including the vestibular nuclei, cerebellum, thalamus, insula, retroinsula, parietal operculum, and temporo-parietal junction, in the so-called vestibular network. Classical views consider this stage of multisensory integration as instrumental to sort out conflicting and/or ambiguous information from the incoming sensory signals. However, there is compelling evidence that it also contributes to an internal representation of gravity effects based on prior experience with the environment. This a priori knowledge could be engaged by various types of information, including sensory signals like the visual ones, which lack a direct correspondence with physical gravity. Indeed, the retinal accelerations elicited by gravitational motion in a visual scene are not invariant, but scale with viewing distance. Moreover, the "visual" gravity vector may not be aligned with physical gravity, as when we watch a scene on a tilted monitor or in weightlessness. This review will discuss experimental evidence from behavioral, neuroimaging (connectomics, fMRI, TMS), and patients' studies, supporting the idea that the internal model estimating the effects of gravity on visual objects is constructed by transforming the vestibular estimates of physical gravity, which are computed in the brainstem and cerebellum, into internalized estimates of virtual gravity, stored in the vestibular cortex. The integration of the internal model of gravity with visual and non-visual signals would take place at multiple levels in the cortex and might involve recurrent connections between early visual areas engaged in the analysis of spatio-temporal features of the visual stimuli and higher visual areas in temporo-parietal-insular regions.
Collapse
Affiliation(s)
- Sergio Delle Monache
- UniCamillus—Saint Camillus International University of Health Sciences, Rome, Italy
- Laboratory of Neuromotor Physiology, IRCCS Santa Lucia Foundation, Rome, Italy
| | - Iole Indovina
- Laboratory of Neuromotor Physiology, IRCCS Santa Lucia Foundation, Rome, Italy
- Department of Biomedical and Dental Sciences and Morphofunctional Imaging, University of Messina, Messina, Italy
| | - Myrka Zago
- Laboratory of Neuromotor Physiology, IRCCS Santa Lucia Foundation, Rome, Italy
- Center for Space Biomedicine, University of Rome “Tor Vergata”, Rome, Italy
- Department of Civil and Computer Engineering, University of Rome “Tor Vergata”, Rome, Italy
| | - Elena Daprati
- Laboratory of Neuromotor Physiology, IRCCS Santa Lucia Foundation, Rome, Italy
- Center for Space Biomedicine, University of Rome “Tor Vergata”, Rome, Italy
- Department of Systems Medicine, University of Rome “Tor Vergata”, Rome, Italy
| | - Francesco Lacquaniti
- Laboratory of Neuromotor Physiology, IRCCS Santa Lucia Foundation, Rome, Italy
- Center for Space Biomedicine, University of Rome “Tor Vergata”, Rome, Italy
- Department of Systems Medicine, University of Rome “Tor Vergata”, Rome, Italy
| | - Gianfranco Bosco
- Laboratory of Neuromotor Physiology, IRCCS Santa Lucia Foundation, Rome, Italy
- Center for Space Biomedicine, University of Rome “Tor Vergata”, Rome, Italy
- Department of Systems Medicine, University of Rome “Tor Vergata”, Rome, Italy
| |
Collapse
|
17
|
Mao D, Avila E, Caziot B, Laurens J, Dickman JD, Angelaki DE. Spatial modulation of hippocampal activity in freely moving macaques. Neuron 2021; 109:3521-3534.e6. [PMID: 34644546 DOI: 10.1016/j.neuron.2021.09.032] [Citation(s) in RCA: 43] [Impact Index Per Article: 14.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2021] [Revised: 07/30/2021] [Accepted: 09/14/2021] [Indexed: 02/08/2023]
Abstract
The hippocampal formation is linked to spatial navigation, but there is little corroboration from freely moving primates with concurrent monitoring of head and gaze stances. We recorded neural activity across hippocampal regions in rhesus macaques during free foraging in an open environment while tracking their head and eye. Theta activity was intermittently present at movement onset and modulated by saccades. Many neurons were phase-locked to theta, with few showing phase precession. Most neurons encoded a mixture of spatial variables beyond place and grid tuning. Spatial representations were dominated by facing location and allocentric direction, mostly in head, rather than gaze, coordinates. Importantly, eye movements strongly modulated neural activity in all regions. These findings reveal that the macaque hippocampal formation represents three-dimensional (3D) space using a multiplexed code, with head orientation and eye movement properties being dominant during free exploration.
Collapse
Affiliation(s)
- Dun Mao
- Center for Neural Science, New York University, New York, NY 10003, USA; Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA.
| | - Eric Avila
- Center for Neural Science, New York University, New York, NY 10003, USA
| | - Baptiste Caziot
- Center for Neural Science, New York University, New York, NY 10003, USA
| | - Jean Laurens
- Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, Frankfurt, Germany
| | - J David Dickman
- Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA
| | - Dora E Angelaki
- Center for Neural Science, New York University, New York, NY 10003, USA; Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA; Tandon School of Engineering, New York University, New York, NY 11201, USA.
| |
Collapse
|
18
|
Kobel MJ, Wagner AR, Merfeld DM. Impact of gravity on the perception of linear motion. J Neurophysiol 2021; 126:875-887. [PMID: 34320866 PMCID: PMC8461827 DOI: 10.1152/jn.00274.2021] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2021] [Revised: 07/19/2021] [Accepted: 07/26/2021] [Indexed: 11/22/2022] Open
Abstract
Accurate perception of gravity and translation is fundamental for balance, navigation, and motor control. Previous studies have reported that perceptual thresholds for earth-vertical (i.e., parallel to gravity) and earth-horizontal (i.e., perpendicular to gravity) translations are equivalent in healthy adults, suggesting that the nervous system compensates for the presence of gravity. However, past study designs were not able to fully separate the effect of gravity from the potential effects of motion direction and body orientation. To quantify the effect of gravity on translation perception relative to these alternative factors, we measured vestibular perceptual thresholds for three motion directions (inter-aural, naso-occipital, and superior-inferior) and three body orientations (upright, supine, and ear-down). In contrast to prior reports, our data suggest that the nervous system does not universally compensate for the effects of gravity during translation, instead, we show that the colinear effect of gravity significantly decreases the sensitivity to stimuli for motions sensed by the utricles (inter-aural and naso-occipital translation), but this effect was not significant for motions sensed by the saccules (superior-inferior translations). We also identified increased thresholds for superior-inferior translation, suggesting decreased sensitivity of motions sensed predominantly by the saccule. An overall effect of body orientation on perception was seen; however, post hoc analyses suggest that this orientation effect may reflect the impact of gravity on self-motion perception. Overall, our data provide fundamental insights into the manner by which the nervous system processes vestibular self-motion cues, showing that the effect of gravity on translation perception is impacted by the direction of motion.NEW & NOTEWORTHY Perception of gravity and translation are fundamental for self-motion perception, balance, and motor control. The central nervous system must accurately disambiguate peripheral otolith signals encoding both linear acceleration and gravity. In contrast to past reports, we show that perception of translation depends on both motion relative to gravity and motion relative to the head. These results provide fundamental insights into otolith-mediated perception and suggest that the nervous system must compensate for the presence of gravity.
Collapse
Affiliation(s)
- Megan J Kobel
- Department of Otolaryngology-Head & Neck Surgery, Ohio State University Wexner Medical Center, Columbus, Ohio
- Department of Speech and Hearing Science, Ohio State University, Columbus, Ohio
| | - Andrew R Wagner
- Department of Otolaryngology-Head & Neck Surgery, Ohio State University Wexner Medical Center, Columbus, Ohio
- Health and Rehabilitation Sciences, Ohio State University, Columbus, Ohio
| | - Daniel M Merfeld
- Department of Otolaryngology-Head & Neck Surgery, Ohio State University Wexner Medical Center, Columbus, Ohio
- Department of Speech and Hearing Science, Ohio State University, Columbus, Ohio
- Health and Rehabilitation Sciences, Ohio State University, Columbus, Ohio
- Department of Biomedical Engineering, Ohio State University, Columbus, Ohio
| |
Collapse
|
19
|
Nedelkou A, Hatzitaki V, Chatzinikolaou K, Grouios G. Does somatosensory feedback from the plantar foot sole contribute to verticality perception? Somatosens Mot Res 2021; 38:214-222. [PMID: 34256655 DOI: 10.1080/08990220.2021.1949977] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
AIM OF THE STUDY In upright standing, the human foot sole is the only point of contact with the ground conveying information about the pressure distribution under the feet. We examined how the altered somatosensory input from the plantar foot receptors, when standing on a soft surface, affects the subjective estimation of the earth vertical in different sensory contexts. MATERIALS AND METHODS Twelve (12) healthy young females (mean age: 21.8 ± 2.4 years) adjusted the orientation of a visual line (35 × 1.5 cm) representing the roll orientation of a hand-held (attached on a 24.9 × 4 cm cylinder) or head-attached electromagnetic tracking sensor (Nest of Birds, Ascension Technologies Inc., VT. USA, 60 Hz) under two visual conditions (eyes open, eyes closed) while standing on a soft or firm surface. The mean absolute (accuracy) and variable (precision) error in the verticality estimate was depicted in the sensor's roll deviation from the gravitational vertical. RESULTS The accuracy and the precision of the estimate decreased in the absence of vision, while standing on the soft surface and when the estimate was provided by an active hand rather than head rotation. The surface effect was significant only in the absence of vision and when the estimate was provided by the hand. CONCLUSIONS The contribution of the plantar foot mechanoreceptors to gravity perception is sensory context dependent. Perception of the earth vertical is more accurate when estimated by active head rotation due to the integration of the vestibular and neck proprioceptive afferents.
Collapse
Affiliation(s)
- A Nedelkou
- Laboratory of Motor Behavior and Adapted Physical Activity, Aristotle University of Thessaloniki, Thessaloniki, Greece
| | - V Hatzitaki
- Laboratory of Motor Behavior and Adapted Physical Activity, Aristotle University of Thessaloniki, Thessaloniki, Greece
| | - K Chatzinikolaou
- Laboratory of Motor Behavior and Adapted Physical Activity, Aristotle University of Thessaloniki, Thessaloniki, Greece
| | - G Grouios
- Laboratory of Motor Behavior and Adapted Physical Activity, Aristotle University of Thessaloniki, Thessaloniki, Greece
| |
Collapse
|
20
|
Perry BAL, Lomi E, Mitchell AS. Thalamocortical interactions in cognition and disease: the mediodorsal and anterior thalamic nuclei. Neurosci Biobehav Rev 2021; 130:162-177. [PMID: 34216651 DOI: 10.1016/j.neubiorev.2021.05.032] [Citation(s) in RCA: 30] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2020] [Revised: 04/12/2021] [Accepted: 05/17/2021] [Indexed: 01/15/2023]
Abstract
The mediodorsal thalamus (MD) and anterior thalamic nuclei (ATN) are two adjacent brain nodes that support our ability to make decisions, learn, update information, form and retrieve memories, and find our way around. The MD and PFC work in partnerships to support cognitive processes linked to successful learning and decision-making, while the ATN and extended hippocampal system together coordinate the encoding and retrieval of memories and successful spatial navigation. Yet, while these distinctions may appear to be segregated, both the MD and ATN together support our higher cognitive functions as they regulate and are influenced by interconnected fronto-temporal neural networks and subcortical inputs. Our review focuses on recent studies in animal models and in humans. This evidence is re-shaping our understanding of the importance of MD and ATN cortico-thalamocortical pathways in influencing complex cognitive functions. Given the evidence from clinical settings and neuroscience research labs, the MD and ATN should be considered targets for effective treatments in neuropsychiatric diseases and disorders and neurodegeneration.
Collapse
Affiliation(s)
- Brook A L Perry
- Department of Experimental Psychology, Oxford University, The Tinsley Building, Mansfield Road, OX1 3SR, United Kingdom
| | - Eleonora Lomi
- Department of Experimental Psychology, Oxford University, The Tinsley Building, Mansfield Road, OX1 3SR, United Kingdom
| | - Anna S Mitchell
- Department of Experimental Psychology, Oxford University, The Tinsley Building, Mansfield Road, OX1 3SR, United Kingdom.
| |
Collapse
|
21
|
Ebbesen CL, Froemke RC. Body language signals for rodent social communication. Curr Opin Neurobiol 2021; 68:91-106. [PMID: 33582455 PMCID: PMC8243782 DOI: 10.1016/j.conb.2021.01.008] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2020] [Revised: 01/09/2021] [Accepted: 01/25/2021] [Indexed: 12/15/2022]
Abstract
Integration of social cues to initiate adaptive emotional and behavioral responses is a fundamental aspect of animal and human behavior. In humans, social communication includes prominent nonverbal components, such as social touch, gestures and facial expressions. Comparative studies investigating the neural basis of social communication in rodents has historically been centered on olfactory signals and vocalizations, with relatively less focus on non-verbal social cues. Here, we outline two exciting research directions: First, we will review recent observations pointing to a role of social facial expressions in rodents. Second, we will review observations that point to a role of 'non-canonical' rodent body language: body posture signals beyond stereotyped displays in aggressive and sexual behavior. In both sections, we will outline how social neuroscience can build on recent advances in machine learning, robotics and micro-engineering to push these research directions forward towards a holistic systems neurobiology of rodent body language.
Collapse
Affiliation(s)
- Christian L Ebbesen
- Skirball Institute of Biomolecular Medicine, Neuroscience Institute, Departments of Otolaryngology, Neuroscience and Physiology, New York University School of Medicine, New York, NY, 10016, USA; Center for Neural Science, New York University, New York, NY, 10003, USA.
| | - Robert C Froemke
- Skirball Institute of Biomolecular Medicine, Neuroscience Institute, Departments of Otolaryngology, Neuroscience and Physiology, New York University School of Medicine, New York, NY, 10016, USA; Center for Neural Science, New York University, New York, NY, 10003, USA; Howard Hughes Medical Institute Faculty Scholar, USA.
| |
Collapse
|
22
|
Wang Y, Xu X, Wang R. Modeling the grid cell activity on non-horizontal surfaces based on oscillatory interference modulated by gravity. Neural Netw 2021; 141:199-210. [PMID: 33915445 DOI: 10.1016/j.neunet.2021.04.009] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2020] [Revised: 02/14/2021] [Accepted: 04/09/2021] [Indexed: 10/21/2022]
Abstract
Internal representation of the space is a fundamental and crucial function of the animal's brain. Grid cells in the medial entorhinal cortex are thought to provide an environment-invariant metric system for the navigation of the animal. Most experimental and theoretical studies have focused on the horizontal planar codes of grid cell, while how this metric coordinate system is configured in the actual three-dimensional space remains unclear. Evidence has implied the spatial cognition may not be fully volumetric. We proposed an oscillatory interference model with a novel gravity and body plane modulation to simulate grid cell activity in complex space for rodents. The animal can perceive the rotation of its body plane along the local surface by sensing the gravity, causing the modulation to the dendritic oscillations. The results not only reproduce the firing patterns of the grid cell recorded from known experiments, but also predict the grid codes in novel environments. It further demonstrates that the gravity signal is indispensable for the animal's navigation, and supports the hypothesis that the periodic firing of the grid cell is intrinsically not a volumetric code in three-dimensional space. This will provide new insights to understand the spatial representation of the actual world in the brain.
Collapse
Affiliation(s)
- Yihong Wang
- Institute for Cognitive Neurodynamics, East China University of Science and Technology, China; Mathematics Department, East China University of Science and Technology, China.
| | - Xuying Xu
- Institute for Cognitive Neurodynamics, East China University of Science and Technology, China; Mathematics Department, East China University of Science and Technology, China.
| | - Rubin Wang
- Institute for Cognitive Neurodynamics, East China University of Science and Technology, China; Computer and Software School, Hangzhou Dianzi University, China.
| |
Collapse
|
23
|
Liu B, Tian Q, Gu Y. Robust vestibular self-motion signals in macaque posterior cingulate region. eLife 2021; 10:e64569. [PMID: 33827753 PMCID: PMC8032402 DOI: 10.7554/elife.64569] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2020] [Accepted: 03/29/2021] [Indexed: 11/13/2022] Open
Abstract
Self-motion signals, distributed ubiquitously across parietal-temporal lobes, propagate to limbic hippocampal system for vector-based navigation via hubs including posterior cingulate cortex (PCC) and retrosplenial cortex (RSC). Although numerous studies have indicated posterior cingulate areas are involved in spatial tasks, it is unclear how their neurons represent self-motion signals. Providing translation and rotation stimuli to macaques on a 6-degree-of-freedom motion platform, we discovered robust vestibular responses in PCC. A combined three-dimensional spatiotemporal model captured data well and revealed multiple temporal components including velocity, acceleration, jerk, and position. Compared to PCC, RSC contained moderate vestibular temporal modulations and lacked significant spatial tuning. Visual self-motion signals were much weaker in both regions compared to the vestibular signals. We conclude that macaque posterior cingulate region carries vestibular-dominant self-motion signals with plentiful temporal components that could be useful for path integration.
Collapse
Affiliation(s)
- Bingyu Liu
- CAS Center for Excellence in Brain Science and Intelligence Technology, Key Laboratory of Primate Neurobiology, Institute of Neuroscience, Chinese Academy of SciencesShanghaiChina
- University of Chinese Academy of SciencesBeijingChina
| | - Qingyang Tian
- CAS Center for Excellence in Brain Science and Intelligence Technology, Key Laboratory of Primate Neurobiology, Institute of Neuroscience, Chinese Academy of SciencesShanghaiChina
- University of Chinese Academy of SciencesBeijingChina
| | - Yong Gu
- CAS Center for Excellence in Brain Science and Intelligence Technology, Key Laboratory of Primate Neurobiology, Institute of Neuroscience, Chinese Academy of SciencesShanghaiChina
- University of Chinese Academy of SciencesBeijingChina
| |
Collapse
|
24
|
Mallory CS, Hardcastle K, Campbell MG, Attinger A, Low IIC, Raymond JL, Giocomo LM. Mouse entorhinal cortex encodes a diverse repertoire of self-motion signals. Nat Commun 2021; 12:671. [PMID: 33510164 PMCID: PMC7844029 DOI: 10.1038/s41467-021-20936-8] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2020] [Accepted: 12/31/2020] [Indexed: 01/30/2023] Open
Abstract
Neural circuits generate representations of the external world from multiple information streams. The navigation system provides an exceptional lens through which we may gain insights about how such computations are implemented. Neural circuits in the medial temporal lobe construct a map-like representation of space that supports navigation. This computation integrates multiple sensory cues, and, in addition, is thought to require cues related to the individual's movement through the environment. Here, we identify multiple self-motion signals, related to the position and velocity of the head and eyes, encoded by neurons in a key node of the navigation circuitry of mice, the medial entorhinal cortex (MEC). The representation of these signals is highly integrated with other cues in individual neurons. Such information could be used to compute the allocentric location of landmarks from visual cues and to generate internal representations of space.
Collapse
Affiliation(s)
- Caitlin S Mallory
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA, USA
| | - Kiah Hardcastle
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA, USA
| | - Malcolm G Campbell
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA, USA
| | - Alexander Attinger
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA, USA
| | - Isabel I C Low
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA, USA
| | - Jennifer L Raymond
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA, USA
| | - Lisa M Giocomo
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA, USA.
| |
Collapse
|
25
|
Time Course of Sensory Substitution for Gravity Sensing in Visual Vertical Orientation Perception following Complete Vestibular Loss. eNeuro 2020; 7:ENEURO.0021-20.2020. [PMID: 32561572 PMCID: PMC7358335 DOI: 10.1523/eneuro.0021-20.2020] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2020] [Revised: 03/24/2020] [Accepted: 03/30/2020] [Indexed: 01/06/2023] Open
Abstract
Loss of vestibular function causes severe acute symptoms of dizziness and disorientation, yet the brain can adapt and regain near to normal locomotor and orientation function through sensory substitution. Animal studies quantifying functional recovery have yet been limited to reflexive eye movements. Here, we studied the interplay between vestibular and proprioceptive graviception in macaque monkeys trained in an earth-vertical visual orientation (subjective visual vertical; SVV) task and measured the time course of sensory substitution for gravity perception following complete bilateral vestibular loss (BVL). Graviceptive gain, defined as the ratio of perceived versus actual tilt angle, decreased to 20% immediately following labyrinthectomy, and recovered to nearly prelesion levels with a time constant of approximately three weeks of postsurgery testing. We conclude that proprioception accounts for up to 20% of gravity sensing in normal animals, and is re-weighted to substitute completely perceptual graviception after vestibular loss. We show that these results can be accounted for by an optimal sensory fusion model.
Collapse
|
26
|
Taube JS, Shinder ME. On the absence or presence of 3D tuned head direction cells in rats: a review and rebuttal. J Neurophysiol 2020; 123:1808-1827. [PMID: 32208877 PMCID: PMC8086636 DOI: 10.1152/jn.00475.2019] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2019] [Revised: 03/20/2020] [Accepted: 03/22/2020] [Indexed: 11/22/2022] Open
Abstract
A major question in the field of spatial cognition is how animals represent three-dimensional (3D) space. Different results have been obtained across various species and may depend on whether the species inhabits a 3D environment or is terrestrial (land dwelling). The head direction (HD) cell system is an attractive candidate to study in terms of 3D representations. HD cells fire as a function of the animal's directional heading in the horizontal plane, independent of the animal's location and on-going behavior. Another issue concerns whether HD cells are tuned in 3D space or tuned to the 2D horizontal plane. Shinder and Taube (Shinder ME, Taube JS. J Neurophysiol 121: 4-37, 2019) addressed this issue by manipulating a rat's orientation in 3D space while monitoring responses from classic HD cells in the rat anterodorsal thalamus. They reported that HD cells did not display conjunctive firing with pitch or roll orientations. Direction-specific firing was primarily derived from horizontal semicircular canal information and that the gravity vector played an important role in influencing the cell's firing rate and its preferred firing direction. Laurens and Angelaki (Laurens J, Angelaki DE. J Neurophysiol 122: 1274-1287, 2019) challenged this view by performing a mathematical analysis on the Shinder and Taube data and concluded that they would not have seen 3D tuning based on their experimental approach. We provide a historical review of these issues followed by a summary of the experiments, which includes additional analyses. We then define what it means for a HD cell to be tuned in 3D and finish by rebutting the reanalyses performed by Laurens and Angelaki.
Collapse
Affiliation(s)
- Jeffrey S Taube
- Department of Psychological & Brain Sciences, Dartmouth College, Hanover, New Hampshire
| | - Michael E Shinder
- Department of Psychological & Brain Sciences, Dartmouth College, Hanover, New Hampshire
| |
Collapse
|
27
|
Alexander AS, Robinson JC, Dannenberg H, Kinsky NR, Levy SJ, Mau W, Chapman GW, Sullivan DW, Hasselmo ME. Neurophysiological coding of space and time in the hippocampus, entorhinal cortex, and retrosplenial cortex. Brain Neurosci Adv 2020; 4:2398212820972871. [PMID: 33294626 PMCID: PMC7708714 DOI: 10.1177/2398212820972871] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2020] [Accepted: 10/21/2020] [Indexed: 11/18/2022] Open
Abstract
Neurophysiological recordings in behaving rodents demonstrate neuronal response properties that may code space and time for episodic memory and goal-directed behaviour. Here, we review recordings from hippocampus, entorhinal cortex, and retrosplenial cortex to address the problem of how neurons encode multiple overlapping spatiotemporal trajectories and disambiguate these for accurate memory-guided behaviour. The solution could involve neurons in the entorhinal cortex and hippocampus that show mixed selectivity, coding both time and location. Some grid cells and place cells that code space also respond selectively as time cells, allowing differentiation of time intervals when a rat runs in the same location during a delay period. Cells in these regions also develop new representations that differentially code the context of prior or future behaviour allowing disambiguation of overlapping trajectories. Spiking activity is also modulated by running speed and head direction, supporting the coding of episodic memory not as a series of snapshots but as a trajectory that can also be distinguished on the basis of speed and direction. Recent data also address the mechanisms by which sensory input could distinguish different spatial locations. Changes in firing rate reflect running speed on long but not short time intervals, and few cells code movement direction, arguing against path integration for coding location. Instead, new evidence for neural coding of environmental boundaries in egocentric coordinates fits with a modelling framework in which egocentric coding of barriers combined with head direction generates distinct allocentric coding of location. The egocentric input can be used both for coding the location of spatiotemporal trajectories and for retrieving specific viewpoints of the environment. Overall, these different patterns of neural activity can be used for encoding and disambiguation of prior episodic spatiotemporal trajectories or for planning of future goal-directed spatiotemporal trajectories.
Collapse
Affiliation(s)
| | | | | | | | - Samuel J. Levy
- Center for Systems Neuroscience, Boston University, Boston, MA, USA
| | - William Mau
- Center for Systems Neuroscience, Boston University, Boston, MA, USA
| | | | | | | |
Collapse
|