1
|
Buckley M, McGregor A, Ihssen N, Austen J, Thurlbeck S, Smith SP, Heinecke A, Lew AR. The well-worn route revisited: Striatal and hippocampal system contributions to familiar route navigation. Hippocampus 2024; 34:310-326. [PMID: 38721743 DOI: 10.1002/hipo.23607] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2023] [Revised: 02/20/2024] [Accepted: 04/17/2024] [Indexed: 06/21/2024]
Abstract
Classic research has shown a division in the neuroanatomical structures that support flexible (e.g., short-cutting) and habitual (e.g., familiar route following) navigational behavior, with hippocampal-caudate systems associated with the former and putamen systems with the latter. There is, however, disagreement about whether the neural structures involved in navigation process particular forms of spatial information, such as associations between constellations of cues forming a cognitive map, versus single landmark-action associations, or alternatively, perform particular reinforcement learning algorithms that allow the use of different spatial strategies, so-called model-based (flexible) or model-free (habitual) forms of learning. We sought to test these theories by asking participants (N = 24) to navigate within a virtual environment through a previously learned, 9-junction route with distinctive landmarks at each junction while undergoing functional magnetic resonance imaging (fMRI). In a series of probe trials, we distinguished knowledge of individual landmark-action associations along the route versus knowledge of the correct sequence of landmark-action associations, either by having absent landmarks, or "out-of-sequence" landmarks. Under a map-based perspective, sequence knowledge would not require hippocampal systems, because there are no constellations of cues available for cognitive map formation. Within a learning-based model, however, responding based on knowledge of sequence would require hippocampal systems because prior context has to be utilized. We found that hippocampal-caudate systems were more active in probes requiring sequence knowledge, supporting the learning-based model. However, we also found greater putamen activation in probes where navigation based purely on sequence memory could be planned, supporting models of putamen function that emphasize its role in action sequencing.
Collapse
Affiliation(s)
| | | | - Niklas Ihssen
- Department of Psychology, Durham University, Durham, UK
| | - Joseph Austen
- Department of Psychology, Durham University, Durham, UK
| | | | - Shamus P Smith
- School of Information and Physical Sciences, University of Newcastle Australia, Callaghan, New South Wales, Australia
| | | | - Adina R Lew
- Department of Psychology, Lancaster University, Lancaster, UK
| |
Collapse
|
2
|
Khosla A, Moscovitch M, Ryan JD. Spatial updating of gaze position in younger and older adults - A path integration-like process in eye movements. Cognition 2024; 250:105835. [PMID: 38875941 DOI: 10.1016/j.cognition.2024.105835] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2023] [Revised: 05/19/2024] [Accepted: 05/23/2024] [Indexed: 06/16/2024]
Abstract
Path integration (PI) is a navigation process that allows an organism to update its current location in reference to a starting point. PI can involve updating self-position continuously with respect to the starting point (continuous updating) or creating a map representation of the route which is then used to compute the homing vector (configural updating). One of the brain areas involved in PI, the entorhinal cortex, is modulated similarly by whole-body and eye movements, suggesting that if PI updates self-position, an analogous process may be used to update gaze position, and may undergo age-related changes. Here, we created an eyetracking version of a PI task in which younger and older participants followed routes with their eyes as guided by visual onsets; at the end of each route, participants were cued to return to the starting point or another enroute location. When only memory for the starting location was required for successful task performance, younger and older adults were generally not influenced by the number of locations, indicative of continuous updating. However, when participants could be cued to any enroute location, thereby requiring memory for the entire route, processing times increased, accuracy decreased, and overt revisits to enroute locations increased with the number of locations in a route, indicative of configural updating. Older participants showed evidence for similar updating strategies as younger participants, but they were less accurate and made more overt revisits to mid-route locations. These findings suggest that spatial updating mechanisms are generalizable across effector systems.
Collapse
Affiliation(s)
- Anisha Khosla
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada; Rotman Research Institute, Baycrest, Toronto, Ontario, Canada.
| | - Morris Moscovitch
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada; Rotman Research Institute, Baycrest, Toronto, Ontario, Canada
| | - Jennifer D Ryan
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada; Rotman Research Institute, Baycrest, Toronto, Ontario, Canada; Department of Psychiatry, University of Toronto, Toronto, Ontario, Canada
| |
Collapse
|
3
|
Chen X, Wei Z, Wolbers T. Repetition Suppression Reveals Cue-Specific Spatial Representations for Landmarks and Self-Motion Cues in the Human Retrosplenial Cortex. eNeuro 2024; 11:ENEURO.0294-23.2024. [PMID: 38519127 PMCID: PMC11007318 DOI: 10.1523/eneuro.0294-23.2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2023] [Revised: 03/08/2024] [Accepted: 03/11/2024] [Indexed: 03/24/2024] Open
Abstract
The efficient use of various spatial cues within a setting is crucial for successful navigation. Two fundamental forms of spatial navigation, landmark-based and self-motion-based, engage distinct cognitive mechanisms. The question of whether these modes invoke shared or separate spatial representations in the brain remains unresolved. While nonhuman animal studies have yielded inconsistent results, human investigation is limited. In our previous work (Chen et al., 2019), we introduced a novel spatial navigation paradigm utilizing ultra-high field fMRI to explore neural coding of positional information. We found that different entorhinal subregions in the right hemisphere encode positional information for landmarks and self-motion cues. The present study tested the generalizability of our previous finding with a modified navigation paradigm. Although we did not replicate our previous finding in the entorhinal cortex, we identified adaptation-based allocentric positional codes for both cue types in the retrosplenial cortex (RSC), which were not confounded by the path to the spatial location. Crucially, the multi-voxel patterns of these spatial codes differed between the cue types, suggesting cue-specific positional coding. The parahippocampal cortex exhibited positional coding for self-motion cues, which was not dissociable from path length. Finally, the brain regions involved in successful navigation differed from our previous study, indicating overall distinct neural mechanisms recruited in our two studies. Taken together, the current findings demonstrate cue-specific allocentric positional coding in the human RSC in the same navigation task for the first time and that spatial representations in the brain are contingent on specific experimental conditions.
Collapse
Affiliation(s)
- Xiaoli Chen
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou 310058, P.R. China
| | - Ziwei Wei
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou 310058, P.R. China
| | - Thomas Wolbers
- German Center for Neurodegenerative Diseases (DZNE), Magdeburg 39120, Germany
- Department of Neurology, Otto-von-Guericke University Magdeburg, Magdeburg 39106, Germany
- Center for Behavioral Brain Sciences (CBBS), Otto-von-Guericke University, Magdeburg 39106, Germany
| |
Collapse
|
4
|
Castegnaro A, Ji Z, Rudzka K, Chan D, Burgess N. Overestimation in angular path integration precedes Alzheimer's dementia. Curr Biol 2023; 33:4650-4661.e7. [PMID: 37827151 PMCID: PMC10957396 DOI: 10.1016/j.cub.2023.09.047] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2023] [Revised: 08/21/2023] [Accepted: 09/20/2023] [Indexed: 10/14/2023]
Abstract
Path integration (PI) is impaired early in Alzheimer's disease (AD) but reflects multiple sub-processes that may be differentially sensitive to AD. To characterize these sub-processes, we developed a novel generative linear-angular model of PI (GLAMPI) to fit the inbound paths of healthy elderly participants performing triangle completion, a popular PI task, in immersive virtual reality with real movement. The model fits seven parameters reflecting the encoding, calculation, and production errors associated with inaccuracies in PI. We compared these parameters across younger and older participants and patients with mild cognitive impairment (MCI), including those with (MCI+) and without (MCI-) cerebrospinal fluid biomarkers of AD neuropathology. MCI patients showed overestimation of the angular turn in the outbound path and more variable inbound distances and directions compared with healthy elderly. MCI+ were best distinguished from MCI- patients by overestimation of outbound turns and more variable inbound directions. Our results suggest that overestimation of turning underlies the PI errors seen in patients with early AD, indicating specific neural pathways and diagnostic behaviors for further research.
Collapse
Affiliation(s)
- Andrea Castegnaro
- UCL Institute of Cognitive Neuroscience, University College London, 17 Queen Square, London WC1N 3AZ, UK; UCL Queen Square Institute of Neurology, University College London, Queen Square, London WC1N 3BG, UK
| | - Zilong Ji
- UCL Institute of Cognitive Neuroscience, University College London, 17 Queen Square, London WC1N 3AZ, UK; Peking-Tsinghua Center for Life Sciences, Academy for Advanced Interdisciplinary Studies, Peking University, Haidian District, Beijing 100871, China
| | - Katarzyna Rudzka
- UCL Institute of Cognitive Neuroscience, University College London, 17 Queen Square, London WC1N 3AZ, UK
| | - Dennis Chan
- UCL Institute of Cognitive Neuroscience, University College London, 17 Queen Square, London WC1N 3AZ, UK
| | - Neil Burgess
- UCL Institute of Cognitive Neuroscience, University College London, 17 Queen Square, London WC1N 3AZ, UK; UCL Queen Square Institute of Neurology, University College London, Queen Square, London WC1N 3BG, UK.
| |
Collapse
|
5
|
Alexander AS, Robinson JC, Stern CE, Hasselmo ME. Gated transformations from egocentric to allocentric reference frames involving retrosplenial cortex, entorhinal cortex, and hippocampus. Hippocampus 2023; 33:465-487. [PMID: 36861201 PMCID: PMC10403145 DOI: 10.1002/hipo.23513] [Citation(s) in RCA: 13] [Impact Index Per Article: 13.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2022] [Revised: 01/22/2023] [Accepted: 01/25/2023] [Indexed: 03/03/2023]
Abstract
This paper reviews the recent experimental finding that neurons in behaving rodents show egocentric coding of the environment in a number of structures associated with the hippocampus. Many animals generating behavior on the basis of sensory input must deal with the transformation of coordinates from the egocentric position of sensory input relative to the animal, into an allocentric framework concerning the position of multiple goals and objects relative to each other in the environment. Neurons in retrosplenial cortex show egocentric coding of the position of boundaries in relation to an animal. These neuronal responses are discussed in relation to existing models of the transformation from egocentric to allocentric coordinates using gain fields and a new model proposing transformations of phase coding that differ from current models. The same type of transformations could allow hierarchical representations of complex scenes. The responses in rodents are also discussed in comparison to work on coordinate transformations in humans and non-human primates.
Collapse
Affiliation(s)
- Andrew S Alexander
- Center for Systems Neuroscience, Boston University, Boston, Massachusetts, USA
| | - Jennifer C Robinson
- Center for Systems Neuroscience, Boston University, Boston, Massachusetts, USA
| | - Chantal E Stern
- Center for Systems Neuroscience, Boston University, Boston, Massachusetts, USA
| | - Michael E Hasselmo
- Center for Systems Neuroscience, Boston University, Boston, Massachusetts, USA
| |
Collapse
|
6
|
Alexander AS, Place R, Starrett MJ, Chrastil ER, Nitz DA. Rethinking retrosplenial cortex: Perspectives and predictions. Neuron 2023; 111:150-175. [PMID: 36460006 DOI: 10.1016/j.neuron.2022.11.006] [Citation(s) in RCA: 19] [Impact Index Per Article: 19.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2022] [Revised: 08/09/2022] [Accepted: 11/06/2022] [Indexed: 12/03/2022]
Abstract
The last decade has produced exciting new ideas about retrosplenial cortex (RSC) and its role in integrating diverse inputs. Here, we review the diversity in forms of spatial and directional tuning of RSC activity, temporal organization of RSC activity, and features of RSC interconnectivity with other brain structures. We find that RSC anatomy and dynamics are more consistent with roles in multiple sensorimotor and cognitive processes than with any isolated function. However, two more generalized categories of function may best characterize roles for RSC in complex cognitive processes: (1) shifting and relating perspectives for spatial cognition and (2) prediction and error correction for current sensory states with internal representations of the environment. Both functions likely take advantage of RSC's capacity to encode conjunctions among sensory, motor, and spatial mapping information streams. Together, these functions provide the scaffold for intelligent actions, such as navigation, perspective taking, interaction with others, and error detection.
Collapse
Affiliation(s)
- Andrew S Alexander
- Department of Psychological and Brain Sciences, Boston University, Boston, MA 02215, USA
| | - Ryan Place
- Department of Cognitive Science, University of California, San Diego, La Jolla, CA 92093, USA
| | - Michael J Starrett
- Department of Neurobiology & Behavior, University of California, Irvine, Irvine, CA 92697, USA
| | - Elizabeth R Chrastil
- Department of Neurobiology & Behavior, University of California, Irvine, Irvine, CA 92697, USA; Department of Cognitive Sciences, University of California, Irvine, Irvine, CA 92697, USA.
| | - Douglas A Nitz
- Department of Cognitive Science, University of California, San Diego, La Jolla, CA 92093, USA.
| |
Collapse
|
7
|
Seton C, Coutrot A, Hornberger M, Spiers HJ, Knight R, Whyatt C. Wayfinding and path integration deficits detected using a virtual reality mobile app in patients with traumatic brain injury. PLoS One 2023; 18:e0282255. [PMID: 36893089 PMCID: PMC9997943 DOI: 10.1371/journal.pone.0282255] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2022] [Accepted: 02/11/2023] [Indexed: 03/10/2023] Open
Abstract
The ability to navigate is supported by a wide network of brain areas which are particularly vulnerable to disruption brain injury, including traumatic brain injury (TBI). Wayfinding and the ability to orient back to the direction you have recently come (path integration) may likely be impacted in daily life but have so far not been tested with patients with TBI. Here, we assessed spatial navigation in thirty-eight participants, fifteen of whom had a history of TBI, and twenty-three control participants. Self-estimated spatial navigation ability was assessed using the Santa Barbara Sense of Direction (SBSOD) scale. No significant difference between TBI patients and a control group was identified. Rather, results indicated that both participant groups demonstrated 'good' self-inferred spatial navigational ability on the SBSOD scale. Objective navigation ability was tested via the virtual mobile app test Sea Hero Quest (SHQ), which has been shown to predict real-world navigation difficulties and assesses (a) wayfinding across several environments and (b) path integration. Compared to a sub-sample of 13 control participants, a matched subsample of 10 TBI patients demonstrated generally poorer performance on all wayfinding environments tested. Further analysis revealed that TBI participants consistently spent a shorter duration viewing a map prior to navigating to goals. Patients showed mixed performance on the path integration task, with poor performance evident when proximal cues were absent. Our results provide preliminary evidence that TBI impacts both wayfinding and, to some extent, path integration. The findings suggest long-lasting clinical difficulties experienced in TBI patients affect both wayfinding and to some degree path integration ability.
Collapse
Affiliation(s)
- Caroline Seton
- Department of Psychology, Sport and Geography, University of Hertfordshire, Hatfield, Hertfordshire, United Kingdom
| | - Antoine Coutrot
- Laboratoire d’InfoRmatique en Image et Systèmes d’information, French Centre National de la Recherche Scientifique, University of Lyon, Lyon, France
| | - Michael Hornberger
- Applied Dementia Research, Norwich Medical School, University of East Anglia, Norwich, United Kingdom
| | - Hugo J. Spiers
- Division of Psychology and Language Sciences, Department of Experimental Psychology, University College London, London, United Kingdom
| | - Rebecca Knight
- Department of Psychology, Sport and Geography, University of Hertfordshire, Hatfield, Hertfordshire, United Kingdom
- * E-mail:
| | - Caroline Whyatt
- Department of Psychology, Sport and Geography, University of Hertfordshire, Hatfield, Hertfordshire, United Kingdom
| |
Collapse
|
8
|
Linking global top-down views to first-person views in the brain. Proc Natl Acad Sci U S A 2022; 119:e2202024119. [PMID: 36322732 PMCID: PMC9659407 DOI: 10.1073/pnas.2202024119] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023] Open
Abstract
Humans and other animals have a remarkable capacity to translate their position from one spatial frame of reference to another. The ability to seamlessly move between top-down and first-person views is important for navigation, memory formation, and other cognitive tasks. Evidence suggests that the medial temporal lobe and other cortical regions contribute to this function. To understand how a neural system might carry out these computations, we used variational autoencoders (VAEs) to reconstruct the first-person view from the top-down view of a robot simulation, and vice versa. Many latent variables in the VAEs had similar responses to those seen in neuron recordings, including location-specific activity, head direction tuning, and encoding of distance to local objects. Place-specific responses were prominent when reconstructing a first-person view from a top-down view, but head direction-specific responses were prominent when reconstructing a top-down view from a first-person view. In both cases, the model could recover from perturbations without retraining, but rather through remapping. These results could advance our understanding of how brain regions support viewpoint linkages and transformations.
Collapse
|
9
|
Alefantis P, Lakshminarasimhan K, Avila E, Noel JP, Pitkow X, Angelaki DE. Sensory Evidence Accumulation Using Optic Flow in a Naturalistic Navigation Task. J Neurosci 2022; 42:5451-5462. [PMID: 35641186 PMCID: PMC9270913 DOI: 10.1523/jneurosci.2203-21.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2021] [Revised: 04/01/2022] [Accepted: 04/22/2022] [Indexed: 11/21/2022] Open
Abstract
Sensory evidence accumulation is considered a hallmark of decision-making in noisy environments. Integration of sensory inputs has been traditionally studied using passive stimuli, segregating perception from action. Lessons learned from this approach, however, may not generalize to ethological behaviors like navigation, where there is an active interplay between perception and action. We designed a sensory-based sequential decision task in virtual reality in which humans and monkeys navigated to a memorized location by integrating optic flow generated by their own joystick movements. A major challenge in such closed-loop tasks is that subjects' actions will determine future sensory input, causing ambiguity about whether they rely on sensory input rather than expectations based solely on a learned model of the dynamics. To test whether subjects integrated optic flow over time, we used three independent experimental manipulations, unpredictable optic flow perturbations, which pushed subjects off their trajectory; gain manipulation of the joystick controller, which changed the consequences of actions; and manipulation of the optic flow density, which changed the information borne by sensory evidence. Our results suggest that both macaques (male) and humans (female/male) relied heavily on optic flow, thereby demonstrating a critical role for sensory evidence accumulation during naturalistic action-perception closed-loop tasks.SIGNIFICANCE STATEMENT The temporal integration of evidence is a fundamental component of mammalian intelligence. Yet, it has traditionally been studied using experimental paradigms that fail to capture the closed-loop interaction between actions and sensations inherent in real-world continuous behaviors. These conventional paradigms use binary decision tasks and passive stimuli with statistics that remain stationary over time. Instead, we developed a naturalistic visuomotor visual navigation paradigm that mimics the causal structure of real-world sensorimotor interactions and probed the extent to which participants integrate sensory evidence by adding task manipulations that reveal complementary aspects of the computation.
Collapse
Affiliation(s)
- Panos Alefantis
- Center for Neural Science, New York University, New York, New York 10003
| | | | - Eric Avila
- Center for Neural Science, New York University, New York, New York 10003
| | - Jean-Paul Noel
- Center for Neural Science, New York University, New York, New York 10003
| | - Xaq Pitkow
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas 77030
- Department of Electrical and Computer Engineering, Rice University, Houston, Texas 77005-1892
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, Texas 77030
| | - Dora E Angelaki
- Center for Neural Science, New York University, New York, New York 10003
- Tandon School of Engineering, New York University, New York, New York 11201
| |
Collapse
|
10
|
Stavropoulos A, Lakshminarasimhan KJ, Laurens J, Pitkow X, Angelaki D. Influence of sensory modality and control dynamics on human path integration. eLife 2022; 11:63405. [PMID: 35179488 PMCID: PMC8856658 DOI: 10.7554/elife.63405] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2020] [Accepted: 12/11/2021] [Indexed: 12/02/2022] Open
Abstract
Path integration is a sensorimotor computation that can be used to infer latent dynamical states by integrating self-motion cues. We studied the influence of sensory observation (visual/vestibular) and latent control dynamics (velocity/acceleration) on human path integration using a novel motion-cueing algorithm. Sensory modality and control dynamics were both varied randomly across trials, as participants controlled a joystick to steer to a memorized target location in virtual reality. Visual and vestibular steering cues allowed comparable accuracies only when participants controlled their acceleration, suggesting that vestibular signals, on their own, fail to support accurate path integration in the absence of sustained acceleration. Nevertheless, performance in all conditions reflected a failure to fully adapt to changes in the underlying control dynamics, a result that was well explained by a bias in the dynamics estimation. This work demonstrates how an incorrect internal model of control dynamics affects navigation in volatile environments in spite of continuous sensory feedback.
Collapse
Affiliation(s)
- Akis Stavropoulos
- Center for Neural Science, New York University, New York, United States
| | | | - Jean Laurens
- Ernst Strüngmann Institute for Neuroscience, Frankfurt, Germany
| | - Xaq Pitkow
- Department of Electrical and Computer Engineering, Rice University, Houston, United States.,Department of Neuroscience, Baylor College of Medicine, Houston, United States.,Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, United States
| | - Dora Angelaki
- Center for Neural Science, New York University, New York, United States.,Department of Neuroscience, Baylor College of Medicine, Houston, United States.,Tandon School of Engineering, New York University, New York, United States
| |
Collapse
|
11
|
Perry BAL, Lomi E, Mitchell AS. Thalamocortical interactions in cognition and disease: the mediodorsal and anterior thalamic nuclei. Neurosci Biobehav Rev 2021; 130:162-177. [PMID: 34216651 DOI: 10.1016/j.neubiorev.2021.05.032] [Citation(s) in RCA: 30] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2020] [Revised: 04/12/2021] [Accepted: 05/17/2021] [Indexed: 01/15/2023]
Abstract
The mediodorsal thalamus (MD) and anterior thalamic nuclei (ATN) are two adjacent brain nodes that support our ability to make decisions, learn, update information, form and retrieve memories, and find our way around. The MD and PFC work in partnerships to support cognitive processes linked to successful learning and decision-making, while the ATN and extended hippocampal system together coordinate the encoding and retrieval of memories and successful spatial navigation. Yet, while these distinctions may appear to be segregated, both the MD and ATN together support our higher cognitive functions as they regulate and are influenced by interconnected fronto-temporal neural networks and subcortical inputs. Our review focuses on recent studies in animal models and in humans. This evidence is re-shaping our understanding of the importance of MD and ATN cortico-thalamocortical pathways in influencing complex cognitive functions. Given the evidence from clinical settings and neuroscience research labs, the MD and ATN should be considered targets for effective treatments in neuropsychiatric diseases and disorders and neurodegeneration.
Collapse
Affiliation(s)
- Brook A L Perry
- Department of Experimental Psychology, Oxford University, The Tinsley Building, Mansfield Road, OX1 3SR, United Kingdom
| | - Eleonora Lomi
- Department of Experimental Psychology, Oxford University, The Tinsley Building, Mansfield Road, OX1 3SR, United Kingdom
| | - Anna S Mitchell
- Department of Experimental Psychology, Oxford University, The Tinsley Building, Mansfield Road, OX1 3SR, United Kingdom.
| |
Collapse
|
12
|
Impaired Parahippocampal Gyrus-Orbitofrontal Cortex Circuit Associated with Visuospatial Memory Deficit as a Potential Biomarker and Interventional Approach for Alzheimer Disease. Neurosci Bull 2020; 36:831-844. [PMID: 32350798 DOI: 10.1007/s12264-020-00498-3] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2019] [Accepted: 12/10/2019] [Indexed: 12/20/2022] Open
Abstract
The parahippocampal gyrus-orbitofrontal cortex (PHG-OFC) circuit in humans is homologous to the postrhinal cortex (POR)-ventral lateral orbitofrontal cortex (vlOFC) circuit in rodents. Both are associated with visuospatial malfunctions in Alzheimer's disease (AD). However, the underlying mechanisms remain to be elucidated. In this study, we explored the relationship between an impaired POR-vlOFC circuit and visuospatial memory deficits through retrograde tracing and in vivo local field potential recordings in 5XFAD mice, and investigated alterations of the PHG-OFC circuit by multi-domain magnetic resonance imaging (MRI) in patients on the AD spectrum. We demonstrated that an impaired glutamatergic POR-vlOFC circuit resulted in deficient visuospatial memory in 5XFAD mice. Moreover, MRI measurements of the PHG-OFC circuit had an accuracy of 77.33% for the classification of amnestic mild cognitive impairment converters versus non-converters. Thus, the PHG-OFC circuit explains the neuroanatomical basis of visuospatial memory deficits in AD, thereby providing a potential predictor for AD progression and a promising interventional approach for AD.
Collapse
|
13
|
Krala M, van Kemenade B, Straube B, Kircher T, Bremmer F. Predictive coding in a multisensory path integration task: An fMRI study. J Vis 2020; 19:13. [PMID: 31561251 DOI: 10.1167/19.11.13] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
During self-motion through an environment, our sensory systems are confronted with a constant flow of information from different modalities. To successfully navigate, self-induced sensory signals have to be dissociated from externally induced sensory signals. Previous studies have suggested that the processing of self-induced sensory information is modulated by means of predictive coding mechanisms. However, the neural correlates of processing self-induced sensory information from different modalities during self-motion are largely unknown. Here, we asked if and how the processing of visually simulated self-motion and/or associated auditory stimuli is modulated by self-controlled action. Participants were asked to actively reproduce a previously observed simulated self-displacement (path integration). Blood oxygen level-dependent (BOLD) activation during this path integration was compared with BOLD activation during a condition in which we passively replayed the exact sensory stimulus that had been produced by the participants in previous trials. We found supramodal BOLD suppression in parietal and frontal regions. Remarkably, BOLD contrast in sensory areas was enhanced in a modality-specific manner. We conclude that the effect of action on sensory processing is strictly dependent on the respective behavioral task and its relevance.
Collapse
Affiliation(s)
- Milosz Krala
- Department of Neurophysics, University of Marburg, Marburg, Germany.,Center for Mind, Brain and Behavior-CMBB, University of Marburg and Justus-Liebig-University Giessen, Germany
| | - Bianca van Kemenade
- Center for Mind, Brain and Behavior-CMBB, University of Marburg and Justus-Liebig-University Giessen, Germany.,Translational Neuroimaging Marburg, Department of Psychiatry and Psychotherapy, University of Marburg, Marburg, Germany
| | - Benjamin Straube
- Center for Mind, Brain and Behavior-CMBB, University of Marburg and Justus-Liebig-University Giessen, Germany.,Translational Neuroimaging Marburg, Department of Psychiatry and Psychotherapy, University of Marburg, Marburg, Germany
| | - Tilo Kircher
- Center for Mind, Brain and Behavior-CMBB, University of Marburg and Justus-Liebig-University Giessen, Germany.,Translational Neuroimaging Marburg, Department of Psychiatry and Psychotherapy, University of Marburg, Marburg, Germany
| | - Frank Bremmer
- Department of Neurophysics, University of Marburg, Marburg, Germany.,Center for Mind, Brain and Behavior-CMBB, University of Marburg and Justus-Liebig-University Giessen, Germany
| |
Collapse
|
14
|
Alexander AS, Robinson JC, Dannenberg H, Kinsky NR, Levy SJ, Mau W, Chapman GW, Sullivan DW, Hasselmo ME. Neurophysiological coding of space and time in the hippocampus, entorhinal cortex, and retrosplenial cortex. Brain Neurosci Adv 2020; 4:2398212820972871. [PMID: 33294626 PMCID: PMC7708714 DOI: 10.1177/2398212820972871] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2020] [Accepted: 10/21/2020] [Indexed: 11/18/2022] Open
Abstract
Neurophysiological recordings in behaving rodents demonstrate neuronal response properties that may code space and time for episodic memory and goal-directed behaviour. Here, we review recordings from hippocampus, entorhinal cortex, and retrosplenial cortex to address the problem of how neurons encode multiple overlapping spatiotemporal trajectories and disambiguate these for accurate memory-guided behaviour. The solution could involve neurons in the entorhinal cortex and hippocampus that show mixed selectivity, coding both time and location. Some grid cells and place cells that code space also respond selectively as time cells, allowing differentiation of time intervals when a rat runs in the same location during a delay period. Cells in these regions also develop new representations that differentially code the context of prior or future behaviour allowing disambiguation of overlapping trajectories. Spiking activity is also modulated by running speed and head direction, supporting the coding of episodic memory not as a series of snapshots but as a trajectory that can also be distinguished on the basis of speed and direction. Recent data also address the mechanisms by which sensory input could distinguish different spatial locations. Changes in firing rate reflect running speed on long but not short time intervals, and few cells code movement direction, arguing against path integration for coding location. Instead, new evidence for neural coding of environmental boundaries in egocentric coordinates fits with a modelling framework in which egocentric coding of barriers combined with head direction generates distinct allocentric coding of location. The egocentric input can be used both for coding the location of spatiotemporal trajectories and for retrieving specific viewpoints of the environment. Overall, these different patterns of neural activity can be used for encoding and disambiguation of prior episodic spatiotemporal trajectories or for planning of future goal-directed spatiotemporal trajectories.
Collapse
Affiliation(s)
| | | | | | | | - Samuel J. Levy
- Center for Systems Neuroscience, Boston University, Boston, MA, USA
| | - William Mau
- Center for Systems Neuroscience, Boston University, Boston, MA, USA
| | | | | | | |
Collapse
|
15
|
Dannenberg H, Alexander AS, Robinson JC, Hasselmo ME. The Role of Hierarchical Dynamical Functions in Coding for Episodic Memory and Cognition. J Cogn Neurosci 2019; 31:1271-1289. [PMID: 31251890 DOI: 10.1162/jocn_a_01439] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/11/2023]
Abstract
Behavioral research in human verbal memory function led to the initial definition of episodic memory and semantic memory. A complete model of the neural mechanisms of episodic memory must include the capacity to encode and mentally reconstruct everything that humans can recall from their experience. This article proposes new model features necessary to address the complexity of episodic memory encoding and recall in the context of broader cognition and the functional properties of neurons that could contribute to this broader scope of memory. Many episodic memory models represent individual snapshots of the world with a sequence of vectors, but a full model must represent complex functions encoding and retrieving the relations between multiple stimulus features across space and time on multiple hierarchical scales. Episodic memory involves not only the space and time of an agent experiencing events within an episode but also features shown in neurophysiological data such as coding of speed, direction, boundaries, and objects. Episodic memory includes not only a spatio-temporal trajectory of a single agent but also segments of spatio-temporal trajectories for other agents and objects encountered in the environment consistent with data on encoding the position and angle of sensory features of objects and boundaries. We will discuss potential interactions of episodic memory circuits in the hippocampus and entorhinal cortex with distributed neocortical circuits that must represent all features of human cognition.
Collapse
|
16
|
Zajac L, Burte H, Taylor HA, Killiany R. Self-reported navigation ability is associated with optic flow-sensitive regions' functional connectivity patterns during visual path integration. Brain Behav 2019; 9:e01236. [PMID: 30884216 PMCID: PMC6456774 DOI: 10.1002/brb3.1236] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/29/2018] [Accepted: 01/16/2019] [Indexed: 12/28/2022] Open
Abstract
INTRODUCTION Spatial navigation is a complex cognitive skill that varies between individuals, and the mechanisms underlying this variability are not clear. Studying simpler components of spatial navigation may help illuminate factors that contribute to variation in this complex skill; path integration is one such component. Optic flow provides self-motion information while moving through an environment and is sufficient for path integration. This study aims to investigate whether self-reported navigation ability is related to information transfer between optic flow-sensitive (OF-sensitive) cortical regions and regions important to navigation during environmental spatial tasks. METHODS Functional magnetic resonance imaging was used to define OF-sensitive regions and map their functional connectivity (FC) with the retrosplenial cortex and hippocampus during visual path integration (VPI) and turn counting (TC) tasks. Both tasks presented visual self-motion through a real-world environment. Correlations predicting a positive association between self-reported navigation ability (measured with the Santa Barbara Sense of Direction scale) and FC strength between OF-sensitive regions and retrosplenial cortex and OF-sensitive regions and the hippocampus were performed. RESULTS During VPI, FC strength between left cingulate sulcus visual area (L CSv) and right retrosplenial cortex and L CSv and right hippocampus was positively associated with self-reported navigation ability. FC strength between right cingulate sulcus visual area (R CSv) and right retrosplenial cortex during VPI was also positively associated with self-reported navigation ability. These relationships were specific to VPI, and whole-brain exploratory analyses corroborated these results. CONCLUSIONS These findings support the hypothesis that perceived spatial navigation ability is associated with communication strength between OF-sensitive and navigationally relevant regions during visual path integration, which may represent the transformation accuracy of visual motion information into internal spatial representations. More broadly, these results illuminate underlying mechanisms that may explain some variability in spatial navigation ability.
Collapse
Affiliation(s)
- Lauren Zajac
- Department of Anatomy & Neurobiology, Boston University School of Medicine, Boston, Massachusetts.,Center for Biomedical Imaging, Boston University School of Medicine, Boston, Massachusetts
| | - Heather Burte
- Department of Psychology, Tufts University, Medford, Massachusetts
| | - Holly A Taylor
- Department of Psychology, Tufts University, Medford, Massachusetts
| | - Ronald Killiany
- Department of Anatomy & Neurobiology, Boston University School of Medicine, Boston, Massachusetts.,Center for Biomedical Imaging, Boston University School of Medicine, Boston, Massachusetts
| |
Collapse
|
17
|
|
18
|
Kim M, Maguire EA. Encoding of 3D head direction information in the human brain. Hippocampus 2018; 29:619-629. [PMID: 30561118 PMCID: PMC6618148 DOI: 10.1002/hipo.23060] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2018] [Revised: 10/30/2018] [Accepted: 11/26/2018] [Indexed: 12/17/2022]
Abstract
Head direction cells are critical for navigation because they convey information about which direction an animal is facing within an environment. To date, most studies on head direction encoding have been conducted on a horizontal two-dimensional (2D) plane, and little is known about how three-dimensional (3D) direction information is encoded in the brain despite humans and other animals living in a 3D world. Here, we investigated head direction encoding in the human brain while participants moved within a virtual 3D "spaceship" environment. Movement was not constrained to planes and instead participants could move along all three axes in volumetric space as if in zero gravity. Using functional magnetic resonance imaging (fMRI) multivoxel pattern similarity analysis, we found evidence that the thalamus, particularly the anterior portion, and the subiculum encoded the horizontal component of 3D head direction (azimuth). In contrast, the retrosplenial cortex was significantly more sensitive to the vertical direction (pitch) than to the azimuth. Our results also indicated that vertical direction information in the retrosplenial cortex was significantly correlated with behavioral performance during a direction judgment task. Our findings represent the first evidence showing that the "classic" head direction system that has been identified on a horizontal 2D plane also seems to encode vertical and horizontal heading in 3D space in the human brain.
Collapse
Affiliation(s)
- Misun Kim
- Wellcome Centre for Human Neuroimaging, Queen Square Institute of Neurology, University College London, London, United Kingdom
| | - Eleanor A Maguire
- Wellcome Centre for Human Neuroimaging, Queen Square Institute of Neurology, University College London, London, United Kingdom
| |
Collapse
|
19
|
Izen SC, Chrastil ER, Stern CE. Resting State Connectivity Between Medial Temporal Lobe Regions and Intrinsic Cortical Networks Predicts Performance in a Path Integration Task. Front Hum Neurosci 2018; 12:415. [PMID: 30459579 PMCID: PMC6232837 DOI: 10.3389/fnhum.2018.00415] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2018] [Accepted: 09/25/2018] [Indexed: 12/26/2022] Open
Abstract
Humans differ in their individual navigational performance, in part because successful navigation relies on several diverse abilities. One such navigational capability is path integration, the updating of position and orientation during movement, typically in a sparse, landmark-free environment. This study examined the relationship between path integration abilities and functional connectivity to several canonical intrinsic brain networks. Intrinsic networks within the brain reflect past inputs and communication as well as structural architecture. Individual differences in intrinsic connectivity have been observed for common networks, suggesting that these networks can inform our understanding of individual spatial abilities. Here, we examined individual differences in intrinsic connectivity using resting state magnetic resonance imaging (rsMRI). We tested path integration ability using a loop closure task, in which participants viewed a single video of movement in a circle trajectory in a sparse environment, and then indicated whether the video ended in the same location in which it started. To examine intrinsic brain networks, participants underwent a resting state scan. We found that better performance in the loop task was associated with increased connectivity during rest between the central executive network (CEN) and posterior hippocampus, parahippocampal cortex (PHC) and entorhinal cortex. We also found that connectivity between PHC and the default mode network (DMN) during rest was associated with better loop closure performance. The results indicate that interactions between medial temporal lobe (MTL) regions and intrinsic networks that involve prefrontal cortex (PFC) are important for path integration and navigation.
Collapse
Affiliation(s)
- Sarah C. Izen
- Department of Psychological & Brain Sciences and Center for Memory & Brain, Boston University, Boston, MA, United States
| | - Elizabeth R. Chrastil
- Department of Psychological & Brain Sciences and Center for Memory & Brain, Boston University, Boston, MA, United States
- Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Boston, MA, United States
- Department of Geography, University of California, Santa Barbara, Santa Barbara, CA, United States
| | - Chantal E. Stern
- Department of Psychological & Brain Sciences and Center for Memory & Brain, Boston University, Boston, MA, United States
- Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Boston, MA, United States
| |
Collapse
|
20
|
Zhao M. Human spatial representation: what we cannot learn from the studies of rodent navigation. J Neurophysiol 2018; 120:2453-2465. [PMID: 30133384 DOI: 10.1152/jn.00781.2017] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022] Open
Abstract
Studies of human and rodent navigation often reveal a remarkable cross-species similarity between the cognitive and neural mechanisms of navigation. Such cross-species resemblance often overshadows some critical differences between how humans and nonhuman animals navigate. In this review, I propose that a navigation system requires both a storage system (i.e., representing spatial information) and a positioning system (i.e., sensing spatial information) to operate. I then argue that the way humans represent spatial information is different from that inferred from the cellular activity observed during rodent navigation. Such difference spans the whole hierarchy of spatial representation, from representing the structure of an environment to the representation of subregions of an environment, routes and paths, and the distance and direction relative to a goal location. These cross-species inconsistencies suggest that what we learn from rodent navigation does not always transfer to human navigation. Finally, I argue for closing the loop for the dominant, unidirectional animal-to-human approach in navigation research so that insights from behavioral studies of human navigation may also flow back to shed light on the cellular mechanisms of navigation for both humans and other mammals (i.e., a human-to-animal approach).
Collapse
Affiliation(s)
- Mintao Zhao
- School of Psychology, University of East Anglia , Norwich , United Kingdom.,Department of Human Perception, Cognition, and Action, Max Planck Institute for Biological Cybernetics , Tübingen , Germany
| |
Collapse
|
21
|
Sherrill KR, Chrastil ER, Aselcioglu I, Hasselmo ME, Stern CE. Structural Differences in Hippocampal and Entorhinal Gray Matter Volume Support Individual Differences in First Person Navigational Ability. Neuroscience 2018; 380:123-131. [DOI: 10.1016/j.neuroscience.2018.04.006] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2017] [Revised: 03/13/2018] [Accepted: 04/06/2018] [Indexed: 12/12/2022]
|
22
|
Hinman JR, Dannenberg H, Alexander AS, Hasselmo ME. Neural mechanisms of navigation involving interactions of cortical and subcortical structures. J Neurophysiol 2018; 119:2007-2029. [PMID: 29442559 DOI: 10.1152/jn.00498.2017] [Citation(s) in RCA: 32] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/27/2022] Open
Abstract
Animals must perform spatial navigation for a range of different behaviors, including selection of trajectories toward goal locations and foraging for food sources. To serve this function, a number of different brain regions play a role in coding different dimensions of sensory input important for spatial behavior, including the entorhinal cortex, the retrosplenial cortex, the hippocampus, and the medial septum. This article will review data concerning the coding of the spatial aspects of animal behavior, including location of the animal within an environment, the speed of movement, the trajectory of movement, the direction of the head in the environment, and the position of barriers and objects both relative to the animal's head direction (egocentric) and relative to the layout of the environment (allocentric). The mechanisms for coding these important spatial representations are not yet fully understood but could involve mechanisms including integration of self-motion information or coding of location based on the angle of sensory features in the environment. We will review available data and theories about the mechanisms for coding of spatial representations. The computation of different aspects of spatial representation from available sensory input requires complex cortical processing mechanisms for transformation from egocentric to allocentric coordinates that will only be understood through a combination of neurophysiological studies and computational modeling.
Collapse
Affiliation(s)
- James R Hinman
- Center for Systems Neuroscience, Boston University , Boston, Massachusetts
| | - Holger Dannenberg
- Center for Systems Neuroscience, Boston University , Boston, Massachusetts
| | - Andrew S Alexander
- Center for Systems Neuroscience, Boston University , Boston, Massachusetts
| | - Michael E Hasselmo
- Center for Systems Neuroscience, Boston University , Boston, Massachusetts
| |
Collapse
|
23
|
Individual Differences in Human Path Integration Abilities Correlate with Gray Matter Volume in Retrosplenial Cortex, Hippocampus, and Medial Prefrontal Cortex. eNeuro 2017; 4:eN-NWR-0346-16. [PMID: 28451633 PMCID: PMC5392707 DOI: 10.1523/eneuro.0346-16.2017] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2016] [Revised: 03/07/2017] [Accepted: 04/02/2017] [Indexed: 12/22/2022] Open
Abstract
Humans differ in their individual navigational abilities. These individual differences may exist in part because successful navigation relies on several disparate abilities, which rely on different brain structures. One such navigational capability is path integration, the updating of position and orientation, in which navigators track distances, directions, and locations in space during movement. Although structural differences related to landmark-based navigation have been examined, gray matter volume related to path integration ability has not yet been tested. Here, we examined individual differences in two path integration paradigms: (1) a location tracking task and (2) a task tracking translational and rotational self-motion. Using voxel-based morphometry, we related differences in performance in these path integration tasks to variation in brain morphology in 26 healthy young adults. Performance in the location tracking task positively correlated with individual differences in gray matter volume in three areas critical for path integration: the hippocampus, the retrosplenial cortex, and the medial prefrontal cortex. These regions are consistent with the path integration system known from computational and animal models and provide novel evidence that morphological variability in retrosplenial and medial prefrontal cortices underlies individual differences in human path integration ability. The results for tracking rotational self-motion-but not translation or location-demonstrated that cerebellum gray matter volume correlated with individual performance. Our findings also suggest that these three aspects of path integration are largely independent. Together, the results of this study provide a link between individual abilities and the functional correlates, computational models, and animal models of path integration.
Collapse
|