1
|
Zhang Y, Yuan L, Zhu Q, Wu J, Nöbauer T, Zhang R, Xiao G, Wang M, Xie H, Guo Z, Dai Q, Vaziri A. A miniaturized mesoscope for the large-scale single-neuron-resolved imaging of neuronal activity in freely behaving mice. Nat Biomed Eng 2024:10.1038/s41551-024-01226-2. [PMID: 38902522 DOI: 10.1038/s41551-024-01226-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2023] [Accepted: 04/03/2024] [Indexed: 06/22/2024]
Abstract
Exploring the relationship between neuronal dynamics and ethologically relevant behaviour involves recording neuronal-population activity using technologies that are compatible with unrestricted animal behaviour. However, head-mounted microscopes that accommodate weight limits to allow for free animal behaviour typically compromise field of view, resolution or depth range, and are susceptible to movement-induced artefacts. Here we report a miniaturized head-mounted fluorescent mesoscope that we systematically optimized for calcium imaging at single-neuron resolution, for increased fields of view and depth of field, and for robustness against motion-generated artefacts. Weighing less than 2.5 g, the mesoscope enabled recordings of neuronal-population activity at up to 16 Hz, with 4 μm resolution over 300 μm depth-of-field across a field of view of 3.6 × 3.6 mm2 in the cortex of freely moving mice. We used the mesoscope to record large-scale neuronal-population activity in socially interacting mice during free exploration and during fear-conditioning experiments, and to investigate neurovascular coupling across multiple cortical regions.
Collapse
Affiliation(s)
- Yuanlong Zhang
- Department of Automation, Tsinghua University, Beijing, China
- Laboratory of Neurotechnology and Biophysics, The Rockefeller University, New York, NY, USA
| | - Lekang Yuan
- Tsinghua-Berkeley Shenzhen Institute, Tsinghua University, Shenzhen, China
| | - Qiyu Zhu
- School of Medicine, Tsinghua University, Beijing, China
- Tsinghua-Peking Joint Center for Life Sciences, Beijing, China
- IDG/McGovern Institute for Brain Research, Tsinghua University, Beijing, China
| | - Jiamin Wu
- Department of Automation, Tsinghua University, Beijing, China
| | - Tobias Nöbauer
- Laboratory of Neurotechnology and Biophysics, The Rockefeller University, New York, NY, USA
| | - Rujin Zhang
- Department of Anesthesiology, the First Medical Center, Chinese PLA General Hospital, Beijing, China
| | - Guihua Xiao
- Department of Automation, Tsinghua University, Beijing, China
| | - Mingrui Wang
- Tsinghua-Berkeley Shenzhen Institute, Tsinghua University, Shenzhen, China
| | - Hao Xie
- Department of Automation, Tsinghua University, Beijing, China
| | - Zengcai Guo
- School of Medicine, Tsinghua University, Beijing, China
- Tsinghua-Peking Joint Center for Life Sciences, Beijing, China
- IDG/McGovern Institute for Brain Research, Tsinghua University, Beijing, China
| | - Qionghai Dai
- Department of Automation, Tsinghua University, Beijing, China.
- IDG/McGovern Institute for Brain Research, Tsinghua University, Beijing, China.
| | - Alipasha Vaziri
- Laboratory of Neurotechnology and Biophysics, The Rockefeller University, New York, NY, USA.
- The Kavli Neural Systems Institute, The Rockefeller University, New York, NY, USA.
| |
Collapse
|
2
|
Gianatti M, Garvert AC, Lenkey N, Ebbesen NC, Hennestad E, Vervaeke K. Multiple long-range projections convey position information to the agranular retrosplenial cortex. Cell Rep 2023; 42:113109. [PMID: 37682706 DOI: 10.1016/j.celrep.2023.113109] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2022] [Revised: 06/13/2023] [Accepted: 08/23/2023] [Indexed: 09/10/2023] Open
Abstract
Neuronal signals encoding the animal's position widely modulate neocortical processing. While these signals are assumed to depend on hippocampal output, their origin has not been investigated directly. Here, we asked which brain region sends position information to the retrosplenial cortex (RSC), a key circuit for memory and navigation. We comprehensively characterized the long-range inputs to agranular RSC using two-photon axonal imaging in head-fixed mice performing a spatial task in darkness. Surprisingly, most long-range pathways convey position information, but with notable differences. Axons from the secondary motor and posterior parietal cortex transmit the most position information. By contrast, axons from the anterior cingulate and orbitofrontal cortex and thalamus convey substantially less position information. Axons from the primary and secondary visual cortex contribute negligibly. This demonstrates that the hippocampus is not the only source of position information. Instead, the RSC is a hub in a distributed brain network that shares position information.
Collapse
Affiliation(s)
- Michele Gianatti
- Institute of Basic Medical Sciences, Section of Physiology, University of Oslo, Oslo, Norway
| | - Anna Christina Garvert
- Institute of Basic Medical Sciences, Section of Physiology, University of Oslo, Oslo, Norway
| | - Nora Lenkey
- Institute of Basic Medical Sciences, Section of Physiology, University of Oslo, Oslo, Norway
| | - Nora Cecilie Ebbesen
- Institute of Basic Medical Sciences, Section of Physiology, University of Oslo, Oslo, Norway
| | - Eivind Hennestad
- Institute of Basic Medical Sciences, Section of Physiology, University of Oslo, Oslo, Norway
| | - Koen Vervaeke
- Institute of Basic Medical Sciences, Section of Physiology, University of Oslo, Oslo, Norway.
| |
Collapse
|
3
|
Pennartz CMA, Oude Lohuis MN, Olcese U. How 'visual' is the visual cortex? The interactions between the visual cortex and other sensory, motivational and motor systems as enabling factors for visual perception. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220336. [PMID: 37545313 PMCID: PMC10404929 DOI: 10.1098/rstb.2022.0336] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2023] [Accepted: 06/13/2023] [Indexed: 08/08/2023] Open
Abstract
The definition of the visual cortex is primarily based on the evidence that lesions of this area impair visual perception. However, this does not exclude that the visual cortex may process more information than of retinal origin alone, or that other brain structures contribute to vision. Indeed, research across the past decades has shown that non-visual information, such as neural activity related to reward expectation and value, locomotion, working memory and other sensory modalities, can modulate primary visual cortical responses to retinal inputs. Nevertheless, the function of this non-visual information is poorly understood. Here we review recent evidence, coming primarily from studies in rodents, arguing that non-visual and motor effects in visual cortex play a role in visual processing itself, for instance disentangling direct auditory effects on visual cortex from effects of sound-evoked orofacial movement. These findings are placed in a broader framework casting vision in terms of predictive processing under control of frontal, reward- and motor-related systems. In contrast to the prevalent notion that vision is exclusively constructed by the visual cortical system, we propose that visual percepts are generated by a larger network-the extended visual system-spanning other sensory cortices, supramodal areas and frontal systems. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Cyriel M. A. Pennartz
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
| | - Matthijs N. Oude Lohuis
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Champalimaud Research, Champalimaud Foundation, 1400-038 Lisbon, Portugal
| | - Umberto Olcese
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
| |
Collapse
|
4
|
Saleem AB, Busse L. Interactions between rodent visual and spatial systems during navigation. Nat Rev Neurosci 2023:10.1038/s41583-023-00716-7. [PMID: 37380885 DOI: 10.1038/s41583-023-00716-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/31/2023] [Indexed: 06/30/2023]
Abstract
Many behaviours that are critical for animals to survive and thrive rely on spatial navigation. Spatial navigation, in turn, relies on internal representations about one's spatial location, one's orientation or heading direction and the distance to objects in the environment. Although the importance of vision in guiding such internal representations has long been recognized, emerging evidence suggests that spatial signals can also modulate neural responses in the central visual pathway. Here, we review the bidirectional influences between visual and navigational signals in the rodent brain. Specifically, we discuss reciprocal interactions between vision and the internal representations of spatial position, explore the effects of vision on representations of an animal's heading direction and vice versa, and examine how the visual and navigational systems work together to assess the relative distances of objects and other features. Throughout, we consider how technological advances and novel ethological paradigms that probe rodent visuo-spatial behaviours allow us to advance our understanding of how brain areas of the central visual pathway and the spatial systems interact and enable complex behaviours.
Collapse
Affiliation(s)
- Aman B Saleem
- UCL Institute of Behavioural Neuroscience, Department of Experimental Psychology, University College London, London, UK.
| | - Laura Busse
- Division of Neuroscience, Faculty of Biology, LMU Munich, Munich, Germany.
- Bernstein Centre for Computational Neuroscience Munich, Munich, Germany.
| |
Collapse
|
5
|
Price BH, Jensen CM, Khoudary AA, Gavornik JP. Expectation violations produce error signals in mouse V1. Cereb Cortex 2023; 33:8803-8820. [PMID: 37183176 PMCID: PMC10321125 DOI: 10.1093/cercor/bhad163] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2023] [Revised: 04/22/2023] [Accepted: 04/25/2023] [Indexed: 05/16/2023] Open
Abstract
Repeated exposure to visual sequences changes the form of evoked activity in the primary visual cortex (V1). Predictive coding theory provides a potential explanation for this, namely that plasticity shapes cortical circuits to encode spatiotemporal predictions and that subsequent responses are modulated by the degree to which actual inputs match these expectations. Here we use a recently developed statistical modeling technique called Model-Based Targeted Dimensionality Reduction (MbTDR) to study visually evoked dynamics in mouse V1 in the context of an experimental paradigm called "sequence learning." We report that evoked spiking activity changed significantly with training, in a manner generally consistent with the predictive coding framework. Neural responses to expected stimuli were suppressed in a late window (100-150 ms) after stimulus onset following training, whereas responses to novel stimuli were not. Substituting a novel stimulus for a familiar one led to increases in firing that persisted for at least 300 ms. Omitting predictable stimuli in trained animals also led to increased firing at the expected time of stimulus onset. Finally, we show that spiking data can be used to accurately decode time within the sequence. Our findings are consistent with the idea that plasticity in early visual circuits is involved in coding spatiotemporal information.
Collapse
Affiliation(s)
- Byron H Price
- Center for Systems Neuroscience, Department of Biology, Boston University, Boston, MA 02215, USA
- Graduate Program in Neuroscience, Boston University, Boston, MA 02215, USA
| | - Cambria M Jensen
- Center for Systems Neuroscience, Department of Biology, Boston University, Boston, MA 02215, USA
| | - Anthony A Khoudary
- Center for Systems Neuroscience, Department of Biology, Boston University, Boston, MA 02215, USA
| | - Jeffrey P Gavornik
- Center for Systems Neuroscience, Department of Biology, Boston University, Boston, MA 02215, USA
- Graduate Program in Neuroscience, Boston University, Boston, MA 02215, USA
| |
Collapse
|
6
|
Rolls ET. Hippocampal spatial view cells for memory and navigation, and their underlying connectivity in humans. Hippocampus 2023; 33:533-572. [PMID: 36070199 PMCID: PMC10946493 DOI: 10.1002/hipo.23467] [Citation(s) in RCA: 29] [Impact Index Per Article: 29.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2022] [Revised: 08/16/2022] [Accepted: 08/16/2022] [Indexed: 01/08/2023]
Abstract
Hippocampal and parahippocampal gyrus spatial view neurons in primates respond to the spatial location being looked at. The representation is allocentric, in that the responses are to locations "out there" in the world, and are relatively invariant with respect to retinal position, eye position, head direction, and the place where the individual is located. The underlying connectivity in humans is from ventromedial visual cortical regions to the parahippocampal scene area, leading to the theory that spatial view cells are formed by combinations of overlapping feature inputs self-organized based on their closeness in space. Thus, although spatial view cells represent "where" for episodic memory and navigation, they are formed by ventral visual stream feature inputs in the parahippocampal gyrus in what is the parahippocampal scene area. A second "where" driver of spatial view cells are parietal inputs, which it is proposed provide the idiothetic update for spatial view cells, used for memory recall and navigation when the spatial view details are obscured. Inferior temporal object "what" inputs and orbitofrontal cortex reward inputs connect to the human hippocampal system, and in macaques can be associated in the hippocampus with spatial view cell "where" representations to implement episodic memory. Hippocampal spatial view cells also provide a basis for navigation to a series of viewed landmarks, with the orbitofrontal cortex reward inputs to the hippocampus providing the goals for navigation, which can then be implemented by hippocampal connectivity in humans to parietal cortex regions involved in visuomotor actions in space. The presence of foveate vision and the highly developed temporal lobe for object and scene processing in primates including humans provide a basis for hippocampal spatial view cells to be key to understanding episodic memory in the primate and human hippocampus, and the roles of this system in primate including human navigation.
Collapse
Affiliation(s)
- Edmund T. Rolls
- Oxford Centre for Computational NeuroscienceOxfordUK
- Department of Computer ScienceUniversity of WarwickCoventryUK
| |
Collapse
|
7
|
Surinach D, Rynes ML, Saxena K, Ko E, Redish AD, Kodandaramaiah SB. Distinct mesoscale cortical dynamics encode search strategies during spatial navigation. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.03.27.534480. [PMID: 37034682 PMCID: PMC10081171 DOI: 10.1101/2023.03.27.534480] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
Spatial navigation is a complex cognitive process that involves neural computations in distributed regions of the brain. Little is known about how cortical regions are coordinated when animals navigate novel spatial environments or how that coordination changes as environments become familiar. We recorded mesoscale calcium (Ca2+) dynamics across large swathes of the dorsal cortex in mice solving the Barnes maze, a 2D spatial navigation task where mice used random, serial, and spatial search strategies to navigate to the goal. Cortical dynamics exhibited patterns of repeated calcium activity with rapid and abrupt shifts between cortical activation patterns at sub-second time scales. We used a clustering algorithm to decompose the spatial patterns of cortical calcium activity in a low dimensional state space, identifying 7 states, each corresponding to a distinct spatial pattern of cortical activation, sufficient to describe the cortical dynamics across all the mice. When mice used serial or spatial search strategies to navigate to the goal, the frontal regions of the cortex were reliably activated for prolonged durations of time (> 1s) shortly after trial initiation. These frontal cortex activation events coincided with mice approaching the edge of the maze from the center and were preceded by temporal sequences of cortical activation patterns that were distinct for serial and spatial search strategies. In serial search trials, frontal cortex activation events were preceded by activation of the posterior regions of the cortex followed by lateral activation of one hemisphere. In spatial search trials, frontal cortical events were preceded by activation of posterior regions of the cortex followed by broad activation of the lateral regions of the cortex. Our results delineated cortical components that differentiate goal- and non-goal oriented spatial navigation strategies.
Collapse
Affiliation(s)
- Daniel Surinach
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Mathew L Rynes
- Department of Biomedical Engineering, University of Minnesota, Twin Cities
| | - Kapil Saxena
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
| | - Eunsong Ko
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
- Department of Biomedical Engineering, University of Minnesota, Twin Cities
| | - A David Redish
- Department of Neuroscience, University of Minnesota, Twin Cities
| | - Suhasa B Kodandaramaiah
- Department of Mechanical Engineering, University of Minnesota, Twin Cities
- Department of Biomedical Engineering, University of Minnesota, Twin Cities
- Department of Neuroscience, University of Minnesota, Twin Cities
| |
Collapse
|
8
|
Wiesing M, Zimmermann E. Serial dependencies between locomotion and visual space. Sci Rep 2023; 13:3302. [PMID: 36849556 PMCID: PMC9970965 DOI: 10.1038/s41598-023-30265-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2022] [Accepted: 02/20/2023] [Indexed: 03/01/2023] Open
Abstract
How do we know the spatial distance of objects around us? Only by physical interaction within an environment can we measure true physical distances. Here, we investigated the possibility that travel distances, measured during walking, could be used to calibrate visual spatial perception. The sensorimotor contingencies that arise during walking were carefully altered using virtual reality and motion tracking. Participants were asked to walk to a briefly highlighted location. During walking, we systematically changed the optic flow, i.e., the ratio between the visual and physical motion speed. Although participants remained unaware of this manipulation, they walked a shorter or longer distance as a function of the optic flow speed. Following walking, participants were required to estimate the perceived distance of visual objects. We found that visual estimates were serially dependent on the experience of the manipulated flow in the previous trial. Additional experiments confirmed that to affect visual perception, both visual and physical motion are required. We conclude that the brain constantly uses movements to measure space for both, actions, and perception.
Collapse
Affiliation(s)
- Michael Wiesing
- Institute for Experimental Psychology, Heinrich Heine University Duesseldorf, Düsseldorf, Germany.
| | - Eckart Zimmermann
- Institute for Experimental Psychology, Heinrich Heine University Duesseldorf, Düsseldorf, Germany
| |
Collapse
|
9
|
Abstract
The elucidation of spatial coding in the hippocampus requires exploring diverse animal species. While robust place-cells are found in the mammalian hippocampus, much less is known about spatial coding in the hippocampus of birds. Here we used a wireless-electrophysiology system to record single neurons in the hippocampus and other two dorsal pallial structures from freely flying barn owls (Tyto alba), a central-place nocturnal predator species with excellent navigational abilities. The owl's 3D position was monitored while it flew between perches. We found place cells-neurons that fired when the owl flew through a spatially restricted region in at least one direction-as well as neurons that encoded the direction of flight, and neurons that represented the owl's perching position between flights. Many neurons encoded combinations of position, direction, and perching. Spatial coding was maintained stable and invariant to lighting conditions. Place cells were observed in owls performing two different types of flying tasks, highlighting the generality of the result. Place coding was found in the anterior hippocampus and in the posterior part of the hyperpallium apicale, and to a lesser extent in the visual Wulst. The finding of place-cells in flying owls suggests commonalities in spatial coding across mammals and birds.
Collapse
|
10
|
Topographic organization of eye-position dependent gain fields in human visual cortex. Nat Commun 2022; 13:7925. [PMID: 36564372 PMCID: PMC9789150 DOI: 10.1038/s41467-022-35488-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2021] [Accepted: 12/06/2022] [Indexed: 12/25/2022] Open
Abstract
The ability to move has introduced animals with the problem of sensory ambiguity: the position of an external stimulus could change over time because the stimulus moved, or because the animal moved its receptors. This ambiguity can be resolved with a change in neural response gain as a function of receptor orientation. Here, we developed an encoding model to capture gain modulation of visual responses in high field (7 T) fMRI data. We characterized population eye-position dependent gain fields (pEGF). The information contained in the pEGFs allowed us to reconstruct eye positions over time across the visual hierarchy. We discovered a systematic distribution of pEGF centers: pEGF centers shift from contra- to ipsilateral following pRF eccentricity. Such a topographical organization suggests that signals beyond pure retinotopy are accessible early in the visual hierarchy, providing the potential to solve sensory ambiguity and optimize sensory processing information for functionally relevant behavior.
Collapse
|
11
|
Zhang X, Long X, Zhang SJ, Chen ZS. Excitatory-inhibitory recurrent dynamics produce robust visual grids and stable attractors. Cell Rep 2022; 41:111777. [PMID: 36516752 PMCID: PMC9805366 DOI: 10.1016/j.celrep.2022.111777] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2022] [Revised: 09/28/2022] [Accepted: 11/14/2022] [Indexed: 12/15/2022] Open
Abstract
Spatially modulated grid cells have been recently found in the rat secondary visual cortex (V2) during active navigation. However, the computational mechanism and functional significance of V2 grid cells remain unknown. To address the knowledge gap, we train a biologically inspired excitatory-inhibitory recurrent neural network to perform a two-dimensional spatial navigation task with multisensory input. We find grid-like responses in both excitatory and inhibitory RNN units, which are robust with respect to spatial cues, dimensionality of visual input, and activation function. Population responses reveal a low-dimensional, torus-like manifold and attractor. We find a link between functional grid clusters with similar receptive fields and structured excitatory-to-excitatory connections. Additionally, multistable torus-like attractors emerged with increasing sparsity in inter- and intra-subnetwork connectivity. Finally, irregular grid patterns are found in recurrent neural network (RNN) units during a visual sequence recognition task. Together, our results suggest common computational mechanisms of V2 grid cells for spatial and non-spatial tasks.
Collapse
Affiliation(s)
- Xiaohan Zhang
- Department of Psychiatry, New York University Grossman School of Medicine, New York, NY, USA
| | - Xiaoyang Long
- Department of Neurosurgery, Xinqiao Hospital, Chongqing, China
| | - Sheng-Jia Zhang
- Department of Neurosurgery, Xinqiao Hospital, Chongqing, China
| | - Zhe Sage Chen
- Department of Psychiatry, New York University Grossman School of Medicine, New York, NY, USA; Department of Neurosurgery, Xinqiao Hospital, Chongqing, China; Neuroscience Institute, New York University Grossman School of Medicine, New York, NY, USA.
| |
Collapse
|
12
|
Lee JY, You T, Woo CW, Kim SG. Optogenetic fMRI for Brain-Wide Circuit Analysis of Sensory Processing. Int J Mol Sci 2022; 23:ijms232012268. [PMID: 36293125 PMCID: PMC9602603 DOI: 10.3390/ijms232012268] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2022] [Revised: 10/12/2022] [Accepted: 10/12/2022] [Indexed: 11/20/2022] Open
Abstract
Sensory processing is a complex neurological process that receives, integrates, and responds to information from one's own body and environment, which is closely related to survival as well as neurological disorders. Brain-wide networks of sensory processing are difficult to investigate due to their dynamic regulation by multiple brain circuits. Optogenetics, a neuromodulation technique that uses light-sensitive proteins, can be combined with functional magnetic resonance imaging (ofMRI) to measure whole-brain activity. Since ofMRI has increasingly been used for investigating brain circuits underlying sensory processing for over a decade, we systematically reviewed recent ofMRI studies of sensory circuits and discussed the challenges of optogenetic fMRI in rodents.
Collapse
Affiliation(s)
- Jeong-Yun Lee
- Center for Neuroscience Imaging Research (CNIR), Institute for Basic Science (IBS), Suwon 16419, Korea
| | - Taeyi You
- Center for Neuroscience Imaging Research (CNIR), Institute for Basic Science (IBS), Suwon 16419, Korea
- Department of Biomedical Engineering, Sungkyunkwan University, Suwon 16419, Korea
- Department of Intelligent Precision Healthcare Convergence, Sungkyunkwan University, Suwon 16419, Korea
| | - Choong-Wan Woo
- Center for Neuroscience Imaging Research (CNIR), Institute for Basic Science (IBS), Suwon 16419, Korea
- Department of Biomedical Engineering, Sungkyunkwan University, Suwon 16419, Korea
- Department of Intelligent Precision Healthcare Convergence, Sungkyunkwan University, Suwon 16419, Korea
| | - Seong-Gi Kim
- Center for Neuroscience Imaging Research (CNIR), Institute for Basic Science (IBS), Suwon 16419, Korea
- Department of Biomedical Engineering, Sungkyunkwan University, Suwon 16419, Korea
- Department of Intelligent Precision Healthcare Convergence, Sungkyunkwan University, Suwon 16419, Korea
- Correspondence: ; Tel.: +82-31-299-4350; Fax: +82-31-299-4506
| |
Collapse
|
13
|
Lee JJ, Krumin M, Harris KD, Carandini M. Task specificity in mouse parietal cortex. Neuron 2022; 110:2961-2969.e5. [PMID: 35963238 PMCID: PMC9616730 DOI: 10.1016/j.neuron.2022.07.017] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2020] [Revised: 01/16/2022] [Accepted: 07/15/2022] [Indexed: 11/26/2022]
Abstract
Parietal cortex is implicated in a variety of behavioral processes, but it is unknown whether and how its individual neurons participate in multiple tasks. We trained head-fixed mice to perform two visual decision tasks involving a steering wheel or a virtual T-maze and recorded from the same parietal neurons during these two tasks. Neurons that were active during the T-maze task were typically inactive during the steering-wheel task and vice versa. Recording from the same neurons in the same apparatus without task stimuli yielded the same specificity as in the task, suggesting that task specificity depends on physical context. To confirm this, we trained some mice in a third task combining the steering wheel context with the visual environment of the T-maze. This hybrid task engaged the same neurons as those engaged in the steering-wheel task. Thus, participation by neurons in mouse parietal cortex is task specific, and this specificity is determined by physical context.
Collapse
Affiliation(s)
- Julie J Lee
- UCL Institute of Ophthalmology, University College London, Gower Street, London WC1E 6AE, UK.
| | - Michael Krumin
- UCL Institute of Ophthalmology, University College London, Gower Street, London WC1E 6AE, UK
| | - Kenneth D Harris
- UCL Queen Square Institute of Neurology, University College London, Gower Street, London WC1E 6AE, UK
| | - Matteo Carandini
- UCL Institute of Ophthalmology, University College London, Gower Street, London WC1E 6AE, UK
| |
Collapse
|
14
|
Tsui KC, Roy J, Chau SC, Wong KH, Shi L, Poon CH, Wang Y, Strekalova T, Aquili L, Chang RCC, Fung ML, Song YQ, Lim LW. Distribution and inter-regional relationship of amyloid-beta plaque deposition in a 5xFAD mouse model of Alzheimer’s disease. Front Aging Neurosci 2022; 14:964336. [PMID: 35966777 PMCID: PMC9371463 DOI: 10.3389/fnagi.2022.964336] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2022] [Accepted: 06/28/2022] [Indexed: 11/13/2022] Open
Abstract
Alzheimer’s disease (AD) is the most common form of dementia. Although previous studies have selectively investigated the localization of amyloid-beta (Aβ) deposition in certain brain regions, a comprehensive characterization of the rostro-caudal distribution of Aβ plaques in the brain and their inter-regional correlation remain unexplored. Our results demonstrated remarkable working and spatial memory deficits in 9-month-old 5xFAD mice compared to wildtype mice. High Aβ plaque load was detected in the somatosensory cortex, piriform cortex, thalamus, and dorsal/ventral hippocampus; moderate levels of Aβ plaques were observed in the motor cortex, orbital cortex, visual cortex, and retrosplenial dysgranular cortex; and low levels of Aβ plaques were located in the amygdala, and the cerebellum; but no Aβ plaques were found in the hypothalamus, raphe nuclei, vestibular nucleus, and cuneate nucleus. Interestingly, the deposition of Aβ plaques was positively associated with brain inter-regions including the prefrontal cortex, somatosensory cortex, medial amygdala, thalamus, and the hippocampus. In conclusion, this study provides a comprehensive morphological profile of Aβ deposition in the brain and its inter-regional correlation. This suggests an association between Aβ plaque deposition and specific brain regions in AD pathogenesis.
Collapse
Affiliation(s)
- Ka Chun Tsui
- School of Biomedical Sciences, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
| | - Jaydeep Roy
- School of Biomedical Sciences, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
| | - Sze Chun Chau
- School of Biomedical Sciences, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
| | - Kah Hui Wong
- School of Biomedical Sciences, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
- Department of Anatomy, Faculty of Medicine, Universiti Malaya, Kuala Lumpur, Malaysia
| | - Lei Shi
- School of Biomedical Sciences, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
| | - Chi Him Poon
- School of Biomedical Sciences, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
| | - Yingyi Wang
- School of Biomedical Sciences, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
| | - Tatyana Strekalova
- Department of Neuroscience, Maastricht University, Maastricht, Netherlands
- Department of Normal Physiology and Laboratory of Psychiatric Neurobiology, Sechenov First Moscow State Medical University, Moscow, Russia
| | - Luca Aquili
- School of Biomedical Sciences, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
- Discipline of Psychology, College of Science, Health, Engineering, and Education, Murdoch University, Perth, WA, Australia
| | - Raymond Chuen-Chung Chang
- School of Biomedical Sciences, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
| | - Man-Lung Fung
- School of Biomedical Sciences, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
- *Correspondence: Man-Lung Fung,
| | - You-qiang Song
- School of Biomedical Sciences, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
- The State Key Laboratory of Brain and Cognitive Sciences, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
- You-qiang Song,
| | - Lee Wei Lim
- School of Biomedical Sciences, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
- Lee Wei Lim,
| |
Collapse
|
15
|
Price BH, Gavornik JP. Efficient Temporal Coding in the Early Visual System: Existing Evidence and Future Directions. Front Comput Neurosci 2022; 16:929348. [PMID: 35874317 PMCID: PMC9298461 DOI: 10.3389/fncom.2022.929348] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2022] [Accepted: 06/13/2022] [Indexed: 01/16/2023] Open
Abstract
While it is universally accepted that the brain makes predictions, there is little agreement about how this is accomplished and under which conditions. Accurate prediction requires neural circuits to learn and store spatiotemporal patterns observed in the natural environment, but it is not obvious how such information should be stored, or encoded. Information theory provides a mathematical formalism that can be used to measure the efficiency and utility of different coding schemes for data transfer and storage. This theory shows that codes become efficient when they remove predictable, redundant spatial and temporal information. Efficient coding has been used to understand retinal computations and may also be relevant to understanding more complicated temporal processing in visual cortex. However, the literature on efficient coding in cortex is varied and can be confusing since the same terms are used to mean different things in different experimental and theoretical contexts. In this work, we attempt to provide a clear summary of the theoretical relationship between efficient coding and temporal prediction, and review evidence that efficient coding principles explain computations in the retina. We then apply the same framework to computations occurring in early visuocortical areas, arguing that data from rodents is largely consistent with the predictions of this model. Finally, we review and respond to criticisms of efficient coding and suggest ways that this theory might be used to design future experiments, with particular focus on understanding the extent to which neural circuits make predictions from efficient representations of environmental statistics.
Collapse
|
16
|
Hippocampal place codes are gated by behavioral engagement. Nat Neurosci 2022; 25:561-566. [PMID: 35449355 PMCID: PMC9076532 DOI: 10.1038/s41593-022-01050-4] [Citation(s) in RCA: 25] [Impact Index Per Article: 12.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2021] [Accepted: 03/14/2022] [Indexed: 11/27/2022]
Abstract
As animals explore an environment, the hippocampus is thought to automatically form and maintain a place code by combining sensory and self-motion signals. Instead, we observed an extensive degradation of the place code when mice voluntarily disengaged from a virtual navigation task, remarkably even as they continued to traverse the identical environment. Internal states, therefore, can strongly gate spatial maps and reorganize hippocampal activity even without sensory and self-motion changes. The authors found that the expression of spatial maps in the hippocampus is modulated by the internal state of an animal. Thus, the brain’s code for spatial positions within an environment can transform even without changes to the external world.
Collapse
|
17
|
Low IIC, Giocomo LM. Task engagement turns on spatial maps. Nat Neurosci 2022; 25:534-535. [PMID: 35449356 DOI: 10.1038/s41593-022-01051-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Affiliation(s)
- Isabel I C Low
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA, USA
| | - Lisa M Giocomo
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA, USA.
| |
Collapse
|
18
|
Zong W, Obenhaus HA, Skytøen ER, Eneqvist H, de Jong NL, Vale R, Jorge MR, Moser MB, Moser EI. Large-scale two-photon calcium imaging in freely moving mice. Cell 2022; 185:1240-1256.e30. [PMID: 35305313 PMCID: PMC8970296 DOI: 10.1016/j.cell.2022.02.017] [Citation(s) in RCA: 95] [Impact Index Per Article: 47.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2021] [Revised: 01/12/2022] [Accepted: 02/14/2022] [Indexed: 11/29/2022]
Abstract
We developed a miniaturized two-photon microscope (MINI2P) for fast, high-resolution, multiplane calcium imaging of over 1,000 neurons at a time in freely moving mice. With a microscope weight below 3 g and a highly flexible connection cable, MINI2P allowed stable imaging with no impediment of behavior in a variety of assays compared to untethered, unimplanted animals. The improved cell yield was achieved through a optical system design featuring an enlarged field of view (FOV) and a microtunable lens with increased z-scanning range and speed that allows fast and stable imaging of multiple interleaved planes, as well as 3D functional imaging. Successive imaging across multiple, adjacent FOVs enabled recordings from more than 10,000 neurons in the same animal. Large-scale proof-of-principle data were obtained from cell populations in visual cortex, medial entorhinal cortex, and hippocampus, revealing spatial tuning of cells in all areas. We made a light-weight 2-photon miniscope for calcium imaging in freely moving mice Stable high-quality imaging was observed during a wide spectrum of behaviors Activity can be monitored in volumes of over 1,000 visual or entorhinal-cortex cells A custom-designed z-scanning module allows fast imaging across multiple planes
Collapse
Affiliation(s)
- Weijian Zong
- Kavli Institute for Systems Neuroscience and Centre for Neural Computation, Norwegian University of Science and Technology (NTNU), Trondheim NO-7491, Norway.
| | - Horst A Obenhaus
- Kavli Institute for Systems Neuroscience and Centre for Neural Computation, Norwegian University of Science and Technology (NTNU), Trondheim NO-7491, Norway
| | - Emilie R Skytøen
- Kavli Institute for Systems Neuroscience and Centre for Neural Computation, Norwegian University of Science and Technology (NTNU), Trondheim NO-7491, Norway
| | - Hanna Eneqvist
- Kavli Institute for Systems Neuroscience and Centre for Neural Computation, Norwegian University of Science and Technology (NTNU), Trondheim NO-7491, Norway
| | - Nienke L de Jong
- Kavli Institute for Systems Neuroscience and Centre for Neural Computation, Norwegian University of Science and Technology (NTNU), Trondheim NO-7491, Norway
| | - Ruben Vale
- Kavli Institute for Systems Neuroscience and Centre for Neural Computation, Norwegian University of Science and Technology (NTNU), Trondheim NO-7491, Norway
| | - Marina R Jorge
- Kavli Institute for Systems Neuroscience and Centre for Neural Computation, Norwegian University of Science and Technology (NTNU), Trondheim NO-7491, Norway
| | - May-Britt Moser
- Kavli Institute for Systems Neuroscience and Centre for Neural Computation, Norwegian University of Science and Technology (NTNU), Trondheim NO-7491, Norway
| | - Edvard I Moser
- Kavli Institute for Systems Neuroscience and Centre for Neural Computation, Norwegian University of Science and Technology (NTNU), Trondheim NO-7491, Norway.
| |
Collapse
|
19
|
Parametric Copula-GP model for analyzing multidimensional neuronal and behavioral relationships. PLoS Comput Biol 2022; 18:e1009799. [PMID: 35089913 PMCID: PMC8827448 DOI: 10.1371/journal.pcbi.1009799] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2021] [Revised: 02/09/2022] [Accepted: 01/02/2022] [Indexed: 11/19/2022] Open
Abstract
One of the main goals of current systems neuroscience is to understand how neuronal populations integrate sensory information to inform behavior. However, estimating stimulus or behavioral information that is encoded in high-dimensional neuronal populations is challenging. We propose a method based on parametric copulas which allows modeling joint distributions of neuronal and behavioral variables characterized by different statistics and timescales. To account for temporal or spatial changes in dependencies between variables, we model varying copula parameters by means of Gaussian Processes (GP). We validate the resulting Copula-GP framework on synthetic data and on neuronal and behavioral recordings obtained in awake mice. We show that the use of a parametric description of the high-dimensional dependence structure in our method provides better accuracy in mutual information estimation in higher dimensions compared to other non-parametric methods. Moreover, by quantifying the redundancy between neuronal and behavioral variables, our model exposed the location of the reward zone in an unsupervised manner (i.e., without using any explicit cues about the task structure). These results demonstrate that the Copula-GP framework is particularly useful for the analysis of complex multidimensional relationships between neuronal, sensory and behavioral variables. Understanding the relationship between a set of variables is a common problem in many fields, such as weather forecast or stock market data. In neuroscience, one of the main challenges is to characterize the dependencies between neuronal activity, sensory stimuli and behavioral outputs. A method of choice for modeling such statistical dependencies is based on copulas, which disentangle dependencies from single variable statistics. To account for changes in dependencies, we model changes in copula parameters by means of Gaussian Processes, conditioned on a task-related variable. The novelty of our approach includes 1) explicit modeling of the dependencies; and 2) combining different copulas to describe experimentally observed variability. We validate the goodness-of-fit as well as information estimates on synthetic data and on recordings from the visual cortex of mice performing a behavioral task. Our parametric model demonstrates significantly better performance in describing high dimensional dependencies compared to other commonly used techniques. We demonstrate that our model can estimate information and predict behaviorally-relevant parameters of the task without providing any explicit cues to the model. Our results indicate that our model is interpretable in the context of neuroscience applications, scalable to large datasets and suitable for accurate statistical modeling and information estimation.
Collapse
|
20
|
Resulaj A. Projections of the Mouse Primary Visual Cortex. Front Neural Circuits 2021; 15:751331. [PMID: 34867213 PMCID: PMC8641241 DOI: 10.3389/fncir.2021.751331] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2021] [Accepted: 10/28/2021] [Indexed: 11/13/2022] Open
Abstract
Lesion or damage to the primary visual cortex (V1) results in a profound loss of visual perception in humans. Similarly, in mice, optogenetic silencing of V1 profoundly impairs discrimination of orientated gratings. V1 is thought to have such a critical role in perception in part due to its position in the visual processing hierarchy. It is the first brain area in the neocortex to receive visual input, and it distributes this information to more than 18 brain areas. Here I review recent advances in our understanding of the organization and function of the V1 projections in the mouse. This progress is in part due to new anatomical and viral techniques that allow for efficient labeling of projection neurons. In the final part of the review, I conclude by highlighting challenges and opportunities for future research.
Collapse
Affiliation(s)
- Arbora Resulaj
- Department of Biology, University of Toronto Mississauga, Mississauga, ON, Canada.,Department of Cell and Systems Biology, University of Toronto, Toronto, ON, Canada
| |
Collapse
|
21
|
Speed A, Haider B. Probing mechanisms of visual spatial attention in mice. Trends Neurosci 2021; 44:822-836. [PMID: 34446296 PMCID: PMC8484049 DOI: 10.1016/j.tins.2021.07.009] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2021] [Revised: 07/05/2021] [Accepted: 07/30/2021] [Indexed: 11/25/2022]
Abstract
The role of spatial attention for visual perception has been thoroughly studied in primates, but less so in mice. Several behavioral tasks in mice reveal spatial attentional effects, with similarities to observations in primates. Pairing these tasks with large-scale, cell-type-specific techniques could enable deeper access to underlying mechanisms, and help define the utility and limitations of resolving attentional effects on visual perception and neural activity in mice. In this Review, we evaluate behavioral and neural evidence for visual spatial attention in mice; assess how specializations of the mouse visual system and behavioral repertoire impact interpretation of spatial attentional effects; and outline how several measurement and manipulation techniques in mice could precisely test and refine models of attentional modulation across scales.
Collapse
Affiliation(s)
- Anderson Speed
- Biomedical Engineering, Georgia Institute of Technology & Emory University, Atlanta, GA, USA
| | - Bilal Haider
- Biomedical Engineering, Georgia Institute of Technology & Emory University, Atlanta, GA, USA.
| |
Collapse
|
22
|
Linton P. V1 as an egocentric cognitive map. Neurosci Conscious 2021; 2021:niab017. [PMID: 34532068 PMCID: PMC8439394 DOI: 10.1093/nc/niab017] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2021] [Revised: 05/21/2021] [Accepted: 06/08/2021] [Indexed: 01/20/2023] Open
Abstract
We typically distinguish between V1 as an egocentric perceptual map and the hippocampus as an allocentric cognitive map. In this article, we argue that V1 also functions as a post-perceptual egocentric cognitive map. We argue that three well-documented functions of V1, namely (i) the estimation of distance, (ii) the estimation of size, and (iii) multisensory integration, are better understood as post-perceptual cognitive inferences. This argument has two important implications. First, we argue that V1 must function as the neural correlates of the visual perception/cognition distinction and suggest how this can be accommodated by V1's laminar structure. Second, we use this insight to propose a low-level account of visual consciousness in contrast to mid-level accounts (recurrent processing theory; integrated information theory) and higher-level accounts (higher-order thought; global workspace theory). Detection thresholds have been traditionally used to rule out such an approach, but we explain why it is a mistake to equate visibility (and therefore the presence/absence of visual experience) with detection thresholds.
Collapse
Affiliation(s)
- Paul Linton
- Centre for Applied Vision Research, City, University of London, Northampton Square, London EC1V 0HB, UK
| |
Collapse
|