1
|
Skyberg RJ, Niell CM. Natural visual behavior and active sensing in the mouse. Curr Opin Neurobiol 2024; 86:102882. [PMID: 38704868 DOI: 10.1016/j.conb.2024.102882] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2023] [Revised: 04/05/2024] [Accepted: 04/10/2024] [Indexed: 05/07/2024]
Abstract
In the natural world, animals use vision for a wide variety of behaviors not reflected in most laboratory paradigms. Although mice have low-acuity vision, they use their vision for many natural behaviors, including predator avoidance, prey capture, and navigation. They also perform active sensing, moving their head and eyes to achieve behavioral goals and acquire visual information. These aspects of natural vision result in visual inputs and corresponding behavioral outputs that are outside the range of conventional vision studies but are essential aspects of visual function. Here, we review recent studies in mice that have tapped into natural behavior and active sensing to reveal the computational logic of neural circuits for vision.
Collapse
Affiliation(s)
- Rolf J Skyberg
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene OR 97403, USA. https://twitter.com/SkybergRolf
| | - Cristopher M Niell
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene OR 97403, USA.
| |
Collapse
|
2
|
Oesch LT, Ryan MB, Churchland AK. From innate to instructed: A new look at perceptual decision-making. Curr Opin Neurobiol 2024; 86:102871. [PMID: 38569230 PMCID: PMC11162954 DOI: 10.1016/j.conb.2024.102871] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2023] [Revised: 03/07/2024] [Accepted: 03/08/2024] [Indexed: 04/05/2024]
Abstract
Understanding how subjects perceive sensory stimuli in their environment and use this information to guide appropriate actions is a major challenge in neuroscience. To study perceptual decision-making in animals, researchers use tasks that either probe spontaneous responses to stimuli (often described as "naturalistic") or train animals to associate stimuli with experimenter-defined responses. Spontaneous decisions rely on animals' pre-existing knowledge, while trained tasks offer greater versatility, albeit often at the cost of extensive training. Here, we review emerging approaches to investigate perceptual decision-making using both spontaneous and trained behaviors, highlighting their strengths and limitations. Additionally, we propose how trained decision-making tasks could be improved to achieve faster learning and a more generalizable understanding of task rules.
Collapse
Affiliation(s)
- Lukas T Oesch
- Department of Neurobiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, United States
| | - Michael B Ryan
- Department of Neurobiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, United States. https://twitter.com/NeuroMikeRyan
| | - Anne K Churchland
- Department of Neurobiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, United States.
| |
Collapse
|
3
|
Ambrad Giovannetti E, Rancz E. Behind mouse eyes: The function and control of eye movements in mice. Neurosci Biobehav Rev 2024; 161:105671. [PMID: 38604571 DOI: 10.1016/j.neubiorev.2024.105671] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2023] [Revised: 03/12/2024] [Accepted: 04/08/2024] [Indexed: 04/13/2024]
Abstract
The mouse visual system has become the most popular model to study the cellular and circuit mechanisms of sensory processing. However, the importance of eye movements only started to be appreciated recently. Eye movements provide a basis for predictive sensing and deliver insights into various brain functions and dysfunctions. A plethora of knowledge on the central control of eye movements and their role in perception and behaviour arose from work on primates. However, an overview of various eye movements in mice and a comparison to primates is missing. Here, we review the eye movement types described to date in mice and compare them to those observed in primates. We discuss the central neuronal mechanisms for their generation and control. Furthermore, we review the mounting literature on eye movements in mice during head-fixed and freely moving behaviours. Finally, we highlight gaps in our understanding and suggest future directions for research.
Collapse
Affiliation(s)
| | - Ede Rancz
- INMED, INSERM, Aix-Marseille University, Marseille, France.
| |
Collapse
|
4
|
Pinke D, Issa JB, Dara GA, Dobos G, Dombeck DA. Full field-of-view virtual reality goggles for mice. Neuron 2023; 111:3941-3952.e6. [PMID: 38070501 PMCID: PMC10841834 DOI: 10.1016/j.neuron.2023.11.019] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2023] [Revised: 10/03/2023] [Accepted: 11/15/2023] [Indexed: 12/23/2023]
Abstract
Visual virtual reality (VR) systems for head-fixed mice offer advantages over real-world studies for investigating the neural circuitry underlying behavior. However, current VR approaches do not fully cover the visual field of view of mice, do not stereoscopically illuminate the binocular zone, and leave the lab frame visible. To overcome these limitations, we developed iMRSIV (Miniature Rodent Stereo Illumination VR)-VR goggles for mice. Our system is compact, separately illuminates each eye for stereo vision, and provides each eye with an ∼180° field of view, thus excluding the lab frame while accommodating saccades. Mice using iMRSIV while navigating engaged in virtual behaviors more quickly than in a current monitor-based system and displayed freezing and fleeing reactions to overhead looming stimulation. Using iMRSIV with two-photon functional imaging, we found large populations of hippocampal place cells during virtual navigation, global remapping during environment changes, and unique responses of place cell ensembles to overhead looming stimulation.
Collapse
Affiliation(s)
- Domonkos Pinke
- Department of Neurobiology, Northwestern University, Evanston, IL 60208, USA
| | - John B Issa
- Department of Neurobiology, Northwestern University, Evanston, IL 60208, USA
| | - Gabriel A Dara
- Department of Neurobiology, Northwestern University, Evanston, IL 60208, USA
| | - Gergely Dobos
- 360world Ltd, Sümegvár köz 9, 1118 Budapest, Hungary
| | - Daniel A Dombeck
- Department of Neurobiology, Northwestern University, Evanston, IL 60208, USA.
| |
Collapse
|
5
|
Parker PRL, Martins DM, Leonard ESP, Casey NM, Sharp SL, Abe ETT, Smear MC, Yates JL, Mitchell JF, Niell CM. A dynamic sequence of visual processing initiated by gaze shifts. Nat Neurosci 2023; 26:2192-2202. [PMID: 37996524 DOI: 10.1038/s41593-023-01481-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2022] [Accepted: 10/04/2023] [Indexed: 11/25/2023]
Abstract
Animals move their head and eyes as they explore the visual scene. Neural correlates of these movements have been found in rodent primary visual cortex (V1), but their sources and computational roles are unclear. We addressed this by combining head and eye movement measurements with neural recordings in freely moving mice. V1 neurons responded primarily to gaze shifts, where head movements are accompanied by saccadic eye movements, rather than to head movements where compensatory eye movements stabilize gaze. A variety of activity patterns followed gaze shifts and together these formed a temporal sequence that was absent in darkness. Gaze-shift responses resembled those evoked by sequentially flashed stimuli, suggesting a large component corresponds to onset of new visual input. Notably, neurons responded in a sequence that matches their spatial frequency bias, consistent with coarse-to-fine processing. Recordings in freely gazing marmosets revealed a similar sequence following saccades, also aligned to spatial frequency preference. Our results demonstrate that active vision in both mice and marmosets consists of a dynamic temporal sequence of neural activity associated with visual sampling.
Collapse
Affiliation(s)
- Philip R L Parker
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
- Behavioral and Systems Neuroscience, Department of Psychology, Rutgers University, New Brunswick, NJ, USA
| | - Dylan M Martins
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Emmalyn S P Leonard
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Nathan M Casey
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Shelby L Sharp
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Elliott T T Abe
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Matthew C Smear
- Institute of Neuroscience and Department of Psychology, University of Oregon, Eugene, OR, USA
| | - Jacob L Yates
- Department of Biology and Program in Neuroscience and Cognitive Science, University of Maryland, College Park, MD, USA
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, CA, USA
| | - Jude F Mitchell
- Department of Brain and Cognitive Sciences and Center for Visual Sciences, University of Rochester, Rochester, NY, USA.
| | - Cristopher M Niell
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA.
| |
Collapse
|
6
|
Klioutchnikov A, Kerr JND. Chasing cortical behavior: designing multiphoton microscopes for imaging neuronal populations in freely moving rodents. NEUROPHOTONICS 2023; 10:044411. [PMID: 37886044 PMCID: PMC10599648 DOI: 10.1117/1.nph.10.4.044411] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/12/2023] [Revised: 09/22/2023] [Accepted: 10/03/2023] [Indexed: 10/28/2023]
Abstract
Imaging in the freely moving animal gives unparalleled access to circuit activity as the animal interacts with its environment in a self-guided way. Over the past few years, new imaging technologies have enabled the interrogation of neuronal populations located at any depth of the cortex in freely moving mice while preserving the animal's behavioral repertoire. This commentary gives an updated overview of the recent advances that have enabled the link between behavior and the underlying neuronal activity to be explored.
Collapse
Affiliation(s)
- Alexandr Klioutchnikov
- Max Planck Institute for Neurobiology of Behavior, Department of Behavior and Brain Organization, Bonn, Germany
| | - Jason N. D. Kerr
- Max Planck Institute for Neurobiology of Behavior, Department of Behavior and Brain Organization, Bonn, Germany
| |
Collapse
|
7
|
Lehnert J, Cha K, Halperin J, Yang K, Zheng DF, Khadra A, Cook EP, Krishnaswamy A. Visual attention to features and space in mice using reverse correlation. Curr Biol 2023; 33:3690-3701.e4. [PMID: 37611588 DOI: 10.1016/j.cub.2023.07.060] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2022] [Revised: 05/17/2023] [Accepted: 07/27/2023] [Indexed: 08/25/2023]
Abstract
Visual attention allows the brain to evoke behaviors based on the most important visual features. Mouse models offer immense potential to gain a circuit-level understanding of this phenomenon, yet how mice distribute attention across features and locations is not well understood. Here, we describe a new approach to address this limitation by training mice to detect weak vertical bars in a background of dynamic noise while spatial cues manipulate their attention. By adapting a reverse-correlation method from human studies, we linked behavioral decisions to stimulus features and locations. We show that mice deployed attention to a small rostral region of the visual field. Within this region, mice attended to multiple features (orientation, spatial frequency, contrast) that indicated the presence of weak vertical bars. This attentional tuning grew with training, multiplicatively scaled behavioral sensitivity, approached that of an ideal observer, and resembled the effects of attention in humans. Taken together, we demonstrate that mice can simultaneously attend to multiple features and locations of a visual stimulus.
Collapse
Affiliation(s)
- Jonas Lehnert
- Department of Physiology, McGill University, Montreal, QC H3G 1Y6, Canada; Quantitative Life Sciences, McGill University, Montreal, QC H3A 1E3, Canada
| | - Kuwook Cha
- Department of Physiology, McGill University, Montreal, QC H3G 1Y6, Canada
| | - Jamie Halperin
- Department of Physiology, McGill University, Montreal, QC H3G 1Y6, Canada
| | - Kerry Yang
- Department of Physiology, McGill University, Montreal, QC H3G 1Y6, Canada
| | - Daniel F Zheng
- Department of Physiology, McGill University, Montreal, QC H3G 1Y6, Canada
| | - Anmar Khadra
- Department of Physiology, McGill University, Montreal, QC H3G 1Y6, Canada; Quantitative Life Sciences, McGill University, Montreal, QC H3A 1E3, Canada; Centre for Applied Mathematics in Bioscience and Medicine, McGill University, Montreal, QC H3G 0B1, Canada
| | - Erik P Cook
- Department of Physiology, McGill University, Montreal, QC H3G 1Y6, Canada; Quantitative Life Sciences, McGill University, Montreal, QC H3A 1E3, Canada; Centre for Applied Mathematics in Bioscience and Medicine, McGill University, Montreal, QC H3G 0B1, Canada.
| | - Arjun Krishnaswamy
- Department of Physiology, McGill University, Montreal, QC H3G 1Y6, Canada; Quantitative Life Sciences, McGill University, Montreal, QC H3A 1E3, Canada.
| |
Collapse
|
8
|
Saleem AB, Busse L. Interactions between rodent visual and spatial systems during navigation. Nat Rev Neurosci 2023:10.1038/s41583-023-00716-7. [PMID: 37380885 DOI: 10.1038/s41583-023-00716-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/31/2023] [Indexed: 06/30/2023]
Abstract
Many behaviours that are critical for animals to survive and thrive rely on spatial navigation. Spatial navigation, in turn, relies on internal representations about one's spatial location, one's orientation or heading direction and the distance to objects in the environment. Although the importance of vision in guiding such internal representations has long been recognized, emerging evidence suggests that spatial signals can also modulate neural responses in the central visual pathway. Here, we review the bidirectional influences between visual and navigational signals in the rodent brain. Specifically, we discuss reciprocal interactions between vision and the internal representations of spatial position, explore the effects of vision on representations of an animal's heading direction and vice versa, and examine how the visual and navigational systems work together to assess the relative distances of objects and other features. Throughout, we consider how technological advances and novel ethological paradigms that probe rodent visuo-spatial behaviours allow us to advance our understanding of how brain areas of the central visual pathway and the spatial systems interact and enable complex behaviours.
Collapse
Affiliation(s)
- Aman B Saleem
- UCL Institute of Behavioural Neuroscience, Department of Experimental Psychology, University College London, London, UK.
| | - Laura Busse
- Division of Neuroscience, Faculty of Biology, LMU Munich, Munich, Germany.
- Bernstein Centre for Computational Neuroscience Munich, Munich, Germany.
| |
Collapse
|
9
|
Yates JL, Coop SH, Sarch GH, Wu RJ, Butts DA, Rucci M, Mitchell JF. Detailed characterization of neural selectivity in free viewing primates. Nat Commun 2023; 14:3656. [PMID: 37339973 PMCID: PMC10282080 DOI: 10.1038/s41467-023-38564-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2022] [Accepted: 05/08/2023] [Indexed: 06/22/2023] Open
Abstract
Fixation constraints in visual tasks are ubiquitous in visual and cognitive neuroscience. Despite its widespread use, fixation requires trained subjects, is limited by the accuracy of fixational eye movements, and ignores the role of eye movements in shaping visual input. To overcome these limitations, we developed a suite of hardware and software tools to study vision during natural behavior in untrained subjects. We measured visual receptive fields and tuning properties from multiple cortical areas of marmoset monkeys who freely viewed full-field noise stimuli. The resulting receptive fields and tuning curves from primary visual cortex (V1) and area MT match reported selectivity from the literature which was measured using conventional approaches. We then combined free viewing with high-resolution eye tracking to make the first detailed 2D spatiotemporal measurements of foveal receptive fields in V1. These findings demonstrate the power of free viewing to characterize neural responses in untrained animals while simultaneously studying the dynamics of natural behavior.
Collapse
Affiliation(s)
- Jacob L Yates
- Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA.
- Center for Visual Science, University of Rochester, Rochester, NY, USA.
- Department of Biology and Program in Neuroscience and Cognitive Science, University of Maryland, College Park, MD, USA.
- Herbert Wertheim School of Optometry and Vision Science, UC Berkeley, Berkeley, CA, USA.
| | - Shanna H Coop
- Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA
- Center for Visual Science, University of Rochester, Rochester, NY, USA
- Neurobiology, Stanford University, Stanford, CA, USA
| | - Gabriel H Sarch
- Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA
- Neuroscience Institute, Carnegie Mellon University, Pittsburgh, PA, USA
| | - Ruei-Jr Wu
- Center for Visual Science, University of Rochester, Rochester, NY, USA
- Institute of Optics, University of Rochester, Rochester, NY, USA
| | - Daniel A Butts
- Department of Biology and Program in Neuroscience and Cognitive Science, University of Maryland, College Park, MD, USA
| | - Michele Rucci
- Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA
- Center for Visual Science, University of Rochester, Rochester, NY, USA
| | - Jude F Mitchell
- Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA
- Center for Visual Science, University of Rochester, Rochester, NY, USA
| |
Collapse
|
10
|
Jauch I, Kamm J, Benn L, Rettig L, Friederich HC, Tesarz J, Kuner T, Wieland S. 2MDR, a Microcomputer-Controlled Visual Stimulation Device for Psychotherapy-Like Treatments of Mice. eNeuro 2023; 10:10/6/ENEURO.0394-22.2023. [PMID: 37268421 DOI: 10.1523/eneuro.0394-22.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2022] [Revised: 02/16/2023] [Accepted: 02/27/2023] [Indexed: 06/04/2023] Open
Abstract
Post-traumatic stress disorder and other mental disorders can be treated by an established psychotherapy called Eye Movement Desensitization and Reprocessing (EMDR). In EMDR, patients are confronted with traumatic memories while they are stimulated with alternating bilateral stimuli (ABS). How ABS affects the brain and whether ABS could be adapted to different patients or mental disorders is unknown. Interestingly, ABS reduced conditioned fear in mice. Yet, an approach to systematically test complex visual stimuli and compare respective differences in emotional processing based on semiautomated/automated behavioral analysis is lacking. We developed 2MDR (MultiModal Visual Stimulation to Desensitize Rodents), a novel, open-source, low-cost, customizable device that can be integrated in and transistor-transistor logic (TTL) controlled by commercial rodent behavioral setups. 2MDR allows the design and precise steering of multimodal visual stimuli in the head direction of freely moving mice. Optimized videography allows semiautomatic analysis of rodent behavior during visual stimulation. Detailed building, integration, and treatment instructions along with open-source software provide easy access for inexperienced users. Using 2MDR, we confirmed that EMDR-like ABS persistently improves fear extinction in mice and showed for the first time that ABS-mediated anxiolytic effects strongly depend on physical stimulus properties such as ABS brightness. 2MDR not only enables researchers to interfere with mouse behavior in an EMDR-like setting, but also demonstrates that visual stimuli can be used as a noninvasive brain stimulation to differentially alter emotional processing in mice.
Collapse
Affiliation(s)
- Isa Jauch
- Department of Functional Neuroanatomy, Institute for Anatomy and Cell Biology, Heidelberg University, 69120 Heidelberg, Germany
| | - Jan Kamm
- Department of Functional Neuroanatomy, Institute for Anatomy and Cell Biology, Heidelberg University, 69120 Heidelberg, Germany
| | - Luca Benn
- Department of Functional Neuroanatomy, Institute for Anatomy and Cell Biology, Heidelberg University, 69120 Heidelberg, Germany
| | - Lukas Rettig
- Department of Functional Neuroanatomy, Institute for Anatomy and Cell Biology, Heidelberg University, 69120 Heidelberg, Germany
| | - Hans-Christoph Friederich
- Department of General Internal and Psychosomatic Medicine, Heidelberg University and Heidelberg University Hospital, 69115 Heidelberg, Germany
| | - Jonas Tesarz
- Department of General Internal and Psychosomatic Medicine, Heidelberg University and Heidelberg University Hospital, 69115 Heidelberg, Germany
| | - Thomas Kuner
- Department of Functional Neuroanatomy, Institute for Anatomy and Cell Biology, Heidelberg University, 69120 Heidelberg, Germany
| | - Sebastian Wieland
- Department of Functional Neuroanatomy, Institute for Anatomy and Cell Biology, Heidelberg University, 69120 Heidelberg, Germany
- Department of General Internal and Psychosomatic Medicine, Heidelberg University and Heidelberg University Hospital, 69115 Heidelberg, Germany
| |
Collapse
|
11
|
Klioutchnikov A, Wallace DJ, Sawinski J, Voit KM, Groemping Y, Kerr JND. A three-photon head-mounted microscope for imaging all layers of visual cortex in freely moving mice. Nat Methods 2023; 20:610-616. [PMID: 36443485 PMCID: PMC10089923 DOI: 10.1038/s41592-022-01688-9] [Citation(s) in RCA: 23] [Impact Index Per Article: 23.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2021] [Accepted: 10/19/2022] [Indexed: 11/30/2022]
Abstract
Advances in head-mounted microscopes have enabled imaging of neuronal activity using genetic tools in freely moving mice but these microscopes are restricted to recording in minimally lit arenas and imaging upper cortical layers. Here we built a 2-g, three-photon excitation-based microscope, containing a z-drive that enabled access to all cortical layers while mice freely behaved in a fully lit environment. The microscope had on-board photon detectors, robust to environmental light, and the arena lighting was timed to the end of each line-scan, enabling functional imaging of activity from cortical layer 4 and layer 6 neurons expressing jGCaMP7f in mice roaming a fully lit or dark arena. By comparing the neuronal activity measured from populations in these layers we show that activity in cortical layer 4 and layer 6 is differentially modulated by lit and dark conditions during free exploration.
Collapse
Affiliation(s)
- Alexandr Klioutchnikov
- Department of Behavior and Brain Organization, Max Planck Institute for Neurobiology of Behavior-caesar, Bonn, Germany
| | - Damian J Wallace
- Department of Behavior and Brain Organization, Max Planck Institute for Neurobiology of Behavior-caesar, Bonn, Germany
| | - Juergen Sawinski
- Department of Behavior and Brain Organization, Max Planck Institute for Neurobiology of Behavior-caesar, Bonn, Germany
| | - Kay-Michael Voit
- Department of Behavior and Brain Organization, Max Planck Institute for Neurobiology of Behavior-caesar, Bonn, Germany
| | - Yvonne Groemping
- Department of Behavior and Brain Organization, Max Planck Institute for Neurobiology of Behavior-caesar, Bonn, Germany
| | - Jason N D Kerr
- Department of Behavior and Brain Organization, Max Planck Institute for Neurobiology of Behavior-caesar, Bonn, Germany.
| |
Collapse
|
12
|
Harris SC, Dunn FA. Asymmetric retinal direction tuning predicts optokinetic eye movements across stimulus conditions. eLife 2023; 12:e81780. [PMID: 36930180 PMCID: PMC10023158 DOI: 10.7554/elife.81780] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2022] [Accepted: 02/02/2023] [Indexed: 03/18/2023] Open
Abstract
Across species, the optokinetic reflex (OKR) stabilizes vision during self-motion. OKR occurs when ON direction-selective retinal ganglion cells (oDSGCs) detect slow, global image motion on the retina. How oDSGC activity is integrated centrally to generate behavior remains unknown. Here, we discover mechanisms that contribute to motion encoding in vertically tuned oDSGCs and leverage these findings to empirically define signal transformation between retinal output and vertical OKR behavior. We demonstrate that motion encoding in vertically tuned oDSGCs is contrast-sensitive and asymmetric for oDSGC types that prefer opposite directions. These phenomena arise from the interplay between spike threshold nonlinearities and differences in synaptic input weights, including shifts in the balance of excitation and inhibition. In behaving mice, these neurophysiological observations, along with a central subtraction of oDSGC outputs, accurately predict the trajectories of vertical OKR across stimulus conditions. Thus, asymmetric tuning across competing sensory channels can critically shape behavior.
Collapse
Affiliation(s)
- Scott C Harris
- Department of Ophthalmology, University of California, San FranciscoSan FranciscoUnited States
- Neuroscience Graduate Program, University of California, San FranciscoSan FranciscoUnited States
| | - Felice A Dunn
- Department of Ophthalmology, University of California, San FranciscoSan FranciscoUnited States
| |
Collapse
|
13
|
Through Hawks’ Eyes: Synthetically Reconstructing the Visual Field of a Bird in Flight. Int J Comput Vis 2023; 131:1497-1531. [PMID: 37089199 PMCID: PMC10110700 DOI: 10.1007/s11263-022-01733-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2022] [Accepted: 12/05/2022] [Indexed: 03/06/2023]
Abstract
AbstractBirds of prey rely on vision to execute flight manoeuvres that are key to their survival, such as intercepting fast-moving targets or navigating through clutter. A better understanding of the role played by vision during these manoeuvres is not only relevant within the field of animal behaviour, but could also have applications for autonomous drones. In this paper, we present a novel method that uses computer vision tools to analyse the role of active vision in bird flight, and demonstrate its use to answer behavioural questions. Combining motion capture data from Harris’ hawks with a hybrid 3D model of the environment, we render RGB images, semantic maps, depth information and optic flow outputs that characterise the visual experience of the bird in flight. In contrast with previous approaches, our method allows us to consider different camera models and alternative gaze strategies for the purposes of hypothesis testing, allows us to consider visual input over the complete visual field of the bird, and is not limited by the technical specifications and performance of a head-mounted camera light enough to attach to a bird’s head in flight. We present pilot data from three sample flights: a pursuit flight, in which a hawk intercepts a moving target, and two obstacle avoidance flights. With this approach, we provide a reproducible method that facilitates the collection of large volumes of data across many individuals, opening up new avenues for data-driven models of animal behaviour.
Collapse
|
14
|
Horrocks EAB, Mareschal I, Saleem AB. Walking humans and running mice: perception and neural encoding of optic flow during self-motion. Philos Trans R Soc Lond B Biol Sci 2023; 378:20210450. [PMID: 36511417 PMCID: PMC9745880 DOI: 10.1098/rstb.2021.0450] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022] Open
Abstract
Locomotion produces full-field optic flow that often dominates the visual motion inputs to an observer. The perception of optic flow is in turn important for animals to guide their heading and interact with moving objects. Understanding how locomotion influences optic flow processing and perception is therefore essential to understand how animals successfully interact with their environment. Here, we review research investigating how perception and neural encoding of optic flow are altered during self-motion, focusing on locomotion. Self-motion has been found to influence estimation and sensitivity for optic flow speed and direction. Nonvisual self-motion signals also increase compensation for self-driven optic flow when parsing the visual motion of moving objects. The integration of visual and nonvisual self-motion signals largely follows principles of Bayesian inference and can improve the precision and accuracy of self-motion perception. The calibration of visual and nonvisual self-motion signals is dynamic, reflecting the changing visuomotor contingencies across different environmental contexts. Throughout this review, we consider experimental research using humans, non-human primates and mice. We highlight experimental challenges and opportunities afforded by each of these species and draw parallels between experimental findings. These findings reveal a profound influence of locomotion on optic flow processing and perception across species. This article is part of a discussion meeting issue 'New approaches to 3D vision'.
Collapse
Affiliation(s)
- Edward A. B. Horrocks
- Institute of Behavioural Neuroscience, Department of Experimental Psychology, University College London, London WC1H 0AP, UK
| | - Isabelle Mareschal
- School of Biological and Behavioural Sciences, Queen Mary, University of London, London E1 4NS, UK
| | - Aman B. Saleem
- Institute of Behavioural Neuroscience, Department of Experimental Psychology, University College London, London WC1H 0AP, UK
| |
Collapse
|
15
|
Juvenile Shank3 KO Mice Adopt Distinct Hunting Strategies during Prey Capture Learning. eNeuro 2022; 9:ENEURO.0230-22.2022. [PMID: 36446569 PMCID: PMC9768843 DOI: 10.1523/eneuro.0230-22.2022] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2022] [Revised: 10/11/2022] [Accepted: 10/23/2022] [Indexed: 12/02/2022] Open
Abstract
Mice are opportunistic omnivores that readily learn to hunt and eat insects such as crickets. The details of how mice learn these behaviors and how these behaviors may differ in strains with altered neuroplasticity are unclear. We quantified the behavior of juvenile wild-type (WT) and Shank3 knock-out (KO) mice as they learned to hunt crickets during the critical period for ocular dominance plasticity. This stage involves heightened cortical plasticity including homeostatic synaptic scaling, which requires Shank3, a glutamatergic synaptic protein that, when mutated, produces Phelan-McDermid syndrome and is often comorbid with autism spectrum disorder (ASD). Both strains showed interest in examining live and dead crickets and learned to hunt. Shank3 knock-out mice took longer to become proficient, and, after 5 d, did not achieve the efficiency of wild-type mice in either time-to-capture or distance-to-capture. Shank3 knock-out mice also exhibited different characteristics when pursuing crickets that could not be explained by a simple motor deficit. Although both genotypes moved at the same average speed when approaching a cricket, Shank3 KO mice paused more often, did not begin final accelerations toward crickets as early, and did not close the distance gap to the cricket as quickly as wild-type mice. These differences in Shank3 KO mice are reminiscent of some behavioral characteristics of individuals with ASD as they perform complex tasks, such as slower action initiation and completion. This paradigm will be useful for exploring the neural circuit mechanisms that underlie these learning and performance differences in monogenic ASD rodent models.
Collapse
|
16
|
Parker PRL, Abe ETT, Beatie NT, Leonard ESP, Martins DM, Sharp SL, Wyrick DG, Mazzucato L, Niell CM. Distance estimation from monocular cues in an ethological visuomotor task. eLife 2022; 11:e74708. [PMID: 36125119 PMCID: PMC9489205 DOI: 10.7554/elife.74708] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2021] [Accepted: 08/29/2022] [Indexed: 12/02/2022] Open
Abstract
In natural contexts, sensory processing and motor output are closely coupled, which is reflected in the fact that many brain areas contain both sensory and movement signals. However, standard reductionist paradigms decouple sensory decisions from their natural motor consequences, and head-fixation prevents the natural sensory consequences of self-motion. In particular, movement through the environment provides a number of depth cues beyond stereo vision that are poorly understood. To study the integration of visual processing and motor output in a naturalistic task, we investigated distance estimation in freely moving mice. We found that mice use vision to accurately jump across a variable gap, thus directly coupling a visual computation to its corresponding ethological motor output. Monocular eyelid suture did not affect gap jumping success, thus mice can use cues that do not depend on binocular disparity and stereo vision. Under monocular conditions, mice altered their head positioning and performed more vertical head movements, consistent with a shift from using stereopsis to other monocular cues, such as motion or position parallax. Finally, optogenetic suppression of primary visual cortex impaired task performance under both binocular and monocular conditions when optical fiber placement was localized to binocular or monocular zone V1, respectively. Together, these results show that mice can use monocular cues, relying on visual cortex, to accurately judge distance. Furthermore, this behavioral paradigm provides a foundation for studying how neural circuits convert sensory information into ethological motor output.
Collapse
Affiliation(s)
- Philip RL Parker
- Institute of Neuroscience, University of OregonEugeneUnited States
| | - Elliott TT Abe
- Institute of Neuroscience, University of OregonEugeneUnited States
| | - Natalie T Beatie
- Institute of Neuroscience, University of OregonEugeneUnited States
| | | | - Dylan M Martins
- Institute of Neuroscience, University of OregonEugeneUnited States
| | - Shelby L Sharp
- Institute of Neuroscience, University of OregonEugeneUnited States
| | - David G Wyrick
- Institute of Neuroscience, University of OregonEugeneUnited States
| | - Luca Mazzucato
- Institute of Neuroscience, University of OregonEugeneUnited States
- Department of Mathematics, University of OregonEugeneUnited States
| | - Cristopher M Niell
- Institute of Neuroscience, University of OregonEugeneUnited States
- Department of Biology, University of OregonEugeneUnited States
| |
Collapse
|
17
|
Treviño M, Medina-Coss Y León R, Lezama E. Response Time Distributions and the Accumulation of Visual Evidence in Freely Moving Mice. Neuroscience 2022; 501:25-41. [PMID: 35995337 DOI: 10.1016/j.neuroscience.2022.08.015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2022] [Revised: 08/09/2022] [Accepted: 08/15/2022] [Indexed: 11/29/2022]
Abstract
Response time (RT) distributions are histograms of the observed RTs for discriminative choices, comprising a rich source of empirical information to study perceptual processes. The drift-diffusion model (DDM), a mathematical formulation predicting decision tasks, reproduces the RT distributions, contributing to our understanding of these processes from a theoretical perspective. Notably, although the mouse is a popular model system for studying brain function and behavior, little is known about mouse perceptual RT distributions, and their description from an information-accumulation perspective. We combined an automated visual discrimination task with pharmacological micro-infusions of targeted brain regions to acquire thousands of responses from freely-moving adult mice. Both choices and escape latencies showed a strong dependency on stimulus discriminability. By applying a DDM fit to our experimental data, we found that the rate of incoming evidence (drift rate) increased with stimulus contrast but was reversibly impaired when inactivating the primary visual cortex (V1). Other brain regions involved in the decision-making process, the posterior parietal cortex (PPC) and the frontal orienting fields (FOF), also influenced relevant parameters from the DDM. The large number of empirical observations that we collected for this study allowed us to achieve accurate convergence for the model fit. Therefore, changes in the experimental conditions were mirrored by changes in model parameters, suggesting the participation of relevant brain areas in the decision-making process. This approach could help interpret future studies involving attention, discrimination, and learning in adult mice.
Collapse
Affiliation(s)
- Mario Treviño
- Laboratorio de Plasticidad Cortical y Aprendizaje Perceptual, Instituto de Neurociencias, Universidad de Guadalajara, Guadalajara, Jalisco, Mexico.
| | - Ricardo Medina-Coss Y León
- Laboratorio de Plasticidad Cortical y Aprendizaje Perceptual, Instituto de Neurociencias, Universidad de Guadalajara, Guadalajara, Jalisco, Mexico; Simmons Cancer Institute at Southern Illinois University, USA
| | - Elí Lezama
- Laboratorio de Plasticidad Cortical y Aprendizaje Perceptual, Instituto de Neurociencias, Universidad de Guadalajara, Guadalajara, Jalisco, Mexico
| |
Collapse
|
18
|
Abstract
An ultimate goal in retina science is to understand how the neural circuit of the retina processes natural visual scenes. Yet most studies in laboratories have long been performed with simple, artificial visual stimuli such as full-field illumination, spots of light, or gratings. The underlying assumption is that the features of the retina thus identified carry over to the more complex scenario of natural scenes. As the application of corresponding natural settings is becoming more commonplace in experimental investigations, this assumption is being put to the test and opportunities arise to discover processing features that are triggered by specific aspects of natural scenes. Here, we review how natural stimuli have been used to probe, refine, and complement knowledge accumulated under simplified stimuli, and we discuss challenges and opportunities along the way toward a comprehensive understanding of the encoding of natural scenes. Expected final online publication date for the Annual Review of Vision Science, Volume 8 is September 2022. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
Collapse
Affiliation(s)
- Dimokratis Karamanlis
- Department of Ophthalmology, University Medical Center Göttingen, Göttingen, Germany.,Bernstein Center for Computational Neuroscience Göttingen, Göttingen, Germany.,International Max Planck Research School for Neurosciences, Göttingen, Germany
| | - Helene Marianne Schreyer
- Department of Ophthalmology, University Medical Center Göttingen, Göttingen, Germany.,Bernstein Center for Computational Neuroscience Göttingen, Göttingen, Germany
| | - Tim Gollisch
- Department of Ophthalmology, University Medical Center Göttingen, Göttingen, Germany.,Bernstein Center for Computational Neuroscience Göttingen, Göttingen, Germany.,Cluster of Excellence "Multiscale Bioimaging: from Molecular Machines to Networks of Excitable Cells" (MBExC), University of Göttingen, Göttingen, Germany
| |
Collapse
|
19
|
Wheatcroft T, Saleem AB, Solomon SG. Functional Organisation of the Mouse Superior Colliculus. Front Neural Circuits 2022; 16:792959. [PMID: 35601532 PMCID: PMC9118347 DOI: 10.3389/fncir.2022.792959] [Citation(s) in RCA: 15] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2021] [Accepted: 03/07/2022] [Indexed: 11/30/2022] Open
Abstract
The superior colliculus (SC) is a highly conserved area of the mammalian midbrain that is widely implicated in the organisation and control of behaviour. SC receives input from a large number of brain areas, and provides outputs to a large number of areas. The convergence and divergence of anatomical connections with different areas and systems provides challenges for understanding how SC contributes to behaviour. Recent work in mouse has provided large anatomical datasets, and a wealth of new data from experiments that identify and manipulate different cells within SC, and their inputs and outputs, during simple behaviours. These data offer an opportunity to better understand the roles that SC plays in these behaviours. However, some of the observations appear, at first sight, to be contradictory. Here we review this recent work and hypothesise a simple framework which can capture the observations, that requires only a small change to previous models. Specifically, the functional organisation of SC can be explained by supposing that three largely distinct circuits support three largely distinct classes of simple behaviours-arrest, turning towards, and the triggering of escape or capture. These behaviours are hypothesised to be supported by the optic, intermediate and deep layers, respectively.
Collapse
Affiliation(s)
| | | | - Samuel G. Solomon
- Institute of Behavioural Neuroscience, University College London, London, United Kingdom
| |
Collapse
|
20
|
Sedigh-Sarvestani M, Fitzpatrick D. What and Where: Location-Dependent Feature Sensitivity as a Canonical Organizing Principle of the Visual System. Front Neural Circuits 2022; 16:834876. [PMID: 35498372 PMCID: PMC9039279 DOI: 10.3389/fncir.2022.834876] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Accepted: 03/01/2022] [Indexed: 11/13/2022] Open
Abstract
Traditionally, functional representations in early visual areas are conceived as retinotopic maps preserving ego-centric spatial location information while ensuring that other stimulus features are uniformly represented for all locations in space. Recent results challenge this framework of relatively independent encoding of location and features in the early visual system, emphasizing location-dependent feature sensitivities that reflect specialization of cortical circuits for different locations in visual space. Here we review the evidence for such location-specific encoding including: (1) systematic variation of functional properties within conventional retinotopic maps in the cortex; (2) novel periodic retinotopic transforms that dramatically illustrate the tight linkage of feature sensitivity, spatial location, and cortical circuitry; and (3) retinotopic biases in cortical areas, and groups of areas, that have been defined by their functional specializations. We propose that location-dependent feature sensitivity is a fundamental organizing principle of the visual system that achieves efficient representation of positional regularities in visual experience, and reflects the evolutionary selection of sensory and motor circuits to optimally represent behaviorally relevant information. Future studies are necessary to discover mechanisms underlying joint encoding of location and functional information, how this relates to behavior, emerges during development, and varies across species.
Collapse
|
21
|
Matthis JS, Muller KS, Bonnen KL, Hayhoe MM. Retinal optic flow during natural locomotion. PLoS Comput Biol 2022; 18:e1009575. [PMID: 35192614 PMCID: PMC8896712 DOI: 10.1371/journal.pcbi.1009575] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2021] [Revised: 03/04/2022] [Accepted: 10/14/2021] [Indexed: 11/18/2022] Open
Abstract
We examine the structure of the visual motion projected on the retina during natural locomotion in real world environments. Bipedal gait generates a complex, rhythmic pattern of head translation and rotation in space, so without gaze stabilization mechanisms such as the vestibular-ocular-reflex (VOR) a walker’s visually specified heading would vary dramatically throughout the gait cycle. The act of fixation on stable points in the environment nulls image motion at the fovea, resulting in stable patterns of outflow on the retinae centered on the point of fixation. These outflowing patterns retain a higher order structure that is informative about the stabilized trajectory of the eye through space. We measure this structure by applying the curl and divergence operations on the retinal flow velocity vector fields and found features that may be valuable for the control of locomotion. In particular, the sign and magnitude of foveal curl in retinal flow specifies the body’s trajectory relative to the gaze point, while the point of maximum divergence in the retinal flow field specifies the walker’s instantaneous overground velocity/momentum vector in retinotopic coordinates. Assuming that walkers can determine the body position relative to gaze direction, these time-varying retinotopic cues for the body’s momentum could provide a visual control signal for locomotion over complex terrain. In contrast, the temporal variation of the eye-movement-free, head-centered flow fields is large enough to be problematic for use in steering towards a goal. Consideration of optic flow in the context of real-world locomotion therefore suggests a re-evaluation of the role of optic flow in the control of action during natural behavior. We recorded the full body kinematics and binocular gaze of humans walking through real-world natural environment and estimated visual motion (optic flow) using both computational video analysis and geometric simulation. Contrary to the established theories of the role of optic flow in the control of locomotion, we found that eye-movement-free, head-centric optic flow is highly unstable due to the complex phasic trajectory of the head during natural locomotion, rendering it an unlikely candidate for heading perception. In contrast, retina-centered optic flow consisted of a regular pattern of outflowing motion centered on the fovea. Retinal optic flow contained highly consistent patterns that specified the walker’s trajectory relative to the point of fixation, which may provide powerful, retinotopic cues that may be used for the visual control of locomotion in natural environments. This examination of optic flow in real-world contexts suggest a need to re-evaluate existing theories of the role of optic flow in the visual control of action during natural behavior.
Collapse
Affiliation(s)
- Jonathan Samir Matthis
- Department of Biology, Northeastern University, Boston, Massachusetts, United States of America
- * E-mail:
| | - Karl S. Muller
- Center for Perceptual Systems, University of Texas at Austin, Austin, Texas, United States of America
| | - Kathryn L. Bonnen
- School of Optometry, Indiana University Bloomington, Bloomington, Indiana, United States of America
| | - Mary M. Hayhoe
- Center for Perceptual Systems, University of Texas at Austin, Austin, Texas, United States of America
| |
Collapse
|