1
|
Wu N, Zhou B, Agrochao M, Clark DA. Broken time reversal symmetry in visual motion detection. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.06.08.598068. [PMID: 38915608 PMCID: PMC11195140 DOI: 10.1101/2024.06.08.598068] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/26/2024]
Abstract
Our intuition suggests that when a movie is played in reverse, our perception of motion in the reversed movie will be perfectly inverted compared to the original. This intuition is also reflected in many classical theoretical and practical models of motion detection. However, here we demonstrate that this symmetry of motion perception upon time reversal is often broken in real visual systems. In this work, we designed a set of visual stimuli to investigate how stimulus symmetries affect time reversal symmetry breaking in the fruit fly Drosophila 's well-studied optomotor rotation behavior. We discovered a suite of new stimuli with a wide variety of different properties that can lead to broken time reversal symmetries in fly behavioral responses. We then trained neural network models to predict the velocity of scenes with both natural and artificial contrast distributions. Training with naturalistic contrast distributions yielded models that break time reversal symmetry, even when the training data was time reversal symmetric. We show analytically and numerically that the breaking of time reversal symmetry in the model responses can arise from contrast asymmetry in the training data, but can also arise from other features of the contrast distribution. Furthermore, shallower neural network models can exhibit stronger symmetry breaking than deeper ones, suggesting that less flexible neural networks promote some forms of time reversal symmetry breaking. Overall, these results reveal a surprising feature of biological motion detectors and suggest that it could arise from constrained optimization in natural environments. Significance In neuroscience, symmetries can tell us about the computations being performed by a circuit. In vision, for instance, one might expect that when a movie is played backward, one's motion percepts should all be reversed. Exact perceptual reversal would indicate a time reversal symmetry, but surprisingly, real visual systems break this symmetry. In this research, we designed visual stimuli to probe different symmetries in motion detection and identify features that lead to symmetry breaking in motion percepts. We discovered that symmetry breaking in motion detection depends strongly on both the detector's architecture and how it is optimized. Interestingly, we find analytically and in simulations that time reversal symmetries are broken in systems optimized to perform with natural inputs.
Collapse
Affiliation(s)
- Nathan Wu
- Yale College, New Haven, CT 06511, USA
| | - Baohua Zhou
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Margarida Agrochao
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Damon A. Clark
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
- Department of Physics, Yale University, New Haven, CT 06511, USA
- Department of Neuroscience, Yale University, New Haven, CT 06511, USA
- Quantitative Biology Institute, Yale University, New Haven, CT 06511, USA
- Wu Tsai Institute, Yale University, New Haven, CT 06511, USA
| |
Collapse
|
2
|
Ammer G, Serbe-Kamp E, Mauss AS, Richter FG, Fendl S, Borst A. Multilevel visual motion opponency in Drosophila. Nat Neurosci 2023; 26:1894-1905. [PMID: 37783895 PMCID: PMC10620086 DOI: 10.1038/s41593-023-01443-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 08/30/2023] [Indexed: 10/04/2023]
Abstract
Inhibitory interactions between opponent neuronal pathways constitute a common circuit motif across brain areas and species. However, in most cases, synaptic wiring and biophysical, cellular and network mechanisms generating opponency are unknown. Here, we combine optogenetics, voltage and calcium imaging, connectomics, electrophysiology and modeling to reveal multilevel opponent inhibition in the fly visual system. We uncover a circuit architecture in which a single cell type implements direction-selective, motion-opponent inhibition at all three network levels. This inhibition, mediated by GluClα receptors, is balanced with excitation in strength, despite tenfold fewer synapses. The different opponent network levels constitute a nested, hierarchical structure operating at increasing spatiotemporal scales. Electrophysiology and modeling suggest that distributing this computation over consecutive network levels counteracts a reduction in gain, which would result from integrating large opposing conductances at a single instance. We propose that this neural architecture provides resilience to noise while enabling high selectivity for relevant sensory information.
Collapse
Affiliation(s)
- Georg Ammer
- Max Planck Institute for Biological Intelligence, Martinsried, Germany.
| | - Etienne Serbe-Kamp
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
- Ludwig Maximilian University of Munich, Munich, Germany
| | - Alex S Mauss
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Florian G Richter
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Sandra Fendl
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Alexander Borst
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| |
Collapse
|
3
|
Chen J, Gish CM, Fransen JW, Salazar-Gatzimas E, Clark DA, Borghuis BG. Direct comparison reveals algorithmic similarities in fly and mouse visual motion detection. iScience 2023; 26:107928. [PMID: 37810236 PMCID: PMC10550730 DOI: 10.1016/j.isci.2023.107928] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2023] [Revised: 08/07/2023] [Accepted: 09/12/2023] [Indexed: 10/10/2023] Open
Abstract
Evolution has equipped vertebrates and invertebrates with neural circuits that selectively encode visual motion. While similarities in the computations performed by these circuits in mouse and fruit fly have been noted, direct experimental comparisons have been lacking. Because molecular mechanisms and neuronal morphology in the two species are distinct, we directly compared motion encoding in these two species at the algorithmic level, using matched stimuli and focusing on a pair of analogous neurons, the mouse ON starburst amacrine cell (ON SAC) and Drosophila T4 neurons. We find that the cells share similar spatiotemporal receptive field structures, sensitivity to spatiotemporal correlations, and tuning to sinusoidal drifting gratings, but differ in their responses to apparent motion stimuli. Both neuron types showed a response to summed sinusoids that deviates from models for motion processing in these cells, underscoring the similarities in their processing and identifying response features that remain to be explained.
Collapse
Affiliation(s)
- Juyue Chen
- Interdepartmental Neurosciences Program, Yale University, New Haven, CT 06511, USA
| | - Caitlin M Gish
- Department of Physics, Yale University, New Haven, CT 06511, USA
| | - James W Fransen
- Department of Anatomical Sciences and Neurobiology, University of Louisville, Louisville, KY 40202, USA
| | | | - Damon A Clark
- Interdepartmental Neurosciences Program, Yale University, New Haven, CT 06511, USA
- Department of Physics, Yale University, New Haven, CT 06511, USA
- Department of Molecular, Cellular, Developmental Biology, Yale University, New Haven, CT 06511, USA
- Department of Neuroscience, Yale University, New Haven, CT 06511, USA
| | - Bart G Borghuis
- Department of Anatomical Sciences and Neurobiology, University of Louisville, Louisville, KY 40202, USA
| |
Collapse
|
4
|
Fu Q. Motion perception based on ON/OFF channels: A survey. Neural Netw 2023; 165:1-18. [PMID: 37263088 DOI: 10.1016/j.neunet.2023.05.031] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2022] [Revised: 04/02/2023] [Accepted: 05/17/2023] [Indexed: 06/03/2023]
Abstract
Motion perception is an essential ability for animals and artificially intelligent systems interacting effectively, safely with surrounding objects and environments. Biological visual systems, that have naturally evolved over hundreds-million years, are quite efficient and robust for motion perception, whereas artificial vision systems are far from such capability. This paper argues that the gap can be significantly reduced by formulation of ON/OFF channels in motion perception models encoding luminance increment (ON) and decrement (OFF) responses within receptive field, separately. Such signal-bifurcating structure has been found in neural systems of many animal species articulating early motion is split and processed in segregated pathways. However, the corresponding biological substrates, and the necessity for artificial vision systems have never been elucidated together, leaving concerns on uniqueness and advantages of ON/OFF channels upon building dynamic vision systems to address real world challenges. This paper highlights the importance of ON/OFF channels in motion perception through surveying current progress covering both neuroscience and computationally modelling works with applications. Compared to related literature, this paper for the first time provides insights into implementation of different selectivity to directional motion of looming, translating, and small-sized target movement based on ON/OFF channels in keeping with soundness and robustness of biological principles. Existing challenges and future trends of such bio-plausible computational structure for visual perception in connection with hotspots of machine learning, advanced vision sensors like event-driven camera finally are discussed.
Collapse
Affiliation(s)
- Qinbing Fu
- Machine Life and Intelligence Research Centre, School of Mathematics and Information Science, Guangzhou University, Guangzhou, 510006, China.
| |
Collapse
|
5
|
Abstract
How neurons detect the direction of motion is a prime example of neural computation: Motion vision is found in the visual systems of virtually all sighted animals, it is important for survival, and it requires interesting computations with well-defined linear and nonlinear processing steps-yet the whole process is of moderate complexity. The genetic methods available in the fruit fly Drosophila and the charting of a connectome of its visual system have led to rapid progress and unprecedented detail in our understanding of how neurons compute the direction of motion in this organism. The picture that emerged incorporates not only the identity, morphology, and synaptic connectivity of each neuron involved but also its neurotransmitters, its receptors, and their subcellular localization. Together with the neurons' membrane potential responses to visual stimulation, this information provides the basis for a biophysically realistic model of the circuit that computes the direction of visual motion.
Collapse
Affiliation(s)
- Alexander Borst
- Max Planck Institute for Biological Intelligence, Martinsried, Germany; ,
| | - Lukas N Groschner
- Max Planck Institute for Biological Intelligence, Martinsried, Germany; ,
| |
Collapse
|
6
|
Kadakia N, Demir M, Michaelis BT, DeAngelis BD, Reidenbach MA, Clark DA, Emonet T. Odour motion sensing enhances navigation of complex plumes. Nature 2022; 611:754-761. [PMID: 36352224 PMCID: PMC10039482 DOI: 10.1038/s41586-022-05423-4] [Citation(s) in RCA: 14] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2021] [Accepted: 10/06/2022] [Indexed: 11/11/2022]
Abstract
Odour plumes in the wild are spatially complex and rapidly fluctuating structures carried by turbulent airflows1-4. To successfully navigate plumes in search of food and mates, insects must extract and integrate multiple features of the odour signal, including odour identity5, intensity6 and timing6-12. Effective navigation requires balancing these multiple streams of olfactory information and integrating them with other sensory inputs, including mechanosensory and visual cues9,12,13. Studies dating back a century have indicated that, of these many sensory inputs, the wind provides the main directional cue in turbulent plumes, leading to the longstanding model of insect odour navigation as odour-elicited upwind motion6,8-12,14,15. Here we show that Drosophila melanogaster shape their navigational decisions using an additional directional cue-the direction of motion of odours-which they detect using temporal correlations in the odour signal between their two antennae. Using a high-resolution virtual-reality paradigm to deliver spatiotemporally complex fictive odours to freely walking flies, we demonstrate that such odour-direction sensing involves algorithms analogous to those in visual-direction sensing16. Combining simulations, theory and experiments, we show that odour motion contains valuable directional information that is absent from the airflow alone, and that both Drosophila and virtual agents are aided by that information in navigating naturalistic plumes. The generality of our findings suggests that odour-direction sensing may exist throughout the animal kingdom and could improve olfactory robot navigation in uncertain environments.
Collapse
Affiliation(s)
- Nirag Kadakia
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT, USA
- Quantitative Biology Institute, Yale University, New Haven, CT, USA
- Swartz Foundation for Theoretical Neuroscience, Yale University, New Haven, CT, USA
| | - Mahmut Demir
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT, USA
- Quantitative Biology Institute, Yale University, New Haven, CT, USA
| | - Brenden T Michaelis
- Department of Environmental Sciences, University of Virginia, Charlottesville, VA, USA
| | - Brian D DeAngelis
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT, USA
- Quantitative Biology Institute, Yale University, New Haven, CT, USA
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT, USA
| | - Matthew A Reidenbach
- Department of Environmental Sciences, University of Virginia, Charlottesville, VA, USA
| | - Damon A Clark
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT, USA.
- Quantitative Biology Institute, Yale University, New Haven, CT, USA.
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT, USA.
- Department of Physics, Yale University, New Haven, CT, USA.
| | - Thierry Emonet
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT, USA.
- Quantitative Biology Institute, Yale University, New Haven, CT, USA.
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT, USA.
- Department of Physics, Yale University, New Haven, CT, USA.
| |
Collapse
|
7
|
Gonzalez-Suarez AD, Zavatone-Veth JA, Chen J, Matulis CA, Badwan BA, Clark DA. Excitatory and inhibitory neural dynamics jointly tune motion detection. Curr Biol 2022; 32:3659-3675.e8. [PMID: 35868321 PMCID: PMC9474608 DOI: 10.1016/j.cub.2022.06.075] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2022] [Revised: 05/03/2022] [Accepted: 06/24/2022] [Indexed: 11/26/2022]
Abstract
Neurons integrate excitatory and inhibitory signals to produce their outputs, but the role of input timing in this integration remains poorly understood. Motion detection is a paradigmatic example of this integration, since theories of motion detection rely on different delays in visual signals. These delays allow circuits to compare scenes at different times to calculate the direction and speed of motion. Different motion detection circuits have different velocity sensitivity, but it remains untested how the response dynamics of individual cell types drive this tuning. Here, we sped up or slowed down specific neuron types in Drosophila's motion detection circuit by manipulating ion channel expression. Altering the dynamics of individual neuron types upstream of motion detectors increased their sensitivity to fast or slow visual motion, exposing distinct roles for excitatory and inhibitory dynamics in tuning directional signals, including a role for the amacrine cell CT1. A circuit model constrained by functional data and anatomy qualitatively reproduced the observed tuning changes. Overall, these results reveal how excitatory and inhibitory dynamics together tune a canonical circuit computation.
Collapse
Affiliation(s)
| | - Jacob A Zavatone-Veth
- Department of Physics, Harvard University, Cambridge, MA 02138, USA; Center for Brain Science, Harvard University, Cambridge, MA 02138, USA
| | - Juyue Chen
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | | | - Bara A Badwan
- School of Engineering and Applied Science, Yale University, New Haven, CT 06511, USA
| | - Damon A Clark
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA; Department of Physics, Yale University, New Haven, CT 06511, USA; Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
8
|
Zhou B, Li Z, Kim S, Lafferty J, Clark DA. Shallow neural networks trained to detect collisions recover features of visual loom-selective neurons. eLife 2022; 11:72067. [PMID: 35023828 PMCID: PMC8849349 DOI: 10.7554/elife.72067] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2021] [Accepted: 01/11/2022] [Indexed: 11/13/2022] Open
Abstract
Animals have evolved sophisticated visual circuits to solve a vital inference problem: detecting whether or not a visual signal corresponds to an object on a collision course. Such events are detected by specific circuits sensitive to visual looming, or objects increasing in size. Various computational models have been developed for these circuits, but how the collision-detection inference problem itself shapes the computational structures of these circuits remains unknown. Here, inspired by the distinctive structures of LPLC2 neurons in the visual system of Drosophila, we build anatomically-constrained shallow neural network models and train them to identify visual signals that correspond to impending collisions. Surprisingly, the optimization arrives at two distinct, opposing solutions, only one of which matches the actual dendritic weighting of LPLC2 neurons. Both solutions can solve the inference problem with high accuracy when the population size is large enough. The LPLC2-like solutions reproduces experimentally observed LPLC2 neuron responses for many stimuli, and reproduces canonical tuning of loom sensitive neurons, even though the models are never trained on neural data. Thus, LPLC2 neuron properties and tuning are predicted by optimizing an anatomically-constrained neural network to detect impending collisions. More generally, these results illustrate how optimizing inference tasks that are important for an animal's perceptual goals can reveal and explain computational properties of specific sensory neurons.
Collapse
Affiliation(s)
- Baohua Zhou
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, United States
| | - Zifan Li
- Department of Statistics and Data Science, Yale University, New Haven, United States
| | - Sunnie Kim
- Department of Statistics and Data Science, Yale University, New Haven, United States
| | - John Lafferty
- Department of Statistics and Data Science, Yale University, New Haven, United States
| | - Damon A Clark
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, United States
| |
Collapse
|
9
|
Kohn JR, Portes JP, Christenson MP, Abbott LF, Behnia R. Flexible filtering by neural inputs supports motion computation across states and stimuli. Curr Biol 2021; 31:5249-5260.e5. [PMID: 34670114 DOI: 10.1016/j.cub.2021.09.061] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2021] [Revised: 08/10/2021] [Accepted: 09/22/2021] [Indexed: 01/05/2023]
Abstract
Sensory systems flexibly adapt their processing properties across a wide range of environmental and behavioral conditions. Such variable processing complicates attempts to extract a mechanistic understanding of sensory computations. This is evident in the highly constrained, canonical Drosophila motion detection circuit, where the core computation underlying direction selectivity is still debated despite extensive studies. Here we measured the filtering properties of neural inputs to the OFF motion-detecting T5 cell in Drosophila. We report state- and stimulus-dependent changes in the shape of these signals, which become more biphasic under specific conditions. Summing these inputs within the framework of a connectomic-constrained model of the circuit demonstrates that these shapes are sufficient to explain T5 responses to various motion stimuli. Thus, our stimulus- and state-dependent measurements reconcile motion computation with the anatomy of the circuit. These findings provide a clear example of how a basic circuit supports flexible sensory computation.
Collapse
Affiliation(s)
- Jessica R Kohn
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA
| | - Jacob P Portes
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA; Center for Theoretical Neuroscience, Columbia University, New York, NY, USA
| | - Matthias P Christenson
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA; Center for Theoretical Neuroscience, Columbia University, New York, NY, USA
| | - L F Abbott
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA; Center for Theoretical Neuroscience, Columbia University, New York, NY, USA
| | - Rudy Behnia
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA; Kavli Institute for Brain Science, Columbia University, New York, NY 10027, USA.
| |
Collapse
|
10
|
Mano O, Creamer MS, Badwan BA, Clark DA. Predicting individual neuron responses with anatomically constrained task optimization. Curr Biol 2021; 31:4062-4075.e4. [PMID: 34324832 DOI: 10.1016/j.cub.2021.06.090] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2021] [Revised: 05/24/2021] [Accepted: 06/29/2021] [Indexed: 01/28/2023]
Abstract
Artificial neural networks trained to solve sensory tasks can develop statistical representations that match those in biological circuits. However, it remains unclear whether they can reproduce properties of individual neurons. Here, we investigated how artificial networks predict individual neuron properties in the visual motion circuits of the fruit fly Drosophila. We trained anatomically constrained networks to predict movement in natural scenes, solving the same inference problem as fly motion detectors. Units in the artificial networks adopted many properties of analogous individual neurons, even though they were not explicitly trained to match these properties. Among these properties was the split into ON and OFF motion detectors, which is not predicted by classical motion detection models. The match between model and neurons was closest when models were trained to be robust to noise. These results demonstrate how anatomical, task, and noise constraints can explain properties of individual neurons in a small neural network.
Collapse
Affiliation(s)
- Omer Mano
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA
| | - Matthew S Creamer
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Bara A Badwan
- School of Engineering and Applied Science, Yale University, New Haven, CT 06511, USA
| | - Damon A Clark
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA; Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA; Department of Physics, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
11
|
Ramos-Traslosheros G, Silies M. The physiological basis for contrast opponency in motion computation in Drosophila. Nat Commun 2021; 12:4987. [PMID: 34404776 PMCID: PMC8371135 DOI: 10.1038/s41467-021-24986-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/27/2020] [Accepted: 07/07/2021] [Indexed: 12/02/2022] Open
Abstract
In Drosophila, direction-selective neurons implement a mechanism of motion computation similar to cortical neurons, using contrast-opponent receptive fields with ON and OFF subfields. It is not clear how the presynaptic circuitry of direction-selective neurons in the OFF pathway supports this computation if all major inputs are OFF-rectified neurons. Here, we reveal the biological substrate for motion computation in the OFF pathway. Three interneurons, Tm2, Tm9 and CT1, provide information about ON stimuli to the OFF direction-selective neuron T5 across its receptive field, supporting a contrast-opponent receptive field organization. Consistent with its prominent role in motion detection, variability in Tm9 receptive field properties transfers to T5, and calcium decrements in Tm9 in response to ON stimuli persist across behavioral states, while spatial tuning is sharpened by active behavior. Together, our work shows how a key neuronal computation is implemented by its constituent neuronal circuit elements to ensure direction selectivity.
Collapse
Affiliation(s)
- Giordano Ramos-Traslosheros
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany
- International Max Planck Research School Neuroscienes and Göttingen Graduate School for Neurosciences, Biophysics, and Molecular Biosciences (GGNB) at the University of Göttingen, Göttingen, Germany
| | - Marion Silies
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany.
| |
Collapse
|
12
|
Fu Q, Yue S. Modelling Drosophila motion vision pathways for decoding the direction of translating objects against cluttered moving backgrounds. BIOLOGICAL CYBERNETICS 2020; 114:443-460. [PMID: 32623517 PMCID: PMC7554016 DOI: 10.1007/s00422-020-00841-x] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/31/2020] [Accepted: 06/19/2020] [Indexed: 06/03/2023]
Abstract
Decoding the direction of translating objects in front of cluttered moving backgrounds, accurately and efficiently, is still a challenging problem. In nature, lightweight and low-powered flying insects apply motion vision to detect a moving target in highly variable environments during flight, which are excellent paradigms to learn motion perception strategies. This paper investigates the fruit fly Drosophila motion vision pathways and presents computational modelling based on cutting-edge physiological researches. The proposed visual system model features bio-plausible ON and OFF pathways, wide-field horizontal-sensitive (HS) and vertical-sensitive (VS) systems. The main contributions of this research are on two aspects: (1) the proposed model articulates the forming of both direction-selective and direction-opponent responses, revealed as principal features of motion perception neural circuits, in a feed-forward manner; (2) it also shows robust direction selectivity to translating objects in front of cluttered moving backgrounds, via the modelling of spatiotemporal dynamics including combination of motion pre-filtering mechanisms and ensembles of local correlators inside both the ON and OFF pathways, which works effectively to suppress irrelevant background motion or distractors, and to improve the dynamic response. Accordingly, the direction of translating objects is decoded as global responses of both the HS and VS systems with positive or negative output indicating preferred-direction or null-direction translation. The experiments have verified the effectiveness of the proposed neural system model, and demonstrated its responsive preference to faster-moving, higher-contrast and larger-size targets embedded in cluttered moving backgrounds.
Collapse
Affiliation(s)
- Qinbing Fu
- Machine Life and Intelligence Research Centre, Guangzhou University, Guangzhou, China.
- Computational Intelligence Lab/Lincoln Centre for Autonomous Systems, University of Lincoln, Lincoln, UK.
| | - Shigang Yue
- Machine Life and Intelligence Research Centre, Guangzhou University, Guangzhou, China.
- Computational Intelligence Lab/Lincoln Centre for Autonomous Systems, University of Lincoln, Lincoln, UK.
| |
Collapse
|
13
|
Agrochao M, Tanaka R, Salazar-Gatzimas E, Clark DA. Mechanism for analogous illusory motion perception in flies and humans. Proc Natl Acad Sci U S A 2020; 117:23044-23053. [PMID: 32839324 PMCID: PMC7502748 DOI: 10.1073/pnas.2002937117] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Visual motion detection is one of the most important computations performed by visual circuits. Yet, we perceive vivid illusory motion in stationary, periodic luminance gradients that contain no true motion. This illusion is shared by diverse vertebrate species, but theories proposed to explain this illusion have remained difficult to test. Here, we demonstrate that in the fruit fly Drosophila, the illusory motion percept is generated by unbalanced contributions of direction-selective neurons' responses to stationary edges. First, we found that flies, like humans, perceive sustained motion in the stationary gradients. The percept was abolished when the elementary motion detector neurons T4 and T5 were silenced. In vivo calcium imaging revealed that T4 and T5 neurons encode the location and polarity of stationary edges. Furthermore, our proposed mechanistic model allowed us to predictably manipulate both the magnitude and direction of the fly's illusory percept by selectively silencing either T4 or T5 neurons. Interestingly, human brains possess the same mechanistic ingredients that drive our model in flies. When we adapted human observers to moving light edges or dark edges, we could manipulate the magnitude and direction of their percepts as well, suggesting that mechanisms similar to the fly's may also underlie this illusion in humans. By taking a comparative approach that exploits Drosophila neurogenetics, our results provide a causal, mechanistic account for a long-known visual illusion. These results argue that this illusion arises from architectures for motion detection that are shared across phyla.
Collapse
Affiliation(s)
- Margarida Agrochao
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT 06511
| | - Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511
| | | | - Damon A Clark
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT 06511;
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511
- Department of Physics, Yale University, New Haven, CT 06511
- Department of Neuroscience, Yale University, New Haven, CT 06511
| |
Collapse
|
14
|
Zavatone-Veth JA, Badwan BA, Clark DA. A minimal synaptic model for direction selective neurons in Drosophila. J Vis 2020; 20:2. [PMID: 32040161 PMCID: PMC7343402 DOI: 10.1167/jov.20.2.2] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/07/2023] Open
Abstract
Visual motion estimation is a canonical neural computation. In Drosophila, recent advances have identified anatomic and functional circuitry underlying direction-selective computations. Models with varying levels of abstraction have been proposed to explain specific experimental results but have rarely been compared across experiments. Here we use the wealth of available anatomical and physiological data to construct a minimal, biophysically inspired synaptic model for Drosophila’s first-order direction-selective T4 cells. We show how this model relates mathematically to classical models of motion detection, including the Hassenstein-Reichardt correlator model. We used numerical simulation to test how well this synaptic model could reproduce measurements of T4 cells across many datasets and stimulus modalities. These comparisons include responses to sinusoid gratings, to apparent motion stimuli, to stochastic stimuli, and to natural scenes. Without fine-tuning this model, it sufficed to reproduce many, but not all, response properties of T4 cells. Since this model is flexible and based on straightforward biophysical properties, it provides an extensible framework for developing a mechanistic understanding of T4 neural response properties. Moreover, it can be used to assess the sufficiency of simple biophysical mechanisms to describe features of the direction-selective computation and identify where our understanding must be improved.
Collapse
|
15
|
Tanaka R, Clark DA. Object-Displacement-Sensitive Visual Neurons Drive Freezing in Drosophila. Curr Biol 2020; 30:2532-2550.e8. [PMID: 32442466 PMCID: PMC8716191 DOI: 10.1016/j.cub.2020.04.068] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2020] [Revised: 04/22/2020] [Accepted: 04/24/2020] [Indexed: 11/26/2022]
Abstract
Visual systems are often equipped with neurons that detect small moving objects, which may represent prey, predators, or conspecifics. Although the processing properties of those neurons have been studied in diverse organisms, links between the proposed algorithms and animal behaviors or circuit mechanisms remain elusive. Here, we have investigated behavioral function, computational algorithm, and neurochemical mechanisms of an object-selective neuron, LC11, in Drosophila. With genetic silencing and optogenetic activation, we show that LC11 is necessary for a visual object-induced stopping behavior in walking flies, a form of short-term freezing, and its activity can promote stopping. We propose a new quantitative model for small object selectivity based on the physiology and anatomy of LC11 and its inputs. The model accurately reproduces LC11 responses by pooling fast-adapting, tightly size-tuned inputs. Direct visualization of neurotransmitter inputs to LC11 confirmed the model conjectures about upstream processing. Our results demonstrate how adaptation can enhance selectivity for behaviorally relevant, dynamic visual features.
Collapse
Affiliation(s)
- Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Damon A Clark
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA; Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Physics, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
16
|
Matulis CA, Chen J, Gonzalez-Suarez AD, Behnia R, Clark DA. Heterogeneous Temporal Contrast Adaptation in Drosophila Direction-Selective Circuits. Curr Biol 2020; 30:222-236.e6. [PMID: 31928874 PMCID: PMC7003801 DOI: 10.1016/j.cub.2019.11.077] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2019] [Revised: 11/06/2019] [Accepted: 11/26/2019] [Indexed: 11/23/2022]
Abstract
In visual systems, neurons adapt both to the mean light level and to the range of light levels, or the contrast. Contrast adaptation has been studied extensively, but it remains unclear how it is distributed among neurons in connected circuits, and how early adaptation affects subsequent computations. Here, we investigated temporal contrast adaptation in neurons across Drosophila's visual motion circuitry. Several ON-pathway neurons showed strong adaptation to changes in contrast over time. One of these neurons, Mi1, showed almost complete adaptation on fast timescales, and experiments ruled out several potential mechanisms for its adaptive properties. When contrast adaptation reduced the gain in ON-pathway cells, it was accompanied by decreased motion responses in downstream direction-selective cells. Simulations show that contrast adaptation can substantially improve motion estimates in natural scenes. The benefits are larger for ON-pathway adaptation, which helps explain the heterogeneous distribution of contrast adaptation in these circuits.
Collapse
Affiliation(s)
- Catherine A Matulis
- Department of Physics, Yale University, 217 Prospect Street, New Haven, CT 06511, USA
| | - Juyue Chen
- Interdepartmental Neuroscience Program, Yale University, 333 Cedar Street, New Haven, CT 06510, USA
| | | | - Rudy Behnia
- Department of Neuroscience, Columbia University, 3227 Broadway, New York, NY 10027, USA
| | - Damon A Clark
- Department of Physics, Yale University, 217 Prospect Street, New Haven, CT 06511, USA; Interdepartmental Neuroscience Program, Yale University, 333 Cedar Street, New Haven, CT 06510, USA; Department of Molecular, Cellular, and Developmental Biology, Yale University, 260 Whitney Avenue, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, 333 Cedar Street, New Haven, CT 06510, USA.
| |
Collapse
|
17
|
How fly neurons compute the direction of visual motion. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2019; 206:109-124. [PMID: 31691093 PMCID: PMC7069908 DOI: 10.1007/s00359-019-01375-9] [Citation(s) in RCA: 53] [Impact Index Per Article: 10.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2019] [Revised: 10/16/2019] [Accepted: 10/23/2019] [Indexed: 10/25/2022]
Abstract
Detecting the direction of image motion is a fundamental component of visual computation, essential for survival of the animal. However, at the level of individual photoreceptors, the direction in which the image is shifting is not explicitly represented. Rather, directional motion information needs to be extracted from the photoreceptor array by comparing the signals of neighboring units over time. The exact nature of this process as implemented in the visual system of the fruit fly Drosophila melanogaster has been studied in great detail, and much progress has recently been made in determining the neural circuits giving rise to directional motion information. The results reveal the following: (1) motion information is computed in parallel ON and OFF pathways. (2) Within each pathway, T4 (ON) and T5 (OFF) cells are the first neurons to represent the direction of motion. Four subtypes of T4 and T5 cells exist, each sensitive to one of the four cardinal directions. (3) The core process of direction selectivity as implemented on the dendrites of T4 and T5 cells comprises both an enhancement of signals for motion along their preferred direction as well as a suppression of signals for motion along the opposite direction. This combined strategy ensures a high degree of direction selectivity right at the first stage where the direction of motion is computed. (4) At the subsequent processing stage, tangential cells spatially integrate direct excitation from ON and OFF-selective T4 and T5 cells and indirect inhibition from bi-stratified LPi cells activated by neighboring T4/T5 terminals, thus generating flow-field-selective responses.
Collapse
|
18
|
Chen J, Mandel HB, Fitzgerald JE, Clark DA. Asymmetric ON-OFF processing of visual motion cancels variability induced by the structure of natural scenes. eLife 2019; 8:e47579. [PMID: 31613221 PMCID: PMC6884396 DOI: 10.7554/elife.47579] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2019] [Accepted: 10/12/2019] [Indexed: 02/05/2023] Open
Abstract
Animals detect motion using a variety of visual cues that reflect regularities in the natural world. Experiments in animals across phyla have shown that motion percepts incorporate both pairwise and triplet spatiotemporal correlations that could theoretically benefit motion computation. However, it remains unclear how visual systems assemble these cues to build accurate motion estimates. Here, we used systematic behavioral measurements of fruit fly motion perception to show how flies combine local pairwise and triplet correlations to reduce variability in motion estimates across natural scenes. By generating synthetic images with statistics controlled by maximum entropy distributions, we show that the triplet correlations are useful only when images have light-dark asymmetries that mimic natural ones. This suggests that asymmetric ON-OFF processing is tuned to the particular statistics of natural scenes. Since all animals encounter the world's light-dark asymmetries, many visual systems are likely to use asymmetric ON-OFF processing to improve motion estimation.
Collapse
Affiliation(s)
- Juyue Chen
- Interdepartmental Neuroscience ProgramYale UniversityNew HavenUnited States
| | - Holly B Mandel
- Department of Molecular, Cellular and Developmental BiologyYale UniversityNew HavenUnited States
| | - James E Fitzgerald
- Janelia Research CampusHoward Hughes Medical InstituteAshburnUnited States
| | - Damon A Clark
- Interdepartmental Neuroscience ProgramYale UniversityNew HavenUnited States
- Department of Molecular, Cellular and Developmental BiologyYale UniversityNew HavenUnited States
- Department of PhysicsYale UniversityNew HavenUnited States
- Department of NeuroscienceYale UniversityNew HavenUnited States
| |
Collapse
|