1
|
Wu N, Zhou B, Agrochao M, Clark DA. Broken time reversal symmetry in visual motion detection. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.06.08.598068. [PMID: 38915608 PMCID: PMC11195140 DOI: 10.1101/2024.06.08.598068] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/26/2024]
Abstract
Our intuition suggests that when a movie is played in reverse, our perception of motion in the reversed movie will be perfectly inverted compared to the original. This intuition is also reflected in many classical theoretical and practical models of motion detection. However, here we demonstrate that this symmetry of motion perception upon time reversal is often broken in real visual systems. In this work, we designed a set of visual stimuli to investigate how stimulus symmetries affect time reversal symmetry breaking in the fruit fly Drosophila's well-studied optomotor rotation behavior. We discovered a suite of new stimuli with a wide variety of different properties that can lead to broken time reversal symmetries in fly behavioral responses. We then trained neural network models to predict the velocity of scenes with both natural and artificial contrast distributions. Training with naturalistic contrast distributions yielded models that break time reversal symmetry, even when the training data was time reversal symmetric. We show analytically and numerically that the breaking of time reversal symmetry in the model responses can arise from contrast asymmetry in the training data, but can also arise from other features of the contrast distribution. Furthermore, shallower neural network models can exhibit stronger symmetry breaking than deeper ones, suggesting that less flexible neural networks promote some forms of time reversal symmetry breaking. Overall, these results reveal a surprising feature of biological motion detectors and suggest that it could arise from constrained optimization in natural environments.
Collapse
Affiliation(s)
- Nathan Wu
- Yale College, New Haven, CT 06511, USA
| | - Baohua Zhou
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Margarida Agrochao
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Damon A. Clark
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
- Department of Physics, Yale University, New Haven, CT 06511, USA
- Department of Neuroscience, Yale University, New Haven, CT 06511, USA
- Quantitative Biology Institute, Yale University, New Haven, CT 06511, USA
- Wu Tsai Institute, Yale University, New Haven, CT 06511, USA
| |
Collapse
|
2
|
Ammer G, Serbe-Kamp E, Mauss AS, Richter FG, Fendl S, Borst A. Multilevel visual motion opponency in Drosophila. Nat Neurosci 2023; 26:1894-1905. [PMID: 37783895 PMCID: PMC10620086 DOI: 10.1038/s41593-023-01443-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 08/30/2023] [Indexed: 10/04/2023]
Abstract
Inhibitory interactions between opponent neuronal pathways constitute a common circuit motif across brain areas and species. However, in most cases, synaptic wiring and biophysical, cellular and network mechanisms generating opponency are unknown. Here, we combine optogenetics, voltage and calcium imaging, connectomics, electrophysiology and modeling to reveal multilevel opponent inhibition in the fly visual system. We uncover a circuit architecture in which a single cell type implements direction-selective, motion-opponent inhibition at all three network levels. This inhibition, mediated by GluClα receptors, is balanced with excitation in strength, despite tenfold fewer synapses. The different opponent network levels constitute a nested, hierarchical structure operating at increasing spatiotemporal scales. Electrophysiology and modeling suggest that distributing this computation over consecutive network levels counteracts a reduction in gain, which would result from integrating large opposing conductances at a single instance. We propose that this neural architecture provides resilience to noise while enabling high selectivity for relevant sensory information.
Collapse
Affiliation(s)
- Georg Ammer
- Max Planck Institute for Biological Intelligence, Martinsried, Germany.
| | - Etienne Serbe-Kamp
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
- Ludwig Maximilian University of Munich, Munich, Germany
| | - Alex S Mauss
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Florian G Richter
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Sandra Fendl
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Alexander Borst
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| |
Collapse
|
3
|
Zavatone-Veth JA, Masset P, Tong WL, Zak JD, Murthy VN, Pehlevan C. Neural Circuits for Fast Poisson Compressed Sensing in the Olfactory Bulb. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.06.21.545947. [PMID: 37961548 PMCID: PMC10634677 DOI: 10.1101/2023.06.21.545947] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/15/2023]
Abstract
Within a single sniff, the mammalian olfactory system can decode the identity and concentration of odorants wafted on turbulent plumes of air. Yet, it must do so given access only to the noisy, dimensionally-reduced representation of the odor world provided by olfactory receptor neurons. As a result, the olfactory system must solve a compressed sensing problem, relying on the fact that only a handful of the millions of possible odorants are present in a given scene. Inspired by this principle, past works have proposed normative compressed sensing models for olfactory decoding. However, these models have not captured the unique anatomy and physiology of the olfactory bulb, nor have they shown that sensing can be achieved within the 100-millisecond timescale of a single sniff. Here, we propose a rate-based Poisson compressed sensing circuit model for the olfactory bulb. This model maps onto the neuron classes of the olfactory bulb, and recapitulates salient features of their connectivity and physiology. For circuit sizes comparable to the human olfactory bulb, we show that this model can accurately detect tens of odors within the timescale of a single sniff. We also show that this model can perform Bayesian posterior sampling for accurate uncertainty estimation. Fast inference is possible only if the geometry of the neural code is chosen to match receptor properties, yielding a distributed neural code that is not axis-aligned to individual odor identities. Our results illustrate how normative modeling can help us map function onto specific neural circuits to generate new hypotheses.
Collapse
Affiliation(s)
- Jacob A Zavatone-Veth
- Center for Brain Science, Harvard University Cambridge, MA 02138
- Department of Physics, Harvard University Cambridge, MA 02138
| | - Paul Masset
- Center for Brain Science, Harvard University Cambridge, MA 02138
- Department of Molecular and Cellular Biology, Harvard University Cambridge, MA 02138
| | - William L Tong
- Center for Brain Science, Harvard University Cambridge, MA 02138
- John A. Paulson School of Engineering and Applied Sciences, Harvard University Cambridge, MA 02138
- Kempner Institute for the Study of Natural and Artificial Intelligence, Harvard University Cambridge, MA 02138
| | - Joseph D Zak
- Department of Biological Sciences, University of Illinois at Chicago Chicago, IL 60607
| | - Venkatesh N Murthy
- Center for Brain Science, Harvard University Cambridge, MA 02138
- Department of Molecular and Cellular Biology, Harvard University Cambridge, MA 02138
| | - Cengiz Pehlevan
- Center for Brain Science, Harvard University Cambridge, MA 02138
- John A. Paulson School of Engineering and Applied Sciences, Harvard University Cambridge, MA 02138
- Kempner Institute for the Study of Natural and Artificial Intelligence, Harvard University Cambridge, MA 02138
| |
Collapse
|
4
|
Abstract
How neurons detect the direction of motion is a prime example of neural computation: Motion vision is found in the visual systems of virtually all sighted animals, it is important for survival, and it requires interesting computations with well-defined linear and nonlinear processing steps-yet the whole process is of moderate complexity. The genetic methods available in the fruit fly Drosophila and the charting of a connectome of its visual system have led to rapid progress and unprecedented detail in our understanding of how neurons compute the direction of motion in this organism. The picture that emerged incorporates not only the identity, morphology, and synaptic connectivity of each neuron involved but also its neurotransmitters, its receptors, and their subcellular localization. Together with the neurons' membrane potential responses to visual stimulation, this information provides the basis for a biophysically realistic model of the circuit that computes the direction of visual motion.
Collapse
Affiliation(s)
- Alexander Borst
- Max Planck Institute for Biological Intelligence, Martinsried, Germany; ,
| | - Lukas N Groschner
- Max Planck Institute for Biological Intelligence, Martinsried, Germany; ,
| |
Collapse
|
5
|
Lee H, Lee HJ, Choe KW, Lee SH. Neural Evidence for Boundary Updating as the Source of the Repulsive Bias in Classification. J Neurosci 2023; 43:4664-4683. [PMID: 37286349 PMCID: PMC10286949 DOI: 10.1523/jneurosci.0166-23.2023] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2023] [Revised: 05/29/2023] [Accepted: 06/01/2023] [Indexed: 06/09/2023] Open
Abstract
Binary classification, an act of sorting items into two classes by setting a boundary, is biased by recent history. One common form of such bias is repulsive bias, a tendency to sort an item into the class opposite to its preceding items. Sensory-adaptation and boundary-updating are considered as two contending sources of the repulsive bias, yet no neural support has been provided for either source. Here, we explored human brains of both men and women, using functional magnetic resonance imaging (fMRI), to find such support by relating the brain signals of sensory-adaptation and boundary-updating to human classification behavior. We found that the stimulus-encoding signal in the early visual cortex adapted to previous stimuli, yet its adaptation-related changes were dissociated from current choices. Contrastingly, the boundary-representing signals in the inferior-parietal and superior-temporal cortices shifted to previous stimuli and covaried with current choices. Our exploration points to boundary-updating, rather than sensory-adaptation, as the origin of the repulsive bias in binary classification.SIGNIFICANCE STATEMENT Many animal and human studies on perceptual decision-making have reported an intriguing history effect called "repulsive bias," a tendency to classify an item as the opposite class of its previous item. Regarding the origin of repulsive bias, two contending ideas have been proposed: "bias in stimulus representation because of sensory adaptation" versus "bias in class-boundary setting because of belief updating." By conducting model-based neuroimaging experiments, we verified their predictions about which brain signal should contribute to the trial-to-trial variability in choice behavior. We found that the brain signal of class boundary, but not stimulus representation, contributed to the choice variability associated with repulsive bias. Our study provides the first neural evidence supporting the boundary-based hypothesis of repulsive bias.
Collapse
Affiliation(s)
- Heeseung Lee
- Department of Brain and Cognitive Sciences, Seoul National University, Seoul 08826, Republic of Korea
| | - Hyang-Jung Lee
- Department of Brain and Cognitive Sciences, Seoul National University, Seoul 08826, Republic of Korea
| | - Kyoung Whan Choe
- Department of Brain and Cognitive Sciences, Seoul National University, Seoul 08826, Republic of Korea
| | - Sang-Hun Lee
- Department of Brain and Cognitive Sciences, Seoul National University, Seoul 08826, Republic of Korea
| |
Collapse
|
6
|
Egelhaaf M. Optic flow based spatial vision in insects. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-022-01610-w. [PMID: 36609568 DOI: 10.1007/s00359-022-01610-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Revised: 12/06/2022] [Accepted: 12/24/2022] [Indexed: 01/09/2023]
Abstract
The optic flow, i.e., the displacement of retinal images of objects in the environment induced by self-motion, is an important source of spatial information, especially for fast-flying insects. Spatial information over a wide range of distances, from the animal's immediate surroundings over several hundred metres to kilometres, is necessary for mediating behaviours, such as landing manoeuvres, collision avoidance in spatially complex environments, learning environmental object constellations and path integration in spatial navigation. To facilitate the processing of spatial information, the complexity of the optic flow is often reduced by active vision strategies. These result in translations and rotations being largely separated by a saccadic flight and gaze mode. Only the translational components of the optic flow contain spatial information. In the first step of optic flow processing, an array of local motion detectors provides a retinotopic spatial proximity map of the environment. This local motion information is then processed in parallel neural pathways in a task-specific manner and used to control the different components of spatial behaviour. A particular challenge here is that the distance information extracted from the optic flow does not represent the distances unambiguously, but these are scaled by the animal's speed of locomotion. Possible ways of coping with this ambiguity are discussed.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Center for Cognitive Interaction Technology (CITEC), Bielefeld University, Universitätsstraße 25, 33615, Bielefeld, Germany.
| |
Collapse
|
7
|
An Artificial Visual System for Motion Direction Detection Based on the Hassenstein–Reichardt Correlator Model. ELECTRONICS 2022. [DOI: 10.3390/electronics11091423] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
The perception of motion direction is essential for the survival of visual animals. Despite various theoretical and biophysical investigations that have been conducted to elucidate directional selectivity at the neural level, the systemic mechanism of motion direction detection remains elusive. Here, we develop an artificial visual system (AVS) based on the core computation of the Hassenstein–Reichardt correlator (HRC) model for global motion direction detection. With reference to the biological investigations of Drosophila, we first describe a local motion-sensitive, directionally detective neuron that only responds to ON motion signals with high pattern contrast in a particular direction. Then, we use the full-neurons scheme motion direction detection mechanism to detect the global motion direction based on our previous research. The mechanism enables our AVS to detect multiple directions in a two-dimensional view, and the global motion direction is inferred from the outputs of all local motion-sensitive directionally detective neurons. To verify the reliability of our AVS, we conduct a series of experiments and compare its performance with the time-considered convolution neural network (CNN) and the EfficientNetB0 under the same conditions. The experimental results demonstrated that our system is reliable in detecting the direction of motion, and among the three models, our AVS has better motion direction detection capabilities.
Collapse
|
8
|
Identifying Inputs to Visual Projection Neurons in Drosophila Lobula by Analyzing Connectomic Data. eNeuro 2022; 9:ENEURO.0053-22.2022. [PMID: 35410869 PMCID: PMC9034759 DOI: 10.1523/eneuro.0053-22.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2022] [Revised: 03/26/2022] [Accepted: 03/30/2022] [Indexed: 11/21/2022] Open
Abstract
Electron microscopy (EM)-based connectomes provide important insights into how visual circuitry of fruit fly Drosophila computes various visual features, guiding and complementing behavioral and physiological studies. However, connectomic analyses of the lobula, a neuropil putatively dedicated to detecting object-like features, remains underdeveloped, largely because of incomplete data on the inputs to the brain region. Here, we attempted to map the columnar inputs into the Drosophila lobula neuropil by performing connectivity-based and morphology-based clustering on a densely reconstructed connectome dataset. While the dataset mostly lacked visual neuropils other than lobula, which would normally help identify inputs to lobula, our clustering analysis successfully extracted clusters of cells with homogeneous connectivity and morphology, likely representing genuine cell types. We were able to draw a correspondence between the resulting clusters and previously identified cell types, revealing previously undocumented connectivity between lobula input and output neurons. While future, more complete connectomic reconstructions are necessary to verify the results presented here, they can serve as a useful basis for formulating hypotheses on mechanisms of visual feature detection in lobula.
Collapse
|
9
|
Groschner LN, Malis JG, Zuidinga B, Borst A. A biophysical account of multiplication by a single neuron. Nature 2022; 603:119-123. [PMID: 35197635 PMCID: PMC8891015 DOI: 10.1038/s41586-022-04428-3] [Citation(s) in RCA: 29] [Impact Index Per Article: 14.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2021] [Accepted: 01/14/2022] [Indexed: 12/19/2022]
Abstract
Nonlinear, multiplication-like operations carried out by individual nerve cells greatly enhance the computational power of a neural system1-3, but our understanding of their biophysical implementation is scant. Here we pursue this problem in the Drosophila melanogaster ON motion vision circuit4,5, in which we record the membrane potentials of direction-selective T4 neurons and of their columnar input elements6,7 in response to visual and pharmacological stimuli in vivo. Our electrophysiological measurements and conductance-based simulations provide evidence for a passive supralinear interaction between two distinct types of synapse on T4 dendrites. We show that this multiplication-like nonlinearity arises from the coincidence of cholinergic excitation and release from glutamatergic inhibition. The latter depends on the expression of the glutamate-gated chloride channel GluClα8,9 in T4 neurons, which sharpens the directional tuning of the cells and shapes the optomotor behaviour of the animals. Interacting pairs of shunting inhibitory and excitatory synapses have long been postulated as an analogue approximation of a multiplication, which is integral to theories of motion detection10,11, sound localization12 and sensorimotor control13.
Collapse
Affiliation(s)
| | | | - Birte Zuidinga
- Max Planck Institute of Neurobiology, Martinsried, Germany
| | | |
Collapse
|
10
|
Zhou B, Li Z, Kim S, Lafferty J, Clark DA. Shallow neural networks trained to detect collisions recover features of visual loom-selective neurons. eLife 2022; 11:72067. [PMID: 35023828 PMCID: PMC8849349 DOI: 10.7554/elife.72067] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2021] [Accepted: 01/11/2022] [Indexed: 11/13/2022] Open
Abstract
Animals have evolved sophisticated visual circuits to solve a vital inference problem: detecting whether or not a visual signal corresponds to an object on a collision course. Such events are detected by specific circuits sensitive to visual looming, or objects increasing in size. Various computational models have been developed for these circuits, but how the collision-detection inference problem itself shapes the computational structures of these circuits remains unknown. Here, inspired by the distinctive structures of LPLC2 neurons in the visual system of Drosophila, we build anatomically-constrained shallow neural network models and train them to identify visual signals that correspond to impending collisions. Surprisingly, the optimization arrives at two distinct, opposing solutions, only one of which matches the actual dendritic weighting of LPLC2 neurons. Both solutions can solve the inference problem with high accuracy when the population size is large enough. The LPLC2-like solutions reproduces experimentally observed LPLC2 neuron responses for many stimuli, and reproduces canonical tuning of loom sensitive neurons, even though the models are never trained on neural data. Thus, LPLC2 neuron properties and tuning are predicted by optimizing an anatomically-constrained neural network to detect impending collisions. More generally, these results illustrate how optimizing inference tasks that are important for an animal's perceptual goals can reveal and explain computational properties of specific sensory neurons.
Collapse
Affiliation(s)
- Baohua Zhou
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, United States
| | - Zifan Li
- Department of Statistics and Data Science, Yale University, New Haven, United States
| | - Sunnie Kim
- Department of Statistics and Data Science, Yale University, New Haven, United States
| | - John Lafferty
- Department of Statistics and Data Science, Yale University, New Haven, United States
| | - Damon A Clark
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, United States
| |
Collapse
|
11
|
Predictive encoding of motion begins in the primate retina. Nat Neurosci 2021; 24:1280-1291. [PMID: 34341586 PMCID: PMC8728393 DOI: 10.1038/s41593-021-00899-1] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2020] [Accepted: 06/25/2021] [Indexed: 02/06/2023]
Abstract
Predictive motion encoding is an important aspect of visually guided behavior that allows animals to estimate the trajectory of moving objects. Motion prediction is understood primarily in the context of translational motion, but the environment contains other types of behaviorally salient motion correlation such as those produced by approaching or receding objects. However, the neural mechanisms that detect and predictively encode these correlations remain unclear. We report here that four of the parallel output pathways in the primate retina encode predictive motion information, and this encoding occurs for several classes of spatiotemporal correlation that are found in natural vision. Such predictive coding can be explained by known nonlinear circuit mechanisms that produce a nearly optimal encoding, with transmitted information approaching the theoretical limit imposed by the stimulus itself. Thus, these neural circuit mechanisms efficiently separate predictive information from nonpredictive information during the encoding process.
Collapse
|
12
|
Ramos-Traslosheros G, Silies M. The physiological basis for contrast opponency in motion computation in Drosophila. Nat Commun 2021; 12:4987. [PMID: 34404776 PMCID: PMC8371135 DOI: 10.1038/s41467-021-24986-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/27/2020] [Accepted: 07/07/2021] [Indexed: 12/02/2022] Open
Abstract
In Drosophila, direction-selective neurons implement a mechanism of motion computation similar to cortical neurons, using contrast-opponent receptive fields with ON and OFF subfields. It is not clear how the presynaptic circuitry of direction-selective neurons in the OFF pathway supports this computation if all major inputs are OFF-rectified neurons. Here, we reveal the biological substrate for motion computation in the OFF pathway. Three interneurons, Tm2, Tm9 and CT1, provide information about ON stimuli to the OFF direction-selective neuron T5 across its receptive field, supporting a contrast-opponent receptive field organization. Consistent with its prominent role in motion detection, variability in Tm9 receptive field properties transfers to T5, and calcium decrements in Tm9 in response to ON stimuli persist across behavioral states, while spatial tuning is sharpened by active behavior. Together, our work shows how a key neuronal computation is implemented by its constituent neuronal circuit elements to ensure direction selectivity.
Collapse
Affiliation(s)
- Giordano Ramos-Traslosheros
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany
- International Max Planck Research School Neuroscienes and Göttingen Graduate School for Neurosciences, Biophysics, and Molecular Biosciences (GGNB) at the University of Göttingen, Göttingen, Germany
| | - Marion Silies
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany.
| |
Collapse
|
13
|
Agrochao M, Tanaka R, Salazar-Gatzimas E, Clark DA. Mechanism for analogous illusory motion perception in flies and humans. Proc Natl Acad Sci U S A 2020; 117:23044-23053. [PMID: 32839324 PMCID: PMC7502748 DOI: 10.1073/pnas.2002937117] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Visual motion detection is one of the most important computations performed by visual circuits. Yet, we perceive vivid illusory motion in stationary, periodic luminance gradients that contain no true motion. This illusion is shared by diverse vertebrate species, but theories proposed to explain this illusion have remained difficult to test. Here, we demonstrate that in the fruit fly Drosophila, the illusory motion percept is generated by unbalanced contributions of direction-selective neurons' responses to stationary edges. First, we found that flies, like humans, perceive sustained motion in the stationary gradients. The percept was abolished when the elementary motion detector neurons T4 and T5 were silenced. In vivo calcium imaging revealed that T4 and T5 neurons encode the location and polarity of stationary edges. Furthermore, our proposed mechanistic model allowed us to predictably manipulate both the magnitude and direction of the fly's illusory percept by selectively silencing either T4 or T5 neurons. Interestingly, human brains possess the same mechanistic ingredients that drive our model in flies. When we adapted human observers to moving light edges or dark edges, we could manipulate the magnitude and direction of their percepts as well, suggesting that mechanisms similar to the fly's may also underlie this illusion in humans. By taking a comparative approach that exploits Drosophila neurogenetics, our results provide a causal, mechanistic account for a long-known visual illusion. These results argue that this illusion arises from architectures for motion detection that are shared across phyla.
Collapse
Affiliation(s)
- Margarida Agrochao
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT 06511
| | - Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511
| | | | - Damon A Clark
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT 06511;
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511
- Department of Physics, Yale University, New Haven, CT 06511
- Department of Neuroscience, Yale University, New Haven, CT 06511
| |
Collapse
|
14
|
Yildizoglu T, Riegler C, Fitzgerald JE, Portugues R. A Neural Representation of Naturalistic Motion-Guided Behavior in the Zebrafish Brain. Curr Biol 2020; 30:2321-2333.e6. [PMID: 32386533 DOI: 10.1016/j.cub.2020.04.043] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2018] [Revised: 03/13/2020] [Accepted: 04/20/2020] [Indexed: 11/20/2022]
Abstract
All animals must transform ambiguous sensory data into successful behavior. This requires sensory representations that accurately reflect the statistics of natural stimuli and behavior. Multiple studies show that visual motion processing is tuned for accuracy under naturalistic conditions, but the sensorimotor circuits extracting these cues and implementing motion-guided behavior remain unclear. Here we show that the larval zebrafish retina extracts a diversity of naturalistic motion cues, and the retinorecipient pretectum organizes these cues around the elements of behavior. We find that higher-order motion stimuli, gliders, induce optomotor behavior matching expectations from natural scene analyses. We then image activity of retinal ganglion cell terminals and pretectal neurons. The retina exhibits direction-selective responses across glider stimuli, and anatomically clustered pretectal neurons respond with magnitudes matching behavior. Peripheral computations thus reflect natural input statistics, whereas central brain activity precisely codes information needed for behavior. This general principle could organize sensorimotor transformations across animal species.
Collapse
Affiliation(s)
- Tugce Yildizoglu
- Max Planck Institute of Neurobiology, Research Group of Sensorimotor Control, Martinsried 82152, Germany
| | - Clemens Riegler
- Department of Molecular and Cellular Biology, Harvard University, Cambridge, MA 02138, USA; Department of Neurobiology, Faculty of Life Sciences, University of Vienna, Althanstrasse 14, 1090 Vienna, Austria
| | - James E Fitzgerald
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA 20147, USA.
| | - Ruben Portugues
- Max Planck Institute of Neurobiology, Research Group of Sensorimotor Control, Martinsried 82152, Germany; Institute of Neuroscience, Technical University of Munich, Munich 80802, Germany; Munich Cluster for Systems Neurology (SyNergy), Munich 80802, Germany.
| |
Collapse
|