1
|
Ammer G, Serbe-Kamp E, Mauss AS, Richter FG, Fendl S, Borst A. Multilevel visual motion opponency in Drosophila. Nat Neurosci 2023; 26:1894-1905. [PMID: 37783895 PMCID: PMC10620086 DOI: 10.1038/s41593-023-01443-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 08/30/2023] [Indexed: 10/04/2023]
Abstract
Inhibitory interactions between opponent neuronal pathways constitute a common circuit motif across brain areas and species. However, in most cases, synaptic wiring and biophysical, cellular and network mechanisms generating opponency are unknown. Here, we combine optogenetics, voltage and calcium imaging, connectomics, electrophysiology and modeling to reveal multilevel opponent inhibition in the fly visual system. We uncover a circuit architecture in which a single cell type implements direction-selective, motion-opponent inhibition at all three network levels. This inhibition, mediated by GluClα receptors, is balanced with excitation in strength, despite tenfold fewer synapses. The different opponent network levels constitute a nested, hierarchical structure operating at increasing spatiotemporal scales. Electrophysiology and modeling suggest that distributing this computation over consecutive network levels counteracts a reduction in gain, which would result from integrating large opposing conductances at a single instance. We propose that this neural architecture provides resilience to noise while enabling high selectivity for relevant sensory information.
Collapse
Affiliation(s)
- Georg Ammer
- Max Planck Institute for Biological Intelligence, Martinsried, Germany.
| | - Etienne Serbe-Kamp
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
- Ludwig Maximilian University of Munich, Munich, Germany
| | - Alex S Mauss
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Florian G Richter
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Sandra Fendl
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Alexander Borst
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| |
Collapse
|
2
|
Chen J, Gish CM, Fransen JW, Salazar-Gatzimas E, Clark DA, Borghuis BG. Direct comparison reveals algorithmic similarities in fly and mouse visual motion detection. iScience 2023; 26:107928. [PMID: 37810236 PMCID: PMC10550730 DOI: 10.1016/j.isci.2023.107928] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2023] [Revised: 08/07/2023] [Accepted: 09/12/2023] [Indexed: 10/10/2023] Open
Abstract
Evolution has equipped vertebrates and invertebrates with neural circuits that selectively encode visual motion. While similarities in the computations performed by these circuits in mouse and fruit fly have been noted, direct experimental comparisons have been lacking. Because molecular mechanisms and neuronal morphology in the two species are distinct, we directly compared motion encoding in these two species at the algorithmic level, using matched stimuli and focusing on a pair of analogous neurons, the mouse ON starburst amacrine cell (ON SAC) and Drosophila T4 neurons. We find that the cells share similar spatiotemporal receptive field structures, sensitivity to spatiotemporal correlations, and tuning to sinusoidal drifting gratings, but differ in their responses to apparent motion stimuli. Both neuron types showed a response to summed sinusoids that deviates from models for motion processing in these cells, underscoring the similarities in their processing and identifying response features that remain to be explained.
Collapse
Affiliation(s)
- Juyue Chen
- Interdepartmental Neurosciences Program, Yale University, New Haven, CT 06511, USA
| | - Caitlin M Gish
- Department of Physics, Yale University, New Haven, CT 06511, USA
| | - James W Fransen
- Department of Anatomical Sciences and Neurobiology, University of Louisville, Louisville, KY 40202, USA
| | | | - Damon A Clark
- Interdepartmental Neurosciences Program, Yale University, New Haven, CT 06511, USA
- Department of Physics, Yale University, New Haven, CT 06511, USA
- Department of Molecular, Cellular, Developmental Biology, Yale University, New Haven, CT 06511, USA
- Department of Neuroscience, Yale University, New Haven, CT 06511, USA
| | - Bart G Borghuis
- Department of Anatomical Sciences and Neurobiology, University of Louisville, Louisville, KY 40202, USA
| |
Collapse
|
3
|
Mano O, Choi M, Tanaka R, Creamer MS, Matos NCB, Shomar JW, Badwan BA, Clandinin TR, Clark DA. Long-timescale anti-directional rotation in Drosophila optomotor behavior. eLife 2023; 12:e86076. [PMID: 37751469 PMCID: PMC10522332 DOI: 10.7554/elife.86076] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Accepted: 09/12/2023] [Indexed: 09/28/2023] Open
Abstract
Locomotor movements cause visual images to be displaced across the eye, a retinal slip that is counteracted by stabilizing reflexes in many animals. In insects, optomotor turning causes the animal to turn in the direction of rotating visual stimuli, thereby reducing retinal slip and stabilizing trajectories through the world. This behavior has formed the basis for extensive dissections of motion vision. Here, we report that under certain stimulus conditions, two Drosophila species, including the widely studied Drosophila melanogaster, can suppress and even reverse the optomotor turning response over several seconds. Such 'anti-directional turning' is most strongly evoked by long-lasting, high-contrast, slow-moving visual stimuli that are distinct from those that promote syn-directional optomotor turning. Anti-directional turning, like the syn-directional optomotor response, requires the local motion detecting neurons T4 and T5. A subset of lobula plate tangential cells, CH cells, show involvement in these responses. Imaging from a variety of direction-selective cells in the lobula plate shows no evidence of dynamics that match the behavior, suggesting that the observed inversion in turning direction emerges downstream of the lobula plate. Further, anti-directional turning declines with age and exposure to light. These results show that Drosophila optomotor turning behaviors contain rich, stimulus-dependent dynamics that are inconsistent with simple reflexive stabilization responses.
Collapse
Affiliation(s)
- Omer Mano
- Department of Molecular, Cellular, and Developmental Biology, Yale UniversityNew HavenUnited States
| | - Minseung Choi
- Department of Neurobiology, Stanford UniversityStanfordUnited States
| | - Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale UniversityNew HavenUnited States
| | - Matthew S Creamer
- Interdepartmental Neuroscience Program, Yale UniversityNew HavenUnited States
| | - Natalia CB Matos
- Interdepartmental Neuroscience Program, Yale UniversityNew HavenUnited States
| | - Joseph W Shomar
- Department of Physics, Yale UniversityNew HavenUnited States
| | - Bara A Badwan
- Department of Chemical Engineering, Yale UniversityNew HavenUnited States
| | | | - Damon A Clark
- Department of Molecular, Cellular, and Developmental Biology, Yale UniversityNew HavenUnited States
- Interdepartmental Neuroscience Program, Yale UniversityNew HavenUnited States
- Department of Physics, Yale UniversityNew HavenUnited States
- Department of Neuroscience, Yale UniversityNew HavenUnited States
| |
Collapse
|
4
|
Fu Q. Motion perception based on ON/OFF channels: A survey. Neural Netw 2023; 165:1-18. [PMID: 37263088 DOI: 10.1016/j.neunet.2023.05.031] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2022] [Revised: 04/02/2023] [Accepted: 05/17/2023] [Indexed: 06/03/2023]
Abstract
Motion perception is an essential ability for animals and artificially intelligent systems interacting effectively, safely with surrounding objects and environments. Biological visual systems, that have naturally evolved over hundreds-million years, are quite efficient and robust for motion perception, whereas artificial vision systems are far from such capability. This paper argues that the gap can be significantly reduced by formulation of ON/OFF channels in motion perception models encoding luminance increment (ON) and decrement (OFF) responses within receptive field, separately. Such signal-bifurcating structure has been found in neural systems of many animal species articulating early motion is split and processed in segregated pathways. However, the corresponding biological substrates, and the necessity for artificial vision systems have never been elucidated together, leaving concerns on uniqueness and advantages of ON/OFF channels upon building dynamic vision systems to address real world challenges. This paper highlights the importance of ON/OFF channels in motion perception through surveying current progress covering both neuroscience and computationally modelling works with applications. Compared to related literature, this paper for the first time provides insights into implementation of different selectivity to directional motion of looming, translating, and small-sized target movement based on ON/OFF channels in keeping with soundness and robustness of biological principles. Existing challenges and future trends of such bio-plausible computational structure for visual perception in connection with hotspots of machine learning, advanced vision sensors like event-driven camera finally are discussed.
Collapse
Affiliation(s)
- Qinbing Fu
- Machine Life and Intelligence Research Centre, School of Mathematics and Information Science, Guangzhou University, Guangzhou, 510006, China.
| |
Collapse
|
5
|
Cai LT, Krishna VS, Hladnik TC, Guilbeault NC, Vijayakumar C, Arunachalam M, Juntti SA, Arrenberg AB, Thiele TR, Cooper EA. Spatiotemporal visual statistics of aquatic environments in the natural habitats of zebrafish. Sci Rep 2023; 13:12028. [PMID: 37491571 PMCID: PMC10368656 DOI: 10.1038/s41598-023-36099-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2023] [Accepted: 05/29/2023] [Indexed: 07/27/2023] Open
Abstract
Animal sensory systems are tightly adapted to the demands of their environment. In the visual domain, research has shown that many species have circuits and systems that exploit statistical regularities in natural visual signals. The zebrafish is a popular model animal in visual neuroscience, but relatively little quantitative data is available about the visual properties of the aquatic habitats where zebrafish reside, as compared to terrestrial environments. Improving our understanding of the visual demands of the aquatic habitats of zebrafish can enhance the insights about sensory neuroscience yielded by this model system. We analyzed a video dataset of zebrafish habitats captured by a stationary camera and compared this dataset to videos of terrestrial scenes in the same geographic area. Our analysis of the spatiotemporal structure in these videos suggests that zebrafish habitats are characterized by low visual contrast and strong motion when compared to terrestrial environments. Similar to terrestrial environments, zebrafish habitats tended to be dominated by dark contrasts, particularly in the lower visual field. We discuss how these properties of the visual environment can inform the study of zebrafish visual behavior and neural processing and, by extension, can inform our understanding of the vertebrate brain.
Collapse
Affiliation(s)
- Lanya T Cai
- Herbert Wertheim School of Optometry & Vision Science, University of California, Berkeley, CA, USA
| | - Venkatesh S Krishna
- Department of Biological Sciences, University of Toronto, Scarborough, ON, Canada
| | - Tim C Hladnik
- Werner Reichardt Centre for Integrative Neuroscience, Institute of Neurobiology, University of Tübingen, Tübingen, Germany
- Graduate Training Centre for Neuroscience, University of Tübingen, Tübingen, Germany
| | - Nicholas C Guilbeault
- Department of Biological Sciences, University of Toronto, Scarborough, ON, Canada
- Department of Cell and Systems Biology, University of Toronto, Toronto, Canada
| | - Chinnian Vijayakumar
- Department of Zoology, Department of Zoology, St. Andrew's College, Gorakhpur, Uttar Pradesh, India
| | - Muthukumarasamy Arunachalam
- Department of Zoology, School of Biological Sciences, Central University of Kerala, Kasaragod, Kerala, India
- Centre for Inland Fishes and Conservation, St. Andrew's College, Gorakhpur, Uttar Pradesh, India
| | - Scott A Juntti
- Department of Biology, University of Maryland, College Park, MD, USA
| | - Aristides B Arrenberg
- Werner Reichardt Centre for Integrative Neuroscience, Institute of Neurobiology, University of Tübingen, Tübingen, Germany
| | - Tod R Thiele
- Department of Biological Sciences, University of Toronto, Scarborough, ON, Canada.
- Department of Cell and Systems Biology, University of Toronto, Toronto, Canada.
| | - Emily A Cooper
- Herbert Wertheim School of Optometry & Vision Science, University of California, Berkeley, CA, USA.
- Helen Wills Neuroscience Institute, University of California, Berkeley, CA, USA.
| |
Collapse
|
6
|
Pungor JR, Allen VA, Songco-Casey JO, Niell CM. Functional organization of visual responses in the octopus optic lobe. Curr Biol 2023; 33:2784-2793.e3. [PMID: 37343556 PMCID: PMC11056276 DOI: 10.1016/j.cub.2023.05.069] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2023] [Revised: 04/24/2023] [Accepted: 05/30/2023] [Indexed: 06/23/2023]
Abstract
Cephalopods are highly visual animals with camera-type eyes, large brains, and a rich repertoire of visually guided behaviors. However, the cephalopod brain evolved independently from those of other highly visual species, such as vertebrates; therefore, the neural circuits that process sensory information are profoundly different. It is largely unknown how their powerful but unique visual system functions, as there have been no direct neural measurements of visual responses in the cephalopod brain. In this study, we used two-photon calcium imaging to record visually evoked responses in the primary visual processing center of the octopus central brain, the optic lobe, to determine how basic features of the visual scene are represented and organized. We found spatially localized receptive fields for light (ON) and dark (OFF) stimuli, which were retinotopically organized across the optic lobe, demonstrating a hallmark of visual system organization shared across many species. An examination of these responses revealed transformations of the visual representation across the layers of the optic lobe, including the emergence of the OFF pathway and increased size selectivity. We also identified asymmetries in the spatial processing of ON and OFF stimuli, which suggest unique circuit mechanisms for form processing that may have evolved to suit the specific demands of processing an underwater visual scene. This study provides insight into the neural processing and functional organization of the octopus visual system, highlighting both shared and unique aspects, and lays a foundation for future studies of the neural circuits that mediate visual processing and behavior in cephalopods.
Collapse
Affiliation(s)
- Judit R Pungor
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene, OR 97405, USA.
| | - V Angelique Allen
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene, OR 97405, USA
| | - Jeremea O Songco-Casey
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene, OR 97405, USA
| | - Cristopher M Niell
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene, OR 97405, USA.
| |
Collapse
|
7
|
Currier TA, Pang MM, Clandinin TR. Visual processing in the fly, from photoreceptors to behavior. Genetics 2023; 224:iyad064. [PMID: 37128740 PMCID: PMC10213501 DOI: 10.1093/genetics/iyad064] [Citation(s) in RCA: 8] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2022] [Accepted: 03/22/2023] [Indexed: 05/03/2023] Open
Abstract
Originally a genetic model organism, the experimental use of Drosophila melanogaster has grown to include quantitative behavioral analyses, sophisticated perturbations of neuronal function, and detailed sensory physiology. A highlight of these developments can be seen in the context of vision, where pioneering studies have uncovered fundamental and generalizable principles of sensory processing. Here we begin with an overview of vision-guided behaviors and common methods for probing visual circuits. We then outline the anatomy and physiology of brain regions involved in visual processing, beginning at the sensory periphery and ending with descending motor control. Areas of focus include contrast and motion detection in the optic lobe, circuits for visual feature selectivity, computations in support of spatial navigation, and contextual associative learning. Finally, we look to the future of fly visual neuroscience and discuss promising topics for further study.
Collapse
Affiliation(s)
- Timothy A Currier
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA 94305, USA
| | - Michelle M Pang
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA 94305, USA
| | - Thomas R Clandinin
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA 94305, USA
| |
Collapse
|
8
|
Mano O, Choi M, Tanaka R, Creamer MS, Matos NC, Shomar J, Badwan BA, Clandinin TR, Clark DA. Long timescale anti-directional rotation in Drosophila optomotor behavior. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.01.06.523055. [PMID: 36711627 PMCID: PMC9882005 DOI: 10.1101/2023.01.06.523055] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 01/09/2023]
Abstract
Locomotor movements cause visual images to be displaced across the eye, a retinal slip that is counteracted by stabilizing reflexes in many animals. In insects, optomotor turning causes the animal to turn in the direction of rotating visual stimuli, thereby reducing retinal slip and stabilizing trajectories through the world. This behavior has formed the basis for extensive dissections of motion vision. Here, we report that under certain stimulus conditions, two Drosophila species, including the widely studied D. melanogaster, can suppress and even reverse the optomotor turning response over several seconds. Such "anti-directional turning" is most strongly evoked by long-lasting, high-contrast, slow-moving visual stimuli that are distinct from those that promote syn-directional optomotor turning. Anti-directional turning, like the syn-directional optomotor response, requires the local motion detecting neurons T4 and T5. A subset of lobula plate tangential cells, CH cells, show involvement in these responses. Imaging from a variety of direction-selective cells in the lobula plate shows no evidence of dynamics that match the behavior, suggesting that the observed inversion in turning direction emerges downstream of the lobula plate. Further, anti-directional turning declines with age and exposure to light. These results show that Drosophila optomotor turning behaviors contain rich, stimulus-dependent dynamics that are inconsistent with simple reflexive stabilization responses.
Collapse
Affiliation(s)
- Omer Mano
- Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Minseung Choi
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
| | - Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Matthew S. Creamer
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Natalia C.B. Matos
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Joseph Shomar
- Department of Physics, Yale University, New Haven, CT 06511, USA
| | - Bara A. Badwan
- Department of Chemical Engineering, Yale University, New Haven, CT 06511, USA
| | | | - Damon A. Clark
- Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
- Department of Physics, Yale University, New Haven, CT 06511, USA
- Department of Neuroscience, Yale University, New Haven, CT 06511, USA
| |
Collapse
|
9
|
Fu Q, Li Z, Peng J. Harmonizing motion and contrast vision for robust looming detection. ARRAY 2023. [DOI: 10.1016/j.array.2022.100272] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022] Open
|
10
|
Pungor JR, Allen VA, Songco-Casey JO, Niell CM. Functional organization of visual responses in the octopus optic lobe. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.02.16.528734. [PMID: 36824726 PMCID: PMC9949128 DOI: 10.1101/2023.02.16.528734] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 02/18/2023]
Abstract
Cephalopods are highly visual animals with camera-type eyes, large brains, and a rich repertoire of visually guided behaviors. However, the cephalopod brain evolved independently from that of other highly visual species, such as vertebrates, and therefore the neural circuits that process sensory information are profoundly different. It is largely unknown how their powerful but unique visual system functions, since there have been no direct neural measurements of visual responses in the cephalopod brain. In this study, we used two-photon calcium imaging to record visually evoked responses in the primary visual processing center of the octopus central brain, the optic lobe, to determine how basic features of the visual scene are represented and organized. We found spatially localized receptive fields for light (ON) and dark (OFF) stimuli, which were retinotopically organized across the optic lobe, demonstrating a hallmark of visual system organization shared across many species. Examination of these responses revealed transformations of the visual representation across the layers of the optic lobe, including the emergence of the OFF pathway and increased size selectivity. We also identified asymmetries in the spatial processing of ON and OFF stimuli, which suggest unique circuit mechanisms for form processing that may have evolved to suit the specific demands of processing an underwater visual scene. This study provides insight into the neural processing and functional organization of the octopus visual system, highlighting both shared and unique aspects, and lays a foundation for future studies of the neural circuits that mediate visual processing and behavior in cephalopods. Highlights The functional organization and visual response properties of the cephalopod visual system are largely unknownUsing calcium imaging, we performed mapping of visual responses in the octopus optic lobeVisual responses demonstrate localized ON and OFF receptive fields with retinotopic organizationON/OFF pathways and size selectivity emerge across layers of the optic lobe and have distinct properties relative to other species.
Collapse
Affiliation(s)
- Judit R Pungor
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene OR 97405
| | - V Angelique Allen
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene OR 97405
| | - Jeremea O Songco-Casey
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene OR 97405
| | - Cristopher M Niell
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene OR 97405
| |
Collapse
|
11
|
Alexander E, Cai LT, Fuchs S, Hladnik TC, Zhang Y, Subramanian V, Guilbeault NC, Vijayakumar C, Arunachalam M, Juntti SA, Thiele TR, Arrenberg AB, Cooper EA. Optic flow in the natural habitats of zebrafish supports spatial biases in visual self-motion estimation. Curr Biol 2022; 32:5008-5021.e8. [PMID: 36327979 PMCID: PMC9729457 DOI: 10.1016/j.cub.2022.10.009] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2022] [Revised: 08/15/2022] [Accepted: 10/05/2022] [Indexed: 12/12/2022]
Abstract
Animals benefit from knowing if and how they are moving. Across the animal kingdom, sensory information in the form of optic flow over the visual field is used to estimate self-motion. However, different species exhibit strong spatial biases in how they use optic flow. Here, we show computationally that noisy natural environments favor visual systems that extract spatially biased samples of optic flow when estimating self-motion. The performance associated with these biases, however, depends on interactions between the environment and the animal's brain and behavior. Using the larval zebrafish as a model, we recorded natural optic flow associated with swimming trajectories in the animal's habitat with an omnidirectional camera mounted on a mechanical arm. An analysis of these flow fields suggests that lateral regions of the lower visual field are most informative about swimming speed. This pattern is consistent with the recent findings that zebrafish optomotor responses are preferentially driven by optic flow in the lateral lower visual field, which we extend with behavioral results from a high-resolution spherical arena. Spatial biases in optic-flow sampling are likely pervasive because they are an effective strategy for determining self-motion in noisy natural environments.
Collapse
Affiliation(s)
- Emma Alexander
- Herbert Wertheim School of Optometry & Vision Science, University of California, Berkeley, Berkeley, CA 94720, USA,Present address: Department of Computer Science, Northwestern University, Evanston, IL 60208, USA,Lead contact,Correspondence:
| | - Lanya T. Cai
- Herbert Wertheim School of Optometry & Vision Science, University of California, Berkeley, Berkeley, CA 94720, USA,Present address: Department of Radiology and Biomedical Imaging, University of California, San Francisco, San Francisco, CA 94158, USA
| | - Sabrina Fuchs
- Werner Reichardt Centre for Integrative Neuroscience, Institute of Neurobiology, University of Tubingen, 72076 Tubingen, Germany
| | - Tim C. Hladnik
- Werner Reichardt Centre for Integrative Neuroscience, Institute of Neurobiology, University of Tubingen, 72076 Tubingen, Germany,Graduate Training Centre for Neuroscience, University of Tubingen, 72074 Tubingen, Germany
| | - Yue Zhang
- Werner Reichardt Centre for Integrative Neuroscience, Institute of Neurobiology, University of Tubingen, 72076 Tubingen, Germany,Graduate Training Centre for Neuroscience, University of Tubingen, 72074 Tubingen, Germany,Present address: Department of Cellular and Systems Neurobiology, Max Planck Institute for Biological Intelligence in Foundation, 82152 Martinsried, Germany
| | - Venkatesh Subramanian
- Department of Biological Sciences, University of Toronto Scarborough, Toronto M1C 1A4, Canada
| | - Nicholas C. Guilbeault
- Department of Biological Sciences, University of Toronto Scarborough, Toronto M1C 1A4, Canada,Department of Cell and Systems Biology, University of Toronto, Toronto M5S 3G5, Canada
| | - Chinnian Vijayakumar
- Department of Zoology, St. Andrew’s College, Gorakhpur, Uttar Pradesh 273001, India
| | - Muthukumarasamy Arunachalam
- Department of Zoology, School of Biological Sciences, Central University of Kerala, Kerala 671316, India,Present address: Centre for Inland Fishes and Conservation, St. Andrew’s College, Gorakhpur, Uttar Pradesh 273001, India
| | - Scott A. Juntti
- Department of Biology, University of Maryland, College Park, MD 20742, USA
| | - Tod R. Thiele
- Department of Biological Sciences, University of Toronto Scarborough, Toronto M1C 1A4, Canada,Department of Cell and Systems Biology, University of Toronto, Toronto M5S 3G5, Canada
| | - Aristides B. Arrenberg
- Werner Reichardt Centre for Integrative Neuroscience, Institute of Neurobiology, University of Tubingen, 72076 Tubingen, Germany
| | - Emily A. Cooper
- Herbert Wertheim School of Optometry & Vision Science, University of California, Berkeley, Berkeley, CA 94720, USA,Helen Wills Neuroscience Institute, University of California, Berkeley, Berkeley, CA 94720, USA
| |
Collapse
|
12
|
Alexander E, Cai LT, Fuchs S, Hladnik TC, Zhang Y, Subramanian V, Guilbeault NC, Vijayakumar C, Arunachalam M, Juntti SA, Thiele TR, Arrenberg AB, Cooper EA. Optic flow in the natural habitats of zebrafish supports spatial biases in visual self-motion estimation. Curr Biol 2022. [PMID: 36327979 DOI: 10.5281/zenodo.6604546] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Animals benefit from knowing if and how they are moving. Across the animal kingdom, sensory information in the form of optic flow over the visual field is used to estimate self-motion. However, different species exhibit strong spatial biases in how they use optic flow. Here, we show computationally that noisy natural environments favor visual systems that extract spatially biased samples of optic flow when estimating self-motion. The performance associated with these biases, however, depends on interactions between the environment and the animal's brain and behavior. Using the larval zebrafish as a model, we recorded natural optic flow associated with swimming trajectories in the animal's habitat with an omnidirectional camera mounted on a mechanical arm. An analysis of these flow fields suggests that lateral regions of the lower visual field are most informative about swimming speed. This pattern is consistent with the recent findings that zebrafish optomotor responses are preferentially driven by optic flow in the lateral lower visual field, which we extend with behavioral results from a high-resolution spherical arena. Spatial biases in optic-flow sampling are likely pervasive because they are an effective strategy for determining self-motion in noisy natural environments.
Collapse
Affiliation(s)
- Emma Alexander
- Herbert Wertheim School of Optometry & Vision Science, University of California, Berkeley, Berkeley, CA 94720, USA.
| | - Lanya T Cai
- Herbert Wertheim School of Optometry & Vision Science, University of California, Berkeley, Berkeley, CA 94720, USA
| | - Sabrina Fuchs
- Werner Reichardt Centre for Integrative Neuroscience, Institute of Neurobiology, University of Tubingen, 72076 Tubingen, Germany
| | - Tim C Hladnik
- Werner Reichardt Centre for Integrative Neuroscience, Institute of Neurobiology, University of Tubingen, 72076 Tubingen, Germany; Graduate Training Centre for Neuroscience, University of Tubingen, 72074 Tubingen, Germany
| | - Yue Zhang
- Werner Reichardt Centre for Integrative Neuroscience, Institute of Neurobiology, University of Tubingen, 72076 Tubingen, Germany; Graduate Training Centre for Neuroscience, University of Tubingen, 72074 Tubingen, Germany
| | - Venkatesh Subramanian
- Department of Biological Sciences, University of Toronto Scarborough, Toronto M1C 1A4, Canada
| | - Nicholas C Guilbeault
- Department of Biological Sciences, University of Toronto Scarborough, Toronto M1C 1A4, Canada; Department of Cell and Systems Biology, University of Toronto, Toronto M5S 3G5, Canada
| | - Chinnian Vijayakumar
- Department of Zoology, St. Andrew's College, Gorakhpur, Uttar Pradesh 273001, India
| | | | - Scott A Juntti
- Department of Biology, University of Maryland, College Park, MD 20742, USA
| | - Tod R Thiele
- Department of Biological Sciences, University of Toronto Scarborough, Toronto M1C 1A4, Canada; Department of Cell and Systems Biology, University of Toronto, Toronto M5S 3G5, Canada
| | - Aristides B Arrenberg
- Werner Reichardt Centre for Integrative Neuroscience, Institute of Neurobiology, University of Tubingen, 72076 Tubingen, Germany
| | - Emily A Cooper
- Herbert Wertheim School of Optometry & Vision Science, University of California, Berkeley, Berkeley, CA 94720, USA; Helen Wills Neuroscience Institute, University of California, Berkeley, Berkeley, CA 94720, USA
| |
Collapse
|
13
|
Alexander E, Cai LT, Fuchs S, Hladnik TC, Zhang Y, Subramanian V, Guilbeault NC, Vijayakumar C, Arunachalam M, Juntti SA, Thiele TR, Arrenberg AB, Cooper EA. Optic flow in the natural habitats of zebrafish supports spatial biases in visual self-motion estimation. Curr Biol 2022. [PMID: 36327979 DOI: 10.5281/zenodo.7120876] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Animals benefit from knowing if and how they are moving. Across the animal kingdom, sensory information in the form of optic flow over the visual field is used to estimate self-motion. However, different species exhibit strong spatial biases in how they use optic flow. Here, we show computationally that noisy natural environments favor visual systems that extract spatially biased samples of optic flow when estimating self-motion. The performance associated with these biases, however, depends on interactions between the environment and the animal's brain and behavior. Using the larval zebrafish as a model, we recorded natural optic flow associated with swimming trajectories in the animal's habitat with an omnidirectional camera mounted on a mechanical arm. An analysis of these flow fields suggests that lateral regions of the lower visual field are most informative about swimming speed. This pattern is consistent with the recent findings that zebrafish optomotor responses are preferentially driven by optic flow in the lateral lower visual field, which we extend with behavioral results from a high-resolution spherical arena. Spatial biases in optic-flow sampling are likely pervasive because they are an effective strategy for determining self-motion in noisy natural environments.
Collapse
Affiliation(s)
- Emma Alexander
- Herbert Wertheim School of Optometry & Vision Science, University of California, Berkeley, Berkeley, CA 94720, USA.
| | - Lanya T Cai
- Herbert Wertheim School of Optometry & Vision Science, University of California, Berkeley, Berkeley, CA 94720, USA
| | - Sabrina Fuchs
- Werner Reichardt Centre for Integrative Neuroscience, Institute of Neurobiology, University of Tubingen, 72076 Tubingen, Germany
| | - Tim C Hladnik
- Werner Reichardt Centre for Integrative Neuroscience, Institute of Neurobiology, University of Tubingen, 72076 Tubingen, Germany; Graduate Training Centre for Neuroscience, University of Tubingen, 72074 Tubingen, Germany
| | - Yue Zhang
- Werner Reichardt Centre for Integrative Neuroscience, Institute of Neurobiology, University of Tubingen, 72076 Tubingen, Germany; Graduate Training Centre for Neuroscience, University of Tubingen, 72074 Tubingen, Germany
| | - Venkatesh Subramanian
- Department of Biological Sciences, University of Toronto Scarborough, Toronto M1C 1A4, Canada
| | - Nicholas C Guilbeault
- Department of Biological Sciences, University of Toronto Scarborough, Toronto M1C 1A4, Canada; Department of Cell and Systems Biology, University of Toronto, Toronto M5S 3G5, Canada
| | - Chinnian Vijayakumar
- Department of Zoology, St. Andrew's College, Gorakhpur, Uttar Pradesh 273001, India
| | | | - Scott A Juntti
- Department of Biology, University of Maryland, College Park, MD 20742, USA
| | - Tod R Thiele
- Department of Biological Sciences, University of Toronto Scarborough, Toronto M1C 1A4, Canada; Department of Cell and Systems Biology, University of Toronto, Toronto M5S 3G5, Canada
| | - Aristides B Arrenberg
- Werner Reichardt Centre for Integrative Neuroscience, Institute of Neurobiology, University of Tubingen, 72076 Tubingen, Germany
| | - Emily A Cooper
- Herbert Wertheim School of Optometry & Vision Science, University of California, Berkeley, Berkeley, CA 94720, USA; Helen Wills Neuroscience Institute, University of California, Berkeley, Berkeley, CA 94720, USA
| |
Collapse
|
14
|
Kadakia N, Demir M, Michaelis BT, DeAngelis BD, Reidenbach MA, Clark DA, Emonet T. Odour motion sensing enhances navigation of complex plumes. Nature 2022; 611:754-761. [PMID: 36352224 PMCID: PMC10039482 DOI: 10.1038/s41586-022-05423-4] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2021] [Accepted: 10/06/2022] [Indexed: 11/11/2022]
Abstract
Odour plumes in the wild are spatially complex and rapidly fluctuating structures carried by turbulent airflows1-4. To successfully navigate plumes in search of food and mates, insects must extract and integrate multiple features of the odour signal, including odour identity5, intensity6 and timing6-12. Effective navigation requires balancing these multiple streams of olfactory information and integrating them with other sensory inputs, including mechanosensory and visual cues9,12,13. Studies dating back a century have indicated that, of these many sensory inputs, the wind provides the main directional cue in turbulent plumes, leading to the longstanding model of insect odour navigation as odour-elicited upwind motion6,8-12,14,15. Here we show that Drosophila melanogaster shape their navigational decisions using an additional directional cue-the direction of motion of odours-which they detect using temporal correlations in the odour signal between their two antennae. Using a high-resolution virtual-reality paradigm to deliver spatiotemporally complex fictive odours to freely walking flies, we demonstrate that such odour-direction sensing involves algorithms analogous to those in visual-direction sensing16. Combining simulations, theory and experiments, we show that odour motion contains valuable directional information that is absent from the airflow alone, and that both Drosophila and virtual agents are aided by that information in navigating naturalistic plumes. The generality of our findings suggests that odour-direction sensing may exist throughout the animal kingdom and could improve olfactory robot navigation in uncertain environments.
Collapse
Affiliation(s)
- Nirag Kadakia
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT, USA
- Quantitative Biology Institute, Yale University, New Haven, CT, USA
- Swartz Foundation for Theoretical Neuroscience, Yale University, New Haven, CT, USA
| | - Mahmut Demir
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT, USA
- Quantitative Biology Institute, Yale University, New Haven, CT, USA
| | - Brenden T Michaelis
- Department of Environmental Sciences, University of Virginia, Charlottesville, VA, USA
| | - Brian D DeAngelis
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT, USA
- Quantitative Biology Institute, Yale University, New Haven, CT, USA
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT, USA
| | - Matthew A Reidenbach
- Department of Environmental Sciences, University of Virginia, Charlottesville, VA, USA
| | - Damon A Clark
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT, USA.
- Quantitative Biology Institute, Yale University, New Haven, CT, USA.
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT, USA.
- Department of Physics, Yale University, New Haven, CT, USA.
| | - Thierry Emonet
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT, USA.
- Quantitative Biology Institute, Yale University, New Haven, CT, USA.
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT, USA.
- Department of Physics, Yale University, New Haven, CT, USA.
| |
Collapse
|
15
|
Dewell RB, Zhu Y, Eisenbrandt M, Morse R, Gabbiani F. Contrast polarity-specific mapping improves efficiency of neuronal computation for collision detection. eLife 2022; 11:e79772. [PMID: 36314775 PMCID: PMC9674337 DOI: 10.7554/elife.79772] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2022] [Accepted: 10/27/2022] [Indexed: 11/29/2022] Open
Abstract
Neurons receive information through their synaptic inputs, but the functional significance of how those inputs are mapped on to a cell's dendrites remains unclear. We studied this question in a grasshopper visual neuron that tracks approaching objects and triggers escape behavior before an impending collision. In response to black approaching objects, the neuron receives OFF excitatory inputs that form a retinotopic map of the visual field onto compartmentalized, distal dendrites. Subsequent processing of these OFF inputs by active membrane conductances allows the neuron to discriminate the spatial coherence of such stimuli. In contrast, we show that ON excitatory synaptic inputs activated by white approaching objects map in a random manner onto a more proximal dendritic field of the same neuron. The lack of retinotopic synaptic arrangement results in the neuron's inability to discriminate the coherence of white approaching stimuli. Yet, the neuron retains the ability to discriminate stimulus coherence for checkered stimuli of mixed ON/OFF polarity. The coarser mapping and processing of ON stimuli thus has a minimal impact, while reducing the total energetic cost of the circuit. Further, we show that these differences in ON/OFF neuronal processing are behaviorally relevant, being tightly correlated with the animal's escape behavior to light and dark stimuli of variable coherence. Our results show that the synaptic mapping of excitatory inputs affects the fine stimulus discrimination ability of single neurons and document the resulting functional impact on behavior.
Collapse
Affiliation(s)
| | - Ying Zhu
- Department of Neuroscience, Baylor College of MedicineHoustonUnited States
| | | | | | - Fabrizio Gabbiani
- Department of Neuroscience, Baylor College of MedicineHoustonUnited States
| |
Collapse
|
16
|
Gonzalez-Suarez AD, Zavatone-Veth JA, Chen J, Matulis CA, Badwan BA, Clark DA. Excitatory and inhibitory neural dynamics jointly tune motion detection. Curr Biol 2022; 32:3659-3675.e8. [PMID: 35868321 PMCID: PMC9474608 DOI: 10.1016/j.cub.2022.06.075] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2022] [Revised: 05/03/2022] [Accepted: 06/24/2022] [Indexed: 11/26/2022]
Abstract
Neurons integrate excitatory and inhibitory signals to produce their outputs, but the role of input timing in this integration remains poorly understood. Motion detection is a paradigmatic example of this integration, since theories of motion detection rely on different delays in visual signals. These delays allow circuits to compare scenes at different times to calculate the direction and speed of motion. Different motion detection circuits have different velocity sensitivity, but it remains untested how the response dynamics of individual cell types drive this tuning. Here, we sped up or slowed down specific neuron types in Drosophila's motion detection circuit by manipulating ion channel expression. Altering the dynamics of individual neuron types upstream of motion detectors increased their sensitivity to fast or slow visual motion, exposing distinct roles for excitatory and inhibitory dynamics in tuning directional signals, including a role for the amacrine cell CT1. A circuit model constrained by functional data and anatomy qualitatively reproduced the observed tuning changes. Overall, these results reveal how excitatory and inhibitory dynamics together tune a canonical circuit computation.
Collapse
Affiliation(s)
| | - Jacob A Zavatone-Veth
- Department of Physics, Harvard University, Cambridge, MA 02138, USA; Center for Brain Science, Harvard University, Cambridge, MA 02138, USA
| | - Juyue Chen
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | | | - Bara A Badwan
- School of Engineering and Applied Science, Yale University, New Haven, CT 06511, USA
| | - Damon A Clark
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA; Department of Physics, Yale University, New Haven, CT 06511, USA; Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
17
|
Ketkar MD, Gür B, Molina-Obando S, Ioannidou M, Martelli C, Silies M. First-order visual interneurons distribute distinct contrast and luminance information across ON and OFF pathways to achieve stable behavior. eLife 2022; 11:74937. [PMID: 35263247 PMCID: PMC8967382 DOI: 10.7554/elife.74937] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2021] [Accepted: 03/03/2022] [Indexed: 11/26/2022] Open
Abstract
The accurate processing of contrast is the basis for all visually guided behaviors. Visual scenes with rapidly changing illumination challenge contrast computation because photoreceptor adaptation is not fast enough to compensate for such changes. Yet, human perception of contrast is stable even when the visual environment is quickly changing, suggesting rapid post receptor luminance gain control. Similarly, in the fruit fly Drosophila, such gain control leads to luminance invariant behavior for moving OFF stimuli. Here, we show that behavioral responses to moving ON stimuli also utilize a luminance gain, and that ON-motion guided behavior depends on inputs from three first-order interneurons L1, L2, and L3. Each of these neurons encodes contrast and luminance differently and distributes information asymmetrically across both ON and OFF contrast-selective pathways. Behavioral responses to both ON and OFF stimuli rely on a luminance-based correction provided by L1 and L3, wherein L1 supports contrast computation linearly, and L3 non-linearly amplifies dim stimuli. Therefore, L1, L2, and L3 are not specific inputs to ON and OFF pathways but the lamina serves as a separate processing layer that distributes distinct luminance and contrast information across ON and OFF pathways to support behavior in varying conditions.
Collapse
Affiliation(s)
- Madhura D Ketkar
- Institute of Developmental Biology and Neurobiology, Johannes Gutenberg University of Mainz, Mainz, Germany
| | - Burak Gür
- Institute of Developmental Biology and Neurobiology, Johannes Gutenberg University of Mainz, Mainz, Germany
| | - Sebastian Molina-Obando
- Institute of Developmental Biology and Neurobiology, Johannes Gutenberg University of Mainz, Mainz, Germany
| | - Maria Ioannidou
- Institute of Developmental Biology and Neurobiology, Johannes Gutenberg University of Mainz, Mainz, Germany
| | - Carlotta Martelli
- Institute of Developmental Biology and Neurobiology, Johannes Gutenberg University of Mainz, Mainz, Germany
| | - Marion Silies
- Institute of Developmental Biology and Neurobiology, Johannes Gutenberg University of Mainz, Mainz, Germany
| |
Collapse
|
18
|
Mano O, Creamer MS, Badwan BA, Clark DA. Predicting individual neuron responses with anatomically constrained task optimization. Curr Biol 2021; 31:4062-4075.e4. [PMID: 34324832 DOI: 10.1016/j.cub.2021.06.090] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2021] [Revised: 05/24/2021] [Accepted: 06/29/2021] [Indexed: 01/28/2023]
Abstract
Artificial neural networks trained to solve sensory tasks can develop statistical representations that match those in biological circuits. However, it remains unclear whether they can reproduce properties of individual neurons. Here, we investigated how artificial networks predict individual neuron properties in the visual motion circuits of the fruit fly Drosophila. We trained anatomically constrained networks to predict movement in natural scenes, solving the same inference problem as fly motion detectors. Units in the artificial networks adopted many properties of analogous individual neurons, even though they were not explicitly trained to match these properties. Among these properties was the split into ON and OFF motion detectors, which is not predicted by classical motion detection models. The match between model and neurons was closest when models were trained to be robust to noise. These results demonstrate how anatomical, task, and noise constraints can explain properties of individual neurons in a small neural network.
Collapse
Affiliation(s)
- Omer Mano
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA
| | - Matthew S Creamer
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Bara A Badwan
- School of Engineering and Applied Science, Yale University, New Haven, CT 06511, USA
| | - Damon A Clark
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA; Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA; Department of Physics, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
19
|
Abstract
[Figure: see text].
Collapse
Affiliation(s)
- Liqun Luo
- Department of Biology and Howard Hughes Medical Institute, Stanford University, Stanford, CA 94305, USA
| |
Collapse
|
20
|
Predictive encoding of motion begins in the primate retina. Nat Neurosci 2021; 24:1280-1291. [PMID: 34341586 PMCID: PMC8728393 DOI: 10.1038/s41593-021-00899-1] [Citation(s) in RCA: 20] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2020] [Accepted: 06/25/2021] [Indexed: 02/06/2023]
Abstract
Predictive motion encoding is an important aspect of visually guided behavior that allows animals to estimate the trajectory of moving objects. Motion prediction is understood primarily in the context of translational motion, but the environment contains other types of behaviorally salient motion correlation such as those produced by approaching or receding objects. However, the neural mechanisms that detect and predictively encode these correlations remain unclear. We report here that four of the parallel output pathways in the primate retina encode predictive motion information, and this encoding occurs for several classes of spatiotemporal correlation that are found in natural vision. Such predictive coding can be explained by known nonlinear circuit mechanisms that produce a nearly optimal encoding, with transmitted information approaching the theoretical limit imposed by the stimulus itself. Thus, these neural circuit mechanisms efficiently separate predictive information from nonpredictive information during the encoding process.
Collapse
|
21
|
Matsuda K, Kubo F. Circuit Organization Underlying Optic Flow Processing in Zebrafish. Front Neural Circuits 2021; 15:709048. [PMID: 34366797 PMCID: PMC8334359 DOI: 10.3389/fncir.2021.709048] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2021] [Accepted: 06/28/2021] [Indexed: 12/15/2022] Open
Abstract
Animals’ self-motion generates a drifting movement of the visual scene in the entire field of view called optic flow. Animals use the sensation of optic flow to estimate their own movements and accordingly adjust their body posture and position and stabilize the direction of gaze. In zebrafish and other vertebrates, optic flow typically drives the optokinetic response (OKR) and optomotor response (OMR). Recent functional imaging studies in larval zebrafish have identified the pretectum as a primary center for optic flow processing. In contrast to the view that the pretectum acts as a relay station of direction-selective retinal inputs, pretectal neurons respond to much more complex visual features relevant to behavior, such as spatially and temporally integrated optic flow information. Furthermore, optic flow signals, as well as motor signals, are represented in the cerebellum in a region-specific manner. Here we review recent findings on the circuit organization that underlies the optic flow processing driving OKR and OMR.
Collapse
Affiliation(s)
- Koji Matsuda
- Center for Frontier Research, National Institute of Genetics, Mishima, Japan
| | - Fumi Kubo
- Center for Frontier Research, National Institute of Genetics, Mishima, Japan.,Department of Genetics, SOKENDAI (The Graduate University for Advanced Studies), Mishima, Japan
| |
Collapse
|
22
|
Ding J, Chen A, Chung J, Acaron Ledesma H, Wu M, Berson DM, Palmer SE, Wei W. Spatially displaced excitation contributes to the encoding of interrupted motion by a retinal direction-selective circuit. eLife 2021; 10:e68181. [PMID: 34096504 PMCID: PMC8211448 DOI: 10.7554/elife.68181] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2021] [Accepted: 06/06/2021] [Indexed: 12/19/2022] Open
Abstract
Spatially distributed excitation and inhibition collectively shape a visual neuron's receptive field (RF) properties. In the direction-selective circuit of the mammalian retina, the role of strong null-direction inhibition of On-Off direction-selective ganglion cells (On-Off DSGCs) on their direction selectivity is well-studied. However, how excitatory inputs influence the On-Off DSGC's visual response is underexplored. Here, we report that On-Off DSGCs have a spatially displaced glutamatergic receptive field along their horizontal preferred-null motion axes. This displaced receptive field contributes to DSGC null-direction spiking during interrupted motion trajectories. Theoretical analyses indicate that population responses during interrupted motion may help populations of On-Off DSGCs signal the spatial location of moving objects in complex, naturalistic visual environments. Our study highlights that the direction-selective circuit exploits separate sets of mechanisms under different stimulus conditions, and these mechanisms may help encode multiple visual features.
Collapse
Affiliation(s)
- Jennifer Ding
- Committee on Neurobiology Graduate Program, The University of ChicagoChicagoUnited States
- Department of Neurobiology, The University of ChicagoChicagoUnited States
| | - Albert Chen
- Department of Organismal Biology, The University of ChicagoChicagoUnited States
| | - Janet Chung
- Department of Neurobiology, The University of ChicagoChicagoUnited States
| | - Hector Acaron Ledesma
- Graduate Program in Biophysical Sciences, The University of ChicagoChicagoUnited States
| | - Mofei Wu
- Department of Neurobiology, The University of ChicagoChicagoUnited States
| | - David M Berson
- Department of Neuroscience and Carney Institute for Brain Science, Brown UniversityProvidenceUnited States
| | - Stephanie E Palmer
- Committee on Neurobiology Graduate Program, The University of ChicagoChicagoUnited States
- Department of Organismal Biology, The University of ChicagoChicagoUnited States
- Grossman Institute for Neuroscience, Quantitative Biology and Human Behavior, The University of ChicagoChicagoUnited States
| | - Wei Wei
- Committee on Neurobiology Graduate Program, The University of ChicagoChicagoUnited States
- Department of Neurobiology, The University of ChicagoChicagoUnited States
- Grossman Institute for Neuroscience, Quantitative Biology and Human Behavior, The University of ChicagoChicagoUnited States
| |
Collapse
|
23
|
Darks and Lights, the 'Yin-Yang' of Vision Depends on Luminance. Trends Neurosci 2021; 44:339-341. [PMID: 33712269 DOI: 10.1016/j.tins.2021.02.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2021] [Accepted: 02/26/2021] [Indexed: 11/21/2022]
Abstract
We all know the disappointment when, after a wonderful snapshot, the details in the photo are at much lower contrast than seen before with our own eyes. A recent study by Rahimi-Nasrabadi et al. revealed that this is because human vision accounts for actual luminance range and for accompanied asymmetric changes in dark and light contrasts.
Collapse
|
24
|
Biswas T, Bishop WE, Fitzgerald JE. Theoretical principles for illuminating sensorimotor processing with brain-wide neuronal recordings. Curr Opin Neurobiol 2020; 65:138-145. [PMID: 33248437 PMCID: PMC8754199 DOI: 10.1016/j.conb.2020.10.021] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2020] [Revised: 10/28/2020] [Accepted: 10/29/2020] [Indexed: 11/24/2022]
Abstract
Modern recording techniques now permit brain-wide sensorimotor circuits to be observed at single neuron resolution in small animals. Extracting theoretical understanding from these recordings requires principles that organize findings and guide future experiments. Here we review theoretical principles that shed light onto brain-wide sensorimotor processing. We begin with an analogy that conceptualizes principles as streetlamps that illuminate the empirical terrain, and we illustrate the analogy by showing how two familiar principles apply in new ways to brain-wide phenomena. We then focus the bulk of the review on describing three more principles that have wide utility for mapping brain-wide neural activity, making testable predictions from highly parameterized mechanistic models, and investigating the computational determinants of neuronal response patterns across the brain.
Collapse
Affiliation(s)
- Tirthabir Biswas
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA 20147, USA
| | - William E Bishop
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA 20147, USA
| | - James E Fitzgerald
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA 20147, USA.
| |
Collapse
|
25
|
Agrochao M, Tanaka R, Salazar-Gatzimas E, Clark DA. Mechanism for analogous illusory motion perception in flies and humans. Proc Natl Acad Sci U S A 2020; 117:23044-23053. [PMID: 32839324 PMCID: PMC7502748 DOI: 10.1073/pnas.2002937117] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Visual motion detection is one of the most important computations performed by visual circuits. Yet, we perceive vivid illusory motion in stationary, periodic luminance gradients that contain no true motion. This illusion is shared by diverse vertebrate species, but theories proposed to explain this illusion have remained difficult to test. Here, we demonstrate that in the fruit fly Drosophila, the illusory motion percept is generated by unbalanced contributions of direction-selective neurons' responses to stationary edges. First, we found that flies, like humans, perceive sustained motion in the stationary gradients. The percept was abolished when the elementary motion detector neurons T4 and T5 were silenced. In vivo calcium imaging revealed that T4 and T5 neurons encode the location and polarity of stationary edges. Furthermore, our proposed mechanistic model allowed us to predictably manipulate both the magnitude and direction of the fly's illusory percept by selectively silencing either T4 or T5 neurons. Interestingly, human brains possess the same mechanistic ingredients that drive our model in flies. When we adapted human observers to moving light edges or dark edges, we could manipulate the magnitude and direction of their percepts as well, suggesting that mechanisms similar to the fly's may also underlie this illusion in humans. By taking a comparative approach that exploits Drosophila neurogenetics, our results provide a causal, mechanistic account for a long-known visual illusion. These results argue that this illusion arises from architectures for motion detection that are shared across phyla.
Collapse
Affiliation(s)
- Margarida Agrochao
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT 06511
| | - Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511
| | | | - Damon A Clark
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT 06511;
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511
- Department of Physics, Yale University, New Haven, CT 06511
- Department of Neuroscience, Yale University, New Haven, CT 06511
| |
Collapse
|
26
|
Zavatone-Veth JA, Badwan BA, Clark DA. A minimal synaptic model for direction selective neurons in Drosophila. J Vis 2020; 20:2. [PMID: 32040161 PMCID: PMC7343402 DOI: 10.1167/jov.20.2.2] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/07/2023] Open
Abstract
Visual motion estimation is a canonical neural computation. In Drosophila, recent advances have identified anatomic and functional circuitry underlying direction-selective computations. Models with varying levels of abstraction have been proposed to explain specific experimental results but have rarely been compared across experiments. Here we use the wealth of available anatomical and physiological data to construct a minimal, biophysically inspired synaptic model for Drosophila’s first-order direction-selective T4 cells. We show how this model relates mathematically to classical models of motion detection, including the Hassenstein-Reichardt correlator model. We used numerical simulation to test how well this synaptic model could reproduce measurements of T4 cells across many datasets and stimulus modalities. These comparisons include responses to sinusoid gratings, to apparent motion stimuli, to stochastic stimuli, and to natural scenes. Without fine-tuning this model, it sufficed to reproduce many, but not all, response properties of T4 cells. Since this model is flexible and based on straightforward biophysical properties, it provides an extensible framework for developing a mechanistic understanding of T4 neural response properties. Moreover, it can be used to assess the sufficiency of simple biophysical mechanisms to describe features of the direction-selective computation and identify where our understanding must be improved.
Collapse
|
27
|
Yildizoglu T, Riegler C, Fitzgerald JE, Portugues R. A Neural Representation of Naturalistic Motion-Guided Behavior in the Zebrafish Brain. Curr Biol 2020; 30:2321-2333.e6. [PMID: 32386533 DOI: 10.1016/j.cub.2020.04.043] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2018] [Revised: 03/13/2020] [Accepted: 04/20/2020] [Indexed: 11/20/2022]
Abstract
All animals must transform ambiguous sensory data into successful behavior. This requires sensory representations that accurately reflect the statistics of natural stimuli and behavior. Multiple studies show that visual motion processing is tuned for accuracy under naturalistic conditions, but the sensorimotor circuits extracting these cues and implementing motion-guided behavior remain unclear. Here we show that the larval zebrafish retina extracts a diversity of naturalistic motion cues, and the retinorecipient pretectum organizes these cues around the elements of behavior. We find that higher-order motion stimuli, gliders, induce optomotor behavior matching expectations from natural scene analyses. We then image activity of retinal ganglion cell terminals and pretectal neurons. The retina exhibits direction-selective responses across glider stimuli, and anatomically clustered pretectal neurons respond with magnitudes matching behavior. Peripheral computations thus reflect natural input statistics, whereas central brain activity precisely codes information needed for behavior. This general principle could organize sensorimotor transformations across animal species.
Collapse
Affiliation(s)
- Tugce Yildizoglu
- Max Planck Institute of Neurobiology, Research Group of Sensorimotor Control, Martinsried 82152, Germany
| | - Clemens Riegler
- Department of Molecular and Cellular Biology, Harvard University, Cambridge, MA 02138, USA; Department of Neurobiology, Faculty of Life Sciences, University of Vienna, Althanstrasse 14, 1090 Vienna, Austria
| | - James E Fitzgerald
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA 20147, USA.
| | - Ruben Portugues
- Max Planck Institute of Neurobiology, Research Group of Sensorimotor Control, Martinsried 82152, Germany; Institute of Neuroscience, Technical University of Munich, Munich 80802, Germany; Munich Cluster for Systems Neurology (SyNergy), Munich 80802, Germany.
| |
Collapse
|
28
|
Matulis CA, Chen J, Gonzalez-Suarez AD, Behnia R, Clark DA. Heterogeneous Temporal Contrast Adaptation in Drosophila Direction-Selective Circuits. Curr Biol 2020; 30:222-236.e6. [PMID: 31928874 PMCID: PMC7003801 DOI: 10.1016/j.cub.2019.11.077] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2019] [Revised: 11/06/2019] [Accepted: 11/26/2019] [Indexed: 11/23/2022]
Abstract
In visual systems, neurons adapt both to the mean light level and to the range of light levels, or the contrast. Contrast adaptation has been studied extensively, but it remains unclear how it is distributed among neurons in connected circuits, and how early adaptation affects subsequent computations. Here, we investigated temporal contrast adaptation in neurons across Drosophila's visual motion circuitry. Several ON-pathway neurons showed strong adaptation to changes in contrast over time. One of these neurons, Mi1, showed almost complete adaptation on fast timescales, and experiments ruled out several potential mechanisms for its adaptive properties. When contrast adaptation reduced the gain in ON-pathway cells, it was accompanied by decreased motion responses in downstream direction-selective cells. Simulations show that contrast adaptation can substantially improve motion estimates in natural scenes. The benefits are larger for ON-pathway adaptation, which helps explain the heterogeneous distribution of contrast adaptation in these circuits.
Collapse
Affiliation(s)
- Catherine A Matulis
- Department of Physics, Yale University, 217 Prospect Street, New Haven, CT 06511, USA
| | - Juyue Chen
- Interdepartmental Neuroscience Program, Yale University, 333 Cedar Street, New Haven, CT 06510, USA
| | | | - Rudy Behnia
- Department of Neuroscience, Columbia University, 3227 Broadway, New York, NY 10027, USA
| | - Damon A Clark
- Department of Physics, Yale University, 217 Prospect Street, New Haven, CT 06511, USA; Interdepartmental Neuroscience Program, Yale University, 333 Cedar Street, New Haven, CT 06510, USA; Department of Molecular, Cellular, and Developmental Biology, Yale University, 260 Whitney Avenue, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, 333 Cedar Street, New Haven, CT 06510, USA.
| |
Collapse
|
29
|
Dynamic Signal Compression for Robust Motion Vision in Flies. Curr Biol 2020; 30:209-221.e8. [PMID: 31928873 DOI: 10.1016/j.cub.2019.10.035] [Citation(s) in RCA: 35] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2019] [Revised: 09/17/2019] [Accepted: 10/18/2019] [Indexed: 12/16/2022]
Abstract
Sensory systems need to reliably extract information from highly variable natural signals. Flies, for instance, use optic flow to guide their course and are remarkably adept at estimating image velocity regardless of image statistics. Current circuit models, however, cannot account for this robustness. Here, we demonstrate that the Drosophila visual system reduces input variability by rapidly adjusting its sensitivity to local contrast conditions. We exhaustively map functional properties of neurons in the motion detection circuit and find that local responses are compressed by surround contrast. The compressive signal is fast, integrates spatially, and derives from neural feedback. Training convolutional neural networks on estimating the velocity of natural stimuli shows that this dynamic signal compression can close the performance gap between model and organism. Overall, our work represents a comprehensive mechanistic account of how neural systems attain the robustness to carry out survival-critical tasks in challenging real-world environments.
Collapse
|
30
|
Gorur-Shandilya S, Martelli C, Demir M, Emonet T. Controlling and measuring dynamic odorant stimuli in the laboratory. ACTA ACUST UNITED AC 2019; 222:jeb.207787. [PMID: 31672728 DOI: 10.1242/jeb.207787] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2019] [Accepted: 10/24/2019] [Indexed: 12/28/2022]
Abstract
Animals experience complex odorant stimuli that vary widely in composition, intensity and temporal properties. However, stimuli used to study olfaction in the laboratory are much simpler. This mismatch arises from the challenges in measuring and controlling them precisely and accurately. Even simple pulses can have diverse kinetics that depend on their molecular identity. Here, we introduce a model that describes how stimulus kinetics depend on the molecular identity of the odorant and the geometry of the delivery system. We describe methods to deliver dynamic odorant stimuli of several types, including broadly distributed stimuli that reproduce some of the statistics of naturalistic plumes, in a reproducible and precise manner. Finally, we introduce a method to calibrate a photo-ionization detector to any odorant it can detect, using no additional components. Our approaches are affordable and flexible and can be used to advance our understanding of how olfactory neurons encode real-world odor signals.
Collapse
Affiliation(s)
- Srinivas Gorur-Shandilya
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA.,Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Carlotta Martelli
- Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA.,Department of Biology, University of Konstanz, Konstanz 78457, Germany
| | - Mahmut Demir
- Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Thierry Emonet
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA .,Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA.,Department of Physics, Yale University, New Haven, CT 06511, USA
| |
Collapse
|
31
|
Chen J, Mandel HB, Fitzgerald JE, Clark DA. Asymmetric ON-OFF processing of visual motion cancels variability induced by the structure of natural scenes. eLife 2019; 8:e47579. [PMID: 31613221 PMCID: PMC6884396 DOI: 10.7554/elife.47579] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2019] [Accepted: 10/12/2019] [Indexed: 02/05/2023] Open
Abstract
Animals detect motion using a variety of visual cues that reflect regularities in the natural world. Experiments in animals across phyla have shown that motion percepts incorporate both pairwise and triplet spatiotemporal correlations that could theoretically benefit motion computation. However, it remains unclear how visual systems assemble these cues to build accurate motion estimates. Here, we used systematic behavioral measurements of fruit fly motion perception to show how flies combine local pairwise and triplet correlations to reduce variability in motion estimates across natural scenes. By generating synthetic images with statistics controlled by maximum entropy distributions, we show that the triplet correlations are useful only when images have light-dark asymmetries that mimic natural ones. This suggests that asymmetric ON-OFF processing is tuned to the particular statistics of natural scenes. Since all animals encounter the world's light-dark asymmetries, many visual systems are likely to use asymmetric ON-OFF processing to improve motion estimation.
Collapse
Affiliation(s)
- Juyue Chen
- Interdepartmental Neuroscience ProgramYale UniversityNew HavenUnited States
| | - Holly B Mandel
- Department of Molecular, Cellular and Developmental BiologyYale UniversityNew HavenUnited States
| | - James E Fitzgerald
- Janelia Research CampusHoward Hughes Medical InstituteAshburnUnited States
| | - Damon A Clark
- Interdepartmental Neuroscience ProgramYale UniversityNew HavenUnited States
- Department of Molecular, Cellular and Developmental BiologyYale UniversityNew HavenUnited States
- Department of PhysicsYale UniversityNew HavenUnited States
- Department of NeuroscienceYale UniversityNew HavenUnited States
| |
Collapse
|
32
|
Optomotor Swimming in Larval Zebrafish Is Driven by Global Whole-Field Visual Motion and Local Light-Dark Transitions. Cell Rep 2019; 29:659-670.e3. [DOI: 10.1016/j.celrep.2019.09.024] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2019] [Revised: 08/22/2019] [Accepted: 09/08/2019] [Indexed: 01/28/2023] Open
|
33
|
Vilidaite G, Norcia AM, West RJH, Elliott CJH, Pei F, Wade AR, Baker DH. Autism sensory dysfunction in an evolutionarily conserved system. Proc Biol Sci 2019; 285:20182255. [PMID: 30963913 PMCID: PMC6304042 DOI: 10.1098/rspb.2018.2255] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022] Open
Abstract
There is increasing evidence for a strong genetic basis for autism, with many genetic models being developed in an attempt to replicate autistic symptoms in animals. However, current animal behaviour paradigms rarely match the social and cognitive behaviours exhibited by autistic individuals. Here, we instead assay another functional domain—sensory processing—known to be affected in autism to test a novel genetic autism model in Drosophila melanogaster. We show similar visual response alterations and a similar development trajectory in Nhe3 mutant flies (total n = 72) and in autistic human participants (total n = 154). We report a dissociation between first- and second-order electrophysiological visual responses to steady-state stimulation in adult mutant fruit flies that is strikingly similar to the response pattern in human adults with ASD as well as that of a large sample of neurotypical individuals with high numbers of autistic traits. We explain this as a genetically driven, selective signalling alteration in transient visual dynamics. In contrast to adults, autistic children show a decrease in the first-order response that is matched by the fruit fly model, suggesting that a compensatory change in processing occurs during development. Our results provide the first animal model of autism comprising a differential developmental phenotype in visual processing.
Collapse
Affiliation(s)
- Greta Vilidaite
- 1 Department of Psychology, Stanford University , Stanford, CA 94305 , USA
| | - Anthony M Norcia
- 1 Department of Psychology, Stanford University , Stanford, CA 94305 , USA
| | - Ryan J H West
- 3 Department of Biology, University of York , York YO10 5DD , UK
| | | | - Francesca Pei
- 2 Department of Psychiatry, Stanford University , Stanford, CA 94305 , USA
| | - Alex R Wade
- 3 Department of Biology, University of York , York YO10 5DD , UK.,4 Department of Psychology, University of York , York YO10 5DD , UK
| | - Daniel H Baker
- 4 Department of Psychology, University of York , York YO10 5DD , UK
| |
Collapse
|
34
|
Dynamic nonlinearities enable direction opponency in Drosophila elementary motion detectors. Nat Neurosci 2019; 22:1318-1326. [PMID: 31346296 PMCID: PMC6748873 DOI: 10.1038/s41593-019-0443-y] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2018] [Accepted: 06/03/2019] [Indexed: 12/13/2022]
Abstract
Direction-selective neurons respond to visual motion in a preferred direction. They are direction-opponent if they are also inhibited by motion in the opposite direction. In flies and vertebrates, direction opponency has been observed in second-order direction-selective neurons, which achieve this opponency by subtracting signals from first-order direction-selective cells with opposite directional tunings. Here, we report direction opponency in Drosophila that emerges in first-order direction-selective neurons, the elementary motion detectors T4 and T5. This opponency persists when synaptic output from these cells is blocked, suggesting that it arises from feedforward, not feedback, computations. These observations exclude a broad class of linear-nonlinear models that have been proposed to describe direction-selective computations. However, they are consistent with models that include dynamic nonlinearities. Simulations of opponent models suggest that direction opponency in first-order motion detectors improves motion discriminability by suppressing noise generated by the local structure of natural scenes.
Collapse
|
35
|
Neural mechanisms of contextual modulation in the retinal direction selective circuit. Nat Commun 2019; 10:2431. [PMID: 31160566 PMCID: PMC6547848 DOI: 10.1038/s41467-019-10268-z] [Citation(s) in RCA: 25] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2018] [Accepted: 04/26/2019] [Indexed: 01/07/2023] Open
Abstract
Contextual modulation of neuronal responses by surrounding environments is a fundamental attribute of sensory processing. In the mammalian retina, responses of On–Off direction selective ganglion cells (DSGCs) are modulated by motion contexts. However, the underlying mechanisms are unknown. Here, we show that posterior-preferring DSGCs (pDSGCs) are sensitive to discontinuities of moving contours owing to contextually modulated cholinergic excitation from starburst amacrine cells (SACs). Using a combination of synapse-specific genetic manipulations, patch clamp electrophysiology and connectomic analysis, we identified distinct circuit motifs upstream of On and Off SACs that are required for the contextual modulation of pDSGC activity for bright and dark contrasts. Furthermore, our results reveal a class of wide-field amacrine cells (WACs) with straight, unbranching dendrites that function as “continuity detectors” of moving contours. Therefore, divergent circuit motifs in the On and Off pathways extend the information encoding of On-Off DSGCs beyond their direction selectivity during complex stimuli. The mechanisms of contextual modulation in direction selective ganglion cells in the retina remain unclear. Here, the authors find that that On-Off direction-selective ganglion cells are differentially sensitive to discontinuities of dark and bright moving edges in the visual environment and, using synapse-specific genetic manipulations with functional measurements, reveal the microcircuits underlying this contextual sensitivity.
Collapse
|
36
|
Creamer MS, Mano O, Tanaka R, Clark DA. A flexible geometry for panoramic visual and optogenetic stimulation during behavior and physiology. J Neurosci Methods 2019; 323:48-55. [PMID: 31103713 DOI: 10.1016/j.jneumeth.2019.05.005] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2019] [Revised: 05/11/2019] [Accepted: 05/12/2019] [Indexed: 11/26/2022]
Abstract
BACKGROUND To study visual processing, it is necessary to precisely control visual stimuli while recording neural and behavioral responses. It can be important to present stimuli over a broad area of the visual field, which can be technically difficult. NEW METHOD We present a simple geometry that can be used to display panoramic stimuli. A single digital light projector generates images that are reflected by mirrors onto flat screens that surround an animal. It can be used for behavioral and neurophysiological measurements, so virtually identical stimuli can be presented. Moreover, this geometry permits light from the projector to be used to activate optogenetic tools. RESULTS Using this geometry, we presented panoramic visual stimulation to Drosophila in three paradigms. We presented drifting contrast gratings while recording walking and turning speed. We used the same projector to activate optogenetic channels during visual stimulation. Finally, we used two-photon microscopy to record responses in direction-selective cells to drifting gratings. COMPARISON WITH EXISTING METHOD(S) Existing methods have typically required custom hardware or curved screens, while this method requires only flat back projection screens and a digital light projector. The projector generates images in real time and does not require pre-generated images. Finally, while many setups are large, this geometry occupies a 30 × 20 cm footprint with a 25 cm height. CONCLUSIONS This flexible geometry enables measurements of behavioral and neural responses to panoramic stimuli. This allows moderate throughput behavioral experiments with simultaneous optogenetic manipulation, with easy comparisons between behavior and neural activity using virtually identical stimuli.
Collapse
Affiliation(s)
- Matthew S Creamer
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT, United States
| | - Omer Mano
- Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT, United States
| | - Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT, United States
| | - Damon A Clark
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT, United States; Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT, United States; Department of Physics, Yale University, New Haven, CT, United States; Department of Neuroscience, Yale University, New Haven, CT, United States.
| |
Collapse
|
37
|
Dyakova O, Rångtell FH, Tan X, Nordström K, Benedict C. Acute sleep loss induces signs of visual discomfort in young men. J Sleep Res 2019; 28:e12837. [PMID: 30815934 PMCID: PMC6900002 DOI: 10.1111/jsr.12837] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2018] [Revised: 01/22/2019] [Accepted: 01/31/2019] [Indexed: 01/24/2023]
Abstract
Acute sleep loss influences visual processes in humans, such as recognizing facial emotions. However, to the best of our knowledge, no study till date has examined whether acute sleep loss alters visual comfort when looking at images. One image statistic that can be used to investigate the level of visual comfort experienced under visual encoding is the slope of the amplitude spectrum, also referred to as the slope constant. The slope constant describes the spatial distribution of pixel intensities and deviations from the natural slope constant can induce visual discomfort. In the present counterbalanced crossover design study, 11 young men with normal or corrected‐to‐normal vision participated in two experimental conditions: one night of sleep loss and one night of sleep. In the morning after each intervention, subjects performed a computerized psychophysics task. Specifically, they were required to adjust the slope constant of images depicting natural landscapes and close‐ups with a randomly chosen initial slope constant until they perceived each image as most natural looking. Subjects also rated the pleasantness of each selected image. Our analysis showed that following sleep loss, higher slope constants were perceived as most natural looking when viewing images of natural landscapes. Images with a higher slope constant are generally perceived as blurrier. The selected images were also rated as less pleasant after sleep loss. No such differences between the experimental conditions were noted for images of close‐ups. The results suggest that sleep loss induces signs of visual discomfort in young men. Possible implications of these findings are discussed.
Collapse
Affiliation(s)
- Olga Dyakova
- Department of Neuroscience, Uppsala University, Uppsala, Sweden
| | | | - Xiao Tan
- Department of Neuroscience, Uppsala University, Uppsala, Sweden
| | - Karin Nordström
- Department of Neuroscience, Uppsala University, Uppsala, Sweden.,Centre for Neuroscience, Flinders University, Adelaide, South Australia, Australia
| | | |
Collapse
|
38
|
Salazar-Gatzimas E, Agrochao M, Fitzgerald JE, Clark DA. The Neuronal Basis of an Illusory Motion Percept Is Explained by Decorrelation of Parallel Motion Pathways. Curr Biol 2018; 28:3748-3762.e8. [PMID: 30471993 DOI: 10.1016/j.cub.2018.10.007] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2018] [Revised: 09/28/2018] [Accepted: 10/02/2018] [Indexed: 10/27/2022]
Abstract
Both vertebrates and invertebrates perceive illusory motion, known as "reverse-phi," in visual stimuli that contain sequential luminance increments and decrements. However, increment (ON) and decrement (OFF) signals are initially processed by separate visual neurons, and parallel elementary motion detectors downstream respond selectively to the motion of light or dark edges, often termed ON- and OFF-edges. It remains unknown how and where ON and OFF signals combine to generate reverse-phi motion signals. Here, we show that each of Drosophila's elementary motion detectors encodes motion by combining both ON and OFF signals. Their pattern of responses reflects combinations of increments and decrements that co-occur in natural motion, serving to decorrelate their outputs. These results suggest that the general principle of signal decorrelation drives the functional specialization of parallel motion detection channels, including their selectivity for moving light or dark edges.
Collapse
Affiliation(s)
- Emilio Salazar-Gatzimas
- Interdepartmental Neuroscience Program, Yale University, 333 Cedar Street, New Haven, CT 06511, USA
| | - Margarida Agrochao
- Department of Molecular Cellular and Developmental Biology, Yale University, 219 Prospect Street, New Haven, CT 06511, USA
| | - James E Fitzgerald
- Janelia Research Campus, Howard Hughes Medical Institute, 19700 Helix Drive, Ashburn, VA 20147, USA
| | - Damon A Clark
- Interdepartmental Neuroscience Program, Yale University, 333 Cedar Street, New Haven, CT 06511, USA; Department of Molecular Cellular and Developmental Biology, Yale University, 219 Prospect Street, New Haven, CT 06511, USA; Department of Physics, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
39
|
Creamer MS, Mano O, Clark DA. Visual Control of Walking Speed in Drosophila. Neuron 2018; 100:1460-1473.e6. [PMID: 30415994 DOI: 10.1016/j.neuron.2018.10.028] [Citation(s) in RCA: 26] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2018] [Revised: 08/29/2018] [Accepted: 10/16/2018] [Indexed: 10/27/2022]
Abstract
An animal's self-motion generates optic flow across its retina, and it can use this visual signal to regulate its orientation and speed through the world. While orientation control has been studied extensively in Drosophila and other insects, much less is known about the visual cues and circuits that regulate translational speed. Here, we show that flies regulate walking speed with an algorithm that is tuned to the speed of visual motion, causing them to slow when visual objects are nearby. This regulation does not depend strongly on the spatial structure or the direction of visual stimuli, making it algorithmically distinct from the classic computation that controls orientation. Despite the different algorithms, the visual circuits that regulate walking speed overlap with those that regulate orientation. Taken together, our findings suggest that walking speed is controlled by a hierarchical computation that combines multiple motion detectors with distinct tunings. VIDEO ABSTRACT.
Collapse
Affiliation(s)
- Matthew S Creamer
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Omer Mano
- Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Damon A Clark
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA; Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Physics, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
40
|
Flow stimuli reveal ecologically appropriate responses in mouse visual cortex. Proc Natl Acad Sci U S A 2018; 115:11304-11309. [PMID: 30327345 DOI: 10.1073/pnas.1811265115] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/18/2023] Open
Abstract
Assessments of the mouse visual system based on spatial-frequency analysis imply that its visual capacity is low, with few neurons responding to spatial frequencies greater than 0.5 cycles per degree. However, visually mediated behaviors, such as prey capture, suggest that the mouse visual system is more precise. We introduce a stimulus class-visual flow patterns-that is more like what the mouse would encounter in the natural world than are sine-wave gratings but is more tractable for analysis than are natural images. We used 128-site silicon microelectrodes to measure the simultaneous responses of single neurons in the primary visual cortex (V1) of alert mice. While holding temporal-frequency content fixed, we explored a class of drifting patterns of black or white dots that have energy only at higher spatial frequencies. These flow stimuli evoke strong visually mediated responses well beyond those predicted by spatial-frequency analysis. Flow responses predominate in higher spatial-frequency ranges (0.15-1.6 cycles per degree), many are orientation or direction selective, and flow responses of many neurons depend strongly on sign of contrast. Many cells exhibit distributed responses across our stimulus ensemble. Together, these results challenge conventional linear approaches to visual processing and expand our understanding of the mouse's visual capacity to behaviorally relevant ranges.
Collapse
|
41
|
Kohler PJ, Meredith WJ, Norcia AM. Revisiting the functional significance of binocular cues for perceiving motion-in-depth. Nat Commun 2018; 9:3511. [PMID: 30158523 PMCID: PMC6115357 DOI: 10.1038/s41467-018-05918-7] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2018] [Accepted: 08/03/2018] [Indexed: 11/30/2022] Open
Abstract
Binocular differencing of spatial cues required for perceiving depth relationships is associated with decreased sensitivity to the corresponding retinal image displacements. However, binocular summation of contrast signals increases sensitivity. Here, we investigated this divergence in sensitivity by making direct neural measurements of responses to suprathreshold motion in human adults and 5-month-old infants using steady-state visually evoked potentials. Interocular differences in retinal image motion generated suppressed response functions and correspondingly elevated perceptual thresholds compared to motion matched between the two eyes. This suppression was of equal strength for horizontal and vertical motion and therefore not specific to the perception of motion-in-depth. Suppression is strongly dependent on the presence of spatial references in the image and highly immature in infants. Suppression appears to be the manifestation of a succession of spatial and interocular opponency operations that occur at an intermediate processing stage either before or in parallel with the extraction of motion-in-depth.
Collapse
Affiliation(s)
- Peter J Kohler
- Department of Psychology, Stanford University, Stanford, CA, 94305, USA.
| | - Wesley J Meredith
- Department of Psychology, Stanford University, Stanford, CA, 94305, USA
| | - Anthony M Norcia
- Department of Psychology, Stanford University, Stanford, CA, 94305, USA
| |
Collapse
|
42
|
Abstract
Visual motion processing can be conceptually divided into two levels. In the lower level, local motion signals are detected by spatiotemporal-frequency-selective sensors and then integrated into a motion vector flow. Although the model based on V1-MT physiology provides a good computational framework for this level of processing, it needs to be updated to fully explain psychophysical findings about motion perception, including complex motion signal interactions in the spatiotemporal-frequency and space domains. In the higher level, the velocity map is interpreted. Although there are many motion interpretation processes, we highlight the recent progress in research on the perception of material (e.g., specular reflection, liquid viscosity) and on animacy perception. We then consider possible linking mechanisms of the two levels and propose intrinsic flow decomposition as the key problem. To provide insights into computational mechanisms of motion perception, in addition to psychophysics and neurosciences, we review machine vision studies seeking to solve similar problems.
Collapse
Affiliation(s)
- Shin'ya Nishida
- NTT Communication Science Labs, Nippon Telegraph and Telephone Corporation, Atsugi, Kanagawa 243-0198, Japan; , , ,
| | - Takahiro Kawabe
- NTT Communication Science Labs, Nippon Telegraph and Telephone Corporation, Atsugi, Kanagawa 243-0198, Japan; , , ,
| | - Masataka Sawayama
- NTT Communication Science Labs, Nippon Telegraph and Telephone Corporation, Atsugi, Kanagawa 243-0198, Japan; , , ,
| | - Taiki Fukiage
- NTT Communication Science Labs, Nippon Telegraph and Telephone Corporation, Atsugi, Kanagawa 243-0198, Japan; , , ,
| |
Collapse
|
43
|
Abstract
Motion in the visual world provides critical information to guide the behavior of sighted animals. Furthermore, as visual motion estimation requires comparisons of signals across inputs and over time, it represents a paradigmatic and generalizable neural computation. Focusing on the Drosophila visual system, where an explosion of technological advances has recently accelerated experimental progress, we review our understanding of how, algorithmically and mechanistically, motion signals are first computed.
Collapse
Affiliation(s)
- Helen H Yang
- Department of Neurobiology, Stanford University, Stanford, California 94305, USA; .,Current affiliation: Department of Neurobiology, Harvard Medical School, Boston, Massachusetts 02115, USA;
| | - Thomas R Clandinin
- Department of Neurobiology, Stanford University, Stanford, California 94305, USA;
| |
Collapse
|
44
|
White noise analysis for the correlation-type elementary motion detectors with half-wave rectifiers. Neural Netw 2018; 102:96-106. [DOI: 10.1016/j.neunet.2018.02.014] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2017] [Revised: 02/20/2018] [Accepted: 02/23/2018] [Indexed: 11/19/2022]
|
45
|
Bialek W. Perspectives on theory at the interface of physics and biology. REPORTS ON PROGRESS IN PHYSICS. PHYSICAL SOCIETY (GREAT BRITAIN) 2018; 81:012601. [PMID: 29214982 DOI: 10.1088/1361-6633/aa995b] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
Theoretical physics is the search for simple and universal mathematical descriptions of the natural world. In contrast, much of modern biology is an exploration of the complexity and diversity of life. For many, this contrast is prima facie evidence that theory, in the sense that physicists use the word, is impossible in a biological context. For others, this contrast serves to highlight a grand challenge. I am an optimist, and believe (along with many colleagues) that the time is ripe for the emergence of a more unified theoretical physics of biological systems, building on successes in thinking about particular phenomena. In this essay I try to explain the reasons for my optimism, through a combination of historical and modern examples.
Collapse
Affiliation(s)
- William Bialek
- Joseph Henry Laboratories of Physics, and Lewis-Sigler Institute for Integrative Genomics, Princeton University, 08544, Princeton NJ, United States of America. Initiative for the Theoretical Sciences, The Graduate Center, City University of New York, 365 Fifth Ave, 10016, New York NY, United States of America
| |
Collapse
|
46
|
Clark DA, Demb JB. Parallel Computations in Insect and Mammalian Visual Motion Processing. Curr Biol 2017; 26:R1062-R1072. [PMID: 27780048 DOI: 10.1016/j.cub.2016.08.003] [Citation(s) in RCA: 36] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
Sensory systems use receptors to extract information from the environment and neural circuits to perform subsequent computations. These computations may be described as algorithms composed of sequential mathematical operations. Comparing these operations across taxa reveals how different neural circuits have evolved to solve the same problem, even when using different mechanisms to implement the underlying math. In this review, we compare how insect and mammalian neural circuits have solved the problem of motion estimation, focusing on the fruit fly Drosophila and the mouse retina. Although the two systems implement computations with grossly different anatomy and molecular mechanisms, the underlying circuits transform light into motion signals with strikingly similar processing steps. These similarities run from photoreceptor gain control and spatiotemporal tuning to ON and OFF pathway structures, motion detection, and computed motion signals. The parallels between the two systems suggest that a limited set of algorithms for estimating motion satisfies both the needs of sighted creatures and the constraints imposed on them by metabolism, anatomy, and the structure and regularities of the visual world.
Collapse
Affiliation(s)
- Damon A Clark
- Department of Molecular, Cellular, and Developmental Biology and Department of Physics, Yale University, New Haven, CT 06511, USA.
| | - Jonathan B Demb
- Department of Ophthalmology and Visual Science and Department of Cellular and Molecular Physiology, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
47
|
Neural mechanisms underlying sensitivity to reverse-phi motion in the fly. PLoS One 2017; 12:e0189019. [PMID: 29261684 PMCID: PMC5737883 DOI: 10.1371/journal.pone.0189019] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2017] [Accepted: 11/18/2017] [Indexed: 01/18/2023] Open
Abstract
Optical illusions provide powerful tools for mapping the algorithms and circuits that underlie visual processing, revealing structure through atypical function. Of particular note in the study of motion detection has been the reverse-phi illusion. When contrast reversals accompany discrete movement, detected direction tends to invert. This occurs across a wide range of organisms, spanning humans and invertebrates. Here, we map an algorithmic account of the phenomenon onto neural circuitry in the fruit fly Drosophila melanogaster. Through targeted silencing experiments in tethered walking flies as well as electrophysiology and calcium imaging, we demonstrate that ON- or OFF-selective local motion detector cells T4 and T5 are sensitive to certain interactions between ON and OFF. A biologically plausible detector model accounts for subtle features of this particular form of illusory motion reversal, like the re-inversion of turning responses occurring at extreme stimulus velocities. In light of comparable circuit architecture in the mammalian retina, we suggest that similar mechanisms may apply even to human psychophysics.
Collapse
|
48
|
Dyakova O, Nordström K. Image statistics and their processing in insect vision. CURRENT OPINION IN INSECT SCIENCE 2017; 24:7-14. [PMID: 29208226 DOI: 10.1016/j.cois.2017.08.002] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/16/2017] [Revised: 08/17/2017] [Accepted: 08/24/2017] [Indexed: 06/07/2023]
Abstract
Natural scenes may appear random, but are not only constrained in space and time, but also show strong spatial and temporal correlations. Spatial constraints and correlations can be described by quantifying image statistics, which include intuitive measures such as contrast, color and luminance, but also parameters that need some type of transformation of the image. In this review we will discuss some common tools used to quantify spatial and temporal parameters of naturalistic visual input, and how these tools have been used to inform us about visual processing in insects. In particular, we will review findings that would not have been possible using conventional, experimenter defined stimuli.
Collapse
Affiliation(s)
- Olga Dyakova
- Department of Neuroscience, Uppsala University, Box 593, 751 24 Uppsala, Sweden
| | - Karin Nordström
- Department of Neuroscience, Uppsala University, Box 593, 751 24 Uppsala, Sweden; Centre for Neuroscience, Flinders University, GPO Box 2100, Adelaide, SA 5001, Australia.
| |
Collapse
|
49
|
Salazar-Gatzimas E, Chen J, Creamer MS, Mano O, Mandel HB, Matulis CA, Pottackal J, Clark DA. Direct Measurement of Correlation Responses in Drosophila Elementary Motion Detectors Reveals Fast Timescale Tuning. Neuron 2017; 92:227-239. [PMID: 27710784 DOI: 10.1016/j.neuron.2016.09.017] [Citation(s) in RCA: 32] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2016] [Revised: 05/22/2016] [Accepted: 08/29/2016] [Indexed: 10/20/2022]
Abstract
Animals estimate visual motion by integrating light intensity information over time and space. The integration requires nonlinear processing, which makes motion estimation circuitry sensitive to specific spatiotemporal correlations that signify visual motion. Classical models of motion estimation weight these correlations to produce direction-selective signals. However, the correlational algorithms they describe have not been directly measured in elementary motion-detecting neurons (EMDs). Here, we employed stimuli to directly measure responses to pairwise correlations in Drosophila's EMD neurons, T4 and T5. Activity in these neurons was required for behavioral responses to pairwise correlations and was predictive of those responses. The pattern of neural responses in the EMDs was inconsistent with one classical model of motion detection, and the timescale and selectivity of correlation responses constrained the temporal filtering properties in potential models. These results reveal how neural responses to pairwise correlations drive visual behavior in this canonical motion-detecting circuit.
Collapse
Affiliation(s)
| | - Juyue Chen
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Matthew S Creamer
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Omer Mano
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Holly B Mandel
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | | | - Joseph Pottackal
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Damon A Clark
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA; Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Physics, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
50
|
Haag J, Mishra A, Borst A. A common directional tuning mechanism of Drosophila motion-sensing neurons in the ON and in the OFF pathway. eLife 2017; 6:29044. [PMID: 28829040 PMCID: PMC5582866 DOI: 10.7554/elife.29044] [Citation(s) in RCA: 32] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2017] [Accepted: 08/21/2017] [Indexed: 01/19/2023] Open
Abstract
In the fruit fly optic lobe, T4 and T5 cells represent the first direction-selective neurons, with T4 cells responding selectively to moving brightness increments (ON) and T5 cells to brightness decrements (OFF). Both T4 and T5 cells comprise four subtypes with directional tuning to one of the four cardinal directions. We had previously found that upward-sensitive T4 cells implement both preferred direction enhancement and null direction suppression (Haag et al., 2016). Here, we asked whether this mechanism generalizes to OFF-selective T5 cells and to all four subtypes of both cell classes. We found that all four subtypes of both T4 and T5 cells implement both mechanisms, that is preferred direction enhancement and null direction inhibition, on opposing sides of their receptive fields. This gives rise to the high degree of direction selectivity observed in both T4 and T5 cells within each subpopulation.
Collapse
Affiliation(s)
- Juergen Haag
- Max-Planck-Institute of Neurobiology, Martinsried, Germany
| | | | | |
Collapse
|