1
|
Matsliah A, Yu SC, Kruk K, Bland D, Burke AT, Gager J, Hebditch J, Silverman B, Willie KP, Willie R, Sorek M, Sterling AR, Kind E, Garner D, Sancer G, Wernet MF, Kim SS, Murthy M, Seung HS. Neuronal parts list and wiring diagram for a visual system. Nature 2024; 634:166-180. [PMID: 39358525 PMCID: PMC11446827 DOI: 10.1038/s41586-024-07981-1] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2023] [Accepted: 08/21/2024] [Indexed: 10/04/2024]
Abstract
A catalogue of neuronal cell types has often been called a 'parts list' of the brain1, and regarded as a prerequisite for understanding brain function2,3. In the optic lobe of Drosophila, rules of connectivity between cell types have already proven to be essential for understanding fly vision4,5. Here we analyse the fly connectome to complete the list of cell types intrinsic to the optic lobe, as well as the rules governing their connectivity. Most new cell types contain 10 to 100 cells, and integrate information over medium distances in the visual field. Some existing type families (Tm, Li, and LPi)6-10 at least double in number of types. A new serpentine medulla (Sm) interneuron family contains more types than any other. Three families of cross-neuropil types are revealed. The consistency of types is demonstrated by analysing the distances in high-dimensional feature space, and is further validated by algorithms that select small subsets of discriminative features. We use connectivity to hypothesize about the functional roles of cell types in motion, object and colour vision. Connectivity with 'boundary types' that straddle the optic lobe and central brain is also quantified. We showcase the advantages of connectomic cell typing: complete and unbiased sampling, a rich array of features based on connectivity and reduction of the connectome to a substantially simpler wiring diagram of cell types, with immediate relevance for brain function and development.
Collapse
Affiliation(s)
- Arie Matsliah
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Szi-Chieh Yu
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Krzysztof Kruk
- Independent researcher, Kielce, Poland
- Eyewire, Boston, MA, USA
| | - Doug Bland
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Austin T Burke
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Jay Gager
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - James Hebditch
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Ben Silverman
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | | | - Ryan Willie
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Marissa Sorek
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
- Eyewire, Boston, MA, USA
| | - Amy R Sterling
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
- Eyewire, Boston, MA, USA
| | - Emil Kind
- Institut für Biologie-Neurobiologie, Freie Universität Berlin, Berlin, Germany
| | - Dustin Garner
- Molecular, Cellular and Developmental Biology, University of California, Santa Barbara, Santa Barbara, CA, USA
| | - Gizem Sancer
- Department of Neuroscience, Yale University, New Haven, CT, USA
| | - Mathias F Wernet
- Institut für Biologie-Neurobiologie, Freie Universität Berlin, Berlin, Germany
| | - Sung Soo Kim
- Molecular, Cellular and Developmental Biology, University of California, Santa Barbara, Santa Barbara, CA, USA
| | - Mala Murthy
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA.
| | - H Sebastian Seung
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA.
- Computer Science Department, Princeton University, Princeton, NJ, USA.
| |
Collapse
|
2
|
Hao YA, Lee S, Roth RH, Natale S, Gomez L, Taxidis J, O'Neill PS, Villette V, Bradley J, Wang Z, Jiang D, Zhang G, Sheng M, Lu D, Boyden E, Delvendahl I, Golshani P, Wernig M, Feldman DE, Ji N, Ding J, Südhof TC, Clandinin TR, Lin MZ. A fast and responsive voltage indicator with enhanced sensitivity for unitary synaptic events. Neuron 2024:S0896-6273(24)00643-3. [PMID: 39305894 DOI: 10.1016/j.neuron.2024.08.019] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2023] [Revised: 07/23/2024] [Accepted: 08/27/2024] [Indexed: 09/29/2024]
Abstract
A remaining challenge for genetically encoded voltage indicators (GEVIs) is the reliable detection of excitatory postsynaptic potentials (EPSPs). Here, we developed ASAP5 as a GEVI with enhanced activation kinetics and responsivity near resting membrane potentials for improved detection of both spiking and subthreshold activity. ASAP5 reported action potentials (APs) in vivo with higher signal-to-noise ratios than previous GEVIs and successfully detected graded and subthreshold responses to sensory stimuli in single two-photon trials. In cultured rat or human neurons, somatic ASAP5 reported synaptic events propagating centripetally and could detect ∼1-mV EPSPs. By imaging spontaneous EPSPs throughout dendrites, we found that EPSP amplitudes decay exponentially during propagation and that amplitude at the initiation site generally increases with distance from the soma. These results extend the applications of voltage imaging to the quantal response domain, including in human neurons, opening up the possibility of high-throughput, high-content characterization of neuronal dysfunction in disease.
Collapse
Affiliation(s)
- Yukun A Hao
- Department of Bioengineering, Stanford University, Stanford, CA 94305, USA; Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
| | - Sungmoo Lee
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
| | - Richard H Roth
- Department of Neurosurgery, Stanford University, Stanford, CA 94305, USA
| | - Silvia Natale
- Department of Molecular & Cellular Physiology, Stanford University, Stanford, CA 94305, USA
| | - Laura Gomez
- Department of Molecular and Cell Biology and Helen Wills Neuroscience Institute, University of California Berkeley, Berkeley, CA 94720, USA; Department of Physics, University of California Berkeley, CA 94720, USA
| | - Jiannis Taxidis
- Department of Neurology, UCLA David Geffen School of Medicine, Los Angeles, CA 90095, USA
| | - Philipp S O'Neill
- Department of Molecular Life Sciences, University of Zurich (UZH), 8057 Zurich, Switzerland; Neuroscience Center Zurich, 8057 Zurich, Switzerland
| | - Vincent Villette
- Institut de Biologie de l'École Normale Supérieure (IBENS), CNRS, INSERM, PSL Research University, Paris 75005, France
| | - Jonathan Bradley
- Institut de Biologie de l'École Normale Supérieure (IBENS), CNRS, INSERM, PSL Research University, Paris 75005, France
| | - Zeguan Wang
- Departments of Brain and Cognitive Sciences, Media Arts and Sciences, and Biological Engineering, MIT, Cambridge, MA 02139, USA; McGovern Institute, MIT, Cambridge, MA 02139, USA
| | - Dongyun Jiang
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
| | - Guofeng Zhang
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
| | - Mengjun Sheng
- Department of Neurosurgery, Stanford University, Stanford, CA 94305, USA
| | - Di Lu
- Department of Neurosurgery, Stanford University, Stanford, CA 94305, USA
| | - Edward Boyden
- Departments of Brain and Cognitive Sciences, Media Arts and Sciences, and Biological Engineering, MIT, Cambridge, MA 02139, USA; McGovern Institute, MIT, Cambridge, MA 02139, USA; Howard Hughes Medical Institute, Cambridge, MA 02139, USA
| | - Igor Delvendahl
- Department of Molecular Life Sciences, University of Zurich (UZH), 8057 Zurich, Switzerland; Neuroscience Center Zurich, 8057 Zurich, Switzerland
| | - Peyman Golshani
- Department of Neurology, UCLA David Geffen School of Medicine, Los Angeles, CA 90095, USA; Semel Institute for Neuroscience and Human Behavior, David Geffen School of Medicine, Los Angeles, CA 90095, USA
| | - Marius Wernig
- Department of Pathology, Stanford University, Stanford, CA 94305, USA
| | - Daniel E Feldman
- Department of Molecular and Cell Biology and Helen Wills Neuroscience Institute, University of California Berkeley, Berkeley, CA 94720, USA
| | - Na Ji
- Department of Molecular and Cell Biology and Helen Wills Neuroscience Institute, University of California Berkeley, Berkeley, CA 94720, USA; Department of Physics, University of California Berkeley, CA 94720, USA
| | - Jun Ding
- Department of Neurosurgery, Stanford University, Stanford, CA 94305, USA
| | - Thomas C Südhof
- Department of Molecular & Cellular Physiology, Stanford University, Stanford, CA 94305, USA; Howard Hughes Medical Institute, Stanford University, Stanford, CA 94305, USA
| | - Thomas R Clandinin
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
| | - Michael Z Lin
- Department of Bioengineering, Stanford University, Stanford, CA 94305, USA; Department of Neurobiology, Stanford University, Stanford, CA 94305, USA.
| |
Collapse
|
3
|
Ganguly I, Heckman EL, Litwin-Kumar A, Clowney EJ, Behnia R. Diversity of visual inputs to Kenyon cells of the Drosophila mushroom body. Nat Commun 2024; 15:5698. [PMID: 38972924 PMCID: PMC11228034 DOI: 10.1038/s41467-024-49616-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2024] [Accepted: 06/11/2024] [Indexed: 07/09/2024] Open
Abstract
The arthropod mushroom body is well-studied as an expansion layer representing olfactory stimuli and linking them to contingent events. However, 8% of mushroom body Kenyon cells in Drosophila melanogaster receive predominantly visual input, and their function remains unclear. Here, we identify inputs to visual Kenyon cells using the FlyWire adult whole-brain connectome. Input repertoires are similar across hemispheres and connectomes with certain inputs highly overrepresented. Many visual neurons presynaptic to Kenyon cells have large receptive fields, while interneuron inputs receive spatially restricted signals that may be tuned to specific visual features. Individual visual Kenyon cells randomly sample sparse inputs from combinations of visual channels, including multiple optic lobe neuropils. These connectivity patterns suggest that visual coding in the mushroom body, like olfactory coding, is sparse, distributed, and combinatorial. However, the specific input repertoire to the smaller population of visual Kenyon cells suggests a constrained encoding of visual stimuli.
Collapse
Affiliation(s)
- Ishani Ganguly
- Department of Neuroscience, Columbia University, New York, NY, USA
- Center for Theoretical Neuroscience, Columbia University, New York, NY, USA
- Zuckerman Institute, Columbia University, New York, NY, USA
| | - Emily L Heckman
- Department of Molecular, Cellular, and Developmental Biology, University of Michigan, Ann Arbor, MI, USA
| | - Ashok Litwin-Kumar
- Department of Neuroscience, Columbia University, New York, NY, USA
- Center for Theoretical Neuroscience, Columbia University, New York, NY, USA
- Zuckerman Institute, Columbia University, New York, NY, USA
| | - E Josephine Clowney
- Department of Molecular, Cellular, and Developmental Biology, University of Michigan, Ann Arbor, MI, USA.
- Michigan Neuroscience Institute, University of Michigan, Ann Arbor, MI, USA.
| | - Rudy Behnia
- Department of Neuroscience, Columbia University, New York, NY, USA.
- Zuckerman Institute, Columbia University, New York, NY, USA.
- Kavli Institute for Brain Science, Columbia University, New York, NY, USA.
| |
Collapse
|
4
|
Cowley BR, Calhoun AJ, Rangarajan N, Ireland E, Turner MH, Pillow JW, Murthy M. Mapping model units to visual neurons reveals population code for social behaviour. Nature 2024; 629:1100-1108. [PMID: 38778103 PMCID: PMC11136655 DOI: 10.1038/s41586-024-07451-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2022] [Accepted: 04/19/2024] [Indexed: 05/25/2024]
Abstract
The rich variety of behaviours observed in animals arises through the interplay between sensory processing and motor control. To understand these sensorimotor transformations, it is useful to build models that predict not only neural responses to sensory input1-5 but also how each neuron causally contributes to behaviour6,7. Here we demonstrate a novel modelling approach to identify a one-to-one mapping between internal units in a deep neural network and real neurons by predicting the behavioural changes that arise from systematic perturbations of more than a dozen neuronal cell types. A key ingredient that we introduce is 'knockout training', which involves perturbing the network during training to match the perturbations of the real neurons during behavioural experiments. We apply this approach to model the sensorimotor transformations of Drosophila melanogaster males during a complex, visually guided social behaviour8-11. The visual projection neurons at the interface between the optic lobe and central brain form a set of discrete channels12, and prior work indicates that each channel encodes a specific visual feature to drive a particular behaviour13,14. Our model reaches a different conclusion: combinations of visual projection neurons, including those involved in non-social behaviours, drive male interactions with the female, forming a rich population code for behaviour. Overall, our framework consolidates behavioural effects elicited from various neural perturbations into a single, unified model, providing a map from stimulus to neuronal cell type to behaviour, and enabling future incorporation of wiring diagrams of the brain15 into the model.
Collapse
Affiliation(s)
- Benjamin R Cowley
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA.
- Cold Spring Harbor Laboratory, Cold Spring Harbor, NY, USA.
| | - Adam J Calhoun
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | | | - Elise Ireland
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Maxwell H Turner
- Department of Neurobiology, Stanford University, Stanford, CA, USA
| | - Jonathan W Pillow
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Mala Murthy
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA.
| |
Collapse
|
5
|
Ramdya P. AI networks reveal how flies find a mate. Nature 2024; 629:1010-1011. [PMID: 38778186 DOI: 10.1038/d41586-024-01320-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/25/2024]
|
6
|
Ganguly I, Heckman EL, Litwin-Kumar A, Clowney EJ, Behnia R. Diversity of visual inputs to Kenyon cells of the Drosophila mushroom body. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.10.12.561793. [PMID: 37873086 PMCID: PMC10592809 DOI: 10.1101/2023.10.12.561793] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/25/2023]
Abstract
The arthropod mushroom body is well-studied as an expansion layer that represents olfactory stimuli and links them to contingent events. However, 8% of mushroom body Kenyon cells in Drosophila melanogaster receive predominantly visual input, and their tuning and function are poorly understood. Here, we use the FlyWire adult whole-brain connectome to identify inputs to visual Kenyon cells. The types of visual neurons we identify are similar across hemispheres and connectomes with certain inputs highly overrepresented. Many visual projection neurons presynaptic to Kenyon cells receive input from large swathes of visual space, while local visual interneurons, providing smaller fractions of input, receive more spatially restricted signals that may be tuned to specific features of the visual scene. Like olfactory Kenyon cells, visual Kenyon cells receive sparse inputs from different combinations of visual channels, including inputs from multiple optic lobe neuropils. The sets of inputs to individual visual Kenyon cells are consistent with random sampling of available inputs. These connectivity patterns suggest that visual coding in the mushroom body, like olfactory coding, is sparse, distributed, and combinatorial. However, the expansion coding properties appear different, with a specific repertoire of visual inputs projecting onto a relatively small number of visual Kenyon cells.
Collapse
Affiliation(s)
- Ishani Ganguly
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA
- Center for Theoretical Neuroscience, Columbia University, New York, NY 10027, USA
| | - Emily L Heckman
- Department of Molecular, Cellular, and Developmental Biology, University of Michigan, Ann Arbor, MI 48109, USA
| | - Ashok Litwin-Kumar
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA
- Center for Theoretical Neuroscience, Columbia University, New York, NY 10027, USA
| | - E Josephine Clowney
- Department of Molecular, Cellular, and Developmental Biology, University of Michigan, Ann Arbor, MI 48109, USA
- Michigan Neuroscience Institute Affiliate
| | - Rudy Behnia
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA
| |
Collapse
|
7
|
Mano O, Choi M, Tanaka R, Creamer MS, Matos NCB, Shomar JW, Badwan BA, Clandinin TR, Clark DA. Long-timescale anti-directional rotation in Drosophila optomotor behavior. eLife 2023; 12:e86076. [PMID: 37751469 PMCID: PMC10522332 DOI: 10.7554/elife.86076] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Accepted: 09/12/2023] [Indexed: 09/28/2023] Open
Abstract
Locomotor movements cause visual images to be displaced across the eye, a retinal slip that is counteracted by stabilizing reflexes in many animals. In insects, optomotor turning causes the animal to turn in the direction of rotating visual stimuli, thereby reducing retinal slip and stabilizing trajectories through the world. This behavior has formed the basis for extensive dissections of motion vision. Here, we report that under certain stimulus conditions, two Drosophila species, including the widely studied Drosophila melanogaster, can suppress and even reverse the optomotor turning response over several seconds. Such 'anti-directional turning' is most strongly evoked by long-lasting, high-contrast, slow-moving visual stimuli that are distinct from those that promote syn-directional optomotor turning. Anti-directional turning, like the syn-directional optomotor response, requires the local motion detecting neurons T4 and T5. A subset of lobula plate tangential cells, CH cells, show involvement in these responses. Imaging from a variety of direction-selective cells in the lobula plate shows no evidence of dynamics that match the behavior, suggesting that the observed inversion in turning direction emerges downstream of the lobula plate. Further, anti-directional turning declines with age and exposure to light. These results show that Drosophila optomotor turning behaviors contain rich, stimulus-dependent dynamics that are inconsistent with simple reflexive stabilization responses.
Collapse
Affiliation(s)
- Omer Mano
- Department of Molecular, Cellular, and Developmental Biology, Yale UniversityNew HavenUnited States
| | - Minseung Choi
- Department of Neurobiology, Stanford UniversityStanfordUnited States
| | - Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale UniversityNew HavenUnited States
| | - Matthew S Creamer
- Interdepartmental Neuroscience Program, Yale UniversityNew HavenUnited States
| | - Natalia CB Matos
- Interdepartmental Neuroscience Program, Yale UniversityNew HavenUnited States
| | - Joseph W Shomar
- Department of Physics, Yale UniversityNew HavenUnited States
| | - Bara A Badwan
- Department of Chemical Engineering, Yale UniversityNew HavenUnited States
| | | | - Damon A Clark
- Department of Molecular, Cellular, and Developmental Biology, Yale UniversityNew HavenUnited States
- Interdepartmental Neuroscience Program, Yale UniversityNew HavenUnited States
- Department of Physics, Yale UniversityNew HavenUnited States
- Department of Neuroscience, Yale UniversityNew HavenUnited States
| |
Collapse
|
8
|
Kumar S, Sharma AK, Tran A, Liu M, Leifer AM. Inhibitory feedback from the motor circuit gates mechanosensory processing in Caenorhabditis elegans. PLoS Biol 2023; 21:e3002280. [PMID: 37733772 PMCID: PMC10617738 DOI: 10.1371/journal.pbio.3002280] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2023] [Revised: 10/31/2023] [Accepted: 07/27/2023] [Indexed: 09/23/2023] Open
Abstract
Animals must integrate sensory cues with their current behavioral context to generate a suitable response. How this integration occurs is poorly understood. Previously, we developed high-throughput methods to probe neural activity in populations of Caenorhabditis elegans and discovered that the animal's mechanosensory processing is rapidly modulated by the animal's locomotion. Specifically, we found that when the worm turns it suppresses its mechanosensory-evoked reversal response. Here, we report that C. elegans use inhibitory feedback from turning-associated neurons to provide this rapid modulation of mechanosensory processing. By performing high-throughput optogenetic perturbations triggered on behavior, we show that turning-associated neurons SAA, RIV, and/or SMB suppress mechanosensory-evoked reversals during turns. We find that activation of the gentle-touch mechanosensory neurons or of any of the interneurons AIZ, RIM, AIB, and AVE during a turn is less likely to evoke a reversal than activation during forward movement. Inhibiting neurons SAA, RIV, and SMB during a turn restores the likelihood with which mechanosensory activation evokes reversals. Separately, activation of premotor interneuron AVA evokes reversals regardless of whether the animal is turning or moving forward. We therefore propose that inhibitory signals from SAA, RIV, and/or SMB gate mechanosensory signals upstream of neuron AVA. We conclude that C. elegans rely on inhibitory feedback from the motor circuit to modulate its response to sensory stimuli on fast timescales. This need for motor signals in sensory processing may explain the ubiquity in many organisms of motor-related neural activity patterns seen across the brain, including in sensory processing areas.
Collapse
Affiliation(s)
- Sandeep Kumar
- Princeton Neuroscience Institute, Princeton University, Princeton, New Jersey, United States of America
| | - Anuj K. Sharma
- Department of Physics, Princeton University, Princeton, New Jersey, United States of America
| | - Andrew Tran
- Princeton Neuroscience Institute, Princeton University, Princeton, New Jersey, United States of America
| | - Mochi Liu
- Department of Physics, Princeton University, Princeton, New Jersey, United States of America
| | - Andrew M. Leifer
- Princeton Neuroscience Institute, Princeton University, Princeton, New Jersey, United States of America
- Department of Physics, Princeton University, Princeton, New Jersey, United States of America
| |
Collapse
|
9
|
Tsuji M, Nishizuka Y, Emoto K. Threat gates visual aversion via theta activity in Tachykinergic neurons. Nat Commun 2023; 14:3987. [PMID: 37443364 PMCID: PMC10345120 DOI: 10.1038/s41467-023-39667-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2022] [Accepted: 06/22/2023] [Indexed: 07/15/2023] Open
Abstract
Animals must adapt sensory responses to an ever-changing environment for survival. Such sensory modulation is especially critical in a threatening situation, in which animals often promote aversive responses to, among others, visual stimuli. Recently, threatened Drosophila has been shown to exhibit a defensive internal state. Whether and how threatened Drosophila promotes visual aversion, however, remains elusive. Here we report that mechanical threats to Drosophila transiently gate aversion from an otherwise neutral visual object. We further identified the neuropeptide tachykinin, and a single cluster of neurons expressing it ("Tk-GAL42 ∩ Vglut neurons"), that are responsible for gating visual aversion. Calcium imaging analysis revealed that mechanical threats are encoded in Tk-GAL42 ∩ Vglut neurons as elevated activity. Remarkably, we also discovered that a visual object is encoded in Tk-GAL42 ∩ Vglut neurons as θ oscillation, which is causally linked to visual aversion. Our data reveal how a single cluster of neurons adapt organismal sensory response to a threatening situation through a neuropeptide and a combination of rate/temporal coding schemes.
Collapse
Affiliation(s)
- Masato Tsuji
- Department of Biological Sciences, Graduate School of Science, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo, 113-0033, Japan
| | - Yuto Nishizuka
- Department of Biological Sciences, Graduate School of Science, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo, 113-0033, Japan
| | - Kazuo Emoto
- Department of Biological Sciences, Graduate School of Science, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo, 113-0033, Japan.
- International Research Center for Neurointelligence (WPI-IRCN), The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo, 113-0033, Japan.
| |
Collapse
|
10
|
Currier TA, Pang MM, Clandinin TR. Visual processing in the fly, from photoreceptors to behavior. Genetics 2023; 224:iyad064. [PMID: 37128740 PMCID: PMC10213501 DOI: 10.1093/genetics/iyad064] [Citation(s) in RCA: 13] [Impact Index Per Article: 13.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2022] [Accepted: 03/22/2023] [Indexed: 05/03/2023] Open
Abstract
Originally a genetic model organism, the experimental use of Drosophila melanogaster has grown to include quantitative behavioral analyses, sophisticated perturbations of neuronal function, and detailed sensory physiology. A highlight of these developments can be seen in the context of vision, where pioneering studies have uncovered fundamental and generalizable principles of sensory processing. Here we begin with an overview of vision-guided behaviors and common methods for probing visual circuits. We then outline the anatomy and physiology of brain regions involved in visual processing, beginning at the sensory periphery and ending with descending motor control. Areas of focus include contrast and motion detection in the optic lobe, circuits for visual feature selectivity, computations in support of spatial navigation, and contextual associative learning. Finally, we look to the future of fly visual neuroscience and discuss promising topics for further study.
Collapse
Affiliation(s)
- Timothy A Currier
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA 94305, USA
| | - Michelle M Pang
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA 94305, USA
| | - Thomas R Clandinin
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA 94305, USA
| |
Collapse
|
11
|
Prinz R. Nothing in evolution makes sense except in the light of code biology. Biosystems 2023; 229:104907. [PMID: 37207840 DOI: 10.1016/j.biosystems.2023.104907] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2023] [Revised: 04/29/2023] [Accepted: 05/02/2023] [Indexed: 05/21/2023]
Abstract
This article highlights the potential contribution of biological codes to the course and dynamics of evolution. The concept of organic codes, developed by Marcello Barbieri, has fundamentally changed our view of how living systems function. The notion that molecular interactions built on adaptors that arbitrarily link molecules from different "worlds" in a conventional, i.e., rule-based way, departs significantly from the law-based constraints imposed on livening things by physical and chemical mechanisms. In other words, living and non-living things behave like rules and laws, respectively, but this important distinction is rarely considered in current evolutionary theory. The many known codes allow quantification of codes that relate to a cell, or comparisons between different biological systems and may pave the way to a quantitative and empirical research agenda in code biology. A starting point for such an endeavour is the introduction of a simple dichotomous classification of structural and regulatory codes. This classification can be used as a tool to analyse and quantify key organising principles of the living world, such as modularity, hierarchy, and robustness, based on organic codes. The implications for evolutionary research are related to the unique dynamics of codes, or ´Eigendynamics´ (self-momentum) and how they determine the behaviour of biological systems from within, whereas physical constraints are imposed mainly from without. A speculation on the drivers of macroevolution in light of codes is followed by the conclusion that a meaningful and comprehensive understanding of evolution depends including codes into the equation of life.
Collapse
|
12
|
Turner MH, Krieger A, Pang MM, Clandinin TR. Visual and motor signatures of locomotion dynamically shape a population code for feature detection in Drosophila. eLife 2022; 11:e82587. [PMID: 36300621 PMCID: PMC9651947 DOI: 10.7554/elife.82587] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2022] [Accepted: 10/25/2022] [Indexed: 01/07/2023] Open
Abstract
Natural vision is dynamic: as an animal moves, its visual input changes dramatically. How can the visual system reliably extract local features from an input dominated by self-generated signals? In Drosophila, diverse local visual features are represented by a group of projection neurons with distinct tuning properties. Here, we describe a connectome-based volumetric imaging strategy to measure visually evoked neural activity across this population. We show that local visual features are jointly represented across the population, and a shared gain factor improves trial-to-trial coding fidelity. A subset of these neurons, tuned to small objects, is modulated by two independent signals associated with self-movement, a motor-related signal, and a visual motion signal associated with rotation of the animal. These two inputs adjust the sensitivity of these feature detectors across the locomotor cycle, selectively reducing their gain during saccades and restoring it during intersaccadic intervals. This work reveals a strategy for reliable feature detection during locomotion.
Collapse
Affiliation(s)
- Maxwell H Turner
- Department of Neurobiology, Stanford UniversityStanfordUnited States
| | - Avery Krieger
- Department of Neurobiology, Stanford UniversityStanfordUnited States
| | - Michelle M Pang
- Department of Neurobiology, Stanford UniversityStanfordUnited States
| | | |
Collapse
|