1
|
Cellini B, Ferrero M, Mongeau JM. Drosophila flying in augmented reality reveals the vision-based control autonomy of the optomotor response. Curr Biol 2024; 34:68-78.e4. [PMID: 38113890 DOI: 10.1016/j.cub.2023.11.045] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2023] [Revised: 10/03/2023] [Accepted: 11/21/2023] [Indexed: 12/21/2023]
Abstract
For walking, swimming, and flying animals, the optomotor response is essential to stabilize gaze. How flexible is the optomotor response? Classic work in Drosophila has argued that flies adapt flight control under augmented visual feedback conditions during goal-directed bar fixation. However, whether the lower-level, reflexive optomotor response can similarly adapt to augmented visual feedback (partially autonomous) or not (autonomous) over long timescales is poorly understood. To address this question, we developed an augmented reality paradigm to study the vision-based control autonomy of the yaw optomotor response of flying fruit flies (Drosophila). Flies were placed in a flight simulator, which permitted free body rotation about the yaw axis. By feeding back body movements in real time to a visual display, we augmented and inverted visual feedback. Thus, this experimental paradigm caused a constant visual error between expected and actual visual feedback to study potential adaptive visuomotor control. By combining experiments with control theory, we demonstrate that the optomotor response is autonomous during augmented reality flight bouts of up to 30 min, which exceeds the reported learning epoch during bar fixation. Agreement between predictions from linear systems theory and experimental data supports the notion that the optomotor response is approximately linear and time invariant within our experimental assay. Even under positive visual feedback, which revealed the stability limit of flies in augmented reality, the optomotor response was autonomous. Our results support a hierarchical motor control architecture in flies with fast and autonomous reflexes at the bottom and more flexible behavior at higher levels.
Collapse
Affiliation(s)
- Benjamin Cellini
- Department of Mechanical Engineering, The Pennsylvania State University, University Park, PA 16802, USA; Department of Mechanical Engineering, University of Nevada, Reno, NV 89557, USA.
| | - Marioalberto Ferrero
- Department of Mechanical Engineering, The Pennsylvania State University, University Park, PA 16802, USA.
| | - Jean-Michel Mongeau
- Department of Mechanical Engineering, The Pennsylvania State University, University Park, PA 16802, USA.
| |
Collapse
|
2
|
Davis BA, Mongeau JM. The influence of saccades on yaw gaze stabilization in fly flight. PLoS Comput Biol 2023; 19:e1011746. [PMID: 38127819 PMCID: PMC10769041 DOI: 10.1371/journal.pcbi.1011746] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2023] [Revised: 01/05/2024] [Accepted: 12/08/2023] [Indexed: 12/23/2023] Open
Abstract
In a way analogous to human vision, the fruit fly D. melanogaster and many other flying insects generate smooth and saccadic movements to stabilize and shift their gaze in flight, respectively. It has been hypothesized that this combination of continuous and discrete movements benefits both flight stability and performance, particularly at high frequencies or speeds. Here we develop a hybrid control system model to explore the effects of saccades on the yaw stabilization reflex of D. melanogaster. Inspired from experimental data, the model includes a first order plant, a Proportional-Integral (PI) continuous controller, and a saccadic reset system that fires based on the integrated error of the continuous controller. We explore the gain, delay and switching threshold parameter space to quantify the optimum regions for yaw stability and performance. We show that the addition of saccades to a continuous controller provides benefits to both stability and performance across a range of frequencies. Our model suggests that Drosophila operates near its optimal switching threshold for its experimental gain set. We also show that based on experimental data, D. melanogaster operates in a region that trades off performance and stability. This trade-off increases flight robustness to compensate for internal perturbations such as wing damage.
Collapse
Affiliation(s)
- Brock A. Davis
- Department of Mechanical Engineering, The Pennsylvania State University, University Park, Pennsylvania, United States of America
| | - Jean-Michel Mongeau
- Department of Mechanical Engineering, The Pennsylvania State University, University Park, Pennsylvania, United States of America
| |
Collapse
|
3
|
Tanaka R, Zhou B, Agrochao M, Badwan BA, Au B, Matos NCB, Clark DA. Neural mechanisms to incorporate visual counterevidence in self-movement estimation. Curr Biol 2023; 33:4960-4979.e7. [PMID: 37918398 PMCID: PMC10848174 DOI: 10.1016/j.cub.2023.10.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2023] [Revised: 10/07/2023] [Accepted: 10/09/2023] [Indexed: 11/04/2023]
Abstract
In selecting appropriate behaviors, animals should weigh sensory evidence both for and against specific beliefs about the world. For instance, animals measure optic flow to estimate and control their own rotation. However, existing models of flow detection can be spuriously triggered by visual motion created by objects moving in the world. Here, we show that stationary patterns on the retina, which constitute evidence against observer rotation, suppress inappropriate stabilizing rotational behavior in the fruit fly Drosophila. In silico experiments show that artificial neural networks (ANNs) that are optimized to distinguish observer movement from external object motion similarly detect stationarity and incorporate negative evidence. Employing neural measurements and genetic manipulations, we identified components of the circuitry for stationary pattern detection, which runs parallel to the fly's local motion and optic-flow detectors. Our results show how the fly brain incorporates negative evidence to improve heading stability, exemplifying how a compact brain exploits geometrical constraints of the visual world.
Collapse
Affiliation(s)
- Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Baohua Zhou
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Statistics and Data Science, Yale University, New Haven, CT 06511, USA
| | - Margarida Agrochao
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Bara A Badwan
- School of Engineering and Applied Science, Yale University, New Haven, CT 06511, USA
| | - Braedyn Au
- Department of Physics, Yale University, New Haven, CT 06511, USA
| | - Natalia C B Matos
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Damon A Clark
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Physics, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA; Wu Tsai Institute, Yale University, New Haven, CT 06511, USA; Quantitative Biology Institute, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
4
|
Rimniceanu M, Currea JP, Frye MA. Proprioception gates visual object fixation in flying flies. Curr Biol 2023; 33:1459-1471.e3. [PMID: 37001520 PMCID: PMC10133043 DOI: 10.1016/j.cub.2023.03.018] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2022] [Revised: 01/24/2023] [Accepted: 03/07/2023] [Indexed: 04/27/2023]
Abstract
Visual object tracking in animals as diverse as felines, frogs, and fish supports behaviors including predation, predator avoidance, and landscape navigation. Decades of experimental results show that a rigidly body-fixed tethered fly in a "virtual reality" visual flight simulator steers to follow the motion of a vertical bar, thereby "fixating" it on visual midline. This behavior likely reflects a desire to seek natural features such as plant stalks and has inspired algorithms for visual object tracking predicated on robust responses to bar velocity, particularly near visual midline. Using a modified flight simulator equipped with a magnetic pivot to allow frictionless turns about the yaw axis, we discovered that bar fixation as well as smooth steering responses to bar velocity are attenuated or eliminated in yaw-free conditions. Body-fixed Drosophila melanogaster respond to bar oscillation on a stationary ground with frequency-matched wing kinematics and fixate the bar on midline. Yaw-free flies respond to the same stimulus by ignoring the bar and maintaining their original heading. These differences are driven by proprioceptive signals, rather than visual signals, as artificially "clamping" a bar in the periphery of a yaw-free fly has no effect. When presented with a bar and ground oscillating at different frequencies, a yaw-free fly follows the frequency of the ground only, whereas a body-fixed fly robustly steers at the frequencies of both the bar and ground. Our findings support a model in which proprioceptive feedback promote active damping of high-gain optomotor responses to object motion.
Collapse
Affiliation(s)
- Martha Rimniceanu
- Department of Integrative Biology and Physiology, University of California, Los Angeles, Los Angeles, CA 90095, USA
| | - John P Currea
- Department of Integrative Biology and Physiology, University of California, Los Angeles, Los Angeles, CA 90095, USA
| | - Mark A Frye
- Department of Integrative Biology and Physiology, University of California, Los Angeles, Los Angeles, CA 90095, USA.
| |
Collapse
|
5
|
Turner MH, Krieger A, Pang MM, Clandinin TR. Visual and motor signatures of locomotion dynamically shape a population code for feature detection in Drosophila. eLife 2022; 11:e82587. [PMID: 36300621 PMCID: PMC9651947 DOI: 10.7554/elife.82587] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2022] [Accepted: 10/25/2022] [Indexed: 01/07/2023] Open
Abstract
Natural vision is dynamic: as an animal moves, its visual input changes dramatically. How can the visual system reliably extract local features from an input dominated by self-generated signals? In Drosophila, diverse local visual features are represented by a group of projection neurons with distinct tuning properties. Here, we describe a connectome-based volumetric imaging strategy to measure visually evoked neural activity across this population. We show that local visual features are jointly represented across the population, and a shared gain factor improves trial-to-trial coding fidelity. A subset of these neurons, tuned to small objects, is modulated by two independent signals associated with self-movement, a motor-related signal, and a visual motion signal associated with rotation of the animal. These two inputs adjust the sensitivity of these feature detectors across the locomotor cycle, selectively reducing their gain during saccades and restoring it during intersaccadic intervals. This work reveals a strategy for reliable feature detection during locomotion.
Collapse
Affiliation(s)
- Maxwell H Turner
- Department of Neurobiology, Stanford UniversityStanfordUnited States
| | - Avery Krieger
- Department of Neurobiology, Stanford UniversityStanfordUnited States
| | - Michelle M Pang
- Department of Neurobiology, Stanford UniversityStanfordUnited States
| | | |
Collapse
|
6
|
Leibbrandt R, Nicholas S, Nordström K. The impulse response of optic flow-sensitive descending neurons to roll m-sequences. J Exp Biol 2021; 224:273641. [PMID: 34870706 PMCID: PMC8714074 DOI: 10.1242/jeb.242833] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2021] [Accepted: 11/05/2021] [Indexed: 11/23/2022]
Abstract
When animals move through the world, their own movements generate widefield optic flow across their eyes. In insects, such widefield motion is encoded by optic lobe neurons. These lobula plate tangential cells (LPTCs) synapse with optic flow-sensitive descending neurons, which in turn project to areas that control neck, wing and leg movements. As the descending neurons play a role in sensorimotor transformation, it is important to understand their spatio-temporal response properties. Recent work shows that a relatively fast and efficient way to quantify such response properties is to use m-sequences or other white noise techniques. Therefore, here we used m-sequences to quantify the impulse responses of optic flow-sensitive descending neurons in male Eristalis tenax hoverflies. We focused on roll impulse responses as hoverflies perform exquisite head roll stabilizing reflexes, and the descending neurons respond particularly well to roll. We found that the roll impulse responses were fast, peaking after 16.5–18.0 ms. This is similar to the impulse response time to peak (18.3 ms) to widefield horizontal motion recorded in hoverfly LPTCs. We found that the roll impulse response amplitude scaled with the size of the stimulus impulse, and that its shape could be affected by the addition of constant velocity roll or lift. For example, the roll impulse response became faster and stronger with the addition of excitatory stimuli, and vice versa. We also found that the roll impulse response had a long return to baseline, which was significantly and substantially reduced by the addition of either roll or lift. Summary: The impulse response of hoverfly optic flow-sensitive descending neurons to roll m-sequences reaches its time to peak within 20 ms and slowly returns to baseline over the next 100 ms.
Collapse
Affiliation(s)
- Richard Leibbrandt
- Neuroscience, Flinders Health and Medical Research Institute, Flinders University, GPO Box 2100, 5001 Adelaide, SA, Australia
| | - Sarah Nicholas
- Neuroscience, Flinders Health and Medical Research Institute, Flinders University, GPO Box 2100, 5001 Adelaide, SA, Australia
| | - Karin Nordström
- Neuroscience, Flinders Health and Medical Research Institute, Flinders University, GPO Box 2100, 5001 Adelaide, SA, Australia.,Department of Neuroscience, Uppsala University, Box 593, 751 24 Uppsala, Sweden
| |
Collapse
|
7
|
Cellini B, Salem W, Mongeau JM. Mechanisms of punctuated vision in fly flight. Curr Biol 2021; 31:4009-4024.e3. [PMID: 34329590 DOI: 10.1016/j.cub.2021.06.080] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2021] [Revised: 06/02/2021] [Accepted: 06/25/2021] [Indexed: 11/26/2022]
Abstract
To guide locomotion, animals control gaze via movements of their eyes, head, and/or body, but how the nervous system controls gaze during complex motor tasks remains elusive. In many animals, shifts in gaze consist of periods of smooth movement punctuated by rapid eye saccades. Notably, eye movements are constrained by anatomical limits, which requires resetting eye position. By studying tethered, flying fruit flies (Drosophila), we show that flies perform stereotyped head saccades to reset gaze, analogous to optokinetic nystagmus in primates. Head-reset saccades interrupted head smooth movement for as little as 50 ms-representing less than 5% of the total flight time-thereby enabling punctuated gaze stabilization. By revealing the passive mechanics of the neck joint, we show that head-reset saccades leverage the neck's natural elastic recoil, enabling mechanically assisted redirection of gaze. The consistent head orientation at saccade initiation, the influence of the head's angular position on saccade rate, the decrease in wing saccade frequency in head-fixed flies, and the decrease in head-reset saccade rate in flies with their head range of motion restricted together implicate proprioception as the primary trigger of head-reset saccades. Wing-reset saccades were influenced by head orientation, establishing a causal link between neck sensory signals and the execution of body saccades. Head-reset saccades were abolished when flies switched to a landing state, demonstrating that head movements are gated by behavioral state. We propose a control architecture for active vision systems with limits in sensor range of motion. VIDEO ABSTRACT.
Collapse
Affiliation(s)
- Benjamin Cellini
- Department of Mechanical Engineering, The Pennsylvania State University, University Park, PA 16802, USA
| | - Wael Salem
- Department of Mechanical Engineering, The Pennsylvania State University, University Park, PA 16802, USA
| | - Jean-Michel Mongeau
- Department of Mechanical Engineering, The Pennsylvania State University, University Park, PA 16802, USA.
| |
Collapse
|
8
|
Abstract
Multisensory integration is synergistic—input from one sensory modality might modulate the behavioural response to another. Work in flies has shown that a small visual object presented in the periphery elicits innate aversive steering responses in flight, likely representing an approaching threat. Object aversion is switched to approach when paired with a plume of food odour. The ‘open-loop’ design of prior work facilitated the observation of changing valence. How does odour influence visual object responses when an animal has naturally active control over its visual experience? In this study, we use closed-loop feedback conditions, in which a fly's steering effort is coupled to the angular velocity of the visual stimulus, to confirm that flies steer toward or ‘fixate’ a long vertical stripe on the visual midline. They tend either to steer away from or ‘antifixate’ a small object or to disengage active visual control, which manifests as uncontrolled object ‘spinning’ within this experimental paradigm. Adding a plume of apple cider vinegar decreases the probability of both antifixation and spinning, while increasing the probability of frontal fixation for objects of any size, including a normally typically aversive small object.
Collapse
Affiliation(s)
- Karen Y Cheng
- UCLA Department of Integrative Biology and Physiology, Los Angeles, CA, USA
| | - Mark A Frye
- UCLA Department of Integrative Biology and Physiology, Los Angeles, CA, USA
| |
Collapse
|
9
|
Li L, Zhang Z, Lu J. Artificial fly visual joint perception neural network inspired by multiple-regional collision detection. Neural Netw 2020; 135:13-28. [PMID: 33338802 DOI: 10.1016/j.neunet.2020.11.018] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2020] [Revised: 11/12/2020] [Accepted: 11/30/2020] [Indexed: 10/22/2022]
Abstract
The biological visual system includes multiple types of motion sensitive neurons which preferentially respond to specific perceptual regions. However, it still keeps open how to borrow such neurons to construct bio-inspired computational models for multiple-regional collision detection. To fill this gap, this work proposes a visual joint perception neural network with two subnetworks - presynaptic and postsynaptic neural networks, inspired by the preferentialperception characteristics of three horizontal and vertical motion sensitive neurons. Related to the neural network and three hazard detection mechanisms, an artificial fly visual synthesized collision detection model for multiple-regional collision detection is originally developed to monitor possible danger occurrence in the case where one or more moving objects appear in the whole field of view. The experiments can clearly draw two conclusions: (i) the acquired neural network can effectively display the characteristics of visual movement, and (ii) the collision detection model, which outperforms the compared models, can effectively perform multiple-regional collision detection at a high success rate, and only takes about 0.24s to complete the process of collision detection for each virtual or actual image frame with resolution 110×60.
Collapse
Affiliation(s)
- Lun Li
- College of Big Data and Information Engineering, Guizhou University, Guizhou Provincial Characteristic Key Laboratory of System Optimization and Scientific Computing, Guiyang, Guizhou 550025, PR China
| | - Zhuhong Zhang
- College of Big Data and Information Engineering, Guizhou University, Guizhou Provincial Characteristic Key Laboratory of System Optimization and Scientific Computing, Guiyang, Guizhou 550025, PR China.
| | - Jiaxuan Lu
- College of Big Data and Information Engineering, Guizhou University, Guizhou Provincial Characteristic Key Laboratory of System Optimization and Scientific Computing, Guiyang, Guizhou 550025, PR China
| |
Collapse
|
10
|
Cellini B, Mongeau JM. Hybrid visual control in fly flight: insights into gaze shift via saccades. CURRENT OPINION IN INSECT SCIENCE 2020; 42:23-31. [PMID: 32896628 DOI: 10.1016/j.cois.2020.08.009] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/01/2020] [Revised: 08/24/2020] [Accepted: 08/26/2020] [Indexed: 06/11/2023]
Abstract
Flies fly by alternating between periods of fixation and body saccades, analogous to how our own eyes move. Gaze fixation via smooth movement in fly flight has been studied extensively, but comparatively less is known about the mechanism by which flies trigger and control body saccades to shift their gaze. Why do flies implement a hybrid fixate-and-saccade locomotion strategy? Here we review recent developments that provide new insights into this question. We focus on the interplay between smooth movement and saccades, the trigger classes of saccades, and the timeline of saccade execution. We emphasize recent mechanistic advances in Drosophila, where genetic tools have enabled cellular circuit analysis at an unprecedented level in a flying insect. In addition, we review trade-offs in behavioral paradigms used to study saccades. Throughout we highlight exciting avenues for future research in the control of fly flight.
Collapse
Affiliation(s)
- Benjamin Cellini
- Department of Mechanical Engineering, The Pennsylvania State University, University Park, PA, 16801, USA
| | - Jean-Michel Mongeau
- Department of Mechanical Engineering, The Pennsylvania State University, University Park, PA, 16801, USA.
| |
Collapse
|
11
|
Salem W, Cellini B, Frye MA, Mongeau JM. Fly eyes are not still: a motion illusion in Drosophila flight supports parallel visual processing. J Exp Biol 2020; 223:jeb212316. [PMID: 32321749 PMCID: PMC7272343 DOI: 10.1242/jeb.212316] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2019] [Accepted: 04/12/2020] [Indexed: 02/05/2023]
Abstract
Most animals shift gaze by a 'fixate and saccade' strategy, where the fixation phase stabilizes background motion. A logical prerequisite for robust detection and tracking of moving foreground objects, therefore, is to suppress the perception of background motion. In a virtual reality magnetic tether system enabling free yaw movement, Drosophila implemented a fixate and saccade strategy in the presence of a static panorama. When the spatial wavelength of a vertical grating was below the Nyquist wavelength of the compound eyes, flies drifted continuously and gaze could not be maintained at a single location. Because the drift occurs from a motionless stimulus - thus any perceived motion stimuli are generated by the fly itself - it is illusory, driven by perceptual aliasing. Notably, the drift speed was significantly faster than under a uniform panorama, suggesting perceptual enhancement as a result of aliasing. Under the same visual conditions in a rigid-tether paradigm, wing steering responses to the unresolvable static panorama were not distinguishable from those to a resolvable static pattern, suggesting visual aliasing is induced by ego motion. We hypothesized that obstructing the control of gaze fixation also disrupts detection and tracking of objects. Using the illusory motion stimulus, we show that magnetically tethered Drosophila track objects robustly in flight even when gaze is not fixated as flies continuously drift. Taken together, our study provides further support for parallel visual motion processing and reveals the critical influence of body motion on visuomotor processing. Motion illusions can reveal important shared principles of information processing across taxa.
Collapse
Affiliation(s)
- Wael Salem
- Department of Mechanical Engineering, Pennsylvania State University, University Park, PA 16802, USA
| | - Benjamin Cellini
- Department of Mechanical Engineering, Pennsylvania State University, University Park, PA 16802, USA
| | - Mark A Frye
- Department of Integrative Biology and Physiology, University of California - Los Angeles, Los Angeles, CA 90095-7239, USA
| | - Jean-Michel Mongeau
- Department of Mechanical Engineering, Pennsylvania State University, University Park, PA 16802, USA
| |
Collapse
|
12
|
Mathejczyk TF, Wernet MF. Modular assays for the quantitative study of visually guided navigation in both flying and walking flies. J Neurosci Methods 2020; 340:108747. [PMID: 32339523 DOI: 10.1016/j.jneumeth.2020.108747] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2020] [Revised: 04/16/2020] [Accepted: 04/17/2020] [Indexed: 01/12/2023]
Abstract
BACKGROUND The quantitative study of behavioral responses to visual stimuli provides crucial information about the computations executed by neural circuits. Insects have long served as powerful model systems, either when walking on air suspended balls (spherical treadmill), or flying while glued to a needle (virtual flight arena). NEW METHOD Here we present detailed instructions for 3D-printing and assembly of arenas optimized for visually guided navigation, including codes for presenting both celestial and panorama cues. These modular arenas can be used either as virtual flight arenas, or as spherical treadmills and consist entirely of commercial and 3D-printed components placed in a temperature and humidity controlled environment. COMPARISON TO EXISTING METHOD(S) Previous assays often include a combination of rather cost-intensive and technically complex, custom-built mechanical, electronic, and software components. Implementation amounts to a major challenge when working in an academic environment without the support of a professional machine shop. RESULTS Robust optomotor responses are induced in flyingDrosophila by displaying moving stripes in a cylinder surrounding the magnetically tethered fly. Similarly, changes in flight heading are induced by presenting changes in the orientation of linearly polarized UV light presented from above. Finally, responses to moving patterns are induced when individual flies are walking on an air-suspended ball. CONCLUSION These modular assays allow for the investigation of a diverse combination navigational cues (sky and panorama) in both flying and walking flies. They can be used for the molecular dissection of neural circuitry in Drosophila and can easily be rescaled for accommodating other insects.
Collapse
Affiliation(s)
- Thomas F Mathejczyk
- Freie Universität Berlin, Fachbereich Biologie, Chemie und Pharmazie, Institut für Biologie - Neurobiologie, Königin-Luise Strasse 1-3, 14195 Berlin, Germany
| | - Mathias F Wernet
- Freie Universität Berlin, Fachbereich Biologie, Chemie und Pharmazie, Institut für Biologie - Neurobiologie, Königin-Luise Strasse 1-3, 14195 Berlin, Germany.
| |
Collapse
|
13
|
Ji X, Yuan D, Wei H, Cheng Y, Wang X, Yang J, Hu P, Gestrich JY, Liu L, Zhu Y. Differentiation of Theta Visual Motion from Fourier Motion Requires LC16 and R18C12 Neurons in Drosophila. iScience 2020; 23:101041. [PMID: 32325414 PMCID: PMC7176990 DOI: 10.1016/j.isci.2020.101041] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2019] [Revised: 03/09/2020] [Accepted: 04/01/2020] [Indexed: 11/19/2022] Open
Abstract
Many animals perceive features of higher-order visual motion that are beyond the spatiotemporal correlations of luminance defined in first-order motion. Although the neural mechanisms of first-order motion detection have become understood in recent years, those underlying higher-order motion perception remain unclear. Here, we established a paradigm to assess the detection of theta motion—a type of higher-order motion—in freely walking Drosophila. Behavioral screening using this paradigm identified two clusters of neurons in the central brain, designated as R18C12, which were required for perception of theta motion but not for first-order motion. Furthermore, theta motion-activated R18C12 neurons were structurally and functionally located downstream of visual projection neurons in lobula, lobula columnar cells LC16, which activated R18C12 neurons via interactions of acetylcholine (ACh) and muscarinic acetylcholine receptors (mAChRs). The current study provides new insights into LC neurons and the neuronal mechanisms underlying visual information processing in complex natural scenes. Perception of theta motion requires LC16 and R18C12 neurons R18C12 neurons are activated by theta motion R18C12 neurons form synaptic connections with LC16 neurons LC16 neurons activate R18C12 neurons through ACh acting on mAChR
Collapse
Affiliation(s)
- Xiaoxiao Ji
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Biomacromolecules, Institute of Biophysics, Chinese Academy of Sciences, 15 Datun Road, Chaoyang District, Beijing 100101, P. R. China; College of Life Sciences, University of the Chinese Academy of Sciences, Beijing 100049, P. R. China
| | - Deliang Yuan
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Biomacromolecules, Institute of Biophysics, Chinese Academy of Sciences, 15 Datun Road, Chaoyang District, Beijing 100101, P. R. China; College of Life Sciences, University of the Chinese Academy of Sciences, Beijing 100049, P. R. China
| | - Hongying Wei
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Biomacromolecules, Institute of Biophysics, Chinese Academy of Sciences, 15 Datun Road, Chaoyang District, Beijing 100101, P. R. China; College of Life Sciences, University of the Chinese Academy of Sciences, Beijing 100049, P. R. China
| | - Yaxin Cheng
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Biomacromolecules, Institute of Biophysics, Chinese Academy of Sciences, 15 Datun Road, Chaoyang District, Beijing 100101, P. R. China; College of Life Sciences, University of the Chinese Academy of Sciences, Beijing 100049, P. R. China
| | - Xinwei Wang
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Biomacromolecules, Institute of Biophysics, Chinese Academy of Sciences, 15 Datun Road, Chaoyang District, Beijing 100101, P. R. China; College of Life Sciences, University of the Chinese Academy of Sciences, Beijing 100049, P. R. China
| | - Jihua Yang
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Biomacromolecules, Institute of Biophysics, Chinese Academy of Sciences, 15 Datun Road, Chaoyang District, Beijing 100101, P. R. China; College of Life Sciences, University of the Chinese Academy of Sciences, Beijing 100049, P. R. China
| | - Pengbo Hu
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Biomacromolecules, Institute of Biophysics, Chinese Academy of Sciences, 15 Datun Road, Chaoyang District, Beijing 100101, P. R. China; College of Life Sciences, University of the Chinese Academy of Sciences, Beijing 100049, P. R. China
| | - Julia Yvonne Gestrich
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Biomacromolecules, Institute of Biophysics, Chinese Academy of Sciences, 15 Datun Road, Chaoyang District, Beijing 100101, P. R. China; College of Life Sciences, University of the Chinese Academy of Sciences, Beijing 100049, P. R. China
| | - Li Liu
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Biomacromolecules, Institute of Biophysics, Chinese Academy of Sciences, 15 Datun Road, Chaoyang District, Beijing 100101, P. R. China; College of Life Sciences, University of the Chinese Academy of Sciences, Beijing 100049, P. R. China; CAS Key Laboratory of Mental Health, Beijing 100101, P. R. China.
| | - Yan Zhu
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Biomacromolecules, Institute of Biophysics, Chinese Academy of Sciences, 15 Datun Road, Chaoyang District, Beijing 100101, P. R. China; College of Life Sciences, University of the Chinese Academy of Sciences, Beijing 100049, P. R. China.
| |
Collapse
|
14
|
Second-order cues to figure motion enable object detection during prey capture by praying mantises. Proc Natl Acad Sci U S A 2019; 116:27018-27027. [PMID: 31818943 DOI: 10.1073/pnas.1912310116] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/26/2023] Open
Abstract
Detecting motion is essential for animals to perform a wide variety of functions. In order to do so, animals could exploit motion cues, including both first-order cues-such as luminance correlation over time-and second-order cues, by correlating higher-order visual statistics. Since first-order motion cues are typically sufficient for motion detection, it is unclear why sensitivity to second-order motion has evolved in animals, including insects. Here, we investigate the role of second-order motion in prey capture by praying mantises. We show that prey detection uses second-order motion cues to detect figure motion. We further present a model of prey detection based on second-order motion sensitivity, resulting from a layer of position detectors feeding into a second layer of elementary-motion detectors. Mantis stereopsis, in contrast, does not require figure motion and is explained by a simpler model that uses only the first layer in both eyes. Second-order motion cues thus enable prey motion to be detected, even when perfectly matching the average background luminance and independent of the elementary motion of any parts of the prey. Subsequent to prey detection, processes such as stereopsis could work to determine the distance to the prey. We thus demonstrate how second-order motion mechanisms enable ecologically relevant behavior such as detecting camouflaged targets for other visual functions including stereopsis and target tracking.
Collapse
|
15
|
Shoemaker PA. Neural Network Model for Detection of Edges Defined by Image Dynamics. Front Comput Neurosci 2019; 13:76. [PMID: 31787888 PMCID: PMC6854273 DOI: 10.3389/fncom.2019.00076] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2019] [Accepted: 10/14/2019] [Indexed: 11/24/2022] Open
Abstract
Insects can detect the presence of discrete objects in their visual fields based on a range of differences in spatiotemporal characteristics between the images of object and background. This includes but is not limited to relative motion. Evidence suggests that edge detection is an integral part of this capability, and this study examines the ability of a bio-inspired processing model to detect the presence of boundaries between two regions of a one-dimensional visual field, based on general differences in image dynamics. The model consists of two parts. The first is an early vision module inspired by insect visual processing, which implements adaptive photoreception, ON and OFF channels with transient and sustained characteristics, and delayed and undelayed signal paths. This is replicated for a number of photoreceptors in a small linear array. It is followed by an artificial neural network trained to discriminate the presence vs. absence of an edge based on the array output signals. Input data are derived from natural imagery and feature both static and moving edges between regions with moving texture, flickering texture, and static patterns in all possible combinations. The model can discriminate the presence of edges, stationary or moving, at rates far higher than chance. The resources required (numbers of neurons and visual signals) are realistic relative to those available in the insect second optic ganglion, where the bulk of such processing would be likely to take place.
Collapse
Affiliation(s)
- Patrick A Shoemaker
- Computational Science Research Center, San Diego State University, San Diego, CA, United States
| |
Collapse
|
16
|
Olfactory and Neuromodulatory Signals Reverse Visual Object Avoidance to Approach in Drosophila. Curr Biol 2019; 29:2058-2065.e2. [PMID: 31155354 DOI: 10.1016/j.cub.2019.05.010] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2018] [Revised: 04/01/2019] [Accepted: 05/01/2019] [Indexed: 12/15/2022]
Abstract
Behavioral reactions of animals to environmental sensory stimuli are sometimes reflexive and stereotyped but can also vary depending on contextual conditions. Engaging in active foraging or flight provokes a reversal in the valence of carbon dioxide responses from aversion to approach in Drosophila [1, 2], whereas mosquitoes encountering this same chemical cue show enhanced approach toward a small visual object [3]. Sensory plasticity in insects has been broadly attributed to the action of biogenic amines, which modulate behaviors such as olfactory learning, aggression, feeding, and egg laying [4-14]. Octopamine acts rapidly upon the onset of flight to modulate the response gain of directionally selective motion-detecting neurons in Drosophila [15]. How the action of biogenic amines might couple sensory modalities to each other or to locomotive states remains poorly understood. Here, we use a visual flight simulator [16] equipped for odor delivery [17] to confirm that flies avoid a small contrasting visual object in odorless air [18] but that the same animals reverse their preference to approach in the presence of attractive food odor. An aversive odor does not reverse object aversion. Optogenetic activation of either octopaminergic neurons or directionally selective motion-detecting neurons that express octopamine receptors elicits visual valence reversal in the absence of odor. Our results suggest a parsimonious model in which odor-activated octopamine release excites the motion detection pathway to increase the saliency of either a small object or a bar, eliciting tracking responses by both visual features.
Collapse
|
17
|
Mongeau JM, Cheng KY, Aptekar J, Frye MA. Visuomotor strategies for object approach and aversion in Drosophila melanogaster. ACTA ACUST UNITED AC 2019; 222:jeb.193730. [PMID: 30559298 DOI: 10.1242/jeb.193730] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2018] [Accepted: 12/10/2018] [Indexed: 02/01/2023]
Abstract
Animals classify stimuli to generate appropriate motor actions. In flight, Drosophila melanogaster classify equidistant large and small objects with categorically different behaviors: a tall object evokes approach whereas a small object elicits avoidance. We studied visuomotor behavior in rigidly and magnetically tethered D. melanogaster to reveal strategies that generate aversion to a small object. We discovered that small-object aversion in tethered flight is enabled by aversive saccades and smooth movement, which vary with the stimulus type. Aversive saccades in response to a short bar had different dynamics from approach saccades in response to a tall bar and the distribution of pre-saccade error angles was more stochastic for a short bar. Taken together, we show that aversive responses in D. melanogaster are driven in part by processes that elicit signed saccades with distinct dynamics and trigger mechanisms. Our work generates new hypotheses to study brain circuits that underlie classification of objects in D. melanogaster.
Collapse
Affiliation(s)
- Jean-Michel Mongeau
- Department of Integrative Biology and Physiology, University of California - Los Angeles, Los Angeles, CA 90095-7239, USA
| | - Karen Y Cheng
- Department of Integrative Biology and Physiology, University of California - Los Angeles, Los Angeles, CA 90095-7239, USA
| | - Jacob Aptekar
- Department of Integrative Biology and Physiology, University of California - Los Angeles, Los Angeles, CA 90095-7239, USA
| | - Mark A Frye
- Department of Integrative Biology and Physiology, University of California - Los Angeles, Los Angeles, CA 90095-7239, USA
| |
Collapse
|
18
|
Keleş MF, Mongeau JM, Frye MA. Object features and T4/T5 motion detectors modulate the dynamics of bar tracking by Drosophila. ACTA ACUST UNITED AC 2019; 222:jeb.190017. [PMID: 30446539 DOI: 10.1242/jeb.190017] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2018] [Accepted: 11/09/2018] [Indexed: 01/21/2023]
Abstract
Visual objects can be discriminated by static spatial features such as luminance or dynamic features such as relative movement. Flies track a solid dark vertical bar moving on a bright background, a behavioral reaction so strong that for a rigidly tethered fly, the steering trajectory is phase advanced relative to the moving bar, apparently in anticipation of its future position. By contrast, flickering bars that generate no coherent motion or have a surface texture that moves in the direction opposite to the bar generate steering responses that lag behind the stimulus. It remains unclear how the spatial properties of a bar influence behavioral response dynamics. Here, we show that a dark bar defined by its luminance contrast to the uniform background drives a co-directional steering response that is phase advanced relative to the response to a textured bar defined only by its motion relative to a stationary textured background. The textured bar drives an initial contra-directional turn and phase-locked tracking. The qualitatively distinct response dynamics could indicate parallel visual processing of a luminance versus motion-defined object. Calcium imaging shows that T4/T5 motion-detecting neurons are more responsive to a solid dark bar than a motion-defined bar. Genetically blocking T4/T5 neurons eliminates the phase-advanced co-directional response to the luminance-defined bar, leaving the orientation response largely intact. We conclude that T4/T5 neurons mediate a co-directional optomotor response to a luminance-defined bar, thereby driving phase-advanced wing kinematics, whereas separate unknown visual pathways elicit the contra-directional orientation response.
Collapse
Affiliation(s)
- Mehmet F Keleş
- Department of Integrative Biology and Physiology, University of California Los Angeles, Los Angeles, CA 90095-7239, USA
| | - Jean-Michel Mongeau
- Department of Integrative Biology and Physiology, University of California Los Angeles, Los Angeles, CA 90095-7239, USA
| | - Mark A Frye
- Department of Integrative Biology and Physiology, University of California Los Angeles, Los Angeles, CA 90095-7239, USA
| |
Collapse
|
19
|
Fu Q, Wang H, Hu C, Yue S. Towards Computational Models and Applications of Insect Visual Systems for Motion Perception: A Review. ARTIFICIAL LIFE 2019; 25:263-311. [PMID: 31397604 DOI: 10.1162/artl_a_00297] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Motion perception is a critical capability determining a variety of aspects of insects' life, including avoiding predators, foraging, and so forth. A good number of motion detectors have been identified in the insects' visual pathways. Computational modeling of these motion detectors has not only been providing effective solutions to artificial intelligence, but also benefiting the understanding of complicated biological visual systems. These biological mechanisms through millions of years of evolutionary development will have formed solid modules for constructing dynamic vision systems for future intelligent machines. This article reviews the computational motion perception models originating from biological research on insects' visual systems in the literature. These motion perception models or neural networks consist of the looming-sensitive neuronal models of lobula giant movement detectors (LGMDs) in locusts, the translation-sensitive neural systems of direction-selective neurons (DSNs) in fruit flies, bees, and locusts, and the small-target motion detectors (STMDs) in dragonflies and hoverflies. We also review the applications of these models to robots and vehicles. Through these modeling studies, we summarize the methodologies that generate different direction and size selectivity in motion perception. Finally, we discuss multiple systems integration and hardware realization of these bio-inspired motion perception models.
Collapse
Affiliation(s)
- Qinbing Fu
- Guangzhou University, School of Mechanical and Electrical Engineering; Machine Life and Intelligence Research Centre
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| | - Hongxin Wang
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| | - Cheng Hu
- Guangzhou University, School of Mechanical and Electrical Engineering; Machine Life and Intelligence Research Centre
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| | - Shigang Yue
- Guangzhou University, School of Mechanical and Electrical Engineering; Machine Life and Intelligence Research Centre
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| |
Collapse
|
20
|
Park EJ, Wasserman SM. Diversity of visuomotor reflexes in two Drosophila species. Curr Biol 2018; 28:R865-R866. [DOI: 10.1016/j.cub.2018.06.071] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
|
21
|
Toepfer F, Wolf R, Heisenberg M. Multi-stability with ambiguous visual stimuli in Drosophila orientation behavior. PLoS Biol 2018; 16:e2003113. [PMID: 29438378 PMCID: PMC5826666 DOI: 10.1371/journal.pbio.2003113] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2017] [Revised: 02/26/2018] [Accepted: 01/19/2018] [Indexed: 11/23/2022] Open
Abstract
It is widely accepted for humans and higher animals that vision is an active process in which the organism interprets the stimulus. To find out whether this also holds for lower animals, we designed an ambiguous motion stimulus, which serves as something like a multi-stable perception paradigm in Drosophila behavior. Confronted with a uniform panoramic texture in a closed-loop situation in stationary flight, the flies adjust their yaw torque to stabilize their virtual self-rotation. To make the visual input ambiguous, we added a second texture. Both textures got a rotatory bias to move into opposite directions at a constant relative angular velocity. The results indicate that the fly now had three possible frames of reference for self-rotation: either of the two motion components as well as the integrated motion vector of the two. In this ambiguous stimulus situation, the flies generated a continuous sequence of behaviors, each one adjusted to one or another of the three references. Vision is considered an active process in humans and higher animals in which the stimulus is interpreted by the subject and can be perceived in different ways if it is ambiguous. We aimed to find out whether this also holds for lower animals, such as the fruit fly Drosophila melanogaster. To provide ambiguity, we exposed flies to transparent motion stimuli in a flight simulator and found their behavior to be multi-stable. These results show that the visual system of the fly can separate the individual components of a transparent motion stimulus, and that this kind of stimulus is ambiguous to the fly. The extent to which the fly shows component selectivity in its behavior depends on several properties of the stimulus, like pattern contrast and element density. The alternations between the different behaviors exhibit a stochasticity reminiscent of the temporal dynamics in human multi-stable perception.
Collapse
Affiliation(s)
| | - Reinhard Wolf
- Rudolf Virchow Center, University of Wuerzburg, Germany
| | | |
Collapse
|
22
|
Nityananda V, Tarawneh G, Henriksen S, Umeton D, Simmons A, Read JCA. A Novel Form of Stereo Vision in the Praying Mantis. Curr Biol 2018; 28:588-593.e4. [PMID: 29429616 DOI: 10.1016/j.cub.2018.01.012] [Citation(s) in RCA: 45] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2017] [Revised: 12/14/2017] [Accepted: 01/04/2018] [Indexed: 12/19/2022]
Abstract
Stereopsis is the ability to estimate distance based on the different views seen in the two eyes [1-5]. It is an important model perceptual system in neuroscience and a major area of machine vision. Mammalian, avian, and almost all machine stereo algorithms look for similarities between the luminance-defined images in the two eyes, using a series of computations to produce a map showing how depth varies across the scene [3, 4, 6-14]. Stereopsis has also evolved in at least one invertebrate, the praying mantis [15-17]. Mantis stereopsis is presumed to be simpler than vertebrates' [15, 18], but little is currently known about the underlying computations. Here, we show that mantis stereopsis uses a fundamentally different computational algorithm from vertebrate stereopsis-rather than comparing luminance in the two eyes' images directly, mantis stereopsis looks for regions of the images where luminance is changing. Thus, while there is no evidence that mantis stereopsis works at all with static images, it successfully reveals the distance to a moving target even in complex visual scenes with targets that are perfectly camouflaged against the background in terms of texture. Strikingly, these insects outperform human observers at judging stereoscopic distance when the pattern of luminance in the two eyes does not match. Insect stereopsis has thus evolved to be computationally efficient while being robust to poor image resolution and to discrepancies in the pattern of luminance between the two eyes. VIDEO ABSTRACT.
Collapse
Affiliation(s)
- Vivek Nityananda
- Institute of Neuroscience, Henry Wellcome Building, Newcastle University, Framlington Place, Newcastle upon Tyne NE2 4HH, UK.
| | - Ghaith Tarawneh
- Institute of Neuroscience, Henry Wellcome Building, Newcastle University, Framlington Place, Newcastle upon Tyne NE2 4HH, UK
| | - Sid Henriksen
- Institute of Neuroscience, Henry Wellcome Building, Newcastle University, Framlington Place, Newcastle upon Tyne NE2 4HH, UK
| | - Diana Umeton
- Institute of Neuroscience, Henry Wellcome Building, Newcastle University, Framlington Place, Newcastle upon Tyne NE2 4HH, UK
| | - Adam Simmons
- Institute of Neuroscience, Henry Wellcome Building, Newcastle University, Framlington Place, Newcastle upon Tyne NE2 4HH, UK
| | - Jenny C A Read
- Institute of Neuroscience, Henry Wellcome Building, Newcastle University, Framlington Place, Newcastle upon Tyne NE2 4HH, UK
| |
Collapse
|
23
|
Dyakova O, Nordström K. Image statistics and their processing in insect vision. CURRENT OPINION IN INSECT SCIENCE 2017; 24:7-14. [PMID: 29208226 DOI: 10.1016/j.cois.2017.08.002] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/16/2017] [Revised: 08/17/2017] [Accepted: 08/24/2017] [Indexed: 06/07/2023]
Abstract
Natural scenes may appear random, but are not only constrained in space and time, but also show strong spatial and temporal correlations. Spatial constraints and correlations can be described by quantifying image statistics, which include intuitive measures such as contrast, color and luminance, but also parameters that need some type of transformation of the image. In this review we will discuss some common tools used to quantify spatial and temporal parameters of naturalistic visual input, and how these tools have been used to inform us about visual processing in insects. In particular, we will review findings that would not have been possible using conventional, experimenter defined stimuli.
Collapse
Affiliation(s)
- Olga Dyakova
- Department of Neuroscience, Uppsala University, Box 593, 751 24 Uppsala, Sweden
| | - Karin Nordström
- Department of Neuroscience, Uppsala University, Box 593, 751 24 Uppsala, Sweden; Centre for Neuroscience, Flinders University, GPO Box 2100, Adelaide, SA 5001, Australia.
| |
Collapse
|
24
|
Mongeau JM, Frye MA. Drosophila Spatiotemporally Integrates Visual Signals to Control Saccades. Curr Biol 2017; 27:2901-2914.e2. [PMID: 28943085 DOI: 10.1016/j.cub.2017.08.035] [Citation(s) in RCA: 32] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2017] [Revised: 07/31/2017] [Accepted: 08/15/2017] [Indexed: 11/16/2022]
Abstract
Like many visually active animals, including humans, flies generate both smooth and rapid saccadic movements to stabilize their gaze. How rapid body saccades and smooth movement interact for simultaneous object pursuit and gaze stabilization is not understood. We directly observed these interactions in magnetically tethered Drosophila free to rotate about the yaw axis. A moving bar elicited sustained bouts of saccades following the bar, with surprisingly little smooth movement. By contrast, a moving panorama elicited robust smooth movement interspersed with occasional optomotor saccades. The amplitude, angular velocity, and torque transients of bar-fixation saccades were finely tuned to the speed of bar motion and were triggered by a threshold in the temporal integral of the bar error angle rather than its absolute retinal position error. Optomotor saccades were tuned to the dynamics of panoramic image motion and were triggered by a threshold in the integral of velocity over time. A hybrid control model based on integrated motion cues simulates saccade trigger and dynamics. We propose a novel algorithm for tuning fixation saccades in flies.
Collapse
Affiliation(s)
- Jean-Michel Mongeau
- Department of Integrative Biology and Physiology, University of California, Los Angeles, Los Angeles, CA 90095-7239, USA
| | - Mark A Frye
- Department of Integrative Biology and Physiology, University of California, Los Angeles, Los Angeles, CA 90095-7239, USA.
| |
Collapse
|
25
|
Keleş MF, Frye MA. Object-Detecting Neurons in Drosophila. Curr Biol 2017; 27:680-687. [PMID: 28190726 DOI: 10.1016/j.cub.2017.01.012] [Citation(s) in RCA: 80] [Impact Index Per Article: 11.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2016] [Revised: 12/15/2016] [Accepted: 01/06/2017] [Indexed: 12/18/2022]
Abstract
Many animals rely on vision to detect objects such as conspecifics, predators, and prey. Hypercomplex cells found in feline cortex and small target motion detectors found in dragonfly and hoverfly optic lobes demonstrate robust tuning for small objects, with weak or no response to larger objects or movement of the visual panorama [1-3]. However, the relationship among anatomical, molecular, and functional properties of object detection circuitry is not understood. Here we characterize a specialized object detector in Drosophila, the lobula columnar neuron LC11 [4]. By imaging calcium dynamics with two-photon excitation microscopy, we show that LC11 responds to the omni-directional movement of a small object darker than the background, with little or no responses to static flicker, vertically elongated bars, or panoramic gratings. LC11 dendrites innervate multiple layers of the lobula, and each dendrite spans enough columns to sample 75° of visual space, yet the area that evokes calcium responses is only 20° wide and shows robust responses to a 2.2° object spanning less than half of one facet of the compound eye. The dendrites of neighboring LC11s encode object motion retinotopically, but the axon terminals fuse into a glomerular structure in the central brain where retinotopy is lost. Blocking inhibitory ionic currents abolishes small object sensitivity and facilitates responses to elongated bars and gratings. Our results reveal high-acuity object motion detection in the Drosophila optic lobe.
Collapse
Affiliation(s)
- Mehmet F Keleş
- Department of Integrative Biology and Physiology, University of California, Los Angeles, Los Angeles, CA 90095, USA
| | - Mark A Frye
- Department of Integrative Biology and Physiology, University of California, Los Angeles, Los Angeles, CA 90095, USA.
| |
Collapse
|
26
|
Wu M, Nern A, Williamson WR, Morimoto MM, Reiser MB, Card GM, Rubin GM. Visual projection neurons in the Drosophila lobula link feature detection to distinct behavioral programs. eLife 2016; 5. [PMID: 28029094 PMCID: PMC5293491 DOI: 10.7554/elife.21022] [Citation(s) in RCA: 147] [Impact Index Per Article: 18.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2016] [Accepted: 12/23/2016] [Indexed: 12/13/2022] Open
Abstract
Visual projection neurons (VPNs) provide an anatomical connection between early visual processing and higher brain regions. Here we characterize lobula columnar (LC) cells, a class of Drosophila VPNs that project to distinct central brain structures called optic glomeruli. We anatomically describe 22 different LC types and show that, for several types, optogenetic activation in freely moving flies evokes specific behaviors. The activation phenotypes of two LC types closely resemble natural avoidance behaviors triggered by a visual loom. In vivo two-photon calcium imaging reveals that these LC types respond to looming stimuli, while another type does not, but instead responds to the motion of a small object. Activation of LC neurons on only one side of the brain can result in attractive or aversive turning behaviors depending on the cell type. Our results indicate that LC neurons convey information on the presence and location of visual features relevant for specific behaviors.
Collapse
Affiliation(s)
- Ming Wu
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| | - Aljoscha Nern
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| | - W Ryan Williamson
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| | - Mai M Morimoto
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| | - Michael B Reiser
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| | - Gwyneth M Card
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| | - Gerald M Rubin
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| |
Collapse
|
27
|
Abstract
Many animals rely on visual figure-ground discrimination to aid in navigation, and to draw attention to salient features like conspecifics or predators. Even figures that are similar in pattern and luminance to the visual surroundings can be distinguished by the optical disparity generated by their relative motion against the ground, and yet the neural mechanisms underlying these visual discriminations are not well understood. We show in flies that a diverse array of figure-ground stimuli containing a motion-defined edge elicit statistically similar behavioral responses to one another, and statistically distinct behavioral responses from ground motion alone. From studies in larger flies and other insect species, we hypothesized that the circuitry of the lobula--one of the four, primary neuropiles of the fly optic lobe--performs this visual discrimination. Using calcium imaging of input dendrites, we then show that information encoded in cells projecting from the lobula to discrete optic glomeruli in the central brain group these sets of figure-ground stimuli in a homologous manner to the behavior; "figure-like" stimuli are coded similar to one another and "ground-like" stimuli are encoded differently. One cell class responds to the leading edge of a figure and is suppressed by ground motion. Two other classes cluster any figure-like stimuli, including a figure moving opposite the ground, distinctly from ground alone. This evidence demonstrates that lobula outputs provide a diverse basis set encoding visual features necessary for figure detection.
Collapse
|
28
|
Spatio-temporal dynamics of impulse responses to figure motion in optic flow neurons. PLoS One 2015; 10:e0126265. [PMID: 25955416 PMCID: PMC4425674 DOI: 10.1371/journal.pone.0126265] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2014] [Accepted: 03/31/2015] [Indexed: 11/24/2022] Open
Abstract
White noise techniques have been used widely to investigate sensory systems in both vertebrates and invertebrates. White noise stimuli are powerful in their ability to rapidly generate data that help the experimenter decipher the spatio-temporal dynamics of neural and behavioral responses. One type of white noise stimuli, maximal length shift register sequences (m-sequences), have recently become particularly popular for extracting response kernels in insect motion vision. We here use such m-sequences to extract the impulse responses to figure motion in hoverfly lobula plate tangential cells (LPTCs). Figure motion is behaviorally important and many visually guided animals orient towards salient features in the surround. We show that LPTCs respond robustly to figure motion in the receptive field. The impulse response is scaled down in amplitude when the figure size is reduced, but its time course remains unaltered. However, a low contrast stimulus generates a slower response with a significantly longer time-to-peak and half-width. Impulse responses in females have a slower time-to-peak than males, but are otherwise similar. Finally we show that the shapes of the impulse response to a figure and a widefield stimulus are very similar, suggesting that the figure response could be coded by the same input as the widefield response.
Collapse
|
29
|
|
30
|
Fenk LM, Poehlmann A, Straw AD. Asymmetric processing of visual motion for simultaneous object and background responses. Curr Biol 2014; 24:2913-9. [PMID: 25454785 DOI: 10.1016/j.cub.2014.10.042] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2014] [Revised: 09/19/2014] [Accepted: 10/14/2014] [Indexed: 10/24/2022]
Abstract
Visual object fixation and figure-ground discrimination in Drosophila are robust behaviors requiring sophisticated computation by the visual system, yet the neural substrates remain unknown. Recent experiments in walking flies revealed object fixation behavior mediated by circuitry independent from the motion-sensitive T4-T5 cells required for wide-field motion responses. In tethered flight experiments under closed-loop conditions, we found similar results for one feedback gain, whereas intact T4-T5 cells were necessary for robust object fixation at a higher feedback gain and in figure-ground discrimination tasks. We implemented dynamical models (available at http://strawlab.org/asymmetric-motion/) based on neurons downstream of T4-T5 cells—one a simple phenomenological model and another, physiologically more realistic model—and found that both predict key features of stripe fixation and figure-ground discrimination and are consistent with a classical formulation. Fundamental to both models is motion asymmetry in the responses of model neurons, whereby front-to-back motion elicits stronger responses than back-to-front motion. When a bilateral pair of such model neurons, based on well-understood horizontal system cells, downstream of T4-T5, is coupled to turning behavior, asymmetry leads to object fixation and figure-ground discrimination in the presence of noise. Furthermore, the models also predict fixation in front of a moving background, a behavior previously suggested to require an additional pathway. Thus, the models predict several aspects of object responses on the basis of neurons that are also thought to serve a key role in background stabilization.
Collapse
Affiliation(s)
- Lisa M Fenk
- Research Institute of Molecular Pathology, Vienna Biocenter, Dr. Bohr-Gasse 7, 1030 Vienna, Austria
| | - Andreas Poehlmann
- Research Institute of Molecular Pathology, Vienna Biocenter, Dr. Bohr-Gasse 7, 1030 Vienna, Austria
| | - Andrew D Straw
- Research Institute of Molecular Pathology, Vienna Biocenter, Dr. Bohr-Gasse 7, 1030 Vienna, Austria.
| |
Collapse
|
31
|
Aptekar JW, Keles MF, Mongeau JM, Lu PM, Frye MA, Shoemaker PA. Method and software for using m-sequences to characterize parallel components of higher-order visual tracking behavior in Drosophila. Front Neural Circuits 2014; 8:130. [PMID: 25400550 PMCID: PMC4215624 DOI: 10.3389/fncir.2014.00130] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2014] [Accepted: 10/09/2014] [Indexed: 11/17/2022] Open
Abstract
A moving visual figure may contain first-order signals defined by variation in mean luminance, as well as second-order signals defined by constant mean luminance and variation in luminance envelope, or higher-order signals that cannot be estimated by taking higher moments of the luminance distribution. Separating these properties of a moving figure to experimentally probe the visual subsystems that encode them is technically challenging and has resulted in debated mechanisms of visual object detection by flies. Our prior work took a white noise systems identification approach using a commercially available electronic display system to characterize the spatial variation in the temporal dynamics of two distinct subsystems for first- and higher-order components of visual figure tracking. The method relied on the use of single pixel displacements of two visual stimuli according to two binary maximum length shift register sequences (m-sequences) and cross-correlation of each m-sequence with time-varying flight steering measurements. The resultant spatio-temporal action fields represent temporal impulse responses parameterized by the azimuthal location of the visual figure, one STAF for first-order and another for higher-order components of compound stimuli. Here we review m-sequence and reverse correlation procedures, then describe our application in detail, provide Matlab code, validate the STAFs, and demonstrate the utility and robustness of STAFs by predicting the results of other published experimental procedures. This method has demonstrated how two relatively modest innovations on classical white noise analysis—the inclusion of space as a way to organize response kernels and the use of linear decoupling to measure the response to two channels of visual information simultaneously—could substantially improve our basic understanding of visual processing in the fly.
Collapse
Affiliation(s)
- Jacob W Aptekar
- Department of Integrative Biology and Physiology, Howard Hughes Medical Institute, University of California, Los Angeles Los Angeles, CA, USA
| | - Mehmet F Keles
- Department of Integrative Biology and Physiology, Howard Hughes Medical Institute, University of California, Los Angeles Los Angeles, CA, USA
| | - Jean-Michel Mongeau
- Department of Integrative Biology and Physiology, Howard Hughes Medical Institute, University of California, Los Angeles Los Angeles, CA, USA
| | - Patrick M Lu
- Department of Integrative Biology and Physiology, Howard Hughes Medical Institute, University of California, Los Angeles Los Angeles, CA, USA
| | - Mark A Frye
- Department of Integrative Biology and Physiology, Howard Hughes Medical Institute, University of California, Los Angeles Los Angeles, CA, USA
| | | |
Collapse
|
32
|
Kress D, Egelhaaf M. Impact of stride-coupled gaze shifts of walking blowflies on the neuronal representation of visual targets. Front Behav Neurosci 2014; 8:307. [PMID: 25309362 PMCID: PMC4164030 DOI: 10.3389/fnbeh.2014.00307] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2014] [Accepted: 08/23/2014] [Indexed: 02/04/2023] Open
Abstract
During locomotion animals rely heavily on visual cues gained from the environment to guide their behavior. Examples are basic behaviors like collision avoidance or the approach to a goal. The saccadic gaze strategy of flying flies, which separates translational from rotational phases of locomotion, has been suggested to facilitate the extraction of environmental information, because only image flow evoked by translational self-motion contains relevant distance information about the surrounding world. In contrast to the translational phases of flight during which gaze direction is kept largely constant, walking flies experience continuous rotational image flow that is coupled to their stride-cycle. The consequences of these self-produced image shifts for the extraction of environmental information are still unclear. To assess the impact of stride-coupled image shifts on visual information processing, we performed electrophysiological recordings from the HSE cell, a motion sensitive wide-field neuron in the blowfly visual system. This cell has been concluded to play a key role in mediating optomotor behavior, self-motion estimation and spatial information processing. We used visual stimuli that were based on the visual input experienced by walking blowflies while approaching a black vertical bar. The response of HSE to these stimuli was dominated by periodic membrane potential fluctuations evoked by stride-coupled image shifts. Nevertheless, during the approach the cell's response contained information about the bar and its background. The response components evoked by the bar were larger than the responses to its background, especially during the last phase of the approach. However, as revealed by targeted modifications of the visual input during walking, the extraction of distance information on the basis of HSE responses is much impaired by stride-coupled retinal image shifts. Possible mechanisms that may cope with these stride-coupled responses are discussed.
Collapse
Affiliation(s)
- Daniel Kress
- Department of Neurobiology, Bielefeld UniversityBielefeld, Germany
- CITEC Center of Excellence Cognitive Interaction Technology, Bielefeld UniversityBielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology, Bielefeld UniversityBielefeld, Germany
- CITEC Center of Excellence Cognitive Interaction Technology, Bielefeld UniversityBielefeld, Germany
| |
Collapse
|
33
|
Abstract
Understanding how the brain controls behaviour is undisputedly one of the grand goals of neuroscience research, and the pursuit of this goal has a long tradition in insect neuroscience. However, appropriate techniques were lacking for a long time. Recent advances in genetic and recording techniques now allow the participation of identified neurons in the execution of specific behaviours to be interrogated. By focusing on fly visual course control, I highlight what has been learned about the neuronal circuit modules that control visual guidance in Drosophila melanogaster through the use of these techniques.
Collapse
Affiliation(s)
- Alexander Borst
- Max Planck Institute of Neurobiology Systems and Computational Neuroscience, Am Klopferspitz 18, 82152 Martinsried, Germany
| |
Collapse
|
34
|
Kress D, Egelhaaf M. Gaze characteristics of freely walking blowflies Calliphora vicina in a goal-directed task. ACTA ACUST UNITED AC 2014; 217:3209-20. [PMID: 25013104 DOI: 10.1242/jeb.097436] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
In contrast to flying flies, walking flies experience relatively strong rotational gaze shifts, even during overall straight phases of locomotion. These gaze shifts are caused by the walking apparatus and modulated by the stride frequency. Accordingly, even during straight walking phases, the retinal image flow is composed of both translational and rotational optic flow, which might affect spatial vision, as well as fixation behavior. We addressed this issue for an orientation task where walking blowflies approached a black vertical bar. The visual stimulus was stationary, or either the bar or the background moved horizontally. The stride-coupled gaze shifts of flies walking toward the bar had similar amplitudes under all visual conditions tested. This finding indicates that these shifts are an inherent feature of walking, which are not even compensated during a visual goal fixation task. By contrast, approaching flies showed a frequent stop-and-go behavior that was affected by the stimulus conditions. As sustained image rotations may impair distance estimation during walking, we propose a hypothesis that explains how rotation-independent translatory image flow containing distance information can be determined. The algorithm proposed works without requiring differentiation at the behavioral level of the rotational and translational flow components. By contrast, disentangling both has been proposed to be necessary during flight. By comparing the retinal velocities of the edges of the goal, its rotational image motion component can be removed. Consequently, the expansion velocity of the goal and, thus, its proximity can be extracted, irrespective of distance-independent stride-coupled rotational image shifts.
Collapse
Affiliation(s)
- Daniel Kress
- Department of Neurobiology and CITEC Center of Excellence Cognitive Interaction Technology, Bielefeld University, Universitätsstraße 25, 33615 Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology and CITEC Center of Excellence Cognitive Interaction Technology, Bielefeld University, Universitätsstraße 25, 33615 Bielefeld, Germany
| |
Collapse
|
35
|
Abstract
Visually-guided animals rely on their ability to stabilize the panorama and simultaneously track salient objects, or figures, that are distinct from the background in order to avoid predators, pursue food resources and mates, and navigate spatially. Visual figures are distinguished by luminance signals that produce coherent motion cues as well as more enigmatic 'higher-order' statistical features. Figure discrimination is thus a complex form of motion vision requiring specialized neural processing. In this minireview, we will highlight recent advances in understanding the perceptual, behavioral, and neurophysiological basis of higher-order figure detection in flies, much of which is grounded in the historical perspective and mechanistic underpinnings of human psychophysics.
Collapse
Affiliation(s)
- Jacob W Aptekar
- Howard Hughes Medical Institute, Department of Integrative Biology and Physiology, University of California Los Angeles, Los Angeles, CA 90095, USA
| | | |
Collapse
|
36
|
Roth E, Sponberg S, Cowan NJ. A comparative approach to closed-loop computation. Curr Opin Neurobiol 2014; 25:54-62. [DOI: 10.1016/j.conb.2013.11.005] [Citation(s) in RCA: 58] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2013] [Revised: 10/02/2013] [Accepted: 11/18/2013] [Indexed: 01/08/2023]
|
37
|
Fox JL, Frye MA. Figure-ground discrimination behavior in Drosophila. II. Visual influences on head movement behavior. ACTA ACUST UNITED AC 2013; 217:570-9. [PMID: 24198264 PMCID: PMC3922834 DOI: 10.1242/jeb.080192] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Visual identification of small moving targets is a challenge for all moving animals. Their own motion generates displacement of the visual surroundings, inducing wide-field optic flow across the retina. Wide-field optic flow is used to sense perturbations in the flight course. Both ego-motion and corrective optomotor responses confound any attempt to track a salient target moving independently of the visual surroundings. What are the strategies that flying animals use to discriminate small-field figure motion from superimposed wide-field background motion? We examined how fruit flies adjust their gaze in response to a compound visual stimulus comprising a small moving figure against an independently moving wide-field ground, which they do by re-orienting their head or their flight trajectory. We found that fixing the head in place impairs object fixation in the presence of ground motion, and that head movements are necessary for stabilizing wing steering responses to wide-field ground motion when a figure is present. When a figure is moving relative to a moving ground, wing steering responses follow components of both the figure and ground trajectories, but head movements follow only the ground motion. To our knowledge, this is the first demonstration that wing responses can be uncoupled from head responses and that the two follow distinct trajectories in the case of simultaneous figure and ground motion. These results suggest that whereas figure tracking by wing kinematics is independent of head movements, head movements are important for stabilizing ground motion during active figure tracking.
Collapse
Affiliation(s)
- Jessica L Fox
- Howard Hughes Medical Institute and Department of Integrative Biology and Physiology, University of California Los Angeles, Los Angeles, CA 90095-7239, USA
| | | |
Collapse
|
38
|
Fox JL, Aptekar JW, Zolotova NM, Shoemaker PA, Frye MA. Figure-ground discrimination behavior in Drosophila. I. Spatial organization of wing-steering responses. ACTA ACUST UNITED AC 2013; 217:558-69. [PMID: 24198267 PMCID: PMC3922833 DOI: 10.1242/jeb.097220] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
The behavioral algorithms and neural subsystems for visual figure–ground discrimination are not sufficiently described in any model system. The fly visual system shares structural and functional similarity with that of vertebrates and, like vertebrates, flies robustly track visual figures in the face of ground motion. This computation is crucial for animals that pursue salient objects under the high performance requirements imposed by flight behavior. Flies smoothly track small objects and use wide-field optic flow to maintain flight-stabilizing optomotor reflexes. The spatial and temporal properties of visual figure tracking and wide-field stabilization have been characterized in flies, but how the two systems interact spatially to allow flies to actively track figures against a moving ground has not. We took a systems identification approach in flying Drosophila and measured wing-steering responses to velocity impulses of figure and ground motion independently. We constructed a spatiotemporal action field (STAF) – the behavioral analog of a spatiotemporal receptive field – revealing how the behavioral impulse responses to figure tracking and concurrent ground stabilization vary for figure motion centered at each location across the visual azimuth. The figure tracking and ground stabilization STAFs show distinct spatial tuning and temporal dynamics, confirming the independence of the two systems. When the figure tracking system is activated by a narrow vertical bar moving within the frontal field of view, ground motion is essentially ignored despite comprising over 90% of the total visual input.
Collapse
Affiliation(s)
- Jessica L Fox
- Howard Hughes Medical Institute and Department of Integrative Biology and Physiology, University of California Los Angeles, Los Angeles, CA 90095-7239, USA
| | | | | | | | | |
Collapse
|
39
|
Clark DA, Freifeld L, Clandinin TR. Mapping and cracking sensorimotor circuits in genetic model organisms. Neuron 2013; 78:583-95. [PMID: 23719159 DOI: 10.1016/j.neuron.2013.05.006] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/07/2013] [Indexed: 12/27/2022]
Abstract
One central goal of systems neuroscience is to understand how neural circuits implement the computations that link sensory inputs to behavior. Work combining electrophysiological and imaging-based approaches to measure neural activity with pharmacological and electrophysiological manipulations has provided fundamental insights. More recently, genetic approaches have been used to monitor and manipulate neural activity, opening up new experimental opportunities and challenges. Here, we discuss issues associated with applying genetic approaches to circuit dissection in sensorimotor transformations, outlining important considerations for experimental design and considering how modeling can complement experimental approaches.
Collapse
Affiliation(s)
- Damon A Clark
- Department of Neurobiology, 299 W. Campus Drive, Stanford University, Stanford, CA 94305, USA
| | | | | |
Collapse
|
40
|
Abstract
A compact genome and a tiny brain make Drosophila the prime model to understand the neural substrate of behavior. The neurogenetic efforts to reveal neural circuits underlying Drosophila vision started about half a century ago, and now the field is booming with sophisticated genetic tools, rich behavioral assays, and importantly, a greater number of scientists joining from different backgrounds. This review will briefly cover the structural anatomy of the Drosophila visual system, the animal’s visual behaviors, the genes involved in assembling these circuits, the new and powerful techniques, and the challenges ahead for ultimately identifying the general principles of biological computation in the brain.
A typical brain utilizes a great many compact neural circuits to collect and process information from the internal biological and external environmental worlds and generates motor commands for observable behaviors. The fruit fly Drosophila melanogaster, despite of its miniature body and tiny brain, can survive in almost any corner of the world.1 It can find food, court mate, fight rival conspecific, avoid predators, and amazingly fly without crashing into trees. Drosophila vision and its underlying neuronal machinery has been a key research model for at least half century for neurogeneticists.2 Given the efforts invested on the visual system, this animal model is likely to offer the first full understanding of how visual information is computed by a multi-cellular organism. Furthermore, research in Drosophila has revealed many genes that play crucial roles in the formation of functional brains across species. The architectural similarities between the visual systems of Drosophila and vertebrate at the molecular, cellular, and network levels suggest new principles discovered at the circuit level on the relationship between neurons and behavior in Drosophila shall also contribute greatly to our understanding of the general principles for how bigger brains work.3 I start with the anatomy of Drosophila visual system, which surprisingly still contains many uncharted areas.
Collapse
Affiliation(s)
- Yan Zhu
- State Key Laboratory of Brain and Cognitive Science; Institute of Biophysics; Chinese Academy of Sciences; Beijing, China
| |
Collapse
|
41
|
Bahl A, Ammer G, Schilling T, Borst A. Object tracking in motion-blind flies. Nat Neurosci 2013; 16:730-8. [DOI: 10.1038/nn.3386] [Citation(s) in RCA: 123] [Impact Index Per Article: 11.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2013] [Accepted: 03/28/2013] [Indexed: 12/12/2022]
|
42
|
Discriminating external and internal causes for heading changes in freely flying Drosophila. PLoS Comput Biol 2013; 9:e1002891. [PMID: 23468601 PMCID: PMC3585425 DOI: 10.1371/journal.pcbi.1002891] [Citation(s) in RCA: 39] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2012] [Accepted: 12/04/2012] [Indexed: 12/03/2022] Open
Abstract
As animals move through the world in search of resources, they change course in reaction to both external sensory cues and internally-generated programs. Elucidating the functional logic of complex search algorithms is challenging because the observable actions of the animal cannot be unambiguously assigned to externally- or internally-triggered events. We present a technique that addresses this challenge by assessing quantitatively the contribution of external stimuli and internal processes. We apply this technique to the analysis of rapid turns (“saccades”) of freely flying Drosophila melanogaster. We show that a single scalar feature computed from the visual stimulus experienced by the animal is sufficient to explain a majority (93%) of the turning decisions. We automatically estimate this scalar value from the observable trajectory, without any assumption regarding the sensory processing. A posteriori, we show that the estimated feature field is consistent with previous results measured in other experimental conditions. The remaining turning decisions, not explained by this feature of the visual input, may be attributed to a combination of deterministic processes based on unobservable internal states and purely stochastic behavior. We cannot distinguish these contributions using external observations alone, but we are able to provide a quantitative bound of their relative importance with respect to stimulus-triggered decisions. Our results suggest that comparatively few saccades in free-flying conditions are a result of an intrinsic spontaneous process, contrary to previous suggestions. We discuss how this technique could be generalized for use in other systems and employed as a tool for classifying effects into sensory, decision, and motor categories when used to analyze data from genetic behavioral screens. Researchers have spent considerable effort studying how specific sensory stimuli elicit behavioral responses and how other behaviors may arise independent of external inputs in conditions of sensory deprivation. Yet an animal in its natural context, such as searching for food or mates, turns both in response to external stimuli and intrinsic, possibly stochastic, decisions. We show how to estimate the contribution of vision and internal causes on the observable behavior of freely flying Drosophila. We developed a dimensionality reduction scheme that finds a one-dimensional feature of the visual stimulus that best predicts turning decisions. This visual feature extraction is consistent with previous literature on visually elicited fly turning and predicts a large majority of turns in the tested environment. The rarity of stimulus-independent events suggests that fly behavior is more deterministic than previously suggested and that, more generally, animal search strategies may be dominated by responses to stimuli with only modest contributions from internal causes.
Collapse
|
43
|
Egelhaaf M, Boeddeker N, Kern R, Kurtz R, Lindemann JP. Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action. Front Neural Circuits 2012; 6:108. [PMID: 23269913 PMCID: PMC3526811 DOI: 10.3389/fncir.2012.00108] [Citation(s) in RCA: 71] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2012] [Accepted: 12/03/2012] [Indexed: 11/30/2022] Open
Abstract
Insects such as flies or bees, with their miniature brains, are able to control highly aerobatic flight maneuvres and to solve spatial vision tasks, such as avoiding collisions with obstacles, landing on objects, or even localizing a previously learnt inconspicuous goal on the basis of environmental cues. With regard to solving such spatial tasks, these insects still outperform man-made autonomous flying systems. To accomplish their extraordinary performance, flies and bees have been shown by their characteristic behavioral actions to actively shape the dynamics of the image flow on their eyes ("optic flow"). The neural processing of information about the spatial layout of the environment is greatly facilitated by segregating the rotational from the translational optic flow component through a saccadic flight and gaze strategy. This active vision strategy thus enables the nervous system to solve apparently complex spatial vision tasks in a particularly efficient and parsimonious way. The key idea of this review is that biological agents, such as flies or bees, acquire at least part of their strength as autonomous systems through active interactions with their environment and not by simply processing passively gained information about the world. These agent-environment interactions lead to adaptive behavior in surroundings of a wide range of complexity. Animals with even tiny brains, such as insects, are capable of performing extraordinarily well in their behavioral contexts by making optimal use of the closed action-perception loop. Model simulations and robotic implementations show that the smart biological mechanisms of motion computation and visually-guided flight control might be helpful to find technical solutions, for example, when designing micro air vehicles carrying a miniaturized, low-weight on-board processor.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Centre of Excellence “Cognitive Interaction Technology”Bielefeld University, Germany
| | | | | | | | | |
Collapse
|
44
|
Zhang X, Liu H, Lei Z, Wu Z, Guo A. Lobula-specific visual projection neurons are involved in perception of motion-defined second-order motion in Drosophila. ACTA ACUST UNITED AC 2012; 216:524-34. [PMID: 23077158 DOI: 10.1242/jeb.079095] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
A wide variety of animal species including humans and fruit flies see second-order motion although they lack coherent spatiotemporal correlations in luminance. Recent electrophysiological recordings, together with intensive psychophysical studies, are bringing to light the neural underpinnings of second-order motion perception in mammals. However, where and how the higher-order motion signals are processed in the fly brain is poorly understood. Using the rich genetic tools available in Drosophila and examining optomotor responses in fruit flies to several stimuli, we revealed that two lobula-specific visual projection neurons, specifically connecting the lobula and the central brain, are involved in the perception of motion-defined second-order motion, independent of whether the second-order feature is moving perpendicular or opposite to the local first-order motion. By contrast, blocking these neurons has no effect on first-order and flicker-defined second-order stimuli in terms of response delay. Our results suggest that visual neuropils deep in the optic lobe and the central brain, whose functional roles in motion processing were previously unclear, may be specifically required for motion-defined motion processing.
Collapse
Affiliation(s)
- Xiaonan Zhang
- State Key Laboratory of Brain and Cognitive Science, Institute of Biophysics, Chinese Academy of Sciences, Beijing 100101, China
| | | | | | | | | |
Collapse
|
45
|
Soibam B, Goldfeder RL, Manson-Bishop C, Gamblin R, Pletcher SD, Shah S, Gunaratne GH, Roman GW. Modeling Drosophila positional preferences in open field arenas with directional persistence and wall attraction. PLoS One 2012; 7:e46570. [PMID: 23071591 PMCID: PMC3468593 DOI: 10.1371/journal.pone.0046570] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2012] [Accepted: 08/31/2012] [Indexed: 12/02/2022] Open
Abstract
In open field arenas, Drosophila adults exhibit a preference for arena boundaries over internal walls and open regions. Herein, we investigate the nature of this preference using phenomenological modeling of locomotion to determine whether local arena features and constraints on movement alone are sufficient to drive positional preferences within open field arenas of different shapes and with different internal features. Our model has two components: directional persistence and local wall force. In regions far away from walls, the trajectory is entirely characterized by a directional persistence probability, , for each movement defined by the step size, , and the turn angle, . In close proximity to walls, motion is computed from and a local attractive force which depends on the distance between the fly and points on the walls. The directional persistence probability was obtained experimentally from trajectories of wild type Drosophila in a circular open field arena and the wall force was computed to minimize the difference between the radial distributions from the model and Drosophila in the same circular arena. The two-component model for fly movement was challenged by comparing the positional preferences from the two-component model to wild type Drosophila in a variety of open field arenas. In most arenas there was a strong concordance between the two-component model and Drosophila. In more complex arenas, the model exhibits similar trends, but some significant differences were found. These differences suggest that there are emergent features within these complex arenas that have significance for the fly, such as potential shelter. Hence, the two-component model is an important step in defining how Drosophila interact with their environment.
Collapse
Affiliation(s)
- Benjamin Soibam
- Department of Computer Science, University of Houston, Houston, Texas, United States of America
| | - Rachel L. Goldfeder
- Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
| | - Claire Manson-Bishop
- Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
| | - Rachel Gamblin
- Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
| | - Scott D. Pletcher
- University of Michigan Geriatrics Center, Department of Molecular and Integrative Physiology, University of Michigan, Ann Arbor, Michigan, United States of America
| | - Shishir Shah
- Department of Computer Science, University of Houston, Houston, Texas, United States of America
| | - Gemunu H. Gunaratne
- Department of Physics, University of Houston, Houston, Texas, United States of America
| | - Gregg W. Roman
- Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
- Biology of Behavior Institute, University of Houston, Texas, United States of America
- * E-mail:
| |
Collapse
|
46
|
Abstract
In higher-order motion stimuli, the direction of object motion does not follow the direction of luminance change. Such stimuli could be generated by the wing movements of a flying butterfly and further complicated by its motion in and out of shadows. Human subjects readily perceive the direction of higher-order motion, although this stands in stark contrast to prevailing motion vision models. Flies and humans compute motion in similar ways, and because flies behaviorally track bars containing higher-order motion cues, they become an attractive model system for investigating the neurophysiology underlying higher-order motion sensitivity. We here use intracellular electrophysiology of motion-vision-sensitive neurons in the hoverfly lobula plate to quantify responses to stimuli containing higher-order motion. We show that motion sensitivity can be broken down into two separate streams, directionally coding for elementary motion and figure motion, respectively, and that responses to Fourier and theta motion can be predicted from these. The sensitivity is affected both by the stimulus' time course and by the neuron's underlying receptive field. Responses to preferred-direction theta motion are sexually dimorphic and particularly robust along the visual midline.
Collapse
|
47
|
Saccadic tracking of targets mediated by the anterior-lateral eyes of jumping spiders. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2012; 198:411-7. [PMID: 22457074 DOI: 10.1007/s00359-012-0719-0] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2012] [Revised: 03/15/2012] [Accepted: 03/16/2012] [Indexed: 10/28/2022]
Abstract
The modular visual system of jumping spiders (Salticidae) divides characteristics such as high spatial acuity and wide-field motion detection between different pairs of eyes. A large pair of telescope-like anterior-median (AM) eyes is supported by 2-3 pairs of 'secondary' eyes, which provide almost 360 degrees of visual coverage at lower resolution. The AM retinae are moveable and can be pointed at stimuli within their range of motion, but salticids have to turn to bring targets into this frontal zone in the first place. We describe how the front-facing pair of secondary eyes (anterior lateral, AL) mediates this through a series of whole-body 'tracking saccades' in response to computer-generated stimuli. We investigated the 'response area' of the AL eyes and show a clear correspondence between the physical margins of the retina and stimulus position at the onset of the first saccade. Saccade frequency is maximal at the margin of AL and AM fields of view. Furthermore, spiders markedly increase the velocity with which higher magnitude tracking saccades are carried out. This has the effect that the time during which vision is impaired due to motion blur is kept at an almost constant low level, even during saccades of large magnitude.
Collapse
|