1
|
Hulse BK, Haberkern H, Franconville R, Turner-Evans D, Takemura SY, Wolff T, Noorman M, Dreher M, Dan C, Parekh R, Hermundstad AM, Rubin GM, Jayaraman V. A connectome of the Drosophila central complex reveals network motifs suitable for flexible navigation and context-dependent action selection. eLife 2021; 10:e66039. [PMID: 34696823 PMCID: PMC9477501 DOI: 10.7554/elife.66039] [Citation(s) in RCA: 122] [Impact Index Per Article: 40.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2020] [Accepted: 09/07/2021] [Indexed: 11/13/2022] Open
Abstract
Flexible behaviors over long timescales are thought to engage recurrent neural networks in deep brain regions, which are experimentally challenging to study. In insects, recurrent circuit dynamics in a brain region called the central complex (CX) enable directed locomotion, sleep, and context- and experience-dependent spatial navigation. We describe the first complete electron microscopy-based connectome of the Drosophila CX, including all its neurons and circuits at synaptic resolution. We identified new CX neuron types, novel sensory and motor pathways, and network motifs that likely enable the CX to extract the fly's head direction, maintain it with attractor dynamics, and combine it with other sensorimotor information to perform vector-based navigational computations. We also identified numerous pathways that may facilitate the selection of CX-driven behavioral patterns by context and internal state. The CX connectome provides a comprehensive blueprint necessary for a detailed understanding of network dynamics underlying sleep, flexible navigation, and state-dependent action selection.
Collapse
Affiliation(s)
- Brad K Hulse
- Janelia Research Campus, Howard Hughes Medical InstituteAshburnUnited States
| | - Hannah Haberkern
- Janelia Research Campus, Howard Hughes Medical InstituteAshburnUnited States
| | - Romain Franconville
- Janelia Research Campus, Howard Hughes Medical InstituteAshburnUnited States
| | - Daniel Turner-Evans
- Janelia Research Campus, Howard Hughes Medical InstituteAshburnUnited States
| | - Shin-ya Takemura
- Janelia Research Campus, Howard Hughes Medical InstituteAshburnUnited States
| | - Tanya Wolff
- Janelia Research Campus, Howard Hughes Medical InstituteAshburnUnited States
| | - Marcella Noorman
- Janelia Research Campus, Howard Hughes Medical InstituteAshburnUnited States
| | - Marisa Dreher
- Janelia Research Campus, Howard Hughes Medical InstituteAshburnUnited States
| | - Chuntao Dan
- Janelia Research Campus, Howard Hughes Medical InstituteAshburnUnited States
| | - Ruchi Parekh
- Janelia Research Campus, Howard Hughes Medical InstituteAshburnUnited States
| | - Ann M Hermundstad
- Janelia Research Campus, Howard Hughes Medical InstituteAshburnUnited States
| | - Gerald M Rubin
- Janelia Research Campus, Howard Hughes Medical InstituteAshburnUnited States
| | - Vivek Jayaraman
- Janelia Research Campus, Howard Hughes Medical InstituteAshburnUnited States
| |
Collapse
|
2
|
Supple JA, Pinto-Benito D, Khoo C, Wardill TJ, Fabian ST, Liu M, Pusdekar S, Galeano D, Pan J, Jiang S, Wang Y, Liu L, Peng H, Olberg RM, Gonzalez-Bellido PT. Binocular Encoding in the Damselfly Pre-motor Target Tracking System. Curr Biol 2020; 30:645-656.e4. [DOI: 10.1016/j.cub.2019.12.031] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2019] [Revised: 10/16/2019] [Accepted: 12/10/2019] [Indexed: 12/29/2022]
|
3
|
Fu Q, Wang H, Hu C, Yue S. Towards Computational Models and Applications of Insect Visual Systems for Motion Perception: A Review. ARTIFICIAL LIFE 2019; 25:263-311. [PMID: 31397604 DOI: 10.1162/artl_a_00297] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Motion perception is a critical capability determining a variety of aspects of insects' life, including avoiding predators, foraging, and so forth. A good number of motion detectors have been identified in the insects' visual pathways. Computational modeling of these motion detectors has not only been providing effective solutions to artificial intelligence, but also benefiting the understanding of complicated biological visual systems. These biological mechanisms through millions of years of evolutionary development will have formed solid modules for constructing dynamic vision systems for future intelligent machines. This article reviews the computational motion perception models originating from biological research on insects' visual systems in the literature. These motion perception models or neural networks consist of the looming-sensitive neuronal models of lobula giant movement detectors (LGMDs) in locusts, the translation-sensitive neural systems of direction-selective neurons (DSNs) in fruit flies, bees, and locusts, and the small-target motion detectors (STMDs) in dragonflies and hoverflies. We also review the applications of these models to robots and vehicles. Through these modeling studies, we summarize the methodologies that generate different direction and size selectivity in motion perception. Finally, we discuss multiple systems integration and hardware realization of these bio-inspired motion perception models.
Collapse
Affiliation(s)
- Qinbing Fu
- Guangzhou University, School of Mechanical and Electrical Engineering; Machine Life and Intelligence Research Centre
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| | - Hongxin Wang
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| | - Cheng Hu
- Guangzhou University, School of Mechanical and Electrical Engineering; Machine Life and Intelligence Research Centre
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| | - Shigang Yue
- Guangzhou University, School of Mechanical and Electrical Engineering; Machine Life and Intelligence Research Centre
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| |
Collapse
|
4
|
Nityananda V, Read JCA. Stereopsis in animals: evolution, function and mechanisms. ACTA ACUST UNITED AC 2018; 220:2502-2512. [PMID: 28724702 PMCID: PMC5536890 DOI: 10.1242/jeb.143883] [Citation(s) in RCA: 51] [Impact Index Per Article: 8.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Stereopsis is the computation of depth information from views acquired simultaneously from different points in space. For many years, stereopsis was thought to be confined to primates and other mammals with front-facing eyes. However, stereopsis has now been demonstrated in many other animals, including lateral-eyed prey mammals, birds, amphibians and invertebrates. The diversity of animals known to have stereo vision allows us to begin to investigate ideas about its evolution and the underlying selective pressures in different animals. It also further prompts the question of whether all animals have evolved essentially the same algorithms to implement stereopsis. If so, this must be the best way to do stereo vision, and should be implemented by engineers in machine stereopsis. Conversely, if animals have evolved a range of stereo algorithms in response to different pressures, that could inspire novel forms of machine stereopsis appropriate for distinct environments, tasks or constraints. As a first step towards addressing these ideas, we here review our current knowledge of stereo vision in animals, with a view towards outlining common principles about the evolution, function and mechanisms of stereo vision across the animal kingdom. We conclude by outlining avenues for future work, including research into possible new mechanisms of stereo vision, with implications for machine vision and the role of stereopsis in the evolution of camouflage. Summary: Stereopsis has evolved independently in different animals. We review the various functions it serves and the variety of mechanisms that could underlie stereopsis in different species.
Collapse
Affiliation(s)
- Vivek Nityananda
- Wissenschaftskolleg zu Berlin, Institute for Advanced Study, Wallotstraße 19, Berlin 14193, Germany .,Newcastle University, Institute of Neuroscience, Henry Wellcome Building, Framlington Place, Newcastle Upon Tyne NE2 4HH, UK
| | - Jenny C A Read
- Newcastle University, Institute of Neuroscience, Henry Wellcome Building, Framlington Place, Newcastle Upon Tyne NE2 4HH, UK
| |
Collapse
|
5
|
Binocular Neuronal Processing of Object Motion in an Arthropod. J Neurosci 2018; 38:6933-6948. [PMID: 30012687 DOI: 10.1523/jneurosci.3641-17.2018] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/27/2017] [Revised: 06/02/2018] [Accepted: 06/05/2018] [Indexed: 11/21/2022] Open
Abstract
Animals use binocular information to guide many behaviors. In highly visual arthropods, complex binocular computations involved in processing panoramic optic flow generated during self-motion occur in the optic neuropils. However, the extent to which binocular processing of object motion occurs in these neuropils remains unknown. We investigated this in a crab, where the distance between the eyes and the extensive overlapping of their visual fields advocate for the use of binocular processing. By performing in vivo intracellular recordings from the lobula (third optic neuropil) of male crabs, we assessed responses of object-motion-sensitive neurons to ipsilateral or contralateral moving objects under binocular and monocular conditions. Most recorded neurons responded to stimuli seen independently with either eye, proving that each lobula receives profuse visual information from both eyes. The contribution of each eye to the binocular response varies among neurons, from those receiving comparable inputs from both eyes to those with mainly ipsilateral or contralateral components, some including contralateral inhibition. Electrophysiological profiles indicated that a similar number of neurons were recorded from their input or their output side. In monocular conditions, the first group showed shorter response delays to ipsilateral than to contralateral stimulation, whereas the second group showed the opposite. These results fit well with neurons conveying centripetal and centrifugal information from and toward the lobula, respectively. Intracellular and massive stainings provided anatomical support for this and for direct connections between the two lobulae, but simultaneous recordings failed to reveal such connections. Simplified model circuits of interocular connections are discussed.SIGNIFICANCE STATEMENT Most active animals became equipped with two eyes, which contributes to functions like depth perception, objects spatial location, and motion processing, all used for guiding behaviors. In visually active arthropods, binocular neural processing of the panoramic optic flow generated during self-motion happens already in the optic neuropils. However, whether binocular processing of single-object motion occurs in these neuropils remained unknown. We investigated this in a crab, where motion-sensitive neurons from the lobula can be recorded in the intact animal. Here we demonstrate that different classes of neurons from the lobula compute binocular information. Our results provide new insight into where and how the visual information acquired by the two eyes is first combined in the brain of an arthropod.
Collapse
|
6
|
Bertrand OJN, Lindemann JP, Egelhaaf M. A Bio-inspired Collision Avoidance Model Based on Spatial Information Derived from Motion Detectors Leads to Common Routes. PLoS Comput Biol 2015; 11:e1004339. [PMID: 26583771 PMCID: PMC4652890 DOI: 10.1371/journal.pcbi.1004339] [Citation(s) in RCA: 28] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2014] [Accepted: 05/13/2015] [Indexed: 11/18/2022] Open
Abstract
Avoiding collisions is one of the most basic needs of any mobile agent, both biological and technical, when searching around or aiming toward a goal. We propose a model of collision avoidance inspired by behavioral experiments on insects and by properties of optic flow on a spherical eye experienced during translation, and test the interaction of this model with goal-driven behavior. Insects, such as flies and bees, actively separate the rotational and translational optic flow components via behavior, i.e. by employing a saccadic strategy of flight and gaze control. Optic flow experienced during translation, i.e. during intersaccadic phases, contains information on the depth-structure of the environment, but this information is entangled with that on self-motion. Here, we propose a simple model to extract the depth structure from translational optic flow by using local properties of a spherical eye. On this basis, a motion direction of the agent is computed that ensures collision avoidance. Flying insects are thought to measure optic flow by correlation-type elementary motion detectors. Their responses depend, in addition to velocity, on the texture and contrast of objects and, thus, do not measure the velocity of objects veridically. Therefore, we initially used geometrically determined optic flow as input to a collision avoidance algorithm to show that depth information inferred from optic flow is sufficient to account for collision avoidance under closed-loop conditions. Then, the collision avoidance algorithm was tested with bio-inspired correlation-type elementary motion detectors in its input. Even then, the algorithm led successfully to collision avoidance and, in addition, replicated the characteristics of collision avoidance behavior of insects. Finally, the collision avoidance algorithm was combined with a goal direction and tested in cluttered environments. The simulated agent then showed goal-directed behavior reminiscent of components of the navigation behavior of insects.
Collapse
Affiliation(s)
| | | | - Martin Egelhaaf
- Neurobiologie & CITEC, Bielefeld University, Bielefeld, Germany
| |
Collapse
|
7
|
Egelhaaf M, Kern R, Lindemann JP. Motion as a source of environmental information: a fresh view on biological motion computation by insect brains. Front Neural Circuits 2014; 8:127. [PMID: 25389392 PMCID: PMC4211400 DOI: 10.3389/fncir.2014.00127] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2014] [Accepted: 10/05/2014] [Indexed: 11/13/2022] Open
Abstract
Despite their miniature brains insects, such as flies, bees and wasps, are able to navigate by highly erobatic flight maneuvers in cluttered environments. They rely on spatial information that is contained in the retinal motion patterns induced on the eyes while moving around ("optic flow") to accomplish their extraordinary performance. Thereby, they employ an active flight and gaze strategy that separates rapid saccade-like turns from translatory flight phases where the gaze direction is kept largely constant. This behavioral strategy facilitates the processing of environmental information, because information about the distance of the animal to objects in the environment is only contained in the optic flow generated by translatory motion. However, motion detectors as are widespread in biological systems do not represent veridically the velocity of the optic flow vectors, but also reflect textural information about the environment. This characteristic has often been regarded as a limitation of a biological motion detection mechanism. In contrast, we conclude from analyses challenging insect movement detectors with image flow as generated during translatory locomotion through cluttered natural environments that this mechanism represents the contours of nearby objects. Contrast borders are a main carrier of functionally relevant object information in artificial and natural sceneries. The motion detection system thus segregates in a computationally parsimonious way the environment into behaviorally relevant nearby objects and-in many behavioral contexts-less relevant distant structures. Hence, by making use of an active flight and gaze strategy, insects are capable of performing extraordinarily well even with a computationally simple motion detection mechanism.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Department of Neurobiology and Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld UniversityBielefeld, Germany
| | - Roland Kern
- Department of Neurobiology and Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld UniversityBielefeld, Germany
| | - Jens Peter Lindemann
- Department of Neurobiology and Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld UniversityBielefeld, Germany
| |
Collapse
|
8
|
Lich M, Bremmer F. Self-motion perception in the elderly. Front Hum Neurosci 2014; 8:681. [PMID: 25309379 PMCID: PMC4163979 DOI: 10.3389/fnhum.2014.00681] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2014] [Accepted: 08/14/2014] [Indexed: 11/18/2022] Open
Abstract
Self-motion through space generates a visual pattern called optic flow. It can be used to determine one's direction of self-motion (heading). Previous studies have already shown that this perceptual ability, which is of critical importance during everyday life, changes with age. In most of these studies subjects were asked to judge whether they appeared to be heading to the left or right of a target. Thresholds were found to increase continuously with age. In our current study, we were interested in absolute rather than relative heading judgments and in the question about a potential neural correlate of an age-related deterioration of heading perception. Two groups, older test subjects and younger controls, were shown optic flow stimuli in a virtual-reality setup. Visual stimuli simulated self-motion through a 3-D cloud of dots and subjects had to indicate their perceived heading direction after each trial. In different subsets of experiments we varied individually relevant stimulus parameters: presentation time, number of dots in the display, stereoscopic vs. non-stereoscopic stimulation, and motion coherence. We found decrements in heading performance with age for each stimulus parameter. In a final step we aimed to determine a putative neural basis of this behavioral decline. To this end we modified a neural network model which previously has proven to be capable of reproduce and predict certain aspects of heading perception. We show that the observed data can be modeled by implementing an age related neuronal cell loss in this neural network. We conclude that a continuous decline of certain aspects of motion perception, among them heading, might be based on an age-related progressive loss of groups of neurons being activated by visual motion.
Collapse
Affiliation(s)
- Matthias Lich
- Department Neurophysics, Philipps-Universität Marburg Marburg, Germany
| | - Frank Bremmer
- Department Neurophysics, Philipps-Universität Marburg Marburg, Germany
| |
Collapse
|
9
|
Kress D, Egelhaaf M. Impact of stride-coupled gaze shifts of walking blowflies on the neuronal representation of visual targets. Front Behav Neurosci 2014; 8:307. [PMID: 25309362 PMCID: PMC4164030 DOI: 10.3389/fnbeh.2014.00307] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2014] [Accepted: 08/23/2014] [Indexed: 02/04/2023] Open
Abstract
During locomotion animals rely heavily on visual cues gained from the environment to guide their behavior. Examples are basic behaviors like collision avoidance or the approach to a goal. The saccadic gaze strategy of flying flies, which separates translational from rotational phases of locomotion, has been suggested to facilitate the extraction of environmental information, because only image flow evoked by translational self-motion contains relevant distance information about the surrounding world. In contrast to the translational phases of flight during which gaze direction is kept largely constant, walking flies experience continuous rotational image flow that is coupled to their stride-cycle. The consequences of these self-produced image shifts for the extraction of environmental information are still unclear. To assess the impact of stride-coupled image shifts on visual information processing, we performed electrophysiological recordings from the HSE cell, a motion sensitive wide-field neuron in the blowfly visual system. This cell has been concluded to play a key role in mediating optomotor behavior, self-motion estimation and spatial information processing. We used visual stimuli that were based on the visual input experienced by walking blowflies while approaching a black vertical bar. The response of HSE to these stimuli was dominated by periodic membrane potential fluctuations evoked by stride-coupled image shifts. Nevertheless, during the approach the cell's response contained information about the bar and its background. The response components evoked by the bar were larger than the responses to its background, especially during the last phase of the approach. However, as revealed by targeted modifications of the visual input during walking, the extraction of distance information on the basis of HSE responses is much impaired by stride-coupled retinal image shifts. Possible mechanisms that may cope with these stride-coupled responses are discussed.
Collapse
Affiliation(s)
- Daniel Kress
- Department of Neurobiology, Bielefeld UniversityBielefeld, Germany
- CITEC Center of Excellence Cognitive Interaction Technology, Bielefeld UniversityBielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology, Bielefeld UniversityBielefeld, Germany
- CITEC Center of Excellence Cognitive Interaction Technology, Bielefeld UniversityBielefeld, Germany
| |
Collapse
|
10
|
Schwegmann A, Lindemann JP, Egelhaaf M. Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis. Front Comput Neurosci 2014; 8:83. [PMID: 25136314 PMCID: PMC4118023 DOI: 10.3389/fncom.2014.00083] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2014] [Accepted: 07/14/2014] [Indexed: 02/04/2023] Open
Abstract
Knowing the depth structure of the environment is crucial for moving animals in many behavioral contexts, such as collision avoidance, targeting objects, or spatial navigation. An important source of depth information is motion parallax. This powerful cue is generated on the eyes during translatory self-motion with the retinal images of nearby objects moving faster than those of distant ones. To investigate how the visual motion pathway represents motion-based depth information we analyzed its responses to image sequences recorded in natural cluttered environments with a wide range of depth structures. The analysis was done on the basis of an experimentally validated model of the visual motion pathway of insects, with its core elements being correlation-type elementary motion detectors (EMDs). It is the key result of our analysis that the absolute EMD responses, i.e., the motion energy profile, represent the contrast-weighted nearness of environmental structures during translatory self-motion at a roughly constant velocity. In other words, the output of the EMD array highlights contours of nearby objects. This conclusion is largely independent of the scale over which EMDs are spatially pooled and was corroborated by scrutinizing the motion energy profile after eliminating the depth structure from the natural image sequences. Hence, the well-established dependence of correlation-type EMDs on both velocity and textural properties of motion stimuli appears to be advantageous for representing behaviorally relevant information about the environment in a computationally parsimonious way.
Collapse
Affiliation(s)
| | | | - Martin Egelhaaf
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology, Bielefeld UniversityBielefeld, Germany
| |
Collapse
|
11
|
Cooperative integration and representation underlying bilateral network of fly motion-sensitive neurons. PLoS One 2014; 9:e85790. [PMID: 24465711 PMCID: PMC3900430 DOI: 10.1371/journal.pone.0085790] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2013] [Accepted: 12/02/2013] [Indexed: 11/19/2022] Open
Abstract
How is binocular motion information integrated in the bilateral network of wide-field motion-sensitive neurons, called lobula plate tangential cells (LPTCs), in the visual system of flies? It is possible to construct an accurate model of this network because a complete picture of synaptic interactions has been experimentally identified. We investigated the cooperative behavior of the network of horizontal LPTCs underlying the integration of binocular motion information and the information representation in the bilateral LPTC network through numerical simulations on the network model. First, we qualitatively reproduced rotational motion-sensitive response of the H2 cell previously reported in vivo experiments and ascertained that it could be accounted for by the cooperative behavior of the bilateral network mainly via interhemispheric electrical coupling. We demonstrated that the response properties of single H1 and Hu cells, unlike H2 cells, are not influenced by motion stimuli in the contralateral visual hemi-field, but that the correlations between these cell activities are enhanced by the rotational motion stimulus. We next examined the whole population activity by performing principal component analysis (PCA) on the population activities of simulated LPTCs. We showed that the two orthogonal patterns of correlated population activities given by the first two principal components represent the rotational and translational motions, respectively, and similar to the H2 cell, rotational motion produces a stronger response in the network than does translational motion. Furthermore, we found that these population-coding properties are strongly influenced by the interhemispheric electrical coupling. Finally, to test the generality of our conclusions, we used a more simplified model and verified that the numerical results are not specific to the network model we constructed.
Collapse
|
12
|
Lindemann JP, Egelhaaf M. Texture dependence of motion sensing and free flight behavior in blowflies. Front Behav Neurosci 2013; 6:92. [PMID: 23335890 PMCID: PMC3542507 DOI: 10.3389/fnbeh.2012.00092] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2012] [Accepted: 12/21/2012] [Indexed: 11/18/2022] Open
Abstract
MANY FLYING INSECTS EXHIBIT AN ACTIVE FLIGHT AND GAZE STRATEGY: purely translational flight segments alternate with quick turns called saccades. To generate such a saccadic flight pattern, the animals decide the timing, direction, and amplitude of the next saccade during the previous translatory intersaccadic interval. The information underlying these decisions is assumed to be extracted from the retinal image displacements (optic flow), which scale with the distance to objects during the intersaccadic flight phases. In an earlier study we proposed a saccade-generation mechanism based on the responses of large-field motion-sensitive neurons. In closed-loop simulations we achieved collision avoidance behavior in a limited set of environments but observed collisions in others. Here we show by open-loop simulations that the cause of this observation is the known texture-dependence of elementary motion detection in flies, reflected also in the responses of large-field neurons as used in our model. We verified by electrophysiological experiments that this result is not an artifact of the sensory model. Already subtle changes in the texture may lead to qualitative differences in the responses of both our model cells and their biological counterparts in the fly's brain. Nonetheless, free flight behavior of blowflies is only moderately affected by such texture changes. This divergent texture dependence of motion-sensitive neurons and behavioral performance suggests either mechanisms that compensate for the texture dependence of the visual motion pathway at the level of the circuits generating the saccadic turn decisions or the involvement of a hypothetical parallel pathway in saccadic control that provides the information for collision avoidance independent of the textural properties of the environment.
Collapse
|
13
|
Egelhaaf M, Boeddeker N, Kern R, Kurtz R, Lindemann JP. Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action. Front Neural Circuits 2012; 6:108. [PMID: 23269913 PMCID: PMC3526811 DOI: 10.3389/fncir.2012.00108] [Citation(s) in RCA: 71] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2012] [Accepted: 12/03/2012] [Indexed: 11/30/2022] Open
Abstract
Insects such as flies or bees, with their miniature brains, are able to control highly aerobatic flight maneuvres and to solve spatial vision tasks, such as avoiding collisions with obstacles, landing on objects, or even localizing a previously learnt inconspicuous goal on the basis of environmental cues. With regard to solving such spatial tasks, these insects still outperform man-made autonomous flying systems. To accomplish their extraordinary performance, flies and bees have been shown by their characteristic behavioral actions to actively shape the dynamics of the image flow on their eyes ("optic flow"). The neural processing of information about the spatial layout of the environment is greatly facilitated by segregating the rotational from the translational optic flow component through a saccadic flight and gaze strategy. This active vision strategy thus enables the nervous system to solve apparently complex spatial vision tasks in a particularly efficient and parsimonious way. The key idea of this review is that biological agents, such as flies or bees, acquire at least part of their strength as autonomous systems through active interactions with their environment and not by simply processing passively gained information about the world. These agent-environment interactions lead to adaptive behavior in surroundings of a wide range of complexity. Animals with even tiny brains, such as insects, are capable of performing extraordinarily well in their behavioral contexts by making optimal use of the closed action-perception loop. Model simulations and robotic implementations show that the smart biological mechanisms of motion computation and visually-guided flight control might be helpful to find technical solutions, for example, when designing micro air vehicles carrying a miniaturized, low-weight on-board processor.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Centre of Excellence “Cognitive Interaction Technology”Bielefeld University, Germany
| | | | | | | | | |
Collapse
|
14
|
Rien D, Kern R, Kurtz R. Octopaminergic modulation of contrast gain adaptation in fly visual motion-sensitive neurons. Eur J Neurosci 2012; 36:3030-9. [PMID: 22775326 DOI: 10.1111/j.1460-9568.2012.08216.x] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
Abstract
Locomotor activity like walking or flying has recently been shown to alter visual processing in several species. In insects, the neuromodulator octopamine is thought to play an important role in mediating state changes during locomotion of the animal [K.D. Longden & H.G. Krapp (2009) J. Neurophysiol., 102, 3606-3618; (2010) Front. Syst. Neurosci., 4, 153; S.N. Jung et al. (2011)J. Neurosci., 31, 9231-9237]. Here, we used the octopamine agonist chlordimeform (CDM) to mimic effects of behavioural state changes on visual motion processing. We recorded from identified motion-sensitive visual interneurons in the lobula plate of the blowfly Calliphora vicina. In these neurons, which are thought to be involved in visual guidance of locomotion, motion adaptation leads to a prominent attenuation of contrast sensitivity. Following CDM application, the neurons maintained high contrast sensitivity in the adapted state. This modulation of contrast gain adaptation was independent of the activity of the recorded neurons, because it was also present after stimulation with visual motion that did not result in deviations from the neurons' resting activity. We conclude that CDM affects presynaptic inputs of the recorded neurons. Accordingly, the effect of CDM was weak when adapting and test stimuli were presented in different parts of the receptive field, stimulating separate populations of local presynaptic neurons. In the peripheral visual system adaptation depends on the temporal frequency of the stimulus pattern and is therefore related to pattern velocity. Contrast gain adaptation could therefore be the basis for a shift in the velocity tuning that was previously suggested to contribute to state-dependent processing of visual motion information in the lobula plate interneurons.
Collapse
Affiliation(s)
- Diana Rien
- Department of Neurobiology, Faculty of Biology, Bielefeld University, PO Box 10 01 31, 33501 Bielefeld, Germany
| | | | | |
Collapse
|
15
|
Hennig P, Egelhaaf M. Neuronal encoding of object and distance information: a model simulation study on naturalistic optic flow processing. Front Neural Circuits 2012; 6:14. [PMID: 22461769 PMCID: PMC3309705 DOI: 10.3389/fncir.2012.00014] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2011] [Accepted: 03/05/2012] [Indexed: 11/13/2022] Open
Abstract
We developed a model of the input circuitry of the FD1 cell, an identified motion-sensitive interneuron in the blowfly's visual system. The model circuit successfully reproduces the FD1 cell's most conspicuous property: its larger responses to objects than to spatially extended patterns. The model circuit also mimics the time-dependent responses of FD1 to dynamically complex naturalistic stimuli, shaped by the blowfly's saccadic flight and gaze strategy: the FD1 responses are enhanced when, as a consequence of self-motion, a nearby object crosses the receptive field during intersaccadic intervals. Moreover, the model predicts that these object-induced responses are superimposed by pronounced pattern-dependent fluctuations during movements on virtual test flights in a three-dimensional environment with systematic modifications of the environmental patterns. Hence, the FD1 cell is predicted to detect not unambiguously objects defined by the spatial layout of the environment, but to be also sensitive to objects distinguished by textural features. These ambiguous detection abilities suggest an encoding of information about objects-irrespective of the features by which the objects are defined-by a population of cells, with the FD1 cell presumably playing a prominent role in such an ensemble.
Collapse
Affiliation(s)
| | - Martin Egelhaaf
- Department of Neurobiology and Center of Excellence “Cognitive Interaction Technology”, Bielefeld UniversityBielefeld, Germany
| |
Collapse
|
16
|
Geurten BRH, Kern R, Egelhaaf M. Species-Specific Flight Styles of Flies are Reflected in the Response Dynamics of a Homolog Motion-Sensitive Neuron. Front Integr Neurosci 2012; 6:11. [PMID: 22485089 PMCID: PMC3307035 DOI: 10.3389/fnint.2012.00011] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2011] [Accepted: 02/28/2012] [Indexed: 11/22/2022] Open
Abstract
Hoverflies and blowflies have distinctly different flight styles. Yet, both species have been shown to structure their flight behavior in a way that facilitates extraction of 3D information from the image flow on the retina (optic flow). Neuronal candidates to analyze the optic flow are the tangential cells in the third optical ganglion - the lobula complex. These neurons are directionally selective and integrate the optic flow over large parts of the visual field. Homolog tangential cells in hoverflies and blowflies have a similar morphology. Because blowflies and hoverflies have similar neuronal layout but distinctly different flight behaviors, they are an ideal substrate to pinpoint potential neuronal adaptations to the different flight styles. In this article we describe the relationship between locomotion behavior and motion vision on three different levels: (1) We compare the different flight styles based on the categorization of flight behavior into prototypical movements. (2) We measure the species-specific dynamics of the optic flow under naturalistic flight conditions. We found the translational optic flow of both species to be very different. (3) We describe possible adaptations of a homolog motion-sensitive neuron. We stimulate this cell in blowflies (Calliphora) and hoverflies (Eristalis) with naturalistic optic flow generated by both species during free flight. The characterized hoverfly tangential cell responds faster to transient changes in the optic flow than its blowfly homolog. It is discussed whether and how the different dynamical response properties aid optic flow analysis.
Collapse
Affiliation(s)
- Bart R. H. Geurten
- Department of Neurobiology, Bielefeld UniversityBielefeld, North Rhine-Westphalia, Germany
- Centre of Excellence ‘Cognitive Interaction Technology’Bielefeld, North Rhine-Westphalia, Germany
- Department of Cellular Neurobiology, Johann-Friedrich-Blumenbach Institute for Zoology and Anthropology, Georg-August-University GöttingenGöttingen, Lower Saxony, Germany
| | - Roland Kern
- Department of Neurobiology, Bielefeld UniversityBielefeld, North Rhine-Westphalia, Germany
- Centre of Excellence ‘Cognitive Interaction Technology’Bielefeld, North Rhine-Westphalia, Germany
| | - Martin Egelhaaf
- Department of Neurobiology, Bielefeld UniversityBielefeld, North Rhine-Westphalia, Germany
- Centre of Excellence ‘Cognitive Interaction Technology’Bielefeld, North Rhine-Westphalia, Germany
| |
Collapse
|
17
|
Liang P, Heitwerth J, Kern R, Kurtz R, Egelhaaf M. Object representation and distance encoding in three-dimensional environments by a neural circuit in the visual system of the blowfly. J Neurophysiol 2012; 107:3446-57. [PMID: 22423002 DOI: 10.1152/jn.00530.2011] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Three motion-sensitive key elements of a neural circuit, presumably involved in processing object and distance information, were analyzed with optic flow sequences as experienced by blowflies in a three-dimensional environment. This optic flow is largely shaped by the blowfly's saccadic flight and gaze strategy, which separates translational flight segments from fast saccadic rotations. By modifying this naturalistic optic flow, all three analyzed neurons could be shown to respond during the intersaccadic intervals not only to nearby objects but also to changes in the distance to background structures. In the presence of strong background motion, the three types of neuron differ in their sensitivity for object motion. Object-induced response increments are largest in FD1, a neuron long known to respond better to moving objects than to spatially extended motion patterns, but weakest in VCH, a neuron that integrates wide-field motion from both eyes and, by inhibiting the FD1 cell, is responsible for its object preference. Small but significant object-induced response increments are present in HS cells, which serve both as a major input neuron of VCH and as output neurons of the visual system. In both HS and FD1, intersaccadic background responses decrease with increasing distance to the animal, although much more prominently in FD1. This strong dependence of FD1 on background distance is concluded to be the consequence of the activity of VCH that dramatically increases its activity and, thus, its inhibitory strength with increasing distance.
Collapse
Affiliation(s)
- Pei Liang
- Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC), Bielefeld University, Bielefeld, Germany
| | | | | | | | | |
Collapse
|
18
|
Duistermars BJ, Care RA, Frye MA. Binocular interactions underlying the classic optomotor responses of flying flies. Front Behav Neurosci 2012; 6:6. [PMID: 22375108 PMCID: PMC3284692 DOI: 10.3389/fnbeh.2012.00006] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2011] [Accepted: 02/08/2012] [Indexed: 11/25/2022] Open
Abstract
In response to imposed course deviations, the optomotor reactions of animals reduce motion blur and facilitate the maintenance of stable body posture. In flies, many anatomical and electrophysiological studies suggest that disparate motion cues stimulating the left and right eyes are not processed in isolation but rather are integrated in the brain to produce a cohesive panoramic percept. To investigate the strength of such inter-ocular interactions and their role in compensatory sensory–motor transformations, we utilize a virtual reality flight simulator to record wing and head optomotor reactions by tethered flying flies in response to imposed binocular rotation and monocular front-to-back and back-to-front motion. Within a narrow range of stimulus parameters that generates large contrast insensitive optomotor responses to binocular rotation, we find that responses to monocular front-to-back motion are larger than those to panoramic rotation, but are contrast sensitive. Conversely, responses to monocular back-to-front motion are slower than those to rotation and peak at the lowest tested contrast. Together our results suggest that optomotor responses to binocular rotation result from the influence of non-additive contralateral inhibitory as well as excitatory circuit interactions that serve to confer contrast insensitivity to flight behaviors influenced by rotatory optic flow.
Collapse
Affiliation(s)
- Brian J Duistermars
- Department of Physiological Science, Howard Hughes Medical Institute, University of California Los Angeles Los Angeles, CA, USA
| | | | | |
Collapse
|