1
|
Zheng Z, Guo A, Wu Z. Moving object detection based on bioinspired background subtraction. BIOINSPIRATION & BIOMIMETICS 2024; 19:056002. [PMID: 38917814 DOI: 10.1088/1748-3190/ad5ba3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/24/2024] [Accepted: 06/25/2024] [Indexed: 06/27/2024]
Abstract
Flying insects rely mainly upon visual motion to detect and track objects. There has been a lot of research on fly inspired algorithms for object detection, but few have been developed based on visual motion alone. One of the daunting difficulties is that the neural and circuit mechanisms underlying the foreground-background segmentation are still unclear. Our previous modeling study proposed that the lobula held parallel pathways with distinct directional selectivity, each of which could retinotopically discriminate figures moving in its own preferred direction based on relative motion cues. The previous model, however, did not address how the multiple parallel pathways gave the only detection output at their common downstream. Since the preferred directions of the pathways along either horizontal or vertical axis were opposite to each other, the background moving in the opposite direction to an object also activated the corresponding lobula pathway. Indiscriminate or ungated projection from all the pathways to their downstream would mix objects with the moving background, making the previous model fail with non-stationary background. Here, we extend the previous model by proposing that the background motion-dependent gating of individual lobula projections is the key to object detection. Large-field lobula plate tangential cells are hypothesized to perform the gating to realize bioinspired background subtraction. The model is shown to be capable of implementing a robust detection of moving objects in video sequences with either a moving camera that induces translational optic flow or a static camera. The model sheds light on the potential of the concise fly algorithm in real-world applications.
Collapse
Affiliation(s)
- Zhu'anzhen Zheng
- School of Life Sciences, Shanghai University, Shanghai 200444, People's Republic of China
| | - Aike Guo
- School of Life Sciences, Shanghai University, Shanghai 200444, People's Republic of China
- International Academic Center of Complex Systems, Advanced Institute of Natural Sciences, Beijing Normal University at Zhuhai, Zhuhai, Guangdong 519087, People's Republic of China
| | - Zhihua Wu
- School of Life Sciences, Shanghai University, Shanghai 200444, People's Republic of China
| |
Collapse
|
2
|
Li L, Zhang Z, Lu J. Artificial fly visual joint perception neural network inspired by multiple-regional collision detection. Neural Netw 2020; 135:13-28. [PMID: 33338802 DOI: 10.1016/j.neunet.2020.11.018] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2020] [Revised: 11/12/2020] [Accepted: 11/30/2020] [Indexed: 10/22/2022]
Abstract
The biological visual system includes multiple types of motion sensitive neurons which preferentially respond to specific perceptual regions. However, it still keeps open how to borrow such neurons to construct bio-inspired computational models for multiple-regional collision detection. To fill this gap, this work proposes a visual joint perception neural network with two subnetworks - presynaptic and postsynaptic neural networks, inspired by the preferentialperception characteristics of three horizontal and vertical motion sensitive neurons. Related to the neural network and three hazard detection mechanisms, an artificial fly visual synthesized collision detection model for multiple-regional collision detection is originally developed to monitor possible danger occurrence in the case where one or more moving objects appear in the whole field of view. The experiments can clearly draw two conclusions: (i) the acquired neural network can effectively display the characteristics of visual movement, and (ii) the collision detection model, which outperforms the compared models, can effectively perform multiple-regional collision detection at a high success rate, and only takes about 0.24s to complete the process of collision detection for each virtual or actual image frame with resolution 110×60.
Collapse
Affiliation(s)
- Lun Li
- College of Big Data and Information Engineering, Guizhou University, Guizhou Provincial Characteristic Key Laboratory of System Optimization and Scientific Computing, Guiyang, Guizhou 550025, PR China
| | - Zhuhong Zhang
- College of Big Data and Information Engineering, Guizhou University, Guizhou Provincial Characteristic Key Laboratory of System Optimization and Scientific Computing, Guiyang, Guizhou 550025, PR China.
| | - Jiaxuan Lu
- College of Big Data and Information Engineering, Guizhou University, Guizhou Provincial Characteristic Key Laboratory of System Optimization and Scientific Computing, Guiyang, Guizhou 550025, PR China
| |
Collapse
|
3
|
Fu Q, Wang H, Hu C, Yue S. Towards Computational Models and Applications of Insect Visual Systems for Motion Perception: A Review. ARTIFICIAL LIFE 2019; 25:263-311. [PMID: 31397604 DOI: 10.1162/artl_a_00297] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Motion perception is a critical capability determining a variety of aspects of insects' life, including avoiding predators, foraging, and so forth. A good number of motion detectors have been identified in the insects' visual pathways. Computational modeling of these motion detectors has not only been providing effective solutions to artificial intelligence, but also benefiting the understanding of complicated biological visual systems. These biological mechanisms through millions of years of evolutionary development will have formed solid modules for constructing dynamic vision systems for future intelligent machines. This article reviews the computational motion perception models originating from biological research on insects' visual systems in the literature. These motion perception models or neural networks consist of the looming-sensitive neuronal models of lobula giant movement detectors (LGMDs) in locusts, the translation-sensitive neural systems of direction-selective neurons (DSNs) in fruit flies, bees, and locusts, and the small-target motion detectors (STMDs) in dragonflies and hoverflies. We also review the applications of these models to robots and vehicles. Through these modeling studies, we summarize the methodologies that generate different direction and size selectivity in motion perception. Finally, we discuss multiple systems integration and hardware realization of these bio-inspired motion perception models.
Collapse
Affiliation(s)
- Qinbing Fu
- Guangzhou University, School of Mechanical and Electrical Engineering; Machine Life and Intelligence Research Centre
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| | - Hongxin Wang
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| | - Cheng Hu
- Guangzhou University, School of Mechanical and Electrical Engineering; Machine Life and Intelligence Research Centre
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| | - Shigang Yue
- Guangzhou University, School of Mechanical and Electrical Engineering; Machine Life and Intelligence Research Centre
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| |
Collapse
|
4
|
Nityananda V, Read JCA. Stereopsis in animals: evolution, function and mechanisms. ACTA ACUST UNITED AC 2018; 220:2502-2512. [PMID: 28724702 PMCID: PMC5536890 DOI: 10.1242/jeb.143883] [Citation(s) in RCA: 51] [Impact Index Per Article: 8.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Stereopsis is the computation of depth information from views acquired simultaneously from different points in space. For many years, stereopsis was thought to be confined to primates and other mammals with front-facing eyes. However, stereopsis has now been demonstrated in many other animals, including lateral-eyed prey mammals, birds, amphibians and invertebrates. The diversity of animals known to have stereo vision allows us to begin to investigate ideas about its evolution and the underlying selective pressures in different animals. It also further prompts the question of whether all animals have evolved essentially the same algorithms to implement stereopsis. If so, this must be the best way to do stereo vision, and should be implemented by engineers in machine stereopsis. Conversely, if animals have evolved a range of stereo algorithms in response to different pressures, that could inspire novel forms of machine stereopsis appropriate for distinct environments, tasks or constraints. As a first step towards addressing these ideas, we here review our current knowledge of stereo vision in animals, with a view towards outlining common principles about the evolution, function and mechanisms of stereo vision across the animal kingdom. We conclude by outlining avenues for future work, including research into possible new mechanisms of stereo vision, with implications for machine vision and the role of stereopsis in the evolution of camouflage. Summary: Stereopsis has evolved independently in different animals. We review the various functions it serves and the variety of mechanisms that could underlie stereopsis in different species.
Collapse
Affiliation(s)
- Vivek Nityananda
- Wissenschaftskolleg zu Berlin, Institute for Advanced Study, Wallotstraße 19, Berlin 14193, Germany .,Newcastle University, Institute of Neuroscience, Henry Wellcome Building, Framlington Place, Newcastle Upon Tyne NE2 4HH, UK
| | - Jenny C A Read
- Newcastle University, Institute of Neuroscience, Henry Wellcome Building, Framlington Place, Newcastle Upon Tyne NE2 4HH, UK
| |
Collapse
|
5
|
Li J, Lindemann JP, Egelhaaf M. Peripheral Processing Facilitates Optic Flow-Based Depth Perception. Front Comput Neurosci 2016; 10:111. [PMID: 27818631 PMCID: PMC5073142 DOI: 10.3389/fncom.2016.00111] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2016] [Accepted: 10/04/2016] [Indexed: 12/19/2022] Open
Abstract
Flying insects, such as flies or bees, rely on consistent information regarding the depth structure of the environment when performing their flight maneuvers in cluttered natural environments. These behaviors include avoiding collisions, approaching targets or spatial navigation. Insects are thought to obtain depth information visually from the retinal image displacements ("optic flow") during translational ego-motion. Optic flow in the insect visual system is processed by a mechanism that can be modeled by correlation-type elementary motion detectors (EMDs). However, it is still an open question how spatial information can be extracted reliably from the responses of the highly contrast- and pattern-dependent EMD responses, especially if the vast range of light intensities encountered in natural environments is taken into account. This question will be addressed here by systematically modeling the peripheral visual system of flies, including various adaptive mechanisms. Different model variants of the peripheral visual system were stimulated with image sequences that mimic the panoramic visual input during translational ego-motion in various natural environments, and the resulting peripheral signals were fed into an array of EMDs. We characterized the influence of each peripheral computational unit on the representation of spatial information in the EMD responses. Our model simulations reveal that information about the overall light level needs to be eliminated from the EMD input as is accomplished under light-adapted conditions in the insect peripheral visual system. The response characteristics of large monopolar cells (LMCs) resemble that of a band-pass filter, which reduces the contrast dependency of EMDs strongly, effectively enhancing the representation of the nearness of objects and, especially, of their contours. We furthermore show that local brightness adaptation of photoreceptors allows for spatial vision under a wide range of dynamic light conditions.
Collapse
Affiliation(s)
- Jinglin Li
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology, Bielefeld UniversityBielefeld, Germany
| | | | | |
Collapse
|
6
|
Egelhaaf M, Kern R, Lindemann JP. Motion as a source of environmental information: a fresh view on biological motion computation by insect brains. Front Neural Circuits 2014; 8:127. [PMID: 25389392 PMCID: PMC4211400 DOI: 10.3389/fncir.2014.00127] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2014] [Accepted: 10/05/2014] [Indexed: 11/13/2022] Open
Abstract
Despite their miniature brains insects, such as flies, bees and wasps, are able to navigate by highly erobatic flight maneuvers in cluttered environments. They rely on spatial information that is contained in the retinal motion patterns induced on the eyes while moving around ("optic flow") to accomplish their extraordinary performance. Thereby, they employ an active flight and gaze strategy that separates rapid saccade-like turns from translatory flight phases where the gaze direction is kept largely constant. This behavioral strategy facilitates the processing of environmental information, because information about the distance of the animal to objects in the environment is only contained in the optic flow generated by translatory motion. However, motion detectors as are widespread in biological systems do not represent veridically the velocity of the optic flow vectors, but also reflect textural information about the environment. This characteristic has often been regarded as a limitation of a biological motion detection mechanism. In contrast, we conclude from analyses challenging insect movement detectors with image flow as generated during translatory locomotion through cluttered natural environments that this mechanism represents the contours of nearby objects. Contrast borders are a main carrier of functionally relevant object information in artificial and natural sceneries. The motion detection system thus segregates in a computationally parsimonious way the environment into behaviorally relevant nearby objects and-in many behavioral contexts-less relevant distant structures. Hence, by making use of an active flight and gaze strategy, insects are capable of performing extraordinarily well even with a computationally simple motion detection mechanism.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Department of Neurobiology and Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld UniversityBielefeld, Germany
| | - Roland Kern
- Department of Neurobiology and Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld UniversityBielefeld, Germany
| | - Jens Peter Lindemann
- Department of Neurobiology and Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld UniversityBielefeld, Germany
| |
Collapse
|
7
|
Temporal statistics of natural image sequences generated by movements with insect flight characteristics. PLoS One 2014; 9:e110386. [PMID: 25340761 PMCID: PMC4207754 DOI: 10.1371/journal.pone.0110386] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2014] [Accepted: 09/09/2014] [Indexed: 11/19/2022] Open
Abstract
Many flying insects, such as flies, wasps and bees, pursue a saccadic flight and gaze strategy. This behavioral strategy is thought to separate the translational and rotational components of self-motion and, thereby, to reduce the computational efforts to extract information about the environment from the retinal image flow. Because of the distinguishing dynamic features of this active flight and gaze strategy of insects, the present study analyzes systematically the spatiotemporal statistics of image sequences generated during saccades and intersaccadic intervals in cluttered natural environments. We show that, in general, rotational movements with saccade-like dynamics elicit fluctuations and overall changes in brightness, contrast and spatial frequency of up to two orders of magnitude larger than translational movements at velocities that are characteristic of insects. Distinct changes in image parameters during translations are only caused by nearby objects. Image analysis based on larger patches in the visual field reveals smaller fluctuations in brightness and spatial frequency composition compared to small patches. The temporal structure and extent of these changes in image parameters define the temporal constraints imposed on signal processing performed by the insect visual system under behavioral conditions in natural environments.
Collapse
|
8
|
Mertes M, Dittmar L, Egelhaaf M, Boeddeker N. Visual motion-sensitive neurons in the bumblebee brain convey information about landmarks during a navigational task. Front Behav Neurosci 2014; 8:335. [PMID: 25309374 PMCID: PMC4173878 DOI: 10.3389/fnbeh.2014.00335] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2014] [Accepted: 09/07/2014] [Indexed: 11/13/2022] Open
Abstract
Bees use visual memories to find the spatial location of previously learnt food sites. Characteristic learning flights help acquiring these memories at newly discovered foraging locations where landmarks—salient objects in the vicinity of the goal location—can play an important role in guiding the animal's homing behavior. Although behavioral experiments have shown that bees can use a variety of visual cues to distinguish objects as landmarks, the question of how landmark features are encoded by the visual system is still open. Recently, it could be shown that motion cues are sufficient to allow bees localizing their goal using landmarks that can hardly be discriminated from the background texture. Here, we tested the hypothesis that motion sensitive neurons in the bee's visual pathway provide information about such landmarks during a learning flight and might, thus, play a role for goal localization. We tracked learning flights of free-flying bumblebees (Bombus terrestris) in an arena with distinct visual landmarks, reconstructed the visual input during these flights, and replayed ego-perspective movies to tethered bumblebees while recording the activity of direction-selective wide-field neurons in their optic lobe. By comparing neuronal responses during a typical learning flight and targeted modifications of landmark properties in this movie we demonstrate that these objects are indeed represented in the bee's visual motion pathway. We find that object-induced responses vary little with object texture, which is in agreement with behavioral evidence. These neurons thus convey information about landmark properties that are useful for view-based homing.
Collapse
Affiliation(s)
- Marcel Mertes
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| | - Laura Dittmar
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| | - Norbert Boeddeker
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| |
Collapse
|
9
|
Lich M, Bremmer F. Self-motion perception in the elderly. Front Hum Neurosci 2014; 8:681. [PMID: 25309379 PMCID: PMC4163979 DOI: 10.3389/fnhum.2014.00681] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2014] [Accepted: 08/14/2014] [Indexed: 11/18/2022] Open
Abstract
Self-motion through space generates a visual pattern called optic flow. It can be used to determine one's direction of self-motion (heading). Previous studies have already shown that this perceptual ability, which is of critical importance during everyday life, changes with age. In most of these studies subjects were asked to judge whether they appeared to be heading to the left or right of a target. Thresholds were found to increase continuously with age. In our current study, we were interested in absolute rather than relative heading judgments and in the question about a potential neural correlate of an age-related deterioration of heading perception. Two groups, older test subjects and younger controls, were shown optic flow stimuli in a virtual-reality setup. Visual stimuli simulated self-motion through a 3-D cloud of dots and subjects had to indicate their perceived heading direction after each trial. In different subsets of experiments we varied individually relevant stimulus parameters: presentation time, number of dots in the display, stereoscopic vs. non-stereoscopic stimulation, and motion coherence. We found decrements in heading performance with age for each stimulus parameter. In a final step we aimed to determine a putative neural basis of this behavioral decline. To this end we modified a neural network model which previously has proven to be capable of reproduce and predict certain aspects of heading perception. We show that the observed data can be modeled by implementing an age related neuronal cell loss in this neural network. We conclude that a continuous decline of certain aspects of motion perception, among them heading, might be based on an age-related progressive loss of groups of neurons being activated by visual motion.
Collapse
Affiliation(s)
- Matthias Lich
- Department Neurophysics, Philipps-Universität Marburg Marburg, Germany
| | - Frank Bremmer
- Department Neurophysics, Philipps-Universität Marburg Marburg, Germany
| |
Collapse
|
10
|
Kress D, Egelhaaf M. Impact of stride-coupled gaze shifts of walking blowflies on the neuronal representation of visual targets. Front Behav Neurosci 2014; 8:307. [PMID: 25309362 PMCID: PMC4164030 DOI: 10.3389/fnbeh.2014.00307] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2014] [Accepted: 08/23/2014] [Indexed: 02/04/2023] Open
Abstract
During locomotion animals rely heavily on visual cues gained from the environment to guide their behavior. Examples are basic behaviors like collision avoidance or the approach to a goal. The saccadic gaze strategy of flying flies, which separates translational from rotational phases of locomotion, has been suggested to facilitate the extraction of environmental information, because only image flow evoked by translational self-motion contains relevant distance information about the surrounding world. In contrast to the translational phases of flight during which gaze direction is kept largely constant, walking flies experience continuous rotational image flow that is coupled to their stride-cycle. The consequences of these self-produced image shifts for the extraction of environmental information are still unclear. To assess the impact of stride-coupled image shifts on visual information processing, we performed electrophysiological recordings from the HSE cell, a motion sensitive wide-field neuron in the blowfly visual system. This cell has been concluded to play a key role in mediating optomotor behavior, self-motion estimation and spatial information processing. We used visual stimuli that were based on the visual input experienced by walking blowflies while approaching a black vertical bar. The response of HSE to these stimuli was dominated by periodic membrane potential fluctuations evoked by stride-coupled image shifts. Nevertheless, during the approach the cell's response contained information about the bar and its background. The response components evoked by the bar were larger than the responses to its background, especially during the last phase of the approach. However, as revealed by targeted modifications of the visual input during walking, the extraction of distance information on the basis of HSE responses is much impaired by stride-coupled retinal image shifts. Possible mechanisms that may cope with these stride-coupled responses are discussed.
Collapse
Affiliation(s)
- Daniel Kress
- Department of Neurobiology, Bielefeld UniversityBielefeld, Germany
- CITEC Center of Excellence Cognitive Interaction Technology, Bielefeld UniversityBielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology, Bielefeld UniversityBielefeld, Germany
- CITEC Center of Excellence Cognitive Interaction Technology, Bielefeld UniversityBielefeld, Germany
| |
Collapse
|
11
|
Schwegmann A, Lindemann JP, Egelhaaf M. Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis. Front Comput Neurosci 2014; 8:83. [PMID: 25136314 PMCID: PMC4118023 DOI: 10.3389/fncom.2014.00083] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2014] [Accepted: 07/14/2014] [Indexed: 02/04/2023] Open
Abstract
Knowing the depth structure of the environment is crucial for moving animals in many behavioral contexts, such as collision avoidance, targeting objects, or spatial navigation. An important source of depth information is motion parallax. This powerful cue is generated on the eyes during translatory self-motion with the retinal images of nearby objects moving faster than those of distant ones. To investigate how the visual motion pathway represents motion-based depth information we analyzed its responses to image sequences recorded in natural cluttered environments with a wide range of depth structures. The analysis was done on the basis of an experimentally validated model of the visual motion pathway of insects, with its core elements being correlation-type elementary motion detectors (EMDs). It is the key result of our analysis that the absolute EMD responses, i.e., the motion energy profile, represent the contrast-weighted nearness of environmental structures during translatory self-motion at a roughly constant velocity. In other words, the output of the EMD array highlights contours of nearby objects. This conclusion is largely independent of the scale over which EMDs are spatially pooled and was corroborated by scrutinizing the motion energy profile after eliminating the depth structure from the natural image sequences. Hence, the well-established dependence of correlation-type EMDs on both velocity and textural properties of motion stimuli appears to be advantageous for representing behaviorally relevant information about the environment in a computationally parsimonious way.
Collapse
Affiliation(s)
| | | | - Martin Egelhaaf
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology, Bielefeld UniversityBielefeld, Germany
| |
Collapse
|
12
|
Berón de Astrada M, Bengochea M, Sztarker J, Delorenzi A, Tomsic D. Behaviorally related neural plasticity in the arthropod optic lobes. Curr Biol 2013; 23:1389-98. [PMID: 23831291 DOI: 10.1016/j.cub.2013.05.061] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2013] [Revised: 05/08/2013] [Accepted: 05/31/2013] [Indexed: 10/26/2022]
Abstract
BACKGROUND Due to the complexity and variability of natural environments, the ability to adaptively modify behavior is of fundamental biological importance. Motion vision provides essential cues for guiding critical behaviors such as prey, predator, or mate detection. However, when confronted with the repeated sight of a moving object that turns out to be irrelevant, most animals will learn to ignore it. The neural mechanisms by which moving objects can be ignored are unknown. Although many arthropods exhibit behavioral adaptation to repetitive moving objects, the underlying neural mechanisms have been difficult to study, due to the difficulty of recording activity from the small columnar neurons in peripheral motion detection circuits. RESULTS We developed an experimental approach in an arthropod to record the calcium responses of visual neurons in vivo. We show that peripheral columnar neurons that convey visual information into the second optic neuropil persist in responding to the repeated presentation of an innocuous moving object. However, activity in the columnar neurons that convey the visual information from the second to the third optic neuropil is suppressed during high-frequency stimulus repetitions. In accordance with the animal's behavioral changes, the suppression of neural activity is fast but short lasting and restricted to the retina's trained area. CONCLUSIONS Columnar neurons from the second optic neuropil are likely the main plastic locus responsible for the modifications in animal behavior when confronted with rapidly repeated object motion. Our results demonstrate that visually guided behaviors can be determined by neural plasticity that occurs surprisingly early in the visual pathway.
Collapse
Affiliation(s)
- Martín Berón de Astrada
- Laboratorio de Neurobiología de la Memoria, Departamento Fisiología, Biología Molecular y Celular, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, IFIBYNE-CONICET, CP 1428 Buenos Aires, Argentina.
| | | | | | | | | |
Collapse
|
13
|
Lindemann JP, Egelhaaf M. Texture dependence of motion sensing and free flight behavior in blowflies. Front Behav Neurosci 2013; 6:92. [PMID: 23335890 PMCID: PMC3542507 DOI: 10.3389/fnbeh.2012.00092] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2012] [Accepted: 12/21/2012] [Indexed: 11/18/2022] Open
Abstract
MANY FLYING INSECTS EXHIBIT AN ACTIVE FLIGHT AND GAZE STRATEGY: purely translational flight segments alternate with quick turns called saccades. To generate such a saccadic flight pattern, the animals decide the timing, direction, and amplitude of the next saccade during the previous translatory intersaccadic interval. The information underlying these decisions is assumed to be extracted from the retinal image displacements (optic flow), which scale with the distance to objects during the intersaccadic flight phases. In an earlier study we proposed a saccade-generation mechanism based on the responses of large-field motion-sensitive neurons. In closed-loop simulations we achieved collision avoidance behavior in a limited set of environments but observed collisions in others. Here we show by open-loop simulations that the cause of this observation is the known texture-dependence of elementary motion detection in flies, reflected also in the responses of large-field neurons as used in our model. We verified by electrophysiological experiments that this result is not an artifact of the sensory model. Already subtle changes in the texture may lead to qualitative differences in the responses of both our model cells and their biological counterparts in the fly's brain. Nonetheless, free flight behavior of blowflies is only moderately affected by such texture changes. This divergent texture dependence of motion-sensitive neurons and behavioral performance suggests either mechanisms that compensate for the texture dependence of the visual motion pathway at the level of the circuits generating the saccadic turn decisions or the involvement of a hypothetical parallel pathway in saccadic control that provides the information for collision avoidance independent of the textural properties of the environment.
Collapse
|
14
|
Egelhaaf M, Boeddeker N, Kern R, Kurtz R, Lindemann JP. Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action. Front Neural Circuits 2012; 6:108. [PMID: 23269913 PMCID: PMC3526811 DOI: 10.3389/fncir.2012.00108] [Citation(s) in RCA: 71] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2012] [Accepted: 12/03/2012] [Indexed: 11/30/2022] Open
Abstract
Insects such as flies or bees, with their miniature brains, are able to control highly aerobatic flight maneuvres and to solve spatial vision tasks, such as avoiding collisions with obstacles, landing on objects, or even localizing a previously learnt inconspicuous goal on the basis of environmental cues. With regard to solving such spatial tasks, these insects still outperform man-made autonomous flying systems. To accomplish their extraordinary performance, flies and bees have been shown by their characteristic behavioral actions to actively shape the dynamics of the image flow on their eyes ("optic flow"). The neural processing of information about the spatial layout of the environment is greatly facilitated by segregating the rotational from the translational optic flow component through a saccadic flight and gaze strategy. This active vision strategy thus enables the nervous system to solve apparently complex spatial vision tasks in a particularly efficient and parsimonious way. The key idea of this review is that biological agents, such as flies or bees, acquire at least part of their strength as autonomous systems through active interactions with their environment and not by simply processing passively gained information about the world. These agent-environment interactions lead to adaptive behavior in surroundings of a wide range of complexity. Animals with even tiny brains, such as insects, are capable of performing extraordinarily well in their behavioral contexts by making optimal use of the closed action-perception loop. Model simulations and robotic implementations show that the smart biological mechanisms of motion computation and visually-guided flight control might be helpful to find technical solutions, for example, when designing micro air vehicles carrying a miniaturized, low-weight on-board processor.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Centre of Excellence “Cognitive Interaction Technology”Bielefeld University, Germany
| | | | | | | | | |
Collapse
|