1
|
Wang H, Zhao J, Wang H, Hu C, Peng J, Yue S. Attention and Prediction-Guided Motion Detection for Low-Contrast Small Moving Targets. IEEE TRANSACTIONS ON CYBERNETICS 2023; 53:6340-6352. [PMID: 35533156 DOI: 10.1109/tcyb.2022.3170699] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Small target motion detection within complex natural environments is an extremely challenging task for autonomous robots. Surprisingly, the visual systems of insects have evolved to be highly efficient in detecting mates and tracking prey, even though targets occupy as small as a few degrees of their visual fields. The excellent sensitivity to small target motion relies on a class of specialized neurons, called small target motion detectors (STMDs). However, existing STMD-based models are heavily dependent on visual contrast and perform poorly in complex natural environments, where small targets generally exhibit extremely low contrast against neighboring backgrounds. In this article, we develop an attention-and-prediction-guided visual system to overcome this limitation. The developed visual system comprises three main subsystems, namely: 1) an attention module; 2) an STMD-based neural network; and 3) a prediction module. The attention module searches for potential small targets in the predicted areas of the input image and enhances their contrast against a complex background. The STMD-based neural network receives the contrast-enhanced image and discriminates small moving targets from background false positives. The prediction module foresees future positions of the detected targets and generates a prediction map for the attention module. The three subsystems are connected in a recurrent architecture, allowing information to be processed sequentially to activate specific areas for small target detection. Extensive experiments on synthetic and real-world datasets demonstrate the effectiveness and superiority of the proposed visual system for detecting small, low-contrast moving targets against complex natural environments.
Collapse
|
2
|
Wu Z, Guo A. Bioinspired figure-ground discrimination via visual motion smoothing. PLoS Comput Biol 2023; 19:e1011077. [PMID: 37083880 PMCID: PMC10155969 DOI: 10.1371/journal.pcbi.1011077] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2022] [Revised: 05/03/2023] [Accepted: 04/04/2023] [Indexed: 04/22/2023] Open
Abstract
Flies detect and track moving targets among visual clutter, and this process mainly relies on visual motion. Visual motion is analyzed or computed with the pathway from the retina to T4/T5 cells. The computation of local directional motion was formulated as an elementary movement detector (EMD) model more than half a century ago. Solving target detection or figure-ground discrimination problems can be equivalent to extracting boundaries between a target and the background based on the motion discontinuities in the output of a retinotopic array of EMDs. Individual EMDs cannot measure true velocities, however, due to their sensitivity to pattern properties such as luminance contrast and spatial frequency content. It remains unclear how local directional motion signals are further integrated to enable figure-ground discrimination. Here, we present a computational model inspired by fly motion vision. Simulations suggest that the heavily fluctuating output of an EMD array is naturally surmounted by a lobula network, which is hypothesized to be downstream of the local motion detectors and have parallel pathways with distinct directional selectivity. The lobula network carries out a spatiotemporal smoothing operation for visual motion, especially across time, enabling the segmentation of moving figures from the background. The model qualitatively reproduces experimental observations in the visually evoked response characteristics of one type of lobula columnar (LC) cell. The model is further shown to be robust to natural scene variability. Our results suggest that the lobula is involved in local motion-based target detection.
Collapse
Affiliation(s)
- Zhihua Wu
- School of Life Sciences, Shanghai University, Shanghai, China
- State Key Laboratory of Brain and Cognitive Science, Institute of Biophysics, Chinese Academy of Sciences, Beijing, China
| | - Aike Guo
- School of Life Sciences, Shanghai University, Shanghai, China
- State Key Laboratory of Brain and Cognitive Science, Institute of Biophysics, Chinese Academy of Sciences, Beijing, China
- International Academic Center of Complex Systems, Advanced Institute of Natural Sciences, Beijing Normal University at Zhuhai, Zhuhai, Guangdong, China
- University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
3
|
Egelhaaf M. Optic flow based spatial vision in insects. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-022-01610-w. [PMID: 36609568 DOI: 10.1007/s00359-022-01610-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Revised: 12/06/2022] [Accepted: 12/24/2022] [Indexed: 01/09/2023]
Abstract
The optic flow, i.e., the displacement of retinal images of objects in the environment induced by self-motion, is an important source of spatial information, especially for fast-flying insects. Spatial information over a wide range of distances, from the animal's immediate surroundings over several hundred metres to kilometres, is necessary for mediating behaviours, such as landing manoeuvres, collision avoidance in spatially complex environments, learning environmental object constellations and path integration in spatial navigation. To facilitate the processing of spatial information, the complexity of the optic flow is often reduced by active vision strategies. These result in translations and rotations being largely separated by a saccadic flight and gaze mode. Only the translational components of the optic flow contain spatial information. In the first step of optic flow processing, an array of local motion detectors provides a retinotopic spatial proximity map of the environment. This local motion information is then processed in parallel neural pathways in a task-specific manner and used to control the different components of spatial behaviour. A particular challenge here is that the distance information extracted from the optic flow does not represent the distances unambiguously, but these are scaled by the animal's speed of locomotion. Possible ways of coping with this ambiguity are discussed.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Center for Cognitive Interaction Technology (CITEC), Bielefeld University, Universitätsstraße 25, 33615, Bielefeld, Germany.
| |
Collapse
|
4
|
Skelton PSM, Finn A, Brinkworth RSA. Contrast independent biologically inspired translational optic flow estimation. BIOLOGICAL CYBERNETICS 2022; 116:635-660. [PMID: 36303043 PMCID: PMC9691503 DOI: 10.1007/s00422-022-00948-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/19/2021] [Accepted: 10/11/2022] [Indexed: 06/16/2023]
Abstract
The visual systems of insects are relatively simple compared to humans. However, they enable navigation through complex environments where insects perform exceptional levels of obstacle avoidance. Biology uses two separable modes of optic flow to achieve this: rapid gaze fixation (rotational motion known as saccades); and the inter-saccadic translational motion. While the fundamental process of insect optic flow has been known since the 1950's, so too has its dependence on contrast. The surrounding visual pathways used to overcome environmental dependencies are less well known. Previous work has shown promise for low-speed rotational motion estimation, but a gap remained in the estimation of translational motion, in particular the estimation of the time to impact. To consistently estimate the time to impact during inter-saccadic translatory motion, the fundamental limitation of contrast dependence must be overcome. By adapting an elaborated rotational velocity estimator from literature to work for translational motion, this paper proposes a novel algorithm for overcoming the contrast dependence of time to impact estimation using nonlinear spatio-temporal feedforward filtering. By applying bioinspired processes, approximately 15 points per decade of statistical discrimination were achieved when estimating the time to impact to a target across 360 background, distance, and velocity combinations: a 17-fold increase over the fundamental process. These results show the contrast dependence of time to impact estimation can be overcome in a biologically plausible manner. This, combined with previous results for low-speed rotational motion estimation, allows for contrast invariant computational models designed on the principles found in the biological visual system, paving the way for future visually guided systems.
Collapse
Affiliation(s)
- Phillip S. M. Skelton
- Centre for Defence Engineering Research and Training, College of Science and Engineering, Flinders University, 1284 South Road, Tonsley, South Australia 5042 Australia
| | - Anthony Finn
- Science, Technology, Engineering, and Mathematics, University of South Australia, 1 Mawson Lakes Boulevard, Mawson Lakes, South Australia 5095 Australia
| | - Russell S. A. Brinkworth
- Centre for Defence Engineering Research and Training, College of Science and Engineering, Flinders University, 1284 South Road, Tonsley, South Australia 5042 Australia
| |
Collapse
|
5
|
Grittner R, Baird E, Stöckl A. Spatial tuning of translational optic flow responses in hawkmoths of varying body size. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2021; 208:279-296. [PMID: 34893928 PMCID: PMC8934765 DOI: 10.1007/s00359-021-01530-1] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2021] [Revised: 11/28/2021] [Accepted: 11/30/2021] [Indexed: 11/12/2022]
Abstract
To safely navigate their environment, flying insects rely on visual cues, such as optic flow. Which cues insects can extract from their environment depends closely on the spatial and temporal response properties of their visual system. These in turn can vary between individuals that differ in body size. How optic flow-based flight control depends on the spatial structure of visual cues, and how this relationship scales with body size, has previously been investigated in insects with apposition compound eyes. Here, we characterised the visual flight control response limits and their relationship to body size in an insect with superposition compound eyes: the hummingbird hawkmoth Macroglossum stellatarum. We used the hawkmoths’ centring response in a flight tunnel as a readout for their reception of translational optic flow stimuli of different spatial frequencies. We show that their responses cut off at different spatial frequencies when translational optic flow was presented on either one, or both tunnel walls. Combined with differences in flight speed, this suggests that their flight control was primarily limited by their temporal rather than spatial resolution. We also observed strong individual differences in flight performance, but no correlation between the spatial response cutoffs and body or eye size.
Collapse
Affiliation(s)
- Rebecca Grittner
- Behavioral Physiology and Sociobiology (Zoology II), University of Würzburg, Würzburg, Germany
| | - Emily Baird
- Department of Zoology, Stockholm University, Stockholm, Sweden
| | - Anna Stöckl
- Behavioral Physiology and Sociobiology (Zoology II), University of Würzburg, Würzburg, Germany.
| |
Collapse
|
6
|
Doussot C, Bertrand OJN, Egelhaaf M. Visually guided homing of bumblebees in ambiguous situations: A behavioural and modelling study. PLoS Comput Biol 2020; 16:e1008272. [PMID: 33048938 PMCID: PMC7553325 DOI: 10.1371/journal.pcbi.1008272] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2019] [Accepted: 08/19/2020] [Indexed: 11/19/2022] Open
Abstract
Returning home is a crucial task accomplished daily by many animals, including humans. Because of their tiny brains, insects, like bees or ants, are good study models for efficient navigation strategies. Bees and ants are known to rely mainly on learned visual information about the nest surroundings to pinpoint their barely visible nest-entrance. During the return, when the actual sight of the insect matches the learned information, the insect is easily guided home. Occasionally, modifications to the visual environment may take place while the insect is on a foraging trip. Here, we addressed the ecologically relevant question of how bumblebees’ homing is affected by such a situation. In an artificial setting, we habituated bees to be guided to their nest by two constellations of visual cues. After habituation, these cues were displaced during foraging trips into a conflict situation. We recorded bumblebees’ return flights in such circumstances and investigated where they search for their nest entrance following the degree of displacement between the two visually relevant cues. Bumblebees mostly searched at the fictive nest location as indicated by either cue constellation, but never at a compromise location between them. We compared these experimental results to the predictions of different types of homing models. We found that models guiding an agent by a single holistic view of the nest surroundings could not account for the bumblebees’ search behaviour in cue-conflict situations. Instead, homing models relying on multiple views were sufficient. We could further show that homing models required fewer views and got more robust to height changes if optic flow-based spatial information was encoded and learned, rather than just brightness information. Returning home sounds trivial, but to a concealed underground location like a burrow, is less easy. For the buff-tailed bumblebees, this task is a routine. After collecting pollen in gardens or flowered meadows, bees must return to their underground nest to feed the queen’s larvae. The nest entrance is almost invisible for a returning bee; therefore, it guides its flight by information about the surrounding visual environment. Since the seminal work of Timbergern, many experiments have focused on how visual information is guiding foraging insects back home. In these experiments, returning foragers were confronted with a coherent displacement of the entire nest surroundings, hence, leading the bees to a unique new location. But in nature, the objects constituting the visual environment maybe unorderly displaced, as some are differently inclined to the action of different factors, e.g. wind. In our study, we moved objects in a tricky way to create two fictitious nest entrances. The bees searched at the fictitious nest entrances, but never in-between. The distance between the fictitious nests affected the bees’ search. Finally, we could predict the search location by using bio-inspired homing models potentially interesting for implementing in autonomous robots.
Collapse
Affiliation(s)
- Charlotte Doussot
- Neurobiology, Faculty of Biology, Universität Bielefeld, Germany
- * E-mail:
| | | | - Martin Egelhaaf
- Neurobiology, Faculty of Biology, Universität Bielefeld, Germany
| |
Collapse
|
7
|
Meyer HG, Klimeck D, Paskarbeit J, Rückert U, Egelhaaf M, Porrmann M, Schneider A. Resource-efficient bio-inspired visual processing on the hexapod walking robot HECTOR. PLoS One 2020; 15:e0230620. [PMID: 32236111 PMCID: PMC7112198 DOI: 10.1371/journal.pone.0230620] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2019] [Accepted: 03/04/2020] [Indexed: 11/26/2022] Open
Abstract
Emulating the highly resource-efficient processing of visual motion information in the brain of flying insects, a bio-inspired controller for collision avoidance and navigation was implemented on a novel, integrated System-on-Chip-based hardware module. The hardware module is used to control visually-guided navigation behavior of the stick insect-like hexapod robot HECTOR. By leveraging highly parallelized bio-inspired algorithms to extract nearness information from visual motion in dynamically reconfigurable logic, HECTOR is able to navigate to predefined goal positions without colliding with obstacles. The system drastically outperforms CPU- and graphics card-based implementations in terms of speed and resource efficiency, making it suitable to be also placed on fast moving robots, such as flying drones.
Collapse
Affiliation(s)
- Hanno Gerd Meyer
- Research Group Biomechatronics, CITEC, Bielefeld University, Bielefeld, Germany
- Department of Neurobiology and CITEC, Bielefeld University, Bielefeld, Germany
- Biomechatronics and Embedded Systems Group, Faculty of Engineering and Mathematics, University of Applied Sciences, Bielefeld, Germany
| | - Daniel Klimeck
- Cognitronics and Sensor Systems Group, CITEC, Bielefeld University, Bielefeld, Germany
| | - Jan Paskarbeit
- Research Group Biomechatronics, CITEC, Bielefeld University, Bielefeld, Germany
| | - Ulrich Rückert
- Cognitronics and Sensor Systems Group, CITEC, Bielefeld University, Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology and CITEC, Bielefeld University, Bielefeld, Germany
| | - Mario Porrmann
- Computer Engineering Group, Osnabrück University, Osnabrück, Germany
| | - Axel Schneider
- Research Group Biomechatronics, CITEC, Bielefeld University, Bielefeld, Germany
- Biomechatronics and Embedded Systems Group, Faculty of Engineering and Mathematics, University of Applied Sciences, Bielefeld, Germany
| |
Collapse
|
8
|
Mauss AS, Borst A. Optic flow-based course control in insects. Curr Opin Neurobiol 2020; 60:21-27. [DOI: 10.1016/j.conb.2019.10.007] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2019] [Accepted: 10/11/2019] [Indexed: 01/31/2023]
|
9
|
Image statistics of the environment surrounding freely behaving hoverflies. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2019; 205:373-385. [PMID: 30937518 PMCID: PMC6579776 DOI: 10.1007/s00359-019-01329-1] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2018] [Revised: 02/12/2019] [Accepted: 03/14/2019] [Indexed: 12/04/2022]
Abstract
Natural scenes are not as random as they might appear, but are constrained in both space and time. The 2-dimensional spatial constraints can be described by quantifying the image statistics of photographs. Human observers perceive images with naturalistic image statistics as more pleasant to view, and both fly and vertebrate peripheral and higher order visual neurons are tuned to naturalistic image statistics. However, for a given animal, what is natural differs depending on the behavior, and even if we have a broad understanding of image statistics, we know less about the scenes relevant for particular behaviors. To mitigate this, we here investigate the image statistics surrounding Episyrphus balteatus hoverflies, where the males hover in sun shafts created by surrounding trees, producing a rich and dense background texture and also intricate shadow patterns on the ground. We quantified the image statistics of photographs of the ground and the surrounding panorama, as the ventral and lateral visual field is particularly important for visual flight control, and found differences in spatial statistics in photos where the hoverflies were hovering compared to where they were flying. Our results can, in the future, be used to create more naturalistic stimuli for experimenter-controlled experiments in the laboratory.
Collapse
|
10
|
Spatial Encoding of Translational Optic Flow in Planar Scenes by Elementary Motion Detector Arrays. Sci Rep 2018; 8:5821. [PMID: 29643402 PMCID: PMC5895815 DOI: 10.1038/s41598-018-24162-z] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2017] [Accepted: 03/28/2018] [Indexed: 02/02/2023] Open
Abstract
Elementary Motion Detectors (EMD) are well-established models of visual motion estimation in insects. The response of EMDs are tuned to specific temporal and spatial frequencies of the input stimuli, which matches the behavioural response of insects to wide-field image rotation, called the optomotor response. However, other behaviours, such as speed and position control, cannot be fully accounted for by EMDs because these behaviours are largely unaffected by image properties and appear to be controlled by the ratio between the flight speed and the distance to an object, defined here as relative nearness. We present a method that resolves this inconsistency by extracting an unambiguous estimate of relative nearness from the output of an EMD array. Our method is suitable for estimation of relative nearness in planar scenes such as when flying above the ground or beside large flat objects. We demonstrate closed loop control of the lateral position and forward velocity of a simulated agent flying in a corridor. This finding may explain how insects can measure relative nearness and control their flight despite the frequency tuning of EMDs. Our method also provides engineers with a relative nearness estimation technique that benefits from the low computational cost of EMDs.
Collapse
|
11
|
Li J, Lindemann JP, Egelhaaf M. Local motion adaptation enhances the representation of spatial structure at EMD arrays. PLoS Comput Biol 2017; 13:e1005919. [PMID: 29281631 PMCID: PMC5760083 DOI: 10.1371/journal.pcbi.1005919] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2017] [Revised: 01/09/2018] [Accepted: 11/13/2017] [Indexed: 11/18/2022] Open
Abstract
Neuronal representation and extraction of spatial information are essential for behavioral control. For flying insects, a plausible way to gain spatial information is to exploit distance-dependent optic flow that is generated during translational self-motion. Optic flow is computed by arrays of local motion detectors retinotopically arranged in the second neuropile layer of the insect visual system. These motion detectors have adaptive response characteristics, i.e. their responses to motion with a constant or only slowly changing velocity decrease, while their sensitivity to rapid velocity changes is maintained or even increases. We analyzed by a modeling approach how motion adaptation affects signal representation at the output of arrays of motion detectors during simulated flight in artificial and natural 3D environments. We focused on translational flight, because spatial information is only contained in the optic flow induced by translational locomotion. Indeed, flies, bees and other insects segregate their flight into relatively long intersaccadic translational flight sections interspersed with brief and rapid saccadic turns, presumably to maximize periods of translation (80% of the flight). With a novel adaptive model of the insect visual motion pathway we could show that the motion detector responses to background structures of cluttered environments are largely attenuated as a consequence of motion adaptation, while responses to foreground objects stay constant or even increase. This conclusion even holds under the dynamic flight conditions of insects.
Collapse
Affiliation(s)
- Jinglin Li
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
- * E-mail:
| | - Jens P. Lindemann
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| |
Collapse
|
12
|
Crall JD, Chang JJ, Oppenheimer RL, Combes SA. Foraging in an unsteady world: bumblebee flight performance in field-realistic turbulence. Interface Focus 2017; 7:20160086. [PMID: 28163878 DOI: 10.1098/rsfs.2016.0086] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/24/2023] Open
Abstract
Natural environments are characterized by variable wind that can pose significant challenges for flying animals and robots. However, our understanding of the flow conditions that animals experience outdoors and how these impact flight performance remains limited. Here, we combine laboratory and field experiments to characterize wind conditions encountered by foraging bumblebees in outdoor environments and test the effects of these conditions on flight. We used radio-frequency tags to track foraging activity of uniquely identified bumblebee (Bombus impatiens) workers, while simultaneously recording local wind flows. Despite being subjected to a wide range of speeds and turbulence intensities, we find that bees do not avoid foraging in windy conditions. We then examined the impacts of turbulence on bumblebee flight in a wind tunnel. Rolling instabilities increased in turbulence, but only at higher wind speeds. Bees displayed higher mean wingbeat frequency and stroke amplitude in these conditions, as well as increased asymmetry in stroke amplitude-suggesting that bees employ an array of active responses to enable flight in turbulence, which may increase the energetic cost of flight. Our results provide the first direct evidence that moderate, environmentally relevant turbulence affects insect flight performance, and suggest that flying insects use diverse mechanisms to cope with these instabilities.
Collapse
Affiliation(s)
- J D Crall
- Department of Organismic and Evolutionary Biology , Harvard University , Cambridge, MA , USA
| | - J J Chang
- Department of Neuroscience , Columbia University , New York, NY , USA
| | - R L Oppenheimer
- Department of Biological Sciences , University of New Hampshire , Durham, NH , USA
| | - S A Combes
- Department of Neurobiology, Physiology, and Behavior , University of California, Davis , Davis, CA , USA
| |
Collapse
|
13
|
Monteagudo J, Lindemann JP, Egelhaaf M. Head orientation of walking blowflies is controlled by visual and mechanical cues. J Exp Biol 2017; 220:4578-4582. [DOI: 10.1242/jeb.164129] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2017] [Accepted: 10/27/2017] [Indexed: 11/20/2022]
Abstract
During locomotion animals employ visual and mechanical cues in order to establish the orientation of their head, which reflects the orientation of the visual coordinate system. However, in certain situations, contradictory cues may suggest different orientations relative to the environment. We recorded blowflies walking on a horizontal or tilted surface surrounded by visual cues suggesting a variety of orientations. We found that the different orientations relative to gravity of visual cues and walking surface were integrated, with the orientation of the surface being the major contributor to head orientation, while visual cues and gravity also play an important role. In contrast, visual cues did not affect body orientation much. Cue integration was modeled as the weighted sum of orientations suggested by the different cues. Our model suggests that in case of lacking visual cues more weight is given to gravity.
Collapse
Affiliation(s)
- José Monteagudo
- Department of Neurobiology & Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| | - Jens P. Lindemann
- Department of Neurobiology & Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology & Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| |
Collapse
|
14
|
Li J, Lindemann JP, Egelhaaf M. Peripheral Processing Facilitates Optic Flow-Based Depth Perception. Front Comput Neurosci 2016; 10:111. [PMID: 27818631 PMCID: PMC5073142 DOI: 10.3389/fncom.2016.00111] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2016] [Accepted: 10/04/2016] [Indexed: 12/19/2022] Open
Abstract
Flying insects, such as flies or bees, rely on consistent information regarding the depth structure of the environment when performing their flight maneuvers in cluttered natural environments. These behaviors include avoiding collisions, approaching targets or spatial navigation. Insects are thought to obtain depth information visually from the retinal image displacements ("optic flow") during translational ego-motion. Optic flow in the insect visual system is processed by a mechanism that can be modeled by correlation-type elementary motion detectors (EMDs). However, it is still an open question how spatial information can be extracted reliably from the responses of the highly contrast- and pattern-dependent EMD responses, especially if the vast range of light intensities encountered in natural environments is taken into account. This question will be addressed here by systematically modeling the peripheral visual system of flies, including various adaptive mechanisms. Different model variants of the peripheral visual system were stimulated with image sequences that mimic the panoramic visual input during translational ego-motion in various natural environments, and the resulting peripheral signals were fed into an array of EMDs. We characterized the influence of each peripheral computational unit on the representation of spatial information in the EMD responses. Our model simulations reveal that information about the overall light level needs to be eliminated from the EMD input as is accomplished under light-adapted conditions in the insect peripheral visual system. The response characteristics of large monopolar cells (LMCs) resemble that of a band-pass filter, which reduces the contrast dependency of EMDs strongly, effectively enhancing the representation of the nearness of objects and, especially, of their contours. We furthermore show that local brightness adaptation of photoreceptors allows for spatial vision under a wide range of dynamic light conditions.
Collapse
Affiliation(s)
- Jinglin Li
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology, Bielefeld UniversityBielefeld, Germany
| | | | | |
Collapse
|
15
|
Asymmetry of Drosophila ON and OFF motion detectors enhances real-world velocity estimation. Nat Neurosci 2016; 19:706-715. [PMID: 26928063 DOI: 10.1038/nn.4262] [Citation(s) in RCA: 58] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2015] [Accepted: 01/29/2016] [Indexed: 12/13/2022]
Abstract
The reliable estimation of motion across varied surroundings represents a survival-critical task for sighted animals. How neural circuits have adapted to the particular demands of natural environments, however, is not well understood. We explored this question in the visual system of Drosophila melanogaster. Here, as in many mammalian retinas, motion is computed in parallel streams for brightness increments (ON) and decrements (OFF). When genetically isolated, ON and OFF pathways proved equally capable of accurately matching walking responses to realistic motion. To our surprise, detailed characterization of their functional tuning properties through in vivo calcium imaging and electrophysiology revealed stark differences in temporal tuning between ON and OFF channels. We trained an in silico motion estimation model on natural scenes and discovered that our optimized detector exhibited differences similar to those of the biological system. Thus, functional ON-OFF asymmetries in fly visual circuitry may reflect ON-OFF asymmetries in natural environments.
Collapse
|
16
|
Bertrand OJN, Lindemann JP, Egelhaaf M. A Bio-inspired Collision Avoidance Model Based on Spatial Information Derived from Motion Detectors Leads to Common Routes. PLoS Comput Biol 2015; 11:e1004339. [PMID: 26583771 PMCID: PMC4652890 DOI: 10.1371/journal.pcbi.1004339] [Citation(s) in RCA: 28] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2014] [Accepted: 05/13/2015] [Indexed: 11/18/2022] Open
Abstract
Avoiding collisions is one of the most basic needs of any mobile agent, both biological and technical, when searching around or aiming toward a goal. We propose a model of collision avoidance inspired by behavioral experiments on insects and by properties of optic flow on a spherical eye experienced during translation, and test the interaction of this model with goal-driven behavior. Insects, such as flies and bees, actively separate the rotational and translational optic flow components via behavior, i.e. by employing a saccadic strategy of flight and gaze control. Optic flow experienced during translation, i.e. during intersaccadic phases, contains information on the depth-structure of the environment, but this information is entangled with that on self-motion. Here, we propose a simple model to extract the depth structure from translational optic flow by using local properties of a spherical eye. On this basis, a motion direction of the agent is computed that ensures collision avoidance. Flying insects are thought to measure optic flow by correlation-type elementary motion detectors. Their responses depend, in addition to velocity, on the texture and contrast of objects and, thus, do not measure the velocity of objects veridically. Therefore, we initially used geometrically determined optic flow as input to a collision avoidance algorithm to show that depth information inferred from optic flow is sufficient to account for collision avoidance under closed-loop conditions. Then, the collision avoidance algorithm was tested with bio-inspired correlation-type elementary motion detectors in its input. Even then, the algorithm led successfully to collision avoidance and, in addition, replicated the characteristics of collision avoidance behavior of insects. Finally, the collision avoidance algorithm was combined with a goal direction and tested in cluttered environments. The simulated agent then showed goal-directed behavior reminiscent of components of the navigation behavior of insects.
Collapse
Affiliation(s)
| | | | - Martin Egelhaaf
- Neurobiologie & CITEC, Bielefeld University, Bielefeld, Germany
| |
Collapse
|
17
|
Strübbe S, Stürzl W, Egelhaaf M. Insect-Inspired Self-Motion Estimation with Dense Flow Fields--An Adaptive Matched Filter Approach. PLoS One 2015; 10:e0128413. [PMID: 26308839 PMCID: PMC4550262 DOI: 10.1371/journal.pone.0128413] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2014] [Accepted: 04/28/2015] [Indexed: 11/18/2022] Open
Abstract
The control of self-motion is a basic, but complex task for both technical and biological systems. Various algorithms have been proposed that allow the estimation of self-motion from the optic flow on the eyes. We show that two apparently very different approaches to solve this task, one technically and one biologically inspired, can be transformed into each other under certain conditions. One estimator of self-motion is based on a matched filter approach; it has been developed to describe the function of motion sensitive cells in the fly brain. The other estimator, the Koenderink and van Doorn (KvD) algorithm, was derived analytically with a technical background. If the distances to the objects in the environment can be assumed to be known, the two estimators are linear and equivalent, but are expressed in different mathematical forms. However, for most situations it is unrealistic to assume that the distances are known. Therefore, the depth structure of the environment needs to be determined in parallel to the self-motion parameters and leads to a non-linear problem. It is shown that the standard least mean square approach that is used by the KvD algorithm leads to a biased estimator. We derive a modification of this algorithm in order to remove the bias and demonstrate its improved performance by means of numerical simulations. For self-motion estimation it is beneficial to have a spherical visual field, similar to many flying insects. We show that in this case the representation of the depth structure of the environment derived from the optic flow can be simplified. Based on this result, we develop an adaptive matched filter approach for systems with a nearly spherical visual field. Then only eight parameters about the environment have to be memorized and updated during self-motion.
Collapse
Affiliation(s)
- Simon Strübbe
- Department of Neurobiology and CITEC, Bielefeld University, Bielefeld, Germany
- * E-mail:
| | - Wolfgang Stürzl
- Institute of Robotics and Mechatronics, German Aerospace Center (DLR), Wessling, Germany
| | - Martin Egelhaaf
- Department of Neurobiology and CITEC, Bielefeld University, Bielefeld, Germany
| |
Collapse
|
18
|
Ullrich TW, Kern R, Egelhaaf M. Influence of environmental information in natural scenes and the effects of motion adaptation on a fly motion-sensitive neuron during simulated flight. Biol Open 2014; 4:13-21. [PMID: 25505148 PMCID: PMC4295162 DOI: 10.1242/bio.20149449] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
Gaining information about the spatial layout of natural scenes is a challenging task that flies need to solve, especially when moving at high velocities. A group of motion sensitive cells in the lobula plate of flies is supposed to represent information about self-motion as well as the environment. Relevant environmental features might be the nearness of structures, influencing retinal velocity during translational self-motion, and the brightness contrast. We recorded the responses of the H1 cell, an individually identifiable lobula plate tangential cell, during stimulation with image sequences, simulating translational motion through natural sceneries with a variety of differing depth structures. A correlation was found between the average nearness of environmental structures within large parts of the cell's receptive field and its response across a variety of scenes, but no correlation was found between the brightness contrast of the stimuli and the cell response. As a consequence of motion adaptation resulting from repeated translation through the environment, the time-dependent response modulations induced by the spatial structure of the environment were increased relatively to the background activity of the cell. These results support the hypothesis that some lobula plate tangential cells do not only serve as sensors of self-motion, but also as a part of a neural system that processes information about the spatial layout of natural scenes.
Collapse
Affiliation(s)
- Thomas W Ullrich
- Department of Neurobiology, Bielefeld University, Universitätsstrasse 25, 33615 Bielefeld, Germany Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Inspiration 1/Zehlendorfer Damm 201, 33619 Bielefeld, Germany
| | - Roland Kern
- Department of Neurobiology, Bielefeld University, Universitätsstrasse 25, 33615 Bielefeld, Germany Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Inspiration 1/Zehlendorfer Damm 201, 33619 Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology, Bielefeld University, Universitätsstrasse 25, 33615 Bielefeld, Germany Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Inspiration 1/Zehlendorfer Damm 201, 33619 Bielefeld, Germany
| |
Collapse
|
19
|
Egelhaaf M, Kern R, Lindemann JP. Motion as a source of environmental information: a fresh view on biological motion computation by insect brains. Front Neural Circuits 2014; 8:127. [PMID: 25389392 PMCID: PMC4211400 DOI: 10.3389/fncir.2014.00127] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2014] [Accepted: 10/05/2014] [Indexed: 11/13/2022] Open
Abstract
Despite their miniature brains insects, such as flies, bees and wasps, are able to navigate by highly erobatic flight maneuvers in cluttered environments. They rely on spatial information that is contained in the retinal motion patterns induced on the eyes while moving around ("optic flow") to accomplish their extraordinary performance. Thereby, they employ an active flight and gaze strategy that separates rapid saccade-like turns from translatory flight phases where the gaze direction is kept largely constant. This behavioral strategy facilitates the processing of environmental information, because information about the distance of the animal to objects in the environment is only contained in the optic flow generated by translatory motion. However, motion detectors as are widespread in biological systems do not represent veridically the velocity of the optic flow vectors, but also reflect textural information about the environment. This characteristic has often been regarded as a limitation of a biological motion detection mechanism. In contrast, we conclude from analyses challenging insect movement detectors with image flow as generated during translatory locomotion through cluttered natural environments that this mechanism represents the contours of nearby objects. Contrast borders are a main carrier of functionally relevant object information in artificial and natural sceneries. The motion detection system thus segregates in a computationally parsimonious way the environment into behaviorally relevant nearby objects and-in many behavioral contexts-less relevant distant structures. Hence, by making use of an active flight and gaze strategy, insects are capable of performing extraordinarily well even with a computationally simple motion detection mechanism.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Department of Neurobiology and Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld UniversityBielefeld, Germany
| | - Roland Kern
- Department of Neurobiology and Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld UniversityBielefeld, Germany
| | - Jens Peter Lindemann
- Department of Neurobiology and Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld UniversityBielefeld, Germany
| |
Collapse
|
20
|
Temporal statistics of natural image sequences generated by movements with insect flight characteristics. PLoS One 2014; 9:e110386. [PMID: 25340761 PMCID: PMC4207754 DOI: 10.1371/journal.pone.0110386] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2014] [Accepted: 09/09/2014] [Indexed: 11/19/2022] Open
Abstract
Many flying insects, such as flies, wasps and bees, pursue a saccadic flight and gaze strategy. This behavioral strategy is thought to separate the translational and rotational components of self-motion and, thereby, to reduce the computational efforts to extract information about the environment from the retinal image flow. Because of the distinguishing dynamic features of this active flight and gaze strategy of insects, the present study analyzes systematically the spatiotemporal statistics of image sequences generated during saccades and intersaccadic intervals in cluttered natural environments. We show that, in general, rotational movements with saccade-like dynamics elicit fluctuations and overall changes in brightness, contrast and spatial frequency of up to two orders of magnitude larger than translational movements at velocities that are characteristic of insects. Distinct changes in image parameters during translations are only caused by nearby objects. Image analysis based on larger patches in the visual field reveals smaller fluctuations in brightness and spatial frequency composition compared to small patches. The temporal structure and extent of these changes in image parameters define the temporal constraints imposed on signal processing performed by the insect visual system under behavioral conditions in natural environments.
Collapse
|
21
|
Mertes M, Dittmar L, Egelhaaf M, Boeddeker N. Visual motion-sensitive neurons in the bumblebee brain convey information about landmarks during a navigational task. Front Behav Neurosci 2014; 8:335. [PMID: 25309374 PMCID: PMC4173878 DOI: 10.3389/fnbeh.2014.00335] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2014] [Accepted: 09/07/2014] [Indexed: 11/13/2022] Open
Abstract
Bees use visual memories to find the spatial location of previously learnt food sites. Characteristic learning flights help acquiring these memories at newly discovered foraging locations where landmarks—salient objects in the vicinity of the goal location—can play an important role in guiding the animal's homing behavior. Although behavioral experiments have shown that bees can use a variety of visual cues to distinguish objects as landmarks, the question of how landmark features are encoded by the visual system is still open. Recently, it could be shown that motion cues are sufficient to allow bees localizing their goal using landmarks that can hardly be discriminated from the background texture. Here, we tested the hypothesis that motion sensitive neurons in the bee's visual pathway provide information about such landmarks during a learning flight and might, thus, play a role for goal localization. We tracked learning flights of free-flying bumblebees (Bombus terrestris) in an arena with distinct visual landmarks, reconstructed the visual input during these flights, and replayed ego-perspective movies to tethered bumblebees while recording the activity of direction-selective wide-field neurons in their optic lobe. By comparing neuronal responses during a typical learning flight and targeted modifications of landmark properties in this movie we demonstrate that these objects are indeed represented in the bee's visual motion pathway. We find that object-induced responses vary little with object texture, which is in agreement with behavioral evidence. These neurons thus convey information about landmark properties that are useful for view-based homing.
Collapse
Affiliation(s)
- Marcel Mertes
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| | - Laura Dittmar
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| | - Norbert Boeddeker
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| |
Collapse
|