1
|
Davis BA, Mongeau JM. The influence of saccades on yaw gaze stabilization in fly flight. PLoS Comput Biol 2023; 19:e1011746. [PMID: 38127819 PMCID: PMC10769041 DOI: 10.1371/journal.pcbi.1011746] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2023] [Revised: 01/05/2024] [Accepted: 12/08/2023] [Indexed: 12/23/2023] Open
Abstract
In a way analogous to human vision, the fruit fly D. melanogaster and many other flying insects generate smooth and saccadic movements to stabilize and shift their gaze in flight, respectively. It has been hypothesized that this combination of continuous and discrete movements benefits both flight stability and performance, particularly at high frequencies or speeds. Here we develop a hybrid control system model to explore the effects of saccades on the yaw stabilization reflex of D. melanogaster. Inspired from experimental data, the model includes a first order plant, a Proportional-Integral (PI) continuous controller, and a saccadic reset system that fires based on the integrated error of the continuous controller. We explore the gain, delay and switching threshold parameter space to quantify the optimum regions for yaw stability and performance. We show that the addition of saccades to a continuous controller provides benefits to both stability and performance across a range of frequencies. Our model suggests that Drosophila operates near its optimal switching threshold for its experimental gain set. We also show that based on experimental data, D. melanogaster operates in a region that trades off performance and stability. This trade-off increases flight robustness to compensate for internal perturbations such as wing damage.
Collapse
Affiliation(s)
- Brock A. Davis
- Department of Mechanical Engineering, The Pennsylvania State University, University Park, Pennsylvania, United States of America
| | - Jean-Michel Mongeau
- Department of Mechanical Engineering, The Pennsylvania State University, University Park, Pennsylvania, United States of America
| |
Collapse
|
2
|
Skeels S, von der Emde G, Burt de Perera T. Mormyrid fish as models for investigating sensory-motor integration: A behavioural perspective. J Zool (1987) 2023; 319:243-253. [PMID: 38515784 PMCID: PMC10953462 DOI: 10.1111/jzo.13046] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2022] [Revised: 11/12/2022] [Accepted: 12/22/2022] [Indexed: 02/04/2023]
Abstract
Animals possess senses which gather information from their environment. They can tune into important aspects of this information and decide on the most appropriate response, requiring coordination of their sensory and motor systems. This interaction is bidirectional. Animals can actively shape their perception with self-driven motion, altering sensory flow to maximise the environmental information they are able to extract. Mormyrid fish are excellent candidates for studying sensory-motor interactions, because they possess a unique sensory system (the active electric sense) and exhibit notable behaviours that seem to be associated with electrosensing. This review will take a behavioural approach to unpicking this relationship, using active electrolocation as an example where body movements and sensing capabilities are highly related and can be assessed in tandem. Active electrolocation is the process where individuals will generate and detect low-voltage electric fields to locate and recognise nearby objects. We will focus on research in the mormyrid Gnathonemus petersii (G. petersii), given the extensive study of this species, particularly its object recognition abilities. By studying object detection and recognition, we can assess the potential benefits of self-driven movements to enhance selection of biologically relevant information. Finally, these findings are highly relevant to understanding the involvement of movement in shaping the sensory experience of animals that use other sensory modalities. Understanding the overlap between sensory and motor systems will give insight into how different species have become adapted to their environments.
Collapse
Affiliation(s)
- S. Skeels
- Department of BiologyUniversity of OxfordOxfordUK
| | | | | |
Collapse
|
3
|
Ravi S, Siesenop T, Bertrand OJ, Li L, Doussot C, Fisher A, Warren WH, Egelhaaf M. Bumblebees display characteristics of active vision during robust obstacle avoidance flight. J Exp Biol 2022; 225:274096. [PMID: 35067721 PMCID: PMC8920035 DOI: 10.1242/jeb.243021] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2021] [Accepted: 01/18/2022] [Indexed: 11/20/2022]
Abstract
Insects are remarkable flyers and capable of navigating through highly cluttered environments. We tracked the head and thorax of bumblebees freely flying in a tunnel containing vertically oriented obstacles to uncover the sensorimotor strategies used for obstacle detection and collision avoidance. Bumblebees presented all the characteristics of active vision during flight by stabilizing their head relative to the external environment and maintained close alignment between their gaze and flightpath. Head stabilization increased motion contrast of nearby features against the background to enable obstacle detection. As bees approached obstacles, they appeared to modulate avoidance responses based on the relative retinal expansion velocity (RREV) of obstacles and their maximum evasion acceleration was linearly related to RREVmax. Finally, bees prevented collisions through rapid roll manoeuvres implemented by their thorax. Overall, the combination of visuo-motor strategies of bumblebees highlights elegant solutions developed by insects for visually guided flight through cluttered environments.
Collapse
Affiliation(s)
- Sridhar Ravi
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany,School of Engineering and Information Technology, University of New South Wales, Canberra, ACT 2600, Australia,Author for correspondence ()
| | - Tim Siesenop
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| | - Olivier J. Bertrand
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| | - Liang Li
- Department of Collective Behavior, Max Planck Institute of Animal Behavior, University of Konstanz, 78464 Konstanz, Germany,Centre for the Advanced Study of Collective Behaviour, University of Konstanz, 78464 Konstanz, Germany,Department of Biology, University of Konstanz, 78464 Konstanz, Germany
| | - Charlotte Doussot
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| | - Alex Fisher
- School of Engineering, RMIT University, Melbourne, VIC 3001, Australia
| | - William H. Warren
- Department of Cognitive, Linguistic & Psychological Sciences, Brown University, Providence, RI 02912, USA
| | - Martin Egelhaaf
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| |
Collapse
|
4
|
Grittner R, Baird E, Stöckl A. Spatial tuning of translational optic flow responses in hawkmoths of varying body size. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2021; 208:279-296. [PMID: 34893928 PMCID: PMC8934765 DOI: 10.1007/s00359-021-01530-1] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2021] [Revised: 11/28/2021] [Accepted: 11/30/2021] [Indexed: 11/12/2022]
Abstract
To safely navigate their environment, flying insects rely on visual cues, such as optic flow. Which cues insects can extract from their environment depends closely on the spatial and temporal response properties of their visual system. These in turn can vary between individuals that differ in body size. How optic flow-based flight control depends on the spatial structure of visual cues, and how this relationship scales with body size, has previously been investigated in insects with apposition compound eyes. Here, we characterised the visual flight control response limits and their relationship to body size in an insect with superposition compound eyes: the hummingbird hawkmoth Macroglossum stellatarum. We used the hawkmoths’ centring response in a flight tunnel as a readout for their reception of translational optic flow stimuli of different spatial frequencies. We show that their responses cut off at different spatial frequencies when translational optic flow was presented on either one, or both tunnel walls. Combined with differences in flight speed, this suggests that their flight control was primarily limited by their temporal rather than spatial resolution. We also observed strong individual differences in flight performance, but no correlation between the spatial response cutoffs and body or eye size.
Collapse
Affiliation(s)
- Rebecca Grittner
- Behavioral Physiology and Sociobiology (Zoology II), University of Würzburg, Würzburg, Germany
| | - Emily Baird
- Department of Zoology, Stockholm University, Stockholm, Sweden
| | - Anna Stöckl
- Behavioral Physiology and Sociobiology (Zoology II), University of Würzburg, Würzburg, Germany.
| |
Collapse
|
5
|
Facilitation of neural responses to targets moving against optic flow. Proc Natl Acad Sci U S A 2021; 118:2024966118. [PMID: 34531320 PMCID: PMC8463850 DOI: 10.1073/pnas.2024966118] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/27/2021] [Indexed: 01/08/2023] Open
Abstract
Target detection in visual clutter is a difficult computational task that insects, with their poor spatial resolution compound eyes and small brains, do successfully and with extremely short behavioral delays. We here show that the responses of target selective descending neurons are attenuated by background motion in the same direction as target motion but facilitated by background motion in the opposite direction. This finding is important for understanding how target pursuit can occur in tandem with gaze stabilization. Indeed, the neural facilitation would come into effect if the hoverfly is subjected to background motion in one direction but the target it is pursuing moves in the opposite direction and could therefore be used to override gaze stabilizing corrective turns. For the human observer, it can be difficult to follow the motion of small objects, especially when they move against background clutter. In contrast, insects efficiently do this, as evidenced by their ability to capture prey, pursue conspecifics, or defend territories, even in highly textured surrounds. We here recorded from target selective descending neurons (TSDNs), which likely subserve these impressive behaviors. To simulate the type of optic flow that would be generated by the pursuer’s own movements through the world, we used the motion of a perspective corrected sparse dot field. We show that hoverfly TSDN responses to target motion are suppressed when such optic flow moves syn-directional to the target. Indeed, neural responses are strongly suppressed when targets move over either translational sideslip or rotational yaw. More strikingly, we show that TSDNs are facilitated by optic flow moving counterdirectional to the target, if the target moves horizontally. Furthermore, we show that a small, frontal spatial window of optic flow is enough to fully facilitate or suppress TSDN responses to target motion. We argue that such TSDN response facilitation could be beneficial in modulating corrective turns during target pursuit.
Collapse
|
6
|
Li J, Niemeier M, Kern R, Egelhaaf M. Disentangling of Local and Wide-Field Motion Adaptation. Front Neural Circuits 2021; 15:713285. [PMID: 34531728 PMCID: PMC8438216 DOI: 10.3389/fncir.2021.713285] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2021] [Accepted: 08/11/2021] [Indexed: 11/21/2022] Open
Abstract
Motion adaptation has been attributed in flying insects a pivotal functional role in spatial vision based on optic flow. Ongoing motion enhances in the visual pathway the representation of spatial discontinuities, which manifest themselves as velocity discontinuities in the retinal optic flow pattern during translational locomotion. There is evidence for different spatial scales of motion adaptation at the different visual processing stages. Motion adaptation is supposed to take place, on the one hand, on a retinotopic basis at the level of local motion detecting neurons and, on the other hand, at the level of wide-field neurons pooling the output of many of these local motion detectors. So far, local and wide-field adaptation could not be analyzed separately, since conventional motion stimuli jointly affect both adaptive processes. Therefore, we designed a novel stimulus paradigm based on two types of motion stimuli that had the same overall strength but differed in that one led to local motion adaptation while the other did not. We recorded intracellularly the activity of a particular wide-field motion-sensitive neuron, the horizontal system equatorial cell (HSE) in blowflies. The experimental data were interpreted based on a computational model of the visual motion pathway, which included the spatially pooling HSE-cell. By comparing the difference between the recorded and modeled HSE-cell responses induced by the two types of motion adaptation, the major characteristics of local and wide-field adaptation could be pinpointed. Wide-field adaptation could be shown to strongly depend on the activation level of the cell and, thus, on the direction of motion. In contrast, the response gain is reduced by local motion adaptation to a similar extent independent of the direction of motion. This direction-independent adaptation differs fundamentally from the well-known adaptive adjustment of response gain according to the prevailing overall stimulus level that is considered essential for an efficient signal representation by neurons with a limited operating range. Direction-independent adaptation is discussed to result from the joint activity of local motion-sensitive neurons of different preferred directions and to lead to a representation of the local motion direction that is independent of the overall direction of global motion.
Collapse
Affiliation(s)
- Jinglin Li
- Neurobiology, Bielefeld University, Bielefeld, Germany
| | | | - Roland Kern
- Neurobiology, Bielefeld University, Bielefeld, Germany
| | | |
Collapse
|
7
|
Bertrand OJN, Doussot C, Siesenop T, Ravi S, Egelhaaf M. Visual and movement memories steer foraging bumblebees along habitual routes. J Exp Biol 2021; 224:269087. [PMID: 34115117 DOI: 10.1242/jeb.237867] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2020] [Accepted: 04/06/2021] [Indexed: 11/20/2022]
Abstract
One persistent question in animal navigation is how animals follow habitual routes between their home and a food source. Our current understanding of insect navigation suggests an interplay between visual memories, collision avoidance and path integration, the continuous integration of distance and direction travelled. However, these behavioural modules have to be continuously updated with instantaneous visual information. In order to alleviate this need, the insect could learn and replicate habitual movements ('movement memories') around objects (e.g. a bent trajectory around an object) to reach its destination. We investigated whether bumblebees, Bombus terrestris, learn and use movement memories en route to their home. Using a novel experimental paradigm, we habituated bumblebees to establish a habitual route in a flight tunnel containing 'invisible' obstacles. We then confronted them with conflicting cues leading to different choice directions depending on whether they rely on movement or visual memories. The results suggest that they use movement memories to navigate, but also rely on visual memories to solve conflicting situations. We investigated whether the observed behaviour was due to other guidance systems, such as path integration or optic flow-based flight control, and found that neither of these systems was sufficient to explain the behaviour.
Collapse
Affiliation(s)
- Olivier J N Bertrand
- Department of Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC) , Bielefeld University, D-33501 Bielefeld, Germany
| | - Charlotte Doussot
- Department of Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC) , Bielefeld University, D-33501 Bielefeld, Germany
| | - Tim Siesenop
- Department of Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC) , Bielefeld University, D-33501 Bielefeld, Germany
| | - Sridhar Ravi
- Department of Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC) , Bielefeld University, D-33501 Bielefeld, Germany.,School of Engineering, RMIT University, Melbourne, VIC 3083, Australia
| | - Martin Egelhaaf
- Department of Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC) , Bielefeld University, D-33501 Bielefeld, Germany
| |
Collapse
|
8
|
Wang H, Fu Q, Wang H, Baxter P, Peng J, Yue S. A bioinspired angular velocity decoding neural network model for visually guided flights. Neural Netw 2021; 136:180-193. [PMID: 33494035 DOI: 10.1016/j.neunet.2020.12.008] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2020] [Revised: 12/03/2020] [Accepted: 12/07/2020] [Indexed: 11/17/2022]
Abstract
Efficient and robust motion perception systems are important pre-requisites for achieving visually guided flights in future micro air vehicles. As a source of inspiration, the visual neural networks of flying insects such as honeybee and Drosophila provide ideal examples on which to base artificial motion perception models. In this paper, we have used this approach to develop a novel method that solves the fundamental problem of estimating angular velocity for visually guided flights. Compared with previous models, our elementary motion detector (EMD) based model uses a separate texture estimation pathway to effectively decode angular velocity, and demonstrates considerable independence from the spatial frequency and contrast of the gratings. Using the Unity development platform the model is further tested for tunnel centering and terrain following paradigms in order to reproduce the visually guided flight behaviors of honeybees. In a series of controlled trials, the virtual bee utilizes the proposed angular velocity control schemes to accurately navigate through a patterned tunnel, maintaining a suitable distance from the undulating textured terrain. The results are consistent with both neuron spike recordings and behavioral path recordings of real honeybees, thereby demonstrating the model's potential for implementation in micro air vehicles which have only visual sensors.
Collapse
Affiliation(s)
- Huatian Wang
- Computational Intelligence Laboratory (CIL), University of Lincoln, Lincoln, UK; Machine Life and Intelligence Research Center, Guangzhou University, Guangzhou, China
| | - Qinbing Fu
- Machine Life and Intelligence Research Center, Guangzhou University, Guangzhou, China; Computational Intelligence Laboratory (CIL), University of Lincoln, Lincoln, UK
| | - Hongxin Wang
- Machine Life and Intelligence Research Center, Guangzhou University, Guangzhou, China; Computational Intelligence Laboratory (CIL), University of Lincoln, Lincoln, UK
| | - Paul Baxter
- Computational Intelligence Laboratory (CIL), University of Lincoln, Lincoln, UK
| | - Jigen Peng
- School of Mathematics and Information Science, Guangzhou University, Guangzhou, China; Machine Life and Intelligence Research Center, Guangzhou University, Guangzhou, China.
| | - Shigang Yue
- Machine Life and Intelligence Research Center, Guangzhou University, Guangzhou, China; Computational Intelligence Laboratory (CIL), University of Lincoln, Lincoln, UK.
| |
Collapse
|
9
|
Odenthal L, Doussot C, Meyer S, Bertrand OJN. Analysing Head-Thorax Choreography During Free-Flights in Bumblebees. Front Behav Neurosci 2021; 14:610029. [PMID: 33510626 PMCID: PMC7835495 DOI: 10.3389/fnbeh.2020.610029] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2020] [Accepted: 12/14/2020] [Indexed: 01/29/2023] Open
Abstract
Animals coordinate their various body parts, sometimes in elaborate manners to swim, walk, climb, fly, and navigate their environment. The coordination of body parts is essential to behaviors such as, chasing, escaping, landing, and the extraction of relevant information. For example, by shaping the movement of the head and body in an active and controlled manner, flying insects structure their flights to facilitate the acquisition of distance information. They condense their turns into a short period of time (the saccade) interspaced by a relatively long translation (the intersaccade). However, due to technological limitations, the precise coordination of the head and thorax during insects' free-flight remains unclear. Here, we propose methods to analyse the orientation of the head and thorax of bumblebees Bombus terrestris, to segregate the trajectories of flying insects into saccades and intersaccades by using supervised machine learning (ML) techniques, and finally to analyse the coordination between head and thorax by using artificial neural networks (ANN). The segregation of flights into saccades and intersaccades by ML, based on the thorax angular velocities, decreased the misclassification by 12% compared to classically used methods. Our results demonstrate how machine learning techniques can be used to improve the analyses of insect flight structures and to learn about the complexity of head-body coordination. We anticipate our assay to be a starting point for more sophisticated experiments and analysis on freely flying insects. For example, the coordination of head and body movements during collision avoidance, chasing behavior, or negotiation of gaps could be investigated by monitoring the head and thorax orientation of freely flying insects within and across behavioral tasks, and in different species.
Collapse
Affiliation(s)
| | | | - Stefan Meyer
- Department of Informatics, University of Sussex, Brighton, United Kingdom
| | | |
Collapse
|
10
|
Ando N, Kanzaki R. Insect-machine hybrid robot. CURRENT OPINION IN INSECT SCIENCE 2020; 42:61-69. [PMID: 32992040 DOI: 10.1016/j.cois.2020.09.006] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/05/2020] [Revised: 09/12/2020] [Accepted: 09/16/2020] [Indexed: 06/11/2023]
Abstract
Recently, insect-machine hybrid robots have been developed that incorporate insects into robots or incorporate machines into insects. Most previous studies were motivated to use the function of insects for robots, but this technology can also prove to be useful as an experimental tool for neuroethology. We reviewed hybrid robots in terms of the closed-loop between an insect, a robot, and the real environment. The incorporated biological components provided the robot sensory signals that were received by the insects and the adaptive functions of the brain. The incorporated artificial components permitted us to understand the biological system by controlling insect behavior. Hybrid robots thus extend the roles of mobile robot experiments in neuroethology for both model evaluation and brain function analysis.
Collapse
Affiliation(s)
- Noriyasu Ando
- Department of Systems Life Engineering, Maebashi Institute of Technology, 460-1, Kamisadori-cho, Maebashi, Gunma 371-0816, Japan.
| | - Ryohei Kanzaki
- Research Center for Advanced Science and Technology, The University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8904, Japan
| |
Collapse
|
11
|
Bumblebees perceive the spatial layout of their environment in relation to their body size and form to minimize inflight collisions. Proc Natl Acad Sci U S A 2020; 117:31494-31499. [PMID: 33229535 DOI: 10.1073/pnas.2016872117] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Animals that move through complex habitats must frequently contend with obstacles in their path. Humans and other highly cognitive vertebrates avoid collisions by perceiving the relationship between the layout of their surroundings and the properties of their own body profile and action capacity. It is unknown whether insects, which have much smaller brains, possess such abilities. We used bumblebees, which vary widely in body size and regularly forage in dense vegetation, to investigate whether flying insects consider their own size when interacting with their surroundings. Bumblebees trained to fly in a tunnel were sporadically presented with an obstructing wall containing a gap that varied in width. Bees successfully flew through narrow gaps, even those that were much smaller than their wingspans, by first performing lateral scanning (side-to-side flights) to visually assess the aperture. Bees then reoriented their in-flight posture (i.e., yaw or heading angle) while passing through, minimizing their projected frontal width and mitigating collisions; in extreme cases, bees flew entirely sideways through the gap. Both the time that bees spent scanning during their approach and the extent to which they reoriented themselves to pass through the gap were determined not by the absolute size of the gap, but by the size of the gap relative to each bee's own wingspan. Our findings suggest that, similar to humans and other vertebrates, flying bumblebees perceive the affordance of their surroundings relative their body size and form to navigate safely through complex environments.
Collapse
|
12
|
Doussot C, Bertrand OJN, Egelhaaf M. Visually guided homing of bumblebees in ambiguous situations: A behavioural and modelling study. PLoS Comput Biol 2020; 16:e1008272. [PMID: 33048938 PMCID: PMC7553325 DOI: 10.1371/journal.pcbi.1008272] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2019] [Accepted: 08/19/2020] [Indexed: 11/19/2022] Open
Abstract
Returning home is a crucial task accomplished daily by many animals, including humans. Because of their tiny brains, insects, like bees or ants, are good study models for efficient navigation strategies. Bees and ants are known to rely mainly on learned visual information about the nest surroundings to pinpoint their barely visible nest-entrance. During the return, when the actual sight of the insect matches the learned information, the insect is easily guided home. Occasionally, modifications to the visual environment may take place while the insect is on a foraging trip. Here, we addressed the ecologically relevant question of how bumblebees’ homing is affected by such a situation. In an artificial setting, we habituated bees to be guided to their nest by two constellations of visual cues. After habituation, these cues were displaced during foraging trips into a conflict situation. We recorded bumblebees’ return flights in such circumstances and investigated where they search for their nest entrance following the degree of displacement between the two visually relevant cues. Bumblebees mostly searched at the fictive nest location as indicated by either cue constellation, but never at a compromise location between them. We compared these experimental results to the predictions of different types of homing models. We found that models guiding an agent by a single holistic view of the nest surroundings could not account for the bumblebees’ search behaviour in cue-conflict situations. Instead, homing models relying on multiple views were sufficient. We could further show that homing models required fewer views and got more robust to height changes if optic flow-based spatial information was encoded and learned, rather than just brightness information. Returning home sounds trivial, but to a concealed underground location like a burrow, is less easy. For the buff-tailed bumblebees, this task is a routine. After collecting pollen in gardens or flowered meadows, bees must return to their underground nest to feed the queen’s larvae. The nest entrance is almost invisible for a returning bee; therefore, it guides its flight by information about the surrounding visual environment. Since the seminal work of Timbergern, many experiments have focused on how visual information is guiding foraging insects back home. In these experiments, returning foragers were confronted with a coherent displacement of the entire nest surroundings, hence, leading the bees to a unique new location. But in nature, the objects constituting the visual environment maybe unorderly displaced, as some are differently inclined to the action of different factors, e.g. wind. In our study, we moved objects in a tricky way to create two fictitious nest entrances. The bees searched at the fictitious nest entrances, but never in-between. The distance between the fictitious nests affected the bees’ search. Finally, we could predict the search location by using bio-inspired homing models potentially interesting for implementing in autonomous robots.
Collapse
Affiliation(s)
- Charlotte Doussot
- Neurobiology, Faculty of Biology, Universität Bielefeld, Germany
- * E-mail:
| | | | - Martin Egelhaaf
- Neurobiology, Faculty of Biology, Universität Bielefeld, Germany
| |
Collapse
|
13
|
Lagogiannis K, Diana G, Meyer MP. Learning steers the ontogeny of an efficient hunting sequence in zebrafish larvae. eLife 2020; 9:55119. [PMID: 32773042 PMCID: PMC7561354 DOI: 10.7554/elife.55119] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2020] [Accepted: 08/07/2020] [Indexed: 11/13/2022] Open
Abstract
Goal-directed behaviors may be poorly coordinated in young animals but, with age and experience, behavior progressively adapts to efficiently exploit the animal’s ecological niche. How experience impinges on the developing neural circuits of behavior is an open question. We have conducted a detailed study of the effects of experience on the ontogeny of hunting behavior in larval zebrafish. We report that larvae with prior experience of live prey consume considerably more prey than naive larvae. This is mainly due to increased capture success and a modest increase in hunt rate. We demonstrate that the initial turn to prey and the final capture manoeuvre of the hunting sequence were jointly modified by experience and that modification of these components predicted capture success. Our findings establish an ethologically relevant paradigm in zebrafish for studying how the brain is shaped by experience to drive the ontogeny of efficient behavior.
Collapse
Affiliation(s)
- Konstantinos Lagogiannis
- Centre for Developmental Neurobiology, MRC Center for Neurodevelopmental Disorders, King's College London, London, United Kingdom
| | - Giovanni Diana
- Centre for Developmental Neurobiology, MRC Center for Neurodevelopmental Disorders, King's College London, London, United Kingdom
| | - Martin P Meyer
- Centre for Developmental Neurobiology, MRC Center for Neurodevelopmental Disorders, King's College London, London, United Kingdom
| |
Collapse
|
14
|
Meyer HG, Klimeck D, Paskarbeit J, Rückert U, Egelhaaf M, Porrmann M, Schneider A. Resource-efficient bio-inspired visual processing on the hexapod walking robot HECTOR. PLoS One 2020; 15:e0230620. [PMID: 32236111 PMCID: PMC7112198 DOI: 10.1371/journal.pone.0230620] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2019] [Accepted: 03/04/2020] [Indexed: 11/26/2022] Open
Abstract
Emulating the highly resource-efficient processing of visual motion information in the brain of flying insects, a bio-inspired controller for collision avoidance and navigation was implemented on a novel, integrated System-on-Chip-based hardware module. The hardware module is used to control visually-guided navigation behavior of the stick insect-like hexapod robot HECTOR. By leveraging highly parallelized bio-inspired algorithms to extract nearness information from visual motion in dynamically reconfigurable logic, HECTOR is able to navigate to predefined goal positions without colliding with obstacles. The system drastically outperforms CPU- and graphics card-based implementations in terms of speed and resource efficiency, making it suitable to be also placed on fast moving robots, such as flying drones.
Collapse
Affiliation(s)
- Hanno Gerd Meyer
- Research Group Biomechatronics, CITEC, Bielefeld University, Bielefeld, Germany
- Department of Neurobiology and CITEC, Bielefeld University, Bielefeld, Germany
- Biomechatronics and Embedded Systems Group, Faculty of Engineering and Mathematics, University of Applied Sciences, Bielefeld, Germany
| | - Daniel Klimeck
- Cognitronics and Sensor Systems Group, CITEC, Bielefeld University, Bielefeld, Germany
| | - Jan Paskarbeit
- Research Group Biomechatronics, CITEC, Bielefeld University, Bielefeld, Germany
| | - Ulrich Rückert
- Cognitronics and Sensor Systems Group, CITEC, Bielefeld University, Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology and CITEC, Bielefeld University, Bielefeld, Germany
| | - Mario Porrmann
- Computer Engineering Group, Osnabrück University, Osnabrück, Germany
| | - Axel Schneider
- Research Group Biomechatronics, CITEC, Bielefeld University, Bielefeld, Germany
- Biomechatronics and Embedded Systems Group, Faculty of Engineering and Mathematics, University of Applied Sciences, Bielefeld, Germany
| |
Collapse
|
15
|
Ravi S, Bertrand O, Siesenop T, Manz LS, Doussot C, Fisher A, Egelhaaf M. Gap perception in bumblebees. ACTA ACUST UNITED AC 2019; 222:222/2/jeb184135. [PMID: 30683732 DOI: 10.1242/jeb.184135] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2018] [Accepted: 10/26/2018] [Indexed: 11/20/2022]
Abstract
A number of insects fly over long distances below the natural canopy, where the physical environment is highly cluttered consisting of obstacles of varying shape, size and texture. While navigating within such environments, animals need to perceive and disambiguate environmental features that might obstruct their flight. The most elemental aspect of aerial navigation through such environments is gap identification and 'passability' evaluation. We used bumblebees to seek insights into the mechanisms used for gap identification when confronted with an obstacle in their flight path and behavioral compensations employed to assess gap properties. Initially, bumblebee foragers were trained to fly though an unobstructed flight tunnel that led to a foraging chamber. After the bees were familiar with this situation, we placed a wall containing a gap that unexpectedly obstructed the flight path on a return trip to the hive. The flight trajectories of the bees as they approached the obstacle wall and traversed the gap were analyzed in order to evaluate their behavior as a function of the distance between the gap and a background wall that was placed behind the gap. Bumblebees initially decelerated when confronted with an unexpected obstacle. Deceleration was first noticed when the obstacle subtended around 35 deg on the retina but also depended on the properties of the gap. Subsequently, the bees gradually traded off their longitudinal velocity to lateral velocity and approached the gap with increasing lateral displacement and lateral velocity. Bumblebees shaped their flight trajectory depending on the salience of the gap, indicated in our case by the optic flow contrast between the region within the gap and on the obstacle, which decreased with decreasing distance between the gap and the background wall. As the optic flow contrast decreased, the bees spent an increasing amount of time moving laterally across the obstacles. During these repeated lateral maneuvers, the bees are probably assessing gap geometry and passability.
Collapse
Affiliation(s)
- Sridhar Ravi
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany .,School of Engineering, RMIT University, Melbourne, VIC 3001, Australia
| | - Olivier Bertrand
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Tim Siesenop
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Lea-Sophie Manz
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany.,Faculty of Biology, Johannes Gutenberg-Universität Mainz, 55122 Mainz, Germany
| | - Charlotte Doussot
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Alex Fisher
- School of Engineering, RMIT University, Melbourne, VIC 3001, Australia
| | - Martin Egelhaaf
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| |
Collapse
|
16
|
Busch C, Borst A, Mauss AS. Bi-directional Control of Walking Behavior by Horizontal Optic Flow Sensors. Curr Biol 2018; 28:4037-4045.e5. [PMID: 30528583 DOI: 10.1016/j.cub.2018.11.010] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2018] [Revised: 10/02/2018] [Accepted: 11/02/2018] [Indexed: 12/13/2022]
Abstract
Moving animals experience constant sensory feedback, such as panoramic image shifts on the retina, termed optic flow. Underlying neuronal signals are thought to be important for exploratory behavior by signaling unintended course deviations and by providing spatial information about the environment [1, 2]. Particularly in insects, the encoding of self-motion-related optic flow is well understood [1-5]. However, a gap remains in understanding how the associated neuronal activity controls locomotor trajectories. In flies, visual projection neurons belonging to two groups encode panoramic horizontal motion: horizontal system (HS) cells respond with depolarization to front-to-back motion and hyperpolarization to the opposite direction [6, 7], and other neurons have the mirror-symmetrical response profile [6, 8, 9]. With primarily monocular sensitivity, the neurons' responses are ambiguous for different rotational and translational self-movement components. Such ambiguities can be greatly reduced by combining signals from both eyes [10-12] to determine turning and movement speed [13-16]. Here, we explore the underlying functional logic by optogenetic HS cell manipulation in tethered walking Drosophila. We show that de- and hyperpolarization evoke opposite turning behavior, indicating that both direction-selective signals are transmitted to descending pathways for course control. Further experiments reveal a negative effect of bilaterally symmetric de- and hyperpolarization on walking velocity. Our results are therefore consistent with a functional architecture in which the HS cells' membrane potential influences walking behavior bi-directionally via two decelerating pathways.
Collapse
Affiliation(s)
- Christian Busch
- Circuits - Computation - Models, Max Planck Institute of Neurobiology, Am Klopferspitz 18, Martinsried 82152, Germany
| | - Alexander Borst
- Circuits - Computation - Models, Max Planck Institute of Neurobiology, Am Klopferspitz 18, Martinsried 82152, Germany
| | - Alex S Mauss
- Circuits - Computation - Models, Max Planck Institute of Neurobiology, Am Klopferspitz 18, Martinsried 82152, Germany.
| |
Collapse
|
17
|
Milde MB, Bertrand OJN, Ramachandran H, Egelhaaf M, Chicca E. Spiking Elementary Motion Detector in Neuromorphic Systems. Neural Comput 2018; 30:2384-2417. [PMID: 30021082 DOI: 10.1162/neco_a_01112] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Apparent motion of the surroundings on an agent's retina can be used to navigate through cluttered environments, avoid collisions with obstacles, or track targets of interest. The pattern of apparent motion of objects, (i.e., the optic flow), contains spatial information about the surrounding environment. For a small, fast-moving agent, as used in search and rescue missions, it is crucial to estimate the distance to close-by objects to avoid collisions quickly. This estimation cannot be done by conventional methods, such as frame-based optic flow estimation, given the size, power, and latency constraints of the necessary hardware. A practical alternative makes use of event-based vision sensors. Contrary to the frame-based approach, they produce so-called events only when there are changes in the visual scene. We propose a novel asynchronous circuit, the spiking elementary motion detector (sEMD), composed of a single silicon neuron and synapse, to detect elementary motion from an event-based vision sensor. The sEMD encodes the time an object's image needs to travel across the retina into a burst of spikes. The number of spikes within the burst is proportional to the speed of events across the retina. A fast but imprecise estimate of the time-to-travel can already be obtained from the first two spikes of a burst and refined by subsequent interspike intervals. The latter encoding scheme is possible due to an adaptive nonlinear synaptic efficacy scaling. We show that the sEMD can be used to compute a collision avoidance direction in the context of robotic navigation in a cluttered outdoor environment and compared the collision avoidance direction to a frame-based algorithm. The proposed computational principle constitutes a generic spiking temporal correlation detector that can be applied to other sensory modalities (e.g., sound localization), and it provides a novel perspective to gating information in spiking neural networks.
Collapse
Affiliation(s)
- M B Milde
- Institute of Neuroinformatics, University of Zurich, and ETH Zurich, 8057 Zurich, Switzerland
| | - O J N Bertrand
- Neurobiology, Faculty of Biology, Bielefeld University, 33615 Bielefeld, and Cognitive Interaction Technology, Center of Excellence, Bielefeld University, 33501 Bielefeld, Germany
| | - H Ramachandran
- Faculty of Technology, Bielefeld University, 33615 Bielefeld, and Cognitive Interaction Technology, Center of Excellence, Bielefeld University, 33501 Bielefeld, Germany
| | - M Egelhaaf
- Neurobiology, Faculty of Biology, Bielefeld University, 33615 Bielefeld, and Cognitive Interaction Technology, Center of Excellence, Bielefeld University, 33501 Bielefeld, Germany
| | - E Chicca
- Faculty of Technology, Bielefeld University, 33615 Bielefeld, Germany, and Cognitive Interaction Technology, Center of Excellence, Bielefeld University, 33501 Bielefeld, Germany
| |
Collapse
|
18
|
Li J, Lindemann JP, Egelhaaf M. Local motion adaptation enhances the representation of spatial structure at EMD arrays. PLoS Comput Biol 2017; 13:e1005919. [PMID: 29281631 PMCID: PMC5760083 DOI: 10.1371/journal.pcbi.1005919] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2017] [Revised: 01/09/2018] [Accepted: 11/13/2017] [Indexed: 11/18/2022] Open
Abstract
Neuronal representation and extraction of spatial information are essential for behavioral control. For flying insects, a plausible way to gain spatial information is to exploit distance-dependent optic flow that is generated during translational self-motion. Optic flow is computed by arrays of local motion detectors retinotopically arranged in the second neuropile layer of the insect visual system. These motion detectors have adaptive response characteristics, i.e. their responses to motion with a constant or only slowly changing velocity decrease, while their sensitivity to rapid velocity changes is maintained or even increases. We analyzed by a modeling approach how motion adaptation affects signal representation at the output of arrays of motion detectors during simulated flight in artificial and natural 3D environments. We focused on translational flight, because spatial information is only contained in the optic flow induced by translational locomotion. Indeed, flies, bees and other insects segregate their flight into relatively long intersaccadic translational flight sections interspersed with brief and rapid saccadic turns, presumably to maximize periods of translation (80% of the flight). With a novel adaptive model of the insect visual motion pathway we could show that the motion detector responses to background structures of cluttered environments are largely attenuated as a consequence of motion adaptation, while responses to foreground objects stay constant or even increase. This conclusion even holds under the dynamic flight conditions of insects.
Collapse
Affiliation(s)
- Jinglin Li
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
- * E-mail:
| | - Jens P. Lindemann
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| |
Collapse
|
19
|
Abstract
Tiger beetles pursue prey by adjusting their heading according to a time-delayed proportional control law that minimizes the error angle (Haselsteiner et al 2014 J. R. Soc. Interface 11 20140216). This control law can be further interpreted in terms of mechanical actuation: to catch prey, tiger beetles exert a sideways force by biasing their tripod gait in proportion to the error angle measured half a stride earlier. The proportional gain was found to be nearly optimal in the sense that it minimizes the time to point directly toward the prey. For a time-delayed linear proportional controller, the optimal gain, k, is inversely proportional to the time delay, τ, and satisfies [Formula: see text]. Here we present evidence that tiger beetles adjust their control gain during their pursuit of prey. Our analysis shows two critical distances: one corresponding to the beetle's final approach to the prey, and the second, less expected, occurring at a distance around 10 cm for a prey size of 4.5 mm. The beetle initiates its chase using a sub-critical gain and increases the gain to the optimal value once the prey is within this critical distance. Insects use a variety of methods to detect distance, often involving different visual cues. Here we examine two such methods: one based on motion parallax and the other based on the prey's elevation angle. We show that, in order for the motion parallax method to explain the observed data, the beetle needs to correct for the ratio of the prey's sideways velocity relative to its own. On the other hand, the simpler method based on the elevation angle can detect both the distance and the prey's size. Moreover we find that the transition distance corresponds to the accuracy required to distinguish small prey from large predators.
Collapse
Affiliation(s)
- R M Noest
- Department of Physics, Cornell University, Ithaca, NY 14853, United States of America
| | | |
Collapse
|
20
|
Faghihi F, Moustafa AA, Heinrich R, Wörgötter F. A computational model of conditioning inspired by Drosophila olfactory system. Neural Netw 2017; 87:96-108. [DOI: 10.1016/j.neunet.2016.11.002] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2016] [Revised: 11/07/2016] [Accepted: 11/11/2016] [Indexed: 11/15/2022]
|
21
|
Gonzalez-Bellido PT, Fabian ST, Nordström K. Target detection in insects: optical, neural and behavioral optimizations. Curr Opin Neurobiol 2016; 41:122-128. [DOI: 10.1016/j.conb.2016.09.001] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2016] [Revised: 08/10/2016] [Accepted: 09/05/2016] [Indexed: 11/16/2022]
|
22
|
Li J, Lindemann JP, Egelhaaf M. Peripheral Processing Facilitates Optic Flow-Based Depth Perception. Front Comput Neurosci 2016; 10:111. [PMID: 27818631 PMCID: PMC5073142 DOI: 10.3389/fncom.2016.00111] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2016] [Accepted: 10/04/2016] [Indexed: 12/19/2022] Open
Abstract
Flying insects, such as flies or bees, rely on consistent information regarding the depth structure of the environment when performing their flight maneuvers in cluttered natural environments. These behaviors include avoiding collisions, approaching targets or spatial navigation. Insects are thought to obtain depth information visually from the retinal image displacements ("optic flow") during translational ego-motion. Optic flow in the insect visual system is processed by a mechanism that can be modeled by correlation-type elementary motion detectors (EMDs). However, it is still an open question how spatial information can be extracted reliably from the responses of the highly contrast- and pattern-dependent EMD responses, especially if the vast range of light intensities encountered in natural environments is taken into account. This question will be addressed here by systematically modeling the peripheral visual system of flies, including various adaptive mechanisms. Different model variants of the peripheral visual system were stimulated with image sequences that mimic the panoramic visual input during translational ego-motion in various natural environments, and the resulting peripheral signals were fed into an array of EMDs. We characterized the influence of each peripheral computational unit on the representation of spatial information in the EMD responses. Our model simulations reveal that information about the overall light level needs to be eliminated from the EMD input as is accomplished under light-adapted conditions in the insect peripheral visual system. The response characteristics of large monopolar cells (LMCs) resemble that of a band-pass filter, which reduces the contrast dependency of EMDs strongly, effectively enhancing the representation of the nearness of objects and, especially, of their contours. We furthermore show that local brightness adaptation of photoreceptors allows for spatial vision under a wide range of dynamic light conditions.
Collapse
Affiliation(s)
- Jinglin Li
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology, Bielefeld UniversityBielefeld, Germany
| | | | | |
Collapse
|
23
|
Zoccolan D, Cox DD, Benucci A. Editorial: What can simple brains teach us about how vision works. Front Neural Circuits 2015; 9:51. [PMID: 26483639 PMCID: PMC4586271 DOI: 10.3389/fncir.2015.00051] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2015] [Accepted: 09/14/2015] [Indexed: 11/30/2022] Open
Affiliation(s)
- Davide Zoccolan
- Visual Neuroscience Lab, International School for Advanced Studies Trieste, Italy
| | - David D Cox
- Department of Molecular and Cellular Biology and Center for Brain Science, Harvard University Cambridge, MA, USA
| | - Andrea Benucci
- Laboratory for Neural Circuit and Behavior, RIKEN Brain Science Institute Wako City, Japan
| |
Collapse
|
24
|
Strübbe S, Stürzl W, Egelhaaf M. Insect-Inspired Self-Motion Estimation with Dense Flow Fields--An Adaptive Matched Filter Approach. PLoS One 2015; 10:e0128413. [PMID: 26308839 PMCID: PMC4550262 DOI: 10.1371/journal.pone.0128413] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2014] [Accepted: 04/28/2015] [Indexed: 11/18/2022] Open
Abstract
The control of self-motion is a basic, but complex task for both technical and biological systems. Various algorithms have been proposed that allow the estimation of self-motion from the optic flow on the eyes. We show that two apparently very different approaches to solve this task, one technically and one biologically inspired, can be transformed into each other under certain conditions. One estimator of self-motion is based on a matched filter approach; it has been developed to describe the function of motion sensitive cells in the fly brain. The other estimator, the Koenderink and van Doorn (KvD) algorithm, was derived analytically with a technical background. If the distances to the objects in the environment can be assumed to be known, the two estimators are linear and equivalent, but are expressed in different mathematical forms. However, for most situations it is unrealistic to assume that the distances are known. Therefore, the depth structure of the environment needs to be determined in parallel to the self-motion parameters and leads to a non-linear problem. It is shown that the standard least mean square approach that is used by the KvD algorithm leads to a biased estimator. We derive a modification of this algorithm in order to remove the bias and demonstrate its improved performance by means of numerical simulations. For self-motion estimation it is beneficial to have a spherical visual field, similar to many flying insects. We show that in this case the representation of the depth structure of the environment derived from the optic flow can be simplified. Based on this result, we develop an adaptive matched filter approach for systems with a nearly spherical visual field. Then only eight parameters about the environment have to be memorized and updated during self-motion.
Collapse
Affiliation(s)
- Simon Strübbe
- Department of Neurobiology and CITEC, Bielefeld University, Bielefeld, Germany
- * E-mail:
| | - Wolfgang Stürzl
- Institute of Robotics and Mechatronics, German Aerospace Center (DLR), Wessling, Germany
| | - Martin Egelhaaf
- Department of Neurobiology and CITEC, Bielefeld University, Bielefeld, Germany
| |
Collapse
|
25
|
Kress D, van Bokhorst E, Lentink D. How Lovebirds Maneuver Rapidly Using Super-Fast Head Saccades and Image Feature Stabilization. PLoS One 2015; 10:e0129287. [PMID: 26107413 PMCID: PMC4481315 DOI: 10.1371/journal.pone.0129287] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2014] [Accepted: 05/06/2015] [Indexed: 11/18/2022] Open
Abstract
Diurnal flying animals such as birds depend primarily on vision to coordinate their flight path during goal-directed flight tasks. To extract the spatial structure of the surrounding environment, birds are thought to use retinal image motion (optical flow) that is primarily induced by motion of their head. It is unclear what gaze behaviors birds perform to support visuomotor control during rapid maneuvering flight in which they continuously switch between flight modes. To analyze this, we measured the gaze behavior of rapidly turning lovebirds in a goal-directed task: take-off and fly away from a perch, turn on a dime, and fly back and land on the same perch. High-speed flight recordings revealed that rapidly turning lovebirds perform a remarkable stereotypical gaze behavior with peak saccadic head turns up to 2700 degrees per second, as fast as insects, enabled by fast neck muscles. In between saccades, gaze orientation is held constant. By comparing saccade and wingbeat phase, we find that these super-fast saccades are coordinated with the downstroke when the lateral visual field is occluded by the wings. Lovebirds thus maximize visual perception by overlying behaviors that impair vision, which helps coordinate maneuvers. Before the turn, lovebirds keep a high contrast edge in their visual midline. Similarly, before landing, the lovebirds stabilize the center of the perch in their visual midline. The perch on which the birds land swings, like a branch in the wind, and we find that retinal size of the perch is the most parsimonious visual cue to initiate landing. Our observations show that rapidly maneuvering birds use precisely timed stereotypic gaze behaviors consisting of rapid head turns and frontal feature stabilization, which facilitates optical flow based flight control. Similar gaze behaviors have been reported for visually navigating humans. This finding can inspire more effective vision-based autopilots for drones.
Collapse
Affiliation(s)
- Daniel Kress
- Department of Mechanical Engineering, Stanford University, Stanford, California, United States of America
| | - Evelien van Bokhorst
- Department of Mechanical Engineering, Stanford University, Stanford, California, United States of America; Department of Mechanical Engineering and Aeronautics, City University London, London, United Kingdom
| | - David Lentink
- Department of Mechanical Engineering, Stanford University, Stanford, California, United States of America; Experimental Zoology Group, Wageningen University, Wageningen, The Netherlands
| |
Collapse
|