1
|
Doussot C, Bertrand OJN, Egelhaaf M. Visually guided homing of bumblebees in ambiguous situations: A behavioural and modelling study. PLoS Comput Biol 2020; 16:e1008272. [PMID: 33048938 PMCID: PMC7553325 DOI: 10.1371/journal.pcbi.1008272] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2019] [Accepted: 08/19/2020] [Indexed: 11/19/2022] Open
Abstract
Returning home is a crucial task accomplished daily by many animals, including humans. Because of their tiny brains, insects, like bees or ants, are good study models for efficient navigation strategies. Bees and ants are known to rely mainly on learned visual information about the nest surroundings to pinpoint their barely visible nest-entrance. During the return, when the actual sight of the insect matches the learned information, the insect is easily guided home. Occasionally, modifications to the visual environment may take place while the insect is on a foraging trip. Here, we addressed the ecologically relevant question of how bumblebees’ homing is affected by such a situation. In an artificial setting, we habituated bees to be guided to their nest by two constellations of visual cues. After habituation, these cues were displaced during foraging trips into a conflict situation. We recorded bumblebees’ return flights in such circumstances and investigated where they search for their nest entrance following the degree of displacement between the two visually relevant cues. Bumblebees mostly searched at the fictive nest location as indicated by either cue constellation, but never at a compromise location between them. We compared these experimental results to the predictions of different types of homing models. We found that models guiding an agent by a single holistic view of the nest surroundings could not account for the bumblebees’ search behaviour in cue-conflict situations. Instead, homing models relying on multiple views were sufficient. We could further show that homing models required fewer views and got more robust to height changes if optic flow-based spatial information was encoded and learned, rather than just brightness information. Returning home sounds trivial, but to a concealed underground location like a burrow, is less easy. For the buff-tailed bumblebees, this task is a routine. After collecting pollen in gardens or flowered meadows, bees must return to their underground nest to feed the queen’s larvae. The nest entrance is almost invisible for a returning bee; therefore, it guides its flight by information about the surrounding visual environment. Since the seminal work of Timbergern, many experiments have focused on how visual information is guiding foraging insects back home. In these experiments, returning foragers were confronted with a coherent displacement of the entire nest surroundings, hence, leading the bees to a unique new location. But in nature, the objects constituting the visual environment maybe unorderly displaced, as some are differently inclined to the action of different factors, e.g. wind. In our study, we moved objects in a tricky way to create two fictitious nest entrances. The bees searched at the fictitious nest entrances, but never in-between. The distance between the fictitious nests affected the bees’ search. Finally, we could predict the search location by using bio-inspired homing models potentially interesting for implementing in autonomous robots.
Collapse
Affiliation(s)
- Charlotte Doussot
- Neurobiology, Faculty of Biology, Universität Bielefeld, Germany
- * E-mail:
| | | | - Martin Egelhaaf
- Neurobiology, Faculty of Biology, Universität Bielefeld, Germany
| |
Collapse
|
2
|
Schwarz S, Mangan M, Webb B, Wystrach A. Route-following ants respond to alterations of the view sequence. J Exp Biol 2020; 223:jeb218701. [PMID: 32487668 DOI: 10.1242/jeb.218701] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2019] [Accepted: 05/21/2020] [Indexed: 08/26/2023]
Abstract
Ants can navigate by comparing the currently perceived view with memorised views along a familiar foraging route. Models regarding route-following suggest that the views are stored and recalled independently of the sequence in which they occur. Hence, the ant only needs to evaluate the instantaneous familiarity of the current view to obtain a heading direction. This study investigates whether ant homing behaviour is influenced by alterations in the sequence of views experienced along a familiar route, using the frequency of stop-and-scan behaviour as an indicator of the ant's navigational uncertainty. Ants were trained to forage between their nest and a feeder which they exited through a short channel before proceeding along the homeward route. In tests, ants were collected before entering the nest and released again in the channel, which was placed either in its original location or halfway along the route. Ants exiting the familiar channel in the middle of the route would thus experience familiar views in a novel sequence. Results show that ants exiting the channel scan significantly more when they find themselves in the middle of the route, compared with when emerging at the expected location near the feeder. This behaviour suggests that previously encountered views influence the recognition of current views, even when these views are highly familiar, revealing a sequence component to route memory. How information about view sequences could be implemented in the insect brain, as well as potential alternative explanations to our results, are discussed.
Collapse
Affiliation(s)
- Sebastian Schwarz
- Centre de Recherches sur la Cognition Animale, CNRS, Université Paul Sabatier, Toulouse, 31062 Cedex 09, France
| | - Michael Mangan
- Sheffield Robotics, Department of Computer Science, University of Sheffield, Western Bank, Sheffield S10 2TN, UK
| | - Barbara Webb
- School of Informatics, University of Edinburgh, Crichton Street, Edinburgh EH8 9AB, UK
| | - Antoine Wystrach
- Centre de Recherches sur la Cognition Animale, CNRS, Université Paul Sabatier, Toulouse, 31062 Cedex 09, France
| |
Collapse
|
3
|
Le Möel F, Wystrach A. Opponent processes in visual memories: A model of attraction and repulsion in navigating insects' mushroom bodies. PLoS Comput Biol 2020; 16:e1007631. [PMID: 32023241 PMCID: PMC7034919 DOI: 10.1371/journal.pcbi.1007631] [Citation(s) in RCA: 35] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2019] [Revised: 02/21/2020] [Accepted: 01/04/2020] [Indexed: 11/19/2022] Open
Abstract
Solitary foraging insects display stunning navigational behaviours in visually complex natural environments. Current literature assumes that these insects are mostly driven by attractive visual memories, which are learnt when the insect's gaze is precisely oriented toward the goal direction, typically along its familiar route or towards its nest. That way, an insect could return home by simply moving in the direction that appears most familiar. Here we show using virtual reconstructions of natural environments that this principle suffers from fundamental drawbacks, notably, a given view of the world does not provide information about whether the agent should turn or not to reach its goal. We propose a simple model where the agent continuously compares its current view with both goal and anti-goal visual memories, which are treated as attractive and repulsive respectively. We show that this strategy effectively results in an opponent process, albeit not at the perceptual level-such as those proposed for colour vision or polarisation detection-but at the level of the environmental space. This opponent process results in a signal that strongly correlates with the angular error of the current body orientation so that a single view of the world now suffices to indicate whether the agent should turn or not. By incorporating this principle into a simple agent navigating in reconstructed natural environments, we show that it overcomes the usual shortcomings and produces a step-increase in navigation effectiveness and robustness. Our findings provide a functional explanation to recent behavioural observations in ants and why and how so-called aversive and appetitive memories must be combined. We propose a likely neural implementation based on insects' mushroom bodies' circuitry that produces behavioural and neural predictions contrasting with previous models.
Collapse
Affiliation(s)
- Florent Le Möel
- Research Centre on Animal Cognition, University Paul Sabatier/CNRS, Toulouse, France
| | - Antoine Wystrach
- Research Centre on Animal Cognition, University Paul Sabatier/CNRS, Toulouse, France
| |
Collapse
|
4
|
Presotto A, Fayrer-Hosken R, Curry C, Madden M. Spatial mapping shows that some African elephants use cognitive maps to navigate the core but not the periphery of their home ranges. Anim Cogn 2019; 22:251-263. [PMID: 30689116 DOI: 10.1007/s10071-019-01242-9] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2018] [Revised: 01/05/2019] [Accepted: 01/21/2019] [Indexed: 01/08/2023]
Abstract
Strategies of navigation have been shown to play a critical role when animals revisit resource sites across large home ranges. The habitual route system appears to be a sufficient strategy for animals to navigate while avoiding the cognitive cost of traveling using the Euclidean map. We hypothesize that wild elephants travel more frequently using habitual routes to revisit resource sites as opposed to using the Euclidean map. To identify the elephants' habitual routes, we created a python script, which accounted for frequently used route segments that constituted the habitual routes. Results showed elephant navigation flexibility traveling at Kruger National Park landscape. Elephants shift strategies of navigation depend on the familiarity of their surroundings. In the core area of their home range, elephants traveled using the Euclidean map, but intraindividual differences showed that elephants were then converted to habitual routes when navigating within the less familiar periphery of their home range. These findings are analogous to the recent experimental results found in smaller mammals that showed that rats encode locations according to their familiarity with their surroundings. In addition, as recently observed in monkeys, intersections of habitual routes are important locations used by elephants when making navigation decisions. We found a strong association between intersections and new segment usage by elephants when they revisit resource sites, suggesting that intersection choice may contribute to the spatial representations elephants use when repeatedly revisiting resource sites.
Collapse
Affiliation(s)
- Andrea Presotto
- Department of Geography and Geosciences, Salisbury University, 1101 Camden Avenue, Salisbury, MD, 21801, USA.
| | - Richard Fayrer-Hosken
- San Diego Zoo, Institute for Conservation Research, 15600 San Pasqual Valley Rd, Escondido, CA, 92027, USA
| | - Caitlin Curry
- Department of Geography and Geosciences, Salisbury University, 1101 Camden Avenue, Salisbury, MD, 21801, USA
| | - Marguerite Madden
- Center for Geospatial Research, University of Georgia, 210 Field Street, Athens, GA, 30602, USA
| |
Collapse
|
5
|
Stone T, Mangan M, Wystrach A, Webb B. Rotation invariant visual processing for spatial memory in insects. Interface Focus 2018; 8:20180010. [PMID: 29951190 PMCID: PMC6015815 DOI: 10.1098/rsfs.2018.0010] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/08/2018] [Indexed: 11/12/2022] Open
Abstract
Visual memory is crucial to navigation in many animals, including insects. Here, we focus on the problem of visual homing, that is, using comparison of the view at a current location with a view stored at the home location to control movement towards home by a novel shortcut. Insects show several visual specializations that appear advantageous for this task, including almost panoramic field of view and ultraviolet light sensitivity, which enhances the salience of the skyline. We discuss several proposals for subsequent processing of the image to obtain the required motion information, focusing on how each might deal with the problem of yaw rotation of the current view relative to the home view. Possible solutions include tagging of views with information from the celestial compass system, using multiple views pointing towards home, or rotation invariant encoding of the view. We illustrate briefly how a well-known shape description method from computer vision, Zernike moments, could provide a compact and rotation invariant representation of sky shapes to enhance visual homing. We discuss the biological plausibility of this solution, and also a fourth strategy, based on observed behaviour of insects, that involves transfer of information from visual memory matching to the compass system.
Collapse
Affiliation(s)
- Thomas Stone
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, UK
| | - Michael Mangan
- Sheffield Robotics, Department of Computer Science, University of Sheffield, Regent Court, Sheffield S1 4DP, UK
| | - Antoine Wystrach
- CNRS, Université Paul Sabatier, Toulouse, 31062 cedex 09, France
| | - Barbara Webb
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, UK
| |
Collapse
|
6
|
Abstract
Navigation in cluttered environments is an important challenge for animals and robots alike and has been the subject of many studies trying to explain and mimic animal navigational abilities. However, the question of selecting an appropriate home location has, so far, received only little attention. This is surprising, since the choice of a home location might greatly influence an animal’s navigation performance. To address the question of home choice in cluttered environments, a systematic analysis of homing trajectories was performed by computer simulations using a skyline-based local homing method. Our analysis reveals that homing performance strongly depends on the location of the home in the environment. Furthermore, it appears that by assessing homing success in the immediate vicinity of the home, an animal might be able to predict its overall success in returning to it from within a much larger area.
Collapse
Affiliation(s)
- Martin M. Müller
- Department of Neurobiology, Faculty of Biology, and Cluster of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
- * E-mail:
| | - Olivier J. N. Bertrand
- Department of Neurobiology, Faculty of Biology, and Cluster of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| | - Dario Differt
- Computer Engineering Group, Faculty of Technology, Bielefeld University, Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology, Faculty of Biology, and Cluster of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| |
Collapse
|
7
|
Abstract
Navigation is an essential skill for many animals, and understanding how animal use environmental information, particularly visual information, to navigate has a long history in both ethology and psychology. In birds, the dominant approach for investigating navigation at small-scales comes from comparative psychology, which emphasizes the cognitive representations underpinning spatial memory. The majority of this work is based in the laboratory and it is unclear whether this context itself affects the information that birds learn and use when they search for a location. Data from hummingbirds suggests that birds in the wild might use visual information in quite a different manner. To reconcile these differences, here we propose a new approach to avian navigation, inspired by the sensory-driven study of navigation in insects. Using methods devised for studying the navigation of insects, it is possible to quantify the visual information available to navigating birds, and then to determine how this information influences those birds' navigation decisions. Focusing on four areas that we consider characteristic of the insect navigation perspective, we discuss how this approach has shone light on the information insects use to navigate, and assess the prospects of taking a similar approach with birds. Although birds and insects differ in many ways, there is nothing in the insect-inspired approach of the kind we describe that means these methods need be restricted to insects. On the contrary, adopting such an approach could provide a fresh perspective on the well-studied question of how birds navigate through a variety of environments.
Collapse
Affiliation(s)
| | - Susan D Healy
- School of Biology, University of St Andrews, Fife, UK
| |
Collapse
|
8
|
Murray T, Zeil J. Quantifying navigational information: The catchment volumes of panoramic snapshots in outdoor scenes. PLoS One 2017; 12:e0187226. [PMID: 29088300 PMCID: PMC5663442 DOI: 10.1371/journal.pone.0187226] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2017] [Accepted: 10/16/2017] [Indexed: 11/18/2022] Open
Abstract
Panoramic views of natural environments provide visually navigating animals with two kinds of information: they define locations because image differences increase smoothly with distance from a reference location and they provide compass information, because image differences increase smoothly with rotation away from a reference orientation. The range over which a given reference image can provide navigational guidance (its 'catchment area') has to date been quantified from the perspective of walking animals by determining how image differences develop across the ground plane of natural habitats. However, to understand the information available to flying animals there is a need to characterize the 'catchment volumes' within which panoramic snapshots can provide navigational guidance. We used recently developed camera-based methods for constructing 3D models of natural environments and rendered panoramic views at defined locations within these models with the aim of mapping navigational information in three dimensions. We find that in relatively open woodland habitats, catchment volumes are surprisingly large extending for metres depending on the sensitivity of the viewer to image differences. The size and the shape of catchment volumes depend on the distance of visual features in the environment. Catchment volumes are smaller for reference images close to the ground and become larger for reference images at some distance from the ground and in more open environments. Interestingly, catchment volumes become smaller when only above horizon views are used and also when views include a 1 km distant panorama. We discuss the current limitations of mapping navigational information in natural environments and the relevance of our findings for our understanding of visual navigation in animals and autonomous robots.
Collapse
Affiliation(s)
- Trevor Murray
- Research School of Biology, Australian National University, Canberra, Australia
- * E-mail:
| | - Jochen Zeil
- Research School of Biology, Australian National University, Canberra, Australia
| |
Collapse
|
9
|
Webb B, Wystrach A. Neural mechanisms of insect navigation. CURRENT OPINION IN INSECT SCIENCE 2016; 15:27-39. [PMID: 27436729 DOI: 10.1016/j.cois.2016.02.011] [Citation(s) in RCA: 85] [Impact Index Per Article: 9.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/04/2016] [Revised: 02/16/2016] [Accepted: 02/22/2016] [Indexed: 06/06/2023]
Abstract
We know more about the ethology of insect navigation than the neural substrates. Few studies have shown direct effects of brain manipulation on navigational behaviour; or measure brain responses that clearly relate to the animal's current location or spatial target, independently of specific sensory cues. This is partly due to the methodological problems of obtaining neural data in a naturally behaving animal. However, substantial indirect evidence, such as comparative anatomy and knowledge of the neural circuits that provide relevant sensory inputs provide converging arguments for the role of some specific brain areas: the mushroom bodies; and the central complex. Finally, modelling can help bridge the gap by relating the computational requirements of a given navigational task to the type of computation offered by different brain areas.
Collapse
Affiliation(s)
- Barbara Webb
- School of Informatics, University of Edinburgh, 10 Crichton St, Edinburgh EH8 9AB, UK.
| | - Antoine Wystrach
- Centre de Recherches sur la Cognition Animale, Centre National de la Recherche Scientifique, Universite Paul Sabatier, Toulouse, France
| |
Collapse
|
10
|
Ardin PB, Mangan M, Webb B. Ant Homing Ability Is Not Diminished When Traveling Backwards. Front Behav Neurosci 2016; 10:69. [PMID: 27147991 PMCID: PMC4829585 DOI: 10.3389/fnbeh.2016.00069] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2016] [Accepted: 03/28/2016] [Indexed: 11/16/2022] Open
Abstract
Ants are known to be capable of homing to their nest after displacement to a novel location. This is widely assumed to involve some form of retinotopic matching between their current view and previously experienced views. One simple algorithm proposed to explain this behavior is continuous retinotopic alignment, in which the ant constantly adjusts its heading by rotating to minimize the pixel-wise difference of its current view from all views stored while facing the nest. However, ants with large prey items will often drag them home while facing backwards. We tested whether displaced ants (Myrmecia croslandi) dragging prey could still home despite experiencing an inverted view of their surroundings under these conditions. Ants moving backwards with food took similarly direct paths to the nest as ants moving forward without food, demonstrating that continuous retinotopic alignment is not a critical component of homing. It is possible that ants use initial or intermittent retinotopic alignment, coupled with some other direction stabilizing cue that they can utilize when moving backward. However, though most ants dragging prey would occasionally look toward the nest, we observed that their heading direction was not noticeably improved afterwards. We assume ants must use comparison of current and stored images for corrections of their path, but suggest they are either able to chose the appropriate visual memory for comparison using an additional mechanism; or can make such comparisons without retinotopic alignment.
Collapse
Affiliation(s)
- Paul B Ardin
- Insect Robotics Lab, School of Informatics, University of Edinburgh Edinburgh, UK
| | - Michael Mangan
- Insect Robotics Lab, School of Informatics, University of Edinburgh Edinburgh, UK
| | - Barbara Webb
- Insect Robotics Lab, School of Informatics, University of Edinburgh Edinburgh, UK
| |
Collapse
|
11
|
Dewar ADM, Wystrach A, Graham P, Philippides A. Navigation-specific neural coding in the visual system of Drosophila. Biosystems 2015; 136:120-7. [PMID: 26310914 DOI: 10.1016/j.biosystems.2015.07.008] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2015] [Revised: 07/21/2015] [Accepted: 07/26/2015] [Indexed: 11/15/2022]
Abstract
Drosophila melanogaster are a good system in which to understand the minimal requirements for widespread visually guided behaviours such as navigation, due to their small brains (adults possess only 100,000 neurons) and the availability of neurogenetic techniques which allow the identification of task-specific cell types. Recently published data describe the receptive fields for two classes of visually responsive neurons (R2 and R3/R4d ring neurons in the central complex) that are essential for visual tasks such as orientation memory for salient objects and simple pattern discriminations. What is interesting is that these cells have very large receptive fields and are very small in number, suggesting that each sub-population of cells might be a bottleneck in the processing of visual information for a specific behaviour, as each subset of cells effectively condenses information from approximately 3000 visual receptors in the eye, to fewer than 50 neurons in total. It has recently been shown how R1 ring neurons, which receive input from the same areas as the R2 and R3/R4d cells, are necessary for place learning in Drosophila. However, how R1 neurons enable place learning is unknown. By examining the information provided by different populations of hypothetical visual neurons in simulations of experimental arenas, we show that neurons with ring neuron-like receptive fields are sufficient for defining a location visually. In this way we provide a link between the type of information conveyed by ring neurons and the behaviour they support.
Collapse
Affiliation(s)
- Alex D M Dewar
- School of Life Sciences, John Maynard Smith Building, University of Sussex, Falmer BN1 9QJ, UK.
| | - Antoine Wystrach
- School of Informatics, University of Edinburgh, Appleton Tower, 11 Crichton Street, Edinburgh EH8 9LE, UK
| | - Paul Graham
- School of Life Sciences, John Maynard Smith Building, University of Sussex, Falmer BN1 9QJ, UK
| | - Andrew Philippides
- Department of Informatics, Chichester I, University of Sussex, Falmer, Brighton BN1 9QJ, UK.
| |
Collapse
|
12
|
Abstract
Visual navigation is a critical behaviour for many animals, and it has been particularly well studied in ants. Decades of ant navigation research have uncovered many ways in which efficient navigation can be implemented in small brains. For example, ants show us how visual information can drive navigation via procedural rather than map-like instructions. Two recent behavioural observations highlight interesting adaptive ways in which ants implement visual guidance. Firstly, it has been shown that the systematic nest searches of ants can be biased by recent experience of familiar scenes. Secondly, ants have been observed to show temporary periods of confusion when asked to repeat a route segment, even if that route segment is very familiar. Taken together, these results indicate that the navigational decisions of ants take into account their recent experiences as well as the currently perceived environment.
Collapse
Affiliation(s)
- Paul Graham
- School of Life Sciences, University of Sussex, CCNR, JMS Building, Brighton BN1 9QG, UK
| | - Michael Mangan
- School of Life Sciences, University of Sussex, CCNR, JMS Building, Brighton BN1 9QG, UK
| |
Collapse
|
13
|
Ardin P, Mangan M, Wystrach A, Webb B. How variation in head pitch could affect image matching algorithms for ant navigation. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2015; 201:585-97. [PMID: 25895895 PMCID: PMC4439443 DOI: 10.1007/s00359-015-1005-8] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2014] [Revised: 03/19/2015] [Accepted: 03/21/2015] [Indexed: 10/29/2022]
Abstract
Desert ants are a model system for animal navigation, using visual memory to follow long routes across both sparse and cluttered environments. Most accounts of this behaviour assume retinotopic image matching, e.g. recovering heading direction by finding a minimum in the image difference function as the viewpoint rotates. But most models neglect the potential image distortion that could result from unstable head motion. We report that for ants running across a short section of natural substrate, the head pitch varies substantially: by over 20 degrees with no load; and 60 degrees when carrying a large food item. There is no evidence of head stabilisation. Using a realistic simulation of the ant's visual world, we demonstrate that this range of head pitch significantly degrades image matching. The effect of pitch variation can be ameliorated by a memory bank of densely sampled along a route so that an image sufficiently similar in pitch and location is available for comparison. However, with large pitch disturbance, inappropriate memories sampled at distant locations are often recalled and navigation along a route can be adversely affected. Ignoring images obtained at extreme pitches, or averaging images over several pitches, does not significantly improve performance.
Collapse
Affiliation(s)
- Paul Ardin
- School of Informatics, University of Edinburgh, 10 Crichton St, Edinburgh, EH8 9AB UK
| | - Michael Mangan
- School of Informatics, University of Edinburgh, 10 Crichton St, Edinburgh, EH8 9AB UK
| | - Antoine Wystrach
- School of Informatics, University of Edinburgh, 10 Crichton St, Edinburgh, EH8 9AB UK
| | - Barbara Webb
- School of Informatics, University of Edinburgh, 10 Crichton St, Edinburgh, EH8 9AB UK
| |
Collapse
|
14
|
Stürzl W, Grixa I, Mair E, Narendra A, Zeil J. Three-dimensional models of natural environments and the mapping of navigational information. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2015; 201:563-84. [DOI: 10.1007/s00359-015-1002-y] [Citation(s) in RCA: 42] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2014] [Revised: 03/10/2015] [Accepted: 03/13/2015] [Indexed: 11/24/2022]
|
15
|
Dittmar L. Static and dynamic snapshots for goal localization in insects? Commun Integr Biol 2014. [DOI: 10.4161/cib.13763] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022] Open
|
16
|
|
17
|
|
18
|
Möller R. A model of ant navigation based on visual prediction. J Theor Biol 2012; 305:118-30. [PMID: 22554981 DOI: 10.1016/j.jtbi.2012.04.022] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2012] [Revised: 04/12/2012] [Accepted: 04/17/2012] [Indexed: 10/28/2022]
Abstract
A model of visual navigation in ants is presented which is based on a simple network predicting the changes of a visual scene under translatory movements. The model contains two behavioral components: the acquisition of multiple snapshots in different orientations during a learning walk, and the selection of a movement direction by a scanning behavior where the ant searches through different headings. Both components fit with observations in experiments with desert ants. The model is in most aspects biologically plausible with respect to the equivalent neural networks, and it produces reliable homing behavior in a simulated environment with a complex random surface texture. The model is closely related to the algorithmic min-warping method for visual robot navigation which shows good homing performance in real-world environments.
Collapse
Affiliation(s)
- Ralf Möller
- Computer Engineering, Faculty of Technology and Center of Excellence Cognitive Interaction Technology, Bielefeld University, POB 10 01 31, 33501 Bielefeld, Germany.
| |
Collapse
|
19
|
Visual homing: an insect perspective. Curr Opin Neurobiol 2012; 22:285-93. [PMID: 22221863 DOI: 10.1016/j.conb.2011.12.008] [Citation(s) in RCA: 154] [Impact Index Per Article: 11.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2011] [Revised: 11/28/2011] [Accepted: 12/15/2011] [Indexed: 11/21/2022]
|
20
|
Dittmar L. Static and dynamic snapshots for goal localization in insects? Commun Integr Biol 2011; 4:17-20. [PMID: 21509170 DOI: 10.4161/cib.4.1.13763] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2010] [Accepted: 09/27/2010] [Indexed: 11/19/2022] Open
Abstract
Bees, wasps and ants navigate successfully between feeding sites and their nest, despite the small size of their brains which contain less than a million neurons. A long history of studies examining the role of visual memories in homing behavior show that insects can localize a goal by finding a close match between a memorized view at the goal location and their current view ("snapshot matching"). However, the concept of static snapshot matching might not explain all aspects of homing behavior, as honeybees are able to use landmarks that are statically camouflaged. In this case the landmarks are only detectable by relative motion cues between the landmark and the background, which the bees generate when they perform characteristic flight maneuvers close to the landmarks. The bees' navigation performance can be explained by a matching scheme based on optic flow amplitudes ("dynamic snapshot matching"). In this article, I will discuss the concept of dynamic snapshot matching in the light of previous literature.
Collapse
Affiliation(s)
- Laura Dittmar
- Department of Neurobiology & Center of Excellence 'Cognitive Interaction Technology' Bielefeld University; Bielefeld, Germany
| |
Collapse
|
21
|
Dittmar L, Egelhaaf M, Stürzl W, Boeddeker N. The behavioral relevance of landmark texture for honeybee homing. Front Behav Neurosci 2011; 5:20. [PMID: 21541258 PMCID: PMC3083717 DOI: 10.3389/fnbeh.2011.00020] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2010] [Accepted: 04/03/2011] [Indexed: 11/15/2022] Open
Abstract
Honeybees visually pinpoint the location of a food source using landmarks. Studies on the role of visual memories have suggested that bees approach the goal by finding a close match between their current view and a memorized view of the goal location. The most relevant landmark features for this matching process seem to be their retinal positions, the size as defined by their edges, and their color. Recently, we showed that honeybees can use landmarks that are statically camouflaged, suggesting that motion cues are relevant as well. Currently it is unclear how bees weight these different landmark features when accomplishing navigational tasks, and whether this depends on their saliency. Since natural objects are often distinguished by their texture, we investigate the behavioral relevance and the interplay of the spatial configuration and the texture of landmarks. We show that landmark texture is a feature that bees memorize, and being given the opportunity to identify landmarks by their texture improves the bees’ navigational performance. Landmark texture is weighted more strongly than landmark configuration when it provides the bees with positional information and when the texture is salient. In the vicinity of the landmark honeybees changed their flight behavior according to its texture.
Collapse
Affiliation(s)
- Laura Dittmar
- Department of Neurobiology and Center of Excellence 'Cognitive Interaction Technology', Bielefeld University Bielefeld, Germany
| | | | | | | |
Collapse
|
22
|
Dittmar L, Stürzl W, Baird E, Boeddeker N, Egelhaaf M. Goal seeking in honeybees: matching of optic flow snapshots? J Exp Biol 2010; 213:2913-23. [DOI: 10.1242/jeb.043737] [Citation(s) in RCA: 62] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
SUMMARY
Visual landmarks guide humans and animals including insects to a goal location. Insects, with their miniature brains, have evolved a simple strategy to find their nests or profitable food sources; they approach a goal by finding a close match between the current view and a memorised retinotopic representation of the landmark constellation around the goal. Recent implementations of such a matching scheme use raw panoramic images (‘image matching’) and show that it is well suited to work on robots and even in natural environments. However, this matching scheme works only if relevant landmarks can be detected by their contrast and texture. Therefore, we tested how honeybees perform in localising a goal if the landmarks can hardly be distinguished from the background by such cues. We recorded the honeybees' flight behaviour with high-speed cameras and compared the search behaviour with computer simulations. We show that honeybees are able to use landmarks that have the same contrast and texture as the background and suggest that the bees use relative motion cues between the landmark and the background. These cues are generated on the eyes when the bee moves in a characteristic way in the vicinity of the landmarks. This extraordinary navigation performance can be explained by a matching scheme that includes snapshots based on optic flow amplitudes (‘optic flow matching’). This new matching scheme provides a robust strategy for navigation, as it depends primarily on the depth structure of the environment.
Collapse
Affiliation(s)
- Laura Dittmar
- Department of Neurobiology and Center of Excellence ‘Cognitive Interaction Technology’, Bielefeld University, 33615 Bielefeld, Germany
| | - Wolfgang Stürzl
- Department of Neurobiology and Center of Excellence ‘Cognitive Interaction Technology’, Bielefeld University, 33615 Bielefeld, Germany
| | - Emily Baird
- Department of Neurobiology and Center of Excellence ‘Cognitive Interaction Technology’, Bielefeld University, 33615 Bielefeld, Germany
| | - Norbert Boeddeker
- Department of Neurobiology and Center of Excellence ‘Cognitive Interaction Technology’, Bielefeld University, 33615 Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology and Center of Excellence ‘Cognitive Interaction Technology’, Bielefeld University, 33615 Bielefeld, Germany
| |
Collapse
|
23
|
|