1
|
Cormons MJ, Zeil J. Digger wasps Microbembex monodonta SAY (Hymenoptera, Crabronidae) rely exclusively on visual cues when pinpointing their nest entrances. PLoS One 2023; 18:e0282144. [PMID: 36989296 PMCID: PMC10058119 DOI: 10.1371/journal.pone.0282144] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2021] [Accepted: 02/02/2023] [Indexed: 03/30/2023] Open
Abstract
The ability of insects to navigate and home is crucial to fundamental tasks, such as pollination, parental care, procuring food, and finding mates. Despite recent advances in our understanding of visual homing in insects, it remains unclear exactly how ground-nesting Hymenoptera are able to precisely locate their often inconspicuous or hidden reproductive burrow entrances. Here we show that the ground-nesting wasp Microbembex monodonta locates her hidden burrow entrance with the help of local landmarks, but only if their view of the wider panorama is not blocked. Moreover, the wasps are able to pinpoint the burrow location to within a few centimeters when potential olfactory, tactile and auditory cues are locally masked. We conclude that M. monodonta locate their hidden burrows relying exclusively on local visual cues in the context of the wider panorama. We discuss these results in the light of the older and more recent literature on nest recognition and homing in insects.
Collapse
Affiliation(s)
| | - Jochen Zeil
- Research School of Biology, The Australian National University, Canberra, ACT, Australia
| |
Collapse
|
2
|
Visual navigation: properties, acquisition and use of views. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2022:10.1007/s00359-022-01599-2. [PMID: 36515743 DOI: 10.1007/s00359-022-01599-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2022] [Revised: 11/20/2022] [Accepted: 11/25/2022] [Indexed: 12/15/2022]
Abstract
Panoramic views offer information on heading direction and on location to visually navigating animals. This review covers the properties of panoramic views and the information they provide to navigating animals, irrespective of image representation. Heading direction can be retrieved by alignment matching between memorized and currently experienced views, and a gradient descent in image differences can lead back to the location at which a view was memorized (positional image matching). Central place foraging insects, such as ants, bees and wasps, conduct distinctly choreographed learning walks and learning flights upon first leaving their nest that are likely to be designed to systematically collect scene memories tagged with information provided by path integration on the direction of and the distance to the nest. Equally, traveling along routes, ants have been shown to engage in scanning movements, in particular when routes are unfamiliar, again suggesting a systematic process of acquiring and comparing views. The review discusses what we know and do not know about how view memories are represented in the brain of insects, how they are acquired and how they are subsequently used for traveling along routes and for pinpointing places.
Collapse
|
3
|
Islam M, Deeti S, Murray T, Cheng K. What view information is most important in the homeward navigation of an Australian bull ant, Myrmecia midas? J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2022; 208:545-559. [PMID: 36048246 PMCID: PMC9734209 DOI: 10.1007/s00359-022-01565-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2021] [Revised: 08/15/2022] [Accepted: 08/17/2022] [Indexed: 12/14/2022]
Abstract
Many insects orient by comparing current panoramic views of their environment to memorised views. We tested the navigational abilities of night-active Myrmecia midas foragers while we blocked segments of their visual panorama. Foragers failed to orient homewards when the front view, lower elevations, entire terrestrial surround, or the full panorama was blocked. Initial scanning increased whenever the visual panorama was blocked but scanning only increased along the rest of the route when the front, back, higher, or lower elevations were blocked. Ants meandered more when the front, the back, or the higher elevations were obscured. When everything except the canopy was blocked, the ants were quick and direct, but moved in random directions, as if to escape. We conclude that a clear front view, or a clear lower panorama is necessary for initial homeward headings. Furthermore, the canopy is neither necessary nor sufficient for homeward initial heading, and the back and upper segments of views, while not necessary, do make finding home easier. Discrepancies between image analysis and ant behaviour when the upper and lower views were blocked suggests that ants are selective in what portions of the scene they attend to or learn.
Collapse
Affiliation(s)
- Muzahid Islam
- grid.1004.50000 0001 2158 5405School of Natural Sciences, Macquarie University, Sydney, NSW 2109 Australia
| | - Sudhakar Deeti
- grid.1004.50000 0001 2158 5405School of Natural Sciences, Macquarie University, Sydney, NSW 2109 Australia
| | - Trevor Murray
- grid.1004.50000 0001 2158 5405School of Natural Sciences, Macquarie University, Sydney, NSW 2109 Australia
| | - Ken Cheng
- grid.1004.50000 0001 2158 5405School of Natural Sciences, Macquarie University, Sydney, NSW 2109 Australia
| |
Collapse
|
4
|
Stankiewicz J, Webb B. Looking down: a model for visual route following in flying insects. BIOINSPIRATION & BIOMIMETICS 2021; 16:055007. [PMID: 34243169 DOI: 10.1088/1748-3190/ac1307] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/01/2021] [Accepted: 07/09/2021] [Indexed: 06/13/2023]
Abstract
Insect visual navigation is often assumed to depend on panoramic views of the horizon, and how these change as the animal moves. However, it is known that honey bees can visually navigate in flat, open meadows where visual information at the horizon is minimal, or would remain relatively constant across a wide range of positions. In this paper we hypothesise that these animals can navigate using view memories of the ground. We find that in natural scenes, low resolution views from an aerial perspective of ostensibly self-similar terrain (e.g. within a field of grass) provide surprisingly robust descriptors of precise spatial locations. We propose a new visual route following approach that makes use of transverse oscillations to centre a flight path along a sequence of learned views of the ground. We deploy this model on an autonomous quadcopter and demonstrate that it provides robust performance in the real world on journeys of up to 30 m. The success of our method is contingent on a robust view matching process which can evaluate the familiarity of a view with a degree of translational invariance. We show that a previously developed wavelet based bandpass orientated filter approach fits these requirements well, exhibiting double the catchment area of standard approaches. Using a realistic simulation package, we evaluate the robustness of our approach to variations in heading direction and aircraft height between inbound and outbound journeys. We also demonstrate that our approach can operate using a vision system with a biologically relevant visual acuity and viewing direction.
Collapse
Affiliation(s)
- J Stankiewicz
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, United Kingdom
| | - B Webb
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, United Kingdom
| |
Collapse
|
5
|
Freas CA, Plowes NJR, Spetch ML. Traveling through light clutter: Path integration and panorama guided navigation in the Sonoran Desert ant, Novomessor cockerelli. Behav Processes 2021; 186:104373. [PMID: 33684462 DOI: 10.1016/j.beproc.2021.104373] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2020] [Revised: 02/04/2021] [Accepted: 03/01/2021] [Indexed: 11/15/2022]
Abstract
Foraging ants use multiple navigational strategies, including path integration and visual panorama cues, which are used simultaneously and weighted based upon context, the environment and the species' sensory ecology. In particular, the amount of visual clutter in the habitat predicts the weighting given to the forager's path integrator and surrounding panorama cues. Here, we characterize the individual cue use and cue weighting of the Sonoran Desert ant, Novomessor cockerelli, by testing foragers after local and distant displacement. Foragers attend to both a path-integration-based vector and the surrounding panorama to navigate, on and off foraging routes. When both cues were present, foragers initially oriented to their path integrator alone, yet weighting was dynamic, with foragers abandoning the vector and switching to panorama-based navigation after a few meters. If displaced to unfamiliar locations, experienced foragers travelled almost their full homeward vector (∼85 %) before the onset of search. Through panorama analysis, we show views acquired on-route provide sufficient information for orientation over only short distances, with rapid parallel decreases in panorama similarity and navigational performance after even small local displacements. These findings are consistent with heavy path integrator weighting over the panorama when the local habitat contains few prominent terrestrial cues.
Collapse
Affiliation(s)
- Cody A Freas
- Department of Psychology, University of Alberta, Alberta, Canada.
| | - Nicola J R Plowes
- Department of Biology, Mesa Community College, Mesa, AZ, United States
| | - Marcia L Spetch
- Department of Psychology, University of Alberta, Alberta, Canada
| |
Collapse
|
6
|
Freas CA, Congdon JV, Plowes NJR, Spetch ML. Same but different: Socially foraging ants backtrack like individually foraging ants but use different mechanisms. JOURNAL OF INSECT PHYSIOLOGY 2019; 118:103944. [PMID: 31520596 DOI: 10.1016/j.jinsphys.2019.103944] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/12/2019] [Revised: 08/07/2019] [Accepted: 09/10/2019] [Indexed: 06/10/2023]
Abstract
Diverse species may adopt behaviourally identical solutions to similar environmental challenges. However, the underlying mechanisms dictating these responses may be quite different and are often associated with the specific ecology or habitat of these species. Foraging desert ants use multiple strategies in order to successfully navigate. In individually foraging ants, these strategies are largely visually-based; this includes path integration and learned panorama cues, with systematic search and backtracking acting as backup mechanisms. Backtracking is believed to be controlled, at least in solitary foraging species, by three criteria: 1) foragers must have recent exposure to the nest panorama, 2) the path integrator must be near zero, and 3) the ant must be displaced to an unfamiliar location. Instead of searching for the nest, under these conditions, foragers head in the opposite compass direction of the one in which they were recently travelling. Here, we explore backtracking in the socially foraging desert harvester ant (Veromessor pergandei), which exhibits a foraging ecology consisting of a combination of social and individual cues in a column and fan structure. We find that backtracking in V. pergandei, similar to solitary foraging species, is dependent on celestial cues, and in particular on the sun's position. However, unlike solitary foraging species, backtracking in V. pergandei is not mediated by the same criteria. Instead the expression of this behaviour is dependent on the presence of the social cues of the column and the proportion of the column that foragers have completed prior to displacement.
Collapse
Affiliation(s)
- Cody A Freas
- Department of Psychology, University of Alberta, Canada.
| | | | | | | |
Collapse
|
7
|
Schulte P, Zeil J, Stürzl W. An insect-inspired model for acquiring views for homing. BIOLOGICAL CYBERNETICS 2019; 113:439-451. [PMID: 31076867 DOI: 10.1007/s00422-019-00800-1] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/20/2018] [Accepted: 04/27/2019] [Indexed: 06/09/2023]
Abstract
Wasps and bees perform learning flights when leaving their nest or food locations for the first time during which they acquire visual information that enables them to return successfully. Here we present and test a set of simple control rules underlying the execution of learning flights that closely mimic those performed by ground-nesting wasps. In the simplest model, we assume that the angle between flight direction and the nest direction as seen from the position of the insect is constant and only flips sign when pivoting direction around the nest is changed, resulting in a concatenation of piecewise defined logarithmic spirals. We then added characteristic properties of real learning flights, such as head saccades and the condition that the nest entrance within the visual field is kept nearly constant to describe the development of a learning flight in a head-centered frame of reference, assuming that the retinal position of the nest is known. We finally implemented a closed-loop simulation of learning flights based on a small set of visual control rules. The visual input for this model are rendered views generated from 3D reconstructions of natural wasp nesting sites, and the retinal nest position is controlled by means of simple template-based tracking. We show that naturalistic paths can be generated without knowledge of the absolute distance to the nest or of the flight speed. We demonstrate in addition that nest-tagged views recorded during such simulated learning flights are sufficient for a homing agent to pinpoint the goal, by identifying nest direction when encountering familiar views. We discuss how the information acquired during learning flights close to the nest can be integrated with long-range homing models.
Collapse
Affiliation(s)
- Patrick Schulte
- Institute of Robotics and Mechatronics, German Aerospace Center (DLR), Wessling, Germany
| | - Jochen Zeil
- Research School of Biology, The Australian National University, Canberra, Australia
| | - Wolfgang Stürzl
- Institute of Robotics and Mechatronics, German Aerospace Center (DLR), Wessling, Germany.
| |
Collapse
|
8
|
Wehner R. The Cataglyphis Mahrèsienne: 50 years of Cataglyphis research at Mahrès. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2019; 205:641-659. [DOI: 10.1007/s00359-019-01333-5] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2019] [Revised: 03/18/2019] [Accepted: 03/21/2019] [Indexed: 11/28/2022]
|
9
|
Terrestrial cue learning and retention during the outbound and inbound foraging trip in the desert ant, Cataglyphis velox. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2019; 205:177-189. [DOI: 10.1007/s00359-019-01316-6] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2018] [Revised: 12/21/2018] [Accepted: 01/16/2019] [Indexed: 10/27/2022]
|
10
|
Freas CA, Cheng K. Panorama similarity and navigational knowledge in the nocturnal bull ant, Myrmicia midas. J Exp Biol 2019; 222:jeb.193201. [DOI: 10.1242/jeb.193201] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2018] [Accepted: 05/09/2019] [Indexed: 11/20/2022]
Abstract
Nocturnal ants forage and navigate during periods of reduced light, making detection of visual cues difficult, yet they are skilled visual navigators. These foragers retain visual panoramic memories both around the nest and along known routes for later use, be it to return to previously visited food sites or to the nest. Here, we explore the navigational knowledge of the nocturnal bull ant, Myrmecia midas, by investigating differences in nest-ward homing after displacement of three forager groups based on similarities in the panoramas between the release site and previously visited locations. Foragers that travel straight up the foraging tree or to close trees around the nest show reduced navigational success in orienting and returning from displacements compared to individuals that forage further from the nest site. By analysing the cues present in the panorama, we show that multiple metrics of forager navigational performance correspond with the degree of similarity between the release site panorama and panoramas of previously visited sites. In highly cluttered environments, where panoramas change rapidly over short distances, the views acquired near the nest are only useful over a small area and memories acquired along foraging routes become critical.
Collapse
Affiliation(s)
- Cody A. Freas
- Department of Psychology, University of Alberta, Canada
- Department of Biological Sciences, Macquarie University, Sydney, Australia
| | - Ken Cheng
- Department of Biological Sciences, Macquarie University, Sydney, Australia
| |
Collapse
|
11
|
|
12
|
Shamsyeh Zahedi M, Zeil J. Fractal dimension and the navigational information provided by natural scenes. PLoS One 2018; 13:e0196227. [PMID: 29734381 PMCID: PMC5937794 DOI: 10.1371/journal.pone.0196227] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2017] [Accepted: 04/09/2018] [Indexed: 11/19/2022] Open
Abstract
Recent work on virtual reality navigation in humans has suggested that navigational success is inversely correlated with the fractal dimension (FD) of artificial scenes. Here we investigate the generality of this claim by analysing the relationship between the fractal dimension of natural insect navigation environments and a quantitative measure of the navigational information content of natural scenes. We show that the fractal dimension of natural scenes is in general inversely proportional to the information they provide to navigating agents on heading direction as measured by the rotational image difference function (rotIDF). The rotIDF determines the precision and accuracy with which the orientation of a reference image can be recovered or maintained and the range over which a gradient descent in image differences will find the minimum of the rotIDF, that is the reference orientation. However, scenes with similar fractal dimension can differ significantly in the depth of the rotIDF, because FD does not discriminate between the orientations of edges, while the rotIDF is mainly affected by edge orientation parallel to the axis of rotation. We present a new equation for the rotIDF relating navigational information to quantifiable image properties such as contrast to show (1) that for any given scene the maximum value of the rotIDF (its depth) is proportional to pixel variance and (2) that FD is inversely proportional to pixel variance. This contrast dependence, together with scene differences in orientation statistics, explains why there is no strict relationship between FD and navigational information. Our experimental data and their numerical analysis corroborate these results.
Collapse
Affiliation(s)
- Moosarreza Shamsyeh Zahedi
- Research School of Biology, Australian National University, Canberra ACT, Australia
- Department of Mathematics, Payame Noor University, Tehran, Iran
- * E-mail:
| | - Jochen Zeil
- Research School of Biology, Australian National University, Canberra ACT, Australia
| |
Collapse
|
13
|
Müller J, Nawrot M, Menzel R, Landgraf T. A neural network model for familiarity and context learning during honeybee foraging flights. BIOLOGICAL CYBERNETICS 2018; 112:113-126. [PMID: 28917001 DOI: 10.1007/s00422-017-0732-z] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/06/2017] [Accepted: 08/30/2017] [Indexed: 06/07/2023]
Abstract
How complex is the memory structure that honeybees use to navigate? Recently, an insect-inspired parsimonious spiking neural network model was proposed that enabled simulated ground-moving agents to follow learned routes. We adapted this model to flying insects and evaluate the route following performance in three different worlds with gradually decreasing object density. In addition, we propose an extension to the model to enable the model to associate sensory input with a behavioral context, such as foraging or homing. The spiking neural network model makes use of a sparse stimulus representation in the mushroom body and reward-based synaptic plasticity at its output synapses. In our experiments, simulated bees were able to navigate correctly even when panoramic cues were missing. The context extension we propose enabled agents to successfully discriminate partly overlapping routes. The structure of the visual environment, however, crucially determines the success rate. We find that the model fails more often in visually rich environments due to the overlap of features represented by the Kenyon cell layer. Reducing the landmark density improves the agents route following performance. In very sparse environments, we find that extended landmarks, such as roads or field edges, may help the agent stay on its route, but often act as strong distractors yielding poor route following performance. We conclude that the presented model is valid for simple route following tasks and may represent one component of insect navigation. Additional components might still be necessary for guidance and action selection while navigating along different memorized routes in complex natural environments.
Collapse
Affiliation(s)
- Jurek Müller
- Institute for Computer Science, Free University Berlin, Berlin, Germany
| | - Martin Nawrot
- Computational Systems Neuroscience, Institute for Zoology, University of Cologne, Cologne, Germany
| | - Randolf Menzel
- Institute for Neurobiology, Free University Berlin, Berlin, Germany
| | - Tim Landgraf
- Institute for Computer Science, Free University Berlin, Berlin, Germany.
| |
Collapse
|
14
|
Abstract
In the last decades, desert ants have become model organisms for the study of insect navigation. In finding their way, they use two major navigational routines: path integration using a celestial compass and landmark guidance based on sets of panoramic views of the terrestrial environment. It has been claimed that this information would enable the insect to acquire and use a centralized cognitive map of its foraging terrain. Here, we present a decentralized architecture, in which the concurrently operating path integration and landmark guidance routines contribute optimally to the directions to be steered, with "optimal" meaning maximizing the certainty (reliability) of the combined information. At any one time during its journey, the animal computes a path integration (global) vector and landmark guidance (local) vector, in which the length of each vector is proportional to the certainty of the individual estimates. Hence, these vectors represent the limited knowledge that the navigator has at any one place about the direction of the goal. The sum of the global and local vectors indicates the navigator's optimal directional estimate. Wherever applied, this decentralized model architecture is sufficient to simulate the results of quite a number of diverse cue-conflict experiments, which have recently been performed in various behavioral contexts by different authors in both desert ants and honeybees. They include even those experiments that have deliberately been designed by former authors to strengthen the evidence for a metric cognitive map in bees.
Collapse
Affiliation(s)
- Thierry Hoinville
- Biological Cybernetics Department, Bielefeld University, 33615 Bielefeld, Germany;
- Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Rüdiger Wehner
- Brain Research Institute, University of Zürich, 8057 Zürich, Switzerland
| |
Collapse
|
15
|
Abstract
Navigation in cluttered environments is an important challenge for animals and robots alike and has been the subject of many studies trying to explain and mimic animal navigational abilities. However, the question of selecting an appropriate home location has, so far, received only little attention. This is surprising, since the choice of a home location might greatly influence an animal’s navigation performance. To address the question of home choice in cluttered environments, a systematic analysis of homing trajectories was performed by computer simulations using a skyline-based local homing method. Our analysis reveals that homing performance strongly depends on the location of the home in the environment. Furthermore, it appears that by assessing homing success in the immediate vicinity of the home, an animal might be able to predict its overall success in returning to it from within a much larger area.
Collapse
Affiliation(s)
- Martin M. Müller
- Department of Neurobiology, Faculty of Biology, and Cluster of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
- * E-mail:
| | - Olivier J. N. Bertrand
- Department of Neurobiology, Faculty of Biology, and Cluster of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| | - Dario Differt
- Computer Engineering Group, Faculty of Technology, Bielefeld University, Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology, Faculty of Biology, and Cluster of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| |
Collapse
|
16
|
Abstract
Navigation is an essential skill for many animals, and understanding how animal use environmental information, particularly visual information, to navigate has a long history in both ethology and psychology. In birds, the dominant approach for investigating navigation at small-scales comes from comparative psychology, which emphasizes the cognitive representations underpinning spatial memory. The majority of this work is based in the laboratory and it is unclear whether this context itself affects the information that birds learn and use when they search for a location. Data from hummingbirds suggests that birds in the wild might use visual information in quite a different manner. To reconcile these differences, here we propose a new approach to avian navigation, inspired by the sensory-driven study of navigation in insects. Using methods devised for studying the navigation of insects, it is possible to quantify the visual information available to navigating birds, and then to determine how this information influences those birds' navigation decisions. Focusing on four areas that we consider characteristic of the insect navigation perspective, we discuss how this approach has shone light on the information insects use to navigate, and assess the prospects of taking a similar approach with birds. Although birds and insects differ in many ways, there is nothing in the insect-inspired approach of the kind we describe that means these methods need be restricted to insects. On the contrary, adopting such an approach could provide a fresh perspective on the well-studied question of how birds navigate through a variety of environments.
Collapse
Affiliation(s)
| | - Susan D Healy
- School of Biology, University of St Andrews, Fife, UK
| |
Collapse
|
17
|
Freas CA, Wystrach A, Narendra A, Cheng K. The View from the Trees: Nocturnal Bull Ants, Myrmecia midas, Use the Surrounding Panorama While Descending from Trees. Front Psychol 2018; 9:16. [PMID: 29422880 PMCID: PMC5788958 DOI: 10.3389/fpsyg.2018.00016] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2017] [Accepted: 01/08/2018] [Indexed: 01/09/2023] Open
Abstract
Solitary foraging ants commonly use visual cues from their environment for navigation. Foragers are known to store visual scenes from the surrounding panorama for later guidance to known resources and to return successfully back to the nest. Several ant species travel not only on the ground, but also climb trees to locate resources. The navigational information that guides animals back home during their descent, while their body is perpendicular to the ground, is largely unknown. Here, we investigate in a nocturnal ant, Myrmecia midas, whether foragers travelling down a tree use visual information to return home. These ants establish nests at the base of a tree on which they forage and in addition, they also forage on nearby trees. We collected foragers and placed them on the trunk of the nest tree or a foraging tree in multiple compass directions. Regardless of the displacement location, upon release ants immediately moved to the side of the trunk facing the nest during their descent. When ants were released on non-foraging trees near the nest, displaced foragers again travelled around the tree to the side facing the nest. All the displaced foragers reached the correct side of the tree well before reaching the ground. However, when the terrestrial cues around the tree were blocked, foragers were unable to orient correctly, suggesting that the surrounding panorama is critical to successful orientation on the tree. Through analysis of panoramic pictures, we show that views acquired at the base of the foraging tree nest can provide reliable nest-ward orientation up to 1.75 m above the ground. We discuss, how animals descending from trees compare their current scene to a memorised scene and report on the similarities in visually guided behaviour while navigating on the ground and descending from trees.
Collapse
Affiliation(s)
- Cody A. Freas
- Department of Biological Sciences, Macquarie University, Sydney, NSW, Australia
| | - Antione Wystrach
- Research Centre on Animal Cognition, Centre for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Ajay Narendra
- Department of Biological Sciences, Macquarie University, Sydney, NSW, Australia
| | - Ken Cheng
- Department of Biological Sciences, Macquarie University, Sydney, NSW, Australia
| |
Collapse
|
18
|
Lobecke A, Kern R, Egelhaaf M. Taking a goal-centred dynamic snapshot as a possibility for local homing in initially naïve bumblebees. ACTA ACUST UNITED AC 2018; 221:jeb.168674. [PMID: 29150448 DOI: 10.1242/jeb.168674] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2017] [Accepted: 11/13/2017] [Indexed: 11/20/2022]
Abstract
It is essential for central place foragers, such as bumblebees, to return reliably to their nest. Bumblebees, leaving their inconspicuous nest hole for the first time need to gather and learn sufficient information about their surroundings to allow them to return to their nest at the end of their trip, instead of just flying away to forage. Therefore, we assume an intrinsic learning programme that manifests itself in the flight structure immediately after leaving the nest for the first time. In this study, we recorded and analysed the first outbound flight of individually marked naïve bumblebees in an indoor environment. We found characteristic loop-like features in the flight pattern that appear to be necessary for the bees to acquire environmental information and might be relevant for finding the nest hole after a foraging trip. Despite common features in their spatio-temporal organisation, first departure flights from the nest are characterised by a high level of variability in their loop-like flight structure across animals. Changes in turn direction of body orientation, for example, are distributed evenly across the entire area used for the flights without any systematic relationship to the nest location. By considering the common flight motifs and this variability, we came to the hypothesis that a kind of dynamic snapshot is taken during the early phase of departure flights centred at the nest location. The quality of this snapshot is hypothesised to be 'tested' during the later phases of the departure flights concerning its usefulness for local homing.
Collapse
Affiliation(s)
- Anne Lobecke
- Department of Neurobiology and Cluster of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Roland Kern
- Department of Neurobiology and Cluster of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology and Cluster of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| |
Collapse
|
19
|
Jayatilaka P, Murray T, Narendra A, Zeil J. The choreography of learning walks in the Australian jack jumper ant Myrmecia croslandi. J Exp Biol 2018; 221:jeb.185306. [DOI: 10.1242/jeb.185306] [Citation(s) in RCA: 31] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2018] [Accepted: 08/12/2018] [Indexed: 11/20/2022]
Abstract
We provide a detailed analysis of the learning walks performed by Myrmecia croslandi ants at the nest during which they acquire visual information on its location. Most learning walks of 12 individually marked naïve ants took place in the morning with a narrow time window separating the first two learning walks, which most often occurred on the same day. Naïve ants performed between 2 to 7 walks over up to 4 consecutive days before heading out to forage. On subsequent walks naïve ants tend to explore the area around the nest in new compass directions. During learning walks ants move along arcs around the nest while performing oscillating scanning movements. In a regular temporal sequence, the ants’ gaze oscillates between the nest direction and the direction pointing away from the nest. Ants thus experience a sequence of views roughly across the nest and away from the nest from systematically spaced vantage points around the nest. We show further that ants leaving the nest for a foraging trip often walk in an arc around the nest on the opposite side to the intended foraging direction, performing a scanning routine indistinguishable from that of a learning walk. These partial learning walks are triggered by disturbance around the nest and may help returning ants with reorienting when overshooting the nest, which they frequently do. We discuss what is known about learning walks in different ant species and their adaptive significance for acquiring robust navigational memories.
Collapse
Affiliation(s)
- Piyankarie Jayatilaka
- Research School of Biology, The Australian National University 46 Sullivans Creek Road, Canberra ACT2601, Australia
| | - Trevor Murray
- Research School of Biology, The Australian National University 46 Sullivans Creek Road, Canberra ACT2601, Australia
| | - Ajay Narendra
- Research School of Biology, The Australian National University 46 Sullivans Creek Road, Canberra ACT2601, Australia
- Present address: Department of Biological Sciences, Macquarie University, 205 Culloden Road, Sydney, NSW 2109, Australia
| | - Jochen Zeil
- Research School of Biology, The Australian National University 46 Sullivans Creek Road, Canberra ACT2601, Australia
| |
Collapse
|
20
|
Narendra A, Ramirez-Esquivel F. Subtle changes in the landmark panorama disrupt visual navigation in a nocturnal bull ant. Philos Trans R Soc Lond B Biol Sci 2017; 372:rstb.2016.0068. [PMID: 28193813 DOI: 10.1098/rstb.2016.0068] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/29/2016] [Indexed: 11/12/2022] Open
Abstract
The ability of ants to navigate when the visual landmark information is altered has often been tested by creating large and artificial discrepancies in their visual environment. Here, we had an opportunity to slightly modify the natural visual environment around the nest of the nocturnal bull ant Myrmecia pyriformis We achieved this by felling three dead trees, two located along the typical route followed by the foragers of that particular nest and one in a direction perpendicular to their foraging direction. An image difference analysis showed that the change in the overall panorama following the removal of these trees was relatively little. We filmed the behaviour of ants close to the nest and tracked their entire paths, both before and after the trees were removed. We found that immediately after the trees were removed, ants walked slower and were less directed. Their foraging success decreased and they looked around more, including turning back to look towards the nest. We document how their behaviour changed over subsequent nights and discuss how the ants may detect and respond to a modified visual environment in the evening twilight period.This article is part of the themed issue 'Vision in dim light'.
Collapse
Affiliation(s)
- Ajay Narendra
- Department of Biological Sciences, Macquarie University, 205 Culloden Road, Sydney, New South Wales 2109, Australia
| | - Fiorella Ramirez-Esquivel
- Research School of Biology, The Australian National University, 46 Sullivans Creek Road, Canberra, Australian Capital Territory 2601, Australia
| |
Collapse
|
21
|
Murray T, Zeil J. Quantifying navigational information: The catchment volumes of panoramic snapshots in outdoor scenes. PLoS One 2017; 12:e0187226. [PMID: 29088300 PMCID: PMC5663442 DOI: 10.1371/journal.pone.0187226] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2017] [Accepted: 10/16/2017] [Indexed: 11/18/2022] Open
Abstract
Panoramic views of natural environments provide visually navigating animals with two kinds of information: they define locations because image differences increase smoothly with distance from a reference location and they provide compass information, because image differences increase smoothly with rotation away from a reference orientation. The range over which a given reference image can provide navigational guidance (its 'catchment area') has to date been quantified from the perspective of walking animals by determining how image differences develop across the ground plane of natural habitats. However, to understand the information available to flying animals there is a need to characterize the 'catchment volumes' within which panoramic snapshots can provide navigational guidance. We used recently developed camera-based methods for constructing 3D models of natural environments and rendered panoramic views at defined locations within these models with the aim of mapping navigational information in three dimensions. We find that in relatively open woodland habitats, catchment volumes are surprisingly large extending for metres depending on the sensitivity of the viewer to image differences. The size and the shape of catchment volumes depend on the distance of visual features in the environment. Catchment volumes are smaller for reference images close to the ground and become larger for reference images at some distance from the ground and in more open environments. Interestingly, catchment volumes become smaller when only above horizon views are used and also when views include a 1 km distant panorama. We discuss the current limitations of mapping navigational information in natural environments and the relevance of our findings for our understanding of visual navigation in animals and autonomous robots.
Collapse
Affiliation(s)
- Trevor Murray
- Research School of Biology, Australian National University, Canberra, Australia
- * E-mail:
| | - Jochen Zeil
- Research School of Biology, Australian National University, Canberra, Australia
| |
Collapse
|
22
|
Lee C, Yu SE, Kim D. Landmark-Based Homing Navigation Using Omnidirectional Depth Information. SENSORS (BASEL, SWITZERLAND) 2017; 17:E1928. [PMID: 28829387 PMCID: PMC5580246 DOI: 10.3390/s17081928] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/22/2017] [Revised: 08/16/2017] [Accepted: 08/18/2017] [Indexed: 11/16/2022]
Abstract
A number of landmark-based navigation algorithms have been studied using feature extraction over the visual information. In this paper, we apply the distance information of the surrounding environment in a landmark navigation model. We mount a depth sensor on a mobile robot, in order to obtain omnidirectional distance information. The surrounding environment is represented as a circular form of landmark vectors, which forms a snapshot. The depth snapshots at the current position and the target position are compared to determine the homing direction, inspired by the snapshot model. Here, we suggest a holistic view of panoramic depth information for homing navigation where each sample point is taken as a landmark. The results are shown in a vector map of homing vectors. The performance of the suggested method is evaluated based on the angular errors and the homing success rate. Omnidirectional depth information about the surrounding environment can be a promising source of landmark homing navigation. We demonstrate the results that a holistic approach with omnidirectional depth information shows effective homing navigation.
Collapse
Affiliation(s)
- Changmin Lee
- School of Electrical and Electronic Engineering, Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 03722, Korea.
| | - Seung-Eun Yu
- School of Electrical and Electronic Engineering, Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 03722, Korea.
| | - DaeEun Kim
- School of Electrical and Electronic Engineering, Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 03722, Korea.
| |
Collapse
|
23
|
Towne WF, Ritrovato AE, Esposto A, Brown DF. Honeybees use the skyline in orientation. ACTA ACUST UNITED AC 2017; 220:2476-2485. [PMID: 28450409 DOI: 10.1242/jeb.160002] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2017] [Accepted: 04/23/2017] [Indexed: 11/20/2022]
Abstract
In view-based navigation, animals acquire views of the landscape from various locations and then compare the learned views with current views in order to orient in certain directions or move toward certain destinations. One landscape feature of great potential usefulness in view-based navigation is the skyline, the silhouette of terrestrial objects against the sky, as it is distant, relatively stable and easy to detect. The skyline has been shown to be important in the view-based navigation of ants, but no flying insect has yet been shown definitively to use the skyline in this way. Here, we show that honeybees do indeed orient using the skyline. A feeder was surrounded with an artificial replica of the natural skyline there, and the bees' departures toward the nest were recorded from above with a video camera under overcast skies (to eliminate celestial cues). When the artificial skyline was rotated, the bees' departures were rotated correspondingly, showing that the bees oriented by the artificial skyline alone. We discuss these findings in the context of the likely importance of the skyline in long-range homing in bees, the likely importance of altitude in using the skyline, the likely role of ultraviolet light in detecting the skyline, and what we know about the bees' ability to resolve skyline features.
Collapse
Affiliation(s)
- William F Towne
- Department of Biology, Kutztown University of Pennsylvania, Kutztown, PA 19529, USA
| | | | - Antonina Esposto
- Department of Biology, Kutztown University of Pennsylvania, Kutztown, PA 19529, USA
| | - Duncan F Brown
- Department of Biology, Kutztown University of Pennsylvania, Kutztown, PA 19529, USA
| |
Collapse
|
24
|
|
25
|
Abstract
Despite their tiny eyes and brains, nocturnal insects have evolved a remarkable capacity to visually navigate at night. Whereas some use moonlight or the stars as celestial compass cues to maintain a straight-line course, others use visual landmarks to navigate to and from their nest. These impressive abilities rely on highly sensitive compound eyes and specialized visual processing strategies in the brain.
Collapse
Affiliation(s)
- Eric Warrant
- Department of Biology, Lund Vision Group, University of Lund, Lund, Sweden
| | - Marie Dacke
- Department of Biology, Lund Vision Group, University of Lund, Lund, Sweden
| |
Collapse
|
26
|
Vanderelst D, Steckel J, Boen A, Peremans H, Holderied MW. Place recognition using batlike sonar. eLife 2016; 5:e14188. [PMID: 27481189 PMCID: PMC4970868 DOI: 10.7554/elife.14188] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2016] [Accepted: 06/20/2016] [Indexed: 11/28/2022] Open
Abstract
Echolocating bats have excellent spatial memory and are able to navigate to salient locations using bio-sonar. Navigating and route-following require animals to recognize places. Currently, it is mostly unknown how bats recognize places using echolocation. In this paper, we propose template based place recognition might underlie sonar-based navigation in bats. Under this hypothesis, bats recognize places by remembering their echo signature - rather than their 3D layout. Using a large body of ensonification data collected in three different habitats, we test the viability of this hypothesis assessing two critical properties of the proposed echo signatures: (1) they can be uniquely classified and (2) they vary continuously across space. Based on the results presented, we conclude that the proposed echo signatures satisfy both criteria. We discuss how these two properties of the echo signatures can support navigation and building a cognitive map.
Collapse
Affiliation(s)
- Dieter Vanderelst
- School of Biological Sciences, University of Bristol, Bristol, United Kingdom
- Active Perception Lab, University of Antwerp, Antwerp, Belgium
| | - Jan Steckel
- Active Perception Lab, University of Antwerp, Antwerp, Belgium
- Constrained Systems Lab, Faculty of Applied Engineering, University of Antwerp, Antwerp, Belgium
| | - Andre Boen
- Active Perception Lab, University of Antwerp, Antwerp, Belgium
| | | | - Marc W Holderied
- School of Biological Sciences, University of Bristol, Bristol, United Kingdom
| |
Collapse
|
27
|
Buehlmann C, Woodgate JL, Collett TS. On the Encoding of Panoramic Visual Scenes in Navigating Wood Ants. Curr Biol 2016; 26:2022-2027. [PMID: 27476601 DOI: 10.1016/j.cub.2016.06.005] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2016] [Revised: 05/12/2016] [Accepted: 06/01/2016] [Indexed: 11/18/2022]
Abstract
A natural visual panorama is a complex stimulus formed of many component shapes. It gives an animal a sense of place and supplies guiding signals for controlling the animal's direction of travel [1]. Insects with their economical neural processing [2] are good subjects for analyzing the encoding and memory of such scenes [3-5]. Honeybees [6] and ants [7, 8] foraging from their nest can follow habitual routes guided only by visual cues within a natural panorama. Here, we analyze the headings that ants adopt when a familiar panorama composed of two or three shapes is manipulated by removing a shape or by replacing training shapes with unfamiliar ones. We show that (1) ants recognize a component shape not only through its particular visual features, but also by its spatial relation to other shapes in the scene, and that (2) each segmented shape [9] contributes its own directional signal to generating the ant's chosen heading. We found earlier that ants trained to a feeder placed to one side of a single shape [10] and tested with shapes of different widths learn the retinal position of the training shape's center of mass (CoM) [11, 12] when heading toward the feeder. They then guide themselves by placing the shape's CoM in the remembered retinal position [10]. This use of CoM in a one-shape panorama combined with the results here suggests that the ants' memory of a multi-shape panorama comprises the retinal positions of the horizontal CoMs of each major component shape within the scene, bolstered by local descriptors of that shape.
Collapse
Affiliation(s)
- Cornelia Buehlmann
- School of Life Sciences, University of Sussex, John Maynard Smith Building, Brighton BN1 9QG, UK.
| | - Joseph L Woodgate
- School of Life Sciences, University of Sussex, John Maynard Smith Building, Brighton BN1 9QG, UK.
| | - Thomas S Collett
- School of Life Sciences, University of Sussex, John Maynard Smith Building, Brighton BN1 9QG, UK.
| |
Collapse
|
28
|
Autonomous Visual Navigation of an Indoor Environment Using a Parsimonious, Insect Inspired Familiarity Algorithm. PLoS One 2016; 11:e0153706. [PMID: 27119720 PMCID: PMC4847926 DOI: 10.1371/journal.pone.0153706] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2015] [Accepted: 04/03/2016] [Indexed: 11/19/2022] Open
Abstract
The navigation of bees and ants from hive to food and back has captivated people for more than a century. Recently, the Navigation by Scene Familiarity Hypothesis (NSFH) has been proposed as a parsimonious approach that is congruent with the limited neural elements of these insects' brains. In the NSFH approach, an agent completes an initial training excursion, storing images along the way. To retrace the path, the agent scans the area and compares the current scenes to those previously experienced. By turning and moving to minimize the pixel-by-pixel differences between encountered and stored scenes, the agent is guided along the path without having memorized the sequence. An important premise of the NSFH is that the visual information of the environment is adequate to guide navigation without aliasing. Here we demonstrate that an image landscape of an indoor setting possesses ample navigational information. We produced a visual landscape of our laboratory and part of the adjoining corridor consisting of 2816 panoramic snapshots arranged in a grid at 12.7-cm centers. We show that pixel-by-pixel comparisons of these images yield robust translational and rotational visual information. We also produced a simple algorithm that tracks previously experienced routes within our lab based on an insect-inspired scene familiarity approach and demonstrate that adequate visual information exists for an agent to retrace complex training routes, including those where the path's end is not visible from its origin. We used this landscape to systematically test the interplay of sensor morphology, angles of inspection, and similarity threshold with the recapitulation performance of the agent. Finally, we compared the relative information content and chance of aliasing within our visually rich laboratory landscape to scenes acquired from indoor corridors with more repetitive scenery.
Collapse
|
29
|
Raderschall CA, Narendra A, Zeil J. Head roll stabilisation in the nocturnal bull ant Myrmecia pyriformis: implications for visual navigation. ACTA ACUST UNITED AC 2016; 219:1449-57. [PMID: 26994172 DOI: 10.1242/jeb.134049] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2015] [Accepted: 02/24/2016] [Indexed: 10/22/2022]
Abstract
Ant foragers are known to memorise visual scenes that allow them to repeatedly travel along idiosyncratic routes and to return to specific places. Guidance is provided by a comparison between visual memories and current views, which critically depends on how well the attitude of the visual system is controlled. Here we show that nocturnal bull ants stabilise their head to varying degrees against locomotion-induced body roll movements, and this ability decreases as light levels fall. There are always un-compensated head roll oscillations that match the frequency of the stride cycle. Head roll stabilisation involves both visual and non-visual cues as ants compensate for body roll in complete darkness and also respond with head roll movements when confronted with visual pattern oscillations. We show that imperfect head roll control degrades navigation-relevant visual information and discuss ways in which navigating ants may deal with this problem.
Collapse
Affiliation(s)
- Chloé A Raderschall
- Research School of Biology, The Australian National University, 46 Sullivans Creek Road, Canberra, Australian Capital Territory 2601, Australia
| | - Ajay Narendra
- Research School of Biology, The Australian National University, 46 Sullivans Creek Road, Canberra, Australian Capital Territory 2601, Australia Department of Biological Sciences, Macquarie University, W19F, 205 Culloden Road, Sydney, New South Wales 2109, Australia
| | - Jochen Zeil
- Research School of Biology, The Australian National University, 46 Sullivans Creek Road, Canberra, Australian Capital Territory 2601, Australia
| |
Collapse
|
30
|
Ardin P, Peng F, Mangan M, Lagogiannis K, Webb B. Using an Insect Mushroom Body Circuit to Encode Route Memory in Complex Natural Environments. PLoS Comput Biol 2016; 12:e1004683. [PMID: 26866692 PMCID: PMC4750948 DOI: 10.1371/journal.pcbi.1004683] [Citation(s) in RCA: 93] [Impact Index Per Article: 11.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2015] [Accepted: 11/30/2015] [Indexed: 11/30/2022] Open
Abstract
Ants, like many other animals, use visual memory to follow extended routes through complex environments, but it is unknown how their small brains implement this capability. The mushroom body neuropils have been identified as a crucial memory circuit in the insect brain, but their function has mostly been explored for simple olfactory association tasks. We show that a spiking neural model of this circuit originally developed to describe fruitfly (Drosophila melanogaster) olfactory association, can also account for the ability of desert ants (Cataglyphis velox) to rapidly learn visual routes through complex natural environments. We further demonstrate that abstracting the key computational principles of this circuit, which include one-shot learning of sparse codes, enables the theoretical storage capacity of the ant mushroom body to be estimated at hundreds of independent images.
Collapse
Affiliation(s)
- Paul Ardin
- School of Informatics, University of Edinburgh, Edinburgh, United Kingdom
| | - Fei Peng
- Biological and Experimental Psychology, School of Biological and Chemical Sciences, Queen Mary University of London, London, United Kingdom
| | - Michael Mangan
- School of Informatics, University of Edinburgh, Edinburgh, United Kingdom
| | | | - Barbara Webb
- School of Informatics, University of Edinburgh, Edinburgh, United Kingdom
| |
Collapse
|
31
|
How Wasps Acquire and Use Views for Homing. Curr Biol 2016; 26:470-82. [DOI: 10.1016/j.cub.2015.12.052] [Citation(s) in RCA: 69] [Impact Index Per Article: 8.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2015] [Revised: 11/20/2015] [Accepted: 12/18/2015] [Indexed: 11/21/2022]
|
32
|
How do field of view and resolution affect the information content of panoramic scenes for visual navigation? A computational investigation. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2015; 202:87-95. [PMID: 26582183 PMCID: PMC4722065 DOI: 10.1007/s00359-015-1052-1] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2014] [Revised: 10/29/2015] [Accepted: 10/30/2015] [Indexed: 10/26/2022]
Abstract
The visual systems of animals have to provide information to guide behaviour and the informational requirements of an animal's behavioural repertoire are often reflected in its sensory system. For insects, this is often evident in the optical array of the compound eye. One behaviour that insects share with many animals is the use of learnt visual information for navigation. As ants are expert visual navigators it may be that their vision is optimised for navigation. Here we take a computational approach in asking how the details of the optical array influence the informational content of scenes used in simple view matching strategies for orientation. We find that robust orientation is best achieved with low-resolution visual information and a large field of view, similar to the optical properties seen for many ant species. A lower resolution allows for a trade-off between specificity and generalisation for stored views. Additionally, our simulations show that orientation performance increases if different portions of the visual field are considered as discrete visual sensors, each giving an independent directional estimate. This suggests that ants might benefit by processing information from their two eyes independently.
Collapse
|
33
|
A novel robot visual homing method based on SIFT features. SENSORS 2015; 15:26063-84. [PMID: 26473880 PMCID: PMC4634498 DOI: 10.3390/s151026063] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/02/2015] [Revised: 09/30/2015] [Accepted: 10/09/2015] [Indexed: 11/18/2022]
Abstract
Warping is an effective visual homing method for robot local navigation. However, the performance of the warping method can be greatly influenced by the changes of the environment in a real scene, thus resulting in lower accuracy. In order to solve the above problem and to get higher homing precision, a novel robot visual homing algorithm is proposed by combining SIFT (scale-invariant feature transform) features with the warping method. The algorithm is novel in using SIFT features as landmarks instead of the pixels in the horizon region of the panoramic image. In addition, to further improve the matching accuracy of landmarks in the homing algorithm, a novel mismatching elimination algorithm, based on the distribution characteristics of landmarks in the catadioptric panoramic image, is proposed. Experiments on image databases and on a real scene confirm the effectiveness of the proposed method.
Collapse
|
34
|
Strübbe S, Stürzl W, Egelhaaf M. Insect-Inspired Self-Motion Estimation with Dense Flow Fields--An Adaptive Matched Filter Approach. PLoS One 2015; 10:e0128413. [PMID: 26308839 PMCID: PMC4550262 DOI: 10.1371/journal.pone.0128413] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2014] [Accepted: 04/28/2015] [Indexed: 11/18/2022] Open
Abstract
The control of self-motion is a basic, but complex task for both technical and biological systems. Various algorithms have been proposed that allow the estimation of self-motion from the optic flow on the eyes. We show that two apparently very different approaches to solve this task, one technically and one biologically inspired, can be transformed into each other under certain conditions. One estimator of self-motion is based on a matched filter approach; it has been developed to describe the function of motion sensitive cells in the fly brain. The other estimator, the Koenderink and van Doorn (KvD) algorithm, was derived analytically with a technical background. If the distances to the objects in the environment can be assumed to be known, the two estimators are linear and equivalent, but are expressed in different mathematical forms. However, for most situations it is unrealistic to assume that the distances are known. Therefore, the depth structure of the environment needs to be determined in parallel to the self-motion parameters and leads to a non-linear problem. It is shown that the standard least mean square approach that is used by the KvD algorithm leads to a biased estimator. We derive a modification of this algorithm in order to remove the bias and demonstrate its improved performance by means of numerical simulations. For self-motion estimation it is beneficial to have a spherical visual field, similar to many flying insects. We show that in this case the representation of the depth structure of the environment derived from the optic flow can be simplified. Based on this result, we develop an adaptive matched filter approach for systems with a nearly spherical visual field. Then only eight parameters about the environment have to be memorized and updated during self-motion.
Collapse
Affiliation(s)
- Simon Strübbe
- Department of Neurobiology and CITEC, Bielefeld University, Bielefeld, Germany
- * E-mail:
| | - Wolfgang Stürzl
- Institute of Robotics and Mechatronics, German Aerospace Center (DLR), Wessling, Germany
| | - Martin Egelhaaf
- Department of Neurobiology and CITEC, Bielefeld University, Bielefeld, Germany
| |
Collapse
|
35
|
Dewar ADM, Wystrach A, Graham P, Philippides A. Navigation-specific neural coding in the visual system of Drosophila. Biosystems 2015; 136:120-7. [PMID: 26310914 DOI: 10.1016/j.biosystems.2015.07.008] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2015] [Revised: 07/21/2015] [Accepted: 07/26/2015] [Indexed: 11/15/2022]
Abstract
Drosophila melanogaster are a good system in which to understand the minimal requirements for widespread visually guided behaviours such as navigation, due to their small brains (adults possess only 100,000 neurons) and the availability of neurogenetic techniques which allow the identification of task-specific cell types. Recently published data describe the receptive fields for two classes of visually responsive neurons (R2 and R3/R4d ring neurons in the central complex) that are essential for visual tasks such as orientation memory for salient objects and simple pattern discriminations. What is interesting is that these cells have very large receptive fields and are very small in number, suggesting that each sub-population of cells might be a bottleneck in the processing of visual information for a specific behaviour, as each subset of cells effectively condenses information from approximately 3000 visual receptors in the eye, to fewer than 50 neurons in total. It has recently been shown how R1 ring neurons, which receive input from the same areas as the R2 and R3/R4d cells, are necessary for place learning in Drosophila. However, how R1 neurons enable place learning is unknown. By examining the information provided by different populations of hypothetical visual neurons in simulations of experimental arenas, we show that neurons with ring neuron-like receptive fields are sufficient for defining a location visually. In this way we provide a link between the type of information conveyed by ring neurons and the behaviour they support.
Collapse
Affiliation(s)
- Alex D M Dewar
- School of Life Sciences, John Maynard Smith Building, University of Sussex, Falmer BN1 9QJ, UK.
| | - Antoine Wystrach
- School of Informatics, University of Edinburgh, Appleton Tower, 11 Crichton Street, Edinburgh EH8 9LE, UK
| | - Paul Graham
- School of Life Sciences, John Maynard Smith Building, University of Sussex, Falmer BN1 9QJ, UK
| | - Andrew Philippides
- Department of Informatics, Chichester I, University of Sussex, Falmer, Brighton BN1 9QJ, UK.
| |
Collapse
|
36
|
|
37
|
Stürzl W, Grixa I, Mair E, Narendra A, Zeil J. Three-dimensional models of natural environments and the mapping of navigational information. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2015; 201:563-84. [DOI: 10.1007/s00359-015-1002-y] [Citation(s) in RCA: 42] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2014] [Revised: 03/10/2015] [Accepted: 03/13/2015] [Indexed: 11/24/2022]
|
38
|
Path integration, views, search, and matched filters: the contributions of Rüdiger Wehner to the study of orientation and navigation. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2015; 201:517-32. [DOI: 10.1007/s00359-015-0984-9] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2014] [Revised: 01/11/2015] [Accepted: 01/27/2015] [Indexed: 10/24/2022]
|
39
|
Dittmar L. Static and dynamic snapshots for goal localization in insects? Commun Integr Biol 2014. [DOI: 10.4161/cib.13763] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022] Open
|
40
|
Milford M, Vig E, Scheirer W, Cox D. Vision-based Simultaneous Localization and Mapping in Changing Outdoor Environments. J FIELD ROBOT 2014. [DOI: 10.1002/rob.21532] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Affiliation(s)
- Michael Milford
- Queensland University of Technology; Brisbane Queensland Australia 4001
| | - Eleonora Vig
- Harvard University; Cambridge Massachusetts 02138
| | | | - David Cox
- Harvard University; Cambridge Massachusetts 02138
| |
Collapse
|
41
|
25 years of research on the use of geometry in spatial reorientation: a current theoretical perspective. Psychon Bull Rev 2014; 20:1033-54. [PMID: 23456412 DOI: 10.3758/s13423-013-0416-1] [Citation(s) in RCA: 116] [Impact Index Per Article: 11.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
The purpose of this article is to review and evaluate the range of theories proposed to explain findings on the use of geometry in reorientation. We consider five key approaches and models associated with them and, in the course of reviewing each approach, five key issues. First, we take up modularity theory itself, as recently revised by Lee and Spelke (Cognitive Psychology, 61, 152-176, 2010a; Experimental Brain Research, 206, 179-188, 2010b). In this context, we discuss issues concerning the basic distinction between geometry and features. Second, we review the view-matching approach (Stürzl, Cheung, Cheng, & Zeil, Journal of Experimental Psychology: Animal Behavior Processes, 34, 1-14, 2008). In this context, we highlight the possibility of cross-species differences, as well as commonalities. Third, we review an associative theory (Miller & Shettleworth, Journal of Experimental Psychology: Animal Behavior Processes, 33, 191-212, 2007; Journal of Experimental Psychology: Animal Behavior Processes, 34, 419-422, 2008). In this context, we focus on phenomena of cue competition. Fourth, we take up adaptive combination theory (Newcombe & Huttenlocher, 2006). In this context, we focus on discussing development and the effects of experience. Fifth, we examine various neurally based approaches, including frameworks proposed by Doeller and Burgess (Proceedings of the National Academy of Sciences of the United States of America, 105, 5909-5914, 2008; Doeller, King, & Burgess, Proceedings of the National Academy of Sciences of the United States of America, 105, 5915-5920, 2008) and by Sheynikhovich, Chavarriaga, Strösslin, Arleo, and Gerstner (Psychological Review, 116, 540-566, 2009). In this context, we examine the issue of the neural substrates of spatial navigation. We conclude that none of these approaches can account for all of the known phenomena concerning the use of geometry in reorientation and clarify what the challenges are for each approach.
Collapse
|
42
|
|
43
|
Zeil J, Narendra A, Stürzl W. Looking and homing: how displaced ants decide where to go. Philos Trans R Soc Lond B Biol Sci 2014; 369:20130034. [PMID: 24395961 DOI: 10.1098/rstb.2013.0034] [Citation(s) in RCA: 66] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
We caught solitary foragers of the Australian Jack Jumper ant, Myrmecia croslandi, and released them in three compass directions at distances of 10 and 15 m from the nest at locations they have never been before. We recorded the head orientation and the movements of ants within a radius of 20 cm from the release point and, in some cases, tracked their subsequent paths with a differential GPS. We find that upon surfacing from their transport vials onto a release platform, most ants move into the home direction after looking around briefly. The ants use a systematic scanning procedure, consisting of saccadic head and body rotations that sweep gaze across the scene with an average angular velocity of 90° s(-1) and intermittent changes in turning direction. By mapping the ants' gaze directions onto the local panorama, we find that neither the ants' gaze nor their decisions to change turning direction are clearly associated with salient or significant features in the scene. Instead, the ants look most frequently in the home direction and start walking fast when doing so. Displaced ants can thus identify home direction with little translation, but exclusively through rotational scanning. We discuss the navigational information content of the ants' habitat and how the insects' behaviour informs us about how they may acquire and retrieve that information.
Collapse
Affiliation(s)
- Jochen Zeil
- ARC Centre of Excellence in Vision Science, Research School of Biology, The Australian National University, , Building 46, Biology Place, Canberra, Australian Capital Territory 0200, Australia
| | | | | |
Collapse
|
44
|
Narendra A, Gourmaud S, Zeil J. Mapping the navigational knowledge of individually foraging ants, Myrmecia croslandi. Proc Biol Sci 2013; 280:20130683. [PMID: 23804615 DOI: 10.1098/rspb.2013.0683] [Citation(s) in RCA: 85] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Ants are efficient navigators, guided by path integration and visual landmarks. Path integration is the primary strategy in landmark-poor habitats, but landmarks are readily used when available. The landmark panorama provides reliable information about heading direction, routes and specific location. Visual memories for guidance are often acquired along routes or near to significant places. Over what area can such locally acquired memories provide information for reaching a place? This question is unusually approachable in the solitary foraging Australian jack jumper ant, since individual foragers typically travel to one or two nest-specific foraging trees. We find that within 10 m from the nest, ants both with and without home vector information available from path integration return directly to the nest from all compass directions, after briefly scanning the panorama. By reconstructing panoramic views within the successful homing range, we show that in the open woodland habitat of these ants, snapshot memories acquired close to the nest provide sufficient navigational information to determine nest-directed heading direction over a surprisingly large area, including areas that animals may have not visited previously.
Collapse
Affiliation(s)
- Ajay Narendra
- ARC Centre of Excellence in Vision Science, Research School of Biology, The Australian National University, Canberra, Australian Capital Territory 0200, Australia.
| | | | | |
Collapse
|
45
|
Lent D, Graham P, Collett T. Visual Scene Perception in Navigating Wood Ants. Curr Biol 2013; 23:684-90. [PMID: 23583550 DOI: 10.1016/j.cub.2013.03.016] [Citation(s) in RCA: 38] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2012] [Revised: 02/05/2013] [Accepted: 03/07/2013] [Indexed: 11/24/2022]
|
46
|
|
47
|
Narendra A, Raderschall C, Robson S. Homing abilities of the Australian intertidal ant, Polyrhachis sokolova. J Exp Biol 2013; 216:3674-81. [DOI: 10.1242/jeb.089649] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Summary
The pressure of returning and locating the nest after a successful foraging trip is immense in ants. To find their way back home, ants use a number of different strategies (e.g., path integration, trail-following) and rely on a range of cues (e.g., pattern of polarised skylight, landmark panorama) available in their environment. How ants weigh different cues has been a question of great interest and has primarily been addressed in the desert ants from Africa and Australia. We here identify the navigational abilities of an intertidal ant, Polyrhachis sokolova that lives on mudflats where nests and foraging areas are frequently inundated with tidal water. We find that these solitary foraging ants rely heavily on visual landmark information for navigation but they are also capable of path integration. By displacing ants with and without vector information at different locations within the local familiar territory we created conflicts between information from the landmarks and the path integrator. The homing success of full-vector ants, compared to the zero-vector ants, when displaced 5 m behind the feeder indicate that vector information had to be coupled with landmark information for successful homing. To explain the differences in the homing abilities of ants from different locations we determined the navigational information content at each release station and compared it to that available at the feeder location. We report here the interaction of multiple navigation strategies in the context of the information content in the environment.
Collapse
|
48
|
Egelhaaf M, Boeddeker N, Kern R, Kurtz R, Lindemann JP. Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action. Front Neural Circuits 2012; 6:108. [PMID: 23269913 PMCID: PMC3526811 DOI: 10.3389/fncir.2012.00108] [Citation(s) in RCA: 71] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2012] [Accepted: 12/03/2012] [Indexed: 11/30/2022] Open
Abstract
Insects such as flies or bees, with their miniature brains, are able to control highly aerobatic flight maneuvres and to solve spatial vision tasks, such as avoiding collisions with obstacles, landing on objects, or even localizing a previously learnt inconspicuous goal on the basis of environmental cues. With regard to solving such spatial tasks, these insects still outperform man-made autonomous flying systems. To accomplish their extraordinary performance, flies and bees have been shown by their characteristic behavioral actions to actively shape the dynamics of the image flow on their eyes ("optic flow"). The neural processing of information about the spatial layout of the environment is greatly facilitated by segregating the rotational from the translational optic flow component through a saccadic flight and gaze strategy. This active vision strategy thus enables the nervous system to solve apparently complex spatial vision tasks in a particularly efficient and parsimonious way. The key idea of this review is that biological agents, such as flies or bees, acquire at least part of their strength as autonomous systems through active interactions with their environment and not by simply processing passively gained information about the world. These agent-environment interactions lead to adaptive behavior in surroundings of a wide range of complexity. Animals with even tiny brains, such as insects, are capable of performing extraordinarily well in their behavioral contexts by making optimal use of the closed action-perception loop. Model simulations and robotic implementations show that the smart biological mechanisms of motion computation and visually-guided flight control might be helpful to find technical solutions, for example, when designing micro air vehicles carrying a miniaturized, low-weight on-board processor.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Centre of Excellence “Cognitive Interaction Technology”Bielefeld University, Germany
| | | | | | | | | |
Collapse
|
49
|
Cheung A, Hiby L, Narendra A. Ant navigation: fractional use of the home vector. PLoS One 2012; 7:e50451. [PMID: 23209744 PMCID: PMC3510198 DOI: 10.1371/journal.pone.0050451] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2012] [Accepted: 10/22/2012] [Indexed: 11/19/2022] Open
Abstract
Home is a special location for many animals, offering shelter from the elements, protection from predation, and a common place for gathering of the same species. Not surprisingly, many species have evolved efficient, robust homing strategies, which are used as part of each and every foraging journey. A basic strategy used by most animals is to take the shortest possible route home by accruing the net distances and directions travelled during foraging, a strategy well known as path integration. This strategy is part of the navigation toolbox of ants occupying different landscapes. However, when there is a visual discrepancy between test and training conditions, the distance travelled by animals relying on the path integrator varies dramatically between species: from 90% of the home vector to an absolute distance of only 50 cm. We here ask what the theoretically optimal balance between PI-driven and landmark-driven navigation should be. In combination with well-established results from optimal search theory, we show analytically that this fractional use of the home vector is an optimal homing strategy under a variety of circumstances. Assuming there is a familiar route that an ant recognizes, theoretically optimal search should always begin at some fraction of the home vector, depending on the region of familiarity. These results are shown to be largely independent of the search algorithm used. Ant species from different habitats appear to have optimized their navigation strategy based on the availability and nature of navigational information content in their environment.
Collapse
Affiliation(s)
- Allen Cheung
- Queensland Brain Institute, The University of Queensland, Brisbane, Queensland, Australia
| | - Lex Hiby
- Conservation Research Ltd., Gt. Shelford, Cambridge, United Kingdom
| | - Ajay Narendra
- ARC Centre of Excellence in Vision Science, Research School of Biology, The Australian National University, Canberra, Australian Capital Territory, Australia
| |
Collapse
|
50
|
Palikij J, Ebert E, Preston M, McBride A, Jander R. Evidence for the honeybee's place knowledge in the vicinity of the hive. JOURNAL OF INSECT PHYSIOLOGY 2012; 58:1289-1298. [PMID: 22796223 DOI: 10.1016/j.jinsphys.2012.07.001] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/07/2011] [Revised: 06/28/2012] [Accepted: 07/02/2012] [Indexed: 06/01/2023]
Abstract
Upon leaving the nest for the first time, honeybees employ a tripartite orientation/exploration system to gain the requisite knowledge to return to their hive after foraging. Focal exploration comes first- the departing bee turns around to face the return target and oscillates in a lateral flight pattern of increasing amplitude and distance. Thereafter, for the peripheral exploration, the forward flying bee circles the return-goal area with expanding and alternating clockwise and counterclockwise arcs. After this two- part proximal exploration follows distal exploration, the bee flies straight towards her potential distal goal. For the return path, supported by the preceding exploratory learning, the return navigational performance is expected to reflect the three exploratory parts in reverse order. Previously only two performance parts have been experimentally identified: focal navigation and distal navigation. Here we discovered peripheral navigation as being distinct from focal and distal navigation. Like focal navigation, yet unlike distal navigation, peripheral navigation is invariably triggered by local place recognition. Whereas focal navigation (orientation) is close to unidirectional, peripheral navigation makes use of multiple goal-vector knowledge. We term the area in question the Peripheral Correction Area because within it peripheral navigation is triggered, which in turn is capable of correcting errors that accumulated during a preceding distal dead-reckoning based flight.
Collapse
Affiliation(s)
- Jason Palikij
- University of Kansas, Department of Ecology and Evolutionary Biology, 2041 Haworth Hall, 1200 Sunnyside Avenue, Lawrence, KS 66045-7534, USA
| | | | | | | | | |
Collapse
|