1
|
Schultheiss P. Unbalanced visual cues do not affect search precision at the nest in desert ants (Cataglyphis nodus). Learn Behav 2024; 52:85-91. [PMID: 37985604 PMCID: PMC10923989 DOI: 10.3758/s13420-023-00613-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/01/2023] [Indexed: 11/22/2023]
Abstract
Desert ant foragers are well known for their visual navigation abilities, relying on visual cues in the environment to find their way along routes back to the nest. If the inconspicuous nest entrance is missed, ants engage in a highly structured systematic search until it is discovered. Searching ants continue to be guided by visual cues surrounding the nest, from which they derive a location estimate. The precision level of this estimate depends on the information content of the nest panorama. This study examines whether search precision is also affected by the directional distribution of visual information. The systematic searching behavior of ants is examined under laboratory settings. Two different visual scenarios are compared - a balanced one where visual information is evenly distributed, and an unbalanced one where all visual information is located on one side of an experimental arena. The identity and number of visual objects is similar over both conditions. The ants search with comparable precision in both conditions. Even in the visually unbalanced condition, searches are characterized by balanced precision on both sides of the arena. This finding lends support to the idea that ants memorize the visual scenery at the nest as panoramic views from different locations. A searching ant is thus able to estimate its location with equal precision in all directions, leading to symmetrical search paths.
Collapse
Affiliation(s)
- Patrick Schultheiss
- Behavioral Physiology and Sociobiology, University of Würzburg, Am Hubland, 97074, Würzburg, Germany.
| |
Collapse
|
2
|
Konnerth MM, Foster JJ, el Jundi B, Spaethe J, Beetz MJ. Monarch butterflies memorize the spatial location of a food source. Proc Biol Sci 2023; 290:20231574. [PMID: 38113939 PMCID: PMC10730289 DOI: 10.1098/rspb.2023.1574] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2023] [Accepted: 11/20/2023] [Indexed: 12/21/2023] Open
Abstract
Spatial memory helps animals to navigate familiar environments. In insects, spatial memory has extensively been studied in central place foragers such as ants and bees. However, if butterflies memorize a spatial location remains unclear. Here, we conducted behavioural experiments to test whether monarch butterflies (Danaus plexippus) can remember and retrieve the spatial location of a food source. We placed several visually identical feeders in a flight cage, with only one feeder providing sucrose solution. Across multiple days, individual butterflies predominantly visited the rewarding feeder. Next, we displaced a salient landmark close to the feeders to test which visual cue the butterflies used to relocate the rewarding feeder. While occasional landmark displacements were ignored by the butterflies and did not affect their decisions, systematic displacement of both the landmark and the rewarding feeder demonstrated that the butterflies associated the salient landmark with the feeder's position. Altogether, we show that butterflies consolidate and retrieve spatial memory in the context of foraging.
Collapse
Affiliation(s)
- M. Marcel Konnerth
- Zoology II, Biocenter, University of Würzburg, Am Hubland, 97074 Würzburg, Bayern, Germany
| | - James J. Foster
- Department of Biology, University of Konstanz, 78464 Konstanz, Baden-Württemberg, Germany
| | - Basil el Jundi
- Department of Biology, Norwegian University of Science and Technology, NO-7491 Trondheim, Norway
| | - Johannes Spaethe
- Zoology II, Biocenter, University of Würzburg, Am Hubland, 97074 Würzburg, Bayern, Germany
| | - M. Jerome Beetz
- Zoology II, Biocenter, University of Würzburg, Am Hubland, 97074 Würzburg, Bayern, Germany
| |
Collapse
|
3
|
Zhu L, Mangan M, Webb B. Neuromorphic sequence learning with an event camera on routes through vegetation. Sci Robot 2023; 8:eadg3679. [PMID: 37756384 DOI: 10.1126/scirobotics.adg3679] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2022] [Accepted: 08/29/2023] [Indexed: 09/29/2023]
Abstract
For many robotics applications, it is desirable to have relatively low-power and efficient onboard solutions. We took inspiration from insects, such as ants, that are capable of learning and following routes in complex natural environments using relatively constrained sensory and neural systems. Such capabilities are particularly relevant to applications such as agricultural robotics, where visual navigation through dense vegetation remains a challenging task. In this scenario, a route is likely to have high self-similarity and be subject to changing lighting conditions and motion over uneven terrain, and the effects of wind on leaves increase the variability of the input. We used a bioinspired event camera on a terrestrial robot to collect visual sequences along routes in natural outdoor environments and applied a neural algorithm for spatiotemporal memory that is closely based on a known neural circuit in the insect brain. We show that this method is plausible to support route recognition for visual navigation and more robust than SeqSLAM when evaluated on repeated runs on the same route or routes with small lateral offsets. By encoding memory in a spiking neural network running on a neuromorphic computer, our model can evaluate visual familiarity in real time from event camera footage.
Collapse
Affiliation(s)
- Le Zhu
- School of Informatics, University of Edinburgh, EH8 9AB Edinburgh, UK
| | - Michael Mangan
- Sheffield Robotics, Department of Computer Science, University of Sheffield, S1 4DP Sheffield, UK
| | - Barbara Webb
- School of Informatics, University of Edinburgh, EH8 9AB Edinburgh, UK
| |
Collapse
|
4
|
Bertrand OJN, Sonntag A. The potential underlying mechanisms during learning flights. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-023-01637-7. [PMID: 37204434 DOI: 10.1007/s00359-023-01637-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2022] [Revised: 05/02/2023] [Accepted: 05/04/2023] [Indexed: 05/20/2023]
Abstract
Hymenopterans, such as bees and wasps, have long fascinated researchers with their sinuous movements at novel locations. These movements, such as loops, arcs, or zigzags, serve to help insects learn their surroundings at important locations. They also allow the insects to explore and orient themselves in their environment. After they gained experience with their environment, the insects fly along optimized paths guided by several guidance strategies, such as path integration, local homing, and route-following, forming a navigational toolkit. Whereas the experienced insects combine these strategies efficiently, the naive insects need to learn about their surroundings and tune the navigational toolkit. We will see that the structure of the movements performed during the learning flights leverages the robustness of certain strategies within a given scale to tune other strategies which are more efficient at a larger scale. Thus, an insect can explore its environment incrementally without risking not finding back essential locations.
Collapse
Affiliation(s)
- Olivier J N Bertrand
- Neurobiology, Bielefeld University, Universitätstr. 25, 33615, Bielefeld, NRW, Germany.
| | - Annkathrin Sonntag
- Neurobiology, Bielefeld University, Universitätstr. 25, 33615, Bielefeld, NRW, Germany
| |
Collapse
|
5
|
Collett TS, Hempel de Ibarra N. An 'instinct for learning': the learning flights and walks of bees, wasps and ants from the 1850s to now. J Exp Biol 2023; 226:301237. [PMID: 37015045 PMCID: PMC10112973 DOI: 10.1242/jeb.245278] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/06/2023]
Abstract
The learning flights and walks of bees, wasps and ants are precisely coordinated movements that enable insects to memorise the visual surroundings of their nest or other significant places such as foraging sites. These movements occur on the first few occasions that an insect leaves its nest. They are of special interest because their discovery in the middle of the 19th century provided perhaps the first evidence that insects can learn and are not solely governed by instinct. Here, we recount the history of research on learning flights from their discovery to the present day. The first studies were conducted by skilled naturalists and then, over the following 50 years, by neuroethologists examining the insects' learning behaviour in the context of experiments on insect navigation and its underlying neural mechanisms. The most important property of these movements is that insects repeatedly fixate their nest and look in other favoured directions, either in a preferred compass direction, such as North, or towards preferred objects close to the nest. Nest facing is accomplished through path integration. Memories of views along a favoured direction can later guide an insect's return to its nest. In some ant species, the favoured direction is adjusted to future foraging needs. These memories can then guide both the outward and homeward legs of a foraging trip. Current studies of central areas of the insect brain indicate what regions implement the behavioural manoeuvres underlying learning flights and the resulting visual memories.
Collapse
Affiliation(s)
- Thomas S Collett
- School of Life Sciences, University of Sussex, Brighton, BN1 9QG, UK
| | | |
Collapse
|
6
|
Clement L, Schwarz S, Wystrach A. An intrinsic oscillator underlies visual navigation in ants. Curr Biol 2023; 33:411-422.e5. [PMID: 36538930 DOI: 10.1016/j.cub.2022.11.059] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2022] [Revised: 11/06/2022] [Accepted: 11/24/2022] [Indexed: 12/23/2022]
Abstract
Many insects display lateral oscillations while moving, but how these oscillations are produced and participate in visual navigation remains unclear. Here, we show that visually navigating ants continuously display regular lateral oscillations coupled with variations of forward speed that strongly optimize the distance covered while simultaneously enabling them to scan left and right directions. This pattern of movement is produced endogenously and conserved across navigational contexts in two phylogenetically distant ant species. Moreover, the oscillations' amplitude can be modulated by both innate or learnt visual cues to adjust the exploration/exploitation balance to the current need. This lower-level motor pattern thus drastically reduces the degree of freedom needed for higher-level strategies to control behavior. The observed dynamical signature readily emerges from a simple neural circuit model of the insect's conserved pre-motor area known as the lateral accessory lobe, offering a surprisingly simple but effective neural control and endorsing oscillation as a core, ancestral way of moving in insects.
Collapse
Affiliation(s)
- Leo Clement
- Centre de Recherches sur la Cognition Animale, CBI, CNRS, Université Paul Sabatier, 31062 Toulouse Cedex 09, France.
| | - Sebastian Schwarz
- Centre de Recherches sur la Cognition Animale, CBI, CNRS, Université Paul Sabatier, 31062 Toulouse Cedex 09, France
| | - Antoine Wystrach
- Centre de Recherches sur la Cognition Animale, CBI, CNRS, Université Paul Sabatier, 31062 Toulouse Cedex 09, France
| |
Collapse
|
7
|
Egelhaaf M. Optic flow based spatial vision in insects. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-022-01610-w. [PMID: 36609568 DOI: 10.1007/s00359-022-01610-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Revised: 12/06/2022] [Accepted: 12/24/2022] [Indexed: 01/09/2023]
Abstract
The optic flow, i.e., the displacement of retinal images of objects in the environment induced by self-motion, is an important source of spatial information, especially for fast-flying insects. Spatial information over a wide range of distances, from the animal's immediate surroundings over several hundred metres to kilometres, is necessary for mediating behaviours, such as landing manoeuvres, collision avoidance in spatially complex environments, learning environmental object constellations and path integration in spatial navigation. To facilitate the processing of spatial information, the complexity of the optic flow is often reduced by active vision strategies. These result in translations and rotations being largely separated by a saccadic flight and gaze mode. Only the translational components of the optic flow contain spatial information. In the first step of optic flow processing, an array of local motion detectors provides a retinotopic spatial proximity map of the environment. This local motion information is then processed in parallel neural pathways in a task-specific manner and used to control the different components of spatial behaviour. A particular challenge here is that the distance information extracted from the optic flow does not represent the distances unambiguously, but these are scaled by the animal's speed of locomotion. Possible ways of coping with this ambiguity are discussed.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Center for Cognitive Interaction Technology (CITEC), Bielefeld University, Universitätsstraße 25, 33615, Bielefeld, Germany.
| |
Collapse
|
8
|
Cormons MJ, Zeil J. Digger wasps Microbembex monodonta SAY (Hymenoptera, Crabronidae) rely exclusively on visual cues when pinpointing their nest entrances. PLoS One 2023; 18:e0282144. [PMID: 36989296 PMCID: PMC10058119 DOI: 10.1371/journal.pone.0282144] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2021] [Accepted: 02/02/2023] [Indexed: 03/30/2023] Open
Abstract
The ability of insects to navigate and home is crucial to fundamental tasks, such as pollination, parental care, procuring food, and finding mates. Despite recent advances in our understanding of visual homing in insects, it remains unclear exactly how ground-nesting Hymenoptera are able to precisely locate their often inconspicuous or hidden reproductive burrow entrances. Here we show that the ground-nesting wasp Microbembex monodonta locates her hidden burrow entrance with the help of local landmarks, but only if their view of the wider panorama is not blocked. Moreover, the wasps are able to pinpoint the burrow location to within a few centimeters when potential olfactory, tactile and auditory cues are locally masked. We conclude that M. monodonta locate their hidden burrows relying exclusively on local visual cues in the context of the wider panorama. We discuss these results in the light of the older and more recent literature on nest recognition and homing in insects.
Collapse
Affiliation(s)
| | - Jochen Zeil
- Research School of Biology, The Australian National University, Canberra, ACT, Australia
| |
Collapse
|
9
|
Martin-Ordas G. Frames of reference in small-scale spatial tasks in wild bumblebees. Sci Rep 2022; 12:21683. [PMID: 36522430 PMCID: PMC9755249 DOI: 10.1038/s41598-022-26282-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2022] [Accepted: 12/13/2022] [Indexed: 12/23/2022] Open
Abstract
Spatial cognitive abilities are fundamental to foraging animal species. In particular, being able to encode the location of an object in relation to another object (i.e., spatial relationships) is critical for successful foraging. Whether egocentric (i.e., viewer-dependent) or allocentric (i.e., dependent on external environment or cues) representations underlie these behaviours is still a highly debated question in vertebrates and invertebrates. Previous research shows that bees encode spatial information largely using egocentric information. However, no research has investigated this question in the context of relational similarity. To test this, a spatial matching task previously used with humans and great apes was adapted for use with wild-caught bumblebees. In a series of experiments, bees first experienced a rewarded object and then had to spontaneously (Experiment 1) find or learn (Experiments 2 and 3) to find a second one, based on the location of first one. The results showed that bumblebees predominantly exhibited an allocentric strategy in the three experiments. These findings suggest that egocentric representations alone might not be evolutionary ancestral and clearly indicate similarities between vertebrates and invertebrates when encoding spatial information.
Collapse
Affiliation(s)
- Gema Martin-Ordas
- grid.10863.3c0000 0001 2164 6351Department of Psychology, University of Oviedo, Oviedo, Spain ,grid.11918.300000 0001 2248 4331Division of Psychology, University of Stirling, Stirling, UK
| |
Collapse
|
10
|
Visual navigation: properties, acquisition and use of views. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2022:10.1007/s00359-022-01599-2. [PMID: 36515743 DOI: 10.1007/s00359-022-01599-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2022] [Revised: 11/20/2022] [Accepted: 11/25/2022] [Indexed: 12/15/2022]
Abstract
Panoramic views offer information on heading direction and on location to visually navigating animals. This review covers the properties of panoramic views and the information they provide to navigating animals, irrespective of image representation. Heading direction can be retrieved by alignment matching between memorized and currently experienced views, and a gradient descent in image differences can lead back to the location at which a view was memorized (positional image matching). Central place foraging insects, such as ants, bees and wasps, conduct distinctly choreographed learning walks and learning flights upon first leaving their nest that are likely to be designed to systematically collect scene memories tagged with information provided by path integration on the direction of and the distance to the nest. Equally, traveling along routes, ants have been shown to engage in scanning movements, in particular when routes are unfamiliar, again suggesting a systematic process of acquiring and comparing views. The review discusses what we know and do not know about how view memories are represented in the brain of insects, how they are acquired and how they are subsequently used for traveling along routes and for pinpointing places.
Collapse
|
11
|
Ravi S, Siesenop T, Bertrand OJ, Li L, Doussot C, Fisher A, Warren WH, Egelhaaf M. Bumblebees display characteristics of active vision during robust obstacle avoidance flight. J Exp Biol 2022; 225:274096. [PMID: 35067721 PMCID: PMC8920035 DOI: 10.1242/jeb.243021] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2021] [Accepted: 01/18/2022] [Indexed: 11/20/2022]
Abstract
Insects are remarkable flyers and capable of navigating through highly cluttered environments. We tracked the head and thorax of bumblebees freely flying in a tunnel containing vertically oriented obstacles to uncover the sensorimotor strategies used for obstacle detection and collision avoidance. Bumblebees presented all the characteristics of active vision during flight by stabilizing their head relative to the external environment and maintained close alignment between their gaze and flightpath. Head stabilization increased motion contrast of nearby features against the background to enable obstacle detection. As bees approached obstacles, they appeared to modulate avoidance responses based on the relative retinal expansion velocity (RREV) of obstacles and their maximum evasion acceleration was linearly related to RREVmax. Finally, bees prevented collisions through rapid roll manoeuvres implemented by their thorax. Overall, the combination of visuo-motor strategies of bumblebees highlights elegant solutions developed by insects for visually guided flight through cluttered environments.
Collapse
Affiliation(s)
- Sridhar Ravi
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany,School of Engineering and Information Technology, University of New South Wales, Canberra, ACT 2600, Australia,Author for correspondence ()
| | - Tim Siesenop
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| | - Olivier J. Bertrand
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| | - Liang Li
- Department of Collective Behavior, Max Planck Institute of Animal Behavior, University of Konstanz, 78464 Konstanz, Germany,Centre for the Advanced Study of Collective Behaviour, University of Konstanz, 78464 Konstanz, Germany,Department of Biology, University of Konstanz, 78464 Konstanz, Germany
| | - Charlotte Doussot
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| | - Alex Fisher
- School of Engineering, RMIT University, Melbourne, VIC 3001, Australia
| | - William H. Warren
- Department of Cognitive, Linguistic & Psychological Sciences, Brown University, Providence, RI 02912, USA
| | - Martin Egelhaaf
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| |
Collapse
|
12
|
Stankiewicz J, Webb B. Looking down: a model for visual route following in flying insects. BIOINSPIRATION & BIOMIMETICS 2021; 16:055007. [PMID: 34243169 DOI: 10.1088/1748-3190/ac1307] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/01/2021] [Accepted: 07/09/2021] [Indexed: 06/13/2023]
Abstract
Insect visual navigation is often assumed to depend on panoramic views of the horizon, and how these change as the animal moves. However, it is known that honey bees can visually navigate in flat, open meadows where visual information at the horizon is minimal, or would remain relatively constant across a wide range of positions. In this paper we hypothesise that these animals can navigate using view memories of the ground. We find that in natural scenes, low resolution views from an aerial perspective of ostensibly self-similar terrain (e.g. within a field of grass) provide surprisingly robust descriptors of precise spatial locations. We propose a new visual route following approach that makes use of transverse oscillations to centre a flight path along a sequence of learned views of the ground. We deploy this model on an autonomous quadcopter and demonstrate that it provides robust performance in the real world on journeys of up to 30 m. The success of our method is contingent on a robust view matching process which can evaluate the familiarity of a view with a degree of translational invariance. We show that a previously developed wavelet based bandpass orientated filter approach fits these requirements well, exhibiting double the catchment area of standard approaches. Using a realistic simulation package, we evaluate the robustness of our approach to variations in heading direction and aircraft height between inbound and outbound journeys. We also demonstrate that our approach can operate using a vision system with a biologically relevant visual acuity and viewing direction.
Collapse
Affiliation(s)
- J Stankiewicz
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, United Kingdom
| | - B Webb
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, United Kingdom
| |
Collapse
|
13
|
Wystrach A. Movements, embodiment and the emergence of decisions. Insights from insect navigation. Biochem Biophys Res Commun 2021; 564:70-77. [PMID: 34023071 DOI: 10.1016/j.bbrc.2021.04.114] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2020] [Revised: 04/06/2021] [Accepted: 04/27/2021] [Indexed: 02/07/2023]
Abstract
We readily infer that animals make decisions, but what this implies is usually not clearly defined. The notion of 'decision-making' ultimately stems from human introspection, and is thus loaded with anthropomorphic assumptions. Notably, the decision is made internally, is based on information, and precedes the goal directed behaviour. Also, making a decision implies that 'something' did it, thus hints at the presence of a cognitive mind, whose existence is independent of the decision itself. This view may convey some truth, but here I take the opposite stance. Using examples from research in insect navigation, this essay highlights how apparent decisions can emerge without a brain, how actions can precede information or how sophisticated goal directed behaviours can be implemented without neural decisions. This perspective requires us to shake off the idea that behaviour is a consequence of the brain; and embrace the concept that movements arise from - as much as participate in - distributed interactions between various computational centres - including the body - that reverberate in closed-loop with the environment. From this perspective we may start to picture how a cognitive mind can be the consequence, rather than the cause, of such neural and body movements.
Collapse
Affiliation(s)
- Antoine Wystrach
- Research Centre on Animal Cognition, Centre for Integrative Biology, CNRS, University of Toulouse, 118 route deNarbonne, F-31062, Toulouse, France.
| |
Collapse
|
14
|
Doussot C, Bertrand OJN, Egelhaaf M. The Critical Role of Head Movements for Spatial Representation During Bumblebees Learning Flight. Front Behav Neurosci 2021; 14:606590. [PMID: 33542681 PMCID: PMC7852487 DOI: 10.3389/fnbeh.2020.606590] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2020] [Accepted: 11/23/2020] [Indexed: 11/20/2022] Open
Abstract
Bumblebees perform complex flight maneuvers around the barely visible entrance of their nest upon their first departures. During these flights bees learn visual information about the surroundings, possibly including its spatial layout. They rely on this information to return home. Depth information can be derived from the apparent motion of the scenery on the bees' retina. This motion is shaped by the animal's flight and orientation: Bees employ a saccadic flight and gaze strategy, where rapid turns of the head (saccades) alternate with flight segments of apparently constant gaze direction (intersaccades). When during intersaccades the gaze direction is kept relatively constant, the apparent motion contains information about the distance of the animal to environmental objects, and thus, in an egocentric reference frame. Alternatively, when the gaze direction rotates around a fixed point in space, the animal perceives the depth structure relative to this pivot point, i.e., in an allocentric reference frame. If the pivot point is at the nest-hole, the information is nest-centric. Here, we investigate in which reference frames bumblebees perceive depth information during their learning flights. By precisely tracking the head orientation, we found that half of the time, the head appears to pivot actively. However, only few of the corresponding pivot points are close to the nest entrance. Our results indicate that bumblebees perceive visual information in several reference frames when they learn about the surroundings of a behaviorally relevant location.
Collapse
Affiliation(s)
- Charlotte Doussot
- Department of Neurobiology, University of Bielefeld, Bielefeld, Germany
| | | | | |
Collapse
|
15
|
Bumblebees perceive the spatial layout of their environment in relation to their body size and form to minimize inflight collisions. Proc Natl Acad Sci U S A 2020; 117:31494-31499. [PMID: 33229535 DOI: 10.1073/pnas.2016872117] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Animals that move through complex habitats must frequently contend with obstacles in their path. Humans and other highly cognitive vertebrates avoid collisions by perceiving the relationship between the layout of their surroundings and the properties of their own body profile and action capacity. It is unknown whether insects, which have much smaller brains, possess such abilities. We used bumblebees, which vary widely in body size and regularly forage in dense vegetation, to investigate whether flying insects consider their own size when interacting with their surroundings. Bumblebees trained to fly in a tunnel were sporadically presented with an obstructing wall containing a gap that varied in width. Bees successfully flew through narrow gaps, even those that were much smaller than their wingspans, by first performing lateral scanning (side-to-side flights) to visually assess the aperture. Bees then reoriented their in-flight posture (i.e., yaw or heading angle) while passing through, minimizing their projected frontal width and mitigating collisions; in extreme cases, bees flew entirely sideways through the gap. Both the time that bees spent scanning during their approach and the extent to which they reoriented themselves to pass through the gap were determined not by the absolute size of the gap, but by the size of the gap relative to each bee's own wingspan. Our findings suggest that, similar to humans and other vertebrates, flying bumblebees perceive the affordance of their surroundings relative their body size and form to navigate safely through complex environments.
Collapse
|
16
|
Doussot C, Bertrand OJN, Egelhaaf M. Visually guided homing of bumblebees in ambiguous situations: A behavioural and modelling study. PLoS Comput Biol 2020; 16:e1008272. [PMID: 33048938 PMCID: PMC7553325 DOI: 10.1371/journal.pcbi.1008272] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2019] [Accepted: 08/19/2020] [Indexed: 11/19/2022] Open
Abstract
Returning home is a crucial task accomplished daily by many animals, including humans. Because of their tiny brains, insects, like bees or ants, are good study models for efficient navigation strategies. Bees and ants are known to rely mainly on learned visual information about the nest surroundings to pinpoint their barely visible nest-entrance. During the return, when the actual sight of the insect matches the learned information, the insect is easily guided home. Occasionally, modifications to the visual environment may take place while the insect is on a foraging trip. Here, we addressed the ecologically relevant question of how bumblebees’ homing is affected by such a situation. In an artificial setting, we habituated bees to be guided to their nest by two constellations of visual cues. After habituation, these cues were displaced during foraging trips into a conflict situation. We recorded bumblebees’ return flights in such circumstances and investigated where they search for their nest entrance following the degree of displacement between the two visually relevant cues. Bumblebees mostly searched at the fictive nest location as indicated by either cue constellation, but never at a compromise location between them. We compared these experimental results to the predictions of different types of homing models. We found that models guiding an agent by a single holistic view of the nest surroundings could not account for the bumblebees’ search behaviour in cue-conflict situations. Instead, homing models relying on multiple views were sufficient. We could further show that homing models required fewer views and got more robust to height changes if optic flow-based spatial information was encoded and learned, rather than just brightness information. Returning home sounds trivial, but to a concealed underground location like a burrow, is less easy. For the buff-tailed bumblebees, this task is a routine. After collecting pollen in gardens or flowered meadows, bees must return to their underground nest to feed the queen’s larvae. The nest entrance is almost invisible for a returning bee; therefore, it guides its flight by information about the surrounding visual environment. Since the seminal work of Timbergern, many experiments have focused on how visual information is guiding foraging insects back home. In these experiments, returning foragers were confronted with a coherent displacement of the entire nest surroundings, hence, leading the bees to a unique new location. But in nature, the objects constituting the visual environment maybe unorderly displaced, as some are differently inclined to the action of different factors, e.g. wind. In our study, we moved objects in a tricky way to create two fictitious nest entrances. The bees searched at the fictitious nest entrances, but never in-between. The distance between the fictitious nests affected the bees’ search. Finally, we could predict the search location by using bio-inspired homing models potentially interesting for implementing in autonomous robots.
Collapse
Affiliation(s)
- Charlotte Doussot
- Neurobiology, Faculty of Biology, Universität Bielefeld, Germany
- * E-mail:
| | | | - Martin Egelhaaf
- Neurobiology, Faculty of Biology, Universität Bielefeld, Germany
| |
Collapse
|
17
|
MaBouDi H, Solvi C, Chittka L. Bumblebees Learn a Relational Rule but Switch to a Win-Stay/Lose-Switch Heuristic After Extensive Training. Front Behav Neurosci 2020; 14:137. [PMID: 32903410 PMCID: PMC7434978 DOI: 10.3389/fnbeh.2020.00137] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2020] [Accepted: 07/16/2020] [Indexed: 11/22/2022] Open
Abstract
Mapping animal performance in a behavioral task to underlying cognitive mechanisms and strategies is rarely straightforward, since a task may be solvable in more than one manner. Here, we show that bumblebees perform well on a concept-based visual discrimination task but spontaneously switch from a concept-based solution to a simpler heuristic with extended training, all while continually increasing performance. Bumblebees were trained in an arena to find rewards on displays with shapes of different sizes where they could not use low-level visual cues. One group of bees was rewarded at displays with larger shapes and another group at displays with smaller shapes. Analysis of total choices shows bees increased their performance over 30 bouts to above chance. However, analyses of first and sequential choices suggest that after approximately 20 bouts, bumblebees changed to a win-stay/lose-switch strategy. Comparing bees' behavior to a probabilistic model based on a win-stay/lose-switch strategy further supports the idea that bees changed strategies with extensive training. Analyses of unrewarded tests indicate that bumblebees learned and retained the concept of relative size even after they had already switched to a win-stay, lost-shift strategy. We propose that the reason for this strategy switching may be due to cognitive flexibility and efficiency.
Collapse
Affiliation(s)
- HaDi MaBouDi
- School of Biological and Chemical Sciences, Queen Mary University of London, London, United Kingdom
- Department of Computer Science, University of Sheffield, Sheffield, United Kingdom
| | - Cwyn Solvi
- School of Biological and Chemical Sciences, Queen Mary University of London, London, United Kingdom
- Department of Biological Sciences, Macquarie University, North Ryde, NSW, Australia
| | - Lars Chittka
- School of Biological and Chemical Sciences, Queen Mary University of London, London, United Kingdom
| |
Collapse
|
18
|
The representation selection problem: Why we should favor the geometric-module framework of spatial reorientation over the view-matching framework. Cognition 2019; 192:103985. [DOI: 10.1016/j.cognition.2019.05.022] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2018] [Revised: 05/22/2019] [Accepted: 05/25/2019] [Indexed: 01/20/2023]
|
19
|
Kheradmand B, Nieh JC. The Role of Landscapes and Landmarks in Bee Navigation: A Review. INSECTS 2019; 10:E342. [PMID: 31614833 PMCID: PMC6835465 DOI: 10.3390/insects10100342] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/31/2019] [Revised: 10/08/2019] [Accepted: 10/09/2019] [Indexed: 11/16/2022]
Abstract
The ability of animals to explore landmarks in their environment is essential to their fitness. Landmarks are widely recognized to play a key role in navigation by providing information in multiple sensory modalities. However, what is a landmark? We propose that animals use a hierarchy of information based upon its utility and salience when an animal is in a given motivational state. Focusing on honeybees, we suggest that foragers choose landmarks based upon their relative uniqueness, conspicuousness, stability, and context. We also propose that it is useful to distinguish between landmarks that provide sensory input that changes ("near") or does not change ("far") as the receiver uses these landmarks to navigate. However, we recognize that this distinction occurs on a continuum and is not a clear-cut dichotomy. We review the rich literature on landmarks, focusing on recent studies that have illuminated our understanding of the kinds of information that bees use, how they use it, potential mechanisms, and future research directions.
Collapse
Affiliation(s)
- Bahram Kheradmand
- Section of Ecology, Behavior, and Evolution, Division of Biological Sciences, UC San Diego, La Jolla, CA 92093, USA.
| | - James C Nieh
- Section of Ecology, Behavior, and Evolution, Division of Biological Sciences, UC San Diego, La Jolla, CA 92093, USA.
| |
Collapse
|
20
|
Schulte P, Zeil J, Stürzl W. An insect-inspired model for acquiring views for homing. BIOLOGICAL CYBERNETICS 2019; 113:439-451. [PMID: 31076867 DOI: 10.1007/s00422-019-00800-1] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/20/2018] [Accepted: 04/27/2019] [Indexed: 06/09/2023]
Abstract
Wasps and bees perform learning flights when leaving their nest or food locations for the first time during which they acquire visual information that enables them to return successfully. Here we present and test a set of simple control rules underlying the execution of learning flights that closely mimic those performed by ground-nesting wasps. In the simplest model, we assume that the angle between flight direction and the nest direction as seen from the position of the insect is constant and only flips sign when pivoting direction around the nest is changed, resulting in a concatenation of piecewise defined logarithmic spirals. We then added characteristic properties of real learning flights, such as head saccades and the condition that the nest entrance within the visual field is kept nearly constant to describe the development of a learning flight in a head-centered frame of reference, assuming that the retinal position of the nest is known. We finally implemented a closed-loop simulation of learning flights based on a small set of visual control rules. The visual input for this model are rendered views generated from 3D reconstructions of natural wasp nesting sites, and the retinal nest position is controlled by means of simple template-based tracking. We show that naturalistic paths can be generated without knowledge of the absolute distance to the nest or of the flight speed. We demonstrate in addition that nest-tagged views recorded during such simulated learning flights are sufficient for a homing agent to pinpoint the goal, by identifying nest direction when encountering familiar views. We discuss how the information acquired during learning flights close to the nest can be integrated with long-range homing models.
Collapse
Affiliation(s)
- Patrick Schulte
- Institute of Robotics and Mechatronics, German Aerospace Center (DLR), Wessling, Germany
| | - Jochen Zeil
- Research School of Biology, The Australian National University, Canberra, Australia
| | - Wolfgang Stürzl
- Institute of Robotics and Mechatronics, German Aerospace Center (DLR), Wessling, Germany.
| |
Collapse
|
21
|
Abstract
Insect navigation is strikingly geometric. Many species use path integration to maintain an accurate estimate of their distance and direction (a vector) to their nest and can store the vector information for multiple salient locations in the world, such as food sources, in a common coordinate system. Insects can also use remembered views of the terrain around salient locations or along travelled routes to guide return, which is a fundamentally geometric process. Recent modelling of these abilities shows convergence on a small set of algorithms and assumptions that appear sufficient to account for a wide range of behavioural data. Notably, this 'base model' does not include any significant topological knowledge: the insect does not need to recover the information (implicit in their vector memory) about the relationships between salient places; nor to maintain any connectedness or ordering information between view memories; nor to form any associations between views and vectors. However, there remains some experimental evidence not fully explained by this base model that may point towards the existence of a more complex or integrated mental map in insects.
Collapse
Affiliation(s)
- Barbara Webb
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, UK
| |
Collapse
|
22
|
Ravi S, Bertrand O, Siesenop T, Manz LS, Doussot C, Fisher A, Egelhaaf M. Gap perception in bumblebees. ACTA ACUST UNITED AC 2019; 222:222/2/jeb184135. [PMID: 30683732 DOI: 10.1242/jeb.184135] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2018] [Accepted: 10/26/2018] [Indexed: 11/20/2022]
Abstract
A number of insects fly over long distances below the natural canopy, where the physical environment is highly cluttered consisting of obstacles of varying shape, size and texture. While navigating within such environments, animals need to perceive and disambiguate environmental features that might obstruct their flight. The most elemental aspect of aerial navigation through such environments is gap identification and 'passability' evaluation. We used bumblebees to seek insights into the mechanisms used for gap identification when confronted with an obstacle in their flight path and behavioral compensations employed to assess gap properties. Initially, bumblebee foragers were trained to fly though an unobstructed flight tunnel that led to a foraging chamber. After the bees were familiar with this situation, we placed a wall containing a gap that unexpectedly obstructed the flight path on a return trip to the hive. The flight trajectories of the bees as they approached the obstacle wall and traversed the gap were analyzed in order to evaluate their behavior as a function of the distance between the gap and a background wall that was placed behind the gap. Bumblebees initially decelerated when confronted with an unexpected obstacle. Deceleration was first noticed when the obstacle subtended around 35 deg on the retina but also depended on the properties of the gap. Subsequently, the bees gradually traded off their longitudinal velocity to lateral velocity and approached the gap with increasing lateral displacement and lateral velocity. Bumblebees shaped their flight trajectory depending on the salience of the gap, indicated in our case by the optic flow contrast between the region within the gap and on the obstacle, which decreased with decreasing distance between the gap and the background wall. As the optic flow contrast decreased, the bees spent an increasing amount of time moving laterally across the obstacles. During these repeated lateral maneuvers, the bees are probably assessing gap geometry and passability.
Collapse
Affiliation(s)
- Sridhar Ravi
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany .,School of Engineering, RMIT University, Melbourne, VIC 3001, Australia
| | - Olivier Bertrand
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Tim Siesenop
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Lea-Sophie Manz
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany.,Faculty of Biology, Johannes Gutenberg-Universität Mainz, 55122 Mainz, Germany
| | - Charlotte Doussot
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Alex Fisher
- School of Engineering, RMIT University, Melbourne, VIC 3001, Australia
| | - Martin Egelhaaf
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| |
Collapse
|
23
|
Linander N, Dacke M, Baird E, Hempel de Ibarra N. The role of spatial texture in visual control of bumblebee learning flights. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2018; 204:737-745. [PMID: 29980840 PMCID: PMC6096632 DOI: 10.1007/s00359-018-1274-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2018] [Revised: 06/07/2018] [Accepted: 06/19/2018] [Indexed: 11/29/2022]
Abstract
When leaving the nest for the first time, bees and wasps perform elaborate learning flights, during which the location of the nest is memorised. These flights are characterised by a succession of arcs or loops of increasing radius centred around the nest, with an incremental increase in ground speed, which requires precise control of the flight manoeuvres by the insect. Here, we investigated the role of optic flow cues in the control of learning flights by manipulating spatial texture in the ventral and panoramic visual field. We measured height, lateral displacement relative to the nest and ground speed during learning flights in bumblebees when ventral and panoramic optic flow cues were present or minimised, or features of the ground texture varied in size. Our observations show that ventral optic flow cues were required for the smooth execution of learning flights. We also found that bumblebees adjusted their flight height in response to variations of the visual texture on the ground. However, the presence or absence of panoramic optic flow did not have a substantial effect on flight performance. Our findings suggest that bumblebees mainly rely on optic flow information from the ventral visual field to control their learning flights.
Collapse
Affiliation(s)
- Nellie Linander
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden. .,Centre for Research in Animal Behaviour, Psychology, University of Exeter, Exeter, EX4 4QG, UK.
| | - Marie Dacke
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden
| | - Emily Baird
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden
| | | |
Collapse
|
24
|
Pritchard DJ, Hurly TA, Healy SD. Wild hummingbirds require a consistent view of landmarks to pinpoint a goal location. Anim Behav 2018. [DOI: 10.1016/j.anbehav.2018.01.014] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
25
|
Abstract
Navigation is an essential skill for many animals, and understanding how animal use environmental information, particularly visual information, to navigate has a long history in both ethology and psychology. In birds, the dominant approach for investigating navigation at small-scales comes from comparative psychology, which emphasizes the cognitive representations underpinning spatial memory. The majority of this work is based in the laboratory and it is unclear whether this context itself affects the information that birds learn and use when they search for a location. Data from hummingbirds suggests that birds in the wild might use visual information in quite a different manner. To reconcile these differences, here we propose a new approach to avian navigation, inspired by the sensory-driven study of navigation in insects. Using methods devised for studying the navigation of insects, it is possible to quantify the visual information available to navigating birds, and then to determine how this information influences those birds' navigation decisions. Focusing on four areas that we consider characteristic of the insect navigation perspective, we discuss how this approach has shone light on the information insects use to navigate, and assess the prospects of taking a similar approach with birds. Although birds and insects differ in many ways, there is nothing in the insect-inspired approach of the kind we describe that means these methods need be restricted to insects. On the contrary, adopting such an approach could provide a fresh perspective on the well-studied question of how birds navigate through a variety of environments.
Collapse
Affiliation(s)
| | - Susan D Healy
- School of Biology, University of St Andrews, Fife, UK
| |
Collapse
|
26
|
Lobecke A, Kern R, Egelhaaf M. Taking a goal-centred dynamic snapshot as a possibility for local homing in initially naïve bumblebees. ACTA ACUST UNITED AC 2018; 221:jeb.168674. [PMID: 29150448 DOI: 10.1242/jeb.168674] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2017] [Accepted: 11/13/2017] [Indexed: 11/20/2022]
Abstract
It is essential for central place foragers, such as bumblebees, to return reliably to their nest. Bumblebees, leaving their inconspicuous nest hole for the first time need to gather and learn sufficient information about their surroundings to allow them to return to their nest at the end of their trip, instead of just flying away to forage. Therefore, we assume an intrinsic learning programme that manifests itself in the flight structure immediately after leaving the nest for the first time. In this study, we recorded and analysed the first outbound flight of individually marked naïve bumblebees in an indoor environment. We found characteristic loop-like features in the flight pattern that appear to be necessary for the bees to acquire environmental information and might be relevant for finding the nest hole after a foraging trip. Despite common features in their spatio-temporal organisation, first departure flights from the nest are characterised by a high level of variability in their loop-like flight structure across animals. Changes in turn direction of body orientation, for example, are distributed evenly across the entire area used for the flights without any systematic relationship to the nest location. By considering the common flight motifs and this variability, we came to the hypothesis that a kind of dynamic snapshot is taken during the early phase of departure flights centred at the nest location. The quality of this snapshot is hypothesised to be 'tested' during the later phases of the departure flights concerning its usefulness for local homing.
Collapse
Affiliation(s)
- Anne Lobecke
- Department of Neurobiology and Cluster of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Roland Kern
- Department of Neurobiology and Cluster of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology and Cluster of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| |
Collapse
|
27
|
Jayatilaka P, Murray T, Narendra A, Zeil J. The choreography of learning walks in the Australian jack jumper ant Myrmecia croslandi. J Exp Biol 2018; 221:jeb.185306. [DOI: 10.1242/jeb.185306] [Citation(s) in RCA: 31] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2018] [Accepted: 08/12/2018] [Indexed: 11/20/2022]
Abstract
We provide a detailed analysis of the learning walks performed by Myrmecia croslandi ants at the nest during which they acquire visual information on its location. Most learning walks of 12 individually marked naïve ants took place in the morning with a narrow time window separating the first two learning walks, which most often occurred on the same day. Naïve ants performed between 2 to 7 walks over up to 4 consecutive days before heading out to forage. On subsequent walks naïve ants tend to explore the area around the nest in new compass directions. During learning walks ants move along arcs around the nest while performing oscillating scanning movements. In a regular temporal sequence, the ants’ gaze oscillates between the nest direction and the direction pointing away from the nest. Ants thus experience a sequence of views roughly across the nest and away from the nest from systematically spaced vantage points around the nest. We show further that ants leaving the nest for a foraging trip often walk in an arc around the nest on the opposite side to the intended foraging direction, performing a scanning routine indistinguishable from that of a learning walk. These partial learning walks are triggered by disturbance around the nest and may help returning ants with reorienting when overshooting the nest, which they frequently do. We discuss what is known about learning walks in different ant species and their adaptive significance for acquiring robust navigational memories.
Collapse
Affiliation(s)
- Piyankarie Jayatilaka
- Research School of Biology, The Australian National University 46 Sullivans Creek Road, Canberra ACT2601, Australia
| | - Trevor Murray
- Research School of Biology, The Australian National University 46 Sullivans Creek Road, Canberra ACT2601, Australia
| | - Ajay Narendra
- Research School of Biology, The Australian National University 46 Sullivans Creek Road, Canberra ACT2601, Australia
- Present address: Department of Biological Sciences, Macquarie University, 205 Culloden Road, Sydney, NSW 2109, Australia
| | - Jochen Zeil
- Research School of Biology, The Australian National University 46 Sullivans Creek Road, Canberra ACT2601, Australia
| |
Collapse
|
28
|
Palavalli-Nettimi R, Narendra A. Miniaturisation decreases visual navigational competence in ants. J Exp Biol 2018; 221:jeb.177238. [DOI: 10.1242/jeb.177238] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2018] [Accepted: 02/15/2018] [Indexed: 12/25/2022]
Abstract
Evolution of smaller body size in a given lineage, called miniaturisation, is commonly observed in many animals including ants. It affects various morphological features and is hypothesized to result in inferior behavioural capabilities, possibly owing to smaller sensory organs. To test this hypothesis, we studied whether reduced spatial resolution of compound eyes influences obstacle detection or obstacle avoidance in five different species of ants. We trained all ant species to travel to a sugar feeder. During their return journeys, we placed an obstacle close to the nest entrance. We found that ants with higher spatial resolution exited the corridor, the area covered between either ends of the obstacle, on average 10 cm earlier suggesting they detected the obstacle earlier in their path. Ants with the lowest spatial resolution changed their viewing directions only when they were close to the obstacle. We discuss the effects of miniaturisation on visual navigational competence in ants.
Collapse
Affiliation(s)
| | - Ajay Narendra
- Department of Biological Sciences, Macquarie University, Sydney, NSW 2109, Australia
| |
Collapse
|
29
|
Li J, Lindemann JP, Egelhaaf M. Local motion adaptation enhances the representation of spatial structure at EMD arrays. PLoS Comput Biol 2017; 13:e1005919. [PMID: 29281631 PMCID: PMC5760083 DOI: 10.1371/journal.pcbi.1005919] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2017] [Revised: 01/09/2018] [Accepted: 11/13/2017] [Indexed: 11/18/2022] Open
Abstract
Neuronal representation and extraction of spatial information are essential for behavioral control. For flying insects, a plausible way to gain spatial information is to exploit distance-dependent optic flow that is generated during translational self-motion. Optic flow is computed by arrays of local motion detectors retinotopically arranged in the second neuropile layer of the insect visual system. These motion detectors have adaptive response characteristics, i.e. their responses to motion with a constant or only slowly changing velocity decrease, while their sensitivity to rapid velocity changes is maintained or even increases. We analyzed by a modeling approach how motion adaptation affects signal representation at the output of arrays of motion detectors during simulated flight in artificial and natural 3D environments. We focused on translational flight, because spatial information is only contained in the optic flow induced by translational locomotion. Indeed, flies, bees and other insects segregate their flight into relatively long intersaccadic translational flight sections interspersed with brief and rapid saccadic turns, presumably to maximize periods of translation (80% of the flight). With a novel adaptive model of the insect visual motion pathway we could show that the motion detector responses to background structures of cluttered environments are largely attenuated as a consequence of motion adaptation, while responses to foreground objects stay constant or even increase. This conclusion even holds under the dynamic flight conditions of insects.
Collapse
Affiliation(s)
- Jinglin Li
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
- * E-mail:
| | - Jens P. Lindemann
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| |
Collapse
|
30
|
Altitude control in honeybees: joint vision-based learning and guidance. Sci Rep 2017; 7:9231. [PMID: 28835634 PMCID: PMC5569062 DOI: 10.1038/s41598-017-09112-5] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2016] [Accepted: 07/24/2017] [Indexed: 11/15/2022] Open
Abstract
Studies on insects’ visual guidance systems have shed little light on how learning contributes to insects’ altitude control system. In this study, honeybees were trained to fly along a double-roofed tunnel after entering it near either the ceiling or the floor of the tunnel. The honeybees trained to hug the ceiling therefore encountered a sudden change in the tunnel configuration midways: i.e. a "dorsal ditch". Thus, the trained honeybees met a sudden increase in the distance to the ceiling, corresponding to a sudden strong change in the visual cues available in their dorsal field of view. Honeybees reacted by rising quickly and hugging the new, higher ceiling, keeping a similar forward speed, distance to the ceiling and dorsal optic flow to those observed during the training step; whereas bees trained to follow the floor kept on following the floor regardless of the change in the ceiling height. When trained honeybees entered the tunnel via the other entry (the lower or upper entry) to that used during the training step, they quickly changed their altitude and hugged the surface they had previously learned to follow. These findings clearly show that trained honeybees control their altitude based on visual cues memorized during training. The memorized visual cues generated by the surfaces followed form a complex optic flow pattern: trained honeybees may attempt to match the visual cues they perceive with this memorized optic flow pattern by controlling their altitude.
Collapse
|
31
|
Homing in a tropical social wasp: role of spatial familiarity, motivation and age. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2017; 203:915-927. [DOI: 10.1007/s00359-017-1202-8] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2017] [Revised: 07/16/2017] [Accepted: 07/17/2017] [Indexed: 10/19/2022]
|
32
|
Howard SR, Avarguès-Weber A, Garcia J, Dyer AG. Free-flying honeybees extrapolate relational size rules to sort successively visited artificial flowers in a realistic foraging situation. Anim Cogn 2017; 20:627-638. [DOI: 10.1007/s10071-017-1086-6] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2016] [Revised: 03/27/2017] [Accepted: 03/29/2017] [Indexed: 02/08/2023]
|
33
|
Klein S, Cabirol A, Devaud JM, Barron AB, Lihoreau M. Why Bees Are So Vulnerable to Environmental Stressors. Trends Ecol Evol 2017; 32:268-278. [PMID: 28111032 DOI: 10.1016/j.tree.2016.12.009] [Citation(s) in RCA: 104] [Impact Index Per Article: 14.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2016] [Revised: 12/21/2016] [Accepted: 12/22/2016] [Indexed: 12/25/2022]
Abstract
Bee populations are declining in the industrialized world, raising concerns for the sustainable pollination of crops. Pesticides, pollutants, parasites, diseases, and malnutrition have all been linked to this problem. We consider here neurobiological, ecological, and evolutionary reasons why bees are particularly vulnerable to these environmental stressors. Central-place foraging on flowers demands advanced capacities of learning, memory, and navigation. However, even at low intensity levels, many stressors damage the bee brain, disrupting key cognitive functions needed for effective foraging, with dramatic consequences for brood development and colony survival. We discuss how understanding the relationships between the actions of stressors on the nervous system, individual cognitive impairments, and colony decline can inform constructive interventions to sustain bee populations.
Collapse
Affiliation(s)
- Simon Klein
- Research Center on Animal Cognition, Center for Integrative Biology, National Center for Scientific Research(CNRS), University Paul Sabatier(UPS), Toulouse, France; Department of Biological Sciences, Macquarie University, Sydney, NSW, Australia
| | - Amélie Cabirol
- Research Center on Animal Cognition, Center for Integrative Biology, National Center for Scientific Research(CNRS), University Paul Sabatier(UPS), Toulouse, France; Department of Biological Sciences, Macquarie University, Sydney, NSW, Australia
| | - Jean-Marc Devaud
- Research Center on Animal Cognition, Center for Integrative Biology, National Center for Scientific Research(CNRS), University Paul Sabatier(UPS), Toulouse, France
| | - Andrew B Barron
- Department of Biological Sciences, Macquarie University, Sydney, NSW, Australia
| | - Mathieu Lihoreau
- Research Center on Animal Cognition, Center for Integrative Biology, National Center for Scientific Research(CNRS), University Paul Sabatier(UPS), Toulouse, France.
| |
Collapse
|
34
|
Object Recognition in Flight: How Do Bees Distinguish between 3D Shapes? PLoS One 2016; 11:e0147106. [PMID: 26886006 PMCID: PMC4757030 DOI: 10.1371/journal.pone.0147106] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2015] [Accepted: 12/29/2015] [Indexed: 11/19/2022] Open
Abstract
Honeybees (Apis mellifera) discriminate multiple object features such as colour, pattern and 2D shape, but it remains unknown whether and how bees recover three-dimensional shape. Here we show that bees can recognize objects by their three-dimensional form, whereby they employ an active strategy to uncover the depth profiles. We trained individual, free flying honeybees to collect sugar water from small three-dimensional objects made of styrofoam (sphere, cylinder, cuboids) or folded paper (convex, concave, planar) and found that bees can easily discriminate between these stimuli. We also tested possible strategies employed by the bees to uncover the depth profiles. For the card stimuli, we excluded overall shape and pictorial features (shading, texture gradients) as cues for discrimination. Lacking sufficient stereo vision, bees are known to use speed gradients in optic flow to detect edges; could the bees apply this strategy also to recover the fine details of a surface depth profile? Analysing the bees' flight tracks in front of the stimuli revealed specific combinations of flight maneuvers (lateral translations in combination with yaw rotations), which are particularly suitable to extract depth cues from motion parallax. We modelled the generated optic flow and found characteristic patterns of angular displacement corresponding to the depth profiles of our stimuli: optic flow patterns from pure translations successfully recovered depth relations from the magnitude of angular displacements, additional rotation provided robust depth information based on the direction of the displacements; thus, the bees flight maneuvers may reflect an optimized visuo-motor strategy to extract depth structure from motion signals. The robustness and simplicity of this strategy offers an efficient solution for 3D-object-recognition without stereo vision, and could be employed by other flying insects, or mobile robots.
Collapse
|
35
|
Collett TS, Philippides A, Hempel de Ibarra N. Insect Navigation: How Do Wasps Get Home? Curr Biol 2016; 26:R166-8. [DOI: 10.1016/j.cub.2016.01.003] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
36
|
How Wasps Acquire and Use Views for Homing. Curr Biol 2016; 26:470-82. [DOI: 10.1016/j.cub.2015.12.052] [Citation(s) in RCA: 69] [Impact Index Per Article: 8.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2015] [Revised: 11/20/2015] [Accepted: 12/18/2015] [Indexed: 11/21/2022]
|
37
|
Minoura M, Sonoda K, Sakiyama T, Gunji YP. Rotating panoramic view: interaction between visual and olfactory cues in ants. ROYAL SOCIETY OPEN SCIENCE 2016; 3:150426. [PMID: 26909169 PMCID: PMC4736924 DOI: 10.1098/rsos.150426] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/22/2015] [Accepted: 12/24/2015] [Indexed: 06/05/2023]
Abstract
Insects use a navigational toolkit consisting of multiple strategies such as path integration, view-dependent recognition methods and olfactory cues. The question arises as to how directional cues afforded by a visual panorama combine with olfactory cues from a pheromone trail to guide ants towards their nest. We positioned a garden ant Lasius niger on a rotating table, whereon a segment of a pheromone trail relative to the stationary panorama was rotated while the ant walked along the trail towards its nest. The rotational speed of the table (3 r.p.m.) was set so that the table would rotate through about 90° by the time that an ant had walked from the start to the centre of the table. The ant completed a U-turn at about this point and so travelled in a nest-ward direction without leaving the trail. These results suggest that the ants persist on the pheromone trail and use visual input to determine their direction of travel along the trail.
Collapse
Affiliation(s)
- Mai Minoura
- School of Fundamental Science and Engineering, Waseda University, Tokyo, Japan
| | - Kohei Sonoda
- Research Organization of Science and Technology, Ritsumeikan University, Shiga, Japan
| | - Tomoko Sakiyama
- School of Fundamental Science and Engineering, Waseda University, Tokyo, Japan
| | - Yukio-Pegio Gunji
- School of Fundamental Science and Engineering, Waseda University, Tokyo, Japan
| |
Collapse
|
38
|
Boeddeker N, Mertes M, Dittmar L, Egelhaaf M. Bumblebee Homing: The Fine Structure of Head Turning Movements. PLoS One 2015; 10:e0135020. [PMID: 26352836 PMCID: PMC4564262 DOI: 10.1371/journal.pone.0135020] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2014] [Accepted: 07/17/2015] [Indexed: 11/18/2022] Open
Abstract
Changes in flight direction in flying insects are largely due to roll, yaw and pitch rotations of their body. Head orientation is stabilized for most of the time by counter rotation. Here, we use high-speed video to analyse head- and body-movements of the bumblebee Bombus terrestris while approaching and departing from a food source located between three landmarks in an indoor flight-arena. The flight paths consist of almost straight flight segments that are interspersed with rapid turns. These short and fast yaw turns ("saccades") are usually accompanied by even faster head yaw turns that change gaze direction. Since a large part of image rotation is thereby reduced to brief instants of time, this behavioural pattern facilitates depth perception from visual motion parallax during the intersaccadic intervals. The detailed analysis of the fine structure of the bees' head turning movements shows that the time course of single head saccades is very stereotypical. We find a consistent relationship between the duration, peak velocity and amplitude of saccadic head movements, which in its main characteristics resembles the so-called "saccadic main sequence" in humans. The fact that bumblebee head saccades are highly stereotyped as in humans, may hint at a common principle, where fast and precise motor control is used to reliably reduce the time during which the retinal images moves.
Collapse
Affiliation(s)
- Norbert Boeddeker
- Department of Neurobiology & Center of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
- Department of Cognitive Neurosciences & Center of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| | - Marcel Mertes
- Department of Neurobiology & Center of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| | - Laura Dittmar
- Department of Neurobiology & Center of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology & Center of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| |
Collapse
|
39
|
Dittmar L. Static and dynamic snapshots for goal localization in insects? Commun Integr Biol 2014. [DOI: 10.4161/cib.13763] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022] Open
|
40
|
Egelhaaf M, Kern R, Lindemann JP. Motion as a source of environmental information: a fresh view on biological motion computation by insect brains. Front Neural Circuits 2014; 8:127. [PMID: 25389392 PMCID: PMC4211400 DOI: 10.3389/fncir.2014.00127] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2014] [Accepted: 10/05/2014] [Indexed: 11/13/2022] Open
Abstract
Despite their miniature brains insects, such as flies, bees and wasps, are able to navigate by highly erobatic flight maneuvers in cluttered environments. They rely on spatial information that is contained in the retinal motion patterns induced on the eyes while moving around ("optic flow") to accomplish their extraordinary performance. Thereby, they employ an active flight and gaze strategy that separates rapid saccade-like turns from translatory flight phases where the gaze direction is kept largely constant. This behavioral strategy facilitates the processing of environmental information, because information about the distance of the animal to objects in the environment is only contained in the optic flow generated by translatory motion. However, motion detectors as are widespread in biological systems do not represent veridically the velocity of the optic flow vectors, but also reflect textural information about the environment. This characteristic has often been regarded as a limitation of a biological motion detection mechanism. In contrast, we conclude from analyses challenging insect movement detectors with image flow as generated during translatory locomotion through cluttered natural environments that this mechanism represents the contours of nearby objects. Contrast borders are a main carrier of functionally relevant object information in artificial and natural sceneries. The motion detection system thus segregates in a computationally parsimonious way the environment into behaviorally relevant nearby objects and-in many behavioral contexts-less relevant distant structures. Hence, by making use of an active flight and gaze strategy, insects are capable of performing extraordinarily well even with a computationally simple motion detection mechanism.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Department of Neurobiology and Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld UniversityBielefeld, Germany
| | - Roland Kern
- Department of Neurobiology and Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld UniversityBielefeld, Germany
| | - Jens Peter Lindemann
- Department of Neurobiology and Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld UniversityBielefeld, Germany
| |
Collapse
|
41
|
Najera DA, McCullough EL, Jander R. Honeybees Use Celestial and/or Terrestrial Compass Cues for Inter-Patch Navigation. Ethology 2014. [DOI: 10.1111/eth.12319] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
42
|
Mertes M, Dittmar L, Egelhaaf M, Boeddeker N. Visual motion-sensitive neurons in the bumblebee brain convey information about landmarks during a navigational task. Front Behav Neurosci 2014; 8:335. [PMID: 25309374 PMCID: PMC4173878 DOI: 10.3389/fnbeh.2014.00335] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2014] [Accepted: 09/07/2014] [Indexed: 11/13/2022] Open
Abstract
Bees use visual memories to find the spatial location of previously learnt food sites. Characteristic learning flights help acquiring these memories at newly discovered foraging locations where landmarks—salient objects in the vicinity of the goal location—can play an important role in guiding the animal's homing behavior. Although behavioral experiments have shown that bees can use a variety of visual cues to distinguish objects as landmarks, the question of how landmark features are encoded by the visual system is still open. Recently, it could be shown that motion cues are sufficient to allow bees localizing their goal using landmarks that can hardly be discriminated from the background texture. Here, we tested the hypothesis that motion sensitive neurons in the bee's visual pathway provide information about such landmarks during a learning flight and might, thus, play a role for goal localization. We tracked learning flights of free-flying bumblebees (Bombus terrestris) in an arena with distinct visual landmarks, reconstructed the visual input during these flights, and replayed ego-perspective movies to tethered bumblebees while recording the activity of direction-selective wide-field neurons in their optic lobe. By comparing neuronal responses during a typical learning flight and targeted modifications of landmark properties in this movie we demonstrate that these objects are indeed represented in the bee's visual motion pathway. We find that object-induced responses vary little with object texture, which is in agreement with behavioral evidence. These neurons thus convey information about landmark properties that are useful for view-based homing.
Collapse
Affiliation(s)
- Marcel Mertes
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| | - Laura Dittmar
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| | - Norbert Boeddeker
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| |
Collapse
|
43
|
Hempel de Ibarra N, Vorobyev M, Menzel R. Mechanisms, functions and ecology of colour vision in the honeybee. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2014; 200:411-33. [PMID: 24828676 PMCID: PMC4035557 DOI: 10.1007/s00359-014-0915-1] [Citation(s) in RCA: 101] [Impact Index Per Article: 10.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2014] [Revised: 04/15/2014] [Accepted: 04/17/2014] [Indexed: 11/06/2022]
Abstract
Research in the honeybee has laid the foundations for our understanding of insect colour vision. The trichromatic colour vision of honeybees shares fundamental properties with primate and human colour perception, such as colour constancy, colour opponency, segregation of colour and brightness coding. Laborious efforts to reconstruct the colour vision pathway in the honeybee have provided detailed descriptions of neural connectivity and the properties of photoreceptors and interneurons in the optic lobes of the bee brain. The modelling of colour perception advanced with the establishment of colour discrimination models that were based on experimental data, the Colour-Opponent Coding and Receptor Noise-Limited models, which are important tools for the quantitative assessment of bee colour vision and colour-guided behaviours. Major insights into the visual ecology of bees have been gained combining behavioural experiments and quantitative modelling, and asking how bee vision has influenced the evolution of flower colours and patterns. Recently research has focussed on the discrimination and categorisation of coloured patterns, colourful scenes and various other groupings of coloured stimuli, highlighting the bees' behavioural flexibility. The identification of perceptual mechanisms remains of fundamental importance for the interpretation of their learning strategies and performance in diverse experimental tasks.
Collapse
Affiliation(s)
- N Hempel de Ibarra
- Department of Psychology, Centre for Research in Animal Behaviour, University of Exeter, Exeter, UK,
| | | | | |
Collapse
|
44
|
Nourani-Vatani N, Borges PVK, Roberts JM, Srinivasan MV. On the Use of Optical Flow for Scene Change Detection and Description. J INTELL ROBOT SYST 2014. [DOI: 10.1007/s10846-013-9840-8] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
45
|
|
46
|
Abstract
Abstract
Primates can analyse visual scenes extremely rapidly, making accurate decisions for presentation times of only 20ms. We asked if bumblebees, despite having potentially more limited processing power, could similarly detect and discriminate visual patterns presented for durations of 100ms or less. Bumblebees detected stimuli and discriminated between differently oriented and coloured stimuli even when presented as briefly as 20ms but failed to identify ecologically relevant shapes (predatory spiders on flowers) even when presented for 100ms. This suggests a profound difference between primate and insect visual processing, so that while primates can capture entire visual scenes 'at a glance', insects might have to rely on continuous online sampling of the world around them, using a process of active vision which requires longer integration times.
Collapse
|
47
|
Philippides A, de Ibarra NH, Riabinina O, Collett TS. Bumblebee calligraphy: the design and control of flight motifs in the learning and return flights of Bombus terrestris. J Exp Biol 2013; 216:1093-104. [DOI: 10.1242/jeb.081455] [Citation(s) in RCA: 53] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
SUMMARY
Many wasps and bees learn the position of their nest relative to nearby visual features during elaborate ‘learning’ flights that they perform on leaving the nest. Return flights to the nest are thought to be patterned so that insects can reach their nest by matching their current view to views of their surroundings stored during learning flights. To understand how ground-nesting bumblebees might implement such a matching process, we have video-recorded the bees' learning and return flights and analysed the similarities and differences between the principal motifs of their flights. Loops that take bees away from and bring them back towards the nest are common during learning flights and less so in return flights. Zigzags are more prominent on return flights. Both motifs tend to be nest based. Bees often both fly towards and face the nest in the middle of loops and at the turns of zigzags. Before and after flight direction and body orientation are aligned, the two diverge from each other so that the nest is held within the bees' fronto-lateral visual field while flight direction relative to the nest can fluctuate more widely. These and other parallels between loops and zigzags suggest that they are stable variations of an underlying pattern, which enable bees to store and reacquire similar nest-focused views during learning and return flights.
Collapse
Affiliation(s)
| | | | - Olena Riabinina
- Department of Informatics, University of Sussex, Brighton BN1 9QG, UK
| | | |
Collapse
|
48
|
Egelhaaf M, Boeddeker N, Kern R, Kurtz R, Lindemann JP. Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action. Front Neural Circuits 2012; 6:108. [PMID: 23269913 PMCID: PMC3526811 DOI: 10.3389/fncir.2012.00108] [Citation(s) in RCA: 71] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2012] [Accepted: 12/03/2012] [Indexed: 11/30/2022] Open
Abstract
Insects such as flies or bees, with their miniature brains, are able to control highly aerobatic flight maneuvres and to solve spatial vision tasks, such as avoiding collisions with obstacles, landing on objects, or even localizing a previously learnt inconspicuous goal on the basis of environmental cues. With regard to solving such spatial tasks, these insects still outperform man-made autonomous flying systems. To accomplish their extraordinary performance, flies and bees have been shown by their characteristic behavioral actions to actively shape the dynamics of the image flow on their eyes ("optic flow"). The neural processing of information about the spatial layout of the environment is greatly facilitated by segregating the rotational from the translational optic flow component through a saccadic flight and gaze strategy. This active vision strategy thus enables the nervous system to solve apparently complex spatial vision tasks in a particularly efficient and parsimonious way. The key idea of this review is that biological agents, such as flies or bees, acquire at least part of their strength as autonomous systems through active interactions with their environment and not by simply processing passively gained information about the world. These agent-environment interactions lead to adaptive behavior in surroundings of a wide range of complexity. Animals with even tiny brains, such as insects, are capable of performing extraordinarily well in their behavioral contexts by making optimal use of the closed action-perception loop. Model simulations and robotic implementations show that the smart biological mechanisms of motion computation and visually-guided flight control might be helpful to find technical solutions, for example, when designing micro air vehicles carrying a miniaturized, low-weight on-board processor.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Centre of Excellence “Cognitive Interaction Technology”Bielefeld University, Germany
| | | | | | | | | |
Collapse
|
49
|
Kern R, Boeddeker N, Dittmar L, Egelhaaf M. Blowfly flight characteristics are shaped by environmental features and controlled by optic flow information. ACTA ACUST UNITED AC 2012; 215:2501-14. [PMID: 22723490 DOI: 10.1242/jeb.061713] [Citation(s) in RCA: 40] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Blowfly flight consists of two main components, saccadic turns and intervals of mostly straight gaze direction, although, as a consequence of inertia, flight trajectories usually change direction smoothly. We investigated how flight behavior changes depending on the surroundings and how saccadic turns and intersaccadic translational movements might be controlled in arenas of different width with and without obstacles. Blowflies do not fly in straight trajectories, even when traversing straight flight arenas; rather, they fly in meandering trajectories. Flight speed and the amplitude of meanders increase with arena width. Although saccade duration is largely constant, peak angular velocity and succession into either direction are variable and depend on the visual surroundings. Saccade rate and amplitude also vary with arena layout and are correlated with the 'time-to-contact' to the arena wall. We provide evidence that both saccade and velocity control rely to a large extent on the intersaccadic optic flow generated in eye regions looking well in front of the fly, rather than in the lateral visual field, where the optic flow at least during forward flight tends to be strongest.
Collapse
Affiliation(s)
- Roland Kern
- Department of Neurobiology and Center of Excellence, Cognitive Interaction Technology, Bielefeld University, D-33501 Bielefeld, Germany.
| | | | | | | |
Collapse
|
50
|
Palikij J, Ebert E, Preston M, McBride A, Jander R. Evidence for the honeybee's place knowledge in the vicinity of the hive. JOURNAL OF INSECT PHYSIOLOGY 2012; 58:1289-1298. [PMID: 22796223 DOI: 10.1016/j.jinsphys.2012.07.001] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/07/2011] [Revised: 06/28/2012] [Accepted: 07/02/2012] [Indexed: 06/01/2023]
Abstract
Upon leaving the nest for the first time, honeybees employ a tripartite orientation/exploration system to gain the requisite knowledge to return to their hive after foraging. Focal exploration comes first- the departing bee turns around to face the return target and oscillates in a lateral flight pattern of increasing amplitude and distance. Thereafter, for the peripheral exploration, the forward flying bee circles the return-goal area with expanding and alternating clockwise and counterclockwise arcs. After this two- part proximal exploration follows distal exploration, the bee flies straight towards her potential distal goal. For the return path, supported by the preceding exploratory learning, the return navigational performance is expected to reflect the three exploratory parts in reverse order. Previously only two performance parts have been experimentally identified: focal navigation and distal navigation. Here we discovered peripheral navigation as being distinct from focal and distal navigation. Like focal navigation, yet unlike distal navigation, peripheral navigation is invariably triggered by local place recognition. Whereas focal navigation (orientation) is close to unidirectional, peripheral navigation makes use of multiple goal-vector knowledge. We term the area in question the Peripheral Correction Area because within it peripheral navigation is triggered, which in turn is capable of correcting errors that accumulated during a preceding distal dead-reckoning based flight.
Collapse
Affiliation(s)
- Jason Palikij
- University of Kansas, Department of Ecology and Evolutionary Biology, 2041 Haworth Hall, 1200 Sunnyside Avenue, Lawrence, KS 66045-7534, USA
| | | | | | | | | |
Collapse
|