1
|
Beetz MJ, El Jundi B. The influence of stimulus history on directional coding in the monarch butterfly brain. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-023-01633-x. [PMID: 37095358 DOI: 10.1007/s00359-023-01633-x] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2022] [Revised: 04/05/2023] [Accepted: 04/12/2023] [Indexed: 04/26/2023]
Abstract
The central complex is a brain region in the insect brain that houses a neural network specialized to encode directional information. Directional coding has traditionally been investigated with compass cues that revolve in full rotations and at constant angular velocities around the insect's head. However, these stimulus conditions do not fully simulate an insect's sensory perception of compass cues during navigation. In nature, an insect flight is characterized by abrupt changes in moving direction as well as constant changes in velocity. The influence of such varying cue dynamics on compass coding remains unclear. We performed long-term tetrode recordings from the brain of monarch butterflies to study how central complex neurons respond to different stimulus velocities and directions. As these butterflies derive directional information from the sun during migration, we measured the neural response to a virtual sun. The virtual sun was either presented as a spot that appeared at random angular positions or was rotated around the butterfly at different angular velocities and directions. By specifically manipulating the stimulus velocity and trajectory, we dissociated the influence of angular velocity and direction on compass coding. While the angular velocity substantially affected the tuning directedness, the stimulus trajectory influenced the shape of the angular tuning curve. Taken together, our results suggest that the central complex flexibly adjusts its directional coding to the current stimulus dynamics ensuring a precise compass even under highly demanding conditions such as during rapid flight maneuvers.
Collapse
Affiliation(s)
- M Jerome Beetz
- Zoology II, Biocenter, University of Würzburg, Würzburg, Germany.
| | - Basil El Jundi
- Zoology II, Biocenter, University of Würzburg, Würzburg, Germany
- Animal Physiology, Department of Biology, Norwegian University of Science and Technology, Trondheim, Norway
| |
Collapse
|
2
|
Yadipour M, Billah MA, Faruque IA. Optic flow enrichment via Drosophila head and retina motions to support inflight position regulation. J Theor Biol 2023; 562:111416. [PMID: 36681182 DOI: 10.1016/j.jtbi.2023.111416] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2021] [Revised: 12/13/2022] [Accepted: 01/11/2023] [Indexed: 01/20/2023]
Abstract
Developing a functional description of the neural control circuits and visual feedback paths underlying insect flight behaviors is an active research area. Feedback controllers incorporating engineering models of the insect visual system outputs have described some flight behaviors, yet they do not explain how insects are able to stabilize their body position relative to nearby targets such as neighbors or forage sources, especially in challenging environments in which optic flow is poor. The insect experimental community is simultaneously recording a growing library of in-flight head and eye motions that may be linked to increased perception. This study develops a quantitative model of the optic flow experienced by a flying insect or robot during head yawing rotations (distinct from lateral peering motions in previous work) with a single other target in view. This study then applies a model of insect visuomotor feedback to show via analysis and simulation of five species that these head motions sufficiently enrich the optic flow and that the output feedback can provide relative position regulation relative to the single target (asymptotic stability). In the simplifying case of pure rotation relative to the body, theoretical analysis provides a stronger stability guarantee. The results are shown to be robust to anatomical neck angle limits and body vibrations, persist with more detailed Drosophila lateral-directional flight dynamics simulations, and generalize to recent retinal motion studies. Together, these results suggest that the optic flow enrichment provided by head or pseudopupil rotation could be used in an insect's neural processing circuit to enable position regulation.
Collapse
Affiliation(s)
- Mehdi Yadipour
- School of Mechanical and Aerospace Engineering, Oklahoma State University, Stillwater, OK, 74078, USA.
| | - Md Arif Billah
- School of Mechanical and Aerospace Engineering, Oklahoma State University, Stillwater, OK, 74078, USA.
| | - Imraan A Faruque
- School of Mechanical and Aerospace Engineering, Oklahoma State University, Stillwater, OK, 74078, USA.
| |
Collapse
|
3
|
Egelhaaf M. Optic flow based spatial vision in insects. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-022-01610-w. [PMID: 36609568 DOI: 10.1007/s00359-022-01610-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Revised: 12/06/2022] [Accepted: 12/24/2022] [Indexed: 01/09/2023]
Abstract
The optic flow, i.e., the displacement of retinal images of objects in the environment induced by self-motion, is an important source of spatial information, especially for fast-flying insects. Spatial information over a wide range of distances, from the animal's immediate surroundings over several hundred metres to kilometres, is necessary for mediating behaviours, such as landing manoeuvres, collision avoidance in spatially complex environments, learning environmental object constellations and path integration in spatial navigation. To facilitate the processing of spatial information, the complexity of the optic flow is often reduced by active vision strategies. These result in translations and rotations being largely separated by a saccadic flight and gaze mode. Only the translational components of the optic flow contain spatial information. In the first step of optic flow processing, an array of local motion detectors provides a retinotopic spatial proximity map of the environment. This local motion information is then processed in parallel neural pathways in a task-specific manner and used to control the different components of spatial behaviour. A particular challenge here is that the distance information extracted from the optic flow does not represent the distances unambiguously, but these are scaled by the animal's speed of locomotion. Possible ways of coping with this ambiguity are discussed.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Center for Cognitive Interaction Technology (CITEC), Bielefeld University, Universitätsstraße 25, 33615, Bielefeld, Germany.
| |
Collapse
|
4
|
Visual navigation: properties, acquisition and use of views. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2022:10.1007/s00359-022-01599-2. [PMID: 36515743 DOI: 10.1007/s00359-022-01599-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2022] [Revised: 11/20/2022] [Accepted: 11/25/2022] [Indexed: 12/15/2022]
Abstract
Panoramic views offer information on heading direction and on location to visually navigating animals. This review covers the properties of panoramic views and the information they provide to navigating animals, irrespective of image representation. Heading direction can be retrieved by alignment matching between memorized and currently experienced views, and a gradient descent in image differences can lead back to the location at which a view was memorized (positional image matching). Central place foraging insects, such as ants, bees and wasps, conduct distinctly choreographed learning walks and learning flights upon first leaving their nest that are likely to be designed to systematically collect scene memories tagged with information provided by path integration on the direction of and the distance to the nest. Equally, traveling along routes, ants have been shown to engage in scanning movements, in particular when routes are unfamiliar, again suggesting a systematic process of acquiring and comparing views. The review discusses what we know and do not know about how view memories are represented in the brain of insects, how they are acquired and how they are subsequently used for traveling along routes and for pinpointing places.
Collapse
|
5
|
Moura PA, Corso G, Montgomery SH, Cardoso MZ. True site fidelity in pollen‐feeding butterflies. Funct Ecol 2021. [DOI: 10.1111/1365-2435.13976] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
Affiliation(s)
- Priscila A. Moura
- Departamento de Ecologia Universidade Federal do Rio Grande do Norte Natal Brazil
| | - Giberto Corso
- Departamento de Biofísica e Farmacologia Universidade Federal do Rio Grande do Norte Natal Brazil
| | | | - Marcio Z. Cardoso
- Departamento de Ecologia Universidade Federal do Rio Grande do Norte Natal Brazil
- Departamento de Ecologia Instituto de Biologia Universidade Federal do Rio de Janeiro Rio de Janeiro Brazil
| |
Collapse
|
6
|
Doussot C, Bertrand OJN, Egelhaaf M. The Critical Role of Head Movements for Spatial Representation During Bumblebees Learning Flight. Front Behav Neurosci 2021; 14:606590. [PMID: 33542681 PMCID: PMC7852487 DOI: 10.3389/fnbeh.2020.606590] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2020] [Accepted: 11/23/2020] [Indexed: 11/20/2022] Open
Abstract
Bumblebees perform complex flight maneuvers around the barely visible entrance of their nest upon their first departures. During these flights bees learn visual information about the surroundings, possibly including its spatial layout. They rely on this information to return home. Depth information can be derived from the apparent motion of the scenery on the bees' retina. This motion is shaped by the animal's flight and orientation: Bees employ a saccadic flight and gaze strategy, where rapid turns of the head (saccades) alternate with flight segments of apparently constant gaze direction (intersaccades). When during intersaccades the gaze direction is kept relatively constant, the apparent motion contains information about the distance of the animal to environmental objects, and thus, in an egocentric reference frame. Alternatively, when the gaze direction rotates around a fixed point in space, the animal perceives the depth structure relative to this pivot point, i.e., in an allocentric reference frame. If the pivot point is at the nest-hole, the information is nest-centric. Here, we investigate in which reference frames bumblebees perceive depth information during their learning flights. By precisely tracking the head orientation, we found that half of the time, the head appears to pivot actively. However, only few of the corresponding pivot points are close to the nest entrance. Our results indicate that bumblebees perceive visual information in several reference frames when they learn about the surroundings of a behaviorally relevant location.
Collapse
Affiliation(s)
- Charlotte Doussot
- Department of Neurobiology, University of Bielefeld, Bielefeld, Germany
| | | | | |
Collapse
|
7
|
Odenthal L, Doussot C, Meyer S, Bertrand OJN. Analysing Head-Thorax Choreography During Free-Flights in Bumblebees. Front Behav Neurosci 2021; 14:610029. [PMID: 33510626 PMCID: PMC7835495 DOI: 10.3389/fnbeh.2020.610029] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2020] [Accepted: 12/14/2020] [Indexed: 01/29/2023] Open
Abstract
Animals coordinate their various body parts, sometimes in elaborate manners to swim, walk, climb, fly, and navigate their environment. The coordination of body parts is essential to behaviors such as, chasing, escaping, landing, and the extraction of relevant information. For example, by shaping the movement of the head and body in an active and controlled manner, flying insects structure their flights to facilitate the acquisition of distance information. They condense their turns into a short period of time (the saccade) interspaced by a relatively long translation (the intersaccade). However, due to technological limitations, the precise coordination of the head and thorax during insects' free-flight remains unclear. Here, we propose methods to analyse the orientation of the head and thorax of bumblebees Bombus terrestris, to segregate the trajectories of flying insects into saccades and intersaccades by using supervised machine learning (ML) techniques, and finally to analyse the coordination between head and thorax by using artificial neural networks (ANN). The segregation of flights into saccades and intersaccades by ML, based on the thorax angular velocities, decreased the misclassification by 12% compared to classically used methods. Our results demonstrate how machine learning techniques can be used to improve the analyses of insect flight structures and to learn about the complexity of head-body coordination. We anticipate our assay to be a starting point for more sophisticated experiments and analysis on freely flying insects. For example, the coordination of head and body movements during collision avoidance, chasing behavior, or negotiation of gaps could be investigated by monitoring the head and thorax orientation of freely flying insects within and across behavioral tasks, and in different species.
Collapse
Affiliation(s)
| | | | - Stefan Meyer
- Department of Informatics, University of Sussex, Brighton, United Kingdom
| | | |
Collapse
|
8
|
Winsor AM, Pagoti GF, Daye DJ, Cheries EW, Cave KR, Jakob EM. What gaze direction can tell us about cognitive processes in invertebrates. Biochem Biophys Res Commun 2021; 564:43-54. [PMID: 33413978 DOI: 10.1016/j.bbrc.2020.12.001] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2020] [Revised: 11/30/2020] [Accepted: 12/01/2020] [Indexed: 01/29/2023]
Abstract
Most visually guided animals shift their gaze using body movements, eye movements, or both to gather information selectively from their environments. Psychological studies of eye movements have advanced our understanding of perceptual and cognitive processes that mediate visual attention in humans and other vertebrates. However, much less is known about how these processes operate in other organisms, particularly invertebrates. We here make the case that studies of invertebrate cognition can benefit by adding precise measures of gaze direction. To accomplish this, we briefly review the human visual attention literature and outline four research themes and several experimental paradigms that could be extended to invertebrates. We briefly review selected studies where the measurement of gaze direction in invertebrates has provided new insights, and we suggest future areas of exploration.
Collapse
Affiliation(s)
- Alex M Winsor
- Graduate Program in Organismic and Evolutionary Biology, University of Massachusetts Amherst, Amherst, MA, 01003, USA.
| | - Guilherme F Pagoti
- Programa de Pós-Graduação em Zoologia, Instituto de Biociências, Universidade de São Paulo, Rua do Matão, 321, Travessa 14, Cidade Universitária, São Paulo, SP, 05508-090, Brazil
| | - Daniel J Daye
- Department of Biology, University of Massachusetts Amherst, Amherst, MA, 01003, USA; Graduate Program in Biological and Environmental Sciences, University of Rhode Island, Kingston, RI, 02881, USA
| | - Erik W Cheries
- Department of Psychological and Brain Sciences, University of Massachusetts Amherst, Amherst, MA, 01003, USA
| | - Kyle R Cave
- Department of Psychological and Brain Sciences, University of Massachusetts Amherst, Amherst, MA, 01003, USA
| | - Elizabeth M Jakob
- Department of Biology, University of Massachusetts Amherst, Amherst, MA, 01003, USA.
| |
Collapse
|
9
|
Bumblebees perceive the spatial layout of their environment in relation to their body size and form to minimize inflight collisions. Proc Natl Acad Sci U S A 2020; 117:31494-31499. [PMID: 33229535 DOI: 10.1073/pnas.2016872117] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Animals that move through complex habitats must frequently contend with obstacles in their path. Humans and other highly cognitive vertebrates avoid collisions by perceiving the relationship between the layout of their surroundings and the properties of their own body profile and action capacity. It is unknown whether insects, which have much smaller brains, possess such abilities. We used bumblebees, which vary widely in body size and regularly forage in dense vegetation, to investigate whether flying insects consider their own size when interacting with their surroundings. Bumblebees trained to fly in a tunnel were sporadically presented with an obstructing wall containing a gap that varied in width. Bees successfully flew through narrow gaps, even those that were much smaller than their wingspans, by first performing lateral scanning (side-to-side flights) to visually assess the aperture. Bees then reoriented their in-flight posture (i.e., yaw or heading angle) while passing through, minimizing their projected frontal width and mitigating collisions; in extreme cases, bees flew entirely sideways through the gap. Both the time that bees spent scanning during their approach and the extent to which they reoriented themselves to pass through the gap were determined not by the absolute size of the gap, but by the size of the gap relative to each bee's own wingspan. Our findings suggest that, similar to humans and other vertebrates, flying bumblebees perceive the affordance of their surroundings relative their body size and form to navigate safely through complex environments.
Collapse
|
10
|
Doussot C, Bertrand OJN, Egelhaaf M. Visually guided homing of bumblebees in ambiguous situations: A behavioural and modelling study. PLoS Comput Biol 2020; 16:e1008272. [PMID: 33048938 PMCID: PMC7553325 DOI: 10.1371/journal.pcbi.1008272] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2019] [Accepted: 08/19/2020] [Indexed: 11/19/2022] Open
Abstract
Returning home is a crucial task accomplished daily by many animals, including humans. Because of their tiny brains, insects, like bees or ants, are good study models for efficient navigation strategies. Bees and ants are known to rely mainly on learned visual information about the nest surroundings to pinpoint their barely visible nest-entrance. During the return, when the actual sight of the insect matches the learned information, the insect is easily guided home. Occasionally, modifications to the visual environment may take place while the insect is on a foraging trip. Here, we addressed the ecologically relevant question of how bumblebees’ homing is affected by such a situation. In an artificial setting, we habituated bees to be guided to their nest by two constellations of visual cues. After habituation, these cues were displaced during foraging trips into a conflict situation. We recorded bumblebees’ return flights in such circumstances and investigated where they search for their nest entrance following the degree of displacement between the two visually relevant cues. Bumblebees mostly searched at the fictive nest location as indicated by either cue constellation, but never at a compromise location between them. We compared these experimental results to the predictions of different types of homing models. We found that models guiding an agent by a single holistic view of the nest surroundings could not account for the bumblebees’ search behaviour in cue-conflict situations. Instead, homing models relying on multiple views were sufficient. We could further show that homing models required fewer views and got more robust to height changes if optic flow-based spatial information was encoded and learned, rather than just brightness information. Returning home sounds trivial, but to a concealed underground location like a burrow, is less easy. For the buff-tailed bumblebees, this task is a routine. After collecting pollen in gardens or flowered meadows, bees must return to their underground nest to feed the queen’s larvae. The nest entrance is almost invisible for a returning bee; therefore, it guides its flight by information about the surrounding visual environment. Since the seminal work of Timbergern, many experiments have focused on how visual information is guiding foraging insects back home. In these experiments, returning foragers were confronted with a coherent displacement of the entire nest surroundings, hence, leading the bees to a unique new location. But in nature, the objects constituting the visual environment maybe unorderly displaced, as some are differently inclined to the action of different factors, e.g. wind. In our study, we moved objects in a tricky way to create two fictitious nest entrances. The bees searched at the fictitious nest entrances, but never in-between. The distance between the fictitious nests affected the bees’ search. Finally, we could predict the search location by using bio-inspired homing models potentially interesting for implementing in autonomous robots.
Collapse
Affiliation(s)
- Charlotte Doussot
- Neurobiology, Faculty of Biology, Universität Bielefeld, Germany
- * E-mail:
| | | | - Martin Egelhaaf
- Neurobiology, Faculty of Biology, Universität Bielefeld, Germany
| |
Collapse
|
11
|
Active vision shapes and coordinates flight motor responses in flies. Proc Natl Acad Sci U S A 2020; 117:23085-23095. [PMID: 32873637 DOI: 10.1073/pnas.1920846117] [Citation(s) in RCA: 23] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Animals use active sensing to respond to sensory inputs and guide future motor decisions. In flight, flies generate a pattern of head and body movements to stabilize gaze. How the brain relays visual information to control head and body movements and how active head movements influence downstream motor control remains elusive. Using a control theoretic framework, we studied the optomotor gaze stabilization reflex in tethered flight and quantified how head movements stabilize visual motion and shape wing steering efforts in fruit flies (Drosophila). By shaping visual inputs, head movements increased the gain of wing steering responses and coordination between stimulus and wings, pointing to a tight coupling between head and wing movements. Head movements followed the visual stimulus in as little as 10 ms-a delay similar to the human vestibulo-ocular reflex-whereas wing steering responses lagged by more than 40 ms. This timing difference suggests a temporal order in the flow of visual information such that the head filters visual information eliciting downstream wing steering responses. Head fixation significantly decreased the mechanical power generated by the flight motor by reducing wingbeat frequency and overall thrust. By simulating an elementary motion detector array, we show that head movements shift the effective visual input dynamic range onto the sensitivity optimum of the motion vision pathway. Taken together, our results reveal a transformative influence of active vision on flight motor responses in flies. Our work provides a framework for understanding how to coordinate moving sensors on a moving body.
Collapse
|
12
|
MaBouDi H, Galpayage Dona HS, Gatto E, Loukola OJ, Buckley E, Onoufriou PD, Skorupski P, Chittka L. Bumblebees Use Sequential Scanning of Countable Items in Visual Patterns to Solve Numerosity Tasks. Integr Comp Biol 2020; 60:929-942. [PMID: 32369562 PMCID: PMC7750931 DOI: 10.1093/icb/icaa025] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022] Open
Abstract
Most research in comparative cognition focuses on measuring if animals manage certain tasks; fewer studies explore how animals might solve them. We investigated bumblebees’ scanning strategies in a numerosity task, distinguishing patterns with two items from four and one from three, and subsequently transferring numerical information to novel numbers, shapes, and colors. Video analyses of flight paths indicate that bees do not determine the number of items by using a rapid assessment of number (as mammals do in “subitizing”); instead, they rely on sequential enumeration even when items are presented simultaneously and in small quantities. This process, equivalent to the motor tagging (“pointing”) found for large number tasks in some primates, results in longer scanning times for patterns containing larger numbers of items. Bees used a highly accurate working memory, remembering which items have already been scanned, resulting in fewer than 1% of re-inspections of items before making a decision. Our results indicate that the small brain of bees, with less parallel processing capacity than mammals, might constrain them to use sequential pattern evaluation even for low quantities.
Collapse
Affiliation(s)
- HaDi MaBouDi
- School of Biological and Chemical Sciences, Queen Mary University of London, London E1 4NS, UK
| | - H Samadi Galpayage Dona
- School of Biological and Chemical Sciences, Queen Mary University of London, London E1 4NS, UK
| | - Elia Gatto
- Department of General Psychology, University of Padova, 35131 Padova, Italy
| | - Olli J Loukola
- School of Biological and Chemical Sciences, Queen Mary University of London, London E1 4NS, UK
| | - Emma Buckley
- School of Biological and Chemical Sciences, Queen Mary University of London, London E1 4NS, UK.,Faculty of Biology, Medicine and Health, University of Manchester, Manchester M13 9PL, UK
| | - Panayiotis D Onoufriou
- School of Biological and Chemical Sciences, Queen Mary University of London, London E1 4NS, UK
| | - Peter Skorupski
- Institute of Medical and Biomedical Education, St George's, University of London, London SW17 0RE, UK
| | - Lars Chittka
- School of Biological and Chemical Sciences, Queen Mary University of London, London E1 4NS, UK.,Wissenschaftskolleg zu Berlin-Institute for Advanced Study, Wallotstrasse 19, 14193 Berlin, Germany
| |
Collapse
|
13
|
Kheradmand B, Nieh JC. The Role of Landscapes and Landmarks in Bee Navigation: A Review. INSECTS 2019; 10:insects10100342. [PMID: 31614833 PMCID: PMC6835465 DOI: 10.3390/insects10100342] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/31/2019] [Revised: 10/08/2019] [Accepted: 10/09/2019] [Indexed: 11/16/2022]
Abstract
The ability of animals to explore landmarks in their environment is essential to their fitness. Landmarks are widely recognized to play a key role in navigation by providing information in multiple sensory modalities. However, what is a landmark? We propose that animals use a hierarchy of information based upon its utility and salience when an animal is in a given motivational state. Focusing on honeybees, we suggest that foragers choose landmarks based upon their relative uniqueness, conspicuousness, stability, and context. We also propose that it is useful to distinguish between landmarks that provide sensory input that changes (“near”) or does not change (“far”) as the receiver uses these landmarks to navigate. However, we recognize that this distinction occurs on a continuum and is not a clear-cut dichotomy. We review the rich literature on landmarks, focusing on recent studies that have illuminated our understanding of the kinds of information that bees use, how they use it, potential mechanisms, and future research directions.
Collapse
Affiliation(s)
- Bahram Kheradmand
- Section of Ecology, Behavior, and Evolution, Division of Biological Sciences, UC San Diego, La Jolla, CA 92093, USA.
| | - James C Nieh
- Section of Ecology, Behavior, and Evolution, Division of Biological Sciences, UC San Diego, La Jolla, CA 92093, USA.
| |
Collapse
|
14
|
Ravi S, Bertrand O, Siesenop T, Manz LS, Doussot C, Fisher A, Egelhaaf M. Gap perception in bumblebees. ACTA ACUST UNITED AC 2019; 222:222/2/jeb184135. [PMID: 30683732 DOI: 10.1242/jeb.184135] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2018] [Accepted: 10/26/2018] [Indexed: 11/20/2022]
Abstract
A number of insects fly over long distances below the natural canopy, where the physical environment is highly cluttered consisting of obstacles of varying shape, size and texture. While navigating within such environments, animals need to perceive and disambiguate environmental features that might obstruct their flight. The most elemental aspect of aerial navigation through such environments is gap identification and 'passability' evaluation. We used bumblebees to seek insights into the mechanisms used for gap identification when confronted with an obstacle in their flight path and behavioral compensations employed to assess gap properties. Initially, bumblebee foragers were trained to fly though an unobstructed flight tunnel that led to a foraging chamber. After the bees were familiar with this situation, we placed a wall containing a gap that unexpectedly obstructed the flight path on a return trip to the hive. The flight trajectories of the bees as they approached the obstacle wall and traversed the gap were analyzed in order to evaluate their behavior as a function of the distance between the gap and a background wall that was placed behind the gap. Bumblebees initially decelerated when confronted with an unexpected obstacle. Deceleration was first noticed when the obstacle subtended around 35 deg on the retina but also depended on the properties of the gap. Subsequently, the bees gradually traded off their longitudinal velocity to lateral velocity and approached the gap with increasing lateral displacement and lateral velocity. Bumblebees shaped their flight trajectory depending on the salience of the gap, indicated in our case by the optic flow contrast between the region within the gap and on the obstacle, which decreased with decreasing distance between the gap and the background wall. As the optic flow contrast decreased, the bees spent an increasing amount of time moving laterally across the obstacles. During these repeated lateral maneuvers, the bees are probably assessing gap geometry and passability.
Collapse
Affiliation(s)
- Sridhar Ravi
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany .,School of Engineering, RMIT University, Melbourne, VIC 3001, Australia
| | - Olivier Bertrand
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Tim Siesenop
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Lea-Sophie Manz
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany.,Faculty of Biology, Johannes Gutenberg-Universität Mainz, 55122 Mainz, Germany
| | - Charlotte Doussot
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Alex Fisher
- School of Engineering, RMIT University, Melbourne, VIC 3001, Australia
| | - Martin Egelhaaf
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| |
Collapse
|
15
|
Lobecke A, Kern R, Egelhaaf M. Taking a goal-centred dynamic snapshot as a possibility for local homing in initially naïve bumblebees. ACTA ACUST UNITED AC 2018; 221:jeb.168674. [PMID: 29150448 DOI: 10.1242/jeb.168674] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2017] [Accepted: 11/13/2017] [Indexed: 11/20/2022]
Abstract
It is essential for central place foragers, such as bumblebees, to return reliably to their nest. Bumblebees, leaving their inconspicuous nest hole for the first time need to gather and learn sufficient information about their surroundings to allow them to return to their nest at the end of their trip, instead of just flying away to forage. Therefore, we assume an intrinsic learning programme that manifests itself in the flight structure immediately after leaving the nest for the first time. In this study, we recorded and analysed the first outbound flight of individually marked naïve bumblebees in an indoor environment. We found characteristic loop-like features in the flight pattern that appear to be necessary for the bees to acquire environmental information and might be relevant for finding the nest hole after a foraging trip. Despite common features in their spatio-temporal organisation, first departure flights from the nest are characterised by a high level of variability in their loop-like flight structure across animals. Changes in turn direction of body orientation, for example, are distributed evenly across the entire area used for the flights without any systematic relationship to the nest location. By considering the common flight motifs and this variability, we came to the hypothesis that a kind of dynamic snapshot is taken during the early phase of departure flights centred at the nest location. The quality of this snapshot is hypothesised to be 'tested' during the later phases of the departure flights concerning its usefulness for local homing.
Collapse
Affiliation(s)
- Anne Lobecke
- Department of Neurobiology and Cluster of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Roland Kern
- Department of Neurobiology and Cluster of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology and Cluster of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| |
Collapse
|
16
|
Longden KD, Wicklein M, Hardcastle BJ, Huston SJ, Krapp HG. Spike Burst Coding of Translatory Optic Flow and Depth from Motion in the Fly Visual System. Curr Biol 2017; 27:3225-3236.e3. [PMID: 29056452 DOI: 10.1016/j.cub.2017.09.044] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2017] [Revised: 08/11/2017] [Accepted: 09/20/2017] [Indexed: 11/19/2022]
Abstract
Many animals use the visual motion generated by traveling straight-the translatory optic flow-to successfully navigate obstacles: near objects appear larger and to move more quickly than distant objects. Flies are expert at navigating cluttered environments, and while their visual processing of rotatory optic flow is understood in exquisite detail, how they process translatory optic flow remains a mystery. We present novel cell types that have local motion receptive fields matched to translation self-motion, the vertical translation (VT) cells. One of these, the VT1 cell, encodes self-motion in the forward-sideslip direction and fires action potentials in spike bursts as well as single spikes. We show that the spike burst coding is size and speed-tuned and is selectively modulated by motion parallax-the relative motion experienced during translation. These properties are spatially organized, so that the cell is most excited by clutter rather than isolated objects. When the fly is presented with a simulation of flying past an elevated object, the spike burst activity is modulated by the height of the object, and the rate of single spikes is unaffected. When the moving object alone is experienced, the cell is weakly driven. Meanwhile, the VT2-3 cells have motion receptive fields matched to the lift axis. In conjunction with previously described horizontal cells, the VT cells have properties well suited to the visual navigation of clutter and to encode the fly's movements along near cardinal axes of thrust, lift, and forward sideslip.
Collapse
Affiliation(s)
- Kit D Longden
- Department of Bioengineering, Imperial College London, London SW7 2AZ, UK.
| | - Martina Wicklein
- Department of Bioengineering, Imperial College London, London SW7 2AZ, UK
| | - Ben J Hardcastle
- Department of Bioengineering, Imperial College London, London SW7 2AZ, UK
| | - Stephen J Huston
- Department of Bioengineering, Imperial College London, London SW7 2AZ, UK
| | - Holger G Krapp
- Department of Bioengineering, Imperial College London, London SW7 2AZ, UK
| |
Collapse
|
17
|
Geurten BRH, Niesterok B, Dehnhardt G, Hanke FD. Saccadic movement strategy in a semiaquatic species - the harbour seal ( Phoca vitulina). ACTA ACUST UNITED AC 2017; 220:1503-1508. [PMID: 28167803 DOI: 10.1242/jeb.150763] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2016] [Accepted: 01/31/2017] [Indexed: 11/20/2022]
Abstract
Moving animals can estimate the distance of visual objects from image shift on their retina (optic flow) created during translational, but not rotational movements. To facilitate this distance estimation, many terrestrial and flying animals perform saccadic movements, thereby temporally separating translational and rotational movements, keeping rotation times short. In this study, we analysed whether a semiaquatic mammal, the harbour seal, also adopts a saccadic movement strategy. We recorded the seals' normal swimming pattern with video cameras and analysed head and body movements. The swimming seals indeed minimized rotation times by saccadic head and body turns, with top rotation speeds exceeding 350 deg s-1 which leads to an increase of translational movements. Saccades occurred during both types of locomotion of the seals' intermittent swimming mode: active propulsion and gliding. In conclusion, harbour seals share the saccadic movement strategy of terrestrial animals. Whether this movement strategy is adopted to facilitate distance estimation from optic flow or serves a different function will be a topic of future research.
Collapse
Affiliation(s)
- Bart R H Geurten
- Georg-August-University of Göttingen, Department of Cellular Neurobiology, Schwann-Schleiden Research Center, Julia-Lermontowa-Weg 3, Göttingen 37007, Germany
| | - Benedikt Niesterok
- University of Rostock, Institute for Biosciences, Sensory and Cognitive Ecology, Albert-Einstein-Str. 3, Rostock 18059, Germany
| | - Guido Dehnhardt
- University of Rostock, Institute for Biosciences, Sensory and Cognitive Ecology, Albert-Einstein-Str. 3, Rostock 18059, Germany
| | - Frederike D Hanke
- University of Rostock, Institute for Biosciences, Sensory and Cognitive Ecology, Albert-Einstein-Str. 3, Rostock 18059, Germany
| |
Collapse
|
18
|
Li J, Lindemann JP, Egelhaaf M. Peripheral Processing Facilitates Optic Flow-Based Depth Perception. Front Comput Neurosci 2016; 10:111. [PMID: 27818631 PMCID: PMC5073142 DOI: 10.3389/fncom.2016.00111] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2016] [Accepted: 10/04/2016] [Indexed: 12/19/2022] Open
Abstract
Flying insects, such as flies or bees, rely on consistent information regarding the depth structure of the environment when performing their flight maneuvers in cluttered natural environments. These behaviors include avoiding collisions, approaching targets or spatial navigation. Insects are thought to obtain depth information visually from the retinal image displacements ("optic flow") during translational ego-motion. Optic flow in the insect visual system is processed by a mechanism that can be modeled by correlation-type elementary motion detectors (EMDs). However, it is still an open question how spatial information can be extracted reliably from the responses of the highly contrast- and pattern-dependent EMD responses, especially if the vast range of light intensities encountered in natural environments is taken into account. This question will be addressed here by systematically modeling the peripheral visual system of flies, including various adaptive mechanisms. Different model variants of the peripheral visual system were stimulated with image sequences that mimic the panoramic visual input during translational ego-motion in various natural environments, and the resulting peripheral signals were fed into an array of EMDs. We characterized the influence of each peripheral computational unit on the representation of spatial information in the EMD responses. Our model simulations reveal that information about the overall light level needs to be eliminated from the EMD input as is accomplished under light-adapted conditions in the insect peripheral visual system. The response characteristics of large monopolar cells (LMCs) resemble that of a band-pass filter, which reduces the contrast dependency of EMDs strongly, effectively enhancing the representation of the nearness of objects and, especially, of their contours. We furthermore show that local brightness adaptation of photoreceptors allows for spatial vision under a wide range of dynamic light conditions.
Collapse
Affiliation(s)
- Jinglin Li
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology, Bielefeld UniversityBielefeld, Germany
| | | | | |
Collapse
|
19
|
How Wasps Acquire and Use Views for Homing. Curr Biol 2016; 26:470-82. [DOI: 10.1016/j.cub.2015.12.052] [Citation(s) in RCA: 69] [Impact Index Per Article: 8.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2015] [Revised: 11/20/2015] [Accepted: 12/18/2015] [Indexed: 11/21/2022]
|