1
|
Cormons MJ, Zeil J. Digger wasps Microbembex monodonta SAY (Hymenoptera, Crabronidae) rely exclusively on visual cues when pinpointing their nest entrances. PLoS One 2023; 18:e0282144. [PMID: 36989296 PMCID: PMC10058119 DOI: 10.1371/journal.pone.0282144] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2021] [Accepted: 02/02/2023] [Indexed: 03/30/2023] Open
Abstract
The ability of insects to navigate and home is crucial to fundamental tasks, such as pollination, parental care, procuring food, and finding mates. Despite recent advances in our understanding of visual homing in insects, it remains unclear exactly how ground-nesting Hymenoptera are able to precisely locate their often inconspicuous or hidden reproductive burrow entrances. Here we show that the ground-nesting wasp Microbembex monodonta locates her hidden burrow entrance with the help of local landmarks, but only if their view of the wider panorama is not blocked. Moreover, the wasps are able to pinpoint the burrow location to within a few centimeters when potential olfactory, tactile and auditory cues are locally masked. We conclude that M. monodonta locate their hidden burrows relying exclusively on local visual cues in the context of the wider panorama. We discuss these results in the light of the older and more recent literature on nest recognition and homing in insects.
Collapse
Affiliation(s)
| | - Jochen Zeil
- Research School of Biology, The Australian National University, Canberra, ACT, Australia
| |
Collapse
|
2
|
Visual navigation: properties, acquisition and use of views. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2022:10.1007/s00359-022-01599-2. [PMID: 36515743 DOI: 10.1007/s00359-022-01599-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2022] [Revised: 11/20/2022] [Accepted: 11/25/2022] [Indexed: 12/15/2022]
Abstract
Panoramic views offer information on heading direction and on location to visually navigating animals. This review covers the properties of panoramic views and the information they provide to navigating animals, irrespective of image representation. Heading direction can be retrieved by alignment matching between memorized and currently experienced views, and a gradient descent in image differences can lead back to the location at which a view was memorized (positional image matching). Central place foraging insects, such as ants, bees and wasps, conduct distinctly choreographed learning walks and learning flights upon first leaving their nest that are likely to be designed to systematically collect scene memories tagged with information provided by path integration on the direction of and the distance to the nest. Equally, traveling along routes, ants have been shown to engage in scanning movements, in particular when routes are unfamiliar, again suggesting a systematic process of acquiring and comparing views. The review discusses what we know and do not know about how view memories are represented in the brain of insects, how they are acquired and how they are subsequently used for traveling along routes and for pinpointing places.
Collapse
|
3
|
Space, the original frontier. Curr Opin Behav Sci 2022. [DOI: 10.1016/j.cobeha.2022.101106] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
|
4
|
Goulard R, Buehlmann C, Niven JE, Graham P, Webb B. A unified mechanism for innate and learned visual landmark guidance in the insect central complex. PLoS Comput Biol 2021; 17:e1009383. [PMID: 34555013 PMCID: PMC8491911 DOI: 10.1371/journal.pcbi.1009383] [Citation(s) in RCA: 23] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2021] [Revised: 10/05/2021] [Accepted: 08/26/2021] [Indexed: 11/24/2022] Open
Abstract
Insects can navigate efficiently in both novel and familiar environments, and this requires flexiblity in how they are guided by sensory cues. A prominent landmark, for example, can elicit strong innate behaviours (attraction or menotaxis) but can also be used, after learning, as a specific directional cue as part of a navigation memory. However, the mechanisms that allow both pathways to co-exist, interact or override each other are largely unknown. Here we propose a model for the behavioural integration of innate and learned guidance based on the neuroanatomy of the central complex (CX), adapted to control landmark guided behaviours. We consider a reward signal provided either by an innate attraction to landmarks or a long-term visual memory in the mushroom bodies (MB) that modulates the formation of a local vector memory in the CX. Using an operant strategy for a simulated agent exploring a simple world containing a single visual cue, we show how the generated short-term memory can support both innate and learned steering behaviour. In addition, we show how this architecture is consistent with the observed effects of unilateral MB lesions in ants that cause a reversion to innate behaviour. We suggest the formation of a directional memory in the CX can be interpreted as transforming rewarding (positive or negative) sensory signals into a mapping of the environment that describes the geometrical attractiveness (or repulsion). We discuss how this scheme might represent an ideal way to combine multisensory information gathered during the exploration of an environment and support optimal cue integration.
Collapse
Affiliation(s)
- Roman Goulard
- Institute for Perception, Action, and Behaviour, School of Informatics, University of Edinburgh, Edinburgh, Scotland, United Kingdom
| | - Cornelia Buehlmann
- School of Life Sciences, University of Sussex, John Maynard Smith Building, Falmer, Brighton, United Kingdom
| | - Jeremy E. Niven
- School of Life Sciences, University of Sussex, John Maynard Smith Building, Falmer, Brighton, United Kingdom
| | - Paul Graham
- School of Life Sciences, University of Sussex, John Maynard Smith Building, Falmer, Brighton, United Kingdom
| | - Barbara Webb
- Institute for Perception, Action, and Behaviour, School of Informatics, University of Edinburgh, Edinburgh, Scotland, United Kingdom
| |
Collapse
|
5
|
Stankiewicz J, Webb B. Looking down: a model for visual route following in flying insects. BIOINSPIRATION & BIOMIMETICS 2021; 16:055007. [PMID: 34243169 DOI: 10.1088/1748-3190/ac1307] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/01/2021] [Accepted: 07/09/2021] [Indexed: 06/13/2023]
Abstract
Insect visual navigation is often assumed to depend on panoramic views of the horizon, and how these change as the animal moves. However, it is known that honey bees can visually navigate in flat, open meadows where visual information at the horizon is minimal, or would remain relatively constant across a wide range of positions. In this paper we hypothesise that these animals can navigate using view memories of the ground. We find that in natural scenes, low resolution views from an aerial perspective of ostensibly self-similar terrain (e.g. within a field of grass) provide surprisingly robust descriptors of precise spatial locations. We propose a new visual route following approach that makes use of transverse oscillations to centre a flight path along a sequence of learned views of the ground. We deploy this model on an autonomous quadcopter and demonstrate that it provides robust performance in the real world on journeys of up to 30 m. The success of our method is contingent on a robust view matching process which can evaluate the familiarity of a view with a degree of translational invariance. We show that a previously developed wavelet based bandpass orientated filter approach fits these requirements well, exhibiting double the catchment area of standard approaches. Using a realistic simulation package, we evaluate the robustness of our approach to variations in heading direction and aircraft height between inbound and outbound journeys. We also demonstrate that our approach can operate using a vision system with a biologically relevant visual acuity and viewing direction.
Collapse
Affiliation(s)
- J Stankiewicz
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, United Kingdom
| | - B Webb
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, United Kingdom
| |
Collapse
|
6
|
Gallistel C. The physical basis of memory. Cognition 2021; 213:104533. [DOI: 10.1016/j.cognition.2020.104533] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2020] [Revised: 12/01/2020] [Accepted: 12/01/2020] [Indexed: 12/31/2022]
|
7
|
Wystrach A. Movements, embodiment and the emergence of decisions. Insights from insect navigation. Biochem Biophys Res Commun 2021; 564:70-77. [PMID: 34023071 DOI: 10.1016/j.bbrc.2021.04.114] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2020] [Revised: 04/06/2021] [Accepted: 04/27/2021] [Indexed: 02/07/2023]
Abstract
We readily infer that animals make decisions, but what this implies is usually not clearly defined. The notion of 'decision-making' ultimately stems from human introspection, and is thus loaded with anthropomorphic assumptions. Notably, the decision is made internally, is based on information, and precedes the goal directed behaviour. Also, making a decision implies that 'something' did it, thus hints at the presence of a cognitive mind, whose existence is independent of the decision itself. This view may convey some truth, but here I take the opposite stance. Using examples from research in insect navigation, this essay highlights how apparent decisions can emerge without a brain, how actions can precede information or how sophisticated goal directed behaviours can be implemented without neural decisions. This perspective requires us to shake off the idea that behaviour is a consequence of the brain; and embrace the concept that movements arise from - as much as participate in - distributed interactions between various computational centres - including the body - that reverberate in closed-loop with the environment. From this perspective we may start to picture how a cognitive mind can be the consequence, rather than the cause, of such neural and body movements.
Collapse
Affiliation(s)
- Antoine Wystrach
- Research Centre on Animal Cognition, Centre for Integrative Biology, CNRS, University of Toulouse, 118 route deNarbonne, F-31062, Toulouse, France.
| |
Collapse
|
8
|
Doussot C, Bertrand OJN, Egelhaaf M. The Critical Role of Head Movements for Spatial Representation During Bumblebees Learning Flight. Front Behav Neurosci 2021; 14:606590. [PMID: 33542681 PMCID: PMC7852487 DOI: 10.3389/fnbeh.2020.606590] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2020] [Accepted: 11/23/2020] [Indexed: 11/20/2022] Open
Abstract
Bumblebees perform complex flight maneuvers around the barely visible entrance of their nest upon their first departures. During these flights bees learn visual information about the surroundings, possibly including its spatial layout. They rely on this information to return home. Depth information can be derived from the apparent motion of the scenery on the bees' retina. This motion is shaped by the animal's flight and orientation: Bees employ a saccadic flight and gaze strategy, where rapid turns of the head (saccades) alternate with flight segments of apparently constant gaze direction (intersaccades). When during intersaccades the gaze direction is kept relatively constant, the apparent motion contains information about the distance of the animal to environmental objects, and thus, in an egocentric reference frame. Alternatively, when the gaze direction rotates around a fixed point in space, the animal perceives the depth structure relative to this pivot point, i.e., in an allocentric reference frame. If the pivot point is at the nest-hole, the information is nest-centric. Here, we investigate in which reference frames bumblebees perceive depth information during their learning flights. By precisely tracking the head orientation, we found that half of the time, the head appears to pivot actively. However, only few of the corresponding pivot points are close to the nest entrance. Our results indicate that bumblebees perceive visual information in several reference frames when they learn about the surroundings of a behaviorally relevant location.
Collapse
Affiliation(s)
- Charlotte Doussot
- Department of Neurobiology, University of Bielefeld, Bielefeld, Germany
| | | | | |
Collapse
|
9
|
Doussot C, Bertrand OJN, Egelhaaf M. Visually guided homing of bumblebees in ambiguous situations: A behavioural and modelling study. PLoS Comput Biol 2020; 16:e1008272. [PMID: 33048938 PMCID: PMC7553325 DOI: 10.1371/journal.pcbi.1008272] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2019] [Accepted: 08/19/2020] [Indexed: 11/19/2022] Open
Abstract
Returning home is a crucial task accomplished daily by many animals, including humans. Because of their tiny brains, insects, like bees or ants, are good study models for efficient navigation strategies. Bees and ants are known to rely mainly on learned visual information about the nest surroundings to pinpoint their barely visible nest-entrance. During the return, when the actual sight of the insect matches the learned information, the insect is easily guided home. Occasionally, modifications to the visual environment may take place while the insect is on a foraging trip. Here, we addressed the ecologically relevant question of how bumblebees’ homing is affected by such a situation. In an artificial setting, we habituated bees to be guided to their nest by two constellations of visual cues. After habituation, these cues were displaced during foraging trips into a conflict situation. We recorded bumblebees’ return flights in such circumstances and investigated where they search for their nest entrance following the degree of displacement between the two visually relevant cues. Bumblebees mostly searched at the fictive nest location as indicated by either cue constellation, but never at a compromise location between them. We compared these experimental results to the predictions of different types of homing models. We found that models guiding an agent by a single holistic view of the nest surroundings could not account for the bumblebees’ search behaviour in cue-conflict situations. Instead, homing models relying on multiple views were sufficient. We could further show that homing models required fewer views and got more robust to height changes if optic flow-based spatial information was encoded and learned, rather than just brightness information. Returning home sounds trivial, but to a concealed underground location like a burrow, is less easy. For the buff-tailed bumblebees, this task is a routine. After collecting pollen in gardens or flowered meadows, bees must return to their underground nest to feed the queen’s larvae. The nest entrance is almost invisible for a returning bee; therefore, it guides its flight by information about the surrounding visual environment. Since the seminal work of Timbergern, many experiments have focused on how visual information is guiding foraging insects back home. In these experiments, returning foragers were confronted with a coherent displacement of the entire nest surroundings, hence, leading the bees to a unique new location. But in nature, the objects constituting the visual environment maybe unorderly displaced, as some are differently inclined to the action of different factors, e.g. wind. In our study, we moved objects in a tricky way to create two fictitious nest entrances. The bees searched at the fictitious nest entrances, but never in-between. The distance between the fictitious nests affected the bees’ search. Finally, we could predict the search location by using bio-inspired homing models potentially interesting for implementing in autonomous robots.
Collapse
Affiliation(s)
- Charlotte Doussot
- Neurobiology, Faculty of Biology, Universität Bielefeld, Germany
- * E-mail:
| | | | - Martin Egelhaaf
- Neurobiology, Faculty of Biology, Universität Bielefeld, Germany
| |
Collapse
|
10
|
Sun X, Yue S, Mangan M. A decentralised neural model explaining optimal integration of navigational strategies in insects. eLife 2020; 9:e54026. [PMID: 32589143 PMCID: PMC7365663 DOI: 10.7554/elife.54026] [Citation(s) in RCA: 41] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2019] [Accepted: 06/26/2020] [Indexed: 12/12/2022] Open
Abstract
Insect navigation arises from the coordinated action of concurrent guidance systems but the neural mechanisms through which each functions, and are then coordinated, remains unknown. We propose that insects require distinct strategies to retrace familiar routes (route-following) and directly return from novel to familiar terrain (homing) using different aspects of frequency encoded views that are processed in different neural pathways. We also demonstrate how the Central Complex and Mushroom Bodies regions of the insect brain may work in tandem to coordinate the directional output of different guidance cues through a contextually switched ring-attractor inspired by neural recordings. The resultant unified model of insect navigation reproduces behavioural data from a series of cue conflict experiments in realistic animal environments and offers testable hypotheses of where and how insects process visual cues, utilise the different information that they provide and coordinate their outputs to achieve the adaptive behaviours observed in the wild.
Collapse
Affiliation(s)
- Xuelong Sun
- Computational Intelligence Lab & L-CAS, School of Computer Science, University of LincolnLincolnUnited Kingdom
| | - Shigang Yue
- Computational Intelligence Lab & L-CAS, School of Computer Science, University of LincolnLincolnUnited Kingdom
- Machine Life and Intelligence Research Centre, Guangzhou UniversityGuangzhouChina
| | - Michael Mangan
- Sheffield Robotics, Department of Computer Science, University of SheffieldSheffieldUnited Kingdom
| |
Collapse
|
11
|
Le Möel F, Wystrach A. Opponent processes in visual memories: A model of attraction and repulsion in navigating insects' mushroom bodies. PLoS Comput Biol 2020; 16:e1007631. [PMID: 32023241 PMCID: PMC7034919 DOI: 10.1371/journal.pcbi.1007631] [Citation(s) in RCA: 35] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2019] [Revised: 02/21/2020] [Accepted: 01/04/2020] [Indexed: 11/19/2022] Open
Abstract
Solitary foraging insects display stunning navigational behaviours in visually complex natural environments. Current literature assumes that these insects are mostly driven by attractive visual memories, which are learnt when the insect's gaze is precisely oriented toward the goal direction, typically along its familiar route or towards its nest. That way, an insect could return home by simply moving in the direction that appears most familiar. Here we show using virtual reconstructions of natural environments that this principle suffers from fundamental drawbacks, notably, a given view of the world does not provide information about whether the agent should turn or not to reach its goal. We propose a simple model where the agent continuously compares its current view with both goal and anti-goal visual memories, which are treated as attractive and repulsive respectively. We show that this strategy effectively results in an opponent process, albeit not at the perceptual level-such as those proposed for colour vision or polarisation detection-but at the level of the environmental space. This opponent process results in a signal that strongly correlates with the angular error of the current body orientation so that a single view of the world now suffices to indicate whether the agent should turn or not. By incorporating this principle into a simple agent navigating in reconstructed natural environments, we show that it overcomes the usual shortcomings and produces a step-increase in navigation effectiveness and robustness. Our findings provide a functional explanation to recent behavioural observations in ants and why and how so-called aversive and appetitive memories must be combined. We propose a likely neural implementation based on insects' mushroom bodies' circuitry that produces behavioural and neural predictions contrasting with previous models.
Collapse
Affiliation(s)
- Florent Le Möel
- Research Centre on Animal Cognition, University Paul Sabatier/CNRS, Toulouse, France
| | - Antoine Wystrach
- Research Centre on Animal Cognition, University Paul Sabatier/CNRS, Toulouse, France
| |
Collapse
|
12
|
Schofield AJ, Gilchrist ID, Bloj M, Leonardis A, Bellotto N. Understanding images in biological and computer vision. Interface Focus 2018. [DOI: 10.1098/rsfs.2018.0027] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/05/2023] Open
Affiliation(s)
- Andrew J. Schofield
- School of Psychology, University of Birmingham, Edgbaston, Birmingham, B15 2TT, UK
| | - Iain D. Gilchrist
- School of Experimental Psychology, University of Bristol, 12A Priory Road, Bristol, BS8 1TU, UK
| | - Marina Bloj
- School of Optometry and Vision Sciences, University of Bradford, Bradford, BD7 1DP, UK
| | - Ales Leonardis
- School of Computer Science, University of Birmingham, Edgbaston, Birmingham, B15 2TT, UK
| | - Nicola Bellotto
- School of Computer Science, University of Lincoln, Brayford Pool, Lincoln, LN6 7TS, UK
| |
Collapse
|