1
|
Buehlmann C, Dell-Cronin S, Diyalagoda Pathirannahelage A, Goulard R, Webb B, Niven JE, Graham P. Impact of central complex lesions on innate and learnt visual navigation in ants. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-023-01613-1. [PMID: 36790487 DOI: 10.1007/s00359-023-01613-1] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2022] [Revised: 12/31/2022] [Accepted: 01/10/2023] [Indexed: 02/16/2023]
Abstract
Wood ants are excellent navigators, using a combination of innate and learnt navigational strategies to travel between their nest and feeding sites. Visual navigation in ants has been studied extensively, however, we have little direct evidence for the underlying neural mechanisms. Here, we perform lateralized mechanical lesions in the central complex (CX) of wood ants, a midline structure known to allow an insect to keep track of the direction of sensory cues relative to its own orientation and to control movement. We lesioned two groups of ants and observed their behaviour in an arena with a large visual landmark present. The first group of ants were naïve and when intact such ants show a clear innate attraction to the conspicuous landmark. The second group of ants were trained to aim to a food location to the side of the landmark. The general heading of naïve ants towards a visual cue was not altered by the lesions, but the heading of ants trained to a landmark adjacent food position was affected. Thus, CX lesions had a specific impact on learnt visual guidance. We also observed that lateralised lesions altered the fine details of turning with lesioned ants spending less time turning to the side ipsilateral of the lesion. The results confirm the role of the CX in turn control and highlight its important role in the implementation of learnt behaviours that rely on information from other brain regions.
Collapse
Affiliation(s)
| | | | | | - Roman Goulard
- School of Informatics, University of Edinburgh, Edinburgh, EH8 9AB, UK.,Lund Vision Group, Department of Biology, Lund University, 223 62, Lund, Sweden
| | - Barbara Webb
- School of Informatics, University of Edinburgh, Edinburgh, EH8 9AB, UK
| | - Jeremy E Niven
- School of Life Sciences, University of Sussex, Brighton, BN1 9QG, UK
| | - Paul Graham
- School of Life Sciences, University of Sussex, Brighton, BN1 9QG, UK
| |
Collapse
|
2
|
Nguyen TAT, Beetz MJ, Merlin C, Pfeiffer K, el Jundi B. Weighting of Celestial and Terrestrial Cues in the Monarch Butterfly Central Complex. Front Neural Circuits 2022; 16:862279. [PMID: 35847485 PMCID: PMC9285895 DOI: 10.3389/fncir.2022.862279] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2022] [Accepted: 06/10/2022] [Indexed: 12/02/2022] Open
Abstract
Monarch butterflies rely on external cues for orientation during their annual long-distance migration from Northern US and Canada to Central Mexico. These external cues can be celestial cues, such as the sun or polarized light, which are processed in a brain region termed the central complex (CX). Previous research typically focused on how individual simulated celestial cues are encoded in the butterfly's CX. However, in nature, the butterflies perceive several celestial cues at the same time and need to integrate them to effectively use the compound of all cues for orientation. In addition, a recent behavioral study revealed that monarch butterflies can rely on terrestrial cues, such as the panoramic skyline, for orientation and use them in combination with the sun to maintain a directed flight course. How the CX encodes a combination of celestial and terrestrial cues and how they are weighted in the butterfly's CX is still unknown. Here, we examined how input neurons of the CX, termed TL neurons, combine celestial and terrestrial information. While recording intracellularly from the neurons, we presented a sun stimulus and polarized light to the butterflies as well as a simulated sun and a panoramic scene simultaneously. Our results show that celestial cues are integrated linearly in these cells, while the combination of the sun and a panoramic skyline did not always follow a linear integration of action potential rates. Interestingly, while the sun and polarized light were invariantly weighted between individual neurons, the sun stimulus and panoramic skyline were dynamically weighted when both stimuli were simultaneously presented. Taken together, this dynamic weighting between celestial and terrestrial cues may allow the butterflies to flexibly set their cue preference during navigation.
Collapse
Affiliation(s)
| | - M. Jerome Beetz
- Biocenter, Zoology II, University of Wuerzburg, Würzburg, Germany
| | - Christine Merlin
- Department of Biology and Center for Biological Clocks Research, Texas A&M University, College Station, TX, United States
| | - Keram Pfeiffer
- Biocenter, Zoology II, University of Wuerzburg, Würzburg, Germany
| | - Basil el Jundi
- Biocenter, Zoology II, University of Wuerzburg, Würzburg, Germany
- Department of Biology, Animal Physiology, Norwegian University of Science and Technology, Trondheim, Norway
- *Correspondence: Basil el Jundi
| |
Collapse
|
3
|
Buehlmann C, Graham P. Innate visual attraction in wood ants is a hardwired behavior seen across different motivational and ecological contexts. INSECTES SOCIAUX 2022; 69:271-277. [PMID: 35909593 PMCID: PMC9314291 DOI: 10.1007/s00040-022-00867-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/27/2022] [Revised: 05/24/2022] [Accepted: 05/27/2022] [Indexed: 06/15/2023]
Abstract
UNLABELLED Ants are expert navigators combining innate and learnt navigational strategies. Whereas we know that the ants' feeding state segregates visual-navigational memories in ants navigating along a learnt route, it is an open question if the motivational state also affects the ants' innate visual preferences. Wood ant foragers show an innate attraction to conspicuous visual cues. These foragers inhabit cluttered woodland habitat and feed on honeydew from aphids on trees. Hence, the attraction to 'tree-like' objects might be an ecologically relevant behavior that is tailored to the wood ants' foraging ecology. Foragers from other ant species with different foraging ecologies show very different innate attractions. We investigated here the innate visual response of wood ant foragers with different motivational states, i.e., unfed or fed, as well as males that show no foraging activity. Our results show that ants from all three groups orient toward a prominent visual cue, i.e., this intrinsic visuomotor response is not context-dependent, but a hardwired behavior seen across different motivational and ecological contexts. SUPPLEMENTARY INFORMATION The online version contains supplementary material available at 10.1007/s00040-022-00867-3.
Collapse
Affiliation(s)
- C. Buehlmann
- School of Life Sciences, University of Sussex, Brighton, BN1 9QG UK
| | - P. Graham
- School of Life Sciences, University of Sussex, Brighton, BN1 9QG UK
| |
Collapse
|
4
|
Stankiewicz J, Webb B. Looking down: a model for visual route following in flying insects. BIOINSPIRATION & BIOMIMETICS 2021; 16:055007. [PMID: 34243169 DOI: 10.1088/1748-3190/ac1307] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/01/2021] [Accepted: 07/09/2021] [Indexed: 06/13/2023]
Abstract
Insect visual navigation is often assumed to depend on panoramic views of the horizon, and how these change as the animal moves. However, it is known that honey bees can visually navigate in flat, open meadows where visual information at the horizon is minimal, or would remain relatively constant across a wide range of positions. In this paper we hypothesise that these animals can navigate using view memories of the ground. We find that in natural scenes, low resolution views from an aerial perspective of ostensibly self-similar terrain (e.g. within a field of grass) provide surprisingly robust descriptors of precise spatial locations. We propose a new visual route following approach that makes use of transverse oscillations to centre a flight path along a sequence of learned views of the ground. We deploy this model on an autonomous quadcopter and demonstrate that it provides robust performance in the real world on journeys of up to 30 m. The success of our method is contingent on a robust view matching process which can evaluate the familiarity of a view with a degree of translational invariance. We show that a previously developed wavelet based bandpass orientated filter approach fits these requirements well, exhibiting double the catchment area of standard approaches. Using a realistic simulation package, we evaluate the robustness of our approach to variations in heading direction and aircraft height between inbound and outbound journeys. We also demonstrate that our approach can operate using a vision system with a biologically relevant visual acuity and viewing direction.
Collapse
Affiliation(s)
- J Stankiewicz
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, United Kingdom
| | - B Webb
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, United Kingdom
| |
Collapse
|
5
|
Kim SS, Hermundstad AM, Romani S, Abbott LF, Jayaraman V. Generation of stable heading representations in diverse visual scenes. Nature 2019; 576:126-131. [PMID: 31748750 PMCID: PMC8115876 DOI: 10.1038/s41586-019-1767-1] [Citation(s) in RCA: 78] [Impact Index Per Article: 15.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2018] [Accepted: 10/07/2019] [Indexed: 12/19/2022]
Abstract
Many animals rely on an internal heading representation when navigating in varied environments1–10. How this representation is linked to the sensory cues defining different surroundings is unclear. In the fly brain, heading is represented by ‘compass neurons’ that innervate a ring-shaped structure, the ellipsoid body3,11,12. Each compass neuron receives inputs from visual-feature-selective ‘ring neurons’13–16, providing the ideal substrate for the extraction of directional information from a visual scene. We combine two-photon calcium imaging and optogenetics in tethered flying flies with circuit modeling to show how the correlated activity of compass and visual neurons drives plasticity17–22, that flexibly transforms two-dimensional visual cues into a stable heading representation. We also describe how this plasticity enables the fly to convert a partial heading representation established from orienting within part of a novel setting into a complete heading representation. Our results provide mechanistic insight into memory-related computations essential for flexible navigation in varied surroundings.
Collapse
Affiliation(s)
- Sung Soo Kim
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA, USA. .,Department of Molecular, Cellular, and Developmental Biology, University of California, Santa Barbara, Santa Barbara, CA, USA. .,Neuroscience Research Institute, University of California, Santa Barbara, Santa Barbara, CA, USA.
| | - Ann M Hermundstad
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA, USA
| | - Sandro Romani
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA, USA
| | - L F Abbott
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA, USA.,Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY, USA
| | - Vivek Jayaraman
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA, USA.
| |
Collapse
|
6
|
Spatial Cognition: Allowing Natural Behaviours to Flourish in the Lab. Curr Biol 2019; 29:R639-R641. [PMID: 31287984 DOI: 10.1016/j.cub.2019.05.031] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
Abstract
Understanding the computational basis of spatial cognition requires observations of natural behaviour and the underlying neural circuits, which are difficult to do simultaneously: however, recent studies show how we might achieve this, combining rich virtual reality set-ups and the use of optogenetics in freely moving animals.
Collapse
|
7
|
Pomaville MB, Lent DD. Multiple Representations of Space by the Cockroach, Periplaneta americana. Front Psychol 2018; 9:1312. [PMID: 30104993 PMCID: PMC6077775 DOI: 10.3389/fpsyg.2018.01312] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2018] [Accepted: 07/09/2018] [Indexed: 11/30/2022] Open
Abstract
When cockroaches are trained to a visual–olfactory cue pairing using the antennal projection response (APR), they can form different memories for the location of a visual cue. A series of experiments, each examining memory for the spatial location of a visual cue, were performed using restrained cockroaches. The first group of experiments involved training cockroaches to associate a visual cue (CS—green LED) with an odor cue (US) in the presence or absence of a second visual reference cue (white LED). These experiments revealed that cockroaches have at least two forms of spatial memory. First, it was found that during learning, the movements of the antennae in response to the odor influenced the cockroaches’ memory. If they use only one antenna, cockroaches form a memory that results in an APR being elicited to the CS irrespective of its location in space. When using both antennae, the cockroaches resulting memory leads to an APR to the CS that is spatially confined to within 15° of the trained position. This memory represents an egocentric spatial representation. Second, the cockroaches simultaneously formed a memory for the angular spatial relationships between two visual cues when trained in the presence of a second visual reference cue. This training provided the cockroaches an allocentric representation or visual snapshot of the environment. If both egocentric and the visual snapshot were available to the cockroach to localize the learned cue, the visual snapshot determined the behavioral response in this assay. Finally, the split-brain assay was used to characterize the cockroach’s ability to establish a memory for the angular relationship between two visual cues with half a brain. Split-brain cockroaches were trained to unilaterally associate a pair of visual cues (CS—green LED and reference—white LED) with an odor cue (US). Split-brain cockroaches learned the general arrangement of the visual cues (i.e., the green LED is right of the white LED), but not the precise angular relationship. These experiments provide new insight into spatial memory processes in the cockroach.
Collapse
Affiliation(s)
- Matthew B Pomaville
- Department of Biology, California State University, Fresno, CA, United States
| | - David D Lent
- Department of Biology, California State University, Fresno, CA, United States
| |
Collapse
|
8
|
Stone T, Mangan M, Wystrach A, Webb B. Rotation invariant visual processing for spatial memory in insects. Interface Focus 2018; 8:20180010. [PMID: 29951190 PMCID: PMC6015815 DOI: 10.1098/rsfs.2018.0010] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/08/2018] [Indexed: 11/12/2022] Open
Abstract
Visual memory is crucial to navigation in many animals, including insects. Here, we focus on the problem of visual homing, that is, using comparison of the view at a current location with a view stored at the home location to control movement towards home by a novel shortcut. Insects show several visual specializations that appear advantageous for this task, including almost panoramic field of view and ultraviolet light sensitivity, which enhances the salience of the skyline. We discuss several proposals for subsequent processing of the image to obtain the required motion information, focusing on how each might deal with the problem of yaw rotation of the current view relative to the home view. Possible solutions include tagging of views with information from the celestial compass system, using multiple views pointing towards home, or rotation invariant encoding of the view. We illustrate briefly how a well-known shape description method from computer vision, Zernike moments, could provide a compact and rotation invariant representation of sky shapes to enhance visual homing. We discuss the biological plausibility of this solution, and also a fourth strategy, based on observed behaviour of insects, that involves transfer of information from visual memory matching to the compass system.
Collapse
Affiliation(s)
- Thomas Stone
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, UK
| | - Michael Mangan
- Sheffield Robotics, Department of Computer Science, University of Sheffield, Regent Court, Sheffield S1 4DP, UK
| | - Antoine Wystrach
- CNRS, Université Paul Sabatier, Toulouse, 31062 cedex 09, France
| | - Barbara Webb
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, UK
| |
Collapse
|