1
|
Schwarz S, Wystrach A, Cheng K, Kelly DM. Landmarks, beacons, or panoramic views: What do pigeons attend to for guidance in familiar environments? Learn Behav 2024; 52:69-84. [PMID: 38379118 DOI: 10.3758/s13420-023-00610-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/25/2023] [Indexed: 02/22/2024]
Abstract
Birds and social insects represent excellent systems for understanding visually guided navigation. Both animal groups use surrounding visual cues for homing and foraging. Ants extract sufficient spatial information from panoramic views, which naturally embed all near and far spatial information, for successful homing. Although egocentric panoramic views allow for parsimonious explanations of navigational behaviors, this potential source of spatial information has been mostly neglected during studies of vertebrates. Here we investigate how distinct landmarks, a beacon, and panoramic views influence the reorientation behavior in pigeons (Columba livia). Pigeons were trained to search for a location characterized by a beacon and several distinct landmarks. Transformation tests manipulated aspects of the landmark configuration, allowing for a dissociation among navigational strategies. Quantitative image and path analyses provided support that the panoramic view was used by the pigeons. Although the results from some individuals support the use of beaconing, overall the pigeons relied predominantly on the panoramic view when spatial cues provided conflicting information regarding the goal location. Reorientation based on vector and bearing information derived from distinct landmarks as well as environmental geometry failed to account fully for the results. Thus, the results of our study support that pigeons can use panoramic views for reorientation in familiar environments. Given that the current model for landmark use by pigeons posits the use of different vectors from an object, a global panorama-matching strategy suggests a fundamental change in the theory of how pigeons use surrounding visual cues for localization.
Collapse
Affiliation(s)
- Sebastian Schwarz
- Department of Psychology, University of Manitoba, 190 Dysart Road, 190 Duff Roblin Building, Winnipeg, MB, R3T, 2N2, Canada
- Centre de Recherches sur la Cognition Animale, CNRS, Université Paul Sabatier, 31062, Toulouse Cedex, 09, France
- Institute of Biology, Karl-Franzen University, Graz, Universtitätsplatz 2, 8010, Austria
| | - Antoine Wystrach
- Centre de Recherches sur la Cognition Animale, CNRS, Université Paul Sabatier, 31062, Toulouse Cedex, 09, France
| | - Ken Cheng
- School of Natural Sciences, Macquarie University, Sydney, NSW, 2109, Australia
| | - Debbie M Kelly
- Department of Psychology, University of Manitoba, 190 Dysart Road, 190 Duff Roblin Building, Winnipeg, MB, R3T, 2N2, Canada.
- Department of Biological Sciences, University of Manitoba, 212 Biological Sciences Building, Winnipeg, MB, R3T, 2N2, Canada.
| |
Collapse
|
2
|
Ortega-Escobar J, Hebets EA, Bingman VP, Wiegmann DD, Gaffin DD. Comparative biology of spatial navigation in three arachnid orders (Amblypygi, Araneae, and Scorpiones). J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-023-01612-2. [PMID: 36781447 DOI: 10.1007/s00359-023-01612-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2022] [Revised: 01/07/2023] [Accepted: 01/10/2023] [Indexed: 02/15/2023]
Abstract
From both comparative biology and translational research perspectives, there is escalating interest in understanding how animals navigate their environments. Considerable work is being directed towards understanding the sensory transduction and neural processing of environmental stimuli that guide animals to, for example, food and shelter. While much has been learned about the spatial orientation behavior, sensory cues, and neurophysiology of champion navigators such as bees and ants, many other, often overlooked animal species possess extraordinary sensory and spatial capabilities that can broaden our understanding of the behavioral and neural mechanisms of animal navigation. For example, arachnids are predators that often return to retreats after hunting excursions. Many of these arachnid central-place foragers are large and highly conducive to scientific investigation. In this review we highlight research on three orders within the Class Arachnida: Amblypygi (whip spiders), Araneae (spiders), and Scorpiones (scorpions). For each, we describe (I) their natural history and spatial navigation, (II) how they sense the world, (III) what information they use to navigate, and (IV) how they process information for navigation. We discuss similarities and differences among the groups and highlight potential avenues for future research.
Collapse
Affiliation(s)
| | - Eileen A Hebets
- School of Biological Sciences, University of Nebraska-Lincoln, Lincoln, NE, 68588, USA
| | - Verner P Bingman
- Department of Psychology and J. P. Scott Center for Neuroscience, Mind and Behavior, Bowling Green State University, Bowling Green, OH, 43403, USA
| | - Daniel D Wiegmann
- Department of Biological Sciences and J. P. Scott Center for Neuroscience, Mind and Behavior, Bowling Green State University, Bowling Green, OH, 43403, USA
| | - Douglas D Gaffin
- Department of Biology, University of Oklahoma, Norman, OK, 73019, USA
| |
Collapse
|
3
|
Visual navigation: properties, acquisition and use of views. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2022:10.1007/s00359-022-01599-2. [PMID: 36515743 DOI: 10.1007/s00359-022-01599-2] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2022] [Revised: 11/20/2022] [Accepted: 11/25/2022] [Indexed: 12/15/2022]
Abstract
Panoramic views offer information on heading direction and on location to visually navigating animals. This review covers the properties of panoramic views and the information they provide to navigating animals, irrespective of image representation. Heading direction can be retrieved by alignment matching between memorized and currently experienced views, and a gradient descent in image differences can lead back to the location at which a view was memorized (positional image matching). Central place foraging insects, such as ants, bees and wasps, conduct distinctly choreographed learning walks and learning flights upon first leaving their nest that are likely to be designed to systematically collect scene memories tagged with information provided by path integration on the direction of and the distance to the nest. Equally, traveling along routes, ants have been shown to engage in scanning movements, in particular when routes are unfamiliar, again suggesting a systematic process of acquiring and comparing views. The review discusses what we know and do not know about how view memories are represented in the brain of insects, how they are acquired and how they are subsequently used for traveling along routes and for pinpointing places.
Collapse
|
4
|
Gaffin DD, Muñoz MG, Hoefnagels MH. Evidence of learning walks related to scorpion home burrow navigation. J Exp Biol 2022; 225:275795. [PMID: 35638243 PMCID: PMC9250797 DOI: 10.1242/jeb.243947] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2021] [Accepted: 05/20/2022] [Indexed: 11/29/2022]
Abstract
The navigation by chemo-textural familiarity hypothesis (NCFH) suggests that scorpions use their midventral pectines to gather chemical and textural information near their burrows and use this information as they subsequently return home. For NCFH to be viable, animals must somehow acquire home-directed ‘tastes’ of the substrate, such as through path integration (PI) and/or learning walks. We conducted laboratory behavioral trials using desert grassland scorpions (Paruroctonus utahensis). Animals reliably formed burrows in small mounds of sand we provided in the middle of circular, sand-lined behavioral arenas. We processed overnight infrared video recordings with a MATLAB script that tracked animal movements at 1–2 s intervals. In all, we analyzed the movements of 23 animals, representing nearly 1500 h of video recording. We found that once animals established their home burrows, they immediately made one to several short, looping excursions away from and back to their burrows before walking greater distances. We also observed similar excursions when animals made burrows in level sand in the middle of the arena (i.e. no mound provided). These putative learning walks, together with recently reported PI in scorpions, may provide the crucial home-directed information requisite for NCFH. Highlighted Article: Evidence that sand scorpions perform looping walks immediately after establishing a burrow and the possible significance of these putative learning walks in terms of scorpion navigation.
Collapse
Affiliation(s)
- Douglas D Gaffin
- Department of Biology, University of Oklahoma, Norman, OK 73019, USA
| | - Maria G Muñoz
- Department of Biology, University of Oklahoma, Norman, OK 73019, USA
| | - Mariëlle H Hoefnagels
- Department of Microbiology and Plant Biology, University of Oklahoma, Norman, OK 73019, USA
| |
Collapse
|
5
|
Stankiewicz J, Webb B. Looking down: a model for visual route following in flying insects. BIOINSPIRATION & BIOMIMETICS 2021; 16:055007. [PMID: 34243169 DOI: 10.1088/1748-3190/ac1307] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/01/2021] [Accepted: 07/09/2021] [Indexed: 06/13/2023]
Abstract
Insect visual navigation is often assumed to depend on panoramic views of the horizon, and how these change as the animal moves. However, it is known that honey bees can visually navigate in flat, open meadows where visual information at the horizon is minimal, or would remain relatively constant across a wide range of positions. In this paper we hypothesise that these animals can navigate using view memories of the ground. We find that in natural scenes, low resolution views from an aerial perspective of ostensibly self-similar terrain (e.g. within a field of grass) provide surprisingly robust descriptors of precise spatial locations. We propose a new visual route following approach that makes use of transverse oscillations to centre a flight path along a sequence of learned views of the ground. We deploy this model on an autonomous quadcopter and demonstrate that it provides robust performance in the real world on journeys of up to 30 m. The success of our method is contingent on a robust view matching process which can evaluate the familiarity of a view with a degree of translational invariance. We show that a previously developed wavelet based bandpass orientated filter approach fits these requirements well, exhibiting double the catchment area of standard approaches. Using a realistic simulation package, we evaluate the robustness of our approach to variations in heading direction and aircraft height between inbound and outbound journeys. We also demonstrate that our approach can operate using a vision system with a biologically relevant visual acuity and viewing direction.
Collapse
Affiliation(s)
- J Stankiewicz
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, United Kingdom
| | - B Webb
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, United Kingdom
| |
Collapse
|
6
|
Murray T, Zeil J. Quantifying navigational information: The catchment volumes of panoramic snapshots in outdoor scenes. PLoS One 2017; 12:e0187226. [PMID: 29088300 PMCID: PMC5663442 DOI: 10.1371/journal.pone.0187226] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2017] [Accepted: 10/16/2017] [Indexed: 11/18/2022] Open
Abstract
Panoramic views of natural environments provide visually navigating animals with two kinds of information: they define locations because image differences increase smoothly with distance from a reference location and they provide compass information, because image differences increase smoothly with rotation away from a reference orientation. The range over which a given reference image can provide navigational guidance (its 'catchment area') has to date been quantified from the perspective of walking animals by determining how image differences develop across the ground plane of natural habitats. However, to understand the information available to flying animals there is a need to characterize the 'catchment volumes' within which panoramic snapshots can provide navigational guidance. We used recently developed camera-based methods for constructing 3D models of natural environments and rendered panoramic views at defined locations within these models with the aim of mapping navigational information in three dimensions. We find that in relatively open woodland habitats, catchment volumes are surprisingly large extending for metres depending on the sensitivity of the viewer to image differences. The size and the shape of catchment volumes depend on the distance of visual features in the environment. Catchment volumes are smaller for reference images close to the ground and become larger for reference images at some distance from the ground and in more open environments. Interestingly, catchment volumes become smaller when only above horizon views are used and also when views include a 1 km distant panorama. We discuss the current limitations of mapping navigational information in natural environments and the relevance of our findings for our understanding of visual navigation in animals and autonomous robots.
Collapse
Affiliation(s)
- Trevor Murray
- Research School of Biology, Australian National University, Canberra, Australia
- * E-mail:
| | - Jochen Zeil
- Research School of Biology, Australian National University, Canberra, Australia
| |
Collapse
|
7
|
Autonomous Visual Navigation of an Indoor Environment Using a Parsimonious, Insect Inspired Familiarity Algorithm. PLoS One 2016; 11:e0153706. [PMID: 27119720 PMCID: PMC4847926 DOI: 10.1371/journal.pone.0153706] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2015] [Accepted: 04/03/2016] [Indexed: 11/19/2022] Open
Abstract
The navigation of bees and ants from hive to food and back has captivated people for more than a century. Recently, the Navigation by Scene Familiarity Hypothesis (NSFH) has been proposed as a parsimonious approach that is congruent with the limited neural elements of these insects' brains. In the NSFH approach, an agent completes an initial training excursion, storing images along the way. To retrace the path, the agent scans the area and compares the current scenes to those previously experienced. By turning and moving to minimize the pixel-by-pixel differences between encountered and stored scenes, the agent is guided along the path without having memorized the sequence. An important premise of the NSFH is that the visual information of the environment is adequate to guide navigation without aliasing. Here we demonstrate that an image landscape of an indoor setting possesses ample navigational information. We produced a visual landscape of our laboratory and part of the adjoining corridor consisting of 2816 panoramic snapshots arranged in a grid at 12.7-cm centers. We show that pixel-by-pixel comparisons of these images yield robust translational and rotational visual information. We also produced a simple algorithm that tracks previously experienced routes within our lab based on an insect-inspired scene familiarity approach and demonstrate that adequate visual information exists for an agent to retrace complex training routes, including those where the path's end is not visible from its origin. We used this landscape to systematically test the interplay of sensor morphology, angles of inspection, and similarity threshold with the recapitulation performance of the agent. Finally, we compared the relative information content and chance of aliasing within our visually rich laboratory landscape to scenes acquired from indoor corridors with more repetitive scenery.
Collapse
|