1
|
Bertrand OJN, Sonntag A. The potential underlying mechanisms during learning flights. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-023-01637-7. [PMID: 37204434 DOI: 10.1007/s00359-023-01637-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2022] [Revised: 05/02/2023] [Accepted: 05/04/2023] [Indexed: 05/20/2023]
Abstract
Hymenopterans, such as bees and wasps, have long fascinated researchers with their sinuous movements at novel locations. These movements, such as loops, arcs, or zigzags, serve to help insects learn their surroundings at important locations. They also allow the insects to explore and orient themselves in their environment. After they gained experience with their environment, the insects fly along optimized paths guided by several guidance strategies, such as path integration, local homing, and route-following, forming a navigational toolkit. Whereas the experienced insects combine these strategies efficiently, the naive insects need to learn about their surroundings and tune the navigational toolkit. We will see that the structure of the movements performed during the learning flights leverages the robustness of certain strategies within a given scale to tune other strategies which are more efficient at a larger scale. Thus, an insect can explore its environment incrementally without risking not finding back essential locations.
Collapse
Affiliation(s)
- Olivier J N Bertrand
- Neurobiology, Bielefeld University, Universitätstr. 25, 33615, Bielefeld, NRW, Germany.
| | - Annkathrin Sonntag
- Neurobiology, Bielefeld University, Universitätstr. 25, 33615, Bielefeld, NRW, Germany
| |
Collapse
|
2
|
Beetz MJ, El Jundi B. The influence of stimulus history on directional coding in the monarch butterfly brain. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-023-01633-x. [PMID: 37095358 DOI: 10.1007/s00359-023-01633-x] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2022] [Revised: 04/05/2023] [Accepted: 04/12/2023] [Indexed: 04/26/2023]
Abstract
The central complex is a brain region in the insect brain that houses a neural network specialized to encode directional information. Directional coding has traditionally been investigated with compass cues that revolve in full rotations and at constant angular velocities around the insect's head. However, these stimulus conditions do not fully simulate an insect's sensory perception of compass cues during navigation. In nature, an insect flight is characterized by abrupt changes in moving direction as well as constant changes in velocity. The influence of such varying cue dynamics on compass coding remains unclear. We performed long-term tetrode recordings from the brain of monarch butterflies to study how central complex neurons respond to different stimulus velocities and directions. As these butterflies derive directional information from the sun during migration, we measured the neural response to a virtual sun. The virtual sun was either presented as a spot that appeared at random angular positions or was rotated around the butterfly at different angular velocities and directions. By specifically manipulating the stimulus velocity and trajectory, we dissociated the influence of angular velocity and direction on compass coding. While the angular velocity substantially affected the tuning directedness, the stimulus trajectory influenced the shape of the angular tuning curve. Taken together, our results suggest that the central complex flexibly adjusts its directional coding to the current stimulus dynamics ensuring a precise compass even under highly demanding conditions such as during rapid flight maneuvers.
Collapse
Affiliation(s)
- M Jerome Beetz
- Zoology II, Biocenter, University of Würzburg, Würzburg, Germany.
| | - Basil El Jundi
- Zoology II, Biocenter, University of Würzburg, Würzburg, Germany
- Animal Physiology, Department of Biology, Norwegian University of Science and Technology, Trondheim, Norway
| |
Collapse
|
3
|
Egelhaaf M. Optic flow based spatial vision in insects. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-022-01610-w. [PMID: 36609568 DOI: 10.1007/s00359-022-01610-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Revised: 12/06/2022] [Accepted: 12/24/2022] [Indexed: 01/09/2023]
Abstract
The optic flow, i.e., the displacement of retinal images of objects in the environment induced by self-motion, is an important source of spatial information, especially for fast-flying insects. Spatial information over a wide range of distances, from the animal's immediate surroundings over several hundred metres to kilometres, is necessary for mediating behaviours, such as landing manoeuvres, collision avoidance in spatially complex environments, learning environmental object constellations and path integration in spatial navigation. To facilitate the processing of spatial information, the complexity of the optic flow is often reduced by active vision strategies. These result in translations and rotations being largely separated by a saccadic flight and gaze mode. Only the translational components of the optic flow contain spatial information. In the first step of optic flow processing, an array of local motion detectors provides a retinotopic spatial proximity map of the environment. This local motion information is then processed in parallel neural pathways in a task-specific manner and used to control the different components of spatial behaviour. A particular challenge here is that the distance information extracted from the optic flow does not represent the distances unambiguously, but these are scaled by the animal's speed of locomotion. Possible ways of coping with this ambiguity are discussed.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Center for Cognitive Interaction Technology (CITEC), Bielefeld University, Universitätsstraße 25, 33615, Bielefeld, Germany.
| |
Collapse
|
4
|
Visual navigation: properties, acquisition and use of views. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2022:10.1007/s00359-022-01599-2. [PMID: 36515743 DOI: 10.1007/s00359-022-01599-2] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2022] [Revised: 11/20/2022] [Accepted: 11/25/2022] [Indexed: 12/15/2022]
Abstract
Panoramic views offer information on heading direction and on location to visually navigating animals. This review covers the properties of panoramic views and the information they provide to navigating animals, irrespective of image representation. Heading direction can be retrieved by alignment matching between memorized and currently experienced views, and a gradient descent in image differences can lead back to the location at which a view was memorized (positional image matching). Central place foraging insects, such as ants, bees and wasps, conduct distinctly choreographed learning walks and learning flights upon first leaving their nest that are likely to be designed to systematically collect scene memories tagged with information provided by path integration on the direction of and the distance to the nest. Equally, traveling along routes, ants have been shown to engage in scanning movements, in particular when routes are unfamiliar, again suggesting a systematic process of acquiring and comparing views. The review discusses what we know and do not know about how view memories are represented in the brain of insects, how they are acquired and how they are subsequently used for traveling along routes and for pinpointing places.
Collapse
|
5
|
Ravi S, Siesenop T, Bertrand OJ, Li L, Doussot C, Fisher A, Warren WH, Egelhaaf M. Bumblebees display characteristics of active vision during robust obstacle avoidance flight. J Exp Biol 2022; 225:274096. [PMID: 35067721 PMCID: PMC8920035 DOI: 10.1242/jeb.243021] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2021] [Accepted: 01/18/2022] [Indexed: 11/20/2022]
Abstract
Insects are remarkable flyers and capable of navigating through highly cluttered environments. We tracked the head and thorax of bumblebees freely flying in a tunnel containing vertically oriented obstacles to uncover the sensorimotor strategies used for obstacle detection and collision avoidance. Bumblebees presented all the characteristics of active vision during flight by stabilizing their head relative to the external environment and maintained close alignment between their gaze and flightpath. Head stabilization increased motion contrast of nearby features against the background to enable obstacle detection. As bees approached obstacles, they appeared to modulate avoidance responses based on the relative retinal expansion velocity (RREV) of obstacles and their maximum evasion acceleration was linearly related to RREVmax. Finally, bees prevented collisions through rapid roll manoeuvres implemented by their thorax. Overall, the combination of visuo-motor strategies of bumblebees highlights elegant solutions developed by insects for visually guided flight through cluttered environments.
Collapse
Affiliation(s)
- Sridhar Ravi
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany,School of Engineering and Information Technology, University of New South Wales, Canberra, ACT 2600, Australia,Author for correspondence ()
| | - Tim Siesenop
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| | - Olivier J. Bertrand
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| | - Liang Li
- Department of Collective Behavior, Max Planck Institute of Animal Behavior, University of Konstanz, 78464 Konstanz, Germany,Centre for the Advanced Study of Collective Behaviour, University of Konstanz, 78464 Konstanz, Germany,Department of Biology, University of Konstanz, 78464 Konstanz, Germany
| | - Charlotte Doussot
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| | - Alex Fisher
- School of Engineering, RMIT University, Melbourne, VIC 3001, Australia
| | - William H. Warren
- Department of Cognitive, Linguistic & Psychological Sciences, Brown University, Providence, RI 02912, USA
| | - Martin Egelhaaf
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| |
Collapse
|
6
|
Straw AD. Review of methods for animal videography using camera systems that automatically move to follow the animal. Integr Comp Biol 2021; 61:917-925. [PMID: 34117754 DOI: 10.1093/icb/icab126] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2021] [Revised: 05/13/2021] [Accepted: 06/10/2021] [Indexed: 11/13/2022] Open
Abstract
Digital photography and videography provide rich data for the study of animal behavior and are consequently widely used techniques. For fixed, unmoving cameras, the image sensor and optics govern the field of view and spatial detail. For a given sensor resolution, the optics determine a tradeoff between high magnification in which high spatial detail from a restricted field of view is obtained versus low magnification in which lower spatial detail is obtained from a larger region. In addition to this geometric resolution versus field of view tradeoff, limited light availability establishes a physical limit when imaging movement. If the animal is moving, motion blur smears the subject on the sensor during exposure. Practically, motion blur is further compounded by sensor inefficiency and noise. While these fundamental tradeoffs with stationary cameras can be sidestepped by employing multiple cameras and providing additional illumination, this may not always be desirable. An alternative that overcomes these issues of stationary cameras is to direct a high magnification camera at an animal continually as it moves. Here we review systems in which automatic tracking is used to maintain an animal in the working volume of a moving optical path. Such methods provide an opportunity to escape the tradeoff between resolution and field of view and also to reduce motion blur while still enabling automated image acquisition. We argue that further development will be useful and outline potential innovations that may improve the technology and lead to more widespread use.
Collapse
Affiliation(s)
- Andrew D Straw
- Institute of Biology I and Bernstein Center Freiburg, Faculty of Biology, Albert-Ludwigs-Universität Freiburg, Germany
| |
Collapse
|
7
|
Odenthal L, Doussot C, Meyer S, Bertrand OJN. Analysing Head-Thorax Choreography During Free-Flights in Bumblebees. Front Behav Neurosci 2021; 14:610029. [PMID: 33510626 PMCID: PMC7835495 DOI: 10.3389/fnbeh.2020.610029] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2020] [Accepted: 12/14/2020] [Indexed: 01/29/2023] Open
Abstract
Animals coordinate their various body parts, sometimes in elaborate manners to swim, walk, climb, fly, and navigate their environment. The coordination of body parts is essential to behaviors such as, chasing, escaping, landing, and the extraction of relevant information. For example, by shaping the movement of the head and body in an active and controlled manner, flying insects structure their flights to facilitate the acquisition of distance information. They condense their turns into a short period of time (the saccade) interspaced by a relatively long translation (the intersaccade). However, due to technological limitations, the precise coordination of the head and thorax during insects' free-flight remains unclear. Here, we propose methods to analyse the orientation of the head and thorax of bumblebees Bombus terrestris, to segregate the trajectories of flying insects into saccades and intersaccades by using supervised machine learning (ML) techniques, and finally to analyse the coordination between head and thorax by using artificial neural networks (ANN). The segregation of flights into saccades and intersaccades by ML, based on the thorax angular velocities, decreased the misclassification by 12% compared to classically used methods. Our results demonstrate how machine learning techniques can be used to improve the analyses of insect flight structures and to learn about the complexity of head-body coordination. We anticipate our assay to be a starting point for more sophisticated experiments and analysis on freely flying insects. For example, the coordination of head and body movements during collision avoidance, chasing behavior, or negotiation of gaps could be investigated by monitoring the head and thorax orientation of freely flying insects within and across behavioral tasks, and in different species.
Collapse
Affiliation(s)
| | | | - Stefan Meyer
- Department of Informatics, University of Sussex, Brighton, United Kingdom
| | | |
Collapse
|