1
|
Brebner JS, Loconsole M, Hanley D, Vasas V. Through an animal's eye: the implications of diverse sensory systems in scientific experimentation. Proc Biol Sci 2024; 291:20240022. [PMID: 39016597 PMCID: PMC11253838 DOI: 10.1098/rspb.2024.0022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2023] [Revised: 03/01/2024] [Accepted: 06/19/2024] [Indexed: 07/18/2024] Open
Abstract
'Accounting for the sensory abilities of animals is critical in experimental design.' No researcher would disagree with this statement, yet it is often the case that we inadvertently fall for anthropocentric biases and use ourselves as the reference point. This paper discusses the risks of adopting an anthropocentric view when working with non-human animals, and the unintended consequences this has on our experimental designs and results. To this aim, we provide general examples of anthropocentric bias from different fields of animal research, with a particular focus on animal cognition and behaviour, and lay out the potential consequences of adopting a human-based perspective. Knowledge of the sensory abilities, both in terms of similarities to humans and peculiarities of the investigated species, is crucial to ensure solid conclusions. A more careful consideration of the diverse sensory systems of animals would improve many scientific fields and enhance animal welfare in the laboratory.
Collapse
Affiliation(s)
- Joanna S. Brebner
- Research Centre on Animal Cognition (CRCA), Centre for Integrative Biology (CBI); CNRS, University Paul Sabatier – Toulouse III, Toulouse, France
| | - Maria Loconsole
- School of Biological and Behavioural Sciences, Queen Mary University of London, London, UK
- Department of General Psychology, University of Padova, Padova, Italy
| | - Daniel Hanley
- Department of Biology, George Mason University, Fairfax, VA, USA
| | - Vera Vasas
- School of Life Sciences, University of Sussex, BrightonBN1 9RH, UK
| |
Collapse
|
2
|
Zeil J. Views from 'crabworld': the spatial distribution of light in a tropical mudflat. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023; 209:859-876. [PMID: 37460846 PMCID: PMC10643439 DOI: 10.1007/s00359-023-01653-7] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2023] [Revised: 06/18/2023] [Accepted: 06/29/2023] [Indexed: 11/14/2023]
Abstract
Natural scene analysis has been extensively used to understand how the invariant structure of the visual environment may have shaped biological image processing strategies. This paper deals with four crucial, but hitherto largely neglected aspects of natural scenes: (1) the viewpoint of specific animals; (2) the fact that image statistics are not independent of the position within the visual field; (3) the influence of the direction of illumination on luminance, spectral and polarization contrast in a scene; and (4) the biologically relevant information content of natural scenes. To address these issues, I recorded the spatial distribution of light in a tropical mudflat with a spectrographic imager equipped with a polarizing filter in an attempt to describe quantitatively the visual environment of fiddler crabs. The environment viewed by the crabs has a distinct structure. Depending on the position of the sun, the luminance, the spectral composition, and the polarization characteristics of horizontal light distribution are not uniform. This is true for both skylight and for reflections from the mudflat surface. The high-contrast feature of the line of horizon dominates the vertical distribution of light and is a discontinuity in terms of luminance, spectral distribution and of image statistics. On a clear day, skylight intensity increases towards the horizon due to multiple scattering, and its spectral composition increasingly resembles that of sunlight. Sky-substratum contrast is highest at short wavelengths. I discuss the consequences of this extreme example of the topography of vision for extracting biologically relevant information from natural scenes.
Collapse
Affiliation(s)
- Jochen Zeil
- Research School of Biology, Australian National University, P.O. Box 475, Canberra, ACT, 2601, Australia.
| |
Collapse
|
3
|
Pfeiffer K. The neuronal building blocks of the navigational toolkit in the central complex of insects. CURRENT OPINION IN INSECT SCIENCE 2023; 55:100972. [PMID: 36126877 DOI: 10.1016/j.cois.2022.100972] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/15/2022] [Revised: 09/03/2022] [Accepted: 09/12/2022] [Indexed: 06/15/2023]
Abstract
The central complex in the brain of insects is a group of midline-spanning neuropils at the interface between sensory and premotor tasks of the brain. It is involved in sleep control, decision-making and most prominently in goal-directed locomotion behaviors. The recently published connectome of the central complex of Drosophila melanogaster is a milestone in understanding the intricacies of the central-complex circuits and will provide inspiration for testable hypotheses for the coming years. Here, I provide a basic neuroanatomical description of the central complex of Drosophila and other species and discuss some recent advancements, some of which, such as the discovery of coordinate transformation through vector math, have been predicted from connectomics data.
Collapse
Affiliation(s)
- Keram Pfeiffer
- Behavioural Physiology and Sociobiology (Zoology II), Biocenter, University of Würzburg, 97074 Würzburg, Germany.
| |
Collapse
|
4
|
Egelhaaf M. Optic flow based spatial vision in insects. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-022-01610-w. [PMID: 36609568 DOI: 10.1007/s00359-022-01610-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Revised: 12/06/2022] [Accepted: 12/24/2022] [Indexed: 01/09/2023]
Abstract
The optic flow, i.e., the displacement of retinal images of objects in the environment induced by self-motion, is an important source of spatial information, especially for fast-flying insects. Spatial information over a wide range of distances, from the animal's immediate surroundings over several hundred metres to kilometres, is necessary for mediating behaviours, such as landing manoeuvres, collision avoidance in spatially complex environments, learning environmental object constellations and path integration in spatial navigation. To facilitate the processing of spatial information, the complexity of the optic flow is often reduced by active vision strategies. These result in translations and rotations being largely separated by a saccadic flight and gaze mode. Only the translational components of the optic flow contain spatial information. In the first step of optic flow processing, an array of local motion detectors provides a retinotopic spatial proximity map of the environment. This local motion information is then processed in parallel neural pathways in a task-specific manner and used to control the different components of spatial behaviour. A particular challenge here is that the distance information extracted from the optic flow does not represent the distances unambiguously, but these are scaled by the animal's speed of locomotion. Possible ways of coping with this ambiguity are discussed.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Center for Cognitive Interaction Technology (CITEC), Bielefeld University, Universitätsstraße 25, 33615, Bielefeld, Germany.
| |
Collapse
|
5
|
Krongauz DL, Lazebnik T. Collective evolution learning model for vision-based collective motion with collision avoidance. PLoS One 2023; 18:e0270318. [PMID: 37163523 PMCID: PMC10171646 DOI: 10.1371/journal.pone.0270318] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2022] [Accepted: 03/09/2023] [Indexed: 05/12/2023] Open
Abstract
Collective motion (CM) takes many forms in nature; schools of fish, flocks of birds, and swarms of locusts to name a few. Commonly, during CM the individuals of the group avoid collisions. These CM and collision avoidance (CA) behaviors are based on input from the environment such as smell, air pressure, and vision, all of which are processed by the individual and defined action. In this work, a novel vision-based CM with CA model (i.e., VCMCA) simulating the collective evolution learning process is proposed. In this setting, a learning agent obtains a visual signal about its environment, and throughout trial-and-error over multiple attempts, the individual learns to perform a local CM with CA which emerges into a global CM with CA dynamics. The proposed algorithm was evaluated in the case of locusts' swarms, showing the evolution of these behaviors in a swarm from the learning process of the individual in the swarm. Thus, this work proposes a biologically-inspired learning process to obtain multi-agent multi-objective dynamics.
Collapse
Affiliation(s)
- David L Krongauz
- Department of Computer Science, Bar-Ilan University, Ramat-Gan, Israel
| | - Teddy Lazebnik
- Department of Cancer Biology, Cancer Institute, University College London, London, United Kingdom
| |
Collapse
|
6
|
Franzke M, Kraus C, Gayler M, Dreyer D, Pfeiffer K, el Jundi B. Stimulus-dependent orientation strategies in monarch butterflies. J Exp Biol 2022; 225:274064. [PMID: 35048981 PMCID: PMC8918799 DOI: 10.1242/jeb.243687] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2021] [Accepted: 01/12/2022] [Indexed: 11/20/2022]
Abstract
Insects are well-known for their ability to keep track of their heading direction based on a combination of skylight cues and visual landmarks. This allows them to navigate back to their nest, disperse throughout unfamiliar environments, as well as migrate over large distances between their breeding and non-breeding habitats. The monarch butterfly (Danaus plexippus) for instance is known for its annual southward migration from North America to certain trees in Central Mexico. To maintain a constant flight route, these butterflies use a time-compensated sun compass for orientation which is processed in a region in the brain, termed the central complex. However, to successfully complete their journey, the butterflies’ brain must generate a multitude of orientation strategies, allowing them to dynamically switch from sun-compass orientation to a tactic behavior toward a certain target. To study if monarch butterflies exhibit different orientation modes and if they can switch between them, we observed the orientation behavior of tethered flying butterflies in a flight simulator while presenting different visual cues to them. We found that the butterflies’ behavior depended on the presented visual stimulus. Thus, while a dark stripe was used for flight stabilization, a bright stripe was fixated by the butterflies in their frontal visual field. If we replaced a bright stripe by a simulated sun stimulus, the butterflies switched their behavior and exhibited compass orientation. Taken together, our data show that monarch butterflies rely on and switch between different orientation modes, allowing the animal to adjust orientation to its actual behavioral demands.
Collapse
Affiliation(s)
- Myriam Franzke
- University of Wuerzburg, Biocenter, Zoology II, Würzburg, Germany
| | - Christian Kraus
- University of Wuerzburg, Biocenter, Zoology II, Würzburg, Germany
| | - Maria Gayler
- University of Wuerzburg, Biocenter, Zoology II, Würzburg, Germany
| | - David Dreyer
- Lund University, Department of Biology, Lund Vision Group, Lund, Sweden
| | - Keram Pfeiffer
- University of Wuerzburg, Biocenter, Zoology II, Würzburg, Germany
| | - Basil el Jundi
- University of Wuerzburg, Biocenter, Zoology II, Würzburg, Germany
| |
Collapse
|
7
|
Miles J, Vowles AS, Kemp PS. The response of common minnows, Phoxinus phoxinus, to visual cues under flowing and static water conditions. Anim Behav 2021. [DOI: 10.1016/j.anbehav.2021.07.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
8
|
Verbe A, Varennes LP, Vercher JL, Viollet S. How do hoverflies use their righting reflex? J Exp Biol 2020; 223:jeb215327. [PMID: 32527962 DOI: 10.1242/jeb.215327] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2019] [Accepted: 05/28/2020] [Indexed: 11/20/2022]
Abstract
When taking off from a sloping surface, flies have to reorient themselves dorsoventrally and stabilize their body by actively controlling their flapping wings. We have observed that righting is achieved solely by performing a rolling manoeuvre. How flies manage to do this has not yet been elucidated. It was observed here for the first time that hoverfly reorientation is entirely achieved within 6 wingbeats (48.8 ms) at angular roll velocities of up to 10×103 deg s-1 and that the onset of their head rotation consistently follows that of their body rotation after a time lag of 16 ms. The insects' body roll was found to be triggered by the asymmetric wing stroke amplitude, as expected. The righting process starts immediately with the first wingbeat and seems unlikely to depend on visual feedback. A dynamic model for the fly's righting reflex is presented, which accounts for the head/body movements and the time lag recorded in these experiments. This model consists of a closed-loop control of the body roll, combined with a feedforward control of the head/body angle. During the righting manoeuvre, a strong coupling seems to exist between the activation of the halteres (which measure the body's angular speed) and the gaze stabilization reflex. These findings again confirm the fundamental role played by the halteres in both body and head stabilization processes.
Collapse
Affiliation(s)
- Anna Verbe
- Institute of Movement Sciences Biorobotics Department, Aix-Marseille Université, CNRS, ISM, Marseille cedex 09, France
| | - Léandre P Varennes
- Institute of Movement Sciences Biorobotics Department, Aix-Marseille Université, CNRS, ISM, Marseille cedex 09, France
| | - Jean-Louis Vercher
- Institute of Movement Sciences Biorobotics Department, Aix-Marseille Université, CNRS, ISM, Marseille cedex 09, France
| | - Stéphane Viollet
- Institute of Movement Sciences Biorobotics Department, Aix-Marseille Université, CNRS, ISM, Marseille cedex 09, France
| |
Collapse
|
9
|
Baden T, Euler T, Berens P. Understanding the retinal basis of vision across species. Nat Rev Neurosci 2019; 21:5-20. [PMID: 31780820 DOI: 10.1038/s41583-019-0242-1] [Citation(s) in RCA: 151] [Impact Index Per Article: 25.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/22/2019] [Indexed: 12/12/2022]
Abstract
The vertebrate retina first evolved some 500 million years ago in ancestral marine chordates. Since then, the eyes of different species have been tuned to best support their unique visuoecological lifestyles. Visual specializations in eye designs, large-scale inhomogeneities across the retinal surface and local circuit motifs mean that all species' retinas are unique. Computational theories, such as the efficient coding hypothesis, have come a long way towards an explanation of the basic features of retinal organization and function; however, they cannot explain the full extent of retinal diversity within and across species. To build a truly general understanding of vertebrate vision and the retina's computational purpose, it is therefore important to more quantitatively relate different species' retinal functions to their specific natural environments and behavioural requirements. Ultimately, the goal of such efforts should be to build up to a more general theory of vision.
Collapse
Affiliation(s)
- Tom Baden
- Sussex Neuroscience, School of Life Sciences, University of Sussex, Brighton, UK. .,Institute for Ophthalmic Research, University of Tübingen, Tübingen, Germany.
| | - Thomas Euler
- Institute for Ophthalmic Research, University of Tübingen, Tübingen, Germany.,Werner Reichardt Centre for Integrative Neuroscience, University of Tübingen, Tübingen, Germany
| | - Philipp Berens
- Institute for Ophthalmic Research, University of Tübingen, Tübingen, Germany.,Werner Reichardt Centre for Integrative Neuroscience, University of Tübingen, Tübingen, Germany.,Institute for Bioinformatics and Medical Informatics, University of Tübingen, Tübingen, Germany.,Bernstein Centre for Computational Neuroscience, University of Tübingen, Tübingen, Germany
| |
Collapse
|
10
|
Abstract
Navigation is an essential skill for many animals, and understanding how animal use environmental information, particularly visual information, to navigate has a long history in both ethology and psychology. In birds, the dominant approach for investigating navigation at small-scales comes from comparative psychology, which emphasizes the cognitive representations underpinning spatial memory. The majority of this work is based in the laboratory and it is unclear whether this context itself affects the information that birds learn and use when they search for a location. Data from hummingbirds suggests that birds in the wild might use visual information in quite a different manner. To reconcile these differences, here we propose a new approach to avian navigation, inspired by the sensory-driven study of navigation in insects. Using methods devised for studying the navigation of insects, it is possible to quantify the visual information available to navigating birds, and then to determine how this information influences those birds' navigation decisions. Focusing on four areas that we consider characteristic of the insect navigation perspective, we discuss how this approach has shone light on the information insects use to navigate, and assess the prospects of taking a similar approach with birds. Although birds and insects differ in many ways, there is nothing in the insect-inspired approach of the kind we describe that means these methods need be restricted to insects. On the contrary, adopting such an approach could provide a fresh perspective on the well-studied question of how birds navigate through a variety of environments.
Collapse
Affiliation(s)
| | - Susan D Healy
- School of Biology, University of St Andrews, Fife, UK
| |
Collapse
|
11
|
Abstract
The use of vision to coordinate behavior requires an efficient control design that stabilizes the world on the retina or directs the gaze towards salient features in the surroundings. With a level gaze, visual processing tasks are simplified and behaviorally relevant features from the visual environment can be extracted. No matter how simple or sophisticated the eye design, mechanisms have evolved across phyla to stabilize gaze. In this review, we describe functional similarities in eyes and gaze stabilization reflexes, emphasizing their fundamental role in transforming sensory information into motor commands that support postural and locomotor control. We then focus on gaze stabilization design in flying insects and detail some of the underlying principles. Systems analysis reveals that gaze stabilization often involves several sensory modalities, including vision itself, and makes use of feedback as well as feedforward signals. Independent of phylogenetic distance, the physical interaction between an animal and its natural environment - its available senses and how it moves - appears to shape the adaptation of all aspects of gaze stabilization.
Collapse
Affiliation(s)
- Ben J Hardcastle
- Department of Bioengineering, Imperial College London, South Kensington Campus, London, SW7 2AZ, UK.
| | - Holger G Krapp
- Department of Bioengineering, Imperial College London, South Kensington Campus, London, SW7 2AZ, UK.
| |
Collapse
|
12
|
Ros IG, Biewener AA. Pigeons ( C. livia) Follow Their Head during Turning Flight: Head Stabilization Underlies the Visual Control of Flight. Front Neurosci 2017; 11:655. [PMID: 29249929 PMCID: PMC5717024 DOI: 10.3389/fnins.2017.00655] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2017] [Accepted: 11/09/2017] [Indexed: 11/13/2022] Open
Abstract
Similar flight control principles operate across insect and vertebrate fliers. These principles indicate that robust solutions have evolved to meet complex behavioral challenges. Following from studies of visual and cervical feedback control of flight in insects, we investigate the role of head stabilization in providing feedback cues for controlling turning flight in pigeons. Based on previous observations that the eyes of pigeons remain at relatively fixed orientations within the head during flight, we test potential sensory control inputs derived from head and body movements during 90° aerial turns. We observe that periods of angular head stabilization alternate with rapid head repositioning movements (head saccades), and confirm that control of head motion is decoupled from aerodynamic and inertial forces acting on the bird's continuously rotating body during turning flapping flight. Visual cues inferred from head saccades correlate with changes in flight trajectory; whereas the magnitude of neck bending predicts angular changes in body position. The control of head motion to stabilize a pigeon's gaze may therefore facilitate extraction of important motion cues, in addition to offering mechanisms for controlling body and wing movements. Strong similarities between the sensory flight control of birds and insects may also inspire novel designs of robust controllers for human-engineered autonomous aerial vehicles.
Collapse
Affiliation(s)
- Ivo G Ros
- Department of Organismic and Evolutionary Biology, Harvard University, Cambridge, MA, United States.,Division of Biology and Bioengineering, California Institute of Technology, Pasadena, CA, United States
| | - Andrew A Biewener
- Department of Organismic and Evolutionary Biology, Harvard University, Cambridge, MA, United States
| |
Collapse
|
13
|
The brain during free movement - What can we learn from the animal model. Brain Res 2017; 1716:3-15. [PMID: 28893579 DOI: 10.1016/j.brainres.2017.09.003] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2017] [Revised: 08/11/2017] [Accepted: 09/04/2017] [Indexed: 11/21/2022]
Abstract
Animals, just like humans, can freely move. They do so for various important reasons, such as finding food and escaping predators. Observing these behaviors can inform us about the underlying cognitive processes. In addition, while humans can convey complicated information easily through speaking, animals need to move their bodies to communicate. This has prompted many creative solutions by animal neuroscientists to enable studying the brain during movement. In this review, we first summarize how animal researchers record from the brain while an animal is moving, by describing the most common neural recording techniques in animals and how they were adapted to record during movement. We further discuss the challenge of controlling or monitoring sensory input during free movement. However, not only is free movement a necessity to reflect the outcome of certain internal cognitive processes in animals, it is also a fascinating field of research since certain crucial behavioral patterns can only be observed and studied during free movement. Therefore, in a second part of the review, we focus on some key findings in animal research that specifically address the interaction between free movement and brain activity. First, focusing on walking as a fundamental form of free movement, we discuss how important such intentional movements are for understanding processes as diverse as spatial navigation, active sensing, and complex motor planning. Second, we propose the idea of regarding free movement as the expression of a behavioral state. This view can help to understand the general influence of movement on brain function. Together, the technological advancements towards recording from the brain during movement, and the scientific questions asked about the brain engaged in movement, make animal research highly valuable to research into the human "moving brain".
Collapse
|
14
|
Serres JR, Ruffier F. Optic flow-based collision-free strategies: From insects to robots. ARTHROPOD STRUCTURE & DEVELOPMENT 2017; 46:703-717. [PMID: 28655645 DOI: 10.1016/j.asd.2017.06.003] [Citation(s) in RCA: 64] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/07/2016] [Revised: 06/19/2017] [Accepted: 06/19/2017] [Indexed: 06/07/2023]
Abstract
Flying insects are able to fly smartly in an unpredictable environment. It has been found that flying insects have smart neurons inside their tiny brains that are sensitive to visual motion also called optic flow. Consequently, flying insects rely mainly on visual motion during their flight maneuvers such as: takeoff or landing, terrain following, tunnel crossing, lateral and frontal obstacle avoidance, and adjusting flight speed in a cluttered environment. Optic flow can be defined as the vector field of the apparent motion of objects, surfaces, and edges in a visual scene generated by the relative motion between an observer (an eye or a camera) and the scene. Translational optic flow is particularly interesting for short-range navigation because it depends on the ratio between (i) the relative linear speed of the visual scene with respect to the observer and (ii) the distance of the observer from obstacles in the surrounding environment without any direct measurement of either speed or distance. In flying insects, roll stabilization reflex and yaw saccades attenuate any rotation at the eye level in roll and yaw respectively (i.e. to cancel any rotational optic flow) in order to ensure pure translational optic flow between two successive saccades. Our survey focuses on feedback-loops which use the translational optic flow that insects employ for collision-free navigation. Optic flow is likely, over the next decade to be one of the most important visual cues that can explain flying insects' behaviors for short-range navigation maneuvers in complex tunnels. Conversely, the biorobotic approach can therefore help to develop innovative flight control systems for flying robots with the aim of mimicking flying insects' abilities and better understanding their flight.
Collapse
|
15
|
Goulard R, Vercher JL, Viollet S. To crash or not to crash: how do hoverflies cope with free-fall situations and weightlessness? J Exp Biol 2016; 219:2497-503. [DOI: 10.1242/jeb.141150] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2016] [Accepted: 06/06/2016] [Indexed: 01/27/2023]
Abstract
ABSTRACT
Insects’ aptitude to perform hovering, automatic landing and tracking tasks involves accurately controlling their head and body roll and pitch movements, but how this attitude control depends on an internal estimation of gravity orientation is still an open question. Gravity perception in flying insects has mainly been studied in terms of grounded animals' tactile orientation responses, but it has not yet been established whether hoverflies use gravity perception cues to detect a nearly weightless state at an early stage. Ground-based microgravity simulators provide biologists with useful tools for studying the effects of changes in gravity. However, in view of the cost and the complexity of these set-ups, an alternative Earth-based free-fall procedure was developed with which flying insects can be briefly exposed to microgravity under various visual conditions. Hoverflies frequently initiated wingbeats in response to an imposed free fall in all the conditions tested, but managed to avoid crashing only in variably structured visual environments, and only episodically in darkness. Our results reveal that the crash-avoidance performance of these insects in various visual environments suggests the existence of a multisensory control system based mainly on vision rather than gravity perception.
Collapse
Affiliation(s)
- Roman Goulard
- Aix-Marseille Université, CNRS, Institute of Movement Science, UMR 7287, Marseille 13288, France
| | - Jean-Louis Vercher
- Aix-Marseille Université, CNRS, Institute of Movement Science, UMR 7287, Marseille 13288, France
| | - Stéphane Viollet
- Aix-Marseille Université, CNRS, Institute of Movement Science, UMR 7287, Marseille 13288, France
| |
Collapse
|
16
|
Katz HK, Lustig A, Lev-Ari T, Nov Y, Rivlin E, Katzir G. Eye movements in chameleons are not truly independent - evidence from simultaneous monocular tracking of two targets. ACTA ACUST UNITED AC 2016; 218:2097-105. [PMID: 26157161 DOI: 10.1242/jeb.113084] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Chameleons perform large-amplitude eye movements that are frequently referred to as independent, or disconjugate. When prey (an insect) is detected, the chameleon's eyes converge to view it binocularly and 'lock' in their sockets so that subsequent visual tracking is by head movements. However, the extent of the eyes' independence is unclear. For example, can a chameleon visually track two small targets simultaneously and monocularly, i.e. one with each eye? This is of special interest because eye movements in ectotherms and birds are frequently independent, with optic nerves that are fully decussated and intertectal connections that are not as developed as in mammals. Here, we demonstrate that chameleons presented with two small targets moving in opposite directions can perform simultaneous, smooth, monocular, visual tracking. To our knowledge, this is the first demonstration of such a capacity. The fine patterns of the eye movements in monocular tracking were composed of alternating, longer, 'smooth' phases and abrupt 'step' events, similar to smooth pursuits and saccades. Monocular tracking differed significantly from binocular tracking with respect to both 'smooth' phases and 'step' events. We suggest that in chameleons, eye movements are not simply 'independent'. Rather, at the gross level, eye movements are (i) disconjugate during scanning, (ii) conjugate during binocular tracking and (iii) disconjugate, but coordinated, during monocular tracking. At the fine level, eye movements are disconjugate in all cases. These results support the view that in vertebrates, basic monocular control is under a higher level of regulation that dictates the eyes' level of coordination according to context.
Collapse
Affiliation(s)
- Hadas Ketter Katz
- Department of Neurobiology, University of Haifa, 199 Aba Khoushy Ave., Mount Carmel, Haifa 3498838, Israel
| | - Avichai Lustig
- Department of Neurobiology, University of Haifa, 199 Aba Khoushy Ave., Mount Carmel, Haifa 3498838, Israel
| | - Tidhar Lev-Ari
- Department of Evolutionary and Environmental Biology, University of Haifa, 199 Aba Khoushy Ave., Mount Carmel, Haifa 3498838, Israel
| | - Yuval Nov
- Department of Statistics, University of Haifa, 199 Aba Khoushy Ave., Mount Carmel, Haifa 3498838, Israel
| | - Ehud Rivlin
- Faculty of Computer Sciences, Technion - Israel Institute of Technology, Technion City, Haifa 3200003, Israel
| | - Gadi Katzir
- Department of Evolutionary and Environmental Biology, University of Haifa, 199 Aba Khoushy Ave., Mount Carmel, Haifa 3498838, Israel Department of Marine Biology, University of Haifa, 199 Aba Khoushy Ave., Mount Carmel, Haifa 3498838, Israel
| |
Collapse
|
17
|
Raderschall CA, Narendra A, Zeil J. Head roll stabilisation in the nocturnal bull ant Myrmecia pyriformis: implications for visual navigation. ACTA ACUST UNITED AC 2016; 219:1449-57. [PMID: 26994172 DOI: 10.1242/jeb.134049] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2015] [Accepted: 02/24/2016] [Indexed: 10/22/2022]
Abstract
Ant foragers are known to memorise visual scenes that allow them to repeatedly travel along idiosyncratic routes and to return to specific places. Guidance is provided by a comparison between visual memories and current views, which critically depends on how well the attitude of the visual system is controlled. Here we show that nocturnal bull ants stabilise their head to varying degrees against locomotion-induced body roll movements, and this ability decreases as light levels fall. There are always un-compensated head roll oscillations that match the frequency of the stride cycle. Head roll stabilisation involves both visual and non-visual cues as ants compensate for body roll in complete darkness and also respond with head roll movements when confronted with visual pattern oscillations. We show that imperfect head roll control degrades navigation-relevant visual information and discuss ways in which navigating ants may deal with this problem.
Collapse
Affiliation(s)
- Chloé A Raderschall
- Research School of Biology, The Australian National University, 46 Sullivans Creek Road, Canberra, Australian Capital Territory 2601, Australia
| | - Ajay Narendra
- Research School of Biology, The Australian National University, 46 Sullivans Creek Road, Canberra, Australian Capital Territory 2601, Australia Department of Biological Sciences, Macquarie University, W19F, 205 Culloden Road, Sydney, New South Wales 2109, Australia
| | - Jochen Zeil
- Research School of Biology, The Australian National University, 46 Sullivans Creek Road, Canberra, Australian Capital Territory 2601, Australia
| |
Collapse
|
18
|
Goulard R, Julien-Laferriere A, Fleuriet J, Vercher JL, Viollet S. Behavioural evidence for a visual and proprioceptive control of head roll in hoverflies (Episyrphus balteatus). ACTA ACUST UNITED AC 2015; 218:3777-87. [PMID: 26486370 DOI: 10.1242/jeb.127043] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2015] [Accepted: 10/01/2015] [Indexed: 11/20/2022]
Abstract
The ability of hoverflies to control their head orientation with respect to their body contributes importantly to their agility and their autonomous navigation abilities. Many tasks performed by this insect during flight, especially while hovering, involve a head stabilization reflex. This reflex, which is mediated by multisensory channels, prevents the visual processing from being disturbed by motion blur and maintains a consistent perception of the visual environment. The so-called dorsal light response (DLR) is another head control reflex, which makes insects sensitive to the brightest part of the visual field. In this study, we experimentally validate and quantify the control loop driving the head roll with respect to the horizon in hoverflies. The new approach developed here consisted of using an upside-down horizon in a body roll paradigm. In this unusual configuration, tethered flying hoverflies surprisingly no longer use purely vision-based control for head stabilization. These results shed new light on the role of neck proprioceptor organs in head and body stabilization with respect to the horizon. Based on the responses obtained with male and female hoverflies, an improved model was then developed in which the output signals delivered by the neck proprioceptor organs are combined with the visual error in the estimated position of the body roll. An internal estimation of the body roll angle with respect to the horizon might explain the extremely accurate flight performances achieved by some hovering insects.
Collapse
Affiliation(s)
- Roman Goulard
- Aix-Marseille Université, CNRS, ISM UMR 7287, Marseille 13009, France
| | - Alice Julien-Laferriere
- INRIA and Université de Lyon, Lyon 69000, France CNRS, UMR 5558, Laboratoire de Biométrie et Biologie Évolutive, Villeurbanne 69622, France
| | - Jérome Fleuriet
- Washington National Primate Research Center and Department of Ophthalmology, University of Washington, Seattle, WA 98195, USA
| | | | - Stéphane Viollet
- Aix-Marseille Université, CNRS, ISM UMR 7287, Marseille 13009, France
| |
Collapse
|
19
|
Boeddeker N, Mertes M, Dittmar L, Egelhaaf M. Bumblebee Homing: The Fine Structure of Head Turning Movements. PLoS One 2015; 10:e0135020. [PMID: 26352836 PMCID: PMC4564262 DOI: 10.1371/journal.pone.0135020] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2014] [Accepted: 07/17/2015] [Indexed: 11/18/2022] Open
Abstract
Changes in flight direction in flying insects are largely due to roll, yaw and pitch rotations of their body. Head orientation is stabilized for most of the time by counter rotation. Here, we use high-speed video to analyse head- and body-movements of the bumblebee Bombus terrestris while approaching and departing from a food source located between three landmarks in an indoor flight-arena. The flight paths consist of almost straight flight segments that are interspersed with rapid turns. These short and fast yaw turns ("saccades") are usually accompanied by even faster head yaw turns that change gaze direction. Since a large part of image rotation is thereby reduced to brief instants of time, this behavioural pattern facilitates depth perception from visual motion parallax during the intersaccadic intervals. The detailed analysis of the fine structure of the bees' head turning movements shows that the time course of single head saccades is very stereotypical. We find a consistent relationship between the duration, peak velocity and amplitude of saccadic head movements, which in its main characteristics resembles the so-called "saccadic main sequence" in humans. The fact that bumblebee head saccades are highly stereotyped as in humans, may hint at a common principle, where fast and precise motor control is used to reliably reduce the time during which the retinal images moves.
Collapse
Affiliation(s)
- Norbert Boeddeker
- Department of Neurobiology & Center of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
- Department of Cognitive Neurosciences & Center of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| | - Marcel Mertes
- Department of Neurobiology & Center of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| | - Laura Dittmar
- Department of Neurobiology & Center of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology & Center of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| |
Collapse
|
20
|
Encoding of yaw in the presence of distractor motion: studies in a fly motion sensitive neuron. J Neurosci 2015; 35:6481-94. [PMID: 25904799 DOI: 10.1523/jneurosci.4256-14.2015] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Motion estimation is crucial for aerial animals such as the fly, which perform fast and complex maneuvers while flying through a 3-D environment. Motion-sensitive neurons in the lobula plate, a part of the visual brain, of the fly have been studied extensively for their specialized role in motion encoding. However, the visual stimuli used in such studies are typically highly simplified, often move in restricted ways, and do not represent the complexities of optic flow generated during actual flight. Here, we use combined rotations about different axes to study how H1, a wide-field motion-sensitive neuron, encodes preferred yaw motion in the presence of stimuli not aligned with its preferred direction. Our approach is an extension of "white noise" methods, providing a framework that is readily adaptable to quantitative studies into the coding of mixed dynamic stimuli in other systems. We find that the presence of a roll or pitch ("distractor") stimulus reduces information transmitted by H1 about yaw, with the amount of this reduction depending on the variance of the distractor. Spike generation is influenced by features of both yaw and the distractor, where the degree of influence is determined by their relative strengths. Certain distractor features may induce bidirectional responses, which are indicative of an imbalance between global excitation and inhibition resulting from complex optic flow. Further, the response is shaped by the dynamics of the combined stimulus. Our results provide intuition for plausible strategies involved in efficient coding of preferred motion from complex stimuli having multiple motion components.
Collapse
|
21
|
Dittmar L. Static and dynamic snapshots for goal localization in insects? Commun Integr Biol 2014. [DOI: 10.4161/cib.13763] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022] Open
|
22
|
Affiliation(s)
- Karin Nordström
- Department of Neuroscience, Uppsala University, SE-751 24 Uppsala, Sweden.
| |
Collapse
|
23
|
Viollet S, Zeil J. Feed-forward and visual feedback control of head roll orientation in wasps (Polistes humilis, Vespidae, Hymenoptera). ACTA ACUST UNITED AC 2012; 216:1280-91. [PMID: 23239889 DOI: 10.1242/jeb.074773] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Flying insects keep their visual system horizontally aligned, suggesting that gaze stabilization is a crucial first step in flight control. Unlike flies, hymenopteran insects such as bees and wasps do not have halteres that provide fast, feed-forward angular rate information to stabilize head orientation in the presence of body rotations. We tested whether hymenopteran insects use inertial (mechanosensory) information to control head orientation from other sources, such as the wings, by applying periodic roll perturbations to male Polistes humilis wasps flying in tether under different visual conditions indoors and in natural outdoor conditions. We oscillated the thorax of the insects with frequency-modulated sinusoids (chirps) with frequencies increasing from 0.2 to 2 Hz at a maximal amplitude of 50 deg peak-to-peak and maximal angular velocity of ±245 deg s(-1). We found that head roll stabilization is best outdoors, but completely absent in uniform visual conditions and in darkness. Step responses confirm that compensatory head roll movements are purely visually driven. Modelling step responses indicates that head roll stabilization is achieved by merging information on head angular velocity, presumably provided by motion-sensitive neurons and information on head orientation, presumably provided by light level integration across the compound eyes and/or ocelli (dorsal light response). Body roll in free flight reaches amplitudes of ±40 deg and angular velocities greater than 1000 deg s(-1), while head orientation remains horizontal for most of the time to within ±10 deg. In free flight, we did not find a delay between spontaneous body roll and compensatory head movements, and suggest that this is evidence for the contribution of a feed-forward control to head stabilization.
Collapse
Affiliation(s)
- Stéphane Viollet
- Aix-Marseille Université, CNRS, ISM UMR 7287, CP 910, 13288, Marseille Cedex 09, France.
| | | |
Collapse
|
24
|
Nørgaard T, Gagnon YL, Warrant EJ. Nocturnal homing: learning walks in a wandering spider? PLoS One 2012; 7:e49263. [PMID: 23145137 PMCID: PMC3492270 DOI: 10.1371/journal.pone.0049263] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2012] [Accepted: 10/05/2012] [Indexed: 11/21/2022] Open
Abstract
Homing by the nocturnal Namib Desert spider Leucorchestris arenicola (Araneae: Sparassidae) is comparable to homing in diurnal bees, wasps and ants in terms of path length and layout. The spiders' homing is based on vision but their basic navigational strategy is unclear. Diurnal homing insects use memorised views of their home in snapshot matching strategies. The insects learn the visual scenery identifying their nest location during learning flights (e.g. bees and wasps) or walks (ants). These learning flights and walks are stereotyped movement patterns clearly different from other movement behaviours. If the visual homing of L. arenicola is also based on an image matching strategy they are likely to exhibit learning walks similar to diurnal insects. To explore this possibility we recorded departures of spiders from a new burrow in an unfamiliar area with infrared cameras and analysed their paths using computer tracking techniques. We found that L. arenicola performs distinct stereotyped movement patterns during the first part of their departures in an unfamiliar area and that they seem to learn the appearance of their home during these movement patterns. We conclude that the spiders perform learning walks and this strongly suggests that L. arenicola uses a visual memory of the burrow location when homing.
Collapse
|
25
|
Abstract
Virtual reality (VR) holds great promise as a tool to study the neural circuitry underlying animal behaviors. Here, we discuss the advantages of VR and the experimental paradigms and technologies that enable closed loop behavioral experiments. We review recent results from VR research in genetic model organisms where the potential combination of rich behaviors, genetic tools and cutting edge neural recording techniques are leading to breakthroughs in our understanding of the neural basis of behavior. We also discuss several key issues to consider when performing VR experiments and provide an outlook for the future of this exciting experimental toolkit.
Collapse
|
26
|
Abstract
Summary
Animals have needed to find their way about almost since a free-living life style evolved. Particularly, if an animal has a home – shelter or nesting site – true navigation becomes necessary to shuttle between this home and areas of other activities, such as feeding. As old as navigation is in the animal kingdom, as diverse are its mechanisms and implementations, depending on an organism's ecology and its endowment with sensors and actuators. The use of landmarks for piloting or the use of trail pheromones for route following have been examined in great detail and in a variety of animal species. The same is true for senses of direction – the compasses for navigation – and the construction of vectors for navigation from compass and distance cues. The measurement of distance itself – odometry – has received much less attention. The present review addresses some recent progress in the understanding of odometers in invertebrates, after outlining general principles of navigation to put odometry in its proper context. Finally, a number of refinements that increase navigation accuracy and safety are addressed.
Collapse
Affiliation(s)
- Harald Wolf
- Institute for Advanced Study Berlin, Wallotstr. 19, D-14193 Berlin, Germany
| |
Collapse
|
27
|
Portelli G, Ruffier F, Roubieu FL, Franceschini N. Honeybees' speed depends on dorsal as well as lateral, ventral and frontal optic flows. PLoS One 2011; 6:e19486. [PMID: 21589861 PMCID: PMC3093387 DOI: 10.1371/journal.pone.0019486] [Citation(s) in RCA: 46] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2010] [Accepted: 04/08/2011] [Indexed: 11/19/2022] Open
Abstract
Flying insects use the optic flow to navigate safely in unfamiliar environments, especially by adjusting their speed and their clearance from surrounding objects. It has not yet been established, however, which specific parts of the optical flow field insects use to control their speed. With a view to answering this question, freely flying honeybees were trained to fly along a specially designed tunnel including two successive tapering parts: the first part was tapered in the vertical plane and the second one, in the horizontal plane. The honeybees were found to adjust their speed on the basis of the optic flow they perceived not only in the lateral and ventral parts of their visual field, but also in the dorsal part. More specifically, the honeybees' speed varied monotonically, depending on the minimum cross-section of the tunnel, regardless of whether the narrowing occurred in the horizontal or vertical plane. The honeybees' speed decreased or increased whenever the minimum cross-section decreased or increased. In other words, the larger sum of the two opposite optic flows in the horizontal and vertical planes was kept practically constant thanks to the speed control performed by the honeybees upon encountering a narrowing of the tunnel. The previously described ALIS ("AutopiLot using an Insect-based vision System") model nicely matches the present behavioral findings. The ALIS model is based on a feedback control scheme that explains how honeybees may keep their speed proportional to the minimum local cross-section of a tunnel, based solely on optic flow processing, without any need for speedometers or rangefinders. The present behavioral findings suggest how flying insects may succeed in adjusting their speed in their complex foraging environments, while at the same time adjusting their distance not only from lateral and ventral objects but also from those located in their dorsal visual field.
Collapse
Affiliation(s)
- Geoffrey Portelli
- Biorobotic Department, Institute of Movement Science, CNRS/Aix-Marseille II University, Marseille, France
| | - Franck Ruffier
- Biorobotic Department, Institute of Movement Science, CNRS/Aix-Marseille II University, Marseille, France
| | - Frédéric L. Roubieu
- Biorobotic Department, Institute of Movement Science, CNRS/Aix-Marseille II University, Marseille, France
| | - Nicolas Franceschini
- Biorobotic Department, Institute of Movement Science, CNRS/Aix-Marseille II University, Marseille, France
| |
Collapse
|
28
|
Dittmar L. Static and dynamic snapshots for goal localization in insects? Commun Integr Biol 2011; 4:17-20. [PMID: 21509170 DOI: 10.4161/cib.4.1.13763] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2010] [Accepted: 09/27/2010] [Indexed: 11/19/2022] Open
Abstract
Bees, wasps and ants navigate successfully between feeding sites and their nest, despite the small size of their brains which contain less than a million neurons. A long history of studies examining the role of visual memories in homing behavior show that insects can localize a goal by finding a close match between a memorized view at the goal location and their current view ("snapshot matching"). However, the concept of static snapshot matching might not explain all aspects of homing behavior, as honeybees are able to use landmarks that are statically camouflaged. In this case the landmarks are only detectable by relative motion cues between the landmark and the background, which the bees generate when they perform characteristic flight maneuvers close to the landmarks. The bees' navigation performance can be explained by a matching scheme based on optic flow amplitudes ("dynamic snapshot matching"). In this article, I will discuss the concept of dynamic snapshot matching in the light of previous literature.
Collapse
Affiliation(s)
- Laura Dittmar
- Department of Neurobiology & Center of Excellence 'Cognitive Interaction Technology' Bielefeld University; Bielefeld, Germany
| |
Collapse
|
29
|
Reiser MB, Dickinson MH. Drosophila fly straight by fixating objects in the face of expanding optic flow. ACTA ACUST UNITED AC 2010; 213:1771-81. [PMID: 20435828 DOI: 10.1242/jeb.035147] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Flies, like all animals that depend on vision to navigate through the world, must integrate the optic flow created by self-motion with the images generated by prominent features in their environment. Although much is known about the responses of Drosophila melanogaster to rotating flow fields, their reactions to the more complex patterns of motion that occur as they translate through the world are not well understood. In the present study we explore the interactions between two visual reflexes in Drosophila: object fixation and expansion avoidance. As a fly flies forward, it encounters an expanding visual flow field. However, recent results have demonstrated that Drosophila strongly turn away from patterns of expansion. Given the strength of this reflex, it is difficult to explain how flies make forward progress through a visual landscape. This paradox is partially resolved by the finding reported here that when undergoing flight directed towards a conspicuous object, Drosophila will tolerate a level of expansion that would otherwise induce avoidance. This navigation strategy allows flies to fly straight when orienting towards prominent visual features.
Collapse
Affiliation(s)
- Michael B Reiser
- Department of Computational and Neural Systems, California Institute of Technology, Pasadena, CA 91125, USA.
| | | |
Collapse
|
30
|
Path Integration Provides a Scaffold for Landmark Learning in Desert Ants. Curr Biol 2010; 20:1368-71. [DOI: 10.1016/j.cub.2010.06.035] [Citation(s) in RCA: 145] [Impact Index Per Article: 9.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2010] [Revised: 06/03/2010] [Accepted: 06/03/2010] [Indexed: 11/21/2022]
|
31
|
Boeddeker N, Dittmar L, Stürzl W, Egelhaaf M. The fine structure of honeybee head and body yaw movements in a homing task. Proc Biol Sci 2010; 277:1899-906. [PMID: 20147329 PMCID: PMC2871881 DOI: 10.1098/rspb.2009.2326] [Citation(s) in RCA: 73] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2009] [Accepted: 01/22/2010] [Indexed: 11/12/2022] Open
Abstract
Honeybees turn their thorax and thus their flight motor to change direction or to fly sideways. If the bee's head were fixed to its thorax, such movements would have great impact on vision. Head movements independent of thorax orientation can stabilize gaze and thus play an important and active role in shaping the structure of the visual input the animal receives. Here, we investigate how gaze and flight control interact in a homing task. We use high-speed video equipment to record the head and body movements of honeybees approaching and departing from a food source that was located between three landmarks in an indoor flight arena. During these flights, the bees' trajectories consist of straight flight segments combined with rapid turns. These short and fast yaw turns ('saccades') are in most cases accompanied by even faster head yaw turns that start about 8 ms earlier than the body saccades. Between saccades, gaze stabilization leads to a behavioural elimination of rotational components from the optical flow pattern, which facilitates depth perception from motion parallax.
Collapse
Affiliation(s)
- Norbert Boeddeker
- Bielefeld University, Neurobiology and Center of Excellence Cognitive Interaction Technology, Bielefeld, Germany.
| | | | | | | |
Collapse
|
32
|
Kerhuel L, Viollet S, Franceschini N. Steering by Gazing: An Efficient Biomimetic Control Strategy for Visually Guided Micro Aerial Vehicles. IEEE T ROBOT 2010. [DOI: 10.1109/tro.2010.2042537] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
33
|
Abstract
The concept of novelty has acquired a large number of diverse referents over the past quarter-century as a result of new methods that permit measurement of a variety of biological and behavioral reactions to novel incentives in both humans and animals. As a result, the term has acquired varied meanings. This analysis of novelty makes four claims. First, the specific state of uncertainty that a novel event creates depends on its origin. Second, unexpected events that alter the immediate stimulus surround (called stimulus novelty) should be distinguished from those that are inconsistent with an agent's long term knowledge (called conceptual novelty). Third, the critical features that render an event novel can vary with the agent's intention to classify or to act on an object and the balance between these two frames changes with development. Finally, the state of uncertainty created when an agent must choose one response from two or more alternatives differs from the states provoked by stimulus and conceptual novelty.
Collapse
Affiliation(s)
- Jerome Kagan
- Department of Psychology, Harvard University, Cambridge, Massachusetts
| |
Collapse
|
34
|
Modeling and measuring the visual detection of ecologically relevant motion by an Anolis lizard. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2009; 196:1-13. [PMID: 19908049 DOI: 10.1007/s00359-009-0487-7] [Citation(s) in RCA: 15] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2009] [Revised: 10/24/2009] [Accepted: 10/24/2009] [Indexed: 10/20/2022]
Abstract
Motion in the visual periphery of lizards, and other animals, often causes a shift of visual attention toward the moving object. This behavioral response must be more responsive to relevant motion (predators, prey, conspecifics) than to irrelevant motion (windblown vegetation). Early stages of visual motion detection rely on simple local circuits known as elementary motion detectors (EMDs). We presented a computer model consisting of a grid of correlation-type EMDs, with videos of natural motion patterns, including prey, predators and windblown vegetation. We systematically varied the model parameters and quantified the relative response to the different classes of motion. We carried out behavioral experiments with the lizard Anolis sagrei and determined that their visual response could be modeled with a grid of correlation-type EMDs with a spacing parameter of 0.3 degrees visual angle, and a time constant of 0.1 s. The model with these parameters gave substantially stronger responses to relevant motion patterns than to windblown vegetation under equivalent conditions. However, the model is sensitive to local contrast and viewer-object distance. Therefore, additional neural processing is probably required for the visual system to reliably distinguish relevant from irrelevant motion under a full range of natural conditions.
Collapse
|
35
|
Portelli G, Serres J, Ruffier F, Franceschini N. Modelling honeybee visual guidance in a 3-D environment. ACTA ACUST UNITED AC 2009; 104:27-39. [PMID: 19909808 DOI: 10.1016/j.jphysparis.2009.11.011] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
In view of the behavioral findings published on bees during the last two decades, it was proposed to decipher the principles underlying bees' autopilot system, focusing in particular on these insects' use of the optic flow (OF). Based on computer-simulated experiments, we developed a vision-based autopilot that enables a "simulated bee" to travel along a tunnel, controlling both its speed and its clearance from the right wall, left wall, ground, and roof. The flying agent thus equipped enjoys three translational degrees of freedom on the surge (x), sway (y), and heave (z) axes, which are uncoupled. This visuo-motor control system, which is called ALIS (AutopiLot using an Insect based vision System), is a dual OF regulator consisting of two interdependent feedback loops, each of which has its own OF set-point. The experiments presented here showed that the simulated bee was able to navigate safely along a straight or tapered tunnel and to react appropriately to any untoward OF perturbations, such as those resulting from the occasional lack of texture on one wall or the tapering of the tunnel. The minimalistic visual system used here (involving only eight pixels) suffices to jointly control both the clearance from the four walls and the forward speed, without having to measure any speeds or distances. The OF sensors and the simple visuo-motor control system we have developed account well for the results of ethological studies performed on honeybees flying freely along straight and tapered corridors.
Collapse
Affiliation(s)
- G Portelli
- The Institute of Movement Sciences, UMR CNRS - Aix-Marseille university II., France.
| | | | | | | |
Collapse
|