1
|
Fujibayashi M, Abe K. A behavioral analysis system MCFBM enables objective inference of songbirds' attention during social interactions. CELL REPORTS METHODS 2024; 4:100844. [PMID: 39232558 PMCID: PMC11440064 DOI: 10.1016/j.crmeth.2024.100844] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/22/2023] [Revised: 05/13/2024] [Accepted: 08/07/2024] [Indexed: 09/06/2024]
Abstract
Understanding animal behavior is crucial in behavioral neuroscience, aiming to unravel the mechanisms driving these behaviors. A significant milestone in this field is the analysis of behavioral reactions during social interactions. Despite their importance in social learning, the behavioral aspects of these interaction are not well understood in detail due to the lack of appropriate tools. We introduce a high-precision, marker-based motion-capture system for analyzing behavior in songbirds, accurately tracking body location and head direction in multiple freely moving finches during social interaction. Focusing on zebra finches, our analysis revealed variations in eye use based on individuals presented. We also observed behavioral changes during virtual and live presentations and a conditioned-learning paradigm. Additionally, the system effectively analyzed social interactions among mice. This system provides an efficient tool for advanced behavioral analysis in small animals and offers an objective method to infer their focus of attention.
Collapse
Affiliation(s)
- Mizuki Fujibayashi
- Lab of Brain Development, Graduate School of Life Sciences, Tohoku University, Katahira 2-1-1, Aoba-ku, Sendai, Miyagi 980-8577, Japan
| | - Kentaro Abe
- Lab of Brain Development, Graduate School of Life Sciences, Tohoku University, Katahira 2-1-1, Aoba-ku, Sendai, Miyagi 980-8577, Japan; Division for the Establishment of Frontier Sciences of the Organization for Advanced Studies, Tohoku University, Katahira 2-1-1, Aoba-ku, Sendai, Miyagi 980-8577, Japan.
| |
Collapse
|
2
|
Borsier E, Sanders H, Taylor GK. Brightness cues affect gap negotiation behaviours in zebra finches flying between perches. ROYAL SOCIETY OPEN SCIENCE 2024; 11:240007. [PMID: 39100151 PMCID: PMC11296001 DOI: 10.1098/rsos.240007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/02/2024] [Revised: 03/21/2024] [Accepted: 04/07/2024] [Indexed: 08/06/2024]
Abstract
Flying animals have had to evolve robust and effective guidance strategies for dealing with habitat clutter. Birds and insects use optic flow expansion cues to sense and avoid obstacles, but orchid bees have also been shown to use brightness cues during gap negotiation. Such brightness cues might therefore be of general importance in structuring visually guided flight behaviours. To test the hypothesis that brightness cues also affect gap negotiation behaviours in birds, we presented captive zebra finches Taeniopygia guttata with a symmetric or asymmetric background brightness distribution on the other side of a tunnel. The background brightness conditions influenced both the birds' decision to enter the tunnel aperture, and their flight direction upon exit. Zebra finches were more likely to initiate flight through the tunnel if they could see a bright background through it; they were also more likely to fly to the bright side upon exiting. We found no evidence of the centring response that would be expected if optic flow cues were balanced bilaterally during gap negotiation. Instead, the birds entered the tunnel by targeting a clearance of approximately one wing length from its near edge. Brightness cues therefore affect how zebra finches structure their flight when negotiating gaps in enclosed environments.
Collapse
Affiliation(s)
- Emma Borsier
- Department of Biology, University of Oxford, OxfordOX1 3SZ, UK
| | - Helen Sanders
- Department of Biology, University of Oxford, OxfordOX1 3SZ, UK
| | | |
Collapse
|
3
|
Schoepe T, Janotte E, Milde MB, Bertrand OJN, Egelhaaf M, Chicca E. Finding the gap: neuromorphic motion-vision in dense environments. Nat Commun 2024; 15:817. [PMID: 38280859 PMCID: PMC10821932 DOI: 10.1038/s41467-024-45063-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2021] [Accepted: 01/15/2024] [Indexed: 01/29/2024] Open
Abstract
Animals have evolved mechanisms to travel safely and efficiently within different habitats. On a journey in dense terrains animals avoid collisions and cross narrow passages while controlling an overall course. Multiple hypotheses target how animals solve challenges faced during such travel. Here we show that a single mechanism enables safe and efficient travel. We developed a robot inspired by insects. It has remarkable capabilities to travel in dense terrain, avoiding collisions, crossing gaps and selecting safe passages. These capabilities are accomplished by a neuromorphic network steering the robot toward regions of low apparent motion. Our system leverages knowledge about vision processing and obstacle avoidance in insects. Our results demonstrate how insects might safely travel through diverse habitats. We anticipate our system to be a working hypothesis to study insects' travels in dense terrains. Furthermore, it illustrates that we can design novel hardware systems by understanding the underlying mechanisms driving behaviour.
Collapse
Affiliation(s)
- Thorben Schoepe
- Peter Grünberg Institut 15, Forschungszentrum Jülich, Aachen, Germany.
- Faculty of Technology and Cognitive Interaction Technology Center of Excellence (CITEC), Bielefeld University, Bielefeld, Germany.
- Bio-Inspired Circuits and Systems (BICS) Lab. Zernike Institute for Advanced Materials (Zernike Inst Adv Mat), University of Groningen, Groningen, Netherlands.
- CogniGron (Groningen Cognitive Systems and Materials Center), University of Groningen, Groningen, Netherlands.
| | - Ella Janotte
- Event Driven Perception for Robotics, Italian Institute of Technology, iCub facility, Genoa, Italy
| | - Moritz B Milde
- International Centre for Neuromorphic Systems, MARCS Institute, Western Sydney University, Penrith, Australia
| | | | - Martin Egelhaaf
- Neurobiology, Faculty of Biology, Bielefeld University, Bielefeld, Germany
| | - Elisabetta Chicca
- Faculty of Technology and Cognitive Interaction Technology Center of Excellence (CITEC), Bielefeld University, Bielefeld, Germany
- Bio-Inspired Circuits and Systems (BICS) Lab. Zernike Institute for Advanced Materials (Zernike Inst Adv Mat), University of Groningen, Groningen, Netherlands
- CogniGron (Groningen Cognitive Systems and Materials Center), University of Groningen, Groningen, Netherlands
| |
Collapse
|
4
|
Gunner RM, Wilson RP, Holton MD, Bennett NC, Alagaili AN, Bertelsen MF, Mohammed OB, Wang T, Manger PR, Ismael K, Scantlebury DM. Examination of head versus body heading may help clarify the extent to which animal movement pathways are structured by environmental cues? MOVEMENT ECOLOGY 2023; 11:71. [PMID: 37891697 PMCID: PMC10612247 DOI: 10.1186/s40462-023-00432-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/02/2023] [Accepted: 10/18/2023] [Indexed: 10/29/2023]
Abstract
Understanding the processes that determine how animals allocate time to space is a major challenge, although it is acknowledged that summed animal movement pathways over time must define space-time use. The critical question is then, what processes structure these pathways? Following the idea that turns within pathways might be based on environmentally determined decisions, we equipped Arabian oryx with head- and body-mounted tags to determine how they orientated their heads - which we posit is indicative of them assessing the environment - in relation to their movement paths, to investigate the role of environment scanning in path tortuosity. After simulating predators to verify that oryx look directly at objects of interest, we recorded that, during routine movement, > 60% of all turns in the animals' paths, before being executed, were preceded by a change in head heading that was not immediately mirrored by the body heading: The path turn angle (as indicated by the body heading) correlated with a prior change in head heading (with head heading being mirrored by subsequent turns in the path) twenty-one times more than when path turns occurred due to the animals adopting a body heading that went in the opposite direction to the change in head heading. Although we could not determine what the objects of interest were, and therefore the proposed reasons for turning, we suggest that this reflects the use of cephalic senses to detect advantageous environmental features (e.g. food) or to detect detrimental features (e.g. predators). The results of our pilot study suggest how turns might emerge in animal pathways and we propose that examination of points of inflection in highly resolved animal paths could represent decisions in landscapes and their examination could enhance our understanding of how animal pathways are structured.
Collapse
Affiliation(s)
- Richard M Gunner
- Department for the Ecology of Animal Societies, Max Planck Institute of Animal Behavior, 78467, Konstanz, Germany.
- Department of Biosciences, College of Science, Swansea University, Swansea, SA2 8PP, Wales.
| | - Rory P Wilson
- Department of Biosciences, College of Science, Swansea University, Swansea, SA2 8PP, Wales.
| | - Mark D Holton
- Department of Biosciences, College of Science, Swansea University, Swansea, SA2 8PP, Wales
| | - Nigel C Bennett
- Mammal Research Institute, Department of Zoology and Entomology, University of Pretoria, Pretoria, 0002, South Africa
| | - Abdulaziz N Alagaili
- Zoology Department, King Saud University, P. O. Box 2455, Riyadh, 11451, Saudi Arabia
| | - Mads F Bertelsen
- Copenhagen Zoo, Centre for Zoo and Wild Animal Health, Frederiksberg, Denmark
| | - Osama B Mohammed
- KSU Mammals Research Chair, Zoology Department, King Saud University, P.O Box 2455, Riyadh, 11451, Saudi Arabia
| | - Tobias Wang
- Zoophysiology, Department of Biology, Aarhus University, Aarhus, Denmark
| | - Paul R Manger
- School of Anatomical Sciences, Faculty of Health Sciences, University of the Witwatersrand, Johannesburg, South Africa
| | - Khairi Ismael
- Prince Saud Al-Faisal Wildlife Research Center, National Center for Wildlife, Taif, Saudi Arabia
| | - D Michael Scantlebury
- School of Biological Sciences, Queen's University Belfast, 19 Chlorine Gardens, Belfast, BT9 5DL, UK.
| |
Collapse
|
5
|
El-Ghazali HM, Abdelbaset-Ismail A, Goda NIA, Aref M. Morphological, radiographic, three-dimensional computed tomographic, and histological features of the primary upstroke and downstroke muscles and bones in the domestic duck (Anas platyrhynchos domesticus) and the cattle egret (Bubulcus ibis, Linnaeus, 1758), reflecting the evolutionary transition towards the irreversible flightlessness. BMC Vet Res 2023; 19:133. [PMID: 37626319 PMCID: PMC10464456 DOI: 10.1186/s12917-023-03649-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2022] [Accepted: 07/12/2023] [Indexed: 08/27/2023] Open
Abstract
BACKGROUND The purpose of this study was to explore whether domestication could lead to evolutionary changes towards flightlessness in the domestic duck (Anas platyrhynchos domesticus) compared to the cattle egret (Bubulcus ibis) as a nonflying and flying biological model, respectively. Bones of the pectoral girdle (scapula, clavicle, and coracoid) and the foramen triosseum were comparatively assessed using anatomical, radiographic, and 3D computed tomographic (CT) studies. Additionally, the muscles pectoralis and the supracoracoideus were histologically and immunohistochemically assessed. RESULTS Among the differences observed, radiographically, the distance between the paired clavicles was significantly wider (p < 0.05) in the domestic duck (mean ± SD 1.43 ± 0.23 cm) compared with the cattle egret (0.96 ± 0.13 cm). Unlike cattle egrets, there was no connection between the sternum and the hypocladium of furcula in domestic ducks. The scapula, clavicle, coracoid, sternum, and humerus were considerably longer in domestic ducks than in cattle egrets. The foramen triosseum appeared significantly (p < 0.01) wider in domestic ducks (0.7 ± 1.17 cm) compared to cattle egrets (0.49 ± 0.03 cm). Histologically, compared to cattle egrets, the muscle fibers in domestic ducks were loosely connected and contained fewer nuclei and perimysial/endomysial spaces. A higher myoglobin expression was evident in cattle egrets compared with domestic ducks. CONCLUSIONS Results of this study indicate that the bones and muscles of the pectoral girdle generally show specific morphological and structural changes reflective of the loss of prerequisites associated with flight behavior in domestic ducks due to domestication effects compared to cattle egrets.
Collapse
Affiliation(s)
- Hanaa M El-Ghazali
- Anatomy and Embryology Department, Faculty of Veterinary Medicine, Zagazig University, Zagazig, 44519, El-Sharkia, Egypt
| | - Ahmed Abdelbaset-Ismail
- Department of Surgery, Anesthesiology, and Radiology, Faculty of Veterinary Medicine, Zagazig University, Zagazig, 44519, El-Sharkia, Egypt.
| | - Nehal I A Goda
- Department of Histology and Cytology, Faculty of Veterinary Medicine, Zagazig University, Zagazig, 44519, El-Sharkia, Egypt
| | - Mohamed Aref
- Anatomy and Embryology Department, Faculty of Veterinary Medicine, Zagazig University, Zagazig, 44519, El-Sharkia, Egypt
| |
Collapse
|
6
|
Through Hawks’ Eyes: Synthetically Reconstructing the Visual Field of a Bird in Flight. Int J Comput Vis 2023; 131:1497-1531. [PMID: 37089199 PMCID: PMC10110700 DOI: 10.1007/s11263-022-01733-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2022] [Accepted: 12/05/2022] [Indexed: 03/06/2023]
Abstract
AbstractBirds of prey rely on vision to execute flight manoeuvres that are key to their survival, such as intercepting fast-moving targets or navigating through clutter. A better understanding of the role played by vision during these manoeuvres is not only relevant within the field of animal behaviour, but could also have applications for autonomous drones. In this paper, we present a novel method that uses computer vision tools to analyse the role of active vision in bird flight, and demonstrate its use to answer behavioural questions. Combining motion capture data from Harris’ hawks with a hybrid 3D model of the environment, we render RGB images, semantic maps, depth information and optic flow outputs that characterise the visual experience of the bird in flight. In contrast with previous approaches, our method allows us to consider different camera models and alternative gaze strategies for the purposes of hypothesis testing, allows us to consider visual input over the complete visual field of the bird, and is not limited by the technical specifications and performance of a head-mounted camera light enough to attach to a bird’s head in flight. We present pilot data from three sample flights: a pursuit flight, in which a hawk intercepts a moving target, and two obstacle avoidance flights. With this approach, we provide a reproducible method that facilitates the collection of large volumes of data across many individuals, opening up new avenues for data-driven models of animal behaviour.
Collapse
|
7
|
Arnold F, Staniszewski MS, Pelzl L, Ramenda C, Gahr M, Hoffmann S. Vision and vocal communication guide three-dimensional spatial coordination of zebra finches during wind-tunnel flights. Nat Ecol Evol 2022; 6:1221-1230. [PMID: 35773345 PMCID: PMC9349042 DOI: 10.1038/s41559-022-01800-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2021] [Accepted: 05/19/2022] [Indexed: 12/01/2022]
Abstract
Animal collective motion is a natural phenomenon readily observable in various taxa. Although theoretical models can predict the macroscopic pattern of group movements based on the relative spatial position of group members, it is poorly understood how group members exchange directional information, which enables the spatial coordination between individuals during collective motion. To test if vocalizations emitted during flocking flight are used by birds to transmit directional information between group members, we recorded vocal behaviour, head orientation and spatial position of each individual in a small flock of zebra finches (Taeniopygia guttata) flying in a wind tunnel. We found that the finches can use both visual and acoustic cues for three-dimensional flock coordination. When visual information is insufficient, birds can increasingly exploit active vocal communication to avoid collisions with flock mates. Our study furthers the mechanistic understanding of collective motion in birds and highlights the impact interindividual vocal interactions can have on group performances in these animals. Zebra finches flying in a wind tunnel use both vocal and visual communication to orientate themselves within the flock, and are able to enhance their use of one form of communication over another depending on circumstance.
Collapse
Affiliation(s)
- Fabian Arnold
- Department of Behavioural Neurobiology, Max Planck Institute for Ornithology, Seewiesen, Germany.,Faculty of Biology, Ludwig-Maximilians-University of Munich, Planegg-Martinsried, Germany.,TUM School of Life Sciences, Technical University of Munich, Freising, Germany
| | - Michael S Staniszewski
- Department of Behavioural Neurobiology, Max Planck Institute for Ornithology, Seewiesen, Germany.,Faculty of Biology, Ludwig-Maximilians-University of Munich, Planegg-Martinsried, Germany.,Faculty of Sciences and Bioengineering Sciences, Vrije Universiteit Brussel, Elsene, Belgium
| | - Lisa Pelzl
- Department of Behavioural Neurobiology, Max Planck Institute for Ornithology, Seewiesen, Germany.,Faculty of Biology, Ludwig-Maximilians-University of Munich, Planegg-Martinsried, Germany.,Faculty of Biology, Ludwig-Maximilians-University of Munich, Planegg-Martinsried, Germany
| | - Claudia Ramenda
- Department of Behavioural Neurobiology, Max Planck Institute for Ornithology, Seewiesen, Germany.,Department of Behavioural Neurobiology, Max Planck Institute for Biological Intelligence (in Foundation), Seewiesen, Germany
| | - Manfred Gahr
- Department of Behavioural Neurobiology, Max Planck Institute for Ornithology, Seewiesen, Germany.,Department of Behavioural Neurobiology, Max Planck Institute for Biological Intelligence (in Foundation), Seewiesen, Germany
| | - Susanne Hoffmann
- Department of Behavioural Neurobiology, Max Planck Institute for Ornithology, Seewiesen, Germany. .,Department of Behavioural Neurobiology, Max Planck Institute for Biological Intelligence (in Foundation), Seewiesen, Germany.
| |
Collapse
|
8
|
Thady RG, Emerson LC, Swaddle JP. Evaluating acoustic signals to reduce avian collision risk. PeerJ 2022; 10:e13313. [PMID: 35573177 PMCID: PMC9104101 DOI: 10.7717/peerj.13313] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2021] [Accepted: 03/30/2022] [Indexed: 01/13/2023] Open
Abstract
Collisions with human-made structures are responsible for billions of bird deaths each year, resulting in ecological damage as well as regulatory and financial burdens to many industries. Acoustic signals can alert birds to obstacles in their flight paths in order to mitigate collisions, but these signals should be tailored to the sensory ecology of birds in flight as the effectiveness of various acoustic signals potentially depends on the influence of background noise and the relative ability of various sound types to propagate within a landscape. We measured changes in flight behaviors from zebra finches released into a flight corridor containing a physical obstacle, either in no-additional-sound control conditions or when exposed to one of four acoustic signals. We selected signals to test two frequency ranges (4-6 kHz or 6-8 kHz) and two temporal modulation patterns (broadband or frequency-modulated oscillating) to determine whether any particular combination of sound attributes elicited the strongest collision avoidance behaviors. We found that, relative to control flights, all sound treatments caused birds to maintain a greater distance from hazards and to adjust their flight trajectories before coming close to obstacles. There were no statistical differences among different sound treatments, but consistent trends within the data suggest that the 4-6 kHz frequency-modulated oscillating signal elicited the strongest avoidance behaviors. We conclude that a variety of acoustic signals can be effective as avian collision deterrents, at least in the context in which we tested these birds. These results may be most directly applicable in scenarios when birds are at risk of collisions with solid structures, such as wind turbines and communication towers, as opposed to window collisions or collisions involving artificial lighting. We recommend the incorporation of acoustic signals into multimodal collision deterrents and demonstrate the value of using behavioral data to assess collision risk.
Collapse
Affiliation(s)
- Robin G. Thady
- Biology Department, William & Mary, Williamsburg, VA, United States of America
| | - Lauren C. Emerson
- Biology Department, William & Mary, Williamsburg, VA, United States of America
| | - John P. Swaddle
- Biology Department, William & Mary, Williamsburg, VA, United States of America,Institute for Integrative Conservation, William & Mary, Williamsburg, VA, United States of America
| |
Collapse
|
9
|
Chen R, Gadagkar V, Roeser AC, Puzerey PA, Goldberg JH. Movement signaling in ventral pallidum and dopaminergic midbrain is gated by behavioral state in singing birds. J Neurophysiol 2021; 125:2219-2227. [PMID: 33949888 DOI: 10.1152/jn.00110.2021] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Movement-related neuronal discharge in ventral tegmental area (VTA) and ventral pallidum (VP) is inconsistently observed across studies. One possibility is that some neurons are movement related and others are not. Another possibility is that the precise behavioral conditions matter-that a single neuron can be movement related under certain behavioral states but not others. We recorded single VTA and VP neurons in birds transitioning between singing and nonsinging states while monitoring body movement with microdrive-mounted accelerometers. Many VP and VTA neurons exhibited body movement-locked activity exclusively when the bird was not singing. During singing, VP and VTA neurons could switch off their tuning to body movement and become instead precisely time-locked to specific song syllables. These changes in neuronal tuning occurred rapidly at state boundaries. Our findings show that movement-related activity in limbic circuits can be gated by behavioral context.NEW & NOTEWORTHY Neural signals in the limbic system have long been known to represent body movements as well as reward. Here, we show that single neurons dramatically change their tuning from movement to song timing when a bird starts to sing.
Collapse
Affiliation(s)
- Ruidong Chen
- Department of Neurobiology and Behavior, Cornell University, Ithaca, New York
| | - Vikram Gadagkar
- Department of Neurobiology and Behavior, Cornell University, Ithaca, New York.,Department of Neuroscience, Zuckerman Mind Brain Behavior Institute, Columbia University, New York, New York
| | - Andrea C Roeser
- Department of Neurobiology and Behavior, Cornell University, Ithaca, New York
| | - Pavel A Puzerey
- Department of Neurobiology and Behavior, Cornell University, Ithaca, New York
| | - Jesse H Goldberg
- Department of Neurobiology and Behavior, Cornell University, Ithaca, New York
| |
Collapse
|
10
|
Beetz MJ, Kössl M, Hechavarría JC. The frugivorous bat Carollia perspicillata dynamically changes echolocation parameters in response to acoustic playback. J Exp Biol 2021; 224:jeb.234245. [PMID: 33568443 DOI: 10.1242/jeb.234245] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2020] [Accepted: 01/30/2021] [Indexed: 11/20/2022]
Abstract
Animals extract behaviorally relevant signals from 'noisy' environments. Echolocation behavior provides a rich system testbed for investigating signal extraction. When echolocating in acoustically enriched environments, bats show many adaptations that are believed to facilitate signal extraction. Most studies to date focused on describing adaptations in insectivorous bats while frugivorous bats have rarely been tested. Here, we characterize how the frugivorous bat Carollia perspicillata adapts its echolocation behavior in response to acoustic playback. Since bats not only adapt their echolocation calls in response to acoustic interference but also with respect to target distances, we swung bats on a pendulum to control for distance-dependent call changes. Forward swings evoked consistent echolocation behavior similar to approach flights. By comparing the echolocation behavior recorded in the presence and absence of acoustic playback, we could precisely define the influence of the acoustic context on the bats' vocal behavior. Our results show that C. perspicillata decrease the terminal peak frequencies of their calls when echolocating in the presence of acoustic playback. When considering the results at an individual level, it became clear that each bat dynamically adjusts different echolocation parameters across and even within experimental days. Utilizing such dynamics, bats create unique echolocation streams that could facilitate signal extraction in noisy environments.
Collapse
Affiliation(s)
- M Jerome Beetz
- Institute for Cell Biology and Neuroscience, Goethe University, 60438 Frankfurt am Main, Germany
| | - Manfred Kössl
- Institute for Cell Biology and Neuroscience, Goethe University, 60438 Frankfurt am Main, Germany
| | - Julio C Hechavarría
- Institute for Cell Biology and Neuroscience, Goethe University, 60438 Frankfurt am Main, Germany
| |
Collapse
|
11
|
Doussot C, Bertrand OJN, Egelhaaf M. The Critical Role of Head Movements for Spatial Representation During Bumblebees Learning Flight. Front Behav Neurosci 2021; 14:606590. [PMID: 33542681 PMCID: PMC7852487 DOI: 10.3389/fnbeh.2020.606590] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2020] [Accepted: 11/23/2020] [Indexed: 11/20/2022] Open
Abstract
Bumblebees perform complex flight maneuvers around the barely visible entrance of their nest upon their first departures. During these flights bees learn visual information about the surroundings, possibly including its spatial layout. They rely on this information to return home. Depth information can be derived from the apparent motion of the scenery on the bees' retina. This motion is shaped by the animal's flight and orientation: Bees employ a saccadic flight and gaze strategy, where rapid turns of the head (saccades) alternate with flight segments of apparently constant gaze direction (intersaccades). When during intersaccades the gaze direction is kept relatively constant, the apparent motion contains information about the distance of the animal to environmental objects, and thus, in an egocentric reference frame. Alternatively, when the gaze direction rotates around a fixed point in space, the animal perceives the depth structure relative to this pivot point, i.e., in an allocentric reference frame. If the pivot point is at the nest-hole, the information is nest-centric. Here, we investigate in which reference frames bumblebees perceive depth information during their learning flights. By precisely tracking the head orientation, we found that half of the time, the head appears to pivot actively. However, only few of the corresponding pivot points are close to the nest entrance. Our results indicate that bumblebees perceive visual information in several reference frames when they learn about the surroundings of a behaviorally relevant location.
Collapse
Affiliation(s)
- Charlotte Doussot
- Department of Neurobiology, University of Bielefeld, Bielefeld, Germany
| | | | | |
Collapse
|
12
|
Winsor AM, Pagoti GF, Daye DJ, Cheries EW, Cave KR, Jakob EM. What gaze direction can tell us about cognitive processes in invertebrates. Biochem Biophys Res Commun 2021; 564:43-54. [PMID: 33413978 DOI: 10.1016/j.bbrc.2020.12.001] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2020] [Revised: 11/30/2020] [Accepted: 12/01/2020] [Indexed: 01/29/2023]
Abstract
Most visually guided animals shift their gaze using body movements, eye movements, or both to gather information selectively from their environments. Psychological studies of eye movements have advanced our understanding of perceptual and cognitive processes that mediate visual attention in humans and other vertebrates. However, much less is known about how these processes operate in other organisms, particularly invertebrates. We here make the case that studies of invertebrate cognition can benefit by adding precise measures of gaze direction. To accomplish this, we briefly review the human visual attention literature and outline four research themes and several experimental paradigms that could be extended to invertebrates. We briefly review selected studies where the measurement of gaze direction in invertebrates has provided new insights, and we suggest future areas of exploration.
Collapse
Affiliation(s)
- Alex M Winsor
- Graduate Program in Organismic and Evolutionary Biology, University of Massachusetts Amherst, Amherst, MA, 01003, USA.
| | - Guilherme F Pagoti
- Programa de Pós-Graduação em Zoologia, Instituto de Biociências, Universidade de São Paulo, Rua do Matão, 321, Travessa 14, Cidade Universitária, São Paulo, SP, 05508-090, Brazil
| | - Daniel J Daye
- Department of Biology, University of Massachusetts Amherst, Amherst, MA, 01003, USA; Graduate Program in Biological and Environmental Sciences, University of Rhode Island, Kingston, RI, 02881, USA
| | - Erik W Cheries
- Department of Psychological and Brain Sciences, University of Massachusetts Amherst, Amherst, MA, 01003, USA
| | - Kyle R Cave
- Department of Psychological and Brain Sciences, University of Massachusetts Amherst, Amherst, MA, 01003, USA
| | - Elizabeth M Jakob
- Department of Biology, University of Massachusetts Amherst, Amherst, MA, 01003, USA.
| |
Collapse
|
13
|
Corthals K, Moore S, Geurten BR. Strategies of locomotion composition. CURRENT OPINION IN INSECT SCIENCE 2019; 36:140-148. [PMID: 31622810 DOI: 10.1016/j.cois.2019.09.007] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/01/2019] [Revised: 09/10/2019] [Accepted: 09/24/2019] [Indexed: 06/10/2023]
Abstract
This review aims to highlight the importance of saccades during locomotion as a strategy to reduce sensory information loss while the subject is moving. Acquiring sensory data from the environment during movement results in a temporal flow of information, as the sensory precept changes with the position of the observer. Accordingly, the movement pattern shapes the sensory flow. Therefore, the requirements of locomotion and sensation have to be balanced in the behaviour of the organism. Insect vision provides deep insight into the interplay between action and perception. Insects can shape their optic flow by reducing their rotational movements to fast and short saccades. This generates prolonged phases of translations which provide depth information. Extensive behavioural and physiological studies on insects show how shaping the optic flow facilitates the coding of motion vision. Indeed the saccadic strategy provides an elegant solution to optimise sensory flow. Complementary studies in other taxa reported similar locomotion strategies emphasising the crucial influence of sensory flow on locomotion.
Collapse
Affiliation(s)
- Kristina Corthals
- Lund University, Functional Zoology, Sölvegatan 35, 223 62 Lund, Sweden
| | - Sharlen Moore
- Instituto de Fisiologıa Celular - Neurociencias, Universidad Nacional Autónoma de México, Av. Universidad 3000, Coyoacán, 04510 Mexico City, Mexico; Max Planck Institute of Experimental Medicine, Department of Neurogenetics, Hermann-Rein-Str. 3, 37075 Göttingen, Germany
| | - Bart Rh Geurten
- Georg-August-University Göttingen, Department of Cellular Neuroscience, Julia-Lermontowa-Weg 3, 37077 Göttingen, Germany.
| |
Collapse
|
14
|
Abstract
Sensing from a moving platform is challenging for both man-made machines and animals. Animals' heads jitter during movement, so if the sensors they carry are not stabilized, any spatial estimation might be biased. Flying animals, like bats, seriously suffer from this problem because flapping flight induces rapid changes in acceleration which moves the body up and down. For echolocating bats, the problem is crucial. Because they emit a sound to sense the world, an unstable head means sound energy pointed in the wrong direction. It is unknown how bats mitigate this problem. By tracking the head and body of flying fruit bats, we show that they stabilize their heads, accurately maintaining a fixed acoustic-gaze relative to a target. Bats can solve the stabilization task even in complete darkness using only echo-based information. Moreover, the bats point their echolocation beam below the target and not towards it, a strategy that should result in better estimations of target elevation.
Collapse
Affiliation(s)
- O Eitan
- School of Zoology, Faculty of Life Sciences, Tel Aviv University, Tel Aviv 6997801, Israel
| | - G Kosa
- Intelligent Medical Micro/Nano Systems Group, University Hospital of Basel, Basel, Switzerland
| | - Y Yovel
- School of Zoology, Faculty of Life Sciences, Tel Aviv University, Tel Aviv 6997801, Israel.,Sagol School of Neuroscience, Tel Aviv University, Tel Aviv 6997801, Israel.,School of Mechanical Engineering, Faculty of Engineering, Tel Aviv University, Tel Aviv 6997801, Israel
| |
Collapse
|
15
|
Goller B, Fellows TK, Dakin R, Tyrrell L, Fernández-Juricic E, Altshuler DL. Spatial and Temporal Resolution of the Visual System of the Anna's Hummingbird ( Calypte anna) Relative to Other Birds. Physiol Biochem Zool 2019; 92:481-495. [PMID: 31393209 DOI: 10.1086/705124] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022]
Abstract
Hummingbirds are an emerging model for studies of the visual guidance of flight. However, basic properties of their visual systems, such as spatial and temporal visual resolution, have not been characterized. We measured both the spatial and temporal visual resolution of Anna's hummingbirds using behavioral experiments and anatomical estimates. Spatial visual resolution was determined behaviorally using the optocollic reflex and anatomically using peak retinal ganglion cell densities from retinal whole mounts and eye size. Anna's hummingbirds have a spatial visual resolution of 5-6 cycles per degree when measured behaviorally, which matches anatomical estimates (fovea: 6.26 ± 0.12 cycles per degree; area temporalis: 5.59 ± 0.15 cycles per degree; and whole eye average: 4.64 ± 0.08 ). To determine temporal visual resolution, we used an operant conditioning paradigm wherein hummingbirds were trained to use a flickering light to find a food reward. The limits of temporal visual resolution were estimated as 70-80 Hz. To compare Anna's hummingbirds with other bird species, we used a phylogenetically controlled analysis of previously published data on avian visual resolutions and body size. Our measurements for Anna's hummingbird vision fall close to and below predictions based on body size for spatial visual resolution and temporal visual resolution, respectively. These results indicate that the enhanced flight performance and foraging behaviors of hummingbirds do not require enhanced spatial or temporal visual resolution. This finding is important for interpreting flight control studies and contributes to a growing understanding of avian vision.
Collapse
|
16
|
Goller B, Blackwell BF, DeVault TL, Baumhardt PE, Fernández-Juricic E. Assessing bird avoidance of high-contrast lights using a choice test approach: implications for reducing human-induced avian mortality. PeerJ 2018; 6:e5404. [PMID: 30280013 PMCID: PMC6163032 DOI: 10.7717/peerj.5404] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2018] [Accepted: 07/18/2018] [Indexed: 01/11/2023] Open
Abstract
Background Avian collisions with man-made objects and vehicles (e.g., buildings, cars, airplanes, power lines) have increased recently. Lights have been proposed to alert birds and minimize the chances of collisions, but it is challenging to choose lights that are tuned to the avian eye and can also lead to avoidance given the differences between human and avian vision. We propose a choice test to address this problem by first identifying wavelengths of light that would over-stimulate the retina using species-specific perceptual models and by then assessing the avoidance/attraction responses of brown-headed cowbirds to these lights during daytime using a behavioral assay. Methods We used perceptual models to estimate wavelength-specific light emitting diode (LED) lights with high chromatic contrast. The behavioral assay consisted of an arena where the bird moved in a single direction and was forced to make a choice (right/left) using a single-choice design (one side with the light on, the other with the light off) under diurnal light conditions. Results First, we identified lights with high saliency from the cowbird visual perspective: LED lights with peaks at 380 nm (ultraviolet), 470 nm (blue), 525 nm (green), 630 nm (red), and broad-spectrum (white) LED lights. Second, we found that cowbirds significantly avoided LED lights with peaks at 470 and 630 nm, but did not avoid or prefer LED lights with peaks at 380 and 525 nm or white lights. Discussion The two lights avoided had the highest chromatic contrast but relatively lower levels of achromatic contrast. Our approach can optimize limited resources to narrow down wavelengths of light with high visual saliency for a target species leading to avoidance. These lights can be used as candidates for visual deterrents to reduce collisions with man-made objects and vehicles.
Collapse
Affiliation(s)
- Benjamin Goller
- Department of Biological Sciences, Purdue University, West Lafayette, IN, USA
| | | | - Travis L DeVault
- USDA/APHIS/WS National Wildlife Research Center, Sandusky, OH, USA
| | - Patrice E Baumhardt
- Department of Biological Sciences, Purdue University, West Lafayette, IN, USA
| | | |
Collapse
|
17
|
Kano F, Walker J, Sasaki T, Biro D. Head-mounted sensors reveal visual attention of free-flying homing pigeons. ACTA ACUST UNITED AC 2018; 221:221/17/jeb183475. [PMID: 30190414 DOI: 10.1242/jeb.183475] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2018] [Accepted: 06/19/2018] [Indexed: 01/08/2023]
Abstract
Gaze behavior offers valuable insights into attention and cognition. However, technological limitations have prevented the examination of animals' gaze behavior in natural, information-rich contexts; for example, during navigation through complex environments. Therefore, we developed a lightweight custom-made logger equipped with an inertial measurement unit (IMU) and GPS to simultaneously track the head movements and flight trajectories of free-flying homing pigeons. Pigeons have a limited range of eye movement, and their eye moves in coordination with their head in a saccadic manner (similar to primate eye saccades). This allows head movement to act as a proxy for visual scanning behavior. Our IMU sensor recorded the 3D movement of the birds' heads in high resolution, allowing us to reliably detect distinct saccade signals. The birds moved their head far more than necessary for maneuvering flight, suggesting that they actively scanned the environment. This movement was predominantly horizontal (yaw) and sideways (roll), allowing them to scan the environment with their lateral visual field. They decreased their head movement when they flew solo over prominent landmarks (major roads and a railway line) and also when they flew in pairs (especially when flying side by side, with the partner maintained in their lateral visual field). Thus, a decrease in head movement indicates a change in birds' focus of attention. We conclude that pigeons use their head gaze in a task-related manner and that tracking flying birds' head movement is a promising method for examining their visual attention during natural tasks.
Collapse
Affiliation(s)
- Fumihiro Kano
- Kumamoto Sanctuary, Wildlife Research Center, Kyoto University, Uki, Kumamoto, Japan .,Department of Zoology, University of Oxford, Oxford OX1 3PS, UK
| | - James Walker
- Department of Zoology, University of Oxford, Oxford OX1 3PS, UK
| | - Takao Sasaki
- Department of Zoology, University of Oxford, Oxford OX1 3PS, UK
| | - Dora Biro
- Department of Zoology, University of Oxford, Oxford OX1 3PS, UK
| |
Collapse
|
18
|
Wylie DR, Gutiérrez-Ibáñez C, Gaede AH, Altshuler DL, Iwaniuk AN. Visual-Cerebellar Pathways and Their Roles in the Control of Avian Flight. Front Neurosci 2018; 12:223. [PMID: 29686605 PMCID: PMC5900027 DOI: 10.3389/fnins.2018.00223] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2017] [Accepted: 03/21/2018] [Indexed: 11/20/2022] Open
Abstract
In this paper, we review the connections and physiology of visual pathways to the cerebellum in birds and consider their role in flight. We emphasize that there are two visual pathways to the cerebellum. One is to the vestibulocerebellum (folia IXcd and X) that originates from two retinal-recipient nuclei that process optic flow: the nucleus of the basal optic root (nBOR) and the pretectal nucleus lentiformis mesencephali (LM). The second is to the oculomotor cerebellum (folia VI-VIII), which receives optic flow information, mainly from LM, but also local visual motion information from the optic tectum, and other visual information from the ventral lateral geniculate nucleus (Glv). The tectum, LM and Glv are all intimately connected with the pontine nuclei, which also project to the oculomotor cerebellum. We believe this rich integration of visual information in the cerebellum is important for analyzing motion parallax that occurs during flight. Finally, we extend upon a suggestion by Ibbotson (2017) that the hypertrophy that is observed in LM in hummingbirds might be due to an increase in the processing demands associated with the pathway to the oculomotor cerebellum as they fly through a cluttered environment while feeding.
Collapse
Affiliation(s)
- Douglas R Wylie
- Neuroscience and Mental Health Institute, University of Alberta, Edmonton, AB, Canada
| | | | - Andrea H Gaede
- Neuroscience and Mental Health Institute, University of Alberta, Edmonton, AB, Canada.,Department of Zoology, University of British Columbia, Vancouver, BC, Canada
| | - Douglas L Altshuler
- Department of Zoology, University of British Columbia, Vancouver, BC, Canada
| | - Andrew N Iwaniuk
- Department of Neuroscience, Canadian Centre for Behavioural Neuroscience, University of Lethbridge, Lethbridge, AB, Canada
| |
Collapse
|
19
|
Altshuler DL, Srinivasan MV. Comparison of Visually Guided Flight in Insects and Birds. Front Neurosci 2018; 12:157. [PMID: 29615852 PMCID: PMC5864886 DOI: 10.3389/fnins.2018.00157] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2017] [Accepted: 02/27/2018] [Indexed: 11/14/2022] Open
Abstract
Over the last half century, work with flies, bees, and moths have revealed a number of visual guidance strategies for controlling different aspects of flight. Some algorithms, such as the use of pattern velocity in forward flight, are employed by all insects studied so far, and are used to control multiple flight tasks such as regulation of speed, measurement of distance, and positioning through narrow passages. Although much attention has been devoted to long-range navigation and homing in birds, until recently, very little was known about how birds control flight in a moment-to-moment fashion. A bird that flies rapidly through dense foliage to land on a branch—as birds often do—engages in a veritable three-dimensional slalom, in which it has to continually dodge branches and leaves, and find, and possibly even plan a collision-free path to the goal in real time. Each mode of flight from take-off to goal could potentially involve a different visual guidance algorithm. Here, we briefly review strategies for visual guidance of flight in insects, synthesize recent work from short-range visual guidance in birds, and offer a general comparison between the two groups of organisms.
Collapse
Affiliation(s)
- Douglas L Altshuler
- Department of Zoology, University of British Columbia, Vancouver, BC, Canada
| | - Mandyam V Srinivasan
- Queensland Brain Institute, University of Queensland, St Lucia, QLD, Australia.,School of Information Technology and Electrical Engineering, University of Queensland, St Lucia, QLD, Australia
| |
Collapse
|
20
|
Abstract
Navigation is an essential skill for many animals, and understanding how animal use environmental information, particularly visual information, to navigate has a long history in both ethology and psychology. In birds, the dominant approach for investigating navigation at small-scales comes from comparative psychology, which emphasizes the cognitive representations underpinning spatial memory. The majority of this work is based in the laboratory and it is unclear whether this context itself affects the information that birds learn and use when they search for a location. Data from hummingbirds suggests that birds in the wild might use visual information in quite a different manner. To reconcile these differences, here we propose a new approach to avian navigation, inspired by the sensory-driven study of navigation in insects. Using methods devised for studying the navigation of insects, it is possible to quantify the visual information available to navigating birds, and then to determine how this information influences those birds' navigation decisions. Focusing on four areas that we consider characteristic of the insect navigation perspective, we discuss how this approach has shone light on the information insects use to navigate, and assess the prospects of taking a similar approach with birds. Although birds and insects differ in many ways, there is nothing in the insect-inspired approach of the kind we describe that means these methods need be restricted to insects. On the contrary, adopting such an approach could provide a fresh perspective on the well-studied question of how birds navigate through a variety of environments.
Collapse
Affiliation(s)
| | - Susan D Healy
- School of Biology, University of St Andrews, Fife, UK
| |
Collapse
|
21
|
Krause ET, Bischof HJ, Engel K, Golüke S, Maraci Ö, Mayer U, Sauer J, Caspers BA. Olfaction in the Zebra Finch ( Taeniopygia guttata ): What Is Known and Further Perspectives. ADVANCES IN THE STUDY OF BEHAVIOR 2018. [DOI: 10.1016/bs.asb.2017.11.001] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
|
22
|
Corcoran AJ, Moss CF. Sensing in a noisy world: lessons from auditory specialists, echolocating bats. J Exp Biol 2017; 220:4554-4566. [DOI: 10.1242/jeb.163063] [Citation(s) in RCA: 46] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
ABSTRACT
All animals face the essential task of extracting biologically meaningful sensory information from the ‘noisy’ backdrop of their environments. Here, we examine mechanisms used by echolocating bats to localize objects, track small prey and communicate in complex and noisy acoustic environments. Bats actively control and coordinate both the emission and reception of sound stimuli through integrated sensory and motor mechanisms that have evolved together over tens of millions of years. We discuss how bats behave in different ecological scenarios, including detecting and discriminating target echoes from background objects, minimizing acoustic interference from competing conspecifics and overcoming insect noise. Bats tackle these problems by deploying a remarkable array of auditory behaviors, sometimes in combination with the use of other senses. Behavioral strategies such as ceasing sonar call production and active jamming of the signals of competitors provide further insight into the capabilities and limitations of echolocation. We relate these findings to the broader topic of how animals extract relevant sensory information in noisy environments. While bats have highly refined abilities for operating under noisy conditions, they face the same challenges encountered by many other species. We propose that the specialized sensory mechanisms identified in bats are likely to occur in analogous systems across the animal kingdom.
Collapse
Affiliation(s)
- Aaron J. Corcoran
- Department of Biology, Wake Forest University, Box 7325 Reynolda Station, Winston-Salem, NC 27109, USA
| | - Cynthia F. Moss
- Department of Psychological and Brain Sciences, Johns Hopkins University, 3400 N. Charles Street, Baltimore, MD 21218, USA
| |
Collapse
|
23
|
Ros IG, Biewener AA. Pigeons ( C. livia) Follow Their Head during Turning Flight: Head Stabilization Underlies the Visual Control of Flight. Front Neurosci 2017; 11:655. [PMID: 29249929 PMCID: PMC5717024 DOI: 10.3389/fnins.2017.00655] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2017] [Accepted: 11/09/2017] [Indexed: 11/13/2022] Open
Abstract
Similar flight control principles operate across insect and vertebrate fliers. These principles indicate that robust solutions have evolved to meet complex behavioral challenges. Following from studies of visual and cervical feedback control of flight in insects, we investigate the role of head stabilization in providing feedback cues for controlling turning flight in pigeons. Based on previous observations that the eyes of pigeons remain at relatively fixed orientations within the head during flight, we test potential sensory control inputs derived from head and body movements during 90° aerial turns. We observe that periods of angular head stabilization alternate with rapid head repositioning movements (head saccades), and confirm that control of head motion is decoupled from aerodynamic and inertial forces acting on the bird's continuously rotating body during turning flapping flight. Visual cues inferred from head saccades correlate with changes in flight trajectory; whereas the magnitude of neck bending predicts angular changes in body position. The control of head motion to stabilize a pigeon's gaze may therefore facilitate extraction of important motion cues, in addition to offering mechanisms for controlling body and wing movements. Strong similarities between the sensory flight control of birds and insects may also inspire novel designs of robust controllers for human-engineered autonomous aerial vehicles.
Collapse
Affiliation(s)
- Ivo G Ros
- Department of Organismic and Evolutionary Biology, Harvard University, Cambridge, MA, United States.,Division of Biology and Bioengineering, California Institute of Technology, Pasadena, CA, United States
| | - Andrew A Biewener
- Department of Organismic and Evolutionary Biology, Harvard University, Cambridge, MA, United States
| |
Collapse
|
24
|
Longden KD, Wicklein M, Hardcastle BJ, Huston SJ, Krapp HG. Spike Burst Coding of Translatory Optic Flow and Depth from Motion in the Fly Visual System. Curr Biol 2017; 27:3225-3236.e3. [PMID: 29056452 DOI: 10.1016/j.cub.2017.09.044] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2017] [Revised: 08/11/2017] [Accepted: 09/20/2017] [Indexed: 11/19/2022]
Abstract
Many animals use the visual motion generated by traveling straight-the translatory optic flow-to successfully navigate obstacles: near objects appear larger and to move more quickly than distant objects. Flies are expert at navigating cluttered environments, and while their visual processing of rotatory optic flow is understood in exquisite detail, how they process translatory optic flow remains a mystery. We present novel cell types that have local motion receptive fields matched to translation self-motion, the vertical translation (VT) cells. One of these, the VT1 cell, encodes self-motion in the forward-sideslip direction and fires action potentials in spike bursts as well as single spikes. We show that the spike burst coding is size and speed-tuned and is selectively modulated by motion parallax-the relative motion experienced during translation. These properties are spatially organized, so that the cell is most excited by clutter rather than isolated objects. When the fly is presented with a simulation of flying past an elevated object, the spike burst activity is modulated by the height of the object, and the rate of single spikes is unaffected. When the moving object alone is experienced, the cell is weakly driven. Meanwhile, the VT2-3 cells have motion receptive fields matched to the lift axis. In conjunction with previously described horizontal cells, the VT cells have properties well suited to the visual navigation of clutter and to encode the fly's movements along near cardinal axes of thrust, lift, and forward sideslip.
Collapse
Affiliation(s)
- Kit D Longden
- Department of Bioengineering, Imperial College London, London SW7 2AZ, UK.
| | - Martina Wicklein
- Department of Bioengineering, Imperial College London, London SW7 2AZ, UK
| | - Ben J Hardcastle
- Department of Bioengineering, Imperial College London, London SW7 2AZ, UK
| | - Stephen J Huston
- Department of Bioengineering, Imperial College London, London SW7 2AZ, UK
| | - Holger G Krapp
- Department of Bioengineering, Imperial College London, London SW7 2AZ, UK
| |
Collapse
|
25
|
Geurten BRH, Niesterok B, Dehnhardt G, Hanke FD. Saccadic movement strategy in a semiaquatic species - the harbour seal ( Phoca vitulina). ACTA ACUST UNITED AC 2017; 220:1503-1508. [PMID: 28167803 DOI: 10.1242/jeb.150763] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2016] [Accepted: 01/31/2017] [Indexed: 11/20/2022]
Abstract
Moving animals can estimate the distance of visual objects from image shift on their retina (optic flow) created during translational, but not rotational movements. To facilitate this distance estimation, many terrestrial and flying animals perform saccadic movements, thereby temporally separating translational and rotational movements, keeping rotation times short. In this study, we analysed whether a semiaquatic mammal, the harbour seal, also adopts a saccadic movement strategy. We recorded the seals' normal swimming pattern with video cameras and analysed head and body movements. The swimming seals indeed minimized rotation times by saccadic head and body turns, with top rotation speeds exceeding 350 deg s-1 which leads to an increase of translational movements. Saccades occurred during both types of locomotion of the seals' intermittent swimming mode: active propulsion and gliding. In conclusion, harbour seals share the saccadic movement strategy of terrestrial animals. Whether this movement strategy is adopted to facilitate distance estimation from optic flow or serves a different function will be a topic of future research.
Collapse
Affiliation(s)
- Bart R H Geurten
- Georg-August-University of Göttingen, Department of Cellular Neurobiology, Schwann-Schleiden Research Center, Julia-Lermontowa-Weg 3, Göttingen 37007, Germany
| | - Benedikt Niesterok
- University of Rostock, Institute for Biosciences, Sensory and Cognitive Ecology, Albert-Einstein-Str. 3, Rostock 18059, Germany
| | - Guido Dehnhardt
- University of Rostock, Institute for Biosciences, Sensory and Cognitive Ecology, Albert-Einstein-Str. 3, Rostock 18059, Germany
| | - Frederike D Hanke
- University of Rostock, Institute for Biosciences, Sensory and Cognitive Ecology, Albert-Einstein-Str. 3, Rostock 18059, Germany
| |
Collapse
|
26
|
Ros IG, Bhagavatula PS, Lin HT, Biewener AA. Rules to fly by: pigeons navigating horizontal obstacles limit steering by selecting gaps most aligned to their flight direction. Interface Focus 2017; 7:20160093. [PMID: 28163883 DOI: 10.1098/rsfs.2016.0093] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Flying animals must successfully contend with obstacles in their natural environments. Inspired by the robust manoeuvring abilities of flying animals, unmanned aerial systems are being developed and tested to improve flight control through cluttered environments. We previously examined steering strategies that pigeons adopt to fly through an array of vertical obstacles (VOs). Modelling VO flight guidance revealed that pigeons steer towards larger visual gaps when making fast steering decisions. In the present experiments, we recorded three-dimensional flight kinematics of pigeons as they flew through randomized arrays of horizontal obstacles (HOs). We found that pigeons still decelerated upon approach but flew faster through a denser array of HOs compared with the VO array previously tested. Pigeons exhibited limited steering and chose gaps between obstacles most aligned to their immediate flight direction, in contrast to VO navigation that favoured widest gap steering. In addition, pigeons navigated past the HOs with more variable and decreased wing stroke span and adjusted their wing stroke plane to reduce contact with the obstacles. Variability in wing extension, stroke plane and wing stroke path was greater during HO flight. Pigeons also exhibited pronounced head movements when negotiating HOs, which potentially serve a visual function. These head-bobbing-like movements were most pronounced in the horizontal (flight direction) and vertical directions, consistent with engaging motion vision mechanisms for obstacle detection. These results show that pigeons exhibit a keen kinesthetic sense of their body and wings in relation to obstacles. Together with aerodynamic flapping flight mechanics that favours vertical manoeuvring, pigeons are able to navigate HOs using simple rules, with remarkable success.
Collapse
Affiliation(s)
- Ivo G Ros
- Department of Organismic and Evolutionary Biology, Concord Field Station, Harvard University, Bedford, MA 01730, USA; Division of Biology and Bioengineering, California Institute of Technology, Pasadena, CA 91125, USA
| | - Partha S Bhagavatula
- Department of Organismic and Evolutionary Biology, Concord Field Station , Harvard University , Bedford, MA 01730 , USA
| | - Huai-Ti Lin
- Department of Organismic and Evolutionary Biology, Concord Field Station, Harvard University, Bedford, MA 01730, USA; HHMI Janelia Research Campus, Ashburn, VA 20147, USA
| | - Andrew A Biewener
- Department of Organismic and Evolutionary Biology, Concord Field Station , Harvard University , Bedford, MA 01730 , USA
| |
Collapse
|
27
|
Sumiya M, Fujioka E, Motoi K, Kondo M, Hiryu S. Coordinated Control of Acoustical Field of View and Flight in Three-Dimensional Space for Consecutive Capture by Echolocating Bats during Natural Foraging. PLoS One 2017; 12:e0169995. [PMID: 28085936 PMCID: PMC5234808 DOI: 10.1371/journal.pone.0169995] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2016] [Accepted: 12/27/2016] [Indexed: 11/30/2022] Open
Abstract
Echolocating bats prey upon small moving insects in the dark using sophisticated sonar techniques. The direction and directivity pattern of the ultrasound broadcast of these bats are important factors that affect their acoustical field of view, allowing us to investigate how the bats control their acoustic attention (pulse direction) for advanced flight maneuvers. The purpose of this study was to understand the behavioral strategies of acoustical sensing of wild Japanese house bats Pipistrellus abramus in three-dimensional (3D) space during consecutive capture flights. The results showed that when the bats successively captured multiple airborne insects in short time intervals (less than 1.5 s), they maintained not only the immediate prey but also the subsequent one simultaneously within the beam widths of the emitted pulses in both horizontal and vertical planes before capturing the immediate one. This suggests that echolocating bats maintain multiple prey within their acoustical field of view by a single sensing using a wide directional beam while approaching the immediate prey, instead of frequently shifting acoustic attention between multiple prey. We also numerically simulated the bats’ flight trajectories when approaching two prey successively to investigate the relationship between the acoustical field of view and the prey direction for effective consecutive captures. This simulation demonstrated that acoustically viewing both the immediate and the subsequent prey simultaneously increases the success rate of capturing both prey, which is considered to be one of the basic axes of efficient route planning for consecutive capture flight. The bat’s wide sonar beam can incidentally cover multiple prey while the bat forages in an area where the prey density is high. Our findings suggest that the bats then keep future targets within their acoustical field of view for effective foraging. In addition, in both the experimental results and the numerical simulations, the acoustic sensing and flights of the bats showed narrower vertical ranges than horizontal ranges. This suggests that the bats control their acoustic sensing according to different schemes in the horizontal and vertical planes according to their surroundings. These findings suggest that echolocating bats coordinate their control of the acoustical field of view and flight for consecutive captures in 3D space during natural foraging.
Collapse
Affiliation(s)
- Miwa Sumiya
- Faculty of Life and Medical Sciences, Doshisha University, Kyotanabe, Kyoto, Japan
| | - Emyo Fujioka
- Faculty of Life and Medical Sciences, Doshisha University, Kyotanabe, Kyoto, Japan
- Organization for Research Initiatives and Development, Doshisha University, Kyotanabe, Kyoto, Japan
| | - Kazuya Motoi
- Faculty of Life and Medical Sciences, Doshisha University, Kyotanabe, Kyoto, Japan
| | - Masaru Kondo
- Faculty of Life and Medical Sciences, Doshisha University, Kyotanabe, Kyoto, Japan
| | - Shizuko Hiryu
- Faculty of Life and Medical Sciences, Doshisha University, Kyotanabe, Kyoto, Japan
- JST PRESTO, Kawaguchi, Saitama, Japan
- * E-mail:
| |
Collapse
|
28
|
Helmer D, Geurten BRH, Dehnhardt G, Hanke FD. Saccadic Movement Strategy in Common Cuttlefish (Sepia officinalis). Front Physiol 2017; 7:660. [PMID: 28105017 PMCID: PMC5214429 DOI: 10.3389/fphys.2016.00660] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2016] [Accepted: 12/15/2016] [Indexed: 12/18/2022] Open
Abstract
Most moving animals segregate their locomotion trajectories in short burst like rotations and prolonged translations, to enhance distance information from optic flow, as only translational, but not rotational optic flow holds distance information. Underwater, optic flow is a valuable source of information as it is in the terrestrial habitat, however, so far, it has gained only little attention. To extend the knowledge on underwater optic flow perception and use, we filmed the movement pattern of six common cuttlefish (Sepia officinalis) with a high speed camera in this study. In the subsequent analysis, the center of mass of the cuttlefish body was manually traced to gain thrust, slip, and yaw of the cuttlefish movements over time. Cuttlefish indeed performed short rotations, saccades, with rotational velocities up to 343°/s. They clearly separated rotations from translations in line with the saccadic movement strategy documented for animals inhabiting the terrestrial habitat as well as for the semiaquatic harbor seals before. However, this separation only occurred during fin motion. In contrast, during jet propelled swimming, the separation between rotational and translational movements and thus probably distance estimation on the basis of the optic flow field is abolished in favor of high movement velocities. In conclusion, this study provides first evidence that an aquatic invertebrate, the cuttlefish, adopts a saccadic movement strategy depending on the behavioral context that could enhance the information gained from optic flow.
Collapse
Affiliation(s)
- Desiree Helmer
- Sensory and Cognitive Ecology, Institute for Biosciences, University of Rostock Rostock, Germany
| | - Bart R H Geurten
- Department of Cellular Neurobiology, Schwann-Schleiden Research Center, Georg-August-University of Göttingen Göttingen, Germany
| | - Guido Dehnhardt
- Sensory and Cognitive Ecology, Institute for Biosciences, University of Rostock Rostock, Germany
| | - Frederike D Hanke
- Sensory and Cognitive Ecology, Institute for Biosciences, University of Rostock Rostock, Germany
| |
Collapse
|
29
|
Beetz MJ, Hechavarría JC, Kössl M. Cortical neurons of bats respond best to echoes from nearest targets when listening to natural biosonar multi-echo streams. Sci Rep 2016; 6:35991. [PMID: 27786252 PMCID: PMC5081524 DOI: 10.1038/srep35991] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2016] [Accepted: 10/10/2016] [Indexed: 11/09/2022] Open
Abstract
Bats orientate in darkness by listening to echoes from their biosonar calls, a behaviour known as echolocation. Recent studies showed that cortical neurons respond in a highly selective manner when stimulated with natural echolocation sequences that contain echoes from single targets. However, it remains unknown how cortical neurons process echolocation sequences containing echo information from multiple objects. In the present study, we used echolocation sequences containing echoes from three, two or one object separated in the space depth as stimuli to study neuronal activity in the bat auditory cortex. Neuronal activity was recorded with multi-electrode arrays placed in the dorsal auditory cortex, where neurons tuned to target-distance are found. Our results show that target-distance encoding neurons are mostly selective to echoes coming from the closest object, and that the representation of echo information from distant objects is selectively suppressed. This suppression extends over a large part of the dorsal auditory cortex and may override possible parallel processing of multiple objects. The presented data suggest that global cortical suppression might establish a cortical "default mode" that allows selectively focusing on close obstacle even without active attention from the animals.
Collapse
Affiliation(s)
- M. Jerome Beetz
- Institut für Zellbiologie und Neurowissenschaft, Goethe-Universität, Frankfurt/M., Germany
| | - Julio C. Hechavarría
- Institut für Zellbiologie und Neurowissenschaft, Goethe-Universität, Frankfurt/M., Germany
| | - Manfred Kössl
- Institut für Zellbiologie und Neurowissenschaft, Goethe-Universität, Frankfurt/M., Germany
| |
Collapse
|
30
|
Li J, Lindemann JP, Egelhaaf M. Peripheral Processing Facilitates Optic Flow-Based Depth Perception. Front Comput Neurosci 2016; 10:111. [PMID: 27818631 PMCID: PMC5073142 DOI: 10.3389/fncom.2016.00111] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2016] [Accepted: 10/04/2016] [Indexed: 12/19/2022] Open
Abstract
Flying insects, such as flies or bees, rely on consistent information regarding the depth structure of the environment when performing their flight maneuvers in cluttered natural environments. These behaviors include avoiding collisions, approaching targets or spatial navigation. Insects are thought to obtain depth information visually from the retinal image displacements ("optic flow") during translational ego-motion. Optic flow in the insect visual system is processed by a mechanism that can be modeled by correlation-type elementary motion detectors (EMDs). However, it is still an open question how spatial information can be extracted reliably from the responses of the highly contrast- and pattern-dependent EMD responses, especially if the vast range of light intensities encountered in natural environments is taken into account. This question will be addressed here by systematically modeling the peripheral visual system of flies, including various adaptive mechanisms. Different model variants of the peripheral visual system were stimulated with image sequences that mimic the panoramic visual input during translational ego-motion in various natural environments, and the resulting peripheral signals were fed into an array of EMDs. We characterized the influence of each peripheral computational unit on the representation of spatial information in the EMD responses. Our model simulations reveal that information about the overall light level needs to be eliminated from the EMD input as is accomplished under light-adapted conditions in the insect peripheral visual system. The response characteristics of large monopolar cells (LMCs) resemble that of a band-pass filter, which reduces the contrast dependency of EMDs strongly, effectively enhancing the representation of the nearness of objects and, especially, of their contours. We furthermore show that local brightness adaptation of photoreceptors allows for spatial vision under a wide range of dynamic light conditions.
Collapse
Affiliation(s)
- Jinglin Li
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology, Bielefeld UniversityBielefeld, Germany
| | | | | |
Collapse
|
31
|
Greer DA, Bonnen K, Huk AC, Cormack LK. Speed discrimination in the far monocular periphery: A relative advantage for interocular comparisons consistent with self-motion. J Vis 2016; 16:7. [PMID: 27548085 PMCID: PMC5015968 DOI: 10.1167/16.10.7] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022] Open
Abstract
Some animals with lateral eyes (such as bees) control their navigation through the 3D world using velocity differences between the two eyes. Other animals with frontal eyes (such as primates, including humans) can perceive 3D motion based on the different velocities that a moving object projects upon the two retinae. Although one type of 3D motion perception involves a comparison between velocities from vastly different (monocular) portions of the visual field, and the other involves a comparison within overlapping (binocular) portions of the visual field, both compare velocities across the two eyes. Here we asked whether human interocular velocity comparisons, typically studied in the context of binocularly overlapping vision, operate in the far lateral (and hence, monocular) periphery and, if so, whether these comparisons were accordant with conventional interocular motion processing. We found that speed discrimination was indeed better between the two eyes' monocular visual fields, as compared to within a single eye's (monocular) visual field, but only when the velocities were consistent with commonly encountered motion. This intriguing finding suggests that mechanisms sensitive to relative motion information on opposite sides of an animal may have been retained, or at some point independently achieved, as the eyes became frontal in some animals.
Collapse
|
32
|
Ros IG, Biewener AA. Optic flow stabilizes flight in ruby-throated hummingbirds. J Exp Biol 2016; 219:2443-8. [DOI: 10.1242/jeb.128488] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2015] [Accepted: 05/26/2016] [Indexed: 11/20/2022]
Abstract
Flying birds rely on visual cues for retinal image stabilization by negating rotation-induced optic flow, the motion of the visual panorama across the retina, through corrective eye and head movements. In combination with vestibular and proprioceptive feedback, birds may also use visual cues to stabilize their body during flight. Here, we test whether artificially induced wide-field motion generated through projected visual patterns elicits maneuvers in body orientation and flight position, in addition to stabilizing vision. To test this hypothesis, we present hummingbirds flying freely within a 1.2 m cylindrical visual arena with a virtual surround rotated at different speeds about its vertical axis. The birds responded robustly to these visual perturbations by rotating their heads and bodies with the moving visual surround, and by adjusting their flight trajectories; following the surround. Thus, similar to insects, hummingbirds appear to use optic flow cues to control flight maneuvers in addition to stabilize their visual inputs.
Collapse
Affiliation(s)
- Ivo G. Ros
- Harvard University, Department of Organismic and Evolutionary Biology, Concord Field Station, 100 Old Causeway Road, Bedford, MA 01730, USA
| | - Andrew A. Biewener
- Harvard University, Department of Organismic and Evolutionary Biology, Concord Field Station, 100 Old Causeway Road, Bedford, MA 01730, USA
| |
Collapse
|
33
|
Altshuler DL, Bahlman JW, Dakin R, Gaede AH, Goller B, Lentink D, Segre PS, Skandalis DA. The biophysics of bird flight: functional relationships integrate aerodynamics, morphology, kinematics, muscles, and sensors. CAN J ZOOL 2015. [DOI: 10.1139/cjz-2015-0103] [Citation(s) in RCA: 58] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
Abstract
Bird flight is a remarkable adaptation that has allowed the approximately 10 000 extant species to colonize all terrestrial habitats on earth including high elevations, polar regions, distant islands, arid deserts, and many others. Birds exhibit numerous physiological and biomechanical adaptations for flight. Although bird flight is often studied at the level of aerodynamics, morphology, wingbeat kinematics, muscle activity, or sensory guidance independently, in reality these systems are naturally integrated. There has been an abundance of new studies in these mechanistic aspects of avian biology but comparatively less recent work on the physiological ecology of avian flight. Here we review research at the interface of the systems used in flight control and discuss several common themes. Modulation of aerodynamic forces to respond to different challenges is driven by three primary mechanisms: wing velocity about the shoulder, shape within the wing, and angle of attack. For birds that flap, the distinction between velocity and shape modulation synthesizes diverse studies in morphology, wing motion, and motor control. Recently developed tools for studying bird flight are influencing multiple areas of investigation, and in particular the role of sensory systems in flight control. How sensory information is transformed into motor commands in the avian brain remains, however, a largely unexplored frontier.
Collapse
Affiliation(s)
- Douglas L. Altshuler
- Department of Zoology, The University of British Columbia, Vancouver, BC V6T 1Z4, Canada
| | - Joseph W. Bahlman
- Department of Zoology, The University of British Columbia, Vancouver, BC V6T 1Z4, Canada
| | - Roslyn Dakin
- Department of Zoology, The University of British Columbia, Vancouver, BC V6T 1Z4, Canada
| | - Andrea H. Gaede
- Department of Zoology, The University of British Columbia, Vancouver, BC V6T 1Z4, Canada
| | - Benjamin Goller
- Department of Zoology, The University of British Columbia, Vancouver, BC V6T 1Z4, Canada
| | - David Lentink
- Department of Mechanical Engineering, Stanford University, Stanford, CA 94305, USA
| | - Paolo S. Segre
- Department of Zoology, The University of British Columbia, Vancouver, BC V6T 1Z4, Canada
| | - Dimitri A. Skandalis
- Department of Zoology, The University of British Columbia, Vancouver, BC V6T 1Z4, Canada
| |
Collapse
|
34
|
Bertrand OJN, Lindemann JP, Egelhaaf M. A Bio-inspired Collision Avoidance Model Based on Spatial Information Derived from Motion Detectors Leads to Common Routes. PLoS Comput Biol 2015; 11:e1004339. [PMID: 26583771 PMCID: PMC4652890 DOI: 10.1371/journal.pcbi.1004339] [Citation(s) in RCA: 28] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2014] [Accepted: 05/13/2015] [Indexed: 11/18/2022] Open
Abstract
Avoiding collisions is one of the most basic needs of any mobile agent, both biological and technical, when searching around or aiming toward a goal. We propose a model of collision avoidance inspired by behavioral experiments on insects and by properties of optic flow on a spherical eye experienced during translation, and test the interaction of this model with goal-driven behavior. Insects, such as flies and bees, actively separate the rotational and translational optic flow components via behavior, i.e. by employing a saccadic strategy of flight and gaze control. Optic flow experienced during translation, i.e. during intersaccadic phases, contains information on the depth-structure of the environment, but this information is entangled with that on self-motion. Here, we propose a simple model to extract the depth structure from translational optic flow by using local properties of a spherical eye. On this basis, a motion direction of the agent is computed that ensures collision avoidance. Flying insects are thought to measure optic flow by correlation-type elementary motion detectors. Their responses depend, in addition to velocity, on the texture and contrast of objects and, thus, do not measure the velocity of objects veridically. Therefore, we initially used geometrically determined optic flow as input to a collision avoidance algorithm to show that depth information inferred from optic flow is sufficient to account for collision avoidance under closed-loop conditions. Then, the collision avoidance algorithm was tested with bio-inspired correlation-type elementary motion detectors in its input. Even then, the algorithm led successfully to collision avoidance and, in addition, replicated the characteristics of collision avoidance behavior of insects. Finally, the collision avoidance algorithm was combined with a goal direction and tested in cluttered environments. The simulated agent then showed goal-directed behavior reminiscent of components of the navigation behavior of insects.
Collapse
Affiliation(s)
| | | | - Martin Egelhaaf
- Neurobiologie & CITEC, Bielefeld University, Bielefeld, Germany
| |
Collapse
|
35
|
Kress D, van Bokhorst E, Lentink D. How Lovebirds Maneuver Rapidly Using Super-Fast Head Saccades and Image Feature Stabilization. PLoS One 2015; 10:e0129287. [PMID: 26107413 PMCID: PMC4481315 DOI: 10.1371/journal.pone.0129287] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2014] [Accepted: 05/06/2015] [Indexed: 11/18/2022] Open
Abstract
Diurnal flying animals such as birds depend primarily on vision to coordinate their flight path during goal-directed flight tasks. To extract the spatial structure of the surrounding environment, birds are thought to use retinal image motion (optical flow) that is primarily induced by motion of their head. It is unclear what gaze behaviors birds perform to support visuomotor control during rapid maneuvering flight in which they continuously switch between flight modes. To analyze this, we measured the gaze behavior of rapidly turning lovebirds in a goal-directed task: take-off and fly away from a perch, turn on a dime, and fly back and land on the same perch. High-speed flight recordings revealed that rapidly turning lovebirds perform a remarkable stereotypical gaze behavior with peak saccadic head turns up to 2700 degrees per second, as fast as insects, enabled by fast neck muscles. In between saccades, gaze orientation is held constant. By comparing saccade and wingbeat phase, we find that these super-fast saccades are coordinated with the downstroke when the lateral visual field is occluded by the wings. Lovebirds thus maximize visual perception by overlying behaviors that impair vision, which helps coordinate maneuvers. Before the turn, lovebirds keep a high contrast edge in their visual midline. Similarly, before landing, the lovebirds stabilize the center of the perch in their visual midline. The perch on which the birds land swings, like a branch in the wind, and we find that retinal size of the perch is the most parsimonious visual cue to initiate landing. Our observations show that rapidly maneuvering birds use precisely timed stereotypic gaze behaviors consisting of rapid head turns and frontal feature stabilization, which facilitates optical flow based flight control. Similar gaze behaviors have been reported for visually navigating humans. This finding can inspire more effective vision-based autopilots for drones.
Collapse
Affiliation(s)
- Daniel Kress
- Department of Mechanical Engineering, Stanford University, Stanford, California, United States of America
| | - Evelien van Bokhorst
- Department of Mechanical Engineering, Stanford University, Stanford, California, United States of America; Department of Mechanical Engineering and Aeronautics, City University London, London, United Kingdom
| | - David Lentink
- Department of Mechanical Engineering, Stanford University, Stanford, California, United States of America; Experimental Zoology Group, Wageningen University, Wageningen, The Netherlands
| |
Collapse
|
36
|
Abstract
Relatively little is known about how sensory information is used for controlling flight in birds. A powerful method is to immerse an animal in a dynamic virtual reality environment to examine behavioral responses. Here, we investigated the role of vision during free-flight hovering in hummingbirds to determine how optic flow--image movement across the retina--is used to control body position. We filmed hummingbirds hovering in front of a projection screen with the prediction that projecting moving patterns would disrupt hovering stability but stationary patterns would allow the hummingbird to stabilize position. When hovering in the presence of moving gratings and spirals, hummingbirds lost positional stability and responded to the specific orientation of the moving visual stimulus. There was no loss of stability with stationary versions of the same stimulus patterns. When exposed to a single stimulus many times or to a weakened stimulus that combined a moving spiral with a stationary checkerboard, the response to looming motion declined. However, even minimal visual motion was sufficient to cause a loss of positional stability despite prominent stationary features. Collectively, these experiments demonstrate that hummingbirds control hovering position by stabilizing motions in their visual field. The high sensitivity and persistence of this disruptive response is surprising, given that the hummingbird brain is highly specialized for sensory processing and spatial mapping, providing other potential mechanisms for controlling position.
Collapse
|
37
|
|
38
|
Kane SA, Zamani M. Falcons pursue prey using visual motion cues: new perspectives from animal-borne cameras. ACTA ACUST UNITED AC 2014; 217:225-34. [PMID: 24431144 DOI: 10.1242/jeb.092403] [Citation(s) in RCA: 71] [Impact Index Per Article: 7.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022]
Abstract
This study reports on experiments on falcons wearing miniature videocameras mounted on their backs or heads while pursuing flying prey. Videos of hunts by a gyrfalcon (Falco rusticolus), gyrfalcon (F. rusticolus)/Saker falcon (F. cherrug) hybrids and peregrine falcons (F. peregrinus) were analyzed to determine apparent prey positions on their visual fields during pursuits. These video data were then interpreted using computer simulations of pursuit steering laws observed in insects and mammals. A comparison of the empirical and modeling data indicates that falcons use cues due to the apparent motion of prey on the falcon's visual field to track and capture flying prey via a form of motion camouflage. The falcons also were found to maintain their prey's image at visual angles consistent with using their shallow fovea. These results should prove relevant for understanding the co-evolution of pursuit and evasion, as well as the development of computer models of predation and the integration of sensory and locomotion systems in biomimetic robots.
Collapse
|
39
|
Schwegmann A, Lindemann JP, Egelhaaf M. Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis. Front Comput Neurosci 2014; 8:83. [PMID: 25136314 PMCID: PMC4118023 DOI: 10.3389/fncom.2014.00083] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2014] [Accepted: 07/14/2014] [Indexed: 02/04/2023] Open
Abstract
Knowing the depth structure of the environment is crucial for moving animals in many behavioral contexts, such as collision avoidance, targeting objects, or spatial navigation. An important source of depth information is motion parallax. This powerful cue is generated on the eyes during translatory self-motion with the retinal images of nearby objects moving faster than those of distant ones. To investigate how the visual motion pathway represents motion-based depth information we analyzed its responses to image sequences recorded in natural cluttered environments with a wide range of depth structures. The analysis was done on the basis of an experimentally validated model of the visual motion pathway of insects, with its core elements being correlation-type elementary motion detectors (EMDs). It is the key result of our analysis that the absolute EMD responses, i.e., the motion energy profile, represent the contrast-weighted nearness of environmental structures during translatory self-motion at a roughly constant velocity. In other words, the output of the EMD array highlights contours of nearby objects. This conclusion is largely independent of the scale over which EMDs are spatially pooled and was corroborated by scrutinizing the motion energy profile after eliminating the depth structure from the natural image sequences. Hence, the well-established dependence of correlation-type EMDs on both velocity and textural properties of motion stimuli appears to be advantageous for representing behaviorally relevant information about the environment in a computationally parsimonious way.
Collapse
Affiliation(s)
| | | | - Martin Egelhaaf
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology, Bielefeld UniversityBielefeld, Germany
| |
Collapse
|
40
|
Lin HT, Ros IG, Biewener AA. Through the eyes of a bird: modelling visually guided obstacle flight. J R Soc Interface 2014; 11:20140239. [PMID: 24812052 DOI: 10.1098/rsif.2014.0239] [Citation(s) in RCA: 42] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Various flight navigation strategies for birds have been identified at the large spatial scales of migratory and homing behaviours. However, relatively little is known about close-range obstacle negotiation through cluttered environments. To examine obstacle flight guidance, we tracked pigeons (Columba livia) flying through an artificial forest of vertical poles. Interestingly, pigeons adjusted their flight path only approximately 1.5 m from the forest entry, suggesting a reactive mode of path planning. Combining flight trajectories with obstacle pole positions, we reconstructed the visual experience of the pigeons throughout obstacle flights. Assuming proportional-derivative control with a constant delay, we searched the relevant parameter space of steering gains and visuomotor delays that best explained the observed steering. We found that a pigeon's steering resembles proportional control driven by the error angle between the flight direction and the desired opening, or gap, between obstacles. Using this pigeon steering controller, we simulated obstacle flights and showed that pigeons do not simply steer to the nearest opening in the direction of flight or destination. Pigeons bias their flight direction towards larger visual gaps when making fast steering decisions. The proposed behavioural modelling method converts the obstacle avoidance behaviour into a (piecewise) target-aiming behaviour, which is better defined and understood. This study demonstrates how such an approach decomposes open-loop free-flight behaviours into components that can be independently evaluated.
Collapse
Affiliation(s)
- Huai-Ti Lin
- Department of Organismic and Evolutionary Biology, Harvard University, , Concord Field Station, 100 Old Causeway Road, Bedford, MA 01730, USA
| | | | | |
Collapse
|
41
|
Eckmeier D, Kern R, Egelhaaf M, Bischof HJ. Encoding of naturalistic optic flow by motion sensitive neurons of nucleus rotundus in the zebra finch (Taeniopygia guttata). Front Integr Neurosci 2013; 7:68. [PMID: 24065895 PMCID: PMC3778379 DOI: 10.3389/fnint.2013.00068] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2013] [Accepted: 09/02/2013] [Indexed: 02/05/2023] Open
Abstract
The retinal image changes that occur during locomotion, the optic flow, carry information about self-motion and the three-dimensional structure of the environment. Especially fast moving animals with only little binocular vision depend on these depth cues for maneuvering. They actively control their gaze to facilitate perception of depth based on cues in the optic flow. In the visual system of birds, nucleus rotundus neurons were originally found to respond to object motion but not to background motion. However, when background and object were both moving, responses increased the more the direction and velocity of object and background motion on the retina differed. These properties may play a role in representing depth cues in the optic flow. We therefore investigated, how neurons in nucleus rotundus respond to optic flow that contains depth cues. We presented simplified and naturalistic optic flow on a panoramic LED display while recording from single neurons in nucleus rotundus of anaesthetized zebra finches. Unlike most studies on motion vision in birds, our stimuli included depth information. We found extensive responses of motion selective neurons in nucleus rotundus to optic flow stimuli. Simplified stimuli revealed preferences for optic flow reflecting translational or rotational self-motion. Naturalistic optic flow stimuli elicited complex response modulations, but the presence of objects was signaled by only few neurons. The neurons that did respond to objects in the optic flow, however, show interesting properties.
Collapse
Affiliation(s)
- Dennis Eckmeier
- Neuroethology Group, Department of Behavioural Biology, Bielefeld University Bielefeld, Germany
| | | | | | | |
Collapse
|
42
|
Wisniewska DM, Johnson M, Beedholm K, Wahlberg M, Madsen PT. Acoustic gaze adjustments during active target selection in echolocating porpoises. ACTA ACUST UNITED AC 2013; 215:4358-73. [PMID: 23175527 DOI: 10.1242/jeb.074013] [Citation(s) in RCA: 48] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Visually dominant animals use gaze adjustments to organize perceptual inputs for cognitive processing. Thereby they manage the massive sensory load from complex and noisy scenes. Echolocation, as an active sensory system, may provide more opportunities to control such information flow by adjusting the properties of the sound source. However, most studies of toothed whale echolocation have involved stationed animals in static auditory scenes for which dynamic information control is unnecessary. To mimic conditions in the wild, we designed an experiment with captive, free-swimming harbor porpoises tasked with discriminating between two hydrophone-equipped targets and closing in on the selected target; this allowed us to gain insight into how porpoises adjust their acoustic gaze in a multi-target dynamic scene. By means of synchronized cameras, an acoustic tag and on-target hydrophone recordings we demonstrate that porpoises employ both beam direction control and range-dependent changes in output levels and pulse intervals to accommodate their changing spatial relationship with objects of immediate interest. We further show that, when switching attention to another target, porpoises can set their depth of gaze accurately for the new target location. In combination, these observations imply that porpoises exert precise vocal-motor control that is tied to spatial perception akin to visual accommodation. Finally, we demonstrate that at short target ranges porpoises narrow their depth of gaze dramatically by adjusting their output so as to focus on a single target. This suggests that echolocating porpoises switch from a deliberative mode of sensorimotor operation to a reactive mode when they are close to a target.
Collapse
Affiliation(s)
- Danuta Maria Wisniewska
- Zoophysiology, Department of Bioscience, Aarhus University, Building 1131, C. F. Moellers Alle 3, DK-8000 Aarhus C, Denmark.
| | | | | | | | | |
Collapse
|
43
|
Lindemann JP, Egelhaaf M. Texture dependence of motion sensing and free flight behavior in blowflies. Front Behav Neurosci 2013; 6:92. [PMID: 23335890 PMCID: PMC3542507 DOI: 10.3389/fnbeh.2012.00092] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2012] [Accepted: 12/21/2012] [Indexed: 11/18/2022] Open
Abstract
MANY FLYING INSECTS EXHIBIT AN ACTIVE FLIGHT AND GAZE STRATEGY: purely translational flight segments alternate with quick turns called saccades. To generate such a saccadic flight pattern, the animals decide the timing, direction, and amplitude of the next saccade during the previous translatory intersaccadic interval. The information underlying these decisions is assumed to be extracted from the retinal image displacements (optic flow), which scale with the distance to objects during the intersaccadic flight phases. In an earlier study we proposed a saccade-generation mechanism based on the responses of large-field motion-sensitive neurons. In closed-loop simulations we achieved collision avoidance behavior in a limited set of environments but observed collisions in others. Here we show by open-loop simulations that the cause of this observation is the known texture-dependence of elementary motion detection in flies, reflected also in the responses of large-field neurons as used in our model. We verified by electrophysiological experiments that this result is not an artifact of the sensory model. Already subtle changes in the texture may lead to qualitative differences in the responses of both our model cells and their biological counterparts in the fly's brain. Nonetheless, free flight behavior of blowflies is only moderately affected by such texture changes. This divergent texture dependence of motion-sensitive neurons and behavioral performance suggests either mechanisms that compensate for the texture dependence of the visual motion pathway at the level of the circuits generating the saccadic turn decisions or the involvement of a hypothetical parallel pathway in saccadic control that provides the information for collision avoidance independent of the textural properties of the environment.
Collapse
|
44
|
Egelhaaf M, Boeddeker N, Kern R, Kurtz R, Lindemann JP. Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action. Front Neural Circuits 2012; 6:108. [PMID: 23269913 PMCID: PMC3526811 DOI: 10.3389/fncir.2012.00108] [Citation(s) in RCA: 71] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2012] [Accepted: 12/03/2012] [Indexed: 11/30/2022] Open
Abstract
Insects such as flies or bees, with their miniature brains, are able to control highly aerobatic flight maneuvres and to solve spatial vision tasks, such as avoiding collisions with obstacles, landing on objects, or even localizing a previously learnt inconspicuous goal on the basis of environmental cues. With regard to solving such spatial tasks, these insects still outperform man-made autonomous flying systems. To accomplish their extraordinary performance, flies and bees have been shown by their characteristic behavioral actions to actively shape the dynamics of the image flow on their eyes ("optic flow"). The neural processing of information about the spatial layout of the environment is greatly facilitated by segregating the rotational from the translational optic flow component through a saccadic flight and gaze strategy. This active vision strategy thus enables the nervous system to solve apparently complex spatial vision tasks in a particularly efficient and parsimonious way. The key idea of this review is that biological agents, such as flies or bees, acquire at least part of their strength as autonomous systems through active interactions with their environment and not by simply processing passively gained information about the world. These agent-environment interactions lead to adaptive behavior in surroundings of a wide range of complexity. Animals with even tiny brains, such as insects, are capable of performing extraordinarily well in their behavioral contexts by making optimal use of the closed action-perception loop. Model simulations and robotic implementations show that the smart biological mechanisms of motion computation and visually-guided flight control might be helpful to find technical solutions, for example, when designing micro air vehicles carrying a miniaturized, low-weight on-board processor.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Centre of Excellence “Cognitive Interaction Technology”Bielefeld University, Germany
| | | | | | | | | |
Collapse
|
45
|
Bischof HJ, Nießner C, Peichl L, Wiltschko R, Wiltschko W. Avian ultraviolet/violet cones as magnetoreceptors: The problem of separating visual and magnetic information. Commun Integr Biol 2012; 4:713-6. [PMID: 22446535 DOI: 10.4161/cib.17338] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022] Open
Abstract
In a recent paper, we described the localization of cryptochrome 1a in the retina of domestic chickens, Gallus gallus, and European robins, Erithacus rubecula: Cryptochrome 1a was found exclusively along the membranes of the disks in the outer segments of the ultraviolet/violet single cones. Cryptochrome has been suggested to act as receptor molecule for the avian magnetic compass, which would mean that the UV/V cones have a double function: they mediate vision in the short-wavelength range and, at the same time, magnetic directional information. This has important implications and raises a number of questions, in particular, how the two types of input are separated. Here, we point out several possibilities how this could be achieved.
Collapse
|
46
|
Are olfactory cues involved in nest recognition in two social species of estrildid finches? PLoS One 2012; 7:e36615. [PMID: 22574196 PMCID: PMC3344906 DOI: 10.1371/journal.pone.0036615] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2011] [Accepted: 04/11/2012] [Indexed: 11/26/2022] Open
Abstract
Reliably recognizing their own nest provides parents with a necessary skill to invest time and resources efficiently in raising their offspring and thereby maximising their own reproductive success. Studies investigating nest recognition in adult birds have focused mainly on visual cues of the nest or the nest site and acoustic cues of the nestlings. To determine whether adult songbirds also use olfaction for nest recognition, we investigated the use of olfactory nest cues for two estrildid finch species, zebra finches (Taeniopygia guttata) and Bengalese finches (Lonchura striata var. domestica) during the nestling and fledgling phase of their offspring. We found similar behavioural responses to nest odours in both songbird species. Females preferred the odour of their own nest over a control and avoided the foreign conspecific nest scent over a control during the nestling phase of their offspring, but when given the own odour and the foreign conspecific odour simultaneously we did not find a preference for the own nest odour. Males of both species did not show any preferences at all. The behavioural reaction to any nest odour decreased after fledging of the offspring. Our results show that only females show a behavioural response to olfactory nest cues, indicating that the use of olfactory cues for nest recognition seems to be sex-specific and dependent on the developmental stage of the offspring. Although estrildid finches are known to use visual and acoustic cues for nest recognition, the similar behavioural pattern of both species indicates that at least females gain additional information by olfactory nest cues during the nestling phase of their offspring. Thus olfactory cues might be important in general, even in situations in which visual and acoustic cues are known to be sufficient.
Collapse
|
47
|
Geurten BRH, Kern R, Egelhaaf M. Species-Specific Flight Styles of Flies are Reflected in the Response Dynamics of a Homolog Motion-Sensitive Neuron. Front Integr Neurosci 2012; 6:11. [PMID: 22485089 PMCID: PMC3307035 DOI: 10.3389/fnint.2012.00011] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2011] [Accepted: 02/28/2012] [Indexed: 11/22/2022] Open
Abstract
Hoverflies and blowflies have distinctly different flight styles. Yet, both species have been shown to structure their flight behavior in a way that facilitates extraction of 3D information from the image flow on the retina (optic flow). Neuronal candidates to analyze the optic flow are the tangential cells in the third optical ganglion - the lobula complex. These neurons are directionally selective and integrate the optic flow over large parts of the visual field. Homolog tangential cells in hoverflies and blowflies have a similar morphology. Because blowflies and hoverflies have similar neuronal layout but distinctly different flight behaviors, they are an ideal substrate to pinpoint potential neuronal adaptations to the different flight styles. In this article we describe the relationship between locomotion behavior and motion vision on three different levels: (1) We compare the different flight styles based on the categorization of flight behavior into prototypical movements. (2) We measure the species-specific dynamics of the optic flow under naturalistic flight conditions. We found the translational optic flow of both species to be very different. (3) We describe possible adaptations of a homolog motion-sensitive neuron. We stimulate this cell in blowflies (Calliphora) and hoverflies (Eristalis) with naturalistic optic flow generated by both species during free flight. The characterized hoverfly tangential cell responds faster to transient changes in the optic flow than its blowfly homolog. It is discussed whether and how the different dynamical response properties aid optic flow analysis.
Collapse
Affiliation(s)
- Bart R. H. Geurten
- Department of Neurobiology, Bielefeld UniversityBielefeld, North Rhine-Westphalia, Germany
- Centre of Excellence ‘Cognitive Interaction Technology’Bielefeld, North Rhine-Westphalia, Germany
- Department of Cellular Neurobiology, Johann-Friedrich-Blumenbach Institute for Zoology and Anthropology, Georg-August-University GöttingenGöttingen, Lower Saxony, Germany
| | - Roland Kern
- Department of Neurobiology, Bielefeld UniversityBielefeld, North Rhine-Westphalia, Germany
- Centre of Excellence ‘Cognitive Interaction Technology’Bielefeld, North Rhine-Westphalia, Germany
| | - Martin Egelhaaf
- Department of Neurobiology, Bielefeld UniversityBielefeld, North Rhine-Westphalia, Germany
- Centre of Excellence ‘Cognitive Interaction Technology’Bielefeld, North Rhine-Westphalia, Germany
| |
Collapse
|
48
|
Liang P, Heitwerth J, Kern R, Kurtz R, Egelhaaf M. Object representation and distance encoding in three-dimensional environments by a neural circuit in the visual system of the blowfly. J Neurophysiol 2012; 107:3446-57. [PMID: 22423002 DOI: 10.1152/jn.00530.2011] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Three motion-sensitive key elements of a neural circuit, presumably involved in processing object and distance information, were analyzed with optic flow sequences as experienced by blowflies in a three-dimensional environment. This optic flow is largely shaped by the blowfly's saccadic flight and gaze strategy, which separates translational flight segments from fast saccadic rotations. By modifying this naturalistic optic flow, all three analyzed neurons could be shown to respond during the intersaccadic intervals not only to nearby objects but also to changes in the distance to background structures. In the presence of strong background motion, the three types of neuron differ in their sensitivity for object motion. Object-induced response increments are largest in FD1, a neuron long known to respond better to moving objects than to spatially extended motion patterns, but weakest in VCH, a neuron that integrates wide-field motion from both eyes and, by inhibiting the FD1 cell, is responsible for its object preference. Small but significant object-induced response increments are present in HS cells, which serve both as a major input neuron of VCH and as output neurons of the visual system. In both HS and FD1, intersaccadic background responses decrease with increasing distance to the animal, although much more prominently in FD1. This strong dependence of FD1 on background distance is concluded to be the consequence of the activity of VCH that dramatically increases its activity and, thus, its inhibitory strength with increasing distance.
Collapse
Affiliation(s)
- Pei Liang
- Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC), Bielefeld University, Bielefeld, Germany
| | | | | | | | | |
Collapse
|
49
|
Bhagavatula P, Claudianos C, Ibbotson M, Srinivasan M. Optic Flow Cues Guide Flight in Birds. Curr Biol 2011; 21:1794-9. [DOI: 10.1016/j.cub.2011.09.009] [Citation(s) in RCA: 67] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2011] [Revised: 09/05/2011] [Accepted: 09/05/2011] [Indexed: 10/15/2022]
|
50
|
Niessner C, Denzau S, Gross JC, Peichl L, Bischof HJ, Fleissner G, Wiltschko W, Wiltschko R. Avian ultraviolet/violet cones identified as probable magnetoreceptors. PLoS One 2011; 6:e20091. [PMID: 21647441 PMCID: PMC3102070 DOI: 10.1371/journal.pone.0020091] [Citation(s) in RCA: 140] [Impact Index Per Article: 10.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2011] [Accepted: 04/24/2011] [Indexed: 11/23/2022] Open
Abstract
Background The Radical-Pair-Model postulates that the reception of magnetic compass directions in birds is based on spin-chemical reactions in specialized photopigments in the eye, with cryptochromes discussed as candidate molecules. But so far, the exact subcellular characterization of these molecules in the retina remained unknown. Methodology/Principal Findings We here describe the localization of cryptochrome 1a (Cry1a) in the retina of European robins, Erithacus rubecula, and domestic chickens, Gallus gallus, two species that have been shown to use the magnetic field for compass orientation. In both species, Cry1a is present exclusively in the ultraviolet/violet (UV/V) cones that are distributed across the entire retina. Electron microscopy shows Cry1a in ordered bands along the membrane discs of the outer segment, and cell fractionation reveals Cry1a in the membrane fraction, suggesting the possibility that Cry1a is anchored along membranes. Conclusions/Significance We provide first structural evidence that Cry1a occurs within a sensory structure arranged in a way that fulfils essential requirements of the Radical-Pair-Model. Our findings, identifying the UV/V-cones as probable magnetoreceptors, support the assumption that Cry1a is indeed the receptor molecule mediating information on magnetic directions, and thus provide the Radical-Pair-Model with a profound histological background.
Collapse
Affiliation(s)
- Christine Niessner
- Fachbereich Biowissenschaften der J.W. Goethe-Universität Frankfurt, Frankfurt am Main, Germany
| | | | | | | | | | | | | | | |
Collapse
|