1
|
Through Hawks’ Eyes: Synthetically Reconstructing the Visual Field of a Bird in Flight. Int J Comput Vis 2023; 131:1497-1531. [PMID: 37089199 PMCID: PMC10110700 DOI: 10.1007/s11263-022-01733-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2022] [Accepted: 12/05/2022] [Indexed: 03/06/2023]
Abstract
AbstractBirds of prey rely on vision to execute flight manoeuvres that are key to their survival, such as intercepting fast-moving targets or navigating through clutter. A better understanding of the role played by vision during these manoeuvres is not only relevant within the field of animal behaviour, but could also have applications for autonomous drones. In this paper, we present a novel method that uses computer vision tools to analyse the role of active vision in bird flight, and demonstrate its use to answer behavioural questions. Combining motion capture data from Harris’ hawks with a hybrid 3D model of the environment, we render RGB images, semantic maps, depth information and optic flow outputs that characterise the visual experience of the bird in flight. In contrast with previous approaches, our method allows us to consider different camera models and alternative gaze strategies for the purposes of hypothesis testing, allows us to consider visual input over the complete visual field of the bird, and is not limited by the technical specifications and performance of a head-mounted camera light enough to attach to a bird’s head in flight. We present pilot data from three sample flights: a pursuit flight, in which a hawk intercepts a moving target, and two obstacle avoidance flights. With this approach, we provide a reproducible method that facilitates the collection of large volumes of data across many individuals, opening up new avenues for data-driven models of animal behaviour.
Collapse
|
2
|
Parkinson RH, Fecher C, Gray JR. Chronic exposure to insecticides impairs honeybee optomotor behaviour. FRONTIERS IN INSECT SCIENCE 2022; 2:936826. [PMID: 38468783 PMCID: PMC10926483 DOI: 10.3389/finsc.2022.936826] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/05/2022] [Accepted: 07/11/2022] [Indexed: 03/13/2024]
Abstract
Honeybees use wide-field visual motion information to calculate the distance they have flown from the hive, and this information is communicated to conspecifics during the waggle dance. Seed treatment insecticides, including neonicotinoids and novel insecticides like sulfoxaflor, display detrimental effects on wild and managed bees, even when present at sublethal quantities. These effects include deficits in flight navigation and homing ability, and decreased survival of exposed worker bees. Neonicotinoid insecticides disrupt visual motion detection in the locust, resulting in impaired escape behaviors, but it had not previously been shown whether seed treatment insecticides disrupt wide-field motion detection in the honeybee. Here, we show that sublethal exposure to two commonly used insecticides, imidacloprid (a neonicotinoid) and sulfoxaflor, results in impaired optomotor behavior in the honeybee. This behavioral effect correlates with altered stress and detoxification gene expression in the brain. Exposure to sulfoxaflor led to sparse increases in neuronal apoptosis, localized primarily in the optic lobes, however there was no effect of imidacloprid. We propose that exposure to cholinergic insecticides disrupts the honeybee's ability to accurately encode wide-field visual motion, resulting in impaired optomotor behaviors. These findings provide a novel explanation for previously described effects of neonicotinoid insecticides on navigation and link these effects to sulfoxaflor for which there is a gap in scientific knowledge.
Collapse
Affiliation(s)
- Rachel H. Parkinson
- Grass Laboratory, Marine Biological Laboratory, Woods Hole, MA, United States
- Department of Zoology, University of Oxford, Oxford, United Kingdom
- Department of Biology, University of Saskatchewan, Saskatoon, SK, Canada
| | - Caroline Fecher
- Grass Laboratory, Marine Biological Laboratory, Woods Hole, MA, United States
- Institute of Neuronal Cell Biology, Technical University of Munich, Munich, Germany
| | - John R. Gray
- Department of Biology, University of Saskatchewan, Saskatoon, SK, Canada
| |
Collapse
|
3
|
Chatterjee P, Prusty AD, Mohan U, Sane SP. Integration of visual and antennal mechanosensory feedback during head stabilization in hawkmoths. eLife 2022; 11:78410. [PMID: 35758646 PMCID: PMC9259029 DOI: 10.7554/elife.78410] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2022] [Accepted: 06/21/2022] [Indexed: 11/23/2022] Open
Abstract
During flight maneuvers, insects exhibit compensatory head movements which are essential for stabilizing the visual field on their retina, reducing motion blur, and supporting visual self-motion estimation. In Diptera, such head movements are mediated via visual feedback from their compound eyes that detect retinal slip, as well as rapid mechanosensory feedback from their halteres – the modified hindwings that sense the angular rates of body rotations. Because non-Dipteran insects lack halteres, it is not known if mechanosensory feedback about body rotations plays any role in their head stabilization response. Diverse non-Dipteran insects are known to rely on visual and antennal mechanosensory feedback for flight control. In hawkmoths, for instance, reduction of antennal mechanosensory feedback severely compromises their ability to control flight. Similarly, when the head movements of freely flying moths are restricted, their flight ability is also severely impaired. The role of compensatory head movements as well as multimodal feedback in insect flight raises an interesting question: in insects that lack halteres, what sensory cues are required for head stabilization? Here, we show that in the nocturnal hawkmoth Daphnis nerii, compensatory head movements are mediated by combined visual and antennal mechanosensory feedback. We subjected tethered moths to open-loop body roll rotations under different lighting conditions, and measured their ability to maintain head angle in the presence or absence of antennal mechanosensory feedback. Our study suggests that head stabilization in moths is mediated primarily by visual feedback during roll movements at lower frequencies, whereas antennal mechanosensory feedback is required when roll occurs at higher frequency. These findings are consistent with the hypothesis that control of head angle results from a multimodal feedback loop that integrates both visual and antennal mechanosensory feedback, albeit at different latencies. At adequate light levels, visual feedback is sufficient for head stabilization primarily at low frequencies of body roll. However, under dark conditions, antennal mechanosensory feedback is essential for the control of head movements at high frequencies of body roll.
Collapse
Affiliation(s)
- Payel Chatterjee
- National Centre for Biological Sciences, Tata Institute of Fundamental Research, Bangalore, India
| | - Agnish Dev Prusty
- National Centre for Biological Sciences, Tata Institute of Fundamental Research, Bangalore, India
| | - Umesh Mohan
- National Centre for Biological Sciences, Tata Institute of Fundamental Research, Bangalore, India
| | - Sanjay P Sane
- National Centre for Biological Sciences, Tata Institute of Fundamental Research, Bangalore, India
| |
Collapse
|
4
|
Ravi S, Siesenop T, Bertrand OJ, Li L, Doussot C, Fisher A, Warren WH, Egelhaaf M. Bumblebees display characteristics of active vision during robust obstacle avoidance flight. J Exp Biol 2022; 225:274096. [PMID: 35067721 PMCID: PMC8920035 DOI: 10.1242/jeb.243021] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2021] [Accepted: 01/18/2022] [Indexed: 11/20/2022]
Abstract
Insects are remarkable flyers and capable of navigating through highly cluttered environments. We tracked the head and thorax of bumblebees freely flying in a tunnel containing vertically oriented obstacles to uncover the sensorimotor strategies used for obstacle detection and collision avoidance. Bumblebees presented all the characteristics of active vision during flight by stabilizing their head relative to the external environment and maintained close alignment between their gaze and flightpath. Head stabilization increased motion contrast of nearby features against the background to enable obstacle detection. As bees approached obstacles, they appeared to modulate avoidance responses based on the relative retinal expansion velocity (RREV) of obstacles and their maximum evasion acceleration was linearly related to RREVmax. Finally, bees prevented collisions through rapid roll manoeuvres implemented by their thorax. Overall, the combination of visuo-motor strategies of bumblebees highlights elegant solutions developed by insects for visually guided flight through cluttered environments.
Collapse
Affiliation(s)
- Sridhar Ravi
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany,School of Engineering and Information Technology, University of New South Wales, Canberra, ACT 2600, Australia,Author for correspondence ()
| | - Tim Siesenop
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| | - Olivier J. Bertrand
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| | - Liang Li
- Department of Collective Behavior, Max Planck Institute of Animal Behavior, University of Konstanz, 78464 Konstanz, Germany,Centre for the Advanced Study of Collective Behaviour, University of Konstanz, 78464 Konstanz, Germany,Department of Biology, University of Konstanz, 78464 Konstanz, Germany
| | - Charlotte Doussot
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| | - Alex Fisher
- School of Engineering, RMIT University, Melbourne, VIC 3001, Australia
| | - William H. Warren
- Department of Cognitive, Linguistic & Psychological Sciences, Brown University, Providence, RI 02912, USA
| | - Martin Egelhaaf
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| |
Collapse
|
5
|
Iyer V, Najafi A, James J, Fuller S, Gollakota S. Wireless steerable vision for live insects and insect-scale robots. Sci Robot 2021; 5:5/44/eabb0839. [PMID: 33022605 DOI: 10.1126/scirobotics.abb0839] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2020] [Accepted: 06/04/2020] [Indexed: 01/13/2023]
Abstract
Vision serves as an essential sensory input for insects but consumes substantial energy resources. The cost to support sensitive photoreceptors has led many insects to develop high visual acuity in only small retinal regions and evolve to move their visual systems independent of their bodies through head motion. By understanding the trade-offs made by insect vision systems in nature, we can design better vision systems for insect-scale robotics in a way that balances energy, computation, and mass. Here, we report a fully wireless, power-autonomous, mechanically steerable vision system that imitates head motion in a form factor small enough to mount on the back of a live beetle or a similarly sized terrestrial robot. Our electronics and actuator weigh 248 milligrams and can steer the camera over 60° based on commands from a smartphone. The camera streams "first person" 160 pixels-by-120 pixels monochrome video at 1 to 5 frames per second (fps) to a Bluetooth radio from up to 120 meters away. We mounted this vision system on two species of freely walking live beetles, demonstrating that triggering image capture using an onboard accelerometer achieves operational times of up to 6 hours with a 10-milliamp hour battery. We also built a small, terrestrial robot (1.6 centimeters by 2 centimeters) that can move at up to 3.5 centimeters per second, support vision, and operate for 63 to 260 minutes. Our results demonstrate that steerable vision can enable object tracking and wide-angle views for 26 to 84 times lower energy than moving the whole robot.
Collapse
Affiliation(s)
- Vikram Iyer
- Paul G. Allen School of Computer Science and Engineering, University of Washington, Seattle, WA, USA. .,Department of Electrical and Computer Engineering, University of Washington, Seattle, WA, USA
| | - Ali Najafi
- Department of Electrical and Computer Engineering, University of Washington, Seattle, WA, USA.
| | - Johannes James
- Department of Mechanical Engineering, University of Washington, Seattle, WA, USA
| | - Sawyer Fuller
- Department of Mechanical Engineering, University of Washington, Seattle, WA, USA
| | - Shyamnath Gollakota
- Paul G. Allen School of Computer Science and Engineering, University of Washington, Seattle, WA, USA. .,Department of Electrical and Computer Engineering, University of Washington, Seattle, WA, USA.,Department of Mechanical Engineering, University of Washington, Seattle, WA, USA
| |
Collapse
|
6
|
Active vision shapes and coordinates flight motor responses in flies. Proc Natl Acad Sci U S A 2020; 117:23085-23095. [PMID: 32873637 DOI: 10.1073/pnas.1920846117] [Citation(s) in RCA: 23] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Animals use active sensing to respond to sensory inputs and guide future motor decisions. In flight, flies generate a pattern of head and body movements to stabilize gaze. How the brain relays visual information to control head and body movements and how active head movements influence downstream motor control remains elusive. Using a control theoretic framework, we studied the optomotor gaze stabilization reflex in tethered flight and quantified how head movements stabilize visual motion and shape wing steering efforts in fruit flies (Drosophila). By shaping visual inputs, head movements increased the gain of wing steering responses and coordination between stimulus and wings, pointing to a tight coupling between head and wing movements. Head movements followed the visual stimulus in as little as 10 ms-a delay similar to the human vestibulo-ocular reflex-whereas wing steering responses lagged by more than 40 ms. This timing difference suggests a temporal order in the flow of visual information such that the head filters visual information eliciting downstream wing steering responses. Head fixation significantly decreased the mechanical power generated by the flight motor by reducing wingbeat frequency and overall thrust. By simulating an elementary motion detector array, we show that head movements shift the effective visual input dynamic range onto the sensitivity optimum of the motion vision pathway. Taken together, our results reveal a transformative influence of active vision on flight motor responses in flies. Our work provides a framework for understanding how to coordinate moving sensors on a moving body.
Collapse
|
7
|
Kalyanasundaram P, Willis MA. Parameters of motion vision in low-light in the hawkmoth, Manduca sexta. J Exp Biol 2018; 221:jeb.173344. [DOI: 10.1242/jeb.173344] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2017] [Accepted: 06/28/2018] [Indexed: 11/20/2022]
Abstract
The hawkmoth Manduca sexta, is nocturnally active, beginning its flight activity at sunset, and executing rapid controlled maneuvers to search for food and mates in dim light conditions. This moth's visual system has been shown to trade off spatial and temporal resolution for increased sensitivity in these conditions. The study presented here uses tethered flying moths to characterize the flight performance envelope of M. sexta's wide-field-motion-triggered steering response in low light conditions by measuring attempted turning in response to wide-field visual motion. Moths were challenged with a horizontally oscillating sinusoidal grating at a range of luminance, from daylight to starlight conditions. The impact of luminance on response to a range of temporal frequencies and spatial wavelengths was assessed across a range of pattern contrasts. The optomotor response decreased as a function of decreasing luminance, and the lower limit of the moth's contrast sensitivity was found to be between 1% to 5%. The preferred spatial frequency for M. sexta increased from 0.06 to 0.3 cycles/degree as the luminance decreased, but the preferred temporal frequency remained stable at 4.5 Hz across all conditions. The relationship between the optomotor response time to the temporal frequency of the pattern movement did not vary significantly with luminance levels. Taken together, these results suggest that the behavioral response to wide-field visual input in M. sexta is adapted to operate during crepuscular to nocturnal luminance levels, and the decreasing light levels experienced during that period changes visual acuity and does not affect their response time significantly.
Collapse
Affiliation(s)
- P. Kalyanasundaram
- Department of Biology, Case Western Reserve University, Cleveland, OH 44106-7080, USA
| | - M. A. Willis
- Department of Biology, Case Western Reserve University, Cleveland, OH 44106-7080, USA
| |
Collapse
|
8
|
Windsor SP, Taylor GK. Head movements quadruple the range of speeds encoded by the insect motion vision system in hawkmoths. Proc Biol Sci 2017; 284:rspb.2017.1622. [PMID: 28978733 DOI: 10.1098/rspb.2017.1622] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2017] [Accepted: 08/30/2017] [Indexed: 11/12/2022] Open
Abstract
Flying insects use compensatory head movements to stabilize gaze. Like other optokinetic responses, these movements can reduce image displacement, motion and misalignment, and simplify the optic flow field. Because gaze is imperfectly stabilized in insects, we hypothesized that compensatory head movements serve to extend the range of velocities of self-motion that the visual system encodes. We tested this by measuring head movements in hawkmoths Hyles lineata responding to full-field visual stimuli of differing oscillation amplitudes, oscillation frequencies and spatial frequencies. We used frequency-domain system identification techniques to characterize the head's roll response, and simulated how this would have affected the output of the motion vision system, modelled as a computational array of Reichardt detectors. The moths' head movements were modulated to allow encoding of both fast and slow self-motion, effectively quadrupling the working range of the visual system for flight control. By using its own output to drive compensatory head movements, the motion vision system thereby works as an adaptive sensor, which will be especially beneficial in nocturnal species with inherently slow vision. Studies of the ecology of motion vision must therefore consider the tuning of motion-sensitive interneurons in the context of the closed-loop systems in which they function.
Collapse
Affiliation(s)
- Shane P Windsor
- Department of Aerospace Engineering, University of Bristol, University Walk, Bristol BS8 1TR, UK
| | - Graham K Taylor
- Department of Zoology, University of Oxford, Oxford OX1 3PS, UK
| |
Collapse
|