1
|
Through Hawks’ Eyes: Synthetically Reconstructing the Visual Field of a Bird in Flight. Int J Comput Vis 2023; 131:1497-1531. [PMID: 37089199 PMCID: PMC10110700 DOI: 10.1007/s11263-022-01733-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2022] [Accepted: 12/05/2022] [Indexed: 03/06/2023]
Abstract
AbstractBirds of prey rely on vision to execute flight manoeuvres that are key to their survival, such as intercepting fast-moving targets or navigating through clutter. A better understanding of the role played by vision during these manoeuvres is not only relevant within the field of animal behaviour, but could also have applications for autonomous drones. In this paper, we present a novel method that uses computer vision tools to analyse the role of active vision in bird flight, and demonstrate its use to answer behavioural questions. Combining motion capture data from Harris’ hawks with a hybrid 3D model of the environment, we render RGB images, semantic maps, depth information and optic flow outputs that characterise the visual experience of the bird in flight. In contrast with previous approaches, our method allows us to consider different camera models and alternative gaze strategies for the purposes of hypothesis testing, allows us to consider visual input over the complete visual field of the bird, and is not limited by the technical specifications and performance of a head-mounted camera light enough to attach to a bird’s head in flight. We present pilot data from three sample flights: a pursuit flight, in which a hawk intercepts a moving target, and two obstacle avoidance flights. With this approach, we provide a reproducible method that facilitates the collection of large volumes of data across many individuals, opening up new avenues for data-driven models of animal behaviour.
Collapse
|
2
|
Giret N, Rolland M, Del Negro C. Multisensory processes in birds: from single neurons to the influence of social interactions and sensory loss. Neurosci Biobehav Rev 2022; 143:104942. [DOI: 10.1016/j.neubiorev.2022.104942] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2022] [Revised: 10/14/2022] [Accepted: 10/31/2022] [Indexed: 11/09/2022]
|
3
|
Ręk P, Magrath RD. Reality and illusion: the assessment of angular separation of multi-modal signallers in a duetting bird. Proc Biol Sci 2022; 289:20220680. [PMID: 35858056 PMCID: PMC9277264 DOI: 10.1098/rspb.2022.0680] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/25/2022] Open
Abstract
The spatial distribution of cooperating individuals plays a strategic role in territorial interactions of many group-living animals, and can indicate group cohesion. Vocalizations are commonly used to judge the distribution of signallers, but the spatial resolution of sounds is poor. Many species therefore accompany calls with movement; however, little is known about the role of audio-visual perception in natural interactions. We studied the effect of angular separation on the efficacy of multimodal duets in the Australian magpie-lark, Grallina cyanoleuca. We tested specifically whether conspicuous wing movements, which typically accompany duets, affect responses to auditory angular separation. Multimodal playbacks of duets using robotic models and speakers showed that birds relied primarily on acoustic cues when visual and auditory angular separations were congruent, but used both modalities to judge separation between the signallers when modalities were spatially incongruent. The visual component modified the effect of acoustic separation: robotic models that were apart weakened the response when speakers were together, while models that were together strengthened responses when speakers were apart. Our results show that responses are stronger when signallers are together, and suggest that males were are able to bind information cross-modally on the senders' spatial location, which is consistent with a multisensory illusion.
Collapse
Affiliation(s)
- Paweł Ręk
- Department of Behavioural Ecology, Institute of Environmental Biology, Faculty of Biology, Adam Mickiewicz University, 61‐614 Poznan, Poland,Division of Ecology and Evolution, Research School of Biology, The Australian National University, Canberra, Australian Capital Territory 2614, Australia
| | - Robert D. Magrath
- Division of Ecology and Evolution, Research School of Biology, The Australian National University, Canberra, Australian Capital Territory 2614, Australia
| |
Collapse
|
4
|
Behavioral and neuronal study of inhibition of return in barn owls. Sci Rep 2020; 10:7267. [PMID: 32350332 PMCID: PMC7190666 DOI: 10.1038/s41598-020-64197-9] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2020] [Accepted: 04/08/2020] [Indexed: 11/21/2022] Open
Abstract
Inhibition of return (IOR) is the reduction of detection speed and/or detection accuracy of a target in a recently attended location. This phenomenon, which has been discovered and studied thoroughly in humans, is believed to reflect a brain mechanism for controlling the allocation of spatial attention in a manner that enhances efficient search. Findings showing that IOR is robust, apparent at a very early age and seemingly dependent on midbrain activity suggest that IOR is a universal attentional mechanism in vertebrates. However, studies in non-mammalian species are scarce. To explore this hypothesis comparatively, we tested for IOR in barn owls (Tyto alba) using the classical Posner cueing paradigm. Two barn owls were trained to initiate a trial by fixating on the center of a computer screen and then turning their gaze to the location of a target. A short, non-informative cue appeared before the target, either at a location predicting the target (valid) or a location not predicting the target (invalid). In one barn owl, the response times (RT) to the valid targets compared to the invalid targets shifted from facilitation (lower RTs) to inhibition (higher RTs) when increasing the time lag between the cue and the target. The second owl mostly failed to maintain fixation and responded to the cue before the target onset. However, when including in the analysis only the trials in which the owl maintained fixation, an inhibition in the valid trials could be detected. To search for the neural correlates of IOR, we recorded multiunit responses in the optic tectum (OT) of four head-fixed owls passively viewing a cueing paradigm as in the behavioral experiments. At short cue to target lags (<100 ms), neural responses to the target in the receptive field (RF) were usually enhanced if the cue appeared earlier inside the RF (valid) and were suppressed if the cue appeared earlier outside the RF (invalid). This was reversed at longer lags: neural responses were suppressed in the valid conditions and were unaffected in the invalid conditions. The findings support the notion that IOR is a basic mechanism in the evolution of vertebrate behavior and suggest that the effect appears as a result of the interaction between lateral and forward inhibition in the tectal circuitry.
Collapse
|
5
|
Dutta A, Lev-Ari T, Barzilay O, Mairon R, Wolf A, Ben-Shahar O, Gutfreund Y. Self-motion trajectories can facilitate orientation-based figure-ground segregation. J Neurophysiol 2020; 123:912-926. [PMID: 31967932 DOI: 10.1152/jn.00439.2019] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Segregation of objects from the background is a basic and essential property of the visual system. We studied the neural detection of objects defined by orientation difference from background in barn owls (Tyto alba). We presented wide-field displays of densely packed stripes with a dominant orientation. Visual objects were created by orienting a circular patch differently from the background. In head-fixed conditions, neurons in both tecto- and thalamofugal visual pathways (optic tectum and visual Wulst) were weakly responsive to these objects in their receptive fields. However, notably, in freely viewing conditions, barn owls occasionally perform peculiar side-to-side head motions (peering) when scanning the environment. In the second part of the study we thus recorded the neural response from head-fixed owls while the visual displays replicated the peering conditions; i.e., the displays (objects and backgrounds) were shifted along trajectories that induced a retinal motion identical to sampled peering motions during viewing of a static object. These conditions induced dramatic neural responses to the objects, in the very same neurons that where unresponsive to the objects in static displays. By reverting to circular motions of the display, we show that the pattern of the neural response is mostly shaped by the orientation of the background relative to motion and not the orientation of the object. Thus our findings provide evidence that peering and/or other self-motions can facilitate orientation-based figure-ground segregation through interaction with inhibition from the surround.NEW & NOTEWORTHY Animals frequently move their sensory organs and thereby create motion cues that can enhance object segregation from background. We address a special example of such active sensing, in barn owls. When scanning the environment, barn owls occasionally perform small-amplitude side-to-side head movements called peering. We show that the visual outcome of such peering movements elicit neural detection of objects that are rotated from the dominant orientation of the background scene and which are otherwise mostly undetected. These results suggest a novel role for self-motions in sensing objects that break the regular orientation of elements in the scene.
Collapse
Affiliation(s)
- Arkadeb Dutta
- The Ruth and Bruce Rappaport Faculty of Medicine and Research Institute, The Technion, Haifa, Israel
| | - Tidhar Lev-Ari
- The Ruth and Bruce Rappaport Faculty of Medicine and Research Institute, The Technion, Haifa, Israel
| | - Ouriel Barzilay
- Faculty of Mechanical Engineering, The Technion, Haifa, Israel
| | - Rotem Mairon
- Department of Computer Science, Ben-Gurion University of the Negev, Beer-Sheva, Israel
| | - Alon Wolf
- Faculty of Mechanical Engineering, The Technion, Haifa, Israel
| | - Ohad Ben-Shahar
- Department of Computer Science, Ben-Gurion University of the Negev, Beer-Sheva, Israel.,The Zlotowski Center for Neuroscience Research, Ben-Gurion University of the Negev, Beer-Sheva, Israel
| | - Yoram Gutfreund
- The Ruth and Bruce Rappaport Faculty of Medicine and Research Institute, The Technion, Haifa, Israel
| |
Collapse
|
6
|
Emergence of an Adaptive Command for Orienting Behavior in Premotor Brainstem Neurons of Barn Owls. J Neurosci 2018; 38:7270-7279. [PMID: 30012694 DOI: 10.1523/jneurosci.0947-18.2018] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2018] [Revised: 06/28/2018] [Accepted: 07/04/2018] [Indexed: 11/21/2022] Open
Abstract
The midbrain map of auditory space commands sound-orienting responses in barn owls. Owls precisely localize sounds in frontal space but underestimate the direction of peripheral sound sources. This bias for central locations was proposed to be adaptive to the decreased reliability in the periphery of sensory cues used for sound localization by the owl. Understanding the neural pathway supporting this biased behavior provides a means to address how adaptive motor commands are implemented by neurons. Here we find that the sensory input for sound direction is weighted by its reliability in premotor neurons of the midbrain tegmentum of owls (male and female), such that the mean population firing rate approximates the head-orienting behavior. We provide evidence that this coding may emerge through convergence of upstream projections from the midbrain map of auditory space. We further show that manipulating the sensory input yields changes predicted by the convergent network in both premotor neural responses and behavior. This work demonstrates how a topographic sensory representation can be linearly read out to adjust behavioral responses by the reliability of the sensory input.SIGNIFICANCE STATEMENT This research shows how statistics of the sensory input can be integrated into a behavioral command by readout of a sensory representation. The firing rate of midbrain premotor neurons receiving sensory information from a topographic representation of auditory space is weighted by the reliability of sensory cues. We show that these premotor responses are consistent with a weighted convergence from the topographic sensory representation. This convergence was also tested behaviorally, where manipulation of stimulus properties led to bidirectional changes in sound localization errors. Thus a topographic representation of auditory space is translated into a premotor command for sound localization that is modulated by sensory reliability.
Collapse
|
7
|
Behavioral Evidence and Neural Correlates of Perceptual Grouping by Motion in the Barn Owl. J Neurosci 2018; 38:6653-6664. [PMID: 29967005 DOI: 10.1523/jneurosci.0174-18.2018] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2018] [Revised: 06/07/2018] [Accepted: 06/11/2018] [Indexed: 11/21/2022] Open
Abstract
Perceiving an object as salient from its surround often requires a preceding process of grouping the object and background elements as perceptual wholes. In humans, motion homogeneity provides a strong cue for grouping, yet it is unknown to what extent this occurs in nonprimate species. To explore this question, we studied the effects of visual motion homogeneity in barn owls of both genders, at the behavioral as well as the neural level. Our data show that the coherency of the background motion modulates the perceived saliency of the target object. An object moving in an odd direction relative to other objects attracted more attention when the other objects moved homogeneously compared with when moved in a variety of directions. A possible neural correlate of this effect may arise in the population activity of the intermediate/deep layers of the optic tectum. In these layers, the neural responses to a moving element in the receptive field were suppressed when additional elements moved in the surround. However, when the surrounding elements all moved in one direction (homogeneously moving), they induced less suppression of the response compared with nonhomogeneously moving elements. Moreover, neural responses were more sensitive to the homogeneity of the background motion than to motion-direction contrasts between the receptive field and the surround. The findings suggest similar principles of saliency-by-motion in an avian species as in humans and show a locus in the optic tectum where the underlying neural circuitry may exist.SIGNIFICANCE STATEMENT A critical task of the visual system is to arrange incoming visual information to a meaningful scene of objects and background. In humans, elements that move homogeneously are grouped perceptually to form a categorical whole object. We discovered a similar principle in the barn owl's visual system, whereby the homogeneity of the motion of elements in the scene allows perceptually distinguishing an object from its surround. The novel findings of these visual effects in an avian species, which lacks neocortical structure, suggest that our basic visual perception shares more universal principles across species than presently thought, and shed light on possible brain mechanisms for perceptual grouping.
Collapse
|
8
|
Interactions between top-down and bottom-up attention in barn owls (Tyto alba). Anim Cogn 2017; 21:197-205. [PMID: 29214438 DOI: 10.1007/s10071-017-1150-2] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2017] [Revised: 11/26/2017] [Accepted: 12/02/2017] [Indexed: 10/18/2022]
Abstract
Selective attention, the prioritization of behaviorally relevant stimuli for behavioral control, is commonly divided into two processes: bottom-up, stimulus-driven selection and top-down, task-driven selection. Here, we tested two barn owls in a visual search task that examines attentional capture of the top-down task by bottom-up mechanisms. We trained barn owls to search for a vertical Gabor patch embedded in a circular array of differently oriented Gabor distractors (top-down guided search). To track the point of gaze, a lightweight wireless video camera was mounted on the owl's head. Three experiments were conducted in which the owls were tested in the following conditions: (1) five distractors; (2) nine distractors; (3) five distractors with one distractor surrounded by a red circle; or (4) five distractors with a brief sound at the initiation of the stimulus. Search times and number of head saccades to reach the target were measured and compared between the different conditions. It was found that search time and number of saccades to the target increased when the number of distractors was larger (condition 2) and when an additional irrelevant salient stimulus, auditory or visual, was added to the scene (conditions 3 and 4). These results demonstrate that in barn owls, bottom-up attention interacts with top-down attention to shape behavior in ways similar to human attentional capture. The findings suggest similar attentional principles in taxa that have been evolutionarily separated for 300 million years.
Collapse
|
9
|
Beatini JR, Proudfoot GA, Gall MD. Frequency sensitivity in Northern saw-whet owls (Aegolius acadicus). J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2017; 204:145-154. [DOI: 10.1007/s00359-017-1216-2] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2017] [Revised: 08/24/2017] [Accepted: 09/16/2017] [Indexed: 10/18/2022]
|
10
|
Barzilay O, Zelnik-Manor L, Gutfreund Y, Wagner H, Wolf A. From biokinematics to a robotic active vision system. BIOINSPIRATION & BIOMIMETICS 2017; 12:056004. [PMID: 28581436 DOI: 10.1088/1748-3190/aa7728] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
Barn owls move their heads in very particular motions, compensating for the quasi-immovability of their eyes. These efficient predators often perform peering side-to-side head motions when scanning their surroundings and seeking prey. In this work, we use the head movements of barn owls as a model to bridge between biological active vision and machine vision. The biomotions are measured and used to actuate a specially built robot equipped with a depth camera for scanning. We hypothesize that the biomotions improve scan accuracy of static objects. Our experiments show that barn owl biomotion-based trajectories consistently improve scan accuracy when compared to intuitive scanning motions. This constitutes proof-of-concept evidence that the vision of robotic systems can be enhanced by bio-inspired viewpoint manipulation. Such biomimetic scanning systems can have many applications, e.g. manufacturing inspection or in autonomous robots.
Collapse
Affiliation(s)
- Ouriel Barzilay
- Faculty of Mechanical Engineering Technion, Israel Institute of Technology, Haifa, Israel
| | | | | | | | | |
Collapse
|
11
|
Responses to Pop-Out Stimuli in the Barn Owl's Optic Tectum Can Emerge through Stimulus-Specific Adaptation. J Neurosci 2016; 36:4876-87. [PMID: 27122042 DOI: 10.1523/jneurosci.3339-15.2016] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2015] [Accepted: 03/20/2016] [Indexed: 02/07/2023] Open
Abstract
UNLABELLED Here, we studied neural correlates of orientation-contrast-based saliency in the optic tectum (OT) of barn owls. Neural responses in the intermediate/deep layers of the OT were recorded from lightly anesthetized owls confronted with arrays of bars in which one bar (the target) was orthogonal to the remaining bars (the distractors). Responses to target bars were compared with responses to distractor bars in the receptive field (RF). Initially, no orientation-contrast sensitivity was observed. However, if the position of the target bar in the array was randomly shuffled across trials so that it occasionally appeared in the RF, then such sensitivity emerged. The effect started to become significant after three or four positional changes of the target bar and strengthened with additional trials. Our data further suggest that this effect arises due to specific adaptation to the stimulus in the RF combined with suppression from the surround. By jittering the position of the bar inside the RF across trials, we demonstrate that the adaptation has two components, one position specific and one orientation specific. The findings give rise to the hypothesis that barn owls, by active scanning of the scene, can induce adaptation of the tectal circuitry to the common orientation and thus achieve a "pop-out" of rare orientations. Such a model is consistent with several behavioral observations in owls and may be relevant to other visual features and species. SIGNIFICANCE STATEMENT Natural scenes are often characterized by a dominant orientation, such as the scenery of a pine forest or the sand dunes in a windy desert. Therefore, orientation that contrasts the regularity of the scene is perceived salient for many animals as a means to break camouflage. By actively moving the scene between each trial, we show here that neurons in the retinotopic map of the barn owl's optic tectum specifically adapt to the common orientation, giving rise to preferential representation of odd orientations. Based on this, we suggest a new mechanism for orientation-based camouflage breaking that links active scanning of scenes with neural adaptation. This mechanism may be relevant to pop-out in other species and visual features.
Collapse
|