1
|
Prusty AD, Sane SP. The motor apparatus of head movements in the Oleander hawkmoth (Daphnis nerii, Lepidoptera). J Comp Neurol 2024; 532:e25577. [PMID: 38289189 DOI: 10.1002/cne.25577] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2023] [Revised: 11/29/2023] [Accepted: 12/17/2023] [Indexed: 02/01/2024]
Abstract
Head movements of insects play a vital role in diverse locomotory behaviors including flying and walking. Because insect eyes move minimally within their sockets, their head movements are essential to reduce visual blur and maintain a stable gaze. As in most vertebrates, gaze stabilization behavior in insects requires the integration of both visual and mechanosensory feedback by the neck motor neurons. Although visual feedback is derived from the optic flow over the retina of their compound eyes, mechanosensory feedback is derived from their organs of balance, similar to the vestibular system in vertebrates. In Diptera, vestibular feedback is derived from the halteres-modified hindwings that evolved into mechanosensory organs-and is integrated with visual feedback to actuate compensatory head movements. However, non-Dipteran insects, including Lepidoptera, lack halteres. In these insects, vestibular feedback is obtained from the antennal Johnston's organs but it is not well-understood how it integrates with visual feedback during head movements. Indeed, although head movements are well-studied in flies, the underlying motor apparatus in non-Dipteran taxa has received relatively less attention. As a first step toward understanding compensatory head movements in the Oleander hawkmoth Daphnis nerii, we image the anatomy and architecture of their neck joint sclerites and muscles using X-ray microtomography, and the associated motor neurons using fluorescent dye fills and confocal microscopy. Based on these morphological data, we propose testable hypotheses about the putative function of specific neck muscles during head movements, which can shed light on their role in neck movements and gaze stabilization.
Collapse
Affiliation(s)
- Agnish D Prusty
- National Centre for Biological Sciences, Tata Institute of Fundamental Research, Bangalore, India
| | - Sanjay P Sane
- National Centre for Biological Sciences, Tata Institute of Fundamental Research, Bangalore, India
| |
Collapse
|
2
|
Egelhaaf M. Optic flow based spatial vision in insects. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-022-01610-w. [PMID: 36609568 DOI: 10.1007/s00359-022-01610-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Revised: 12/06/2022] [Accepted: 12/24/2022] [Indexed: 01/09/2023]
Abstract
The optic flow, i.e., the displacement of retinal images of objects in the environment induced by self-motion, is an important source of spatial information, especially for fast-flying insects. Spatial information over a wide range of distances, from the animal's immediate surroundings over several hundred metres to kilometres, is necessary for mediating behaviours, such as landing manoeuvres, collision avoidance in spatially complex environments, learning environmental object constellations and path integration in spatial navigation. To facilitate the processing of spatial information, the complexity of the optic flow is often reduced by active vision strategies. These result in translations and rotations being largely separated by a saccadic flight and gaze mode. Only the translational components of the optic flow contain spatial information. In the first step of optic flow processing, an array of local motion detectors provides a retinotopic spatial proximity map of the environment. This local motion information is then processed in parallel neural pathways in a task-specific manner and used to control the different components of spatial behaviour. A particular challenge here is that the distance information extracted from the optic flow does not represent the distances unambiguously, but these are scaled by the animal's speed of locomotion. Possible ways of coping with this ambiguity are discussed.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Center for Cognitive Interaction Technology (CITEC), Bielefeld University, Universitätsstraße 25, 33615, Bielefeld, Germany.
| |
Collapse
|
3
|
McHenry MJ, Hedrick TL. The science and technology of kinematic measurements in a century of Journal of Experimental Biology. J Exp Biol 2023; 226:286615. [PMID: 36637450 DOI: 10.1242/jeb.245147] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/14/2023]
Abstract
Kinematic measurements have been essential to the study of comparative biomechanics and offer insight into relationships between technological development and scientific progress. Here, we review the 100 year history of kinematic measurements in Journal of Experimental Biology (JEB) through eras that used film, analog video and digital video, and approaches that have circumvented the use of image capture. This history originated with the career of Sir James Gray and has since evolved over the generations of investigators that have followed. Although some JEB studies have featured technological developments that were ahead of their time, the vast majority of research adopted equipment that was broadly available through the consumer or industrial markets. We found that across eras, an emphasis on high-speed phenomena outpaced the growth of the number of articles published by JEB and the size of datasets increased significantly. Despite these advances, the number of species studied within individual reports has not differed significantly over time. Therefore, we find that advances in technology have helped to enable a growth in the number of JEB studies that have included kinematic measurements, contributed to an emphasis on high-speed phenomena, and yielded biomechanical studies that are more data rich, but are no more comparative now than in previous decades.
Collapse
Affiliation(s)
- Matthew J McHenry
- Department of Ecology and Evolutionary Biology , University of California, Irvine, CA 92697, USA
| | - Tyson L Hedrick
- Department of Biology , University of North Carolina at Chapel Hill, Chapel Hill, NC 27599, USA
| |
Collapse
|
4
|
Visual navigation: properties, acquisition and use of views. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2022:10.1007/s00359-022-01599-2. [PMID: 36515743 DOI: 10.1007/s00359-022-01599-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2022] [Revised: 11/20/2022] [Accepted: 11/25/2022] [Indexed: 12/15/2022]
Abstract
Panoramic views offer information on heading direction and on location to visually navigating animals. This review covers the properties of panoramic views and the information they provide to navigating animals, irrespective of image representation. Heading direction can be retrieved by alignment matching between memorized and currently experienced views, and a gradient descent in image differences can lead back to the location at which a view was memorized (positional image matching). Central place foraging insects, such as ants, bees and wasps, conduct distinctly choreographed learning walks and learning flights upon first leaving their nest that are likely to be designed to systematically collect scene memories tagged with information provided by path integration on the direction of and the distance to the nest. Equally, traveling along routes, ants have been shown to engage in scanning movements, in particular when routes are unfamiliar, again suggesting a systematic process of acquiring and comparing views. The review discusses what we know and do not know about how view memories are represented in the brain of insects, how they are acquired and how they are subsequently used for traveling along routes and for pinpointing places.
Collapse
|
5
|
Chatterjee P, Prusty AD, Mohan U, Sane SP. Integration of visual and antennal mechanosensory feedback during head stabilization in hawkmoths. eLife 2022; 11:e78410. [PMID: 35758646 PMCID: PMC9259029 DOI: 10.7554/elife.78410] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2022] [Accepted: 06/21/2022] [Indexed: 11/23/2022] Open
Abstract
During flight maneuvers, insects exhibit compensatory head movements which are essential for stabilizing the visual field on their retina, reducing motion blur, and supporting visual self-motion estimation. In Diptera, such head movements are mediated via visual feedback from their compound eyes that detect retinal slip, as well as rapid mechanosensory feedback from their halteres - the modified hindwings that sense the angular rates of body rotations. Because non-Dipteran insects lack halteres, it is not known if mechanosensory feedback about body rotations plays any role in their head stabilization response. Diverse non-Dipteran insects are known to rely on visual and antennal mechanosensory feedback for flight control. In hawkmoths, for instance, reduction of antennal mechanosensory feedback severely compromises their ability to control flight. Similarly, when the head movements of freely flying moths are restricted, their flight ability is also severely impaired. The role of compensatory head movements as well as multimodal feedback in insect flight raises an interesting question: in insects that lack halteres, what sensory cues are required for head stabilization? Here, we show that in the nocturnal hawkmoth Daphnis nerii, compensatory head movements are mediated by combined visual and antennal mechanosensory feedback. We subjected tethered moths to open-loop body roll rotations under different lighting conditions, and measured their ability to maintain head angle in the presence or absence of antennal mechanosensory feedback. Our study suggests that head stabilization in moths is mediated primarily by visual feedback during roll movements at lower frequencies, whereas antennal mechanosensory feedback is required when roll occurs at higher frequency. These findings are consistent with the hypothesis that control of head angle results from a multimodal feedback loop that integrates both visual and antennal mechanosensory feedback, albeit at different latencies. At adequate light levels, visual feedback is sufficient for head stabilization primarily at low frequencies of body roll. However, under dark conditions, antennal mechanosensory feedback is essential for the control of head movements at high frequencies of body roll.
Collapse
Affiliation(s)
- Payel Chatterjee
- National Centre for Biological Sciences, Tata Institute of Fundamental ResearchBangaloreIndia
| | - Agnish Dev Prusty
- National Centre for Biological Sciences, Tata Institute of Fundamental ResearchBangaloreIndia
| | - Umesh Mohan
- National Centre for Biological Sciences, Tata Institute of Fundamental ResearchBangaloreIndia
| | - Sanjay P Sane
- National Centre for Biological Sciences, Tata Institute of Fundamental ResearchBangaloreIndia
| |
Collapse
|
6
|
Baird E, Boeddeker N, Srinivasan MV. The effect of optic flow cues on honeybee flight control in wind. Proc Biol Sci 2021; 288:20203051. [PMID: 33468001 DOI: 10.1098/rspb.2020.3051] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
To minimize the risk of colliding with the ground or other obstacles, flying animals need to control both their ground speed and ground height. This task is particularly challenging in wind, where head winds require an animal to increase its airspeed to maintain a constant ground speed and tail winds may generate negative airspeeds, rendering flight more difficult to control. In this study, we investigate how head and tail winds affect flight control in the honeybee Apis mellifera, which is known to rely on the pattern of visual motion generated across the eye-known as optic flow-to maintain constant ground speeds and heights. We find that, when provided with both longitudinal and transverse optic flow cues (in or perpendicular to the direction of flight, respectively), honeybees maintain a constant ground speed but fly lower in head winds and higher in tail winds, a response that is also observed when longitudinal optic flow cues are minimized. When the transverse component of optic flow is minimized, or when all optic flow cues are minimized, the effect of wind on ground height is abolished. We propose that the regular sidewards oscillations that the bees make as they fly may be used to extract information about the distance to the ground, independently of the longitudinal optic flow that they use for ground speed control. This computationally simple strategy could have potential uses in the development of lightweight and robust systems for guiding autonomous flying vehicles in natural environments.
Collapse
Affiliation(s)
- Emily Baird
- Department of Zoology, Stockholm University, Stockholm, Sweden
| | - Norbert Boeddeker
- Department of Cognitive Neuroscience, Bielefeld University, Bielefeld, Germany
| | | |
Collapse
|
7
|
Doussot C, Bertrand OJN, Egelhaaf M. The Critical Role of Head Movements for Spatial Representation During Bumblebees Learning Flight. Front Behav Neurosci 2021; 14:606590. [PMID: 33542681 PMCID: PMC7852487 DOI: 10.3389/fnbeh.2020.606590] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2020] [Accepted: 11/23/2020] [Indexed: 11/20/2022] Open
Abstract
Bumblebees perform complex flight maneuvers around the barely visible entrance of their nest upon their first departures. During these flights bees learn visual information about the surroundings, possibly including its spatial layout. They rely on this information to return home. Depth information can be derived from the apparent motion of the scenery on the bees' retina. This motion is shaped by the animal's flight and orientation: Bees employ a saccadic flight and gaze strategy, where rapid turns of the head (saccades) alternate with flight segments of apparently constant gaze direction (intersaccades). When during intersaccades the gaze direction is kept relatively constant, the apparent motion contains information about the distance of the animal to environmental objects, and thus, in an egocentric reference frame. Alternatively, when the gaze direction rotates around a fixed point in space, the animal perceives the depth structure relative to this pivot point, i.e., in an allocentric reference frame. If the pivot point is at the nest-hole, the information is nest-centric. Here, we investigate in which reference frames bumblebees perceive depth information during their learning flights. By precisely tracking the head orientation, we found that half of the time, the head appears to pivot actively. However, only few of the corresponding pivot points are close to the nest entrance. Our results indicate that bumblebees perceive visual information in several reference frames when they learn about the surroundings of a behaviorally relevant location.
Collapse
Affiliation(s)
- Charlotte Doussot
- Department of Neurobiology, University of Bielefeld, Bielefeld, Germany
| | | | | |
Collapse
|
8
|
Bumblebees perceive the spatial layout of their environment in relation to their body size and form to minimize inflight collisions. Proc Natl Acad Sci U S A 2020; 117:31494-31499. [PMID: 33229535 DOI: 10.1073/pnas.2016872117] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Animals that move through complex habitats must frequently contend with obstacles in their path. Humans and other highly cognitive vertebrates avoid collisions by perceiving the relationship between the layout of their surroundings and the properties of their own body profile and action capacity. It is unknown whether insects, which have much smaller brains, possess such abilities. We used bumblebees, which vary widely in body size and regularly forage in dense vegetation, to investigate whether flying insects consider their own size when interacting with their surroundings. Bumblebees trained to fly in a tunnel were sporadically presented with an obstructing wall containing a gap that varied in width. Bees successfully flew through narrow gaps, even those that were much smaller than their wingspans, by first performing lateral scanning (side-to-side flights) to visually assess the aperture. Bees then reoriented their in-flight posture (i.e., yaw or heading angle) while passing through, minimizing their projected frontal width and mitigating collisions; in extreme cases, bees flew entirely sideways through the gap. Both the time that bees spent scanning during their approach and the extent to which they reoriented themselves to pass through the gap were determined not by the absolute size of the gap, but by the size of the gap relative to each bee's own wingspan. Our findings suggest that, similar to humans and other vertebrates, flying bumblebees perceive the affordance of their surroundings relative their body size and form to navigate safely through complex environments.
Collapse
|
9
|
Ryan LA, Cunningham R, Hart NS, Ogawa Y. The buzz around spatial resolving power and contrast sensitivity in the honeybee, Apis mellifera. Vision Res 2020; 169:25-32. [PMID: 32145455 DOI: 10.1016/j.visres.2020.02.005] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2019] [Revised: 02/17/2020] [Accepted: 02/18/2020] [Indexed: 10/24/2022]
Abstract
Most animals rely on vision to perform a range of behavioural tasks and variations in the anatomy and physiology of the eye likely reflect differences in habitat and life history. Moreover, eye design represents a balance between often conflicting requirements for gathering different forms of visual information. The trade-off between spatial resolving power and contrast sensitivity is common to all visual systems, and European honeybees (Apis mellifera) present an important opportunity to better understand this trade-off. Vision has been studied extensively in A. mellifera as it is vital for foraging, navigation and communication. Consequently, spatial resolving power and contrast sensitivity in A. mellifera have been measured using several methodologies; however, there is considerable variation in estimates between methodologies. We assess pattern electroretinography (pERG) as a new method for assessing the trade-off between visual spatial and contrast information in A.mellifera. pERG has the benefit of measuring spatial contrast sensitivity from higher order visual processing neurons in the eye. Spatial resolving power of A.mellifera estimated from pERG was 0.54 cycles per degree (cpd), and contrast sensitivity was 16.9. pERG estimates of contrast sensitivity were comparable to previous behavioural studies. Estimates of spatial resolving power reflected anatomical estimates in the frontal region of the eye, which corresponds to the region stimulated by pERG. Apis mellifera has similar spatial contrast sensitivity to other hymenopteran insects with similar facet diameter (Myrmecia ant species). Our results support the idea that eye anatomy has a substantial effect on spatial contrast sensitivity in compound eyes.
Collapse
Affiliation(s)
- Laura A Ryan
- Department of Biological Sciences, Macquarie University, Sydney, New South Wales 2109, Australia.
| | - Rhianon Cunningham
- Department of Biological Sciences, Macquarie University, Sydney, New South Wales 2109, Australia
| | - Nathan S Hart
- Department of Biological Sciences, Macquarie University, Sydney, New South Wales 2109, Australia
| | - Yuri Ogawa
- Department of Biological Sciences, Macquarie University, Sydney, New South Wales 2109, Australia
| |
Collapse
|
10
|
Corthals K, Moore S, Geurten BR. Strategies of locomotion composition. CURRENT OPINION IN INSECT SCIENCE 2019; 36:140-148. [PMID: 31622810 DOI: 10.1016/j.cois.2019.09.007] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/01/2019] [Revised: 09/10/2019] [Accepted: 09/24/2019] [Indexed: 06/10/2023]
Abstract
This review aims to highlight the importance of saccades during locomotion as a strategy to reduce sensory information loss while the subject is moving. Acquiring sensory data from the environment during movement results in a temporal flow of information, as the sensory precept changes with the position of the observer. Accordingly, the movement pattern shapes the sensory flow. Therefore, the requirements of locomotion and sensation have to be balanced in the behaviour of the organism. Insect vision provides deep insight into the interplay between action and perception. Insects can shape their optic flow by reducing their rotational movements to fast and short saccades. This generates prolonged phases of translations which provide depth information. Extensive behavioural and physiological studies on insects show how shaping the optic flow facilitates the coding of motion vision. Indeed the saccadic strategy provides an elegant solution to optimise sensory flow. Complementary studies in other taxa reported similar locomotion strategies emphasising the crucial influence of sensory flow on locomotion.
Collapse
Affiliation(s)
- Kristina Corthals
- Lund University, Functional Zoology, Sölvegatan 35, 223 62 Lund, Sweden
| | - Sharlen Moore
- Instituto de Fisiologıa Celular - Neurociencias, Universidad Nacional Autónoma de México, Av. Universidad 3000, Coyoacán, 04510 Mexico City, Mexico; Max Planck Institute of Experimental Medicine, Department of Neurogenetics, Hermann-Rein-Str. 3, 37075 Göttingen, Germany
| | - Bart Rh Geurten
- Georg-August-University Göttingen, Department of Cellular Neuroscience, Julia-Lermontowa-Weg 3, 37077 Göttingen, Germany.
| |
Collapse
|
11
|
Ravi S, Bertrand O, Siesenop T, Manz LS, Doussot C, Fisher A, Egelhaaf M. Gap perception in bumblebees. ACTA ACUST UNITED AC 2019; 222:222/2/jeb184135. [PMID: 30683732 DOI: 10.1242/jeb.184135] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2018] [Accepted: 10/26/2018] [Indexed: 11/20/2022]
Abstract
A number of insects fly over long distances below the natural canopy, where the physical environment is highly cluttered consisting of obstacles of varying shape, size and texture. While navigating within such environments, animals need to perceive and disambiguate environmental features that might obstruct their flight. The most elemental aspect of aerial navigation through such environments is gap identification and 'passability' evaluation. We used bumblebees to seek insights into the mechanisms used for gap identification when confronted with an obstacle in their flight path and behavioral compensations employed to assess gap properties. Initially, bumblebee foragers were trained to fly though an unobstructed flight tunnel that led to a foraging chamber. After the bees were familiar with this situation, we placed a wall containing a gap that unexpectedly obstructed the flight path on a return trip to the hive. The flight trajectories of the bees as they approached the obstacle wall and traversed the gap were analyzed in order to evaluate their behavior as a function of the distance between the gap and a background wall that was placed behind the gap. Bumblebees initially decelerated when confronted with an unexpected obstacle. Deceleration was first noticed when the obstacle subtended around 35 deg on the retina but also depended on the properties of the gap. Subsequently, the bees gradually traded off their longitudinal velocity to lateral velocity and approached the gap with increasing lateral displacement and lateral velocity. Bumblebees shaped their flight trajectory depending on the salience of the gap, indicated in our case by the optic flow contrast between the region within the gap and on the obstacle, which decreased with decreasing distance between the gap and the background wall. As the optic flow contrast decreased, the bees spent an increasing amount of time moving laterally across the obstacles. During these repeated lateral maneuvers, the bees are probably assessing gap geometry and passability.
Collapse
Affiliation(s)
- Sridhar Ravi
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany .,School of Engineering, RMIT University, Melbourne, VIC 3001, Australia
| | - Olivier Bertrand
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Tim Siesenop
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Lea-Sophie Manz
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany.,Faculty of Biology, Johannes Gutenberg-Universität Mainz, 55122 Mainz, Germany
| | - Charlotte Doussot
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Alex Fisher
- School of Engineering, RMIT University, Melbourne, VIC 3001, Australia
| | - Martin Egelhaaf
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| |
Collapse
|
12
|
Frasnelli E, Hempel de Ibarra N, Stewart FJ. The Dominant Role of Visual Motion Cues in Bumblebee Flight Control Revealed Through Virtual Reality. Front Physiol 2018; 9:1038. [PMID: 30108522 PMCID: PMC6079625 DOI: 10.3389/fphys.2018.01038] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2018] [Accepted: 07/12/2018] [Indexed: 11/13/2022] Open
Abstract
Flying bees make extensive use of optic flow: the apparent motion in the visual scene generated by their own movement. Much of what is known about bees' visually-guided flight comes from experiments employing real physical objects, which constrains the types of cues that can be presented. Here we implement a virtual reality system allowing us to create the visual illusion of objects in 3D space. We trained bumblebees, Bombus ignitus, to feed from a static target displayed on the floor of a flight arena, and then observed their responses to various interposing virtual objects. When a virtual floor was presented above the physical floor, bees were reluctant to descend through it, indicating that they perceived the virtual floor as a real surface. To reach a target at ground level, they flew through a hole in a virtual surface above the ground, and around an elevated virtual platform, despite receiving no reward for avoiding the virtual obstacles. These behaviors persisted even when the target was made (unrealistically) visible through the obstructing object. Finally, we challenged the bees with physically impossible ambiguous stimuli, which give conflicting motion and occlusion cues. In such cases, they behaved in accordance with the motion information, seemingly ignoring occlusion.
Collapse
Affiliation(s)
- Elisa Frasnelli
- Department of Evolutionary Studies of Biosystems, The Graduate University for Advanced Studies, Hayama, Japan.,School of Life Sciences, University of Lincoln, Lincoln, United Kingdom
| | - Natalie Hempel de Ibarra
- Department of Psychology, College of Life and Environmental Sciences, University of Exeter, Exeter, United Kingdom
| | - Finlay J Stewart
- Department of Evolutionary Studies of Biosystems, The Graduate University for Advanced Studies, Hayama, Japan
| |
Collapse
|
13
|
Daly IM, How MJ, Partridge JC, Roberts NW. Complex gaze stabilization in mantis shrimp. Proc Biol Sci 2018; 285:20180594. [PMID: 29720419 PMCID: PMC5966611 DOI: 10.1098/rspb.2018.0594] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2018] [Accepted: 04/09/2018] [Indexed: 11/12/2022] Open
Abstract
Almost all animals, regardless of the anatomy of the eyes, require some level of gaze stabilization in order to see the world clearly and without blur. For the mantis shrimp, achieving gaze stabilization is unusually challenging as their eyes have an unprecedented scope for movement in all three rotational degrees of freedom: yaw, pitch and torsion. We demonstrate that the species Odontodactylus scyllarus performs stereotypical gaze stabilization in the yaw degree of rotational freedom, which is accompanied by simultaneous changes in the pitch and torsion rotation of the eye. Surprisingly, yaw gaze stabilization performance is unaffected by both the torsional pose and the rate of torsional rotation of the eye. Further to this, we show, for the first time, a lack of a torsional gaze stabilization response in the stomatopod visual system. In the light of these findings, we suggest that the neural wide-field motion detection network in the stomatopod visual system may follow a radially symmetric organization to compensate for the potentially disorientating effects of torsional eye movements, a system likely to be unique to stomatopods.
Collapse
Affiliation(s)
- Ilse M Daly
- School of Biological Sciences, University of Bristol, Tyndall Avenue, Bristol BS8 1TQ, UK
| | - Martin J How
- School of Biological Sciences, University of Bristol, Tyndall Avenue, Bristol BS8 1TQ, UK
| | - Julian C Partridge
- School of Biological Sciences and the Oceans Institute, Faculty of Science, University of Western Australia, 35 Stirling Highway, Crawley, Western Australia 6009, Australia
| | - Nicholas W Roberts
- School of Biological Sciences, University of Bristol, Tyndall Avenue, Bristol BS8 1TQ, UK
| |
Collapse
|
14
|
Role of side-slip flight in target pursuit: blue-tailed damselflies (Ischnura elegans) avoid body rotation while approaching a moving perch. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2018; 204:561-577. [PMID: 29666930 DOI: 10.1007/s00359-018-1261-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2017] [Revised: 04/08/2018] [Accepted: 04/09/2018] [Indexed: 01/19/2023]
Abstract
Visually guided flight control requires processing changes in the visual panorama (optic-flow) resulting from self-movement relative to stationary objects, as well as from moving objects passing through the field of view. We studied the ability of the blue-tailed damselfly, Ischnura elegans, to successfully land on a perch moving unpredictably. We tracked the insects landing on a vertical pole moved linearly 6 cm back and forth with sinusoidal changes in velocity. When the moving perch changed direction at frequencies higher than 1 Hz, the damselflies engaged in manoeuvres that typically involved sideways flight, with minimal changes in body orientation relative to the stationary environment. We show that these flight manoeuvres attempted to fix the target in the centre of the field of view when flying in any direction while keeping body rotation changes about the yaw axis to the minimum. We propose that this pursuit strategy allows the insect to obtain reliable information on self and target motion relative to the stationary environment from the translational optic-flow, while minimizing interference from the rotational optic-flow. The ability of damselflies to fly in any direction, irrespective of body orientation, underlines the superb flight control of these aerial predators.
Collapse
|
15
|
Ibbotson MR, Hung YS, Meffin H, Boeddeker N, Srinivasan MV. Neural basis of forward flight control and landing in honeybees. Sci Rep 2017; 7:14591. [PMID: 29109404 PMCID: PMC5673959 DOI: 10.1038/s41598-017-14954-0] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2017] [Accepted: 10/18/2017] [Indexed: 01/04/2023] Open
Abstract
The impressive repertoire of honeybee visually guided behaviors, and their ability to learn has made them an important tool for elucidating the visual basis of behavior. Like other insects, bees perform optomotor course correction to optic flow, a response that is dependent on the spatial structure of the visual environment. However, bees can also distinguish the speed of image motion during forward flight and landing, as well as estimate flight distances (odometry), irrespective of the visual scene. The neural pathways underlying these abilities are unknown. Here we report on a cluster of descending neurons (DNIIIs) that are shown to have the directional tuning properties necessary for detecting image motion during forward flight and landing on vertical surfaces. They have stable firing rates during prolonged periods of stimulation and respond to a wide range of image speeds, making them suitable to detect image flow during flight behaviors. While their responses are not strictly speed tuned, the shape and amplitudes of their speed tuning functions are resistant to large changes in spatial frequency. These cells are prime candidates not only for the control of flight speed and landing, but also the basis of a neural 'front end' of the honeybee's visual odometer.
Collapse
Affiliation(s)
- M R Ibbotson
- National Vision Research Institute, Australian College of Optometry, Carlton, Victoria, Australia.
- Department of Optometry and Vision Sciences, University of Melbourne, Parkville, Victoria, Australia.
| | - Y-S Hung
- National Vision Research Institute, Australian College of Optometry, Carlton, Victoria, Australia
- Department of Optometry and Vision Sciences, University of Melbourne, Parkville, Victoria, Australia
- National Institute of Child Health and Human Development (NICHD), National Institutes of Health (NIH), 9000, Rockville Pike, Bldg 35 A, Bethesda, MD, USA
| | - H Meffin
- National Vision Research Institute, Australian College of Optometry, Carlton, Victoria, Australia
- Department of Optometry and Vision Sciences, University of Melbourne, Parkville, Victoria, Australia
| | - N Boeddeker
- Department of Cognitive Neuroscience, Bielefeld University, 33615, Bielefeld, Germany
| | - M V Srinivasan
- Queensland Brain Institute, University of Queensland, St Lucia, QLD 4072, Australia
| |
Collapse
|
16
|
Windsor SP, Taylor GK. Head movements quadruple the range of speeds encoded by the insect motion vision system in hawkmoths. Proc Biol Sci 2017; 284:rspb.2017.1622. [PMID: 28978733 DOI: 10.1098/rspb.2017.1622] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2017] [Accepted: 08/30/2017] [Indexed: 11/12/2022] Open
Abstract
Flying insects use compensatory head movements to stabilize gaze. Like other optokinetic responses, these movements can reduce image displacement, motion and misalignment, and simplify the optic flow field. Because gaze is imperfectly stabilized in insects, we hypothesized that compensatory head movements serve to extend the range of velocities of self-motion that the visual system encodes. We tested this by measuring head movements in hawkmoths Hyles lineata responding to full-field visual stimuli of differing oscillation amplitudes, oscillation frequencies and spatial frequencies. We used frequency-domain system identification techniques to characterize the head's roll response, and simulated how this would have affected the output of the motion vision system, modelled as a computational array of Reichardt detectors. The moths' head movements were modulated to allow encoding of both fast and slow self-motion, effectively quadrupling the working range of the visual system for flight control. By using its own output to drive compensatory head movements, the motion vision system thereby works as an adaptive sensor, which will be especially beneficial in nocturnal species with inherently slow vision. Studies of the ecology of motion vision must therefore consider the tuning of motion-sensitive interneurons in the context of the closed-loop systems in which they function.
Collapse
Affiliation(s)
- Shane P Windsor
- Department of Aerospace Engineering, University of Bristol, University Walk, Bristol BS8 1TR, UK
| | - Graham K Taylor
- Department of Zoology, University of Oxford, Oxford OX1 3PS, UK
| |
Collapse
|
17
|
Rusch C, Roth E, Vinauger C, Riffell JA. Honeybees in a virtual reality environment learn unique combinations of colour and shape. ACTA ACUST UNITED AC 2017; 220:3478-3487. [PMID: 28751492 DOI: 10.1242/jeb.164731] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2017] [Accepted: 07/21/2017] [Indexed: 11/20/2022]
Abstract
Honeybees are well-known models for the study of visual learning and memory. Whereas most of our knowledge of learned responses comes from experiments using free-flying bees, a tethered preparation would allow fine-scale control of the visual stimuli as well as accurate characterization of the learned responses. Unfortunately, conditioning procedures using visual stimuli in tethered bees have been limited in their efficacy. In this study, using a novel virtual reality environment and a differential training protocol in tethered walking bees, we show that the majority of honeybees learn visual stimuli, and need only six paired training trials to learn the stimulus. We found that bees readily learn visual stimuli that differ in both shape and colour. However, bees learn certain components over others (colour versus shape), and visual stimuli are learned in a non-additive manner with the interaction of specific colour and shape combinations being crucial for learned responses. To better understand which components of the visual stimuli the bees learned, the shape-colour association of the stimuli was reversed either during or after training. Results showed that maintaining the visual stimuli in training and testing phases was necessary to elicit visual learning, suggesting that bees learn multiple components of the visual stimuli. Together, our results demonstrate a protocol for visual learning in restrained bees that provides a powerful tool for understanding how components of a visual stimulus elicit learned responses as well as elucidating how visual information is processed in the honeybee brain.
Collapse
Affiliation(s)
- Claire Rusch
- Department of Biology, University of Washington, Seattle, WA 98195, USA.,University of Washington Institute for Neuroengineering, Seattle, WA 98195, USA
| | - Eatai Roth
- Department of Biology, University of Washington, Seattle, WA 98195, USA.,University of Washington Institute for Neuroengineering, Seattle, WA 98195, USA
| | - Clément Vinauger
- Department of Biology, University of Washington, Seattle, WA 98195, USA
| | - Jeffrey A Riffell
- Department of Biology, University of Washington, Seattle, WA 98195, USA .,University of Washington Institute for Neuroengineering, Seattle, WA 98195, USA
| |
Collapse
|
18
|
Ogawa Y, Ribi W, Zeil J, Hemmi JM. Regional differences in the preferred e-vector orientation of honeybee ocellar photoreceptors. ACTA ACUST UNITED AC 2017; 220:1701-1708. [PMID: 28213397 DOI: 10.1242/jeb.156109] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2017] [Accepted: 02/15/2017] [Indexed: 11/20/2022]
Abstract
In addition to compound eyes, honeybees (Apis mellifera) possess three single-lens eyes called ocelli located on the top of the head. Ocelli are involved in head-attitude control and in some insects have been shown to provide celestial compass information. Anatomical and early electrophysiological studies have suggested that UV and blue-green photoreceptors in ocelli are polarization sensitive. However, their retinal distribution and receptor characteristics have not been documented. Here, we used intracellular electrophysiology to determine the relationship between the spectral and polarization sensitivity of the photoreceptors and their position within the visual field of the ocelli. We first determined a photoreceptor's spectral response through a series of monochromatic flashes (340-600 nm). We found UV and green receptors, with peak sensitivities at 360 and 500 nm, respectively. We subsequently measured polarization sensitivity at the photoreceptor's peak sensitivity wavelength by rotating a polarizer with monochromatic flashes. Polarization sensitivity (PS) values were significantly higher in UV receptors (3.8±1.5, N=61) than in green receptors (2.1±0.6, N=60). Interestingly, most receptors with receptive fields below 35 deg elevation were sensitive to vertically polarized light while the receptors with visual fields above 35 deg were sensitive to a wide range of polarization angles. These results agree well with anatomical measurements showing differences in rhabdom orientations between dorsal and ventral retinae. We discuss the functional significance of the distribution of polarization sensitivities across the visual field of ocelli by highlighting the information the ocelli are able to extract from the bee's visual environment.
Collapse
Affiliation(s)
- Yuri Ogawa
- School of Biological Sciences and UWA Oceans Institute (M092), The University of Western Australia, 35 Stirling Highway, Crawley, WA 6009, Australia .,Department of Biological Sciences, Macquarie University, Sydney, NSW 2109, Australia
| | - Willi Ribi
- Research School of Biology, The Australian National University, Canberra, ACT 2601, Australia
| | - Jochen Zeil
- Research School of Biology, The Australian National University, Canberra, ACT 2601, Australia
| | - Jan M Hemmi
- School of Biological Sciences and UWA Oceans Institute (M092), The University of Western Australia, 35 Stirling Highway, Crawley, WA 6009, Australia
| |
Collapse
|
19
|
Avarguès-Weber A, Mota T. Advances and limitations of visual conditioning protocols in harnessed bees. ACTA ACUST UNITED AC 2016; 110:107-118. [PMID: 27998810 DOI: 10.1016/j.jphysparis.2016.12.006] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2016] [Revised: 10/06/2016] [Accepted: 12/14/2016] [Indexed: 12/12/2022]
Abstract
Bees are excellent invertebrate models for studying visual learning and memory mechanisms, because of their sophisticated visual system and impressive cognitive capacities associated with a relatively simple brain. Visual learning in free-flying bees has been traditionally studied using an operant conditioning paradigm. This well-established protocol, however, can hardly be combined with invasive procedures for studying the neurobiological basis of visual learning. Different efforts have been made to develop protocols in which harnessed honey bees could associate visual cues with reinforcement, though learning performances remain poorer than those obtained with free-flying animals. Especially in the last decade, the intention of improving visual learning performances of harnessed bees led many authors to adopt distinct visual conditioning protocols, altering parameters like harnessing method, nature and duration of visual stimulation, number of trials, inter-trial intervals, among others. As a result, the literature provides data hardly comparable and sometimes contradictory. In the present review, we provide an extensive analysis of the literature available on visual conditioning of harnessed bees, with special emphasis on the comparison of diverse conditioning parameters adopted by different authors. Together with this comparative overview, we discuss how these diverse conditioning parameters could modulate visual learning performances of harnessed bees.
Collapse
Affiliation(s)
- Aurore Avarguès-Weber
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), Université de Toulouse, CNRS, UPS, 118 Route de Narbonne, 31062 Toulouse Cedex 9, France.
| | - Theo Mota
- Departamento de Fisiologia e Biofísica, Instituto de Ciências Biológicas - ICB, Universidade Federal de Minas Gerais - UFMG, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte, Minas Gerais, Brazil.
| |
Collapse
|
20
|
Crall JD, Ravi S, Mountcastle AM, Combes SA. Bumblebee flight performance in cluttered environments: effects of obstacle orientation, body size and acceleration. ACTA ACUST UNITED AC 2016; 218:2728-37. [PMID: 26333927 DOI: 10.1242/jeb.121293] [Citation(s) in RCA: 37] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Locomotion through structurally complex environments is fundamental to the life history of most flying animals, and the costs associated with movement through clutter have important consequences for the ecology and evolution of volant taxa. However, few studies have directly investigated how flying animals navigate through cluttered environments, or examined which aspects of flight performance are most critical for this challenging task. Here, we examined how body size, acceleration and obstacle orientation affect the flight of bumblebees in an artificial, cluttered environment. Non-steady flight performance is often predicted to decrease with body size, as a result of a presumed reduction in acceleration capacity, but few empirical tests of this hypothesis have been performed in flying animals. We found that increased body size is associated with impaired flight performance (specifically transit time) in cluttered environments, but not with decreased peak accelerations. In addition, previous studies have shown that flying insects can produce higher accelerations along the lateral body axis, suggesting that if maneuvering is constrained by acceleration capacity, insects should perform better when maneuvering around objects laterally rather than vertically. Our data show that bumblebees do generate higher accelerations in the lateral direction, but we found no difference in their ability to pass through obstacle courses requiring lateral versus vertical maneuvering. In sum, our results suggest that acceleration capacity is not a primary determinant of flight performance in clutter, as is often assumed. Rather than being driven by the scaling of acceleration, we show that the reduced flight performance of larger bees in cluttered environments is driven by the allometry of both path sinuosity and mean flight speed. Specifically, differences in collision-avoidance behavior underlie much of the variation in flight performance across body size, with larger bees negotiating obstacles more cautiously. Thus, our results show that cluttered environments challenge the flight capacity of insects, but in surprising ways that emphasize the importance of behavioral and ecological context for understanding flight performance in complex environments.
Collapse
Affiliation(s)
- James D Crall
- Concord Field Station, Department of Organismic and Evolutionary Biology, Harvard University, 100 Old Causeway Rd, Bedford, MA 01730, USA
| | - Sridhar Ravi
- School of Aerospace Mechanical and Manufacturing Engineering, RMIT University, Melbourne, VIC 3001, Australia
| | - Andrew M Mountcastle
- Concord Field Station, Department of Organismic and Evolutionary Biology, Harvard University, 100 Old Causeway Rd, Bedford, MA 01730, USA
| | - Stacey A Combes
- Concord Field Station, Department of Organismic and Evolutionary Biology, Harvard University, 100 Old Causeway Rd, Bedford, MA 01730, USA
| |
Collapse
|
21
|
How Wasps Acquire and Use Views for Homing. Curr Biol 2016; 26:470-82. [DOI: 10.1016/j.cub.2015.12.052] [Citation(s) in RCA: 69] [Impact Index Per Article: 8.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2015] [Revised: 11/20/2015] [Accepted: 12/18/2015] [Indexed: 11/21/2022]
|
22
|
Goulard R, Julien-Laferriere A, Fleuriet J, Vercher JL, Viollet S. Behavioural evidence for a visual and proprioceptive control of head roll in hoverflies (Episyrphus balteatus). ACTA ACUST UNITED AC 2015; 218:3777-87. [PMID: 26486370 DOI: 10.1242/jeb.127043] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2015] [Accepted: 10/01/2015] [Indexed: 11/20/2022]
Abstract
The ability of hoverflies to control their head orientation with respect to their body contributes importantly to their agility and their autonomous navigation abilities. Many tasks performed by this insect during flight, especially while hovering, involve a head stabilization reflex. This reflex, which is mediated by multisensory channels, prevents the visual processing from being disturbed by motion blur and maintains a consistent perception of the visual environment. The so-called dorsal light response (DLR) is another head control reflex, which makes insects sensitive to the brightest part of the visual field. In this study, we experimentally validate and quantify the control loop driving the head roll with respect to the horizon in hoverflies. The new approach developed here consisted of using an upside-down horizon in a body roll paradigm. In this unusual configuration, tethered flying hoverflies surprisingly no longer use purely vision-based control for head stabilization. These results shed new light on the role of neck proprioceptor organs in head and body stabilization with respect to the horizon. Based on the responses obtained with male and female hoverflies, an improved model was then developed in which the output signals delivered by the neck proprioceptor organs are combined with the visual error in the estimated position of the body roll. An internal estimation of the body roll angle with respect to the horizon might explain the extremely accurate flight performances achieved by some hovering insects.
Collapse
Affiliation(s)
- Roman Goulard
- Aix-Marseille Université, CNRS, ISM UMR 7287, Marseille 13009, France
| | - Alice Julien-Laferriere
- INRIA and Université de Lyon, Lyon 69000, France CNRS, UMR 5558, Laboratoire de Biométrie et Biologie Évolutive, Villeurbanne 69622, France
| | - Jérome Fleuriet
- Washington National Primate Research Center and Department of Ophthalmology, University of Washington, Seattle, WA 98195, USA
| | | | - Stéphane Viollet
- Aix-Marseille Université, CNRS, ISM UMR 7287, Marseille 13009, France
| |
Collapse
|
23
|
Briod A, Zufferey JC, Floreano D. A method for ego-motion estimation in micro-hovering platforms flying in very cluttered environments. Auton Robots 2015. [DOI: 10.1007/s10514-015-9494-4] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
24
|
Boeddeker N, Mertes M, Dittmar L, Egelhaaf M. Bumblebee Homing: The Fine Structure of Head Turning Movements. PLoS One 2015; 10:e0135020. [PMID: 26352836 PMCID: PMC4564262 DOI: 10.1371/journal.pone.0135020] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2014] [Accepted: 07/17/2015] [Indexed: 11/18/2022] Open
Abstract
Changes in flight direction in flying insects are largely due to roll, yaw and pitch rotations of their body. Head orientation is stabilized for most of the time by counter rotation. Here, we use high-speed video to analyse head- and body-movements of the bumblebee Bombus terrestris while approaching and departing from a food source located between three landmarks in an indoor flight-arena. The flight paths consist of almost straight flight segments that are interspersed with rapid turns. These short and fast yaw turns ("saccades") are usually accompanied by even faster head yaw turns that change gaze direction. Since a large part of image rotation is thereby reduced to brief instants of time, this behavioural pattern facilitates depth perception from visual motion parallax during the intersaccadic intervals. The detailed analysis of the fine structure of the bees' head turning movements shows that the time course of single head saccades is very stereotypical. We find a consistent relationship between the duration, peak velocity and amplitude of saccadic head movements, which in its main characteristics resembles the so-called "saccadic main sequence" in humans. The fact that bumblebee head saccades are highly stereotyped as in humans, may hint at a common principle, where fast and precise motor control is used to reliably reduce the time during which the retinal images moves.
Collapse
Affiliation(s)
- Norbert Boeddeker
- Department of Neurobiology & Center of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
- Department of Cognitive Neurosciences & Center of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| | - Marcel Mertes
- Department of Neurobiology & Center of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| | - Laura Dittmar
- Department of Neurobiology & Center of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology & Center of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| |
Collapse
|
25
|
Stewart FJ, Kinoshita M, Arikawa K. The roles of visual parallax and edge attraction in the foraging behaviour of the butterfly Papilio xuthus. ACTA ACUST UNITED AC 2015; 218:1725-32. [PMID: 25883380 DOI: 10.1242/jeb.115063] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2014] [Accepted: 04/07/2015] [Indexed: 11/20/2022]
Abstract
Several examples of insects using visual motion to measure distance have been documented, from locusts peering to gauge the proximity of prey, to honeybees performing visual odometry en route between the hive and a flower patch. However, whether the use of parallax information is confined to specialised behaviours like these or represents a more general purpose sensory capability, is an open question. We investigate this issue in the foraging swallowtail butterfly Papilio xuthus, which we trained to associate a target presented on a monitor with a food reward. We then tracked the animal's flight in real-time, allowing us to manipulate the size and/or position of the target in a closed-loop manner to create the illusion that it is situated either above or below the monitor surface. Butterflies are less attracted to (i.e. slower to approach) targets that appear, based on motion parallax, to be more distant. Furthermore, we found that the number of abortive descent manoeuvres performed prior to the first successful target approach varies according to the depth of the virtual target, with expansion and parallax cues having effects of opposing polarity. However, we found no evidence that Papilio modulate the kinematic parameters of their descents according to the apparent distance of the target. Thus, we argue that motion parallax is used to identify a proximal target object, but that the subsequent process of approaching it is based on stabilising its edge in the 2D space of the retina, without estimating its distance.
Collapse
Affiliation(s)
- Finlay J Stewart
- Department of Evolutionary Studies of Biosystems, School of Advanced Sciences, Sokendai (The Graduate University for Advanced Studies), Shonan Village, Hayama, Kanagawa 240-0193, Japan
| | - Michiyo Kinoshita
- Department of Evolutionary Studies of Biosystems, School of Advanced Sciences, Sokendai (The Graduate University for Advanced Studies), Shonan Village, Hayama, Kanagawa 240-0193, Japan
| | - Kentaro Arikawa
- Department of Evolutionary Studies of Biosystems, School of Advanced Sciences, Sokendai (The Graduate University for Advanced Studies), Shonan Village, Hayama, Kanagawa 240-0193, Japan
| |
Collapse
|
26
|
Beatus T, Guckenheimer JM, Cohen I. Controlling roll perturbations in fruit flies. J R Soc Interface 2015; 12:20150075. [PMID: 25762650 PMCID: PMC4387536 DOI: 10.1098/rsif.2015.0075] [Citation(s) in RCA: 73] [Impact Index Per Article: 8.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2015] [Accepted: 02/16/2015] [Indexed: 11/12/2022] Open
Abstract
Owing to aerodynamic instabilities, stable flapping flight requires ever-present fast corrective actions. Here, we investigate how flies control perturbations along their body roll angle, which is unstable and their most sensitive degree of freedom. We glue a magnet to each fly and apply a short magnetic pulse that rolls it in mid-air. Fast video shows flies correct perturbations up to 100° within 30 ± 7 ms by applying a stroke-amplitude asymmetry that is well described by a linear proportional-integral controller. For more aggressive perturbations, we show evidence for nonlinear and hierarchical control mechanisms. Flies respond to roll perturbations within 5 ms, making this correction reflex one of the fastest in the animal kingdom.
Collapse
Affiliation(s)
- Tsevi Beatus
- Department of Physics, Cornell University, Ithaca, NY 14853, USA
| | | | - Itai Cohen
- Department of Physics, Cornell University, Ithaca, NY 14853, USA
| |
Collapse
|
27
|
Odometry for ground moving agents by optic flow recorded with optical mouse chips. SENSORS 2014; 14:21045-64. [PMID: 25384010 PMCID: PMC4279524 DOI: 10.3390/s141121045] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/05/2014] [Revised: 10/17/2014] [Accepted: 10/23/2014] [Indexed: 11/17/2022]
Abstract
Optical mouse chips—equipped with adequate lenses—can serve as small, light, precise, fast, and cheap motion sensors monitoring optic flow induced by self motion of an agent in a contrasted environment. We present a device that extracts self motion parameters exclusively from flow in eight mouse sensors. Four pairs of sensors with opposite azimuth are mounted on a sensor head, each individual sensor looking down with −45° elevation. The head is mounted on a carriage and is moved at constant height above a textured planar ground. The calibration procedure and tests on the precision of self motion estimates are reported.
Collapse
|
28
|
Dittmar L. Static and dynamic snapshots for goal localization in insects? Commun Integr Biol 2014. [DOI: 10.4161/cib.13763] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022] Open
|
29
|
Egelhaaf M, Kern R, Lindemann JP. Motion as a source of environmental information: a fresh view on biological motion computation by insect brains. Front Neural Circuits 2014; 8:127. [PMID: 25389392 PMCID: PMC4211400 DOI: 10.3389/fncir.2014.00127] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2014] [Accepted: 10/05/2014] [Indexed: 11/13/2022] Open
Abstract
Despite their miniature brains insects, such as flies, bees and wasps, are able to navigate by highly erobatic flight maneuvers in cluttered environments. They rely on spatial information that is contained in the retinal motion patterns induced on the eyes while moving around ("optic flow") to accomplish their extraordinary performance. Thereby, they employ an active flight and gaze strategy that separates rapid saccade-like turns from translatory flight phases where the gaze direction is kept largely constant. This behavioral strategy facilitates the processing of environmental information, because information about the distance of the animal to objects in the environment is only contained in the optic flow generated by translatory motion. However, motion detectors as are widespread in biological systems do not represent veridically the velocity of the optic flow vectors, but also reflect textural information about the environment. This characteristic has often been regarded as a limitation of a biological motion detection mechanism. In contrast, we conclude from analyses challenging insect movement detectors with image flow as generated during translatory locomotion through cluttered natural environments that this mechanism represents the contours of nearby objects. Contrast borders are a main carrier of functionally relevant object information in artificial and natural sceneries. The motion detection system thus segregates in a computationally parsimonious way the environment into behaviorally relevant nearby objects and-in many behavioral contexts-less relevant distant structures. Hence, by making use of an active flight and gaze strategy, insects are capable of performing extraordinarily well even with a computationally simple motion detection mechanism.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Department of Neurobiology and Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld UniversityBielefeld, Germany
| | - Roland Kern
- Department of Neurobiology and Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld UniversityBielefeld, Germany
| | - Jens Peter Lindemann
- Department of Neurobiology and Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld UniversityBielefeld, Germany
| |
Collapse
|
30
|
Mertes M, Dittmar L, Egelhaaf M, Boeddeker N. Visual motion-sensitive neurons in the bumblebee brain convey information about landmarks during a navigational task. Front Behav Neurosci 2014; 8:335. [PMID: 25309374 PMCID: PMC4173878 DOI: 10.3389/fnbeh.2014.00335] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2014] [Accepted: 09/07/2014] [Indexed: 11/13/2022] Open
Abstract
Bees use visual memories to find the spatial location of previously learnt food sites. Characteristic learning flights help acquiring these memories at newly discovered foraging locations where landmarks—salient objects in the vicinity of the goal location—can play an important role in guiding the animal's homing behavior. Although behavioral experiments have shown that bees can use a variety of visual cues to distinguish objects as landmarks, the question of how landmark features are encoded by the visual system is still open. Recently, it could be shown that motion cues are sufficient to allow bees localizing their goal using landmarks that can hardly be discriminated from the background texture. Here, we tested the hypothesis that motion sensitive neurons in the bee's visual pathway provide information about such landmarks during a learning flight and might, thus, play a role for goal localization. We tracked learning flights of free-flying bumblebees (Bombus terrestris) in an arena with distinct visual landmarks, reconstructed the visual input during these flights, and replayed ego-perspective movies to tethered bumblebees while recording the activity of direction-selective wide-field neurons in their optic lobe. By comparing neuronal responses during a typical learning flight and targeted modifications of landmark properties in this movie we demonstrate that these objects are indeed represented in the bee's visual motion pathway. We find that object-induced responses vary little with object texture, which is in agreement with behavioral evidence. These neurons thus convey information about landmark properties that are useful for view-based homing.
Collapse
Affiliation(s)
- Marcel Mertes
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| | - Laura Dittmar
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| | - Norbert Boeddeker
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| |
Collapse
|
31
|
Abstract
Abstract
Primates can analyse visual scenes extremely rapidly, making accurate decisions for presentation times of only 20ms. We asked if bumblebees, despite having potentially more limited processing power, could similarly detect and discriminate visual patterns presented for durations of 100ms or less. Bumblebees detected stimuli and discriminated between differently oriented and coloured stimuli even when presented as briefly as 20ms but failed to identify ecologically relevant shapes (predatory spiders on flowers) even when presented for 100ms. This suggests a profound difference between primate and insect visual processing, so that while primates can capture entire visual scenes 'at a glance', insects might have to rely on continuous online sampling of the world around them, using a process of active vision which requires longer integration times.
Collapse
|
32
|
Egelhaaf M, Boeddeker N, Kern R, Kurtz R, Lindemann JP. Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action. Front Neural Circuits 2012; 6:108. [PMID: 23269913 PMCID: PMC3526811 DOI: 10.3389/fncir.2012.00108] [Citation(s) in RCA: 71] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2012] [Accepted: 12/03/2012] [Indexed: 11/30/2022] Open
Abstract
Insects such as flies or bees, with their miniature brains, are able to control highly aerobatic flight maneuvres and to solve spatial vision tasks, such as avoiding collisions with obstacles, landing on objects, or even localizing a previously learnt inconspicuous goal on the basis of environmental cues. With regard to solving such spatial tasks, these insects still outperform man-made autonomous flying systems. To accomplish their extraordinary performance, flies and bees have been shown by their characteristic behavioral actions to actively shape the dynamics of the image flow on their eyes ("optic flow"). The neural processing of information about the spatial layout of the environment is greatly facilitated by segregating the rotational from the translational optic flow component through a saccadic flight and gaze strategy. This active vision strategy thus enables the nervous system to solve apparently complex spatial vision tasks in a particularly efficient and parsimonious way. The key idea of this review is that biological agents, such as flies or bees, acquire at least part of their strength as autonomous systems through active interactions with their environment and not by simply processing passively gained information about the world. These agent-environment interactions lead to adaptive behavior in surroundings of a wide range of complexity. Animals with even tiny brains, such as insects, are capable of performing extraordinarily well in their behavioral contexts by making optimal use of the closed action-perception loop. Model simulations and robotic implementations show that the smart biological mechanisms of motion computation and visually-guided flight control might be helpful to find technical solutions, for example, when designing micro air vehicles carrying a miniaturized, low-weight on-board processor.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Centre of Excellence “Cognitive Interaction Technology”Bielefeld University, Germany
| | | | | | | | | |
Collapse
|
33
|
Hung YS, van Kleef JP, Stange G, Ibbotson MR. Spectral inputs and ocellar contributions to a pitch-sensitive descending neuron in the honeybee. J Neurophysiol 2012. [PMID: 23197452 DOI: 10.1152/jn.00830.2012] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
By measuring insect compensatory optomotor reflexes to visual motion, researchers have examined the computational mechanisms of the motion processing system. However, establishing the spectral sensitivity of the neural pathways that underlie this motion behavior has been difficult, and the contribution of the simple eyes (ocelli) has been rarely examined. In this study we investigate the spectral response properties and ocellar inputs of an anatomically identified descending neuron (DNII(2)) in the honeybee optomotor pathway. Using a panoramic stimulus, we show that it responds selectively to optic flow associated with pitch rotations. The neuron is also stimulated with a custom-built light-emitting diode array that presented moving bars that were either all-green (spectrum 500-600 nm, peak 530 nm) or all-short wavelength (spectrum 350-430 nm, peak 380 nm). Although the optomotor response is thought to be dominated by green-sensitive inputs, we show that DNII(2) is equally responsive to, and direction selective to, both green- and short-wavelength stimuli. The color of the background image also influences the spontaneous spiking behavior of the cell: a green background produces significantly higher spontaneous spiking rates. Stimulating the ocelli produces strong modulatory effects on DNII(2), significantly increasing the amplitude of its responses in the preferred motion direction and decreasing the response latency by adding a directional, short-latency response component. Our results suggest that the spectral sensitivity of the optomotor response in honeybees may be more complicated than previously thought and that ocelli play a significant role in shaping the timing of motion signals.
Collapse
Affiliation(s)
- Y-S Hung
- National Vision Research Institute, Australian College of Optometry, Carlton, Victoria, Australia
| | | | | | | |
Collapse
|
34
|
Pant V, Higgins CM. Tracking improves performance of biological collision avoidance models. BIOLOGICAL CYBERNETICS 2012; 106:307-322. [PMID: 22744199 DOI: 10.1007/s00422-012-0499-1] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/22/2011] [Accepted: 05/31/2012] [Indexed: 06/01/2023]
Abstract
Collision avoidance models derived from the study of insect brains do not perform universally well in practical collision scenarios, although the insects themselves may perform well in similar situations. In this article, we present a detailed simulation analysis of two well-known collision avoidance models and illustrate their limitations. In doing so, we present a novel continuous-time implementation of a neuronally based collision avoidance model. We then show that visual tracking can improve performance of these models by allowing an relative computation of the distance between the obstacle and the observer. We compare the results of simulations of the two models with and without tracking to show how tracking improves the ability of the model to detect an imminent collision. We present an implementation of one of these models processing imagery from a camera to show how it performs in real-world scenarios. These results suggest that insects may track looming objects with their gaze.
Collapse
Affiliation(s)
- Vivek Pant
- Department of Electrical and Computer Engineering, The University of Arizona, Tucson, AZ 85721, USA
| | | |
Collapse
|
35
|
Visual homing: an insect perspective. Curr Opin Neurobiol 2012; 22:285-93. [PMID: 22221863 DOI: 10.1016/j.conb.2011.12.008] [Citation(s) in RCA: 154] [Impact Index Per Article: 12.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2011] [Revised: 11/28/2011] [Accepted: 12/15/2011] [Indexed: 11/21/2022]
|
36
|
|
37
|
Hung YS, van Kleef JP, Ibbotson MR. Visual response properties of neck motor neurons in the honeybee. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2011; 197:1173-87. [PMID: 21909972 DOI: 10.1007/s00359-011-0679-9] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2011] [Revised: 08/26/2011] [Accepted: 08/28/2011] [Indexed: 11/28/2022]
Abstract
Recent behavioural studies have demonstrated that honeybees use visual feedback to stabilize their gaze. However, little is known about the neural circuits that perform the visual motor computations that underlie this ability. We investigated the motor neurons that innervate two neck muscles (m44 and m51), which produce stabilizing yaw movements of the head. Intracellular recordings were made from five (out of eight) identified neuron types in the first cervical nerve (IK1) of honeybees. Two motor neurons that innervate muscle 51 were found to be direction-selective, with a preference for horizontal image motion from the contralateral to the ipsilateral side of the head. Three neurons that innervate muscle 44 were tuned to detect motion in the opposite direction (from ipsilateral to contralateral). These cells were binocularly sensitive and responded optimally to frontal stimulation. By combining the directional tuning of the motor neurons in an opponent manner, the neck motor system would be able to mediate reflexive optomotor head turns in the direction of image motion, thus stabilising the retinal image. When the dorsal ocelli were covered, the spontaneous activity of neck motor neurons increased and visual responses were modified, suggesting an ocellar input in addition to that from the compound eyes.
Collapse
Affiliation(s)
- Y-S Hung
- ARC Centre of Excellence in Vision Science, Research School of Biology and Division of Biomedical Science and Biochemistry, R.N. Robertson Building, Australian National University, Canberra, ACT, 2601, Australia
| | | | | |
Collapse
|
38
|
Portelli G, Ruffier F, Roubieu FL, Franceschini N. Honeybees' speed depends on dorsal as well as lateral, ventral and frontal optic flows. PLoS One 2011; 6:e19486. [PMID: 21589861 PMCID: PMC3093387 DOI: 10.1371/journal.pone.0019486] [Citation(s) in RCA: 46] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2010] [Accepted: 04/08/2011] [Indexed: 11/19/2022] Open
Abstract
Flying insects use the optic flow to navigate safely in unfamiliar environments, especially by adjusting their speed and their clearance from surrounding objects. It has not yet been established, however, which specific parts of the optical flow field insects use to control their speed. With a view to answering this question, freely flying honeybees were trained to fly along a specially designed tunnel including two successive tapering parts: the first part was tapered in the vertical plane and the second one, in the horizontal plane. The honeybees were found to adjust their speed on the basis of the optic flow they perceived not only in the lateral and ventral parts of their visual field, but also in the dorsal part. More specifically, the honeybees' speed varied monotonically, depending on the minimum cross-section of the tunnel, regardless of whether the narrowing occurred in the horizontal or vertical plane. The honeybees' speed decreased or increased whenever the minimum cross-section decreased or increased. In other words, the larger sum of the two opposite optic flows in the horizontal and vertical planes was kept practically constant thanks to the speed control performed by the honeybees upon encountering a narrowing of the tunnel. The previously described ALIS ("AutopiLot using an Insect-based vision System") model nicely matches the present behavioral findings. The ALIS model is based on a feedback control scheme that explains how honeybees may keep their speed proportional to the minimum local cross-section of a tunnel, based solely on optic flow processing, without any need for speedometers or rangefinders. The present behavioral findings suggest how flying insects may succeed in adjusting their speed in their complex foraging environments, while at the same time adjusting their distance not only from lateral and ventral objects but also from those located in their dorsal visual field.
Collapse
Affiliation(s)
- Geoffrey Portelli
- Biorobotic Department, Institute of Movement Science, CNRS/Aix-Marseille II University, Marseille, France
| | - Franck Ruffier
- Biorobotic Department, Institute of Movement Science, CNRS/Aix-Marseille II University, Marseille, France
| | - Frédéric L. Roubieu
- Biorobotic Department, Institute of Movement Science, CNRS/Aix-Marseille II University, Marseille, France
| | - Nicolas Franceschini
- Biorobotic Department, Institute of Movement Science, CNRS/Aix-Marseille II University, Marseille, France
| |
Collapse
|
39
|
Dittmar L. Static and dynamic snapshots for goal localization in insects? Commun Integr Biol 2011; 4:17-20. [PMID: 21509170 DOI: 10.4161/cib.4.1.13763] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2010] [Accepted: 09/27/2010] [Indexed: 11/19/2022] Open
Abstract
Bees, wasps and ants navigate successfully between feeding sites and their nest, despite the small size of their brains which contain less than a million neurons. A long history of studies examining the role of visual memories in homing behavior show that insects can localize a goal by finding a close match between a memorized view at the goal location and their current view ("snapshot matching"). However, the concept of static snapshot matching might not explain all aspects of homing behavior, as honeybees are able to use landmarks that are statically camouflaged. In this case the landmarks are only detectable by relative motion cues between the landmark and the background, which the bees generate when they perform characteristic flight maneuvers close to the landmarks. The bees' navigation performance can be explained by a matching scheme based on optic flow amplitudes ("dynamic snapshot matching"). In this article, I will discuss the concept of dynamic snapshot matching in the light of previous literature.
Collapse
Affiliation(s)
- Laura Dittmar
- Department of Neurobiology & Center of Excellence 'Cognitive Interaction Technology' Bielefeld University; Bielefeld, Germany
| |
Collapse
|
40
|
Dittmar L, Egelhaaf M, Stürzl W, Boeddeker N. The behavioral relevance of landmark texture for honeybee homing. Front Behav Neurosci 2011; 5:20. [PMID: 21541258 PMCID: PMC3083717 DOI: 10.3389/fnbeh.2011.00020] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2010] [Accepted: 04/03/2011] [Indexed: 11/15/2022] Open
Abstract
Honeybees visually pinpoint the location of a food source using landmarks. Studies on the role of visual memories have suggested that bees approach the goal by finding a close match between their current view and a memorized view of the goal location. The most relevant landmark features for this matching process seem to be their retinal positions, the size as defined by their edges, and their color. Recently, we showed that honeybees can use landmarks that are statically camouflaged, suggesting that motion cues are relevant as well. Currently it is unclear how bees weight these different landmark features when accomplishing navigational tasks, and whether this depends on their saliency. Since natural objects are often distinguished by their texture, we investigate the behavioral relevance and the interplay of the spatial configuration and the texture of landmarks. We show that landmark texture is a feature that bees memorize, and being given the opportunity to identify landmarks by their texture improves the bees’ navigational performance. Landmark texture is weighted more strongly than landmark configuration when it provides the bees with positional information and when the texture is salient. In the vicinity of the landmark honeybees changed their flight behavior according to its texture.
Collapse
Affiliation(s)
- Laura Dittmar
- Department of Neurobiology and Center of Excellence 'Cognitive Interaction Technology', Bielefeld University Bielefeld, Germany
| | | | | | | |
Collapse
|
41
|
Boeddeker N, Dittmar L, Stürzl W, Egelhaaf M. The fine structure of honeybee head and body yaw movements in a homing task. Proc Biol Sci 2010; 277:1899-906. [PMID: 20147329 PMCID: PMC2871881 DOI: 10.1098/rspb.2009.2326] [Citation(s) in RCA: 73] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2009] [Accepted: 01/22/2010] [Indexed: 11/12/2022] Open
Abstract
Honeybees turn their thorax and thus their flight motor to change direction or to fly sideways. If the bee's head were fixed to its thorax, such movements would have great impact on vision. Head movements independent of thorax orientation can stabilize gaze and thus play an important and active role in shaping the structure of the visual input the animal receives. Here, we investigate how gaze and flight control interact in a homing task. We use high-speed video equipment to record the head and body movements of honeybees approaching and departing from a food source that was located between three landmarks in an indoor flight arena. During these flights, the bees' trajectories consist of straight flight segments combined with rapid turns. These short and fast yaw turns ('saccades') are in most cases accompanied by even faster head yaw turns that start about 8 ms earlier than the body saccades. Between saccades, gaze stabilization leads to a behavioural elimination of rotational components from the optical flow pattern, which facilitates depth perception from motion parallax.
Collapse
Affiliation(s)
- Norbert Boeddeker
- Bielefeld University, Neurobiology and Center of Excellence Cognitive Interaction Technology, Bielefeld, Germany.
| | | | | | | |
Collapse
|