1
|
Kaushik PK, Olsson SB. Using virtual worlds to understand insect navigation for bio-inspired systems. CURRENT OPINION IN INSECT SCIENCE 2020; 42:97-104. [PMID: 33010476 DOI: 10.1016/j.cois.2020.09.010] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/14/2020] [Revised: 09/18/2020] [Accepted: 09/22/2020] [Indexed: 06/11/2023]
Abstract
Insects perform a wide array of intricate behaviors over large spatial and temporal scales in complex natural environments. A mechanistic understanding of insect cognition has direct implications on how brains integrate multimodal information and can inspire bio-based solutions for autonomous robots. Virtual Reality (VR) offers an opportunity assess insect neuroethology while presenting complex, yet controlled, stimuli. Here, we discuss the use of insects as inspiration for artificial systems, recent advances in different VR technologies, current knowledge gaps, and the potential for application of insect VR research to bio-inspired robots. Finally, we advocate the need to diversify our model organisms, behavioral paradigms, and embrace the complexity of the natural world. This will help us to uncover the proximate and ultimate basis of brain and behavior and extract general principles for common challenging problems.
Collapse
Affiliation(s)
- Pavan Kumar Kaushik
- National Centre for Biological Sciences, Tata Institute of Fundamental Research, GKVK Campus, Bellary Road, Bengaluru, 560064, India.
| | - Shannon B Olsson
- National Centre for Biological Sciences, Tata Institute of Fundamental Research, GKVK Campus, Bellary Road, Bengaluru, 560064, India.
| |
Collapse
|
2
|
Kócsi Z, Murray T, Dahmen H, Narendra A, Zeil J. The Antarium: A Reconstructed Visual Reality Device for Ant Navigation Research. Front Behav Neurosci 2020; 14:599374. [PMID: 33240057 PMCID: PMC7683616 DOI: 10.3389/fnbeh.2020.599374] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2020] [Accepted: 10/12/2020] [Indexed: 12/16/2022] Open
Abstract
We constructed a large projection device (the Antarium) with 20,000 UV-Blue-Green LEDs that allows us to present tethered ants with views of their natural foraging environment. The ants walk on an air-cushioned trackball, their movements are registered and can be fed back to the visual panorama. Views are generated in a 3D model of the ants’ environment so that they experience the changing visual world in the same way as they do when foraging naturally. The Antarium is a biscribed pentakis dodecahedron with 55 facets of identical isosceles triangles. The length of the base of the triangles is 368 mm resulting in a device that is roughly 1 m in diameter. Each triangle contains 361 blue/green LEDs and nine UV LEDs. The 55 triangles of the Antarium have 19,855 Green and Blue pixels and 495 UV pixels, covering 360° azimuth and elevation from −50° below the horizon to +90° above the horizon. The angular resolution is 1.5° for Green and Blue LEDs and 6.7° for UV LEDs, offering 65,536 intensity levels at a flicker frequency of more than 9,000 Hz and a framerate of 190 fps. Also, the direction and degree of polarisation of the UV LEDs can be adjusted through polarisers mounted on the axles of rotary actuators. We build 3D models of the natural foraging environment of ants using purely camera-based methods. We reconstruct panoramic scenes at any point within these models, by projecting panoramic images onto six virtual cameras which capture a cube-map of images to be projected by the LEDs of the Antarium. The Antarium is a unique instrument to investigate visual navigation in ants. In an open loop, it allows us to provide ants with familiar and unfamiliar views, with completely featureless visual scenes, or with scenes that are altered in spatial or spectral composition. In closed-loop, we can study the behavior of ants that are virtually displaced within their natural foraging environment. In the future, the Antarium can also be used to investigate the dynamics of navigational guidance and the neurophysiological basis of ant navigation in natural visual environments.
Collapse
Affiliation(s)
- Zoltán Kócsi
- Research School of Biology, Australian National University, Canberra, ACT, Australia
| | - Trevor Murray
- Research School of Biology, Australian National University, Canberra, ACT, Australia
| | - Hansjürgen Dahmen
- Department of Cognitive Neuroscience, University of Tübingen, Tübingen, Germany
| | - Ajay Narendra
- Department of Biological Sciences, Macquarie University, Sydney, NSW, Australia
| | - Jochen Zeil
- Research School of Biology, Australian National University, Canberra, ACT, Australia
| |
Collapse
|
3
|
Ullrich TW, Kern R, Egelhaaf M. Influence of environmental information in natural scenes and the effects of motion adaptation on a fly motion-sensitive neuron during simulated flight. Biol Open 2014; 4:13-21. [PMID: 25505148 PMCID: PMC4295162 DOI: 10.1242/bio.20149449] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
Gaining information about the spatial layout of natural scenes is a challenging task that flies need to solve, especially when moving at high velocities. A group of motion sensitive cells in the lobula plate of flies is supposed to represent information about self-motion as well as the environment. Relevant environmental features might be the nearness of structures, influencing retinal velocity during translational self-motion, and the brightness contrast. We recorded the responses of the H1 cell, an individually identifiable lobula plate tangential cell, during stimulation with image sequences, simulating translational motion through natural sceneries with a variety of differing depth structures. A correlation was found between the average nearness of environmental structures within large parts of the cell's receptive field and its response across a variety of scenes, but no correlation was found between the brightness contrast of the stimuli and the cell response. As a consequence of motion adaptation resulting from repeated translation through the environment, the time-dependent response modulations induced by the spatial structure of the environment were increased relatively to the background activity of the cell. These results support the hypothesis that some lobula plate tangential cells do not only serve as sensors of self-motion, but also as a part of a neural system that processes information about the spatial layout of natural scenes.
Collapse
Affiliation(s)
- Thomas W Ullrich
- Department of Neurobiology, Bielefeld University, Universitätsstrasse 25, 33615 Bielefeld, Germany Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Inspiration 1/Zehlendorfer Damm 201, 33619 Bielefeld, Germany
| | - Roland Kern
- Department of Neurobiology, Bielefeld University, Universitätsstrasse 25, 33615 Bielefeld, Germany Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Inspiration 1/Zehlendorfer Damm 201, 33619 Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology, Bielefeld University, Universitätsstrasse 25, 33615 Bielefeld, Germany Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Inspiration 1/Zehlendorfer Damm 201, 33619 Bielefeld, Germany
| |
Collapse
|
4
|
Mertes M, Dittmar L, Egelhaaf M, Boeddeker N. Visual motion-sensitive neurons in the bumblebee brain convey information about landmarks during a navigational task. Front Behav Neurosci 2014; 8:335. [PMID: 25309374 PMCID: PMC4173878 DOI: 10.3389/fnbeh.2014.00335] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2014] [Accepted: 09/07/2014] [Indexed: 11/13/2022] Open
Abstract
Bees use visual memories to find the spatial location of previously learnt food sites. Characteristic learning flights help acquiring these memories at newly discovered foraging locations where landmarks—salient objects in the vicinity of the goal location—can play an important role in guiding the animal's homing behavior. Although behavioral experiments have shown that bees can use a variety of visual cues to distinguish objects as landmarks, the question of how landmark features are encoded by the visual system is still open. Recently, it could be shown that motion cues are sufficient to allow bees localizing their goal using landmarks that can hardly be discriminated from the background texture. Here, we tested the hypothesis that motion sensitive neurons in the bee's visual pathway provide information about such landmarks during a learning flight and might, thus, play a role for goal localization. We tracked learning flights of free-flying bumblebees (Bombus terrestris) in an arena with distinct visual landmarks, reconstructed the visual input during these flights, and replayed ego-perspective movies to tethered bumblebees while recording the activity of direction-selective wide-field neurons in their optic lobe. By comparing neuronal responses during a typical learning flight and targeted modifications of landmark properties in this movie we demonstrate that these objects are indeed represented in the bee's visual motion pathway. We find that object-induced responses vary little with object texture, which is in agreement with behavioral evidence. These neurons thus convey information about landmark properties that are useful for view-based homing.
Collapse
Affiliation(s)
- Marcel Mertes
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| | - Laura Dittmar
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| | - Norbert Boeddeker
- Department of Neurobiology, Center of Excellence 'Cognitive Interaction Technology' (CITEC), Bielefeld University Bielefeld, Germany
| |
Collapse
|
5
|
Kress D, Egelhaaf M. Impact of stride-coupled gaze shifts of walking blowflies on the neuronal representation of visual targets. Front Behav Neurosci 2014; 8:307. [PMID: 25309362 PMCID: PMC4164030 DOI: 10.3389/fnbeh.2014.00307] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2014] [Accepted: 08/23/2014] [Indexed: 02/04/2023] Open
Abstract
During locomotion animals rely heavily on visual cues gained from the environment to guide their behavior. Examples are basic behaviors like collision avoidance or the approach to a goal. The saccadic gaze strategy of flying flies, which separates translational from rotational phases of locomotion, has been suggested to facilitate the extraction of environmental information, because only image flow evoked by translational self-motion contains relevant distance information about the surrounding world. In contrast to the translational phases of flight during which gaze direction is kept largely constant, walking flies experience continuous rotational image flow that is coupled to their stride-cycle. The consequences of these self-produced image shifts for the extraction of environmental information are still unclear. To assess the impact of stride-coupled image shifts on visual information processing, we performed electrophysiological recordings from the HSE cell, a motion sensitive wide-field neuron in the blowfly visual system. This cell has been concluded to play a key role in mediating optomotor behavior, self-motion estimation and spatial information processing. We used visual stimuli that were based on the visual input experienced by walking blowflies while approaching a black vertical bar. The response of HSE to these stimuli was dominated by periodic membrane potential fluctuations evoked by stride-coupled image shifts. Nevertheless, during the approach the cell's response contained information about the bar and its background. The response components evoked by the bar were larger than the responses to its background, especially during the last phase of the approach. However, as revealed by targeted modifications of the visual input during walking, the extraction of distance information on the basis of HSE responses is much impaired by stride-coupled retinal image shifts. Possible mechanisms that may cope with these stride-coupled responses are discussed.
Collapse
Affiliation(s)
- Daniel Kress
- Department of Neurobiology, Bielefeld UniversityBielefeld, Germany
- CITEC Center of Excellence Cognitive Interaction Technology, Bielefeld UniversityBielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology, Bielefeld UniversityBielefeld, Germany
- CITEC Center of Excellence Cognitive Interaction Technology, Bielefeld UniversityBielefeld, Germany
| |
Collapse
|
6
|
Verderber A, McKnight M, Bozkurt A. Early metamorphic insertion technology for insect flight behavior monitoring. J Vis Exp 2014. [PMID: 25079130 DOI: 10.3791/50901] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/31/2022] Open
Abstract
Early Metamorphosis Insertion Technology (EMIT) is a novel methodology for integrating microfabricated neuromuscular recording and actuation platforms on insects during their metamorphic development. Here, the implants are fused within the structure and function of the neuromuscular system as a result of metamorphic tissue remaking. The implants emerge with the insect where the development of tissue around the electronics during pupal development results in a bioelectrically and biomechanically enhanced tissue interface. This relatively more reliable and stable interface would be beneficial for many researchers exploring the neural basis of the insect locomotion with alleviated traumatic effects caused during adult stage insertions. In this article, we implant our electrodes into the indirect flight muscles of Manduca sexta. Located in the dorsal-thorax, these main flight powering dorsoventral and dorsolongitudinal muscles actuate the wings and supply the mechanical power for up and down strokes. Relative contraction of these two muscle groups has been under investigation to explore how the yaw maneuver is neurophysiologically coordinated. To characterize the flight dynamics, insects are often tethered with wires and their flight is recorded with digital cameras. We also developed a novel way to tether Manduca sexta on a magnetically levitating frame where the insect is connected to a commercially available wireless neural amplifier. This set up can be used to limit the degree of freedom to yawing "only" while transmitting the related electromyography signals from dorsoventral and dorsolongitudinal muscle groups.
Collapse
Affiliation(s)
- Alexander Verderber
- Department of Electrical and Computer Engineering, North Carolina State University
| | - Michael McKnight
- Department of Electrical and Computer Engineering, North Carolina State University
| | - Alper Bozkurt
- Department of Electrical and Computer Engineering, North Carolina State University;
| |
Collapse
|
7
|
Ullrich TW, Kern R, Egelhaaf M. Texture-defined objects influence responses of blowfly motion-sensitive neurons under natural dynamical conditions. Front Integr Neurosci 2014; 8:34. [PMID: 24808836 PMCID: PMC4010782 DOI: 10.3389/fnint.2014.00034] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2013] [Accepted: 04/10/2014] [Indexed: 11/13/2022] Open
Abstract
The responses of visual interneurons of flies involved in the processing of motion information do not only depend on the velocity, but also on other stimulus parameters, such as the contrast and the spatial frequency content of the stimulus pattern. These dependencies have been known for long, but it is still an open question how they affect the neurons' performance in extracting information about the structure of the environment under the specific dynamical conditions of natural flight. Free-flight of blowflies is characterized by sequences of phases of translational movements lasting for just 30-100 ms interspersed with even shorter and extremely rapid saccade-like rotational shifts in flight and gaze direction. Previous studies already analyzed how nearby objects, leading to relative motion on the retina with respect to a more distant background, influenced the response of a class of fly motion sensitive visual interneurons, the horizontal system (HS) cells. In the present study, we focused on objects that differed from their background by discontinuities either in their brightness contrast or in their spatial frequency content. We found strong object-induced effects on the membrane potential even during the short intersaccadic intervals, if the background contrast was small and the object contrast sufficiently high. The object evoked similar response increments provided that it contained higher spatial frequencies than the background, but not under reversed conditions. This asymmetry in the response behavior is partly a consequence of the depolarization level induced by the background. Thus, our results suggest that, under the specific dynamical conditions of natural flight, i.e., on a very short timescale, the responses of HS cells represent object information depending on the polarity of the difference between object and background contrast and spatial frequency content.
Collapse
Affiliation(s)
- Thomas W Ullrich
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology, Bielefeld University Bielefeld, Germany
| | - Roland Kern
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology, Bielefeld University Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology, Bielefeld University Bielefeld, Germany
| |
Collapse
|
8
|
Moore RJD, Taylor GJ, Paulk AC, Pearson T, van Swinderen B, Srinivasan MV. FicTrac: a visual method for tracking spherical motion and generating fictive animal paths. J Neurosci Methods 2014; 225:106-19. [PMID: 24491637 DOI: 10.1016/j.jneumeth.2014.01.010] [Citation(s) in RCA: 65] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2013] [Revised: 01/08/2014] [Accepted: 01/13/2014] [Indexed: 10/25/2022]
Abstract
Studying how animals interface with a virtual reality can further our understanding of how attention, learning and memory, sensory processing, and navigation are handled by the brain, at both the neurophysiological and behavioural levels. To this end, we have developed a novel vision-based tracking system, FicTrac (Fictive path Tracking software), for estimating the path an animal makes whilst rotating an air-supported sphere using only input from a standard camera and computer vision techniques. We have found that the accuracy and robustness of FicTrac outperforms a low-cost implementation of a standard optical mouse-based approach for generating fictive paths. FicTrac is simple to implement for a wide variety of experimental configurations and, importantly, is fast to execute, enabling real-time sensory feedback for behaving animals. We have used FicTrac to record the behaviour of tethered honeybees, Apis mellifera, whilst presenting visual stimuli in both open-loop and closed-loop experimental paradigms. We found that FicTrac could accurately register the fictive paths of bees as they walked towards bright green vertical bars presented on an LED arena. Using FicTrac, we have demonstrated closed-loop visual fixation in both the honeybee and the fruit fly, Drosophila melanogaster, establishing the flexibility of this system. FicTrac provides the experimenter with a simple yet adaptable system that can be combined with electrophysiological recording techniques to study the neural mechanisms of behaviour in a variety of organisms, including walking vertebrates.
Collapse
Affiliation(s)
- Richard J D Moore
- Queensland Brain Institute, University of Queensland, St Lucia, QLD 4072, Australia.
| | - Gavin J Taylor
- Queensland Brain Institute, University of Queensland, St Lucia, QLD 4072, Australia
| | - Angelique C Paulk
- Queensland Brain Institute, University of Queensland, St Lucia, QLD 4072, Australia
| | - Thomas Pearson
- Queensland Brain Institute, University of Queensland, St Lucia, QLD 4072, Australia
| | - Bruno van Swinderen
- Queensland Brain Institute, University of Queensland, St Lucia, QLD 4072, Australia
| | - Mandyam V Srinivasan
- Queensland Brain Institute, University of Queensland, St Lucia, QLD 4072, Australia
| |
Collapse
|
9
|
Rien D, Kern R, Kurtz R. Octopaminergic modulation of a fly visual motion-sensitive neuron during stimulation with naturalistic optic flow. Front Behav Neurosci 2013; 7:155. [PMID: 24194704 PMCID: PMC3810598 DOI: 10.3389/fnbeh.2013.00155] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2013] [Accepted: 10/08/2013] [Indexed: 11/13/2022] Open
Abstract
In a variety of species locomotor activity, like walking or flying, has been demonstrated to alter visual information processing. The neuromodulator octopamine was shown to change the response characteristics of optic flow processing neurons in the fly's visual system in a similar way as locomotor activity. This modulation resulted in enhanced neuronal responses, in particular during sustained stimulation with high temporal frequencies, and in shorter latencies of responses to abrupt onsets of pattern motion. These state-dependent changes were interpreted to adjust neuronal tuning to the range of high velocities encountered during locomotion. Here we assess the significance of these changes for the processing of optic flow as experienced during flight. Naturalistic image sequences were reconstructed based on measurements of the head position and gaze direction of Calliphora vicina flying in an arena. We recorded the responses of the V1 neuron during presentation of these image sequences on a panoramic stimulus device ("FliMax"). Consistent with previous accounts, we found that spontaneous as well as stimulus-induced spike rates were increased by an octopamine agonist and decreased by an antagonist. Moreover, a small but consistent decrease in response latency upon octopaminergic activation was present, which might support fast responses to optic flow cues and limit instabilities during closed-loop optomotor regulation. However, apart from these effects the similarities between the dynamic response properties in the different pharmacologically induced states were surprisingly high, indicating that the processing of naturalistic optic flow is not fundamentally altered by octopaminergic modulation.
Collapse
Affiliation(s)
- Diana Rien
- Department of Neurobiology, Faculty of Biology, Bielefeld University , Bielefeld, Germany
| | | | | |
Collapse
|
10
|
Eckmeier D, Kern R, Egelhaaf M, Bischof HJ. Encoding of naturalistic optic flow by motion sensitive neurons of nucleus rotundus in the zebra finch (Taeniopygia guttata). Front Integr Neurosci 2013; 7:68. [PMID: 24065895 PMCID: PMC3778379 DOI: 10.3389/fnint.2013.00068] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2013] [Accepted: 09/02/2013] [Indexed: 02/05/2023] Open
Abstract
The retinal image changes that occur during locomotion, the optic flow, carry information about self-motion and the three-dimensional structure of the environment. Especially fast moving animals with only little binocular vision depend on these depth cues for maneuvering. They actively control their gaze to facilitate perception of depth based on cues in the optic flow. In the visual system of birds, nucleus rotundus neurons were originally found to respond to object motion but not to background motion. However, when background and object were both moving, responses increased the more the direction and velocity of object and background motion on the retina differed. These properties may play a role in representing depth cues in the optic flow. We therefore investigated, how neurons in nucleus rotundus respond to optic flow that contains depth cues. We presented simplified and naturalistic optic flow on a panoramic LED display while recording from single neurons in nucleus rotundus of anaesthetized zebra finches. Unlike most studies on motion vision in birds, our stimuli included depth information. We found extensive responses of motion selective neurons in nucleus rotundus to optic flow stimuli. Simplified stimuli revealed preferences for optic flow reflecting translational or rotational self-motion. Naturalistic optic flow stimuli elicited complex response modulations, but the presence of objects was signaled by only few neurons. The neurons that did respond to objects in the optic flow, however, show interesting properties.
Collapse
Affiliation(s)
- Dennis Eckmeier
- Neuroethology Group, Department of Behavioural Biology, Bielefeld University Bielefeld, Germany
| | | | | | | |
Collapse
|
11
|
Lindemann JP, Egelhaaf M. Texture dependence of motion sensing and free flight behavior in blowflies. Front Behav Neurosci 2013; 6:92. [PMID: 23335890 PMCID: PMC3542507 DOI: 10.3389/fnbeh.2012.00092] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2012] [Accepted: 12/21/2012] [Indexed: 11/18/2022] Open
Abstract
MANY FLYING INSECTS EXHIBIT AN ACTIVE FLIGHT AND GAZE STRATEGY: purely translational flight segments alternate with quick turns called saccades. To generate such a saccadic flight pattern, the animals decide the timing, direction, and amplitude of the next saccade during the previous translatory intersaccadic interval. The information underlying these decisions is assumed to be extracted from the retinal image displacements (optic flow), which scale with the distance to objects during the intersaccadic flight phases. In an earlier study we proposed a saccade-generation mechanism based on the responses of large-field motion-sensitive neurons. In closed-loop simulations we achieved collision avoidance behavior in a limited set of environments but observed collisions in others. Here we show by open-loop simulations that the cause of this observation is the known texture-dependence of elementary motion detection in flies, reflected also in the responses of large-field neurons as used in our model. We verified by electrophysiological experiments that this result is not an artifact of the sensory model. Already subtle changes in the texture may lead to qualitative differences in the responses of both our model cells and their biological counterparts in the fly's brain. Nonetheless, free flight behavior of blowflies is only moderately affected by such texture changes. This divergent texture dependence of motion-sensitive neurons and behavioral performance suggests either mechanisms that compensate for the texture dependence of the visual motion pathway at the level of the circuits generating the saccadic turn decisions or the involvement of a hypothetical parallel pathway in saccadic control that provides the information for collision avoidance independent of the textural properties of the environment.
Collapse
|
12
|
Egelhaaf M, Boeddeker N, Kern R, Kurtz R, Lindemann JP. Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action. Front Neural Circuits 2012; 6:108. [PMID: 23269913 PMCID: PMC3526811 DOI: 10.3389/fncir.2012.00108] [Citation(s) in RCA: 71] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2012] [Accepted: 12/03/2012] [Indexed: 11/30/2022] Open
Abstract
Insects such as flies or bees, with their miniature brains, are able to control highly aerobatic flight maneuvres and to solve spatial vision tasks, such as avoiding collisions with obstacles, landing on objects, or even localizing a previously learnt inconspicuous goal on the basis of environmental cues. With regard to solving such spatial tasks, these insects still outperform man-made autonomous flying systems. To accomplish their extraordinary performance, flies and bees have been shown by their characteristic behavioral actions to actively shape the dynamics of the image flow on their eyes ("optic flow"). The neural processing of information about the spatial layout of the environment is greatly facilitated by segregating the rotational from the translational optic flow component through a saccadic flight and gaze strategy. This active vision strategy thus enables the nervous system to solve apparently complex spatial vision tasks in a particularly efficient and parsimonious way. The key idea of this review is that biological agents, such as flies or bees, acquire at least part of their strength as autonomous systems through active interactions with their environment and not by simply processing passively gained information about the world. These agent-environment interactions lead to adaptive behavior in surroundings of a wide range of complexity. Animals with even tiny brains, such as insects, are capable of performing extraordinarily well in their behavioral contexts by making optimal use of the closed action-perception loop. Model simulations and robotic implementations show that the smart biological mechanisms of motion computation and visually-guided flight control might be helpful to find technical solutions, for example, when designing micro air vehicles carrying a miniaturized, low-weight on-board processor.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Centre of Excellence “Cognitive Interaction Technology”Bielefeld University, Germany
| | | | | | | | | |
Collapse
|
13
|
Takalo J, Piironen A, Honkanen A, Lempeä M, Aikio M, Tuukkanen T, Vähäsöyrinki M. A fast and flexible panoramic virtual reality system for behavioural and electrophysiological experiments. Sci Rep 2012; 2:324. [PMID: 22442752 PMCID: PMC3310229 DOI: 10.1038/srep00324] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2012] [Accepted: 02/27/2012] [Indexed: 11/09/2022] Open
Abstract
Ideally, neuronal functions would be studied by performing experiments with unconstrained animals whilst they behave in their natural environment. Although this is not feasible currently for most animal models, one can mimic the natural environment in the laboratory by using a virtual reality (VR) environment. Here we present a novel VR system based upon a spherical projection of computer generated images using a modified commercial data projector with an add-on fish-eye lens. This system provides equidistant visual stimulation with extensive coverage of the visual field, high spatio-temporal resolution and flexible stimulus generation using a standard computer. It also includes a track-ball system for closed-loop behavioural experiments with walking animals. We present a detailed description of the system and characterize it thoroughly. Finally, we demonstrate the VR system’s performance whilst operating in closed-loop conditions by showing the movement trajectories of the cockroaches during exploratory behaviour in a VR forest.
Collapse
|
14
|
Geurten BRH, Kern R, Egelhaaf M. Species-Specific Flight Styles of Flies are Reflected in the Response Dynamics of a Homolog Motion-Sensitive Neuron. Front Integr Neurosci 2012; 6:11. [PMID: 22485089 PMCID: PMC3307035 DOI: 10.3389/fnint.2012.00011] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2011] [Accepted: 02/28/2012] [Indexed: 11/22/2022] Open
Abstract
Hoverflies and blowflies have distinctly different flight styles. Yet, both species have been shown to structure their flight behavior in a way that facilitates extraction of 3D information from the image flow on the retina (optic flow). Neuronal candidates to analyze the optic flow are the tangential cells in the third optical ganglion - the lobula complex. These neurons are directionally selective and integrate the optic flow over large parts of the visual field. Homolog tangential cells in hoverflies and blowflies have a similar morphology. Because blowflies and hoverflies have similar neuronal layout but distinctly different flight behaviors, they are an ideal substrate to pinpoint potential neuronal adaptations to the different flight styles. In this article we describe the relationship between locomotion behavior and motion vision on three different levels: (1) We compare the different flight styles based on the categorization of flight behavior into prototypical movements. (2) We measure the species-specific dynamics of the optic flow under naturalistic flight conditions. We found the translational optic flow of both species to be very different. (3) We describe possible adaptations of a homolog motion-sensitive neuron. We stimulate this cell in blowflies (Calliphora) and hoverflies (Eristalis) with naturalistic optic flow generated by both species during free flight. The characterized hoverfly tangential cell responds faster to transient changes in the optic flow than its blowfly homolog. It is discussed whether and how the different dynamical response properties aid optic flow analysis.
Collapse
Affiliation(s)
- Bart R. H. Geurten
- Department of Neurobiology, Bielefeld UniversityBielefeld, North Rhine-Westphalia, Germany
- Centre of Excellence ‘Cognitive Interaction Technology’Bielefeld, North Rhine-Westphalia, Germany
- Department of Cellular Neurobiology, Johann-Friedrich-Blumenbach Institute for Zoology and Anthropology, Georg-August-University GöttingenGöttingen, Lower Saxony, Germany
| | - Roland Kern
- Department of Neurobiology, Bielefeld UniversityBielefeld, North Rhine-Westphalia, Germany
- Centre of Excellence ‘Cognitive Interaction Technology’Bielefeld, North Rhine-Westphalia, Germany
| | - Martin Egelhaaf
- Department of Neurobiology, Bielefeld UniversityBielefeld, North Rhine-Westphalia, Germany
- Centre of Excellence ‘Cognitive Interaction Technology’Bielefeld, North Rhine-Westphalia, Germany
| |
Collapse
|
15
|
Liang P, Heitwerth J, Kern R, Kurtz R, Egelhaaf M. Object representation and distance encoding in three-dimensional environments by a neural circuit in the visual system of the blowfly. J Neurophysiol 2012; 107:3446-57. [PMID: 22423002 DOI: 10.1152/jn.00530.2011] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Three motion-sensitive key elements of a neural circuit, presumably involved in processing object and distance information, were analyzed with optic flow sequences as experienced by blowflies in a three-dimensional environment. This optic flow is largely shaped by the blowfly's saccadic flight and gaze strategy, which separates translational flight segments from fast saccadic rotations. By modifying this naturalistic optic flow, all three analyzed neurons could be shown to respond during the intersaccadic intervals not only to nearby objects but also to changes in the distance to background structures. In the presence of strong background motion, the three types of neuron differ in their sensitivity for object motion. Object-induced response increments are largest in FD1, a neuron long known to respond better to moving objects than to spatially extended motion patterns, but weakest in VCH, a neuron that integrates wide-field motion from both eyes and, by inhibiting the FD1 cell, is responsible for its object preference. Small but significant object-induced response increments are present in HS cells, which serve both as a major input neuron of VCH and as output neurons of the visual system. In both HS and FD1, intersaccadic background responses decrease with increasing distance to the animal, although much more prominently in FD1. This strong dependence of FD1 on background distance is concluded to be the consequence of the activity of VCH that dramatically increases its activity and, thus, its inhibitory strength with increasing distance.
Collapse
Affiliation(s)
- Pei Liang
- Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC), Bielefeld University, Bielefeld, Germany
| | | | | | | | | |
Collapse
|
16
|
Hung YS, van Kleef JP, Ibbotson MR. Visual response properties of neck motor neurons in the honeybee. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2011; 197:1173-87. [PMID: 21909972 DOI: 10.1007/s00359-011-0679-9] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2011] [Revised: 08/26/2011] [Accepted: 08/28/2011] [Indexed: 11/28/2022]
Abstract
Recent behavioural studies have demonstrated that honeybees use visual feedback to stabilize their gaze. However, little is known about the neural circuits that perform the visual motor computations that underlie this ability. We investigated the motor neurons that innervate two neck muscles (m44 and m51), which produce stabilizing yaw movements of the head. Intracellular recordings were made from five (out of eight) identified neuron types in the first cervical nerve (IK1) of honeybees. Two motor neurons that innervate muscle 51 were found to be direction-selective, with a preference for horizontal image motion from the contralateral to the ipsilateral side of the head. Three neurons that innervate muscle 44 were tuned to detect motion in the opposite direction (from ipsilateral to contralateral). These cells were binocularly sensitive and responded optimally to frontal stimulation. By combining the directional tuning of the motor neurons in an opponent manner, the neck motor system would be able to mediate reflexive optomotor head turns in the direction of image motion, thus stabilising the retinal image. When the dorsal ocelli were covered, the spontaneous activity of neck motor neurons increased and visual responses were modified, suggesting an ocellar input in addition to that from the compound eyes.
Collapse
Affiliation(s)
- Y-S Hung
- ARC Centre of Excellence in Vision Science, Research School of Biology and Division of Biomedical Science and Biochemistry, R.N. Robertson Building, Australian National University, Canberra, ACT, 2601, Australia
| | | | | |
Collapse
|
17
|
Rien D, Kern R, Kurtz R. Synaptic transmission of graded membrane potential changes and spikes between identified visual interneurons. Eur J Neurosci 2011; 34:705-16. [PMID: 21819463 DOI: 10.1111/j.1460-9568.2011.07801.x] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
Several physiological mechanisms allow sensory information to be propagated in neuronal networks. According to the conventional view of signal processing, graded changes of membrane potential at the dendrite are converted into a sequence of spikes. However, in many sensory receptors and several types of mostly invertebrate neurons, graded potential changes have a direct impact on the cells' output signals. The visual system of the blowfly Calliphora vicina is a good model system to study synaptic transmission in vivo during sensory stimulation. We recorded extracellularly from an identified motion-sensitive neuron while simultaneously measuring and controlling the membrane potential of individual elements of its presynaptic input ensemble. The membrane potential in the terminals of the presynaptic neuron is composed of two components, graded membrane potential changes and action potentials. To dissociate the roles of action potentials and graded potential changes in synaptic transmission we used voltage-clamp-controlled current-clamp techniques to suppress the graded membrane potential changes without affecting action potentials. Our results indicate that both the graded potential and the action potentials of the presynaptic neuron have an impact on the spiking characteristics of the postsynaptic neuron. Although a tight temporal coupling between pre- and postsynaptic spikes exists, the timing between these spikes is also affected by graded potential changes. We propose that the control of synaptic transfer of a dynamically complex signal by graded changes in membrane potential and spikes is useful to enable a temporally precise coupling of spikes in response to sudden transitions in stimulus intensity.
Collapse
Affiliation(s)
- Diana Rien
- Department of Neurobiology, Faculty of Biology, Bielefeld University, Postfach 10 01 31, 33501 Bielefeld, Germany.
| | | | | |
Collapse
|
18
|
|
19
|
Hennig P, Kern R, Egelhaaf M. Binocular integration of visual information: a model study on naturalistic optic flow processing. Front Neural Circuits 2011; 5:4. [PMID: 21519385 PMCID: PMC3078557 DOI: 10.3389/fncir.2011.00004] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2010] [Accepted: 03/21/2011] [Indexed: 11/30/2022] Open
Abstract
The computation of visual information from both visual hemispheres is often of functional relevance when solving orientation and navigation tasks. The vCH-cell is a motion-sensitive wide-field neuron in the visual system of the blowfly Calliphora, a model system in the field of optic flow processing. The vCH-cell receives input from various other identified wide-field cells, the receptive fields of which are located in both the ipsilateral and the contralateral visual field. The relevance of this connectivity to the processing of naturalistic image sequences, with their peculiar dynamical characteristics, is still unresolved. To disentangle the contributions of the different input components to the cell's overall response, we used electrophysiologically determined responses of the vCH-cell and its various input elements to tune a model of the vCH-circuit. Their impact on the vCH-cell response could be distinguished by stimulating not only extended parts of the visual field of the fly, but also selected regions in the ipsi- and contralateral visual field with behaviorally generated optic flow. We show that a computational model of the vCH-circuit is able to account for the neuronal activities of the counterparts in the blowfly's visual system. Furthermore, we offer an insight into the dendritic integration of binocular visual input.
Collapse
Affiliation(s)
- Patrick Hennig
- Department of Neurobiology and Center of Excellence 'Cognitive Interaction Technology', Bielefeld University Bielefeld, Germany
| | | | | |
Collapse
|
20
|
Liang P, Kern R, Kurtz R, Egelhaaf M. Impact of visual motion adaptation on neural responses to objects and its dependence on the temporal characteristics of optic flow. J Neurophysiol 2011; 105:1825-34. [PMID: 21307322 DOI: 10.1152/jn.00359.2010] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
It is still unclear how sensory systems efficiently encode signals with statistics as experienced by animals in the real world and what role adaptation plays during normal behavior. Therefore, we studied the performance of visual motion-sensitive neurons of blowflies, the horizontal system neurons, with optic flow that was reconstructed from the head trajectories of semi-free-flying flies. To test how motion adaptation is affected by optic flow dynamics, we manipulated the seminatural optic flow by targeted modifications of the flight trajectories and assessed to what extent neuronal responses to an object located close to the flight trajectory depend on adaptation dynamics. For all types of adapting optic flow object-induced response increments were stronger in the adapted compared with the nonadapted state. Adaptation with optic flow characterized by the typical alternation between translational and rotational segments produced this effect but also adaptation with optic flow that lacked these distinguishing features and even pure rotation at a constant angular velocity. The enhancement of object-induced response increments had a direction-selective component because preferred-direction rotation and natural optic flow were more efficient adaptors than null-direction rotation. These results indicate that natural dynamics of optic flow is not a basic requirement to adapt neurons in a specific, presumably functionally beneficial way. Our findings are discussed in the light of adaptation mechanisms proposed on the basis of experiments previously done with conventional experimenter-defined stimuli.
Collapse
Affiliation(s)
- Pei Liang
- Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC), Bielefeld University, Bielefeld, Germany
| | | | | | | |
Collapse
|
21
|
Identifying prototypical components in behaviour using clustering algorithms. PLoS One 2010; 5:e9361. [PMID: 20179763 PMCID: PMC2825265 DOI: 10.1371/journal.pone.0009361] [Citation(s) in RCA: 55] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2009] [Accepted: 02/02/2010] [Indexed: 02/08/2023] Open
Abstract
Quantitative analysis of animal behaviour is a requirement to understand the task solving strategies of animals and the underlying control mechanisms. The identification of repeatedly occurring behavioural components is thereby a key element of a structured quantitative description. However, the complexity of most behaviours makes the identification of such behavioural components a challenging problem. We propose an automatic and objective approach for determining and evaluating prototypical behavioural components. Behavioural prototypes are identified using clustering algorithms and finally evaluated with respect to their ability to represent the whole behavioural data set. The prototypes allow for a meaningful segmentation of behavioural sequences. We applied our clustering approach to identify prototypical movements of the head of blowflies during cruising flight. The results confirm the previously established saccadic gaze strategy by the set of prototypes being divided into either predominantly translational or rotational movements, respectively. The prototypes reveal additional details about the saccadic and intersaccadic flight sections that could not be unravelled so far. Successful application of the proposed approach to behavioural data shows its ability to automatically identify prototypical behavioural components within a large and noisy database and to evaluate these with respect to their quality and stability. Hence, this approach might be applied to a broad range of behavioural and neural data obtained from different animals and in different contexts.
Collapse
|
22
|
Wertz A, Haag J, Borst A. Local and global motion preferences in descending neurons of the fly. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2009; 195:1107-20. [PMID: 19830435 PMCID: PMC2780676 DOI: 10.1007/s00359-009-0481-0] [Citation(s) in RCA: 35] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2009] [Revised: 09/08/2009] [Accepted: 09/20/2009] [Indexed: 11/25/2022]
Abstract
For a moving animal, optic flow is an important source of information about its ego-motion. In flies, the processing of optic flow is performed by motion sensitive tangential cells in the lobula plate. Amongst them, cells of the vertical system (VS cells) have receptive fields with similarities to optic flows generated during rotations around different body axes. Their output signals are further processed by pre-motor descending neurons. Here, we investigate the local motion preferences of two descending neurons called descending neurons of the ocellar and vertical system (DNOVS1 and DNOVS2). Using an LED arena subtending 240° × 95° of visual space, we mapped the receptive fields of DNOVS1 and DNOVS2 as well as those of their presynaptic elements, i.e. VS cells 1–10 and V2. The receptive field of DNOVS1 can be predicted in detail from the receptive fields of those VS cells that are most strongly coupled to the cell. The receptive field of DNOVS2 is a combination of V2 and VS cells receptive fields. Predicting the global motion preferences from the receptive field revealed a linear spatial integration in DNOVS1 and a superlinear spatial integration in DNOVS2. In addition, the superlinear integration of V2 output is necessary for DNOVS2 to differentiate between a roll rotation and a lift translation of the fly.
Collapse
Affiliation(s)
- Adrian Wertz
- Department of Systems and Computational Neurobiology, Max-Planck-Institute of Neurobiology, 82152 Martinsried, Germany.
| | | | | |
Collapse
|
23
|
Motion adaptation enhances object-induced neural activity in three-dimensional virtual environment. J Neurosci 2008; 28:11328-32. [PMID: 18971474 DOI: 10.1523/jneurosci.0203-08.2008] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Many response characteristics of neurons sensitive to visual motion depend on stimulus history and change during prolonged stimulation. Although the changes are usually regarded as adaptive, their functional significance is still not fully understood. With experimenter-defined stimuli, previous research on motion adaptation has mainly focused on enhancing the detection of changes in the stimulus domain, on preventing output saturation and on energy efficient coding. Here we will analyze in the blowfly visual system the functional significance of motion adaptation under the complex stimulus conditions encountered in the three-dimensional world. Identified motion sensitive neurons are confronted with seminatural optic flow as is seen by semi-free-flying animals as well as targeted modifications of it. Motion adaptation is shown to enhance object-induced neural responses in a three-dimensional environment although the overall neuronal response amplitude decreases during prolonged motion stimulation.
Collapse
|
24
|
Straw AD. Vision egg: an open-source library for realtime visual stimulus generation. Front Neuroinform 2008; 2:4. [PMID: 19050754 PMCID: PMC2584775 DOI: 10.3389/neuro.11.004.2008] [Citation(s) in RCA: 176] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2008] [Accepted: 10/08/2008] [Indexed: 11/18/2022] Open
Abstract
Modern computer hardware makes it possible to produce visual stimuli in ways not previously possible. Arbitrary scenes, from traditional sinusoidal gratings to naturalistic 3D scenes can now be specified on a frame-by-frame basis in realtime. A programming library called the Vision Egg that aims to make it easy to take advantage of these innovations. The Vision Egg is a free, open-source library making use of OpenGL and written in the high-level language Python with extensions in C. Careful attention has been paid to the issues of luminance and temporal calibration, and several interfacing techniques to input devices such as mice, movement tracking systems, and digital triggers are discussed. Together, these make the Vision Egg suitable for many psychophysical, electrophysiological, and behavioral experiments. This software is available for free download at visionegg.org.
Collapse
Affiliation(s)
- Andrew D Straw
- Bioengineering, California Institute of Technology Pasadena, CA, USA
| |
Collapse
|
25
|
Taylor GK, Bacic M, Bomphrey RJ, Carruthers AC, Gillies J, Walker SM, Thomas ALR. New experimental approaches to the biology of flight control systems. ACTA ACUST UNITED AC 2008; 211:258-66. [PMID: 18165253 DOI: 10.1242/jeb.012625] [Citation(s) in RCA: 35] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Here we consider how new experimental approaches in biomechanics can be used to attain a systems-level understanding of the dynamics of animal flight control. Our aim in this paper is not to provide detailed results and analysis, but rather to tackle several conceptual and methodological issues that have stood in the way of experimentalists in achieving this goal, and to offer tools for overcoming these. We begin by discussing the interplay between analytical and empirical methods, emphasizing that the structure of the models we use to analyse flight control dictates the empirical measurements we must make in order to parameterize them. We then provide a conceptual overview of tethered-flight paradigms, comparing classical ;open-loop' and ;closed-loop' setups, and describe a flight simulator that we have recently developed for making flight dynamics measurements on tethered insects. Next, we provide a conceptual overview of free-flight paradigms, focusing on the need to use system identification techniques in order to analyse the data they provide, and describe two new techniques that we have developed for making flight dynamics measurements on freely flying birds. First, we describe a technique for obtaining inertial measurements of the orientation, angular velocity and acceleration of a steppe eagle Aquila nipalensis in wide-ranging free flight, together with synchronized measurements of wing and tail kinematics using onboard instrumentation and video cameras. Second, we describe a photogrammetric method to measure the 3D wing kinematics of the eagle during take-off and landing. In each case, we provide demonstration data to illustrate the kinds of information available from each method. We conclude by discussing the prospects for systems-level analyses of flight control using these techniques and others like them.
Collapse
|
26
|
Lindemann JP, Weiss H, Möller R, Egelhaaf M. Saccadic flight strategy facilitates collision avoidance: closed-loop performance of a cyberfly. BIOLOGICAL CYBERNETICS 2008; 98:213-227. [PMID: 18180948 DOI: 10.1007/s00422-007-0205-x] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/18/2006] [Accepted: 11/29/2007] [Indexed: 05/25/2023]
Abstract
Behavioural and electrophysiological experiments suggest that blowflies employ an active saccadic strategy of flight and gaze control to separate the rotational from the translational optic flow components. As a consequence, this allows motion sensitive neurons to encode during translatory intersaccadic phases of locomotion information about the spatial layout of the environment. So far, it has not been clear whether and how a motor controller could decode the responses of these neurons to prevent a blowfly from colliding with obstacles. Here we propose a simple model of the blowfly visual course control system, named cyberfly, and investigate its performance and limitations. The sensory input module of the cyberfly emulates a pair of output neurons subserving the two eyes of the blowfly visual motion pathway. We analyse two sensory-motor interfaces (SMI). An SMI coupling the differential signal of the sensory neurons proportionally to the yaw rotation fails to avoid obstacles. A more plausible SMI is based on a saccadic controller. Even with sideward drift after saccades as is characteristic of real blowflies, the cyberfly is able to successfully avoid collisions with obstacles. The relative distance information contained in the optic flow during translatory movements between saccades is provided to the SMI by the responses of the visual output neurons. An obvious limitation of this simple mechanism is its strong dependence on the textural properties of the environment.
Collapse
Affiliation(s)
- Jens Peter Lindemann
- Neurobiologie, Fakultät für Biologie, Universität Bielefeld, Postfach 10 01 31, 33501 Bielefeld, Germany.
| | | | | | | |
Collapse
|
27
|
Reiser MB, Dickinson MH. A modular display system for insect behavioral neuroscience. J Neurosci Methods 2007; 167:127-39. [PMID: 17854905 DOI: 10.1016/j.jneumeth.2007.07.019] [Citation(s) in RCA: 195] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2006] [Revised: 07/22/2007] [Accepted: 07/23/2007] [Indexed: 10/23/2022]
Abstract
Flying insects exhibit stunning behavioral repertoires that are largely mediated by the visual control of flight. For this reason, presenting a controlled visual environment to tethered insects has been and continues to be a powerful tool for studying the sensory control of complex behaviors. To create an easily controlled, scalable, and customizable visual stimulus, we have designed a modular system, based on panels composed of an 8 x 8 array of individual LEDs, that may be connected together to 'tile' an experimental environment with controllable displays. The panels have been designed to be extremely bright, with the added flexibility of individual-pixel brightness control, allowing experimentation over a broad range of behaviorally relevant conditions. Patterns to be displayed may be designed using custom software, downloaded to a controller board, and displayed on the individually addressed panels via a rapid communication interface. The panels are controlled by a microprocessor-based display controller which, for most experiments, will not require a computer in the loop, greatly reducing the experimental infrastructure. This technology allows an experimenter to build and program a visual arena with a customized geometry in a matter of hours. To demonstrate the utility of this system, we present results from experiments with tethered Drosophila melanogaster: (1) in a cylindrical arena composed of 44 panels, used to test the contrast dependence of object orientation behavior, and (2) above a 30-panel floor display, used to examine the effects of ground motion on orientation during flight.
Collapse
Affiliation(s)
- Michael B Reiser
- Department of Computation and Neural Systems, Caltech, MC 138-78, Pasadena, CA 91125, USA.
| | | |
Collapse
|
28
|
Trischler C, Boeddeker N, Egelhaaf M. Characterisation of a blowfly male-specific neuron using behaviourally generated visual stimuli. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2007; 193:559-72. [PMID: 17333206 DOI: 10.1007/s00359-007-0212-3] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2006] [Revised: 01/18/2007] [Accepted: 01/19/2007] [Indexed: 10/23/2022]
Abstract
The pursuit system controlling chasing behaviour in male blowflies has to cope with extremely fast and dynamically changing visual input. An identified male-specific visual neuron called Male Lobula Giant 1 (MLG1) is presumably one major element of this pursuit system. Previous behavioural and modelling analyses have indicated that angular target size, retinal target position and target velocity are relevant input variables of the pursuit system. To investigate whether MLG1 specifically represents any of these visual parameters we obtained in vivo intracellular recordings while replaying optical stimuli that simulate the visual signals received by a male fly during chasing manoeuvres. On the basis of these naturalistic stimuli we find that MLG1 shows distinct direction sensitivity and is depolarised if the target motion contains an upward component. The responses of MLG1 are jointly determined by the retinal position, the speed and direction, and the duration of target motimotion. Coherence analysis reveals that although retinal target size and position are in some way inherent in the responses of MLG1, we find no confirmation of the hypothesis that MLG1 encodes any of these parameters exclusively.
Collapse
Affiliation(s)
- Christine Trischler
- Department of Neurobiology, Bielefeld University, Post Box 100131, 33501, Bielefeld, Germany.
| | | | | |
Collapse
|
29
|
Karmeier K, van Hateren JH, Kern R, Egelhaaf M. Encoding of Naturalistic Optic Flow by a Population of Blowfly Motion-Sensitive Neurons. J Neurophysiol 2006; 96:1602-14. [PMID: 16687623 DOI: 10.1152/jn.00023.2006] [Citation(s) in RCA: 64] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
In sensory systems information is encoded by the activity of populations of neurons. To analyze the coding properties of neuronal populations sensory stimuli have usually been used that were much simpler than those encountered in real life. It has been possible only recently to stimulate visual interneurons of the blowfly with naturalistic visual stimuli reconstructed from eye movements measured during free flight. Therefore we now investigate with naturalistic optic flow the coding properties of a small neuronal population of identified visual interneurons in the blowfly, the so-called VS and HS neurons. These neurons are motion sensitive and directionally selective and are assumed to extract information about the animal's self-motion from optic flow. We could show that neuronal responses of VS and HS neurons are mainly shaped by the characteristic dynamical properties of the fly's saccadic flight and gaze strategy. Individual neurons encode information about both the rotational and the translational components of the animal's self-motion. Thus the information carried by individual neurons is ambiguous. The ambiguities can be reduced by considering neuronal population activity. The joint responses of different subpopulations of VS and HS neurons can provide unambiguous information about the three rotational and the three translational components of the animal's self-motion and also, indirectly, about the three-dimensional layout of the environment.
Collapse
Affiliation(s)
- K Karmeier
- Department of Neurobiology, Faculty for Biology, Bielefeld University, Bielefeld, Germany
| | | | | | | |
Collapse
|
30
|
Kern R, van Hateren JH, Egelhaaf M. Representation of behaviourally relevant information by blowfly motion-sensitive visual interneurons requires precise compensatory head movements. ACTA ACUST UNITED AC 2006; 209:1251-60. [PMID: 16547297 DOI: 10.1242/jeb.02127] [Citation(s) in RCA: 42] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Flying blowflies shift their gaze by saccadic turns of body and head, keeping their gaze basically fixed between saccades. For the head, this results in almost pure translational optic flow between saccades, enabling visual interneurons in the fly motion pathway to extract information about translation of the animal and thereby about the spatial layout of the environment. There are noticeable differences between head and body movements during flight. Head saccades are faster and shorter than body saccades, and the head orientation is more stable between saccades than the body orientation. Here, we analyse the functional importance of these differences by probing visual interneurons of the blowfly motion pathway with optic flow based on either head movements or body movements, as recorded accurately with a magnetic search coil technique. We find that the precise head-body coordination is essential for the visual system to separate the translational from the rotational optic flow. If the head were tightly coupled to the body, the resulting optic flow would not contain the behaviourally important information on translation. Since it is difficult to resolve head orientation in many experimental paradigms, even when employing state-of-the-art digital video techniques, we introduce a 'headifying algorithm', which transforms the time-dependent body orientation in free flight into an estimate of head orientation. We show that application of this algorithm leads to an estimated head orientation between saccades that is sufficiently stable to enable recovering information on translation. The algorithm may therefore be of practical use when head orientation is needed but cannot be measured.
Collapse
Affiliation(s)
- R Kern
- Department of Neurobiology, Faculty for Biology, Bielefeld University, Bielefeld 33501, Germany.
| | | | | |
Collapse
|
31
|
Berry R, Stange G, Olberg R, van Kleef J. The mapping of visual space by identified large second-order neurons in the dragonfly median ocellus. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2006; 192:1105-23. [PMID: 16761130 DOI: 10.1007/s00359-006-0142-5] [Citation(s) in RCA: 29] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2006] [Revised: 05/09/2006] [Accepted: 05/14/2006] [Indexed: 11/24/2022]
Abstract
In adult dragonflies, the compound eyes are augmented by three simple eyes known as the dorsal ocelli. The outputs of ocellar photoreceptors converge on relatively few second-order neurons with large axonal diameters (L-neurons). We determine L-neuron morphology by iontophoretic dye injection combined with three-dimensional reconstructions. Using intracellular recording and white noise analysis, we also determine the physiological receptive fields of the L-neurons, in order to identify the extent to which they preserve spatial information. We find a total of 11 median ocellar L-neurons, consisting of five symmetrical pairs and one unpaired neuron. L-neurons are distinguishable by the extent and location of their terminations within the ocellar plexus and brain. In the horizontal dimension, L-neurons project to different regions of the ocellar plexus, in close correlation with their receptive fields. In the vertical dimension, dendritic arborizations overlap widely, paralleled by receptive fields that are narrow and do not differ between different neurons. These results provide the first evidence for the preservation of spatial information by the second-order neurons of any dorsal ocellus. The system essentially forms a one-dimensional image of the equator over a wide azimuthal area, possibly forming an internal representation of the horizon. Potential behavioural roles for the system are discussed.
Collapse
Affiliation(s)
- Richard Berry
- Centre for Visual Sciences, Research School of Biological Sciences, Australian National University, PO Box 475, Canberra, ACT 2601, Australia.
| | | | | | | |
Collapse
|
32
|
Zanker JM, Zeil J. Movement-induced motion signal distributions in outdoor scenes. NETWORK (BRISTOL, ENGLAND) 2005; 16:357-76. [PMID: 16611590 DOI: 10.1080/09548980500497758] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/08/2023]
Abstract
The movement of an observer generates a characteristic field of velocity vectors on the retina (Gibson 1950). Because such optic flow-fields are useful for navigation, many theoretical, psychophysical and physiological studies have addressed the question how ego-motion parameters such as direction of heading can be estimated from optic flow. Little is known, however, about the structure of optic flow under natural conditions. To address this issue, we recorded sequences of panoramic images along accurately defined paths in a variety of outdoor locations and used these sequences as input to a two-dimensional array of correlation-based motion detectors (2DMD). We find that (a) motion signal distributions are sparse and noisy with respect to local motion directions; (b) motion signal distributions contain patches (motion streaks) which are systematically oriented along the principal flow-field directions; (c) motion signal distributions show a distinct, dorso-ventral topography, reflecting the distance anisotropy of terrestrial environments; (d) the spatiotemporal tuning of the local motion detector we used has little influence on the structure of motion signal distributions, at least for the range of conditions we tested; and (e) environmental motion is locally noisy throughout the visual field, with little spatial or temporal correlation; it can therefore be removed by temporal averaging and is largely over-ridden by image motion caused by observer movement. Our results suggest that spatial or temporal integration is important to retrieve reliable information on the local direction and size of motion vectors, because the structure of optic flow is clearly detectable in the temporal average of motion signal distributions. Ego-motion parameters can be reliably retrieved from such averaged distributions under a range of environmental conditions. These observations raise a number of questions about the role of specific environmental and computational constraints in the processing of natural optic flow.
Collapse
Affiliation(s)
- J M Zanker
- Department of Psychology, Royal Holloway, University of London, Egham, Surrey TW20 0EX, UK.
| | | |
Collapse
|
33
|
Karmeier K, Krapp HG, Egelhaaf M. Population Coding of Self-Motion: Applying Bayesian Analysis to a Population of Visual Interneurons in the Fly. J Neurophysiol 2005; 94:2182-94. [PMID: 15901759 DOI: 10.1152/jn.00278.2005] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Coding of sensory information often involves the activity of neuronal populations. We demonstrate how the accuracy of a population code depends on integration time, the size of the population, and noise correlation between the participating neurons. The population we study consists of 10 identified visual interneurons in the blowfly Calliphora vicina involved in optic flow processing. These neurons are assumed to encode the animal's head or body rotations around horizontal axes by means of graded potential changes. From electrophysiological experiments we obtain parameters for modeling the neurons' responses. From applying a Bayesian analysis on the modeled population response we draw three major conclusions. First, integration of neuronal activities over a time period of only 5 ms after response onset is sufficient to decode accurately the rotation axis. Second, noise correlation between neurons has only little impact on the population's performance. And third, although a population of only two neurons would be sufficient to encode any horizontal rotation axis, the population of 10 vertical system neurons is advantageous if the available integration time is short. For the fly, short integration times to decode neuronal responses are important when controlling rapid flight maneuvers.
Collapse
Affiliation(s)
- Katja Karmeier
- Bielefeld University, Lehrstuhl für Neurobiologie, Postfach 100131, D-33501 Bielefeld, Germany.
| | | | | |
Collapse
|
34
|
Boeddeker N, Lindemann JP, Egelhaaf M, Zeil J. Responses of blowfly motion-sensitive neurons to reconstructed optic flow along outdoor flight paths. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2005; 191:1143-55. [PMID: 16133502 DOI: 10.1007/s00359-005-0038-9] [Citation(s) in RCA: 45] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2005] [Revised: 06/29/2005] [Accepted: 07/02/2005] [Indexed: 11/24/2022]
Abstract
The retinal image flow a blowfly experiences in its daily life on the wing is determined by both the structure of the environment and the animal's own movements. To understand the design of visual processing mechanisms, there is thus a need to analyse the performance of neurons under natural operating conditions. To this end, we recorded flight paths of flies outdoors and reconstructed what they had seen, by moving a panoramic camera along exactly the same paths. The reconstructed image sequences were later replayed on a fast, panoramic flight simulator to identified, motion sensitive neurons of the so-called horizontal system (HS) in the lobula plate of the blowfly, which are assumed to extract self-motion parameters from optic flow. We show that under real life conditions HS-cells not only encode information about self-rotation, but are also sensitive to translational optic flow and, thus, indirectly signal information about the depth structure of the environment. These properties do not require an elaboration of the known model of these neurons, because the natural optic flow sequences generate--at least qualitatively--the same depth-related response properties when used as input to a computational HS-cell model and to real neurons.
Collapse
Affiliation(s)
- N Boeddeker
- Lehrstuhl Neurobiologie, Universität Bielefeld, Postfach 10 01 31, 33501 Bielefeld, Germany.
| | | | | | | |
Collapse
|
35
|
Lindemann JP, Kern R, van Hateren JH, Ritter H, Egelhaaf M. On the computations analyzing natural optic flow: quantitative model analysis of the blowfly motion vision pathway. J Neurosci 2005; 25:6435-48. [PMID: 16000634 PMCID: PMC6725274 DOI: 10.1523/jneurosci.1132-05.2005] [Citation(s) in RCA: 65] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2004] [Revised: 05/20/2005] [Accepted: 05/20/2005] [Indexed: 11/21/2022] Open
Abstract
For many animals, including humans, the optic flow generated on the eyes during locomotion is an important source of information about self-motion and the structure of the environment. The blowfly has been used frequently as a model system for experimental analysis of optic flow processing at the microcircuit level. Here, we describe a model of the computational mechanisms implemented by these circuits in the blowfly motion vision pathway. Although this model was originally proposed based on simple experimenter-designed stimuli, we show that it is also capable to quantitatively predict the responses to the complex dynamic stimuli a blowfly encounters in free flight. In particular, the model visual system exploits the active saccadic gaze and flight strategy of blowflies in a similar way, as does its neuronal counterpart. The model circuit extracts information about translation velocity in the intersaccadic intervals and thus, indirectly, about the three-dimensional layout of the environment. By stepwise dissection of the model circuit, we determine which of its components are essential for these remarkable features. When accounting for the responses to complex natural stimuli, the model is much more robust against parameter changes than when explaining the neuronal responses to simple experimenter-defined stimuli. In contrast to conclusions drawn from experiments with simple stimuli, optimization of the parameter set for different segments of natural optic flow stimuli do not indicate pronounced adaptational changes of these parameters during long-lasting stimulation.
Collapse
Affiliation(s)
- J P Lindemann
- Department of Neurobiology, Faculty for Biology, Bielefeld University, D-33501 Bielefeld, Germany.
| | | | | | | | | |
Collapse
|
36
|
Heitwerth J, Kern R, van Hateren JH, Egelhaaf M. Motion adaptation leads to parsimonious encoding of natural optic flow by blowfly motion vision system. J Neurophysiol 2005; 94:1761-9. [PMID: 15917319 DOI: 10.1152/jn.00308.2005] [Citation(s) in RCA: 16] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Neurons sensitive to visual motion change their response properties during prolonged motion stimulation. These changes have been interpreted as adaptive and were concluded, for instance, to adjust the sensitivity of the visual motion pathway to velocity changes or to increase the reliability of encoding of motion information. These conclusions are based on experiments with experimenter-designed motion stimuli that differ substantially with respect to their dynamical properties from the optic flow an animal experiences during normal behavior. We analyze for the first time motion adaptation under natural stimulus conditions. The experiments are done on the H1-cell, an identified neuron in the blowfly visual motion pathway that has served in many previous studies as a model system for visual motion computation. We reconstructed optic flow perceived by a blowfly in free flight and used this behaviorally generated optic flow to study motion adaptation. A variety of measures (variability in spike count, response latency, jitter of spike timing) suggests that the coding quality does not improve with prolonged stimulation. However, although the number of spikes decreases considerably during stimulation with natural optic flow, the amount of information that is conveyed stays nearly constant. Thus the information per spike increases, and motion adaptation leads to parsimonious coding without sacrificing the reliability with which behaviorally relevant information is encoded.
Collapse
Affiliation(s)
- J Heitwerth
- Department of Neurobiology, Faculty for Biology, Bielefeld University, D33501 Bielefeld, Germany.
| | | | | | | |
Collapse
|
37
|
Kern R, van Hateren JH, Michaelis C, Lindemann JP, Egelhaaf M. Function of a fly motion-sensitive neuron matches eye movements during free flight. PLoS Biol 2005; 3:e171. [PMID: 15884977 PMCID: PMC1110907 DOI: 10.1371/journal.pbio.0030171] [Citation(s) in RCA: 99] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2005] [Accepted: 03/14/2005] [Indexed: 11/28/2022] Open
Abstract
Sensing is often implicitly assumed to be the passive acquisition of information. However, part of the sensory information is generated actively when animals move. For instance, humans shift their gaze actively in a sequence of saccades towards interesting locations in a scene. Likewise, many insects shift their gaze by saccadic turns of body and head, keeping their gaze fixed between saccades. Here we employ a novel panoramic virtual reality stimulator and show that motion computation in a blowfly visual interneuron is tuned to make efficient use of the characteristic dynamics of retinal image flow. The neuron is able to extract information about the spatial layout of the environment by utilizing intervals of stable vision resulting from the saccadic viewing strategy. The extraction is possible because the retinal image flow evoked by translation, containing information about object distances, is confined to low frequencies. This flow component can be derived from the total optic flow between saccades because the residual intersaccadic head rotations are small and encoded at higher frequencies. Information about the spatial layout of the environment can thus be extracted by the neuron in a computationally parsimonious way. These results on neuronal function based on naturalistic, behaviourally generated optic flow are in stark contrast to conclusions based on conventional visual stimuli that the neuron primarily represents a detector for yaw rotations of the animal. With the aid of virtual reality, the authors study the neuronal responses to visual motion associated with natural behavior.
Collapse
Affiliation(s)
- Roland Kern
- Department of Neurobiology, Faculty for Biology, Bielefeld University, Bielefeld, Germany.
| | | | | | | | | |
Collapse
|
38
|
van Hateren JH, Kern R, Schwerdtfeger G, Egelhaaf M. Function and coding in the blowfly H1 neuron during naturalistic optic flow. J Neurosci 2005; 25:4343-52. [PMID: 15858060 PMCID: PMC6725116 DOI: 10.1523/jneurosci.0616-05.2005] [Citation(s) in RCA: 51] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2004] [Revised: 03/17/2005] [Accepted: 03/17/2005] [Indexed: 11/21/2022] Open
Abstract
Naturalistic stimuli, reconstructed from measured eye movements of flying blowflies, were replayed on a panoramic stimulus device. The directional movement-sensitive H1 neuron was recorded from blowflies watching these stimuli. The response of the H1 neuron is dominated by the response to fast saccadic turns into one direction. The response between saccades is mostly inhibited by the front-to-back optic flow caused by the forward translation during flight. To unravel the functional significance of the H1 neuron, we replayed, in addition to the original behaviorally generated stimulus, two targeted stimulus modifications: (1) a stimulus in which flow resulting from translation was removed (this stimulus produced strong intersaccadic responses); and (2) a stimulus in which the saccades were removed by assuming that the head follows the smooth flight trajectory (this stimulus produced alternating zero or nearly saturating spike rates). The responses to the two modified stimuli are strongly different from the response to the original stimulus, showing the importance of translation and saccades for the H1 response to natural optic flow. The response to the original stimulus thus suggests a double function for the H1 neuron, assisting two major classes of movement-sensitive output neurons targeted by H1. First, its strong response to saccades may function as a saccadic suppressor (via one of its target neurons) for cells involved in figure-ground discrimination. Second, its intersaccadic response may increase the signal-to-noise ratio (SNR) of wide-field neurons involved in detecting translational optic flow between saccades, in particular when flying speeds are low or when object distances are large.
Collapse
Affiliation(s)
- J H van Hateren
- Department of Neurobiophysics, University of Groningen, NL-9747 AG Groningen, The Netherlands.
| | | | | | | |
Collapse
|
39
|
Johnson AP, Barnes WJP, Macauley MWS. Local mechanisms for the separation of optic flow-field components in the land crab,Cardisoma guanhumi: A role for motion parallax? Vis Neurosci 2005; 21:905-11. [PMID: 15733345 DOI: 10.1017/s0952523804216108] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2004] [Indexed: 11/05/2022]
Abstract
Although a number of global mechanisms have been proposed over the years that explain how crabs might separate the rotational and translational components of their optic flow field, there has been no evidence to date that local mechanisms such as motion parallax are used in this separation. We describe here a study that takes advantage of a recently developed suite of computer-generated visual stimuli that creates a three-dimensional world surrounding the crab in which we can simulate translational and rotational optic flow. We show that, while motion parallax is not the only mechanism used in flow-field separation, it does play a role in the recognition of translational optic flow fields in that, under conditions of low overall light intensity and low contrast ratio when crabs find the distinction between rotation and translation harder, smaller eye movements occur in response to translation when motion parallax cues are present than when they are absent. Thus, motion parallax is one of many cues that crabs use to separate rotational and translational optic flow by showing compensatory eye movements to only the former.
Collapse
Affiliation(s)
- Aaron P Johnson
- Division of Environmental and Evolutionary Biology, Institute of Biomedical & Life Sciences, University of Glasgow, Glasgow, Scotland, UK.
| | | | | |
Collapse
|
40
|
Fry SN, Müller P, Baumann HJ, Straw AD, Bichsel M, Robert D. Context-dependent stimulus presentation to freely moving animals in 3D. J Neurosci Methods 2004; 135:149-57. [PMID: 15020099 DOI: 10.1016/j.jneumeth.2003.12.012] [Citation(s) in RCA: 18] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2003] [Revised: 11/21/2003] [Accepted: 12/12/2003] [Indexed: 10/26/2022]
Abstract
The presentation of controllable, dynamic sensory stimuli provides a powerful experimental paradigm, which has been extensively applied to explore sensory processing in walking and tethered flying insects. Recent advances in computer hardware and software technology provide the opportunity to track the 3D flight path of free-flying insects and process these data in real-time, opening up the possibility to present dynamic stimuli to free-flying animals. To accommodate for the increased complexity relating to 3D space, we partitioned experimental design, real-time data acquisition and stimulus control into multiple self-contained modules. 3D experimental scenarios were created in a stand-alone application by forging multiple 3D space-stimulus relationships. The use of dynamic cues is illustrated by an experiment, in which dynamic acoustic cues were presented to a free-flying parasitoid fly in a large 3D environment. The combination of loosely coupled modules provides robust and flexible solutions, allowing new paradigms to be readily implemented based on existing technologies. We demonstrate this with a test system that displayed a complex visual stimulus, controlled in real-time by the 2D position and orientation of a test object. The presented methods are applicable in a variety of novel experimental paradigms, including learning paradigms, for various sensory modalities in walking, swimming and flying animals.
Collapse
Affiliation(s)
- S N Fry
- Institute of Neuroinformatics, University/ETH Zürich, Winterthurerstrasse 190, CH-8057 Zürich, Switzerland.
| | | | | | | | | | | |
Collapse
|
41
|
Abstract
The behavioural repertoire of male flies includes visually guided chasing after moving targets. The visuomotor control system for these pursuits belongs to the fastest found in the animal kingdom. We simulated a virtual fly, to test whether or not experimentally established hypotheses on the underlying control system are sufficient to explain chasing behaviour. Two operating instructions for steering the chasing virtual fly were derived from behavioural experiments: (i) the retinal size of the target controls the fly's forward speed and, thus, indirectly its distance to the target; and (ii) a smooth pursuit system uses the retinal position of the target to regulate the fly's flight direction. Low-pass filters implement neuronal processing time. Treating the virtual fly as a point mass, its kinematics are modelled in consideration of the effects of translatory inertia and air friction. Despite its simplicity, the model shows behaviour similar to that of real flies. Depending on its starting position and orientation as well as on target size and speed, the virtual fly either catches the target or follows it indefinitely without capture. These two behavioural modes of the virtual fly emerge from the control system for flight steering without implementation of an explicit decision maker.
Collapse
Affiliation(s)
- Norbert Boeddeker
- Department of Neurobiology, Bielefeld University, PO Box 10 01 31, 33501 Bielefeld, Germany.
| | | |
Collapse
|
42
|
Warzecha AK, Kurtz R, Egelhaaf M. Synaptic transfer of dynamic motion information between identified neurons in the visual system of the blowfly. Neuroscience 2003; 119:1103-12. [PMID: 12831867 DOI: 10.1016/s0306-4522(03)00204-5] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Synaptic transmission is usually studied in vitro with electrical stimulation replacing the natural input of the system. In contrast, we analyzed in vivo transfer of visual motion information from graded-potential presynaptic to spiking postsynaptic neurons in the fly. Motion in the null direction leads to hyperpolarization of the presynaptic neuron but does not much influence the postsynaptic cell, because its firing rate is already low during rest, giving only little scope for further reductions. In contrast, preferred-direction motion leads to presynaptic depolarizations and increases the postsynaptic spike rate. Signal transfer to the postsynaptic cell is linear and reliable for presynaptic graded membrane potential fluctuations of up to approximately 10 Hz. This frequency range covers the dynamic range of velocities that is encoded with a high gain by visual motion-sensitive neurons. Hence, information about preferred-direction motion is transmitted largely undistorted ensuring a consistent dependency of neuronal signals on stimulus parameters, such as motion velocity. Postsynaptic spikes are often elicited by rapid presynaptic spike-like depolarizations which superimpose the graded membrane potential. Although the timing of most of these spike-like depolarizations is set by noise and not by the motion stimulus, it is preserved at the synapse with millisecond precision.
Collapse
Affiliation(s)
- A-K Warzecha
- Lehrstuhl für Neurobiologie, Fakultät für Biologie, Universität Bielefeld, Postfach 10 01 31, D-33501, Bielefeld, Germany.
| | | | | |
Collapse
|
43
|
Kurtz R, Egelhaaf M. Natural patterns of neural activity: how physiological mechanisms are orchestrated to cope with real life. Mol Neurobiol 2003; 27:13-32. [PMID: 12668900 DOI: 10.1385/mn:27:1:13] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Physiological mechanisms of neuronal information processing have been shaped during evolution by a continual interplay between organisms and their sensory surroundings. Thus, when asking for the functional significance of such mechanisms, the natural conditions under which they operate must be considered. This has been done successfully in several studies that employ sensory stimulation under in vivo conditions. These studies address the question of how physiological mechanisms within neurons are properly adjusted to the characteristics of natural stimuli and to the demands imposed on the system being studied. Results from diverse animal models show how neurons exploit natural stimulus statistics efficiently by utilizing specific filtering capacities. Mechanisms that allow neurons to adapt to the currently relevant range from an often immense stimulus spectrum are outlined, and examples are provided that suggest that information transfer between neurons is shaped by the system-specific computational tasks in the behavioral context.
Collapse
Affiliation(s)
- Rafael Kurtz
- Lehrstuhl für Neurobiologie, Fakultät für Biologie, Universität Bielefeld, Germany.
| | | |
Collapse
|
44
|
Abstract
Vision guides flight behaviour in numerous insects. Despite their small brain, insects easily outperform current man-made autonomous vehicles in many respects. Examples are the virtuosic chasing manoeuvres male flies perform as part of their mating behaviour and the ability of bees to assess, on the basis of visual motion cues, the distance travelled in a novel environment. Analyses at both the behavioural and neuronal levels are beginning to unveil reasons for such extraordinary capabilities of insects. One recipe for their success is the adaptation of visual information processing to the specific requirements of the behavioural tasks and to the specific spatiotemporal properties of the natural input.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Lehrstuhl für Neurobiologie, Fakultät für Biologie, Universität Bielefeld, Postfach 100131, Germany
| | | |
Collapse
|