1
|
Jaimes‐Nino L, Bar A, Subach A, Stoldt M, Libbrecht R, Scharf I, Foitzik S. Transcriptomic Signature of Spatial Navigation in Brains of Desert Ants. Ecol Evol 2024; 14:e70365. [PMID: 39371266 PMCID: PMC11449808 DOI: 10.1002/ece3.70365] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2024] [Revised: 09/12/2024] [Accepted: 09/15/2024] [Indexed: 10/08/2024] Open
Abstract
Navigation is crucial for central-place foragers to locate food and return to the nest. Cataglyphis ants are renowned for their advanced navigation abilities, relying on landmark cues and path integration. This study aims to uncover the transcriptomic basis of exceptional spatial learning in the central nervous system of Cataglyphis niger. Ants navigated a maze with a food reward, and we examined expression changes linked to correct decisions in subsequent runs. Correct decisions correlated with expression changes in the optic lobes, but not the central brain, showing a downregulation of genes associated with sucrose response and Creb3l1. The latter gene is homologous to Drosophila crebA, which is essential for long-term memory formation. To understand how ants use distance information during path integration, we analyzed expression shifts associated with the last distance traveled. We uncovered a transcriptomic footprint in the central brain, but not in the optic lobes, with genes enriched for energy consumption and neurological functions, including neuronal projection development, synaptic target inhibition, and recognition processes. This suggests that transcriptional activity in the central brain is necessary for estimating distance traveled, which is crucial for path integration. Our study supports the distinct roles of different brain parts for navigation in Cataglyphis ants.
Collapse
Affiliation(s)
- Luisa Maria Jaimes‐Nino
- Institute of Organismic and Molecular EvolutionJohannes Gutenberg University MainzMainzGermany
| | - Adi Bar
- School of Zoology, George S Wise Faculty of Life SciencesTel Aviv UniversityTel AvivIsrael
| | - Aziz Subach
- School of Zoology, George S Wise Faculty of Life SciencesTel Aviv UniversityTel AvivIsrael
| | - Marah Stoldt
- Institute of Organismic and Molecular EvolutionJohannes Gutenberg University MainzMainzGermany
| | - Romain Libbrecht
- Insect Biology Research Institute, UMR7261, CNRSUniversity of ToursToursFrance
| | - Inon Scharf
- School of Zoology, George S Wise Faculty of Life SciencesTel Aviv UniversityTel AvivIsrael
| | - Susanne Foitzik
- Institute of Organismic and Molecular EvolutionJohannes Gutenberg University MainzMainzGermany
| |
Collapse
|
2
|
Wagner H, Egelhaaf M, Carr C. Model organisms and systems in neuroethology: one hundred years of history and a look into the future. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2024; 210:227-242. [PMID: 38227005 PMCID: PMC10995084 DOI: 10.1007/s00359-023-01685-z] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2023] [Revised: 11/27/2023] [Accepted: 11/29/2023] [Indexed: 01/17/2024]
Abstract
The Journal of Comparative Physiology lived up to its name in the last 100 years by including more than 1500 different taxa in almost 10,000 publications. Seventeen phyla of the animal kingdom were represented. The honeybee (Apis mellifera) is the taxon with most publications, followed by locust (Locusta migratoria), crayfishes (Cambarus spp.), and fruitfly (Drosophila melanogaster). The representation of species in this journal in the past, thus, differs much from the 13 model systems as named by the National Institutes of Health (USA). We mention major accomplishments of research on species with specific adaptations, specialist animals, for example, the quantitative description of the processes underlying the axon potential in squid (Loligo forbesii) and the isolation of the first receptor channel in the electric eel (Electrophorus electricus) and electric ray (Torpedo spp.). Future neuroethological work should make the recent genetic and technological developments available for specialist animals. There are many research questions left that may be answered with high yield in specialists and some questions that can only be answered in specialists. Moreover, the adaptations of animals that occupy specific ecological niches often lend themselves to biomimetic applications. We go into some depth in explaining our thoughts in the research of motion vision in insects, sound localization in barn owls, and electroreception in weakly electric fish.
Collapse
Affiliation(s)
- Hermann Wagner
- Institute of Biology II, RWTH Aachen University, 52074, Aachen, Germany.
| | - Martin Egelhaaf
- Department of Neurobiology, Bielefeld University, Bielefeld, Germany
| | - Catherine Carr
- Department of Biology, University of Maryland at College Park, College Park, USA
| |
Collapse
|
3
|
Schoepe T, Janotte E, Milde MB, Bertrand OJN, Egelhaaf M, Chicca E. Finding the gap: neuromorphic motion-vision in dense environments. Nat Commun 2024; 15:817. [PMID: 38280859 PMCID: PMC10821932 DOI: 10.1038/s41467-024-45063-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2021] [Accepted: 01/15/2024] [Indexed: 01/29/2024] Open
Abstract
Animals have evolved mechanisms to travel safely and efficiently within different habitats. On a journey in dense terrains animals avoid collisions and cross narrow passages while controlling an overall course. Multiple hypotheses target how animals solve challenges faced during such travel. Here we show that a single mechanism enables safe and efficient travel. We developed a robot inspired by insects. It has remarkable capabilities to travel in dense terrain, avoiding collisions, crossing gaps and selecting safe passages. These capabilities are accomplished by a neuromorphic network steering the robot toward regions of low apparent motion. Our system leverages knowledge about vision processing and obstacle avoidance in insects. Our results demonstrate how insects might safely travel through diverse habitats. We anticipate our system to be a working hypothesis to study insects' travels in dense terrains. Furthermore, it illustrates that we can design novel hardware systems by understanding the underlying mechanisms driving behaviour.
Collapse
Affiliation(s)
- Thorben Schoepe
- Peter Grünberg Institut 15, Forschungszentrum Jülich, Aachen, Germany.
- Faculty of Technology and Cognitive Interaction Technology Center of Excellence (CITEC), Bielefeld University, Bielefeld, Germany.
- Bio-Inspired Circuits and Systems (BICS) Lab. Zernike Institute for Advanced Materials (Zernike Inst Adv Mat), University of Groningen, Groningen, Netherlands.
- CogniGron (Groningen Cognitive Systems and Materials Center), University of Groningen, Groningen, Netherlands.
| | - Ella Janotte
- Event Driven Perception for Robotics, Italian Institute of Technology, iCub facility, Genoa, Italy
| | - Moritz B Milde
- International Centre for Neuromorphic Systems, MARCS Institute, Western Sydney University, Penrith, Australia
| | | | - Martin Egelhaaf
- Neurobiology, Faculty of Biology, Bielefeld University, Bielefeld, Germany
| | - Elisabetta Chicca
- Faculty of Technology and Cognitive Interaction Technology Center of Excellence (CITEC), Bielefeld University, Bielefeld, Germany
- Bio-Inspired Circuits and Systems (BICS) Lab. Zernike Institute for Advanced Materials (Zernike Inst Adv Mat), University of Groningen, Groningen, Netherlands
- CogniGron (Groningen Cognitive Systems and Materials Center), University of Groningen, Groningen, Netherlands
| |
Collapse
|
4
|
Tanaka R, Zhou B, Agrochao M, Badwan BA, Au B, Matos NCB, Clark DA. Neural mechanisms to incorporate visual counterevidence in self-movement estimation. Curr Biol 2023; 33:4960-4979.e7. [PMID: 37918398 PMCID: PMC10848174 DOI: 10.1016/j.cub.2023.10.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2023] [Revised: 10/07/2023] [Accepted: 10/09/2023] [Indexed: 11/04/2023]
Abstract
In selecting appropriate behaviors, animals should weigh sensory evidence both for and against specific beliefs about the world. For instance, animals measure optic flow to estimate and control their own rotation. However, existing models of flow detection can be spuriously triggered by visual motion created by objects moving in the world. Here, we show that stationary patterns on the retina, which constitute evidence against observer rotation, suppress inappropriate stabilizing rotational behavior in the fruit fly Drosophila. In silico experiments show that artificial neural networks (ANNs) that are optimized to distinguish observer movement from external object motion similarly detect stationarity and incorporate negative evidence. Employing neural measurements and genetic manipulations, we identified components of the circuitry for stationary pattern detection, which runs parallel to the fly's local motion and optic-flow detectors. Our results show how the fly brain incorporates negative evidence to improve heading stability, exemplifying how a compact brain exploits geometrical constraints of the visual world.
Collapse
Affiliation(s)
- Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Baohua Zhou
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Statistics and Data Science, Yale University, New Haven, CT 06511, USA
| | - Margarida Agrochao
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Bara A Badwan
- School of Engineering and Applied Science, Yale University, New Haven, CT 06511, USA
| | - Braedyn Au
- Department of Physics, Yale University, New Haven, CT 06511, USA
| | - Natalia C B Matos
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Damon A Clark
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Physics, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA; Wu Tsai Institute, Yale University, New Haven, CT 06511, USA; Quantitative Biology Institute, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
5
|
Ammer G, Serbe-Kamp E, Mauss AS, Richter FG, Fendl S, Borst A. Multilevel visual motion opponency in Drosophila. Nat Neurosci 2023; 26:1894-1905. [PMID: 37783895 PMCID: PMC10620086 DOI: 10.1038/s41593-023-01443-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 08/30/2023] [Indexed: 10/04/2023]
Abstract
Inhibitory interactions between opponent neuronal pathways constitute a common circuit motif across brain areas and species. However, in most cases, synaptic wiring and biophysical, cellular and network mechanisms generating opponency are unknown. Here, we combine optogenetics, voltage and calcium imaging, connectomics, electrophysiology and modeling to reveal multilevel opponent inhibition in the fly visual system. We uncover a circuit architecture in which a single cell type implements direction-selective, motion-opponent inhibition at all three network levels. This inhibition, mediated by GluClα receptors, is balanced with excitation in strength, despite tenfold fewer synapses. The different opponent network levels constitute a nested, hierarchical structure operating at increasing spatiotemporal scales. Electrophysiology and modeling suggest that distributing this computation over consecutive network levels counteracts a reduction in gain, which would result from integrating large opposing conductances at a single instance. We propose that this neural architecture provides resilience to noise while enabling high selectivity for relevant sensory information.
Collapse
Affiliation(s)
- Georg Ammer
- Max Planck Institute for Biological Intelligence, Martinsried, Germany.
| | - Etienne Serbe-Kamp
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
- Ludwig Maximilian University of Munich, Munich, Germany
| | - Alex S Mauss
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Florian G Richter
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Sandra Fendl
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Alexander Borst
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| |
Collapse
|
6
|
Mathejczyk TF, Babo ÉJ, Schönlein E, Grinda NV, Greiner A, Okrožnik N, Belušič G, Wernet MF. Behavioral responses of free-flying Drosophila melanogaster to shiny, reflecting surfaces. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023; 209:929-941. [PMID: 37796303 PMCID: PMC10643280 DOI: 10.1007/s00359-023-01676-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2022] [Revised: 09/14/2023] [Accepted: 09/16/2023] [Indexed: 10/06/2023]
Abstract
Active locomotion plays an important role in the life of many animals, permitting them to explore the environment, find vital resources, and escape predators. Most insect species rely on a combination of visual cues such as celestial bodies, landmarks, or linearly polarized light to navigate or orient themselves in their surroundings. In nature, linearly polarized light can arise either from atmospheric scattering or from reflections off shiny non-metallic surfaces like water. Multiple reports have described different behavioral responses of various insects to such shiny surfaces. Our goal was to test whether free-flying Drosophila melanogaster, a molecular genetic model organism and behavioral generalist, also manifests specific behavioral responses when confronted with such polarized reflections. Fruit flies were placed in a custom-built arena with controlled environmental parameters (temperature, humidity, and light intensity). Flight detections and landings were quantified for three different stimuli: a diffusely reflecting matt plate, a small patch of shiny acetate film, and real water. We compared hydrated and dehydrated fly populations, since the state of hydration may change the motivation of flies to seek or avoid water. Our analysis reveals for the first time that flying fruit flies indeed use vision to avoid flying over shiny surfaces.
Collapse
Affiliation(s)
- Thomas F Mathejczyk
- Division of Neurobiology, Institute of Biology, Fachbereich Biologie, Chemie and Pharmazie, Freie Universität Berlin, Königin-Luise Strasse 1-3, 14195, Berlin, Germany
| | - Édouard J Babo
- Division of Neurobiology, Institute of Biology, Fachbereich Biologie, Chemie and Pharmazie, Freie Universität Berlin, Königin-Luise Strasse 1-3, 14195, Berlin, Germany
| | - Erik Schönlein
- Division of Neurobiology, Institute of Biology, Fachbereich Biologie, Chemie and Pharmazie, Freie Universität Berlin, Königin-Luise Strasse 1-3, 14195, Berlin, Germany
| | - Nikolai V Grinda
- Division of Neurobiology, Institute of Biology, Fachbereich Biologie, Chemie and Pharmazie, Freie Universität Berlin, Königin-Luise Strasse 1-3, 14195, Berlin, Germany
| | - Andreas Greiner
- Division of Neurobiology, Institute of Biology, Fachbereich Biologie, Chemie and Pharmazie, Freie Universität Berlin, Königin-Luise Strasse 1-3, 14195, Berlin, Germany
| | - Nina Okrožnik
- Division of Neurobiology, Institute of Biology, Fachbereich Biologie, Chemie and Pharmazie, Freie Universität Berlin, Königin-Luise Strasse 1-3, 14195, Berlin, Germany
| | - Gregor Belušič
- Department of Biology, Biotechnical Faculty, University of Ljubljana, Ljubljana, Slovenia
| | - Mathias F Wernet
- Division of Neurobiology, Institute of Biology, Fachbereich Biologie, Chemie and Pharmazie, Freie Universität Berlin, Königin-Luise Strasse 1-3, 14195, Berlin, Germany.
| |
Collapse
|
7
|
Cruz TL, Chiappe ME. Multilevel visuomotor control of locomotion in Drosophila. Curr Opin Neurobiol 2023; 82:102774. [PMID: 37651855 DOI: 10.1016/j.conb.2023.102774] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2023] [Revised: 07/26/2023] [Accepted: 08/01/2023] [Indexed: 09/02/2023]
Abstract
Vision is critical for the control of locomotion, but the underlying neural mechanisms by which visuomotor circuits contribute to the movement of the body through space are yet not well understood. Locomotion engages multiple control systems, forming distinct interacting "control levels" driven by the activity of distributed and overlapping circuits. Therefore, a comprehensive understanding of the mechanisms underlying locomotion control requires the consideration of all control levels and their necessary coordination. Due to their small size and the wide availability of experimental tools, Drosophila has become an important model system to study this coordination. Traditionally, insect locomotion has been divided into studying either the biomechanics and local control of limbs, or navigation and course control. However, recent developments in tracking techniques, and physiological and genetic tools in Drosophila have prompted researchers to examine multilevel control coordination in flight and walking.
Collapse
Affiliation(s)
- Tomás L Cruz
- Champalimaud Research, Champalimaud Centre for the Unknown, 1400-038 Lisbon, Portugal
| | - M Eugenia Chiappe
- Champalimaud Research, Champalimaud Centre for the Unknown, 1400-038 Lisbon, Portugal.
| |
Collapse
|
8
|
Tanaka R, Zhou B, Agrochao M, Badwan BA, Au B, Matos NCB, Clark DA. Neural mechanisms to incorporate visual counterevidence in self motion estimation. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.01.04.522814. [PMID: 36711843 PMCID: PMC9881891 DOI: 10.1101/2023.01.04.522814] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
In selecting appropriate behaviors, animals should weigh sensory evidence both for and against specific beliefs about the world. For instance, animals measure optic flow to estimate and control their own rotation. However, existing models of flow detection can confuse the movement of external objects with genuine self motion. Here, we show that stationary patterns on the retina, which constitute negative evidence against self rotation, are used by the fruit fly Drosophila to suppress inappropriate stabilizing rotational behavior. In silico experiments show that artificial neural networks optimized to distinguish self and world motion similarly detect stationarity and incorporate negative evidence. Employing neural measurements and genetic manipulations, we identified components of the circuitry for stationary pattern detection, which runs parallel to the fly's motion- and optic flow-detectors. Our results exemplify how the compact brain of the fly incorporates negative evidence to improve heading stability, exploiting geometrical constraints of the visual world.
Collapse
Affiliation(s)
- Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
- Present Address: Institute of Neuroscience, Technical University of Munich, Munich 80802, Germany
| | - Baohua Zhou
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
- Department of Statistics and Data Science, Yale University, New Haven, CT 06511, USA
| | - Margarida Agrochao
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Bara A. Badwan
- School of Engineering and Applied Science, Yale University, New Haven, CT 06511, USA
| | - Braedyn Au
- Department of Physics, Yale University, New Haven, CT 06511, USA
| | - Natalia C. B. Matos
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Damon A. Clark
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
- Department of Physics, Yale University, New Haven, CT 06511, USA
- Department of Neuroscience, Yale University, New Haven, CT 06511, USA
- Wu Tsai Institute, Yale University, New Haven, CT 06511, USA
| |
Collapse
|
9
|
Honkanen A, Hensgen R, Kannan K, Adden A, Warrant E, Wcislo W, Heinze S. Parallel motion vision pathways in the brain of a tropical bee. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-023-01625-x. [PMID: 37017717 DOI: 10.1007/s00359-023-01625-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2022] [Revised: 03/01/2023] [Accepted: 03/09/2023] [Indexed: 04/06/2023]
Abstract
Spatial orientation is a prerequisite for most behaviors. In insects, the underlying neural computations take place in the central complex (CX), the brain's navigational center. In this region different streams of sensory information converge to enable context-dependent navigational decisions. Accordingly, a variety of CX input neurons deliver information about different navigation-relevant cues. In bees, direction encoding polarized light signals converge with translational optic flow signals that are suited to encode the flight speed of the animals. The continuous integration of speed and directions in the CX can be used to generate a vector memory of the bee's current position in space in relation to its nest, i.e., perform path integration. This process depends on specific, complex features of the optic flow encoding CX input neurons, but it is unknown how this information is derived from the visual periphery. Here, we thus aimed at gaining insight into how simple motion signals are reshaped upstream of the speed encoding CX input neurons to generate their complex features. Using electrophysiology and anatomical analyses of the halictic bees Megalopta genalis and Megalopta centralis, we identified a wide range of motion-sensitive neurons connecting the optic lobes with the central brain. While most neurons formed pathways with characteristics incompatible with CX speed neurons, we showed that one group of lobula projection neurons possess some physiological and anatomical features required to generate the visual responses of CX optic-flow encoding neurons. However, as these neurons cannot explain all features of CX speed cells, local interneurons of the central brain or alternative input cells from the optic lobe are additionally required to construct inputs with sufficient complexity to deliver speed signals suited for path integration in bees.
Collapse
Affiliation(s)
- Anna Honkanen
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden
| | - Ronja Hensgen
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden
| | - Kavitha Kannan
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden
| | - Andrea Adden
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden
- Neural Circuits and Evolution Lab, The Francis Crick Institute, London, UK
| | - Eric Warrant
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden
| | - William Wcislo
- Smithsonian Tropical Research Institute, Panama City, República de Panamá
| | - Stanley Heinze
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden.
- NanoLund, Lund University, Lund, Sweden.
| |
Collapse
|
10
|
Egelhaaf M. Optic flow based spatial vision in insects. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-022-01610-w. [PMID: 36609568 DOI: 10.1007/s00359-022-01610-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Revised: 12/06/2022] [Accepted: 12/24/2022] [Indexed: 01/09/2023]
Abstract
The optic flow, i.e., the displacement of retinal images of objects in the environment induced by self-motion, is an important source of spatial information, especially for fast-flying insects. Spatial information over a wide range of distances, from the animal's immediate surroundings over several hundred metres to kilometres, is necessary for mediating behaviours, such as landing manoeuvres, collision avoidance in spatially complex environments, learning environmental object constellations and path integration in spatial navigation. To facilitate the processing of spatial information, the complexity of the optic flow is often reduced by active vision strategies. These result in translations and rotations being largely separated by a saccadic flight and gaze mode. Only the translational components of the optic flow contain spatial information. In the first step of optic flow processing, an array of local motion detectors provides a retinotopic spatial proximity map of the environment. This local motion information is then processed in parallel neural pathways in a task-specific manner and used to control the different components of spatial behaviour. A particular challenge here is that the distance information extracted from the optic flow does not represent the distances unambiguously, but these are scaled by the animal's speed of locomotion. Possible ways of coping with this ambiguity are discussed.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Center for Cognitive Interaction Technology (CITEC), Bielefeld University, Universitätsstraße 25, 33615, Bielefeld, Germany.
| |
Collapse
|
11
|
Skelton PSM, Finn A, Brinkworth RSA. Contrast independent biologically inspired translational optic flow estimation. BIOLOGICAL CYBERNETICS 2022; 116:635-660. [PMID: 36303043 PMCID: PMC9691503 DOI: 10.1007/s00422-022-00948-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/19/2021] [Accepted: 10/11/2022] [Indexed: 06/16/2023]
Abstract
The visual systems of insects are relatively simple compared to humans. However, they enable navigation through complex environments where insects perform exceptional levels of obstacle avoidance. Biology uses two separable modes of optic flow to achieve this: rapid gaze fixation (rotational motion known as saccades); and the inter-saccadic translational motion. While the fundamental process of insect optic flow has been known since the 1950's, so too has its dependence on contrast. The surrounding visual pathways used to overcome environmental dependencies are less well known. Previous work has shown promise for low-speed rotational motion estimation, but a gap remained in the estimation of translational motion, in particular the estimation of the time to impact. To consistently estimate the time to impact during inter-saccadic translatory motion, the fundamental limitation of contrast dependence must be overcome. By adapting an elaborated rotational velocity estimator from literature to work for translational motion, this paper proposes a novel algorithm for overcoming the contrast dependence of time to impact estimation using nonlinear spatio-temporal feedforward filtering. By applying bioinspired processes, approximately 15 points per decade of statistical discrimination were achieved when estimating the time to impact to a target across 360 background, distance, and velocity combinations: a 17-fold increase over the fundamental process. These results show the contrast dependence of time to impact estimation can be overcome in a biologically plausible manner. This, combined with previous results for low-speed rotational motion estimation, allows for contrast invariant computational models designed on the principles found in the biological visual system, paving the way for future visually guided systems.
Collapse
Affiliation(s)
- Phillip S. M. Skelton
- Centre for Defence Engineering Research and Training, College of Science and Engineering, Flinders University, 1284 South Road, Tonsley, South Australia 5042 Australia
| | - Anthony Finn
- Science, Technology, Engineering, and Mathematics, University of South Australia, 1 Mawson Lakes Boulevard, Mawson Lakes, South Australia 5095 Australia
| | - Russell S. A. Brinkworth
- Centre for Defence Engineering Research and Training, College of Science and Engineering, Flinders University, 1284 South Road, Tonsley, South Australia 5042 Australia
| |
Collapse
|
12
|
Yu L, Zhao J, Ma Z, Wang W, Yan S, Jin Y, Fang Y. Experimental Verification on Steering Flight of Honeybee by Electrical Stimulation. CYBORG AND BIONIC SYSTEMS 2022. [DOI: 10.34133/2022/9895837] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
The artificial locomotion control strategy is the fundamental technique to ensure the accomplishment of the preset assignments for cyborg insects. The existing research has recognized that the electrical stimulation applied to the optic lobes was an appropriate flight control strategy for small insects represented by honeybee. This control technique has been confirmed to be effective for honeybee flight initiation and cessation. However, its regulation effect on steering locomotion has not been fully verified. Here, we investigated the steering control effect of honeybee by applying electrical stimulation signals with different duty cycles and frequencies on the unilateral optic lobes and screened the stimulus parameters with the highest response successful rate. Moreover, we confirmed the effectiveness of steering control by verifying the presence of rotation torque on tethered honeybees and the body orientation change of crawling honeybees. Our study will contribute some reliable parameter references to the motion control of cyborg honeybees.
Collapse
Affiliation(s)
- Li Yu
- School of Mechanical Engineering, Beijing Institute of Technology, Beijing 100081, China
| | - Jieliang Zhao
- School of Mechanical Engineering, Beijing Institute of Technology, Beijing 100081, China
| | - Zhiyun Ma
- School of Mechanical Engineering, Beijing Institute of Technology, Beijing 100081, China
| | - Wenzhong Wang
- School of Mechanical Engineering, Beijing Institute of Technology, Beijing 100081, China
| | - Shaoze Yan
- Division of Intelligent and Biomechanical Systems, State Key Laboratory of Tribology, Department of Mechanical Engineering, Tsinghua University, Beijing 100084, China
| | - Yue Jin
- Institute of Apicultural Research, Chinese Academy of Agricultural Science, 100193, China
| | - Yu Fang
- Institute of Apicultural Research, Chinese Academy of Agricultural Science, 100193, China
| |
Collapse
|
13
|
Ali MA, Bollmann JH. Motion vision: Course control in the developing visual system. Curr Biol 2022; 32:R520-R523. [PMID: 35671725 DOI: 10.1016/j.cub.2022.04.084] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
As we move around, the image pattern on our retina is constantly changing. Nervous systems have evolved to detect such global 'optic flow' patterns. A new study reveals how optic flow is encoded in the larval zebrafish brain and could be used for the estimation of self-motion.
Collapse
Affiliation(s)
- Mir Ahsan Ali
- Institute of Biology I, Faculty of Biology, University of Freiburg, 79104 Freiburg, Germany
| | - Johann H Bollmann
- Institute of Biology I, Faculty of Biology, University of Freiburg, 79104 Freiburg, Germany; Bernstein Center Freiburg, University of Freiburg, 79104 Freiburg, Germany.
| |
Collapse
|
14
|
Ryu L, Kim SY, Kim AJ. From Photons to Behaviors: Neural Implementations of Visual Behaviors in Drosophila. Front Neurosci 2022; 16:883640. [PMID: 35600623 PMCID: PMC9115102 DOI: 10.3389/fnins.2022.883640] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2022] [Accepted: 03/28/2022] [Indexed: 11/17/2022] Open
Abstract
Neural implementations of visual behaviors in Drosophila have been dissected intensively in the past couple of decades. The availability of premiere genetic toolkits, behavioral assays in tethered or freely moving conditions, and advances in connectomics have permitted the understanding of the physiological and anatomical details of the nervous system underlying complex visual behaviors. In this review, we describe recent advances on how various features of a visual scene are detected by the Drosophila visual system and how the neural circuits process these signals and elicit an appropriate behavioral response. Special emphasis was laid on the neural circuits that detect visual features such as brightness, color, local motion, optic flow, and translating or approaching visual objects, which would be important for behaviors such as phototaxis, optomotor response, attraction (or aversion) to moving objects, navigation, and visual learning. This review offers an integrative framework for how the fly brain detects visual features and orchestrates an appropriate behavioral response.
Collapse
Affiliation(s)
- Leesun Ryu
- Department of Electronic Engineering, Hanyang University, Seoul, South Korea
| | - Sung Yong Kim
- Department of Electronic Engineering, Hanyang University, Seoul, South Korea
| | - Anmo J. Kim
- Department of Electronic Engineering, Hanyang University, Seoul, South Korea
- Department of Biomedical Engineering, Hanyang University, Seoul, South Korea
| |
Collapse
|
15
|
Ravi S, Siesenop T, Bertrand OJ, Li L, Doussot C, Fisher A, Warren WH, Egelhaaf M. Bumblebees display characteristics of active vision during robust obstacle avoidance flight. J Exp Biol 2022; 225:274096. [PMID: 35067721 PMCID: PMC8920035 DOI: 10.1242/jeb.243021] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2021] [Accepted: 01/18/2022] [Indexed: 11/20/2022]
Abstract
Insects are remarkable flyers and capable of navigating through highly cluttered environments. We tracked the head and thorax of bumblebees freely flying in a tunnel containing vertically oriented obstacles to uncover the sensorimotor strategies used for obstacle detection and collision avoidance. Bumblebees presented all the characteristics of active vision during flight by stabilizing their head relative to the external environment and maintained close alignment between their gaze and flightpath. Head stabilization increased motion contrast of nearby features against the background to enable obstacle detection. As bees approached obstacles, they appeared to modulate avoidance responses based on the relative retinal expansion velocity (RREV) of obstacles and their maximum evasion acceleration was linearly related to RREVmax. Finally, bees prevented collisions through rapid roll manoeuvres implemented by their thorax. Overall, the combination of visuo-motor strategies of bumblebees highlights elegant solutions developed by insects for visually guided flight through cluttered environments.
Collapse
Affiliation(s)
- Sridhar Ravi
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany,School of Engineering and Information Technology, University of New South Wales, Canberra, ACT 2600, Australia,Author for correspondence ()
| | - Tim Siesenop
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| | - Olivier J. Bertrand
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| | - Liang Li
- Department of Collective Behavior, Max Planck Institute of Animal Behavior, University of Konstanz, 78464 Konstanz, Germany,Centre for the Advanced Study of Collective Behaviour, University of Konstanz, 78464 Konstanz, Germany,Department of Biology, University of Konstanz, 78464 Konstanz, Germany
| | - Charlotte Doussot
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| | - Alex Fisher
- School of Engineering, RMIT University, Melbourne, VIC 3001, Australia
| | - William H. Warren
- Department of Cognitive, Linguistic & Psychological Sciences, Brown University, Providence, RI 02912, USA
| | - Martin Egelhaaf
- Department of Neurobiology and Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33619 Bielefeld, Germany
| |
Collapse
|
16
|
Leonte MB, Leonhardt A, Borst A, Mauss AS. Aerial course stabilization is impaired in motion-blind flies. J Exp Biol 2021; 224:271038. [PMID: 34297111 DOI: 10.1242/jeb.242219] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2021] [Accepted: 06/22/2021] [Indexed: 01/12/2023]
Abstract
Visual motion detection is among the best understood neuronal computations. As extensively investigated in tethered flies, visual motion signals are assumed to be crucial to detect and counteract involuntary course deviations. During free flight, however, course changes are also signalled by other sensory systems. Therefore, it is as yet unclear to what extent motion vision contributes to course control. To address this question, we genetically rendered flies motion-blind by blocking their primary motion-sensitive neurons and quantified their free-flight performance. We found that such flies have difficulty maintaining a straight flight trajectory, much like unimpaired flies in the dark. By unilateral wing clipping, we generated an asymmetry in propulsive force and tested the ability of flies to compensate for this perturbation. While wild-type flies showed a remarkable level of compensation, motion-blind animals exhibited pronounced circling behaviour. Our results therefore directly confirm that motion vision is necessary to fly straight under realistic conditions.
Collapse
Affiliation(s)
- Maria-Bianca Leonte
- Circuits - Computation - Models, Max Planck Institute of Neurobiology, Am Klopferspitz 18, Martinsried 82152, Germany.,Graduate School of Systemic Neurosciences, Ludwig Maximilians University, Großhadernerstr. 2, Planegg-Martinsried 82152, Germany
| | - Aljoscha Leonhardt
- Circuits - Computation - Models, Max Planck Institute of Neurobiology, Am Klopferspitz 18, Martinsried 82152, Germany
| | - Alexander Borst
- Circuits - Computation - Models, Max Planck Institute of Neurobiology, Am Klopferspitz 18, Martinsried 82152, Germany
| | - Alex S Mauss
- Circuits - Computation - Models, Max Planck Institute of Neurobiology, Am Klopferspitz 18, Martinsried 82152, Germany
| |
Collapse
|
17
|
Parlevliet PP, Kanaev A, Hung CP, Schweiger A, Gregory FD, Benosman R, de Croon GCHE, Gutfreund Y, Lo CC, Moss CF. Autonomous Flying With Neuromorphic Sensing. Front Neurosci 2021; 15:672161. [PMID: 34054420 PMCID: PMC8160287 DOI: 10.3389/fnins.2021.672161] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2021] [Accepted: 04/07/2021] [Indexed: 11/17/2022] Open
Abstract
Autonomous flight for large aircraft appears to be within our reach. However, launching autonomous systems for everyday missions still requires an immense interdisciplinary research effort supported by pointed policies and funding. We believe that concerted endeavors in the fields of neuroscience, mathematics, sensor physics, robotics, and computer science are needed to address remaining crucial scientific challenges. In this paper, we argue for a bio-inspired approach to solve autonomous flying challenges, outline the frontier of sensing, data processing, and flight control within a neuromorphic paradigm, and chart directions of research needed to achieve operational capabilities comparable to those we observe in nature. One central problem of neuromorphic computing is learning. In biological systems, learning is achieved by adaptive and relativistic information acquisition characterized by near-continuous information retrieval with variable rates and sparsity. This results in both energy and computational resource savings being an inspiration for autonomous systems. We consider pertinent features of insect, bat and bird flight behavior as examples to address various vital aspects of autonomous flight. Insects exhibit sophisticated flight dynamics with comparatively reduced complexity of the brain. They represent excellent objects for the study of navigation and flight control. Bats and birds enable more complex models of attention and point to the importance of active sensing for conducting more complex missions. The implementation of neuromorphic paradigms for autonomous flight will require fundamental changes in both traditional hardware and software. We provide recommendations for sensor hardware and processing algorithm development to enable energy efficient and computationally effective flight control.
Collapse
Affiliation(s)
| | - Andrey Kanaev
- U.S. Office of Naval Research Global, London, United Kingdom
| | - Chou P. Hung
- United States Army Research Laboratory, Aberdeen Proving Ground, Maryland, MD, United States
| | | | - Frederick D. Gregory
- U.S. Army Research Laboratory, London, United Kingdom
- Department of Bioengineering, Imperial College London, London, United Kingdom
| | - Ryad Benosman
- Institut de la Vision, INSERM UMRI S 968, Paris, France
- Biomedical Science Tower, University of Pittsburgh, Pittsburgh, PA, United States
- Robotics Institute, Carnegie Mellon University, Pittsburgh, PA, United States
| | - Guido C. H. E. de Croon
- Micro Air Vehicle Laboratory, Department of Control and Operations, Faculty of Aerospace Engineering, Delft University of Technology, Delft, Netherlands
| | - Yoram Gutfreund
- The Neuroethological lab, Department of Neurobiology, The Rappaport Institute for Biomedical Research, Technion – Israel Institute of Technology, Haifa, Israel
| | - Chung-Chuan Lo
- Brain Research Center/Institute of Systems Neuroscience, National Tsing Hua University, Hsinchu, Taiwan
| | - Cynthia F. Moss
- Laboratory of Comparative Neural Systems and Behavior, Department of Psychological and Brain Sciences, Neuroscience and Mechanical Engineering, Johns Hopkins University, Baltimore, MD, United States
| |
Collapse
|
18
|
Meece M, Rathore S, Buschbeck EK. Stark trade-offs and elegant solutions in arthropod visual systems. J Exp Biol 2021; 224:224/4/jeb215541. [PMID: 33632851 DOI: 10.1242/jeb.215541] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Vision is one of the most important senses for humans and animals alike. Diverse elegant specializations have evolved among insects and other arthropods in response to specific visual challenges and ecological needs. These specializations are the subject of this Review, and they are best understood in light of the physical limitations of vision. For example, to achieve high spatial resolution, fine sampling in different directions is necessary, as demonstrated by the well-studied large eyes of dragonflies. However, it has recently been shown that a comparatively tiny robber fly (Holcocephala) has similarly high visual resolution in the frontal visual field, despite their eyes being a fraction of the size of those of dragonflies. Other visual specializations in arthropods include the ability to discern colors, which relies on parallel inputs that are tuned to spectral content. Color vision is important for detection of objects such as mates, flowers and oviposition sites, and is particularly well developed in butterflies, stomatopods and jumping spiders. Analogous to color vision, the visual systems of many arthropods are specialized for the detection of polarized light, which in addition to communication with conspecifics, can be used for orientation and navigation. For vision in low light, optical superposition compound eyes perform particularly well. Other modifications to maximize photon capture involve large lenses, stout photoreceptors and, as has been suggested for nocturnal bees, the neural pooling of information. Extreme adaptations even allow insects to see colors at very low light levels or to navigate using the Milky Way.
Collapse
Affiliation(s)
- Michael Meece
- Department of Biological Sciences, University of Cincinnati, Cincinnati, OH 45221, USA
| | - Shubham Rathore
- Department of Biological Sciences, University of Cincinnati, Cincinnati, OH 45221, USA
| | - Elke K Buschbeck
- Department of Biological Sciences, University of Cincinnati, Cincinnati, OH 45221, USA
| |
Collapse
|
19
|
Kaushik PK, Olsson SB. Using virtual worlds to understand insect navigation for bio-inspired systems. CURRENT OPINION IN INSECT SCIENCE 2020; 42:97-104. [PMID: 33010476 DOI: 10.1016/j.cois.2020.09.010] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/14/2020] [Revised: 09/18/2020] [Accepted: 09/22/2020] [Indexed: 06/11/2023]
Abstract
Insects perform a wide array of intricate behaviors over large spatial and temporal scales in complex natural environments. A mechanistic understanding of insect cognition has direct implications on how brains integrate multimodal information and can inspire bio-based solutions for autonomous robots. Virtual Reality (VR) offers an opportunity assess insect neuroethology while presenting complex, yet controlled, stimuli. Here, we discuss the use of insects as inspiration for artificial systems, recent advances in different VR technologies, current knowledge gaps, and the potential for application of insect VR research to bio-inspired robots. Finally, we advocate the need to diversify our model organisms, behavioral paradigms, and embrace the complexity of the natural world. This will help us to uncover the proximate and ultimate basis of brain and behavior and extract general principles for common challenging problems.
Collapse
Affiliation(s)
- Pavan Kumar Kaushik
- National Centre for Biological Sciences, Tata Institute of Fundamental Research, GKVK Campus, Bellary Road, Bengaluru, 560064, India.
| | - Shannon B Olsson
- National Centre for Biological Sciences, Tata Institute of Fundamental Research, GKVK Campus, Bellary Road, Bengaluru, 560064, India.
| |
Collapse
|
20
|
Mangalam M, Lee IC, Newell KM, Kelty-Stephen DG. Visual effort moderates postural cascade dynamics. Neurosci Lett 2020; 742:135511. [PMID: 33227367 DOI: 10.1016/j.neulet.2020.135511] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2020] [Revised: 10/15/2020] [Accepted: 11/15/2020] [Indexed: 01/13/2023]
Abstract
Standing still and focusing on a visible target in front of us is a preamble to many coordinated behaviors (e.g., reaching an object). Hiding behind its apparent simplicity is a deep layering of texture at many scales. The task of standing still laces together activities at multiple scales: from ensuring that a few photoreceptors on the retina cover the target in the visual field on an extremely fine scale to synergies spanning the limbs and joints at smaller scales to the mechanical layout of the ground underfoot and optic flow in the visual field on the coarser scales. Here, we used multiscale probability density function (PDF) analysis to show that postural fluctuations exhibit similar statistical signatures of cascade dynamics as found in fluid flow. In participants asked to stand quietly, the oculomotor strain of visually fixating at different distances moderated postural cascade dynamics. Visually fixating at a comfortable viewing distance elicited posture with a similar cascade dynamics as posture with eyes closed. Greater viewing distances known to stabilize posture showed more diminished cascade dynamics. In contrast, nearest and farthest viewing distances requiring greater oculomotor strain to focus on targets elicited a dramatic strengthening of postural cascade dynamics, reflecting active postural adjustments. Critically, these findings suggest that vision stabilizes posture by reconfiguring the prestressed poise that prepares the body to interact with different spatial layouts.
Collapse
Affiliation(s)
- Madhur Mangalam
- Department of Physical Therapy, Movement and Rehabilitation Sciences, Northeastern University, Boston, MA 02115, USA.
| | - I-Chieh Lee
- UNC-NC State Joint Department of Biomedical Engineering, UNC-Chapel Hill, Chapel Hill, NC 27514, USA
| | - Karl M Newell
- Department of Kinesiology, University of Georgia, Athens, GA 30602, USA
| | | |
Collapse
|