1
|
Morton NJ, Grice M, Kemp S, Grace RC. Non-symbolic estimation of big and small ratios with accurate and noisy feedback. Atten Percept Psychophys 2024:10.3758/s13414-024-02914-6. [PMID: 38992321 DOI: 10.3758/s13414-024-02914-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/31/2024] [Indexed: 07/13/2024]
Abstract
The ratio of two magnitudes can take one of two values depending on the order they are operated on: a 'big' ratio of the larger to smaller magnitude, or a 'small' ratio of the smaller to larger. Although big and small ratio scales have different metric properties and carry divergent predictions for perceptual comparison tasks, no psychophysical studies have directly compared them. Two experiments are reported in which subjects implicitly learned to compare pairs of brightnesses and line lengths by non-symbolic feedback based on the scaled big ratio, small ratio or difference of the magnitudes presented. Results of Experiment 1 showed all three operations were learned quickly and estimated with a high degree of accuracy that did not significantly differ across groups or between intensive and extensive modalities, though regressions on individual data suggested an overall predisposition towards differences. Experiment 2 tested whether subjects learned to estimate the operation trained or to associate stimulus pairs with correct responses. For each operation, Gaussian noise was added to the feedback that was constant for repetitions of each pair. For all subjects, coefficients for the added noise component were negative when entered in a regression model alongside the trained differences or ratios, and were statistically significant in 80% of individual cases. Thus, subjects learned to estimate the comparative operations and effectively ignored or suppressed the added noise. These results suggest the perceptual system is highly flexible in its capacity for non-symbolic computation, which may reflect a deeper connection between perceptual structure and mathematics.
Collapse
Affiliation(s)
- Nicola J Morton
- School of Psychology, Speech and Hearing, University of Canterbury, Christchurch, New Zealand.
| | - Matt Grice
- School of Psychology, Speech and Hearing, University of Canterbury, Christchurch, New Zealand
| | - Simon Kemp
- School of Psychology, Speech and Hearing, University of Canterbury, Christchurch, New Zealand
| | - Randolph C Grace
- School of Psychology, Speech and Hearing, University of Canterbury, Christchurch, New Zealand.
| |
Collapse
|
2
|
Sibeaux A, Newport C, Green JP, Karlsson C, Engelmann J, Burt de Perera T. Taking a shortcut: what mechanisms do fish use? Commun Biol 2024; 7:578. [PMID: 38755224 PMCID: PMC11099040 DOI: 10.1038/s42003-024-06179-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2023] [Accepted: 04/10/2024] [Indexed: 05/18/2024] Open
Abstract
Path integration is a powerful navigational mechanism whereby individuals continuously update their distance and angular vector of movement to calculate their position in relation to their departure location, allowing them to return along the most direct route even across unfamiliar terrain. While path integration has been investigated in several terrestrial animals, it has never been demonstrated in aquatic vertebrates, where movement occurs through volumetric space and sensory cues available for navigation are likely to differ substantially from those in terrestrial environments. By performing displacement experiments with Lamprologus ocellatus, we show evidence consistent with fish using path integration to navigate alongside other mechanisms (allothetic place cues and route recapitulation). These results indicate that the use of path integration is likely to be deeply rooted within the vertebrate phylogeny irrespective of the environment, and suggests that fish may possess a spatial encoding system that parallels that of mammals.
Collapse
Affiliation(s)
- Adelaide Sibeaux
- Department of Biology, University of Oxford, Zoology Research and Administration Building, 11a Mansfield Road, Oxford, OX1 3SZ, UK.
| | - Cait Newport
- Department of Biology, University of Oxford, Zoology Research and Administration Building, 11a Mansfield Road, Oxford, OX1 3SZ, UK
| | - Jonathan P Green
- Department of Biology, University of Oxford, Zoology Research and Administration Building, 11a Mansfield Road, Oxford, OX1 3SZ, UK
| | - Cecilia Karlsson
- Wolfson College, University of Cambridge, Cambridge, CB3 9BB, UK
| | - Jacob Engelmann
- Faculty of Biology, Bielefeld University, Universitätstrasse 25, Bielefeld, 33615, Germany
| | - Theresa Burt de Perera
- Department of Biology, University of Oxford, Zoology Research and Administration Building, 11a Mansfield Road, Oxford, OX1 3SZ, UK
| |
Collapse
|
3
|
Honkanen A, Hensgen R, Kannan K, Adden A, Warrant E, Wcislo W, Heinze S. Parallel motion vision pathways in the brain of a tropical bee. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-023-01625-x. [PMID: 37017717 DOI: 10.1007/s00359-023-01625-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2022] [Revised: 03/01/2023] [Accepted: 03/09/2023] [Indexed: 04/06/2023]
Abstract
Spatial orientation is a prerequisite for most behaviors. In insects, the underlying neural computations take place in the central complex (CX), the brain's navigational center. In this region different streams of sensory information converge to enable context-dependent navigational decisions. Accordingly, a variety of CX input neurons deliver information about different navigation-relevant cues. In bees, direction encoding polarized light signals converge with translational optic flow signals that are suited to encode the flight speed of the animals. The continuous integration of speed and directions in the CX can be used to generate a vector memory of the bee's current position in space in relation to its nest, i.e., perform path integration. This process depends on specific, complex features of the optic flow encoding CX input neurons, but it is unknown how this information is derived from the visual periphery. Here, we thus aimed at gaining insight into how simple motion signals are reshaped upstream of the speed encoding CX input neurons to generate their complex features. Using electrophysiology and anatomical analyses of the halictic bees Megalopta genalis and Megalopta centralis, we identified a wide range of motion-sensitive neurons connecting the optic lobes with the central brain. While most neurons formed pathways with characteristics incompatible with CX speed neurons, we showed that one group of lobula projection neurons possess some physiological and anatomical features required to generate the visual responses of CX optic-flow encoding neurons. However, as these neurons cannot explain all features of CX speed cells, local interneurons of the central brain or alternative input cells from the optic lobe are additionally required to construct inputs with sufficient complexity to deliver speed signals suited for path integration in bees.
Collapse
Affiliation(s)
- Anna Honkanen
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden
| | - Ronja Hensgen
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden
| | - Kavitha Kannan
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden
| | - Andrea Adden
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden
- Neural Circuits and Evolution Lab, The Francis Crick Institute, London, UK
| | - Eric Warrant
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden
| | - William Wcislo
- Smithsonian Tropical Research Institute, Panama City, República de Panamá
| | - Stanley Heinze
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden.
- NanoLund, Lund University, Lund, Sweden.
| |
Collapse
|
4
|
Egelhaaf M. Optic flow based spatial vision in insects. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-022-01610-w. [PMID: 36609568 DOI: 10.1007/s00359-022-01610-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Revised: 12/06/2022] [Accepted: 12/24/2022] [Indexed: 01/09/2023]
Abstract
The optic flow, i.e., the displacement of retinal images of objects in the environment induced by self-motion, is an important source of spatial information, especially for fast-flying insects. Spatial information over a wide range of distances, from the animal's immediate surroundings over several hundred metres to kilometres, is necessary for mediating behaviours, such as landing manoeuvres, collision avoidance in spatially complex environments, learning environmental object constellations and path integration in spatial navigation. To facilitate the processing of spatial information, the complexity of the optic flow is often reduced by active vision strategies. These result in translations and rotations being largely separated by a saccadic flight and gaze mode. Only the translational components of the optic flow contain spatial information. In the first step of optic flow processing, an array of local motion detectors provides a retinotopic spatial proximity map of the environment. This local motion information is then processed in parallel neural pathways in a task-specific manner and used to control the different components of spatial behaviour. A particular challenge here is that the distance information extracted from the optic flow does not represent the distances unambiguously, but these are scaled by the animal's speed of locomotion. Possible ways of coping with this ambiguity are discussed.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Center for Cognitive Interaction Technology (CITEC), Bielefeld University, Universitätsstraße 25, 33615, Bielefeld, Germany.
| |
Collapse
|
5
|
Zittrell F, Pabst K, Carlomagno E, Rosner R, Pegel U, Endres DM, Homberg U. Integration of optic flow into the sky compass network in the brain of the desert locust. Front Neural Circuits 2023; 17:1111310. [PMID: 37187914 PMCID: PMC10175609 DOI: 10.3389/fncir.2023.1111310] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2022] [Accepted: 03/30/2023] [Indexed: 05/17/2023] Open
Abstract
Flexible orientation through any environment requires a sense of current relative heading that is updated based on self-motion. Global external cues originating from the sky or the earth's magnetic field and local cues provide a reference frame for the sense of direction. Locally, optic flow may inform about turning maneuvers, travel speed and covered distance. The central complex in the insect brain is associated with orientation behavior and largely acts as a navigation center. Visual information from global celestial cues and local landmarks are integrated in the central complex to form an internal representation of current heading. However, it is less clear how optic flow is integrated into the central-complex network. We recorded intracellularly from neurons in the locust central complex while presenting lateral grating patterns that simulated translational and rotational motion to identify these sites of integration. Certain types of central-complex neurons were sensitive to optic-flow stimulation independent of the type and direction of simulated motion. Columnar neurons innervating the noduli, paired central-complex substructures, were tuned to the direction of simulated horizontal turns. Modeling the connectivity of these neurons with a system of proposed compass neurons can account for rotation-direction specific shifts in the activity profile in the central complex corresponding to turn direction. Our model is similar but not identical to the mechanisms proposed for angular velocity integration in the navigation compass of the fly Drosophila.
Collapse
Affiliation(s)
- Frederick Zittrell
- Department of Biology, Philipps-Universität Marburg, Marburg, Germany
- Center for Mind, Brain and Behavior (CMBB), University of Marburg and Justus Liebig University, Marburg, Germany
| | - Kathrin Pabst
- Center for Mind, Brain and Behavior (CMBB), University of Marburg and Justus Liebig University, Marburg, Germany
- Department of Psychology, Philipps-Universität Marburg, Marburg, Germany
| | - Elena Carlomagno
- Department of Biology, Philipps-Universität Marburg, Marburg, Germany
| | - Ronny Rosner
- Department of Biology, Philipps-Universität Marburg, Marburg, Germany
| | - Uta Pegel
- Department of Biology, Philipps-Universität Marburg, Marburg, Germany
| | - Dominik M. Endres
- Center for Mind, Brain and Behavior (CMBB), University of Marburg and Justus Liebig University, Marburg, Germany
- Department of Psychology, Philipps-Universität Marburg, Marburg, Germany
| | - Uwe Homberg
- Department of Biology, Philipps-Universität Marburg, Marburg, Germany
- Center for Mind, Brain and Behavior (CMBB), University of Marburg and Justus Liebig University, Marburg, Germany
- *Correspondence: Uwe Homberg
| |
Collapse
|
6
|
Xiong X, Manoonpong P. No Need for Landmarks: An Embodied Neural Controller for Robust Insect-Like Navigation Behaviors. IEEE TRANSACTIONS ON CYBERNETICS 2022; 52:12893-12904. [PMID: 34264833 DOI: 10.1109/tcyb.2021.3091127] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Bayesian filters have been considered to help refine and develop theoretical views on spatial cell functions for self-localization. However, extending a Bayesian filter to reproduce insect-like navigation behaviors (e.g., home searching) remains an open and challenging problem. To address this problem, we propose an embodied neural controller for self-localization, foraging, backward homing (BH), and home searching of an advanced mobility sensor (AMOS)-driven insect-like robot. The controller, comprising a navigation module for the Bayesian self-localization and goal-directed control of AMOS and a locomotion module for coordinating the 18 joints of AMOS, leads to its robust insect-like navigation behaviors. As a result, the proposed controller enables AMOS to perform robust foraging, BH, and home searching against various levels of sensory noise, compared to conventional controllers. Its implementation relies only on self-localization and heading perception, rather than global positioning and landmark guidance. Interestingly, the proposed controller makes AMOS achieve spiral searching patterns comparable to those performed by real insects. We also demonstrated the performance of the controller for real-time indoor and outdoor navigation in a real insect-like robot without any landmark and cognitive map.
Collapse
|
7
|
Gaffin DD, Muñoz MG, Hoefnagels MH. Evidence of learning walks related to scorpion home burrow navigation. J Exp Biol 2022; 225:275795. [PMID: 35638243 PMCID: PMC9250797 DOI: 10.1242/jeb.243947] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2021] [Accepted: 05/20/2022] [Indexed: 11/29/2022]
Abstract
The navigation by chemo-textural familiarity hypothesis (NCFH) suggests that scorpions use their midventral pectines to gather chemical and textural information near their burrows and use this information as they subsequently return home. For NCFH to be viable, animals must somehow acquire home-directed ‘tastes’ of the substrate, such as through path integration (PI) and/or learning walks. We conducted laboratory behavioral trials using desert grassland scorpions (Paruroctonus utahensis). Animals reliably formed burrows in small mounds of sand we provided in the middle of circular, sand-lined behavioral arenas. We processed overnight infrared video recordings with a MATLAB script that tracked animal movements at 1–2 s intervals. In all, we analyzed the movements of 23 animals, representing nearly 1500 h of video recording. We found that once animals established their home burrows, they immediately made one to several short, looping excursions away from and back to their burrows before walking greater distances. We also observed similar excursions when animals made burrows in level sand in the middle of the arena (i.e. no mound provided). These putative learning walks, together with recently reported PI in scorpions, may provide the crucial home-directed information requisite for NCFH. Highlighted Article: Evidence that sand scorpions perform looping walks immediately after establishing a burrow and the possible significance of these putative learning walks in terms of scorpion navigation.
Collapse
Affiliation(s)
- Douglas D Gaffin
- Department of Biology, University of Oklahoma, Norman, OK 73019, USA
| | - Maria G Muñoz
- Department of Biology, University of Oklahoma, Norman, OK 73019, USA
| | - Mariëlle H Hoefnagels
- Department of Microbiology and Plant Biology, University of Oklahoma, Norman, OK 73019, USA
| |
Collapse
|
8
|
Rañó I. On motion camouflage as proportional navigation. BIOLOGICAL CYBERNETICS 2022; 116:69-79. [PMID: 34766202 DOI: 10.1007/s00422-021-00907-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/27/2021] [Accepted: 10/20/2021] [Indexed: 06/13/2023]
Abstract
Motion camouflage is a stealth behaviour by which an insect can appear stationary at a fixed point while approaching or escaping another moving insect. Although several approaches have been proposed to generate motion camouflage in simulated and real agents, the exact mechanisms insects use to perform this complex behaviour are not well understood, especially considering their limited perceptual and computational resources. This paper sheds light on the possible underlying control mechanisms insect might use to generate motion camouflage, by training and analysing a series of motion camouflage controllers using reinforcement learning. We first investigate through simulations the most relevant information available to the insect that can be used to perform motion camouflage and analyse the learnt controllers. The results of this analysis drove us to hypothesise two simpler control mechanisms which, we show, can also generate motion camouflage. The proposed controllers are an extension of proportional navigation, another interception technique found in nature, and therefore, both animal behaviours seem to be connected. Motion camouflage can lead, among others, to novel approaches to closely observe animals in the wild, record sports events or gather information in military operations without being noticed.
Collapse
Affiliation(s)
- Iñaki Rañó
- The Mærsk Mc Kinney Møller Institute, University of Southern Denmark, Campusvej 55, 5230, Odense, Denmark.
| |
Collapse
|
9
|
Hulse BK, Haberkern H, Franconville R, Turner-Evans DB, Takemura SY, Wolff T, Noorman M, Dreher M, Dan C, Parekh R, Hermundstad AM, Rubin GM, Jayaraman V. A connectome of the Drosophila central complex reveals network motifs suitable for flexible navigation and context-dependent action selection. eLife 2021; 10:66039. [PMID: 34696823 PMCID: PMC9477501 DOI: 10.7554/elife.66039] [Citation(s) in RCA: 118] [Impact Index Per Article: 39.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2020] [Accepted: 09/07/2021] [Indexed: 11/13/2022] Open
Abstract
Flexible behaviors over long timescales are thought to engage recurrent neural networks in deep brain regions, which are experimentally challenging to study. In insects, recurrent circuit dynamics in a brain region called the central complex (CX) enable directed locomotion, sleep, and context- and experience-dependent spatial navigation. We describe the first complete electron-microscopy-based connectome of the Drosophila CX, including all its neurons and circuits at synaptic resolution. We identified new CX neuron types, novel sensory and motor pathways, and network motifs that likely enable the CX to extract the fly's head-direction, maintain it with attractor dynamics, and combine it with other sensorimotor information to perform vector-based navigational computations. We also identified numerous pathways that may facilitate the selection of CX-driven behavioral patterns by context and internal state. The CX connectome provides a comprehensive blueprint necessary for a detailed understanding of network dynamics underlying sleep, flexible navigation, and state-dependent action selection.
Collapse
Affiliation(s)
- Brad K Hulse
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| | - Hannah Haberkern
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| | - Romain Franconville
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| | | | | | - Tanya Wolff
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| | - Marcella Noorman
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| | - Marisa Dreher
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| | - Chuntao Dan
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| | - Ruchi Parekh
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| | | | - Gerald M Rubin
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| | - Vivek Jayaraman
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| |
Collapse
|
10
|
Navigation and orientation in Coleoptera: a review of strategies and mechanisms. Anim Cogn 2021; 24:1153-1164. [PMID: 33846895 DOI: 10.1007/s10071-021-01513-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2020] [Revised: 03/30/2021] [Accepted: 04/04/2021] [Indexed: 10/21/2022]
Abstract
Spatial orientation is important for animals to forage, mate, migrate, and escape certain threats, and can require simple to complex cognitive abilities and behaviours. As these behaviours are more difficult to experimentally test in vertebrates, considerable research has focussed on investigating spatial orientation in insects. However, the majority of insect spatial orientation research tends to focus on a few taxa of interest, especially social insects. Beetles present an interesting insect group to study in this respect, due to their diverse taxonomy and biology, and prevalence as agricultural pests. In this article, I review research on beetle spatial orientation. Then, I use this synthesis to discuss mechanisms beetles employ in the context of different behaviours that require orientation or navigation. I conclude by discussing two future avenues for behavioural research on this topic, which could lead to more robust conclusions on how species in this diverse order are able to traverse through a wide variety of environments.
Collapse
|
11
|
Differences and ratios in a nonsymbolic 'Artificial algebra': Effects of extended training. Behav Processes 2020; 180:104242. [PMID: 32910993 DOI: 10.1016/j.beproc.2020.104242] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2020] [Revised: 08/23/2020] [Accepted: 08/30/2020] [Indexed: 01/29/2023]
Abstract
Grace et al. (2018) showed that humans could estimate ratios and differences of stimulus magnitudes by feedback and without explicit instruction in a nonsymbolic 'artificial algebra' task, but that responding depended on both operations even though only one was trained. Here we asked whether control by the trained operation would increase over several sessions, that is, if perceptual learning would occur. Observers (n = 16) completed four sessions in which feedback was based on either ratios or differences for stimulus pairs that varied in brightness (Experiment 1) or line length (Experiment 2). Results showed that control by the trained and untrained operations increased and decreased, respectively, over the sessions, indicating perceptual learning. For about two thirds of individual sessions, regressions indicated significant control by both differences and ratios, suggesting that the perceptual system automatically computes two operations. The similarity of results across experiments with both intensive (brightness) and extensive (line length) stimulus dimensions suggests that differences and ratios are computed centrally, perhaps as part of a general system for processing magnitudes (cf. Walsh, 2003).
Collapse
|
12
|
Multimodal interactions in insect navigation. Anim Cogn 2020; 23:1129-1141. [PMID: 32323027 PMCID: PMC7700066 DOI: 10.1007/s10071-020-01383-2] [Citation(s) in RCA: 40] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2019] [Revised: 04/02/2020] [Accepted: 04/06/2020] [Indexed: 01/06/2023]
Abstract
Animals travelling through the world receive input from multiple sensory modalities that could be important for the guidance of their journeys. Given the availability of a rich array of cues, from idiothetic information to input from sky compasses and visual information through to olfactory and other cues (e.g. gustatory, magnetic, anemotactic or thermal) it is no surprise to see multimodality in most aspects of navigation. In this review, we present the current knowledge of multimodal cue use during orientation and navigation in insects. Multimodal cue use is adapted to a species’ sensory ecology and shapes navigation behaviour both during the learning of environmental cues and when performing complex foraging journeys. The simultaneous use of multiple cues is beneficial because it provides redundant navigational information, and in general, multimodality increases robustness, accuracy and overall foraging success. We use examples from sensorimotor behaviours in mosquitoes and flies as well as from large scale navigation in ants, bees and insects that migrate seasonally over large distances, asking at each stage how multiple cues are combined behaviourally and what insects gain from using different modalities.
Collapse
|
13
|
Scheiner R, Frantzmann F, Jäger M, Mitesser O, Helfrich-Förster C, Pauls D. A Novel Thermal-Visual Place Learning Paradigm for Honeybees ( Apis mellifera). Front Behav Neurosci 2020; 14:56. [PMID: 32351370 PMCID: PMC7174502 DOI: 10.3389/fnbeh.2020.00056] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2020] [Accepted: 03/24/2020] [Indexed: 12/21/2022] Open
Abstract
Honeybees (Apis mellifera) have fascinating navigational skills and learning capabilities in the field. To decipher the mechanisms underlying place learning in honeybees, we need paradigms to study place learning of individual honeybees under controlled laboratory conditions. Here, we present a novel visual place learning arena for honeybees which relies on high temperatures as aversive stimuli. Honeybees learn to locate a safe spot in an unpleasantly warm arena, relying on a visual panorama. Bees can solve this task at a temperature of 46°C, while at temperatures above 48°C bees die quickly. This new paradigm, which is based on pioneering work on Drosophila, allows us now to investigate thermal-visual place learning of individual honeybees in the laboratory, for example after controlled genetic knockout or pharmacological intervention.
Collapse
Affiliation(s)
- Ricarda Scheiner
- Behavioral Physiology and Sociobiology, Theodor-Boveri-Institute, Biocenter, University of Würzburg, Würzburg, Germany
| | - Felix Frantzmann
- Department of Animal Physiology, Institute of Biology, Leipzig University, Leipzig, Germany
- Neurobiology and Genetics, Theodor-Boveri-Institute, Biocenter, University of Würzburg, Würzburg, Germany
| | - Maria Jäger
- Behavioral Physiology and Sociobiology, Theodor-Boveri-Institute, Biocenter, University of Würzburg, Würzburg, Germany
| | - Oliver Mitesser
- Field Station Fabrikschleichach, Biocenter, Department of Animal Ecology and Tropical Biology, University of Würzburg, Würzburg, Germany
| | - Charlotte Helfrich-Förster
- Neurobiology and Genetics, Theodor-Boveri-Institute, Biocenter, University of Würzburg, Würzburg, Germany
| | - Dennis Pauls
- Department of Animal Physiology, Institute of Biology, Leipzig University, Leipzig, Germany
- Neurobiology and Genetics, Theodor-Boveri-Institute, Biocenter, University of Würzburg, Würzburg, Germany
| |
Collapse
|
14
|
Leibold C. A model for navigation in unknown environments based on a reservoir of hippocampal sequences. Neural Netw 2020; 124:328-342. [DOI: 10.1016/j.neunet.2020.01.014] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2019] [Revised: 12/18/2019] [Accepted: 01/14/2020] [Indexed: 12/21/2022]
|
15
|
Kathman ND, Fox JL. Representation of Haltere Oscillations and Integration with Visual Inputs in the Fly Central Complex. J Neurosci 2019; 39:4100-4112. [PMID: 30877172 PMCID: PMC6529865 DOI: 10.1523/jneurosci.1707-18.2019] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2018] [Revised: 01/28/2019] [Accepted: 01/30/2019] [Indexed: 11/21/2022] Open
Abstract
The reduced hindwings of flies, known as halteres, are specialized mechanosensory organs that detect body rotations during flight. Primary afferents of the haltere encode its oscillation frequency linearly over a wide bandwidth and with precise phase-dependent spiking. However, it is not currently known whether information from haltere primary afferent neurons is sent to higher brain centers where sensory information about body position could be used in decision making, or whether precise spike timing is useful beyond the peripheral circuits that drive wing movements. We show that in cells in the central brain, the timing and rates of neural spiking can be modulated by sensory input from experimental haltere movements (driven by a servomotor). Using multichannel extracellular recording in restrained flesh flies (Sarcophaga bullata of both sexes), we examined responses of central complex cells to a range of haltere oscillation frequencies alone, and in combination with visual motion speeds and directions. Haltere-responsive units fell into multiple response classes, including those responding to any haltere motion and others with firing rates linearly related to the haltere frequency. Cells with multisensory responses showed higher firing rates than the sum of the unisensory responses at higher haltere frequencies. They also maintained visual properties, such as directional selectivity, while increasing response gain nonlinearly with haltere frequency. Although haltere inputs have been described extensively in the context of rapid locomotion control, we find haltere sensory information in a brain region known to be involved in slower, higher-order behaviors, such as navigation.SIGNIFICANCE STATEMENT Many animals use vision for navigation; however, these cues must be interpreted in the context of the body's position. In mammalian brains, hippocampal cells combine visual and vestibular information to encode head direction. A region of the arthropod brain, known as the central complex (CX), similarly encodes heading information, but it is unknown whether proprioceptive information is integrated here as well. We show that CX neurons respond to input from halteres, specialized proprioceptors in flies that detect body rotations. These neurons also respond to visual input, providing one of the few examples of multiple sensory modalities represented in individual CX cells. Haltere stimulation modifies neural responses to visual signals, providing a mechanism for integrating vision with proprioception.
Collapse
Affiliation(s)
- Nicholas D Kathman
- Department of Biology, Case Western Reserve University, Cleveland, Ohio 44106
| | - Jessica L Fox
- Department of Biology, Case Western Reserve University, Cleveland, Ohio 44106
| |
Collapse
|
16
|
Honkanen A, Adden A, da Silva Freitas J, Heinze S. The insect central complex and the neural basis of navigational strategies. ACTA ACUST UNITED AC 2019; 222:222/Suppl_1/jeb188854. [PMID: 30728235 DOI: 10.1242/jeb.188854] [Citation(s) in RCA: 100] [Impact Index Per Article: 20.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
Oriented behaviour is present in almost all animals, indicating that it is an ancient feature that has emerged from animal brains hundreds of millions of years ago. Although many complex navigation strategies have been described, each strategy can be broken down into a series of elementary navigational decisions. In each moment in time, an animal has to compare its current heading with its desired direction and compensate for any mismatch by producing a steering response either to the right or to the left. Different from reflex-driven movements, target-directed navigation is not only initiated in response to sensory input, but also takes into account previous experience and motivational state. Once a series of elementary decisions are chained together to form one of many coherent navigation strategies, the animal can pursue a navigational target, e.g. a food source, a nest entrance or a constant flight direction during migrations. Insects show a great variety of complex navigation behaviours and, owing to their small brains, the pursuit of the neural circuits controlling navigation has made substantial progress over the last years. A brain region as ancient as insects themselves, called the central complex, has emerged as the likely navigation centre of the brain. Research across many species has shown that the central complex contains the circuitry that might comprise the neural substrate of elementary navigational decisions. Although this region is also involved in a wide range of other functions, we hypothesize in this Review that its role in mediating the animal's next move during target-directed behaviour is its ancestral function, around which other functions have been layered over the course of evolution.
Collapse
Affiliation(s)
- Anna Honkanen
- Lund Vision Group, Department of Biology, Lund University, 22362 Lund, Sweden
| | - Andrea Adden
- Lund Vision Group, Department of Biology, Lund University, 22362 Lund, Sweden
| | | | - Stanley Heinze
- Lund Vision Group, Department of Biology, Lund University, 22362 Lund, Sweden
| |
Collapse
|
17
|
Ravi S, Bertrand O, Siesenop T, Manz LS, Doussot C, Fisher A, Egelhaaf M. Gap perception in bumblebees. ACTA ACUST UNITED AC 2019; 222:222/2/jeb184135. [PMID: 30683732 DOI: 10.1242/jeb.184135] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2018] [Accepted: 10/26/2018] [Indexed: 11/20/2022]
Abstract
A number of insects fly over long distances below the natural canopy, where the physical environment is highly cluttered consisting of obstacles of varying shape, size and texture. While navigating within such environments, animals need to perceive and disambiguate environmental features that might obstruct their flight. The most elemental aspect of aerial navigation through such environments is gap identification and 'passability' evaluation. We used bumblebees to seek insights into the mechanisms used for gap identification when confronted with an obstacle in their flight path and behavioral compensations employed to assess gap properties. Initially, bumblebee foragers were trained to fly though an unobstructed flight tunnel that led to a foraging chamber. After the bees were familiar with this situation, we placed a wall containing a gap that unexpectedly obstructed the flight path on a return trip to the hive. The flight trajectories of the bees as they approached the obstacle wall and traversed the gap were analyzed in order to evaluate their behavior as a function of the distance between the gap and a background wall that was placed behind the gap. Bumblebees initially decelerated when confronted with an unexpected obstacle. Deceleration was first noticed when the obstacle subtended around 35 deg on the retina but also depended on the properties of the gap. Subsequently, the bees gradually traded off their longitudinal velocity to lateral velocity and approached the gap with increasing lateral displacement and lateral velocity. Bumblebees shaped their flight trajectory depending on the salience of the gap, indicated in our case by the optic flow contrast between the region within the gap and on the obstacle, which decreased with decreasing distance between the gap and the background wall. As the optic flow contrast decreased, the bees spent an increasing amount of time moving laterally across the obstacles. During these repeated lateral maneuvers, the bees are probably assessing gap geometry and passability.
Collapse
Affiliation(s)
- Sridhar Ravi
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany .,School of Engineering, RMIT University, Melbourne, VIC 3001, Australia
| | - Olivier Bertrand
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Tim Siesenop
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Lea-Sophie Manz
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany.,Faculty of Biology, Johannes Gutenberg-Universität Mainz, 55122 Mainz, Germany
| | - Charlotte Doussot
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| | - Alex Fisher
- School of Engineering, RMIT University, Melbourne, VIC 3001, Australia
| | - Martin Egelhaaf
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, 33615 Bielefeld, Germany
| |
Collapse
|
18
|
Zhao M. Human spatial representation: what we cannot learn from the studies of rodent navigation. J Neurophysiol 2018; 120:2453-2465. [PMID: 30133384 DOI: 10.1152/jn.00781.2017] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022] Open
Abstract
Studies of human and rodent navigation often reveal a remarkable cross-species similarity between the cognitive and neural mechanisms of navigation. Such cross-species resemblance often overshadows some critical differences between how humans and nonhuman animals navigate. In this review, I propose that a navigation system requires both a storage system (i.e., representing spatial information) and a positioning system (i.e., sensing spatial information) to operate. I then argue that the way humans represent spatial information is different from that inferred from the cellular activity observed during rodent navigation. Such difference spans the whole hierarchy of spatial representation, from representing the structure of an environment to the representation of subregions of an environment, routes and paths, and the distance and direction relative to a goal location. These cross-species inconsistencies suggest that what we learn from rodent navigation does not always transfer to human navigation. Finally, I argue for closing the loop for the dominant, unidirectional animal-to-human approach in navigation research so that insights from behavioral studies of human navigation may also flow back to shed light on the cellular mechanisms of navigation for both humans and other mammals (i.e., a human-to-animal approach).
Collapse
Affiliation(s)
- Mintao Zhao
- School of Psychology, University of East Anglia , Norwich , United Kingdom.,Department of Human Perception, Cognition, and Action, Max Planck Institute for Biological Cybernetics , Tübingen , Germany
| |
Collapse
|
19
|
Heinze S. Unraveling the neural basis of insect navigation. CURRENT OPINION IN INSECT SCIENCE 2017; 24:58-67. [PMID: 29208224 PMCID: PMC6186168 DOI: 10.1016/j.cois.2017.09.001] [Citation(s) in RCA: 59] [Impact Index Per Article: 8.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/08/2017] [Revised: 09/05/2017] [Accepted: 09/08/2017] [Indexed: 05/09/2023]
Abstract
One of the defining features of animals is their ability to navigate their environment. Using behavioral experiments this topic has been under intense investigation for nearly a century. In insects, this work has largely focused on the remarkable homing abilities of ants and bees. More recently, the neural basis of navigation shifted into the focus of attention. Starting with revealing the neurons that process the sensory signals used for navigation, in particular polarized skylight, migratory locusts became the key species for delineating navigation-relevant regions of the insect brain. Over the last years, this work was used as a basis for research in the fruit fly Drosophila and extraordinary progress has been made in illuminating the neural underpinnings of navigational processes. With increasingly detailed understanding of navigation circuits, we can begin to ask whether there is a fundamentally shared concept underlying all navigation behavior across insects. This review highlights recent advances and puts them into the context of the behavioral work on ants and bees, as well as the circuits involved in polarized-light processing. A region of the insect brain called the central complex emerges as the common substrate for guiding navigation and its highly organized neuroarchitecture provides a framework for future investigations potentially suited to explain all insect navigation behavior at the level of identified neurons.
Collapse
Affiliation(s)
- Stanley Heinze
- Lund University, Department of Biology, Lund Vision Group, Sölvegatan 35, 22362 Lund, Sweden.
| |
Collapse
|
20
|
Stone T, Webb B, Adden A, Weddig NB, Honkanen A, Templin R, Wcislo W, Scimeca L, Warrant E, Heinze S. An Anatomically Constrained Model for Path Integration in the Bee Brain. Curr Biol 2017; 27:3069-3085.e11. [PMID: 28988858 DOI: 10.1016/j.cub.2017.08.052] [Citation(s) in RCA: 191] [Impact Index Per Article: 27.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2017] [Revised: 07/24/2017] [Accepted: 08/21/2017] [Indexed: 01/30/2023]
Abstract
Path integration is a widespread navigational strategy in which directional changes and distance covered are continuously integrated on an outward journey, enabling a straight-line return to home. Bees use vision for this task-a celestial-cue-based visual compass and an optic-flow-based visual odometer-but the underlying neural integration mechanisms are unknown. Using intracellular electrophysiology, we show that polarized-light-based compass neurons and optic-flow-based speed-encoding neurons converge in the central complex of the bee brain, and through block-face electron microscopy, we identify potential integrator cells. Based on plausible output targets for these cells, we propose a complete circuit for path integration and steering in the central complex, with anatomically identified neurons suggested for each processing step. The resulting model circuit is thus fully constrained biologically and provides a functional interpretation for many previously unexplained architectural features of the central complex. Moreover, we show that the receptive fields of the newly discovered speed neurons can support path integration for the holonomic motion (i.e., a ground velocity that is not precisely aligned with body orientation) typical of bee flight, a feature not captured in any previously proposed model of path integration. In a broader context, the model circuit presented provides a general mechanism for producing steering signals by comparing current and desired headings-suggesting a more basic function for central complex connectivity, from which path integration may have evolved.
Collapse
Affiliation(s)
- Thomas Stone
- School of Informatics, University of Edinburgh, Edinburgh, UK
| | - Barbara Webb
- School of Informatics, University of Edinburgh, Edinburgh, UK
| | - Andrea Adden
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden
| | | | - Anna Honkanen
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden
| | - Rachel Templin
- Queensland Brain Institute, University of Queensland, Brisbane, Australia
| | - William Wcislo
- Smithsonian Tropical Research Institute, Panama City, Panama
| | - Luca Scimeca
- School of Informatics, University of Edinburgh, Edinburgh, UK
| | - Eric Warrant
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden
| | - Stanley Heinze
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden.
| |
Collapse
|
21
|
Sabo C, Chisholm R, Petterson A, Cope A. A lightweight, inexpensive robotic system for insect vision. ARTHROPOD STRUCTURE & DEVELOPMENT 2017; 46:689-702. [PMID: 28818663 DOI: 10.1016/j.asd.2017.08.001] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/20/2016] [Revised: 08/05/2017] [Accepted: 08/10/2017] [Indexed: 06/07/2023]
Abstract
Designing hardware for miniaturized robotics which mimics the capabilities of flying insects is of interest, because they share similar constraints (i.e. small size, low weight, and low energy consumption). Research in this area aims to enable robots with similarly efficient flight and cognitive abilities. Visual processing is important to flying insects' impressive flight capabilities, but currently, embodiment of insect-like visual systems is limited by the hardware systems available. Suitable hardware is either prohibitively expensive, difficult to reproduce, cannot accurately simulate insect vision characteristics, and/or is too heavy for small robotic platforms. These limitations hamper the development of platforms for embodiment which in turn hampers the progress on understanding of how biological systems fundamentally work. To address this gap, this paper proposes an inexpensive, lightweight robotic system for modelling insect vision. The system is mounted and tested on a robotic platform for mobile applications, and then the camera and insect vision models are evaluated. We analyse the potential of the system for use in embodiment of higher-level visual processes (i.e. motion detection) and also for development of navigation based on vision for robotics in general. Optic flow from sample camera data is calculated and compared to a perfect, simulated bee world showing an excellent resemblance.
Collapse
Affiliation(s)
- Chelsea Sabo
- University of Sheffield, Sheffield, S10 2TN, UK.
| | | | | | - Alex Cope
- University of Sheffield, Sheffield, S10 2TN, UK.
| |
Collapse
|
22
|
Bruck JN, Allen NA, Brass KE, Horn BA, Campbell P. Species differences in egocentric navigation: the effect of burrowing ecology on a spatial cognitive trait in mice. Anim Behav 2017. [DOI: 10.1016/j.anbehav.2017.02.023] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/03/2023]
|
23
|
Towne WF, Ritrovato AE, Esposto A, Brown DF. Honeybees use the skyline in orientation. ACTA ACUST UNITED AC 2017; 220:2476-2485. [PMID: 28450409 DOI: 10.1242/jeb.160002] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2017] [Accepted: 04/23/2017] [Indexed: 11/20/2022]
Abstract
In view-based navigation, animals acquire views of the landscape from various locations and then compare the learned views with current views in order to orient in certain directions or move toward certain destinations. One landscape feature of great potential usefulness in view-based navigation is the skyline, the silhouette of terrestrial objects against the sky, as it is distant, relatively stable and easy to detect. The skyline has been shown to be important in the view-based navigation of ants, but no flying insect has yet been shown definitively to use the skyline in this way. Here, we show that honeybees do indeed orient using the skyline. A feeder was surrounded with an artificial replica of the natural skyline there, and the bees' departures toward the nest were recorded from above with a video camera under overcast skies (to eliminate celestial cues). When the artificial skyline was rotated, the bees' departures were rotated correspondingly, showing that the bees oriented by the artificial skyline alone. We discuss these findings in the context of the likely importance of the skyline in long-range homing in bees, the likely importance of altitude in using the skyline, the likely role of ultraviolet light in detecting the skyline, and what we know about the bees' ability to resolve skyline features.
Collapse
Affiliation(s)
- William F Towne
- Department of Biology, Kutztown University of Pennsylvania, Kutztown, PA 19529, USA
| | | | - Antonina Esposto
- Department of Biology, Kutztown University of Pennsylvania, Kutztown, PA 19529, USA
| | - Duncan F Brown
- Department of Biology, Kutztown University of Pennsylvania, Kutztown, PA 19529, USA
| |
Collapse
|
24
|
Goldschmidt D, Manoonpong P, Dasgupta S. A Neurocomputational Model of Goal-Directed Navigation in Insect-Inspired Artificial Agents. Front Neurorobot 2017; 11:20. [PMID: 28446872 PMCID: PMC5388780 DOI: 10.3389/fnbot.2017.00020] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2016] [Accepted: 03/24/2017] [Indexed: 01/07/2023] Open
Abstract
Despite their small size, insect brains are able to produce robust and efficient navigation in complex environments. Specifically in social insects, such as ants and bees, these navigational capabilities are guided by orientation directing vectors generated by a process called path integration. During this process, they integrate compass and odometric cues to estimate their current location as a vector, called the home vector for guiding them back home on a straight path. They further acquire and retrieve path integration-based vector memories globally to the nest or based on visual landmarks. Although existing computational models reproduced similar behaviors, a neurocomputational model of vector navigation including the acquisition of vector representations has not been described before. Here we present a model of neural mechanisms in a modular closed-loop control—enabling vector navigation in artificial agents. The model consists of a path integration mechanism, reward-modulated global learning, random search, and action selection. The path integration mechanism integrates compass and odometric cues to compute a vectorial representation of the agent's current location as neural activity patterns in circular arrays. A reward-modulated learning rule enables the acquisition of vector memories by associating the local food reward with the path integration state. A motor output is computed based on the combination of vector memories and random exploration. In simulation, we show that the neural mechanisms enable robust homing and localization, even in the presence of external sensory noise. The proposed learning rules lead to goal-directed navigation and route formation performed under realistic conditions. Consequently, we provide a novel approach for vector learning and navigation in a simulated, situated agent linking behavioral observations to their possible underlying neural substrates.
Collapse
Affiliation(s)
- Dennis Goldschmidt
- Bernstein Center for Computational Neuroscience, Third Institute of Physics - Biophysics, Georg-August UniversityGöttingen, Germany.,Champalimaud Neuroscience Programme, Champalimaud Centre for the UnknownLisbon, Portugal
| | - Poramate Manoonpong
- Embodied AI and Neurorobotics Lab, Centre of BioRobotics, The Mærsk Mc-Kinney Møller Institute, University of Southern DenmarkOdense, Denmark
| | | |
Collapse
|
25
|
Held M, Berz A, Hensgen R, Muenz TS, Scholl C, Rössler W, Homberg U, Pfeiffer K. Microglomerular Synaptic Complexes in the Sky-Compass Network of the Honeybee Connect Parallel Pathways from the Anterior Optic Tubercle to the Central Complex. Front Behav Neurosci 2016; 10:186. [PMID: 27774056 PMCID: PMC5053983 DOI: 10.3389/fnbeh.2016.00186] [Citation(s) in RCA: 39] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2016] [Accepted: 09/21/2016] [Indexed: 02/05/2023] Open
Abstract
While the ability of honeybees to navigate relying on sky-compass information has been investigated in a large number of behavioral studies, the underlying neuronal system has so far received less attention. The sky-compass pathway has recently been described from its input region, the dorsal rim area (DRA) of the compound eye, to the anterior optic tubercle (AOTU). The aim of this study is to reveal the connection from the AOTU to the central complex (CX). For this purpose, we investigated the anatomy of large microglomerular synaptic complexes in the medial and lateral bulbs (MBUs/LBUs) of the lateral complex (LX). The synaptic complexes are formed by tubercle-lateral accessory lobe neuron 1 (TuLAL1) neurons of the AOTU and GABAergic tangential neurons of the central body’s (CB) lower division (TL neurons). Both TuLAL1 and TL neurons strongly resemble neurons forming these complexes in other insect species. We further investigated the ultrastructure of these synaptic complexes using transmission electron microscopy. We found that single large presynaptic terminals of TuLAL1 neurons enclose many small profiles (SPs) of TL neurons. The synaptic connections between these neurons are established by two types of synapses: divergent dyads and divergent tetrads. Our data support the assumption that these complexes are a highly conserved feature in the insect brain and play an important role in reliable signal transmission within the sky-compass pathway.
Collapse
Affiliation(s)
- Martina Held
- Department of Biology, Animal Physiology, Philipps-University Marburg Marburg, Germany
| | - Annuska Berz
- Department of Biology, Animal Physiology, Philipps-University Marburg Marburg, Germany
| | - Ronja Hensgen
- Department of Biology, Animal Physiology, Philipps-University Marburg Marburg, Germany
| | - Thomas S Muenz
- Biozentrum, Behavioral Physiology and Sociobiology (Zoology II), University of Würzburg Würzburg, Germany
| | - Christina Scholl
- Biozentrum, Behavioral Physiology and Sociobiology (Zoology II), University of Würzburg Würzburg, Germany
| | - Wolfgang Rössler
- Biozentrum, Behavioral Physiology and Sociobiology (Zoology II), University of Würzburg Würzburg, Germany
| | - Uwe Homberg
- Department of Biology, Animal Physiology, Philipps-University Marburg Marburg, Germany
| | - Keram Pfeiffer
- Department of Biology, Animal Physiology, Philipps-University Marburg Marburg, Germany
| |
Collapse
|
26
|
Cheng K, Ronacher B. A champion of organismal biology. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2015; 201:513-5. [PMID: 25841648 DOI: 10.1007/s00359-015-1004-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2015] [Revised: 03/17/2015] [Accepted: 03/18/2015] [Indexed: 11/26/2022]
Affiliation(s)
- Ken Cheng
- Macquarie University, Sydney, NSW, Australia,
| | | |
Collapse
|