1
|
Beetz MJ. A perspective on neuroethology: what the past teaches us about the future of neuroethology. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2024; 210:325-346. [PMID: 38411712 PMCID: PMC10995053 DOI: 10.1007/s00359-024-01695-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2023] [Revised: 02/12/2024] [Accepted: 02/13/2024] [Indexed: 02/28/2024]
Abstract
For 100 years, the Journal of Comparative Physiology-A has significantly supported research in the field of neuroethology. The celebration of the journal's centennial is a great time point to appreciate the recent progress in neuroethology and to discuss possible avenues of the field. Animal behavior is the main source of inspiration for neuroethologists. This is illustrated by the huge diversity of investigated behaviors and species. To explain behavior at a mechanistic level, neuroethologists combine neuroscientific approaches with sophisticated behavioral analysis. The rapid technological progress in neuroscience makes neuroethology a highly dynamic and exciting field of research. To summarize the recent scientific progress in neuroethology, I went through all abstracts of the last six International Congresses for Neuroethology (ICNs 2010-2022) and categorized them based on the sensory modalities, experimental model species, and research topics. This highlights the diversity of neuroethology and gives us a perspective on the field's scientific future. At the end, I highlight three research topics that may, among others, influence the future of neuroethology. I hope that sharing my roots may inspire other scientists to follow neuroethological approaches.
Collapse
Affiliation(s)
- M Jerome Beetz
- Zoology II, Biocenter, University of Würzburg, 97074, Würzburg, Germany.
| |
Collapse
|
2
|
Konnerth MM, Foster JJ, el Jundi B, Spaethe J, Beetz MJ. Monarch butterflies memorize the spatial location of a food source. Proc Biol Sci 2023; 290:20231574. [PMID: 38113939 PMCID: PMC10730289 DOI: 10.1098/rspb.2023.1574] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2023] [Accepted: 11/20/2023] [Indexed: 12/21/2023] Open
Abstract
Spatial memory helps animals to navigate familiar environments. In insects, spatial memory has extensively been studied in central place foragers such as ants and bees. However, if butterflies memorize a spatial location remains unclear. Here, we conducted behavioural experiments to test whether monarch butterflies (Danaus plexippus) can remember and retrieve the spatial location of a food source. We placed several visually identical feeders in a flight cage, with only one feeder providing sucrose solution. Across multiple days, individual butterflies predominantly visited the rewarding feeder. Next, we displaced a salient landmark close to the feeders to test which visual cue the butterflies used to relocate the rewarding feeder. While occasional landmark displacements were ignored by the butterflies and did not affect their decisions, systematic displacement of both the landmark and the rewarding feeder demonstrated that the butterflies associated the salient landmark with the feeder's position. Altogether, we show that butterflies consolidate and retrieve spatial memory in the context of foraging.
Collapse
Affiliation(s)
- M. Marcel Konnerth
- Zoology II, Biocenter, University of Würzburg, Am Hubland, 97074 Würzburg, Bayern, Germany
| | - James J. Foster
- Department of Biology, University of Konstanz, 78464 Konstanz, Baden-Württemberg, Germany
| | - Basil el Jundi
- Department of Biology, Norwegian University of Science and Technology, NO-7491 Trondheim, Norway
| | - Johannes Spaethe
- Zoology II, Biocenter, University of Würzburg, Am Hubland, 97074 Würzburg, Bayern, Germany
| | - M. Jerome Beetz
- Zoology II, Biocenter, University of Würzburg, Am Hubland, 97074 Würzburg, Bayern, Germany
| |
Collapse
|
3
|
Quinlan PD, Katz PS. State-dependent, visually guided behaviors in the nudibranch Berghia stephanieae. J Exp Biol 2023; 226:jeb245213. [PMID: 37661725 PMCID: PMC10560555 DOI: 10.1242/jeb.245213] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2022] [Accepted: 08/22/2023] [Indexed: 09/05/2023]
Abstract
Nudibranch mollusks have structurally simple eyes whose behavioral roles have not been established. We tested the effects of visual stimuli on the behavior of the nudibranch Berghia stephanieae under different food and hunger conditions. In an arena that was half-shaded, animals spent most of their time in the dark, where they also decreased their speed and made more changes in heading. These behavioral differences between the light and dark were less evident in uniformly illuminated or darkened arenas, suggesting that they were not caused by the level of illumination. Berghia stephanieae responded to distant visual targets; animals approached a black stripe that was at least 15 deg wide on a white background. They did not approach a stripe that was lighter than the background but approached a stripe that was isoluminant with the background, suggesting the detection of spatial information. Animals traveled in convoluted paths in a featureless arena but straightened their paths when a visual target was present even if they did not approach it, suggesting that visual cues were used for navigation. Individuals were less responsive to visual stimuli when food deprived or in the presence of food odor. Thus, B. stephanieae exhibits visually guided behaviors that are influenced by odors and hunger state.
Collapse
Affiliation(s)
- Phoenix D. Quinlan
- Neuroscience and Behavior Graduate Program, University of Massachusetts Amherst, Amherst, MA 01003, USA
- Department of Biology, University of Massachusetts Amherst, 611 North Pleasant Street, Amherst, MA 01003, USA
| | - Paul S. Katz
- Neuroscience and Behavior Graduate Program, University of Massachusetts Amherst, Amherst, MA 01003, USA
- Department of Biology, University of Massachusetts Amherst, 611 North Pleasant Street, Amherst, MA 01003, USA
| |
Collapse
|
4
|
Motion cues from the background influence associative color learning of honey bees in a virtual-reality scenario. Sci Rep 2021; 11:21127. [PMID: 34702914 PMCID: PMC8548521 DOI: 10.1038/s41598-021-00630-x] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2021] [Accepted: 10/13/2021] [Indexed: 11/21/2022] Open
Abstract
Honey bees exhibit remarkable visual learning capacities, which can be studied using virtual reality (VR) landscapes in laboratory conditions. Existing VR environments for bees are imperfect as they provide either open-loop conditions or 2D displays. Here we achieved a true 3D environment in which walking bees learned to discriminate a rewarded from a punished virtual stimulus based on color differences. We included ventral or frontal background cues, which were also subjected to 3D updating based on the bee movements. We thus studied if and how the presence of such motion cues affected visual discrimination in our VR landscape. Our results showed that the presence of frontal, and to a lesser extent, of ventral background motion cues impaired the bees' performance. Whenever these cues were suppressed, color discrimination learning became possible. We analyzed the specific contribution of foreground and background cues and discussed the role of attentional interference and differences in stimulus salience in the VR environment to account for these results. Overall, we show how background and target cues may interact at the perceptual level and influence associative learning in bees. In addition, we identify issues that may affect decision-making in VR landscapes, which require specific control by experimenters.
Collapse
|
5
|
Paffhausen BH, Petrasch J, Wild B, Meurers T, Schülke T, Polster J, Fuchs I, Drexler H, Kuriatnyk O, Menzel R, Landgraf T. A Flying Platform to Investigate Neuronal Correlates of Navigation in the Honey Bee ( Apis mellifera). Front Behav Neurosci 2021; 15:690571. [PMID: 34354573 PMCID: PMC8329708 DOI: 10.3389/fnbeh.2021.690571] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2021] [Accepted: 06/24/2021] [Indexed: 11/13/2022] Open
Abstract
Navigating animals combine multiple perceptual faculties, learn during exploration, retrieve multi-facetted memory contents, and exhibit goal-directedness as an expression of their current needs and motivations. Navigation in insects has been linked to a variety of underlying strategies such as path integration, view familiarity, visual beaconing, and goal-directed orientation with respect to previously learned ground structures. Most works, however, study navigation either from a field perspective, analyzing purely behavioral observations, or combine computational models with neurophysiological evidence obtained from lab experiments. The honey bee (Apis mellifera) has long been a popular model in the search for neural correlates of complex behaviors and exhibits extraordinary navigational capabilities. However, the neural basis for bee navigation has not yet been explored under natural conditions. Here, we propose a novel methodology to record from the brain of a copter-mounted honey bee. This way, the animal experiences natural multimodal sensory inputs in a natural environment that is familiar to her. We have developed a miniaturized electrophysiology recording system which is able to record spikes in the presence of time-varying electric noise from the copter's motors and rotors, and devised an experimental procedure to record from mushroom body extrinsic neurons (MBENs). We analyze the resulting electrophysiological data combined with a reconstruction of the animal's visual perception and find that the neural activity of MBENs is linked to sharp turns, possibly related to the relative motion of visual features. This method is a significant technological step toward recording brain activity of navigating honey bees under natural conditions. By providing all system specifications in an online repository, we hope to close a methodological gap and stimulate further research informing future computational models of insect navigation.
Collapse
Affiliation(s)
- Benjamin H Paffhausen
- Department of Biology, Chemistry and Pharmacy, Institute of Neurobiology, Free University of Berlin, Berlin, Germany
| | - Julian Petrasch
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| | - Benjamin Wild
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| | - Thierry Meurers
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| | - Tobias Schülke
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| | - Johannes Polster
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| | - Inga Fuchs
- Department of Biology, Chemistry and Pharmacy, Institute of Neurobiology, Free University of Berlin, Berlin, Germany
| | - Helmut Drexler
- Department of Biology, Chemistry and Pharmacy, Institute of Neurobiology, Free University of Berlin, Berlin, Germany
| | - Oleksandra Kuriatnyk
- Department of Biology, Chemistry and Pharmacy, Institute of Neurobiology, Free University of Berlin, Berlin, Germany
| | - Randolf Menzel
- Department of Biology, Chemistry and Pharmacy, Institute of Neurobiology, Free University of Berlin, Berlin, Germany
| | - Tim Landgraf
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| |
Collapse
|