1
|
Lafon G, Paoli M, Paffhausen BH, Sanchez GDB, Lihoreau M, Avarguès-Weber A, Giurfa M. Efficient visual learning by bumble bees in virtual-reality conditions: Size does not matter. INSECT SCIENCE 2023; 30:1734-1748. [PMID: 36734172 DOI: 10.1111/1744-7917.13181] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/21/2022] [Revised: 01/17/2023] [Accepted: 01/27/2023] [Indexed: 06/18/2023]
Abstract
Recent developments allowed establishing virtual-reality (VR) setups to study multiple aspects of visual learning in honey bees under controlled experimental conditions. Here, we adopted a VR environment to investigate the visual learning in the buff-tailed bumble bee Bombus terrestris. Based on responses to appetitive and aversive reinforcements used for conditioning, we show that bumble bees had the proper appetitive motivation to engage in the VR experiments and that they learned efficiently elemental color discriminations. In doing so, they reduced the latency to make a choice, increased the proportion of direct paths toward the virtual stimuli and walked faster toward them. Performance in a short-term retention test showed that bumble bees chose and fixated longer on the correct stimulus in the absence of reinforcement. Body size and weight, although variable across individuals, did not affect cognitive performances and had a mild impact on motor performances. Overall, we show that bumble bees are suitable experimental subjects for experiments on visual learning under VR conditions, which opens important perspectives for invasive studies on the neural and molecular bases of such learning given the robustness of these insects and the accessibility of their brain.
Collapse
Affiliation(s)
- Gregory Lafon
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Marco Paoli
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Benjamin H Paffhausen
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Gabriela de Brito Sanchez
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Mathieu Lihoreau
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Aurore Avarguès-Weber
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Martin Giurfa
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
- French Academy of Sciences for University Professors, Institut Universitaire de France (IUF), Paris, France
| |
Collapse
|
2
|
Claverie N, Buvat P, Casas J. Active Sensing in Bees Through Antennal Movements Is Independent of Odor Molecule. Integr Comp Biol 2023; 63:315-331. [PMID: 36958852 DOI: 10.1093/icb/icad010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2022] [Revised: 03/08/2023] [Accepted: 03/13/2023] [Indexed: 03/25/2023] Open
Abstract
When sampling odors, many insects are moving their antennae in a complex but repeatable fashion. Previous studies with bees have tracked antennal movements in only two dimensions, with a low sampling rate and with relatively few odorants. A detailed characterization of the multimodal antennal movement patterns as function of olfactory stimuli is thus wanted. The aim of this study is to test for a relationship between the scanning movements and the properties of the odor molecule. We tracked several key locations on the antennae of bumblebees at high frequency and in three dimensions while stimulating the insect with puffs of 11 common odorants released in a low-speed continuous flow. Water and paraffin were used as negative controls. Movement analysis was done with the neural network Deeplabcut. Bees use a stereotypical oscillating motion of their antennae when smelling odors, similar across all bees, independently of the identity of the odors and hence their diffusivity and vapor pressure. The variability in the movement amplitude among odors is as large as between individuals. The main type of oscillation at low frequencies and large amplitude is triggered by the presence of an odor and is in line with previous work, as is the speed of movement. The second oscillation mode at higher frequencies and smaller amplitudes is constantly present. Antennae are quickly deployed when a stimulus is perceived, decorrelate their movement trajectories rapidly, and oscillate vertically with a large amplitude and laterally with a smaller one. The cone of airspace thus sampled was identified through the 3D understanding of the motion patterns. The amplitude and speed of antennal scanning movements seem to be function of the internal state of the animal, rather than determined by the odorant. Still, bees display an active olfactory sampling strategy. First, they deploy their antennae when perceiving an odor. Second, fast vertical scanning movements further increase the odorant capture rate. Finally, lateral movements might enhance the likelihood to locate the source of odor, similarly to the lateral scanning movement of insects at odor plume boundaries.
Collapse
Affiliation(s)
- Nicolas Claverie
- Institut de Recherche sur la Biologie de l'Insecte, Université de Tours, 37200 Tours, France
- CEA le Ripault, Centre d'études du Ripault, 37260 Monts, France
| | - Pierrick Buvat
- CEA le Ripault, Centre d'études du Ripault, 37260 Monts, France
| | - Jérôme Casas
- Institut de Recherche sur la Biologie de l'Insecte, Université de Tours, 37200 Tours, France
| |
Collapse
|
3
|
Kobayashi N, Hasegawa Y, Okada R, Sakura M. Visual learning in tethered bees modifies flight orientation and is impaired by epinastine. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-023-01623-z. [PMID: 36930349 DOI: 10.1007/s00359-023-01623-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2022] [Revised: 02/09/2023] [Accepted: 03/01/2023] [Indexed: 03/18/2023]
Abstract
Visual-orientation learning of a tethered flying bee was investigated using a flight simulator and a novel protocol in which orientation preference toward trained visual targets was assessed in tests performed before and after appetitive conditioning. Either a blue or a green rectangle (conditioned stimulus, CS) was associated with 30% sucrose solution (unconditioned stimulus, US), whereas the other rectangle was not paired with US. Bees were tested in a closed-looped flight simulator 5 min after ten pairings of the US and CS. Conditioned bees were preferentially oriented to the CS after such training. This increase in preference for CS was maintained for 24 h, indicating the presence of long-term memory. Because the total orienting time was not altered by conditioning, conditioning did not enhance orientation activity itself but increased the relative time for orientation to CS. When 0.4 or 4 mM epinastine (an antagonist of octopamine receptors) was injected into the bee's head 30 min prior to the experiment, both short- and long-term memory formation were significantly impaired, suggesting that octopamine, which is crucial for appetitive olfactory learning in insects, is also involved in visual orientation learning.
Collapse
Affiliation(s)
- Norihiro Kobayashi
- Department of Biology, Graduate School of Science, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe, Hyogo, 657-8501, Japan
| | | | - Ryuichi Okada
- Department of Biology, Graduate School of Science, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe, Hyogo, 657-8501, Japan.
| | - Midori Sakura
- Department of Biology, Graduate School of Science, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe, Hyogo, 657-8501, Japan.
| |
Collapse
|
4
|
Multimodal Information Processing and Associative Learning in the Insect Brain. INSECTS 2022; 13:insects13040332. [PMID: 35447774 PMCID: PMC9033018 DOI: 10.3390/insects13040332] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/03/2022] [Revised: 03/23/2022] [Accepted: 03/25/2022] [Indexed: 02/04/2023]
Abstract
Simple Summary Insect behaviors are a great indicator of evolution and provide useful information about the complexity of organisms. The realistic sensory scene of an environment is complex and replete with multisensory inputs, making the study of sensory integration that leads to behavior highly relevant. We summarize the recent findings on multimodal sensory integration and the behaviors that originate from them in our review. Abstract The study of sensory systems in insects has a long-spanning history of almost an entire century. Olfaction, vision, and gustation are thoroughly researched in several robust insect models and new discoveries are made every day on the more elusive thermo- and mechano-sensory systems. Few specialized senses such as hygro- and magneto-reception are also identified in some insects. In light of recent advancements in the scientific investigation of insect behavior, it is not only important to study sensory modalities individually, but also as a combination of multimodal inputs. This is of particular significance, as a combinatorial approach to study sensory behaviors mimics the real-time environment of an insect with a wide spectrum of information available to it. As a fascinating field that is recently gaining new insight, multimodal integration in insects serves as a fundamental basis to understand complex insect behaviors including, but not limited to navigation, foraging, learning, and memory. In this review, we have summarized various studies that investigated sensory integration across modalities, with emphasis on three insect models (honeybees, ants and flies), their behaviors, and the corresponding neuronal underpinnings.
Collapse
|
5
|
Lafon G, Geng H, Avarguès-Weber A, Buatois A, Massou I, Giurfa M. The Neural Signature of Visual Learning Under Restrictive Virtual-Reality Conditions. Front Behav Neurosci 2022; 16:846076. [PMID: 35250505 PMCID: PMC8888666 DOI: 10.3389/fnbeh.2022.846076] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2021] [Accepted: 01/21/2022] [Indexed: 11/22/2022] Open
Abstract
Honey bees are reputed for their remarkable visual learning and navigation capabilities. These capacities can be studied in virtual reality (VR) environments, which allow studying performances of tethered animals in stationary flight or walk under full control of the sensory environment. Here, we used a 2D VR setup in which a tethered bee walking stationary under restrictive closed-loop conditions learned to discriminate vertical rectangles differing in color and reinforcing outcome. Closed-loop conditions restricted stimulus control to lateral displacements. Consistently with prior VR analyses, bees learned to discriminate the trained stimuli. Ex vivo analyses on the brains of learners and non-learners showed that successful learning led to a downregulation of three immediate early genes in the main regions of the visual circuit, the optic lobes (OLs) and the calyces of the mushroom bodies (MBs). While Egr1 was downregulated in the OLs, Hr38 and kakusei were coincidently downregulated in the calyces of the MBs. Our work thus reveals that color discrimination learning induced a neural signature distributed along the sequential pathway of color processing that is consistent with an inhibitory trace. This trace may relate to the motor patterns required to solve the discrimination task, which are different from those underlying pathfinding in 3D VR scenarios allowing for navigation and exploratory learning and which lead to IEG upregulation.
Collapse
Affiliation(s)
- Gregory Lafon
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Haiyang Geng
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
- College of Animal Sciences (College of Bee Science), Fujian Agriculture and Forestry University, Fuzhou, China
| | - Aurore Avarguès-Weber
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Alexis Buatois
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Isabelle Massou
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Martin Giurfa
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
- College of Animal Sciences (College of Bee Science), Fujian Agriculture and Forestry University, Fuzhou, China
- Institut Universitaire de France, Paris, France
- *Correspondence: Martin Giurfa,
| |
Collapse
|
6
|
Visual learning in a virtual reality environment upregulates immediate early gene expression in the mushroom bodies of honey bees. Commun Biol 2022; 5:130. [PMID: 35165405 PMCID: PMC8844430 DOI: 10.1038/s42003-022-03075-8] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/16/2021] [Accepted: 01/26/2022] [Indexed: 11/08/2022] Open
Abstract
Free-flying bees learn efficiently to solve numerous visual tasks. Yet, the neural underpinnings of this capacity remain unexplored. We used a 3D virtual reality (VR) environment to study visual learning and determine if it leads to changes in immediate early gene (IEG) expression in specific areas of the bee brain. We focused on kakusei, Hr38 and Egr1, three IEGs that have been related to bee foraging and orientation, and compared their relative expression in the calyces of the mushroom bodies, the optic lobes and the rest of the brain after color discrimination learning. Bees learned to discriminate virtual stimuli displaying different colors and retained the information learned. Successful learners exhibited Egr1 upregulation only in the calyces of the mushroom bodies, thus uncovering a privileged involvement of these brain regions in associative color learning and the usefulness of Egr1 as a marker of neural activity induced by this phenomenon.
Collapse
|
7
|
Motion cues from the background influence associative color learning of honey bees in a virtual-reality scenario. Sci Rep 2021; 11:21127. [PMID: 34702914 PMCID: PMC8548521 DOI: 10.1038/s41598-021-00630-x] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2021] [Accepted: 10/13/2021] [Indexed: 11/21/2022] Open
Abstract
Honey bees exhibit remarkable visual learning capacities, which can be studied using virtual reality (VR) landscapes in laboratory conditions. Existing VR environments for bees are imperfect as they provide either open-loop conditions or 2D displays. Here we achieved a true 3D environment in which walking bees learned to discriminate a rewarded from a punished virtual stimulus based on color differences. We included ventral or frontal background cues, which were also subjected to 3D updating based on the bee movements. We thus studied if and how the presence of such motion cues affected visual discrimination in our VR landscape. Our results showed that the presence of frontal, and to a lesser extent, of ventral background motion cues impaired the bees' performance. Whenever these cues were suppressed, color discrimination learning became possible. We analyzed the specific contribution of foreground and background cues and discussed the role of attentional interference and differences in stimulus salience in the VR environment to account for these results. Overall, we show how background and target cues may interact at the perceptual level and influence associative learning in bees. In addition, we identify issues that may affect decision-making in VR landscapes, which require specific control by experimenters.
Collapse
|
8
|
Visuo-Motor Feedback Modulates Neural Activities in the Medulla of the Honeybee, Apis mellifera. J Neurosci 2021; 41:3192-3203. [PMID: 33608383 DOI: 10.1523/jneurosci.1824-20.2021] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2020] [Revised: 01/26/2021] [Accepted: 01/28/2021] [Indexed: 12/30/2022] Open
Abstract
Behavioral and internal-state modulation of sensory processing has been described in several organisms. In insects, visual neurons in the optic lobe are modulated by locomotion, but the degree to which visual-motor feedback modulates these neurons remains unclear. Moreover, it also remains unknown whether self-generated and externally generated visual motion are processed differently. Here, we implemented a virtual reality system that allowed fine-scale control over visual stimulation in relation to animal motion, in combination with multichannel recording of neural activity in the medulla of a female honeybee (Apis mellifera). We found that this activity was modulated by locomotion, although, in most cases, only when the bee had behavioral control over the visual stimulus (i.e., in a closed-loop system). Moreover, closed-loop control modulated a third of the recorded neurons, and the application of octopamine (OA) evoked similar changes in neural responses that were observed in a closed loop. Additionally, in a subset of modulated neurons, fixation on a visual stimulus was preceded by an increase in firing rate. To further explore the relationship between neuromodulation and adaptive control of the visual environment of the bee, we modified motor gain sensitivity while locally injecting an OA receptor antagonist into the medulla. Whereas female honeybees were tuned to a motor gain of -2 to 2 (between the heading of the bee and its visual feedback), local disruption of the OA pathway in the medulla abolished this tuning, resulting in similar low levels of response across levels of motor gain. Our results show that behavioral control modulates neural activity in the medulla and ultimately impacts behavior.SIGNIFICANCE STATEMENT When moving, an animal generates the motion of the visual scene over its retina. We asked whether self-generated and externally generated optic flow are processed differently in the insect medulla. Our results show that closed-loop control of the visual stimulus modulates neural activity as early as the medulla and ultimately impacts behavior. Moreover, blocking octopaminergic modulation further disrupted object-tracking responses. Our results suggest that the medulla is an important site for context-dependent processing of visual information and that placing the animal in a closed-loop environment may be essential to understanding its visual cognition and processing.
Collapse
|
9
|
Abstract
With less than a million neurons, the western honeybee Apis mellifera is capable of complex olfactory behaviors and provides an ideal model for investigating the neurophysiology of the olfactory circuit and the basis of olfactory perception and learning. Here, we review the most fundamental aspects of honeybee's olfaction: first, we discuss which odorants dominate its environment, and how bees use them to communicate and regulate colony homeostasis; then, we describe the neuroanatomy and the neurophysiology of the olfactory circuit; finally, we explore the cellular and molecular mechanisms leading to olfactory memory formation. The vastity of histological, neurophysiological, and behavioral data collected during the last century, together with new technological advancements, including genetic tools, confirm the honeybee as an attractive research model for understanding olfactory coding and learning.
Collapse
Affiliation(s)
- Marco Paoli
- Research Centre on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, 31062, Toulouse, France.
| | - Giovanni C Galizia
- Department of Neuroscience, University of Konstanz, 78457, Konstanz, Germany.
| |
Collapse
|
10
|
Goulard R, Buehlmann C, Niven JE, Graham P, Webb B. A motion compensation treadmill for untethered wood ants ( Formica rufa): evidence for transfer of orientation memories from free-walking training. ACTA ACUST UNITED AC 2020; 223:223/24/jeb228601. [PMID: 33443039 PMCID: PMC7774907 DOI: 10.1242/jeb.228601] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2020] [Accepted: 10/23/2020] [Indexed: 11/20/2022]
Abstract
The natural scale of insect navigation during foraging makes it challenging to study under controlled conditions. Virtual reality and trackball setups have offered experimental control over visual environments while studying tethered insects, but potential limitations and confounds introduced by tethering motivates the development of alternative untethered solutions. In this paper, we validate the use of a motion compensator (or ‘treadmill’) to study visually driven behaviour of freely moving wood ants (Formica rufa). We show how this setup allows naturalistic walking behaviour and preserves foraging motivation over long time frames. Furthermore, we show that ants are able to transfer associative and navigational memories from classical maze and arena contexts to our treadmill. Thus, we demonstrate the possibility to study navigational behaviour over ecologically relevant durations (and virtual distances) in precisely controlled environments, bridging the gap between natural and highly controlled laboratory experiments. Summary: We have developed and validated a motion compensating treadmill for wood ants which opens new perspectives to study insect navigation behaviour in a fully controlled manner over ecologically relevant durations.
Collapse
Affiliation(s)
- Roman Goulard
- School of Informatics, University of Edinburgh, Edinburgh EH8 9AB, UK
| | | | - Jeremy E Niven
- University of Sussex, School of Life Sciences, Brighton BN1 9QG, UK
| | - Paul Graham
- University of Sussex, School of Life Sciences, Brighton BN1 9QG, UK
| | - Barbara Webb
- School of Informatics, University of Edinburgh, Edinburgh EH8 9AB, UK
| |
Collapse
|
11
|
Kaushik PK, Olsson SB. Using virtual worlds to understand insect navigation for bio-inspired systems. CURRENT OPINION IN INSECT SCIENCE 2020; 42:97-104. [PMID: 33010476 DOI: 10.1016/j.cois.2020.09.010] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/14/2020] [Revised: 09/18/2020] [Accepted: 09/22/2020] [Indexed: 06/11/2023]
Abstract
Insects perform a wide array of intricate behaviors over large spatial and temporal scales in complex natural environments. A mechanistic understanding of insect cognition has direct implications on how brains integrate multimodal information and can inspire bio-based solutions for autonomous robots. Virtual Reality (VR) offers an opportunity assess insect neuroethology while presenting complex, yet controlled, stimuli. Here, we discuss the use of insects as inspiration for artificial systems, recent advances in different VR technologies, current knowledge gaps, and the potential for application of insect VR research to bio-inspired robots. Finally, we advocate the need to diversify our model organisms, behavioral paradigms, and embrace the complexity of the natural world. This will help us to uncover the proximate and ultimate basis of brain and behavior and extract general principles for common challenging problems.
Collapse
Affiliation(s)
- Pavan Kumar Kaushik
- National Centre for Biological Sciences, Tata Institute of Fundamental Research, GKVK Campus, Bellary Road, Bengaluru, 560064, India.
| | - Shannon B Olsson
- National Centre for Biological Sciences, Tata Institute of Fundamental Research, GKVK Campus, Bellary Road, Bengaluru, 560064, India.
| |
Collapse
|
12
|
Buatois A, Laroche L, Lafon G, Avarguès‐Weber A, Giurfa M. Higher‐order discrimination learning by honeybees in a virtual environment. Eur J Neurosci 2020; 51:681-694. [DOI: 10.1111/ejn.14633] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2019] [Revised: 11/26/2019] [Accepted: 11/28/2019] [Indexed: 11/30/2022]
Affiliation(s)
- Alexis Buatois
- Research Centre on Animal Cognition Center for Integrative Biology CNRS University of Toulouse Toulouse Cedex 09 France
| | - Lou Laroche
- Research Centre on Animal Cognition Center for Integrative Biology CNRS University of Toulouse Toulouse Cedex 09 France
| | - Gregory Lafon
- Research Centre on Animal Cognition Center for Integrative Biology CNRS University of Toulouse Toulouse Cedex 09 France
| | - Aurore Avarguès‐Weber
- Research Centre on Animal Cognition Center for Integrative Biology CNRS University of Toulouse Toulouse Cedex 09 France
| | - Martin Giurfa
- Research Centre on Animal Cognition Center for Integrative Biology CNRS University of Toulouse Toulouse Cedex 09 France
- College of Animal Science (College of Bee Science) Fujian Agriculture and Forestry University Fuzhou China
- Institut Universitaire de France (IUF) France
| |
Collapse
|
13
|
Little CM, Chapman TW, Hillier NK. Considerations for Insect Learning in Integrated Pest Management. JOURNAL OF INSECT SCIENCE (ONLINE) 2019; 19:6. [PMID: 31313814 PMCID: PMC6635889 DOI: 10.1093/jisesa/iez064] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/22/2019] [Indexed: 06/10/2023]
Abstract
The past 100 yr have seen dramatic philosophical shifts in our approach to controlling or managing pest species. The introduction of integrated pest management in the 1970s resulted in the incorporation of biological and behavioral approaches to preserve ecosystems and reduce reliance on synthetic chemical pesticides. Increased understanding of the local ecosystem, including its structure and the biology of its species, can improve efficacy of integrated pest management strategies. Pest management strategies incorporating insect learning paradigms to control insect pests or to use insects to control other pests can mediate risk to nontarget insects, including pollinators. Although our understanding of insect learning is in its early stages, efforts to integrate insect learning into pest management strategies have been promising. Due to considerable differences in cognitive abilities among insect species, a case-by-case assessment is needed for each potential application of insect learning within a pest management strategy.
Collapse
Affiliation(s)
- Catherine M Little
- Department of Biology, Acadia University, Wolfville, NS, Canada
- Department of Biology, Memorial University of Newfoundland and Labrador, St. John’s, NL, Canada
| | - Thomas W Chapman
- Department of Biology, Memorial University of Newfoundland and Labrador, St. John’s, NL, Canada
| | - N Kirk Hillier
- Department of Biology, Acadia University, Wolfville, NS, Canada
| |
Collapse
|
14
|
Honeybees foraging for numbers. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2019; 205:439-450. [DOI: 10.1007/s00359-019-01344-2] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2019] [Revised: 05/03/2019] [Accepted: 05/04/2019] [Indexed: 10/26/2022]
|
15
|
Zwaka H, Bartels R, Lehfeldt S, Jusyte M, Hantke S, Menzel S, Gora J, Alberdi R, Menzel R. Learning and Its Neural Correlates in a Virtual Environment for Honeybees. Front Behav Neurosci 2019; 12:279. [PMID: 30740045 PMCID: PMC6355692 DOI: 10.3389/fnbeh.2018.00279] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2018] [Accepted: 10/30/2018] [Indexed: 11/13/2022] Open
Abstract
The search for neural correlates of operant and observational learning requires a combination of two (experimental) conditions that are very difficult to combine: stable recording from high order neurons and free movement of the animal in a rather natural environment. We developed a virtual environment (VE) that simulates a simplified 3D world for honeybees walking stationary on an air-supported spherical treadmill. We show that honeybees perceive the stimuli in the VE as meaningful by transferring learned information from free flight to the virtual world. In search for neural correlates of learning in the VE, mushroom body extrinsic neurons were recorded over days during learning. We found changes in the neural activity specific to the rewarded and unrewarded visual stimuli. Our results suggest an involvement of the mushroom body extrinsic neurons in operant learning in the honeybee (Apis mellifera).
Collapse
Affiliation(s)
- Hanna Zwaka
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany.,Molecular and Cellular Biology, Harvard University, Cambridge, MA, United States
| | - Ruth Bartels
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany
| | - Sophie Lehfeldt
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany
| | - Meida Jusyte
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany
| | - Sören Hantke
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany
| | - Simon Menzel
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany
| | - Jacob Gora
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany
| | - Rafael Alberdi
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany
| | - Randolf Menzel
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany.,Bernstein Center for Computational Neuroscience, Berlin, Germany
| |
Collapse
|
16
|
Buatois A, Flumian C, Schultheiss P, Avarguès-Weber A, Giurfa M. Transfer of Visual Learning Between a Virtual and a Real Environment in Honey Bees: The Role of Active Vision. Front Behav Neurosci 2018; 12:139. [PMID: 30057530 PMCID: PMC6053632 DOI: 10.3389/fnbeh.2018.00139] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2018] [Accepted: 06/18/2018] [Indexed: 01/19/2023] Open
Abstract
To study visual learning in honey bees, we developed a virtual reality (VR) system in which the movements of a tethered bee walking stationary on a spherical treadmill update the visual panorama presented in front of it (closed-loop conditions), thus creating an experience of immersion within a virtual environment. In parallel, we developed a small Y-maze with interchangeable end-boxes, which allowed replacing repeatedly a freely walking bee into the starting point of the maze for repeated decision recording. Using conditioning and transfer experiments between the VR setup and the Y-maze, we studied the extent to which movement freedom and active vision are crucial for learning a simple color discrimination. Approximately 57% of the bees learned the visual discrimination in both conditions. Transfer from VR to the maze improved significantly the bees’ performances: 75% of bees having chosen the CS+ continued doing so and 100% of bees having chosen the CS− reverted their choice in favor of the CS+. In contrast, no improvement was seen for these two groups of bees during the reciprocal transfer from the Y-maze to VR. In this case, bees exhibited inconsistent choices in the VR setup. The asymmetric transfer between contexts indicates that the information learned in each environment may be different despite the similar learning success. Moreover, it shows that reducing the possibility of active vision and movement freedom in the passage from the maze to the VR impairs the expression of visual learning while increasing them in the reciprocal transfer improves it. Our results underline the active nature of visual processing in bees and allow discussing the developments required for immersive VR experiences in insects.
Collapse
Affiliation(s)
- Alexis Buatois
- Research Centre on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Clara Flumian
- Research Centre on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Patrick Schultheiss
- Research Centre on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Aurore Avarguès-Weber
- Research Centre on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Martin Giurfa
- Research Centre on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| |
Collapse
|
17
|
Schultheiss P, Buatois A, Avarguès-Weber A, Giurfa M. Using virtual reality to study visual performances of honeybees. CURRENT OPINION IN INSECT SCIENCE 2017; 24:43-50. [PMID: 29208222 DOI: 10.1016/j.cois.2017.08.003] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/26/2017] [Revised: 08/14/2017] [Accepted: 08/24/2017] [Indexed: 06/07/2023]
Abstract
Virtual reality (VR) offers an appealing experimental framework for studying visual performances of insects under highly controlled conditions. In the case of the honeybee Apis mellifera, this possibility may fill the gap between behavioural analyses in free-flight and cellular analyses in the laboratory. Using automated, computer-controlled systems, it is possible to generate virtual stimuli or even entire environments that can be modified to test hypotheses on bee visual behaviour. The bee itself can remain tethered in place, making it possible to record neural activity while the bees is performing behavioural tasks. Recent studies have examined visual navigation and attentional processes in VR on flying or walking tethered bees, but experimental paradigms for examining visual learning and memory are only just emerging. Behavioural performances of bees under current experimental conditions are often lower in VR than in natural environments, but further improvements on current experimental protocols seem possible. Here we discuss current developments and conclude that it is essential to tailor the specifications of the VR simulation to the visual processing of honeybees to improve the success of this research endeavour.
Collapse
Affiliation(s)
- Patrick Schultheiss
- Research Centre on Animal Cognition, Centre for Integrative Biology, CNRS, University of Toulouse, 118 Route de Narbonne, 31062 Toulouse cedex 09, France.
| | - Alexis Buatois
- Research Centre on Animal Cognition, Centre for Integrative Biology, CNRS, University of Toulouse, 118 Route de Narbonne, 31062 Toulouse cedex 09, France
| | - Aurore Avarguès-Weber
- Research Centre on Animal Cognition, Centre for Integrative Biology, CNRS, University of Toulouse, 118 Route de Narbonne, 31062 Toulouse cedex 09, France
| | - Martin Giurfa
- Research Centre on Animal Cognition, Centre for Integrative Biology, CNRS, University of Toulouse, 118 Route de Narbonne, 31062 Toulouse cedex 09, France
| |
Collapse
|