1
|
Lafon G, Paoli M, Paffhausen BH, Sanchez GDB, Lihoreau M, Avarguès-Weber A, Giurfa M. Efficient visual learning by bumble bees in virtual-reality conditions: Size does not matter. INSECT SCIENCE 2023; 30:1734-1748. [PMID: 36734172 DOI: 10.1111/1744-7917.13181] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/21/2022] [Revised: 01/17/2023] [Accepted: 01/27/2023] [Indexed: 06/18/2023]
Abstract
Recent developments allowed establishing virtual-reality (VR) setups to study multiple aspects of visual learning in honey bees under controlled experimental conditions. Here, we adopted a VR environment to investigate the visual learning in the buff-tailed bumble bee Bombus terrestris. Based on responses to appetitive and aversive reinforcements used for conditioning, we show that bumble bees had the proper appetitive motivation to engage in the VR experiments and that they learned efficiently elemental color discriminations. In doing so, they reduced the latency to make a choice, increased the proportion of direct paths toward the virtual stimuli and walked faster toward them. Performance in a short-term retention test showed that bumble bees chose and fixated longer on the correct stimulus in the absence of reinforcement. Body size and weight, although variable across individuals, did not affect cognitive performances and had a mild impact on motor performances. Overall, we show that bumble bees are suitable experimental subjects for experiments on visual learning under VR conditions, which opens important perspectives for invasive studies on the neural and molecular bases of such learning given the robustness of these insects and the accessibility of their brain.
Collapse
Affiliation(s)
- Gregory Lafon
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Marco Paoli
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Benjamin H Paffhausen
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Gabriela de Brito Sanchez
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Mathieu Lihoreau
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Aurore Avarguès-Weber
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Martin Giurfa
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
- French Academy of Sciences for University Professors, Institut Universitaire de France (IUF), Paris, France
| |
Collapse
|
2
|
Vinauger C, Riffell JA. Retrospective Review of and Introduction to the Analysis of Mosquito Optomotor Responses. Cold Spring Harb Protoc 2023; 2023:614-617. [PMID: 36997277 DOI: 10.1101/pdb.top107672] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/01/2023]
Abstract
Adult hematophagous female mosquitoes require nutrients and proteins from vertebrate blood to produce progeny. To find these hosts, mosquitoes rely on olfactory, thermal, and visual cues. Among these sensory modalities, vision has received far less attention than olfaction, in part because of a lack of experimental tools providing sufficient control on the delivery of visual stimuli and the recording of mosquito responses. Although free-flight experiments (e.g., wind tunnel and cage) ensure higher ecological relevance and allow the observation of more natural flight dynamics, tethered flight assays offer a greater level of control on the suite of sensory stimuli experienced by mosquitoes. In addition, these tethered assays provide a stepping stone toward understanding the neural underpinnings of mosquito optomotor behavior. Advances in computer vision tracking systems and programmable light-emitting diode displays have permitted significant discoveries in models such as the fly Drosophila melanogaster Here, we introduce the use of these methods with mosquitoes.
Collapse
Affiliation(s)
- Clément Vinauger
- Department of Biochemistry, Virginia Polytechnic Institute and State University, Blacksburg, Virginia 24061, USA
| | - Jeffrey A Riffell
- Department of Biology, University of Washington, Seattle, Washington 98195, USA
| |
Collapse
|
3
|
Kobayashi N, Hasegawa Y, Okada R, Sakura M. Visual learning in tethered bees modifies flight orientation and is impaired by epinastine. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-023-01623-z. [PMID: 36930349 DOI: 10.1007/s00359-023-01623-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2022] [Revised: 02/09/2023] [Accepted: 03/01/2023] [Indexed: 03/18/2023]
Abstract
Visual-orientation learning of a tethered flying bee was investigated using a flight simulator and a novel protocol in which orientation preference toward trained visual targets was assessed in tests performed before and after appetitive conditioning. Either a blue or a green rectangle (conditioned stimulus, CS) was associated with 30% sucrose solution (unconditioned stimulus, US), whereas the other rectangle was not paired with US. Bees were tested in a closed-looped flight simulator 5 min after ten pairings of the US and CS. Conditioned bees were preferentially oriented to the CS after such training. This increase in preference for CS was maintained for 24 h, indicating the presence of long-term memory. Because the total orienting time was not altered by conditioning, conditioning did not enhance orientation activity itself but increased the relative time for orientation to CS. When 0.4 or 4 mM epinastine (an antagonist of octopamine receptors) was injected into the bee's head 30 min prior to the experiment, both short- and long-term memory formation were significantly impaired, suggesting that octopamine, which is crucial for appetitive olfactory learning in insects, is also involved in visual orientation learning.
Collapse
Affiliation(s)
- Norihiro Kobayashi
- Department of Biology, Graduate School of Science, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe, Hyogo, 657-8501, Japan
| | | | - Ryuichi Okada
- Department of Biology, Graduate School of Science, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe, Hyogo, 657-8501, Japan.
| | - Midori Sakura
- Department of Biology, Graduate School of Science, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe, Hyogo, 657-8501, Japan.
| |
Collapse
|
4
|
Lafon G, Geng H, Avarguès-Weber A, Buatois A, Massou I, Giurfa M. The Neural Signature of Visual Learning Under Restrictive Virtual-Reality Conditions. Front Behav Neurosci 2022; 16:846076. [PMID: 35250505 PMCID: PMC8888666 DOI: 10.3389/fnbeh.2022.846076] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2021] [Accepted: 01/21/2022] [Indexed: 11/22/2022] Open
Abstract
Honey bees are reputed for their remarkable visual learning and navigation capabilities. These capacities can be studied in virtual reality (VR) environments, which allow studying performances of tethered animals in stationary flight or walk under full control of the sensory environment. Here, we used a 2D VR setup in which a tethered bee walking stationary under restrictive closed-loop conditions learned to discriminate vertical rectangles differing in color and reinforcing outcome. Closed-loop conditions restricted stimulus control to lateral displacements. Consistently with prior VR analyses, bees learned to discriminate the trained stimuli. Ex vivo analyses on the brains of learners and non-learners showed that successful learning led to a downregulation of three immediate early genes in the main regions of the visual circuit, the optic lobes (OLs) and the calyces of the mushroom bodies (MBs). While Egr1 was downregulated in the OLs, Hr38 and kakusei were coincidently downregulated in the calyces of the MBs. Our work thus reveals that color discrimination learning induced a neural signature distributed along the sequential pathway of color processing that is consistent with an inhibitory trace. This trace may relate to the motor patterns required to solve the discrimination task, which are different from those underlying pathfinding in 3D VR scenarios allowing for navigation and exploratory learning and which lead to IEG upregulation.
Collapse
Affiliation(s)
- Gregory Lafon
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Haiyang Geng
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
- College of Animal Sciences (College of Bee Science), Fujian Agriculture and Forestry University, Fuzhou, China
| | - Aurore Avarguès-Weber
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Alexis Buatois
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Isabelle Massou
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Martin Giurfa
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
- College of Animal Sciences (College of Bee Science), Fujian Agriculture and Forestry University, Fuzhou, China
- Institut Universitaire de France, Paris, France
| |
Collapse
|
5
|
Visual learning in a virtual reality environment upregulates immediate early gene expression in the mushroom bodies of honey bees. Commun Biol 2022; 5:130. [PMID: 35165405 PMCID: PMC8844430 DOI: 10.1038/s42003-022-03075-8] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/16/2021] [Accepted: 01/26/2022] [Indexed: 11/08/2022] Open
Abstract
Free-flying bees learn efficiently to solve numerous visual tasks. Yet, the neural underpinnings of this capacity remain unexplored. We used a 3D virtual reality (VR) environment to study visual learning and determine if it leads to changes in immediate early gene (IEG) expression in specific areas of the bee brain. We focused on kakusei, Hr38 and Egr1, three IEGs that have been related to bee foraging and orientation, and compared their relative expression in the calyces of the mushroom bodies, the optic lobes and the rest of the brain after color discrimination learning. Bees learned to discriminate virtual stimuli displaying different colors and retained the information learned. Successful learners exhibited Egr1 upregulation only in the calyces of the mushroom bodies, thus uncovering a privileged involvement of these brain regions in associative color learning and the usefulness of Egr1 as a marker of neural activity induced by this phenomenon.
Collapse
|
6
|
Motion cues from the background influence associative color learning of honey bees in a virtual-reality scenario. Sci Rep 2021; 11:21127. [PMID: 34702914 PMCID: PMC8548521 DOI: 10.1038/s41598-021-00630-x] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2021] [Accepted: 10/13/2021] [Indexed: 11/21/2022] Open
Abstract
Honey bees exhibit remarkable visual learning capacities, which can be studied using virtual reality (VR) landscapes in laboratory conditions. Existing VR environments for bees are imperfect as they provide either open-loop conditions or 2D displays. Here we achieved a true 3D environment in which walking bees learned to discriminate a rewarded from a punished virtual stimulus based on color differences. We included ventral or frontal background cues, which were also subjected to 3D updating based on the bee movements. We thus studied if and how the presence of such motion cues affected visual discrimination in our VR landscape. Our results showed that the presence of frontal, and to a lesser extent, of ventral background motion cues impaired the bees' performance. Whenever these cues were suppressed, color discrimination learning became possible. We analyzed the specific contribution of foreground and background cues and discussed the role of attentional interference and differences in stimulus salience in the VR environment to account for these results. Overall, we show how background and target cues may interact at the perceptual level and influence associative learning in bees. In addition, we identify issues that may affect decision-making in VR landscapes, which require specific control by experimenters.
Collapse
|