1
|
Lafon G, Paoli M, Paffhausen BH, Sanchez GDB, Lihoreau M, Avarguès-Weber A, Giurfa M. Efficient visual learning by bumble bees in virtual-reality conditions: Size does not matter. INSECT SCIENCE 2023; 30:1734-1748. [PMID: 36734172 DOI: 10.1111/1744-7917.13181] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/21/2022] [Revised: 01/17/2023] [Accepted: 01/27/2023] [Indexed: 06/18/2023]
Abstract
Recent developments allowed establishing virtual-reality (VR) setups to study multiple aspects of visual learning in honey bees under controlled experimental conditions. Here, we adopted a VR environment to investigate the visual learning in the buff-tailed bumble bee Bombus terrestris. Based on responses to appetitive and aversive reinforcements used for conditioning, we show that bumble bees had the proper appetitive motivation to engage in the VR experiments and that they learned efficiently elemental color discriminations. In doing so, they reduced the latency to make a choice, increased the proportion of direct paths toward the virtual stimuli and walked faster toward them. Performance in a short-term retention test showed that bumble bees chose and fixated longer on the correct stimulus in the absence of reinforcement. Body size and weight, although variable across individuals, did not affect cognitive performances and had a mild impact on motor performances. Overall, we show that bumble bees are suitable experimental subjects for experiments on visual learning under VR conditions, which opens important perspectives for invasive studies on the neural and molecular bases of such learning given the robustness of these insects and the accessibility of their brain.
Collapse
Affiliation(s)
- Gregory Lafon
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Marco Paoli
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Benjamin H Paffhausen
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Gabriela de Brito Sanchez
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Mathieu Lihoreau
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Aurore Avarguès-Weber
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
| | - Martin Giurfa
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), University of Toulouse, CNRS, UPS, Toulouse, France
- French Academy of Sciences for University Professors, Institut Universitaire de France (IUF), Paris, France
| |
Collapse
|
2
|
Kobayashi N, Hasegawa Y, Okada R, Sakura M. Visual learning in tethered bees modifies flight orientation and is impaired by epinastine. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-023-01623-z. [PMID: 36930349 DOI: 10.1007/s00359-023-01623-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2022] [Revised: 02/09/2023] [Accepted: 03/01/2023] [Indexed: 03/18/2023]
Abstract
Visual-orientation learning of a tethered flying bee was investigated using a flight simulator and a novel protocol in which orientation preference toward trained visual targets was assessed in tests performed before and after appetitive conditioning. Either a blue or a green rectangle (conditioned stimulus, CS) was associated with 30% sucrose solution (unconditioned stimulus, US), whereas the other rectangle was not paired with US. Bees were tested in a closed-looped flight simulator 5 min after ten pairings of the US and CS. Conditioned bees were preferentially oriented to the CS after such training. This increase in preference for CS was maintained for 24 h, indicating the presence of long-term memory. Because the total orienting time was not altered by conditioning, conditioning did not enhance orientation activity itself but increased the relative time for orientation to CS. When 0.4 or 4 mM epinastine (an antagonist of octopamine receptors) was injected into the bee's head 30 min prior to the experiment, both short- and long-term memory formation were significantly impaired, suggesting that octopamine, which is crucial for appetitive olfactory learning in insects, is also involved in visual orientation learning.
Collapse
Affiliation(s)
- Norihiro Kobayashi
- Department of Biology, Graduate School of Science, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe, Hyogo, 657-8501, Japan
| | | | - Ryuichi Okada
- Department of Biology, Graduate School of Science, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe, Hyogo, 657-8501, Japan.
| | - Midori Sakura
- Department of Biology, Graduate School of Science, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe, Hyogo, 657-8501, Japan.
| |
Collapse
|
3
|
Lafon G, Geng H, Avarguès-Weber A, Buatois A, Massou I, Giurfa M. The Neural Signature of Visual Learning Under Restrictive Virtual-Reality Conditions. Front Behav Neurosci 2022; 16:846076. [PMID: 35250505 PMCID: PMC8888666 DOI: 10.3389/fnbeh.2022.846076] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2021] [Accepted: 01/21/2022] [Indexed: 11/22/2022] Open
Abstract
Honey bees are reputed for their remarkable visual learning and navigation capabilities. These capacities can be studied in virtual reality (VR) environments, which allow studying performances of tethered animals in stationary flight or walk under full control of the sensory environment. Here, we used a 2D VR setup in which a tethered bee walking stationary under restrictive closed-loop conditions learned to discriminate vertical rectangles differing in color and reinforcing outcome. Closed-loop conditions restricted stimulus control to lateral displacements. Consistently with prior VR analyses, bees learned to discriminate the trained stimuli. Ex vivo analyses on the brains of learners and non-learners showed that successful learning led to a downregulation of three immediate early genes in the main regions of the visual circuit, the optic lobes (OLs) and the calyces of the mushroom bodies (MBs). While Egr1 was downregulated in the OLs, Hr38 and kakusei were coincidently downregulated in the calyces of the MBs. Our work thus reveals that color discrimination learning induced a neural signature distributed along the sequential pathway of color processing that is consistent with an inhibitory trace. This trace may relate to the motor patterns required to solve the discrimination task, which are different from those underlying pathfinding in 3D VR scenarios allowing for navigation and exploratory learning and which lead to IEG upregulation.
Collapse
Affiliation(s)
- Gregory Lafon
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Haiyang Geng
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
- College of Animal Sciences (College of Bee Science), Fujian Agriculture and Forestry University, Fuzhou, China
| | - Aurore Avarguès-Weber
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Alexis Buatois
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Isabelle Massou
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Martin Giurfa
- Research Center on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
- College of Animal Sciences (College of Bee Science), Fujian Agriculture and Forestry University, Fuzhou, China
- Institut Universitaire de France, Paris, France
- *Correspondence: Martin Giurfa,
| |
Collapse
|
4
|
Visual learning in a virtual reality environment upregulates immediate early gene expression in the mushroom bodies of honey bees. Commun Biol 2022; 5:130. [PMID: 35165405 PMCID: PMC8844430 DOI: 10.1038/s42003-022-03075-8] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/16/2021] [Accepted: 01/26/2022] [Indexed: 11/08/2022] Open
Abstract
Free-flying bees learn efficiently to solve numerous visual tasks. Yet, the neural underpinnings of this capacity remain unexplored. We used a 3D virtual reality (VR) environment to study visual learning and determine if it leads to changes in immediate early gene (IEG) expression in specific areas of the bee brain. We focused on kakusei, Hr38 and Egr1, three IEGs that have been related to bee foraging and orientation, and compared their relative expression in the calyces of the mushroom bodies, the optic lobes and the rest of the brain after color discrimination learning. Bees learned to discriminate virtual stimuli displaying different colors and retained the information learned. Successful learners exhibited Egr1 upregulation only in the calyces of the mushroom bodies, thus uncovering a privileged involvement of these brain regions in associative color learning and the usefulness of Egr1 as a marker of neural activity induced by this phenomenon.
Collapse
|
5
|
Motion cues from the background influence associative color learning of honey bees in a virtual-reality scenario. Sci Rep 2021; 11:21127. [PMID: 34702914 PMCID: PMC8548521 DOI: 10.1038/s41598-021-00630-x] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2021] [Accepted: 10/13/2021] [Indexed: 11/21/2022] Open
Abstract
Honey bees exhibit remarkable visual learning capacities, which can be studied using virtual reality (VR) landscapes in laboratory conditions. Existing VR environments for bees are imperfect as they provide either open-loop conditions or 2D displays. Here we achieved a true 3D environment in which walking bees learned to discriminate a rewarded from a punished virtual stimulus based on color differences. We included ventral or frontal background cues, which were also subjected to 3D updating based on the bee movements. We thus studied if and how the presence of such motion cues affected visual discrimination in our VR landscape. Our results showed that the presence of frontal, and to a lesser extent, of ventral background motion cues impaired the bees' performance. Whenever these cues were suppressed, color discrimination learning became possible. We analyzed the specific contribution of foreground and background cues and discussed the role of attentional interference and differences in stimulus salience in the VR environment to account for these results. Overall, we show how background and target cues may interact at the perceptual level and influence associative learning in bees. In addition, we identify issues that may affect decision-making in VR landscapes, which require specific control by experimenters.
Collapse
|
6
|
Paffhausen BH, Petrasch J, Wild B, Meurers T, Schülke T, Polster J, Fuchs I, Drexler H, Kuriatnyk O, Menzel R, Landgraf T. A Flying Platform to Investigate Neuronal Correlates of Navigation in the Honey Bee ( Apis mellifera). Front Behav Neurosci 2021; 15:690571. [PMID: 34354573 PMCID: PMC8329708 DOI: 10.3389/fnbeh.2021.690571] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2021] [Accepted: 06/24/2021] [Indexed: 11/13/2022] Open
Abstract
Navigating animals combine multiple perceptual faculties, learn during exploration, retrieve multi-facetted memory contents, and exhibit goal-directedness as an expression of their current needs and motivations. Navigation in insects has been linked to a variety of underlying strategies such as path integration, view familiarity, visual beaconing, and goal-directed orientation with respect to previously learned ground structures. Most works, however, study navigation either from a field perspective, analyzing purely behavioral observations, or combine computational models with neurophysiological evidence obtained from lab experiments. The honey bee (Apis mellifera) has long been a popular model in the search for neural correlates of complex behaviors and exhibits extraordinary navigational capabilities. However, the neural basis for bee navigation has not yet been explored under natural conditions. Here, we propose a novel methodology to record from the brain of a copter-mounted honey bee. This way, the animal experiences natural multimodal sensory inputs in a natural environment that is familiar to her. We have developed a miniaturized electrophysiology recording system which is able to record spikes in the presence of time-varying electric noise from the copter's motors and rotors, and devised an experimental procedure to record from mushroom body extrinsic neurons (MBENs). We analyze the resulting electrophysiological data combined with a reconstruction of the animal's visual perception and find that the neural activity of MBENs is linked to sharp turns, possibly related to the relative motion of visual features. This method is a significant technological step toward recording brain activity of navigating honey bees under natural conditions. By providing all system specifications in an online repository, we hope to close a methodological gap and stimulate further research informing future computational models of insect navigation.
Collapse
Affiliation(s)
- Benjamin H Paffhausen
- Department of Biology, Chemistry and Pharmacy, Institute of Neurobiology, Free University of Berlin, Berlin, Germany
| | - Julian Petrasch
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| | - Benjamin Wild
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| | - Thierry Meurers
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| | - Tobias Schülke
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| | - Johannes Polster
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| | - Inga Fuchs
- Department of Biology, Chemistry and Pharmacy, Institute of Neurobiology, Free University of Berlin, Berlin, Germany
| | - Helmut Drexler
- Department of Biology, Chemistry and Pharmacy, Institute of Neurobiology, Free University of Berlin, Berlin, Germany
| | - Oleksandra Kuriatnyk
- Department of Biology, Chemistry and Pharmacy, Institute of Neurobiology, Free University of Berlin, Berlin, Germany
| | - Randolf Menzel
- Department of Biology, Chemistry and Pharmacy, Institute of Neurobiology, Free University of Berlin, Berlin, Germany
| | - Tim Landgraf
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| |
Collapse
|
7
|
Doussot C, Bertrand OJN, Egelhaaf M. The Critical Role of Head Movements for Spatial Representation During Bumblebees Learning Flight. Front Behav Neurosci 2021; 14:606590. [PMID: 33542681 PMCID: PMC7852487 DOI: 10.3389/fnbeh.2020.606590] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2020] [Accepted: 11/23/2020] [Indexed: 11/20/2022] Open
Abstract
Bumblebees perform complex flight maneuvers around the barely visible entrance of their nest upon their first departures. During these flights bees learn visual information about the surroundings, possibly including its spatial layout. They rely on this information to return home. Depth information can be derived from the apparent motion of the scenery on the bees' retina. This motion is shaped by the animal's flight and orientation: Bees employ a saccadic flight and gaze strategy, where rapid turns of the head (saccades) alternate with flight segments of apparently constant gaze direction (intersaccades). When during intersaccades the gaze direction is kept relatively constant, the apparent motion contains information about the distance of the animal to environmental objects, and thus, in an egocentric reference frame. Alternatively, when the gaze direction rotates around a fixed point in space, the animal perceives the depth structure relative to this pivot point, i.e., in an allocentric reference frame. If the pivot point is at the nest-hole, the information is nest-centric. Here, we investigate in which reference frames bumblebees perceive depth information during their learning flights. By precisely tracking the head orientation, we found that half of the time, the head appears to pivot actively. However, only few of the corresponding pivot points are close to the nest entrance. Our results indicate that bumblebees perceive visual information in several reference frames when they learn about the surroundings of a behaviorally relevant location.
Collapse
Affiliation(s)
- Charlotte Doussot
- Department of Neurobiology, University of Bielefeld, Bielefeld, Germany
| | | | | |
Collapse
|
8
|
Winsor AM, Pagoti GF, Daye DJ, Cheries EW, Cave KR, Jakob EM. What gaze direction can tell us about cognitive processes in invertebrates. Biochem Biophys Res Commun 2021; 564:43-54. [PMID: 33413978 DOI: 10.1016/j.bbrc.2020.12.001] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2020] [Revised: 11/30/2020] [Accepted: 12/01/2020] [Indexed: 01/29/2023]
Abstract
Most visually guided animals shift their gaze using body movements, eye movements, or both to gather information selectively from their environments. Psychological studies of eye movements have advanced our understanding of perceptual and cognitive processes that mediate visual attention in humans and other vertebrates. However, much less is known about how these processes operate in other organisms, particularly invertebrates. We here make the case that studies of invertebrate cognition can benefit by adding precise measures of gaze direction. To accomplish this, we briefly review the human visual attention literature and outline four research themes and several experimental paradigms that could be extended to invertebrates. We briefly review selected studies where the measurement of gaze direction in invertebrates has provided new insights, and we suggest future areas of exploration.
Collapse
Affiliation(s)
- Alex M Winsor
- Graduate Program in Organismic and Evolutionary Biology, University of Massachusetts Amherst, Amherst, MA, 01003, USA.
| | - Guilherme F Pagoti
- Programa de Pós-Graduação em Zoologia, Instituto de Biociências, Universidade de São Paulo, Rua do Matão, 321, Travessa 14, Cidade Universitária, São Paulo, SP, 05508-090, Brazil
| | - Daniel J Daye
- Department of Biology, University of Massachusetts Amherst, Amherst, MA, 01003, USA; Graduate Program in Biological and Environmental Sciences, University of Rhode Island, Kingston, RI, 02881, USA
| | - Erik W Cheries
- Department of Psychological and Brain Sciences, University of Massachusetts Amherst, Amherst, MA, 01003, USA
| | - Kyle R Cave
- Department of Psychological and Brain Sciences, University of Massachusetts Amherst, Amherst, MA, 01003, USA
| | - Elizabeth M Jakob
- Department of Biology, University of Massachusetts Amherst, Amherst, MA, 01003, USA.
| |
Collapse
|
9
|
Goulard R, Buehlmann C, Niven JE, Graham P, Webb B. A motion compensation treadmill for untethered wood ants ( Formica rufa): evidence for transfer of orientation memories from free-walking training. ACTA ACUST UNITED AC 2020; 223:223/24/jeb228601. [PMID: 33443039 PMCID: PMC7774907 DOI: 10.1242/jeb.228601] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2020] [Accepted: 10/23/2020] [Indexed: 11/20/2022]
Abstract
The natural scale of insect navigation during foraging makes it challenging to study under controlled conditions. Virtual reality and trackball setups have offered experimental control over visual environments while studying tethered insects, but potential limitations and confounds introduced by tethering motivates the development of alternative untethered solutions. In this paper, we validate the use of a motion compensator (or ‘treadmill’) to study visually driven behaviour of freely moving wood ants (Formica rufa). We show how this setup allows naturalistic walking behaviour and preserves foraging motivation over long time frames. Furthermore, we show that ants are able to transfer associative and navigational memories from classical maze and arena contexts to our treadmill. Thus, we demonstrate the possibility to study navigational behaviour over ecologically relevant durations (and virtual distances) in precisely controlled environments, bridging the gap between natural and highly controlled laboratory experiments. Summary: We have developed and validated a motion compensating treadmill for wood ants which opens new perspectives to study insect navigation behaviour in a fully controlled manner over ecologically relevant durations.
Collapse
Affiliation(s)
- Roman Goulard
- School of Informatics, University of Edinburgh, Edinburgh EH8 9AB, UK
| | | | - Jeremy E Niven
- University of Sussex, School of Life Sciences, Brighton BN1 9QG, UK
| | - Paul Graham
- University of Sussex, School of Life Sciences, Brighton BN1 9QG, UK
| | - Barbara Webb
- School of Informatics, University of Edinburgh, Edinburgh EH8 9AB, UK
| |
Collapse
|
10
|
Kócsi Z, Murray T, Dahmen H, Narendra A, Zeil J. The Antarium: A Reconstructed Visual Reality Device for Ant Navigation Research. Front Behav Neurosci 2020; 14:599374. [PMID: 33240057 PMCID: PMC7683616 DOI: 10.3389/fnbeh.2020.599374] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2020] [Accepted: 10/12/2020] [Indexed: 12/16/2022] Open
Abstract
We constructed a large projection device (the Antarium) with 20,000 UV-Blue-Green LEDs that allows us to present tethered ants with views of their natural foraging environment. The ants walk on an air-cushioned trackball, their movements are registered and can be fed back to the visual panorama. Views are generated in a 3D model of the ants’ environment so that they experience the changing visual world in the same way as they do when foraging naturally. The Antarium is a biscribed pentakis dodecahedron with 55 facets of identical isosceles triangles. The length of the base of the triangles is 368 mm resulting in a device that is roughly 1 m in diameter. Each triangle contains 361 blue/green LEDs and nine UV LEDs. The 55 triangles of the Antarium have 19,855 Green and Blue pixels and 495 UV pixels, covering 360° azimuth and elevation from −50° below the horizon to +90° above the horizon. The angular resolution is 1.5° for Green and Blue LEDs and 6.7° for UV LEDs, offering 65,536 intensity levels at a flicker frequency of more than 9,000 Hz and a framerate of 190 fps. Also, the direction and degree of polarisation of the UV LEDs can be adjusted through polarisers mounted on the axles of rotary actuators. We build 3D models of the natural foraging environment of ants using purely camera-based methods. We reconstruct panoramic scenes at any point within these models, by projecting panoramic images onto six virtual cameras which capture a cube-map of images to be projected by the LEDs of the Antarium. The Antarium is a unique instrument to investigate visual navigation in ants. In an open loop, it allows us to provide ants with familiar and unfamiliar views, with completely featureless visual scenes, or with scenes that are altered in spatial or spectral composition. In closed-loop, we can study the behavior of ants that are virtually displaced within their natural foraging environment. In the future, the Antarium can also be used to investigate the dynamics of navigational guidance and the neurophysiological basis of ant navigation in natural visual environments.
Collapse
Affiliation(s)
- Zoltán Kócsi
- Research School of Biology, Australian National University, Canberra, ACT, Australia
| | - Trevor Murray
- Research School of Biology, Australian National University, Canberra, ACT, Australia
| | - Hansjürgen Dahmen
- Department of Cognitive Neuroscience, University of Tübingen, Tübingen, Germany
| | - Ajay Narendra
- Department of Biological Sciences, Macquarie University, Sydney, NSW, Australia
| | - Jochen Zeil
- Research School of Biology, Australian National University, Canberra, ACT, Australia
| |
Collapse
|
11
|
Chromatic, achromatic and bimodal negative patterning discrimination by free-flying bumble bees. Anim Behav 2020. [DOI: 10.1016/j.anbehav.2020.09.009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
12
|
Buatois A, Laroche L, Lafon G, Avarguès‐Weber A, Giurfa M. Higher‐order discrimination learning by honeybees in a virtual environment. Eur J Neurosci 2020; 51:681-694. [DOI: 10.1111/ejn.14633] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2019] [Revised: 11/26/2019] [Accepted: 11/28/2019] [Indexed: 11/30/2022]
Affiliation(s)
- Alexis Buatois
- Research Centre on Animal Cognition Center for Integrative Biology CNRS University of Toulouse Toulouse Cedex 09 France
| | - Lou Laroche
- Research Centre on Animal Cognition Center for Integrative Biology CNRS University of Toulouse Toulouse Cedex 09 France
| | - Gregory Lafon
- Research Centre on Animal Cognition Center for Integrative Biology CNRS University of Toulouse Toulouse Cedex 09 France
| | - Aurore Avarguès‐Weber
- Research Centre on Animal Cognition Center for Integrative Biology CNRS University of Toulouse Toulouse Cedex 09 France
| | - Martin Giurfa
- Research Centre on Animal Cognition Center for Integrative Biology CNRS University of Toulouse Toulouse Cedex 09 France
- College of Animal Science (College of Bee Science) Fujian Agriculture and Forestry University Fuzhou China
- Institut Universitaire de France (IUF) France
| |
Collapse
|
13
|
Honeybees foraging for numbers. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2019; 205:439-450. [DOI: 10.1007/s00359-019-01344-2] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2019] [Revised: 05/03/2019] [Accepted: 05/04/2019] [Indexed: 10/26/2022]
|
14
|
Research on 3D Painting in Virtual Reality to Improve Students’ Motivation of 3D Animation Learning. SUSTAINABILITY 2019. [DOI: 10.3390/su11061605] [Citation(s) in RCA: 22] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The purpose of this study was to investigate the use of 6-DoF high immersive virtual reality for stereoscopic spatial mapping to assess the impact of perceived spatial capabilities on 3D software learning motivation. This study wasn’t a bound course with mandatory participation, and students were free to participate in the trial, and employed HTC VIVE, which provides highly immersive experiences, to elicit strong emotional responses. A total of 111 students from a university digital media department were invited to participate in a 3D VR painting experiment in which students created paintings using Google Tilt Brush. A 5-point scale based on the ARCS learning motivation model was adopted to collect student data. Perform a factor analysis of the data twice to select the appropriate factor (p = 0.000 < 0.05). Specifically, exploratory factor analysis was used to classify factors based on four constructs. The Cronbach alpha values of ARCS were 0.920, 0.929, 0.693 and 0.664, respectively, both >0.6, which still indicate favorable reliability. The results show that immersive VR can promote students’ motivation and interest in learning 3D animation. However, the practical application of this technology requires solving problems related to hardware and space.
Collapse
|
15
|
Zwaka H, Bartels R, Lehfeldt S, Jusyte M, Hantke S, Menzel S, Gora J, Alberdi R, Menzel R. Learning and Its Neural Correlates in a Virtual Environment for Honeybees. Front Behav Neurosci 2019; 12:279. [PMID: 30740045 PMCID: PMC6355692 DOI: 10.3389/fnbeh.2018.00279] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2018] [Accepted: 10/30/2018] [Indexed: 11/13/2022] Open
Abstract
The search for neural correlates of operant and observational learning requires a combination of two (experimental) conditions that are very difficult to combine: stable recording from high order neurons and free movement of the animal in a rather natural environment. We developed a virtual environment (VE) that simulates a simplified 3D world for honeybees walking stationary on an air-supported spherical treadmill. We show that honeybees perceive the stimuli in the VE as meaningful by transferring learned information from free flight to the virtual world. In search for neural correlates of learning in the VE, mushroom body extrinsic neurons were recorded over days during learning. We found changes in the neural activity specific to the rewarded and unrewarded visual stimuli. Our results suggest an involvement of the mushroom body extrinsic neurons in operant learning in the honeybee (Apis mellifera).
Collapse
Affiliation(s)
- Hanna Zwaka
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany.,Molecular and Cellular Biology, Harvard University, Cambridge, MA, United States
| | - Ruth Bartels
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany
| | - Sophie Lehfeldt
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany
| | - Meida Jusyte
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany
| | - Sören Hantke
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany
| | - Simon Menzel
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany
| | - Jacob Gora
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany
| | - Rafael Alberdi
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany
| | - Randolf Menzel
- Department of Biology and Neurobiology, Freie Universität Berlin, Berlin, Germany.,Bernstein Center for Computational Neuroscience, Berlin, Germany
| |
Collapse
|
16
|
Buatois A, Flumian C, Schultheiss P, Avarguès-Weber A, Giurfa M. Transfer of Visual Learning Between a Virtual and a Real Environment in Honey Bees: The Role of Active Vision. Front Behav Neurosci 2018; 12:139. [PMID: 30057530 PMCID: PMC6053632 DOI: 10.3389/fnbeh.2018.00139] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2018] [Accepted: 06/18/2018] [Indexed: 01/19/2023] Open
Abstract
To study visual learning in honey bees, we developed a virtual reality (VR) system in which the movements of a tethered bee walking stationary on a spherical treadmill update the visual panorama presented in front of it (closed-loop conditions), thus creating an experience of immersion within a virtual environment. In parallel, we developed a small Y-maze with interchangeable end-boxes, which allowed replacing repeatedly a freely walking bee into the starting point of the maze for repeated decision recording. Using conditioning and transfer experiments between the VR setup and the Y-maze, we studied the extent to which movement freedom and active vision are crucial for learning a simple color discrimination. Approximately 57% of the bees learned the visual discrimination in both conditions. Transfer from VR to the maze improved significantly the bees’ performances: 75% of bees having chosen the CS+ continued doing so and 100% of bees having chosen the CS− reverted their choice in favor of the CS+. In contrast, no improvement was seen for these two groups of bees during the reciprocal transfer from the Y-maze to VR. In this case, bees exhibited inconsistent choices in the VR setup. The asymmetric transfer between contexts indicates that the information learned in each environment may be different despite the similar learning success. Moreover, it shows that reducing the possibility of active vision and movement freedom in the passage from the maze to the VR impairs the expression of visual learning while increasing them in the reciprocal transfer improves it. Our results underline the active nature of visual processing in bees and allow discussing the developments required for immersive VR experiences in insects.
Collapse
Affiliation(s)
- Alexis Buatois
- Research Centre on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Clara Flumian
- Research Centre on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Patrick Schultheiss
- Research Centre on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Aurore Avarguès-Weber
- Research Centre on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| | - Martin Giurfa
- Research Centre on Animal Cognition, Center for Integrative Biology, CNRS, University of Toulouse, Toulouse, France
| |
Collapse
|
17
|
Freas CA, Schultheiss P. How to Navigate in Different Environments and Situations: Lessons From Ants. Front Psychol 2018; 9:841. [PMID: 29896147 PMCID: PMC5986876 DOI: 10.3389/fpsyg.2018.00841] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2018] [Accepted: 05/09/2018] [Indexed: 01/07/2023] Open
Abstract
Ants are a globally distributed insect family whose members have adapted to live in a wide range of different environments and ecological niches. Foraging ants everywhere face the recurring challenge of navigating to find food and to bring it back to the nest. More than a century of research has led to the identification of some key navigational strategies, such as compass navigation, path integration, and route following. Ants have been shown to rely on visual, olfactory, and idiothetic cues for navigational guidance. Here, we summarize recent behavioral work, focusing on how these cues are learned and stored as well as how different navigational cues are integrated, often between strategies and even across sensory modalities. Information can also be communicated between different navigational routines. In this way, a shared toolkit of fundamental navigational strategies can lead to substantial flexibility in behavioral outcomes. This allows individual ants to tune their behavioral repertoire to different tasks (e.g., foraging and homing), lifestyles (e.g., diurnal and nocturnal), or environments, depending on the availability and reliability of different guidance cues. We also review recent anatomical and physiological studies in ants and other insects that have started to reveal neural correlates for specific navigational strategies, and which may provide the beginnings of a truly mechanistic understanding of navigation behavior.
Collapse
Affiliation(s)
- Cody A Freas
- Department of Biological Sciences, Macquarie University, Sydney, NSW, Australia.,Department of Psychology, University of Alberta, Edmonton, AB, Canada
| | - Patrick Schultheiss
- Research Center on Animal Cognition, Center for Integrative Biology, French National Center for Scientific Research, Toulouse University, Toulouse, France
| |
Collapse
|
18
|
Virtual reality method to analyze visual recognition in mice. PLoS One 2018; 13:e0196563. [PMID: 29768429 PMCID: PMC5955493 DOI: 10.1371/journal.pone.0196563] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2018] [Accepted: 04/16/2018] [Indexed: 12/15/2022] Open
Abstract
Behavioral tests have been extensively used to measure the visual function of mice. To determine how precisely mice perceive certain visual cues, it is necessary to have a quantifiable measurement of their behavioral responses. Recently, virtual reality tests have been utilized for a variety of purposes, from analyzing hippocampal cell functionality to identifying visual acuity. Despite the widespread use of these tests, the training requirement for the recognition of a variety of different visual targets, and the performance of the behavioral tests has not been thoroughly characterized. We have developed a virtual reality behavior testing approach that can essay a variety of different aspects of visual perception, including color/luminance and motion detection. When tested for the ability to detect a color/luminance target or a moving target, mice were able to discern the designated target after 9 days of continuous training. However, the quality of their performance is significantly affected by the complexity of the visual target, and their ability to navigate on a spherical treadmill. Importantly, mice retained memory of their visual recognition for at least three weeks after the end of their behavioral training.
Collapse
|