1
|
Bischof WF, Anderson NC, Kingstone A. A tutorial: Analyzing eye and head movements in virtual reality. Behav Res Methods 2024:10.3758/s13428-024-02482-5. [PMID: 39117987 DOI: 10.3758/s13428-024-02482-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/16/2024] [Indexed: 08/10/2024]
Abstract
This tutorial provides instruction on how to use the eye tracking technology built into virtual reality (VR) headsets, emphasizing the analysis of head and eye movement data when an observer is situated in the center of an omnidirectional environment. We begin with a brief description of how VR eye movement research differs from previous forms of eye movement research, as well as identifying some outstanding gaps in the current literature. We then introduce the basic methodology used to collect VR eye movement data both in general and with regard to the specific data that we collected to illustrate different analytical approaches. We continue with an introduction of the foundational ideas regarding data analysis in VR, including frames of reference, how to map eye and head position, and event detection. In the next part, we introduce core head and eye data analyses focusing on determining where the head and eyes are directed. We then expand on what has been presented, introducing several novel spatial, spatio-temporal, and temporal head-eye data analysis techniques. We conclude with a reflection on what has been presented, and how the techniques introduced in this tutorial provide the scaffolding for extensions to more complex and dynamic VR environments.
Collapse
Affiliation(s)
- Walter F Bischof
- Department of Psychology, University of British Columbia, 2136 West Mall, Vancouver, BC, V6T 1Z4, Canada.
| | - Nicola C Anderson
- Department of Psychology, University of British Columbia, 2136 West Mall, Vancouver, BC, V6T 1Z4, Canada
| | - Alan Kingstone
- Department of Psychology, University of British Columbia, 2136 West Mall, Vancouver, BC, V6T 1Z4, Canada
| |
Collapse
|
2
|
Argota Sánchez-Vaquerizo J, Hausladen CI, Mahajan S, Matter M, Siebenmann M, van Eggermond MAB, Helbing D. A virtual reality experiment to study pedestrian perception of future street scenarios. Sci Rep 2024; 14:4571. [PMID: 38403717 PMCID: PMC10894882 DOI: 10.1038/s41598-024-55073-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2023] [Accepted: 02/20/2024] [Indexed: 02/27/2024] Open
Abstract
The current allocation of street space is based on expected vehicular peak-hour flows. Flexible and adaptive use of this space can respond to changing needs. To evaluate the acceptability of flexible street layouts, several urban environments were designed and implemented in virtual reality. Participants explored these designs in immersive virtual reality in a [Formula: see text] mixed factorial experiment, in which we analysed self-reported, behavioural and physiological responses from participants. Distinct communication strategies were varied between subjects. Participants' responses reveal a preference for familiar solutions. Unconventional street layouts are less preferred, perceived as unsafe and cause a measurably greater stress response. Furthermore, information provision focusing on comparisons lead participants to focus primarily on the drawbacks, instead of the advantages of novel scenarios. When being able to freely express thoughts and opinions, participants are focused more on the impact of space design on behaviour rather than the objective physical features themselves. Especially, this last finding suggests that it is vital to develop new street scenarios in an inclusive and democratic way: the success of innovating urban spaces depends on how well the vast diversity of citizens' needs is considered and met.
Collapse
Affiliation(s)
| | - Carina I Hausladen
- ETH Zürich, Computational Social Science, 8092, Zurich, Switzerland
- California Institute of Technology, Behavioral Economics, Pasadena, CA, 91125, USA
| | - Sachit Mahajan
- ETH Zürich, Computational Social Science, 8092, Zurich, Switzerland
| | - Marc Matter
- ETH Zürich, Computational Social Science, 8092, Zurich, Switzerland
| | | | | | - Dirk Helbing
- ETH Zürich, Computational Social Science, 8092, Zurich, Switzerland
- Complexity Science Hub, 1080, Vienna, Austria
| |
Collapse
|
3
|
Reeves SM, Otero-Millan J. The influence of scene tilt on saccade directions is amplitude dependent. J Neurol Sci 2023; 448:120635. [PMID: 37031623 DOI: 10.1016/j.jns.2023.120635] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2022] [Revised: 01/13/2023] [Accepted: 03/24/2023] [Indexed: 03/28/2023]
Abstract
When exploring a visual scene, humans make more saccades in the horizontal direction than any other direction. While many have shown that the horizontal saccade bias rotates in response to scene tilt, it is unclear whether this effect depends on saccade amplitude. We addressed this question by examining the effect of image tilt on the saccade direction distributions recorded during freely viewing natural scenes. Participants (n = 20) viewed scenes tilted at -30°, 0°, and 30°. Saccade distributions during free viewing rotated by an angle of 12.1° ± 6.7° (t(19) = 8.04, p < 0.001) in the direction of the image tilt. When we partitioned the saccades according to their amplitude we found that small amplitude saccades occurred most in the horizontal direction while large amplitude saccades were more oriented to the scene tilt (p < 0.001). To further study the characteristics of small saccades and how they are affected by scene tilt, we looked at the effect of image tilt on small fixational saccades made while fixating a central target amidst a larger scene and found that fixational saccade distributions did not rotate with scene tilt (-0.3° ±1.7° degrees; t(19) = -0.8, p = 0.39). These results suggest a combined effect of two reference frames in saccade generation: one egocentric reference frame that dominates for small saccades, biases them horizontally, and may be common for different tasks, and another allocentric reference frame that biases larger saccades along the orientation of an image during free viewing.
Collapse
|
4
|
Bischof WF, Anderson NC, Kingstone A. Eye and head movements while encoding and recognizing panoramic scenes in virtual reality. PLoS One 2023; 18:e0282030. [PMID: 36800398 PMCID: PMC9937482 DOI: 10.1371/journal.pone.0282030] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2022] [Accepted: 02/06/2023] [Indexed: 02/18/2023] Open
Abstract
One approach to studying the recognition of scenes and objects relies on the comparison of eye movement patterns during encoding and recognition. Past studies typically analyzed the perception of flat stimuli of limited extent presented on a computer monitor that did not require head movements. In contrast, participants in the present study saw omnidirectional panoramic scenes through an immersive 3D virtual reality viewer, and they could move their head freely to inspect different parts of the visual scenes. This allowed us to examine how unconstrained observers use their head and eyes to encode and recognize visual scenes. By studying head and eye movement within a fully immersive environment, and applying cross-recurrence analysis, we found that eye movements are strongly influenced by the content of the visual environment, as are head movements-though to a much lesser degree. Moreover, we found that the head and eyes are linked, with the head supporting, and by and large mirroring the movements of the eyes, consistent with the notion that the head operates to support the acquisition of visual information by the eyes.
Collapse
Affiliation(s)
- Walter F. Bischof
- Department of Psychology, University of British Columbia, Vancouver, BC, Canada
| | - Nicola C. Anderson
- Department of Psychology, University of British Columbia, Vancouver, BC, Canada
| | - Alan Kingstone
- Department of Psychology, University of British Columbia, Vancouver, BC, Canada
- * E-mail:
| |
Collapse
|
5
|
Abstract
This chapter explores the current state of the art in eye tracking within 3D virtual environments. It begins with the motivation for eye tracking in Virtual Reality (VR) in psychological research, followed by descriptions of the hardware and software used for presenting virtual environments as well as for tracking eye and head movements in VR. This is followed by a detailed description of an example project on eye and head tracking while observers look at 360° panoramic scenes. The example is illustrated with descriptions of the user interface and program excerpts to show the measurement of eye and head movements in VR. The chapter continues with fundamentals of data analysis, in particular methods for the determination of fixations and saccades when viewing spherical displays. We then extend these methodological considerations to determining the spatial and temporal coordination of the eyes and head in VR perception. The chapter concludes with a discussion of outstanding problems and future directions for conducting eye- and head-tracking research in VR. We hope that this chapter will serve as a primer for those intending to implement VR eye tracking in their own research.
Collapse
|
6
|
Head Orientation Influences Saccade Directions during Free Viewing. eNeuro 2022; 9:ENEURO.0273-22.2022. [PMID: 36351820 PMCID: PMC9787809 DOI: 10.1523/eneuro.0273-22.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2022] [Revised: 10/01/2022] [Accepted: 11/03/2022] [Indexed: 11/11/2022] Open
Abstract
When looking around a visual scene, humans make saccadic eye movements to fixate objects of interest. While the extraocular muscles can execute saccades in any direction, not all saccade directions are equally likely: saccades in horizontal and vertical directions are most prevalent. Here, we asked whether head orientation plays a role in determining saccade direction biases. Study participants (n = 14) viewed natural scenes and abstract fractals (radially symmetric patterns) through a virtual reality headset equipped with eye tracking. Participants' heads were stabilized and tilted at -30°, 0°, or 30° while viewing the images, which could also be tilted by -30°, 0°, and 30° relative to the head. To determine whether the biases in saccade direction changed with head tilt, we calculated polar histograms of saccade directions and cross-correlated pairs of histograms to find the angular displacement resulting in the maximum correlation. During free viewing of fractals, saccade biases largely followed the orientation of the head with an average displacement value of 24° when comparing head upright to head tilt in world-referenced coordinates (t (13) = 17.63, p < 0.001). There was a systematic offset of 2.6° in saccade directions, likely reflecting ocular counter roll (OCR; t (13) = 3.13, p = 0.008). When participants viewed an Earth upright natural scene during head tilt, we found that the orientation of the head still influenced saccade directions (t (13) = 3.7, p = 0.001). These results suggest that nonvisual information about head orientation, such as that acquired by vestibular sensors, likely plays a role in saccade generation.
Collapse
|
7
|
David EJ, Lebranchu P, Perreira Da Silva M, Le Callet P. What are the visuo-motor tendencies of omnidirectional scene free-viewing in virtual reality? J Vis 2022; 22:12. [PMID: 35323868 PMCID: PMC8963670 DOI: 10.1167/jov.22.4.12] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2021] [Accepted: 02/08/2022] [Indexed: 11/24/2022] Open
Abstract
Central and peripheral vision during visual tasks have been extensively studied on two-dimensional screens, highlighting their perceptual and functional disparities. This study has two objectives: replicating on-screen gaze-contingent experiments removing central or peripheral field of view in virtual reality, and identifying visuo-motor biases specific to the exploration of 360 scenes with a wide field of view. Our results are useful for vision modelling, with applications in gaze position prediction (e.g., content compression and streaming). We ask how previous on-screen findings translate to conditions where observers can use their head to explore stimuli. We implemented a gaze-contingent paradigm to simulate loss of vision in virtual reality, participants could freely view omnidirectional natural scenes. This protocol allows the simulation of vision loss with an extended field of view (\(\gt \)80°) and studying the head's contributions to visual attention. The time-course of visuo-motor variables in our pure free-viewing task reveals long fixations and short saccades during first seconds of exploration, contrary to literature in visual tasks guided by instructions. We show that the effect of vision loss is reflected primarily on eye movements, in a manner consistent with two-dimensional screens literature. We hypothesize that head movements mainly serve to explore the scenes during free-viewing, the presence of masks did not significantly impact head scanning behaviours. We present new fixational and saccadic visuo-motor tendencies in a 360° context that we hope will help in the creation of gaze prediction models dedicated to virtual reality.
Collapse
Affiliation(s)
- Erwan Joël David
- Department of Psychology, Goethe-Universität, Frankfurt, Germany
| | - Pierre Lebranchu
- LS2N UMR CNRS 6004, University of Nantes and Nantes University Hospital, Nantes, France
| | | | - Patrick Le Callet
- LS2N UMR CNRS 6004, University of Nantes, Nantes, France
- http://pagesperso.ls2n.fr/~lecallet-p/index.html
| |
Collapse
|