1
|
Lehser C, Hillyard SA, Strauss DJ. Feeling senseless sensations: a crossmodal EEG study of mismatched tactile and visual experiences in virtual reality. J Neural Eng 2024; 21:056042. [PMID: 39374631 DOI: 10.1088/1741-2552/ad83f5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2024] [Accepted: 10/07/2024] [Indexed: 10/09/2024]
Abstract
Objective.To create highly immersive experiences in virtual reality (VR) it is important to not only include the visual sense but also to involve multimodal sensory input. To achieve optimal results, the temporal and spatial synchronization of these multimodal inputs is critical. It is therefore necessary to find methods to objectively evaluate the synchronization of VR experiences with a continuous tracking of the user.Approach.In this study a passive touch experience was incorporated in a visual-tactile VR setup using VR glasses and tactile sensations in mid-air. Inconsistencies of multimodal perception were intentionally integrated into a discrimination task. The participants' electroencephalogram (EEG) was recorded to obtain neural correlates of visual-tactile mismatch situations.Main results.The results showed significant differences in the event-related potentials (ERP) between match and mismatch situations. A biphasic ERP configuration consisting of a positivity at 120 ms and a later negativity at 370 ms was observed following a visual-tactile mismatch.Significance.This late negativity could be related to the N400 that is associated with semantic incongruency. These results provide a promising approach towards the objective evaluation of visual-tactile synchronization in virtual experiences.
Collapse
Affiliation(s)
- Caroline Lehser
- Systems Neuroscience and Neurotechnology Unit, Faculty of Medicine, Saarland University & School of Engineering, htw saar, Homburg/Saar, Germany
- Center for Digital Neurotechnologies Saar, Homburg/Saar & Saarbruecken, Germany
| | - Steven A Hillyard
- Leibniz Institute of Neurobiology, Magdeburg, Germany
- Department of Neurosciences, University of California, San Diego, CA, United States of America
| | - Daniel J Strauss
- Systems Neuroscience and Neurotechnology Unit, Faculty of Medicine, Saarland University & School of Engineering, htw saar, Homburg/Saar, Germany
- Center for Digital Neurotechnologies Saar, Homburg/Saar & Saarbruecken, Germany
| |
Collapse
|
2
|
Nguyen W, Gramann K, Gehrke L. Modeling the Intent to Interact With VR Using Physiological Features. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:5893-5900. [PMID: 37624723 DOI: 10.1109/tvcg.2023.3308787] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 08/27/2023]
Abstract
OBJECTIVE Mixed-Reality (XR) technologies promise a user experience (UX) that rivals the interactive experience with the real-world. The key facilitators in the design of such a natural UX are that the interaction has zero lag and that users experience no excess mental load. This is difficult to achieve due to technical constraints such as motion-to-photon latency as well as false-positives during gesture-based interaction. METHODS In this paper, we explored the use of physiological features to model the user's intent to interact with a virtual reality (VR) environment. Accurate predictions about when users want to express an interaction intent could overcome the limitations of an interactive device that lags behind the intention of a user. We computed time-domain features from electroencephalography (EEG) and electromyography (EMG) recordings during a grab-and-drop task in VR and cross-validated a Linear Discriminant Analysis (LDA) for three different combinations of (1) EEG, (2) EMG and (3) EEG-EMG features. RESULTS & CONCLUSION We found the classifiers to detect the presence of a pre-movement state from background idle activity reflecting the users' intent to interact with the virtual objects (EEG: 62 % ± 10 %, EMG: 72 % ± 9 %, EEG-EMG: 69 % ± 10 %) above simulated chance level. The features leveraged in our classification scheme have a low computational cost and are especially useful for fast decoding of users' mental states. Our work is a further step towards a useful classification of users' intent to interact, as a high temporal resolution and speed of detection is crucial. This facilitates natural experiences through zero-lag adaptive interfaces.
Collapse
|
3
|
Gramann K, Lotte F, Dehais F, Ayaz H, Vukelić M, Karwowski W, Fairclough S, Brouwer AM, Roy RN. Editorial: Open science to support replicability in neuroergonomic research. FRONTIERS IN NEUROERGONOMICS 2024; 5:1459204. [PMID: 39139473 PMCID: PMC11319283 DOI: 10.3389/fnrgo.2024.1459204] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/03/2024] [Accepted: 07/08/2024] [Indexed: 08/15/2024]
Affiliation(s)
- Klaus Gramann
- Biological Psychology and Neuroergonomics, Technische Universitaet Berlin, Berlin, Germany
| | - Fabien Lotte
- Inria Center at the University of Bordeaux/LaBRI, Bordeaux, France
| | - Frederic Dehais
- Fédération ENAC ISAE-SUPAERO ONERA, Université de Toulouse, Toulouse, France
| | - Hasan Ayaz
- School of Biomedical Engineering Science and Health Systems, Drexel University, Philadelphia, PA, United States
| | - Mathias Vukelić
- Fraunhofer Institute for Industrial Engineering IAO, Stuttgart, Germany
| | - Waldemar Karwowski
- Department of Industrial Engineering and Management Systems, University of Central Florida, Orlando, FL, United States
| | - Stephen Fairclough
- School of Psychology, Liverpool John Moores University, Liverpool, United Kingdom
| | | | - Raphaëlle N. Roy
- Fédération ENAC ISAE-SUPAERO ONERA, Université de Toulouse, Toulouse, France
| |
Collapse
|
4
|
Gehrke L, Terfurth L, Akman S, Gramann K. Visuo-haptic prediction errors: a multimodal dataset (EEG, motion) in BIDS format indexing mismatches in haptic interaction. FRONTIERS IN NEUROERGONOMICS 2024; 5:1411305. [PMID: 38903905 PMCID: PMC11188399 DOI: 10.3389/fnrgo.2024.1411305] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/02/2024] [Accepted: 05/20/2024] [Indexed: 06/22/2024]
Affiliation(s)
- Lukas Gehrke
- Biological Psychology and Neuroergonomics, Department of Psychology and Ergonomics, Technological University Berlin, Berlin, Germany
| | | | | | | |
Collapse
|
5
|
Wimmer M, Weidinger N, Veas E, Müller-Putz GR. Multimodal decoding of error processing in a virtual reality flight simulation. Sci Rep 2024; 14:9221. [PMID: 38649681 PMCID: PMC11035577 DOI: 10.1038/s41598-024-59278-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2023] [Accepted: 04/09/2024] [Indexed: 04/25/2024] Open
Abstract
Technological advances in head-mounted displays (HMDs) facilitate the acquisition of physiological data of the user, such as gaze, pupil size, or heart rate. Still, interactions with such systems can be prone to errors, including unintended behavior or unexpected changes in the presented virtual environments. In this study, we investigated if multimodal physiological data can be used to decode error processing, which has been studied, to date, with brain signals only. We examined the feasibility of decoding errors solely with pupil size data and proposed a hybrid decoding approach combining electroencephalographic (EEG) and pupillometric signals. Moreover, we analyzed if hybrid approaches can improve existing EEG-based classification approaches and focused on setups that offer increased usability for practical applications, such as the presented game-like virtual reality flight simulation. Our results indicate that classifiers trained with pupil size data can decode errors above chance. Moreover, hybrid approaches yielded improved performance compared to EEG-based decoders in setups with a reduced number of channels, which is crucial for many out-of-the-lab scenarios. These findings contribute to the development of hybrid brain-computer interfaces, particularly in combination with wearable devices, which allow for easy acquisition of additional physiological data.
Collapse
Affiliation(s)
- Michael Wimmer
- Know-Center GmbH, Graz, Austria
- Institute of Neural Engineering, Graz University of Technology, Graz, Austria
| | | | - Eduardo Veas
- Know-Center GmbH, Graz, Austria
- Institute of Interactive Systems and Data Science, Graz University of Technology, Graz, Austria
| | - Gernot R Müller-Putz
- Institute of Neural Engineering, Graz University of Technology, Graz, Austria.
- BioTechMed-Graz, Graz, Austria.
| |
Collapse
|
6
|
Flores-Cortes M, Guerra-Armas J, Pineda-Galan C, La Touche R, Luque-Suarez A. Sensorimotor Uncertainty of Immersive Virtual Reality Environments for People in Pain: Scoping Review. Brain Sci 2023; 13:1461. [PMID: 37891829 PMCID: PMC10604973 DOI: 10.3390/brainsci13101461] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2023] [Revised: 10/10/2023] [Accepted: 10/13/2023] [Indexed: 10/29/2023] Open
Abstract
INTRODUCTION Decision making and action execution both rely on sensory information, and their primary objective is to minimise uncertainty. Virtual reality (VR) introduces uncertainty due to the imprecision of perceptual information. The concept of "sensorimotor uncertainty" is a pivotal element in the interplay between perception and action within the VR environment. The role of immersive VR in the four stages of motor behaviour decision making in people with pain has been previously discussed. These four processing levels are the basis to understand the uncertainty that a patient experiences when using VR: sensory information, current state, transition rules, and the outcome obtained. METHODS This review examines the different types of uncertainty that a patient may experience when they are immersed in a virtual reality environment in a context of pain. Randomised clinical trials, a secondary analysis of randomised clinical trials, and pilot randomised clinical trials related to the scope of Sensorimotor Uncertainty in Immersive Virtual Reality were included after searching. RESULTS Fifty studies were included in this review. They were divided into four categories regarding the type of uncertainty the intervention created and the stage of the decision-making model. CONCLUSIONS Immersive virtual reality makes it possible to alter sensorimotor uncertainty, but studies of higher methodological quality are needed on this topic, as well as an exploration into the patient profile for pain management using immersive VR.
Collapse
Affiliation(s)
- Mar Flores-Cortes
- Faculty of Health Sciences, University of Malaga, 29071 Malaga, Spain
| | | | | | - Roy La Touche
- Instituto de Dolor Craneofacial y Neuromusculoesquelético (INDCRAN), 28008 Madrid, Spain
- Departamento de Fisioterapia, Centro Superior de Estudios Universitarios La Salle, Universidad Autónoma de Madrid, 28023 Madrid, Spain
- Motion in Brains Research Group, Institute of Neuroscience and Sciences of the Movement (INCIMOV), Centro Superior de Estudios Universitarios La Salle, Universidad Autónoma de Madrid, 28023 Madrid, Spain
| | - Alejandro Luque-Suarez
- Faculty of Health Sciences, University of Malaga, 29071 Malaga, Spain
- Instituto de Investigacion Biomedica de Malaga (IBIMA), 29071 Malaga, Spain
| |
Collapse
|
7
|
Feder S, Miksch J, Grimm S, Krems JF, Bendixen A. Using event-related brain potentials to evaluate motor-auditory latencies in virtual reality. FRONTIERS IN NEUROERGONOMICS 2023; 4:1196507. [PMID: 38234486 PMCID: PMC10790907 DOI: 10.3389/fnrgo.2023.1196507] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/29/2023] [Accepted: 06/14/2023] [Indexed: 01/19/2024]
Abstract
Actions in the real world have immediate sensory consequences. Mimicking these in digital environments is within reach, but technical constraints usually impose a certain latency (delay) between user actions and system responses. It is important to assess the impact of this latency on the users, ideally with measurement techniques that do not interfere with their digital experience. One such unobtrusive technique is electroencephalography (EEG), which can capture the users' brain activity associated with motor responses and sensory events by extracting event-related potentials (ERPs) from the continuous EEG recording. Here we exploit the fact that the amplitude of sensory ERP components (specifically, N1 and P2) reflects the degree to which the sensory event was perceived as an expected consequence of an own action (self-generation effect). Participants (N = 24) elicit auditory events in a virtual-reality (VR) setting by entering codes on virtual keypads to open doors. In a within-participant design, the delay between user input and sound presentation is manipulated across blocks. Occasionally, the virtual keypad is operated by a simulated robot instead, yielding a control condition with externally generated sounds. Results show that N1 (but not P2) amplitude is reduced for self-generated relative to externally generated sounds, and P2 (but not N1) amplitude is modulated by delay of sound presentation in a graded manner. This dissociation between N1 and P2 effects maps back to basic research on self-generation of sounds. We suggest P2 amplitude as a candidate read-out to assess the quality and immersiveness of digital environments with respect to system latency.
Collapse
Affiliation(s)
- Sascha Feder
- Cognitive Systems Lab, Institute of Physics, Faculty of Natural Sciences, Chemnitz University of Technology, Chemnitz, Germany
| | - Jochen Miksch
- Cognitive Systems Lab, Institute of Physics, Faculty of Natural Sciences, Chemnitz University of Technology, Chemnitz, Germany
- Physics of Cognition Group, Institute of Physics, Faculty of Natural Sciences, Chemnitz University of Technology, Chemnitz, Germany
| | - Sabine Grimm
- Cognitive Systems Lab, Institute of Physics, Faculty of Natural Sciences, Chemnitz University of Technology, Chemnitz, Germany
- Physics of Cognition Group, Institute of Physics, Faculty of Natural Sciences, Chemnitz University of Technology, Chemnitz, Germany
| | - Josef F. Krems
- Research Group Cognitive and Engineering Psychology, Institute of Psychology, Faculty of Behavioural and Social Sciences, Chemnitz University of Technology, Chemnitz, Germany
| | - Alexandra Bendixen
- Cognitive Systems Lab, Institute of Physics, Faculty of Natural Sciences, Chemnitz University of Technology, Chemnitz, Germany
| |
Collapse
|
8
|
Wimmer M, Weidinger N, Veas E, Muller-Putz GR. Neural and Pupillometric Correlates of Error Perception in an Immersive VR Flight Simulation. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2023; 2023:1-4. [PMID: 38083691 DOI: 10.1109/embc40787.2023.10340376] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
Algorithms detecting erroneous events, as used in brain-computer interfaces, usually rely solely on neural correlates of error perception. The increasing availability of wearable displays with built-in pupillometric sensors enables access to additional physiological data, potentially improving error detection. Hence, we measured both electroencephalographic (EEG) and pupillometric signals of 19 participants while performing a navigation task in an immersive virtual reality (VR) setting. We found EEG and pupillometric correlates of error perception and significant differences between distinct error types. Further, we found that actively performing tasks delays error perception. We believe that the results of this work could contribute to improving error detection, which has rarely been studied in the context of immersive VR.
Collapse
|
9
|
Jeung S, Hilton C, Berg T, Gehrke L, Gramann K. Virtual Reality for Spatial Navigation. Curr Top Behav Neurosci 2023; 65:103-129. [PMID: 36512288 DOI: 10.1007/7854_2022_403] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
Immersive virtual reality (VR) allows its users to experience physical space in a non-physical world. It has developed into a powerful research tool to investigate the neural basis of human spatial navigation as an embodied experience. The task of wayfinding can be carried out by using a wide range of strategies, leading to the recruitment of various sensory modalities and brain areas in real-life scenarios. While traditional desktop-based VR setups primarily focus on vision-based navigation, immersive VR setups, especially mobile variants, can efficiently account for motor processes that constitute locomotion in the physical world, such as head-turning and walking. When used in combination with mobile neuroimaging methods, immersive VR affords a natural mode of locomotion and high immersion in experimental settings, designing an embodied spatial experience. This in turn facilitates ecologically valid investigation of the neural underpinnings of spatial navigation.
Collapse
Affiliation(s)
- Sein Jeung
- Department of Biological Psychology and Neuroergonomics, Technische Universität Berlin, Berlin, Germany
- Kavli Institute for Systems Neuroscience, Norwegian University of Science and Technology, Trondheim, Norway
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Christopher Hilton
- Department of Biological Psychology and Neuroergonomics, Technische Universität Berlin, Berlin, Germany
| | - Timotheus Berg
- Department of Biological Psychology and Neuroergonomics, Technische Universität Berlin, Berlin, Germany
| | - Lukas Gehrke
- Department of Biological Psychology and Neuroergonomics, Technische Universität Berlin, Berlin, Germany
| | - Klaus Gramann
- Department of Biological Psychology and Neuroergonomics, Technische Universität Berlin, Berlin, Germany.
- Center for Advanced Neurological Engineering, University of California, San Diego, CA, USA.
| |
Collapse
|