1
|
Ribeiro JC, Rocha C, Barbosa B, Lima RC, Cunha LM. Sensory Analysis Performed within Augmented Virtuality System: Impact on Hedonic Scores, Engagement, and Presence Level. Foods 2024; 13:2456. [PMID: 39123647 PMCID: PMC11311452 DOI: 10.3390/foods13152456] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2024] [Revised: 07/25/2024] [Accepted: 07/30/2024] [Indexed: 08/12/2024] Open
Abstract
Sensory analysis methodologies are performed in sensory booths designed to minimise external stimuli, lacking ecological validity. Immersive environments are used to introduce contextual cues, but there is a lack of studies using mixed reality systems. The main goal of this study was to evaluate an augmented virtuality (AV) system where participants are inserted into a virtual environment and evaluate a real product, being able to interact with both dimensions. A panel of 102 consumers evaluated five samples of commercial peach nectars in three sessions, each in a different environment: public food court, living room (AV environments), and laboratory (traditional sensory booth). Consumers rated overall liking, followed by open comments, and also answered an Engagement (EQ) and a Presence Questionnaire (PQ). The type of environment only affected hedonic discrimination among samples, with the laboratory setting being the only one with sample discrimination. Nonetheless, each sample was not evaluated differently across the different environments. Concerning engagement, the environment only significantly influenced the EQ's 'Affective Value' factor, being higher when using an AV system. The level of presence in the virtual environment was significantly higher in the public food court, being significantly correlated with the EQ factor scores.
Collapse
Affiliation(s)
- José Carlos Ribeiro
- GreenUPorto/INOV4Agro & DGAOT, Faculty of Sciences of the University of Porto, Rua da Agrária, 747, 4485-646 Vairão, Portugal
| | - Célia Rocha
- GreenUPorto/INOV4Agro & DGAOT, Faculty of Sciences of the University of Porto, Rua da Agrária, 747, 4485-646 Vairão, Portugal
- Sense Test, Lda, Rua Zeferino Costa, 341, 4400-345 Vila Nova de Gaia, Portugal
| | - Bruna Barbosa
- GreenUPorto/INOV4Agro & DGAOT, Faculty of Sciences of the University of Porto, Rua da Agrária, 747, 4485-646 Vairão, Portugal
- Sense Test, Lda, Rua Zeferino Costa, 341, 4400-345 Vila Nova de Gaia, Portugal
| | - Rui Costa Lima
- Sense Test, Lda, Rua Zeferino Costa, 341, 4400-345 Vila Nova de Gaia, Portugal
| | - Luís Miguel Cunha
- GreenUPorto/INOV4Agro & DGAOT, Faculty of Sciences of the University of Porto, Rua da Agrária, 747, 4485-646 Vairão, Portugal
| |
Collapse
|
2
|
Narciso D, Melo M, Rodrigues S, Cunha JP, Vasconcelos-Raposo J, Bessa M. Studying the Influence of Multisensory Stimuli on a Firefighting Training Virtual Environment. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:4122-4136. [PMID: 37028005 DOI: 10.1109/tvcg.2023.3251188] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
How we perceive and experience the world around us is inherently multisensory. Most of the Virtual Reality (VR) literature is based on the senses of sight and hearing. However, there is a lot of potential for integrating additional stimuli into Virtual Environments (VEs), especially in a training context. Identifying the relevant stimuli for obtaining a virtual experience that is perceptually equivalent to a real experience will lead users to behave the same across environments, which adds substantial value for several training areas, such as firefighters. In this article, we present an experiment aiming to assess the impact of different sensory stimuli on stress, fatigue, cybersickness, Presence and knowledge transfer of users during a firefighter training VE. The results suggested that the stimulus that significantly impacted the user's response was wearing a firefighter's uniform and combining all sensory stimuli under study: heat, weight, uniform, and mask. The results also showed that the VE did not induce cybersickness and that it was successful in the task of transferring knowledge.
Collapse
|
3
|
Xiong N, Liu Q, Zhu K. PetPresence: Investigating the Integration of Real-World Pet Activities in Virtual Reality. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:2559-2569. [PMID: 38437107 DOI: 10.1109/tvcg.2024.3372095] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/06/2024]
Abstract
For VR interaction, the home environment with complicated spatial setup and dynamics may hinder the VR user experience. In particular, pets' movement may be more unpredictable. In this paper, we investigate the integration of real-world pet activities into immersive VR interaction. Our pilot study showed that the active pet movements, especially dogs, could negatively impact users' performance and experience in immersive VR. We proposed three different types of pet integration, namely semitransparent real-world portal, non-interactive object in VR, and interactive object in VR. We conducted the user study with 16 pet owners and their pets. The results showed that compared to the baseline condition without any pet-integration technique, the approach of integrating the pet as interactive objects in VR yielded significantly higher participant ratings in perceived realism, joy, multisensory engagement, and connection with their pets in VR.
Collapse
|
4
|
Souchet AD, Lourdeaux D, Burkhardt JM, Hancock PA. Design guidelines for limiting and eliminating virtual reality-induced symptoms and effects at work: a comprehensive, factor-oriented review. Front Psychol 2023; 14:1161932. [PMID: 37359863 PMCID: PMC10288216 DOI: 10.3389/fpsyg.2023.1161932] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2023] [Accepted: 05/16/2023] [Indexed: 06/28/2023] Open
Abstract
Virtual reality (VR) can induce side effects known as virtual reality-induced symptoms and effects (VRISE). To address this concern, we identify a literature-based listing of these factors thought to influence VRISE with a focus on office work use. Using those, we recommend guidelines for VRISE amelioration intended for virtual environment creators and users. We identify five VRISE risks, focusing on short-term symptoms with their short-term effects. Three overall factor categories are considered: individual, hardware, and software. Over 90 factors may influence VRISE frequency and severity. We identify guidelines for each factor to help reduce VR side effects. To better reflect our confidence in those guidelines, we graded each with a level of evidence rating. Common factors occasionally influence different forms of VRISE. This can lead to confusion in the literature. General guidelines for using VR at work involve worker adaptation, such as limiting immersion times to between 20 and 30 min. These regimens involve taking regular breaks. Extra care is required for workers with special needs, neurodiversity, and gerontechnological concerns. In addition to following our guidelines, stakeholders should be aware that current head-mounted displays and virtual environments can continue to induce VRISE. While no single existing method fully alleviates VRISE, workers' health and safety must be monitored and safeguarded when VR is used at work.
Collapse
Affiliation(s)
- Alexis D. Souchet
- Heudiasyc UMR 7253, Alliance Sorbonne Université, Université de Technologie de Compiègne, CNRS, Compiègne, France
- Institute for Creative Technologies, University of Southern California, Los Angeles, CA, United States
| | - Domitile Lourdeaux
- Heudiasyc UMR 7253, Alliance Sorbonne Université, Université de Technologie de Compiègne, CNRS, Compiègne, France
| | | | - Peter A. Hancock
- Department of Psychology, University of Central Florida, Orlando, FL, United States
| |
Collapse
|
5
|
Covaci A, Saleme EB, Mesfin G, Comsa IS, Trestian R, Santos CAS, Ghinea G. Multisensory 360° Videos Under Varying Resolution Levels Enhance Presence. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2023; 29:2093-2101. [PMID: 34990363 DOI: 10.1109/tvcg.2022.3140875] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Omnidirectional videos have become a leading multimedia format for Virtual Reality applications. While live 360 ° videos offer a unique immersive experience, streaming of omnidirectional content at high resolutions is not always feasible in bandwidth-limited networks. While in the case of flat videos, scaling to lower resolutions works well, 360 ° video quality is seriously degraded because of the viewing distances involved in head-mounted displays. Hence, in this article, we investigate first how quality degradation impacts the sense of presence in immersive Virtual Reality applications. Then, we are pushing the boundaries of 360 ° technology through the enhancement with multisensory stimuli. 48 participants experimented both 360 ° scenarios (with and without multisensory content), while they were divided randomly between four conditions characterised by different encoding qualities (HD, FullHD, 2.5K, 4K). The results showed that presence is not mediated by streaming at a higher bitrate. The trend we identified revealed however that presence is positively and significantly impacted by the enhancement with multisensory content. This shows that multisensory technology is crucial in creating more immersive experiences.
Collapse
|
6
|
Evoked sensory stimulation of the eating environment, impacts feeling of presence and food desires in an online environment. Food Res Int 2023; 167:112645. [PMID: 37087236 DOI: 10.1016/j.foodres.2023.112645] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2022] [Revised: 02/15/2023] [Accepted: 02/21/2023] [Indexed: 02/27/2023]
Abstract
Online food choices are often made outside a regular food environment and suffer from sensory deprivation. The present study investigated if evoked multi-sensory stimulation can drive context specific food desires in an online environment. In a randomised between subject design, participants expressed their food desire on a visual analogue scale and feeling of presence (e.g., did you feel present on a beach) on a Likert scale, whilst looking online at a picture and reading a neutral description of a sensory laboratory (control condition), looking at a photo of a beach and reading a neutral description (beach condition), or looking at a photo of a beach and reading a sensory based description (beach + ). Participants (n = 725 participants, 622 females) who saw the beach photo increased their desire for cold, but not neutral foods (p < 0.05), those who were exposed to the sensory description in addition to the photo showed a higher desire for cold foods compared to those who just saw the beach photo (p < 0.001). These effects were modulated by an increased feeling of presence and how often participants visited the beach. Participants with a higher feeling of presence showed a higher desire for cold foods (p < 0.05). Food desires of those who visited the beach often were more impacted by the evoked sensory stimulation than food desires of those who visited the beach rarely. Food desires created in an online environment can be influenced by visual, and text based evoked sensory stimulation as long as consumers' feeling of presence in is high. The results can inform public health professionals how to impact healthy food choices in an online environment.
Collapse
|
7
|
Chung KS. The effect of sensory experience on sport development: baseball simulation in Korea. MANAGING SPORT AND LEISURE 2022. [DOI: 10.1080/23750472.2022.2159502] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
Affiliation(s)
- Kyu-soo Chung
- Exercise Science and Sport Management, Wellstar College of Health and Human Services, Kennesaw State University, Kennesaw, GA, USA
| |
Collapse
|
8
|
Kim H, Lee IK. Studying the Effects of Congruence of Auditory and Visual Stimuli on Virtual Reality Experiences. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:2080-2090. [PMID: 35167477 DOI: 10.1109/tvcg.2022.3150514] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Studies in virtual reality (VR) have introduced numerous multisensory simulation techniques for more immersive VR experiences. However, although they primarily focus on expanding sensory types or increasing individual sensory quality, they lack consensus in designing appropriate interactions between different sensory stimuli. This paper explores how the congruence between auditory and visual (AV) stimuli, which are the sensory stimuli typically provided by VR devices, affects the cognition and experience of VR users as a critical interaction factor in promoting multisensory integration. We defined the types of (in)congruence between AV stimuli, and then designed 12 virtual spaces with different types or degrees of congruence between AV stimuli. We then evaluated the presence, immersion, motion sickness, and cognition changes in each space. We observed the following key findings: 1) there is a limit to the degree of temporal or spatial incongruence that can be tolerated, with few negative effects on user experience until that point is exceeded; 2) users are tolerant of semantic incongruence; 3) a simulation that considers synesthetic congruence contributes to the user's sense of immersion and presence. Based on these insights, we identified the essential considerations for designing sensory simulations in VR and proposed future research directions.
Collapse
|
9
|
Virtual Reality for Neurorehabilitation and Cognitive Enhancement. Brain Sci 2021; 11:brainsci11020221. [PMID: 33670277 PMCID: PMC7918687 DOI: 10.3390/brainsci11020221] [Citation(s) in RCA: 22] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2020] [Revised: 01/23/2021] [Accepted: 02/06/2021] [Indexed: 02/06/2023] Open
Abstract
Our access to computer-generated worlds changes the way we feel, how we think, and how we solve problems. In this review, we explore the utility of different types of virtual reality, immersive or non-immersive, for providing controllable, safe environments that enable individual training, neurorehabilitation, or even replacement of lost functions. The neurobiological effects of virtual reality on neuronal plasticity have been shown to result in increased cortical gray matter volumes, higher concentration of electroencephalographic beta-waves, and enhanced cognitive performance. Clinical application of virtual reality is aided by innovative brain–computer interfaces, which allow direct tapping into the electric activity generated by different brain cortical areas for precise voluntary control of connected robotic devices. Virtual reality is also valuable to healthy individuals as a narrative medium for redesigning their individual stories in an integrative process of self-improvement and personal development. Future upgrades of virtual reality-based technologies promise to help humans transcend the limitations of their biological bodies and augment their capacity to mold physical reality to better meet the needs of a globalized world.
Collapse
|
10
|
Design of Desktop Audiovisual Entertainment System with Deep Learning and Haptic Sensations. Symmetry (Basel) 2020. [DOI: 10.3390/sym12101718] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
In this study, we designed a four-dimensional (4D) audiovisual entertainment system called Sense. This system comprises a scene recognition system and hardware modules that provide haptic sensations for users when they watch movies and animations at home. In the scene recognition system, we used Google Cloud Vision to detect common scene elements in a video, such as fire, explosions, wind, and rain, and further determine whether the scene depicts hot weather, rain, or snow. Additionally, for animated videos, we applied deep learning with a single shot multibox detector to detect whether the animated video contained scenes of fire-related objects. The hardware module was designed to provide six types of haptic sensations set as line-symmetry to provide a better user experience. After the system considers the results of object detection via the scene recognition system, the system generates corresponding haptic sensations. The system integrates deep learning, auditory signals, and haptic sensations to provide an enhanced viewing experience.
Collapse
|