1
|
Nakamura J, Kitazaki M. The effect of posture on virtual walking experience using foot vibrations. Sci Rep 2024; 14:19366. [PMID: 39169206 PMCID: PMC11339416 DOI: 10.1038/s41598-024-70229-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2024] [Accepted: 08/14/2024] [Indexed: 08/23/2024] Open
Abstract
Virtual walking systems for stationary observers have been developed using multimodal stimulation such as vision, touch, and sound to overcome physical limitation. In previous studies, participants were typically positioned in either a standing or a seated position. It would be beneficial if bedridden users could have enough virtual walking experience. Thus, we aimed to investigate the effects of participants' posture and foot vibrations on the experience of virtual walking. They were either sitting, standing, or lying during observing a virtual scene of a walking avatar in the first-person perspective, while vibrations either synchronized or asynchronized (randomized) to the avatar's walking were applied to their feet. We found that the synchronized foot vibrations improved virtual walking experiences compared to asynchronous vibrations. The standing position consistently offered an improved virtual walking experience compared to sitting and lying positions with either the synchronous or asynchronous foot vibrations, while the difference between the siting and lying postures was small and not significant. Furthermore, subjective scores for posture matching between real and virtual postures, illusory body ownership, and sense of agency were significantly higher with the synchronous than the asynchronous vibration. These findings suggest that experiencing virtual walking with foot vibrations in a lying position is less effective than a standing position, but not much different from a sitting position.
Collapse
Affiliation(s)
- Junya Nakamura
- Department of Computer Science and Engineering, Toyohashi University of Technology, 1-1 Hibarigaoka, Tempaku, Toyohashi, Aichi, 441-8580, Japan.
| | - Michiteru Kitazaki
- Department of Computer Science and Engineering, Toyohashi University of Technology, 1-1 Hibarigaoka, Tempaku, Toyohashi, Aichi, 441-8580, Japan.
| |
Collapse
|
2
|
Lang M, Ghandour S, Rikard B, Balasalle EK, Rouhezamin MR, Zhang H, Uppot RN. Medical Extended Reality for Radiology Education and Training. J Am Coll Radiol 2024:S1546-1440(24)00516-7. [PMID: 38866067 DOI: 10.1016/j.jacr.2024.05.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2024] [Revised: 04/30/2024] [Accepted: 05/02/2024] [Indexed: 06/14/2024]
Abstract
Medical extended reality (MXR), encompassing augmented reality, virtual reality, and mixed reality (MR), presents a novel paradigm in radiology training by offering immersive, interactive, and realistic learning experiences in health care. Although traditional educational tools in the field of radiology are essential, it is necessary to capitalize on the innovative and emerging educational applications of extended reality (XR) technologies. At the most basic level of learning anatomy, XR has been extensively used with an emphasis on its superiority over conventional learning methods, especially in spatial understanding and recall. For imaging interpretation, XR has fostered the concepts of virtual reading rooms by enabling collaborative learning environments and enhancing image analysis and understanding. Moreover, image-guided interventions in interventional radiology have witnessed an uptick in XR utilization, illustrating its effectiveness in procedural training and skill acquisition for medical students and residents in a safe and risk-free environment. However, there remain several challenges and limitations for XR in radiology education, including technological, economic, and ergonomic challenges and and integration into existing curricula. This review explores the transformative potential of MXR in radiology education and training along with insights on the future of XR in radiology education, forecasting advancements in immersive simulations, artificial intelligence integration for personalized learning, and the potential of cloud-based XR platforms for remote and collaborative training. In summation, MXR's burgeoning role in reshaping radiology education offers a safer, scalable, and more efficient training model that aligns with the dynamic healthcare landscape.
Collapse
Affiliation(s)
- Min Lang
- Director of Innovation and Research, Medical Extended Reality Lab, Mass General Brigham, Boston, Massachusetts; Vice President of Operations at the American Medical Extended Reality Association, Boston, Massachusetts; Department of Radiology, Massachusetts General Hospital, Boston, Massachusetts.
| | - Samir Ghandour
- Department of Radiology, Massachusetts General Hospital, Boston, Massachusetts
| | - Blaire Rikard
- Department of Radiology, Massachusetts General Hospital, Boston, Massachusetts; Harvard Medical School, Boston, Massachusetts; Duke University School of Medicine, Durham, North Carolina
| | - Eleni K Balasalle
- Program Director, Medical Extended Reality Lab, Mass General Brigham, Boston, Massachusetts
| | | | - Haipeng Zhang
- Department of Psychosocial Oncology and Palliative Care, Dana-Farber Cancer Institute, Boston, Massachusetts; President of the American Medical Extended Reality Association and Chief Innovation Officer & Chief Officer, Office of Healthcare Innovation and Learning, US Department of Veterans Affairs
| | - Raul N Uppot
- Department of Radiology, Massachusetts General Hospital, Boston, Massachusetts; Harvard Medical School, Boston, Massachusetts; Executive Director, Medical Extended Reality Lab, Mass General Brigham, Boston, Massachusetts; Director of Interventional Radiology Research, Department of Radiology, Massachusetts General Hospital, Boston, Massachusetts
| |
Collapse
|
3
|
Gao B, Shao T, Tu H, Ma Q, Liu Z, Han T. Exploring Bimanual Haptic Feedback for Spatial Search in Virtual Reality. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:2422-2433. [PMID: 38437136 DOI: 10.1109/tvcg.2024.3372045] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/06/2024]
Abstract
Spatial search tasks are common and crucial in many Virtual Reality (VR) applications. Traditional methods to enhance the performance of spatial search often employ sensory cues such as visual, auditory, or haptic feedback. However, the design and use of bimanual haptic feedback with two VR controllers for spatial search in VR remains largely unexplored. In this work, we explored bimanual haptic feedback with various combinations of haptic properties, where four types of bimanual haptic feedback were designed, for spatial search tasks in VR. Two experiments were designed to evaluate the effectiveness of bimanual haptic feedback on spatial direction guidance and search in VR. The results from the first experiment reveal that our proposed bimanual haptic schemes significantly enhanced the recognition of spatial directions in terms of accuracy and speed compared to spatial audio feedback. The second experiment's findings suggest that the performance of bimanual haptic feedback was comparable to or even better than the visual arrow, especially in reducing the angle of head movement and enhancing searching targets behind the participants, which was supported by subjective feedback as well. Based on these findings, we have derived a set of design recommendations for spatial search using bimanual haptic feedback in VR.
Collapse
|
4
|
Nakamura J, Ikei Y, Kitazaki M. Effects of self-avatar cast shadow and foot vibration on telepresence, virtual walking experience, and cybersickness from omnidirectional movie. Iperception 2024; 15:20416695241227857. [PMID: 38404740 PMCID: PMC10894555 DOI: 10.1177/20416695241227857] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2023] [Accepted: 01/08/2024] [Indexed: 02/27/2024] Open
Abstract
Human locomotion is most naturally achieved through walking, which is good for both mental and physical health. To provide a virtual walking experience to seated users, a system utilizing foot vibrations and simulated optical flow was developed. The current study sought to augment this system and examine the effect of an avatar's cast shadow and foot vibrations on the virtual walking experience and cybersickness. The omnidirectional movie and the avatar's walking animation were synchronized, with the cast shadow reflecting the avatar's movement on the ground. Twenty participants were exposed to the virtual walking in six conditions (with/without foot vibrations and no/short/long shadow) and were asked to rate their sense of telepresence, walking experience, and occurrences of cybersickness. Our findings indicate that the synchronized foot vibrations enhanced telepresence as well as self-motion, walking, and leg-action sensations, while also reducing instances of nausea and disorientation sickness. The avatar's cast shadow was found to improve telepresence and leg-action sensation, but had no impact on self-motion and walking sensation. These results suggest that observation of the self-body cast shadow does not directly improve walking sensation, but is effective in enhancing telepresence and leg-action sensation, while foot vibrations are effective in improving telepresence and walking experience and reducing instances of cybersickness.
Collapse
|
5
|
Lee H, Oh S, Choi S. Data-Driven Rendering of Motion Effects for Walking Sensations in Different Gaits. IEEE TRANSACTIONS ON HAPTICS 2022; 15:547-559. [PMID: 35604970 DOI: 10.1109/toh.2022.3176964] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
Motion effects are a vital component in 4D interactive applications, where special physical effects, such as motion, vibration, and wind, are provided with audiovisual stimuli. In 4D films and VR games, the scenes that show human locomotion appear frequently, and motion effects emphasizing such movements can enhance the viewers' immersive experiences. This paper proposes a data-driven framework for automatic generation of the motion effects that provide users with walking sensations. Measurements are made using the motion sensors attached to the human body during locomotion in different gaits, e.g., walking, running, and stumping. The captured data are processed and converted to multiple degree-of-freedom commands to a motion platform. We demonstrate that the data-driven motion commands can be represented in a greatly lower-dimensional space by principal component analysis. This finding leads to an algorithm for the synthesis of new motion commands that can elicit the target gait's walking sensations. The perceptual performance of our method is validated by two user studies. This work contributes to investigating the feasibility of mimicking walking sensations using a motion platform based on human locomotion data and developing an automatic generation algorithm of motion effects conveying the impressions of different gaits.
Collapse
|
6
|
Tsao CA, Wu TC, Tsai HR, Wei TY, Liao FY, Chapman S, Chen BY. FrictShoes: Providing Multilevel Nonuniform Friction Feedback on Shoes in VR. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:2026-2036. [PMID: 35167465 DOI: 10.1109/tvcg.2022.3150492] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Many haptic feedback methods have been proposed to enhance realism in virtual reality (VR). However, friction on the feet in VR, which renders feedback as if walking on different terrains or ground textures or stepping on objects is still less explored. Herein, we propose a wearable device, FrictShoes a pair of foot accessories, to provide multilevel nonuniform friction feedback to feet. This is achieved by the independent functioning of six brakes on six wheels underneath each FrictShoe, which allows the friction levels of the wheels from each to be either matched or to vary. We conducted a magnitude estimation study to understand users' distinguishability of friction force magnitudes (or levels). Based on the results, we performed an exploratory study to realize how users adjust and map the multilevel nonuniform friction patterns to common VR terrains or ground textures. Finally, a VR experience study was conducted to evaluate the performance of the proposed multilevel nonuniform friction feedback to the feet in VR experiences.
Collapse
|
7
|
Sra M, Danry V, Maes P, Johnsen K, Billinghurst M. Situated VR: Toward a Congruent Hybrid Reality Without Experiential Artifacts. IEEE COMPUTER GRAPHICS AND APPLICATIONS 2022; 42:7-18. [PMID: 35671280 DOI: 10.1109/mcg.2022.3154358] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
The vision of extended reality (XR) systems is living in a world where real and virtual elements seamlessly and contextually augment experiences of ourselves and the worlds we inhabit. While this integration promises exciting opportunities for the future of XR, it comes with the risk of experiential distortions and feelings of dissociation, especially related to virtual reality (VR). When transitioning from a virtual world to the real world, users report of experiential structures that linger on, as sort of after images, causing disruptions in their daily life. In this work, we define these atypical experiences as experiential artifacts (EAs) and present preliminary results from an informal survey conducted online with 76 VR users to highlight different types of artifacts and their durations. To avoid disruptions caused by these artifacts and simultaneously increase the user's sense of presence, we propose the idea of situated VR, which blends the real and virtual in novel ways that can reduce incongruencies between the two worlds. We discuss the implications of EAs, and through examples from our own work in building hybrid experiences, we demonstrate the potential and relevance of situated VR in the design of a future, more immersive, artifact-free hybrid reality.
Collapse
|
8
|
Melo M, Goncalves G, Monteiro P, Coelho H, Vasconcelos-Raposo J, Bessa M. Do Multisensory Stimuli Benefit the Virtual Reality Experience? A Systematic Review. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:1428-1442. [PMID: 32746276 DOI: 10.1109/tvcg.2020.3010088] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
The majority of virtual reality (VR) applications rely on audiovisual stimuli and do not exploit the addition of other sensory cues that could increase the potential of VR. This systematic review surveys the existing literature on multisensory VR and the impact of haptic, olfactory, and taste cues over audiovisual VR. The goal is to identify the extent to which multisensory stimuli affect the VR experience, which stimuli are used in multisensory VR, the type of VR setups used, and the application fields covered. An analysis of the 105 studies that met the eligibility criteria revealed that 84.8 percent of the studies show a positive impact of multisensory VR experiences. Haptics is the most commonly used stimulus in multisensory VR systems (86.6 percent). Non-immersive and immersive VR setups are preferred over semi-immersive setups. Regarding the application fields, a considerable part was adopted by health professionals and science and engineering professionals. We further conclude that smell and taste are still underexplored, and they can bring significant value to VR applications. More research is recommended on how to synthesize and deliver these stimuli, which still require complex and costly apparatus be integrated into the VR experience in a controlled and straightforward manner.
Collapse
|
9
|
Nai W, Liu J, Sun C, Wang Q, Liu G, Sun X. Vibrotactile Feedback Rendering of Patterned Textures Using a Waveform Segment Table Method. IEEE TRANSACTIONS ON HAPTICS 2021; 14:849-861. [PMID: 34043515 DOI: 10.1109/toh.2021.3084304] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Vibrotactile feedback is a common form of rendered haptic feedback used for simulating stylus-texture interaction. Most state-of-the-art stylus-texture interaction vibrotactile feedback synthesizing methods are oriented toward generating signal with resemblance in spectrum in frequency domain. In this paper we set our foot backward and explore more about record-and-playback method for a subset of textures: those that have obvious spatial pattern, which constitutes a significant proportion of man-made textures we interact with in daily life. We propose a method that explicitly renders the periodic vibrotactile feedback for patterned textures. The method uses Dynamic Time Warping to select the most representative signal segment from a long continuous signal captured under a certain interaction condition, and constructs a waveform segment table to store representative signal segments under different conditions. Results of similarity-comparison user study show that subjects gave generally higher similarity scores to our proposed method than to a spectrum-oriented method. The results shed light on the importance of conserving the pattern in the haptic feedback rendering for patterned textures.
Collapse
|
10
|
Locomotor illusions are generated by perceptual body-environment organization. PLoS One 2021; 16:e0251562. [PMID: 33974677 PMCID: PMC8112709 DOI: 10.1371/journal.pone.0251562] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2020] [Accepted: 04/28/2021] [Indexed: 11/19/2022] Open
Abstract
While one is walking, the stimulation by one's body forms a structure with the stimulation by the environment. This locomotor array of stimulation corresponds to the human-environment relation that one's body forms with the environment it is moving through. Thus, the perceptual experience of walking may arise from such a locomotor array of stimulation. Humans can also experience walking while they are sitting. In this case, there is no stimulation by one's walking body. Hence, one can experience walking although a basic component of a locomotor array of stimulation is missing. This may be facilitated by perception organizing the sensory input about one's body and environment into a perceptual structure that corresponds to a locomotor array of stimulation. We examined whether locomotor illusions are generated by this perceptual formation of a locomotor structure. We exposed sixteen seated individuals to environmental stimuli that elicited either the perceptual formation of a locomotor structure or that of a control structure. The study participants experienced distinct locomotor illusions when they were presented with environmental stimuli that elicited the perceptual formation of a locomotor structure. They did not experience distinct locomotor illusions when the stimuli instead elicited the perceptual formation of the control structure. These findings suggest that locomotor illusions are generated by the perceptual organization of sensory input about one's body and environment into a locomotor structure. This perceptual body-environment organization elucidates why seated human individuals experience the sensation of walking without any proprioceptive or kinaesthetic stimulation.
Collapse
|
11
|
Yang TH, Son H, Byeon S, Gil H, Hwang I, Jo G, Choi S, Kim SY, Kim JR. Magnetorheological Fluid Haptic Shoes for Walking in VR. IEEE TRANSACTIONS ON HAPTICS 2021; 14:83-94. [PMID: 32804656 DOI: 10.1109/toh.2020.3017099] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
In this article, present RealWalk, a pair of haptic shoes for HMD-based VR, designed to create realistic sensations of ground surface deformation, and texture using Magnetorheological fluid (MR fluid). RealWalk offers a novel interaction scheme through the physical interaction between the shoes, and the ground surfaces while walking in VR. Each shoe consists of two MR fluid actuators, an insole pressure sensor, and a foot position tracker. The MR fluid actuators are designed in the form of multi-stacked disc structure with a long flow path to maximize the flow resistance. With changing the magnetic field intensity in MR fluid actuators based on the ground material in the virtual scene, the viscosity of MR fluid is changed accordingly. When a user steps on the ground with the shoes, the two MR fluid actuators are pressed down, creating a variety of ground material deformation such as snow, mud, and dry sand. We built an interactive VR application, and compared RealWalk with vibrotactile-based haptic shoes in four different VR scenes: grass, sand, mud, and snow. We report that, compared to vibrotactile-haptic shoes, RealWalk provides higher ratings in all scenes for discrimination, realism, and satisfaction. We also report qualitative user feedback for their experiences.
Collapse
|
12
|
A Quality of Experience assessment of haptic and augmented reality feedback modalities in a gait analysis system. PLoS One 2020; 15:e0230570. [PMID: 32203533 PMCID: PMC7089541 DOI: 10.1371/journal.pone.0230570] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2019] [Accepted: 03/03/2020] [Indexed: 11/19/2022] Open
Abstract
Gait analysis is a technique that is used to understand movement patterns and, in some cases, to inform the development of rehabilitation protocols. Traditional rehabilitation approaches have relied on expert guided feedback in clinical settings. Such efforts require the presence of an expert to inform the re-training (to evaluate any improvement) and the patient to travel to the clinic. Nowadays, potential opportunities exist to employ the use of digitized “feedback” modalities to help a user to “understand” improved gait technique. This is important as clear and concise feedback can enhance the quality of rehabilitation and recovery. A critical requirement emerges to consider the quality of feedback from the user perspective i.e. how they process, understand and react to the feedback. In this context, this paper reports the results of a Quality of Experience (QoE) evaluation of two feedback modalities: Augmented Reality (AR) and Haptic, employed as part of an overall gait analysis system. The aim of the feedback is to reduce varus/valgus misalignments, which can cause serious orthopedics problems. The QoE analysis considers objective (improvement in knee alignment) and subjective (questionnaire responses) user metrics in 26 participants, as part of a within subject design. Participants answered 12 questions on QoE aspects such as utility, usability, interaction and immersion of the feedback modalities via post-test reporting. In addition, objective metrics of participant performance (angles and alignment) were also considered as indicators of the utility of each feedback modality. The findings show statistically significant higher QoE ratings for AR feedback. Also, the number of knee misalignments was reduced after users experienced AR feedback (35% improvement with AR feedback relative to baseline when compared to haptic). Gender analysis showed significant differences in performance for number of misalignments and time to correct valgus misalignment (for males when they experienced AR feedback). The female group self-reported higher utility and QoE ratings for AR when compared to male group.
Collapse
|
13
|
Amemiya T, Ikei Y, Kitazaki M. Remapping Peripersonal Space by Using Foot-Sole Vibrations Without Any Body Movement. Psychol Sci 2019; 30:1522-1532. [PMID: 31545929 DOI: 10.1177/0956797619869337] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
The limited space immediately surrounding our body, known as peripersonal space (PPS), has been investigated by focusing on changes in the multisensory processing of audio-tactile stimuli occurring within or outside the PPS. Some studies have reported that the PPS representation is extended by body actions such as walking. However, it is unclear whether the PPS changes when a walking-like sensation is induced but the body neither moves nor is forced to move. Here, we show that a rhythmic pattern consisting of walking-sound vibrations applied to the soles of the feet, but not the forearms, boosted tactile processing when looming sounds were located near the body. The findings suggest that an extension of the PPS representation can be triggered by stimulating the soles in the absence of body action, which may automatically drive a motor program for walking, leading to a change in spatial cognition around the body.
Collapse
Affiliation(s)
- Tomohiro Amemiya
- Graduate School of Information Science and Technology, The University of Tokyo.,Virtual Reality Educational Research Center, The University of Tokyo.,NTT Communication Science Laboratories, NTT Corporation, Kanagawa, Japan
| | - Yasushi Ikei
- Faculty of Systems Design, Tokyo Metropolitan University
| | - Michiteru Kitazaki
- Department of Computer Science and Engineering, Toyohashi University of Technology
| |
Collapse
|
14
|
Karunakaran KK, Abbruzzese KM, Xu H, Foulds RA. The Importance of Haptics in Generating Exoskeleton Gait Trajectory Using Alternate Motor Inputs. IEEE Trans Neural Syst Rehabil Eng 2017; 25:2328-2335. [PMID: 28715331 DOI: 10.1109/tnsre.2017.2726538] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Human gait requires both haptic and visual feedback to generate and control rhythmic movements, and navigate environmental obstacles. Current lower extremity wearable exoskeletons that restore gait to individuals with paraplegia due to spinal cord injury rely completely on visual feedback to generate limited pre-programmed gait variations, and generally provide little control by the user over the gait cycle. As an alternative to this limitation, we propose user control of gait in real time using healthy upper extremities. This paper evaluates the feedback conditions required for the hands to generate complex rhythmic trajectories that resemble gait trajectories. This paper involved 18 subjects who performed a virtual locomotor task, where contralateral hand movements were mapped to control virtual feet in three feedback conditions: haptic only, visual only, and haptic and visual. The results indicate that haptic feedback in addition to visual feedback is required to produce rhythmic hand trajectories similar to gait trajectories.
Collapse
|
15
|
de Jesus Oliveira VA, Brayda L, Nedel L, Maciel A. Designing a Vibrotactile Head-Mounted Display for Spatial Awareness in 3D Spaces. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2017; 23:1409-1417. [PMID: 28129175 DOI: 10.1109/tvcg.2017.2657238] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
Due to the perceptual characteristics of the head, vibrotactile Head-mounted Displays are built with low actuator density. Therefore, vibrotactile guidance is mostly assessed by pointing towards objects in the azimuthal plane. When it comes to multisensory interaction in 3D environments, it is also important to convey information about objects in the elevation plane. In this paper, we design and assess a haptic guidance technique for 3D environments. First, we explore the modulation of vibration frequency to indicate the position of objects in the elevation plane. Then, we assessed a vibrotactile HMD made to render the position of objects in a 3D space around the subject by varying both stimulus loci and vibration frequency. Results have shown that frequencies modulated with a quadratic growth function allowed a more accurate, precise, and faster target localization in an active head pointing task. The technique presented high usability and a strong learning effect for a haptic search across different scenarios in an immersive VR setup.
Collapse
|
16
|
Karunakaran K, Abbruzzese K, Xu H, Ehrenberg N, Foulds R. Haptic proprioception in a virtual locomotor task. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2015; 2014:3594-7. [PMID: 25570768 DOI: 10.1109/embc.2014.6944400] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
Normal gait needs both proprioceptive and visual feedback to the nervous system to effectively control the rhythmicity of motor movement. Current preprogrammed exoskeletons provide only visual feedback with no user control over the foot trajectory. We propose an intuitive controller where hand trajectories are mapped to control contralateral foot movement. Our study shows that proprioceptive feedback provided to the users hand in addition to visual feedback result in better control during virtual ambulation than visual feedback alone. Hand trajectories resembled normal foot trajectories when both proprioceptive and visual feedback was present. Our study concludes that haptic feedback is essential for both temporal and spatial aspects of motor control in rhythmic movements.
Collapse
|