1
|
Erdeniz B, Tekgün E, Lenggenhager B, Lopez C. Visual perspective, distance, and felt presence of others in dreams. Conscious Cogn 2023; 113:103547. [PMID: 37390767 DOI: 10.1016/j.concog.2023.103547] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2023] [Revised: 06/22/2023] [Accepted: 06/22/2023] [Indexed: 07/02/2023]
Abstract
The peripersonal space, that is, the limited space surrounding the body, involves multisensory coding and representation of the self in space. Previous studies have shown that peripersonal space representation and the visual perspective on the environment can be dramatically altered when neurotypical individuals self-identify with a distant avatar (i.e., in virtual reality) or during clinical conditions (i.e., out-of-body experience, heautoscopy, depersonalization). Despite its role in many cognitive/social functions, the perception of peripersonal space in dreams, and its relationship with the perception of other characters (interpersonal distance in dreams), remain largely uncharted. The present study aimed to explore the visuospatial properties of this space, which is likely to underlie self-location as well as self/other distinction in dreams. 530 healthy volunteers answered a web-based questionnaire to measure their dominant visuo-spatial perspective in dreams, the frequency of recall for felt distances between their dream self and other dream characters, and the dreamers' viewing angle of other dream characters. Most participants reported dream experiences from a first-person perspective (1PP) (82%) compared to a third-person perspective (3PP) (18%). Independent of their dream perspective, participants reported that they generally perceived other dream characters in their close space, that is, at distance of either between 0 and 90 cm, or 90-180 cm, than in further spaces (180-270 cm). Regardless of the perspective (1PP or 3PP), both groups also reported more frequently seeing other dream characters from eye level (0° angle of viewing) than from above (30° and 60°) or below eye level (-30° and -60°). Moreover, the intensity of sensory experiences in dreams, as measured by the Bodily Self-Consciousness in Dreams Questionnaire, was higher in individuals who habitually see other dream characters closer to their personal dream self (i.e., within 0-90 cm and 90-180 cm). These preliminary findings offer a new, phenomenological account of space representation in dreams with regards to the felt presence of others. They might provide insights not only to our understanding of how dreams are formed, but also to the type of neurocomputations involved in self/other distinction.
Collapse
Affiliation(s)
- Burak Erdeniz
- İzmir University of Economics, Department of Psychology, İzmir, Turkey
| | - Ege Tekgün
- İzmir University of Economics, Department of Psychology, İzmir, Turkey
| | | | | |
Collapse
|
2
|
Genay A, Lecuyer A, Hachet M. Being an Avatar "for Real": A Survey on Virtual Embodiment in Augmented Reality. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:5071-5090. [PMID: 34310309 DOI: 10.1109/tvcg.2021.3099290] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Virtual self-avatars have been increasingly used in Augmented Reality (AR) where one can see virtual content embedded into physical space. However, little is known about the perception of self-avatars in such a context. The possibility that their embodiment could be achieved in a similar way as in Virtual Reality opens the door to numerous applications in education, communication, entertainment, or the medical field. This article aims to review the literature covering the embodiment of virtual self-avatars in AR. Our goal is (i) to guide readers through the different options and challenges linked to the implementation of AR embodiment systems, (ii) to provide a better understanding of AR embodiment perception by classifying the existing knowledge, and (iii) to offer insight on future research topics and trends for AR and avatar research. To do so, we introduce a taxonomy of virtual embodiment experiences by defining a "body avatarization" continuum. The presented knowledge suggests that the sense of embodiment evolves in the same way in AR as in other settings, but this possibility has yet to be fully investigated. We suggest that, whilst it is yet to be well understood, the embodiment of avatars has a promising future in AR and conclude by discussing possible directions for research.
Collapse
|
3
|
Cerasa A, Gaggioli A, Marino F, Riva G, Pioggia G. The promise of the metaverse in mental health: the new era of MEDverse. Heliyon 2022; 8:e11762. [PMID: 36458297 PMCID: PMC9706139 DOI: 10.1016/j.heliyon.2022.e11762] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2022] [Revised: 10/11/2022] [Accepted: 11/14/2022] [Indexed: 11/24/2022] Open
Abstract
Since Mark Zuckerberg's announcement about the development of new three-dimensional virtual worlds for social communication, a great debate has been raised about the promise of such a technology. The metaverse, a term formed by combining meta and universe, could open a new era in mental health, mainly in psychological disorders, where the creation of a full-body illusion via digital avatar could promote healthcare and personal well-being. Patients affected by body dysmorphism symptoms (i.e., eating disorders), social deficits (i.e. autism) could greatly benefit from this kind of technology. However, it is not clear which advantage the metaverse would have in treating psychological disorders with respect to the well-known and effective virtual reality (VR) exposure therapy. Indeed, in the last twenty years, a plethora of studies have demonstrated the effectiveness of VR technology in reducing symptoms of pain, anxiety, stress, as well as, in improving cognitive and social skills. We hypothesize that the metaverse will offer more opportunities, such as a more complex, virtual realm where sensory inputs, and recurrent feedback, mediated by a "federation" of multiple technologies - e.g., artificial intelligence, tangible interfaces, Internet of Things and blockchain, can be reinterpreted for facilitating a new kind of communication overcoming self-body representation. However, nowadays a clear starting point does not exist. For this reason, it is worth defining a theoretical framework for applying this new kind of technology in a social neuroscience context for developing accurate solutions to mental health in the future.
Collapse
|
4
|
Moon HJ, Gauthier B, Park HD, Faivre N, Blanke O. Sense of self impacts spatial navigation and hexadirectional coding in human entorhinal cortex. Commun Biol 2022; 5:406. [PMID: 35501331 PMCID: PMC9061856 DOI: 10.1038/s42003-022-03361-5] [Citation(s) in RCA: 11] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2021] [Accepted: 04/12/2022] [Indexed: 11/09/2022] Open
Abstract
Grid cells in entorhinal cortex (EC) encode an individual's location in space and rely on environmental cues and self-motion cues derived from the individual's body. Body-derived signals are also primary signals for the sense of self and based on integrated sensorimotor signals (proprioceptive, tactile, visual, motor) that have been shown to enhance self-centered processing. However, it is currently unknown whether such sensorimotor signals that modulate self-centered processing impact grid cells and spatial navigation. Integrating the online manipulation of bodily signals, to modulate self-centered processing, with a spatial navigation task and an fMRI measure to detect grid cell-like representation (GCLR) in humans, we report improved performance in spatial navigation and decreased GCLR in EC. This decrease in entorhinal GCLR was associated with an increase in retrosplenial cortex activity, which was correlated with participants' navigation performance. These data link self-centered processes during spatial navigation to entorhinal and retrosplenial activity and highlight the role of different bodily factors at play when navigating in VR.
Collapse
Affiliation(s)
- Hyuk-June Moon
- Center of Neuroprosthetics, Faculty of Life Sciences, Swiss Federal Institute of Technology (École Polytechnique Fédérale de Lausanne, EPFL), Geneva, Switzerland.,Brain Mind Institute, Faculty of Life Sciences, Swiss Federal Institute of Technology (École Polytechnique Fédérale de Lausanne, EPFL), Lausanne, Switzerland.,Center for Bionics, Biomedical Research Division, Korea Institute of Science and Technology (KIST), Seoul, South Korea
| | - Baptiste Gauthier
- Center of Neuroprosthetics, Faculty of Life Sciences, Swiss Federal Institute of Technology (École Polytechnique Fédérale de Lausanne, EPFL), Geneva, Switzerland.,Brain Mind Institute, Faculty of Life Sciences, Swiss Federal Institute of Technology (École Polytechnique Fédérale de Lausanne, EPFL), Lausanne, Switzerland
| | - Hyeong-Dong Park
- Center of Neuroprosthetics, Faculty of Life Sciences, Swiss Federal Institute of Technology (École Polytechnique Fédérale de Lausanne, EPFL), Geneva, Switzerland.,Brain Mind Institute, Faculty of Life Sciences, Swiss Federal Institute of Technology (École Polytechnique Fédérale de Lausanne, EPFL), Lausanne, Switzerland.,Graduate Institute of Mind, Brain and Consciousness, Taipei Medical University, Taipei, Taiwan.,Brain and Consciousness Research Centre, Shuang-Ho Hospital, New Taipei City, Taiwan
| | - Nathan Faivre
- Center of Neuroprosthetics, Faculty of Life Sciences, Swiss Federal Institute of Technology (École Polytechnique Fédérale de Lausanne, EPFL), Geneva, Switzerland.,Brain Mind Institute, Faculty of Life Sciences, Swiss Federal Institute of Technology (École Polytechnique Fédérale de Lausanne, EPFL), Lausanne, Switzerland.,University Grenoble Alpes, University Savoie Mont Blanc, CNRS, LPNC, Grenoble, France
| | - Olaf Blanke
- Center of Neuroprosthetics, Faculty of Life Sciences, Swiss Federal Institute of Technology (École Polytechnique Fédérale de Lausanne, EPFL), Geneva, Switzerland. .,Brain Mind Institute, Faculty of Life Sciences, Swiss Federal Institute of Technology (École Polytechnique Fédérale de Lausanne, EPFL), Lausanne, Switzerland. .,Department of Neurology, University Hospital Geneva, Geneva, Switzerland.
| |
Collapse
|
5
|
Dewez D, Hoyet L, Lecuyer A, Argelaguet F. Do You Need Another Hand? Investigating Dual Body Representations During Anisomorphic 3D Manipulation. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:2047-2057. [PMID: 35167468 DOI: 10.1109/tvcg.2022.3150501] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
In virtual reality, several manipulation techniques distort users' motions, for example to reach remote objects or increase precision. These techniques can become problematic when used with avatars, as they create a mismatch between the real performed action and the corresponding displayed action, which can negatively impact the sense of embodiment. In this paper, we propose to use a dual representation during anisomorphic interaction. A co-located representation serves as a spatial reference and reproduces the exact users' motion, while an interactive representation is used for distorted interaction. We conducted two experiments, investigating the use of dual representations with amplified motion (with the Go-Go technique) and decreased motion (with the PRISM technique). Two visual appearances for the interactive representation and the co-located one were explored. This exploratory study investigating dual representations in this context showed that people globally preferred having a single representation, but opinions diverged for the Go-Go technique. Also, we could not find significant differences in terms of performance. While interacting seemed more important than showing exact movements for agency during out-of-reach manipulation, people felt more in control of the realistic arm during close manipulation.
Collapse
|
6
|
Kondo R, Tani Y, Sugimoto M, Minamizawa K, Inami M, Kitazaki M. Re-association of Body Parts: Illusory Ownership of a Virtual Arm Associated With the Contralateral Real Finger by Visuo-Motor Synchrony. Front Robot AI 2021; 7:26. [PMID: 33501195 PMCID: PMC7805900 DOI: 10.3389/frobt.2020.00026] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2019] [Accepted: 02/13/2020] [Indexed: 11/16/2022] Open
Abstract
Illusory ownership can be induced in a virtual body by visuo-motor synchrony. Our aim was to test the possibility of a re-association of the right thumb with a virtual left arm and express the illusory body ownership of the re-associated arm through a synchronous or asynchronous movement of the body parts through action and vision. Participants felt that their right thumb was the virtual left arm more strongly in the synchronous condition than in the asynchronous one, and the feeling of ownership of the virtual arm was also stronger in the synchronous condition. We did not find a significant difference in the startle responses to a sudden knife appearance to the virtual arm between the two synchrony conditions, as there was no proprioceptive drift of the thumb. These results suggest that a re-association of the right thumb with the virtual left arm could be induced by visuo-motor synchronization; however, it may be weaker than the natural association.
Collapse
Affiliation(s)
- Ryota Kondo
- Department of Computer Science and Engineering, Toyohashi University of Technology, Toyohashi, Japan
| | - Yamato Tani
- Department of Computer Science and Engineering, Toyohashi University of Technology, Toyohashi, Japan
| | - Maki Sugimoto
- Department of Information and Computer Science, Keio University, Yokohama, Japan
| | - Kouta Minamizawa
- Graduate School of Media Design, Keio University, Yokohama, Japan
| | - Masahiko Inami
- Research Center for Advanced Science and Technology, The University of Tokyo, Bunkyo-ku, Japan
| | - Michiteru Kitazaki
- Department of Computer Science and Engineering, Toyohashi University of Technology, Toyohashi, Japan
| |
Collapse
|
7
|
Guterstam A, Larsson DEO, Szczotka J, Ehrsson HH. Duplication of the bodily self: a perceptual illusion of dual full-body ownership and dual self-location. ROYAL SOCIETY OPEN SCIENCE 2020; 7:201911. [PMID: 33489299 PMCID: PMC7813251 DOI: 10.1098/rsos.201911] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/09/2020] [Accepted: 11/02/2020] [Indexed: 06/12/2023]
Abstract
Previous research has shown that it is possible to use multisensory stimulation to induce the perceptual illusion of owning supernumerary limbs, such as two right arms. However, it remains unclear whether the coherent feeling of owning a full-body may be duplicated in the same manner and whether such a dual full-body illusion could be used to split the unitary sense of self-location into two. Here, we examined whether healthy human participants can experience simultaneous ownership of two full-bodies, located either close in parallel or in two separate spatial locations. A previously described full-body illusion, based on visuo-tactile stimulation of an artificial body viewed from the first-person perspective (1PP) via head-mounted displays, was adapted to a dual-body setting and quantified in five experiments using questionnaires, a behavioural self-location task and threat-evoked skin conductance responses. The results of experiments 1-3 showed that synchronous visuo-tactile stimulation of two bodies viewed from the 1PP lying in parallel next to each other induced a significant illusion of dual full-body ownership. In experiment 4, we failed to find support for our working hypothesis that splitting the visual scene into two, so that each of the two illusory bodies was placed in distinct spatial environments, would lead to dual self-location. In a final exploratory experiment (no. 5), we found preliminary support for an illusion of dual self-location and dual body ownership by using dynamic changes between the 1PPs of two artificial bodies and/or a common third-person perspective in the ceiling of the testing room. These findings suggest that healthy people, under certain conditions of multisensory perceptual ambiguity, may experience dual body ownership and dual self-location. These findings suggest that the coherent sense of the bodily self located at a single place in space is the result of an active and dynamic perceptual integration process.
Collapse
Affiliation(s)
- Arvid Guterstam
- Department of Psychology, Princeton University, Princeton, NJ, USA
- Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | | | - Joanna Szczotka
- Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | - H. Henrik Ehrsson
- Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden
| |
Collapse
|
8
|
Nakul E, Orlando-Dessaints N, Lenggenhager B, Lopez C. Measuring perceived self-location in virtual reality. Sci Rep 2020; 10:6802. [PMID: 32321976 PMCID: PMC7176655 DOI: 10.1038/s41598-020-63643-y] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2019] [Accepted: 03/30/2020] [Indexed: 12/04/2022] Open
Abstract
Third-person perspective full-body illusions (3PP-FBI) enable the manipulation, through multisensory stimulation, of perceived self-location. Perceived self-location is classically measured by a locomotion task. Yet, as locomotion modulates various sensory signals, we developed in immersive virtual reality a measure of self-location without locomotion. Tactile stimulation was applied on the back of twenty-five participants and displayed synchronously or asynchronously on an avatar's back seen from behind. Participants completed the locomotion task and a novel mental imagery task, in which they self-located in relation to a virtual ball approaching them. Participants self-identified with the avatar more during synchronous than asynchronous visuo-tactile stimulation in both tasks. This was accentuated for the mental imagery task, showing a larger self-relocation toward the avatar, together with higher reports of presence, bi-location and disembodiment in the synchronous condition only for the mental imagery task. In conclusion, the results suggest that avoiding multisensory updating during walking, and using a perceptual rather than a motor task, can improve measures of illusory self-location.
Collapse
Affiliation(s)
- Estelle Nakul
- Aix Marseille Univ, CNRS, LNSC, FR3C, Marseille, France
| | | | | | | |
Collapse
|
9
|
Kondo R, Tani Y, Sugimoto M, Inami M, Kitazaki M. Scrambled body differentiates body part ownership from the full body illusion. Sci Rep 2020; 10:5274. [PMID: 32210268 PMCID: PMC7093408 DOI: 10.1038/s41598-020-62121-9] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2019] [Accepted: 03/09/2020] [Indexed: 11/09/2022] Open
Abstract
Illusory body ownership can be induced in a body part or a full body by visual-motor synchronisation. A previous study indicated that an invisible full body illusion can be induced by the synchronous movement of only the hands and feet. The difference between body part ownership and the full body illusion has not been explained in detail because there is no method for separating these two illusions. To develop a method to do so, we scrambled or randomised the positions of the hands and feet and compared it with the normal layout stimulus by manipulating visual-motor synchronisation. In Experiment 1, participants observed the stimuli from a third-person perspective, and the questionnaire results showed that the scrambled body stimulus induced only body part ownership, while the normal layout stimulus induced both body part ownership and full body ownership when the stimuli were synchronous with participants' actions. In Experiment 2, we found similar results as with the first-person perspective stimuli in a questionnaire. We did not find significant skin conductance response difference between any conditions in either Experiment 2 or 3. These results suggest that a spatial relationship is necessary for the full body illusion, but not for body part ownership.
Collapse
Affiliation(s)
- Ryota Kondo
- Department of Computer Science and Engineering, Toyohashi University of Technology, 1-1 Hibarigaoka, Tempaku, Toyohashi, Aichi, 441-8580, Japan.
| | - Yamato Tani
- Department of Computer Science and Engineering, Toyohashi University of Technology, 1-1 Hibarigaoka, Tempaku, Toyohashi, Aichi, 441-8580, Japan
| | - Maki Sugimoto
- Department of Information and Computer Science, Keio University, 3-14-1 Hiyoshi, Kohoku-ku, Yokohama, Kanagawa, 223-8522, Japan
| | - Masahiko Inami
- Research Center for Advanced Science and Technology, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo, 113-0033, Japan
| | - Michiteru Kitazaki
- Department of Computer Science and Engineering, Toyohashi University of Technology, 1-1 Hibarigaoka, Tempaku, Toyohashi, Aichi, 441-8580, Japan
| |
Collapse
|
10
|
van der Veer AH, Longo MR, Alsmith AJT, Wong HY, Mohler BJ. Self and Body Part Localization in Virtual Reality: Comparing a Headset and a Large-Screen Immersive Display. Front Robot AI 2019; 6:33. [PMID: 33501049 PMCID: PMC7805778 DOI: 10.3389/frobt.2019.00033] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2018] [Accepted: 04/11/2019] [Indexed: 11/13/2022] Open
Abstract
It is currently not fully understood where people precisely locate themselves in their bodies, particularly in virtual reality. To investigate this, we asked participants to point directly at themselves and to several of their body parts with a virtual pointer, in two virtual reality (VR) setups, a VR headset and a large-screen immersive display (LSID). There was a difference in distance error in pointing to body parts depending on VR setup. Participants pointed relatively accurately to many of their body parts (i.e., eyes, nose, chin, shoulders, and waist). However, in both VR setups when pointing to the feet and the knees they pointed too low, and for the top of the head too high (to larger extents in the VR headset). Taking these distortions into account, the locations found for pointing to self were considered in terms of perceived bodies, based on where the participants had pointed to their body parts in the two VR setups. Pointing to self in terms of the perceived body was mostly to the face, the upper followed by the lower, as well as some to the torso regions. There was no significant overall effect of VR condition for pointing to self in terms of the perceived body (but there was a significant effect of VR if only the physical body (as measured) was considered). In a paper-and-pencil task outside of VR, performed by pointing on a picture of a simple body outline (body template task), participants pointed most to the upper torso. Possible explanations for the differences between pointing to self in the VR setups and the body template task are discussed. The main finding of this study is that the VR setup influences where people point to their body parts, but not to themselves, when perceived and not physical body parts are considered.
Collapse
Affiliation(s)
- Albert H. van der Veer
- Max Planck Institute for Biological Cybernetics, Tübingen, Germany
- International Max Planck Research School for Cognitive and Systems Neuroscience, University of Tübingen, Tübingen, Germany
| | - Matthew R. Longo
- Department of Psychological Sciences, Birkbeck, University of London, London, United Kingdom
| | | | - Hong Yu Wong
- Werner Reichardt Centre for Integrative Neuroscience, University of Tübingen, Tübingen, Germany
- Institute of Philosophy, Department of Philosophy and Media, University of Tübingen, Tübingen, Germany
| | - Betty J. Mohler
- Max Planck Institute for Biological Cybernetics, Tübingen, Germany
- Institute for Sport Science, Department of Human Sciences, Technical University of Darmstadt, Darmstadt, Germany
- Max Planck Institute for Intelligent Systems, Tübingen, Germany
| |
Collapse
|
11
|
Liang C, Lee YT, Chen WY, Huang HC. Body-as-Subject in the Four-Hand Illusion. Front Psychol 2018; 9:1710. [PMID: 30283376 PMCID: PMC6157404 DOI: 10.3389/fpsyg.2018.01710] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2018] [Accepted: 08/23/2018] [Indexed: 11/13/2022] Open
Abstract
In a recent study (Chen et al., 2018), we conducted a series of experiments that induced the "four-hand illusion": using a head-mounted display (HMD), the participant adopted the experimenter's first-person perspective (1PP) as if it was his/her own 1PP. The participant saw four hands via the HMD: the experimenter's two hands from the adopted 1PP and the subject's own two hands from the adopted third-person perspective (3PP). In the active four-hand condition, the participant tapped his/her index fingers, imitated by the experimenter. Once all four hands acted synchronously and received synchronous tactile stimulations at the same time, many participants felt as if they owned two more hands. In this paper, we argue that there is a philosophical implication of this novel illusion. According to Merleau-Ponty (1945/1962) and Legrand (2010), one can experience one's own body or body-part either as-object or as-subject but cannot experience it as both simultaneously, i.e., these two experiences are mutually exclusive. Call this view the Experiential Exclusion Thesis. We contend that a key component of the four-hand illusion-the subjective experience of the 1PP-hands that involved both "kinesthetic sense of movement" and "visual sense of movement" (the movement that the participant sees via the HMD)-provides an important counter-example against this thesis. We argue that it is possible for a healthy subject to experience the same body-part both as-subject and as-object simultaneously. Our goal is not to annihilate the distinction between body-as-object and body-as-subject, but to show that it is not as rigid as suggested by the phenomenologists.
Collapse
Affiliation(s)
- Caleb Liang
- Department of Philosophy, National Taiwan University, Taipei, Taiwan.,Graduate Institute of Brain and Mind Sciences, National Taiwan University, Taipei, Taiwan
| | - Yen-Tung Lee
- Department of Philosophy, National Taiwan University, Taipei, Taiwan
| | - Wen-Yeo Chen
- Graduate Institute of Brain and Mind Sciences, National Taiwan University, Taipei, Taiwan
| | - Hsu-Chia Huang
- Graduate Institute of Brain and Mind Sciences, National Taiwan University, Taipei, Taiwan
| |
Collapse
|
12
|
Gonzalez-Franco M, Peck TC. Avatar Embodiment. Towards a Standardized Questionnaire. Front Robot AI 2018; 5:74. [PMID: 33500953 PMCID: PMC7805666 DOI: 10.3389/frobt.2018.00074] [Citation(s) in RCA: 72] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2018] [Accepted: 06/04/2018] [Indexed: 11/29/2022] Open
Abstract
Inside virtual reality, users can embody avatars that are collocated from a first-person perspective. When doing so, participants have the feeling that the own body has been substituted by the self-avatar, and that the new body is the source of the sensations. Embodiment is complex as it includes not only body ownership over the avatar, but also agency, co-location, and external appearance. Despite the multiple variables that influence it, the illusion is quite robust, and it can be produced even if the self-avatar is of a different age, size, gender, or race from the participant's own body. Embodiment illusions are therefore the basis for many social VR experiences and a current active research area among the community. Researchers are interested both in the body manipulations that can be accepted, as well as studying how different self-avatars produce different attitudinal, social, perceptual, and behavioral effects. However, findings suggest that despite embodiment being strongly associated with the performance and reactions inside virtual reality, the extent to which the illusion is experienced varies between participants. In this paper, we review the questionnaires used in past experiments and propose a standardized embodiment questionnaire based on 25 questions that are prevalent in the literature. We encourage future virtual reality experiments that include first-person virtual avatars to administer this questionnaire in order to evaluate the degree of embodiment.
Collapse
Affiliation(s)
| | - Tabitha C Peck
- Mathematics and Computer Science Department, Davidson College, Davidson, NC, United States
| |
Collapse
|
13
|
Chen WY, Huang HC, Lee YT, Liang C. Body ownership and the four-hand illusion. Sci Rep 2018; 8:2153. [PMID: 29391505 PMCID: PMC5794744 DOI: 10.1038/s41598-018-19662-x] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2017] [Accepted: 01/05/2018] [Indexed: 11/09/2022] Open
Abstract
Recent studies of the rubber hand illusion (RHI) have shown that the sense of body ownership is constrained by several factors and yet is still very flexible. However, exactly how flexible is our sense of body ownership? In this study, we address this issue by investigating the following question: is it possible that one may have the illusory experience of owning four hands? Under visual manipulation, the participant adopted the experimenter’s first-person perspective (1PP) as if it was his/her own. Sitting face to face, the participant saw four hands—the experimenter’s two hands from the adopted 1PP together with the subject’s own two hands from the adopted third-person perspective (3PP). We found that: (1) the four-hand illusion did not occur in the passive four-hand condition. (2) In the active four-hand condition, the participants tapped their index fingers, imitated by the experimenter. When tactile stimulations were not provided, the key illusion was not induced, either. (3) Strikingly, once all four hands began to act with the same pattern and received synchronous tactile stimulations at the same time, many participants felt as if they had two more hands. These results show that the sense of body ownership is much more flexible than most researchers have suggested.
Collapse
Affiliation(s)
- Wen-Yeo Chen
- Graduate Institute of Brain and Mind Sciences, National Taiwan University, Taipei, Taiwan
| | - Hsu-Chia Huang
- Graduate Institute of Brain and Mind Sciences, National Taiwan University, Taipei, Taiwan
| | - Yen-Tung Lee
- Department of Philosophy, National Taiwan University, Taipei, Taiwan
| | - Caleb Liang
- Graduate Institute of Brain and Mind Sciences, National Taiwan University, Taipei, Taiwan. .,Department of Philosophy, National Taiwan University, Taipei, Taiwan.
| |
Collapse
|
14
|
Gonzalez-Franco M, Lanier J. Model of Illusions and Virtual Reality. Front Psychol 2017; 8:1125. [PMID: 28713323 PMCID: PMC5492764 DOI: 10.3389/fpsyg.2017.01125] [Citation(s) in RCA: 116] [Impact Index Per Article: 16.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2017] [Accepted: 06/19/2017] [Indexed: 12/13/2022] Open
Abstract
In Virtual Reality (VR) it is possible to induce illusions in which users report and behave as if they have entered into altered situations and identities. The effect can be robust enough for participants to respond “realistically,” meaning behaviors are altered as if subjects had been exposed to the scenarios in reality. The circumstances in which such VR illusions take place were first introduced in the 80's. Since then, rigorous empirical evidence has explored a wide set of illusory experiences in VR. Here, we compile this research and propose a neuroscientific model explaining the underlying perceptual and cognitive mechanisms that enable illusions in VR. Furthermore, we describe the minimum instrumentation requirements to support illusory experiences in VR, and discuss the importance and shortcomings of the generic model.
Collapse
|
15
|
Blefari ML, Martuzzi R, Salomon R, Bello-Ruiz J, Herbelin B, Serino A, Blanke O. Bilateral Rolandic operculum processing underlying heartbeat awareness reflects changes in bodily self-consciousness. Eur J Neurosci 2017; 45:1300-1312. [DOI: 10.1111/ejn.13567] [Citation(s) in RCA: 49] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2016] [Revised: 03/23/2017] [Accepted: 03/23/2017] [Indexed: 11/28/2022]
Affiliation(s)
- Maria Laura Blefari
- Center for Neuroprosthetics; École Polytechnique Fédérale de Lausanne; Campus Biotech Chemin des Mines 9 1202 Geneva Switzerland
- Laboratory of Cognitive Neuroscience; Brain Mind Institute; School of Life Sciences; École Polytechnique Fédérale de Lausanne; Lausanne Switzerland
| | - Roberto Martuzzi
- Center for Neuroprosthetics; École Polytechnique Fédérale de Lausanne; Campus Biotech Chemin des Mines 9 1202 Geneva Switzerland
- Laboratory of Cognitive Neuroscience; Brain Mind Institute; School of Life Sciences; École Polytechnique Fédérale de Lausanne; Lausanne Switzerland
- Fondation Campus Biotech Geneva; Geneva Switzerland
| | - Roy Salomon
- Center for Neuroprosthetics; École Polytechnique Fédérale de Lausanne; Campus Biotech Chemin des Mines 9 1202 Geneva Switzerland
- Laboratory of Cognitive Neuroscience; Brain Mind Institute; School of Life Sciences; École Polytechnique Fédérale de Lausanne; Lausanne Switzerland
| | - Javier Bello-Ruiz
- Center for Neuroprosthetics; École Polytechnique Fédérale de Lausanne; Campus Biotech Chemin des Mines 9 1202 Geneva Switzerland
- Laboratory of Cognitive Neuroscience; Brain Mind Institute; School of Life Sciences; École Polytechnique Fédérale de Lausanne; Lausanne Switzerland
| | - Bruno Herbelin
- Center for Neuroprosthetics; École Polytechnique Fédérale de Lausanne; Campus Biotech Chemin des Mines 9 1202 Geneva Switzerland
- Laboratory of Cognitive Neuroscience; Brain Mind Institute; School of Life Sciences; École Polytechnique Fédérale de Lausanne; Lausanne Switzerland
| | - Andrea Serino
- Center for Neuroprosthetics; École Polytechnique Fédérale de Lausanne; Campus Biotech Chemin des Mines 9 1202 Geneva Switzerland
- Laboratory of Cognitive Neuroscience; Brain Mind Institute; School of Life Sciences; École Polytechnique Fédérale de Lausanne; Lausanne Switzerland
- Department of Clinical Neurosciences; University Hospital Lausanne (CHUV); Lausanne Switzerland
| | - Olaf Blanke
- Center for Neuroprosthetics; École Polytechnique Fédérale de Lausanne; Campus Biotech Chemin des Mines 9 1202 Geneva Switzerland
- Laboratory of Cognitive Neuroscience; Brain Mind Institute; School of Life Sciences; École Polytechnique Fédérale de Lausanne; Lausanne Switzerland
- Department of Neurology; Geneva University Hospital; Geneva Switzerland
| |
Collapse
|
16
|
Huang HC, Lee YT, Chen WY, Liang C. The Sense of 1PP-Location Contributes to Shaping the Perceived Self-location Together with the Sense of Body-Location. Front Psychol 2017; 8:370. [PMID: 28352241 PMCID: PMC5348511 DOI: 10.3389/fpsyg.2017.00370] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2016] [Accepted: 02/27/2017] [Indexed: 11/13/2022] Open
Abstract
Self-location—the sense of where I am in space—provides an experiential anchor for one's interaction with the environment. In the studies of full-body illusions, many researchers have defined self-location solely in terms of body-location—the subjective feeling of where my body is. Although this view is useful, there is an issue regarding whether it can fully accommodate the role of 1PP-location—the sense of where my first-person perspective is located in space. In this study, we investigate self-location by comparing body-location and 1PP-location: using a head-mounted display (HMD) and a stereo camera, the subjects watched their own body standing in front of them and received tactile stimulations. We manipulated their senses of body-location and 1PP-location in three different conditions: the participants standing still (Basic condition), asking them to move forward (Walking condition), and swiftly moving the stereo camera away from their body (Visual condition). In the Walking condition, the participants watched their body moving away from their 1PP. In the Visual condition, the scene seen via the HMD was systematically receding. Our data show that, under different manipulations of movement, the spatial unity between 1PP-location and body-location can be temporarily interrupted. Interestingly, we also observed a “double-body effect.” We further suggest that it is better to consider body-location and 1PP-location as interrelated but distinct factors that jointly support the sense of self-location.
Collapse
Affiliation(s)
- Hsu-Chia Huang
- Graduate Institute of Brain and Mind Sciences, National Taiwan University Taipei, Taiwan
| | - Yen-Tung Lee
- Department of Philosophy, National Taiwan University Taipei, Taiwan
| | - Wen-Yeo Chen
- Graduate Institute of Brain and Mind Sciences, National Taiwan University Taipei, Taiwan
| | - Caleb Liang
- Graduate Institute of Brain and Mind Sciences, National Taiwan UniversityTaipei, Taiwan; Department of Philosophy, National Taiwan UniversityTaipei, Taiwan
| |
Collapse
|
17
|
Tidoni E, Gergondet P, Fusco G, Kheddar A, Aglioti SM. The Role of Audio-Visual Feedback in a Thought-Based Control of a Humanoid Robot: A BCI Study in Healthy and Spinal Cord Injured People. IEEE Trans Neural Syst Rehabil Eng 2016; 25:772-781. [PMID: 28113631 DOI: 10.1109/tnsre.2016.2597863] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
The efficient control of our body and successful interaction with the environment are possible through the integration of multisensory information. Brain-computer interface (BCI) may allow people with sensorimotor disorders to actively interact in the world. In this study, visual information was paired with auditory feedback to improve the BCI control of a humanoid surrogate. Healthy and spinal cord injured (SCI) people were asked to embody a humanoid robot and complete a pick-and-place task by means of a visual evoked potentials BCI system. Participants observed the remote environment from the robot's perspective through a head mounted display. Human-footsteps and computer-beep sounds were used as synchronous/asynchronous auditory feedback. Healthy participants achieved better placing accuracy when listening to human footstep sounds relative to a computer-generated sound. SCI people demonstrated more difficulty in steering the robot during asynchronous auditory feedback conditions. Importantly, subjective reports highlighted that the BCI mask overlaying the display did not limit the observation of the scenario and the feeling of being in control of the robot. Overall, the data seem to suggest that sensorimotor-related information may improve the control of external devices. Further studies are required to understand how the contribution of residual sensory channels could improve the reliability of BCI systems.
Collapse
|
18
|
In the presence of others: Self-location, balance control and vestibular processing. Neurophysiol Clin 2015; 45:241-54. [DOI: 10.1016/j.neucli.2015.09.001] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2015] [Revised: 09/10/2015] [Accepted: 09/11/2015] [Indexed: 11/23/2022] Open
|
19
|
Ronchi R, Bello-Ruiz J, Lukowska M, Herbelin B, Cabrilo I, Schaller K, Blanke O. Right insular damage decreases heartbeat awareness and alters cardio-visual effects on bodily self-consciousness. Neuropsychologia 2015; 70:11-20. [DOI: 10.1016/j.neuropsychologia.2015.02.010] [Citation(s) in RCA: 70] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2014] [Revised: 02/03/2015] [Accepted: 02/09/2015] [Indexed: 11/27/2022]
|
20
|
Liang C, Chang SY, Chen WY, Huang HC, Lee YT. Body ownership and experiential ownership in the self-touching illusion. Front Psychol 2015; 5:1591. [PMID: 25774138 PMCID: PMC4344111 DOI: 10.3389/fpsyg.2014.01591] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2014] [Accepted: 12/27/2014] [Indexed: 11/28/2022] Open
Abstract
We investigate two issues about the subjective experience of one's body: first, is the experience of owning a full-body fundamentally different from the experience of owning a body-part?Second, when I experience a bodily sensation, does it guarantee that I cannot be wrong about whether it is me who feels it? To address these issues, we conducted a series of experiments that combined the rubber hand illusion (RHI) and the “body swap illusion.” The subject wore a head mounted display (HMD) connected with a stereo camera set on the experimenter's head. Sitting face to face, they used their right hand holding a paintbrush to brush each other's left hand. Through the HMD, the subject adopted the experimenter's first-person perspective (1PP) as if it was his/her own 1PP: the subject watched either the experimenter's hand from the adopted 1PP, and/or the subject's own hand from the adopted third-person perspective (3PP) in the opposite direction (180°), or the subject's full body from the adopted 3PP (180°, with or without face). The synchronous full-body conditions generate a “self-touching illusion”: many participants felt that “I was brushing my own hand!” We found that (1) the sense of body-part ownership and the sense of full-body ownership are not fundamentally different from each other; and (2) our data present a strong case against the mainstream philosophical view called the immunity principle (IEM). We argue that it is possible for misrepresentation to occur in the subject's sense of “experiential ownership” (the sense that I am the one who is having this bodily experience). We discuss these findings and conclude that not only the sense of body ownership but also the sense of experiential ownership call for further interdisciplinary studies.
Collapse
Affiliation(s)
- Caleb Liang
- Department of Philosophy, National Taiwan University Taipei, Taiwan ; Graduate Institute of Brain and Mind Sciences, National Taiwan University Taipei, Taiwan
| | - Si-Yan Chang
- Department of Philosophy, National Taiwan University Taipei, Taiwan
| | - Wen-Yeo Chen
- Graduate Institute of Brain and Mind Sciences, National Taiwan University Taipei, Taiwan
| | - Hsu-Chia Huang
- Graduate Institute of Brain and Mind Sciences, National Taiwan University Taipei, Taiwan
| | - Yen-Tung Lee
- Department of Philosophy, National Taiwan University Taipei, Taiwan
| |
Collapse
|
21
|
Macauda G, Bertolini G, Palla A, Straumann D, Brugger P, Lenggenhager B. Binding body and self in visuo-vestibular conflicts. Eur J Neurosci 2014; 41:810-7. [PMID: 25557766 DOI: 10.1111/ejn.12809] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2014] [Revised: 11/21/2014] [Accepted: 11/24/2014] [Indexed: 11/27/2022]
Abstract
Maintenance of the bodily self relies on the accurate integration of multisensory inputs in which visuo-vestibular cue integration is thought to play an essential role. Here, we tested in healthy volunteers how conflicting visuo-vestibular bodily input might impact on body self-coherence in a full body illusion set-up. Natural passive vestibular stimulation was provided on a motion platform, while visual input was manipulated using virtual reality equipment. Explicit (questionnaire) and implicit (skin temperature) measures were employed to assess illusory self-identification with either a mannequin or a control object. Questionnaire results pointed to a relatively small illusion, but hand skin temperature, plausibly an index of illusory body ownership, showed the predicted drop specifically in the condition when participants saw the mannequin moving in congruence with them. We argue that this implicit measure was accessible to visuo-vestibular modulation of the sense of self, possibly mediated by shared neural processes in the insula involved in vestibular and interoceptive signalling, thermoregulation and multisensory integration.
Collapse
Affiliation(s)
- Gianluca Macauda
- Department of Neurology, University Hospital Zurich, Frauenklinikstrasse 26, Zurich, CH-8091, Switzerland
| | | | | | | | | | | |
Collapse
|