1
|
Lin J, Cronje J, Wienrich C, Pauli P, Latoschik ME. Visual Indicators Representing Avatars' Authenticity in Social Virtual Reality and Their Impacts on Perceived Trustworthiness. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2023; 29:4589-4599. [PMID: 37788202 DOI: 10.1109/tvcg.2023.3320234] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/05/2023]
Abstract
Photorealistic avatars show great potential in social VR and VR collaboration. However, identity and privacy issues are threatening avatars' authenticity in social VR. In addition to the necessary authentication and protection, effective solutions are needed to convey avatars' authenticity status to users and thereby enhance the overall trustworthiness. We designed several visual indicators (VIs) using static or dynamic visual effects on photorealistic avatars and evaluated their effectiveness in visualizing avatars' authenticity status. In this study we explored suitable attributes and designs for conveying the authenticity of photorealistic avatars and influencing their perceived trustworthiness. Furthermore, we investigated how different interactivity levels influence their effectiveness (the avatar was either presented in a static image, an animated video clip, or an immersive virtual environment). Our findings showed that using a full name can increase trust, while most other VIs could decrease users' trust. We also found that interactivity levels significantly impacted users' trust and the effectiveness of VIs. Based on our results, we developed design guidelines for visual indicators as effective tools to convey authenticity, as a first step towards the improvement of trustworthiness in social VR with identity management.
Collapse
|
2
|
Zhao X, Liu Y, Chen T, Wang S, Chen J, Wang L, Liu G. Differences in brain activations between micro- and macro-expressions based on electroencephalography. Front Neurosci 2022; 16:903448. [PMID: 36172039 PMCID: PMC9511965 DOI: 10.3389/fnins.2022.903448] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2022] [Accepted: 08/23/2022] [Indexed: 12/04/2022] Open
Abstract
Micro-expressions can reflect an individual's subjective emotions and true mental state and are widely used in the fields of mental health, justice, law enforcement, intelligence, and security. However, the current approach based on image and expert assessment-based micro-expression recognition technology has limitations such as limited application scenarios and time consumption. Therefore, to overcome these limitations, this study is the first to explore the brain mechanisms of micro-expressions and their differences from macro-expressions from a neuroscientific perspective. This can be a foundation for micro-expression recognition based on EEG signals. We designed a real-time supervision and emotional expression suppression (SEES) experimental paradigm to synchronously collect facial expressions and electroencephalograms. Electroencephalogram signals were analyzed at the scalp and source levels to determine the temporal and spatial neural patterns of micro- and macro-expressions. We found that micro-expressions were more strongly activated in the premotor cortex, supplementary motor cortex, and middle frontal gyrus in frontal regions under positive emotions than macro-expressions. Under negative emotions, micro-expressions were more weakly activated in the somatosensory cortex and corneal gyrus regions than macro-expressions. The activation of the right temporoparietal junction (rTPJ) was stronger in micro-expressions under positive than negative emotions. The reason for this difference is that the pathways of facial control are different; the production of micro-expressions under positive emotion is dependent on the control of the face, while micro-expressions under negative emotions are more dependent on the intensity of the emotion.
Collapse
Affiliation(s)
- Xingcong Zhao
- School of Electronic and Information Engineering, Southwest University, Chongqing, China
- Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing, China
| | - Ying Liu
- Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing, China
- School of Music, Southwest University, Chongqing, China
| | - Tong Chen
- School of Electronic and Information Engineering, Southwest University, Chongqing, China
- Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing, China
| | - Shiyuan Wang
- School of Electronic and Information Engineering, Southwest University, Chongqing, China
- Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing, China
| | - Jiejia Chen
- School of Electronic and Information Engineering, Southwest University, Chongqing, China
- Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing, China
| | - Linwei Wang
- Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing, China
| | - Guangyuan Liu
- School of Electronic and Information Engineering, Southwest University, Chongqing, China
- Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing, China
| |
Collapse
|
3
|
Inter- and Transcultural Learning in Social Virtual Reality: A Proposal for an Inter- and Transcultural Virtual Object Database to be Used in the Implementation, Reflection, and Evaluation of Virtual Encounters. MULTIMODAL TECHNOLOGIES AND INTERACTION 2022. [DOI: 10.3390/mti6070050] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Visual stimuli are frequently used to improve memory, language learning or perception, and understanding of metacognitive processes. However, in virtual reality (VR), there are few systematically and empirically derived databases. This paper proposes the first collection of virtual objects based on empirical evaluation for inter-and transcultural encounters between English- and German-speaking learners. We used explicit and implicit measurement methods to identify cultural associations and the degree of stereotypical perception for each virtual stimuli (n = 293) through two online studies, including native German and English-speaking participants. The analysis resulted in a final well-describable database of 128 objects (called InteractionSuitcase). In future applications, the objects can be used as a great interaction or conversation asset and behavioral measurement tool in social VR applications, especially in the field of foreign language education. For example, encounters can use the objects to describe their culture, or teachers can intuitively assess stereotyped attitudes of the encounters.
Collapse
|