1
|
Liu Y, Liu R, Ge J, Wang Y. Advancements in brain-machine interfaces for application in the metaverse. Front Neurosci 2024; 18:1383319. [PMID: 38919909 PMCID: PMC11198002 DOI: 10.3389/fnins.2024.1383319] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2024] [Accepted: 05/14/2024] [Indexed: 06/27/2024] Open
Abstract
In recent years, with the shift of focus in metaverse research toward content exchange and social interaction, breaking through the current bottleneck of audio-visual media interaction has become an urgent issue. The use of brain-machine interfaces for sensory simulation is one of the proposed solutions. Currently, brain-machine interfaces have demonstrated irreplaceable potential as physiological signal acquisition tools in various fields within the metaverse. This study explores three application scenarios: generative art in the metaverse, serious gaming for healthcare in metaverse medicine, and brain-machine interface applications for facial expression synthesis in the virtual society of the metaverse. It investigates existing commercial products and patents (such as MindWave Mobile, GVS, and Galea), draws analogies with the development processes of network security and neurosecurity, bioethics and neuroethics, and discusses the challenges and potential issues that may arise when brain-machine interfaces mature and are widely applied. Furthermore, it looks ahead to the diverse possibilities of deep and varied applications of brain-machine interfaces in the metaverse in the future.
Collapse
Affiliation(s)
- Yang Liu
- Department of Ophthalmology, First Hospital of China Medical University, Shengyang, China
| | - Ruibin Liu
- Department of Clinical Integration of Traditional Chinese and Western medicine, Liaoning University of Traditional Chinese Medicine, Shenyang, China
- Department of General Surgery, Cancer Hospital of China Medical University, Liaoning Cancer Hospital & Institute, Shenyang, China
| | - Jinnian Ge
- Department of General Surgery, First Hospital of China Medical University, Shengyang, China
| | - Yue Wang
- Department of General Surgery, Cancer Hospital of China Medical University, Liaoning Cancer Hospital & Institute, Shenyang, China
| |
Collapse
|
2
|
Huang Y, Gopal J, Kakusa B, Li AH, Huang W, Wang JB, Persad A, Ramayya A, Parvizi J, Buch VP, Keller C. Naturalistic acute pain states decoded from neural and facial dynamics. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.05.10.593652. [PMID: 38766098 PMCID: PMC11100805 DOI: 10.1101/2024.05.10.593652] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/22/2024]
Abstract
Pain is a complex experience that remains largely unexplored in naturalistic contexts, hindering our understanding of its neurobehavioral representation in ecologically valid settings. To address this, we employed a multimodal, data-driven approach integrating intracranial electroencephalography, pain self-reports, and facial expression quantification to characterize the neural and behavioral correlates of naturalistic acute pain in twelve epilepsy patients undergoing continuous monitoring with neural and audiovisual recordings. High self-reported pain states were associated with elevated blood pressure, increased pain medication use, and distinct facial muscle activations. Using machine learning, we successfully decoded individual participants' high versus low self-reported pain states from distributed neural activity patterns (mean AUC = 0.70), involving mesolimbic regions, striatum, and temporoparietal cortex. High self-reported pain states exhibited increased low-frequency activity in temporoparietal areas and decreased high-frequency activity in mesolimbic regions (hippocampus, cingulate, and orbitofrontal cortex) compared to low pain states. This neural pain representation remained stable for hours and was modulated by pain onset and relief. Objective facial expression changes also classified self-reported pain states, with results concordant with electrophysiological predictions. Importantly, we identified transient periods of momentary pain as a distinct naturalistic acute pain measure, which could be reliably differentiated from affect-neutral periods using intracranial and facial features, albeit with neural and facial patterns distinct from self-reported pain. These findings reveal reliable neurobehavioral markers of naturalistic acute pain across contexts and timescales, underscoring the potential for developing personalized pain interventions in real-world settings.
Collapse
Affiliation(s)
- Yuhao Huang
- Department of Neurosurgery, Stanford University School of Medicine, Palo Alto, CA, USA
| | - Jay Gopal
- Brown University, Providence, RI, 02912, USA
| | - Bina Kakusa
- Department of Neurosurgery, Stanford University School of Medicine, Palo Alto, CA, USA
| | - Alice H. Li
- Department of Anesthesiology, Perioperative and Pain Medicine, Stanford University School of Medicine, Palo Alto, CA, USA
| | - Weichen Huang
- Department of Neurology, Stanford University School of Medicine, Palo Alto, CA, USA
| | - Jeffrey B. Wang
- Department of Anesthesia and Critical Care Medicine, The Johns Hopkins University School of Medicine, Baltimore, MD 21205, USA
| | - Amit Persad
- Department of Neurosurgery, Stanford University School of Medicine, Palo Alto, CA, USA
| | - Ashwin Ramayya
- Department of Neurosurgery, Stanford University School of Medicine, Palo Alto, CA, USA
| | - Josef Parvizi
- Department of Neurology, Stanford University School of Medicine, Palo Alto, CA, USA
| | - Vivek P. Buch
- Department of Neurosurgery, Stanford University School of Medicine, Palo Alto, CA, USA
| | - Corey Keller
- Department of Psychiatry and Behavioral Sciences, Stanford University School of Medicine, Palo Alto, CA, USA
- Wu Tsai Neuroscience Institute, Stanford University School of Medicine, Palo Alto, CA
- Veterans Affairs Palo Alto Healthcare System, and the Sierra Pacific Mental Illness, Research, Education, and Clinical Center (MIRECC), Palo Alto, CA, 94394, USA
| |
Collapse
|
3
|
Chen C, Messinger DS, Chen C, Yan H, Duan Y, Ince RAA, Garrod OGB, Schyns PG, Jack RE. Cultural facial expressions dynamically convey emotion category and intensity information. Curr Biol 2024; 34:213-223.e5. [PMID: 38141619 PMCID: PMC10831323 DOI: 10.1016/j.cub.2023.12.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2023] [Revised: 10/27/2023] [Accepted: 12/01/2023] [Indexed: 12/25/2023]
Abstract
Communicating emotional intensity plays a vital ecological role because it provides valuable information about the nature and likelihood of the sender's behavior.1,2,3 For example, attack often follows signals of intense aggression if receivers fail to retreat.4,5 Humans regularly use facial expressions to communicate such information.6,7,8,9,10,11 Yet how this complex signaling task is achieved remains unknown. We addressed this question using a perception-based, data-driven method to mathematically model the specific facial movements that receivers use to classify the six basic emotions-"happy," "surprise," "fear," "disgust," "anger," and "sad"-and judge their intensity in two distinct cultures (East Asian, Western European; total n = 120). In both cultures, receivers expected facial expressions to dynamically represent emotion category and intensity information over time, using a multi-component compositional signaling structure. Specifically, emotion intensifiers peaked earlier or later than emotion classifiers and represented intensity using amplitude variations. Emotion intensifiers are also more similar across emotions than classifiers are, suggesting a latent broad-plus-specific signaling structure. Cross-cultural analysis further revealed similarities and differences in expectations that could impact cross-cultural communication. Specifically, East Asian and Western European receivers have similar expectations about which facial movements represent high intensity for threat-related emotions, such as "anger," "disgust," and "fear," but differ on those that represent low threat emotions, such as happiness and sadness. Together, our results provide new insights into the intricate processes by which facial expressions can achieve complex dynamic signaling tasks by revealing the rich information embedded in facial expressions.
Collapse
Affiliation(s)
- Chaona Chen
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK.
| | - Daniel S Messinger
- Departments of Psychology, Pediatrics, and Electrical & Computer Engineering, University of Miami, 5665 Ponce De Leon Blvd, Coral Gables, FL 33146, USA
| | - Cheng Chen
- Foreign Language Department, Teaching Centre for General Courses, Chengdu Medical College, 601 Tianhui Street, Chengdu 610083, China
| | - Hongmei Yan
- The MOE Key Lab for Neuroinformation, University of Electronic Science and Technology of China, North Jianshe Road, Chengdu 611731, China
| | - Yaocong Duan
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| | - Robin A A Ince
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| | - Oliver G B Garrod
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| | - Philippe G Schyns
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| | - Rachael E Jack
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| |
Collapse
|