1
|
Hamzah HA, Abdalla KK. EEG-based emotion recognition systems; comprehensive study. Heliyon 2024; 10:e31485. [PMID: 38818173 PMCID: PMC11137547 DOI: 10.1016/j.heliyon.2024.e31485] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2024] [Accepted: 05/16/2024] [Indexed: 06/01/2024] Open
Abstract
Emotion recognition technology through EEG signal analysis is currently a fundamental concept in artificial intelligence. This recognition has major practical implications in emotional health care, human-computer interaction, and so on. This paper provides a comprehensive study of different methods for extracting electroencephalography (EEG) features for emotion recognition from four different perspectives, including time domain features, frequency domain features, time-frequency features, and nonlinear features. We summarize the current pattern recognition methods adopted in most related works, and with the rapid development of deep learning (DL) attracting the attention of researchers in this field, we pay more attention to deep learning-based studies and analyse the characteristics, advantages, disadvantages, and applicable scenarios. Finally, the current challenges and future development directions in this field were summarized. This paper can help novice researchers in this field gain a systematic understanding of the current status of emotion recognition research based on EEG signals and provide ideas for subsequent related research.
Collapse
Affiliation(s)
- Hussein Ali Hamzah
- Electrical Engineering Department, College of Engineering, University of Babylon, Iraq
| | - Kasim K. Abdalla
- Electrical Engineering Department, College of Engineering, University of Babylon, Iraq
| |
Collapse
|
2
|
Zhu X, Liu C, Zhao L, Wang S. EEG Emotion Recognition Network Based on Attention and Spatiotemporal Convolution. SENSORS (BASEL, SWITZERLAND) 2024; 24:3464. [PMID: 38894254 PMCID: PMC11174415 DOI: 10.3390/s24113464] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2024] [Revised: 05/24/2024] [Accepted: 05/26/2024] [Indexed: 06/21/2024]
Abstract
Human emotions are complex psychological and physiological responses to external stimuli. Correctly identifying and providing feedback on emotions is an important goal in human-computer interaction research. Compared to facial expressions, speech, or other physiological signals, using electroencephalogram (EEG) signals for the task of emotion recognition has advantages in terms of authenticity, objectivity, and high reliability; thus, it is attracting increasing attention from researchers. However, the current methods have significant room for improvement in terms of the combination of information exchange between different brain regions and time-frequency feature extraction. Therefore, this paper proposes an EEG emotion recognition network, namely, self-organized graph pesudo-3D convolution (SOGPCN), based on attention and spatiotemporal convolution. Unlike previous methods that directly construct graph structures for brain channels, the proposed SOGPCN method considers that the spatial relationships between electrodes in each frequency band differ. First, a self-organizing map is constructed for each channel in each frequency band to obtain the 10 most relevant channels to the current channel, and graph convolution is employed to capture the spatial relationships between all channels in the self-organizing map constructed for each channel in each frequency band. Then, pseudo-three-dimensional convolution combined with partial dot product attention is implemented to extract the temporal features of the EEG sequence. Finally, LSTM is employed to learn the contextual information between adjacent time-series data. Subject-dependent and subject-independent experiments are conducted on the SEED dataset to evaluate the performance of the proposed SOGPCN method, which achieves recognition accuracies of 95.26% and 94.22%, respectively, indicating that the proposed method outperforms several baseline methods.
Collapse
Affiliation(s)
| | | | | | - Shengming Wang
- National Engineering Research Center of Educational Big Data, Central China Normal University, Wuhan 430079, China; (X.Z.); (C.L.); (L.Z.)
| |
Collapse
|
3
|
Schindler S, Bruchmann M, Straube T. Beyond facial expressions: A systematic review on effects of emotional relevance of faces on the N170. Neurosci Biobehav Rev 2023; 153:105399. [PMID: 37734698 DOI: 10.1016/j.neubiorev.2023.105399] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2023] [Revised: 09/15/2023] [Accepted: 09/17/2023] [Indexed: 09/23/2023]
Abstract
The N170 is the most prominent electrophysiological signature of face processing. While facial expressions reliably modulate the N170, there is considerable variance in N170 modulations by other sources of emotional relevance. Therefore, we systematically review and discuss this research area using different methods to manipulate the emotional relevance of inherently neutral faces. These methods were categorized into (1) existing pre-experimental affective person knowledge (e.g., negative attitudes towards outgroup faces), (2) experimentally instructed affective person knowledge (e.g., negative person information), (3) contingency-based affective learning (e.g., fear-conditioning), or (4) the immediate affective context (e.g., emotional information directly preceding the face presentation). For all categories except the immediate affective context category, the majority of studies reported significantly increased N170 amplitudes depending on the emotional relevance of faces. Furthermore, the potentiated N170 was observed across different attention conditions, supporting the role of the emotional relevance of faces on the early prioritized processing of configural facial information, regardless of low-level differences. However, we identified several open research questions and suggest venues for further research.
Collapse
Affiliation(s)
- Sebastian Schindler
- Institute of Medical Psychology and Systems Neuroscience, University of Muenster, Germany; Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Germany.
| | - Maximilian Bruchmann
- Institute of Medical Psychology and Systems Neuroscience, University of Muenster, Germany; Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Germany
| | - Thomas Straube
- Institute of Medical Psychology and Systems Neuroscience, University of Muenster, Germany; Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Germany
| |
Collapse
|
4
|
Effects of facial expression and gaze interaction on brain dynamics during a working memory task in preschool children. PLoS One 2022; 17:e0266713. [PMID: 35482742 PMCID: PMC9049575 DOI: 10.1371/journal.pone.0266713] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/24/2020] [Accepted: 03/25/2022] [Indexed: 11/19/2022] Open
Abstract
Executive functioning in preschool children is important for building social relationships during the early stages of development. We investigated the brain dynamics of preschool children during an attention-shifting task involving congruent and incongruent gaze directions in emotional facial expressions (neutral, angry, and happy faces). Ignoring distracting stimuli (gaze direction and expression), participants (17 preschool children and 17 young adults) were required to detect and memorize the location (left or right) of a target symbol as a simple working memory task (i.e., no general priming paradigm in which a target appears after a cue stimulus). For the preschool children, the frontal late positive response and the central and parietal P3 responses increased for angry faces. In addition, a parietal midline α (Pmα) power to change attention levels decreased mainly during the encoding of a target for angry faces, possibly causing an association of no congruency effect on reaction times (i.e., no faster response in the congruent than incongruent gaze condition). For the adults, parietal P3 response and frontal midline θ (Fmθ) power increased mainly during the encoding period for incongruent gaze shifts in happy faces. The Pmα power for happy faces decreased for incongruent gaze during the encoding period and increased for congruent gaze during the first retention period. These results suggest that adults can quickly shift attention to a target in happy faces, sufficiently allocating attentional resources to ignore incongruent gazes and detect a target, which can attenuate a congruency effect on reaction times. By contrast, possibly because of underdeveloped brain activity, preschool children did not show the happy face superiority effect and they may be more responsive to angry faces. These observations imply a crucial key point to build better relationships between developing preschoolers and their parents and educators, incorporating nonverbal communication into social and emotional learning.
Collapse
|
5
|
Bijanzadeh M, Khambhati AN, Desai M, Wallace DL, Shafi A, Dawes HE, Sturm VE, Chang EF. Decoding naturalistic affective behaviour from spectro-spatial features in multiday human iEEG. Nat Hum Behav 2022; 6:823-836. [PMID: 35273355 DOI: 10.1038/s41562-022-01310-0] [Citation(s) in RCA: 11] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2021] [Accepted: 01/18/2022] [Indexed: 02/04/2023]
Abstract
The neurological basis of affective behaviours in everyday life is not well understood. We obtained continuous intracranial electroencephalography recordings from the human mesolimbic network in 11 participants with epilepsy and hand-annotated spontaneous behaviours from 116 h of multiday video recordings. In individual participants, binary random forest models decoded affective behaviours from neutral behaviours with up to 93% accuracy. Both positive and negative affective behaviours were associated with increased high-frequency and decreased low-frequency activity across the mesolimbic network. The insula, amygdala, hippocampus and anterior cingulate cortex made stronger contributions to affective behaviours than the orbitofrontal cortex, but the insula and anterior cingulate cortex were most critical for differentiating behaviours with observable affect from those without. In a subset of participants (N = 3), multiclass decoders distinguished amongst the positive, negative and neutral behaviours. These results suggest that spectro-spatial features of brain activity in the mesolimbic network are associated with affective behaviours of everyday life.
Collapse
Affiliation(s)
- Maryam Bijanzadeh
- Department of Neurological Surgery, University of California San Francisco, San Francisco, CA, USA
| | - Ankit N Khambhati
- Department of Neurological Surgery, University of California San Francisco, San Francisco, CA, USA
| | - Maansi Desai
- Department of Communication Sciences and Disorders, Moody College of Communication, University of Texas at Austin, Austin, TX, USA
| | - Deanna L Wallace
- Department of Mechanical Engineering, Psychology and Neurology, University of Texas at Austin, Austin, TX, USA
| | - Alia Shafi
- Department of Neurological Surgery, University of California San Francisco, San Francisco, CA, USA
| | - Heather E Dawes
- Department of Neurological Surgery, University of California San Francisco, San Francisco, CA, USA
| | - Virginia E Sturm
- Department of Neurology, UCSF Weill Institute for Neurosciences, University of California San Francisco, San Francisco, CA, USA
| | - Edward F Chang
- Department of Neurological Surgery, University of California San Francisco, San Francisco, CA, USA.
| |
Collapse
|
6
|
Esposito D, Centracchio J, Andreozzi E, Gargiulo GD, Naik GR, Bifulco P. Biosignal-Based Human-Machine Interfaces for Assistance and Rehabilitation: A Survey. SENSORS 2021; 21:s21206863. [PMID: 34696076 PMCID: PMC8540117 DOI: 10.3390/s21206863] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/31/2021] [Revised: 09/30/2021] [Accepted: 10/12/2021] [Indexed: 12/03/2022]
Abstract
As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal-based HMIs for assistance and rehabilitation to outline state-of-the-art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full-text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever-growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complexity, so their usefulness should be carefully evaluated for the specific application.
Collapse
Affiliation(s)
- Daniele Esposito
- Department of Electrical Engineering and Information Technologies, Polytechnic and Basic Sciences School, University of Naples “Federico II”, 80125 Naples, Italy; (D.E.); (J.C.); (E.A.); (P.B.)
| | - Jessica Centracchio
- Department of Electrical Engineering and Information Technologies, Polytechnic and Basic Sciences School, University of Naples “Federico II”, 80125 Naples, Italy; (D.E.); (J.C.); (E.A.); (P.B.)
| | - Emilio Andreozzi
- Department of Electrical Engineering and Information Technologies, Polytechnic and Basic Sciences School, University of Naples “Federico II”, 80125 Naples, Italy; (D.E.); (J.C.); (E.A.); (P.B.)
| | - Gaetano D. Gargiulo
- School of Engineering, Design and Built Environment, Western Sydney University, Penrith, NSW 2747, Australia;
- The MARCS Institute, Western Sydney University, Penrith, NSW 2751, Australia
| | - Ganesh R. Naik
- School of Engineering, Design and Built Environment, Western Sydney University, Penrith, NSW 2747, Australia;
- The Adelaide Institute for Sleep Health, Flinders University, Bedford Park, SA 5042, Australia
- Correspondence:
| | - Paolo Bifulco
- Department of Electrical Engineering and Information Technologies, Polytechnic and Basic Sciences School, University of Naples “Federico II”, 80125 Naples, Italy; (D.E.); (J.C.); (E.A.); (P.B.)
| |
Collapse
|
7
|
Steinert S, Friedrich O. Wired Emotions: Ethical Issues of Affective Brain-Computer Interfaces. SCIENCE AND ENGINEERING ETHICS 2020; 26:351-367. [PMID: 30868377 PMCID: PMC6978299 DOI: 10.1007/s11948-019-00087-2] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/29/2018] [Accepted: 01/24/2019] [Indexed: 05/28/2023]
Abstract
Ethical issues concerning brain-computer interfaces (BCIs) have already received a considerable amount of attention. However, one particular form of BCI has not received the attention that it deserves: Affective BCIs that allow for the detection and stimulation of affective states. This paper brings the ethical issues of affective BCIs in sharper focus. The paper briefly reviews recent applications of affective BCIs and considers ethical issues that arise from these applications. Ethical issues that affective BCIs share with other neurotechnologies are presented and ethical concerns that are specific to affective BCIs are identified and discussed.
Collapse
Affiliation(s)
- Steffen Steinert
- Department of Values, Technology and Innovation, Faculty of Technology, Policy and Management, Delft University of Technology, Delft, The Netherlands
| | - Orsolya Friedrich
- Institute of Ethics, History and Theory of Medicine, Ludwig-Maximilians-Universität München, Lessingstr. 2, 80336 Munich, Germany
| |
Collapse
|
8
|
Lulé D, Kübler A, Ludolph AC. Ethical Principles in Patient-Centered Medical Care to Support Quality of Life in Amyotrophic Lateral Sclerosis. Front Neurol 2019; 10:259. [PMID: 30967833 PMCID: PMC6439311 DOI: 10.3389/fneur.2019.00259] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2019] [Accepted: 02/26/2019] [Indexed: 12/12/2022] Open
Abstract
It is one of the primary goals of medical care to secure good quality of life (QoL) while prolonging survival. This is a major challenge in severe medical conditions with a prognosis such as amyotrophic lateral sclerosis (ALS). Further, the definition of QoL and the question whether survival in this severe condition is compatible with a good QoL is a matter of subjective and culture-specific debate. Some people without neurodegenerative conditions believe that physical decline is incompatible with satisfactory QoL. Current data provide extensive evidence that psychosocial adaptation in ALS is possible, indicated by a satisfactory QoL. Thus, there is no fatalistic link of loss of QoL when physical health declines. There are intrinsic and extrinsic factors that have been shown to successfully facilitate and secure QoL in ALS which will be reviewed in the following article following the four ethical principles (1) Beneficence, (2) Non-maleficence, (3) Autonomy and (4) Justice, which are regarded as key elements of patient centered medical care according to Beauchamp and Childress. This is a JPND-funded work to summarize findings of the project NEEDSinALS (www.NEEDSinALS.com) which highlights subjective perspectives and preferences in medical decision making in ALS.
Collapse
Affiliation(s)
- Dorothée Lulé
- Department of Neurology, University of Ulm, Ulm, Germany
| | - Andrea Kübler
- Interventional Psychology, Psychology III, University of Würzburg, Würzburg, Germany
| | | |
Collapse
|
9
|
Grabowski K, Rynkiewicz A, Lassalle A, Baron-Cohen S, Schuller B, Cummins N, Baird A, Podgórska-Bednarz J, Pieniążek A, Łucka I. Emotional expression in psychiatric conditions: New technology for clinicians. Psychiatry Clin Neurosci 2019; 73:50-62. [PMID: 30565801 DOI: 10.1111/pcn.12799] [Citation(s) in RCA: 30] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/11/2017] [Revised: 09/24/2018] [Accepted: 11/11/2018] [Indexed: 12/24/2022]
Abstract
AIM Emotional expressions are one of the most widely studied topics in neuroscience, from both clinical and non-clinical perspectives. Atypical emotional expressions are seen in various psychiatric conditions, including schizophrenia, depression, and autism spectrum conditions. Understanding the basics of emotional expressions and recognition can be crucial for diagnostic and therapeutic procedures. Emotions can be expressed in the face, gesture, posture, voice, and behavior and affect physiological parameters, such as the heart rate or body temperature. With modern technology, clinicians can use a variety of tools ranging from sophisticated laboratory equipment to smartphones and web cameras. The aim of this paper is to review the currently used tools using modern technology and discuss their usefulness as well as possible future directions in emotional expression research and treatment strategies. METHODS The authors conducted a literature review in the PubMed, EBSCO, and SCOPUS databases, using the following key words: 'emotions,' 'emotional expression,' 'affective computing,' and 'autism.' The most relevant and up-to-date publications were identified and discussed. Search results were supplemented by the authors' own research in the field of emotional expression. RESULTS We present a critical review of the currently available technical diagnostic and therapeutic methods. The most important studies are summarized in a table. CONCLUSION Most of the currently available methods have not been adequately validated in clinical settings. They may be a great help in everyday practice; however, they need further testing. Future directions in this field include more virtual-reality-based and interactive interventions, as well as development and improvement of humanoid robots.
Collapse
Affiliation(s)
- Karol Grabowski
- Department of Psychiatry, Adult Psychiatry Clinic, Faculty of Medicine, Medical University of Gdansk, Gdansk, Poland
| | - Agnieszka Rynkiewicz
- Neurodevelopmental Disorders Research Lab, Institute of Experimental and Clinical Medicine, Faculty of Medicine, University of Rzeszow, Rzeszow, Poland.,Center for Diagnosis, Therapy and Education SPECTRUM ASC-MED, Gdansk & Rzeszow, Poland
| | - Amandine Lassalle
- Department of Psychology, Brain & Cognition, University of Amsterdam, Amsterdam, Netherlands
| | - Simon Baron-Cohen
- Autism Research Centre, Department of Psychiatry, University of Cambridge, Cambridge, UK
| | - Björn Schuller
- Department of Computing, GLAM - Group on Language, Audio, and Music, Imperial College London, London, UK
| | - Nicholas Cummins
- Department of Computing, GLAM - Group on Language, Audio, and Music, Imperial College London, London, UK
| | - Alice Baird
- Embedded Intelligence for Health Care and Wellbeing, University of Augsburg, Augsburg, Germany
| | - Justyna Podgórska-Bednarz
- Institute of Physiotherapy, Faculty of Medicine, University of Rzeszow, Rzeszow, Poland.,Association for Children with Attention Deficit Hyperactivity Disorder in Rzeszow, Rzeszow, Poland
| | - Agata Pieniążek
- Institute of Physiotherapy, Faculty of Medicine, University of Rzeszow, Rzeszow, Poland.,SOLIS RADIUS Association for People with Disabilities and Autism Spectrum Disorders in Rzeszow, Rzeszow, Poland.,Medical Center for Children with Autism Spectrum Disorders in Rzeszow, Rzeszow, Poland
| | - Izabela Łucka
- Developmental Psychiatry, Psychotic and Geriatric Disorders Clinic, Department of Psychiatry, Faculty of Medicine, Medical University of Gdansk, Gdansk, Poland
| |
Collapse
|
10
|
Masood N, Farooq H. Investigating EEG Patterns for Dual-Stimuli Induced Human Fear Emotional State. SENSORS (BASEL, SWITZERLAND) 2019; 19:E522. [PMID: 30691180 PMCID: PMC6387207 DOI: 10.3390/s19030522] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/12/2018] [Revised: 01/11/2019] [Accepted: 01/21/2019] [Indexed: 11/16/2022]
Abstract
Most electroencephalography (EEG) based emotion recognition systems make use of videos and images as stimuli. Few used sounds, and even fewer studies were found involving self-induced emotions. Furthermore, most of the studies rely on single stimuli to evoke emotions. The question of "whether different stimuli for same emotion elicitation generate any subject-independent correlations" remains unanswered. This paper introduces a dual modality based emotion elicitation paradigm to investigate if emotions can be classified induced with different stimuli. A method has been proposed based on common spatial pattern (CSP) and linear discriminant analysis (LDA) to analyze human brain signals for fear emotions evoked with two different stimuli. Self-induced emotional imagery is one of the considered stimuli, while audio/video clips are used as the other stimuli. The method extracts features from the CSP algorithm and LDA performs classification. To investigate associated EEG correlations, a spectral analysis was performed. To further improve the performance, CSP was compared with other regularized techniques. Critical EEG channels are identified based on spatial filter weights. To the best of our knowledge, our work provides the first contribution for the assessment of EEG correlations in the case of self versus video induced emotions captured with a commercial grade EEG device.
Collapse
Affiliation(s)
- Naveen Masood
- Electrical Engineering Department, Bahria University, Karachi 75260, Pakistan.
| | - Humera Farooq
- Computer Science Department, Bahria University, Karachi 75260, Pakistan.
| |
Collapse
|
11
|
Corsi MC, Chavez M, Schwartz D, Hugueville L, Khambhati AN, Bassett DS, De Vico Fallani F. Integrating EEG and MEG Signals to Improve Motor Imagery Classification in Brain–Computer Interface. Int J Neural Syst 2019; 29:1850014. [DOI: 10.1142/s0129065718500144] [Citation(s) in RCA: 47] [Impact Index Per Article: 9.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
We adopted a fusion approach that combines features from simultaneously recorded electroencephalogram (EEG) and magnetoencephalogram (MEG) signals to improve classification performances in motor imagery-based brain–computer interfaces (BCIs). We applied our approach to a group of 15 healthy subjects and found a significant classification performance enhancement as compared to standard single-modality approaches in the alpha and beta bands. Taken together, our findings demonstrate the advantage of considering multimodal approaches as complementary tools for improving the impact of noninvasive BCIs.
Collapse
Affiliation(s)
- Marie-Constance Corsi
- Inria, Aramis project-team, F-75013, Paris, France
- Institut du Cerveau et de la Moelle épinière, ICM, F-75013, Paris, France
- Inserm, U 1127, F-75013, Paris, France
- CNRS, UMR 7225, F-75013, Paris, France
- Sorbonne Université, F-75013, Paris, France
| | | | - Denis Schwartz
- Centre de NeuroImagerie de Recherche — CENIR, Centre de Recherche de l’Institut du Cerveau et de la Moelle Epinère, Université Pierre et Marie Curie-Paris 6 UMR-S975, INSERM U975, CNRS UMR7225, Groupe Hospitalier Pitié-Salpêtrière, Paris, France
| | - Laurent Hugueville
- Centre de NeuroImagerie de Recherche — CENIR, Centre de Recherche de l’Institut du Cerveau et de la Moelle Epinère, Université Pierre et Marie Curie-Paris 6 UMR-S975, INSERM U975, CNRS UMR7225, Groupe Hospitalier Pitié-Salpêtrière, Paris, France
| | - Ankit N. Khambhati
- Department of Bioengineering, University of Pennsylvania, Philadelphia, PA 19104, USA
| | - Danielle S. Bassett
- Department of Bioengineering, University of Pennsylvania, Philadelphia, PA 19104, USA
- Department of Electrical and Systems Engineering, University of Pennsylvania, Philadelphia, PA 19104, USA
- Department of Physics, University of Pennsylvania, Philadelphia, PA 19104, USA
- Department of Neurology, University of Pennsylvania, Philadelphia, PA 19104, USA
| | - Fabrizio De Vico Fallani
- Inria, Aramis project-team, F-75013, Paris, France
- Institut du Cerveau et de la Moelle épinière, ICM, F-75013, Paris, France
- Inserm, U 1127, F-75013, Paris, France
- CNRS, UMR 7225, F-75013, Paris, France
- Sorbonne Université, F-75013, Paris, France
| |
Collapse
|
12
|
Li R, Zhang X, Lu Z, Liu C, Li H, Sheng W, Odekhe R. An Approach for Brain-Controlled Prostheses Based on a Facial Expression Paradigm. Front Neurosci 2018; 12:943. [PMID: 30618572 PMCID: PMC6305548 DOI: 10.3389/fnins.2018.00943] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2018] [Accepted: 11/29/2018] [Indexed: 12/26/2022] Open
Abstract
One of the most exciting areas of rehabilitation research is brain-controlled prostheses, which translate electroencephalography (EEG) signals into control commands that operate prostheses. However, the existing brain-control methods have an obstacle between the selection of brain computer interface (BCI) and its performance. In this paper, a novel BCI system based on a facial expression paradigm is proposed to control prostheses that uses the characteristics of theta and alpha rhythms of the prefrontal and motor cortices. A portable brain-controlled prosthesis system was constructed to validate the feasibility of the facial-expression-based BCI (FE-BCI) system. Four types of facial expressions were used in this study. An effective filtering algorithm based on noise-assisted multivariate empirical mode decomposition (NA-MEMD) and sample entropy (SampEn) was used to remove electromyography (EMG) artifacts. A wavelet transform (WT) was applied to calculate the feature set, and a back propagation neural network (BPNN) was employed as a classifier. To prove the effectiveness of the FE-BCI system for prosthesis control, 18 subjects were involved in both offline and online experiments. The grand average accuracy over 18 subjects was 81.31 ± 5.82% during the online experiment. The experimental results indicated that the proposed FE-BCI system achieved good performance and can be efficiently applied for prosthesis control.
Collapse
Affiliation(s)
- Rui Li
- Shaanxi Key Laboratory of Intelligent Robot, Xi'an Jiaotong University, Xi'an, China
| | - Xiaodong Zhang
- Shaanxi Key Laboratory of Intelligent Robot, Xi'an Jiaotong University, Xi'an, China
| | - Zhufeng Lu
- Shaanxi Key Laboratory of Intelligent Robot, Xi'an Jiaotong University, Xi'an, China
| | - Chang Liu
- Shaanxi Key Laboratory of Intelligent Robot, Xi'an Jiaotong University, Xi'an, China
| | - Hanzhe Li
- Shaanxi Key Laboratory of Intelligent Robot, Xi'an Jiaotong University, Xi'an, China
| | - Weihua Sheng
- School of Electrical and Computer Engineering, Oklahoma State University, Stillwater, OK, United States
- Shenzhen Academy of Robotics, Shenzhen, China
| | - Randolph Odekhe
- Shaanxi Key Laboratory of Intelligent Robot, Xi'an Jiaotong University, Xi'an, China
| |
Collapse
|
13
|
Dima DC, Perry G, Messaritaki E, Zhang J, Singh KD. Spatiotemporal dynamics in human visual cortex rapidly encode the emotional content of faces. Hum Brain Mapp 2018; 39:3993-4006. [PMID: 29885055 PMCID: PMC6175429 DOI: 10.1002/hbm.24226] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2018] [Revised: 04/13/2018] [Accepted: 05/14/2018] [Indexed: 12/05/2022] Open
Abstract
Recognizing emotion in faces is important in human interaction and survival, yet existing studies do not paint a consistent picture of the neural representation supporting this task. To address this, we collected magnetoencephalography (MEG) data while participants passively viewed happy, angry and neutral faces. Using time-resolved decoding of sensor-level data, we show that responses to angry faces can be discriminated from happy and neutral faces as early as 90 ms after stimulus onset and only 10 ms later than faces can be discriminated from scrambled stimuli, even in the absence of differences in evoked responses. Time-resolved relevance patterns in source space track expression-related information from the visual cortex (100 ms) to higher-level temporal and frontal areas (200-500 ms). Together, our results point to a system optimised for rapid processing of emotional faces and preferentially tuned to threat, consistent with the important evolutionary role that such a system must have played in the development of human social interactions.
Collapse
Affiliation(s)
- Diana C. Dima
- Cardiff University Brain Research Imaging Centre (CUBRIC), School of Psychology, Cardiff UniversityCardiffCF24 4HQUnited Kingdom
| | - Gavin Perry
- Cardiff University Brain Research Imaging Centre (CUBRIC), School of Psychology, Cardiff UniversityCardiffCF24 4HQUnited Kingdom
| | - Eirini Messaritaki
- BRAIN Unit, School of MedicineCardiff UniversityCardiffCF24 4HQUnited Kingdom
- Cardiff University Brain Research Imaging Centre (CUBRIC), School of Psychology, Cardiff UniversityCardiffCF24 4HQUnited Kingdom
| | - Jiaxiang Zhang
- Cardiff University Brain Research Imaging Centre (CUBRIC), School of Psychology, Cardiff UniversityCardiffCF24 4HQUnited Kingdom
| | - Krish D. Singh
- Cardiff University Brain Research Imaging Centre (CUBRIC), School of Psychology, Cardiff UniversityCardiffCF24 4HQUnited Kingdom
| |
Collapse
|
14
|
Lopatina OL, Komleva YK, Gorina YV, Higashida H, Salmina AB. Neurobiological Aspects of Face Recognition: The Role of Oxytocin. Front Behav Neurosci 2018; 12:195. [PMID: 30210321 PMCID: PMC6121008 DOI: 10.3389/fnbeh.2018.00195] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2018] [Accepted: 08/09/2018] [Indexed: 12/23/2022] Open
Abstract
Face recognition is an important index in the formation of social cognition and neurodevelopment in humans. Changes in face perception and memory are connected with altered sociability, which is a symptom of numerous brain conditions including autism spectrum disorder (ASD). Various brain regions and neuropeptides are implicated in face processing. The neuropeptide oxytocin (OT) plays an important role in various social behaviors, including face and emotion recognition. Nasal OT administration is a promising new therapy that can address social cognition deficits in individuals with ASD. New instrumental neurotechnologies enable the assessment of brain region activation during specific social tasks and therapies, and can characterize the involvement of genes and peptides in impaired neurodevelopment. The present review sought to discuss some of the mechanisms of the face distinguishing process, the ability of OT to modulate social cognition, as well as new perspectives and technologies for research and rehabilitation of face recognition.
Collapse
Affiliation(s)
- Olga L Lopatina
- Department of Biochemistry, Medical, Pharmaceutical, and Toxicological Chemistry, Krasnoyarsk State Medical University named after Prof. V.F. Voino-Yasenetsky, Krasnoyarsk, Russia.,Research Institute of Molecular Medicine and Pathobiochemistry, Krasnoyarsk State Medical University named after Prof. V.F. Voino-Yasenetsky, Krasnoyarsk, Russia.,Department of Basic Research on Social Recognition and Memory, Research Center for Child Mental Development, Kanazawa University, Kanazawa, Japan
| | - Yulia K Komleva
- Department of Biochemistry, Medical, Pharmaceutical, and Toxicological Chemistry, Krasnoyarsk State Medical University named after Prof. V.F. Voino-Yasenetsky, Krasnoyarsk, Russia.,Research Institute of Molecular Medicine and Pathobiochemistry, Krasnoyarsk State Medical University named after Prof. V.F. Voino-Yasenetsky, Krasnoyarsk, Russia
| | - Yana V Gorina
- Department of Biochemistry, Medical, Pharmaceutical, and Toxicological Chemistry, Krasnoyarsk State Medical University named after Prof. V.F. Voino-Yasenetsky, Krasnoyarsk, Russia.,Research Institute of Molecular Medicine and Pathobiochemistry, Krasnoyarsk State Medical University named after Prof. V.F. Voino-Yasenetsky, Krasnoyarsk, Russia
| | - Haruhiro Higashida
- Research Institute of Molecular Medicine and Pathobiochemistry, Krasnoyarsk State Medical University named after Prof. V.F. Voino-Yasenetsky, Krasnoyarsk, Russia.,Department of Basic Research on Social Recognition and Memory, Research Center for Child Mental Development, Kanazawa University, Kanazawa, Japan
| | - Alla B Salmina
- Department of Biochemistry, Medical, Pharmaceutical, and Toxicological Chemistry, Krasnoyarsk State Medical University named after Prof. V.F. Voino-Yasenetsky, Krasnoyarsk, Russia.,Research Institute of Molecular Medicine and Pathobiochemistry, Krasnoyarsk State Medical University named after Prof. V.F. Voino-Yasenetsky, Krasnoyarsk, Russia.,Department of Basic Research on Social Recognition and Memory, Research Center for Child Mental Development, Kanazawa University, Kanazawa, Japan
| |
Collapse
|
15
|
Sun N, Lu H, Qu C. Sex differences in extinction to negative stimuli: Event-related brain potentials. Medicine (Baltimore) 2018; 97:e0503. [PMID: 29703014 PMCID: PMC5944551 DOI: 10.1097/md.0000000000010503] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/04/2022] Open
Abstract
There are controversial observations regarding whether females have a longer time to extinction than men, which may be related to different levels of conditioning acquisition and/or the influence of the menstrual cycle. We explored the electrophysiological evidence of sex differences in extinction.In this study, females in the luteal phase and menstrual phase were examined for event-related potential (ERP) and evidence of attention allocation in the conditioning model using electroencephalogram recordings. A group of male participants was also included and compared.Women in the luteal phase had a higher difference waveform of P3 amplitude to conditioned stimulus (CS) in the extinction phase than women in the menstrual phase and men. There was a shorter latency of P3 to CS+ in men than in women in the extinction phase, suggesting that men react faster than women to unconditioned stimulus (US) expectation. Our study revealed that women in the luteal phase allocated more attentive resources to the expectation of a US. In contrast, men displayed faster expectation of the extinguished US than women. Our results support the superiority of ERP technology in documenting the neural mechanism of the extinction process.
Collapse
Affiliation(s)
- Nan Sun
- School of Education
- Center for Brain and Cognitive Sciences, School of Education, Guangzhou University
| | - Hong Lu
- School of Education
- Center for Brain and Cognitive Sciences, School of Education, Guangzhou University
| | - Chen Qu
- Psychology Research Center, South China Normal University, Guangzhou, China
| |
Collapse
|
16
|
Review and Classification of Emotion Recognition Based on EEG Brain-Computer Interface System Research: A Systematic Review. APPLIED SCIENCES-BASEL 2017. [DOI: 10.3390/app7121239] [Citation(s) in RCA: 70] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/02/2023]
|
17
|
Ahn S, Cho H, Kwon M, Kim K, Kwon H, Kim BS, Chang WS, Chang JW, Jun SC. Interbrain phase synchronization during turn-taking verbal interaction-a hyperscanning study using simultaneous EEG/MEG. Hum Brain Mapp 2017; 39:171-188. [PMID: 29024193 DOI: 10.1002/hbm.23834] [Citation(s) in RCA: 52] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2017] [Revised: 09/04/2017] [Accepted: 09/22/2017] [Indexed: 01/25/2023] Open
Abstract
Recently, neurophysiological findings about social interaction have been investigated widely, and hardware has been developed that can measure multiple subjects' brain activities simultaneously. These hyperscanning studies have enabled us to discover new and important evidences of interbrain interactions. Yet, very little is known about verbal interaction without any visual input. Therefore, we conducted a new hyperscanning study based on verbal, interbrain turn-taking interaction using simultaneous EEG/MEG, which measures rapidly changing brain activities. To establish turn-taking verbal interactions between a pair of subjects, we set up two EEG/MEG systems (19 and 146 channels of EEG and MEG, respectively) located ∼100 miles apart. Subjects engaged in verbal communication via condenser microphones and magnetic-compatible earphones, and a network time protocol synchronized the two systems. Ten subjects participated in this experiment and performed verbal interaction and noninteraction tasks separately. We found significant oscillations in EEG alpha and MEG alpha/gamma bands in several brain regions for all subjects. Furthermore, we estimated phase synchronization between two brains using the weighted phase lag index and found statistically significant synchronization in EEG and MEG data. Our novel paradigm and neurophysiological findings may foster a basic understanding of the functional mechanisms involved in human social interactions. Hum Brain Mapp 39:171-188, 2018. © 2017 Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Sangtae Ahn
- Department of Psychiatry, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina
| | - Hohyun Cho
- New York State Department of Health, Wadsworth Center, Albany, New York
| | - Moonyoung Kwon
- School of Electrical Engineering and Computer Science, Gwangju Institute of Science and Technology, Gwangju, South Korea
| | - Kiwoong Kim
- Center for Biosignals, Korea Research Institute of Standards and Science, Daejeon, South Korea.,Department of Medical Physics, University of Science and Technology, Daejeon, South Korea
| | - Hyukchan Kwon
- Center for Biosignals, Korea Research Institute of Standards and Science, Daejeon, South Korea
| | - Bong Soo Kim
- EIT/LOFUS R&D Center, Institute for Integrative Medicine, College of Medicine, Catholic Kwandong University, Gangneung-si, Gangwon-do, South Korea.,Catholic Kwandong University International St. Mary's Hospital, Incheon, South Korea
| | - Won Seok Chang
- Department of Neurosurgery, Brain Research Institute, Yonsei University College of Medicine, Seoul, South Korea
| | - Jin Woo Chang
- Department of Neurosurgery, Brain Research Institute, Yonsei University College of Medicine, Seoul, South Korea
| | - Sung Chan Jun
- School of Electrical Engineering and Computer Science, Gwangju Institute of Science and Technology, Gwangju, South Korea
| |
Collapse
|
18
|
McFarland DJ, Parvaz MA, Sarnacki WA, Goldstein RZ, Wolpaw JR. Prediction of subjective ratings of emotional pictures by EEG features. J Neural Eng 2017; 14:016009. [PMID: 27934776 PMCID: PMC5476954 DOI: 10.1088/1741-2552/14/1/016009] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
OBJECTIVE Emotion dysregulation is an important aspect of many psychiatric disorders. Brain-computer interface (BCI) technology could be a powerful new approach to facilitating therapeutic self-regulation of emotions. One possible BCI method would be to provide stimulus-specific feedback based on subject-specific electroencephalographic (EEG) responses to emotion-eliciting stimuli. APPROACH To assess the feasibility of this approach, we studied the relationships between emotional valence/arousal and three EEG features: amplitude of alpha activity over frontal cortex; amplitude of theta activity over frontal midline cortex; and the late positive potential over central and posterior mid-line areas. For each feature, we evaluated its ability to predict emotional valence/arousal on both an individual and a group basis. Twenty healthy participants (9 men, 11 women; ages 22-68) rated each of 192 pictures from the IAPS collection in terms of valence and arousal twice (96 pictures on each of 4 d over 2 weeks). EEG was collected simultaneously and used to develop models based on canonical correlation to predict subject-specific single-trial ratings. Separate models were evaluated for the three EEG features: frontal alpha activity; frontal midline theta; and the late positive potential. In each case, these features were used to simultaneously predict both the normed ratings and the subject-specific ratings. MAIN RESULTS Models using each of the three EEG features with data from individual subjects were generally successful at predicting subjective ratings on training data, but generalization to test data was less successful. Sparse models performed better than models without regularization. SIGNIFICANCE The results suggest that the frontal midline theta is a better candidate than frontal alpha activity or the late positive potential for use in a BCI-based paradigm designed to modify emotional reactions.
Collapse
Affiliation(s)
- Dennis J. McFarland
- National Center for Adaptive Neurotechnologies, Wadsworth Center, New York State Department of Health, Albany, New York 12201-0509
| | - Muhammad A. Parvaz
- Departments of Psychiatry and Neuroscience, Icahn School of Medicine at Mount Sinai, New York, NY 10029-6574
| | - William A. Sarnacki
- National Center for Adaptive Neurotechnologies, Wadsworth Center, New York State Department of Health, Albany, New York 12201-0509
| | - Rita Z. Goldstein
- Departments of Psychiatry and Neuroscience, Icahn School of Medicine at Mount Sinai, New York, NY 10029-6574
| | - Jonathan R. Wolpaw
- National Center for Adaptive Neurotechnologies, Wadsworth Center, New York State Department of Health, Albany, New York 12201-0509
| |
Collapse
|
19
|
Quitadamo LR, Cavrini F, Sbernini L, Riillo F, Bianchi L, Seri S, Saggio G. Support vector machines to detect physiological patterns for EEG and EMG-based human-computer interaction: a review. J Neural Eng 2017; 14:011001. [PMID: 28068295 DOI: 10.1088/1741-2552/14/1/011001] [Citation(s) in RCA: 47] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
Support vector machines (SVMs) are widely used classifiers for detecting physiological patterns in human-computer interaction (HCI). Their success is due to their versatility, robustness and large availability of free dedicated toolboxes. Frequently in the literature, insufficient details about the SVM implementation and/or parameters selection are reported, making it impossible to reproduce study analysis and results. In order to perform an optimized classification and report a proper description of the results, it is necessary to have a comprehensive critical overview of the applications of SVM. The aim of this paper is to provide a review of the usage of SVM in the determination of brain and muscle patterns for HCI, by focusing on electroencephalography (EEG) and electromyography (EMG) techniques. In particular, an overview of the basic principles of SVM theory is outlined, together with a description of several relevant literature implementations. Furthermore, details concerning reviewed papers are listed in tables and statistics of SVM use in the literature are presented. Suitability of SVM for HCI is discussed and critical comparisons with other classifiers are reported.
Collapse
Affiliation(s)
- L R Quitadamo
- Department of Electronic Engineering, University of Rome Tor Vergata, Rome, Italy. School of Life and Health Sciences, Aston Brain Center, Aston University, Birmingham, UK
| | | | | | | | | | | | | |
Collapse
|
20
|
Bogost MD, Burgos PI, Little CE, Woollacott MH, Dalton BH. Electrocortical Sources Related to Whole-Body Surface Translations during a Single- and Dual-Task Paradigm. Front Hum Neurosci 2016; 10:524. [PMID: 27803658 PMCID: PMC5067303 DOI: 10.3389/fnhum.2016.00524] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2016] [Accepted: 10/03/2016] [Indexed: 11/13/2022] Open
Abstract
Appropriate reactive motor responses are essential in maintaining upright balance. However, little is known regarding the potential location of cortical sources that are related to the onset of a perturbation during single- and dual-task paradigms. The purpose of this study was to estimate the location of cortical sources in response to a whole-body surface translation and whether diverted attention decreases the N1 event-related potential (ERP) amplitude related to a postural perturbation. This study utilized high-resolution electroencephalography in conjunction with measure projection analysis from ERPs time-locked to backwards surface translation onsets to determine which cortical sources were related to whole-body postural perturbations. Subjects (n = 15) either reacted to whole-body surface translations with (dual task) or without (single task) performing a visual working memory task. For the single task, four domains were identified that were mainly localized within the frontal and parietal lobes and included sources from the prefrontal, premotor, primary and supplementary motor, somatosensory and anterior cingulate cortex. Five domains were estimated for the dual task and also included sources within the frontal and parietal lobes, but the sources also shifted to other locations that included areas within the temporal and occipital lobes. Additionally, mean absolute N1 ERP amplitudes representing the activity from similar locations in both tasks were greater for the single than dual task. The present localization results highlight the importance of frontal, parietal and anterior cingulate cortical areas in reactive postural control and suggest a re-allocation or shift of cortical sources related to reactive balance control in the presence of a secondary task. Thus, this study provides novel insight into the underlying neurophysiology and contribution of cortical sources in relation to the neural control of reactive balance.
Collapse
Affiliation(s)
- Mark D Bogost
- Department of Human Physiology, University of Oregon Eugene, OR, USA
| | - Pablo I Burgos
- Department of Kinesiology, Universidad de Chile Santiago, Chile
| | - C Elaine Little
- Faculty of Kinesiology, University of Calgary Calgary, AB, Canada
| | | | - Brian H Dalton
- Department of Human Physiology, University of Oregon Eugene, OR, USA
| |
Collapse
|
21
|
Ordikhani-Seyedlar M, Lebedev MA, Sorensen HBD, Puthusserypady S. Neurofeedback Therapy for Enhancing Visual Attention: State-of-the-Art and Challenges. Front Neurosci 2016; 10:352. [PMID: 27536212 PMCID: PMC4971093 DOI: 10.3389/fnins.2016.00352] [Citation(s) in RCA: 40] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2016] [Accepted: 07/12/2016] [Indexed: 11/17/2022] Open
Abstract
We have witnessed a rapid development of brain-computer interfaces (BCIs) linking the brain to external devices. BCIs can be utilized to treat neurological conditions and even to augment brain functions. BCIs offer a promising treatment for mental disorders, including disorders of attention. Here we review the current state of the art and challenges of attention-based BCIs, with a focus on visual attention. Attention-based BCIs utilize electroencephalograms (EEGs) or other recording techniques to generate neurofeedback, which patients use to improve their attention, a complex cognitive function. Although progress has been made in the studies of neural mechanisms of attention, extraction of attention-related neural signals needed for BCI operations is a difficult problem. To attain good BCI performance, it is important to select the features of neural activity that represent attentional signals. BCI decoding of attention-related activity may be hindered by the presence of different neural signals. Therefore, BCI accuracy can be improved by signal processing algorithms that dissociate signals of interest from irrelevant activities. Notwithstanding recent progress, optimal processing of attentional neural signals remains a fundamental challenge for the development of efficient therapies for disorders of attention.
Collapse
Affiliation(s)
- Mehdi Ordikhani-Seyedlar
- Division of Biomedical Engineering, Department of Electrical Engineering, Technical University of Denmark Lyngby, Denmark
| | - Mikhail A Lebedev
- Department of Neurobiology, Duke UniversityDurham, NC, USA; Center for Neuroengineering, Duke UniversityDurham, NC, USA
| | - Helge B D Sorensen
- Division of Biomedical Engineering, Department of Electrical Engineering, Technical University of Denmark Lyngby, Denmark
| | - Sadasivan Puthusserypady
- Division of Biomedical Engineering, Department of Electrical Engineering, Technical University of Denmark Lyngby, Denmark
| |
Collapse
|
22
|
Pfeiffer T, Heinze N, Frysch R, Deouell LY, Schoenfeld MA, Knight RT, Rose G. Extracting duration information in a picture category decoding task using hidden Markov Models. J Neural Eng 2016; 13:026010. [PMID: 26859831 DOI: 10.1088/1741-2560/13/2/026010] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
OBJECTIVE Adapting classifiers for the purpose of brain signal decoding is a major challenge in brain-computer-interface (BCI) research. In a previous study we showed in principle that hidden Markov models (HMM) are a suitable alternative to the well-studied static classifiers. However, since we investigated a rather straightforward task, advantages from modeling of the signal could not be assessed. APPROACH Here, we investigate a more complex data set in order to find out to what extent HMMs, as a dynamic classifier, can provide useful additional information. We show for a visual decoding problem that besides category information, HMMs can simultaneously decode picture duration without an additional training required. This decoding is based on a strong correlation that we found between picture duration and the behavior of the Viterbi paths. MAIN RESULTS Decoding accuracies of up to 80% could be obtained for category and duration decoding with a single classifier trained on category information only. SIGNIFICANCE The extraction of multiple types of information using a single classifier enables the processing of more complex problems, while preserving good training results even on small databases. Therefore, it provides a convenient framework for online real-life BCI utilizations.
Collapse
Affiliation(s)
- Tim Pfeiffer
- Institute for Medical Engineering, Otto-von-Guericke-University Magdeburg, Germany
| | | | | | | | | | | | | |
Collapse
|
23
|
Kontson KL, Megjhani M, Brantley JA, Cruz-Garza JG, Nakagome S, Robleto D, White M, Civillico E, Contreras-Vidal JL. Your Brain on Art: Emergent Cortical Dynamics During Aesthetic Experiences. Front Hum Neurosci 2015; 9:626. [PMID: 26635579 PMCID: PMC4649259 DOI: 10.3389/fnhum.2015.00626] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2015] [Accepted: 11/02/2015] [Indexed: 11/13/2022] Open
Abstract
The brain response to conceptual art was studied with mobile electroencephalography (EEG) to examine the neural basis of aesthetic experiences. In contrast to most studies of perceptual phenomena, participants were moving and thinking freely as they viewed the exhibit The Boundary of Life is Quietly Crossed by Dario Robleto at the Menil Collection-Houston. The brain activity of over 400 subjects was recorded using dry-electrode and one reference gel-based EEG systems over a period of 3 months. Here, we report initial findings based on the reference system. EEG segments corresponding to each art piece were grouped into one of three classes (complex, moderate, and baseline) based on analysis of a digital image of each piece. Time, frequency, and wavelet features extracted from EEG were used to classify patterns associated with viewing art, and ranked based on their relevance for classification. The maximum classification accuracy was 55% (chance = 33%) with delta and gamma features the most relevant for classification. Functional analysis revealed a significant increase in connection strength in localized brain networks while subjects viewed the most aesthetically pleasing art compared to viewing a blank wall. The direction of signal flow showed early recruitment of broad posterior areas followed by focal anterior activation. Significant differences in the strength of connections were also observed across age and gender. This work provides evidence that EEG, deployed on freely behaving subjects, can detect selective signal flow in neural networks, identify significant differences between subject groups, and report with greater-than-chance accuracy the complexity of a subject's visual percept of aesthetically pleasing art. Our approach, which allows acquisition of neural activity “in action and context,” could lead to understanding of how the brain integrates sensory input and its ongoing internal state to produce the phenomenon which we term aesthetic experience.
Collapse
Affiliation(s)
- Kimberly L Kontson
- Office of Science and Engineering Laboratories, Division of Biomedical Physics, Center for Devices and Radiological Health, U.S. Food and Drug Administration Silver Spring, MD, USA ; Laboratory for Non-Invasive Brain Machine Interfaces, Department of Electrical and Computer Engineering, University of Houston Houston, TX, USA
| | - Murad Megjhani
- Laboratory for Non-Invasive Brain Machine Interfaces, Department of Electrical and Computer Engineering, University of Houston Houston, TX, USA
| | - Justin A Brantley
- Laboratory for Non-Invasive Brain Machine Interfaces, Department of Electrical and Computer Engineering, University of Houston Houston, TX, USA
| | - Jesus G Cruz-Garza
- Laboratory for Non-Invasive Brain Machine Interfaces, Department of Electrical and Computer Engineering, University of Houston Houston, TX, USA
| | - Sho Nakagome
- Laboratory for Non-Invasive Brain Machine Interfaces, Department of Electrical and Computer Engineering, University of Houston Houston, TX, USA
| | - Dario Robleto
- American Artist Houston, TX, USA ; The Menil Collection Houston, TX, USA
| | | | - Eugene Civillico
- Office of Science and Engineering Laboratories, Division of Biomedical Physics, Center for Devices and Radiological Health, U.S. Food and Drug Administration Silver Spring, MD, USA
| | - Jose L Contreras-Vidal
- Laboratory for Non-Invasive Brain Machine Interfaces, Department of Electrical and Computer Engineering, University of Houston Houston, TX, USA
| |
Collapse
|
24
|
Liberati G, Federici S, Pasqualotto E. Extracting neurophysiological signals reflecting users’ emotional and affective responses to BCI use: A systematic literature review. NeuroRehabilitation 2015; 37:341-58. [DOI: 10.3233/nre-151266] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Affiliation(s)
- Giulia Liberati
- Université Catholique de Louvain, Institute of Neuroscience, Louvain, Belgium
| | - Stefano Federici
- Università di Perugia, Department of Philosophy, Social & Human Sciences and Education, Perugia, Italy
| | | |
Collapse
|
25
|
van Erp JBF, Brouwer AM, Zander TO. Editorial: Using neurophysiological signals that reflect cognitive or affective state. Front Neurosci 2015; 9:193. [PMID: 26074763 PMCID: PMC4448037 DOI: 10.3389/fnins.2015.00193] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2015] [Accepted: 05/16/2015] [Indexed: 11/13/2022] Open
Affiliation(s)
- Jan B F van Erp
- TNO Human Factors Soesterberg, Netherlands ; Human Media Interaction, University of Twente Enschede, Netherlands
| | | | | |
Collapse
|
26
|
Brouwer AM, Zander TO, van Erp JBF, Korteling JE, Bronkhorst AW. Using neurophysiological signals that reflect cognitive or affective state: six recommendations to avoid common pitfalls. Front Neurosci 2015; 9:136. [PMID: 25983676 PMCID: PMC4415417 DOI: 10.3389/fnins.2015.00136] [Citation(s) in RCA: 61] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/24/2014] [Accepted: 04/02/2015] [Indexed: 11/23/2022] Open
Abstract
Estimating cognitive or affective state from neurophysiological signals and designing applications that make use of this information requires expertise in many disciplines such as neurophysiology, machine learning, experimental psychology, and human factors. This makes it difficult to perform research that is strong in all its aspects as well as to judge a study or application on its merits. On the occasion of the special topic "Using neurophysiological signals that reflect cognitive or affective state" we here summarize often occurring pitfalls and recommendations on how to avoid them, both for authors (researchers) and readers. They relate to defining the state of interest, the neurophysiological processes that are expected to be involved in the state of interest, confounding factors, inadvertently "cheating" with classification analyses, insight on what underlies successful state estimation, and finally, the added value of neurophysiological measures in the context of an application. We hope that this paper will support the community in producing high quality studies and well-validated, useful applications.
Collapse
Affiliation(s)
- Anne-Marie Brouwer
- Perceptual and Cognitive Systems, Netherlands Organisation for Applied Scientific Research TNOSoesterberg, Netherlands
| | - Thorsten O. Zander
- Team PhyPA, Biological Psychology and Neuroergonomics, Technical UniversityBerlin, Germany
| | - Jan B. F. van Erp
- Perceptual and Cognitive Systems, Netherlands Organisation for Applied Scientific Research TNOSoesterberg, Netherlands
- Human Media Interaction, Twente UniversityEnschede, Netherlands
| | - Johannes E. Korteling
- Training Performance Innovations, Netherlands Organisation for Applied Scientific Research TNOSoesterberg, Netherlands
| | - Adelbert W. Bronkhorst
- Perceptual and Cognitive Systems, Netherlands Organisation for Applied Scientific Research TNOSoesterberg, Netherlands
- Cognitive Psychology, VU UniversityAmsterdam, Netherlands
| |
Collapse
|