1
|
Tea A, Ovid D. A Model for Emotional Intelligence in Biology Education Research. CBE LIFE SCIENCES EDUCATION 2024; 23:es12. [PMID: 39437126 DOI: 10.1187/cbe.23-10-0198] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/25/2024]
Abstract
Informed by social science fields including psychology and public health, we propose a Model for Emotional Intelligence to advance biology education research in affective learning. The model offers a shared discourse for biology education researchers to develop and assess evidence-based strategies to perceive, use, understand, and manage emotions for students and instructors in life sciences classrooms. We begin by reviewing the connection between stress, emotional invalidation, Sense of Belonging, and Science Identity as it relates to emotions in undergraduate life sciences classrooms. Next, we highlight the impact that emotionally invalidating classroom environments have on science students' development of psychological distress, maladaptive coping, and high-risk behaviors. Assuming Emotional Intelligence can be taught and learned (i.e., the ability model of Emotional Intelligence), we develop a Model for Emotional Intelligence to advance biology education research in this arena. This essay aims to inform assessments of current and future interventions designed to counteract emotional invalidation and encourage the development of emotional management among students and instructors. In alignment with our collective effort to support student well-being in the life sciences, the study of Emotional Intelligence in undergraduate biology education has the potential to support student mental health as future scientists and health care practitioners.
Collapse
Affiliation(s)
- Ash Tea
- Department of Physiology and Pharmacology, University of Georgia, Athens, GA 30602
| | - Dax Ovid
- Department of Physiology and Pharmacology, University of Georgia, Athens, GA 30602
| |
Collapse
|
2
|
Domínguez-Oliva A, Chávez C, Martínez-Burnes J, Olmos-Hernández A, Hernández-Avalos I, Mota-Rojas D. Neurobiology and Anatomy of Facial Expressions in Great Apes: Application of the AnimalFACS and Its Possible Association with the Animal's Affective State. Animals (Basel) 2024; 14:3414. [PMID: 39682379 DOI: 10.3390/ani14233414] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2024] [Revised: 11/07/2024] [Accepted: 11/19/2024] [Indexed: 12/18/2024] Open
Abstract
The Facial Action Coding System (FACS) is an anatomically based system to study facial expression in humans. Currently, it is recognized that nonhuman animals, particularly nonhuman primates, have an extensive facial ethogram that changes according to the context and affective state. The facial expression of great apes, the closest species to humans, has been studied using the ChimpFACS and OrangFACS as reliable tools to code facial expressions. However, although the FACS does not infer animal emotions, making additional evaluations and associating the facial changes with other parameters could contribute to understanding the facial expressions of nonhuman primates during positive or negative emotions. The present review aims to discuss the neural correlates and anatomical components of emotional facial expression in great apes. It will focus on the use of Facial Action Coding Systems (FACSs) and the movements of the facial muscles (AUs) of chimpanzees, orangutans, and gorillas and their possible association with the affective state of great apes.
Collapse
Affiliation(s)
- Adriana Domínguez-Oliva
- PhD Program in Biological and Health Sciences, Universidad Autónoma Metropolitana (UAM), Mexico City 04960, Mexico
- Neurophysiology of Pain, Behavior and Animal Welfare Assessment, DPAA, Universidad Autónoma Metropolitana (UAM), Mexico City 04960, Mexico
| | - Cuauhtémoc Chávez
- Departamento de Ciencias Ambientales, CBS, Universidad Autónoma Metropolitana-Lerma, Lerma de Villada 52005, Mexico
| | - Julio Martínez-Burnes
- Facultad de Medicina Veterinaria y Zootecnia, Universidad Autónoma de Tamaulipas, Victoria City 87000, Mexico
| | - Adriana Olmos-Hernández
- Bioterio and Experimental Surgery, Instituto Nacional de Rehabilitación-Luis Guillermo Ibarra Ibarra (INR-LGII), Mexico City 14389, Mexico
| | - Ismael Hernández-Avalos
- Biological Sciences Department, Facultad de Estudios Superiores Cuautitlán, Universidad Nacional Autónoma de México, Cuautitlán 54714, Mexico
| | - Daniel Mota-Rojas
- Neurophysiology of Pain, Behavior and Animal Welfare Assessment, DPAA, Universidad Autónoma Metropolitana (UAM), Mexico City 04960, Mexico
| |
Collapse
|
3
|
Buron L, Perreault S, Sultan S, Bonanno M, Coltin H, Laverdière C, Rondeau É, Desjardins L. Full and Partial Facial Affect Recognition in Pediatric Brain Tumour Survivors and Typically Developing Children Following COVID-19 Pandemic. Curr Oncol 2024; 31:4546-4558. [PMID: 39195322 DOI: 10.3390/curroncol31080339] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2024] [Revised: 08/03/2024] [Accepted: 08/07/2024] [Indexed: 08/29/2024] Open
Abstract
Affect recognition has emerged as a potential mechanism underlying the social competence challenges experienced by pediatric brain tumour survivors (PBTSs). However, many social interactions were altered during the pandemic, with the widespread use of masking potentially impacting affect recognition abilities. Here, we examine affect recognition in PBTSs and typically developing youth (TD) after the onset of the global pandemic. Twenty-three PBTSs and 24 TD between 8 and 16 years old were recruited and completed two performance-based affect recognition tasks (full and partial facial features) and a self-reported questionnaire on mask exposure in their social interactions. Their parents completed parent proxy questionnaires on their child's social adjustment and sociodemographics. The scores between the PBTSs and TD did not differ significantly in full (t(45) = 1.33, p = 0.19, d = 0.39, 95% CI [-0.69, 3.40]) or partial (t(37.36) = 1.56, p = 0.13, d = 0.46, 95% CI [-0.47, 3.60]) affect recognition, suggesting similar affect recognition between the two groups. These skills were also not significantly correlated with social adjustment or mask exposure (p > 0.05). However, the combined sample had significantly better scores in affect recognition when exposed to partial facial cues versus full. Additionally, participants obtained lower scores on a measure of full facial affect recognition and higher scores on a measure of partial affect recognition compared to pre-pandemic data. The pandemic may have influenced affect recognition across youth, underscoring the importance of further research into its lasting impact on the social competence of youth.
Collapse
Affiliation(s)
- Laurianne Buron
- Department of Psychology, Université de Montréal, 2900 Bd Édouard-Montpetit, Montreal, QC H3T 1J4, Canada
- Sainte-Justine's University Health Center, 3175 Chem. de la Côte-Sainte-Catherine, Montreal, QC H3T 1C5, Canada
| | - Sébastien Perreault
- Sainte-Justine's University Health Center, 3175 Chem. de la Côte-Sainte-Catherine, Montreal, QC H3T 1C5, Canada
- Department of Neurosciences, Université de Montréal, 2900 Bd Édouard-Montpetit, Montreal, QC H3T 1J4, Canada
| | - Serge Sultan
- Department of Psychology, Université de Montréal, 2900 Bd Édouard-Montpetit, Montreal, QC H3T 1J4, Canada
- Sainte-Justine's University Health Center, 3175 Chem. de la Côte-Sainte-Catherine, Montreal, QC H3T 1C5, Canada
| | - Marco Bonanno
- Sainte-Justine's University Health Center, 3175 Chem. de la Côte-Sainte-Catherine, Montreal, QC H3T 1C5, Canada
| | - Hallie Coltin
- Sainte-Justine's University Health Center, 3175 Chem. de la Côte-Sainte-Catherine, Montreal, QC H3T 1C5, Canada
- Department of Pediatrics, Université de Montréal, 2900 Bd Édouard-Montpetit, Montreal, QC H3T 1J4, Canada
| | - Caroline Laverdière
- Sainte-Justine's University Health Center, 3175 Chem. de la Côte-Sainte-Catherine, Montreal, QC H3T 1C5, Canada
- Department of Pediatrics, Université de Montréal, 2900 Bd Édouard-Montpetit, Montreal, QC H3T 1J4, Canada
| | - Émélie Rondeau
- Sainte-Justine's University Health Center, 3175 Chem. de la Côte-Sainte-Catherine, Montreal, QC H3T 1C5, Canada
| | - Leandra Desjardins
- Sainte-Justine's University Health Center, 3175 Chem. de la Côte-Sainte-Catherine, Montreal, QC H3T 1C5, Canada
- Department of Pediatrics, Université de Montréal, 2900 Bd Édouard-Montpetit, Montreal, QC H3T 1J4, Canada
| |
Collapse
|
4
|
Lebert A, Vilarroya O, Stins J. Stepping into emotions: investigating the effect of angry and fearful faces on forward stepping and quiet stance. Front Hum Neurosci 2024; 18:1411246. [PMID: 39183817 PMCID: PMC11341305 DOI: 10.3389/fnhum.2024.1411246] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2024] [Accepted: 07/15/2024] [Indexed: 08/27/2024] Open
Abstract
Introduction Facial expressions conveying an emotion may affect social interactions, such as approach- or avoidance-related behaviors. A specific facial feature is the gaze direction. An emotional facial expression such as anger will elicit distinct behavioral tendencies, depending on whether the angry gaze is directed toward the onlooker, or in a different direction. We tested whether facial expressions of anger and fear, combined with direct or averted gaze, elicit approach- or avoidance tendencies, using a go/no-go variant of the whole-body stepping task. Method Healthy adults stood on a force plate, recording the center of pressure (COP). Participants were presented with angry or fearful faces; either with direct or averted gaze. Participants had to identify the emotion, and "depending on instructions- either make a single step forward, or remain in a quiet stance. From the COP of the forward steps, we derived parameters such as reaction time and step size. From the quiet standing trials we derived parameters of postural sway, indicative of postural "freeze." We used analysis of variance to analyze the outcomes. Results and discussion First, we found that steps were initiated faster with angry faces than with fearful faces, in line with existing literature. Second, we did not observe a significant effect of gaze direction. Forward steps with direct and averted gaze had similar COP characteristics. Finally, we had expected to find freeze (postural immobility) with fearful faces, but this was also not observed. We discuss various explanations for the finding, and implications for research into the motoric grounding of social interactions.
Collapse
Affiliation(s)
- Angélique Lebert
- Unitat de Recerca en Neurociència Cognitiva, Departament de Psiquiatria i Medicina Legal, Universitat Autònoma de Barcelona, Barcelona, Spain
- Hospital del Mar Research Institute, Barcelona, Spain
| | - Oscar Vilarroya
- Unitat de Recerca en Neurociència Cognitiva, Departament de Psiquiatria i Medicina Legal, Universitat Autònoma de Barcelona, Barcelona, Spain
- Hospital del Mar Research Institute, Barcelona, Spain
| | - John Stins
- Department of Human Movement Sciences, Faculty of Behavioural and Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam Movement Sciences, Amsterdam, Netherlands
| |
Collapse
|
5
|
Fuchs M, Kersting A, Suslow T, Bodenschatz CM. Recognizing and Looking at Masked Emotional Faces in Alexithymia. Behav Sci (Basel) 2024; 14:343. [PMID: 38667139 PMCID: PMC11047507 DOI: 10.3390/bs14040343] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2024] [Revised: 04/08/2024] [Accepted: 04/16/2024] [Indexed: 04/29/2024] Open
Abstract
Alexithymia is a clinically relevant personality construct characterized by difficulties identifying and communicating one's emotions and externally oriented thinking. Alexithymia has been found to be related to poor emotion decoding and diminished attention to the eyes. The present eye tracking study investigated whether high levels of alexithymia are related to impairments in recognizing emotions in masked faces and reduced attentional preference for the eyes. An emotion recognition task with happy, fearful, disgusted, and neutral faces with face masks was administered to high-alexithymic and non-alexithymic individuals. Hit rates, latencies of correct responses, and fixation duration on eyes and face mask were analyzed as a function of group and sex. Alexithymia had no effects on accuracy and speed of emotion recognition. However, alexithymic men showed less attentional preference for the eyes relative to the mask than non-alexithymic men, which was due to their increased attention to face masks. No fixation duration differences were observed between alexithymic and non-alexithymic women. Our data indicate that high levels of alexithymia might not have adverse effects on the efficiency of emotion recognition from faces wearing masks. Future research on gaze behavior during facial emotion recognition in high alexithymia should consider sex as a moderating variable.
Collapse
Affiliation(s)
| | | | - Thomas Suslow
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig Medical Center, 04103 Leipzig, Germany; (M.F.); (A.K.); (C.M.B.)
| | | |
Collapse
|
6
|
Bianchini E, Rinaldi D, Alborghetti M, Simonelli M, D’Audino F, Onelli C, Pegolo E, Pontieri FE. The Story behind the Mask: A Narrative Review on Hypomimia in Parkinson's Disease. Brain Sci 2024; 14:109. [PMID: 38275529 PMCID: PMC10814039 DOI: 10.3390/brainsci14010109] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2023] [Revised: 01/18/2024] [Accepted: 01/19/2024] [Indexed: 01/27/2024] Open
Abstract
Facial movements are crucial for social and emotional interaction and well-being. Reduced facial expressions (i.e., hypomimia) is a common feature in patients with Parkinson's disease (PD) and previous studies linked this manifestation to both motor symptoms of the disease and altered emotion recognition and processing. Nevertheless, research on facial motor impairment in PD has been rather scarce and only a limited number of clinical evaluation tools are available, often suffering from poor validation processes and high inter- and intra-rater variability. In recent years, the availability of technology-enhanced quantification methods of facial movements, such as automated video analysis and machine learning application, led to increasing interest in studying hypomimia in PD. In this narrative review, we summarize the current knowledge on pathophysiological hypotheses at the basis of hypomimia in PD, with particular focus on the association between reduced facial expressions and emotional processing and analyze the current evaluation tools and management strategies for this symptom, as well as future research perspectives.
Collapse
Affiliation(s)
- Edoardo Bianchini
- Department of Neuroscience, Mental Health and Sensory Organs (NESMOS), Sapienza University of Rome, 00189 Rome, Italy; (E.B.); (D.R.); (M.A.); (M.S.)
- AGEIS, Université Grenoble Alpes, 38000 Grenoble, France
- Sant’Andrea University Hospital, 00189 Rome, Italy;
| | - Domiziana Rinaldi
- Department of Neuroscience, Mental Health and Sensory Organs (NESMOS), Sapienza University of Rome, 00189 Rome, Italy; (E.B.); (D.R.); (M.A.); (M.S.)
- Sant’Andrea University Hospital, 00189 Rome, Italy;
| | - Marika Alborghetti
- Department of Neuroscience, Mental Health and Sensory Organs (NESMOS), Sapienza University of Rome, 00189 Rome, Italy; (E.B.); (D.R.); (M.A.); (M.S.)
- Sant’Andrea University Hospital, 00189 Rome, Italy;
| | - Marta Simonelli
- Department of Neuroscience, Mental Health and Sensory Organs (NESMOS), Sapienza University of Rome, 00189 Rome, Italy; (E.B.); (D.R.); (M.A.); (M.S.)
- Ospedale dei Castelli, ASL Rome 6, 00040 Ariccia, Italy
| | | | - Camilla Onelli
- Department of Molecular Medicine, University of Padova, 35121 Padova, Italy;
| | - Elena Pegolo
- Department of Information Engineering, University of Padova, 35131 Padova, Italy;
| | - Francesco E. Pontieri
- Department of Neuroscience, Mental Health and Sensory Organs (NESMOS), Sapienza University of Rome, 00189 Rome, Italy; (E.B.); (D.R.); (M.A.); (M.S.)
- Sant’Andrea University Hospital, 00189 Rome, Italy;
- Fondazione Santa Lucia IRCCS, 00179 Rome, Italy
| |
Collapse
|
7
|
Rohrbeck P, Kersting A, Suslow T. Trait anger and negative interpretation bias in neutral face perception. Front Psychol 2023; 14:1086784. [PMID: 37213369 PMCID: PMC10196385 DOI: 10.3389/fpsyg.2023.1086784] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2022] [Accepted: 04/18/2023] [Indexed: 05/23/2023] Open
Abstract
Introduction Anger is a basic emotion helping people to achieve goals by preparing the body for action and prompting others to change their behavior but is also associated with health issues and risks. Trait anger, the disposition to experience angry feelings, goes along with an attribution of hostile traits to others. Negative distortions in the interpretation of social information have also been observed in anxiety and depression. The present study examined the associations between components of anger and negative interpretation tendencies in the perception of ambiguous and neutral schematic faces controlling for anxiety, depressed mood, and other variables. Methods A sample of 150 young adults performed a computer-based perception of facial expressions task and completed the State-Trait Anger Expression Inventory (STAXI-2) along with other self-report measures and tests. Results Trait anger and anger expression correlated with the perception of negative affects in neutral but not in ambiguous faces. More specifically, trait anger was linked to the attribution of anger, sadness, and anxiety to neutral faces. Trait anger predicted perceived negative affects in neutral faces when adjusting for anxiety, depression, and state anger. Discussion For neutral schematic faces, the present data support an association between trait anger and negatively biased interpretation of facial expression, which is independent of anxiety and depressed mood. The negative interpretation of neutral schematic faces in trait angry individuals seems not only to comprise the attribution of anger but also of negative emotions signaling weakness. Neutral schematic facial expressions might be useful stimuli in the future study of anger-related interpretation biases.
Collapse
|
8
|
Kaye LK, Rocabado JF, Rodriguez-Cuadrado S, Jones BR, Malone SA, Wall HJ, Duñabeitia JA. Exploring the (lack of) facilitative effect of emoji for word processing. COMPUTERS IN HUMAN BEHAVIOR 2022. [DOI: 10.1016/j.chb.2022.107563] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
|
9
|
Hocking MC, Schultz RT, Minturn JE, Brodsky C, Albee M, Herrington JD. Reduced Fusiform Gyrus Activation During Face Processing in Pediatric Brain Tumor Survivors. J Int Neuropsychol Soc 2022; 28:937-946. [PMID: 34605383 PMCID: PMC8977397 DOI: 10.1017/s135561772100117x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
OBJECTIVE The neural mechanisms contributing to the social problems of pediatric brain tumor survivors (PBTS) are unknown. Face processing is important to social communication, social behavior, and peer acceptance. Research with other populations with social difficulties, namely autism spectrum disorder, suggests atypical brain activation in areas important for face processing. This case-controlled functional magnetic resonance imaging (fMRI) study compared brain activation during face processing in PBTS and typically developing (TD) youth. METHODS Participants included 36 age-, gender-, and IQ-matched youth (N = 18 per group). PBTS were at least 5 years from diagnosis and 2 years from the completion of tumor therapy. fMRI data were acquired during a face identity task and a control condition. Groups were compared on activation magnitude within the fusiform gyrus for the faces condition compared to the control condition. Correlational analyses evaluated associations between neuroimaging metrics and indices of social behavior for PBTS participants. RESULTS Both groups demonstrated face-specific activation within the social brain for the faces condition compared to the control condition. PBTS showed significantly decreased activation for faces in the medial portions of the fusiform gyrus bilaterally compared to TD youth, ps ≤ .004. Higher peak activity in the left fusiform gyrus was associated with better socialization (r = .53, p < .05). CONCLUSIONS This study offers initial evidence of atypical activation in a key face processing area in PBTS. Such atypical activation may underlie some of the social difficulties of PBTS. Social cognitive neuroscience methodologies may elucidate the neurobiological bases for PBTS social behavior.
Collapse
Affiliation(s)
- Matthew C. Hocking
- Children’s Hospital of Philadelphia, 3401 Civic Center Blvd., Philadelphia, PA 19104, USA
- Perelman School of Medicine, University of Pennsylvania, 3400 Civic Center Blvd., Philadelphia, PA 19104, USA
- Correspondence and reprint requests to: Matthew C. Hocking, Ph.D., Division of Oncology, The Children’s Hospital of Philadelphia, 3615 Civic Center Blvd., 1427B Abramson Pediatric Research Center, Philadelphia, PA 19104, USA.
| | - Robert T. Schultz
- Children’s Hospital of Philadelphia, 3401 Civic Center Blvd., Philadelphia, PA 19104, USA
- Perelman School of Medicine, University of Pennsylvania, 3400 Civic Center Blvd., Philadelphia, PA 19104, USA
| | - Jane E. Minturn
- Children’s Hospital of Philadelphia, 3401 Civic Center Blvd., Philadelphia, PA 19104, USA
- Perelman School of Medicine, University of Pennsylvania, 3400 Civic Center Blvd., Philadelphia, PA 19104, USA
| | - Cole Brodsky
- Children’s Hospital of Philadelphia, 3401 Civic Center Blvd., Philadelphia, PA 19104, USA
| | - May Albee
- Children’s Hospital of Philadelphia, 3401 Civic Center Blvd., Philadelphia, PA 19104, USA
| | - John D. Herrington
- Perelman School of Medicine, University of Pennsylvania, 3400 Civic Center Blvd., Philadelphia, PA 19104, USA
| |
Collapse
|
10
|
Domínguez-Oliva A, Mota-Rojas D, Hernández-Avalos I, Mora-Medina P, Olmos-Hernández A, Verduzco-Mendoza A, Casas-Alvarado A, Whittaker AL. The neurobiology of pain and facial movements in rodents: Clinical applications and current research. Front Vet Sci 2022; 9:1016720. [PMID: 36246319 PMCID: PMC9556725 DOI: 10.3389/fvets.2022.1016720] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2022] [Accepted: 09/12/2022] [Indexed: 11/30/2022] Open
Abstract
One of the most controversial aspects of the use of animals in science is the production of pain. Pain is a central ethical concern. The activation of neural pathways involved in the pain response has physiological, endocrine, and behavioral consequences, that can affect both the health and welfare of the animals, as well as the validity of research. The strategy to prevent these consequences requires understanding of the nociception process, pain itself, and how assessment can be performed using validated, non-invasive methods. The study of facial expressions related to pain has undergone considerable study with the finding that certain movements of the facial muscles (called facial action units) are associated with the presence and intensity of pain. This review, focused on rodents, discusses the neurobiology of facial expressions, clinical applications, and current research designed to better understand pain and the nociceptive pathway as a strategy for implementing refinement in biomedical research.
Collapse
Affiliation(s)
- Adriana Domínguez-Oliva
- Master in Science Program “Maestría en Ciencias Agropecuarias”, Universidad Autónoma Metropolitana, Mexico City, Mexico
| | - Daniel Mota-Rojas
- Neurophysiology, Behavior and Animal Welfare Assesment, DPAA, Universidad Autónoma Metropolitana, Mexico City, Mexico
- *Correspondence: Daniel Mota-Rojas
| | - Ismael Hernández-Avalos
- Facultad de Estudios Superiores Cuautitlán, Universidad Nacional Autónoma de México, Cuautitlán Izcalli, Mexico
| | - Patricia Mora-Medina
- Facultad de Estudios Superiores Cuautitlán, Universidad Nacional Autónoma de México, Cuautitlán Izcalli, Mexico
| | - Adriana Olmos-Hernández
- Division of Biotechnology-Bioterio and Experimental Surgery, Instituto Nacional de Rehabilitación Luis Guillermo Ibarra Ibarra, Mexico City, Mexico
| | - Antonio Verduzco-Mendoza
- Division of Biotechnology-Bioterio and Experimental Surgery, Instituto Nacional de Rehabilitación Luis Guillermo Ibarra Ibarra, Mexico City, Mexico
| | - Alejandro Casas-Alvarado
- Neurophysiology, Behavior and Animal Welfare Assesment, DPAA, Universidad Autónoma Metropolitana, Mexico City, Mexico
| | - Alexandra L. Whittaker
- School of Animal and Veterinary Sciences, The University of Adelaide, Roseworthy, SA, Australia
| |
Collapse
|
11
|
Sampedro F, Martínez‐Horta S, Horta‐Barba A, Grothe MJ, Labrador‐Espinosa MA, Jesús S, Adarmes‐Gomez A, Carrillo F, Puig‐Davi A, Roldan‐Lora F, Aguilar‐Barbera M, Pastor P, Escalante Arroyo S, Solano‐Vila B, Cots‐Foraster A, Ruiz‐Martínez J, Carrillo‐Padilla F, Pueyo‐Morlans M, Gonzalez‐Aramburu I, Infante‐Ceberio J, Hernandez‐Vara J, de Fabregues‐Boixar O, de Deus Fonticoba T, Avila A, Martínez‐Castrillo JC, Bejr‐Kasem H, Campolongo A, Pascual‐Sedano B, Martínez‐Martín P, Santos‐García D, Mir P, Garcia‐Ruiz PJ, Kulisevsky J. Clinical and structural brain correlates of hypomimia in early‐stage Parkinson’s disease. Eur J Neurol 2022; 29:3720-3727. [DOI: 10.1111/ene.15513] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2022] [Revised: 06/21/2022] [Accepted: 07/14/2022] [Indexed: 11/30/2022]
Affiliation(s)
- Frederic Sampedro
- Centro de Investigacion Biomedica en Red Sobre Enfermedades Neurodegenerativas (CIBERNED), Instituto de Salud Carlos III Madrid Spain
- Unidad de Trastornos del Movimiento, Servicio de Neurología, Hospital de Sant Pau Barcelona Spain
- Instituto de Investigacion del Hospital de Sant Pau Barcelona Spain
| | - Saul Martínez‐Horta
- Centro de Investigacion Biomedica en Red Sobre Enfermedades Neurodegenerativas (CIBERNED), Instituto de Salud Carlos III Madrid Spain
- Unidad de Trastornos del Movimiento, Servicio de Neurología, Hospital de Sant Pau Barcelona Spain
- Instituto de Investigacion del Hospital de Sant Pau Barcelona Spain
| | - Andrea Horta‐Barba
- Centro de Investigacion Biomedica en Red Sobre Enfermedades Neurodegenerativas (CIBERNED), Instituto de Salud Carlos III Madrid Spain
- Unidad de Trastornos del Movimiento, Servicio de Neurología, Hospital de Sant Pau Barcelona Spain
- Instituto de Investigacion del Hospital de Sant Pau Barcelona Spain
| | - Michel J. Grothe
- Centro de Investigacion Biomedica en Red Sobre Enfermedades Neurodegenerativas (CIBERNED), Instituto de Salud Carlos III Madrid Spain
- Unidad de Trastornos del Movimiento, Servicio de Neurología y Neurofisiología Clínica, Instituto de Biomedicina de Sevilla (IBiS), Hospital Universitario Virgen del Rocío/CSIC/Universidad de Sevilla Seville Spain
| | - Miguel A. Labrador‐Espinosa
- Centro de Investigacion Biomedica en Red Sobre Enfermedades Neurodegenerativas (CIBERNED), Instituto de Salud Carlos III Madrid Spain
- Unidad de Trastornos del Movimiento, Servicio de Neurología y Neurofisiología Clínica, Instituto de Biomedicina de Sevilla (IBiS), Hospital Universitario Virgen del Rocío/CSIC/Universidad de Sevilla Seville Spain
| | - Silvia Jesús
- Centro de Investigacion Biomedica en Red Sobre Enfermedades Neurodegenerativas (CIBERNED), Instituto de Salud Carlos III Madrid Spain
- Unidad de Trastornos del Movimiento, Servicio de Neurología y Neurofisiología Clínica, Instituto de Biomedicina de Sevilla (IBiS), Hospital Universitario Virgen del Rocío/CSIC/Universidad de Sevilla Seville Spain
| | - Astrid Adarmes‐Gomez
- Centro de Investigacion Biomedica en Red Sobre Enfermedades Neurodegenerativas (CIBERNED), Instituto de Salud Carlos III Madrid Spain
- Unidad de Trastornos del Movimiento, Servicio de Neurología y Neurofisiología Clínica, Instituto de Biomedicina de Sevilla (IBiS), Hospital Universitario Virgen del Rocío/CSIC/Universidad de Sevilla Seville Spain
| | - Fatima Carrillo
- Movement Disorders Unit, Neurology Department, Hospital Universitario Virgen del Rocío Seville Spain
| | - Arnau Puig‐Davi
- Centro de Investigacion Biomedica en Red Sobre Enfermedades Neurodegenerativas (CIBERNED), Instituto de Salud Carlos III Madrid Spain
- Unidad de Trastornos del Movimiento, Servicio de Neurología, Hospital de Sant Pau Barcelona Spain
- Instituto de Investigacion del Hospital de Sant Pau Barcelona Spain
| | - Florinda Roldan‐Lora
- Unidad de Radiodiagnostico, Hospital Universitario Virgen del Rocío Seville Spain
| | | | - Pau Pastor
- Hospital Universitari Mutua de Terrassa, Terrassa Barcelona Spain
| | | | - Berta Solano‐Vila
- Institut Catala de la Salud (Girona), Institut d’Assistencia Sanitaria (IAS) Spain
| | - Anna Cots‐Foraster
- Institut Catala de la Salud (Girona), Institut d’Assistencia Sanitaria (IAS) Spain
| | - Javier Ruiz‐Martínez
- Instituto de Investigacion Biodonostia, Hospital Universitario Donostia San Sebastian Spain
| | | | | | - Isabel Gonzalez‐Aramburu
- Unidad de Trastornos del Movimiento, Servicio de Neurología, Hospital Universitario Marques de Valdecilla Santander Spain
| | - Jon Infante‐Ceberio
- Unidad de Trastornos del Movimiento, Servicio de Neurología, Hospital Universitario Marques de Valdecilla Santander Spain
| | - Jorge Hernandez‐Vara
- Neurology Department and Neurodegenerative Diseases Research Group, Vall D’Hebron Universitary Campus Barcelona Spain
| | - Oriol de Fabregues‐Boixar
- Neurology Department and Neurodegenerative Diseases Research Group, Vall D’Hebron Universitary Campus Barcelona Spain
| | | | - Asuncion Avila
- Consorci Sanitari Integral, Hospital General de L’Hospitalet, L’Hospitalet de Llobregat Barcelona Spain
| | | | - Helena Bejr‐Kasem
- Centro de Investigacion Biomedica en Red Sobre Enfermedades Neurodegenerativas (CIBERNED), Instituto de Salud Carlos III Madrid Spain
- Unidad de Trastornos del Movimiento, Servicio de Neurología, Hospital de Sant Pau Barcelona Spain
- Instituto de Investigacion del Hospital de Sant Pau Barcelona Spain
| | - Antonia Campolongo
- Centro de Investigacion Biomedica en Red Sobre Enfermedades Neurodegenerativas (CIBERNED), Instituto de Salud Carlos III Madrid Spain
- Unidad de Trastornos del Movimiento, Servicio de Neurología, Hospital de Sant Pau Barcelona Spain
- Instituto de Investigacion del Hospital de Sant Pau Barcelona Spain
| | - Berta Pascual‐Sedano
- Centro de Investigacion Biomedica en Red Sobre Enfermedades Neurodegenerativas (CIBERNED), Instituto de Salud Carlos III Madrid Spain
- Unidad de Trastornos del Movimiento, Servicio de Neurología, Hospital de Sant Pau Barcelona Spain
- Instituto de Investigacion del Hospital de Sant Pau Barcelona Spain
| | - Pablo Martínez‐Martín
- Centro de Investigacion Biomedica en Red Sobre Enfermedades Neurodegenerativas (CIBERNED), Instituto de Salud Carlos III Madrid Spain
| | | | - Pablo Mir
- Centro de Investigacion Biomedica en Red Sobre Enfermedades Neurodegenerativas (CIBERNED), Instituto de Salud Carlos III Madrid Spain
- Unidad de Trastornos del Movimiento, Servicio de Neurología y Neurofisiología Clínica, Instituto de Biomedicina de Sevilla (IBiS), Hospital Universitario Virgen del Rocío/CSIC/Universidad de Sevilla Seville Spain
| | | | - Jaime Kulisevsky
- Centro de Investigacion Biomedica en Red Sobre Enfermedades Neurodegenerativas (CIBERNED), Instituto de Salud Carlos III Madrid Spain
- Unidad de Trastornos del Movimiento, Servicio de Neurología, Hospital de Sant Pau Barcelona Spain
- Instituto de Investigacion del Hospital de Sant Pau Barcelona Spain
| | | |
Collapse
|
12
|
Neuromodulation of facial emotion recognition in health and disease: A systematic review. Neurophysiol Clin 2022; 52:183-201. [DOI: 10.1016/j.neucli.2022.03.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2021] [Revised: 03/20/2022] [Accepted: 03/21/2022] [Indexed: 11/20/2022] Open
|
13
|
Grahlow M, Rupp CI, Derntl B. The impact of face masks on emotion recognition performance and perception of threat. PLoS One 2022; 17:e0262840. [PMID: 35148327 PMCID: PMC8836371 DOI: 10.1371/journal.pone.0262840] [Citation(s) in RCA: 35] [Impact Index Per Article: 11.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2021] [Accepted: 01/07/2022] [Indexed: 12/03/2022] Open
Abstract
Facial emotion recognition is crucial for social interaction. However, in times of a global pandemic, where wearing a face mask covering mouth and nose is widely encouraged to prevent the spread of disease, successful emotion recognition may be challenging. In the current study, we investigated whether emotion recognition, assessed by a validated emotion recognition task, is impaired for faces wearing a mask compared to uncovered faces, in a sample of 790 participants between 18 and 89 years (condition mask vs. original). In two more samples of 395 and 388 participants between 18 and 70 years, we assessed emotion recognition performance for faces that are occluded by something other than a mask, i.e., a bubble as well as only showing the upper part of the faces (condition half vs. bubble). Additionally, perception of threat for faces with and without occlusion was assessed. We found impaired emotion recognition for faces wearing a mask compared to faces without mask, for all emotions tested (anger, fear, happiness, sadness, disgust, neutral). Further, we observed that perception of threat was altered for faces wearing a mask. Upon comparison of the different types of occlusion, we found that, for most emotions and especially for disgust, there seems to be an effect that can be ascribed to the face mask specifically, both for emotion recognition performance and perception of threat. Methodological constraints as well as the importance of wearing a mask despite temporarily compromised social interaction are discussed.
Collapse
Affiliation(s)
- Melina Grahlow
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
- Graduate Training Centre of Neuroscience, University of Tübingen, Tübingen, Germany
- Tübingen Center for Mental Health (TüCMH), Tübingen, Germany
- * E-mail:
| | - Claudia Ines Rupp
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical University of Innsbruck, Innsbruck, Austria
| | - Birgit Derntl
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
- Tübingen Center for Mental Health (TüCMH), Tübingen, Germany
- Tübingen Neuro Campus, University of Tübingen, Tübingen, Germany
- Lead Graduate School, University of Tübingen, Tübingen, Germany
| |
Collapse
|
14
|
Yao L, Dai Q, Wu Q, Liu Y, Yu Y, Guo T, Zhou M, Yang J, Takahashi S, Ejima Y, Wu J. Eye Size Affects Cuteness in Different Facial Expressions and Ages. Front Psychol 2022; 12:674456. [PMID: 35087437 PMCID: PMC8786738 DOI: 10.3389/fpsyg.2021.674456] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2021] [Accepted: 12/01/2021] [Indexed: 11/13/2022] Open
Abstract
Researchers have suggested that infants exhibiting baby schema are considered cute. These similar studies have mainly focused on changes in overall baby schema facial features. However, whether a change in only eye size affects the perception of cuteness across different facial expressions and ages has not been explicitly evaluated until now. In the present study, a paired comparison method and 7-point scale were used to investigate the effects of eye size on perceived cuteness across facial expressions (positive, neutral, and negative) and ages (adults and infants). The results show that stimuli with large eyes were perceived to be cuter than both unmanipulated eyes and small eyes across all facial expressions and age groups. This suggests not only that the effect of baby schema on cuteness is based on changes in a set of features but also that eye size as an individual feature can affect the perception of cuteness.
Collapse
Affiliation(s)
- Lichang Yao
- Cognitive Neuroscience Laboratory, Graduate School of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama, Japan
| | - Qi Dai
- Cognitive Neuroscience Laboratory, Graduate School of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama, Japan
| | - Qiong Wu
- School of Education, Suzhou University of Science and Technology, Suzhou, China
| | - Yang Liu
- School of Education, Suzhou University of Science and Technology, Suzhou, China
| | - Yiyang Yu
- Cognitive Neuroscience Laboratory, Graduate School of Natural Science and Technology, Okayama University, Okayama, Japan
| | - Ting Guo
- Cognitive Neuroscience Laboratory, Graduate School of Natural Science and Technology, Okayama University, Okayama, Japan
| | - Mengni Zhou
- Cognitive Neuroscience Laboratory, Graduate School of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama, Japan
| | - Jiajia Yang
- Cognitive Neuroscience Laboratory, Graduate School of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama, Japan
| | - Satoshi Takahashi
- Cognitive Neuroscience Laboratory, Graduate School of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama, Japan
| | - Yoshimichi Ejima
- Cognitive Neuroscience Laboratory, Graduate School of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama, Japan
| | - Jinglong Wu
- Cognitive Neuroscience Laboratory, Graduate School of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama, Japan.,Research Center for Medical Artificial Intelligence, Shenzhen Institute of Advanced Technology, Chinese Academy of Science, Shenzhen, China.,School of Mechatronical Engineering, Beijing Institute of Technology, Beijing, China
| |
Collapse
|
15
|
Khomchenkova A, Prokopenko S, Gurevich V, Peresunko P. Diagnosis of hypomimia in Parkinson’s disease. Zh Nevrol Psikhiatr Im S S Korsakova 2022; 122:24-29. [DOI: 10.17116/jnevro202212211224] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
16
|
Suslow T, Kersting A. Beyond Face and Voice: A Review of Alexithymia and Emotion Perception in Music, Odor, Taste, and Touch. Front Psychol 2021; 12:707599. [PMID: 34393944 PMCID: PMC8362879 DOI: 10.3389/fpsyg.2021.707599] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2021] [Accepted: 07/06/2021] [Indexed: 11/22/2022] Open
Abstract
Alexithymia is a clinically relevant personality trait characterized by deficits in recognizing and verbalizing one's emotions. It has been shown that alexithymia is related to an impaired perception of external emotional stimuli, but previous research focused on emotion perception from faces and voices. Since sensory modalities represent rather distinct input channels it is important to know whether alexithymia also affects emotion perception in other modalities and expressive domains. The objective of our review was to summarize and systematically assess the literature on the impact of alexithymia on the perception of emotional (or hedonic) stimuli in music, odor, taste, and touch. Eleven relevant studies were identified. On the basis of the reviewed research, it can be preliminary concluded that alexithymia might be associated with deficits in the perception of primarily negative but also positive emotions in music and a reduced perception of aversive taste. The data available on olfaction and touch are inconsistent or ambiguous and do not allow to draw conclusions. Future investigations would benefit from a multimethod assessment of alexithymia and control of negative affect. Multimodal research seems necessary to advance our understanding of emotion perception deficits in alexithymia and clarify the contribution of modality-specific and supramodal processing impairments.
Collapse
Affiliation(s)
- Thomas Suslow
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig Medical Center, Leipzig, Germany
| | - Anette Kersting
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig Medical Center, Leipzig, Germany
| |
Collapse
|
17
|
Hocking MC, Albee M, Brodsky C, Shabason E, Wang L, Schultz RT, Herrington J. Face Processing and Social Functioning in Pediatric Brain Tumor Survivors. J Pediatr Psychol 2021; 46:1267-1275. [PMID: 34313751 DOI: 10.1093/jpepsy/jsab067] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2020] [Revised: 05/23/2021] [Accepted: 05/24/2021] [Indexed: 11/13/2022] Open
Abstract
OBJECTIVE Pediatric brain tumor survivors (PBTS) experience deficits in social functioning. Facial expression and identity recognition are key components of social information processing and are widely studied as an index of social difficulties in youth with autism spectrum disorder (ASD) and other neurodevelopmental conditions. This study evaluated facial expression and identity recognition among PBTS, youth with ASD, and typically developing (TD) youth, and the associations between these face processing skills and social impairments. METHODS PBTS (N = 54; ages 7-16) who completed treatment at least 2 years prior were matched with TD (N = 43) youth and youth with ASD (N = 55) based on sex and IQ. Parents completed a measure of social impairments and youth completed a measure of facial expression and identity recognition. RESULTS Groups significantly differed on social impairments (p < .001), with youth with ASD scoring highest followed by PBTS and lastly TD youth. Youth with ASD performed significantly worse on the two measures of facial processing, while TD youth and PBTS were not statistically different. The association of facial expression recognition and social impairments was moderated by group, such that PBTS with higher levels of social impairment performed worse on the expression task compared to TD and ASD groups (p < .01, η2 = 0.07). CONCLUSIONS Variability in face processing may be uniquely important to the social challenges of PBTS compared to other neurodevelopmental populations. Future directions include prospectively examining associations between facial expression recognition and social difficulties in PBTS and face processing training as an intervention for PBTS.
Collapse
Affiliation(s)
- Matthew C Hocking
- Children's Hospital of Philadelphia and The University of Pennsylvania
| | | | | | | | - Leah Wang
- Children's Hospital of Philadelphia and The University of Pennsylvania
| | - Robert T Schultz
- Children's Hospital of Philadelphia and The University of Pennsylvania
| | - John Herrington
- Children's Hospital of Philadelphia and The University of Pennsylvania
| |
Collapse
|
18
|
Suslow T, Günther V, Hensch T, Kersting A, Bodenschatz CM. Alexithymia Is Associated With Deficits in Visual Search for Emotional Faces in Clinical Depression. Front Psychiatry 2021; 12:668019. [PMID: 34267686 PMCID: PMC8275928 DOI: 10.3389/fpsyt.2021.668019] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/02/2021] [Accepted: 06/03/2021] [Indexed: 11/13/2022] Open
Abstract
Background: The concept of alexithymia is characterized by difficulties identifying and describing one's emotions. Alexithymic individuals are impaired in the recognition of others' emotional facial expressions. Alexithymia is quite common in patients suffering from major depressive disorder. The face-in-the-crowd task is a visual search paradigm that assesses processing of multiple facial emotions. In the present eye-tracking study, the relationship between alexithymia and visual processing of facial emotions was examined in clinical depression. Materials and Methods: Gaze behavior and manual response times of 20 alexithymic and 19 non-alexithymic depressed patients were compared in a face-in-the-crowd task. Alexithymia was empirically measured via the 20-item Toronto Alexithymia-Scale. Angry, happy, and neutral facial expressions of different individuals were shown as target and distractor stimuli. Our analyses of gaze behavior focused on latency to the target face, number of distractor faces fixated before fixating the target, number of target fixations, and number of distractor faces fixated after fixating the target. Results: Alexithymic patients exhibited in general slower decision latencies compared to non-alexithymic patients in the face-in-the-crowd task. Patient groups did not differ in latency to target, number of target fixations, and number of distractors fixated prior to target fixation. However, after having looked at the target, alexithymic patients fixated more distractors than non-alexithymic patients, regardless of expression condition. Discussion: According to our results, alexithymia goes along with impairments in visual processing of multiple facial emotions in clinical depression. Alexithymia appears to be associated with delayed manual reaction times and prolonged scanning after the first target fixation in depression, but it might have no impact on the early search phase. The observed deficits could indicate difficulties in target identification and/or decision-making when processing multiple emotional facial expressions. Impairments of alexithymic depressed patients in processing emotions in crowds of faces seem not limited to a specific affective valence. In group situations, alexithymic depressed patients might be slowed in processing interindividual differences in emotional expressions compared with non-alexithymic depressed patients. This could represent a disadvantage in understanding non-verbal communication in groups.
Collapse
Affiliation(s)
- Thomas Suslow
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig Medical Center, Leipzig, Germany
| | - Vivien Günther
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig Medical Center, Leipzig, Germany
| | - Tilman Hensch
- Department of Psychiatry and Psychotherapy, University of Leipzig Medical Center, Leipzig, Germany
- Department of Psychology, IU International University of Applied Science, Erfurt, Germany
| | - Anette Kersting
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig Medical Center, Leipzig, Germany
| | - Charlott Maria Bodenschatz
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig Medical Center, Leipzig, Germany
| |
Collapse
|
19
|
Khafif TC, Rotenberg LDS, Nascimento C, Beraldi GH, Lafer B. Emotion regulation in pediatric bipolar disorder: A meta-analysis of published studies. J Affect Disord 2021; 285:86-96. [PMID: 33639359 DOI: 10.1016/j.jad.2021.02.010] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/29/2020] [Revised: 01/14/2021] [Accepted: 02/01/2021] [Indexed: 11/18/2022]
Abstract
BACKGROUND Emotion regulation is a relatively recent topic in psychiatry, and has only recently begun to be tested across Pediatric Bipolar Disorder (PBD). To date, no meta-analysis has investigated the presence of emotion regulation deficits in PBD patients. OBJECTIVES The aim of this study is to understand where the literature stands on this topic, as well as how different researchers are measuring and grasping the concept of emotion regulation in pediatric bipolar disorders. METHODS A systematic search of trials using the terms ("Pediatric Bipolar Disorder") AND ("Emotion Regulation" OR "Affect Regulation" OR "Mood Lability" OR "Mood Instability" OR "Irritability") was conducted using PubMed, Google Scholar, ResearchGate, Web of Science and Psych Info databases. Of the initial 366 articles identified, 8 met eligibility criteria for the meta-analysis and were included in this study. RESULTS There is a statistically significant difference in Accuracy in Emotion Regulation tasks, with a tendency for lower accuracy in PBD patients; however, both groups did not differ statistically regarding Response Time. CONCLUSION Our data suggests that PBD patients do present emotion regulation deficits, particularly regarding facial emotion recognition and affective language interference tasks mediated by cognitive assignments. These results have important implications in developing novel psychotherapeutic interventions for this population.
Collapse
Affiliation(s)
- Tatiana Cohab Khafif
- Bipolar Disorder Program (PROMAN), Department of Psychiatry, University of São Paulo Medical School, Rua Dr. Ovídio Pires de Campos, 785, São Paulo, Brazil.
| | - Luisa de Siqueira Rotenberg
- Bipolar Disorder Program (PROMAN), Department of Psychiatry, University of São Paulo Medical School, Rua Dr. Ovídio Pires de Campos, 785, São Paulo, Brazil
| | - Camila Nascimento
- Bipolar Disorder Program (PROMAN), Department of Psychiatry, University of São Paulo Medical School, Rua Dr. Ovídio Pires de Campos, 785, São Paulo, Brazil
| | - Gabriel Henrique Beraldi
- Bipolar Disorder Program (PROMAN), Department of Psychiatry, University of São Paulo Medical School, Rua Dr. Ovídio Pires de Campos, 785, São Paulo, Brazil
| | - Beny Lafer
- Bipolar Disorder Program (PROMAN), Department of Psychiatry, University of São Paulo Medical School, Rua Dr. Ovídio Pires de Campos, 785, São Paulo, Brazil
| |
Collapse
|
20
|
Csoltova E, Mehinagic E. Where Do We Stand in the Domestic Dog ( Canis familiaris ) Positive-Emotion Assessment: A State-of-the-Art Review and Future Directions. Front Psychol 2020; 11:2131. [PMID: 33013543 PMCID: PMC7506079 DOI: 10.3389/fpsyg.2020.02131] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2020] [Accepted: 07/30/2020] [Indexed: 12/19/2022] Open
Abstract
Although there have been a growing number of studies focusing on dog welfare, the research field concerning dog positive-emotion assessment remains mostly unexplored. This paper aims to provide a state-of-the-art review and summary of the scattered and disperse research on dog positive-emotion assessment. The review notably details the current advancement in dog positive-emotion research, what approaches, measures, methods, and techniques have been implemented so far in emotion perception, processing, and response assessment. Moreover, we propose possible future research directions for short-term emotion as well as longer-term emotional states assessment in dogs. The review ends by identifying and addressing some methodological limitations and by pointing out further methodological research needs.
Collapse
|
21
|
Rosenberg N, Ihme K, Lichev V, Sacher J, Rufer M, Grabe HJ, Kugel H, Pampel A, Lepsien J, Kersting A, Villringer A, Suslow T. Alexithymia and automatic processing of facial emotions: behavioral and neural findings. BMC Neurosci 2020; 21:23. [PMID: 32471365 PMCID: PMC7257227 DOI: 10.1186/s12868-020-00572-6] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2018] [Accepted: 05/20/2020] [Indexed: 12/27/2022] Open
Abstract
BACKGROUND Alexithymia is a personality trait characterized by difficulties identifying and describing feelings, an externally oriented style of thinking, and a reduced inclination to imagination. Previous research has shown deficits in the recognition of emotional facial expressions in alexithymia and reductions of brain responsivity to emotional stimuli. Using an affective priming paradigm, we investigated automatic perception of facial emotions as a function of alexithymia at the behavioral and neural level. In addition to self-report scales, we applied an interview to assess alexithymic tendencies. RESULTS During 3 T fMRI scanning, 49 healthy individuals judged valence of neutral faces preceded by briefly shown happy, angry, fearful, and neutral facial expressions. Alexithymia was assessed using the 20-Item Toronto Alexithymia Scale (TAS-20), the Bermond-Vorst Alexithymia Questionnaire (BVAQ) and the Toronto Structured Interview for Alexithymia (TSIA). As expected, only negative correlations were found between alexithymic features and affective priming. The global level of self-reported alexithymia (as assessed by the TAS-20 and the BVAQ) was found to be related to less affective priming owing to angry faces. At the facet level, difficulties identifying feelings, difficulties analyzing feelings, and impoverished fantasy (as measured by the BVAQ) were correlated with reduced affective priming due to angry faces. Difficulties identifying feelings (BVAQ) correlated also with reduced affective priming due to fearful faces and reduced imagination (TSIA) was related to decreased affective priming due to happy faces. There was only one significant correlation between alexithymia dimensions and automatic brain response to masked facial emotions: TAS-20 alexithymia correlated with heightened brain response to masked happy faces in superior and medial frontal areas. CONCLUSIONS Our behavioral results provide evidence that alexithymic features are related in particular to less sensitivity for covert facial expressions of anger. The perceptual alterations could reflect impaired automatic recognition or integration of social anger signals into judgemental processes and might contribute to the problems in interpersonal relationships associated with alexithymia. Our findings suggest that self-report measures of alexithymia may have an advantage over interview-based tests as research tools in the field of emotion perception at least in samples of healthy individuals characterized by rather low levels of alexithymia.
Collapse
Affiliation(s)
- Nicole Rosenberg
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig, Semmelweisstrasse 10, 04103 Leipzig, Germany
| | - Klas Ihme
- Institute of Transportation Systems, German Aerospace Center, Lilienthalplatz 7, 38108 Brunswick, Germany
| | - Vladimir Lichev
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig, Semmelweisstrasse 10, 04103 Leipzig, Germany
| | - Julia Sacher
- Department of Neurology, Max-Planck-Institute of Human Cognitive and Brain Sciences, Stephanstraße 1, 04103 Leipzig, Germany
- Clinic of Cognitive Neurology, University of Leipzig, Liebigstrasse 18, 04103 Leipzig, Germany
| | - Michael Rufer
- Department of Psychiatry, Psychotherapy and Psychosomatics, University Hospital Zurich, University of Zurich, Militärstrasse 8, 8021 Zurich, Switzerland
| | - Hans Jörgen Grabe
- Department of Psychiatry, University Medicine of Greifswald, Ellernholzstraße 1-2, 17475 Greifswald, Germany
| | - Harald Kugel
- Department of Clinical Radiology, University of Münster, Albert-Schweitzer-Campus 1, 48149 Münster, Germany
| | - André Pampel
- Nuclear Magnetic Resonance Unit, Max-Planck-Institute of Human Cognitive and Brain Sciences, Stephanstraße 1, 04103 Leipzig, Germany
| | - Jöran Lepsien
- Nuclear Magnetic Resonance Unit, Max-Planck-Institute of Human Cognitive and Brain Sciences, Stephanstraße 1, 04103 Leipzig, Germany
| | - Anette Kersting
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig, Semmelweisstrasse 10, 04103 Leipzig, Germany
| | - Arno Villringer
- Department of Neurology, Max-Planck-Institute of Human Cognitive and Brain Sciences, Stephanstraße 1, 04103 Leipzig, Germany
- Clinic of Cognitive Neurology, University of Leipzig, Liebigstrasse 18, 04103 Leipzig, Germany
| | - Thomas Suslow
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig, Semmelweisstrasse 10, 04103 Leipzig, Germany
| |
Collapse
|
22
|
Association Between Hypomimia and Mild Cognitive Impairment in De Novo Parkinson's Disease Patients. Can J Neurol Sci 2020; 47:855-857. [PMID: 32406363 DOI: 10.1017/cjn.2020.93] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/27/2022]
|
23
|
The State of Automated Facial Expression Analysis (AFEA) in Evaluating Consumer Packaged Beverages. BEVERAGES 2020. [DOI: 10.3390/beverages6020027] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/03/2023]
Abstract
In the late 1970s, analysis of facial expressions to unveil emotional states began to grow and flourish along with new technologies and software advances. Researchers have always been able to document what consumers do, but understanding how consumers feel at a specific moment in time is an important part of the product development puzzle. Because of this, biometric testing methods have been used in numerous studies, as researchers have worked to develop a more comprehensive understanding of consumers. Despite the many articles on automated facial expression analysis (AFEA), literature is limited in regard to food and beverage studies. There are no standards to guide researchers in setting up materials, processing data, or conducting a study, and there are few, if any, compilations of the studies that have been performed to determine whether any methodologies work better than others or what trends have been found. Through a systematic Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) review, 38 articles were found that were relevant to the research goals. The authors identified AFEA study methods that have worked and those that have not been as successful and noted any trends of particular importance. Key takeaways include a listing of commercial AFEA software, experimental methods used within the PRISMA analysis, and a comprehensive explanation of the critical methods and practices of the studies analyzed. Key information was analyzed and compared to determine effects on the study outcomes. Through analyzing the various studies, suggestions and guidance for conducting and analyzing data from AFEA experiments are discussed.
Collapse
|
24
|
Zeev-Wolf M, Rassovsky Y. Testing the magnocellular-pathway advantage in facial expressions processing for consistency over time. Neuropsychologia 2020; 138:107352. [PMID: 31958409 DOI: 10.1016/j.neuropsychologia.2020.107352] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2019] [Revised: 12/12/2019] [Accepted: 01/16/2020] [Indexed: 10/25/2022]
Abstract
The ability to identify facial expressions rapidly and accurately is central to human evolution. Previous studies have demonstrated that this ability relies to a large extent on the magnocellular, rather than parvocellular, visual pathway, which is biased toward processing low spatial frequencies. Despite the generally consistent finding, no study to date has investigated the reliability of this effect over time. In the present study, 40 participants completed a facial emotion identification task (fearful, happy, or neutral faces) using facial images presented at three different spatial frequencies (low, high, or broad spatial frequency), at two time points separated by one year. Bayesian statistics revealed an advantage for the magnocellular pathway in processing facial expressions; however, no effect for time was found. Furthermore, participants' RT patterns of results were highly stable over time. Our replication, together with the consistency of our measurements within subjects, underscores the robustness of this effect. This capacity, therefore, may be considered in a trait-like manner, suggesting that individuals may possess various ability levels for processing facial expressions that can be captured in behavioral measurements.
Collapse
Affiliation(s)
- Maor Zeev-Wolf
- Department of Education and Zlotowski Center for Neuroscience, Ben Gurion University of the Negev, Beer Sheva, Israel
| | - Yuri Rassovsky
- Department of Psychology and Gonda Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat-Gan, Israel; Department of Psychiatry and Biobehavioral Sciences, UCLA Semel Institute for Neuroscience and Human Behavior, Los Angeles, CA, USA.
| |
Collapse
|
25
|
Sun Y, Ayaz H, Akansu AN. Multimodal Affective State Assessment Using fNIRS + EEG and Spontaneous Facial Expression. Brain Sci 2020; 10:E85. [PMID: 32041316 PMCID: PMC7071625 DOI: 10.3390/brainsci10020085] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2019] [Revised: 01/31/2020] [Accepted: 02/01/2020] [Indexed: 01/04/2023] Open
Abstract
Human facial expressions are regarded as a vital indicator of one's emotion and intention, and even reveal the state of health and wellbeing. Emotional states have been associated with information processing within and between subcortical and cortical areas of the brain, including the amygdala and prefrontal cortex. In this study, we evaluated the relationship between spontaneous human facial affective expressions and multi-modal brain activity measured via non-invasive and wearable sensors: functional near-infrared spectroscopy (fNIRS) and electroencephalography (EEG) signals. The affective states of twelve male participants detected via fNIRS, EEG, and spontaneous facial expressions were investigated in response to both image-content stimuli and video-content stimuli. We propose a method to jointly evaluate fNIRS and EEG signals for affective state detection (emotional valence as positive or negative). Experimental results reveal a strong correlation between spontaneous facial affective expressions and the perceived emotional valence. Moreover, the affective states were estimated by the fNIRS, EEG, and fNIRS + EEG brain activity measurements. We show that the proposed EEG + fNIRS hybrid method outperforms fNIRS-only and EEG-only approaches. Our findings indicate that the dynamic (video-content based) stimuli triggers a larger affective response than the static (image-content based) stimuli. These findings also suggest joint utilization of facial expression and wearable neuroimaging, fNIRS, and EEG, for improved emotional analysis and affective brain-computer interface applications.
Collapse
Affiliation(s)
- Yanjia Sun
- Department of Electrical and Computer Engineering, New Jersey Institute of Technology, Newark, NJ 07102, USA;
| | - Hasan Ayaz
- School of Biomedical Engineering, Science and Health Systems, Drexel University, Philadelphia, PA 19104, USA;
- Department of Psychology, College of Arts and Sciences, Drexel University, Philadelphia, PA 19104, USA
- Department of Family and Community Health, University of Pennsylvania, Philadelphia, PA 19104, USA
- Center for Injury Research and Prevention, Children’s Hospital of Philadelphia, Philadelphia, PA 19104, USA
| | - Ali N. Akansu
- Department of Electrical and Computer Engineering, New Jersey Institute of Technology, Newark, NJ 07102, USA;
| |
Collapse
|
26
|
Suslow T, Wildenauer K, Günther V. Ruminative response style is associated with a negative bias in the perception of emotional facial expressions in healthy women without a history of clinical depression. J Behav Ther Exp Psychiatry 2019; 62:125-132. [PMID: 30366227 DOI: 10.1016/j.jbtep.2018.10.004] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/14/2018] [Revised: 09/10/2018] [Accepted: 10/15/2018] [Indexed: 11/28/2022]
Abstract
BACKGROUND AND OBJECTIVES Rumination has been shown to be an important cognitive vulnerability factor affecting development and maintenance of depression. Ruminative thinking can be divided into a self-focused component referring to persistent reflection about causes and consequences of depressed mood and a symptom-focused component characterized by repetitive thinking about depressive symptoms. Previous research on clinical depression has shown that rumination is associated with the perception of negative emotions in others' facial expressions. The present study was conducted to investigate the relation between habitual rumination and negative bias in face perception in healthy individuals. METHODS 100 healthy young women without a history of clinical depression completed the Response Styles Questionnaire along with measures of depressive symptoms, dysfunctional attitudes, and anxiety. A computer-based version of the perception of facial expressions questionnaire using line drawings (schematic faces) was administered to assess perceived emotions in faces with ambiguous and unambiguous emotional expressions. RESULTS According to hierarchical regression analyses, symptom-based (but not self-focused) rumination predicted perceived negative emotions in ambiguous as well as in unambiguous negative faces after controlling for current depressive symptoms, state and trait anxiety, intelligence, and dysfunctional attitudes. LIMITATIONS Generalization of the present findings is limited by the fact that only women were included as study participants. CONCLUSIONS Habitual ruminating about depressive symptoms in healthy, never clinically depressed individuals goes along with a negative bias in the perception of others' facial expressions. Negatively biasing social perception might be one mechanism by which symptom-focused rumination might increase vulnerability for depression.
Collapse
Affiliation(s)
- Thomas Suslow
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig, Semmelweisstr. 10, 04103, Leipzig, Germany.
| | - Kathrin Wildenauer
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig, Semmelweisstr. 10, 04103, Leipzig, Germany
| | - Vivien Günther
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig, Semmelweisstr. 10, 04103, Leipzig, Germany
| |
Collapse
|
27
|
Andric Petrovic S, Jerotic S, Mihaljevic M, Pavlovic Z, Ristic I, Soldatovic I, Maric NP. Sex differences in facial emotion recognition in health and psychotic disorders. Cogn Neuropsychiatry 2019; 24:108-122. [PMID: 30789053 DOI: 10.1080/13546805.2019.1582411] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
Abstract
BACKGROUND Previous studies examining sex-differences in facial emotion recognition (FER) in psychosis yielded inconsistent results. Although females are considered to be superior in FER in health, it remains unclear whether the specific sex-difference is present in psychosis. We aimed to examine whether women and men differ in FER ability in health and in psychosis, and to explore potential sex differences in the illness' effects on FER. METHODS Remitted psychotic patients and controls were assessed using the CANTAB Emotion Recognition Task (ERT) examining accuracies/response latencies in identifying basic emotional expressions. General linear model was performed to assess the effects of group, sex and their interactions on ERT performance. RESULTS Healthy females showed FER advantage in comparison to healthy males, while the aforementioned sex-difference was not observed in remitted psychotic patients. Our results also demonstrated the existence of overall FER deficit in psychosis in comparison to healthy controls, as well as the differential illness' effects on the recognition accuracy of facial expression of anger in males and females-suggesting that females with psychotic disorders undergo more profound deterioration of FER ability than their male counterparts. CONCLUSION The assessment of sex-differences in FER and other important features of psychosis is important for better understanding of its neurobiological basis and for the development of targeted treatments for improved functioning.
Collapse
Affiliation(s)
| | - Stefan Jerotic
- a Clinic for Psychiatry, Clinical Center of Serbia , Belgrade , Serbia
| | - Marina Mihaljevic
- a Clinic for Psychiatry, Clinical Center of Serbia , Belgrade , Serbia.,b School of Medicine , University of Belgrade , Belgrade , Serbia
| | - Zorana Pavlovic
- a Clinic for Psychiatry, Clinical Center of Serbia , Belgrade , Serbia.,b School of Medicine , University of Belgrade , Belgrade , Serbia
| | - Ivan Ristic
- b School of Medicine , University of Belgrade , Belgrade , Serbia
| | - Ivan Soldatovic
- b School of Medicine , University of Belgrade , Belgrade , Serbia
| | - Nadja P Maric
- a Clinic for Psychiatry, Clinical Center of Serbia , Belgrade , Serbia.,b School of Medicine , University of Belgrade , Belgrade , Serbia
| |
Collapse
|
28
|
Haines N, Southward MW, Cheavens JS, Beauchaine T, Ahn WY. Using computer-vision and machine learning to automate facial coding of positive and negative affect intensity. PLoS One 2019; 14:e0211735. [PMID: 30721270 PMCID: PMC6363175 DOI: 10.1371/journal.pone.0211735] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2018] [Accepted: 01/18/2019] [Indexed: 11/26/2022] Open
Abstract
Facial expressions are fundamental to interpersonal communication, including social interaction, and allow people of different ages, cultures, and languages to quickly and reliably convey emotional information. Historically, facial expression research has followed from discrete emotion theories, which posit a limited number of distinct affective states that are represented with specific patterns of facial action. Much less work has focused on dimensional features of emotion, particularly positive and negative affect intensity. This is likely, in part, because achieving inter-rater reliability for facial action and affect intensity ratings is painstaking and labor-intensive. We use computer-vision and machine learning (CVML) to identify patterns of facial actions in 4,648 video recordings of 125 human participants, which show strong correspondences to positive and negative affect intensity ratings obtained from highly trained coders. Our results show that CVML can both (1) determine the importance of different facial actions that human coders use to derive positive and negative affective ratings when combined with interpretable machine learning methods, and (2) efficiently automate positive and negative affect intensity coding on large facial expression databases. Further, we show that CVML can be applied to individual human judges to infer which facial actions they use to generate perceptual emotion ratings from facial expressions.
Collapse
Affiliation(s)
- Nathaniel Haines
- Department of Psychology, The Ohio State University, Columbus, Ohio, United States of America
| | - Matthew W. Southward
- Department of Psychology, The Ohio State University, Columbus, Ohio, United States of America
| | - Jennifer S. Cheavens
- Department of Psychology, The Ohio State University, Columbus, Ohio, United States of America
| | - Theodore Beauchaine
- Department of Psychology, The Ohio State University, Columbus, Ohio, United States of America
| | - Woo-Young Ahn
- Department of Psychology, Seoul National University, Seoul, Korea
| |
Collapse
|
29
|
Zhao J, Meng Q, An L, Wang Y. An event-related potential comparison of facial expression processing between cartoon and real faces. PLoS One 2019; 14:e0198868. [PMID: 30629582 PMCID: PMC6328201 DOI: 10.1371/journal.pone.0198868] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2018] [Accepted: 12/14/2018] [Indexed: 11/22/2022] Open
Abstract
Faces play important roles in the social lives of humans. Besides real faces, people also encounter numerous cartoon faces in daily life which convey basic emotional states through facial expressions. Using event-related potentials (ERPs), we conducted a facial expression recognition experiment with 17 university students to compare the processing of cartoon faces with that of real faces. This study used face type (real vs. cartoon), emotion valence (happy vs. angry) and participant gender (male vs. female) as independent variables. Reaction time, recognition accuracy, and the amplitudes and latencies of emotion processing-related ERP components such as N170, VPP (vertex positive potential), and LPP (late positive potential) were used as dependent variables. The ERP results revealed that cartoon faces caused larger N170 and VPP amplitudes as well as a briefer N170 latency than did real faces; that real faces induced larger LPP amplitudes than did cartoon faces. In addition, the results showed a significant difference in the brain regions as reflected in a right hemispheric advantage. The behavioral results showed that the reaction times for happy faces were shorter than those for angry faces; that females showed a higher accuracy than did males; and that males showed a higher recognition accuracy for angry faces than happy faces. Due to the sample size, these results may suggestively but not rigorously demonstrate differences in facial expression recognition and neurological processing between cartoon faces and real faces. Cartoon faces showed a higher processing intensity and speed than real faces during the early processing stage. However, more attentional resources were allocated for real faces during the late processing stage.
Collapse
Affiliation(s)
- Jiayin Zhao
- Beijing Key Laboratory of Learning and Cognition, Department of Psychology, Capital Normal University, Beijing, China
| | - Qi Meng
- Beijing Key Laboratory of Learning and Cognition, Department of Psychology, Capital Normal University, Beijing, China
| | - Licong An
- Beijing Key Laboratory of Learning and Cognition, Department of Psychology, Capital Normal University, Beijing, China
| | - Yifang Wang
- Beijing Key Laboratory of Learning and Cognition, Department of Psychology, Capital Normal University, Beijing, China
| |
Collapse
|
30
|
Automated facial expression analysis for emotional responsivity using an aqueous bitter model. Food Qual Prefer 2018. [DOI: 10.1016/j.foodqual.2018.04.004] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
|
31
|
Ihme K, Unni A, Zhang M, Rieger JW, Jipp M. Recognizing Frustration of Drivers From Face Video Recordings and Brain Activation Measurements With Functional Near-Infrared Spectroscopy. Front Hum Neurosci 2018; 12:327. [PMID: 30177876 PMCID: PMC6109683 DOI: 10.3389/fnhum.2018.00327] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2018] [Accepted: 07/25/2018] [Indexed: 11/13/2022] Open
Abstract
Experiencing frustration while driving can harm cognitive processing, result in aggressive behavior and hence negatively influence driving performance and traffic safety. Being able to automatically detect frustration would allow adaptive driver assistance and automation systems to adequately react to a driver's frustration and mitigate potential negative consequences. To identify reliable and valid indicators of driver's frustration, we conducted two driving simulator experiments. In the first experiment, we aimed to reveal facial expressions that indicate frustration in continuous video recordings of the driver's face taken while driving highly realistic simulator scenarios in which frustrated or non-frustrated emotional states were experienced. An automated analysis of facial expressions combined with multivariate logistic regression classification revealed that frustrated time intervals can be discriminated from non-frustrated ones with accuracy of 62.0% (mean over 30 participants). A further analysis of the facial expressions revealed that frustrated drivers tend to activate muscles in the mouth region (chin raiser, lip pucker, lip pressor). In the second experiment, we measured cortical activation with almost whole-head functional near-infrared spectroscopy (fNIRS) while participants experienced frustrating and non-frustrating driving simulator scenarios. Multivariate logistic regression applied to the fNIRS measurements allowed us to discriminate between frustrated and non-frustrated driving intervals with higher accuracy of 78.1% (mean over 12 participants). Frustrated driving intervals were indicated by increased activation in the inferior frontal, putative premotor and occipito-temporal cortices. Our results show that facial and cortical markers of frustration can be informative for time resolved driver state identification in complex realistic driving situations. The markers derived here can potentially be used as an input for future adaptive driver assistance and automation systems that detect driver frustration and adaptively react to mitigate it.
Collapse
Affiliation(s)
- Klas Ihme
- Department of Human Factors, Institute of Transportation Systems, German Aerospace Center (DLR), Braunschweig, Germany
| | - Anirudh Unni
- Department of Psychology, University of Oldenburg, Oldenburg, Germany
| | - Meng Zhang
- Department of Human Factors, Institute of Transportation Systems, German Aerospace Center (DLR), Braunschweig, Germany
| | - Jochem W. Rieger
- Department of Psychology, University of Oldenburg, Oldenburg, Germany
| | - Meike Jipp
- Department of Human Factors, Institute of Transportation Systems, German Aerospace Center (DLR), Braunschweig, Germany
| |
Collapse
|
32
|
Günther V, Zimmer J, Kersting A, Hoffmann KT, Lobsien D, Suslow T. Automatic processing of emotional facial expressions as a function of social anhedonia. Psychiatry Res Neuroimaging 2017; 270:46-53. [PMID: 29055240 DOI: 10.1016/j.pscychresns.2017.10.002] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/31/2017] [Revised: 09/11/2017] [Accepted: 10/09/2017] [Indexed: 11/21/2022]
Abstract
Anhedonia is an important feature of major depression and schizophrenia-spectrum disorders. Few neuroimaging studies have investigated neural alterations in high anhedonia, isolated from other psychopathological variables, by including only participants without clinical diagnoses. The present study examined healthy individuals scoring high (N = 18) vs. low (N = 19) in social anhedonia, who were carefully selected from a sample of N = 282 participants. To examine differences in automatic brain responses to social-affective stimuli between high vs. low social anhedonia participants, we used functional magnetic resonance imaging. To assess early, automatic stages of emotion processing, we administered a paradigm presenting brief (33ms), backward-masked happy, sad, and neutral facial expressions. Individuals high in social anhedonia demonstrated increased activation in the bilateral thalamus and left red nucleus in response to masked sad faces relative to individuals low in social anhedonia. No significant group differences in brain activation emerged in other regions known to be involved in emotion and reward processing, including the amygdala and nucleus accumbens. Our results suggest that high social anhedonia in otherwise healthy individuals is associated with exaggerated automatic reactivity in the thalamus, which is a brain structure that has been implicated in the mediation of attentional processes.
Collapse
Affiliation(s)
- Vivien Günther
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig, Leipzig, Germany
| | - Juliane Zimmer
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig, Leipzig, Germany
| | - Anette Kersting
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig, Leipzig, Germany
| | | | - Donald Lobsien
- Department of Neuroradiology, University of Leipzig, Leipzig, Germany
| | - Thomas Suslow
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig, Leipzig, Germany.
| |
Collapse
|
33
|
Mote J, Kring AM. Facial emotion perception in schizophrenia: Does sex matter? World J Psychiatry 2016; 6:257-268. [PMID: 27354969 PMCID: PMC4919266 DOI: 10.5498/wjp.v6.i2.257] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/12/2015] [Revised: 11/12/2015] [Accepted: 04/11/2016] [Indexed: 02/05/2023] Open
Abstract
AIM: To review the literature on sex differences in facial emotion perception (FEP) across the schizophrenia spectrum.
METHODS: We conducted a systematic review of empirical articles that were included in five separate meta-analyses of FEP across the schizophrenia spectrum, including meta-analyses that predominantly examined adults with chronic schizophrenia, people with early (onset prior to age 18) or recent-onset (experiencing their first or second psychotic episode or illness duration less than 2 years) schizophrenia, and unaffected first-degree relatives of people with schizophrenia. We also examined articles written in English (from November 2011 through June 2015) that were not included in the aforementioned meta-analyses through a literature search in the PubMed database. All relevant articles were accessed in full text. We examined all studies to determine the sample sizes, diagnostic characteristics, demographic information, methodologies, results, and whether each individual study reported on sex differences. The results from the meta-analyses themselves as well as the individual studies are reported in tables and text.
RESULTS: We retrieved 134 articles included in five separate meta-analyses and the PubMed database that examined FEP across the schizophrenia spectrum. Of these articles, 38 examined sex differences in FEP. Thirty of these studies did not find sex differences in FEP in either chronically ill adults with schizophrenia, early-onset or recently diagnosed people with schizophrenia, or first-degree relatives of people with schizophrenia. Of the eight studies that found sex differences in FEP, three found that chronically ill women outperformed men, one study found that girls with early-onset schizophrenia outperformed boys, and two studies found that women (including first-degree relatives, adults with schizophrenia, and the healthy control group) outperformed men on FEP tasks. In total, six of the eight studies that examined sex differences in FEP found that women outperformed men across the schizophrenia spectrum.
CONCLUSION: Evidence to date suggests few sex differences in FEP in schizophrenia; both men and women across the schizophrenia spectrum have deficits in FEP.
Collapse
|
34
|
Dey JK, Ishii LE, Byrne PJ, Boahene KDO, Ishii M. The Social Penalty of Facial Lesions. JAMA FACIAL PLAST SU 2015; 17:90-6. [DOI: 10.1001/jamafacial.2014.1131] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Affiliation(s)
- Jacob K. Dey
- Division of Facial Plastic and Reconstructive Surgery, Johns Hopkins School of Medicine, Baltimore, Maryland
- Department of Otolaryngology–Head and Neck Surgery, Johns Hopkins School of Medicine, Baltimore, Maryland
| | - Lisa E. Ishii
- Division of Facial Plastic and Reconstructive Surgery, Johns Hopkins School of Medicine, Baltimore, Maryland
- Department of Otolaryngology–Head and Neck Surgery, Johns Hopkins School of Medicine, Baltimore, Maryland
| | - Patrick J. Byrne
- Division of Facial Plastic and Reconstructive Surgery, Johns Hopkins School of Medicine, Baltimore, Maryland
- Department of Otolaryngology–Head and Neck Surgery, Johns Hopkins School of Medicine, Baltimore, Maryland
| | - Kofi D. O. Boahene
- Division of Facial Plastic and Reconstructive Surgery, Johns Hopkins School of Medicine, Baltimore, Maryland
- Department of Otolaryngology–Head and Neck Surgery, Johns Hopkins School of Medicine, Baltimore, Maryland
| | - Masaru Ishii
- Department of Otolaryngology–Head and Neck Surgery, Johns Hopkins School of Medicine, Baltimore, Maryland
- Division of Rhinology, Department of Otolaryngology–Head and Neck Surgery, Johns Hopkins School of Medicine, Baltimore, Maryland
| |
Collapse
|
35
|
Dey JK, Ishii M, Boahene KDO, Byrne P, Ishii LE. Impact of facial defect reconstruction on attractiveness and negative facial perception. Laryngoscope 2015; 125:1316-21. [DOI: 10.1002/lary.25130] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/10/2014] [Indexed: 11/05/2022]
Affiliation(s)
- Jacob K. Dey
- Division of Facial Plastic & Reconstructive Surgery; Johns Hopkins School of Medicine; Baltimore Maryland U.S.A
| | - Masaru Ishii
- Division of Rhinology, Department of Otolaryngology-Head & Neck Surgery; Johns Hopkins School of Medicine; Baltimore Maryland U.S.A
| | - Kofi D. O. Boahene
- Division of Facial Plastic & Reconstructive Surgery; Johns Hopkins School of Medicine; Baltimore Maryland U.S.A
| | - Patrick Byrne
- Division of Facial Plastic & Reconstructive Surgery; Johns Hopkins School of Medicine; Baltimore Maryland U.S.A
| | - Lisa E. Ishii
- Division of Facial Plastic & Reconstructive Surgery; Johns Hopkins School of Medicine; Baltimore Maryland U.S.A
| |
Collapse
|
36
|
Donges US, Dukalski B, Kersting A, Suslow T. Automatic processing of facial affects in patients with borderline personality disorder: associations with symptomatology and comorbid disorders. Ann Gen Psychiatry 2015; 14:20. [PMID: 26170894 PMCID: PMC4499878 DOI: 10.1186/s12991-015-0058-y] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/27/2015] [Accepted: 07/06/2015] [Indexed: 11/25/2022] Open
Abstract
BACKGROUND Instability of affects and interpersonal relations are important features of borderline personality disorder (BPD). Interpersonal problems of individuals suffering from BPD might develop based on abnormalities in the processing of facial affects and high sensitivity to negative affective expressions. The aims of the present study were to examine automatic evaluative shifts and latencies as a function of masked facial affects in patients with BPD compared to healthy individuals. As BPD comorbidity rates for mental and personality disorders are high, we investigated also the relationships of affective processing characteristics with specific borderline symptoms and comorbidity. METHODS Twenty-nine women with BPD and 38 healthy women participated in the study. The majority of patients suffered from additional Axis I disorders and/or additional personality disorders. In the priming experiment, angry, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces that had to be evaluated. Evaluative decisions and response latencies were registered. Borderline-typical symptomatology was assessed with the Borderline Symptom List. RESULTS In the total sample, valence-congruent evaluative shifts and delays of evaluative decision due to facial affect were observed. No between-group differences were obtained for evaluative decisions and latencies. The presence of comorbid anxiety disorders was found to be positively correlated with evaluative shifting owing to masked happy primes, regardless of baseline-neutral or no facial expression condition. The presence of comorbid depressive disorder, paranoid personality disorder, and symptoms of social isolation and self-aggression were significantly correlated with response delay due to masked angry faces, regardless of baseline. CONCLUSIONS In the present affective priming study, no abnormalities in the automatic recognition and processing of facial affects were observed in BPD patients compared to healthy individuals. The presence of comorbid anxiety disorders could make patients more susceptible to the influence of a happy expression on judgment processes at an automatic processing level. Comorbid depressive disorder, paranoid personality disorder, and symptoms of social isolation and self-aggression may enhance automatic attention allocation to threatening facial expressions in BPD. Increased automatic vigilance for social threat stimuli might contribute to affective instability and interpersonal problems in specific patients with BPD.
Collapse
Affiliation(s)
- Uta-Susan Donges
- Department of Psychosomatic Medicine, University of Leipzig, Leipzig, Germany
| | - Bibiana Dukalski
- Department of Psychosomatic Medicine, University of Leipzig, Leipzig, Germany
| | - Anette Kersting
- Department of Psychosomatic Medicine, University of Leipzig, Leipzig, Germany
| | - Thomas Suslow
- Department of Psychosomatic Medicine, University of Leipzig, Leipzig, Germany ; Department of Psychiatry, University of Münster, Münster, Germany
| |
Collapse
|
37
|
Ihme K, Sacher J, Lichev V, Rosenberg N, Kugel H, Rufer M, Grabe HJ, Pampel A, Lepsien J, Kersting A, Villringer A, Lane RD, Suslow T. Alexithymic features and the labeling of brief emotional facial expressions – An fMRI study. Neuropsychologia 2014; 64:289-99. [DOI: 10.1016/j.neuropsychologia.2014.09.044] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2014] [Revised: 08/18/2014] [Accepted: 09/24/2014] [Indexed: 01/29/2023]
|
38
|
[The mutual influence of pain and emotion processing]. Schmerz 2014; 28:631-4. [PMID: 25179417 DOI: 10.1007/s00482-014-1481-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
39
|
Abstract
OBJECTIVE Assess the ability of facial reanimation surgery to restore affect display in patients with severe facial paralysis. STUDY DESIGN Survey of healthy observers' perception of change in facial visage from preoperative to postoperative state. SETTING Academic tertiary referral center. MAIN OUTCOME MEASURES Observer graded affect display. METHODS Ninety naive observers completed a survey with pictures of paralyzed faces, smiling and in repose, before and after surgery, as well as normal comparison faces. Observers characterized affect display of each face coded in primary affects from the Derogatis Affects Balance Scale. Results were analyzed with latent class analysis and regression using a three-class model. RESULTS Preoperatively, paralyzed faces in repose were most likely to be considered negative (56.0%), then neutral (41.3%) and positive (2.7%). In this same cohort of patients in repose, reanimation surgery restored affect display to normal levels: decreasing negative classification (18.9%) and increasing neutral (53.4%) and positive (27.7%) classification. Hypothesis testing revealed no statistically significant differences in the mean classification probabilities for postoperative faces in repose versus normal faces in repose. The same analysis was performed for smiling faces, which showed marked improvement with reanimation surgery: decreasing negative (45.6%-11.7%) and increasing positive (26.2%-60.0%) classification. Despite this improvement, there were statistically significant differences in classification of postoperative smiling faces versus normal smiling faces. CONCLUSION Facial reanimation surgery was associated with normalized affect display for faces in repose and improved affect display for smiling faces. These results provide evidence that facial reanimation improves the facial expression of emotion. Further assessment in additional contexts will help better characterize the ability of facial reanimation to mitigate the psychosocial burden associated with facial paralysis.
Collapse
|
40
|
Abstract
Few batteries of prosodic stimuli testing have been validated for Quebec-French people. Such validation is necessary to develop auditory-verbal tasks in this population. The objective of this study was to validate a battery of emotional prosodic stimuli for French-Québec aging subjects. The battery of 195 stimuli, which was elaborated by Maurage et al. (2007), is composed of 195 prosodic stimuli and was administrated to 50 healthy Quebecers aged 50-to-80 years. The percentages of good responses were calculated for each stimulus. For each emotion, Cronbach's alphas were calculated to evaluate the internal consistency of the stimuli. Results showed that among the 195 stimuli, 40 were correctly recognized by at least 80 per cent of the subjects. Anger was the emotion that was most correctly identified by the participants, while recognition of disgust was the least recognised. Overall, this study provides data that will guide the selection of prosodic stimuli in evaluating French-Québécois.
Collapse
|
41
|
|
42
|
Lindell AK. Continuities in emotion lateralization in human and non-human primates. Front Hum Neurosci 2013; 7:464. [PMID: 23964230 PMCID: PMC3737467 DOI: 10.3389/fnhum.2013.00464] [Citation(s) in RCA: 63] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2013] [Accepted: 07/26/2013] [Indexed: 11/13/2022] Open
Abstract
Where hemispheric lateralization was once considered an exclusively human trait, it is increasingly recognized that hemispheric asymmetries are evident throughout the animal kingdom. Emotion is a prime example of a lateralized function: given its vital role in promoting adaptive behavior and hence survival, a growing body of research in affective neuroscience is working to illuminate the cortical bases of emotion processing. Presuming that human and non-human primates evolved from a shared ancestor, one would anticipate evidence of organizational continuity in the neural substrate supporting emotion processing. This paper thus reviews research examining the patterns of lateralization for the expression and perception of facial emotion in non-human primates, aiming to determine whether the patterns of hemispheric asymmetry that characterize the human brain are similarly evident in other primate species. As such, this review seeks to enhance understanding of the evolution of hemispheric specialization for emotion, using emotion lateralization in non-human primates as a window through which to view emotion lateralization in humans.
Collapse
Affiliation(s)
- Annukka K Lindell
- School of Psychological Science, La Trobe University , Melbourne, VIC , Australia
| |
Collapse
|
43
|
Rassovsky Y, Lee J, Nori P, D Wu A, Iacoboni M, Breitmeyer BG, Hellemann G, Green MF. Assessing temporal processing of facial emotion perception with transcranial magnetic stimulation. Brain Behav 2013; 3:263-72. [PMID: 23785658 PMCID: PMC3683286 DOI: 10.1002/brb3.136] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/07/2012] [Revised: 02/13/2013] [Accepted: 02/23/2013] [Indexed: 11/30/2022] Open
Abstract
The ability to process facial expressions can be modified by altering the spatial frequency of the stimuli, an effect that has been attributed to differential properties of visual pathways that convey different types of information to distinct brain regions at different speeds. While this effect suggests a potential influence of spatial frequency on the processing speed of facial emotion, this hypothesis has not been examined directly. We addressed this question using a facial emotion identification task with photographs containing either high spatial frequency (HSF), low spatial frequency (LSF), or broadband spatial frequency (BSF). Temporal processing of emotion perception was manipulated by suppressing visual perception with a single-pulse transcranial magnetic stimulation (TMS), delivered to the visual cortex at six intervals prior to (forward masking) or following (backward masking) stimulus presentation. Participants performed best in the BSF, followed by LSF, and finally HSF condition. A spatial frequency by forward/backward masking interaction effect demonstrated reduced performance in the forward masking component in the BSF condition and a reversed performance pattern in the HSF condition, with no significant differences between forward and backward masking in the LSF condition. Results indicate that LSF information may play a greater role than HSF information in emotional processing, but may not be sufficient for fast conscious perception of emotion. As both LSF and HSF filtering reduced the speed of extracting emotional information from faces, it is possible that intact BSF faces have an inherent perceptual advantage and hence benefit from faster temporal processing.
Collapse
Affiliation(s)
- Yuri Rassovsky
- Department of Psychology and Gonda Multidisciplinary Brain Research Center, Bar-Ilan University Ramat-Gan, Israel ; Department of Psychiatry and Biobehavioral Sciences, UCLA Semel Institute for Neuroscience and Human Behavior Los Angeles, California
| | | | | | | | | | | | | | | |
Collapse
|
44
|
Shortened night sleep impairs facial responsiveness to emotional stimuli. Biol Psychol 2013; 93:41-4. [PMID: 23357729 DOI: 10.1016/j.biopsycho.2013.01.008] [Citation(s) in RCA: 34] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2012] [Revised: 12/27/2012] [Accepted: 01/07/2013] [Indexed: 11/23/2022]
Abstract
Sleep deprivation deteriorates mood, impairs the recognition of facial expressions, and affects the ability to regulate emotions. The present study investigated the effect of partial sleep deprivation on facial responses to emotional stimuli. Thirty-three healthy undergraduates were tested twice: after a night with (i) 8h and (ii) 4h sleep. Self-reported sleepiness and sustained attention (Psychomotor Vigilance Task) were assessed. Emotional reactivity was measured with facial Electromyogram (EMG) while participants were asked to respond with either compatible or incompatible facial muscles to emotional stimuli in order to study whether partial sleep deprivation caused slower reactions mainly in response to incompatible stimuli (due to an additional effort to suppress the compatible reaction caused by decreased inhibitory control) or in response to both compatible and incompatible stimuli. Self-reported sleepiness and reaction times in a sustained attention task significantly increased after one night of partial sleep deprivation. Facial reactions to emotional stimuli were decelerated. No significant interaction between sleep restriction and compatibility of the muscle to the picture valence could be observed. Hence, volitional facial reactions in response to emotional stimuli were slower after one night of reduced sleep, but affective inhibitory control was not significantly impaired. However, slowed facial responding to emotional stimuli may affect social interaction after sleep restriction.
Collapse
|
45
|
Horning SM, Cornwell RE, Davis HP. The recognition of facial expressions: An investigation of the influence of age and cognition. AGING NEUROPSYCHOLOGY AND COGNITION 2012; 19:657-76. [DOI: 10.1080/13825585.2011.645011] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
|
46
|
Mu YG, Huang LJ, Li SY, Ke C, Chen Y, Jin Y, Chen ZP. Working memory and the identification of facial expression in patients with left frontal glioma. Neuro Oncol 2012; 14 Suppl 4:iv81-9. [PMID: 23095835 PMCID: PMC3480252 DOI: 10.1093/neuonc/nos215] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/18/2022] Open
Abstract
Patients with brain tumors may have cognitive dysfunctions including memory deterioration, such as working memory, that affect quality of life. This study was to explore the presence of defects in working memory and the identification of facial expressions in patients with left frontal glioma. This case-control study recruited 11 matched pairs of patients and healthy control subjects (mean age ± standard deviation, 37.00 ± 10.96 years vs 36.73 ± 11.20 years; 7 male and 4 female) from March through December 2011. The psychological tests contained tests that estimate verbal/visual-spatial working memory, executive function, and the identification of facial expressions. According to the paired samples analysis, there were no differences in the anxiety and depression scores or in the intelligence quotients between the 2 groups (P > .05). All indices of the Digit Span Test were significantly worse in patients than in control subjects (P < .05), but the Tapping Test scores did not differ between patient and control groups. Of all 7 Wisconsin Card Sorting Test (WCST) indexes, only the Preservative Response was significantly different between patients and control subjects (P < .05). Patients were significantly less accurate in detecting angry facial expressions than were control subjects (30.3% vs 57.6%; P < .05) but showed no deficits in the identification of other expressions. The backward indexes of the Digit Span Test were associated with emotion scores and tumor size and grade (P < .05). Patients with left frontal glioma had deficits in verbal working memory and the ability to identify anger. These may have resulted from damage to functional frontal cortex regions, in which roles in these 2 capabilities have not been confirmed. However, verbal working memory performance might be affected by emotional and tumor-related factors.
Collapse
Affiliation(s)
- Yong-Gao Mu
- State Key Laboratory of Oncology in South China, Sun Yat-sen University Cancer Center, Guangzhou, People's Republic of China
| | | | | | | | | | | | | |
Collapse
|
47
|
Gerdes AB, Wieser MJ, Alpers GW, Strack F, Pauli P. Why do you smile at me while I'm in pain? — Pain selectively modulates voluntary facial muscle responses to happy faces. Int J Psychophysiol 2012; 85:161-7. [DOI: 10.1016/j.ijpsycho.2012.06.002] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2011] [Revised: 06/04/2012] [Accepted: 06/05/2012] [Indexed: 12/11/2022]
|
48
|
|
49
|
Wieser MJ, Gerdes ABM, Greiner R, Reicherts P, Pauli P. Tonic pain grabs attention, but leaves the processing of facial expressions intact-evidence from event-related brain potentials. Biol Psychol 2012; 90:242-8. [PMID: 22503790 DOI: 10.1016/j.biopsycho.2012.03.019] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2011] [Revised: 03/05/2012] [Accepted: 03/21/2012] [Indexed: 11/24/2022]
Abstract
Emotion and attention are key players in the modulation of pain perception. However, much less is known about the reverse influence of pain on attentional and especially emotional processes. To this end, we employed painful vs. non-painful pressure stimulation to examine effects on the processing of simultaneously presented facial expressions (fearful, neutral, happy). Continuous EEG was recorded and participants had to rate each facial expression with regard to valence and arousal. Painful stimulation attenuated visual processing in general, as reduced P100 and late positive potential (LPP) amplitudes revealed, but did not interfere with structural encoding of faces (N170). In addition, early perceptual discrimination and sustained preferential processing of emotional facial expressions as well as affective ratings were not influenced by pain. Thus, tonic pain demonstrates strong attention-demanding properties, but this does not interfere with concurrently ongoing emotion discrimination processes. These effects point at partially independent effects of pain on emotion and attention, respectively.
Collapse
|
50
|
Shivakumar G, Vijaya PA. Emotion Recognition Using Finger Tip Temperature: FirstStep towards an Automatic System. ACTA ACUST UNITED AC 2012. [DOI: 10.7763/ijcee.2012.v4.489] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/01/2022]
|