1
|
Mosley PE, van der Meer JN, Hamilton LHW, Fripp J, Parker S, Jeganathan J, Breakspear M, Parker R, Holland R, Mitchell BL, Byrne E, Hickie IB, Medland SE, Martin NG, Cocchi L. Markers of positive affect and brain state synchrony discriminate melancholic from non-melancholic depression using naturalistic stimuli. Mol Psychiatry 2024:10.1038/s41380-024-02699-y. [PMID: 39191867 DOI: 10.1038/s41380-024-02699-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/11/2024] [Revised: 08/11/2024] [Accepted: 08/14/2024] [Indexed: 08/29/2024]
Abstract
Melancholia has been proposed as a qualitatively distinct depressive subtype associated with a characteristic symptom profile (psychomotor retardation, profound anhedonia) and a better response to biological therapies. Existing work has suggested that individuals with melancholia are blunted in their display of positive emotions and differ in their neural response to emotionally evocative stimuli. Here, we unify these brain and behavioural findings amongst a carefully phenotyped group of seventy depressed participants, drawn from an established Australian database (the Australian Genetics of Depression Study) and further enriched for melancholia (high ratings of psychomotor retardation and anhedonia). Melancholic (n = 30) or non-melancholic status (n = 40) was defined using a semi-structured interview (the Sydney Melancholia Prototype Index). Complex facial expressions were captured whilst participants watched a movie clip of a comedian and classified using a machine learning algorithm. Subsequently, the dynamics of sequential changes in brain activity were modelled during the viewing of an emotionally evocative movie in the MRI scanner. We found a quantitative reduction in positive facial expressivity amongst participants with melancholia, combined with differences in the synchronous expression of brain states during positive epochs of the movie. In non-melancholic depression, the display of positive affect was inversely related to the activity of cerebellar regions implicated in the processing of affect. However, this relationship was reduced in those with a melancholic phenotype. Our multimodal findings show differences in evaluative and motoric domains between melancholic and non-melancholic depression through engagement in ecologically valid tasks that evoke positive emotion. These findings provide new markers to stratify depression and an opportunity to support the development of targeted interventions.
Collapse
Affiliation(s)
- Philip E Mosley
- QIMR Berghofer Medical Research Institute, Herston, QLD, Australia.
- Queensland Brain Institute, University of Queensland, St Lucia, QLD, Australia.
- Australian eHealth Research Centre, CSIRO Health and Biosecurity, Herston, QLD, Australia.
- Faculty of Medicine, School of Biomedical Sciences, University of Queensland, St Lucia, QLD, Australia.
| | - Johan N van der Meer
- QIMR Berghofer Medical Research Institute, Herston, QLD, Australia
- School of Information Systems, Queensland University of Technology, Kelvin Grove, QLD, Australia
| | | | - Jurgen Fripp
- Australian eHealth Research Centre, CSIRO Health and Biosecurity, Herston, QLD, Australia
| | - Stephen Parker
- Faculty of Medicine, School of Biomedical Sciences, University of Queensland, St Lucia, QLD, Australia
- Metro North Mental Health, Royal Brisbane & Women's Hospital, Herston, QLD, Australia
| | - Jayson Jeganathan
- School of Psychology, College of Engineering, Science and the Environment, University of Newcastle, Newcastle, NSW, Australia
- Brain Neuromodulation Research Program, Hunter Medical Research Institute, Newcastle, NSW, Australia
- School of Medicine and Public Health, College of Medicine, Health and Wellbeing, University of Newcastle, Newcastle, NSW, Australia
| | - Michael Breakspear
- School of Psychology, College of Engineering, Science and the Environment, University of Newcastle, Newcastle, NSW, Australia
- Brain Neuromodulation Research Program, Hunter Medical Research Institute, Newcastle, NSW, Australia
- School of Medicine and Public Health, College of Medicine, Health and Wellbeing, University of Newcastle, Newcastle, NSW, Australia
| | - Richard Parker
- QIMR Berghofer Medical Research Institute, Herston, QLD, Australia
| | - Rebecca Holland
- QIMR Berghofer Medical Research Institute, Herston, QLD, Australia
| | - Brittany L Mitchell
- QIMR Berghofer Medical Research Institute, Herston, QLD, Australia
- Faculty of Medicine, School of Biomedical Sciences, University of Queensland, St Lucia, QLD, Australia
| | - Enda Byrne
- Child Health Research Centre, University of Queensland, South Brisbane, QLD, Australia
| | - Ian B Hickie
- Brain and Mind Centre, University of Sydney, Camperdown, NSW, Australia
| | - Sarah E Medland
- QIMR Berghofer Medical Research Institute, Herston, QLD, Australia
- School of Psychology, University of Queensland, St Lucia, QLD, Australia
- School of Psychology and Counselling, Queensland University of Technology, Kelvin Grove, QLD, Australia
| | | | - Luca Cocchi
- QIMR Berghofer Medical Research Institute, Herston, QLD, Australia
- Faculty of Medicine, School of Biomedical Sciences, University of Queensland, St Lucia, QLD, Australia
| |
Collapse
|
2
|
Martin EA, Lian W, Oltmanns JR, Jonas KG, Samaras D, Hallquist MN, Ruggero CJ, Clouston SAP, Kotov R. Behavioral meaures of psychotic disorders: Using automatic facial coding to detect nonverbal expressions in video. J Psychiatr Res 2024; 176:9-17. [PMID: 38830297 DOI: 10.1016/j.jpsychires.2024.05.056] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/16/2023] [Revised: 04/11/2024] [Accepted: 05/29/2024] [Indexed: 06/05/2024]
Abstract
Emotional deficits in psychosis are prevalent and difficult to treat. In particular, much remains unknown about facial expression abnormalities, and a key reason is that expressions are very labor-intensive to code. Automatic facial coding (AFC) can remove this barrier. The current study sought to both provide evidence for the utility of AFC in psychosis for research purposes and to provide evidence that AFC are valid measures of clinical constructs. Changes of facial expressions and head position of participants-39 with schizophrenia/schizoaffective disorder (SZ), 46 with other psychotic disorders (OP), and 108 never psychotic individuals (NP)-were assessed via FaceReader, a commercially available automated facial expression analysis software, using video recorded during a clinical interview. We first examined the behavioral measures of the psychotic disorder groups and tested if they can discriminate between the groups. Next, we evaluated links of behavioral measures with clinical symptoms, controlling for group membership. We found the SZ group was characterized by significantly less variation in neutral expressions, happy expressions, arousal, and head movements compared to NP. These measures discriminated SZ from NP well (AUC = 0.79, sensitivity = 0.79, specificity = 0.67) but discriminated SZ from OP less well (AUC = 0.66, sensitivity = 0.77, specificity = 0.46). We also found significant correlations between clinician-rated symptoms and most behavioral measures (particularly happy expressions, arousal, and head movements). Taken together, these results suggest that AFC can provide useful behavioral measures of psychosis, which could improve research on non-verbal expressions in psychosis and, ultimately, enhance treatment.
Collapse
Affiliation(s)
- Elizabeth A Martin
- Department of Psychological Science, University of California, Irvine, CA, USA.
| | - Wenxuan Lian
- Department of Materials Science and Engineering and Department of Applied Math and Statistics, Stony Brook University, Stony Brook, NY, USA
| | - Joshua R Oltmanns
- Department of Psychiatry, Stony Brook University, Stony Brook, NY, USA
| | - Katherine G Jonas
- Department of Psychiatry, Stony Brook University, Stony Brook, NY, USA
| | - Dimitris Samaras
- Department of Computer Science, Stony Brook University, Stony Brook, NY, USA
| | - Michael N Hallquist
- Department of Psychology and Neuroscience, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| | - Camilo J Ruggero
- Department of Psychology, University of Texas at Dallas, Richardson, TX, USA
| | - Sean A P Clouston
- Program in Public Health and Department of Family, Population, and Preventive Medicine, Renaissance School of Medicine, Stony Brook University, Stony Brook, NY, USA
| | - Roman Kotov
- Department of Psychiatry, Stony Brook University, Stony Brook, NY, USA.
| |
Collapse
|
3
|
Chen WT, Hsiao FJ, Coppola G, Wang SJ. Decoding pain through facial expressions: a study of patients with migraine. J Headache Pain 2024; 25:33. [PMID: 38462615 PMCID: PMC10926654 DOI: 10.1186/s10194-024-01742-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2024] [Accepted: 03/01/2024] [Indexed: 03/12/2024] Open
Abstract
BACKGROUND The present study used the Facial Action Coding System (FACS) to analyse changes in facial activities in individuals with migraine during resting conditions to determine the potential of facial expressions to convey information about pain during headache episodes. METHODS Facial activity was recorded in calm and resting conditions by using a camera for both healthy controls (HC) and patients with episodic migraine (EM) and chronic migraine (CM). The FACS was employed to analyse the collected facial images, and intensity scores for each of the 20 action units (AUs) representing expressions were generated. The groups and headache pain conditions were then examined for each AU. RESULTS The study involved 304 participants, that is, 46 HCs, 174 patients with EM, and 84 patients with CM. Elevated headache pain levels were associated with increased lid tightener activity and reduced mouth stretch. In the CM group, moderate to severe headache attacks exhibited decreased activation in the mouth stretch, alongside increased activation in the lid tightener, nose wrinkle, and cheek raiser, compared to mild headache attacks (all corrected p < 0.05). Notably, lid tightener activation was positively correlated with the Numeric Rating Scale (NRS) level of headache (p = 0.012). Moreover, the lip corner depressor was identified to be indicative of emotional depression severity (p < 0.001). CONCLUSION Facial expressions, particularly lid tightener actions, served as inherent indicators of headache intensity in individuals with migraine, even during resting conditions. This indicates that the proposed approach holds promise for providing a subjective evaluation of headaches, offering the benefits of real-time assessment and convenience for patients with migraine.
Collapse
Affiliation(s)
- Wei-Ta Chen
- Brain Research Center, National Yang Ming Chiao Tung University, 155, Linong Street Sec 2, Taipei, 112, Taiwan
- School of Medicine, National Yang Ming Chiao Tung University, Taipei, Taiwan
- Department of Neurology, Neurological Institute, Taipei Veterans General Hospital, Taipei, Taiwan
- Department of Neurology, Keelung Hospital, Ministry of Health and Welfare, Keelung, Taiwan
| | - Fu-Jung Hsiao
- Brain Research Center, National Yang Ming Chiao Tung University, 155, Linong Street Sec 2, Taipei, 112, Taiwan.
| | - Gianluca Coppola
- Department of Medico-Surgical Sciences and Biotechnologies, Sapienza University of Rome Polo Pontino, Latina, Italy
| | - Shuu-Jiun Wang
- Brain Research Center, National Yang Ming Chiao Tung University, 155, Linong Street Sec 2, Taipei, 112, Taiwan
- School of Medicine, National Yang Ming Chiao Tung University, Taipei, Taiwan
- Department of Neurology, Neurological Institute, Taipei Veterans General Hospital, Taipei, Taiwan
| |
Collapse
|
4
|
Chen C, Messinger DS, Chen C, Yan H, Duan Y, Ince RAA, Garrod OGB, Schyns PG, Jack RE. Cultural facial expressions dynamically convey emotion category and intensity information. Curr Biol 2024; 34:213-223.e5. [PMID: 38141619 PMCID: PMC10831323 DOI: 10.1016/j.cub.2023.12.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2023] [Revised: 10/27/2023] [Accepted: 12/01/2023] [Indexed: 12/25/2023]
Abstract
Communicating emotional intensity plays a vital ecological role because it provides valuable information about the nature and likelihood of the sender's behavior.1,2,3 For example, attack often follows signals of intense aggression if receivers fail to retreat.4,5 Humans regularly use facial expressions to communicate such information.6,7,8,9,10,11 Yet how this complex signaling task is achieved remains unknown. We addressed this question using a perception-based, data-driven method to mathematically model the specific facial movements that receivers use to classify the six basic emotions-"happy," "surprise," "fear," "disgust," "anger," and "sad"-and judge their intensity in two distinct cultures (East Asian, Western European; total n = 120). In both cultures, receivers expected facial expressions to dynamically represent emotion category and intensity information over time, using a multi-component compositional signaling structure. Specifically, emotion intensifiers peaked earlier or later than emotion classifiers and represented intensity using amplitude variations. Emotion intensifiers are also more similar across emotions than classifiers are, suggesting a latent broad-plus-specific signaling structure. Cross-cultural analysis further revealed similarities and differences in expectations that could impact cross-cultural communication. Specifically, East Asian and Western European receivers have similar expectations about which facial movements represent high intensity for threat-related emotions, such as "anger," "disgust," and "fear," but differ on those that represent low threat emotions, such as happiness and sadness. Together, our results provide new insights into the intricate processes by which facial expressions can achieve complex dynamic signaling tasks by revealing the rich information embedded in facial expressions.
Collapse
Affiliation(s)
- Chaona Chen
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK.
| | - Daniel S Messinger
- Departments of Psychology, Pediatrics, and Electrical & Computer Engineering, University of Miami, 5665 Ponce De Leon Blvd, Coral Gables, FL 33146, USA
| | - Cheng Chen
- Foreign Language Department, Teaching Centre for General Courses, Chengdu Medical College, 601 Tianhui Street, Chengdu 610083, China
| | - Hongmei Yan
- The MOE Key Lab for Neuroinformation, University of Electronic Science and Technology of China, North Jianshe Road, Chengdu 611731, China
| | - Yaocong Duan
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| | - Robin A A Ince
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| | - Oliver G B Garrod
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| | - Philippe G Schyns
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| | - Rachael E Jack
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| |
Collapse
|
5
|
Hur MS, Lee S, Jung HS, Schneider RA. Crossing fibers may underlie the dynamic pulling forces of muscles that attach to cartilage at the tip of the nose. Sci Rep 2023; 13:18948. [PMID: 37919340 PMCID: PMC10622497 DOI: 10.1038/s41598-023-45781-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2023] [Accepted: 10/24/2023] [Indexed: 11/04/2023] Open
Abstract
The present study used microdissection, histology, and microcomputed tomography (micro-CT) with the aims of determining the prevalence and patterns of the depressor septi nasi (DSN) and orbicularis oris (OOr) muscles attached to the footplate of the medial crus (fMC) of the major alar cartilage, focusing on their crossing fibers. The DSN and OOr attached to the fMC of the major alar cartilage were investigated in 76 samples from 38 embalmed Korean adult cadavers (20 males, 18 females; mean age 70 years). The DSN, OOr, or both were attached to the fMC. When the DSN ran unilaterally or was absent, some OOr fibers ascended to attach to the fMC instead of the DSN in 20.6% of the samples. Crossing fibers of the DSN or OOr attached to the fMC were found in 82.4% of the samples. Bilateral and unilateral crossing fibers were found in 32.4% and 50.0%, respectively, and no crossing fibers were found in 17.6%. The DSN and OOr that attached to the fMC could be categorized into six types according to presence of the DSN and the crossing patterns of the DSN and OOr. Anatomical findings of the DSN and OOr that attached to the fMC were confirmed in histology and micro-CT images. These findings offer insights on anatomical mechanisms that may underlie the dynamic pulling forces generated by muscles that attach to the fMCs and on evolutionary variation observed in human facial expressions. They can also provide useful information for guiding rhinoplasty of the nasal tip.
Collapse
Affiliation(s)
- Mi-Sun Hur
- Department of Anatomy, Daegu Catholic University School of Medicine, Daegu, Korea
| | - Seunggyu Lee
- Division of Applied Mathematical Sciences, Korea University, Sejong, Korea
- Biomedical Mathematics Group, Institute for Basic Science, Daejeon, Korea
| | - Han-Sung Jung
- Division in Anatomy and Developmental Biology, Department of Oral Biology, Taste Research Center, BK21 FOUR Project, Oral Science Research Center, Yonsei University College of Dentistry, Seoul, Korea.
| | - Richard A Schneider
- Department of Orthopaedic Surgery, University of California at San Francisco, 513 Parnassus Avenue, S-1161, San Francisco, CA, 94143-0514, USA.
| |
Collapse
|