1
|
Li W, Huang B, Song Y, Hou L, Shi W. Altered neural mechanisms of deception in individuals with autistic traits. Brain Cogn 2023; 170:106005. [PMID: 37320929 DOI: 10.1016/j.bandc.2023.106005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2023] [Revised: 06/02/2023] [Accepted: 06/04/2023] [Indexed: 06/17/2023]
Abstract
A successful deception involves making a decision, acting on it, and evaluating results. Here, we investigated deception in a non-clinical sample (n = 36) with varying autism traits using a coin-toss paradigm of active deception. The subjects were asked to react to the instructions by clicking one of the two boxes that could mislead their opponents, followed by feedback on their success or failure. During this reaction, their EEG activity was recorded, and the results suggested that people with high autistic traits exhibited longer reaction times and lower amplitude of P3 in the decision-making stage compared to individuals with low autistic traits. The feedback evaluation stage in the high autistic trait group elicited lower amplitude of FRN and P3. Overall, these results indicated that people with high autistic traits experienced difficulties in deceiving, which could be related to atypical neural mechanisms.
Collapse
Affiliation(s)
- Wenrui Li
- School of Education, Shanghai Normal University, Shanghai 200000, China
| | - Bowen Huang
- School of Education, Shanghai Normal University, Shanghai 200000, China
| | - Youming Song
- Department of Psychology, College of Education Science, Yan'an University, Yan'an 716000, China
| | - Lulu Hou
- School of Education, Shanghai Normal University, Shanghai 200000, China
| | - Wendian Shi
- School of Education, Shanghai Normal University, Shanghai 200000, China.
| |
Collapse
|
2
|
Key AP, Jones D, Corbett BA. Sex differences in automatic emotion regulation in adolescents with autism spectrum disorder. Autism Res 2022; 15:712-728. [PMID: 35103402 PMCID: PMC9060299 DOI: 10.1002/aur.2678] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2021] [Revised: 12/23/2021] [Accepted: 01/15/2022] [Indexed: 11/11/2022]
Abstract
Autism may be underdiagnosed in females because their social difficulties are often less noticeable. This study explored sex differences in automatic facial emotion processing in 45 adolescents with autism spectrum disorder (22 female, 23 male), age 10-16 years, performing active target detection task and Go/NoGo tasks where faces with positive and negative emotional expressions served as irrelevant distractors. The combined sample demonstrated more accurate performance on the target detection (response initiation) than the Go/NoGo task (response inhibition), replicating findings previously reported in typical participants. Females exhibited greater difficulty than males with response initiation in the target detection task, especially in the context of angry faces, while males found withholding a response in the Go/NoGo block with happy faces more challenging. Electrophysiological data revealed no sex differences or emotion discrimination effects during the early perceptual processing of faces indexed by the occipitotemporal N170. Autistic males demonstrated increased frontal N2 and parietal P3 amplitudes compared to females, suggesting greater neural resource allocation to automatic emotion regulation processes. The associations between standardized behavioral measures (autism severity, theory of mind skills) and brain responses also varied by sex: more adaptive social functioning was related to the speed of perceptual processing (N170 latency) in females and the extent of deliberate attention allocation (P3 amplitudes) in males. Together, these findings suggest that males and females with autism may rely on different strategies for social functioning and highlight the importance of considering sex differences in autism. LAY SUMMARY: Females with autism may exhibit less noticeable social difficulties than males. This study demonstrates that autistic females are more successful than males at inhibiting behavioral responses in emotional contexts, while males are more likely to initiate a response. At the neural level, social functioning in females is related to the speed of automatic perceptual processing of facial cues, and in males, to the extent of active attention allocation to the stimuli. These findings highlight the importance of considering sex differences in autism diagnosis and treatment selection.
Collapse
Affiliation(s)
- Alexandra P. Key
- Vanderbilt Kennedy Center, Vanderbilt University Medical Center
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center
- Department of Psychiatry and Behavioral Sciences, Vanderbilt University Medical Center
| | - Dorita Jones
- Vanderbilt Kennedy Center, Vanderbilt University Medical Center
| | - Blythe A. Corbett
- Vanderbilt Kennedy Center, Vanderbilt University Medical Center
- Department of Psychiatry and Behavioral Sciences, Vanderbilt University Medical Center
| |
Collapse
|
3
|
Meng C, Huo C, Ge H, Li Z, Hu Y, Meng J. Processing of expressions by individuals with autistic traits: Empathy deficit or sensory hyper-reactivity? PLoS One 2021; 16:e0254207. [PMID: 34242310 PMCID: PMC8270190 DOI: 10.1371/journal.pone.0254207] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2020] [Accepted: 06/22/2021] [Indexed: 11/19/2022] Open
Abstract
Individuals with autistic traits display impaired social interaction and communication in everyday life, but the underlying cognitive neural mechanisms remain very unclear and still remain controversial. The mind-blindness hypothesis suggests that social difficulties in individuals with autistic traits are caused by empathy impairment in individuals; however, the intense world theory suggests that these social difficulties are caused by sensory hyper-reactivity and sensory overload, rather than empathy impairment. To further test these two theories, this study investigated event-related potentials (ERPs) to explore the cognitive neural processing of repetitive expressions in individuals with autistic traits. This study employed the Mandarin version of the autism-spectrum quotient (AQ) to assess autistic traits in 2,502 healthy adults. Two subset groups were used, e.g., the participants of a high-AQ group were randomly selected among the 10% of individuals with the highest AQ scores; similarly, the participants in the low-AQ group were randomly selected from the 10% of participants with the lowest AQ scores. In an experiment, three different facial expressions (positive, neutral, or negative) of the same person were presented successively and pseudo-randomly in each trial. Participants needed to define the expression of the face that was presented last. The results showed that compared with the low-AQ group, the high-AQ group exhibited higher P1 amplitudes induced by the second and third presented expressions, as well as higher P3 amplitudes induced by the third presented negative expressions. This indicates that individuals with autistic traits may experience overly strong perception, attention, and cognitive evaluation to repetitive expressions, particularly negative expressions. This result supports the intense world theory more strongly than the mind-blindness hypothesis.
Collapse
Affiliation(s)
- Chunyan Meng
- Key Laboratory of Applied Psychology, Chongqing Normal University, Chongqing, China
- Laboratory of Emotion and Mental Health, Chongqing University of Arts and Sciences, Chongqing, China
- Nanchong Vocational College of Science and Technology, Nanchong, China
| | - Chao Huo
- Key Laboratory of Applied Psychology, Chongqing Normal University, Chongqing, China
| | - Hongxin Ge
- Key Laboratory of Applied Psychology, Chongqing Normal University, Chongqing, China
| | - Zuoshan Li
- Key Laboratory of Applied Psychology, Chongqing Normal University, Chongqing, China
| | - Yuanyan Hu
- Laboratory of Emotion and Mental Health, Chongqing University of Arts and Sciences, Chongqing, China
| | - Jing Meng
- Key Laboratory of Applied Psychology, Chongqing Normal University, Chongqing, China
| |
Collapse
|
4
|
Aktürk T, de Graaf TA, Abra Y, Şahoğlu-Göktaş S, Özkan D, Kula A, Güntekin B. Event-related EEG oscillatory responses elicited by dynamic facial expression. Biomed Eng Online 2021; 20:41. [PMID: 33906649 PMCID: PMC8077950 DOI: 10.1186/s12938-021-00882-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2021] [Accepted: 04/20/2021] [Indexed: 11/30/2022] Open
Abstract
BACKGROUND Recognition of facial expressions (FEs) plays a crucial role in social interactions. Most studies on FE recognition use static (image) stimuli, even though real-life FEs are dynamic. FE processing is complex and multifaceted, and its neural correlates remain unclear. Transitioning from static to dynamic FE stimuli might help disentangle the neural oscillatory mechanisms underlying face processing and recognition of emotion expression. To our knowledge, we here present the first time-frequency exploration of oscillatory brain mechanisms underlying the processing of dynamic FEs. RESULTS Videos of joyful, fearful, and neutral dynamic facial expressions were presented to 18 included healthy young adults. We analyzed event-related activity in electroencephalography (EEG) data, focusing on the delta, theta, and alpha-band oscillations. Since the videos involved a transition from neutral to emotional expressions (onset around 500 ms), we identified time windows that might correspond to face perception initially (time window 1; first TW), and emotion expression recognition subsequently (around 1000 ms; second TW). First TW showed increased power and phase-locking values for all frequency bands. In the first TW, power and phase-locking values were higher in the delta and theta bands for emotional FEs as compared to neutral FEs, thus potentially serving as a marker for emotion recognition in dynamic face processing. CONCLUSIONS Our time-frequency exploration revealed consistent oscillatory responses to complex, dynamic, ecologically meaningful FE stimuli. We conclude that while dynamic FE processing involves complex network dynamics, dynamic FEs were successfully used to reveal temporally separate oscillation responses related to face processing and subsequently emotion expression recognition.
Collapse
Affiliation(s)
- Tuba Aktürk
- Program of Electroneurophysiology, Vocational School, Istanbul Medipol University, Istanbul, Turkey
- Program of Neuroscience Ph.D, Graduate School of Health Sciences, Istanbul Medipol University, Istanbul, Turkey
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands
| | - Tom A de Graaf
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands
| | - Yasemin Abra
- Department of Biological Sciences, Faculty of Arts and Sciences, Middle East Technical University, Ankara, Turkey
- Institute for Psychology, Faculty of Human Sciences, Universität Der Bundeswehr München, Munich, Germany
- Department of Psychology, Faculty of Psychology and Educational Sciences, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Sevilay Şahoğlu-Göktaş
- Program of Neuroscience Ph.D, Graduate School of Health Sciences, Istanbul Medipol University, Istanbul, Turkey
- Regenerative and Restorative Medicine Research Center (REMER), Istanbul Medipol University, Istanbul, Turkey
| | - Dilek Özkan
- Meram Faculty of Medicine, Konya Necmettin Erbakan University, Konya, Turkey
| | - Aysun Kula
- Department of Molecular Biology and Genetics, Faculty of Science, Sivas Cumhuriyet University, Sivas, Turkey
| | - Bahar Güntekin
- Department of Biophysics, School of Medicine, Istanbul Medipol University, Istanbul, Turkey.
- Regenerative and Restorative Medicine Research Center (REMER), Istanbul Medipol University, Istanbul, Turkey.
| |
Collapse
|
5
|
Age related differences in the recognition of facial expression: Evidence from EEG event-related brain oscillations. Int J Psychophysiol 2020; 147:244-256. [DOI: 10.1016/j.ijpsycho.2019.11.013] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2019] [Revised: 11/04/2019] [Accepted: 11/27/2019] [Indexed: 01/09/2023]
|
6
|
Zeng H, Wu Z, Zhang J, Yang C, Zhang H, Dai G, Kong W. EEG Emotion Classification Using an Improved SincNet-Based Deep Learning Model. Brain Sci 2019; 9:E326. [PMID: 31739605 PMCID: PMC6895992 DOI: 10.3390/brainsci9110326] [Citation(s) in RCA: 28] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2019] [Revised: 11/01/2019] [Accepted: 11/12/2019] [Indexed: 02/08/2023] Open
Abstract
Deep learning (DL) methods have been used increasingly widely, such as in the fields of speech and image recognition. However, how to design an appropriate DL model to accurately and efficiently classify electroencephalogram (EEG) signals is still a challenge, mainly because EEG signals are characterized by significant differences between two different subjects or vary over time within a single subject, non-stability, strong randomness, low signal-to-noise ratio. SincNet is an efficient classifier for speaker recognition, but it has some drawbacks in dealing with EEG signals classification. In this paper, we improve and propose a SincNet-based classifier, SincNet-R, which consists of three convolutional layers, and three deep neural network (DNN) layers. We then make use of SincNet-R to test the classification accuracy and robustness by emotional EEG signals. The comparable results with original SincNet model and other traditional classifiers such as CNN, LSTM and SVM, show that our proposed SincNet-R model has higher classification accuracy and better algorithm robustness.
Collapse
Affiliation(s)
- Hong Zeng
- School of Computer Science and Technology, Hangzhou Dianzi University, Hanghzhou 310018, China; (H.Z.); (Z.W.); (J.Z.); (C.Y.); (H.Z.); (G.D.)
- Industrial NeuroScience Lab, University of Rome “La Sapienza”, 00161 Rome, Italy
| | - Zhenhua Wu
- School of Computer Science and Technology, Hangzhou Dianzi University, Hanghzhou 310018, China; (H.Z.); (Z.W.); (J.Z.); (C.Y.); (H.Z.); (G.D.)
| | - Jiaming Zhang
- School of Computer Science and Technology, Hangzhou Dianzi University, Hanghzhou 310018, China; (H.Z.); (Z.W.); (J.Z.); (C.Y.); (H.Z.); (G.D.)
| | - Chen Yang
- School of Computer Science and Technology, Hangzhou Dianzi University, Hanghzhou 310018, China; (H.Z.); (Z.W.); (J.Z.); (C.Y.); (H.Z.); (G.D.)
| | - Hua Zhang
- School of Computer Science and Technology, Hangzhou Dianzi University, Hanghzhou 310018, China; (H.Z.); (Z.W.); (J.Z.); (C.Y.); (H.Z.); (G.D.)
| | - Guojun Dai
- School of Computer Science and Technology, Hangzhou Dianzi University, Hanghzhou 310018, China; (H.Z.); (Z.W.); (J.Z.); (C.Y.); (H.Z.); (G.D.)
| | - Wanzeng Kong
- School of Computer Science and Technology, Hangzhou Dianzi University, Hanghzhou 310018, China; (H.Z.); (Z.W.); (J.Z.); (C.Y.); (H.Z.); (G.D.)
| |
Collapse
|