1
|
Yang D, Svoboda AM, George TG, Mansfield PK, Wheelock MD, Schroeder ML, Rafferty SM, Sherafati A, Tripathy K, Burns-Yocum T, Forsen E, Pruett JR, Marrus NM, Culver JP, Constantino JN, Eggebrecht AT. Mapping neural correlates of biological motion perception in autistic children using high-density diffuse optical tomography. Mol Autism 2024; 15:35. [PMID: 39175054 PMCID: PMC11342641 DOI: 10.1186/s13229-024-00614-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2024] [Accepted: 07/31/2024] [Indexed: 08/24/2024] Open
Abstract
BACKGROUND Autism spectrum disorder (ASD), a neurodevelopmental disorder defined by social communication deficits plus repetitive behaviors and restricted interests, currently affects 1/36 children in the general population. Recent advances in functional brain imaging show promise to provide useful biomarkers of ASD diagnostic likelihood, behavioral trait severity, and even response to therapeutic intervention. However, current gold-standard neuroimaging methods (e.g., functional magnetic resonance imaging, fMRI) are limited in naturalistic studies of brain function underlying ASD-associated behaviors due to the constrained imaging environment. Compared to fMRI, high-density diffuse optical tomography (HD-DOT), a non-invasive and minimally constraining optical neuroimaging modality, can overcome these limitations. Herein, we aimed to establish HD-DOT to evaluate brain function in autistic and non-autistic school-age children as they performed a biological motion perception task previously shown to yield results related to both ASD diagnosis and behavioral traits. METHODS We used HD-DOT to image brain function in 46 ASD school-age participants and 49 non-autistic individuals (NAI) as they viewed dynamic point-light displays of coherent biological and scrambled motion. We assessed group-level cortical brain function with statistical parametric mapping. Additionally, we tested for brain-behavior associations with dimensional metrics of autism traits, as measured with the Social Responsiveness Scale-2, with hierarchical regression models. RESULTS We found that NAI participants presented stronger brain activity contrast (coherent > scrambled) than ASD children in cortical regions related to visual, motor, and social processing. Additionally, regression models revealed multiple cortical regions in autistic participants where brain function is significantly associated with dimensional measures of ASD traits. LIMITATIONS Optical imaging methods are limited in depth sensitivity and so cannot measure brain activity within deep subcortical regions. However, the field of view of this HD-DOT system includes multiple brain regions previously implicated in both task-based and task-free studies on autism. CONCLUSIONS This study demonstrates that HD-DOT is sensitive to brain function that both differentiates between NAI and ASD groups and correlates with dimensional measures of ASD traits. These findings establish HD-DOT as an effective tool for investigating brain function in autistic and non-autistic children. Moreover, this study established neural correlates related to biological motion perception and its association with dimensional measures of ASD traits.
Collapse
Affiliation(s)
- Dalin Yang
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 660 S. Euclid Ave, St. Louis, MO, 63110, USA
| | - Alexandra M Svoboda
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 660 S. Euclid Ave, St. Louis, MO, 63110, USA
| | - Tessa G George
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 660 S. Euclid Ave, St. Louis, MO, 63110, USA
| | - Patricia K Mansfield
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 660 S. Euclid Ave, St. Louis, MO, 63110, USA
- Medical Education, Saint Louis University School of Medicine, St. Louis, MO, 63104, USA
| | - Muriah D Wheelock
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 660 S. Euclid Ave, St. Louis, MO, 63110, USA
- Department of Biomedical Engineering, Washington University School of Engineering, St. Louis, MO, 63130, USA
- Division of Biology and Biomedical Sciences, Washington University School of Medicine, St. Louis, MO, 63110, USA
| | - Mariel L Schroeder
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 660 S. Euclid Ave, St. Louis, MO, 63110, USA
- Department of Speech, Language, and Hearing Science, Purdue University, West Lafayette, IL, 47907, USA
| | - Sean M Rafferty
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 660 S. Euclid Ave, St. Louis, MO, 63110, USA
| | - Arefeh Sherafati
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 660 S. Euclid Ave, St. Louis, MO, 63110, USA
- Department of Physics, Washington University School of Arts and Science, St. Louis, MO, 63130, USA
- Department of Neurology, University of California San Francisco, San Francisco, CA, 94158, USA
| | - Kalyan Tripathy
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 660 S. Euclid Ave, St. Louis, MO, 63110, USA
- Division of Biology and Biomedical Sciences, Washington University School of Medicine, St. Louis, MO, 63110, USA
- University of Pittsburgh Medical Center, Western Psychiatric Hospital, Pittsburgh, PA, 15213, USA
| | - Tracy Burns-Yocum
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 660 S. Euclid Ave, St. Louis, MO, 63110, USA
- Evolytics, Parkville, MO, 64152, USA
| | - Elizabeth Forsen
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 660 S. Euclid Ave, St. Louis, MO, 63110, USA
- Doctor of Medicine Program, Washington University School of Medicine, St. Louis, MO, 63110, USA
| | - John R Pruett
- Department of Psychiatry, Washington University School of Medicine, St. Louis, MO, 63110, USA
| | - Natasha M Marrus
- Department of Psychiatry, Washington University School of Medicine, St. Louis, MO, 63110, USA
| | - Joseph P Culver
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 660 S. Euclid Ave, St. Louis, MO, 63110, USA
- Department of Biomedical Engineering, Washington University School of Engineering, St. Louis, MO, 63130, USA
- Division of Biology and Biomedical Sciences, Washington University School of Medicine, St. Louis, MO, 63110, USA
- Department of Physics, Washington University School of Arts and Science, St. Louis, MO, 63130, USA
- Department of Electrical and System Engineering, Washington University School of Engineering, St. Louis, MO, 63112, USA
- Department Imaging Sciences Engineering, Washington University School of Engineering, St. Louis, MO, 63112, USA
| | - John N Constantino
- Department of Psychiatry, Washington University School of Medicine, St. Louis, MO, 63110, USA
- Department of Psychiatry, Emory University School of Medicine, Atlanta, GA, 30322, USA
- Division of Behavioral and Mental Health, Children's Healthcare of Atlanta, Atlanta, GA, 30329, USA
| | - Adam T Eggebrecht
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 660 S. Euclid Ave, St. Louis, MO, 63110, USA.
- Department of Biomedical Engineering, Washington University School of Engineering, St. Louis, MO, 63130, USA.
- Division of Biology and Biomedical Sciences, Washington University School of Medicine, St. Louis, MO, 63110, USA.
- Department of Physics, Washington University School of Arts and Science, St. Louis, MO, 63130, USA.
- Department of Electrical and System Engineering, Washington University School of Engineering, St. Louis, MO, 63112, USA.
- Department Imaging Sciences Engineering, Washington University School of Engineering, St. Louis, MO, 63112, USA.
| |
Collapse
|
2
|
Papoutselou E, Harrison S, Mai G, Buck B, Patil N, Wiggins I, Hartley D. Investigating mother-child inter-brain synchrony in a naturalistic paradigm: A functional near infrared spectroscopy (fNIRS) hyperscanning study. Eur J Neurosci 2024; 59:1386-1403. [PMID: 38155106 DOI: 10.1111/ejn.16233] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2022] [Revised: 11/27/2023] [Accepted: 12/01/2023] [Indexed: 12/30/2023]
Abstract
Successful social interactions between mothers and children are hypothesised to play a significant role in a child's social, cognitive and language development. Earlier research has confirmed, through structured experimental paradigms, that these interactions could be underpinned by coordinated neural activity. Nevertheless, the extent of neural synchrony during real-life, ecologically valid interactions between mothers and their children remains largely unexplored. In this study, we investigated mother-child inter-brain synchrony using a naturalistic free-play paradigm. We also examined the relationship between neural synchrony, verbal communication patterns and personality traits to further understand the underpinnings of brain synchrony. Twelve children aged between 3 and 5 years old and their mothers participated in this study. Neural synchrony in mother-child dyads were measured bilaterally over frontal and temporal areas using functional Near Infra-red Spectroscopy (fNIRS) whilst the dyads were asked to play with child-friendly toys together (interactive condition) and separately (independent condition). Communication patterns were captured via video recordings and conversational turns were coded. Compared to the independent condition, mother-child dyads showed increased neural synchrony in the interactive condition across the prefrontal cortex and temporo-parietal junction. There was no significant relationship found between neural synchrony and turn-taking and between neural synchrony and the personality traits of each member of the dyad. Overall, we demonstrate the feasibility of measuring inter-brain synchrony between mothers and children in a naturalistic environment. These findings can inform future study designs to assess inter-brain synchrony between parents and pre-lingual children and/or children with communication needs.
Collapse
Affiliation(s)
- Efstratia Papoutselou
- Hearing Sciences, Division of Clinical Neuroscience, School of Medicine, University of Nottingham, Nottingham, UK
- Nottingham Biomedical Research Centre (BRC), National Institute for Health Research (NIHR), Nottingham, UK
| | - Samantha Harrison
- Hearing Sciences, Division of Clinical Neuroscience, School of Medicine, University of Nottingham, Nottingham, UK
- Nottingham Biomedical Research Centre (BRC), National Institute for Health Research (NIHR), Nottingham, UK
| | - Guangting Mai
- Hearing Sciences, Division of Clinical Neuroscience, School of Medicine, University of Nottingham, Nottingham, UK
- Nottingham Biomedical Research Centre (BRC), National Institute for Health Research (NIHR), Nottingham, UK
| | - Bryony Buck
- Hearing Sciences - Scottish Section, Division of Clinical Neuroscience, School of Medicine, University of Nottingham, Nottingham, UK
| | - Nikita Patil
- Nottingham Biomedical Research Centre (BRC), National Institute for Health Research (NIHR), Nottingham, UK
- School of Medicine, University of Nottingham, Nottingham, UK
| | - Ian Wiggins
- Hearing Sciences, Division of Clinical Neuroscience, School of Medicine, University of Nottingham, Nottingham, UK
- Nottingham Biomedical Research Centre (BRC), National Institute for Health Research (NIHR), Nottingham, UK
| | - Douglas Hartley
- Hearing Sciences, Division of Clinical Neuroscience, School of Medicine, University of Nottingham, Nottingham, UK
- Nottingham Biomedical Research Centre (BRC), National Institute for Health Research (NIHR), Nottingham, UK
- Nottingham University Hospitals NHS Trust, Queen's Medical Centre, Nottingham, UK
| |
Collapse
|
3
|
Zhang X, Noah JA, Singh R, McPartland JC, Hirsch J. Support vector machine prediction of individual Autism Diagnostic Observation Schedule (ADOS) scores based on neural responses during live eye-to-eye contact. Sci Rep 2024; 14:3232. [PMID: 38332184 PMCID: PMC10853508 DOI: 10.1038/s41598-024-53942-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2023] [Accepted: 02/06/2024] [Indexed: 02/10/2024] Open
Abstract
Social difficulties during interactions with others are central to autism spectrum disorder (ASD). Understanding the links between these social difficulties and their underlying neural processes is a primary aim focused on improved diagnosis and treatment. In keeping with this goal, we have developed a multivariate classification method based on neural data acquired by functional near infrared spectroscopy, fNIRS, during live eye-to-eye contact with adults who were either typically developed (TD) or individuals with ASD. The ASD diagnosis was based on the gold-standard Autism Diagnostic Observation Schedule (ADOS) which also provides an index of symptom severity. Using a nested cross-validation method, a support vector machine (SVM) was trained to discriminate between ASD and TD groups based on the neural responses during eye-to-eye contact. ADOS scores were not applied in the classification training. To test the hypothesis that SVM identifies neural activity patterns related to one of the neural mechanisms underlying the behavioral symptoms of ASD, we determined the correlation coefficient between the SVM scores and the individual ADOS scores. Consistent with the hypothesis, the correlation between observed and predicted ADOS scores was 0.72 (p < 0.002). Findings suggest that multivariate classification methods combined with the live interaction paradigm of eye-to-eye contact provide a promising approach to link neural processes and social difficulties in individuals with ASD.
Collapse
Affiliation(s)
- Xian Zhang
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, 300 George St., Suite 902, New Haven, CT, USA
| | - J Adam Noah
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, 300 George St., Suite 902, New Haven, CT, USA
| | - Rahul Singh
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, 300 George St., Suite 902, New Haven, CT, USA
- Wu Tsai Institute, Yale University New Haven, New Haven, CT, 06511, USA
| | - James C McPartland
- Yale Child Study Center, Nieson Irving Harris Building, 230 South Frontage Road, Floor G, Suite 100A, New Haven, CT, 06519, USA
- Center for Brain and Mind Health, Yale School of Medicine, New Haven, CT, 06511, USA
| | - Joy Hirsch
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, 300 George St., Suite 902, New Haven, CT, USA.
- Wu Tsai Institute, Yale University New Haven, New Haven, CT, 06511, USA.
- Center for Brain and Mind Health, Yale School of Medicine, New Haven, CT, 06511, USA.
- Department of Neuroscience, Yale School of Medicine, New Haven, CT, 06511, USA.
- Department of Comparative Medicine, Yale School of Medicine, New Haven, CT, 06511, USA.
- Department of Medical Physics and Biomedical Engineering, University College London, London, WC1E 6BT, UK.
| |
Collapse
|
4
|
Hirsch J, Zhang X, Noah JA, Bhattacharya A. Neural mechanisms for emotional contagion and spontaneous mimicry of live facial expressions. Philos Trans R Soc Lond B Biol Sci 2023; 378:20210472. [PMID: 36871593 PMCID: PMC9985973 DOI: 10.1098/rstb.2021.0472] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2022] [Accepted: 01/16/2023] [Indexed: 03/07/2023] Open
Abstract
Viewing a live facial expression typically elicits a similar expression by the observer (facial mimicry) that is associated with a concordant emotional experience (emotional contagion). The model of embodied emotion proposes that emotional contagion and facial mimicry are functionally linked although the neural underpinnings are not known. To address this knowledge gap, we employed a live two-person paradigm (n = 20 dyads) using functional near-infrared spectroscopy during live emotive face-processing while also measuring eye-tracking, facial classifications and ratings of emotion. One dyadic partner, 'Movie Watcher', was instructed to emote natural facial expressions while viewing evocative short movie clips. The other dyadic partner, 'Face Watcher', viewed the Movie Watcher's face. Task and rest blocks were implemented by timed epochs of clear and opaque glass that separated partners. Dyadic roles were alternated during the experiment. Mean cross-partner correlations of facial expressions (r = 0.36 ± 0.11 s.e.m.) and mean cross-partner affect ratings (r = 0.67 ± 0.04) were consistent with facial mimicry and emotional contagion, respectively. Neural correlates of emotional contagion based on covariates of partner affect ratings included angular and supramarginal gyri, whereas neural correlates of the live facial action units included motor cortex and ventral face-processing areas. Findings suggest distinct neural components for facial mimicry and emotional contagion. This article is part of a discussion meeting issue 'Face2face: advancing the science of social interaction'.
Collapse
Affiliation(s)
- Joy Hirsch
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, New Haven, CT 06511, USA
- Department of Neuroscience, Yale School of Medicine, New Haven, CT 06511, USA
- Department of Comparative Medicine, Yale School of Medicine, New Haven, CT 06511, USA
- Wu Tsai Institute, Yale University, PO Box 208091, New Haven, CT 06520, USA
- Haskins Laboratories, 300 George Street, New Haven, CT 06511, USA
- Department of Medical Physics and Biomedical Engineering, University College London, London WC1E 6BT, UK
| | - Xian Zhang
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, New Haven, CT 06511, USA
| | - J. Adam Noah
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, New Haven, CT 06511, USA
| | | |
Collapse
|
5
|
Parker TC, Zhang X, Noah JA, Tiede M, Scassellati B, Kelley M, McPartland JC, Hirsch J. Neural and visual processing of social gaze cueing in typical and ASD adults. MEDRXIV : THE PREPRINT SERVER FOR HEALTH SCIENCES 2023:2023.01.30.23284243. [PMID: 36778502 PMCID: PMC9915835 DOI: 10.1101/2023.01.30.23284243] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
Atypical eye gaze in joint attention is a clinical characteristic of autism spectrum disorder (ASD). Despite this documented symptom, neural processing of joint attention tasks in real-life social interactions is not understood. To address this knowledge gap, functional-near infrared spectroscopy (fNIRS) and eye-tracking data were acquired simultaneously as ASD and typically developed (TD) individuals engaged in a gaze-directed joint attention task with a live human and robot partner. We test the hypothesis that face processing deficits in ASD are greater for interactive faces than for simulated (robot) faces. Consistent with prior findings, neural responses during human gaze cueing modulated by face visual dwell time resulted in increased activity of ventral frontal regions in ASD and dorsal parietal systems in TD participants. Hypoactivity of the right dorsal parietal area during live human gaze cueing was correlated with autism spectrum symptom severity: Brief Observations of Symptoms of Autism (BOSA) scores (r = âˆ'0.86). Contrarily, neural activity in response to robot gaze cueing modulated by visual acquisition factors activated dorsal parietal systems in ASD, and this neural activity was not related to autism symptom severity (r = 0.06). These results are consistent with the hypothesis that altered encoding of incoming facial information to the dorsal parietal cortex is specific to live human faces in ASD. These findings open new directions for understanding joint attention difficulties in ASD by providing a connection between superior parietal lobule activity and live interaction with human faces. Lay Summary Little is known about why it is so difficult for autistic individuals to make eye contact with other people. We find that in a live face-to-face viewing task with a robot, the brains of autistic participants were similar to typical participants but not when the partner was a live human. Findings suggest that difficulties in real-life social situations for autistic individuals may be specific to difficulties with live social interaction rather than general face gaze.
Collapse
|
6
|
Desmons C, Lavault S, Mazel A, Niérat M, Tadiello S, Khamassi M, Pelachaud C, Similowski T. Influence d’une activité pseudo-ventilatoire chez un robot humanoïde sur les interactions humain-machine. Rev Mal Respir 2023. [DOI: 10.1016/j.rmr.2022.11.076] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/18/2023]
|
7
|
Leung AYM, Zhao IY, Lin S, Lau TK. Exploring the Presence of Humanoid Social Robots at Home and Capturing Human-Robot Interactions with Older Adults: Experiences from Four Case Studies. Healthcare (Basel) 2022; 11:healthcare11010039. [PMID: 36611499 PMCID: PMC9818881 DOI: 10.3390/healthcare11010039] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2022] [Revised: 12/14/2022] [Accepted: 12/20/2022] [Indexed: 12/24/2022] Open
Abstract
BACKGROUND Social robots have the potential to bring benefits to aged care. However, it is uncertain whether placing these robots in older people's home is acceptable and whether human-robot interactions would occur or not. METHODS Four case studies were conducted to understand the experiences of older adults and family caregivers when humanoid social robot Ka Ka was placed in homes for two weeks. RESULTS Four older adults and three family caregivers were involved. Older adults interacted with the social robot Ka Ka every day during the study period. 'Talking to Ka Ka', 'listening to music', 'using the calendar reminder', and 'listening to the weather report' were the most commonly used features. Qualitative data reported the strengths of Ka Ka, such as providing emotional support to older adults living alone, diversifying their daily activities, and enhancing family relationships. The voice from Ka Ka (female, soft, and pleasing to the ear) was considered as 'bringing a pleasant feeling' to older adults. CONCLUSIONS In order to support aging-in-place and fill the gaps of the intensified shortage of health and social manpower, it is of prime importance to develop reliable and age-friendly AI-based robotic services that meet the needs and preferences of older adults and caregivers.
Collapse
Affiliation(s)
- Angela Y. M. Leung
- WHO Collaborating Centre for Community Health Services, School of Nursing, The Hong Kong Polytechnic University, Hong Kong 999077, China
- Research Institute of Smart Aging (RISA), The Hong Kong Polytechnic University, Hong Kong 999077, China
- Correspondence: ; Tel.: +852-2766-5587
| | - Ivy Y. Zhao
- WHO Collaborating Centre for Community Health Services, School of Nursing, The Hong Kong Polytechnic University, Hong Kong 999077, China
- Research Institute of Smart Aging (RISA), The Hong Kong Polytechnic University, Hong Kong 999077, China
| | - Shuanglan Lin
- WHO Collaborating Centre for Community Health Services, School of Nursing, The Hong Kong Polytechnic University, Hong Kong 999077, China
| | - Terence K. Lau
- WHO Collaborating Centre for Community Health Services, School of Nursing, The Hong Kong Polytechnic University, Hong Kong 999077, China
- Research Institute of Smart Aging (RISA), The Hong Kong Polytechnic University, Hong Kong 999077, China
| |
Collapse
|
8
|
Li M, Guo F, Wang X, Chen J, Ham J. Effects of robot gaze and voice human-likeness on users’ subjective perception, visual attention, and cerebral activity in voice conversations. COMPUTERS IN HUMAN BEHAVIOR 2022. [DOI: 10.1016/j.chb.2022.107645] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022]
|
9
|
Hirsch J, Zhang X, Noah JA, Dravida S, Naples A, Tiede M, Wolf JM, McPartland JC. Neural correlates of eye contact and social function in autism spectrum disorder. PLoS One 2022; 17:e0265798. [PMID: 36350848 PMCID: PMC9645655 DOI: 10.1371/journal.pone.0265798] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2022] [Accepted: 10/06/2022] [Indexed: 11/11/2022] Open
Abstract
Reluctance to make eye contact during natural interactions is a central diagnostic criterion for autism spectrum disorder (ASD). However, the underlying neural correlates for eye contacts in ASD are unknown, and diagnostic biomarkers are active areas of investigation. Here, neuroimaging, eye-tracking, and pupillometry data were acquired simultaneously using two-person functional near-infrared spectroscopy (fNIRS) during live "in-person" eye-to-eye contact and eye-gaze at a video face for typically-developed (TD) and participants with ASD to identify the neural correlates of live eye-to-eye contact in both groups. Comparisons between ASD and TD showed decreased right dorsal-parietal activity and increased right ventral temporal-parietal activity for ASD during live eye-to-eye contact (p≤0.05, FDR-corrected) and reduced cross-brain coherence consistent with atypical neural systems for live eye contact. Hypoactivity of right dorsal-parietal regions during eye contact in ASD was further associated with gold standard measures of social performance by the correlation of neural responses and individual measures of: ADOS-2, Autism Diagnostic Observation Schedule, 2nd Edition (r = -0.76, -0.92 and -0.77); and SRS-2, Social Responsiveness Scale, Second Edition (r = -0.58). The findings indicate that as categorized social ability decreases, neural responses to real eye-contact in the right dorsal parietal region also decrease consistent with a neural correlate for social characteristics in ASD.
Collapse
Affiliation(s)
- Joy Hirsch
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, New Haven, CT, United States of America
- Interdepartmental Neuroscience Program, Yale School of Medicine, New Haven, CT, United States of America
- Department of Neuroscience, Yale School of Medicine, New Haven, CT, United States of America
- Department of Comparative Medicine, Yale School of Medicine, New Haven, CT, United States of America
- Department of Medical Physics and Biomedical Engineering, University College London, London, United Kingdom
- Haskins Laboratories, New Haven, CT, United States of America
| | - Xian Zhang
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, New Haven, CT, United States of America
| | - J. Adam Noah
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, New Haven, CT, United States of America
| | - Swethasri Dravida
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, New Haven, CT, United States of America
- Interdepartmental Neuroscience Program, Yale School of Medicine, New Haven, CT, United States of America
| | - Adam Naples
- Yale Child Study Center, New Haven, CT, United States of America
| | - Mark Tiede
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, New Haven, CT, United States of America
- Haskins Laboratories, New Haven, CT, United States of America
| | - Julie M. Wolf
- Yale Child Study Center, New Haven, CT, United States of America
| | | |
Collapse
|
10
|
Morillo-Mendez L, Schrooten MGS, Loutfi A, Mozos OM. Age-Related Differences in the Perception of Robotic Referential Gaze in Human-Robot Interaction. Int J Soc Robot 2022:1-13. [PMID: 36185773 PMCID: PMC9510350 DOI: 10.1007/s12369-022-00926-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/08/2022] [Indexed: 11/12/2022]
Abstract
There is an increased interest in using social robots to assist older adults during their daily life activities. As social robots are designed to interact with older users, it becomes relevant to study these interactions under the lens of social cognition. Gaze following, the social ability to infer where other people are looking at, deteriorates with older age. Therefore, the referential gaze from robots might not be an effective social cue to indicate spatial locations to older users. In this study, we explored the performance of older adults, middle-aged adults, and younger controls in a task assisted by the referential gaze of a Pepper robot. We examined age-related differences in task performance, and in self-reported social perception of the robot. Our main findings show that referential gaze from a robot benefited task performance, although the magnitude of this facilitation was lower for older participants. Moreover, perceived anthropomorphism of the robot varied less as a result of its referential gaze in older adults. This research supports that social robots, even if limited in their gazing capabilities, can be effectively perceived as social entities. Additionally, this research suggests that robotic social cues, usually validated with young participants, might be less optimal signs for older adults. Supplementary Information The online version contains supplementary material available at 10.1007/s12369-022-00926-6.
Collapse
Affiliation(s)
- Lucas Morillo-Mendez
- Centre for Applied Autonomous Sensor Systems, Örebro University, Fakultetsgatan 1, Örebro, 702 81 Sweden
| | | | - Amy Loutfi
- Centre for Applied Autonomous Sensor Systems, Örebro University, Fakultetsgatan 1, Örebro, 702 81 Sweden
| | - Oscar Martinez Mozos
- Centre for Applied Autonomous Sensor Systems, Örebro University, Fakultetsgatan 1, Örebro, 702 81 Sweden
| |
Collapse
|
11
|
Ayaz H, Baker WB, Blaney G, Boas DA, Bortfeld H, Brady K, Brake J, Brigadoi S, Buckley EM, Carp SA, Cooper RJ, Cowdrick KR, Culver JP, Dan I, Dehghani H, Devor A, Durduran T, Eggebrecht AT, Emberson LL, Fang Q, Fantini S, Franceschini MA, Fischer JB, Gervain J, Hirsch J, Hong KS, Horstmeyer R, Kainerstorfer JM, Ko TS, Licht DJ, Liebert A, Luke R, Lynch JM, Mesquida J, Mesquita RC, Naseer N, Novi SL, Orihuela-Espina F, O’Sullivan TD, Peterka DS, Pifferi A, Pollonini L, Sassaroli A, Sato JR, Scholkmann F, Spinelli L, Srinivasan VJ, St. Lawrence K, Tachtsidis I, Tong Y, Torricelli A, Urner T, Wabnitz H, Wolf M, Wolf U, Xu S, Yang C, Yodh AG, Yücel MA, Zhou W. Optical imaging and spectroscopy for the study of the human brain: status report. NEUROPHOTONICS 2022; 9:S24001. [PMID: 36052058 PMCID: PMC9424749 DOI: 10.1117/1.nph.9.s2.s24001] [Citation(s) in RCA: 55] [Impact Index Per Article: 27.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/14/2023]
Abstract
This report is the second part of a comprehensive two-part series aimed at reviewing an extensive and diverse toolkit of novel methods to explore brain health and function. While the first report focused on neurophotonic tools mostly applicable to animal studies, here, we highlight optical spectroscopy and imaging methods relevant to noninvasive human brain studies. We outline current state-of-the-art technologies and software advances, explore the most recent impact of these technologies on neuroscience and clinical applications, identify the areas where innovation is needed, and provide an outlook for the future directions.
Collapse
Affiliation(s)
- Hasan Ayaz
- Drexel University, School of Biomedical Engineering, Science, and Health Systems, Philadelphia, Pennsylvania, United States
- Drexel University, College of Arts and Sciences, Department of Psychological and Brain Sciences, Philadelphia, Pennsylvania, United States
| | - Wesley B. Baker
- Children’s Hospital of Philadelphia, Division of Neurology, Philadelphia, Pennsylvania, United States
- Perelman School of Medicine at the University of Pennsylvania, Philadelphia, Pennsylvania, United States
| | - Giles Blaney
- Tufts University, Department of Biomedical Engineering, Medford, Massachusetts, United States
| | - David A. Boas
- Boston University Neurophotonics Center, Boston, Massachusetts, United States
- Boston University, College of Engineering, Department of Biomedical Engineering, Boston, Massachusetts, United States
| | - Heather Bortfeld
- University of California, Merced, Departments of Psychological Sciences and Cognitive and Information Sciences, Merced, California, United States
| | - Kenneth Brady
- Lurie Children’s Hospital, Northwestern University Feinberg School of Medicine, Department of Anesthesiology, Chicago, Illinois, United States
| | - Joshua Brake
- Harvey Mudd College, Department of Engineering, Claremont, California, United States
| | - Sabrina Brigadoi
- University of Padua, Department of Developmental and Social Psychology, Padua, Italy
| | - Erin M. Buckley
- Georgia Institute of Technology, Wallace H. Coulter Department of Biomedical Engineering, Atlanta, Georgia, United States
- Emory University School of Medicine, Department of Pediatrics, Atlanta, Georgia, United States
| | - Stefan A. Carp
- Massachusetts General Hospital, Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Charlestown, Massachusetts, United States
| | - Robert J. Cooper
- University College London, Department of Medical Physics and Bioengineering, DOT-HUB, London, United Kingdom
| | - Kyle R. Cowdrick
- Georgia Institute of Technology, Wallace H. Coulter Department of Biomedical Engineering, Atlanta, Georgia, United States
| | - Joseph P. Culver
- Washington University School of Medicine, Department of Radiology, St. Louis, Missouri, United States
| | - Ippeita Dan
- Chuo University, Faculty of Science and Engineering, Tokyo, Japan
| | - Hamid Dehghani
- University of Birmingham, School of Computer Science, Birmingham, United Kingdom
| | - Anna Devor
- Boston University, College of Engineering, Department of Biomedical Engineering, Boston, Massachusetts, United States
| | - Turgut Durduran
- ICFO – The Institute of Photonic Sciences, The Barcelona Institute of Science and Technology, Castelldefels, Barcelona, Spain
- Institució Catalana de Recerca I Estudis Avançats (ICREA), Barcelona, Spain
| | - Adam T. Eggebrecht
- Washington University in St. Louis, Mallinckrodt Institute of Radiology, St. Louis, Missouri, United States
| | - Lauren L. Emberson
- University of British Columbia, Department of Psychology, Vancouver, British Columbia, Canada
| | - Qianqian Fang
- Northeastern University, Department of Bioengineering, Boston, Massachusetts, United States
| | - Sergio Fantini
- Tufts University, Department of Biomedical Engineering, Medford, Massachusetts, United States
| | - Maria Angela Franceschini
- Massachusetts General Hospital, Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Charlestown, Massachusetts, United States
| | - Jonas B. Fischer
- ICFO – The Institute of Photonic Sciences, The Barcelona Institute of Science and Technology, Castelldefels, Barcelona, Spain
| | - Judit Gervain
- University of Padua, Department of Developmental and Social Psychology, Padua, Italy
- Université Paris Cité, CNRS, Integrative Neuroscience and Cognition Center, Paris, France
| | - Joy Hirsch
- Yale School of Medicine, Department of Psychiatry, Neuroscience, and Comparative Medicine, New Haven, Connecticut, United States
- University College London, Department of Medical Physics and Biomedical Engineering, London, United Kingdom
| | - Keum-Shik Hong
- Pusan National University, School of Mechanical Engineering, Busan, Republic of Korea
- Qingdao University, School of Automation, Institute for Future, Qingdao, China
| | - Roarke Horstmeyer
- Duke University, Department of Biomedical Engineering, Durham, North Carolina, United States
- Duke University, Department of Electrical and Computer Engineering, Durham, North Carolina, United States
- Duke University, Department of Physics, Durham, North Carolina, United States
| | - Jana M. Kainerstorfer
- Carnegie Mellon University, Department of Biomedical Engineering, Pittsburgh, Pennsylvania, United States
- Carnegie Mellon University, Neuroscience Institute, Pittsburgh, Pennsylvania, United States
| | - Tiffany S. Ko
- Children’s Hospital of Philadelphia, Division of Cardiothoracic Anesthesiology, Philadelphia, Pennsylvania, United States
| | - Daniel J. Licht
- Children’s Hospital of Philadelphia, Division of Neurology, Philadelphia, Pennsylvania, United States
| | - Adam Liebert
- Polish Academy of Sciences, Nalecz Institute of Biocybernetics and Biomedical Engineering, Warsaw, Poland
| | - Robert Luke
- Macquarie University, Department of Linguistics, Sydney, New South Wales, Australia
- Macquarie University Hearing, Australia Hearing Hub, Sydney, New South Wales, Australia
| | - Jennifer M. Lynch
- Children’s Hospital of Philadelphia, Division of Cardiothoracic Anesthesiology, Philadelphia, Pennsylvania, United States
| | - Jaume Mesquida
- Parc Taulí Hospital Universitari, Critical Care Department, Sabadell, Spain
| | - Rickson C. Mesquita
- University of Campinas, Institute of Physics, Campinas, São Paulo, Brazil
- Brazilian Institute of Neuroscience and Neurotechnology, Campinas, São Paulo, Brazil
| | - Noman Naseer
- Air University, Department of Mechatronics and Biomedical Engineering, Islamabad, Pakistan
| | - Sergio L. Novi
- University of Campinas, Institute of Physics, Campinas, São Paulo, Brazil
- Western University, Department of Physiology and Pharmacology, London, Ontario, Canada
| | | | - Thomas D. O’Sullivan
- University of Notre Dame, Department of Electrical Engineering, Notre Dame, Indiana, United States
| | - Darcy S. Peterka
- Columbia University, Zuckerman Mind Brain Behaviour Institute, New York, United States
| | | | - Luca Pollonini
- University of Houston, Department of Engineering Technology, Houston, Texas, United States
| | - Angelo Sassaroli
- Tufts University, Department of Biomedical Engineering, Medford, Massachusetts, United States
| | - João Ricardo Sato
- Federal University of ABC, Center of Mathematics, Computing and Cognition, São Bernardo do Campo, São Paulo, Brazil
| | - Felix Scholkmann
- University of Bern, Institute of Complementary and Integrative Medicine, Bern, Switzerland
- University of Zurich, University Hospital Zurich, Department of Neonatology, Biomedical Optics Research Laboratory, Zürich, Switzerland
| | - Lorenzo Spinelli
- National Research Council (CNR), IFN – Institute for Photonics and Nanotechnologies, Milan, Italy
| | - Vivek J. Srinivasan
- University of California Davis, Department of Biomedical Engineering, Davis, California, United States
- NYU Langone Health, Department of Ophthalmology, New York, New York, United States
- NYU Langone Health, Department of Radiology, New York, New York, United States
| | - Keith St. Lawrence
- Lawson Health Research Institute, Imaging Program, London, Ontario, Canada
- Western University, Department of Medical Biophysics, London, Ontario, Canada
| | - Ilias Tachtsidis
- University College London, Department of Medical Physics and Biomedical Engineering, London, United Kingdom
| | - Yunjie Tong
- Purdue University, Weldon School of Biomedical Engineering, West Lafayette, Indiana, United States
| | - Alessandro Torricelli
- Politecnico di Milano, Dipartimento di Fisica, Milan, Italy
- National Research Council (CNR), IFN – Institute for Photonics and Nanotechnologies, Milan, Italy
| | - Tara Urner
- Georgia Institute of Technology, Wallace H. Coulter Department of Biomedical Engineering, Atlanta, Georgia, United States
| | - Heidrun Wabnitz
- Physikalisch-Technische Bundesanstalt (PTB), Berlin, Germany
| | - Martin Wolf
- University of Zurich, University Hospital Zurich, Department of Neonatology, Biomedical Optics Research Laboratory, Zürich, Switzerland
| | - Ursula Wolf
- University of Bern, Institute of Complementary and Integrative Medicine, Bern, Switzerland
| | - Shiqi Xu
- Duke University, Department of Biomedical Engineering, Durham, North Carolina, United States
| | - Changhuei Yang
- California Institute of Technology, Department of Electrical Engineering, Pasadena, California, United States
| | - Arjun G. Yodh
- University of Pennsylvania, Department of Physics and Astronomy, Philadelphia, Pennsylvania, United States
| | - Meryem A. Yücel
- Boston University Neurophotonics Center, Boston, Massachusetts, United States
- Boston University, College of Engineering, Department of Biomedical Engineering, Boston, Massachusetts, United States
| | - Wenjun Zhou
- University of California Davis, Department of Biomedical Engineering, Davis, California, United States
- China Jiliang University, College of Optical and Electronic Technology, Hangzhou, Zhejiang, China
| |
Collapse
|
12
|
Yorgancigil E, Yildirim F, Urgen BA, Erdogan SB. An Exploratory Analysis of the Neural Correlates of Human-Robot Interactions With Functional Near Infrared Spectroscopy. Front Hum Neurosci 2022; 16:883905. [PMID: 35923750 PMCID: PMC9339604 DOI: 10.3389/fnhum.2022.883905] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2022] [Accepted: 06/22/2022] [Indexed: 11/13/2022] Open
Abstract
Functional near infrared spectroscopy (fNIRS) has been gaining increasing interest as a practical mobile functional brain imaging technology for understanding the neural correlates of social cognition and emotional processing in the human prefrontal cortex (PFC). Considering the cognitive complexity of human-robot interactions, the aim of this study was to explore the neural correlates of emotional processing of congruent and incongruent pairs of human and robot audio-visual stimuli in the human PFC with fNIRS methodology. Hemodynamic responses from the PFC region of 29 subjects were recorded with fNIRS during an experimental paradigm which consisted of auditory and visual presentation of human and robot stimuli. Distinct neural responses to human and robot stimuli were detected at the dorsolateral prefrontal cortex (DLPFC) and orbitofrontal cortex (OFC) regions. Presentation of robot voice elicited significantly less hemodynamic response than presentation of human voice in a left OFC channel. Meanwhile, processing of human faces elicited significantly higher hemodynamic activity when compared to processing of robot faces in two left DLPFC channels and a left OFC channel. Significant correlation between the hemodynamic and behavioral responses for the face-voice mismatch effect was found in the left OFC. Our results highlight the potential of fNIRS for unraveling the neural processing of human and robot audio-visual stimuli, which might enable optimization of social robot designs and contribute to elucidation of the neural processing of human and robot stimuli in the PFC in naturalistic conditions.
Collapse
Affiliation(s)
- Emre Yorgancigil
- Department of Medical Engineering, Acibadem Mehmet Ali Aydinlar University, Istanbul, Turkey
- *Correspondence: Emre Yorgancigil
| | - Funda Yildirim
- Cognitive Science Master's Program, Yeditepe University, Istanbul, Turkey
- Department of Computer Engineering, Yeditepe University, Istanbul, Turkey
| | - Burcu A. Urgen
- Department of Psychology, Bilkent University, Ankara, Turkey
- Neuroscience Graduate Program, Bilkent University, Ankara, Turkey
- Aysel Sabuncu Brain Research Center, National Magnetic Resonance Research Center (UMRAM), Ankara, Turkey
| | - Sinem Burcu Erdogan
- Department of Medical Engineering, Acibadem Mehmet Ali Aydinlar University, Istanbul, Turkey
| |
Collapse
|
13
|
Ge Y, Su R, Liang Z, Luo J, Tian S, Shen X, Wu H, Liu C. Transcranial Direct Current Stimulation Over the Right Temporal Parietal Junction Facilitates Spontaneous Micro-Expression Recognition. Front Hum Neurosci 2022; 16:933831. [PMID: 35874155 PMCID: PMC9305610 DOI: 10.3389/fnhum.2022.933831] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2022] [Accepted: 06/21/2022] [Indexed: 11/19/2022] Open
Abstract
Micro-expressions are fleeting and subtle emotional expressions. As they are spontaneous and uncontrollable by one's mind, micro-expressions are considered an indicator of genuine emotions. Their accurate recognition and interpretation promote interpersonal interaction and social communication. Therefore, enhancing the ability to recognize micro-expressions has captured much attention. In the current study, we investigated the effects of training on micro-expression recognition with a Chinese version of the Micro-Expression Training Tool (METT). Our goal was to confirm whether the recognition accuracy of spontaneous micro-expressions could be improved through training and brain stimulation. Since the right temporal parietal junction (rTPJ) has been shown to be involved in the explicit process of facial emotion recognition, we hypothesized that the rTPJ would play a role in facilitating the recognition of micro-expressions. The results showed that anodal transcranial direct-current stimulation (tDCS) of the rTPJ indeed improved the recognition of spontaneous micro-expressions, especially for those associated with fear. The improved accuracy of recognizing fear spontaneous micro-expressions was positively correlated with personal distress in the anodal group but not in the sham group. Our study supports that the combined use of tDCS and METT can be a viable way to train and enhance micro-expression recognition.
Collapse
Affiliation(s)
- Yue Ge
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, China
- Center for Collaboration and Innovation in Brain and Learning Sciences, Beijing Normal University, Beijing, China
- Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing, China
- Beijing Institute of Biomedicine, Beijing, China
| | - Rui Su
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, China
- Center for Collaboration and Innovation in Brain and Learning Sciences, Beijing Normal University, Beijing, China
- Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing, China
| | - Zilu Liang
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, China
- Center for Collaboration and Innovation in Brain and Learning Sciences, Beijing Normal University, Beijing, China
- Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing, China
| | - Jing Luo
- Beijing Institute of Biomedicine, Beijing, China
| | - Suizi Tian
- School of Psychology, Beijing Normal University, Beijing, China
| | - Xunbing Shen
- College of Humanities, Jiangxi University of Chinese Medicine, Nanchang, China
| | - Haiyan Wu
- Centre for Cognitive and Brain Sciences and Department of Psychology, University of Macau, Taipa, China
| | - Chao Liu
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, China
- Center for Collaboration and Innovation in Brain and Learning Sciences, Beijing Normal University, Beijing, China
- Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing, China
| |
Collapse
|
14
|
Kesner L, Adámek P, Grygarová D. How Neuroimaging Can Aid the Interpretation of Art. Front Hum Neurosci 2021; 15:702473. [PMID: 34594192 PMCID: PMC8476868 DOI: 10.3389/fnhum.2021.702473] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2021] [Accepted: 08/10/2021] [Indexed: 11/24/2022] Open
Abstract
Cognitive neuroscience of art continues to be criticized for failing to provide interesting results about art itself. In particular, results of brain imaging experiments have not yet been utilized in interpretation of particular works of art. Here we revisit a recent study in which we explored the neuronal and behavioral response to painted portraits with a direct versus an averted gaze. We then demonstrate how fMRI results can be related to the art historical interpretation of a specific painting. The evidentiary status of neuroimaging data is not different from any other extra-pictorial facts that art historians uncover in their research and relate to their account of the significance of a work of art. They are not explanatory in a strong sense, yet they provide supportive evidence for the art writer’s inference about the intended meaning of a given work. We thus argue that brain imaging can assume an important role in the interpretation of particular art works.
Collapse
Affiliation(s)
- Ladislav Kesner
- National Institute of Mental Health, Klecany, Czechia.,Faculty of Arts, Masaryk University, Brno, Czechia
| | - Petr Adámek
- National Institute of Mental Health, Klecany, Czechia.,Third Faculty of Medicine, Charles University, Prague, Czechia
| | | |
Collapse
|
15
|
Eye contact marks the rise and fall of shared attention in conversation. Proc Natl Acad Sci U S A 2021; 118:2106645118. [PMID: 34504001 DOI: 10.1073/pnas.2106645118] [Citation(s) in RCA: 31] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/27/2021] [Indexed: 12/24/2022] Open
Abstract
Conversation is the platform where minds meet: the venue where information is shared, ideas cocreated, cultural norms shaped, and social bonds forged. Its frequency and ease belie its complexity. Every conversation weaves a unique shared narrative from the contributions of independent minds, requiring partners to flexibly move into and out of alignment as needed for conversation to both cohere and evolve. How two minds achieve this coordination is poorly understood. Here we test whether eye contact, a common feature of conversation, predicts this coordination by measuring dyadic pupillary synchrony (a corollary of shared attention) during natural conversation. We find that eye contact is positively correlated with synchrony as well as ratings of engagement by conversation partners. However, rather than elicit synchrony, eye contact commences as synchrony peaks and predicts its immediate and subsequent decline until eye contact breaks. This relationship suggests that eye contact signals when shared attention is high. Furthermore, we speculate that eye contact may play a corrective role in disrupting shared attention (reducing synchrony) as needed to facilitate independent contributions to conversation.
Collapse
|
16
|
Ono Y, Zhang X, Noah JA, Dravida S, Hirsch J. Bidirectional Connectivity Between Broca's Area and Wernicke's Area During Interactive Verbal Communication. Brain Connect 2021; 12:210-222. [PMID: 34128394 DOI: 10.1089/brain.2020.0790] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Aim: This investigation aims to advance the understanding of neural dynamics that underlies live and natural interactions during spoken dialogue between two individuals. Introduction: The underlying hypothesis is that functional connectivity between canonical speech areas in the human brain will be modulated by social interaction. Methods: Granger causality was applied to compare directional connectivity across Broca's and Wernicke's areas during verbal conditions consisting of interactive and noninteractive communication. Thirty-three pairs of healthy adult participants alternately talked and listened to each other while performing an object naming and description task that was either interactive or not during hyperscanning with functional near-infrared spectroscopy (fNIRS). In the noninteractive condition, the speaker named and described a picture-object without reference to the partner's description. In the interactive condition, the speaker performed the same task but included an interactive response about the preceding comments of the partner. Causality measures of hemodynamic responses from Broca's and Wernicke's areas were compared between real, surrogate, and shuffled trials within dyads. Results: The interactive communication was characterized by bidirectional connectivity between Wernicke's and Broca's areas of the listener's brain. Whereas this connectivity was unidirectional in the speaker's brain. In the case of the noninteractive condition, both speaker's and listener's brains showed unidirectional top-down (Broca's area to Wernicke's area) connectivity. Conclusion: Together, directional connectivity as determined by Granger analysis reveals bidirectional flow of neuronal information during dynamic two-person verbal interaction for processes that are active during listening (reception) and not during talking (production). Findings are consistent with prior contrast findings (general linear model) showing neural modulation of the receptive language system associated with Wernicke's area during a two-person live interaction. Impact statement The neural dynamics that underlies real-life social interactions is an emergent topic of interest. Dynamically coupled cross-brain neural mechanisms between interacting partners during verbal dialogue have been shown within Wernicke's area. However, it is not known how within-brain long-range neural mechanisms operate during these live social functions. Using Granger causality analysis, we show bidirectional neural activity between Broca's and Wernicke's areas during interactive dialogue compared with a noninteractive control task showing only unidirectional activity. Findings are consistent with an Interactive Brain Model where long-range neural mechanisms process interactive processes associated with rapid and spontaneous spoken social cues.
Collapse
Affiliation(s)
- Yumie Ono
- Department of Electronics and Bioinformatics, School of Science and Technology, Meiji University, Kawasaki, Kanagawa, Japan.,Department of Psychiatry, Yale School of Medicine, New Haven, Connecticut, USA
| | - Xian Zhang
- Department of Psychiatry, Yale School of Medicine, New Haven, Connecticut, USA
| | - J Adam Noah
- Department of Psychiatry, Yale School of Medicine, New Haven, Connecticut, USA
| | - Swethasri Dravida
- Interdepartmental Program for Neuroscience, Yale School of Medicine, New Haven, Connecticut, USA.,Medical Student Training Program, Yale School of Medicine, New Haven, Connecticut, USA
| | - Joy Hirsch
- Department of Psychiatry, Yale School of Medicine, New Haven, Connecticut, USA.,Interdepartmental Program for Neuroscience, Yale School of Medicine, New Haven, Connecticut, USA.,Department of Neuroscience, Yale School of Medicine, New Haven, Connecticut, USA.,Department of Comparative Medicine, Yale School of Medicine, New Haven, Connecticut, USA.,Department of Medical Physics and Biomedical Engineering, University College London, London, United Kingdom
| |
Collapse
|