1
|
Pecukonis M, Gerson J, Gustafson-Alm H, Wood M, Yücel M, Boas D, Tager-Flusberg H. The Neural Bases of Language Processing During Social and Non-Social Contexts: A fNIRS Study of Autistic and Neurotypical Preschool-Aged Children. RESEARCH SQUARE 2024:rs.3.rs-4450882. [PMID: 38883761 PMCID: PMC11177967 DOI: 10.21203/rs.3.rs-4450882/v1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/18/2024]
Abstract
Background Little is known about how the brains of autistic children process language during real-world "social contexts," despite the fact that challenges with language, communication, and social interaction are core features of Autism Spectrum Disorder (ASD). Methods We investigated the neural bases of language processing during social and non-social contexts in a sample of N=20 autistic and N=20 neurotypical (NT) preschool-aged children, 3 to 6 years old. Functional near-infrared spectroscopy (fNIRS) was used to measure children's brain response to "live language" spoken by a live experimenter during an in-person social context (i.e., book reading), and "recorded language" played via an audio recording during a non-social context (i.e., screen time). We examined within-group and between-group differences in the strength and localization of brain response to live language and recorded language, as well as correlations between children's brain response and language skills measured by the Preschool Language Scales. Results In the NT group, brain response to live language was greater than brain response to recorded language in the right temporal parietal junction (TPJ). In the ASD group, the strength of brain response did not differ between conditions. The ASD group showed greater brain response to recorded language than the NT group in the right inferior and middle frontal gyrus (IMFG). Across groups, children's language skills were negatively associated with brain response to recorded language in the right IMFG, suggesting that processing recorded language required more cognitive effort for children with lower language skills. Children's language skills were also positively associated with the difference in brain response between conditions in the right TPJ, demonstrating that children who showed a greater difference in brain response to live language versus recorded language had higher language skills. Limitations Findings should be considered preliminary until they are replicated in a larger sample. Conclusions Findings suggest that the brains of NT children, but not autistic children, process language differently during social and non-social contexts. Individual differences in how the brain processes language during social and non-social contexts may help to explain why language skills are so variable across children with and without autism.
Collapse
|
2
|
Green GD, Jacewicz E, Santosa H, Arzbecker LJ, Fox RA. Evaluating Speaker-Listener Cognitive Effort in Speech Communication Through Brain-to-Brain Synchrony: A Pilot Functional Near-Infrared Spectroscopy Investigation. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2024; 67:1339-1359. [PMID: 38535722 DOI: 10.1044/2024_jslhr-23-00476] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/09/2024]
Abstract
PURPOSE We explore a new approach to the study of cognitive effort involved in listening to speech by measuring the brain activity in a listener in relation to the brain activity in a speaker. We hypothesize that the strength of this brain-to-brain synchrony (coupling) reflects the magnitude of cognitive effort involved in verbal communication and includes both listening effort and speaking effort. We investigate whether interbrain synchrony is greater in native-to-native versus native-to-nonnative communication using functional near-infrared spectroscopy (fNIRS). METHOD Two speakers participated, a native speaker of American English and a native speaker of Korean who spoke English as a second language. Each speaker was fitted with the fNIRS cap and told short stories. The native English speaker provided the English narratives, and the Korean speaker provided both the nonnative (accented) English and Korean narratives. In separate sessions, fNIRS data were obtained from seven English monolingual participants ages 20-24 years who listened to each speaker's stories. After listening to each story in native and nonnative English, they retold the content, and their transcripts and audio recordings were analyzed for comprehension and discourse fluency, measured in the number of hesitations and articulation rate. No story retellings were obtained for narratives in Korean (an incomprehensible language for English listeners). Utilizing fNIRS technique termed sequential scanning, we quantified the brain-to-brain synchronization in each speaker-listener dyad. RESULTS For native-to-native dyads, multiple brain regions associated with various linguistic and executive functions were activated. There was a weaker coupling for native-to-nonnative dyads, and only the brain regions associated with higher order cognitive processes and functions were synchronized. All listeners understood the content of all stories, but they hesitated significantly more when retelling stories told in accented English. The nonnative speaker hesitated significantly more often than the native speaker and had a significantly slower articulation rate. There was no brain-to-brain coupling during listening to Korean, indicating a break in communication when listeners failed to comprehend the speaker. CONCLUSIONS We found that effortful speech processing decreased interbrain synchrony and delayed comprehension processes. The obtained brain-based and behavioral patterns are consistent with our proposal that cognitive effort in verbal communication pertains to both the listener and the speaker and that brain-to-brain synchrony can be an indicator of differences in their cumulative communicative effort. SUPPLEMENTAL MATERIAL https://doi.org/10.23641/asha.25452142.
Collapse
Affiliation(s)
- Geoff D Green
- Department of Speech and Hearing Science, The Ohio State University, Columbus
| | - Ewa Jacewicz
- Department of Speech and Hearing Science, The Ohio State University, Columbus
| | | | - Lian J Arzbecker
- Department of Speech and Hearing Science, The Ohio State University, Columbus
| | - Robert A Fox
- Department of Speech and Hearing Science, The Ohio State University, Columbus
| |
Collapse
|
3
|
Wohltjen S, Wheatley T. Interpersonal eye-tracking reveals the dynamics of interacting minds. Front Hum Neurosci 2024; 18:1356680. [PMID: 38532792 PMCID: PMC10963423 DOI: 10.3389/fnhum.2024.1356680] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2023] [Accepted: 02/20/2024] [Indexed: 03/28/2024] Open
Abstract
The human eye is a rich source of information about where, when, and how we attend. Our gaze paths indicate where and what captures our attention, while changes in pupil size can signal surprise, revealing our expectations. Similarly, the pattern of our blinks suggests levels of alertness and when our attention shifts between external engagement and internal thought. During interactions with others, these cues reveal how we coordinate and share our mental states. To leverage these insights effectively, we need accurate, timely methods to observe these cues as they naturally unfold. Advances in eye-tracking technology now enable real-time observation of these cues, shedding light on mutual cognitive processes that foster shared understanding, collaborative thought, and social connection. This brief review highlights these advances and the new opportunities they present for future research.
Collapse
Affiliation(s)
- Sophie Wohltjen
- Department of Psychology, University of Wisconsin–Madison, Madison, WI, United States
| | - Thalia Wheatley
- Department of Psychological and Brain Sciences, Consortium for Interacting Minds, Dartmouth College, Hanover, NH, United States
- Santa Fe Institute, Santa Fe, NM, United States
| |
Collapse
|
4
|
Konrad K, Gerloff C, Kohl SH, Mehler DMA, Mehlem L, Volbert EL, Komorek M, Henn AT, Boecker M, Weiss E, Reindl V. Interpersonal neural synchrony and mental disorders: unlocking potential pathways for clinical interventions. Front Neurosci 2024; 18:1286130. [PMID: 38529267 PMCID: PMC10962391 DOI: 10.3389/fnins.2024.1286130] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2023] [Accepted: 01/30/2024] [Indexed: 03/27/2024] Open
Abstract
Introduction Interpersonal synchronization involves the alignment of behavioral, affective, physiological, and brain states during social interactions. It facilitates empathy, emotion regulation, and prosocial commitment. Mental disorders characterized by social interaction dysfunction, such as Autism Spectrum Disorder (ASD), Reactive Attachment Disorder (RAD), and Social Anxiety Disorder (SAD), often exhibit atypical synchronization with others across multiple levels. With the introduction of the "second-person" neuroscience perspective, our understanding of interpersonal neural synchronization (INS) has improved, however, so far, it has hardly impacted the development of novel therapeutic interventions. Methods To evaluate the potential of INS-based treatments for mental disorders, we performed two systematic literature searches identifying studies that directly target INS through neurofeedback (12 publications; 9 independent studies) or brain stimulation techniques (7 studies), following PRISMA guidelines. In addition, we narratively review indirect INS manipulations through behavioral, biofeedback, or hormonal interventions. We discuss the potential of such treatments for ASD, RAD, and SAD and using a systematic database search assess the acceptability of neurofeedback (4 studies) and neurostimulation (4 studies) in patients with social dysfunction. Results Although behavioral approaches, such as engaging in eye contact or cooperative actions, have been shown to be associated with increased INS, little is known about potential long-term consequences of such interventions. Few proof-of-concept studies have utilized brain stimulation techniques, like transcranial direct current stimulation or INS-based neurofeedback, showing feasibility and preliminary evidence that such interventions can boost behavioral synchrony and social connectedness. Yet, optimal brain stimulation protocols and neurofeedback parameters are still undefined. For ASD, RAD, or SAD, so far no randomized controlled trial has proven the efficacy of direct INS-based intervention techniques, although in general brain stimulation and neurofeedback methods seem to be well accepted in these patient groups. Discussion Significant work remains to translate INS-based manipulations into effective treatments for social interaction disorders. Future research should focus on mechanistic insights into INS, technological advancements, and rigorous design standards. Furthermore, it will be key to compare interventions directly targeting INS to those targeting other modalities of synchrony as well as to define optimal target dyads and target synchrony states in clinical interventions.
Collapse
Affiliation(s)
- Kerstin Konrad
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, University Hospital RWTH, Aachen, Germany
- JARA Brain Institute II, Molecular Neuroscience and Neuroimaging (INM-11), Jülich Research Centre, Jülich, Germany
| | - Christian Gerloff
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, University Hospital RWTH, Aachen, Germany
- JARA Brain Institute II, Molecular Neuroscience and Neuroimaging (INM-11), Jülich Research Centre, Jülich, Germany
- Department of Applied Mathematics and Theoretical Physics, Cambridge Centre for Data-Driven Discovery, University of Cambridge, Cambridge, United Kingdom
| | - Simon H. Kohl
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, University Hospital RWTH, Aachen, Germany
- JARA Brain Institute II, Molecular Neuroscience and Neuroimaging (INM-11), Jülich Research Centre, Jülich, Germany
| | - David M. A. Mehler
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany
- Institute for Translational Psychiatry, University of Münster, Münster, Germany
- School of Psychology, Cardiff University Brain Research Imaging Center (CUBRIC), Cardiff University, Cardiff, United Kingdom
| | - Lena Mehlem
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, University Hospital RWTH, Aachen, Germany
| | - Emily L. Volbert
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, University Hospital RWTH, Aachen, Germany
| | - Maike Komorek
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, University Hospital RWTH, Aachen, Germany
| | - Alina T. Henn
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, University Hospital RWTH, Aachen, Germany
| | - Maren Boecker
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, University Hospital RWTH, Aachen, Germany
- Institute of Medical Psychology and Medical Sociology, University Hospital RWTH, Aachen, Germany
| | - Eileen Weiss
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, University Hospital RWTH, Aachen, Germany
- Institute of Medical Psychology and Medical Sociology, University Hospital RWTH, Aachen, Germany
| | - Vanessa Reindl
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, University Hospital RWTH, Aachen, Germany
- Department of Psychology, School of Social Sciences, Nanyang Technological University, Singapore, Singapore
| |
Collapse
|
5
|
Papoutselou E, Harrison S, Mai G, Buck B, Patil N, Wiggins I, Hartley D. Investigating mother-child inter-brain synchrony in a naturalistic paradigm: A functional near infrared spectroscopy (fNIRS) hyperscanning study. Eur J Neurosci 2024; 59:1386-1403. [PMID: 38155106 DOI: 10.1111/ejn.16233] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2022] [Revised: 11/27/2023] [Accepted: 12/01/2023] [Indexed: 12/30/2023]
Abstract
Successful social interactions between mothers and children are hypothesised to play a significant role in a child's social, cognitive and language development. Earlier research has confirmed, through structured experimental paradigms, that these interactions could be underpinned by coordinated neural activity. Nevertheless, the extent of neural synchrony during real-life, ecologically valid interactions between mothers and their children remains largely unexplored. In this study, we investigated mother-child inter-brain synchrony using a naturalistic free-play paradigm. We also examined the relationship between neural synchrony, verbal communication patterns and personality traits to further understand the underpinnings of brain synchrony. Twelve children aged between 3 and 5 years old and their mothers participated in this study. Neural synchrony in mother-child dyads were measured bilaterally over frontal and temporal areas using functional Near Infra-red Spectroscopy (fNIRS) whilst the dyads were asked to play with child-friendly toys together (interactive condition) and separately (independent condition). Communication patterns were captured via video recordings and conversational turns were coded. Compared to the independent condition, mother-child dyads showed increased neural synchrony in the interactive condition across the prefrontal cortex and temporo-parietal junction. There was no significant relationship found between neural synchrony and turn-taking and between neural synchrony and the personality traits of each member of the dyad. Overall, we demonstrate the feasibility of measuring inter-brain synchrony between mothers and children in a naturalistic environment. These findings can inform future study designs to assess inter-brain synchrony between parents and pre-lingual children and/or children with communication needs.
Collapse
Affiliation(s)
- Efstratia Papoutselou
- Hearing Sciences, Division of Clinical Neuroscience, School of Medicine, University of Nottingham, Nottingham, UK
- Nottingham Biomedical Research Centre (BRC), National Institute for Health Research (NIHR), Nottingham, UK
| | - Samantha Harrison
- Hearing Sciences, Division of Clinical Neuroscience, School of Medicine, University of Nottingham, Nottingham, UK
- Nottingham Biomedical Research Centre (BRC), National Institute for Health Research (NIHR), Nottingham, UK
| | - Guangting Mai
- Hearing Sciences, Division of Clinical Neuroscience, School of Medicine, University of Nottingham, Nottingham, UK
- Nottingham Biomedical Research Centre (BRC), National Institute for Health Research (NIHR), Nottingham, UK
| | - Bryony Buck
- Hearing Sciences - Scottish Section, Division of Clinical Neuroscience, School of Medicine, University of Nottingham, Nottingham, UK
| | - Nikita Patil
- Nottingham Biomedical Research Centre (BRC), National Institute for Health Research (NIHR), Nottingham, UK
- School of Medicine, University of Nottingham, Nottingham, UK
| | - Ian Wiggins
- Hearing Sciences, Division of Clinical Neuroscience, School of Medicine, University of Nottingham, Nottingham, UK
- Nottingham Biomedical Research Centre (BRC), National Institute for Health Research (NIHR), Nottingham, UK
| | - Douglas Hartley
- Hearing Sciences, Division of Clinical Neuroscience, School of Medicine, University of Nottingham, Nottingham, UK
- Nottingham Biomedical Research Centre (BRC), National Institute for Health Research (NIHR), Nottingham, UK
- Nottingham University Hospitals NHS Trust, Queen's Medical Centre, Nottingham, UK
| |
Collapse
|
6
|
Bánki A, Köster M, Cichy RM, Hoehl S. Communicative signals during joint attention promote neural processes of infants and caregivers. Dev Cogn Neurosci 2024; 65:101321. [PMID: 38061133 PMCID: PMC10754706 DOI: 10.1016/j.dcn.2023.101321] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2023] [Revised: 10/13/2023] [Accepted: 11/04/2023] [Indexed: 01/01/2024] Open
Abstract
Communicative signals such as eye contact increase infants' brain activation to visual stimuli and promote joint attention. Our study assessed whether communicative signals during joint attention enhance infant-caregiver dyads' neural responses to objects, and their neural synchrony. To track mutual attention processes, we applied rhythmic visual stimulation (RVS), presenting images of objects to 12-month-old infants and their mothers (n = 37 dyads), while we recorded dyads' brain activity (i.e., steady-state visual evoked potentials, SSVEPs) with electroencephalography (EEG) hyperscanning. Within dyads, mothers either communicatively showed the images to their infant or watched the images without communicative engagement. Communicative cues increased infants' and mothers' SSVEPs at central-occipital-parietal, and central electrode sites, respectively. Infants showed significantly more gaze behaviour to images during communicative engagement. Dyadic neural synchrony (SSVEP amplitude envelope correlations, AECs) was not modulated by communicative cues. Taken together, maternal communicative cues in joint attention increase infants' neural responses to objects, and shape mothers' own attention processes. We show that communicative cues enhance cortical visual processing, thus play an essential role in social learning. Future studies need to elucidate the effect of communicative cues on neural synchrony during joint attention. Finally, our study introduces RVS to study infant-caregiver neural dynamics in social contexts.
Collapse
Affiliation(s)
- Anna Bánki
- University of Vienna, Faculty of Psychology, Vienna, Austria.
| | - Moritz Köster
- University of Regensburg, Institute for Psychology, Regensburg, Germany; Freie Universität Berlin, Faculty of Education and Psychology, Berlin, Germany
| | | | - Stefanie Hoehl
- University of Vienna, Faculty of Psychology, Vienna, Austria
| |
Collapse
|
7
|
Hakim U, De Felice S, Pinti P, Zhang X, Noah JA, Ono Y, Burgess PW, Hamilton A, Hirsch J, Tachtsidis I. Quantification of inter-brain coupling: A review of current methods used in haemodynamic and electrophysiological hyperscanning studies. Neuroimage 2023; 280:120354. [PMID: 37666393 DOI: 10.1016/j.neuroimage.2023.120354] [Citation(s) in RCA: 16] [Impact Index Per Article: 16.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2023] [Revised: 08/25/2023] [Accepted: 08/28/2023] [Indexed: 09/06/2023] Open
Abstract
Hyperscanning is a form of neuroimaging experiment where the brains of two or more participants are imaged simultaneously whilst they interact. Within the domain of social neuroscience, hyperscanning is increasingly used to measure inter-brain coupling (IBC) and explore how brain responses change in tandem during social interaction. In addition to cognitive research, some have suggested that quantification of the interplay between interacting participants can be used as a biomarker for a variety of cognitive mechanisms aswell as to investigate mental health and developmental conditions including schizophrenia, social anxiety and autism. However, many different methods have been used to quantify brain coupling and this can lead to questions about comparability across studies and reduce research reproducibility. Here, we review methods for quantifying IBC, and suggest some ways moving forward. Following the PRISMA guidelines, we reviewed 215 hyperscanning studies, across four different brain imaging modalities: functional near-infrared spectroscopy (fNIRS), functional magnetic resonance (fMRI), electroencephalography (EEG) and magnetoencephalography (MEG). Overall, the review identified a total of 27 different methods used to compute IBC. The most common hyperscanning modality is fNIRS, used by 119 studies, 89 of which adopted wavelet coherence. Based on the results of this literature survey, we first report summary statistics of the hyperscanning field, followed by a brief overview of each signal that is obtained from each neuroimaging modality used in hyperscanning. We then discuss the rationale, assumptions and suitability of each method to different modalities which can be used to investigate IBC. Finally, we discuss issues surrounding the interpretation of each method.
Collapse
Affiliation(s)
- U Hakim
- Department of Medical Physics and Biomedical Engineering, University College London, Malet Place Engineering Building, Gower Street, London WC1E 6BT, United Kingdom.
| | - S De Felice
- Institute of Cognitive Neuroscience, University College London, London, United Kingdom; Department of Psychology, University of Cambridge, United Kingdom
| | - P Pinti
- Department of Medical Physics and Biomedical Engineering, University College London, Malet Place Engineering Building, Gower Street, London WC1E 6BT, United Kingdom; Centre for Brain and Cognitive Development, Birkbeck, University of London, London, United Kingdom
| | - X Zhang
- Department of Psychiatry, Yale School of Medicine, New Haven, CT, United States
| | - J A Noah
- Department of Psychiatry, Yale School of Medicine, New Haven, CT, United States
| | - Y Ono
- Department of Electronics and Bioinformatics, School of Science and Technology, Meiji University, Kawasaki, Kanagawa, Japan
| | - P W Burgess
- Institute of Cognitive Neuroscience, University College London, London, United Kingdom
| | - A Hamilton
- Institute of Cognitive Neuroscience, University College London, London, United Kingdom
| | - J Hirsch
- Department of Medical Physics and Biomedical Engineering, University College London, Malet Place Engineering Building, Gower Street, London WC1E 6BT, United Kingdom; Department of Psychiatry, Yale School of Medicine, New Haven, CT, United States; Departments of Neuroscience and Comparative Medicine, Yale School of Medicine, New Haven, CT, United States; Yale University, Wu Tsai Institute, New Haven, CT, United States
| | - I Tachtsidis
- Department of Medical Physics and Biomedical Engineering, University College London, Malet Place Engineering Building, Gower Street, London WC1E 6BT, United Kingdom
| |
Collapse
|
8
|
Chuang C, Hsu H. Pseudo-mutual gazing enhances interbrain synchrony during remote joint attention tasking. Brain Behav 2023; 13:e3181. [PMID: 37496332 PMCID: PMC10570487 DOI: 10.1002/brb3.3181] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/06/2023] [Revised: 06/29/2023] [Accepted: 07/13/2023] [Indexed: 07/28/2023] Open
Abstract
INTRODUCTION Mutual gaze enables people to share attention and increase engagement during social interactions through intentional and implicit messages. Although previous studies have explored gaze behaviors and neural mechanisms underlying in-person eye contact, the growing prevalence of remote communication has raised questions about how to establish mutual gaze remotely and how the brains of interacting individuals synchronize. METHODS To address these questions, we conducted a study using eye trackers to create a pseudo-mutual gaze channel that mirrors the gazes of each interacting dyad on their respective remote screens. To demonstrate fluctuations in coupling across brains, we incorporated electroencephalographic hyperscanning techniques to simultaneously record the brain activity of interacting dyads engaged in a joint attention task in player-observer, collaborative, and competitive modes. RESULTS Our results indicated that mutual gaze could improve the efficiency of joint attention activities among remote partners. Moreover, by employing the phase locking value, we could estimate interbrain synchrony (IBS) and observe low-frequency couplings in the frontal and temporal regions that varied based on the interaction mode. While dyadic gender composition significantly affected gaze patterns, it did not impact the IBS. CONCLUSION These results provide insight into the neurological mechanisms underlying remote interaction through the pseudo-mutual gaze channel and have significant implications for developing effective online communication environments.
Collapse
Affiliation(s)
- Chun‐Hsiang Chuang
- Research Center for Education and Mind Sciences, College of EducationNational Tsing Hua UniversityHsinchuTaiwan
- Institute of Information Systems and ApplicationsCollege of Electrical Engineering and Computer ScienceNational Tsing Hua UniversityHsinchuTaiwan
| | - Hao‐Che Hsu
- Research Center for Education and Mind Sciences, College of EducationNational Tsing Hua UniversityHsinchuTaiwan
- Department of Computer ScienceNational Yang Ming Chiao Tung UniversityHsinchuTaiwan
- Department of Computer Science and EngineeringNational Taiwan Ocean UniversityKeelungTaiwan
| |
Collapse
|
9
|
Park J, Shin J, Lee J, Jeong J. Inter-Brain Synchrony Pattern Investigation on Triadic Board Game Play-Based Social Interaction: An fNIRS Study. IEEE Trans Neural Syst Rehabil Eng 2023; 31:2923-2932. [PMID: 37410649 DOI: 10.1109/tnsre.2023.3292844] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/08/2023]
Abstract
Recent advances in functional neuroimaging techniques, including methodologies such as fNIRS, have enabled the evaluation of inter-brain synchrony (IBS) induced by interpersonal interactions. However, the social interactions assumed in existing dyadic hyperscanning studies do not sufficiently emulate polyadic social interactions in the real world. Therefore, we devised an experimental paradigm that incorporates the Korean folk board game "Yut-nori" to reproduce social interactions that emulate social activities in the real world. We recruited 72 participants aged 25.2 ± 3.9 years (mean ± standard deviation) and divided them into 24 triads to play Yut-nori, following the standard or modified rules. The participants either competed against an opponent (standard rule) or cooperated with an opponent (modified rule) to achieve a goal efficiently. Three different fNIRS devices were employed to record cortical hemodynamic activations in the prefrontal cortex both individually and simultaneously. Wavelet transform coherence (WTC) analyses were performed to assess prefrontal IBS within a frequency range of 0.05-0.2 Hz. Consequently, we observed that cooperative interactions increased prefrontal IBS across overall frequency bands of interest. In addition, we also found that different purposes for cooperation generated different spectral characteristics of IBS depending on the frequency bands. Moreover, IBS in the frontopolar cortex (FPC) reflected the influence of verbal interactions. The findings of our study suggest that future hyperscanning studies should consider polyadic social interactions to reveal the properties of IBS in real-world interactions.
Collapse
|
10
|
Parker TC, Zhang X, Noah JA, Tiede M, Scassellati B, Kelley M, McPartland JC, Hirsch J. Neural and visual processing of social gaze cueing in typical and ASD adults. MEDRXIV : THE PREPRINT SERVER FOR HEALTH SCIENCES 2023:2023.01.30.23284243. [PMID: 36778502 PMCID: PMC9915835 DOI: 10.1101/2023.01.30.23284243] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
Atypical eye gaze in joint attention is a clinical characteristic of autism spectrum disorder (ASD). Despite this documented symptom, neural processing of joint attention tasks in real-life social interactions is not understood. To address this knowledge gap, functional-near infrared spectroscopy (fNIRS) and eye-tracking data were acquired simultaneously as ASD and typically developed (TD) individuals engaged in a gaze-directed joint attention task with a live human and robot partner. We test the hypothesis that face processing deficits in ASD are greater for interactive faces than for simulated (robot) faces. Consistent with prior findings, neural responses during human gaze cueing modulated by face visual dwell time resulted in increased activity of ventral frontal regions in ASD and dorsal parietal systems in TD participants. Hypoactivity of the right dorsal parietal area during live human gaze cueing was correlated with autism spectrum symptom severity: Brief Observations of Symptoms of Autism (BOSA) scores (r = âˆ'0.86). Contrarily, neural activity in response to robot gaze cueing modulated by visual acquisition factors activated dorsal parietal systems in ASD, and this neural activity was not related to autism symptom severity (r = 0.06). These results are consistent with the hypothesis that altered encoding of incoming facial information to the dorsal parietal cortex is specific to live human faces in ASD. These findings open new directions for understanding joint attention difficulties in ASD by providing a connection between superior parietal lobule activity and live interaction with human faces. Lay Summary Little is known about why it is so difficult for autistic individuals to make eye contact with other people. We find that in a live face-to-face viewing task with a robot, the brains of autistic participants were similar to typical participants but not when the partner was a live human. Findings suggest that difficulties in real-life social situations for autistic individuals may be specific to difficulties with live social interaction rather than general face gaze.
Collapse
|
11
|
Yin X. Influences of eye gaze cues on memory and its mechanisms: The function and evolution of social attention. Front Psychol 2022; 13:1036530. [DOI: 10.3389/fpsyg.2022.1036530] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2022] [Accepted: 09/29/2022] [Indexed: 11/13/2022] Open
Abstract
During evolution, humans have formed a priority perceptual preference for others’ gazes. The gaze direction of others is called the gaze cue, conveying environmental information, a critical non-verbal communication in early humans. Recently, empirical evidence has indicated that gaze cues can affect high-level cognitive processes, such as memory. Unlike non-social cues (e.g., arrows), gaze cues elicit special social attention. Research determining the underlying mechanisms suggests that social intention influences observers’ visual attention and influences their memory. This article provides a brief review of the current state of research on the relationship between gaze cues and memory. Future studies should focus on multiple gaze cues, the social nature of gaze cues, and clinical research.
Collapse
|
12
|
Ayaz H, Baker WB, Blaney G, Boas DA, Bortfeld H, Brady K, Brake J, Brigadoi S, Buckley EM, Carp SA, Cooper RJ, Cowdrick KR, Culver JP, Dan I, Dehghani H, Devor A, Durduran T, Eggebrecht AT, Emberson LL, Fang Q, Fantini S, Franceschini MA, Fischer JB, Gervain J, Hirsch J, Hong KS, Horstmeyer R, Kainerstorfer JM, Ko TS, Licht DJ, Liebert A, Luke R, Lynch JM, Mesquida J, Mesquita RC, Naseer N, Novi SL, Orihuela-Espina F, O’Sullivan TD, Peterka DS, Pifferi A, Pollonini L, Sassaroli A, Sato JR, Scholkmann F, Spinelli L, Srinivasan VJ, St. Lawrence K, Tachtsidis I, Tong Y, Torricelli A, Urner T, Wabnitz H, Wolf M, Wolf U, Xu S, Yang C, Yodh AG, Yücel MA, Zhou W. Optical imaging and spectroscopy for the study of the human brain: status report. NEUROPHOTONICS 2022; 9:S24001. [PMID: 36052058 PMCID: PMC9424749 DOI: 10.1117/1.nph.9.s2.s24001] [Citation(s) in RCA: 55] [Impact Index Per Article: 27.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/14/2023]
Abstract
This report is the second part of a comprehensive two-part series aimed at reviewing an extensive and diverse toolkit of novel methods to explore brain health and function. While the first report focused on neurophotonic tools mostly applicable to animal studies, here, we highlight optical spectroscopy and imaging methods relevant to noninvasive human brain studies. We outline current state-of-the-art technologies and software advances, explore the most recent impact of these technologies on neuroscience and clinical applications, identify the areas where innovation is needed, and provide an outlook for the future directions.
Collapse
Affiliation(s)
- Hasan Ayaz
- Drexel University, School of Biomedical Engineering, Science, and Health Systems, Philadelphia, Pennsylvania, United States
- Drexel University, College of Arts and Sciences, Department of Psychological and Brain Sciences, Philadelphia, Pennsylvania, United States
| | - Wesley B. Baker
- Children’s Hospital of Philadelphia, Division of Neurology, Philadelphia, Pennsylvania, United States
- Perelman School of Medicine at the University of Pennsylvania, Philadelphia, Pennsylvania, United States
| | - Giles Blaney
- Tufts University, Department of Biomedical Engineering, Medford, Massachusetts, United States
| | - David A. Boas
- Boston University Neurophotonics Center, Boston, Massachusetts, United States
- Boston University, College of Engineering, Department of Biomedical Engineering, Boston, Massachusetts, United States
| | - Heather Bortfeld
- University of California, Merced, Departments of Psychological Sciences and Cognitive and Information Sciences, Merced, California, United States
| | - Kenneth Brady
- Lurie Children’s Hospital, Northwestern University Feinberg School of Medicine, Department of Anesthesiology, Chicago, Illinois, United States
| | - Joshua Brake
- Harvey Mudd College, Department of Engineering, Claremont, California, United States
| | - Sabrina Brigadoi
- University of Padua, Department of Developmental and Social Psychology, Padua, Italy
| | - Erin M. Buckley
- Georgia Institute of Technology, Wallace H. Coulter Department of Biomedical Engineering, Atlanta, Georgia, United States
- Emory University School of Medicine, Department of Pediatrics, Atlanta, Georgia, United States
| | - Stefan A. Carp
- Massachusetts General Hospital, Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Charlestown, Massachusetts, United States
| | - Robert J. Cooper
- University College London, Department of Medical Physics and Bioengineering, DOT-HUB, London, United Kingdom
| | - Kyle R. Cowdrick
- Georgia Institute of Technology, Wallace H. Coulter Department of Biomedical Engineering, Atlanta, Georgia, United States
| | - Joseph P. Culver
- Washington University School of Medicine, Department of Radiology, St. Louis, Missouri, United States
| | - Ippeita Dan
- Chuo University, Faculty of Science and Engineering, Tokyo, Japan
| | - Hamid Dehghani
- University of Birmingham, School of Computer Science, Birmingham, United Kingdom
| | - Anna Devor
- Boston University, College of Engineering, Department of Biomedical Engineering, Boston, Massachusetts, United States
| | - Turgut Durduran
- ICFO – The Institute of Photonic Sciences, The Barcelona Institute of Science and Technology, Castelldefels, Barcelona, Spain
- Institució Catalana de Recerca I Estudis Avançats (ICREA), Barcelona, Spain
| | - Adam T. Eggebrecht
- Washington University in St. Louis, Mallinckrodt Institute of Radiology, St. Louis, Missouri, United States
| | - Lauren L. Emberson
- University of British Columbia, Department of Psychology, Vancouver, British Columbia, Canada
| | - Qianqian Fang
- Northeastern University, Department of Bioengineering, Boston, Massachusetts, United States
| | - Sergio Fantini
- Tufts University, Department of Biomedical Engineering, Medford, Massachusetts, United States
| | - Maria Angela Franceschini
- Massachusetts General Hospital, Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Charlestown, Massachusetts, United States
| | - Jonas B. Fischer
- ICFO – The Institute of Photonic Sciences, The Barcelona Institute of Science and Technology, Castelldefels, Barcelona, Spain
| | - Judit Gervain
- University of Padua, Department of Developmental and Social Psychology, Padua, Italy
- Université Paris Cité, CNRS, Integrative Neuroscience and Cognition Center, Paris, France
| | - Joy Hirsch
- Yale School of Medicine, Department of Psychiatry, Neuroscience, and Comparative Medicine, New Haven, Connecticut, United States
- University College London, Department of Medical Physics and Biomedical Engineering, London, United Kingdom
| | - Keum-Shik Hong
- Pusan National University, School of Mechanical Engineering, Busan, Republic of Korea
- Qingdao University, School of Automation, Institute for Future, Qingdao, China
| | - Roarke Horstmeyer
- Duke University, Department of Biomedical Engineering, Durham, North Carolina, United States
- Duke University, Department of Electrical and Computer Engineering, Durham, North Carolina, United States
- Duke University, Department of Physics, Durham, North Carolina, United States
| | - Jana M. Kainerstorfer
- Carnegie Mellon University, Department of Biomedical Engineering, Pittsburgh, Pennsylvania, United States
- Carnegie Mellon University, Neuroscience Institute, Pittsburgh, Pennsylvania, United States
| | - Tiffany S. Ko
- Children’s Hospital of Philadelphia, Division of Cardiothoracic Anesthesiology, Philadelphia, Pennsylvania, United States
| | - Daniel J. Licht
- Children’s Hospital of Philadelphia, Division of Neurology, Philadelphia, Pennsylvania, United States
| | - Adam Liebert
- Polish Academy of Sciences, Nalecz Institute of Biocybernetics and Biomedical Engineering, Warsaw, Poland
| | - Robert Luke
- Macquarie University, Department of Linguistics, Sydney, New South Wales, Australia
- Macquarie University Hearing, Australia Hearing Hub, Sydney, New South Wales, Australia
| | - Jennifer M. Lynch
- Children’s Hospital of Philadelphia, Division of Cardiothoracic Anesthesiology, Philadelphia, Pennsylvania, United States
| | - Jaume Mesquida
- Parc Taulí Hospital Universitari, Critical Care Department, Sabadell, Spain
| | - Rickson C. Mesquita
- University of Campinas, Institute of Physics, Campinas, São Paulo, Brazil
- Brazilian Institute of Neuroscience and Neurotechnology, Campinas, São Paulo, Brazil
| | - Noman Naseer
- Air University, Department of Mechatronics and Biomedical Engineering, Islamabad, Pakistan
| | - Sergio L. Novi
- University of Campinas, Institute of Physics, Campinas, São Paulo, Brazil
- Western University, Department of Physiology and Pharmacology, London, Ontario, Canada
| | | | - Thomas D. O’Sullivan
- University of Notre Dame, Department of Electrical Engineering, Notre Dame, Indiana, United States
| | - Darcy S. Peterka
- Columbia University, Zuckerman Mind Brain Behaviour Institute, New York, United States
| | | | - Luca Pollonini
- University of Houston, Department of Engineering Technology, Houston, Texas, United States
| | - Angelo Sassaroli
- Tufts University, Department of Biomedical Engineering, Medford, Massachusetts, United States
| | - João Ricardo Sato
- Federal University of ABC, Center of Mathematics, Computing and Cognition, São Bernardo do Campo, São Paulo, Brazil
| | - Felix Scholkmann
- University of Bern, Institute of Complementary and Integrative Medicine, Bern, Switzerland
- University of Zurich, University Hospital Zurich, Department of Neonatology, Biomedical Optics Research Laboratory, Zürich, Switzerland
| | - Lorenzo Spinelli
- National Research Council (CNR), IFN – Institute for Photonics and Nanotechnologies, Milan, Italy
| | - Vivek J. Srinivasan
- University of California Davis, Department of Biomedical Engineering, Davis, California, United States
- NYU Langone Health, Department of Ophthalmology, New York, New York, United States
- NYU Langone Health, Department of Radiology, New York, New York, United States
| | - Keith St. Lawrence
- Lawson Health Research Institute, Imaging Program, London, Ontario, Canada
- Western University, Department of Medical Biophysics, London, Ontario, Canada
| | - Ilias Tachtsidis
- University College London, Department of Medical Physics and Biomedical Engineering, London, United Kingdom
| | - Yunjie Tong
- Purdue University, Weldon School of Biomedical Engineering, West Lafayette, Indiana, United States
| | - Alessandro Torricelli
- Politecnico di Milano, Dipartimento di Fisica, Milan, Italy
- National Research Council (CNR), IFN – Institute for Photonics and Nanotechnologies, Milan, Italy
| | - Tara Urner
- Georgia Institute of Technology, Wallace H. Coulter Department of Biomedical Engineering, Atlanta, Georgia, United States
| | - Heidrun Wabnitz
- Physikalisch-Technische Bundesanstalt (PTB), Berlin, Germany
| | - Martin Wolf
- University of Zurich, University Hospital Zurich, Department of Neonatology, Biomedical Optics Research Laboratory, Zürich, Switzerland
| | - Ursula Wolf
- University of Bern, Institute of Complementary and Integrative Medicine, Bern, Switzerland
| | - Shiqi Xu
- Duke University, Department of Biomedical Engineering, Durham, North Carolina, United States
| | - Changhuei Yang
- California Institute of Technology, Department of Electrical Engineering, Pasadena, California, United States
| | - Arjun G. Yodh
- University of Pennsylvania, Department of Physics and Astronomy, Philadelphia, Pennsylvania, United States
| | - Meryem A. Yücel
- Boston University Neurophotonics Center, Boston, Massachusetts, United States
- Boston University, College of Engineering, Department of Biomedical Engineering, Boston, Massachusetts, United States
| | - Wenjun Zhou
- University of California Davis, Department of Biomedical Engineering, Davis, California, United States
- China Jiliang University, College of Optical and Electronic Technology, Hangzhou, Zhejiang, China
| |
Collapse
|
13
|
Luft CDB, Zioga I, Giannopoulos A, Di Bona G, Binetti N, Civilini A, Latora V, Mareschal I. Social synchronization of brain activity increases during eye-contact. Commun Biol 2022; 5:412. [PMID: 35508588 PMCID: PMC9068716 DOI: 10.1038/s42003-022-03352-6] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2021] [Accepted: 04/11/2022] [Indexed: 11/23/2022] Open
Abstract
Humans make eye-contact to extract information about other people’s mental states, recruiting dedicated brain networks that process information about the self and others. Recent studies show that eye-contact increases the synchronization between two brains but do not consider its effects on activity within single brains. Here we investigate how eye-contact affects the frequency and direction of the synchronization within and between two brains and the corresponding network characteristics. We also evaluate the functional relevance of eye-contact networks by comparing inter- and intra-brain networks of friends vs. strangers and the direction of synchronization between leaders and followers. We show that eye-contact increases higher inter- and intra-brain synchronization in the gamma frequency band. Network analysis reveals that some brain areas serve as hubs linking within- and between-brain networks. During eye-contact, friends show higher inter-brain synchronization than strangers. Dyads with clear leader/follower roles demonstrate higher synchronization from leader to follower in the alpha frequency band. Importantly, eye-contact affects synchronization between brains more than within brains, demonstrating that eye-contact is an inherently social signal. Future work should elucidate the causal mechanisms behind eye-contact induced synchronization. Friends making eye-contact have higher inter-brain synchronization than strangers. Eye-contact affects neural synchronization between brains more than within a brain, highlighting that eye-contact is an inherently social signal.
Collapse
Affiliation(s)
- Caroline Di Bernardi Luft
- School of Biological and Behavioural Sciences, Queen Mary, University of London, London, E1 4NS, United Kingdom.
| | - Ioanna Zioga
- School of Biological and Behavioural Sciences, Queen Mary, University of London, London, E1 4NS, United Kingdom.,Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
| | - Anastasios Giannopoulos
- School of Electrical and Computer Engineering, National Technical University of Athens (NTUA), Athens, Greece
| | - Gabriele Di Bona
- School of Mathematical Sciences, Queen Mary University of London, London, E1 4NS, United Kingdom
| | - Nicola Binetti
- School of Biological and Behavioural Sciences, Queen Mary, University of London, London, E1 4NS, United Kingdom
| | - Andrea Civilini
- School of Mathematical Sciences, Queen Mary University of London, London, E1 4NS, United Kingdom
| | - Vito Latora
- School of Mathematical Sciences, Queen Mary University of London, London, E1 4NS, United Kingdom.,Dipartimento di Fisica ed Astronomia, Università di Catania and INFN, I-95123, Catania, Italy.,The Alan Turing Institute, The British Library, London, NW1 2DB, United Kingdom.,Complexity Science Hub, Josefstäadter Strasse 39, A 1080, Vienna, Austria
| | - Isabelle Mareschal
- School of Biological and Behavioural Sciences, Queen Mary, University of London, London, E1 4NS, United Kingdom
| |
Collapse
|
14
|
Mundy P, Bullen J. The Bidirectional Social-Cognitive Mechanisms of the Social-Attention Symptoms of Autism. Front Psychiatry 2022; 12:752274. [PMID: 35173636 PMCID: PMC8841840 DOI: 10.3389/fpsyt.2021.752274] [Citation(s) in RCA: 13] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/02/2021] [Accepted: 12/20/2021] [Indexed: 11/13/2022] Open
Abstract
Differences in social attention development begin to be apparent in the 6th to 12th month of development in children with Autism Spectrum Disorder (ASD) and theoretically reflect important elements of its neurodevelopmental endophenotype. This paper examines alternative conceptual views of these early social attention symptoms and hypotheses about the mechanisms involved in their development. One model emphasizes mechanism involved in the spontaneous allocation of attention to faces, or social orienting. Alternatively, another model emphasizes mechanisms involved in the coordination of attention with other people, or joint attention, and the socially bi-directional nature of its development. This model raises the possibility that atypical responses of children to the attention or the gaze of a social partner directed toward themselves may be as important in the development of social attention symptoms as differences in the development of social orienting. Another model holds that symptoms of social attention may be important to early development, but may not impact older individuals with ASD. The alterative model is that the social attention symptoms in infancy (social orienting and joint attention), and social cognitive symptoms in childhood and adulthood share common neurodevelopmental substrates. Therefore, differences in early social attention and later social cognition constitute a developmentally continuous axis of symptom presentation in ASD. However, symptoms in older individuals may be best measured with in vivo measures of efficiency of social attention and social cognition in social interactions rather than the accuracy of response on analog tests used in measures with younger children. Finally, a third model suggests that the social attention symptoms may not truly be a symptom of ASD. Rather, they may be best conceptualized as stemming from differences domain general attention and motivation mechanisms. The alternative argued for here that infant social attention symptoms meet all the criteria of a unique dimension of the phenotype of ASD and the bi-directional phenomena involved in social attention cannot be fully explained in terms of domain general aspects of attention development.
Collapse
Affiliation(s)
- Peter Mundy
- Department of Learning and Mind Sciences, School of Education, University of California, Davis, Davis, CA, United States
- Department of Psychiatry and Behavioral Science and The MIND Institute, UC Davis School of Medicine, Sacramento, CA, United States
| | - Jenifer Bullen
- Department of Human Development, School of Human Ecology, University of California, Davis, Davis, CA, United States
| |
Collapse
|
15
|
Park J, Shin J, Jeong J. Inter-Brain Synchrony Levels According to Task Execution Modes and Difficulty Levels: an fNIRS/GSR Study. IEEE Trans Neural Syst Rehabil Eng 2022; 30:194-204. [PMID: 35041606 DOI: 10.1109/tnsre.2022.3144168] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
Hyperscanning is a brain imaging technique that measures brain synchrony caused by social interactions. Recent research on hyperscanning has revealed substantial inter-brain synchrony (IBS), but little is known about the link between IBS and mental workload. To study this link, we conducted an experiment consisting of button-pressing tasks of three different difficulty levels for the cooperation and competition modes with 56 participants aged 23.7±3.8 years (mean±standard deviation). We attempted to observe IBS using functional near-infrared spectroscopy (fNIRS) and galvanic skin response (GSR) to assess the activities of the human autonomic nervous system. We found that the IBS levels increased in a frequency band of 0.075-0.15 Hz, which was unrelated to the task repetition frequency in the cooperation mode according to the task difficulty level. Significant relative inter-brain synchrony (RIBS) increases were observed in three and 10 channels out of 15 for the hard tasks compared to the normal and easy tasks, respectively. We observed that the average GSR values increased with increasing task difficulty levels for the competition mode only. Thus, our results suggest that the IBS revealed by fNIRS and GSR is not related to the hemodynamic changes induced by mental workload, simple behavioral synchrony such as button-pressing timing, or autonomic nervous system activity. IBS is thus explicitly caused by social interactions such as cooperation.
Collapse
|
16
|
Wyser DG, Kanzler CM, Salzmann L, Lambercy O, Wolf M, Scholkmann F, Gassert R. Characterizing reproducibility of cerebral hemodynamic responses when applying short-channel regression in functional near-infrared spectroscopy. NEUROPHOTONICS 2022; 9:015004. [PMID: 35265732 PMCID: PMC8901194 DOI: 10.1117/1.nph.9.1.015004] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/11/2021] [Accepted: 02/11/2022] [Indexed: 05/06/2023]
Abstract
Significance: Functional near-infrared spectroscopy (fNIRS) enables the measurement of brain activity noninvasively. Optical neuroimaging with fNIRS has been shown to be reproducible on the group level and hence is an excellent research tool, but the reproducibility on the single-subject level is still insufficient, challenging the use for clinical applications. Aim: We investigated the effect of short-channel regression (SCR) as an approach to obtain fNIRS measurements with higher reproducibility on a single-subject level. SCR simultaneously considers contributions from long- and short-separation channels and removes confounding physiological changes through the regression of the short-separation channel information. Approach: We performed a test-retest study with a hand grasping task in 15 healthy subjects using a wearable fNIRS device, optoHIVE. Relevant brain regions were localized with transcranial magnetic stimulation to ensure correct placement of the optodes. Reproducibility was assessed by intraclass correlation, correlation analysis, mixed effects modeling, and classification accuracy of the hand grasping task. Further, we characterized the influence of SCR on reproducibility. Results: We found a high reproducibility of fNIRS measurements on a single-subject level (ICC single = 0.81 and correlation r = 0.81 ). SCR increased the reproducibility from 0.64 to 0.81 (ICC single ) but did not affect classification (85% overall accuracy). Significant intersubject variability in the reproducibility was observed and was explained by Mayer wave oscillations and low raw signal strength. The raw signal-to-noise ratio (threshold at 40 dB) allowed for distinguishing between persons with weak and strong activations. Conclusions: We report, for the first time, that fNIRS measurements are reproducible on a single-subject level using our optoHIVE fNIRS system and that SCR improves reproducibility. In addition, we give a benchmark to easily assess the ability of a subject to elicit sufficiently strong hemodynamic responses. With these insights, we pave the way for the reliable use of fNIRS neuroimaging in single subjects for neuroscientific research and clinical applications.
Collapse
Affiliation(s)
- Dominik G. Wyser
- ETH Zurich, Rehabilitation Engineering Laboratory, Department of Health Sciences and Technology, Zurich, Switzerland
- University Hospital Zurich, University of Zurich, Biomedical Optics Research Laboratory, Department of Neonatology, Zurich, Switzerland
| | - Christoph M. Kanzler
- ETH Zurich, Rehabilitation Engineering Laboratory, Department of Health Sciences and Technology, Zurich, Switzerland
| | - Lena Salzmann
- ETH Zurich, Rehabilitation Engineering Laboratory, Department of Health Sciences and Technology, Zurich, Switzerland
| | - Olivier Lambercy
- ETH Zurich, Rehabilitation Engineering Laboratory, Department of Health Sciences and Technology, Zurich, Switzerland
| | - Martin Wolf
- University Hospital Zurich, University of Zurich, Biomedical Optics Research Laboratory, Department of Neonatology, Zurich, Switzerland
| | - Felix Scholkmann
- University Hospital Zurich, University of Zurich, Biomedical Optics Research Laboratory, Department of Neonatology, Zurich, Switzerland
- University of Bern, Institute of Complementary and Integrative Medicine, Bern, Switzerland
| | - Roger Gassert
- ETH Zurich, Rehabilitation Engineering Laboratory, Department of Health Sciences and Technology, Zurich, Switzerland
| |
Collapse
|
17
|
Gilmore N, Yücel MA, Li X, Boas DA, Kiran S. Investigating Language and Domain-General Processing in Neurotypicals and Individuals With Aphasia - A Functional Near-Infrared Spectroscopy Pilot Study. Front Hum Neurosci 2021; 15:728151. [PMID: 34602997 PMCID: PMC8484538 DOI: 10.3389/fnhum.2021.728151] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2021] [Accepted: 08/25/2021] [Indexed: 11/29/2022] Open
Abstract
Brain reorganization patterns associated with language recovery after stroke have long been debated. Studying mechanisms of spontaneous and treatment-induced language recovery in post-stroke aphasia requires a network-based approach given the potential for recruitment of perilesional left hemisphere language regions, homologous right hemisphere language regions, and/or spared bilateral domain-general regions. Recent hardware, software, and methodological advances in functional near-infrared spectroscopy (fNIRS) make it well-suited to examine this question. fNIRS is cost-effective with minimal contraindications, making it a robust option to monitor treatment-related brain activation changes over time. Establishing clear activation patterns in neurotypical adults during language and domain-general cognitive processes via fNIRS is an important first step. Some fNIRS studies have investigated key language processes in healthy adults, yet findings are challenging to interpret in the context of methodological limitations. This pilot study used fNIRS to capture brain activation during language and domain-general processing in neurotypicals and individuals with aphasia. These findings will serve as a reference when interpreting treatment-related changes in brain activation patterns in post-stroke aphasia in the future. Twenty-four young healthy controls, seventeen older healthy controls, and six individuals with left hemisphere stroke-induced aphasia completed two language tasks (i.e., semantic feature, picture naming) and one domain-general cognitive task (i.e., arithmetic) twice during fNIRS. The probe covered bilateral frontal, parietal, and temporal lobes and included short-separation detectors for scalp signal nuisance regression. Younger and older healthy controls activated core language regions during semantic feature processing (e.g., left inferior frontal gyrus pars opercularis) and lexical retrieval (e.g., left inferior frontal gyrus pars triangularis) and domain-general regions (e.g., bilateral middle frontal gyri) during hard versus easy arithmetic as expected. Consistent with theories of post-stroke language recovery, individuals with aphasia activated areas outside the traditional networks: left superior frontal gyrus and left supramarginal gyrus during semantic feature judgment; left superior frontal gyrus and right precentral gyrus during picture naming; and left inferior frontal gyrus pars opercularis during arithmetic processing. The preliminary findings in the stroke group highlight the utility of using fNIRS to study language and domain-general processing in aphasia.
Collapse
Affiliation(s)
- Natalie Gilmore
- Department of Speech Language & Hearing Sciences, Sargent College of Health and Rehabilitation Sciences, Boston University, Boston, MA, United States
| | - Meryem Ayse Yücel
- Neurophotonics Center, Biomedical Engineering, Boston University, Boston, MA, United States
| | - Xinge Li
- Neurophotonics Center, Biomedical Engineering, Boston University, Boston, MA, United States.,Department of Psychology, College of Liberal Arts and Social Sciences, University of Houston, Houston, TX, United States
| | - David A Boas
- Neurophotonics Center, Biomedical Engineering, Boston University, Boston, MA, United States
| | - Swathi Kiran
- Department of Speech Language & Hearing Sciences, Sargent College of Health and Rehabilitation Sciences, Boston University, Boston, MA, United States
| |
Collapse
|
18
|
Laskowitz S, Griffin JW, Geier CF, Scherf KS. Cracking the Code of Live Human Social Interactions in Autism: A Review of the Eye-Tracking Literature. PROCEEDINGS OF MACHINE LEARNING RESEARCH 2021; 173:242-264. [PMID: 36540356 PMCID: PMC9762806] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
Human social interaction involves a complex, dynamic exchange of verbal and non-verbal information. Over the last decade, eye-tracking technology has afforded unique insight into the way eye gaze information, including both holding gaze and shifting gaze, organizes live human interactions. For example, while playing a social game together, speakers end their turn by directing gaze at the listener, who begins to speak with averted gaze (Ho et al., 2015). These findings reflect how eye gaze can be used to signal important turn-taking transitions in social interactions. Deficits in conversational turn-taking is a core feature of autism spectrum disorders. Individuals on the autism spectrum also have notable difficulties processing eye gaze information (Griffin & Scherf, 2020). A central hypothesis in the literature is that the difficulties in processing eye gaze information are foundational to the social communication deficits that make social interactions so challenging for individuals on the autism spectrum. Although eye-tracking technology has been used extensively to assess the way individuals on the spectrum attend to stimuli presented on computer screens (for review see Papagiannopoulou et al., 2014), it has rarely been used to evaluate the critical question regarding whether and how autistic individuals process non-verbal social cues from their partners during live social interactions. Here, we review this emerging literature with a focus on characterizing the experimental paradigms and eye-tracking procedures to understand the scope (and limitations) of research questions and findings. We discuss the theoretical implications of the findings from this review and provide recommendations for future work that will be essential to understand whether and how fundamental difficulties in perceiving and processing information about eye gaze cues interfere with social communication skills in autism.
Collapse
|
19
|
Kelley MS, Noah JA, Zhang X, Scassellati B, Hirsch J. Comparison of Human Social Brain Activity During Eye-Contact With Another Human and a Humanoid Robot. Front Robot AI 2021; 7:599581. [PMID: 33585574 PMCID: PMC7879449 DOI: 10.3389/frobt.2020.599581] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2020] [Accepted: 12/07/2020] [Indexed: 01/17/2023] Open
Abstract
Robot design to simulate interpersonal social interaction is an active area of research with applications in therapy and companionship. Neural responses to eye-to-eye contact in humans have recently been employed to determine the neural systems that are active during social interactions. Whether eye-contact with a social robot engages the same neural system remains to be seen. Here, we employ a similar approach to compare human-human and human-robot social interactions. We assume that if human-human and human-robot eye-contact elicit similar neural activity in the human, then the perceptual and cognitive processing is also the same for human and robot. That is, the robot is processed similar to the human. However, if neural effects are different, then perceptual and cognitive processing is assumed to be different. In this study neural activity was compared for human-to-human and human-to-robot conditions using near infrared spectroscopy for neural imaging, and a robot (Maki) with eyes that blink and move right and left. Eye-contact was confirmed by eye-tracking for both conditions. Increased neural activity was observed in human social systems including the right temporal parietal junction and the dorsolateral prefrontal cortex during human-human eye contact but not human-robot eye-contact. This suggests that the type of human-robot eye-contact used here is not sufficient to engage the right temporoparietal junction in the human. This study establishes a foundation for future research into human-robot eye-contact to determine how elements of robot design and behavior impact human social processing within this type of interaction and may offer a method for capturing difficult to quantify components of human-robot interaction, such as social engagement.
Collapse
Affiliation(s)
- Megan S. Kelley
- Interdepartmental Neuroscience Program, Yale School of Medicine, New Haven, CT, United States
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, New Haven, CT, United States
| | - J. Adam Noah
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, New Haven, CT, United States
| | - Xian Zhang
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, New Haven, CT, United States
| | - Brian Scassellati
- Social Robotics Laboratory, Department of Computer Science, Yale University, New Haven, CT, United States
| | - Joy Hirsch
- Interdepartmental Neuroscience Program, Yale School of Medicine, New Haven, CT, United States
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, New Haven, CT, United States
- Departments of Neuroscience and Comparative Medicine, Yale School of Medicine, New Haven, CT, United States
- Department of Medical Physics and Biomedical Engineering, University College London, London, United Kingdom
| |
Collapse
|
20
|
Hirsch J, Tiede M, Zhang X, Noah JA, Salama-Manteau A, Biriotti M. Interpersonal Agreement and Disagreement During Face-to-Face Dialogue: An fNIRS Investigation. Front Hum Neurosci 2021; 14:606397. [PMID: 33584223 PMCID: PMC7874076 DOI: 10.3389/fnhum.2020.606397] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2020] [Accepted: 12/15/2020] [Indexed: 01/03/2023] Open
Abstract
Although the neural systems that underlie spoken language are well-known, how they adapt to evolving social cues during natural conversations remains an unanswered question. In this work we investigate the neural correlates of face-to-face conversations between two individuals using functional near infrared spectroscopy (fNIRS) and acoustical analyses of concurrent audio recordings. Nineteen pairs of healthy adults engaged in live discussions on two controversial topics where their opinions were either in agreement or disagreement. Participants were matched according to their a priori opinions on these topics as assessed by questionnaire. Acoustic measures of the recorded speech including the fundamental frequency range, median fundamental frequency, syllable rate, and acoustic energy were elevated during disagreement relative to agreement. Consistent with both the a priori opinion ratings and the acoustic findings, neural activity associated with long-range functional networks, rather than the canonical language areas, was also differentiated by the two conditions. Specifically, the frontoparietal system including bilateral dorsolateral prefrontal cortex, left supramarginal gyrus, angular gyrus, and superior temporal gyrus showed increased activity while talking during disagreement. In contrast, talking during agreement was characterized by increased activity in a social and attention network including right supramarginal gyrus, bilateral frontal eye-fields, and left frontopolar regions. Further, these social and visual attention networks were more synchronous across brains during agreement than disagreement. Rather than localized modulation of the canonical language system, these findings are most consistent with a model of distributed and adaptive language-related processes including cross-brain neural coupling that serves dynamic verbal exchanges.
Collapse
Affiliation(s)
- Joy Hirsch
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, New Haven, CT, United States.,Department of Neuroscience, Yale School of Medicine, New Haven, CT, United States.,Department of Comparative Medicine, Yale School of Medicine, New Haven, CT, United States.,Haskins Laboratories, New Haven, CT, United States.,Department of Medical Physics and Biomedical Engineering, University College London, London, United Kingdom
| | - Mark Tiede
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, New Haven, CT, United States.,Haskins Laboratories, New Haven, CT, United States
| | - Xian Zhang
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, New Haven, CT, United States
| | - J Adam Noah
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, New Haven, CT, United States
| | - Alexandre Salama-Manteau
- Brain Function Laboratory, Department of Psychiatry, Yale School of Medicine, New Haven, CT, United States
| | - Maurice Biriotti
- Faculty of Arts and Humanities, University College London, London, United Kingdom
| |
Collapse
|