1
|
Sato W, Shimokawa K, Uono S, Minato T. Mentalistic attention orienting triggered by android eyes. Sci Rep 2024; 14:23143. [PMID: 39367157 PMCID: PMC11452688 DOI: 10.1038/s41598-024-75063-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2024] [Accepted: 10/01/2024] [Indexed: 10/06/2024] Open
Abstract
The eyes play a special role in human communications. Previous psychological studies have reported reflexive attention orienting in response to another individual's eyes during live interactions. Although robots are expected to collaborate with humans in various social situations, it remains unclear whether robot eyes have the potential to trigger attention orienting similarly to human eyes, specifically based on mental attribution. We investigated this issue in a series of experiments using a live gaze-cueing paradigm with an android. In Experiment 1, the non-predictive cue was the eyes and head of an android placed in front of human participants. Light-emitting diodes in the periphery served as target signals. The reaction times (RTs) required to localize the valid cued targets were faster than those for invalid cued targets for both types of cues. In Experiment 2, the gaze direction of the android eyes changed before the peripheral target lights appeared with or without barriers that made the targets non-visible, such that the android did not attend to them. The RTs were faster for validly cued targets only when there were no barriers. In Experiment 3, the targets were changed from lights to sounds, which the android could attend to even in the presence of barriers. The RTs to the target sounds were faster with valid cues, irrespective of the presence of barriers. These results suggest that android eyes may automatically induce attention orienting in humans based on mental state attribution.
Collapse
Affiliation(s)
- Wataru Sato
- Psychological Process Research Team, Guardian Robot Project, RIKEN, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto, 619-0288, Japan.
| | - Koh Shimokawa
- Psychological Process Research Team, Guardian Robot Project, RIKEN, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto, 619-0288, Japan
| | - Shota Uono
- Division of Disability Sciences, Institute of Human Sciences, University of Tsukuba, 1-1-1 Tennodai, Tsukuba, 305-8572, Ibaraki, Japan
| | - Takashi Minato
- Interactive Robot Research Team, Guardian Robot Project, RIKEN, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto, 619-0288, Japan
| |
Collapse
|
2
|
Cavadini T, Riviere E, Gentaz E. An Eye-Tracking Study on Six Early Social-Emotional Abilities in Children Aged 1 to 3 Years. CHILDREN (BASEL, SWITZERLAND) 2024; 11:1031. [PMID: 39201965 PMCID: PMC11352975 DOI: 10.3390/children11081031] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/18/2024] [Revised: 07/29/2024] [Accepted: 08/12/2024] [Indexed: 09/03/2024]
Abstract
BACKGROUND The experimental evaluation of young children's socio-emotional abilities is limited by the lack of existing specific measures to assess this population and by the relative difficulty for researchers to adapt measures designed for the general population. METHODS This study examined six early social-emotional abilities in 86 typically developing children aged 1 to 3 years using an eye-tracking-based experimental paradigm that combined visual preference tasks adapted from pre-existing infant studies. OBJECTIVES The aim of this study is to obtain developmental norms in six early social-emotional abilities in typical children aged 1 to 3 years that would be promising for an understanding of disorders of mental development. These developmental standards are essential to enable comparative assessments with children with atypical development, such as children with Profound Intellectual and Multiple Disabilities (PIMD). RESULTS The participants had greater spontaneous visual preferences for biological (vs. non-biological) motion, socially salient (vs. non-social) stimuli, the eye (vs. mouth) area of emotional expressions, angry (vs. happy) faces, and objects of joint attention (vs. non-looked-at ones). Interestingly, although the prosocial (vs. antisocial) scene of the socio-moral task was preferred, both the helper and hinderer characters were equally gazed at. Finally, correlational analyses revealed that performance was neither related to participants' age nor to each other (dismissing the hypothesis of a common underpinning process). CONCLUSION Our revised experimental paradigm is possible in infants aged 1 to 3 years and thus provides additional scientific proof on the direct assessment of these six socio-emotional abilities in this population.
Collapse
Affiliation(s)
- Thalia Cavadini
- Department of Psychology, University of Geneva, 1205 Geneva, Switzerland; (T.C.); (E.R.)
| | - Elliot Riviere
- Department of Psychology, University of Geneva, 1205 Geneva, Switzerland; (T.C.); (E.R.)
- Univ. Lille, ULR 4072–PSITEC–Psychologie: Interactions Temps Emotions Cognition, F-59000 Lille, France
| | - Edouard Gentaz
- Department of Psychology, University of Geneva, 1205 Geneva, Switzerland; (T.C.); (E.R.)
- Swiss Center for Affective Sciences, University of Geneva, 1205 Geneva, Switzerland
- Centre National de la Recherche Scientifique, F-38400 Grenoble, France
| |
Collapse
|
3
|
Fu X, Franchak JM, MacNeill LA, Gunther KE, Borjon JI, Yurkovic-Harding J, Harding S, Bradshaw J, Pérez-Edgar KE. Implementing mobile eye tracking in psychological research: A practical guide. Behav Res Methods 2024:10.3758/s13428-024-02473-6. [PMID: 39147949 DOI: 10.3758/s13428-024-02473-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/20/2024] [Indexed: 08/17/2024]
Abstract
Eye tracking provides direct, temporally and spatially sensitive measures of eye gaze. It can capture visual attention patterns from infancy through adulthood. However, commonly used screen-based eye tracking (SET) paradigms are limited in their depiction of how individuals process information as they interact with the environment in "real life". Mobile eye tracking (MET) records participant-perspective gaze in the context of active behavior. Recent technological developments in MET hardware enable researchers to capture egocentric vision as early as infancy and across the lifespan. However, challenges remain in MET data collection, processing, and analysis. The present paper aims to provide an introduction and practical guide to starting researchers in the field to facilitate the use of MET in psychological research with a wide range of age groups. First, we provide a general introduction to MET. Next, we briefly review MET studies in adults and children that provide new insights into attention and its roles in cognitive and socioemotional functioning. We then discuss technical issues relating to MET data collection and provide guidelines for data quality inspection, gaze annotations, data visualization, and statistical analyses. Lastly, we conclude by discussing the future directions of MET implementation. Open-source programs for MET data quality inspection, data visualization, and analysis are shared publicly.
Collapse
Affiliation(s)
- Xiaoxue Fu
- Department of Psychology, University of South Carolina, Columbia, SC, USA.
| | - John M Franchak
- Department of Psychology, University of California Riverside, Riverside, CA, USA
| | - Leigha A MacNeill
- Department of Medical Social Sciences, Northwestern University, Feinberg School of Medicine, Northwestern University, Chicago, IL, USA
- Institute for Innovations in Developmental Sciences, Northwestern University, Evanston, IL, USA
| | - Kelley E Gunther
- Neuroscience and Cognitive Science Program, University of Maryland, College Park, MD, USA
| | - Jeremy I Borjon
- Department of Psychology, University of Houston, Houston, TX, USA
- Texas Institute for Measurement, Evaluation, and Statistics, University of Houston, Houston, TX, USA
- Texas Center for Learning Disorders, University of Houston, Houston, TX, USA
| | | | - Samuel Harding
- Department of Psychology, University of South Carolina, Columbia, SC, USA
| | - Jessica Bradshaw
- Department of Psychology, University of South Carolina, Columbia, SC, USA
| | - Koraly E Pérez-Edgar
- Department of Psychology, The Pennsylvania State University, University Park, PA, USA
| |
Collapse
|
4
|
Tepencelik ON, Wei W, Luo M, Cosman P, Dey S. Behavioral Intervention for Adults With Autism on Distribution of Attention in Triadic Conversations: A/B-Tested Pre-Post Study. JMIR Form Res 2024; 8:e55339. [PMID: 39133914 PMCID: PMC11347890 DOI: 10.2196/55339] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2023] [Revised: 04/17/2024] [Accepted: 05/27/2024] [Indexed: 08/30/2024] Open
Abstract
BACKGROUND Cross-neurotype differences in social communication patterns contribute to high unemployment rates among adults with autism. Adults with autism can be unsuccessful in job searches or terminated from employment due to mismatches between their social attention behaviors and society's expectations on workplace communication. OBJECTIVE We propose a behavioral intervention concerning distribution of attention in triadic (three-way) conversations. Specifically, the objective is to determine whether providing personalized feedback to each individual with autism based on an analysis of their attention distribution behavior during an initial conversation session would cause them to modify their orientation behavior in a subsequent conversation session. METHODS Our system uses an unobtrusive head orientation estimation model to track the focus of attention of each individual. Head orientation sequences from a conversation session are analyzed based on five statistical domains (eg, maximum exclusion duration and average contact duration) representing different types of attention distribution behavior. An intervention is provided to a participant if they exceeded the nonautistic average for that behavior by at least 2 SDs. The intervention uses data analysis and video modeling along with a constructive discussion about the targeted behaviors. Twenty-four individuals with autism with no intellectual disabilities participated in the study. The participants were divided into test and control groups of 12 participants each. RESULTS Based on their attention distribution behavior in the initial conversation session, 11 of the 12 participants in the test group received an intervention in at least one domain. Of the 11 participants who received the intervention, 10 showed improvement in at least one domain on which they received feedback. Independent t tests for larger test groups (df>15) confirmed that the group improvements are statistically significant compared with the corresponding controls (P<.05). Crawford-Howell t tests confirmed that 78% of the interventions resulted in significant improvements when compared individually against corresponding controls (P<.05). Additional t tests comparing the first conversation sessions of the test and control groups and comparing the first and second conversation sessions of the control group resulted in nonsignificant differences, pointing to the intervention being the main effect behind the behavioral changes displayed by the test group, as opposed to confounding effects or group differences. CONCLUSIONS Our proposed behavioral intervention offers a useful framework for practicing social attention behavior in multiparty conversations that are common in social and professional settings.
Collapse
Affiliation(s)
- Onur Necip Tepencelik
- Electrical and Computer Engineering, University of California San Diego, La Jolla, CA, United States
| | - Wenchuan Wei
- Electrical and Computer Engineering, University of California San Diego, La Jolla, CA, United States
| | - Mirabel Luo
- Carlsbad High School, Carlsbad, CA, United States
| | - Pamela Cosman
- Electrical and Computer Engineering, University of California San Diego, La Jolla, CA, United States
| | - Sujit Dey
- Electrical and Computer Engineering, University of California San Diego, La Jolla, CA, United States
| |
Collapse
|
5
|
Kroczek LOH, Lingnau A, Schwind V, Wolff C, Mühlberger A. Observers predict actions from facial emotional expressions during real-time social interactions. Behav Brain Res 2024; 471:115126. [PMID: 38950784 DOI: 10.1016/j.bbr.2024.115126] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2023] [Revised: 06/07/2024] [Accepted: 06/19/2024] [Indexed: 07/03/2024]
Abstract
In face-to-face social interactions, emotional expressions provide insights into the mental state of an interactive partner. This information can be crucial to infer action intentions and react towards another person's actions. Here we investigate how facial emotional expressions impact subjective experience and physiological and behavioral responses to social actions during real-time interactions. Thirty-two participants interacted with virtual agents while fully immersed in Virtual Reality. Agents displayed an angry or happy facial expression before they directed an appetitive (fist bump) or aversive (punch) social action towards the participant. Participants responded to these actions, either by reciprocating the fist bump or by defending the punch. For all interactions, subjective experience was measured using ratings. In addition, physiological responses (electrodermal activity, electrocardiogram) and participants' response times were recorded. Aversive actions were judged to be more arousing and less pleasant relative to appetitive actions. In addition, angry expressions increased heart rate relative to happy expressions. Crucially, interaction effects between facial emotional expression and action were observed. Angry expressions reduced pleasantness stronger for appetitive compared to aversive actions. Furthermore, skin conductance responses to aversive actions were increased for happy compared to angry expressions and reaction times were faster to aversive compared to appetitive actions when agents showed an angry expression. These results indicate that observers used facial emotional expression to generate expectations for particular actions. Consequently, the present study demonstrates that observers integrate information from facial emotional expressions with actions during social interactions.
Collapse
Affiliation(s)
- Leon O H Kroczek
- Department of Psychology, Clinical Psychology and Psychotherapy, University of Regensburg, Regensburg, Germany.
| | - Angelika Lingnau
- Department of Psychology, Cognitive Neuroscience, University of Regensburg, Regensburg, Germany
| | - Valentin Schwind
- Human Computer Interaction, University of Applied Sciences in Frankfurt a. M., Frankfurt a. M, Germany; Department of Media Informatics, University of Regensburg, Regensburg, Germany
| | - Christian Wolff
- Department of Media Informatics, University of Regensburg, Regensburg, Germany
| | - Andreas Mühlberger
- Department of Psychology, Clinical Psychology and Psychotherapy, University of Regensburg, Regensburg, Germany
| |
Collapse
|
6
|
Msika EF, Despres M, Piolino P, Narme P. Dynamic and/or multimodal assessments for social cognition in neuropsychology: Results from a systematic literature review. Clin Neuropsychol 2024; 38:922-962. [PMID: 37904259 DOI: 10.1080/13854046.2023.2266172] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2023] [Accepted: 09/27/2023] [Indexed: 11/01/2023]
Abstract
Objective: Despite the prevalence of socio-cognitive disturbances, and their important diagnostic/therapeutic implications, the assessment of these disturbances remains scarce. This systematic review aims to identify available social cognition tools for adult assessment that use multimodal and/or dynamic social cues, specifying their strengths and limitations (e.g. from a methodological, psychometric, ecological, and clinical perspective). Method: An electronic search was conducted in Pubmed, PsychINFO, Embase and Scopus databases for articles published up to the 3th of January 2023 and the first 200 Google Scholar results on the same date. The PRISMA methodology was applied, 3884 studies were screened based on title and abstract and 329 full texts were screened. Articles using pseudo-dynamic methodologies (e.g. morphing), reported only subjective or self-reported measures, or investigated only physiological or brain activity responses were excluded. Results: In total, 149 works were included in this review, representing 65 assessment tools (i.e. 48% studying emotion recognition (n = 31), 32% Theory of Mind (n = 21), 5% empathy (n = 3), 1.5% moral cognition/social reasoning (n = 1), and 14% being multimodal (n = 9)). For each study, the tool's main characteristics, psychometric properties, ecological validity indicators and available norms are reported. The tools are presented according to social-cognitive process assessed and communication channels used. Conclusions: This study highlights the lack of validated and standardized tools. A few tools appear to partially meet some clinical needs. The development of methodologies using a first-person paradigm and taking into account the multidimensional nature of social cognition seems a relevant research endeavour for greater ecological validity.
Collapse
Affiliation(s)
- Eva-Flore Msika
- Laboratoire Mémoire, Cerveau et Cognition, Université Paris Cité, Boulogne-Billancourt, France
| | - Mathilde Despres
- Laboratoire Mémoire, Cerveau et Cognition, Université Paris Cité, Boulogne-Billancourt, France
| | - Pascale Piolino
- Laboratoire Mémoire, Cerveau et Cognition, Université Paris Cité, Boulogne-Billancourt, France
| | - Pauline Narme
- Laboratoire Mémoire, Cerveau et Cognition, Université Paris Cité, Boulogne-Billancourt, France
| |
Collapse
|
7
|
Masters-Waage TC, Kinias Z, Argueta-Rivera J, Stewart D, Ivany R, King E, Hebl M. Social inattentional blindness to idea stealing in meetings. Sci Rep 2024; 14:8060. [PMID: 38580682 PMCID: PMC10997580 DOI: 10.1038/s41598-024-56905-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2023] [Accepted: 03/11/2024] [Indexed: 04/07/2024] Open
Abstract
Using a virtual reality social experiment, participants (N = 154) experienced being at the table during a decision-making meeting and identified the best solutions generated. During the meeting, one meeting participant repeated another participant's idea, presenting it as his own. Although this idea stealing was clearly visible and audible, only 30% of participants correctly identified who shared the idea first. Subsequent analyses suggest that the social environment affected this novel form of inattentional blindness. Although there was no experimental effect of team diversity on noticing, there was correlational evidence of an indirect effect of perceived team status on noticing via attentional engagement. In sum, this paper extends the inattentional blindness phenomenon to a realistic professional interaction and demonstrates how features of the social environment can reduce social inattention.
Collapse
|
8
|
Martinez-Cedillo AP, Foulsham T. Don't look now! Social elements are harder to avoid during scene viewing. Vision Res 2024; 216:108356. [PMID: 38184917 DOI: 10.1016/j.visres.2023.108356] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2023] [Revised: 11/09/2023] [Accepted: 12/28/2023] [Indexed: 01/09/2024]
Abstract
Regions of social importance (i.e., other people) attract attention in real world scenes, but it is unclear how automatic this bias is and how it might interact with other guidance factors. To investigate this, we recorded eye movements while participants were explicitly instructed to avoid looking at one of two objects in a scene (either a person or a non-social object). The results showed that, while participants could follow these instructions, they still made errors (especially on the first saccade). Crucially, there were about twice as many erroneous looks towards the person than there were towards the other object. This indicates that it is hard to suppress the prioritization of social information during scene viewing, with implications for how quickly and automatically this information is perceived and attended to.
Collapse
Affiliation(s)
- A P Martinez-Cedillo
- Department of Psychology, University of York, York YO10 5DD, England; Department of Psychology, University of Essex, Wivenhoe Park, Colchester, Essex CO4 3SQ, England.
| | - T Foulsham
- Department of Psychology, University of Essex, Wivenhoe Park, Colchester, Essex CO4 3SQ, England
| |
Collapse
|
9
|
Moreno-Verdú M, Hamoline G, Van Caenegem EE, Waltzing BM, Forest S, Valappil AC, Khan AH, Chye S, Esselaar M, Campbell MJ, McAllister CJ, Kraeutner SN, Poliakoff E, Frank C, Eaves DL, Wakefield C, Boe SG, Holmes PS, Bruton AM, Vogt S, Wright DJ, Hardwick RM. Guidelines for reporting action simulation studies (GRASS): Proposals to improve reporting of research in motor imagery and action observation. Neuropsychologia 2024; 192:108733. [PMID: 37956956 DOI: 10.1016/j.neuropsychologia.2023.108733] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Revised: 10/10/2023] [Accepted: 11/08/2023] [Indexed: 11/21/2023]
Abstract
Researchers from multiple disciplines have studied the simulation of actions through motor imagery, action observation, or their combination. Procedures used in these studies vary considerably between research groups, and no standardized approach to reporting experimental protocols has been proposed. This has led to under-reporting of critical details, impairing the assessment, replication, synthesis, and potential clinical translation of effects. We provide an overview of issues related to the reporting of information in action simulation studies, and discuss the benefits of standardized reporting. We propose a series of checklists that identify key details of research protocols to include when reporting action simulation studies. Each checklist comprises A) essential methodological details, B) essential details that are relevant to a specific mode of action simulation, and C) further points that may be useful on a case-by-case basis. We anticipate that the use of these guidelines will improve the understanding, reproduction, and synthesis of studies using action simulation, and enhance the translation of research using motor imagery and action observation to applied and clinical settings.
Collapse
Affiliation(s)
- Marcos Moreno-Verdú
- Brain, Action, And Skill Laboratory, Institute of Neuroscience (Cognition and Systems Division), UC Louvain, Belgium; Department of Radiology, Rehabilitation and Physiotherapy, Complutense University of Madrid, Spain
| | - Gautier Hamoline
- Brain, Action, And Skill Laboratory, Institute of Neuroscience (Cognition and Systems Division), UC Louvain, Belgium
| | - Elise E Van Caenegem
- Brain, Action, And Skill Laboratory, Institute of Neuroscience (Cognition and Systems Division), UC Louvain, Belgium
| | - Baptiste M Waltzing
- Brain, Action, And Skill Laboratory, Institute of Neuroscience (Cognition and Systems Division), UC Louvain, Belgium
| | - Sébastien Forest
- Brain, Action, And Skill Laboratory, Institute of Neuroscience (Cognition and Systems Division), UC Louvain, Belgium
| | - Ashika C Valappil
- Simulating Movements to Improve Learning and Execution (SMILE) Research Group, School of Life and Health Sciences, University of Roehampton, UK
| | - Adam H Khan
- Simulating Movements to Improve Learning and Execution (SMILE) Research Group, School of Life and Health Sciences, University of Roehampton, UK
| | - Samantha Chye
- Simulating Movements to Improve Learning and Execution (SMILE) Research Group, School of Life and Health Sciences, University of Roehampton, UK
| | - Maaike Esselaar
- Research Centre for Musculoskeletal Science and Sports Medicine, Department of Sport and Exercise Sciences, Faculty of Science and Engineering, Manchester Metropolitan University, UK
| | - Mark J Campbell
- Lero Esports Science Research Lab, Physical Education & Sport Sciences Department & Lero the Science Foundation Ireland Centre for Software Research, University of Limerick, Ireland
| | - Craig J McAllister
- Centre for Human Brain Health, School of Sport Exercise and Rehabilitation Sciences, University of Birmingham, UK
| | - Sarah N Kraeutner
- Neuroplasticity, Imagery, And Motor Behaviour Laboratory, Department of Psychology & Djavad Mowafaghian Centre for Brain Health, University of British Columbia, Okanagan, Canada
| | - Ellen Poliakoff
- Body Eyes and Movement (BEAM) Laboratory, Faculty of Biology, Medicine and Health, University of Manchester, UK
| | - Cornelia Frank
- Cognition, Imagery and Learning in Action Laboratory, Department of Sports and Movement Science, School of Educational and Cultural Studies, Osnabrueck University, Germany
| | - Daniel L Eaves
- Biomedical, Nutritional and Sport Sciences, Faculty of Medical Sciences, Newcastle University, UK
| | | | - Shaun G Boe
- Laboratory for Brain Recovery and Function, School of Physiotherapy and Department of Psychology and Neuroscience, Dalhousie University, Canada
| | - Paul S Holmes
- Research Centre for Health, Psychology and Communities, Department of Psychology, Faculty of Health and Education, Manchester Metropolitan University, UK
| | - Adam M Bruton
- Simulating Movements to Improve Learning and Execution (SMILE) Research Group, School of Life and Health Sciences, University of Roehampton, UK; : Centre for Cognitive and Clinical Neuroscience, College of Health, Medicine and Life Sciences, Brunel University London, UK
| | - Stefan Vogt
- Perception and Action Group, Department of Psychology, Lancaster University, UK
| | - David J Wright
- Research Centre for Health, Psychology and Communities, Department of Psychology, Faculty of Health and Education, Manchester Metropolitan University, UK
| | - Robert M Hardwick
- Brain, Action, And Skill Laboratory, Institute of Neuroscience (Cognition and Systems Division), UC Louvain, Belgium.
| |
Collapse
|
10
|
Fernandes EG, Tatler BW, Slessor G, Phillips LH. Age Differences in Gaze Following: Older Adults Follow Gaze More than Younger Adults When free-viewing Scenes. Exp Aging Res 2024; 50:84-101. [PMID: 36572660 DOI: 10.1080/0361073x.2022.2156760] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2022] [Accepted: 11/22/2022] [Indexed: 12/28/2022]
Abstract
Previous research investigated age differences in gaze following with an attentional cueing paradigm where participants view a face with averted gaze, and then respond to a target appearing in a location congruent or incongruent with the gaze cue. However, this paradigm is far removed from the way we use gaze cues in everyday settings. Here we recorded the eye movements of younger and older adults while they freely viewed naturalistic scenes where a person looked at an object or location. Older adults were more likely to fixate and made more fixations to the gazed-at location, compared to younger adults. Our findings suggest that, contrary to what was observed in the traditional gaze-cueing paradigm, in a non-constrained task that uses contextualized stimuli older adults follow gaze as much as or even more than younger adults.
Collapse
Affiliation(s)
- Eunice G Fernandes
- Department of Foreign Languages and Translation, Universitet i Agder, Kristiansand, Norway
- School of Psychology, University of Aberdeen, Aberdeen, UK
| | | | | | | |
Collapse
|
11
|
Watson MR, Traczewski N, Dunghana S, Boroujeni KB, Neumann A, Wen X, Womelsdorf T. A Multi-task Platform for Profiling Cognitive and Motivational Constructs in Humans and Nonhuman Primates. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.11.09.566422. [PMID: 38014107 PMCID: PMC10680597 DOI: 10.1101/2023.11.09.566422] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/29/2023]
Abstract
Background Understanding the neurobiological substrates of psychiatric disorders requires comprehensive evaluations of cognitive and motivational functions in preclinical research settings. The translational validity of such evaluations will be supported by (1) tasks with high construct validity that are engaging and easy to teach to human and nonhuman participants, (2) software that enables efficient switching between multiple tasks in single sessions, (3) software that supports tasks across a broad range of physical experimental setups, and (4) by platform architectures that are easily extendable and customizable to encourage future optimization and development. New Method We describe the Multi-task Universal Suite for Experiments ( M-USE ), a software platform designed to meet these requirements. It leverages the Unity video game engine and C# programming language to (1) support immersive and engaging tasks for humans and nonhuman primates, (2) allow experimenters or participants to switch between multiple tasks within-session, (3) generate builds that function across computers, tablets, and websites, and (4) is freely available online with documentation and tutorials for users and developers. M-USE includes a task library with seven pre-existing tasks assessing cognitive and motivational constructs of perception, attention, working memory, cognitive flexibility, motivational and affective self-control, relational long-term memory, and visuo-spatial problem solving. Results M-USE was used to test NHPs on up to six tasks per session, all available as part of the Task Library, and to extract performance metrics for all major cognitive and motivational constructs spanning the Research Domain Criteria (RDoC) of the National Institutes of Mental Health. Comparison with Existing Methods Other experiment design and control systems exist, but do not provide the full range of features available in M-USE, including a pre-existing task library for cross-species assessments; the ability to switch seamlessly between tasks in individual sessions; cross-platform build capabilities; license-free availability; and its leveraging of video-engine capabilities used to gamify tasks. Conclusions The new multi-task platform facilitates cross-species translational research for understanding the neurobiological substrates of higher cognitive and motivational functions.
Collapse
|
12
|
Pan H, Chen Z, Jospe K, Gao Q, Sheng J, Gao Z, Perry A. Mood congruency affects physiological synchrony but not empathic accuracy in a naturalistic empathy task. Biol Psychol 2023; 184:108720. [PMID: 37952694 DOI: 10.1016/j.biopsycho.2023.108720] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2022] [Revised: 11/05/2023] [Accepted: 11/08/2023] [Indexed: 11/14/2023]
Abstract
Empathy is a crucial aspect of our daily lives, as it enhances our wellbeing and is a proxy for prosocial behavior. It encompasses two related but partially distinct components: cognitive and affective empathy. Both are susceptible to context, biases and an individual's physiological state. Few studies have explored the effects of a person's mood on these empathy components, and results are mixed. The current study takes advantage of an ecological, naturalistic empathy task - the empathic accuracy (EA) task - in combination with physiological measurements to examine and differentiate between the effects of one's mood on both empathy components. Participants were induced with positive or negative mood and presented videos of targets narrating autobiographical negative stories, selected from a Chinese empathy dataset that we developed (now publicly available). The stories were conveyed in audio-only, visual-only and full-video formats. Participants rated the target's emotional state while watching or listening to their stories, and physiological measures were taken throughout the process. Importantly, similar measures were taken from the targets when they narrated the stories, allowing a comparison between participants' and targets' measures. We found that in audio-only and visual-only conditions, participants whose moods were congruent with the target showed higher physiological synchrony than those with incongruent mood, implying a mood-congruency effect on affective empathy. However, there was no mood effect on empathic accuracy (reflecting cognitive empathy), suggesting a different influence of mood on the two empathy components.
Collapse
Affiliation(s)
- Hanxi Pan
- Department of Psychology and Behavioral Sciences, Zhejiang University, China
| | - Zhiyun Chen
- Department of Psychology and Behavioral Sciences, Zhejiang University, China
| | - Karine Jospe
- Department of Psychology, The Hebrew University of Jerusalem, Israel; Department of Psychology, Tel-Aviv University, Israel
| | - Qi Gao
- Department of Psychology and Behavioral Sciences, Zhejiang University, China.
| | - Jinyou Sheng
- Department of Psychology and Behavioral Sciences, Zhejiang University, China
| | - Zaifeng Gao
- Department of Psychology and Behavioral Sciences, Zhejiang University, China.
| | - Anat Perry
- Department of Psychology, The Hebrew University of Jerusalem, Israel
| |
Collapse
|
13
|
López B, Gregory NJ, Freeth M. Social attention patterns of autistic and non-autistic adults when viewing real versus reel people. AUTISM : THE INTERNATIONAL JOURNAL OF RESEARCH AND PRACTICE 2023; 27:2372-2383. [PMID: 36995032 PMCID: PMC10576900 DOI: 10.1177/13623613231162156] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/31/2023]
Abstract
LAY ABSTRACT Early research shows that autistic adults do not attend to faces as much as non-autistic adults. However, some recent studies where autistic people are placed in scenarios with real people reveal that they attend to faces as much as non-autistic people. This study compares attention to faces in two situations. In one, autistic and non-autistic adults watched a pre-recorded video. In the other, they watched what they thought were two people in a room in the same building, via a life webcam, when in fact exactly the same video in two situations. We report the results of 32 autistic adults and 33 non-autistic adults. The results showed that autistic adults do not differ in any way from non-autistic adults when they watched what they believed was people interacting in real time. However, when they thought they were watching a video, non-autistic participants showed higher levels of attention to faces than non-autistic participants. We conclude that attention to social stimuli is the result of a combination of two processes. One innate, which seems to be different in autism, and one that is influenced by social norms, which works in the same way in autistic adults without learning disabilities. The results suggest that social attention is not as different in autism as first thought. Specifically, the study contributes to dispel long-standing deficit models regarding social attention in autism as it points to subtle differences in the use of social norms rather than impairments.
Collapse
|
14
|
Skripkauskaite S, Mihai I, Koldewyn K. Attentional bias towards social interactions during viewing of naturalistic scenes. Q J Exp Psychol (Hove) 2023; 76:2303-2311. [PMID: 36377819 PMCID: PMC10503253 DOI: 10.1177/17470218221140879] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2022] [Revised: 09/30/2022] [Accepted: 11/04/2022] [Indexed: 09/16/2023]
Abstract
Human visual attention is readily captured by the social information in scenes. Multiple studies have shown that social areas of interest (AOIs) such as faces and bodies attract more attention than non-social AOIs (e.g., objects or background). However, whether this attentional bias is moderated by the presence (or absence) of a social interaction remains unclear. Here, the gaze of 70 young adults was tracked during the free viewing of 60 naturalistic scenes. All photographs depicted two people, who were either interacting or not. Analyses of dwell time revealed that more attention was spent on human than background AOIs in the interactive pictures. In non-interactive pictures, however, dwell time did not differ between AOI type. In the time-to-first-fixation analysis, humans always captured attention before other elements of the scene, although this difference was slightly larger in interactive than non-interactive scenes. These findings confirm the existence of a bias towards social information in attentional capture and suggest our attention values social interactions beyond the presence of two people.
Collapse
Affiliation(s)
- Simona Skripkauskaite
- School of Psychology, Bangor University, Bangor, UK
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| | - Ioana Mihai
- School of Psychology, Bangor University, Bangor, UK
| | | |
Collapse
|
15
|
Pasqualette L, Kulke L. Effects of emotional content on social inhibition of gaze in live social and non-social situations. Sci Rep 2023; 13:14151. [PMID: 37644088 PMCID: PMC10465544 DOI: 10.1038/s41598-023-41154-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2021] [Accepted: 08/22/2023] [Indexed: 08/31/2023] Open
Abstract
In real-life interactions, it is crucial that humans adequately respond to others' emotional expressions. Emotion perception so far has mainly been studied in highly controlled laboratory tasks. However, recent research suggests that attention and gaze behaviour significantly differ between watching a person on a controlled laboratory screen compared to in real world interactions. Therefore, the current study aimed to investigate effects of emotional expression on participants' gaze in social and non-social situations. We compared looking behaviour towards a confederate showing positive, neutral or negative facial expressions between live social and non-social waiting room situations. Participants looked more often and longer to the confederate on the screen, than when physically present in the room. Expressions displayed by the confederate and individual traits (social anxiety and autistic traits) of participants did not reliably relate to gaze behaviour. Indications of covert attention also occurred more often and longer during the non-social, than during the social condition. Findings indicate that social norm is a strong factor modulating gaze behaviour in social contexts. PROTOCOL REGISTRATION: The stage 1 protocol for this Registered Report was accepted in principle on September 13, 2021. The protocol, as accepted by the journal, can be found at: https://doi.org/10.6084/m9.figshare.16628290 .
Collapse
Affiliation(s)
- Laura Pasqualette
- Department of Neurocognitive Developmental Psychology, Friedrich-Alexander-Universität Erlangen-Nürnberg, Erlangen, Germany
- Developmental Psychology with Educational Psychology, University of Bremen, Bremen, Germany
| | - Louisa Kulke
- Department of Neurocognitive Developmental Psychology, Friedrich-Alexander-Universität Erlangen-Nürnberg, Erlangen, Germany.
- Developmental Psychology with Educational Psychology, University of Bremen, Bremen, Germany.
| |
Collapse
|
16
|
Alreja A, Ward MJ, Ma Q, Russ BE, Bickel S, Van Wouwe NC, González-Martínez JA, Neimat JS, Abel TJ, Bagić A, Parker LS, Richardson RM, Schroeder CE, Morency LP, Ghuman AS. A new paradigm for investigating real-world social behavior and its neural underpinnings. Behav Res Methods 2023; 55:2333-2352. [PMID: 35877024 PMCID: PMC10841340 DOI: 10.3758/s13428-022-01882-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/15/2022] [Indexed: 11/08/2022]
Abstract
Eye tracking and other behavioral measurements collected from patient-participants in their hospital rooms afford a unique opportunity to study natural behavior for basic and clinical translational research. We describe an immersive social and behavioral paradigm implemented in patients undergoing evaluation for surgical treatment of epilepsy, with electrodes implanted in the brain to determine the source of their seizures. Our studies entail collecting eye tracking with other behavioral and psychophysiological measurements from patient-participants during unscripted behavior, including social interactions with clinical staff, friends, and family in the hospital room. This approach affords a unique opportunity to study the neurobiology of natural social behavior, though it requires carefully addressing distinct logistical, technical, and ethical challenges. Collecting neurophysiological data synchronized to behavioral and psychophysiological measures helps us to study the relationship between behavior and physiology. Combining across these rich data sources while participants eat, read, converse with friends and family, etc., enables clinical-translational research aimed at understanding the participants' disorders and clinician-patient interactions, as well as basic research into natural, real-world behavior. We discuss data acquisition, quality control, annotation, and analysis pipelines that are required for our studies. We also discuss the clinical, logistical, and ethical and privacy considerations critical to working in the hospital setting.
Collapse
Affiliation(s)
- Arish Alreja
- Center for the Neural Basis of Cognition, Carnegie Mellon University and University of Pittsburgh, Pittsburgh, USA.
- Neuroscience Institute, Carnegie Mellon University, Pittsburgh, USA.
- Machine Learning Department, Carnegie Mellon University, Pittsburgh, USA.
- Department of Neurological Surgery, University of Pittsburgh, Pittsburgh, USA.
| | - Michael J Ward
- Department of Neurological Surgery, University of Pittsburgh, Pittsburgh, USA
- David Geffen School of Medicine, University of California Los Angeles, Los Angeles, USA
| | - Qianli Ma
- Language Technologies Institute, Carnegie Mellon University, Pittsburgh, USA
| | - Brian E Russ
- Nathan Kline Institute for Psychiatric Research, Orangeburg, USA
| | - Stephan Bickel
- Department of Neurosurgery and Neurology, Northwell Health, The Feinstein Institutes for Medical Research, Manhasset, USA
| | - Nelleke C Van Wouwe
- Department of Neurological Surgery, University of Louisville, Louisville, USA
| | | | - Joseph S Neimat
- Department of Neurological Surgery, University of Louisville, Louisville, USA
| | - Taylor J Abel
- Center for the Neural Basis of Cognition, Carnegie Mellon University and University of Pittsburgh, Pittsburgh, USA
- Department of Neurological Surgery, University of Pittsburgh, Pittsburgh, USA
- Brain Institute, University of Pittsburgh, Pittsburgh, USA
| | - Anto Bagić
- Department of Neurology, University of Pittsburgh, Pittsburgh, USA
| | - Lisa S Parker
- School of Public Health, University of Pittsburgh, Pittsburgh, USA
| | - R Mark Richardson
- Department of Neurological Surgery, University of Pittsburgh, Pittsburgh, USA
- Department of Neurosurgery, Harvard Medical School and Massachusetts General Hospital, Boston, USA
| | - Charles E Schroeder
- Nathan Kline Institute for Psychiatric Research, Orangeburg, USA
- Departments of Neurosurgery and Psychiatry, Columbia University, New York, USA
| | | | - Avniel Singh Ghuman
- Center for the Neural Basis of Cognition, Carnegie Mellon University and University of Pittsburgh, Pittsburgh, USA
- Department of Neurological Surgery, University of Pittsburgh, Pittsburgh, USA
- Brain Institute, University of Pittsburgh, Pittsburgh, USA
- Departments of Psychology, Neurobiology, and Psychiatry, University of Pittsburgh, Pittsburgh, USA
| |
Collapse
|
17
|
Ortega J, Chen Z, Whitney D. Inferential Emotion Tracking reveals impaired context-based emotion processing in individuals with high Autism Quotient scores. Sci Rep 2023; 13:8093. [PMID: 37208368 DOI: 10.1038/s41598-023-35371-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2023] [Accepted: 05/17/2023] [Indexed: 05/21/2023] Open
Abstract
Emotion perception is essential for successful social interactions and maintaining long-term relationships with friends and family. Individuals with autism spectrum disorder (ASD) experience social communication deficits and have reported difficulties in facial expression recognition. However, emotion recognition depends on more than just processing face expression; context is critically important to correctly infer the emotions of others. Whether context-based emotion processing is impacted in those with Autism remains unclear. Here, we used a recently developed context-based emotion perception task, called Inferential Emotion Tracking (IET), and investigated whether individuals who scored high on the Autism Spectrum Quotient (AQ) had deficits in context-based emotion perception. Using 34 videos (including Hollywood movies, home videos, and documentaries), we tested 102 participants as they continuously tracked the affect (valence and arousal) of a blurred-out, invisible character. We found that individual differences in Autism Quotient scores were more strongly correlated with IET task accuracy than they are with traditional face emotion perception tasks. This correlation remained significant even when controlling for potential covarying factors, general intelligence, and performance on traditional face perception tasks. These findings suggest that individuals with ASD may have impaired perception of contextual information, it reveals the importance of developing ecologically relevant emotion perception tasks in order to better assess and treat ASD, and it provides a new direction for further research on context-based emotion perception deficits in ASD.
Collapse
Affiliation(s)
- Jefferson Ortega
- Department of Psychology, University of California, Berkeley, CA, 94720, USA.
| | - Zhimin Chen
- Department of Psychology, University of California, Berkeley, CA, 94720, USA
| | - David Whitney
- Department of Psychology, University of California, Berkeley, CA, 94720, USA
- Vision Science Program, University of California, Berkeley, CA, 94720, USA
- Helen Wills Neuroscience Institute, University of California, Berkeley, CA, 94720, USA
| |
Collapse
|
18
|
Thorsson M, Galazka MA, Åsberg Johnels J, Hadjikhani N. A novel end-to-end dual-camera system for eye gaze synchrony assessment in face-to-face interaction. Atten Percept Psychophys 2023:10.3758/s13414-023-02679-4. [PMID: 37099200 DOI: 10.3758/s13414-023-02679-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/11/2023] [Indexed: 04/27/2023]
Abstract
Quantification of face-to-face interaction can provide highly relevant information in cognitive and psychological science research. Current commercial glint-dependent solutions suffer from several disadvantages and limitations when applied in face-to-face interaction, including data loss, parallax errors, the inconvenience and distracting effect of wearables, and/or the need for several cameras to capture each person. Here we present a novel eye-tracking solution, consisting of a dual-camera system used in conjunction with an individually optimized deep learning approach that aims to overcome some of these limitations. Our data show that this system can accurately classify gaze location within different areas of the face of two interlocutors, and capture subtle differences in interpersonal gaze synchrony between two individuals during a (semi-)naturalistic face-to-face interaction.
Collapse
Affiliation(s)
- Max Thorsson
- Gillberg Neuropsychiatry Centre, Institute of Neuroscience and Physiology, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden.
| | - Martyna A Galazka
- Gillberg Neuropsychiatry Centre, Institute of Neuroscience and Physiology, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
| | - Jakob Åsberg Johnels
- Gillberg Neuropsychiatry Centre, Institute of Neuroscience and Physiology, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Section of Speech and Language Pathology, Institute of Neuroscience and Physiology, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
| | - Nouchine Hadjikhani
- Gillberg Neuropsychiatry Centre, Institute of Neuroscience and Physiology, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| |
Collapse
|
19
|
Freeth M, Morgan EJ. I see you, you see me: the impact of social presence on social interaction processes in autistic and non-autistic people. Philos Trans R Soc Lond B Biol Sci 2023; 378:20210479. [PMID: 36871584 PMCID: PMC9985964 DOI: 10.1098/rstb.2021.0479] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2022] [Accepted: 12/23/2022] [Indexed: 03/07/2023] Open
Abstract
Environments that require social interaction are complex, challenging and sometimes experienced as overwhelming by autistic people. However, all too often theories relating to social interaction processes are created, and interventions are proposed, on the basis of data collected from studies that do not involve genuine social encounters nor do they consider the perception of social presence to be a potentially influential factor. In this review, we begin by considering why face-to-face interaction research is important in this field. We then discuss how the perception of social agency and social presence can influence conclusions about social interaction processes. We then outline some insights gained from face-to-face interaction research conducted with both autistic and non-autistic people. We finish by considering the impact of social presence on cognitive processes more broadly, including theory of mind. Overall, we demonstrate that choice of stimuli in studies assessing social interaction processes has the potential to substantially alter conclusions drawn. Ecological validity matters and social presence, in particular, is a critical factor that fundamentally impacts social interaction processes in both autistic and non-autistic people. This article is part of a discussion meeting issue 'Face2face: advancing the science of social interaction'.
Collapse
Affiliation(s)
- Megan Freeth
- Department of Psychology, The University of Sheffield, Sheffield, Sheffield S1 2LT, UK
| | - Emma J. Morgan
- Department of Psychology, The University of Sheffield, Sheffield, Sheffield S1 2LT, UK
| |
Collapse
|
20
|
Burack JA, Friedman S, Lessage M, Brodeur D. Re-visiting the 'mysterious myth of attention deficit': A systematic review of the recent evidence. JOURNAL OF INTELLECTUAL DISABILITY RESEARCH : JIDR 2023; 67:271-288. [PMID: 36437709 DOI: 10.1111/jir.12994] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/15/2021] [Accepted: 11/03/2022] [Indexed: 06/16/2023]
Abstract
Based on the inclusive and methodologically rigorous framework provided by Ed Zigler's developmental approach, we previously challenged what we called, 'the mysterious myth of attention deficit', the fallacy of attention as a universal deficit among persons with intellectual disability (ID). In this latest update, we conducted a systematic review of studies of essential components of attention among persons with ID published in the interim since the last iteration of the mysterious myth narrative was submitted for publication approximately a decade ago. We searched the databases PubMed and PsycINFO for English-language peer-reviewed studies published from 1 January 2011 through 5 February 2021. In keeping with the developmental approach, the two essential methodological criteria were that the groups of persons with ID were aetiologically homogeneous and that the comparisons with persons with average IQs (or with available norms) were based on an appropriate index of developmental level, or mental age. Stringent use of these criteria for inclusion served to control for bias in article selection. Articles were then categorised based on aetiological group studied and component of visual attention. Based on these criteria, 18 articles were selected for inclusion out of the 2837 that were identified. The included studies involved 547 participants: 201 participants with Down syndrome, 214 participants with Williams syndrome and 132 participants with fragile X syndrome. The findings from these articles call attention to the complexities and nuances in understanding attentional functioning across homogeneous aetiological groups and highlight that functioning must be considered in relation to aetiology; factors associated with the individual, such as developmental level, motivation, styles and biases; and factors associated with both the task, such as context, focus, social and emotional implications, and levels of environmental complexity.
Collapse
Affiliation(s)
- J A Burack
- Department of Educational & Counseling Psychology, McGill University, Montreal, Canada
| | - S Friedman
- Department of Psychology, Temple University, Philadelphia, PA, USA
| | - M Lessage
- Department of Psychology, University of Toronto, Toronto, Canada
| | - D Brodeur
- Department of Psychology, Acadia University, Wolfville, Canada
| |
Collapse
|
21
|
Forby L, Anderson NC, Cheng JT, Foulsham T, Karstadt B, Dawson J, Pazhoohi F, Kingstone A. Reading the room: Autistic traits, gaze behaviour, and the ability to infer social relationships. PLoS One 2023; 18:e0282310. [PMID: 36857369 PMCID: PMC9977004 DOI: 10.1371/journal.pone.0282310] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2021] [Accepted: 01/11/2023] [Indexed: 03/02/2023] Open
Abstract
Individuals high in autistic traits can have difficulty understanding verbal and non-verbal cues, and may display atypical gaze behaviour during social interactions. The aim of this study was to examine differences among neurotypical individuals with high and low levels of autistic traits with regard to their gaze behaviour and their ability to assess peers' social status accurately. Fifty-four university students who completed the 10-item Autism Quotient (AQ-10) were eye-tracked as they watched six 20-second video clips of people ("targets") involved in a group decision-making task. Simulating natural, everyday social interactions, the video clips included moments of debate, humour, interruptions, and cross talk. Results showed that high-scorers on the AQ-10 (i.e., those with more autistic traits) did not differ from the low-scorers in either gaze behaviour or assessing the targets' relative social status. The results based on this neurotypical group of participants suggest that the ability of individuals high in autistic traits to read social cues may be preserved in certain tasks crucial to navigating day-to-day social relationships. These findings are discussed in terms of their implications for theory of mind, weak central coherence, and social motivation theories of autism.
Collapse
Affiliation(s)
- Leilani Forby
- Department of Psychology, University of British Columbia, Vancouver, British Columbia, Canada
- * E-mail:
| | - Nicola C. Anderson
- Department of Psychology, University of British Columbia, Vancouver, British Columbia, Canada
| | - Joey T. Cheng
- Department of Psychology, York University, Toronto, Ontario, Canada
| | - Tom Foulsham
- Department of Psychology, University of Essex, Colchester, Essex, England
| | - Bradley Karstadt
- Department of Psychology, University of British Columbia, Vancouver, British Columbia, Canada
| | - Jessica Dawson
- Department of Psychology, University of Essex, Colchester, Essex, England
| | - Farid Pazhoohi
- Department of Psychology, University of British Columbia, Vancouver, British Columbia, Canada
| | - Alan Kingstone
- Department of Psychology, University of British Columbia, Vancouver, British Columbia, Canada
| |
Collapse
|
22
|
Greig PR, Zolger D, Onwochei DN, Thurley N, Higham H, Desai N. Cognitive aids in the management of clinical emergencies: a systematic review. Anaesthesia 2023; 78:343-355. [PMID: 36517981 PMCID: PMC10107924 DOI: 10.1111/anae.15939] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/22/2022] [Indexed: 12/23/2022]
Abstract
Clinical emergencies can be defined as unpredictable events that necessitate immediate intervention. Safety critical industries have acknowledged the difficulties of responding to such crises. Strategies to improve human performance and mitigate its limitations include the provision and use of cognitive aids, a family of tools that includes algorithms, checklists and decision aids. This systematic review evaluates the usefulness of cognitive aids in clinical emergencies. Following a systematic search of the electronic databases, we included 13 randomised controlled trials, reported in 16 publications. Each compared cognitive aids with usual care in the context of an anaesthetic, medical, surgical or trauma emergency involving adults. Most trials used only clinicians in the development and testing of the cognitive aids, and only some trials provided familiarisation with the cognitive aids before they were deployed. The primary outcome was the completeness of care delivered to the patient. Cognitive aids were associated with a reduction in the incidence of missed care steps from 43.3% to 11% (RR (95%CI) 0.29 (0.15-0.16); p < 0.001), and the quality of evidence was rated as moderate. The use of cognitive aids was related to decreases in the incidence of errors, increases in the rate of correctly performed steps and improvement in the clinical teamwork skills scores, non-technical skills scores, subjective conflict resolution scores and the global assessment of team performance. Cognitive aids had an inconsistent influence on the time to first intervention and time to complete care of the patient's condition. It is possible that this was a reflection of how common or rare the crisis in question was as well as the experience and expertise of the clinicians and team. Sufficient thought should be applied to the development of the content and design of cognitive aids, with consideration of the pre-existing guideline ecosystem. Cognitive aids should be tested before their deployment with adequate clinician and team training.
Collapse
Affiliation(s)
- P R Greig
- Department of Anaesthesia, Guy's and St Thomas' NHS Foundation Trust, London, UK.,Nuffield Department of Clinical Neurosciences, University of Oxford, UK
| | - D Zolger
- Department of Anaesthesia, Guy's and St Thomas' NHS Foundation Trust, London, UK
| | - D N Onwochei
- Department of Anaesthesia, Guy's and St Thomas' NHS Foundation Trust, London, UK.,King's College London, UK
| | - N Thurley
- Bodleian Library, University of Oxford, UK
| | - H Higham
- Nuffield Department of Clinical Neurosciences, Oxford University Hospitals NHS Foundation Trust, Oxford, UK.,Nuffield Department of Anaesthesia, Oxford University Hospitals NHS Foundation Trust, Oxford, UK
| | - N Desai
- Department of Anaesthesia, Guy's and St Thomas' NHS Foundation Trust, London, UK.,King's College London, UK
| |
Collapse
|
23
|
Putnam OC, Sasson N, Parish-Morris J, Harrop C. Effects of social complexity and gender on social and non-social attention in male and female autistic children: A comparison of four eye-tracking paradigms. Autism Res 2023; 16:315-326. [PMID: 36408851 DOI: 10.1002/aur.2851] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2021] [Accepted: 10/31/2022] [Indexed: 11/22/2022]
Abstract
Eye tracking has long been used to characterize differences in social attention between autistic and non-autistic children, but recent work has shown that these patterns may vary widely according to the biological sex of the participants and the social complexity and gender-typicality of the eye tracking stimuli (e.g., barbies vs. transformers). To better understand effects of sex, social complexity, and object gender-typicality on social and non-social gaze behavior in autism, we compared the visual attention patterns of 67 autistic (ASD) and non-autistic (NA) males (M) and females (F) (ASD M = 21; ASD F = 18; NA M = 14; NA F = 14) across four eye tracking paradigms varying in social complexity and object gender-typicality. We found consistency across paradigms in terms of overall attention and attention to social stimuli, but attention to objects varied when paradigms considered gender in their stimulus design. Children attended more to gendered objects, particularly when the gender-typicality of the object matched their assigned sex. These results demonstrate that visual social attention in autism is affected by interactions between a child's biological sex, social scene complexity, and object gender-typicality and have broad implications for the design and interpretation of eye tracking studies.
Collapse
Affiliation(s)
- Orla C Putnam
- Division of Health Sciences, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA
| | - Noah Sasson
- School of Behavioral and Brain Sciences, University of Texas at Dallas, Richardson, Texas, USA
| | - Julia Parish-Morris
- Department of Child and Adolescent Psychiatry, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania, USA.,Department of Psychiatry, Perelman School of Medicine at the University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - Clare Harrop
- Division of Health Sciences, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA.,TEACCH Autism Program, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA
| |
Collapse
|
24
|
Berlijn AM, Hildebrandt LK, Gamer M. Idiosyncratic viewing patterns of social scenes reflect individual preferences. J Vis 2022; 22:10. [PMID: 36583910 PMCID: PMC9807181 DOI: 10.1167/jov.22.13.10] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2022] [Accepted: 11/16/2022] [Indexed: 12/31/2022] Open
Abstract
In general, humans preferentially look at conspecifics in naturalistic images. However, such group-based effects might conceal systematic individual differences concerning the preference for social information. Here, we investigated to what degree fixations on social features occur consistently within observers and whether this preference generalizes to other measures of social prioritization in the laboratory as well as the real world. Participants carried out a free viewing task, a relevance taps task that required them to actively select image regions that are crucial for understanding a given scene, and they were asked to freely take photographs outside the laboratory that were later classified regarding their social content. We observed stable individual differences in the fixation and active selection of human heads and faces that were correlated across tasks and partly predicted the social content of self-taken photographs. Such relationship was not observed for human bodies indicating that different social elements need to be dissociated. These findings suggest that idiosyncrasies in the visual exploration and interpretation of social features exist and predict real-world behavior. Future studies should further characterize these preferences and elucidate how they shape perception and interpretation of social contexts in healthy participants and patients with mental disorders that affect social functioning.
Collapse
Affiliation(s)
- Adam M Berlijn
- Department of Experimental Psychology, Heinrich-Heine-University Düsseldorf, Düsseldorf, Germany
- Institute of Clinical Neuroscience and Medical Psychology, Medical Faculty, University Hospital Düsseldorf, Heinrich-Heine University Düsseldorf, Düsseldorf, Germany
- Institute of Neuroscience and Medicine (INM-1), Research Centre Jülich, Jülich, Germany
- Department of Psychology, Julius-Maximilians-University Würzburg, Würzburg, Germany
| | - Lea K Hildebrandt
- Department of Psychology, Julius-Maximilians-University Würzburg, Würzburg, Germany
| | - Matthias Gamer
- Department of Psychology, Julius-Maximilians-University Würzburg, Würzburg, Germany
| |
Collapse
|
25
|
Zeng G, Maylott SE, Leung TS, Messinger DS, Wang J, Simpson EA. Infant temperamental fear, pupil dilation, and gaze aversion from smiling strangers. Dev Psychobiol 2022; 64:e22324. [PMID: 36282740 DOI: 10.1002/dev.22324] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2021] [Revised: 07/06/2022] [Accepted: 08/17/2022] [Indexed: 01/27/2023]
Abstract
In childhood, higher levels of temperamental fear-an early-emerging proclivity to distress in the face of novelty-are associated with lower social responsivity and greater social anxiety. While the early emergence of temperamental fear in infancy is poorly understood, it is theorized to be driven by individual differences in reactivity and self-regulation to novel stimuli. The current study used eye tracking to capture infants' (N = 124) reactions to a video of a smiling stranger-a common social encounter-including infant gaze aversions from the stranger's face (indexing arousal regulation) and pupil dilation (indexing physiological reactivity), longitudinally at 2, 4, 6, and 8 months of age. Multilevel mixed-effects models indicated that more fearful infants took more time to look away from a smiling stranger's face than less fearful infants, suggesting that high-fear infants may have slower arousal regulation. At 2 and 4 months, more fearful infants also exhibited greater and faster pupil dilation before gaze aversions, consistent with greater physiological reactivity. Together, these findings suggest that individual differences in infants' gaze aversions and pupil dilation can index the development of fearful temperament in early infancy, facilitating the identification of, and interventions for, risk factors to social disruptions.
Collapse
Affiliation(s)
- Guangyu Zeng
- Department of Psychology, University of Miami, Coral Gables, Florida, USA
| | - Sarah E Maylott
- Department of Psychology, University of Miami, Coral Gables, Florida, USA.,Department of Psychology, University of Utah, Salt Lake City, Utah, USA.,Department of Psychiatry & Behavioral Sciences, Duke University, Durham, North Carolina, USA
| | - Tiffany S Leung
- Department of Psychology, University of Miami, Coral Gables, Florida, USA
| | - Daniel S Messinger
- Department of Psychology, University of Miami, Coral Gables, Florida, USA.,Departments of Pediatrics, Music Engineering, Electrical and Computer Engineering, University of Miami, Coral Gables, Florida, USA
| | - Jue Wang
- Department of Psychology, University of Science and Technology of China, Hefei, China
| | | |
Collapse
|
26
|
Großekathöfer JD, Seis C, Gamer M. Reality in a sphere: A direct comparison of social attention in the laboratory and the real world. Behav Res Methods 2022; 54:2286-2301. [PMID: 34918223 PMCID: PMC9579106 DOI: 10.3758/s13428-021-01724-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/08/2021] [Indexed: 11/24/2022]
Abstract
Humans often show reduced social attention in real situations, a finding rarely replicated in controlled laboratory studies. Virtual reality is supposed to allow for ecologically valid and at the same time highly controlled experiments. This study aimed to provide initial insights into the reliability and validity of using spherical videos viewed via a head-mounted display (HMD) to assess social attention. We chose five public places in the city of Würzburg and measured eye movements of 44 participants for 30 s at each location twice: Once in a real environment with mobile eye-tracking glasses and once in a virtual environment playing a spherical video of the location in an HMD with an integrated eye tracker. As hypothesized, participants demonstrated reduced social attention with less exploration of passengers in the real environment as compared to the virtual one. This is in line with earlier studies showing social avoidance in interactive situations. Furthermore, we only observed consistent gaze proportions on passengers across locations in virtual environments. These findings highlight that the potential for social interactions and an adherence to social norms are essential modulators of viewing behavior in social situations and cannot be easily simulated in laboratory contexts. However, spherical videos might be helpful for supplementing the range of methods in social cognition research and other fields. Data and analysis scripts are available at https://osf.io/hktdu/ .
Collapse
Affiliation(s)
- Jonas D Großekathöfer
- Department of Psychology, Julius Maximilian University of Würzburg, Würzburg, Germany.
| | - Christian Seis
- Department of Psychology, Julius Maximilian University of Würzburg, Würzburg, Germany
| | - Matthias Gamer
- Department of Psychology, Julius Maximilian University of Würzburg, Würzburg, Germany
| |
Collapse
|
27
|
Deliberate control of facial expressions in a go/no-go task: An ERP study. Acta Psychol (Amst) 2022; 230:103773. [DOI: 10.1016/j.actpsy.2022.103773] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2022] [Revised: 08/11/2022] [Accepted: 10/11/2022] [Indexed: 11/21/2022] Open
|
28
|
Eye gaze and visual attention as a window into leadership and followership: A review of empirical insights and future directions. THE LEADERSHIP QUARTERLY 2022. [DOI: 10.1016/j.leaqua.2022.101654] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
29
|
Eye contact avoidance in crowds: A large wearable eye-tracking study. Atten Percept Psychophys 2022; 84:2623-2640. [PMID: 35996058 PMCID: PMC9630249 DOI: 10.3758/s13414-022-02541-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/08/2022] [Indexed: 11/30/2022]
Abstract
Eye contact is essential for human interactions. We investigated whether humans are able to avoid eye contact while navigating crowds. At a science festival, we fitted 62 participants with a wearable eye tracker and instructed them to walk a route. Half of the participants were further instructed to avoid eye contact. We report that humans can flexibly allocate their gaze while navigating crowds and avoid eye contact primarily by orienting their head and eyes towards the floor. We discuss implications for crowd navigation and gaze behavior. In addition, we address a number of issues encountered in such field studies with regard to data quality, control of the environment, and participant adherence to instructions. We stress that methodological innovation and scientific progress are strongly interrelated.
Collapse
|
30
|
Lau WK, Sauter M, Huckauf A. Small Pupils Lead to Lower Judgements of a Person’s Characteristics for Exaggerated, but Not for Realistic Pupils. Behav Sci (Basel) 2022; 12:bs12080283. [PMID: 36004854 PMCID: PMC9405288 DOI: 10.3390/bs12080283] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2022] [Revised: 08/03/2022] [Accepted: 08/10/2022] [Indexed: 11/16/2022] Open
Abstract
Our eyes convey information about a person. The pupils may provide information regarding our emotional states when presented along with different emotional expressions. We examined the effects of pupil size and vergence on inferring other people’s characteristics in neutral expression eyes. Pupil sizes were manipulated by overlaying black disks onto the pupils of the original eye images. The disk area was then changed to create small, medium, and large pupils. Vergence was simulated by shifting the medium-sized disks nasally in one eye. Pupil sizes were exaggerated for Experiment 1 and followed values from the literature for Experiment 2. The first Purkinje image from the eye photos in Experiment 2 was kept to preserve image realism. The characteristics measured were sex, age, attractiveness, trustworthiness, intelligence, valence, and arousal. Participants completed one of two online experiments and rated eight eye pictures with differently sized pupils and with vergence eyes. Both experiments were identical except for the stimuli designs. Results from Experiment 1 revealed rating differences between pupil sizes for all characteristics except sex, age, and arousal. Specifically, eyes with extremely small pupil sizes and artificial vergence received the lowest ratings compared to medium and large pupil sizes. Results from Experiment 2 only indicated weak effects of pupil size and vergence, particularly for intelligence ratings. We conclude that the pupils can influence how characteristics of another person are perceived and may be regarded as important social signals in subconscious social interaction processes. However, the effects may be rather small for neutral expressions.
Collapse
|
31
|
Ridley E, Arnott B, Riby DM, Burt DM, Hanley M, Leekam SR. The Quality of Everyday Eye Contact in Williams Syndrome: Insights From Cross-Syndrome Comparisons. AMERICAN JOURNAL ON INTELLECTUAL AND DEVELOPMENTAL DISABILITIES 2022; 127:293-312. [PMID: 36122327 DOI: 10.1352/1944-7558-127.4.293] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/08/2020] [Accepted: 08/31/2021] [Indexed: 06/15/2023]
Abstract
Past research shows that individuals with Williams syndrome (WS) have heightened and prolonged eye contact. Using parent report measures, we examined not only the presence of eye contact but also its qualitative features. Study 1 included individuals with WS (n = 22, ages 6.0-36.3). Study 2 included children with different neurodevelopmental (ND) conditions (WS, autism spectrum condition, fragile X syndrome, attention-deficit/hyperactivity disorder) and children with neurotypical development (NT; n = 262, ages 4.0-17.11). Unusual eye contact features, including staring, were found in approximately half of the WS samples. However, other features such as brief glances were frequently found in WS and in all ND conditions, but not NT. Future research in ND conditions should focus on qualitative as well as quantitative features of eye contact.
Collapse
Affiliation(s)
- Ellen Ridley
- Ellen Ridley, Centre for Neurodiversity & Development, Durham University, and Department of Psychology, Durham University, Durham, DH1 3LE, UK
| | - Bronia Arnott
- Bronia Arnott, Population Health Sciences Institute, Newcastle University, Newcastle upon Tyne, NE2 4AX, UK
| | - Deborah M Riby
- Deborah M. Riby, Centre for Neurodiversity & Development, Durham University, and Department of Psychology, Durham University, Durham, DH1 3LE, UK
| | - D Michael Burt
- D. Michael Burt, Department of Psychology, Durham University, Durham, DH1 3LE, UK
| | - Mary Hanley
- Mary Hanley, Centre for Neurodiversity & Development, Durham University, and Department of Psychology, Durham University, Durham, DH1 3LE, UK
| | - Susan R Leekam
- Susan R. Leekam, Cardiff University Centre for Developmental Science, Cardiff University, Park Place, Cardiff, Wales, CF10 3AT, UK
| |
Collapse
|
32
|
Recio G, Surdzhiyska Y, Bagherzadeh-Azbari S, Hilpert P, Rostami HN, Xu Q, Sommer W. Deliberate control over facial expressions in motherhood. Evidence from a Stroop-like task. Acta Psychol (Amst) 2022; 228:103652. [PMID: 35753142 DOI: 10.1016/j.actpsy.2022.103652] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2021] [Revised: 03/10/2022] [Accepted: 06/14/2022] [Indexed: 11/01/2022] Open
Abstract
The deliberate control of facial expressions is an important ability in human interactions, in particular for mothers with prelinguistic infants. Because research on this topic is still scarce, we investigated the control over facial expressions in a Stroop-like paradigm. Mothers of 2-6 months old infants and nullipara women produced smiles and frowns in response to verbal commands written on distractor faces of adults or infants showing expressions of happiness or anger/distress. Analyses of video recordings with a machine classifier for facial expression revealed pronounced effects of congruency between the expressions required by the participants and those displayed by the face stimuli on the onset latencies of the deliberate facial expressions. With adult distractor faces this Stroop effect was similar whether participants smiled or frowned. With infant distractor faces mothers and non-mothers showed indistinguishable Stroop effects on smile responses; however, for frown responses, the Stroop effect in mothers was smaller than in non-mothers. We suggest that for frown responses in mothers when facing infants, the effect of mimicry or stimulus response compatibility, leading to the Stroop effect, is offset by a caregiving response or empathy.
Collapse
Affiliation(s)
| | | | | | | | | | - Qiang Xu
- Humboldt Universität zu Berlin, Germany; Ningbo University, China
| | | |
Collapse
|
33
|
Bo O’Connor B, Lee K, Campbell D, Young L. Moral psychology from the lab to the wild: Relief registries as a paradigm for studying real-world altruism. PLoS One 2022; 17:e0269469. [PMID: 35696389 PMCID: PMC9191725 DOI: 10.1371/journal.pone.0269469] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2021] [Accepted: 05/20/2022] [Indexed: 11/19/2022] Open
Abstract
Experimental psychology's recent shift toward low-effort, high-volume methods (e.g., self-reports, online studies) and away from the more effortful study of naturalistic behavior raises concerns about the ecological validity of findings from these fields, concerns that have become particularly apparent in the field of moral psychology. To help address these concerns, we introduce a method allowing researchers to investigate an important, widespread form of altruistic behavior-charitable donations-in a manner balancing competing concerns about internal validity, ecological validity, and ease of implementation: relief registries, which leverage existing online gift registry platforms to allow research subjects to choose among highly needed donation items to ship directly to charitable organizations. Here, we demonstrate the use of relief registries in two experiments exploring the ecological validity of the finding from our own research that people are more willing to help others after having imagined themselves doing so. In this way, we sought to provide a blueprint for researchers seeking to enhance the ecological validity of their own research in a narrow sense (i.e., by using the relief registry method we introduce) and in broader terms by adapting methods that take advantage of modern technology to directly impact others' lives outside the lab.
Collapse
Affiliation(s)
- Brendan Bo O’Connor
- Department of Psychology, University at Albany, State University of New York, Albany, New York, United States of America
| | - Karen Lee
- Department of Psychology, Boston College, Chestnut Hill, Massachusetts, United States of America
| | - Dylan Campbell
- Department of Psychology, University at Albany, State University of New York, Albany, New York, United States of America
| | - Liane Young
- Department of Psychology, Boston College, Chestnut Hill, Massachusetts, United States of America
| |
Collapse
|
34
|
Tsutsuse KS, Vibell J, Sinnett S. EXPRESS: Multisensory Perception of Natural Versus Unnatural Motion. Q J Exp Psychol (Hove) 2022; 76:1233-1244. [PMID: 35658653 DOI: 10.1177/17470218221108251] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
Previous research has shown that visual perception is influenced by Newtonian constraints. Kominsky et al. (2017) showed humans detect unnatural motion, where objects break Newtonian constraints by moving at a faster speed after colliding with another object, faster than collisions that do not violate Newtonian constraints. These findings show that the perceptual system distinguishes between realistic and unrealistic causal events. However, real world collisions are rarely silent. The present study extends this research by including sound at the collision point between two objects to evaluate how multisensory integration influences the perception of natural versus unnatural colliding events. Participants viewed an array of three simultaneous videos, each depicting two objects moving in a horizontal back and forth motion. Two of the videos showed the objects moving at the same speed while the third video was an oddball that either moved faster before the collision and slower after (natural target), or slower before the collision and faster after (unnatural target). A brief click was presented at the collision point of one or none of the videos. Participants were asked to indicate the oddball video via keypress. Replicating Kominsky et al. (2017), participants were faster when identifying unnatural target motion events compared to natural target motion events, both with and without sound. The findings also demonstrated lower accuracy rates for unnatural events compared to natural events, especially when a sound was added. These findings suggest that the addition of a sound could be distracting to participants, possibly due to limitations in attentional resources.
Collapse
Affiliation(s)
- Kayla Soma Tsutsuse
- Department of Psychology, University of Hawaii at Manoa 2530 Dole Street Sakamaki D412, Honolulu, HI, 96822 3949
| | - Jonas Vibell
- Department of Psychology, University of Hawaii at Manoa 2530 Dole Street Sakamaki D412, Honolulu, HI, 96822 3949
| | - Scott Sinnett
- Department of Psychology, University of Hawaii at Manoa 2530 Dole Street Sakamaki D412, Honolulu, HI, 96822 3949
| |
Collapse
|
35
|
Morgan EJ, Carroll DJ, Chow CKC, Freeth M. The Effect of Social Presence on Mentalizing Behavior. Cogn Sci 2022; 46:e13126. [PMID: 35411971 PMCID: PMC9287020 DOI: 10.1111/cogs.13126] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2021] [Revised: 02/22/2022] [Accepted: 02/22/2022] [Indexed: 11/29/2022]
Abstract
Our behavior is frequently influenced by those around us. However, the majority of social cognition research is conducted using socially isolated paradigms, without the presence of real people (i.e., without a "social presence"). The current study aimed to test the influence of social presence upon a measure of mentalizing behavior in adults. Study 1 used a first-order theory of mind task; and study 2 used a second-order theory of mind task. Both studies included two conditions: live, where the task protagonists were physically present acting out the task, or recorded, where the same task protagonists demonstrated the task in a video recording. In both experiments, participants were affected by the social presence and demonstrated significantly different patterns of behavior in response to the presence of real people. This study, therefore, highlights the critical importance of understanding the effect of a social presence in mentalizing research, and suggests that the inclusion of a social presence needs to be given strong consideration across social cognition paradigms.
Collapse
Affiliation(s)
- Emma J Morgan
- Department of Psychology, The University of Sheffield
| | | | | | - Megan Freeth
- Department of Psychology, The University of Sheffield
| |
Collapse
|
36
|
López B. Commentary on Autism and the double-empathy problem: Implications for development and mental health. BRITISH JOURNAL OF DEVELOPMENTAL PSYCHOLOGY 2022; 40:368-370. [PMID: 35338788 PMCID: PMC9310842 DOI: 10.1111/bjdp.12410] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2022] [Accepted: 03/08/2022] [Indexed: 11/29/2022]
Affiliation(s)
- Beatriz López
- Department of Psychology, University of Portsmouth, Portsmouth, UK
| |
Collapse
|
37
|
Mundy P, Bullen J. The Bidirectional Social-Cognitive Mechanisms of the Social-Attention Symptoms of Autism. Front Psychiatry 2022; 12:752274. [PMID: 35173636 PMCID: PMC8841840 DOI: 10.3389/fpsyt.2021.752274] [Citation(s) in RCA: 13] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/02/2021] [Accepted: 12/20/2021] [Indexed: 11/13/2022] Open
Abstract
Differences in social attention development begin to be apparent in the 6th to 12th month of development in children with Autism Spectrum Disorder (ASD) and theoretically reflect important elements of its neurodevelopmental endophenotype. This paper examines alternative conceptual views of these early social attention symptoms and hypotheses about the mechanisms involved in their development. One model emphasizes mechanism involved in the spontaneous allocation of attention to faces, or social orienting. Alternatively, another model emphasizes mechanisms involved in the coordination of attention with other people, or joint attention, and the socially bi-directional nature of its development. This model raises the possibility that atypical responses of children to the attention or the gaze of a social partner directed toward themselves may be as important in the development of social attention symptoms as differences in the development of social orienting. Another model holds that symptoms of social attention may be important to early development, but may not impact older individuals with ASD. The alterative model is that the social attention symptoms in infancy (social orienting and joint attention), and social cognitive symptoms in childhood and adulthood share common neurodevelopmental substrates. Therefore, differences in early social attention and later social cognition constitute a developmentally continuous axis of symptom presentation in ASD. However, symptoms in older individuals may be best measured with in vivo measures of efficiency of social attention and social cognition in social interactions rather than the accuracy of response on analog tests used in measures with younger children. Finally, a third model suggests that the social attention symptoms may not truly be a symptom of ASD. Rather, they may be best conceptualized as stemming from differences domain general attention and motivation mechanisms. The alternative argued for here that infant social attention symptoms meet all the criteria of a unique dimension of the phenotype of ASD and the bi-directional phenomena involved in social attention cannot be fully explained in terms of domain general aspects of attention development.
Collapse
Affiliation(s)
- Peter Mundy
- Department of Learning and Mind Sciences, School of Education, University of California, Davis, Davis, CA, United States
- Department of Psychiatry and Behavioral Science and The MIND Institute, UC Davis School of Medicine, Sacramento, CA, United States
| | - Jenifer Bullen
- Department of Human Development, School of Human Ecology, University of California, Davis, Davis, CA, United States
| |
Collapse
|
38
|
Holleman GA, Hooge ITC, Huijding J, Deković M, Kemner C, Hessels RS. Gaze and speech behavior in parent–child interactions: The role of conflict and cooperation. CURRENT PSYCHOLOGY 2021. [DOI: 10.1007/s12144-021-02532-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
AbstractA primary mode of human social behavior is face-to-face interaction. In this study, we investigated the characteristics of gaze and its relation to speech behavior during video-mediated face-to-face interactions between parents and their preadolescent children. 81 parent–child dyads engaged in conversations about cooperative and conflictive family topics. We used a dual-eye tracking setup that is capable of concurrently recording eye movements, frontal video, and audio from two conversational partners. Our results show that children spoke more in the cooperation-scenario whereas parents spoke more in the conflict-scenario. Parents gazed slightly more at the eyes of their children in the conflict-scenario compared to the cooperation-scenario. Both parents and children looked more at the other's mouth region while listening compared to while speaking. Results are discussed in terms of the role that parents and children take during cooperative and conflictive interactions and how gaze behavior may support and coordinate such interactions.
Collapse
|
39
|
Doom JR, Rozenman M, Fox KR, Phu T, Subar AR, Seok D, Rivera KM. The Transdiagnostic Origins of Anxiety and Depression During the Pediatric Period: Linking NIMH Research Domain Criteria (RDoC) Constructs to Ecological Systems. Dev Psychopathol 2021; 33:1599-1619. [PMID: 35281333 PMCID: PMC8916713 DOI: 10.1017/s0954579421000559] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
In the last decade, an abundance of research has utilized the NIMH Research Domain Criteria (RDoC) framework to examine mechanisms underlying anxiety and depression in youth. However, relatively little work has examined how these mechanistic intrapersonal processes intersect with context during childhood and adolescence. The current paper covers reviews and meta-analyses that have linked RDoC-relevant constructs to ecological systems in internalizing problems in youth. Specifically, cognitive, biological, and affective factors within the RDoC framework were examined. Based on these reviews and some of the original empirical research they cover, we highlight the integral role of ecological factors to the RDoC framework in predicting onset and maintenance of internalizing problems in youth. Specific recommendations are provided for researchers using the RDoC framework to inform future research integrating ecological systems and development. We advocate for future research and research funding to focus on better integration of the environment and development into the RDoC framework.
Collapse
|
40
|
The Multiple Object Avoidance (MOA) task measures attention for action: Evidence from driving and sport. Behav Res Methods 2021; 54:1508-1529. [PMID: 34786653 PMCID: PMC9170642 DOI: 10.3758/s13428-021-01679-2] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/02/2021] [Indexed: 11/08/2022]
Abstract
Performance in everyday tasks, such as driving and sport, requires allocation of attention to task-relevant information and the ability to inhibit task-irrelevant information. Yet there are individual differences in this attentional function ability. This research investigates a novel task for measuring attention for action, called the Multiple Object Avoidance task (MOA), in its relation to the everyday tasks of driving and sport. The aim in Study 1 was to explore the efficacy of the MOA task to predict simulated driving behaviour and hazard perception. Whilst also investigating its test-retest reliability and how it correlates to self-report driving measures. We found that superior performance in the MOA task predicted simulated driving performance in complex environments and was superior at predicting performance compared to the Useful Field of View task. We found a moderate test-retest reliability and a correlation between the attentional lapses subscale of the Driving Behaviour Questionnaire. Study 2 investigated the discriminative power of the MOA in sport by exploring performance differences in those that do and do not play sports. We also investigated if the MOA shared attentional elements with other measures of visual attention commonly attributed to sporting expertise: Multiple Object Tracking (MOT) and cognitive processing speed. We found that those that played sports exhibited superior MOA performance and found a positive relationship between MOA performance and Multiple Object Tracking performance and cognitive processing speed. Collectively, this research highlights the utility of the MOA when investigating visual attention in everyday contexts.
Collapse
|
41
|
Fairchild GT, Marini F, Snow JC. Graspability Modulates the Stronger Neural Signature of Motor Preparation for Real Objects vs. Pictures. J Cogn Neurosci 2021; 33:2477-2493. [PMID: 34407193 PMCID: PMC9946154 DOI: 10.1162/jocn_a_01771] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The cognitive and neural bases of visual perception are typically studied using pictures rather than real-world stimuli. Unlike pictures, real objects are actionable solids that can be manipulated with the hands. Recent evidence from human brain imaging suggests that neural responses to real objects differ from responses to pictures; however, little is known about the neural mechanisms that drive these differences. Here, we tested whether brain responses to real objects versus pictures are differentially modulated by the "in-the-moment" graspability of the stimulus. In human dorsal cortex, electroencephalographic responses show a "real object advantage" in the strength and duration of mu (μ) and low beta (β) rhythm desynchronization-well-known neural signatures of visuomotor action planning. We compared desynchronization for real tools versus closely matched pictures of the same objects, when the stimuli were positioned unoccluded versus behind a large transparent barrier that prevented immediate access to the stimuli. We found that, without the barrier in place, real objects elicited stronger μ and β desynchronization compared to pictures, both during stimulus presentation and after stimulus offset, replicating previous findings. Critically, however, with the barrier in place, this real object advantage was attenuated during the period of stimulus presentation, whereas the amplification in later periods remained. These results suggest that the "real object advantage" is driven initially by immediate actionability, whereas later differences perhaps reflect other, more inherent properties of real objects. The findings showcase how the use of richer multidimensional stimuli can provide a more complete and ecologically valid understanding of object vision.
Collapse
|
42
|
Asbee J, Parsons TD. Effects of Transcranial Direct Current Stimulation on Cognitive and Affective Outcomes Using Virtual Stimuli: A Systematic Review. CYBERPSYCHOLOGY, BEHAVIOR AND SOCIAL NETWORKING 2021; 24:699-714. [PMID: 33625878 DOI: 10.1089/cyber.2020.0301] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Transcranial direct current stimulation (tDCS) is a noninvasive form of brain stimulation used to influence neural activity. While early tDCS studies primarily used static stimuli, there is growing interest in dynamic stimulus presentations using virtual environments (VEs). This review attempts to convey the state of the field. This is not a quantitative meta-analysis as there are not yet enough studies following consistent protocols and/or reporting adequate data. In addition to reviewing the state of the literature, this review includes an exploratory analysis of the available data. Following preferred reporting items for systematic reviews and meta-analyses (PRISMA) guidelines, studies were culled from several databases. Results from this review reveal differences between online and offline stimulation. While offline stimulation did not influence affective and cognitive outcomes, online stimulation led to small changes in affect and cognition. Future studies should include randomized controlled trials with larger samples. Furthermore, greater care needs to be applied to full data reporting (e.g., means, standard deviations, and data for their nonsignificant findings) to improve our understanding of the combined effects of virtual stimuli with tDCS.
Collapse
Affiliation(s)
- Justin Asbee
- Department of Psychology, University of North Texas, Denton, Texas, USA
- Computational Neuropsychology & Simulation (CNS) Laboratory, University of North Texas, Denton, Texas, USA
| | - Thomas D Parsons
- Computational Neuropsychology & Simulation (CNS) Laboratory, University of North Texas, Denton, Texas, USA
- College of Information, University of North Texas, Denton, Texas, USA
| |
Collapse
|
43
|
Stagg S, Tan LH, Kodakkadan F. Emotion Recognition and Context in Adolescents with Autism Spectrum Disorder. J Autism Dev Disord 2021; 52:4129-4137. [PMID: 34617238 DOI: 10.1007/s10803-021-05292-2] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/09/2021] [Indexed: 10/16/2022]
Abstract
Emotion recognition research in autism has provided conflicting results and has ignored the role of context. We examined if autistic adolescents use context to identify displayed and felt emotion. Twenty adolescents with autism and 20 age-matched neurotypical adolescents identified emotions from a standardised set of images. The groups also viewed videos scenes with actors displaying a feigned emotion masking their true feelings. Participants identified the displayed and felt emotions. Both groups identified emotions from static images equally well. In the video condition, the autism group was unable to distinguish between the displayed and felt emotions. Emotion research is often divorced from context. Our findings suggest that autistic individuals have difficulty integrating contextual cues when processing emotions.
Collapse
Affiliation(s)
- Steven Stagg
- Anglia Ruskin University, East Road, Cambridge, CB1 PT1, UK.
| | - Li-Huan Tan
- Anglia Ruskin University, East Road, Cambridge, CB1 PT1, UK
| | | |
Collapse
|
44
|
Tracking developmental differences in real-world social attention across adolescence, young adulthood and older adulthood. Nat Hum Behav 2021; 5:1381-1390. [PMID: 33986520 PMCID: PMC7611872 DOI: 10.1038/s41562-021-01113-9] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2020] [Accepted: 04/12/2021] [Indexed: 02/03/2023]
Abstract
Detecting and responding appropriately to social information in one's environment is a vital part of everyday social interactions. Here, we report two preregistered experiments that examine how social attention develops across the lifespan, comparing adolescents (10-19 years old), young (20-40 years old) and older (60-80 years old) adults. In two real-world tasks, participants were immersed in different social interaction situations-a face-to-face conversation and navigating an environment-and their attention to social and non-social content was recorded using eye-tracking glasses. The results revealed that, compared with young adults, adolescents and older adults attended less to social information (that is, the face) during face-to-face conversation, and to people when navigating the real world. Thus, we provide evidence that real-world social attention undergoes age-related change, and these developmental differences might be a key mechanism that influences theory of mind among adolescents and older adults, with potential implications for predicting successful social interactions in daily life.
Collapse
|
45
|
Hartz A, Guth B, Jording M, Vogeley K, Schulte-Rüther M. Temporal Behavioral Parameters of On-Going Gaze Encounters in a Virtual Environment. Front Psychol 2021; 12:673982. [PMID: 34421731 PMCID: PMC8377250 DOI: 10.3389/fpsyg.2021.673982] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2021] [Accepted: 06/29/2021] [Indexed: 11/25/2022] Open
Abstract
To navigate the social world, humans heavily rely on gaze for non-verbal communication as it conveys information in a highly dynamic and complex, yet concise manner: For instance, humans utilize gaze effortlessly to direct and infer the attention of a possible interaction partner. Many traditional paradigms in social gaze research though rely on static ways of assessing gaze interaction, e.g., by using images or prerecorded videos as stimulus material. Emerging gaze contingent paradigms, in which algorithmically controlled virtual characters can respond flexibly to the gaze behavior of humans, provide high ecological validity. Ideally, these are based on models of human behavior which allow for precise, parameterized characterization of behavior, and should include variable interactive settings and different communicative states of the interacting agents. The present study provides a complete definition and empirical description of a behavioral parameter space of human gaze behavior in extended gaze encounters. To this end, we (i) modeled a shared 2D virtual environment on a computer screen in which a human could interact via gaze with an agent and simultaneously presented objects to create instances of joint attention and (ii) determined quantitatively the free model parameters (temporal and probabilistic) of behavior within this environment to provide a first complete, detailed description of the behavioral parameter space governing joint attention. This knowledge is essential to enable the modeling of interacting agents with a high degree of ecological validity, be it for cognitive studies or applications in human-robot interaction.
Collapse
Affiliation(s)
- Arne Hartz
- Molecular Neuroscience and Neuroimaging (INM-11), Institute of Neuroscience and Medicine, Jülich Research Center, Jülich, Germany.,Translational Brain Research, Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University Hospital RWTH Aachen, Aachen, Germany
| | - Björn Guth
- Translational Brain Research, Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University Hospital RWTH Aachen, Aachen, Germany
| | - Mathis Jording
- Department of Psychiatry and Psychotherapy, University Hospital Cologne, Cologne, Germany.,Cognitive Neuroscience (INM-3), Institute of Neuroscience and Medicine, Jülich Research Center, Jülich, Germany
| | - Kai Vogeley
- Department of Psychiatry and Psychotherapy, University Hospital Cologne, Cologne, Germany.,Cognitive Neuroscience (INM-3), Institute of Neuroscience and Medicine, Jülich Research Center, Jülich, Germany
| | - Martin Schulte-Rüther
- Molecular Neuroscience and Neuroimaging (INM-11), Institute of Neuroscience and Medicine, Jülich Research Center, Jülich, Germany.,Translational Brain Research, Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University Hospital RWTH Aachen, Aachen, Germany.,Cognitive Neuroscience (INM-3), Institute of Neuroscience and Medicine, Jülich Research Center, Jülich, Germany.,Department of Child and Adolescent Psychiatry and Psychotherapy, University Medical Center Göttingen, Göttingen, Germany
| |
Collapse
|
46
|
Smith ME, Loschky LC, Bailey HR. Knowledge guides attention to goal-relevant information in older adults. COGNITIVE RESEARCH-PRINCIPLES AND IMPLICATIONS 2021; 6:56. [PMID: 34406505 PMCID: PMC8374018 DOI: 10.1186/s41235-021-00321-1] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/27/2020] [Accepted: 07/31/2021] [Indexed: 11/18/2022]
Abstract
How does viewers’ knowledge guide their attention while they watch everyday events, how does it affect their memory, and does it change with age? Older adults have diminished episodic memory for everyday events, but intact semantic knowledge. Indeed, research suggests that older adults may rely on their semantic memory to offset impairments in episodic memory, and when relevant knowledge is lacking, older adults’ memory can suffer. Yet, the mechanism by which prior knowledge guides attentional selection when watching dynamic activity is unclear. To address this, we studied the influence of knowledge on attention and memory for everyday events in young and older adults by tracking their eyes while they watched videos. The videos depicted activities that older adults perform more frequently than young adults (balancing a checkbook, planting flowers) or activities that young adults perform more frequently than older adults (installing a printer, setting up a video game). Participants completed free recall, recognition, and order memory tests after each video. We found age-related memory deficits when older adults had little knowledge of the activities, but memory did not differ between age groups when older adults had relevant knowledge and experience with the activities. Critically, results showed that knowledge influenced where viewers fixated when watching the videos. Older adults fixated less goal-relevant information compared to young adults when watching young adult activities, but they fixated goal-relevant information similarly to young adults, when watching more older adult activities. Finally, results showed that fixating goal-relevant information predicted free recall of the everyday activities for both age groups. Thus, older adults may use relevant knowledge to more effectively infer the goals of actors, which guides their attention to goal-relevant actions, thus improving their episodic memory for everyday activities.
Collapse
Affiliation(s)
- Maverick E Smith
- Department of Psychological Sciences, Kansas State University, 471 Bluemont Hall, 1100 Mid-campus Dr., Manhattan, KS, 66506, USA.
| | - Lester C Loschky
- Department of Psychological Sciences, Kansas State University, 471 Bluemont Hall, 1100 Mid-campus Dr., Manhattan, KS, 66506, USA
| | - Heather R Bailey
- Department of Psychological Sciences, Kansas State University, 471 Bluemont Hall, 1100 Mid-campus Dr., Manhattan, KS, 66506, USA
| |
Collapse
|
47
|
Gregory SEA. Investigating facilitatory versus inhibitory effects of dynamic social and non-social cues on attention in a realistic space. PSYCHOLOGICAL RESEARCH 2021; 86:1578-1590. [PMID: 34374844 PMCID: PMC9177496 DOI: 10.1007/s00426-021-01574-7] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2021] [Accepted: 07/29/2021] [Indexed: 11/25/2022]
Abstract
This study aimed to investigate the facilitatory versus inhibitory effects of dynamic non-predictive central cues presented in a realistic environment. Realistic human-avatars initiated eye contact and then dynamically looked to the left, right or centre of a table. A moving stick served as a non-social control cue and participants localised (Experiment 1) or discriminated (Experiment 2) a contextually relevant target (teapot/teacup). The cues movement took 500 ms and stimulus onset asynchronies (SOA, 150 ms/300 ms/500 ms/1000 ms) were measured from movement initiation. Similar cuing effects were seen for the social avatar and non-social stick cue across tasks. Results showed facilitatory processes without inhibition, though there was some variation by SOA and task. This is the first time facilitatory versus inhibitory processes have been directly investigated where eye contact is initiated prior to gaze shift. These dynamic stimuli allow a better understanding of how attention might be cued in more realistic environments.
Collapse
Affiliation(s)
- Samantha E A Gregory
- Aston Institute of Health and Neurodevelopment, Aston University, Birmingham, B4 7ET, UK.
| |
Collapse
|
48
|
Hietanen JK, Peltola MJ. The eye contact smile: The effects of sending and receiving a direct gaze. VISUAL COGNITION 2021. [DOI: 10.1080/13506285.2021.1915904] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Affiliation(s)
- Jari K. Hietanen
- Human Information Processing Laboratory, Faculty of Social Sciences/Psychology, Tampere University, Tampere, Finland
| | - Mikko J. Peltola
- Human Information Processing Laboratory, Faculty of Social Sciences/Psychology, Tampere University, Tampere, Finland
| |
Collapse
|
49
|
Kask A, Põldver N, Ausmees L, Kreegipuu K. Subjectively different emotional schematic faces not automatically discriminated from the brain's bioelectrical responses. Conscious Cogn 2021; 93:103150. [PMID: 34051391 DOI: 10.1016/j.concog.2021.103150] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2020] [Revised: 05/03/2021] [Accepted: 05/05/2021] [Indexed: 10/21/2022]
Abstract
The present study investigates how the brain automatically discriminates emotional schematic faces, as indicated by the mismatch responses, and how reliable these brain responses are. Thirty-three healthy volunteers participated in the vMMN EEG experiment with four experimental sets differing from each other by the type of standard (object with scrambled face features) and the type of deviants (Angry, Happy and Neutral schematic faces) presented. Conscious subjective evaluations of valence, arousal and attention catching of the same stimuli showed clear differentiation of emotional expressions. Deviant faces elicited rather similar vMMN at frontal and occipital sites. Bayesian analyses suggest that vMMN does not differ between angry and happy faces. Neutral faces, however, did not yield statistically significant vMMN at occipital leads. Pearson's correlation and intra-class correlation analyses showed that the brain's reactions to the stimuli were highly stable within individuals across the experimental sets, whereas the mismatch responses were much more variable.
Collapse
Affiliation(s)
- Annika Kask
- Institute of Psychology, University of Tartu, Tartu, Estonia; Doctoral School of Behavioural, Social and Health Sciences, Tartu, Estonia
| | - Nele Põldver
- Institute of Psychology, University of Tartu, Tartu, Estonia
| | - Liisi Ausmees
- Institute of Psychology, University of Tartu, Tartu, Estonia
| | - Kairi Kreegipuu
- Institute of Psychology, University of Tartu, Tartu, Estonia.
| |
Collapse
|
50
|
Haensel JX, Smith TJ, Senju A. Cultural differences in mutual gaze during face-to-face interactions: A dual head-mounted eye-tracking study. VISUAL COGNITION 2021. [DOI: 10.1080/13506285.2021.1928354] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
Affiliation(s)
- Jennifer X. Haensel
- Department of Psychological Sciences, Birkbeck, University of London, London, UK
| | - Tim J. Smith
- Department of Psychological Sciences, Birkbeck, University of London, London, UK
| | - Atsushi Senju
- Department of Psychological Sciences, Birkbeck, University of London, London, UK
| |
Collapse
|