1
|
Hsu CW, Gross J, Colombo M, Hayne H. Look into my eyes: a "faceless" avatar interviewer lowers reporting threshold for adult eyewitnesses. Mem Cognit 2023; 51:1761-1773. [PMID: 37072575 PMCID: PMC10638134 DOI: 10.3758/s13421-023-01424-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/03/2023] [Indexed: 04/20/2023]
Abstract
Evidential interviewing is often used to gather important information, which can determine the outcome of a criminal case. An interviewer's facial features, however, may impact reporting during this task. Here, we investigated adults' interview performance using a novel tool-a faceless avatar interviewer-designed to minimize the impact of an interviewer's visual communication signals, potentially enhancing memory performance. Adults were interviewed about the details of a video by (1) a human-appearing avatar or a human interviewer (Experiment 1; N = 105) or (2) a human-appearing avatar or a faceless avatar interviewer (Experiment 2; N = 109). Participants assigned to the avatar interviewer condition were (1) asked whether they thought the interviewer was either computer or human operated (Experiment 1) or (2) explicitly told that the interviewer was either computer or human operated (Experiment 2). Adults' memory performance was statistically equivalent when they were interviewed by a human-appearing avatar or a human interviewer, but, relative to the human-appearing avatar, adults who were interviewed by a faceless avatar reported more correct (but also incorrect) details in response to free-recall questions. Participants who indicated that the avatar interviewer was computer operated-as opposed to human operated-provided more accurate memory reports, but specifically telling participants that the avatar was computer operated or human operated had no influence on their memory reports. The present study introduced a novel interviewing tool and highlighted the possible cognitive and social influences of an interviewer's facial features on adults' report of a witnessed event.
Collapse
Affiliation(s)
- Che-Wei Hsu
- Department of Psychology, University of Otago, Dunedin, New Zealand.
- Department of Psychological Medicine, University of Otago, PO Box 54, Dunedin, 9054, New Zealand.
| | - Julien Gross
- Department of Psychology, University of Otago, Dunedin, New Zealand
| | - Marea Colombo
- Department of Psychology, University of Otago, Dunedin, New Zealand
| | - Harlene Hayne
- Department of Psychology, University of Otago, Dunedin, New Zealand
- School of Population Health, Curtin University, Perth, Australia
| |
Collapse
|
2
|
Clin E, Kissine M. Neurotypical, but not autistic, adults might experience distress when looking at someone avoiding eye contact: A live face-to-face paradigm. AUTISM : THE INTERNATIONAL JOURNAL OF RESEARCH AND PRACTICE 2023; 27:1949-1959. [PMID: 36688307 DOI: 10.1177/13623613221148553] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/24/2023]
Abstract
LAY ABSTRACT What is already known about the topic?Autistics are usually reported to share less eye contact than neurotypicals with their interlocutors. However, the reason why autistics might pay less attention to eyes looking at them is still unknown: some autistics express being hyper-aroused by this eye contact, while some eye-tracking studies suggest that eye contact is associated with hypo-arousal in autism.What this paper adds?This study is based on a highly controlled live face-to-face paradigm, combining a wearable eye-tracker (to study eye behaviours) with electrodermal activity sensors (to assess potential stress). We draw a nuanced picture of social attention in autism, as our autistic participants did not differ from our neurotypical group in their eye behaviours nor their skin conductance responses. However, we found that neurotypicals, compared to autistics, seemed to be much more distressed when their interlocutor did not gaze at them during the experiment.Implications for practice, research or policy:Our study encourages to consider social interaction difficulties in autism as a relational issue, instead as an individual deficit. This step might be first taken in research, by implementing paradigms sensitive to the experimenter's role and attitude.
Collapse
|
3
|
Clin E, Kissine M. Listener- Versus Speaker-Oriented Disfluencies in Autistic Adults: Insights From Wearable Eye-Tracking and Skin Conductance Within a Live Face-to-Face Paradigm. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2023:1-19. [PMID: 37418752 DOI: 10.1044/2023_jslhr-23-00002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/09/2023]
Abstract
PURPOSE Our study addresses three main questions: (a) Do autistics and neurotypicals produce different patterns of disfluencies, depending on the experimenter's direct versus averted gaze? (b) Are these patterns correlated to gender, skin conductance responses, fixations on the experimenter's face, alexithymia, or social anxiety scores? Lastly, (c) can eye-tracking and electrodermal activity data be used in distinguishing listener- versus speaker-oriented disfluencies? METHOD Within a live face-to-face paradigm combining a wearable eye-tracker with electrodermal activity sensors, 80 adults (40 autistics, 40 neurotypicals) defined words in front of an experimenter who was either staring at their eyes (direct gaze condition) or looking elsewhere (averted gaze condition). RESULTS Autistics produce less listener-oriented (uh, um) and more speaker-oriented (prolongations, breath) disfluencies than neurotypicals. In both groups, men produce less um than women. Both autistics' and neurotypicals' speech are influenced by whether their interlocutor systematically looks at them in the eyes or not, but their reactions go in opposite directions. Disfluencies seem to primarily be linguistic phenomena as experienced stress, social attention, alexithymia, and social anxiety scores do not influence any of the reported results. Finally, eye-tracking and electrodermal activity data suggest that laughter could be a listener-oriented disfluency. CONCLUSIONS This article studies disfluencies in a fine-grained way in autistic and neurotypical adults while controlling for social attention, experienced stress, and experimental condition (direct vs. averted gaze). It adds to current literature by (a) enlightening our knowledge of speech in autism, (b) opening new perspectives on disfluency patterns as important signals in social interaction, (c) addressing theoretical issues on the dichotomy between listener- and speaker-oriented disfluencies, and (d) considering understudied phenomena as potential disfluencies (e.g., laughter, breath). SUPPLEMENTAL MATERIAL https://doi.org/10.23641/asha.23549550.
Collapse
Affiliation(s)
- Elise Clin
- ACTE, LaDisco and ULB Neuroscience Institute, Université Libre de Bruxelles, Brussels, Belgium
| | - Mikhail Kissine
- ACTE, LaDisco and ULB Neuroscience Institute, Université Libre de Bruxelles, Brussels, Belgium
| |
Collapse
|
4
|
Li K, Lu A, Deng R, Yi H. The Unique Cost of Human Eye Gaze in Cognitive Control: Being Human-Specific and Body-Related? PSICHOLOGIJA 2022. [DOI: 10.15388/psichol.2022.59] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/13/2023] Open
Abstract
This study investigated the eye gaze cost in cognitive control and whether it is human-specific and body-related. In Experiment 1, we explored whether there was a cost of human eye gaze in cognitive control and extended it by focusing on the role of emotion in the cost. Stroop effect was found to be larger in eye-gaze condition than vertical grating condition, and to be comparable across positive, negative, and neutral trials. In Experiment 2, we explored whether the eye gaze cost in cognitive control was limited to human eyes. No larger Stroop effect was found in feline eye-gaze condition, neither the modulating role of emotion. In Experiment 3, we explored whether the mouth could elicit a cost in Stroop effect. Stroop effect was not significantly larger in mouth condition compared to vertical grating condition, nor across positive, negative, and neutral conditions. The results suggest that: (1) There is a robust cost of eye gaze in cognitive control; (2) Such eye-gaze cost was specific to human eyes but not to animal eyes; (3) Only human eyes could have such eye-gaze costs but not human mouth. This study supported the notion that presentation of social cues, such as human eyes, could influence attentional processing, and provided preliminary evidence that the human eye plays an important role in cognitive processing.
Collapse
|
5
|
Darfler M, Cruz-Garza JG, Kalantari S. An EEG-Based Investigation of the Effect of Perceived Observation on Visual Memory in Virtual Environments. Brain Sci 2022; 12:brainsci12020269. [PMID: 35204033 PMCID: PMC8870655 DOI: 10.3390/brainsci12020269] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2021] [Revised: 02/01/2022] [Accepted: 02/08/2022] [Indexed: 11/16/2022] Open
Abstract
The presence of external observers has been shown to affect performance on cognitive tasks, but the parameters of this impact for different types of tasks and the underlying neural dynamics are less understood. The current study examined the behavioral and brain activity effects of perceived observation on participants’ visual working memory (VWM) in a virtual reality (VR) classroom setting, using the task format as a moderating variable. Participants (n = 21) were equipped with a 57-channel EEG cap, and neural data were collected as they completed two VWM tasks under two observation conditions (observed and not observed) in a within-subjects experimental design. The “observation” condition was operationalized through the addition of a static human avatar in the VR classroom. The avatar’s presence was associated with a significant effect on extending the task response time, but no effect was found on task accuracy. This outcome may have been due to a ceiling effect, as the mean participant task scores were quite high. EEG data analysis supported the behavioral findings by showing consistent differences between the no-observation and observation conditions for one of the VWM tasks only. These neural differences were identified in the dorsolateral prefrontal cortex (dlPFC) and the occipital cortex (OC) regions, with higher theta-band activity occurring in the dlPFC during stimulus encoding and in the OC during response selection when the “observing” avatar was present. These findings provide evidence that perceived observation can inhibit performance during visual tasks by altering attentional focus, even in virtual contexts.
Collapse
|
6
|
Abstract
AbstractMourning constitutes an important human emotion, which might cause—among other things—major depressive symptoms when lasting for too long. To date, no study investigated whether mourning is related to specific psychophysiological activation patterns. Therefore, we examined physiological reactions induced by iconographic mourning-related stimuli in comparison to neutral and attachment stimuli in healthy adults (N = 77, mean age: 21.9). We evaluated pupillometric and eye-tracking parameters as well as heart rate variability (HRV) and skin conductance (EDA). Eye-tracking revealed a stronger dilated pupil during mourning in comparison to the neutral, but not to the attachment condition; furthermore, fixation patterns revealed less fixations on mourning stimuli. While HF HRV was reduced during mourning and attachment, we found no differences concerning EDA parameters between conditions. Results suggest specific eye-movement and pupil adaptations during representations of mourning, which might point toward inward cognition or avoidance, but no specific physiological pattern concerning HRV and EDA.
Collapse
|
7
|
Tracking developmental differences in real-world social attention across adolescence, young adulthood and older adulthood. Nat Hum Behav 2021; 5:1381-1390. [PMID: 33986520 PMCID: PMC7611872 DOI: 10.1038/s41562-021-01113-9] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2020] [Accepted: 04/12/2021] [Indexed: 02/03/2023]
Abstract
Detecting and responding appropriately to social information in one's environment is a vital part of everyday social interactions. Here, we report two preregistered experiments that examine how social attention develops across the lifespan, comparing adolescents (10-19 years old), young (20-40 years old) and older (60-80 years old) adults. In two real-world tasks, participants were immersed in different social interaction situations-a face-to-face conversation and navigating an environment-and their attention to social and non-social content was recorded using eye-tracking glasses. The results revealed that, compared with young adults, adolescents and older adults attended less to social information (that is, the face) during face-to-face conversation, and to people when navigating the real world. Thus, we provide evidence that real-world social attention undergoes age-related change, and these developmental differences might be a key mechanism that influences theory of mind among adolescents and older adults, with potential implications for predicting successful social interactions in daily life.
Collapse
|
8
|
Nash A, Ridout N, Nash RA. Facing away from the interviewer: Evidence of little benefit to eyewitnesses' memory performance. APPLIED COGNITIVE PSYCHOLOGY 2020. [DOI: 10.1002/acp.3723] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Affiliation(s)
- Alena Nash
- Department of Psychology, School of Life and Health Sciences Aston University UK
| | - Nathan Ridout
- Department of Psychology, School of Life and Health Sciences Aston University UK
| | - Robert A. Nash
- Department of Psychology, School of Life and Health Sciences Aston University UK
| |
Collapse
|
9
|
Desideri L, Bonifacci P, Croati G, Dalena A, Gesualdo M, Molinario G, Gherardini A, Cesario L, Ottaviani C. The Mind in the Machine: Mind Perception Modulates Gaze Aversion During Child–Robot Interaction. Int J Soc Robot 2020. [DOI: 10.1007/s12369-020-00656-7] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
10
|
Intentionally distracting: Working memory is disrupted by the perception of other agents attending to you - even without eye-gaze cues. Psychon Bull Rev 2019; 26:951-957. [PMID: 30324506 DOI: 10.3758/s13423-018-1530-x] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/21/2023]
Abstract
Of all the visual stimuli you can perceive, perhaps the most important are other people's eyes. And this is especially true when those eyes are looking at you: direct gaze has profound influences, even at the level of basic cognitive processes such as working memory. For example, memory for the properties of simple geometric shapes is disrupted by the presence of other eyes gazing at you. But are such effects really specific to direct gaze per se? Seeing eyes is undoubtedly important, but presumably only because of what it tells us about the "mind behind the eyes" - i.e., about others' attention and intentions. This suggests that the same effects might arise even without eyes, as long as an agent's directed attention is conveyed by other means. Here we tested the impact on working memory of simple "mouth" shapes - which in no way resemble eyes, yet can still be readily seen as intentionally facing you (or not). Just as with gaze cues, the ability to detect changes in geometric shapes was impaired by direct (compared to averted) mouths - but not in very similar control stimuli that were not perceived as intentional. We conclude that this disruption of working memory reflects a general phenomenon of "mind contact," rather than a specific effect of eye contact.
Collapse
|
11
|
Kamermans KL, Pouw W, Mast FW, Paas F. Reinterpretation in visual imagery is possible without visual cues: a validation of previous research. PSYCHOLOGICAL RESEARCH 2019; 83:1237-1250. [PMID: 29242975 PMCID: PMC6647238 DOI: 10.1007/s00426-017-0956-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2017] [Accepted: 12/04/2017] [Indexed: 11/20/2022]
Abstract
Is visual reinterpretation of bistable figures (e.g., duck/rabbit figure) in visual imagery possible? Current consensus suggests that it is in principle possible because of converging evidence of quasi-pictorial functioning of visual imagery. Yet, studies that have directly tested and found evidence for reinterpretation in visual imagery, allow for the possibility that reinterpretation was already achieved during memorization of the figure(s). One study resolved this issue, providing evidence for reinterpretation in visual imagery (Mast and Kosslyn, Cognition 86:57-70, 2002). However, participants in that study performed reinterpretations with aid of visual cues. Hence, reinterpretation was not performed with mental imagery alone. Therefore, in this study we assessed the possibility of reinterpretation without visual support. We further explored the possible role of haptic cues to assess the multimodal nature of mental imagery. Fifty-three participants were consecutively presented three to be remembered bistable 2-D figures (reinterpretable when rotated 180°), two of which were visually inspected and one was explored hapticly. After memorization of the figures, a visually bistable exemplar figure was presented to ensure understanding of the concept of visual bistability. During recall, 11 participants (out of 36; 30.6%) who did not spot bistability during memorization successfully performed reinterpretations when instructed to mentally rotate their visual image, but additional haptic cues during mental imagery did not inflate reinterpretation ability. This study validates previous findings that reinterpretation in visual imagery is possible.
Collapse
Affiliation(s)
- Kevin L Kamermans
- Department of Psychology, Education and Child Studies, Erasmus University Rotterdam, Rotterdam, The Netherlands
| | - Wim Pouw
- Department of Psychology, Education and Child Studies, Erasmus University Rotterdam, Rotterdam, The Netherlands.
- Department of Psychological Sciences, University of Connecticut, Storrs, USA.
| | - Fred W Mast
- Department of Psychology, University of Bern, Bern, Switzerland
| | - Fred Paas
- Department of Psychology, Education and Child Studies, Erasmus University Rotterdam, Rotterdam, The Netherlands
- Early Start Research Institute, University of Wollongong, Wollongong, Australia
| |
Collapse
|
12
|
Taylor DA, Dando CJ. Eyewitness Memory in Face-to-Face and Immersive Avatar-to-Avatar Contexts. Front Psychol 2018; 9:507. [PMID: 29719520 PMCID: PMC5913374 DOI: 10.3389/fpsyg.2018.00507] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2018] [Accepted: 03/26/2018] [Indexed: 11/13/2022] Open
Abstract
Technological advances offer possibilities for innovation in the way eyewitness testimony is elicited. Typically, this occurs face-to-face. We investigated whether a virtual environment, where interviewer and eyewitness communicate as avatars, might confer advantages by attenuating the social and situational demands of a face-to-face interview, releasing more cognitive resources for invoking episodic retrieval mode. In conditions of intentional encoding, eyewitnesses were interviewed 48 h later, either face-to-face or in a virtual environment (N = 38). Participants in the virtual environment significantly outperformed those interviewed face-to-face on all episodic performance measures - improved correct reporting reduced errors, and increased accuracy. Participants reported finding it easier to admit not remembering event information to the avatar, and finding the avatar easier to talk to. These novel findings, and our pattern of retrieval results indicates the potential of avatar-to-avatar communication in virtual environments, and provide impetus for further research investigating eyewitness cognition in contemporary retrieval contexts.
Collapse
Affiliation(s)
| | - Coral J. Dando
- Department of Psychology, University of Westminster, London, United Kingdom
| |
Collapse
|
13
|
Benedek M, Stoiser R, Walcher S, Körner C. Eye Behavior Associated with Internally versus Externally Directed Cognition. Front Psychol 2017; 8:1092. [PMID: 28713304 PMCID: PMC5491649 DOI: 10.3389/fpsyg.2017.01092] [Citation(s) in RCA: 32] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2017] [Accepted: 06/13/2017] [Indexed: 11/27/2022] Open
Abstract
What do our eyes do when we are focused on internal representations such as during imagination or planning? Evidence from mind wandering research suggests that spontaneous shifts from externally directed cognition (EDC) to internally directed cognition (IDC) involves oculomotor changes indicative of visual disengagement. In the present study, we investigated potential differences in eye behavior between goal-directed forms of IDC and EDC. To this end, we manipulated the focus of attention (internal versus external) in two demanding cognitive tasks (anagram and sentence generation). IDC was associated with fewer and longer fixations and higher variability in pupil diameter and eye vergence compared to EDC, suggesting reduced visual scanning and higher spontaneous eye activity. IDC was further related to longer blinks, lower microsaccade frequency, and a lower angle of eye vergence. These latter changes appear conducive to attenuate visual input and thereby shield ongoing internal processes from external distraction. Together, these findings suggest that IDC is accompanied by characteristic eye behavior that reflects a decoupling of attention from external events and serves gating out visual input.
Collapse
Affiliation(s)
| | - Robert Stoiser
- Institute of Psychology, University of GrazGraz, Austria
| | - Sonja Walcher
- Institute of Psychology, University of GrazGraz, Austria
| | | |
Collapse
|
14
|
When we cannot speak: Eye contact disrupts resources available to cognitive control processes during verb generation. Cognition 2016; 157:352-357. [PMID: 27750156 DOI: 10.1016/j.cognition.2016.10.002] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2015] [Revised: 09/19/2016] [Accepted: 10/05/2016] [Indexed: 11/22/2022]
Abstract
Although eye contact and verbal processing appear independent, people frequently avert their eyes from interlocutors during conversation. This suggests that there is interference between these processes. We hypothesized that such interference occurs because both processes share cognitive resources of a domain-general system and explored the influence of eye contact on simultaneous verb generation processes (i.e., retrieval and selection). In the present experiment, viewing a movie of faces with eyes directed toward the viewer delayed verbal generation more than a movie of faces with averted eyes; however, this effect was only present when both retrieval and selection demands were high. The results support the hypothesis that eye contact shares domain-general cognitive resource with verb generation. This further indicates that a full understanding of functional and dysfunctional communication must consider the interaction and interference of verbal and non-verbal channels.
Collapse
|
15
|
Vredeveldt A, Perfect TJ. Reduction of environmental distraction to facilitate cognitive performance. Front Psychol 2014; 5:860. [PMID: 25147535 PMCID: PMC4123724 DOI: 10.3389/fpsyg.2014.00860] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2014] [Accepted: 07/19/2014] [Indexed: 11/23/2022] Open
Affiliation(s)
- Annelies Vredeveldt
- Department of Criminal Law and Criminology, VU University Amsterdam Amsterdam, Netherlands
| | | |
Collapse
|
16
|
Craik FIM. Effects of distraction on memory and cognition: a commentary. Front Psychol 2014; 5:841. [PMID: 25120527 PMCID: PMC4114291 DOI: 10.3389/fpsyg.2014.00841] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2014] [Accepted: 07/15/2014] [Indexed: 11/13/2022] Open
Abstract
This commentary is a review of the findings and ideas reported in the preceding nine articles on the effects of distraction on aspects of cognitive performance. The articles themselves deal with the disruptive effects of distraction on recall of words, objects and events, also on visual processing, category formation and other cognitive tasks. The commentary assesses the part played by "domain-general" suppression of distracting information and the "domain-specific" competition arising when tasks and distraction involve very similar material. Some forms of distraction are meaningfully relevant to the ongoing task, and Treisman's (1964) model of selective attention is invoked to provide an account of findings in this area. Finally, individual differences to vulnerability to distraction are discussed; older adults are particularly affected by distracting stimuli although the failure to repress distraction can sometimes prove beneficial to later cognitive performance.
Collapse
Affiliation(s)
- Fergus I M Craik
- Rotman Research Institute of Baycrest, Baycrest Centre Toronto, ON, Canada
| |
Collapse
|