1
|
Korda Ž, Walcher S, Körner C, Benedek M. Decoupling of the pupillary light response during internal attention: The modulating effect of luminance intensity. Acta Psychol (Amst) 2024; 242:104123. [PMID: 38181698 DOI: 10.1016/j.actpsy.2023.104123] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2023] [Revised: 11/30/2023] [Accepted: 12/21/2023] [Indexed: 01/07/2024] Open
Abstract
In a world full of sensory stimuli, attention guides us between the external environment and our internal thoughts. While external attention involves processing sensory stimuli, internal attention is devoted to self-generated representations such as planning or spontaneous mind wandering. They both draw from common cognitive resources, thus simultaneous engagement in both often leads to interference between processes. In order to maintain internal focus, an attentional mechanism known as perceptual decoupling takes effect. This mechanism supports internal cognition by decoupling attention from the perception of sensory information. Two previous studies of our lab investigated to what extent perceptual decoupling is evident in voluntary eye movements. Findings showed that the effect is mediated by the internal task modality and workload (visuospatial > arithmetic and high > low, respectively). However, it remains unclear whether it extends to involuntary eye behavior, which may not share cognitive resources with internal activities. Therefore, the present experiment aimed to further elucidate attentional dynamics by examining whether internal attention affects the pupillary light response (PLR). Specifically, we consistently observed that workload and task modality of the internal task reduced the PLR to luminance changes of medium intensity. However, the PLR to strong luminance changes was less or not at all affected by the internal task. These results suggest that perceptual decoupling effects may be less consistent in involuntary eye behavior, particularly in the context of a salient visual stimulus.
Collapse
Affiliation(s)
- Živa Korda
- Department of Psychology, University of Graz, Graz, Austria.
| | - Sonja Walcher
- Department of Psychology, University of Graz, Graz, Austria
| | | | | |
Collapse
|
2
|
Magliacano A, Catalano L, Sagliano L, Estraneo A, Trojano L. Spontaneous eye blinking during an auditory, an interoceptive and a visual task: The role of the sensory modality and the attentional focus. Cortex 2023; 168:49-61. [PMID: 37659289 DOI: 10.1016/j.cortex.2023.07.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Revised: 07/15/2023] [Accepted: 07/31/2023] [Indexed: 09/04/2023]
Abstract
Previous evidence suggested that spontaneous eye blinking changes as a function of the attentional focus. In particular, eye blink rate (EBR) tends to increase when attention is directed to internal versus environmental processing. Most studies on this issue compared eye blinking during visual and mental imagery tasks, and interpreted the increase in EBR as a mechanism to focus cognitive resources on internal processing by disengaging attention from interfering information. However, since eye blinking also depends on the sensory modality of the task, the findings might be influenced by a modality-specific effect. In the present Registered Report we aim at investigating whether the environmental versus internal attentional focus can affect spontaneous blinking behaviour in non-visual tasks as well, in conditions where visual stimuli are not relevant. In a within-subject design, healthy participants performed an interoceptive task (i.e., heartbeat counting) and an auditory task in which pre-recorded heartbeats were presented aurally; during both tasks irrelevant visual stimuli were also presented. In a further control condition with the same auditory and visual stimuli, the participants were required to focus their attention on visual stimuli. Participants' EBR was recorded during each task by means of an eye-tracking system. We found that, although the interoceptive task was more difficult than the auditory and visual tasks, participants' EBR decreased by a comparable level in all tasks with respect to a rest condition, with no differences between internal versus environmental conditions. The present findings do not support the idea that EBR is modulated by an internal versus external focus of attention, at least in presence of controlled visual stimulation.
Collapse
Affiliation(s)
| | - Laura Catalano
- Department of Psychology, University of Campania "Luigi Vanvitelli", Caserta, Italy
| | - Laura Sagliano
- Department of Psychology, University of Campania "Luigi Vanvitelli", Caserta, Italy
| | | | - Luigi Trojano
- Department of Psychology, University of Campania "Luigi Vanvitelli", Caserta, Italy.
| |
Collapse
|
3
|
Yule S, Robertson JM, Mormann B, Smink DS, Lipsitz S, Abahuje E, Kennedy-Metz L, Park S, Miccile C, Pozner CN, Doyle T, Musson D, Dias RD. Crew Autonomy During Simulated Medical Event Management on Long Duration Space Exploration Missions. HUMAN FACTORS 2023; 65:1221-1234. [PMID: 35430922 PMCID: PMC10466940 DOI: 10.1177/00187208211067575] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/28/2021] [Accepted: 11/16/2021] [Indexed: 06/14/2023]
Abstract
OBJECTIVE Our primary aim was to investigate crew performance during medical emergencies with and without ground-support from a flight surgeon located at mission control. BACKGROUND There are gaps in knowledge regarding the potential for unanticipated in-flight medical events to affect crew health and capacity, and potentially compromise mission success. Additionally, ground support may be impaired or periodically absent during long duration missions. METHOD We reviewed video recordings of 16 three-person flight crews each managing four unique medical events in a fully immersive spacecraft simulator. Crews were randomized to two conditions: with and without telemedical flight surgeon (FS) support. We assessed differences in technical performance, behavioral skills, and cognitive load between groups. RESULTS Crews with FS support performed better clinically, were rated higher on technical skills, and completed more clinical tasks from the medical checklists than crews without FS support. Crews with FS support also had better behavioral/non-technical skills (information exchange) and reported significantly lower cognitive demand during the medical event scenarios on the NASA-TLX scale, particularly in mental demand and temporal demand. There was no significant difference between groups in time to treat or in objective measures of cognitive demand derived from heart rate variability and electroencephalography. CONCLUSION Medical checklists are necessary but not sufficient to support high levels of autonomous crew performance in the absence of real-time flight surgeon support. APPLICATION Potential applications of this research include developing ground-based and in-flight training countermeasures; informing policy regarding autonomous spaceflight, and design of autonomous clinical decision support systems.
Collapse
Affiliation(s)
- Steven Yule
- STRATUS Center for Medical Simulation, Brigham and Women's Hospital, Boston, MA, USA; Center for Surgery & Public Health, Brigham & Women's Hospital, Boston, MA, USA; Department of Surgery, Brigham & Women's Hospital/ Harvard Medical School, Boston, MA, USA; Department of Clinical Surgery, The University of Edinburgh, Edinburgh, UK
| | - Jamie M Robertson
- Department of Surgery, Brigham & Women's Hospital/ Harvard Medical School, Boston, MA, USA
| | - Benjamin Mormann
- Department of Emergency Medicine, Harvard Medical School, Boston, MA, USA
| | - Douglas S Smink
- Center for Surgery & Public Health, Brigham & Women's Hospital, Boston, MA, USA; Department of Surgery, Brigham & Women's Hospital/ Harvard Medical School, Boston, MA, USA
| | - Stuart Lipsitz
- Center for Surgery & Public Health, Brigham & Women's Hospital, Boston, MA, USA
| | - Egide Abahuje
- STRATUS Center for Medical Simulation, Brigham and Women's Hospital, Boston, MA, USA; Department of Surgery, Feinberg School of Medicine, Northwestern University, Chicago, IL, USA
| | - Lauren Kennedy-Metz
- Department of Surgery, Brigham & Women's Hospital/ Harvard Medical School, Boston, MA, USA; Medical Robotics and Computer Assisted Surgery Laboratory, Division of Cardiac Surgery, U.S. Veterans Affairs Boston Healthcare System, Boston, MA, USA
| | - Sandra Park
- STRATUS Center for Medical Simulation, Brigham and Women's Hospital, Boston, MA, USA
| | - Christian Miccile
- STRATUS Center for Medical Simulation, Brigham and Women's Hospital, Boston, MA, USA
| | - Charles N Pozner
- STRATUS Center for Medical Simulation, Brigham and Women's Hospital, Boston, MA, USA; Department of Emergency Medicine, Harvard Medical School, Boston, MA, USA
| | - Thomas Doyle
- Department of Electrical and Computer Engineering, McMaster University, Hamilton, ON, Canada
| | - David Musson
- Faculty of Health Science, Northern Ontario School of Medicine, Thunder Bay, ON, Canada
| | - Roger D Dias
- STRATUS Center for Medical Simulation, Brigham and Women's Hospital, Boston, MA, USA; Department of Emergency Medicine, Harvard Medical School, Boston, MA, USA
| |
Collapse
|
4
|
Schirm J, Gómez-Vargas AR, Perusquía-Hernández M, Skarbez RT, Isoyama N, Uchiyama H, Kiyokawa K. Identification of Language-Induced Mental Load from Eye Behaviors in Virtual Reality. SENSORS (BASEL, SWITZERLAND) 2023; 23:6667. [PMID: 37571449 PMCID: PMC10422404 DOI: 10.3390/s23156667] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Revised: 07/03/2023] [Accepted: 07/14/2023] [Indexed: 08/13/2023]
Abstract
Experiences of virtual reality (VR) can easily break if the method of evaluating subjective user states is intrusive. Behavioral measures are increasingly used to avoid this problem. One such measure is eye tracking, which recently became more standard in VR and is often used for content-dependent analyses. This research is an endeavor to utilize content-independent eye metrics, such as pupil size and blinks, for identifying mental load in VR users. We generated mental load independently from visuals through auditory stimuli. We also defined and measured a new eye metric, focus offset, which seeks to measure the phenomenon of "staring into the distance" without focusing on a specific surface. In the experiment, VR-experienced participants listened to two native and two foreign language stimuli inside a virtual phone booth. The results show that with increasing mental load, relative pupil size on average increased 0.512 SDs (0.118 mm), with 57% reduced variance. To a lesser extent, mental load led to fewer fixations, less voluntary gazing at distracting content, and a larger focus offset as if looking through surfaces (about 0.343 SDs, 5.10 cm). These results are in agreement with previous studies. Overall, we encourage further research on content-independent eye metrics, and we hope that hardware and algorithms will be developed in the future to further increase tracking stability.
Collapse
Affiliation(s)
- Johannes Schirm
- Graduate School of Science and Technology, Nara Institute of Science and Technology, Ikoma 630-0192, Japan; (A.R.G.-V.); (M.P.-H.); (H.U.); (K.K.)
| | - Andrés Roberto Gómez-Vargas
- Graduate School of Science and Technology, Nara Institute of Science and Technology, Ikoma 630-0192, Japan; (A.R.G.-V.); (M.P.-H.); (H.U.); (K.K.)
| | - Monica Perusquía-Hernández
- Graduate School of Science and Technology, Nara Institute of Science and Technology, Ikoma 630-0192, Japan; (A.R.G.-V.); (M.P.-H.); (H.U.); (K.K.)
| | - Richard T. Skarbez
- Department of Computer Science and Information Technology, School of Computing, Engineering and Mathematical Sciences, La Trobe University, Melbourne Campus, Melbourne, VIC 3086, Australia
| | - Naoya Isoyama
- Faculty of Social Information Studies, Otsuma Women’s University, Tokyo 102-8357, Japan;
| | - Hideaki Uchiyama
- Graduate School of Science and Technology, Nara Institute of Science and Technology, Ikoma 630-0192, Japan; (A.R.G.-V.); (M.P.-H.); (H.U.); (K.K.)
| | - Kiyoshi Kiyokawa
- Graduate School of Science and Technology, Nara Institute of Science and Technology, Ikoma 630-0192, Japan; (A.R.G.-V.); (M.P.-H.); (H.U.); (K.K.)
| |
Collapse
|
5
|
Effects of internally directed cognition on smooth pursuit eye movements: A systematic examination of perceptual decoupling. Atten Percept Psychophys 2023; 85:1159-1178. [PMID: 36922477 PMCID: PMC10167146 DOI: 10.3758/s13414-023-02688-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/21/2023] [Indexed: 03/17/2023]
Abstract
Eye behavior differs between internally and externally directed cognition and thus is indicative of an internal versus external attention focus. Recent work implicated perceptual decoupling (i.e., eye behavior becoming less determined by the sensory environment) as one of the key mechanisms involved in these attention-related eye movement differences. However, it is not yet understood how perceptual decoupling depends on the characteristics of the internal task. Therefore, we systematically examined effects of varying internal task demands on smooth pursuit eye movements. Specifically, we evaluated effects of the internal workload (control vs. low vs. high) and of internal task (arithmetic vs. visuospatial). The results of multilevel modelling showed that effects of perceptual decoupling were stronger for higher workload, and more pronounced for the visuospatial modality. Effects also followed a characteristic time-course relative to internal operations. The findings provide further support of the perceptual decoupling mechanism by showing that it is sensitive to the degree of interference between external and internal information.
Collapse
|
6
|
Abstract
Abstract. The dynamic creativity framework (DCF) represents a new theoretical perspective for studying the creativity construct. This framework is based on the dynamic definition of creativity, and it has both theoretical and empirical implications. From a theoretical point of view, we review the characteristics of the dynamic creative process and its extension into the dynamic universal creative process, encompassing creativity at different layers of complexity. We discuss the key concept of creative potential, considering individual, sociocultural, and material viewpoints, and we show how the DCF is instrumental in clarifying the relationship between creativity and intelligence, between creativity and anticipation, as well as in introducing the concept of ‘organic creativity’. From the empirical perspective, we focus on the dynamic creative process broken down into four phases: i) drive, ii) information, iii) idea generation, iv) idea evaluation. We review results obtained through investigations accounting for the dynamic interplay between emotional and cognitive components defining creative performance for each. Experiments were conducted to measure the role of emotions and attention in driving the dynamic process, considering the processing of apparently irrelevant information and the interaction between idea generation and idea evaluation, always taking into account individual differences as measured through personality traits, performance variables, or lifetime achievement. Neurophysiological evidence is considered in discussing dynamic effects in divergent thinking, such as the serial order effect, as well as the possibility to enhance creative potential through neurofeedback. Finally, we report on the effects of different environments on the creative process, highlighting the dynamics produced by context-embeddedness.
Collapse
Affiliation(s)
- Giovanni Emanuele Corazza
- DEI-Marconi Institute for Creativity, University of Bologna, Italy
- Université Paris Cité and University Gustave Eiffel, LaPEA, Boulogne-Billancourt, France
| | - Sergio Agnoli
- DEI-Marconi Institute for Creativity, University of Bologna, Italy
| | - Serena Mastria
- DEI-Marconi Institute for Creativity, University of Bologna, Italy
| |
Collapse
|
7
|
Vortmann LM, Putze F. Combining Implicit and Explicit Feature Extraction for Eye Tracking: Attention Classification Using a Heterogeneous Input. SENSORS 2021; 21:s21248205. [PMID: 34960295 PMCID: PMC8707750 DOI: 10.3390/s21248205] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/29/2021] [Revised: 11/29/2021] [Accepted: 12/02/2021] [Indexed: 01/24/2023]
Abstract
Statistical measurements of eye movement-specific properties, such as fixations, saccades, blinks, or pupil dilation, are frequently utilized as input features for machine learning algorithms applied to eye tracking recordings. These characteristics are intended to be interpretable aspects of eye gazing behavior. However, prior research has demonstrated that when trained on implicit representations of raw eye tracking data, neural networks outperform these traditional techniques. To leverage the strengths and information of both feature sets, we integrated implicit and explicit eye tracking features in one classification approach in this work. A neural network was adapted to process the heterogeneous input and predict the internally and externally directed attention of 154 participants. We compared the accuracies reached by the implicit and combined features for different window lengths and evaluated the approaches in terms of person- and task-independence. The results indicate that combining implicit and explicit feature extraction techniques for eye tracking data improves classification results for attentional state detection significantly. The attentional state was correctly classified during new tasks with an accuracy better than chance, and person-independent classification even outperformed person-dependently trained classifiers for some settings. For future experiments and applications that require eye tracking data classification, we suggest to consider implicit data representation in addition to interpretable explicit features.
Collapse
|
8
|
Vortmann LM, Knychalla J, Annerer-Walcher S, Benedek M, Putze F. Imaging Time Series of Eye Tracking Data to Classify Attentional States. Front Neurosci 2021; 15:664490. [PMID: 34121994 PMCID: PMC8193942 DOI: 10.3389/fnins.2021.664490] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2021] [Accepted: 05/03/2021] [Indexed: 12/25/2022] Open
Abstract
It has been shown that conclusions about the human mental state can be drawn from eye gaze behavior by several previous studies. For this reason, eye tracking recordings are suitable as input data for attentional state classifiers. In current state-of-the-art studies, the extracted eye tracking feature set usually consists of descriptive statistics about specific eye movement characteristics (i.e., fixations, saccades, blinks, vergence, and pupil dilation). We suggest an Imaging Time Series approach for eye tracking data followed by classification using a convolutional neural net to improve the classification accuracy. We compared multiple algorithms that used the one-dimensional statistical summary feature set as input with two different implementations of the newly suggested method for three different data sets that target different aspects of attention. The results show that our two-dimensional image features with the convolutional neural net outperform the classical classifiers for most analyses, especially regarding generalization over participants and tasks. We conclude that current attentional state classifiers that are based on eye tracking can be optimized by adjusting the feature set while requiring less feature engineering and our future work will focus on a more detailed and suited investigation of this approach for other scenarios and data sets.
Collapse
Affiliation(s)
- Lisa-Marie Vortmann
- Cognitive Systems Lab, Department of Mathematics and Computer Science, University of Bremen, Bremen, Germany
| | - Jannes Knychalla
- Cognitive Systems Lab, Department of Mathematics and Computer Science, University of Bremen, Bremen, Germany
| | | | - Mathias Benedek
- Creative Cognition Lab, Institute of Psychology, University of Graz, Graz, Austria
| | - Felix Putze
- Cognitive Systems Lab, Department of Mathematics and Computer Science, University of Bremen, Bremen, Germany
| |
Collapse
|
9
|
Annerer‐Walcher S, Ceh SM, Putze F, Kampen M, Körner C, Benedek M. How Reliably Do Eye Parameters Indicate Internal Versus External Attentional Focus? Cogn Sci 2021; 45:e12977. [DOI: 10.1111/cogs.12977] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2020] [Revised: 03/15/2021] [Accepted: 03/19/2021] [Indexed: 12/15/2022]
Affiliation(s)
| | | | - Felix Putze
- Department of Mathematics and Computer Science University of Bremen
| | - Marvin Kampen
- Department of Mathematics and Computer Science University of Bremen
| | | | | |
Collapse
|
10
|
Ceh SM, Annerer-Walcher S, Körner C, Rominger C, Kober SE, Fink A, Benedek M. Neurophysiological indicators of internal attention: An electroencephalography-eye-tracking coregistration study. Brain Behav 2020; 10:e01790. [PMID: 32816400 PMCID: PMC7559625 DOI: 10.1002/brb3.1790] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/17/2020] [Revised: 07/22/2020] [Accepted: 07/22/2020] [Indexed: 12/19/2022] Open
Abstract
INTRODUCTION Many goal-directed and spontaneous everyday activities (e.g., planning, mind wandering) rely on an internal focus of attention. Internally directed cognition (IDC) was shown to differ from externally directed cognition in a range of neurophysiological indicators such as electroencephalogram (EEG) alpha activity and eye behavior. METHODS In this EEG-eye-tracking coregistration study, we investigated effects of attention direction on EEG alpha activity and various relevant eye parameters. We used an established paradigm to manipulate internal attention demands in the visual domain within tasks by means of conditional stimulus masking. RESULTS Consistent with previous research, IDC involved relatively higher EEG alpha activity (lower alpha desynchronization) at posterior cortical sites. Moreover, IDC was characterized by greater pupil diameter (PD), fewer microsaccades, fixations, and saccades. These findings show that internal versus external cognition is associated with robust differences in several indicators at the neural and perceptual level. In a second line of analysis, we explored the intrinsic temporal covariation between EEG alpha activity and eye parameters during rest. This analysis revealed a positive correlation of EEG alpha power with PD especially in bilateral parieto-occipital regions. CONCLUSION Together, these findings suggest that EEG alpha activity and PD represent time-sensitive indicators of internal attention demands, which may be involved in a neurophysiological gating mechanism serving to shield internal cognition from irrelevant sensory information.
Collapse
Affiliation(s)
| | | | | | | | | | - Andreas Fink
- Institute of Psychology, University of Graz, Graz, Austria
| | | |
Collapse
|