1
|
Walcher S, Korda Ž, Körner C, Benedek M. How workload and availability of spatial reference shape eye movement coupling in visuospatial working memory. Cognition 2024; 249:105815. [PMID: 38761645 DOI: 10.1016/j.cognition.2024.105815] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2023] [Revised: 03/27/2024] [Accepted: 05/13/2024] [Indexed: 05/20/2024]
Abstract
Eyes are active in memory recall and visual imagination, yet our grasp of the underlying qualities and factors of these internally coupled eye movements is limited. To explore this, we studied 50 participants, examining how workload, spatial reference availability, and imagined movement direction influence internal coupling of eye movements. We designed a visuospatial working memory task in which participants mentally moved a black patch along a path within a matrix and each trial involved one step along this path (presented via speakers: up, down, left, or right). We varied workload by adjusting matrix size (3 × 3 vs. 5 × 5), manipulated availability of a spatial frame of reference by presenting either a blank screen (requiring participants to rely solely on their mental representation of the matrix) or spatial reference in the form of an empty matrix, and contrasted active task performance to two control conditions involving only active or passive listening. Our findings show that eye movements consistently matched the imagined movement of the patch in the matrix, not driven solely by auditory or semantic cues. While workload influenced pupil diameter, perceived demand, and performance, it had no observable impact on internal coupling. The availability of spatial reference enhanced coupling of eye movements, leading more frequent, precise, and resilient saccades against noise and bias. The absence of workload effects on coupled saccades in our study, in combination with the relatively high degree of coupling observed even in the invisible matrix condition, indicates that eye movements align with shifts in attention across both visually and internally represented information. This suggests that coupled eye movements are not merely strategic efforts to reduce workload, but rather a natural response to where attention is directed.
Collapse
Affiliation(s)
- Sonja Walcher
- Creative Cognition Lab, Institute of Psychology, University of Graz, Graz, Austria.
| | - Živa Korda
- Creative Cognition Lab, Institute of Psychology, University of Graz, Graz, Austria.
| | - Christof Körner
- Cognitive Psychology & Neuroscience, Institute of Psychology, University of Graz, Graz, Austria.
| | - Mathias Benedek
- Creative Cognition Lab, Institute of Psychology, University of Graz, Graz, Austria.
| |
Collapse
|
2
|
Lebuda I, Benedek M. A meta-perspective on the creative metacognition framework. Reply to comments on "A systematic framework of creative metacognition". Phys Life Rev 2024; 50:66-71. [PMID: 38970863 DOI: 10.1016/j.plrev.2024.06.012] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2024] [Accepted: 06/18/2024] [Indexed: 07/08/2024]
Affiliation(s)
- Izabela Lebuda
- Institute of Psychology, University of Wrocław, Dawida 1, 50-527 Wroclaw, Poland.
| | - Mathias Benedek
- Institute of Psychology, University of Graz, Universitätsplatz 2, 8010 Graz, Austria
| |
Collapse
|
3
|
Korda Ž, Walcher S, Körner C, Benedek M. Decoupling of the pupillary light response during internal attention: The modulating effect of luminance intensity. Acta Psychol (Amst) 2024; 242:104123. [PMID: 38181698 DOI: 10.1016/j.actpsy.2023.104123] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2023] [Revised: 11/30/2023] [Accepted: 12/21/2023] [Indexed: 01/07/2024] Open
Abstract
In a world full of sensory stimuli, attention guides us between the external environment and our internal thoughts. While external attention involves processing sensory stimuli, internal attention is devoted to self-generated representations such as planning or spontaneous mind wandering. They both draw from common cognitive resources, thus simultaneous engagement in both often leads to interference between processes. In order to maintain internal focus, an attentional mechanism known as perceptual decoupling takes effect. This mechanism supports internal cognition by decoupling attention from the perception of sensory information. Two previous studies of our lab investigated to what extent perceptual decoupling is evident in voluntary eye movements. Findings showed that the effect is mediated by the internal task modality and workload (visuospatial > arithmetic and high > low, respectively). However, it remains unclear whether it extends to involuntary eye behavior, which may not share cognitive resources with internal activities. Therefore, the present experiment aimed to further elucidate attentional dynamics by examining whether internal attention affects the pupillary light response (PLR). Specifically, we consistently observed that workload and task modality of the internal task reduced the PLR to luminance changes of medium intensity. However, the PLR to strong luminance changes was less or not at all affected by the internal task. These results suggest that perceptual decoupling effects may be less consistent in involuntary eye behavior, particularly in the context of a salient visual stimulus.
Collapse
Affiliation(s)
- Živa Korda
- Department of Psychology, University of Graz, Graz, Austria.
| | - Sonja Walcher
- Department of Psychology, University of Graz, Graz, Austria
| | | | | |
Collapse
|
4
|
Yamashita J, Takimoto Y, Oishi H, Kumada T. How do personality traits modulate real-world gaze behavior? Generated gaze data shows situation-dependent modulations. Front Psychol 2024; 14:1144048. [PMID: 38268808 PMCID: PMC10805946 DOI: 10.3389/fpsyg.2023.1144048] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2023] [Accepted: 12/21/2023] [Indexed: 01/26/2024] Open
Abstract
It has both scientific and practical benefits to substantiate the theoretical prediction that personality (Big Five) traits systematically modulate gaze behavior in various real-world (working) situations. Nevertheless, previous methods that required controlled situations and large numbers of participants failed to incorporate real-world personality modulation analysis. One cause of this research gap is the mixed effects of individual attributes (e.g., the accumulated attributes of age, gender, and degree of measurement noise) and personality traits in gaze data. Previous studies may have used larger sample sizes to average out the possible concentration of specific individual attributes in some personality traits, and may have imposed control situations to prevent unexpected interactions between these possibly biased individual attributes and complex, realistic situations. Therefore, we generated and analyzed real-world gaze behavior where the effects of personality traits are separated out from individual attributes. In Experiment 1, we successfully provided a methodology for generating such sensor data on head and eye movements for a small sample of participants who performed realistic nonsocial (data-entry) and social (conversation) work tasks (i.e., the first contribution). In Experiment 2, we evaluated the effectiveness of generated gaze behavior for real-world personality modulation analysis. We successfully showed how openness systematically modulates the autocorrelation coefficients of sensor data, reflecting the period of head and eye movements in data-entry and conversation tasks (i.e., the second contribution). We found different openness modulations in the autocorrelation coefficients from the generated sensor data of the two tasks. These modulations could not be detected using real sensor data because of the contamination of individual attributes. In conclusion, our method is a potentially powerful tool for understanding theoretically expected, systematic situation-specific personality modulation of real-world gaze behavior.
Collapse
Affiliation(s)
- Jumpei Yamashita
- NTT Access Network Service Systems Laboratories, Nippon Telegraph and Telephone Corporation, Tokyo, Japan
- Graduate School of Informatics, Kyoto University, Kyoto, Japan
| | - Yoshiaki Takimoto
- NTT Human Informatics Laboratories, Nippon Telegraph and Telephone Corporation, Kanagawa, Japan
| | - Haruo Oishi
- NTT Access Network Service Systems Laboratories, Nippon Telegraph and Telephone Corporation, Tokyo, Japan
| | | |
Collapse
|
5
|
Magliacano A, Catalano L, Sagliano L, Estraneo A, Trojano L. Spontaneous eye blinking during an auditory, an interoceptive and a visual task: The role of the sensory modality and the attentional focus. Cortex 2023; 168:49-61. [PMID: 37659289 DOI: 10.1016/j.cortex.2023.07.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Revised: 07/15/2023] [Accepted: 07/31/2023] [Indexed: 09/04/2023]
Abstract
Previous evidence suggested that spontaneous eye blinking changes as a function of the attentional focus. In particular, eye blink rate (EBR) tends to increase when attention is directed to internal versus environmental processing. Most studies on this issue compared eye blinking during visual and mental imagery tasks, and interpreted the increase in EBR as a mechanism to focus cognitive resources on internal processing by disengaging attention from interfering information. However, since eye blinking also depends on the sensory modality of the task, the findings might be influenced by a modality-specific effect. In the present Registered Report we aim at investigating whether the environmental versus internal attentional focus can affect spontaneous blinking behaviour in non-visual tasks as well, in conditions where visual stimuli are not relevant. In a within-subject design, healthy participants performed an interoceptive task (i.e., heartbeat counting) and an auditory task in which pre-recorded heartbeats were presented aurally; during both tasks irrelevant visual stimuli were also presented. In a further control condition with the same auditory and visual stimuli, the participants were required to focus their attention on visual stimuli. Participants' EBR was recorded during each task by means of an eye-tracking system. We found that, although the interoceptive task was more difficult than the auditory and visual tasks, participants' EBR decreased by a comparable level in all tasks with respect to a rest condition, with no differences between internal versus environmental conditions. The present findings do not support the idea that EBR is modulated by an internal versus external focus of attention, at least in presence of controlled visual stimulation.
Collapse
Affiliation(s)
| | - Laura Catalano
- Department of Psychology, University of Campania "Luigi Vanvitelli", Caserta, Italy
| | - Laura Sagliano
- Department of Psychology, University of Campania "Luigi Vanvitelli", Caserta, Italy
| | | | - Luigi Trojano
- Department of Psychology, University of Campania "Luigi Vanvitelli", Caserta, Italy.
| |
Collapse
|
6
|
Walcher S, Korda Ž, Körner C, Benedek M. The effects of type and workload of internal tasks on voluntary saccades in a target-distractor saccade task. PLoS One 2023; 18:e0290322. [PMID: 37616320 PMCID: PMC10449167 DOI: 10.1371/journal.pone.0290322] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2023] [Accepted: 08/07/2023] [Indexed: 08/26/2023] Open
Abstract
When we engage in internally directed cognition, like doing mental arithmetic or mind wandering, fewer cognitive resources are assigned for other activities like reacting to perceptual input-an effect termed perceptual decoupling. However, the exact conditions under which perceptual decoupling occurs and its underlying cognitive mechanisms are still unclear. Hence, the present study systematically manipulated the task type (arithmetic, visuospatial) and workload (control, low, high) of the internal task in a within-subject design and tested its effects on voluntary saccades in a target-distractor saccade task. As expected, engagement in internal tasks delayed saccades to the target. This effect was moderated by time, task, and workload: The delay was largest right after internal task onset and then decreased, potentially reflecting the intensity of internal task demands. Saccades were also more delayed for the high compared to the low workload condition in the arithmetic task, whereas workload conditions had similarly high effects in the visuospatial task. Findings suggests that perceptual decoupling of eye behavior gradually increases with internal demands on general resources and that perceptual decoupling is specifically sensitive to internal demands on visuospatial resources. The latter may be mediated by interference due to eye behavior elicited by the internal task itself. Internal tasks did not affect the saccade latency-deviation trade-off, indicating that while the internal tasks delayed the execution of the saccade, the perception of the saccade stimuli and spatial planning of the saccade continued unaffected in parallel to the internal tasks. Together, these findings shed further light on the specific mechanisms underlying perceptual decoupling by suggesting that perceptual decoupling of eye behavior increases as internal demands on cognitive resources overlap more strongly with demands of the external task.
Collapse
Affiliation(s)
- Sonja Walcher
- Creative Cognition Lab, Institute of Psychology, University of Graz, Graz, Austria
| | - Živa Korda
- Creative Cognition Lab, Institute of Psychology, University of Graz, Graz, Austria
| | - Christof Körner
- Cognitive Psychology & Neuroscience, Institute of Psychology, University of Graz, Graz, Austria
| | - Mathias Benedek
- Creative Cognition Lab, Institute of Psychology, University of Graz, Graz, Austria
| |
Collapse
|
7
|
Effects of internally directed cognition on smooth pursuit eye movements: A systematic examination of perceptual decoupling. Atten Percept Psychophys 2023; 85:1159-1178. [PMID: 36922477 PMCID: PMC10167146 DOI: 10.3758/s13414-023-02688-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/21/2023] [Indexed: 03/17/2023]
Abstract
Eye behavior differs between internally and externally directed cognition and thus is indicative of an internal versus external attention focus. Recent work implicated perceptual decoupling (i.e., eye behavior becoming less determined by the sensory environment) as one of the key mechanisms involved in these attention-related eye movement differences. However, it is not yet understood how perceptual decoupling depends on the characteristics of the internal task. Therefore, we systematically examined effects of varying internal task demands on smooth pursuit eye movements. Specifically, we evaluated effects of the internal workload (control vs. low vs. high) and of internal task (arithmetic vs. visuospatial). The results of multilevel modelling showed that effects of perceptual decoupling were stronger for higher workload, and more pronounced for the visuospatial modality. Effects also followed a characteristic time-course relative to internal operations. The findings provide further support of the perceptual decoupling mechanism by showing that it is sensitive to the degree of interference between external and internal information.
Collapse
|
8
|
Yu W, Zhao F, Ren Z, Jin D, Yang X, Zhang X. Mining attention distribution paradigm: Discover gaze patterns and their association rules behind the visual image. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2023; 230:107330. [PMID: 36603232 DOI: 10.1016/j.cmpb.2022.107330] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/19/2022] [Revised: 11/05/2022] [Accepted: 12/28/2022] [Indexed: 06/17/2023]
Abstract
BACKGROUND AND OBJECTIVE Attention allocation reflects the way of humans filtering and organizing the information. On one hand, different task scenarios seriously affect human's rule of attention distribution, on the other hand, visual attention reflecting the cognitive and psychological process. Most of the previous studies on visual attention allocation are based on cognitive models, predicted models, or statistical analysis of eye movement data or visual images, however, these methods are inadequate to provide an inside view of gaze behavior to reveal the attention distribution pattern within scenario context. Moreover, they seldom study the association rules of these patterns. Therefore, we adopted the big data mining approach to discover the paradigm of visual attention distribution. METHODS We applied the data mining method to extract the gaze patterns to discover the regularities of attention distribution behavior within the scenario context. The proposed method consists of three components, tasks scenario segmented and clustered, gaze pattern mining, and association rule of frequent pattern mining. RESULTS The proposed approach is tested on the operation platform. The complex operation task is simultaneously segmented and clustered with the TICC-based method and evaluated by the BCI index. The operator's eye movement frequent patterns and their association rule are discovered. The results demonstrate that our method can associate the eye-tracking data with the task-oriented scene data. DISCUSSION The proposed method provides the benefits of being able to explicitly express and quantitatively analyze people's visual attention patterns. The proposed method can not only be applied in the field of aerospace medicine and aviation psychology, but also can likely be applied to computer-aided diagnosis and follow-up tool for neurological disease and cognitive impairment related disease, such as ADHD (Attention Deficit Hyperactivity Disorder), neglect syndrome, social attention differences in ASD (Autism spectrum disorder).
Collapse
Affiliation(s)
- Weiwei Yu
- School of Mechanical Engineering, Northwestern Polytechnical University, Xi'an, 710072, China; Unmanned System Research Institute, Northwestern Polytechnical University, Xi'an, 710072, China.
| | - Feng Zhao
- School of Mechanical Engineering, Northwestern Polytechnical University, Xi'an, 710072, China
| | - Zhijun Ren
- School of Mechanical Engineering, Northwestern Polytechnical University, Xi'an, 710072, China
| | - Dian Jin
- School of Mechanical Engineering, Northwestern Polytechnical University, Xi'an, 710072, China
| | - Xinliang Yang
- School of Mechanical Engineering, Northwestern Polytechnical University, Xi'an, 710072, China; Chinese Flight Test Establishment, Xi'an, 710089, China
| | - Xiaokun Zhang
- School of Computing and Information Systems, Athabasca University, Canada
| |
Collapse
|
9
|
Huang T, Zhou C, Luo X, Kaner J. Study of Ageing in Complex Interface Interaction Tasks: Based on Combined Eye-Movement and HRV Bioinformatic Feedback. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:ijerph192416937. [PMID: 36554822 PMCID: PMC9779224 DOI: 10.3390/ijerph192416937] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/16/2022] [Revised: 11/16/2022] [Accepted: 12/14/2022] [Indexed: 05/25/2023]
Abstract
Human-computer interaction tends to be intelligent and driven by technological innovation. However, there is a digital divide caused by usage barriers for older users when interacting with complex tasks. To better help elderly users efficiently complete complex interactions, a smart home's operating system's interface is used as an example to explore the usage characteristics of elderly users of different genders. This study uses multi-signal physiological acquisition as a criterion. The results of the study showed that: (1) Older users are more attracted to iconic information than textual information. (2) When searching for complex tasks, female users are more likely to browse the whole page before locating the job. (3) Female users are more likely to browse from top to bottom when searching for complex tasks. (4) Female users are more likely to concentrate when performing complex tasks than male users. (5) Males are more likely to be nervous than females when performing complex tasks.
Collapse
Affiliation(s)
- Ting Huang
- College of Furnishings and Industrial Design, Nanjing Forestry University, Nanjing 210037, China
- Jiangsu Co-Innovation Center of Efficient Processing and Utilization of Forest Resources, Nanjing 210037, China
| | - Chengmin Zhou
- College of Furnishings and Industrial Design, Nanjing Forestry University, Nanjing 210037, China
- Jiangsu Co-Innovation Center of Efficient Processing and Utilization of Forest Resources, Nanjing 210037, China
| | - Xin Luo
- College of Furnishings and Industrial Design, Nanjing Forestry University, Nanjing 210037, China
| | - Jake Kaner
- School of Art and Design, Nottingham Trent University, Nottingham NG1 4FQ, UK
| |
Collapse
|
10
|
Vortmann LM, Weidenbach P, Putze F. AtAwAR Translate: Attention-Aware Language Translation Application in Augmented Reality for Mobile Phones. SENSORS (BASEL, SWITZERLAND) 2022; 22:6160. [PMID: 36015922 PMCID: PMC9412445 DOI: 10.3390/s22166160] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/31/2022] [Revised: 08/02/2022] [Accepted: 08/12/2022] [Indexed: 06/15/2023]
Abstract
As lightweight, low-cost EEG headsets emerge, the feasibility of consumer-oriented brain-computer interfaces (BCI) increases. The combination of portable smartphones and easy-to-use EEG dry electrode headbands offers intriguing new applications and methods of human-computer interaction. In previous research, augmented reality (AR) scenarios have been identified to profit from additional user state information-such as that provided by a BCI. In this work, we implemented a system that integrates user attentional state awareness into a smartphone application for an AR written language translator. The attentional state of the user is classified in terms of internally and externally directed attention by using the Muse 2 electroencephalography headband with four frontal electrodes. The classification results are used to adapt the behavior of the translation app, which uses the smartphone's camera to display translated text as augmented reality elements. We present the first mobile BCI system that uses a smartphone and a low-cost EEG device with few electrodes to provide attention awareness to an AR application. Our case study with 12 participants did not fully support the assumption that the BCI improves usability. However, we are able to show that the classification accuracy and ease of setup are promising paths toward mobile consumer-oriented BCI usage. For future studies, other use cases, applications, and adaptations will be tested for this setup to explore the usability.
Collapse
|
11
|
Vortmann LM, Ceh S, Putze F. Multimodal EEG and Eye Tracking Feature Fusion Approaches for Attention Classification in Hybrid BCIs. FRONTIERS IN COMPUTER SCIENCE 2022. [DOI: 10.3389/fcomp.2022.780580] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Often, various modalities capture distinct aspects of particular mental states or activities. While machine learning algorithms can reliably predict numerous aspects of human cognition and behavior using a single modality, they can benefit from the combination of multiple modalities. This is why hybrid BCIs are gaining popularity. However, it is not always straightforward to combine features from a multimodal dataset. Along with the method for generating the features, one must decide when the modalities should be combined during the classification process. We compare unimodal EEG and eye tracking classification of internally and externally directed attention to multimodal approaches for early, middle, and late fusion in this study. On a binary dataset with a chance level of 0.5, late fusion of the data achieves the highest classification accuracy of 0.609–0.675 (95%-confidence interval). In general, the results indicate that for these modalities, middle or late fusion approaches are better suited than early fusion approaches. Additional validation of the observed trend will require the use of additional datasets, alternative feature generation mechanisms, decision rules, and neural network designs. We conclude with a set of premises that need to be considered when deciding on a multimodal attentional state classification approach.
Collapse
|
12
|
Vortmann LM, Putze F. Combining Implicit and Explicit Feature Extraction for Eye Tracking: Attention Classification Using a Heterogeneous Input. SENSORS 2021; 21:s21248205. [PMID: 34960295 PMCID: PMC8707750 DOI: 10.3390/s21248205] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/29/2021] [Revised: 11/29/2021] [Accepted: 12/02/2021] [Indexed: 01/24/2023]
Abstract
Statistical measurements of eye movement-specific properties, such as fixations, saccades, blinks, or pupil dilation, are frequently utilized as input features for machine learning algorithms applied to eye tracking recordings. These characteristics are intended to be interpretable aspects of eye gazing behavior. However, prior research has demonstrated that when trained on implicit representations of raw eye tracking data, neural networks outperform these traditional techniques. To leverage the strengths and information of both feature sets, we integrated implicit and explicit eye tracking features in one classification approach in this work. A neural network was adapted to process the heterogeneous input and predict the internally and externally directed attention of 154 participants. We compared the accuracies reached by the implicit and combined features for different window lengths and evaluated the approaches in terms of person- and task-independence. The results indicate that combining implicit and explicit feature extraction techniques for eye tracking data improves classification results for attentional state detection significantly. The attentional state was correctly classified during new tasks with an accuracy better than chance, and person-independent classification even outperformed person-dependently trained classifiers for some settings. For future experiments and applications that require eye tracking data classification, we suggest to consider implicit data representation in addition to interpretable explicit features.
Collapse
|
13
|
Ceh SM, Annerer-Walcher S, Koschutnig K, Körner C, Fink A, Benedek M. Neurophysiological indicators of internal attention: An fMRI-eye-tracking coregistration study. Cortex 2021; 143:29-46. [PMID: 34371378 DOI: 10.1016/j.cortex.2021.07.005] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2021] [Revised: 06/01/2021] [Accepted: 07/08/2021] [Indexed: 01/16/2023]
Abstract
Many goal-directed, as well as spontaneous everyday activities (e.g., planning, mind-wandering), rely on an internal focus of attention. This fMRI-eye-tracking coregistration study investigated brain mechanisms and eye behavior related to internally versus externally directed cognition. Building on an established paradigm, we manipulated internal attention demands within tasks utilizing conditional stimulus masking. Internally directed cognition involved bilateral activation of the lingual gyrus and inferior parietal lobe areas as well as wide-spread deactivation of visual networks. Moreover, internally directed cognition was related to greater pupil diameter, pupil diameter variance, blink duration, fixation disparity variance, and smaller amounts of microsaccades. FMRI-eye-tracking covariation analyses further revealed that larger pupil diameter was related to increased activation of basal ganglia and lingual gyrus. It can be concluded that internally and externally directed cognition are characterized by distinct neurophysiological signatures. The observed neurophysiological differences indicate that internally directed cognition is associated with reduced processing of task-irrelevant information and increased mental load. These findings shed further light on the interplay between neural and perceptual mechanisms contributing to an internal focus of attention.
Collapse
Affiliation(s)
- Simon Majed Ceh
- Institute of Psychology, University of Graz, Universitätsplatz 2, BioTechMed, Graz, Austria
| | - Sonja Annerer-Walcher
- Institute of Psychology, University of Graz, Universitätsplatz 2, BioTechMed, Graz, Austria
| | - Karl Koschutnig
- Institute of Psychology, University of Graz, Universitätsplatz 2, BioTechMed, Graz, Austria
| | - Christof Körner
- Institute of Psychology, University of Graz, Universitätsplatz 2, BioTechMed, Graz, Austria
| | - Andreas Fink
- Institute of Psychology, University of Graz, Universitätsplatz 2, BioTechMed, Graz, Austria
| | - Mathias Benedek
- Institute of Psychology, University of Graz, Universitätsplatz 2, BioTechMed, Graz, Austria.
| |
Collapse
|
14
|
Vortmann LM, Knychalla J, Annerer-Walcher S, Benedek M, Putze F. Imaging Time Series of Eye Tracking Data to Classify Attentional States. Front Neurosci 2021; 15:664490. [PMID: 34121994 PMCID: PMC8193942 DOI: 10.3389/fnins.2021.664490] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2021] [Accepted: 05/03/2021] [Indexed: 12/25/2022] Open
Abstract
It has been shown that conclusions about the human mental state can be drawn from eye gaze behavior by several previous studies. For this reason, eye tracking recordings are suitable as input data for attentional state classifiers. In current state-of-the-art studies, the extracted eye tracking feature set usually consists of descriptive statistics about specific eye movement characteristics (i.e., fixations, saccades, blinks, vergence, and pupil dilation). We suggest an Imaging Time Series approach for eye tracking data followed by classification using a convolutional neural net to improve the classification accuracy. We compared multiple algorithms that used the one-dimensional statistical summary feature set as input with two different implementations of the newly suggested method for three different data sets that target different aspects of attention. The results show that our two-dimensional image features with the convolutional neural net outperform the classical classifiers for most analyses, especially regarding generalization over participants and tasks. We conclude that current attentional state classifiers that are based on eye tracking can be optimized by adjusting the feature set while requiring less feature engineering and our future work will focus on a more detailed and suited investigation of this approach for other scenarios and data sets.
Collapse
Affiliation(s)
- Lisa-Marie Vortmann
- Cognitive Systems Lab, Department of Mathematics and Computer Science, University of Bremen, Bremen, Germany
| | - Jannes Knychalla
- Cognitive Systems Lab, Department of Mathematics and Computer Science, University of Bremen, Bremen, Germany
| | | | - Mathias Benedek
- Creative Cognition Lab, Institute of Psychology, University of Graz, Graz, Austria
| | - Felix Putze
- Cognitive Systems Lab, Department of Mathematics and Computer Science, University of Bremen, Bremen, Germany
| |
Collapse
|
15
|
Using Brain Activity Patterns to Differentiate Real and Virtual Attended Targets during Augmented Reality Scenarios. INFORMATION 2021. [DOI: 10.3390/info12060226] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/06/2023] Open
Abstract
Augmented reality is the fusion of virtual components and our real surroundings. The simultaneous visibility of generated and natural objects often requires users to direct their selective attention to a specific target that is either real or virtual. In this study, we investigated whether this target is real or virtual by using machine learning techniques to classify electroencephalographic (EEG) and eye tracking data collected in augmented reality scenarios. A shallow convolutional neural net classified 3 second EEG data windows from 20 participants in a person-dependent manner with an average accuracy above 70% if the testing data and training data came from different trials. This accuracy could be significantly increased to 77% using a multimodal late fusion approach that included the recorded eye tracking data. Person-independent EEG classification was possible above chance level for 6 out of 20 participants. Thus, the reliability of such a brain–computer interface is high enough for it to be treated as a useful input mechanism for augmented reality applications.
Collapse
|