1
|
Byrne SA, Nyström M, Maquiling V, Kasneci E, Niehorster DC. Precise localization of corneal reflections in eye images using deep learning trained on synthetic data. Behav Res Methods 2024; 56:3226-3241. [PMID: 38114880 PMCID: PMC11133043 DOI: 10.3758/s13428-023-02297-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/15/2023] [Indexed: 12/21/2023]
Abstract
We present a deep learning method for accurately localizing the center of a single corneal reflection (CR) in an eye image. Unlike previous approaches, we use a convolutional neural network (CNN) that was trained solely using synthetic data. Using only synthetic data has the benefit of completely sidestepping the time-consuming process of manual annotation that is required for supervised training on real eye images. To systematically evaluate the accuracy of our method, we first tested it on images with synthetic CRs placed on different backgrounds and embedded in varying levels of noise. Second, we tested the method on two datasets consisting of high-quality videos captured from real eyes. Our method outperformed state-of-the-art algorithmic methods on real eye images with a 3-41.5% reduction in terms of spatial precision across data sets, and performed on par with state-of-the-art on synthetic images in terms of spatial accuracy. We conclude that our method provides a precise method for CR center localization and provides a solution to the data availability problem, which is one of the important common roadblocks in the development of deep learning models for gaze estimation. Due to the superior CR center localization and ease of application, our method has the potential to improve the accuracy and precision of CR-based eye trackers.
Collapse
Affiliation(s)
| | - Marcus Nyström
- Lund University Humanities Lab, Lund University, Lund, Sweden
| | - Virmarie Maquiling
- Human-Centered Technologies for Learning, Technical University of Munich, Munich, Germany
| | - Enkelejda Kasneci
- Human-Centered Technologies for Learning, Technical University of Munich, Munich, Germany
| | - Diederick C Niehorster
- MoMiLab, IMT School for Advanced Studies Lucca, Lucca, Italy.
- Department of Psychology, Lund University, Lund, Sweden.
| |
Collapse
|
2
|
Romeo A, Leonovych O, Solé Puig M, Supèr H. Cognitive Vergence Recorded with a Webcam-Based Eye-Tracker during an Oddball Task in an Elderly Population. SENSORS (BASEL, SWITZERLAND) 2024; 24:888. [PMID: 38339605 PMCID: PMC10857309 DOI: 10.3390/s24030888] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/17/2023] [Revised: 01/24/2024] [Accepted: 01/25/2024] [Indexed: 02/12/2024]
Abstract
(1) Background: Our previous research provides evidence that vergence eye movements may significantly influence cognitive processing and could serve as a reliable measure of cognitive issues. The rise of consumer-grade eye tracking technology, which uses sophisticated imaging techniques in the visible light spectrum to determine gaze position, is noteworthy. In our study, we explored the feasibility of using webcam-based eye tracking to monitor the vergence eye movements of patients with Mild Cognitive Impairment (MCI) during a visual oddball paradigm. (2) Methods: We simultaneously recorded eye positions using a remote infrared-based pupil eye tracker. (3) Results: Both tracking methods effectively captured vergence eye movements and demonstrated robust cognitive vergence responses, where participants exhibited larger vergence eye movement amplitudes in response to targets versus distractors. (4) Conclusions: In summary, the use of a consumer-grade webcam to record cognitive vergence shows potential. This method could lay the groundwork for future research aimed at creating an affordable screening tool for mental health care.
Collapse
Affiliation(s)
- August Romeo
- Department Cognition, Development and Educational Psychology, University of Barcelona, 08035 Barcelona, Spain
| | - Oleksii Leonovych
- Bioinformatics and Biomedical Signals Laboratory, Polytechnical University of Catalonia, 08028 Barcelona, Spain
| | - Maria Solé Puig
- Assesment Unit of Cognition and Attention in Learning, Psychology Clinic, 08035 Barcelona, Spain;
| | - Hans Supèr
- Braingaze S.L., 08302 Mataró, Spain
- Institute of Neurosciences (UBNeuro), University of Barcelona, 08035 Barcelona, Spain
- Institut de Recerca Sant Joan de Déu (IRSJD), 08950 Barcelona, Spain
- Catalan Institution for Research and Advanced Studies (ICREA), 08010 Barcelona, Spain
| |
Collapse
|
3
|
Huang Z, Duan X, Zhu G, Zhang S, Wang R, Wang Z. Assessing the data quality of AdHawk MindLink eye-tracking glasses. Behav Res Methods 2024:10.3758/s13428-023-02310-2. [PMID: 38168041 DOI: 10.3758/s13428-023-02310-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/30/2023] [Indexed: 01/05/2024]
Abstract
Most commercially available eye-tracking devices rely on video cameras and image processing algorithms to track gaze. Despite this, emerging technologies are entering the field, making high-speed, cameraless eye-tracking more accessible. In this study, a series of tests were conducted to compare the data quality of MEMS-based eye-tracking glasses (AdHawk MindLink) with three widely used camera-based eye-tracking devices (EyeLink Portable Duo, Tobii Pro Glasses 2, and SMI Eye Tracking Glasses 2). The data quality measures assessed in these tests included accuracy, precision, data loss, and system latency. The results suggest that, overall, the data quality of the eye-tracking glasses was lower compared to that of a desktop EyeLink Portable Duo eye-tracker. Among the eye-tracking glasses, the accuracy and precision of the MindLink eye-tracking glasses were either higher or on par with those of Tobii Pro Glasses 2 and SMI Eye Tracking Glasses 2. The system latency of MindLink was approximately 9 ms, significantly lower than that of camera-based eye-tracking devices found in VR goggles. These results suggest that the MindLink eye-tracking glasses show promise for research applications where high sampling rates and low latency are preferred.
Collapse
Affiliation(s)
- Zehao Huang
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Xiaoting Duan
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Gancheng Zhu
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Shuai Zhang
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Rong Wang
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Zhiguo Wang
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China.
| |
Collapse
|
4
|
Kaduk T, Goeke C, Finger H, König P. Webcam eye tracking close to laboratory standards: Comparing a new webcam-based system and the EyeLink 1000. Behav Res Methods 2023:10.3758/s13428-023-02237-8. [PMID: 37821751 DOI: 10.3758/s13428-023-02237-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/04/2023] [Indexed: 10/13/2023]
Abstract
This paper aims to compare a new webcam-based eye-tracking system, integrated into the Labvanced platform for online experiments, to a "gold standard" lab-based eye tracker (EyeLink 1000 - SR Research). Specifically, we simultaneously recorded data with both eye trackers in five different tasks, analyzing their real-time performance. These tasks were a subset of a standardized test battery for eye trackers, including a Large Grid task, Smooth Pursuit eye movements, viewing natural images, and two Head Movements tasks (roll, yaw). The results show that the webcam-based system achieved an overall accuracy of 1.4°, and a precision of 1.1° (standard deviation (SD) across subjects), an error of about 0.5° larger than the EyeLink system. Interestingly, both accuracy (1.3°) and precision (0.9°) were slightly better for centrally presented targets, the region of interest in many psychophysical experiments. Remarkably, the correlation of raw gaze samples between the EyeLink and webcam-based was at about 90% for the Large Grid task and about 80% for Free View and Smooth Pursuit. Overall, these results put the performance of the webcam-based system roughly on par with mobile eye-tracking devices (Ehinger et al. PeerJ, 7, e7086, 2019; Tonsen et al., 2020) and demonstrate substantial improvement compared to existing webcam eye-tracking solutions (Papoutsaki et al., 2017).
Collapse
Affiliation(s)
- Tobiasz Kaduk
- Institute of Cognitive Science, University of Osnabrück, Osnabrück, Germany.
- Research and Development Division, Scicovery GmbH, Paderborn, Germany.
| | - Caspar Goeke
- Research and Development Division, Scicovery GmbH, Paderborn, Germany
| | - Holger Finger
- Research and Development Division, Scicovery GmbH, Paderborn, Germany
| | - Peter König
- Institute of Cognitive Science, University of Osnabrück, Osnabrück, Germany
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| |
Collapse
|
5
|
Poletti M. An eye for detail: Eye movements and attention at the foveal scale. Vision Res 2023; 211:108277. [PMID: 37379763 PMCID: PMC10528557 DOI: 10.1016/j.visres.2023.108277] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2023] [Revised: 06/08/2023] [Accepted: 06/08/2023] [Indexed: 06/30/2023]
Abstract
Human vision relies on a tiny region of the retina, the 1-deg foveola, to achieve high spatial resolution. Foveal vision is of paramount importance in daily activities, yet its study is challenging, as eye movements incessantly displace stimuli across this region. Here I will review work that, building on recent advances in eye-tracking and gaze-contingent display, examines how attention and eye movements operate at the foveal level. This research highlights how exploration of fine spatial detail unfolds following visuomotor strategies reminiscent of those occurring at larger scales. It shows that, together with highly precise control of attention, this motor activity is linked to non-homogenous processing within the foveola and selectively modulates sensitivity both in space and time. Overall, the picture emerges of a highly dynamic foveal perception in which fine spatial vision, rather than simply being the result of placing a stimulus at the center of gaze, is the result of a finely tuned and orchestrated synergy of motor, cognitive, and attentional processes.
Collapse
Affiliation(s)
- Martina Poletti
- Department of Brain and Cognitive Sciences, University of Rochester, United States; Center for Visual Science, University of Rochester, United States; Department of Neuroscience, University of Rochester, United States.
| |
Collapse
|
6
|
Becker W, Behler A, Vintonyak O, Kassubek J. Patterns of small involuntary fixation saccades (SIFSs) in different neurodegenerative diseases: the role of noise. Exp Brain Res 2023:10.1007/s00221-023-06633-6. [PMID: 37247026 DOI: 10.1007/s00221-023-06633-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2023] [Accepted: 05/09/2023] [Indexed: 05/30/2023]
Abstract
During the attempt to steadily fixate at a single spot, sequences of small involuntary fixation saccades (SIFSs, known also as microsaccades οr intrusions) occur which form spatio-temporal patterns such as square wave jerks (SWJs), a pattern characterised by alternating centrifugal and centripetal movements of similar magnitude. In many neurodegenerative disorders, SIFSs exhibit elevated amplitudes and frequencies. Elevated SIFS amplitudes have been shown to favour the occurrence of SWJs ("SWJ coupling"). We analysed SIFSs in different subject groups comprising both healthy controls (CTR) and patients with amyotrophic lateral sclerosis (ALS) and progressive supranuclear palsy (PSP), i.e. two neurodegenerative diseases with completely different neuropathological basis and different clinical phenotypes. We show that, across these groups, the relations between SIFS amplitude and the relative frequency of SWJ-like patterns and other SIFS characteristics follow a common law. As an explanation, we propose that physiological and technical noise comprises a small, amplitude-independent component that has little effect on large SIFSs, but causes considerable deviations from the intended amplitude and direction of small ones. Therefore, in contrast to large SIFSs, successive small SIFSs have a lower chance to meet the SWJ similarity criteria. In principle, every measurement of SIFSs is affected by an amplitude-independent noise background. Therefore, the dependence of SWJ coupling on SIFS amplitude will probably be encountered in almost any group of subjects. In addition, we find a positive correlation between SIFS amplitude and frequency in ALS, but none in PSP, suggesting that the elevated amplitudes might arise at different sites in the two disorders.
Collapse
Affiliation(s)
- Wolfgang Becker
- Section of Neurophysiology, Department of Neurology, University of Ulm, Ulm, Germany.
| | - Anna Behler
- Section of Neurophysiology, Department of Neurology, University of Ulm, Ulm, Germany
- Department of Neurology, University of Ulm, Ulm, Germany
| | - Olga Vintonyak
- Section of Neurophysiology, Department of Neurology, University of Ulm, Ulm, Germany
| | - Jan Kassubek
- Section of Neurophysiology, Department of Neurology, University of Ulm, Ulm, Germany
- Department of Neurology, University of Ulm, Ulm, Germany
| |
Collapse
|
7
|
Wu RJ, Clark AM, Cox MA, Intoy J, Jolly PC, Zhao Z, Rucci M. High-resolution eye-tracking via digital imaging of Purkinje reflections. J Vis 2023; 23:4. [PMID: 37140912 PMCID: PMC10166114 DOI: 10.1167/jov.23.5.4] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/05/2023] Open
Abstract
Reliably measuring eye movements and determining where the observer looks are fundamental needs in vision science. A classical approach to achieve high-resolution oculomotor measurements is the so-called dual Purkinje image (DPI) method, a technique that relies on the relative motion of the reflections generated by two distinct surfaces in the eye, the cornea and the back of the lens. This technique has been traditionally implemented in fragile and difficult to operate analog devices, which have remained exclusive use of specialized oculomotor laboratories. Here we describe progress on the development of a digital DPI, a system that builds on recent advances in digital imaging to enable fast, highly precise eye-tracking without the complications of previous analog devices. This system integrates an optical setup with no moving components with a digital imaging module and dedicated software on a fast processing unit. Data from both artificial and human eyes demonstrate subarcminute resolution at 1 kHz. Furthermore, when coupled with previously developed gaze-contingent calibration methods, this system enables localization of the line of sight within a few arcminutes.
Collapse
Affiliation(s)
- Ruei-Jr Wu
- Department of Brain & Cognitive Sciences and Center for Visual Science, University of Rochester, 310 Meliora Hall, Rochester, NY, USA
| | - Ashley M Clark
- Department of Brain & Cognitive Sciences and Center for Visual Science, University of Rochester, 310 Meliora Hall, Rochester, NY, USA
| | - Michele A Cox
- Department of Brain & Cognitive Sciences and Center for Visual Science, University of Rochester, 310 Meliora Hall, Rochester, NY, USA
| | - Janis Intoy
- Department of Brain & Cognitive Sciences and Center for Visual Science, University of Rochester, 310 Meliora Hall, Rochester, NY, USA
| | - Paul C Jolly
- Department of Brain & Cognitive Sciences and Center for Visual Science, University of Rochester, 310 Meliora Hall, Rochester, NY, USA
| | - Zhetuo Zhao
- Department of Brain & Cognitive Sciences and Center for Visual Science, University of Rochester, 310 Meliora Hall, Rochester, NY, USA
| | - Michele Rucci
- Department of Brain & Cognitive Sciences and Center for Visual Science, University of Rochester, 310 Meliora Hall, Rochester, NY, USA
| |
Collapse
|
8
|
Nyström M, Niehorster DC, Andersson R, Hessels RS, Hooge ITC. The amplitude of small eye movements can be accurately estimated with video-based eye trackers. Behav Res Methods 2023; 55:657-669. [PMID: 35419703 PMCID: PMC10027793 DOI: 10.3758/s13428-021-01780-6] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/20/2021] [Indexed: 11/08/2022]
Abstract
Estimating the gaze direction with a digital video-based pupil and corneal reflection (P-CR) eye tracker is challenging partly since a video camera is limited in terms of spatial and temporal resolution, and because the captured eye images contain noise. Through computer simulation, we evaluated the localization accuracy of pupil-, and CR centers in the eye image for small eye rotations (≪ 1 deg). Results highlight how inaccuracies in center localization are related to 1) how many pixels the pupil and CR span in the eye camera image, 2) the method to compute the center of the pupil and CRs, and 3) the level of image noise. Our results provide a possible explanation to why the amplitude of small saccades may not be accurately estimated by many currently used video-based eye trackers. We conclude that eye movements with arbitrarily small amplitudes can be accurately estimated using the P-CR eye-tracking principle given that the level of image noise is low and the pupil and CR span enough pixels in the eye camera, or if localization of the CR is based on the intensity values in the eye image instead of a binary representation.
Collapse
Affiliation(s)
- Marcus Nyström
- Lund University Humanities Lab, Lund University, Box 201, SE-221 00, Lund, Sweden.
| | - Diederick C Niehorster
- Lund University Humanities Lab, Lund University, Box 201, SE-221 00, Lund, Sweden
- Department of Psychology, Lund University, Box 201, SE-221 00, Lund, Sweden
| | | | - Roy S Hessels
- Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584, CS, Utrecht, The Netherlands
| | - Ignace T C Hooge
- Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584, CS, Utrecht, The Netherlands
| |
Collapse
|
9
|
Holmqvist K, Örbom SL, Hooge ITC, Niehorster DC, Alexander RG, Andersson R, Benjamins JS, Blignaut P, Brouwer AM, Chuang LL, Dalrymple KA, Drieghe D, Dunn MJ, Ettinger U, Fiedler S, Foulsham T, van der Geest JN, Hansen DW, Hutton SB, Kasneci E, Kingstone A, Knox PC, Kok EM, Lee H, Lee JY, Leppänen JM, Macknik S, Majaranta P, Martinez-Conde S, Nuthmann A, Nyström M, Orquin JL, Otero-Millan J, Park SY, Popelka S, Proudlock F, Renkewitz F, Roorda A, Schulte-Mecklenbeck M, Sharif B, Shic F, Shovman M, Thomas MG, Venrooij W, Zemblys R, Hessels RS. Eye tracking: empirical foundations for a minimal reporting guideline. Behav Res Methods 2023; 55:364-416. [PMID: 35384605 PMCID: PMC9535040 DOI: 10.3758/s13428-021-01762-8] [Citation(s) in RCA: 45] [Impact Index Per Article: 45.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/29/2021] [Indexed: 11/08/2022]
Abstract
In this paper, we present a review of how the various aspects of any study using an eye tracker (such as the instrument, methodology, environment, participant, etc.) affect the quality of the recorded eye-tracking data and the obtained eye-movement and gaze measures. We take this review to represent the empirical foundation for reporting guidelines of any study involving an eye tracker. We compare this empirical foundation to five existing reporting guidelines and to a database of 207 published eye-tracking studies. We find that reporting guidelines vary substantially and do not match with actual reporting practices. We end by deriving a minimal, flexible reporting guideline based on empirical research (Section "An empirically based minimal reporting guideline").
Collapse
Affiliation(s)
- Kenneth Holmqvist
- Department of Psychology, Nicolaus Copernicus University, Torun, Poland.
- Department of Computer Science and Informatics, University of the Free State, Bloemfontein, South Africa.
- Department of Psychology, Regensburg University, Regensburg, Germany.
| | - Saga Lee Örbom
- Department of Psychology, Regensburg University, Regensburg, Germany
| | - Ignace T C Hooge
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| | - Diederick C Niehorster
- Lund University Humanities Lab and Department of Psychology, Lund University, Lund, Sweden
| | - Robert G Alexander
- Department of Ophthalmology, SUNY Downstate Health Sciences University, Brooklyn, NY, USA
| | | | - Jeroen S Benjamins
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
- Social, Health and Organizational Psychology, Utrecht University, Utrecht, The Netherlands
| | - Pieter Blignaut
- Department of Computer Science and Informatics, University of the Free State, Bloemfontein, South Africa
| | | | - Lewis L Chuang
- Department of Ergonomics, Leibniz Institute for Working Environments and Human Factors, Dortmund, Germany
- Institute of Informatics, LMU Munich, Munich, Germany
| | | | - Denis Drieghe
- School of Psychology, University of Southampton, Southampton, UK
| | - Matt J Dunn
- School of Optometry and Vision Sciences, Cardiff University, Cardiff, UK
| | | | - Susann Fiedler
- Vienna University of Economics and Business, Vienna, Austria
| | - Tom Foulsham
- Department of Psychology, University of Essex, Essex, UK
| | | | - Dan Witzner Hansen
- Machine Learning Group, Department of Computer Science, IT University of Copenhagen, Copenhagen, Denmark
| | | | - Enkelejda Kasneci
- Human-Computer Interaction, University of Tübingen, Tübingen, Germany
| | | | - Paul C Knox
- Department of Eye and Vision Science, Institute of Life Course and Medical Sciences, University of Liverpool, Liverpool, UK
| | - Ellen M Kok
- Department of Education and Pedagogy, Division Education, Faculty of Social and Behavioral Sciences, Utrecht University, Utrecht, The Netherlands
- Department of Online Learning and Instruction, Faculty of Educational Sciences, Open University of the Netherlands, Heerlen, The Netherlands
| | - Helena Lee
- University of Southampton, Southampton, UK
| | - Joy Yeonjoo Lee
- School of Health Professions Education, Faculty of Health, Medicine, and Life Sciences, Maastricht University, Maastricht, The Netherlands
| | - Jukka M Leppänen
- Department of Psychology and Speed-Language Pathology, University of Turku, Turku, Finland
| | - Stephen Macknik
- Department of Ophthalmology, SUNY Downstate Health Sciences University, Brooklyn, NY, USA
| | - Päivi Majaranta
- TAUCHI Research Center, Computing Sciences, Faculty of Information Technology and Communication Sciences, Tampere University, Tampere, Finland
| | - Susana Martinez-Conde
- Department of Ophthalmology, SUNY Downstate Health Sciences University, Brooklyn, NY, USA
| | - Antje Nuthmann
- Institute of Psychology, University of Kiel, Kiel, Germany
| | - Marcus Nyström
- Lund University Humanities Lab, Lund University, Lund, Sweden
| | - Jacob L Orquin
- Department of Management, Aarhus University, Aarhus, Denmark
- Center for Research in Marketing and Consumer Psychology, Reykjavik University, Reykjavik, Iceland
| | - Jorge Otero-Millan
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, CA, USA
| | - Soon Young Park
- Comparative Cognition, Messerli Research Institute, University of Veterinary Medicine Vienna, Medical University of Vienna, Vienna, Austria
| | - Stanislav Popelka
- Department of Geoinformatics, Palacký University Olomouc, Olomouc, Czech Republic
| | - Frank Proudlock
- The University of Leicester Ulverscroft Eye Unit, Department of Neuroscience, Psychology and Behaviour, University of Leicester, Leicester, UK
| | - Frank Renkewitz
- Department of Psychology, University of Erfurt, Erfurt, Germany
| | - Austin Roorda
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, CA, USA
| | | | - Bonita Sharif
- School of Computing, University of Nebraska-Lincoln, Lincoln, Nebraska, USA
| | - Frederick Shic
- Center for Child Health, Behavior and Development, Seattle Children's Research Institute, Seattle, WA, USA
- Department of General Pediatrics, University of Washington School of Medicine, Seattle, WA, USA
| | - Mark Shovman
- Eyeviation Systems, Herzliya, Israel
- Department of Industrial Design, Bezalel Academy of Arts and Design, Jerusalem, Israel
| | - Mervyn G Thomas
- The University of Leicester Ulverscroft Eye Unit, Department of Neuroscience, Psychology and Behaviour, University of Leicester, Leicester, UK
| | - Ward Venrooij
- Electrical Engineering, Mathematics and Computer Science (EEMCS), University of Twente, Enschede, The Netherlands
| | | | - Roy S Hessels
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| |
Collapse
|
10
|
Wu JY, Ching CTS, Wang HMD, Liao LD. Emerging Wearable Biosensor Technologies for Stress Monitoring and Their Real-World Applications. BIOSENSORS 2022; 12:1097. [PMID: 36551064 PMCID: PMC9776100 DOI: 10.3390/bios12121097] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/07/2022] [Accepted: 11/15/2022] [Indexed: 06/17/2023]
Abstract
Wearable devices are being developed faster and applied more widely. Wearables have been used to monitor movement-related physiological indices, including heartbeat, movement, and other exercise metrics, for health purposes. People are also paying more attention to mental health issues, such as stress management. Wearable devices can be used to monitor emotional status and provide preliminary diagnoses and guided training functions. The nervous system responds to stress, which directly affects eye movements and sweat secretion. Therefore, the changes in brain potential, eye potential, and cortisol content in sweat could be used to interpret emotional changes, fatigue levels, and physiological and psychological stress. To better assess users, stress-sensing devices can be integrated with applications to improve cognitive function, attention, sports performance, learning ability, and stress release. These application-related wearables can be used in medical diagnosis and treatment, such as for attention-deficit hyperactivity disorder (ADHD), traumatic stress syndrome, and insomnia, thus facilitating precision medicine. However, many factors contribute to data errors and incorrect assessments, including the various wearable devices, sensor types, data reception methods, data processing accuracy and algorithms, application reliability and validity, and actual user actions. Therefore, in the future, medical platforms for wearable devices and applications should be developed, and product implementations should be evaluated clinically to confirm product accuracy and perform reliable research.
Collapse
Affiliation(s)
- Ju-Yu Wu
- Institute of Biomedical Engineering and Nanomedicine, National Health Research Institutes, Zhunan Township, Miaoli County 35053, Taiwan
- Program in Tissue Engineering and Regenerative Medicine, National Chung Hsing University, South District, Taichung City 402, Taiwan
| | - Congo Tak-Shing Ching
- Graduate Institute of Biomedical Engineering, National Chung Hsing University, South District, Taichung City 402, Taiwan
- Department of Electrical Engineering, National Chi Nan University, No. 1 University Road, Puli Township, Nantou County 545301, Taiwan
| | - Hui-Min David Wang
- Program in Tissue Engineering and Regenerative Medicine, National Chung Hsing University, South District, Taichung City 402, Taiwan
- Graduate Institute of Biomedical Engineering, National Chung Hsing University, South District, Taichung City 402, Taiwan
| | - Lun-De Liao
- Institute of Biomedical Engineering and Nanomedicine, National Health Research Institutes, Zhunan Township, Miaoli County 35053, Taiwan
- Program in Tissue Engineering and Regenerative Medicine, National Chung Hsing University, South District, Taichung City 402, Taiwan
| |
Collapse
|
11
|
Hofmann J, Domdei L, Jainta S, Harmening WM. Assessment of binocular fixational eye movements including cyclotorsion with split-field binocular scanning laser ophthalmoscopy. J Vis 2022; 22:5. [PMID: 36069941 PMCID: PMC9465939 DOI: 10.1167/jov.22.10.5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/05/2022] Open
Abstract
Fixational eye movements are a hallmark of human gaze behavior, yet little is known about how they interact between fellow eyes. Here, we designed, built and validated a split-field binocular scanning laser ophthalmoscope to record high-resolution eye motion traces from both eyes of six observers during fixation in different binocular vergence conditions. In addition to microsaccades and drift, torsional eye motion could be extracted, with a spatial measurement error of less than 1 arcmin. Microsaccades were strongly coupled between fellow eyes under all conditions. No monocular microsaccade occurred and no significant delay between microsaccade onsets across fellow eyes could be detected. Cyclotorsion was also firmly coupled between both eyes, occurring typically in conjugacy, with gradual changes during drift and abrupt changes during saccades.
Collapse
Affiliation(s)
- Julia Hofmann
- Rheinische Friedrich-Wilhelms-Universität Bonn, University Eye Hospital, Bonn, Germany.,Fraunhofer Institute for Optronics, Systems Technologies and Image Exploitations IOSB, Karlsruhe, Germany., https://www.iosb.fraunhofer.de/en.html
| | - Lennart Domdei
- Rheinische Friedrich-Wilhelms-Universität Bonn, University Eye Hospital, Bonn, Germany., https://ao.ukbonn.de/
| | - Stephanie Jainta
- SRH University of Applied Sciences in North Rhine-Westphalia, Hamm, Germany., https://www.srh-hochschule-nrw.de/
| | - Wolf M Harmening
- Rheinische Friedrich-Wilhelms-Universität Bonn, University Eye Hospital, Bonn, Germany., https://ao.ukbonn.de/
| |
Collapse
|
12
|
Holmqvist K, Örbom SL, Zemblys R. Small head movements increase and colour noise in data from five video-based P-CR eye trackers. Behav Res Methods 2022; 54:845-863. [PMID: 34357538 PMCID: PMC8344338 DOI: 10.3758/s13428-021-01648-9] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/06/2021] [Indexed: 11/08/2022]
Abstract
We empirically investigate the role of small, almost imperceptible balance and breathing movements of the head on the level and colour of noise in data from five commercial video-based P-CR eye trackers. By comparing noise from recordings with completely static artificial eyes to noise from recordings where the artificial eyes are worn by humans, we show that very small head movements increase levels and colouring of the noise in data recorded from all five eye trackers in this study. This increase of noise levels is seen not only in the gaze signal, but also in the P and CR signals of the eye trackers that provide these camera image features. The P and CR signals of the SMI eye trackers correlate strongly during small head movements, but less so or not at all when the head is completely still, indicating that head movements are registered by the P and CR images in the eye camera. By recording with artificial eyes, we can also show that the pupil size artefact has no major role in increasing and colouring noise. Our findings add to and replicate the observation by Niehorster et al., (2021) that lowpass filters in video-based P-CR eye trackers colour the data. Irrespective of source, filters or head movements, coloured noise can be confused for oculomotor drift. We also find that usage of the default head restriction in the EyeLink 1000+, the EyeLink II and the HiSpeed240 result in noisier data compared to less head restriction. Researchers investigating data quality in eye trackers should consider not using the Gen 2 artificial eye from SR Research / EyeLink. Data recorded with this artificial eye are much noisier than data recorded with other artificial eyes, on average 2.2-14.5 times worse for the five eye trackers.
Collapse
Affiliation(s)
- Kenneth Holmqvist
- Institute of Psychology, Nicolaus Copernicus University in Torun, Torun, Poland
- Department of Psychology, Regensburg University, Regensburg, Germany
- Department of Computer Science and Informatics, University of the Free State, Bloemfontein, South Africa
| | - Saga Lee Örbom
- Department of Psychology, Regensburg University, Regensburg, Germany
| | | |
Collapse
|
13
|
Bowers NR, Gautier J, Lin S, Roorda A. Fixational eye movements in passive versus active sustained fixation tasks. J Vis 2021; 21:16. [PMID: 34677574 PMCID: PMC8556553 DOI: 10.1167/jov.21.11.16] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/22/2023] Open
Abstract
Human fixational eye movements are so small and precise that high-speed, accurate tools are needed to fully reveal their properties and functional roles. Where the fixated image lands on the retina and how it moves for different levels of visually demanding tasks is the subject of the current study. An Adaptive Optics Scanning Laser Ophthalmoscope (AOSLO) was used to image, track and present a variety of fixation targets (Maltese cross, disk, concentric circles, Vernier and tumbling-E letter) to healthy subjects. During these different passive (static) or active (discriminating) tasks under natural eye motion, the landing position of the target on the retina was tracked in space and time over the retinal image directly with high spatial (<1 arcmin) and temporal (960 Hz) resolution. We computed both the eye motion and the exact trajectory of the fixated target's motion over the retina. We confirmed that compared to passive tasks, active tasks elicited a partial inhibition of microsaccades, leading to longer drift periods compensated by larger corrective saccades. Consequently, the overall fixation stability during active tasks was on average 57% larger than during passive tasks. The preferred retinal locus of fixation was the same for each task and did not coincide with the location of the peak cone density.
Collapse
Affiliation(s)
- Norick R Bowers
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, CA, USA.,
| | - Josselin Gautier
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, CA, USA.,
| | - Samantha Lin
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, CA, USA.,
| | - Austin Roorda
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, CA, USA.,
| |
Collapse
|
14
|
Vergence eye movements during figure-ground perception. Conscious Cogn 2021; 92:103138. [PMID: 34022640 DOI: 10.1016/j.concog.2021.103138] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2020] [Revised: 04/19/2021] [Accepted: 04/20/2021] [Indexed: 11/24/2022]
Abstract
Figure-ground, that is the segmentation of visual information into objects and their surrounding backgrounds, provides structure for visual attention. Recent evidence shows a novel role of vergence eye movements in visual attention. In the present work, vergence responses during figure-ground segregation tasks are psychophysically investigated. We show that during a figure-ground detection task, subjects convergence their eyes. Vergence eye movements are larger in figure trials than in ground trials. In detected figures trials, vergence are stronger than in trials where the figure went unnoticed. Moreover in figure trials, vergence responses are stronger to low-contrast figures than to high-contrast figures. We argue that these discriminative vergence responses have a role in figure-ground.
Collapse
|
15
|
Abstract
Due to its reported high sampling frequency and precision, the Tobii Pro Spectrum is of potential interest to researchers who want to study small eye movements during fixation. We test how suitable the Tobii Pro Spectrum is for research on microsaccades by computing data-quality measures and common properties of microsaccades and comparing these to the currently most used system in this field: the EyeLink 1000 Plus. Results show that the EyeLink data provide higher RMS precision and microsaccade rates compared with data acquired with the Tobii Pro Spectrum. However, both systems provide microsaccades with similar directions and shapes, as well as rates consistent with previous literature. Data acquired at 1200 Hz with the Tobii Pro Spectrum provide results that are more similar to the EyeLink, compared to data acquired at 600 Hz. We conclude that the Tobii Pro Spectrum is a useful tool for researchers investigating microsaccades.
Collapse
|
16
|
Abstract
Eye trackers are sometimes used to study the miniature eye movements such as drift that occur while observers fixate a static location on a screen. Specifically, analysis of such eye-tracking data can be performed by examining the temporal spectrum composition of the recorded gaze position signal, allowing to assess its color. However, not only rotations of the eyeball but also filters in the eye tracker may affect the signal’s spectral color. Here, we therefore ask whether colored, as opposed to white, signal dynamics in eye-tracking recordings reflect fixational eye movements, or whether they are instead largely due to filters. We recorded gaze position data with five eye trackers from four pairs of human eyes performing fixation sequences, and also from artificial eyes. We examined the spectral color of the gaze position signals produced by the eye trackers, both with their filters switched on, and for unfiltered data. We found that while filtered data recorded from both human and artificial eyes were colored for all eye trackers, for most eye trackers the signal was white when examining both unfiltered human and unfiltered artificial eye data. These results suggest that color in the eye-movement recordings was due to filters for all eye trackers except the most precise eye tracker where it may partly reflect fixational eye movements. As such, researchers studying fixational eye movements should be careful to examine the properties of the filters in their eye tracker to ensure they are studying eyeball rotation and not filter properties.
Collapse
|
17
|
Abstract
The magnitude of variation in the gaze position signals recorded by an eye tracker, also known as its precision, is an important aspect of an eye tracker’s data quality. However, data quality of eye-tracking signals is still poorly understood. In this paper, we therefore investigate the following: (1) How do the various available measures characterizing eye-tracking data during fixation relate to each other? (2) How are they influenced by signal type? (3) What type of noise should be used to augment eye-tracking data when evaluating eye-movement analysis methods? To support our analysis, this paper presents new measures to characterize signal type and signal magnitude based on RMS-S2S and STD, two established measures of precision. Simulations are performed to investigate how each of these measures depends on the number of gaze position samples over which they are calculated, and to reveal how RMS-S2S and STD relate to each other and to measures characterizing the temporal spectrum composition of the recorded gaze position signal. Further empirical investigations were performed using gaze position data recorded with five eye trackers from human and artificial eyes. We found that although the examined eye trackers produce gaze position signals with different characteristics, the relations between precision measures derived from simulations are borne out by the data. We furthermore conclude that data with a range of signal type values should be used to assess the robustness of eye-movement analysis methods. We present a method for generating artificial eye-tracker noise of any signal type and magnitude.
Collapse
|
18
|
Evaluating three approaches to binary event-level agreement scoring. A reply to Friedman (2020). Behav Res Methods 2020; 53:325-334. [PMID: 32705657 PMCID: PMC7880951 DOI: 10.3758/s13428-020-01425-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|