1
|
Huang Z, Duan X, Zhu G, Zhang S, Wang R, Wang Z. Assessing the data quality of AdHawk MindLink eye-tracking glasses. Behav Res Methods 2024; 56:5771-5787. [PMID: 38168041 DOI: 10.3758/s13428-023-02310-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/30/2023] [Indexed: 01/05/2024]
Abstract
Most commercially available eye-tracking devices rely on video cameras and image processing algorithms to track gaze. Despite this, emerging technologies are entering the field, making high-speed, cameraless eye-tracking more accessible. In this study, a series of tests were conducted to compare the data quality of MEMS-based eye-tracking glasses (AdHawk MindLink) with three widely used camera-based eye-tracking devices (EyeLink Portable Duo, Tobii Pro Glasses 2, and SMI Eye Tracking Glasses 2). The data quality measures assessed in these tests included accuracy, precision, data loss, and system latency. The results suggest that, overall, the data quality of the eye-tracking glasses was lower compared to that of a desktop EyeLink Portable Duo eye-tracker. Among the eye-tracking glasses, the accuracy and precision of the MindLink eye-tracking glasses were either higher or on par with those of Tobii Pro Glasses 2 and SMI Eye Tracking Glasses 2. The system latency of MindLink was approximately 9 ms, significantly lower than that of camera-based eye-tracking devices found in VR goggles. These results suggest that the MindLink eye-tracking glasses show promise for research applications where high sampling rates and low latency are preferred.
Collapse
Affiliation(s)
- Zehao Huang
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Xiaoting Duan
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Gancheng Zhu
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Shuai Zhang
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Rong Wang
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Zhiguo Wang
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China.
| |
Collapse
|
2
|
Feng Y, Chen Y, Zhang J, Tian C, Ren R, Han T, Proctor RW. Human-centred design of next generation transportation infrastructure with connected and automated vehicles: a system-of-systems perspective. THEORETICAL ISSUES IN ERGONOMICS SCIENCE 2023. [DOI: 10.1080/1463922x.2023.2182003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/25/2023]
Affiliation(s)
- Yiheng Feng
- Lyles School of Civil Engineering, Purdue University, West Lafayette, IN, USA
| | - Yunfeng Chen
- School of Construction Management Technology, Purdue University, West Lafayette, IN, USA
| | - Jiansong Zhang
- School of Construction Management Technology, Purdue University, West Lafayette, IN, USA
| | - Chi Tian
- School of Construction Management Technology, Purdue University, West Lafayette, IN, USA
| | - Ran Ren
- School of Construction Management Technology, Purdue University, West Lafayette, IN, USA
| | - Tianfang Han
- Department of Psychological Sciences, Purdue University, West Lafayette, IN, USA
| | - Robert W. Proctor
- Department of Psychological Sciences, Purdue University, West Lafayette, IN, USA
| |
Collapse
|
3
|
Enhancing the Sense of Attention from an Assistance Mobile Robot by Improving Eye-Gaze Contact from Its Iconic Face Displayed on a Flat Screen. SENSORS 2022; 22:s22114282. [PMID: 35684903 PMCID: PMC9185237 DOI: 10.3390/s22114282] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/26/2022] [Revised: 05/27/2022] [Accepted: 05/31/2022] [Indexed: 02/04/2023]
Abstract
One direct way to express the sense of attention in a human interaction is through the gaze. This paper presents the enhancement of the sense of attention from the face of a human-sized mobile robot during an interaction. This mobile robot was designed as an assistance mobile robot and uses a flat screen at the top of the robot to display an iconic (simplified) face with big round eyes and a single line as a mouth. The implementation of eye-gaze contact from this iconic face is a problem because of the difficulty of simulating real 3D spherical eyes in a 2D image considering the perspective of the person interacting with the mobile robot. The perception of eye-gaze contact has been improved by manually calibrating the gaze of the robot relative to the location of the face of the person interacting with the robot. The sense of attention has been further enhanced by implementing cyclic face explorations with saccades in the gaze and by performing blinking and small movements of the mouth.
Collapse
|
4
|
Keller-Hamilton B, Fioritto M, Klein EG, Brinkman MC, Pennell ML, Nini P, Patterson JG, Ferketich AK. Visual attention to blu's parody warnings and the FDA's warning on e-cigarette advertisements. Addict Behav 2022; 125:107169. [PMID: 34768058 PMCID: PMC8629956 DOI: 10.1016/j.addbeh.2021.107169] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2021] [Revised: 10/06/2021] [Accepted: 10/26/2021] [Indexed: 02/03/2023]
Abstract
OBJECTIVES In 2017, the e-cigarette brand, blu, released advertisements featuring large, boxed, positively-framed messages. These messages mimicked the format of FDA-mandated warnings that would appear on e-cigarette advertisements in the United States in 2018. We compared attention to blu's parody warnings and FDA-mandated warning appearing on blu advertisements. METHODS N = 73 young adults who had used tobacco participated in an eye-tracking study. Participants viewed three blu e-cigarette advertisements in random order: one with a parody warning and two with the FDA-mandated warning (one with a model's face and one without). Areas of interest (AOIs) were the parody or FDA-mandated warning. We compared dwell time on AOIs between the three advertisements. RESULTS Participants viewed parody warnings longer than each FDA-mandated warning on average (254 and 608 ms longer; p's < 0.02). Comparing the advertisements with FDA-mandated warnings revealed that participants spent less time looking at the warning in the advertisement with a model's face (354 fewer milliseconds; p = 0.001). CONCLUSIONS Parody warnings attracted more visual attention than FDA-mandated warnings, and the presence of a face in the advertisement drew attention away from the FDA-mandated warning. Results underscore the need for advertisement regulations that support increased attention to health warnings.
Collapse
Affiliation(s)
- Brittney Keller-Hamilton
- College of Medicine, The Ohio State University, Columbus, OH, USA; Center for Tobacco Research, The Ohio State University Comprehensive Cancer Center, Columbus, OH, USA.
| | - Makala Fioritto
- Environmental, Health, and Safety, Textron Inc., Providence, RI, USA
| | - Elizabeth G Klein
- College of Public Health, The Ohio State University, Columbus, OH, USA
| | - Marielle C Brinkman
- Center for Tobacco Research, The Ohio State University Comprehensive Cancer Center, Columbus, OH, USA; College of Public Health, The Ohio State University, Columbus, OH, USA
| | - Michael L Pennell
- College of Public Health, The Ohio State University, Columbus, OH, USA
| | - Paul Nini
- College of Arts and Sciences, The Ohio State University, Columbus, OH, USA
| | | | - Amy K Ferketich
- College of Public Health, The Ohio State University, Columbus, OH, USA
| |
Collapse
|
5
|
Attard-Johnson J, Vasilev MR, Ó Ciardha C, Bindemann M, Babchishin KM. Measurement of Sexual Interests with Pupillary Responses: A Meta-Analysis. ARCHIVES OF SEXUAL BEHAVIOR 2021; 50:3385-3411. [PMID: 34557971 PMCID: PMC8604861 DOI: 10.1007/s10508-021-02137-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/22/2020] [Revised: 08/24/2021] [Accepted: 08/25/2021] [Indexed: 05/10/2023]
Abstract
Objective measures of sexual interest are important for research on human sexuality. There has been a resurgence in research examining pupil dilation as a potential index of sexual orientation. We carried out a meta-analytic review of studies published between 1965 and 2020 (Mdn year = 2016) measuring pupil responses to visual stimuli of adult men and women to assess sexual interest. Separate meta-analyses were performed for six sexual orientation categories. In the final analysis, 15 studies were included for heterosexual men (N = 550), 5 studies for gay men (N = 65), 4 studies for bisexual men (N = 124), 13 studies for heterosexual women (N = 403), and 3 studies for lesbian women (N = 132). Only heterosexual and gay men demonstrated discrimination in pupillary responses that was clearly in line with their sexual orientation, with greater pupil dilation to female and male stimuli, respectively. Bisexual men showed greater pupil dilation to male stimuli. Although heterosexual women exhibited larger pupils to male stimuli compared to female stimuli, the magnitude of the effect was small and non-significant. Finally, lesbian women displayed greater pupil dilation to male stimuli. Three methodological moderators were identified-the sexual explicitness of stimulus materials, the measurement technique of pupillary response, and inclusion of self-report measures of sexual interest. These meta-analyses are based on a limited number of studies and are therefore preliminary. However, the results suggest that pupillary measurement of sexual interest is promising for men and that standardization is essential to gain a better understanding of the validity of this measurement technique for sexual interest.
Collapse
Affiliation(s)
- Janice Attard-Johnson
- Department of Psychology, Faculty of Science and Technology, Bournemouth University, Dorset, BH12 5BB, UK.
| | - Martin R Vasilev
- Department of Psychology, Faculty of Science and Technology, Bournemouth University, Dorset, BH12 5BB, UK
| | | | | | | |
Collapse
|
6
|
Aust J, Mitrovic A, Pons D. Assessment of the Effect of Cleanliness on the Visual Inspection of Aircraft Engine Blades: An Eye Tracking Study. SENSORS (BASEL, SWITZERLAND) 2021; 21:6135. [PMID: 34577343 PMCID: PMC8473167 DOI: 10.3390/s21186135] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/18/2021] [Revised: 09/03/2021] [Accepted: 09/07/2021] [Indexed: 01/20/2023]
Abstract
Background-The visual inspection of aircraft parts such as engine blades is crucial to ensure safe aircraft operation. There is a need to understand the reliability of such inspections and the factors that affect the results. In this study, the factor 'cleanliness' was analysed among other factors. Method-Fifty industry practitioners of three expertise levels inspected 24 images of parts with a variety of defects in clean and dirty conditions, resulting in a total of N = 1200 observations. The data were analysed statistically to evaluate the relationships between cleanliness and inspection performance. Eye tracking was applied to understand the search strategies of different levels of expertise for various part conditions. Results-The results show an inspection accuracy of 86.8% and 66.8% for clean and dirty blades, respectively. The statistical analysis showed that cleanliness and defect type influenced the inspection accuracy, while expertise was surprisingly not a significant factor. In contrast, inspection time was affected by expertise along with other factors, including cleanliness, defect type and visual acuity. Eye tracking revealed that inspectors (experts) apply a more structured and systematic search with less fixations and revisits compared to other groups. Conclusions-Cleaning prior to inspection leads to better results. Eye tracking revealed that inspectors used an underlying search strategy characterised by edge detection and differentiation between surface deposits and other types of damage, which contributed to better performance.
Collapse
Affiliation(s)
- Jonas Aust
- Department of Mechanical Engineering, University of Canterbury, Christchurch 8041, New Zealand;
| | - Antonija Mitrovic
- Department of Computer Science and Software Engineering, University of Canterbury, Christchurch 8041, New Zealand;
| | - Dirk Pons
- Department of Mechanical Engineering, University of Canterbury, Christchurch 8041, New Zealand;
| |
Collapse
|
7
|
Zandi B, Lode M, Herzog A, Sakas G, Khanh TQ. PupilEXT: Flexible Open-Source Platform for High-Resolution Pupillometry in Vision Research. Front Neurosci 2021; 15:676220. [PMID: 34220432 PMCID: PMC8249868 DOI: 10.3389/fnins.2021.676220] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2021] [Accepted: 04/28/2021] [Indexed: 12/12/2022] Open
Abstract
The human pupil behavior has gained increased attention due to the discovery of the intrinsically photosensitive retinal ganglion cells and the afferent pupil control path's role as a biomarker for cognitive processes. Diameter changes in the range of 10-2 mm are of interest, requiring reliable and characterized measurement equipment to accurately detect neurocognitive effects on the pupil. Mostly commercial solutions are used as measurement devices in pupillometry which is associated with high investments. Moreover, commercial systems rely on closed software, restricting conclusions about the used pupil-tracking algorithms. Here, we developed an open-source pupillometry platform consisting of hardware and software competitive with high-end commercial stereo eye-tracking systems. Our goal was to make a professional remote pupil measurement pipeline for laboratory conditions accessible for everyone. This work's core outcome is an integrated cross-platform (macOS, Windows and Linux) pupillometry software called PupilEXT, featuring a user-friendly graphical interface covering the relevant requirements of professional pupil response research. We offer a selection of six state-of-the-art open-source pupil detection algorithms (Starburst, Swirski, ExCuSe, ElSe, PuRe and PuReST) to perform the pupil measurement. A developed 120-fps pupillometry demo system was able to achieve a calibration accuracy of 0.003 mm and an averaged temporal pupil measurement detection accuracy of 0.0059 mm in stereo mode. The PupilEXT software has extended features in pupil detection, measurement validation, image acquisition, data acquisition, offline pupil measurement, camera calibration, stereo vision, data visualization and system independence, all combined in a single open-source interface, available at https://github.com/openPupil/Open-PupilEXT.
Collapse
Affiliation(s)
- Babak Zandi
- Laboratory of Lighting Technology, Department of Electrical Engineering and Information Technology, Technical University of Darmstadt, Darmstadt, Germany
| | - Moritz Lode
- Laboratory of Lighting Technology, Department of Electrical Engineering and Information Technology, Technical University of Darmstadt, Darmstadt, Germany
| | - Alexander Herzog
- Laboratory of Lighting Technology, Department of Electrical Engineering and Information Technology, Technical University of Darmstadt, Darmstadt, Germany
| | - Georgios Sakas
- Interactive Graphic Systems, Department of Computer Science, Technical University of Darmstadt, Darmstadt, Germany
| | - Tran Quoc Khanh
- Laboratory of Lighting Technology, Department of Electrical Engineering and Information Technology, Technical University of Darmstadt, Darmstadt, Germany
| |
Collapse
|
8
|
Dai W, Selesnick I, Rizzo JR, Rucker J, Hudson T. Detection of normal and slow saccades using implicit piecewise polynomial approximation. J Vis 2021; 21:8. [PMID: 34125160 PMCID: PMC8212426 DOI: 10.1167/jov.21.6.8] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
The quantitative analysis of saccades in eye movement data unveils information associated with intention, cognition, and health status. Abnormally slow saccades are indicative of neurological disorders and often imply a specific pathological disturbance. However, conventional saccade detection algorithms are not designed to detect slow saccades, and are correspondingly unreliable when saccades are unusually slow. In this article, we propose an algorithm that is effective for the detection of both normal and slow saccades. The proposed algorithm is partly based on modeling saccadic waveforms as piecewise-quadratic signals. The algorithm first decreases noise in acquired eye-tracking data using optimization to minimize a prescribed objective function, then uses velocity thresholding to detect saccades. Using both simulated saccades and real saccades generated by healthy subjects and patients, we evaluate the performance of the proposed algorithm and 10 other detection algorithms. We show the proposed algorithm is more accurate in detecting both normal and slow saccades than other algorithms.
Collapse
Affiliation(s)
- Weiwei Dai
- Department of Electrical and Computer Engineering, Tandon School of Engineering, New York University, Brooklyn, NY, USA.,
| | - Ivan Selesnick
- Department of Electrical and Computer Engineering, Tandon School of Engineering, New York University, Brooklyn, NY, USA.,
| | - John-Ross Rizzo
- Department of Neurology, School of Medicine, New York University, New York, NY, USA.,
| | - Janet Rucker
- Department of Neurology, School of Medicine, New York University, New York, NY, USA.,
| | - Todd Hudson
- Department of Neurology, School of Medicine, New York University, New York, NY, USA.,
| |
Collapse
|
9
|
Spiller M, Liu YH, Hossain MZ, Gedeon T, Geissler J, Nürnberger A. Predicting Visual Search Task Success from Eye Gaze Data as a Basis for User-Adaptive Information Visualization Systems. ACM T INTERACT INTEL 2021. [DOI: 10.1145/3446638] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
Information visualizations are an efficient means to support the users in understanding large amounts of complex, interconnected data; user comprehension, however, depends on individual factors such as their cognitive abilities. The research literature provides evidence that user-adaptive information visualizations positively impact the users’ performance in visualization tasks. This study attempts to contribute toward the development of a computational model to predict the users’ success in visual search tasks from eye gaze data and thereby drive such user-adaptive systems. State-of-the-art deep learning models for time series classification have been trained on sequential eye gaze data obtained from 40 study participants’ interaction with a circular and an organizational graph. The results suggest that such models yield higher accuracy than a baseline classifier and previously used models for this purpose. In particular, a Multivariate Long Short Term Memory Fully Convolutional Network shows encouraging performance for its use in online user-adaptive systems. Given this finding, such a computational model can infer the users’ need for support during interaction with a graph and trigger appropriate interventions in user-adaptive information visualization systems. This facilitates the design of such systems since further interaction data like mouse clicks is not required.
Collapse
Affiliation(s)
- Moritz Spiller
- INKA—Innovation Laboratory for Image Guided Therapy, Health Campus Immunology Infectiology and Inflammation (GC-I3), Otto-von-Guericke-University, Germany
| | | | | | - Tom Gedeon
- The Australian National University, Australia
| | | | | |
Collapse
|
10
|
When assistive eye tracking fails: Communicating with a brainstem-stroke patient through the pupillary accommodative response – A case study. Biomed Signal Process Control 2021. [DOI: 10.1016/j.bspc.2021.102515] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
|
11
|
Di Stasi LL, Diaz-Piedra C, Morales JM, Kurapov A, Tagliabue M, Bjärtå A, Megias A, Bernhardsson J, Paschenko S, Romero S, Cándido A, Catena A. A cross-cultural comparison of visual search strategies and response times in road hazard perception testing. ACCIDENT; ANALYSIS AND PREVENTION 2020; 148:105785. [PMID: 33161370 DOI: 10.1016/j.aap.2020.105785] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/11/2020] [Revised: 08/25/2020] [Accepted: 09/11/2020] [Indexed: 06/11/2023]
Abstract
Road hazard perception is considered the most prominent higher-order cognitive skill related to traffic-accident involvement. Regional cultures and social rules that govern acceptable behavior may influence drivers' interpretation of a traffic situation and, consequently, the correct identification of potentially hazardous situations. Here, we aimed to compare hazard perception skills among four European countries that differ in their traffic culture, policies to reduce traffic risks, and fatal crashes: Ukraine, Italy, Spain, and Sweden. We developed a static hazard perception test in which driving scenes with different levels of braking affordance were presented while drivers' gaze was recorded. The test required drivers to indicate the action they would undertake: to brake vs. to keep driving. We assessed 218 young adult drivers. Multilevel models revealed that the scenes' levels of braking affordance (i.e., road hazard) modulated drivers' behavior. As the levels of braking affordance increased, drivers' responses became faster and their gaze entropy decreased (i.e., visual search strategy became less erratic). The country of origin influenced these effects. Ukrainian drivers were the fastest and Swedish drivers were the slowest to respond. For all countries, the decrement in response times was less marked in the case of experienced drivers. Also, Spanish drivers showed the most structured (least erratic) visual search strategy, whereas the Italians had the most rigid (most constant) one. These results suggest that road hazard perception can be defined cross-culturally, with cultural factors (e.g., traffic climate, legislation) modulating response times and visual search strategies. Our results also support the idea that a multimodal assessment methodology is possible for mass testing of road hazard perception and its outcomes would be relevant to understand how different traffic cultures shape driving behavior.
Collapse
Affiliation(s)
- Leandro L Di Stasi
- Mind, Brain, and Behavior Research Center, University of Granada, Granada, Spain.
| | - Carolina Diaz-Piedra
- Mind, Brain, and Behavior Research Center, University of Granada, Granada, Spain; College of Nursing and Health Innovation, Arizona State University, Phoenix, AZ, USA.
| | - José M Morales
- Mind, Brain, and Behavior Research Center, University of Granada, Granada, Spain; Department of Computer Architecture and Technology, University of Granada, Granada, Spain
| | - Anton Kurapov
- Faculty of Psychology, Taras Shevchenko National University of Kyiv, Kyiv, Ukraine
| | | | - Anna Bjärtå
- Department of Psychology and Social Work, Mid Sweden University, Östersund, Sweden
| | - Alberto Megias
- Department of Basic Psychology, Faculty of Psychology, University of Malaga, Malaga, Spain
| | - Jens Bernhardsson
- Department of Psychology and Social Work, Mid Sweden University, Östersund, Sweden
| | - Svitlana Paschenko
- Faculty of Psychology, Taras Shevchenko National University of Kyiv, Kyiv, Ukraine
| | - Samuel Romero
- Department of Computer Architecture and Technology, University of Granada, Granada, Spain
| | - Antonio Cándido
- Mind, Brain, and Behavior Research Center, University of Granada, Granada, Spain
| | - Andrés Catena
- Mind, Brain, and Behavior Research Center, University of Granada, Granada, Spain
| |
Collapse
|
12
|
Thierfelder P, Durantin G, Wigglesworth G. The Effect of Word Predictability on Phonological Activation in Cantonese Reading: A Study of Eye-Fixations and Pupillary Response. JOURNAL OF PSYCHOLINGUISTIC RESEARCH 2020; 49:779-801. [PMID: 32556719 DOI: 10.1007/s10936-020-09713-8] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
This study aimed to investigate the effects of contextual predictability on orthographic and phonological activation during Chinese sentence reading by Cantonese-speaking readers using the error disruption paradigm. Participants' eye fixations and pupil sizes were recorded while they silently read Chinese sentences containing homophonic, orthographic, and unrelated errors. Sentences had varying amounts of contextual information leading up to target words such that some targets were more predictable than others. Results of the fixation time analysis indicated that orthographic effects were significant in first fixation and gaze duration, while phonological effects emerged later in total reading time. However, interactions between predictability and the homophonic condition were found in gaze duration. These results suggest that, while Cantonese readers activate word meanings primarily through orthography in early processing, early phonological activation can occur when facilitated by semantics in high-constraint sentence contexts. Analysis of pupillary response measurements revealed that participants' pupil sizes became larger when they read words containing orthographic errors, suggesting that orthographic error recovery processes significantly increase cognitive load.
Collapse
Affiliation(s)
- Philip Thierfelder
- ARC Centre of Excellence for the Dynamics of Language, The University of Melbourne, Melbourne, Australia.
| | - Gautier Durantin
- ARC Centre of Excellence for the Dynamics of Language, The University of Queensland, Brisbane, Australia
| | - Gillian Wigglesworth
- ARC Centre of Excellence for the Dynamics of Language, The University of Melbourne, Melbourne, Australia
| |
Collapse
|
13
|
Carter BT, Luke SG. Best practices in eye tracking research. Int J Psychophysiol 2020; 155:49-62. [PMID: 32504653 DOI: 10.1016/j.ijpsycho.2020.05.010] [Citation(s) in RCA: 81] [Impact Index Per Article: 20.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2019] [Revised: 05/26/2020] [Accepted: 05/27/2020] [Indexed: 12/14/2022]
Abstract
This guide describes best practices in using eye tracking technology for research in a variety of disciplines. A basic outline of the anatomy and physiology of the eyes and of eye movements is provided, along with a description of the sorts of research questions eye tracking can address. We then explain how eye tracking technology works and what sorts of data it generates, and provide guidance on how to select and use an eye tracker as well as selecting appropriate eye tracking measures. Challenges to the validity of eye tracking studies are described, along with recommendations for overcoming these challenges. We then outline correct reporting standards for eye tracking studies.
Collapse
|
14
|
Development and validation of a high-speed video system for measuring saccadic eye movement. Behav Res Methods 2020; 51:2302-2309. [PMID: 30706347 DOI: 10.3758/s13428-019-01197-2] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Laboratory-based retroreflective and magnetic scleral search-coil technologies are the current standards for collecting saccadometric data, but such equipment is costly and cumbersome. We have validated a novel, portable, high-speed video camera-based system (Exilim EX-FH20, Casio, Tokyo, Japan) for measuring saccade reaction time (RT) and error rate in a well-lit environment. This system would enable measurements of pro- and antisaccades in athletes, which is important because antisaccade metrics provide a valid tool for concussion diagnosis and determining an athlete's safe return to play. A total of 529 trials collected from 15 participants were used to compare saccade RT and error rate measurements of the high-speed camera system to a retroreflective video-based eye tracker (Eye-Trac 6: Applied Sciences Laboratories, Bedford, MA). Bland-Altman analysis revealed that the RT measurements made by the high-speed video system were 11 ms slower than those made by the retroreflective system. Error rate measurements were identical between the two systems. An excellent degree of reliability was found between the system measurements and in the ratings of independent researchers examining the video data. A strong association (r = .97) between the RTs determined via the retroreflective and high-speed camera systems was observed across all trials. Our high-speed camera system is portable and easily set up, does not require extensive equipment calibration, and can be used in a well-lit environment. Accordingly, the camera-based capture of saccadometric data may provide a valuable tool for neurological assessment following a concussive event and for the continued monitoring of recovery.
Collapse
|
15
|
Abstract
Pupil dilation is an effective indicator of cognitive and affective processes. Although several eyetracker systems on the market can provide effective solutions for pupil dilation measurement, there is a lack of tools for processing and analyzing the data provided by these systems. For this reason, we developed CHAP: open-source software written in MATLAB. This software provides a user-friendly graphical user interface for processing and analyzing pupillometry data. Our software creates uniform conventions for the preprocessing and analysis of pupillometry data and provides a quick and easy-to-use tool for researchers interested in pupillometry. To download CHAP or join our mailing list, please visit CHAP's website: http://in.bgu.ac.il/en/Labs/CNL/chap .
Collapse
|
16
|
Ehinger BV, Groß K, Ibs I, König P. A new comprehensive eye-tracking test battery concurrently evaluating the Pupil Labs glasses and the EyeLink 1000. PeerJ 2019; 7:e7086. [PMID: 31328028 PMCID: PMC6625505 DOI: 10.7717/peerj.7086] [Citation(s) in RCA: 49] [Impact Index Per Article: 9.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2019] [Accepted: 05/07/2019] [Indexed: 01/08/2023] Open
Abstract
Eye-tracking experiments rely heavily on good data quality of eye-trackers. Unfortunately, it is often the case that only the spatial accuracy and precision values are available from the manufacturers. These two values alone are not sufficient to serve as a benchmark for an eye-tracker: Eye-tracking quality deteriorates during an experimental session due to head movements, changing illumination or calibration decay. Additionally, different experimental paradigms require the analysis of different types of eye movements; for instance, smooth pursuit movements, blinks or microsaccades, which themselves cannot readily be evaluated by using spatial accuracy or precision alone. To obtain a more comprehensive description of properties, we developed an extensive eye-tracking test battery. In 10 different tasks, we evaluated eye-tracking related measures such as: the decay of accuracy, fixation durations, pupil dilation, smooth pursuit movement, microsaccade classification, blink classification, or the influence of head motion. For some measures, true theoretical values exist. For others, a relative comparison to a reference eye-tracker is needed. Therefore, we collected our gaze data simultaneously from a remote EyeLink 1000 eye-tracker as the reference and compared it with the mobile Pupil Labs glasses. As expected, the average spatial accuracy of 0.57° for the EyeLink 1000 eye-tracker was better than the 0.82° for the Pupil Labs glasses (N = 15). Furthermore, we classified less fixations and shorter saccade durations for the Pupil Labs glasses. Similarly, we found fewer microsaccades using the Pupil Labs glasses. The accuracy over time decayed only slightly for the EyeLink 1000, but strongly for the Pupil Labs glasses. Finally, we observed that the measured pupil diameters differed between eye-trackers on the individual subject level but not on the group level. To conclude, our eye-tracking test battery offers 10 tasks that allow us to benchmark the many parameters of interest in stereotypical eye-tracking situations and addresses a common source of confounds in measurement errors (e.g., yaw and roll head movements). All recorded eye-tracking data (including Pupil Labs' eye videos), the stimulus code for the test battery, and the modular analysis pipeline are freely available (https://github.com/behinger/etcomp).
Collapse
Affiliation(s)
- Benedikt V. Ehinger
- Institute of Cognitive Science, Osnabrück University, Osnabrück, Germany
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands
| | - Katharina Groß
- Institute of Cognitive Science, Osnabrück University, Osnabrück, Germany
| | - Inga Ibs
- Institute of Cognitive Science, Osnabrück University, Osnabrück, Germany
| | - Peter König
- Institute of Cognitive Science, Osnabrück University, Osnabrück, Germany
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| |
Collapse
|
17
|
Diego-Mas JA, Garzon-Leal D, Poveda-Bautista R, Alcaide-Marzal J. User-interfaces layout optimization using eye-tracking, mouse movements and genetic algorithms. APPLIED ERGONOMICS 2019; 78:197-209. [PMID: 31046951 DOI: 10.1016/j.apergo.2019.03.004] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/07/2018] [Revised: 02/11/2019] [Accepted: 03/13/2019] [Indexed: 06/09/2023]
Abstract
Establishing the best layout configuration for software-generated interfaces and control panels is a complex problem when they include many controls and indicators. Several methods have been developed for arranging the interface elements; however, the results are usually conceptual designs that must be manually adjusted to obtain layouts valid for real situations. Based on these considerations, in this work we propose a new automatized procedure to obtain optimal layouts for software-based interfaces. Eye-tracking and mouse-tracking data collected during the use of the interface is used to obtain the best configuration for its elements. The solutions are generated using a slicing-trees based genetic algorithm. This algorithm is able to obtain really applicable configurations that respect the geometrical restrictions of elements in the interface. Results show that this procedure increases effectiveness, efficiency and satisfaction of the users when they interact with the obtained interfaces.
Collapse
Affiliation(s)
- Jose Antonio Diego-Mas
- I3B, Institute for Research and Innovation in Bioengineering, Universitat Politecnica de Valencia, Camino de Vera s/n, 46022, Valencia, Spain.
| | | | - Rocio Poveda-Bautista
- INGENIO (CSIC-UPV), Universitat Politecnica de Valencia, Camino de Vera s/n, Valencia, 46022, Spain.
| | - Jorge Alcaide-Marzal
- I3B, Institute for Research and Innovation in Bioengineering, Universitat Politecnica de Valencia, Camino de Vera s/n, 46022, Valencia, Spain
| |
Collapse
|
18
|
Hassoumi A, Peysakhovich V, Hurter C. Improving eye-tracking calibration accuracy using symbolic regression. PLoS One 2019; 14:e0213675. [PMID: 30875387 PMCID: PMC6420251 DOI: 10.1371/journal.pone.0213675] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2018] [Accepted: 02/26/2019] [Indexed: 11/27/2022] Open
Abstract
Eye tracking systems have recently experienced a diversity of novel calibration procedures, including smooth pursuit and vestibulo-ocular reflex based calibrations. These approaches allowed collecting more data compared to the standard 9-point calibration. However, the computation of the mapping function which provides planar gaze positions from pupil features given as input is mostly based on polynomial regressions, and little work has investigated alternative approaches. This paper fills this gap by providing a new calibration computation method based on symbolic regression. Instead of making prior assumptions on the polynomial transfer function between input and output records, symbolic regression seeks an optimal model among different types of functions and their combinations. This approach offers an interesting perspective in terms of flexibility and accuracy. Therefore, we designed two experiments in which we collected ground truth data to compare vestibulo-ocular and smooth pursuit calibrations based on symbolic regression, both using a marker or a finger as a target, resulting in four different calibrations. As a result, we improved calibration accuracy by more than 30%, with reasonable extra computation time.
Collapse
Affiliation(s)
- Almoctar Hassoumi
- DEVI, French Civil Aviation University - ENAC, Toulouse, France
- DCAS, ISAE-SUPAERO, Université de Toulouse, Toulouse, France
| | | | | |
Collapse
|