1
|
Felinska EA, Fuchs TE, Kogkas A, Chen ZW, Otto B, Kowalewski KF, Petersen J, Müller-Stich BP, Mylonas G, Nickel F. Telestration with augmented reality improves surgical performance through gaze guidance. Surg Endosc 2023; 37:3557-3566. [PMID: 36609924 PMCID: PMC10156835 DOI: 10.1007/s00464-022-09859-7] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2022] [Accepted: 12/27/2022] [Indexed: 01/07/2023]
Abstract
BACKGROUND In minimally invasive surgery (MIS), trainees need to learn how to interpret the operative field displayed on the laparoscopic screen. Experts currently guide trainees mainly verbally during laparoscopic procedures. A newly developed telestration system with augmented reality (iSurgeon) allows the instructor to display hand gestures in real-time on the laparoscopic screen in augmented reality to provide visual expert guidance (telestration). This study analysed the effect of telestration guided instructions on gaze behaviour during MIS training. METHODS In a randomized-controlled crossover study, 40 MIS naive medical students performed 8 laparoscopic tasks with telestration or with verbal instructions only. Pupil Core eye-tracking glasses were used to capture the instructor's and trainees' gazes. Gaze behaviour measures for tasks 1-7 were gaze latency, gaze convergence and collaborative gaze convergence. Performance measures included the number of errors in tasks 1-7 and trainee's ratings in structured and standardized performance scores in task 8 (ex vivo porcine laparoscopic cholecystectomy). RESULTS There was a significant improvement 1-7 on gaze latency [F(1,39) = 762.5, p < 0.01, ηp2 = 0.95], gaze convergence [F(1,39) = 482.8, p < 0.01, ηp2 = 0.93] and collaborative gaze convergence [F(1,39) = 408.4, p < 0.01, ηp2 = 0.91] upon instruction with iSurgeon. The number of errors was significantly lower in tasks 1-7 (0.18 ± 0.56 vs. 1.94 ± 1.80, p < 0.01) and the score ratings for laparoscopic cholecystectomy were significantly higher with telestration (global OSATS: 29 ± 2.5 vs. 25 ± 5.5, p < 0.01; task-specific OSATS: 60 ± 3 vs. 50 ± 6, p < 0.01). CONCLUSIONS Telestration with augmented reality successfully improved surgical performance. The trainee's gaze behaviour was improved by reducing the time from instruction to fixation on targets and leading to a higher convergence of the instructor's and the trainee's gazes. Also, the convergence of trainee's gaze and target areas increased with telestration. This confirms augmented reality-based telestration works by means of gaze guidance in MIS and could be used to improve training outcomes.
Collapse
Affiliation(s)
- Eleni Amelia Felinska
- Department of General, Visceral and Transplant Surgery, Heidelberg University Hospital, 69120, Heidelberg, Germany
| | - Thomas Ewald Fuchs
- Department of General, Visceral and Transplant Surgery, Heidelberg University Hospital, 69120, Heidelberg, Germany
| | - Alexandros Kogkas
- Hamlyn Centre for Robotic Surgery, Imperial College London, London, SW7 2AZ, UK
- Department of Surgery and Cancer, Faculty of Medicine, Imperial College London, London, SW7 2AZ, UK
| | - Zi-Wei Chen
- Department of General, Visceral and Transplant Surgery, Heidelberg University Hospital, 69120, Heidelberg, Germany
| | - Benjamin Otto
- Department of General, Visceral and Transplant Surgery, Heidelberg University Hospital, 69120, Heidelberg, Germany
| | - Karl-Friedrich Kowalewski
- Department of Urology and Urological Surgery, University Medical Center Mannheim, Heidelberg University, 68167, Mannheim, Germany
| | - Jens Petersen
- Department of Medical Image Computing, German Cancer Research Center, 69120, Heidelberg, Germany
| | - Beat Peter Müller-Stich
- Department of General, Visceral and Transplant Surgery, Heidelberg University Hospital, 69120, Heidelberg, Germany
| | - George Mylonas
- Hamlyn Centre for Robotic Surgery, Imperial College London, London, SW7 2AZ, UK
- Department of Surgery and Cancer, Faculty of Medicine, Imperial College London, London, SW7 2AZ, UK
| | - Felix Nickel
- Department of General, Visceral and Transplant Surgery, Heidelberg University Hospital, 69120, Heidelberg, Germany.
| |
Collapse
|
2
|
Niederhauser L, Gunser S, Waser M, Mast FW, Caversaccio M, Anschuetz L. Training and proficiency level in endoscopic sinus surgery change residents' eye movements. Sci Rep 2023; 13:79. [PMID: 36596830 PMCID: PMC9810736 DOI: 10.1038/s41598-022-25518-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2022] [Accepted: 11/30/2022] [Indexed: 01/04/2023] Open
Abstract
Nose surgery is challenging and needs a lot of training for safe and efficient treatments. Eye tracking can provide an objective assessment to measure residents' learning curve. The aim of the current study was to assess residents' fixation duration and other dependent variables over the course of a dedicated training in functional endoscopic sinus surgery (FESS). Sixteen residents performed a FESS training over 18 sessions, split into three surgical steps. Eye movements in terms of percent fixation on the screen and average fixation duration were measured, in addition to residents' completion time, cognitive load, and surgical performance. Results indicated performance improvements in terms of completion time and surgical performance. Cognitive load and average fixation duration showed a significant change within the last step of training. Percent fixation on screen increased within the first step, and then stagnated. Results showed that eye movements and cognitive load differed between residents of different proficiency levels. In conclusion, eye tracking is a helpful objective measuring tool in FESS. It provides additional insights of the training level and changes with increasing performance. Expert-like gaze was obtained after half of the training sessions and increased proficiency in FESS was associated with increased fixation duration.
Collapse
Affiliation(s)
- Laura Niederhauser
- grid.5734.50000 0001 0726 5157Department of Psychology, University of Bern, Bern, Switzerland
| | - Sandra Gunser
- grid.5734.50000 0001 0726 5157Department of Otorhinolaryngology, Head and Neck Surgery, Inselspital, University Hospital Freiburgstrasse 18, University of Bern, 3010 Bern, Switzerland
| | - Manuel Waser
- grid.5734.50000 0001 0726 5157Department of Otorhinolaryngology, Head and Neck Surgery, Inselspital, University Hospital Freiburgstrasse 18, University of Bern, 3010 Bern, Switzerland
| | - Fred W. Mast
- grid.5734.50000 0001 0726 5157Department of Psychology, University of Bern, Bern, Switzerland
| | - Marco Caversaccio
- grid.5734.50000 0001 0726 5157Department of Otorhinolaryngology, Head and Neck Surgery, Inselspital, University Hospital Freiburgstrasse 18, University of Bern, 3010 Bern, Switzerland
| | - Lukas Anschuetz
- grid.5734.50000 0001 0726 5157Department of Otorhinolaryngology, Head and Neck Surgery, Inselspital, University Hospital Freiburgstrasse 18, University of Bern, 3010 Bern, Switzerland
| |
Collapse
|
3
|
EyeT4Empathy: Dataset of foraging for visual information, gaze typing and empathy assessment. Sci Data 2022; 9:752. [PMID: 36463232 PMCID: PMC9719458 DOI: 10.1038/s41597-022-01862-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2022] [Accepted: 11/23/2022] [Indexed: 12/05/2022] Open
Abstract
We present a dataset of eye-movement recordings collected from 60 participants, along with their empathy levels, towards people with movement impairments. During each round of gaze recording, participants were divided into two groups, each one completing one task. One group performed a task of free exploration of structureless images, and a second group performed a task consisting of gaze typing, i.e. writing sentences using eye-gaze movements on a card board. The eye-tracking data recorded from both tasks is stored in two datasets, which, besides gaze position, also include pupil diameter measurements. The empathy levels of participants towards non-verbal movement-impaired people were assessed twice through a questionnaire, before and after each task. The questionnaire is composed of forty questions, extending a established questionnaire of cognitive and affective empathy. Finally, our dataset presents an opportunity for analysing and evaluating, among other, the statistical features of eye-gaze trajectories in free-viewing as well as how empathy is reflected in eye features.
Collapse
|
4
|
Eye Tracking Use in Surgical Research: A Systematic Review. J Surg Res 2022; 279:774-787. [PMID: 35944332 DOI: 10.1016/j.jss.2022.05.024] [Citation(s) in RCA: 11] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2022] [Revised: 03/18/2022] [Accepted: 05/22/2022] [Indexed: 11/20/2022]
Abstract
INTRODUCTION Eye tracking (ET) is a popular tool to study what factors affect the visual behaviour of surgical team members. To our knowledge, there have been no reviews to date that evaluate the broad use of ET in surgical research. This review aims to identify and assess the quality of this evidence, to synthesize how ET can be used to inform surgical practice, and to provide recommendations to improve future ET surgical studies. METHODS In line with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, a systematic literature review was conducted. An electronic search was performed in MEDLINE, Cochrane Central, Embase, and Web of Science databases up to September 2020. Included studies used ET to measure the visual behaviour of members of the surgical team during surgery or surgical tasks. The included studies were assessed by two independent reviewers. RESULTS A total of 7614 studies were identified, and 111 were included for data extraction. Eleven applications were identified; the four most common were skill assessment (41%), visual attention assessment (22%), workload measurement (17%), and skills training (10%). A summary was provided of the various ways ET could be used to inform surgical practice, and three areas were identified for the improvement of future ET studies in surgery. CONCLUSIONS This review provided a comprehensive summary of the various applications of ET in surgery and how ET could be used to inform surgical practice, including how to use ET to improve surgical education. The information provided in this review can also aid in the design and conduct of future ET surgical studies.
Collapse
|
5
|
Heinrich F, Huettl F, Schmidt G, Paschold M, Kneist W, Huber T, Hansen C. HoloPointer: a virtual augmented reality pointer for laparoscopic surgery training. Int J Comput Assist Radiol Surg 2021; 16:161-168. [PMID: 33095424 PMCID: PMC7822765 DOI: 10.1007/s11548-020-02272-2] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2020] [Accepted: 09/25/2020] [Indexed: 11/28/2022]
Abstract
PURPOSE In laparoscopic surgery training, experts guide novice physicians to desired instrument positions or indicate relevant areas of interest. These instructions are usually given via verbal communication or using physical pointing devices. To facilitate a sterile work flow and to improve training, new guiding methods are needed. This work proposes to use optical see-through augmented reality to visualize an interactive virtual pointer on the laparoscopic. METHODS After an interdisciplinary development, the pointer's applicability and feasibility for training was evaluated and it was compared to a standard condition based on verbal and gestural communication only. In this study, ten surgical trainees were guided by an experienced trainer during cholecystectomies on a laparoscopic training simulator. All trainees completed a virtual cholecystectomy with and without the interactive virtual pointer in alternating order. Measures included procedure time, economy of movement and error rates. RESULTS Results of standardized variables revealed significantly improved economy of movement (p = 0.047) and error rates (p = 0.047), as well as an overall improved user performance (Total z-score; p = 0.031) in conditions using the proposed method. CONCLUSION The proposed HoloPointer is a feasible and applicable tool for laparoscopic surgery training. It improved objective performance metrics without prolongation of the task completion time in this pre-clinical setup.
Collapse
Affiliation(s)
- Florian Heinrich
- Faculty of Computer Science, University of Magdeburg, Magdeburg, Germany
- Research Campus STIMULATE, Magdeburg, Germany
| | - Florentine Huettl
- Department of General, Visceral and Transplant Surgery, University Medicine of the Johannes Gutenberg University Mainz, Mainz, Germany
| | - Gerd Schmidt
- Faculty of Computer Science, University of Magdeburg, Magdeburg, Germany
- Research Campus STIMULATE, Magdeburg, Germany
| | - Markus Paschold
- Department of General, Visceral and Transplant Surgery, University Medicine of the Johannes Gutenberg University Mainz, Mainz, Germany
- Department of Surgery, Hospital St. Marienwörth, Bad Kreuznach, Germany
| | - Werner Kneist
- Department of General, Visceral and Transplant Surgery, University Medicine of the Johannes Gutenberg University Mainz, Mainz, Germany
- Department of General and Visceral Surgery, St. George Clinic Eisenach, Eisenach, Germany
| | - Tobias Huber
- Department of General, Visceral and Transplant Surgery, University Medicine of the Johannes Gutenberg University Mainz, Mainz, Germany
| | - Christian Hansen
- Faculty of Computer Science, University of Magdeburg, Magdeburg, Germany.
- Research Campus STIMULATE, Magdeburg, Germany.
| |
Collapse
|