1
|
Bui DT, Barnett T, Hoang H, Chinthammit W. Usability of Augmented Reality Technology in Situational Telementorship for Managing Clinical Scenarios: Quasi-Experimental Study. JMIR MEDICAL EDUCATION 2023; 9:e47228. [PMID: 37782533 PMCID: PMC10580139 DOI: 10.2196/47228] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/14/2023] [Revised: 07/24/2023] [Accepted: 08/10/2023] [Indexed: 10/03/2023]
Abstract
BACKGROUND Telementorship provides a way to maintain the professional skills of isolated rural health care workers. The incorporation of augmented reality (AR) technology into telementoring systems could be used to mentor health care professionals remotely under different clinical situations. OBJECTIVE This study aims to evaluate the usability of AR technology in telementorship for managing clinical scenarios in a simulation laboratory. METHODS This study used a quasi-experimental design. Experienced health professionals and novice health practitioners were recruited for the roles of mentors and mentees, respectively, and then trained in the use of the AR setup. In the experiment, each mentee wearing an AR headset was asked to respond to 4 different clinical scenarios: acute coronary syndrome (ACS), acute myocardial infarction (AMI), pneumonia severe reaction to antibiotics (PSRA), and hypoglycemic emergency (HE). Their mentor used a laptop to provide remote guidance, following the treatment protocols developed for each scenario. Rating scales were used to measure the AR's usability, mentorship effectiveness, and mentees' self-confidence and skill performance. RESULTS A total of 4 mentors and 15 mentees participated in this study. Mentors and mentees were positive about using the AR technology, despite some technical issues and the time required to become familiar with the technology. The positive experience of telementorship was highlighted (mean 4.8, SD 0.414 for mentees and mean of 4.25, SD 0.5 for mentors on the 5-point Likert scale). Mentees' confidence in managing each of the 4 scenarios improved after telementoring (P=.001 for the ACS, AMI, and PSRA scenarios and P=.002 for the HE scenario). Mentees' individual skill performance rates ranged from 98% in the ACS scenario to 97% in the AMI, PSRA, and HE scenarios. CONCLUSIONS This study provides evidence about the usability of AR technology in telementorship for managing clinical scenarios. The findings suggest the potential for this technology to be used to support health workers in real-world clinical environments and point to new directions of research.
Collapse
Affiliation(s)
- Dung T Bui
- Centre for Rural Health, School of Health Sciences, College of Health and Medicine, University of Tasmania, Launceston, Australia
| | - Tony Barnett
- Centre for Rural Health, School of Health Sciences, College of Health and Medicine, University of Tasmania, Launceston, Australia
| | - Ha Hoang
- Centre for Rural Health, School of Health Sciences, College of Health and Medicine, University of Tasmania, Launceston, Australia
| | - Winyu Chinthammit
- Human Interface Technology Laboratory, School of Information and Communication Technology, College of Sciences and Engineering, University of Tasmania, Launceston, Australia
| |
Collapse
|
2
|
Kim JT, Cha YH, Yoo JI, Park CH. Touchless Control of Picture Archiving and Communication System in Operating Room Environment: A Comparative Study of Input Methods. Clin Orthop Surg 2021; 13:436-446. [PMID: 34484637 PMCID: PMC8380534 DOI: 10.4055/cios20004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/08/2020] [Revised: 01/12/2021] [Accepted: 01/12/2021] [Indexed: 11/06/2022] Open
Abstract
Background The advancement of computer information technology would maximize its potential in operating rooms with touchless input devices. A picture archiving and communication system (PACS) was compared with a touchless input device (LMC-GW), relaying to another person to control a mouse through verbal guidance, and directly controlling a mouse. Methods Participants (n = 34; mean age, 29.6 years) were prospectively enrolled and given nine scenarios to compare the three methods. Each scenario consisted of eight tasks, which required 6 essential functions of PACS. Time elapsed and measurement values were recorded for objective evaluation, while subjective evaluation was conducted with a questionnaire. Results In all 8 tasks, manipulation using the mouse took significantly less time than the other methods (all p < 0.05). Study selection, panning, zooming, scrolling, distance measuring, and leg length measurement took significantly less time when LMC-GW was used compared to relaying to another person (all p < 0.01), whereas there were no significant differences in time required for measuring the angles and windowing. Although the touchless input device provided higher accessibility and lower contamination risk, it was more difficult to handle than the other input methods (all p < 0.01). Conclusions The touchless input device provided superior or equal performance to the method of verbal instruction in the environment of operating room. Surgeons agreed that the device would be helpful for manipulating PACS in operating rooms with less contamination risk and disturbance of workflow. The touchless input device can be an alternative option for direct manipulation of a mouse in operation rooms in the future.
Collapse
Affiliation(s)
- Jung-Taek Kim
- Department of Orthopedic Surgery, Ajou University School of Medicine, Ajou Medical Center, Suwon, Korea
| | - Yong-Han Cha
- Department of Orthopedic Surgery, Eulji University Hospital, Daejeon, Korea
| | - Jun-Il Yoo
- Department of Orthopaedic Surgery, Gyeongsang National University Hospital, Jinju, Korea
| | - Chan-Ho Park
- Department of Orthopedic Surgery, Yeungnam University Hospital, Daegu, Korea
| |
Collapse
|
3
|
Feng Y, Uchidiuno UA, Zahiri HR, George I, Park AE, Mentis H. Comparison of Kinect and Leap Motion for Intraoperative Image Interaction. Surg Innov 2020; 28:33-40. [PMID: 32812838 DOI: 10.1177/1553350620947206] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Background. Touchless interaction devices have increasingly garnered attention for intraoperative imaging interaction, but there are limited recommendations on which touchless interaction mechanisms should be implemented in the operating room. The objective of this study was to evaluate the efficiency, accuracy, and satisfaction of 2 current touchless interaction mechanisms-hand motion and body motion for intraoperative image interaction. Methods. We used the TedCas plugin for ClearCanvas DICOM viewer to display and manipulate CT images. Ten surgeons performed 5 image interaction tasks-step-through, pan, zoom, circle measure, and line measure-on the 3 input interaction devices-the Microsoft Kinect, the Leap Motion, and a mouse. Results. The Kinect shared similar accuracy with the Leap Motion for most of the tasks. But it had an increased error rate in the step-through task. The Leap Motion led to shorter task completion time than the Kinect and was preferred by the surgeons, especially for the measure tasks. Discussion. Our study suggests that hand tracking devices, such as the Leap Motion, should be used for intraoperative imagining manipulation tasks that require high precision.
Collapse
Affiliation(s)
- Yuanyuan Feng
- 14701University of Maryland Baltimore County, Baltimore, MD, USA
| | | | | | - Ivan George
- 1267Anne Arundel Medical Center, Baltimore, MD, USA
| | | | - Helena Mentis
- 14701University of Maryland Baltimore County, Baltimore, MD, USA
| |
Collapse
|
4
|
Homayoon B, Chung J, Gandhi RT, Liu DM. Early clinical experience with a touchless image navigation interface for the endovascular suite. MINIM INVASIV THER 2019; 29:146-153. [PMID: 31066595 DOI: 10.1080/13645706.2019.1612440] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Abstract
Introduction: The inability to having comprehensive access to all pertinent imaging data related to an endovascular procedure in the sterile field is an unmet need, and its absence may have deleterious effects on decision-making, outcomes and workflow. Current image navigation solutions rely on traditional personal computing interfaces which are difficult to use in the sterile field. Innovative technological solutions are needed to address this need.Material and methods: Utilizing novel hardware and software integration, a human computer interaction (HCI) based platform has been developed through an iterative design and development process, that allows intuitive real-time access to imaging data in the sterile field. Following validation and pre-clinical testing, the platform has been introduced to the endovascular suite for clinical use.Results: Three prospective case-based observational reviews are presented that demonstrate the utility of touchless image navigation in the sterile field in facilitating decision-making and resource utilization during endovascular procedures, while avoiding the cognitive and workflow disturbances inherent in leaving the sterile field or involving non-scrubbed third persons in the image navigation process.Conclusion: Physician engagement and 'needs-based' technological innovation is needed to improve human computer interaction in the endovascular suite, in hopes of positively affecting procedural decision-making, outcomes, and workflow.
Collapse
Affiliation(s)
- Behrang Homayoon
- Department of Radiology, Surrey Memorial Hospital, Surrey, Canada
| | - John Chung
- Department of Radiology, Vancouver General Hospital, University of British Columbia, Vancouver, Canada
| | - Ripal T Gandhi
- Miami Vascular Specialists, Miami Cardiac and Vascular Institute, Miami, FL, USA
| | - David M Liu
- Department of Radiology, Vancouver General Hospital, University of British Columbia, Vancouver, Canada
| |
Collapse
|
5
|
Experimental Assessment of a Novel Touchless Interface for Intraprocedural Imaging Review. Cardiovasc Intervent Radiol 2019; 42:1192-1198. [PMID: 31044296 DOI: 10.1007/s00270-019-02207-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/09/2019] [Accepted: 03/14/2019] [Indexed: 10/26/2022]
Abstract
PURPOSE To examine the feasibility of a novel technology platform that enables real-time touchless interaction with radiology images in both a simulated and an actual clinical setting. MATERIALS AND METHODS This platform offers three different modes for image interaction. The gesture recognition mode uses a depth camera to detect the user's hand gestures which are translated to image manipulation commands. The light projection mode uses the same camera to detect finger point-and-tap movements above the icons which are projected on a surface to activate the commands. The capacitive sensing mode is enabled by a handheld, portable device, over which finger movements are detected by capacitive sensors to control the image review. Following initial feedback, light projection and capacitive sensing modes were selected for further testing by comparing with the conventional mode of image interaction in time trials for performing a series of standardized image manipulation tasks. Finally, the usability of the technology platform was examined in actual clinical procedures. RESULTS The light projection and the capacitive sensing modes were evaluated in the time trials and exhibited 60% and 71% reduction in time, respectively, relative to the control mode (p < 0.001). Clinical feasibility for this platform was demonstrated in three actual interventional radiology cases. CONCLUSION Accessing, navigating, and extracting relevant information from patient images intraprocedurally are cumbersome and time-consuming tasks that affect safety, efficiency, and decision-making during image-guided procedures. This study demonstrated that the novel technology addressed this issue by allowing touchless interaction with these images in the sterile field.
Collapse
|
6
|
Alvarez-Lopez F, Maina MF, Saigí-Rubió F. Use of Commercial Off-The-Shelf Devices for the Detection of Manual Gestures in Surgery: Systematic Literature Review. J Med Internet Res 2019; 21:e11925. [PMID: 31066679 PMCID: PMC6533048 DOI: 10.2196/11925] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2018] [Revised: 01/04/2019] [Accepted: 01/25/2019] [Indexed: 01/08/2023] Open
Abstract
Background The increasingly pervasive presence of technology in the operating room raises the need to study the interaction between the surgeon and computer system. A new generation of tools known as commercial off-the-shelf (COTS) devices enabling touchless gesture–based human-computer interaction is currently being explored as a solution in surgical environments. Objective The aim of this systematic literature review was to provide an account of the state of the art of COTS devices in the detection of manual gestures in surgery and to identify their use as a simulation tool for motor skills teaching in minimally invasive surgery (MIS). Methods For this systematic literature review, a search was conducted in PubMed, Excerpta Medica dataBASE, ScienceDirect, Espacenet, OpenGrey, and the Institute of Electrical and Electronics Engineers databases. Articles published between January 2000 and December 2017 on the use of COTS devices for gesture detection in surgical environments and in simulation for surgical skills learning in MIS were evaluated and selected. Results A total of 3180 studies were identified, 86 of which met the search selection criteria. Microsoft Kinect (Microsoft Corp) and the Leap Motion Controller (Leap Motion Inc) were the most widely used COTS devices. The most common intervention was image manipulation in surgical and interventional radiology environments, followed by interaction with virtual reality environments for educational or interventional purposes. The possibility of using this technology to develop portable low-cost simulators for skills learning in MIS was also examined. As most of the articles identified in this systematic review were proof-of-concept or prototype user testing and feasibility testing studies, we concluded that the field was still in the exploratory phase in areas requiring touchless manipulation within environments and settings that must adhere to asepsis and antisepsis protocols, such as angiography suites and operating rooms. Conclusions COTS devices applied to hand and instrument gesture–based interfaces in the field of simulation for skills learning and training in MIS could open up a promising field to achieve ubiquitous training and presurgical warm up.
Collapse
Affiliation(s)
- Fernando Alvarez-Lopez
- Faculty of Health Sciences, Universitat Oberta de Catalunya, Barcelona, Spain.,Faculty of Health Sciences, Universidad de Manizales, Caldas, Colombia
| | - Marcelo Fabián Maina
- Faculty of Psychology and Education Sciences, Universitat Oberta de Catalunya, Barcelona, Spain
| | | |
Collapse
|
7
|
Jurewicz KA, Neyens DM, Catchpole K, Reeves ST. Developing a 3D Gestural Interface for Anesthesia-Related Human-Computer Interaction Tasks Using Both Experts and Novices. HUMAN FACTORS 2018; 60:992-1007. [PMID: 29906400 DOI: 10.1177/0018720818780544] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
OBJECTIVE The purpose of this research was to compare gesture-function mappings for experts and novices using a 3D, vision-based, gestural input system when exposed to the same context of anesthesia tasks in the operating room (OR). BACKGROUND 3D, vision-based, gestural input systems can serve as a natural way to interact with computers and are potentially useful in sterile environments (e.g., ORs) to limit the spread of bacteria. Anesthesia providers' hands have been linked to bacterial transfer in the OR, but a gestural input system for anesthetic tasks has not been investigated. METHODS A repeated-measures study was conducted with two cohorts: anesthesia providers (i.e., experts) ( N = 16) and students (i.e., novices) ( N = 30). Participants chose gestures for 10 anesthetic functions across three blocks to determine intuitive gesture-function mappings. Reaction time was collected as a complementary measure for understanding the mappings. RESULTS The two gesture-function mapping sets showed some similarities and differences. The gesture mappings of the anesthesia providers showed a relationship to physical components in the anesthesia environment that were not seen in the students' gestures. The students also exhibited evidence related to longer reaction times compared to the anesthesia providers. CONCLUSION Domain expertise is influential when creating gesture-function mappings. However, both experts and novices should be able to use a gesture system intuitively, so development methods need to be refined for considering the needs of different user groups. APPLICATION The development of a touchless interface for perioperative anesthesia may reduce bacterial contamination and eventually offer a reduced risk of infection to patients.
Collapse
|
8
|
Bachmann D, Weichert F, Rinkenauer G. Review of Three-Dimensional Human-Computer Interaction with Focus on the Leap Motion Controller. SENSORS (BASEL, SWITZERLAND) 2018; 18:E2194. [PMID: 29986517 PMCID: PMC6068627 DOI: 10.3390/s18072194] [Citation(s) in RCA: 74] [Impact Index Per Article: 10.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/31/2018] [Revised: 06/30/2018] [Accepted: 07/02/2018] [Indexed: 11/16/2022]
Abstract
Modern hardware and software development has led to an evolution of user interfaces from command-line to natural user interfaces for virtual immersive environments. Gestures imitating real-world interaction tasks increasingly replace classical two-dimensional interfaces based on Windows/Icons/Menus/Pointers (WIMP) or touch metaphors. Thus, the purpose of this paper is to survey the state-of-the-art Human-Computer Interaction (HCI) techniques with a focus on the special field of three-dimensional interaction. This includes an overview of currently available interaction devices, their applications of usage and underlying methods for gesture design and recognition. Focus is on interfaces based on the Leap Motion Controller (LMC) and corresponding methods of gesture design and recognition. Further, a review of evaluation methods for the proposed natural user interfaces is given.
Collapse
Affiliation(s)
- Daniel Bachmann
- Department of Computer Science VII, TU Dortmund University, 44221 Dortmund, Germany.
| | - Frank Weichert
- Department of Computer Science VII, TU Dortmund University, 44221 Dortmund, Germany.
| | - Gerhard Rinkenauer
- Leibniz Research Centre for Working Environment and Human Factors, 44139 Dortmund, Germany.
| |
Collapse
|
9
|
Auditory display as feedback for a novel eye-tracking system for sterile operating room interaction. Int J Comput Assist Radiol Surg 2017; 13:37-45. [PMID: 29079993 DOI: 10.1007/s11548-017-1677-3] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2017] [Accepted: 10/13/2017] [Indexed: 10/18/2022]
Abstract
PURPOSE The growing number of technical systems in the operating room has increased attention on developing touchless interaction methods for sterile conditions. However, touchless interaction paradigms lack the tactile feedback found in common input devices such as mice and keyboards. We propose a novel touchless eye-tracking interaction system with auditory display as a feedback method for completing typical operating room tasks. Auditory display provides feedback concerning the selected input into the eye-tracking system as well as a confirmation of the system response. METHODS An eye-tracking system with a novel auditory display using both earcons and parameter-mapping sonification was developed to allow touchless interaction for six typical scrub nurse tasks. An evaluation with novice participants compared auditory display with visual display with respect to reaction time and a series of subjective measures. RESULTS When using auditory display to substitute for the lost tactile feedback during eye-tracking interaction, participants exhibit reduced reaction time compared to using visual-only display. In addition, the auditory feedback led to lower subjective workload and higher usefulness and system acceptance ratings. CONCLUSION Due to the absence of tactile feedback for eye-tracking and other touchless interaction methods, auditory display is shown to be a useful and necessary addition to new interaction concepts for the sterile operating room, reducing reaction times while improving subjective measures, including usefulness, user satisfaction, and cognitive workload.
Collapse
|
10
|
Sánchez-Margallo FM, Sánchez-Margallo JA, Moyano-Cuevas JL, Pérez EM, Maestre J. Use of natural user interfaces for image navigation during laparoscopic surgery: initial experience. MINIM INVASIV THER 2017; 26:253-261. [PMID: 28349758 DOI: 10.1080/13645706.2017.1304964] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Abstract
BACKGROUND Surgical environments require special aseptic conditions for direct interaction with the preoperative images. We aim to test the feasibility of using a set of gesture control sensors combined with voice control to interact in a sterile manner with preoperative information and an integrated operating room (OR) during laparoscopic surgery. MATERIAL AND METHODS Two hepatectomies and two partial nephrectomies were performed by three experienced surgeons in a porcine model. The Kinect, Leap Motion, and MYO armband in combination with voice control were used as natural user interfaces (NUIs). After surgery, surgeons completed a questionnaire about their experience. RESULTS Surgeons required <10 min training with each NUI. They stated that NUIs improved the access to preoperative patient information and kept them more focused on the surgical site. The Kinect system was reported as the most physically demanding NUI and the MYO armband in combination with voice commands as the most intuitive and accurate. The need to release one of the laparoscopic instruments in order to use the NUIs was identified as the main limitation. CONCLUSIONS The presented NUIs are feasible to directly interact in a more intuitive and sterile manner with the preoperative images and the integrated OR functionalities during laparoscopic surgery.
Collapse
Affiliation(s)
| | - Juan A Sánchez-Margallo
- b Bioengineering and Health Technologies Unit , Jesús Usón Minimally Invasive Surgery Centre , Cáceres , Spain
| | - José L Moyano-Cuevas
- b Bioengineering and Health Technologies Unit , Jesús Usón Minimally Invasive Surgery Centre , Cáceres , Spain
| | - Eva María Pérez
- c Department of Surgery , University of Extremadura , Cáceres , Spain
| | - Juan Maestre
- d General Surgery Unit , Jesús Usón Minimally Invasive Surgery Centre , Cáceres , Spain
| |
Collapse
|
11
|
Comparison of gesture and conventional interaction techniques for interventional neuroradiology. Int J Comput Assist Radiol Surg 2017; 12:1643-1653. [DOI: 10.1007/s11548-017-1523-7] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2016] [Accepted: 01/06/2017] [Indexed: 01/09/2023]
|
12
|
Mewes A, Hensen B, Wacker F, Hansen C. Touchless interaction with software in interventional radiology and surgery: a systematic literature review. Int J Comput Assist Radiol Surg 2016; 12:291-305. [PMID: 27647327 DOI: 10.1007/s11548-016-1480-6] [Citation(s) in RCA: 45] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2016] [Accepted: 08/31/2016] [Indexed: 11/25/2022]
Abstract
PURPOSE In this article, we systematically examine the current state of research of systems that focus on touchless human-computer interaction in operating rooms and interventional radiology suites. We further discuss the drawbacks of current solutions and underline promising technologies for future development. METHODS A systematic literature search of scientific papers that deal with touchless control of medical software in the immediate environment of the operation room and interventional radiology suite was performed. This includes methods for touchless gesture interaction, voice control and eye tracking. RESULTS Fifty-five research papers were identified and analyzed in detail including 33 journal publications. Most of the identified literature (62 %) deals with the control of medical image viewers. The others present interaction techniques for laparoscopic assistance (13 %), telerobotic assistance and operating room control (9 % each) as well as for robotic operating room assistance and intraoperative registration (3.5 % each). Only 8 systems (14.5 %) were tested in a real clinical environment, and 7 (12.7 %) were not evaluated at all. CONCLUSION In the last 10 years, many advancements have led to robust touchless interaction approaches. However, only a few have been systematically evaluated in real operating room settings. Further research is required to cope with current limitations of touchless software interfaces in clinical environments. The main challenges for future research are the improvement and evaluation of usability and intuitiveness of touchless human-computer interaction and the full integration into productive systems as well as the reduction of necessary interaction steps and further development of hands-free interaction.
Collapse
Affiliation(s)
- André Mewes
- Faculty of Computer Science, University of Magdeburg, Magdeburg, Germany.
| | - Bennet Hensen
- Institute for Diagnostic and Interventional Radiology, Medical School Hanover, Hanover, Germany
| | - Frank Wacker
- Institute for Diagnostic and Interventional Radiology, Medical School Hanover, Hanover, Germany
| | - Christian Hansen
- Faculty of Computer Science, University of Magdeburg, Magdeburg, Germany
| |
Collapse
|
13
|
Di Tommaso L, Aubry S, Godard J, Katranji H, Pauchot J. [A new human machine interface in neurosurgery: The Leap Motion(®). Technical note regarding a new touchless interface]. Neurochirurgie 2016; 62:178-81. [PMID: 27234915 DOI: 10.1016/j.neuchi.2016.01.006] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2015] [Revised: 12/30/2015] [Accepted: 01/26/2016] [Indexed: 11/25/2022]
Abstract
Currently, cross-sectional imaging viewing is used in routine practice whereas the surgical procedure requires physical contact with an interface (mouse or touch-sensitive screen). This type of contact results in a risk of lack of aseptic control and causes loss of time. The recent appearance of devices such as the Leap Motion(®) (Leap Motion society, San Francisco, USA) a sensor which enables to interact with the computer without any physical contact is of major interest in the field of surgery. However, its configuration and ergonomics produce key challenges in order to adapt to the practitioner's requirements, the imaging software as well as the surgical environment. This article aims to suggest an easy configuration of the Leap Motion(®) in neurosurgery on a PC for an optimized utilization with Carestream(®) Vue PACS v11.3.4 (Carestream Health, Inc., Rochester, USA) using a plug-in (to download at: https://drive.google.com/?usp=chrome_app#folders/0B_F4eBeBQc3ybElEeEhqME5DQkU) and a video tutorial (https://www.youtube.com/watch?v=yVPTgxg-SIk).
Collapse
Affiliation(s)
- L Di Tommaso
- Service de neurochirurgie, CHU Jean-Minjoz, 25030 Besançon, France.
| | - S Aubry
- Service d'imagerie musculo-squelettique, CHU Jean-Minjoz, 25030 Besançon, France; EA 4268I4S IFR 133 Inserm, unité de recherche, 25030 Besançon, France; Université de Franche-Comté, 25000 Besançon, France
| | - J Godard
- Service de neurochirurgie, CHU Jean-Minjoz, 25030 Besançon, France
| | - H Katranji
- Service de neurochirurgie, CHU Jean-Minjoz, 25030 Besançon, France
| | - J Pauchot
- EA 4268I4S IFR 133 Inserm, unité de recherche, 25030 Besançon, France; Université de Franche-Comté, 25000 Besançon, France; Service de chirurgie orthopédique, traumatologique, plastique, esthétique, reconstructrice et assistance main, CHU Jean-Minjoz, 25030 Besançon, France
| |
Collapse
|
14
|
Alvarez-Lopez F, Maina MF, Saigí-Rubió F. Natural User Interfaces: Is It a Solution to Accomplish Ubiquitous Training in Minimally Invasive Surgery? Surg Innov 2016; 23:429-30. [PMID: 27009688 DOI: 10.1177/1553350616639145] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Affiliation(s)
- Fernando Alvarez-Lopez
- Universitat Oberta de Catalunya, Barcelona, Spain Universidad de Manizales, Manizales, Colombia
| | | | | |
Collapse
|
15
|
Li Y, Zhao Y, Zhang J, Zhang Z, Dong G, Wang Q, Liu L, Yu X, Xu B, Chen X. Low-Cost Interactive Image-Based Virtual Endoscopy for the Diagnosis and Surgical Planning of Suprasellar Arachnoid Cysts. World Neurosurg 2015; 88:76-82. [PMID: 26732948 DOI: 10.1016/j.wneu.2015.12.038] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2015] [Accepted: 12/07/2015] [Indexed: 11/28/2022]
Abstract
OBJECTIVE To investigate the feasibility and reliability of virtual endoscopy (VE) as a rapid, low-cost, and interactive tool for the diagnosis and surgical planning of suprasellar arachnoid cysts (SACs). METHODS Eighteen patients with SACs treated with endoscopic ventriculocystostomy were recruited, and 18 endoscopic patients treated with third ventriculostomy were randomly selected as a VE reconstruction control group. After loading their DICOM data into free 3D Slicer software, VE reconstruction was independently performed by 3 blinded clinicians and the time required for each reconstruction was recorded. Another 3 blinded senior neurosurgeons interactively graded the visibility of VE by watching video recordings of the endoscopic procedures. Based on the visibility scores, receiver operating characteristic curve analysis was used to investigate the reliability of VE to diagnose SACs, and Bland-Altman plots were used to assess the reliability of VE for surgical planning. In addition, the intraclass correlation coefficient was calculated to estimate the consistency among the results of 3 reconstruction performers. RESULTS All 3 independent reconstructing performers successfully completed VE simulation for all cases, and the average reconstruction time was 10.2 ± 9.7 minutes. The area under the receiver operating characteristic curve of the cyst visibility score was 0.96, implying its diagnostic value for SACs. The Bland-Altman plot indicated good agreement between VE and intraoperative viewings, suggesting the anatomic accuracy of the VE for surgical planning. In addition, the intraclass correlation coefficient was 0.81, which revealed excellent interperformer consistency of our simulation method. CONCLUSIONS This study substantiated the feasibility and reliability of VE as a rapid, low-cost, and interactive modality for diagnosis and surgical planning of SACs.
Collapse
Affiliation(s)
- Ye Li
- Department of Neurosurgery, Chinese PLA General Hospital, Beijing, China; School of Medicine, Nankai University, Tianjin, China; Surgical Planning Laboratory, Department of Radiology, Brigham and Women's Hospital, Boston, Massachusetts, USA
| | - Yining Zhao
- Department of Neurosurgery, Chinese PLA General Hospital, Beijing, China; School of Medicine, Nankai University, Tianjin, China
| | - Jiashu Zhang
- Department of Neurosurgery, Chinese PLA General Hospital, Beijing, China
| | - Zhizhong Zhang
- Department of Neurosurgery, Chinese PLA General Hospital, Beijing, China
| | - Guojun Dong
- Department of Neurosurgery, Chinese PLA General Hospital, Beijing, China
| | - Qun Wang
- Department of Neurosurgery, Chinese PLA General Hospital, Beijing, China
| | - Lei Liu
- Department of Neurosurgery, Chinese PLA General Hospital, Beijing, China
| | - Xinguang Yu
- Department of Neurosurgery, Chinese PLA General Hospital, Beijing, China
| | - Bainan Xu
- Department of Neurosurgery, Chinese PLA General Hospital, Beijing, China
| | - Xiaolei Chen
- Department of Neurosurgery, Chinese PLA General Hospital, Beijing, China.
| |
Collapse
|
16
|
Partridge RW, Brown FS, Brennan PM, Hennessey IAM, Hughes MA. The LEAP™ Gesture Interface Device and Take-Home Laparoscopic Simulators: A Study of Construct and Concurrent Validity. Surg Innov 2015; 23:70-7. [PMID: 26178693 DOI: 10.1177/1553350615594734] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
AIM To assess the potential of the LEAP™ infrared motion tracking device to map laparoscopic instrument movement in a simulated environment. Simulator training is optimized when augmented by objective performance feedback. We explore the potential LEAP has to provide this in a way compatible with affordable take-home simulators. METHOD LEAP and the previously validated InsTrac visual tracking tool mapped expert and novice performances of a standardized simulated laparoscopic task. Ability to distinguish between the 2 groups (construct validity) and correlation between techniques (concurrent validity) were the primary outcome measures. RESULTS Forty-three expert and 38 novice performances demonstrated significant differences in LEAP-derived metrics for instrument path distance (P < .001), speed (P = .002), acceleration (P < .001), motion smoothness (P < .001), and distance between the instruments (P = .019). Only instrument path distance demonstrated a correlation between LEAP and InsTrac tracking methods (novices: r = .663, P < .001; experts: r = .536, P < .001). Consistency of LEAP tracking was poor (average % time hands not tracked: 31.9%). CONCLUSION The LEAP motion device is able to track the movement of hands using instruments in a laparoscopic box simulator. Construct validity is demonstrated by its ability to distinguish novice from expert performances. Only time and instrument path distance demonstrated concurrent validity with an existing tracking method however. A number of limitations to the tracking method used by LEAP have been identified. These need to be addressed before it can be considered an alternative to visual tracking for the delivery of objective performance metrics in take-home laparoscopic simulators.
Collapse
|
17
|
Pauchot J, Di Tommaso L, Lounis A, Benassarou M, Mathieu P, Bernot D, Aubry S. Leap Motion Gesture Control With Carestream Software in the Operating Room to Control Imaging: Installation Guide and Discussion. Surg Innov 2015; 22:615-20. [PMID: 26002115 DOI: 10.1177/1553350615587992] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Nowadays, routine cross-sectional imaging viewing during a surgical procedure requires physical contact with an interface (mouse or touch-sensitive screen). Such contact risks exposure to aseptic conditions and causes loss of time. Devices such as the recently introduced Leap Motion (Leap Motion Society, San Francisco, CA), which enables interaction with the computer without any physical contact, are of wide interest in the field of surgery, but configuration and ergonomics are key challenges for the practitioner, imaging software, and surgical environment. This article aims to suggest an easy configuration of Leap Motion on a PC for optimized use with Carestream Vue PACS v11.3.4 (Carestream Health, Inc, Rochester, NY) using a plug-in (to download at https://drive.google.com/open?id=0B_F4eBeBQc3yNENvTXlnY09qS00&authuser=0) and a video tutorial (https://www.youtube.com/watch?v=yVPTgxg-SIk). Videos of surgical procedure and discussion about innovative gesture control technology and its various configurations are provided in this article.
Collapse
Affiliation(s)
- Julien Pauchot
- Orthopedic, Traumatology, Aesthetic, Plastic, Reconstructive and Hand Surgery Unit, University Hospital of Besançon, Besançon, France
| | - Laetitia Di Tommaso
- Neurosurgery Department, University Hospital of Besançon, University of Franche-Comté, Besançon, France
| | - Ahmed Lounis
- Department of Musculoskeletal Imaging, University Hospital of Besançon, University of Franche-Comté, Besançon, France
| | - Mourad Benassarou
- MaxilloFacial and Stomatology Department, University Hospital of Besançon, University of Franche-Comté, Besançon, France
| | - Pierre Mathieu
- Liver Transplantation and Digestive Surgery Unit, University Hospital of Besançon, University of Franche-Comté, Besançon, France
| | - Dominique Bernot
- Informatics Department, University Hospital of Besançon, University of Franche-Comté, Besançon, France
| | - Sébastien Aubry
- Department of Musculoskeletal Imaging, University Hospital of Besançon, University of Franche-Comté, Besançon, France
| |
Collapse
|
18
|
Mewes A, Saalfeld P, Riabikin O, Skalej M, Hansen C. A gesture-controlled projection display for CT-guided interventions. Int J Comput Assist Radiol Surg 2015; 11:157-64. [DOI: 10.1007/s11548-015-1215-0] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2015] [Accepted: 04/21/2015] [Indexed: 11/24/2022]
|