1
|
Bian D, Lin Z, Lu H, Zhong Q, Wang K, Tang X, Zang J. The application of extended reality technology-assisted intraoperative navigation in orthopedic surgery. Front Surg 2024; 11:1336703. [PMID: 38375409 PMCID: PMC10875025 DOI: 10.3389/fsurg.2024.1336703] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2023] [Accepted: 01/23/2024] [Indexed: 02/21/2024] Open
Abstract
Extended reality (XR) technology refers to any situation where real-world objects are enhanced with computer technology, including virtual reality, augmented reality, and mixed reality. Augmented reality and mixed reality technologies have been widely applied in orthopedic clinical practice, including in teaching, preoperative planning, intraoperative navigation, and surgical outcome evaluation. The primary goal of this narrative review is to summarize the effectiveness and superiority of XR-technology-assisted intraoperative navigation in the fields of trauma, joint, spine, and bone tumor surgery, as well as to discuss the current shortcomings in intraoperative navigation applications. We reviewed titles of more than 200 studies obtained from PubMed with the following search terms: extended reality, mixed reality, augmented reality, virtual reality, intraoperative navigation, and orthopedic surgery; of those 200 studies, 69 related papers were selected for abstract review. Finally, the full text of 55 studies was analyzed and reviewed. They were classified into four groups-trauma, joint, spine, and bone tumor surgery-according to their content. Most of studies that we reviewed showed that XR-technology-assisted intraoperative navigation can effectively improve the accuracy of implant placement, such as that of screws and prostheses, reduce postoperative complications caused by inaccurate implantation, facilitate the achievement of tumor-free surgical margins, shorten the surgical duration, reduce radiation exposure for patients and surgeons, minimize further damage caused by the need for visual exposure during surgery, and provide richer and more efficient intraoperative communication, thereby facilitating academic exchange, medical assistance, and the implementation of remote healthcare.
Collapse
Affiliation(s)
- Dongxiao Bian
- Department of Musculoskeletal Tumor, Peking University People’s Hospital, Beijing, China
| | - Zhipeng Lin
- State Key Laboratory of Virtual Reality Technology and Systems, Beihang University, Beijing, China
| | - Hao Lu
- Traumatic Orthopedic Department, Peking University People’s Hospital, Beijing, China
| | - Qunjie Zhong
- Arthritis Clinic and Research Center, Peking University People’s Hospital, Beijing, China
| | - Kaifeng Wang
- Spinal Surgery Department, Peking University People’s Hospital, Beijing, China
| | - Xiaodong Tang
- Department of Musculoskeletal Tumor, Peking University People’s Hospital, Beijing, China
| | - Jie Zang
- Department of Musculoskeletal Tumor, Peking University People’s Hospital, Beijing, China
| |
Collapse
|
2
|
Shahbaz M, Miao H, Farhaj Z, Gong X, Weikai S, Dong W, Jun N, Shuwei L, Yu D. Mixed reality navigation training system for liver surgery based on a high-definition human cross-sectional anatomy data set. Cancer Med 2023; 12:7992-8004. [PMID: 36607128 PMCID: PMC10134360 DOI: 10.1002/cam4.5583] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2022] [Revised: 11/24/2022] [Accepted: 12/17/2022] [Indexed: 01/07/2023] Open
Abstract
OBJECTIVES This study aims to use the three-dimensional (3D) mixed-reality model of liver, entailing complex intrahepatic systems and to deeply study the anatomical structures and to promote the training, diagnosis and treatment of liver diseases. METHODS Vascular perfusion human specimens were used for thin-layer frozen milling to obtain liver cross-sections. The 104-megapixel-high-definition cross sectional data set was established and registered to achieve structure identification and manual segmentation. The digital model was reconstructed and data was used to print a 3D hepatic model. The model was combined with HoloLens mixed reality technology to reflect the complex relationships of intrahepatic systems. We simulated 3D patient specific anatomy for identification and preoperative planning, conducted a questionnaire survey, and evaluated the results. RESULTS The 3D digital model and 1:1 transparent and colored model of liver established truly reflected intrahepatic vessels and their complex relationships. The reconstructed model imported into HoloLens could be accurately matched with the 3D model. Only 7.7% participants could identify accessory hepatic veins. The depth and spatial-relationship of intrahepatic structures were better understandable for 92%. The 100%, 84.6%, 69% and 84% believed the 3D models were useful in planning, safer surgical paths, reducing intraoperative complications and training of young surgeons respectively. CONCLUSIONS A detailed 3D model can be reconstructed using the higher quality cross-sectional anatomical data set. When combined with 3D printing and HoloLens technology, a novel hybrid-reality navigation-training system for liver surgery is created. Mixed Reality training is a worthy alternative to provide 3D information to clinicians and its possible application in surgery. This conclusion was obtained based on a questionnaire and evaluation. Surgeons with extensive experience in surgical operations perceived in the questionnaire that this technology might be useful in liver surgery, would help in precise preoperative planning, accurate intraoperative identification, and reduction of hepatic injury.
Collapse
Affiliation(s)
- Muhammad Shahbaz
- Department of Radiology, Qilu Hospital of Shandong UniversityJinanShandongChina
- Research Center for Sectional and Imaging AnatomyDigital Human Institute, School of Basic Medical Science, Shandong UniversityJinanShandongChina
- Department of General SurgeryQilu Hospital of Shandong UniversityJinanShandongChina
| | - Huachun Miao
- Department of Anatomy, Wannan Medical CollegeWuhuAnhuiChina
| | - Zeeshan Farhaj
- Department of Cardiovascular Surgery, Shandong Qianfoshan Hospital, Cheeloo College of MedicineShandong UniversityJinanShandongChina
| | - Xin Gong
- Department of Anatomy, Wannan Medical CollegeWuhuAnhuiChina
| | - Sun Weikai
- Department of Radiology, Qilu Hospital of Shandong UniversityJinanShandongChina
| | - Wenqing Dong
- Department of Anatomy, Wannan Medical CollegeWuhuAnhuiChina
| | - Niu Jun
- Department of General SurgeryQilu Hospital of Shandong UniversityJinanShandongChina
| | - Liu Shuwei
- Research Center for Sectional and Imaging AnatomyDigital Human Institute, School of Basic Medical Science, Shandong UniversityJinanShandongChina
| | - Dexin Yu
- Department of Radiology, Qilu Hospital of Shandong UniversityJinanShandongChina
| |
Collapse
|
3
|
Arpaia P, De Benedetto E, De Paolis L, D’Errico G, Donato N, Duraccio L. Performance and Usability Evaluation of an Extended Reality Platform to Monitor Patient’s Health during Surgical Procedures. SENSORS 2022; 22:s22103908. [PMID: 35632317 PMCID: PMC9143436 DOI: 10.3390/s22103908] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/26/2022] [Revised: 05/14/2022] [Accepted: 05/18/2022] [Indexed: 02/01/2023]
Abstract
An extended-reality (XR) platform for real-time monitoring of patients’ health during surgical procedures is proposed. The proposed system provides real-time access to a comprehensive set of patients’ information, which are made promptly available to the surgical team in the operating room (OR). In particular, the XR platform supports the medical staff by automatically acquiring the patient’s vitals from the operating room instrumentation and displaying them in real-time directly on an XR headset. Furthermore, information regarding the patient clinical record is also shown upon request. Finally, the XR-based monitoring platform also allows displaying in XR the video stream coming directly from the endoscope. The innovative aspect of the proposed XR-based monitoring platform lies in the comprehensiveness of the available information, in its modularity and flexibility (in terms of adaption to different sources of data), ease of use, and most importantly, in a reliable communication, which are critical requirements for the healthcare field. To validate the proposed system, experimental tests were conducted using instrumentation typically available in the operating room (i.e., a respiratory ventilator, a patient monitor for intensive care, and an endoscope). The overall results showed (i) an accuracy of the data communication greater than 99 %, along with (ii) an average time response below ms, and (iii) satisfying feedback from the SUS questionnaires filled out by the physicians after intensive use.
Collapse
Affiliation(s)
- Pasquale Arpaia
- Interdepartmental Research Center in Health Management and Innovation in Healthcare (CIRMIS), University of Naples Federico II, 80138 Naples, Italy;
- Augmented Reality for Health Monitoring Laboratory (ARHeMLAB), Department of Information Technology and Electrical Engineering, University of Naples Federico II, 80138 Naples, Italy
- Correspondence:
| | - Egidio De Benedetto
- Interdepartmental Research Center in Health Management and Innovation in Healthcare (CIRMIS), University of Naples Federico II, 80138 Naples, Italy;
- Augmented Reality for Health Monitoring Laboratory (ARHeMLAB), Department of Information Technology and Electrical Engineering, University of Naples Federico II, 80138 Naples, Italy
| | - Lucio De Paolis
- Department of Engineering for Innovation, University of Salento, 73100 Lecce, Italy;
| | - Giovanni D’Errico
- Department of Applied Science and Technology, Polytechnic University of Turin, 10129 Turin, Italy;
| | - Nicola Donato
- Department of Engineering, University of Messina, 98122 Messina, Italy;
| | - Luigi Duraccio
- Department of Electronics and Telecommunications, Polytechnic University of Turin, 10129 Turin, Italy;
| |
Collapse
|
4
|
Wong KC, Sun YE, Kumta SM. Review and Future/Potential Application of Mixed Reality Technology in Orthopaedic Oncology. Orthop Res Rev 2022; 14:169-186. [PMID: 35601186 PMCID: PMC9121991 DOI: 10.2147/orr.s360933] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2022] [Accepted: 04/26/2022] [Indexed: 11/23/2022] Open
Abstract
In orthopaedic oncology, surgical planning and intraoperative execution errors may result in positive tumor resection margins that increase the risk of local recurrence and adversely affect patients’ survival. Computer navigation and 3D-printed resection guides have been reported to address surgical inaccuracy by replicating the surgical plans in complex cases. However, limitations include surgeons’ attention shift from the operative field to view the navigation monitor and expensive navigation facilities in computer navigation surgery. Practical concerns are lacking real-time visual feedback of preoperative images and the lead-time in manufacturing 3D-printed objects. Mixed Reality (MR) is a technology of merging real and virtual worlds to produce new environments with enhanced visualizations, where physical and digital objects coexist and allow users to interact with both in real-time. The unique MR features of enhanced medical images visualization and interaction with holograms allow surgeons real-time and on-demand medical information and remote assistance in their immediate working environment. Early application of MR technology has been reported in surgical procedures. Its role is unclear in orthopaedic oncology. This review aims to provide orthopaedic tumor surgeons with up-to-date knowledge of the emerging MR technology. The paper presents its essential features and clinical workflow, reviews the current literature and potential clinical applications, and discusses the limitations and future development in orthopaedic oncology. The emerging MR technology adds a new dimension to digital assistive tools with a more accessible and less costly alternative in orthopaedic oncology. The MR head-mounted display and hand-free control may achieve clinical point-of-care inside or outside the operating room and improve service efficiency and patient safety. However, lacking an accurate hologram-to-patient matching, an MR platform dedicated to orthopaedic oncology, and clinical results may hinder its wide adoption. Industry-academic partnerships are essential to advance the technology with its clinical role determined through future clinical studies. ![]()
Point your SmartPhone at the code above. If you have a QR code reader the video abstract will appear. Or use: https://youtu.be/t4hl_Anh_kM
Collapse
Affiliation(s)
- Kwok Chuen Wong
- Department of Orthopaedics and Traumatology, Prince of Wales Hospital, the Chinese University of Hong Kong, Hong Kong Special Administrative Region, People’s Republic of China
- Correspondence: Kwok Chuen Wong, Department of Orthopaedics and Traumatology, Prince of Wales Hospital, the Chinese University of Hong Kong, Hong Kong Special Administrative Region, People’s Republic of China, Email
| | - Yan Edgar Sun
- New Territories, Hong Kong Special Administrative Region, People’s Republic of China
| | - Shekhar Madhukar Kumta
- Department of Orthopaedics and Traumatology, Prince of Wales Hospital, the Chinese University of Hong Kong, Hong Kong Special Administrative Region, People’s Republic of China
| |
Collapse
|
5
|
Ha J, Parekh P, Gamble D, Masters J, Jun P, Hester T, Daniels T, Halai M. Opportunities and challenges of using augmented reality and heads-up display in orthopaedic surgery: A narrative review. J Clin Orthop Trauma 2021; 18:209-215. [PMID: 34026489 PMCID: PMC8131920 DOI: 10.1016/j.jcot.2021.04.031] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/08/2021] [Revised: 03/28/2021] [Accepted: 04/29/2021] [Indexed: 12/16/2022] Open
Abstract
BACKGROUND & AIM Utilization of augmented reality (AR) and heads-up displays (HUD) to aid orthopaedic surgery has the potential to benefit surgeons and patients alike through improved accuracy, safety, and educational benefits. With the COVID-19 pandemic, the opportunity for adoption of novel technology is more relevant. The aims are to assess the technology available, to understand the current evidence regarding the benefit and to consider challenges to implementation in clinical practice. METHODS & RESULTS PRISMA guidelines were used to filter the literature. Of 1004 articles returned the following exclusion criteria were applied: 1) reviews/commentaries 2) unrelated to orthopaedic surgery 3) use of other AR wearables beyond visual aids leaving 42 papers for review.This review illustrates benefits including enhanced accuracy and reduced time of surgery, reduced radiation exposure and educational benefits. CONCLUSION Whilst there are obstacles to overcome, there are already reports of technology being used. As with all novel technologies, a greater understanding of the learning curve is crucial, in addition to shielding our patients from this learning curve. Improvements in usability and implementing surgeons' specific needs should increase uptake.
Collapse
Affiliation(s)
- Joon Ha
- Queen Elizabeth Hospital, London, UK,Corresponding author.
| | | | | | - James Masters
- Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS), UK
| | - Peter Jun
- University of Alberta, Edmonton, Canada
| | | | | | - Mansur Halai
- St Michael's Hospital, University of Toronto, Canada
| |
Collapse
|
6
|
Ergonomic effects of medical augmented reality glasses in video-assisted surgery. Surg Endosc 2021; 36:988-998. [PMID: 33638103 DOI: 10.1007/s00464-021-08363-8] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2020] [Accepted: 02/09/2021] [Indexed: 01/13/2023]
Abstract
BACKGROUND The aim of this study was to objectively compare medical augmented reality glasses (ARG) and conventional monitors in video-assisted surgery and to systematically analyze its ergonomic benefits. METHODS Three surgeons (thoracic, laparoscopic, and thyroid surgeons) participated in the study. Six thoracoscopic metastasectomies, six subtotal laparoscopic gastrectomies, and six thyroidectomies were performed with and without ARG. The subjective experience was evaluated using a questionnaire-based NASA-Task Load Index (NASA-TLX). Postures during surgeries were recorded. The risk of musculoskeletal disorders associated with video-assisted surgery was assessed using rapid entire body assessment (REBA). Surface electromyography (EMG) was recorded. Muscle fatigue was objectively measured. RESULTS NASA-TLX scores of three surgeons were lower when ARG was used compared to those with conventional monitor (66.4 versus 82.7). Less workload during surgery was reported with ARG. The laparoscopic surgeon exhibited a substantial decrease in mental and physical demand [- 21.1 and 12.5%)] and the thyroid surgeon did (- 40.0 and - 66.7%).Total REBA scores decreased with ARG (8 to 3.6). The risk of musculoskeletal disorders was improved in regions of the neck and shoulders. Root mean square (RMS) of the EMG signal decreased from 0.347 ± 0.150 to 0.286 ± 0.130 (p = 0.010) with usage of ARG; a decrease was observed in all surgeons. The greatest RMS decrease was observed in trapezius and sternocleidomastoid muscles. The decrease in brachioradialis muscle was small. CONCLUSION ARG assisted with correction of bad posture in surgeons during video-assisted surgery and reduced muscular fatigue of the upper body. This study highlights the superior ergonomic efficiency of ARG in video-assisted surgery.
Collapse
|
7
|
Tornari C, Tedla M, Surda P. Rhinology: Simulation Training (Part 1). CURRENT OTORHINOLARYNGOLOGY REPORTS 2020. [DOI: 10.1007/s40136-020-00272-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
Abstract
Purpose of Review
Recently, there has been an expansion of novel technologies in simulation training. Different models target different aspects of training. The aim of this review was to examine existing evidence about training simulators in rhinology, their incorporation into real training programmes and translation of these skills into the operating room. The first part focuses on the virtual and augmented reality simulators. The second part describes the role of physical (i.e. non-computer-based) models of endoscopic sinus surgery.
Recent Findings
Virtual reality simulators are still evolving and facing challenges due to their inherent cost and lack of realism in terms of the type of haptic feedback they provide. On the other hand, augmented reality seems to be a promising platform with a growing number of applications in preoperative planning, intraoperative navigation and education. Limitations in validity, registration error and level of evidence prevent the adoption of augmented reality on a wider scale or in clinical practice.
Summary
Simulation training is a maturing field that shows reasonable evidence for a number of models. The incorporation of these models into real training programmes requires further evaluation to ensure that training opportunities are being maximized.
Collapse
|
8
|
Carrera JF. A Systematic Review of the Use of Google Glass in Graduate Medical Education. J Grad Med Educ 2019; 11:637-648. [PMID: 31871562 PMCID: PMC6919184 DOI: 10.4300/jgme-d-19-00148.1] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/22/2019] [Revised: 06/13/2019] [Accepted: 08/21/2019] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Graduate medical education (GME) has emphasized the assessment of trainee competencies and milestones; however, sufficient in-person assessment is often constrained. Using mobile hands-free devices, such as Google Glass (GG) for telemedicine, allows for remote supervision, education, and assessment of residents. OBJECTIVE We reviewed available literature on the use of GG in GME in the clinical learning environment, its use for resident supervision and education, and its clinical utility and technical limitations. METHODS We conducted a systematic review in accordance with 2009 PRISMA guidelines. Applicable studies were identified through a review of PubMed, MEDLINE, and Web of Science databases for articles published from January 2013 to August 2018. Two reviewers independently screened titles, abstracts, and full-text articles that reported using GG in GME and assessed the quality of the studies. A systematic review of these studies appraised the literature for descriptions of its utility in GME. RESULTS Following our search and review process, 37 studies were included. The majority evaluated GG in surgical specialties (n = 23) for the purpose of surgical/procedural skills training or supervision. GG was predominantly used for video teleconferencing, and photo and video capture. Highlighted positive aspects of GG use included point-of-view broadcasting and capacity for 2-way communication. Most studies cited drawbacks that included suboptimal battery life and HIPAA concerns. CONCLUSIONS GG shows some promise as a device capable of enhancing GME. Studies evaluating GG in GME are limited by small sample sizes and few quantitative data. Overall experience with use of GG in GME is generally positive.
Collapse
|
9
|
Ramsingh D, Ma M, Le DQ, Davis W, Ringer M, Austin B, Ricks C. Feasibility Evaluation of Commercially Available Video Conferencing Devices to Technically Direct Untrained Nonmedical Personnel to Perform a Rapid Trauma Ultrasound Examination. Diagnostics (Basel) 2019; 9:diagnostics9040188. [PMID: 31739422 PMCID: PMC6963664 DOI: 10.3390/diagnostics9040188] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2019] [Revised: 11/08/2019] [Accepted: 11/11/2019] [Indexed: 01/23/2023] Open
Abstract
Introduction: Point-of-care ultrasound (POCUS) is a rapidly expanding discipline that has proven to be a valuable modality in the hospital setting. Recent evidence has demonstrated the utility of commercially available video conferencing technologies, namely, FaceTime (Apple Inc, Cupertino, CA, USA) and Google Glass (Google Inc, Mountain View, CA, USA), to allow an expert POCUS examiner to remotely guide a novice medical professional. However, few studies have evaluated the ability to use these teleultrasound technologies to guide a nonmedical novice to perform an acute care POCUS examination for cardiac, pulmonary, and abdominal assessments. Additionally, few studies have shown the ability of a POCUS-trained cardiac anesthesiologist to perform the role of an expert instructor. This study sought to evaluate the ability of a POCUS-trained anesthesiologist to remotely guide a nonmedically trained participant to perform an acute care POCUS examination. Methods: A total of 21 nonmedically trained undergraduate students who had no prior ultrasound experience were recruited to perform a three-part ultrasound examination on a standardized patient with the guidance of a remote expert who was a POCUS-trained cardiac anesthesiologist. The examination included the following acute care POCUS topics: (1) cardiac function via parasternal long/short axis views, (2) pneumothorax assessment via pleural sliding exam via anterior lung views, and (3) abdominal free fluid exam via right upper quadrant abdominal view. Each examiner was given a handout with static images of probe placement and actual ultrasound images for the three views. After a brief 8 min tutorial on the teleultrasound technologies, a connection was established with the expert, and they were guided through the acute care POCUS exam. Each view was deemed to be complete when the expert sonographer was satisfied with the obtained image or if the expert sonographer determined that the image could not be obtained after 5 min. Image quality was scored on a previously validated 0 to 4 grading scale. The entire session was recorded, and the image quality was scored during the exam by the remote expert instructor as well as by a separate POCUS-trained, blinded expert anesthesiologist. Results: A total of 21 subjects completed the study. The average total time for the exam was 8.5 min (standard deviation = 4.6). A comparison between the live expert examiner and the blinded postexam reviewer showed a 100% agreement between image interpretations. A review of the exams rated as three or higher demonstrated that 87% of abdominal, 90% of cardiac, and 95% of pulmonary exams achieved this level of image quality. A satisfaction survey of the novice users demonstrated higher ease of following commands for the cardiac and pulmonary exams compared to the abdominal exam. Conclusions: The results from this pilot study demonstrate that nonmedically trained individuals can be guided to complete a relevant ultrasound examination within a short period. Further evaluation of using telemedicine technologies to promote POCUS should be evaluated.
Collapse
Affiliation(s)
- Davinder Ramsingh
- Department of Anesthesiology, Loma Linda University Health, 11234 Anderson St. MC-2532, Loma Linda, CA 92354, USA
- Correspondence:
| | - Michael Ma
- Department of Anesthesiology, UCI Medical Center, Orange, CA 92868, USA; (M.M.); (C.R.)
| | - Danny Quy Le
- David Geffen School of Medicine at UCLA, Los Angeles, CA 90095, USA;
| | - Warren Davis
- Department of Anesthesiology, St. Joseph Medical Center, 7601 Osler Drive, Towson, MD 21204, USA;
| | - Mark Ringer
- Loma Linda University School of Medicine, Loma Linda, CA 92350, USA;
| | - Briahnna Austin
- Department of Anesthesiology, Loma Linda University Health, 11234 Anderson St. MC-2532, Loma Linda, CA 92354, USA
| | - Cameron Ricks
- Department of Anesthesiology, UCI Medical Center, Orange, CA 92868, USA; (M.M.); (C.R.)
| |
Collapse
|
10
|
Sharma P, Vleugels RA, Nambudiri VE. Augmented reality in dermatology: Are we ready for AR? J Am Acad Dermatol 2019; 81:1216-1222. [PMID: 31302186 DOI: 10.1016/j.jaad.2019.07.008] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2019] [Revised: 04/29/2019] [Accepted: 07/02/2019] [Indexed: 10/26/2022]
Abstract
Augmented reality (AR) refers to a group of technologies that capture, analyze, and superimpose digital information onto the real world. This information gives health care providers unique and useful perspectives that can enhance patient care. AR has been utilized in selected scenarios in health care for several decades, notably laparoscopic surgery and vein finding. In recent years, improved wireless technologies, computing power, and analytics are leading to rapid growth in the AR industry. Novel health care-specific use cases are rapidly being introduced with the potential to widely affect clinical care, particularly in dermatology because of the visual nature of the field. In this article, we define AR, profile clinical and educational uses of AR in dermatology, and discuss key policy considerations for the safe and appropriate use of this emerging technology.
Collapse
Affiliation(s)
- Priyank Sharma
- Department of Dermatology, Brigham and Women's Hospital, Boston, Massachusetts
| | - Ruth Ann Vleugels
- Department of Dermatology, Brigham and Women's Hospital, Boston, Massachusetts
| | - Vinod E Nambudiri
- Department of Dermatology, Brigham and Women's Hospital, Boston, Massachusetts.
| |
Collapse
|
11
|
Nikouline A, Jimenez MC, Okrainec A. Feasibility of remote administration of the fundamentals of laparoscopic surgery (FLS) skills test using Google wearable device. Surg Endosc 2019; 34:443-449. [DOI: 10.1007/s00464-019-06788-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2018] [Accepted: 04/04/2019] [Indexed: 01/15/2023]
|
12
|
McCullough MC, Kulber L, Sammons P, Santos P, Kulber DA. Google Glass for Remote Surgical Tele-proctoring in Low- and Middle-income Countries: A Feasibility Study from Mozambique. PLASTIC AND RECONSTRUCTIVE SURGERY-GLOBAL OPEN 2018; 6:e1999. [PMID: 30656104 PMCID: PMC6326622 DOI: 10.1097/gox.0000000000001999] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2018] [Accepted: 09/14/2018] [Indexed: 11/26/2022]
Abstract
BACKGROUND Untreated surgical conditions account for one-third of the total global burden of disease, and a lack of trained providers is a significant contributor to the paucity of surgical care in low- and middle-income countries (LMICs). Wearable technology with real-time tele-proctoring has been demonstrated in high-resource settings to be an innovative method of advancing surgical education and connecting providers, but application to LMICs has not been well-described. METHODS Google Glass with live-stream capability was utilized to facilitate tele-proctoring between a surgeon in Mozambique and a reconstructive surgeon in the United States over a 6-month period. At the completion of the pilot period, a survey was administered regarding the acceptability of the image quality as well as the overall educational benefit of the technology in different surgical contexts. RESULTS Twelve surgical procedures were remotely proctored using the technology. No complications were experienced in any patients. Both participants reported moderate visual impairment due to image distortion and light over-exposure. Video-stream latency and connection disruption were also cited as limitations. Overall, both participants reported that the technology was highly useful as training tool in both the intraoperative and perioperative setting. CONCLUSIONS Our experience in Mozambique demonstrates the feasibility of wearable technology to enhance the reach and availability of specialty surgical training in LMICs. Despite shortcomings in the technology and logistical challenges inherent to international collaborations, this educational model holds promise for connecting surgeons across the globe and introducing expanded access to education and mentorship in areas with limited opportunities for surgical trainees.
Collapse
Affiliation(s)
- Meghan C McCullough
- Division of Plastic Surgery, Department of Surgery, Keck School of Medicine of the University of Southern California, Los Angeles, Calif
| | | | - Patrick Sammons
- Division of Plastic Surgery, Department of Surgery, Keck School of Medicine of the University of Southern California, Los Angeles, Calif
| | - Pedro Santos
- Department of Surgery, Matola Hospital, Matola, Mozambique
| | - David A Kulber
- Division of Plastic Surgery, Department of Surgery, Keck School of Medicine of the University of Southern California, Los Angeles, Calif
- Department of Plastic and Reconstructive Surgery, Cedars Sinai Hospital, Los Angeles, Calif
| |
Collapse
|
13
|
Yoon JW, Chen RE, Kim EJ, Akinduro OO, Kerezoudis P, Han PK, Si P, Freeman WD, Diaz RJ, Komotar RJ, Pirris SM, Brown BL, Bydon M, Wang MY, Wharen RE, Quinones-Hinojosa A. Augmented reality for the surgeon: Systematic review. Int J Med Robot 2018; 14:e1914. [DOI: 10.1002/rcs.1914] [Citation(s) in RCA: 88] [Impact Index Per Article: 14.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2017] [Revised: 03/19/2018] [Accepted: 03/20/2018] [Indexed: 11/06/2022]
Affiliation(s)
- Jang W. Yoon
- Department of Neurological Surgery; Mayo Clinic; Jacksonville Florida USA
| | - Robert E. Chen
- Emory University School of Medicine; Atlanta Georgia USA
- Georgia Institute of Technology; Atlanta Georgia USA
| | | | | | | | | | - Phong Si
- Georgia Institute of Technology; Atlanta Georgia USA
| | | | - Roberto J. Diaz
- Department of Neurosurgery and Neurology; Montreal Neurological Institute and Hospital, McGill University; Montreal Quebec Canada
| | - Ricardo J. Komotar
- Department of Neurological Surgery; University of Miami Miller School of Medicine, University of Miami Hospital, University of Miami Brain Tumor Initiative; Miami Florida USA
| | - Stephen M. Pirris
- Department of Neurological Surgery; Mayo Clinic; Jacksonville Florida USA
- St. Vincent's Spine and Brain Institute; Jacksonville Florida USA
| | - Benjamin L. Brown
- Department of Neurological Surgery; Mayo Clinic; Jacksonville Florida USA
| | - Mohamad Bydon
- Department of Neurological Surgery; Mayo Clinic; Rochester Minnesota USA
| | - Michael Y. Wang
- Department of Neurological Surgery; University of Miami Miller School of Medicine, University of Miami Hospital, University of Miami Brain Tumor Initiative; Miami Florida USA
| | - Robert E. Wharen
- Department of Neurological Surgery; Mayo Clinic; Jacksonville Florida USA
| | | |
Collapse
|
14
|
García-Cruz E, Bretonnet A, Alcaraz A. Testing Smart Glasses in urology: Clinical and surgical potential applications. Actas Urol Esp 2018; 42:207-211. [PMID: 29037757 DOI: 10.1016/j.acuro.2017.06.007] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2017] [Revised: 06/14/2017] [Accepted: 06/14/2017] [Indexed: 11/18/2022]
Abstract
OBJECTIVES We aimed to explore the potential benefits of using smart glasses - wearable computer optical devices with touch-less command features - in the surgery room and in outpatient care settings in urology. MATERIALS AND METHODS Between April and November 2015, 80 urologists were invited to use Google Glass in their daily surgical and clinical practice, and to share them with other urologists. Participants rated the usefulness of smart glasses on a 10-point scale, and provided insights on their potential benefits in a telephone interview. RESULTS During the testing period, 240 urologists used smart glasses, and the 80 initially invited rated their usefulness. Mean scores for usefulness in the surgery room and in outpatient clinics were 7.4 and 5.4, respectively. The interview revealed that the applications of smart glasses considered most promising in surgery were live video streaming and static image playback, augmented reality, laparoscopic navigation, and digital checklist for safety verification. In outpatient settings, participants considered the glasses useful as a viewing platform for sharing test results, for browsing digital vademecum, and for checking medical records in emergency situations. CONCLUSIONS Urologists engaged in our experience identified various uses of smart glasses with potential benefits for physician's daily practice, particularly in the urological surgery setting. Further quantitative studies are needed to exploit the actual possibilities of smart glasses and address the technical limitations for their safe use in clinical and surgical practice.
Collapse
Affiliation(s)
- E García-Cruz
- Departamento de Urología, Hospital Plató, Barcelona, España; Departamento de Urología, Hospital Clínic de Barcelona, Barcelona, España; EAU Young Academic Urologists Men's Health Group, Barcelona, España.
| | - A Bretonnet
- Healthcare Innovation, Soft for You, Barcelona, España
| | - A Alcaraz
- Departamento de Urología, Hospital Clínic de Barcelona, Barcelona, España
| |
Collapse
|
15
|
Wei NJ, Dougherty B, Myers A, Badawy SM. Using Google Glass in Surgical Settings: Systematic Review. JMIR Mhealth Uhealth 2018; 6:e54. [PMID: 29510969 PMCID: PMC5861300 DOI: 10.2196/mhealth.9409] [Citation(s) in RCA: 45] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2017] [Revised: 01/13/2018] [Accepted: 01/17/2018] [Indexed: 12/13/2022] Open
Abstract
Background In recent years, wearable devices have become increasingly attractive and the health care industry has been especially drawn to Google Glass because of its ability to serve as a head-mounted wearable device. The use of Google Glass in surgical settings is of particular interest due to the hands-free device potential to streamline workflow and maintain sterile conditions in an operating room environment. Objective The aim is to conduct a systematic evaluation of the literature on the feasibility and acceptability of using Google Glass in surgical settings and to assess the potential benefits and limitations of its application. Methods The literature was searched for articles published between January 2013 and May 2017. The search included the following databases: PubMed MEDLINE, Embase, Cumulative Index to Nursing and Allied Health Literature, PsycINFO (EBSCO), and IEEE Xplore. Two reviewers independently screened titles and abstracts and assessed full-text articles. Original research articles that evaluated the feasibility, usability, or acceptability of using Google Glass in surgical settings were included. This review was completed following the Preferred Reporting Results of Systematic Reviews and Meta-Analyses guidelines. Results Of the 520 records obtained, 31 met all predefined criteria and were included in this review. Google Glass was used in various surgical specialties. Most studies were in the United States (23/31, 74%) and all were conducted in hospital settings: 29 in adult hospitals (29/31, 94%) and two in children’s hospitals (2/31, 7%). Sample sizes of participants who wore Google Glass ranged from 1 to 40. Of the 31 studies, 25 (81%) were conducted under real-time conditions or actual clinical care settings, whereas the other six (19%) were conducted under simulated environment. Twenty-six studies were pilot or feasibility studies (84%), three were case studies (10%), and two were randomized controlled trials (6%). The majority of studies examined the potential use of Google Glass as an intraoperative intervention (27/31, 87%), whereas others observed its potential use in preoperative (4/31, 13%) and postoperative settings (5/31, 16%). Google Glass was utilized as a videography and photography device (21/31, 68%), a vital sign monitor (6/31, 19%), a surgical navigation display (5/31, 16%), and as a videoconferencing tool to communicate with remote surgeons intraoperatively (5/31, 16%). Most studies reported moderate or high acceptability of using Google Glass in surgical settings. The main reported limitations of using Google Glass utilization were short battery life (8/31, 26%) and difficulty with hands-free features (5/31, 16%). Conclusions There are promising feasibility and usability data of using Google Glass in surgical settings with particular benefits for surgical education and training. Despite existing technical limitations, Google Glass was generally well received and several studies in surgical settings acknowledged its potential for training, consultation, patient monitoring, and audiovisual recording.
Collapse
Affiliation(s)
- Nancy J Wei
- Weinberg College of Arts and Sciences, Northwestern University, Evanston, IL, United States
| | - Bryn Dougherty
- Weinberg College of Arts and Sciences, Northwestern University, Evanston, IL, United States
| | - Aundria Myers
- Weinberg College of Arts and Sciences, Northwestern University, Evanston, IL, United States
| | - Sherif M Badawy
- Division of Hematology, Oncology and Stem Cell Transplant, Ann & Robert H Lurie Children's Hospital of Chicago, Chicago, IL, United States.,Department of Pediatrics, Feinberg School of Medicine, Northwestern University, Chicago, IL, United States.,Department of Pediatrics, Division of Hematology and Oncology, Faculty of Medicine, Zagazig University, Zagazig, Egypt
| |
Collapse
|
16
|
Abstract
Augmented, virtual, and mixed reality applications all aim to enhance a user's current experience or reality. While variations of this technology are not new, within the last few years there has been a significant increase in the number of artificial reality devices or applications available to the general public. This column will explain the difference between augmented, virtual, and mixed reality and how each application might be useful in libraries. It will also provide an overview of the concerns surrounding these different reality applications and describe how and where they are currently being used.
Collapse
|
17
|
Mixed Reality with HoloLens: Where Virtual Reality Meets Augmented Reality in the Operating Room. Plast Reconstr Surg 2017; 140:1066-1070. [PMID: 29068946 DOI: 10.1097/prs.0000000000003802] [Citation(s) in RCA: 127] [Impact Index Per Article: 18.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/09/2023]
Abstract
Virtual reality and augmented reality devices have recently been described in the surgical literature. The authors have previously explored various iterations of these devices, and although they show promise, it has become clear that virtual reality and/or augmented reality devices alone do not adequately meet the demands of surgeons. The solution may lie in a hybrid technology known as mixed reality, which merges many virtual reality and augmented realty features. Microsoft's HoloLens, the first commercially available mixed reality device, provides surgeons intraoperative hands-free access to complex data, the real environment, and bidirectional communication. This report describes the use of HoloLens in the operating room to improve decision-making and surgical workflow. The pace of mixed reality-related technological development will undoubtedly be rapid in the coming years, and plastic surgeons are ideally suited to both lead and benefit from this advance.
Collapse
|
18
|
Siebert JN, Ehrler F, Gervaix A, Haddad K, Lacroix L, Schrurs P, Sahin A, Lovis C, Manzano S. Adherence to AHA Guidelines When Adapted for Augmented Reality Glasses for Assisted Pediatric Cardiopulmonary Resuscitation: A Randomized Controlled Trial. J Med Internet Res 2017; 19:e183. [PMID: 28554878 PMCID: PMC5468544 DOI: 10.2196/jmir.7379] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2017] [Revised: 04/03/2017] [Accepted: 04/28/2017] [Indexed: 12/18/2022] Open
Abstract
Background The American Heart Association (AHA) guidelines for cardiopulmonary resuscitation (CPR) are nowadays recognized as the world’s most authoritative resuscitation guidelines. Adherence to these guidelines optimizes the management of critically ill patients and increases their chances of survival after cardiac arrest. Despite their availability, suboptimal quality of CPR is still common. Currently, the median hospital survival rate after pediatric in-hospital cardiac arrest is 36%, whereas it falls below 10% for out-of-hospital cardiac arrest. Among emerging information technologies and devices able to support caregivers during resuscitation and increase adherence to AHA guidelines, augmented reality (AR) glasses have not yet been assessed. In order to assess their potential, we adapted AHA Pediatric Advanced Life Support (PALS) guidelines for AR glasses. Objective The study aimed to determine whether adapting AHA guidelines for AR glasses increased adherence by reducing deviation and time to initiation of critical life-saving maneuvers during pediatric CPR when compared with the use of PALS pocket reference cards. Methods We conducted a randomized controlled trial with two parallel groups of voluntary pediatric residents, comparing AR glasses to PALS pocket reference cards during a simulation-based pediatric cardiac arrest scenario—pulseless ventricular tachycardia (pVT). The primary outcome was the elapsed time in seconds in each allocation group, from onset of pVT to the first defibrillation attempt. Secondary outcomes were time elapsed to (1) initiation of chest compression, (2) subsequent defibrillation attempts, and (3) administration of drugs, as well as the time intervals between defibrillation attempts and drug doses, shock doses, and number of shocks. All these outcomes were assessed for deviation from AHA guidelines. Results Twenty residents were randomized into 2 groups. Time to first defibrillation attempt (mean: 146 s) and adherence to AHA guidelines in terms of time to other critical resuscitation endpoints and drug dose delivery were not improved using AR glasses. However, errors and deviations were significantly reduced in terms of defibrillation doses when compared with the use of the PALS pocket reference cards. In a total of 40 defibrillation attempts, residents not wearing AR glasses used wrong doses in 65% (26/40) of cases, including 21 shock overdoses >100 J, for a cumulative defibrillation dose of 18.7 Joules per kg. These errors were reduced by 53% (21/40, P<.001) and cumulative defibrillation dose by 37% (5.14/14, P=.001) with AR glasses. Conclusions AR glasses did not decrease time to first defibrillation attempt and other critical resuscitation endpoints when compared with PALS pocket cards. However, they improved adherence and performance among residents in terms of administering the defibrillation doses set by AHA.
Collapse
Affiliation(s)
- Johan N Siebert
- Geneva Children's Hospital, Department of Pediatric Emergency Medicine, University Hospitals of Geneva, Geneva, Switzerland
| | - Frederic Ehrler
- Division of Medical Information Sciences, Department of Radiology and Medical Informatics, University Hospitals of Geneva, Geneva, Switzerland
| | - Alain Gervaix
- Geneva Children's Hospital, Department of Pediatric Emergency Medicine, University Hospitals of Geneva, Geneva, Switzerland
| | - Kevin Haddad
- Geneva Children's Hospital, Department of Pediatric Emergency Medicine, University Hospitals of Geneva, Geneva, Switzerland
| | - Laurence Lacroix
- Geneva Children's Hospital, Department of Pediatric Emergency Medicine, University Hospitals of Geneva, Geneva, Switzerland
| | - Philippe Schrurs
- Geneva Medical Center, University Hospitals of Geneva, Geneva, Switzerland
| | - Ayhan Sahin
- Geneva Medical Center, University Hospitals of Geneva, Geneva, Switzerland
| | - Christian Lovis
- Division of Medical Information Sciences, Department of Radiology and Medical Informatics, University Hospitals of Geneva, Geneva, Switzerland
| | - Sergio Manzano
- Geneva Children's Hospital, Department of Pediatric Emergency Medicine, University Hospitals of Geneva, Geneva, Switzerland
| |
Collapse
|
19
|
Schmutz T, Braun F. Lunettes connectées : médecins régulateurs, ouvrez les yeux ! ANNALES FRANCAISES DE MEDECINE D URGENCE 2016. [DOI: 10.1007/s13341-016-0669-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|