51
|
Li CG. Application of three-dimensional reconstruction and virtual reality technology in liver surgery. Shijie Huaren Xiaohua Zazhi 2020; 28:515-518. [DOI: 10.11569/wcjd.v28.i13.515] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/06/2023] Open
Abstract
After three-dimensional (3D) reconstruction of the two-dimensional information obtained from routine computed tomography or magnetic resonance imaging examinations of the liver using software, surgeons can examine the volume of the liver, anatomical variation, the course of intrahepatic vessels, the location of the tumor, and its relationship with the surrounding vessels more intuitively, vividly, and from multiple angles. Preoperative 3D reconstruction and virtual reality technology can realize the measurement of liver volume and the implementation of simulated hepatectomy, which can further clarify the scope of surgical resection and ensure the residual liver volume and function to meet the needs of patients after operation. The virtual operation and image navigation before and during the operation can also prevent the injury to important blood vessels and bile ducts in the liver during the operation, significantly shorten the operation time, reduce the bleeding during the operation, and reduce the occurrence of complications such as liver dysfunction, bile leakage, and bleeding after the operation.
Collapse
Affiliation(s)
- Cheng-Gang Li
- Second Department of Hepatobiliary Surgery, Chinese People's Liberation Army General Hospital, Beijing 100853, China
| |
Collapse
|
52
|
Schmelzle M, Krenzien F, Schöning W, Pratschke J. Laparoscopic liver resection: indications, limitations, and economic aspects. Langenbecks Arch Surg 2020; 405:725-735. [PMID: 32607841 PMCID: PMC7471173 DOI: 10.1007/s00423-020-01918-8] [Citation(s) in RCA: 36] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2020] [Accepted: 06/23/2020] [Indexed: 12/13/2022]
Abstract
Background Minimally invasive techniques have increasingly found their way into liver surgery in recent years. A multitude of mostly retrospective analyses suggests several advantages of laparoscopic over open liver surgery. Due to the speed and variety of simultaneous technical and strategic developments, it is difficult to maintain an overview of the current status and perspectives in laparoscopic liver surgery. Purpose This review highlights up-to-date aspects in laparoscopic liver surgery. We discuss established indications with regard to their development over time as well as continuing limitations of applied techniques. We give an assessment based on the current literature and according to our own center experiences, not least with regard to a highly topical cost discussion. Conclusions While in the beginning mainly benign tumors were laparoscopically operated on, liver metastasis and hepatocellular carcinoma are now among the most frequent indications. Technical limitations remain and should be evaluated with the overall aim not to endanger quality standards in open surgery. Financial aspects cannot be neglected with the necessity of cost-covering reimbursement.
Collapse
Affiliation(s)
- Moritz Schmelzle
- Department of Surgery, Campus Charité Mitte and Campus Virchow-Klinikum, Charité - Universitätsmedizin, Corporate Member of Freie Universität Berlin, Humboldt-Universität zu Berlin, Augustenburger Platz 1, 13353, Berlin, Germany.
| | - Felix Krenzien
- Department of Surgery, Campus Charité Mitte and Campus Virchow-Klinikum, Charité - Universitätsmedizin, Corporate Member of Freie Universität Berlin, Humboldt-Universität zu Berlin, Augustenburger Platz 1, 13353, Berlin, Germany
| | - Wenzel Schöning
- Department of Surgery, Campus Charité Mitte and Campus Virchow-Klinikum, Charité - Universitätsmedizin, Corporate Member of Freie Universität Berlin, Humboldt-Universität zu Berlin, Augustenburger Platz 1, 13353, Berlin, Germany
| | - Johann Pratschke
- Department of Surgery, Campus Charité Mitte and Campus Virchow-Klinikum, Charité - Universitätsmedizin, Corporate Member of Freie Universität Berlin, Humboldt-Universität zu Berlin, Augustenburger Platz 1, 13353, Berlin, Germany
| |
Collapse
|
53
|
Hilt AD, Mamaqi Kapllani K, Hierck BP, Kemp AC, Albayrak A, Melles M, Schalij MJ, Scherptong RWC. Perspectives of Patients and Professionals on Information and Education After Myocardial Infarction With Insight for Mixed Reality Implementation: Cross-Sectional Interview Study. JMIR Hum Factors 2020; 7:e17147. [PMID: 32573464 PMCID: PMC7381062 DOI: 10.2196/17147] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2019] [Revised: 03/19/2020] [Accepted: 04/12/2020] [Indexed: 12/11/2022] Open
Abstract
BACKGROUND Patient education is crucial in the secondary prevention of cardiovascular disease. Novel technologies such as augmented reality or mixed reality expand the possibilities for providing visual support in this process. Mixed reality creates interactive digital three-dimensional (3D) projections overlaying virtual objects on the real-world environment. While augmented reality only overlays objects, mixed reality not just overlays but anchors virtual objects to the real world. However, research on this technology in the patient domain is scarce. OBJECTIVE The aim of this study was to understand how patients perceive information provided after myocardial infarction and examine if mixed reality can be supportive in this process. METHODS In total, 12 patients that experienced myocardial infarction and 6 health care professionals were enrolled in the study. Clinical, demographic, and qualitative data were obtained through semistructured interviews, with a main focus on patient experiences within the hospital and the knowledge they gained about their disease. These data were then used to map a susceptible timeframe to identify how mixed reality can contribute to patient information and education. RESULTS Knowledge transfer after myocardial infarction was perceived by patients as too extensive, not personal, and inconsistent. Notably, knowledge on anatomy and medication was minimal and was not recognized as crucial by patients, whereas professionals stated the opposite. Patient journey analysis indicated the following four critical phases of knowledge transfer: at hospital discharge, at the first outpatient visit, during rehabilitation, and during all follow-up outpatient visits. Important patient goals were understanding the event in relation to daily life and its implications on resuming daily life. During follow-up, understanding physical limitations and coping with the condition and medication side effects in daily life emerged as the most important patient goals. The professionals' goals were to improve recovery, enhance medication adherence, and offer coping support. CONCLUSIONS There is a remarkable difference between patients' and professionals' goals regarding information and education after myocardial infarction. Mixed reality may be a practical tool to unite perspectives of patients and professionals on the disease in a more even manner, and thus optimize knowledge transfer after myocardial infarction. Improving medication knowledge seems to be a feasible target for mixed reality. However, further research is needed to create durable methods for education on medication through mixed reality interventions.
Collapse
Affiliation(s)
- Alexander D Hilt
- Department of Cardiology, Leiden University Medical Center, Leiden, Netherlands
| | - Kevin Mamaqi Kapllani
- Faculty of Industrial Design Engineering, Delft University of Technology, Delft, Netherlands
| | - Beerend P Hierck
- Department of Anatomy and Embryology, Leiden University Medical Center, Leiden, Netherlands
| | - Anne C Kemp
- Faculty of Industrial Design Engineering, Delft University of Technology, Delft, Netherlands
| | - Armagan Albayrak
- Faculty of Industrial Design Engineering, Delft University of Technology, Delft, Netherlands
| | - Marijke Melles
- Faculty of Industrial Design Engineering, Delft University of Technology, Delft, Netherlands
| | - Martin J Schalij
- Department of Cardiology, Leiden University Medical Center, Leiden, Netherlands
| | | |
Collapse
|
54
|
Asgar-Deen D, Carriere J, Wiebe E, Peiris L, Duha A, Tavakoli M. Augmented Reality Guided Needle Biopsy of Soft Tissue: A Pilot Study. Front Robot AI 2020; 7:72. [PMID: 33501239 PMCID: PMC7806065 DOI: 10.3389/frobt.2020.00072] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2019] [Accepted: 04/30/2020] [Indexed: 11/24/2022] Open
Abstract
Percutaneous biopsies are popular for extracting suspicious tissue formations (primarily for cancer diagnosis purposes) due to the: relatively low cost, minimal invasiveness, quick procedure times, and low risk for the patient. Despite the advantages provided by percutaneous biopsies, poor needle and tumor visualization is a problem that can result in the clinicians classifying the tumor as benign when it was malignant (false negative). The system developed by the authors aims to address the concern of poor needle and tumor visualization through two virtualization setups. This system is designed to track and visualize the needle and tumor in three-dimensional space using an electromagnetic tracking system. User trials were conducted in which the 10 participants, who were not medically trained, performed a total of 6 tests, each guiding the biopsy needle to the desired location. The users guided the biopsy needle to the desired point on an artificial spherical tumor (diameters of 30, 20, and 10 mm) using the 3D augmented reality (AR) overlay for three trials and a projection on a second monitor (TV) for the other three trials. From the randomized trials, it was found that the participants were able to guide the needle tip 6.5 ± 3.3 mm away from the desired position with an angle deviation of 1.96 ± 1.10° in the AR trials, compared to values of 4.5 ± 2.3 mm and 2.70 ± 1.67° in the TV trials. The results indicate that for simple stationary surgical procedures, an AR display is non-inferior a TV display.
Collapse
Affiliation(s)
- David Asgar-Deen
- Electrical and Computer Engineering, University of Alberta, Edmonton, AB, Canada
| | - Jay Carriere
- Electrical and Computer Engineering, University of Alberta, Edmonton, AB, Canada
| | - Ericka Wiebe
- Oncology, Medicine and Dentistry, University of Alberta, Edmonton, AB, Canada
| | - Lashan Peiris
- Surgery, Medicine and Dentistry, University of Alberta, Edmonton, AB, Canada
| | - Aalo Duha
- Radiology and Diagnostic Imaging, University of Alberta, Edmonton, AB, Canada
| | - Mahdi Tavakoli
- Electrical and Computer Engineering, University of Alberta, Edmonton, AB, Canada
| |
Collapse
|
55
|
Tuladhar S, AlSallami N, Alsadoon A, Prasad PWC, Alsadoon OH, Haddad S, Alrubaie A. A recent review and a taxonomy for hard and soft tissue visualization-based mixed reality. Int J Med Robot 2020; 16:1-22. [PMID: 32388923 DOI: 10.1002/rcs.2120] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2020] [Revised: 04/28/2020] [Accepted: 04/30/2020] [Indexed: 11/10/2022]
Abstract
BACKGROUND Mixed reality (MR) visualization is gaining popularity in image-guided surgery (IGS) systems, especially for hard and soft tissue surgeries. However, a few MR systems are implemented in real time. Some factors are limiting MR technology and creating a difficulty in setting up and evaluating the MR system in real environments. Some of these factors include: the end users are not considered, the limitations in the operating room, and the medical images are not fully unified into the operating interventions. METHODOLOGY The purpose of this article is to use Data, Visualization processing, and View (DVV) taxonomy to evaluate the current MR systems. DVV includes all the components required to be considered and validated for the MR used in hard and soft tissue surgeries. This taxonomy helps the developers and end users like researchers and surgeons to enhance MR system for the surgical field. RESULTS We evaluated, validated, and verified the taxonomy based on system comparison, completeness, and acceptance criteria. Around 24 state-of-the-art solutions that are picked relate to MR visualization, which is then used to demonstrate and validate this taxonomy. The results showed that most of the findings are evaluated and others are validated. CONCLUSION The DVV taxonomy acts as a great resource for MR visualization in IGS. State-of-the-art solutions are classified, evaluated, validated, and verified to elaborate the process of MR visualization during surgery. The DVV taxonomy provides the benefits to the end users and future improvements in MR.
Collapse
Affiliation(s)
- Selina Tuladhar
- School of Computing and Mathematics, Charles Sturt University, Sydney, New South Wales, Australia
| | - Nada AlSallami
- Computer Science Department, Worcester State University, Worcester, Massachusetts, USA
| | - Abeer Alsadoon
- School of Computing and Mathematics, Charles Sturt University, Sydney, New South Wales, Australia.,Department of Information Technology, Study Group Australia, Sydney, New South Wales, Australia
| | - P W C Prasad
- School of Computing and Mathematics, Charles Sturt University, Sydney, New South Wales, Australia
| | - Omar H Alsadoon
- Department of Islamic Sciences, Al Iraqia University, Baghdad, Iraq
| | - Sami Haddad
- Department of Oral and Maxillofacial Services, Greater Western Sydney Area Health Services, Sydney, New South Wales, Australia.,Department of Oral and Maxillofacial Services, Central Coast Area Health, Gosford, New South Wales, Australia
| | - Ahmad Alrubaie
- Faculty of Medicine, University of New South Wales, Sydney, New South Wales, Australia
| |
Collapse
|
56
|
Abstract
OBJECTIVE The aim of this study was to investigate the potential of an intraoperative 3D hologram, which was a computer graphics model liver, with mixed reality techniques in liver surgery. SUMMARY BACKGROUND DATA The merits for the application of a hologram for surgical support are: 1) no sterilized display monitor; 2) better spatial awareness; and 3) 3D images shared by all the surgeons. METHODS 3D polygon data using preoperative computed tomography data was installed into head mount displays, HoloLens (Microsoft Corporation, Redmond, WA). RESULTS In a Wi-Fi-enabled operative room, several surgeons wearing HoloLens succeeded in sharing the same hologram and moving that hologram from respective operators' angles by means of easy gesture-handling without any monitors. The intraoperative hologram contributed to better imagination of tumor locations, and for determining the parenchymal dissection line in the hepatectomy for the patients with more than 20 multiple colo-rectal liver metastases. In another case, the hologram enabled a safe Gliisonean pedicle approach for hepato-cellular carcinoma with a hilar anatomical anomaly. Surgeons could easily compare the real patient's anatomy and that of the hologram just before the hepatic hilar procedure. CONCLUSIONS This initial experience suggested that an intraoperative hologram with mixed reality techniques contributed to "last-minute simulation," not for "navigation." The intraoperative hologram might be a new next-generation operation-supportive tool in terms of spatial awareness, sharing, and simplicity.
Collapse
|
57
|
|
58
|
Cartucho J, Shapira D, Ashrafian H, Giannarou S. Multimodal mixed reality visualisation for intraoperative surgical guidance. Int J Comput Assist Radiol Surg 2020; 15:819-826. [PMID: 32333360 PMCID: PMC7261260 DOI: 10.1007/s11548-020-02165-4] [Citation(s) in RCA: 24] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2019] [Accepted: 04/06/2020] [Indexed: 12/22/2022]
Abstract
Purpose In the last decade, there has been a great effort to bring mixed reality (MR) into the operating room to assist surgeons intraoperatively. However, progress towards this goal is still at an early stage. The aim of this paper is to propose a MR visualisation platform which projects multiple imaging modalities to assist intraoperative surgical guidance. Methodology In this work, a MR visualisation platform has been developed for the Microsoft HoloLens. The platform contains three visualisation components, namely a 3D organ model, volumetric data, and tissue morphology captured with intraoperative imaging modalities. Furthermore, a set of novel interactive functionalities have been designed including scrolling through volumetric data and adjustment of the virtual objects’ transparency. A pilot user study has been conducted to evaluate the usability of the proposed platform in the operating room. The participants were allowed to interact with the visualisation components and test the different functionalities. Each surgeon answered a questionnaire on the usability of the platform and provided their feedback and suggestions.
Results The analysis of the surgeons’ scores showed that the 3D model is the most popular MR visualisation component and neurosurgery is the most relevant speciality for this platform. The majority of the surgeons found the proposed visualisation platform intuitive and would use it in their operating rooms for intraoperative surgical guidance. Our platform has several promising potential clinical applications, including vascular neurosurgery. Conclusion The presented pilot study verified the potential of the proposed visualisation platform and its usability in the operating room. Our future work will focus on enhancing the platform by incorporating the surgeons’ suggestions and conducting extensive evaluation on a large group of surgeons.
Collapse
Affiliation(s)
- João Cartucho
- The Hamlyn Centre for Robotic Surgery, Imperial College London, London, SW7 2AZ, UK.
| | - David Shapira
- The Hamlyn Centre for Robotic Surgery, Imperial College London, London, SW7 2AZ, UK.,Product Development Group Zurich, ETH Zürich, 8092, Zürich, Switzerland
| | - Hutan Ashrafian
- The Hamlyn Centre for Robotic Surgery, Imperial College London, London, SW7 2AZ, UK
| | - Stamatia Giannarou
- The Hamlyn Centre for Robotic Surgery, Imperial College London, London, SW7 2AZ, UK
| |
Collapse
|
59
|
Lei PF, Su SL, Kong LY, Wang CG, Zhong D, Hu YH. Mixed Reality Combined with Three-Dimensional Printing Technology in Total Hip Arthroplasty: An Updated Review with a Preliminary Case Presentation. Orthop Surg 2020; 11:914-920. [PMID: 31663276 PMCID: PMC6819179 DOI: 10.1111/os.12537] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/11/2019] [Revised: 07/18/2019] [Accepted: 08/20/2019] [Indexed: 12/19/2022] Open
Abstract
Three-dimensional (3D) printing technology, virtual reality, and augmented reality technology have been used to help surgeons to complete complex total hip arthroplasty, while their respective shortcomings limit their further application. With the development of technology, mixed reality (MR) technology has been applied to improve the success rate of complicated hip arthroplasty because of its unique advantages. We presented a case of a 59-year-old man with an intertrochanteric fracture in the left femur, who had received a prior left hip fusion. After admission to our hospital, a left total hip arthroplasty was performed on the patient using a combination of MR technology and 3D printing technology. Before surgery, 3D reconstruction of a certain bony landmark exposed in the surgical area was first performed. Then a veneer part was designed according to the bony landmark and connected to a reference registration landmark outside the body through a connecting rod. After that, the series of parts were made into a holistic reference registration instrument using 3D printing technology, and the patient's data for bone and surrounding tissue, along with digital 3D information of the reference registration instrument, were imported into the head-mounted display (HMD). During the operation, the disinfected reference registration instrument was installed on the selected bony landmark, and then the automatic real-time registration was realized by HMD through recognizing the registration landmark on the reference registration instrument, whereby the patient's virtual bone and other anatomical structures were quickly and accurately superimposed on the real body of the patient. To the best of our knowledge, this is the first report to use MR combined with 3D printing technology in total hip arthroplasty.
Collapse
Affiliation(s)
- Peng-Fei Lei
- Department of Orthopaedics, Xiangya Hospital, Central South University, Changsha, China
| | - Shi-Long Su
- Department of Orthopaedics, Xiangya Hospital, Central South University, Changsha, China
| | - Ling-Yu Kong
- Department of Radiology, Xiangya Hospital, Central South University, Changsha, China
| | - Cheng-Gong Wang
- Department of Orthopaedics, Xiangya Hospital, Central South University, Changsha, China
| | - Da Zhong
- Department of Orthopaedics, Xiangya Hospital, Central South University, Changsha, China
| | - Yi-He Hu
- Department of Orthopaedics, Xiangya Hospital, Central South University, Changsha, China
| |
Collapse
|
60
|
Application of Image Fusion in Diagnosis and Treatment of Liver Cancer. APPLIED SCIENCES-BASEL 2020. [DOI: 10.3390/app10031171] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/05/2023]
Abstract
With the accelerated development of medical imaging equipment and techniques, image fusion technology has been effectively applied for diagnosis, biopsy and radiofrequency ablation, especially for liver tumor. Tumor treatment relying on a single medical imaging modality might face challenges, due to the deep positioning of the lesions, operation history and the specific background conditions of the liver disease. Image fusion technology has been employed to address these challenges. Using the image fusion technology, one could obtain real-time anatomical imaging superimposed by functional images showing the same plane to facilitate the diagnosis and treatments of liver tumors. This paper presents a review of the key principles of image fusion technology, its application in tumor treatments, particularly in liver tumors, and concludes with a discussion of the limitations and prospects of the image fusion technology.
Collapse
|
61
|
Kumar RP, Pelanis E, Bugge R, Brun H, Palomar R, Aghayan DL, Fretland ÅA, Edwin B, Elle OJ. Use of mixed reality for surgery planning: Assessment and development workflow. J Biomed Inform 2020; 112S:100077. [PMID: 34417006 DOI: 10.1016/j.yjbinx.2020.100077] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2020] [Revised: 06/09/2020] [Accepted: 08/16/2020] [Indexed: 12/15/2022]
Abstract
Meticulous preoperative planning is an important part of any surgery to achieve high levels of precision and avoid complications. Conventional medical 2D images and their corresponding three-dimensional (3D) reconstructions are the main components of an efficient planning system. However, these systems still use flat screens for visualisation of 3D information, thus losing depth information which is crucial for 3D spatial understanding. Currently, cutting-edge mixed reality systems have shown to be a worthy alternative to provide 3D information to clinicians. In this work, we describe development details of the different steps in the workflow for the clinical use of mixed reality, including results from a qualitative user evaluation and clinical use-cases in laparoscopic liver surgery and heart surgery. Our findings indicate a very high general acceptance of mixed reality devices with our applications and they were consistently rated high for device, visualisation and interaction areas in our questionnaire. Furthermore, our clinical use-cases demonstrate that the surgeons perceived the HoloLens to be useful, recommendable to other surgeons and also provided a definitive answer at a multi-disciplinary team meeting.
Collapse
Affiliation(s)
- Rahul Prasanna Kumar
- The Intervention Centre, Oslo University Hospital, Sognsvannsveien 20, 0372 Oslo, Norway.
| | - Egidijus Pelanis
- The Intervention Centre, Oslo University Hospital, Sognsvannsveien 20, 0372 Oslo, Norway; Institute of Clinical Medicine, University of Oslo, Kirkeveien 166, 0450 Oslo, Norway
| | - Robin Bugge
- The Intervention Centre, Oslo University Hospital, Sognsvannsveien 20, 0372 Oslo, Norway; Department of Diagnostic Physics, Oslo University Hospital, Sognsvannsvn 20, 0372 Oslo, Norway
| | - Henrik Brun
- The Intervention Centre, Oslo University Hospital, Sognsvannsveien 20, 0372 Oslo, Norway; Department for Pediatric Cardiology, Oslo University Hospital, Sognsvannsvn 20, 0372 Oslo, Norway
| | - Rafael Palomar
- The Intervention Centre, Oslo University Hospital, Sognsvannsveien 20, 0372 Oslo, Norway; Department of Computer Science, NTNU, Teknologiveien 22, 2815 Gjøvik, Norway
| | - Davit L Aghayan
- The Intervention Centre, Oslo University Hospital, Sognsvannsveien 20, 0372 Oslo, Norway; Institute of Clinical Medicine, University of Oslo, Kirkeveien 166, 0450 Oslo, Norway; Department of Surgery N1, Yerevan State Medical University, Yerevan, Armenia
| | - Åsmund Avdem Fretland
- The Intervention Centre, Oslo University Hospital, Sognsvannsveien 20, 0372 Oslo, Norway; Institute of Clinical Medicine, University of Oslo, Kirkeveien 166, 0450 Oslo, Norway; Department of Hepatopancreatobiliary Surgery, Oslo University Hospital, Sognsvannsveien 20, 0372 Oslo, Norway
| | - Bjørn Edwin
- The Intervention Centre, Oslo University Hospital, Sognsvannsveien 20, 0372 Oslo, Norway; Institute of Clinical Medicine, University of Oslo, Kirkeveien 166, 0450 Oslo, Norway; Department of Hepatopancreatobiliary Surgery, Oslo University Hospital, Sognsvannsveien 20, 0372 Oslo, Norway
| | - Ole Jakob Elle
- The Intervention Centre, Oslo University Hospital, Sognsvannsveien 20, 0372 Oslo, Norway; Department of Informatics, University of Oslo, Gaustadallèen 23 B, 0373 Oslo, Norway
| |
Collapse
|
62
|
Response to Comment on "Mixed Reality in Visceral Surgery: Development of a Suitable Workflow and Evaluation of Intraoperative Usecases". Ann Surg 2019; 269:e54-e55. [PMID: 30845017 DOI: 10.1097/sla.0000000000002913] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
63
|
Real-time navigation for laparoscopic hepatectomy using image fusion of preoperative 3D surgical plan and intraoperative indocyanine green fluorescence imaging. Surg Endosc 2019; 34:3449-3459. [DOI: 10.1007/s00464-019-07121-1] [Citation(s) in RCA: 30] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2019] [Accepted: 09/13/2019] [Indexed: 12/12/2022]
|
64
|
Wu X, Liu R, Xu S, Yang C, Yang S, Shao Z, Li S, Ye Z. Feasibility of mixed reality-based intraoperative three-dimensional image-guided navigation for atlanto-axial pedicle screw placement. Proc Inst Mech Eng H 2019; 233:1310-1317. [PMID: 31617820 DOI: 10.1177/0954411919881255] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
This study aimed to evaluate the safety and accuracy of mixed reality-based intraoperative three-dimensional navigated pedicle screws in three-dimensional printed model of fractured upper cervical spine. A total of 27 cervical model from patients of upper cervical spine fractures formed the study group. All the C1 and C2 pedicle screws were inserted under mixed reality-based intraoperative three-dimensional image-guided navigation system. The accuracy and safety of the pedicle screw placement were evaluated on the basis of postoperative computerized tomography scans. A total of 108 pedicle screws were properly inserted into the cervical three-dimensional models under mixed reality-based navigation, including 54 C1 pedicle screws and 54 C2 pedicle screws. Analysis of the dimensional parameters of each pedicle at C1/C2 level showed no statistically significant differences in the ideal and the actual entry points, inclined angles, and tailed angles. No screw was misplaced outside the pedicle of the three-dimensional printed model, and no ionizing X-ray radiation was used during screw placement under navigation. It is easy and safe to place C1/C2 pedicle screws under MR surgical navigation. Mixed reality-based navigation is feasible within upper cervical spinal fractures with improved safety and accuracy of C1/C2 pedicle screw insertion.
Collapse
Affiliation(s)
- Xinghuo Wu
- Department of Orthopaedic Surgery, Wuhan Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Rong Liu
- Department of Orthopaedic Surgery, Puren Hospital of Wuhan, Wuhan University of Science and Technology, Wuhan, China
| | - Song Xu
- Department of Orthopaedic Surgery, Wuhan Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Cao Yang
- Department of Orthopaedic Surgery, Wuhan Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Shuhua Yang
- Department of Orthopaedic Surgery, Wuhan Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Zengwu Shao
- Department of Orthopaedic Surgery, Wuhan Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Suyun Li
- Department of Orthopaedic Surgery, Wuhan Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Zhewei Ye
- Department of Orthopaedic Surgery, Wuhan Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| |
Collapse
|
65
|
Dorweiler B, Vahl CF, Ghazy A. Zukunftsperspektiven digitaler Visualisierungstechnologien in der Gefäßchirurgie. GEFÄSSCHIRURGIE 2019. [DOI: 10.1007/s00772-019-00570-x] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/19/2022]
|
66
|
Gao Y, Lin L, Chai G, Xie L. A feasibility study of a new method to enhance the augmented reality navigation effect in mandibular angle split osteotomy. J Craniomaxillofac Surg 2019; 47:1242-1248. [DOI: 10.1016/j.jcms.2019.04.005] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2018] [Revised: 04/16/2019] [Accepted: 04/16/2019] [Indexed: 01/31/2023] Open
|
67
|
Zhang ZY, Duan WC, Chen RK, Zhang FJ, Yu B, Zhan YB, Li K, Zhao HB, Sun T, Ji YC, Bai YH, Wang YM, Zhou JQ, Liu XZ. Preliminary application of mxed reality in neurosurgery: Development and evaluation of a new intraoperative procedure. J Clin Neurosci 2019; 67:234-238. [PMID: 31221576 DOI: 10.1016/j.jocn.2019.05.038] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2018] [Revised: 04/20/2019] [Accepted: 05/21/2019] [Indexed: 11/25/2022]
Abstract
During neurological surgery, neurosurgeons have to transform the two-dimensional (2D) sectional images into three-dimensional (3D) structures at the cognitive level. The complexity of the intracranial structures increases the difficulty and risk of neurosurgery. Mixed reality (MR) applications reduce the obstacles in the transformation from 2D images to 3D visualization of anatomical structures of central nervous system. In this study, the holographic image was established by MR using computed tomography (CT), computed tomography angiography (CTA) and magnetic resonance imaging (MRI) data of patients. The surgeon's field of vision was superimposed with the 3D model of the patient's intracranial structure displayed on the mixed reality head-mounted display (MR-HMD). The neurosurgeons practiced and evaluated the feasibility of this technique in neurosurgical cases. We developed the segmentation image masks and texture mapping including brain tissue, intracranial vessels, nerves, tumors, and their relative positions by MR technologies. The results showed that the three-dimensional imaging is in a stable state in the operating room with no significant flutter and blur. And the neurosurgeon's feedback on the comfort of the equipment and the practicality of the technology was satisfactory. In conclusion, MR technology can holographically construct a 3D digital model of patient's lesions and improve the anatomical perception of neurosurgeons during craniotomy. The feasibility of the MR-HMD application in neurosurgery is confirmed.
Collapse
Affiliation(s)
- Zhen-Yu Zhang
- Department of Neurosurgery, The First Affiliated Hospital of Zhengzhou University, Jian She Dong Road 1, Zhengzhou, Henan 450000, China
| | - Wen-Chao Duan
- Department of Neurosurgery, The First Affiliated Hospital of Zhengzhou University, Jian She Dong Road 1, Zhengzhou, Henan 450000, China
| | - Ruo-Kun Chen
- Department of Neurosurgery, The First Affiliated Hospital of Zhengzhou University, Jian She Dong Road 1, Zhengzhou, Henan 450000, China
| | - Feng-Jiang Zhang
- Department of Neurosurgery, The First Affiliated Hospital of Zhengzhou University, Jian She Dong Road 1, Zhengzhou, Henan 450000, China
| | - Bin Yu
- Department of Neurosurgery, The First Affiliated Hospital of Zhengzhou University, Jian She Dong Road 1, Zhengzhou, Henan 450000, China
| | - Yun-Bo Zhan
- Department of Neurosurgery, The First Affiliated Hospital of Zhengzhou University, Jian She Dong Road 1, Zhengzhou, Henan 450000, China
| | - Ke Li
- Department of Neurosurgery, The First Affiliated Hospital of Zhengzhou University, Jian She Dong Road 1, Zhengzhou, Henan 450000, China
| | - Hai-Biao Zhao
- Department of Neurosurgery, The First Affiliated Hospital of Zhengzhou University, Jian She Dong Road 1, Zhengzhou, Henan 450000, China
| | - Tao Sun
- Department of Neurosurgery, The First Affiliated Hospital of Zhengzhou University, Jian She Dong Road 1, Zhengzhou, Henan 450000, China
| | - Yu-Chen Ji
- Department of Neurosurgery, The First Affiliated Hospital of Zhengzhou University, Jian She Dong Road 1, Zhengzhou, Henan 450000, China
| | - Ya-Hui Bai
- Department of Neurosurgery, The First Affiliated Hospital of Zhengzhou University, Jian She Dong Road 1, Zhengzhou, Henan 450000, China
| | - Yan-Min Wang
- Department of Neurosurgery, The First Affiliated Hospital of Zhengzhou University, Jian She Dong Road 1, Zhengzhou, Henan 450000, China
| | - Jin-Qiao Zhou
- Department of Neurosurgery, The First Affiliated Hospital of Zhengzhou University, Jian She Dong Road 1, Zhengzhou, Henan 450000, China
| | - Xian-Zhi Liu
- Department of Neurosurgery, The First Affiliated Hospital of Zhengzhou University, Jian She Dong Road 1, Zhengzhou, Henan 450000, China.
| |
Collapse
|
68
|
Mixed Reality-Based Preoperative Planning for Training of Percutaneous Transforaminal Endoscopic Discectomy: A Feasibility Study. World Neurosurg 2019; 129:e767-e775. [PMID: 31203062 DOI: 10.1016/j.wneu.2019.06.020] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2019] [Revised: 06/03/2019] [Accepted: 06/04/2019] [Indexed: 12/16/2022]
Abstract
OBJECTIVE To explore the effect of preoperative planning using mixed reality (MR) on training of percutaneous transforaminal endoscopic discectomy (PTED). METHODS Before the training, we invited an experienced chief physician to plan the puncture path of PTED on the X-ray films of the lumbar spine model and the 3D Slicer platform, respectively, and used this as the standard to guide trainees. In the aggregate, 60 young residents were randomly divided into Group A (N = 30) and Group B (N = 30). Group A learned the 2-dimensional standard planning route, whereas Group B learned the standard route planning based on MR through the 3D Slicer platform. Then, trainees were asked to conduct PTED puncture on a lumbar spine model. Questionnaires were distributed to trainees before and after the training. During the training, puncture times, operating time (minutes), and fluoroscopy times were recorded. RESULTS After the training, it was obvious that more trainees showed their recognition of MR, believing that MR could help preoperative planning and training of PTED. Their high satisfaction with the training indicated the success of our training. Moreover, puncture times, operating time (minutes), and fluoroscopy times of Group B were significantly lower than those of Group A. CONCLUSIONS MR technology contributes to preoperative planning of PTED and is beneficial in the training of PTED. It significantly reduces puncture times and fluoroscopy times, providing a standardized method for the training of PTED.
Collapse
|
69
|
Pelanis E, Kumar RP, Aghayan DL, Palomar R, Fretland ÅA, Brun H, Elle OJ, Edwin B. Use of mixed reality for improved spatial understanding of liver anatomy. MINIM INVASIV THER 2019; 29:154-160. [PMID: 31116053 DOI: 10.1080/13645706.2019.1616558] [Citation(s) in RCA: 26] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022]
Abstract
Introduction: In liver surgery, medical images from pre-operative computed tomography and magnetic resonance imaging are the basis for the decision-making process. These images are used in surgery planning and guidance, especially for parenchyma-sparing hepatectomies. Though medical images are commonly visualized in two dimensions (2D), surgeons need to mentally reconstruct this information in three dimensions (3D) for a spatial understanding of the anatomy. The aim of this work is to investigate whether the use of a 3D model visualized in mixed reality with Microsoft HoloLens increases the spatial understanding of the liver, compared to the conventional way of using 2D images.Material and methods: In this study, clinicians had to identify liver segments associated to lesions.Results: Twenty-eight clinicians with varying medical experience were recruited for the study. From a total of 150 lesions, 89 were correctly assigned without significant difference between the modalities. The median time for correct identification was 23.5 [4-138] s using the magnetic resonance imaging images and 6.00 [1-35] s using HoloLens (p < 0.001).Conclusions: The use of 3D liver models in mixed reality significantly decreases the time for tasks requiring a spatial understanding of the organ. This may significantly decrease operating time and improve use of resources.
Collapse
Affiliation(s)
- Egidijus Pelanis
- The Intervention Centre, Oslo University Hospital, Oslo, Norway.,Institute of Clinical Medicine, Faculty of Medicine, University of Oslo, Oslo, Norway
| | - Rahul P Kumar
- The Intervention Centre, Oslo University Hospital, Oslo, Norway
| | - Davit L Aghayan
- The Intervention Centre, Oslo University Hospital, Oslo, Norway.,Institute of Clinical Medicine, Faculty of Medicine, University of Oslo, Oslo, Norway.,Department of Surgery N1, Yerevan State Medical University after M.Heratsi, Yerevan, Armenia
| | - Rafael Palomar
- The Intervention Centre, Oslo University Hospital, Oslo, Norway.,Department of Computer Science, NTNU, Gjøvik, Norway
| | - Åsmund A Fretland
- The Intervention Centre, Oslo University Hospital, Oslo, Norway.,Institute of Clinical Medicine, Faculty of Medicine, University of Oslo, Oslo, Norway.,Department of HPB Surgery, Norway University Hospital - Rikshospitalet, Oslo, Norway
| | - Henrik Brun
- The Intervention Centre, Oslo University Hospital, Oslo, Norway.,Clinic for Pediatric Cardiology, Norway University Hospital - Rikshospitalet, Oslo, Norway
| | - Ole Jakob Elle
- The Intervention Centre, Oslo University Hospital, Oslo, Norway.,Department of Informatics, The Faculty of Mathematics and Natural Sciences, University of Oslo, Oslo, Norway
| | - Bjørn Edwin
- The Intervention Centre, Oslo University Hospital, Oslo, Norway.,Institute of Clinical Medicine, Faculty of Medicine, University of Oslo, Oslo, Norway.,Department of HPB Surgery, Norway University Hospital - Rikshospitalet, Oslo, Norway
| |
Collapse
|
70
|
Hu HZ, Feng XB, Shao ZW, Xie M, Xu S, Wu XH, Ye ZW. Application and Prospect of Mixed Reality Technology in Medical Field. Curr Med Sci 2019; 39:1-6. [PMID: 30868484 DOI: 10.1007/s11596-019-1992-8] [Citation(s) in RCA: 55] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2018] [Revised: 12/14/2018] [Indexed: 01/04/2023]
Abstract
Mixed reality (MR) technology is a new digital holographic image technology, which appears in the field of graphics after virtual reality (VR) and augmented reality (AR) technology, a new interdisciplinary frontier. As a new generation of technology, MR has attracted great attention of clinicians in recent years. The emergence of MR will bring about revolutionary changes in medical education training, medical research, medical communication, and clinical treatment. At present, MR technology has become the popular frontline information technology for medical applications. With the popularization of digital technology in the medical field, the development prospects of MR are inestimable. The purpose of this review article is to introduce the application of MR technology in the medical field and prospect its trend in the future.
Collapse
Affiliation(s)
- Hong-Zhi Hu
- Department of Orthopaedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022, China
| | - Xiao-Bo Feng
- Department of Orthopaedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022, China
| | - Zeng-Wu Shao
- Department of Orthopaedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022, China
| | - Mao Xie
- Department of Orthopaedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022, China
| | - Song Xu
- Department of Orthopaedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022, China
| | - Xing-Huo Wu
- Department of Orthopaedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022, China
| | - Zhe-Wei Ye
- Department of Orthopaedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022, China.
| |
Collapse
|
71
|
Comment on "Mixed Reality in Visceral Surgery: Development of a Suitable Workflow and Evaluation of Intraoperative Usecases". Ann Surg 2019; 269:e53. [PMID: 30845016 DOI: 10.1097/sla.0000000000002905] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/02/2023]
|
72
|
Fang C, Zhang P, Qi X. Digital and intelligent liver surgery in the new era: Prospects and dilemmas. EBioMedicine 2019; 41:693-701. [PMID: 30773479 PMCID: PMC6442371 DOI: 10.1016/j.ebiom.2019.02.017] [Citation(s) in RCA: 47] [Impact Index Per Article: 9.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2018] [Revised: 01/29/2019] [Accepted: 02/07/2019] [Indexed: 02/06/2023] Open
Abstract
Despite tremendous advances in traditional imaging technology over the past few decades, the intraoperative identification of lesions is still based on naked eye observation or pre-operative image evaluation. However, these two-dimensional image data cannot objectively reflect the complex anatomical structure of the liver and the detailed morphological features of the lesion, which directly limits the clinical application value of these imaging data in surgery in that it cannot improve the curative efficacy of surgery and the prognosis of the patient. This traditional mode of diagnosis and treatment has been changed by digital medical imaging technology in the new era with its significant function of accurate and efficient diagnosis of diseases, selection of reasonable treatment schemes, improvement of radical resection rate and reduction of surgical risk. In this paper, we reviewed the latest application of digital intelligent diagnosis and treatment technology related to liver surgery in the hope that it may help to achieve accurate treatment of liver surgery diseases.
Collapse
Affiliation(s)
- Chihua Fang
- CHESS, The First Department of Hepatobiliary Surgery, Zhujiang Hospital, Southern Medical University, Guangdong Provincial Clinical and Engineering Center of Digital Medicine, Guangzhou 510282, China.
| | - Peng Zhang
- CHESS, The First Department of Hepatobiliary Surgery, Zhujiang Hospital, Southern Medical University, Guangdong Provincial Clinical and Engineering Center of Digital Medicine, Guangzhou 510282, China
| | - Xiaolong Qi
- CHESS Frontier Center Working Party, The First Hospital of Lanzhou University, Lanzhou University, Lanzhou 730000, China.
| |
Collapse
|
73
|
Abstract
INTRODUCTION Head-mounted mixed-reality technologies may enable advanced intraoperative visualization during visceral surgery. In this technical note, we describe an innovative use of real-time mixed reality during robotic-assisted transanal total mesorectal excision. TECHNIQUE Video signals from the robotic console and video endoscopic transanal approach were displayed on a virtual monitor using a head-up display. The surgeon, assistant, and a surgical trainee used this technique during abdominal and transanal robotic-assisted total mesorectal excision. We evaluated the feasibility and usability of this approach with the use of validated scales. RESULTS The technical feasibility of the real-time visualization provided by the current setup was demonstrated for both the robotic and transanal parts of the surgery. The surgeon, assistant, and trainee each used the mixed-reality device for 15, 55, and 35 minutes. All participants handled the device intuitively and reported a high level of comfort during the surgery. The task load was easily manageable (task load index: <4/21), although the surgeon and assistant both noted a short delay in the real-time video. CONCLUSION The implementation of head-mounted mixed-reality technology during robotic-assisted transanal total mesorectal excision can benefit the operating surgeon, assistant, and surgical trainee. Further improvements in display quality, connectivity, and systems integration are necessary.
Collapse
|
74
|
Wu X, Liu R, Yu J, Xu S, Yang C, Yang S, Shao Z, Ye Z. Mixed Reality Technology Launches in Orthopedic Surgery for Comprehensive Preoperative Management of Complicated Cervical Fractures. Surg Innov 2019; 25:421-422. [PMID: 30012077 DOI: 10.1177/1553350618761758] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/22/2022]
Affiliation(s)
- Xinghuo Wu
- 1 Department of Orthopaedic Surgery, Wuhan Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan , China
| | - Rong Liu
- 2 Department of Orthopaedic Surgery, Puren Hospital of Wuhan, Wuhan University of Science and Technology, Wuhan, China
| | - Jie Yu
- 1 Department of Orthopaedic Surgery, Wuhan Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan , China
| | - Song Xu
- 1 Department of Orthopaedic Surgery, Wuhan Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan , China
| | - Cao Yang
- 1 Department of Orthopaedic Surgery, Wuhan Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan , China
| | - Shuhua Yang
- 1 Department of Orthopaedic Surgery, Wuhan Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan , China
| | - Zengwu Shao
- 1 Department of Orthopaedic Surgery, Wuhan Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan , China
| | - Zhewei Ye
- 1 Department of Orthopaedic Surgery, Wuhan Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan , China
| |
Collapse
|
75
|
Moosburner S, Remde C, Tang P, Queisner M, Haep N, Pratschke J, Sauer IM. Real world usability analysis of two augmented reality headsets in visceral surgery. Artif Organs 2019; 43:694-698. [DOI: 10.1111/aor.13396] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2018] [Accepted: 11/20/2018] [Indexed: 01/24/2023]
Affiliation(s)
- Simon Moosburner
- Department of Surgery, Experimental Surgery Charité—Universitätsmedizin Berlin, Campus Charité Mitte and Campus Virchow‐Klinikum Berlin Germany
| | - Christopher Remde
- Image Knowledge Gestaltung, Humboldt‐Universität zu Berlin Berlin Germany
| | - Peter Tang
- Department of Surgery, Experimental Surgery Charité—Universitätsmedizin Berlin, Campus Charité Mitte and Campus Virchow‐Klinikum Berlin Germany
| | - Moritz Queisner
- Department of Surgery, Experimental Surgery Charité—Universitätsmedizin Berlin, Campus Charité Mitte and Campus Virchow‐Klinikum Berlin Germany
- Image Knowledge Gestaltung, Humboldt‐Universität zu Berlin Berlin Germany
| | - Nils Haep
- Department of Surgery, Experimental Surgery Charité—Universitätsmedizin Berlin, Campus Charité Mitte and Campus Virchow‐Klinikum Berlin Germany
| | - Johann Pratschke
- Department of Surgery, Experimental Surgery Charité—Universitätsmedizin Berlin, Campus Charité Mitte and Campus Virchow‐Klinikum Berlin Germany
| | - Igor M. Sauer
- Department of Surgery, Experimental Surgery Charité—Universitätsmedizin Berlin, Campus Charité Mitte and Campus Virchow‐Klinikum Berlin Germany
| |
Collapse
|
76
|
Fida B, Cutolo F, di Franco G, Ferrari M, Ferrari V. Augmented reality in open surgery. Updates Surg 2018; 70:389-400. [PMID: 30006832 DOI: 10.1007/s13304-018-0567-8] [Citation(s) in RCA: 27] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2018] [Accepted: 07/08/2018] [Indexed: 12/17/2022]
Abstract
Augmented reality (AR) has been successfully providing surgeons an extensive visual information of surgical anatomy to assist them throughout the procedure. AR allows surgeons to view surgical field through the superimposed 3D virtual model of anatomical details. However, open surgery presents new challenges. This study provides a comprehensive overview of the available literature regarding the use of AR in open surgery, both in clinical and simulated settings. In this way, we aim to analyze the current trends and solutions to help developers and end/users discuss and understand benefits and shortcomings of these systems in open surgery. We performed a PubMed search of the available literature updated to January 2018 using the terms (1) "augmented reality" AND "open surgery", (2) "augmented reality" AND "surgery" NOT "laparoscopic" NOT "laparoscope" NOT "robotic", (3) "mixed reality" AND "open surgery", (4) "mixed reality" AND "surgery" NOT "laparoscopic" NOT "laparoscope" NOT "robotic". The aspects evaluated were the following: real data source, virtual data source, visualization processing modality, tracking modality, registration technique, and AR display type. The initial search yielded 502 studies. After removing the duplicates and by reading abstracts, a total of 13 relevant studies were chosen. In 1 out of 13 studies, in vitro experiments were performed, while the rest of the studies were carried out in a clinical setting including pancreatic, hepatobiliary, and urogenital surgeries. AR system in open surgery appears as a versatile and reliable tool in the operating room. However, some technological limitations need to be addressed before implementing it into the routine practice.
Collapse
Affiliation(s)
- Benish Fida
- Department of Information Engineering, University of Pisa, Pisa, Italy.,Department of Translational Research and New Technologies in Medicine and Surgery, EndoCAS Center, University of Pisa, Pisa, Italy
| | - Fabrizio Cutolo
- Department of Information Engineering, University of Pisa, Pisa, Italy. .,Department of Translational Research and New Technologies in Medicine and Surgery, EndoCAS Center, University of Pisa, Pisa, Italy.
| | - Gregorio di Franco
- General Surgery Unit, Department of Surgery, Translational and New Technologies, University of Pisa, Pisa, Italy
| | - Mauro Ferrari
- Department of Translational Research and New Technologies in Medicine and Surgery, EndoCAS Center, University of Pisa, Pisa, Italy.,Vascular Surgery Unit, Cisanello University Hospital AOUP, Pisa, Italy
| | - Vincenzo Ferrari
- Department of Information Engineering, University of Pisa, Pisa, Italy.,Department of Translational Research and New Technologies in Medicine and Surgery, EndoCAS Center, University of Pisa, Pisa, Italy
| |
Collapse
|
77
|
Gibby JT, Swenson SA, Cvetko S, Rao R, Javan R. Head-mounted display augmented reality to guide pedicle screw placement utilizing computed tomography. Int J Comput Assist Radiol Surg 2018; 14:525-535. [PMID: 29934792 DOI: 10.1007/s11548-018-1814-7] [Citation(s) in RCA: 94] [Impact Index Per Article: 15.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2018] [Accepted: 06/13/2018] [Indexed: 02/06/2023]
Abstract
PURPOSE Augmented reality has potential to enhance surgical navigation and visualization. We determined whether head-mounted display augmented reality (HMD-AR) with superimposed computed tomography (CT) data could allow the wearer to percutaneously guide pedicle screw placement in an opaque lumbar model with no real-time fluoroscopic guidance. METHODS CT imaging was obtained of a phantom composed of L1-L3 Sawbones vertebrae in opaque silicone. Preprocedural planning was performed by creating virtual trajectories of appropriate angle and depth for ideal approach into the pedicle, and these data were integrated into the Microsoft HoloLens using the Novarad OpenSight application allowing the user to view the virtual trajectory guides and CT images superimposed on the phantom in two and three dimensions. Spinal needles were inserted following the virtual trajectories to the point of contact with bone. Repeat CT revealed actual needle trajectory, allowing comparison with the ideal preprocedural paths. RESULTS Registration of AR to phantom showed a roughly circular deviation with maximum average radius of 2.5 mm. Users took an average of 200 s to place a needle. Extrapolation of needle trajectory into the pedicle showed that of 36 needles placed, 35 (97%) would have remained within the pedicles. Needles placed approximated a mean distance of 4.69 mm in the mediolateral direction and 4.48 mm in the craniocaudal direction from pedicle bone edge. CONCLUSION To our knowledge, this is the first peer-reviewed report and evaluation of HMD-AR with superimposed 3D guidance utilizing CT for spinal pedicle guide placement for the purpose of cannulation without the use of fluoroscopy.
Collapse
Affiliation(s)
- Jacob T Gibby
- School of Medicine and Health Sciences, George Washington University, 2300 I St NW, Washington, DC, 200052, USA
| | - Samuel A Swenson
- School of Medicine and Health Sciences, George Washington University, 2300 I St NW, Washington, DC, 200052, USA
| | - Steve Cvetko
- Novarad Corporation, 752 East 1180 South, Suite 200, American Fork, UT, 84003, USA
| | - Raj Rao
- School of Medicine and Health Sciences, George Washington University, 2300 I St NW, Washington, DC, 200052, USA.,Department of Orthopedic Surgery, George Washington University Hospital, 900 23rd St NW, Washington, DC, 20037, USA
| | - Ramin Javan
- School of Medicine and Health Sciences, George Washington University, 2300 I St NW, Washington, DC, 200052, USA. .,Department of Neuroradiology, George Washington University Hospital, 900 23rd St NW, Suite G2092, Washington, DC, 20037, USA.
| |
Collapse
|