1
|
Han Z, Dou Q. A review on organ deformation modeling approaches for reliable surgical navigation using augmented reality. Comput Assist Surg (Abingdon) 2024; 29:2357164. [PMID: 39253945 DOI: 10.1080/24699322.2024.2357164] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/11/2024] Open
Abstract
Augmented Reality (AR) holds the potential to revolutionize surgical procedures by allowing surgeons to visualize critical structures within the patient's body. This is achieved through superimposing preoperative organ models onto the actual anatomy. Challenges arise from dynamic deformations of organs during surgery, making preoperative models inadequate for faithfully representing intraoperative anatomy. To enable reliable navigation in augmented surgery, modeling of intraoperative deformation to obtain an accurate alignment of the preoperative organ model with the intraoperative anatomy is indispensable. Despite the existence of various methods proposed to model intraoperative organ deformation, there are still few literature reviews that systematically categorize and summarize these approaches. This review aims to fill this gap by providing a comprehensive and technical-oriented overview of modeling methods for intraoperative organ deformation in augmented reality in surgery. Through a systematic search and screening process, 112 closely relevant papers were included in this review. By presenting the current status of organ deformation modeling methods and their clinical applications, this review seeks to enhance the understanding of organ deformation modeling in AR-guided surgery, and discuss the potential topics for future advancements.
Collapse
Affiliation(s)
- Zheng Han
- Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong, China
| | - Qi Dou
- Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong, China
| |
Collapse
|
2
|
Khang S, Park T, Lee J, Kim KW, Song H, Lee J. Computer-Aided Breast Surgery Framework Using a Markerless Augmented Reality Method. Diagnostics (Basel) 2022; 12:3123. [PMID: 36553130 PMCID: PMC9777271 DOI: 10.3390/diagnostics12123123] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2022] [Revised: 12/05/2022] [Accepted: 12/09/2022] [Indexed: 12/14/2022] Open
Abstract
This study proposes a markerless Augmented Reality (AR) surgical framework for breast lesion removal using a depth sensor and 3D breast Computed Tomography (CT) images. A patient mesh in the real coordinate system is acquired through a patient 3D scan using a depth sensor for registration. The patient mesh on the virtual coordinate system is obtained by contrast-based skin segmentation in 3D mesh generated from breast CT scans. Then, the nipple area is detected based on the gradient in the segmented skin area. The region of interest (ROI) is set based on the detection result to select the vertices in the virtual coordinate system. The mesh on the real and virtual coordinate systems is first aligned by matching the center of mass, and the Iterative Closest Point (ICP) method is applied to perform more precise registration. Experimental results of 20 patients' data showed 98.35 ± 0.71% skin segmentation accuracy in terms of Dice Similarity Coefficient (DSC) value, 2.79 ± 1.54 mm nipple detection error, and 4.69 ± 1.95 mm registration error. Experiments using phantom and patient data also confirmed high accuracy in AR visualization. The proposed method in this study showed that the 3D AR visualization of medical data on the patient's body is possible by using a single depth sensor without having to use markers.
Collapse
Affiliation(s)
- Seungwoo Khang
- School of Computer Science and Engineering, Soongsil University, 369 Sangdo-ro, Dongjak-gu, Seoul 06978, Republic of Korea
| | - Taeyong Park
- Department of Biomedical Informatics, Hallym University Medical Center, 22 Gwanpyeong-ro 170beon-gil, Dongan-gu, Anyang-si 14068, Republic of Korea
| | - Junwoo Lee
- Ewha Womans University Mokdong Hospital, 1071 Anyangcheon-ro, Yangcheon-gu, Seoul 07985, Republic of Korea
| | - Kyung Won Kim
- Department of Radiology and Research Institute of Radiology, Asan Medical Center, University of Ulsan College of Medicine, 88 Olympic-ro, 43-gil, Songpa-gu, Seoul 05505, Republic of Korea
| | - Hyunjoo Song
- School of Computer Science and Engineering, Soongsil University, 369 Sangdo-ro, Dongjak-gu, Seoul 06978, Republic of Korea
| | - Jeongjin Lee
- School of Computer Science and Engineering, Soongsil University, 369 Sangdo-ro, Dongjak-gu, Seoul 06978, Republic of Korea
| |
Collapse
|
3
|
Görtz M, Byczkowski M, Rath M, Schütz V, Reimold P, Gasch C, Simpfendörfer T, März K, Seitel A, Nolden M, Ross T, Mindroc-Filimon D, Michael D, Metzger J, Onogur S, Speidel S, Mündermann L, Fallert J, Müller M, von Knebel Doeberitz M, Teber D, Seitz P, Maier-Hein L, Duensing S, Hohenfellner M. A Platform and Multisided Market for Translational, Software-Defined Medical Procedures in the Operating Room (OP 4.1): Proof-of-Concept Study. JMIR Med Inform 2022; 10:e27743. [PMID: 35049510 PMCID: PMC8814925 DOI: 10.2196/27743] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2021] [Revised: 06/25/2021] [Accepted: 11/21/2021] [Indexed: 11/25/2022] Open
Abstract
Background Although digital and data-based technologies are widespread in various industries in the context of Industry 4.0, the use of smart connected devices in health care is still in its infancy. Innovative solutions for the medical environment are affected by difficult access to medical device data and high barriers to market entry because of proprietary systems. Objective In the proof-of-concept project OP 4.1, we show the business viability of connecting and augmenting medical devices and data through software add-ons by giving companies a technical and commercial platform for the development, implementation, distribution, and billing of innovative software solutions. Methods The creation of a central platform prototype requires the collaboration of several independent market contenders, including medical users, software developers, medical device manufacturers, and platform providers. A dedicated consortium of clinical and scientific partners as well as industry partners was set up. Results We demonstrate the successful development of the prototype of a user-centric, open, and extensible platform for the intelligent support of processes starting with the operating room. By connecting heterogeneous data sources and medical devices from different manufacturers and making them accessible for software developers and medical users, the cloud-based platform OP 4.1 enables the augmentation of medical devices and procedures through software-based solutions. The platform also allows for the demand-oriented billing of apps and medical devices, thus permitting software-based solutions to fast-track their economic development and become commercially successful. Conclusions The technology and business platform OP 4.1 creates a multisided market for the successful development, implementation, distribution, and billing of new software solutions in the operating room and in the health care sector in general. Consequently, software-based medical innovation can be translated into clinical routine quickly, efficiently, and cost-effectively, optimizing the treatment of patients through smartly assisted procedures.
Collapse
Affiliation(s)
- Magdalena Görtz
- Department of Urology, Heidelberg University Hospital, Heidelberg, Germany
| | | | - Mathias Rath
- Department of Urology, Heidelberg University Hospital, Heidelberg, Germany
| | - Viktoria Schütz
- Department of Urology, Heidelberg University Hospital, Heidelberg, Germany
| | - Philipp Reimold
- Department of Urology, Heidelberg University Hospital, Heidelberg, Germany
| | - Claudia Gasch
- Department of Urology, Heidelberg University Hospital, Heidelberg, Germany
| | | | - Keno März
- German Cancer Research Center, Heidelberg, Germany
| | | | - Marco Nolden
- German Cancer Research Center, Heidelberg, Germany
| | - Tobias Ross
- German Cancer Research Center, Heidelberg, Germany
| | | | | | | | - Sinan Onogur
- German Cancer Research Center, Heidelberg, Germany
| | | | | | | | | | - Magnus von Knebel Doeberitz
- Department of Applied Tumor Biology, Institute of Pathology, Heidelberg University Hospital, Heidelberg, Germany
| | - Dogu Teber
- Department of Urology, Städtisches Klinikum Karlsruhe, Karlsruhe, Germany
| | | | | | - Stefan Duensing
- Section of Molecular Urooncology, Department of Urology, University of Heidelberg School of Medicine, Heidelberg, Germany
| | | |
Collapse
|
4
|
Azargoshasb S, Houwing KHM, Roos PR, van Leeuwen SI, Boonekamp M, Mazzone E, Bauwens K, Dell'Oglio P, van Leeuwen FWB, van Oosterom MN. Optical Navigation of the Drop-In γ-Probe as a Means to Strengthen the Connection Between Robot-Assisted and Radioguided Surgery. J Nucl Med 2021; 62:1314-1317. [PMID: 33419942 PMCID: PMC8882900 DOI: 10.2967/jnumed.120.259796] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2020] [Accepted: 01/03/2021] [Indexed: 12/30/2022] Open
Abstract
With translation of the Drop-In γ-probe, radioguidance has advanced into laparoscopic robot-assisted surgery. Global-positioning-system-like navigation can further enhance the symbiosis between nuclear medicine and surgery. Therefore, we developed a fluorescence-video-based tracking method that integrates the Drop-In with navigated robotic surgery. Methods: Fluorescent markers, integrated into the Drop-In, were automatically detected using a daVinci Firefly laparoscope. Subsequently, a declipseSPECT-navigation platform calculated the Drop-In location within the surgical field. Using a phantom (n = 3), we pursued robotic navigation on SPECT/CT, whereas intraoperative feasibility was validated during porcine surgery (n = 4). Results: Video-based tracking allowed for navigation of the Drop-In toward all lesions detected on SPECT/CT (external iliac and common iliac artery regions). Augmented-reality visualization in the surgical console indicated the distance to these lesions in real time, confirmed by the Drop-In readout. Porcine surgery underlined the feasibility of the concept. Conclusion: Optical navigation of the Drop-In probe provides a next step toward connecting nuclear medicine with robotic surgery.
Collapse
Affiliation(s)
- Samaneh Azargoshasb
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Center, Leiden, The Netherlands
| | - Krijn H M Houwing
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Center, Leiden, The Netherlands
| | - Paul R Roos
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Center, Leiden, The Netherlands
| | - Sven I van Leeuwen
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Center, Leiden, The Netherlands
| | - Michael Boonekamp
- Instrumentele Zaken Ontwikkeling, Facilitair Bedrijf, Leiden University Medical Center, Leiden, The Netherlands
| | - Elio Mazzone
- Department of Urology and Division of Experimental Oncology, URI, Urological Research Institute IRCCS San Raffaele Scientific Institute, Milan, Italy
- Orsi Academy, Melle, Belgium
| | | | - Paolo Dell'Oglio
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Center, Leiden, The Netherlands
- Department of Urology and Division of Experimental Oncology, URI, Urological Research Institute IRCCS San Raffaele Scientific Institute, Milan, Italy
- Department of Urology, ASST Grande Ospedale Metropolitano Niguarda, Milan, Italy; and
- Department of Urology, Netherlands Cancer Institute-Antoni van Leeuwenhoek Hospital, Amsterdam, The Netherlands
| | - Fijs W B van Leeuwen
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Center, Leiden, The Netherlands
- Orsi Academy, Melle, Belgium
- Department of Urology, Netherlands Cancer Institute-Antoni van Leeuwenhoek Hospital, Amsterdam, The Netherlands
| | - Matthias N van Oosterom
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Center, Leiden, The Netherlands;
- Department of Urology, Netherlands Cancer Institute-Antoni van Leeuwenhoek Hospital, Amsterdam, The Netherlands
| |
Collapse
|
5
|
Laparoscopic augmented reality registration for oncological resection site repair. Int J Comput Assist Radiol Surg 2021; 16:1577-1586. [PMID: 33797689 PMCID: PMC8354909 DOI: 10.1007/s11548-021-02336-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2020] [Accepted: 02/25/2021] [Indexed: 10/31/2022]
Abstract
PURPOSE Resection site repair during laparoscopic oncological surgery (e.g. laparoscopic partial nephrectomy) poses some unique challenges and opportunities for augmented reality (AR) navigation support. This work introduces an AR registration workflow that addresses the time pressure that is present during resection site repair. METHODS We propose a two-step registration process: the AR content is registered as accurately as possible prior to the tumour resection (the primary registration). This accurate registration is used to apply artificial fiducials to the physical organ and the virtual model. After the resection, these fiducials can be used for rapid re-registration (the secondary registration). We tested this pipeline in a simulated-use study with [Formula: see text] participants. We compared the registration accuracy and speed for our method and for landmark-based registration as a reference. RESULTS Acquisition of and, thereby, registration with the artificial fiducials were significantly faster than the initial use of anatomical landmarks. Our method also had a trend to be more accurate in cases in which the primary registration was successful. The accuracy loss between the elaborate primary registration and the rapid secondary registration could be quantified with a mean target registration error increase of 2.35 mm. CONCLUSION This work introduces a registration pipeline for AR navigation support during laparoscopic resection site repair and provides a successful proof-of-concept evaluation thereof. Our results indicate that the concept is better suited than landmark-based registration during this phase, but further work is required to demonstrate clinical suitability and applicability.
Collapse
|
6
|
Zaffino P, Moccia S, De Momi E, Spadea MF. A Review on Advances in Intra-operative Imaging for Surgery and Therapy: Imagining the Operating Room of the Future. Ann Biomed Eng 2020; 48:2171-2191. [PMID: 32601951 DOI: 10.1007/s10439-020-02553-6] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2020] [Accepted: 06/17/2020] [Indexed: 12/19/2022]
Abstract
With the advent of Minimally Invasive Surgery (MIS), intra-operative imaging has become crucial for surgery and therapy guidance, allowing to partially compensate for the lack of information typical of MIS. This paper reviews the advancements in both classical (i.e. ultrasounds, X-ray, optical coherence tomography and magnetic resonance imaging) and more recent (i.e. multispectral, photoacoustic and Raman imaging) intra-operative imaging modalities. Each imaging modality was analyzed, focusing on benefits and disadvantages in terms of compatibility with the operating room, costs, acquisition time and image characteristics. Tables are included to summarize this information. New generation of hybrid surgical room and algorithms for real time/in room image processing were also investigated. Each imaging modality has its own (site- and procedure-specific) peculiarities in terms of spatial and temporal resolution, field of view and contrasted tissues. Besides the benefits that each technique offers for guidance, considerations about operators and patient risk, costs, and extra time required for surgical procedures have to be considered. The current trend is to equip surgical rooms with multimodal imaging systems, so as to integrate multiple information for real-time data extraction and computer-assisted processing. The future of surgery is to enhance surgeons eye to minimize intra- and after-surgery adverse events and provide surgeons with all possible support to objectify and optimize the care-delivery process.
Collapse
Affiliation(s)
- Paolo Zaffino
- Department of Experimental and Clinical Medicine, Universitá della Magna Graecia, Catanzaro, Italy
| | - Sara Moccia
- Department of Information Engineering (DII), Universitá Politecnica delle Marche, via Brecce Bianche, 12, 60131, Ancona, AN, Italy.
| | - Elena De Momi
- Department of Electronics, Information and Bioengineering (DEIB), Politecnico di Milano, Piazza Leonardo da Vinci, 32, 20133, Milano, MI, Italy
| | - Maria Francesca Spadea
- Department of Experimental and Clinical Medicine, Universitá della Magna Graecia, Catanzaro, Italy
| |
Collapse
|
7
|
Luo H, Yin D, Zhang S, Xiao D, He B, Meng F, Zhang Y, Cai W, He S, Zhang W, Hu Q, Guo H, Liang S, Zhou S, Liu S, Sun L, Guo X, Fang C, Liu L, Jia F. Augmented reality navigation for liver resection with a stereoscopic laparoscope. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2020; 187:105099. [PMID: 31601442 DOI: 10.1016/j.cmpb.2019.105099] [Citation(s) in RCA: 37] [Impact Index Per Article: 9.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/10/2019] [Revised: 08/14/2019] [Accepted: 09/27/2019] [Indexed: 06/10/2023]
Abstract
OBJECTIVE Understanding the three-dimensional (3D) spatial position and orientation of vessels and tumor(s) is vital in laparoscopic liver resection procedures. Augmented reality (AR) techniques can help surgeons see the patient's internal anatomy in conjunction with laparoscopic video images. METHOD In this paper, we present an AR-assisted navigation system for liver resection based on a rigid stereoscopic laparoscope. The stereo image pairs from the laparoscope are used by an unsupervised convolutional network (CNN) framework to estimate depth and generate an intraoperative 3D liver surface. Meanwhile, 3D models of the patient's surgical field are segmented from preoperative CT images using V-Net architecture for volumetric image data in an end-to-end predictive style. A globally optimal iterative closest point (Go-ICP) algorithm is adopted to register the pre- and intraoperative models into a unified coordinate space; then, the preoperative 3D models are superimposed on the live laparoscopic images to provide the surgeon with detailed information about the subsurface of the patient's anatomy, including tumors, their resection margins and vessels. RESULTS The proposed navigation system is tested on four laboratory ex vivo porcine livers and five operating theatre in vivo porcine experiments to validate its accuracy. The ex vivo and in vivo reprojection errors (RPE) are 6.04 ± 1.85 mm and 8.73 ± 2.43 mm, respectively. CONCLUSION AND SIGNIFICANCE Both the qualitative and quantitative results indicate that our AR-assisted navigation system shows promise and has the potential to be highly useful in clinical practice.
Collapse
Affiliation(s)
- Huoling Luo
- Research Lab for Medical Imaging and Digital Surgery, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China; Shenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Shenzhen, China
| | - Dalong Yin
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China; Department of Hepatobiliary Surgery, Shengli Hospital Affiliated to University of Science and Technology of China, Hefei, China
| | - Shugeng Zhang
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China; Department of Hepatobiliary Surgery, Shengli Hospital Affiliated to University of Science and Technology of China, Hefei, China
| | - Deqiang Xiao
- Research Lab for Medical Imaging and Digital Surgery, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China; Shenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Shenzhen, China
| | - Baochun He
- Research Lab for Medical Imaging and Digital Surgery, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| | - Fanzheng Meng
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China
| | - Yanfang Zhang
- Department of Interventional Radiology, Shenzhen People's Hospital, Shenzhen, China
| | - Wei Cai
- Department of Hepatobiliary Surgery, Zhujiang Hospital, Southern Medical University, Guangzhou, China
| | - Shenghao He
- Research Lab for Medical Imaging and Digital Surgery, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| | - Wenyu Zhang
- Department of Hepatobiliary Surgery, Zhujiang Hospital, Southern Medical University, Guangzhou, China
| | - Qingmao Hu
- Research Lab for Medical Imaging and Digital Surgery, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China; Shenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Shenzhen, China
| | - Hongrui Guo
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China
| | - Shuhang Liang
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China
| | - Shuo Zhou
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China
| | - Shuxun Liu
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China
| | - Linmao Sun
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China
| | - Xiao Guo
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China
| | - Chihua Fang
- Department of Hepatobiliary Surgery, Zhujiang Hospital, Southern Medical University, Guangzhou, China
| | - Lianxin Liu
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China; Department of Hepatobiliary Surgery, Shengli Hospital Affiliated to University of Science and Technology of China, Hefei, China.
| | - Fucang Jia
- Research Lab for Medical Imaging and Digital Surgery, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China; Shenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Shenzhen, China.
| |
Collapse
|
8
|
Eppenga R, Kuhlmann K, Ruers T, Nijkamp J. Accuracy assessment of target tracking using two 5-degrees-of-freedom wireless transponders. Int J Comput Assist Radiol Surg 2019; 15:369-377. [PMID: 31724113 PMCID: PMC6989619 DOI: 10.1007/s11548-019-02088-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2019] [Accepted: 11/04/2019] [Indexed: 12/22/2022]
Abstract
Purpose Surgical navigation systems are generally only applied for targets in rigid areas. For non-rigid areas, real-time tumor tracking can be included to compensate for anatomical changes. The only clinically cleared system using a wireless electromagnetic tracking technique is the Calypso® System (Varian Medical Systems Inc., USA), designed for radiotherapy. It is limited to tracking maximally three wireless 5-degrees-of-freedom (DOF) transponders, all used for tumor tracking. For surgical navigation, a surgical tool has to be tracked as well. In this study, we evaluated whether accurate 6DOF tumor tracking is possible using only two 5DOF transponders, leaving one transponder to track a tool. Methods Two methods were defined to derive 6DOF information out of two 5DOF transponders. The first method uses the vector information of both transponders (TTV), and the second method combines the vector information of one transponder with the distance vector between the transponders (OTV). The accuracy of tracking a rotating object was assessed for each method mimicking clinically relevant and worst-case configurations. Accuracy was compared to using all three transponders to derive 6DOF (Default method). An optical tracking system was used as a reference for accuracy. Results The TTV method performed best and was as accurate as the Default method for almost all transponder configurations (median errors < 0.5°, 95% confidence interval < 3°). Only when the angle between the transponders was less than 2°, the TTV method was inaccurate and the OTV method may be preferred. The accuracy of both methods was independent of the angle of rotation, and only the OTV method was sensitive to the plane of rotation. Conclusion These results indicate that accurate 6DOF tumor tracking is possible using only two 5DOF transponders. This encourages further development of a wireless EM surgical navigation approach using a readily available clinical system.
Collapse
Affiliation(s)
- Roeland Eppenga
- Department of Surgical Oncology, The Netherlands Cancer Institute, Plesmanlaan 121, 1066 CX, Amsterdam, The Netherlands
| | - Koert Kuhlmann
- Department of Surgical Oncology, The Netherlands Cancer Institute, Plesmanlaan 121, 1066 CX, Amsterdam, The Netherlands
| | - Theo Ruers
- Department of Surgical Oncology, The Netherlands Cancer Institute, Plesmanlaan 121, 1066 CX, Amsterdam, The Netherlands.
- Nanobiophysics Group, Faculty TNW, University of Twente, Enschede, The Netherlands.
| | - Jasper Nijkamp
- Department of Surgical Oncology, The Netherlands Cancer Institute, Plesmanlaan 121, 1066 CX, Amsterdam, The Netherlands
| |
Collapse
|
9
|
Computer-assisted surgery: virtual- and augmented-reality displays for navigation during urological interventions. Curr Opin Urol 2019; 28:205-213. [PMID: 29278582 DOI: 10.1097/mou.0000000000000478] [Citation(s) in RCA: 37] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
PURPOSE OF REVIEW To provide an overview of the developments made for virtual- and augmented-reality navigation procedures in urological interventions/surgery. RECENT FINDINGS Navigation efforts have demonstrated potential in the field of urology by supporting guidance for various disorders. The navigation approaches differ between the individual indications, but seem interchangeable to a certain extent. An increasing number of pre- and intra-operative imaging modalities has been used to create detailed surgical roadmaps, namely: (cone-beam) computed tomography, MRI, ultrasound, and single-photon emission computed tomography. Registration of these surgical roadmaps with the real-life surgical view has occurred in different forms (e.g. electromagnetic, mechanical, vision, or near-infrared optical-based), whereby the combination of approaches was suggested to provide superior outcome. Soft-tissue deformations demand the use of confirmatory interventional (imaging) modalities. This has resulted in the introduction of new intraoperative modalities such as drop-in US, transurethral US, (drop-in) gamma probes and fluorescence cameras. These noninvasive modalities provide an alternative to invasive technologies that expose the patients to X-ray doses. Whereas some reports have indicated navigation setups provide equal or better results than conventional approaches, most trials have been performed in relatively small patient groups and clear follow-up data are missing. SUMMARY The reported computer-assisted surgery research concepts provide a glimpse in to the future application of navigation technologies in the field of urology.
Collapse
|
10
|
Zhang X, Wang J, Wang T, Ji X, Shen Y, Sun Z, Zhang X. A markerless automatic deformable registration framework for augmented reality navigation of laparoscopy partial nephrectomy. Int J Comput Assist Radiol Surg 2019; 14:1285-1294. [PMID: 31016562 DOI: 10.1007/s11548-019-01974-6] [Citation(s) in RCA: 25] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2018] [Accepted: 04/05/2019] [Indexed: 01/03/2023]
Abstract
Purpose Video see-through augmented reality (VST-AR) navigation for laparoscopic partial nephrectomy (LPN) can enhance intraoperative perception of surgeons by visualizing surgical targets and critical structures of the kidney tissue. Image registration is the main challenge in the procedure. Existing registration methods in laparoscopic navigation systems suffer from limitations such as manual alignment, invasive external marker fixation, relying on external tracking devices with bulky tracking sensors and lack of deformation compensation. To address these issues, we present a markerless automatic deformable registration framework for LPN VST-AR navigation. METHOD Dense stereo matching and 3D reconstruction, automatic segmentation and surface stitching are combined to obtain a larger dense intraoperative point cloud of the renal surface. A coarse-to-fine deformable registration is performed to achieve a precise automatic registration between the intraoperative point cloud and the preoperative model using the iterative closest point algorithm followed by the coherent point drift algorithm. Kidney phantom experiments and in vivo experiments were performed to evaluate the accuracy and effectiveness of our approach. RESULTS The average segmentation accuracy rate of the automatic segmentation was 94.9%. The mean target registration error of the phantom experiments was found to be 1.28 ± 0.68 mm (root mean square error). In vivo experiments showed that tumor location was identified successfully by superimposing the tumor model on the laparoscopic view. CONCLUSION Experimental results have demonstrated that the proposed framework could accurately overlay comprehensive preoperative models on deformable soft organs automatically in a manner of VST-AR without using extra intraoperative imaging modalities and external tracking devices, as well as its potential clinical use.
Collapse
Affiliation(s)
- Xiaohui Zhang
- School of Mechanical Engineering and Automation, Beihang University, Beijing, 100191, China
| | - Junchen Wang
- School of Mechanical Engineering and Automation, Beihang University, Beijing, 100191, China.,Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing, 100083, China
| | - Tianmiao Wang
- School of Mechanical Engineering and Automation, Beihang University, Beijing, 100191, China.,Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing, 100083, China
| | - Xuquan Ji
- School of Mechanical Engineering and Automation, Beihang University, Beijing, 100191, China
| | - Yu Shen
- School of Mechanical Engineering and Automation, Beihang University, Beijing, 100191, China.,Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing, 100083, China
| | - Zhen Sun
- School of Mechanical Engineering and Automation, Beihang University, Beijing, 100191, China
| | - Xuebin Zhang
- Department of Urology, Peking Union Medical College Hospital, Peking Union Medical College and Chinese Academy of Medical Sciences, Beijing, 100730, China.
| |
Collapse
|
11
|
Bertolo R, Hung A, Porpiglia F, Bove P, Schleicher M, Dasgupta P. Systematic review of augmented reality in urological interventions: the evidences of an impact on surgical outcomes are yet to come. World J Urol 2019; 38:2167-2176. [PMID: 30826888 DOI: 10.1007/s00345-019-02711-z] [Citation(s) in RCA: 30] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2018] [Accepted: 02/26/2019] [Indexed: 01/12/2023] Open
Abstract
PURPOSE To perform a systematic literature review on the clinical impact of augmented reality (AR) for urological interventions. METHODS As of June 21, 2018, systematic literature review was performed via Medline, Embase and Cochrane databases in accordance with the PRISMA guidelines and registered at PROSPERO (CRD42018102194). Only full text articles in English were included, without time restrictions. Articles were considered if they reported on the use of AR during urological intervention and the impact on the surgical outcomes. The risk of bias and the quality of each study included were independently assessed using the standard Cochrane Collaboration risk of bias tool and the Risk Of Bias In Non-randomised Studies-of Interventions Tool (ROBINS-I). RESULTS 131 articles were identified. 102 remained after duplicate removal and were critically reviewed for evidence synthesis. 20 studies reporting on the outcomes of the use of AR during urological interventions in a clinical setting were considered. Given the mostly non-comparative design of the studies identified, the evidence synthesis was performed in a descriptive and narrative manner. Only one comparative study was found, with the remaining 19 items being single-arm observational studies. Based on the existing evidence, we are unable to state that AR improves the outcomes of urological interventions. The major limitation of AR-assisted surgery is inaccuracy in registration, translating into a poor navigation precision. CONCLUSIONS To date, there is limited evidence showing superior therapeutic benefits of AR-guided surgery when compared with the conventional surgical approach to the respective disease.
Collapse
Affiliation(s)
- Riccardo Bertolo
- Glickman Urological and Kidney Institute, Cleveland Clinic, 2050 E 96th St, Q Building, Cleveland, OH, 44195, USA. .,Urology Department, "San Carlo di Nancy" Hospital, Rome, Italy.
| | - Andrew Hung
- Center for Robotic Simulation and Education, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA
| | - Francesco Porpiglia
- Division of Urology, Department of Oncology, University of Turin, San Luigi Hospital, Orbassano, Turin, Italy
| | - Pierluigi Bove
- Urology Department, "San Carlo di Nancy" Hospital, Rome, Italy
| | - Mary Schleicher
- Floyd D. Loop Alumni Library, Cleveland Clinic, Cleveland, OH, USA
| | | |
Collapse
|
12
|
Eppenga R, Kuhlmann K, Ruers T, Nijkamp J. Accuracy assessment of wireless transponder tracking in the operating room environment. Int J Comput Assist Radiol Surg 2018; 13:1937-1948. [PMID: 30099659 DOI: 10.1007/s11548-018-1838-z] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2018] [Accepted: 07/27/2018] [Indexed: 01/23/2023]
Abstract
PURPOSE To evaluate the applicability of the Calypso® wireless transponder tracking system (Varian Medical Systems Inc., USA) for real-time tumor motion tracking during surgical procedures on tumors in non-rigid target areas. An accuracy assessment was performed for an extended electromagnetic field of view (FoV) of 27.5 × 27.5 × 22.5 cm (which included the standard FoV of 14 × 14 × 19 cm) in which 5DOF wireless Beacon® transponders can be tracked. METHODS Using a custom-made measurement setup, we assessed single transponder relative accuracy, absolute accuracy and jitter throughout the extended FoV at 1440 locations interspaced with 2.5 cm in each orthogonal direction. The NDI Polaris Spectra optical tracking system (OTS) was used as a reference. Measurements were taken in a room without surrounding distorting factors and repeated in an operating room (OR). In the OR, the influence of a carbon fiber and regular stainless steel OR tabletop was investigated. RESULTS The calibration of the OTS and transponder system resulted in an average root-mean-square error (RMSE) vector of 0.03 cm. For both the standard and extended FoV, all accuracy measures were dependent on transponder to tracking array (TA) distances and the absolute accuracy was also dependent on TA to OR tabletop distances. This latter influence was reproducible, and after calibrating this, the residual error was below 0.1 cm RMSE within the entire standard FoV. Within the extended FoV, this residual RMSE did not exceed 0.1 cm for transponder to TA distances up to 25 cm. CONCLUSION This study shows that transponder tracking is promising for accurate tumor tracking in the operating room. This applies when using the standard FoV, but also when using the extended FoV up to 25 cm above the TA, substantially increasing flexibility.
Collapse
Affiliation(s)
- Roeland Eppenga
- Department of Surgical Oncology, The Netherlands Cancer Institute, Amsterdam, The Netherlands
| | - Koert Kuhlmann
- Department of Surgical Oncology, The Netherlands Cancer Institute, Amsterdam, The Netherlands
| | - Theo Ruers
- Department of Surgical Oncology, The Netherlands Cancer Institute, Amsterdam, The Netherlands
- Nanobiophysics Group, MIRA Institute, University of Twente, Enschede, The Netherlands
| | - Jasper Nijkamp
- Department of Surgical Oncology, The Netherlands Cancer Institute, Amsterdam, The Netherlands.
- Department of Surgery, The Netherlands Cancer Institute, Plesmanlaan 121, 1066 CX, Amsterdam, The Netherlands.
| |
Collapse
|
13
|
Hettig J, Engelhardt S, Hansen C, Mistelbauer G. AR in VR: assessing surgical augmented reality visualizations in a steerable virtual reality environment. Int J Comput Assist Radiol Surg 2018; 13:1717-1725. [DOI: 10.1007/s11548-018-1825-4] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2018] [Accepted: 07/05/2018] [Indexed: 12/28/2022]
|
14
|
Robu MR, Ramalhinho J, Thompson S, Gurusamy K, Davidson B, Hawkes D, Stoyanov D, Clarkson MJ. Global rigid registration of CT to video in laparoscopic liver surgery. Int J Comput Assist Radiol Surg 2018; 13:947-956. [PMID: 29736801 PMCID: PMC5974008 DOI: 10.1007/s11548-018-1781-z] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2018] [Accepted: 04/27/2018] [Indexed: 11/09/2022]
Abstract
PURPOSE Image-guidance systems have the potential to aid in laparoscopic interventions by providing sub-surface structure information and tumour localisation. The registration of a preoperative 3D image with the intraoperative laparoscopic video feed is an important component of image guidance, which should be fast, robust and cause minimal disruption to the surgical procedure. Most methods for rigid and non-rigid registration require a good initial alignment. However, in most research systems for abdominal surgery, the user has to manually rotate and translate the models, which is usually difficult to perform quickly and intuitively. METHODS We propose a fast, global method for the initial rigid alignment between a 3D mesh derived from a preoperative CT of the liver and a surface reconstruction of the intraoperative scene. We formulate the shape matching problem as a quadratic assignment problem which minimises the dissimilarity between feature descriptors while enforcing geometrical consistency between all the feature points. We incorporate a novel constraint based on the liver contours which deals specifically with the challenges introduced by laparoscopic data. RESULTS We validate our proposed method on synthetic data, on a liver phantom and on retrospective clinical data acquired during a laparoscopic liver resection. We show robustness over reduced partial size and increasing levels of deformation. Our results on the phantom and on the real data show good initial alignment, which can successfully converge to the correct position using fine alignment techniques. Furthermore, since we can pre-process the CT scan before surgery, the proposed method runs faster than current algorithms. CONCLUSION The proposed shape matching method can provide a fast, global initial registration, which can be further refined by fine alignment methods. This approach will lead to a more usable and intuitive image-guidance system for laparoscopic liver surgery.
Collapse
Affiliation(s)
- Maria R Robu
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London, UK.
- Centre For Medical Image Computing, University College London, London, UK.
| | - João Ramalhinho
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London, UK
- Centre For Medical Image Computing, University College London, London, UK
| | - Stephen Thompson
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London, UK
- Centre For Medical Image Computing, University College London, London, UK
| | - Kurinchi Gurusamy
- Division of Surgery and Interventional Science, University College London, London, UK
| | - Brian Davidson
- Division of Surgery and Interventional Science, University College London, London, UK
| | - David Hawkes
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London, UK
- Centre For Medical Image Computing, University College London, London, UK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London, UK
- Centre For Medical Image Computing, University College London, London, UK
| | - Matthew J Clarkson
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London, UK
- Centre For Medical Image Computing, University College London, London, UK
| |
Collapse
|
15
|
van Oosterom M, den Houting D, van de Velde C, van Leeuwen F. Navigating surgical fluorescence cameras using near-infrared optical tracking. JOURNAL OF BIOMEDICAL OPTICS 2018; 23:1-10. [PMID: 29745131 DOI: 10.1117/1.jbo.23.5.056003] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/07/2017] [Accepted: 04/09/2018] [Indexed: 05/24/2023]
Abstract
Fluorescence guidance facilitates real-time intraoperative visualization of the tissue of interest. However, due to attenuation, the application of fluorescence guidance is restricted to superficial lesions. To overcome this shortcoming, we have previously applied three-dimensional surgical navigation to position the fluorescence camera in reach of the superficial fluorescent signal. Unfortunately, in open surgery, the near-infrared (NIR) optical tracking system (OTS) used for navigation also induced an interference during NIR fluorescence imaging. In an attempt to support future implementation of navigated fluorescence cameras, different aspects of this interference were characterized and solutions were sought after. Two commercial fluorescence cameras for open surgery were studied in (surgical) phantom and human tissue setups using two different NIR OTSs and one OTS simulating light-emitting diode setup. Following the outcome of these measurements, OTS settings were optimized. Measurements indicated the OTS interference was caused by: (1) spectral overlap between the OTS light and camera, (2) OTS light intensity, (3) OTS duty cycle, (4) OTS frequency, (5) fluorescence camera frequency, and (6) fluorescence camera sensitivity. By optimizing points 2 to 4, navigation of fluorescence cameras during open surgery could be facilitated. Optimization of the OTS and camera compatibility can be used to support navigated fluorescence guidance concepts.
Collapse
Affiliation(s)
| | | | | | - Fijs van Leeuwen
- Leiden Univ. Medical Ctr., Netherlands
- The Netherlands Cancer Institute-Antoni van Leeuwenhoek Hospital, Netherlands
| |
Collapse
|
16
|
Edgcumbe P, Singla R, Pratt P, Schneider C, Nguan C, Rohling R. Follow the light: projector-based augmented reality intracorporeal system for laparoscopic surgery. J Med Imaging (Bellingham) 2018; 5:021216. [PMID: 29487888 DOI: 10.1117/1.jmi.5.2.021216] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2017] [Accepted: 01/22/2018] [Indexed: 01/20/2023] Open
Abstract
A projector-based augmented reality intracorporeal system (PARIS) is presented that includes a miniature tracked projector, tracked marker, and laparoscopic ultrasound (LUS) transducer. PARIS was developed to improve the efficacy and safety of laparoscopic partial nephrectomy (LPN). In particular, it has been demonstrated to effectively assist in the identification of tumor boundaries during surgery and to improve the surgeon's understanding of the underlying anatomy. PARIS achieves this by displaying the orthographic projection of the cancerous tumor on the kidney's surface. The performance of PARIS was evaluated in a user study with two surgeons who performed 32 simulated robot-assisted partial nephrectomies. They performed 16 simulated partial nephrectomies with PARIS for guidance and 16 simulated partial nephrectomies with only an LUS transducer for guidance. With PARIS, there was a significant reduction [30% ([Formula: see text])] in the amount of healthy tissue excised and a trend toward a more accurate dissection around the tumor and more negative margins. The combined point tracking and reprojection root-mean-square error of PARIS was 0.8 mm. PARIS' proven ability to improve key metrics of LPN surgery and qualitative feedback from surgeons about PARIS supports the hypothesis that it is an effective surgical navigation tool.
Collapse
Affiliation(s)
- Philip Edgcumbe
- University of British Columbia, MD/PhD Program, Vancouver, Canada
| | - Rohit Singla
- University of British Columbia, Department of Electrical and Computer Engineering, Vancouver, Canada
| | - Philip Pratt
- Imperial College London, Department of Surgery and Cancer, London, United Kingdom
| | - Caitlin Schneider
- University of British Columbia, Department of Electrical and Computer Engineering, Vancouver, Canada
| | - Christopher Nguan
- University of British Columbia, Department of Urological Sciences, Vancouver, Canada
| | - Robert Rohling
- University of British Columbia, Department of Electrical and Computer Engineering, Vancouver, Canada.,University of British Columbia, Department of Mechanical Engineering, Vancouver, Canada
| |
Collapse
|
17
|
Singla R, Edgcumbe P, Pratt P, Nguan C, Rohling R. Intra-operative ultrasound-based augmented reality guidance for laparoscopic surgery. Healthc Technol Lett 2017; 4:204-209. [PMID: 29184666 PMCID: PMC5683195 DOI: 10.1049/htl.2017.0063] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2017] [Accepted: 07/28/2017] [Indexed: 01/20/2023] Open
Abstract
In laparoscopic surgery, the surgeon must operate with a limited field of view and reduced depth perception. This makes spatial understanding of critical structures difficult, such as an endophytic tumour in a partial nephrectomy. Such tumours yield a high complication rate of 47%, and excising them increases the risk of cutting into the kidney's collecting system. To overcome these challenges, an augmented reality guidance system is proposed. Using intra-operative ultrasound, a single navigation aid, and surgical instrument tracking, four augmentations of guidance information are provided during tumour excision. Qualitative and quantitative system benefits are measured in simulated robot-assisted partial nephrectomies. Robot-to-camera calibration achieved a total registration error of 1.0 ± 0.4 mm while the total system error is 2.5 ± 0.5 mm. The system significantly reduced healthy tissue excised from an average (±standard deviation) of 30.6 ± 5.5 to 17.5 ± 2.4 cm3 (p < 0.05) and reduced the depth from the tumor underside to cut from an average (±standard deviation) of 10.2 ± 4.1 to 3.3 ± 2.3 mm (p < 0.05). Further evaluation is required in vivo, but the system has promising potential to reduce the amount of healthy parenchymal tissue excised.
Collapse
Affiliation(s)
- Rohit Singla
- Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, CanadaV6T1Z4
| | - Philip Edgcumbe
- MD/PhD Program, University of British Columbia, Vancouver, CanadaV6T1Z4
| | - Philip Pratt
- Department of Surgery and Cancer, Imperial College London, UK, SW72BX
| | - Christopher Nguan
- Department of Urological Sciences, University of British Columbia, Vancouver, CanadaV6T1Z4
| | - Robert Rohling
- Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, CanadaV6T1Z4.,Department of Mechanical Engineering, University of British Columbia, Vancouver, CanadaV6T1Z4
| |
Collapse
|
18
|
Detmer FJ, Hettig J, Schindele D, Schostak M, Hansen C. Virtual and Augmented Reality Systems for Renal Interventions: A Systematic Review. IEEE Rev Biomed Eng 2017; 10:78-94. [PMID: 28885161 DOI: 10.1109/rbme.2017.2749527] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
PURPOSE Many virtual and augmented reality systems have been proposed to support renal interventions. This paper reviews such systems employed in the treatment of renal cell carcinoma and renal stones. METHODS A systematic literature search was performed. Inclusion criteria were virtual and augmented reality systems for radical or partial nephrectomy and renal stone treatment, excluding systems solely developed or evaluated for training purposes. RESULTS In total, 52 research papers were identified and analyzed. Most of the identified literature (87%) deals with systems for renal cell carcinoma treatment. About 44% of the systems have already been employed in clinical practice, but only 20% in studies with ten or more patients. Main challenges remaining for future research include the consideration of organ movement and deformation, human factor issues, and the conduction of large clinical studies. CONCLUSION Augmented and virtual reality systems have the potential to improve safety and outcomes of renal interventions. In the last ten years, many technical advances have led to more sophisticated systems, which are already applied in clinical practice. Further research is required to cope with current limitations of virtual and augmented reality assistance in clinical environments.
Collapse
|
19
|
Recent Development of Augmented Reality in Surgery: A Review. JOURNAL OF HEALTHCARE ENGINEERING 2017; 2017:4574172. [PMID: 29065604 PMCID: PMC5585624 DOI: 10.1155/2017/4574172] [Citation(s) in RCA: 156] [Impact Index Per Article: 22.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/12/2017] [Accepted: 07/03/2017] [Indexed: 12/11/2022]
Abstract
Introduction The development augmented reality devices allow physicians to incorporate data visualization into diagnostic and treatment procedures to improve work efficiency, safety, and cost and to enhance surgical training. However, the awareness of possibilities of augmented reality is generally low. This review evaluates whether augmented reality can presently improve the results of surgical procedures. Methods We performed a review of available literature dating from 2010 to November 2016 by searching PubMed and Scopus using the terms “augmented reality” and “surgery.” Results. The initial search yielded 808 studies. After removing duplicates and including only journal articles, a total of 417 studies were identified. By reading of abstracts, 91 relevant studies were chosen to be included. 11 references were gathered by cross-referencing. A total of 102 studies were included in this review. Conclusions The present literature suggest an increasing interest of surgeons regarding employing augmented reality into surgery leading to improved safety and efficacy of surgical procedures. Many studies showed that the performance of newly devised augmented reality systems is comparable to traditional techniques. However, several problems need to be addressed before augmented reality is implemented into the routine practice.
Collapse
|