1
|
Cannon PC, Setia SA, Klein-Gardner S, Kavoussi NL, Webster RJ, Herrell SD. Are 3D Image Guidance Systems Ready for Use? A Comparative Analysis of 3D Image Guidance Implementations in Minimally Invasive Partial Nephrectomy. J Endourol 2024; 38:395-407. [PMID: 38251637 PMCID: PMC10979686 DOI: 10.1089/end.2023.0059] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/23/2024] Open
Abstract
Introduction: Three-dimensional image-guided surgical (3D-IGS) systems for minimally invasive partial nephrectomy (MIPN) can potentially improve the efficiency and accuracy of intraoperative anatomical localization and tumor resection. This review seeks to analyze the current state of research regarding 3D-IGS, including the evaluation of clinical outcomes, system functionality, and qualitative insights regarding 3D-IGS's impact on surgical procedures. Methods: We have systematically reviewed the clinical literature pertaining to 3D-IGS deployed for MIPN. For inclusion, studies must produce a patient-specific 3D anatomical model from two-dimensional imaging. Data extracted from the studies include clinical results, registration (alignment of the 3D model to the surgical scene) method used, limitations, and data types reported. A subset of studies was qualitatively analyzed through an inductive coding approach to identify major themes and subthemes across the studies. Results: Twenty-five studies were included in the review. Eight (32%) studies reported clinical results that point to 3D-IGS improving multiple surgical outcomes. Manual registration was the most utilized (48%). Soft tissue deformation was the most cited limitation among the included studies. Many studies reported qualitative statements regarding surgeon accuracy improvement, but quantitative surgeon accuracy data were not reported. During the qualitative analysis, six major themes emerged across the nine applicable studies. They are as follows: 3D-IGS is necessary, 3D-IGS improved surgical outcomes, researcher/surgeon confidence in 3D-IGS system, enhanced surgeon ability/accuracy, anatomical explanation for qualitative assessment, and claims without data or reference to support. Conclusions: Currently, clinical outcomes are the main source of quantitative data available to point to 3D-IGS's efficacy. However, the literature qualitatively suggests the benefit of accurate 3D-IGS for robotic partial nephrectomy.
Collapse
Affiliation(s)
- Piper C. Cannon
- Department of Mechanical Engineering, Vanderbilt University, Nashville, Tennessee, USA
| | - Shaan A. Setia
- Department of Urologic Surgery, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Stacy Klein-Gardner
- Department of Mechanical Engineering, Vanderbilt University, Nashville, Tennessee, USA
| | - Nicholas L. Kavoussi
- Department of Urologic Surgery, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Robert J. Webster
- Department of Mechanical Engineering, Vanderbilt University, Nashville, Tennessee, USA
| | - S. Duke Herrell
- Department of Urologic Surgery, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| |
Collapse
|
2
|
Makiyama K, Komeya M, Tatenuma T, Noguchi G, Ohtake S. Patient-specific simulations and navigation systems for partial nephrectomy. Int J Urol 2023; 30:1087-1095. [PMID: 37622340 DOI: 10.1111/iju.15287] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2023] [Accepted: 08/07/2023] [Indexed: 08/26/2023]
Abstract
Partial nephrectomy (PN) is the standard treatment for T1 renal cell carcinoma. PN is affected more by surgical variations and requires greater surgical experience than radical nephrectomy. Patient-specific simulations and navigation systems may help to reduce the surgical experience required for PN. Recent advances in three-dimensional (3D) virtual reality (VR) imaging and 3D printing technology have allowed accurate patient-specific simulations and navigation systems. We reviewed previous studies about patient-specific simulations and navigation systems for PN. Recently, image reconstruction technology has developed, and commercial software that converts two-dimensional images into 3D images has become available. Many urologists are now able to view 3DVR images when preparing for PN. Surgical simulations based on 3DVR images can change surgical plans and improve surgical outcomes, and are useful during patient consultations. Patient-specific simulators that are capable of simulating surgical procedures, the gold-standard form of patient-specific simulations, have also been reported. Besides VR, 3D printing is also useful for understanding patient-specific information. Some studies have reported simulation and navigation systems for PN based on solid 3D models. Patient-specific simulations are a form of preoperative preparation, whereas patient-specific navigation is used intraoperatively. Navigation-assisted PN procedures using 3DVR images have become increasingly common, especially in robotic surgery. Some studies found that these systems produced improvements in surgical outcomes. Once its accuracy has been confirmed, it is hoped that this technology will spread further and become more generalized.
Collapse
Affiliation(s)
- Kazuhide Makiyama
- Department of Urology, Yokohama City University Graduate School of Medicine, Yokohama, Kanagawa, Japan
| | - Mitsuru Komeya
- Department of Urology, Yokohama City University Graduate School of Medicine, Yokohama, Kanagawa, Japan
| | - Tomoyuki Tatenuma
- Department of Urology, Yokohama City University Graduate School of Medicine, Yokohama, Kanagawa, Japan
| | - Go Noguchi
- Department of Urology, Yokohama City University Graduate School of Medicine, Yokohama, Kanagawa, Japan
| | - Shinji Ohtake
- Department of Urology, Yokohama City University Graduate School of Medicine, Yokohama, Kanagawa, Japan
| |
Collapse
|
3
|
Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol 2023; 68. [PMID: 36580681 DOI: 10.1088/1361-6560/acaf23] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/29/2022] [Indexed: 12/31/2022]
Abstract
Augmented reality (AR) surgical navigation has developed rapidly in recent years. This paper reviews and analyzes the visualization, registration, and tracking techniques used in AR surgical navigation systems, as well as the application of these AR systems in different surgical fields. The types of AR visualization are divided into two categories ofin situvisualization and nonin situvisualization. The rendering contents of AR visualization are various. The registration methods include manual registration, point-based registration, surface registration, marker-based registration, and calibration-based registration. The tracking methods consist of self-localization, tracking with integrated cameras, external tracking, and hybrid tracking. Moreover, we describe the applications of AR in surgical fields. However, most AR applications were evaluated through model experiments and animal experiments, and there are relatively few clinical experiments, indicating that the current AR navigation methods are still in the early stage of development. Finally, we summarize the contributions and challenges of AR in the surgical fields, as well as the future development trend. Despite the fact that AR-guided surgery has not yet reached clinical maturity, we believe that if the current development trend continues, it will soon reveal its clinical utility.
Collapse
Affiliation(s)
- Longfei Ma
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Tianqi Huang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Jie Wang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| |
Collapse
|
4
|
A survey of augmented reality methods to guide minimally invasive partial nephrectomy. World J Urol 2023; 41:335-343. [PMID: 35776173 DOI: 10.1007/s00345-022-04078-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2021] [Accepted: 05/21/2022] [Indexed: 10/17/2022] Open
Abstract
INTRODUCTION Minimally invasive partial nephrectomy (MIPN) has become the standard of care for localized kidney tumors over the past decade. The characteristics of each tumor, in particular its size and relationship with the excretory tract and vessels, allow one to judge its complexity and to attempt predicting the risk of complications. The recent development of virtual 3D model reconstruction and computer vision has opened the way to image-guided surgery and augmented reality (AR). OBJECTIVE Our objective was to perform a systematic review to list and describe the different AR techniques proposed to support PN. MATERIALS AND METHODS The systematic review of the literature was performed on 12/04/22, using the keywords "nephrectomy" and "augmented reality" on Embase and Medline. Articles were considered if they reported surgical outcomes when using AR with virtual image overlay on real vision, during ex vivo or in vivo MIPN. We classified them according to the registration technique they use. RESULTS We found 16 articles describing an AR technique during MIPN procedures that met the eligibility criteria. A moderate to high risk of bias was recorded for all the studies. We classified registration methods into three main families, of which the most promising one seems to be surface-based registration. CONCLUSION Despite promising results, there do not exist studies showing an improvement in clinical outcomes using AR. The ideal AR technique is probably yet to be established, as several designs are still being actively explored. More clinical data will be required to establish the potential contribution of this technology to MIPN.
Collapse
|
5
|
Wu B, Liu P, Xiong C, Li C, Zhang F, Shen S, Shao P, Yao P, Niu C, Xu R. Stereotactic co-axial projection imaging for augmented reality neuronavigation: a proof-of-concept study. Quant Imaging Med Surg 2022; 12:3792-3802. [PMID: 35782260 PMCID: PMC9246757 DOI: 10.21037/qims-21-1144] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2021] [Accepted: 04/27/2022] [Indexed: 11/07/2023]
Abstract
BACKGROUND Lack of intuitiveness and poor hand-eye coordination present a major technical challenge in neurosurgical navigation. METHODS We developed an integrated dexterous stereotactic co-axial projection imaging (sCPI) system featuring orthotopic image projection for augmented reality (AR) neurosurgical navigation. The performance characteristics of the sCPI system, including projection resolution and navigation accuracy, were quantitatively verified. The resolution of the sCPI was tested with a USAF1951 resolution test chart. The stereotactic navigation accuracy of the sCPI was measured using a calibration panel with a 7×7 circle array pattern. In benchtop validation, the navigation accuracy of the sCPI and the BrainLab Kick Navigation Station was compared using a skull phantom with 8 intracranial targets. Finally, we demonstrated the potential clinical application of sCPI through a clinical trial. RESULTS The resolution test showed that the resolution of the sCPI was 1.3 mm. In a stereotactic navigation accuracy test, the maximum and minimum error of the sCPI was 2.9 and 0.3 mm, and the mean error was 1.5 mm. The stereotactic navigation accuracy test also showed that the navigation error of the sCPI would increase with the pitch and yaw angle, but there was no obvious difference in navigation errors caused by different yaw directions, which meant that the navigation error is unbiased across all directions. The benchtop validation showed that the average navigation errors for the sCPI system and the Kick Navigation Station were 1.4±0.8 and 1.8±0.7 mm, the medians were 1.3 and 1.9 mm, and the average preparation times were 3 min 24 sec and 6 min 8 sec, respectively. The clinical feasibility of sCPI-assisted neurosurgical navigation was demonstrated in a clinical study. In comparison with the BrainLab device, the sCPI system required less time for preoperative preparation and enhanced the clinician experience in intraoperative visualization and navigation. CONCLUSIONS The sCPI technique can be potentially used in many surgical applications for intuitive visualization of medical information and intraoperative guidance of surgical trajectories.
Collapse
Affiliation(s)
- Bingxuan Wu
- Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China, Hefei, China
| | - Peng Liu
- Suzhou Institute for Advanced Research, University of Science and Technology of China, Suzhou, China
| | - Chi Xiong
- Department of Neurosurgery, The First Affiliated Hospital of USTC, Division of Life Sciences and Medicine, University of Science and Technology of China, Hefei, China
| | - Chenmeng Li
- Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China, Hefei, China
| | - Fan Zhang
- Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China, Hefei, China
| | - Shuwei Shen
- Suzhou Institute for Advanced Research, University of Science and Technology of China, Suzhou, China
| | - Pengfei Shao
- Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China, Hefei, China
| | - Peng Yao
- Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China, Hefei, China
| | - Chaoshi Niu
- Department of Neurosurgery, The First Affiliated Hospital of USTC, Division of Life Sciences and Medicine, University of Science and Technology of China, Hefei, China
| | - Ronald Xu
- Suzhou Institute for Advanced Research, University of Science and Technology of China, Suzhou, China
| |
Collapse
|
6
|
Roberts S, Desai A, Checcucci E, Puliatti S, Taratkin M, Kowalewski KF, Gomez Rivas J, Rivero I, Veneziano D, Autorino R, Porpiglia F, Gill IS, Cacciamani GE. "Augmented reality" applications in urology: a systematic review. Minerva Urol Nephrol 2022; 74:528-537. [PMID: 35383432 DOI: 10.23736/s2724-6051.22.04726-7] [Citation(s) in RCA: 25] [Impact Index Per Article: 12.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
INTRODUCTION Augmented reality (AR) applied to surgical procedures refers to the superimposition of preoperative or intra-operative images onto the operative field. Augmented reality has been increasingly used in myriad surgical specialties including Urology. The following study reviews advances in the use of AR for improvements in urologic outcomes. EVIDENCE ACQUISITION We identified all descriptive, validity, prospective randomized/nonrandomized trials and retrospective comparative/noncomparative studies about the use of AR in Urology up until March 2021. The MEDLINE, Scopus, and Web of Science databases were used for literature search. We conducted the study selection according to the PRISMA (Preferred Reporting Items for Systematic Reviews and meta-analysis statement) guidelines. We limited included studies to only those using AR, excluding all that used virtual reality technology. EVIDENCE SYNTHESIS A total of 60 studies were identified and included in the present analysis. Overall, 19 studies were descriptive/validity/phantom studies for specific AR methodologies, 4 studies were case reports, and 37 studies included clinical prospective/retrospective comparative studies. CONCLUSIONS Advances in AR have led to increasing registration accuracy as well as increased ability to identify anatomic landmarks and improve outcomes during Urologic procedures such as RARP and robot-assisted partial nephrectomy.
Collapse
Affiliation(s)
- Sidney Roberts
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Aditya Desai
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Enrico Checcucci
- School of Medicine, Division of Urology, Department of Oncology, San Luigi Hospital, University of Turin, Orbassano, Turin, Italy.,European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands
| | - Stefano Puliatti
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, University of Modena and Reggio Emilia, Modena, Italy.,Department of Urology, OLV, Aalst, Belgium.,ORSI Academy, Melle, Belgium
| | - Mark Taratkin
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Institute for Urology and Reproductive Health, Sechenov University, Moscow, Russia
| | - Karl-Friedrich Kowalewski
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Virgen Macarena University Hospital, Seville, Spain.,Department of Urology and Urosurgery, University Hospital of Mannheim, Mannheim, Germany
| | - Juan Gomez Rivas
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, Clinico San Carlos University Hospital, Madrid, Spain
| | - Ines Rivero
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology and Nephrology, Virgen del Rocío University Hospital, Seville, Spain
| | - Domenico Veneziano
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, Riuniti Hospital, Reggio Calabria, Reggio Calabria, Italy
| | | | - Francesco Porpiglia
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands
| | - Inderbir S Gill
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA.,Artificial Intelligence (AI) Center at USC Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Giovanni E Cacciamani
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA - .,European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Artificial Intelligence (AI) Center at USC Urology, USC Institute of Urology, Los Angeles, CA, USA.,Keck School of Medicine, Department of Radiology, University of Southern California, Los Angeles, CA, USA
| |
Collapse
|
7
|
von Haxthausen F, Böttger S, Wulff D, Hagenah J, García-Vázquez V, Ipsen S. Medical Robotics for Ultrasound Imaging: Current Systems and Future Trends. ACTA ACUST UNITED AC 2021; 2:55-71. [PMID: 34977593 PMCID: PMC7898497 DOI: 10.1007/s43154-020-00037-y] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/21/2020] [Indexed: 12/17/2022]
Abstract
Abstract
Purpose of Review
This review provides an overview of the most recent robotic ultrasound systems that have contemporary emerged over the past five years, highlighting their status and future directions. The systems are categorized based on their level of robot autonomy (LORA).
Recent Findings
Teleoperating systems show the highest level of technical maturity. Collaborative assisting and autonomous systems are still in the research phase, with a focus on ultrasound image processing and force adaptation strategies. However, missing key factors are clinical studies and appropriate safety strategies. Future research will likely focus on artificial intelligence and virtual/augmented reality to improve image understanding and ergonomics.
Summary
A review on robotic ultrasound systems is presented in which first technical specifications are outlined. Hereafter, the literature of the past five years is subdivided into teleoperation, collaborative assistance, or autonomous systems based on LORA. Finally, future trends for robotic ultrasound systems are reviewed with a focus on artificial intelligence and virtual/augmented reality.
Collapse
Affiliation(s)
- Felix von Haxthausen
- Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| | - Sven Böttger
- Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| | - Daniel Wulff
- Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| | - Jannis Hagenah
- Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| | - Verónica García-Vázquez
- Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| | - Svenja Ipsen
- Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| |
Collapse
|
8
|
Qian L, Wu JY, DiMaio SP, Navab N, Kazanzides P. A Review of Augmented Reality in Robotic-Assisted Surgery. ACTA ACUST UNITED AC 2020. [DOI: 10.1109/tmrb.2019.2957061] [Citation(s) in RCA: 33] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
9
|
Joeres F, Schindele D, Luz M, Blaschke S, Russwinkel N, Schostak M, Hansen C. How well do software assistants for minimally invasive partial nephrectomy meet surgeon information needs? A cognitive task analysis and literature review study. PLoS One 2019; 14:e0219920. [PMID: 31318919 PMCID: PMC6638947 DOI: 10.1371/journal.pone.0219920] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2019] [Accepted: 07/04/2019] [Indexed: 12/30/2022] Open
Abstract
INTRODUCTION Intraoperative software assistance is gaining increasing importance in laparoscopic and robot-assisted surgery. Within the user-centred development process of such systems, the first question to be asked is: What information does the surgeon need and when does he or she need it? In this article, we present an approach to investigate these surgeon information needs for minimally invasive partial nephrectomy and compare these needs to the relevant surgical computer assistance literature. MATERIALS AND METHODS First, we conducted a literature-based hierarchical task analysis of the surgical procedure. This task analysis was taken as a basis for a qualitative in-depth interview study with nine experienced surgical urologists. The study employed a cognitive task analysis method to elicit surgeons' information needs during minimally invasive partial nephrectomy. Finally, a systematic literature search was conducted to review proposed software assistance solutions for minimally invasive partial nephrectomy. The review focused on what information the solutions present to the surgeon and what phase of the surgery they aim to support. RESULTS The task analysis yielded a workflow description for minimally invasive partial nephrectomy. During the subsequent interview study, we identified three challenging phases of the procedure, which may particularly benefit from software assistance. These phases are I. Hilar and vascular management, II. Tumour excision, and III. Repair of the renal defects. Between these phases, 25 individual challenges were found which define the surgeon information needs. The literature review identified 34 relevant publications, all of which aim to support the surgeon in hilar and vascular management (phase I) or tumour excision (phase II). CONCLUSION The work presented in this article identified unmet surgeon information needs in minimally invasive partial nephrectomy. Namely, our results suggest that future solutions should address the repair of renal defects (phase III) or put more focus on the renal collecting system as a critical anatomical structure.
Collapse
Affiliation(s)
- Fabian Joeres
- Department of Simulation and Graphics, Faculty of Computer Science, Otto von Guericke University Magdeburg, Magdeburg, Germany
| | - Daniel Schindele
- Clinic of Urology and Paediatric Urology, University Hospital of Magdeburg, Magdeburg, Germany
| | - Maria Luz
- Department of Simulation and Graphics, Faculty of Computer Science, Otto von Guericke University Magdeburg, Magdeburg, Germany
| | - Simon Blaschke
- Clinic of Urology and Paediatric Urology, University Hospital of Magdeburg, Magdeburg, Germany
| | - Nele Russwinkel
- Department of Cognitive Modelling in Dynamic Human-Machine Systems, Technische Universität Berlin, Berlin, Germany
| | - Martin Schostak
- Clinic of Urology and Paediatric Urology, University Hospital of Magdeburg, Magdeburg, Germany
| | - Christian Hansen
- Department of Simulation and Graphics, Faculty of Computer Science, Otto von Guericke University Magdeburg, Magdeburg, Germany
| |
Collapse
|
10
|
Camara M, Mayer E, Darzi A, Pratt P. Intraoperative ultrasound for improved 3D tumour reconstruction in robot-assisted surgery: An evaluation of feedback modalities. Int J Med Robot 2018; 15:e1973. [PMID: 30485641 DOI: 10.1002/rcs.1973] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2018] [Revised: 11/16/2018] [Accepted: 11/16/2018] [Indexed: 11/10/2022]
Abstract
BACKGROUND Intraoperative ultrasound scanning induces deformation on the tissue in the absence of a feedback modality, which results in a 3D tumour reconstruction that is not directly representative of real anatomy. METHODS A biomechanical model with different feedback modalities (haptic, visual, or auditory) was implemented in a simulation environment. A user study with 20 clinicians was performed to assess which modality resulted in the 3D tumour volume reconstruction that most resembled the reference configuration from the respective computed tomography (CT) scans. RESULTS Integrating a feedback modality significantly improved the scanning performance across all participants and data sets. The optimal feedback modality to adopt varied depending on the evaluation. Nonetheless, using guidance with feedback is always preferred compared with none. CONCLUSIONS The results demonstrated the urgency to integrate a feedback modality framework into clinical practice, to ensure an improved scanning performance. Furthermore, this framework enabled an evaluation that cannot be performed in vivo.
Collapse
Affiliation(s)
- Mafalda Camara
- Department of Surgery and Cancer, Imperial College London, United Kingdom
| | - Erik Mayer
- Department of Surgery and Cancer, Imperial College London, United Kingdom
| | - Ara Darzi
- Department of Surgery and Cancer, Imperial College London, United Kingdom
| | - Philip Pratt
- Department of Surgery and Cancer, Imperial College London, United Kingdom
| |
Collapse
|