1
|
Doornbos MCJ, Peek JJ, Maat APWM, Ruurda JP, De Backer P, Cornelissen BMW, Mahtab EAF, Sadeghi AH, Kluin J. Augmented Reality Implementation in Minimally Invasive Surgery for Future Application in Pulmonary Surgery: A Systematic Review. Surg Innov 2024; 31:646-658. [PMID: 39370802 PMCID: PMC11475712 DOI: 10.1177/15533506241290412] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/08/2024]
Abstract
OBJECTIVE This systematic review investigates of Augmented Reality (AR) systems used in minimally invasive surgery of deformable organs, focusing on initial registration, dynamic tracking, and visualization. The objective is to acquire a comprehensive understanding of the current knowledge, applications, and challenges associated with current AR-techniques, aiming to leverage these insights for developing a dedicated AR pulmonary Video or Robotic Assisted Thoracic Surgery (VATS/RATS) workflow. METHODS A systematic search was conducted within Embase, Medline (Ovid) and Web of Science on April 16, 2024, following the Preferred Reporting items for Systematic Reviews and Meta-Analyses (PRISMA). The search focused on intraoperative AR applications and intraoperative navigational purposes for deformable organs. Quality assessment was performed and studies were categorized according to initial registration and dynamic tracking methods. RESULTS 33 articles were included, of which one involved pulmonary surgery. Studies used both manual and (semi-) automatic registration methods, established through anatomical landmark-based, fiducial-based, or surface-based techniques. Diverse outcome measures were considered, including surgical outcomes and registration accuracy. The majority of studies that reached an registration accuracy below 5 mm applied surface-based registration. CONCLUSIONS AR can potentially aid surgeons with real-time navigation and decision making during anatomically complex minimally invasive procedures. Future research for pulmonary applications should focus on exploring surface-based registration methods, considering their non-invasive, marker-less nature, and promising accuracy. Additionally, vascular-labeling-based methods are worth exploring, given the importance and relative stability of broncho-vascular anatomy in pulmonary VATS/RATS. Assessing clinical feasibility of these approaches is crucial, particularly concerning registration accuracy and potential impact on surgical outcomes.
Collapse
Affiliation(s)
- Marie-Claire J. Doornbos
- Department of Cardiothoracic Surgery, Thoraxcenter, Erasmus MC, Rotterdam, The Netherlands
- Educational Program Technical Medicine, Leiden University Medical Center, Delft University of Technology & Erasmus University Medical Center Rotterdam, Leiden, The Netherlands
| | - Jette J. Peek
- Department of Cardiothoracic Surgery, Thoraxcenter, Erasmus MC, Rotterdam, The Netherlands
| | | | - Jelle P. Ruurda
- Department of Surgery, University Medical Center Utrecht, Utrecht, The Netherlands
| | | | | | - Edris A. F. Mahtab
- Department of Cardiothoracic Surgery, Thoraxcenter, Erasmus MC, Rotterdam, The Netherlands
- Department of Cardiothoracic Surgery, Leiden University Medical Center, Leiden, The Netherlands
| | - Amir H. Sadeghi
- Department of Cardiothoracic Surgery, Thoraxcenter, Erasmus MC, Rotterdam, The Netherlands
- Department of Cardiothoracic Surgery, University Medical Center Utrecht, The Netherlands
| | - Jolanda Kluin
- Department of Cardiothoracic Surgery, Thoraxcenter, Erasmus MC, Rotterdam, The Netherlands
| |
Collapse
|
2
|
Altunhan A, Soyturk S, Guldibi F, Tozsin A, Aydın A, Aydın A, Sarica K, Guven S, Ahmed K. Artificial intelligence in urolithiasis: a systematic review of utilization and effectiveness. World J Urol 2024; 42:579. [PMID: 39417840 DOI: 10.1007/s00345-024-05268-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2024] [Accepted: 09/05/2024] [Indexed: 10/19/2024] Open
Abstract
PURPOSE Mirroring global trends, artificial intelligence advances in medicine, notably urolithiasis. It promises accurate diagnoses, effective treatments, and forecasting epidemiological risks and stone passage. This systematic review aims to identify the types of AI models utilised in urolithiasis studies and evaluate their effectiveness. METHODS The study was registered with PROSPERO. Pubmed, EMBASE, Google Scholar, and Cochrane Library databases were searched for relevant literature, using keywords such as 'urology,' 'artificial intelligence,' and 'machine learning.' Only original AI studies on urolithiasis were included, excluding reviews, unrelated studies, and non-English articles. PRISMA guidelines followed. RESULTS Out of 4851 studies initially identified, 71 were included for comprehensive analysis in the application of AI in urolithiasis. AI showed notable proficiency in stone composition analysis in 12 studies, achieving an average precision of 88.2% (Range 0.65-1). In the domain of stone detection, the average precision remarkably reached 96.9%. AI's accuracy rate in predicting spontaneous ureteral stone passage averaged 87%, while its performance in treatment modalities such as PCNL and SWL achieved average accuracy rates of 82% and 83%, respectively. These AI models were generally superior to traditional diagnostic and treatment methods. CONCLUSION The consolidated data underscores AI's increasing significance in urolithiasis management. Across various dimensions-diagnosis, monitoring, and treatment-AI outperformed conventional methodologies. High precision and accuracy rates indicate that AI is not only effective but also poised for integration into routine clinical practice. Further research is warranted to establish AI's long-term utility and to validate its role as a standard tool in urological care.
Collapse
Affiliation(s)
- Abdullah Altunhan
- Meram School of Medicine, Urology Department, Necmettin Erbakan University, Konya, Türkiye
| | - Selim Soyturk
- Meram School of Medicine, Urology Department, Necmettin Erbakan University, Konya, Türkiye
| | - Furkan Guldibi
- Meram School of Medicine, Urology Department, Necmettin Erbakan University, Konya, Türkiye
| | - Atinc Tozsin
- School of Medicine, Urology Department, Trakya University, Edirne, Türkiye
| | - Abdullatif Aydın
- Department of Urology, King's College Hospital NHS Foundation Trust, London, UK
- MRC Centre for Transplantation, King's College London, London, UK
| | - Arif Aydın
- Meram School of Medicine, Urology Department, Necmettin Erbakan University, Konya, Türkiye
| | - Kemal Sarica
- Department of Urology, Health Sciences University, Prof. Dr. Ilhan Varank Education and Training Hospital, Istanbul, Türkiye
- Department of Urology, Biruni University Medical School, Istanbul, Türkiye
| | - Selcuk Guven
- Meram School of Medicine, Urology Department, Necmettin Erbakan University, Konya, Türkiye.
| | - Kamran Ahmed
- Meram School of Medicine, Urology Department, Necmettin Erbakan University, Konya, Türkiye
- Department of Urology, King's College Hospital NHS Foundation Trust, London, UK
- Sheikh Khalifa Medical City, Abu Dhabi, UAE
- Khalifa University, Abu Dhabi, UAE
| |
Collapse
|
3
|
Prasad K, Fassler C, Miller A, Aweeda M, Pruthi S, Fusco JC, Daniel B, Miga M, Wu JY, Topf MC. More than meets the eye: Augmented reality in surgical oncology. J Surg Oncol 2024; 130:405-418. [PMID: 39155686 DOI: 10.1002/jso.27790] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2024] [Accepted: 07/09/2024] [Indexed: 08/20/2024]
Abstract
BACKGROUND AND OBJECTIVES In the field of surgical oncology, there has been a desire for innovative techniques to improve tumor visualization, resection, and patient outcomes. Augmented reality (AR) technology superimposes digital content onto the real-world environment, enhancing the user's experience by blending digital and physical elements. A thorough examination of AR technology in surgical oncology has yet to be performed. METHODS A scoping review of intraoperative AR in surgical oncology was conducted according to the guidelines and recommendations of The Preferred Reporting Items for Systematic Review and Meta-analyzes Extension for Scoping Reviews (PRISMA-ScR) framework. All original articles examining the use of intraoperative AR during surgical management of cancer were included. Exclusion criteria included virtual reality applications only, preoperative use only, fluorescence, AR not specific to surgical oncology, and study design (reviews, commentaries, abstracts). RESULTS A total of 2735 articles were identified of which 83 were included. Most studies (52) were performed on animals or phantom models, while the remaining included patients. A total of 1112 intraoperative AR surgical cases were performed across the studies. The most common anatomic site was brain (20 articles), followed by liver (16), renal (9), and head and neck (8). AR was most often used for intraoperative navigation or anatomic visualization of tumors or critical structures but was also used to identify osteotomy or craniotomy planes. CONCLUSIONS AR technology has been applied across the field of surgical oncology to aid in localization and resection of tumors.
Collapse
Affiliation(s)
- Kavita Prasad
- Department of Otolaryngology-Head & Neck Surgery, Beth Israel Deaconess Medical Center, Boston, Massachusetts, USA
| | - Carly Fassler
- Department of Otolaryngology-Head & Neck Surgery, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Alexis Miller
- Department of Otolaryngology-Head & Neck Surgery, University of Alabama at Birmingham, Birmingham, Alabama, USA
| | - Marina Aweeda
- Department of Otolaryngology-Head & Neck Surgery, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Sumit Pruthi
- Department of Radiology, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Joseph C Fusco
- Department of Pediatric Surgery, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Bruce Daniel
- Department of Radiology, Stanford Health Care, Palo Alto, California, USA
| | - Michael Miga
- Department of Biomedical Engineering, Vanderbilt University, Nashville, Tennessee, USA
| | - Jie Ying Wu
- Department of Computer Science, Vanderbilt University, Nashville, Tennessee, USA
| | - Michael C Topf
- Department of Otolaryngology-Head & Neck Surgery, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| |
Collapse
|
4
|
Laterza V, Marchegiani F, Aisoni F, Ammendola M, Schena CA, Lavazza L, Ravaioli C, Carra MC, Costa V, De Franceschi A, De Simone B, de’Angelis N. Smart Operating Room in Digestive Surgery: A Narrative Review. Healthcare (Basel) 2024; 12:1530. [PMID: 39120233 PMCID: PMC11311806 DOI: 10.3390/healthcare12151530] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2024] [Revised: 07/24/2024] [Accepted: 07/29/2024] [Indexed: 08/10/2024] Open
Abstract
The introduction of new technologies in current digestive surgical practice is progressively reshaping the operating room, defining the fourth surgical revolution. The implementation of black boxes and control towers aims at streamlining workflow and reducing surgical error by early identification and analysis, while augmented reality and artificial intelligence augment surgeons' perceptual and technical skills by superimposing three-dimensional models to real-time surgical images. Moreover, the operating room architecture is transitioning toward an integrated digital environment to improve efficiency and, ultimately, patients' outcomes. This narrative review describes the most recent evidence regarding the role of these technologies in transforming the current digestive surgical practice, underlining their potential benefits and drawbacks in terms of efficiency and patients' outcomes, as an attempt to foresee the digestive surgical practice of tomorrow.
Collapse
Affiliation(s)
- Vito Laterza
- Department of Digestive Surgical Oncology and Liver Transplantation, University Hospital of Besançon, 3 Boulevard Alexandre Fleming, 25000 Besancon, France;
| | - Francesco Marchegiani
- Unit of Colorectal and Digestive Surgery, DIGEST Department, Beaujon University Hospital, AP-HP, University of Paris Cité, Clichy, 92110 Paris, France
| | - Filippo Aisoni
- Unit of Emergency Surgery, Department of Surgery, Ferrara University Hospital, 44124 Ferrara, Italy;
| | - Michele Ammendola
- Digestive Surgery Unit, Health of Science Department, University Hospital “R.Dulbecco”, 88100 Catanzaro, Italy;
| | - Carlo Alberto Schena
- Unit of Robotic and Minimally Invasive Surgery, Department of Surgery, Ferrara University Hospital, 44124 Ferrara, Italy; (C.A.S.); (N.d.)
| | - Luca Lavazza
- Hospital Network Coordinator of Azienda Ospedaliero, Universitaria and Azienda USL di Ferrara, 44121 Ferrara, Italy;
| | - Cinzia Ravaioli
- Azienda Ospedaliero, Universitaria di Ferrara, 44121 Ferrara, Italy;
| | - Maria Clotilde Carra
- Rothschild Hospital (AP-HP), 75012 Paris, France;
- INSERM-Sorbonne Paris Cité, Epidemiology and Statistics Research Centre, 75004 Paris, France
| | - Vittore Costa
- Unit of Orthopedics, Humanitas Hospital, 24125 Bergamo, Italy;
| | | | - Belinda De Simone
- Department of Emergency Surgery, Academic Hospital of Villeneuve St Georges, 91560 Villeneuve St. Georges, France;
| | - Nicola de’Angelis
- Unit of Robotic and Minimally Invasive Surgery, Department of Surgery, Ferrara University Hospital, 44124 Ferrara, Italy; (C.A.S.); (N.d.)
- Department of Translational Medicine, University of Ferrara, 44121 Ferrara, Italy
| |
Collapse
|
5
|
Zattoni F, Carletti F, Randazzo G, Tuminello A, Betto G, Novara G, Dal Moro F. Potential Applications of New Headsets for Virtual and Augmented Reality in Urology. Eur Urol Focus 2023:S2405-4569(23)00295-X. [PMID: 38160172 DOI: 10.1016/j.euf.2023.12.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2023] [Revised: 12/02/2023] [Accepted: 12/18/2023] [Indexed: 01/03/2024]
Abstract
Virtual and augmented reality (VR/AR) technologies hold great promise in various medical fields. The release of a new generation of headsets for medical enhanced VR/AR (MER) opens new possibilities for applications in medicine, particularly in urology, to improve accessibility to everyone. These innovative headsets offer deep immersion without requiring a controller, which represents a novel approach to VR/AR engagement. The potential of these headsets applies to all aspects of urology, including surgical training, virtual meetings, communication between health care providers, patient counseling, telemedicine, delivering patient advice, and pain control. MER has the potential to improve operative planning and enhance intraoperative navigation and spatial awareness. The surgeon's visualization and overall experience can be significantly enhanced via improved guidance and visualization, ultimately leading to greater precision and safety. This cutting-edge technology has the potential to reshape urology practice, communication methods, and medical procedures, and ultimately to improve patients' experience of their urological condition. PATIENT SUMMARY: This mini review explores how a new generation of headsets for medical enhanced virtual reality could revolutionize urology by improving surgical planning, assistance during procedures, and medical education. Patients can benefit from better pain management and a deeper understanding of their conditions. However, challenges such as costs, accuracy, and ethical concerns must be addressed. This technology holds promise for transforming urological practice and patient care.
Collapse
Affiliation(s)
- Fabio Zattoni
- Department of Surgery, Oncology, and Gastroenterology, Urological Unit, University of Padova, Padova, Italy.
| | - Filippo Carletti
- Department of Surgery, Oncology, and Gastroenterology, Urological Unit, University of Padova, Padova, Italy
| | - Gianmarco Randazzo
- Department of Surgery, Oncology, and Gastroenterology, Urological Unit, University of Padova, Padova, Italy
| | - Arianna Tuminello
- Department of Surgery, Oncology, and Gastroenterology, Urological Unit, University of Padova, Padova, Italy
| | - Giovanni Betto
- Department of Surgery, Oncology, and Gastroenterology, Urological Unit, University of Padova, Padova, Italy
| | - Giacomo Novara
- Department of Surgery, Oncology, and Gastroenterology, Urological Unit, University of Padova, Padova, Italy
| | - Fabrizio Dal Moro
- Department of Surgery, Oncology, and Gastroenterology, Urological Unit, University of Padova, Padova, Italy
| |
Collapse
|
6
|
Ramalhinho J, Yoo S, Dowrick T, Koo B, Somasundaram M, Gurusamy K, Hawkes DJ, Davidson B, Blandford A, Clarkson MJ. The value of Augmented Reality in surgery - A usability study on laparoscopic liver surgery. Med Image Anal 2023; 90:102943. [PMID: 37703675 PMCID: PMC10958137 DOI: 10.1016/j.media.2023.102943] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2022] [Revised: 06/29/2023] [Accepted: 08/24/2023] [Indexed: 09/15/2023]
Abstract
Augmented Reality (AR) is considered to be a promising technology for the guidance of laparoscopic liver surgery. By overlaying pre-operative 3D information of the liver and internal blood vessels on the laparoscopic view, surgeons can better understand the location of critical structures. In an effort to enable AR, several authors have focused on the development of methods to obtain an accurate alignment between the laparoscopic video image and the pre-operative 3D data of the liver, without assessing the benefit that the resulting overlay can provide during surgery. In this paper, we present a study that aims to assess quantitatively and qualitatively the value of an AR overlay in laparoscopic surgery during a simulated surgical task on a phantom setup. We design a study where participants are asked to physically localise pre-operative tumours in a liver phantom using three image guidance conditions - a baseline condition without any image guidance, a condition where the 3D surfaces of the liver are aligned to the video and displayed on a black background, and a condition where video see-through AR is displayed on the laparoscopic video. Using data collected from a cohort of 24 participants which include 12 surgeons, we observe that compared to the baseline, AR decreases the median localisation error of surgeons on non-peripheral targets from 25.8 mm to 9.2 mm. Using subjective feedback, we also identify that AR introduces usability improvements in the surgical task and increases the perceived confidence of the users. Between the two tested displays, the majority of participants preferred to use the AR overlay instead of navigated view of the 3D surfaces on a separate screen. We conclude that AR has the potential to improve performance and decision making in laparoscopic surgery, and that improvements in overlay alignment accuracy and depth perception should be pursued in the future.
Collapse
Affiliation(s)
- João Ramalhinho
- Wellcome ESPRC Centre for Interventional and Surgical Sciences, University College London, London, United Kingdom.
| | - Soojeong Yoo
- Wellcome ESPRC Centre for Interventional and Surgical Sciences, University College London, London, United Kingdom; UCL Interaction Centre, University College London, London, United Kingdom
| | - Thomas Dowrick
- Wellcome ESPRC Centre for Interventional and Surgical Sciences, University College London, London, United Kingdom
| | - Bongjin Koo
- Wellcome ESPRC Centre for Interventional and Surgical Sciences, University College London, London, United Kingdom
| | - Murali Somasundaram
- Division of Surgery and Interventional Sciences, University College London, London, United Kingdom
| | - Kurinchi Gurusamy
- Division of Surgery and Interventional Sciences, University College London, London, United Kingdom
| | - David J Hawkes
- Wellcome ESPRC Centre for Interventional and Surgical Sciences, University College London, London, United Kingdom
| | - Brian Davidson
- Division of Surgery and Interventional Sciences, University College London, London, United Kingdom
| | - Ann Blandford
- Wellcome ESPRC Centre for Interventional and Surgical Sciences, University College London, London, United Kingdom; UCL Interaction Centre, University College London, London, United Kingdom
| | - Matthew J Clarkson
- Wellcome ESPRC Centre for Interventional and Surgical Sciences, University College London, London, United Kingdom
| |
Collapse
|
7
|
Makiyama K, Komeya M, Tatenuma T, Noguchi G, Ohtake S. Patient-specific simulations and navigation systems for partial nephrectomy. Int J Urol 2023; 30:1087-1095. [PMID: 37622340 DOI: 10.1111/iju.15287] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2023] [Accepted: 08/07/2023] [Indexed: 08/26/2023]
Abstract
Partial nephrectomy (PN) is the standard treatment for T1 renal cell carcinoma. PN is affected more by surgical variations and requires greater surgical experience than radical nephrectomy. Patient-specific simulations and navigation systems may help to reduce the surgical experience required for PN. Recent advances in three-dimensional (3D) virtual reality (VR) imaging and 3D printing technology have allowed accurate patient-specific simulations and navigation systems. We reviewed previous studies about patient-specific simulations and navigation systems for PN. Recently, image reconstruction technology has developed, and commercial software that converts two-dimensional images into 3D images has become available. Many urologists are now able to view 3DVR images when preparing for PN. Surgical simulations based on 3DVR images can change surgical plans and improve surgical outcomes, and are useful during patient consultations. Patient-specific simulators that are capable of simulating surgical procedures, the gold-standard form of patient-specific simulations, have also been reported. Besides VR, 3D printing is also useful for understanding patient-specific information. Some studies have reported simulation and navigation systems for PN based on solid 3D models. Patient-specific simulations are a form of preoperative preparation, whereas patient-specific navigation is used intraoperatively. Navigation-assisted PN procedures using 3DVR images have become increasingly common, especially in robotic surgery. Some studies found that these systems produced improvements in surgical outcomes. Once its accuracy has been confirmed, it is hoped that this technology will spread further and become more generalized.
Collapse
Affiliation(s)
- Kazuhide Makiyama
- Department of Urology, Yokohama City University Graduate School of Medicine, Yokohama, Kanagawa, Japan
| | - Mitsuru Komeya
- Department of Urology, Yokohama City University Graduate School of Medicine, Yokohama, Kanagawa, Japan
| | - Tomoyuki Tatenuma
- Department of Urology, Yokohama City University Graduate School of Medicine, Yokohama, Kanagawa, Japan
| | - Go Noguchi
- Department of Urology, Yokohama City University Graduate School of Medicine, Yokohama, Kanagawa, Japan
| | - Shinji Ohtake
- Department of Urology, Yokohama City University Graduate School of Medicine, Yokohama, Kanagawa, Japan
| |
Collapse
|
8
|
A survey of augmented reality methods to guide minimally invasive partial nephrectomy. World J Urol 2023; 41:335-343. [PMID: 35776173 DOI: 10.1007/s00345-022-04078-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2021] [Accepted: 05/21/2022] [Indexed: 10/17/2022] Open
Abstract
INTRODUCTION Minimally invasive partial nephrectomy (MIPN) has become the standard of care for localized kidney tumors over the past decade. The characteristics of each tumor, in particular its size and relationship with the excretory tract and vessels, allow one to judge its complexity and to attempt predicting the risk of complications. The recent development of virtual 3D model reconstruction and computer vision has opened the way to image-guided surgery and augmented reality (AR). OBJECTIVE Our objective was to perform a systematic review to list and describe the different AR techniques proposed to support PN. MATERIALS AND METHODS The systematic review of the literature was performed on 12/04/22, using the keywords "nephrectomy" and "augmented reality" on Embase and Medline. Articles were considered if they reported surgical outcomes when using AR with virtual image overlay on real vision, during ex vivo or in vivo MIPN. We classified them according to the registration technique they use. RESULTS We found 16 articles describing an AR technique during MIPN procedures that met the eligibility criteria. A moderate to high risk of bias was recorded for all the studies. We classified registration methods into three main families, of which the most promising one seems to be surface-based registration. CONCLUSION Despite promising results, there do not exist studies showing an improvement in clinical outcomes using AR. The ideal AR technique is probably yet to be established, as several designs are still being actively explored. More clinical data will be required to establish the potential contribution of this technology to MIPN.
Collapse
|
9
|
Jiang J, Zhang J, Sun J, Wu D, Xu S. User's image perception improved strategy and application of augmented reality systems in smart medical care: A review. Int J Med Robot 2023; 19:e2497. [PMID: 36629798 DOI: 10.1002/rcs.2497] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2022] [Revised: 12/26/2022] [Accepted: 01/06/2023] [Indexed: 01/12/2023]
Abstract
BACKGROUND Augmented reality (AR) is a new human-computer interaction technology that combines virtual reality, computer vision, and computer networks. With the rapid advancement of the medical field towards intelligence and data visualisation, AR systems are becoming increasingly popular in the medical field because they can provide doctors with clear enough medical images and accurate image navigation in practical applications. However, it has been discovered that different display types of AR systems have different effects on doctors' perception of the image after virtual-real fusion during the actual medical application. If doctors cannot correctly perceive the image, they may be unable to correctly match the virtual information with the real world, which will have a significant impact on their ability to recognise complex structures. METHODS This paper uses Citespace, a literature analysis tool, to visualise and analyse the research hotspots when AR systems are used in the medical field. RESULTS A visual analysis of the 1163 articles retrieved from the Web of Science Core Collection database reveals that display technology and visualisation technology are the key research directions of AR systems at the moment. CONCLUSION This paper categorises AR systems based on their display principles, reviews current image perception optimisation schemes for various types of systems, and analyses and compares different display types of AR systems based on their practical applications in the field of smart medical care so that doctors can select the appropriate display types based on different application scenarios. Finally, the future development direction of AR display technology is anticipated in order for AR technology to be more effectively applied in the field of smart medical care. The advancement of display technology for AR systems is critical for their use in the medical field, and the advantages and disadvantages of various display types should be considered in different application scenarios to select the best AR system.
Collapse
Affiliation(s)
- Jingang Jiang
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China.,Robotics & Its Engineering Research Center, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Jiawei Zhang
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Jianpeng Sun
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Dianhao Wu
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Shuainan Xu
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| |
Collapse
|
10
|
Chahine J, Mascarenhas L, George SA, Bartos J, Yannopoulos D, Raveendran G, Gurevich S. Effects of a Mixed-Reality Headset on Procedural Outcomes in the Cardiac Catheterization Laboratory. CARDIOVASCULAR REVASCULARIZATION MEDICINE 2022; 45:3-8. [PMID: 35995656 DOI: 10.1016/j.carrev.2022.08.009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2022] [Revised: 07/02/2022] [Accepted: 08/05/2022] [Indexed: 01/04/2023]
Abstract
BACKGROUND Mixed reality head-mounted displays (MR-HMD) are a novel and emerging tool in healthcare. There is a paucity of data on the safety and efficacy of the use of MR-HMD in the cardiac catheterization laboratory (CCL). We sought to analyze and compare fluoroscopy time, procedure time, and complication rates with right heart catheterizations (RHCs) and coronary angiographies (CAs) performed with MR-HMD versus standard LCD medical displays. METHODS This is a non-randomized trial that included patients who underwent RHC and CA with MR-HMD between August 2019 and January 2020. Their outcomes were compared to a control group during the same time period. The primary endpoints were procedure time, fluoroscopy time, and dose area product (DAP). The secondary endpoints were contrast volume and intra and postprocedural complications rate. RESULTS 50 patients were enrolled in the trial, 33 had a RHC done, and 29 had a diagnostic CA performed. They were compared to 232 patients in the control group. The use of MR-HMD was associated with a significantly lower procedure time (20 min (IQR 14-30) vs. 25 min (IQR 18-36), p = 0.038). There were no significant differences in median fluoroscopy time (1.5 min (IQR 0.7-4.9) in the study group vs. 1.3 min (IQR 0.8-3.1), p = 0.84) or median DAP (165.4 mGy·cm2 (IQR 13-15,583) in the study group vs. 913 mGy·cm2 (IQR 24-6291), p = 0.17). There was no significant increase in intra- or post-procedure complications. CONCLUSION MR-HMD use is safe and feasible and may decrease procedure time in the CCL.
Collapse
Affiliation(s)
- Johnny Chahine
- Department of Medicine, University of Minnesota, Minneapolis, MN, United States of America
| | - Lorraine Mascarenhas
- Department of Medicine, University of Minnesota, Minneapolis, MN, United States of America
| | | | - Jason Bartos
- Department of Medicine, University of Minnesota, Minneapolis, MN, United States of America
| | - Demetris Yannopoulos
- Department of Medicine, University of Minnesota, Minneapolis, MN, United States of America
| | - Ganesh Raveendran
- Department of Medicine, University of Minnesota, Minneapolis, MN, United States of America
| | - Sergey Gurevich
- Department of Medicine, University of Minnesota, Minneapolis, MN, United States of America.
| |
Collapse
|
11
|
The intraoperative use of augmented and mixed reality technology to improve surgical outcomes: A systematic review. Int J Med Robot 2022; 18:e2450. [DOI: 10.1002/rcs.2450] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2022] [Revised: 07/23/2022] [Accepted: 07/27/2022] [Indexed: 11/07/2022]
|
12
|
A 3D Image Registration Method for Laparoscopic Liver Surgery Navigation. ELECTRONICS 2022. [DOI: 10.3390/electronics11111670] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
At present, laparoscopic augmented reality (AR) navigation has been applied to minimally invasive abdominal surgery, which can help doctors to see the location of blood vessels and tumors in organs, so as to perform precise surgery operations. Image registration is the process of optimally mapping one or more images to the target image, and it is also the core of laparoscopic AR navigation. The key is how to shorten the registration time and optimize the registration accuracy. We have studied the three-dimensional (3D) image registration technology in laparoscopic liver surgery navigation and proposed a new registration method combining rough registration and fine registration. First, the adaptive fireworks algorithm (AFWA) is applied to rough registration, and then the optimized iterative closest point (ICP) algorithm is applied to fine registration. We proposed a method that is validated by the computed tomography (CT) dataset 3D-IRCADb-01. Experimental results show that our method is superior to other registration methods based on stochastic optimization algorithms in terms of registration time and accuracy.
Collapse
|
13
|
Roberts S, Desai A, Checcucci E, Puliatti S, Taratkin M, Kowalewski KF, Gomez Rivas J, Rivero I, Veneziano D, Autorino R, Porpiglia F, Gill IS, Cacciamani GE. "Augmented reality" applications in urology: a systematic review. Minerva Urol Nephrol 2022; 74:528-537. [PMID: 35383432 DOI: 10.23736/s2724-6051.22.04726-7] [Citation(s) in RCA: 25] [Impact Index Per Article: 12.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
INTRODUCTION Augmented reality (AR) applied to surgical procedures refers to the superimposition of preoperative or intra-operative images onto the operative field. Augmented reality has been increasingly used in myriad surgical specialties including Urology. The following study reviews advances in the use of AR for improvements in urologic outcomes. EVIDENCE ACQUISITION We identified all descriptive, validity, prospective randomized/nonrandomized trials and retrospective comparative/noncomparative studies about the use of AR in Urology up until March 2021. The MEDLINE, Scopus, and Web of Science databases were used for literature search. We conducted the study selection according to the PRISMA (Preferred Reporting Items for Systematic Reviews and meta-analysis statement) guidelines. We limited included studies to only those using AR, excluding all that used virtual reality technology. EVIDENCE SYNTHESIS A total of 60 studies were identified and included in the present analysis. Overall, 19 studies were descriptive/validity/phantom studies for specific AR methodologies, 4 studies were case reports, and 37 studies included clinical prospective/retrospective comparative studies. CONCLUSIONS Advances in AR have led to increasing registration accuracy as well as increased ability to identify anatomic landmarks and improve outcomes during Urologic procedures such as RARP and robot-assisted partial nephrectomy.
Collapse
Affiliation(s)
- Sidney Roberts
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Aditya Desai
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Enrico Checcucci
- School of Medicine, Division of Urology, Department of Oncology, San Luigi Hospital, University of Turin, Orbassano, Turin, Italy.,European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands
| | - Stefano Puliatti
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, University of Modena and Reggio Emilia, Modena, Italy.,Department of Urology, OLV, Aalst, Belgium.,ORSI Academy, Melle, Belgium
| | - Mark Taratkin
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Institute for Urology and Reproductive Health, Sechenov University, Moscow, Russia
| | - Karl-Friedrich Kowalewski
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Virgen Macarena University Hospital, Seville, Spain.,Department of Urology and Urosurgery, University Hospital of Mannheim, Mannheim, Germany
| | - Juan Gomez Rivas
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, Clinico San Carlos University Hospital, Madrid, Spain
| | - Ines Rivero
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology and Nephrology, Virgen del Rocío University Hospital, Seville, Spain
| | - Domenico Veneziano
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, Riuniti Hospital, Reggio Calabria, Reggio Calabria, Italy
| | | | - Francesco Porpiglia
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands
| | - Inderbir S Gill
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA.,Artificial Intelligence (AI) Center at USC Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Giovanni E Cacciamani
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA - .,European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Artificial Intelligence (AI) Center at USC Urology, USC Institute of Urology, Los Angeles, CA, USA.,Keck School of Medicine, Department of Radiology, University of Southern California, Los Angeles, CA, USA
| |
Collapse
|
14
|
Zhu W, Xiong S, Xu C, Zhu Z, Li Z, Zhang L, Guan H, Huang Y, Zhang P, Zhu H, Lin J, Li X, Zhou L. Initial experiences with preoperative three-dimensional image reconstruction technology in laparoscopic pyeloplasty for ureteropelvic junction obstruction. Transl Androl Urol 2022; 10:4142-4151. [PMID: 34984180 PMCID: PMC8661249 DOI: 10.21037/tau-21-590] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2021] [Accepted: 09/28/2021] [Indexed: 11/06/2022] Open
Abstract
Background To explore the clinical value of three-dimensional image reconstruction technology (3DIT) on preoperative surgical planning and perioperative outcomes in laparoscopic pyeloplasty (LP). Methods Data of 25 patients with ureteropelvic junction obstruction (UPJO) admitted to our hospital from January 2018 to January 2019 was analyzed retrospectively. All patients underwent preoperative enhanced computed tomography (CT) scanning. In the 12 cases in the 3DIT group, preoperative planning involved the use of virtual operation and morphometry based on reconstruction of the CT data into three-dimensional (3D) images. Surgery in the other 13 cases was performed with traditional CT examination. Demographic, surgical outcome, and postoperative parameters were compared between these two groups. Results Reconstructed 3D images clearly showed the spatial structural relationships between the UPJO and surrounding blood vessels. In all 25 cases surgery was completed with no conversion to open surgery. Preoperative 3DIT analyses resulted in significant improvements to mean operation time (107.76 vs. 141.58 min, P=0.024), mean time of dissociating ureteropelvic junction (UPJ) (11.26 vs. 19.40 min, P=0.020), and mean estimated blood loss volume (23.84 vs. 49.16 mL, P=0.028). There were no statistically significant differences in perioperative complications, postoperative hospital stays or postoperative drainage time. Conclusions 3DIT based on enhanced CT scans is of clinical value in the treatment of UPJO, as it can provide accurate anatomical information and reliable guidance for preoperative operation planning, and it facilitates image-guided LP.
Collapse
Affiliation(s)
- Weijie Zhu
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Shengwei Xiong
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Chunru Xu
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Zhenpeng Zhu
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Zhihua Li
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Lei Zhang
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Hua Guan
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Yanbo Huang
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Peng Zhang
- Department of Urology, Emergency General Hospital, Beijing, China
| | - Hongjian Zhu
- Department of Urology, Beijing Jiangong Hospital, Beijing, China
| | - Jian Lin
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Xuesong Li
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Liqun Zhou
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| |
Collapse
|
15
|
Shiozaki K, Kawanishi Y, Sasaki Y, Daizumoto K, Tsuda M, Izumi K, Kusuhara Y, Fukawa T, Yamamoto Y, Yamaguchi K, Takahashi M, Kanayama H. Clinical application of virtual imaging guided Robot-assisted partial nephrectomy. THE JOURNAL OF MEDICAL INVESTIGATION 2022; 69:237-243. [DOI: 10.2152/jmi.69.237] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/02/2022]
Affiliation(s)
- Keito Shiozaki
- Department of Urology, Tokushima University Graduate School of Biomedical Sciences, Tokushima, Japan
| | - Yasuo Kawanishi
- Department of Urology, Takamatsu Red Cross Hospital, Takamatsu, Japan
| | - Yutaro Sasaki
- Department of Urology, Tokushima University Graduate School of Biomedical Sciences, Tokushima, Japan
| | - Kei Daizumoto
- Department of Urology, Tokushima University Graduate School of Biomedical Sciences, Tokushima, Japan
| | - Megumi Tsuda
- Department of Urology, Tokushima University Graduate School of Biomedical Sciences, Tokushima, Japan
| | - Kazuyoshi Izumi
- Department of Urology, Takamatsu Red Cross Hospital, Takamatsu, Japan
| | - Yoshito Kusuhara
- Department of Urology, Tokushima University Graduate School of Biomedical Sciences, Tokushima, Japan
| | - Tomoya Fukawa
- Department of Urology, Tokushima University Graduate School of Biomedical Sciences, Tokushima, Japan
| | - Yasuyo Yamamoto
- Department of Urology, Tokushima University Graduate School of Biomedical Sciences, Tokushima, Japan
| | - Kunihisa Yamaguchi
- Department of Urology, Tokushima University Graduate School of Biomedical Sciences, Tokushima, Japan
| | - Masayuki Takahashi
- Department of Urology, Tokushima University Graduate School of Biomedical Sciences, Tokushima, Japan
| | - Hiroomi Kanayama
- Department of Urology, Tokushima University Graduate School of Biomedical Sciences, Tokushima, Japan
| |
Collapse
|
16
|
Singh SP, Borthwick KG, Qureshi FM. Commentary: 3D Laparoscopy-Assisted Operation to Adult Intussusceptions During Perioperative Period of Liver Transplantation: Case Report and Literature Review. Front Surg 2021; 8:764741. [PMID: 34746226 PMCID: PMC8564035 DOI: 10.3389/fsurg.2021.764741] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2021] [Accepted: 09/24/2021] [Indexed: 11/22/2022] Open
Affiliation(s)
- Som P Singh
- Department of Biomedical Sciences, University of Missouri-Kansas City School of Medicine, Kansas, MO, United States
| | - Kiera G Borthwick
- Department of Neurosciences, Washington and Lee University, Lexington, VA, United States
| | - Fahad M Qureshi
- Department of Biomedical Sciences, University of Missouri-Kansas City School of Medicine, Kansas, MO, United States
| |
Collapse
|
17
|
Zhu W, Zheng M, Xiong S, Han G, Meng C, Li Z, Zhang L, Xiong G, Guan H, Huang Y, Zhu H, Li X, Wang G, Zhou L. Modified Takazawa anatomical classification of renal pelvicalyceal system based on three-dimensional virtual reconstruction models. Transl Androl Urol 2021; 10:2944-2952. [PMID: 34430397 PMCID: PMC8350222 DOI: 10.21037/tau-21-309] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2021] [Accepted: 06/02/2021] [Indexed: 01/18/2023] Open
Abstract
Background Previous classification of renal pelvicalyceal anatomical structure may be difficult to intuitively understand and unpractical for endourological surgery. We aim to put forward a modified Takazawa anatomical classification of renal pelvicalyceal system based on three-dimensional (3D) virtual reconstruction models for endourological surgery. Methods We retrospectively collected data on 225 patients (320 kidneys) in total between Apr. 2017 and Dec. 2020, spatial anatomical structure of renal pelvis and calyces were modeled and corresponding morphological parameters were measured after 3D virtual reconstruction of computed tomography urography (CTU). The modified Takazawa renal pelvicalyceal anatomical classification was advanced based on the renal pelvicalyceal morphological parameters [bifurcated branches of renal pelvis, cross sectional area of renal pelvis and ureteropelvic junction (UPJ), infundibuloureteral angle (IUA), lower pole infundibular calyceal length (IL)] by 3D virtual reconstruction models, and comparison of renal pelvicalyceal system morphological parameters were performed to evaluate the differences in various classification types of renal pelvis and calyces. Results Anatomical structure of renal pelvis and calyces were divided into two main types (Type A and Type B) according to renal pelvic branch patterns. A single pelvis without bifurcated branch was regarded as Type A (62%) and subclassified into three subtypes: Type A1 (22%), Type A2 (27%) and Type A3 (13%), the slimline pelvis was classified as Type A1, the typical pelvis as Type A2 and the broad pelvis as Type A3. A divided pelvis with bifurcated branches was seen as Type B (38%) and subclassified into two subtypes: Type B1 (15%) with the wide and flat lower calyx branch, Type B2 (23%) with the narrow and steep lower calyx branch. Conclusions Previous studies have reported that the visualization and classification of renal pelvicalyceal anatomical structure by endocast, autopsy, ultrasonography and excretory urography, the modified Takazawa classification system based on 3D virtual reconstruction models enables to standardized different anatomical morphology of renal pelvicalyceal system and provide intuitive and concise information on anatomy, thus leading to the improvement in treatment modality.
Collapse
Affiliation(s)
- Weijie Zhu
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Mengmeng Zheng
- Department of Urology, Beijing Friendship Hospital, Capital Medical University, Beijing, China
| | - Shengwei Xiong
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Guanpeng Han
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Chang Meng
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Zhihua Li
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Lei Zhang
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Gengyan Xiong
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Hua Guan
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Yanbo Huang
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Hongjian Zhu
- Department of Urology, Beijing Jiangong Hospital, Beijing, China
| | - Xuesong Li
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Gang Wang
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| | - Liqun Zhou
- Department of Urology, Peking University First Hospital, Institute of Urology, Peking University, National Urological Cancer Centre, Beijing, China
| |
Collapse
|
18
|
Zhang W, Zhu W, Yang J, Xiang N, Zeng N, Hu H, Jia F, Fang C. Augmented Reality Navigation for Stereoscopic Laparoscopic Anatomical Hepatectomy of Primary Liver Cancer: Preliminary Experience. Front Oncol 2021; 11:663236. [PMID: 33842378 PMCID: PMC8027474 DOI: 10.3389/fonc.2021.663236] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2021] [Accepted: 03/11/2021] [Indexed: 12/17/2022] Open
Abstract
Background Accurate determination of intrahepatic anatomy remains challenging for laparoscopic anatomical hepatectomy (LAH). Laparoscopic augmented reality navigation (LARN) is expected to facilitate LAH of primary liver cancer (PLC) by identifying the exact location of tumors and vessels. The study was to evaluate the safety and effectiveness of our independently developed LARN system in LAH of PLC. Methods From May 2018 to July 2020, the study included 85 PLC patients who underwent three-dimensional (3D) LAH. According to whether LARN was performed during the operation, the patients were divided into the intraoperative navigation (IN) group and the non-intraoperative navigation (NIN) group. We compared the preoperative data, perioperative results and postoperative complications between the two groups, and introduced our preliminary experience of this novel technology in LAH. Results There were 44 and 41 PLC patients in the IN group and the NIN group, respectively. No significant differences were found in preoperative characteristics and any of the resection-related complications between the two groups (All P > 0.05). Compared with the NIN group, the IN group had significantly less operative bleeding (P = 0.002), lower delta Hb% (P = 0.039), lower blood transfusion rate (P < 0.001), and reduced postoperative hospital stay (P = 0.003). For the IN group, the successful fusion of simulated surgical planning and operative scene helped to determine the extent of resection. Conclusions The LARN contributed to the identification of important anatomical structures during LAH of PLC. It reduced vascular injury and accelerated postoperative recovery, showing a potential application prospects in liver surgery.
Collapse
Affiliation(s)
- Weiqi Zhang
- Department of Hepatobiliary Surgery, Zhujiang Hospital, Southern Medical University, Guangzhou, China.,Guangdong Provincial Clinical and Engineering Center of Digital Medicine, Guangzhou, China
| | - Wen Zhu
- Department of Hepatobiliary Surgery, Zhujiang Hospital, Southern Medical University, Guangzhou, China.,Guangdong Provincial Clinical and Engineering Center of Digital Medicine, Guangzhou, China
| | - Jian Yang
- Department of Hepatobiliary Surgery, Zhujiang Hospital, Southern Medical University, Guangzhou, China.,Guangdong Provincial Clinical and Engineering Center of Digital Medicine, Guangzhou, China
| | - Nan Xiang
- Department of Hepatobiliary Surgery, Zhujiang Hospital, Southern Medical University, Guangzhou, China.,Guangdong Provincial Clinical and Engineering Center of Digital Medicine, Guangzhou, China
| | - Ning Zeng
- Department of Hepatobiliary Surgery, Zhujiang Hospital, Southern Medical University, Guangzhou, China.,Guangdong Provincial Clinical and Engineering Center of Digital Medicine, Guangzhou, China
| | - Haoyu Hu
- Department of Hepatobiliary Surgery, Zhujiang Hospital, Southern Medical University, Guangzhou, China.,Guangdong Provincial Clinical and Engineering Center of Digital Medicine, Guangzhou, China
| | - Fucang Jia
- Research Laboratory for Medical Imaging and Digital Surgery, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| | - Chihua Fang
- Department of Hepatobiliary Surgery, Zhujiang Hospital, Southern Medical University, Guangzhou, China.,Guangdong Provincial Clinical and Engineering Center of Digital Medicine, Guangzhou, China
| |
Collapse
|
19
|
Kavoussi NL, Pitt B, Ferguson JM, Granna J, Remirez A, Nimmagadda N, Melnyk R, Ghazi A, Barth EJ, Webster RJ, Herrell SD. Accuracy of Touch-Based Registration During Robotic Image-Guided Partial Nephrectomy Before and After Tumor Resection in Validated Phantoms. J Endourol 2021; 35:362-368. [PMID: 33040602 PMCID: PMC7987368 DOI: 10.1089/end.2020.0363] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Aim: Image-guided surgery (IGS) allows for accurate, real-time localization of subsurface critical structures during surgery. No prior IGS systems have described a feasible method of intraoperative reregistration after manipulation of the kidney during robotic partial nephrectomy (PN). We present a method for seamless reregistration during IGS and evaluate accuracy before and after tumor resection in two validated kidney phantoms. Materials and Methods: We performed robotic PN on two validated kidney phantoms-one with an endophytic tumor and one with an exophytic tumor-with our IGS system utilizing the da Vinci Xi robot. Intraoperatively, the kidney phantoms' surfaces were digitized with the da Vinci robotic manipulator via a touch-based method and registered to a three-dimensional segmented model created from cross-sectional CT imaging of the phantoms. Fiducial points were marked with a surgical marking pen and identified after the initial registration using the robotic manipulator. Segmented images were displayed via picture-in-picture in the surgeon console as tumor resection was performed. After resection, reregistration was performed by reidentifying the fiducial points. The accuracy of the initial registration and reregistration was compared. Results: The root mean square (RMS) averages of target registration error (TRE) were 2.53 and 4.88 mm for the endophytic and exophytic phantoms, respectively. IGS enabled resection along preplanned contours. Specifically, the RMS averages of the normal TRE over the entire resection surface were 0.75 and 2.15 mm for the endophytic and exophytic phantoms, respectively. Both tumors were resected with grossly negative margins. Point-based reregistration enabled instantaneous reregistration with minimal impact on RMS TRE compared with the initial registration (from 1.34 to 1.70 mm preresection and from 1.60 to 2.10 mm postresection). Conclusions: We present a novel and accurate registration and reregistration framework for use during IGS for PN with the da Vinci Xi surgical system. The technology is easily integrated into the surgical workflow and does not require additional hardware.
Collapse
Affiliation(s)
- Nicholas L. Kavoussi
- Department of Urology, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Bryn Pitt
- Department of Mechanical Engineering, School of Engineering, Vanderbilt University, Nashville, Tennessee, USA
| | - James M. Ferguson
- Department of Mechanical Engineering, School of Engineering, Vanderbilt University, Nashville, Tennessee, USA
| | - Josephine Granna
- Department of Mechanical Engineering, School of Engineering, Vanderbilt University, Nashville, Tennessee, USA
| | - Andria Remirez
- Department of Mechanical Engineering, School of Engineering, Vanderbilt University, Nashville, Tennessee, USA
| | - Naren Nimmagadda
- Department of Urology, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Rachel Melnyk
- Simulation Innovation Laboratory, Department of Urology, University of Rochester, Rochester, New York, USA
| | - Ahmed Ghazi
- Department of Urology, University of Rochester Medical Center, Rochester, New York, USA
| | - Eric J. Barth
- Department of Mechanical Engineering, School of Engineering, Vanderbilt University, Nashville, Tennessee, USA
| | - Robert J. Webster
- Department of Mechanical Engineering, School of Engineering, Vanderbilt University, Nashville, Tennessee, USA
| | - Stanley Duke Herrell
- Department of Urology, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| |
Collapse
|
20
|
Chen J, Wan Z, Zhang J, Li W, Chen Y, Li Y, Duan Y. Medical image segmentation and reconstruction of prostate tumor based on 3D AlexNet. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2021; 200:105878. [PMID: 33308904 DOI: 10.1016/j.cmpb.2020.105878] [Citation(s) in RCA: 32] [Impact Index Per Article: 10.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/24/2020] [Accepted: 11/22/2020] [Indexed: 06/12/2023]
Abstract
BACKGROUND Prostate cancer is a disease with a high incidence of tumors in men. Due to the long incubation time and insidious condition, early diagnosis is difficult; especially imaging diagnosis is more difficult. In actual clinical practice, the method of manual segmentation by medical experts is mainly used, which is time-consuming and labor-intensive and relies heavily on the experience and ability of medical experts. The rapid, accurate and repeatable segmentation of the prostate area is still a challenging problem. It is important to explore the automated segmentation of prostate images based on the 3D AlexNet network. METHOD Taking the medical image of prostate cancer as the entry point, the three-dimensional data is introduced into the deep learning convolutional neural network. This paper proposes a 3D AlexNet method for the automatic segmentation of prostate cancer magnetic resonance images, and the general network ResNet 50, Inception -V4 compares network performance. RESULTS Based on the training samples of magnetic resonance images of 500 prostate cancer patients, a set of 3D AlexNet with simple structure and excellent performance was established through adaptive improvement on the basis of classic AlexNet. The accuracy rate was as high as 0.921, the specificity was 0.896, and the sensitivity It is 0.902 and the area under the receiver operating characteristic curve (AUC) is 0.964. The Mean Absolute Distance (MAD) between the segmentation result and the medical expert's gold standard is 0.356 mm, and the Hausdorff distance (HD) is 1.024 mm, the Dice similarity coefficient is 0.9768. CONCLUSION The improved 3D AlexNet can automatically complete the structured segmentation of prostate magnetic resonance images. Compared with traditional segmentation methods and depth segmentation methods, the performance of the 3D AlexNet network is superior in terms of training time and parameter amount, or network performance evaluation. Compared with the algorithm, it proves the effectiveness of this method.
Collapse
Affiliation(s)
- Jun Chen
- Department of Urology, The Second Affiliated Hospital of Zhejiang Chinese Medical University, No.318 Chaowang Road, Gongshu District, Hangzhou 310005 China
| | - Zhechao Wan
- Department of Urology, Zhuji Central Hospital, No.98 Zhugong Road, Jiyang Street, Zhuji City, 311800, Zhejiang Province, China
| | - Jiacheng Zhang
- The 2nd Clinical Medical College, Zhejiang Chinese Medical University, 548 Bin Wen Road, Hangzhou 310053, China
| | - Wenhua Li
- Department of Radiology, Xinhua Hospital affiliated to Shanghai Jiao Tong University School of Medicine, 1665 Kong Jiang Road, Shanghai 200092, China
| | - Yanbing Chen
- Computer Application Technology, School of Applied Sciences, Macao Polytechnic Institute, Macao SAR 999078, China
| | - Yuebing Li
- Department of Anaesthesiology, The Second Affiliated Hospital of Zhejiang Chinese Medical University, No.318 Chaowang Road, Gongshu District, Hangzhou 310005 China.
| | - Yue Duan
- Department of Urology, The Second Affiliated Hospital of Zhejiang Chinese Medical University, No.318 Chaowang Road, Gongshu District, Hangzhou 310005 China.
| |
Collapse
|
21
|
Yamamoto M, Oyama S, Otsuka S, Murakami Y, Yokota H, Hirata H. Experimental pilot study for augmented reality-enhanced elbow arthroscopy. Sci Rep 2021; 11:4650. [PMID: 33633227 PMCID: PMC7907139 DOI: 10.1038/s41598-021-84062-7] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2020] [Accepted: 02/11/2021] [Indexed: 12/20/2022] Open
Abstract
The purpose of this study was to develop and evaluate a novel elbow arthroscopy system with superimposed bone and nerve visualization using preoperative computed tomography (CT) and magnetic resonance imaging (MRI) data. We obtained bone and nerve segmentation data by CT and MRI, respectively, of the elbow of a healthy human volunteer and cadaveric Japanese monkey. A life size 3-dimensional (3D) model of human organs and frame was constructed using a stereo-lithographic 3D printer. Elbow arthroscopy was performed using the elbow of a cadaveric Japanese monkey. The augmented reality (AR) range of error during rotation of arthroscopy was examined at 20 mm scope-object distances. We successfully performed AR arthroscopy using the life-size 3D elbow model and the elbow of the cadaveric Japanese monkey by making anteromedial and posterior portals. The target registration error was 1.63 ± 0.49 mm (range 1-2.7 mm) with respect to the rotation angle of the lens cylinder from 40° to - 40°. We attained reasonable accuracy and demonstrated the operation of the designed system. Given the multiple applications of AR-enhanced arthroscopic visualization, it has the potential to be a next-generation technology for arthroscopy. This technique will contribute to the reduction of serious complications associated with elbow arthroscopy.
Collapse
Affiliation(s)
- Michiro Yamamoto
- Department of Hand Surgery, Nagoya University Graduate School of Medicine, 65 Tsurumai-cho, Showa-ku, Nagoya, 466-8550, Japan.
| | - Shintaro Oyama
- Department of Hand Surgery, Nagoya University Graduate School of Medicine, 65 Tsurumai-cho, Showa-ku, Nagoya, 466-8550, Japan
| | - Syuto Otsuka
- Department of Mechanical Engineering, Tokyo University of Science, Noda, Japan.,Image Processing Research Team, RIKEN Center for Advanced Photonics, Wako, Japan
| | - Yukimi Murakami
- Image Processing Research Team, RIKEN Center for Advanced Photonics, Wako, Japan
| | - Hideo Yokota
- Image Processing Research Team, RIKEN Center for Advanced Photonics, Wako, Japan
| | - Hitoshi Hirata
- Department of Hand Surgery, Nagoya University Graduate School of Medicine, 65 Tsurumai-cho, Showa-ku, Nagoya, 466-8550, Japan
| |
Collapse
|
22
|
|
23
|
Pelanis E, Teatini A, Eigl B, Regensburger A, Alzaga A, Kumar RP, Rudolph T, Aghayan DL, Riediger C, Kvarnström N, Elle OJ, Edwin B. Evaluation of a novel navigation platform for laparoscopic liver surgery with organ deformation compensation using injected fiducials. Med Image Anal 2020; 69:101946. [PMID: 33454603 DOI: 10.1016/j.media.2020.101946] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2020] [Revised: 11/28/2020] [Accepted: 12/15/2020] [Indexed: 12/11/2022]
Abstract
In laparoscopic liver resection, surgeons conventionally rely on anatomical landmarks detected through a laparoscope, preoperative volumetric images and laparoscopic ultrasound to compensate for the challenges of minimally invasive access. Image guidance using optical tracking and registration procedures is a promising tool, although often undermined by its inaccuracy. This study evaluates a novel surgical navigation solution that can compensate for liver deformations using an accurate and effective registration method. The proposed solution relies on a robotic C-arm to perform registration to preoperative CT/MRI image data and allows for intraoperative updates during resection using fluoroscopic images. Navigation is offered both as a 3D liver model with real-time instrument visualization, as well as an augmented reality overlay on the laparoscope camera view. Testing was conducted through a pre-clinical trial which included four porcine models. Accuracy of the navigation system was measured through two evaluation methods: liver surface fiducials reprojection and a comparison between planned and navigated resection margins. Target Registration Error with the fiducials evaluation shows that the accuracy in the vicinity of the lesion was 3.78±1.89 mm. Resection margin evaluations resulted in an overall median accuracy of 4.44 mm with a maximum error of 9.75 mm over the four subjects. The presented solution is accurate enough to be potentially clinically beneficial for surgical guidance in laparoscopic liver surgery.
Collapse
Affiliation(s)
- Egidijus Pelanis
- The Intervention Centre, Oslo University Hospital Rikshospitalet 0424, Oslo, Norway; Institute of Clinical Medicine, University of Oslo 1072, Oslo, Norway.
| | - Andrea Teatini
- The Intervention Centre, Oslo University Hospital Rikshospitalet 0424, Oslo, Norway; Department of Informatics, University of Oslo 1072, Oslo, Norway
| | | | | | | | - Rahul Prasanna Kumar
- The Intervention Centre, Oslo University Hospital Rikshospitalet 0424, Oslo, Norway
| | | | - Davit L Aghayan
- The Intervention Centre, Oslo University Hospital Rikshospitalet 0424, Oslo, Norway; Institute of Clinical Medicine, University of Oslo 1072, Oslo, Norway; Department of Surgery N1, Yerevan State Medical University, 0025 Yerevan, Armenia
| | - Carina Riediger
- University Hospital Carl Gustav Carus, Technische Universität Dresden, 01307 Dresden, Germany
| | | | - Ole Jakob Elle
- The Intervention Centre, Oslo University Hospital Rikshospitalet 0424, Oslo, Norway; Department of Informatics, University of Oslo 1072, Oslo, Norway
| | - Bjørn Edwin
- The Intervention Centre, Oslo University Hospital Rikshospitalet 0424, Oslo, Norway; Institute of Clinical Medicine, University of Oslo 1072, Oslo, Norway; Department of Hepato-Pancreatic-Biliary surgery 0424, Oslo University Hospital, Oslo, Norway
| |
Collapse
|
24
|
Schoeb DS, Rassweiler J, Sigle A, Miernik A, Engels C, Goezen AS, Teber D. [Robotics and intraoperative navigation]. Urologe A 2020; 60:27-38. [PMID: 33320305 DOI: 10.1007/s00120-020-01405-4] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/18/2020] [Indexed: 10/22/2022]
Abstract
Urology has always been closely linked to technological progress. In the last few decades, we have witnessed increasing implementation of various technologies and innovations in subdisciplines of urology. While conventional laparoscopy is increasingly being replaced by robot-assisted procedures and the introduction of new robotic systems from various manufactures will continue for years, the field of endourolgy is still not dominated by robotic systems. However, new systems (e.g., autonomous, robot-controlled aquablation of the prostate) are becoming increasingly popular and numerous development projects will also probably change clinical care in coming years. In addition, further advancements in the combination of robotics with intraoperative navigation through the integration of imaging and augmented-reality (AR) and virtual reality (VR) technology can be expected. This combination of navigation and robotic technology is already being used successfully in prostate biopsy.
Collapse
Affiliation(s)
- D S Schoeb
- Medizinische Fakultät, Klinik für Urologie, Universitätsklinikum Freiburg, Freiburg, Deutschland
| | - J Rassweiler
- Klinik für Urologie, SLK-Kliniken Heilbronn GmbH, Heilbronn, Deutschland
| | - A Sigle
- Medizinische Fakultät, Klinik für Urologie, Universitätsklinikum Freiburg, Freiburg, Deutschland
| | - A Miernik
- Medizinische Fakultät, Klinik für Urologie, Universitätsklinikum Freiburg, Freiburg, Deutschland
| | - C Engels
- Urologische Klinik, Städtisches Klinikum Karlsruhe, Moltkestr. 90, 76133, Karlsruhe, Deutschland
| | - A S Goezen
- Klinik für Urologie, SLK-Kliniken Heilbronn GmbH, Heilbronn, Deutschland
| | - D Teber
- Urologische Klinik, Städtisches Klinikum Karlsruhe, Moltkestr. 90, 76133, Karlsruhe, Deutschland.
| |
Collapse
|
25
|
Elsayed M, Kadom N, Ghobadi C, Strauss B, Al Dandan O, Aggarwal A, Anzai Y, Griffith B, Lazarow F, Straus CM, Safdar NM. Virtual and augmented reality: potential applications in radiology. Acta Radiol 2020; 61:1258-1265. [PMID: 31928346 DOI: 10.1177/0284185119897362] [Citation(s) in RCA: 27] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The modern-day radiologist must be adept at image interpretation, and the one who most successfully leverages new technologies may provide the highest value to patients, clinicians, and trainees. Applications of virtual reality (VR) and augmented reality (AR) have the potential to revolutionize how imaging information is applied in clinical practice and how radiologists practice. This review provides an overview of VR and AR, highlights current applications, future developments, and limitations hindering adoption.
Collapse
Affiliation(s)
- Mohammad Elsayed
- Department of Radiology and Imaging Sciences, Emory University School of Medicine, Atlanta, GA, USA
| | - Nadja Kadom
- Department of Radiology and Imaging Sciences, Emory University School of Medicine, Atlanta, GA, USA
| | - Comeron Ghobadi
- Department of Radiology, The University of Chicago Pritzker School of Medicine, IL, USA
| | - Benjamin Strauss
- Department of Radiology, The University of Chicago Pritzker School of Medicine, IL, USA
| | - Omran Al Dandan
- Department of Radiology, Imam Abdulrahman Bin Faisal University College of Medicine, Dammam, Eastern Province, Saudi Arabia
| | - Abhimanyu Aggarwal
- Department of Radiology, Eastern Virginia Medical School, Norfolk, VA, USA
| | - Yoshimi Anzai
- Department of Radiology and Imaging Sciences, University of Utah School of Medicine, Salt Lake City, Utah, USA
| | - Brent Griffith
- Department of Radiology, Henry Ford Health System, Detroit, MI, USA
| | - Frances Lazarow
- Department of Radiology, Eastern Virginia Medical School, Norfolk, VA, USA
| | - Christopher M Straus
- Department of Radiology, The University of Chicago Pritzker School of Medicine, IL, USA
| | - Nabile M Safdar
- Department of Radiology and Imaging Sciences, Emory University School of Medicine, Atlanta, GA, USA
| |
Collapse
|
26
|
Abstract
In this paper, a map of the state of the art of recent medical simulators that provide evaluation and guidance for surgical procedures is performed. The systems are reviewed and compared from the viewpoint of the used technology, force feedback, learning evaluation, didactic and visual aid, guidance, data collection and storage, and type of solution (commercial or non-commercial). The works’ assessment was made to identify if—(1) current applications can provide assistance and track performance in training, and (2) virtual environments are more suitable for practicing than physical applications. Automatic analysis of the papers was performed to minimize subjective bias. It was found that some works limit themselves to recording the session data to evaluate them internally, while others assess it and provide immediate user feedback. However, it was found that few works are currently implementing guidance, aid during sessions, and assessment. Current trends suggest that the evaluation process’s automation could reduce the workload of experts and let them focus on improving the curriculum covered in medical education. Lastly, this paper also draws several conclusions, observations per area, and suggestions for future work.
Collapse
|
27
|
Ferguson JM, Pitt EB, Remirez AA, Siebold MA, Kuntz A, Kavoussi NL, Barth EJ, Herrell SD, Webster RJ. Toward Practical and Accurate Touch-Based Image Guidance for Robotic Partial Nephrectomy. ACTA ACUST UNITED AC 2020; 2:196-205. [DOI: 10.1109/tmrb.2020.2989661] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
28
|
Singh P, Alsadoon A, Prasad P, Venkata HS, Ali RS, Haddad S, Alrubaie A. A novel augmented reality to visualize the hidden organs and internal structure in surgeries. Int J Med Robot 2020; 16:e2055. [DOI: 10.1002/rcs.2055] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2018] [Revised: 10/27/2019] [Accepted: 10/28/2019] [Indexed: 11/08/2022]
Affiliation(s)
- P. Singh
- School of Computing and MathematicsCharles Sturt University Sydney New South Wales Australia
| | - Abeer Alsadoon
- School of Computing and MathematicsCharles Sturt University Sydney New South Wales Australia
| | - P.W.C. Prasad
- School of Computing and MathematicsCharles Sturt University Sydney New South Wales Australia
| | | | - Rasha S. Ali
- Department of Computer Techniques EngineeringAL Nisour University College Baghdad Iraq
| | - Sami Haddad
- Department of Oral and Maxillofacial ServicesGreater Western Sydney Area Health Services New South Wales Australia
- Department of Oral and Maxillofacial ServicesCentral Coast Area Health Gosford New South Wales Australia
| | - Ahmad Alrubaie
- Faculty of MedicineUniversity of New South Wales Sydney New South Wales Australia
| |
Collapse
|
29
|
A case study: impact of target surface mesh size and mesh quality on volume-to-surface registration performance in hepatic soft tissue navigation. Int J Comput Assist Radiol Surg 2020; 15:1235-1245. [PMID: 32221798 PMCID: PMC7351822 DOI: 10.1007/s11548-020-02123-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2019] [Accepted: 02/10/2020] [Indexed: 11/30/2022]
Abstract
Purpose Soft tissue deformation severely impacts the registration of pre- and intra-operative image data during computer-assisted navigation in laparoscopic liver surgery. However, quantifying the impact of target surface size, surface orientation, and mesh quality on non-rigid registration performance remains an open research question. This paper aims to uncover how these affect volume-to-surface registration performance. Methods To find such evidence, we design three experiments that are evaluated using a three-step pipeline: (1) volume-to-surface registration using the physics-based shape matching method or PBSM, (2) voxelization of the deformed surface to a \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$1024^3$$\end{document}10243 voxel grid, and (3) computation of similarity (e.g., mutual information), distance (i.e., Hausdorff distance), and classical metrics (i.e., mean squared error or MSE). Results Using the Hausdorff distance, we report a statistical significance for the different partial surfaces. We found that removing non-manifold geometry and noise improved registration performance, and a target surface size of only 16.5% was necessary. Conclusion By investigating three different factors and improving registration results, we defined a generalizable evaluation pipeline and automatic post-processing strategies that were deemed helpful. All source code, reference data, models, and evaluation results are openly available for download: https://github.com/ghattab/EvalPBSM/.
Collapse
|
30
|
|
31
|
Abstract
Artificial intelligence (AI) - the ability of a machine to perform cognitive tasks to achieve a particular goal based on provided data - is revolutionizing and reshaping our health-care systems. The current availability of ever-increasing computational power, highly developed pattern recognition algorithms and advanced image processing software working at very high speeds has led to the emergence of computer-based systems that are trained to perform complex tasks in bioinformatics, medical imaging and medical robotics. Accessibility to 'big data' enables the 'cognitive' computer to scan billions of bits of unstructured information, extract the relevant information and recognize complex patterns with increasing confidence. Computer-based decision-support systems based on machine learning (ML) have the potential to revolutionize medicine by performing complex tasks that are currently assigned to specialists to improve diagnostic accuracy, increase efficiency of throughputs, improve clinical workflow, decrease human resource costs and improve treatment choices. These characteristics could be especially helpful in the management of prostate cancer, with growing applications in diagnostic imaging, surgical interventions, skills training and assessment, digital pathology and genomics. Medicine must adapt to this changing world, and urologists, oncologists, radiologists and pathologists, as high-volume users of imaging and pathology, need to understand this burgeoning science and acknowledge that the development of highly accurate AI-based decision-support applications of ML will require collaboration between data scientists, computer researchers and engineers.
Collapse
|
32
|
Dubrovin V, Egoshin A, Rozhentsov A, Batuhtin D, Eruslanov R, Chernishov D, Furman Y, Baev A. Virtual simulation, preoperative planning and intraoperative navigation during laparoscopic partial nephrectomy. Cent European J Urol 2019; 72:247-251. [PMID: 31720025 PMCID: PMC6830493 DOI: 10.5173/ceju.2019.1632] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2017] [Revised: 02/18/2018] [Accepted: 04/23/2019] [Indexed: 12/26/2022] Open
Abstract
Introduction The use of computer navigation systems is a new and actively explored method used for surgical procedures concerning the abdominal and retroperitoneal organs. In this paper, we propose an original hardware – software complex, which forms a virtual body model, based on preoperative computer tomography data, transmitted to the operating screen monitor using a surgical navigation system, involving a mechanical digitizer. Material and methods During a laparoscopic procedure, a three-dimensional (3D) model of a kidney with a tumor was used to obtain additional information on the primary or secondary monitor or for combining the virtual model and video images on the main or additional monitor in the operating room. This method was used for laparoscopic partial nephrectomy, where twelve patients were operated with an average age of 45.4 (38–54) years, with clear cell renal cell carcinoma size 27.08 (15–40) mm. Results All patients successfully underwent laparoscopic partial nephrectomy with intraoperative navigation. The mean operative time was 97.2 (80–155) minutes, warm ischemia time – 18.0 (12–25) minutes. Selective clamping of segmental renal arteries was performed in 7 (58.3%) cases, in the remaining 5 (41.6%) cases the renal artery was clamped. There were no serious complications. The average duration of hospital stay was 7.0 (5–10) days. Conclusions Preliminary results of our clinical study have shown the success of 3D modeling for qualitative visualization of kidney tumors in the course of surgical intervention, both for the surgeon and for the patient to understand the nature of the pathological process.
Collapse
Affiliation(s)
| | | | - Alexey Rozhentsov
- Volga State University of Technology, Yoshkar-Ola, Russian Federation
| | - Dmitrii Batuhtin
- Volga State University of Technology, Yoshkar-Ola, Russian Federation
| | - Ruslan Eruslanov
- Volga State University of Technology, Yoshkar-Ola, Russian Federation
| | | | - Yacov Furman
- Volga State University of Technology, Yoshkar-Ola, Russian Federation
| | - Alexey Baev
- Volga State University of Technology, Yoshkar-Ola, Russian Federation
| |
Collapse
|
33
|
Checcucci E, Amparore D, Pecoraro A, Peretti D, Aimar R, DE Cillis S, Piramide F, Volpi G, Piazzolla P, Manfrin D, Manfredi M, Fiori C, Porpiglia F. 3D mixed reality holograms for preoperative surgical planning of nephron-sparing surgery: evaluation of surgeons' perception. Minerva Urol Nephrol 2019; 73:367-375. [PMID: 31486325 DOI: 10.23736/s2724-6051.19.03610-5] [Citation(s) in RCA: 35] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
BACKGROUND 3D reconstructions are gaining a wide diffusion in nephron-sparing surgery (NSS) planning. They have usually been studied on common 2D flat supports, with limitations regarding real depth comprehension and interaction. Nowadays, it is possible to visualize kidney 3D reconstructions as holograms in a "mixed reality" (MR) setting. The aim of this study was to test the face and content validity of this technology, and to assess the role of 3D holograms in aiding preoperative planning for highly complex renal tumors amenable by NSS. METHODS We evaluated surgeons' perception of mixed reality for partial nephrectomy during a urological international meeting organized at our Institution in January 2019. Thanks to the images of preoperative CT, hyper-accuracy 3D (HA3DTM) reconstructions were performed. Then, a virtual environment was created, and it interacted with the models in mixed reality setting by using HoloLens. We submitted to all the attendees a questionnaire, expressed by the Likert scale (1-10), about their opinion over the use and application of the MR. Moreover, the attendees had the chance to perform a first-hand MR experience; then, they were asked to choose their clamping and resection approach. RESULTS Overall 172 questionnaires were collected. The scores obtained regarding both surgical planning (scored 8/10) and anatomical accuracy (9/10) were very positive. High satisfaction toward the potential role of this technology in surgical planning and understanding of surgical complexity (both scored 9/10) were expressed. After a first-hand experience with HoloLens and MR, 64.4% and 44.4% of the surgeons changed their clamping and resection approach, respectively - compared to CT image visualization only - choosing a more selective one. CONCLUSIONS Our study suggests that surgeons perceive holograms and MR as a useful and interesting tool for the preoperative setting before partial nephrectomy, in the direction of an ever more precise surgery.
Collapse
Affiliation(s)
- Enrico Checcucci
- Division of Urology, Department of Oncology, School of Medicine, San Luigi Gonzaga Hospital, University of Turin, Orbassano, Turin, Italy -
| | - Daniele Amparore
- Division of Urology, Department of Oncology, School of Medicine, San Luigi Gonzaga Hospital, University of Turin, Orbassano, Turin, Italy
| | - Angela Pecoraro
- Division of Urology, Department of Oncology, School of Medicine, San Luigi Gonzaga Hospital, University of Turin, Orbassano, Turin, Italy
| | - Dario Peretti
- Division of Urology, Department of Oncology, School of Medicine, San Luigi Gonzaga Hospital, University of Turin, Orbassano, Turin, Italy
| | - Roberta Aimar
- Division of Urology, Department of Oncology, School of Medicine, San Luigi Gonzaga Hospital, University of Turin, Orbassano, Turin, Italy
| | - Sabrina DE Cillis
- Division of Urology, Department of Oncology, School of Medicine, San Luigi Gonzaga Hospital, University of Turin, Orbassano, Turin, Italy
| | - Federico Piramide
- Division of Urology, Department of Oncology, School of Medicine, San Luigi Gonzaga Hospital, University of Turin, Orbassano, Turin, Italy
| | - Gabriele Volpi
- Division of Urology, Department of Oncology, School of Medicine, San Luigi Gonzaga Hospital, University of Turin, Orbassano, Turin, Italy
| | - Pietro Piazzolla
- Department of Management and Production Engineer, Polytechnic University of Turin, Turin, Italy
| | - Diego Manfrin
- Division of Urology, Department of Oncology, School of Medicine, San Luigi Gonzaga Hospital, University of Turin, Orbassano, Turin, Italy
| | - Matteo Manfredi
- Division of Urology, Department of Oncology, School of Medicine, San Luigi Gonzaga Hospital, University of Turin, Orbassano, Turin, Italy
| | - Cristian Fiori
- Division of Urology, Department of Oncology, School of Medicine, San Luigi Gonzaga Hospital, University of Turin, Orbassano, Turin, Italy
| | - Francesco Porpiglia
- Division of Urology, Department of Oncology, School of Medicine, San Luigi Gonzaga Hospital, University of Turin, Orbassano, Turin, Italy
| |
Collapse
|
34
|
Joeres F, Schindele D, Luz M, Blaschke S, Russwinkel N, Schostak M, Hansen C. How well do software assistants for minimally invasive partial nephrectomy meet surgeon information needs? A cognitive task analysis and literature review study. PLoS One 2019; 14:e0219920. [PMID: 31318919 PMCID: PMC6638947 DOI: 10.1371/journal.pone.0219920] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2019] [Accepted: 07/04/2019] [Indexed: 12/30/2022] Open
Abstract
INTRODUCTION Intraoperative software assistance is gaining increasing importance in laparoscopic and robot-assisted surgery. Within the user-centred development process of such systems, the first question to be asked is: What information does the surgeon need and when does he or she need it? In this article, we present an approach to investigate these surgeon information needs for minimally invasive partial nephrectomy and compare these needs to the relevant surgical computer assistance literature. MATERIALS AND METHODS First, we conducted a literature-based hierarchical task analysis of the surgical procedure. This task analysis was taken as a basis for a qualitative in-depth interview study with nine experienced surgical urologists. The study employed a cognitive task analysis method to elicit surgeons' information needs during minimally invasive partial nephrectomy. Finally, a systematic literature search was conducted to review proposed software assistance solutions for minimally invasive partial nephrectomy. The review focused on what information the solutions present to the surgeon and what phase of the surgery they aim to support. RESULTS The task analysis yielded a workflow description for minimally invasive partial nephrectomy. During the subsequent interview study, we identified three challenging phases of the procedure, which may particularly benefit from software assistance. These phases are I. Hilar and vascular management, II. Tumour excision, and III. Repair of the renal defects. Between these phases, 25 individual challenges were found which define the surgeon information needs. The literature review identified 34 relevant publications, all of which aim to support the surgeon in hilar and vascular management (phase I) or tumour excision (phase II). CONCLUSION The work presented in this article identified unmet surgeon information needs in minimally invasive partial nephrectomy. Namely, our results suggest that future solutions should address the repair of renal defects (phase III) or put more focus on the renal collecting system as a critical anatomical structure.
Collapse
Affiliation(s)
- Fabian Joeres
- Department of Simulation and Graphics, Faculty of Computer Science, Otto von Guericke University Magdeburg, Magdeburg, Germany
| | - Daniel Schindele
- Clinic of Urology and Paediatric Urology, University Hospital of Magdeburg, Magdeburg, Germany
| | - Maria Luz
- Department of Simulation and Graphics, Faculty of Computer Science, Otto von Guericke University Magdeburg, Magdeburg, Germany
| | - Simon Blaschke
- Clinic of Urology and Paediatric Urology, University Hospital of Magdeburg, Magdeburg, Germany
| | - Nele Russwinkel
- Department of Cognitive Modelling in Dynamic Human-Machine Systems, Technische Universität Berlin, Berlin, Germany
| | - Martin Schostak
- Clinic of Urology and Paediatric Urology, University Hospital of Magdeburg, Magdeburg, Germany
| | - Christian Hansen
- Department of Simulation and Graphics, Faculty of Computer Science, Otto von Guericke University Magdeburg, Magdeburg, Germany
| |
Collapse
|
35
|
Yan Y, Xia HZ, Li XS, He W, Zhu XH, Zhang ZY, Xiao CL, Liu YQ, Huang H, He LH, Lu J. [Application of U-shaped convolutional neural network in auto segmentation and reconstruction of 3D prostate model in laparoscopic prostatectomy navigation]. JOURNAL OF PEKING UNIVERSITY. HEALTH SCIENCES 2019; 51:596-601. [PMID: 31209437 DOI: 10.19723/j.issn.1671-167x.2019.03.033] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
OBJECTIVE To investigate the efficacy of intraoperative cognitive navigation on laparoscopic radical prostatectomy using 3D prostatic models created by U-shaped convolutional neural network (U-net) and reconstructed through Medical Image Interaction Tool Kit (MITK) platform. METHODS A total of 5 000 pieces of prostate cancer magnetic resonance (MR) imaging discovery sets with manual annotations were used to train a modified U-net, and a set of clinically demand-oriented, stable and efficient full convolutional neural network algorithm was constructed. The MR images were cropped and segmented automatically by using modified U-net, and the segmentation data were automatically reconstructed using MITK platform according to our own protocols. The modeling data were output as STL format, and the prostate models were simultaneously displayed on an android tablet during the operation to help achieving cognitive navigation. RESULTS Based on original U-net architecture, we established a modified U-net from a 201-case MR imaging training set. The network performance was tested and compared with human segmentations and other segmentation networks by using one certain testing data set. Auto segmentation of multi-structures (such as prostate, prostate tumors, seminal vesicles, rectus, neurovascular bundles and dorsal venous complex) were successfully achieved. Secondary automatic 3D reconstruction had been carried out through MITK platform. During the surgery, 3D models of prostatic area were simultaneously displayed on an android tablet, and the cognitive navigation was successfully achieved. Intra-operation organ visualization demonstrated the structural relationships among the key structures in great detail and the degree of tumor invasion was visualized directly. CONCLUSION The modified U-net was able to achieve automatic segmentations of important structures of prostate area. Secondary 3D model reconstruction and demonstration could provide intraoperative visualization of vital structures of prostate area, which could help achieve cognitive fusion navigation for surgeons. The application of these techniques could finally reduce positive surgical margin rates, and may improve the efficacy and oncological outcomes of laparoscopic prostatectomy.
Collapse
Affiliation(s)
- Y Yan
- Department of Urology, Peking University Third Hospital, Beijing 100191, China
| | - H Z Xia
- Department of Urology, Peking University Third Hospital, Beijing 100191, China
| | - X S Li
- Institute of Electronic and Information, Tongji University, Shanghai 400047, China
| | - W He
- Department of Radiology, Peking University Third Hospital, Beijing 100191, China
| | - X H Zhu
- Department of Urology, Peking University Third Hospital, Beijing 100191, China
| | - Z Y Zhang
- Department of Urology, Peking University Third Hospital, Beijing 100191, China
| | - C L Xiao
- Department of Urology, Peking University Third Hospital, Beijing 100191, China
| | - Y Q Liu
- Department of Urology, Peking University Third Hospital, Beijing 100191, China
| | - H Huang
- School of Computer Science and Technology, Beijing Institute of Technology, Beijing 100081, China
| | - L H He
- Institute of Electronic and Information, Tongji University, Shanghai 400047, China
| | - J Lu
- Department of Urology, Peking University Third Hospital, Beijing 100191, China
| |
Collapse
|
36
|
Kobayashi S, Cho B, Huaulmé A, Tatsugami K, Honda H, Jannin P, Hashizumea M, Eto M. Assessment of surgical skills by using surgical navigation in robot-assisted partial nephrectomy. Int J Comput Assist Radiol Surg 2019; 14:1449-1459. [PMID: 31119486 DOI: 10.1007/s11548-019-01980-8] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2019] [Accepted: 04/16/2019] [Indexed: 01/15/2023]
Abstract
PURPOSE To assess surgical skills in robot-assisted partial nephrectomy (RAPN) with and without surgical navigation (SN). METHODS We employed an SN system that synchronizes the real-time endoscopic image with a virtual reality three-dimensional (3D) model for RAPN and evaluated the skills of two expert surgeons with regard to the identification and dissection of the renal artery (non-SN group, n = 21 [first surgeon n = 9, second surgeon n = 12]; SN group, n = 32 [first surgeon n = 11, second surgeon n = 21]). We converted all movements of the robotic forceps during RAPN into a dedicated vocabulary. Using RAPN videos, we classified all movements of the robotic forceps into direct action (defined as movements of the robotic forceps that directly affect tissues) and connected motion (defined as movements that link actions). In addition, we analyzed the frequency, duration, and occupancy rate of the connected motion. RESULTS In the SN group, the R.E.N.A.L nephrometry score was lower (7 vs. 6, P = 0.019) and the time to identify and dissect the renal artery (16 vs. 9 min, P = 0.008) was significantly shorter. The connected motions of inefficient "insert," "pull," and "rotate" motions were significantly improved by SN. SN significantly improved the frequency, duration, and occupancy rate of connected motions of the right hand of the first surgeon and of both hands of the second surgeon. The improvements in connected motions were positively associated with SN for both surgeons. CONCLUSION This is the first study to investigate SN for nephron-sparing surgery. SN with 3D models might help improve the connected motions of expert surgeons to ensure efficient RAPN.
Collapse
Affiliation(s)
- Satoshi Kobayashi
- Department of Advanced Medical Initiatives Faculty of Medical Sciences, Kyushu University, Fukuoka, Japan.,Department of Urology, Kyushu University, Fukuoka, Japan
| | - Byunghyun Cho
- Department of Advanced Medical Initiatives Faculty of Medical Sciences, Kyushu University, Fukuoka, Japan.
| | - Arnaud Huaulmé
- Faculty of Medicine, National Institute of Health and Scientific Research, University of Rennes 1, Rennes, France
| | | | - Hiroshi Honda
- Department of Radiology, Kyushu University, Fukuoka, Japan
| | - Pierre Jannin
- Faculty of Medicine, National Institute of Health and Scientific Research, University of Rennes 1, Rennes, France
| | - Makoto Hashizumea
- Department of Advanced Medical Initiatives Faculty of Medical Sciences, Kyushu University, Fukuoka, Japan
| | - Masatoshi Eto
- Department of Advanced Medical Initiatives Faculty of Medical Sciences, Kyushu University, Fukuoka, Japan.,Department of Urology, Kyushu University, Fukuoka, Japan
| |
Collapse
|
37
|
Alismail A, Thomas J, Daher NS, Cohen A, Almutairi W, Terry MH, Huang C, Tan LD. Augmented reality glasses improve adherence to evidence-based intubation practice. ADVANCES IN MEDICAL EDUCATION AND PRACTICE 2019; 10:279-286. [PMID: 31191075 PMCID: PMC6511613 DOI: 10.2147/amep.s201640] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/16/2019] [Accepted: 04/06/2019] [Indexed: 06/09/2023]
Abstract
Background: The risk of failing or delaying endotracheal intubation in critically ill patients has commonly been associated with inadequate procedure preparation. Clinicians and trainees in simulation courses for tracheal intubation are encouraged to recall the steps of how to intubate in order to mitigate the risk of a failed intubation. The purpose of this study was to assess the effectiveness of using optical head mounted display augmented reality (AR) glasses as an assistance tool to perform intubation simulation procedure. Methods: A total of 32 subjects with a mean age of 30±7.8, AR (n1=15) vs non-augmented reality(non-AR) (n2=17). The majority were males (n=22, 68.7%). Subjects were randomly assigned into two groups: the AR group and the non-AR group. Both groups reviewed a video on how to intubate following the New England Journal of Medicine (NEJM) intubation guidelines. The AR group had to intubate using the AR glasses head mount display compared to the non-AR where they performed regular intubation. Results: The AR group took longer median (min, max) time (seconds) to ventilate than the non-AR group (280 (130,740) vs 205 (100,390); η 2 =1.0, p=0.005, respectively). Similarly, there was a higher percent adherence of NEJM intubation checklist (100% in the AR group vs 82.4% in the non-AR group; η2=1.8, p<0.001). Conclusion: The AR glasses showed promise in assisting different health care professionals on endotracheal intubation simulation. Participants in the AR group took a longer time to ventilate but scored 100% in the developed checklist that followed the NEJM protocol. This finding shows that the AR technology can be used in a simulation setting and requires further study before clinical use.
Collapse
Affiliation(s)
- Abdullah Alismail
- Cardiopulmonary Sciences Department, School of Allied Health Professions, Loma Linda University, Loma Linda, CA, USA
| | - Jonathan Thomas
- Zapara School of Business, La Sierra University, Riverside, CA, USA
| | - Noha S Daher
- Allied Health Studies, School of Allied Health Professoins, Loma Linda University, Loma Linda, CA, USA
| | - Avi Cohen
- Division of Pulmonary, Critical Care, Hyperbaric and Sleep Medicine, Department of Internal Medicine, Loma Linda University Medical Center, Loma Linda, CA, USA
| | - Waleed Almutairi
- Cardiopulmonary Sciences Department, School of Allied Health Professions, Loma Linda University, Loma Linda, CA, USA
| | - Michael H Terry
- Cardiopulmonary Sciences Department, School of Allied Health Professions, Loma Linda University, Loma Linda, CA, USA
- Department of Respiratory Care, Loma Linda University Medical Center, Loma Linda, CA, USA
| | - Cynthia Huang
- Division of Pulmonary, Critical Care, Hyperbaric and Sleep Medicine, Department of Internal Medicine, Loma Linda University Medical Center, Loma Linda, CA, USA
| | - Laren D Tan
- Cardiopulmonary Sciences Department, School of Allied Health Professions, Loma Linda University, Loma Linda, CA, USA
- Division of Pulmonary, Critical Care, Hyperbaric and Sleep Medicine, Department of Internal Medicine, Loma Linda University Medical Center, Loma Linda, CA, USA
| |
Collapse
|
38
|
Zhang X, Wang J, Wang T, Ji X, Shen Y, Sun Z, Zhang X. A markerless automatic deformable registration framework for augmented reality navigation of laparoscopy partial nephrectomy. Int J Comput Assist Radiol Surg 2019; 14:1285-1294. [PMID: 31016562 DOI: 10.1007/s11548-019-01974-6] [Citation(s) in RCA: 25] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2018] [Accepted: 04/05/2019] [Indexed: 01/03/2023]
Abstract
Purpose Video see-through augmented reality (VST-AR) navigation for laparoscopic partial nephrectomy (LPN) can enhance intraoperative perception of surgeons by visualizing surgical targets and critical structures of the kidney tissue. Image registration is the main challenge in the procedure. Existing registration methods in laparoscopic navigation systems suffer from limitations such as manual alignment, invasive external marker fixation, relying on external tracking devices with bulky tracking sensors and lack of deformation compensation. To address these issues, we present a markerless automatic deformable registration framework for LPN VST-AR navigation. METHOD Dense stereo matching and 3D reconstruction, automatic segmentation and surface stitching are combined to obtain a larger dense intraoperative point cloud of the renal surface. A coarse-to-fine deformable registration is performed to achieve a precise automatic registration between the intraoperative point cloud and the preoperative model using the iterative closest point algorithm followed by the coherent point drift algorithm. Kidney phantom experiments and in vivo experiments were performed to evaluate the accuracy and effectiveness of our approach. RESULTS The average segmentation accuracy rate of the automatic segmentation was 94.9%. The mean target registration error of the phantom experiments was found to be 1.28 ± 0.68 mm (root mean square error). In vivo experiments showed that tumor location was identified successfully by superimposing the tumor model on the laparoscopic view. CONCLUSION Experimental results have demonstrated that the proposed framework could accurately overlay comprehensive preoperative models on deformable soft organs automatically in a manner of VST-AR without using extra intraoperative imaging modalities and external tracking devices, as well as its potential clinical use.
Collapse
Affiliation(s)
- Xiaohui Zhang
- School of Mechanical Engineering and Automation, Beihang University, Beijing, 100191, China
| | - Junchen Wang
- School of Mechanical Engineering and Automation, Beihang University, Beijing, 100191, China.,Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing, 100083, China
| | - Tianmiao Wang
- School of Mechanical Engineering and Automation, Beihang University, Beijing, 100191, China.,Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing, 100083, China
| | - Xuquan Ji
- School of Mechanical Engineering and Automation, Beihang University, Beijing, 100191, China
| | - Yu Shen
- School of Mechanical Engineering and Automation, Beihang University, Beijing, 100191, China.,Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing, 100083, China
| | - Zhen Sun
- School of Mechanical Engineering and Automation, Beihang University, Beijing, 100191, China
| | - Xuebin Zhang
- Department of Urology, Peking Union Medical College Hospital, Peking Union Medical College and Chinese Academy of Medical Sciences, Beijing, 100730, China.
| |
Collapse
|
39
|
Uppot RN, Laguna B, McCarthy CJ, De Novi G, Phelps A, Siegel E, Courtier J. Implementing Virtual and Augmented Reality Tools for Radiology Education and Training, Communication, and Clinical Care. Radiology 2019; 291:570-580. [PMID: 30990383 DOI: 10.1148/radiol.2019182210] [Citation(s) in RCA: 72] [Impact Index Per Article: 14.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Advances in virtual immersive and augmented reality technology, commercially available for the entertainment and gaming industry, hold potential for education and clinical use in medicine and the field of medical imaging. Radiology departments have begun exploring the use of these technologies to help with radiology education and clinical care. The purpose of this review article is to summarize how three institutions have explored using virtual and augmented reality for radiology.
Collapse
Affiliation(s)
- Raul N Uppot
- From the Department of Radiology, Division of Interventional Radiology, Massachusetts General Hospital, 55 Fruit St, Gray 290, Boston, MA 02114 (R.N.U., C.J.M., G.D.N.); Department of Radiology and Biomedical Imaging, University of California San Francisco Medical Center, San Francisco, Calif (B.L., A.P., J.C.); and Department of Radiology, University of Maryland Medical Center, Baltimore, Md (E.S.)
| | - Benjamin Laguna
- From the Department of Radiology, Division of Interventional Radiology, Massachusetts General Hospital, 55 Fruit St, Gray 290, Boston, MA 02114 (R.N.U., C.J.M., G.D.N.); Department of Radiology and Biomedical Imaging, University of California San Francisco Medical Center, San Francisco, Calif (B.L., A.P., J.C.); and Department of Radiology, University of Maryland Medical Center, Baltimore, Md (E.S.)
| | - Colin J McCarthy
- From the Department of Radiology, Division of Interventional Radiology, Massachusetts General Hospital, 55 Fruit St, Gray 290, Boston, MA 02114 (R.N.U., C.J.M., G.D.N.); Department of Radiology and Biomedical Imaging, University of California San Francisco Medical Center, San Francisco, Calif (B.L., A.P., J.C.); and Department of Radiology, University of Maryland Medical Center, Baltimore, Md (E.S.)
| | - Gianluca De Novi
- From the Department of Radiology, Division of Interventional Radiology, Massachusetts General Hospital, 55 Fruit St, Gray 290, Boston, MA 02114 (R.N.U., C.J.M., G.D.N.); Department of Radiology and Biomedical Imaging, University of California San Francisco Medical Center, San Francisco, Calif (B.L., A.P., J.C.); and Department of Radiology, University of Maryland Medical Center, Baltimore, Md (E.S.)
| | - Andrew Phelps
- From the Department of Radiology, Division of Interventional Radiology, Massachusetts General Hospital, 55 Fruit St, Gray 290, Boston, MA 02114 (R.N.U., C.J.M., G.D.N.); Department of Radiology and Biomedical Imaging, University of California San Francisco Medical Center, San Francisco, Calif (B.L., A.P., J.C.); and Department of Radiology, University of Maryland Medical Center, Baltimore, Md (E.S.)
| | - Eliot Siegel
- From the Department of Radiology, Division of Interventional Radiology, Massachusetts General Hospital, 55 Fruit St, Gray 290, Boston, MA 02114 (R.N.U., C.J.M., G.D.N.); Department of Radiology and Biomedical Imaging, University of California San Francisco Medical Center, San Francisco, Calif (B.L., A.P., J.C.); and Department of Radiology, University of Maryland Medical Center, Baltimore, Md (E.S.)
| | - Jesse Courtier
- From the Department of Radiology, Division of Interventional Radiology, Massachusetts General Hospital, 55 Fruit St, Gray 290, Boston, MA 02114 (R.N.U., C.J.M., G.D.N.); Department of Radiology and Biomedical Imaging, University of California San Francisco Medical Center, San Francisco, Calif (B.L., A.P., J.C.); and Department of Radiology, University of Maryland Medical Center, Baltimore, Md (E.S.)
| |
Collapse
|
40
|
Bertolo R, Hung A, Porpiglia F, Bove P, Schleicher M, Dasgupta P. Systematic review of augmented reality in urological interventions: the evidences of an impact on surgical outcomes are yet to come. World J Urol 2019; 38:2167-2176. [PMID: 30826888 DOI: 10.1007/s00345-019-02711-z] [Citation(s) in RCA: 30] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2018] [Accepted: 02/26/2019] [Indexed: 01/12/2023] Open
Abstract
PURPOSE To perform a systematic literature review on the clinical impact of augmented reality (AR) for urological interventions. METHODS As of June 21, 2018, systematic literature review was performed via Medline, Embase and Cochrane databases in accordance with the PRISMA guidelines and registered at PROSPERO (CRD42018102194). Only full text articles in English were included, without time restrictions. Articles were considered if they reported on the use of AR during urological intervention and the impact on the surgical outcomes. The risk of bias and the quality of each study included were independently assessed using the standard Cochrane Collaboration risk of bias tool and the Risk Of Bias In Non-randomised Studies-of Interventions Tool (ROBINS-I). RESULTS 131 articles were identified. 102 remained after duplicate removal and were critically reviewed for evidence synthesis. 20 studies reporting on the outcomes of the use of AR during urological interventions in a clinical setting were considered. Given the mostly non-comparative design of the studies identified, the evidence synthesis was performed in a descriptive and narrative manner. Only one comparative study was found, with the remaining 19 items being single-arm observational studies. Based on the existing evidence, we are unable to state that AR improves the outcomes of urological interventions. The major limitation of AR-assisted surgery is inaccuracy in registration, translating into a poor navigation precision. CONCLUSIONS To date, there is limited evidence showing superior therapeutic benefits of AR-guided surgery when compared with the conventional surgical approach to the respective disease.
Collapse
Affiliation(s)
- Riccardo Bertolo
- Glickman Urological and Kidney Institute, Cleveland Clinic, 2050 E 96th St, Q Building, Cleveland, OH, 44195, USA. .,Urology Department, "San Carlo di Nancy" Hospital, Rome, Italy.
| | - Andrew Hung
- Center for Robotic Simulation and Education, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA
| | - Francesco Porpiglia
- Division of Urology, Department of Oncology, University of Turin, San Luigi Hospital, Orbassano, Turin, Italy
| | - Pierluigi Bove
- Urology Department, "San Carlo di Nancy" Hospital, Rome, Italy
| | - Mary Schleicher
- Floyd D. Loop Alumni Library, Cleveland Clinic, Cleveland, OH, USA
| | | |
Collapse
|
41
|
Wang J, Shen Y, Yang S. A practical marker-less image registration method for augmented reality oral and maxillofacial surgery. Int J Comput Assist Radiol Surg 2019; 14:763-773. [PMID: 30825070 DOI: 10.1007/s11548-019-01921-5] [Citation(s) in RCA: 31] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2018] [Accepted: 01/29/2019] [Indexed: 12/12/2022]
Abstract
BACKGROUND Image registration lies in the core of augmented reality (AR), which aligns the virtual scene with the reality. In AR surgical navigation, the performance of image registration is vital to the surgical outcome. METHODS This paper presents a practical marker-less image registration method for AR-guided oral and maxillofacial surgery where a virtual scene is generated and mixed with reality to guide surgical operation or provide surgical outcome visualization in the manner of video see-through overlay. An intraoral 3D scanner is employed to acquire the patient's teeth shape model intraoperatively. The shape model is then registered with a custom-made stereo camera system using a novel 3D stereo matching algorithm and with the patient's CT-derived 3D model using an iterative closest point scheme, respectively. By leveraging the intraoral 3D scanner, the CT space and the stereo camera space are associated so that surrounding anatomical models and virtual implants could be overlaid on the camera's view to achieve AR surgical navigation. RESULTS Jaw phantom experiments were performed to evaluate the target registration error of the overlay, which yielded an average error of less than 0.50 mm with the time cost less than 0.5 s. Volunteer trial was also conducted to show the clinical feasibility. CONCLUSIONS The proposed registration method does not rely on any external fiducial markers attached to the patient. It performs automatically so as to maintain a correct AR scene, overcoming the misalignment difficulty caused by patient's movement. Therefore, it is noninvasive and practical in oral and maxillofacial surgery.
Collapse
Affiliation(s)
- Junchen Wang
- School of Mechanical Engineering and Automation, Beihang University, Beijing, 100191, China.,Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing, 100086, China
| | - Yu Shen
- School of Mechanical Engineering and Automation, Beihang University, Beijing, 100191, China.,Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing, 100086, China
| | - Shuo Yang
- Stomatological Hospital, Southern Medical University, Guangzhou, China.
| |
Collapse
|
42
|
Glybochko PV, Alyaev YG, Khokhlachev SB, Fiev DN, Shpot EV, Petrovsky NV, Zhang D, Proskura AV, Yurova M, Matz EL, Wang X, Atala A, Zhang Y, Butnaru DV. 3D reconstruction of CT scans aid in preoperative planning for sarcomatoid renal cancer: A case report and mini-review. JOURNAL OF X-RAY SCIENCE AND TECHNOLOGY 2019; 27:389-395. [PMID: 30689600 DOI: 10.3233/xst-180387] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Contrast-enhanced multi-slice computed tomography (MSCT) is commonly used in the diagnosis of complex malignant tumours. This technology provides comprehensive and accurate information about tumour size and shape in relation to solid tumours and the affected adjacent organs and tissues. This case report demonstrates the benefit of using MSCT 3D imaging for preoperative planning in a patient with late-stage (T4) sarcomatoid renal cell carcinoma, a rare renal malignant tumour. The surgical margin on the liver was negative, and no metastases to veins, lungs or other organs were detected by abdominal and chest contrast-enhanced CT. Although sarcomatoid histology is considered to be a poor prognostic factor, the patient is alive and well 17 months after surgery. The MSCT imaging modality enables 3D rendering of an area of interest, which assists surgical decision-making in cases of advanced renal tumours. In this case, as a result of MSCT 3D reconstruction, the patient received justified surgical treatment without compromising oncological principles.
Collapse
Affiliation(s)
- Petr V Glybochko
- Institute for Urology and Reproductive Health, Sechenov University, Moscow, Russian Federation
| | - Yuriy G Alyaev
- Institute for Urology and Reproductive Health, Sechenov University, Moscow, Russian Federation
| | - Sergey B Khokhlachev
- Institute for Urology and Reproductive Health, Sechenov University, Moscow, Russian Federation
| | - Dmitriy N Fiev
- Institute for Urology and Reproductive Health, Sechenov University, Moscow, Russian Federation
| | - Evgeniy V Shpot
- Institute for Urology and Reproductive Health, Sechenov University, Moscow, Russian Federation
| | - Nikolay V Petrovsky
- Institute for Urology and Reproductive Health, Sechenov University, Moscow, Russian Federation
| | - Deying Zhang
- Department of Urology, Children's Hospital of Chongqing Medical University, Chongqing, China
| | - Alexandra V Proskura
- Institute for Urology and Reproductive Health, Sechenov University, Moscow, Russian Federation
| | - Maria Yurova
- Institute for Urology and Reproductive Health, Sechenov University, Moscow, Russian Federation
| | - Ethan Lester Matz
- Institute for Regenerative Medicine, Wake Forest University, Winston-Salem, NC, USA
| | - Xisheng Wang
- Department of Urology, Shenzhen Longhua District Central Hospital, Shenzhen, China
| | - Anthony Atala
- Institute for Regenerative Medicine, Wake Forest University, Winston-Salem, NC, USA
| | - Yuanyuan Zhang
- Institute for Regenerative Medicine, Wake Forest University, Winston-Salem, NC, USA
| | - Denis V Butnaru
- Institute for Urology and Reproductive Health, Sechenov University, Moscow, Russian Federation
| |
Collapse
|
43
|
Intraoperative Imaging Techniques to Support Complete Tumor Resection in Partial Nephrectomy. Eur Urol Focus 2018; 4:960-968. [DOI: 10.1016/j.euf.2017.04.008] [Citation(s) in RCA: 44] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2017] [Accepted: 04/29/2017] [Indexed: 12/22/2022]
|
44
|
Vision for the future on urolithiasis: research, management, education and training—some personal views. Urolithiasis 2018; 47:401-413. [DOI: 10.1007/s00240-018-1086-2] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2018] [Accepted: 10/23/2018] [Indexed: 12/17/2022]
|
45
|
Robu MR, Ramalhinho J, Thompson S, Gurusamy K, Davidson B, Hawkes D, Stoyanov D, Clarkson MJ. Global rigid registration of CT to video in laparoscopic liver surgery. Int J Comput Assist Radiol Surg 2018; 13:947-956. [PMID: 29736801 PMCID: PMC5974008 DOI: 10.1007/s11548-018-1781-z] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2018] [Accepted: 04/27/2018] [Indexed: 11/09/2022]
Abstract
PURPOSE Image-guidance systems have the potential to aid in laparoscopic interventions by providing sub-surface structure information and tumour localisation. The registration of a preoperative 3D image with the intraoperative laparoscopic video feed is an important component of image guidance, which should be fast, robust and cause minimal disruption to the surgical procedure. Most methods for rigid and non-rigid registration require a good initial alignment. However, in most research systems for abdominal surgery, the user has to manually rotate and translate the models, which is usually difficult to perform quickly and intuitively. METHODS We propose a fast, global method for the initial rigid alignment between a 3D mesh derived from a preoperative CT of the liver and a surface reconstruction of the intraoperative scene. We formulate the shape matching problem as a quadratic assignment problem which minimises the dissimilarity between feature descriptors while enforcing geometrical consistency between all the feature points. We incorporate a novel constraint based on the liver contours which deals specifically with the challenges introduced by laparoscopic data. RESULTS We validate our proposed method on synthetic data, on a liver phantom and on retrospective clinical data acquired during a laparoscopic liver resection. We show robustness over reduced partial size and increasing levels of deformation. Our results on the phantom and on the real data show good initial alignment, which can successfully converge to the correct position using fine alignment techniques. Furthermore, since we can pre-process the CT scan before surgery, the proposed method runs faster than current algorithms. CONCLUSION The proposed shape matching method can provide a fast, global initial registration, which can be further refined by fine alignment methods. This approach will lead to a more usable and intuitive image-guidance system for laparoscopic liver surgery.
Collapse
Affiliation(s)
- Maria R Robu
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London, UK.
- Centre For Medical Image Computing, University College London, London, UK.
| | - João Ramalhinho
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London, UK
- Centre For Medical Image Computing, University College London, London, UK
| | - Stephen Thompson
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London, UK
- Centre For Medical Image Computing, University College London, London, UK
| | - Kurinchi Gurusamy
- Division of Surgery and Interventional Science, University College London, London, UK
| | - Brian Davidson
- Division of Surgery and Interventional Science, University College London, London, UK
| | - David Hawkes
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London, UK
- Centre For Medical Image Computing, University College London, London, UK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London, UK
- Centre For Medical Image Computing, University College London, London, UK
| | - Matthew J Clarkson
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London, UK
- Centre For Medical Image Computing, University College London, London, UK
| |
Collapse
|
46
|
Dey A, Billinghurst M, Lindeman RW, Swan JE. A Systematic Review of 10 Years of Augmented Reality Usability Studies: 2005 to 2014. Front Robot AI 2018; 5:37. [PMID: 33500923 PMCID: PMC7805955 DOI: 10.3389/frobt.2018.00037] [Citation(s) in RCA: 66] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2017] [Accepted: 03/19/2018] [Indexed: 11/13/2022] Open
Abstract
Augmented Reality (AR) interfaces have been studied extensively over the last few decades, with a growing number of user-based experiments. In this paper, we systematically review 10 years of the most influential AR user studies, from 2005 to 2014. A total of 291 papers with 369 individual user studies have been reviewed and classified based on their application areas. The primary contribution of the review is to present the broad landscape of user-based AR research, and to provide a high-level view of how that landscape has changed. We summarize the high-level contributions from each category of papers, and present examples of the most influential user studies. We also identify areas where there have been few user studies, and opportunities for future research. Among other things, we find that there is a growing trend toward handheld AR user studies, and that most studies are conducted in laboratory settings and do not involve pilot testing. This research will be useful for AR researchers who want to follow best practices in designing their own AR user studies.
Collapse
Affiliation(s)
- Arindam Dey
- Empathic Computing Laboratory, University of South Australia, Mawson Lakes, SA, Australia
| | - Mark Billinghurst
- Empathic Computing Laboratory, University of South Australia, Mawson Lakes, SA, Australia
| | - Robert W Lindeman
- Human Interface Technology Lab New Zealand (HIT Lab NZ), University of Canterbury, Christchurch, New Zealand
| | - J Edward Swan
- Mississippi State University, Starkville, MS, United States
| |
Collapse
|
47
|
Intraoperative utilization of advanced imaging modalities in a complex kidney stone case: a pilot case study. World J Urol 2018; 36:733-743. [DOI: 10.1007/s00345-018-2260-4] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2017] [Accepted: 03/07/2018] [Indexed: 10/17/2022] Open
|
48
|
Singla R, Edgcumbe P, Pratt P, Nguan C, Rohling R. Intra-operative ultrasound-based augmented reality guidance for laparoscopic surgery. Healthc Technol Lett 2017; 4:204-209. [PMID: 29184666 PMCID: PMC5683195 DOI: 10.1049/htl.2017.0063] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2017] [Accepted: 07/28/2017] [Indexed: 01/20/2023] Open
Abstract
In laparoscopic surgery, the surgeon must operate with a limited field of view and reduced depth perception. This makes spatial understanding of critical structures difficult, such as an endophytic tumour in a partial nephrectomy. Such tumours yield a high complication rate of 47%, and excising them increases the risk of cutting into the kidney's collecting system. To overcome these challenges, an augmented reality guidance system is proposed. Using intra-operative ultrasound, a single navigation aid, and surgical instrument tracking, four augmentations of guidance information are provided during tumour excision. Qualitative and quantitative system benefits are measured in simulated robot-assisted partial nephrectomies. Robot-to-camera calibration achieved a total registration error of 1.0 ± 0.4 mm while the total system error is 2.5 ± 0.5 mm. The system significantly reduced healthy tissue excised from an average (±standard deviation) of 30.6 ± 5.5 to 17.5 ± 2.4 cm3 (p < 0.05) and reduced the depth from the tumor underside to cut from an average (±standard deviation) of 10.2 ± 4.1 to 3.3 ± 2.3 mm (p < 0.05). Further evaluation is required in vivo, but the system has promising potential to reduce the amount of healthy parenchymal tissue excised.
Collapse
Affiliation(s)
- Rohit Singla
- Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, CanadaV6T1Z4
| | - Philip Edgcumbe
- MD/PhD Program, University of British Columbia, Vancouver, CanadaV6T1Z4
| | - Philip Pratt
- Department of Surgery and Cancer, Imperial College London, UK, SW72BX
| | - Christopher Nguan
- Department of Urological Sciences, University of British Columbia, Vancouver, CanadaV6T1Z4
| | - Robert Rohling
- Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, CanadaV6T1Z4.,Department of Mechanical Engineering, University of British Columbia, Vancouver, CanadaV6T1Z4
| |
Collapse
|
49
|
Detmer FJ, Hettig J, Schindele D, Schostak M, Hansen C. Virtual and Augmented Reality Systems for Renal Interventions: A Systematic Review. IEEE Rev Biomed Eng 2017; 10:78-94. [PMID: 28885161 DOI: 10.1109/rbme.2017.2749527] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
PURPOSE Many virtual and augmented reality systems have been proposed to support renal interventions. This paper reviews such systems employed in the treatment of renal cell carcinoma and renal stones. METHODS A systematic literature search was performed. Inclusion criteria were virtual and augmented reality systems for radical or partial nephrectomy and renal stone treatment, excluding systems solely developed or evaluated for training purposes. RESULTS In total, 52 research papers were identified and analyzed. Most of the identified literature (87%) deals with systems for renal cell carcinoma treatment. About 44% of the systems have already been employed in clinical practice, but only 20% in studies with ten or more patients. Main challenges remaining for future research include the consideration of organ movement and deformation, human factor issues, and the conduction of large clinical studies. CONCLUSION Augmented and virtual reality systems have the potential to improve safety and outcomes of renal interventions. In the last ten years, many technical advances have led to more sophisticated systems, which are already applied in clinical practice. Further research is required to cope with current limitations of virtual and augmented reality assistance in clinical environments.
Collapse
|
50
|
Augmented reality in a tumor resection model. Surg Endosc 2017; 32:1192-1201. [DOI: 10.1007/s00464-017-5791-7] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2017] [Accepted: 07/28/2017] [Indexed: 01/20/2023]
|