1
|
Mahmud M, Sari DCR, Sari D, Arfian N, Zucha MA. The application of augmented reality for improving clinical skills: a scoping review. KOREAN JOURNAL OF MEDICAL EDUCATION 2024; 36:65-79. [PMID: 38462243 PMCID: PMC10925804 DOI: 10.3946/kjme.2024.285] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/26/2023] [Revised: 10/18/2023] [Accepted: 12/11/2023] [Indexed: 03/12/2024]
Abstract
Augmented reality technology had developed rapidly in recent years and had been applied in many fields, including medical education. Augmented reality had potential to improve students' knowledge and skills in medical education. This scoping review primarily aims to further elaborate the current studies on the implementation of augmented reality in advancing clinical skills. This study was conducted by utilizing electronic databases such as PubMed, Embase, and Web of Science in June 2022 for articles focusing on the use of augmented reality for improving clinical skills. The Rayyan website was used to screen the articles that met the inclusion criteria, which was the application of augmented reality as a learning method in medical education. Total of 37 articles met the inclusion criteria. These publications suggested that using augmented reality could improve clinical skills. The most researched topics explored were laparoscopic surgery skills and ophthalmology were the most studied topic. The research methods applied in the articles fall into two main categories: randomized control trial (RCT) (29.3%) and non-RCT (70.3%). Augmented reality has the potential to be integrated in medical education, particularly to boost clinical studies. Due to limited databases, however, any further studies on the implementation of augmented reality as a method to enhance skills in medical education need to be conducted.
Collapse
Affiliation(s)
- Mahmud Mahmud
- Department of Anesthesiology & Intensive Care Therapy, Sardjito General Hospital, Faculty of Medicine, Public Health and Nursing Universitas Gadjah Mada, Yogyakarta, Indonesia
| | - Dwi Cahyani Ratna Sari
- Department of Anatomy, Faculty of Medicine, Public Health and Nursing Universitas Gadjah Mada, Yogyakarta, Indonesia
| | - Djayanti Sari
- Department of Anesthesiology & Intensive Care Therapy, Sardjito General Hospital, Faculty of Medicine, Public Health and Nursing Universitas Gadjah Mada, Yogyakarta, Indonesia
| | - Nur Arfian
- Department of Anatomy, Faculty of Medicine, Public Health and Nursing Universitas Gadjah Mada, Yogyakarta, Indonesia
| | - Muhammad Ary Zucha
- Department of Obstetrics and Gynecology, Sardjito General Hospital, Faculty of Medicine, Public Health and Nursing Universitas Gadjah Mada, Yogyakarta, Indonesia
| |
Collapse
|
2
|
Pulumati A, Algarin YA, Jaalouk D, Hirsch M, Nouri K. Exploring the potential role for extended reality in Mohs micrographic surgery. Arch Dermatol Res 2024; 316:67. [PMID: 38194123 DOI: 10.1007/s00403-023-02804-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2023] [Revised: 11/03/2023] [Accepted: 12/14/2023] [Indexed: 01/10/2024]
Abstract
Mohs micrographic surgery (MMS) is a cornerstone of dermatological practice. Virtual reality (VR) and augmented reality (AR) technology, initially used for entertainment, have entered healthcare, offering real-time data overlaying a surgeon's view. This paper explores potential applications of VR and AR in MMS, emphasizing their advantages and limitations. We aim to identify research gaps to facilitate innovation in dermatological surgery. We conducted a PubMed search using the following: "augmented reality" OR "virtual reality" AND "Mohs" or "augmented reality" OR "virtual reality" AND "surgery." Inclusion criteria were peer-reviewed articles in English discussing these technologies in medical settings. We excluded non-peer-reviewed sources, non-English articles, and those not addressing these technologies in a medical context. VR alleviates patient anxiety and enhances patient satisfaction while serving as an educational tool. It also aids physicians by providing realistic surgical simulations. On the other hand, AR assists in real-time lesion analysis, optimizing incision planning, and refining margin control during surgery. Both of these technologies offer remote guidance for trainee residents, enabling real-time learning and oversight and facilitating synchronous teleconsultations. These technologies may transform dermatologic surgery, making it more accessible and efficient. However, further research is needed to validate their effectiveness, address potential challenges, and optimize seamless integration. All in all, AR and VR enhance real-world environments with digital data, offering real-time surgical guidance and medical insights. By exploring the potential integration of these technologies in MMS, our study identifies avenues for further research to thoroughly understand the role of these technologies to redefine dermatologic surgery, elevating precision, surgical outcomes, and patient experiences.
Collapse
Affiliation(s)
- Anika Pulumati
- University of Missouri-Kansas City School of Medicine, Kansas City, MO, USA.
- Department of Dermatology and Cutaneous Surgery, University of Miami Leonard M. Miller School of Medicine, Miami, FL, USA.
| | | | - Dana Jaalouk
- Florida State University College of Medicine, Tallahassee, FL, USA
| | - Melanie Hirsch
- University of Miami Leonard M. Miller School of Medicine, Miami, FL, USA
| | - Keyvan Nouri
- Department of Dermatology and Cutaneous Surgery, University of Miami Leonard M. Miller School of Medicine, Miami, FL, USA
| |
Collapse
|
3
|
Chiou SY, Liu LS, Lee CW, Kim DH, Al-Masni MA, Liu HL, Wei KC, Yan JL, Chen PY. Augmented Reality Surgical Navigation System Integrated with Deep Learning. Bioengineering (Basel) 2023; 10:bioengineering10050617. [PMID: 37237687 DOI: 10.3390/bioengineering10050617] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2023] [Revised: 05/11/2023] [Accepted: 05/17/2023] [Indexed: 05/28/2023] Open
Abstract
Most current surgical navigation methods rely on optical navigators with images displayed on an external screen. However, minimizing distractions during surgery is critical and the spatial information displayed in this arrangement is non-intuitive. Previous studies have proposed combining optical navigation systems with augmented reality (AR) to provide surgeons with intuitive imaging during surgery, through the use of planar and three-dimensional imagery. However, these studies have mainly focused on visual aids and have paid relatively little attention to real surgical guidance aids. Moreover, the use of augmented reality reduces system stability and accuracy, and optical navigation systems are costly. Therefore, this paper proposed an augmented reality surgical navigation system based on image positioning that achieves the desired system advantages with low cost, high stability, and high accuracy. This system also provides intuitive guidance for the surgical target point, entry point, and trajectory. Once the surgeon uses the navigation stick to indicate the position of the surgical entry point, the connection between the surgical target and the surgical entry point is immediately displayed on the AR device (tablet or HoloLens glasses), and a dynamic auxiliary line is shown to assist with incision angle and depth. Clinical trials were conducted for EVD (extra-ventricular drainage) surgery, and surgeons confirmed the system's overall benefit. A "virtual object automatic scanning" method is proposed to achieve a high accuracy of 1 ± 0.1 mm for the AR-based system. Furthermore, a deep learning-based U-Net segmentation network is incorporated to enable automatic identification of the hydrocephalus location by the system. The system achieves improved recognition accuracy, sensitivity, and specificity of 99.93%, 93.85%, and 95.73%, respectively, representing a significant improvement from previous studies.
Collapse
Affiliation(s)
- Shin-Yan Chiou
- Department of Electrical Engineering, College of Engineering, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
- Department of Nuclear Medicine, Linkou Chang Gung Memorial Hospital, Taoyuan 333, Taiwan
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
| | - Li-Sheng Liu
- Department of Electrical Engineering, College of Engineering, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
- Department of Electrical and Electronic Engineering, College of Engineering, Yonsei University, Seodaemun-gu, Seoul 03722, Republic of Korea
| | - Chia-Wei Lee
- Department of Electrical Engineering, College of Engineering, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
| | - Dong-Hyun Kim
- Department of Electrical and Electronic Engineering, College of Engineering, Yonsei University, Seodaemun-gu, Seoul 03722, Republic of Korea
| | - Mohammed A Al-Masni
- Department of Artificial Intelligence, College of Software & Convergence Technology, Daeyang AI Center, Sejong University, Seoul 05006, Republic of Korea
| | - Hao-Li Liu
- Department of Electrical Engineering, National Taiwan University, Taipei 106, Taiwan
| | - Kuo-Chen Wei
- New Taipei City Tucheng Hospital, Tao-Yuan, Tucheng, New Taipei City 236, Taiwan
| | - Jiun-Lin Yan
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
| | - Pin-Yuan Chen
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
| |
Collapse
|
4
|
von Ende E, Ryan S, Crain MA, Makary MS. Artificial Intelligence, Augmented Reality, and Virtual Reality Advances and Applications in Interventional Radiology. Diagnostics (Basel) 2023; 13:diagnostics13050892. [PMID: 36900036 PMCID: PMC10000832 DOI: 10.3390/diagnostics13050892] [Citation(s) in RCA: 9] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2023] [Revised: 02/12/2023] [Accepted: 02/23/2023] [Indexed: 03/03/2023] Open
Abstract
Artificial intelligence (AI) uses computer algorithms to process and interpret data as well as perform tasks, while continuously redefining itself. Machine learning, a subset of AI, is based on reverse training in which evaluation and extraction of data occur from exposure to labeled examples. AI is capable of using neural networks to extract more complex, high-level data, even from unlabeled data sets, and better emulate, or even exceed, the human brain. Advances in AI have and will continue to revolutionize medicine, especially the field of radiology. Compared to the field of interventional radiology, AI innovations in the field of diagnostic radiology are more widely understood and used, although still with significant potential and growth on the horizon. Additionally, AI is closely related and often incorporated into the technology and programming of augmented reality, virtual reality, and radiogenomic innovations which have the potential to enhance the efficiency and accuracy of radiological diagnoses and treatment planning. There are many barriers that limit the applications of artificial intelligence applications into the clinical practice and dynamic procedures of interventional radiology. Despite these barriers to implementation, artificial intelligence in IR continues to advance and the continued development of machine learning and deep learning places interventional radiology in a unique position for exponential growth. This review describes the current and possible future applications of artificial intelligence, radiogenomics, and augmented and virtual reality in interventional radiology while also describing the challenges and limitations that must be addressed before these applications can be fully implemented into common clinical practice.
Collapse
|
5
|
Chiou SY, Zhang ZY, Liu HL, Yan JL, Wei KC, Chen PY. Augmented Reality Surgical Navigation System for External Ventricular Drain. Healthcare (Basel) 2022; 10:healthcare10101815. [PMID: 36292263 PMCID: PMC9601392 DOI: 10.3390/healthcare10101815] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2022] [Revised: 09/14/2022] [Accepted: 09/19/2022] [Indexed: 12/02/2022] Open
Abstract
Augmented reality surgery systems are playing an increasing role in the operating room, but applying such systems to neurosurgery presents particular challenges. In addition to using augmented reality technology to display the position of the surgical target position in 3D in real time, the application must also display the scalpel entry point and scalpel orientation, with accurate superposition on the patient. To improve the intuitiveness, efficiency, and accuracy of extra-ventricular drain surgery, this paper proposes an augmented reality surgical navigation system which accurately superimposes the surgical target position, scalpel entry point, and scalpel direction on a patient’s head and displays this data on a tablet. The accuracy of the optical measurement system (NDI Polaris Vicra) was first independently tested, and then complemented by the design of functions to help the surgeon quickly identify the surgical target position and determine the preferred entry point. A tablet PC was used to display the superimposed images of the surgical target, entry point, and scalpel on top of the patient, allowing for correct scalpel orientation. Digital imaging and communications in medicine (DICOM) results for the patient’s computed tomography were used to create a phantom and its associated AR model. This model was then imported into the application, which was then executed on the tablet. In the preoperative phase, the technician first spent 5–7 min to superimpose the virtual image of the head and the scalpel. The surgeon then took 2 min to identify the intended target position and entry point position on the tablet, which then dynamically displayed the superimposed image of the head, target position, entry point position, and scalpel (including the scalpel tip and scalpel orientation). Multiple experiments were successfully conducted on the phantom, along with six practical trials of clinical neurosurgical EVD. In the 2D-plane-superposition model, the optical measurement system (NDI Polaris Vicra) provided highly accurate visualization (2.01 ± 1.12 mm). In hospital-based clinical trials, the average technician preparation time was 6 min, while the surgeon required an average of 3.5 min to set the target and entry-point positions and accurately overlay the orientation with an NDI surgical stick. In the preparation phase, the average time required for the DICOM-formatted image processing and program import was 120 ± 30 min. The accuracy of the designed augmented reality optical surgical navigation system met clinical requirements, and can provide a visual and intuitive guide for neurosurgeons. The surgeon can use the tablet application to obtain real-time DICOM-formatted images of the patient, change the position of the surgical entry point, and instantly obtain an updated surgical path and surgical angle. The proposed design can be used as the basis for various augmented reality brain surgery navigation systems in the future.
Collapse
Affiliation(s)
- Shin-Yan Chiou
- Department of Electrical Engineering, College of Engineering, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
- Department of Nuclear Medicine, Linkou Chang Gung Memorial Hospital, Taoyuan 333, Taiwan
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
| | - Zhi-Yue Zhang
- Department of Electrical Engineering, College of Engineering, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
| | - Hao-Li Liu
- Department of Electrical Engineering, National Taiwan University, Taipei 106, Taiwan
| | - Jiun-Lin Yan
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
| | - Kuo-Chen Wei
- Department of Neurosurgery, New Taipei City TuCheng Hospital, New Taipei City 236, Taiwan
| | - Pin-Yuan Chen
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
- School of Medicine, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
- Correspondence: ; Tel.: +886-2-2431-3131
| |
Collapse
|
6
|
Paraboschi I, Mantica G, Minoli DG, De Marco EA, Gnech M, Bebi C, Manzoni G, Berrettini A. Fluorescence-Guided Surgery and Novel Innovative Technologies for Improved Visualization in Pediatric Urology. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:ijerph191811194. [PMID: 36141458 PMCID: PMC9517607 DOI: 10.3390/ijerph191811194] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/06/2022] [Revised: 08/27/2022] [Accepted: 09/02/2022] [Indexed: 05/30/2023]
Abstract
Fluorescence-guided surgery (FGS), three-dimensional (3D) imaging technologies, and other innovative devices are rapidly revolutionizing the field of urology, providing surgeons with powerful tools for a more complete understanding of patient-specific anatomy. Today, several new intraoperative imaging technologies and cutting-edge devices are available in adult urology to assist surgeons in delivering personalized interventions. Their applications are also gradually growing in general pediatric surgery, where the detailed visualization of normal and pathological structures has the potential to significantly minimize perioperative complications and improve surgical outcomes. In the field of pediatric urology, FGS, 3D reconstructions and printing technologies, augmented reality (AR) devices, contrast-enhanced ultrasound (CEUS), and intraoperative magnetic resonance imaging (iMRI) have been increasingly adopted for a more realistic understanding of the normal and abnormal anatomy, providing a valuable insight to deliver customized treatments in real time. This narrative review aims to illustrate the main applications of these new technologies and imaging devices in the clinical setting of pediatric urology by selecting, with a strict methodology, the most promising articles published in the international scientific literature on this topic. The purpose is to favor early adoption and stimulate more research on this topic for the benefit of children.
Collapse
Affiliation(s)
- Irene Paraboschi
- Department of Pediatric Urology, Fondazione IRCCS Cà Granda Ospedale Maggiore Policlinico, 20122 Milan, Italy
| | - Guglielmo Mantica
- Department of Urology, Policlinico San Martino Hospital, University of Genoa, 16132 Genoa, Italy
| | - Dario Guido Minoli
- Department of Pediatric Urology, Fondazione IRCCS Cà Granda Ospedale Maggiore Policlinico, 20122 Milan, Italy
| | - Erika Adalgisa De Marco
- Department of Pediatric Urology, Fondazione IRCCS Cà Granda Ospedale Maggiore Policlinico, 20122 Milan, Italy
| | - Michele Gnech
- Department of Pediatric Urology, Fondazione IRCCS Cà Granda Ospedale Maggiore Policlinico, 20122 Milan, Italy
| | - Carolina Bebi
- Department of Urology, Fondazione IRCCS Ca’ Granda Ospedale Maggiore Policlinico, Università degli Studi di Milano, 20122 Milan, Italy
| | - Gianantonio Manzoni
- Department of Pediatric Urology, Fondazione IRCCS Cà Granda Ospedale Maggiore Policlinico, 20122 Milan, Italy
| | - Alfredo Berrettini
- Department of Pediatric Urology, Fondazione IRCCS Cà Granda Ospedale Maggiore Policlinico, 20122 Milan, Italy
| |
Collapse
|
7
|
Roberts S, Desai A, Checcucci E, Puliatti S, Taratkin M, Kowalewski KF, Gomez Rivas J, Rivero I, Veneziano D, Autorino R, Porpiglia F, Gill IS, Cacciamani GE. "Augmented reality" applications in urology: a systematic review. Minerva Urol Nephrol 2022; 74:528-537. [PMID: 35383432 DOI: 10.23736/s2724-6051.22.04726-7] [Citation(s) in RCA: 21] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
INTRODUCTION Augmented reality (AR) applied to surgical procedures refers to the superimposition of preoperative or intra-operative images onto the operative field. Augmented reality has been increasingly used in myriad surgical specialties including Urology. The following study reviews advances in the use of AR for improvements in urologic outcomes. EVIDENCE ACQUISITION We identified all descriptive, validity, prospective randomized/nonrandomized trials and retrospective comparative/noncomparative studies about the use of AR in Urology up until March 2021. The MEDLINE, Scopus, and Web of Science databases were used for literature search. We conducted the study selection according to the PRISMA (Preferred Reporting Items for Systematic Reviews and meta-analysis statement) guidelines. We limited included studies to only those using AR, excluding all that used virtual reality technology. EVIDENCE SYNTHESIS A total of 60 studies were identified and included in the present analysis. Overall, 19 studies were descriptive/validity/phantom studies for specific AR methodologies, 4 studies were case reports, and 37 studies included clinical prospective/retrospective comparative studies. CONCLUSIONS Advances in AR have led to increasing registration accuracy as well as increased ability to identify anatomic landmarks and improve outcomes during Urologic procedures such as RARP and robot-assisted partial nephrectomy.
Collapse
Affiliation(s)
- Sidney Roberts
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Aditya Desai
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Enrico Checcucci
- School of Medicine, Division of Urology, Department of Oncology, San Luigi Hospital, University of Turin, Orbassano, Turin, Italy.,European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands
| | - Stefano Puliatti
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, University of Modena and Reggio Emilia, Modena, Italy.,Department of Urology, OLV, Aalst, Belgium.,ORSI Academy, Melle, Belgium
| | - Mark Taratkin
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Institute for Urology and Reproductive Health, Sechenov University, Moscow, Russia
| | - Karl-Friedrich Kowalewski
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Virgen Macarena University Hospital, Seville, Spain.,Department of Urology and Urosurgery, University Hospital of Mannheim, Mannheim, Germany
| | - Juan Gomez Rivas
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, Clinico San Carlos University Hospital, Madrid, Spain
| | - Ines Rivero
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology and Nephrology, Virgen del Rocío University Hospital, Seville, Spain
| | - Domenico Veneziano
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, Riuniti Hospital, Reggio Calabria, Reggio Calabria, Italy
| | | | - Francesco Porpiglia
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands
| | - Inderbir S Gill
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA.,Artificial Intelligence (AI) Center at USC Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Giovanni E Cacciamani
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA - .,European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Artificial Intelligence (AI) Center at USC Urology, USC Institute of Urology, Los Angeles, CA, USA.,Keck School of Medicine, Department of Radiology, University of Southern California, Los Angeles, CA, USA
| |
Collapse
|
8
|
Yu J, Xie HUA, Wang S. The effectiveness of augmented reality assisted technology on LPN: a systematic review and meta-analysis. MINIM INVASIV THER 2022; 31:981-991. [DOI: 10.1080/13645706.2022.2051190] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Affiliation(s)
- Jiaqi Yu
- School of Medical Instrument and Food Engineering, University of Shanghai for Science and Technology, Shanghai, China
| | - H. U. A. Xie
- Department of Urology, Children’s Hospital of Shanghai Jiaotong University, Shanghai, China
| | - Shuyi Wang
- School of Medical Instrument and Food Engineering, University of Shanghai for Science and Technology, Shanghai, China
| |
Collapse
|
9
|
Paraboschi I, Gnech M, De Marco EA, Minoli DG, Bebi C, Zanetti SP, Manzoni G, Montanari E, Berrettini A. Pediatric Urolithiasis: Current Surgical Strategies and Future Perspectives. Front Pediatr 2022; 10:886425. [PMID: 35757114 PMCID: PMC9218273 DOI: 10.3389/fped.2022.886425] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/28/2022] [Accepted: 05/16/2022] [Indexed: 12/23/2022] Open
Abstract
New technological innovations and cutting-edge techniques have led to important changes in the surgical management of pediatric urolithiasis. Miniaturized technologies and minimally invasive approaches have been increasingly used in children with urinary stones to minimize surgical complications and improve patient outcomes. Moreover, the new computer technologies of the digital era have been opening new horizons for the preoperative planning and surgical treatment of children with urinary calculi. Three-dimensional modeling reconstructions, virtual, augmented, and mixed reality are rapidly approaching the surgical practice, equipping surgeons with powerful instruments to enhance the real-time intraoperative visualization of normal and pathological structures. The broad range of possibilities offered by these technological innovations in the adult population finds increasing applications in pediatrics, offering a more detailed visualization of small anatomical structures. This review illustrates the most promising techniques and devices to enhance the surgical treatment of pediatric urolithiasis in children, aiming to favor an early adoption and to stimulate more research on this topic.
Collapse
Affiliation(s)
- Irene Paraboschi
- Pediatric Urology Unit, Fondazione IRCCS Cà Granda Ospedale Maggiore Policlinico, Milan, Italy
| | - Michele Gnech
- Pediatric Urology Unit, Fondazione IRCCS Cà Granda Ospedale Maggiore Policlinico, Milan, Italy
| | - Erika Adalgisa De Marco
- Pediatric Urology Unit, Fondazione IRCCS Cà Granda Ospedale Maggiore Policlinico, Milan, Italy
| | - Dario Guido Minoli
- Pediatric Urology Unit, Fondazione IRCCS Cà Granda Ospedale Maggiore Policlinico, Milan, Italy
| | - Carolina Bebi
- Department of Urology, Fondazione IRCCS Cà Granda Ospedale Maggiore Policlinico, Università degli Studi di Milano, Milan, Italy
| | - Stefano Paolo Zanetti
- Department of Urology, Fondazione IRCCS Cà Granda Ospedale Maggiore Policlinico, Università degli Studi di Milano, Milan, Italy
| | - Gianantonio Manzoni
- Pediatric Urology Unit, Fondazione IRCCS Cà Granda Ospedale Maggiore Policlinico, Milan, Italy
| | - Emanuele Montanari
- Department of Urology, Fondazione IRCCS Cà Granda Ospedale Maggiore Policlinico, Università degli Studi di Milano, Milan, Italy
| | - Alfredo Berrettini
- Pediatric Urology Unit, Fondazione IRCCS Cà Granda Ospedale Maggiore Policlinico, Milan, Italy
| |
Collapse
|
10
|
Porpiglia F, Checcucci E, Amparore D, Peretti D, Piramide F, De Cillis S, Piana A, Niculescu G, Verri P, Manfredi M, Poggio M, Stura I, Migliaretti G, Cossu M, Fiori C. Percutaneous Kidney Puncture with Three-dimensional Mixed-reality Hologram Guidance: From Preoperative Planning to Intraoperative Navigation. Eur Urol 2021; 81:588-597. [PMID: 34799199 DOI: 10.1016/j.eururo.2021.10.023] [Citation(s) in RCA: 20] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2021] [Revised: 09/05/2021] [Accepted: 10/15/2021] [Indexed: 12/23/2022]
Abstract
BACKGROUND Despite technical and technological innovations, percutaneous puncture still represents the most challenging step when performing percutaneous nephrolithotomy. This maneuver is characterized by the steepest learning curve and a risk of injuring surrounding organs and kidney damage. OBJECTIVE To evaluate the feasibility of three-dimensional mixed reality (3D MR) holograms in establishing the access point and guiding the needle during percutaneous kidney puncture. DESIGN, SETTING, AND PARTICIPANTS This prospective study included ten patients who underwent 3D MR endoscopic combined intrarenal surgery (ECIRS) for kidney stones from July 2019 to January 2020. A retrospective series of patients who underwent a standard procedure were selected for matched pair analysis. SURGICAL PROCEDURE For patients who underwent 3D MR ECIRS, holograms were overlapped on the real anatomy to guide the surgeon during percutaneous puncture. In the standard group, the procedures were only guided by ultrasound and fluoroscopy. MEASUREMENTS Differences in preoperative and postoperative patient characteristics between the groups were tested using a χ2 test and a Kruskal-Wallis test for categorical and continuous variables, respectively. Results are reported as the median and interquartile range for continuous variables and as the frequency and percentage for categorical variables. RESULTS AND LIMITATIONS Ten patients underwent 3D MR ECIRS. In all cases, the inferior calyx was punctured correctly, as planned using the overlapping hologram. The median puncture and radiation exposure times were 27 min and 120 s, respectively. No intraoperative or major postoperative complications occurred. Matched pair analysis with the standard ECIRS group revealed a significantly shorter radiation exposure time for the 3D MR group (p < 0.001) even though the puncture time was longer in comparison to the standard group (p < 0.001). Finally, use of 3D MR led to a higher success rate for renal puncture at the first attempt (100% vs 50%; p = 0.032). The main limitations of the study are the small sample size and manual overlapping of the rigid hologram models. CONCLUSIONS Our experience demonstrates that 3D MR guidance for renal puncture is feasible and safe. The procedure proved to be effective, with the inferior calyx correctly punctured in all cases, and was associated with a low intraoperative radiation exposure time because of the MR guidance. PATIENT SUMMARY Three-dimensional virtual models visualized as holograms and intraoperatively overlapped on the patient's real anatomy seem to be a valid new tool for guiding puncture of the kidney through the skin for minimally invasive treatment.
Collapse
Affiliation(s)
- Francesco Porpiglia
- Division of Urology, Department Of Oncology, School of Medicine, University of Turin, San Luigi Hospital, Orbassano, Turin, Italy
| | - Enrico Checcucci
- Division of Urology, Department Of Oncology, School of Medicine, University of Turin, San Luigi Hospital, Orbassano, Turin, Italy; Department of Surgery, Candiolo Cancer Institute, FPO-IRCCS, Candiolo, Turin, Italy; Uro-technology and Social Media Working Group of the Young Academic Urologists of the European Association of Urology, Arnhem, The Netherlands.
| | - Daniele Amparore
- Division of Urology, Department Of Oncology, School of Medicine, University of Turin, San Luigi Hospital, Orbassano, Turin, Italy
| | - Dario Peretti
- Division of Urology, Department Of Oncology, School of Medicine, University of Turin, San Luigi Hospital, Orbassano, Turin, Italy
| | - Federico Piramide
- Division of Urology, Department Of Oncology, School of Medicine, University of Turin, San Luigi Hospital, Orbassano, Turin, Italy
| | - Sabrina De Cillis
- Division of Urology, Department Of Oncology, School of Medicine, University of Turin, San Luigi Hospital, Orbassano, Turin, Italy
| | - Alberto Piana
- Division of Urology, Department Of Oncology, School of Medicine, University of Turin, San Luigi Hospital, Orbassano, Turin, Italy
| | - Gabriel Niculescu
- Division of Urology, Department Of Oncology, School of Medicine, University of Turin, San Luigi Hospital, Orbassano, Turin, Italy
| | - Paolo Verri
- Division of Urology, Department Of Oncology, School of Medicine, University of Turin, San Luigi Hospital, Orbassano, Turin, Italy
| | - Matteo Manfredi
- Division of Urology, Department Of Oncology, School of Medicine, University of Turin, San Luigi Hospital, Orbassano, Turin, Italy
| | - Massimiliano Poggio
- Division of Urology, Department Of Oncology, School of Medicine, University of Turin, San Luigi Hospital, Orbassano, Turin, Italy
| | - Ilaria Stura
- Department of Public Health and Pediatric Sciences, School of Medicine, University of Turin, Turin, Italy
| | - Giuseppe Migliaretti
- Department of Public Health and Pediatric Sciences, School of Medicine, University of Turin, Turin, Italy
| | - Marco Cossu
- Division of Urology, Department Of Oncology, School of Medicine, University of Turin, San Luigi Hospital, Orbassano, Turin, Italy
| | - Cristian Fiori
- Division of Urology, Department Of Oncology, School of Medicine, University of Turin, San Luigi Hospital, Orbassano, Turin, Italy
| |
Collapse
|
11
|
Checcucci E, Amparore D, Volpi G, Piramide F, De Cillis S, Piana A, Alessio P, Verri P, Piscitello S, Carbonaro B, Meziere J, Zamengo D, Tsaturyan A, Cacciamani G, Rivas JG, De Luca S, Manfredi M, Fiori C, Liatsikos E, Porpiglia F. Percutaneous puncture during PCNL: new perspective for the future with virtual imaging guidance. World J Urol 2021; 40:639-650. [PMID: 34468886 DOI: 10.1007/s00345-021-03820-4] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2021] [Accepted: 08/19/2021] [Indexed: 12/21/2022] Open
Abstract
CONTEXT Large and complex renal stones are usually treated with percutaneous nephrolithotomy (PCNL). One of the crucial steps in this procedure is the access to the collecting system with the percutaneous puncture and this maneuver leads to a risk of vascular and neighboring organs' injury. In the last years, the application of virtual image-guided surgery has gained wide diffusion even in this specific field. OBJECTIVES To provide a short overview of the most recent evidence on current applications of virtual imaging guidance for PCNL. EVIDENCE ACQUISITION A non-systematic review of the literature was performed. Medline, PubMed, the Cochrane Database and Embase were screened for studies regarding the use virtual imaging guidance for PCNL. EVIDENCE SYNTHESIS 3D virtual navigation technology for PCNL was first used in urology with the purpose of surgical training and surgical planning; subsequently, the field of surgical navigation with different modalities (from cognitive to augmented reality or mixed reality) had been explored. Finally, anecdotal preliminary experiences explored the potential application of artificial intelligence guidance for percutaneous puncture. CONCLUSION Nowadays, many experiences proved the potential benefit of virtual guidance for surgical simulation and training. Focusing on surgery, this tool revealed to be useful both for surgical planning, allowed to achieve a better surgical performance, and for surgical navigation by using augmented reality and mixed reality systems aimed to assist the surgeon in real time during the intervention.
Collapse
Affiliation(s)
- E Checcucci
- Department of Surgery, Candiolo Cancer Institute, FPO-IRCCS, Strada Provinciale 142, km 3,95, 10060, Candiolo, Turin, Italy.
- Uro-Technology and SoMe Working Group of the Young Academic Urologists (YAU) Working Party of the European Association of Urology (EAU), Arnhem, The Netherlands.
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy.
| | - D Amparore
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - G Volpi
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - F Piramide
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - S De Cillis
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - A Piana
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - P Alessio
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - P Verri
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - S Piscitello
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - B Carbonaro
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - J Meziere
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - D Zamengo
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - A Tsaturyan
- Department of Urology, University Hospital of Patras, Patras, Greece
| | - G Cacciamani
- USC Institute of Urology, University of Southern California, Los Angeles, CA, USA
| | - Juan Gomez Rivas
- Department of Urology, La Paz University Hospital, Madrid, Spain
| | - S De Luca
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - M Manfredi
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - C Fiori
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - E Liatsikos
- Department of Urology, University Hospital of Patras, Patras, Greece
- Department of Urology, Medical University of Vienna, Vienna, Austria
| | - F Porpiglia
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| |
Collapse
|
12
|
Evaluation of a Wearable AR Platform for Guiding Complex Craniotomies in Neurosurgery. Ann Biomed Eng 2021; 49:2590-2605. [PMID: 34297263 DOI: 10.1007/s10439-021-02834-8] [Citation(s) in RCA: 23] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2021] [Accepted: 07/12/2021] [Indexed: 10/20/2022]
Abstract
Today, neuronavigation is widely used in daily clinical routine to perform safe and efficient surgery. Augmented reality (AR) interfaces can provide anatomical models and preoperative planning contextually blended with the real surgical scenario, overcoming the limitations of traditional neuronavigators. This study aims to demonstrate the reliability of a new-concept AR headset in navigating complex craniotomies. Moreover, we aim to prove the efficacy of a patient-specific template-based methodology for fast, non-invasive, and fully automatic planning-to-patient registration. The AR platform navigation performance was assessed with an in-vitro study whose goal was twofold: to measure the real-to-virtual 3D target visualization error (TVE), and assess the navigation accuracy through a user study involving 10 subjects in tracing a complex craniotomy. The feasibility of the template-based registration was preliminarily tested on a volunteer. The TVE mean and standard deviation were 1.3 and 0.6 mm. The results of the user study, over 30 traced craniotomies, showed that 97% of the trajectory length was traced within an error margin of 1.5 mm, and 92% within a margin of 1 mm. The in-vivo test confirmed the feasibility and reliability of the patient-specific template for registration. The proposed AR headset allows ergonomic and intuitive fruition of preoperative planning, and it can represent a valid option to support neurosurgical tasks.
Collapse
|
13
|
Ferraguti F, Minelli M, Farsoni S, Bazzani S, Bonfe M, Vandanjon A, Puliatti S, Bianchi G, Secchi C. Augmented Reality and Robotic-Assistance for Percutaneous Nephrolithotomy. IEEE Robot Autom Lett 2020. [DOI: 10.1109/lra.2020.3002216] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
|
14
|
Extracorporeal shock-wave lithotripsy: is it still valid in the era of robotic endourology? Can it be more efficient? Curr Opin Urol 2020; 30:120-129. [PMID: 31990816 DOI: 10.1097/mou.0000000000000732] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
Abstract
PURPOSE OF REVIEW The aim of the article is to evaluate the actual role of extracorporeal shock-wave lithotripsy (ESWL) in the management of urolithiasis based on the new developments of flexible ureterorenoscopy (FURS) and percutaneous nephrolithotomy (PCNL). RECENT FINDINGS In Western Europe, there is a significant change of techniques used for treatment of renal stones with an increase of FURS and a decrease of ESWL. The reasons for this include the change of indications, technical improvement of the endourologic armamentarium, including robotic assistance. Mostly relevant is the introduction of digital reusable and single-use flexible ureterorenoscopes, whereas micro-PCNL has been abandoned. Some companies have stopped production of lithotripters and novel ideas to improve the efficacy of shock waves have not been implemented in the actual systems. Promising shock-wave technologies include the use of burst-shock-wave lithotripsy (SWL) or high-frequent ESWL. The main advantage would be the very fast pulverization of the stone as shown in in-vitro models. SUMMARY The role of ESWL in the management of urolithiasis is decreasing, whereas FURS is constantly progressing. Quality and safety of intracorporeal shock-wave lithotripsy using holmium:YAG-laser under endoscopic control clearly outweighs the advantages of noninvasive ESWL. To regain ground, new technologies like burst-SWL or high-frequent ESWL have to be implemented in new systems.
Collapse
|
15
|
Pérez-Pachón L, Poyade M, Lowe T, Gröning F. Image Overlay Surgery Based on Augmented Reality: A Systematic Review. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2020; 1260:175-195. [PMID: 33211313 DOI: 10.1007/978-3-030-47483-6_10] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2022]
Abstract
Augmented Reality (AR) applied to surgical guidance is gaining relevance in clinical practice. AR-based image overlay surgery (i.e. the accurate overlay of patient-specific virtual images onto the body surface) helps surgeons to transfer image data produced during the planning of the surgery (e.g. the correct resection margins of tissue flaps) to the operating room, thus increasing accuracy and reducing surgery times. We systematically reviewed 76 studies published between 2004 and August 2018 to explore which existing tracking and registration methods and technologies allow healthcare professionals and researchers to develop and implement these systems in-house. Most studies used non-invasive markers to automatically track a patient's position, as well as customised algorithms, tracking libraries or software development kits (SDKs) to compute the registration between patient-specific 3D models and the patient's body surface. Few studies combined the use of holographic headsets, SDKs and user-friendly game engines, and described portable and wearable systems that combine tracking, registration, hands-free navigation and direct visibility of the surgical site. Most accuracy tests included a low number of subjects and/or measurements and did not normally explore how these systems affect surgery times and success rates. We highlight the need for more procedure-specific experiments with a sufficient number of subjects and measurements and including data about surgical outcomes and patients' recovery. Validation of systems combining the use of holographic headsets, SDKs and game engines is especially interesting as this approach facilitates an easy development of mobile AR applications and thus the implementation of AR-based image overlay surgery in clinical practice.
Collapse
Affiliation(s)
- Laura Pérez-Pachón
- School of Medicine, Medical Sciences and Nutrition, University of Aberdeen, Aberdeen, UK.
| | - Matthieu Poyade
- School of Simulation and Visualisation, Glasgow School of Art, Glasgow, UK
| | - Terry Lowe
- School of Medicine, Medical Sciences and Nutrition, University of Aberdeen, Aberdeen, UK
- Head and Neck Oncology Unit, Aberdeen Royal Infirmary (NHS Grampian), Aberdeen, UK
| | - Flora Gröning
- School of Medicine, Medical Sciences and Nutrition, University of Aberdeen, Aberdeen, UK
| |
Collapse
|
16
|
Off-Line Camera-Based Calibration for Optical See-Through Head-Mounted Displays. APPLIED SCIENCES-BASEL 2019. [DOI: 10.3390/app10010193] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
In recent years, the entry into the market of self contained optical see-through headsets with integrated multi-sensor capabilities has led the way to innovative and technology driven augmented reality applications and has encouraged the adoption of these devices also across highly challenging medical and industrial settings. Despite this, the display calibration process of consumer level systems is still sub-optimal, particularly for those applications that require high accuracy in the spatial alignment between computer generated elements and a real-world scene. State-of-the-art manual and automated calibration procedures designed to estimate all the projection parameters are too complex for real application cases outside laboratory environments. This paper describes an off-line fast calibration procedure that only requires a camera to observe a planar pattern displayed on the see-through display. The camera that replaces the user’s eye must be placed within the eye-motion-box of the see-through display. The method exploits standard camera calibration and computer vision techniques to estimate the projection parameters of the display model for a generic position of the camera. At execution time, the projection parameters can then be refined through a planar homography that encapsulates the shift and scaling effect associated with the estimated relative translation from the old camera position to the current user’s eye position. Compared to classical SPAAM techniques that still rely on the human element and to other camera based calibration procedures, the proposed technique is flexible and easy to replicate in both laboratory environments and real-world settings.
Collapse
|
17
|
Chang F, Laguna B, Uribe J, Vu L, Zapala MA, Devincent C, Courtier J. Evaluating the Performance of Augmented Reality in Displaying Magnetic Resonance Imaging-Derived Three-Dimensional Holographic Models. J Med Imaging Radiat Sci 2019; 51:95-102. [PMID: 31862176 DOI: 10.1016/j.jmir.2019.10.006] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2019] [Revised: 08/29/2019] [Accepted: 10/23/2019] [Indexed: 10/25/2022]
Abstract
INTRODUCTION/BACKGROUND Establishing accuracy and precision of magnetic resonance (MR)-derived augmented reality (AR) models is critical before clinical utilization, particularly in preoperative planning. We investigate the performance of an AR application in representing and displaying MR-derived three-dimensional holographic models. METHODS Thirty gold standard (GS) measurements were obtained on a magnetic resonance imaging (MRI) phantom (six interfiducial distances and five configurations). Four MRI pulse sequences were obtained for each of the five configurations, and distances measured in Picture Archiving and Communication System (PACS). Digital imaging and communications in medicine files were translated into three-dimensional models and then loaded onto a novel AR platform. Measurements were also obtained with the software's AR caliper tool. Significant differences among the three groups (GS, PACS, and AR) were assessed with the Kruskal-Wallis test and nonsample median test. Accuracy analysis of GS vs. AR was performed. Precision (percent deviation) of the AR-based caliper tool was also assessed. RESULTS No statistically significant difference existed between AR and GS measurements (P = .6208). PACS demonstrated mean squared error (MSE) of 0.29%. AR digital caliper demonstrated an MSE of 0.3%. Three-dimensional T2 CUBE AR measurements using the platform's AR caliper tool demonstrated an MSE of 8.6%. Percent deviation of AR software caliper tool ranged between 1.9% and 3.9%. DISCUSSION AR demonstrated a high degree of accuracy in comparison to GS, comparable to PACS-based measurements. AR caliper tool demonstrated overall lower accuracy than with physical calipers, although with MSE <10% and greatest measured difference from GS measuring <5 mm. AR-based caliper demonstrated a high degree of precision. CONCLUSION There was no statistically significant difference between GS measurements and three-dimensional AR measurements in MRI phantom models.
Collapse
Affiliation(s)
- Frank Chang
- UCSF Department of Radiology and Biomedical Imaging, Masters of Science in Biomedical Imaging Program, San Francisco, California, USA
| | - Ben Laguna
- UCSF Department of Radiology and Biomedical Imaging, San Francisco, California, USA
| | - Jesus Uribe
- UCSF School of Medicine, San Francisco, California, USA
| | - Lan Vu
- Division of Pediatric Surgery, UCSF Department of Surgery, San Francisco, California, USA
| | - Matthew A Zapala
- UCSF Department of Radiology and Biomedical Imaging, San Francisco, California, USA
| | - Craig Devincent
- UCSF Department of Radiology and Biomedical Imaging, San Francisco, California, USA
| | - Jesse Courtier
- UCSF Department of Radiology and Biomedical Imaging, San Francisco, California, USA.
| |
Collapse
|
18
|
Zhou H, Zhang T, Jagadeesan J. Re-weighting and 1-Point RANSAC-Based P nP Solution to Handle Outliers. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 2019; 41:3022-3033. [PMID: 31689179 PMCID: PMC6857708 DOI: 10.1109/tpami.2018.2871832] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/08/2023]
Abstract
The ability to handle outliers is essential for performing the perspective- n-point (P nP) approach in practical applications, but conventional RANSAC+P3P or P4P methods have high time complexities. We propose a fast P nP solution named R1PP nP to handle outliers by utilizing a soft re-weighting mechanism and the 1-point RANSAC scheme. We first present a P nP algorithm, which serves as the core of R1PP nP, for solving the P nP problem in outlier-free situations. The core algorithm is an optimal process minimizing an objective function conducted with a random control point. Then, to reduce the impact of outliers, we propose a reprojection error-based re-weighting method and integrate it into the core algorithm. Finally, we employ the 1-point RANSAC scheme to try different control points. Experiments with synthetic and real-world data demonstrate that R1PP nP is faster than RANSAC+P3P or P4P methods especially when the percentage of outliers is large, and is accurate. Besides, comparisons with outlier-free synthetic data show that R1PP nP is among the most accurate and fast P nP solutions, which usually serve as the final refinement step of RANSAC+P3P or P4P. Compared with REPP nP, which is the state-of-the-art P nP algorithm with an explicit outliers-handling mechanism, R1PP nP is slower but does not suffer from the percentage of outliers limitation as REPP nP.
Collapse
Affiliation(s)
- Haoyin Zhou
- Surgical Planning Laboratory, Brigham and Women’s Hospital, Harvard Medical School, Boston, MA, 02115, USA
| | - Tao Zhang
- Department of Automation, School of Information Science and Technology, Tsinghua University, Beijing, China, 100086
| | - Jayender Jagadeesan
- Surgical Planning Laboratory, Brigham and Women’s Hospital, Harvard Medical School, Boston, MA, 02115, USA
| |
Collapse
|
19
|
Artificial Intelligence in Interventional Radiology: A Literature Review and Future Perspectives. JOURNAL OF ONCOLOGY 2019; 2019:6153041. [PMID: 31781215 PMCID: PMC6874978 DOI: 10.1155/2019/6153041] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 05/18/2019] [Revised: 09/22/2019] [Accepted: 10/01/2019] [Indexed: 01/17/2023]
Abstract
The term “artificial intelligence” (AI) includes computational algorithms that can perform tasks considered typical of human intelligence, with partial to complete autonomy, to produce new beneficial outputs from specific inputs. The development of AI is largely based on the introduction of artificial neural networks (ANN) that allowed the introduction of the concepts of “computational learning models,” machine learning (ML) and deep learning (DL). AI applications appear promising for radiology scenarios potentially improving lesion detection, segmentation, and interpretation with a recent application also for interventional radiology (IR) practice, including the ability of AI to offer prognostic information to both patients and physicians about interventional oncology procedures. This article integrates evidence-reported literature and experience-based perceptions to assist not only residents and fellows who are training in interventional radiology but also practicing colleagues who are approaching to locoregional mini-invasive treatments.
Collapse
|
20
|
Accuracy assessment for the co-registration between optical and VIVE head-mounted display tracking. Int J Comput Assist Radiol Surg 2019; 14:1207-1215. [PMID: 31069642 DOI: 10.1007/s11548-019-01992-4] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2019] [Accepted: 04/25/2019] [Indexed: 10/26/2022]
Abstract
PURPOSE We report on the development and accuracy assessment of a hybrid tracking system that integrates optical spatial tracking into a video pass-through head-mounted display. METHODS The hybrid system uses a dual-tracked co-calibration apparatus to provide a co-registration between the origins of an optical dynamic reference frame and the VIVE Pro controller through a point-based registration. This registration provides the location of optically tracked tools with respect to the VIVE controller's origin and thus the VIVE's tracking system. RESULTS The positional accuracy was assessed using a CNC machine to collect a grid of points with 25 samples per location. The positional trueness and precision for the hybrid tracking system were [Formula: see text] and [Formula: see text], respectively. The rotational accuracy was assessed through inserting a stylus tracked by all three systems into a hemispherical phantom with cylindrical openings at known angles and collecting 25 samples per cylinder for each system. The rotational trueness and precision for the hybrid tracking system were [Formula: see text] and [Formula: see text], respectively. The difference in position and rotational trueness between the OTS and the hybrid tracking system was [Formula: see text] and [Formula: see text], respectively. CONCLUSIONS We developed a hybrid tracking system that allows the pose of optically tracked surgical instruments to be known within a first-person HMD visualization system, achieving submillimeter accuracy. This research validated the positional and rotational accuracy of the hybrid tracking system and subsequently the optical tracking and VIVE tracking systems. This work provides a method to determine the position of an optically tracked surgical tool with a surgically acceptable accuracy within a low-cost commercial-grade video pass-through HMD. The hybrid tracking system provides the foundation for the continued development of virtual reality or augmented virtuality surgical navigation systems for training or practicing surgical techniques.
Collapse
|
21
|
Nguyen DD, Luo JW, Tailly T, Bhojani N. Percutaneous Nephrolithotomy Access: A Systematic Review of Intraoperative Assistive Technologies. J Endourol 2019; 33:358-368. [DOI: 10.1089/end.2019.0085] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022] Open
Affiliation(s)
| | - Jack W. Luo
- Faculty of Medicine, McGill University, Montreal, Canada
| | - Thomas Tailly
- Urology Department, University Hospital Ghent, Ghent, Belgium
| | - Naeem Bhojani
- Division of Urology, University of Montreal Health Center (CHUM), Montreal, Canada
| |
Collapse
|
22
|
Parkhomenko E, O'Leary M, Safiullah S, Walia S, Owyong M, Lin C, James R, Okhunov Z, Patel RM, Kaler KS, Landman J, Clayman R. Pilot Assessment of Immersive Virtual Reality Renal Models as an Educational and Preoperative Planning Tool for Percutaneous Nephrolithotomy. J Endourol 2019; 33:283-288. [DOI: 10.1089/end.2018.0626] [Citation(s) in RCA: 34] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022] Open
Affiliation(s)
- Egor Parkhomenko
- Department of Urology, School of Medicine, University of California, Irvine, Orange, California
| | - Mitchell O'Leary
- Department of Urology, School of Medicine, University of California, Irvine, Orange, California
| | - Shoaib Safiullah
- Department of Urology, School of Medicine, University of California, Irvine, Orange, California
| | - Sartaaj Walia
- Department of Urology, School of Medicine, University of California, Irvine, Orange, California
| | - Michael Owyong
- Department of Urology, School of Medicine, University of California, Irvine, Orange, California
| | - Cyrus Lin
- Department of Urology, School of Medicine, University of California, Irvine, Orange, California
| | - Ryan James
- Department of Biomedical Informatics and Medical Education, University of Washington, Seattle, Washington
| | - Zhamshid Okhunov
- Department of Urology, School of Medicine, University of California, Irvine, Orange, California
| | - Roshan M. Patel
- Department of Urology, School of Medicine, University of California, Irvine, Orange, California
| | - Kamaljot S. Kaler
- Department of Urology, School of Medicine, University of California, Irvine, Orange, California
- Department of Surgery, Section of Urology, University of Calgary, Calgary, Canada
| | - Jaime Landman
- Department of Urology, School of Medicine, University of California, Irvine, Orange, California
| | - Ralph Clayman
- Department of Urology, School of Medicine, University of California, Irvine, Orange, California
| |
Collapse
|
23
|
Bertolo R, Hung A, Porpiglia F, Bove P, Schleicher M, Dasgupta P. Systematic review of augmented reality in urological interventions: the evidences of an impact on surgical outcomes are yet to come. World J Urol 2019; 38:2167-2176. [PMID: 30826888 DOI: 10.1007/s00345-019-02711-z] [Citation(s) in RCA: 30] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2018] [Accepted: 02/26/2019] [Indexed: 01/12/2023] Open
Abstract
PURPOSE To perform a systematic literature review on the clinical impact of augmented reality (AR) for urological interventions. METHODS As of June 21, 2018, systematic literature review was performed via Medline, Embase and Cochrane databases in accordance with the PRISMA guidelines and registered at PROSPERO (CRD42018102194). Only full text articles in English were included, without time restrictions. Articles were considered if they reported on the use of AR during urological intervention and the impact on the surgical outcomes. The risk of bias and the quality of each study included were independently assessed using the standard Cochrane Collaboration risk of bias tool and the Risk Of Bias In Non-randomised Studies-of Interventions Tool (ROBINS-I). RESULTS 131 articles were identified. 102 remained after duplicate removal and were critically reviewed for evidence synthesis. 20 studies reporting on the outcomes of the use of AR during urological interventions in a clinical setting were considered. Given the mostly non-comparative design of the studies identified, the evidence synthesis was performed in a descriptive and narrative manner. Only one comparative study was found, with the remaining 19 items being single-arm observational studies. Based on the existing evidence, we are unable to state that AR improves the outcomes of urological interventions. The major limitation of AR-assisted surgery is inaccuracy in registration, translating into a poor navigation precision. CONCLUSIONS To date, there is limited evidence showing superior therapeutic benefits of AR-guided surgery when compared with the conventional surgical approach to the respective disease.
Collapse
Affiliation(s)
- Riccardo Bertolo
- Glickman Urological and Kidney Institute, Cleveland Clinic, 2050 E 96th St, Q Building, Cleveland, OH, 44195, USA. .,Urology Department, "San Carlo di Nancy" Hospital, Rome, Italy.
| | - Andrew Hung
- Center for Robotic Simulation and Education, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA
| | - Francesco Porpiglia
- Division of Urology, Department of Oncology, University of Turin, San Luigi Hospital, Orbassano, Turin, Italy
| | - Pierluigi Bove
- Urology Department, "San Carlo di Nancy" Hospital, Rome, Italy
| | - Mary Schleicher
- Floyd D. Loop Alumni Library, Cleveland Clinic, Cleveland, OH, USA
| | | |
Collapse
|
24
|
Wetzl M, Weller M, Heiss R, Schrüfer E, Wuest W, Thierfelder C, Lerch D, Cavallaro A, Amarteifio P, Uder M, May MS. Mobile Workflow in Computed Tomography of the Chest. J Med Syst 2018; 43:14. [PMID: 30535865 PMCID: PMC6290687 DOI: 10.1007/s10916-018-1131-2] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2018] [Accepted: 11/28/2018] [Indexed: 01/04/2023]
Abstract
A CT system with a tablet as mobile user interface and a wireless remote control for positioning and radiation release has recently been presented. Our aim was to evaluate the effects of a mobile CT examination workflow on the radiographers’ performance compared to conventional examinations. A prototype of a radiation protection cabin was installed besides the gantry of a CT system. The CT system was equipped with a simplified user interface on a portable tablet and a mobile remote control. 98 patients with an indication for CT of the chest were randomly assigned to examination using the mobile devices (study group, n = 47) or using the conventional stationary workflow on the console (reference group, n = 51). Three ceiling mounted fisheye cameras were used for motion tracking of the radiographers, two in the examination room and one in the control room. Relative density of detection heat-maps and area counts were assessed using a dedicated software tool to quantify radiographers’ movements. Duration of each task of the examination was manually recorded using a stopwatch. In the reference group 25% of the area counts were located inside of the examination room, while it was 48% in the study group. The time spent in the same room with the patient increased from 3:06 min (29%) to 6:01 min (57%) using the mobile workflow (p < 0.05), thereof 0:59 min (9%) were spent in moderate separation with maintained voice and visual contact in the radiation protection cabin. Heat-maps showed an increase of the radiographer’s working area, indicating a higher freedom of movement. Total duration of the examination was slightly less in the study group without statistical significance (median time: study 10:36, reference 10:50 min; p = 0.29). A mobile CT examination transfers the radiographers’ interaction with the scanner from the control room into the examination room. There, radiographers’ freedom of movement is higher, without any tradeoffs regarding the examination duration.
Collapse
Affiliation(s)
- Matthias Wetzl
- Department of Radiology, University Hospital Erlangen, Maximiliansplatz 3, 91054, Erlangen, Germany
| | - Melanie Weller
- Siemens Healthcare GmbH, Siemensstr. 3, 91301, Forchheim, Germany
| | - Rafael Heiss
- Department of Radiology, University Hospital Erlangen, Maximiliansplatz 3, 91054, Erlangen, Germany
| | - Eleni Schrüfer
- Department of Radiology, University Hospital Erlangen, Maximiliansplatz 3, 91054, Erlangen, Germany
| | - Wolfgang Wuest
- Department of Radiology, University Hospital Erlangen, Maximiliansplatz 3, 91054, Erlangen, Germany
| | | | - Daniel Lerch
- Siemens Healthcare GmbH, Siemensstr. 3, 91301, Forchheim, Germany
| | - Alexander Cavallaro
- Department of Radiology, University Hospital Erlangen, Maximiliansplatz 3, 91054, Erlangen, Germany
- Imaging Science Institute, University Hospital Erlangen, Ulmenweg 18, 91054, Erlangen, Germany
| | - Patrick Amarteifio
- Imaging Science Institute, University Hospital Erlangen, Ulmenweg 18, 91054, Erlangen, Germany
- Siemens Healthcare GmbH, Henkestr. 127, 91052, Erlangen, Germany
| | - Michael Uder
- Department of Radiology, University Hospital Erlangen, Maximiliansplatz 3, 91054, Erlangen, Germany
- Imaging Science Institute, University Hospital Erlangen, Ulmenweg 18, 91054, Erlangen, Germany
| | - Matthias Stefan May
- Department of Radiology, University Hospital Erlangen, Maximiliansplatz 3, 91054, Erlangen, Germany.
- Imaging Science Institute, University Hospital Erlangen, Ulmenweg 18, 91054, Erlangen, Germany.
| |
Collapse
|
25
|
Viglialoro R, Esposito N, Condino S, Cutolo F, Guadagni S, Gesi M, Ferrari M, Ferrari V. Augmented Reality to Improve Surgical Simulation. Lessons Learned Towards the Design of a Hybrid Laparoscopic Simulator for Cholecystectomy. IEEE Trans Biomed Eng 2018; 66:2091-2104. [PMID: 30507490 DOI: 10.1109/tbme.2018.2883816] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
Hybrid surgical simulators based on Augmented Reality (AR) solutions benefit from the advantages of both the box trainers and the Virtual Reality simulators. This paper reports on the results of a long development stage of a hybrid simulator for laparoscopic cholecystectomy that integrates real and the virtual components. We first outline the specifications of the AR simulator and then we explain the strategy adopted for implementing it based on a careful selection of its simulated anatomical components, and characterized by a real-time tracking of both a target anatomy and of the laparoscope. The former is tracked by means of an electromagnetic field generator, while the latter requires an additional camera for video tracking. The new system was evaluated in terms of AR visualization accuracy, realism and hardware robustness. Obtained results show that the accuracy of AR visualization is adequate for training purposes. The qualitative evaluation confirms the robustness and the realism of the simulator. The AR simulator satisfies all the initial specifications in terms of anatomical appearance, modularity, reusability, minimization of spare parts cost, and ability to record surgical errors and to track in real-time the Calot's triangle and the laparoscope. The proposed system could be an effective training tool for learning the task of identification and isolation of Calot's triangle in laparoscopic cholecystectomy. Moreover, the presented strategy could be applied to simulate other surgical procedures involving the task of identification and isolation of generic tubular structures, such as blood vessels, biliary tree and nerves, which are not directly visible.
Collapse
|
26
|
Akand M, Civcik L, Buyukaslan A, Altintas E, Kocer E, Koplay M, Erdogru T. Feasibility of a novel technique using 3-dimensional modeling and augmented reality for access during percutaneous nephrolithotomy in two different ex-vivo models. Int Urol Nephrol 2018; 51:17-25. [PMID: 30474783 DOI: 10.1007/s11255-018-2037-0] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2018] [Accepted: 11/19/2018] [Indexed: 10/27/2022]
Abstract
PURPOSE We describe a novel technique that uses mathematical calculation software, 3-dimensional (3D) modeling and augmented reality (AR) technology for access during percutaneous nephrolithotomy (PCNL) and report our first preliminary results in two different ex-vivo models. METHODS Novel software was created in order to calculate access point and angle by using pre-operative computed tomography (CT) obtained in 50 patients. Two scans, 27 s and 10 min after injection of contrast agent, were taken in prone PCNL position. By using DICOM objects, mathematical and software functions were developed to measure distance of stone from reference electrodes. Vectoral 3D modeling was performed to calculate the access point, direction angle and access angle. With specific programs and AR, 3D modeling was placed virtually onto real object, and the calculated access point and an access needle according to the calculated direction angle and access angle were displayed virtually on the object on the screen of tablet. RESULTS The system was tested on two different models-a stone placed in a gel cushion, and a stone inserted in a bovine kidney that was placed in a chicken-for twice, and correct access point and angle were achieved at every time. Accuracy of insertion of needle was checked by feeling crepitation on stone surface and observing tip of needle touching stone in a control CT scan. CONCLUSIONS This novel device, which uses software-based mathematical calculation, 3D modeling and AR, seems to ensure a correct access point and angle for PCNL. Further research is required to test its accuracy and safety in humans.
Collapse
Affiliation(s)
- Murat Akand
- School of Medicine, Department of Urology, Selcuk University, Konya, Turkey. .,Selçuk Üniversitesi, Alaeddin Keykubat Kampüsü, Tıp Fakültesi Hastanesi, E-Blok, Kat:1, Üroloji Polikliniği, Selçuklu, 42075, Konya, Turkey.
| | - Levent Civcik
- Higher School of Vocational and Technical Sciences, Department of Computer Technologies, Selcuk University, Konya, Turkey
| | | | - Emre Altintas
- School of Medicine, Department of Urology, Selcuk University, Konya, Turkey
| | - Erdinc Kocer
- Technical Education Faculty, Department of Electronic and Computer Education, Selcuk University, Konya, Turkey
| | - Mustafa Koplay
- School of Medicine, Department of Radiology, Selcuk University, Konya, Turkey
| | - Tibet Erdogru
- UroKlinik -Center of Excellence in Urology, Istanbul, Turkey
| |
Collapse
|
27
|
Klein JT, Rassweiler J, Rassweiler-Seyfried MC. Validation of a Novel Cost Effective Easy to Produce and Durable In Vitro Model for Kidney-Puncture and Percutaneous Nephrolitholapaxy-Simulation. J Endourol 2018; 32:871-876. [DOI: 10.1089/end.2017.0834] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Affiliation(s)
- Jan-Thorsten Klein
- Department of Urology and Pediatric Urology, Ulm University Medical Centre, Ulm, Germany
| | - Jens Rassweiler
- Department of Urology, SLK-Klinikum Heilbronn, University of Heidelberg, Heilbronn, Germany
| | | |
Collapse
|
28
|
Kenngott HG, Preukschas AA, Wagner M, Nickel F, Müller M, Bellemann N, Stock C, Fangerau M, Radeleff B, Kauczor HU, Meinzer HP, Maier-Hein L, Müller-Stich BP. Mobile, real-time, and point-of-care augmented reality is robust, accurate, and feasible: a prospective pilot study. Surg Endosc 2018; 32:2958-2967. [PMID: 29602988 DOI: 10.1007/s00464-018-6151-y] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2017] [Accepted: 03/21/2018] [Indexed: 11/28/2022]
Abstract
BACKGROUND Augmented reality (AR) systems are currently being explored by a broad spectrum of industries, mainly for improving point-of-care access to data and images. Especially in surgery and especially for timely decisions in emergency cases, a fast and comprehensive access to images at the patient bedside is mandatory. Currently, imaging data are accessed at a distance from the patient both in time and space, i.e., at a specific workstation. Mobile technology and 3-dimensional (3D) visualization of radiological imaging data promise to overcome these restrictions by making bedside AR feasible. METHODS In this project, AR was realized in a surgical setting by fusing a 3D-representation of structures of interest with live camera images on a tablet computer using marker-based registration. The intent of this study was to focus on a thorough evaluation of AR. Feasibility, robustness, and accuracy were thus evaluated consecutively in a phantom model and a porcine model. Additionally feasibility was evaluated in one male volunteer. RESULTS In the phantom model (n = 10), AR visualization was feasible in 84% of the visualization space with high accuracy (mean reprojection error ± standard deviation (SD): 2.8 ± 2.7 mm; 95th percentile = 6.7 mm). In a porcine model (n = 5), AR visualization was feasible in 79% with high accuracy (mean reprojection error ± SD: 3.5 ± 3.0 mm; 95th percentile = 9.5 mm). Furthermore, AR was successfully used and proved feasible within a male volunteer. CONCLUSIONS Mobile, real-time, and point-of-care AR for clinical purposes proved feasible, robust, and accurate in the phantom, animal, and single-trial human model shown in this study. Consequently, AR following similar implementation proved robust and accurate enough to be evaluated in clinical trials assessing accuracy, robustness in clinical reality, as well as integration into the clinical workflow. If these further studies prove successful, AR might revolutionize data access at patient bedside.
Collapse
Affiliation(s)
- Hannes Götz Kenngott
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Im Neuenheimer Feld 110, 69120, Heidelberg, Germany
| | - Anas Amin Preukschas
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Im Neuenheimer Feld 110, 69120, Heidelberg, Germany
| | - Martin Wagner
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Im Neuenheimer Feld 110, 69120, Heidelberg, Germany
| | - Felix Nickel
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Im Neuenheimer Feld 110, 69120, Heidelberg, Germany
| | - Michael Müller
- Division of Medical and Biological Informatics, German Cancer Research Center, Heidelberg, Germany
| | - Nadine Bellemann
- Department of Diagnostic and Interventional Radiology, Heidelberg University, Heidelberg, Germany
| | - Christian Stock
- Institute for Medical Biometry and Informatics, Heidelberg University, Heidelberg, Germany
| | - Markus Fangerau
- Department of Diagnostic and Interventional Radiology, Heidelberg University, Heidelberg, Germany
| | - Boris Radeleff
- Department of Diagnostic and Interventional Radiology, Heidelberg University, Heidelberg, Germany
| | - Hans-Ulrich Kauczor
- Department of Diagnostic and Interventional Radiology, Heidelberg University, Heidelberg, Germany
| | - Hans-Peter Meinzer
- Division of Medical and Biological Informatics, German Cancer Research Center, Heidelberg, Germany
| | - Lena Maier-Hein
- Division of Medical and Biological Informatics, German Cancer Research Center, Heidelberg, Germany
| | - Beat Peter Müller-Stich
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Im Neuenheimer Feld 110, 69120, Heidelberg, Germany.
| |
Collapse
|
29
|
Léger É, Drouin S, Collins DL, Popa T, Kersten-Oertel M. Quantifying attention shifts in augmented reality image-guided neurosurgery. Healthc Technol Lett 2017; 4:188-192. [PMID: 29184663 PMCID: PMC5683248 DOI: 10.1049/htl.2017.0062] [Citation(s) in RCA: 48] [Impact Index Per Article: 6.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2017] [Accepted: 07/27/2017] [Indexed: 11/20/2022] Open
Abstract
Image-guided surgery (IGS) has allowed for more minimally invasive procedures, leading to better patient outcomes, reduced risk of infection, less pain, shorter hospital stays and faster recoveries. One drawback that has emerged with IGS is that the surgeon must shift their attention from the patient to the monitor for guidance. Yet both cognitive and motor tasks are negatively affected with attention shifts. Augmented reality (AR), which merges the realworld surgical scene with preoperative virtual patient images and plans, has been proposed as a solution to this drawback. In this work, we studied the impact of two different types of AR IGS set-ups (mobile AR and desktop AR) and traditional navigation on attention shifts for the specific task of craniotomy planning. We found a significant difference in terms of the time taken to perform the task and attention shifts between traditional navigation, but no significant difference between the different AR set-ups. With mobile AR, however, users felt that the system was easier to use and that their performance was better. These results suggest that regardless of where the AR visualisation is shown to the surgeon, AR may reduce attention shifts, leading to more streamlined and focused procedures.
Collapse
Affiliation(s)
- Étienne Léger
- Department of Computer Science and Software Engineering & Perform Centre, Concordia University, Montreal, Canada
| | - Simon Drouin
- McConnell Brain Imaging Centre, Montreal Neuro, McGill University, Montréal, Canada
| | - D. Louis Collins
- McConnell Brain Imaging Centre, Montreal Neuro, McGill University, Montréal, Canada
| | - Tiberiu Popa
- Department of Computer Science and Software Engineering & Perform Centre, Concordia University, Montreal, Canada
| | - Marta Kersten-Oertel
- Department of Computer Science and Software Engineering & Perform Centre, Concordia University, Montreal, Canada
| |
Collapse
|
30
|
Detmer FJ, Hettig J, Schindele D, Schostak M, Hansen C. Virtual and Augmented Reality Systems for Renal Interventions: A Systematic Review. IEEE Rev Biomed Eng 2017; 10:78-94. [PMID: 28885161 DOI: 10.1109/rbme.2017.2749527] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
PURPOSE Many virtual and augmented reality systems have been proposed to support renal interventions. This paper reviews such systems employed in the treatment of renal cell carcinoma and renal stones. METHODS A systematic literature search was performed. Inclusion criteria were virtual and augmented reality systems for radical or partial nephrectomy and renal stone treatment, excluding systems solely developed or evaluated for training purposes. RESULTS In total, 52 research papers were identified and analyzed. Most of the identified literature (87%) deals with systems for renal cell carcinoma treatment. About 44% of the systems have already been employed in clinical practice, but only 20% in studies with ten or more patients. Main challenges remaining for future research include the consideration of organ movement and deformation, human factor issues, and the conduction of large clinical studies. CONCLUSION Augmented and virtual reality systems have the potential to improve safety and outcomes of renal interventions. In the last ten years, many technical advances have led to more sophisticated systems, which are already applied in clinical practice. Further research is required to cope with current limitations of virtual and augmented reality assistance in clinical environments.
Collapse
|
31
|
Recent Development of Augmented Reality in Surgery: A Review. JOURNAL OF HEALTHCARE ENGINEERING 2017; 2017:4574172. [PMID: 29065604 PMCID: PMC5585624 DOI: 10.1155/2017/4574172] [Citation(s) in RCA: 138] [Impact Index Per Article: 19.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/12/2017] [Accepted: 07/03/2017] [Indexed: 12/11/2022]
Abstract
Introduction The development augmented reality devices allow physicians to incorporate data visualization into diagnostic and treatment procedures to improve work efficiency, safety, and cost and to enhance surgical training. However, the awareness of possibilities of augmented reality is generally low. This review evaluates whether augmented reality can presently improve the results of surgical procedures. Methods We performed a review of available literature dating from 2010 to November 2016 by searching PubMed and Scopus using the terms “augmented reality” and “surgery.” Results. The initial search yielded 808 studies. After removing duplicates and including only journal articles, a total of 417 studies were identified. By reading of abstracts, 91 relevant studies were chosen to be included. 11 references were gathered by cross-referencing. A total of 102 studies were included in this review. Conclusions The present literature suggest an increasing interest of surgeons regarding employing augmented reality into surgery leading to improved safety and efficacy of surgical procedures. Many studies showed that the performance of newly devised augmented reality systems is comparable to traditional techniques. However, several problems need to be addressed before augmented reality is implemented into the routine practice.
Collapse
|
32
|
Augmented Reality Guidance with Multimodality Imaging Data and Depth-Perceived Interaction for Robot-Assisted Surgery. ROBOTICS 2017. [DOI: 10.3390/robotics6020013] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
|
33
|
Tailly T, Denstedt J. Innovations in percutaneous nephrolithotomy. Int J Surg 2016; 36:665-672. [DOI: 10.1016/j.ijsu.2016.11.007] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2016] [Accepted: 11/02/2016] [Indexed: 12/26/2022]
|
34
|
MITK-OpenIGTLink for combining open-source toolkits in real-time computer-assisted interventions. Int J Comput Assist Radiol Surg 2016; 12:351-361. [DOI: 10.1007/s11548-016-1488-y] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2016] [Accepted: 09/08/2016] [Indexed: 11/26/2022]
|
35
|
Mewes A, Hensen B, Wacker F, Hansen C. Touchless interaction with software in interventional radiology and surgery: a systematic literature review. Int J Comput Assist Radiol Surg 2016; 12:291-305. [PMID: 27647327 DOI: 10.1007/s11548-016-1480-6] [Citation(s) in RCA: 40] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2016] [Accepted: 08/31/2016] [Indexed: 11/25/2022]
Abstract
PURPOSE In this article, we systematically examine the current state of research of systems that focus on touchless human-computer interaction in operating rooms and interventional radiology suites. We further discuss the drawbacks of current solutions and underline promising technologies for future development. METHODS A systematic literature search of scientific papers that deal with touchless control of medical software in the immediate environment of the operation room and interventional radiology suite was performed. This includes methods for touchless gesture interaction, voice control and eye tracking. RESULTS Fifty-five research papers were identified and analyzed in detail including 33 journal publications. Most of the identified literature (62 %) deals with the control of medical image viewers. The others present interaction techniques for laparoscopic assistance (13 %), telerobotic assistance and operating room control (9 % each) as well as for robotic operating room assistance and intraoperative registration (3.5 % each). Only 8 systems (14.5 %) were tested in a real clinical environment, and 7 (12.7 %) were not evaluated at all. CONCLUSION In the last 10 years, many advancements have led to robust touchless interaction approaches. However, only a few have been systematically evaluated in real operating room settings. Further research is required to cope with current limitations of touchless software interfaces in clinical environments. The main challenges for future research are the improvement and evaluation of usability and intuitiveness of touchless human-computer interaction and the full integration into productive systems as well as the reduction of necessary interaction steps and further development of hands-free interaction.
Collapse
Affiliation(s)
- André Mewes
- Faculty of Computer Science, University of Magdeburg, Magdeburg, Germany.
| | - Bennet Hensen
- Institute for Diagnostic and Interventional Radiology, Medical School Hanover, Hanover, Germany
| | - Frank Wacker
- Institute for Diagnostic and Interventional Radiology, Medical School Hanover, Hanover, Germany
| | - Christian Hansen
- Faculty of Computer Science, University of Magdeburg, Magdeburg, Germany
| |
Collapse
|
36
|
Robust and Accurate Algorithm for Wearable Stereoscopic Augmented Reality with Three Indistinguishable Markers. ELECTRONICS 2016. [DOI: 10.3390/electronics5030059] [Citation(s) in RCA: 32] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
|
37
|
Ghani KR, Andonian S, Bultitude M, Desai M, Giusti G, Okhunov Z, Preminger GM, de la Rosette J. Percutaneous Nephrolithotomy: Update, Trends, and Future Directions. Eur Urol 2016; 70:382-96. [DOI: 10.1016/j.eururo.2016.01.047] [Citation(s) in RCA: 122] [Impact Index Per Article: 15.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2015] [Accepted: 01/28/2016] [Indexed: 12/24/2022]
|
38
|
Javali T, Pathade A, Nagaraj HK. A Novel method of ensuring safe and accurate dilatation during percutaneous nephrolithotomy. Int Braz J Urol 2016; 41:1014-9. [PMID: 26689529 PMCID: PMC4756980 DOI: 10.1590/s1677-5538.ibju.2015.0007] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2015] [Accepted: 05/13/2015] [Indexed: 11/24/2022] Open
Abstract
Objective: To report our technique that helps locate the guidewire into the ureter enabling safe dilatation during PCNL. Materials and Methods: Cases in which the guidewire failed to pass into the ureter following successful puncture of the desired calyx were subjected to this technique. A second guidewire was passed through the outer sheath of a 9 Fr. metallic dilator cannula, passed over the first guidewire. The cannula and outer sheath were removed, followed by percutaneous passage of a 6/7.5 Fr ureteroscope between the two guidewires, monitoring its progress through both the endoscopic and fluoroscopic monitors. Once the stone was visualized in the calyx a guidewire was passed through the working channel and maneuvered past the stone into the pelvis and ureter under direct endoscopic vision. This was followed by routine tract dilatation. Results: This technique was employed in 85 out of 675 cases of PCNL carried out at our institute between Jan 2010 to June 2014. The mean time required for our technique, calculated from the point of introduction of the ureteroscope untill the successful passage of the guidewire down into the ureter was 95 seconds. There were no intraoperative or postoperative complications as a result of this technique. Guidewire could be successfully passed into the ureter in 82 out of 85 cases. Conclusions: Use of the ureteroscope introduced percutaneously through the puncture site in PCNL, is a safe and effective technique that helps in maneuvering the guidewire down into the ureter, which subsequently enables safe dilatation.
Collapse
|
39
|
Loy Rodas N, Barrera F, Padoy N. See It With Your Own Eyes: Markerless Mobile Augmented Reality for Radiation Awareness in the Hybrid Room. IEEE Trans Biomed Eng 2016; 64:429-440. [PMID: 27164565 DOI: 10.1109/tbme.2016.2560761] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
GOAL We present an approach to provide awareness to the harmful ionizing radiation generated during X-ray-guided minimally invasive procedures. METHODS A hand-held screen is used to display directly in the user's view information related to radiation safety in a mobile augmented reality (AR) manner. Instead of using markers, we propose a method to track the observer's viewpoint, which relies on the use of multiple RGB-D sensors and combines equipment detection for tracking initialization with a KinectFusion-like approach for frame-to-frame tracking. Two of the sensors are ceiling-mounted and a third one is attached to the hand-held screen. The ceiling cameras keep an updated model of the room's layout, which is used to exploit context information and improve the relocalization procedure. RESULTS The system is evaluated on a multicamera dataset generated inside an operating room (OR) and containing ground-truth poses of the AR display. This dataset includes a wide variety of sequences with different scene configurations, occlusions, motion in the scene, and abrupt viewpoint changes. Qualitative results illustrating the different AR visualization modes for radiation awareness provided by the system are also presented. CONCLUSION Our approach allows the user to benefit from a large AR visualization area and permits to recover from tracking failure caused by vast motion or changes in the scene just by looking at a piece of equipment. SIGNIFICANCE The system enables the user to see the 3-D propagation of radiation, the medical staff's exposure, and/or the doses deposited on the patient's surface as seen through his own eyes.
Collapse
|
40
|
Simpfendörfer T, Hatiboglu G, Hadaschik BA, Wild E, Maier-Hein L, Rassweiler MC, Rassweiler J, Hohenfellner M, Teber D. [Navigation in urological surgery: Possibilities and limits of current techniques]. Urologe A 2016; 54:709-15. [PMID: 25572970 DOI: 10.1007/s00120-014-3709-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
Abstract
Surgical navigation describes the concept of real-time processing and presentation of preoperative and intraoperative data from different sources to intraoperatively provide surgeons with additional cognitive support. Imaging methods such as 3D ultrasound, magnetic resonance imaging (MRI) and computed tomography (CT) and data from optical, electromagnetic or mechanical tracking methods are used. The resulting information of the navigation system will be presented by the means of visual methods. Mostly virtual reality or augmented reality visualization is used. There are different guidance systems for various disciplines introduced. Mostly it operates on rigid structures (bone, brain). For soft tissue navigation motion compensation and deformation detection are necessary. Therefore, marker-based tracking methods are used in several urological application examples; however, the systems are often still under development and have not yet arrived in the clinical routine.
Collapse
Affiliation(s)
- T Simpfendörfer
- Urologische Universitätsklinik Heidelberg, Im Neuenheimer Feld 110, 69120, Heidelberg, Deutschland,
| | | | | | | | | | | | | | | | | |
Collapse
|
41
|
New simple image overlay system using a tablet PC for pinpoint identification of the appropriate site for anastomosis in peripheral arterial reconstruction. Surg Today 2016; 46:1387-1393. [PMID: 26988854 DOI: 10.1007/s00595-016-1326-4] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2015] [Accepted: 02/12/2016] [Indexed: 10/22/2022]
Abstract
PURPOSE To evaluate the accuracy and utility of a new image overlay system using a tablet PC for patients undergoing peripheral arterial reconstruction. METHODS Eleven limbs treated with distal bypass surgery were studied. Three-dimensional images obtained by processing a preoperative contrast-enhanced computed tomography scan were superimposed onto the back-camera images of a tablet PC. We used this system to pinpoint a planned distal anastomotic site preoperatively and to make a precise incision directly above it during surgery. We used a branch artery near the distal anastomotic site as a reference point and the accuracy of the system was validated by comparing its results with the intraoperative findings. The precision of the system was also compared with that of a preoperative ultrasonographic examination. RESULTS Both the image overlay system and ultrasonography (US) accurately identified the target branch artery in all except one limb. In that limb, which had a very small reference branch artery, preoperative US wrongly identified another branch, whereas the image overlay system located the target branch with an error of 10 mm. CONCLUSIONS Our image overlay system was easy to use and allowed us to precisely identify a target artery preoperatively. Therefore, this system could be helpful for pinpointing the most accurate incision site during surgery.
Collapse
|
42
|
|
43
|
Mobasheri MH, Johnston M, Syed UM, King D, Darzi A. The uses of smartphones and tablet devices in surgery: A systematic review of the literature. Surgery 2015; 158:1352-71. [DOI: 10.1016/j.surg.2015.03.029] [Citation(s) in RCA: 41] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2015] [Revised: 03/02/2015] [Accepted: 03/26/2015] [Indexed: 10/23/2022]
|
44
|
Ferrari V, Viglialoro RM, Nicoli P, Cutolo F, Condino S, Carbone M, Siesto M, Ferrari M. Augmented reality visualization of deformable tubular structures for surgical simulation. Int J Med Robot 2015; 12:231-40. [PMID: 26149832 DOI: 10.1002/rcs.1681] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2014] [Revised: 05/26/2015] [Accepted: 06/03/2015] [Indexed: 11/09/2022]
Abstract
BACKGROUND Surgical simulation based on augmented reality (AR), mixing the benefits of physical and virtual simulation, represents a step forward in surgical training. However, available systems are unable to update the virtual anatomy following deformations impressed on actual anatomy. METHODS A proof-of-concept solution is described providing AR visualization of hidden deformable tubular structures using nitinol tubes sensorized with electromagnetic sensors. This system was tested in vitro on a setup comprised of sensorized cystic, left and right hepatic, and proper hepatic arteries. In the trial session, the surgeon deformed the tubular structures with surgical forceps in 10 positions. RESULTS The mean, standard deviation, and maximum misalignment between virtual and real arteries were 0.35, 0.22, and 0.99 mm, respectively. CONCLUSION The alignment accuracy obtained demonstrates the feasibility of the approach, which can be adopted in advanced AR simulations, in particular as an aid to the identification and isolation of tubular structures. Copyright © 2015 John Wiley & Sons, Ltd.
Collapse
Affiliation(s)
- Vincenzo Ferrari
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Italy.,Information Engineering Department, University of Pisa, Italy.,Vascular Surgery Unit, Cisanello University Hospital AOUP, Pisa, Italy
| | - Rosanna Maria Viglialoro
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Italy
| | - Paola Nicoli
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Italy
| | - Fabrizio Cutolo
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Italy
| | - Sara Condino
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Italy
| | - Marina Carbone
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Italy
| | - Mentore Siesto
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Italy
| | - Mauro Ferrari
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Italy.,Vascular Surgery Unit, Cisanello University Hospital AOUP, Pisa, Italy
| |
Collapse
|
45
|
Kenngott HG, Wagner M, Nickel F, Wekerle AL, Preukschas A, Apitz M, Schulte T, Rempel R, Mietkowski P, Wagner F, Termer A, Müller-Stich BP. Computer-assisted abdominal surgery: new technologies. Langenbecks Arch Surg 2015; 400:273-81. [PMID: 25701196 DOI: 10.1007/s00423-015-1289-8] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2015] [Accepted: 02/09/2015] [Indexed: 12/16/2022]
Abstract
BACKGROUND Computer-assisted surgery is a wide field of technologies with the potential to enable the surgeon to improve efficiency and efficacy of diagnosis, treatment, and clinical management. PURPOSE This review provides an overview of the most important new technologies and their applications. METHODS A MEDLINE database search was performed revealing a total of 1702 references. All references were considered for information on six main topics, namely image guidance and navigation, robot-assisted surgery, human-machine interface, surgical processes and clinical pathways, computer-assisted surgical training, and clinical decision support. Further references were obtained through cross-referencing the bibliography cited in each work. Based on their respective field of expertise, the authors chose 64 publications relevant for the purpose of this review. CONCLUSION Computer-assisted systems are increasingly used not only in experimental studies but also in clinical studies. Although computer-assisted abdominal surgery is still in its infancy, the number of studies is constantly increasing, and clinical studies start showing the benefits of computers used not only as tools of documentation and accounting but also for directly assisting surgeons during diagnosis and treatment of patients. Further developments in the field of clinical decision support even have the potential of causing a paradigm shift in how patients are diagnosed and treated.
Collapse
Affiliation(s)
- H G Kenngott
- Department of General, Abdominal and Transplant Surgery, Ruprecht-Karls-University, Heidelberg, Germany
| | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
46
|
Mobile markerless augmented reality and its application in forensic medicine. Int J Comput Assist Radiol Surg 2014; 10:573-86. [PMID: 25149272 DOI: 10.1007/s11548-014-1106-9] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2014] [Accepted: 07/30/2014] [Indexed: 10/24/2022]
Abstract
PURPOSE During autopsy, forensic pathologists today mostly rely on visible indication, tactile perception and experience to determine the cause of death. Although computed tomography (CT) data is often available for the bodies under examination, these data are rarely used due to the lack of radiological workstations in the pathological suite. The data may prevent the forensic pathologist from damaging evidence by allowing him to associate, for example, external wounds to internal injuries. To facilitate this, we propose a new multimodal approach for intuitive visualization of forensic data and evaluate its feasibility. METHODS A range camera is mounted on a tablet computer and positioned in a way such that the camera simultaneously captures depth and color information of the body. A server estimates the camera pose based on surface registration of CT and depth data to allow for augmented reality visualization of the internal anatomy directly on the tablet. Additionally, projection of color information onto the CT surface is implemented. RESULTS We validated the system in a postmortem pilot study using fiducials attached to the skin for quantification of a mean target registration error of [Formula: see text] mm. CONCLUSIONS The system is mobile, markerless, intuitive and real-time capable with sufficient accuracy. It can support the forensic pathologist during autopsy with augmented reality and textured surfaces. Furthermore, the system enables multimodal documentation for presentation in court. Despite its preliminary prototype status, it has high potential due to its low price and simplicity.
Collapse
|
47
|
Affiliation(s)
- Bartosz Dybowski
- Department of Urology, Medical University of Warsaw, Warsaw, Poland
| |
Collapse
|
48
|
Ungi T, Beiko D, Fuoco M, King F, Holden MS, Fichtinger G, Siemens DR. Tracked ultrasonography snapshots enhance needle guidance for percutaneous renal access: a pilot study. J Endourol 2014; 28:1040-5. [PMID: 24745550 DOI: 10.1089/end.2014.0011] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/30/2022] Open
Abstract
BACKGROUND AND PURPOSE Although ultrasonography-guided percutaneous nephrostomy is relatively safe, a number of factors make it challenging for inexperienced operators. A computerized needle navigation technique using tracked ultrasonography snapshots was investigated to determine whether performance of percutaneous nephrostomy by inexperienced users could be improved. METHODS Ten operators performed the procedure on a phantom model with alternating needle guidance between conventional ultrasonography and tracked ultrasonography snapshots. The needle was reinserted until fluid backflow confirmed calyceal access. Needle trajectories were recorded using the real time needle navigation system for offline evaluation of operator performance. Recorded needle trajectories were used to measure needle motion path length inside the phantom tissue, number of reinsertions, total procedure time, and needle insertion time as end points of this study. RESULTS Needle path length measured inside the phantom tissue was significantly lower with ultrasonography snapshots guidance (295.0±23.1 mm, average±standard error of the mean) compared with control procedures (977.9±144.4 mm, P<0.01). This was associated with a significantly lower number of needle insertion attempts with ultrasonography snapshots (average 1.27±0.10 vs 2.83±0.31, P<0.01). The total procedure time and the needle insertion time were also significantly lower with ultrasonography snapshots guidance. CONCLUSION Tracked ultrasonography snapshots appear to improve the performance of percutaneous nephrostomy in these preliminary investigations, justifying further validation studies. The presented navigation system is reproducible because of commercially available hardware and open-source software components, facilitating its potential role in clinical practice.
Collapse
Affiliation(s)
- Tamas Ungi
- 1 Laboratory for Percutaneous Surgery, School of Computing, Queen's University , Kingston, Ontario, Canada
| | | | | | | | | | | | | |
Collapse
|
49
|
Rassweiler J, Rassweiler MC, Frede T, Alken P. Extracorporeal shock wave lithotripsy: An opinion on its future. Indian J Urol 2014; 30:73-9. [PMID: 24497687 PMCID: PMC3897058 DOI: 10.4103/0970-1591.124211] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022] Open
Abstract
The development of miniaturized nephroscopes which allow one-stage stone clearance with minimal morbidity has brought the role of shock wave lithotripsy (SWL) in stone management into question. Design innovations in SWL machines over the last decade have attempted to address this problem. We reviewed the recent literature on SWL using a MEDLINE/PUBMED research. For commenting on the future of SWL, we took the subjective opinion of two senior urologists, one mid-level expert, and an upcoming junior fellow. There have been a number of recent changes in lithotripter design and techniques. This includes the use of multiple focus machines and improved coupling designs. Additional changes involve better localization real-time monitoring. The main goal of stone treatment today seems to be to get rid of the stone in one session rather than being treated multiple times non-invasively. Stone treatment in the future will be individualized by genetic screening of stone formers, using improved SWL devices for small stones only. However, there is still no consensus about the design of the ideal lithotripter. Innovative concepts such as emergency SWL for ureteric stones may be implemented in clinical routine.
Collapse
Affiliation(s)
| | | | - Thomas Frede
- Department of Urology, Helios Kliniken Müllheim, Germany
| | - Peter Alken
- Department of Urology, Medical School of Mannheim, University of Heidelberg, Germany
| |
Collapse
|
50
|
|