1
|
Arapi V, Hardt-Stremayr A, Weiss S, Steinbrener J. Bridging the simulation-to-real gap for AI-based needle and target detection in robot-assisted ultrasound-guided interventions. Eur Radiol Exp 2023; 7:30. [PMID: 37332035 DOI: 10.1186/s41747-023-00344-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2022] [Accepted: 04/05/2023] [Indexed: 06/20/2023] Open
Abstract
BACKGROUND Artificial intelligence (AI)-powered, robot-assisted, and ultrasound (US)-guided interventional radiology has the potential to increase the efficacy and cost-efficiency of interventional procedures while improving postsurgical outcomes and reducing the burden for medical personnel. METHODS To overcome the lack of available clinical data needed to train state-of-the-art AI models, we propose a novel approach for generating synthetic ultrasound data from real, clinical preoperative three-dimensional (3D) data of different imaging modalities. With the synthetic data, we trained a deep learning-based detection algorithm for the localization of needle tip and target anatomy in US images. We validated our models on real, in vitro US data. RESULTS The resulting models generalize well to unseen synthetic data and experimental in vitro data making the proposed approach a promising method to create AI-based models for applications of needle and target detection in minimally invasive US-guided procedures. Moreover, we show that by one-time calibration of the US and robot coordinate frames, our tracking algorithm can be used to accurately fine-position the robot in reach of the target based on 2D US images alone. CONCLUSIONS The proposed data generation approach is sufficient to bridge the simulation-to-real gap and has the potential to overcome data paucity challenges in interventional radiology. The proposed AI-based detection algorithm shows very promising results in terms of accuracy and frame rate. RELEVANCE STATEMENT This approach can facilitate the development of next-generation AI algorithms for patient anatomy detection and needle tracking in US and their application to robotics. KEY POINTS • AI-based methods show promise for needle and target detection in US-guided interventions. • Publicly available, annotated datasets for training AI models are limited. • Synthetic, clinical-like US data can be generated from magnetic resonance or computed tomography data. • Models trained with synthetic US data generalize well to real in vitro US data. • Target detection with an AI model can be used for fine positioning of the robot.
Collapse
Affiliation(s)
- Visar Arapi
- Control of Networked Systems Research Group, Institute of Smart Systems Technologies, University of Klagenfurt, Klagenfurt, Austria.
| | - Alexander Hardt-Stremayr
- Control of Networked Systems Research Group, Institute of Smart Systems Technologies, University of Klagenfurt, Klagenfurt, Austria
| | - Stephan Weiss
- Control of Networked Systems Research Group, Institute of Smart Systems Technologies, University of Klagenfurt, Klagenfurt, Austria
| | - Jan Steinbrener
- Control of Networked Systems Research Group, Institute of Smart Systems Technologies, University of Klagenfurt, Klagenfurt, Austria
| |
Collapse
|
2
|
Yan W, Ding Q, Chen J, Yan K, Tang RSY, Cheng SS. Learning-based needle tip tracking in 2D ultrasound by fusing visual tracking and motion prediction. Med Image Anal 2023; 88:102847. [PMID: 37307759 DOI: 10.1016/j.media.2023.102847] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2022] [Revised: 01/29/2023] [Accepted: 05/17/2023] [Indexed: 06/14/2023]
Abstract
Visual trackers are the most commonly adopted approach for needle tip tracking in ultrasound (US)-based procedures. However, they often perform unsatisfactorily in biological tissues due to the significant background noise and anatomical occlusion. This paper presents a learning-based needle tip tracking system, which consists of not only a visual tracking module, but also a motion prediction module. In the visual tracking module, two sets of masks are designed to improve the tracker's discriminability, and a template update submodule is used to keep up to date with the needle tip's current appearance. In the motion prediction module, a Transformer network-based prediction architecture estimates the target's current position according to its historical position data to tackle the problem of target's temporary disappearance. A data fusion module then integrates the results from the visual tracking and motion prediction modules to provide robust and accurate tracking results. Our proposed tracking system showed distinct improvement against other state-of-the-art trackers during the motorized needle insertion experiments in both gelatin phantom and biological tissue environments (e.g. 78% against <60% in terms of the tracking success rate in the most challenging scenario of "In-plane-static" during the tissue experiments). Its robustness was also verified in manual needle insertion experiments under varying needle velocities and directions, and occasional temporary needle tip disappearance, with its tracking success rate being >18% higher than the second best performing tracking system. The proposed tracking system, with its computational efficiency, tracking robustness, and tracking accuracy, will lead to safer targeting during existing clinical practice of US-guided needle operations and potentially be integrated in a tissue biopsy robotic system.
Collapse
Affiliation(s)
- Wanquan Yan
- Department of Mechanical and Automation Engineering and T Stone Robotics Institute, The Chinese University of Hong Kong, Hong Kong
| | - Qingpeng Ding
- Department of Mechanical and Automation Engineering and T Stone Robotics Institute, The Chinese University of Hong Kong, Hong Kong
| | - Jianghua Chen
- Department of Mechanical and Automation Engineering and T Stone Robotics Institute, The Chinese University of Hong Kong, Hong Kong
| | - Kim Yan
- Department of Mechanical and Automation Engineering and T Stone Robotics Institute, The Chinese University of Hong Kong, Hong Kong
| | - Raymond Shing-Yan Tang
- Department of Medicine and Therapeutics and Institute of Digestive Disease, The Chinese University of Hong Kong, Hong Kong
| | - Shing Shin Cheng
- Department of Mechanical and Automation Engineering and T Stone Robotics Institute, The Chinese University of Hong Kong, Hong Kong; Institute of Medical Intelligence and XR, Multi-scale Medical Robotics Center, and Shun Hing Institute of Advanced Engineering, The Chinese University of Hong Kong, Hong Kong.
| |
Collapse
|
3
|
Checcucci E, Amparore D, Volpi G, Piramide F, De Cillis S, Piana A, Alessio P, Verri P, Piscitello S, Carbonaro B, Meziere J, Zamengo D, Tsaturyan A, Cacciamani G, Rivas JG, De Luca S, Manfredi M, Fiori C, Liatsikos E, Porpiglia F. Percutaneous puncture during PCNL: new perspective for the future with virtual imaging guidance. World J Urol 2021; 40:639-650. [PMID: 34468886 DOI: 10.1007/s00345-021-03820-4] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2021] [Accepted: 08/19/2021] [Indexed: 12/21/2022] Open
Abstract
CONTEXT Large and complex renal stones are usually treated with percutaneous nephrolithotomy (PCNL). One of the crucial steps in this procedure is the access to the collecting system with the percutaneous puncture and this maneuver leads to a risk of vascular and neighboring organs' injury. In the last years, the application of virtual image-guided surgery has gained wide diffusion even in this specific field. OBJECTIVES To provide a short overview of the most recent evidence on current applications of virtual imaging guidance for PCNL. EVIDENCE ACQUISITION A non-systematic review of the literature was performed. Medline, PubMed, the Cochrane Database and Embase were screened for studies regarding the use virtual imaging guidance for PCNL. EVIDENCE SYNTHESIS 3D virtual navigation technology for PCNL was first used in urology with the purpose of surgical training and surgical planning; subsequently, the field of surgical navigation with different modalities (from cognitive to augmented reality or mixed reality) had been explored. Finally, anecdotal preliminary experiences explored the potential application of artificial intelligence guidance for percutaneous puncture. CONCLUSION Nowadays, many experiences proved the potential benefit of virtual guidance for surgical simulation and training. Focusing on surgery, this tool revealed to be useful both for surgical planning, allowed to achieve a better surgical performance, and for surgical navigation by using augmented reality and mixed reality systems aimed to assist the surgeon in real time during the intervention.
Collapse
Affiliation(s)
- E Checcucci
- Department of Surgery, Candiolo Cancer Institute, FPO-IRCCS, Strada Provinciale 142, km 3,95, 10060, Candiolo, Turin, Italy.
- Uro-Technology and SoMe Working Group of the Young Academic Urologists (YAU) Working Party of the European Association of Urology (EAU), Arnhem, The Netherlands.
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy.
| | - D Amparore
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - G Volpi
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - F Piramide
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - S De Cillis
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - A Piana
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - P Alessio
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - P Verri
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - S Piscitello
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - B Carbonaro
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - J Meziere
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - D Zamengo
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - A Tsaturyan
- Department of Urology, University Hospital of Patras, Patras, Greece
| | - G Cacciamani
- USC Institute of Urology, University of Southern California, Los Angeles, CA, USA
| | - Juan Gomez Rivas
- Department of Urology, La Paz University Hospital, Madrid, Spain
| | - S De Luca
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - M Manfredi
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - C Fiori
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - E Liatsikos
- Department of Urology, University Hospital of Patras, Patras, Greece
- Department of Urology, Medical University of Vienna, Vienna, Austria
| | - F Porpiglia
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| |
Collapse
|
4
|
Time-aware deep neural networks for needle tip localization in 2D ultrasound. Int J Comput Assist Radiol Surg 2021; 16:819-827. [PMID: 33840037 DOI: 10.1007/s11548-021-02361-w] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2021] [Accepted: 03/25/2021] [Indexed: 10/21/2022]
Abstract
PURPOSE Accurate placement of the needle is critical in interventions like biopsies and regional anesthesia, during which incorrect needle insertion can lead to procedure failure and complications. Therefore, ultrasound guidance is widely used to improve needle placement accuracy. However, at steep and deep insertions, the visibility of the needle is lost. Computational methods for automatic needle tip localization could improve the clinical success rate in these scenarios. METHODS We propose a novel algorithm for needle tip localization during challenging ultrasound-guided insertions when the shaft may be invisible, and the tip has a low intensity. There are two key steps in our approach. First, we enhance the needle tip features in consecutive ultrasound frames using a detection scheme which recognizes subtle intensity variations caused by needle tip movement. We then employ a hybrid deep neural network comprising a convolutional neural network and long short-term memory recurrent units. The input to the network is a consecutive plurality of fused enhanced frames and the corresponding original B-mode frames, and this spatiotemporal information is used to predict the needle tip location. RESULTS We evaluate our approach on an ex vivo dataset collected with in-plane and out-of-plane insertion of 17G and 22G needles in bovine, porcine, and chicken tissue, acquired using two different ultrasound systems. We train the model with 5000 frames from 42 video sequences. Evaluation on 600 frames from 30 sequences yields a tip localization error of [Formula: see text] mm and an overall inference time of 0.064 s (15 fps). Comparison against prior art on challenging datasets reveals a 30% improvement in tip localization accuracy. CONCLUSION The proposed method automatically models temporal dynamics associated with needle tip motion and is more accurate than state-of-the-art methods. Therefore, it has the potential for improving needle tip localization in challenging ultrasound-guided interventions.
Collapse
|
5
|
Beigi P, Salcudean SE, Ng GC, Rohling R. Enhancement of needle visualization and localization in ultrasound. Int J Comput Assist Radiol Surg 2020; 16:169-178. [PMID: 32995981 DOI: 10.1007/s11548-020-02227-7] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2020] [Accepted: 07/06/2020] [Indexed: 12/12/2022]
Abstract
PURPOSE This scoping review covers needle visualization and localization techniques in ultrasound, where localization-based approaches mostly aim to compute the needle shaft (and tip) location while potentially enhancing its visibility too. METHODS A literature review is conducted on the state-of-the-art techniques, which could be divided into five categories: (1) signal and image processing-based techniques to augment the needle, (2) modifications to the needle and insertion to help with needle-transducer alignment and visibility, (3) changes to ultrasound image formation, (4) motion-based analysis and (5) machine learning. RESULTS Advantages, limitations and challenges of representative examples in each of the categories are discussed. Evaluation techniques performed in ex vivo, phantom and in vivo studies are discussed and summarized. CONCLUSION Greatest limitation of the majority of the literature is that they rely on original visibility of the needle in the static image. Need for additional/improved apparatus is the greatest limitation toward clinical utility in practice. SIGNIFICANCE Ultrasound-guided needle placement is performed in many clinical applications, including biopsies, treatment injections and anesthesia. Despite the wide range and long history of this technique, an ongoing challenge is needle visibility in ultrasound. A robust technique to enhance ultrasonic needle visibility, especially for steeply inserted hand-held needles, and while maintaining clinical utility requirements is needed.
Collapse
Affiliation(s)
- Parmida Beigi
- Electrical and Computer Engineering Department, University of British Columbia, Vancouver, BC, Canada.
| | - Septimiu E Salcudean
- Electrical and Computer Engineering Department, University of British Columbia, Vancouver, BC, Canada
| | - Gary C Ng
- Philips Ultrasound, Bothell, WA, USA
| | - Robert Rohling
- Electrical and Computer Engineering Department and Mechanical Engineering Department, University of British Columbia, Vancouver, BC, Canada
| |
Collapse
|
6
|
Gillies DJ, Rodgers JR, Gyacskov I, Roy P, Kakani N, Cool DW, Fenster A. Deep learning segmentation of general interventional tools in two‐dimensional ultrasound images. Med Phys 2020; 47:4956-4970. [DOI: 10.1002/mp.14427] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2020] [Revised: 07/05/2020] [Accepted: 07/21/2020] [Indexed: 12/18/2022] Open
Affiliation(s)
- Derek J. Gillies
- Department of Medical Biophysics Western University London OntarioN6A 3K7 Canada
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
| | - Jessica R. Rodgers
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
- School of Biomedical Engineering Western University London OntarioN6A 3K7 Canada
| | - Igor Gyacskov
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
| | - Priyanka Roy
- Department of Medical Biophysics Western University London OntarioN6A 3K7 Canada
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
| | - Nirmal Kakani
- Department of Radiology Manchester Royal Infirmary ManchesterM13 9WL UK
| | - Derek W. Cool
- Department of Medical Imaging Western University London OntarioN6A 3K7 Canada
| | - Aaron Fenster
- Department of Medical Biophysics Western University London OntarioN6A 3K7 Canada
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
- School of Biomedical Engineering Western University London OntarioN6A 3K7 Canada
- Department of Medical Imaging Western University London OntarioN6A 3K7 Canada
| |
Collapse
|
7
|
Lee JY, Islam M, Woh JR, Washeem TSM, Ngoh LYC, Wong WK, Ren H. Ultrasound needle segmentation and trajectory prediction using excitation network. Int J Comput Assist Radiol Surg 2020; 15:437-443. [PMID: 31960247 DOI: 10.1007/s11548-019-02113-x] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2019] [Accepted: 12/30/2019] [Indexed: 10/25/2022]
Abstract
PURPOSE Ultrasound (US)-guided percutaneous kidney biopsy is a challenge for interventionists as US artefacts prevent accurate viewing of the biopsy needle tip. Automatic needle tracking and trajectory prediction can increase operator confidence in performing biopsies, reduce procedure time, minimize the risk of inadvertent biopsy bleedings, and enable future image-guided robotic procedures. METHODS In this paper, we propose a tracking-by-segmentation model with spatial and channel "Squeeze and Excitation" (scSE) for US needle detection and trajectory prediction. We adopt a light deep learning architecture (e.g. LinkNet) as our segmentation baseline network and integrate the scSE module to learn spatial information for better prediction. The proposed model is trained with the US images of anonymized kidney biopsy clips from 8 patients. The contour is obtained using the border-following algorithm and area calculated using Green formula. Trajectory prediction is made by extrapolating from the smallest bounding box that can capture the contour. RESULTS We train and test our model on a total of 996 images extracted from 102 short videos at a rate of 3 frames per second from each video. A set of 794 images is used for training and 202 images for testing. Our model has achieved IOU of 41.01%, dice accuracy of 56.65%, F1-score of 36.61%, and root-mean-square angle error of 13.3[Formula: see text]. We are thus able to predict and extrapolate the trajectory of the biopsy needle with decent accuracy for interventionists to better perform biopsies. CONCLUSION Our novel model combining LinkNet and scSE shows a promising result for kidney biopsy application, which implies potential to other similar ultrasound-guided biopsies that require needle tracking and trajectory prediction.
Collapse
Affiliation(s)
- Jia Yi Lee
- Faculty of Engineering, National University of Singapore, Singapore, Singapore
| | - Mobarakol Islam
- Faculty of Engineering, National University of Singapore, Singapore, Singapore.,Department of Biomedical Engineering, National University of Singapore, Singapore, Singapore.,NUS Graduate School for Integrative Sciences and Engineering (NGS), NUS, Singapore, Singapore
| | - Jing Ru Woh
- Faculty of Engineering, National University of Singapore, Singapore, Singapore
| | - T S Mohamed Washeem
- Faculty of Engineering, National University of Singapore, Singapore, Singapore.,Department of Biomedical Engineering, National University of Singapore, Singapore, Singapore
| | - Lee Ying Clara Ngoh
- Division of Nephrology, National University Hospital, Singapore, Singapore.,Department of Medicine, Yong Loo Lin School of Medicine, National University of Singapore, Singapore, Singapore
| | - Weng Kin Wong
- Division of Nephrology, National University Hospital, Singapore, Singapore
| | - Hongliang Ren
- Faculty of Engineering, National University of Singapore, Singapore, Singapore. .,Department of Biomedical Engineering, National University of Singapore, Singapore, Singapore.
| |
Collapse
|
8
|
Learning needle tip localization from digital subtraction in 2D ultrasound. Int J Comput Assist Radiol Surg 2019; 14:1017-1026. [DOI: 10.1007/s11548-019-01951-z] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2019] [Accepted: 03/18/2019] [Indexed: 12/19/2022]
|
9
|
|
10
|
Mwikirize C, Nosher JL, Hacihaliloglu I. Convolution neural networks for real-time needle detection and localization in 2D ultrasound. Int J Comput Assist Radiol Surg 2018; 13:647-657. [PMID: 29512006 DOI: 10.1007/s11548-018-1721-y] [Citation(s) in RCA: 37] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2018] [Accepted: 02/28/2018] [Indexed: 12/28/2022]
Abstract
PURPOSE We propose a framework for automatic and accurate detection of steeply inserted needles in 2D ultrasound data using convolution neural networks. We demonstrate its application in needle trajectory estimation and tip localization. METHODS Our approach consists of a unified network, comprising a fully convolutional network (FCN) and a fast region-based convolutional neural network (R-CNN). The FCN proposes candidate regions, which are then fed to a fast R-CNN for finer needle detection. We leverage a transfer learning paradigm, where the network weights are initialized by training with non-medical images, and fine-tuned with ex vivo ultrasound scans collected during insertion of a 17G epidural needle into freshly excised porcine and bovine tissue at depth settings up to 9 cm and [Formula: see text]-[Formula: see text] insertion angles. Needle detection results are used to accurately estimate needle trajectory from intensity invariant needle features and perform needle tip localization from an intensity search along the needle trajectory. RESULTS Our needle detection model was trained and validated on 2500 ex vivo ultrasound scans. The detection system has a frame rate of 25 fps on a GPU and achieves 99.6% precision, 99.78% recall rate and an [Formula: see text] score of 0.99. Validation for needle localization was performed on 400 scans collected using a different imaging platform, over a bovine/porcine lumbosacral spine phantom. Shaft localization error of [Formula: see text], tip localization error of [Formula: see text] mm, and a total processing time of 0.58 s were achieved. CONCLUSION The proposed method is fully automatic and provides robust needle localization results in challenging scanning conditions. The accurate and robust results coupled with real-time detection and sub-second total processing make the proposed method promising in applications for needle detection and localization during challenging minimally invasive ultrasound-guided procedures.
Collapse
Affiliation(s)
- Cosmas Mwikirize
- Department of Biomedical Engineering, Rutgers University, Piscataway, NJ, 08854, USA.
| | - John L Nosher
- Department of Radiology, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, 08901, USA
| | - Ilker Hacihaliloglu
- Department of Biomedical Engineering, Rutgers University, Piscataway, NJ, 08854, USA.,Department of Radiology, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, 08901, USA
| |
Collapse
|