1
|
Watanabe H, Fukuda H, Ezawa Y, Matsuyama E, Kondo Y, Hayashi N, Ogura T, Shimosegawa M. Automated angular measurement for puncture angle using a computer-aided method in ultrasound-guided peripheral insertion. Phys Eng Sci Med 2024; 47:679-689. [PMID: 38358620 DOI: 10.1007/s13246-024-01397-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2023] [Accepted: 01/28/2024] [Indexed: 02/16/2024]
Abstract
Ultrasound guidance has become the gold standard for obtaining vascular access. Angle information, which indicates the entry angle of the needle into the vein, is required to ensure puncture success. Although various image processing-based methods, such as deep learning, have recently been applied to improve needle visibility, these methods have limitations, in that the puncture angle to the target organ is not measured. We aim to detect the target vessel and puncture needle and to derive the puncture angle by combining deep learning and conventional image processing methods such as the Hough transform. Median cubital vein US images were obtained from 20 healthy volunteers, and images of simulated blood vessels and needles were obtained during the puncture of a simulated blood vessel in four phantoms. The U-Net architecture was used to segment images of blood vessels and needles, and various image processing methods were employed to automatically measure angles. The experimental results indicated that the mean dice coefficients of median cubital veins, simulated blood vessels, and needles were 0.826, 0.931, and 0.773, respectively. The quantitative results of angular measurement showed good agreement between the expert and automatic measurements of the puncture angle with 0.847 correlations. Our findings indicate that the proposed method achieves extremely high segmentation accuracy and automated angular measurements. The proposed method reduces the variability and time required in manual angle measurements and presents the possibility where the operator can concentrate on delicate techniques related to the direction of the needle.
Collapse
Affiliation(s)
- Haruyuki Watanabe
- School of Radiological Technology, Gunma Prefectural College of Health Sciences, Maebashi, Japan.
| | - Hironori Fukuda
- Department of Radiology, Cardiovascular Hospital of Central Japan, Shibukawa, Japan
| | - Yuina Ezawa
- School of Radiological Technology, Gunma Prefectural College of Health Sciences, Maebashi, Japan
| | - Eri Matsuyama
- Faculty of Informatics, The University of Fukuchiyama, Fukuchiyama, Japan
| | - Yohan Kondo
- Graduate School of Health Sciences, Niigata University, Niigata, Japan
| | - Norio Hayashi
- School of Radiological Technology, Gunma Prefectural College of Health Sciences, Maebashi, Japan
| | - Toshihiro Ogura
- School of Radiological Technology, Gunma Prefectural College of Health Sciences, Maebashi, Japan
| | - Masayuki Shimosegawa
- School of Radiological Technology, Gunma Prefectural College of Health Sciences, Maebashi, Japan
| |
Collapse
|
2
|
Hui X, Rajendran P, Ling T, Dai X, Xing L, Pramanik M. Ultrasound-guided needle tracking with deep learning: A novel approach with photoacoustic ground truth. PHOTOACOUSTICS 2023; 34:100575. [PMID: 38174105 PMCID: PMC10761306 DOI: 10.1016/j.pacs.2023.100575] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/13/2023] [Revised: 11/15/2023] [Accepted: 11/27/2023] [Indexed: 01/05/2024]
Abstract
Accurate needle guidance is crucial for safe and effective clinical diagnosis and treatment procedures. Conventional ultrasound (US)-guided needle insertion often encounters challenges in consistency and precisely visualizing the needle, necessitating the development of reliable methods to track the needle. As a powerful tool in image processing, deep learning has shown promise for enhancing needle visibility in US images, although its dependence on manual annotation or simulated data as ground truth can lead to potential bias or difficulties in generalizing to real US images. Photoacoustic (PA) imaging has demonstrated its capability for high-contrast needle visualization. In this study, we explore the potential of PA imaging as a reliable ground truth for deep learning network training without the need for expert annotation. Our network (UIU-Net), trained on ex vivo tissue image datasets, has shown remarkable precision in localizing needles within US images. The evaluation of needle segmentation performance extends across previously unseen ex vivo data and in vivo human data (collected from an open-source data repository). Specifically, for human data, the Modified Hausdorff Distance (MHD) value stands at approximately 3.73, and the targeting error value is around 2.03, indicating the strong similarity and small needle orientation deviation between the predicted needle and actual needle location. A key advantage of our method is its applicability beyond US images captured from specific imaging systems, extending to images from other US imaging systems.
Collapse
Affiliation(s)
- Xie Hui
- School of Chemistry, Chemical Engineering and Biotechnology, Nanyang Technological University, Singapore 637459, Singapore
| | - Praveenbalaji Rajendran
- Stanford University, Department of Radiation Oncology, Stanford, California 94305, United States
| | - Tong Ling
- School of Chemistry, Chemical Engineering and Biotechnology, Nanyang Technological University, Singapore 637459, Singapore
- School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore 637459, Singapore
| | - Xianjin Dai
- Stanford University, Department of Radiation Oncology, Stanford, California 94305, United States
| | - Lei Xing
- Stanford University, Department of Radiation Oncology, Stanford, California 94305, United States
| | - Manojit Pramanik
- Department of Electrical and Computer Engineering, Iowa State University, Ames, IA 50011, United States
| |
Collapse
|
3
|
Malamal G, Schwab HM, Panicker MR. Enhanced Needle Visualization With Reflection Tuned Apodization Based on the Radon Transform for Ultrasound Imaging. IEEE TRANSACTIONS ON ULTRASONICS, FERROELECTRICS, AND FREQUENCY CONTROL 2023; 70:1482-1493. [PMID: 37721881 DOI: 10.1109/tuffc.2023.3316284] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/20/2023]
Abstract
In ultrasound (US)-guided interventions, accurately tracking and visualizing needles during in-plane insertions are significant challenges due to strong directional specular reflections. These reflections violate the geometrical delay and apodization estimations in the conventional delay and sum beamforming (DASB) degrading the visualization of needles. This study proposes a novel reflection tuned apodization (RTA) to address this issue and facilitate needle enhancement through DASB. The method leverages both temporal and angular information derived from the Radon transforms of the radio frequency (RF) data from plane-wave imaging to filter the specular reflections from the needle and their directivity. The directivity information is translated into apodization center maps through time-to-space mapping in the Radon domain, which is subsequently integrated into DASB. We assess the influence of needle angulations, projection angles in the Radon transform, needle gauge sizes, and the presence of multiple specular interfaces on the approach. The analysis shows that the method surpasses conventional DASB in enhancing the image quality of needle interfaces while preserving the diffuse scattering from the surrounding tissues without significant computational overhead. The work offers promising prospects for improved outcomes in US-guided interventions and better insights into characterizing US reflections with Radon transforms.
Collapse
|
4
|
Park S, Beom DG, Bae EH, Kim SW, Kim DJ, Kim CS. Model-Based Needle Identification Using Image Analysis and Needle Library Matching for Ultrasound-Guided Kidney Biopsy: A Feasibility Study. ULTRASOUND IN MEDICINE & BIOLOGY 2023; 49:1699-1708. [PMID: 37137741 DOI: 10.1016/j.ultrasmedbio.2023.03.009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/25/2022] [Revised: 03/05/2023] [Accepted: 03/07/2023] [Indexed: 05/05/2023]
Abstract
OBJECTIVE The aim of the work described here was to determine the feasibility of using a novel biopsy needle detection technique that achieves high sensitivity and specificity in a trade-off of resolution, detectability and depth of imaging. METHODS The proposed needle detection method consists of a model-based image analysis, temporal needle projection and needle library matching: (i) Image analysis was formulated under the signal decomposition framework; (ii) temporal projection converted the time-resolved needle dynamics into a single image of the desired needle; and (iii) the enhanced needle structure was spatially refined by matching a long, straight linear object in the needle library. The efficacy was examined with respect to different needle visibility. RESULTS Our method effectively eliminated confounding effects of the background tissue artifacts more robustly than conventional methods, thus improving needle visibility even with the low contrast between the needle and tissue. The improvement in needle structure further resulted in an improvement in estimation performance for the trajectory angle and tip position. CONCLUSION Our three-step needle detection method can reliably detect needle position without the need for external devices, increasing the needle conspicuity and reducing motion sensitivity.
Collapse
Affiliation(s)
- Suhyung Park
- Department of Computer Engineering, Chonnam National University, Gwangju, Republic of Korea; Department of ICT Convergence System Engineering, Chonnam National University, Gwangju, Republic of Korea
| | - Dong Gyu Beom
- Department of Computer Engineering, Chonnam National University, Gwangju, Republic of Korea
| | - Eun Hui Bae
- Department of Internal Medicine, Chonnam National University Medical School, Gwangju, Republic of Korea; Department of Internal Medicine, Chonnam National University Hospital, Gwangju, Republic of Korea
| | - Soo Wan Kim
- Department of Internal Medicine, Chonnam National University Medical School, Gwangju, Republic of Korea; Department of Internal Medicine, Chonnam National University Hospital, Gwangju, Republic of Korea
| | - Dong Joon Kim
- Department of Anesthesiology and Pain Medicine, Chosun University Medical School, Gwangju, Republic of Korea; Department of Anesthesiology and Pain Medicine, Chosun University Hospital, Gwangju, Republic of Korea
| | - Chang Seong Kim
- Department of Internal Medicine, Chonnam National University Medical School, Gwangju, Republic of Korea; Department of Internal Medicine, Chonnam National University Hospital, Gwangju, Republic of Korea.
| |
Collapse
|
5
|
Arapi V, Hardt-Stremayr A, Weiss S, Steinbrener J. Bridging the simulation-to-real gap for AI-based needle and target detection in robot-assisted ultrasound-guided interventions. Eur Radiol Exp 2023; 7:30. [PMID: 37332035 DOI: 10.1186/s41747-023-00344-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2022] [Accepted: 04/05/2023] [Indexed: 06/20/2023] Open
Abstract
BACKGROUND Artificial intelligence (AI)-powered, robot-assisted, and ultrasound (US)-guided interventional radiology has the potential to increase the efficacy and cost-efficiency of interventional procedures while improving postsurgical outcomes and reducing the burden for medical personnel. METHODS To overcome the lack of available clinical data needed to train state-of-the-art AI models, we propose a novel approach for generating synthetic ultrasound data from real, clinical preoperative three-dimensional (3D) data of different imaging modalities. With the synthetic data, we trained a deep learning-based detection algorithm for the localization of needle tip and target anatomy in US images. We validated our models on real, in vitro US data. RESULTS The resulting models generalize well to unseen synthetic data and experimental in vitro data making the proposed approach a promising method to create AI-based models for applications of needle and target detection in minimally invasive US-guided procedures. Moreover, we show that by one-time calibration of the US and robot coordinate frames, our tracking algorithm can be used to accurately fine-position the robot in reach of the target based on 2D US images alone. CONCLUSIONS The proposed data generation approach is sufficient to bridge the simulation-to-real gap and has the potential to overcome data paucity challenges in interventional radiology. The proposed AI-based detection algorithm shows very promising results in terms of accuracy and frame rate. RELEVANCE STATEMENT This approach can facilitate the development of next-generation AI algorithms for patient anatomy detection and needle tracking in US and their application to robotics. KEY POINTS • AI-based methods show promise for needle and target detection in US-guided interventions. • Publicly available, annotated datasets for training AI models are limited. • Synthetic, clinical-like US data can be generated from magnetic resonance or computed tomography data. • Models trained with synthetic US data generalize well to real in vitro US data. • Target detection with an AI model can be used for fine positioning of the robot.
Collapse
Affiliation(s)
- Visar Arapi
- Control of Networked Systems Research Group, Institute of Smart Systems Technologies, University of Klagenfurt, Klagenfurt, Austria.
| | - Alexander Hardt-Stremayr
- Control of Networked Systems Research Group, Institute of Smart Systems Technologies, University of Klagenfurt, Klagenfurt, Austria
| | - Stephan Weiss
- Control of Networked Systems Research Group, Institute of Smart Systems Technologies, University of Klagenfurt, Klagenfurt, Austria
| | - Jan Steinbrener
- Control of Networked Systems Research Group, Institute of Smart Systems Technologies, University of Klagenfurt, Klagenfurt, Austria
| |
Collapse
|
6
|
Yang H, Shan C, Kolen AF, de With PHN. Medical instrument detection in ultrasound: a review. Artif Intell Rev 2022. [DOI: 10.1007/s10462-022-10287-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/02/2022]
Abstract
AbstractMedical instrument detection is essential for computer-assisted interventions, since it facilitates clinicians to find instruments efficiently with a better interpretation, thereby improving clinical outcomes. This article reviews image-based medical instrument detection methods for ultrasound-guided (US-guided) operations. Literature is selected based on an exhaustive search in different sources, including Google Scholar, PubMed, and Scopus. We first discuss the key clinical applications of medical instrument detection in the US, including delivering regional anesthesia, biopsy taking, prostate brachytherapy, and catheterization. Then, we present a comprehensive review of instrument detection methodologies, including non-machine-learning and machine-learning methods. The conventional non-machine-learning methods were extensively studied before the era of machine learning methods. The principal issues and potential research directions for future studies are summarized for the computer-assisted intervention community. In conclusion, although promising results have been obtained by the current (non-) machine learning methods for different clinical applications, thorough clinical validations are still required.
Collapse
|
7
|
Daoud MI, Abu-Hani AF, Shtaiyat A, Ali MZ, Alazrai R. Needle detection using ultrasound B-mode and power Doppler analyses. Med Phys 2022; 49:4999-5013. [PMID: 35608237 DOI: 10.1002/mp.15725] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2021] [Revised: 03/31/2022] [Accepted: 04/13/2022] [Indexed: 11/08/2022] Open
Abstract
BACKGROUND Ultrasound is employed in needle interventions to visualize the anatomical structures and track the needle. Nevertheless, needle detection in ultrasound images is a difficult task, specifically at steep insertion angles. PURPOSE A new method is presented to enable effective needle detection using ultrasound B-mode and power Doppler analyses. METHODS A small buzzer is used to excite the needle and an ultrasound system is utilized to acquire B-mode and power Doppler images for the needle. The B-mode and power Doppler images are processed using Radon transform and local phase analysis to initially detect the axis of the needle. The detection of the needle axis is improved by processing the power Doppler image using alpha shape analysis to define a region of interest (ROI) that contains the needle. Also, a set of feature maps are extracted from the ROI in the B-mode image. The feature maps are processed using a machine learning classifier to construct a likelihood image that visualizes the posterior needle likelihoods of the pixels. Radon transform is applied to the likelihood image to achieve an improved needle axis detection. Additionally, the region in the B-mode image surrounding the needle axis is analyzed to identify the needle tip using a custom-made probabilistic approach. Our method was utilized to detect needles inserted in ex vivo animal tissues at shallow [20° -40°), moderate [40° -60°), and steep [60° -85°] angles. RESULTS Our method detected the needles with failure rates equal to 0% and mean angle, axis, and tip errors less than or equal to 0.7°, 0.6 mm, and 0.7 mm, respectively. Additionally, our method achieved favorable results compared to two recently introduced needle detection methods. CONCLUSIONS The results indicate the potential of applying our method to achieve effective needle detection in ultrasound images. This article is protected by copyright. All rights reserved.
Collapse
Affiliation(s)
- Mohammad I Daoud
- Department of Computer Engineering, German Jordanian University, Amman, 11180, Jordan
| | - Ayah F Abu-Hani
- Department of Electrical and Computer Engineering, Technical University of Munich, Munich, 80333, Germany
| | - Ahmad Shtaiyat
- Department of Computer Engineering, German Jordanian University, Amman, 11180, Jordan
| | - Mostafa Z Ali
- Department of Computer Information Systems, Jordan University of Science and Technology, Irbid, 22110, Jordan
| | - Rami Alazrai
- Department of Computer Engineering, German Jordanian University, Amman, 11180, Jordan
| |
Collapse
|
8
|
Sánchez-Margallo JA, Tas L, Moelker A, van den Dobbelsteen JJ, Sánchez-Margallo FM, Langø T, van Walsum T, van de Berg NJ. Block-matching-based registration to evaluate ultrasound visibility of percutaneous needles in liver-mimicking phantoms. Med Phys 2021; 48:7602-7612. [PMID: 34665885 PMCID: PMC9298012 DOI: 10.1002/mp.15305] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2021] [Revised: 10/08/2021] [Accepted: 10/14/2021] [Indexed: 11/24/2022] Open
Abstract
Purpose To present a novel methodical approach to compare visibility of percutaneous needles in ultrasound images. Methods A motor‐driven rotation platform was used to gradually change the needle angle while capturing image data. Data analysis was automated using block‐matching‐based registration, with a tracking and refinement step. Every 25 frames, a Hough transform was used to improve needle alignments after large rotations. The method was demonstrated by comparing three commercial needles (14G radiofrequency ablation, RFA; 18G Trocar; 22G Chiba) and six prototype needles with different sizes, materials, and surface conditions (polished, sand‐blasted, and kerfed), within polyvinyl alcohol phantom tissue and ex vivo bovine liver models. For each needle and angle, a contrast‐to‐noise ratio (CNR) was determined to quantify visibility. CNR values are presented as a function of needle type and insertion angle. In addition, the normalized area under the (CNR‐angle) curve was used as a summary metric to compare needles. Results In phantom tissue, the first kerfed needle design had the largest normalized area of visibility and the polished 1 mm diameter stainless steel needle the smallest (0.704 ± 0.199 vs. 0.154 ± 0.027, p < 0.01). In the ex vivo model, the second kerfed needle design had the largest normalized area of visibility, and the sand‐blasted stainless steel needle the smallest (0.470 ± 0.190 vs. 0.127 ± 0.047, p < 0.001). As expected, the analysis showed needle visibility peaks at orthogonal insertion angles. For acute or obtuse angles, needle visibility was similar or reduced. Overall, the variability in needle visibility was considerably higher in livers. Conclusion The best overall visibility was found with kerfed needles and the commercial RFA needle. The presented methodical approach to quantify ultrasound visibility allows comparisons of (echogenic) needles, as well as other technological innovations aiming to improve ultrasound visibility of percutaneous needles, such as coatings, material treatments, and beam steering approaches.
Collapse
Affiliation(s)
- Juan A Sánchez-Margallo
- Bioengineering and Health Technologies Unit, Jesús Usón Minimally Invasive Surgery Centre, Cáceres, Spain
| | - Lisette Tas
- Department of Biomechanical Engineering, Delft University of Technology, Delft, The Netherlands
| | - Adriaan Moelker
- Department of Radiology & Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | | | | | | | - Theo van Walsum
- Biomedical Imaging Group Rotterdam, Department of Radiology & Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, The Netherlands
| | - Nick J van de Berg
- Department of Radiology & Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| |
Collapse
|
9
|
Checcucci E, Amparore D, Volpi G, Piramide F, De Cillis S, Piana A, Alessio P, Verri P, Piscitello S, Carbonaro B, Meziere J, Zamengo D, Tsaturyan A, Cacciamani G, Rivas JG, De Luca S, Manfredi M, Fiori C, Liatsikos E, Porpiglia F. Percutaneous puncture during PCNL: new perspective for the future with virtual imaging guidance. World J Urol 2021; 40:639-650. [PMID: 34468886 DOI: 10.1007/s00345-021-03820-4] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2021] [Accepted: 08/19/2021] [Indexed: 12/21/2022] Open
Abstract
CONTEXT Large and complex renal stones are usually treated with percutaneous nephrolithotomy (PCNL). One of the crucial steps in this procedure is the access to the collecting system with the percutaneous puncture and this maneuver leads to a risk of vascular and neighboring organs' injury. In the last years, the application of virtual image-guided surgery has gained wide diffusion even in this specific field. OBJECTIVES To provide a short overview of the most recent evidence on current applications of virtual imaging guidance for PCNL. EVIDENCE ACQUISITION A non-systematic review of the literature was performed. Medline, PubMed, the Cochrane Database and Embase were screened for studies regarding the use virtual imaging guidance for PCNL. EVIDENCE SYNTHESIS 3D virtual navigation technology for PCNL was first used in urology with the purpose of surgical training and surgical planning; subsequently, the field of surgical navigation with different modalities (from cognitive to augmented reality or mixed reality) had been explored. Finally, anecdotal preliminary experiences explored the potential application of artificial intelligence guidance for percutaneous puncture. CONCLUSION Nowadays, many experiences proved the potential benefit of virtual guidance for surgical simulation and training. Focusing on surgery, this tool revealed to be useful both for surgical planning, allowed to achieve a better surgical performance, and for surgical navigation by using augmented reality and mixed reality systems aimed to assist the surgeon in real time during the intervention.
Collapse
Affiliation(s)
- E Checcucci
- Department of Surgery, Candiolo Cancer Institute, FPO-IRCCS, Strada Provinciale 142, km 3,95, 10060, Candiolo, Turin, Italy.
- Uro-Technology and SoMe Working Group of the Young Academic Urologists (YAU) Working Party of the European Association of Urology (EAU), Arnhem, The Netherlands.
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy.
| | - D Amparore
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - G Volpi
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - F Piramide
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - S De Cillis
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - A Piana
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - P Alessio
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - P Verri
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - S Piscitello
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - B Carbonaro
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - J Meziere
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - D Zamengo
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - A Tsaturyan
- Department of Urology, University Hospital of Patras, Patras, Greece
| | - G Cacciamani
- USC Institute of Urology, University of Southern California, Los Angeles, CA, USA
| | - Juan Gomez Rivas
- Department of Urology, La Paz University Hospital, Madrid, Spain
| | - S De Luca
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - M Manfredi
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - C Fiori
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| | - E Liatsikos
- Department of Urology, University Hospital of Patras, Patras, Greece
- Department of Urology, Medical University of Vienna, Vienna, Austria
| | - F Porpiglia
- Department of Oncology, Division of Urology, University of Turin, Turin, Italy
| |
Collapse
|
10
|
Wijata A, Andrzejewski J, Pyciński B. An Automatic Biopsy Needle Detection and Segmentation on Ultrasound Images Using a Convolutional Neural Network. ULTRASONIC IMAGING 2021; 43:262-272. [PMID: 34180737 DOI: 10.1177/01617346211025267] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Needle visualization in the ultrasound image is essential to successfully perform the ultrasound-guided core needle biopsy. Automatic needle detection can significantly reduce the procedure time, false-negative rate, and highly improve the diagnosis. In this paper, we present a CNN-based, fully automatic method for detection of core needle in 2D ultrasound images. Adaptive moment estimation optimizer is proposed as CNN architecture. Radon transform is applied to locate the needle. The network's model was trained and tested on the total of 619 2D images from 91 cases of breast cancer. The model has achieved an average weighted intersection over union (the weighted Jaccard Index) of 0.986, F1 Score of 0.768, and angle RMSE of 3.73°. The obtained results exceed the other solutions by at least 0.27 and 7° in case of F1 score and angle RMSE, respectively. Finally, the needle is detected in a single frame averagely in 21.6 ms on a modern PC.
Collapse
Affiliation(s)
- Agata Wijata
- Faculty of Biomedical Engineering, Silesian University of Technology, Zabrze, Poland
| | - Jacek Andrzejewski
- Faculty of Biomedical Engineering, Silesian University of Technology, Zabrze, Poland
| | - Bartłomiej Pyciński
- Faculty of Biomedical Engineering, Silesian University of Technology, Zabrze, Poland
| |
Collapse
|
11
|
Time-aware deep neural networks for needle tip localization in 2D ultrasound. Int J Comput Assist Radiol Surg 2021; 16:819-827. [PMID: 33840037 DOI: 10.1007/s11548-021-02361-w] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2021] [Accepted: 03/25/2021] [Indexed: 10/21/2022]
Abstract
PURPOSE Accurate placement of the needle is critical in interventions like biopsies and regional anesthesia, during which incorrect needle insertion can lead to procedure failure and complications. Therefore, ultrasound guidance is widely used to improve needle placement accuracy. However, at steep and deep insertions, the visibility of the needle is lost. Computational methods for automatic needle tip localization could improve the clinical success rate in these scenarios. METHODS We propose a novel algorithm for needle tip localization during challenging ultrasound-guided insertions when the shaft may be invisible, and the tip has a low intensity. There are two key steps in our approach. First, we enhance the needle tip features in consecutive ultrasound frames using a detection scheme which recognizes subtle intensity variations caused by needle tip movement. We then employ a hybrid deep neural network comprising a convolutional neural network and long short-term memory recurrent units. The input to the network is a consecutive plurality of fused enhanced frames and the corresponding original B-mode frames, and this spatiotemporal information is used to predict the needle tip location. RESULTS We evaluate our approach on an ex vivo dataset collected with in-plane and out-of-plane insertion of 17G and 22G needles in bovine, porcine, and chicken tissue, acquired using two different ultrasound systems. We train the model with 5000 frames from 42 video sequences. Evaluation on 600 frames from 30 sequences yields a tip localization error of [Formula: see text] mm and an overall inference time of 0.064 s (15 fps). Comparison against prior art on challenging datasets reveals a 30% improvement in tip localization accuracy. CONCLUSION The proposed method automatically models temporal dynamics associated with needle tip motion and is more accurate than state-of-the-art methods. Therefore, it has the potential for improving needle tip localization in challenging ultrasound-guided interventions.
Collapse
|
12
|
Yang H, Shan C, Kolen AF, de With PHN. Efficient Medical Instrument Detection in 3D Volumetric Ultrasound Data. IEEE Trans Biomed Eng 2021; 68:1034-1043. [PMID: 32746017 DOI: 10.1109/tbme.2020.2999729] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Ultrasound-guided procedures have been applied in many clinical therapies, such as cardiac catheterization and regional anesthesia. Medical instrument detection in 3D Ultrasound (US) is highly desired, but the existing approaches are far from real-time performance. Our objective is to investigate an efficient instrument detection method in 3D US for practical clinical use. We propose a novel Multi-dimensional Mixed Network for efficient instrument detection in 3D US, which extracts the discriminating features at 3D full-image level by a 3D encoder, and then applies a specially designed dimension reduction block to reduce the spatial complexity of the feature maps by projecting from 3D space into 2D space. A 2D decoder is adopted to detect the instrument along the specified axes. By projecting the predicted 2D outputs, the instrument is detected or visualized in the 3D volume. Furthermore, to enable the network to better learn the discriminative information, we propose a multi-level loss function to capture both pixel- and image-level differences. We carried out extensive experiments on two datasets for two tasks: (1) catheter detection for cardiac RF-ablation and (2) needle detection for regional anesthesia. Our experiments show that our proposed method achieves a detection error of 2-3 voxels with an efficiency of about 0.12 sec per 3D US volume. The proposed method is 3-8 times faster than the state-of-the-art methods, leading to real-time performance. The results show that our proposed method has significant clinical value for real-time 3D US-guided intervention.
Collapse
|
13
|
Efficient and Robust Instrument Segmentation in 3D Ultrasound Using Patch-of-Interest-FuseNet with Hybrid Loss. Med Image Anal 2020; 67:101842. [PMID: 33075639 DOI: 10.1016/j.media.2020.101842] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2019] [Revised: 09/11/2020] [Accepted: 09/24/2020] [Indexed: 11/20/2022]
Abstract
Instrument segmentation plays a vital role in 3D ultrasound (US) guided cardiac intervention. Efficient and accurate segmentation during the operation is highly desired since it can facilitate the operation, reduce the operational complexity, and therefore improve the outcome. Nevertheless, current image-based instrument segmentation methods are not efficient nor accurate enough for clinical usage. Lately, fully convolutional neural networks (FCNs), including 2D and 3D FCNs, have been used in different volumetric segmentation tasks. However, 2D FCN cannot exploit the 3D contextual information in the volumetric data, while 3D FCN requires high computation cost and a large amount of training data. Moreover, with limited computation resources, 3D FCN is commonly applied with a patch-based strategy, which is therefore not efficient for clinical applications. To address these, we propose a POI-FuseNet, which consists of a patch-of-interest (POI) selector and a FuseNet. The POI selector can efficiently select the interested regions containing the instrument, while FuseNet can make use of 2D and 3D FCN features to hierarchically exploit contextual information. Furthermore, we propose a hybrid loss function, which consists of a contextual loss and a class-balanced focal loss, to improve the segmentation performance of the network. With the collected challenging ex-vivo dataset on RF-ablation catheter, our method achieved a Dice score of 70.5%, superior to the state-of-the-art methods. In addition, based on the pre-trained model from ex-vivo dataset, our method can be adapted to the in-vivo dataset on guidewire and achieves a Dice score of 66.5% for a different cardiac operation. More crucially, with POI-based strategy, segmentation efficiency is reduced to around 1.3 seconds per volume, which shows the proposed method is promising for clinical use.
Collapse
|
14
|
Beigi P, Salcudean SE, Ng GC, Rohling R. Enhancement of needle visualization and localization in ultrasound. Int J Comput Assist Radiol Surg 2020; 16:169-178. [PMID: 32995981 DOI: 10.1007/s11548-020-02227-7] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2020] [Accepted: 07/06/2020] [Indexed: 12/12/2022]
Abstract
PURPOSE This scoping review covers needle visualization and localization techniques in ultrasound, where localization-based approaches mostly aim to compute the needle shaft (and tip) location while potentially enhancing its visibility too. METHODS A literature review is conducted on the state-of-the-art techniques, which could be divided into five categories: (1) signal and image processing-based techniques to augment the needle, (2) modifications to the needle and insertion to help with needle-transducer alignment and visibility, (3) changes to ultrasound image formation, (4) motion-based analysis and (5) machine learning. RESULTS Advantages, limitations and challenges of representative examples in each of the categories are discussed. Evaluation techniques performed in ex vivo, phantom and in vivo studies are discussed and summarized. CONCLUSION Greatest limitation of the majority of the literature is that they rely on original visibility of the needle in the static image. Need for additional/improved apparatus is the greatest limitation toward clinical utility in practice. SIGNIFICANCE Ultrasound-guided needle placement is performed in many clinical applications, including biopsies, treatment injections and anesthesia. Despite the wide range and long history of this technique, an ongoing challenge is needle visibility in ultrasound. A robust technique to enhance ultrasonic needle visibility, especially for steeply inserted hand-held needles, and while maintaining clinical utility requirements is needed.
Collapse
Affiliation(s)
- Parmida Beigi
- Electrical and Computer Engineering Department, University of British Columbia, Vancouver, BC, Canada.
| | - Septimiu E Salcudean
- Electrical and Computer Engineering Department, University of British Columbia, Vancouver, BC, Canada
| | - Gary C Ng
- Philips Ultrasound, Bothell, WA, USA
| | - Robert Rohling
- Electrical and Computer Engineering Department and Mechanical Engineering Department, University of British Columbia, Vancouver, BC, Canada
| |
Collapse
|
15
|
Gillies DJ, Rodgers JR, Gyacskov I, Roy P, Kakani N, Cool DW, Fenster A. Deep learning segmentation of general interventional tools in two‐dimensional ultrasound images. Med Phys 2020; 47:4956-4970. [DOI: 10.1002/mp.14427] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2020] [Revised: 07/05/2020] [Accepted: 07/21/2020] [Indexed: 12/18/2022] Open
Affiliation(s)
- Derek J. Gillies
- Department of Medical Biophysics Western University London OntarioN6A 3K7 Canada
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
| | - Jessica R. Rodgers
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
- School of Biomedical Engineering Western University London OntarioN6A 3K7 Canada
| | - Igor Gyacskov
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
| | - Priyanka Roy
- Department of Medical Biophysics Western University London OntarioN6A 3K7 Canada
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
| | - Nirmal Kakani
- Department of Radiology Manchester Royal Infirmary ManchesterM13 9WL UK
| | - Derek W. Cool
- Department of Medical Imaging Western University London OntarioN6A 3K7 Canada
| | - Aaron Fenster
- Department of Medical Biophysics Western University London OntarioN6A 3K7 Canada
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
- School of Biomedical Engineering Western University London OntarioN6A 3K7 Canada
- Department of Medical Imaging Western University London OntarioN6A 3K7 Canada
| |
Collapse
|
16
|
Machine Learning and Artificial Intelligence in Pediatric Research: Current State, Future Prospects, and Examples in Perioperative and Critical Care. J Pediatr 2020; 221S:S3-S10. [PMID: 32482232 DOI: 10.1016/j.jpeds.2020.02.039] [Citation(s) in RCA: 29] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/18/2019] [Revised: 02/12/2020] [Accepted: 02/19/2020] [Indexed: 01/21/2023]
|
17
|
Daoud MI, Abu-Hani AF, Alazrai R. Reliable and accurate needle localization in curvilinear ultrasound images using signature-based analysis of ultrasound beamformed radio frequency signals. Med Phys 2020; 47:2356-2379. [PMID: 32160309 DOI: 10.1002/mp.14126] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2019] [Revised: 12/30/2019] [Accepted: 02/21/2020] [Indexed: 01/26/2023] Open
Abstract
PURPOSE Ultrasound imaging is used in many minimally invasive needle insertion procedures to track the advancing needle, but localizing the needle in ultrasound images can be challenging, particularly at steep insertion angles. Previous methods have been introduced to localize the needle in ultrasound images, but the majority of these methods are based on ultrasound B-mode image analysis that is affected by the needle visibility. To address this limitation, we propose a two-phase, signature-based method to achieve reliable and accurate needle localization in curvilinear ultrasound images based on the beamformed radio frequency (RF) signals that are acquired using conventional ultrasound imaging systems. METHODS In the first phase of our proposed method, the beamformed RF signals are divided into overlapping segments and these segments are processed to extract needle-specific features to identify the needle echoes. The features are analyzed using a support vector machine classifier to synthesize a quantitative image that highlights the needle. The quantitative image is processed using the Radon transform to achieve a reliable and accurate signature-based estimation of the needle axis. In the second phase, the accuracy of the needle axis estimation is improved by processing the RF samples located around the signature-based estimation of the needle axis using local phase analysis combined with the Radon transform. Moreover, a probabilistic approach is employed to identify the needle tip. The proposed method is used to localize needles with two different sizes inserted in ex vivo animal tissue specimens at various insertion angles. RESULTS Our proposed method achieved reliable and accurate needle localization for an extended range of needle insertion angles with failure rates of 0% and mean angle, axis, and tip errors smaller than or equal to 0 . 7 ∘ , 0.6 mm, and 0.7 mm, respectively. Moreover, our proposed method outperformed a recently introduced needle localization method that is based on B-mode image analysis. CONCLUSIONS These results suggest the potential of employing our signature-based method to achieve reliable and accurate needle localization during ultrasound-guided needle insertion procedures.
Collapse
Affiliation(s)
- Mohammad I Daoud
- Department of Computer Engineering, German Jordanian University, Amman, 11180, Jordan
| | - Ayah F Abu-Hani
- Department of Computer Engineering, German Jordanian University, Amman, 11180, Jordan
| | - Rami Alazrai
- Department of Computer Engineering, German Jordanian University, Amman, 11180, Jordan
| |
Collapse
|
18
|
Lee JY, Islam M, Woh JR, Washeem TSM, Ngoh LYC, Wong WK, Ren H. Ultrasound needle segmentation and trajectory prediction using excitation network. Int J Comput Assist Radiol Surg 2020; 15:437-443. [PMID: 31960247 DOI: 10.1007/s11548-019-02113-x] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2019] [Accepted: 12/30/2019] [Indexed: 10/25/2022]
Abstract
PURPOSE Ultrasound (US)-guided percutaneous kidney biopsy is a challenge for interventionists as US artefacts prevent accurate viewing of the biopsy needle tip. Automatic needle tracking and trajectory prediction can increase operator confidence in performing biopsies, reduce procedure time, minimize the risk of inadvertent biopsy bleedings, and enable future image-guided robotic procedures. METHODS In this paper, we propose a tracking-by-segmentation model with spatial and channel "Squeeze and Excitation" (scSE) for US needle detection and trajectory prediction. We adopt a light deep learning architecture (e.g. LinkNet) as our segmentation baseline network and integrate the scSE module to learn spatial information for better prediction. The proposed model is trained with the US images of anonymized kidney biopsy clips from 8 patients. The contour is obtained using the border-following algorithm and area calculated using Green formula. Trajectory prediction is made by extrapolating from the smallest bounding box that can capture the contour. RESULTS We train and test our model on a total of 996 images extracted from 102 short videos at a rate of 3 frames per second from each video. A set of 794 images is used for training and 202 images for testing. Our model has achieved IOU of 41.01%, dice accuracy of 56.65%, F1-score of 36.61%, and root-mean-square angle error of 13.3[Formula: see text]. We are thus able to predict and extrapolate the trajectory of the biopsy needle with decent accuracy for interventionists to better perform biopsies. CONCLUSION Our novel model combining LinkNet and scSE shows a promising result for kidney biopsy application, which implies potential to other similar ultrasound-guided biopsies that require needle tracking and trajectory prediction.
Collapse
Affiliation(s)
- Jia Yi Lee
- Faculty of Engineering, National University of Singapore, Singapore, Singapore
| | - Mobarakol Islam
- Faculty of Engineering, National University of Singapore, Singapore, Singapore.,Department of Biomedical Engineering, National University of Singapore, Singapore, Singapore.,NUS Graduate School for Integrative Sciences and Engineering (NGS), NUS, Singapore, Singapore
| | - Jing Ru Woh
- Faculty of Engineering, National University of Singapore, Singapore, Singapore
| | - T S Mohamed Washeem
- Faculty of Engineering, National University of Singapore, Singapore, Singapore.,Department of Biomedical Engineering, National University of Singapore, Singapore, Singapore
| | - Lee Ying Clara Ngoh
- Division of Nephrology, National University Hospital, Singapore, Singapore.,Department of Medicine, Yong Loo Lin School of Medicine, National University of Singapore, Singapore, Singapore
| | - Weng Kin Wong
- Division of Nephrology, National University Hospital, Singapore, Singapore
| | - Hongliang Ren
- Faculty of Engineering, National University of Singapore, Singapore, Singapore. .,Department of Biomedical Engineering, National University of Singapore, Singapore, Singapore.
| |
Collapse
|
19
|
Learning needle tip localization from digital subtraction in 2D ultrasound. Int J Comput Assist Radiol Surg 2019; 14:1017-1026. [DOI: 10.1007/s11548-019-01951-z] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2019] [Accepted: 03/18/2019] [Indexed: 12/19/2022]
|
20
|
Classification of Liver Diseases Based on Ultrasound Image Texture Features. APPLIED SCIENCES-BASEL 2019. [DOI: 10.3390/app9020342] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/29/2023]
Abstract
This paper discusses using computer-aided diagnosis (CAD) to distinguish between hepatocellular carcinoma (HCC), i.e., the most common type of primary liver malignancy and a leading cause of death in people with cirrhosis worldwide, and liver abscess based on ultrasound image texture features and a support vector machine (SVM) classifier. Among 79 cases of liver diseases including 44 cases of liver cancer and 35 cases of liver abscess, this research extracts 96 features including 52 features of the gray-level co-occurrence matrix (GLCM) and 44 features of the gray-level run-length matrix (GLRLM) from the regions of interest (ROIs) in ultrasound images. Three feature selection models—(i) sequential forward selection (SFS), (ii) sequential backward selection (SBS), and (iii) F-score—are adopted to distinguish the two liver diseases. Finally, the developed system can classify liver cancer and liver abscess by SVM with an accuracy of 88.875%. The proposed methods for CAD can provide diagnostic assistance while distinguishing these two types of liver lesions.
Collapse
|
21
|
Daoud MI, Shtaiyat A, Zayadeen AR, Alazrai R. Accurate Needle Localization Using Two-Dimensional Power Doppler and B-Mode Ultrasound Image Analyses: A Feasibility Study. SENSORS (BASEL, SWITZERLAND) 2018; 18:E3475. [PMID: 30332743 PMCID: PMC6209937 DOI: 10.3390/s18103475] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/20/2018] [Revised: 09/30/2018] [Accepted: 10/09/2018] [Indexed: 01/07/2023]
Abstract
Curvilinear ultrasound transducers are commonly used in various needle insertion interventions, but localizing the needle in curvilinear ultrasound images is usually challenging. In this paper, a new method is proposed to localize the needle in curvilinear ultrasound images by exciting the needle using a piezoelectric buzzer and imaging the excited needle using a curvilinear ultrasound transducer to acquire a power Doppler image and a B-mode image. The needle-induced Doppler responses that appear in the power Doppler image are analyzed to estimate the needle axis initially and identify the candidate regions that are expected to include the needle. The candidate needle regions in the B-mode image are analyzed to improve the localization of the needle axis. The needle tip is determined by analyzing the intensity variations of the power Doppler and B-mode images around the needle axis. The proposed method is employed to localize different needles that are inserted in three ex vivo animal tissue types at various insertion angles, and the results demonstrate the capability of the method to achieve automatic, reliable and accurate needle localization. Furthermore, the proposed method outperformed two existing needle localization methods.
Collapse
Affiliation(s)
- Mohammad I Daoud
- Department of Computer Engineering, German Jordanian University, Amman 11180, Jordan.
| | - Ahmad Shtaiyat
- Department of Computer Engineering, German Jordanian University, Amman 11180, Jordan.
| | - Adnan R Zayadeen
- Ultrasound Section, Jordanian Royal Medical Services, Amman 11180, Jordan.
| | - Rami Alazrai
- Department of Computer Engineering, German Jordanian University, Amman 11180, Jordan.
| |
Collapse
|
22
|
Mwikirize C, Nosher JL, Hacihaliloglu I. Convolution neural networks for real-time needle detection and localization in 2D ultrasound. Int J Comput Assist Radiol Surg 2018; 13:647-657. [PMID: 29512006 DOI: 10.1007/s11548-018-1721-y] [Citation(s) in RCA: 37] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2018] [Accepted: 02/28/2018] [Indexed: 12/28/2022]
Abstract
PURPOSE We propose a framework for automatic and accurate detection of steeply inserted needles in 2D ultrasound data using convolution neural networks. We demonstrate its application in needle trajectory estimation and tip localization. METHODS Our approach consists of a unified network, comprising a fully convolutional network (FCN) and a fast region-based convolutional neural network (R-CNN). The FCN proposes candidate regions, which are then fed to a fast R-CNN for finer needle detection. We leverage a transfer learning paradigm, where the network weights are initialized by training with non-medical images, and fine-tuned with ex vivo ultrasound scans collected during insertion of a 17G epidural needle into freshly excised porcine and bovine tissue at depth settings up to 9 cm and [Formula: see text]-[Formula: see text] insertion angles. Needle detection results are used to accurately estimate needle trajectory from intensity invariant needle features and perform needle tip localization from an intensity search along the needle trajectory. RESULTS Our needle detection model was trained and validated on 2500 ex vivo ultrasound scans. The detection system has a frame rate of 25 fps on a GPU and achieves 99.6% precision, 99.78% recall rate and an [Formula: see text] score of 0.99. Validation for needle localization was performed on 400 scans collected using a different imaging platform, over a bovine/porcine lumbosacral spine phantom. Shaft localization error of [Formula: see text], tip localization error of [Formula: see text] mm, and a total processing time of 0.58 s were achieved. CONCLUSION The proposed method is fully automatic and provides robust needle localization results in challenging scanning conditions. The accurate and robust results coupled with real-time detection and sub-second total processing make the proposed method promising in applications for needle detection and localization during challenging minimally invasive ultrasound-guided procedures.
Collapse
Affiliation(s)
- Cosmas Mwikirize
- Department of Biomedical Engineering, Rutgers University, Piscataway, NJ, 08854, USA.
| | - John L Nosher
- Department of Radiology, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, 08901, USA
| | - Ilker Hacihaliloglu
- Department of Biomedical Engineering, Rutgers University, Piscataway, NJ, 08854, USA.,Department of Radiology, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, 08901, USA
| |
Collapse
|
23
|
Mwikirize C, Nosher JL, Hacihaliloglu I. Signal attenuation maps for needle enhancement and localization in 2D ultrasound. Int J Comput Assist Radiol Surg 2018; 13:363-374. [PMID: 29294213 DOI: 10.1007/s11548-017-1698-y] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2017] [Accepted: 12/20/2017] [Indexed: 10/18/2022]
Abstract
PURPOSE We propose a novel framework for enhancement and localization of steeply inserted hand-held needles under in-plane 2D ultrasound guidance. METHODS Depth-dependent attenuation and non-axial specular reflection hinder visibility of steeply inserted needles. Here, we model signal transmission maps representative of the attenuation probability within the image domain. The maps are employed in a contextual regularization framework to recover needle shaft and tip information. The needle tip is automatically localized by line-fitting along the local-phase-directed trajectory, followed by statistical optimization. RESULTS The proposed method was tested on 300 ex vivo ultrasound scans collected during insertion of an epidural needle into freshly excised porcine and bovine tissue. A tip localization accuracy of [Formula: see text] was achieved. CONCLUSION The proposed method could be useful in challenging procedures where needle shaft and tip are inconspicuous. Improved needle localization results compared to previously proposed methods suggest that the proposed method is promising for further clinical evaluation.
Collapse
Affiliation(s)
- Cosmas Mwikirize
- Department of Biomedical Engineering, Rutgers University, Piscataway, NJ, 08854, USA.
| | - John L Nosher
- Department of Radiology, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, 08901, USA
| | - Ilker Hacihaliloglu
- Department of Biomedical Engineering, Rutgers University, Piscataway, NJ, 08854, USA.,Department of Radiology, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, 08901, USA
| |
Collapse
|
24
|
Pourtaherian A, Scholten HJ, Kusters L, Zinger S, Mihajlovic N, Kolen AF, Zuo F, Ng GC, Korsten HHM, de With PHN. Medical Instrument Detection in 3-Dimensional Ultrasound Data Volumes. IEEE TRANSACTIONS ON MEDICAL IMAGING 2017; 36:1664-1675. [PMID: 28410101 DOI: 10.1109/tmi.2017.2692302] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
Ultrasound-guided medical interventions are broadly applied in diagnostics and therapy, e.g., regional anesthesia or ablation. A guided intervention using 2-D ultrasound is challenging due to the poor instrument visibility, limited field of view, and the multi-fold coordination of the medical instrument and ultrasound plane. Recent 3-D ultrasound transducers can improve the quality of the image-guided intervention if an automated detection of the needle is used. In this paper, we present a novel method for detecting medical instruments in 3-D ultrasound data that is solely based on image processing techniques and validated on various ex vivo and in vivo data sets. In the proposed procedure, the physician is placing the 3-D transducer at the desired position, and the image processing will automatically detect the best instrument view, so that the physician can entirely focus on the intervention. Our method is based on the classification of instrument voxels using volumetric structure directions and robust approximation of the primary tool axis. A novel normalization method is proposed for the shape and intensity consistency of instruments to improve the detection. Moreover, a novel 3-D Gabor wavelet transformation is introduced and optimally designed for revealing the instrument voxels in the volume, while remaining generic to several medical instruments and transducer types. Experiments on diverse data sets, including in vivo data from patients, show that for a given transducer and an instrument type, high detection accuracies are achieved with position errors smaller than the instrument diameter in the 0.5-1.5-mm range on average.
Collapse
|
25
|
Scholten HJ, Pourtaherian A, Mihajlovic N, Korsten HHM, A. Bouwman R. Improving needle tip identification during ultrasound-guided procedures in anaesthetic practice. Anaesthesia 2017; 72:889-904. [DOI: 10.1111/anae.13921] [Citation(s) in RCA: 45] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/23/2017] [Indexed: 12/16/2022]
Affiliation(s)
- H. J. Scholten
- Department of Anaesthesiology; Intensive Care and Pain Medicine; Catharina Hospital; Eindhoven the Netherlands
| | - A. Pourtaherian
- Department of Electrical Engineering; Eindhoven University of Technology; Eindhoven the Netherlands
| | | | - H. H. M. Korsten
- Department of Anaesthesiology; Intensive Care and Pain Medicine; Catharina Hospital; Eindhoven the Netherlands
- Department of Electrical Engineering; Eindhoven University of Technology; Eindhoven the Netherlands
| | - R. A. Bouwman
- Department of Anaesthesiology; Intensive Care and Pain Medicine; Catharina Hospital; Eindhoven the Netherlands
- Department of Electrical Engineering; Eindhoven University of Technology; Eindhoven the Netherlands
| |
Collapse
|
26
|
Beigi P, Rohling R, Salcudean SE, Ng GC. Spectral analysis of the tremor motion for needle detection in curvilinear ultrasound via spatiotemporal linear sampling. Int J Comput Assist Radiol Surg 2016; 11:1183-92. [PMID: 27059024 DOI: 10.1007/s11548-016-1402-7] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2016] [Accepted: 03/23/2016] [Indexed: 11/24/2022]
Affiliation(s)
- Parmida Beigi
- Electrical and Computer Engineering Department, University of British Columbia, Vancouver, BC, Canada.
| | - Robert Rohling
- Electrical and Computer Engineering Department, University of British Columbia, Vancouver, BC, Canada
- Mechanical Engineering Department, University of British Columbia, Vancouver, BC, Canada
| | - Septimiu E Salcudean
- Electrical and Computer Engineering Department, University of British Columbia, Vancouver, BC, Canada
| | - Gary C Ng
- Philips Ultrasound, Bothell, WA, USA
| |
Collapse
|
27
|
|
28
|
Hadjerci O, Hafiane A, Conte D, Makris P, Vieyres P, Delbos A. Computer-aided detection system for nerve identification using ultrasound images: A comparative study. INFORMATICS IN MEDICINE UNLOCKED 2016. [DOI: 10.1016/j.imu.2016.06.003] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022] Open
|
29
|
Shen D, Wu G, Zhang D, Suzuki K, Wang F, Yan P. Machine learning in medical imaging. Comput Med Imaging Graph 2015; 41:1-2. [DOI: 10.1016/j.compmedimag.2015.02.001] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|