1
|
Grube S, Latus S, Behrendt F, Riabova O, Neidhardt M, Schlaefer A. Needle tracking in low-resolution ultrasound volumes using deep learning. Int J Comput Assist Radiol Surg 2024:10.1007/s11548-024-03234-8. [PMID: 39002100 DOI: 10.1007/s11548-024-03234-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2024] [Accepted: 07/03/2024] [Indexed: 07/15/2024]
Abstract
PURPOSE Clinical needle insertion into tissue, commonly assisted by 2D ultrasound imaging for real-time navigation, faces the challenge of precise needle and probe alignment to reduce out-of-plane movement. Recent studies investigate 3D ultrasound imaging together with deep learning to overcome this problem, focusing on acquiring high-resolution images to create optimal conditions for needle tip detection. However, high-resolution also requires a lot of time for image acquisition and processing, which limits the real-time capability. Therefore, we aim to maximize the US volume rate with the trade-off of low image resolution. We propose a deep learning approach to directly extract the 3D needle tip position from sparsely sampled US volumes. METHODS We design an experimental setup with a robot inserting a needle into water and chicken liver tissue. In contrast to manual annotation, we assess the needle tip position from the known robot pose. During insertion, we acquire a large data set of low-resolution volumes using a 16 × 16 element matrix transducer with a volume rate of 4 Hz. We compare the performance of our deep learning approach with conventional needle segmentation. RESULTS Our experiments in water and liver show that deep learning outperforms the conventional approach while achieving sub-millimeter accuracy. We achieve mean position errors of 0.54 mm in water and 1.54 mm in liver for deep learning. CONCLUSION Our study underlines the strengths of deep learning to predict the 3D needle positions from low-resolution ultrasound volumes. This is an important milestone for real-time needle navigation, simplifying the alignment of needle and ultrasound probe and enabling a 3D motion analysis.
Collapse
Affiliation(s)
- Sarah Grube
- Institute of Medical Technology and Intelligent Systems, Hamburg University of Technology, Hamburg, Germany.
| | - Sarah Latus
- Institute of Medical Technology and Intelligent Systems, Hamburg University of Technology, Hamburg, Germany
| | - Finn Behrendt
- Institute of Medical Technology and Intelligent Systems, Hamburg University of Technology, Hamburg, Germany
| | - Oleksandra Riabova
- Institute of Medical Technology and Intelligent Systems, Hamburg University of Technology, Hamburg, Germany
| | - Maximilian Neidhardt
- Institute of Medical Technology and Intelligent Systems, Hamburg University of Technology, Hamburg, Germany
| | - Alexander Schlaefer
- Institute of Medical Technology and Intelligent Systems, Hamburg University of Technology, Hamburg, Germany
| |
Collapse
|
2
|
Amiri Tehrani Zade A, Jalili Aziz M, Majedi H, Mirbagheri A, Ahmadian A. Spatiotemporal analysis of speckle dynamics to track invisible needle in ultrasound sequences using convolutional neural networks: a phantom study. Int J Comput Assist Radiol Surg 2023; 18:1373-1382. [PMID: 36745339 DOI: 10.1007/s11548-022-02812-y] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/13/2022] [Indexed: 02/07/2023]
Abstract
PURPOSE Accurate needle placement into the target point is critical for ultrasound interventions like biopsies and epidural injections. However, aligning the needle to the thin plane of the transducer is a challenging issue as it leads to the decay of visibility by the naked eye. Therefore, we have developed a CNN-based framework to track the needle using the spatiotemporal features of the speckle dynamics. METHODS There are three key techniques to optimize the network for our application. First, we used Gunnar-Farneback (GF) as a traditional motion field estimation technique to augment the model input with the spatiotemporal features extracted from the stack of consecutive frames. We also designed an efficient network based on the state-of-the-art Yolo framework (nYolo). Lastly, the Assisted Excitation (AE) module was added at the neck of the network to handle the imbalance problem. RESULTS Fourteen freehand ultrasound sequences were collected by inserting an injection needle steeply into the Ultrasound Compatible Lumbar Epidural Simulator and Femoral Vascular Access Ezono test phantoms. We divided the dataset into two sub-categories. In the second category, in which the situation is more challenging and the needle is totally invisible, the angle and tip localization error were 2.43 ± 1.14° and 2.3 ± 1.76 mm using Yolov3+GF+AE and 2.08 ± 1.18° and 2.12 ± 1.43 mm using nYolo+GF+AE. CONCLUSION The proposed method has the potential to track the needle in a more reliable operation compared to other state-of-the-art methods and can accurately localize it in 2D B-mode US images in real time, allowing it to be used in current ultrasound intervention procedures.
Collapse
Affiliation(s)
- Amin Amiri Tehrani Zade
- Department of Medical Physics and Biomedical Engineering, Tehran University of Medical Sciences (TUMS), Tehran, Iran
- Image-Guided Surgery Group, Research Centre for Biomedical Technologies and Robotics (RCBTR), Tehran University of Medical Sciences, Tehran, Iran
| | - Maryam Jalili Aziz
- Department of Medical Physics and Biomedical Engineering, Tehran University of Medical Sciences (TUMS), Tehran, Iran
- Image-Guided Surgery Group, Research Centre for Biomedical Technologies and Robotics (RCBTR), Tehran University of Medical Sciences, Tehran, Iran
| | - Hossein Majedi
- Pain Research Center, Neuroscience Institute, Tehran University of Medical Sciences, Tehran, Iran
- Department of Anesthesiology, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
| | - Alireza Mirbagheri
- Department of Medical Physics and Biomedical Engineering, Tehran University of Medical Sciences (TUMS), Tehran, Iran
- Robotic Group, Research Centre for Biomedical Technologies and Robotics (RCBTR), Tehran University of Medical Sciences, Tehran, Iran
| | - Alireza Ahmadian
- Department of Medical Physics and Biomedical Engineering, Tehran University of Medical Sciences (TUMS), Tehran, Iran.
- Image-Guided Surgery Group, Research Centre for Biomedical Technologies and Robotics (RCBTR), Tehran University of Medical Sciences, Tehran, Iran.
| |
Collapse
|
3
|
Peng C, Cai Q, Chen M, Jiang X. Recent Advances in Tracking Devices for Biomedical Ultrasound Imaging Applications. MICROMACHINES 2022; 13:mi13111855. [PMID: 36363876 PMCID: PMC9695235 DOI: 10.3390/mi13111855] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/08/2022] [Revised: 10/26/2022] [Accepted: 10/27/2022] [Indexed: 05/27/2023]
Abstract
With the rapid advancement of tracking technologies, the applications of tracking systems in ultrasound imaging have expanded across a wide range of fields. In this review article, we discuss the basic tracking principles, system components, performance analyses, as well as the main sources of error for popular tracking technologies that are utilized in ultrasound imaging. In light of the growing demand for object tracking, this article explores both the potential and challenges associated with different tracking technologies applied to various ultrasound imaging applications, including freehand 3D ultrasound imaging, ultrasound image fusion, ultrasound-guided intervention and treatment. Recent development in tracking technology has led to increased accuracy and intuitiveness of ultrasound imaging and navigation with less reliance on operator skills, thereby benefiting the medical diagnosis and treatment. Although commercially available tracking systems are capable of achieving sub-millimeter resolution for positional tracking and sub-degree resolution for orientational tracking, such systems are subject to a number of disadvantages, including high costs and time-consuming calibration procedures. While some emerging tracking technologies are still in the research stage, their potentials have been demonstrated in terms of the compactness, light weight, and easy integration with existing standard or portable ultrasound machines.
Collapse
Affiliation(s)
- Chang Peng
- School of Biomedical Engineering, ShanghaiTech University, Shanghai 201210, China
| | - Qianqian Cai
- Department of Mechanical and Aerospace Engineering, North Carolina State University, Raleigh, NC 27695, USA
| | - Mengyue Chen
- Department of Mechanical and Aerospace Engineering, North Carolina State University, Raleigh, NC 27695, USA
| | - Xiaoning Jiang
- Department of Mechanical and Aerospace Engineering, North Carolina State University, Raleigh, NC 27695, USA
| |
Collapse
|
4
|
Daoud MI, Abu-Hani AF, Shtaiyat A, Ali MZ, Alazrai R. Needle detection using ultrasound B-mode and power Doppler analyses. Med Phys 2022; 49:4999-5013. [PMID: 35608237 DOI: 10.1002/mp.15725] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2021] [Revised: 03/31/2022] [Accepted: 04/13/2022] [Indexed: 11/08/2022] Open
Abstract
BACKGROUND Ultrasound is employed in needle interventions to visualize the anatomical structures and track the needle. Nevertheless, needle detection in ultrasound images is a difficult task, specifically at steep insertion angles. PURPOSE A new method is presented to enable effective needle detection using ultrasound B-mode and power Doppler analyses. METHODS A small buzzer is used to excite the needle and an ultrasound system is utilized to acquire B-mode and power Doppler images for the needle. The B-mode and power Doppler images are processed using Radon transform and local phase analysis to initially detect the axis of the needle. The detection of the needle axis is improved by processing the power Doppler image using alpha shape analysis to define a region of interest (ROI) that contains the needle. Also, a set of feature maps are extracted from the ROI in the B-mode image. The feature maps are processed using a machine learning classifier to construct a likelihood image that visualizes the posterior needle likelihoods of the pixels. Radon transform is applied to the likelihood image to achieve an improved needle axis detection. Additionally, the region in the B-mode image surrounding the needle axis is analyzed to identify the needle tip using a custom-made probabilistic approach. Our method was utilized to detect needles inserted in ex vivo animal tissues at shallow [20° -40°), moderate [40° -60°), and steep [60° -85°] angles. RESULTS Our method detected the needles with failure rates equal to 0% and mean angle, axis, and tip errors less than or equal to 0.7°, 0.6 mm, and 0.7 mm, respectively. Additionally, our method achieved favorable results compared to two recently introduced needle detection methods. CONCLUSIONS The results indicate the potential of applying our method to achieve effective needle detection in ultrasound images. This article is protected by copyright. All rights reserved.
Collapse
Affiliation(s)
- Mohammad I Daoud
- Department of Computer Engineering, German Jordanian University, Amman, 11180, Jordan
| | - Ayah F Abu-Hani
- Department of Electrical and Computer Engineering, Technical University of Munich, Munich, 80333, Germany
| | - Ahmad Shtaiyat
- Department of Computer Engineering, German Jordanian University, Amman, 11180, Jordan
| | - Mostafa Z Ali
- Department of Computer Information Systems, Jordan University of Science and Technology, Irbid, 22110, Jordan
| | - Rami Alazrai
- Department of Computer Engineering, German Jordanian University, Amman, 11180, Jordan
| |
Collapse
|
5
|
Zhang Y, He X, Tian Z, Jeong JJ, Lei Y, Wang T, Zeng Q, Jani AB, Curran WJ, Patel P, Liu T, Yang X. Multi-Needle Detection in 3D Ultrasound Images Using Unsupervised Order-Graph Regularized Sparse Dictionary Learning. IEEE TRANSACTIONS ON MEDICAL IMAGING 2020; 39:2302-2315. [PMID: 31985414 PMCID: PMC7370243 DOI: 10.1109/tmi.2020.2968770] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
Accurate and automatic multi-needle detection in three-dimensional (3D) ultrasound (US) is a key step of treatment planning for US-guided brachytherapy. However, most current studies are concentrated on single-needle detection by only using a small number of images with a needle, regardless of the massive database of US images without needles. In this paper, we propose a workflow for multi-needle detection by considering the images without needles as auxiliary. Concretely, we train position-specific dictionaries on 3D overlapping patches of auxiliary images, where we develop an enhanced sparse dictionary learning method by integrating spatial continuity of 3D US, dubbed order-graph regularized dictionary learning. Using the learned dictionaries, target images are reconstructed to obtain residual pixels which are then clustered in every slice to yield centers. With the obtained centers, regions of interest (ROIs) are constructed via seeking cylinders. Finally, we detect needles by using the random sample consensus algorithm per ROI and then locate the tips by finding the sharp intensity drops along the detected axis for every needle. Extensive experiments were conducted on a phantom dataset and a prostate dataset of 70/21 patients without/with needles. Visualization and quantitative results show the effectiveness of our proposed workflow. Specifically, our method can correctly detect 95% of needles with a tip location error of 1.01 mm on the prostate dataset. This technique provides accurate multi-needle detection for US-guided HDR prostate brachytherapy, facilitating the clinical workflow.
Collapse
|
6
|
Zhang Y, Lei Y, Qiu RLJ, Wang T, Wang H, Jani AB, Curran WJ, Patel P, Liu T, Yang X. Multi-needle Localization with Attention U-Net in US-guided HDR Prostate Brachytherapy. Med Phys 2020; 47:2735-2745. [PMID: 32155666 DOI: 10.1002/mp.14128] [Citation(s) in RCA: 23] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2019] [Revised: 02/17/2020] [Accepted: 03/04/2020] [Indexed: 12/11/2022] Open
Abstract
PURPOSE Ultrasound (US)-guided high dose rate (HDR) prostate brachytherapy requests the clinicians to place HDR needles (catheters) into the prostate gland under transrectal US (TRUS) guidance in the operating room. The quality of the subsequent radiation treatment plan is largely dictated by the needle placements, which varies upon the experience level of the clinicians and the procedure protocols. Real-time plan dose distribution, if available, could be a vital tool to provide more subjective assessment of the needle placements, hence potentially improving the radiation plan quality and the treatment outcome. However, due to low signal-to-noise ratio (SNR) in US imaging, real-time multi-needle segmentation in 3D TRUS, which is the major obstacle for real-time dose mapping, has not been realized to date. In this study, we propose a deep learning-based method that enables accurate and real-time digitization of the multiple needles in the 3D TRUS images of HDR prostate brachytherapy. METHODS A deep learning model based on the U-Net architecture was developed to segment multiple needles in the 3D TRUS images. Attention gates were considered in our model to improve the prediction on the small needle points. Furthermore, the spatial continuity of needles was encoded into our model with total variation (TV) regularization. The combined network was trained on 3D TRUS patches with the deep supervision strategy, where the binary needle annotation images were provided as ground truth. The trained network was then used to localize and segment the HDR needles for a new patient's TRUS images. We evaluated our proposed method based on the needle shaft and tip errors against manually defined ground truth and compared our method with other state-of-art methods (U-Net and deeply supervised attention U-Net). RESULTS Our method detected 96% needles of 339 needles from 23 HDR prostate brachytherapy patients with 0.290 ± 0.236 mm at shaft error and 0.442 ± 0.831 mm at tip error. For shaft localization, our method resulted in 96% localizations with less than 0.8 mm error (needle diameter is 1.67 mm), while for tip localization, our method resulted in 75% needles with 0 mm error and 21% needles with 2 mm error (TRUS image slice thickness is 2 mm). No significant difference is observed (P = 0.83) on tip localization between our results with the ground truth. Compared with U-Net and deeply supervised attention U-Net, the proposed method delivers a significant improvement on both shaft error and tip error (P < 0.05). CONCLUSIONS We proposed a new segmentation method to precisely localize the tips and shafts of multiple needles in 3D TRUS images of HDR prostate brachytherapy. The 3D rendering of the needles could help clinicians to evaluate the needle placements. It paves the way for the development of real-time plan dose assessment tools that can further elevate the quality and outcome of HDR prostate brachytherapy.
Collapse
Affiliation(s)
- Yupei Zhang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| | - Yang Lei
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| | - Richard L J Qiu
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| | - Tonghe Wang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| | - Hesheng Wang
- Department of Radiation Oncology, New York University, New York, NY, USA
| | - Ashesh B Jani
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| | - Walter J Curran
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| | - Pretesh Patel
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| | - Tian Liu
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| | - Xiaofeng Yang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| |
Collapse
|
7
|
Daoud MI, Abu-Hani AF, Alazrai R. Reliable and accurate needle localization in curvilinear ultrasound images using signature-based analysis of ultrasound beamformed radio frequency signals. Med Phys 2020; 47:2356-2379. [PMID: 32160309 DOI: 10.1002/mp.14126] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2019] [Revised: 12/30/2019] [Accepted: 02/21/2020] [Indexed: 01/26/2023] Open
Abstract
PURPOSE Ultrasound imaging is used in many minimally invasive needle insertion procedures to track the advancing needle, but localizing the needle in ultrasound images can be challenging, particularly at steep insertion angles. Previous methods have been introduced to localize the needle in ultrasound images, but the majority of these methods are based on ultrasound B-mode image analysis that is affected by the needle visibility. To address this limitation, we propose a two-phase, signature-based method to achieve reliable and accurate needle localization in curvilinear ultrasound images based on the beamformed radio frequency (RF) signals that are acquired using conventional ultrasound imaging systems. METHODS In the first phase of our proposed method, the beamformed RF signals are divided into overlapping segments and these segments are processed to extract needle-specific features to identify the needle echoes. The features are analyzed using a support vector machine classifier to synthesize a quantitative image that highlights the needle. The quantitative image is processed using the Radon transform to achieve a reliable and accurate signature-based estimation of the needle axis. In the second phase, the accuracy of the needle axis estimation is improved by processing the RF samples located around the signature-based estimation of the needle axis using local phase analysis combined with the Radon transform. Moreover, a probabilistic approach is employed to identify the needle tip. The proposed method is used to localize needles with two different sizes inserted in ex vivo animal tissue specimens at various insertion angles. RESULTS Our proposed method achieved reliable and accurate needle localization for an extended range of needle insertion angles with failure rates of 0% and mean angle, axis, and tip errors smaller than or equal to 0 . 7 ∘ , 0.6 mm, and 0.7 mm, respectively. Moreover, our proposed method outperformed a recently introduced needle localization method that is based on B-mode image analysis. CONCLUSIONS These results suggest the potential of employing our signature-based method to achieve reliable and accurate needle localization during ultrasound-guided needle insertion procedures.
Collapse
Affiliation(s)
- Mohammad I Daoud
- Department of Computer Engineering, German Jordanian University, Amman, 11180, Jordan
| | - Ayah F Abu-Hani
- Department of Computer Engineering, German Jordanian University, Amman, 11180, Jordan
| | - Rami Alazrai
- Department of Computer Engineering, German Jordanian University, Amman, 11180, Jordan
| |
Collapse
|