1
|
Wen M, Shcherbakov P, Xu Y, Li J, Hu Y, Zhou Q, Liang H, Yuan L, Zhang X. A temporal enhanced semi-supervised training framework for needle segmentation in 3D ultrasound images. Phys Med Biol 2024; 69:115023. [PMID: 38684166 DOI: 10.1088/1361-6560/ad450b] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2023] [Accepted: 04/29/2024] [Indexed: 05/02/2024]
Abstract
Objective.Automated biopsy needle segmentation in 3D ultrasound images can be used for biopsy navigation, but it is quite challenging due to the low ultrasound image resolution and interference similar to the needle appearance. For 3D medical image segmentation, such deep learning networks as convolutional neural network and transformer have been investigated. However, these segmentation methods require numerous labeled data for training, have difficulty in meeting the real-time segmentation requirement and involve high memory consumption.Approach.In this paper, we have proposed the temporal information-based semi-supervised training framework for fast and accurate needle segmentation. Firstly, a novel circle transformer module based on the static and dynamic features has been designed after the encoders for extracting and fusing the temporal information. Then, the consistency constraints of the outputs before and after combining temporal information are proposed to provide the semi-supervision for the unlabeled volume. Finally, the model is trained using the loss function which combines the cross-entropy and Dice similarity coefficient (DSC) based segmentation loss with mean square error based consistency loss. The trained model with the single ultrasound volume input is applied to realize the needle segmentation in ultrasound volume.Main results.Experimental results on three needle ultrasound datasets acquired during the beagle biopsy show that our approach is superior to the most competitive mainstream temporal segmentation model and semi-supervised method by providing higher DSC (77.1% versus 76.5%), smaller needle tip position (1.28 mm versus 1.87 mm) and length (1.78 mm versus 2.19 mm) errors on the kidney dataset as well as DSC (78.5% versus 76.9%), needle tip position (0.86 mm versus 1.12 mm) and length (1.01 mm versus 1.26 mm) errors on the prostate dataset.Significance.The proposed method can significantly enhance needle segmentation accuracy by training with sequential images at no additional cost. This enhancement may further improve the effectiveness of biopsy navigation systems.
Collapse
Affiliation(s)
- Mingwei Wen
- Department of Biomedical Engineering, College of Life Science and Technology, Huazhong University of Science and Technology, No 1037, Luoyu Road, Wuhan 430074, People's Republic of China
| | - Pavel Shcherbakov
- Institute for Control Science, Russian Academy of Sciences, 65, Profsoyuznaya str., Moscow 117997, Russia
| | - Yang Xu
- Department of Biomedical Engineering, College of Life Science and Technology, Huazhong University of Science and Technology, No 1037, Luoyu Road, Wuhan 430074, People's Republic of China
- Hubei Medical Devices Quality Supervision and Test Institute, Wuhan, 430075, People's Republic of China
| | - Jing Li
- Hubei Medical Devices Quality Supervision and Test Institute, Wuhan, 430075, People's Republic of China
| | - Yi Hu
- Hubei Medical Devices Quality Supervision and Test Institute, Wuhan, 430075, People's Republic of China
| | - Quan Zhou
- Department of Biomedical Engineering, College of Life Science and Technology, Huazhong University of Science and Technology, No 1037, Luoyu Road, Wuhan 430074, People's Republic of China
| | - Huageng Liang
- Department of Urology, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, No 13, Hangkong Road, Wuhan 430022, People's Republic of China
| | - Li Yuan
- Department of Ultrasound imaging, Wuhan Children's Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan 430030, People's Republic of China
| | - Xuming Zhang
- Department of Biomedical Engineering, College of Life Science and Technology, Huazhong University of Science and Technology, No 1037, Luoyu Road, Wuhan 430074, People's Republic of China
| |
Collapse
|
2
|
Hui X, Rajendran P, Ling T, Dai X, Xing L, Pramanik M. Ultrasound-guided needle tracking with deep learning: A novel approach with photoacoustic ground truth. PHOTOACOUSTICS 2023; 34:100575. [PMID: 38174105 PMCID: PMC10761306 DOI: 10.1016/j.pacs.2023.100575] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/13/2023] [Revised: 11/15/2023] [Accepted: 11/27/2023] [Indexed: 01/05/2024]
Abstract
Accurate needle guidance is crucial for safe and effective clinical diagnosis and treatment procedures. Conventional ultrasound (US)-guided needle insertion often encounters challenges in consistency and precisely visualizing the needle, necessitating the development of reliable methods to track the needle. As a powerful tool in image processing, deep learning has shown promise for enhancing needle visibility in US images, although its dependence on manual annotation or simulated data as ground truth can lead to potential bias or difficulties in generalizing to real US images. Photoacoustic (PA) imaging has demonstrated its capability for high-contrast needle visualization. In this study, we explore the potential of PA imaging as a reliable ground truth for deep learning network training without the need for expert annotation. Our network (UIU-Net), trained on ex vivo tissue image datasets, has shown remarkable precision in localizing needles within US images. The evaluation of needle segmentation performance extends across previously unseen ex vivo data and in vivo human data (collected from an open-source data repository). Specifically, for human data, the Modified Hausdorff Distance (MHD) value stands at approximately 3.73, and the targeting error value is around 2.03, indicating the strong similarity and small needle orientation deviation between the predicted needle and actual needle location. A key advantage of our method is its applicability beyond US images captured from specific imaging systems, extending to images from other US imaging systems.
Collapse
Affiliation(s)
- Xie Hui
- School of Chemistry, Chemical Engineering and Biotechnology, Nanyang Technological University, Singapore 637459, Singapore
| | - Praveenbalaji Rajendran
- Stanford University, Department of Radiation Oncology, Stanford, California 94305, United States
| | - Tong Ling
- School of Chemistry, Chemical Engineering and Biotechnology, Nanyang Technological University, Singapore 637459, Singapore
- School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore 637459, Singapore
| | - Xianjin Dai
- Stanford University, Department of Radiation Oncology, Stanford, California 94305, United States
| | - Lei Xing
- Stanford University, Department of Radiation Oncology, Stanford, California 94305, United States
| | - Manojit Pramanik
- Department of Electrical and Computer Engineering, Iowa State University, Ames, IA 50011, United States
| |
Collapse
|
3
|
Lin X, Shi H, Fan X, Wang J, Fu Z, Chen Y, Chen S, Chen X, Chen M. Handheld interventional ultrasound/photoacoustic puncture needle navigation based on deep learning segmentation. BIOMEDICAL OPTICS EXPRESS 2023; 14:5979-5993. [PMID: 38021141 PMCID: PMC10659795 DOI: 10.1364/boe.504999] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/05/2023] [Revised: 10/08/2023] [Accepted: 10/18/2023] [Indexed: 12/01/2023]
Abstract
Interventional ultrasound (US) has challenges in accurate localization of the puncture needle due to intrinsic acoustic interferences, which lead to blurred, indistinct, and even invisible needles in handheld linear array transducer-based US navigation, especially the incorrect needle tip positioning. Photoacoustic (PA) imaging can provide complementary image contrast, without additional data acquisition. Herein, we proposed an internal illumination to solely light up the needle tip in PA imaging. Then deep-learning-based feature segmentation alleviates acoustic interferences, enhancing the needle shaft-tip visibility. Further, needle shaft-tip compensation aligned the needle shaft in US image and the needle tip in the PA image. The experiments on phantom, ex vivo chicken breast, preclinical radiofrequency ablation and in vivo biopsy of sentinel lymph nodes were piloted. The target registration error can reach the submillimeter level, achieving precise puncture needle tracking ability with in-plane US/PA navigation.
Collapse
Affiliation(s)
- Xiangwei Lin
- School of Biomedical Engineering, Shenzhen University, 1066 Xueyuan Ave, Shenzhen 518057, China
| | - Hongji Shi
- School of Biomedical Engineering, Shenzhen University, 1066 Xueyuan Ave, Shenzhen 518057, China
| | - Xiaozhou Fan
- Department of Ultrasound, Air Force Medical Center, Air Force Medical University, 30 Fucheng Road, Beijing 100142, China
| | - Jiaxin Wang
- School of Chinese Materia Medica, Beijing University of Chinese Medicine, 11 Huandong Road, Beijing 102488, China
| | - Zhenyu Fu
- School of Biomedical Engineering, Shenzhen University, 1066 Xueyuan Ave, Shenzhen 518057, China
| | - Yuqing Chen
- School of Biomedical Engineering, Shenzhen University, 1066 Xueyuan Ave, Shenzhen 518057, China
| | - Siping Chen
- School of Biomedical Engineering, Shenzhen University, 1066 Xueyuan Ave, Shenzhen 518057, China
| | - Xin Chen
- School of Biomedical Engineering, Shenzhen University, 1066 Xueyuan Ave, Shenzhen 518057, China
| | - Mian Chen
- School of Biomedical Engineering, Shenzhen University, 1066 Xueyuan Ave, Shenzhen 518057, China
| |
Collapse
|
4
|
Lavenir L, Santos JC, Zemiti N, Kaderbay A, Venail F, Poignet P. Miniaturized Endoscopic 2D US Transducer for Volumetric Ultrasound Imaging of the Auditory System. IEEE Trans Biomed Eng 2023; 70:2624-2635. [PMID: 37027277 DOI: 10.1109/tbme.2023.3260683] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/08/2023]
Abstract
OBJECTIVE In this paper, we focus on the carrying out and validation of minimally invasive three-dimensional (3D) ultrasound (US) imaging of the auditory system, which is based on a new miniaturized endoscopic 2D US transducer. METHODS This unique probe consists of a 18 MHz 24 elements curved array transducer with a distal diameter of 4 mm so it can be inserted into the external auditory canal. Typical acquisition is achieved by rotating such a transducer around its own axis using a robotic platform. Reconstruction of a US volume from the set of acquired B-scans during the rotation is then performed using scan-conversion. The accuracy of the reconstruction procedure is evaluated using a dedicated phantom that includes a set of wires as reference geometry. RESULTS Twelve acquisitions obtained from different probe poses are compared to a micro-computed tomographic model of the phantom, leading to a maximum error of 0.20 mm. Additionally, acquisitions with a cadaveric head highlight the clinical applicability of this set up. Structures of the auditory system such as the ossicles and the round window can be identified from the obtained 3D volumes. CONCLUSION These results confirm that our technique enables the accurate imaging of the middle and inner ears without having to deteriorate the surrounding bone. SIGNIFICANCE Since US is a real-time, wide available and non-ionizing imaging modality, our acquisition setup could facilitate the minimally invasive diagnosis and surgical navigation for otology in a fast, cost-effective and safe way.
Collapse
|
5
|
Park S, Beom DG, Bae EH, Kim SW, Kim DJ, Kim CS. Model-Based Needle Identification Using Image Analysis and Needle Library Matching for Ultrasound-Guided Kidney Biopsy: A Feasibility Study. ULTRASOUND IN MEDICINE & BIOLOGY 2023; 49:1699-1708. [PMID: 37137741 DOI: 10.1016/j.ultrasmedbio.2023.03.009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/25/2022] [Revised: 03/05/2023] [Accepted: 03/07/2023] [Indexed: 05/05/2023]
Abstract
OBJECTIVE The aim of the work described here was to determine the feasibility of using a novel biopsy needle detection technique that achieves high sensitivity and specificity in a trade-off of resolution, detectability and depth of imaging. METHODS The proposed needle detection method consists of a model-based image analysis, temporal needle projection and needle library matching: (i) Image analysis was formulated under the signal decomposition framework; (ii) temporal projection converted the time-resolved needle dynamics into a single image of the desired needle; and (iii) the enhanced needle structure was spatially refined by matching a long, straight linear object in the needle library. The efficacy was examined with respect to different needle visibility. RESULTS Our method effectively eliminated confounding effects of the background tissue artifacts more robustly than conventional methods, thus improving needle visibility even with the low contrast between the needle and tissue. The improvement in needle structure further resulted in an improvement in estimation performance for the trajectory angle and tip position. CONCLUSION Our three-step needle detection method can reliably detect needle position without the need for external devices, increasing the needle conspicuity and reducing motion sensitivity.
Collapse
Affiliation(s)
- Suhyung Park
- Department of Computer Engineering, Chonnam National University, Gwangju, Republic of Korea; Department of ICT Convergence System Engineering, Chonnam National University, Gwangju, Republic of Korea
| | - Dong Gyu Beom
- Department of Computer Engineering, Chonnam National University, Gwangju, Republic of Korea
| | - Eun Hui Bae
- Department of Internal Medicine, Chonnam National University Medical School, Gwangju, Republic of Korea; Department of Internal Medicine, Chonnam National University Hospital, Gwangju, Republic of Korea
| | - Soo Wan Kim
- Department of Internal Medicine, Chonnam National University Medical School, Gwangju, Republic of Korea; Department of Internal Medicine, Chonnam National University Hospital, Gwangju, Republic of Korea
| | - Dong Joon Kim
- Department of Anesthesiology and Pain Medicine, Chosun University Medical School, Gwangju, Republic of Korea; Department of Anesthesiology and Pain Medicine, Chosun University Hospital, Gwangju, Republic of Korea
| | - Chang Seong Kim
- Department of Internal Medicine, Chonnam National University Medical School, Gwangju, Republic of Korea; Department of Internal Medicine, Chonnam National University Hospital, Gwangju, Republic of Korea.
| |
Collapse
|
6
|
Ciocan RA, Graur F, Ciocan A, Cismaru CA, Pintilie SR, Berindan-Neagoe I, Hajjar NA, Gherman CD. Robot-Guided Ultrasonography in Surgical Interventions. Diagnostics (Basel) 2023; 13:2456. [PMID: 37510199 PMCID: PMC10378616 DOI: 10.3390/diagnostics13142456] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2023] [Accepted: 07/19/2023] [Indexed: 07/30/2023] Open
Abstract
INTRODUCTION The introduction of robotic-guided procedures in surgical techniques has brought an increase in the accuracy and control of resections. Surgery has evolved as a technique since the development of laparoscopy, which has added to the visualisation of the peritoneal cavity from a different perspective. Multi-armed robot associated with real-time intraoperative imaging devices brings important manoeuvrability and dexterity improvements in certain surgical fields. MATERIALS AND METHODS The present study is designed to synthesise the development of imaging techniques with a focus on ultrasonography in robotic surgery in the last ten years regarding abdominal surgical interventions. RESULTS All studies involved abdominal surgery. Out of the seven studies, two were performed in clinical trials. The other five studies were performed on organs or simulators and attempted to develop a hybrid surgical technique using ultrasonography and robotic surgery. Most studies aim to surgically identify both blood vessels and nerve structures through this combined technique (surgery and imaging). CONCLUSIONS Ultrasonography is often used in minimally invasive surgical techniques. This adds to the visualisation of blood vessels, the correct identification of tumour margins, and the location of surgical instruments in the tissue. The development of ultrasound technology from 2D to 3D and 4D has brought improvements in minimally invasive and robotic surgical techniques, and it should be further studied to bring surgery to a higher level.
Collapse
Affiliation(s)
- Răzvan Alexandru Ciocan
- Department of Surgery-Practical Abilities, "Iuliu Hațieganu" University of Medicine and Pharmacy Cluj-Napoca, Marinescu Street, No. 23, 400337 Cluj-Napoca, Romania
| | - Florin Graur
- Department of Surgery, "Iuliu Hațieganu" University of Medicine and Pharmacy Cluj-Napoca, Croitorilor Street, No. 19-21, 400162 Cluj-Napoca, Romania
| | - Andra Ciocan
- Department of Surgery, "Iuliu Hațieganu" University of Medicine and Pharmacy Cluj-Napoca, Croitorilor Street, No. 19-21, 400162 Cluj-Napoca, Romania
| | - Cosmin Andrei Cismaru
- Research Center for Functional Genomics, Biomedicine and Translational Medicine, "Iuliu Hațieganu" University of Medicine and Pharmacy Cluj-Napoca, Victor Babeș Street, No. 8, 400347 Cluj-Napoca, Romania
| | - Sebastian Romeo Pintilie
- "Iuliu Hațieganu" University of Medicine and Pharmacy Cluj-Napoca, Victor Babeș Street, No. 8, 400347 Cluj-Napoca, Romania
| | - Ioana Berindan-Neagoe
- Research Center for Functional Genomics, Biomedicine and Translational Medicine, "Iuliu Hațieganu" University of Medicine and Pharmacy Cluj-Napoca, Victor Babeș Street, No. 8, 400347 Cluj-Napoca, Romania
| | - Nadim Al Hajjar
- Department of Surgery, "Iuliu Hațieganu" University of Medicine and Pharmacy Cluj-Napoca, Croitorilor Street, No. 19-21, 400162 Cluj-Napoca, Romania
| | - Claudia Diana Gherman
- Department of Surgery-Practical Abilities, "Iuliu Hațieganu" University of Medicine and Pharmacy Cluj-Napoca, Marinescu Street, No. 23, 400337 Cluj-Napoca, Romania
| |
Collapse
|
7
|
Shi M, Zhao T, West SJ, Desjardins AE, Vercauteren T, Xia W. Improving needle visibility in LED-based photoacoustic imaging using deep learning with semi-synthetic datasets. PHOTOACOUSTICS 2022; 26:100351. [PMID: 35495095 PMCID: PMC9048160 DOI: 10.1016/j.pacs.2022.100351] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/05/2022] [Revised: 03/29/2022] [Accepted: 03/30/2022] [Indexed: 06/14/2023]
Abstract
Photoacoustic imaging has shown great potential for guiding minimally invasive procedures by accurate identification of critical tissue targets and invasive medical devices (such as metallic needles). The use of light emitting diodes (LEDs) as the excitation light sources accelerates its clinical translation owing to its high affordability and portability. However, needle visibility in LED-based photoacoustic imaging is compromised primarily due to its low optical fluence. In this work, we propose a deep learning framework based on U-Net to improve the visibility of clinical metallic needles with a LED-based photoacoustic and ultrasound imaging system. To address the complexity of capturing ground truth for real data and the poor realism of purely simulated data, this framework included the generation of semi-synthetic training datasets combining both simulated data to represent features from the needles and in vivo measurements for tissue background. Evaluation of the trained neural network was performed with needle insertions into blood-vessel-mimicking phantoms, pork joint tissue ex vivo and measurements on human volunteers. This deep learning-based framework substantially improved the needle visibility in photoacoustic imaging in vivo compared to conventional reconstruction by suppressing background noise and image artefacts, achieving 5.8 and 4.5 times improvements in terms of signal-to-noise ratio and the modified Hausdorff distance, respectively. Thus, the proposed framework could be helpful for reducing complications during percutaneous needle insertions by accurate identification of clinical needles in photoacoustic imaging.
Collapse
Affiliation(s)
- Mengjie Shi
- School of Biomedical Engineering and Imaging Sciences, King’s College London, London SE1 7EH, United Kingdom
| | - Tianrui Zhao
- School of Biomedical Engineering and Imaging Sciences, King’s College London, London SE1 7EH, United Kingdom
| | - Simeon J. West
- Department of Anaesthesia, University College Hospital, London NW1 2BU, United Kingdom
| | - Adrien E. Desjardins
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1 W 7TY, United Kingdom
- Department of Medical Physics and Biomedical Engineering, University College London, London WC1E 6BT, United Kingdom
| | - Tom Vercauteren
- School of Biomedical Engineering and Imaging Sciences, King’s College London, London SE1 7EH, United Kingdom
| | - Wenfeng Xia
- School of Biomedical Engineering and Imaging Sciences, King’s College London, London SE1 7EH, United Kingdom
| |
Collapse
|
8
|
Liu D, Tupor S, Singh J, Chernoff T, Leong N, Sadikov E, Amjad A, Zilles S. The Challenges Facing Deep Learning based Catheter Localization for Ultrasound Guided High-Dose-Rate Prostate Brachytherapy. Med Phys 2022; 49:2442-2451. [PMID: 35118676 DOI: 10.1002/mp.15522] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2021] [Revised: 01/09/2022] [Accepted: 01/18/2022] [Indexed: 11/11/2022] Open
Abstract
BACKGROUND Automated catheter localization for ultrasound guided high-dose-rate prostate brachytherapy faces challenges relating to imaging noise and artifacts. To date, catheter reconstruction during the clinical procedure is performed manually. Deep learning has been successfully applied to a wide variety of complex tasks and has the potential to tackle the unique challenges associated with multiple catheter localization on ultrasound. Such a task is well suited for automation, with the potential to improve productivity and reliability. PURPOSE We developed a deep learning model for automated catheter reconstruction and investigated potential factors influencing model performance. The model was designed to integrate into a clinical workflow, with a proposed reconstruction confidence metric to aid in planner verification. METHODS Datasets from 242 patients treated from 2016 to 2020 were collected retrospectively. The anonymized dataset comprises of 31,000 transverse images reconstructed from 3D sagittal ultrasound acquisitions and 3,500 implanted catheters manually localized by the planner. Each catheter was retrospectively ranked based on the severity of imaging artifacts affecting reconstruction difficulty. The U-NET deep learning architecture was trained to localize implanted catheters on transverse images. A five-fold cross-validation method was used, allowing for evaluation over the entire dataset. The post-processing software combined the predictions with patient-specific implant information to reconstructed catheters in 3D space, uniquely matched to the implanted grid positions. A reconstruction confidence metric was calculated based on the number and probability of localized predictions per catheter. For each patient, deep learning prediction and post-processing reconstruction was completed in under two minutes on a non-performance PC. RESULTS Overall, 80% of catheter reconstructions were accurate, within 2 mm along 90% of the length. The catheter tip was often not detected and required extrapolation during reconstruction. The reconstruction accuracy was 89% for the easiest catheter ranking and decreased to 13% for the highest difficulty ranking, when the aid of live ultrasound would have been recommended. Even when limited to the easiest ranked catheters, the reconstruction accuracy decreased at distal grid positions, down to 50%. Individual implantation style was found to influence the frequency of severe artifacts, slightly impacting the model accuracy. A reconstruction confidence metric identified the difficult catheters, removed the observed individual variation, and increased the overall accuracy to 91% while excluding 27% of the reconstructions. CONCLUSIONS The deep learning model localized implanted catheters over a large clinical dataset, with overall promising results. The model faced challenges due to ultrasound artifacts and image degradation distal to the probe, underlining the continued importance of maintaining image quality and minimizing artifacts. A potential workflow for integration into the clinical procedure was demonstrated, including the use of a confidence metric to predict low accuracy reconstructions. Comparison between models evaluated on different datasets should also consider underlying differences, such as the frequency and severity of imaging artifacts. This article is protected by copyright. All rights reserved.
Collapse
Affiliation(s)
- Derek Liu
- Dept of Medical Physics, Allan Blair Cancer Centre, Regina, Saskatchewan, S4T 7T1, Canada.,Dept of Oncology, College of Medicine, University of Saskatchewan, Saskatoon, Saskatchewan, S7N 5E5, Canada
| | - Shayantonee Tupor
- Dept of Computer Science, University of Regina, Regina, Saskatchewan, S4S 0A2, Canada
| | - Jaskaran Singh
- College of Medicine, University of Saskatchewan, Saskatoon, Saskatchewan, S7N 5E5, Canada
| | - Trey Chernoff
- Dept of Physics, University of Regina, Regina, Saskatchewan, S4S 0A2, Canada
| | - Nelson Leong
- Dept of Oncology, College of Medicine, University of Saskatchewan, Saskatoon, Saskatchewan, S7N 5E5, Canada.,Dept of Radiation Oncology, Allan Blair Cancer Centre, Regina, Saskatchewan, S4T 7T1, Canada
| | - Evgeny Sadikov
- Dept of Oncology, College of Medicine, University of Saskatchewan, Saskatoon, Saskatchewan, S7N 5E5, Canada.,Dept of Radiation Oncology, Allan Blair Cancer Centre, Regina, Saskatchewan, S4T 7T1, Canada
| | - Asim Amjad
- Dept of Oncology, College of Medicine, University of Saskatchewan, Saskatoon, Saskatchewan, S7N 5E5, Canada.,Dept of Radiation Oncology, Allan Blair Cancer Centre, Regina, Saskatchewan, S4T 7T1, Canada
| | - Sandra Zilles
- Dept of Computer Science, University of Regina, Regina, Saskatchewan, S4S 0A2, Canada
| |
Collapse
|
9
|
Sahu SK, Sozer C, Rosa B, Tamadon I, Renaud P, Menciassi A. Shape Reconstruction Processes for Interventional Application Devices: State of the Art, Progress, and Future Directions. Front Robot AI 2021; 8:758411. [PMID: 34869615 PMCID: PMC8640970 DOI: 10.3389/frobt.2021.758411] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2021] [Accepted: 10/11/2021] [Indexed: 01/02/2023] Open
Abstract
Soft and continuum robots are transforming medical interventions thanks to their flexibility, miniaturization, and multidirectional movement abilities. Although flexibility enables reaching targets in unstructured and dynamic environments, it also creates challenges for control, especially due to interactions with the anatomy. Thus, in recent years lots of efforts have been devoted for the development of shape reconstruction methods, with the advancement of different kinematic models, sensors, and imaging techniques. These methods can increase the performance of the control action as well as provide the tip position of robotic manipulators relative to the anatomy. Each method, however, has its advantages and disadvantages and can be worthwhile in different situations. For example, electromagnetic (EM) and Fiber Bragg Grating (FBG) sensor-based shape reconstruction methods can be used in small-scale robots due to their advantages thanks to miniaturization, fast response, and high sensitivity. Yet, the problem of electromagnetic interference in the case of EM sensors, and poor response to high strains in the case of FBG sensors need to be considered. To help the reader make a suitable choice, this paper presents a review of recent progress on shape reconstruction methods, based on a systematic literature search, excluding pure kinematic models. Methods are classified into two categories. First, sensor-based techniques are presented that discuss the use of various sensors such as FBG, EM, and passive stretchable sensors for reconstructing the shape of the robots. Second, imaging-based methods are discussed that utilize images from different imaging systems such as fluoroscopy, endoscopy cameras, and ultrasound for the shape reconstruction process. The applicability, benefits, and limitations of each method are discussed. Finally, the paper draws some future promising directions for the enhancement of the shape reconstruction methods by discussing open questions and alternative methods.
Collapse
Affiliation(s)
- Sujit Kumar Sahu
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pisa, Italy
- Department of Excellence in Robotics & AI, Scuola Superiore Sant’Anna, Pisa, Italy
- ICube, CNRS, INSA Strasbourg, University of Strasbourg, Strasbourg, France
| | - Canberk Sozer
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pisa, Italy
- Department of Excellence in Robotics & AI, Scuola Superiore Sant’Anna, Pisa, Italy
| | - Benoit Rosa
- ICube, CNRS, INSA Strasbourg, University of Strasbourg, Strasbourg, France
| | - Izadyar Tamadon
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pisa, Italy
- Department of Excellence in Robotics & AI, Scuola Superiore Sant’Anna, Pisa, Italy
| | - Pierre Renaud
- ICube, CNRS, INSA Strasbourg, University of Strasbourg, Strasbourg, France
| | - Arianna Menciassi
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pisa, Italy
- Department of Excellence in Robotics & AI, Scuola Superiore Sant’Anna, Pisa, Italy
| |
Collapse
|
10
|
Konh B, Padasdao B, Batsaikhan Z, Ko SY. Integrating robot-assisted ultrasound tracking and 3D needle shape prediction for real-time tracking of the needle tip in needle steering procedures. Int J Med Robot 2021; 17:e2272. [PMID: 33951748 DOI: 10.1002/rcs.2272] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2021] [Revised: 05/02/2021] [Accepted: 05/03/2021] [Indexed: 11/07/2022]
Abstract
BACKGROUND Needle insertions have been used in several minimally invasive procedures for diagnostic and therapeutic purposes. Real-time position of the needle tip is an important information in needle steering systems. METHODS This work introduces a robot-assisted ultrasound tracking (R-AUST) system integrated with a needle shape prediction method to provide 3D position of the needle tip. The tracking system is evaluated in phantom and ex vivo beef liver tissues. RESULTS An average error of 0.60 mm was found for needle insertion tests inside the phantom tissue. The R-AUST integrated with shape prediction in the beef liver tissue was able to track the needle tip with an average and maximum error of 0.37 and 0.67 mm, respectively. The average error reported in this work is within the mean allowable needle placement error (<2.7 mm) in targeted procedures. CONCLUSIONS Integration of R-AUST tracking method with needle shape prediction results in a reasonably accurate real-time tracking suitable for ultrasound-guided needle insertions.
Collapse
Affiliation(s)
- Bardia Konh
- Department of Mechanical Engineering, University of Hawaii at Manoa, Honolulu, Hawaii, USA
| | - Blayton Padasdao
- Department of Mechanical Engineering, University of Hawaii at Manoa, Honolulu, Hawaii, USA
| | - Zolboo Batsaikhan
- Department of Mechanical Engineering, University of Hawaii at Manoa, Honolulu, Hawaii, USA
| | - Seong Young Ko
- School of Mechanical Engineering, Chonnam National University, Gwangju, South Korea
| |
Collapse
|
11
|
Zheng Y, Jiang S, Yang Z, Wei L. Automatic needle detection using improved random sample consensus in CT image-guided lung interstitial brachytherapy. J Appl Clin Med Phys 2021; 22:121-131. [PMID: 33764659 PMCID: PMC8035571 DOI: 10.1002/acm2.13231] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2020] [Revised: 01/19/2021] [Accepted: 02/06/2021] [Indexed: 11/15/2022] Open
Abstract
Purpose To develop a method for automatically detecting needles from CT images, which can be used in image‐guided lung interstitial brachytherapy to assist needle placement assessment and dose distribution optimization. Material and Methods Based on the preview model parameters evaluation, local optimization combining local random sample consensus, and principal component analysis, the needle shaft was detected quickly, accurately, and robustly through the modified random sample consensus algorithm. By tracing intensities along the axis, the needle tip was determined. Furthermore, multineedles in a single slice were segmented at once using successive inliers deletion. Results The simulation data show that the segmentation efficiency is much higher than the original random sample consensus and yet maintains a stable submillimeter accuracy. Experiments with physical phantom demonstrate that the segmentation accuracy of described algorithm depends on the needle insertion depth into the CT image. Application to permanent lung brachytherapy image is also validated, where manual segmentation is the counterparts of the estimated needle shape. Conclusions From the results, the mean errors in determining needle orientation and endpoint are regulated within 2° and 1 mm, respectively. The average segmentation time is 0.238 s per needle.
Collapse
Affiliation(s)
- Yongnan Zheng
- School of Mechanical Engineering, Tianjin University, Tianjin, China
| | - Shan Jiang
- School of Mechanical Engineering, Tianjin University, Tianjin, China
| | - Zhiyong Yang
- School of Mechanical Engineering, Tianjin University, Tianjin, China
| | - Lin Wei
- School of Mechanical Engineering, Tianjin University, Tianjin, China
| |
Collapse
|
12
|
Zhang Y, Tian Z, Lei Y, Wang T, Patel P, Jani AB, Curran WJ, Liu T, Yang X. Automatic multi-needle localization in ultrasound images using large margin mask RCNN for ultrasound-guided prostate brachytherapy. ACTA ACUST UNITED AC 2020; 65:205003. [DOI: 10.1088/1361-6560/aba410] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
|
13
|
Rodgers JR, Hrinivich WT, Surry K, Velker V, D'Souza D, Fenster A. A semiautomatic segmentation method for interstitial needles in intraoperative 3D transvaginal ultrasound images for high-dose-rate gynecologic brachytherapy of vaginal tumors. Brachytherapy 2020; 19:659-668. [PMID: 32631651 DOI: 10.1016/j.brachy.2020.05.006] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2020] [Revised: 05/22/2020] [Accepted: 05/28/2020] [Indexed: 11/24/2022]
Abstract
PURPOSE The purpose of this study was to evaluate the use of a semiautomatic algorithm to simultaneously segment multiple high-dose-rate (HDR) gynecologic interstitial brachytherapy (ISBT) needles in three-dimensional (3D) transvaginal ultrasound (TVUS) images, with the aim of providing a clinically useful tool for intraoperative implant assessment. METHODS AND MATERIALS A needle segmentation algorithm previously developed for HDR prostate brachytherapy was adapted and extended to 3D TVUS images from gynecologic ISBT patients with vaginal tumors. Two patients were used for refining/validating the modified algorithm and five patients (8-12 needles/patient) were reserved as an unseen test data set. The images were filtered to enhance needle edges, using intensity peaks to generate feature points, and leveraged the randomized 3D Hough transform to identify candidate needle trajectories. Algorithmic segmentations were compared against manual segmentations and calculated dwell positions were evaluated. RESULTS All 50 test data set needles were successfully segmented with 96% of algorithmically segmented needles having angular differences <3° compared with manually segmented needles and the maximum Euclidean distance was <2.1 mm. The median distance between corresponding dwell positions was 0.77 mm with 86% of needles having maximum differences <3 mm. The mean segmentation time using the algorithm was <30 s/patient. CONCLUSIONS We successfully segmented multiple needles simultaneously in intraoperative 3D TVUS images from gynecologic HDR-ISBT patients with vaginal tumors and demonstrated the robustness of the algorithmic approach to image artifacts. This method provided accurate segmentations within a clinically efficient timeframe, providing the potential to be translated into intraoperative clinical use for implant assessment.
Collapse
MESH Headings
- Adenocarcinoma, Clear Cell/radiotherapy
- Adenocarcinoma, Clear Cell/secondary
- Aged
- Aged, 80 and over
- Algorithms
- Brachytherapy/instrumentation
- Brachytherapy/methods
- Carcinoma, Endometrioid/radiotherapy
- Carcinoma, Endometrioid/secondary
- Carcinoma, Squamous Cell/pathology
- Carcinoma, Squamous Cell/radiotherapy
- Carcinoma, Squamous Cell/secondary
- Endometrial Neoplasms/pathology
- Female
- Humans
- Image Processing, Computer-Assisted
- Imaging, Three-Dimensional/methods
- Middle Aged
- Needles
- Ovarian Neoplasms/pathology
- Prostate/diagnostic imaging
- Radiotherapy Planning, Computer-Assisted
- Ultrasonography/methods
- Vaginal Neoplasms/pathology
- Vaginal Neoplasms/radiotherapy
- Vaginal Neoplasms/secondary
Collapse
Affiliation(s)
- Jessica Robin Rodgers
- School of Biomedical Engineering, The University of Western Ontario, London, Ontario, Canada; Robarts Research Institute, The University of Western Ontario, London, Ontario, Canada.
| | - William Thomas Hrinivich
- Department of Radiation Oncology and Molecular Radiation Sciences, Johns Hopkins University, Baltimore, MD
| | - Kathleen Surry
- Department of Medical Physics, London Regional Cancer Program, London, Ontario, Canada
| | - Vikram Velker
- Department of Radiation Oncology, London Regional Cancer Program, London, Ontario, Canada
| | - David D'Souza
- Department of Radiation Oncology, London Regional Cancer Program, London, Ontario, Canada
| | - Aaron Fenster
- School of Biomedical Engineering, The University of Western Ontario, London, Ontario, Canada; Robarts Research Institute, The University of Western Ontario, London, Ontario, Canada
| |
Collapse
|
14
|
Gillies DJ, Rodgers JR, Gyacskov I, Roy P, Kakani N, Cool DW, Fenster A. Deep learning segmentation of general interventional tools in two‐dimensional ultrasound images. Med Phys 2020; 47:4956-4970. [DOI: 10.1002/mp.14427] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2020] [Revised: 07/05/2020] [Accepted: 07/21/2020] [Indexed: 12/18/2022] Open
Affiliation(s)
- Derek J. Gillies
- Department of Medical Biophysics Western University London OntarioN6A 3K7 Canada
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
| | - Jessica R. Rodgers
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
- School of Biomedical Engineering Western University London OntarioN6A 3K7 Canada
| | - Igor Gyacskov
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
| | - Priyanka Roy
- Department of Medical Biophysics Western University London OntarioN6A 3K7 Canada
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
| | - Nirmal Kakani
- Department of Radiology Manchester Royal Infirmary ManchesterM13 9WL UK
| | - Derek W. Cool
- Department of Medical Imaging Western University London OntarioN6A 3K7 Canada
| | - Aaron Fenster
- Department of Medical Biophysics Western University London OntarioN6A 3K7 Canada
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
- School of Biomedical Engineering Western University London OntarioN6A 3K7 Canada
- Department of Medical Imaging Western University London OntarioN6A 3K7 Canada
| |
Collapse
|
15
|
Li X, Young AS, Raman SS, Lu DS, Lee YH, Tsao TC, Wu HH. Automatic needle tracking using Mask R-CNN for MRI-guided percutaneous interventions. Int J Comput Assist Radiol Surg 2020; 15:1673-1684. [PMID: 32676870 DOI: 10.1007/s11548-020-02226-8] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/27/2019] [Accepted: 07/03/2020] [Indexed: 12/16/2022]
Abstract
PURPOSE Accurate needle tracking provides essential information for MRI-guided percutaneous interventions. Passive needle tracking using MR images is challenged by variations of the needle-induced signal void feature in different situations. This work aimed to develop an automatic needle tracking algorithm for MRI-guided interventions based on the Mask Region Proposal-Based Convolutional Neural Network (R-CNN). METHODS Mask R-CNN was adapted and trained to segment the needle feature using 250 intra-procedural images from 85 MRI-guided prostate biopsy cases and 180 real-time images from MRI-guided needle insertion in ex vivo tissue. The segmentation masks were passed into the needle feature localization algorithm to extract the needle feature tip location and axis orientation. The proposed algorithm was tested using 208 intra-procedural images from 40 MRI-guided prostate biopsy cases, and 3 real-time MRI datasets in ex vivo tissue. The algorithm results were compared with human-annotated references. RESULTS In prostate datasets, the proposed algorithm achieved needle feature tip localization error with median Euclidean distance (dxy) of 0.71 mm and median difference in axis orientation angle (dθ) of 1.28°, respectively. In 3 real-time MRI datasets, the proposed algorithm achieved consistent dynamic needle feature tracking performance with processing time of 75 ms/image: (a) median dxy = 0.90 mm, median dθ = 1.53°; (b) median dxy = 1.31 mm, median dθ = 1.9°; (c) median dxy = 1.09 mm, median dθ = 0.91°. CONCLUSIONS The proposed algorithm using Mask R-CNN can accurately track the needle feature tip and axis on MR images from in vivo intra-procedural prostate biopsy cases and ex vivo real-time MRI experiments with a range of different conditions. The algorithm achieved pixel-level tracking accuracy in real time and has potential to assist MRI-guided percutaneous interventions.
Collapse
Affiliation(s)
- Xinzhou Li
- Department of Radiological Sciences, University of California Los Angeles, 300 UCLA Medical Plaza, Suite B119, Los Angeles, CA, 90095, USA
- Department of Bioengineering, University of California Los Angeles, Los Angeles, CA, USA
| | - Adam S Young
- Department of Radiological Sciences, University of California Los Angeles, 300 UCLA Medical Plaza, Suite B119, Los Angeles, CA, 90095, USA
| | - Steven S Raman
- Department of Radiological Sciences, University of California Los Angeles, 300 UCLA Medical Plaza, Suite B119, Los Angeles, CA, 90095, USA
| | - David S Lu
- Department of Radiological Sciences, University of California Los Angeles, 300 UCLA Medical Plaza, Suite B119, Los Angeles, CA, 90095, USA
| | - Yu-Hsiu Lee
- Department of Mechanical and Aerospace Engineering, University of California Los Angeles, Los Angeles, CA, USA
| | - Tsu-Chin Tsao
- Department of Mechanical and Aerospace Engineering, University of California Los Angeles, Los Angeles, CA, USA
| | - Holden H Wu
- Department of Radiological Sciences, University of California Los Angeles, 300 UCLA Medical Plaza, Suite B119, Los Angeles, CA, 90095, USA.
- Department of Bioengineering, University of California Los Angeles, Los Angeles, CA, USA.
| |
Collapse
|
16
|
Zhang Y, He X, Tian Z, Jeong JJ, Lei Y, Wang T, Zeng Q, Jani AB, Curran WJ, Patel P, Liu T, Yang X. Multi-Needle Detection in 3D Ultrasound Images Using Unsupervised Order-Graph Regularized Sparse Dictionary Learning. IEEE TRANSACTIONS ON MEDICAL IMAGING 2020; 39:2302-2315. [PMID: 31985414 PMCID: PMC7370243 DOI: 10.1109/tmi.2020.2968770] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
Accurate and automatic multi-needle detection in three-dimensional (3D) ultrasound (US) is a key step of treatment planning for US-guided brachytherapy. However, most current studies are concentrated on single-needle detection by only using a small number of images with a needle, regardless of the massive database of US images without needles. In this paper, we propose a workflow for multi-needle detection by considering the images without needles as auxiliary. Concretely, we train position-specific dictionaries on 3D overlapping patches of auxiliary images, where we develop an enhanced sparse dictionary learning method by integrating spatial continuity of 3D US, dubbed order-graph regularized dictionary learning. Using the learned dictionaries, target images are reconstructed to obtain residual pixels which are then clustered in every slice to yield centers. With the obtained centers, regions of interest (ROIs) are constructed via seeking cylinders. Finally, we detect needles by using the random sample consensus algorithm per ROI and then locate the tips by finding the sharp intensity drops along the detected axis for every needle. Extensive experiments were conducted on a phantom dataset and a prostate dataset of 70/21 patients without/with needles. Visualization and quantitative results show the effectiveness of our proposed workflow. Specifically, our method can correctly detect 95% of needles with a tip location error of 1.01 mm on the prostate dataset. This technique provides accurate multi-needle detection for US-guided HDR prostate brachytherapy, facilitating the clinical workflow.
Collapse
|
17
|
Dai X, Lei Y, Zhang Y, Qiu RLJ, Wang T, Dresser SA, Curran WJ, Patel P, Liu T, Yang X. Automatic multi-catheter detection using deeply supervised convolutional neural network in MRI-guided HDR prostate brachytherapy. Med Phys 2020; 47:4115-4124. [PMID: 32484573 DOI: 10.1002/mp.14307] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2020] [Revised: 05/19/2020] [Accepted: 05/24/2020] [Indexed: 12/19/2022] Open
Abstract
PURPOSE High-dose-rate (HDR) brachytherapy is an established technique to be used as monotherapy option or focal boost in conjunction with external beam radiation therapy (EBRT) for treating prostate cancer. Radiation source path reconstruction is a critical procedure in HDR treatment planning. Manually identifying the source path is labor intensive and time inefficient. In recent years, magnetic resonance imaging (MRI) has become a valuable imaging modality for image-guided HDR prostate brachytherapy due to its superb soft-tissue contrast for target delineation and normal tissue contouring. The purpose of this study is to investigate a deep-learning-based method to automatically reconstruct multiple catheters in MRI for prostate cancer HDR brachytherapy treatment planning. METHODS Attention gated U-Net incorporated with total variation (TV) regularization model was developed for multi-catheter segmentation in MRI. The attention gates were used to improve the accuracy of identifying small catheter points, while TV regularization was adopted to encode the natural spatial continuity of catheters into the model. The model was trained using the binary catheter annotation images offered by experienced physicists as ground truth paired with original MRI images. After the network was trained, MR images of a new prostate cancer patient receiving HDR brachytherapy were fed into the model to predict the locations and shapes of all the catheters. Quantitative assessments of our proposed method were based on catheter shaft and tip errors compared to the ground truth. RESULTS Our method detected 299 catheters from 20 patients receiving HDR prostate brachytherapy with a catheter tip error of 0.37 ± 1.68 mm and a catheter shaft error of 0.93 ± 0.50 mm. For detection of catheter tips, our method resulted in 87% of the catheter tips within an error of less than ± 2.0 mm, and more than 71% of the tips can be localized within an absolute error of no >1.0 mm. For catheter shaft localization, 97% of catheters were detected with an error of <2.0 mm, while 63% were within 1.0 mm. CONCLUSIONS In this study, we proposed a novel multi-catheter detection method to precisely localize the tips and shafts of catheters in three-dimensional MRI images of HDR prostate brachytherapy. It paves the way for elevating the quality and outcome of MRI-guided HDR prostate brachytherapy.
Collapse
Affiliation(s)
- Xianjin Dai
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30332, USA
| | - Yang Lei
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30332, USA
| | - Yupei Zhang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30332, USA
| | - Richard L J Qiu
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30332, USA
| | - Tonghe Wang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30332, USA
| | - Sean A Dresser
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30332, USA
| | - Walter J Curran
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30332, USA
| | - Pretesh Patel
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30332, USA
| | - Tian Liu
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30332, USA
| | - Xiaofeng Yang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30332, USA
| |
Collapse
|
18
|
Zhang Y, Lei Y, Qiu RLJ, Wang T, Wang H, Jani AB, Curran WJ, Patel P, Liu T, Yang X. Multi-needle Localization with Attention U-Net in US-guided HDR Prostate Brachytherapy. Med Phys 2020; 47:2735-2745. [PMID: 32155666 DOI: 10.1002/mp.14128] [Citation(s) in RCA: 23] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2019] [Revised: 02/17/2020] [Accepted: 03/04/2020] [Indexed: 12/11/2022] Open
Abstract
PURPOSE Ultrasound (US)-guided high dose rate (HDR) prostate brachytherapy requests the clinicians to place HDR needles (catheters) into the prostate gland under transrectal US (TRUS) guidance in the operating room. The quality of the subsequent radiation treatment plan is largely dictated by the needle placements, which varies upon the experience level of the clinicians and the procedure protocols. Real-time plan dose distribution, if available, could be a vital tool to provide more subjective assessment of the needle placements, hence potentially improving the radiation plan quality and the treatment outcome. However, due to low signal-to-noise ratio (SNR) in US imaging, real-time multi-needle segmentation in 3D TRUS, which is the major obstacle for real-time dose mapping, has not been realized to date. In this study, we propose a deep learning-based method that enables accurate and real-time digitization of the multiple needles in the 3D TRUS images of HDR prostate brachytherapy. METHODS A deep learning model based on the U-Net architecture was developed to segment multiple needles in the 3D TRUS images. Attention gates were considered in our model to improve the prediction on the small needle points. Furthermore, the spatial continuity of needles was encoded into our model with total variation (TV) regularization. The combined network was trained on 3D TRUS patches with the deep supervision strategy, where the binary needle annotation images were provided as ground truth. The trained network was then used to localize and segment the HDR needles for a new patient's TRUS images. We evaluated our proposed method based on the needle shaft and tip errors against manually defined ground truth and compared our method with other state-of-art methods (U-Net and deeply supervised attention U-Net). RESULTS Our method detected 96% needles of 339 needles from 23 HDR prostate brachytherapy patients with 0.290 ± 0.236 mm at shaft error and 0.442 ± 0.831 mm at tip error. For shaft localization, our method resulted in 96% localizations with less than 0.8 mm error (needle diameter is 1.67 mm), while for tip localization, our method resulted in 75% needles with 0 mm error and 21% needles with 2 mm error (TRUS image slice thickness is 2 mm). No significant difference is observed (P = 0.83) on tip localization between our results with the ground truth. Compared with U-Net and deeply supervised attention U-Net, the proposed method delivers a significant improvement on both shaft error and tip error (P < 0.05). CONCLUSIONS We proposed a new segmentation method to precisely localize the tips and shafts of multiple needles in 3D TRUS images of HDR prostate brachytherapy. The 3D rendering of the needles could help clinicians to evaluate the needle placements. It paves the way for the development of real-time plan dose assessment tools that can further elevate the quality and outcome of HDR prostate brachytherapy.
Collapse
Affiliation(s)
- Yupei Zhang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| | - Yang Lei
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| | - Richard L J Qiu
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| | - Tonghe Wang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| | - Hesheng Wang
- Department of Radiation Oncology, New York University, New York, NY, USA
| | - Ashesh B Jani
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| | - Walter J Curran
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| | - Pretesh Patel
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| | - Tian Liu
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| | - Xiaofeng Yang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, USA
| |
Collapse
|
19
|
Chaudhury A, Barron JL. 3D Phenotyping of Plants. 3D IMAGING, ANALYSIS AND APPLICATIONS 2020:699-732. [DOI: 10.1007/978-3-030-44070-1_14] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/19/2023]
|
20
|
Sonographic visibility of cannulas using convex ultrasound transducers. BIOMED ENG-BIOMED TE 2019; 64:691-698. [DOI: 10.1515/bmt-2018-0174] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2018] [Accepted: 01/30/2019] [Indexed: 11/15/2022]
Abstract
Abstract
The key for safe ultrasound (US)-guided punctures is a good visibility of the cannula. When using convex transducers for deep punctures, the incident angle between US beam and cannula varies along the cannula leading to a complex visibility pattern. Here, we present a method to systematically investigate the visibility throughout the US image. For this, different objective criteria were defined and applied to measurement series with varying puncture angles and depths of the cannula. It is shown that the visibility not only depends on the puncture angle but also on the location of the cannula in the US image when using convex transducers. In some image regions, an unexpected good visibility was observed even for steep puncture angles. The systematic evaluation of the cannula visibility is of fundamental interest to sensitise physicians to the handling of convex transducers and to evaluate new techniques for further improvement.
Collapse
|
21
|
Zaffino P, Pernelle G, Mastmeyer A, Mehrtash A, Zhang H, Kikinis R, Kapur T, Francesca Spadea M. Fully automatic catheter segmentation in MRI with 3D convolutional neural networks: application to MRI-guided gynecologic brachytherapy. Phys Med Biol 2019; 64:165008. [PMID: 31272095 DOI: 10.1088/1361-6560/ab2f47] [Citation(s) in RCA: 36] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
External-beam radiotherapy followed by high dose rate (HDR) brachytherapy is the standard-of-care for treating gynecologic cancers. The enhanced soft-tissue contrast provided by magnetic resonance imaging (MRI) makes it a valuable imaging modality for diagnosing and treating these cancers. However, in contrast to computed tomography (CT) imaging, the appearance of the brachytherapy catheters, through which radiation sources are inserted to reach the cancerous tissue later on, is often variable across images. This paper reports, for the first time, a new deep-learning-based method for fully automatic segmentation of multiple closely spaced brachytherapy catheters in intraoperative MRI. Represented in the data are 50 gynecologic cancer patients treated by MRI-guided HDR brachytherapy. For each patient, a single intraoperative MRI was used. 826 catheters in the images were manually segmented by an expert radiation physicist who is also a trained radiation oncologist. The number of catheters in a patient ranged between 10 and 35. A deep 3D convolutional neural network (CNN) model was developed and trained. In order to make the learning process more robust, the network was trained 5 times, each time using a different combination of shown patients. Finally, each test case was processed by the five networks and the final segmentation was generated by voting on the obtained five candidate segmentations. 4-fold validation was executed and all the patients were segmented. An average distance error of 2.0 ± 3.4 mm was achieved. False positive and false negative catheters were 6.7% and 1.5% respectively. Average Dice score was equal to 0.60 ± 0.17. The algorithm is available for use in the open source software platform 3D Slicer allowing for wide scale testing and research discussion. In conclusion, to the best of our knowledge, fully automatic segmentation of multiple closely spaced catheters from intraoperative MR images was achieved for the first time in gynecological brachytherapy.
Collapse
Affiliation(s)
- Paolo Zaffino
- Department of Experimental and Clinical Medicine, Magna Graecia University, 88100, Catanzaro, Italy. Author to whom any correspondence should be addressed
| | | | | | | | | | | | | | | |
Collapse
|
22
|
Antico M, Sasazawa F, Wu L, Jaiprakash A, Roberts J, Crawford R, Pandey AK, Fontanarosa D. Ultrasound guidance in minimally invasive robotic procedures. Med Image Anal 2019; 54:149-167. [DOI: 10.1016/j.media.2019.01.002] [Citation(s) in RCA: 25] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2017] [Revised: 01/01/2019] [Accepted: 01/09/2019] [Indexed: 12/20/2022]
|
23
|
Gillies DJ, Awad J, Rodgers JR, Edirisinghe C, Cool DW, Kakani N, Fenster A. Three-dimensional therapy needle applicator segmentation for ultrasound-guided focal liver ablation. Med Phys 2019; 46:2646-2658. [PMID: 30994191 DOI: 10.1002/mp.13548] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2018] [Revised: 03/06/2019] [Accepted: 03/28/2019] [Indexed: 12/19/2022] Open
Abstract
PURPOSE Minimally invasive procedures, such as microwave ablation, are becoming first-line treatment options for early-stage liver cancer due to lower complication rates and shorter recovery times than conventional surgical techniques. Although these procedures are promising, one reason preventing widespread adoption is inadequate local tumor ablation leading to observations of higher local cancer recurrence compared to conventional procedures. Poor ablation coverage has been associated with two-dimensional (2D) ultrasound (US) guidance of the therapy needle applicators and has stimulated investigation into the use of three-dimensional (3D) US imaging for these procedures. We have developed a supervised 3D US needle applicator segmentation algorithm using a single user input to augment the addition of 3D US to the current focal liver tumor ablation workflow with the goals of identifying and improving needle applicator localization efficiency. METHODS The algorithm is initialized by creating a spherical search space of line segments around a manually chosen seed point that is selected by a user on the needle applicator visualized in a 3D US image. The most probable trajectory is chosen by maximizing the count and intensity of threshold voxels along a line segment and is filtered using the Otsu method to determine the tip location. Homogeneous tissue mimicking phantom images containing needle applicators were used to optimize the parameters of the algorithm prior to a four-user investigation on retrospective 3D US images of patients who underwent microwave ablation for liver cancer. Trajectory, axis localization, and tip errors were computed based on comparisons to manual segmentations in 3D US images. RESULTS Segmentation of needle applicators in ten phantom 3D US images was optimized to median (Q1, Q3) trajectory, axis, and tip errors of 2.1 (1.1, 3.6)°, 1.3 (0.8, 2.1) mm, and 1.3 (0.7, 2.5) mm, respectively, with a mean ± SD segmentation computation time of 0.246 ± 0.007 s. Use of the segmentation method with a 16 in vivo 3D US patient dataset resulted in median (Q1, Q3) trajectory, axis, and tip errors of 4.5 (2.4, 5.2)°, 1.9 (1.7, 2.1) mm, and 5.1 (2.2, 5.9) mm based on all users. CONCLUSIONS Segmentation of needle applicators in 3D US images during minimally invasive liver cancer therapeutic procedures could provide a utility that enables enhanced needle applicator guidance, placement verification, and improved clinical workflow. A semi-automated 3D US needle applicator segmentation algorithm used in vivo demonstrated localization of the visualized trajectory and tip with less than 5° and 5.2 mm errors, respectively, in less than 0.31 s. This offers the ability to assess and adjust needle applicator placements intraoperatively to potentially decrease the observed liver cancer recurrence rates associated with current ablation procedures. Although optimized for deep and oblique angle needle applicator insertions, this proposed workflow has the potential to be altered for a variety of image-guided minimally invasive procedures to improve localization and verification of therapy needle applicators intraoperatively.
Collapse
Affiliation(s)
- Derek J Gillies
- Department of Medical Biophysics, Western University, London, ON, N6A 3K7, Canada.,Robarts Research Institute, Western University, London, ON, N6A 3K7, Canada
| | - Joseph Awad
- Centre for Imaging Technology Commercialization, London, ON, N6G 4X8, Canada
| | - Jessica R Rodgers
- Robarts Research Institute, Western University, London, ON, N6A 3K7, Canada.,School of Biomedical Engineering, Western University, London, ON, N6A 3K7, Canada
| | | | - Derek W Cool
- Department of Medical Imaging, Western University, London, ON, N6A 3K7, Canada
| | - Nirmal Kakani
- Department of Radiology, Manchester Royal Infirmary, Manchester, M13 9WL, UK
| | - Aaron Fenster
- Department of Medical Biophysics, Western University, London, ON, N6A 3K7, Canada.,Robarts Research Institute, Western University, London, ON, N6A 3K7, Canada.,Centre for Imaging Technology Commercialization, London, ON, N6G 4X8, Canada.,School of Biomedical Engineering, Western University, London, ON, N6A 3K7, Canada.,Department of Medical Imaging, Western University, London, ON, N6A 3K7, Canada
| |
Collapse
|
24
|
Mehrtash A, Ghafoorian M, Pernelle G, Ziaei A, Heslinga FG, Tuncali K, Fedorov A, Kikinis R, Tempany CM, Wells WM, Abolmaesumi P, Kapur T. Automatic Needle Segmentation and Localization in MRI With 3-D Convolutional Neural Networks: Application to MRI-Targeted Prostate Biopsy. IEEE TRANSACTIONS ON MEDICAL IMAGING 2019; 38:1026-1036. [PMID: 30334789 PMCID: PMC6450731 DOI: 10.1109/tmi.2018.2876796] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/05/2023]
Abstract
Image guidance improves tissue sampling during biopsy by allowing the physician to visualize the tip and trajectory of the biopsy needle relative to the target in MRI, CT, ultrasound, or other relevant imagery. This paper reports a system for fast automatic needle tip and trajectory localization and visualization in MRI that has been developed and tested in the context of an active clinical research program in prostate biopsy. To the best of our knowledge, this is the first reported system for this clinical application and also the first reported system that leverages deep neural networks for segmentation and localization of needles in MRI across biomedical applications. Needle tip and trajectory were annotated on 583 T2-weighted intra-procedural MRI scans acquired after needle insertion for 71 patients who underwent transperineal MRI-targeted biopsy procedure at our institution. The images were divided into two independent training-validation and test sets at the patient level. A deep 3-D fully convolutional neural network model was developed, trained, and deployed on these samples. The accuracy of the proposed method, as tested on previously unseen data, was 2.80-mm average in needle tip detection and 0.98° in needle trajectory angle. An observer study was designed in which independent annotations by a second observer, blinded to the original observer, were compared with the output of the proposed method. The resultant error was comparable to the measured inter-observer concordance, reinforcing the clinical acceptability of the proposed method. The proposed system has the potential for deployment in clinical routine.
Collapse
Affiliation(s)
- Alireza Mehrtash
- Department of Electrical and Computer Engineering, The University of British Columbia, Vancouver, BC, V6T 1Z4, Canada
- Department of Radiology, Brigham and Women’s Hospital, Harvard Medical School, Boston, 02115, USA
| | | | | | - Alireza Ziaei
- Department of Radiology, Brigham and Women’s Hospital, Harvard Medical School, Boston, 02115, USA
| | - Friso G. Heslinga
- Department of Radiology, Brigham and Women’s Hospital, Harvard Medical School, Boston, 02115, USA
| | - Kemal Tuncali
- Department of Radiology, Brigham and Women’s Hospital, Harvard Medical School, Boston, 02115, USA
| | - Andriy Fedorov
- Department of Radiology, Brigham and Women’s Hospital, Harvard Medical School, Boston, 02115, USA
| | - Ron Kikinis
- Department of Computer Science at the University of Bremen, Bremen, Germany
- Department of Radiology, Brigham and Women’s Hospital, Harvard Medical School, Boston, 02115, USA
| | - Clare M. Tempany
- Department of Radiology, Brigham and Women’s Hospital, Harvard Medical School, Boston, 02115, USA
| | - William M. Wells
- Department of Radiology, Brigham and Women’s Hospital, Harvard Medical School, Boston, 02115, USA
| | - Purang Abolmaesumi
- Department of Electrical and Computer Engineering, The University of British Columbia Vancouver, BC, V5T 1Z4, Canada
| | - Tina Kapur
- Department of Radiology, Brigham and Women’s Hospital, Harvard Medical School, Boston, 02115, USA
| |
Collapse
|
25
|
Yang H, Shan C, Pourtaherian A, Kolen AF, de With PHN. Catheter segmentation in three-dimensional ultrasound images by feature fusion and model fitting. J Med Imaging (Bellingham) 2019; 6:015001. [PMID: 30662926 DOI: 10.1117/1.jmi.6.1.015001] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2018] [Accepted: 12/14/2018] [Indexed: 11/14/2022] Open
Abstract
Ultrasound (US) has been increasingly used during interventions, such as cardiac catheterization. To accurately identify the catheter inside US images, extra training for physicians and sonographers is needed. As a consequence, automated segmentation of the catheter in US images and optimized presentation viewing to the physician can be beneficial to accelerate the efficiency and safety of interventions and improve their outcome. For cardiac catheterization, a three-dimensional (3-D) US image is potentially attractive because of no radiation modality and richer spatial information. However, due to a limited spatial resolution of 3-D cardiac US and complex anatomical structures inside the heart, image-based catheter segmentation is challenging. We propose a cardiac catheter segmentation method in 3-D US data through image processing techniques. Our method first applies a voxel-based classification through newly designed multiscale and multidefinition features, which provide a robust catheter voxel segmentation in 3-D US. Second, a modified catheter model fitting is applied to segment the curved catheter in 3-D US images. The proposed method is validated with extensive experiments, using different in-vitro, ex-vivo, and in-vivo datasets. The proposed method can segment the catheter within an average tip-point error that is smaller than the catheter diameter (1.9 mm) in the volumetric images. Based on automated catheter segmentation and combined with optimal viewing, physicians do not have to interpret US images and can focus on the procedure itself to improve the quality of cardiac intervention.
Collapse
Affiliation(s)
- Hongxu Yang
- Eindhoven University of Technology, VCA Research Group, Eindhoven, The Netherlands
| | - Caifeng Shan
- Philips Research, In-Body Systems, Eindhoven, The Netherlands
| | - Arash Pourtaherian
- Eindhoven University of Technology, VCA Research Group, Eindhoven, The Netherlands
| | | | - Peter H N de With
- Eindhoven University of Technology, VCA Research Group, Eindhoven, The Netherlands
| |
Collapse
|
26
|
Daoud MI, Alshalalfah AL, Ait Mohamed O, Alazrai R. A hybrid camera- and ultrasound-based approach for needle localization and tracking using a 3D motorized curvilinear ultrasound probe. Med Image Anal 2018; 50:145-166. [PMID: 30336383 DOI: 10.1016/j.media.2018.09.006] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2018] [Revised: 08/11/2018] [Accepted: 09/25/2018] [Indexed: 10/28/2022]
Abstract
Three-dimensional (3D) motorized curvilinear ultrasound probes provide an effective, low-cost tool to guide needle interventions, but localizing and tracking the needle in 3D ultrasound volumes is often challenging. In this study, a new method is introduced to localize and track the needle using 3D motorized curvilinear ultrasound probes. In particular, a low-cost camera mounted on the probe is employed to estimate the needle axis. The camera-estimated axis is used to identify a volume of interest (VOI) in the ultrasound volume that enables high needle visibility. This VOI is analyzed using local phase analysis and the random sample consensus algorithm to refine the camera-estimated needle axis. The needle tip is determined by searching the localized needle axis using a probabilistic approach. Dynamic needle tracking in a sequence of 3D ultrasound volumes is enabled by iteratively applying a Kalman filter to estimate the VOI that includes the needle in the successive ultrasound volume and limiting the localization analysis to this VOI. A series of ex vivo animal experiments are conducted to evaluate the accuracy of needle localization and tracking. The results show that the proposed method can localize the needle in individual ultrasound volumes with maximum error rates of 0.7 mm for the needle axis, 1.7° for the needle angle, and 1.2 mm for the needle tip. Moreover, the proposed method can track the needle in a sequence of ultrasound volumes with maximum error rates of 1.0 mm for the needle axis, 2.0° for the needle angle, and 1.7 mm for the needle tip. These results suggest the feasibility of applying the proposed method to localize and track the needle using 3D motorized curvilinear ultrasound probes.
Collapse
Affiliation(s)
- Mohammad I Daoud
- Department of Computer Engineering, German Jordanian University, Amman, Jordan.
| | | | - Otmane Ait Mohamed
- Department of Electrical and Computer Engineering, Concordia University, Montreal, Quebec, Canada
| | - Rami Alazrai
- Department of Computer Engineering, German Jordanian University, Amman, Jordan
| |
Collapse
|
27
|
Automatic Robotic Steering of Flexible Needles from 3D Ultrasound Images in Phantoms and Ex Vivo Biological Tissue. Ann Biomed Eng 2018; 46:1385-1396. [PMID: 29845413 DOI: 10.1007/s10439-018-2061-3] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2017] [Accepted: 05/25/2018] [Indexed: 12/25/2022]
Abstract
Robotic control of needle bending aims at increasing the precision of percutaneous procedures. Ultrasound feedback is preferable for its clinical ease of use, cost and compactness but raises needle detection issues. In this paper, we propose a complete system dedicated to robotized guidance of a flexible needle under 3D ultrasound imaging. This system includes a medical robot dedicated to transperineal needle positioning and insertion, a rapid path planning for needle steering using bevel-tip needle natural curvature in tissue, and an ultrasound-based automatic needle detection algorithm. Since ultrasound-based automatic needle steering is often made difficult by the needle localization in biological tissue, we quantify the benefit of using flexible echogenic needles for robotized guidance under 3D ultrasound. The "echogenic" term refers to the etching of microstructures on the needle shaft. We prove that these structures improve needle visibility and detection robustness in ultrasound images. We finally present promising results when reaching targets using needle steering. The experiments were conducted with various needles in different media (synthetic phantoms and ex vivo biological tissue). For instance, with nitinol needles the mean accuracy is 1.2 mm (respectively 3.8 mm) in phantoms (resp. biological tissue).
Collapse
|
28
|
Czajkowska J, Pyciński B, Juszczyk J, Pietka E. Biopsy needle tracking technique in US images. Comput Med Imaging Graph 2018; 65:93-101. [DOI: 10.1016/j.compmedimag.2017.07.001] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2017] [Revised: 06/23/2017] [Accepted: 07/18/2017] [Indexed: 11/28/2022]
|
29
|
Assessment of Homodyned K Distribution Modeling Ultrasonic Speckles from Scatterers with Varying Spatial Organizations. JOURNAL OF HEALTHCARE ENGINEERING 2017; 2017:8154780. [PMID: 29312656 PMCID: PMC5605812 DOI: 10.1155/2017/8154780] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/06/2017] [Revised: 06/09/2017] [Accepted: 06/22/2017] [Indexed: 11/17/2022]
Abstract
Objective This paper presents an assessment of physical meanings of parameter and goodness of fit for homodyned K (HK) distribution modeling ultrasonic speckles from scatterer distributions with wide-varying spatial organizations. Methods A set of 3D scatterer phantoms based on gamma distributions is built to be implemented from the clustered to random to uniform scatterer distributions continuously. The model parameters are obtained by maximum likelihood estimation (MLE) from statistical histograms of the ultrasonic envelope data and then compared with those by the optimally fitting models chosen from three single distributions. Results show that the parameters of the HK distribution still present their respective physical meanings of independent contributions in the scatterer distributions. Moreover, the HK distribution presents better goodness of fit with a maximum relative MLE difference of 6.23% for random or clustered scatterers with a well-organized periodic structure. Experiments based on ultrasonic envelope data from common carotid arterial B-mode images of human subjects validate the modeling performance of HK distribution. Conclusion We conclude that the HK model for ultrasonic speckles is a better choice for characterizing tissue with a wide variety of spatial organizations, especially the emphasis on the goodness of fit for the tissue in practical applications.
Collapse
|
30
|
Pourtaherian A, Scholten HJ, Kusters L, Zinger S, Mihajlovic N, Kolen AF, Zuo F, Ng GC, Korsten HHM, de With PHN. Medical Instrument Detection in 3-Dimensional Ultrasound Data Volumes. IEEE TRANSACTIONS ON MEDICAL IMAGING 2017; 36:1664-1675. [PMID: 28410101 DOI: 10.1109/tmi.2017.2692302] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
Ultrasound-guided medical interventions are broadly applied in diagnostics and therapy, e.g., regional anesthesia or ablation. A guided intervention using 2-D ultrasound is challenging due to the poor instrument visibility, limited field of view, and the multi-fold coordination of the medical instrument and ultrasound plane. Recent 3-D ultrasound transducers can improve the quality of the image-guided intervention if an automated detection of the needle is used. In this paper, we present a novel method for detecting medical instruments in 3-D ultrasound data that is solely based on image processing techniques and validated on various ex vivo and in vivo data sets. In the proposed procedure, the physician is placing the 3-D transducer at the desired position, and the image processing will automatically detect the best instrument view, so that the physician can entirely focus on the intervention. Our method is based on the classification of instrument voxels using volumetric structure directions and robust approximation of the primary tool axis. A novel normalization method is proposed for the shape and intensity consistency of instruments to improve the detection. Moreover, a novel 3-D Gabor wavelet transformation is introduced and optimally designed for revealing the instrument voxels in the volume, while remaining generic to several medical instruments and transducer types. Experiments on diverse data sets, including in vivo data from patients, show that for a given transducer and an instrument type, high detection accuracies are achieved with position errors smaller than the instrument diameter in the 0.5-1.5-mm range on average.
Collapse
|
31
|
Beigi P, Rohling R, Salcudean SE, Ng GC. CASPER: computer-aided segmentation of imperceptible motion-a learning-based tracking of an invisible needle in ultrasound. Int J Comput Assist Radiol Surg 2017. [PMID: 28647883 DOI: 10.1007/s11548-017-1631-4] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Abstract
PURPOSE This paper presents a new micro-motion-based approach to track a needle in ultrasound images captured by a handheld transducer. METHODS We propose a novel learning-based framework to track a handheld needle by detecting microscale variations of motion dynamics over time. The current state of the art on using motion analysis for needle detection uses absolute motion and hence work well only when the transducer is static. We have introduced and evaluated novel spatiotemporal and spectral features, obtained from the phase image, in a self-supervised tracking framework to improve the detection accuracy in the subsequent frames using incremental training. Our proposed tracking method involves volumetric feature selection and differential flow analysis to incorporate the neighboring pixels and mitigate the effects of the subtle tremor motion of a handheld transducer. To evaluate the detection accuracy, the method is tested on porcine tissue in-vivo, during the needle insertion in the biceps femoris muscle. RESULTS Experimental results show the mean, standard deviation and root-mean-square errors of [Formula: see text], [Formula: see text] and [Formula: see text] in the insertion angle, and 0.82, 1.21, 1.47 mm, in the needle tip, respectively. CONCLUSIONS Compared to the appearance-based detection approaches, the proposed method is especially suitable for needles with ultrasonic characteristics that are imperceptible in the static image and to the naked eye.
Collapse
Affiliation(s)
- Parmida Beigi
- Electrical and Computer Engineering Department, University of British Columbia, Vancouver, BC, Canada.
| | - Robert Rohling
- Electrical and Computer Engineering Department and Mechanical Engineering Department, University of British Columbia, Vancouver, BC, Canada
| | - Septimiu E Salcudean
- Electrical and Computer Engineering Department, University of British Columbia, Vancouver, BC, Canada
| | - Gary C Ng
- Philips Ultrasound, Bothell, WA, USA
| |
Collapse
|
32
|
Ikhsan M, Tan KK, Putra AS. Assistive technology for ultrasound-guided central venous catheter placement. J Med Ultrason (2001) 2017; 45:41-57. [DOI: 10.1007/s10396-017-0789-2] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2016] [Accepted: 03/30/2017] [Indexed: 11/28/2022]
|
33
|
Hrinivich WT, Hoover DA, Surry K, Edirisinghe C, Montreuil J, D'Souza D, Fenster A, Wong E. Simultaneous automatic segmentation of multiple needles using 3D ultrasound for high-dose-rate prostate brachytherapy. Med Phys 2017; 44:1234-1245. [PMID: 28160517 DOI: 10.1002/mp.12148] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2016] [Revised: 01/10/2017] [Accepted: 01/29/2017] [Indexed: 12/31/2022] Open
Abstract
PURPOSE Sagittally reconstructed 3D (SR3D) ultrasound imaging shows promise for improved needle localization for high-dose-rate prostate brachytherapy (HDR-BT); however, needles must be manually segmented intraoperatively while the patient is anesthetized to create a treatment plan. The purpose of this article was to describe and validate an automatic needle segmentation algorithm designed for HDR-BT, specifically capable of simultaneously segmenting all needles in an HDR-BT implant using a single SR3D image with ~5 mm interneedle spacing. MATERIALS AND METHODS The segmentation algorithm involves regularized feature point classification and line trajectory identification based on the randomized 3D Hough transform modified to handle multiple straight needles in a single image simultaneously. Needle tips are identified based on peaks in the derivative of the signal intensity profile along the needle trajectory. For algorithm validation, 12 prostate cancer patients underwent HDR-BT during which SR3D images were acquired with all needles in place. Needles present in each of the 12 images were segmented manually, providing a gold standard for comparison, and using the algorithm. Tip errors were assessed in terms of the 3D Euclidean distance between needle tips, and trajectory error was assessed in terms of 2D distance in the axial plane and angular deviation between trajectories. RESULTS In total, 190 needles were investigated. Mean execution time of the algorithm was 11.0 s per patient, or 0.7 s per needle. The algorithm identified 82% and 85% of needle tips with 3D errors ≤3 mm and ≤5 mm, respectively, 91% of needle trajectories with 2D errors in the axial plane ≤3 mm, and 83% of needle trajectories with angular errors ≤3°. The largest tip error component was in the needle insertion direction. CONCLUSIONS Previous work has indicated HDR-BT needles may be manually segmented using SR3D images with insertion depth errors ≤3 mm and ≤5 mm for 83% and 92% of needles, respectively. The algorithm shows promise for reducing the time required for the segmentation of straight HDR-BT needles, and future work involves improving needle tip localization performance through improved image quality and modeling curvilinear trajectories.
Collapse
Affiliation(s)
- William Thomas Hrinivich
- Department of Medical Biophysics, University of Western Ontario, London, Ontario, N6A 5C1, Canada.,Imaging Research Laboratories, Robarts Research Institute, University of Western Ontario, London, Ontario, N6A 5K8, Canada
| | - Douglas A Hoover
- Department of Medical Biophysics, University of Western Ontario, London, Ontario, N6A 5C1, Canada.,Department of Oncology, University of Western Ontario, London, Ontario, N6A 4L6, Canada.,London Regional Cancer Program, London, Ontario, N6A 5W9, Canada
| | - Kathleen Surry
- Department of Medical Biophysics, University of Western Ontario, London, Ontario, N6A 5C1, Canada.,Department of Oncology, University of Western Ontario, London, Ontario, N6A 4L6, Canada.,London Regional Cancer Program, London, Ontario, N6A 5W9, Canada
| | - Chandima Edirisinghe
- Imaging Research Laboratories, Robarts Research Institute, University of Western Ontario, London, Ontario, N6A 5K8, Canada
| | - Jacques Montreuil
- Imaging Research Laboratories, Robarts Research Institute, University of Western Ontario, London, Ontario, N6A 5K8, Canada
| | - David D'Souza
- Department of Oncology, University of Western Ontario, London, Ontario, N6A 4L6, Canada.,London Regional Cancer Program, London, Ontario, N6A 5W9, Canada
| | - Aaron Fenster
- Department of Medical Biophysics, University of Western Ontario, London, Ontario, N6A 5C1, Canada.,Imaging Research Laboratories, Robarts Research Institute, University of Western Ontario, London, Ontario, N6A 5K8, Canada.,Department of Oncology, University of Western Ontario, London, Ontario, N6A 4L6, Canada.,Department of Physics and Astronomy, University of Western Ontario, London, Ontario, N6A 3K7, Canada
| | - Eugene Wong
- Department of Medical Biophysics, University of Western Ontario, London, Ontario, N6A 5C1, Canada.,Department of Oncology, University of Western Ontario, London, Ontario, N6A 4L6, Canada.,London Regional Cancer Program, London, Ontario, N6A 5W9, Canada.,Department of Physics and Astronomy, University of Western Ontario, London, Ontario, N6A 3K7, Canada
| |
Collapse
|
34
|
Multi-scale RANSAC algorithm for needle localization in 3D ultrasound guided puncture surgery. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2017; 2016:4113-4116. [PMID: 28269187 DOI: 10.1109/embc.2016.7591631] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Correct localization of the needle is of vital importance to guarantee successful puncture. The complexity of real US data increases the difficulties. A multi-scale random sample consensus (MS-RANSAC) algorithm is proposed in this paper to locate the needle in complicated 3D US data. The algorithm uses the radius difference between needle and other tubular human tissues to extract the correct needle location. The performance of classic RANSAC and MS-RANSAC are compared using three different datasets. Results show that MS-RANSAC can locate needle correctly in complicated condition where classic RANSAC cannot. A parallel framework of the algorithm is designed and implemented using CUDA, making it usable in real time and online.
Collapse
|
35
|
Zhao Y, Shen Y, Bernard A, Cachard C, Liebgott H. Evaluation and comparison of current biopsy needle localization and tracking methods using 3D ultrasound. ULTRASONICS 2017; 73:206-220. [PMID: 27668998 DOI: 10.1016/j.ultras.2016.09.006] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/30/2015] [Revised: 08/21/2016] [Accepted: 09/07/2016] [Indexed: 06/06/2023]
Abstract
This article compares four different biopsy needle localization algorithms in both 3D and 4D situations to evaluate their accuracy and execution time. The localization algorithms were: Principle component analysis (PCA), random Hough transform (RHT), parallel integral projection (PIP) and ROI-RK (ROI based RANSAC and Kalman filter). To enhance the contrast of the biopsy needle and background tissue, a line filtering pre-processing step was implemented. To make the PCA, RHT and PIP algorithms comparable with the ROI-RK method, a region of interest (ROI) strategy was added. Simulated and ex-vivo data were used to evaluate the performance of the different biopsy needle localization algorithms. The resolutions of the sectorial and cylindrical volumes were 0.3mm×0.4mm×0.6mmand0.1mm×0.1mm×0.2mm (axial×lateral×azimuthal) respectively. In so far as the simulation and experimental results show, the ROI-RK method successfully located and tracked the biopsy needle in both 3D and 4D situations. The tip localization error was within 1.5mm and the axis accuracy was within 1.6mm. To the best of our knowledge, considering both localization accuracy and execution time, the ROI-RK was the most stable and time-saving method. Normally, accuracy comes at the expense of time. However, the ROI-RK method was able to locate the biopsy needle with high accuracy in real time, which makes it a promising method for clinical applications.
Collapse
Affiliation(s)
- Yue Zhao
- Control Theory and Engineering, School of Astronautics, Harbin Institute of Technology, China.
| | - Yi Shen
- Control Theory and Engineering, School of Astronautics, Harbin Institute of Technology, China
| | - Adeline Bernard
- CREAITS, Université de Lyon, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Claude Bernard Lyon 1, France
| | - Christian Cachard
- CREAITS, Université de Lyon, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Claude Bernard Lyon 1, France
| | - Hervé Liebgott
- CREAITS, Université de Lyon, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Claude Bernard Lyon 1, France.
| |
Collapse
|
36
|
Waine M, Rossa C, Sloboda R, Usmani N, Tavakoli M. Needle Tracking and Deflection Prediction for Robot-Assisted Needle Insertion Using 2D Ultrasound Images. ACTA ACUST UNITED AC 2016. [DOI: 10.1142/s2424905x16400018] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
In many types of percutaneous needle insertion surgeries, tissue deformation and needle deflection can create significant difficulties for accurate needle placement. In this paper, we present a method for automatic needle tracking in 2D ultrasound (US) images, which is used in a needle–tissue interaction model to estimate current and future needle tip deflection. This is demonstrated using a semi-automatic needle steering system. The US probe can be controlled to follow the needle tip or it can be stopped at an appropriate position to avoid tissue deformation of the target area. US images are used to fully parameterize the needle-tissue model. Once the needle deflection reaches a pre-determined threshold, the robot rotates the needle to correct the tip’s trajectory. Experimental results show that the final needle tip deflection can be estimated with average accuracies between 0.7[Formula: see text]mm and 1.0[Formula: see text]mm for insertions with and without rotation. The proposed method provides surgeons with improved US feedback of the needle tip deflection and minimizes the motion of the US probe to reduce tissue deformation of the target area.
Collapse
Affiliation(s)
- Michael Waine
- Department of Electrical and Computer Engineering, University of Alberta, Edmonton, AB, Canada T6G 2V4, Canada
| | - Carlos Rossa
- Department of Electrical and Computer Engineering, University of Alberta, Edmonton, AB, Canada T6G 2V4, Canada
| | - Ron Sloboda
- Department of Oncology, University of Alberta, Edmonton, AB, Canada T6G 1Z2, Canada
| | - Nawaid Usmani
- Department of Oncology, University of Alberta, Edmonton, AB, Canada T6G 1Z2, Canada
| | - Mahdi Tavakoli
- Department of Electrical and Computer Engineering, University of Alberta, Edmonton, AB, Canada T6G 2V4, Canada
| |
Collapse
|
37
|
Khadem M, Rossa C, Sloboda RS, Usmani N, Tavakoli M. Ultrasound-Guided Model Predictive Control of Needle Steering in Biological Tissue. ACTA ACUST UNITED AC 2016. [DOI: 10.1142/s2424905x16400079] [Citation(s) in RCA: 29] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
In needle-based medical procedures, beveled tip flexible needles are steered inside soft tissue to reach the desired target locations. In this paper, we have developed an autonomous image-guided needle steering system that enhances targeting accuracy in needle insertion while minimizing tissue trauma. The system has three main components. First is a novel mechanics-based needle steering model that predicts needle deflection and accepts needle tip rotation as an input for needle steering. The second is a needle tip tracking system that determines needle deflection from the ultrasound images. The needle steering model employs the estimated needle deflection at the present time to predict needle tip trajectory in the future steps. The third component is a nonlinear model predictive controller (NMPC) that steers the needle inside the tissue by rotating the needle beveled tip. The MPC controller calculates control decisions based on iterative optimization of the predictions of the needle steering model. To validate the proposed ultrasound-guided needle steering system, needle insertion experiments in biological tissue phantoms are performed in two cases–with and without obstacle. The results demonstrate that our needle steering strategy guides the needle to the desired targets with the maximum error of 2.85[Formula: see text]mm.
Collapse
Affiliation(s)
- Mohsen Khadem
- Department of Electrical and Computer Engineering, University of Alberta, Edmonton, Canada
| | - Carlos Rossa
- Department of Electrical and Computer Engineering, University of Alberta, Edmonton, Canada
| | - Ron S. Sloboda
- Cross Cancer Institute and the Department of Oncology, University of Alberta, Edmonton, Canada
| | - Nawaid Usmani
- Cross Cancer Institute and the Department of Oncology, University of Alberta, Edmonton, Canada
| | - Mahdi Tavakoli
- Department of Electrical and Computer Engineering, University of Alberta, Edmonton, Canada
| |
Collapse
|
38
|
Chaudhury A, Brophy M, Barron JL. Junction-Based Correspondence Estimation of Plant Point Cloud Data Using Subgraph Matching. IEEE GEOSCIENCE AND REMOTE SENSING LETTERS 2016:1-5. [DOI: 10.1109/lgrs.2016.2571121] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/19/2023]
|
39
|
Daoud MI, Rohling RN, Salcudean SE, Abolmaesumi P. Needle detection in curvilinear ultrasound images based on the reflection pattern of circular ultrasound waves. Med Phys 2015; 42:6221-33. [DOI: 10.1118/1.4932214] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022] Open
Affiliation(s)
- Mohammad I. Daoud
- Department of Computer Engineering, German Jordanian University, Amman 11180, Jordan
| | - Robert N. Rohling
- Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, British Columbia V6T 1Z4, Canada
| | - Septimiu E. Salcudean
- Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, British Columbia V6T 1Z4, Canada
| | - Purang Abolmaesumi
- Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, British Columbia V6T 1Z4, Canada
| |
Collapse
|
40
|
Waine M, Rossa C, Sloboda R, Usmani N, Tavakoli M. Three-Dimensional Needle Shape Estimation in TRUS-Guided Prostate Brachytherapy Using 2-D Ultrasound Images. IEEE J Biomed Health Inform 2015; 20:1621-1631. [PMID: 26372660 DOI: 10.1109/jbhi.2015.2477829] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
In this paper, we propose an automated method to reconstruct the three-dimensional (3-D) needle shape during needle insertion procedures using only 2-D transverse ultrasound (US) images. Using a set of transverse US images, image processing and random sample consensus are used to locate the needle within each image and estimate the needle shape. The method is validated with an in vitro needle insertion setup and a transparent tissue phantom, where two orthogonal cameras are used to capture the true 3-D needle shape for verification. Results showed that the use of at least three images obtained at 75% of the maximum insertion depth or greater allows for maximum needle shape estimation errors of less than 2 mm. In addition, the needle shape can be calculated consistently as long as the needle can be identified in 30% of the transverse US images obtained. Application to permanent prostate brachytherapy is also presented, where the estimated needle shape is compared to manual segmentation and sagittal US images. Our method is intended to help to assess needle placement during manual or robot-assisted needle insertion procedures after the needle has been inserted.
Collapse
|
41
|
Mignon P, Poignet P, Troccaz J. Using rotation for steerable needle detection in 3D color-Doppler ultrasound images. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2015; 2015:1544-7. [PMID: 26736566 DOI: 10.1109/embc.2015.7318666] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
This paper demonstrates a new way to detect needles in 3D color-Doppler volumes of biological tissues. It uses rotation to generate vibrations of a needle using an existing robotic brachytherapy system. The results of our detection for color-Doppler and B-Mode ultrasound are compared to a needle location reference given by robot odometry and robot ultrasound calibration. Average errors between detection and reference are 5.8 mm on needle tip for B-Mode images and 2.17 mm for color-Doppler images. These results show that color-Doppler imaging leads to more robust needle detection in noisy environment with poor needle visibility or when needle interacts with other objects.
Collapse
|
42
|
Wu X, Housden J, Ma Y, Razavi B, Rhode K, Rueckert D. Fast catheter segmentation from echocardiographic sequences based on segmentation from corresponding X-ray fluoroscopy for cardiac catheterization interventions. IEEE TRANSACTIONS ON MEDICAL IMAGING 2015; 34:861-76. [PMID: 25291790 DOI: 10.1109/tmi.2014.2360988] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
Echocardiography is a potential alternative to X-ray fluoroscopy in cardiac catheterization given its richness in soft tissue information and its lack of ionizing radiation. However, its small field of view and acoustic artifacts make direct automatic segmentation of the catheters very challenging. In this study, a fast catheter segmentation framework for echocardiographic imaging guided by the segmentation of corresponding X-ray fluoroscopic imaging is proposed. The complete framework consists of: 1) catheter initialization in the first X-ray frame; 2) catheter tracking in the rest of the X-ray sequence; 3) fast registration of corresponding X-ray and ultrasound frames; and 4) catheter segmentation in ultrasound images guided by the results of both X-ray tracking and fast registration. The main contributions include: 1) a Kalman filter-based growing strategy with more clinical data evalution; 2) a SURF detector applied in a constrained search space for catheter segmentation in ultrasound images; 3) a two layer hierarchical graph model to integrate and smooth catheter fragments into a complete catheter; and 4) the integration of these components into a system for clinical applications. This framework is evaluated on five sequences of porcine data and four sequences of patient data comprising more than 3000 X-ray frames and more than 1000 ultrasound frames. The results show that our algorithm is able to track the catheter in ultrasound images at 1.3 s per frame, with an error of less than 2 mm. However, although this may satisfy the accuracy for visualization purposes and is also fast, the algorithm still needs to be further accelerated for real-time clinical applications.
Collapse
|
43
|
Enhanced needle localization in ultrasound using beam steering and learning-based segmentation. Comput Med Imaging Graph 2015; 41:46-54. [DOI: 10.1016/j.compmedimag.2014.06.016] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2014] [Revised: 06/19/2014] [Accepted: 06/23/2014] [Indexed: 11/22/2022]
|
44
|
Adebar TK, Fletcher AE, Okamura AM. 3-D ultrasound-guided robotic needle steering in biological tissue. IEEE Trans Biomed Eng 2014; 61:2899-910. [PMID: 25014948 DOI: 10.1109/tbme.2014.2334309] [Citation(s) in RCA: 71] [Impact Index Per Article: 7.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/17/2023]
Abstract
Robotic needle steering systems have the potential to greatly improve medical interventions, but they require new methods for medical image guidance. Three-dimensional (3-D) ultrasound is a widely available, low-cost imaging modality that may be used to provide real-time feedback to needle steering robots. Unfortunately, the poor visibility of steerable needles in standard grayscale ultrasound makes automatic segmentation of the needles impractical. A new imaging approach is proposed, in which high-frequency vibration of a steerable needle makes it visible in ultrasound Doppler images. Experiments demonstrate that segmentation from this Doppler data is accurate to within 1-2 mm. An image-guided control algorithm that incorporates the segmentation data as feedback is also described. In experimental tests in ex vivo bovine liver tissue, a robotic needle steering system implementing this control scheme was able to consistently steer a needle tip to a simulated target with an average error of 1.57 mm. Implementation of 3-D ultrasound-guided needle steering in biological tissue represents a significant step toward the clinical application of robotic needle steering.
Collapse
|
45
|
Qiu W, Yuchi M, Ding M. Phase grouping-based needle segmentation in 3-D trans-rectal ultrasound-guided prostate trans-perineal therapy. ULTRASOUND IN MEDICINE & BIOLOGY 2014; 40:804-816. [PMID: 24462163 DOI: 10.1016/j.ultrasmedbio.2013.11.004] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/13/2013] [Revised: 10/17/2013] [Accepted: 11/04/2013] [Indexed: 06/03/2023]
Abstract
A robust and efficient needle segmentation method used to localize and track the needle in 3-D trans-rectal ultrasound (TRUS)-guided prostate therapy is proposed. The algorithmic procedure begins by cropping the 3-D US image containing a needle; then all voxels in the cropped 3-D image are grouped into different line support regions (LSRs) based on the outer product of the adjacent voxels' gradient vector. Two different needle axis extraction methods in the candidate LSR are presented: least-squares fitting and 3-D randomized Hough transform. Subsequent local optimization refines the position of the needle axis. Finally, the needle endpoint is localized by finding an intensity drop along the needle axis. The proposed methods were validated with 3-D TRUS tissue-mimicking agar phantom images, chicken breast phantom images and patient images obtained during prostate cryotherapy. The results of the in vivo test indicate that our method can localize the needle accurately and robustly with a needle endpoint localization accuracy <1.43 mm and detection accuracy >84%, which are favorable for 3-D TRUS-guided prostate trans-perineal therapy.
Collapse
Affiliation(s)
- Wu Qiu
- Department of Biomedical Engineering, School of Life Science and Technology, Huazhong University of Science and Technology, Wuhan, Hubei, China.
| | - Ming Yuchi
- Department of Biomedical Engineering, School of Life Science and Technology, Huazhong University of Science and Technology, Wuhan, Hubei, China
| | - Mingyue Ding
- Department of Biomedical Engineering, School of Life Science and Technology, Huazhong University of Science and Technology, Wuhan, Hubei, China
| |
Collapse
|
46
|
|
47
|
|
48
|
Uherčík M, Kybic J, Zhao Y, Cachard C, Liebgott H. Line filtering for surgical tool localization in 3D ultrasound images. Comput Biol Med 2013; 43:2036-45. [DOI: 10.1016/j.compbiomed.2013.09.020] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2013] [Revised: 09/22/2013] [Accepted: 09/25/2013] [Indexed: 10/26/2022]
|
49
|
Diarra B, Robini M, Tortoli P, Cachard C, Liebgott H. Design of Optimal 2-D Nongrid Sparse Arrays for Medical Ultrasound. IEEE Trans Biomed Eng 2013; 60:3093-102. [DOI: 10.1109/tbme.2013.2267742] [Citation(s) in RCA: 72] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
50
|
Zhao Y, Cachard C, Liebgott H. Automatic needle detection and tracking in 3D ultrasound using an ROI-based RANSAC and Kalman method. ULTRASONIC IMAGING 2013; 35:283-306. [PMID: 24081726 DOI: 10.1177/0161734613502004] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
This article proposes a robust technique for needle detection and tracking using three-dimensional ultrasound (3D US). It is difficult for radiologists to detect and follow the position of micro tools, such as biopsy needles, that are inserted in human tissues under 3D US guidance. To overcome this difficulty, we propose a method that automatically reduces the processed volume to a limited region of interest (ROI), increasing at the same time the calculation speed and the robustness of the proposed technique. First, a line filter method that enhances the contrast of the needle against the background is used to facilitate the initialization of ROI using the tubularness information of the complete US volume. Then, the random sample consensus (RANSAC) and Kalman filter (RK) algorithm is used in the ROI to detect and track the precise position of the needle. A series of numerical inhomogeneous phantoms with a needle simulated from real 3D US volumes are used to evaluate our method. The results show that the proposed method is much more robust than the RANSAC algorithm when detecting the needle, regardless of whether or not the insertion axis corresponds to an acquisition plane in the 3D US volume. The possibility of failure is also discussed in this article.
Collapse
Affiliation(s)
- Yue Zhao
- 1Creatis, Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Lyon 1, France
| | | | | |
Collapse
|