1
|
Watanabe H, Fukuda H, Ezawa Y, Matsuyama E, Kondo Y, Hayashi N, Ogura T, Shimosegawa M. Automated angular measurement for puncture angle using a computer-aided method in ultrasound-guided peripheral insertion. Phys Eng Sci Med 2024; 47:679-689. [PMID: 38358620 DOI: 10.1007/s13246-024-01397-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2023] [Accepted: 01/28/2024] [Indexed: 02/16/2024]
Abstract
Ultrasound guidance has become the gold standard for obtaining vascular access. Angle information, which indicates the entry angle of the needle into the vein, is required to ensure puncture success. Although various image processing-based methods, such as deep learning, have recently been applied to improve needle visibility, these methods have limitations, in that the puncture angle to the target organ is not measured. We aim to detect the target vessel and puncture needle and to derive the puncture angle by combining deep learning and conventional image processing methods such as the Hough transform. Median cubital vein US images were obtained from 20 healthy volunteers, and images of simulated blood vessels and needles were obtained during the puncture of a simulated blood vessel in four phantoms. The U-Net architecture was used to segment images of blood vessels and needles, and various image processing methods were employed to automatically measure angles. The experimental results indicated that the mean dice coefficients of median cubital veins, simulated blood vessels, and needles were 0.826, 0.931, and 0.773, respectively. The quantitative results of angular measurement showed good agreement between the expert and automatic measurements of the puncture angle with 0.847 correlations. Our findings indicate that the proposed method achieves extremely high segmentation accuracy and automated angular measurements. The proposed method reduces the variability and time required in manual angle measurements and presents the possibility where the operator can concentrate on delicate techniques related to the direction of the needle.
Collapse
Affiliation(s)
- Haruyuki Watanabe
- School of Radiological Technology, Gunma Prefectural College of Health Sciences, Maebashi, Japan.
| | - Hironori Fukuda
- Department of Radiology, Cardiovascular Hospital of Central Japan, Shibukawa, Japan
| | - Yuina Ezawa
- School of Radiological Technology, Gunma Prefectural College of Health Sciences, Maebashi, Japan
| | - Eri Matsuyama
- Faculty of Informatics, The University of Fukuchiyama, Fukuchiyama, Japan
| | - Yohan Kondo
- Graduate School of Health Sciences, Niigata University, Niigata, Japan
| | - Norio Hayashi
- School of Radiological Technology, Gunma Prefectural College of Health Sciences, Maebashi, Japan
| | - Toshihiro Ogura
- School of Radiological Technology, Gunma Prefectural College of Health Sciences, Maebashi, Japan
| | - Masayuki Shimosegawa
- School of Radiological Technology, Gunma Prefectural College of Health Sciences, Maebashi, Japan
| |
Collapse
|
2
|
Duan Y, Ling J, Feng Z, Ye T, Sun T, Zhu Y. A Survey of Needle Steering Approaches in Minimally Invasive Surgery. Ann Biomed Eng 2024; 52:1492-1517. [PMID: 38530535 DOI: 10.1007/s10439-024-03494-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2023] [Accepted: 03/08/2024] [Indexed: 03/28/2024]
Abstract
In virtue of a curved insertion path inside tissues, needle steering techniques have revealed the potential with the assistance of medical robots and images. The superiority of this technique has been preliminarily verified with several maneuvers: target realignment, obstacle circumvention, and multi-target access. However, the momentum of needle steering approaches in the past decade leads to an open question-"How to choose an applicable needle steering approach for a specific clinical application?" This survey discusses this question in terms of design choices and clinical considerations, respectively. In view of design choices, this survey proposes a hierarchical taxonomy of current needle steering approaches. Needle steering approaches of different manipulations and designs are classified to systematically review the design choices and their influences on clinical treatments. In view of clinical consideration, this survey discusses the steerability and acceptability of the current needle steering approaches. On this basis, the pros and cons of the current needle steering approaches are weighed and their suitable applications are summarized. At last, this survey concluded with an outlook of the needle steering techniques, including the potential clinical applications and future developments in mechanical design.
Collapse
Affiliation(s)
- Yuzhou Duan
- College of Mechanical and Electrical Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing, 210016, China
| | - Jie Ling
- College of Mechanical and Electrical Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing, 210016, China.
| | - Zhao Feng
- School of Power and Mechanical Engineering, Wuhan University, Wuhan, 430072, China
- Wuhan University Shenzhen Research Institute, Shenzhen, 518057, China
| | - Tingting Ye
- Industrial and Systems Engineering Department, The Hong Kong Polytechnic University, Hong Kong SAR, 999077, China
| | - Tairen Sun
- School of Health Science and Engineering, University of Shanghai for Science and Technology, Shanghai, 200093, China
| | - Yuchuan Zhu
- College of Mechanical and Electrical Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing, 210016, China
| |
Collapse
|
3
|
Che H, Qin J, Chen Y, Ji Z, Yan Y, Yang J, Wang Q, Liang C, Wu J. Improving Needle Tip Tracking and Detection in Ultrasound-Based Navigation System Using Deep Learning-Enabled Approach. IEEE J Biomed Health Inform 2024; 28:2930-2942. [PMID: 38215329 DOI: 10.1109/jbhi.2024.3353343] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/14/2024]
Abstract
Ultrasound-guided percutaneous interventions have numerous advantages over traditional techniques. Accurate needle placement in the target anatomy is crucial for successful intervention, and reliable visual information is essential to achieve this. However, previous studies have revealed several challenges, such as the variability in needle echogenicity and the common misalignment of the ultrasound beam and the needle. Advanced techniques have been developed to optimize needle visualization, including hardware-based and image-processing-based methods. This paper proposes a novel strategy of integrating ultrasound-based deep learning approaches into an optical navigation system to enhance needle visualization and improve tip positioning accuracy. Both the tracking and detection algorithms are optimized utilizing optical tracking information. The information is introduced into the tracking network to define the search patch update strategy and form a trajectory reference to correct tracking results. In the detection network, the original image is processed according to the needle insertion position and current position given by the optical localization system to locate a coarse region, and the depth-score criterion is adopted to optimize detection results. Extensive experiments demonstrate that our approach achieves promising tip tracking and detection performance with tip localization errors of 1.11 ± 0.59 mm and 1.17 ± 0.70 mm, respectively. Moreover, we establish a paired dataset consisting of ultrasound images and their corresponding spatial tip coordinates acquired from the optical tracking system and conduct real puncture experiments to verify the effectiveness of the proposed methods. Our approach significantly improves needle visualization and provides physicians with visual guidance for posture adjustment.
Collapse
|
4
|
Wang R, Tan G, Liu X. Robust tip localization under continuous spatial and temporal constraints during 2D ultrasound-guided needle puncture. Int J Comput Assist Radiol Surg 2023; 18:2233-2242. [PMID: 37160581 DOI: 10.1007/s11548-023-02894-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2022] [Accepted: 03/29/2023] [Indexed: 05/11/2023]
Abstract
PURPOSE During ultrasound-guided (US-guided) needle puncture for minimally invasive procedures, automated needle tip localization can help clinicians capture small tips in US images easily and precisely, providing them with obvious tip indicators on the screen and bringing them more confidence during the procedures. However, automated needle tip localization in US images is challenging due to serious interferences arising from all kinds of echoes. METHODS We propose a method that localizes needle tips under continuous spatial and temporal constraints in the real-time US frame stream. A temporal constraint is firstly acquired by detecting translational tip motion in motion-enhanced US images with a deep learning-based (DL-based) detector. A spatial constraint and candidate tip locations are obtained by detecting needle shafts and tips in the raw grayscale B-mode images with another DL-based detector. To provide continuous constraints, estimated tip velocity from acquired temporal constraint is used to predict tip locations in frames where no temporal or spatial constraint is detected. Finally, tip coordinates are precisely localized among candidate tips under the spatial and temporal constraints. RESULTS Experimental results evaluated on 1121 US images from porcine organ punctures, and 895 images from human thyroid punctures demonstrate that the proposed method is effective and efficient, surpassing existing methods. On porcine organ data, a 97.2% recall rate and a 91.9% precision rate on tip detection and 0.88 ± 0.70 mm root-mean-square error (RMSE) on tip localization were achieved. On the human thyroid data, which was not involved in the training, 86.5% recall, 84.3% precision and 0.92 ± 0.78 mm RMSE were achieved separately. The running speed of 14.5 frames per second was achieved only using a CPU. CONCLUSION The proposed method provides a more reliable solution for automated needle tip localization during US-guided needle puncture, being more robust to interferences. Fast running speed leads to its practicability in the real-time US stream.
Collapse
Affiliation(s)
- Ruixin Wang
- College of Computer and Information, Hohai University, Nanjing, 210098, China
| | - Guoping Tan
- College of Computer and Information, Hohai University, Nanjing, 210098, China.
| | - Xiaohui Liu
- The First People's Hospital of Kunshan, Affiliated Kunshan Hospital of Jiangsu University, Kunshan, 215300, China
| |
Collapse
|
5
|
Lin X, Shi H, Fan X, Wang J, Fu Z, Chen Y, Chen S, Chen X, Chen M. Handheld interventional ultrasound/photoacoustic puncture needle navigation based on deep learning segmentation. BIOMEDICAL OPTICS EXPRESS 2023; 14:5979-5993. [PMID: 38021141 PMCID: PMC10659795 DOI: 10.1364/boe.504999] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/05/2023] [Revised: 10/08/2023] [Accepted: 10/18/2023] [Indexed: 12/01/2023]
Abstract
Interventional ultrasound (US) has challenges in accurate localization of the puncture needle due to intrinsic acoustic interferences, which lead to blurred, indistinct, and even invisible needles in handheld linear array transducer-based US navigation, especially the incorrect needle tip positioning. Photoacoustic (PA) imaging can provide complementary image contrast, without additional data acquisition. Herein, we proposed an internal illumination to solely light up the needle tip in PA imaging. Then deep-learning-based feature segmentation alleviates acoustic interferences, enhancing the needle shaft-tip visibility. Further, needle shaft-tip compensation aligned the needle shaft in US image and the needle tip in the PA image. The experiments on phantom, ex vivo chicken breast, preclinical radiofrequency ablation and in vivo biopsy of sentinel lymph nodes were piloted. The target registration error can reach the submillimeter level, achieving precise puncture needle tracking ability with in-plane US/PA navigation.
Collapse
Affiliation(s)
- Xiangwei Lin
- School of Biomedical Engineering, Shenzhen University, 1066 Xueyuan Ave, Shenzhen 518057, China
| | - Hongji Shi
- School of Biomedical Engineering, Shenzhen University, 1066 Xueyuan Ave, Shenzhen 518057, China
| | - Xiaozhou Fan
- Department of Ultrasound, Air Force Medical Center, Air Force Medical University, 30 Fucheng Road, Beijing 100142, China
| | - Jiaxin Wang
- School of Chinese Materia Medica, Beijing University of Chinese Medicine, 11 Huandong Road, Beijing 102488, China
| | - Zhenyu Fu
- School of Biomedical Engineering, Shenzhen University, 1066 Xueyuan Ave, Shenzhen 518057, China
| | - Yuqing Chen
- School of Biomedical Engineering, Shenzhen University, 1066 Xueyuan Ave, Shenzhen 518057, China
| | - Siping Chen
- School of Biomedical Engineering, Shenzhen University, 1066 Xueyuan Ave, Shenzhen 518057, China
| | - Xin Chen
- School of Biomedical Engineering, Shenzhen University, 1066 Xueyuan Ave, Shenzhen 518057, China
| | - Mian Chen
- School of Biomedical Engineering, Shenzhen University, 1066 Xueyuan Ave, Shenzhen 518057, China
| |
Collapse
|
6
|
Arapi V, Hardt-Stremayr A, Weiss S, Steinbrener J. Bridging the simulation-to-real gap for AI-based needle and target detection in robot-assisted ultrasound-guided interventions. Eur Radiol Exp 2023; 7:30. [PMID: 37332035 DOI: 10.1186/s41747-023-00344-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2022] [Accepted: 04/05/2023] [Indexed: 06/20/2023] Open
Abstract
BACKGROUND Artificial intelligence (AI)-powered, robot-assisted, and ultrasound (US)-guided interventional radiology has the potential to increase the efficacy and cost-efficiency of interventional procedures while improving postsurgical outcomes and reducing the burden for medical personnel. METHODS To overcome the lack of available clinical data needed to train state-of-the-art AI models, we propose a novel approach for generating synthetic ultrasound data from real, clinical preoperative three-dimensional (3D) data of different imaging modalities. With the synthetic data, we trained a deep learning-based detection algorithm for the localization of needle tip and target anatomy in US images. We validated our models on real, in vitro US data. RESULTS The resulting models generalize well to unseen synthetic data and experimental in vitro data making the proposed approach a promising method to create AI-based models for applications of needle and target detection in minimally invasive US-guided procedures. Moreover, we show that by one-time calibration of the US and robot coordinate frames, our tracking algorithm can be used to accurately fine-position the robot in reach of the target based on 2D US images alone. CONCLUSIONS The proposed data generation approach is sufficient to bridge the simulation-to-real gap and has the potential to overcome data paucity challenges in interventional radiology. The proposed AI-based detection algorithm shows very promising results in terms of accuracy and frame rate. RELEVANCE STATEMENT This approach can facilitate the development of next-generation AI algorithms for patient anatomy detection and needle tracking in US and their application to robotics. KEY POINTS • AI-based methods show promise for needle and target detection in US-guided interventions. • Publicly available, annotated datasets for training AI models are limited. • Synthetic, clinical-like US data can be generated from magnetic resonance or computed tomography data. • Models trained with synthetic US data generalize well to real in vitro US data. • Target detection with an AI model can be used for fine positioning of the robot.
Collapse
Affiliation(s)
- Visar Arapi
- Control of Networked Systems Research Group, Institute of Smart Systems Technologies, University of Klagenfurt, Klagenfurt, Austria.
| | - Alexander Hardt-Stremayr
- Control of Networked Systems Research Group, Institute of Smart Systems Technologies, University of Klagenfurt, Klagenfurt, Austria
| | - Stephan Weiss
- Control of Networked Systems Research Group, Institute of Smart Systems Technologies, University of Klagenfurt, Klagenfurt, Austria
| | - Jan Steinbrener
- Control of Networked Systems Research Group, Institute of Smart Systems Technologies, University of Klagenfurt, Klagenfurt, Austria
| |
Collapse
|
7
|
Magana-Salgado U, Namburi P, Feigin-Almon M, Pallares-Lopez R, Anthony B. A comparison of point-tracking algorithms in ultrasound videos from the upper limb. Biomed Eng Online 2023; 22:52. [PMID: 37226240 DOI: 10.1186/s12938-023-01105-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2023] [Accepted: 04/25/2023] [Indexed: 05/26/2023] Open
Abstract
Tracking points in ultrasound (US) videos can be especially useful to characterize tissues in motion. Tracking algorithms that analyze successive video frames, such as variations of Optical Flow and Lucas-Kanade (LK), exploit frame-to-frame temporal information to track regions of interest. In contrast, convolutional neural-network (CNN) models process each video frame independently of neighboring frames. In this paper, we show that frame-to-frame trackers accumulate error over time. We propose three interpolation-like methods to combat error accumulation and show that all three methods reduce tracking errors in frame-to-frame trackers. On the neural-network end, we show that a CNN-based tracker, DeepLabCut (DLC), outperforms all four frame-to-frame trackers when tracking tissues in motion. DLC is more accurate than the frame-to-frame trackers and less sensitive to variations in types of tissue movement. The only caveat found with DLC comes from its non-temporal tracking strategy, leading to jitter between consecutive frames. Overall, when tracking points in videos of moving tissue, we recommend using DLC when prioritizing accuracy and robustness across movements in videos, and using LK with the proposed error-correction methods for small movements when tracking jitter is unacceptable.
Collapse
Affiliation(s)
- Uriel Magana-Salgado
- Department of Mechanical Engineering, MIT, Cambridge, MA, 02139, USA
- Mechanical Engineering Graduate Program, MIT, Cambridge, MA, 02139, USA
| | - Praneeth Namburi
- Institute for Medical Engineering and Science, Massachusetts Institute of Technology, 77 Massachusetts Ave, 12-3211, Cambridge, MA, 02139, USA.
- MIT.Nano Immersion Lab, MIT, Cambridge, MA, 02139, USA.
| | | | - Roger Pallares-Lopez
- Department of Mechanical Engineering, MIT, Cambridge, MA, 02139, USA
- Mechanical Engineering Graduate Program, MIT, Cambridge, MA, 02139, USA
| | - Brian Anthony
- Department of Mechanical Engineering, MIT, Cambridge, MA, 02139, USA
- Institute for Medical Engineering and Science, Massachusetts Institute of Technology, 77 Massachusetts Ave, 12-3211, Cambridge, MA, 02139, USA
- MIT.Nano Immersion Lab, MIT, Cambridge, MA, 02139, USA
| |
Collapse
|
8
|
Malamal G, Panicker MR. On the physics of ultrasound transmission for in-plane needle tracking in guided interventions. Biomed Phys Eng Express 2023; 9. [PMID: 36898145 DOI: 10.1088/2057-1976/acc338] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2022] [Accepted: 03/10/2023] [Indexed: 03/12/2023]
Abstract
Objective.In ultrasound (US) guided interventions, the accurate visualization and tracking of needles is a critical challenge, particularly during in-plane insertions. An inaccurate identification and localization of needles lead to severe inadvertent complications and increased procedure times. This is due to the inherent specular reflections from the needle with directivity depending on the angle of incidence of the US beam, and the needle inclination.Approach.Though several methods have been proposed for improved needle visualization, a detailed study emphasizing the physics of specular reflections resulting from the interaction of transmitted US beam with the needle remains to be explored. In this work, we discuss the properties of specular reflections from planar and spherical wave US transmissions respectively through multi-angle plane wave (PW) and synthetic transmit aperture (STA) techniques for in-plane needle insertion angles between 15°-50°.Main Results.The qualitative and quantitative results from simulations and experiments reveal that the spherical waves enable better visualization and characterization of needles than planar wavefronts. The needle visibility in PW transmissions is severely degraded by the receive aperture weighting during image reconstruction than STA due to greater deviation in reflection directivity. It is also observed that the spherical wave characteristics starts to alter to planar characteristics due to wave divergence at large needle insertion depths.Significance.The study highlights that synergistic transmit-receive imaging schemes addressing the physical properties of reflections from the transmit wavefronts are imperative for the precise imaging of needle interfaces and hence have strong potential in elevating the quality of outcomes from US guided interventional practices.
Collapse
Affiliation(s)
- Gayathri Malamal
- Center for Computational Imaging, Dept. of Electrical Engineering, Indian Institute of Technology Palakkad, India
| | | |
Collapse
|
9
|
Automatic and accurate needle detection in 2D ultrasound during robot-assisted needle insertion process. Int J Comput Assist Radiol Surg 2021; 17:295-303. [PMID: 34677747 DOI: 10.1007/s11548-021-02519-6] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2021] [Accepted: 10/05/2021] [Indexed: 10/20/2022]
Abstract
PURPOSE Robot-assisted needle insertion guided by 2D ultrasound (US) can effectively improve the accuracy and success rate of clinical puncture. To this end, automatic and accurate needle-tracking methods are important for monitoring puncture processes, avoiding the needle deviating from the intended path, and reducing the risk of injury to surrounding tissues. This work aims to develop a framework for automatic and accurate detection of an inserted needle in 2D US image during the insertion process. METHODS We propose a novel convolutional neural network architecture comprising of a two-channel encoder and single-channel decoder for needle segmentation using needle motion information extracted from two adjacent US image frames. Based on the novel network, we further propose an automatic needle detection framework. According to the prediction result of the previous frame, a region of interest of the needle in the US image was extracted and fed into the proposed network to achieve finer and faster continuous needle localization. RESULTS The performance of our method was evaluated based on 1000 pairs of US images extracted from robot-assisted needle insertions on freshly excised bovine and porcine tissues. The needle segmentation network achieved 99.7% accuracy, 86.2% precision, 89.1% recall, and an F1-score of 0.87. The needle detection framework successfully localized the needle with a mean tip error of 0.45 ± 0.33 mm and a mean orientation error of 0.42° ± 0.34° and achieved a total processing time of 50 ms per image. CONCLUSION The proposed framework demonstrated the capability to realize robust, accurate, and real-time needle localization during robot-assisted needle insertion processes. It has a promising application in tracking the needle and ensuring the safety of robotic-assisted automatic puncture during challenging US-guided minimally invasive procedures.
Collapse
|
10
|
Sánchez-Margallo JA, Tas L, Moelker A, van den Dobbelsteen JJ, Sánchez-Margallo FM, Langø T, van Walsum T, van de Berg NJ. Block-matching-based registration to evaluate ultrasound visibility of percutaneous needles in liver-mimicking phantoms. Med Phys 2021; 48:7602-7612. [PMID: 34665885 PMCID: PMC9298012 DOI: 10.1002/mp.15305] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2021] [Revised: 10/08/2021] [Accepted: 10/14/2021] [Indexed: 11/24/2022] Open
Abstract
Purpose To present a novel methodical approach to compare visibility of percutaneous needles in ultrasound images. Methods A motor‐driven rotation platform was used to gradually change the needle angle while capturing image data. Data analysis was automated using block‐matching‐based registration, with a tracking and refinement step. Every 25 frames, a Hough transform was used to improve needle alignments after large rotations. The method was demonstrated by comparing three commercial needles (14G radiofrequency ablation, RFA; 18G Trocar; 22G Chiba) and six prototype needles with different sizes, materials, and surface conditions (polished, sand‐blasted, and kerfed), within polyvinyl alcohol phantom tissue and ex vivo bovine liver models. For each needle and angle, a contrast‐to‐noise ratio (CNR) was determined to quantify visibility. CNR values are presented as a function of needle type and insertion angle. In addition, the normalized area under the (CNR‐angle) curve was used as a summary metric to compare needles. Results In phantom tissue, the first kerfed needle design had the largest normalized area of visibility and the polished 1 mm diameter stainless steel needle the smallest (0.704 ± 0.199 vs. 0.154 ± 0.027, p < 0.01). In the ex vivo model, the second kerfed needle design had the largest normalized area of visibility, and the sand‐blasted stainless steel needle the smallest (0.470 ± 0.190 vs. 0.127 ± 0.047, p < 0.001). As expected, the analysis showed needle visibility peaks at orthogonal insertion angles. For acute or obtuse angles, needle visibility was similar or reduced. Overall, the variability in needle visibility was considerably higher in livers. Conclusion The best overall visibility was found with kerfed needles and the commercial RFA needle. The presented methodical approach to quantify ultrasound visibility allows comparisons of (echogenic) needles, as well as other technological innovations aiming to improve ultrasound visibility of percutaneous needles, such as coatings, material treatments, and beam steering approaches.
Collapse
Affiliation(s)
- Juan A Sánchez-Margallo
- Bioengineering and Health Technologies Unit, Jesús Usón Minimally Invasive Surgery Centre, Cáceres, Spain
| | - Lisette Tas
- Department of Biomechanical Engineering, Delft University of Technology, Delft, The Netherlands
| | - Adriaan Moelker
- Department of Radiology & Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | | | | | | | - Theo van Walsum
- Biomedical Imaging Group Rotterdam, Department of Radiology & Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, The Netherlands
| | - Nick J van de Berg
- Department of Radiology & Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| |
Collapse
|
11
|
Jeng GS, Wang YA, Liu PY, Li PC. Laser-Generated Leaky Acoustic Wave Imaging for Interventional Guidewire Guidance. IEEE TRANSACTIONS ON ULTRASONICS, FERROELECTRICS, AND FREQUENCY CONTROL 2021; 68:2496-2506. [PMID: 33780337 DOI: 10.1109/tuffc.2021.3069474] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Ultrasound (US) is widely used to visualize both tissue and the positions of surgical instruments in real time during surgery. Previously we proposed a new method to exploit US imaging and laser-generated leaky acoustic waves (LAWs) for needle visualization. Although successful, that method only detects the position of a needle tip, with the location of the entire needle deduced from knowing that the needle is straight. The purpose of the current study was to develop a beamforming-based method for the direct visualization of objects. The approach can be applied to objects with arbitrary shapes, such as the guidewires that are commonly used in interventional guidance. With this method, illumination by a short laser pulse generates photoacoustic waves at the top of the guidewire that propagate down its metal surface. These waves then leak into the surrounding tissue, which can be detected by a US array transducer. The time of flight consists of two parts: 1) the propagation time of the guided waves on the guidewire and 2) the propagation time of the US that leaks into the tissue. In principle, an image of the guidewire can be formed based on array beamforming by taking the propagation time on the metal into consideration. Furthermore, we introduced directional filtering and a matched filter to compress the dispersion signal associated with long propagation times. The results showed that guidewires could be detected at depths of at least 70 mm. The maximum detectable angle was 56.3°. LAW imaging with a 1268-mm-long guidewire was also demonstrated. The proposed method has considerable potential in new clinical applications.
Collapse
|
12
|
Konh B, Padasdao B, Batsaikhan Z, Ko SY. Integrating robot-assisted ultrasound tracking and 3D needle shape prediction for real-time tracking of the needle tip in needle steering procedures. Int J Med Robot 2021; 17:e2272. [PMID: 33951748 DOI: 10.1002/rcs.2272] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2021] [Revised: 05/02/2021] [Accepted: 05/03/2021] [Indexed: 11/07/2022]
Abstract
BACKGROUND Needle insertions have been used in several minimally invasive procedures for diagnostic and therapeutic purposes. Real-time position of the needle tip is an important information in needle steering systems. METHODS This work introduces a robot-assisted ultrasound tracking (R-AUST) system integrated with a needle shape prediction method to provide 3D position of the needle tip. The tracking system is evaluated in phantom and ex vivo beef liver tissues. RESULTS An average error of 0.60 mm was found for needle insertion tests inside the phantom tissue. The R-AUST integrated with shape prediction in the beef liver tissue was able to track the needle tip with an average and maximum error of 0.37 and 0.67 mm, respectively. The average error reported in this work is within the mean allowable needle placement error (<2.7 mm) in targeted procedures. CONCLUSIONS Integration of R-AUST tracking method with needle shape prediction results in a reasonably accurate real-time tracking suitable for ultrasound-guided needle insertions.
Collapse
Affiliation(s)
- Bardia Konh
- Department of Mechanical Engineering, University of Hawaii at Manoa, Honolulu, Hawaii, USA
| | - Blayton Padasdao
- Department of Mechanical Engineering, University of Hawaii at Manoa, Honolulu, Hawaii, USA
| | - Zolboo Batsaikhan
- Department of Mechanical Engineering, University of Hawaii at Manoa, Honolulu, Hawaii, USA
| | - Seong Young Ko
- School of Mechanical Engineering, Chonnam National University, Gwangju, South Korea
| |
Collapse
|
13
|
Beigi P, Salcudean SE, Ng GC, Rohling R. Enhancement of needle visualization and localization in ultrasound. Int J Comput Assist Radiol Surg 2020; 16:169-178. [PMID: 32995981 DOI: 10.1007/s11548-020-02227-7] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2020] [Accepted: 07/06/2020] [Indexed: 12/12/2022]
Abstract
PURPOSE This scoping review covers needle visualization and localization techniques in ultrasound, where localization-based approaches mostly aim to compute the needle shaft (and tip) location while potentially enhancing its visibility too. METHODS A literature review is conducted on the state-of-the-art techniques, which could be divided into five categories: (1) signal and image processing-based techniques to augment the needle, (2) modifications to the needle and insertion to help with needle-transducer alignment and visibility, (3) changes to ultrasound image formation, (4) motion-based analysis and (5) machine learning. RESULTS Advantages, limitations and challenges of representative examples in each of the categories are discussed. Evaluation techniques performed in ex vivo, phantom and in vivo studies are discussed and summarized. CONCLUSION Greatest limitation of the majority of the literature is that they rely on original visibility of the needle in the static image. Need for additional/improved apparatus is the greatest limitation toward clinical utility in practice. SIGNIFICANCE Ultrasound-guided needle placement is performed in many clinical applications, including biopsies, treatment injections and anesthesia. Despite the wide range and long history of this technique, an ongoing challenge is needle visibility in ultrasound. A robust technique to enhance ultrasonic needle visibility, especially for steeply inserted hand-held needles, and while maintaining clinical utility requirements is needed.
Collapse
Affiliation(s)
- Parmida Beigi
- Electrical and Computer Engineering Department, University of British Columbia, Vancouver, BC, Canada.
| | - Septimiu E Salcudean
- Electrical and Computer Engineering Department, University of British Columbia, Vancouver, BC, Canada
| | - Gary C Ng
- Philips Ultrasound, Bothell, WA, USA
| | - Robert Rohling
- Electrical and Computer Engineering Department and Mechanical Engineering Department, University of British Columbia, Vancouver, BC, Canada
| |
Collapse
|
14
|
Rodgers JR, Hrinivich WT, Surry K, Velker V, D'Souza D, Fenster A. A semiautomatic segmentation method for interstitial needles in intraoperative 3D transvaginal ultrasound images for high-dose-rate gynecologic brachytherapy of vaginal tumors. Brachytherapy 2020; 19:659-668. [PMID: 32631651 DOI: 10.1016/j.brachy.2020.05.006] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2020] [Revised: 05/22/2020] [Accepted: 05/28/2020] [Indexed: 11/24/2022]
Abstract
PURPOSE The purpose of this study was to evaluate the use of a semiautomatic algorithm to simultaneously segment multiple high-dose-rate (HDR) gynecologic interstitial brachytherapy (ISBT) needles in three-dimensional (3D) transvaginal ultrasound (TVUS) images, with the aim of providing a clinically useful tool for intraoperative implant assessment. METHODS AND MATERIALS A needle segmentation algorithm previously developed for HDR prostate brachytherapy was adapted and extended to 3D TVUS images from gynecologic ISBT patients with vaginal tumors. Two patients were used for refining/validating the modified algorithm and five patients (8-12 needles/patient) were reserved as an unseen test data set. The images were filtered to enhance needle edges, using intensity peaks to generate feature points, and leveraged the randomized 3D Hough transform to identify candidate needle trajectories. Algorithmic segmentations were compared against manual segmentations and calculated dwell positions were evaluated. RESULTS All 50 test data set needles were successfully segmented with 96% of algorithmically segmented needles having angular differences <3° compared with manually segmented needles and the maximum Euclidean distance was <2.1 mm. The median distance between corresponding dwell positions was 0.77 mm with 86% of needles having maximum differences <3 mm. The mean segmentation time using the algorithm was <30 s/patient. CONCLUSIONS We successfully segmented multiple needles simultaneously in intraoperative 3D TVUS images from gynecologic HDR-ISBT patients with vaginal tumors and demonstrated the robustness of the algorithmic approach to image artifacts. This method provided accurate segmentations within a clinically efficient timeframe, providing the potential to be translated into intraoperative clinical use for implant assessment.
Collapse
MESH Headings
- Adenocarcinoma, Clear Cell/radiotherapy
- Adenocarcinoma, Clear Cell/secondary
- Aged
- Aged, 80 and over
- Algorithms
- Brachytherapy/instrumentation
- Brachytherapy/methods
- Carcinoma, Endometrioid/radiotherapy
- Carcinoma, Endometrioid/secondary
- Carcinoma, Squamous Cell/pathology
- Carcinoma, Squamous Cell/radiotherapy
- Carcinoma, Squamous Cell/secondary
- Endometrial Neoplasms/pathology
- Female
- Humans
- Image Processing, Computer-Assisted
- Imaging, Three-Dimensional/methods
- Middle Aged
- Needles
- Ovarian Neoplasms/pathology
- Prostate/diagnostic imaging
- Radiotherapy Planning, Computer-Assisted
- Ultrasonography/methods
- Vaginal Neoplasms/pathology
- Vaginal Neoplasms/radiotherapy
- Vaginal Neoplasms/secondary
Collapse
Affiliation(s)
- Jessica Robin Rodgers
- School of Biomedical Engineering, The University of Western Ontario, London, Ontario, Canada; Robarts Research Institute, The University of Western Ontario, London, Ontario, Canada.
| | - William Thomas Hrinivich
- Department of Radiation Oncology and Molecular Radiation Sciences, Johns Hopkins University, Baltimore, MD
| | - Kathleen Surry
- Department of Medical Physics, London Regional Cancer Program, London, Ontario, Canada
| | - Vikram Velker
- Department of Radiation Oncology, London Regional Cancer Program, London, Ontario, Canada
| | - David D'Souza
- Department of Radiation Oncology, London Regional Cancer Program, London, Ontario, Canada
| | - Aaron Fenster
- School of Biomedical Engineering, The University of Western Ontario, London, Ontario, Canada; Robarts Research Institute, The University of Western Ontario, London, Ontario, Canada
| |
Collapse
|
15
|
Gillies DJ, Rodgers JR, Gyacskov I, Roy P, Kakani N, Cool DW, Fenster A. Deep learning segmentation of general interventional tools in two‐dimensional ultrasound images. Med Phys 2020; 47:4956-4970. [DOI: 10.1002/mp.14427] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2020] [Revised: 07/05/2020] [Accepted: 07/21/2020] [Indexed: 12/18/2022] Open
Affiliation(s)
- Derek J. Gillies
- Department of Medical Biophysics Western University London OntarioN6A 3K7 Canada
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
| | - Jessica R. Rodgers
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
- School of Biomedical Engineering Western University London OntarioN6A 3K7 Canada
| | - Igor Gyacskov
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
| | - Priyanka Roy
- Department of Medical Biophysics Western University London OntarioN6A 3K7 Canada
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
| | - Nirmal Kakani
- Department of Radiology Manchester Royal Infirmary ManchesterM13 9WL UK
| | - Derek W. Cool
- Department of Medical Imaging Western University London OntarioN6A 3K7 Canada
| | - Aaron Fenster
- Department of Medical Biophysics Western University London OntarioN6A 3K7 Canada
- Robarts Research Institute Western University London OntarioN6A 3K7 Canada
- School of Biomedical Engineering Western University London OntarioN6A 3K7 Canada
- Department of Medical Imaging Western University London OntarioN6A 3K7 Canada
| |
Collapse
|
16
|
Tip Estimation Method in Phantoms for Curved Needle Using 2D Transverse Ultrasound Images. APPLIED SCIENCES-BASEL 2019. [DOI: 10.3390/app9245305] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/26/2022]
Abstract
Flexible needles have been widely used in minimally invasive surgeries, especially in percutaneous interventions. Among the interventions, tip position of the curved needle is very important, since it directly affects the success of the surgeries. In this paper, we present a method to estimate the tip position of a long-curved needle by using 2D transverse ultrasound images from a robotic ultrasound system. Ultrasound is first used to detect the cross section of long-flexible needle. A new imaging approach is proposed based on the selection of numbers of pixels with a higher gray level, which can directly remove the lower gray level to highlight the needle. After that, the needle shape tracking method is proposed by combining the image processing with the Kalman filter by using 3D needle positions, which develop a robust needle tracking procedure from 1 mm to 8 mm scan intervals. Shape reconstruction is then achieved using the curve fitting method. Finally, the needle tip position is estimated based on the curve fitting result. Experimental results showed that the estimation error of tip position is less than 1 mm within 4 mm scan intervals. The advantage of the proposed method is that the shape and tip position can be estimated through scanning the needle’s cross sections at intervals along the direction of needle insertion without detecting the tip.
Collapse
|
17
|
Learning needle tip localization from digital subtraction in 2D ultrasound. Int J Comput Assist Radiol Surg 2019; 14:1017-1026. [DOI: 10.1007/s11548-019-01951-z] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2019] [Accepted: 03/18/2019] [Indexed: 12/19/2022]
|
18
|
Mwikirize C, Nosher JL, Hacihaliloglu I. Convolution neural networks for real-time needle detection and localization in 2D ultrasound. Int J Comput Assist Radiol Surg 2018; 13:647-657. [PMID: 29512006 DOI: 10.1007/s11548-018-1721-y] [Citation(s) in RCA: 37] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2018] [Accepted: 02/28/2018] [Indexed: 12/28/2022]
Abstract
PURPOSE We propose a framework for automatic and accurate detection of steeply inserted needles in 2D ultrasound data using convolution neural networks. We demonstrate its application in needle trajectory estimation and tip localization. METHODS Our approach consists of a unified network, comprising a fully convolutional network (FCN) and a fast region-based convolutional neural network (R-CNN). The FCN proposes candidate regions, which are then fed to a fast R-CNN for finer needle detection. We leverage a transfer learning paradigm, where the network weights are initialized by training with non-medical images, and fine-tuned with ex vivo ultrasound scans collected during insertion of a 17G epidural needle into freshly excised porcine and bovine tissue at depth settings up to 9 cm and [Formula: see text]-[Formula: see text] insertion angles. Needle detection results are used to accurately estimate needle trajectory from intensity invariant needle features and perform needle tip localization from an intensity search along the needle trajectory. RESULTS Our needle detection model was trained and validated on 2500 ex vivo ultrasound scans. The detection system has a frame rate of 25 fps on a GPU and achieves 99.6% precision, 99.78% recall rate and an [Formula: see text] score of 0.99. Validation for needle localization was performed on 400 scans collected using a different imaging platform, over a bovine/porcine lumbosacral spine phantom. Shaft localization error of [Formula: see text], tip localization error of [Formula: see text] mm, and a total processing time of 0.58 s were achieved. CONCLUSION The proposed method is fully automatic and provides robust needle localization results in challenging scanning conditions. The accurate and robust results coupled with real-time detection and sub-second total processing make the proposed method promising in applications for needle detection and localization during challenging minimally invasive ultrasound-guided procedures.
Collapse
Affiliation(s)
- Cosmas Mwikirize
- Department of Biomedical Engineering, Rutgers University, Piscataway, NJ, 08854, USA.
| | - John L Nosher
- Department of Radiology, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, 08901, USA
| | - Ilker Hacihaliloglu
- Department of Biomedical Engineering, Rutgers University, Piscataway, NJ, 08854, USA.,Department of Radiology, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, 08901, USA
| |
Collapse
|
19
|
Beigi P, Rohling R, Salcudean T, Lessoway VA, Ng GC. Detection of an invisible needle in ultrasound using a probabilistic SVM and time-domain features. ULTRASONICS 2017; 78:18-22. [PMID: 28279882 DOI: 10.1016/j.ultras.2017.02.010] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/23/2016] [Revised: 02/11/2017] [Accepted: 02/13/2017] [Indexed: 06/06/2023]
Abstract
We propose a novel learning-based approach to detect an imperceptible hand-held needle in ultrasound images using the natural tremor motion. The minute tremor induced on the needle however is also transferred to the tissue in contact with the needle, making the accurate needle detection a challenging task. The proposed learning-based framework is based on temporal analysis of the phase variations of pixels to classify them according to the motion characteristics. In addition to the classification, we also obtain a probability map of the segmented pixels by cross-validation. A Hough transform is then used on the probability map to localize the needle using the segmented needle and posterior probability estimate. The two-step probability-weighted localization on the segmented needle in a learning framework is the key innovation which results in localization improvement and adaptability to specific clinical applications. The method was tested in vivo for a standard 17 gauge needle inserted at 50-80° insertion angles and 40-60mm depths. The results showed an average accuracy of (2.12°, 1.69mm) and 81%±4% for localization and classification, respectively.
Collapse
Affiliation(s)
- Parmida Beigi
- Electrical and Computer Engineering Department, University of British Columbia, Vancouver, BC, Canada.
| | - Robert Rohling
- Electrical and Computer Engineering Department, University of British Columbia, Vancouver, BC, Canada; Mechanical Engineering Department, University of British Columbia, Vancouver, BC, Canada
| | - Tim Salcudean
- Electrical and Computer Engineering Department, University of British Columbia, Vancouver, BC, Canada
| | | | - Gary C Ng
- Philips Ultrasound, Bothell, WA, USA
| |
Collapse
|
20
|
Beigi P, Rohling R, Salcudean SE, Ng GC. CASPER: computer-aided segmentation of imperceptible motion-a learning-based tracking of an invisible needle in ultrasound. Int J Comput Assist Radiol Surg 2017. [PMID: 28647883 DOI: 10.1007/s11548-017-1631-4] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Abstract
PURPOSE This paper presents a new micro-motion-based approach to track a needle in ultrasound images captured by a handheld transducer. METHODS We propose a novel learning-based framework to track a handheld needle by detecting microscale variations of motion dynamics over time. The current state of the art on using motion analysis for needle detection uses absolute motion and hence work well only when the transducer is static. We have introduced and evaluated novel spatiotemporal and spectral features, obtained from the phase image, in a self-supervised tracking framework to improve the detection accuracy in the subsequent frames using incremental training. Our proposed tracking method involves volumetric feature selection and differential flow analysis to incorporate the neighboring pixels and mitigate the effects of the subtle tremor motion of a handheld transducer. To evaluate the detection accuracy, the method is tested on porcine tissue in-vivo, during the needle insertion in the biceps femoris muscle. RESULTS Experimental results show the mean, standard deviation and root-mean-square errors of [Formula: see text], [Formula: see text] and [Formula: see text] in the insertion angle, and 0.82, 1.21, 1.47 mm, in the needle tip, respectively. CONCLUSIONS Compared to the appearance-based detection approaches, the proposed method is especially suitable for needles with ultrasonic characteristics that are imperceptible in the static image and to the naked eye.
Collapse
Affiliation(s)
- Parmida Beigi
- Electrical and Computer Engineering Department, University of British Columbia, Vancouver, BC, Canada.
| | - Robert Rohling
- Electrical and Computer Engineering Department and Mechanical Engineering Department, University of British Columbia, Vancouver, BC, Canada
| | - Septimiu E Salcudean
- Electrical and Computer Engineering Department, University of British Columbia, Vancouver, BC, Canada
| | - Gary C Ng
- Philips Ultrasound, Bothell, WA, USA
| |
Collapse
|
21
|
Scholten HJ, Pourtaherian A, Mihajlovic N, Korsten HHM, A. Bouwman R. Improving needle tip identification during ultrasound-guided procedures in anaesthetic practice. Anaesthesia 2017; 72:889-904. [DOI: 10.1111/anae.13921] [Citation(s) in RCA: 45] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/23/2017] [Indexed: 12/16/2022]
Affiliation(s)
- H. J. Scholten
- Department of Anaesthesiology; Intensive Care and Pain Medicine; Catharina Hospital; Eindhoven the Netherlands
| | - A. Pourtaherian
- Department of Electrical Engineering; Eindhoven University of Technology; Eindhoven the Netherlands
| | | | - H. H. M. Korsten
- Department of Anaesthesiology; Intensive Care and Pain Medicine; Catharina Hospital; Eindhoven the Netherlands
- Department of Electrical Engineering; Eindhoven University of Technology; Eindhoven the Netherlands
| | - R. A. Bouwman
- Department of Anaesthesiology; Intensive Care and Pain Medicine; Catharina Hospital; Eindhoven the Netherlands
- Department of Electrical Engineering; Eindhoven University of Technology; Eindhoven the Netherlands
| |
Collapse
|
22
|
Chuang BI, Hsu JH, Kuo LC, Jou IM, Su FC, Sun YN. Tendon-motion tracking in an ultrasound image sequence using optical-flow-based block matching. Biomed Eng Online 2017; 16:47. [PMID: 28427411 PMCID: PMC5399340 DOI: 10.1186/s12938-017-0335-x] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2016] [Accepted: 03/30/2017] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Tendon motion, which is commonly observed using ultrasound imaging, is one of the most important features used in tendinopathy diagnosis. However, speckle noise and out-of-plane issues make the tracking process difficult. Manual tracking is usually time consuming and often yields inconsistent results between users. METHODS To automatically track tendon motion in ultrasound images, we developed a new method that combines the advantages of optical flow and multi-kernel block matching. For every pair of adjacent image frames, the optical flow is computed and used to estimate the accumulated displacement. The proposed method selects the frame interval adaptively based on this displacement. Multi-kernel block matching is then computed on the two selected frames, and, to reduce tracking errors, the detailed displacements of the frames in between are interpolated based on the optical flow results. RESULTS In the experiments, cadaver data were used to evaluate the tracking results. The mean absolute error was less than 0.05 mm. The proposed method also tracked the motion of tendons in vivo, which provides useful information for clinical diagnosis. CONCLUSION The proposed method provides a new index for adaptively determining the frame interval. Compared with other methods, the proposed method yields tracking results that are significantly more accurate.
Collapse
Affiliation(s)
- Bo-I Chuang
- Department of Computer Science and Information Engineering, 1 University Road, Tainan, 701, Taiwan
| | - Jian-Han Hsu
- Department of Computer Science and Information Engineering, 1 University Road, Tainan, 701, Taiwan
| | - Li-Chieh Kuo
- Department of Occupational Therapy, 1 University Road, Tainan, 701, Taiwan
| | - I-Ming Jou
- Department of Orthopedics, E-Da Hospital, I-Shou University, 1 E-Da Road, Jiao-Shu Village, Yan-Chao District, Kaohsiung City, 82445, Taiwan
| | - Fong-Chin Su
- Department of Biomedical Engineering, National Cheng Kung University, 1 University Road, Tainan, 701, Taiwan.
| | - Yung-Nien Sun
- Department of Computer Science and Information Engineering, 1 University Road, Tainan, 701, Taiwan.
| |
Collapse
|