1
|
Wang C, Guo L, Zhu J, Zhu L, Li C, Zhu H, Song A, Lu L, Teng GJ, Navab N, Jiang Z. Review of robotic systems for thoracoabdominal puncture interventional surgery. APL Bioeng 2024; 8:021501. [PMID: 38572313 PMCID: PMC10987197 DOI: 10.1063/5.0180494] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2023] [Accepted: 03/11/2024] [Indexed: 04/05/2024] Open
Abstract
Cancer, with high morbidity and high mortality, is one of the major burdens threatening human health globally. Intervention procedures via percutaneous puncture have been widely used by physicians due to its minimally invasive surgical approach. However, traditional manual puncture intervention depends on personal experience and faces challenges in terms of precisely puncture, learning-curve, safety and efficacy. The development of puncture interventional surgery robotic (PISR) systems could alleviate the aforementioned problems to a certain extent. This paper attempts to review the current status and prospective of PISR systems for thoracic and abdominal application. In this review, the key technologies related to the robotics, including spatial registration, positioning navigation, puncture guidance feedback, respiratory motion compensation, and motion control, are discussed in detail.
Collapse
Affiliation(s)
- Cheng Wang
- Hanglok-Tech Co. Ltd., Hengqin 519000, People's Republic of China
| | - Li Guo
- Hanglok-Tech Co. Ltd., Hengqin 519000, People's Republic of China
| | | | - Lifeng Zhu
- State Key Laboratory of Digital Medical Engineering, Jiangsu Key Lab of Remote Measurement and Control, School of Instrument Science and Engineering, Southeast University, Nanjing 210096, People's Republic of China
| | - Chichi Li
- School of Computer Science and Engineering, Macau University of Science and Technology, Macau, 999078, People's Republic of China
| | - Haidong Zhu
- Center of Interventional Radiology and Vascular Surgery, Department of Radiology, Zhongda Hospital, Medical School, Southeast University, Nanjing 210009, People's Republic of China
| | - Aiguo Song
- State Key Laboratory of Digital Medical Engineering, Jiangsu Key Lab of Remote Measurement and Control, School of Instrument Science and Engineering, Southeast University, Nanjing 210096, People's Republic of China
| | | | - Gao-Jun Teng
- Center of Interventional Radiology and Vascular Surgery, Department of Radiology, Zhongda Hospital, Medical School, Southeast University, Nanjing 210009, People's Republic of China
| | | | - Zhongliang Jiang
- Computer Aided Medical Procedures, Technical University of Munich, Munich 80333, Germany
| |
Collapse
|
2
|
Watanabe H, Fukuda H, Ezawa Y, Matsuyama E, Kondo Y, Hayashi N, Ogura T, Shimosegawa M. Automated angular measurement for puncture angle using a computer-aided method in ultrasound-guided peripheral insertion. Phys Eng Sci Med 2024; 47:679-689. [PMID: 38358620 DOI: 10.1007/s13246-024-01397-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2023] [Accepted: 01/28/2024] [Indexed: 02/16/2024]
Abstract
Ultrasound guidance has become the gold standard for obtaining vascular access. Angle information, which indicates the entry angle of the needle into the vein, is required to ensure puncture success. Although various image processing-based methods, such as deep learning, have recently been applied to improve needle visibility, these methods have limitations, in that the puncture angle to the target organ is not measured. We aim to detect the target vessel and puncture needle and to derive the puncture angle by combining deep learning and conventional image processing methods such as the Hough transform. Median cubital vein US images were obtained from 20 healthy volunteers, and images of simulated blood vessels and needles were obtained during the puncture of a simulated blood vessel in four phantoms. The U-Net architecture was used to segment images of blood vessels and needles, and various image processing methods were employed to automatically measure angles. The experimental results indicated that the mean dice coefficients of median cubital veins, simulated blood vessels, and needles were 0.826, 0.931, and 0.773, respectively. The quantitative results of angular measurement showed good agreement between the expert and automatic measurements of the puncture angle with 0.847 correlations. Our findings indicate that the proposed method achieves extremely high segmentation accuracy and automated angular measurements. The proposed method reduces the variability and time required in manual angle measurements and presents the possibility where the operator can concentrate on delicate techniques related to the direction of the needle.
Collapse
Affiliation(s)
- Haruyuki Watanabe
- School of Radiological Technology, Gunma Prefectural College of Health Sciences, Maebashi, Japan.
| | - Hironori Fukuda
- Department of Radiology, Cardiovascular Hospital of Central Japan, Shibukawa, Japan
| | - Yuina Ezawa
- School of Radiological Technology, Gunma Prefectural College of Health Sciences, Maebashi, Japan
| | - Eri Matsuyama
- Faculty of Informatics, The University of Fukuchiyama, Fukuchiyama, Japan
| | - Yohan Kondo
- Graduate School of Health Sciences, Niigata University, Niigata, Japan
| | - Norio Hayashi
- School of Radiological Technology, Gunma Prefectural College of Health Sciences, Maebashi, Japan
| | - Toshihiro Ogura
- School of Radiological Technology, Gunma Prefectural College of Health Sciences, Maebashi, Japan
| | - Masayuki Shimosegawa
- School of Radiological Technology, Gunma Prefectural College of Health Sciences, Maebashi, Japan
| |
Collapse
|
3
|
Yang L, Duan S, Zhang Y, Hao L, Wang S, Zou Z, Hu Y, Chen S, Hu Y, Zhang L. Feasibility and Safety of Percutaneous Puncture Guided by a 5G-Based Telerobotic Ultrasound System: An Experimental Study. Cardiovasc Intervent Radiol 2024; 47:812-819. [PMID: 38592415 DOI: 10.1007/s00270-024-03681-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/05/2023] [Accepted: 01/31/2024] [Indexed: 04/10/2024]
Abstract
PURPOSE To evaluate the feasibility and safety of percutaneous puncture guided by a 5th generation mobile communication technology (5G)-based telerobotic ultrasound system in phantom and animal experiments. MATERIALS AND METHODS In the phantom experiment, 10 simulated lesions were punctured, once at each of two angles for each lesion, under the guidance of a telerobotic ultrasound system and ultrasound-guided freehand puncture. Student's t test was used to compare the two methods in terms of puncture accuracy, total operation duration, and puncture duration. In the animal experiment, under the guidance of the telerobotic ultrasound system, an 18G puncture needle was used to puncture 3 target steel beads in the liver, right kidney, and right gluteal muscle, respectively. The animal experiment had no freehand ultrasound-guided control group. After puncture, a CT scan was performed to verify the position of the puncture needle in relation to the target, and the complications and puncture duration, etc., were recorded. RESULTS In the phantom experiment, the mean accuracies of puncture under telerobotic ultrasound guidance and conventional ultrasound guidance were 1.8 ± 0.3 mm and 1.6 ± 0.3 mm (P = 0.09), respectively; therefore, there was no significant difference in the accuracy of the two guide methods. In the animal experiment, the first-attempt puncture success (the needle tip close to the target) rate was 93%. Polypnea occurred during one puncture. No other intraoperative or postoperative complications were observed. CONCLUSION Puncture guided by a 5G-based telerobotic ultrasound system has shown good feasibility and safety in phantom and animal experiments.
Collapse
Affiliation(s)
- Lanling Yang
- Zhengzhou University People's Hospital, Henan Provincial People's Hospital, Zhengzhou, 450003, Henan, China
| | - Shaobo Duan
- Henan Provincial People's Hospital, Zhengzhou University People's Hospital, Henan University People's Hospital, Zhengzhou, 450003, Henan, China
- Department of Ultrasound, Henan Key Laboratory for Ultrasound Molecular Imaging and Artificial Intelligence Medicine, Henan Provincial People's Hospital, Zhengzhou University People's Hospital, Henan University People's Hospital, No.7, Weiwu Road, Jinshui District, Zhengzhou, 450003, Henan, China
| | - Ye Zhang
- Henan Provincial People's Hospital, Zhengzhou University People's Hospital, Henan University People's Hospital, Zhengzhou, 450003, Henan, China
| | - Liuwei Hao
- Henan Provincial People's Hospital, Zhengzhou University People's Hospital, Henan University People's Hospital, Zhengzhou, 450003, Henan, China
| | - Shuaiyang Wang
- Henan Provincial People's Hospital, Zhengzhou University People's Hospital, Henan University People's Hospital, Zhengzhou, 450003, Henan, China
| | - Zhi Zou
- Henan Provincial People's Hospital, Zhengzhou University People's Hospital, Henan University People's Hospital, Zhengzhou, 450003, Henan, China
| | - Yanshan Hu
- Zhengzhou University People's Hospital, Henan Provincial People's Hospital, Zhengzhou, 450003, Henan, China
| | - Si Chen
- Zhengzhou University People's Hospital, Henan Provincial People's Hospital, Zhengzhou, 450003, Henan, China
| | - Yiwen Hu
- Henan University People's Hospital, Henan Provincial People's Hospital, Zhengzhou, 450003, Henan, China
| | - Lianzhong Zhang
- Henan Provincial People's Hospital, Zhengzhou University People's Hospital, Henan University People's Hospital, Zhengzhou, 450003, Henan, China.
- Department of Rehabilitation, Henan Rehabilitation Clinical Medicine Research Center, Henan Provincial People's Hospital, Zhengzhou University People's Hospital, Henan University People's Hospital, Zhengzhou, 450003, Henan, China.
- Department of Ultrasound, Henan Key Laboratory for Ultrasound Molecular Imaging and Artificial Intelligence Medicine, Henan Provincial People's Hospital, Zhengzhou University People's Hospital, Henan University People's Hospital, No.7, Weiwu Road, Jinshui District, Zhengzhou, 450003, Henan, China.
| |
Collapse
|
4
|
Sang T, Yu F, Zhao J, Wu B, Ding X, Shen C. A novel deep learning method to segment parathyroid glands on intraoperative videos of thyroid surgery. Front Surg 2024; 11:1370017. [PMID: 38708363 PMCID: PMC11066234 DOI: 10.3389/fsurg.2024.1370017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2024] [Accepted: 04/08/2024] [Indexed: 05/07/2024] Open
Abstract
Introduction The utilization of artificial intelligence (AI) augments intraoperative safety and surgical training. The recognition of parathyroid glands (PGs) is difficult for inexperienced surgeons. The aim of this study was to find out whether deep learning could be used to auxiliary identification of PGs on intraoperative videos in patients undergoing thyroid surgery. Methods In this retrospective study, 50 patients undergoing thyroid surgery between 2021 and 2023 were randomly assigned (7:3 ratio) to a training cohort (n = 35) and a validation cohort (n = 15). The combined datasets included 98 videos with 9,944 annotated frames. An independent test cohort included 15 videos (1,500 frames) from an additional 15 patients. We developed a deep-learning model Video-Trans-U-HRNet to segment parathyroid glands in surgical videos, comparing it with three advanced medical AI methods on the internal validation cohort. Additionally, we assessed its performance against four surgeons (2 senior surgeons and 2 junior surgeons) on the independent test cohort, calculating precision and recall metrics for the model. Results Our model demonstrated superior performance compared to other AI models on the internal validation cohort. The DICE and accuracy achieved by our model were 0.760 and 74.7% respectively, surpassing Video-TransUnet (0.710, 70.1%), Video-SwinUnet (0.754, 73.6%), and TransUnet (0.705, 69.4%). For the external test, our method got 89.5% precision 77.3% recall and 70.8% accuracy. In the statistical analysis, our model demonstrated results comparable to those of senior surgeons (senior surgeon 1: χ2 = 0.989, p = 0.320; senior surgeon 2: χ2 = 1.373, p = 0.241) and outperformed 2 junior surgeons (junior surgeon 1: χ2 = 3.889, p = 0.048; junior surgeon 2: χ2 = 4.763, p = 0.029). Discussion We introduce an innovative intraoperative video method for identifying PGs, highlighting the potential advancements of AI in the surgical domain. The segmentation method employed for parathyroid glands in intraoperative videos offer surgeons supplementary guidance in locating real PGs. The method developed may have utility in facilitating training and decreasing the learning curve associated with the use of this technology.
Collapse
Affiliation(s)
- Tian Sang
- School of Computer Engineering and Science, Shanghai University, Shanghai, China
| | - Fan Yu
- Department of Nuclear Medicine, Shanghai Sixth People’s Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Junjuan Zhao
- School of Computer Engineering and Science, Shanghai University, Shanghai, China
| | - Bo Wu
- Department of Thyroid, Breast and Hernia Surgery, Shanghai Sixth People’s Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Xuehai Ding
- School of Computer Engineering and Science, Shanghai University, Shanghai, China
| | - Chentian Shen
- Department of Nuclear Medicine, Shanghai Sixth People’s Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai, China
| |
Collapse
|
5
|
Wang R, Tan G, Liu X. TipDet: A multi-keyframe motion-aware framework for tip detection during ultrasound-guided interventions. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2024; 247:108109. [PMID: 38460346 DOI: 10.1016/j.cmpb.2024.108109] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/11/2023] [Revised: 01/09/2024] [Accepted: 02/29/2024] [Indexed: 03/11/2024]
Abstract
BACKGROUND AND OBJECTIVE Automatic needle tip detection is important in real-time ultrasound (US) images that are utilized to guide interventional needle puncture procedures in clinical settings. However, due to the spatial indiscernibility problem caused by the severe background interferences and the tip characteristics of small size, being grayscale and indistinctive appearance patterns, tip detection in US images is challenging. METHODS To achieve precise tip detection in US images against spatial indiscernibility, a novel multi-keyframe motion-aware framework called TipDet is proposed. It can identify tips based on their short-term spatial-temporal pattern and long-term motion pattern. In TipDet, first, an adaptive keyframe model (AKM) is proposed to decide whether a frame is informative to serve as a keyframe for long-term motion pattern learning. Second, candidate tip detection is conducted using a two-stream backbone (TSB) based on their short-term spatial-temporal pattern. Third, to further identify the true one in the candidate tips, a novel method for learning the long-term motion pattern of the tips is proposed based on the proposed optical-flow-aware multi-head cross-attention (OFA-MHCA). RESULTS On the clinical human puncture dataset, which includes 4195 B-mode images, the experimental results show that the proposed TipDet can achieve precise tip detection against the spatial indiscernibility problem, achieving 78.7 % AP0.1:0.5 and 8.9 % improvement over the base detector at approximately 20 FPS. Moreover, a tip localization error of 1.3±0.6 % is achieved, exceeding the existing method. CONCLUSIONS The proposed TipDet can facilitate a wider and easier application of US-guided interventional procedures by providing robust and precise needle tip localization. The codes and data are available at https://github.com/ResonWang/TipDet.
Collapse
Affiliation(s)
- Ruixin Wang
- College of Computer and Information, Hohai University, Nanjing 210098, China
| | - Guoping Tan
- College of Computer and Information, Hohai University, Nanjing 210098, China.
| | - Xiaohui Liu
- The First People's Hospital of Kunshan, Affiliated Kunshan Hospital of Jiangsu University, Kunshan 215300, China
| |
Collapse
|
6
|
Gómez FM, Van der Reijd DJ, Panfilov IA, Baetens T, Wiese K, Haverkamp-Begemann N, Lam SW, Runge JH, Rice SL, Klompenhouwer EG, Maas M, Helmberger T, Beets-Tan RG. Imaging in interventional oncology, the better you see, the better you treat. J Med Imaging Radiat Oncol 2023; 67:895-902. [PMID: 38062853 DOI: 10.1111/1754-9485.13610] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2023] [Accepted: 11/22/2023] [Indexed: 01/14/2024]
Abstract
Imaging and image processing is the fundamental pillar of interventional oncology in which diagnostic, procedure planning, treatment and follow-up are sustained. Knowing all the possibilities that the different image modalities can offer is capital to select the most appropriate and accurate guidance for interventional procedures. Despite there is a wide variability in physicians preferences and availability of the different image modalities to guide interventional procedures, it is important to recognize the advantages and limitations for each of them. In this review, we aim to provide an overview of the most frequently used image guidance modalities for interventional procedures and its typical and future applications including angiography, computed tomography (CT) and spectral CT, magnetic resonance imaging, Ultrasound and the use of hybrid systems. Finally, we resume the possible role of artificial intelligence related to image in patient selection, treatment and follow-up.
Collapse
Affiliation(s)
- Fernando M Gómez
- Grupo de Investigación Biomédica en Imagen, Instituto de Investigación Sanitaria La Fe, Valencia, Spain
- Área Clínica de Imagen Médica, Hospital Universitario y Politécnico La Fe, Valencia, Spain
- Department of Radiology, The Netherlands Cancer Institute, Amsterdam, The Netherlands
| | | | - Ilia A Panfilov
- Department of Radiology, The Netherlands Cancer Institute, Amsterdam, The Netherlands
| | - Tarik Baetens
- Department of Radiology, The Netherlands Cancer Institute, Amsterdam, The Netherlands
| | - Kevin Wiese
- Department of Radiology, The Netherlands Cancer Institute, Amsterdam, The Netherlands
| | | | - Siu W Lam
- Department of Radiology, The Netherlands Cancer Institute, Amsterdam, The Netherlands
| | - Jurgen H Runge
- Department of Radiology, The Netherlands Cancer Institute, Amsterdam, The Netherlands
| | - Samuel L Rice
- Radiology, Interventional Radiology Section, UT Southwestern Medical Center, Dallas, TX, USA
| | | | - Monique Maas
- Department of Radiology, The Netherlands Cancer Institute, Amsterdam, The Netherlands
| | - Thomas Helmberger
- Institut für Radiologie, Neuroradiologie und Minimal-Invasive Therapie, München Klinik Bogenhausen, Munich, Germany
| | - Regina Gh Beets-Tan
- Department of Radiology, The Netherlands Cancer Institute, Amsterdam, The Netherlands
- GROW School for Oncology and Developmental Biology, University of Maastricht, Maastricht, The Netherlands
| |
Collapse
|
7
|
Amiri Tehrani Zade A, Jalili Aziz M, Majedi H, Mirbagheri A, Ahmadian A. Spatiotemporal analysis of speckle dynamics to track invisible needle in ultrasound sequences using convolutional neural networks: a phantom study. Int J Comput Assist Radiol Surg 2023; 18:1373-1382. [PMID: 36745339 DOI: 10.1007/s11548-022-02812-y] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/13/2022] [Indexed: 02/07/2023]
Abstract
PURPOSE Accurate needle placement into the target point is critical for ultrasound interventions like biopsies and epidural injections. However, aligning the needle to the thin plane of the transducer is a challenging issue as it leads to the decay of visibility by the naked eye. Therefore, we have developed a CNN-based framework to track the needle using the spatiotemporal features of the speckle dynamics. METHODS There are three key techniques to optimize the network for our application. First, we used Gunnar-Farneback (GF) as a traditional motion field estimation technique to augment the model input with the spatiotemporal features extracted from the stack of consecutive frames. We also designed an efficient network based on the state-of-the-art Yolo framework (nYolo). Lastly, the Assisted Excitation (AE) module was added at the neck of the network to handle the imbalance problem. RESULTS Fourteen freehand ultrasound sequences were collected by inserting an injection needle steeply into the Ultrasound Compatible Lumbar Epidural Simulator and Femoral Vascular Access Ezono test phantoms. We divided the dataset into two sub-categories. In the second category, in which the situation is more challenging and the needle is totally invisible, the angle and tip localization error were 2.43 ± 1.14° and 2.3 ± 1.76 mm using Yolov3+GF+AE and 2.08 ± 1.18° and 2.12 ± 1.43 mm using nYolo+GF+AE. CONCLUSION The proposed method has the potential to track the needle in a more reliable operation compared to other state-of-the-art methods and can accurately localize it in 2D B-mode US images in real time, allowing it to be used in current ultrasound intervention procedures.
Collapse
Affiliation(s)
- Amin Amiri Tehrani Zade
- Department of Medical Physics and Biomedical Engineering, Tehran University of Medical Sciences (TUMS), Tehran, Iran
- Image-Guided Surgery Group, Research Centre for Biomedical Technologies and Robotics (RCBTR), Tehran University of Medical Sciences, Tehran, Iran
| | - Maryam Jalili Aziz
- Department of Medical Physics and Biomedical Engineering, Tehran University of Medical Sciences (TUMS), Tehran, Iran
- Image-Guided Surgery Group, Research Centre for Biomedical Technologies and Robotics (RCBTR), Tehran University of Medical Sciences, Tehran, Iran
| | - Hossein Majedi
- Pain Research Center, Neuroscience Institute, Tehran University of Medical Sciences, Tehran, Iran
- Department of Anesthesiology, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
| | - Alireza Mirbagheri
- Department of Medical Physics and Biomedical Engineering, Tehran University of Medical Sciences (TUMS), Tehran, Iran
- Robotic Group, Research Centre for Biomedical Technologies and Robotics (RCBTR), Tehran University of Medical Sciences, Tehran, Iran
| | - Alireza Ahmadian
- Department of Medical Physics and Biomedical Engineering, Tehran University of Medical Sciences (TUMS), Tehran, Iran.
- Image-Guided Surgery Group, Research Centre for Biomedical Technologies and Robotics (RCBTR), Tehran University of Medical Sciences, Tehran, Iran.
| |
Collapse
|