1
|
Cartucho J, Weld A, Tukra S, Xu H, Matsuzaki H, Ishikawa T, Kwon M, Jang YE, Kim KJ, Lee G, Bai B, Kahrs LA, Boecking L, Allmendinger S, Müller L, Zhang Y, Jin Y, Bano S, Vasconcelos F, Reiter W, Hajek J, Silva B, Lima E, Vilaça JL, Queirós S, Giannarou S. SurgT challenge: Benchmark of soft-tissue trackers for robotic surgery. Med Image Anal 2024; 91:102985. [PMID: 37844472 DOI: 10.1016/j.media.2023.102985] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2023] [Revised: 08/30/2023] [Accepted: 09/28/2023] [Indexed: 10/18/2023]
Abstract
This paper introduces the "SurgT: Surgical Tracking" challenge which was organized in conjunction with the 25th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI 2022). There were two purposes for the creation of this challenge: (1) the establishment of the first standardized benchmark for the research community to assess soft-tissue trackers; and (2) to encourage the development of unsupervised deep learning methods, given the lack of annotated data in surgery. A dataset of 157 stereo endoscopic videos from 20 clinical cases, along with stereo camera calibration parameters, have been provided. Participants were assigned the task of developing algorithms to track the movement of soft tissues, represented by bounding boxes, in stereo endoscopic videos. At the end of the challenge, the developed methods were assessed on a previously hidden test subset. This assessment uses benchmarking metrics that were purposely developed for this challenge, to verify the efficacy of unsupervised deep learning algorithms in tracking soft-tissue. The metric used for ranking the methods was the Expected Average Overlap (EAO) score, which measures the average overlap between a tracker's and the ground truth bounding boxes. Coming first in the challenge was the deep learning submission by ICVS-2Ai with a superior EAO score of 0.617. This method employs ARFlow to estimate unsupervised dense optical flow from cropped images, using photometric and regularization losses. Second, Jmees with an EAO of 0.583, uses deep learning for surgical tool segmentation on top of a non-deep learning baseline method: CSRT. CSRT by itself scores a similar EAO of 0.563. The results from this challenge show that currently, non-deep learning methods are still competitive. The dataset and benchmarking tool created for this challenge have been made publicly available at https://surgt.grand-challenge.org/. This challenge is expected to contribute to the development of autonomous robotic surgery and other digital surgical technologies.
Collapse
Affiliation(s)
- João Cartucho
- The Hamlyn Centre for Robotic Surgery, Imperial College London, United Kingdom.
| | - Alistair Weld
- The Hamlyn Centre for Robotic Surgery, Imperial College London, United Kingdom
| | - Samyakh Tukra
- The Hamlyn Centre for Robotic Surgery, Imperial College London, United Kingdom
| | - Haozheng Xu
- The Hamlyn Centre for Robotic Surgery, Imperial College London, United Kingdom
| | | | | | - Minjun Kwon
- Electronics and Telecommunications Research Institute (ETRI), Daejeon, South Korea
| | - Yong Eun Jang
- Electronics and Telecommunications Research Institute (ETRI), Daejeon, South Korea
| | - Kwang-Ju Kim
- Electronics and Telecommunications Research Institute (ETRI), Daejeon, South Korea
| | - Gwang Lee
- Ajou University, Gyeonggi-do, South Korea
| | - Bizhe Bai
- Medical Computer Vision and Robotics Lab, University of Toronto, Canada
| | - Lueder A Kahrs
- Medical Computer Vision and Robotics Lab, University of Toronto, Canada
| | | | | | | | - Yitong Zhang
- Surgical Robot Vision, University College London, United Kingdom
| | - Yueming Jin
- Surgical Robot Vision, University College London, United Kingdom
| | - Sophia Bano
- Surgical Robot Vision, University College London, United Kingdom
| | | | | | | | - Bruno Silva
- Life and Health Sciences Research Institute (ICVS), School of Medicine, University of Minho, Braga, Portugal; ICVS/3B's - PT Government Associate Laboratory, Braga/Guimarães, Portugal; 2Ai - School of Technology, IPCA, Barcelos, Portugal
| | - Estevão Lima
- Life and Health Sciences Research Institute (ICVS), School of Medicine, University of Minho, Braga, Portugal; ICVS/3B's - PT Government Associate Laboratory, Braga/Guimarães, Portugal
| | - João L Vilaça
- 2Ai - School of Technology, IPCA, Barcelos, Portugal
| | - Sandro Queirós
- Life and Health Sciences Research Institute (ICVS), School of Medicine, University of Minho, Braga, Portugal; ICVS/3B's - PT Government Associate Laboratory, Braga/Guimarães, Portugal
| | - Stamatia Giannarou
- The Hamlyn Centre for Robotic Surgery, Imperial College London, United Kingdom
| |
Collapse
|
2
|
Optical force estimation for interactions between tool and soft tissues. Sci Rep 2023; 13:506. [PMID: 36627354 PMCID: PMC9831996 DOI: 10.1038/s41598-022-27036-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2022] [Accepted: 12/23/2022] [Indexed: 01/11/2023] Open
Abstract
Robotic assistance in minimally invasive surgery offers numerous advantages for both patient and surgeon. However, the lack of force feedback in robotic surgery is a major limitation, and accurately estimating tool-tissue interaction forces remains a challenge. Image-based force estimation offers a promising solution without the need to integrate sensors into surgical tools. In this indirect approach, interaction forces are derived from the observed deformation, with learning-based methods improving accuracy and real-time capability. However, the relationship between deformation and force is determined by the stiffness of the tissue. Consequently, both deformation and local tissue properties must be observed for an approach applicable to heterogeneous tissue. In this work, we use optical coherence tomography, which can combine the detection of tissue deformation with shear wave elastography in a single modality. We present a multi-input deep learning network for processing of local elasticity estimates and volumetric image data. Our results demonstrate that accounting for elastic properties is critical for accurate image-based force estimation across different tissue types and properties. Joint processing of local elasticity information yields the best performance throughout our phantom study. Furthermore, we test our approach on soft tissue samples that were not present during training and show that generalization to other tissue properties is possible.
Collapse
|
3
|
Huang B, Nguyen A, Wang S, Wang Z, Mayer E, Tuch D, Vyas K, Giannarou S, Elson DS. Simultaneous Depth Estimation and Surgical Tool Segmentation in Laparoscopic Images. IEEE TRANSACTIONS ON MEDICAL ROBOTICS AND BIONICS 2022; 4:335-338. [PMID: 36148137 PMCID: PMC7613616 DOI: 10.1109/tmrb.2022.3170215] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
Surgical instrument segmentation and depth estimation are crucial steps to improve autonomy in robotic surgery. Most recent works treat these problems separately, making the deployment challenging. In this paper, we propose a unified framework for depth estimation and surgical tool segmentation in laparoscopic images. The network has an encoder-decoder architecture and comprises two branches for simultaneously performing depth estimation and segmentation. To train the network end to end, we propose a new multi-task loss function that effectively learns to estimate depth in an unsupervised manner, while requiring only semi-ground truth for surgical tool segmentation. We conducted extensive experiments on different datasets to validate these findings. The results showed that the end-to-end network successfully improved the state-of-the-art for both tasks while reducing the complexity during their deployment.
Collapse
Affiliation(s)
- Baoru Huang
- The Hamlyn Centre for Robotic Surgery, Imperial College London, SW7 2AZ, UK
- Department of Surgery & Cancer, Imperial College London, SW7 2AZ, UK
| | - Anh Nguyen
- The Hamlyn Centre for Robotic Surgery, Imperial College London, SW7 2AZ, UK
- Department of Computer Science, University of Liverpool, UK
| | - Siyao Wang
- The Hamlyn Centre for Robotic Surgery, Imperial College London, SW7 2AZ, UK
| | - Ziyang Wang
- Department of Computer Science, University of Oxford, UK
| | - Erik Mayer
- Department of Surgery & Cancer, Imperial College London, SW7 2AZ, UK
| | | | | | - Stamatia Giannarou
- The Hamlyn Centre for Robotic Surgery, Imperial College London, SW7 2AZ, UK
- Department of Surgery & Cancer, Imperial College London, SW7 2AZ, UK
| | - Daniel S Elson
- The Hamlyn Centre for Robotic Surgery, Imperial College London, SW7 2AZ, UK
- Department of Surgery & Cancer, Imperial College London, SW7 2AZ, UK
| |
Collapse
|
4
|
Tukra S, Lidströmer N, Ashrafian H, Gianarrou S. AI in Surgical Robotics. Artif Intell Med 2022. [DOI: 10.1007/978-3-030-64573-1_323] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
5
|
Hadi Hosseinabadi AH, Salcudean SE. Force sensing in robot-assisted keyhole endoscopy: A systematic survey. Int J Rob Res 2021. [DOI: 10.1177/02783649211052067] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
Abstract
Instrument–tissue interaction forces in minimally invasive surgery (MIS) provide valuable information that can be used to provide haptic perception, monitor tissue trauma, develop training guidelines, and evaluate the skill level of novice and expert surgeons. Force and tactile sensing is lost in many robot-assisted surgery (RAS) systems. Therefore, many researchers have focused on recovering this information through sensing systems and estimation algorithms. This article provides a comprehensive systematic review of the current force sensing research aimed at RAS and, more generally, keyhole endoscopy, in which instruments enter the body through small incisions. Articles published between January 2011 and May 2020 are considered, following the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines. The literature search resulted in 110 papers on different force estimation algorithms and sensing technologies, sensor design specifications, and fabrication techniques.
Collapse
Affiliation(s)
- Amir Hossein Hadi Hosseinabadi
- Robotics and Controls Laboratory (RCL), Electrical and Computer Engineering Department, University of British Columbia, Vancouver, British Columbia, Canada
| | - Septimiu E. Salcudean
- Robotics and Controls Laboratory (RCL), Electrical and Computer Engineering Department, University of British Columbia, Vancouver, British Columbia, Canada
| |
Collapse
|
6
|
Tukra S, Lidströmer N, Ashrafian H, Giannarou S. AI in Surgical Robotics. Artif Intell Med 2021. [DOI: 10.1007/978-3-030-58080-3_323-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
|
7
|
Vision-Based Suture Tensile Force Estimation in Robotic Surgery. SENSORS 2020; 21:s21010110. [PMID: 33375388 PMCID: PMC7796030 DOI: 10.3390/s21010110] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/03/2020] [Revised: 12/23/2020] [Accepted: 12/24/2020] [Indexed: 12/14/2022]
Abstract
Compared to laparoscopy, robotics-assisted minimally invasive surgery has the problem of an absence of force feedback, which is important to prevent a breakage of the suture. To overcome this problem, surgeons infer the suture force from their proprioception and 2D image by comparing them to the training experience. Based on this idea, a deep-learning-based method using a single image and robot position to estimate the tensile force of the sutures without a force sensor is proposed. A neural network structure with a modified Inception Resnet-V2 and Long Short Term Memory (LSTM) networks is used to estimate the suture pulling force. The feasibility of proposed network is verified using the generated DB, recording the interaction under the condition of two different artificial skins and two different situations (in vivo and in vitro) at 13 viewing angles of the images by changing the tool positions collected from the master-slave robotic system. From the evaluation conducted to show the feasibility of the interaction force estimation, the proposed learning models successfully estimated the tensile force at 10 unseen viewing angles during training.
Collapse
|
8
|
Edwards PJ‘E, Colleoni E, Sridhar A, Kelly JD, Stoyanov D. Visual kinematic force estimation in robot-assisted surgery – application to knot tying. COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING: IMAGING & VISUALIZATION 2020. [DOI: 10.1080/21681163.2020.1833368] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Affiliation(s)
| | - Emanuele Colleoni
- Department of Computer Science, Surgical Robot Vision Group, WEISS, UCL, London, UK
| | - Aswhin Sridhar
- Urology Department, Westmoreland Street Hospital, UCLH, London, UK
| | - John D. Kelly
- Urology Department, Westmoreland Street Hospital, UCLH, London, UK
| | - Danail Stoyanov
- Department of Computer Science, Surgical Robot Vision Group, WEISS, UCL, London, UK
| |
Collapse
|
9
|
A recurrent convolutional neural network approach for sensorless force estimation in robotic surgery. Biomed Signal Process Control 2019. [DOI: 10.1016/j.bspc.2019.01.011] [Citation(s) in RCA: 39] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
10
|
Miyashita K, Oude Vrielink T, Mylonas G. A cable-driven parallel manipulator with force sensing capabilities for high-accuracy tissue endomicroscopy. Int J Comput Assist Radiol Surg 2018. [PMID: 29516353 PMCID: PMC5953980 DOI: 10.1007/s11548-018-1717-7] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/12/2023]
Abstract
PURPOSE Endomicroscopy (EM) provides high resolution, non-invasive histological tissue information and can be used for scanning of large areas of tissue to assess cancerous and pre-cancerous lesions and their margins. However, current robotic solutions do not provide the accuracy and force sensitivity required to perform safe and accurate tissue scanning. METHODS A new surgical instrument has been developed that uses a cable-driven parallel mechanism (CPDM) to manipulate an EM probe. End-effector forces are determined by measuring the tensions in each cable. As a result, the instrument allows to accurately apply a contact force on a tissue, while at the same time offering high resolution and highly repeatable probe movement. RESULTS 0.2 and 0.6 N force sensitivities were found for 1 and 2 DoF image acquisition methods, respectively. A back-stepping technique can be used when a higher force sensitivity is required for the acquisition of high quality tissue images. This method was successful in acquiring images on ex vivo liver tissue. CONCLUSION The proposed approach offers high force sensitivity and precise control, which is essential for robotic EM. The technical benefits of the current system can also be used for other surgical robotic applications, including safe autonomous control, haptic feedback and palpation.
Collapse
Affiliation(s)
- Kiyoteru Miyashita
- HARMS Lab, Department of Surgery and Cancer, Imperial College London, 3rd Floor Paterson Wing, 20 South Wharf Road, W2 1PF, London, UK
| | - Timo Oude Vrielink
- HARMS Lab, Department of Surgery and Cancer, Imperial College London, 3rd Floor Paterson Wing, 20 South Wharf Road, W2 1PF, London, UK.
| | - George Mylonas
- HARMS Lab, Department of Surgery and Cancer, Imperial College London, 3rd Floor Paterson Wing, 20 South Wharf Road, W2 1PF, London, UK
| |
Collapse
|
11
|
Leibrandt K, Yang GZ. Efficient Proximity Queries for Continuum Robots on Parallel Computing Hardware. IEEE Robot Autom Lett 2017. [DOI: 10.1109/lra.2017.2668466] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|