1
|
Jurosch F, Wagner L, Jell A, Islertas E, Wilhelm D, Berlet M. Extra-abdominal trocar and instrument detection for enhanced surgical workflow understanding. Int J Comput Assist Radiol Surg 2024; 19:1939-1945. [PMID: 39008232 PMCID: PMC11442558 DOI: 10.1007/s11548-024-03220-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2024] [Accepted: 06/24/2024] [Indexed: 07/16/2024]
Abstract
PURPOSE Video-based intra-abdominal instrument tracking for laparoscopic surgeries is a common research area. However, the tracking can only be done with instruments that are actually visible in the laparoscopic image. By using extra-abdominal cameras to detect trocars and classify their occupancy state, additional information about the instrument location, whether an instrument is still in the abdomen or not, can be obtained. This can enhance laparoscopic workflow understanding and enrich already existing intra-abdominal solutions. METHODS A data set of four laparoscopic surgeries recorded with two time-synchronized extra-abdominal 2D cameras was generated. The preprocessed and annotated data were used to train a deep learning-based network architecture consisting of a trocar detection, a centroid tracker and a temporal model to provide the occupancy state of all trocars during the surgery. RESULTS The trocar detection model achieves an F1 score of 95.06 ± 0.88 % . The prediction of the occupancy state yields an F1 score of 89.29 ± 5.29 % , providing a first step towards enhanced surgical workflow understanding. CONCLUSION The current method shows promising results for the extra-abdominal tracking of trocars and their occupancy state. Future advancements include the enlargement of the data set and incorporation of intra-abdominal imaging to facilitate accurate assignment of instruments to trocars.
Collapse
Affiliation(s)
- Franziska Jurosch
- Research Group MITI, Klinikum rechts der Isar, TUM School of Medicine and Health, Technical University of Munich, Munich, Germany.
| | - Lars Wagner
- Research Group MITI, Klinikum rechts der Isar, TUM School of Medicine and Health, Technical University of Munich, Munich, Germany
| | - Alissa Jell
- Research Group MITI, Klinikum rechts der Isar, TUM School of Medicine and Health, Technical University of Munich, Munich, Germany
- Department of Surgery, Klinikum rechts der Isar, TUM School of Medicine and Health, Technical University of Munich, Munich, Germany
| | - Esra Islertas
- Research Group MITI, Klinikum rechts der Isar, TUM School of Medicine and Health, Technical University of Munich, Munich, Germany
| | - Dirk Wilhelm
- Research Group MITI, Klinikum rechts der Isar, TUM School of Medicine and Health, Technical University of Munich, Munich, Germany
- Department of Surgery, Klinikum rechts der Isar, TUM School of Medicine and Health, Technical University of Munich, Munich, Germany
| | - Maximilian Berlet
- Research Group MITI, Klinikum rechts der Isar, TUM School of Medicine and Health, Technical University of Munich, Munich, Germany
- Department of Surgery, Klinikum rechts der Isar, TUM School of Medicine and Health, Technical University of Munich, Munich, Germany
| |
Collapse
|
2
|
Xu S, Hu B, Liu R, Zhao X, Sun M. Liquid-Driven Microinjection System for Precise Fundus Injection. SENSORS (BASEL, SWITZERLAND) 2024; 24:2140. [PMID: 38610350 PMCID: PMC11014097 DOI: 10.3390/s24072140] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/29/2024] [Revised: 03/19/2024] [Accepted: 03/26/2024] [Indexed: 04/14/2024]
Abstract
Microinjection is usually applied to the treatment of some retinal disorders, such as retinal vein cannulation and displaced submacular hemorrhage. Currently, the microinjection procedure is usually performed by using the viscous fluid control of a standard vitrectomy system, which applies a fixed air pressure through foot pedal activation. The injection process with the fixed pressure is uncontrollable and lacks feedback, the high flow rate of the injected drug may cause damage to the fundus tissue. In this paper, a liquid-driven microinjection system with a flow sensor is designed and developed specifically for fundus injection. In addition, a PID sliding mode control (SMC) method is proposed to achieve precise injection in the injection system. The experimental results of fundus simulation injection demonstrate that the microinjection system meets the requirements of fundus injection and reduces the impact of the injection process on the fundus tissue.
Collapse
Affiliation(s)
- Shiyu Xu
- National Key Laboratory of Intelligent Tracking and Forecasting for Infectious Diseases, Engineering Research Center of Trusted Behavior Intelligence, Ministry of Education, Tianjin Key Laboratory of Intelligent Robotics, Institute of Robotics and Automatic Information System, Nankai University, Tianjin 300350, China; (S.X.); (B.H.); (R.L.); (X.Z.)
- Institute of Intelligence Technology and Robotic Systems, Shenzhen Research Institute of Nankai University, Shenzhen 518083, China
| | - Bo Hu
- National Key Laboratory of Intelligent Tracking and Forecasting for Infectious Diseases, Engineering Research Center of Trusted Behavior Intelligence, Ministry of Education, Tianjin Key Laboratory of Intelligent Robotics, Institute of Robotics and Automatic Information System, Nankai University, Tianjin 300350, China; (S.X.); (B.H.); (R.L.); (X.Z.)
- Institute of Intelligence Technology and Robotic Systems, Shenzhen Research Institute of Nankai University, Shenzhen 518083, China
| | - Rongxin Liu
- National Key Laboratory of Intelligent Tracking and Forecasting for Infectious Diseases, Engineering Research Center of Trusted Behavior Intelligence, Ministry of Education, Tianjin Key Laboratory of Intelligent Robotics, Institute of Robotics and Automatic Information System, Nankai University, Tianjin 300350, China; (S.X.); (B.H.); (R.L.); (X.Z.)
- Institute of Intelligence Technology and Robotic Systems, Shenzhen Research Institute of Nankai University, Shenzhen 518083, China
| | - Xin Zhao
- National Key Laboratory of Intelligent Tracking and Forecasting for Infectious Diseases, Engineering Research Center of Trusted Behavior Intelligence, Ministry of Education, Tianjin Key Laboratory of Intelligent Robotics, Institute of Robotics and Automatic Information System, Nankai University, Tianjin 300350, China; (S.X.); (B.H.); (R.L.); (X.Z.)
- Institute of Intelligence Technology and Robotic Systems, Shenzhen Research Institute of Nankai University, Shenzhen 518083, China
| | - Mingzhu Sun
- National Key Laboratory of Intelligent Tracking and Forecasting for Infectious Diseases, Engineering Research Center of Trusted Behavior Intelligence, Ministry of Education, Tianjin Key Laboratory of Intelligent Robotics, Institute of Robotics and Automatic Information System, Nankai University, Tianjin 300350, China; (S.X.); (B.H.); (R.L.); (X.Z.)
- Institute of Intelligence Technology and Robotic Systems, Shenzhen Research Institute of Nankai University, Shenzhen 518083, China
| |
Collapse
|
3
|
Birch J, Da Cruz L, Rhode K, Bergeles C. Trocar localisation for robot-assisted vitreoretinal surgery. Int J Comput Assist Radiol Surg 2024; 19:191-198. [PMID: 37354219 PMCID: PMC10838829 DOI: 10.1007/s11548-023-02987-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2023] [Accepted: 06/12/2023] [Indexed: 06/26/2023]
Abstract
PURPOSE Robot-assisted vitreoretinal surgery provides precise and consistent operations on the back of the eye. To perform this safely, knowledge of the surgical instrument's remote centre of motion (RCM) and the location of the insertion point into the eye (trocar) is required. This enables the robot to align both positions to pivot the instrument about the trocar, thus preventing any damaging lateral forces from being exerted. METHODS Building on a system developed in previous work, this study presents a trocar localisation method that uses a micro-camera mounted on a vitreoretinal surgical forceps, to track two ArUco markers attached on either side of a trocar. The trocar position is the estimated midpoint between the markers. RESULTS Experimental evaluation of the trocar localisation was conducted. Results showed an RMSE of 1.82 mm for the localisation of the markers and an RMSE of 1.24 mm for the trocar localisation. CONCLUSIONS The proposed camera-based trocar localisation presents reasonable consistency and accuracy and shows improved results compared to other current methods. Optimum accuracy for this application would necessitate a 1.4 mm absolute error margin, which corresponds to the trocar's radius. The trocar localisation results are successfully found within this margin, yet the marker localisation would require further refinement to ensure consistency of localisation within the error margin. Further work will refine these position estimates and ensure the error stays consistently within this boundary.
Collapse
Affiliation(s)
- Jeremy Birch
- School of Biomedical Engineering and Imaging Sciences, King's College London, Strand, London, WC2R 2LS, UK.
| | - Lyndon Da Cruz
- Moorfields Eye Hospital, 162 City Rd, London, EC1V 2PD, UK
| | - Kawal Rhode
- School of Biomedical Engineering and Imaging Sciences, King's College London, Strand, London, WC2R 2LS, UK
| | - Christos Bergeles
- School of Biomedical Engineering and Imaging Sciences, King's College London, Strand, London, WC2R 2LS, UK
| |
Collapse
|
4
|
Alikhani A, Osner S, Dehghani S, Busam B, Inagaki S, Maier M, Navab N, Nasseri MA. RCIT: A Robust Catadioptric-based Instrument 3D Tracking Method For Microsurgical Instruments In a Single-Camera System. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2023; 2023:1-5. [PMID: 38083453 DOI: 10.1109/embc40787.2023.10340955] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
The field of robotic microsurgery and micro-manipulation has undergone a profound evolution in recent years, particularly with regard to the accuracy, precision, versatility, and dexterity. These advancements have the potential to revolutionize high-precision biomedical procedures, such as neurosurgery, vitreoretinal surgery, and cell micro-manipulation. However, a critical challenge in developing micron-precision robotic systems is accurately verifying the end-effector motion in 3D. Such verification is complicated due to environmental vibrations, inaccuracy of mechanical assembly, and other physical uncertainties. To overcome these challenges, this paper proposes a novel single-camera framework that utilizes mirrors with known geometric parameters to estimate the 3D position of the microsurgical instrument. Euclidean distance between reconstructed points by the algorithm and the robot movement recorded by the highly accurate encoders is considered an error. Our method exhibits an accurate estimation with the mean absolute error of 0.044 mm when tested on a 23G surgical cannula with a diameter of 0.640 mm and operates at a resolution of 4024 × 3036 at 30 frames per second.
Collapse
|
5
|
Dehghani S, Sommersperger M, Zhang P, Martin-Gomez A, Busam B, Gehlbach P, Navab N, Nasseri MA, Iordachita I. Robotic Navigation Autonomy for Subretinal Injection via Intelligent Real-Time Virtual iOCT Volume Slicing. IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION : ICRA : [PROCEEDINGS]. IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION 2023; 2023:4724-4731. [PMID: 38125032 PMCID: PMC10732544 DOI: 10.1109/icra48891.2023.10160372] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/23/2023]
Abstract
In the last decade, various robotic platforms have been introduced that could support delicate retinal surgeries. Concurrently, to provide semantic understanding of the surgical area, recent advances have enabled microscope-integrated intraoperative Optical Coherent Tomography (iOCT) with high-resolution 3D imaging at near video rate. The combination of robotics and semantic understanding enables task autonomy in robotic retinal surgery, such as for subretinal injection. This procedure requires precise needle insertion for best treatment outcomes. However, merging robotic systems with iOCT introduces new challenges. These include, but are not limited to high demands on data processing rates and dynamic registration of these systems during the procedure. In this work, we propose a framework for autonomous robotic navigation for subretinal injection, based on intelligent real-time processing of iOCT volumes. Our method consists of an instrument pose estimation method, an online registration between the robotic and the iOCT system, and trajectory planning tailored for navigation to an injection target. We also introduce intelligent virtual B-scans, a volume slicing approach for rapid instrument pose estimation, which is enabled by Convolutional Neural Networks (CNNs). Our experiments on ex-vivo porcine eyes demonstrate the precision and repeatability of the method. Finally, we discuss identified challenges in this work and suggest potential solutions to further the development of such systems.
Collapse
Affiliation(s)
- Shervin Dehghani
- Department of Computer Science, Technische Universität München, München 85748 Germany
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| | - Michael Sommersperger
- Department of Computer Science, Technische Universität München, München 85748 Germany
| | - Peiyao Zhang
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| | - Alejandro Martin-Gomez
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| | - Benjamin Busam
- Department of Computer Science, Technische Universität München, München 85748 Germany
| | - Peter Gehlbach
- Wilmer Eye Institute, Johns Hopkins Hospital, Baltimore, MD, USA
| | - Nassir Navab
- Computer Aided Medical Procedures & Augmented Reality, Technical University of Munich, 85748 Munich, Germany, and an adjunct professor at the Whiting School of Engineering, Johns Hopkins University, Baltimore, MD, USA
| | - M. Ali Nasseri
- Department of Computer Science, Technische Universität München, München 85748 Germany
- Augenklinik und Poliklinik, Klinikum rechts der Isar der Technische Universität München, München 81675 Germany
| | - Iulian Iordachita
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| |
Collapse
|