1
|
Su B, Zhang Q, Gong Y, Xiu W, Gao Y, Xu L, Li H, Wang Z, Yu S, Hu YD, Yao W, Wang J, Li C, Tang J, Gao L. Deep learning-based classification and segmentation for scalpels. Int J Comput Assist Radiol Surg 2023; 18:855-864. [PMID: 36602643 DOI: 10.1007/s11548-022-02825-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2022] [Accepted: 12/22/2022] [Indexed: 01/06/2023]
Abstract
PURPOSE Scalpels are typical tools used for cutting in surgery, and the surgical tray is one of the locations where the scalpel is present during surgery. However, there is no known method for the classification and segmentation of multiple types of scalpels. This paper presents a dataset of multiple types of scalpels and a classification and segmentation method that can be applied as a first step for validating segmentation of scalpels and further applications can include identifying scalpels from other tools in different clinical scenarios. METHODS The proposed scalpel dataset contains 6400 images with labeled information of 10 types of scalpels, and a classification and segmentation model for multiple types of scalpels is obtained by training the dataset based on Mask R-CNN. The article concludes with an analysis and evaluation of the network performance, verifying the feasibility of the work. RESULTS A multi-type scalpel dataset was established, and the classification and segmentation models of multi-type scalpel were obtained by training the Mask R-CNN. The average accuracy and average recall reached 94.19% and 96.61%, respectively, in the classification task and 93.30% and 95.14%, respectively, in the segmentation task. CONCLUSION The first scalpel dataset is created covering multiple types of scalpels. And the classification and segmentation of multiple types of scalpels are realized for the first time. This study achieves the classification and segmentation of scalpels in a surgical tray scene, providing a potential solution for scalpel recognition, localization and tracking.
Collapse
Affiliation(s)
- Baiquan Su
- Medical Robotics Laboratory, School of Automation, Beijing University of Posts and Telecommunications, Beijing, China
| | - Qingqian Zhang
- Medical Robotics Laboratory, School of Automation, Beijing University of Posts and Telecommunications, Beijing, China
| | - Yi Gong
- Medical Robotics Laboratory, School of Automation, Beijing University of Posts and Telecommunications, Beijing, China
| | - Wei Xiu
- Chinese Institute of Electronics, Beijing, China
| | - Yang Gao
- Chinese Institute of Electronics, Beijing, China
| | - Lixin Xu
- Department of Neurosurgery, Xuanwu Hospital, Capital Medical University, Beijing, China
| | - Han Li
- Medical Robotics Laboratory, School of Automation, Beijing University of Posts and Telecommunications, Beijing, China
| | - Zehao Wang
- Medical Robotics Laboratory, School of Automation, Beijing University of Posts and Telecommunications, Beijing, China
| | - Shi Yu
- Medical Robotics Laboratory, School of Automation, Beijing University of Posts and Telecommunications, Beijing, China
| | - Yida David Hu
- Brigham and Women's Hospital, Harvard Medical School, Boston, USA
| | - Wei Yao
- Gastroenterology Department, Peking University Third Hospital, Beijing, China
| | - Junchen Wang
- School of Mechanical Engineering and Automation, Beihang University, Beijing, China
| | - Changsheng Li
- School of Mechatronical Engineering, Beijing Institute of Technology, Beijing, China
| | - Jie Tang
- Department of Neurosurgery, Xuanwu Hospital, Capital Medical University, Beijing, China.
| | - Li Gao
- Department of Periodontology, National Stomatological Center, Peking University School and Hospital of Stomatology, Beijing, China.
- National Clinical Research Center for Oral Diseases, Beijing, China.
- National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing, China.
- Beijing Key Laboratory of Digital Stomatology, Beijing, China.
| |
Collapse
|
2
|
Gumbs AA, Grasso V, Bourdel N, Croner R, Spolverato G, Frigerio I, Illanes A, Abu Hilal M, Park A, Elyan E. The Advances in Computer Vision That Are Enabling More Autonomous Actions in Surgery: A Systematic Review of the Literature. SENSORS 2022; 22:s22134918. [PMID: 35808408 PMCID: PMC9269548 DOI: 10.3390/s22134918] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Revised: 06/21/2022] [Accepted: 06/21/2022] [Indexed: 12/28/2022]
Abstract
This is a review focused on advances and current limitations of computer vision (CV) and how CV can help us obtain to more autonomous actions in surgery. It is a follow-up article to one that we previously published in Sensors entitled, “Artificial Intelligence Surgery: How Do We Get to Autonomous Actions in Surgery?” As opposed to that article that also discussed issues of machine learning, deep learning and natural language processing, this review will delve deeper into the field of CV. Additionally, non-visual forms of data that can aid computerized robots in the performance of more autonomous actions, such as instrument priors and audio haptics, will also be highlighted. Furthermore, the current existential crisis for surgeons, endoscopists and interventional radiologists regarding more autonomy during procedures will be discussed. In summary, this paper will discuss how to harness the power of CV to keep doctors who do interventions in the loop.
Collapse
Affiliation(s)
- Andrew A. Gumbs
- Departement de Chirurgie Digestive, Centre Hospitalier Intercommunal de, Poissy/Saint-Germain-en-Laye, 78300 Poissy, France
- Department of Surgery, University of Magdeburg, 39106 Magdeburg, Germany;
- Correspondence: ; Tel.: +33-139274873
| | - Vincent Grasso
- Family Christian Health Center, 31 West 155th St., Harvey, IL 60426, USA;
| | - Nicolas Bourdel
- Gynecological Surgery Department, CHU Clermont Ferrand, 1, Place Lucie-Aubrac Clermont-Ferrand, 63100 Clermont-Ferrand, France;
- EnCoV, Institut Pascal, UMR6602 CNRS, UCA, Clermont-Ferrand University Hospital, 63000 Clermont-Ferrand, France
- SurgAR-Surgical Augmented Reality, 63000 Clermont-Ferrand, France
| | - Roland Croner
- Department of Surgery, University of Magdeburg, 39106 Magdeburg, Germany;
| | - Gaya Spolverato
- Department of Surgical, Oncological and Gastroenterological Sciences, University of Padova, 35122 Padova, Italy;
| | - Isabella Frigerio
- Department of Hepato-Pancreato-Biliary Surgery, Pederzoli Hospital, 37019 Peschiera del Garda, Italy;
| | - Alfredo Illanes
- INKA-Innovation Laboratory for Image Guided Therapy, Otto-von-Guericke University Magdeburg, 39120 Magdeburg, Germany;
| | - Mohammad Abu Hilal
- Unità Chirurgia Epatobiliopancreatica, Robotica e Mininvasiva, Fondazione Poliambulanza Istituto Ospedaliero, Via Bissolati, 57, 25124 Brescia, Italy;
| | - Adrian Park
- Anne Arundel Medical Center, Johns Hopkins University, Annapolis, MD 21401, USA;
| | - Eyad Elyan
- School of Computing, Robert Gordon University, Aberdeen AB10 7JG, UK;
| |
Collapse
|
3
|
Object Detection and Distance Measurement in Teleoperation. MACHINES 2022. [DOI: 10.3390/machines10050402] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/10/2022]
Abstract
In recent years, teleoperation has experienced rapid development. Numerous teleoperation applications in diverse areas have been reported. Among all teleoperation-related components, computer vision (CV) is treated as one of the must-have technologies, because it allows users to observe remote scenarios. In addition, CV can further help the user to identify and track the desired targets from complex scenes. It has been proven that efficient CV methods can significantly improve the operation accuracy and relieve user’s physical and mental fatigue. Therefore, furthering understanding about CV techniques and reviewing the latest research outcomes is necessary for teleoperation designers. In this context, this review article was composed.
Collapse
|
4
|
Wang J, Yue C, Wang G, Gong Y, Li H, Yao W, Kuang S, Liu W, Wang J, Su B. Task Autonomous Medical Robot for Both Incision Stapling and Staples Removal. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3141452] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|