1
|
Kim M, Zhang Y, Jin S. Soft tissue surgical robot for minimally invasive surgery: a review. Biomed Eng Lett 2023; 13:561-569. [PMID: 37872994 PMCID: PMC10590359 DOI: 10.1007/s13534-023-00326-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2023] [Revised: 09/08/2023] [Accepted: 09/26/2023] [Indexed: 10/25/2023] Open
Abstract
Purpose The current state of soft tissue surgery robots is surveyed, and the key technologies underlying their success are analyzed. State-of-the-art technologies are introduced, and future directions are discussed. Methods Relevant literature is explored, analyzed, and summarized. Results Soft tissue surgical robots had rapidly spread in the field of laparoscopic surgery based on the multi-degree-of-freedom movement of intra-abdominal surgical tools and stereoscopic imaging that are not possible in conventional surgery. The three key technologies that have made surgical robots successful are wire-driven mechanisms for multi-degree-of-freedom movement, master devices for intuitive remote control, and stereoscopic imaging technology. Recently, human-robot interaction technologies have been applied to develop user interfaces such as vision assistance and haptic feedback, and research on autonomous surgery has begun. Conclusion Robotic surgery not only replaces conventional laparoscopic surgery but also allows for complex surgeries that are not possible with laparoscopic surgery. On the other hand, it is also criticized for its high cost and lack of clinical superiority or patient benefit compared to conventional laparoscopic surgery. As various robots compete in the market, the cost of surgical robots is expected to decrease. Surgical robots are expected to continue to evolve in the future due to the need to reduce the workload of medical staff and improve the level of care demanded by patients.
Collapse
Affiliation(s)
- Minhyo Kim
- School of Mechanical Engineering, Pusan National University, 2, Busandaehak-ro 63beon-gil, Geumjeong-gu, Busan, 46241 Republic of Korea
| | - Youqiang Zhang
- School of Mechanical Engineering, Pusan National University, 2, Busandaehak-ro 63beon-gil, Geumjeong-gu, Busan, 46241 Republic of Korea
| | - Sangrok Jin
- School of Mechanical Engineering, Pusan National University, 2, Busandaehak-ro 63beon-gil, Geumjeong-gu, Busan, 46241 Republic of Korea
| |
Collapse
|
2
|
Shahkoo AA, Abin AA. Deep reinforcement learning in continuous action space for autonomous robotic surgery. Int J Comput Assist Radiol Surg 2023; 18:423-431. [PMID: 36383302 DOI: 10.1007/s11548-022-02789-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2022] [Accepted: 10/28/2022] [Indexed: 11/18/2022]
Abstract
PURPOSE Reinforcement learning methods have shown promising results for the automation of sub-tasks in robotic surgery systems. With the development of these methods, surgical robots have been able to achieve good performances, so that they can be used in complex and high-risk environments such as surgical pattern cutting to reduce stress and pressure on the surgeon and increase surgical accuracy. This study has aimed at providing a deep reinforcement learning-based approach to control the gripper arm when cutting soft tissue in a continuous action space. METHODS Surgical soft tissue cutting in this study is performed by controlling the gripper arm in a continuous action space and a grid observation space. In the proposed method using deep reinforcement learning, we find an optimal tensioning policy in the continuous action space that increases the cutting accuracy of the predetermined pattern. RESULTS The simulation results demonstrated that in the cutting of many complex patterns, the proposed method works better than the methods in which the tensioning was performed in a discrete action space and the observation space was modeled as a partial and random representation. CONCLUSION We introduced a deep reinforcement learning-based method for obtaining the optimal tensioning policy in a continuous action space when cutting a predetermined pattern. We showed that the proposed approach outperforms the state-of-the-art method in the soft pattern cutting task with respect to accuracy.
Collapse
Affiliation(s)
- Amin Abbasi Shahkoo
- Faculty of Computer Science and Engineering, Shahid Beheshti University, Daneshjou Blvd., Tehran, Tehran, 1983969411, Iran
| | - Ahmad Ali Abin
- Faculty of Computer Science and Engineering, Shahid Beheshti University, Daneshjou Blvd., Tehran, Tehran, 1983969411, Iran.
| |
Collapse
|
3
|
Chadebecq F, Lovat LB, Stoyanov D. Artificial intelligence and automation in endoscopy and surgery. Nat Rev Gastroenterol Hepatol 2023; 20:171-182. [PMID: 36352158 DOI: 10.1038/s41575-022-00701-y] [Citation(s) in RCA: 21] [Impact Index Per Article: 21.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 10/03/2022] [Indexed: 11/10/2022]
Abstract
Modern endoscopy relies on digital technology, from high-resolution imaging sensors and displays to electronics connecting configurable illumination and actuation systems for robotic articulation. In addition to enabling more effective diagnostic and therapeutic interventions, the digitization of the procedural toolset enables video data capture of the internal human anatomy at unprecedented levels. Interventional video data encapsulate functional and structural information about a patient's anatomy as well as events, activity and action logs about the surgical process. This detailed but difficult-to-interpret record from endoscopic procedures can be linked to preoperative and postoperative records or patient imaging information. Rapid advances in artificial intelligence, especially in supervised deep learning, can utilize data from endoscopic procedures to develop systems for assisting procedures leading to computer-assisted interventions that can enable better navigation during procedures, automation of image interpretation and robotically assisted tool manipulation. In this Perspective, we summarize state-of-the-art artificial intelligence for computer-assisted interventions in gastroenterology and surgery.
Collapse
Affiliation(s)
- François Chadebecq
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London, UK
| | - Laurence B Lovat
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London, UK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London, UK.
| |
Collapse
|
4
|
Nillahoot N, Pillai BM, Sharma B, Wilasrusmee C, Suthakorn J. Interactive 3D Force/Torque Parameter Acquisition and Correlation Identification during Primary Trocar Insertion in Laparoscopic Abdominal Surgery: 5 Cases. SENSORS (BASEL, SWITZERLAND) 2022; 22:8970. [PMID: 36433567 PMCID: PMC9698636 DOI: 10.3390/s22228970] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/04/2022] [Revised: 11/08/2022] [Accepted: 11/17/2022] [Indexed: 06/16/2023]
Abstract
Laparoscopic procedures have become indispensable in gastrointestinal surgery. As a minimally invasive process, it begins with primary trocar insertion. However, this step poses the threat of injuries to the gastrointestinal tract and blood vessels. As such, the comprehension of the insertion process is crucial to the development of robotic-assisted/automated surgeries. To sustain robotic development, this research aims to study the interactive force/torque (F/T) behavior between the trocar and the abdomen during the trocar insertion process. For force/torque (F/T) data acquisition, a trocar interfaced with a six-axis F/T sensor was used by surgeons for the insertion. The study was conducted during five abdominal hernia surgical cases in the Department of Surgery, Faculty of Medicine, Ramathibodi Hospital, Mahidol University. The real-time F/T data were further processed and analyzed. The fluctuation in the force/torque (F/T) parameter was significant, with peak force ranging from 16.83 N to 61.86 N and peak torque ranging from 0.552 Nm to 1.76 Nm. The force parameter was observed to positively correlate with procedural time, while torque was found to be negatively correlated. Although during the process a surgeon applied force and torque in multiple axes, for a robotic system, the push and turn motion in a single axis was observed to be sufficient. For minimal tissue damage in less procedural time, a system with low push force and high torque was observed to be advantageous. These understandings will eventually benefit the development of computer-assisted or robotics technology to improve the outcome of the primary trocar insertion procedure.
Collapse
Affiliation(s)
- Nantida Nillahoot
- Department of Biomedical Engineering, Center for Biomedical and Robotics Technology (BART LAB), Faculty of Engineering, Mahidol University, Nakhon Pathom 73170, Thailand
| | - Branesh M. Pillai
- Department of Biomedical Engineering, Center for Biomedical and Robotics Technology (BART LAB), Faculty of Engineering, Mahidol University, Nakhon Pathom 73170, Thailand
| | - Bibhu Sharma
- Department of Biomedical Engineering, Center for Biomedical and Robotics Technology (BART LAB), Faculty of Engineering, Mahidol University, Nakhon Pathom 73170, Thailand
| | - Chumpon Wilasrusmee
- Department of Surgery, Faculty of Medicine Ramathibodi Hospital, Mahidol University, Bangkok 10400, Thailand
| | - Jackrit Suthakorn
- Department of Biomedical Engineering, Center for Biomedical and Robotics Technology (BART LAB), Faculty of Engineering, Mahidol University, Nakhon Pathom 73170, Thailand
| |
Collapse
|
5
|
Zhu L, Shen J, Yang S, Song A. Robot-Assisted Retraction for Transoral Surgery. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3211491] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Affiliation(s)
- Lifeng Zhu
- State Key Laboratory of Bioelectronics, Jiangsu Key Lab of Remote Measurement and Control, School of Instrument Science and Engineering, Southeast University, Nanjing, China
| | - Jiangwei Shen
- State Key Laboratory of Bioelectronics, Jiangsu Key Lab of Remote Measurement and Control, School of Instrument Science and Engineering, Southeast University, Nanjing, China
| | - Shuyan Yang
- State Key Laboratory of Bioelectronics, Jiangsu Key Lab of Remote Measurement and Control, School of Instrument Science and Engineering, Southeast University, Nanjing, China
| | - Aiguo Song
- State Key Laboratory of Bioelectronics, Jiangsu Key Lab of Remote Measurement and Control, School of Instrument Science and Engineering, Southeast University, Nanjing, China
| |
Collapse
|
6
|
Surgical Tool Datasets for Machine Learning Research: A Survey. Int J Comput Vis 2022. [DOI: 10.1007/s11263-022-01640-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
AbstractThis paper is a comprehensive survey of datasets for surgical tool detection and related surgical data science and machine learning techniques and algorithms. The survey offers a high level perspective of current research in this area, analyses the taxonomy of approaches adopted by researchers using surgical tool datasets, and addresses key areas of research, such as the datasets used, evaluation metrics applied and deep learning techniques utilised. Our presentation and taxonomy provides a framework that facilitates greater understanding of current work, and highlights the challenges and opportunities for further innovative and useful research.
Collapse
|
7
|
Li L, Li X, Ding S, Fang Z, Xu M, Ren H, Yang S. SIRNet: Fine-Grained Surgical Interaction Recognition. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3148454] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
8
|
Nagy TD, Haidegger T. Performance and Capability Assessment in Surgical Subtask Automation. SENSORS (BASEL, SWITZERLAND) 2022; 22:2501. [PMID: 35408117 PMCID: PMC9002652 DOI: 10.3390/s22072501] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Revised: 03/16/2022] [Accepted: 03/19/2022] [Indexed: 02/04/2023]
Abstract
Robot-Assisted Minimally Invasive Surgery (RAMIS) has reshaped the standard clinical practice during the past two decades. Many believe that the next big step in the advancement of RAMIS will be partial autonomy, which may reduce the fatigue and the cognitive load on the surgeon by performing the monotonous, time-consuming subtasks of the surgical procedure autonomously. Although serious research efforts are paid to this area worldwide, standard evaluation methods, metrics, or benchmarking techniques are still not formed. This article aims to fill the void in the research domain of surgical subtask automation by proposing standard methodologies for performance evaluation. For that purpose, a novel characterization model is presented for surgical automation. The current metrics for performance evaluation and comparison are overviewed and analyzed, and a workflow model is presented that can help researchers to identify and apply their choice of metrics. Existing systems and setups that serve or could serve as benchmarks are also introduced and the need for standard benchmarks in the field is articulated. Finally, the matter of Human-Machine Interface (HMI) quality, robustness, and the related legal and ethical issues are presented.
Collapse
Affiliation(s)
- Tamás D. Nagy
- Antal Bejczy Center for Intelligent Robotics, EKIK, Óbuda University, Bécsi út 96/B, 1034 Budapest, Hungary;
- Doctoral School of Applied Informatics and Applied Mathematics, Óbuda University, Bécsi út 96/B, 1034 Budapest, Hungary
- Biomatics Institute, John von Neumann Faculty of Informatics, Óbuda University, Bécsi út 96/B, 1034 Budapest, Hungary
| | - Tamás Haidegger
- Antal Bejczy Center for Intelligent Robotics, EKIK, Óbuda University, Bécsi út 96/B, 1034 Budapest, Hungary;
- Austrian Center for Medical Innovation and Technology (ACMIT), Viktor-Kaplan-Straße 2/1, 2700 Wiener Neustadt, Austria
| |
Collapse
|
9
|
Bardozzo F, Collins T, Forgione A, Hostettler A, Tagliaferri R. StaSiS-Net: a stacked and siamese disparity estimation network for depth reconstruction in modern 3D laparoscopy. Med Image Anal 2022; 77:102380. [DOI: 10.1016/j.media.2022.102380] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2021] [Revised: 01/26/2022] [Accepted: 01/27/2022] [Indexed: 10/19/2022]
|
10
|
Moglia A, Georgiou K, Georgiou E, Satava RM, Cuschieri A. A systematic review on artificial intelligence in robot-assisted surgery. Int J Surg 2021; 95:106151. [PMID: 34695601 DOI: 10.1016/j.ijsu.2021.106151] [Citation(s) in RCA: 32] [Impact Index Per Article: 10.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2021] [Revised: 10/04/2021] [Accepted: 10/19/2021] [Indexed: 12/12/2022]
Abstract
BACKGROUND Despite the extensive published literature on the significant potential of artificial intelligence (AI) there are no reports on its efficacy in improving patient safety in robot-assisted surgery (RAS). The purposes of this work are to systematically review the published literature on AI in RAS, and to identify and discuss current limitations and challenges. MATERIALS AND METHODS A literature search was conducted on PubMed, Web of Science, Scopus, and IEEExplore according to PRISMA 2020 statement. Eligible articles were peer-review studies published in English language from January 1, 2016 to December 31, 2020. Amstar 2 was used for quality assessment. Risk of bias was evaluated with the Newcastle Ottawa Quality assessment tool. Data of the studies were visually presented in tables using SPIDER tool. RESULTS Thirty-five publications, representing 3436 patients, met the search criteria and were included in the analysis. The selected reports concern: motion analysis (n = 17), urology (n = 12), gynecology (n = 1), other specialties (n = 1), training (n = 3), and tissue retraction (n = 1). Precision for surgical tools detection varied from 76.0% to 90.6%. Mean absolute error on prediction of urinary continence after robot-assisted radical prostatectomy (RARP) ranged from 85.9 to 134.7 days. Accuracy on prediction of length of stay after RARP was 88.5%. Accuracy on recognition of the next surgical task during robot-assisted partial nephrectomy (RAPN) achieved 75.7%. CONCLUSION The reviewed studies were of low quality. The findings are limited by the small size of the datasets. Comparison between studies on the same topic was restricted due to algorithms and datasets heterogeneity. There is no proof that currently AI can identify the critical tasks of RAS operations, which determine patient outcome. There is an urgent need for studies on large datasets and external validation of the AI algorithms used. Furthermore, the results should be transparent and meaningful to surgeons, enabling them to inform patients in layman's words. REGISTRATION Review Registry Unique Identifying Number: reviewregistry1225.
Collapse
Affiliation(s)
- Andrea Moglia
- EndoCAS, Center for Computer Assisted Surgery, University of Pisa, 56124, Pisa, Italy 1st Propaedeutic Surgical Unit, Hippocrateion Athens General Hospital, Athens Medical School, National and Kapodistrian University of Athens, Greece MPLSC, Athens Medical School, National and Kapodistrian University of Athens, Greece Department of Surgery, University of Washington Medical Center, Seattle, WA, United States Scuola Superiore Sant'Anna of Pisa, 56214, Pisa, Italy Institute for Medical Science and Technology, University of Dundee, Dundee, DD2 1FD, United Kingdom
| | | | | | | | | |
Collapse
|
11
|
Ren Q, Zhu W, Feng Z, Liang W. Learning-Based Force Control of a Surgical Robot for Tool-Soft Tissue Interaction. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3093018] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
12
|
Qin Y, Allan M, Burdick JW, Azizian M. Autonomous Hierarchical Surgical State Estimation During Robot-Assisted Surgery Through Deep Neural Networks. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3091728] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
13
|
Tagliabue E, Dall'Alba D, Pfeiffer M, Piccinelli M, Marin R, Castellani U, Speidel S, Fiorini P. Data-Driven Intra-Operative Estimation of Anatomical Attachments for Autonomous Tissue Dissection. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3060655] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
14
|
Qin Y, Allan M, Yue Y, Burdick JW, Azizian M. Learning Invariant Representation of Tasks for Robust Surgical State Estimation. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3063014] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
15
|
Su B, Yu S, Li X, Gong Y, Li H, Ren Z, Xia Y, Wang H, Zhang Y, Yao W, Wang J, Tang J. Autonomous Robot for Removing Superficial Traumatic Blood. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE-JTEHM 2021; 9:2600109. [PMID: 33598368 PMCID: PMC7880304 DOI: 10.1109/jtehm.2021.3056618] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/16/2020] [Revised: 01/16/2021] [Accepted: 01/29/2021] [Indexed: 11/09/2022]
Abstract
Objective: To remove blood from an incision and find the incision spot is a key task during surgery, or else over discharge of blood will endanger a patient's life. However, the repetitive manual blood removal involves plenty of workload contributing fatigue of surgeons. Thus, it is valuable to design a robotic system which can automatically remove blood on the incision surface. Methods: In this paper, we design a robotic system to fulfill the surgical task of the blood removal. The system consists of a pair of dual cameras, a 6-DoF robotic arm, an aspirator whose handle is fixed to a robotic arm, and a pump connected to the aspirator. Further, a path-planning algorithm is designed to generate a path, which the aspirator tip should follow to remove blood. Results: In a group of simulating bleeding experiments on ex vivo porcine tissue, the contour of the blood region is detected, and the reconstructed spatial coordinates of the detected blood contour is obtained afterward. The BRR robot cleans thoroughly the blood running out the incision. Conclusions: This study contributes the first result on designing an autonomous blood removal medical robot. The skill of the surgical blood removal operation, which is manually operated by surgeons nowadays, is alternatively grasped by the proposed BRR medical robot.
Collapse
Affiliation(s)
- Baiquan Su
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - Shi Yu
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - Xintong Li
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - Yi Gong
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - Han Li
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - Zifeng Ren
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - Yijing Xia
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - He Wang
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - Yucheng Zhang
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - Wei Yao
- Department of GastroenterologyPeking University Third HospitalBeijing100191China
| | - Junchen Wang
- School of Mechanical Engineering and AutomationBeihang UniversityBeijing100191China.,Beijing Advanced Innovation Center, Biomedical EngineeringBeihang UniversityBeijing100086China
| | - Jie Tang
- Department of NeurosurgeryXuanwu HospitalCapital Medical UniversityBeijing100053China
| |
Collapse
|