1
|
Neri A, Coduri M, Penza V, Santangelo A, Oliveri A, Turco E, Pizzirani M, Trinceri E, Soriero D, Boero F, Ricci S, Mattos LS. A novel affordable user interface for robotic surgery training: design, development and usability study. Front Digit Health 2024; 6:1428534. [PMID: 39139587 PMCID: PMC11319275 DOI: 10.3389/fdgth.2024.1428534] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2024] [Accepted: 07/16/2024] [Indexed: 08/15/2024] Open
Abstract
Introduction The use of robotic systems in the surgical domain has become groundbreaking for patients and surgeons in the last decades. While the annual number of robotic surgical procedures continues to increase rapidly, it is essential to provide the surgeon with innovative training courses along with the standard specialization path. To this end, simulators play a fundamental role. Currently, the high cost of the leading VR simulators limits their accessibility to educational institutions. The challenge lies in balancing high-fidelity simulation with cost-effectiveness; however, few cost-effective options exist for robotic surgery training. Methods This paper proposes the design, development and user-centered usability study of an affordable user interface to control a surgical robot simulator. It consists of a cart equipped with two haptic interfaces, a VR visor and two pedals. The simulations were created using Unity, which offers versatility for expanding the simulator to more complex scenes. An intuitive teleoperation control of the simulated robotic instruments is achieved through a high-level control strategy. Results and Discussion Its affordability and resemblance to real surgeon consoles make it ideal for implementing robotic surgery training programs in medical schools, enhancing accessibility to a broader audience. This is demonstrated by the results of an usability study involving expert surgeons who use surgical robots regularly, expert surgeons without robotic surgery experience, and a control group. The results of the study, which was based on a traditional Peg-board exercise and Camera Control task, demonstrate the simulator's high usability and intuitive control across diverse user groups, including those with limited experience. This offers evidence that this affordable system is a promising solution for expanding robotic surgery training.
Collapse
Affiliation(s)
- Alberto Neri
- Biomedical Robotics Lab, Advanced Robotics, Istituto Italiano di Tecnologia, Genoa, Italy
- Department of Computer Science, Bioengineering, Robotics and Systems Engineering (DIBRIS), University of Genoa, Genova, Italy
| | - Mara Coduri
- Department of Computer Science, Bioengineering, Robotics and Systems Engineering (DIBRIS), University of Genoa, Genova, Italy
- Simulation and Advanced Education Center, University of Genova, Genova, Italy
| | - Veronica Penza
- Biomedical Robotics Lab, Advanced Robotics, Istituto Italiano di Tecnologia, Genoa, Italy
| | - Andrea Santangelo
- Biomedical Robotics Lab, Advanced Robotics, Istituto Italiano di Tecnologia, Genoa, Italy
| | - Alessandra Oliveri
- Department of Computer Science, Bioengineering, Robotics and Systems Engineering (DIBRIS), University of Genoa, Genova, Italy
| | - Enrico Turco
- Humanoid and Human Centred Mechatronics (HHCM), Istituto Italiano di Tecnologia, Genoa, Italy
- Department of Information Engineering and Mathematics, University of Siena, Siena, Italy
| | | | | | - Domenico Soriero
- Unit of Surgical Oncology, IRCCS Policlinico San Martino, Genoa, Italy
| | | | - Serena Ricci
- Department of Computer Science, Bioengineering, Robotics and Systems Engineering (DIBRIS), University of Genoa, Genova, Italy
- Simulation and Advanced Education Center, University of Genova, Genova, Italy
| | - Leonardo S. Mattos
- Biomedical Robotics Lab, Advanced Robotics, Istituto Italiano di Tecnologia, Genoa, Italy
| |
Collapse
|
2
|
van Amsterdam B, Clarkson MJ, Stoyanov D. Gesture Recognition in Robotic Surgery: A Review. IEEE Trans Biomed Eng 2021; 68:2021-2035. [PMID: 33497324 DOI: 10.1109/tbme.2021.3054828] [Citation(s) in RCA: 23] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
OBJECTIVE Surgical activity recognition is a fundamental step in computer-assisted interventions. This paper reviews the state-of-the-art in methods for automatic recognition of fine-grained gestures in robotic surgery focusing on recent data-driven approaches and outlines the open questions and future research directions. METHODS An article search was performed on 5 bibliographic databases with the following search terms: robotic, robot-assisted, JIGSAWS, surgery, surgical, gesture, fine-grained, surgeme, action, trajectory, segmentation, recognition, parsing. Selected articles were classified based on the level of supervision required for training and divided into different groups representing major frameworks for time series analysis and data modelling. RESULTS A total of 52 articles were reviewed. The research field is showing rapid expansion, with the majority of articles published in the last 4 years. Deep-learning-based temporal models with discriminative feature extraction and multi-modal data integration have demonstrated promising results on small surgical datasets. Currently, unsupervised methods perform significantly less well than the supervised approaches. CONCLUSION The development of large and diverse open-source datasets of annotated demonstrations is essential for development and validation of robust solutions for surgical gesture recognition. While new strategies for discriminative feature extraction and knowledge transfer, or unsupervised and semi-supervised approaches, can mitigate the need for data and labels, they have not yet been demonstrated to achieve comparable performance. Important future research directions include detection and forecast of gesture-specific errors and anomalies. SIGNIFICANCE This paper is a comprehensive and structured analysis of surgical gesture recognition methods aiming to summarize the status of this rapidly evolving field.
Collapse
|
3
|
Bielsa VF. Virtual reality simulation in plastic surgery training. Literature review. J Plast Reconstr Aesthet Surg 2021; 74:2372-2378. [PMID: 33972199 DOI: 10.1016/j.bjps.2021.03.066] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/24/2019] [Revised: 11/07/2020] [Accepted: 03/13/2021] [Indexed: 11/19/2022]
Abstract
Major changes have occurred in the medical environment leading to an evolution from the traditional residency programmes to competency-based ones. Virtual reality (VR) represents a promising simulation resource for surgical training. Several types of VR simulators can be considered, depending on the level of immersion they offer. The goal of the article is to review the progress of VR simulation in plastic surgery (PS) training. A systematic search of the literature was performed on PUBMED/MEDLINE with the following key words: (Simulation OR Virtual Reality) AND (Education OR Training) AND Plastic Surgery from January 1998 to September 2019. A total of 244 results were found, and 80 of them were selected for abstract review. Sixty-four articles were selected for complete reading. Several attempts have been made to create VR simulators and most of them are non-immersive or partially immersive. The main conclusions of them are summarized. VR simulation has been proven to have a role in PS training, offering many advantages. Furthermore, VR simulation can be used for safety training, team interaction and decision-making education. Validation is a key point for acceptance of simulators. Further efforts are required to include simulation in PS curricula.
Collapse
Affiliation(s)
- V Fuertes Bielsa
- Plastic Surgery Department University Hospital Miguel Servet Paseo Isabel la Católica, 1-3, 50009 Zaragoza, Spain.
| |
Collapse
|
4
|
Abstract
Just as laparoscopic surgery provided a giant leap in safety and recovery for patients over open surgery methods, robotic-assisted surgery (RAS) is doing the same to laparoscopic surgery. The first laparoscopic-RAS systems to be commercialized were the Intuitive Surgical, Inc. (Sunnyvale, CA, USA) da Vinci and the Computer Motion Zeus. These systems were similar in many aspects, which led to a patent dispute between the two companies. Before the dispute was settled in court, Intuitive Surgical bought Computer Motion, and thus owned critical patents for laparoscopic-RAS. Recently, the patents held by Intuitive Surgical have begun to expire, leading to many new laparoscopic-RAS systems being developed and entering the market. In this study, we review the newly commercialized and prototype laparoscopic-RAS systems. We compare the features of the imaging and display technology, surgeons console and patient cart of the reviewed RAS systems. We also briefly discuss the future directions of laparoscopic-RAS surgery. With new laparoscopic-RAS systems now commercially available we should see RAS being adopted more widely in surgical interventions and costs of procedures using RAS to decrease in the near future.
Collapse
|