1
|
Olsen RG, Svendsen MBS, Tolsgaard MG, Konge L, Røder A, Bjerrum F. Surgical gestures can be used to assess surgical competence in robot-assisted surgery : A validity investigating study of simulated RARP. J Robot Surg 2024; 18:47. [PMID: 38244130 PMCID: PMC10799775 DOI: 10.1007/s11701-023-01807-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2023] [Accepted: 12/23/2023] [Indexed: 01/22/2024]
Abstract
To collect validity evidence for the assessment of surgical competence through the classification of general surgical gestures for a simulated robot-assisted radical prostatectomy (RARP). We used 165 video recordings of novice and experienced RARP surgeons performing three parts of the RARP procedure on the RobotiX Mentor. We annotated the surgical tasks with different surgical gestures: dissection, hemostatic control, application of clips, needle handling, and suturing. The gestures were analyzed using idle time (periods with minimal instrument movements) and active time (whenever a surgical gesture was annotated). The distribution of surgical gestures was described using a one-dimensional heat map, snail tracks. All surgeons had a similar percentage of idle time but novices had longer phases of idle time (mean time: 21 vs. 15 s, p < 0.001). Novices used a higher total number of surgical gestures (number of phases: 45 vs. 35, p < 0.001) and each phase was longer compared with those of the experienced surgeons (mean time: 10 vs. 8 s, p < 0.001). There was a different pattern of gestures between novices and experienced surgeons as seen by a different distribution of the phases. General surgical gestures can be used to assess surgical competence in simulated RARP and can be displayed as a visual tool to show how performance is improving. The established pass/fail level may be used to ensure the competence of the residents before proceeding with supervised real-life surgery. The next step is to investigate if the developed tool can optimize automated feedback during simulator training.
Collapse
Affiliation(s)
- Rikke Groth Olsen
- Copenhagen Academy for Medical Education and Simulation (CAMES), Center for HR & Education, The Capital Region of Denmark, Ryesgade 53B, 2100, Copenhagen, Denmark.
- Department of Urology, Copenhagen Prostate Cancer Center, Copenhagen University Hospital - Rigshospitalet, Copenhagen, Denmark.
- Faculty of Health and Medical Sciences, University of Copenhagen, Copenhagen, Denmark.
| | - Morten Bo Søndergaard Svendsen
- Copenhagen Academy for Medical Education and Simulation (CAMES), Center for HR & Education, The Capital Region of Denmark, Ryesgade 53B, 2100, Copenhagen, Denmark
- Department of Computer Science, University of Copenhagen, Copenhagen, Denmark
| | - Martin G Tolsgaard
- Copenhagen Academy for Medical Education and Simulation (CAMES), Center for HR & Education, The Capital Region of Denmark, Ryesgade 53B, 2100, Copenhagen, Denmark
| | - Lars Konge
- Copenhagen Academy for Medical Education and Simulation (CAMES), Center for HR & Education, The Capital Region of Denmark, Ryesgade 53B, 2100, Copenhagen, Denmark
- Faculty of Health and Medical Sciences, University of Copenhagen, Copenhagen, Denmark
| | - Andreas Røder
- Department of Urology, Copenhagen Prostate Cancer Center, Copenhagen University Hospital - Rigshospitalet, Copenhagen, Denmark
- Faculty of Health and Medical Sciences, University of Copenhagen, Copenhagen, Denmark
| | - Flemming Bjerrum
- Copenhagen Academy for Medical Education and Simulation (CAMES), Center for HR & Education, The Capital Region of Denmark, Ryesgade 53B, 2100, Copenhagen, Denmark
- Department of Gastrointestinal and Hepatic Diseases, Copenhagen University Hospital - Herlev and Gentofte, Herlev, Denmark
| |
Collapse
|
2
|
Hutchinson K, Reyes I, Li Z, Alemzadeh H. COMPASS: a formal framework and aggregate dataset for generalized surgical procedure modeling. Int J Comput Assist Radiol Surg 2023; 18:2143-2154. [PMID: 37145250 DOI: 10.1007/s11548-023-02922-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2022] [Accepted: 04/14/2023] [Indexed: 05/06/2023]
Abstract
PURPOSE We propose a formal framework for the modeling and segmentation of minimally invasive surgical tasks using a unified set of motion primitives (MPs) to enable more objective labeling and the aggregation of different datasets. METHODS We model dry-lab surgical tasks as finite state machines, representing how the execution of MPs as the basic surgical actions results in the change of surgical context, which characterizes the physical interactions among tools and objects in the surgical environment. We develop methods for labeling surgical context based on video data and for automatic translation of context to MP labels. We then use our framework to create the COntext and Motion Primitive Aggregate Surgical Set (COMPASS), including six dry-lab surgical tasks from three publicly available datasets (JIGSAWS, DESK, and ROSMA), with kinematic and video data and context and MP labels. RESULTS Our context labeling method achieves near-perfect agreement between consensus labels from crowd-sourcing and expert surgeons. Segmentation of tasks to MPs results in the creation of the COMPASS dataset that nearly triples the amount of data for modeling and analysis and enables the generation of separate transcripts for the left and right tools. CONCLUSION The proposed framework results in high quality labeling of surgical data based on context and fine-grained MPs. Modeling surgical tasks with MPs enables the aggregation of different datasets and the separate analysis of left and right hands for bimanual coordination assessment. Our formal framework and aggregate dataset can support the development of explainable and multi-granularity models for improved surgical process analysis, skill assessment, error detection, and autonomy.
Collapse
Affiliation(s)
- Kay Hutchinson
- Department of Electrical and Computer Engineering, University of Virginia, Charlottesville, VA, 22903, USA.
| | - Ian Reyes
- Department of Computer Science, University of Virginia, Charlottesville, VA, 22903, USA
- IBM, RTP, Durham, NC, 27709, USA
| | - Zongyu Li
- Department of Electrical and Computer Engineering, University of Virginia, Charlottesville, VA, 22903, USA
| | - Homa Alemzadeh
- Department of Electrical and Computer Engineering, University of Virginia, Charlottesville, VA, 22903, USA
- Department of Computer Science, University of Virginia, Charlottesville, VA, 22903, USA
| |
Collapse
|
3
|
Ma R, Ramaswamy A, Xu J, Trinh L, Kiyasseh D, Chu TN, Wong EY, Lee RS, Rodriguez I, DeMeo G, Desai A, Otiato MX, Roberts SI, Nguyen JH, Laca J, Liu Y, Urbanova K, Wagner C, Anandkumar A, Hu JC, Hung AJ. Surgical gestures as a method to quantify surgical performance and predict patient outcomes. NPJ Digit Med 2022; 5:187. [PMID: 36550203 PMCID: PMC9780308 DOI: 10.1038/s41746-022-00738-y] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2022] [Accepted: 11/29/2022] [Indexed: 12/24/2022] Open
Abstract
How well a surgery is performed impacts a patient's outcomes; however, objective quantification of performance remains an unsolved challenge. Deconstructing a procedure into discrete instrument-tissue "gestures" is a emerging way to understand surgery. To establish this paradigm in a procedure where performance is the most important factor for patient outcomes, we identify 34,323 individual gestures performed in 80 nerve-sparing robot-assisted radical prostatectomies from two international medical centers. Gestures are classified into nine distinct dissection gestures (e.g., hot cut) and four supporting gestures (e.g., retraction). Our primary outcome is to identify factors impacting a patient's 1-year erectile function (EF) recovery after radical prostatectomy. We find that less use of hot cut and more use of peel/push are statistically associated with better chance of 1-year EF recovery. Our results also show interactions between surgeon experience and gesture types-similar gesture selection resulted in different EF recovery rates dependent on surgeon experience. To further validate this framework, two teams independently constructe distinct machine learning models using gesture sequences vs. traditional clinical features to predict 1-year EF. In both models, gesture sequences are able to better predict 1-year EF (Team 1: AUC 0.77, 95% CI 0.73-0.81; Team 2: AUC 0.68, 95% CI 0.66-0.70) than traditional clinical features (Team 1: AUC 0.69, 95% CI 0.65-0.73; Team 2: AUC 0.65, 95% CI 0.62-0.68). Our results suggest that gestures provide a granular method to objectively indicate surgical performance and outcomes. Application of this methodology to other surgeries may lead to discoveries on methods to improve surgery.
Collapse
Affiliation(s)
- Runzhuo Ma
- grid.42505.360000 0001 2156 6853Center for Robotic Simulation & Education, Catherine & Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA USA
| | - Ashwin Ramaswamy
- grid.5386.8000000041936877XDepartment of Urology, Weill Cornell Medicine, New York, NY USA
| | - Jiashu Xu
- grid.42505.360000 0001 2156 6853Computer Science Department, Viterbi School of Engineering, University of Southern California, Los Angeles, CA USA
| | - Loc Trinh
- grid.42505.360000 0001 2156 6853Computer Science Department, Viterbi School of Engineering, University of Southern California, Los Angeles, CA USA
| | - Dani Kiyasseh
- grid.20861.3d0000000107068890Department of Computing & Mathematical Sciences, California Institute of Technology, Pasadena, CA USA
| | - Timothy N. Chu
- grid.42505.360000 0001 2156 6853Center for Robotic Simulation & Education, Catherine & Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA USA
| | - Elyssa Y. Wong
- grid.42505.360000 0001 2156 6853Center for Robotic Simulation & Education, Catherine & Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA USA
| | - Ryan S. Lee
- grid.42505.360000 0001 2156 6853Center for Robotic Simulation & Education, Catherine & Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA USA
| | - Ivan Rodriguez
- grid.42505.360000 0001 2156 6853Center for Robotic Simulation & Education, Catherine & Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA USA
| | - Gina DeMeo
- grid.5386.8000000041936877XDepartment of Urology, Weill Cornell Medicine, New York, NY USA
| | - Aditya Desai
- grid.42505.360000 0001 2156 6853Center for Robotic Simulation & Education, Catherine & Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA USA
| | - Maxwell X. Otiato
- grid.42505.360000 0001 2156 6853Center for Robotic Simulation & Education, Catherine & Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA USA
| | - Sidney I. Roberts
- grid.42505.360000 0001 2156 6853Center for Robotic Simulation & Education, Catherine & Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA USA
| | - Jessica H. Nguyen
- grid.42505.360000 0001 2156 6853Center for Robotic Simulation & Education, Catherine & Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA USA
| | - Jasper Laca
- grid.42505.360000 0001 2156 6853Center for Robotic Simulation & Education, Catherine & Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA USA
| | - Yan Liu
- grid.42505.360000 0001 2156 6853Computer Science Department, Viterbi School of Engineering, University of Southern California, Los Angeles, CA USA
| | - Katarina Urbanova
- grid.459927.40000 0000 8785 9045Department of Urology and Urologic Oncology, St. Antonius-Hospital, Gronau, Germany
| | - Christian Wagner
- grid.459927.40000 0000 8785 9045Department of Urology and Urologic Oncology, St. Antonius-Hospital, Gronau, Germany
| | - Animashree Anandkumar
- grid.20861.3d0000000107068890Department of Computing & Mathematical Sciences, California Institute of Technology, Pasadena, CA USA
| | - Jim C. Hu
- grid.5386.8000000041936877XDepartment of Urology, Weill Cornell Medicine, New York, NY USA
| | - Andrew J. Hung
- grid.42505.360000 0001 2156 6853Center for Robotic Simulation & Education, Catherine & Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA USA
| |
Collapse
|