1
|
Shofler D, Cooperman S, Shibata E, Duffin E, Shapiro J. Development and Evaluation of a Surgical Direct Assessment Tool for Resident Training. Clin Podiatr Med Surg 2020; 37:391-400. [PMID: 32146991 DOI: 10.1016/j.cpm.2019.12.014] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
Abstract
In podiatric residency training, minimum activity volume numbers are used to assess surgical competency. The purpose of this study was to develop a standardized direct assessment form as a complement to minimum activity volume numbers. Sixteen attending physicians completed 121 direct assessment forms, evaluating six podiatric medicine and surgery residents. Evaluation scores were highly correlated with residency year. Resident feedback was positive, with the open-response portion identified as especially useful. Although further efforts may help refine this approach, the use of standardized, competency-based direct assessment has the potential to improve the training of podiatric medicine and surgery residents.
Collapse
Affiliation(s)
- David Shofler
- Department of Podiatric Medicine, Surgery, and Biomechanics, Western University College of Podiatric Medicine, 309 East 2nd Street, Pomona, CA 91766, USA.
| | - Steven Cooperman
- Department of Podiatric Medicine, Surgery, and Biomechanics, Western University College of Podiatric Medicine, 309 East 2nd Street, Pomona, CA 91766, USA
| | - Emily Shibata
- Department of Podiatric Medicine, Surgery, and Biomechanics, Western University College of Podiatric Medicine, 309 East 2nd Street, Pomona, CA 91766, USA
| | - Eric Duffin
- Department of Podiatric Medicine, Surgery, and Biomechanics, Western University College of Podiatric Medicine, 309 East 2nd Street, Pomona, CA 91766, USA
| | - Jarrod Shapiro
- Department of Podiatric Medicine, Surgery, and Biomechanics, Western University College of Podiatric Medicine, 309 East 2nd Street, Pomona, CA 91766, USA
| |
Collapse
|
2
|
Thanawala RM, Jesneck JL, Seymour NE. Education Management Platform Enables Delivery and Comparison of Multiple Evaluation Types. JOURNAL OF SURGICAL EDUCATION 2019; 76:e209-e216. [PMID: 31515199 DOI: 10.1016/j.jsurg.2019.08.017] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/23/2019] [Revised: 08/04/2019] [Accepted: 08/12/2019] [Indexed: 06/10/2023]
Abstract
OBJECTIVE The purpose of this study was to determine whether an automated platform for evaluation selection and delivery would increase participation from surgical teaching faculty in submitting resident operative performance evaluations. DESIGN We built a HIPAA-compliant, web-based platform to track resident operative assignments and to link embedded evaluation instruments to procedure type. The platform matched appropriate evaluations to surgeons' scheduled procedures, and delivered multiple evaluation types, including Ottawa Surgical Competency Operating Room Evaluation (O-Score) evaluations and Operative Performance Rating System (OPRS) evaluations. Prompts to complete evaluations were made through a system of automatic electronic notifications. We compared the time spent in the platform to achieve evaluation completion. As a metric for the platform's effect on faculty participation, we considered a task that would typically be infeasible without workflow optimization: the evaluator could choose to complete multiple, complementary evaluations for the same resident in the same case. For those cases with multiple evaluations, correlation was analyzed by Spearman rank test. Evaluation data were compared between PGY levels using repeated measures ANOVA. SETTING The study took place at 4 general surgery residency programs: The University of Massachusetts Medical School-Baystate, the University of Connecticut School or Medicine, the University of Iowa Carver College of Medicine, and Maimonides Medical Center. PARTICIPANTS From March 2017 to February 2019, the study included 70 surgical teaching faculty and 101 general surgery residents. RESULTS Faculty completed 1230 O-Score evaluations and 106 OPRS evaluations. Evaluations were completed quickly, with a median time of 36 ± 18 seconds for O-Score evaluations, and 53 ± 51 seconds for OPRS evaluations. 89% of O-Score and 55% of OPRS evaluations were completed without optional comments within one minute, and 99% of O-Score and 82% of OPRS evaluations were completed within 2 minutes. For cases eligible for both evaluation types, attendings completed both evaluations on 74 of 221 (33%) of these cases. These paired evaluations strongly correlated on resident performance (Spearman coefficient = 0.84, p < 0.00001). Both evaluation types stratified operative skill level by program year (p < 0.00001). CONCLUSIONS Evaluation initiatives can be hampered by the challenge of making multiple surgical evaluation instruments available when needed for appropriate clinical situations, including specific case types. As a test of the optimized evaluation workflow, and to lay the groundwork for future data-driven design of evaluations, we tested the impact of simultaneously delivering 2 evaluation instruments via a secure web-based education platform. We measured the evaluation completion rates of faculty surgeon evaluators when rating resident operative performance, and how effectively the results of evaluation could be analyzed and compared, taking advantage of a highly integrated management of the evaluative information.
Collapse
Affiliation(s)
- Ruchi M Thanawala
- University of Massachusetts Medical School-Baystate, Springfield, Massachusetts; University of Iowa Health Care, Carver College of Medicine, Iowa City Iowa
| | - Jonathan L Jesneck
- University of Massachusetts Medical School-Baystate, Springfield, Massachusetts; University of Iowa Health Care, Carver College of Medicine, Iowa City Iowa
| | - Neal E Seymour
- University of Massachusetts Medical School-Baystate, Springfield, Massachusetts; University of Iowa Health Care, Carver College of Medicine, Iowa City Iowa.
| |
Collapse
|
3
|
Thanawala R, Jesneck J, Seymour NE. Novel Educational Information Management Platform Improves the Surgical Skill Evaluation Process of Surgical Residents. JOURNAL OF SURGICAL EDUCATION 2018; 75:e204-e211. [PMID: 30077701 DOI: 10.1016/j.jsurg.2018.06.004] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/03/2018] [Revised: 05/25/2018] [Accepted: 06/06/2018] [Indexed: 06/08/2023]
Abstract
OBJECTIVE We sought to increase compliance and timeliness of surgery resident operative evaluation, by providing faculty and residents with a Platform-linking evaluation to analytics and machine-learning-facilitated case logging. DESIGN We built a HIPAA-compliant web-based Platform for comprehensive management of resident education information, including resident operative performance evaluations. To assess evaluation timeliness, we compared the lag time for Platform-based evaluations to that of end-of-rotation evaluations. We also assessed evaluation compliance, based on a time threshold of 5 days for Platform evaluations and 2 weeks for end-of-rotation evaluations. SETTING University of Massachusetts, Baystate Medical Center, General Surgery Residency. PARTICIPANTS Twenty three attendings and 43 residents for the Platform cohort; 15 services and 45 residents for the end-of-rotation cohort. RESULTS Three hundred and fifty-eight Platform evaluations were completed by 23 attendings for 43 residents for March through October 2017. Six hundred and ten end-of-rotation evaluations by 15 attendings for 45 residents were used for comparison (September 2015 through June 2017). Of Platform evaluations, 41.3% were completed within 24 hours of the operation (16.5% in 6 hours, 33.3% in 12 hours, and 62.2% in 48 hours), with 24.3% of evaluations completed within 3 hours after e-mail reminders. In the first 6 weeks (March 1 through April 12) 4.5 ± 3.7 evaluations were completed per week compared to 18.8 ± 5.8 in the last (September 18 through October 31). Evaluation lag times improved with the use of the Platform, both for median lag of 35 days earlier (1 ± 1.5 days Platform, 36 ± 28.2 days traditional, p < 0.0001) and a mean lag of 41 days earlier (3.0 ± 4.7 days Platform, 44.0 ± 32.6 days traditional, p < 0.0001). CONCLUSIONS Our comprehensive Platform facilitated faculty compliance with evaluation requirements and timeliness of availability of performance information (often in near real time) for both residents and residency leadership. The added value of the Platform's integration of evaluations with resident and attending case logging may account for the rapidly increasing number of operative skill evaluations over the short time span since implementation.
Collapse
Affiliation(s)
- Ruchi Thanawala
- Department of Surgery, University of Massachusetts - Baystate Medical Center, Springfield, Massachusetts
| | - Jonathan Jesneck
- Department of Surgery, University of Massachusetts - Baystate Medical Center, Springfield, Massachusetts
| | - Neal E Seymour
- Department of Surgery, University of Massachusetts - Baystate Medical Center, Springfield, Massachusetts.
| |
Collapse
|
4
|
Black H, Sheppard G, Metcalfe B, Stone-McLean J, McCarthy H, Dubrowski A. Expert Facilitated Development of an Objective Assessment Tool for Point-of-Care Ultrasound Performance in Undergraduate Medical Education. Cureus 2016; 8:e636. [PMID: 27433415 PMCID: PMC4938628 DOI: 10.7759/cureus.636] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022] Open
Abstract
BACKGROUND With the various applications of point-of-care ultrasound (PoCUS) steadily increasing, many medical schools across North America are incorporating PoCUS training into their undergraduate curricula. The Faculty of Medicine at Memorial University also intends to introduce PoCUS training into its own undergraduate medical program. The proposed approach is to introduce a PoCUS curriculum focusing on anatomy and physiology while developing cognitive and psychomotor skills that are later transferred into clinical applications. This has been the common approach taken by most undergraduate ultrasound programs in the United States. This project highlights the development and the challenges involved in creating an objective assessment tool that meets the unique needs of this proposed undergraduate ultrasound curriculum. METHODS After a thorough review of existing literature and input from experts in PoCUS, a prototype global rating scale (GRS) and three exam-specific checklists were created by researchers. The exam-specific checklists include aorta exam, subxiphoid cardiac exam, and focused abdominal exam. A panel of 18 emergency room physicians certified in PoCUS were recruited to evaluate the GRS and three checklists. This was accomplished using a modified Delphi technique. The items were rated on a 5-point Likert scale. If an item received a mean score of less than 4, it was deemed unimportant for the assessment of PoCUS performance in undergraduate medical learners and was excluded. Experts were also encouraged to provide comments and suggest further items to be added to the GRS or checklists. Items were modified according to these comments. All of the edits were then sent back to the experts for revisions. RESULTS A consensus was achieved after three rounds of surveys, with the final GRS containing nine items. The final aorta checklist contained nine items, and the subxiphoid cardiac and focused abdominal checklists each contained 11 items. CONCLUSION By using a modified Delphi technique, we developed a single GRS and three checklists. A panel of independent PoCUS practitioners supports the content validity of these tools. Research is currently ongoing to evaluate their validity for assessing PoCUS competency in undergraduate medical students.
Collapse
Affiliation(s)
- Holly Black
- Emergency Medicine, Memorial University of Newfoundland
| | | | - Brian Metcalfe
- Faculty of Medicine, Memorial University of Newfoundland
| | | | | | - Adam Dubrowski
- Emergency Medicine, Pediatrics, Memorial University of Newfoundland ; Marine Institute, Memorial University of Newfoundland
| |
Collapse
|
5
|
Huang E, Wyles SM, Chern H, Kim E, O'Sullivan P. From novice to master surgeon: improving feedback with a descriptive approach to intraoperative assessment. Am J Surg 2015; 212:180-7. [PMID: 26611717 DOI: 10.1016/j.amjsurg.2015.04.026] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2014] [Revised: 03/04/2015] [Accepted: 04/27/2015] [Indexed: 11/16/2022]
Abstract
BACKGROUND A developmental and descriptive approach to assessing trainee intraoperative performance was explored. METHODS Semistructured interviews with 20 surgeon educators were recorded, transcribed, deidentified, and analyzed using a grounded theory approach to identify emergent themes. Two researchers independently coded the transcripts. Emergent themes were also compared to existing theories of skill acquisition. RESULTS Surgeon educators characterized intraoperative surgical performance as an integrated practice of multiple skill categories and included anticipating, planning for contingencies, monitoring progress, self-efficacy, and "working knowledge." Comments concerning progression through stages, broadly characterized as "technician," "anatomist," "anticipator," "strategist," and "executive," formed a narrative about each stage of development. CONCLUSIONS The developmental trajectory with narrative, descriptive profiles of surgeons working toward mastery provide a standardized vocabulary for communicating feedback, while fostering reflection on trainee progress. Viewing surgical performance as integrated practice rather than the conglomerate of isolated skills enhances authentic assessment.
Collapse
Affiliation(s)
- Emily Huang
- Department of Surgery, University of California, San Francisco, 513 Parnassus Avenue, S-321, San Francisco, CA, 94143-0470, USA.
| | - Susannah M Wyles
- Department of Surgery, University of California, San Francisco, 513 Parnassus Avenue, S-321, San Francisco, CA, 94143-0470, USA
| | - Hueylan Chern
- Department of Surgery, University of California, San Francisco, 513 Parnassus Avenue, S-321, San Francisco, CA, 94143-0470, USA
| | - Edward Kim
- Department of Surgery, University of California, San Francisco, 513 Parnassus Avenue, S-321, San Francisco, CA, 94143-0470, USA
| | - Patricia O'Sullivan
- Department of Medicine, University of California, San Francisco, San Francisco, CA, USA
| |
Collapse
|
6
|
2014 ACAPS Congress: Abstracts. PLASTIC AND RECONSTRUCTIVE SURGERY-GLOBAL OPEN 2015; 3:e347. [PMID: 25878923 PMCID: PMC4387169] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
7
|
Development of an Operative Performance Rating System for Plastic Surgery Residents. Plast Reconstr Surg Glob Open 2015; 3:e367. [PMID: 30805272 PMCID: PMC6373567 DOI: 10.1097/gox.0000000000000317] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
8
|
Dougherty PJ. CORR curriculum - orthopaedic education: Faculty development begins at home. Clin Orthop Relat Res 2014; 472:3637-43. [PMID: 25298280 PMCID: PMC4397787 DOI: 10.1007/s11999-014-3986-y] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/15/2014] [Accepted: 09/26/2014] [Indexed: 01/31/2023]
Affiliation(s)
- Paul J. Dougherty
- Detroit Medical Center, 4201 St. Antoine, Suite 4G, Detroit, MI 48201 USA
| |
Collapse
|
9
|
Chen XP, Williams RG, Smink DS. Do residents receive the same OR guidance as surgeons report? Difference between residents' and surgeons' perceptions of OR guidance. JOURNAL OF SURGICAL EDUCATION 2014; 71:e79-e82. [PMID: 24931416 DOI: 10.1016/j.jsurg.2014.04.010] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/26/2014] [Revised: 04/25/2014] [Accepted: 04/27/2014] [Indexed: 06/03/2023]
Abstract
PURPOSE Operating room (OR) guidance is important for surgical residents' performance and, ultimately, for the development of independence and autonomy. This study explores the differences in surgical residents' and attending surgeons' perceptions of OR guidance in prerecorded surgical cases. METHODS A total of 9 attending surgeons and 8 surgical residents observed 8 prerecorded surgical cases and were asked to identify both the presence and the type of attending surgeons' OR guidance. Each recorded case was observed by 2 attending surgeons and 1 resident. A previously developed taxonomy for types of OR guidance was applied to analyze the data to explore the difference. Agreement by both attending surgeons on the presence and the type of OR guidance served as the concordant guidance behaviors to which the responses of the residents were compared. RESULTS Overall, 116 OR guidance events were identified. Attending surgeons agreed on the presence of guidance in 80 of 116 (69.8%) events and consistently identified the type of OR guidance in 91.4% (73/80, Cohen κ = 0.874) of them. However, surgical residents only agreed with attending surgeons on the presence of guidance in 61.25% (49/80) of the events. In addition, there was significant disagreement (Cohen κ = 0.319) between surgical residents and attending surgeons in the type of OR guidance; the residents only identified 54.8% (40/73) of concordant guidance behaviors in the same guidance category as both the surgeons. Among the types of OR guidance, residents and attending surgeons were most likely to agree on the teaching guidance (66.67%) and least likely to agree on the assisting guidance (36.84%). CONCLUSIONS Surgical residents and attending surgeons have different perceptions of both the presence and the type of OR guidance. This difference in perception of OR guidance has important implications for the efficiency of training surgical residents in the OR, and, ultimately on residents' development of independence and autonomy.
Collapse
Affiliation(s)
| | - Reed G Williams
- Department of Surgery, School of Medicine, Indiana University, Indianapolis, Indiana
| | - Douglas S Smink
- Department of Surgery, Brigham and Women's Hospital, Boston, Massachusetts
| |
Collapse
|