1
|
Montgomery KB, Mellinger JD, McLeod MC, Jones A, Zmijewski P, Sarosi GA, Brasel KJ, Klingensmith ME, Minter RM, Buyske J, Lindeman B. Decision-Making Confidence of Clinical Competency Committees for Entrustable Professional Activities. JAMA Surg 2024; 159:801-808. [PMID: 38717759 PMCID: PMC11079788 DOI: 10.1001/jamasurg.2024.0809] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2023] [Accepted: 02/02/2024] [Indexed: 05/12/2024]
Abstract
Importance A competency-based assessment framework using entrustable professional activities (EPAs) was endorsed by the American Board of Surgery following a 2-year feasibility pilot study. Pilot study programs' clinical competency committees (CCCs) rated residents on EPA entrustment semiannually using this newly developed assessment tool, but factors associated with their decision-making are not yet known. Objective To identify factors associated with variation in decision-making confidence of CCCs in EPA summative entrustment decisions. Design, Setting, and Participants This cohort study used deidentified data from the EPA Pilot Study, with participating sites at 28 general surgery residency programs, prospectively collected from July 1, 2018, to June 30, 2020. Data were analyzed from September 27, 2022, to February 15, 2023. Exposure Microassessments of resident entrustment for pilot EPAs (gallbladder disease, inguinal hernia, right lower quadrant pain, trauma, and consultation) collected within the course of routine clinical care across four 6-month study cycles. Summative entrustment ratings were then determined by program CCCs for each study cycle. Main Outcomes and Measures The primary outcome was CCC decision-making confidence rating (high, moderate, slight, or no confidence) for summative entrustment decisions, with a secondary outcome of number of EPA microassessments received per summative entrustment decision. Bivariate tests and mixed-effects regression modeling were used to evaluate factors associated with CCC confidence. Results Among 565 residents receiving at least 1 EPA microassessment, 1765 summative entrustment decisions were reported. Overall, 72.5% (1279 of 1765) of summative entrustment decisions were made with moderate or high confidence. Confidence ratings increased with increasing mean number of EPA microassessments, with 1.7 (95% CI, 1.4-2.0) at no confidence, 1.9 (95% CI, 1.7-2.1) at slight confidence, 2.9 (95% CI, 2.6-3.2) at moderate confidence, and 4.1 (95% CI, 3.8-4.4) at high confidence. Increasing number of EPA microassessments was associated with increased likelihood of higher CCC confidence for all except 1 EPA phase after controlling for program effects (odds ratio range: 1.21 [95% CI, 1.07-1.37] for intraoperative EPA-4 to 2.93 [95% CI, 1.64-5.85] for postoperative EPA-2); for preoperative EPA-3, there was no association. Conclusions and Relevance In this cohort study, the CCC confidence in EPA summative entrustment decisions increased as the number of EPA microassessments increased, and CCCs endorsed moderate to high confidence in most entrustment decisions. These findings provide early validity evidence for this novel assessment framework and may inform program practices as EPAs are implemented nationally.
Collapse
Affiliation(s)
| | - John D. Mellinger
- American Board of Surgery, Philadelphia, Pennsylvania
- Department of Surgery, Southern Illinois University, Springfield
| | | | - Andrew Jones
- American Board of Surgery, Philadelphia, Pennsylvania
| | | | | | - Karen J. Brasel
- Department of Surgery, Oregon Health & Science University, Portland
| | - Mary E. Klingensmith
- American Board of Surgery, Philadelphia, Pennsylvania
- Accreditation Council for Graduate Medical Education, Chicago, Illinois
- Department of Surgery, Washington University in St Louis, St Louis, Missouri
| | | | - Jo Buyske
- American Board of Surgery, Philadelphia, Pennsylvania
- Department of Surgery, University of Pennsylvania, Philadelphia
| | | |
Collapse
|
2
|
Montgomery KB, Mellinger JD, Lindeman B. Entrustable Professional Activities in Surgery: A Review. JAMA Surg 2024; 159:571-577. [PMID: 38477902 PMCID: PMC11260519 DOI: 10.1001/jamasurg.2023.8107] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/14/2024]
Abstract
Importance Entrustable professional activities (EPAs) compose a competency-based education (CBE) assessment framework that has been increasingly adopted across medical specialties as a workplace-based assessment tool. EPAs focus on directly observed behaviors to determine the level of entrustment a trainee has for a given activity of that specialty. In this narrative review, we highlight the rationale for EPAs in general surgery, describe current evidence supporting their use, and outline some of the practical considerations for EPAs among residency programs, faculty, and trainees. Observations An expanding evidence base for EPAs in general surgery has provided moderate validity evidence for their use as well as practical recommendations for implementation across residency programs. Challenges to EPA use include garnering buy-in from individual faculty and residents to complete EPA microassessments and engage in timely, specific feedback after a case or clinical encounter. When successfully integrated into a program's workflow, EPAs can provide a more accurate picture of residents' competence for a fundamental surgical task or activity compared with other assessment methods. Conclusions and Relevance EPAs represent the next significant shift in the evaluation of general surgery residents as part of the overarching progression toward CBE among all US residency programs. While pragmatic challenges to the implementation of EPAs remain, the best practices from EPA and other CBE assessment literature summarized in this review may assist individuals and programs in implementing EPAs. As EPAs become more widely used in general surgery resident training, further analysis of barriers and facilitators to successful and sustainable EPA implementation will be needed to continue to optimize and advance this new assessment framework.
Collapse
|
3
|
Goldenberg MG. Surgical Artificial Intelligence in Urology: Educational Applications. Urol Clin North Am 2024; 51:105-115. [PMID: 37945096 DOI: 10.1016/j.ucl.2023.06.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2023]
Abstract
Surgical education has seen immense change recently. Increased demand for iterative evaluation of trainees from medical school to independent practice has led to the generation of an overwhelming amount of data related to an individual's competency. Artificial intelligence has been proposed as a solution to automate and standardize the ability of stakeholders to assess the technical and nontechnical abilities of a surgical trainee. In both the simulation and clinical environments, evidence supports the use of machine learning algorithms to both evaluate trainee skill and provide real-time and automated feedback, enabling a shortened learning curve for many key procedural skills and ensuring patient safety.
Collapse
Affiliation(s)
- Mitchell G Goldenberg
- Catherine & Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, 1441 Eastlake Avenue, Suite 7416, Los Angeles, CA 90033, USA.
| |
Collapse
|
4
|
Jogerst KM, Park YS, Anteby R, Sinyard R, Coe TM, Cassidy D, McKinley SK, Petrusa E, Phitayakorn R, Mohapatra A, Gee DW. Impact of Rater Training on Residents Technical Skill Assessments: A Randomized Trial. JOURNAL OF SURGICAL EDUCATION 2022; 79:e225-e234. [PMID: 36333174 DOI: 10.1016/j.jsurg.2022.09.013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/13/2022] [Revised: 08/28/2022] [Accepted: 09/23/2022] [Indexed: 06/16/2023]
Abstract
OBJECTIVE The ACS/APDS Resident Skills Curriculum's Objective Structured Assessment of Technical Skills (OSATS) consists of task-specific checklists and a global rating scale (GRS) completed by raters. Prior work demonstrated a need for rater training. This study evaluates the impact of a rater-training curriculum on scoring discrimination, consistency, and validity for handsewn bowel anastomosis (HBA) and vascular anastomosis (VA). DESIGN/ METHODS A rater training video model was developed, which included a GRS orientation and anchoring performances representing the range of potential scores. Faculty raters were randomized to rater training or no rater training and were asked to score videos of resident HBA/VA. Consensus scores were assigned to each video using a modified Delphi process (Gold Score). Trained and untrained scores were analyzed for discrimination and score spread and compared to the Gold Score for relative agreement. RESULTS Eight general and eight vascular surgery faculty were randomized to score 24 HBA/VA videos. Rater training increased rater discrimination and decreased rating scale shrinkage for both VA (mean trained score: 2.83, variance 1.88; mean untrained score: 3.1, variance 1.14, p = 0.007) and HBA (mean trained score: 3.52, variance 1.44; mean untrained score: 3.42, variance 0.96, p = 0.033). On validity analyses, a comparison between each rater group vs Gold Score revealed a moderate training impact for VA, trained κ=0.65 vs untrained κ=0.57 and no impact for HBA, R1 κ = 0.71 vs R2 κ = 0.73. CONCLUSION A rater-training curriculum improved raters' ability to differentiate performance levels and use a wider range of the scoring scale. However, despite rater training, there was persistent disagreement between faculty GRS scores with no groups reaching the agreement threshold for formative assessment. If technical skill exams are incorporated into high stakes assessments, consensus ratings via a standard setting process are likely a more valid option than individual faculty ratings.
Collapse
Affiliation(s)
- Kristen M Jogerst
- Department of General Surgery, Mayo Clinic Arizona, Phoenix, Arizona; Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Yoon Soo Park
- Department of Emergency Medicine, Massachusetts General Hospital, Boston, Massachusetts
| | - Roi Anteby
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Robert Sinyard
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Taylor M Coe
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Douglas Cassidy
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Sophia K McKinley
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Emil Petrusa
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Roy Phitayakorn
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Abhisekh Mohapatra
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Denise W Gee
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts.
| |
Collapse
|
5
|
Andersen SAW, Nayahangan LJ, Park YS, Konge L. Use of Generalizability Theory for Exploring Reliability of and Sources of Variance in Assessment of Technical Skills: A Systematic Review and Meta-Analysis. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:1609-1619. [PMID: 33951677 DOI: 10.1097/acm.0000000000004150] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
PURPOSE Competency-based education relies on the validity and reliability of assessment scores. Generalizability (G) theory is well suited to explore the reliability of assessment tools in medical education but has only been applied to a limited extent. This study aimed to systematically review the literature using G-theory to explore the reliability of structured assessment of medical and surgical technical skills and to assess the relative contributions of different factors to variance. METHOD In June 2020, 11 databases, including PubMed, were searched from inception through May 31, 2020. Eligible studies included the use of G-theory to explore reliability in the context of assessment of medical and surgical technical skills. Descriptive information on study, assessment context, assessment protocol, participants being assessed, and G-analyses was extracted. Data were used to map G-theory and explore variance components analyses. A meta-analysis was conducted to synthesize the extracted data on the sources of variance and reliability. RESULTS Forty-four studies were included; of these, 39 had sufficient data for meta-analysis. The total pool included 35,284 unique assessments of 31,496 unique performances of 4,154 participants. Person variance had a pooled effect of 44.2% (95% confidence interval [CI], 36.8%-51.5%). Only assessment tool type (Objective Structured Assessment of Technical Skills-type vs task-based checklist-type) had a significant effect on person variance. The pooled reliability (G-coefficient) was 0.65 (95% CI, .59-.70). Most studies included decision studies (39, 88.6%) and generally seemed to have higher ratios of performances to assessors to achieve a sufficiently reliable assessment. CONCLUSIONS G-theory is increasingly being used to examine reliability of technical skills assessment in medical education, but more rigor in reporting is warranted. Contextual factors can potentially affect variance components and thereby reliability estimates and should be considered, especially in high-stakes assessment. Reliability analysis should be a best practice when developing assessment of technical skills.
Collapse
Affiliation(s)
- Steven Arild Wuyts Andersen
- S.A.W. Andersen is postdoctoral researcher, Copenhagen Academy for Medical Education and Simulation (CAMES), Center for Human Resources and Education, Capital Region of Denmark, and Department of Otolaryngology, The Ohio State University, Columbus, Ohio, and resident in otorhinolaryngology, Department of Otorhinolaryngology-Head & Neck Surgery, Rigshospitalet, Copenhagen, Denmark; ORCID: https://orcid.org/0000-0002-3491-9790
| | - Leizl Joy Nayahangan
- L.J. Nayahangan is researcher, CAMES, Center for Human Resources and Education, Capital Region of Denmark, Copenhagen, Denmark; ORCID: https://orcid.org/0000-0002-6179-1622
| | - Yoon Soo Park
- Y.S. Park is director of health professions education research, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts; ORCID: https://orcid.org/0000-0001-8583-4335
| | - Lars Konge
- L. Konge is professor of medical education, University of Copenhagen, and head of research, CAMES, Center for Human Resources and Education, Capital Region of Denmark, Copenhagen, Denmark; ORCID: https://orcid.org/0000-0002-1258-5822
| |
Collapse
|
6
|
Abstract
There are myriad types of problem learners in surgical residency and most have difficulty in more than 1 competency. Programs that use a standard curriculum of study and assessment are most successful in identifying struggling learners early. Many problem learners lack appropriate systems for study; a multidisciplinary educational team that is separate from the team that evaluates the success of remediation is critical. Struggling residents who require formal remediation benefit from performance improvement plans that clearly outline the issues of concern, describe the steps required for remediation, define success of remediation, and outline consequences for failure to remediate appropriately.
Collapse
Affiliation(s)
- Lilah F Morris-Wiseman
- University of Arizona, Department of Surgery, Division of Surgical Oncology, 1501 N. Campbell Avenue, PO Box 245058, Tucson, AZ 85724-5058, USA
| | - Valentine N Nfonsam
- University of Arizona, Department of Surgery, Division of Surgical Oncology, 1501 N. Campbell Avenue, PO Box 245058, Tucson, AZ 85724-5058, USA.
| |
Collapse
|
7
|
Steinemann S, Korndorffer J, Dent D, Rucinski J, Newman RW, Blair P, Lupi LK, Sachdeva AK. Defining the need for faculty development in assessment. Am J Surg 2021; 222:679-684. [PMID: 34226039 DOI: 10.1016/j.amjsurg.2021.06.010] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2021] [Revised: 06/04/2021] [Accepted: 06/20/2021] [Indexed: 12/29/2022]
Abstract
BACKGROUND High-quality workplace-based assessments are essential for competency-based surgical education. We explored education leaders' perceptions regarding faculty competence in assessment. METHODS Surgical education leaders were surveyed regarding which areas faculty needed improvement, and knowledge of assessment tools. Respondents were queried on specific skills regarding (a)importance in resident/medical student education (b)competence of faculty in assessment and feedback. RESULTS Surveys (n = 636) were emailed, 103 responded most faculty needed improvement in: verbal (86%) and written (83%) feedback, assessing operative skill (49%) and preparation for procedures (50%). Cholecystectomy, trauma laparotomy, inguinal herniorrhaphy were "very-extremely important" in resident education (99%), but 21-24% thought faculty "moderately to not-at-all" competent in assessment. This gap was larger for non-technical skills. Regarding assessment tools, 56% used OSATS, 49% Zwisch; most were unfamiliar with all non-technical tools. SUMMARY These data demonstrate a significant perceived gap in competence of faculty in assessment and feedback, and unfamiliarity with assessment tools. This can inform faculty development to support competency-based surgical education.
Collapse
Affiliation(s)
- Susan Steinemann
- Department of Surgery, University of Hawaii John A. Burns School of Medicine, 651 Ilalo Street, MEB223H, Honolulu, HI, 96813, USA.
| | - James Korndorffer
- Department of Surgery, Stanford University School of Medicine, 300 Pasteur Drive, Stanford, CA, 94305, USA.
| | - Daniel Dent
- Department of Surgery, University of Texas Health Science Center at San Antonio, 4502 Medical, San Antonio, TX, 78229, USA.
| | - James Rucinski
- Department of Surgery, New York-Presbyterian Brooklyn Methodist Hospital, 506 6th Street, Brooklyn, NY, 11215, USA.
| | - Rachel Williams Newman
- Division of Education, American College of Surgeons, 633 N. Saint Clair Street, Chicago, IL, 60611, USA
| | - Patrice Blair
- Division of Education, American College of Surgeons, 633 N. Saint Clair Street, Chicago, IL, 60611, USA
| | - Linda K Lupi
- Division of Education, American College of Surgeons, 633 N. Saint Clair Street, Chicago, IL, 60611, USA
| | - Ajit K Sachdeva
- Division of Education, American College of Surgeons, 633 N. Saint Clair Street, Chicago, IL, 60611, USA
| |
Collapse
|
8
|
Nicolas JD, Huang R, Teitelbaum EN, Bilimoria KY, Hu YY. Constructing Learning Curves to Benchmark Operative Performance of General Surgery Residents Against a National Cohort of Peers. JOURNAL OF SURGICAL EDUCATION 2020; 77:e94-e102. [PMID: 33109492 DOI: 10.1016/j.jsurg.2020.10.001] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/17/2020] [Revised: 08/27/2020] [Accepted: 10/02/2020] [Indexed: 06/11/2023]
Abstract
OBJECTIVE No method or data exist to allow surgical trainees or their programs to contextualize their technical progress. The objective of this study was to create peer benchmarks for Cumulative Sum (CUSUM) charts based upon operative evaluations from a national cohort of general surgery residents. DESIGN, SETTING, PARTICIPANTS In 2016-2018, faculty from 26 general surgery residency programs nationwide rated 328 residents' operative performance on a case-by-case basis using a validated 5-point Likert scale. An individual case was considered a "misstep" if scoring below the national median score for that procedure in that postgraduate year (PGY). We constructed 2-sided observed-expected CUSUM charts to capture each resident's cumulative performance over time relative to the national medians. Upper (failure) and lower (positive outlier) benchmarks were established based on the PGY-specific 75th percentile and median misstep rates; consistent/repeated missteps are reflected by crossing of the upper boundary. Procedures with ≤10 observations and residents who were evaluated <10 times for each PGY were excluded. RESULTS Around 8,161 evaluations on 76 procedure types were analyzed. The individual misstep rate was lowest among PGY-3s at 13.3% and highest among PGY-4s at 28.6%. No interns had curves that crossed the failure boundary. 8.7% of PGY-2s and 8.9% of PGY-3s finished the year past the failure boundary. PGY-2s had the most positive outliers, with 28.3% of them demonstrating an outlying success performance beyond the lower boundary for at least once. PGY-5s most frequently failed, with 16.7% ever crossing the upper boundary and 11.1% remaining above it at graduation. CONCLUSIONS CUSUM is a valid statistical approach for benchmarking individual residents' operative performance against national peers as they progress through the year in real-time. With further validation, CUSUM could be used to set progression and/or graduation standards and objectively identify residents who might benefit from remediation.
Collapse
Affiliation(s)
- Joseph D Nicolas
- Surgical Outcomes and Quality Improvement Center (SOQIC), Department of Surgery, Feinberg School of Medicine, Northwestern University, Chicago, Illinois
| | - Reiping Huang
- Surgical Outcomes and Quality Improvement Center (SOQIC), Department of Surgery, Feinberg School of Medicine, Northwestern University, Chicago, Illinois
| | - Ezra N Teitelbaum
- Department of Surgery, Feinberg School of Medicine, Northwestern University, Chicago, Illinois
| | - Karl Y Bilimoria
- Surgical Outcomes and Quality Improvement Center (SOQIC), Department of Surgery, Feinberg School of Medicine, Northwestern University, Chicago, Illinois; Department of Surgery, Feinberg School of Medicine, Northwestern University, Chicago, Illinois
| | - Yue-Yung Hu
- Surgical Outcomes and Quality Improvement Center (SOQIC), Department of Surgery, Feinberg School of Medicine, Northwestern University, Chicago, Illinois; Department of Surgery, Feinberg School of Medicine, Northwestern University, Chicago, Illinois; Division of Pediatric Surgery, Ann and Robert H. Lurie Children's Hospital, Chicago, Illinois.
| |
Collapse
|
9
|
Perkins SQ, Dabaja A, Atiemo H. Best Approaches to Evaluation and Feedback in Post-Graduate Medical Education. Curr Urol Rep 2020; 21:36. [PMID: 32789759 DOI: 10.1007/s11934-020-00991-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
PURPOSE OF REVIEW The objectives of this literature review are to appraise current approaches and assess new technologies that have been utilized for evaluation and feedback of residents, with focus on surgical trainees. RECENT FINDINGS In 1999, the Accreditation Council for Graduate Medical Education introduced the Milestone system as a tool for summative evaluation. The organization allows individual program autonomy on how evaluation and feedback are performed. In the past, questionnaire evaluations and informal verbal feedback were employed. However, with the advent of technology, they have taken a different shape in the form of crowdsourcing, mobile platforms, and simulation. Limited data is available on new methods but studies show promise citing low cost and positive impact on resident education. No one "best approach" exists for evaluation and feedback. However, it is apparent that a multimodal approach that is based on the ACGME Milestones can be effective and aid in guiding programs.
Collapse
Affiliation(s)
- Sara Q Perkins
- Henry Ford Health System, 2799 W Grand Blvd, K9, Detroit, MI, 48202, USA
| | - Ali Dabaja
- Henry Ford Health System, 2799 W Grand Blvd, K9, Detroit, MI, 48202, USA
| | - Humphrey Atiemo
- Henry Ford Health System, 2799 W Grand Blvd, K9, Detroit, MI, 48202, USA.
| |
Collapse
|
10
|
Oh DD, Bains H, Agostinho N, Young CJ, Storey D, Hong JS. Utility of digitally supported surgical competency assessments in a work‐based setting: a systematic review of the literature. ANZ J Surg 2019; 90:970-977. [DOI: 10.1111/ans.15472] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2019] [Revised: 07/15/2019] [Accepted: 08/18/2019] [Indexed: 11/27/2022]
Affiliation(s)
- Daniel D. Oh
- Division of Surgery, Royal Prince Alfred Hospital Sydney New South Wales Australia
- Institute of Academic SurgeryRoyal Prince Alfred Hospital Sydney New South Wales Australia
- Central Clinical SchoolThe University of Sydney Sydney New South Wales Australia
| | - Harinder Bains
- Division of Surgery, Royal Prince Alfred Hospital Sydney New South Wales Australia
- Institute of Academic SurgeryRoyal Prince Alfred Hospital Sydney New South Wales Australia
| | - Nelson Agostinho
- Division of Surgery, Royal Prince Alfred Hospital Sydney New South Wales Australia
- Institute of Academic SurgeryRoyal Prince Alfred Hospital Sydney New South Wales Australia
| | - Christopher J. Young
- Division of Surgery, Royal Prince Alfred Hospital Sydney New South Wales Australia
- Institute of Academic SurgeryRoyal Prince Alfred Hospital Sydney New South Wales Australia
- Central Clinical SchoolThe University of Sydney Sydney New South Wales Australia
| | - David Storey
- Division of Surgery, Royal Prince Alfred Hospital Sydney New South Wales Australia
- Institute of Academic SurgeryRoyal Prince Alfred Hospital Sydney New South Wales Australia
- Central Clinical SchoolThe University of Sydney Sydney New South Wales Australia
| | - Jonathan S. Hong
- Division of Surgery, Royal Prince Alfred Hospital Sydney New South Wales Australia
- Institute of Academic SurgeryRoyal Prince Alfred Hospital Sydney New South Wales Australia
- Central Clinical SchoolThe University of Sydney Sydney New South Wales Australia
| |
Collapse
|
11
|
Thanawala RM, Jesneck JL, Seymour NE. Education Management Platform Enables Delivery and Comparison of Multiple Evaluation Types. JOURNAL OF SURGICAL EDUCATION 2019; 76:e209-e216. [PMID: 31515199 DOI: 10.1016/j.jsurg.2019.08.017] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/23/2019] [Revised: 08/04/2019] [Accepted: 08/12/2019] [Indexed: 06/10/2023]
Abstract
OBJECTIVE The purpose of this study was to determine whether an automated platform for evaluation selection and delivery would increase participation from surgical teaching faculty in submitting resident operative performance evaluations. DESIGN We built a HIPAA-compliant, web-based platform to track resident operative assignments and to link embedded evaluation instruments to procedure type. The platform matched appropriate evaluations to surgeons' scheduled procedures, and delivered multiple evaluation types, including Ottawa Surgical Competency Operating Room Evaluation (O-Score) evaluations and Operative Performance Rating System (OPRS) evaluations. Prompts to complete evaluations were made through a system of automatic electronic notifications. We compared the time spent in the platform to achieve evaluation completion. As a metric for the platform's effect on faculty participation, we considered a task that would typically be infeasible without workflow optimization: the evaluator could choose to complete multiple, complementary evaluations for the same resident in the same case. For those cases with multiple evaluations, correlation was analyzed by Spearman rank test. Evaluation data were compared between PGY levels using repeated measures ANOVA. SETTING The study took place at 4 general surgery residency programs: The University of Massachusetts Medical School-Baystate, the University of Connecticut School or Medicine, the University of Iowa Carver College of Medicine, and Maimonides Medical Center. PARTICIPANTS From March 2017 to February 2019, the study included 70 surgical teaching faculty and 101 general surgery residents. RESULTS Faculty completed 1230 O-Score evaluations and 106 OPRS evaluations. Evaluations were completed quickly, with a median time of 36 ± 18 seconds for O-Score evaluations, and 53 ± 51 seconds for OPRS evaluations. 89% of O-Score and 55% of OPRS evaluations were completed without optional comments within one minute, and 99% of O-Score and 82% of OPRS evaluations were completed within 2 minutes. For cases eligible for both evaluation types, attendings completed both evaluations on 74 of 221 (33%) of these cases. These paired evaluations strongly correlated on resident performance (Spearman coefficient = 0.84, p < 0.00001). Both evaluation types stratified operative skill level by program year (p < 0.00001). CONCLUSIONS Evaluation initiatives can be hampered by the challenge of making multiple surgical evaluation instruments available when needed for appropriate clinical situations, including specific case types. As a test of the optimized evaluation workflow, and to lay the groundwork for future data-driven design of evaluations, we tested the impact of simultaneously delivering 2 evaluation instruments via a secure web-based education platform. We measured the evaluation completion rates of faculty surgeon evaluators when rating resident operative performance, and how effectively the results of evaluation could be analyzed and compared, taking advantage of a highly integrated management of the evaluative information.
Collapse
Affiliation(s)
- Ruchi M Thanawala
- University of Massachusetts Medical School-Baystate, Springfield, Massachusetts; University of Iowa Health Care, Carver College of Medicine, Iowa City Iowa
| | - Jonathan L Jesneck
- University of Massachusetts Medical School-Baystate, Springfield, Massachusetts; University of Iowa Health Care, Carver College of Medicine, Iowa City Iowa
| | - Neal E Seymour
- University of Massachusetts Medical School-Baystate, Springfield, Massachusetts; University of Iowa Health Care, Carver College of Medicine, Iowa City Iowa.
| |
Collapse
|
12
|
George BC, Bohnen JD, Schuller MC, Fryer JP. Using smartphones for trainee performance assessment: A SIMPL case study. Surgery 2019; 167:903-906. [PMID: 31668358 DOI: 10.1016/j.surg.2019.09.011] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2019] [Revised: 06/27/2019] [Accepted: 09/17/2019] [Indexed: 11/25/2022]
Abstract
Workplace-based assessments are used by raters to evaluate observed performance of trainees in actual clinical practice. These types of assessments are of growing interest, especially because observed performance is prioritized within the larger competency-based medical educational movement. Implementation of workplace-based assessments has, however, been challenging. This article describes the motivations and implications for workplace-based assessments that leverage smartphone technology. It does so in reference to an app called SIMPL (System for Improving and Measuring Procedural Learning) in order to highlight some of the challenges and benefits one might encounter during implementation of similar systems.
Collapse
Affiliation(s)
- Brian C George
- Center for Surgical Training and Research, Michigan Medicine, University of Michigan, Ann Arbor, MI.
| | - Jordan D Bohnen
- Department of Surgery, Massachusetts General Hospital, Boston, MA
| | - Mary C Schuller
- Department of Surgery, Northwestern University, Evanston, IL
| | | |
Collapse
|
13
|
Holmstrom AL, Meyerson SL. Obtaining Meaningful Assessment in Thoracic Surgery Education. Thorac Surg Clin 2019; 29:239-247. [PMID: 31235292 DOI: 10.1016/j.thorsurg.2019.03.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Abstract
Training in thoracic surgery has evolved immensely over the past decade due to the advent of integrated programs, technological innovations, and regulations on resident duty hours, decreasing the time trainees have to learn. These changes have made assessment of thoracic surgical trainees even more important. Shifts in medical education have increasingly emphasized competency, which has led to novel competency-based assessment tools for clinical and operative assessment. These novel tools take advantage of simulation and modern technology to provide more frequent and comprehensive assessment of the surgical trainee to ensure competence.
Collapse
Affiliation(s)
- Amy L Holmstrom
- Department of Surgery, Northwestern University Feinberg School of Medicine, 676 North Saint Clair Street, Suite 2320, Chicago, IL 60611, USA
| | - Shari L Meyerson
- Department of Surgery, University of Kentucky, 740 South Limestone, Suite A301, Lexington, KY 40536, USA.
| |
Collapse
|
14
|
Mackenzie CF, Tisherman SA, Shackelford S, Sevdalis N, Elster E, Bowyer MW. Efficacy of Trauma Surgery Technical Skills Training Courses. JOURNAL OF SURGICAL EDUCATION 2019; 76:832-843. [PMID: 30827743 DOI: 10.1016/j.jsurg.2018.10.004] [Citation(s) in RCA: 26] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/03/2018] [Accepted: 10/07/2018] [Indexed: 06/09/2023]
Abstract
OBJECTIVE Because open surgical skills training for trauma is limited in clinical practice, trauma skills training courses were developed to fill this gap, The aim of this report is to find supporting evidence for efficacy of these courses. The questions addressed are: What courses are available and is there robust evidence of benefit? DESIGN We performed a systematic review of the training course literature on open trauma surgery procedural skills courses for surgeons using Kirkpatrick's framework for evaluating complex educational interventions. Courses were identified using Pubmed, Google Scholar and other databases. SETTING AND PARTICIPANTS The review was carried out at the University of Maryland, Baltimore with input from civilian and military trauma surgeons, all of whom have taught and/or developed trauma skills courses. RESULTS We found 32 course reports that met search criteria, including 21 trauma-skills training courses. Courses were of variable duration, content, cost and scope. There were no prospective randomized clinical trials of course impact. Efficacy for most courses was with Kirkpatrick level 1 and 2 evidence of benefit by self-evaluations, and reporting small numbers of respondents. Few courses assessed skill retention with longitudinal data before and after training. Three courses, namely: Advanced Trauma Life Support (ATLS), Advanced Surgical Skills for Exposure in Trauma (ASSET) and Advanced Trauma Operative Management (ATOM) have Kirkpatrick's level 2-3 evidence for efficacy. Components of these 3 courses are included in several other courses, but many skills courses have little published evidence of training efficacy or skills retention durability. CONCLUSIONS Large variations in course content, duration, didactics, operative models, resource requirements and cost suggest that standardization of content, duration, and development of metrics for open surgery skills would be beneficial, as would translation into improved trauma patient outcomes. Surgeons at all levels of training and experience should participate in these trauma skills courses, because these procedures are rarely performed in routine clinical practice. Faculty running courses without evidence of training benefit should be encouraged to study outcomes to show their course improves technical skills and subsequently patient outcomes. Obtaining Kirkpatrick's level 3 and 4 evidence for benefits of ASSET, ATOM, ATLS and for other existing courses should be a high priority.
Collapse
Affiliation(s)
- Colin F Mackenzie
- Shock Trauma Anesthesiology Research Center, Baltimore, Maryland; University of Maryland School of Medicine, Baltimore, Maryland.
| | | | | | - Nick Sevdalis
- Center for Implementation Science, Kings College, London, UK.
| | - Eric Elster
- Department of Surgery, The Uniformed Services University of Health Sciences and the Walter Reed National Military Medical Center, Bethesda, Maryland.
| | - Mark W Bowyer
- Department of Surgery, The Uniformed Services University of Health Sciences and the Walter Reed National Military Medical Center, Bethesda, Maryland.
| |
Collapse
|
15
|
Bjorklund KA, Sommer N, Neumeister MW, Kasten SJ. Establishing Validity Evidence for an Operative Performance Rating System for Plastic Surgery Residents. JOURNAL OF SURGICAL EDUCATION 2019; 76:529-539. [PMID: 30253984 DOI: 10.1016/j.jsurg.2018.08.016] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/12/2017] [Revised: 08/16/2018] [Accepted: 08/18/2018] [Indexed: 05/22/2023]
Abstract
OBJECTIVE The aim of this study was to describe an operative performance rating system for plastic surgery residents and provide validity evidence for the instrument. METHODS Three plastic surgery residents (PGY levels 1, 5, and 6) from Southern Illinois University School of Medicine (SIUSOM) performed a carpal tunnel release with audio video recording. The 3 videos were reviewed by 8 expert hand surgeons and 3 SIUSOM faculty using the operative performance rating system instrument to assess resident operative performance. Validity evidence including content, internal structure, and relationship to other variables was collected. RESULTS Inter-rater reliability was consistently fair to moderate (weighted Cohen's Kappa 0.44-0.84 for experts, 0.24-0.55 for SIUSOM raters), and all assessment items were highly correlated (Cronbach's alpha of 0.9867). Local SIUSOM faculty routinely demonstrated higher overall scores for PGY 1 and PGY 6 residents compared to expert raters. CONCLUSIONS Although limited by small numbers, this pilot study suggests that potential bias based upon PGY year, identity, and performance history may exist and independent assessment by unbiased raters or comparison to national operative norms may be valuable. Our study provides baseline validity evidence for a resident operative performance assessment tool that can be integrated into practice in plastic surgery training programs.
Collapse
Affiliation(s)
- Kim A Bjorklund
- Section of Plastic and Reconstructive Surgery, Department of Surgery, Nationwide Children's Hospital, The Ohio State University College of Medicine, Columbus, Ohio.
| | - Nicole Sommer
- Department of Surgery, Southern Illinois University School of Medicine, Springfield, Illinois
| | - Michael W Neumeister
- Department of Surgery, Southern Illinois University School of Medicine, Springfield, Illinois
| | - Steven J Kasten
- Section of Plastic Surgery, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| |
Collapse
|
16
|
How Many Observations are Needed to Assess a Surgical Trainee's State of Operative Competency? Ann Surg 2019; 269:377-382. [DOI: 10.1097/sla.0000000000002554] [Citation(s) in RCA: 28] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/23/2023]
|
17
|
Mapping the landscape of cataract surgery teaching assessment in Canadian residency programs. CANADIAN JOURNAL OF OPHTHALMOLOGY 2018; 54:155-158. [PMID: 30975336 DOI: 10.1016/j.jcjo.2018.04.023] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/03/2018] [Revised: 04/18/2018] [Accepted: 04/25/2018] [Indexed: 11/20/2022]
Abstract
OBJECTIVE The Royal College of Physicians and Surgeons of Canada has mandated a shift in post-graduate residency education in Canada towards a competency-based model. Within this context, it is unclear how resident's competence in cataract surgery is currently being assessed for both formative and summative purposes. Therefore, we conducted a national survey to evaluate the current landscape of cataract surgery teaching in Canadian Ophthalmology programs. METHODS The opportunity to participate in an online survey was extended to all Canadian ophthalmology program directors and residents. Between July and September 2017, data was collected on demographics (name of program, levels of training), current framework of assessment, and any other contexts for cataract surgery assessments being used (e.g., wetlabs or surgical simulators). RESULTS We had a total of 32 responses including 7 program directors (22%), 14 senior residents (44%), and 10 junior residents (34%). The assessments used varied greatly; none of the residency programs used a published assessment tool for assessing skill in cataract surgery. The majority of programs (9 of 11; 82%) used locally-designed assessments and two programs (18%) did not use any standardized forms or tools. All schools were using a wet lab to augment surgical teaching and simulators were being used by 5 of 11 programs (45%). CONCLUSION There are a variety of approaches being used to assess competence in cataract surgery. Many programs share some similarities, and a framework for designing assessment is suggested to guide future efforts at competency-based training and assessment.
Collapse
|
18
|
Comprehensive Multicenter Graduate Surgical Education Initiative Incorporating Entrustable Professional Activities, Continuous Quality Improvement Cycles, and a Web-Based Platform to Enhance Teaching and Learning. J Am Coll Surg 2018; 227:64-76. [PMID: 29551697 DOI: 10.1016/j.jamcollsurg.2018.02.014] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2018] [Revised: 02/03/2018] [Accepted: 02/05/2018] [Indexed: 11/23/2022]
|
19
|
Sharma G, Aycart MA, O'Mara L, Havens J, Nehs M, Shimizu N, Smink DS, Gravereaux E, Gates JD, Askari R. A cadaveric procedural anatomy simulation course improves video-based assessment of operative performance. J Surg Res 2017; 223:64-71. [PMID: 29433887 DOI: 10.1016/j.jss.2017.05.067] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2016] [Revised: 05/04/2017] [Accepted: 05/19/2017] [Indexed: 10/18/2022]
Abstract
BACKGROUND Inadequate anatomic knowledge has been cited as a major contributor to declining surgical resident operative competence. We analyzed the impact of a comprehensive, procedurally oriented cadaveric procedural anatomy dissection laboratory on the operative performance of surgery residents, hypothesizing that trainees' performance of surgical procedures would improve after such a dissection course. MATERIALS AND METHODS Midlevel general surgery residents (n = 9) participated in an 8 wk, 16-h surgery faculty-led procedurally oriented cadaver simulation course. Both before and after completion of the course, residents participated in a practical examination, in which they were randomized to perform one of nine Surgical Council on Resident Education-designated "essential" procedures. The procedures were recorded using wearable video technology. Videos were deidentified before evaluation by six faculty raters blinded to examinee and whether performances occurred before or after an examinee had taken the course. Raters used the validated Operative Performance Rating System and Objective Structured Assessment of Technical Skill scales. RESULTS After the course residents had higher procedure-specific scores (median, 4.0 versus 2.4, P < 0.0001), instrument-handling (4.0 versus 3.0, P = 0.006), respect for tissue (4.0 versus 3.0, P = 0.0004), time and motion (3.0 versus 2.0, P = 0.0007), operation flow (3.0 versus 2.0, P = 0.0005), procedural knowledge (4.0 versus 2.0, P = 0.0001), and overall performance scores (4.0 versus 2.0, P < 0.0001). Operative Performance Rating System and Objective Structured Assessment of Technical Skill scales averaged by number of items in each were also higher (3.2 versus 2.0, P = 0.0002 and 3.1 versus 2.2, P = 0.002, respectively). CONCLUSIONS A cadaveric procedural anatomy simulation course covering a broad range of open general surgery procedures was associated with significant improvements in trainees' operative performance.
Collapse
Affiliation(s)
- Gaurav Sharma
- Department of Surgery, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts
| | - Mario A Aycart
- Department of Surgery, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts
| | - Lynne O'Mara
- Department of Surgery, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts
| | - Joaquim Havens
- Department of Surgery, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts
| | - Matthew Nehs
- Department of Surgery, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts
| | - Naomi Shimizu
- Department of Surgery, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts
| | - Douglas S Smink
- Department of Surgery, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts
| | - Edwin Gravereaux
- Department of Surgery, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts
| | - Jonathan D Gates
- Department of Surgery, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts
| | - Reza Askari
- Department of Surgery, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts.
| |
Collapse
|
20
|
|
21
|
Köhler TS. Assessing Competence in Surgical Training and Becoming a Better Educator. J Sex Med 2017; 14:761-764. [PMID: 28583336 DOI: 10.1016/j.jsxm.2017.04.663] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2017] [Revised: 04/10/2017] [Accepted: 04/11/2017] [Indexed: 10/19/2022]
Affiliation(s)
- Tobias S Köhler
- Division of Urology, Southern Illinois University School of Medicine, Springfield, IL, USA.
| |
Collapse
|
22
|
Kozin ED, Bohnen JD, George BC, Justicz N, Colaianni CA, Duarte M, Gray ST. Novel Mobile App Allows for Fast and Validated Intraoperative Assessment of Otolaryngology Residents. OTO Open 2017; 1:2473974X16685705. [PMID: 30480172 PMCID: PMC6239054 DOI: 10.1177/2473974x16685705] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2016] [Accepted: 12/02/2016] [Indexed: 11/17/2022] Open
Abstract
Evaluation of resident operative skills is challenging in the fast-paced
operating room environment and limited by lack of validated assessment metrics.
We describe a smartphone-based app that enables rapid assessment of operative
skills. Accreditation Council for Graduate Medical Education (ACGME)
otolaryngology taxonomy surgical procedures (n = 593) were uploaded to the
software platform. The app was piloted over 1 month. Outcomes included (1)
completion of evaluation, (2) time spent completing the evaluation, and (3)
quantification of case complexity, operative autonomy, and performance. During
the study, 12 of 12 procedures, corresponding to 3 paired evaluated by the
resident/attending dyad. Mean ± SD time of evaluation completion was 98.0 ± 24.2
and 123.0 ± 14.0 seconds for the resident and attending, respectively. Mean time
between resident and attending evaluation completion was 27.9 ± 26.8 seconds.
Resident and attending scores for case complexity, operative autonomy, and
performance were strongly correlated (P < .0001). Rapid
evaluation of resident intraoperative performance is feasible using
smartphone-based technology.
Collapse
Affiliation(s)
- Elliott D Kozin
- Department of Otolaryngology, Massachusetts Eye and Ear Infirmary, Boston, Massachusetts, USA.,Department of Otolaryngology, Harvard Medical School, Boston, Massachusetts, USA
| | - Jordan D Bohnen
- Department of General Surgery, Massachusetts General Hospital, Boston, Massachusetts, USA
| | - Brian C George
- Department of General Surgery, Massachusetts General Hospital, Boston, Massachusetts, USA.,Department of Surgery, University of Michigan Health System, Ann Arbor, Michigan, USA
| | - Natalie Justicz
- Department of Otolaryngology, Massachusetts Eye and Ear Infirmary, Boston, Massachusetts, USA.,Department of Otolaryngology, Harvard Medical School, Boston, Massachusetts, USA
| | - C Alessandra Colaianni
- Department of Otolaryngology, Massachusetts Eye and Ear Infirmary, Boston, Massachusetts, USA.,Department of Otolaryngology, Harvard Medical School, Boston, Massachusetts, USA
| | - Maria Duarte
- Department of Otolaryngology, Massachusetts Eye and Ear Infirmary, Boston, Massachusetts, USA.,Department of Otolaryngology, Harvard Medical School, Boston, Massachusetts, USA
| | - Stacey T Gray
- Department of Otolaryngology, Massachusetts Eye and Ear Infirmary, Boston, Massachusetts, USA.,Department of Otolaryngology, Harvard Medical School, Boston, Massachusetts, USA
| |
Collapse
|
23
|
Mellinger JD, Williams RG, Sanfey H, Fryer JP, DaRosa D, George BC, Bohnen JD, Schuller MC, Sandhu G, Minter RM, Gardner AK, Scott DJ. Teaching and assessing operative skills: From theory to practice. Curr Probl Surg 2016; 54:44-81. [PMID: 28212782 DOI: 10.1067/j.cpsurg.2016.11.007] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2016] [Accepted: 11/22/2016] [Indexed: 11/22/2022]
Affiliation(s)
- John D Mellinger
- Department of Surgery, Southern Illinois University School of Medicine, Springfield, IL.
| | - Reed G Williams
- Department of Surgery, Southern Illinois University School of Medicine, Springfield, IL; Department of Surgery, Indiana University School of Medicine, Indianapolis, IN
| | - Hilary Sanfey
- Department of Surgery, Southern Illinois University School of Medicine, Springfield, IL; American College of Surgeons, Chicago, IL
| | - Jonathan P Fryer
- Department of Surgery, Feinberg School of Medicine, Northwestern University, Chicago, IL
| | - Debra DaRosa
- Department of Surgery, Feinberg School of Medicine, Northwestern University, Chicago, IL
| | - Brian C George
- Department of Surgery, University of Michigan, Ann Arbor, MI
| | - Jordan D Bohnen
- Department of General Surgery, Massachussetts General Hospital and Harvard University, Boston, MA
| | - Mary C Schuller
- Department of Surgery, Feinberg School of Medicine, Northwestern University, Chicago, IL
| | - Gurjit Sandhu
- Department of Surgery, University of Michigan, Ann Arbor, MI; Department of Learning Health Sciences, University of Michigan, Ann Arbor, MI
| | - Rebecca M Minter
- Department of Surgery, University of Texas Southwestern Medical Center, Dallas, TX
| | - Aimee K Gardner
- Department of Surgery, University of Texas Southwestern Medical Center, Dallas, TX; UT Southwestern Simulation Center, University of Texas Southwestern Medical Center, Dallas, TX
| | - Daniel J Scott
- Department of Surgery, University of Texas Southwestern Medical Center, Dallas, TX; UT Southwestern Simulation Center, University of Texas Southwestern Medical Center, Dallas, TX
| |
Collapse
|
24
|
Dwyer T, Slade Shantz J, Kulasegaram KM, Chahal J, Wasserstein D, Schachar R, Devitt B, Theodoropoulos J, Hodges B, Ogilvie-Harris D. Use of an Objective Structured Assessment of Technical Skill After a Sports Medicine Rotation. Arthroscopy 2016; 32:2572-2581.e3. [PMID: 27474104 DOI: 10.1016/j.arthro.2016.05.037] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/15/2016] [Revised: 04/10/2016] [Accepted: 05/05/2016] [Indexed: 02/02/2023]
Abstract
PURPOSE The purpose of this study was to determine if the use of an Objective Structured Assessment of Technical skill (OSATS), using dry models, would be a valid method of assessing residents' ability to perform sports medicine procedures after training in a competency-based model. METHODS Over 18 months, 27 residents (19 junior [postgraduate year (PGY) 1-3] and 8 senior [PGY 4-5]) sat the OSATS after their rotation, in addition to 14 sports medicine staff and fellows. Each resident was provided a list of 10 procedures in which they were expected to show competence. At the end of the rotation, each resident undertook an OSATS composed of 6 stations sampled from the 10 procedures using dry models-faculty used the Arthroscopic Surgical Skill Evaluation Tool (ASSET), task-specific checklists, as well as an overall 5-point global rating scale (GRS) to score each resident. Each procedure was videotaped for blinded review. RESULTS The overall reliability of the OSATS (0.9) and the inter-rater reliability (0.9) were both high. A significant difference by year in training was seen for the overall GRS, the total ASSET score, and the total checklist score, as well as for each technical procedure (P < .001). Further analysis revealed a significant difference in the total ASSET score between junior (mean 18.4, 95% confidence interval [CI] 16.8 to 19.9) and senior residents (24.2, 95% CI 22.7 to 25.6), senior residents and fellows (30.1, 95% CI 28.2 to 31.9), as well as between fellows and faculty (37, 95% CI 36.1 to 27.8) (P < .05). CONCLUSIONS The results of this study show that an OSATS using dry models shows evidence of validity when used to assess performance of technical procedures after a sports medicine rotation. However, junior residents were not able to perform as well as senior residents, suggesting that overall surgical experience is as important as intensive teaching. CLINICAL RELEVANCE As postgraduate medical training shifts to a competency-based model, methods of assessing performance of technical procedures become necessary.
Collapse
Affiliation(s)
- Tim Dwyer
- Women's College Hospital, Toronto, Ontario, Canada; Mt. Sinai Hospital, Toronto, Ontario, Canada.
| | | | | | | | | | | | | | - John Theodoropoulos
- Women's College Hospital, Toronto, Ontario, Canada; Mt. Sinai Hospital, Toronto, Ontario, Canada
| | | | | |
Collapse
|
25
|
Williams RG, Kim MJ, Dunnington GL. Practice Guidelines for Operative Performance Assessments. Ann Surg 2016; 264:934-948. [DOI: 10.1097/sla.0000000000001685] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
26
|
Bohnen JD, George BC, Williams RG, Schuller MC, DaRosa DA, Torbeck L, Mullen JT, Meyerson SL, Auyang ED, Chipman JG, Choi JN, Choti MA, Endean ED, Foley EF, Mandell SP, Meier AH, Smink DS, Terhune KP, Wise PE, Soper NJ, Zwischenberger JB, Lillemoe KD, Dunnington GL, Fryer JP. The Feasibility of Real-Time Intraoperative Performance Assessment With SIMPL (System for Improving and Measuring Procedural Learning): Early Experience From a Multi-institutional Trial. JOURNAL OF SURGICAL EDUCATION 2016; 73:e118-e130. [PMID: 27886971 DOI: 10.1016/j.jsurg.2016.08.010] [Citation(s) in RCA: 133] [Impact Index Per Article: 16.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/02/2016] [Revised: 07/12/2016] [Accepted: 08/18/2016] [Indexed: 06/06/2023]
Abstract
PURPOSE Intraoperative performance assessment of residents is of growing interest to trainees, faculty, and accreditors. Current approaches to collect such assessments are limited by low participation rates and long delays between procedure and evaluation. We deployed an innovative, smartphone-based tool, SIMPL (System for Improving and Measuring Procedural Learning), to make real-time intraoperative performance assessment feasible for every case in which surgical trainees participate, and hypothesized that SIMPL could be feasibly integrated into surgical training programs. METHODS Between September 1, 2015 and February 29, 2016, 15 U.S. general surgery residency programs were enrolled in an institutional review board-approved trial. SIMPL was made available after 70% of faculty and residents completed a 1-hour training session. Descriptive and univariate statistics analyzed multiple dimensions of feasibility, including training rates, volume of assessments, response rates/times, and dictation rates. The 20 most active residents and attendings were evaluated in greater detail. RESULTS A total of 90% of eligible users (1267/1412) completed training. Further, 13/15 programs began using SIMPL. Totally, 6024 assessments were completed by 254 categorical general surgery residents (n = 3555 assessments) and 259 attendings (n = 2469 assessments), and 3762 unique operations were assessed. There was significant heterogeneity in participation within and between programs. Mean percentage (range) of users who completed ≥1, 5, and 20 assessments were 62% (21%-96%), 34% (5%-75%), and 10% (0%-32%) across all programs, and 96%, 75%, and 32% in the most active program. Overall, response rate was 70%, dictation rate was 24%, and mean response time was 12 hours. Assessments increased from 357 (September 2015) to 1146 (February 2016). The 20 most active residents each received mean 46 assessments by 10 attendings for 20 different procedures. CONCLUSIONS SIMPL can be feasibly integrated into surgical training programs to enhance the frequency and timeliness of intraoperative performance assessment. We believe SIMPL could help facilitate a national competency-based surgical training system, although local and systemic challenges still need to be addressed.
Collapse
Affiliation(s)
- Jordan D Bohnen
- Department of Surgery, Harvard Medical School, Massachusetts General Hospital, Boston, Massachusetts.
| | - Brian C George
- Department of Surgery, Harborview Medical Center, University of Washington, Seattle, Washington
| | - Reed G Williams
- Department of Surgery, Indiana University School of Medicine, Indianapolis, Indiana
| | - Mary C Schuller
- Department of Surgery, Northwestern University Feinberg School of Medicine, Chicago, Illinois
| | - Debra A DaRosa
- Department of Surgery, Northwestern University Feinberg School of Medicine, Chicago, Illinois
| | - Laura Torbeck
- Department of Surgery, Indiana University School of Medicine, Indianapolis, Indiana
| | - John T Mullen
- Department of Surgery, Harvard Medical School, Massachusetts General Hospital, Boston, Massachusetts
| | - Shari L Meyerson
- Department of Surgery, Northwestern University Feinberg School of Medicine, Chicago, Illinois
| | - Edward D Auyang
- Department of Surgery, University of New Mexico School of Medicine, Albuquerque, New Mexico
| | - Jeffrey G Chipman
- Department of Surgery, University of Minnesota Medical School, Minneapolis, Minnesota
| | - Jennifer N Choi
- Department of Surgery, Indiana University School of Medicine, Indianapolis, Indiana
| | - Michael A Choti
- Department of Surgery, University of Texas Southwestern Medical Center, Dallas, Texas
| | - Eric D Endean
- Department of Surgery, University of Kentucky College of Medicine, Lexington, Kentucky
| | - Eugene F Foley
- Department of Surgery, University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin
| | - Samuel P Mandell
- Department of Surgery, Harborview Medical Center, University of Washington, Seattle, Washington
| | - Andreas H Meier
- Department of Surgery, State University of New York Upstate Medical University, Syracuse, New York
| | - Douglas S Smink
- Department of Surgery, Brigham and Women׳s Hospital, Boston, Massachusetts
| | - Kyla P Terhune
- Department of Surgery, Vanderbilt University Medical Center, Nashville, Tennessee
| | - Paul E Wise
- Department of Surgery, Washington University School of Medicine, St. Louis, Missouri
| | - Nathaniel J Soper
- Department of Surgery, Northwestern University Feinberg School of Medicine, Chicago, Illinois
| | | | - Keith D Lillemoe
- Department of Surgery, Harvard Medical School, Massachusetts General Hospital, Boston, Massachusetts
| | - Gary L Dunnington
- Department of Surgery, Indiana University School of Medicine, Indianapolis, Indiana
| | - Jonathan P Fryer
- Department of Surgery, Northwestern University Feinberg School of Medicine, Chicago, Illinois
| |
Collapse
|
27
|
Validated Assessment Tools and Maintenance of Certification in Plastic Surgery. Plast Reconstr Surg 2016; 137:1327-1333. [DOI: 10.1097/prs.0000000000002038] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
28
|
Jeyarajah DR, Berman RS, Doyle MB, Geevarghese SK, Posner MC, Farmer D, Minter RM. Consensus Conference on North American Training in Hepatopancreaticobiliary Surgery: A Review of the Conference and Presentation of Consensus Statements. Am J Transplant 2016; 16:1086-93. [PMID: 26928942 DOI: 10.1111/ajt.13675] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2015] [Revised: 11/29/2015] [Accepted: 11/29/2015] [Indexed: 01/25/2023]
Abstract
The findings and recommendations of the North American consensus conference on training in hepatopancreaticobiliary (HPB) surgery held in October 2014 are presented. The conference was hosted by the Society for Surgical Oncology (SSO), the Americas Hepato-Pancreatico-Biliary Association (AHPBA), and the American Society of Transplant Surgeons (ASTS). The current state of training in HPB surgery in North America was defined through three pathways-HPB, surgical oncology, and solid organ transplant fellowships. Consensus regarding programmatic requirements included establishment of minimum case volumes and inclusion of quality metrics. Formative assessment, using milestones as a framework and inclusive of both operative and nonoperative skills, must be present. Specific core HPB cases should be defined and used for evaluation of operative skills. The conference concluded with a focus on the optimal means to perform summative assessment to evaluate the individual fellow completing a fellowship in HPB surgery. Presentations from the hospital perspective and the American Board of Surgery led to consensus that summative assessment was desired by the public and the hospital systems and should occur in a uniform but possibly modular manner for all HPB fellowship pathways. A task force composed of representatives of the SSO, AHPBA, and ASTS are charged with implementation of the consensus statements emanating from this consensus conference.
Collapse
Affiliation(s)
- D R Jeyarajah
- Department of Surgery, Methodist Dallas Medical Center, Dallas, TX
| | - R S Berman
- Department of Surgery, Division of Surgical Oncology, New York University, New York, NY
| | - M B Doyle
- Department of Abdominal Transplantation, Washington University School of Medicine, St Louis, MO
| | - S K Geevarghese
- Division of Hepatobiliary Surgery and Liver Transplantation, Vanderbilt University Medical Center, Nashville, TN
| | - M C Posner
- Section of General Surgery and Surgical Oncology, University of Chicago Medicine, Chicago, IL
| | - D Farmer
- Department of Transplantation, UCLA Medical Center, Los Angeles, CA
| | - R M Minter
- Department of Surgery, Division of Hepatopancreatobiliary Surgery, University of Texas Southwestern Medical Center, Dallas, TX
| |
Collapse
|
29
|
Jeyarajah DR, Berman RS, Doyle M, Geevarghese SK, Posner MC, Farmer D, Minter RM. Consensus Conference on North American Training in Hepatopancreaticobiliary Surgery: A Review of the Conference and Presentation of Consensus Statements. Ann Surg Oncol 2016; 23:2153-60. [DOI: 10.1245/s10434-016-5111-9] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2015] [Indexed: 11/18/2022]
|
30
|
Bilgic E, Watanabe Y, McKendy K, Munshi A, Ito YM, Fried GM, Feldman LS, Vassiliou MC. Reliable assessment of operative performance. Am J Surg 2016; 211:426-30. [DOI: 10.1016/j.amjsurg.2015.10.008] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2015] [Revised: 09/25/2015] [Accepted: 10/02/2015] [Indexed: 10/22/2022]
|
31
|
Williams RG, Verhulst S, Mellinger JD, Dunnington GL. Is a Single-Item Operative Performance Rating Sufficient? JOURNAL OF SURGICAL EDUCATION 2015; 72:e212-e217. [PMID: 26610357 DOI: 10.1016/j.jsurg.2015.05.002] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/16/2015] [Accepted: 05/04/2015] [Indexed: 06/05/2023]
Abstract
OBJECTIVE A valid measure of resident operative performance ability requires direct observation and accurate rating of multiple resident performances under the normal range of operating conditions. The challenge is to create an operative performance rating (OPR) system that: is easy to use, encourages completion of many ratings immediately after performances and minimally disrupts supervising surgeons' work days. The purpose of this study was to determine whether a score based on a single-item overall OPR provides a valid and stable appraisal of resident operative performances. DESIGN A retrospective comparison of a single-item OPR with a gold-standard rating based on multiple procedure-specific and general OPR items. SETTING Data were collected in the general surgery residency program at Southern Illinois University from 2001 through 2012. PARTICIPANTS Assessments of 1033 operative performances (3 common procedures, 2 laparoscopic, and 1 open) by general surgery residents were collected. OPRs based on single-item overall performance scale scores were compared with gold-standard ratings for the same performances. RESULTS Differences in performance scores using the 2 scales averaged 0.02 points (5-point scale). Correlations of the single-item and gold-standard scale scores averaged 0.95. Based on generalizability analyses of laparoscopic cholecystectomy ratings, each instrument required 5 observations to achieve reliabilities of 0.80 and 11 observations to achieve reliabilities of 0.90. Only 4.4% of single-item ratings misclassified the performance when compared with the gold-standard rating and all misclassifications were near misses. For 80% of misclassified ratings, single-item ratings were lower. CONCLUSIONS Single-item operative performance measures produced ratings that were virtually identical to gold-standard scale ratings. Misclassifications occurred infrequently and were minor in magnitude. Ratings using the single-item scale: take less time to complete, should increase the sample of procedures rated, and encourage attending surgeons to complete ratings immediately after observing performances. Face-to-face and written comments and suggestions should continue to be used to provide the granular feedback residents need to improve subsequent performances.
Collapse
Affiliation(s)
- Reed G Williams
- Department of Surgery, Indiana University School of Medicine, Indianapolis, Indiana.
| | - Steven Verhulst
- Department of Medical Education, Southern Illinois University School of Medicine, Springfield, Illinois
| | - John D Mellinger
- Department of Surgery, Southern Illinois University School of Medicine, Springfield, Illinois
| | - Gary L Dunnington
- Department of Surgery, Indiana University School of Medicine, Indianapolis, Indiana
| |
Collapse
|
32
|
Watanabe Y, Bilgic E, Lebedeva E, McKendy KM, Feldman LS, Fried GM, Vassiliou MC. A systematic review of performance assessment tools for laparoscopic cholecystectomy. Surg Endosc 2015; 30:832-44. [PMID: 26092014 DOI: 10.1007/s00464-015-4285-8] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2015] [Accepted: 05/23/2015] [Indexed: 02/06/2023]
Abstract
BACKGROUND Multiple tools are available to assess clinical performance of laparoscopic cholecystectomy (LC), but there are no guidelines on how best to implement and interpret them in educational settings. The purpose of this systematic review was to identify and critically appraise LC assessment tools and their measurement properties, in order to make recommendations for their implementation in surgical training. METHODS A systematic search (1989-2013) was conducted in MEDLINE, Embase, Scopus, Cochrane, and grey literature sources. Evidence for validity (content, response process, internal structure, relations to other variables, and consequences) and the conditions in which the evidence was obtained were evaluated. RESULTS A total of 54 articles were included for qualitative synthesis. Fifteen technical skills and two non-technical skills assessment tools were identified. The 17 tools were used for either: recorded procedures (nine tools, 60%), direct observation (five tools, 30%), or both (three tools, 18%). Fourteen (82%) tools reported inter-rater reliability and one reported a Generalizability Theory coefficient. Nine (53%) had evidence for validity based on clinical experience and 11 (65%) compared scores to other assessments. Consequences of scores, educational impact, applications to residency training, and how raters were trained were not clearly reported. No studies mentioned cost. CONCLUSIONS The most commonly reported validity evidence was inter-rater reliability and relationships to other known variables. Consequences of assessments and rater training were not clearly reported. These data and the evidence for validity should be taken into consideration when deciding how to select and implement a tool to assess performance of LC, and especially how to interpret the results.
Collapse
Affiliation(s)
- Yusuke Watanabe
- Steinberg-Bernstein Centre for Minimally Invasive Surgery and Innovation, McGill University Health Centre, 1650, Cedar Avenue, L9. 316, Montreal, QC, H3G 1A4, Canada.
- Department of Gastroenterological Surgery II, Hokkaido University Graduate School of Medicine, Sapporo, Hokkaido, Japan.
| | - Elif Bilgic
- Steinberg-Bernstein Centre for Minimally Invasive Surgery and Innovation, McGill University Health Centre, 1650, Cedar Avenue, L9. 316, Montreal, QC, H3G 1A4, Canada
| | - Ekaterina Lebedeva
- The Henry K.M. De Kuyper Education Centre, McGill University Health Centre, Montreal, QC, Canada
| | - Katherine M McKendy
- Steinberg-Bernstein Centre for Minimally Invasive Surgery and Innovation, McGill University Health Centre, 1650, Cedar Avenue, L9. 316, Montreal, QC, H3G 1A4, Canada
| | - Liane S Feldman
- Steinberg-Bernstein Centre for Minimally Invasive Surgery and Innovation, McGill University Health Centre, 1650, Cedar Avenue, L9. 316, Montreal, QC, H3G 1A4, Canada
| | - Gerald M Fried
- Steinberg-Bernstein Centre for Minimally Invasive Surgery and Innovation, McGill University Health Centre, 1650, Cedar Avenue, L9. 316, Montreal, QC, H3G 1A4, Canada
| | - Melina C Vassiliou
- Steinberg-Bernstein Centre for Minimally Invasive Surgery and Innovation, McGill University Health Centre, 1650, Cedar Avenue, L9. 316, Montreal, QC, H3G 1A4, Canada.
| |
Collapse
|
33
|
Gardner AK, Scott DJ, Hebert JC, Mellinger JD, Frey-Vogel A, Ten Eyck RP, Davis BR, Sillin LF, Sachdeva AK. Gearing up for milestones in surgery: Will simulation play a role? Surgery 2015; 158:1421-7. [PMID: 26013987 DOI: 10.1016/j.surg.2015.03.039] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2015] [Accepted: 03/13/2015] [Indexed: 10/23/2022]
Abstract
BACKGROUND The Consortium of American College of Surgeons-Accredited Education Institutes was created to promote patient safety through the use of simulation, develop new education and technologies, identify best practices, and encourage research and collaboration. METHODS During the 7th Annual Meeting of the Consortium, leaders from a variety of specialties discussed how simulation is playing a role in the assessment of resident performance within the context of the Milestones of the Accreditation Council for Graduate Medical Education as part of the Next Accreditation System. CONCLUSION This report presents experiences from several viewpoints and supports the utility of simulation for this purpose.
Collapse
Affiliation(s)
| | | | - James C Hebert
- University of Vermont College of Medicine, Residency Review Committee for Surgery, Burlington, VT
| | - John D Mellinger
- Southern Illinois University School of Medicine, Springfield, IL
| | | | | | | | - Lelan F Sillin
- Lahey Center for Professional Development and Simulation, Burlington, MA
| | | |
Collapse
|
34
|
Abstract
OBJECTIVE The purpose of this study was to create a technical skills assessment toolbox for 35 basic and advanced skills/procedures that comprise the American College of Surgeons (ACS)/Association of Program Directors in Surgery (APDS) surgical skills curriculum and to provide a critical appraisal of the included tools, using contemporary framework of validity. BACKGROUND Competency-based training has become the predominant model in surgical education and assessment of performance is an essential component. Assessment methods must produce valid results to accurately determine the level of competency. METHODS A search was performed, using PubMed and Google Scholar, to identify tools that have been developed for assessment of the targeted technical skills. RESULTS A total of 23 assessment tools for the 35 ACS/APDS skills modules were identified. Some tools, such as Operative Performance Rating System (OSATS) and Objective Structured Assessment of Technical Skill (OPRS), have been tested for more than 1 procedure. Therefore, 30 modules had at least 1 assessment tool, with some common surgical procedures being addressed by several tools. Five modules had none. Only 3 studies used Messick's framework to design their validity studies. The remaining studies used an outdated framework on the basis of "types of validity." When analyzed using the contemporary framework, few of these studies demonstrated validity for content, internal structure, and relationship to other variables. CONCLUSIONS This study provides an assessment toolbox for common surgical skills/procedures. Our review shows that few authors have used the contemporary unitary concept of validity for development of their assessment tools. As we progress toward competency-based training, future studies should provide evidence for various sources of validity using the contemporary framework.
Collapse
|
35
|
Dougherty PJ. CORR curriculum - orthopaedic education: Faculty development begins at home. Clin Orthop Relat Res 2014; 472:3637-43. [PMID: 25298280 PMCID: PMC4397787 DOI: 10.1007/s11999-014-3986-y] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/15/2014] [Accepted: 09/26/2014] [Indexed: 01/31/2023]
Affiliation(s)
- Paul J. Dougherty
- Detroit Medical Center, 4201 St. Antoine, Suite 4G, Detroit, MI 48201 USA
| |
Collapse
|
36
|
Chen XP, Williams RG, Smink DS. Do residents receive the same OR guidance as surgeons report? Difference between residents' and surgeons' perceptions of OR guidance. JOURNAL OF SURGICAL EDUCATION 2014; 71:e79-e82. [PMID: 24931416 DOI: 10.1016/j.jsurg.2014.04.010] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/26/2014] [Revised: 04/25/2014] [Accepted: 04/27/2014] [Indexed: 06/03/2023]
Abstract
PURPOSE Operating room (OR) guidance is important for surgical residents' performance and, ultimately, for the development of independence and autonomy. This study explores the differences in surgical residents' and attending surgeons' perceptions of OR guidance in prerecorded surgical cases. METHODS A total of 9 attending surgeons and 8 surgical residents observed 8 prerecorded surgical cases and were asked to identify both the presence and the type of attending surgeons' OR guidance. Each recorded case was observed by 2 attending surgeons and 1 resident. A previously developed taxonomy for types of OR guidance was applied to analyze the data to explore the difference. Agreement by both attending surgeons on the presence and the type of OR guidance served as the concordant guidance behaviors to which the responses of the residents were compared. RESULTS Overall, 116 OR guidance events were identified. Attending surgeons agreed on the presence of guidance in 80 of 116 (69.8%) events and consistently identified the type of OR guidance in 91.4% (73/80, Cohen κ = 0.874) of them. However, surgical residents only agreed with attending surgeons on the presence of guidance in 61.25% (49/80) of the events. In addition, there was significant disagreement (Cohen κ = 0.319) between surgical residents and attending surgeons in the type of OR guidance; the residents only identified 54.8% (40/73) of concordant guidance behaviors in the same guidance category as both the surgeons. Among the types of OR guidance, residents and attending surgeons were most likely to agree on the teaching guidance (66.67%) and least likely to agree on the assisting guidance (36.84%). CONCLUSIONS Surgical residents and attending surgeons have different perceptions of both the presence and the type of OR guidance. This difference in perception of OR guidance has important implications for the efficiency of training surgical residents in the OR, and, ultimately on residents' development of independence and autonomy.
Collapse
Affiliation(s)
| | - Reed G Williams
- Department of Surgery, School of Medicine, Indiana University, Indianapolis, Indiana
| | - Douglas S Smink
- Department of Surgery, Brigham and Women's Hospital, Boston, Massachusetts
| |
Collapse
|
37
|
Minter RM, Dunnington GL, Sudan R, Terhune KP, Dent DL, Lentz AK. Can This Resident Be Saved? Identification and Early Intervention for Struggling Residents. J Am Coll Surg 2014; 219:1088-95. [DOI: 10.1016/j.jamcollsurg.2014.06.013] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2014] [Revised: 06/17/2014] [Accepted: 06/18/2014] [Indexed: 10/25/2022]
|
38
|
Williams RG, Chen XP, Sanfey H, Markwell SJ, Mellinger JD, Dunnington GL. The measured effect of delay in completing operative performance ratings on clarity and detail of ratings assigned. JOURNAL OF SURGICAL EDUCATION 2014; 71:e132-e138. [PMID: 25088368 DOI: 10.1016/j.jsurg.2014.06.015] [Citation(s) in RCA: 36] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/13/2014] [Revised: 05/19/2014] [Accepted: 06/23/2014] [Indexed: 06/03/2023]
Abstract
PURPOSE Operative performance ratings (OPRs) need adequate clarity and detail to support self-directed learning and valid progress decisions. This study was designed to determine (1) the elapsed time between observing operative performances and completing performance ratings under field conditions and (2) the effect of increased elapsed time on rating clarity and detail. METHODS Overall, 895 OPRs by 19 faculty members for 37 general surgery residents were the focus of this study. The elapsed time between observing the performance and completing the evaluation was recorded. No-delay comparison data included 45 additional ratings of 8 performances collected under controlled conditions immediately following the performance by 17 surgeons whose sole responsibility was to observe and rate the performances. Item-to-item OPR variation and the presence and nature of comments were indicators of evaluation clarity, detail, and quality. RESULTS Elapsed time between observing and evaluating performances under field conditions were as follows: 1 day or less, 116 performances (13%); 2 to 3 days, 178 performances (20%); 4 to 14 days, 377 performances (42%); and more than 14 days, 224 performances (25%). Overall, 87% of performances rated more than 14 days after observation had no item-to-item ratings variation compared with 62% rated with a delay of 4 to 14 days, 41% rated with a delay of 2 to 3 days, 42% rated within 1 day, and 2% rated immediately. In addition, 70% of ratings completed more than 14 days after observation had no written comments, compared with 49% for those completed with a delay of 4 to 14 days, 45% for those completed in 2 to 3 days, and 46% for those completed within 1 day. Moreover, 47% of comments submitted after more than 14 days were exclusively global comments (less instructionally useful) compared with 7% for those completed with a delay of 4 to 14 days and 5% for those completed in 1 to 3 days. CONCLUSIONS The elapsed time between observation and rating of operative performances should be recorded. Immediate ratings should be encouraged. Ratings completed more than 3 days after observation should be discouraged and discounted, as they lack clarity and detail about the performance.
Collapse
Affiliation(s)
- Reed G Williams
- Department of Surgery, Indiana University School of Medicine, Indianapolis, Indiana.
| | | | - Hilary Sanfey
- Department of Surgery, Southern Illinois University School of Medicine, Springfield, Illinois
| | - Stephen J Markwell
- Department of Surgery, Southern Illinois University School of Medicine, Springfield, Illinois
| | - John D Mellinger
- Department of Surgery, Southern Illinois University School of Medicine, Springfield, Illinois
| | - Gary L Dunnington
- Department of Surgery, Indiana University School of Medicine, Indianapolis, Indiana
| |
Collapse
|
39
|
Schatz A, Kogan B, Feustel P. Assessing resident surgical competency in urology using a global rating scale. JOURNAL OF SURGICAL EDUCATION 2014; 71:790-797. [PMID: 24862244 DOI: 10.1016/j.jsurg.2014.03.012] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/17/2014] [Revised: 03/07/2014] [Accepted: 03/24/2014] [Indexed: 06/03/2023]
Abstract
OBJECTIVE Training programs must ensure residents are competent to practice independently. For surgical fields, this is generally done by the faculty who graduate the residents, but there has been no accepted methodology for this process. DESIGN As part of a generalized survey, attending physicians performing an operation were asked to assess resident competency to perform the operation independently in an average patient, using a single global question. Residents, in a blinded manner, were asked to answer the same question. SETTING Urology Residency Program, Albany Medical College, Albany, NY. PARTICIPANTS Participants included 12 resident physicians and 10 attending physicians. RESULTS There is a large variation in attending physician assessment of resident surgical competency, and the assessment varies by attending physician and by resident. Generally, attending physicians rated residents lower than the residents rated themselves. The discrepancy was largest for residents early in training and lessened as resident experience increased. Assessments also tended to converge toward the attending physician assessment as competency increased. Assessments had less variability when involving a single, high-volume procedure for a single resident. CONCLUSIONS Assessing resident surgical competency with a standardized global question is feasible, but complex. Attending physicians and residents differ significantly in their assessment of resident competence. The trend of residents׳ perceptions approaching attending physician estimates as training and competence increases supports the current concept that program directors should use attending physician assessments as the primary measure.
Collapse
Affiliation(s)
- Adam Schatz
- Department of Urology, Albany Medical College, Albany, New York.
| | - Barry Kogan
- Department of Urology, Albany Medical College, Albany, New York
| | - Paul Feustel
- Department of Urology, Albany Medical College, Albany, New York
| |
Collapse
|
40
|
Wagner JP, Chen DC, Donahue TR, Quach C, Hines OJ, Hiatt JR, Tillou A. Assessment of resident operative performance using a real-time mobile Web system: preparing for the milestone age. JOURNAL OF SURGICAL EDUCATION 2014; 71:e41-e46. [PMID: 25037504 DOI: 10.1016/j.jsurg.2014.06.008] [Citation(s) in RCA: 32] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/28/2014] [Revised: 06/08/2014] [Accepted: 06/12/2014] [Indexed: 06/03/2023]
Abstract
OBJECTIVE To satisfy trainees' operative competency requirements while improving feedback validity and timeliness using a mobile Web-based platform. DESIGN The Southern Illinois University Operative Performance Rating Scale (OPRS) was embedded into a website formatted for mobile devices. From March 2013 to February 2014, faculty members were instructed to complete the OPRS form while providing verbal feedback to the operating resident at the conclusion of each procedure. Submitted data were compiled automatically within a secure Web-based spreadsheet. Conventional end-of-rotation performance (CERP) evaluations filed 2006 to 2013 and OPRS performance scores were compared by year of training using serial and independent-samples t tests. The mean CERP scores and OPRS overall resident operative performance scores were directly compared using a linear regression model. OPRS mobile site analytics were reviewed using a Web-based reporting program. SETTING Large university-based general surgery residency program. PARTICIPANTS General Surgery faculty used the mobile Web OPRS system to rate resident performance. Residents and the program director reviewed evaluations semiannually. RESULTS Over the study period, 18 faculty members and 37 residents logged 176 operations using the mobile OPRS system. There were 334 total OPRS website visits. Median time to complete an evaluation was 45 minutes from the end of the operation, and faculty spent an average of 134 seconds on the site to enter 1 assessment. In the 38,506 CERP evaluations reviewed, mean performance scores showed a positive linear trend of 2% change per year of training (p = 0.001). OPRS overall resident operative performance scores showed a significant linear (p = 0.001), quadratic (p = 0.001), and cubic (p = 0.003) trend of change per year of clinical training, reflecting the resident operative experience in our training program. Differences between postgraduate year-1 and postgraduate year-5 overall performance scores were greater with the OPRS (mean = 0.96, CI: 0.55-1.38) than with CERP measures (mean = 0.37, CI: 0.34-0.41). Additionally, there were consistent increases in each of the OPRS subcategories. CONCLUSIONS In contrast to CERPs, the OPRS fully satisfies the Accreditation Council for Graduate Medical Education and American Board of Surgery operative assessment requirements. The mobile Web platform provides a convenient interface, broad accessibility, automatic data compilation, and compatibility with common database and statistical software. Our mobile OPRS system encourages candid feedback dialog and generates a comprehensive review of individual and group-wide operative proficiency in real time.
Collapse
Affiliation(s)
- Justin P Wagner
- David Geffen School of Medicine, University of California, Los Angeles, California.
| | - David C Chen
- David Geffen School of Medicine, University of California, Los Angeles, California
| | - Timothy R Donahue
- David Geffen School of Medicine, University of California, Los Angeles, California
| | - Chi Quach
- David Geffen School of Medicine, University of California, Los Angeles, California
| | - O Joe Hines
- David Geffen School of Medicine, University of California, Los Angeles, California
| | - Jonathan R Hiatt
- David Geffen School of Medicine, University of California, Los Angeles, California
| | - Areti Tillou
- David Geffen School of Medicine, University of California, Los Angeles, California
| |
Collapse
|
41
|
Abstract
BACKGROUND The patient safety imperative has raised expectations regarding the responsibility of medical educators and decision makers to ensure that physicians are competent. Ensuring that trainees are ready for independent practice upon graduation is challenged by reduced work hours such that trainees spend less time in the OR and perform fewer cases than desirable. METHODS The literature on the assessment of technical and non--technical operative skills and professionalism was reviewed in order to make recommendations to identify barriers to evaluation. DISCUSSION Barriers to documenting performance deficiencies include uncertainty as to what should be documented, and concerns about the negative impact of critical evaluations on faculty popularity. Additional challenges include a lack of clear standards for performance and effective remediation options. CONCLUSIONS Trainee performance should be evaluated in a rigorous, reliable and meaningful way to ensure that graduates have the skills necessary for safe, independent practice.
Collapse
|
42
|
Sachdeva AK, Flynn TC, Brigham TP, Dacey RG, Napolitano LM, Bass BL, Philibert I, Blair PG, Lupi LK. Interventions to address challenges associated with the transition from residency training to independent surgical practice. Surgery 2014; 155:867-82. [DOI: 10.1016/j.surg.2013.12.027] [Citation(s) in RCA: 59] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2013] [Accepted: 12/26/2013] [Indexed: 01/22/2023]
|
43
|
Recognizing Residents with a Deficiency in Operative Performance as a Step Closer to Effective Remediation. J Am Coll Surg 2013; 216:114-22. [DOI: 10.1016/j.jamcollsurg.2012.09.008] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2012] [Revised: 09/12/2012] [Accepted: 09/12/2012] [Indexed: 12/25/2022]
|