1
|
Doster D, Hunt ML, Thomas CM, Krusing MB, Miller PM, Choi J, Stefanidis D, Matthew Ritter E. Using ACGME General Surgery Milestones to Define the Competent Foundational Surgical Resident. JOURNAL OF SURGICAL EDUCATION 2024; 81:973-982. [PMID: 38749820 DOI: 10.1016/j.jsurg.2024.03.016] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/18/2023] [Revised: 03/06/2024] [Accepted: 03/25/2024] [Indexed: 06/11/2024]
Abstract
OBJECTIVE In transitioning to competency-based surgical training, the need to clearly define competency is paramount. The purpose of this study is to define the well-prepared foundational resident using the ACGME General Surgery Milestones as our conceptual framework. DESIGN Participants reflected on their expectations of a well-prepared resident at the end of PGY1, then assigned milestone levels reflecting this level of competence for General Surgery Milestones 1.0 and 2.0. Subcompetency scores were averaged among residents and faculty. The level of the well-prepared foundational resident was determined based on the highest level within one standard deviation of faculty, resident, and total group averages. SETTING This took place during a dedicated education retreat at a single, large academic general surgery residency program. PARTICIPANTS Key faculty stakeholders and a representative sample of residents (PGY 1-5) within our institution participated. RESULTS Eight faculty and five residents completed Milestones 1.0 and 2.0 scoring. Mean scores between faculty and residents were compared. For 1.0, mean scores for Practice-Based Learning and Improvement 3 (PBLI 3) and Interpersonal Communication Skills 3 (ICS 3) were discernably lower for residents than for faculty (PBLI 3 1.3 (0.3) v 0.9 (0.2), p = 0.01; ICS3 1.6 (0.6) v 1.1 (1), p = 0.01). Scores of 2.0 were comparable across all subcompetency domains. With this broad agreement, Milestone-based competency standards were determined. Descriptive narratives of the KSAs were created for each subcompetency, combining the determined Milestones 1.0 and 2.0 levels. CONCLUSIONS We were able to clearly define the competent foundational resident using the ACGME Milestones as a conceptual framework. These Milestone levels reflect the culture and expectations in our department, providing a foundation upon which to build a program of assessment. This methodology can be readily replicated in other programs to reflect specific expectations of the program within the larger ACGME frameworks of competency.
Collapse
Affiliation(s)
- Dominique Doster
- Department of Surgery, Indiana University School of Medicine, Indianapolis, Indiana
| | - Maya L Hunt
- Department of Surgery, Indiana University School of Medicine, Indianapolis, Indiana.
| | - Christopher M Thomas
- Department of Surgery, Indiana University School of Medicine, Indianapolis, Indiana
| | - Madeline B Krusing
- Department of Surgery, Indiana University School of Medicine, Indianapolis, Indiana
| | - Payton M Miller
- Department of Surgery, Indiana University School of Medicine, Indianapolis, Indiana
| | - Jennifer Choi
- Department of Surgery, Indiana University School of Medicine, Indianapolis, Indiana
| | - Dimitrios Stefanidis
- Department of Surgery, Indiana University School of Medicine, Indianapolis, Indiana
| | - E Matthew Ritter
- Department of Surgery, Indiana University School of Medicine, Indianapolis, Indiana
| |
Collapse
|
2
|
Pradarelli AA, Park YS, Healy MG, Phitayakorn R, Petrusa E. National Profile of the ACGME Milestones 1.0 and 2.0 within General Surgery: A Seven-Year National Study from 2014 to 2021. JOURNAL OF SURGICAL EDUCATION 2024; 81:626-638. [PMID: 38555246 DOI: 10.1016/j.jsurg.2024.01.016] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/16/2023] [Revised: 09/15/2023] [Accepted: 01/29/2024] [Indexed: 04/02/2024]
Abstract
PURPOSE The Accreditation Council for Graduate Medical Education (ACGME) introduced General Surgery Milestones 1.0 in 2014 and Milestones 2.0 in 2020 as steps toward competency-based training. Analysis will inform residency programs on curriculum development, assessment, feedback, and faculty development. This study describes the distributions and trends for Milestones 1.0 and 2.0 ratings and proportion of residents not achieving the level 4.0 graduation target. METHODS A deidentified dataset of milestone ratings for all ACGME-accredited General Surgery residency programs in the United States was used. Medians and interquartile ranges (IQR) were reported for milestone ratings at each PGY level. Percentages of PGY-5s receiving final year ratings of less than 4.0 were calculated. Wilcoxon rank sum tests were used to compare 1.0 and 2.0 median ratings. Kruskal-Wallis and Bonferroni post-hoc tests were used to compare median ratings across time periods and PGY levels. Chi-squared tests were used to compare the proportion of level 4.0 nonachievement under both systems. RESULTS Milestones 1.0 data consisted of 13,866 residents and Milestones 2.0 data consisted of 7,633 residents. For 1.0 and 2.0, all competency domain median ratings were higher for subsequent years of training. Milestones 2.0 had significantly higher median ratings at all PGY levels for all competency domains except Medical Knowledge. Percentages of PGY-5 residents not achieving the graduation target in Milestones 1.0 ranged from 27% to 42% and in 2.0 from 5% to 13%. For Milestones 1.0, all subcompetencies showed an increased number of residents achieving the graduation target from 2014 to 2019. CONCLUSIONS This study of General Surgery Milestones 1.0 and 2.0 data uncovered significant increases in average ratings and significantly fewer residents not achieving the graduation target under the 2.0 system. We hypothesize that these findings may be related more to rating bias given the change in rating scales, rather than a true increase in resident ability.
Collapse
Affiliation(s)
- Alyssa A Pradarelli
- Medical Education Design Lab, Department of Surgery, University of Michigan, Ann Arbor, Michigan; Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts.
| | - Yoon Soo Park
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Michael G Healy
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Roy Phitayakorn
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Emil Petrusa
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| |
Collapse
|
3
|
Baynouna AlKetbi L, Nagelkerke N, AlZarouni AA, AlKuwaiti MM, AlDhaheri R, AlNeyadi AM, AlAlawi SS, AlKuwaiti MH. Assessing the impact of adopting a competency-based medical education framework and ACGME-I accreditation on educational outcomes in a family medicine residency program in Abu Dhabi Emirate, United Arab Emirates. Front Med (Lausanne) 2024; 10:1257213. [PMID: 38259827 PMCID: PMC10802161 DOI: 10.3389/fmed.2023.1257213] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2023] [Accepted: 11/24/2023] [Indexed: 01/24/2024] Open
Abstract
Background Competency-Based Medical Education (CBME) is now mandated by many graduate and undergraduate accreditation standards. Evaluating CBME is essential for quantifying its impact, finding supporting evidence for the efforts invested in accreditation processes, and determining future steps. The Ambulatory Healthcare Services (AHS) family medicine residency program has been accredited by the Accreditation Council of Graduate Medical Education-International (ACGME-I) since 2013. This study aims to report the Abu Dhabi program's experience in implementing CBME and accreditation. Objectives Compare the two residents' cohorts' performance pre-and post-ACGME-I accreditation.Study the bi-annually reported milestones as a graduating residents' performance prognostic tool. Methods All residents in the program from 2008 to 2019 were included. They are called Cohort one-the intake from 2008 to 2012, before the ACGME accreditation, and Cohort two-the intake from 2013 to 2019, after the ACGME accreditation, with the milestones used. The mandatory annual in-training exam was used as an indication of the change in competency between the two cohorts. Among Cohort two ACGME-I, the biannual milestones data were studied to find the correlation between residents' early and graduating milestones. Results A total of 112 residents were included: 36 in Cohort one and 76 in Cohort two. In Cohort one, before the ACGME accreditation, no significant associations were identified between residents' graduation in-training exam and their early performance indicators, while in Cohort two, there were significant correlations between almost all performance metrics. Early milestones are correlated with the graduation in-training exam score. Linear regression confirmed this relationship after controlling the residents' undergraduate Grade Point Average (GPA). Competency development continues to improve even after residents complete training at Post Graduate Year, PGY4, as residents' achievement in PGY5 continues to improve. Conclusion Improved achievement of residents after the introduction of the ACGME-I accreditation is evident. Additionally, the correlation between the graduation in-training exam and graduation milestones, with earlier milestones, suggests a possible use of early milestones in predicting outcomes.
Collapse
Affiliation(s)
| | - Nico Nagelkerke
- Community Medicine Department, UAEU, Al Ain, United Arab Emirates
| | - Amal A. AlZarouni
- Abu Dhabi Healthcare Services, Ambulatory Healthcare Services, Al Ain, United Arab Emirates
| | - Mariam M. AlKuwaiti
- Abu Dhabi Healthcare Services, Ambulatory Healthcare Services, Al Ain, United Arab Emirates
| | - Ruwaya AlDhaheri
- Abu Dhabi Healthcare Services, Ambulatory Healthcare Services, Al Ain, United Arab Emirates
| | - Amna M. AlNeyadi
- Abu Dhabi Healthcare Services, Ambulatory Healthcare Services, Al Ain, United Arab Emirates
| | - Shamma S. AlAlawi
- Abu Dhabi Healthcare Services, Ambulatory Healthcare Services, Al Ain, United Arab Emirates
| | - Mouza H. AlKuwaiti
- Abu Dhabi Healthcare Services, Ambulatory Healthcare Services, Al Ain, United Arab Emirates
| |
Collapse
|
4
|
Weaver ML, Carter T, Yamazaki K, Hamstra SJ, Holmboe E, Chaer R, Park YS, Smith BK. The Association of ACGME Milestones With Performance on American Board of Surgery Assessments: A National Investigation of Surgical Trainees. Ann Surg 2024; 279:180-186. [PMID: 37436889 DOI: 10.1097/sla.0000000000005998] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/14/2023]
Abstract
OBJECTIVE To determine the relationship between, and predictive utility of, milestone ratings and subsequent American Board of Surgery (ABS) vascular surgery in-training examination (VSITE), vascular qualifying examination (VQE), and vascular certifying examination (VCE) performance in a national cohort of vascular surgery trainees. BACKGROUND Specialty board certification is an important indicator of physician competence. However, predicting future board certification examination performance during training continues to be challenging. METHODS This is a national longitudinal cohort study examining relational and predictive associations between Accreditation Council for Graduate Medical Education (ACGME) Milestone ratings and performance on VSITE, VQE, and VCE for all vascular surgery trainees from 2015 to 2021. Predictive associations between milestone ratings and VSITE were conducted using cross-classified random-effects regression. Cross-classified random-effects logistic regression was used to identify predictive associations between milestone ratings and VQE and VCE. RESULTS Milestone ratings were obtained for all residents and fellows(n=1,118) from 164 programs during the study period (from July 2015 to June 2021), including 145,959 total trainee assessments. Medical knowledge (MK) and patient care (PC) milestone ratings were strongly predictive of VSITE performance across all postgraduate years (PGYs) of training, with MK ratings demonstrating a slightly stronger predictive association overall (MK coefficient 17.26 to 35.76, β = 0.15 to 0.23). All core competency ratings were predictive of VSITE performance in PGYs 4 and 5. PGY 5 MK was highly predictive of VQE performance [OR 4.73, (95% CI, 3.87-5.78), P <0.001]. PC subcompetencies were also highly predictive of VQE performance in the final year of training [OR 4.14, (95% CI, 3.17-5.41), P <0.001]. All other competencies were also significantly predictive of first-attempt VQE pass with ORs of 1.53 and higher. PGY 4 ICS ratings [OR 4.0, (95% CI, 3.06-5.21), P <0.001] emerged as the strongest predictor of VCE first-attempt pass. Again, all subcompetency ratings remained significant predictors of first-attempt pass on CE with ORs of 1.48 and higher. CONCLUSIONS ACGME Milestone ratings are highly predictive of future VSITE performance, and first-attempt pass achievement on VQE and VCE in a national cohort of surgical trainees.
Collapse
Affiliation(s)
- M Libby Weaver
- Division of Vascular and Endovascular Surgery, University of Virginia, Charlottesville, VA
| | - Taylor Carter
- Department of Surgery, University of North Carolina, Chapel Hill, NC
| | - Kenji Yamazaki
- Accreditation Council for Graduate Medical Education, Chicago, IL
| | - Stanley J Hamstra
- Department of Surgery, University of Toronto, Toronto, Ontario, Canada
- Department of Medical Education, Northwestern University Feinberg School of Medicine, Chicago, IL
| | - Eric Holmboe
- Accreditation Council for Graduate Medical Education, Chicago, IL
| | - Rabih Chaer
- Division of Vascular Surgery, University of Pittsburgh Medical Center, Pittsburgh, PA
| | - Yoon Soo Park
- Massachusetts General Hospital, Harvard Medical School, Boston, MA
| | - Brigitte K Smith
- Division of Vascular Surgery, University of Utah, Salt Lake City, UT
| |
Collapse
|
5
|
Kendrick DE, Thelen AE, Chen X, Gupta T, Yamazaki K, Krumm AE, Bandeh-Ahmadi H, Clark M, Luckoscki J, Fan Z, Wnuk GM, Ryan AM, Mukherjee B, Hamstra SJ, Dimick JB, Holmboe ES, George BC. Association of Surgical Resident Competency Ratings With Patient Outcomes. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:813-820. [PMID: 36724304 DOI: 10.1097/acm.0000000000005157] [Citation(s) in RCA: 11] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/18/2023]
Abstract
PURPOSE Accurate assessment of clinical performance is essential to ensure graduating residents are competent for unsupervised practice. The Accreditation Council for Graduate Medical Education milestones framework is the most widely used competency-based framework in the United States. However, the relationship between residents' milestones competency ratings and their subsequent early career clinical outcomes has not been established. It is important to examine the association between milestones competency ratings of U.S. general surgical residents and those surgeons' patient outcomes in early career practice. METHOD A retrospective, cross-sectional study was conducted using a sample of national Medicare claims for 23 common, high-risk inpatient general surgical procedures performed between July 1, 2015, and November 30, 2018 (n = 12,400 cases) by nonfellowship-trained U.S. general surgeons. Milestone ratings collected during those surgeons' last year of residency (n = 701 residents) were compared with their risk-adjusted rates of mortality, any complication, or severe complication within 30 days of index operation during their first 2 years of practice. RESULTS There were no associations between mean milestone competency ratings of graduating general surgery residents and their subsequent early career patient outcomes, including any complication (23% proficient vs 22% not yet proficient; relative risk [RR], 0.97, [95% CI, 0.88-1.08]); severe complication (9% vs 9%, respectively; RR, 1.01, [95% CI, 0.86-1.19]); and mortality (5% vs 5%; RR, 1.07, [95% CI, 0.88-1.30]). Secondary analyses yielded no associations between patient outcomes and milestone ratings specific to technical performance, or between patient outcomes and composites of operative performance, professionalism, or leadership milestones ratings ( P ranged .32-.97). CONCLUSIONS Milestone ratings of graduating general surgery residents were not associated with the patient outcomes of those surgeons when they performed common, higher-risk procedures in a Medicare population. Efforts to improve how milestones ratings are generated might strengthen their association with early career outcomes.
Collapse
Affiliation(s)
- Daniel E Kendrick
- D.E. Kendrick is assistant professor, Department of Surgery, University of Minnesota, Minneapolis, Minnesota
| | - Angela E Thelen
- A.E. Thelen is research fellow, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Xilin Chen
- X. Chen is research analyst, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Tanvi Gupta
- T. Gupta is research analyst, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Kenji Yamazaki
- K. Yamazaki is senior data analyst, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | - Andrew E Krumm
- A.E. Krumm is assistant professor, Department of Learning Health Sciences, University of Michigan, Ann Arbor, Michigan
| | - Hoda Bandeh-Ahmadi
- H. Bandeh-Ahmadi is project manager, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Michael Clark
- M. Clark is a biostatistician, Consulting for Statistics, Computing, and Analytics Research, University of Michigan, Ann Arbor, Michigan
| | - John Luckoscki
- J. Luckoscki is research fellow, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Zhaohui Fan
- Z. Fan is research analyst, Center for Healthcare Outcomes and Policy, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Greg M Wnuk
- G.M. Wnuk is program manager, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Andrew M Ryan
- A.M. Ryan is professor, Department of Health Management and Policy, School of Public Health, University of Michigan, Ann Arbor, Michigan
| | - Bhramar Mukherjee
- B. Mukherjee is professor and chair, Division of Biostatistics, School of Public Health, University of Michigan, Ann Arbor, Michigan
| | - Stanley J Hamstra
- S.J. Hamstra is professor, Department of Surgery, University of Toronto, Toronto, Ontario, Canada
| | - Justin B Dimick
- J.B. Dimick is professor and chair, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Eric S Holmboe
- E.S. Holmboe is chief research, Milestone Development, and evaluation officer, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | - Brian C George
- B.C. George is director, Center for Surgical Training and Research, and assistant professor, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| |
Collapse
|
6
|
Shirkhodaie C, Avila S, Seidel H, Gibbons RD, Arora VM, Farnan JM. The Association Between USMLE Step 2 Clinical Knowledge Scores and Residency Performance: A Systematic Review and Meta-Analysis. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:264-273. [PMID: 36512984 DOI: 10.1097/acm.0000000000005061] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
PURPOSE With the change in Step 1 score reporting, Step 2 Clinical Knowledge (CK) may become a pivotal factor in resident selection. This systematic review and meta-analysis seeks to synthesize existing observational studies that assess the relationship between Step 2 CK scores and measures of resident performance. METHOD The authors searched MEDLINE, Web of Science, and Scopus databases using terms related to Step 2 CK in 2021. Two researchers identified studies investigating the association between Step 2 CK and measures of resident performance and included studies if they contained a bivariate analysis examining Step 2 CK scores' association with an outcome of interest: in-training examination (ITE) scores, board certification examination scores, select Accreditation Council for Graduate Medical Education core competency assessments, overall resident performance evaluations, or other subjective measures of performance. For outcomes that were investigated by 3 or more studies, pooled effect sizes were estimated with random-effects models. RESULTS Among 1,355 potential studies, 68 met inclusion criteria and 43 were able to be pooled. There was a moderate positive correlation between Step 2 CK and ITE scores (0.52, 95% CI 0.45-0.59, P < .01). There was a moderate positive correlation between Step 2 CK and ITE scores for both nonsurgical (0.59, 95% CI 0.51-0.66, P < .01) and surgical specialties (0.41, 95% CI 0.33-0.48, P < .01). There was a very weak positive correlation between Step 2 CK scores and subjective measures of resident performance (0.19, 95% CI 0.13-0.25, P < .01). CONCLUSIONS This study found Step 2 CK scores have a statistically significant moderate positive association with future examination scores and a statistically significant weak positive correlation with subjective measures of resident performance. These findings are increasingly relevant as Step 2 CK scores will likely become more important in resident selection.
Collapse
Affiliation(s)
- Camron Shirkhodaie
- C. Shirkhodaie is a medical student, Pritzker School of Medicine, University of Chicago, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-4279-3251
| | - Santiago Avila
- S. Avila is a medical student, Pritzker School of Medicine, University of Chicago, Chicago, Illinois; ORCID: https://orcid.org/0000-0003-3633-4304
| | - Henry Seidel
- H. Seidel is a medical student, Pritzker School of Medicine, University of Chicago, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-7364-1365
| | - Robert D Gibbons
- R.D. Gibbons is professor, Center for Health Statistics and Departments of Medicine and Public Health Sciences, University of Chicago, Chicago, Illinois
| | - Vineet M Arora
- V.M. Arora is professor, Department of Medicine, University of Chicago Medicine, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-4745-7599
| | - Jeanne M Farnan
- J.M. Farnan is professor, Department of Medicine, University of Chicago Medicine, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-1138-9416
| |
Collapse
|
7
|
Maranich AM, Hemmer PA, Uijtdehaage S, Battista A. ACGME Milestones in the Real World: A Qualitative Study Exploring Response Process Evidence. J Grad Med Educ 2022; 14:201-209. [PMID: 35463179 PMCID: PMC9017262 DOI: 10.4300/jgme-d-21-00546.1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/22/2021] [Revised: 10/02/2021] [Accepted: 01/05/2022] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Since the Accreditation Council for Graduate Medical Education (ACGME) introduced the Milestones in 2013, the body of validity evidence supporting their use has grown, but there is a gap with regard to response process. OBJECTIVE The purpose of this study is to qualitatively explore validity evidence pertaining to the response process of individual Clinical Competency Committee (CCC) members when assigning Milestone ratings to a resident. METHODS Using a constructivist paradigm, we conducted a thematic analysis of semi-structured interviews with 8 Transitional Year (TY) CCC members from 4 programs immediately following a CCC meeting between November and December 2020. Participants were queried about their response process in their application of Milestone assessment. Analysis was iterative, including coding, constant comparison, and theming. RESULTS Participant interviews identified an absence of formal training and a perception that Milestones are a tool for resident assessment without recognizing their role in program evaluation. In describing their thought process, participants reported comparing averaged assessment data to peers and time in training to generate Milestone ratings. Meaningful narrative comments, when available, differentiated resident performance from peers. When assessment data were absent, participants assumed an average performance. CONCLUSIONS Our study found that the response process used by TY CCC members was not always consistent with the dual purpose of the Milestones to improve educational outcomes at the levels of residents and the program.
Collapse
Affiliation(s)
- Ashley M. Maranich
- All authors are with the Uniformed Services University
- Ashley M. Maranich, MD, is Assistant Dean for Clinical Sciences and Associate Professor of Pediatrics
| | - Paul A. Hemmer
- All authors are with the Uniformed Services University
- Paul A. Hemmer, MD, MPH, is Professor of Medicine and Professor of Health Professions Education
| | - Sebastian Uijtdehaage
- All authors are with the Uniformed Services University
- Sebastian Uijtdehaage, PhD, is Professor of Medicine and Professor of Health Professions Education, and Associate Editor, Journal of Graduate Medical Education
| | - Alexis Battista
- All authors are with the Uniformed Services University
- Alexis Battista, PhD, is Assistant Professor of Medicine
| |
Collapse
|
8
|
Abstract
Introduction: "Traditional teaching" models often fail to engage millennial residents. Multiple modern didactic methods have been employed. The most frequently used objective measure to assess the effectiveness of didactic formats has been American Board of Surgery In-Training Examination performance.Methods: A literature search was conducted searching PubMed, EMBASE, and JAMA Network from June 2011 to June 2021, in accordance with the PRISMA guidelines. Searches were performed for the terms "ABSITE" and "American Board of Surgery In-Training Examination." Only studies discussing didactic structures were included.Results: A final 16 studies were included. Modern methods such as a "flipped classroom," Team Based Learning (TBL), and "gamification" have all shown increased engagement and significantly improved ABSITE performance. Structured biostatistics reviews may be used to supplement research and statistics which are often missed by other resources.Discussion: Programs have a duty to promote excellent resident education. In addition to fostering individual study habits, didactics and program structures should be optimized for resident development. As opposed to focusing on the pure amount of scheduled protected time, programs may instead consider focusing on the quality of the didactic format used and modern didactic methods may be beneficial.
Collapse
Affiliation(s)
- David Ray Velez
- Department of Surgery, 3579University of North Dakota School of Medicine & Health Sciences, Grand Forks, ND, USA
| |
Collapse
|
9
|
Velez DR, Johnson SW, Sticca RP. How to Prepare for the American Board of Surgery In-Training Examination (ABSITE): A Systematic Review. JOURNAL OF SURGICAL EDUCATION 2022; 79:216-228. [PMID: 34429278 DOI: 10.1016/j.jsurg.2021.08.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/27/2021] [Revised: 08/02/2021] [Accepted: 08/06/2021] [Indexed: 06/13/2023]
Abstract
INTRODUCTION Performance on ABSITE is an important factor when monitoring resident progress. It predicts future performance and has lasting effects. Understanding the highest-yield preparation strategies can help residents in their study efforts and optimize performance. METHODS A literature search was conducted searching PubMed, EMBASE and JAMA Network in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Searches were performed for the terms "ABSITE" and "American Board of Surgery In-Training Examination". Only studies discussing individual study habits from May 2011 to May 2021 were included. RESULTS 19 studies were included in qualitative synthesis. Year-round clinical study failed to show significant correlation to ABSITE performance although year-round ABSITE review was more consistently correlated. During a dedicated study period, increased time and increased total practice questions completed are associated with improved performance. The correlation of individual resources such as ABSITE review books, textbooks, audio podcasts and ABSITE preparatory courses to improved ABSITE performance was not proven. CONCLUSIONS Residents should optimize study strategies based on methods that have consistently shown to improve performance. Recommendations for best preparation strategies are provided.
Collapse
Affiliation(s)
- David Ray Velez
- University of North Dakota School of Medicine and Health Sciences, Department of Surgery, Grand Forks, North Dakota.
| | - Stefan Walter Johnson
- University of North Dakota School of Medicine and Health Sciences, Department of Surgery, Grand Forks, North Dakota
| | - Robert Peter Sticca
- University of North Dakota School of Medicine and Health Sciences, Department of Surgery, Grand Forks, North Dakota
| |
Collapse
|
10
|
Panda N, Bahdila D, Abdullah A, Ghosh AJ, Lee SY, Feldman WB. Association Between USMLE Step 1 Scores and In-Training Examination Performance: A Meta-Analysis. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:1742-1754. [PMID: 34323860 DOI: 10.1097/acm.0000000000004227] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
PURPOSE On February 12, 2020, the sponsors of the United States Medical Licensing Examination announced that Step 1 will transition to pass/fail scoring in 2022. Step 1 performance has historically carried substantial weight in the evaluation of residency applicants and as a predictor of subsequent subject-specific medical knowledge. Using a systematic review and meta-analysis, the authors sought to determine the association between Step 1 scores and in-training examination (ITE) performance, which is often used to assess knowledge acquisition during residency. METHOD The authors systematically searched Medline, EMBASE, and Web of Science for observational studies published from 1992 through May 10, 2020. Observational studies reporting associations between Step 1 and ITE scores, regardless of medical or surgical specialty, were eligible for inclusion. Pairs of researchers screened all studies, evaluated quality assessment using a modified Newcastle-Ottawa Scale, and extracted data in a standardized fashion. The primary endpoint was the correlation of Step 1 and ITE scores. RESULTS Of 1,432 observational studies identified, 49 were systematically reviewed and 37 were included in the meta-analysis. Overall study quality was low to moderate. The pooled estimate of the correlation coefficient was 0.42 (95% confidence interval [CI]: 0.36, 0.48; P < .001), suggesting a weak-to-moderate positive correlation between Step 1 and ITE scores. The random-effects meta-regression found the association between Step 1 and ITE scores was weaker for surgical (versus medical) specialties (beta -0.25 [95% CI: -0.41, -0.09; P = .003]) and fellowship (versus residency) training programs (beta -0.25 [95% CI: -0.47, -0.03; P = .030]). CONCLUSIONS The authors identified a weak-to-moderate positive correlation between Step 1 and ITE scores based on a meta-analysis of low-to-moderate quality observational data. With Step 1 scoring transitioning to pass/fail, the undergraduate and graduate medical education communities should continue to develop better tools for evaluating medical students.
Collapse
Affiliation(s)
- Nikhil Panda
- N. Panda is a clinical fellow of surgery, Massachusetts General Hospital and Harvard Medical School, and a postdoctoral researcher, Harvard T.H. Chan School of Public Health, Boston, Massachusetts
| | - Dania Bahdila
- D. Bahdila is a doctoral candidate, Department of Oral Health Policy and Epidemiology, Harvard School of Dental Medicine, Boston, Massachusetts, and Department of Preventive Dental Sciences, Faculty of Dentistry, King Abdulaziz University, Jeddah, Saudi Arabia
| | - Abeer Abdullah
- A. Abdullah is a doctoral candidate, Department of Oral Health Policy and Epidemiology, Harvard School of Dental Medicine, Boston, Massachusetts, and Department of Preventive Dental Sciences, Faculty of Dentistry, King Abdulaziz University, Jeddah, Saudi Arabia
| | - Auyon J Ghosh
- A.J. Ghosh is a clinical fellow of medicine and postdoctoral researcher, Division of Pulmonary and Critical Care Medicine, Department of Medicine, Brigham and Women's Hospital and Harvard Medical School, Boston, Massachusetts
| | - Sun Yeop Lee
- S.Y. Lee is research assistant, Department of Epidemiology, Harvard T.H. Chan School of Public Health, Boston, Massachusetts
| | - William B Feldman
- W.B. Feldman is associate physician and research fellow, Division of Pulmonary and Critical Care Medicine and the Program On Regulation, Therapeutics, And Law (PORTAL), Division of Pharmacoepidemiology and Pharmacoeconomics, Department of Medicine, Brigham and Women's Hospital and Harvard Medical School, Boston, Massachusetts
| |
Collapse
|
11
|
Velez DR. Prospective Factors that Predict American Board of Surgery In-Training Examination Performance: A Systematic Review. Am Surg 2021; 87:1867-1878. [PMID: 34763542 DOI: 10.1177/00031348211058626] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
INTRODUCTION American Board of Surgery In-Training Examination (ABSITE) performance has become an important factor when monitoring resident progress. Understanding which prospective factors predict performance can help identify residents at risk. METHODS A literature search was conducted searching PubMed, EMBASE, and JAMA Network from June 2011 to June 2021, in accordance with the PRISMA guidelines. Searches were performed for the terms "ABSITE" and "American Board of Surgery In-Training Examination." Prospective factors such as prior examination performance, clinical evaluations, and demographics were evaluated. RESULTS A final 35 studies were included. The prospective factor most consistently found to predict ABSITE performance is performance on prior knowledge-based examinations such as the USMLE step exams. The ACGME Medical Knowledge 1 milestone evaluation also appears to correlate to ABSITE performance, although clinical evaluations, in general, do not. Demographics have no significant correlation to ABSITE performance. DISCUSSION Using performance on prior knowledge-based examinations programs may be able to identify residents at risk for failing ABSITE. It may be possible to initiate early intervention before rather than only remediation after poor performance.
Collapse
Affiliation(s)
- David R Velez
- Department of Surgery, 3579University of North Dakota School of Medicine & Health Sciences, Grand Forks, ND, USA
| |
Collapse
|
12
|
Cassidy DJ, Chakraborty S, Panda N, McKinley SK, Mansur A, Hamdi I, Mullen J, Petrusa E, Phitayakorn R, Gee D. The Surgical Knowledge "Growth Curve": Predicting ABSITE Scores and Identifying "At-Risk" Residents. JOURNAL OF SURGICAL EDUCATION 2021; 78:50-59. [PMID: 32694087 DOI: 10.1016/j.jsurg.2020.06.038] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/23/2020] [Revised: 06/09/2020] [Accepted: 06/28/2020] [Indexed: 06/11/2023]
Abstract
OBJECTIVE Resident performance on the American Board of Surgery In-Training Examination (ABSITE) is used for evaluation of surgical knowledge and guides resident selection for institutional remediation programs. Remediation thresholds have historically been based on ABSITE percentile scores; however, this does not account for predictors that can impact a resident's exam performance. We sought to identify predictors of yearly ABSITE performance to help identify residents "at-risk" for performing below their expected growth trajectory. DESIGN The knowledge of the residents, as measured by standardized ABSITE scores, was modeled as a function of the corresponding postgraduate year via a linear mixed effects regression model. Additional model covariates included written USMLE-1-3 examination scores, gender, number of practice questions completed, and percentage correct of practice questions. For each resident, the predicted ABSITE standard score along with a 95% bootstrap prediction interval was obtained. Both resident-specific and population-level predictions for ABSITE standard scores were also estimated. SETTING The study was conducted at a single, large academic medical center (Massachusetts General Hospital, Boston, MA). PARTICIPANTS Six years of general surgery resident score reports at a single institution between 2014 and 2019 were deidentified and analyzed. RESULTS A total of 376 score reports from 130 residents were analyzed. Covariates that had a significant effect on the model included USMLE-1 score (PGY1: p = 0.013; PGY2: p = 0.007; PGY3: p = 0.011), USMLE-2 score (PGY1: p < 0.001; PGY2: p < 0.001; PGY3: p < 0.001; PGY4: p < 0.001; PGY5: p = 0.032), male gender (PGY1: p = 0.003; PGY2: p < 0.001; PGY3: p < 0.001; PGY4: p = 0.008), and number of practice questions completed (p=0.003). Five residents were identified as having "fallen off" their predicted knowledge curve, including a single resident on 2 occasions. Population prediction curves were obtained at 7 different covariate percentile levels (5%, 10%, 25%, 50%, 75%, 90%, and 95%) that could be used to plot predicted resident knowledge progress. CONCLUSION Performance on USMLE-1 and -2 examinations, male gender, and number of practice questions completed were positive predictors of ABSITE performance. Creating residency-wide knowledge growth curves as well as individualized predictive ABSITE performance models allows for more efficient identification of residents potentially at risk for poor ABSITE performance and structured monitoring of surgical knowledge progression.
Collapse
Affiliation(s)
- Douglas J Cassidy
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts.
| | - Saptarshi Chakraborty
- Department of Epidemiology and Biostatistics, Memorial Sloan Kettering Cancer Center, Manhattan, New York
| | - Nikhil Panda
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Sophia K McKinley
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Arian Mansur
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Isra Hamdi
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - John Mullen
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Emil Petrusa
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Roy Phitayakorn
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Denise Gee
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| |
Collapse
|
13
|
Goshtasbi K, Abouzari M, Tjoa T, Malekzadeh S, Bhandarkar ND. The Effects of Pass/Fail USMLE Step 1 Scoring on the Otolaryngology Residency Application Process. Laryngoscope 2020; 131:E738-E743. [PMID: 32880975 DOI: 10.1002/lary.29072] [Citation(s) in RCA: 42] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2020] [Revised: 08/04/2020] [Accepted: 08/13/2020] [Indexed: 11/10/2022]
Abstract
OBJECTIVES To investigate how the decision to report United States Medical Licensing Examination (USMLE) Step 1 score as pass/fail will influence future otolaryngology residency application and match processes. STUDY DESIGN Survey study. METHODS An anonymous and voluntary survey approved by the Otolaryngology Program Directors Organization was administered to academic faculty members from April 24, 2020 through May 19, 2020. RESULTS Two hundred fifty-seven surveys were received from department chairs (17.5%), program directors (24.1%), associate program directors (12.5%), and department faculty (45.9%). USMLE Step 1 has been the most heavily weighted metric for offering interviews (44.0%), and it has correlated with residents' medical knowledge (77.0%) and in-service performance (79.8%) but not with surgical skills (57.6%) or patient care (47.1%). In total, 68.1% disagreed with the decision to make USMLE Step 1 pass/fail. This change is anticipated to lead to an increase in significance of USMLE Step 2 CK (89.1%), core clerkship grades (80.9%), elective rotation at the respective institutions (65.7%), Alpha Omega Alpha and other awards (64.6%), and letters of recommendation (63.8%). The new scoring is also anticipated to especially benefit students from top-ranked schools (70.8%), increase medical students' anxiety/uncertainty regarding obtaining interview invites (59.1%), and negatively affect international (51.4%), doctor of osteopathic medicine (45.9%), and underrepresented students (36.9%). Indication that USMLE Step 2 CK will significantly increase in weight varied according to department position (P = .049), geographic region (P = .047), years of practice (P < .001), and residency program size (P = .002). CONCLUSION Most academic otolaryngologists disagreed with changing USMLE Step 1 scoring to pass/fail and believe that it will increase other objective/subjective metrics' weight and put certain student populations at a disadvantage. LEVEL OF EVIDENCE N/A. Laryngoscope, 131:E738-E743, 2021.
Collapse
Affiliation(s)
- Khodayar Goshtasbi
- Department of Otolaryngology-Head and Neck Surgery, University of California, Irvine, California, U.S.A
| | - Mehdi Abouzari
- Department of Otolaryngology-Head and Neck Surgery, University of California, Irvine, California, U.S.A
| | - Tjoson Tjoa
- Department of Otolaryngology-Head and Neck Surgery, University of California, Irvine, California, U.S.A
| | - Sonya Malekzadeh
- Department of Otolaryngology-Head and Neck Surgery, Georgetown University Medical Center, Washington, District of Columbia, U.S.A
| | - Naveen D Bhandarkar
- Department of Otolaryngology-Head and Neck Surgery, University of California, Irvine, California, U.S.A
| |
Collapse
|