1
|
Nowicki KD, Balboni IM, Cidon MJ, Dhanrajani AD, Driest KD, Fair DC, Imundo LF, Mehta JJ, Tarvin SE, Walters HM, Woolnough LC, Edgar LC, Curran ML. Assessing Pediatric Rheumatology Fellow Competence in the Milestone Era: Past, Present, and Future. Arthritis Care Res (Hoboken) 2024; 76:600-607. [PMID: 38108087 DOI: 10.1002/acr.25276] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2023] [Revised: 11/15/2023] [Accepted: 11/30/2023] [Indexed: 12/19/2023]
Abstract
Starting in 2015, pediatric rheumatology fellowship training programs were required by the Accreditation Council for Graduate Medical Education to assess fellows' academic performance within 21 subcompetencies falling under six competency domains. Each subcompetency had four or five milestone levels describing developmental progression of knowledge and skill acquisition. Milestones were standardized across all pediatric subspecialties. As part of the Milestones 2.0 revision project, the Accreditation Council for Graduate Medical Education convened a workgroup in 2022 to write pediatric rheumatology-specific milestones. Using adult rheumatology's Milestones 2.0 as a starting point, the workgroup revised the patient care and medical knowledge subcompetencies and milestones to reflect requirements and nuances of pediatric rheumatology care. Milestones within four remaining competency domains (professionalism, interpersonal and communication skills, practice-based learning and improvement, and systems-based practice) were standardized across all pediatric subspecialties, and therefore not revised. The workgroup created a supplemental guide with explanations of the intent of each subcompetency, 25 in total, and examples for each milestone level. The new milestones are an important step forward for competency-based medical education in pediatric rheumatology. However, challenges remain. Milestone level assignment is meant to be informed by results of multiple assessment methods. The lack of pediatric rheumatology-specific assessment tools typically result in clinical competency committees determining trainee milestone levels without such collated results as the foundation of their assessments. Although further advances in pediatric rheumatology fellowship competency-based medical education are needed, Milestones 2.0 importantly establishes the first pediatric-specific rheumatology Milestones to assess fellow performance during training and help measure readiness for independent practice.
Collapse
Affiliation(s)
- Katherine D Nowicki
- University of Colorado, Denver, and Children's Hospital Colorado, Aurora, Colorado
| | | | - Michal J Cidon
- Children's Hospital Los Angeles, Los Angeles, California
| | - Anita D Dhanrajani
- The University of Mississippi Medical Center, Jackson, Mississippi, Tulane University School of Medicine, New Orleans, and Children's Hospital of New Orleans, New Orleans, Louisiana
| | - Kyla D Driest
- Nationwide Children's Hospital and The Ohio State University, Columbus
| | | | - Lisa F Imundo
- Columbia University Medical Center, New York City, New York
| | - Jay J Mehta
- Children's Hospital of Philadelphia, Philadelphia, Pennsylvania
| | - Stacey E Tarvin
- Riley Hospital for Children at Indiana University, Indianapolis
| | - Heather M Walters
- Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hamptstead, New York, and Cohen Children's Medical Center of New York, New Hyde Park
| | | | - Laura C Edgar
- Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | - Megan L Curran
- University of Colorado, Denver, and Children's Hospital Colorado, Aurora, Colorado
| |
Collapse
|
2
|
Pradarelli AA, Park YS, Healy MG, Phitayakorn R, Petrusa E. National Profile of the ACGME Milestones 1.0 and 2.0 within General Surgery: A Seven-Year National Study from 2014 to 2021. JOURNAL OF SURGICAL EDUCATION 2024; 81:626-638. [PMID: 38555246 DOI: 10.1016/j.jsurg.2024.01.016] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/16/2023] [Revised: 09/15/2023] [Accepted: 01/29/2024] [Indexed: 04/02/2024]
Abstract
PURPOSE The Accreditation Council for Graduate Medical Education (ACGME) introduced General Surgery Milestones 1.0 in 2014 and Milestones 2.0 in 2020 as steps toward competency-based training. Analysis will inform residency programs on curriculum development, assessment, feedback, and faculty development. This study describes the distributions and trends for Milestones 1.0 and 2.0 ratings and proportion of residents not achieving the level 4.0 graduation target. METHODS A deidentified dataset of milestone ratings for all ACGME-accredited General Surgery residency programs in the United States was used. Medians and interquartile ranges (IQR) were reported for milestone ratings at each PGY level. Percentages of PGY-5s receiving final year ratings of less than 4.0 were calculated. Wilcoxon rank sum tests were used to compare 1.0 and 2.0 median ratings. Kruskal-Wallis and Bonferroni post-hoc tests were used to compare median ratings across time periods and PGY levels. Chi-squared tests were used to compare the proportion of level 4.0 nonachievement under both systems. RESULTS Milestones 1.0 data consisted of 13,866 residents and Milestones 2.0 data consisted of 7,633 residents. For 1.0 and 2.0, all competency domain median ratings were higher for subsequent years of training. Milestones 2.0 had significantly higher median ratings at all PGY levels for all competency domains except Medical Knowledge. Percentages of PGY-5 residents not achieving the graduation target in Milestones 1.0 ranged from 27% to 42% and in 2.0 from 5% to 13%. For Milestones 1.0, all subcompetencies showed an increased number of residents achieving the graduation target from 2014 to 2019. CONCLUSIONS This study of General Surgery Milestones 1.0 and 2.0 data uncovered significant increases in average ratings and significantly fewer residents not achieving the graduation target under the 2.0 system. We hypothesize that these findings may be related more to rating bias given the change in rating scales, rather than a true increase in resident ability.
Collapse
Affiliation(s)
- Alyssa A Pradarelli
- Medical Education Design Lab, Department of Surgery, University of Michigan, Ann Arbor, Michigan; Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts.
| | - Yoon Soo Park
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Michael G Healy
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Roy Phitayakorn
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Emil Petrusa
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| |
Collapse
|
3
|
Weaver ML, Carter T, Yamazaki K, Hamstra SJ, Holmboe E, Chaer R, Park YS, Smith BK. The Association of ACGME Milestones With Performance on American Board of Surgery Assessments: A National Investigation of Surgical Trainees. Ann Surg 2024; 279:180-186. [PMID: 37436889 DOI: 10.1097/sla.0000000000005998] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/14/2023]
Abstract
OBJECTIVE To determine the relationship between, and predictive utility of, milestone ratings and subsequent American Board of Surgery (ABS) vascular surgery in-training examination (VSITE), vascular qualifying examination (VQE), and vascular certifying examination (VCE) performance in a national cohort of vascular surgery trainees. BACKGROUND Specialty board certification is an important indicator of physician competence. However, predicting future board certification examination performance during training continues to be challenging. METHODS This is a national longitudinal cohort study examining relational and predictive associations between Accreditation Council for Graduate Medical Education (ACGME) Milestone ratings and performance on VSITE, VQE, and VCE for all vascular surgery trainees from 2015 to 2021. Predictive associations between milestone ratings and VSITE were conducted using cross-classified random-effects regression. Cross-classified random-effects logistic regression was used to identify predictive associations between milestone ratings and VQE and VCE. RESULTS Milestone ratings were obtained for all residents and fellows(n=1,118) from 164 programs during the study period (from July 2015 to June 2021), including 145,959 total trainee assessments. Medical knowledge (MK) and patient care (PC) milestone ratings were strongly predictive of VSITE performance across all postgraduate years (PGYs) of training, with MK ratings demonstrating a slightly stronger predictive association overall (MK coefficient 17.26 to 35.76, β = 0.15 to 0.23). All core competency ratings were predictive of VSITE performance in PGYs 4 and 5. PGY 5 MK was highly predictive of VQE performance [OR 4.73, (95% CI, 3.87-5.78), P <0.001]. PC subcompetencies were also highly predictive of VQE performance in the final year of training [OR 4.14, (95% CI, 3.17-5.41), P <0.001]. All other competencies were also significantly predictive of first-attempt VQE pass with ORs of 1.53 and higher. PGY 4 ICS ratings [OR 4.0, (95% CI, 3.06-5.21), P <0.001] emerged as the strongest predictor of VCE first-attempt pass. Again, all subcompetency ratings remained significant predictors of first-attempt pass on CE with ORs of 1.48 and higher. CONCLUSIONS ACGME Milestone ratings are highly predictive of future VSITE performance, and first-attempt pass achievement on VQE and VCE in a national cohort of surgical trainees.
Collapse
Affiliation(s)
- M Libby Weaver
- Division of Vascular and Endovascular Surgery, University of Virginia, Charlottesville, VA
| | - Taylor Carter
- Department of Surgery, University of North Carolina, Chapel Hill, NC
| | - Kenji Yamazaki
- Accreditation Council for Graduate Medical Education, Chicago, IL
| | - Stanley J Hamstra
- Department of Surgery, University of Toronto, Toronto, Ontario, Canada
- Department of Medical Education, Northwestern University Feinberg School of Medicine, Chicago, IL
| | - Eric Holmboe
- Accreditation Council for Graduate Medical Education, Chicago, IL
| | - Rabih Chaer
- Division of Vascular Surgery, University of Pittsburgh Medical Center, Pittsburgh, PA
| | - Yoon Soo Park
- Massachusetts General Hospital, Harvard Medical School, Boston, MA
| | - Brigitte K Smith
- Division of Vascular Surgery, University of Utah, Salt Lake City, UT
| |
Collapse
|
4
|
Kendrick DE, Thelen AE, Chen X, Gupta T, Yamazaki K, Krumm AE, Bandeh-Ahmadi H, Clark M, Luckoscki J, Fan Z, Wnuk GM, Ryan AM, Mukherjee B, Hamstra SJ, Dimick JB, Holmboe ES, George BC. Association of Surgical Resident Competency Ratings With Patient Outcomes. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:813-820. [PMID: 36724304 DOI: 10.1097/acm.0000000000005157] [Citation(s) in RCA: 12] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/18/2023]
Abstract
PURPOSE Accurate assessment of clinical performance is essential to ensure graduating residents are competent for unsupervised practice. The Accreditation Council for Graduate Medical Education milestones framework is the most widely used competency-based framework in the United States. However, the relationship between residents' milestones competency ratings and their subsequent early career clinical outcomes has not been established. It is important to examine the association between milestones competency ratings of U.S. general surgical residents and those surgeons' patient outcomes in early career practice. METHOD A retrospective, cross-sectional study was conducted using a sample of national Medicare claims for 23 common, high-risk inpatient general surgical procedures performed between July 1, 2015, and November 30, 2018 (n = 12,400 cases) by nonfellowship-trained U.S. general surgeons. Milestone ratings collected during those surgeons' last year of residency (n = 701 residents) were compared with their risk-adjusted rates of mortality, any complication, or severe complication within 30 days of index operation during their first 2 years of practice. RESULTS There were no associations between mean milestone competency ratings of graduating general surgery residents and their subsequent early career patient outcomes, including any complication (23% proficient vs 22% not yet proficient; relative risk [RR], 0.97, [95% CI, 0.88-1.08]); severe complication (9% vs 9%, respectively; RR, 1.01, [95% CI, 0.86-1.19]); and mortality (5% vs 5%; RR, 1.07, [95% CI, 0.88-1.30]). Secondary analyses yielded no associations between patient outcomes and milestone ratings specific to technical performance, or between patient outcomes and composites of operative performance, professionalism, or leadership milestones ratings ( P ranged .32-.97). CONCLUSIONS Milestone ratings of graduating general surgery residents were not associated with the patient outcomes of those surgeons when they performed common, higher-risk procedures in a Medicare population. Efforts to improve how milestones ratings are generated might strengthen their association with early career outcomes.
Collapse
Affiliation(s)
- Daniel E Kendrick
- D.E. Kendrick is assistant professor, Department of Surgery, University of Minnesota, Minneapolis, Minnesota
| | - Angela E Thelen
- A.E. Thelen is research fellow, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Xilin Chen
- X. Chen is research analyst, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Tanvi Gupta
- T. Gupta is research analyst, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Kenji Yamazaki
- K. Yamazaki is senior data analyst, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | - Andrew E Krumm
- A.E. Krumm is assistant professor, Department of Learning Health Sciences, University of Michigan, Ann Arbor, Michigan
| | - Hoda Bandeh-Ahmadi
- H. Bandeh-Ahmadi is project manager, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Michael Clark
- M. Clark is a biostatistician, Consulting for Statistics, Computing, and Analytics Research, University of Michigan, Ann Arbor, Michigan
| | - John Luckoscki
- J. Luckoscki is research fellow, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Zhaohui Fan
- Z. Fan is research analyst, Center for Healthcare Outcomes and Policy, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Greg M Wnuk
- G.M. Wnuk is program manager, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Andrew M Ryan
- A.M. Ryan is professor, Department of Health Management and Policy, School of Public Health, University of Michigan, Ann Arbor, Michigan
| | - Bhramar Mukherjee
- B. Mukherjee is professor and chair, Division of Biostatistics, School of Public Health, University of Michigan, Ann Arbor, Michigan
| | - Stanley J Hamstra
- S.J. Hamstra is professor, Department of Surgery, University of Toronto, Toronto, Ontario, Canada
| | - Justin B Dimick
- J.B. Dimick is professor and chair, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Eric S Holmboe
- E.S. Holmboe is chief research, Milestone Development, and evaluation officer, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | - Brian C George
- B.C. George is director, Center for Surgical Training and Research, and assistant professor, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| |
Collapse
|
5
|
Smith BK, Yamazaki K, Tekian A, Holmboe E, Hamstra SJ, Mitchell EL, Park YS. The Use of Learning Analytics to Enable Detection of Underperforming Trainees: An Analysis of National Vascular Surgery Trainee ACGME Milestones Assessment Data. Ann Surg 2023; 277:e971-e977. [PMID: 35129524 DOI: 10.1097/sla.0000000000005243] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
OBJECTIVE This study aims to investigate at-risk scores of semiannual Accreditation Council for Graduate Medical Education (ACGME) Milestone ratings for vascular surgical trainees' final achievement of competency targets. SUMMARY BACKGROUND DATA ACGME Milestones assessments have been collected since 2015 for Vascular Surgery. It is unclear whether milestone ratings throughout training predict achievement of recommended performance targets upon graduation. METHODS National ACGME Milestones data were utilized for analyses. All trainees completing 2-year vascular surgery fellowships in June 2018 and 5-year integrated vascular surgery residencies in June 2019 were included. A generalized estimating equations model was used to obtain at-risk scores for each of the 31 subcompetencies by semiannual review periods, to estimate the probability of trainees achieving the recommended graduation target based on their previous ratings. RESULTS A total of 122 vascular surgery fellows (VSFs) (95.3%) and 52 integrated vascular surgery residents (IVSRs) (100%) were included. VSFs and IVSRs did not achieve level 4.0 competency targets at a rate of 1.6% to 25.4% across subcompetencies, which was not significantly different between the 2 groups for any of the subcompetencies ( P = 0.161-0.999). Trainees were found to be at greater risk of not achieving competency targets when lower milestone ratings were assigned, and at later time-points in training. At a milestone rating of 2.5, with 1 year remaining before graduation, the at-risk score for not achieving the target level 4.0 milestone ranged from 2.9% to 77.9% for VSFs and 33.3% to 75.0% for IVSRs. CONCLUSION The ACGME Milestones provide early diagnostic and predictive information for vascular surgery trainees' achievement of competence at completion of training.
Collapse
Affiliation(s)
- Brigitte K Smith
- Division of Vascular Surgery, Department of Surgery, University of Utah, School of Medicine, Salt Lake City, UT
| | - Kenji Yamazaki
- Accreditation Council for Graduate Medical Education, Chicago, IL
| | - Ara Tekian
- University of Illinois-Chicago Department of Medical Education, Chicago, IL
| | - Eric Holmboe
- Accreditation Council for Graduate Medical Education, Chicago, IL
| | - Stanley J Hamstra
- Accreditation Council for Graduate Medical Education, Chicago, IL
- University of Toronto, Department of Surgery
| | - Erica L Mitchell
- Department of Surgery, University of Tennessee Health & Science Center, Vascular and Endovascular Surgery, Regional One Health Medical Center, Memphis, TN
| | - Yoon Soo Park
- Massachusetts General Hospital, Harvard Medical School, Boston, MA
| |
Collapse
|
6
|
Competency Assessment in Physical Medicine and Rehabilitation Resident Education: A Systematic Review. Am J Phys Med Rehabil 2022; 101:1111-1116. [PMID: 35121682 DOI: 10.1097/phm.0000000000001983] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/16/2022]
Abstract
OBJECTIVE The aim of this systematic review was to examine the scope and quality of research in physical medicine and rehabilitation resident education as it pertains to the six core competencies defined by the Accreditation Council for Graduate Medical Education. DESIGN All indexed years of Medline, Embase, and ERIC were searched using key words related to physical medicine and rehabilitation and medical education. Data were extracted on core competencies, content categories, teaching interventions, and study quality. RESULTS From a sample of 2544 articles, 62 studies were included in this review. Frequencies of core competencies studied were: patient care 62.9%, medical knowledge 56.5%, systems-based practice 22.6%, practice-based learning and improvement 14.5%, professionalism 25.8%, and interpersonal and communication skills 22.6%. Musculoskeletal and pain medicine was the most frequently studied content category (33.9%). There was no significant difference in quality of studies between the six core competency groups ( P = 0.31). CONCLUSIONS Available research is highly concentrated in patient care and medical knowledge competencies and in the musculoskeletal and pain medicine content category. This systematic review outlines the current state of education literature and highlights areas for further inquiry. This is an important step toward the translation of research into evidence-based educational practices.
Collapse
|
7
|
Yamazaki K, Holmboe ES, Hamstra SJ. An Empirical Investigation Into Milestones Factor Structure Using National Data Derived From Clinical Competency Committees. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2022; 97:569-576. [PMID: 34192718 DOI: 10.1097/acm.0000000000004218] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
PURPOSE To investigate whether milestone data obtained from clinical competency committee (CCC) ratings in a single specialty reflected the 6 general competency domains framework. METHOD The authors examined milestone ratings from all 275 U.S. Accreditation Council for Graduate Medical Education-accredited categorical obstetrics and gynecology (OBGYN) programs from July 1, 2018, to June 30, 2019. The sample size ranged from 1,371 to 1,438 residents from 275 programs across 4 postgraduate years (PGYs), each with 2 assessment periods. The OBGYN milestones reporting form consisted of 28 subcompetencies under the 6 general competency domains. Milestone ratings were determined by each program's CCC. Intraclass correlations (ICCs) and design effects were calculated for each subcompetency by PGY and assessment period. A multilevel confirmatory factor analysis (CFA) perspective was used, and the pooled within-program covariance matrix was obtained to compare the fit of the 6-domain factor model against 3 other plausible models. RESULTS Milestone ratings from 5,618 OBGYN residents were examined. Moderate to high ICCs and design effects greater than 2.0 were prevalent among all subcompetencies for both assessment periods, warranting the use of the multilevel approach in applying CFA to the milestone data. The theory-aided split-patient care (PC) factor model, which used the 6 general competency domains but also included 3 factors within the PC domain (obstetric technical skills, gynecology technical skills, and ambulatory care), was consistently shown as the best-fitting model across all PGYs by assessment period conditions, except for one. CONCLUSIONS The findings indicate that in addition to using the 6 general competency domains framework in their rating process, CCCs may have further distinguished the PC competency domain into 3 meaningful factors. This study provides internal structure validity evidence for the milestones within a single specialty and may shed light on CCCs' understanding of the distinctive content embedded within the milestones.
Collapse
Affiliation(s)
- Kenji Yamazaki
- K. Yamazaki is senior analyst, Milestones Research and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-7039-4717
| | - Eric S Holmboe
- E.S. Holmboe is chief, Research, Milestones Development and Evaluation Officer, Accreditation Council for Graduate Medical Education, Chicago, Illinois; ORCID: https://orcid.org/0000-0003-0108-6021
| | - Stanley J Hamstra
- S.J. Hamstra is research consultant, Accreditation Council for Graduate Medical Education, Chicago, Illinois, professor, Department of Surgery, University of Toronto, Toronto, Ontario, Canada, and adjunct professor, Department of Medical Education, Feinberg School of Medicine, Northwestern University, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-0680-366X
| |
Collapse
|
8
|
Hamstra SJ, Yamazaki K. A Validity Framework for Effective Analysis and Interpretation of Milestones Data. J Grad Med Educ 2021; 13:75-80. [PMID: 33936537 PMCID: PMC8078069 DOI: 10.4300/jgme-d-20-01039.1] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Affiliation(s)
- Stanley J. Hamstra
- At the time of research, Stanley J. Hamstra, PhD, was Vice President, Milestones Research and Evaluation, Accreditation Council for Graduate Medical Education (ACGME), and is now Professor, Department of Surgery, University of Toronto, Adjunct Professor, Department of Medical Education, Feinberg School of Medicine, Northwestern University, and Research Consultant, ACGME
| | - Kenji Yamazaki
- Kenji Yamazaki, PhD, is Senior Analyst, Milestones Research and Evaluation, ACGME
| |
Collapse
|