1
|
Drake CB, Rhee DW, Panigrahy N, Heery L, Iturrate E, Stern DT, Sartori DJ. Toward precision medical education: Characterizing individual residents' clinical experiences throughout training. J Hosp Med 2024. [PMID: 39103985 DOI: 10.1002/jhm.13471] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/08/2024] [Revised: 06/22/2024] [Accepted: 07/14/2024] [Indexed: 08/07/2024]
Abstract
BACKGROUND Despite the central role of experiential learning in residency training, the actual clinical experiences residents participate in are not well characterized. A better understanding of the type, volume, and variation in residents' clinical experiences is essential to support precision medical education strategies. OBJECTIVE We sought to characterize the entirety of the clinical experiences had by individual internal medicine residents throughout their time in training. METHOD We evaluated the clinical experiences of medicine residents (n = 51) who completed training at NYU Grossman School of Medicine's Brooklyn campus between 2020 and 2023. Residents' inpatient and outpatient experiences were identified using notes written, orders placed, and care team sign-ins; principal ICD-10 codes for each encounter were converted into medical content categories using a previously described crosswalk tool. RESULTS Of 152,426 clinical encounters with available ICD-10 codes, 132,284 were mapped to medical content categories (94.5% capture). Residents' clinical experiences were particularly enriched in infectious and cardiovascular disease; most had very little exposure to allergy, dermatology, oncology, or rheumatology. Some trainees saw twice as many cases in a given content area as did others. There was little concordance between actual frequency of clinical experience and expected content frequency on the ABIM certification exam. CONCLUSIONS Individual residents' clinical experiences in training vary widely, both in number and in type. Characterizing these experiences paves the way for exploration of the relationships between clinical exposure and educational outcomes, and for the implementation of precision education strategies that could fill residents' experiential gaps and complement strengths with targeted educational interventions.
Collapse
Affiliation(s)
- Carolyn B Drake
- Division of Hospital Medicine, Department of Medicine, Internal Medicine Residency Program, NYU Grossman School of Medicine, New York, New York, USA
| | - David W Rhee
- Leon H. Charney Division of Cardiology, Department of Medicine, NYU Grossman School of Medicine, New York, New York, USA
| | - Neha Panigrahy
- NYU Grossman School of Medicine, New York, New York, USA
| | - Lauren Heery
- NYU Grossman School of Medicine, New York, New York, USA
| | - Eduardo Iturrate
- Division of Hospital Medicine, Department of Medicine, DataCore, Enterprise Research Informatics and Epic Analytics, NYU Grossman School of Medicine, New York, New York, USA
| | - David T Stern
- Department of Medicine, Education and Faculty Affairs, NYU Grossman School of Medicine, New York, New York, USA
- Margaret Cochran Corbin VA Medical Center, New York, New York, USA
| | - Daniel J Sartori
- Division of Hospital Medicine, Department of Medicine, Internal Medicine Residency Program, NYU Grossman School of Medicine, New York, New York, USA
| |
Collapse
|
2
|
Plewa MC, Ledrick DJ, Jenkins K, Orqvist A, McCrea M. Can USMLE and COMLEX-USA Scores Predict At-Risk Emergency Medicine Residents' Performance on In-Training Examinations? Cureus 2024; 16:e58684. [PMID: 38651085 PMCID: PMC11033967 DOI: 10.7759/cureus.58684] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/21/2024] [Indexed: 04/25/2024] Open
Abstract
PURPOSE The United States Medical Licensing Examination (USMLE) and Comprehensive Osteopathic Medical Licensing Examination (COMLEX) scores are standard methods used to determine residency candidates' medical knowledge. The authors were interested in using the USMLE and COMLEX part 2 scores in our emergency medicine (EM) residency program to identify at-risk residents who may have difficulty on the in-training exam (ITE) and to determine the cutoff values under which an intern could be given an individualized study plan to ensure medical knowledge competency. METHODS The authors abstracted the USMLE and COMLEX part 2 scores and the American Board of Emergency Medicine (ABEM) ITE scores for a cohort of first-year EM residents graduating years 2010-2022, converting raw scores to percentiles, and compared part 2 and ABEM ITE scores with Pearson's correlation, a Bland-Altman analysis of bias and 95% limits of agreement, and ROC analysis to determine optimal the cut-off values for predicting ABEM ITE < 50th percentile and the estimated test characteristics. RESULTS Scores were available for 152 residents, including 93 USMLE and 88 COMLEX exams. The correlations between part 2 scores and ABEM ITE were r = 0.36 (95%CI: 0.17, 0.52; p < 0.001) for USMLE and r = 0.50 (95%CI: 0.33, 0.64; p < 0.001) for COMLEX. Bias and limits of agreement for both part 2 scores were -14 ± 63% for USMLE and 13 ± 50% for COMLEX in predicting the ABEM ITE scores. USMLE < 37th percentile and COMLEX < 53rd percentile identified 42% (N = 39) and 27% (N = 24) of EM residents, respectively, as at risk, with a sensitivity of 61% and 49% and specificity of 71% and 92%, respectively. CONCLUSION USMLE and COMLEX part 2 scores have a very limited role in identifying those at risk of low ITE performance, suggesting that other factors should be considered to identify interns in need of medical knowledge remediation.
Collapse
Affiliation(s)
- Michael C Plewa
- Emergency Medicine, Mercy Health - St. Vincent Medical Center, Toledo, USA
| | - David J Ledrick
- Emergency Medicine, Mercy Health - St. Vincent Medical Center, Toledo, USA
| | - Kenneth Jenkins
- Emergency Medicine, Ohio University Heritage College of Osteopathic Medicine, Athens, USA
| | - Aaron Orqvist
- Emergency Medicine, Mercy Health - St. Vincent Medical Center, Toledo, USA
| | - Michael McCrea
- Emergency Medicine, Mercy Health - St. Vincent Medical Center, Toledo, USA
| |
Collapse
|
3
|
Seaberg PH, Kling JM, Klanderman MC, Mead-Harvey C, Williams KE, Labonte HR, Jain A, Taylor GE, Blair JE. Resident factors associated with American board of internal medicine certification exam failure. MEDICAL EDUCATION ONLINE 2023; 28:2152162. [PMID: 36443907 PMCID: PMC9718560 DOI: 10.1080/10872981.2022.2152162] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/25/2022] [Revised: 11/21/2022] [Accepted: 11/22/2022] [Indexed: 06/16/2023]
Abstract
INTRODUCTION Performance on the certifying examinations such as the American Board of Internal Medicine Certification Exam (ABIM-CE) is of great interest to residents and their residency programs. Identification of factors associated with certification exam result may allow residency programs to recognize and intervene for residents at risk of failing. Despite this, residency programs have few evidence-based predictors of certification exam outcome. The change to pass-or-fail score reporting of the USA Medical Licensing Exam (USMLE) Step 1 removes one such predictor. MATERIALS AND METHODS We performed a retrospective study of residents from a medium-sized internal medicine residency program who graduated from 1998 through 2017. We used univariate tests of associations between ABIM-CE result and various demographic and scholastic factors. RESULTS Of 166 graduates, 14 (8.4%) failed the ABIM-CE on the first attempt. Failing the first attempt of the ABIM-CE was associated with older median age on entering residency (29 vs 27 years; P = 0.01); lower percentile rank on the Internal Medicine In-Training Examination (IM-ITE) in each of the first, second, and third years of training (P < 0.001 for all); and lower scores on the USMLE Steps 1, 2 Clinical Knowledge, and 3 (P < 0.05 for all). No association was seen between a variety of other scholastic or demographic factors and first-attempt ABIM-CE result. DISCUSSION Although USMLE step 1 has changed to a pass-or-fail reporting structure, there are still other characteristics that allow residency programs to identify residents at risk of ABIM-CE first time failure and who may benefit from intervention.
Collapse
Affiliation(s)
- Preston H. Seaberg
- Department of Internal Medicine Charleston Division, West Virginia University School of Medicine, Charleston, West Virginia, USA
| | - Juliana M. Kling
- Division of Women’s Health Internal Medicine, Mayo Clinic, Scottsdale, Arizona, USA
| | - Molly C. Klanderman
- Division of Clinical Trials and Biostatistics, Mayo Clinic, Scottsdale, Arizona, USA
| | - Carolyn Mead-Harvey
- Division of Clinical Trials and Biostatistics, Mayo Clinic, Scottsdale, Arizona, USA
| | | | - Helene R. Labonte
- Division of Community Internal Medicine, Mayo Clinic, Scottsdale, Arizona, USA
| | - Atul Jain
- Division of General Internal Medicine, Mayo Clinic, Scottsdale, Arizona, USA
| | - Gretchen E. Taylor
- Division of Hospital Internal Medicine, Mayo Clinic, Phoenix, Arizona, USA
| | - Janis E. Blair
- Division of Infectious Diseases, Mayo Clinic, Phoenix, AZ, USA
| |
Collapse
|
4
|
Fujihashi A, Patel OU, Yadav I, Burge K, Haynes W, Zaniewski R, Wagoner NV, Grant MB. Ophthalmology Residency Program Director Survey on Pass/Fail U.S. Medical Licensing Exam Step 1 Scoring. JOURNAL OF ACADEMIC OPHTHALMOLOGY (2017) 2023; 15:e243-e247. [PMID: 38021032 PMCID: PMC10645543 DOI: 10.1055/s-0043-1771034] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 01/13/2023] [Accepted: 06/12/2023] [Indexed: 12/01/2023]
Abstract
Background Beginning January 26, 2022, the U.S. Medical Licensing Exam (USMLE) Step 1 changed from a numerical score to pass/fail (P/F). The purpose of this study was to determine the perspective of ophthalmology program directors regarding this change in evaluating applicants. Methods After institutional review board approval, a survey was sent out to program directors of all 125 ophthalmology programs accredited by the Accreditation Council for Graduate Medical Education. Survey questions asked for program demographics, the utility of USMLE Step 1 and 2 Clinical Knowledge scores in assessing applicants, and the importance of 16 different applicant metrics before and after Step 1 becomes P/F. The metrics examined were: letters of recommendation; clerkship grades; class ranking; Alpha Omega Alpha Membership; Gold Humanism Honor Society Membership; Dean's Letter; involvement and leadership; personal statement; number of abstracts, presentations, and publications; mean number of research experiences in the specialty; Step 2 Clinical Knowledge score; volunteering; preclinical grades; away rotation in the specialty; the applicant having another graduate degree; and graduation from a top 40 National Institutes of Health-funded program. Data were analyzed using nonoverlapping 95% confidence intervals. Results The survey was completed by 50 (40%) program directors. Sixty-eight percent of respondents stated a student's ranking would be considered more after USMLE Step 1 scores become P/F, and 60% stated medical schools should share clerkship shelf exam scores with residency programs. There were no significant differences in program directors' rankings of applicant metrics following the transition to P/F Step 1. Conclusion Based on our data, program directors will likely not place a greater emphasis on Step 2 scores, despite it being the only remaining objective measure for all applicants following the switch to a P/F Step 1. Nevertheless, program directors expressed an interest in receiving other objective measures, such as shelf exam scores and class ranking, as part of the application process. Notably, we found no significant changes in the rankings of various applicant metrics before and after the transition to P/F Step 1, indicating that the metrics that were important to program directors prior to the change remain just as critical in the new era of admissions.
Collapse
Affiliation(s)
- Ayaka Fujihashi
- Marnix E. Heersink School of Medicine, University of Alabama at Birmingham, Birmingham, Alabama
| | - Om U. Patel
- Marnix E. Heersink School of Medicine, University of Alabama at Birmingham, Birmingham, Alabama
| | - Ishant Yadav
- Marnix E. Heersink School of Medicine, University of Alabama at Birmingham, Birmingham, Alabama
| | - Kaitlin Burge
- Marnix E. Heersink School of Medicine, University of Alabama at Birmingham, Birmingham, Alabama
| | - William Haynes
- Marnix E. Heersink School of Medicine, University of Alabama at Birmingham, Birmingham, Alabama
| | - Ryan Zaniewski
- Marnix E. Heersink School of Medicine, University of Alabama at Birmingham, Birmingham, Alabama
| | - Nicholas Van Wagoner
- Marnix E. Heersink School of Medicine, University of Alabama at Birmingham, Birmingham, Alabama
- Department of Medicine, Division of Infectious Diseases, University of Alabama at Birmingham, Birmingham, Alabama
| | - Maria B. Grant
- Marnix E. Heersink School of Medicine, University of Alabama at Birmingham, Birmingham, Alabama
- Department of Ophthalmology and Visual Sciences, University of Alabama at Birmingham, Birmingham, Alabama
| |
Collapse
|
5
|
Shirkhodaie C, Avila S, Seidel H, Gibbons RD, Arora VM, Farnan JM. The Association Between USMLE Step 2 Clinical Knowledge Scores and Residency Performance: A Systematic Review and Meta-Analysis. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:264-273. [PMID: 36512984 DOI: 10.1097/acm.0000000000005061] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
PURPOSE With the change in Step 1 score reporting, Step 2 Clinical Knowledge (CK) may become a pivotal factor in resident selection. This systematic review and meta-analysis seeks to synthesize existing observational studies that assess the relationship between Step 2 CK scores and measures of resident performance. METHOD The authors searched MEDLINE, Web of Science, and Scopus databases using terms related to Step 2 CK in 2021. Two researchers identified studies investigating the association between Step 2 CK and measures of resident performance and included studies if they contained a bivariate analysis examining Step 2 CK scores' association with an outcome of interest: in-training examination (ITE) scores, board certification examination scores, select Accreditation Council for Graduate Medical Education core competency assessments, overall resident performance evaluations, or other subjective measures of performance. For outcomes that were investigated by 3 or more studies, pooled effect sizes were estimated with random-effects models. RESULTS Among 1,355 potential studies, 68 met inclusion criteria and 43 were able to be pooled. There was a moderate positive correlation between Step 2 CK and ITE scores (0.52, 95% CI 0.45-0.59, P < .01). There was a moderate positive correlation between Step 2 CK and ITE scores for both nonsurgical (0.59, 95% CI 0.51-0.66, P < .01) and surgical specialties (0.41, 95% CI 0.33-0.48, P < .01). There was a very weak positive correlation between Step 2 CK scores and subjective measures of resident performance (0.19, 95% CI 0.13-0.25, P < .01). CONCLUSIONS This study found Step 2 CK scores have a statistically significant moderate positive association with future examination scores and a statistically significant weak positive correlation with subjective measures of resident performance. These findings are increasingly relevant as Step 2 CK scores will likely become more important in resident selection.
Collapse
Affiliation(s)
- Camron Shirkhodaie
- C. Shirkhodaie is a medical student, Pritzker School of Medicine, University of Chicago, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-4279-3251
| | - Santiago Avila
- S. Avila is a medical student, Pritzker School of Medicine, University of Chicago, Chicago, Illinois; ORCID: https://orcid.org/0000-0003-3633-4304
| | - Henry Seidel
- H. Seidel is a medical student, Pritzker School of Medicine, University of Chicago, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-7364-1365
| | - Robert D Gibbons
- R.D. Gibbons is professor, Center for Health Statistics and Departments of Medicine and Public Health Sciences, University of Chicago, Chicago, Illinois
| | - Vineet M Arora
- V.M. Arora is professor, Department of Medicine, University of Chicago Medicine, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-4745-7599
| | - Jeanne M Farnan
- J.M. Farnan is professor, Department of Medicine, University of Chicago Medicine, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-1138-9416
| |
Collapse
|
6
|
Morgan DE. Use of Attending Radiologist Reviews of Resident Clinical Performance to Predict Outcomes on the American Board of Radiology Qualifying (Core) Exam: A Call to Action. Acad Radiol 2022; 29:1727-1729. [PMID: 36050263 DOI: 10.1016/j.acra.2022.07.024] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2022] [Accepted: 07/31/2022] [Indexed: 11/20/2022]
Affiliation(s)
- Desiree E Morgan
- University of Alabama at Birmingham, Department of Radiology, JTN456, 619 South 19th Street, Birmingham, AL 35249.
| |
Collapse
|
7
|
Best Practices for Remediation in Pulmonary and Critical Care Medicine Fellowship Training. ATS Sch 2022; 3:485-500. [PMID: 36312805 PMCID: PMC9590524 DOI: 10.34197/ats-scholar.2022-0007re] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2022] [Accepted: 06/03/2022] [Indexed: 12/03/2022] Open
Abstract
Background Remediation of struggling learners in pulmonary and critical care fellowship
programs is a challenge, even for experienced medical educators. Objective This evidence-based narrative review provides a framework program leaders may
use to address fellows having difficulty achieving competency during
fellowship training. Methods The relevant evidence for approaches on the basis of each learner’s
needs is reviewed and interpreted in the context of fellowship training in
pulmonary medicine and critical care. Issues addressed include bias in
fellow assessments and remediation, the impacts of the severe acute
respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic, the specific
challenges of pulmonary and critical care fellowship programs, a brief
review of relevant legal issues, guidance on building and leveraging program
resources, and a discussion of learner outcomes. Results This results in a concise, evidence-based toolkit for program leaders based
around four pillars: early identification, fellow assessment, collaborative
intervention, and reassessment. Important concepts also include the need for
documentation, clear and written communication, and fellow-directed
approaches to the creation of achievable goals. Conclusion Evidence-based remediation helps struggling learners in pulmonary and
critical care fellowship to improve their ability to meet Accreditation
Council for Graduate Medical Education (ACGME) milestones.
Collapse
|
8
|
Klein R, Koch J, Snyder ED, Volerman A, Simon W, Jassal SK, Cosco D, Cioletti A, Ufere NN, Burnett-Bowie SAM, Palamara K, Schaeffer S, Julian KA, Thompson V. Association of Gender and Race/Ethnicity with Internal Medicine In-Training Examination Performance in Graduate Medical Education. J Gen Intern Med 2022; 37:2194-2199. [PMID: 35710653 PMCID: PMC9296734 DOI: 10.1007/s11606-022-07597-z] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/13/2021] [Accepted: 03/30/2022] [Indexed: 11/24/2022]
Abstract
BACKGROUND Disparities in objective assessments in graduate medical education such as the In-Training Examination (ITE) that disadvantage women and those self-identifying with race/ethnicities underrepresented in medicine (URiM) are of concern. OBJECTIVE Examine ITE trends longitudinally across post-graduate year (PGY) with gender and race/ethnicity. DESIGN Longitudinal analysis of resident ITE metrics at 7 internal medicine residency programs, 2014-2019. ITE trends across PGY of women and URiM residents compared to non-URiM men assessed via ANOVA. Those with ITE scores associated with less than 90% probability of passing the American Board of Internal Medicine certification exam (ABIM-CE) were identified and odds of being identified as at-risk between groups were assessed with chi square. PARTICIPANTS A total of 689 IM residents, including 330 women and URiM residents (48%). MAIN MEASURES ITE score KEY RESULTS: There was a significant difference in ITE score across PGY for women and URiM residents compared to non-URiM men (F(2, 1321) 4.46, p=0.011). Adjusting for program, calendar year, and baseline ITE, women and URiM residents had smaller ITE score gains (adjusted mean change in score between PGY1 and PGY3 (se), non-URiM men 13.1 (0.25) vs women and URiM residents 11.4 (0.28), p<0.001). Women and URiM residents had greater odds of being at potential risk for not passing the ABIM-CE (OR 1.75, 95% CI 1.10 to 2.78) with greatest odds in PGY3 (OR 3.13, 95% CI 1.54 to 6.37). CONCLUSION Differences in ITE over training were associated with resident gender and race/ethnicity. Women and URiM residents had smaller ITE score gains across PGY translating into greater odds of potentially being seen as at-risk for not passing the ABIM-CE. Differences in ITE over training may reflect differences in experiences of women and URiM residents during training and may lead to further disparities.
Collapse
Affiliation(s)
- Robin Klein
- Department of Medicine, Division of General Internal Medicine and Geriatrics, Emory University School of Medicine, 49 Jesse Hill Jr Dr, Atlanta, GA, 30303, USA.
| | - Jennifer Koch
- Department of Medicine, University of Louisville, Louisville, KY, USA
| | - Erin D Snyder
- Department of Medicine, University of Alabama Birmingham School of Medicine, Birmingham, AL, USA
| | - Anna Volerman
- Departments of Medicine and Pediatrics, University of Chicago, Chicago, IL, USA
| | - Wendy Simon
- Department of Medicine, University of California, Los Angeles, Los Angeles, USA
| | - Simerjot K Jassal
- Department of Medicine, VA San Diego Healthcare System, University of California, San Diego, San Diego, USA
| | - Dominique Cosco
- Department of Medicine, Washington University St. Louis, St. Louis, USA
| | - Anne Cioletti
- Department of Medicine, University of Utah, Salt Lake City, USA
| | - Nneka N Ufere
- Department of Medicine, Massachusetts General Hospital, Boston, MA, USA
| | | | - Kerri Palamara
- Department of Medicine, Massachusetts General Hospital, Boston, MA, USA
| | - Sarah Schaeffer
- Department of Medicine, University of California, San Francisco, San Francisco, CA, USA
| | - Katherine A Julian
- Department of Medicine, University of California, San Francisco, San Francisco, CA, USA
| | - Vanessa Thompson
- Department of Medicine, University of California, San Francisco, San Francisco, CA, USA
| |
Collapse
|
9
|
Jewell C, Kraut A, Miller D, Ray K, Werley E, Schnapp B. Metrics of Resident Achievement for Defining Program Aims. West J Emerg Med 2022; 23:1-8. [PMID: 35060852 PMCID: PMC8782131 DOI: 10.5811/westjem.2021.12.53554] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2021] [Accepted: 12/06/2021] [Indexed: 11/28/2022] Open
Abstract
Introduction Resident achievement data is a powerful but underutilized means of program evaluation, allowing programs to empirically measure whether they are meeting their program aims, facilitate refinement of curricula and improve resident recruitment efforts. The goal was to provide an overview of available metrics of resident achievement and how these metrics can be used to inform program aims. Methods A literature search was performed using PubMed and Google Scholar between May and November of 2020. Publications were eligible for inclusion if they discussed or assessed “excellence” or “success” during residency training. A narrative review structure was chosen due to the intention to provide an examination of the literature on available resident achievement metrics. Results 57 publications met inclusion criteria and were included in the review. Metrics of excellence were grouped into larger categories, including success defined by program factors, academics, national competencies, employer factors, and possible new metrics. Conclusions Programs can best evaluate whether they are meeting their program aims by creating a list of important resident-level metrics based on their stated goals and values using one or more of the published definitions as a foundation. Each program must define which metrics align best with their individual program aims and mission.
Collapse
Affiliation(s)
- Corlin Jewell
- University of Wisconsin School of Medicine and Public Health, BerbeeWalsh Department of Emergency Medicine, Madison, Wisconsin
| | - Aaron Kraut
- University of Wisconsin School of Medicine and Public Health, BerbeeWalsh Department of Emergency Medicine, Madison, Wisconsin
| | - Danielle Miller
- University of Colorado School of Medicine, Department of Emergency Medicine, Aurora, Colorado
| | - Kaitlin Ray
- University of Wisconsin School of Medicine and Public Health, BerbeeWalsh Department of Emergency Medicine, Madison, Wisconsin
| | - Elizabeth Werley
- PennState College of Medicine, Department of Emergency Medicine, Hershey, Pennsylvania
| | - Bejamin Schnapp
- University of Wisconsin School of Medicine and Public Health, BerbeeWalsh Department of Emergency Medicine, Madison, Wisconsin
| |
Collapse
|
10
|
Panda N, Bahdila D, Abdullah A, Ghosh AJ, Lee SY, Feldman WB. Association Between USMLE Step 1 Scores and In-Training Examination Performance: A Meta-Analysis. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:1742-1754. [PMID: 34323860 DOI: 10.1097/acm.0000000000004227] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
PURPOSE On February 12, 2020, the sponsors of the United States Medical Licensing Examination announced that Step 1 will transition to pass/fail scoring in 2022. Step 1 performance has historically carried substantial weight in the evaluation of residency applicants and as a predictor of subsequent subject-specific medical knowledge. Using a systematic review and meta-analysis, the authors sought to determine the association between Step 1 scores and in-training examination (ITE) performance, which is often used to assess knowledge acquisition during residency. METHOD The authors systematically searched Medline, EMBASE, and Web of Science for observational studies published from 1992 through May 10, 2020. Observational studies reporting associations between Step 1 and ITE scores, regardless of medical or surgical specialty, were eligible for inclusion. Pairs of researchers screened all studies, evaluated quality assessment using a modified Newcastle-Ottawa Scale, and extracted data in a standardized fashion. The primary endpoint was the correlation of Step 1 and ITE scores. RESULTS Of 1,432 observational studies identified, 49 were systematically reviewed and 37 were included in the meta-analysis. Overall study quality was low to moderate. The pooled estimate of the correlation coefficient was 0.42 (95% confidence interval [CI]: 0.36, 0.48; P < .001), suggesting a weak-to-moderate positive correlation between Step 1 and ITE scores. The random-effects meta-regression found the association between Step 1 and ITE scores was weaker for surgical (versus medical) specialties (beta -0.25 [95% CI: -0.41, -0.09; P = .003]) and fellowship (versus residency) training programs (beta -0.25 [95% CI: -0.47, -0.03; P = .030]). CONCLUSIONS The authors identified a weak-to-moderate positive correlation between Step 1 and ITE scores based on a meta-analysis of low-to-moderate quality observational data. With Step 1 scoring transitioning to pass/fail, the undergraduate and graduate medical education communities should continue to develop better tools for evaluating medical students.
Collapse
Affiliation(s)
- Nikhil Panda
- N. Panda is a clinical fellow of surgery, Massachusetts General Hospital and Harvard Medical School, and a postdoctoral researcher, Harvard T.H. Chan School of Public Health, Boston, Massachusetts
| | - Dania Bahdila
- D. Bahdila is a doctoral candidate, Department of Oral Health Policy and Epidemiology, Harvard School of Dental Medicine, Boston, Massachusetts, and Department of Preventive Dental Sciences, Faculty of Dentistry, King Abdulaziz University, Jeddah, Saudi Arabia
| | - Abeer Abdullah
- A. Abdullah is a doctoral candidate, Department of Oral Health Policy and Epidemiology, Harvard School of Dental Medicine, Boston, Massachusetts, and Department of Preventive Dental Sciences, Faculty of Dentistry, King Abdulaziz University, Jeddah, Saudi Arabia
| | - Auyon J Ghosh
- A.J. Ghosh is a clinical fellow of medicine and postdoctoral researcher, Division of Pulmonary and Critical Care Medicine, Department of Medicine, Brigham and Women's Hospital and Harvard Medical School, Boston, Massachusetts
| | - Sun Yeop Lee
- S.Y. Lee is research assistant, Department of Epidemiology, Harvard T.H. Chan School of Public Health, Boston, Massachusetts
| | - William B Feldman
- W.B. Feldman is associate physician and research fellow, Division of Pulmonary and Critical Care Medicine and the Program On Regulation, Therapeutics, And Law (PORTAL), Division of Pharmacoepidemiology and Pharmacoeconomics, Department of Medicine, Brigham and Women's Hospital and Harvard Medical School, Boston, Massachusetts
| |
Collapse
|
11
|
Novel Strategies for Evaluating and Improving Plastic Surgery Applicant Selection. Plast Reconstr Surg 2021; 148:1040e-1046e. [PMID: 34705807 DOI: 10.1097/prs.0000000000008572] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
SUMMARY Residency applicant evaluation and selection is a critical part of developing and maintaining a high-quality plastic surgery residency program. Currently, many programs rely on objective measures such as the United States Medical Licensing Exam scores, number of research publications, grade point average, Alpha Omega Alpha Honor Medical Society status, or a combination of these objective metrics. However, there is a growing body of literature suggesting that the current means of residency applicant evaluation and selection may not be the best predictive factors of future resident success. The aim of this study was to identify nontraditional means of evaluating plastic surgery residency candidates and discuss how these means have been implemented at the authors' institution. After reviewing industry hiring practices, the authors propose that standardized interviewing and personality testing can help evaluate some of the previously intangible parts of an applicant that may play a role in teamwork, commitment, and dedication to patient care.
Collapse
|
12
|
Muganlinskaya N, Mollaeian A, Karpman M. Learning style preferences of internal medicine residents and in-training examination scores: is there a correlation? J Community Hosp Intern Med Perspect 2021; 11:608-611. [PMID: 34567449 PMCID: PMC8462833 DOI: 10.1080/20009666.2021.1944018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
The internal medicine in-training examination (IM-ITE) has been traditionally used as a measuring tool to evaluate the base of knowledge of the residents in internal medicine residency programs across the US. Multiple interventions has been applied and studied to increase the first-time passing rate of ABIM, as it is an indicator of each residency program’s performance and ranking. Additionally, studies have demonstrated that different learning styles and preferences are a predictor of exam results; however, it is not well known whether certain preferred learning styles are correlated with certain IM-ITE results. Primary objective of our study was to find a correlation between residents’ preferred learning style, based on Kolb learning style inventory, and their PGY1 and PGY2 IM-ITE performance score difference. Secondary objective was to find the correlation between PGY2s’ IM-ITE score and their preferred learning styles based on the Kolb learning style inventory. Mean scores of PGY1 and PGY2 IM-ITE were compared in each learning style group. Additionally, the mean difference between the PGY1 and PGY2 IM-ITE scores for each learning group was compared as well. The analysis of the mean IM-ITE score from PGY1 to PGY2 between groups revealed a statistically significant improvement in IM-ITE score from PGY1 to PGY2 in all groups, however, with a larger difference in one of the groups.
Collapse
Affiliation(s)
- Nargiz Muganlinskaya
- Department of Medicine, Luminis Health Anne Arundel Medical Center, Annapolis, MD, USA
| | - Arash Mollaeian
- Department of Medicine, Medstar Health Internal Medicine Residency Program, Baltimore, MD, USA
| | - Mitchell Karpman
- Department of Medicine, Luminis Health Anne Arundel Medical Center, Annapolis, MD, USA
| |
Collapse
|
13
|
Han R, Keith J, Slodkowska E, Nofech-Mozes S, Djordjevic B, Parra-Herran C, Shachar S, Mirkovic J, Sherman C, Hsieh E, Ismiil N, Lu FI. Hot Seat Diagnosis: Competency-Based Tool Is Superior to Time-Based Tool for the Formative In-Service Assessment of Pathology Trainees. Arch Pathol Lab Med 2021; 146:123-131. [PMID: 34133708 DOI: 10.5858/arpa.2020-0702-ep] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/04/2021] [Indexed: 01/09/2023]
Abstract
CONTEXT.— Competency-based medical education relies on frequent formative in-service assessments to ascertain trainee progression. Currently at our institution, trainees receive a summative end-of-rotation In-Training Evaluation Report based on feedback collected from staff pathologists. There is no method of simulating report sign-out. OBJECTIVE.— To develop a formative in-service assessment tool that is able to simulate report sign-out and provide case-by-case feedback to trainees. Further, to compare time- versus competency-based assessment models. DESIGN.— Twenty-one pathology trainees were assessed for 20 months. Hot Seat Diagnosis by trainees and trainee assessment by pathologists were recorded in the Laboratory Information System. In the first iteration, trainees were assessed by using a time-based assessment scale on their ability to diagnose, report, use ancillary testings, comment on clinical implications, provide intraoperative consultation and/or gross cases. The second iteration used a competency-based assessment scale. Trainees and pathologists completed surveys on the effectiveness of the In-Training Evaluation Report versus the Hot Seat Diagnosis tool. RESULTS.— Scores from both iterations correlated significantly with other assessment tools including the Resident In-Service Examination (r = 0.93, P = .04 and r = 0.87, P = .03). The competency-based model was better able to demonstrate improvement over time and stratify junior versus senior trainees than the time-based model. Trainees and pathologists rated Hot Seat Diagnosis as significantly more objective, detailed, and timely than the In-Training Evaluation Report, and effective at simulating report sign-out. CONCLUSIONS.— Hot Seat Diagnosis is an effective tool for the formative in-service assessment of pathology trainees and simulation of report sign-out, with the competency-based model outperforming the time-based model.
Collapse
Affiliation(s)
- Rachel Han
- From the Department of Laboratory Medicine and Pathobiology, University of Toronto, Toronto, Ontario, Canada (Han, Keith, Slodkowska, Nofech-Mozes, Djordjevic, Shachar, Mirkovic, Sherman, Lu)
| | - Julia Keith
- From the Department of Laboratory Medicine and Pathobiology, University of Toronto, Toronto, Ontario, Canada (Han, Keith, Slodkowska, Nofech-Mozes, Djordjevic, Shachar, Mirkovic, Sherman, Lu).,The Department of Laboratory Medicine and Molecular Diagnostics, Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada (Keith, Slodkowska, Nofech-Mozes, Djordjevic, Shachar, Mirkovic, Sherman, Lu)
| | - Elzbieta Slodkowska
- From the Department of Laboratory Medicine and Pathobiology, University of Toronto, Toronto, Ontario, Canada (Han, Keith, Slodkowska, Nofech-Mozes, Djordjevic, Shachar, Mirkovic, Sherman, Lu).,The Department of Laboratory Medicine and Molecular Diagnostics, Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada (Keith, Slodkowska, Nofech-Mozes, Djordjevic, Shachar, Mirkovic, Sherman, Lu)
| | - Sharon Nofech-Mozes
- From the Department of Laboratory Medicine and Pathobiology, University of Toronto, Toronto, Ontario, Canada (Han, Keith, Slodkowska, Nofech-Mozes, Djordjevic, Shachar, Mirkovic, Sherman, Lu).,The Department of Laboratory Medicine and Molecular Diagnostics, Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada (Keith, Slodkowska, Nofech-Mozes, Djordjevic, Shachar, Mirkovic, Sherman, Lu)
| | - Bojana Djordjevic
- From the Department of Laboratory Medicine and Pathobiology, University of Toronto, Toronto, Ontario, Canada (Han, Keith, Slodkowska, Nofech-Mozes, Djordjevic, Shachar, Mirkovic, Sherman, Lu).,The Department of Laboratory Medicine and Molecular Diagnostics, Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada (Keith, Slodkowska, Nofech-Mozes, Djordjevic, Shachar, Mirkovic, Sherman, Lu)
| | - Carlos Parra-Herran
- The Department of Pathology, Brigham and Women's Hospital, Boston, Massachusetts (Parra-Herran)
| | - Sade Shachar
- From the Department of Laboratory Medicine and Pathobiology, University of Toronto, Toronto, Ontario, Canada (Han, Keith, Slodkowska, Nofech-Mozes, Djordjevic, Shachar, Mirkovic, Sherman, Lu).,The Department of Laboratory Medicine and Molecular Diagnostics, Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada (Keith, Slodkowska, Nofech-Mozes, Djordjevic, Shachar, Mirkovic, Sherman, Lu)
| | - Jelena Mirkovic
- From the Department of Laboratory Medicine and Pathobiology, University of Toronto, Toronto, Ontario, Canada (Han, Keith, Slodkowska, Nofech-Mozes, Djordjevic, Shachar, Mirkovic, Sherman, Lu).,The Department of Laboratory Medicine and Molecular Diagnostics, Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada (Keith, Slodkowska, Nofech-Mozes, Djordjevic, Shachar, Mirkovic, Sherman, Lu)
| | - Christopher Sherman
- From the Department of Laboratory Medicine and Pathobiology, University of Toronto, Toronto, Ontario, Canada (Han, Keith, Slodkowska, Nofech-Mozes, Djordjevic, Shachar, Mirkovic, Sherman, Lu).,The Department of Laboratory Medicine and Molecular Diagnostics, Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada (Keith, Slodkowska, Nofech-Mozes, Djordjevic, Shachar, Mirkovic, Sherman, Lu)
| | - Eugene Hsieh
- The Department of Pathology, Dynacare, Brampton, Ontario, Canada (Hsieh)
| | - Nadia Ismiil
- The Department of Pathology, Lakeridge Health Ajax Pickering Hospital, Ajax, Ontario, Canada (Ismiil)
| | - Fang-I Lu
- From the Department of Laboratory Medicine and Pathobiology, University of Toronto, Toronto, Ontario, Canada (Han, Keith, Slodkowska, Nofech-Mozes, Djordjevic, Shachar, Mirkovic, Sherman, Lu).,The Department of Laboratory Medicine and Molecular Diagnostics, Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada (Keith, Slodkowska, Nofech-Mozes, Djordjevic, Shachar, Mirkovic, Sherman, Lu)
| |
Collapse
|
14
|
Mun F, Scott AR, Cui D, Lehman EB, Jeong S, Chisty A, Juliano PJ, Hennrikus WL, Hennrikus EF. A comparison of orthopaedic surgery and internal medicine perceptions of USMLE Step 1 pass/fail scoring. BMC MEDICAL EDUCATION 2021; 21:255. [PMID: 33941167 PMCID: PMC8091716 DOI: 10.1186/s12909-021-02699-4] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/12/2021] [Accepted: 04/23/2021] [Indexed: 06/07/2023]
Abstract
BACKGROUND United States Medical Licensing Examination Step 1 will transition from numeric grading to pass/fail, sometime after January 2022. The aim of this study was to compare how program directors in orthopaedics and internal medicine perceive a pass/fail Step 1 will impact the residency application process. METHODS A 27-item survey was distributed through REDCap to 161 U.S. orthopaedic residency program directors and 548 U.S. internal medicine residency program directors. Program director emails were obtained from the American Medical Association's Fellowship and Residency Electronic Interactive Database. RESULTS We received 58 (36.0%) orthopaedic and 125 (22.8%) internal medicine program director responses. The majority of both groups disagree with the change to pass/fail, and felt that the decision was not transparent. Both groups believe that the Step 2 Clinical Knowledge exam and clerkship grades will take on more importance. Compared to internal medicine PDs, orthopaedic PDs were significantly more likely to emphasize research, letters of recommendation from known faculty, Alpha Omega Alpha membership, leadership/extracurricular activities, audition elective rotations, and personal knowledge of the applicant. Both groups believe that allopathic students from less prestigious medical schools, osteopathic students, and international medical graduates will be disadvantaged. Orthopaedic and internal medicine program directors agree that medical schools should adopt a graded pre-clinical curriculum, and that there should be a cap on the number of residency applications a student can submit. CONCLUSION Orthopaedic and internal medicine program directors disagree with the change of Step 1 to pass/fail. They also believe that this transition will make the match process more difficult, and disadvantage students from less highly-regarded medical schools. Both groups will rely more heavily on the Step 2 clinical knowledge exam score, but orthopaedics will place more importance on research, letters of recommendation, Alpha Omega Alpha membership, leadership/extracurricular activities, personal knowledge of the applicant, and audition electives.
Collapse
Affiliation(s)
- Frederick Mun
- Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA.
| | - Alyssa R Scott
- Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| | - David Cui
- Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| | - Erik B Lehman
- Public Health Sciences at Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| | - Seongho Jeong
- Department of Orthopaedics and Rehabilitation, Yale School of Medicine, Yale New Haven Hospital, New Haven, CT, USA
| | - Alia Chisty
- Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
- Department of Internal Medicine, Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| | - Paul J Juliano
- Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
- Department of Orthopaedics and Rehabilitation, Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| | - William L Hennrikus
- Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
- Department of Orthopaedics and Rehabilitation, Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| | - Eileen F Hennrikus
- Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
- Department of Internal Medicine, Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| |
Collapse
|
15
|
Mun F, Scott AR, Cui D, Chisty A, Hennrikus WL, Hennrikus EF. Internal medicine residency program director perceptions of USMLE Step 1 pass/fail scoring: A cross-sectional survey. Medicine (Baltimore) 2021; 100:e25284. [PMID: 33847625 PMCID: PMC8052063 DOI: 10.1097/md.0000000000025284] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/07/2020] [Accepted: 03/08/2021] [Indexed: 01/04/2023] Open
Abstract
The United States Medical Licensing Examination Step 1 will transition to a pass/fail exam starting no earlier than January 2022. Internal medicine residency programs will need to adapt to these changes. The purpose of this study was to investigate: 1. internal medicine residency program directors’ perceptions on the change of Step 1 to a pass/fail exam, and 2. the impact on other factors considered for internal medicine residency selection. A validated REDCap survey was sent to 548 program directors at active Accreditation Council for Graduate Medical Education internal medicine residency programs. Contact information from the American Medical Association's Fellowship and Residency Electronic Interactive Database was used. The survey had 123 respondents (22.4%). Most internal medicine program directors do not support the pass/fail change. A greater importance will be placed on Step 2 Clinical Knowledge exam, personal knowledge of the applicant, clerkship grades, and audition electives. Allopathic students from less highly regarded medical schools, as well as osteopathic and international students, will be disadvantaged. About half believe that schools should adopt a graded pre-clinical curriculum (51.2%) and that there should be residency application caps (54.5%). Internal medicine program directors mostly disagree with the pass/fail Step 1 transition. Residency programs will need to reevaluate how applicants are evaluated. Other factors, such as Step 2 Clinical Knowledge score, personal knowledge of the applicant, grades in clerkships, and audition rotations will now be emphasized more heavily.
Collapse
Affiliation(s)
| | | | - David Cui
- Pennsylvania State University College of Medicine
| | - Alia Chisty
- Pennsylvania State University College of Medicine
- Department of Internal Medicine
| | - William L. Hennrikus
- Pennsylvania State University College of Medicine
- Bone and Joint Institute, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| | - Eileen F. Hennrikus
- Pennsylvania State University College of Medicine
- Department of Internal Medicine
| |
Collapse
|
16
|
Greenky D, Reddy P, George P. Rethinking the Initial Board Certification Exam. MEDICAL SCIENCE EDUCATOR 2021; 31:889-891. [PMID: 33462556 PMCID: PMC7806195 DOI: 10.1007/s40670-021-01209-x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 01/08/2021] [Indexed: 06/12/2023]
Affiliation(s)
- David Greenky
- Division of Pediatric Emergency Medicine, Emory University School of Medicine, 1547 Clifton Road, Atlanta, GA 30322 USA
| | - Pranav Reddy
- Department of Obstetrics and Gynecology, Yale University School of Medicine, 333 Ceder Street, New Haven, CT 06510 USA
| | - Paul George
- Family Medicine and Medical Science, the Warren Alpert Medical School, Brown University, 222 Richmond St., Providence, RI 02903 USA
| |
Collapse
|
17
|
Ost SR, Wells D, Goedecke PJ, Tolley EA, Kleinman M, Thompson NS. Relationship Between Standardized Test Scores and Board Certification Exams in a Combined Internal Medicine/Pediatrics Residency Program. Cureus 2021; 13:e13567. [PMID: 33815979 PMCID: PMC8008765 DOI: 10.7759/cureus.13567] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/04/2022] Open
Abstract
Background Combined Internal Medicine/Pediatrics (Med/Peds) residencies rely on categorical program data to predict pass rates for the American Board of Internal Medicine Certifying Exam (ABIM-CE) and the American Board of Pediatrics Certifying Exam (ABP-CE). There is insufficient literature describing what best predicts a Med/Peds resident passing board exams. In this study, we aimed to determine how standardized test scores predict performance on ABIM-CE and ABP-CE for Med/Peds residents. Methodology We analyzed prior exam scores for 91/96 (95%) residents in a Med/Peds program from 2008 to 2017. Scores from the United States Medical Licensing Examination (USMLE) Steps 1 and 2 Clinical Knowledge (CK) and In-Training Exams in Internal Medicine (ITE-IM) and Pediatrics (ITE-P) were analyzed with the corresponding ABIM-CE and ABP-CE first-time scores. Linear and logistic regression were applied to predict board scores/passage. Results USMLE 1 and 2 CK, ITE-IM, and ITE-P scores had a linear relationship with both ABIM-CE and ABP-CE scores. In the linear regression, adjusted R2 values showed low-to-moderate predictive ability (R2 = 0.11-0.35), with the highest predictor of ABIM-CE and ABP-CE being USMLE Step 1 (0.35) and Postgraduate Year 1 (PGY-1) ITE-IM (0.33), respectively. Logistic regression showed odds ratios of passing board certifications ranging from 1.05 to 1.53 per point increase on the prior exam score. The PGY-3 ITE-IM was the best predictor of passing both certifying exams. Conclusions In one Med/Peds program, USMLE Steps 1 and 2 and all ITE-IM and ITE-P scores predicted certifying exam scores and passage. This provides Med/Peds-specific data to allow individualized resident counseling and guide programmatic improvements targeted to board performance.
Collapse
Affiliation(s)
- Shelley R Ost
- General Internal Medicine, University of Tennessee Health Science Center College of Medicine, Memphis, USA
| | - Daniel Wells
- General Pediatrics, University of Tennessee Health Science Center College of Medicine, Memphis, USA
| | - Patricia J Goedecke
- Preventive Medicine, University of Tennessee Health Science Center College of Medicine, Memphis, USA
| | - Elizabeth A Tolley
- Preventive Medicine, University of Tennessee Health Science Center College of Medicine, Memphis, USA
| | - Michael Kleinman
- General Internal Medicine, University of Tennessee Health Science Center College of Medicine, Memphis, USA
| | - Natascha S Thompson
- General Internal Medicine, University of Tennessee Health Science Center College of Medicine, Memphis, USA
| |
Collapse
|
18
|
Sanchez AN, Martinez CI, Stampas A, Pedroza C, Escalon MX, Silver JK, Frontera J, Verduzco-Gutierrez M. Ethnic and Racial Diversity in Academic Physical Medicine and Rehabilitation Compared with All Other Medical Specialties. Am J Phys Med Rehabil 2021; 100:S12-S16. [PMID: 32487973 DOI: 10.1097/phm.0000000000001486] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
ABSTRACT The primary aim of this study was to compare ethnic/racial diversity in academic physical medicine and rehabilitation (PM&R) with all other medical specialties in academia. The secondary aim was to characterize the ethnic/racial diversity of current PM&R program directors. Self-reported ethnicity/race information was collected from the Association of American Medical Colleges and Accreditation Council for Graduate Medical Education. Ethnicity/race was defined as white, Asian, African American, Hispanic, and other. Odds ratios (ORs) and Fisher's exact tests were used to compare ethnic/racial differences at each career level between each specialty. In 2017, in PM&R, compared with whites, there was decreased odds of African Americans by 89% (OR, 0.11), 90% for Hispanics (OR, 0.10), 62% for Asians (OR, 0.38), and 73% for other (OR, 0.27) (all P < 0.001). This disparity increased in full professors: 99% (OR, 0.01), 96% (OR, 0.04), 87% (OR, 0.13), and 90% (OR, 0.10), respectively (all P < 0.001). In 2019, most PM&R program directors identified as white (51%) compared with Hispanic (4%) and African American (2%). Overall, ethnic/racial underrepresented minorities in medicine decreased with increasing academic rank. Therefore, more robust initiatives must be implemented to improve the exposure, recruitment, and retention of ethnic/racial underrepresented minorities at all levels of PM&R academia.
Collapse
Affiliation(s)
- Ashley N Sanchez
- From the McGovern Medical School, The University of Texas Health Science Center at Houston, Houston, Texas (ANS, CIM, CP); Department of Physical Medicine and Rehabilitation, McGovern Medical School, The University of Texas Health Science Center at Houston, Houston, Texas (AS, JF); Department of Rehabilitation, and Human Performance, The Mount Sinai Hospital/Icahn School of Medicine at Mount Sinai, New York, New York (MXE); Department of Physical Medicine and Rehabilitation, Harvard Medical School, Spaulding Rehabilitation Hospital, Massachusetts General Hospital, Brigham and Women's Hospital, Boston, Massachusetts (JKS); and Department of Rehabilitation Medicine, Long School of Medicine at the University of Texas Health Science Center at San Antonio, San Antonio, Texas (MV-G)
| | | | | | | | | | | | | | | |
Collapse
|
19
|
McCrary HC, Colbert-Getz JM, Poss WB, Smith BK. A Systematic Review of the Relationship Between In-Training Examination Scores and Specialty Board Examination Scores. J Grad Med Educ 2021; 13:43-57. [PMID: 33680301 PMCID: PMC7901636 DOI: 10.4300/jgme-d-20-00111.1] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/07/2020] [Revised: 05/04/2020] [Accepted: 10/23/2020] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND In-training examinations (ITEs) are intended for low-stakes, formative assessment of residents' knowledge, but are increasingly used for high-stake purposes, such as to predict board examination failures. OBJECTIVE The aim of this review was to investigate the relationship between performance on ITEs and board examination performance across medical specialties. METHODS A search of the literature for studies assessing the strength of the relationship between ITE and board examination performance from January 2000 to March 2019 was completed. Results were categorized based on the type of statistical analysis used to determine the relationship between ITE performance and board examination performance. RESULTS Of 1407 articles initially identified, 89 articles underwent full-text review, and 32 articles were included in this review. There was a moderate-strong relationship between ITE and board examination performance, and ITE scores significantly predict board examination scores for the majority of studies. Performing well on an ITE predicts a passing outcome for the board examination, but there is less evidence that performing poorly on an ITE will result in failing the associated specialty board examination. CONCLUSIONS There is a moderate to strong correlation between ITE performance and subsequent performance on board examinations. That the predictive value for passing the board examination is stronger than the predictive value for failing calls into question the "common wisdom" that ITE scores can be used to identify "at risk" residents. The graduate medical education community should continue to exercise caution and restraint in using ITE scores for moderate to high-stakes decisions.
Collapse
|
20
|
Ali SA, Riaz Q, Mushtaq ZM, Awan S, Tariq M. Low performance of internal medicine senior residents in in-service examinations. Postgrad Med J 2021; 98:246-250. [PMID: 33452159 DOI: 10.1136/postgradmedj-2020-138476] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2020] [Revised: 06/14/2020] [Accepted: 12/03/2020] [Indexed: 11/03/2022]
Abstract
We observed an unprecedented and consistent low performance of senior residents as compared with juniors in monthly examinations. This compelled us to evaluate systematically and compare the scores of senior residents with their junior colleagues. This retrospective observational study was conducted in April 2020 among internal medicine residents. Residents in first and second year of their training were labelled as junior residents. Residents in third or fourth year of their training were labelled as senior residents. Comparison of mean scores of each resident level was done separately both for monthly formative multiple-choice questions tests, and summative yearly end of term examinations. We discussed the possible reasons as well. There were 67 residents in year 2018 and 69 in 2019. There is no significant difference between scores of monthly examinations of years 2018 and 2019 among residents of each level. Rather, in March and December 2018, junior residents perform better than senior residents with p values of 0.01 and 0.04, respectively. In February and September 2019, senior residents performed better than junior residents with p value of 0.02. Similarly, there is no significant difference in scores of residents of each level in end-of-term examinations of years 2018 and 2019 with p values 0.18 and 0.25, respectively. Performance of senior residents in our residency programme in in-service examinations is relatively low as compared with their junior colleagues. There is a need to evaluate reasons for this relatively low performance of senior residents.
Collapse
Affiliation(s)
| | - Qamar Riaz
- Department of Educational Development, Faculty of Health Sciences, The Aga Khan University, Karachi, Pakistan
| | | | - Safia Awan
- Medicine, Aga Khan University, Karachi, Sind, Pakistan
| | | |
Collapse
|
21
|
Fliotsos MJ, Zafar S, Dharssi S, Srikumaran D, Chow J, Singman EL, Woreta FA. Objective Resident Characteristics Associated with Performance on the Ophthalmic Knowledge Assessment Program Examination. JOURNAL OF ACADEMIC OPHTHALMOLOGY (2017) 2021; 13:e40-e45. [PMID: 37389170 PMCID: PMC9928084 DOI: 10.1055/s-0040-1722311] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/05/2020] [Accepted: 11/18/2020] [Indexed: 10/21/2022]
Abstract
Background To determine objective resident characteristics that correlate with Ophthalmic Knowledge Assessment Program (OKAP) performance, as well as to correlate OKAP performance with Accreditation Council for Graduate Medical Education (ACGME) milestone assessments, written qualifying examination (WQE) scores, and oral board pass rates. Methods Review of administrative records at an ACGME-accredited ophthalmology residency training program at an urban, tertiary academic medical center. Results The study included data from a total of 50 resident physicians who completed training from 2012 to 2018. Mean (standard deviation) OKAP percentile performance was 60.90 (27.51), 60.46 (28.12), and 60.55 (27.43) for Years 1, 2, and 3 examinations, respectively. There were no statistically significant differences based on sex, marital status, having children, MD/PhD degree, other additional degree, number of publications, number of first author publications, or grades on medical school medicine and surgery rotations. OKAP percentile scores were significantly associated with United States Medical Licensing Examination (USMLE) Step 1 scores (linear regression coefficient 0.88 [0.54-1.18], p = 0.008). Finally, continuous OKAP scores were significantly correlated with WQE ( r s = 0.292, p = 0.049) and oral board ( r s = 0.49, p = 0.001) scores. Conclusion Higher OKAP performance is correlated with passage of both WQE and oral board examinations during the first attempt. USMLE Step 1 score is the preresidency academic factor with the strongest association with success on the OKAP examination. Programs can utilize this information to identify those who may benefit from additional OKAP, WQE, and oral board preparation assistance.
Collapse
Affiliation(s)
- Michael J. Fliotsos
- Wilmer Eye Institute, Department of Ophthalmology, Johns Hopkins University School of Medicine, Baltimore, Maryland
| | - Sidra Zafar
- Wilmer Eye Institute, Department of Ophthalmology, Johns Hopkins University School of Medicine, Baltimore, Maryland
| | - Shazia Dharssi
- Wilmer Eye Institute, Department of Ophthalmology, Johns Hopkins University School of Medicine, Baltimore, Maryland
| | - Divya Srikumaran
- Wilmer Eye Institute, Department of Ophthalmology, Johns Hopkins University School of Medicine, Baltimore, Maryland
| | - Jessica Chow
- Department of Ophthalmology & Visual Science, Yale Eye Center, Yale University School of Medicine, New Haven, Connecticut
| | - Eric L. Singman
- Wilmer Eye Institute, Department of Ophthalmology, Johns Hopkins University School of Medicine, Baltimore, Maryland
| | - Fasika A. Woreta
- Wilmer Eye Institute, Department of Ophthalmology, Johns Hopkins University School of Medicine, Baltimore, Maryland
| |
Collapse
|
22
|
Longyhore DS. Pharmacy residency directors' concerns with implementing
in‐training
examinations in pharmacy residencies. JOURNAL OF THE AMERICAN COLLEGE OF CLINICAL PHARMACY 2020. [DOI: 10.1002/jac5.1328] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/16/2022]
|
23
|
Hahn B, Waring ED, Chacko J, Trovato G, Tice A, Greenstein J. Assessment of Written Feedback for Emergency Medicine Residents. South Med J 2020; 113:451-456. [PMID: 32885265 DOI: 10.14423/smj.0000000000001142] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
OBJECTIVES An essential component of resident growth is a learning environment with high-quality feedback. Criteria have been developed for characterizing and assessing written feedback quality in internal medicine residents by Jackson et al. Our primary goal was to describe feedback characteristics and assess the quality of written feedback for emergency medicine (EM) residents. Our secondary goals were to evaluate the relation between feedback quality and objective outcome measures. METHODS This retrospective study was conducted between July 1, 2016 and July 1, 2018. EM residents with an Accreditation Council for Graduate Medical Education composite score (ACS), an in-service score, and written evaluations completed by an attending physician or EM resident in each of the 2 years of the study period were included. RESULTS Overall, most of the evaluations contained 1 (21%), 2 (23%), or 3 (17%) feedback items. Feedback tended to be positive (82%) and the feedback quality of the evaluations was more likely to be high (44%). There was an association between feedback quality and ACS change (P < 0.0001), but not in-service score change (P = 0.63). Resident evaluations were more likely than attending evaluations to correlate with ACS change (P < 0.00001). CONCLUSIONS The written evaluations contained few individual feedback items. Evaluations generally focused on the feedback characteristics of professionalism and interpersonal communication. The general feedback quality of evaluations tended to be high and correlated with an increase in ACSs.
Collapse
Affiliation(s)
- Barry Hahn
- From the Department of Emergency Medicine, Staten Island University Hospital, Northwell Health, Staten Island, New York
| | - Elizabeth D Waring
- From the Department of Emergency Medicine, Staten Island University Hospital, Northwell Health, Staten Island, New York
| | - Jerel Chacko
- From the Department of Emergency Medicine, Staten Island University Hospital, Northwell Health, Staten Island, New York
| | - Gabriella Trovato
- From the Department of Emergency Medicine, Staten Island University Hospital, Northwell Health, Staten Island, New York
| | - Amanda Tice
- From the Department of Emergency Medicine, Staten Island University Hospital, Northwell Health, Staten Island, New York
| | - Josh Greenstein
- From the Department of Emergency Medicine, Staten Island University Hospital, Northwell Health, Staten Island, New York
| |
Collapse
|
24
|
Wallach SL, Williams C, Chow RT, Jadhav N, Kuehl S, Raj JM, Alweis R. Internal medicine resident perspectives on scoring USMLE as pass/fail. J Community Hosp Intern Med Perspect 2020; 10:381-385. [PMID: 33235666 PMCID: PMC7671726 DOI: 10.1080/20009666.2020.1796366] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022] Open
Abstract
Background The scoring rubric on the USMLE Step 1 examination will be changing to pass/fail in January 2022. This study elicits internal medicine resident perspectives on USMLE pass/fail scoring at the national level. Objective To assess internal medicine resident opinions regarding USMLE pass/fail scoring and examine how variables such as gender, scores on USMLE 1 and 2, PGY status and type of medical school are associated with these results. Methods In the fall of 2019, the authors surveyed current internal medicine residents via an on-line tool distributed through their program directors. Respondents indicated their Step 1 and Step 2 Clinical Knowledge scores from five categorical ranges. Questions on medical school type, year of training year, and gender were included. The results were analyzed utilizing Pearson Chi-square testing and multivariable logistic regression. Results 4012 residents responded, reflecting 13% of internal medicine residents currently training in the USA. Fifty-five percent of respondents disagreed/strongly disagreed with pass/fail scoring and 34% agreed/strongly agreed. Group-based differences were significant for gender, PGY level, Step 1 score, and medical school type; a higher percentage of males, those training at the PGY1 level, and graduates of international medical schools (IMGs) disagreed with pass/fail reporting. In addition, high scorers on Step 1 were more likely to disagree with pass/fail reporting than low scoring residents Conclusion Our results suggest that a majority of internal medicine residents, currently training in the USA prefer that USMLE numerical scoring is retained and not changed to pass/fail.
Collapse
Affiliation(s)
- Sara L Wallach
- Department of Medicine, St. Francis Medical Center, Trenton, NJ, USA.,Department of Medicine, Hackensack Meridian School of Medicine, Nutley, NJ, USA.,Medicine, Drexel University College of Medicine, Philadelphia, PA, USA
| | - Christopher Williams
- Department of Behavioral and Community Health, School of Public Health, University of Maryland, College Park, MD, USA
| | - Robert T Chow
- Department of Medicine, The University of Maryland Medical Center Midtown Campus, Baltimore, MD, USA.,Medicine, University of Maryland School of Medicine, Baltimore, MD, USA
| | - Nagesh Jadhav
- Department of Medicine, Rochester General Hospital Internal Medicine Residency Program, Rochester, NY, USA.,Medicine, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Sapna Kuehl
- Medicine, University of Maryland School of Medicine, Baltimore, MD, USA.,Department of Medicine, Ascension Saint Agnes Hospital, Baltimore, MD, USA
| | - Jaya M Raj
- Department of Medicine, Creighton University School of Medicine-Phoenix, St. Joseph's Hospital and Medical Center, Phoenix, AZ, USA
| | - Richard Alweis
- Medicine, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA.,Education, Rochester Regional Health, Rochester, NY, USA.,Health Sciences, Rochester Institute of Technology, Rochester, NY, USA
| |
Collapse
|
25
|
Peterson LE, Boulet JR, Clauser B. Associations Between Medical Education Assessments and American Board of Family Medicine Certification Examination Score and Failure to Obtain Certification. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2020; 95:1396-1403. [PMID: 32271228 DOI: 10.1097/acm.0000000000003344] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
PURPOSE Family medicine residency programs can be cited for low pass or take rates on the American Board of Family Medicine (ABFM) certification examination, and the relationships among standardized medical education assessments and performance on board certification examinations and eventual board certification have not been comprehensively studied. The objective of this study was to evaluate the associations of all required standardized examinations in medical education with ABFM certification examination scores and eventual ABFM certification. METHOD All graduates of U.S. MD-granting family medicine residency programs from 2008 to 2012 were included. Data on ABFM certification examination score, ABFM certification status (as of December 31, 2014), Medical College Admission Test (MCAT) section scores, undergraduate grade point average, all United States Medical Licensing Examination (USMLE) Step scores, and all ABFM in-training examination scores were linked. Nested logistic and linear regression models, controlling for clustering by residency program, determined associations between assessments and both certification examination scores and board certification status. As many international medical graduates (IMGs) do not take the MCAT, separate models for U.S. medical graduates (USMG) and IMGs were run. RESULTS The study sample was 15,902 family medicine graduates, of whom 92.1% (14,648/15,902) obtained board certification. In models for both IMGs and USMGs, the addition of more recent assessments weakened the associations of earlier assessments. USMLE Step 2 Clinical Knowledge was predictive of certification examination scores and certification status in all models in which it was included. CONCLUSIONS For family medicine residents, more recent assessments generally have stronger associations with board certification score and status than earlier assessments. Solely using medical school admissions (grade point average and MCAT) and licensure (USMLE) scores for resident selection may not adequately predict ultimate board certification.
Collapse
Affiliation(s)
- Lars E Peterson
- L.E. Peterson is vice president of research, American Board of Family Medicine, and associate professor, Department of Family and Community Medicine, University of Kentucky, Lexington, Kentucky; ORCID: http://orcid.org/0000-0003-4853-3108
| | - John R Boulet
- J.R. Boulet is vice president, Research and Data Resources, Foundation for Advancement of International Medical Education and Research, Philadelphia, Pennsylvania
| | - Brian Clauser
- B. Clauser is vice president, Center for Advanced Assessment, National Board of Medical Examiners, Philadelphia, Pennsylvania
| |
Collapse
|
26
|
McDonald FS, Jurich D, Duhigg LM, Paniagua M, Chick D, Wells M, Williams A, Alguire P. Correlations Between the USMLE Step Examinations, American College of Physicians In-Training Examination, and ABIM Internal Medicine Certification Examination. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2020; 95:1388-1395. [PMID: 32271224 DOI: 10.1097/acm.0000000000003382] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
PURPOSE To assess the correlations between United States Medical Licensing Examination (USMLE) performance, American College of Physicians Internal Medicine In-Training Examination (IM-ITE) performance, American Board of Internal Medicine Internal Medicine Certification Exam (IM-CE) performance, and other medical knowledge and demographic variables. METHOD The study included 9,676 postgraduate year (PGY)-1, 11,424 PGY-2, and 10,239 PGY-3 internal medicine (IM) residents from any Accreditation Council for Graduate Medical Education-accredited IM residency program who took the IM-ITE (2014 or 2015) and the IM-CE (2015-2018). USMLE scores, IM-ITE percent correct scores, and IM-CE scores were analyzed using multiple linear regression, and IM-CE pass/fail status was analyzed using multiple logistic regression, controlling for USMLE Step 1, Step 2 Clinical Knowledge, and Step 3 scores; averaged medical knowledge milestones; age at IM-ITE; gender; and medical school location (United States or Canada vs international). RESULTS All variables were significant predictors of passing the IM-CE with IM-ITE scores having the strongest association and USMLE Step scores being the next strongest predictors. Prediction curves for the probability of passing the IM-CE based solely on IM-ITE score for each PGY show that residents must score higher on the IM-ITE with each subsequent administration to maintain the same estimated probability of passing the IM-CE. CONCLUSIONS The findings from this study should support residents and program directors in their efforts to more precisely identify and evaluate knowledge gaps for both personal learning and program improvement. While no individual USMLE Step score was as strongly predictive of IM-CE score as IM-ITE score, the combined relative contribution of all 3 USMLE Step scores was of a magnitude similar to that of IM-ITE score.
Collapse
Affiliation(s)
- Furman S McDonald
- F.S. McDonald is senior vice president for academic and medical affairs, American Board of Internal Medicine, Philadelphia, Pennsylvania, adjunct professor of medicine, Mayo Clinic College of Medicine and Science, Rochester, Minnesota, adjunct professor of medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania, and clinical associate, J. Edwin Wood Clinic, Pennsylvania Hospital, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0001-7952-3776
| | - Daniel Jurich
- D. Jurich is senior psychometrician, National Board of Medical Examiners, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0002-1870-2436
| | - Lauren M Duhigg
- L.M. Duhigg is senior research associate, American Board of Internal Medicine, Philadelphia, Pennsylvania
| | - Miguel Paniagua
- M. Paniagua is medical advisor, National Board of Medical Examiners, and adjunct professor of medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0003-2307-4873
| | - Davoren Chick
- D. Chick is senior vice president of medical education, American College of Physicians, and adjunct professor of medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; ORCID: http://orcid.org/0000-0003-4477-1272
| | - Margaret Wells
- M. Wells is director of assessment and education programs, American College of Physicians, Philadelphia, Pennsylvania
| | - Amber Williams
- A. Williams is manager, Relationship Development, National Board of Medical Examiners, Philadelphia, Pennsylvania
| | - Patrick Alguire
- P. Alguire is senior vice president emeritus medical education, American College of Physicians, Philadelphia, Pennsylvania
| |
Collapse
|
27
|
Huq S, Khalafallah AM, Botros D, Jimenez AE, Lam S, Huang J, Mukherjee D. Perceived impact of USMLE Step 1 pass/fail scoring change on neurosurgery: program director survey. J Neurosurg 2020; 133:928-935. [PMID: 32559749 DOI: 10.3171/2020.4.jns20748] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
28
|
Cullen MW, Beckman TJ, Baldwin KM, Engstler GJ, Mandrekar J, Scott CG, Klarich KW. Predicting Quality of Clinical Performance From Cardiology Fellowship Applications. Tex Heart Inst J 2020; 47:258-264. [PMID: 33472223 DOI: 10.14503/thij-18-6851] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
Variables in cardiology fellowship applications have not been objectively analyzed against applicants' subsequent clinical performance. We investigated possible correlations in a retrospective cohort study of 65 cardiology fellows at the Mayo Clinic (Rochester, Minn) who began 2 years of clinical training from July 2007 through July 2013. Application variables included the strength of comparative statements in recommendation letters and the authors' academic ranks, membership status in the Alpha Omega Alpha Honor Medical Society, awards earned, volunteer activities, United States Medical Licensing Examination (USMLE) scores, advanced degrees, publications, and completion of a residency program ranked in the top 6 in the United States. The outcome was clinical performance as measured by a mean of faculty evaluation scores during clinical training. The overall mean evaluation score was 4.07 ± 0.18 (scale, 1-5). After multivariable analysis, evaluation scores were associated with Alpha Omega Alpha designation (β=0.13; 95% CI, 0.01-0.25; P=0.03), residency program reputation (β=0.13; 95% CI, 0.05-0.21; P=0.004), and strength of comparative statements in recommendation letters (β=0.08; 95% CI, 0.01-0.15; P=0.02), particularly in letters from residency program directors (β=0.05; 95% CI, 0.01-0.08; P=0.009). Objective factors to consider in the cardiology fellowship application include Alpha Omega Alpha membership, residency program reputation, and comparative statements from residency program directors.
Collapse
Affiliation(s)
- Michael W Cullen
- Department of Cardiovascular Medicine, Mayo Clinic, Rochester, Minnesota 55905
| | - Thomas J Beckman
- Department of Internal Medicine, Division of General Internal Medicine, Mayo Clinic, Rochester, Minnesota 55905
| | - Kristine M Baldwin
- Department of Cardiovascular Medicine, Mayo Clinic, Rochester, Minnesota 55905
| | - Gregory J Engstler
- Department of Information Services, Mayo Clinic, Rochester, Minnesota 55905
| | - Jay Mandrekar
- Department of Health Sciences Research, Division of Biomedical Statistics and Informatics; Mayo Clinic, Rochester, Minnesota 55905
| | - Christopher G Scott
- Department of Health Sciences Research, Division of Biomedical Statistics and Informatics; Mayo Clinic, Rochester, Minnesota 55905
| | - Kyle W Klarich
- Department of Cardiovascular Medicine, Mayo Clinic, Rochester, Minnesota 55905
| |
Collapse
|
29
|
Brateanu A, Switzer B, Scott SC, Ramsey J, Thomascik J, Nowacki AS, Colbert CY. Higher Grit Scores Associated With Less Burnout in a Cohort of Internal Medicine Residents. Am J Med Sci 2020; 360:357-362. [PMID: 32631577 DOI: 10.1016/j.amjms.2020.05.045] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/12/2020] [Revised: 04/08/2020] [Accepted: 05/28/2020] [Indexed: 11/30/2022]
Abstract
BACKGROUND The association between grit, defined as perseverance and passion for long-term goals, and professional burnout has not been studied in internal medicine residents. Our objective was to examine whether internal medicine residents' scores on a grit scale were associated with various measures of burnout. METHODS All residents from a single internal medicine program were invited to participate in a study of grit and burnout. Grit and burnout were measured using the Short Grit Scale and modified Maslach Burnout Inventory, respectively. In addition, demographics, last In-Training Examination (ITE) score, and interest in a subspecialty were captured. RESULTS A total of 139 of 168 eligible residents (83%) participated. Emotional exhaustion and depersonalization (i.e., burn out) were identified in 63% and 42% of residents, respectively. Endorsement of emotional exhaustion was higher for residents living with family members, postgraduate year (PGY)1 and PGY2 compared with PGY3 residents, and residents scoring above the 50th percentile on the last ITE. Grit scores were higher for residents not reporting emotional exhaustion. As grit score increases, the odds of reporting emotional exhaustion significantly decreased, after adjustments for demographics, ITE scores, type of medical school, PGY level, and interest in a subspecialty (odds ratio = 0.36, 95% CI 0.15-0.84). CONCLUSIONS Grit appeared to be an independent predictor of burnout in internal medicine residents in this sample, with lower grit scores associated with higher burnout scores. By measuring grit early in residency, programs can potentially identify residents at risk for symptoms of burnout, specifically emotional exhaustion, and implement targeted interventions.
Collapse
Affiliation(s)
- Andrei Brateanu
- Department of Internal Medicine, Cleveland Clinic (AB, BS, SCS), Cleveland, Ohio; Cleveland Clinic Lerner College of Medicine of Case Western Reserve University (AB, JR, ASN, CYC), Cleveland, Ohio.
| | - Benjamin Switzer
- Department of Internal Medicine, Cleveland Clinic (AB, BS, SCS), Cleveland, Ohio
| | - Susan C Scott
- Department of Internal Medicine, Cleveland Clinic (AB, BS, SCS), Cleveland, Ohio
| | - Jennifer Ramsey
- Cleveland Clinic Lerner College of Medicine of Case Western Reserve University (AB, JR, ASN, CYC), Cleveland, Ohio; Department of Palliative Medicine, Taussig Cancer Institute, Cleveland Clinic (JR), Cleveland, Ohio
| | - James Thomascik
- Department of Quality, Cleveland Clinic (JT), Cleveland, Ohio
| | - Amy S Nowacki
- Cleveland Clinic Lerner College of Medicine of Case Western Reserve University (AB, JR, ASN, CYC), Cleveland, Ohio; Department of Quantitative Health Sciences, Cleveland Clinic (ASN), Cleveland, Ohio
| | - Colleen Y Colbert
- Cleveland Clinic Lerner College of Medicine of Case Western Reserve University (AB, JR, ASN, CYC), Cleveland, Ohio; Education Institute, Cleveland Clinic (CYC), Cleveland, Ohio
| |
Collapse
|
30
|
Separate but Equal? The Sorting of USMDs and Non-USMDs in Internal Medicine Residency Programs. J Gen Intern Med 2020; 35:1458-1464. [PMID: 31823308 PMCID: PMC7210370 DOI: 10.1007/s11606-019-05573-8] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/11/2019] [Revised: 10/03/2019] [Accepted: 11/21/2019] [Indexed: 10/25/2022]
Abstract
BACKGROUND The US internal medicine workforce relies on international and osteopathic medical graduates to fill gaps in residency. Little is known about the distribution and impact of IMGs, DOs, and USMDs concentrating in different types of IM programs. OBJECTIVE Determining the extent to which USMDs, DOs, and IMGs concentrate in different types of IM programs and comparing Board pass rates by program concentration. DESIGN, SETTINGS, AND PARTICIPANTS This survey study used data from the AMA's FREIDA database for 476 non-military IM programs in 2017-2018, and 2016-2018 ABIM exam pass rates for 388 accredited programs. MEASUREMENTS Outcomes were (1) program concentration based on percentage of residents who were USMDs, IMGs, and DOs in 2017-2018 and (2) 2016-2018 program ABIM pass rates as proxies for program quality. Key independent variables were hospital type (community-based, community-based university-affiliated, or university-based) when program concentration was the outcome, and program concentration when Board pass rates were the outcome. RESULTS Twenty-five percent of programs were "USMD-dominated," 17% were "DO-dominated," 42% were "IMG dominated," and 16% were "integrated." The chances that a university hospital was USMD-dominated were 32 percentage points higher than that for a community hospital (AME = 0.32, baseline probability = 0.11, 95% CI, 0.17-0.46, P < .001). USMD-dominated programs also had significantly higher pass rates by 4.0 percentage points (AME = 0.04, baseline proportion = 0.90, 95% CI, 0.02-0.06, P < .001) than integrated programs, while DO-dominated programs had significantly lower pass rates (AME = - 0.1, baseline proportion = 0.90, 95% CI, - 0.15 to - 0.04, P < .001). CONCLUSION USMDs and non-USMDs systematically cluster in certain types of residency programs and their training may not be equal, as measured by board pass rates.
Collapse
|
31
|
Al-Mohammed A, Al Mohanadi D, Rahil A, Elhiday AH, Al khal A, Suliman S. Evaluation of Progress of an ACGME-International Accredited Residency Program in Qatar. Qatar Med J 2020; 2020:6. [PMID: 32300550 PMCID: PMC7147266 DOI: 10.5339/qmj.2020.6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2019] [Accepted: 10/21/2019] [Indexed: 11/24/2022] Open
Abstract
Background: The American College of Physicians’ (ACP) Internal Medicine In-Training Examination (IM-ITE) is designed to evaluate the cognitive knowledge of residents to aid them and program directors in evaluating the training experience. Objective: To determine the impact of the curriculum reform accompanied by the Accreditation Council for Graduate Medical Education (ACGME)-I alignment and accreditation on the internal medicine residency program (IMRP) using residents’ performance in the ACP's ITE from 2008 to 2016, and where the IMRP stands in comparison to all ACGME and ACGME-I accredited programs. Methods: This is a descriptive study conducted at a hospital-based IMRP in Doha, Qatar from 2008 to 2016. The study population is 1052 residents at all levels of training in IMRP. The ACP-generated ITE results of all the United States and ACGME-I accredited programs were compared with IM-ITE results in Qatar. These results were expressed in the total program average and the ranking percentile. Results: There is a progressive improvement in resident performance in Qatar as shown by the rise in total average program score from 52% in 2008 to 72% in 2016 and the sharp rise in percentile rank from 3rd percentile in 2008 to 93rd percentile in 2016 with a dramatic increase during the period 2013 to 2014 (from 32nd percentile to 73rd percentile), which represents the period of ACGME-I accreditation. None of the factors (ethnicity, USMLE or year of residency) were statistically significant with a p value >0.05 and standard coefficient ( − 0.017–0.495). There was negligible correlation between the USMLE test scores with the residents’ ITE scores with a p value = 0.023 and a Pearson correlation r = 0.097. Conclusion: The initial ACGME-I alignment followed by the accreditation, together with whole curriculum redesign to a structured, competency-based program starting from 2008, has led to an improvement in the ITE scores in the IMRP. This was further evidenced by the lack of change in the residency entry selection criteria.
Collapse
|
32
|
Rayamajhi S, Dhakal P, Wang L, Rai MP, Shrotriya S. Do USMLE steps, and ITE score predict the American Board of Internal Medicine Certifying Exam results? BMC MEDICAL EDUCATION 2020; 20:79. [PMID: 32183789 PMCID: PMC7079442 DOI: 10.1186/s12909-020-1974-3] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/12/2019] [Accepted: 02/20/2020] [Indexed: 05/25/2023]
Abstract
BACKGROUND To evaluate if United States Medical Licensing Examination (USMLE) Step 1, USMLE Step 2 CK, USMLE Step 3, and residency third-year in-service training exam (ITE) scores predict the results of American Board of Internal Medicine Certifying Exam (ABIM-CE). METHODS We performed a retrospective review of USMLE Step 1, USMLE Step 2 CK, USMLE Step 3, third-year residency ITE scores and ABIM-CE results of IM residents at our residency program from 2004 through 2017. Statistical analysis was perfrormed using Pearson correlation coefficient, and logistic regression to assess the relationship between USMLE Step 1, USMLE Step 2CK, USMLE Step 3, 3rd year ITE scores and ABIM-CE results. We used Multivariate logistic regression to predict pass or fail results in ABIM-CE based on USMLE and third-year ITE test scores controlling for other covariates. RESULTS Among 114 Internal Medicine MD residents included in the study, 92% (n = 105) passed the ABIM-CE. The OR of passing ABIM-CE was 2.70 (95% CI = 1.38-5.29), 2.31 (95% CI = 1.33-4.01), and 1.63 (95% CI = 0.81-3.29) with a ten-point increase in USMLE Step 1, USMLE Step 2 CK and USMLE Step 3 scores respectively. The OR of ABIM-CE passing chance was 2.96 (95% CI = 0.95-9.20), with a ten-point increase in the average score of the above three exams. A 5 % increase in ITE percentage raised the likelihood of passing ABIM-CE (OR 2.92, 95% CI 1.15-7.38). All residents who failed ABIM-CE had Step 1 scores < 220. Among 31 residents with Step 2 CK score < 220, 20% (n = 6) failed ABIM. Similarly, 9% of residents with USMLE Step 3 score < 220 failed ABIM-CE; all residents who failed had scored < 220. The probability curve predicted that the chance of passing ABIM- CE was around 80% with USMLE scores greater than 200 and increased to almost 100% with USMLE scores of 250 or more. CONCLUSION USMLE Step 1, USMLE Step 2 CK, and third-year ITE scores can predict the chances of passing ABIM-CE. The third-year ITE score has a higher preditive value compared to USMLE Step 1 and USMLE Step 2 scores. USMLE Step 1 scores more predictive of ABIM-CE results compared to USMLE Step 2CK scores. Thus, residency programs can identify internal medicine residents at risk of failing ABIM-CE and formulate interventions at an early stage during residency training. Measures such as enrolling them in question banks or board review courses can be helpful in improving their chances of passing ABIM-CE.
Collapse
Affiliation(s)
- Supratik Rayamajhi
- Department of Medicine, Michigan State University, 788 Service Road, Room B301 Clinical Center, East Lansing, MI, 48824, USA
| | - Prajwal Dhakal
- Division of Oncology and Hematology, Department of Internal Medicine, University of Nebraska Medical Center, Omaha, NE, USA
- Fred and Pamela Buffett Cancer Center, University of Nebraska Medical Center, Omaha, NE, USA
| | - Ling Wang
- Department of Medicine, Michigan State University, 788 Service Road, Room B301 Clinical Center, East Lansing, MI, 48824, USA
| | - Manoj P Rai
- Department of Medicine, Michigan State University, 788 Service Road, Room B301 Clinical Center, East Lansing, MI, 48824, USA.
| | - Shiva Shrotriya
- Department of Medicine, Michigan State University, 788 Service Road, Room B301 Clinical Center, East Lansing, MI, 48824, USA
| |
Collapse
|
33
|
Guilbault RW, Lee SW, Lian B, Choi J. Predictors of USMLE Step 1 Outcomes: Charting Successful Study Habits. MEDICAL SCIENCE EDUCATOR 2020; 30:103-106. [PMID: 34457646 PMCID: PMC8368851 DOI: 10.1007/s40670-019-00907-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
BACKGROUND The United States Medical Licensing Examination Step 1 is a test that affects many aspects of medical students' careers. The aim of this study was to assess the predictive value of various studying habits and academic traits. ACTIVITY A survey concerning Step 1 study habits and scores was collected and analyzed. RESULTS AND DISCUSSION Study results showed that preclinical curriculum grades, practice test scores, and the number of practice questions completed were positively correlated with Step 1 scores. The strongest predictor of Step 1 scores was preclinical curriculum grades: each unit increase in a letter grade was associated with a 12-point increase in Step 1 scores.
Collapse
Affiliation(s)
- Ryan W.R. Guilbault
- Department of Orthopaedic Surgery, Johns Hopkins University, 601 North Caroline Street, Baltimore, MD 21205 USA
| | - Sang W. Lee
- Department of General Surgery, Medical College of Georgia, 1120 15th Street, Augusta, GA 30912 USA
| | - Brad Lian
- Department of Community Medicine, Mercer University School of Medicine, 1501 Mercer University Drive, Macon, GA USA
| | - Jaehwa Choi
- Department of Biomedical Sciences, Mercer University School of Medicine, 1501 Mercer University Drive, Macon, GA 31207 USA
| |
Collapse
|
34
|
Chapman CH, Hwang WT, Wang X, Deville C. Factors that predict for representation of women in physician graduate medical education. MEDICAL EDUCATION ONLINE 2019; 24:1624132. [PMID: 31199206 PMCID: PMC6586104 DOI: 10.1080/10872981.2019.1624132] [Citation(s) in RCA: 31] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/16/2019] [Revised: 05/13/2019] [Accepted: 05/20/2019] [Indexed: 05/27/2023]
Abstract
Background/Objective: To identify factors associated with underrepresentation of women in the largest medical specialties. Methods: The authors obtained specialty-specific data from the Association of American Medical Colleges, National Residency Match Program and Journal of the American Medical Association Graduate Medical Education Supplement from 2014 on the gender of trainees and faculty members, residency program director (PD)-rated importance of interview selection and rank list formation criteria, and characteristics of matched NRMP participants. They used linear regression to evaluate whether factors were associated with representation of female trainees in the 18 largest specialties that participated in the NRMP. They hypothesized that factors representing lower student exposure or higher research requirements would be associated with lower representation of women. Results: In 2014, representation of women as trainees ranged from 13.7% in Orthopedic Surgery to 82.5% in OB/Gyn. On multivariable analysis, the factors associated with specialties having lower percentages of female trainees were: not being part of the third year core (slope = 0.141, p = 0.002), having lower specialty mean step 1 scores (slope = 0.007, p = 0.017), and having lower percentages of female faculty members. For each 1% increase in female faculty, the percentage of female trainees increased by 1.45% (p < 0.001). Conclusions: Two exposure-related factors, percentage of female faculty members and being part of the third year core, were associated with underrepresentation of women as trainees. Future research could help examine whether these are causal associations. Medical schools and training specialties should investigate whether strategies to enhance mentorship and increase exposure to non-core specialties will increase the proportion of women in fields in which they are underrepresented.
Collapse
Affiliation(s)
- Christina H. Chapman
- Department of Radiation Oncology, University of Michigan, Ann Arbor, MI, USA
- Center for Clinical Management Research, VA Ann Arbor Healthcare System, Ann Arbor, MI, USA
| | - Wei-Ting Hwang
- Department of Biostatistics, Epidemiology and Informatics, University of Pennsylvania, Philadelphia, PA, USA
| | - Xingmei Wang
- Center for Clinical Epidemiology and Biostatistics, University of Pennsylvania, Philadelphia, PA, USA
| | - Curtiland Deville
- Department of Radiation Oncology and Molecular Radiation Sciences, Johns Hopkins, Baltimore, MD, USA
| |
Collapse
|
35
|
Carmody JB, Sarkany D, Heitkamp DE. The USMLE Step 1 Pass/Fail Reporting Proposal: Another View. Acad Radiol 2019; 26:1403-1406. [PMID: 31296373 DOI: 10.1016/j.acra.2019.06.002] [Citation(s) in RCA: 22] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2019] [Revised: 06/13/2019] [Accepted: 06/13/2019] [Indexed: 11/28/2022]
Abstract
The Association of Program Directors in Radiology recently issued a statement endorsing continued reporting of results of the United States Medical Licensing Examination (USMLE) as a three-digit score. While this position was approved by the Association of Program Directors in Radiology Board of Directors, it does not reflect the opinions of all radiology program directors. Here, we present an argument in support of reporting USMLE results as pass/fail. As a psychometric instrument, the USMLE Step 1 is designed to assess basic science knowledge and intended to inform a binary decision on licensure. Due to a steadily-increasing burden of applications to review, program directors have increasingly relied upon scores for candidate screening. Such use has multiple adverse consequences. Student focus on Step 1 systematically devalues educational content not evaluated on the exam, and the reliance on Step 1 scores almost certainly works against efforts to increase workforce diversity. Moreover, the increasing pressure of "Step 1 Mania" has negative consequences for trainee mental health and wellness. Despite the widespread use of Step 1 scores to select applicants, there are little data to correlate scores to meaningful outcomes related to patient care or clinical practice. We find the current situation untenable, and believe a necessary first step toward reform is making Step 1 a pass/fail only examination.
Collapse
Affiliation(s)
- J Bryan Carmody
- Eastern Virginia Medical School, Department of Pediatrics, Division of Nephrology, Norfolk, Virginia
| | - David Sarkany
- Staten Island University Hospital, Northwell Health, Department of Radiology, 475 Seaview Avenue, Staten Island, NY 10305.
| | - Darel E Heitkamp
- Advent Health Orlando, Department of Radiology, Orlando, Florida
| |
Collapse
|
36
|
Sabharwal S, Chiodo AE, Raddatz MM. Administration and performance on the Spinal Cord Injury Medicine Certification Examination over a 10-year period. J Spinal Cord Med 2019; 42:606-612. [PMID: 29902393 PMCID: PMC6758686 DOI: 10.1080/10790268.2018.1475995] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/14/2022] Open
Abstract
Context/Objective: The examination for Spinal Cord Injury (SCI) Medicine subspecialty certification has been administered since 1998, but published information about exam performance or administration is limited. Design: Retrospective review Setting/Participants: We examined de-identified information from the American Board of Physical Medicine and Rehabilitation (ABPMR) database for characteristics and performance of candidates (n = 566) who completed the SCI Medicine Examination over a 10-year period (2005-2014), during which the exam outline and passing standard remained consistent. Interventions: Not applicable Outcome Measures: We analysed candidate performance by candidate track, primary specialty, number of attempts, and domains being tested. We also examined candidate perception of the SCI Medicine Exam by analysing responses to a survey taken after exam completion. Results: Thirty-six percent of candidates who completed the exam during the study period took it for initial certification (23% in the fellowship track and 13% in the practice track offered during the initial "grandfathering" period) and 64% took it for maintenance of certification (MOC) in SCI Medicine. Factors associated with better exam performance included primary specialty certification in Physical Medicine and Rehabilitation (PM&R) and first attempt at passing the exam. For PM&R candidates, ABPMR Part I Examination scores and SCI Medicine Examination scores were strongly correlated. Candidate feedback about the exam was largely positive with 97% agreeing or strongly agreeing that it was relevant to the field and 90% that it was a good test of their knowledge. Conclusion: This study can inform prospective candidates for the SCI Medicine Examination as well as those guiding them. It may also provide useful information for future exam development.
Collapse
Affiliation(s)
- Sunil Sabharwal
- Harvard Medical School, Boston, Massachusetts, USA,VA Boston Health Care System, Boston, Massachusetts, USA,Correspondence to: Sunil Sabharwal, MD, Spinal Cord Injury Service (SCI-128), VA Boston Health Care System, 1400 VFW Parkway, West Roxbury, MA 02132, USA; Ph: 857-203-6574.
| | - Anthony E. Chiodo
- Physical Medicine and Rehabilitation, University of Michigan, Ann Arbour, Michigan, USA
| | - Mikaela M. Raddatz
- American Board of Physical Medicine &Rehabilitation, Rochester, Minnesota, USA
| |
Collapse
|
37
|
Calisi N, Gondi KT, Asmar J, Singhal H, Andresen K. Predictors of Success on the ABR Core Examination. J Am Coll Radiol 2019; 16:1193-1200. [DOI: 10.1016/j.jacr.2019.03.007] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2018] [Revised: 03/09/2019] [Accepted: 03/12/2019] [Indexed: 11/30/2022]
|
38
|
Sharma A, Schauer DP, Kelleher M, Kinnear B, Sall D, Warm E. USMLE Step 2 CK: Best Predictor of Multimodal Performance in an Internal Medicine Residency. J Grad Med Educ 2019; 11:412-419. [PMID: 31440335 PMCID: PMC6699543 DOI: 10.4300/jgme-d-19-00099.1] [Citation(s) in RCA: 40] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/05/2019] [Revised: 04/26/2019] [Accepted: 06/04/2019] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Internal medicine (IM) residency programs receive information about applicants via academic transcripts, but studies demonstrate wide variability in satisfaction with and usefulness of this information. In addition, many studies compare application materials to only 1 or 2 assessment metrics, usually standardized test scores and work-based observational faculty assessments. OBJECTIVE We sought to determine which application materials best predict performance across a broad array of residency assessment outcomes generated by standardized testing and a yearlong IM residency ambulatory long block. METHODS In 2019, we analyzed available Electronic Residency Application Service data for 167 categorical IM residents, including advanced degree status, research experience, failures during medical school, undergraduate medical education award status, and United States Medical Licensing Examination (USMLE) scores. We compared these with post-match residency multimodal performance, including standardized test scores and faculty member, peer, allied health professional, and patient-level assessment measures. RESULTS In multivariate analyses, USMLE Step 2 Clinical Knowledge (CK) scores were most predictive of performance across all residency performance domains measured. Having an advanced degree was associated with higher patient-level assessments (eg, physician listens, physician explains, etc). USMLE Step 1 scores were associated with in-training examination scores only. None of the other measured application materials predicted performance. CONCLUSIONS USMLE Step 2 CK scores were the highest predictors of residency performance across a broad array of performance measurements generated by standardized testing and an IM residency ambulatory long block.
Collapse
|
39
|
Radabaugh CL, Hawkins RE, Welcher CM, Mejicano GC, Aparicio A, Kirk LM, Skochelak SE. Beyond the United States Medical Licensing Examination Score: Assessing Competence for Entering Residency. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2019; 94:983-989. [PMID: 30920448 DOI: 10.1097/acm.0000000000002728] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Assessments of physician learners during the transition from undergraduate to graduate medical education generate information that may inform their learning and improvement needs, determine readiness to move along the medical education continuum, and predict success in their residency programs. To achieve a constructive transition for the learner, residency program, and patients, high-quality assessments should provide meaningful information regarding applicant characteristics, academic achievement, and competence that lead to a suitable match between the learner and the residency program's culture and focus.The authors discuss alternative assessment models that may correlate with resident physician clinical performance and patient care outcomes. Currently, passing the United States Medical Licensing Examination Step examinations provides one element of reliable assessment data that could inform judgments about a learner's likelihood for success in residency. Yet, learner capabilities in areas beyond those traditionally valued in future physicians, such as life experiences, community engagement, language skills, and leadership attributes, are not afforded the same level of influence when candidate selections are made.While promising new methods of screening and assessment-such as objective structured clinical examinations, holistic assessments, and competency-based assessments-have attracted increased attention in the medical education community, currently they may be expensive, be less psychometrically sound, lack a national comparison group, or be complicated to administer. Future research and experimentation are needed to establish measures that can best meet the needs of programs, faculty, staff, students, and, more importantly, patients.
Collapse
Affiliation(s)
- Carrie L Radabaugh
- C.L. Radabaugh is vice president, governance and board relations, American Board of Medical Specialties, Chicago, Illinois. R.E. Hawkins is president and chief executive officer, American Board of Medical Specialties, Chicago, Illinois. C.M. Welcher is senior policy analyst, Medical Education Programs, American Medical Association, Chicago, Illinois. G.C. Mejicano is professor and senior associate dean for education, School of Medicine, Oregon Health & Science University, Portland, Oregon. A. Aparicio is director, Medical Education Programs, American Medical Association, Chicago, Illinois. L.M. Kirk is professor, Internal Medicine/Family & Community Medicine, Southwestern Medical School, University of Texas Southwestern Medical Center, Dallas, Texas. S.E. Skochelak is chief academic officer and medical education group vice president, American Medical Association, Chicago, Illinois
| | | | | | | | | | | | | |
Collapse
|
40
|
Bynum D, Colford C, Contarino M. Licensing examination scores and fellowship selection. CLINICAL TEACHER 2019; 16:269-271. [DOI: 10.1111/tct.12917] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Affiliation(s)
- Debra Bynum
- Department of Internal MedicineUniversity of North Carolina Chapel Hill North Carolina USA
| | - Cristin Colford
- Department of Internal MedicineUniversity of North Carolina Chapel Hill North Carolina USA
| | - Michael Contarino
- Department of MedicineWakeMed Health and Hospitals Raleigh North Carolina USA
| |
Collapse
|
41
|
Hartman ND. A Narrative Review of the Evidence Supporting Factors Used by Residency Program Directors to Select Applicants for Interviews. J Grad Med Educ 2019; 11:268-273. [PMID: 31210855 PMCID: PMC6570461 DOI: 10.4300/jgme-d-18-00979.3] [Citation(s) in RCA: 52] [Impact Index Per Article: 10.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/19/2018] [Revised: 01/23/2019] [Accepted: 03/31/2019] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Residency applicants feel increasing pressure to maximize their chances of successfully matching into the program of their choice, and are applying to more programs than ever before. OBJECTIVE In this narrative review, we examined the most common and highly rated factors used to select applicants for interviews. We also examined the literature surrounding those factors to illuminate the advantages and disadvantages of using them as differentiating elements in interviewee selection. METHODS Using the 2018 NRMP Program Director Survey as a framework, we examined the last 10 years of literature to ascertain how residency directors are using these common factors to grant residency interviews, and whether these factors are predictive of success in residency. RESULTS Residency program directors identified 12 factors that contribute substantially to the decision to invite applicants for interviews. Although United States Medical Licensing Examination (USMLE) Step 1 is often used as a comparative factor, most studies do not demonstrate its predictive value for resident performance, except in the case of test failure. We also found that structured letters of recommendation from within a specialty carry increased benefit when compared with generic letters. Failing USMLE Step 1 or 2 and unprofessional behavior predicted lower performance in residency. CONCLUSIONS We found that the evidence basis for the factors most commonly used by residency directors is decidedly mixed in terms of predicting success in residency and beyond. Given these limitations, program directors should be skeptical of making summative decisions based on any one factor.
Collapse
|
42
|
Prober CG. The Match: To Thine Own Self Be True. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2019; 94:317-320. [PMID: 30540566 DOI: 10.1097/acm.0000000000002557] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
The residency match process, culminating with the Match Day celebration, plays out in medical schools across the United States and Canada every year. The process may seem strange and mysterious for observers outside of medicine. The notion that each graduating student's employer for the next several years is first revealed to thousands of people, all at the same moment, through the opening of an envelope is surreal. The emotional reactions accompanying the process range from jubilance to deep disappointment. Much attention and care have been given to developing the algorithm underpinning the Match, and the process seems just: Optimization favors applicants over training programs. Witnessing students as they progress to their next stage of medical training is special for those involved in medical education. Faculty are filled with pride. But the process is far from perfect. The author of this Invited Commentary notes several concerns about the Match: the arduous process that students undergo to maximize their chances of success; the costs attendant to the travel and related expenses of multiple, geographically dispersed interviews; and the metrics that students and their medical schools use to judge the outcomes. The author worries that for some students, the "ideal" match may not be the one driven by their dreams and aspirations but, rather, by an amalgamation of those of many well-meaning friends, family members, and faculty. Medical students should seek advice and guidance, but the author hopes that, ultimately, students follow their own drumbeat and are true first to themselves.
Collapse
Affiliation(s)
- Charles G Prober
- C.G. Prober is professor of pediatrics, professor of microbiology and immunology, and senior associate vice provost for health education, Stanford University, Stanford, California
| |
Collapse
|
43
|
Predicting American Board of Emergency Medicine Qualifying Examination Passage Using United States Medical Licensing Examination Step Scores. Ochsner J 2018; 18:204-208. [PMID: 30275782 DOI: 10.31486/toj.17.0101] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022] Open
Abstract
Background The objective of the current study was to determine whether emergency medicine residents' United States Medical Licensing Examination (USMLE) scores are significantly associated with first-attempt passage of the American Board of Emergency Medicine (ABEM) qualifying (written) examination. We hypothesized that USMLE Step 2 Clinical Knowledge (CK) scores would be useful in predicting students who passed the ABEM qualifying examination on their first attempt. Methods For this retrospective cohort study, we examined the data of residents who successfully completed training at two emergency medicine residency programs between the years 2002-2013. Because scores on the USMLE Step examinations varied greatly across years, we obtained means and standard deviations from the National Board of Medical Examiners. We subtracted the mean score for the year each resident took the examination from the resident's examination score, creating centered Step 1 and centered Step 2 CK scores. Results A multivariate logistic regression analysis indicated that centered Step 2 CK scores could be used to predict the odds of passing the ABEM qualifying examination (odds ratio = 1.05 [95% confidence interval 1.02 to 1.08, P < 0.001]). Using a Step 2 CK score cutoff of 7 points lower than the mean yielded 64% sensitivity and 81% specificity for predicting passage of the ABEM written examination on the first attempt. Conclusion Program directors and selection committees may wish to consider whether applicants' Step 2 CK scores are near the national average when making ranking decisions, as this variable is highly predictive of passing the ABEM qualifying examination on the initial attempt.
Collapse
|
44
|
Gelinne A, Zuckerman S, Benzil D, Grady S, Callas P, Durham S. United States Medical Licensing Exam Step I Score as a Predictor of Neurosurgical Career Beyond Residency. Neurosurgery 2018; 84:1028-1034. [DOI: 10.1093/neuros/nyy313] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2018] [Accepted: 06/14/2018] [Indexed: 11/13/2022] Open
Abstract
AbstractBACKGROUNDUnited States Medical Licensing Exam (USMLE) Step I score is cited as one of the most important factors when for applying to neurosurgery residencies. No studies have documented a correlation between USMLE Step I score and metrics of neurosurgical career trajectory beyond residency.OBJECTIVETo determine whether USMLE Step I exam scores are predictive of neurosurgical career beyond residency, as defined by American Board of Neurological Surgery (ABNS) certification status, practice type, academic rank, and research productivity.METHODSA database of neurosurgery residency applicants who matched into neurosurgery from 1997 to 2007 was utilized that included USMLE Step I score. Online databases were used to determine h-index, National Institutes of Health (NIH) grant funding, academic rank, practice type, and ABNS certification status of each applicant. Linear regression and nonparametric testing determined associations between USMLE Step I scores and these variables.RESULTSUSMLE Step I scores were higher for neurosurgeons in academic positions (237) when compared to community practice (234) and non-neurosurgeons (233, P < .01). USMLE Step I score was not different between neurosurgeons of different academic rank (P = .21) or ABNS certification status (P = .78). USMLE Step I score was not correlated with h-index for academic neurosurgeons (R2 = 0.002, P = .36).CONCLUSIONUSMLE Step I score has little utility in predicting the future careers of neurosurgery resident applicants. A career in academic neurosurgery is associated with a slightly higher USMLE Step I score. However, USMLE Step I score does not predict academic rank or productivity (h-index or NIH funding) nor does USMLE Step I score predict ABNS certification status.
Collapse
Affiliation(s)
- Aaron Gelinne
- Department of Neurological Surgery, University of Vermont Medical Center, Burlington, Vermont
| | - Scott Zuckerman
- Department of Neurological Surgery, Vanderbilt University Medical Center, Nashville, Tennessee
| | - Deborah Benzil
- Department of Neurological Surgery, Mount Sinai Health System, Mount Kisco, New York
| | - Sean Grady
- Department of Neurological Surgery, University of Pennsylvania Medicine, Philadelphia, Pennsylvania
| | - Peter Callas
- Department of Mathematics & Statistics, University of Vermont, Burlington, Vermont
| | - Susan Durham
- Department of Neurological Surgery, University of Vermont Medical Center, Burlington, Vermont
| |
Collapse
|
45
|
Durham SR, Donaldson K, Grady MS, Benzil DL. Analysis of the 1990-2007 neurosurgery residency match: does applicant gender affect neurosurgery match outcome? J Neurosurg 2018; 129:282-289. [PMID: 29882698 DOI: 10.3171/2017.11.jns171831] [Citation(s) in RCA: 29] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
OBJECTIVE With nearly half of graduating US medical students being female, it is imperative to understand why females typically make up less than 20% of the neurosurgery applicant pool, a number that has changed very slowly over the past several decades. Organized neurosurgery has strongly indicated the desire to overcome the underrepresentation of women, and it is critical to explore whether females are at a disadvantage during the residency application process, one of the first steps in a neurosurgical career. To date, there are no published studies on specific applicant characteristics, including gender, that are associated with match outcome among neurosurgery resident applicants. The purpose of this study is to determine which characteristics of neurosurgery residency applicants, including gender, are associated with a successful match outcome. METHODS De-identified neurosurgical resident applicant data obtained from the San Francisco Fellowship and Residency Matching Service for the years 1990-2007 were analyzed. Applicant characteristics including gender, medical school attended, year of application, United States Medical Licensing Exam (USMLE) Step 1 score, Alpha Omega Alpha (AOA) status, and match outcome were available for study. RESULTS Of the total 3426 applicants studied, 473 (13.8%) applicants were female and 2953 (86.2%) were male. Two thousand four hundred forty-eight (71.5%) applicants successfully matched. USMLE Step 1 score was the strongest predictor of match outcome with scores > 245 having an OR of 20.84 (95% CI 10.31-42.12) compared with those scoring < 215. The mean USMLE Step 1 score for applicants who successfully matched was 233.2 and was 210.8 for those applicants who did not match (p < 0.001). Medical school rank was also associated with match outcome (p < 0.001). AOA status was not significantly associated with match outcome. Female gender was associated with significantly lower odds of matching in both simple (OR 0.59, 95% CI 0.48-0.72) and multivariate analyses (OR 0.57, 95% CI 0.34-0.94 CI). USMLE Step 1 scores were significantly lower for females compared to males with a mean score of 230.1 for males and 221.5 for females (p < 0.001). There was no significant difference in medical school ranking or AOA status when stratified by applicant gender. CONCLUSIONS The limited historical applicant data from 1990-2007 suggests that USMLE Step 1 score is the best predictor of match outcome, although applicant gender may also play a role.
Collapse
Affiliation(s)
- Susan R Durham
- 1Division of Neurosurgery, University of Vermont College of Medicine, Burlington, Vermont
| | | | - M Sean Grady
- 3Department of Neurosurgery, The University of Pennsylvania School of Medicine, Philadelphia, Pennsylvania; and
| | - Deborah L Benzil
- 4Department of Neurological Surgery, Columbia University, Mt. Kisco, New York
| |
Collapse
|
46
|
Nishizaki Y, Shinozaki T, Kinoshita K, Shimizu T, Tokuda Y. Awareness of Diagnostic Error among Japanese Residents: a Nationwide Study. J Gen Intern Med 2018; 33:445-448. [PMID: 29256086 PMCID: PMC5880762 DOI: 10.1007/s11606-017-4248-y] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/21/2017] [Revised: 10/24/2017] [Accepted: 11/27/2017] [Indexed: 11/25/2022]
Abstract
BACKGROUND Residents' understanding of diagnostic error may differ between countries. We sought to explore the relationship between diagnostic error knowledge and self-study, clinical knowledge, and experience. METHODS Our nationwide study involved postgraduate year 1 and 2 (PGY-1 and -2) Japanese residents. The Diagnostic Error Knowledge Assessment Test (D-KAT) and General Medicine In-Training Examination (GM-ITE) were administered at the end of the 2014 academic year. D-KAT scores were compared with the benchmark scores of US residents. Associations between D-KAT score and gender, PGY, emergency department (ED) rotations per month, mean number of inpatients handled at any given time, and mean daily minutes of self-study were also analyzed, both with and without adjusting for GM-ITE scores. Student's t test was used for comparisons with linear mixed models and structural equation models (SEM) to explore associations with D-KAT or GM-ITE scores. RESULTS The mean D-KAT score among Japanese PGY-2 residents was significantly lower than that of their US PGY-2 counterparts (6.2 vs. 8.3, p < 0.001). GM-ITE scores correlated with ED rotations (≥6 rotations: 2.14; 0.16-4.13; p = 0.03), inpatient caseloads (5-9 patients: 1.79; 0.82-2.76; p < 0.001), and average daily minutes of self-study (≥91 min: 2.05; 0.56-3.53; p = 0.01). SEM revealed that D-KAT scores were directly associated with GM-ITE scores (ß = 0.37, 95% CI: 0.34-0.41) and indirectly associated with ED rotations (ß = 0.06, 95% CI: 0.02-0.10), inpatient caseload (ß = 0.04, 95% CI: 0.003-0.08), and average daily minutes of study (ß = 0.13, 95% CI: 0.09-0.17). CONCLUSIONS Knowledge regarding diagnostic error among Japanese residents was poor compared with that among US residents. D-KAT scores correlated strongly with GM-ITE scores, and the latter scores were positively associated with a greater number of ED rotations, larger caseload (though only up to 15 patients), and more time spent studying.
Collapse
Affiliation(s)
- Yuji Nishizaki
- Medical Technology Innovation Center, Juntendo University, Tokyo, Japan
| | - Tomohiro Shinozaki
- Department of Biostatistics, Graduate School of Medicine, University of Tokyo, Tokyo, Japan
| | - Kensuke Kinoshita
- Department of Medicine, Mito Kyodo General Hospital, University of Tsukuba, Tsukuba, Japan
| | - Taro Shimizu
- Diagnostic and Generalist Medicine, Dokkyo Medical University, Mibu, Japan
| | - Yasuharu Tokuda
- Muribushi Okinawa for Teaching Hospitals, Urasoe City, Okinawa, Japan.
| |
Collapse
|
47
|
Redmann AJ, Tawfik KO, Myer CM. The impact of a resident-run review curriculum and USMLE scores on the Otolaryngology in-service exam. Int J Pediatr Otorhinolaryngol 2018; 104:25-28. [PMID: 29287874 DOI: 10.1016/j.ijporl.2017.10.031] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/29/2017] [Revised: 10/19/2017] [Accepted: 10/20/2017] [Indexed: 10/18/2022]
Abstract
OBJECTIVE Describe the association of USMLE Step 1 scores and the institution of a dedicated board review curriculum with resident performance on the Otolaryngology training examination. STUDY DESIGN Retrospective cross sectional study. METHODS We reviewed American Board of Otolaryngology Training Examination (OTE) scores for an otolaryngology residency program between 2005 and 2016. USMLE Step 1 scores were collected. In 2011 a resident-run OTE review curriculum was instituted with the goal of improving test preparation. Scores were compared before and after curriculum institution. Linear regression was performed to identify predictors of OTE scores. RESULTS 47 residents were evaluated, 24 before and 23 after instituting the curriculum. There was a moderate correlation between USMLE step 1 scores and OTE scores for all years. For PGY-2 residents, mean OTE scores improved from 25th percentile to 41st percentile after institution of the review curriculum (p = 0.05). PGY 3-5 residents demonstrated no significant improvement. On multivariate linear regression, after controlling for USMLE step 1 scores, a dedicated board review curriculum predicted a 23-point percentile improvement in OTE scores for PGY-2 residents (p = 0.003). For other post-graduate years, the review curriculum did not predict score improvement. CONCLUSION USMLE step 1 scores are moderately correlated with OTE performance. A dedicated OTE review curriculum may improve OTE scores for PGY-2 residents, but such a curriculum may have less benefit for intermediate- and senior-level residents. LEVEL OF EVIDENCE 4.
Collapse
Affiliation(s)
- Andrew J Redmann
- Department of Otolaryngology - Head & Neck Surgery, University of Cincinnati College of Medicine, Cincinnati, OH, United States
| | - Kareem O Tawfik
- Department of Otolaryngology - Head & Neck Surgery, University of Cincinnati College of Medicine, Cincinnati, OH, United States
| | - Charles M Myer
- Department of Otolaryngology - Head & Neck Surgery, University of Cincinnati College of Medicine, Cincinnati, OH, United States; Division of Pediatric Otolaryngology - Head & Neck Surgery, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, United States.
| |
Collapse
|
48
|
Driscoll SW, Massagli TL, McMahon MA, Raddatz MM, Pruitt DW, Murphy KP. Performance of Pediatric Rehabilitation Medicine Candidates on the Subspecialty Board Certification Examination from 2003 to 2015. PM R 2017; 10:391-397. [PMID: 29024755 DOI: 10.1016/j.pmrj.2017.09.010] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2017] [Revised: 09/26/2017] [Accepted: 09/29/2017] [Indexed: 11/25/2022]
Abstract
BACKGROUND Pediatric rehabilitation medicine (PRM) physicians enter the field via several pathways. It is unknown whether different training pathways impact performance on the American Board of Physical Medicine and Rehabilitation (ABPMR) PRM Examination and Maintenance of Certification (MOC) Examination. OBJECTIVES To describe the examination performance of candidates on the ABPMR PRM Examination according to their type of training (physiatrists with a clinical PRM focus, accredited or unaccredited fellowship training, separate pediatric and physical medicine and rehabilitation residencies, or combined pediatrics/physical medicine and rehabilitation residencies) and to compare candidates' performance on the PRM Examination with their initial ABPMR certification and MOC Examinations. DESIGN A retrospective cohort study. SETTING American Board of Physical Medicine and Rehabilitation office. PARTICIPANTS A total of 250 candidates taking the PRM subspecialty certification examination from 2003 to 2015. METHODS Scaled scores on the PRM Examination were compared to the examinees' initial certification scores as well as their admissibility criteria. Pass rates and scaled scores also were compared for those taking their initial PRM certification versus MOC. MAIN OUTCOME MEASUREMENTS Board pass rates and mean scaled scores for initial PRM Examination and MOC. RESULTS The 250 physiatrists who took the subspecialty PRM Examination had an overall first-time pass rate of 89%. There was no significant difference between first-time PRM pass rates or mean scaled scores for individuals who completed an Accreditation Council for Graduate Medical Education-accredited fellowship versus those who did not. First time PRM pass rates were greatest among those who were also certified by the American Board of Pediatrics (100%). Performance on Parts I and II of the initial ABPMR Certification Examination significantly predicted PRM Examination scores. There was no difference in mean scaled scores for initial PRM certification versus taking the PRM Examination for MOC. CONCLUSIONS Several pathways to admissibility to the PRM Examination afforded similar opportunity for diplomates to gain the knowledge necessary to pass the PRM Examination. Once certified, physicians taking the PRM Examination for MOC have a high success rate of passing again in years 7-10 of their certification cycle. LEVEL OF EVIDENCE III.
Collapse
Affiliation(s)
- Sherilyn W Driscoll
- Mayo Clinic, Mayo Clinic Children's Center, 200 1st Street SW, Rochester, MN 55905.,University of Washington, Seattle Children's Hospital, Seattle, WA.,University of Cincinnati, Cincinnati Children's Hospital Medical Center, Cincinnati, OH.,American Board of Physical Medicine and Rehabilitation, Rochester, MN.,Sanford Health Systems, Bismarck, ND and Gillette Specialty Healthcare, Northern Minnesota Clinics, Duluth, MN
| | - Teresa L Massagli
- Mayo Clinic, Mayo Clinic Children's Center, 200 1st Street SW, Rochester, MN 55905.,University of Washington, Seattle Children's Hospital, Seattle, WA.,University of Cincinnati, Cincinnati Children's Hospital Medical Center, Cincinnati, OH.,American Board of Physical Medicine and Rehabilitation, Rochester, MN.,Sanford Health Systems, Bismarck, ND and Gillette Specialty Healthcare, Northern Minnesota Clinics, Duluth, MN
| | - Mary A McMahon
- Mayo Clinic, Mayo Clinic Children's Center, 200 1st Street SW, Rochester, MN 55905.,University of Washington, Seattle Children's Hospital, Seattle, WA.,University of Cincinnati, Cincinnati Children's Hospital Medical Center, Cincinnati, OH.,American Board of Physical Medicine and Rehabilitation, Rochester, MN.,Sanford Health Systems, Bismarck, ND and Gillette Specialty Healthcare, Northern Minnesota Clinics, Duluth, MN
| | - Mikaela M Raddatz
- Mayo Clinic, Mayo Clinic Children's Center, 200 1st Street SW, Rochester, MN 55905.,University of Washington, Seattle Children's Hospital, Seattle, WA.,University of Cincinnati, Cincinnati Children's Hospital Medical Center, Cincinnati, OH.,American Board of Physical Medicine and Rehabilitation, Rochester, MN.,Sanford Health Systems, Bismarck, ND and Gillette Specialty Healthcare, Northern Minnesota Clinics, Duluth, MN
| | - David W Pruitt
- Mayo Clinic, Mayo Clinic Children's Center, 200 1st Street SW, Rochester, MN 55905.,University of Washington, Seattle Children's Hospital, Seattle, WA.,University of Cincinnati, Cincinnati Children's Hospital Medical Center, Cincinnati, OH.,American Board of Physical Medicine and Rehabilitation, Rochester, MN.,Sanford Health Systems, Bismarck, ND and Gillette Specialty Healthcare, Northern Minnesota Clinics, Duluth, MN
| | - Kevin P Murphy
- Mayo Clinic, Mayo Clinic Children's Center, 200 1st Street SW, Rochester, MN 55905.,University of Washington, Seattle Children's Hospital, Seattle, WA.,University of Cincinnati, Cincinnati Children's Hospital Medical Center, Cincinnati, OH.,American Board of Physical Medicine and Rehabilitation, Rochester, MN.,Sanford Health Systems, Bismarck, ND and Gillette Specialty Healthcare, Northern Minnesota Clinics, Duluth, MN
| |
Collapse
|
49
|
United States Medical Licensing Examination and American Board of Pediatrics Certification Examination Results: Does the Residency Program Contribute to Trainee Achievement. J Pediatr 2017. [PMID: 28629684 DOI: 10.1016/j.jpeds.2017.05.057] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
Abstract
OBJECTIVE To determine whether training site or prior examinee performance on the US Medical Licensing Examination (USMLE) step 1 and step 2 might predict pass rates on the American Board of Pediatrics (ABP) certifying examination. STUDY DESIGN Data from graduates of pediatric residency programs completing the ABP certifying examination between 2009 and 2013 were obtained. For each, results of the initial ABP certifying examination were obtained, as well as results on National Board of Medical Examiners (NBME) step 1 and step 2 examinations. Hierarchical linear modeling was used to nest first-time ABP results within training programs to isolate program contribution to ABP results while controlling for USMLE step 1 and step 2 scores. Stepwise linear regression was then used to determine which of these examinations was a better predictor of ABP results. RESULTS A total of 1110 graduates of 15 programs had complete testing results and were subject to analysis. Mean ABP scores for these programs ranged from 186.13 to 214.32. The hierarchical linear model suggested that the interaction of step 1 and 2 scores predicted ABP performance (F[1,1007.70] = 6.44, P = .011). By conducting a multilevel model by training program, both USMLE step examinations predicted first-time ABP results (b = .002, t = 2.54, P = .011). Linear regression analyses indicated that step 2 results were a better predictor of ABP performance than step 1 or a combination of the two USMLE scores. CONCLUSIONS Performance on the USMLE examinations, especially step 2, predicts performance on the ABP certifying examination. The contribution of training site to ABP performance was statistically significant, though contributed modestly to the effect compared with prior USMLE scores.
Collapse
|
50
|
Collichio FA, Hess BJ, Muchmore EA, Duhigg L, Lipner RS, Haist S, Hawley JL, Morrison CA, Clayton CP, Raymond MJ, Kayoumi KM, Gitlin SD. Medical Knowledge Assessment by Hematology and Medical Oncology In-Training Examinations Are Better Than Program Director Assessments at Predicting Subspecialty Certification Examination Performance. JOURNAL OF CANCER EDUCATION : THE OFFICIAL JOURNAL OF THE AMERICAN ASSOCIATION FOR CANCER EDUCATION 2017; 32:647-654. [PMID: 26897634 DOI: 10.1007/s13187-016-0993-6] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
The Accreditation Council for Graduate Medical Education's Next Accreditation System requires training programs to demonstrate that fellows are achieving competence in medical knowledge (MK), as part of a global assessment of clinical competency. Passing American Board of Internal Medicine (ABIM) certification examinations is recognized as a metric of MK competency. This study examines several in-training MK assessment approaches and their ability to predict performance on the ABIM Hematology or Medical Oncology Certification Examinations. Results of a Hematology In-Service Examination (ISE) and an Oncology In-Training Examination (ITE), program director (PD) ratings, demographic variables, United States Medical Licensing Examination (USMLE), and ABIM Internal Medicine (IM) Certification Examination were compared. Stepwise multiple regression and logistic regression analyses evaluated these assessment approaches as predictors of performance on the Hematology or Medical Oncology Certification Examinations. Hematology ISE scores were the strongest predictor of Hematology Certification Examination scores (β = 0.41) (passing odds ratio [OR], 1.012; 95 % confidence interval [CI], 1.008-1.015), and the Oncology ITE scores were the strongest predictor of Medical Oncology Certification Examination scores (β = 0.45) (passing OR, 1.013; 95 % CI, 1.011-1.016). PD rating of MK was the weakest predictor of Medical Oncology Certification Examination scores (β = 0.07) and was not significantly predictive of Hematology Certification Examination scores. Hematology and Oncology ITEs are better predictors of certification examination performance than PD ratings of MK, reinforcing the effectiveness of ITEs for competency-based assessment of MK.
Collapse
Affiliation(s)
- Frances A Collichio
- Division of Hematology and Oncology, University of North Carolina-Chapel Hill, Physicians Office Building, 3rd Floor 170 Manning Drive, CB# 7305, Chapel Hill, NC, 27599, USA
| | - Brian J Hess
- Hess Consulting, 272 Rue du Replat, Lévis, Quebec, G7A 5E4, Canada
| | - Elaine A Muchmore
- Division of Hematology/Oncology, University of California-San Diego School of Medicine and Veterans Affairs San Diego Healthcare System, 9500 Gilman Drive #9111-E, La Jolla, CA, 92093, USA
| | - Lauren Duhigg
- American Board of Internal Medicine, 510 Walnut St, Suite 1700, Philadelphia, PA, 19106, USA
| | - Rebecca S Lipner
- American Board of Internal Medicine, 510 Walnut St, Suite 1700, Philadelphia, PA, 19106, USA
| | - Steven Haist
- Test Development Services, National Board of Medical Examiners, 3750 Market St, Philadelphia, PA, 19104, USA
| | - Janine L Hawley
- National Board of Medical Examiners, 3750 Market St, Philadelphia, PA, 19104, USA
| | - Carol A Morrison
- National Board of Medical Examiners, 3750 Market St, Philadelphia, PA, 19104, USA
| | - Charles P Clayton
- Education and Training, American Society of Hematology, 2021 L Street NW, Suite 900, Washington, DC, 20036, USA
| | - Marilyn J Raymond
- American Society of Clinical Oncology, 2318 Mill Road, Suite 800, Alexandria, VA, 22314, USA
| | - Karen M Kayoumi
- Education and Training, American Society of Hematology, 2021 L Street NW, Suite 900, Washington, DC, 20036, USA
| | - Scott D Gitlin
- Division of Hematology/Oncology, University of Michigan Health System and Veterans Affairs Ann Arbor Health System, C345 Med Inn Building/SPC 5848, 1500 E. Medical Center Drive, Ann Arbor, MI, 48109, USA.
- University of Michigan Health System, C345 Med Inn Building/SPC 5848, 1500 E. Medical Center Drive, Ann Arbor, MI, 48109-5848, USA.
| |
Collapse
|