1
|
Swails JL, Angus S, Barone MA, Bienstock J, Burk-Rafel J, Roett MA, Hauer KE. The Undergraduate to Graduate Medical Education Transition as a Systems Problem: A Root Cause Analysis. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:180-187. [PMID: 36538695 DOI: 10.1097/acm.0000000000005065] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
The transition from undergraduate medical education (UME) to graduate medical education (GME) constitutes a complex system with important implications for learner progression and patient safety. The transition is currently dysfunctional, requiring students and residency programs to spend significant time, money, and energy on the process. Applications and interviews continue to increase despite stable match rates. Although many in the medical community acknowledge the problems with the UME-GME transition and learners have called for prompt action to address these concerns, the underlying causes are complex and have defied easy fixes. This article describes the work of the Coalition for Physician Accountability's Undergraduate Medical Education to Graduate Medical Education Review Committee (UGRC) to apply a quality improvement approach and systems thinking to explore the underlying causes of dysfunction in the UME-GME transition. The UGRC performed a root cause analysis using the 5 whys and an Ishikawa (or fishbone) diagram to deeply explore problems in the UME-GME transition. The root causes of problems identified include culture, costs and limited resources, bias, systems, lack of standards, and lack of alignment. Using the principles of systems thinking (components, connections, and purpose), the UGRC considered interactions among the root causes and developed recommendations to improve the UME-GME transition. Several of the UGRC's recommendations stemming from this work are explained. Sustained monitoring will be necessary to ensure interventions move the process forward to better serve applicants, programs, and the public good.
Collapse
Affiliation(s)
- Jennifer L Swails
- J.L. Swails is residency program director, codirector of interprofessional education, and associate professor, Department of Medicine, McGovern Medical School, University of Texas Health Science Center, Houston, Texas; ORCID: http://orcid.org/0000-0002-6102-831X
| | - Steven Angus
- S. Angus is designated institutional official, vice-chair for education, and professor, Department of Medicine, University of Connecticut School of Medicine, Farmington, Connecticut
| | - Michael A Barone
- M.A. Barone is vice president of competency-based assessment, NBME, Philadelphia, Pennsylvania, and adjunct associate professor of pediatrics, Johns Hopkins University School of Medicine, Baltimore, Maryland
| | - Jessica Bienstock
- J. Bienstock is professor of gynecology and obstetrics, associate dean for graduate medical education, and designated institutional official, Johns Hopkins University School of Medicine, Baltimore, Maryland
| | - Jesse Burk-Rafel
- J. Burk-Rafel is assistant professor of medicine and assistant director of UME-GME innovation, Institute for Innovations in Medical Education, NYU Grossman School of Medicine, New York, New York
| | - Michelle A Roett
- M.A. Roett is professor and chair, Department of Family Medicine, Georgetown University Medicine Center, and clinical chief of family medicine, MedStar Georgetown University Hospital, Washington, DC
| | - Karen E Hauer
- K.E. Hauer is associate dean for competency assessment and professional standards and professor, Department of Medicine, University of California, San Francisco School of Medicine, San Francisco, California
| |
Collapse
|
2
|
Everett GD, Maharam E, Yi F. National Resident Matching Program Rank Order and Performance in an Internal Medicine Residency. South Med J 2021; 114:657-661. [PMID: 34599345 DOI: 10.14423/smj.0000000000001301] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
OBJECTIVES Prospective first-year house staff and residency program leaders spend substantial time, effort, and expense preparing a rank order list for the National Resident Matching Program (NRMP). Previous studies have mostly shown minimal or no relation between rank order and subsequent resident performance, raising questions about the value of this process. Furthermore, no previous studies have been done with Internal Medicine residencies. As such, the purpose of this study was to compare NRMP rank order to multiple objective outcomes of an Internal Medicine residency. METHODS A retrospective cohort of Internal Medicine residents from five consecutive graduating classes, trained between July 1, 2013 and July 31, 2020, were evaluated for five objective outcomes: Accreditation Council for Graduate Medical Education (ACGME) milestones, faculty rankings of quality, National In-Training Examination scores, chief resident attainment, and fellowship attainment. Outcomes were analyzed in relation to eight potential predictors: NRMP rank, medical school type and grades, immigration status, added qualifications, sex, age and US Medical Licensing Examination (USMLE) scores, using univariate and multivariate analyses. RESULTS From a cohort of 61 residents, 56 were eligible. All eligible residents' data were included, for a participation rate of 100% (56 of 56). There were no statistically significant univariate or multivariate predictors for the endpoint of fellowship attainment. Higher USMLE scores were predictive of chief resident status in univariate analysis only. NRMP rank was significantly correlated with ACGME milestones in the univariate analysis. The multivariate analysis revealed that higher USMLE score was statistically significantly predictive of more favorable milestones, faculty ranking, and National In-Training Examination score. CONCLUSIONS Higher USMLE score was statistically significantly associated with multiple favorable objective residency outcomes in an Internal Medicine residency. A better NRMP rank was correlated with favorable ACGME milestones in univariate analysis, but USMLE score emerged as the strongest predictor in multivariate analysis.
Collapse
Affiliation(s)
- George D Everett
- From the Internal Medicine Department and the Research Institute, AdventHealth Orlando, Orlando, Florida
| | - Edward Maharam
- From the Internal Medicine Department and the Research Institute, AdventHealth Orlando, Orlando, Florida
| | - Fanchao Yi
- From the Internal Medicine Department and the Research Institute, AdventHealth Orlando, Orlando, Florida
| |
Collapse
|
3
|
Burkhardt JC, Parekh KP, Gallahue FE, London KS, Edens MA, Humbert AJ, Pillow MT, Santen SA, Hopson LR. A Critical Disconnect: Residency Selection Factors Lack Correlation With Intern Performance. J Grad Med Educ 2020; 12:696-704. [PMID: 33391593 PMCID: PMC7771600 DOI: 10.4300/jgme-d-20-00013.1] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/03/2020] [Revised: 05/30/2020] [Accepted: 08/01/2020] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Emergency medicine (EM) residency programs want to employ a selection process that will rank best possible applicants for admission into the specialty. OBJECTIVE We tested if application data are associated with resident performance using EM milestone assessments. We hypothesized that a weak correlation would exist between some selection factors and milestone outcomes. METHODS Utilizing data from 5 collaborating residency programs, a secondary analysis was performed on residents trained from 2013 to 2018. Factors in the model were gender, underrepresented in medicine status, United States Medical Licensing Examination Step 1 and 2 Clinical Knowledge (CK), Alpha Omega Alpha (AOA), grades (EM, medicine, surgery, pediatrics), advanced degree, Standardized Letter of Evaluation global assessment, rank list position, and controls for year assessed and program. The primary outcomes were milestone level achieved in the core competencies. Multivariate linear regression models were fitted for each of the 23 competencies with comparisons made between each model's results. RESULTS For the most part, academic performance in medical school (Step 1, 2 CK, grades, AOA) was not associated with residency clinical performance on milestones. Isolated correlations were found between specific milestones (eg, higher surgical grade increased wound care score), but most had no correlation with residency performance. CONCLUSIONS Our study did not find consistent, meaningful correlations between the most common selection factors and milestones at any point in training. This may indicate our current selection process cannot consistently identify the medical students who are most likely to be high performers as residents.
Collapse
Affiliation(s)
- John C Burkhardt
- Assistant Professor, Departments of Emergency Medicine and Learning Health Sciences, University of Michigan Medical School
| | - Kendra P Parekh
- Associate Professor, Department of Emergency Medicine, Vanderbilt University School of Medicine
| | - Fiona E Gallahue
- Residency Program Director and Associate Professor, Department of Emergency Medicine, University of Washington
| | - Kory S London
- Associate Residency Program Director, Director of Clinical Operations, Jefferson Methodist ED, Associate Director of Quality Assurance and Practice Improvement, and Assistant Professor, Department of Emergency Medicine, Thomas Jefferson University
| | - Mary A Edens
- Residency Program Director and Associate Professor, Department of Emergency Medicine, Louisiana State University Health Sciences Center Shreveport
| | - A J Humbert
- Residency Program Director and Associate Professor of Clinical Emergency Medicine, Indiana University School of Medicine
| | - M Tyson Pillow
- Vice Chair of Education, and Associate Professor, Department of Emergency Medicine, Baylor College of Medicine
| | - Sally A Santen
- Senior Associate Dean for Assessment, Evaluation and Scholarship, and Professor, Department of Emergency Medicine, Virginia Commonwealth University School of Medicine
| | - Laura R Hopson
- Associate Chair of Education, Emergency Medicine Residency Program, and Associate Professor of Emergency Medicine, University of Michigan Medical School
| |
Collapse
|
4
|
Application Factors Associated With Clinical Performance During Pediatric Internship. Acad Pediatr 2020; 20:1007-1012. [PMID: 32268217 DOI: 10.1016/j.acap.2020.03.010] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/07/2019] [Revised: 03/25/2020] [Accepted: 03/26/2020] [Indexed: 11/22/2022]
Abstract
OBJECTIVE Our goal was to identify aspects of residency applications predictive of subsequent performance during pediatric internship. METHODS We conducted a retrospective cohort study of graduates of US medical schools who began pediatric internship in a large pediatric residency program in the summers of 2013 to 2017. The primary outcome was the weighted average of subjects' Accreditation Council for Graduate Medical Education pediatric Milestone scores at the end of pediatric internship. To determine factors independently associated with performance, we conducted multivariate linear mixed-effects models controlling for match year and Milestone grading committee as random effects and the following application factors as fixed effects: letter of recommendation strength, clerkship grades, medical school reputation, master's or PhD degrees, gender, US Medical Licensing Examination Step 1 score, Alpha Omega Alpha membership, private medical school, and interview score. RESULTS Our study population included 195 interns. In multivariate analyses, the aspects of applications significantly associated with composite Milestone scores at the end of internship were letter of recommendation strength (estimate 0.09, 95% confidence intervals [CI]: 0.04, 0.15), numbers of clerkship honors (est. 0.05, 95% CI: 0.01-0.09), medical school ranking (est. 0.04, 95% CI: 0.08-0.01), having a master's degree (est. 0.19, 95% CI: 0.03-0.36), and not having a PhD (est. 0.14, 95% CI: 0.02-0.26). Overall, the final model explained 18% of the variance in milestone scoring. CONCLUSIONS Letter of recommendation strength, clerkship grades, medical school ranking, and having obtained a Master's degree were significantly associated with higher clinical performance during pediatric internship.
Collapse
|
5
|
Resident selection for emergency medicine specialty training in Canada: A survey of existing practice with recommendations for programs, applicants, and references. CAN J EMERG MED 2020; 22:829-835. [PMID: 32838823 DOI: 10.1017/cem.2020.457] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
OBJECTIVE Little is known about how the Royal College of Emergency Medicine (RCEM) residency programs are selecting their residents. This creates uncertainty regarding alignment between current selection processes and known best practices. We seek to describe the current selection processes of Canadian RCEM programs. METHODS An online survey was distributed to all RCEM program directors and assistant directors. The survey instrument included 22 questions and sought both qualitative and quantitative data from the following six domains: application file, letters of reference, elective selection, interview, rank order, and selection process evaluation. RESULTS We received responses from 13 of 14 programs for an aggregate response rate of 92.9%. A candidate's letters of reference were identified as the most important criterion from the paper application (38.5%). Having a high level of familiarity with the applicant was the most important characteristic of a reference letter author (46.2%). In determining rank order, 53.8% of programs weighed the interview more heavily than the paper application. Once final candidate scores are established following the interview stage, all program respondents indicated that further adjustment is made to the final rank order list. Only 1 of 13 program respondents reported ever having completed a formal evaluation of their selection process. CONCLUSION We have identified elements of the selection process that will inform recommendations for programs, students, and referees. We encourage programs to conduct regular reviews of their selection process going forward to be in alignment with best practices.
Collapse
|
6
|
Yang A, Gilani C, Saadat S, Murphy L, Toohey S, Boysen‐Osborn M. Which Applicant Factors Predict Success in Emergency Medicine Training Programs? A Scoping Review. AEM EDUCATION AND TRAINING 2020; 4:191-201. [PMID: 32704588 PMCID: PMC7369487 DOI: 10.1002/aet2.10411] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/01/2019] [Revised: 10/30/2019] [Accepted: 11/04/2019] [Indexed: 05/30/2023]
Abstract
BACKGROUND Program directors (PDs) in emergency medicine (EM) receive an abundance of applications for very few residency training spots. It is unclear which selection strategies will yield the most successful residents. Many authors have attempted to determine which items in an applicant's file predict future performance in EM. OBJECTIVES The purpose of this scoping review is to examine the breadth of evidence related to the predictive value of selection factors for performance in EM residency. METHODS The authors systematically searched four databases and websites for peer-reviewed and gray literature related to EM admissions published between 1992 and February 2019. Two reviewers screened titles and abstracts for articles that met the inclusion criteria, according to the scoping review study protocol. The authors included studies if they specifically examined selection factors and whether those factors predicted performance in EM residency training in the United States. RESULTS After screening 23,243 records, the authors selected 60 for full review. From these, the authors selected 15 published manuscripts, one unpublished manuscript, and 11 abstracts for inclusion in the review. These studies examined the United States Medical Licensing Examination (USMLE), Standardized Letters of Evaluation, Medical Student Performance Evaluation, medical school attended, clerkship grades, membership in honor societies, and other less common factors and their association with future EM residency training performance. CONCLUSIONS The USMLE was the most common factor studied. It unreliably predicts clinical performance, but more reliably predicts performance on licensing examinations. All other factors were less commonly studied and, similar to the USMLE, yielded mixed results.
Collapse
Affiliation(s)
- Allen Yang
- Department of Emergency MedicineUniversity of California, IrvineIrvineCA
| | - Chris Gilani
- Department of Emergency MedicineUniversity of California, IrvineIrvineCA
| | - Soheil Saadat
- Department of Emergency MedicineUniversity of California, IrvineIrvineCA
| | - Linda Murphy
- Health Science Library OrangeUniversity of California, IrvineIrvineCA
| | - Shannon Toohey
- Department of Emergency MedicineUniversity of California, IrvineIrvineCA
| | - Megan Boysen‐Osborn
- Department of Emergency MedicineUniversity of California, IrvineIrvineCA
- School of MedicineUniversity of California, IrvineIrvineCA
| |
Collapse
|
7
|
Egan DJ, Husain A, Bond MC, Caputo W, Cygan L, Van Dermark J, Shoenberger JM, Li I, Krauss W, Bronner J, White M, Chung AS, Shah KH, Taylor T, Silver M, Ardolic B, Weizberg M. Standardized Video Interviews Do Not Correlate to United States Medical Licensing Examination Step 1 and Step 2 Scores. West J Emerg Med 2019; 20:87-91. [PMID: 30643606 PMCID: PMC6324696 DOI: 10.5811/westjem.2018.11.39730] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2018] [Revised: 10/08/2018] [Accepted: 11/16/2018] [Indexed: 11/11/2022] Open
Abstract
Introduction In 2017, the Standardized Video Interview (SVI) was required for applicants to emergency medicine (EM). The SVI contains six questions highlighting professionalism and interpersonal communication skills. The responses were scored (6–30). As it is a new metric, no information is available on correlation between SVI scores and other application data. This study was to determine if a correlation exists between applicants’ United States Medical Licensing Examination (USMLE) and SVI scores. We hypothesized that numeric USMLE Step 1 and Step 2 Clinical Knowledge (CK) scores would not correlate with the SVI score, but that performance on the Step 2 Clinical Skills (CS) portion may correlate with the SVI since both test communication skills. Methods Nine EM residency sites participated in the study with data exported from an Electronic Residency Application Service (ERAS®) report. All applicants with both SVI and USMLE scores were included. We studied the correlation between SVI scores and USMLE scores. Predetermined subgroup analysis was performed based on applicants’ USMLE Step 1 and Step 2 CK scores as follows: (≥ 200, 201–220, 221–240, 241–260, >260). We used linear regression, the Kruskal-Wallis test and Mann-Whitney U test for statistical analyses. Results 1,325 applicants had both Step 1 and SVI scores available, with no correlation between the overall scores (p=0.58) and no correlation between the scores across all Step 1 score ranges, (p=0.29). Both Step 2 CK and SVI scores were available for 1,275 applicants, with no correlation between the overall scores (p=0.56) and no correlation across all ranges, (p=0.10). The USMLE Step 2 CS and SVI scores were available for 1,000 applicants. Four applicants failed the CS test without any correlation to the SVI score (p=0.08). Conclusion We found no correlation between the scores on any portion of the USMLE and the SVI; therefore, the SVI provides new information to application screeners.
Collapse
Affiliation(s)
- Daniel J Egan
- Columbia University Vagelos College of Physicians and Surgeons, Department of Emergency Medicine, New York, New York
| | - Abbas Husain
- Staten Island University Hospital - Northwell, Department of Emergency Medicine, Staten Island, New York
| | - Michael C Bond
- University of Maryland School of Medicine, Department of Emergency Medicine, Baltimore, Maryland
| | - William Caputo
- Staten Island University Hospital - Northwell, Department of Emergency Medicine, Staten Island, New York
| | - Lukasz Cygan
- Staten Island University Hospital - Northwell, Department of Emergency Medicine, Staten Island, New York
| | - Jeff Van Dermark
- University of Texas Southwestern Medical Center, Department of Emergency Medicine, Dallas, Texas
| | - Jan M Shoenberger
- University of Southern California, Keck School of Medicine, Department of Emergency Medicine, Los Angeles, California
| | - Ida Li
- Staten Island University Hospital - Northwell, Department of Emergency Medicine, Staten Island, New York
| | - William Krauss
- Kaiser Permanente San Diego Medical Center, Department of Emergency Medicine, San Diego, California
| | - Jonathan Bronner
- University of Kentucky, Department of Emergency Medicine, Lexington, Kentucky
| | - Melissa White
- Emory University, Department of Emergency Medicine, Atlanta, Georgia
| | - Arlene S Chung
- Maimonides Medical Center, Department of Emergency Medicine, Brooklyn, New York
| | - Kaushal H Shah
- Icahn School of Medicine at Mount Sinai Hospital, Department of Emergency Medicine, New York, New York
| | - Todd Taylor
- Emory University, Department of Emergency Medicine, Atlanta, Georgia
| | - Matthew Silver
- Kaiser Permanente San Diego Medical Center, Department of Emergency Medicine, San Diego, California
| | - Brahim Ardolic
- Staten Island University Hospital - Northwell, Department of Emergency Medicine, Staten Island, New York
| | - Moshe Weizberg
- Staten Island University Hospital - Northwell, Department of Emergency Medicine, Staten Island, New York
| |
Collapse
|
8
|
Dubosh NM, Jordan J, Yarris LM, Ullman E, Kornegay J, Runde D, Juve AM, Fisher J. Critical Appraisal of Emergency Medicine Educational Research: The Best Publications of 2016. AEM EDUCATION AND TRAINING 2019; 3:58-73. [PMID: 30680348 PMCID: PMC6339548 DOI: 10.1002/aet2.10203] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/15/2018] [Revised: 09/27/2018] [Accepted: 10/02/2018] [Indexed: 05/05/2023]
Abstract
OBJECTIVES The objectives were to critically appraise the emergency medicine (EM) medical education literature published in 2016 and review the highest-quality quantitative and qualitative studies. METHODS A search of the English language literature in 2016 querying MEDLINE, Scopus, Education Resources Information Center (ERIC), and PsychInfo identified 510 papers related to medical education in EM. Two reviewers independently screened all of the publications using previously established exclusion criteria. The 25 top-scoring quantitative studies based on methodology and all six qualitative studies were scored by all reviewers using selected scoring criteria that have been adapted from previous installments. The top-scoring articles were highlighted and trends in medical education research were described. RESULTS Seventy-five manuscripts met inclusion criteria and were scored. Eleven quantitative and one qualitative papers were the highest scoring and are summarized in this article. CONCLUSION This annual critical appraisal series highlights the best EM education research articles published in 2016.
Collapse
Affiliation(s)
- Nicole M. Dubosh
- Beth Israel Deaconess Medical Center and Harvard Medical SchoolBostonMA
| | - Jaime Jordan
- University of California Los Angeles School of MedicineTorranceCA
| | | | - Edward Ullman
- Beth Israel Deaconess Medical Center and Harvard Medical SchoolBostonMA
| | | | | | | | - Jonathan Fisher
- University of Arizona College of Medicine PhoenixMaricopa Medical CenterPhoenixAZ
| |
Collapse
|
9
|
Meyer NB, Gaetke-Udager K, Shampain KL, Spencer A, Cohan RH, Davenport MS. (Lack of) Measurable Clinical or Knowledge Gains From Resident Participation in Noon Conference. Acad Radiol 2018; 25:719-726. [PMID: 29751859 DOI: 10.1016/j.acra.2017.12.032] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2017] [Revised: 11/09/2017] [Accepted: 12/04/2017] [Indexed: 10/17/2022]
Abstract
RATIONALE AND OBJECTIVES The objective of this study was to determine whether noon conference attendance by diagnostic radiology residents is predictive of measurable performance. METHODS This single-center retrospective Health Insurance and Portability and Accountability Act (HIPAA)-compliant cross-sectional study was considered "not regulated" by the institutional review board. All diagnostic radiology residents who began residency training from 2008 to 2012 were included (N = 54). Metrics of clinical performance and knowledge were collected, including junior and senior precall test results, American Board of Radiology scores (z-score transformed), American College of Radiology in-training scores (years 1-3), on-call "great call" and minor and major discrepancy rates, on-call and daytime case volumes, and training rotation scores. Multivariate regression models were constructed to determine if conference attendance, match rank order, or starting year could predict these outcomes. Pearson bivariate correlations were calculated. RESULTS Senior precall test results were moderately correlated with American Board of Radiology (r = 0.41) and American College of Radiology (r = 0.38-0.48) test results and mean rotation scores (r = 0.41), indicating moderate internal validity. However, conference attendance, match rank order, and year of training did not correlate with (r = -0.16-0.16) or predict (P > .05) measurable resident knowledge. On multivariate analysis, neither match rank order (P = .14-.96) nor conference attendance (P = .10-.88) predicted measurable clinical efficiency or accuracy. Year started training predicted greater cross-sectional case volume (P < .0001, β = 0.361-0.516) and less faculty-to-resident feedback (P < 0.0001, β = [-0.628]-[-0.733]). CONCLUSIONS Residents with lower conference attendance are indistinguishable from those who attend more frequently in a wide range of clinical and knowledge-based performance assessments, suggesting that required attendance may not be necessary to gain certain measurable core competencies.
Collapse
Affiliation(s)
- Nathaniel B Meyer
- Department of Radiology, Michigan Medicine, 1500 E Medical Center Dr, B2 A209P, Ann Arbor, MI 48109
| | - Kara Gaetke-Udager
- Department of Radiology, Michigan Medicine, 1500 E Medical Center Dr, B2 A209P, Ann Arbor, MI 48109; Michigan Radiology Quality Collaborative, 1500 E Medical Center Dr, B2 A209P, Ann Arbor, MI 48109
| | - Kimberly L Shampain
- Department of Radiology, Michigan Medicine, 1500 E Medical Center Dr, B2 A209P, Ann Arbor, MI 48109
| | - Amy Spencer
- Department of Radiology, Michigan Medicine, 1500 E Medical Center Dr, B2 A209P, Ann Arbor, MI 48109
| | - Richard H Cohan
- Department of Radiology, Michigan Medicine, 1500 E Medical Center Dr, B2 A209P, Ann Arbor, MI 48109
| | - Matthew S Davenport
- Department of Radiology, Michigan Medicine, 1500 E Medical Center Dr, B2 A209P, Ann Arbor, MI 48109; Michigan Radiology Quality Collaborative, 1500 E Medical Center Dr, B2 A209P, Ann Arbor, MI 48109; Department of Urology, Michigan Medicine, 1500 E Medical Center Dr, B2 A209P, Ann Arbor, MI 48109.
| |
Collapse
|
10
|
Jandrey KE, Goggs R, Kerl M, Guillaumin J, Kent MS. Analysis of the first-time pass rate of the American College of Veterinary Emergency and Critical Care certifying examination (2010-2015). J Vet Emerg Crit Care (San Antonio) 2018; 28:187-191. [PMID: 29631327 DOI: 10.1111/vec.12715] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2016] [Accepted: 12/14/2016] [Indexed: 11/30/2022]
Abstract
OBJECTIVES To disseminate information regarding the annual pass rates for the American College of Veterinary Emergency and Critical Care (ACVECC) certifying examination. To compare the first-time pass rates (FTPR) of ACVECC residents trained in academic and private practice settings. DESIGN Retrospective study. SETTING ACVECC examination. ANIMALS None. INTERVENTIONS None. MEASUREMENTS AND MAIN RESULTS Anonymized ACVECC examination performance data from 2010-2015 inclusive were analyzed. Overall pass rates and FTPR were calculated for all candidates and categorized by type of residency training program. The overall pass rate for all candidates was 64.3%. The median pass rate for the 6-year period was 63.8% [IQR 59.3-67.3%]. The FTPR for residents trained in academic programs was significantly higher than for residents trained in private practice (77.1% vs 47.2%, P < 0.0001). When residents were subdivided by species-focus of training program, there was no significant difference between academic versus private practice training programs for large-animal candidates (P = 0.2), but there remained a significant difference between residency training programs for small-animal candidates (P < 0.0001). CONCLUSIONS Between 2010 and 2015 residents trained in academic training programs were significantly more likely to pass the ACVECC certifying examination compared to those trained in private practice training programs. The causes of this difference are uncertain, are likely multifactorial and warrant further investigation.
Collapse
Affiliation(s)
- Karl E Jandrey
- Departments of Surgical and Radiological Sciences, School of Veterinary Medicine, University of California-Davis, Davis, CA 95616
| | - Robert Goggs
- Department of Clinical Sciences, College of Veterinary Medicine, Cornell University, Ithaca, NY 14853
| | - Marie Kerl
- Department of Veterinary Medicine and Surgery, University of Missouri, Columbia, MO 65211
| | - Julien Guillaumin
- Department of Veterinary Clinical Sciences, College of Veterinary Medicine, The Ohio State University, Columbus, OH 43210
| | - Michael S Kent
- Departments of Surgical and Radiological Sciences, School of Veterinary Medicine, University of California-Davis, Davis, CA 95616
| |
Collapse
|
11
|
Jordan J, Linden JA, Maculatis MC, Hern HG, Schneider JI, Wills CP, Marshall JP, Friedman A, Yarris LM. Identifying the Emergency Medicine Personality: A Multisite Exploratory Pilot Study. AEM EDUCATION AND TRAINING 2018; 2:91-99. [PMID: 30051075 PMCID: PMC6001604 DOI: 10.1002/aet2.10078] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/21/2017] [Revised: 11/27/2017] [Accepted: 12/06/2017] [Indexed: 06/08/2023]
Abstract
OBJECTIVES This study aimed to understand the personality characteristics of emergency medicine (EM) residents and assess consistency and variations among residency programs. METHODS In this cross-sectional study, a convenience sample of residents (N = 140) at five EM residency programs in the United States completed three personality assessments: the Hogan Personality Inventory (HPI)-describing usual tendencies; the Hogan Development Survey (HDS)-describing tendencies under stress or fatigue; and the Motives, Values, and Preferences Inventory (MVPI)-describing motivators. Differences between EM residents and a normative population of U.S. physicians were examined with one-sample t-tests. Differences between EM residents by program were analyzed using one-way analysis of variance tests. RESULTS One-hundred forty (100%), 124 (88.6%), and 121 (86.4%) residents completed the HPI, HDS, and MVPI, respectively. For the HPI, residents scored lower than the norms on the adjustment, ambition, learning approach, inquisitive, and prudence scales. For the HDS, residents scored higher than the norms on the cautious, excitable, reserved, and leisurely scales, but lower on bold, diligent, and imaginative scales. For the MVPI, residents scored higher than the physician population norms on altruistic, hedonistic, and aesthetics scales, although lower on the security and tradition scales. Residents at the five programs were similar on 22 of 28 scales, differing on one of 11 scales of the HPI (interpersonal sensitivity), two of 11 scales of the HDS (leisurely, bold), and three of 10 scales of the MVPI (aesthetics, commerce, and recognition). CONCLUSIONS Our findings suggest that the personality characteristics of EM residents differ considerably from the norm for physicians, which may have implications for medical students' choice of specialty. Additionally, results indicated that EM residents at different programs are comparable in many areas, but moderate variation in personality characteristics exists. These results may help to inform future research incorporating personality assessment into the resident selection process and the training environment.
Collapse
Affiliation(s)
- Jaime Jordan
- Department of Emergency MedicineDavid Geffen School of Medicine at UCLADepartment of Emergency MedicineHarbor‐UCLA Medical CenterTorranceCA
| | - Judith A. Linden
- Department of Emergency MedicineBoston University School of MedicineBoston Medical CenterBostonMA
| | | | - H. Gene Hern
- Department of Emergency MedicineUCSF School of MedicineOaklandCA
- Alameda Health System–Highland HospitalOaklandCA
| | - Jeffrey I. Schneider
- Department of Emergency MedicineBoston University School of MedicineBoston Medical CenterBostonMA
| | - Charlotte P. Wills
- Department of Emergency MedicineUCSF School of MedicineOaklandCA
- Alameda Health System–Highland HospitalOaklandCA
| | - John P. Marshall
- Department of Emergency Medicine, Maimonides Medical CenterBrooklynNY
| | | | - Lalena M. Yarris
- Department of Emergency MedicineOregon Health and Science UniversityPortlandOR
| |
Collapse
|
12
|
Agarwal V, Bump GM, Heller MT, Chen LW, Branstetter BF, Amesur NB, Hughes MA. Do Residency Selection Factors Predict Radiology Resident Performance? Acad Radiol 2018; 25:397-402. [PMID: 29239834 DOI: 10.1016/j.acra.2017.09.020] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2017] [Revised: 09/12/2017] [Accepted: 09/21/2017] [Indexed: 10/18/2022]
Abstract
RATIONALE AND OBJECTIVES The purpose of our study is to determine what information in medical student residency applications predicts radiology residency success as defined by objective clinical performance data. MATERIALS AND METHODS We performed a retrospective cohort study of residents who entered our institution's residency program through the National Resident Matching Program as postgraduate year 2 residents and completed the program over the past 2 years. Medical school grades, selection to Alpha Omega Alpha (AOA) Honor Society, United States Medical Licensing Examination (USMLE) scores, publication in peer-reviewed journals, and whether the applicant was from a peer institution were the variables examined. Clinical performance was determined by calculating each resident's cumulative major discordance rate for on-call cases the resident read and gave a preliminary interpretation. A major discordance was defined as a difference between the preliminary resident and the final attending interpretations that could immediately impact the care of the patient. A multivariate logistic regression was performed to determine significant variables. RESULTS Twenty-seven residents provided preliminary reports on call for 67,145 studies. The mean major discordance rate was 1.08% (range 0.34%-2.54%). Higher USMLE Step 1 scores, publication before residency, and election to AOA Honor Society were all statistically significant predictors of lower major discordance rates (P values 0.01, 0.01, and <0.001, respectively). CONCLUSIONS Overall resident performance was excellent. There are predictors that help select the better performing residents, namely higher USMLE Step 1 scores, one to two publications during medical school, and election to AOA in the junior year of medical school.
Collapse
|