1
|
Collins S, Baker EB. Resident Recruitment in a New Era. Int Anesthesiol Clin 2024; 62:35-46. [PMID: 38855840 DOI: 10.1097/aia.0000000000000447] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/11/2024]
Abstract
ABSTRACT This chapter focuses on resident recruitment and recent US National Resident Matching Program changes and the impact in the evaluation and ranking of applicants within the specialty of anesthesiology. Recruitment challenges are examined as well as program strategies and potential future directions. Also discussed are DEI initiatives within the recruitment process.
Collapse
Affiliation(s)
- Stephen Collins
- Department of Anesthesiology, University of Virginia Health, Charlottesville, Virginia
| | - E Brooke Baker
- Division of Regional Anesthesiology and Acute Pain Medicine, Department of Anesthesiology and Critical Care Medicine Chief, Faculty Affairs and DEI, Executive Physician for Claims Management, UNM Hospital System
| |
Collapse
|
2
|
Toale C, Morris M, Gross S, O'Keeffe DA, Ryan DM, Boland F, Doherty EM, Traynor OJ, Kavanagh DO. Performance in Irish Selection and Future Performance in Surgical Training. JAMA Surg 2024; 159:538-545. [PMID: 38446454 PMCID: PMC10918576 DOI: 10.1001/jamasurg.2024.0034] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2023] [Accepted: 12/04/2023] [Indexed: 03/07/2024]
Abstract
Importance Selection processes for surgical training should aim to identify candidates who will become competent independent practitioners and should aspire to high standards of reliability and validity. Objective To determine the association between measured candidate factors at the time of an Irish selection and assessment outcomes in surgical training, examined via rate of progression to Higher Specialist Training (HST), attrition rates, and performance as assessed through a multimodal framework of workplace-based and simulation-based assessments. Design, Setting, and Participants This retrospective observational cohort study included data from all successful applicants to the Royal College of Surgeons in Ireland (RCSI) national Core Surgical Training (CST) program. Participants included all trainees recruited to dedicated postgraduate surgical training from 2016 to 2020. These data were analyzed from July 11, 2016, through July 10, 2022. Exposures Selection decisions were based on a composite score that was derived from technical aptitude assessments, undergraduate academic performance, and a 4-station multiple mini-interview. Main outcomes and measures Assessment data, attrition rates, and rates of progression to HST were recorded for each trainee. CST performance was assessed using workplace-based and simulation-based technical and nontechnical skill assessments. Potential associations between selection and assessment measures were explored using Pearson correlation, logistic regression, and multiple linear-regression analyses. Results Data were available for 303 trainees. Composite scores were positively associated with progression to HST (odds ratio [OR], 1.09; 95% CI, 1.05-1.13). There was a weak positive correlation, ranging from 0.23 to 0.34, between scores and performance across all CST assessments. Multivariable linear regression analysis showed technical aptitude scores at application were associated with future operative performance assessment scores, both in the workplace (β = 0.31; 95% CI, 0.14-0.48) and simulated environments (β = 0.57; 95% CI, 0.33-0.81). There was evidence that the interpersonal skills interview station was associated with future performance in simulated communication skill assessments (β = 0.55; 95% CI, 0.22-0.87). Conclusions and Relevance In this study, performance at the time of Irish national selection, measured across technical and nontechnical domains in a multimodal fashion, was associated with future performance in the workplace and in simulated environments. Future studies will be required to explore the consequential validity of selection, including potential unintended effects of selection and ranking on candidate performance.
Collapse
Affiliation(s)
- Conor Toale
- Department of Surgical Affairs, RCSI University of Medicine and Health Sciences, Royal College of Surgeons in Ireland, Dublin, Ireland
| | - Marie Morris
- Department of Surgical Affairs, RCSI University of Medicine and Health Sciences, Royal College of Surgeons in Ireland, Dublin, Ireland
| | - Sara Gross
- Department of Surgical Affairs, RCSI University of Medicine and Health Sciences, Royal College of Surgeons in Ireland, Dublin, Ireland
| | - Dara A O'Keeffe
- Department of Surgical Affairs, RCSI University of Medicine and Health Sciences, Royal College of Surgeons in Ireland, Dublin, Ireland
| | - Donncha M Ryan
- Department of Surgical Affairs, RCSI University of Medicine and Health Sciences, Royal College of Surgeons in Ireland, Dublin, Ireland
| | - Fiona Boland
- Data Science Centre, RCSI University of Medicine and Health Sciences, Royal College of Surgeons in Ireland, Dublin, Ireland
| | - Eva M Doherty
- Department of Surgical Affairs, RCSI University of Medicine and Health Sciences, Royal College of Surgeons in Ireland, Dublin, Ireland
| | - Oscar J Traynor
- Department of Surgical Affairs, RCSI University of Medicine and Health Sciences, Royal College of Surgeons in Ireland, Dublin, Ireland
| | | |
Collapse
|
3
|
Gazit N, Ben-Gal G, Eliashar R. Development and validation of an objective virtual reality tool for assessing technical aptitude among potential candidates for surgical training. BMC MEDICAL EDUCATION 2024; 24:286. [PMID: 38486166 PMCID: PMC10941473 DOI: 10.1186/s12909-024-05228-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/17/2023] [Accepted: 02/26/2024] [Indexed: 03/17/2024]
Abstract
BACKGROUND Good technical skills are crucial for surgeons. Yet although surgical training programs strive to assess technical aptitude when selecting surgical residents, valid assessments of such aptitude are still lacking. Surgical simulators have been proposed as a potentially effective tool for this purpose. The current study aims to develop a technical aptitude test using a virtual reality surgical simulator, and to validate its use for the selection of surgical residents. METHODS The study had three phases. In Phase 1, we developed an initial version of the technical aptitude test using the Lap-X-VR laparoscopic simulator. In Phases 2 and 3 we refined the test and collected empirical data to evaluate four main sources of validity evidence (content, response process, internal structure, and relationships with other variables), and to evaluate the feasibility and acceptability of the test. Specifically, Phase 2 comprised a review of the test by 30 senior surgeons, and in Phase 3 a revised version of the test was administered to 152 interns to determine its psychometric properties. RESULTS Both the surgeons and interns rated the test as highly relevant for selecting surgical residents. Analyses of the data obtained from the trial administration of the test supported the appropriateness of the score calculation process and showed good psychometric properties, including reliability (α = 0.83) and task discrimination (mean discrimination = 0.5, SD = 0.1). The correlations between test scores and background variables revealed significant correlations with gender, surgical simulator experience, and video game experience (ps < 0.001). These variables, however, explained together only 10% of the variance in test scores. CONCLUSIONS We describe the systematic development of an innovative virtual reality test for assessing technical aptitude in candidates for surgical training, and present evidence for its validity, feasibility and acceptability. Further validation is required to support the application of the test for selection, as well as to discern the impact of gender, surgical simulator experience, and video game experience on the fairness of test results. However, the test appears to be a promising tool that may help training programs assess the suitability of candidates for surgical training.
Collapse
Affiliation(s)
- Noa Gazit
- Department of Prosthodontics, Hadassah Medical Center, Faculty of Dental Medicine, Hebrew University of Jerusalem, Jerusalem, Israel.
- Department of Otolaryngology/HNS, Hadassah Medical Center, Faculty of Medicine, Hebrew University of Jerusalem, Jerusalem, Israel.
| | - Gilad Ben-Gal
- Department of Prosthodontics, Hadassah Medical Center, Faculty of Dental Medicine, Hebrew University of Jerusalem, Jerusalem, Israel
| | - Ron Eliashar
- Department of Otolaryngology/HNS, Hadassah Medical Center, Faculty of Medicine, Hebrew University of Jerusalem, Jerusalem, Israel
| |
Collapse
|
4
|
Tooley AA, Law J, Lelli GJ, Sun G, Godfrey KJ, Tran AQ, Kim E, Solomon JM, Chen JJ, Khan AR, Wayman L, Olson JH, Lee MS, Harrison AR, Espinoza GM, Davitt BV, Tao J, Hodge DO, Barkmeier AJ. Predictors of Ophthalmology Resident Performance From Medical Student Application Materials. JOURNAL OF SURGICAL EDUCATION 2024; 81:151-160. [PMID: 38036387 DOI: 10.1016/j.jsurg.2023.10.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/23/2023] [Revised: 10/02/2023] [Accepted: 10/07/2023] [Indexed: 12/02/2023]
Abstract
OBJECTIVE To determine whether elements in ophthalmology residency applications are predictors of future resident performance. DESIGN This multi-institutional, cross-sectional, observational study retrospectively reviewed the residency application materials of ophthalmology residents who graduated from residency from 2006 through 2018. Resident performance was scored by 2 faculty reviewers in 4 domains (clinical, surgical, academic, and global performance). Correlation between specific elements of the residency application and resident performance was assessed by Spearman correlation coefficients (univariate) and linear regression (multivariate) for continuous variables and logistic regression (multivariate) for categorical variables. SETTING Seven ophthalmology residency programs in the US. PARTICIPANTS Ophthalmology residents who graduated from their residency program. RESULTS High-performing residents were a diverse group, in terms of sex, ethnicity, visa status, and educational background. Residents with United States Medical Licensing Examination Step 1 scores higher than the national average for that year had significantly higher scores in all 4 performance domains than those who scored at or below the mean (all domains P < 0.05). Residents who had honors in at least 4 core clerkships and who were members of Alpha Omega Alpha Medical Honor Society also had higher scores in all 4 performance domains (all domains P ≤ 0.04). Step 1 score (ρ=0.26, P < 0.001) and the difference between Step 1 score and the national average for that year (ρ=0.19, P = 0.009) positively correlated with total resident performance scores. Residents who passed the American Board of Ophthalmology Written Qualifying Examination or Oral Examination on their first attempt had significantly higher Step 1/2 scores (P ≤ 0.005), Ophthalmology Knowledge Assessment Program scores (P = 0.001), and resident performance scores (P ≤ 0.004). CONCLUSIONS In this new landscape of increasing numbers of applicants to residency programs and changing of the Step 1 score to pass/fail, our findings may help guide selection committees as they holistically review applicants to select exceptional future residents in ophthalmology.
Collapse
Affiliation(s)
- Andrea A Tooley
- Department of Ophthalmology, Mayo Clinic, Rochester, Minnesota.
| | - Janice Law
- Department of Ophthalmology, Vanderbilt University, Nashville, Tennessee
| | - Gary J Lelli
- Department of Ophthalmology, Weill Cornell Medical College, New York, New York
| | - Grace Sun
- Department of Ophthalmology, Weill Cornell Medical College, New York, New York
| | - Kyle J Godfrey
- Department of Ophthalmology, Weill Cornell Medical College, New York, New York
| | - Ann Q Tran
- Department of Ophthalmology, Weill Cornell Medical College, New York, New York
| | - Eleanore Kim
- Department of Ophthalmology, New York University, New York, New York
| | - Joel M Solomon
- Department of Ophthalmology, New York University, New York, New York
| | - John J Chen
- Department of Ophthalmology, Mayo Clinic, Rochester, Minnesota
| | - Amir R Khan
- Department of Ophthalmology, Mayo Clinic, Rochester, Minnesota
| | - Laura Wayman
- Department of Ophthalmology, Vanderbilt University, Nashville, Tennessee
| | - Joshua H Olson
- Department of Ophthalmology and Visual Neurosciences, University of Minnesota, Minneapolis, Minnesota
| | - Michael S Lee
- Department of Ophthalmology and Visual Neurosciences, University of Minnesota, Minneapolis, Minnesota
| | - Andrew R Harrison
- Department of Ophthalmology and Visual Neurosciences, University of Minnesota, Minneapolis, Minnesota
| | | | - Bradley V Davitt
- Department of Ophthalmology, Saint Louis University, St. Louis, Missouri
| | - Jeremiah Tao
- Gavin Herbert Eye Institute, University of California Irvine, Irvine, California
| | - David O Hodge
- Division of Clinical Trials and Biostatistics, Mayo Clinic, Jacksonville, Florida
| | | |
Collapse
|
5
|
Teo JH, Chow C. How do current paediatrics residency selection criteria correlate with residency performance? ANNALS OF THE ACADEMY OF MEDICINE, SINGAPORE 2023; 52:553-555. [PMID: 38920207 DOI: 10.47102/annals-acadmedsg.2023157] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/27/2024]
Abstract
The selection process for potential residents needs to be reviewed regularly and assessed if effective in selecting the best-fit residents who can achieve academic and professional excellence. Objective measures must take precedence over subjective criteria to reduce selection bias while ensuring transparency and accountability. However, the predictors of an ideal resident and his/her performance during residency training have been a great challenge to identify as part of the selection process. The use of examination results from medical school examination, licensing examinations such as the United States Medical Licensing Examination,1,2,3 and structured interviews4 was reported to correlate positively with doctor’s performances. A Canadian study also reported that the presence of scholarly activity did not affect match outcome, though this is variable for different programmes.5 Competitive programmes like paediatrics have a vested interest in selecting the most suitable applicants who will excel as paediatric residents and emerge as holistic, high-performing paediatricians in their field.6
Collapse
Affiliation(s)
- Jia Hui Teo
- Neurology Service, Department of Paediatric Medicine, KK Women's and Children's Hospital, Singapore
| | - Cristelle Chow
- Paediatric Homecare Services, Department of Paediatric Medicine, KK Women's and Children's Hospital, Singapore
| |
Collapse
|
6
|
Schafer KR, Sood L, King CJ, Alexandraki I, Aronowitz P, Cohen M, Chretien K, Pahwa A, Shen E, Williams D, Hauer KE. The Grade Debate: Evidence, Knowledge Gaps, and Perspectives on Clerkship Assessment Across the UME to GME Continuum. Am J Med 2023; 136:394-398. [PMID: 36632923 DOI: 10.1016/j.amjmed.2023.01.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/22/2022] [Accepted: 01/03/2023] [Indexed: 01/10/2023]
Affiliation(s)
- Katherine R Schafer
- Department of Internal Medicine, Wake Forest University School of Medicine, Winston-Salem, NC.
| | - Lonika Sood
- Elson S. Floyd College of Medicine, Washington State University, Spokane
| | - Christopher J King
- Division of Hospital Medicine, Department of Medicine, University of Colorado School of Medicine, Aurora
| | | | | | - Margot Cohen
- Department of Medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia
| | | | - Amit Pahwa
- Johns Hopkins University School of Medicine, Baltimore, Md
| | - E Shen
- Department of Internal Medicine, Wake Forest University School of Medicine, Winston-Salem, NC
| | - Donna Williams
- Department of Internal Medicine, Wake Forest University School of Medicine, Winston-Salem, NC
| | | |
Collapse
|
7
|
De Rosa P, Takacs EB, Wendt L, Tracy CR. Effect of Holistic Review, Interview Blinding, and Structured Questions in Resident Selection: Can we Predict Who Will Do Well in a Residency Interview? Urology 2023; 173:41-47. [PMID: 36603653 DOI: 10.1016/j.urology.2022.11.047] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2022] [Revised: 11/06/2022] [Accepted: 11/15/2022] [Indexed: 01/04/2023]
Abstract
OBJECTIVE To examine the Urology residency application process, particularly the interview. Historically, the residency interview has been vulnerable to bias and not determined to be a predictor of future residency performance. Our goal is to determine the relationship between pre-interview metrics and post-interview ranking using best practices for Urology resident selection including holistic review, blinded interviews, and structured behaviorally anchored questions. METHODS Applications were assessed on cognitive (Alpha Omega Alpha, class rank, junior year clinical clerkship grades) and non-cognitive attributes (letters of recommendation [LOR], personal statement [PS], demographics, research, personal characteristics) by reviewers blinded to USMLE scores and photograph. Interviewers were blinded to the application other than PS and LORs. Interviews consisted of a structured behaviorally anchored question (SBI) and an unstructured interview (UI). Odds ratios were determined comparing pre-interview and interview impressions. RESULTS Fifty-one applicants were included in the analysis. USMLE step 1 score (average 245) was associated with Alpha Omega Alpha, class rank, junior year clinical clerkship, and PS. The UI score was associated with the LOR (P = .04) whereas SBI scores were not (P = .5). Faculty rank was associated with SBI, UI, and overall interview (OI) scores (P < .001). Faculty rank was also associated with LOR. Resident impression of interviewees were associated with faculty interview scores (P = .001) and faculty rank (P < .001). CONCLUSION Traditional interviews may be biased toward application materials and may be balanced with behavioral questions. While Step 1 score does not offer additional information over other PI metrics, blinded interviews may offer discriminant validity over a PI rubric.
Collapse
Affiliation(s)
- Paige De Rosa
- Department of Urology, University of Iowa Hospitals & Clinics, Iowa City, Iowa
| | - Elizabeth B Takacs
- Department of Urology, University of Iowa Hospitals & Clinics, Iowa City, Iowa
| | - Linder Wendt
- Department of Statistics, University of Iowa, Iowa City, Iowa
| | - Chad R Tracy
- Department of Urology, University of Iowa Hospitals & Clinics, Iowa City, Iowa.
| |
Collapse
|
8
|
Shirkhodaie C, Avila S, Seidel H, Gibbons RD, Arora VM, Farnan JM. The Association Between USMLE Step 2 Clinical Knowledge Scores and Residency Performance: A Systematic Review and Meta-Analysis. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:264-273. [PMID: 36512984 DOI: 10.1097/acm.0000000000005061] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
PURPOSE With the change in Step 1 score reporting, Step 2 Clinical Knowledge (CK) may become a pivotal factor in resident selection. This systematic review and meta-analysis seeks to synthesize existing observational studies that assess the relationship between Step 2 CK scores and measures of resident performance. METHOD The authors searched MEDLINE, Web of Science, and Scopus databases using terms related to Step 2 CK in 2021. Two researchers identified studies investigating the association between Step 2 CK and measures of resident performance and included studies if they contained a bivariate analysis examining Step 2 CK scores' association with an outcome of interest: in-training examination (ITE) scores, board certification examination scores, select Accreditation Council for Graduate Medical Education core competency assessments, overall resident performance evaluations, or other subjective measures of performance. For outcomes that were investigated by 3 or more studies, pooled effect sizes were estimated with random-effects models. RESULTS Among 1,355 potential studies, 68 met inclusion criteria and 43 were able to be pooled. There was a moderate positive correlation between Step 2 CK and ITE scores (0.52, 95% CI 0.45-0.59, P < .01). There was a moderate positive correlation between Step 2 CK and ITE scores for both nonsurgical (0.59, 95% CI 0.51-0.66, P < .01) and surgical specialties (0.41, 95% CI 0.33-0.48, P < .01). There was a very weak positive correlation between Step 2 CK scores and subjective measures of resident performance (0.19, 95% CI 0.13-0.25, P < .01). CONCLUSIONS This study found Step 2 CK scores have a statistically significant moderate positive association with future examination scores and a statistically significant weak positive correlation with subjective measures of resident performance. These findings are increasingly relevant as Step 2 CK scores will likely become more important in resident selection.
Collapse
Affiliation(s)
- Camron Shirkhodaie
- C. Shirkhodaie is a medical student, Pritzker School of Medicine, University of Chicago, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-4279-3251
| | - Santiago Avila
- S. Avila is a medical student, Pritzker School of Medicine, University of Chicago, Chicago, Illinois; ORCID: https://orcid.org/0000-0003-3633-4304
| | - Henry Seidel
- H. Seidel is a medical student, Pritzker School of Medicine, University of Chicago, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-7364-1365
| | - Robert D Gibbons
- R.D. Gibbons is professor, Center for Health Statistics and Departments of Medicine and Public Health Sciences, University of Chicago, Chicago, Illinois
| | - Vineet M Arora
- V.M. Arora is professor, Department of Medicine, University of Chicago Medicine, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-4745-7599
| | - Jeanne M Farnan
- J.M. Farnan is professor, Department of Medicine, University of Chicago Medicine, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-1138-9416
| |
Collapse
|
9
|
Gazit N, Ben-Gal G, Eliashar R. Using Job Analysis for Identifying the Desired Competencies of 21st-Century Surgeons for Improving Trainees Selection. JOURNAL OF SURGICAL EDUCATION 2023; 80:81-92. [PMID: 36175291 DOI: 10.1016/j.jsurg.2022.08.015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/18/2022] [Revised: 06/29/2022] [Accepted: 08/16/2022] [Indexed: 06/16/2023]
Abstract
OBJECTIVE The current selection for surgical training is based on ineffective methods. In order to identify or to develop more valid selection tools to improve the selection, it is necessary to first define what are the competencies that are most important for success in contemporary surgery. Therefore, the current study aims to identify what competencies are required for success as a surgeon in the 21st-century and to evaluate their relative importance for selection for surgical training. METHODS Job analysis was conducted using a mixed-methods design. First, 104 senior surgeons from all surgical fields from various hospitals in Israel were interviewed in order to query their perceptions of competencies associated with success as a surgeon. Their answers were coded and analyzed to create a list of important competencies. Next, a larger sample of 1,102 surgeons and residents from all surgical fields completed a questionnaire in which they rated the importance of each competency in the list for success as a surgeon and for selection for surgical training in the 21st-century. RESULTS Twenty-four competencies (five technical skills, six cognitive abilities, 13 personality characteristics) were identified in the interview analysis. Analysis of the questionnaire's data revealed that all 24 competencies were perceived as important for success as a surgeon in the 21st-century as well as for selection for surgical training. The perceived importance of personality characteristics was higher than both cognitive abilities (p < 0.001) and technical skills (p < 0.001). The results did not differ between different surgical fields. CONCLUSIONS Twenty-four competencies were identified as important for 21st-century surgeons and for selection for surgical training. Although all competencies were perceived as important, personality characteristics were perceived as more important than technical skills and cognitive abilities. This updated definition of required competencies may aid in developing more valid selection methods of candidates for surgical training.
Collapse
Affiliation(s)
- Noa Gazit
- Department of Prosthodontics, Hadassah Medical Center, Faculty of Dental Medicine, Hebrew University of Jerusalem, Jerusalem, Israel; Department of Otolaryngology/HNS, Hadassah Medical Center, Faculty of Medicine, Hebrew University of Jerusalem, Jerusalem, Israel.
| | - Gilad Ben-Gal
- Department of Prosthodontics, Hadassah Medical Center, Faculty of Dental Medicine, Hebrew University of Jerusalem, Jerusalem, Israel
| | - Ron Eliashar
- Department of Otolaryngology/HNS, Hadassah Medical Center, Faculty of Medicine, Hebrew University of Jerusalem, Jerusalem, Israel
| |
Collapse
|
10
|
Berk GA, Ho TD, Stack‐Pyle TJ, Zeatoun A, Kong KA, Chaskes MB, Thorp BD, Ebert CS, DeMason CE, Kimple AJ, Senior BA. The next step: Replacing step 1 as a metric for residency application. Laryngoscope Investig Otolaryngol 2022; 7:1756-1761. [PMID: 36544915 PMCID: PMC9764748 DOI: 10.1002/lio2.947] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2022] [Accepted: 10/01/2022] [Indexed: 12/24/2022] Open
Abstract
Objective As of January 2022, USMLE Step 1 scores are reported as pass/fail. Historically, Step 1 scores have been a critical component of residency applications, representing one of the few metrics standardized across all applicants independent of the school they attended. In competitive specialties, such as otolaryngology, programs routinely get 100+ applicants for each residency spot and use Step 1 as a screening tool. This study seeks to assess quantifiable metrics in the application that highly competitive residency programs could use for screening in place of Step 1 scores. Methods Otolaryngology applications to an academic medical center for the 2019-20 and 2020-21 ERAS cycles were reviewed. Board scores and quantitative research data were extracted. The relationships between Step 1 score and the other metrics were examined by computing Pearson's correlation coefficients and building regression models. Similar analyses were done separately for three different score tiers defined by Step 1 cutoffs at 220 points and 250 points. Results Step 2 score was the only variable that had meaningful correlation with Step 1 score (R = .67, p < 2.2e-16). No other objective metric such as journal articles, posters, or oral presentations correlated with Step 1 scores. Conclusion Step 1 scores were moderately correlated with Step 2 scores; however, using a Step 2 cutoff instead of a Step 1 cutoff would identify a different cohort of applicants for interview. No other quantifiable application metric had a positive correlation. In future match cycles, highly competitive residency programs will need to adopt new methods to screen candidates.Level of Evidence: Level 3.
Collapse
Affiliation(s)
- Garrett A. Berk
- Department of Otolaryngology—Head & Neck Surgerythe University of North CarolinaChapel HillNorth CarolinaUSA
| | - Tiffany D. Ho
- Department of Otolaryngology—Head & Neck Surgerythe University of North CarolinaChapel HillNorth CarolinaUSA
| | - Taylor J. Stack‐Pyle
- Department of Otolaryngology—Head & Neck Surgerythe University of North CarolinaChapel HillNorth CarolinaUSA
| | - Abdullah Zeatoun
- Department of Otolaryngology—Head & Neck Surgerythe University of North CarolinaChapel HillNorth CarolinaUSA
| | - Keonho A. Kong
- Department of Otolaryngology—Head & Neck Surgerythe University of North CarolinaChapel HillNorth CarolinaUSA
| | - Mark B. Chaskes
- Department of Otolaryngology—Head & Neck Surgerythe University of North CarolinaChapel HillNorth CarolinaUSA
| | - Brian D. Thorp
- Department of Otolaryngology—Head & Neck Surgerythe University of North CarolinaChapel HillNorth CarolinaUSA
| | - Charles S. Ebert
- Department of Otolaryngology—Head & Neck Surgerythe University of North CarolinaChapel HillNorth CarolinaUSA
| | - Christine E. DeMason
- Department of Otolaryngology—Head & Neck Surgerythe University of North CarolinaChapel HillNorth CarolinaUSA
| | - Adam J. Kimple
- Department of Otolaryngology—Head & Neck Surgerythe University of North CarolinaChapel HillNorth CarolinaUSA
| | - Brent A. Senior
- Department of Otolaryngology—Head & Neck Surgerythe University of North CarolinaChapel HillNorth CarolinaUSA
| |
Collapse
|
11
|
Lund S, D'Angelo JD, Baloul M, Yeh VJH, Stulak J, Rivera M. Simulation as Soothsayer: Simulated Surgical Skills MMIs During Residency Interviews are Associated With First Year Residency Performance. JOURNAL OF SURGICAL EDUCATION 2022; 79:e235-e241. [PMID: 35725725 DOI: 10.1016/j.jsurg.2022.06.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/23/2022] [Revised: 05/18/2022] [Accepted: 06/04/2022] [Indexed: 06/15/2023]
Abstract
OBJECTIVE The main consideration during residency recruitment is identifying applicants who will succeed during residency. However, few studies have identified applicant characteristics that are associated with competency development during residency, such as the Accreditation Council for Graduate Medical Education milestones. As mini multiple interviews (MMIs) can be used to assess various competencies, we aimed to determine if simulated surgical skills MMI scores during a general surgery residency interview were associated with Accreditation Council for Graduate Medical Education milestone ratings at the conclusion of intern year. DESIGN Retrospective cohort study. Interns' Step 1 and 2 clinical knowledge (CK) scores, interview day simulated surgical skills MMI overall score, traditional faculty interview scores, average overall milestone ratings in the spring of residency, and intern American Board of Surgery In-Training Examination (ABSITE) percentile scores were gathered. Two multiple linear regression were performed analyzing the association between Step 1, Step 2 CK, MMI, and traditional faculty interview scores with (1) average overall milestone rating and (2) ABSITE percentile scores, controlling for categorical/preliminary intern classification. SETTING One academic medical center PARTICIPANTS: General surgery interns matriculating in 2020-2021 RESULTS: Nineteen interns were included. Multiple linear regression revealed that higher overall simulated surgical skills MMI score was associated with higher average milestone ratings (β = .45, p = 0.03) and higher ABSITE score (β = .43, p = 0.02) while neither Step 1, Step 2 CK, nor faculty interview scores were significantly associated with average milestone ratings. CONCLUSIONS Surgical residency programs invest a tremendous amount of effort into training residents, thus metrics for predicting applicants that will succeed are needed. Higher scores on a simulated surgical skills MMIs are associated with higher milestone ratings 1 year into residency and higher intern ABSITE percentiles. These results indicate a noteworthy method, simulated surgical skills MMIs, as an additional metric that may select residents that will have early success in residency.
Collapse
Affiliation(s)
- Sarah Lund
- Mayo Clinic Department of Surgery, Rochester, Minnesota.
| | | | | | - Vicky J-H Yeh
- Mayo Clinic Department of Surgery, Rochester, Minnesota
| | - John Stulak
- Mayo Clinic Department of Cardiovascular Surgery, Rochester, Minnesota
| | - Mariela Rivera
- Mayo Clinic Division of Trauma, Critical Care, and General Surgery, Rochester, Minnesota
| |
Collapse
|
12
|
Wu JW, Cheng HM, Huang SS, Liang JF, Huang CC, Shulruf B, Yang YY, Chen CH, Hou MC, Huey-Herng Sheu W. Medical school grades may predict future clinical competence. J Chin Med Assoc 2022; 85:909-914. [PMID: 36150103 DOI: 10.1097/jcma.0000000000000782] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/26/2022] Open
Abstract
BACKGROUND In real-world medical education, there is a lack of reliable predictors of future clinical competencies. Hence, we aim to identify the factors associated with clinical competencies and construct a prediction model to identify "improvement required" trainees. METHODS We analyzed data from medical students who graduated from National Yang-Ming University with clerkship training and participated in the postgraduate year (PGY) interview at Taipei Veterans General Hospital. Clinical competencies were evaluated using grades of national objective structured clinical examination (OSCEs). This study used data from medical students who graduated in July 2018 as the derivation cohort (N = 50) and those who graduated in July 2020 (n = 56) for validation. RESULTS Medical school grades were associated with the performance of national OSCEs (Pearson r = 0.34, p = 0.017), but the grades of the structured PGY interviews were marginally associated with the national OSCE (Pearson r = 0.268, p = 0.06). A prediction model was constructed to identify "improvement required" trainees, defined: trainees with the lowest 25% of scores in the national OSCEs. According to this model, trainees with the lowest 25% medical school grades predicted a higher risk of the "improvement required" clinical performance (Q1-Q3 vs Q4 = 15% vs 60%, odds ratio = 8.5 [95% confidence interval = 1.8-39.4], p = 0.029). In the validation cohort, our prediction model could accurately classify 76.7% "improvement required" and "nonimprovement required" students. CONCLUSION Our study suggests that interventions for students with unsatisfactory medical school grades are warranted to improve their clinical competencies.
Collapse
Affiliation(s)
- Jr-Wei Wu
- Department of Medical Education, Taipei Veterans General Hospital, Taipei, Taiwan, ROC
- College of Medicine, National Yang Ming Tung University, Taipei, Taiwan, ROC
- Clinical Innovation Center, Taipei Veterans General Hospital, Taipei, Taiwan, ROC
| | - Hao-Min Cheng
- Department of Medical Education, Taipei Veterans General Hospital, Taipei, Taiwan, ROC
- College of Medicine, National Yang Ming Tung University, Taipei, Taiwan, ROC
- Center for Evidence-based Medicine, Taipei Veterans General Hospital, ROC
| | - Shiau-Shian Huang
- Department of Medical Education, Taipei Veterans General Hospital, Taipei, Taiwan, ROC
- College of Medicine, National Yang Ming Tung University, Taipei, Taiwan, ROC
| | - Jen-Feng Liang
- Department of Medical Education, Taipei Veterans General Hospital, Taipei, Taiwan, ROC
- College of Medicine, National Yang Ming Tung University, Taipei, Taiwan, ROC
| | - Chia-Chang Huang
- Department of Medical Education, Taipei Veterans General Hospital, Taipei, Taiwan, ROC
- College of Medicine, National Yang Ming Tung University, Taipei, Taiwan, ROC
- Division of Clinical Skills Training Center, Taipei Veterans General Hospital, Taipei, Taiwan, ROC
| | - Boaz Shulruf
- University of New South Wales, Sydney, Australia
| | - Ying-Ying Yang
- Department of Medical Education, Taipei Veterans General Hospital, Taipei, Taiwan, ROC
- College of Medicine, National Yang Ming Tung University, Taipei, Taiwan, ROC
- Clinical Innovation Center, Taipei Veterans General Hospital, Taipei, Taiwan, ROC
- Division of Clinical Skills Training Center, Taipei Veterans General Hospital, Taipei, Taiwan, ROC
| | - Chen-Huan Chen
- Department of Medical Education, Taipei Veterans General Hospital, Taipei, Taiwan, ROC
- College of Medicine, National Yang Ming Tung University, Taipei, Taiwan, ROC
| | - Ming-Chih Hou
- College of Medicine, National Yang Ming Tung University, Taipei, Taiwan, ROC
- Division of Gastroenterology and Hepatology, Department of Medicine, Taipei Veterans General Hospital, Taipei, ROC
| | - Wayne Huey-Herng Sheu
- College of Medicine, National Yang Ming Tung University, Taipei, Taiwan, ROC
- Section of Endocrinology and Metabolism, Department of Medicine, Taipei Veterans General Hospital, Taipei, Taiwan, ROC
- Institute of Medical Technology, College of Life Science, National Chung-Hsing University, Taichung, Taiwan, ROC
| |
Collapse
|
13
|
Mun F, Jeong S, Juliano PJ, Hennrikus WL. Perceptions of USMLE Step 1 Pass/Fail Score Reporting Among Orthopedic Surgery Residency Program Directors. Orthopedics 2022; 45:e30-e34. [PMID: 34846244 DOI: 10.3928/01477447-20211124-08] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/03/2023]
Abstract
The United States Medical Licensing Examination (USMLE) Step 1 examination will transition from graded to pass/fail scoring starting no earlier than January 2022. Orthopedic surgery residency programs will need to adapt to these changes. The goal of this study was to investigate the perceptions of orthopedic surgery residency program directors on the change of Step 1 from a graded to a pass/fail examination. We also investigated how the change would affect the other factors that are typically considered in the selection of orthopedic surgery residents. A survey was distributed to 161 directors of allopathic orthopedic surgery programs. Contact information was obtained from a national database. Of those contacted, 75 (46.6%) program directors responded. Most (85.3%) did not support the pass/fail change. Most believe that greater importance will be placed on the Step 2 Clinical Knowledge examination (96.0%), audition elective with their department (84.0%), personal knowledge of the applicant (78.7%), grades (74.7%), letters of recommendation from recognizable orthopedic surgeons (74.7%), and Alpha Omega Alpha status (69.3%). Most also believe that this change will advantage allopathic students who attend highly regarded schools (58.7%). Most of the program directors support a graded preclinical curriculum (69.3%) and caps on the number of orthopedic surgery residency applications (70.7%). Although most orthopedic surgery program directors disagree with the change to a pass/fail Step 1 examination, residency programs will need to reevaluate how they screen applicants for an interview once the scored Step 1 is no longer available. With this change, other factors, such as Step 2 score, audition rotations, and grades in clerkships, will be emphasized more heavily. [Orthopedics. 2022;45(1):e30-e34.].
Collapse
|
14
|
Rahayu GR, Findyartini A, Riskiyana R, Thadeus MS, Meidianawaty V, Sari SM, Puspadewi N, Bekti RS, Hermasari BK, Sudarso S, Utami AE, Kusumawati W. Stakeholders' Views and Confidence Towards Indonesian Medical Doctor National Competency Examination: A Qualitative Study. J Multidiscip Healthc 2021; 14:3411-3420. [PMID: 34938080 PMCID: PMC8685446 DOI: 10.2147/jmdh.s336965] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2021] [Accepted: 11/24/2021] [Indexed: 11/23/2022] Open
Abstract
Introduction Indonesia is a huge country consisting of 33 provinces with different characteristics. There are 83 medical schools across Indonesia with different accreditation statuses. The Indonesia Medical Doctor National Competency Examination (IMDNCE) has been established to control the quality of medical school graduates. The implementation of IMDNCE needed to be evaluated to determine its impact. To date, there has not been any research in Indonesia that explores the stakeholders’ perceptions toward IMDNCE. This study aimed to explore how the stakeholders in Indonesia perceived the impact of IMDNCE towards performances of medical school graduates in clinical practice. Methods and Study Participants A qualitative study with phenomenological approach was conducted to investigate perceptions of stakeholders including representatives from consumer organizations, the National Health Coverage, the Ministry of Health, the Indonesian Medical Association, employers (hospital and health center directors), clinical supervisors as well as patients across Indonesia. Data were obtained through focus group discussions (FGDs) and interviews. The study used thematic analysis methods to obtain the results. Results A total of 90 study participants participated in the study including 10 representatives of consumer watchdog organizations, the National Health Coverage, the Ministry of Health, the Indonesian Medical Association, 31 employers, 32 professionals, and 17 patients. The study found three general themes which represent the perceptions of the stakeholders towards performances of medical school graduates in clinical practice: IMDNCE as an effort to standardize doctor graduates in Indonesia, the results of IMDNCE as a mean to reflect the quality of medical education in Indonesia, and IMDNCE as an effort to improve health services in Indonesia through the quality of graduates. Conclusion In general, the stakeholders perceived that the IMDNCE was able to standardize medical school graduates from various medical schools across Indonesia. However, the IMDNCE needs to be further developed to maximize its potential in improving the competences of Indonesian medical students.
Collapse
Affiliation(s)
- Gandes Retno Rahayu
- Faculty of Medicine, Public Health and Nursing, Universitas Gadjah Mada, Yogyakarta, Indonesia
| | | | - Rilani Riskiyana
- Faculty of Medicine, Public Health and Nursing, Universitas Gadjah Mada, Yogyakarta, Indonesia
| | | | - Vivi Meidianawaty
- Faculty of Medicine, Universitas Swadaya Gunung Jati, Cirebon, Indonesia
| | | | - Natalia Puspadewi
- School of Medicine and Health Sciences, Atma Jaya Catholic University of Indonesia, Jakarta, Indonesia
| | | | | | | | | | - Wiwik Kusumawati
- Faculty of Medicine and Health Science, Universitas Muhammadiyah Yogyakarta, Yogyakarta, Indonesia
| |
Collapse
|
15
|
Lu M, Farhat JH, Beck Dallaghan GL. Enhanced Learning and Retention of Medical Knowledge Using the Mobile Flash card Application Anki. MEDICAL SCIENCE EDUCATOR 2021; 31:1975-1981. [PMID: 34956708 PMCID: PMC8651966 DOI: 10.1007/s40670-021-01386-9] [Citation(s) in RCA: 22] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 08/14/2021] [Indexed: 05/28/2023]
Abstract
INTRODUCTION As medical schools condense the basic science phase of undergraduate medical education, it has become increasingly important to identify methods and tools that facilitate learning, mastery, and application of medical knowledge. One increasingly popular tool that promotes engagement with content is Anki, a web-based flash card system. Using Anki, medical students can access pre-made flash cards specifically tailored to prepare students for the United States Medical Licensing Exam (USMLE) Step 1 exam. The objective of this study was to identify Anki use and its association to USMLE Step 1 performance. METHODS In March 2020, medical students in years 2, 3, and 4 who had completed USMLE Step 1 were administered a survey to measure Anki usage. The survey was locally developed and was reviewed by survey experts on campus. Survey responses were paired with USMLE Step 1 results for analyses. Descriptive and inferential statistics were used for analysis. RESULTS Anki usage was associated with higher USMLE Step 1 scores. Additionally, amongst those who used Anki, those with more consistent use had higher USMLE Step 1 scores and higher perceived levels of knowledge retention. CONCLUSIONS This research suggests that Anki is an effective educational tool that should be recommended to medical students alongside other evidenced-based study tools, such as the popular question bank USMLE World. Future research should attempt to identify a relationship between Anki usage and future clinical performance to demonstrate the implications that Anki has on clinical skills. SUPPLEMENTARY INFORMATION The online version contains supplementary material available at 10.1007/s40670-021-01386-9.
Collapse
Affiliation(s)
- Matthew Lu
- University of North Carolina School of Medicine, Chapel Hill, NC USA
| | - John H. Farhat
- University of North Carolina School of Medicine, Chapel Hill, NC USA
| | - Gary L. Beck Dallaghan
- University of North Carolina School of Medicine, 108 Taylor Hall, CB 7321, Chapel Hill, NC 27599 USA
| |
Collapse
|
16
|
Pershing S, Stell L, Fisher AC, Goldberg JL. Implicit Bias and the Association of Redaction of Identifiers With Residency Application Screening Scores. JAMA Ophthalmol 2021; 139:1274-1282. [PMID: 34673889 DOI: 10.1001/jamaophthalmol.2021.4323] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
Importance Diversity in the ophthalmology profession is important when providing care for an increasingly diverse patient population. However, implicit bias may inadvertently disadvantage underrepresented applicants during resident recruitment and selection. Objective To evaluate the association of the redaction of applicant identifiers with the review scores on ophthalmology residency applications as an intervention to address implicit bias. Design, Setting, and Participants In this quality improvement study, 46 faculty members reviewed randomized sets of 462 redacted and unredacted applications from a single academic institution during the 2019-2020 ophthalmology residency application cycle. Interventions Applications electronically redacted for applicant identifiers, including name, sex or gender, race and ethnicity, and related terms. Main Outcomes and Measures The main outcome was the distribution of scores on redacted and unredacted applications, stratified by applicant's sex, underrepresentation in medicine (URiM; traditionally comprising American Indian or Alaskan Native, Black, and Hispanic individuals) status, and international medical graduate (IMG) status; the application score β coefficients for redaction and the applicant and reviewer characteristics were calculated. Applications were scored on a scale of 1 to 9, where 1 was the best score and 9 was the worst score. Scores were evaluated for a significant difference based on redaction among female, URiM, and IMG applicants. Linear regression was used to evaluate the adjusted association of redaction, self-reported applicant characteristics, and reviewer characteristics with scores on ophthalmology residency applications. Results In this study, 277 applicants (60.0%) were male and 71 (15.4%) had URiM status; 32 faculty reviewers (69.6%) were male and 2 (0.4%) had URiM status. The distribution of scores was similar for redacted vs unredacted applications, with no difference based on sex, URiM status, or IMG status. Applicant's sex, URiM status, and IMG status had no association with scores in multivariable analysis (sex, β = -0.08; 95% CI, -0.32 to 0.15; P = .26; URiM status, β = -0.03; (95% CI, -0.36 to 0.30; P = .94; and IMG status, β = 0.39; 95% CI, -0.24 to 1.02; P = .35). In adjusted regression, redaction was not associated with differences in scores (β = -0.06 points on a 1-9 scale; 95% CI, -0.22 to 0.10 points; P = .48). Factors most associated with better scores were attending a top 20 medical school (β = -1.06; 95% CI, -1.37 to -0.76; P < .001), holding an additional advanced degree (β = -0.86; 95% CI, -1.22 to -0.50; P < .001), and having a higher United States Medical Licensing Examination Step 1 score (β = -0.35 per 10-point increase; 95% CI, -0.45 to -0.26; P < .001). Conclusions and Relevance This quality improvement study did not detect an association between the redaction of applicant characteristics on ophthalmology residency applications and the application review scores among underrepresented candidates at this institution. Although the study may not have been powered adequately to find a difference, these findings suggest that the association of redaction with application review scores may be preempted by additional approaches to enhance diversity, including pipeline programs, implicit bias training, diversity-centered culture and priorities, and targeted applicant outreach. Programs may adapt this study design to probe their own application screening biases and track over time before-and-after bias-related interventions.
Collapse
Affiliation(s)
- Suzann Pershing
- Byers Eye Institute, Department of Ophthalmology, Stanford University School of Medicine, Palo Alto, California.,Ophthalmology and Eye Care Services, Veterans Affairs Palo Alto Health Care System, Palo Alto, California
| | - Laurel Stell
- Byers Eye Institute, Department of Ophthalmology, Stanford University School of Medicine, Palo Alto, California.,Biomedical Data Science, Stanford University School of Medicine, Palo Alto, California
| | - A Caroline Fisher
- Byers Eye Institute, Department of Ophthalmology, Stanford University School of Medicine, Palo Alto, California
| | - Jeffrey L Goldberg
- Byers Eye Institute, Department of Ophthalmology, Stanford University School of Medicine, Palo Alto, California.,Ophthalmology and Eye Care Services, Veterans Affairs Palo Alto Health Care System, Palo Alto, California
| |
Collapse
|
17
|
Withers C, Noble C, Brandenburg C, Glasziou PP, Stehlik P. Selection criteria for Australian and New Zealand medical specialist training programs: another under-recognised driver of research waste. Med J Aust 2021; 215:336-336.e1. [PMID: 34494269 DOI: 10.5694/mja2.51250] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2020] [Revised: 04/26/2021] [Accepted: 04/29/2021] [Indexed: 11/17/2022]
Affiliation(s)
- Caitlyn Withers
- Gold Coast Hospital and Health Service, Gold Coast, QLD.,Griffith University, Gold Coast, QLD
| | | | | | - Paul P Glasziou
- Institute for Evidence-Based Healthcare, Bond University, Gold Coast, QLD
| | - Paulina Stehlik
- Gold Coast Hospital and Health Service, Gold Coast, QLD.,Institute for Evidence-Based Healthcare, Bond University, Gold Coast, QLD
| |
Collapse
|
18
|
Kortz MW, Vegas A, Moore SP, McCray E, Mureb MC, Bernstein JE, May J, Bishop B, Frydenlund M, Dobson JR. National Resident Matching Program Performance Among US MD and DO Seniors in the Early Single Accreditation Graduate Medical Education Era. Cureus 2021; 13:e17319. [PMID: 34557365 PMCID: PMC8449856 DOI: 10.7759/cureus.17319] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/19/2021] [Indexed: 11/23/2022] Open
Abstract
Introduction: As of the 2020 National Resident Matching Program (NRMP), nearly all applicants are evaluated together for graduate medical education (GME) candidacy. We set out to characterize US MD and DO Senior residency match performance in the single-accreditation GME era. Methods: A retrospective study was conducted in 2021 utilizing data collected from the 2018 and 2020 NRMP Charting Outcomes in the Match publications aggregated and subdivided into three groups based on competitiveness: low (LC), moderate (MC), and high (HC). Nonparametric analysis was performed using Chi square or Fisher exact tests if counts were less than five. Significance was determined at p < 0.05. Results: A total of 46,853 candidates were included, with 36,194 (77.3%) US MD and 10,659 (22.7%) DO Seniors. Match rates for US DO Seniors were lower than US MD Seniors across all competitiveness strata (p < 0.0001). Research item production, national licensing examination scores, and mean number of contiguous programs ranked were lower for matched US DO Seniors compared to matched US MD Seniors, with significant differences depending on competitiveness group. Conclusions: With recent changes to GME and its application process, understanding how various groups compare will be increasingly important. US DO Seniors have lower first-rank match rates for all specialty competitiveness levels. This may be due to lower research output or nuanced specialty selection. This study could aid GME stakeholders to more effectively allocate resources and better prepare residency candidates.
Collapse
Affiliation(s)
- Michael W Kortz
- Neurosurgery, University of Colorado School of Medicine, Aurora, USA.,Osteopathic Medicine, Kansas City University College of Osteopathic Medicine, Kansas City, USA
| | - Austin Vegas
- Osteopathic Medicine, Campbell University School of Osteopathic Medicine, Buies Creek, USA
| | - Sean P Moore
- Osteopathic Medicine, Kansas City University College of Osteopathic Medicine, Kansas City, USA
| | - Edwin McCray
- Osteopathic Medicine, Campbell University School of Osteopathic Medicine, Buies Creek, USA
| | - Monica C Mureb
- Neurosurgery, New York Medical College Westchester Medical Center, Westchester, USA
| | - Jacob E Bernstein
- Neurosurgery, Riverside University Health System Medical Center, Moreno Valley, USA
| | - Joshua May
- Osteopathic Medicine, Kansas City University College of Osteopathic Medicine, Kansas City, USA
| | - Brandon Bishop
- Osteopathic Medicine, Kansas City University College of Osteopathic Medicine, Kansas City, USA
| | | | - John R Dobson
- Pathology, Kansas City University College of Osteopathic Medicine, Kansas City, USA
| |
Collapse
|
19
|
Hughes RH, Kleinschmidt S, Sheng AY. Using structured interviews to reduce bias in emergency medicine residency recruitment: Worth a second look. AEM EDUCATION AND TRAINING 2021; 5:S130-S134. [PMID: 34616987 PMCID: PMC8480396 DOI: 10.1002/aet2.10562] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/26/2020] [Revised: 11/20/2020] [Accepted: 11/24/2020] [Indexed: 05/29/2023]
Affiliation(s)
| | | | - Alexander Y. Sheng
- Department of Emergency MedicineBoston Medical CenterBostonMAUSA
- Boston University School of MedicineBostonMAUSA
| |
Collapse
|
20
|
A transparent and defensible process for applicant selection within a Canadian emergency medicine residency program. CAN J EMERG MED 2021; 22:215-223. [PMID: 31941560 DOI: 10.1017/cem.2019.460] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
OBJECTIVES The Canadian Resident Matching Service (CaRMS) selection process has come under scrutiny due to the increasing number of unmatched medical graduates. In response, we outline our residency program's selection process including how we have incorporated best practices and novel techniques. METHODS We selected file reviewers and interviewers to mitigate gender bias and increase diversity. Four residents and two attending physicians rated each file using a standardized, cloud-based file review template to allow simultaneous rating. We interviewed applicants using four standardized stations with two or three interviewers per station. We used heat maps to review rating discrepancies and eliminated rating variance using Z-scores. The number of person-hours that we required to conduct our selection process was quantified and the process outcomes were described statistically and graphically. RESULTS We received between 75 and 90 CaRMS applications during each application cycle between 2017 and 2019. Our overall process required 320 person-hours annually, excluding attendance at the social events and administrative assistant duties. Our preliminary interview and rank lists were developed using weighted Z-scores and modified through an organized discussion informed by heat mapped data. The difference between the Z-scores of applicants surrounding the interview invitation threshold was 0.18-0.3 standard deviations. Interview performance significantly impacted the final rank list. CONCLUSIONS We describe a rigorous resident selection process for our emergency medicine training program which incorporated simultaneous cloud-based rating, Z-scores, and heat maps. This standardized approach could inform other programs looking to adopt a rigorous selection process while providing applicants guidance and reassurance of a fair assessment.
Collapse
|
21
|
Tseng JR, Kang YS, Youm J, Pandit R. Radiology resident selection factors predict resident performance. Clin Imaging 2021; 80:225-228. [PMID: 34352495 DOI: 10.1016/j.clinimag.2021.07.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2021] [Revised: 07/04/2021] [Accepted: 07/10/2021] [Indexed: 10/20/2022]
Abstract
PURPOSE To determine selection factors that predict radiology resident performance. METHODS 59 consecutive radiology residents from 2002 to 2015 were ranked on performance during residency. Correlations and multiple regression analyses were performed to predict resident performance from the following selection factors: United States Medical Licensing Exam (USMLE) Step 1 score, medical school rank, Alpha Omega Alpha (AOA) membership, honors in clinical rotations, Medical Student Performance Evaluation (MSPE), and interview score. Results were compared against predictions from Match rank position. RESULTS Five selection factors showed significant or marginally significant correlations with resident performance (r = 0.2 to 0.3). The interview score was not significantly correlated. A multiple regression model comprised of the USMLE Step 1 score, medical school rank, AOA membership, and interview score predicted resident performance, with an adjusted R2 of 0.19. The interview score was included in the model but did not achieve statistical significance. Match rank did not predict resident performance, with an R2 of 0.01. CONCLUSIONS A multiple regression model comprised of the USMLE Step 1 score, medical school rank, and AOA membership predicted radiology resident performance and may assist with resident selection.
Collapse
Affiliation(s)
- Jeffrey R Tseng
- Santa Clara Valley Medical Center, Department of Radiology, 751 South Bascom Avenue, San Jose, CA 95128, United States of America.
| | - Young S Kang
- Santa Clara Valley Medical Center, Department of Radiology, 751 South Bascom Avenue, San Jose, CA 95128, United States of America
| | - Jiwon Youm
- Santa Clara Valley Medical Center, Department of Radiology, 751 South Bascom Avenue, San Jose, CA 95128, United States of America
| | - Rajul Pandit
- Santa Clara Valley Medical Center, Department of Radiology, 751 South Bascom Avenue, San Jose, CA 95128, United States of America
| |
Collapse
|
22
|
Rahil A, Hamamyh T, Al-Mohammed A, Kamel A, Abubeker I, Abu-Raddad L, Dargham S, Suliman S, Al Mohanadi D, Al Khal A. Do the selection criteria of internal medicine residency program predict resident performance? Qatar Med J 2021; 2021:20. [PMID: 34189112 PMCID: PMC8216212 DOI: 10.5339/qmj.2021.20] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2020] [Accepted: 03/18/2021] [Indexed: 01/07/2023] Open
Abstract
BACKGROUND Well-performing physician reflects the success of the residency program in selecting the best candidates for training. This study aimed to evaluate the selection criteria, mainly the United States Medical Licensing Examination (USMLE) Step 2 Clinical Knowledge (CK) results and applicants' status as international or locally trained applicants, used by the medical education department and the internal medicine residency program in Hamad Medical Corporation in Qatar to predict the residents' performance during their training. METHODS A retrospective chart review was performed for three batches of graduates who started residency training in 2011, 2012, and 2013. Each group completed 4 years of training. The USMLE Step 2 CK status of the applicant, in-training exam (ITE) scores, formative evaluation scores, Arab Board written and clinical exams pass rate, and other indicators were analyzed. Statistical analysis included chi squares and independent t-test to identify associations. Multivariable analyses were conducted using logistic and linear regressions to test for adjusted associations. RESULTS The study included 118 (81 international/37 locally trained applicants) internal medicine residents. The ITE score correlated positively with the USMLE Step 2 CK score (r = 0.621, r = 0.587, r = 0.576, r = 0.571, p < 0.001) over the 4 years of training and among the international compared with locally trained applicants (p < 0.001). The rate of passing part 1 and 2 written exam of the Arab Board was higher in international than in local applicants, whereas clinical Arab Board exam and formative evaluation were not associated with any criteria. CONCLUSIONS Higher USMLE Step 2 CK score correlated with better performance on ITE but not with other performance indicators, whereas international applicants did better in both ITE and Arab Board written exam than local applicants. These variables may provide reasonable predictors of well-performing physicians.
Collapse
Affiliation(s)
- Ali Rahil
- Hamad General Hospital, Doha, Qatar E-mail: ,E-mail:
| | | | | | | | | | - Laith Abu-Raddad
- Biomathematics Research Core, Weill Cornell Medical College, Qatar
| | - Soha Dargham
- Biomathematics Research Core, Weill Cornell Medical College, Qatar
| | | | | | | |
Collapse
|
23
|
Mun F, Scott AR, Cui D, Lehman EB, Jeong S, Chisty A, Juliano PJ, Hennrikus WL, Hennrikus EF. A comparison of orthopaedic surgery and internal medicine perceptions of USMLE Step 1 pass/fail scoring. BMC MEDICAL EDUCATION 2021; 21:255. [PMID: 33941167 PMCID: PMC8091716 DOI: 10.1186/s12909-021-02699-4] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/12/2021] [Accepted: 04/23/2021] [Indexed: 06/07/2023]
Abstract
BACKGROUND United States Medical Licensing Examination Step 1 will transition from numeric grading to pass/fail, sometime after January 2022. The aim of this study was to compare how program directors in orthopaedics and internal medicine perceive a pass/fail Step 1 will impact the residency application process. METHODS A 27-item survey was distributed through REDCap to 161 U.S. orthopaedic residency program directors and 548 U.S. internal medicine residency program directors. Program director emails were obtained from the American Medical Association's Fellowship and Residency Electronic Interactive Database. RESULTS We received 58 (36.0%) orthopaedic and 125 (22.8%) internal medicine program director responses. The majority of both groups disagree with the change to pass/fail, and felt that the decision was not transparent. Both groups believe that the Step 2 Clinical Knowledge exam and clerkship grades will take on more importance. Compared to internal medicine PDs, orthopaedic PDs were significantly more likely to emphasize research, letters of recommendation from known faculty, Alpha Omega Alpha membership, leadership/extracurricular activities, audition elective rotations, and personal knowledge of the applicant. Both groups believe that allopathic students from less prestigious medical schools, osteopathic students, and international medical graduates will be disadvantaged. Orthopaedic and internal medicine program directors agree that medical schools should adopt a graded pre-clinical curriculum, and that there should be a cap on the number of residency applications a student can submit. CONCLUSION Orthopaedic and internal medicine program directors disagree with the change of Step 1 to pass/fail. They also believe that this transition will make the match process more difficult, and disadvantage students from less highly-regarded medical schools. Both groups will rely more heavily on the Step 2 clinical knowledge exam score, but orthopaedics will place more importance on research, letters of recommendation, Alpha Omega Alpha membership, leadership/extracurricular activities, personal knowledge of the applicant, and audition electives.
Collapse
Affiliation(s)
- Frederick Mun
- Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA.
| | - Alyssa R Scott
- Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| | - David Cui
- Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| | - Erik B Lehman
- Public Health Sciences at Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| | - Seongho Jeong
- Department of Orthopaedics and Rehabilitation, Yale School of Medicine, Yale New Haven Hospital, New Haven, CT, USA
| | - Alia Chisty
- Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
- Department of Internal Medicine, Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| | - Paul J Juliano
- Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
- Department of Orthopaedics and Rehabilitation, Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| | - William L Hennrikus
- Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
- Department of Orthopaedics and Rehabilitation, Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| | - Eileen F Hennrikus
- Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
- Department of Internal Medicine, Penn State College of Medicine, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| |
Collapse
|
24
|
Mun F, Scott AR, Cui D, Chisty A, Hennrikus WL, Hennrikus EF. Internal medicine residency program director perceptions of USMLE Step 1 pass/fail scoring: A cross-sectional survey. Medicine (Baltimore) 2021; 100:e25284. [PMID: 33847625 PMCID: PMC8052063 DOI: 10.1097/md.0000000000025284] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/07/2020] [Accepted: 03/08/2021] [Indexed: 01/04/2023] Open
Abstract
The United States Medical Licensing Examination Step 1 will transition to a pass/fail exam starting no earlier than January 2022. Internal medicine residency programs will need to adapt to these changes. The purpose of this study was to investigate: 1. internal medicine residency program directors’ perceptions on the change of Step 1 to a pass/fail exam, and 2. the impact on other factors considered for internal medicine residency selection. A validated REDCap survey was sent to 548 program directors at active Accreditation Council for Graduate Medical Education internal medicine residency programs. Contact information from the American Medical Association's Fellowship and Residency Electronic Interactive Database was used. The survey had 123 respondents (22.4%). Most internal medicine program directors do not support the pass/fail change. A greater importance will be placed on Step 2 Clinical Knowledge exam, personal knowledge of the applicant, clerkship grades, and audition electives. Allopathic students from less highly regarded medical schools, as well as osteopathic and international students, will be disadvantaged. About half believe that schools should adopt a graded pre-clinical curriculum (51.2%) and that there should be residency application caps (54.5%). Internal medicine program directors mostly disagree with the pass/fail Step 1 transition. Residency programs will need to reevaluate how applicants are evaluated. Other factors, such as Step 2 Clinical Knowledge score, personal knowledge of the applicant, grades in clerkships, and audition rotations will now be emphasized more heavily.
Collapse
Affiliation(s)
| | | | - David Cui
- Pennsylvania State University College of Medicine
| | - Alia Chisty
- Pennsylvania State University College of Medicine
- Department of Internal Medicine
| | - William L. Hennrikus
- Pennsylvania State University College of Medicine
- Bone and Joint Institute, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| | - Eileen F. Hennrikus
- Pennsylvania State University College of Medicine
- Department of Internal Medicine
| |
Collapse
|
25
|
Upchurch DA, Renberg WC. A survey of attributes of surgical resident applicants deemed important to American College of Veterinary Surgeons board-certified surgeons involved in resident selection and methods used to evaluate these attributes. Vet Surg 2021; 50:485-493. [PMID: 33645852 DOI: 10.1111/vsu.13596] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2020] [Revised: 11/24/2020] [Accepted: 12/28/2020] [Indexed: 11/29/2022]
Abstract
OBJECTIVE To determine which attributes of residency applicants were most commonly assessed by large and small animal American College of Veterinary Surgeons diplomates and to determine which evaluation methods were perceived to assess those attributes. STUDY DESIGN Online survey. SAMPLE POPULATION American College of Veterinary Surgeons board-certified surgeons as of March 2019. METHODS An online survey was sent to eligible individuals. Respondents rated the importance of 23 attributes assessed by the Veterinary Internship and Residency Matching Program (VIRMP) application as well the usefulness of interviews, conversations with people knowledgeable with the applicants, and review of the VIRMP packet for evaluating each of these attributes. Responses were compared between large and small animal practitioners and between individuals involved in residency selection (supervisors) and individuals not involved in residency selection (nonsupervisors). RESULTS Surveys were completed by 221 individuals (14.6% response rate). Seventeen of the 23 attributes were considered important by most respondents. Grade point average (GPA) and class rank were used as screening tools by 73% and 65% of supervisors, respectively. Letters of reference (LOR) were ranked as the most important part of the VIRMP packet. Conversations with people knowledgeable with the applicant was the only method judged by most respondents to be appropriate to evaluate all 23 attributes. Responses were similar between large and small animal supervisors and nonsupervisors. CONCLUSION Respondents considered conversations with people knowledgeable with the applicant to be the most useful methods for assessing a resident applicant, but LOR, GPA, and class rank were also important. IMPACT Resident applicants and mentors can use this information to strengthen applications.
Collapse
Affiliation(s)
- David A Upchurch
- Department of Clinical Sciences, College of Veterinary Medicine, Kansas State University, Manhattan, Kansas
| | - Walter C Renberg
- Department of Clinical Sciences, College of Veterinary Medicine, Kansas State University, Manhattan, Kansas
| |
Collapse
|
26
|
Busha ME, McMillen B, Greene J, Gibson K, Milnes C, Ziemkowski P. One Institution's evaluation of family medicine residency applicant data for academic predictors of success. BMC MEDICAL EDUCATION 2021; 21:84. [PMID: 33530993 PMCID: PMC7851804 DOI: 10.1186/s12909-021-02518-w] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/25/2020] [Accepted: 01/08/2021] [Indexed: 05/10/2023]
Abstract
BACKGROUND Family Medicine residencies are navigating recruitment in a changing environment. The consolidation of accreditation for allopathic and osteopathic programs, the high volume of applicants, and the forthcoming transition of the United States Medical Licensing Exam (USMLE) Step 1 to pass/fail reporting all contribute. This retrospective cohort study evaluated which components of a student's academic history best predict readiness for residency. METHODS In 2020, we analyzed applicant data and initial residency data for program graduates at a single residency program between 2013 and 2020. This included undergraduate education characteristics, medical school academic performance, medical school academic problems (including professionalism), STEP exams, location of medical school, and assessments during the first 6 months of residency. Of 110 matriculating residents, assessment data was available for 97 (88%). RESULTS Pre-matriculation USMLE data had a positive correlation with initial American Board of Family Medicine (ABFM) in-training exams. Pre-matriculation exam data did not have a positive correlation with resident assessment across any of the six Accreditation Council for Graduate Medical Education (ACGME) competency domains. A defined cohort of residents with a history of academic struggles during medical school or failure on a USMLE exam performed statistically similarly to residents with no such history on assessments across the six ACGME competency domains. CONCLUSIONS Applicants with a history of academic problems perform similarly in the clinical environment to those without. While a positive correlation between pre-matriculation exams and the ABFM in-training exam was found, this did not extend to clinical assessments across the ACGME competency domains.
Collapse
Affiliation(s)
- Michael E Busha
- Western Michigan University Homer Stryker M.D. School of Medicine, 1000 Oakland Drive, Kalamazoo, MI, 49008, USA.
| | - Brock McMillen
- Indiana University School of Medicine, 1520 North Senate, Indianapolis, IN, 46202, USA
| | - Jeffrey Greene
- Western Michigan University Homer Stryker M.D. School of Medicine, 1000 Oakland Drive, Kalamazoo, MI, 49008, USA
| | - Kristine Gibson
- Western Michigan University Homer Stryker M.D. School of Medicine, 1000 Oakland Drive, Kalamazoo, MI, 49008, USA
| | - Charlotte Milnes
- Western Michigan University Homer Stryker M.D. School of Medicine, 1000 Oakland Drive, Kalamazoo, MI, 49008, USA
| | - Peter Ziemkowski
- Western Michigan University Homer Stryker M.D. School of Medicine, 1000 Oakland Drive, Kalamazoo, MI, 49008, USA
| |
Collapse
|
27
|
Cook AK, Creevy KE, Levine J, Arthur W. Small Animal Resident Selection Processes at a University Teaching Hospital: An Analysis and Recommendations for Improvement. JOURNAL OF VETERINARY MEDICAL EDUCATION 2021; 48:1-7. [PMID: 32163023 DOI: 10.3138/jvme.2019-0052] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Concerns regarding resident performance within a small animal department prompted a review of selection practices, with the intent of improving validity and efficiency. Information was gathered from semi-structured interviews and descriptions of current processes; emphasis was placed on determining how the Veterinary Internship and Residency Matching Program application was used. Processes were found to lack standardization and rely heavily on arbitrary judgments. In addition, faculty members expressed concerns regarding their reliability and the time spent generating candidate rankings. Suggestions for improvement were based on current practices in personnel psychology and human resource management. The need for standardization within and across specialty groups was emphasized, along with a multiple-hurdle approach in which a substantial deficit or red flag in any component results in candidate disqualification. Comprehensive recommendations were made for the selection process as follows: Each application undergoes initial administrative screening for employment eligibility and academic cut-offs; eligible applications are scored by 2-3 faculty members using defined ratings on four equally weighted pre-interview criteria (i.e., veterinary education, post-graduation experiences, personal statement, and standardized letters of reference); phone calls to colleagues with knowledge of the applicant follow specific guidelines and a rating scale; veterinary-situational structured interview questions with appropriate rating scales are used to assess candidates' standing on specified competencies identified as important for success; and the interview score is weighted equally and added to the four pre-interview components to determine the final rank. It is hoped this new approach will take less time and facilitate the selection of successful residents.
Collapse
|
28
|
Prystowsky MB, Cadoff E, Lo Y, Hebert TM, Steinberg JJ. Prioritizing the Interview in Selecting Resident Applicants: Behavioral Interviews to Determine Goodness of Fit. Acad Pathol 2021; 8:23742895211052885. [PMID: 34722866 PMCID: PMC8552388 DOI: 10.1177/23742895211052885] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2021] [Revised: 08/02/2021] [Accepted: 08/07/2021] [Indexed: 12/30/2022] Open
Abstract
From our initial screening of applications, we assess that the 10% to 15% of applicants whom we will interview are all academically qualified to complete our residency training program. This initial screening to select applicants to interview includes a personality assessment provided by the personal statement, Dean's letter, and letters of recommendation that, taken together, begin our evaluation of the applicant's cultural fit for our program. While the numerical scoring ranks applicants preinterview, the final ranking into best fit categories is determined solely on the interview day at a consensus conference by faculty and residents. We analyzed data of 819 applicants from 2005 to 2017. Most candidates were US medical graduates (62.5%) with 23.7% international medical graduates, 11.7% Doctors of Osteopathic Medicine (DO), and 2.1% Caribbean medical graduates. Given that personality assessment began with application review, there was excellent correlation between the preinterview composite score and the final categorical ranking in all 4 categories. For most comparisons, higher scores and categorical rankings were associated with applicants subsequently working in academia versus private practice. We found no problem in using our 3-step process employing virtual interviews during the COVID pandemic.
Collapse
Affiliation(s)
| | - Evan Cadoff
- Albert Einstein College of Medicine/Montefiore Medical Center, Bronx, NY, USA
| | - Yungtai Lo
- Albert Einstein College of Medicine/Montefiore Medical Center, Bronx, NY, USA
| | - Tiffany M. Hebert
- Albert Einstein College of Medicine/Montefiore Medical Center, Bronx, NY, USA
| | - Jacob J. Steinberg
- Albert Einstein College of Medicine/Montefiore Medical Center, Bronx, NY, USA
| |
Collapse
|
29
|
Saudek K, Treat R, Rogers A, Hahn D, Lauck S, Saudek D, Weisgerber M. A novel faculty development tool for writing a letter of recommendation. PLoS One 2020; 15:e0244016. [PMID: 33326489 PMCID: PMC7743943 DOI: 10.1371/journal.pone.0244016] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2020] [Accepted: 11/18/2020] [Indexed: 11/29/2022] Open
Abstract
Objective Based on a national survey of program directors we developed a letter of recommendation (LOR) scoring rubric (SR) to assess LORs submitted to a pediatric residency program. The objective was to use the SR to analyze: the consistency of LOR ratings across raters and LOR components that contributed to impression of the LOR and candidate. Methods We graded 30 LORs submitted to a pediatric residency program that were evenly distributed based on final rank by our program. The SR contained 3 sections (letter features, phrases, and applicant abilities) and 2 questions about the quality of the LOR (LORQ) and impression of the candidate (IC) after reading the LOR on a 5-point Likert scale. Inter-rater reliability was calculated with intraclass correlation coefficients (ICC(2,1)). Pearson (r) correlations and stepwise multivariate linear regression modeling predicted LORQ and IC. Mean scores of phrases, features, and applicant abilities were analyzed with ANOVA and Bonferroni correction. Results Phrases (ICC(2,1) = 0.82, p<0.001)) and features (ICC(2,1) = 0.60, p<0.001)) were rated consistently, while applicant abilities were not (ICC(2,1) = 0.28, p<0.001)). For features, LORQ (R2 = 0.75, p<0.001) and IC (R2 = 0.58, p<0.001) were best predicated by: writing about candidates’ abilities, strength of recommendation, and depth of interaction with the applicant. For abilities, LORQ (R2 = 0.47, p<0.001) and IC (R2 = 0.51, p<0.001) were best predicted by: clinical reasoning, leadership, and communication skills (0.2). There were significant differences for phrases and features (p<0.05). Conclusions The SR was consistent across raters and correlates with impression of LORQ and IC. This rubric has potential as a faculty development tool for writing LORS.
Collapse
Affiliation(s)
- Kris Saudek
- Division of Neonatology, Department of Pediatrics, Medical College of Wisconsin, Milwaukee, Wisconsin, United States of America
- * E-mail:
| | - Robert Treat
- Division of Neonatology, Department of Pediatrics, Medical College of Wisconsin, Milwaukee, Wisconsin, United States of America
| | - Amanda Rogers
- Department of Pediatrics, Division of Hospital Medicine, Milwaukee, Wisconsin, United States of America
| | - Danita Hahn
- Department of Pediatrics, Division of Hospital Medicine, Milwaukee, Wisconsin, United States of America
| | - Sara Lauck
- Department of Pediatrics, Division of Hospital Medicine, Milwaukee, Wisconsin, United States of America
| | - David Saudek
- Department of Pediatrics, Division of Cardiology, Milwaukee, Wisconsin, United States of America
| | - Michael Weisgerber
- Department of Pediatrics, Division of Hospital Medicine, Milwaukee, Wisconsin, United States of America
| |
Collapse
|
30
|
Burkhardt JC, Parekh KP, Gallahue FE, London KS, Edens MA, Humbert AJ, Pillow MT, Santen SA, Hopson LR. A Critical Disconnect: Residency Selection Factors Lack Correlation With Intern Performance. J Grad Med Educ 2020; 12:696-704. [PMID: 33391593 PMCID: PMC7771600 DOI: 10.4300/jgme-d-20-00013.1] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/03/2020] [Revised: 05/30/2020] [Accepted: 08/01/2020] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Emergency medicine (EM) residency programs want to employ a selection process that will rank best possible applicants for admission into the specialty. OBJECTIVE We tested if application data are associated with resident performance using EM milestone assessments. We hypothesized that a weak correlation would exist between some selection factors and milestone outcomes. METHODS Utilizing data from 5 collaborating residency programs, a secondary analysis was performed on residents trained from 2013 to 2018. Factors in the model were gender, underrepresented in medicine status, United States Medical Licensing Examination Step 1 and 2 Clinical Knowledge (CK), Alpha Omega Alpha (AOA), grades (EM, medicine, surgery, pediatrics), advanced degree, Standardized Letter of Evaluation global assessment, rank list position, and controls for year assessed and program. The primary outcomes were milestone level achieved in the core competencies. Multivariate linear regression models were fitted for each of the 23 competencies with comparisons made between each model's results. RESULTS For the most part, academic performance in medical school (Step 1, 2 CK, grades, AOA) was not associated with residency clinical performance on milestones. Isolated correlations were found between specific milestones (eg, higher surgical grade increased wound care score), but most had no correlation with residency performance. CONCLUSIONS Our study did not find consistent, meaningful correlations between the most common selection factors and milestones at any point in training. This may indicate our current selection process cannot consistently identify the medical students who are most likely to be high performers as residents.
Collapse
Affiliation(s)
- John C Burkhardt
- Assistant Professor, Departments of Emergency Medicine and Learning Health Sciences, University of Michigan Medical School
| | - Kendra P Parekh
- Associate Professor, Department of Emergency Medicine, Vanderbilt University School of Medicine
| | - Fiona E Gallahue
- Residency Program Director and Associate Professor, Department of Emergency Medicine, University of Washington
| | - Kory S London
- Associate Residency Program Director, Director of Clinical Operations, Jefferson Methodist ED, Associate Director of Quality Assurance and Practice Improvement, and Assistant Professor, Department of Emergency Medicine, Thomas Jefferson University
| | - Mary A Edens
- Residency Program Director and Associate Professor, Department of Emergency Medicine, Louisiana State University Health Sciences Center Shreveport
| | - A J Humbert
- Residency Program Director and Associate Professor of Clinical Emergency Medicine, Indiana University School of Medicine
| | - M Tyson Pillow
- Vice Chair of Education, and Associate Professor, Department of Emergency Medicine, Baylor College of Medicine
| | - Sally A Santen
- Senior Associate Dean for Assessment, Evaluation and Scholarship, and Professor, Department of Emergency Medicine, Virginia Commonwealth University School of Medicine
| | - Laura R Hopson
- Associate Chair of Education, Emergency Medicine Residency Program, and Associate Professor of Emergency Medicine, University of Michigan Medical School
| |
Collapse
|
31
|
Poremski D, Tan GMY, Lau BJ, Lee YW, Sim K. Selection of New Psychiatry Residents Within a National Program: a Qualitative Study of Faculty Perspectives on Competencies and Attributes. ACADEMIC PSYCHIATRY : THE JOURNAL OF THE AMERICAN ASSOCIATION OF DIRECTORS OF PSYCHIATRIC RESIDENCY TRAINING AND THE ASSOCIATION FOR ACADEMIC PSYCHIATRY 2020; 44:545-553. [PMID: 32705571 DOI: 10.1007/s40596-020-01282-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/24/2020] [Accepted: 07/01/2020] [Indexed: 06/11/2023]
Abstract
OBJECTIVE Admission committees use multiple sources of information to select residents. However, the way in which faculty members use each data source remains unclear and highly context-specific. The present study seeks to understand how faculty members use various sources of information about candidates to make admission decisions to a National Psychiatry Residency Program. METHODS The theory of core competencies was used as a foundation for this qualitative study. Framework analysis was used to structure the project and data presentation. Twenty key informants from the faculty were purposefully sampled in accordance with the initial theory. Open-ended semi-structured interviews were conducted to obtain their views about the essential competencies of psychiatrists and the ways in which these competencies could be reliably gauged. RESULTS Participants described numerous competencies that they believed were essential to becoming competent psychiatrists. These competencies fell within the six core competencies of the Accreditation Council for Graduate Medical Education framework. However, several non-competency attributes (such as perseverance, empathy, and compassion) were also relevant in the selection process. To reduce the impact of self-presentation bias, to which these attributes were vulnerable, the faculty relied heavily on sources of information obtained from third parties, such as feedback from co-workers with first-hand experience of the candidate during their clinical placements. CONCLUSION Faculty members place importance on informal informant-derived information about a candidate's non-competency attributes in addition to core competencies when deciding whether or not to select a candidate for admission into a residency training program.
Collapse
Affiliation(s)
| | | | - Boon Jia Lau
- Institute of Mental Health, Singapore, Singapore
| | - Yu Wei Lee
- Institute of Mental Health, Singapore, Singapore
| | - Kang Sim
- National Healthcare Group, Singapore, Singapore
| |
Collapse
|
32
|
Cherpak L, Chan J, Verma R, McInnes MDF, Hibbert R. Delivering CaRMS Transparency: Applicant Review and Selection Process of a Single-Center Diagnostic Radiology Residency Training Program. Can Assoc Radiol J 2020; 72:628-636. [PMID: 32960078 DOI: 10.1177/0846537120957621] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
PURPOSE To report the current application review and selection process in our Canadian diagnostic radiology program at the University of Ottawa for both Canadian and international medical graduates. APPLICATION REVIEW AND SELECTION PROCESS Submitted applications fulfilling institutional requirements were selected for a detailed file review after preliminary screening. A diverse group of file reviewers and interviewers was selected. Interviews were offered based on file review score sheet outcomes. Each interviewer generated a postinterview rank list. Applicants were reviewed and discussed from highest to lowest rank based on a preliminary compiled rank list generated from the average of the postinterview rank lists. Group discussion and a consensus model were used to create a final applicant rank list. CONCLUSIONS We outlined our systematic, consistent selection process which aligns with current best practices. This description may inform other programs wishing to adopt or optimize strategies to improve candidate assessments and selection processes.
Collapse
Affiliation(s)
- Lindsay Cherpak
- Department of Radiology, Faculty of Medicine, 7938University of Ottawa, Ontario, Canada
| | - Jason Chan
- Department of Radiology, Faculty of Medicine, 7938University of Ottawa, Ontario, Canada
| | - Raman Verma
- Department of Radiology, Faculty of Medicine, 7938University of Ottawa, Ontario, Canada
| | - Matthew D F McInnes
- Department of Radiology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, 27337University of Ottawa, Ontario, Canada
| | - Rebecca Hibbert
- Department of Radiology, Faculty of Medicine, 7938University of Ottawa, Ontario, Canada
| |
Collapse
|
33
|
Pershing S, Co JPT, Katznelson L. The New USMLE Step 1 Paradigm: An Opportunity to Cultivate Diversity of Excellence. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2020; 95:1325-1328. [PMID: 32433311 DOI: 10.1097/acm.0000000000003512] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
The February 2020 announcement that United States Medical Licensing Examination (USMLE) Step 1 results will be reported as pass/fail instead of numerical scores has been controversial. Step 1 scores have played a key role in residency selection, including screening for interviews. Although Step 1 scores are viewed as an objective criterion, they have been shown to disadvantage female and underrepresented minority applicants, cause student anxiety and financial burden, and affect student well-being. Furthermore, Step 1 scores incompletely predict applicants' overall residency performance. With this paradigm shift in Step 1 score reporting, residency programs will have fewer objective, standardized metrics for selection decisions, which may lead to greater emphasis on USMLE Step 2 Clinical Knowledge scores or yield unintended consequences, including shifting weight to metrics such as medical school reputation.Yet, greater breadth in residency selection metrics will better serve both applicants and programs. Some students excel in coursework, others in research or leadership. All factors should be recognized, and broader metrics should be implemented to promote and recognize diversity of excellence. Given the need for metrics for residency selection as well as for a more holistic approach to evaluating residency applicants, assessment during medical school should be revisited and made more meaningful. Another opportunity may involve use of situational judgment tests to predict professionalism and performance on other competencies. It will be important to evaluate the impact of the new Step 1 paradigm and related initiatives going forward. Residency application overload must also be addressed.
Collapse
Affiliation(s)
- Suzann Pershing
- S. Pershing is assistant professor, Department of Ophthalmology, Stanford University School of Medicine, and chief of ophthalmology and eye care services, Veterans Affairs Palo Alto Heath Care System, Stanford, California
| | - John Patrick T Co
- J.P.T. Co is designated institutional official, Brigham and Women's and Massachusetts General Hospitals, Partners HealthCare, and associate professor of pediatrics, Harvard Medical School, Boston, Massachusetts
| | - Laurence Katznelson
- L. Katznelson is associate dean of graduate medical education and professor of neurosurgery and medicine (endocrinology and metabolism), Stanford University School of Medicine, Stanford, California
| |
Collapse
|
34
|
McDonald FS, Jurich D, Duhigg LM, Paniagua M, Chick D, Wells M, Williams A, Alguire P. Correlations Between the USMLE Step Examinations, American College of Physicians In-Training Examination, and ABIM Internal Medicine Certification Examination. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2020; 95:1388-1395. [PMID: 32271224 DOI: 10.1097/acm.0000000000003382] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
PURPOSE To assess the correlations between United States Medical Licensing Examination (USMLE) performance, American College of Physicians Internal Medicine In-Training Examination (IM-ITE) performance, American Board of Internal Medicine Internal Medicine Certification Exam (IM-CE) performance, and other medical knowledge and demographic variables. METHOD The study included 9,676 postgraduate year (PGY)-1, 11,424 PGY-2, and 10,239 PGY-3 internal medicine (IM) residents from any Accreditation Council for Graduate Medical Education-accredited IM residency program who took the IM-ITE (2014 or 2015) and the IM-CE (2015-2018). USMLE scores, IM-ITE percent correct scores, and IM-CE scores were analyzed using multiple linear regression, and IM-CE pass/fail status was analyzed using multiple logistic regression, controlling for USMLE Step 1, Step 2 Clinical Knowledge, and Step 3 scores; averaged medical knowledge milestones; age at IM-ITE; gender; and medical school location (United States or Canada vs international). RESULTS All variables were significant predictors of passing the IM-CE with IM-ITE scores having the strongest association and USMLE Step scores being the next strongest predictors. Prediction curves for the probability of passing the IM-CE based solely on IM-ITE score for each PGY show that residents must score higher on the IM-ITE with each subsequent administration to maintain the same estimated probability of passing the IM-CE. CONCLUSIONS The findings from this study should support residents and program directors in their efforts to more precisely identify and evaluate knowledge gaps for both personal learning and program improvement. While no individual USMLE Step score was as strongly predictive of IM-CE score as IM-ITE score, the combined relative contribution of all 3 USMLE Step scores was of a magnitude similar to that of IM-ITE score.
Collapse
Affiliation(s)
- Furman S McDonald
- F.S. McDonald is senior vice president for academic and medical affairs, American Board of Internal Medicine, Philadelphia, Pennsylvania, adjunct professor of medicine, Mayo Clinic College of Medicine and Science, Rochester, Minnesota, adjunct professor of medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania, and clinical associate, J. Edwin Wood Clinic, Pennsylvania Hospital, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0001-7952-3776
| | - Daniel Jurich
- D. Jurich is senior psychometrician, National Board of Medical Examiners, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0002-1870-2436
| | - Lauren M Duhigg
- L.M. Duhigg is senior research associate, American Board of Internal Medicine, Philadelphia, Pennsylvania
| | - Miguel Paniagua
- M. Paniagua is medical advisor, National Board of Medical Examiners, and adjunct professor of medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0003-2307-4873
| | - Davoren Chick
- D. Chick is senior vice president of medical education, American College of Physicians, and adjunct professor of medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; ORCID: http://orcid.org/0000-0003-4477-1272
| | - Margaret Wells
- M. Wells is director of assessment and education programs, American College of Physicians, Philadelphia, Pennsylvania
| | - Amber Williams
- A. Williams is manager, Relationship Development, National Board of Medical Examiners, Philadelphia, Pennsylvania
| | - Patrick Alguire
- P. Alguire is senior vice president emeritus medical education, American College of Physicians, Philadelphia, Pennsylvania
| |
Collapse
|
35
|
Resident selection for emergency medicine specialty training in Canada: A survey of existing practice with recommendations for programs, applicants, and references. CAN J EMERG MED 2020; 22:829-835. [PMID: 32838823 DOI: 10.1017/cem.2020.457] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
OBJECTIVE Little is known about how the Royal College of Emergency Medicine (RCEM) residency programs are selecting their residents. This creates uncertainty regarding alignment between current selection processes and known best practices. We seek to describe the current selection processes of Canadian RCEM programs. METHODS An online survey was distributed to all RCEM program directors and assistant directors. The survey instrument included 22 questions and sought both qualitative and quantitative data from the following six domains: application file, letters of reference, elective selection, interview, rank order, and selection process evaluation. RESULTS We received responses from 13 of 14 programs for an aggregate response rate of 92.9%. A candidate's letters of reference were identified as the most important criterion from the paper application (38.5%). Having a high level of familiarity with the applicant was the most important characteristic of a reference letter author (46.2%). In determining rank order, 53.8% of programs weighed the interview more heavily than the paper application. Once final candidate scores are established following the interview stage, all program respondents indicated that further adjustment is made to the final rank order list. Only 1 of 13 program respondents reported ever having completed a formal evaluation of their selection process. CONCLUSION We have identified elements of the selection process that will inform recommendations for programs, students, and referees. We encourage programs to conduct regular reviews of their selection process going forward to be in alignment with best practices.
Collapse
|
36
|
Fagan R, Harkin E, Wu K, Salazar D, Schiff A. The Lack of Standardization of Allopathic and Osteopathic Medical School Grading Systems and Transcripts. JOURNAL OF SURGICAL EDUCATION 2020; 77:69-73. [PMID: 31302034 DOI: 10.1016/j.jsurg.2019.06.016] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/17/2019] [Accepted: 06/25/2019] [Indexed: 06/10/2023]
Abstract
OBJECTIVE The purpose of this study is to assess the variability in grading systems used by US allopathic and osteopathic medical schools across all 4 years of medical school coursework. DESIGN Transcripts were reviewed from all participating allopathic and osteopathic medical schools for all 4 years of coursework for grading system type, the presence or absence of a key or guide, the inclusion of grade distribution within class year, inclusion of a class rank, and summary statements or evaluation systems used by the institution within the Medical Student Performance Evaluation to evaluate overall performance. SETTING Loyola University Medical Center. Maywood, IL. PARTICIPANTS Transcripts were reviewed for 144 out of existing 147 allopathic medical schools (97.9%) and 37 out of 39 existing osteopathic medical schools (94.8%). RESULTS For allopathic schools, grading system distribution for preclinical years was-41.6% Pass/Fail, 40.3% Honors, 13.2% Letter; while grading system distribution for clinical years was-78.5% Honors, 15.9% Letter. Only 35.4% of allopathic medical schools used the same system for all 4 years, while the remaining schools used a different system for preclinical and clinical years. For osteopathic medical schools, grading system distribution for preclinical years was-45.9% Letter, 32.4% Honors, and 13.5% Pass/Fail; while grading system distribution for clinical years was-59.5% Honors and 29.7% Letter (Table 4). Overall, 56.7% of osteopathic programs used the same system for all 4 years, while the remaining schools used a different system for the preclinical years than the clinical years. Variability also existed within each of these broader grading system categories (Table 1, Table 3). CONCLUSIONS Our results highlight the variability in grading systems used by medical schools both among programs and between preclinical and clinical years. From the residency program perspective, the lack of consistent, objective comparisons between school transcripts makes comparing applicants from different institutions difficult.
Collapse
Affiliation(s)
- Richard Fagan
- Loyola University Chicago, Stritch School of Medicine, Maywood, Illinois.
| | - Elizabeth Harkin
- Department of Orthopedic Surgery and Rehabilitation, Loyola University Medical Center, Maywood, Illinois
| | - Karen Wu
- Department of Orthopedic Surgery and Rehabilitation, Loyola University Medical Center, Maywood, Illinois
| | - Dane Salazar
- Department of Orthopedic Surgery and Rehabilitation, Loyola University Medical Center, Maywood, Illinois
| | - Adam Schiff
- Department of Orthopedic Surgery and Rehabilitation, Loyola University Medical Center, Maywood, Illinois
| |
Collapse
|
37
|
Nasreddine AY, Gallo R. Applying to Orthopaedic Residency and Matching Rates: Analysis and Review of the Past 25 Years. J Bone Joint Surg Am 2019; 101:e134. [PMID: 31567661 DOI: 10.2106/jbjs.18.00371] [Citation(s) in RCA: 28] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
BACKGROUND The competitiveness of orthopaedics and recent changes in the residency application process have resulted in increased costs to both applicants and programs. Our purpose was to investigate changes in the orthopaedic residency application process between 1992 and 2017. Also, we aimed to determine an ideal number of applications that each student can submit without jeopardizing his or her probability of matching into an orthopaedic residency slot while concurrently reducing the excessive number of applications that are received by program selection committees. METHODS Retrospective data from both the Electronic Residency Application Service (ERAS) and the National Resident Matching Program (NRMP) were collected and analyzed for changes in the characteristics of applications, applicants, and programs over the study period. Using these data, the probability of matching into orthopaedics through the years was calculated and compared in order to propose an ideal number of applications for a medical student to submit to match into an orthopaedic residency. RESULTS Over the study period of 25 years, there has been an increase in the number of residency positions offered and a decrease in the number of applicants per offered position among U.S. senior medical students. Nonetheless, the average number of submitted applications per applicant significantly increased from 1992 to 2017, from 28 to 80 applications (p < 0.001). As a result, the overall costs to apply and review applications also have increased. There was no association between the increased number of submitted applications and the match rate. Our analysis showed that 50 applications per student offer is the most effective option without compromising the overall applicant match rate. CONCLUSIONS Based on these data, we suggest encouraging students to limit the number of applications that they submit. This limit could reduce the cost for both applicants and programs while likely maintaining the current match rate and competitiveness of the specialty.
Collapse
Affiliation(s)
- Adam Y Nasreddine
- Department of Orthopaedics and Rehabilitation, Yale School of Medicine, New Haven, Connecticut.,Penn State Hershey College of Medicine, Hershey, Pennsylvania
| | - Robert Gallo
- Penn State Hershey College of Medicine, Hershey, Pennsylvania.,Penn State Hershey Medical Center, Bone and Joint Institute, Hershey, Pennsylvania
| |
Collapse
|
38
|
Rozenshtein A, Mullins ME, Marx MV. The USMLE Step 1 Pass/Fail Reporting Proposal: The APDR Position. Acad Radiol 2019; 26:1400-1402. [PMID: 31383545 DOI: 10.1016/j.acra.2019.06.004] [Citation(s) in RCA: 31] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2019] [Revised: 06/07/2019] [Accepted: 06/10/2019] [Indexed: 11/17/2022]
Abstract
BACKGROUND The National Board of Medical Examiners (NBME) and the United States Medical Licensing Examination (USMLE) has convened a conference of "key stakeholders" on March 11-12, 2019 to consider reporting the results of the USMLE Step 1 as pass/fail. DISCUSSION While the original purpose of the USMLE Step 1 was to provide an objective basis for medical licensing, the score is increasingly used in residency applicant screening and selection because it is an objective, nationally recognized metric allowing comparison across medical schools in and outside the United States. Excessive reliance on the Step 1 score in the matching process has led to "Step 1 Culture" that drives medical schools to "teach to the test," increases medical student anxiety, and disadvantages minorities that have been shown to score lower on the USMLE Step 1 examination. The outsize role of the USMLE Step 1 score in resident selection is due to lack of standardization in medical school transcripts, grade inflation, and the lack of class standing in many summative assessments. Furthermore, the numeric score allows initial Electronic Residency Application Service filtering, commonly used by programs to limit the number of residency applications to review. CONCLUSION The Association of Program Directors in Radiology (APDR) is concerned that pass/fail reporting of the USMLE Step 1 score would take away an objective measure of medical student's knowledge and the incentive to acquire as much of it as possible. Although the APDR is not in favor of the Step 1 exam being used as a screening tool, in the absence of an equal or better metric for applicant comparison the APDR opposes the change in Step 1 reporting from the numeric score to pass/fail.
Collapse
Affiliation(s)
- Anna Rozenshtein
- Department of Radiology, Westchester Medical Center-New York Medical College, 100 Woods Road, Valhalla, NY 10595.
| | - Mark E Mullins
- Department of Radiology and Imaging Sciences, Emory University, Atlanta, Georgia
| | - M Victoria Marx
- Department of Radiology, Keck School of Medicine University of South California, Los Angeles, California
| |
Collapse
|
39
|
Sharma A, Schauer DP, Kelleher M, Kinnear B, Sall D, Warm E. USMLE Step 2 CK: Best Predictor of Multimodal Performance in an Internal Medicine Residency. J Grad Med Educ 2019; 11:412-419. [PMID: 31440335 PMCID: PMC6699543 DOI: 10.4300/jgme-d-19-00099.1] [Citation(s) in RCA: 40] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/05/2019] [Revised: 04/26/2019] [Accepted: 06/04/2019] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Internal medicine (IM) residency programs receive information about applicants via academic transcripts, but studies demonstrate wide variability in satisfaction with and usefulness of this information. In addition, many studies compare application materials to only 1 or 2 assessment metrics, usually standardized test scores and work-based observational faculty assessments. OBJECTIVE We sought to determine which application materials best predict performance across a broad array of residency assessment outcomes generated by standardized testing and a yearlong IM residency ambulatory long block. METHODS In 2019, we analyzed available Electronic Residency Application Service data for 167 categorical IM residents, including advanced degree status, research experience, failures during medical school, undergraduate medical education award status, and United States Medical Licensing Examination (USMLE) scores. We compared these with post-match residency multimodal performance, including standardized test scores and faculty member, peer, allied health professional, and patient-level assessment measures. RESULTS In multivariate analyses, USMLE Step 2 Clinical Knowledge (CK) scores were most predictive of performance across all residency performance domains measured. Having an advanced degree was associated with higher patient-level assessments (eg, physician listens, physician explains, etc). USMLE Step 1 scores were associated with in-training examination scores only. None of the other measured application materials predicted performance. CONCLUSIONS USMLE Step 2 CK scores were the highest predictors of residency performance across a broad array of performance measurements generated by standardized testing and an IM residency ambulatory long block.
Collapse
|
40
|
Deng F, Wesevich A. More on the Role of USMLE Step 1 in Resident Selection. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2019; 94:921. [PMID: 31241563 DOI: 10.1097/acm.0000000000002725] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Affiliation(s)
- Francis Deng
- Radiology resident, Massachusetts General Hospital, Boston, Massachusetts; ; ORCID: https://orcid.org/0000-0003-3117-5076. Internal medicine and pediatrics resident, Duke University Medical Center, Durham, North Carolina; ORCID: https://orcid.org/0000-0001-5202-1231
| | | |
Collapse
|
41
|
Radabaugh CL, Hawkins RE, Welcher CM, Mejicano GC, Aparicio A, Kirk LM, Skochelak SE. Beyond the United States Medical Licensing Examination Score: Assessing Competence for Entering Residency. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2019; 94:983-989. [PMID: 30920448 DOI: 10.1097/acm.0000000000002728] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Assessments of physician learners during the transition from undergraduate to graduate medical education generate information that may inform their learning and improvement needs, determine readiness to move along the medical education continuum, and predict success in their residency programs. To achieve a constructive transition for the learner, residency program, and patients, high-quality assessments should provide meaningful information regarding applicant characteristics, academic achievement, and competence that lead to a suitable match between the learner and the residency program's culture and focus.The authors discuss alternative assessment models that may correlate with resident physician clinical performance and patient care outcomes. Currently, passing the United States Medical Licensing Examination Step examinations provides one element of reliable assessment data that could inform judgments about a learner's likelihood for success in residency. Yet, learner capabilities in areas beyond those traditionally valued in future physicians, such as life experiences, community engagement, language skills, and leadership attributes, are not afforded the same level of influence when candidate selections are made.While promising new methods of screening and assessment-such as objective structured clinical examinations, holistic assessments, and competency-based assessments-have attracted increased attention in the medical education community, currently they may be expensive, be less psychometrically sound, lack a national comparison group, or be complicated to administer. Future research and experimentation are needed to establish measures that can best meet the needs of programs, faculty, staff, students, and, more importantly, patients.
Collapse
Affiliation(s)
- Carrie L Radabaugh
- C.L. Radabaugh is vice president, governance and board relations, American Board of Medical Specialties, Chicago, Illinois. R.E. Hawkins is president and chief executive officer, American Board of Medical Specialties, Chicago, Illinois. C.M. Welcher is senior policy analyst, Medical Education Programs, American Medical Association, Chicago, Illinois. G.C. Mejicano is professor and senior associate dean for education, School of Medicine, Oregon Health & Science University, Portland, Oregon. A. Aparicio is director, Medical Education Programs, American Medical Association, Chicago, Illinois. L.M. Kirk is professor, Internal Medicine/Family & Community Medicine, Southwestern Medical School, University of Texas Southwestern Medical Center, Dallas, Texas. S.E. Skochelak is chief academic officer and medical education group vice president, American Medical Association, Chicago, Illinois
| | | | | | | | | | | | | |
Collapse
|
42
|
Hopson LR, Dorfsman ML, Branzetti J, Gisondi MA, Hart D, Jordan J, Cranford JA, Williams SR, Regan L. Comparison of the Standardized Video Interview and Interview Assessments of Professionalism and Interpersonal Communication Skills in Emergency Medicine. AEM EDUCATION AND TRAINING 2019; 3:259-268. [PMID: 31360819 PMCID: PMC6637001 DOI: 10.1002/aet2.10346] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/04/2019] [Revised: 03/16/2019] [Accepted: 03/23/2019] [Indexed: 06/10/2023]
Abstract
OBJECTIVES The Association of American Medical Colleges Standardized Video Interview (SVI) was recently added as a component of emergency medicine (EM) residency applications to provide additional information about interpersonal communication skills (ICS) and knowledge of professionalism (PROF) behaviors. Our objective was to ascertain the correlation between the SVI and residency interviewer assessments of PROF and ICS. Secondary objectives included examination of 1) inter- and intrainstitutional assessments of ICS and PROF, 2) correlation of SVI scores with rank order list (ROL) positions, and 3) the potential influence of gender on interview day assessments. METHODS We conducted an observational study using prospectively collected data from seven EM residency programs during 2017 and 2018 using a standardized instrument. Correlations between interview day PROF/ICS scores and the SVI were tested. A one-way analysis of variance was used to analyze the association of SVI and ROL position. Gender differences were assessed with independent-groups t-tests. RESULTS A total of 1,264 interview-day encounters from 773 unique applicants resulted in 4,854 interviews conducted by 151 interviewers. Both PROF and ICS demonstrated a small positive correlation with the SVI score (r = 0.16 and r = 0.17, respectively). ROL position was associated with SVI score (p < 0.001), with mean SVI scores for top-, middle-, and bottom-third applicants being 20.9, 20.5, and 19.8, respectively. No group differences with gender were identified on assessments of PROF or ICS. CONCLUSIONS Interview assessments of PROF and ICS have a small, positive correlation with SVI scores. These residency selection tools may be measuring related, but not redundant, applicant characteristics. We did not identify gender differences in interview assessments.
Collapse
Affiliation(s)
- Laura R. Hopson
- Department of Emergency MedicineUniversity of Michigan Medical SchoolAnn ArborMI
| | - Michele L. Dorfsman
- Department of Emergency MedicineUniversity of Pittsburgh School of MedicinePittsburghPA
| | - Jeremy Branzetti
- Ronald O. Perelman Department of Emergency MedicineNew York University School of MedicineNew YorkNY
| | | | - Danielle Hart
- Department of Emergency MedicineUniversity of Minnesota Medical SchoolSt. PaulMN
| | - Jaime Jordan
- Department of Emergency MedicineDavid Geffen School of Medicine at UCLALos AngelesCA
| | | | - Sarah R. Williams
- Department of Emergency MedicineStanford University School of MedicineStanfordCA
| | - Linda Regan
- Department of Emergency MedicineJohns Hopkins University School of MedicineBaltimoreMD
| |
Collapse
|
43
|
Hartman ND. A Narrative Review of the Evidence Supporting Factors Used by Residency Program Directors to Select Applicants for Interviews. J Grad Med Educ 2019; 11:268-273. [PMID: 31210855 PMCID: PMC6570461 DOI: 10.4300/jgme-d-18-00979.3] [Citation(s) in RCA: 52] [Impact Index Per Article: 10.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/19/2018] [Revised: 01/23/2019] [Accepted: 03/31/2019] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Residency applicants feel increasing pressure to maximize their chances of successfully matching into the program of their choice, and are applying to more programs than ever before. OBJECTIVE In this narrative review, we examined the most common and highly rated factors used to select applicants for interviews. We also examined the literature surrounding those factors to illuminate the advantages and disadvantages of using them as differentiating elements in interviewee selection. METHODS Using the 2018 NRMP Program Director Survey as a framework, we examined the last 10 years of literature to ascertain how residency directors are using these common factors to grant residency interviews, and whether these factors are predictive of success in residency. RESULTS Residency program directors identified 12 factors that contribute substantially to the decision to invite applicants for interviews. Although United States Medical Licensing Examination (USMLE) Step 1 is often used as a comparative factor, most studies do not demonstrate its predictive value for resident performance, except in the case of test failure. We also found that structured letters of recommendation from within a specialty carry increased benefit when compared with generic letters. Failing USMLE Step 1 or 2 and unprofessional behavior predicted lower performance in residency. CONCLUSIONS We found that the evidence basis for the factors most commonly used by residency directors is decidedly mixed in terms of predicting success in residency and beyond. Given these limitations, program directors should be skeptical of making summative decisions based on any one factor.
Collapse
|
44
|
Hudson KM. Association Between Performance on COMLEX-USA and the American College of Osteopathic Family Physicians In-Service Examination. J Grad Med Educ 2018; 10:543-547. [PMID: 30386480 PMCID: PMC6194890 DOI: 10.4300/jgme-d-17-00997.1] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/07/2017] [Revised: 03/23/2018] [Accepted: 06/14/2018] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND The primary goal of residency programs is to select and educate qualified candidates to become competent physicians. Program directors often use performance on licensure examinations to evaluate the ability of candidates during the resident application process. The American College of Osteopathic Family Physicians (ACOFP) administers an in-service examination (ISE) to residents annually. There are few prior studies of the relationship between the Comprehensive Osteopathic Medical Licensing Examination of the United States of America (COMLEX-USA) series and formative assessments of residents in training. OBJECTIVE We explored the relationship between performance on COMLEX-USA and the ACOFP in-service examination to offer support on the use of licensing examinations in resident selection. METHODS In 2016, performance data from the COMLEX-USA and the ISE were matched for 3 resident cohorts (2011-2013, inclusive; N = 1384). Correlations were calculated to examine the relationship between COMLEX-USA and ISE scores. Multiple linear regression models were used to determine if performance on COMLEX-USA significantly predicted third-year ISE (ISE-3) scores. RESULTS Findings indicated that correlations among performance on COMLEX-USA and ISE were statistically significant (all P < .001), and there was strong intercorrelation between COMLEX-USA Level 3 and ISE-1 performance (r = 0.57, P < .001). Performance on the COMLEX-USA Levels 1 and 2-Cognitive Examination significantly predicted performance on the ISE-3 (F(2,1381) = 228.8, P < .001). CONCLUSIONS The results support using COMLEX-USA as a part of resident selection in family medicine. Additionally, program directors may use performance on COMLEX-USA to predict success on the ISE-3.
Collapse
|
45
|
McInnes MD. Canadian program directors lack data to select residency candidates. CMAJ 2018; 190:E1114. [PMID: 30224445 PMCID: PMC6141249 DOI: 10.1503/cmaj.69695] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/01/2022] Open
Affiliation(s)
- Matthew D McInnes
- Associate professor of radiology and epidemiology, University of Ottawa, Ottawa, Ont.; The Ottawa Hospital Research Institute, Ottawa, Ont
| |
Collapse
|
46
|
Price T, Lynn N, Coombes L, Roberts M, Gale T, de Bere SR, Archer J. The International Landscape of Medical Licensing Examinations: A Typology Derived From a Systematic Review. Int J Health Policy Manag 2018; 7:782-790. [PMID: 30316226 PMCID: PMC6186476 DOI: 10.15171/ijhpm.2018.32] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2017] [Accepted: 03/26/2018] [Indexed: 01/12/2023] Open
Abstract
BACKGROUND National licensing examinations (NLEs) are large-scale examinations usually taken by medical doctors close to the point of graduation from medical school. Where NLEs are used, success is usually required to obtain a license for full practice. Approaches to national licensing, and the evidence that supports their use, varies significantly across the globe. This paper aims to develop a typology of NLEs, based on candidacy, to explore the implications of different examination types for workforce planning. METHODS A systematic review of the published literature and medical licensing body websites, an electronic survey of all medical licensing bodies in highly developed nations, and a survey of medical regulators. RESULTS The evidence gleaned through this systematic review highlights four approaches to NLEs: where graduating medical students wishing to practice in their national jurisdiction must pass a national licensing exam before they are granted a license to practice; where all prospective doctors, whether from the national jurisdiction or international medical graduates, are required to pass a national licensing exam in order to practice within that jurisdiction; where international medical graduates are required to pass a licensing exam if their qualifications are not acknowledged to be comparable with those students from the national jurisdiction; and where there are no NLEs in operation. This typology facilitates comparison across systems and highlights the implications of different licensing systems for workforce planning. CONCLUSION The issue of national licensing cannot be viewed in isolation from workforce planning; future research on the efficacy of national licensing systems to drive up standards should be integrated with research on the implications of such systems for the mobility of doctors to cross borders.
Collapse
Affiliation(s)
- Tristan Price
- Collaboration for the Advancement of Medical Education Research & Assessment (CAMERA), University of Plymouth, Plymouth, UK
- Peninsula Schools of Medicine & Dentistry, University of Plymouth, Plymouth, UK
| | - Nick Lynn
- Collaboration for the Advancement of Medical Education Research & Assessment (CAMERA), University of Plymouth, Plymouth, UK
- Peninsula Schools of Medicine & Dentistry, University of Plymouth, Plymouth, UK
| | - Lee Coombes
- School of Medicine, Cardiff University, Wales, UK
| | - Martin Roberts
- Collaboration for the Advancement of Medical Education Research & Assessment (CAMERA), University of Plymouth, Plymouth, UK
- Peninsula Schools of Medicine & Dentistry, University of Plymouth, Plymouth, UK
| | - Tom Gale
- Collaboration for the Advancement of Medical Education Research & Assessment (CAMERA), University of Plymouth, Plymouth, UK
- Peninsula Schools of Medicine & Dentistry, University of Plymouth, Plymouth, UK
| | - Sam Regan de Bere
- Collaboration for the Advancement of Medical Education Research & Assessment (CAMERA), University of Plymouth, Plymouth, UK
- Peninsula Schools of Medicine & Dentistry, University of Plymouth, Plymouth, UK
| | - Julian Archer
- Collaboration for the Advancement of Medical Education Research & Assessment (CAMERA), University of Plymouth, Plymouth, UK
- Peninsula Schools of Medicine & Dentistry, University of Plymouth, Plymouth, UK
| |
Collapse
|
47
|
Stover W, Gill S, Schenarts K, Chahine AA. Defining the Applicant Pool for Postgraduate Year-2 Categorical General Surgery Positions. JOURNAL OF SURGICAL EDUCATION 2018; 75:870-876. [PMID: 29242045 DOI: 10.1016/j.jsurg.2017.11.006] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/26/2017] [Revised: 10/02/2017] [Accepted: 11/17/2017] [Indexed: 06/07/2023]
Abstract
OBJECTIVE In the spring of 2010, a categorical general surgery postgraduate year (PGY)-2 position became available at our academic medical center secondary to attrition of a PGY-1 resident. We sought to study the unique characteristics of applicants to that position and to describe the selection process with hopes to stimulate additional studies about the unique challenges of recruiting applicants into advanced standing positions. DESIGN Applications were received via e-mail and reviewed to characterize the applicant pool. An Excel spreadsheet was used to organize data. Characteristics assessed included United States Medical Licensing Examination (USMLE) scores, Educational Commission for Foreign Medical Graduates status, Alpha Omega Alpha Honor Society status, sex, academic performance, number of case logs, volunteer and job experience, leadership roles, research experience including submissions, and advanced degrees. These characteristics were compared to those of the PGY-1 applicants through the Match that year. SETTING Academic medical center. PARTICIPANTS Applicants for a categorical general surgery PGY-2 position in 2010. RESULTS A total of 129 applicants provided the requested documents. There were 104 males, 25 females, no Alpha Omega Alpha Honor Society candidates, and 82 international candidates. Of all, 46 candidates experienced academic difficulties. Quantitative averages include USMLE 1: 214.17, USMLE 2: 215.74, American Board of Surgery In Training Examination (ABSITE) percentile = 51.96, ABSITE 2 = 46.00, grand total case log: 192.10. Advanced degrees included 2 MBAs, 6 MPHs, and 7 nonphysiology MSs. The selection process to fill the position started on 3/25/2010 when the announcement was published and ended on 5/11/2010 when the offer of acceptance was sent. The selected applicant integrated well with the peers and just graduated from our residency as one of the leaders of the graduating class. CONCLUSIONS Although the attrition rate in general surgery remains high, there is a dearth of literature about how best to replace residents. The hardship of replacing residents highlights the importance of studying this group to improve the recruitment process and the quality of replacement residents. The selection process was time consuming and presented its own challenges given the lack of a computerized system for screening. It lasted nearly 7 weeks requiring faculty time commitment to mine through application data/e-mails, correspond with applicants, conduct interviews, and ultimately select an applicant for the position. This is the first study to investigate the applicant pool to advanced standing positions in general surgery and we present it as a pilot study to stimulate further research efforts.
Collapse
Affiliation(s)
- Weston Stover
- Georgetown University School of Medicine, Washington, DC; Department of Surgery, UT Southwestern Medical Center, Dallas, Texas
| | - Sujata Gill
- Department of Surgery, MedStar Georgetown University Hospital, Washington, DC
| | - Kim Schenarts
- Department of Surgery, University of Nebraska Medical Center, Omaha, Nebraska
| | - A Alfred Chahine
- Department of Surgery, MedStar Georgetown University Hospital, Washington, DC; Department of Surgery, Children's National Medical System, Washington, DC; Department of Surgery, The George Washington University School of Medicine and Health Sciences, Washington, DC.
| |
Collapse
|
48
|
Gupta R, Norris ML, Barrowman N, Writer H. Pre-residency publication and its association with paediatric residency match outcome-a retrospective analysis of a national database. PERSPECTIVES ON MEDICAL EDUCATION 2017; 6:388-395. [PMID: 29134620 PMCID: PMC5732106 DOI: 10.1007/s40037-017-0383-8] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/14/2023]
Abstract
INTRODUCTION Scholarly activity is considered valuable in the resident selection process by candidates and program directors alike, despite existing literature suggesting applicants with scholarly work do not perform better in the match. These studies, however, are limited in that they have only measured whether candidates have successfully matched or not. To try and reconcile the existing disconnect in the value of pre-residency scholarly activity, we sought to deepen the understanding by investigating whether pre-residency publication is associated with a higher rank-order list match achievement. METHODS Anonymized data were collected from the Canadian Residency Matching Service for individuals matched to paediatric programs from 2007-2012. The primary analysis was to identify whether documentation of ≥1 pre-residency publication was associated with achieving a first-choice match. Secondary analyses included evaluating for an association between multiple pre-residency publications, academic presentations or a graduate degree and match outcome. RESULTS Of a total of 843 matched individuals, 406 (48.2%) listed ≥1 pre-residency publication and 494 (58.6%) matched to their first-choice program. The possession of ≥1 pre-residency publications was not associated with matching to a candidate's first-choice program (odds ratio = 0.94 [95% confidence interval = 0.71-1.24], p = 0.66). Similarly, listing ≥2 publications, ≥3 publications, a graduate degree, or an academic presentation was not associated with achieving a first-choice match. CONCLUSIONS The results provide increased support for the notion that in aggregate, candidate scholarly activity does not influence match outcome. Accordingly, it is recommended that medical student research activities are fostered with the goal to improve their skills as scientists, and not simply to achieve a better residency match outcome.
Collapse
Affiliation(s)
- Ronish Gupta
- Department of Pediatrics, Children's Hospital of Eastern Ontario, Ottawa, Ontario, Canada.
| | - Mark L Norris
- Department of Pediatrics, Children's Hospital of Eastern Ontario, Ottawa, Ontario, Canada
| | - Nicholas Barrowman
- Department of Pediatrics, Children's Hospital of Eastern Ontario, Ottawa, Ontario, Canada
- Clinical Research Unit, Children's Hospital of Eastern Ontario Research Institute, Ottawa, Ontario, Canada
| | - Hilary Writer
- Department of Pediatrics, Children's Hospital of Eastern Ontario, Ottawa, Ontario, Canada
| |
Collapse
|
49
|
Shipper ES, Mazer LM, Merrell SB, Lin DT, Lau JN, Melcher ML. Pilot evaluation of the Computer-Based Assessment for Sampling Personal Characteristics test. J Surg Res 2017; 215:211-218. [DOI: 10.1016/j.jss.2017.03.054] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2016] [Revised: 03/12/2017] [Accepted: 03/29/2017] [Indexed: 11/25/2022]
|
50
|
|