1
|
Barkovich EJ, Heekin M, Barkovich MJ, Lichtenberger JP. Beyond milestones and percentiles: Revisiting non-cognitive and non-interpretive skills in radiology. Curr Probl Diagn Radiol 2024:S0363-0188(24)00153-1. [PMID: 39138113 DOI: 10.1067/j.cpradiol.2024.08.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2024] [Revised: 07/25/2024] [Accepted: 08/08/2024] [Indexed: 08/15/2024]
Abstract
While there is no precise formula for a great radiology resident, certain attributes and achievements may herald success during training. We briefly review prior works exploring predictive factors and evaluation metrics of top resident performance, noting that those focusing on non-cognitive attributes are over twenty years old. As radiology practice and education has substantially evolved in the interim, we revisit this topic from a contemporary perspective. Inspired by the literature and our own personal experiences, we suggest that the following non-cognitive traits are invaluable for radiology trainees: communication expertise, workplace adaptability, self-awareness, tech savvy and genuine interest in one's individual work and greater community. These characteristics should be highlighted by applicants, sought by selection committees, cultivated by mentors, evaluated by programs and valued by colleagues.
Collapse
Affiliation(s)
- Emil J Barkovich
- Department of Radiology, Massachusetts General Hospital, 55 Fruit St, Gray 2, Room 273A, Boston, MA 02114, United States.
| | - Mary Heekin
- Department of Radiology, George Washington University School of Medicine and Health Sciences, 2300 I St NW, Washington, DC 20052, United States
| | - Matthew J Barkovich
- Department of Radiology and Biomedical Imaging, University of California, 505 Parnassus Ave, Room L352, San Francisco, CA 94158, United States
| | - John P Lichtenberger
- Department of Radiology, George Washington University Medical Faculty Associates, 900 23rd St NW, Washington, DC 20037, United States
| |
Collapse
|
2
|
Payne DL, Purohit K, Borrero WM, Chung K, Hao M, Mpoy M, Jin M, Prasanna P, Hill V. Performance of GPT-4 on the American College of Radiology In-training Examination: Evaluating Accuracy, Model Drift, and Fine-tuning. Acad Radiol 2024; 31:3046-3054. [PMID: 38653599 DOI: 10.1016/j.acra.2024.04.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2024] [Revised: 04/01/2024] [Accepted: 04/06/2024] [Indexed: 04/25/2024]
Abstract
RATIONALE AND OBJECTIVES In our study, we evaluate GPT-4's performance on the American College of Radiology (ACR) 2022 Diagnostic Radiology In-Training Examination (DXIT). We perform multiple experiments across time points to assess for model drift, as well as after fine-tuning to assess for differences in accuracy. MATERIALS AND METHODS Questions were sequentially input into GPT-4 with a standardized prompt. Each answer was recorded and overall accuracy was calculated, as was logic-adjusted accuracy, and accuracy on image-based questions. This experiment was repeated several months later to assess for model drift, then again after the performance of fine-tuning to assess for changes in GPT's performance. RESULTS GPT-4 achieved 58.5% overall accuracy, lower than the PGY-3 average (61.9%) but higher than the PGY-2 average (52.8%). Adjusted accuracy was 52.8%. GPT-4 showed significantly higher (p = 0.012) confidence for correct answers (87.1%) compared to incorrect (84.0%). Performance on image-based questions was significantly poorer (p < 0.001) at 45.4% compared to text-only questions (80.0%), with adjusted accuracy for image-based questions of 36.4%. When the questions were repeated, GPT-4 chose a different answer 25.5% of the time and there was no change in accuracy. Fine-tuning did not improve accuracy. CONCLUSION GPT-4 performed between PGY-2 and PGY-3 levels on the 2022 DXIT, significantly poorer on image-based questions, and with large variability in answer choices across time points. Exploratory experiments in fine-tuning did not improve performance. This study underscores the potential and risks of using minimally-prompted general AI models in interpreting radiologic images as a diagnostic tool. Implementers of general AI radiology systems should exercise caution given the possibility of spurious yet confident responses.
Collapse
Affiliation(s)
- David L Payne
- Stony Brook University Hospital Department of Radiology, 101 Nicolls Road, Stony Brook, New York 11794, USA (D.L.P., K.P., W.M.B., K.C., M.H., M.M., M.J.); Stony Brook University Department of Biomedical Informatics, 1 Lauterbur Drive, Stony Brook, New York 11794, USA (D.L.P., P.P.).
| | - Kush Purohit
- Stony Brook University Hospital Department of Radiology, 101 Nicolls Road, Stony Brook, New York 11794, USA (D.L.P., K.P., W.M.B., K.C., M.H., M.M., M.J.)
| | - Walter Morales Borrero
- Stony Brook University Hospital Department of Radiology, 101 Nicolls Road, Stony Brook, New York 11794, USA (D.L.P., K.P., W.M.B., K.C., M.H., M.M., M.J.)
| | - Katherine Chung
- Stony Brook University Hospital Department of Radiology, 101 Nicolls Road, Stony Brook, New York 11794, USA (D.L.P., K.P., W.M.B., K.C., M.H., M.M., M.J.)
| | - Max Hao
- Stony Brook University Hospital Department of Radiology, 101 Nicolls Road, Stony Brook, New York 11794, USA (D.L.P., K.P., W.M.B., K.C., M.H., M.M., M.J.)
| | - Mutshipay Mpoy
- Stony Brook University Hospital Department of Radiology, 101 Nicolls Road, Stony Brook, New York 11794, USA (D.L.P., K.P., W.M.B., K.C., M.H., M.M., M.J.)
| | - Michael Jin
- Stony Brook University Hospital Department of Radiology, 101 Nicolls Road, Stony Brook, New York 11794, USA (D.L.P., K.P., W.M.B., K.C., M.H., M.M., M.J.)
| | - Prateek Prasanna
- Stony Brook University Department of Biomedical Informatics, 1 Lauterbur Drive, Stony Brook, New York 11794, USA (D.L.P., P.P.)
| | - Virginia Hill
- Northwestern University Feinberg School of Medicine Department of Radiology, 676 North Clair Street, Chicago, Illinois 60611, USA (V.H.)
| |
Collapse
|
3
|
Pfiffner S, Albazi E, Musa A, Altinok G, Johnson SC, Harb A. The New Diagnostic Radiology Oral Exam: Challenges, Opportunities, and Future Directions. Acad Radiol 2024; 31:2190-2191. [PMID: 38184415 DOI: 10.1016/j.acra.2023.12.026] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2023] [Revised: 12/12/2023] [Accepted: 12/16/2023] [Indexed: 01/08/2024]
Affiliation(s)
- Samantha Pfiffner
- School of Medicine, Wayne State University, Detroit, Michigan, 48201, USA (S.P.)
| | - Evien Albazi
- College of Human Medicine, Michigan State University, East Lansing, Michigan, 48824, USA (E.A.)
| | - Arif Musa
- Department of Radiology, Wayne State University / Detroit Medical Center, Detroit, Michigan, 48201, USA (A.M., G.A., S.C.J., A.H.).
| | - Gulcin Altinok
- Department of Radiology, Wayne State University / Detroit Medical Center, Detroit, Michigan, 48201, USA (A.M., G.A., S.C.J., A.H.)
| | - Samuel C Johnson
- Department of Radiology, Wayne State University / Detroit Medical Center, Detroit, Michigan, 48201, USA (A.M., G.A., S.C.J., A.H.)
| | - Ali Harb
- Department of Radiology, Wayne State University / Detroit Medical Center, Detroit, Michigan, 48201, USA (A.M., G.A., S.C.J., A.H.)
| |
Collapse
|
4
|
Zhu GG, Xie AY, Elahi F, Asumu H, Chakraborty A, Stoddard GJ, Al-Dulaimi R, Wiggins RH. Perspectives From the RadDiscord Annual Survey: Overview of the Top Study Tools and Evaluation of Study Time and Various Resources. Acad Radiol 2024; 31:399-408. [PMID: 38401985 PMCID: PMC10897967 DOI: 10.1016/j.acra.2023.12.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2023] [Revised: 12/01/2023] [Accepted: 12/03/2023] [Indexed: 02/26/2024]
Abstract
RATIONALE AND OBJECTIVES Each year, senior radiology residents take the American Board of Radiology Qualifying (Core) exam to evaluate competency. Approximately 10% of first-time examinees will fail this exam (1). Understanding factors that contribute to success will help residency program directors and trainees prepare for future exams. RadDiscord (www.raddiscord.org), an international radiology educational community, is in the unique position to evaluate different study materials and resources. The goal of this paper is to report the results from the RadDiscord survey and analyze the factors that correlate with higher exam performance and passing. MATERIALS AND METHODS Following the February 2021, June 2021, and June 2022 exams, RadDiscord members were provided an anonymous survey, collecting information on study resources and exam scores. The collected data were analyzed using various statistical methods. Both descriptive and inferential analyses were performed. RESULTS A total of 318 residents responded (95% passed). Significant variability in Qualifying (Core) exam performance and perceived quality of internal didactics existed between program types. Residents who did less than 2000 practice questions performed lower on the exam. The Diagnostic Radiology In-Training (DXIT) exam was the most predictive for passing and performance. Qualifying (Core) exam performance negatively correlated with study time, though certain residents did receive some benefit from study time. CONCLUSION Many factors correlate with passing and Qualifying (Core) exam performance. Residency programs with fewer resources should consider alternative ways to support residents beyond offering study time. Residents who complete at least 2000 practice questions are more likely to pass and DXIT results can be a useful gauge to identify exam readiness.
Collapse
Affiliation(s)
- Grace G Zhu
- Department of Radiology, University of Utah Health, Salt Lake City, Utah (G.G.Z., R.H.W.).
| | | | - Fatima Elahi
- Department of Radiology, Hospital of the University of Pennsylvania, Philadelphia, Pennsylvania (F.E.)
| | - Hazel Asumu
- Baylor University Medical Center, Dallas, Texas (H.A.)
| | | | - Gregory J Stoddard
- Department of Internal Medicine, University of Utah, Salt Lake City, Utah (G.J.S.)
| | | | - Richard H Wiggins
- Department of Radiology, University of Utah Health, Salt Lake City, Utah (G.G.Z., R.H.W.)
| |
Collapse
|
5
|
Newbury A, Cerniglia CA, DeBenedectis CM, Harman A, Lo HS. Radiology program director's perspective on a novel hands-on advanced elective. Clin Imaging 2023; 102:93-97. [PMID: 37657275 DOI: 10.1016/j.clinimag.2023.07.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2023] [Revised: 07/25/2023] [Accepted: 07/31/2023] [Indexed: 09/03/2023]
Abstract
With the advent of the USMLE Step 1 exam moving to a pass/fail status, Radiology Program Directors (PDs) and Associate Program Directors (APDs) need alternative methods of identifying interested and engaged medical students who are applying to their program. Additionally, undergraduate radiology medical education in the United States varies widely from institution to institution with no universal mandatory radiology component. To address these problems, we implemented an advanced fourth year hands-on radiology elective where the students were treated as first year radiology residents (R1s), giving them resident-level access to the Picture Archive and Communication System (PACS) and dictation software, and allowing them to perform entry-level procedures with appropriate supervision. After implementation of the elective, a 5-question online survey was sent to two hundred and ninety-eight PDs and APDs via the Association of Program Directors in Radiology (APDR) listserv, of which seventy-two responses were compiled, yielding a response rate of 24%. The survey focused on how a hands-on medical student elective would help in assessing prospective candidates and predicting R1 performance. Most respondents felt interest in radiology, motivation, and interpersonal skills would be better assessed after such an elective and the vast majority felt hands-on Advanced Elective would be at least slightly predictive of first year resident performance. Based on this information, we believe implementing a hands-on advanced radiology elective would significantly help address the passive nature of traditional radiology electives, providing valuable information to PDs and APDs and giving the best possible radiology experience to our medical students.
Collapse
Affiliation(s)
- Alex Newbury
- University of Massachusetts Chan Medical School, Department of Radiology, Worcester, MA 01655, United States of America.
| | - Christopher A Cerniglia
- University of Massachusetts Chan Medical School, Department of Radiology, Worcester, MA 01655, United States of America
| | - Carolynn M DeBenedectis
- University of Massachusetts Chan Medical School, Department of Radiology, Worcester, MA 01655, United States of America. https://twitter.com/@c_debenedectiMD
| | - Aaron Harman
- University of Massachusetts Chan Medical School, Department of Radiology, Worcester, MA 01655, United States of America
| | - Hao S Lo
- University of Massachusetts Chan Medical School, Department of Radiology, Worcester, MA 01655, United States of America
| |
Collapse
|
6
|
Gunderman P, Gunderman D. There Is No Optimal Case Count for Passing the ABR Core Exam. Acad Radiol 2023; 30:1010. [PMID: 36882353 DOI: 10.1016/j.acra.2023.01.032] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2022] [Revised: 01/25/2023] [Accepted: 01/25/2023] [Indexed: 03/07/2023]
|
7
|
Virtual interventional radiology education increases confidence in American Board of Radiology Core Exam Preparation. Clin Imaging 2023; 95:90-91. [PMID: 36682181 DOI: 10.1016/j.clinimag.2022.12.013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2022] [Revised: 11/16/2022] [Accepted: 12/30/2022] [Indexed: 01/13/2023]
|
8
|
Shirkhodaie C, Avila S, Seidel H, Gibbons RD, Arora VM, Farnan JM. The Association Between USMLE Step 2 Clinical Knowledge Scores and Residency Performance: A Systematic Review and Meta-Analysis. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:264-273. [PMID: 36512984 DOI: 10.1097/acm.0000000000005061] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
PURPOSE With the change in Step 1 score reporting, Step 2 Clinical Knowledge (CK) may become a pivotal factor in resident selection. This systematic review and meta-analysis seeks to synthesize existing observational studies that assess the relationship between Step 2 CK scores and measures of resident performance. METHOD The authors searched MEDLINE, Web of Science, and Scopus databases using terms related to Step 2 CK in 2021. Two researchers identified studies investigating the association between Step 2 CK and measures of resident performance and included studies if they contained a bivariate analysis examining Step 2 CK scores' association with an outcome of interest: in-training examination (ITE) scores, board certification examination scores, select Accreditation Council for Graduate Medical Education core competency assessments, overall resident performance evaluations, or other subjective measures of performance. For outcomes that were investigated by 3 or more studies, pooled effect sizes were estimated with random-effects models. RESULTS Among 1,355 potential studies, 68 met inclusion criteria and 43 were able to be pooled. There was a moderate positive correlation between Step 2 CK and ITE scores (0.52, 95% CI 0.45-0.59, P < .01). There was a moderate positive correlation between Step 2 CK and ITE scores for both nonsurgical (0.59, 95% CI 0.51-0.66, P < .01) and surgical specialties (0.41, 95% CI 0.33-0.48, P < .01). There was a very weak positive correlation between Step 2 CK scores and subjective measures of resident performance (0.19, 95% CI 0.13-0.25, P < .01). CONCLUSIONS This study found Step 2 CK scores have a statistically significant moderate positive association with future examination scores and a statistically significant weak positive correlation with subjective measures of resident performance. These findings are increasingly relevant as Step 2 CK scores will likely become more important in resident selection.
Collapse
Affiliation(s)
- Camron Shirkhodaie
- C. Shirkhodaie is a medical student, Pritzker School of Medicine, University of Chicago, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-4279-3251
| | - Santiago Avila
- S. Avila is a medical student, Pritzker School of Medicine, University of Chicago, Chicago, Illinois; ORCID: https://orcid.org/0000-0003-3633-4304
| | - Henry Seidel
- H. Seidel is a medical student, Pritzker School of Medicine, University of Chicago, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-7364-1365
| | - Robert D Gibbons
- R.D. Gibbons is professor, Center for Health Statistics and Departments of Medicine and Public Health Sciences, University of Chicago, Chicago, Illinois
| | - Vineet M Arora
- V.M. Arora is professor, Department of Medicine, University of Chicago Medicine, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-4745-7599
| | - Jeanne M Farnan
- J.M. Farnan is professor, Department of Medicine, University of Chicago Medicine, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-1138-9416
| |
Collapse
|
9
|
Morgan DE. Use of Attending Radiologist Reviews of Resident Clinical Performance to Predict Outcomes on the American Board of Radiology Qualifying (Core) Exam: A Call to Action. Acad Radiol 2022; 29:1727-1729. [PMID: 36050263 DOI: 10.1016/j.acra.2022.07.024] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2022] [Accepted: 07/31/2022] [Indexed: 11/20/2022]
Affiliation(s)
- Desiree E Morgan
- University of Alabama at Birmingham, Department of Radiology, JTN456, 619 South 19th Street, Birmingham, AL 35249.
| |
Collapse
|
10
|
Horn GL, Masood I, Heymann JC, Saleem A, Nguyen QD. Attending Reviews of Residents Correlate with ABR Qualifying (Core) Examination Failure. Acad Radiol 2022; 29:1723-1726. [PMID: 35232656 DOI: 10.1016/j.acra.2022.01.002] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2021] [Revised: 01/02/2022] [Accepted: 01/04/2022] [Indexed: 11/23/2022]
Abstract
RATIONALE AND OBJECTIVES Since the American Board of Radiology (ABR) instituted the new system of board certification, there has been much discussion as to the test's validity. We decided to evaluate if subjective evaluation of resident performance correlated with ABR Qualifying (Core) Examination performance at this single institution. MATERIALS AND METHODS Data regarding resident evaluation scores by attending physicians and passage of board examinations was gathered regarding residents who had taken the ABR Qualifying (Core) Examination from 2013 through 2019 for a total of 42 residents, eight of whom failed the ABR Qualifying (Core) Examination on their first attempt. A univariate analysis comparing scores with resident passage or failure of the ABR Qualifying (Core) Examination on the first attempt and analyses correcting for class year only and class year and number of evaluations was performed. RESULTS The non-weighted average evaluation score of years 1, 2, and 3 was 80.24% for those who failed the ABR Qualifying (Core) Examination and 83.71 % for those who passed. On univariate analysis along with analyses correcting for class year only and class year along with number of evaluations, there was a statistically significant correlation with decreased evaluation scores averaged over the three years of residency and failure of the ABR Qualifying (Core) Examination (p = 0.0102, p = 0.003, and p = 0.0043). The statistical significance held for the average numerical score in each individual year of training in all analyses except for year 1 of the univariate analysis (p = 0.1264). CONCLUSION At the studied institution, there was a statistically significant correlation between lower subjective faculty evaluation scores and failure of the ABR Qualifying (Core) Examination.
Collapse
Affiliation(s)
- Gary Lloyd Horn
- Baylor College of Medicine Department of Radiology (G.L.H.), One Baylor Plaza, Houston, Texas 77573; University of Texas Medical Branch at Galveston Department of Radiology (I.M., J.C.H., A.S., Q.D.N.), Galveston, Texas.
| | - Irfan Masood
- Baylor College of Medicine Department of Radiology (G.L.H.), One Baylor Plaza, Houston, Texas 77573; University of Texas Medical Branch at Galveston Department of Radiology (I.M., J.C.H., A.S., Q.D.N.), Galveston, Texas
| | - John C Heymann
- Baylor College of Medicine Department of Radiology (G.L.H.), One Baylor Plaza, Houston, Texas 77573; University of Texas Medical Branch at Galveston Department of Radiology (I.M., J.C.H., A.S., Q.D.N.), Galveston, Texas
| | - Arsalan Saleem
- Baylor College of Medicine Department of Radiology (G.L.H.), One Baylor Plaza, Houston, Texas 77573; University of Texas Medical Branch at Galveston Department of Radiology (I.M., J.C.H., A.S., Q.D.N.), Galveston, Texas
| | - Quan Dang Nguyen
- Baylor College of Medicine Department of Radiology (G.L.H.), One Baylor Plaza, Houston, Texas 77573; University of Texas Medical Branch at Galveston Department of Radiology (I.M., J.C.H., A.S., Q.D.N.), Galveston, Texas
| |
Collapse
|
11
|
Zhang RV, Awan OA, Resnik CS, Hossain R. Potential Impact of a Pass or Fail United States Medical Licensing Exam Step 1 Scoring System on Radiology Residency Applications. Acad Radiol 2022; 29:158-165. [PMID: 33162317 DOI: 10.1016/j.acra.2020.10.016] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2020] [Revised: 10/01/2020] [Accepted: 10/06/2020] [Indexed: 11/25/2022]
Abstract
RATIONALE AND OBJECTIVES To assess the anticipated impact of the change in United States Medical Licensing Examination Step 1 scoring from numerical to pass or fail on the future selection of radiology residents. MATERIALS AND METHODS An anonymous electronic 14-item survey was distributed to 308 members of the Association of Program Directors in Radiology and included questions regarding the anticipated importance of various application metrics when Step 1 becomes pass or fail. Secondary analyses compared responses based on the current use of a Step 1 scoring screen. RESULTS Eighty eight respondents (28.6% [88/308]) completed the survey. Most (64% [56/88]) noted that the United States Medical Licensing Examination Step 2 Clinical Knowledge (CK) score will likely be one of the top three most important factors in assessing applicants, followed by class ranking or quartile (51% [45/88]) and the Medical Student Performance Evaluation/Dean's Letter (42% [37/88]). Over 90% (82/88) of respondents anticipate potentially or definitively requiring Step 2 CK scores before application review, and 50% (44/88) of respondents anticipate extending interview invites at a later date to receive Step 2 CK scores. These results did not significantly differ between programs who currently use a Step 1 scoring screen and those who do not. CONCLUSION As Step 1 transitions from a numerical score to pass or fail, radiology residency program directors will likely rely on Step 2 CK scores as an objective and standardized metric to screen applicants. Further investigation is needed to identify other objective metrics to evaluate applicants before Step 1 becomes pass or fail.
Collapse
|
12
|
Goggs R, Kerl M, Jandrey KE, Guillaumin J. Prospective investigation of factors associated with success on the American College of Veterinary Emergency and Critical Care certification examination (2016-2018). J Vet Emerg Crit Care (San Antonio) 2021; 32:196-206. [PMID: 34714977 DOI: 10.1111/vec.13153] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2020] [Revised: 08/30/2020] [Accepted: 08/31/2020] [Indexed: 11/27/2022]
Abstract
OBJECTIVE To assess the association of candidate attributes and residency training factors with success on the American College of Veterinary Emergency and Critical Care (ACVECC) board certification examination and to develop multivariable models of first-attempt success. DESIGN Prospective survey-based study. SETTING Post-assessment ACVECC examination candidates. ANIMALS None. INTERVENTIONS None. MEASUREMENTS AND MAIN RESULTS Comprehensive surveys were distributed to ACVECC examination candidates in 2016 to 2018 after completion of their assessments, but prior to publication of examination results. Unique anonymous candidate identification numbers were used to match survey responses to outcome data from the office of the ACVECC Executive Secretary. After curation to retain only the first response from each candidate, there were 97 unique candidate responses available for analysis. Univariate analyses identified multiple factors associated with first-attempt success and multiple differences between academic and private practice residency programs. Multivariable logistic regression modeling suggested that 5 factors were independently associated with first-attempt success on the ACVECC examination, specifically younger age, more weeks of study prior to the examination, training at a facility with more ACVECC Diplomates, training at a facility with more ACVECC residents, and having no requirement to manage both Emergency Room (ER) and Critical Care (CC) cases simultaneously. CONCLUSIONS Numerous resident and training center factors are associated with success in the ACVECC board certification examination. Residents and training centers might be able to use these data to enhance training, but caution must be exercised because these data are associative only.
Collapse
Affiliation(s)
- Robert Goggs
- Department of Clinical Sciences, Cornell University College of Veterinary Medicine, Ithaca, New York, USA
| | - Marie Kerl
- Regional Operations, Heartland Group, VCA Inc., Los Angeles, California, USA.,Department of Veterinary Medicine and Surgery, College of Veterinary Medicine, University of Missouri, Columbia, Missouri, USA
| | - Karl E Jandrey
- Department of Surgical and Radiological Sciences, University of California, Davis, California, USA
| | - Julien Guillaumin
- Department of Clinical Science, College of Veterinary Medicine and Biomedical Sciences, Colorado State University, Fort Collins, Colorado, USA
| |
Collapse
|
13
|
Patel K, Patel A. The New Radiology Resident: Quick Tips for Success. J Am Coll Radiol 2021; 18:1666-1667. [PMID: 34637775 DOI: 10.1016/j.jacr.2021.08.019] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2021] [Revised: 08/18/2021] [Accepted: 08/26/2021] [Indexed: 10/20/2022]
Affiliation(s)
- Kirang Patel
- Chief Radiology Resident, University of Missouri at Kansas City, Kansas City, Missouri.
| | - Amy Patel
- Assistant Professor, Department of Radiology, University of Missouri at Kansas City, Kansas City, Missouri; Medical Director, Breast Care Center, Liberty Hospital, Liberty, Missouri
| |
Collapse
|
14
|
Chung CY, Jiang L, Balthazar P. The American Board of Radiology's First Remote Core Examination: A Trainee's Perspective- Radiology In Training. Radiology 2021; 300:E293-E295. [PMID: 34003052 DOI: 10.1148/radiol.2021210846] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Affiliation(s)
- Charlotte Y Chung
- From the Department of Radiology and Imaging Sciences, Emory University School of Medicine, 1364 Clifton Rd NE, Suite BG03, Atlanta, Ga 30322 (C.Y.C.); Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, Mass (L.J.), and Department of Radiology, Massachusetts General Hospital, Harvard Medical School, Boston, Mass (P.B.)
| | - Liwei Jiang
- From the Department of Radiology and Imaging Sciences, Emory University School of Medicine, 1364 Clifton Rd NE, Suite BG03, Atlanta, Ga 30322 (C.Y.C.); Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, Mass (L.J.), and Department of Radiology, Massachusetts General Hospital, Harvard Medical School, Boston, Mass (P.B.)
| | - Patricia Balthazar
- From the Department of Radiology and Imaging Sciences, Emory University School of Medicine, 1364 Clifton Rd NE, Suite BG03, Atlanta, Ga 30322 (C.Y.C.); Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, Mass (L.J.), and Department of Radiology, Massachusetts General Hospital, Harvard Medical School, Boston, Mass (P.B.)
| |
Collapse
|
15
|
Ballard DH, Summers D, Hoegger MJ, Salter A, Gould JE. Results of the 2019 Survey of the American Alliance of Academic Chief Residents in Radiology. Acad Radiol 2021; 28:1018-1028. [PMID: 32546338 DOI: 10.1016/j.acra.2020.04.042] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2020] [Revised: 04/30/2020] [Accepted: 04/30/2020] [Indexed: 11/25/2022]
Abstract
RATIONALE AND OBJECTIVES An annual survey of chief residents in accredited North American radiology programs is conducted by the American Alliance of Academic Chief Residents in Radiology (A3CR2). The purpose of this study is to summarize the 2019 A3CR2 chief resident survey. MATERIALS AND METHODS An online survey was distributed to chief residents from 194 Accreditation Council on Graduate Medical Education-accredited radiology residencies. Questions were designed to gather information about residency program details, call and weekend coverage, interventional radiology training, fellowship, social media use, healthcare reform, artificial intelligence, and job market status. RESULTS One hundred and forty-two unique responses from 99 programs were provided, yielding a 51% program response rate. There was a mean of 7.3 women per residency with a mean program size of 28 residents (26% women). Only 3 of the 99 (3%) programs had a proportion of women that was 50% or higher. The proportion of women in radiology residencies is unchanged since 2014 (p= 0.93) and is significantly lower than 2019 graduating women medical students (49.3%; p < 0.001). Thirty-five percent of programs had 24/7 attending coverage and 40% of programs had extended hours attending shifts. Of programs without 24/7 attending coverage, the proportion of programs without face-to-face readout has increased from 34% in 2014 to 55% in 2019 (p = 0.015). The majority (67%) of respondents had no concerns about the radiology job market; compared to 2014, where only 4% had no concerns (p < 0.001). CONCLUSION Women remain underrepresented in radiology, face-to-face readout is decreasing, and there has been a shift towards a positive job market outlook.
Collapse
|
16
|
The Relationship Between ACR Diagnostic Radiology In-Training Examination Scores and ABR Core Examination Outcome and Performance: A Multi-Institutional Study. J Am Coll Radiol 2020; 17:1663-1669. [DOI: 10.1016/j.jacr.2020.04.032] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2020] [Revised: 04/24/2020] [Accepted: 04/29/2020] [Indexed: 11/17/2022]
|
17
|
Toy D, Escalon JG, Groner LK, Naeger DM. Preparation for the ABR Core Exam: Resident Study Habits and the Value of Case Conferences. J Am Coll Radiol 2020; 18:615-619. [PMID: 33242480 DOI: 10.1016/j.jacr.2020.10.019] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2020] [Revised: 10/27/2020] [Accepted: 10/29/2020] [Indexed: 11/29/2022]
Affiliation(s)
- Dennis Toy
- Department of Radiology, New York-Presbyterian/Weill Cornell Medical Center, New York, New York.
| | - Joanna G Escalon
- Department of Radiology, New York-Presbyterian/Weill Cornell Medical Center, New York, New York
| | - Lauren K Groner
- Department of Radiology, New York-Presbyterian/Weill Cornell Medical Center, New York, New York
| | - David M Naeger
- Director of Radiology, Denver Health, Denver, Colorado; Vice Chair, Department of Radiology, University of Colorado, Denver, Colorado
| |
Collapse
|
18
|
Maxfield CM, Grimm LJ. The Value of Numerical USMLE Step 1 Scores in Radiology Resident Selection. Acad Radiol 2020; 27:1475-1480. [PMID: 31445825 DOI: 10.1016/j.acra.2019.08.007] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2019] [Revised: 08/08/2019] [Accepted: 08/08/2019] [Indexed: 11/30/2022]
Abstract
RATIONALE AND OBJECTIVES In response to a recent proposal to change scoring on the United States Medical Licensing (USMLE) Step 1 exam to pass/fail, we sought to determine the value of numerical Step 1 scores in predicting success in our radiology residency program. MATERIALS AND METHODS Residency applications for 157 residents entering the program between 2005 and 2017 were retrospectively reviewed. Biographical (gender, sports participation, advanced degree), undergraduate (school, major), and medical school (grades, USMLE Step 1 score, Alpha Omega Alpha membership, letters of recommendation, publications) data were recorded. Multivariate regression analysis was used to examine the relationship between these application factors and subsequent performance as a radiology resident, as determined by completion of the program without requiring corrective action, select Accreditation Council for Graduate Medical Education milestones, and selection as chief resident. RESULTS Corrective action was required for 7% (n = 12) of residents. Of the predictor variables, only Step 1 score was associated with the need for corrective action (p < 0.001). The interpretation of exams milestone was associated with higher Step 1 scores (p = 0.001) and number of medical school clerkship honors (p = 0.008). Selection as chief resident was associated with sports participation (p = 0.04), and clerkship honors (p = 0.02). CONCLUSION Numerical USMLE Step 1 scores are predictive of successful completion of radiology residency training without the need for corrective action, and of accelerated competence in the interpretation of exams milestone. Continued reporting of numerical Step 1 scores would be valuable in selection of radiology residents.
Collapse
Affiliation(s)
- Charles M Maxfield
- Department of Radiology, Duke University Medical Center, Durham, North Carolina.
| | - Lars J Grimm
- Department of Radiology, Duke University Medical Center, Box 3808, Durham, NC 27710
| |
Collapse
|
19
|
Patel MD, Tomblinson CM, Benefield T, Ali K, DeBenedectis CM, England E, Gaviola GC, Ho CP, Jay AK, Milburn JM, Ong S, Robbins JB, Sarkany DS, Heitkamp DE, Jordan SG. The Relationship Between US Medical Licensing Examination Step Scores and ABR Core Examination Outcome and Performance: A Multi-institutional Study. J Am Coll Radiol 2020; 17:1037-1045. [PMID: 32220580 DOI: 10.1016/j.jacr.2020.02.017] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2020] [Revised: 02/25/2020] [Accepted: 02/27/2020] [Indexed: 11/17/2022]
Abstract
PURPOSE We analyzed multi-institutional data to understand the relationship of US Medical Licensing Examination (USMLE) Step scores to ABR Core examination performance to identify Step score tiers that stratify radiology residents into different Core performance groups. METHODS We collected USMLE Step scores and ABR Core examination outcomes and scores for anonymized residents from 13 different diagnostic radiology residency programs taking the ABR Core examination between 2013 and 2019. USMLE scores were grouped into noniles using z scores and then aggregated into three tiers based on similar Core examination pass-or-fail outcomes. Core performance was grouped using standard deviation from the mean and then measured by the percent of residents with scores below the mean. Differences between Step tiers for Core outcome and Core performance were statistically evaluated (P < .05 considered significant). RESULTS Differences in Step 1 terciles Core failure rates (45.9%, 11.9%, and 3.0%, from lowest to highest Step tiers; n = 416) and below-mean Core performance (83.8%, 54.1%, and 21.1%, respectively; n = 402) were significant. Differences in Step 2 groups Core failure rates (30.0%, 10.6%, and 2.0%, from lowest to highest Step tiers; n = 387) and below-mean Core performance (80.0%, 43.7%, and 14.0%, respectively; n = 380) were significant. Step 2 results modified Core outcome and performance predictions for residents in Step 1 terciles of varying statistical significance. CONCLUSIONS Tiered scoring of USMLE Step results has value in predicting radiology resident performance on the ABR Core examination; effective stratification of radiology resident applicants can be done without reporting numerical Step scores.
Collapse
Affiliation(s)
- Maitray D Patel
- Executive Board, Society of Radiologists in Ultrasound, Department of Radiology, Mayo Clinic Arizona, Phoenix, Arizona.
| | - Courtney M Tomblinson
- Associate Program Director Diagnostic Radiology Residency; Associate Director, Women in Radiology, Department of Radiology and Radiological Sciences, Vanderbilt University Medical Center, Nashville, Tennessee
| | - Thad Benefield
- Department of Radiology, University of North Carolina School of Medicine, Chapel Hill, North Carolina
| | - Kamran Ali
- Program Director Diagnostic Radiology Residency; President, Radiology Group, Department of Radiology, University of Kansas School of Medicine, Wichita, Kansas
| | - Carolynn M DeBenedectis
- Vice Chair for Education; Program Director, Radiology Residency program, Department of Radiology, University of Massachusetts Medical School, Worcester, Massachusetts
| | - Eric England
- Vice Chair of Education, Department of Radiology; Program Director, Diagnostic Radiology Residency; Jerome F. Wiot Endowed Chair of Radiology Residency Education, Department of Radiology, University of Cincinnati Medical Center, Cincinnati, Ohio
| | - Glenn C Gaviola
- Program Director Diagnostic Radiology Residency, Department of Radiology, Brigham and Women's Hospital, Boston, Massachusetts
| | - Christopher P Ho
- Program Director Diagnostic Radiology Residency, Department of Radiology and Imaging Sciences, Emory University School of Medicine, Atlanta, Georgia
| | - Ann K Jay
- Vice Chair of Education and Program Director Diagnostic Radiology Residency, Department of Radiology, MedStar Georgetown University Hospital, Washington, DC
| | - James M Milburn
- Vice Chair of Radiology, Section Head Neuroradiology, Program Director of Diagnostic Radiology Residency. ACR: Louisiana State Councilor; Department of Radiology, Ochsner Clinic Foundation, New Orleans, Louisiana
| | - Seng Ong
- Program Director Diagnostic Radiology Residency, Department of Radiology, University of Chicago Medical Center, Chicago, Illinois
| | - Jessica B Robbins
- Vice Chair of Faculty Development and Enrichment; Associate Program Director Diagnostic Radiology and Integrated Diagnostic/Interventional Radiology Residencies, University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin
| | - David S Sarkany
- Diagnostic Radiology Program Director, Department of Radiology, Staten Island University Hospital Northwell Health, Staten Island, New York
| | | | - Sheryl G Jordan
- Education Director Department of Radiology, University of North Carolina School of Medicine, Chapel Hill, North Carolina
| |
Collapse
|
20
|
ACGME Case Log Values Correlate with Performance on ABR Core Exam. Acad Radiol 2020; 27:274-275. [PMID: 31759797 DOI: 10.1016/j.acra.2019.10.014] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2019] [Accepted: 10/23/2019] [Indexed: 11/22/2022]
|
21
|
|
22
|
Hardy SM, Donnelly EF, Bruno MA. Reliance on Multiple-Choice Board Examinations Will Have Trade-Offs. J Am Coll Radiol 2019; 16:1634-1635. [PMID: 31639362 DOI: 10.1016/j.jacr.2019.09.010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2019] [Revised: 09/22/2019] [Accepted: 09/23/2019] [Indexed: 10/25/2022]
Affiliation(s)
- Seth M Hardy
- Department of Radiology, Penn State Health Milton S. Hershey Medical Center, 500 University Dr, PO Box 500, Hershey, PA 17033.
| | - Edwin F Donnelly
- Department of Radiology and Radiological Sciences, Vanderbilt University, Nashville, Tennessee
| | - Michael A Bruno
- Department of Radiology, Penn State Health Milton S. Hershey Medical Center, Hershey, Pennsylvania
| |
Collapse
|
23
|
Carmody JB, Sarkany D, Heitkamp DE. The USMLE Step 1 Pass/Fail Reporting Proposal: Another View. Acad Radiol 2019; 26:1403-1406. [PMID: 31296373 DOI: 10.1016/j.acra.2019.06.002] [Citation(s) in RCA: 22] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2019] [Revised: 06/13/2019] [Accepted: 06/13/2019] [Indexed: 11/28/2022]
Abstract
The Association of Program Directors in Radiology recently issued a statement endorsing continued reporting of results of the United States Medical Licensing Examination (USMLE) as a three-digit score. While this position was approved by the Association of Program Directors in Radiology Board of Directors, it does not reflect the opinions of all radiology program directors. Here, we present an argument in support of reporting USMLE results as pass/fail. As a psychometric instrument, the USMLE Step 1 is designed to assess basic science knowledge and intended to inform a binary decision on licensure. Due to a steadily-increasing burden of applications to review, program directors have increasingly relied upon scores for candidate screening. Such use has multiple adverse consequences. Student focus on Step 1 systematically devalues educational content not evaluated on the exam, and the reliance on Step 1 scores almost certainly works against efforts to increase workforce diversity. Moreover, the increasing pressure of "Step 1 Mania" has negative consequences for trainee mental health and wellness. Despite the widespread use of Step 1 scores to select applicants, there are little data to correlate scores to meaningful outcomes related to patient care or clinical practice. We find the current situation untenable, and believe a necessary first step toward reform is making Step 1 a pass/fail only examination.
Collapse
Affiliation(s)
- J Bryan Carmody
- Eastern Virginia Medical School, Department of Pediatrics, Division of Nephrology, Norfolk, Virginia
| | - David Sarkany
- Staten Island University Hospital, Northwell Health, Department of Radiology, 475 Seaview Avenue, Staten Island, NY 10305.
| | - Darel E Heitkamp
- Advent Health Orlando, Department of Radiology, Orlando, Florida
| |
Collapse
|