1
|
Beyer RS, Hatter MJ, Brown NJ, Oh MY. Letter to the Editor. The USMLE examination scoring change: uncertainty and implications for dual-degree applicants. J Neurosurg 2022; 137:607-608. [PMID: 35426832 DOI: 10.3171/2022.3.jns22531] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
2
|
Bigach SD, Winkelman RD, Savakus JC, Papp KK. A Novel USMLE Step 1 Projection Model Using a Single Comprehensive Basic Science Self-Assessment Taken During a Brief Intense Study Period. MEDICAL SCIENCE EDUCATOR 2021; 31:67-73. [PMID: 34457866 PMCID: PMC8368818 DOI: 10.1007/s40670-020-01097-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 10/01/2020] [Indexed: 06/13/2023]
Abstract
BACKGROUND Comprehensive Basic Science Self-Assessments (CBSSAs) offered by the National Board of Medical Examiners (NBME) are used by students to gauge preparedness for the United States Medical Licensing (USMLE) Step 1. Because residency programs value Step 1 scores, students expend many resources attempting to score highly on this exam. We sought to generate a predicted Step 1 score from a single CBSSA taken several days out from a planned exam date to inform student testing and study plans. METHODS 2016 and 2017 Step 1 test takers at one US medical school were surveyed. The average daily score improvement from CBSSA to Step 1 during the 2016 study period was calculated and used to generate a predicted Step 1 score as well as mean absolute prediction errors (MAPEs). The predictive model was validated on 2017 data. RESULTS In total, 43 of 61 respondents totaling 141 CBSSAs in 2016 and 37 of 43 respondents totaling 122 CBSSAs in 2017 were included. The final prediction model was [Predicted Step 1 = 292 - (292 - CBSSA score) * 0.987527 ^ (number of days out)]. In 2016, the average difference between predicted and actual scores was -0.81 (10.2) and the MAPE was 7.8. In 2017, 88 (72.1%) and 118 (96.7%) of true Step 1 scores fell within one and two standard deviations of a student's predicted score. There was a MAPE of 7.7. Practice form used (p = 0.19, 0.07) and how far out from actual Step 1 it was taken (p = 0.82, 0.38) were not significant in either year of study. CONCLUSION This projection model is reasonable for students to use to gauge their readiness for Step 1 while it remains a scored exam and provides a framework for future predictive model generation as the landscape of standardized testing changes in medical education.
Collapse
Affiliation(s)
- Stephen D. Bigach
- Department of Orthopaedic Surgery, McGaw Medical Center of Northwestern University, IL Chicago, USA
- School of Medicine, Case Western Reserve University, Cleveland, OH USA
| | | | - Jonathan C. Savakus
- School of Medicine, Case Western Reserve University, Cleveland, OH USA
- Department of Orthopaedic Surgery, Vanderbilt University Medical Center, Nashville, TN USA
| | - Klara K. Papp
- School of Medicine, Case Western Reserve University, Cleveland, OH USA
| |
Collapse
|
3
|
Wallach SL, Williams C, Chow RT, Jadhav N, Kuehl S, Raj JM, Alweis R. Internal medicine resident perspectives on scoring USMLE as pass/fail. J Community Hosp Intern Med Perspect 2020; 10:381-385. [PMID: 33235666 PMCID: PMC7671726 DOI: 10.1080/20009666.2020.1796366] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022] Open
Abstract
Background The scoring rubric on the USMLE Step 1 examination will be changing to pass/fail in January 2022. This study elicits internal medicine resident perspectives on USMLE pass/fail scoring at the national level. Objective To assess internal medicine resident opinions regarding USMLE pass/fail scoring and examine how variables such as gender, scores on USMLE 1 and 2, PGY status and type of medical school are associated with these results. Methods In the fall of 2019, the authors surveyed current internal medicine residents via an on-line tool distributed through their program directors. Respondents indicated their Step 1 and Step 2 Clinical Knowledge scores from five categorical ranges. Questions on medical school type, year of training year, and gender were included. The results were analyzed utilizing Pearson Chi-square testing and multivariable logistic regression. Results 4012 residents responded, reflecting 13% of internal medicine residents currently training in the USA. Fifty-five percent of respondents disagreed/strongly disagreed with pass/fail scoring and 34% agreed/strongly agreed. Group-based differences were significant for gender, PGY level, Step 1 score, and medical school type; a higher percentage of males, those training at the PGY1 level, and graduates of international medical schools (IMGs) disagreed with pass/fail reporting. In addition, high scorers on Step 1 were more likely to disagree with pass/fail reporting than low scoring residents Conclusion Our results suggest that a majority of internal medicine residents, currently training in the USA prefer that USMLE numerical scoring is retained and not changed to pass/fail.
Collapse
Affiliation(s)
- Sara L Wallach
- Department of Medicine, St. Francis Medical Center, Trenton, NJ, USA.,Department of Medicine, Hackensack Meridian School of Medicine, Nutley, NJ, USA.,Medicine, Drexel University College of Medicine, Philadelphia, PA, USA
| | - Christopher Williams
- Department of Behavioral and Community Health, School of Public Health, University of Maryland, College Park, MD, USA
| | - Robert T Chow
- Department of Medicine, The University of Maryland Medical Center Midtown Campus, Baltimore, MD, USA.,Medicine, University of Maryland School of Medicine, Baltimore, MD, USA
| | - Nagesh Jadhav
- Department of Medicine, Rochester General Hospital Internal Medicine Residency Program, Rochester, NY, USA.,Medicine, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Sapna Kuehl
- Medicine, University of Maryland School of Medicine, Baltimore, MD, USA.,Department of Medicine, Ascension Saint Agnes Hospital, Baltimore, MD, USA
| | - Jaya M Raj
- Department of Medicine, Creighton University School of Medicine-Phoenix, St. Joseph's Hospital and Medical Center, Phoenix, AZ, USA
| | - Richard Alweis
- Medicine, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA.,Education, Rochester Regional Health, Rochester, NY, USA.,Health Sciences, Rochester Institute of Technology, Rochester, NY, USA
| |
Collapse
|
4
|
Lewis CE, Hiatt JR, Wilkerson L, Tillou A, Parker NH, Hines OJ. Numerical Versus Pass/Fail Scoring on the USMLE: What Do Medical Students and Residents Want and Why? J Grad Med Educ 2011; 3:59-66. [PMID: 22379524 PMCID: PMC3186267 DOI: 10.4300/jgme-d-10-00121.1] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/26/2010] [Revised: 09/08/2010] [Accepted: 10/11/2010] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Although the primary purpose of the US Medical Licensing Examination (USMLE) is assessment for licensure, USMLE scores often are used for other purposes, more prominently resident selection. The Committee to Evaluate the USMLE Program currently is considering a number of substantial changes, including conversion to pass/fail scoring. METHODS A survey was administered to third-year (MS3) and fourth-year (MS4) medical students and residents at a single institution to evaluate opinions regarding pass/fail scoring on the USMLE. RESULTS Response rate was 59% (n = 732 of 1249). Reported score distribution for Step 1 was 30% for <220, 38% for 220-240, and 32% for >240, with no difference between MS3s, MS4s, and residents (P = .89). Score distribution for Step 2 Clinical Knowledge (CK) was similar. Only 26% of respondents agreed that Step 1 should be pass/fail; 38% agreed with pass/fail scoring for Step 2 CK. Numerical scoring on Step 1 was preferred by respondents who: (1) agreed that the examination gave an accurate estimate of knowledge (odds ratio [OR], 4.23; confidence interval [CI], 2.41-7.43; P < .001); (2) scored >240 (OR, 4.0; CI, 1.92-8.33; P < .001); and (3) felt that acquisition of knowledge might decrease if the examination were pass/fail (OR, 10.15; CI, 3.32-31.02; P < .001). For Step 2 CK, numerical scoring was preferred by respondents who: (1) believed they gained a large amount of knowledge preparing for the examination (OR, 2.63; CI, 1.52-4.76; P < .001); (2) scored >240 (OR, 4.76; CI, 2.86-8.33; P < .001); (3) felt that the amount of knowledge acquired might decrease if it were pass/fail (OR, 28.16; CI, 7.31-108.43; P < .001); and (4) believed their Step 2 CK score was important when applying for residency (OR, 2.37; CI, 1.47-3.84; P < .001). CONCLUSIONS Students and residents prefer the ongoing use of numerical scoring because they believe that scores are important in residency selection, that residency applicants are advantaged by examination scores, and that scores provide an important impetus to review and solidify medical knowledge.
Collapse
Affiliation(s)
| | | | | | | | | | - O. Joe Hines
- Corresponding author: O. Joe Hines, MD, David Geffen School of Medicine at UCLA, 10833 Le Conte Avenue, 72-170 CHS, Box 956904, Los Angeles, CA 90095- 6904, 310.206.0441,
| |
Collapse
|
5
|
Laatsch L. Evaluation and treatment of students with difficulties passing the Step examinations. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2009; 84:677-683. [PMID: 19704209 DOI: 10.1097/acm.0b013e31819faae1] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
PURPOSE The author designed this retrospective case series study both to systematically examine characteristics of individuals referred for treatment after multiple failures on the United States Medical Licensing Examinations (USMLE) Step 1 or 2 administered by the National Board of Medical Examiners and to evaluate treatment effectiveness in a uniform sample. METHOD Six medical students referred to rehabilitation psychology met selection criteria. All students completed the requisite neuropsychological, academic, and psychological testing to identify cognitive and emotional strengths and weaknesses. All six underwent individualized cognitive rehabilitation (CR) with a primary focus on reading fluency and accuracy. RESULTS All participants improved on a quantitative measure of reading speed and accuracy, and five of the six passed their next USLME Step examination in spite of past failures. CONCLUSIONS Medical students with identified difficulties on reading fluency, but no history of a learning disability, may benefit from systematic CR that addresses cognitive weaknesses related to test-taking abilities. The strong relationships between language and reading skills and the USMLE Step examinations suggest that some students may fail these examinations because of a relative weakness in language processing and reading fluency that may prohibit their successful completion of the Step examinations.
Collapse
Affiliation(s)
- Linda Laatsch
- Rehabilitation Psychology, Department of Neurology and Rehabilitation, University of Illinois, College of Medicine, Chicago, Illinois, USA.
| |
Collapse
|
6
|
Hendrix D, Hasman L. A survey of collection development for United States Medical Licensing Examination (USMLE) and National Board Dental Examination (NBDE) preparation material. J Med Libr Assoc 2008; 96:207-16. [PMID: 18654641 DOI: 10.3163/1536-5050.96.3.006] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022] Open
Abstract
OBJECTIVE The research sought to ascertain medical and dental libraries' collection development policies, evaluation methods, purchase decisions, and issues that relate to print and electronic United States Medical Licensing Examination (USMLE) and National Board Dental Examination (NBDE) preparation materials. METHODS The investigators surveyed librarians supporting American Association of Medical Colleges (AAMC)-accredited medical schools (n = 58/125) on the USMLE and librarians supporting American Dental Association (ADA)-accredited dental schools (n = 23/56) on the NBDE. The investigators analyzed the data by cross-tabulating and filtering the results using EFM Continuum web survey software. Investigators also surveyed print and electronic USMLE and NBDE preparation materials from 2004-2007 to determine the number of publications and existence of reviews. RESULTS A majority of responding AAMC libraries (62%, n = 58) provide at least 1 electronic or online USMLE preparation resource and buy an average of 11.6 print USMLE titles annually. Due to a paucity of NBDE print and electronic resources, ADA libraries bought significantly fewer print resources, and only 1 subscribed to an electronic resource. The most often reported evaluation methods for both populations were feedback from medical or dental students, feedback from medical or dental faculty, and online trials. Some AAMC (10%, n = 58) and ADA libraries (39%, n = 23) libraries reported that no evaluation of these materials occured at their libraries. CONCLUSIONS From 2004-2007, publishers produced 45 USMLE preparation resources (total n = 546) to every 1 NBDE preparation resource (total n = 12). Users' needs, institutional missions and goals, financial status, and official collection policies most often underlie decisions to collect or not collect examination preparation materials. Evaluating the quality of examination preparation materials can be problematic due to lack of published reviews, lack of usability testing by libraries, and librarians' and library users' unfamiliarity with the actual content of examinations. Libraries must integrate faculty and students into the purchase process to make sure examination preparation resources of the highest quality are purchased.
Collapse
Affiliation(s)
- Dean Hendrix
- Health Sciences Library, University at Buffalo, Buffalo, NY 14214, USA.
| | | |
Collapse
|
7
|
Dyrbye LN, Thomas MR, Natt N, Rohren CH. Prolonged delays for research training in medical school are associated with poorer subsequent clinical knowledge. J Gen Intern Med 2007; 22:1101-6. [PMID: 17492473 PMCID: PMC2305740 DOI: 10.1007/s11606-007-0200-x] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/24/2006] [Revised: 03/20/2007] [Accepted: 03/30/2007] [Indexed: 11/29/2022]
Abstract
BACKGROUND Complementary degree programs and research training are important alternative tracks in medical school that typically interrupt the traditional MD curriculum. OBJECTIVE Examine effects of such a break on clinical knowledge after reentry into the MD curriculum. DESIGN Retrospective cohort study. PARTICIPANTS Three hundred and two graduates of Mayo Medical School. MAIN MEASUREMENTS Compared years of delay between the second and third years of medical school with third year clerkship grades, National Board of Medical Examiner's (NBME) Subject Examinations, and United States Medical License Exam (USMLE) Step 2. MAIN RESULTS 258, 13, and 31 students spent 0, 1, or > or = 3 years pursuing research between the second and third year. Baseline measures of knowledge before matriculation and before the third year were similar between groups. Whereas a 1-year delay had no significant effect, a > or = 3-year delay was associated with fewer clerkship honors and lower NBME Medicine, Pediatrics, and Psychiatry percentiles compared to no delay (all p < .05). Students with a > or = 3-year delay had a 77% reduction in the odds of honors in Medicine. For each year of delay beyond 3, students' third-year NBME Medicine, Neurology, Obstetrics and Gynecology, and Psychiatry scores decreased as did USMLE Step 2 scores (r = -.38 to -.50, p < .05). CONCLUSIONS Delays of > or = 3 years between the second and third years of medical school are associated with lower grades and scores on clinical knowledge tests. Further research is needed to determine the optimal timing of research training and develop effective interventions to facilitate reentry into the medical school curriculum.
Collapse
Affiliation(s)
- Liselotte N Dyrbye
- Department of Medicine, Mayo Clinic College of Medicine, 200 First Street SW, Rochester, MN 55906, USA.
| | | | | | | |
Collapse
|
8
|
Affiliation(s)
- Joseph H Flaherty
- Division of Geriatric Medicine, Saint Louis University School of Medicine, Missouri 63104, USA.
| |
Collapse
|
9
|
McLay R, Klingsberg R, Florez L, Bhattacharjee M, Garcia C, Sutton C, Crawford B. A web page to teach neurology and neuropathology to medical students. Neuropathol Appl Neurobiol 2001; 27:142-4. [PMID: 11437995 DOI: 10.1046/j.1365-2990.2001.00320.x] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
|
10
|
Edelstein RA, Reid HM, Usatine R, Wilkes MS. A comparative study of measures to evaluate medical students' performance. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2000; 75:825-833. [PMID: 10965862 DOI: 10.1097/00001888-200008000-00016] [Citation(s) in RCA: 17] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/23/2023]
Abstract
PURPOSE To assess how new National Board of Medical Examiners (NBME) performance examinations--computerbased case simulations (CBX) and standardized patient exams (SPX)-compare with each other and with traditional internal and external measures of medical students' performances. Secondary objectives examined attitudes of students toward new and traditional evaluation modalities. METHOD Fourth-year students (n = 155) at the University of California, Los Angeles, School of Medicine (including joint programs at Charles R. Drew University of Medicine and Science and University of California, Riverside) were assigned two days of performance examinations (eight SPXs, ten CBXs, and a self-administered attitudinal survey). The CBX was scored by the NBME and the SPX by a NBME/Macy consortium. Scores were linked to the survey and correlated with archival student data, including traditional performance indicators (licensing board scores, grade-point averages, etc.). RESULTS Of the 155 students, 95% completed the testing. The CBX and the SPX had low to moderate statistically significant correlations with each other and with traditional measures of performance. Traditional measures were intercorrelated at higher levels than with the CBX or SPX. Students' perceptions of the various evaluation methods varied based on the assessment. These findings are consistent with the theoretical construct for development of performance examinations. For example, to assess clinical decision making, students rated the CBX best, while they rated multiple-choice examinations best to assess knowledge. CONCLUSION Examination results and student perception studies provide converging evidence that performance examinations measure different physician competency domains and support using multipronged assessment approaches.
Collapse
Affiliation(s)
- R A Edelstein
- Department of Family Medicine, Charles R. Drew University of Medicine and Science, College of Medicine--Academic Affairs, Los Angeles, CA 90059, USA.
| | | | | | | |
Collapse
|