1
|
Tolsma R, Shebrain S, Berry SD, Miller L. Medical student perceptions of assessments of clinical reasoning in a general surgery clerkship. BMC MEDICAL EDUCATION 2024; 24:211. [PMID: 38429706 PMCID: PMC10908043 DOI: 10.1186/s12909-024-05184-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/29/2023] [Accepted: 02/15/2024] [Indexed: 03/03/2024]
Abstract
BACKGROUND Components factoring into general surgery clerkship grades vary by institution, and while evaluators attempt to remain unbiased when evaluating medical student performance, subjectivity and implicit bias remain an issue. Our institution recently implemented a case-based structured oral examination to provide the general surgery clerkship director objective insight into students' clinical reasoning skills. We hypothesized that medical students believe this exam, along with graded clinical documentation and the Observed Standardized Clinical Encounter (OSCE), are fair assessments and increase students' awareness of their clinical reasoning skills. METHODS A survey was sent to third-year medical students in the classes of 2023 and 2024 at our institution who had completed their general surgery clerkship. Students rated five grading assessments (i.e., preceptor evaluations, the oral examination, clinical documentation, the OSCE, and the shelf exam) on fairness and the ability of the assessment to give them insight into their clinical reasoning on a five-point Likert scale 1-5 (with 1 = Strongly Agree, 5 = Strongly Disagree). RESULTS One hundred and ten of 162 (67.9%) students responded to the survey. The shelf examination was the most highly regarded assessment tool followed by the oral examination. Seventy-three percent agreed or strongly agreed that the oral exam was a fair assessment, and 80% agreed or strongly agreed that it gave them insight into their clinical reasoning skills. Alternatively, only 41.8% of students agreed or strongly agreed that preceptor evaluations were fair assessments and 42.7% agreed or strongly agreed that it gave them insight into their clinical reasoning. CONCLUSIONS Third-year medical students on a general surgery clerkship favor the shelf examination and a case-based oral examination over other assessment tools regarding fairness and perception of their clinical reasoning. This type of examination can provide general surgery clerkship directors with additional objective data to assess medical students more fairly and improve students' clinical reasoning.
Collapse
Affiliation(s)
- Rachael Tolsma
- Department of Orthopaedic Surgery, University of Wisconsin-Madison, 1685 Highland Ave, Madison, WI, 53705, USA.
| | - Saad Shebrain
- Department of General Surgery, Western Michigan University Homer Stryker MD School of Medicine, Kalamazoo, MI, USA
| | - Shamsi Daneshvari Berry
- Department of Biomedical Informatics, Western Michigan University Homer Stryker MD School of Medicine, Kalamazoo, MI, USA
| | - Lisa Miller
- Department of General Surgery, Western Michigan University Homer Stryker MD School of Medicine, Kalamazoo, MI, USA
| |
Collapse
|
2
|
Chang O, Holbrook AM, Lohit S, Deng J, Xu J, Lee M, Cheng A. Comparability of Objective Structured Clinical Examinations (OSCEs) and Written Tests for Assessing Medical School Students' Competencies: A Scoping Review. Eval Health Prof 2023; 46:213-224. [PMID: 36959750 PMCID: PMC10443966 DOI: 10.1177/01632787231165797] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/25/2023]
Abstract
Objective Structured Clinical Examinations (OSCEs) and written tests are commonly used to assess health professional students, but it remains unclear whether the additional human resources and expenses required for OSCEs, both in-person and online, are worthwhile for assessing competencies. This scoping review summarized literature identified by searching MEDLINE and EMBASE comparing 1) OSCEs and written tests and 2) in-person and online OSCEs, for assessing health professional trainees' competencies. For Q1, 21 studies satisfied inclusion criteria. The most examined health profession was medical trainees (19, 90.5%), the comparison was most frequently OSCEs versus multiple-choice questions (MCQs) (18, 85.7%), and 18 (87.5%) examined the same competency domain. Most (77.5%) total score correlation coefficients between testing methods were weak (r < 0.40). For Q2, 13 articles were included. In-person and online OSCEs were most used for medical trainees (9, 69.2%), checklists were the most prevalent evaluation scheme (7, 63.6%), and 14/17 overall score comparisons were not statistically significantly different. Generally low correlations exist between MCQ and OSCE scores, providing insufficient evidence as to whether OSCEs provide sufficient value to be worth their additional cost. Online OSCEs may be a viable alternative to in-person OSCEs for certain competencies where technical challenges can be met.
Collapse
Affiliation(s)
- Oswin Chang
- Clinical Pharmacology and Toxicology Research, St Joseph’s Healthcare Hamilton
- Faculty of Health Sciences, McMaster University
| | - Anne M. Holbrook
- Clinical Pharmacology and Toxicology Research, St Joseph’s Healthcare Hamilton
- Division of Clinical Pharmacology and Toxicology, McMaster University
| | - Simran Lohit
- Clinical Pharmacology and Toxicology Research, St Joseph’s Healthcare Hamilton
- Faculty of Health Sciences, McMaster University
| | - Jiawen Deng
- Clinical Pharmacology and Toxicology Research, St Joseph’s Healthcare Hamilton
- Faculty of Health Sciences, McMaster University
| | - Janice Xu
- Clinical Pharmacology and Toxicology Research, St Joseph’s Healthcare Hamilton
- Faculty of Health Sciences, McMaster University
| | - Munil Lee
- Schulich School of Medicine and Dentistry,University of Western Ontario
| | - Alan Cheng
- Clinical Pharmacology and Toxicology Research, St Joseph’s Healthcare Hamilton
- Faculty of Health Sciences, McMaster University
| |
Collapse
|
3
|
Huerta CT, Cohen BL, Hernandez AE, Saberi RA, Thorson CM, Hui VW, Rodgers SE, Sands LR. Examination Scores but not Clinical Performance Correlate With Duration of Preclinical Didactic Time: A Synchronous Comparison of Second- Versus Third-Year Medical Students on the Surgery Clerkship. JOURNAL OF SURGICAL EDUCATION 2023; 80:957-964. [PMID: 37277232 DOI: 10.1016/j.jsurg.2023.05.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/31/2023] [Revised: 04/23/2023] [Accepted: 05/04/2023] [Indexed: 06/07/2023]
Abstract
OBJECTIVE Numerous institutions have reduced preclinical didactic time to facilitate earlier clinical exposure during the second year of medical education. However, the effects that shortened preclinical education may have on performance in the surgery clerkship are unclear. This study aims to compare the clinical and examination performance of second- (MS2) and third-year (MS3) students synchronously completing an identical surgery clerkship. DESIGN All students completing the surgery clerkship (identical didactics, examinations, clinical rotations, etc.) were included. MS3s received 24 months of preclinical education, whereas MS2s received 14 months. Performance outcomes included weekly quizzes based on lectures, NBME Surgery Shelf Exam, numeric clinical evaluations, objective structured clinical examination (OSCE) scores, and overall clerkship grades. SETTING University of Miami Miller School of Medicine. PARTICIPANTS All second- (MS2) and third-year (MS3) medical students completing the Surgery Clerkship over 1 year (n = 395). RESULTS There were 199 MS3 (50%) and 196 MS2 (50%) students. MS3s demonstrated higher median shelf exams (77% vs 72% MS2s), weekly quiz score averages (87% vs 80% MS2s), clinical evaluations (96% vs 95% MS2s), and overall clerkship grades (89% vs 87% MS2s), all p < 0.020. There was no difference in median OSCE performance (both 92%; p = 0.499). A greater proportion of MS3 students performed in the highest 50% of weekly quiz scores (57% vs 43% MS2), NBME shelf exams (59% vs 39% MS2), and overall clerkship grades (45% vs 37% MS2), all p < 0.010. No significant difference in the proportion of students placing in the top 50% of clinical parameters including the OSCE (48% MS3 vs 46% MS2; p = 0.106) and clinical evaluations (45% MS3 vs 38%; p = 0.185) was observed. CONCLUSIONS Although the duration of preclerkship education may correspond to examination scores, MS2s and MS3s perform similarly on clinical metrics. Future strategies to enhance available preclinical didactic time and preparation for examinations are needed.
Collapse
Affiliation(s)
- Carlos Theodore Huerta
- Division of Pediatric Surgery, DeWitt Daughtry Family Department of Surgery, University of Miami Miller School of Medicine, Miami, Florida.
| | - Brianna L Cohen
- Division of Pediatric Surgery, DeWitt Daughtry Family Department of Surgery, University of Miami Miller School of Medicine, Miami, Florida
| | - Alexandra E Hernandez
- Division of Pediatric Surgery, DeWitt Daughtry Family Department of Surgery, University of Miami Miller School of Medicine, Miami, Florida
| | - Rebecca A Saberi
- Division of Pediatric Surgery, DeWitt Daughtry Family Department of Surgery, University of Miami Miller School of Medicine, Miami, Florida
| | - Chad M Thorson
- Division of Pediatric Surgery, DeWitt Daughtry Family Department of Surgery, University of Miami Miller School of Medicine, Miami, Florida
| | - Vanessa W Hui
- Division of Pediatric Surgery, DeWitt Daughtry Family Department of Surgery, University of Miami Miller School of Medicine, Miami, Florida
| | - Steven E Rodgers
- Division of Pediatric Surgery, DeWitt Daughtry Family Department of Surgery, University of Miami Miller School of Medicine, Miami, Florida
| | - Laurence R Sands
- Division of Pediatric Surgery, DeWitt Daughtry Family Department of Surgery, University of Miami Miller School of Medicine, Miami, Florida
| |
Collapse
|
4
|
Huerta CT, Saberi RA, Thorson CM, Hui VW, Rodgers SE, Sands LR. Effects of Recorded versus Live Teleconference Didactic Lectures on Medical Student Performance in the Surgery Clerkship. JOURNAL OF SURGICAL EDUCATION 2023; 80:228-234. [PMID: 36241483 PMCID: PMC9551991 DOI: 10.1016/j.jsurg.2022.09.017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/18/2022] [Revised: 08/15/2022] [Accepted: 09/23/2022] [Indexed: 06/16/2023]
Abstract
OBJECTIVE Due to the COVID-19 pandemic, numerous institutions converted medical education didactics to electronic formats including both live teleconference didactics and recorded faculty lectures. This study aims to compare the effect of recorded versus live teleconference didactic lectures on medical student examination scores during the surgery clerkship. DESIGN Medical students completing the Surgery Clerkship received a weekly series of didactic lectures taught by faculty via a teleconference (2020-2021 academic year) or recorded format (2021-2022 academic year). Performance outcomes included weekly quizzes, National Board of Medical Examiners (NBME) Surgery Shelf Exam, and clerkship Objective Structured Clinical Examination (OSCE) scores. SETTING University of Miami Miller School of Medicine. PARTICIPANTS All second- (MS2) and third-year (MS3) medical students completing the Surgery Clerkship over two academic years (n = 312). RESULTS Students who received live teleconference lectures (n = 156) demonstrated higher average scores on weekly quizzes (89%) and the NBME shelf exam (76%) compared to those receiving recorded lectures (n = 156; 71% quiz, 70% shelf exam), both p < 0.001. There was a significant association with performance in the highest quartile (Q1) of weekly quiz scores and receiving live lectures (40% vs. recorded lectures 1%, p < 0.001). Comparing only MS3 students, mean weekly quiz scores and Q1 achievement were significantly higher (both p < 0.001) in the teleconference cohort with no significant difference in NBME shelf exam performance (p = 0.971). No difference in OSCE performance was observed between groups. CONCLUSION These results suggest that synchronous teleconferences may be more effective than recorded lectures for achieving institutional learning objectives on the surgery clerkship without any negative impact on NBME shelf exam or clinical evaluation parameters. This information should be used to inform future institutional clerkship design and educational initiatives.
Collapse
Affiliation(s)
- Carlos Theodore Huerta
- DeWitt Daughtry Family Department of Surgery, University of Miami Miller School of Medicine, Miami, Florida; University of Miami Miller School of Medicine, Miami, Florida.
| | - Rebecca A Saberi
- DeWitt Daughtry Family Department of Surgery, University of Miami Miller School of Medicine, Miami, Florida; University of Miami Miller School of Medicine, Miami, Florida
| | - Chad M Thorson
- DeWitt Daughtry Family Department of Surgery, University of Miami Miller School of Medicine, Miami, Florida; University of Miami Miller School of Medicine, Miami, Florida
| | - Vanessa W Hui
- DeWitt Daughtry Family Department of Surgery, University of Miami Miller School of Medicine, Miami, Florida; University of Miami Miller School of Medicine, Miami, Florida
| | - Steven E Rodgers
- DeWitt Daughtry Family Department of Surgery, University of Miami Miller School of Medicine, Miami, Florida; University of Miami Miller School of Medicine, Miami, Florida
| | - Laurence R Sands
- DeWitt Daughtry Family Department of Surgery, University of Miami Miller School of Medicine, Miami, Florida; University of Miami Miller School of Medicine, Miami, Florida
| |
Collapse
|
5
|
The Lake Wobegon effect is real: All general surgery residents appear to be better than average. SURGERY IN PRACTICE AND SCIENCE 2022. [DOI: 10.1016/j.sipas.2022.100134] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
|
6
|
Concordance of Narrative Comments with Supervision Ratings Provided During Entrustable Professional Activity Assessments. J Gen Intern Med 2022; 37:2200-2207. [PMID: 35710663 PMCID: PMC9296736 DOI: 10.1007/s11606-022-07509-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/08/2021] [Accepted: 03/24/2022] [Indexed: 10/18/2022]
Abstract
BACKGROUND Use of EPA-based entrustment-supervision ratings to determine a learner's readiness to assume patient care responsibilities is expanding. OBJECTIVE In this study, we investigate the correlation between narrative comments and supervision ratings assigned during ad hoc assessments of medical students' performance of EPA tasks. DESIGN Data from assessments completed for students enrolled in the clerkship phase over 2 academic years were used to extract a stratified random sample of 100 narrative comments for review by an expert panel. PARTICIPANTS A review panel, comprised of faculty with specific expertise related to their roles within the EPA program, provided a "gold standard" supervision rating using the comments provided by the original assessor. MAIN MEASURES Interrater reliability (IRR) between members of review panel and correlation coefficients (CC) between expert ratings and supervision ratings from original assessors. KEY RESULTS IRR among members of the expert panel ranged from .536 for comments associated with focused history taking to .833 for complete physical exam. CC (Kendall's correlation coefficient W) between panel members' assignment of supervision ratings and the ratings provided by the original assessors for history taking, physical examination, and oral presentation comments were .668, .697, and .735 respectively. The supervision ratings of the expert panel had the highest degree of correlation with ratings provided during assessments done by master assessors, faculty trained to assess students across clinical contexts. Correlation between supervision ratings provided with the narrative comments at the time of observation and supervision ratings assigned by the expert panel differed by clinical discipline, perhaps reflecting the value placed on, and perhaps the comfort level with, assessment of the task in a given specialty. CONCLUSIONS To realize the full educational and catalytic effect of EPA assessments, assessors must apply established performance expectations and provide high-quality narrative comments aligned with the criteria.
Collapse
|
7
|
The Surgical Skills and Technology Elective Program and Medical Student Career Choice. J Surg Res 2022; 273:127-131. [DOI: 10.1016/j.jss.2021.12.020] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2021] [Revised: 10/21/2021] [Accepted: 12/14/2021] [Indexed: 11/20/2022]
|
8
|
Surgical clerkship: Do examination scores correlate with clinical performance? Am J Surg 2021; 222:1163-1166. [PMID: 34602278 DOI: 10.1016/j.amjsurg.2021.09.016] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2021] [Revised: 08/23/2021] [Accepted: 09/13/2021] [Indexed: 11/23/2022]
Abstract
BACKGROUND This study aims to determine if there are correlations between clinical performance and objective grading parameters for medical students in the third-year surgery clerkship. METHODS Clerkship grades were compiled from 2016 to 2020. Performance on clinical rotations, NBME shelf exam, oral exam, and weekly quizzes were reviewed. Students were divided into quartiles (Q1-Q4) based on clinical performance. Standard statistical analysis was performed. RESULTS There were 625 students included in the study. Students in Q1+Q2 were more likely than those in Q3+Q4 to score in the top quartile on the shelf exam (29% vs. 19%, p = 0.002), oral exam (24% vs. 17%, p = 0.032), and quizzes (22% vs. 15%, p = 0.024). However, there was negligible correlation between clinical performance and performance on objective measures: shelf exam (R2 = 0.027, p < 0.001), oral exam (R2 = 0.021, p < 0.001), and weekly quizzes (R2 = 0.053, p = 0.092). CONCLUSIONS Clinical performance does not correlate with objective grading parameters for medical students in the third-year surgery clerkship.
Collapse
|
9
|
Brallier I, Mahmood S, Grotkowski K, Taylor J, Zdon M. Does surgical observed structured clinical exam (OSCE) predict clerkship grade, shelf exam scores, and preceptor clinical evaluation? Am J Surg 2021; 222:1167-1171. [PMID: 34511199 DOI: 10.1016/j.amjsurg.2021.08.038] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2020] [Revised: 07/29/2021] [Accepted: 08/16/2021] [Indexed: 11/20/2022]
Abstract
BACKGROUND Clinical evaluation of medical student performance has been criticized as variable and subjective. The aim of this study was to assess the correlation of a summative surgical OSCE exam to clinical faculty evaluations as well as surgery shelf exam score and final grades. METHODS The performance of 392 students who completed the surgical clerkship between 2017 and 2019 was assessed via Pearson Coefficients comparing OSCE grades, clinical evaluations of Medical Knowledge and Patient Care, Communication and Professionalism, the National Board of Medical Examiners (NBME) shelf surgical subject exam, and final clerkship grade. RESULTS Results demonstrate a statistically significant positive relationship between the OSCE, Shelf score and grade, final clerkship grade, and all clinical evaluations except Communication skills. The greatest correlation occurred between OSCE and shelf scores and grades. Although significant, the degree of correlation with clinical observation was significantly less. CONCLUSION This study demonstrates that a surgical OSCE has a small positive correlation with clinical knowledge as measured by the NBME shelf exam. There is also an equal correlation with medical knowledge standards, with the OSCE better predicting NBME shelf outcome. This lower correlation to clinical assessment suggests that either the clinical grades contain elements not detected on an OSCE exam but could also support the hypothesis that variability in clinical grades do contain a significant degree of subjectivity.
Collapse
Affiliation(s)
- Ian Brallier
- Chicago Medical School at Rosalind Franklin University, 3333 Green Bay Road, North Chicago, IL, 60064, USA.
| | - Sabah Mahmood
- Chicago Medical School at Rosalind Franklin University, 3333 Green Bay Road, North Chicago, IL, 60064, USA.
| | - Karolina Grotkowski
- Rosalind Franklin University College of Health Professions, Psychology Doctorate Program, 3333 Green Bay Road, North Chicago, IL, 60064, USA.
| | - Jessica Taylor
- Chicago Medical School at Rosalind Franklin University, 3333 Green Bay Road, North Chicago, IL, 60064, USA.
| | - Michael Zdon
- Chicago Medical School at Rosalind Franklin University, 3333 Green Bay Road, North Chicago, IL, 60064, USA.
| |
Collapse
|
10
|
Jaber J, Keric N, Kang P, Feinstein AJ. Predicting success: A comparative analysis of student performance on the surgical clerkship and the NBME surgery subject exam. Surg Open Sci 2020; 1:86-89. [PMID: 32754698 PMCID: PMC7391907 DOI: 10.1016/j.sopen.2019.07.002] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2019] [Revised: 07/10/2019] [Accepted: 07/15/2019] [Indexed: 11/28/2022] Open
Abstract
Background The National Board of Medical Examiners surgery shelf is a well-established terminal measure of student medical knowledge. No study has explored the correlation between intraclerkship quizzes and shelf exam performance. Methods Weekly quiz and National Board of Medical Examiners scores were collected from 156 third-year students who participated in a 12-week surgical clerkship from 2015 to 2017. Kruskal-Wallis, Wilcoxon rank sum, and linear regression analysis was completed. Results Trauma/Burns, Esophagus/Anorectal, and Wound/Intensive Care Unit quiz content corresponded with increased National Board of Medical Examiners performance with β-coefficients of 1.57 (P < .001), 1.42 (P < .001), 1.38 (P < .001), respectively. Wound/Intensive Care Unit and Cardio/Vascular content corresponded with decreased likelihood of scoring < 70 points on the National Board of Medical Examiners (OR: 0.75 (P = .03), and 0.68 (P = .02)). Aggregate quiz scores stratified by academic block were 67 (IQR 64–69.5), 77 (IQR 74.5–80), 76.5 (IQR of 67–89.5), 83 (IQR of 76–85) corresponding to academic blocks 1, 2, 3, and 4, respectively (P < .001). Conclusion Modeling National Board of Medical Examiners outcomes as a function of weekly quizzes taken during a 12-week surgery clerkship is a viable concept. Intra-clerkship quiz scores correlate with NBME surgical shelf performance. Wound/ICU, esophagus/anorectal, trauma/burns content was “high-yield.” Wound/ICU and Cardiac/Vascular quizzes were predictive of NBME scores. Clerkship participation later in the third year correlates with increased performance.
Collapse
Affiliation(s)
- Jamil Jaber
- University of Arizona College of Medicine Phoenix
| | | | - Paul Kang
- University of Arizona College of Medicine Phoenix
| | | |
Collapse
|
11
|
Warner DO, Isaak RS, Peterson-Layne C, Lien CA, Sun H, Menzies AO, Cole DJ, Dainer RJ, Fahy BG, Macario A, Suresh S, Harman AE. Development of an Objective Structured Clinical Examination as a Component of Assessment for Initial Board Certification in Anesthesiology. Anesth Analg 2020; 130:258-264. [PMID: 31688077 DOI: 10.1213/ane.0000000000004496] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
With its first administration of an Objective Structured Clinical Examination (OSCE) in 2018, the American Board of Anesthesiology (ABA) became the first US medical specialty certifying board to incorporate this type of assessment into its high-stakes certification examination system. The fundamental rationale for the ABA's introduction of the OSCE is to include an assessment that allows candidates for board certification to demonstrate what they actually "do" in domains relevant to clinical practice. Inherent in this rationale is that the OSCE will capture competencies not well assessed in the current written and oral examinations-competencies that will allow the ABA to judge whether a candidate meets the standards expected for board certification more properly. This special article describes the ABA's journey from initial conceptualization through first administration of the OSCE, including the format of the OSCE, the process for scenario development, the standardized patient program that supports OSCE administration, examiner training, scoring, and future assessment of reliability, validity, and impact of the OSCE. This information will be beneficial to both those involved in the initial certification process, such as residency graduate candidates and program directors, and others contemplating the use of high-stakes summative OSCE assessments.
Collapse
Affiliation(s)
- David O Warner
- From the Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, Minnesota
| | - Robert S Isaak
- Department of Anesthesiology, The University of North Carolina at Chapel Hill, Chapel Hill, North Carolina
| | | | - Cynthia A Lien
- Department of Anesthesiology, Medical College of Wisconsin, Milwaukee, Wisconsin
| | - Huaping Sun
- The American Board of Anesthesiology, Raleigh, North Carolina
| | - Anna O Menzies
- The American Board of Anesthesiology, Raleigh, North Carolina
| | - Daniel J Cole
- Department of Anesthesiology and Perioperative Medicine, University of California, Los Angeles, Los Angeles, California
| | - Rupa J Dainer
- Department of Ambulatory Surgery, Pediatric Specialists of Virginia, Fairfax, Virginia
| | - Brenda G Fahy
- Department of Anesthesiology, University of Florida, Gainesville, Florida
| | - Alex Macario
- Department of Anesthesiology, Perioperative and Pain Medicine, Stanford University, Stanford, California
| | - Santhanam Suresh
- Department of Pediatric Anesthesiology, Ann & Robert H. Lurie Children's Hospital of Chicago, Northwestern University, Chicago, Illinois
| | - Ann E Harman
- The American Board of Anesthesiology, Raleigh, North Carolina
| |
Collapse
|
12
|
Schilling DC. Using the Clerkship Shelf Exam Score as a Qualification for an Overall Clerkship Grade of Honors: A Valid Practice or Unfair to Students? ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2019; 94:328-332. [PMID: 30188368 DOI: 10.1097/acm.0000000000002438] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Most clerkships require medical students to pass the National Board of Medical Examiners (NBME) subject (shelf) exam to pass the clerkship. Many use the NBME's recommended honors cut score on the shelf exam to determine medical student eligibility for an overall clerkship grade of honors. This use of a conjunctive scoring model for determining honors is inconsistent with the logic behind the intended use of this model for making pass-fail determinations. Further, many clerkships use grading systems that employ both this conjunctive model for honors eligibility and a compensatory scoring model for determining the overall clerkship grade. For students who fall short of the shelf exam honors cut score, such a grading system effectively increases the weighting of shelf exam performance and decreases the clerkship's transparency about the weighting of performance on other assessments toward the clerkship composite score and overall grade. It may also lead to contradictory grading results in which student B has a higher composite score than student A, yet student B receives a lower overall grade. The author illustrates how to calculate a weight for shelf exam performance that would be fairer to students and would help create a more transparent weighting scheme for the grading system. The author recommends that clerkships restructure their grading systems so that shelf exam honors-eligibility cut scores are not used as conjunctive criteria for determining overall clerkship grades of honors. A reexamination of the NBME's practice of suggesting honors-eligibility cut scores for shelf exams is also recommended.
Collapse
Affiliation(s)
- David C Schilling
- D.C. Schilling is associate professor and psychiatry clerkship director, Loyola University Chicago Stritch School of Medicine, Maywood, Illinois; ORCID: http://orcid/000-0001-8553-6186
| |
Collapse
|
13
|
Variation of surgery clerkship grades in US medical schools. Am J Surg 2019; 217:329-334. [DOI: 10.1016/j.amjsurg.2018.09.024] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2018] [Revised: 09/06/2018] [Accepted: 09/17/2018] [Indexed: 11/23/2022]
|
14
|
|
15
|
Karmali RJ, Siu JM, You DZ, Spano S, Winthrop AL, Rudan JF, Reznick RK, Sanfilippo AT, Belliveau P. The Surgical Skills and Technology Elective Program (SSTEP): A comprehensive simulation-based surgical skills initiative for preclerkship medical students. Am J Surg 2018; 216:375-381. [DOI: 10.1016/j.amjsurg.2017.09.012] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2017] [Revised: 09/14/2017] [Accepted: 09/16/2017] [Indexed: 11/29/2022]
|
16
|
Cardenas Lara F, Naik ND, Pandian TK, Gas BL, Strubel S, Cadeliña R, Heller SF, Farley DR. A Comparison of Objective Assessment Data for the United States and International Medical Graduates in a General Surgery Residency. JOURNAL OF SURGICAL EDUCATION 2017; 74:e1-e7. [PMID: 28869159 DOI: 10.1016/j.jsurg.2017.08.003] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/03/2017] [Revised: 06/30/2017] [Accepted: 08/05/2017] [Indexed: 06/07/2023]
Abstract
OBJECTIVE To compare objective assessment scores between international medical graduates (IMGs) and United States Medical Graduates. Scores of residents who completed a preliminary year, who later matched into a categorical position, were compared to those who matched directly into a categorical position at the Mayo Clinic, Rochester. DESIGN Postgraduate year (PGY) 1 to 5 residents participate in a biannual multistation, OSCE-style assessment event as part of our surgical training program. Assessment data were, retrospectively, reviewed and analyzed from 2008 to 2016 for PGY-1 and from 2013 to 2016 for PGY 2 to 5 categorical residents. SETTING Academic medical center. PARTICIPANTS Categorical PGY 1 to 5 General Surgery (GS) residents at Mayo Clinic Rochester, MN. RESULTS A total of 86 GS residents were identified. Twenty-one residents (1 United States Medical Graduates [USMG] and 20 IMGs) completed a preliminary GS year, before matching into a categorical position and 68 (58 USMGs and 10 IMGs) residents, who matched directly into a categorical position, were compared. Mean scores (%) for the summer and winter multistation assessments were higher for PGY-1 trainees with a preliminary year than those without (summer: 59 vs. 37, p < 0.001; winter: 69 vs. 61, p = 0.05). Summer and winter PGY-2 scores followed the same pattern (74 vs. 64, p < 0.01; 85 vs. 71, p < 0.01). For the PGY 3 to 5 assessments, differences in scores between these groups were not observed. IMGs and USMGs scored equivalently on all assessments. Overall, junior residents showed greater score improvement between tests than their senior colleagues (mean score increase: PGY 1-2 = 18 vs. PGY 3-5 = 3, p < 0.001). CONCLUSIONS Residents with a previous preliminary GS year at our institution scored higher on initial assessments compared to trainees with no prior GS training at our institution. The scoring advantage of an added preliminary year decreased as trainees progressed through residency.
Collapse
Affiliation(s)
| | - Nimesh D Naik
- Department of Surgery, Mayo Clinic College of Medicine, Rochester, Minnesota
| | - T K Pandian
- Department of Surgery, Mayo Clinic College of Medicine, Rochester, Minnesota
| | - Becca L Gas
- Department of Surgery, Mayo Clinic College of Medicine, Rochester, Minnesota
| | - Suzanne Strubel
- Department of Surgery, Mayo Clinic College of Medicine, Rochester, Minnesota
| | - Rachel Cadeliña
- Department of Surgery, Mayo Clinic College of Medicine, Rochester, Minnesota
| | - Stephanie F Heller
- Department of Surgery, Mayo Clinic College of Medicine, Rochester, Minnesota
| | - David R Farley
- Department of Surgery, Mayo Clinic College of Medicine, Rochester, Minnesota.
| |
Collapse
|
17
|
Sridhar J, Shahlaee A, Mehta S, Rahimy E, Garg SJ, Finklea BD, Dunn JP, Chiang A. Usefulness of Structured Video Indirect Ophthalmoscope—Guided Education in Improving Resident Ophthalmologist Confidence and Ability. ACTA ACUST UNITED AC 2017; 1:282-287. [DOI: 10.1016/j.oret.2016.12.010] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2016] [Revised: 12/17/2016] [Accepted: 12/19/2016] [Indexed: 11/16/2022]
|