1
|
Brenner JM, Fulton TB, Kruidering M, Bird JB, Willey J, Qua K, Olvet DM. What have we learned about constructed response short-answer questions from students and faculty? A multi-institutional study. Med Teach 2024; 46:349-358. [PMID: 37688773 DOI: 10.1080/0142159x.2023.2249209] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/11/2023]
Abstract
PURPOSE The purpose of this study was to enrich understanding about the perceived benefits and drawbacks of constructed response short-answer questions (CR-SAQs) in preclerkship assessment using Norcini's criteria for good assessment as a framework. METHODS This multi-institutional study surveyed students and faculty at three institutions. A survey using Likert scale and open-ended questions was developed to evaluate faculty and student perceptions of CR-SAQs using the criteria of good assessment to determine the benefits and drawbacks. Descriptive statistics and Chi-square analyses are presented, and open responses were analyzed using directed content analysis to describe benefits and drawbacks of CR-SAQs. RESULTS A total of 260 students (19%) and 57 faculty (48%) completed the survey. Students and faculty report that the benefits of CR-SAQs are authenticity, deeper learning (educational effect), and receiving feedback (catalytic effect). Drawbacks included feasibility, construct validity, and scoring reproducibility. Students and faculty found CR-SAQs to be both acceptable (can show your reasoning, partial credit) and unacceptable (stressful, not USMLE format). CONCLUSIONS CR-SAQs are a method of aligning innovative curricula with assessment and could enrich the assessment toolkit for medical educators.
Collapse
Affiliation(s)
- Judith M Brenner
- Department of Science Education, Donald and Barbara Zucker School of Medicine, Hofstra/Northwell, Hempstead, New York, USA
| | - Tracy B Fulton
- Department of Biochemistry and Biophysics, University of California San Francisco, San Francisco, California, USA
| | - Marieke Kruidering
- Department of Cellular and Molecular Pharmacology, University of California San Francisco,San Francisco, California, USA
| | - Jeffrey B Bird
- Department of Science Education, Donald and Barbara Zucker School of Medicine, Hofstra/Northwell, Hempstead, New York, USA
| | - Joanne Willey
- Department of Science Education, Donald and Barbara Zucker School of Medicine, Hofstra/Northwell, Hempstead, New York, USA
| | - Kelli Qua
- Center for Medical Education, Case Western Reserve University School of Medicine, Cleveland, Ohio, USA
| | - Doreen M Olvet
- Department of Science Education, Donald and Barbara Zucker School of Medicine, Hofstra/Northwell, Hempstead, New York, USA
| |
Collapse
|
2
|
LeClair RJ, Binks AP, Gambala CT, Brenner JM, Willey JM. The Impact of Changing Step 1 to Pass/Fail Reporting on Anxiety, Learning Approaches, and Curiosity. Med Sci Educ 2023; 33:1197-1204. [PMID: 37886271 PMCID: PMC10597890 DOI: 10.1007/s40670-023-01878-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 08/22/2023] [Indexed: 10/28/2023]
Abstract
Purpose Given the significance of the US Medical Licensing Exam (USMLE) Step 1 score moving from a 3-digit value to pass/fail, the authors investigated the impact of the change on students' anxiety, approach to learning, and curiosity. Method Two cohorts of pre-clerkship medical students at three medical schools completed a composite of four instruments: the State-Trait Anxiety Inventory, the revised two-factor Study Process Questionnaire, the Interest/Deprivation Type Epistemic Curiosity Scale, and the Short Grit Scale prior to taking the last 3-digit scored Step 1 in 2021 or taking the first pass/fail scored Step 1 in 2022. Responses of 3-digit and pass/fail exam takers were compared (Mann-Whitney U) and multiple regression path analysis was performed to determine the factors that significantly impacted learning strategies. Results There was no difference between 3-digit (n = 86) and pass/fail exam takers (n = 154) in anxiety (STA-I scores, 50 vs. 49, p = 0.85), shallow learning strategies (22 vs. 23, p = 0.84), or interest curiosity scores (median scores 15 vs. 15, p = 0.07). However, pass/fail exam takers had lower deprivation curiosity scores (median 12 vs. 11, p = 0.03) and showed a decline in deep learning strategies (30 vs. 27, p = 0.0012). Path analysis indicated the decline in deep learning strategies was due to the change in exam scoring (β = - 2.0428, p < 0.05). Conclusions Counter to the stated hypothesis and intentions, the initial impact of the change to pass/fail grading for USMLE Step 1 failed to reduce learner anxiety, and reduced curiosity and deep learning strategies. Supplementary Information The online version contains supplementary material available at 10.1007/s40670-023-01878-w.
Collapse
Affiliation(s)
- Renée J. LeClair
- Department of Basic Science Education, Virginia Tech Carilion School of Medicine, Roanoke, VA USA
| | - Andrew P. Binks
- Department of Basic Science Education, Virginia Tech Carilion School of Medicine, Roanoke, VA USA
| | - Cecilia T. Gambala
- Office of Academic Affairs, Tulane University School of Medicine, New Orleans, LA USA
| | - Judith M. Brenner
- Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY USA
- New York University Grossman Long Island School of Medicine, Mineola, USA
| | - Joanne M. Willey
- Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY USA
| |
Collapse
|
3
|
Olvet DM, Bird JB, Fulton TB, Kruidering M, Papp KK, Qua K, Willey JM, Brenner JM. A Multi-institutional Study of the Feasibility and Reliability of the Implementation of Constructed Response Exam Questions. Teach Learn Med 2023; 35:609-622. [PMID: 35989668 DOI: 10.1080/10401334.2022.2111571] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/14/2021] [Accepted: 07/27/2022] [Indexed: 06/15/2023]
Abstract
PROBLEM Some medical schools have incorporated constructed response short answer questions (CR-SAQs) into their assessment toolkits. Although CR-SAQs carry benefits for medical students and educators, the faculty perception that the amount of time required to create and score CR-SAQs is not feasible and concerns about reliable scoring may impede the use of this assessment type in medical education. INTERVENTION Three US medical schools collaborated to write and score CR-SAQs based on a single vignette. Study participants included faculty question writers (N = 5) and three groups of scorers: faculty content experts (N = 7), faculty non-content experts (N = 6), and fourth-year medical students (N = 7). Structured interviews were performed with question writers and an online survey was administered to scorers to gather information about their process for creating and scoring CR-SAQs. A content analysis was performed on the qualitative data using Bowen's model of feasibility as a framework. To examine inter-rater reliability between the content expert and other scorers, a random selection of fifty student responses from each site were scored by each site's faculty content experts, faculty non-content experts, and student scorers. A holistic rubric (6-point Likert scale) was used by two schools and an analytic rubric (3-4 point checklist) was used by one school. Cohen's weighted kappa (κw) was used to evaluate inter-rater reliability. CONTEXT This research study was implemented at three US medical schools that are nationally dispersed and have been administering CR-SAQ summative exams as part of their programs of assessment for at least five years. The study exam question was included in an end-of-course summative exam during the first year of medical school. IMPACT Five question writers (100%) participated in the interviews and twelve scorers (60% response rate) completed the survey. Qualitative comments revealed three aspects of feasibility: practicality (time, institutional culture, teamwork), implementation (steps in the question writing and scoring process), and adaptation (feedback, rubric adjustment, continuous quality improvement). The scorers' described their experience in terms of the need for outside resources, concern about lack of expertise, and value gained through scoring. Inter-rater reliability between the faculty content expert and student scorers was fair/moderate (κw=.34-.53, holistic rubrics) or substantial (κw=.67-.76, analytic rubric), but much lower between faculty content and non-content experts (κw=.18-.29, holistic rubrics; κw=.59-.66, analytic rubric). LESSONS LEARNED Our findings show that from the faculty perspective it is feasible to include CR-SAQs in summative exams and we provide practical information for medical educators creating and scoring CR-SAQs. We also learned that CR-SAQs can be reliably scored by faculty without content expertise or senior medical students using an analytic rubric, or by senior medical students using a holistic rubric, which provides options to alleviate the faculty burden associated with grading CR-SAQs.
Collapse
Affiliation(s)
- Doreen M Olvet
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, New York, USA
| | - Jeffrey B Bird
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, New York, USA
| | - Tracy B Fulton
- Department of Biochemistry and Biophysics, University of California San Francisco School of Medicine, San Francisco, California, USA
| | - Marieke Kruidering
- Department of Cellular & Molecular Pharmacology, University of California at San Francisco School of Medicine, San Francisco, California, USA
| | - Klara K Papp
- Center for Medical Education, Case Western Reserve University School of Medicine, Cleveland, Ohio, USA
| | - Kelli Qua
- Research and Evaluation, Case Western Reserve University School of Medicine, Cleveland, Ohio, USA
| | - Joanne M Willey
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, New York, USA
| | - Judith M Brenner
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, New York, USA
| |
Collapse
|
4
|
Bird JB, Olvet DM, Orner D, Willey JM, Brenner JM. Exploring the impact of postponing core clerkships on future performance. Med Educ Online 2022; 27:2114864. [PMID: 36062838 PMCID: PMC9448398 DOI: 10.1080/10872981.2022.2114864] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/24/2022] [Revised: 08/02/2022] [Accepted: 08/16/2022] [Indexed: 06/15/2023]
Abstract
Despite the many clerkship models of medical education, all can be considered a form of experiential learning. Experiential learning is a complex pedagogical approach involving the development of cognitive skills in an environment with a unique culture with multiple stakeholders, which may impact learner motivation, confidence, and other noncognitive drivers of success. Students may delay the transition to the clerkship year for myriad reasons, and the intricate nature of experiential learning suggested this may impact student performance. This retrospective, observational study investigated the impact of clerkship postponement by measuring subsequent clerkship performance. Pre-clerkship and third-year clerkship performance were analyzed for three cohorts of students (classes of 2018, 2019, and 2020, N = 274) where students had the option to delay the start of their clerkship year. A mixed analysis of variance (ANOVA) and paired t-tests were conducted to compare academic performance over time among students who did and did not delay. Across three cohorts of students, 12% delayed the start of the clerkship year (N = 33). Regardless of prior academic performance, these students experienced a significant reduction in clerkship grades compared to their non-delaying peers. Delaying the start of the clerkship year may have negative durable effects on future academic performance. This information should be kept in mind for student advisement.
Collapse
Affiliation(s)
- Jeffrey B. Bird
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, USA
| | - Doreen M. Olvet
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, USA
| | - David Orner
- Office of Academic Affairs, Northwell Health, New Hyde Park, NY, USA
| | - Joanne M. Willey
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, USA
| | - Judith M Brenner
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, USA
| |
Collapse
|
5
|
Bird JB, Olvet DM, Willey JM, Brenner JM. A Generalizable Approach to Predicting Performance on USMLE Step 2 CK. Adv Med Educ Pract 2022; 13:939-944. [PMID: 36039184 PMCID: PMC9419904 DOI: 10.2147/amep.s373300] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/04/2022] [Accepted: 07/31/2022] [Indexed: 06/15/2023]
Abstract
INTRODUCTION The elimination of the USMLE Step 1 three-digit score has created a deficit in standardized performance metrics for undergraduate medical educators and residency program directors. It is likely that there will be greater emphasis on USMLE Step 2 CK, an exam found to be associated with later clinical performance in residents and physicians. Because many previous models relied on Step 1 scores to predict student performance on Step 2 CK, we developed a model using other metrics. MATERIALS AND METHODS Assessment data for 228 students in three cohorts (classes of 2018, 2019, and 2020) were collected, including the Medical College Admission Test (MCAT), NBME Customized Assessment Service (CAS) exams and NBME Subject exams. A linear regression model was conducted to predict Step 2 CK scores at five time-points: at the end of years one and two and at three trimester intervals in year three. An additional cohort (class of 2021) was used to validate the model. RESULTS Significant models were found at 5 time-points in the curriculum and increased in predictability as students progressed: end of year 1 (adj R2 = 0.29), end of year 2 (adj R2 = 0.34), clerkship trimester 1 (adj R2 = 0.52), clerkship trimester 2 (adj R2 = 0.58), clerkship trimester 3 (adj R2 = 0.62). Including Step 1 scores did not significantly improve the final model. Using metrics from the class of 2021, the model predicted Step 2 CK performance within a mean square error (MSE) of 8.3 points (SD = 6.8) at the end of year 1 increasing predictability incrementally to within a mean of 5.4 points (SD = 4.1) by the end of year 3. CONCLUSION This model is highly generalizable and enables medical educators to predict student performance on Step 2 CK in the absence of Step 1 quantitative data as early as the end of the first year of medical education with increasingly stronger predictions as students progressed through the clerkship year.
Collapse
Affiliation(s)
- Jeffrey B Bird
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, 11549, USA
| | - Doreen M Olvet
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, 11549, USA
| | - Joanne M Willey
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, 11549, USA
| | - Judith M Brenner
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, 11549, USA
- Department of Medicine, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, 11549, USA
| |
Collapse
|
6
|
Bird JB, Friedman KA, Arayssi T, Olvet DM, Conigliaro RL, Brenner JM. Review of the Medical Student Performance Evaluation: analysis of the end-users' perspective across the specialties. Med Educ Online 2021; 26:1876315. [PMID: 33606615 PMCID: PMC7899642 DOI: 10.1080/10872981.2021.1876315] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/14/2020] [Revised: 12/23/2020] [Accepted: 01/11/2021] [Indexed: 06/12/2023]
Abstract
The Medical Student Performance Evaluation (MSPE) is an important tool of communication used by program directors to make decisions in the residency application process. To understand the perspective and usage of the MSPE across multiple medical specialties now and in anticipation of the planned changes in USMLE Step 1 score-reporting. A survey instrument including quantitative and qualitative measures was developed and piloted. The final survey was distributed to residency programs across 28 specialties in 2020 via the main contact on the ACGME listserv. Of the 28 specialties surveyed, at least one response was received from 26 (93%). Eight percent of all programs (364/4675) responded to the survey, with most respondents being program directors. Usage of the MSPE varied among specialties. Approximately 1/3 of end-users stated that the MSPE is very or extremely influential in their initial screening process. Slightly less than half agreed or strongly agreed that they trust the information to be an accurate representation of applicants, though slightly more than half agree that the MSPE will become more influential once USMLE Step 1 becomes pass/fail. Professionalism was rated as the most important component and noteworthy characteristics among the least important in the decision-making process. Performance in the internal medicine clerkship was rated as the most influential while neurology and psychiatry performances were rated as less influential. Overwhelmingly, respondents suggested that including comparative performance and/or class rank would make the MSPE more useful once USMLE Step 1 becomes pass/fail. MSPE end-users across a variety of specialties utilize this complex document in different ways and value it differentially in their decision-making processes. Despite this, continued mistrust of the MSPE persists. A better understanding of end-users' perceptions of the MSPE offers the UME community an opportunity to transform the MSPE into a highly valued, trusted document of communication.
Collapse
Affiliation(s)
- Jeffrey B. Bird
- Barbara Zucker School of Medicine, Hofstra/Northwell, Hempstead, New York, USA
| | - Karen A. Friedman
- Vice Chair for Education, Department of Medicine, Northwell Health, Manhasset, NY, A Professor of Medicine at the Donald and Barbara Zucker School of Medicine, Hofstra/Northwell, Hempstead, New York, USA
| | - Thurayya Arayssi
- Senior Associate Dean, Medical Education and CPD, Weill Cornell Medicine-Qatar and a Professor of Clinical Medicine, Weill Cornell Medicine, NY, NY
| | - Doreen M. Olvet
- Assistant Professor and Medical Education ProjectManager Department of Science Education, Donald, Barbara Zucker School of Medicine, Hofstra/Northwell, Hempstead, New York, USA
| | | | - Judith M. Brenner
- Associate Dean for Educational Data and Analytics Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hofstra University, Hempstead, NY, USA
| |
Collapse
|
7
|
Olvet DM, Bird JB, Fulton TB, Kruidering M, Papp KK, Qua K, Willey JM, Brenner JM. Can Content Experts Rely on Others to Reliably Score Open-Ended Questions on Summative Exams? Acad Med 2021; 96:S210. [PMID: 34705711 DOI: 10.1097/acm.0000000000004278] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Affiliation(s)
- Doreen M Olvet
- Author affiliations: D.M. Olvet, J.B. Bird, J.M. Willey, J.M. Brenner, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell
| | - Jeffrey B Bird
- Author affiliations: D.M. Olvet, J.B. Bird, J.M. Willey, J.M. Brenner, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell
| | - Tracy B Fulton
- T.B. Fulton, M. Kruidering, University of California, San Francisco School of Medicine
| | - Marieke Kruidering
- T.B. Fulton, M. Kruidering, University of California, San Francisco School of Medicine
| | - Klara K Papp
- K.K. Papp, K. Qua, Case Western Reserve University School of Medicine
| | - Kelli Qua
- K.K. Papp, K. Qua, Case Western Reserve University School of Medicine
| | - Joanne M Willey
- Author affiliations: D.M. Olvet, J.B. Bird, J.M. Willey, J.M. Brenner, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell
| | - Judith M Brenner
- Author affiliations: D.M. Olvet, J.B. Bird, J.M. Willey, J.M. Brenner, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell
| |
Collapse
|
8
|
Brenner JM, Bird JB, Brenner J, Orner D, Friedman K. Current State of the Medical Student Performance Evaluation: A Tool for Reflection for Residency Programs. J Grad Med Educ 2021; 13:576-580. [PMID: 34434519 PMCID: PMC8370358 DOI: 10.4300/jgme-d-20-01373.1] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/06/2020] [Revised: 02/11/2021] [Accepted: 04/18/2021] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND The Medical Student Performance Evaluation (MSPE) provides important information to residency programs. Despite recent recommendations for standardization, it is not clear how much variation exists in MSPE content among schools. OBJECTIVES We describe the current section content of the MSPE in US allopathic medical schools, with a particular focus on variations in the presentation of student performance. METHODS A representative MSPE was obtained from 95.3% (143 of 150) of allopathic US medical schools through residency applications to the Zucker School of Medicine at Hofstra/Northwell in select programs for the 2019-2020 academic year. A manual data abstraction tool was piloted in 2018-2019. After training, it was used to code all portions of the MSPE in this study. The results were analyzed, and descriptive statistics were reported. RESULTS In preclinical years, 30.8% of MSPEs reported data regarding performance of students beyond achieving "passes" in a pass/fail curriculum. Only half referenced performance in the fourth year including electives, acting internships, or both. About two-thirds of schools included an overall descriptor of comparative performance in the final paragraph. Among these schools, a majority provided adjectives such as "outstanding/excellent/very good/good," while one-quarter reported numerical data categories. Regarding clerkship grades, there were numerous nomenclature systems used. CONCLUSIONS This analysis demonstrates the existence of extreme variability in the content of MSPEs submitted by US allopathic medical schools in the 2019-2020 cycle, including the components and nomenclature of grades and descriptors of comparative performance, display of data, and inclusion of data across all years of the medical education program.
Collapse
Affiliation(s)
- Judith M. Brenner
- Judith M. Brenner, MD, is Associate Dean for Curricular Integration and Assessment, and Associate Professor of Science Education and Medicine, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell
| | - Jeffrey B. Bird
- Jeffrey B. Bird, MA, is Educational Research & Strategic Assessment Analyst, and Assistant Professor of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell
| | - Jason Brenner
- Jason Brenner, BS, is a Volunteer Research Assistant, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, and Student, University of Michigan
| | - David Orner
- David Orner, MPH, is a Research Assistant, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell
| | - Karen Friedman
- Karen Friedman, MS, MD, is Vice Chair for Education, Department of Medicine, Northwell Health, and Professor of Medicine, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell
| |
Collapse
|
9
|
Binks AP, LeClair RJ, Willey JM, Brenner JM, Pickering JD, Moore JS, Huggett KN, Everling KM, Arnott JA, Croniger CM, Zehle CH, Kranea NK, Schwartzstein RM. Changing Medical Education, Overnight: The Curricular Response to COVID-19 of Nine Medical Schools. Teach Learn Med 2021; 33:334-342. [PMID: 33706632 DOI: 10.1080/10401334.2021.1891543] [Citation(s) in RCA: 38] [Impact Index Per Article: 12.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Issue: Calls to change medical education have been frequent, persistent, and generally limited to alterations in content or structural re-organization. Self-imposed barriers have prevented adoption of more radical pedagogical approaches, so recent predictions of the 'inevitability' of medical education transitioning to online delivery seemed unlikely. Then in March 2020 the COVID-19 pandemic forced medical schools to overcome established barriers overnight and make the most rapid curricular shift in medical education's history. We share the collated reports of nine medical schools and postulate how recent responses may influence future medical education. Evidence: While extraneous pandemic-related factors make it impossible to scientifically distinguish the impact of the curricular changes, some themes emerged. The rapid transition to online delivery was made possible by all schools having learning management systems and key electronic resources already blended into their curricula; we were closer to online delivery than anticipated. Student engagement with online delivery varied with different pedagogies used and the importance of social learning and interaction along with autonomy in learning were apparent. These are factors known to enhance online learning, and the student-centered modalities (e.g. problem-based learning) that included them appeared to be more engaging. Assumptions that the new online environment would be easily adopted and embraced by 'technophilic' students did not always hold true. Achieving true distance medical education will take longer than this 'overnight' response, but adhering to best practices for online education may open a new realm of possibilities. Implications: While this experience did not confirm that online medical education is really 'inevitable,' it revealed that it is possible. Thoughtfully blending more online components into a medical curriculum will allow us to take advantage of this environment's strengths such as efficiency and the ability to support asynchronous and autonomous learning that engage and foster intrinsic learning in our students. While maintaining aspects of social interaction, online learning could enhance pre-clinical medical education by allowing integration and collaboration among classes of medical students, other health professionals, and even between medical schools. What remains to be seen is whether COVID-19 provided the experience, vision and courage for medical education to change, or whether the old barriers will rise again when the pandemic is over.
Collapse
Affiliation(s)
- Andrew P Binks
- Department of Basic Science Education, Virginia Tech Carilion School of Medicine, Roanoke, Virgina, USA
| | - Renée J LeClair
- Department of Basic Science Education, Virginia Tech Carilion School of Medicine, Roanoke, Virgina, USA
| | - Joanne M Willey
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, New York, USA
| | - Judith M Brenner
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, New York, USA
| | - James D Pickering
- Division of Anatomical Education, School of Medicine, University of Leeds, Leeds, UK
| | - Jesse S Moore
- Department of Surgery, Larner College of Medicine, University of Vermont, Burlington, Vermont, USA
| | - Kathryn N Huggett
- Department of Medicine, Larner College of Medicine, University of Vermont, Burlington, Vermont, USA
| | - Kathleen M Everling
- Office of Educational Development, School of Medicine at University of Texas Medical Branch, Galveston, Texas, USA
| | - John A Arnott
- Department of Medical Education, Geisinger Commonwealth School of Medicine, Scranton, Pennsylvania, USA
| | - Colleen M Croniger
- Department of Nutrition, Case Western Reserve University School of Medicine, Cleveland, Ohio, USA
| | - Christa H Zehle
- Department of Pediatrics, Larner College of Medicine, University, of Vermont, Burlington, Vermont, USA
| | - N Kevin Kranea
- Department of Medicine, Tulane University School of Medicine, New Orleans, Louisiana, USA
| | | |
Collapse
|
10
|
Abstract
In response to the need for physician leaders, the Donald and Barbara Zucker School of Medicine at Hofstra/Northwell developed the Klar Leadership Development and Innovation Management program. This novel program leverages its partnership with a large Northeast health system to longitudinally provide students with leadership fundamentals and mentored experiences.
Collapse
Affiliation(s)
- Tiffany M. Jordan
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, 500 Hofstra University, Hempstead, NY 11549 USA
- Dual Degree Programs and Grants Management, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY USA
| | - Joanne M. Willey
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, 500 Hofstra University, Hempstead, NY 11549 USA
- Biomedical Sciences, Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY USA
| | - Judith M. Brenner
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, 500 Hofstra University, Hempstead, NY 11549 USA
- Curricular Integration and Assessment, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY USA
| |
Collapse
|
11
|
Hauer KE, Boscardin C, Brenner JM, van Schaik SM, Papp KK. Twelve tips for assessing medical knowledge with open-ended questions: Designing constructed response examinations in medical education. Med Teach 2020; 42:880-885. [PMID: 31282798 DOI: 10.1080/0142159x.2019.1629404] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Medical knowledge examinations employing open-ended (constructed response) items can be useful to assess medical students' factual and conceptual understanding. Modern day curricula that emphasize active learning in small groups and other interactive formats lend themselves to an assessment format that prompts students to share conceptual understanding, explain, and elaborate. The open-ended question examination format can provide faculty with insights into learners' abilities to apply information to clinical or scientific problems, and reveal learners' misunderstandings about essential content. To implement formative or summative assessments with open-ended questions in a rigorous manner, educators must design systems for exam creation and scoring. This includes systems for constructing exam blueprints, items and scoring rubrics, and procedures for scoring and standard setting. Information gained through review of students' responses can guide future educational sessions and curricular changes in a cycle of continuous improvement.
Collapse
Affiliation(s)
- Karen E Hauer
- Department of Medicine, School of Medicine, University of California, San Francisco, CA, USA
| | - Christy Boscardin
- Department of Medicine, School of Medicine, University of California, San Francisco, CA, USA
| | - Judith M Brenner
- Department of Medicine, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, USA
| | - Sandrijn M van Schaik
- Department of Pediatrics, University of California, San Francisco School of Medicine, San Francisco, CA, USA
| | - Klara K Papp
- Division of General Medical Sciences, Case Western Reserve University School of Medicine, Cleveland, OH, USA
| |
Collapse
|
12
|
Willey JM, Olvet DM, Bird JB, Brenner JM. Pandemics Past and Present: A Guided Inquiry Approach. J Med Educ Curric Dev 2020; 7:2382120520976957. [PMID: 33294621 PMCID: PMC7705775 DOI: 10.1177/2382120520976957] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/25/2020] [Accepted: 11/02/2020] [Indexed: 06/12/2023]
Abstract
BACKGROUND COVID-19 exposed undergraduate medical education curricular gaps in exploring historical pandemics, how to critically consume scientific literature and square it with the lay press, and how to grapple with emerging ethical issues. In addition, as medical students were dismissed from clinical environments, their capacity to build community and promote professional identity formation was compromised. METHODS A synchronous, online course entitled Life Cycle of a Pandemic was developed using a modified guided inquiry approach. Students met daily for 2 weeks in groups of 15 to 18 with a process facilitator. During the first week, students reported on lessons learned from past pandemics; in the second week, students discussed ethical concerns surrounding COVID-19 clinical trials, heard from physicians who provided patient care in the HIV and COVID-19 pandemics, and concluded with an opportunity for reflection. Following the course, students were asked to complete an anonymous, voluntary survey to assess their perceptions of the course. RESULTS With a response rate of 69%, an overwhelming majority of students agreed or strongly agreed that learning about historical pandemics helped them understand COVID-19 (72, 99%). The course successfully helped students understand current and potential COVID-19 management strategies as 66 (90%) agreed or strongly agreed they developed a better understanding of nonpharmacological interventions and new pharmacological treatments. Students also gained insight into the experiences of healthcare providers who cared for patients with HIV and COVID-19. Qualitative analysis of the open-ended comments yielded 5 main themes: critical appraisal of resources, responsibility of the physician, humanism, knowledge related to pandemics, and learning from history. CONCLUSIONS The onset of the COVID-19 crisis illustrated curricular gaps that could be remedied by introducing the history and biology of pandemics earlier in the curriculum. It was also apparent that learners need more practice in critically reviewing literature and comparing scientific literature with lay press. The flexible format of the course promotes the development of future iterations that could cover evolving topics related to COVID-19. The course could also be repurposed for a graduate or continuing medical education audience.
Collapse
Affiliation(s)
- Joanne M Willey
- Joanne M Willey, Leo A. Guthart Professor of
Biomedical Sciences, Chair, Department of Science Education, Donald and Barbara
Zucker School of Medicine at Hofstra/Northwell, 500 Hofstra University,
Hempstead, NY 11549, USA.
| | | | | | | |
Collapse
|
13
|
Brenner JM. The Revised Medical School Performance Evaluation: Does It Meet the Needs of Its Readers? J Grad Med Educ 2019; 11:475-478. [PMID: 31440345 PMCID: PMC6699531 DOI: 10.4300/jgme-d-19-00089.1] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/31/2019] [Revised: 04/23/2019] [Accepted: 04/24/2019] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND The Medical School Performance Evaluation (MSPE) is an important factor for application to residency programs. Many medical schools are incorporating recent recommendations from the Association of American Medical Colleges MSPE Task Force into their letters. To date, there has been no feedback from the graduate medical education community on the impact of this effort. OBJECTIVE We surveyed individuals involved in residency candidate selection for internal medicine programs to understand their perceptions on the new MSPE format. METHODS A survey was distributed in March and April 2018 using the Association of Program Directors in Internal Medicine listserv, which comprises 4220 individuals from 439 residency programs. Responses were analyzed, and themes were extracted from open-ended questions. RESULTS A total of 140 individuals, predominantly program directors and associate program directors, from across the United States completed the survey. Most were aware of the existence of the MSPE Task Force. Respondents read a median of 200 to 299 letters each recruitment season. The majority reported observing evidence of adoption of the new format in more than one quarter of all medical schools. Among respondents, nearly half reported the new format made the MSPE more important in decision-making about a candidate. Within the MSPE, respondents recognized the following areas as most influential: academic progress, summary paragraph, graphic representation of class performance, academic history, and overall adjective of performance indicator (rank). CONCLUSIONS The internal medicine graduate medical education community finds value in many components of the new MSPE format, while recognizing there are further opportunities for improvement.
Collapse
|
14
|
Brenner JM. Web Exclusive. Annals Story Slam - Telling the Untold Story. Ann Intern Med 2019; 170:SS1. [PMID: 30977766 DOI: 10.7326/w19-0001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
|
15
|
Brenner JM, Bird JB, Willey JM. Formative Assessment in an Integrated Curriculum: Identifying At-Risk Students for Poor Performance on USMLE Step 1 Using NBME Custom Exam Questions. Acad Med 2017; 92:S21-S25. [PMID: 29065019 DOI: 10.1097/acm.0000000000001914] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
PURPOSE The Hofstra Northwell School of Medicine (HNSOM) uses an essay-based assessment system. Recognizing the emphasis graduate medical education places on the United States Medical Licensing Examination (USMLE) Step exams, the authors developed a method to predict students at risk for lower performance on USMLE Step 1. METHOD Beginning with the inaugural class (2015), HNSOM administered National Board of Medical Examiners (NBME) Customized Assessment Service (CAS) examinations as formative assessment at the end of each integrated course in the first two years of medical school. Using preadmission data, the first two courses in the educational program, and NBME score deviation from the national test takers' mean, a statistical model was built to predict students who scored below the Step 1 national mean. RESULTS A regression equation using the highest Medical College Admission Test (MCAT) score and NBME score deviation predicted student Step 1 scores. The MCAT alone accounted for 21% of the variance. Adding the NBME score deviation from the first and second courses increased the variance to 40% and 50%, respectively. Adding NBME exams from later courses increased the variance to 52% and 64% by the end of years one and two, respectively. Cross-validation demonstrated the model successfully predicted 63% of at-risk students by the end of the fifth month of medical school. CONCLUSIONS The model identified students at risk for lower performance on Step 1 using the NBME CAS. This model is applicable to schools reforming their curriculum delivery and assessment programs toward an integrated model.
Collapse
Affiliation(s)
- Judith M Brenner
- J.M. Brenner is associate dean of curricular integration and assessment, Hofstra Northwell School of Medicine, Hempstead, New York. J.B. Bird is assessment and evaluation analyst, Hofstra Northwell School of Medicine, Hempstead, New York. J.M. Willey is professor and chair, Department of Science Education, Hofstra Northwell School of Medicine, Hempstead, New York
| | | | | |
Collapse
|
16
|
Dimmock JR, Brenner JM, Phillips OA. Evaluation of some benzenesulphonylhydrazones of aryl aldehydes and ketones as antiepileptic agents. Pharmazie 1987; 42:376-8. [PMID: 3671456] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 01/06/2023]
Abstract
Benzaldehyde benzenesulphonamide was shown to have comparable anticonvulsant activity to valproic acid in the maximal electroshock seizure test. Molecular modification of this compound was developed principally to explore whether a correlation between planarity and anticonvulsant activity could be discerned. The interplanar angle (theta) between an aryl ring and the adjacent azomethine group of some representative compounds was measured by electronic absorption spectroscopy. However, the structural alterations of benzaldehyde benzenesulphonamide undertaken in this work led to compounds bereft of anticonvulsant activity.
Collapse
Affiliation(s)
- J R Dimmock
- College of Pharmacy, University of Saskatchewan, Saskatoon, Canada
| | | | | |
Collapse
|