1
|
Parasram M, Loiseau SY, Yoo AS, Stone JB, Ch'ang JH, Robbins MS. Curriculum Innovation: A Resident-Created Multiple-Choice Question of the Week to Augment Case-Based Learning. NEUROLOGY. EDUCATION 2024; 3:e200119. [PMID: 39360150 PMCID: PMC11441749 DOI: 10.1212/ne9.0000000000200119] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/10/2023] [Accepted: 01/26/2024] [Indexed: 10/04/2024]
Abstract
Introduction and Problem Statement Morning report (MR) has been a foundation of learning in many neurology residency programs. However, fortification of the high-yield learning points during MR cases may be achieved with supplementary educational initiatives to promote effective long-term retention and test-enhanced learning. Objectives During the 2020-2021 academic year, chief residents of our neurology training program sought to implement neurology certification board-style multiple-choice questions (MCQs) based on cases presented at MR to enhance case-based learning. Methods and Curriculum Description A chief resident was selected weekly to write a MCQ based on an instructive case presented in MR from the prior week. The National Board of Medical Examiners item writing guide and online tutorial were used as guidelines for constructing MCQs. MCQs featured a clinical vignette in the question stem, and images were added to augment select cases. The MCQs were distributed using Qualtrics, which generated a web link and tracked anonymous answers. The Qualtrics link was added to the departmental weekly newsletter and labeled question of the week (QOW). Detailed explanations for each QOW were provided. A feedback survey was sent to the departmental education committee after study completion. Results and Assessment of Data Forty MCQs were written by the chief residents, and 1 question was distributed weekly in the departmental newsletter. After week 24, the QOW was restructured to enhance visibility. The mean number of residents who completed the MCQ was 13 (of 29 neurology residents [range 4-29]). The overall median response rate was 38%. When stratified by weeks 1-24 and 25-40 to account for QOW reformatting, the median response rate for weeks 1-24 and weeks 25-40 were 24% and 55%, respectively (p = 0.0013). In a poststudy survey sent to the education committee, 90% of respondents felt that resident-created MCQs were similar to board-style questions and added educational value to resident learning. Discussion and Lessons Learned A chief resident QOW initiative was feasible and led to neurology resident academic engagement and enrichment, which supplemented case-based learning through a test-enhanced learning approach. Resident participation was significantly increased with enhanced visibility of QOW in weekly emails compared with hyperlink format.
Collapse
Affiliation(s)
- Melvin Parasram
- From the Department of Neurology (M.P., S.Y.L., A.S.Y., J.H.C., M.S.R.), Weill Cornell Medicine, New York; and Department of Neurology (M.P., S.Y.L., A.S.Y., J.B.S.), Memorial Sloan Kettering Cancer Center, Manhattan, NY
| | - Shamelia Y Loiseau
- From the Department of Neurology (M.P., S.Y.L., A.S.Y., J.H.C., M.S.R.), Weill Cornell Medicine, New York; and Department of Neurology (M.P., S.Y.L., A.S.Y., J.B.S.), Memorial Sloan Kettering Cancer Center, Manhattan, NY
| | - Andrea S Yoo
- From the Department of Neurology (M.P., S.Y.L., A.S.Y., J.H.C., M.S.R.), Weill Cornell Medicine, New York; and Department of Neurology (M.P., S.Y.L., A.S.Y., J.B.S.), Memorial Sloan Kettering Cancer Center, Manhattan, NY
| | - Jacqueline B Stone
- From the Department of Neurology (M.P., S.Y.L., A.S.Y., J.H.C., M.S.R.), Weill Cornell Medicine, New York; and Department of Neurology (M.P., S.Y.L., A.S.Y., J.B.S.), Memorial Sloan Kettering Cancer Center, Manhattan, NY
| | - Judy H Ch'ang
- From the Department of Neurology (M.P., S.Y.L., A.S.Y., J.H.C., M.S.R.), Weill Cornell Medicine, New York; and Department of Neurology (M.P., S.Y.L., A.S.Y., J.B.S.), Memorial Sloan Kettering Cancer Center, Manhattan, NY
| | - Matthew S Robbins
- From the Department of Neurology (M.P., S.Y.L., A.S.Y., J.H.C., M.S.R.), Weill Cornell Medicine, New York; and Department of Neurology (M.P., S.Y.L., A.S.Y., J.B.S.), Memorial Sloan Kettering Cancer Center, Manhattan, NY
| |
Collapse
|
2
|
Al Ameer AY. Assessment of the Quality of Multiple-Choice Questions in the Surgery Course for an Integrated Curriculum, University of Bisha College of Medicine, Saudi Arabia. Cureus 2023; 15:e50441. [PMID: 38222171 PMCID: PMC10785735 DOI: 10.7759/cureus.50441] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/12/2023] [Indexed: 01/16/2024] Open
Abstract
INTRODUCTION Multiple-choice questions (MCQs) have been recognized as reliable assessment tools, and incorporating clinical scenarios in MCQ stems has enhanced their effectiveness in evaluating knowledge and understanding. Item analysis is used to assess the reliability and consistency of MCQs, indicating their suitability as an assessment tool. This study aims to ensure the competence of graduates in serving the community and establish an examination bank for the surgery course. OBJECTIVE This study aims to assess the quality and acceptability of MCQs in the surgery course at the University of Bisha College of Medicine (UBCOM). METHODS A psychometric study evaluated the quality of MCQs used in surgery examinations from 2019 to 2023 at UBCOM in Saudi Arabia. The MCQs/items were analyzed and categorized for their difficulty index (DIF), discrimination index (DI), and distracter efficiency (DE) Fifth-year MBBS students undergo a rotation in the department and are assessed at the end of 12 weeks. The assessment includes 60 MCQs/items and written items. Data was collected and analyzed using SPSS version 24. RESULTS A total of 189 students were examined across five test sessions, with 300 MCQ items. Student scores ranged from 28.33% to 90.0%, with an average score of 64.6%±4.35. The 300 MCQ items had a total of 900 distractors. The DIF was 75.3% for the items, and 63.3% of the items showed good discrimination. No items had negative points in terms of biserial correlation. The mean number of functional distractors per test item was 2.19±1.007, with 34% of the items having three functional distractors. CONCLUSION The psychometric indices used to evaluate the MCQs in this study were encouraging, with acceptable DIF, distractor efficiencies, and item reliability. Providing robust faculty training and capacity-building is recommended to enhance item development skills.
Collapse
Affiliation(s)
- Ahmed Y Al Ameer
- Department of Surgery, College of Medicine, University of Bisha, Bisha, SAU
| |
Collapse
|
3
|
Smith EB, Gellatly M, Schwartz CJ, Jordan S. Training Radiology Residents, Bloom Style. Acad Radiol 2021; 28:1626-1630. [PMID: 32921568 DOI: 10.1016/j.acra.2020.08.013] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2020] [Revised: 08/10/2020] [Accepted: 08/11/2020] [Indexed: 01/16/2023]
Abstract
Bloom's Taxonomy, an integral component of learning theory since its inception, describes cognitive skill levels in increasing complexity (Remember, Understand, Apply, Analyze, Evaluate, and Create). Considering Bloom's Taxonomy when writing learning objectives and lecture material, teaching residents at the workstation and creating multiple choice questions can increase an educator's effectiveness. The incorporation of higher Bloom levels aids in cultivating critical thinking skills vital to image interpretation and patient care, and becomes increasingly important as the radiologist's role evolves with the continued development of artificial intelligence. Following established tenets of multiple choice question writing, involving trainees in the question writing process, and incorporating audience response systems into lectures are all strategies in which higher Bloom level skills can be accomplished.
Collapse
Affiliation(s)
- Elana B Smith
- R. Adams Cowley Shock Trauma Center, University of Maryland Medical Center, Department of Radiology, 22 S. Greene St., Baltimore, MD 21201.
| | - Matthew Gellatly
- University of North Carolina School of Medicine, Chapel Hill, North Carolina
| | - Cody J Schwartz
- University of North Carolina School of Medicine, Department of Radiology, Chapel Hill, North Carolina
| | - Sheryl Jordan
- University of North Carolina School of Medicine, Department of Radiology, Chapel Hill, North Carolina
| |
Collapse
|
4
|
A simple eye model for objectively assessing the competency of direct ophthalmoscopy. Eye (Lond) 2021; 36:1789-1794. [PMID: 34373614 PMCID: PMC8351584 DOI: 10.1038/s41433-021-01730-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2021] [Revised: 07/10/2021] [Accepted: 07/28/2021] [Indexed: 11/08/2022] Open
Abstract
Background Direct ophthalmoscopy is an important investigative technology not only for ophthalmologists, but also for general practitioners and other specialists. The purpose of this study was to develop a simple and robust eye model for effective and objective assessment of ophthalmoscopic competency. Methods A series of eye models were assembled using commonly available materials, including 26-mm-diameter double-hemispherical brown plastic balls and convex lenses. A 6-mm circular opening was drilled on one hemisphere as a pupil behind which the lens was glued to provide the refractive component. Ten pieces of letters were placed on the inner surface of the other hemisphere. Ophthalmoscopic skills of ophthalmologist residents were first subjectively assessed using a checklist by two tutors and then objectively by using the eye models. The discrimination index was calculated to evaluate the effectiveness of assessment. Finally, a feedback questionnaire was completed. Results Totally 76 residents were recruited. The checklist score was 9.25 ± 0.47, with a discrimination index of 0.11. The model-assessment score was 4.24 ± 3.10, with a discrimination index of 0.79. There was no correlation between the checklist score and model scores (r = 0.133, P = 0.251). Two-thirds of the participants agreed or strongly agreed that model-assessment could reflect the ability to visualize the fundus. Conclusions We have developed simple eye models to assess the competency of ophthalmoscopy with excellent discriminatory power to differentiate competence levels of ophthalmology residents.
Collapse
|
5
|
Salih KEMA, Jibo A, Ishaq M, Khan S, Mohammed OA, Al-Shahrani AM, Abbas M. Psychometric analysis of multiple-choice questions in an innovative curriculum in Kingdom of Saudi Arabia. J Family Med Prim Care 2020; 9:3663-3668. [PMID: 33102347 PMCID: PMC7567208 DOI: 10.4103/jfmpc.jfmpc_358_20] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2020] [Revised: 03/28/2020] [Accepted: 04/07/2020] [Indexed: 11/17/2022] Open
Abstract
Background and Aims: Worldwide, medical education and assessment of medical students are evolving. Psychometric analysis of the adopted assessment methods is thus, necessary for an efficient, reliable, valid and evidence based approach to the assessment of the students. The objective of this study was to determine the pattern of psychometric analysis of our courses conducted in the academic year 2018-2019, in an innovative curriculum. Methods: It was a cross-sectional-design study involving review of examination items over one academic session -2018/2019. All exam item analysis of courses completed within the three phases of the year were analyzed using SPSS V20 statistical software. Results: There were 24 courses conducted during the academic year 2018-2019, across the three academic phases. The total examination items were 1073 with 3219 distractors in one of four best option multiple choice questions (MCQs). The item analysis showed that the mean difficulty index (DIF I) was 79.1 ± 3.3. Items with good discrimination have a mean of 65 ± 11.2 and a distractor efficiency of 80.9%. Reliability Index (Kr20) across all exams in the three phases was 0.75. There was a significant difference within the examination items block (F = 12.31, F critical = 3.33, P < 0.05) across all the phases of the courses taken by the students. Similarly, significant differences existed among the three phases of the courses taken (F ratio = 12.44, F critical 4.10, P < 0.05). Conclusion: The psychometric analysis showed that the quality of examination questions was valid and reliable. Though differences were observed in items quality between different phases of study as well as within courses of study, it has generally remained consistent throughout the session. More efforts need to be channeled towards improving the quality in the future is recommended.
Collapse
Affiliation(s)
- Karim Eldin M A Salih
- Department of Pediatrics, College of Medicine, University of Bisha, Saudi Arabia.,Medical Education, College of Medicine, University of Bisha, Saudi Arabia
| | - Abubakar Jibo
- Family and Community Medicine, College of Medicine, University of Bisha, Saudi Arabia
| | - Masoud Ishaq
- Medical Education, College of Medicine, University of Bisha, Saudi Arabia
| | - Sameer Khan
- Physiology, College of Medicine, University of Bisha, Saudi Arabia
| | - Osama A Mohammed
- Pharmacology, College of Medicine, University of Bisha, Saudi Arabia
| | | | - Mohammed Abbas
- Department of Pediatrics, College of Medicine, University of Bisha, Saudi Arabia.,Medical Education, College of Medicine, University of Bisha, Saudi Arabia
| |
Collapse
|