1
|
Dell KA, DeVries DM. Effect of changing multiple choice questions from "all of the above" to "select all that apply". CURRENTS IN PHARMACY TEACHING & LEARNING 2024; 16:174-177. [PMID: 38218657 DOI: 10.1016/j.cptl.2023.12.034] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/14/2023] [Revised: 12/28/2023] [Accepted: 12/31/2023] [Indexed: 01/15/2024]
Abstract
INTRODUCTION The purpose of this study was to describe the effect of converting multiple choice questions (MCQs) that include an "all of the above" (AOTA) answer option to a "select all that apply" (SATA) question type on question performance. METHODS A summative assessment at the end of the first professional pharmacy year was comprised of approximately 50 multiple choice questions covering material from all courses taught. Eight questions contained AOTA answer options and were converted to SATA items in the subsequent year by eliminating the AOTA option and including the words "select all that apply" in the stem. Majority of the other questions included on the exam remained the same between the two years. Item difficulty, item discrimination, point biserial, and distractor efficiency were used to compare the MCQs on exams in the two years. RESULTS The AOTA questions were significantly easier and less discriminating than the SATA items. The performance of the remaining questions on the exam did not differ between the years. The distractor efficiency increased significantly when the questions were converted to SATA items. CONCLUSIONS MCQs with AOTA answer options are discouraged due to poor item construction resulting in poor discrimination between high and low performing students. The AOTA questions are easily converted to the SATA format. The result of this conversion is a more difficult and more discriminating question with all answer options chosen, which prevents students from easily guessing the correct answer.
Collapse
Affiliation(s)
- Kamila A Dell
- Department of Pharmacotherapeutics and Clinical Research, University of South Florida Taneja College of Pharmacy, 12901 Bruce B. Downs Blvd., MDC 30, Tampa, FL 33612, United States.
| | - Davina M DeVries
- Learning and Development Manager, University of South Florida Taneja College of Pharmacy, 12901 Bruce B. Downs Blvd., MDC 30, Tampa, FL 33612, United States.
| |
Collapse
|
2
|
Gutiérrez-Velilla E, Pérez-Sánchez IN, Alvarado-de la Barrera C, Ávila-Ríos S, Caballero-Suárez NP. Assessing HIV knowledge in Mexican people living with HIV: development and validation of CC-VIH questionnaire. Health Promot Int 2023; 38:daad164. [PMID: 38041806 DOI: 10.1093/heapro/daad164] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/04/2023] Open
Abstract
The level of knowledge that people living with human immunodeficiency virus (HIV) have about their disease can impact their adherence to treatment. The aim of this study was to develop a tool to assess the knowledge about HIV among people receiving treatment at a specialized clinic in Mexico City. To establish content validity, expert judges were invited to conceptualize the tool and propose items for the defined dimensions. A total of 490 individuals living with HIV completed the 91-item questionnaire, with 82.2% being male and a mean age of 36.1 years. We conducted an exploratory factor analysis, resulting in a reduced questionnaire of 45 questions. A three-factor solution explained 36.2% of the variance in HIV knowledge. The total scale had a reliability coefficient of 0.937, and each subscale had reliabilities of 0.828, 0.856 and 0.859. Lower educational level (F(336) = 8.488, p < 0.001) and female gender (t(399) = 2.003, p = 0.046) were associated with lower scores on the HIV knowledge questionnaire. This tool appears suitable for measuring HIV knowledge in people living with HIV, although future studies are required to confirm its structure and reduce its extension.
Collapse
Affiliation(s)
- Ester Gutiérrez-Velilla
- Centro de Investigación en Enfermedades Infecciosas (CIENI), Instituto Nacional de Enfermedades Respiratorias (INER), Calzada de Tlalpan 4502, Sección XVI, Tlalpan, Mexico City 14080, Mexico
| | - Ivonne Nalliely Pérez-Sánchez
- Centro de Investigación en Enfermedades Infecciosas (CIENI), Instituto Nacional de Enfermedades Respiratorias (INER), Calzada de Tlalpan 4502, Sección XVI, Tlalpan, Mexico City 14080, Mexico
| | - Claudia Alvarado-de la Barrera
- Centro de Investigación en Enfermedades Infecciosas (CIENI), Instituto Nacional de Enfermedades Respiratorias (INER), Calzada de Tlalpan 4502, Sección XVI, Tlalpan, Mexico City 14080, Mexico
| | - Santiago Ávila-Ríos
- Centro de Investigación en Enfermedades Infecciosas (CIENI), Instituto Nacional de Enfermedades Respiratorias (INER), Calzada de Tlalpan 4502, Sección XVI, Tlalpan, Mexico City 14080, Mexico
| | - Nancy Patricia Caballero-Suárez
- Centro de Investigación en Enfermedades Infecciosas (CIENI), Instituto Nacional de Enfermedades Respiratorias (INER), Calzada de Tlalpan 4502, Sección XVI, Tlalpan, Mexico City 14080, Mexico
| |
Collapse
|
3
|
Adeosun SO. Differences in Multiple-Choice Questions of Opposite Stem Orientations Based on a Novel Item Quality Measure. AMERICAN JOURNAL OF PHARMACEUTICAL EDUCATION 2023; 87:ajpe8934. [PMID: 35470171 PMCID: PMC10159516 DOI: 10.5688/ajpe8934] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/18/2021] [Accepted: 04/13/2022] [Indexed: 05/06/2023]
Abstract
Objective. To determine whether there are differences in the performance and quality of multiple-choice items with opposite stem orientations (positive or negative), based on a novel item quality measure and conventional psychometric parameters.Methods. A retrospective study was conducted on multiple-choice assessment items used in years two and three of pharmacy school for pharmacotherapy and related courses administered between August 2018 and December 2019. Conventional psychometric parameters (difficulty and discrimination indices), average response time, nonfunctional distractor percentage, and a novel measure of item quality of negatively worded items were compared with those of control items, namely positively worded items (n=103 each). This novel measure uses difficulty and discrimination in tandem for the decision to reject, review, or retain items in an assessment. Statistical analyses were performed on continuous and categorical variables, on the relationship between difficulty and discrimination, and on differences in correlation coefficients between positively and negatively worded items.Results. Stem orientation was not significantly associated with the novel measure of item quality. Also, there were no significant differences between positively and negatively worded items in any of the psychometric parameters. There were significant, negative correlations between difficulty and discrimination indices in both groups, and the correlation coefficients were significantly stronger in positively versus negatively worded items.Conclusion. Items with opposite stem orientations show no differences in the novel item quality measure nor in conventional measures of performance and quality, except in difficulty-discrimination relationships. This suggests that negatively worded items should be used when necessary, but cautiously.
Collapse
|
4
|
Roni MA, Berrocal Y, Tapping R. Improving Summative Assessment Through a Resource-Efficient Faculty Review Process. MEDICAL SCIENCE EDUCATOR 2022; 32:979-983. [PMID: 36276766 PMCID: PMC9584026 DOI: 10.1007/s40670-022-01631-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 09/06/2022] [Indexed: 06/16/2023]
Abstract
Committee reviews improve the quality of multiple-choice question (MCQ) exams; however, such review processes are typically highly resource-intensive and time-consuming. We report a review process that requires limited faculty time and administrative resources. A small committee reviewed selected items (14-20%) of the final exams of six independent block courses taken by first- and second-year medical students. This process resulted in a significant increase in the item discrimination of reviewed questions in all exams. Our findings support the utility of a review process and may offer health profession educators a more practical and efficient approach for improving the quality of in-house MCQ exams. Supplementary Information The online version contains supplementary material available at 10.1007/s40670-022-01631-9.
Collapse
Affiliation(s)
- Monzurul A. Roni
- Department of Health Sciences Education and Pathology, University of Illinois College of Medicine, Peoria, IL 61605 USA
| | - Yerko Berrocal
- Department of Health Sciences Education and Pathology, University of Illinois College of Medicine, Peoria, IL 61605 USA
| | - Richard Tapping
- Department of Health Sciences Education and Pathology, University of Illinois College of Medicine, Peoria, IL 61605 USA
| |
Collapse
|
5
|
Richardson CL, Chapman S, White S. Experiencing a virtual patient to practice patient counselling skills. CURRENTS IN PHARMACY TEACHING & LEARNING 2021; 13:1593-1601. [PMID: 34895668 DOI: 10.1016/j.cptl.2021.09.048] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/09/2020] [Revised: 07/07/2021] [Accepted: 09/15/2021] [Indexed: 06/14/2023]
Abstract
INTRODUCTION Virtual patients (VPs) are a safe and standardised method of simulating clinical environments but few studies have explored health care professional's experiences of learning via a VP. This study explored how users experienced and used a VP that aimed to teach the user to deliver non-vitamin K oral anticoagulant patient education. METHODS The study used semi-structured interviews with pharmacists and pre-registration trainees from a wider research study. Interview topics were based on key areas concerning VP use. Interviews were audio-recorded and transcribed verbatim before being analysed using the framework approach to thematic analysis. Ethical approval was granted by Keele University. RESULTS There was variation in the type and nature of use of the VP and in the reported learning, which included reinforcement of knowledge, an opportunity to promote reflection, and acquisition and application of knowledge to clinical, patient-facing interactions. The VP was seen as an adjunct to other education and training. The majority of users indicated that they used the VP more than once. Some users seemed to have gamified their learning with a drive to achieve perfect feedback rather than true engagement with the learning, whereas for others the learning appeared to be deep with a reflective focus. CONCLUSIONS The VP offered an educational use as experiential learning, although the users experienced the VP differently; commonly the VP facilitated learning via reinforcement of pre-existing knowledge. The users reported that the VP had value as an adjunct to other education and training resources.
Collapse
Affiliation(s)
| | - Stephen Chapman
- School of Pharmacy and Bioengineering, Keele University, ST5 5BG, UK.
| | - Simon White
- School of Pharmacy and Bioengineering, Keele University, ST5 5BG, UK.
| |
Collapse
|
6
|
Abstract
Multiple-choice tests are the most used method of assessment in medical education. However, there is limited literature in medical education and psychiatry to inform the best practices in writing good-quality multiple-choice questions. Moreover, few physicians and psychiatrists have received training and have experience in writing them. This article highlights the strategies in writing high-quality multiple-choice items and discusses some common flaws that can impact validity and reliability of the assessment examinations.
Collapse
Affiliation(s)
- Vikas Gupta
- South Carolina Department of Mental Health, 2715 Colonial Drive, Suite 200-A, Colonial Drive, Columbia, SC 29201, USA
| | - Eric R Williams
- University of South Carolina School of Medicine, 6311 Garners Ferry Road, Suite 126, Columbia, SC 29209, USA.
| | - Roopma Wadhwa
- South Carolina Department of Mental Health, 2715 Colonial Drive, Suite 200-A, Colonial Drive, Columbia, SC 29201, USA
| |
Collapse
|
7
|
Reynolds QJ, Gilliland KO, Smith K, Walker JA, Beck Dallaghan GL. Differences in medical student performance on examinations: exploring score variance between Kolb's Learning Style Inventory classifications. BMC MEDICAL EDUCATION 2020; 20:423. [PMID: 33176776 PMCID: PMC7661198 DOI: 10.1186/s12909-020-02353-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/27/2020] [Accepted: 11/02/2020] [Indexed: 06/11/2023]
Abstract
BACKGROUND Kolb's Cycle of Learning Theory acts as a foundational framework for the evolution of knowledge gained by learners throughout their education. Through Kolb's cycle of experiential learning, one's preferred way of learning could impact academic achievement in the pre-clinical years of medical education. METHODS The medical student classes of 2020 and 2021 at a public university in the southeastern U.S. were invited to complete Kolb's Learning Style Inventory (LSI). For those participants completing the LSI, examination results for their pre-clinical blocks were obtained and matched to the LSI results. Examination scores (locally-developed examinations and customized National Board of Medical Examiners (NBME) final examinations) were compared by LSI classification for each examination using Kruskal-Wallis Test. RESULTS Out of 360 possible participants, 314 (87.2%) completed the Learning Style Inventory. Convergers and Assimilators made up 84.1% [Convergers (n = 177, 56.4%), Assimilators (n = 87, 27.7%)]. Accommodators (n = 25, 7.9%) and Divergers (n = 25, 7.9%) made up the remaining sample. Accomodators' scores were significantly lower on locally-developed examinations in Principles of Medicine, Hematology, and Gastrointestinal System. The only NBME examination that demonstrated a significant difference across learning styles was from the Cardiovascular block. CONCLUSIONS Upon reviewing Kolb's LSI, our study indicated that performance on the customized NBME examinations minimized the variance in performance compared to locally-developed examinations. The lack of variance across learning styles for all but one NBME final examination appears to provide a more equitable assessment strategy.
Collapse
Affiliation(s)
- Quentin J Reynolds
- Department of Psychiatry, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Kurt O Gilliland
- Department of Cell Biology and Physiology, University of North Carolina School of Medicine at Chapel Hill, Chapel Hill, NC, USA
| | - Katie Smith
- Office of Medical Education, University of North Carolina School of Medicine at Chapel Hill, Chapel Hill, NC, USA
| | - Joshua A Walker
- University of North Carolina School of Medicine at Chapel Hill, 108 Taylor Hall, CB 7321, Chapel Hill, NC, NC 27599, USA
| | - Gary L Beck Dallaghan
- Office of Medical Education, University of North Carolina School of Medicine at Chapel Hill, Chapel Hill, NC, USA.
| |
Collapse
|
8
|
Danh T, Desiderio T, Herrmann V, Lyons HM, Patrick F, Wantuch GA, Dell KA. Evaluating the quality of multiple-choice questions in a NAPLEX preparation book. CURRENTS IN PHARMACY TEACHING & LEARNING 2020; 12:1188-1193. [PMID: 32739055 DOI: 10.1016/j.cptl.2020.05.006] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/10/2019] [Revised: 03/13/2020] [Accepted: 05/29/2020] [Indexed: 06/11/2023]
Abstract
INTRODUCTION There is a plethora of preparatory books and guides available to help study for the North American Pharmacist Licensure Examination (NAPLEX). However, the quality of questions included has not been scrutinized. Our objective was to evaluate the quality of multiple-choice questions (MCQs) construction in a commonly used NAPLEX preparatory book. METHODS Five students and two faculty members reviewed MCQs from the RxPrep 2018 edition course book. Item structure and utilization of case-based questions were evaluated using best practices for item construction. Frequency of item writing flaws (IWF) and utilization of cases for case-based questions was identified. RESULTS A total of 298 questions were reviewed. Twenty-seven (9.1%) questions met all best practices for item construction. Flawed questions contained an average of 2.53 IWF per MCQ. The most commonly identified best practice violations were answer choices containing differing length and verb tense (21%) and question stems containing too little or too much information necessary to eliminate distractors (16.6%). Of the case-based questions, the majority (61.9%) did not require utilization of the provided case. CONCLUSIONS This pilot analysis identified that a majority of MCQs in one NAPLEX preparatory source contained IWF. These results align with previous evaluations of test-banks in published books outside of pharmacy. Further evaluation of other preparatory materials, to expand on the findings from this pilot analysis, are needed to evaluate the pervasiveness of IWF in preparatory materials and the effect of flawed questions on utility of study materials.
Collapse
Affiliation(s)
- Tina Danh
- University of South Florida Taneja College of Pharmacy, 12901 Bruce B. Downs Blvd, MDC 30, Tampa, FL 33612, United States
| | - Tamara Desiderio
- University of South Florida Taneja College of Pharmacy, 12901 Bruce B. Downs Blvd, MDC 30, Tampa, FL 33612, United States
| | - Victoria Herrmann
- University of South Florida Taneja College of Pharmacy, 12901 Bruce B. Downs Blvd, MDC 30, Tampa, FL 33612, United States
| | - Heather M Lyons
- University of South Florida Taneja College of Pharmacy, 12901 Bruce B. Downs Blvd, MDC 30, Tampa, FL 33612, United States
| | - Frankie Patrick
- University of South Florida Taneja College of Pharmacy, 12901 Bruce B. Downs Blvd, MDC 30, Tampa, FL 33612, United States
| | - Gwendolyn A Wantuch
- University of South Florida Taneja College of Pharmacy, 12901 Bruce B. Downs Blvd, MDC 30, Tampa, FL 33612, United States.
| | - Kamila A Dell
- University of South Florida Taneja College of Pharmacy, 12901 Bruce B. Downs Blvd, MDC 30, Tampa, FL 33612, United States
| |
Collapse
|
9
|
Dangprapai Y, Ngamskulrungroj P, Senawong S, Ungprasert P, Harun A. Development of a New Scoring System To Accurately Estimate Learning Outcome Achievements via Single, Best-Answer, Multiple-Choice Questions for Preclinical Students in a Medical Microbiology Course. JOURNAL OF MICROBIOLOGY & BIOLOGY EDUCATION 2020; 21:21.1.4. [PMID: 32148605 PMCID: PMC7048397 DOI: 10.1128/jmbe.v21i1.1773] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/18/2019] [Accepted: 11/20/2019] [Indexed: 06/10/2023]
Abstract
During the preclinical years, single-best-answer multiple-choice questions (SBA-MCQs) are often used to test the higher-order cognitive processes of medical students (such as application and analysis) while simultaneously assessing lower-order processes (like knowledge and comprehension). Consequently, it can be difficult to pinpoint which learning outcome has been achieved or needs improvement. We developed a new scoring system for SBA-MCQs using a step-by-step methodology to evaluate each learning outcome independently. Enrolled in this study were third-year medical students (n = 316) who had registered in the basic microbiology course at the Faculty of Medicine, Siriraj Hospital, Mahidol University during the academic year 2017. A step-by-step SBA-MCQ with a new scoring system was created and used as a tool to evaluate the validity of the traditional SBA-MCQs that assess two separate outcomes simultaneously. The scores for the two methods, in percentages, were compared using two different questions (SBA-MCQ1 and SBA-MCQ2). SBA-MCQ1 tested the students' knowledge of the causative agent of a specific infectious disease and the basic characteristics of the microorganism, while SBA-MCQ2 tested their knowledge of the causative agent of a specific infectious disease and the pathogenic mechanism of the microorganism. The mean score obtained with the traditional SBA-MCQs was significantly lower than that obtained with the step-by-step SBA-MCQs (85.9% for the traditional approach versus 90.9% for step-by-step SBA-MCQ1; p < 0.001; and 81.5% for the traditional system versus 87.4% for step-by-step SBA-MCQ2; p < 0.001). Moreover, 65.8% and 87.8% of the students scored lower with the traditional SBA-MCQ1 and the traditional SBA-MCQ2, respectively, than with the corresponding sets of step-by-step SBA-MCQ questions. These results suggest that traditional SBA-MCQ scores need to be interpreted with caution because they have the potential to underestimate the learning achievement of students. Therefore, the step-by-step SBA-MCQ is preferable to the traditional SBA-MCQs and is recommended for use in examinations during the preclinical years.
Collapse
Affiliation(s)
- Yodying Dangprapai
- Department of Physiology, Faculty of Medicine, Siriraj Hospital, Mahidol University, Bangkok 10700, Thailand
| | - Popchai Ngamskulrungroj
- Department of Microbiology, Faculty of Medicine, Siriraj Hospital, Mahidol University, Bangkok 10700, Thailand
| | - Sansnee Senawong
- Department of Immunology, Faculty of Medicine, Siriraj Hospital, Mahidol University, Bangkok 10700, Thailand
| | - Patompong Ungprasert
- Clinical Epidemiology Unit, Department of Research and Development, Faculty of Medicine, Siriraj Hospital, Mahidol University, Bangkok 10700, Thailand
| | - Azian Harun
- Department of Medical Microbiology and Parasitology, School of Medical Sciences, Universiti Sains Malaysia, 16150 Kubang Kerian, Kelantan, Malaysia
| |
Collapse
|