1
|
Abu-Ghazalah RM, Dubins DN, Poon GMK. Dissecting knowledge, guessing, and blunder in multiple choice assessments. APPLIED MEASUREMENT IN EDUCATION 2023; 36:80-98. [PMID: 37223404 PMCID: PMC10201919 DOI: 10.1080/08957347.2023.2172017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
Multiple choice results are inherently probabilistic outcomes, as correct responses reflect a combination of knowledge and guessing, while incorrect responses additionally reflect blunder, a confidently committed mistake. To objectively resolve knowledge from responses in an MC test structure, we evaluated probabilistic models that explicitly account for guessing, knowledge and blunder using eight assessments (>9,000 responses) from an undergraduate biotechnology curriculum. A Bayesian implementation of the models, aimed at assessing their robustness to prior beliefs in examinee knowledge, showed that explicit estimators of knowledge are markedly sensitive to prior beliefs with scores as sole input. To overcome this limitation, we examined self-ranked confidence as a proxy knowledge indicator. For our test set, three levels of confidence resolved test performance. Responses rated as least confident were correct more frequently than expected from random selection, reflecting partial knowledge, but were balanced by blunder among the most confident responses. By translating evidence-based guessing and blunder rates to pass marks that statistically qualify a desired level of examinee knowledge, our approach finds practical utility in test analysis and design.
Collapse
Affiliation(s)
- Rashid M. Abu-Ghazalah
- W. Booth School of Engineering Practice and Technology, Faculty of Engineering, McMaster University, Hamilton, Ontario, Canada
| | - David N. Dubins
- Leslie Dan Faculty of Pharmacy, University of Toronto, Toronto, Ontario, Canada
| | - Gregory M. K. Poon
- Departments of Chemistry and Nutrition, Georgia State University, Atlanta, USA
| |
Collapse
|
2
|
Roni MA, Berrocal Y, Tapping R. Improving Summative Assessment Through a Resource-Efficient Faculty Review Process. MEDICAL SCIENCE EDUCATOR 2022; 32:979-983. [PMID: 36276766 PMCID: PMC9584026 DOI: 10.1007/s40670-022-01631-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 09/06/2022] [Indexed: 06/16/2023]
Abstract
Committee reviews improve the quality of multiple-choice question (MCQ) exams; however, such review processes are typically highly resource-intensive and time-consuming. We report a review process that requires limited faculty time and administrative resources. A small committee reviewed selected items (14-20%) of the final exams of six independent block courses taken by first- and second-year medical students. This process resulted in a significant increase in the item discrimination of reviewed questions in all exams. Our findings support the utility of a review process and may offer health profession educators a more practical and efficient approach for improving the quality of in-house MCQ exams. Supplementary Information The online version contains supplementary material available at 10.1007/s40670-022-01631-9.
Collapse
Affiliation(s)
- Monzurul A. Roni
- Department of Health Sciences Education and Pathology, University of Illinois College of Medicine, Peoria, IL 61605 USA
| | - Yerko Berrocal
- Department of Health Sciences Education and Pathology, University of Illinois College of Medicine, Peoria, IL 61605 USA
| | - Richard Tapping
- Department of Health Sciences Education and Pathology, University of Illinois College of Medicine, Peoria, IL 61605 USA
| |
Collapse
|
3
|
Abstract
Multiple-choice tests are the most used method of assessment in medical education. However, there is limited literature in medical education and psychiatry to inform the best practices in writing good-quality multiple-choice questions. Moreover, few physicians and psychiatrists have received training and have experience in writing them. This article highlights the strategies in writing high-quality multiple-choice items and discusses some common flaws that can impact validity and reliability of the assessment examinations.
Collapse
Affiliation(s)
- Vikas Gupta
- South Carolina Department of Mental Health, 2715 Colonial Drive, Suite 200-A, Colonial Drive, Columbia, SC 29201, USA
| | - Eric R Williams
- University of South Carolina School of Medicine, 6311 Garners Ferry Road, Suite 126, Columbia, SC 29209, USA.
| | - Roopma Wadhwa
- South Carolina Department of Mental Health, 2715 Colonial Drive, Suite 200-A, Colonial Drive, Columbia, SC 29201, USA
| |
Collapse
|
4
|
Arooj M, Mukhtar K, Khan RA, Azhar T. Assessing the educational impact of cognitive level of MCQ and SEQ on learning approaches of dental students. Pak J Med Sci 2021; 37:445-449. [PMID: 33679929 PMCID: PMC7931318 DOI: 10.12669/pjms.37.2.3475] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022] Open
Abstract
Objectives: MCQ’s and SEQ’s are the most widely used assessment tool in dental colleges across Pakistan. This study explores the impact of assessment tool: MCQ’s and SEQ’s on learning approach of dental students and also identifies correlation between these assessment tools and deep & surface learning approaches in integrated and discipline based curriculum. Methods: A quantitative co-relational study was conducted in 2018 on 2nd and 4th year dental students. A pre-validated “Revised Study Process Questionnaire” was used. Spearman’s rho correlation coefficient and Wilcoxon signed ranks test were applied to determine the relationship between learning approaches and assessment tools. Internal consistency was calculated by Cronbach’s alpha. Results: Ninety six students out of one hundred and fifty completed the questionnaire. Correlation test showed that surface approach correlates significantly with MCQ’s (0.73) while no co-relation exists with SEQ’s (-0.14) in our study. Deep approach has a strong and significant correlation with SEQ’s (0.80) as compared to MCQ’s (0.056). Conclusion: Assessment tool has an impact on learning approaches used by the students. It was concluded that students used to prefer deep learning approach while preparing for SEQ’s as they were designed at higher cognitive level, whereas, they preferred surface approach while preparing for MCQ’s as they were developed at low cognitive order.
Collapse
Affiliation(s)
- Mahwish Arooj
- Mahwish Arooj, MBBS, MME, M. Phil, PHD Physiology. Professor of Physiology and Director, DME University College of Medicine and Dentistry, Lahore, Pakistan
| | - Khadijah Mukhtar
- Khadijah Mukhtar, BDS, MME Assistant Professor, DME University College of Medicine and Dentistry, Lahore, Pakistan
| | - Rehan Ahmed Khan
- Rehan Ahmed Khan, MBBS, FCPS, FRCS, MHPE Professor of Surgery, Assistant Dean Medical Education, Riphah International University
| | - Tayyaba Azhar
- Tayyaba Azhar. MBBS, MME Assistant Professor, DME University College of Medicine and Dentistry, Lahore, Pakistan
| |
Collapse
|
5
|
Shigli K, Nayak SS, Gali S, Sankeshwari B, Fulari D, Shyam Kishore K, Upadhya P N, Jirge V. Are Multiple Choice Questions for Post Graduate Dental Entrance Examinations Spot On?-Item Analysis of MCQs in Prosthodontics in India. J Natl Med Assoc 2017; 110:455-458. [PMID: 30129514 DOI: 10.1016/j.jnma.2017.11.001] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2017] [Revised: 10/17/2017] [Accepted: 11/13/2017] [Indexed: 11/29/2022]
Abstract
BACKGROUND Construction of appropriate test items is a challenge in preparing quality multiple choice questions. Item analysis provides valuable feedback data on validity of multiple choice questions. The present study was conducted to evaluate the difficulty index, discrimination index and distracter efficiency of the items present in the multiple choice questions of post graduate dental entrance examinations. METHODS A list consisting of 20 MCQs was taken from the entrance exam books of MCQs on an introductory topic and administered to 104 undergraduate students. RESULTS In the present study 15% of the MCQs related to impression making procedure were difficult with difficulty index (p) less than 30%, 15% were poor discriminators and 55% had at least one non-functional distracter. CONCLUSION Item analysis of MCQs in post graduate entrance examinations demonstrated low difficulty index, discrimination index and distracter efficiency. Hence, we propose a strong need for faculty training in test constructors and their post validation.
Collapse
Affiliation(s)
- Kamal Shigli
- Department of Prosthodontics, D.Y. Patil Dental School, Lohegaon, Pune, Maharashtra, India.
| | | | - Sivaranjani Gali
- Department of Prosthodontics, Faculty of Dental Sciences, Ramaiah University of Applied Sciences, Bengaluru, Karnataka, India
| | - Banashree Sankeshwari
- Department of Prosthodontics, Bharati Vidyapeeth Deemed University Dental College and Hospital, Sangli, Maharashtra, India
| | - Deepti Fulari
- Department of Prosthodontics, Bharati Vidyapeeth Deemed University Dental College and Hospital, Sangli, Maharashtra, India
| | - K Shyam Kishore
- Department of Anatomy, Seth G. S. Medical College and K. E. M. Hospital, Parel, Mumbai, India
| | - Nagaraja Upadhya P
- Department of Dental Materials, Manipal College of Dental Sciences, Manipal, Karnataka, India
| | - Vasanti Jirge
- Dept of Oral Medicine and Radiology, KLE VK Institute of Dental Sciences, Belgaum, Karnataka, India
| |
Collapse
|
6
|
Abstract
Objective To analyze the psychometric indices of Anatomy question items in modular system assessment. Methods A quantitative study was done to determine the quality of MCQs and to analyze the performance of 1st year 100 MBBS students. Each module covers different subjects of MBBS curriculum but psychometric analysis was done on the subject of Anatomy only. The assessment results of 3 modules were taken and checked by item analysis to see the mean differences between the modules using ANOVA. Post hoc analysis was determined by using Tukey HSD test. Results A total of 140 one best (OB) Anatomy MCQ items were calculated for difficulty index, discriminatory index and reliability. Difficulty index was found to be higher in module I when compared with module II and III. Discriminatory index comparatively showed higher results in module II whereas reliability of module III was significantly higher than the other modules. Results were considered to be significant with p value≤ 0.05. Conclusions The psychometric analysis of Anatomy MCQs showed average difficulty, good discrimination and reliability.
Collapse
Affiliation(s)
- Zia Ul Islam
- Prof. Dr. Zia ul Islam, M.Phil. Prof. and Head, Department of Anatomy, Liaquat National Hospital and Medical College, Karachi, Pakistan
| | - Ambreen Usmani
- Prof. Dr. Ambreen Usmani, M.Phil, MCPS (HPE), PGD Bioethics, PhD- Anatomy. Prof. and Head, Department of Anatomy, Bahria University Medical and Dental College Karachi, Pakistan
| |
Collapse
|
7
|
Dell KA, Wantuch GA. How-to-guide for writing multiple choice questions for the pharmacy instructor. CURRENTS IN PHARMACY TEACHING & LEARNING 2017; 9:137-144. [PMID: 29180146 DOI: 10.1016/j.cptl.2016.08.036] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/17/2015] [Revised: 07/07/2016] [Accepted: 08/23/2016] [Indexed: 06/07/2023]
Abstract
BACKGROUND Writing multiple choice questions (MCQ) takes a lot of practice. Often, pharmacy practitioners lack the training to write effective MCQ. Sources for instruction in effective MCQ writing can be overwhelming with numerous suggestions of what should and should not be done. PURPOSE The following guide is prepared to serve as a succinct reference for creation and revision of MCQ by both novice and seasoned pharmacy faculty practitioners. METHODS The literature is summarized into 12 best practices for writing effective MCQ. Pharmacy specific examples that demonstrate violations of best practices and how they can be corrected are provided. IMPLICATIONS The guide can serve as a primer to write new MCQ, as a reference to revise previously created questions, or as a guide to peer review of MCQ.
Collapse
Affiliation(s)
- Kamila A Dell
- College of Pharmacy, University of South Florida, Tampa, FL; College of Medicine, University of South Florida, Tampa, FL.
| | - Gwendolyn A Wantuch
- College of Pharmacy, University of South Florida, Tampa, FL; College of Medicine, University of South Florida, Tampa, FL
| |
Collapse
|
8
|
Abdulghani HM, Ahmad F, Irshad M, Khalil MS, Al-Shaikh GK, Syed S, Aldrees AA, Alrowais N, Haque S. Faculty development programs improve the quality of Multiple Choice Questions items' writing. Sci Rep 2015; 5:9556. [PMID: 25828516 PMCID: PMC4381327 DOI: 10.1038/srep09556] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2014] [Accepted: 03/10/2015] [Indexed: 11/09/2022] Open
Abstract
The aim of this study was to assess the utility of long term faculty development programs (FDPs) in order to improve the quality of multiple choice questions (MCQs) items' writing. This was a quasi-experimental study, conducted with newly joined faculty members. The MCQ items were analyzed for difficulty index, discriminating index, reliability, Bloom's cognitive levels, item writing flaws (IWFs) and MCQs' nonfunctioning distractors (NFDs) based test courses of respiratory, cardiovascular and renal blocks. Significant improvement was found in the difficulty index values of pre- to post-training (p = 0.003). MCQs with moderate difficulty and higher discrimination were found to be more in the post-training tests in all three courses. Easy questions were decreased from 36.7 to 22.5%. Significant improvement was also reported in the discriminating indices from 92.1 to 95.4% after training (p = 0.132). More number of higher cognitive level of Bloom's taxonomy was reported in the post-training test items (p<0.0001). Also, NFDs and IWFs were reported less in the post-training items (p<0.02). The MCQs written by the faculties without participating in FDPs are usually of low quality. This study suggests that newly joined faculties need active participation in FDPs as these programs are supportive in improving the quality of MCQs' items writing.
Collapse
Affiliation(s)
| | - Farah Ahmad
- Department of Medical Education, King Saud University, Riyadh-11321, Saudi Arabia
| | - Mohammad Irshad
- Department of Medical Education, King Saud University, Riyadh-11321, Saudi Arabia
| | - Mahmoud Salah Khalil
- Department of Medical Education, King Saud University, Riyadh-11321, Saudi Arabia
| | | | - Sadiqa Syed
- Department of Basic Sciences, The Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia
| | | | - Norah Alrowais
- Department of Family & Community Medicine, King Saud University, Riyadh-11321, Saudi Arabia
| | - Shafiul Haque
- Research and Scientific Studies Unit, College of Nursing and Allied Health Sciences, Jazan University, Jazan-45142, Saudi Arabia
| |
Collapse
|