1
|
Eldakhakhny B, Elsamanoudy AZ. Discrimination Power of Short Essay Questions Versus Multiple Choice Questions as an Assessment Tool in Clinical Biochemistry. Cureus 2023; 15:e35427. [PMID: 36987482 PMCID: PMC10040235 DOI: 10.7759/cureus.35427] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/24/2023] [Indexed: 03/30/2023] Open
Abstract
Assessment is fundamental to the educational process. Multiple choice questions (MCQs) and short essay questions (SEQs) are the most widely used assessment method in medical school. The current study evaluated the discriminating value of SEQs compared to MCQs as assessment tools in clinical biochemistry and correlated undergraduate students' SEQ scores with their overall scores during the academic years 2021-2022 and 2022-2023. This is a descriptive-analytical study in which MCQ and SEQ papers of clinical biochemistry were analyzed. The mean score for SEQs in males was 66.7 ± 1.2 and for females it was 64.0 ± 1.1 SEM, with a p-value of 0.09; for MCQs, the mean score for males was 68.5 ± 0.9 SEM and for females it was 72.6 ± 0.8. When analyzing the difficulty index (DI) and discrimination factor (DF) of the questions, MCQs have a mean DI of 0.70 ± 0.01,and DF of 0.05 to 0.6. SEQs have a mean DI of 0.73 ± 0.03 and DF of 0.68 ± 0.01; there was a significant difference between the DF of MCQs and SEQs (p < 0.0001). Furthermore, there was a significant difference between SEQs and MCQs when categorizing students based on their scores, except for A-scored students. According to the current study, SEQs have a higher discriminating ability than MCQs and help differentiate high-achieving students from low-achieving students.
Collapse
Affiliation(s)
- Basmah Eldakhakhny
- Clinical Biochemistry, King Abdulaziz University Faculty of Medicine, Jeddah, SAU
| | - Ayman Z Elsamanoudy
- Clinical Biochemistry, King Abdulaziz University Faculty of Medicine, Jeddah, SAU
- Medical Biochemistry and Molecular Biology, Mansoura University, Faculty of Medicine, Mansoura, EGY
| |
Collapse
|
2
|
Erturk S, van Tilburg WAP, Igou ER. Off the mark: Repetitive marking undermines essay evaluations due to boredom. MOTIVATION AND EMOTION 2022. [DOI: 10.1007/s11031-022-09929-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
AbstractEssay-style assessment is widespread in education. Nonetheless, research shows that this tool can suffer from low reliability and validity. We attribute this problem partly to the boredom that marking multiple essays causes. Specifically, we propose that boredom in markers is associated with systematically lower marks on essays. To test this, we asked participants (N = 100) with an undergraduate degree to mark essays. The majority of these participants had at least some experience with marking. After marking each essay, participants indicated how bored they were. We found an increase in boredom over time and that higher boredom was associated with lower marks. Furthermore, offering a marking rubric did not prevent this problematic impact of boredom. These findings have implications for the validity of essays as an assessment tool and raise concerns about repetitive marking practices in general.
Collapse
|
3
|
Wilhelm J, Mattingly S, Gonzalez VH. Perceptions, satisfactions, and performance of undergraduate students during Covid-19 emergency remote teaching. ANATOMICAL SCIENCES EDUCATION 2022; 15:42-56. [PMID: 34859608 PMCID: PMC9011711 DOI: 10.1002/ase.2161] [Citation(s) in RCA: 14] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/22/2020] [Revised: 07/13/2021] [Accepted: 11/29/2021] [Indexed: 06/12/2023]
Abstract
Due to the Covid-19 pandemic, the education system worldwide faced sudden and unforeseen challenges. Many academic institutions closed their doors, forcing both educators and students to transition to Emergency Remote Teaching (ERT) for the remainder of the semester. This transition eliminated hands-on experiences, increased workload, and altered curricula. However, these aspects, as well as students' perceptions, study habits, and performance in response to ERT remain poorly documented. This contribution describes changes in the curriculum of an undergraduate cadaver-based laboratory, and explores students' performance, self-perceived learning, and overall satisfaction during this educational crisis. Online content delivery for this course included both asynchronous instruction and synchronous discussion sessions. While formative assessments remained the same, online spotter examinations included short answer, multiple choice, multiple answer, ordering, and true and false questions. Despite examination grades improving 20% during ERT, students reported lower levels of learning, confidence, and engagement with the course materials when compared to the face-to-face portion of the class. The most prevalent challenges identified by students were those related to the loss of access to cadaver-based learning, including difficulty identifying and visualizing structures in three dimensions, and the loss of context and sensorial cues. Flexibility in taking examinations and learning the material at their own pace were recognized as positive outcomes of the ERT transition. While the resulting student perceptions and performances are unsurprising, they offer insight into the challenges of fostering a productive learning environment in a future threatened by epidemic outbreak and economic uncertainty.
Collapse
Affiliation(s)
- Jessica Wilhelm
- Department of Ecology and Evolutionary BiologyCollege of Liberal Arts and SciencesUniversity of KansasLawrenceKansasUSA
| | - Spencer Mattingly
- Department of Ecology and Evolutionary BiologyCollege of Liberal Arts and SciencesUniversity of KansasLawrenceKansasUSA
| | - Victor H. Gonzalez
- Department of Ecology and Evolutionary BiologyCollege of Liberal Arts and SciencesUniversity of KansasLawrenceKansasUSA
- Undergraduate Biology ProgramCollege of Liberal Arts and SciencesUniversity of KansasLawrenceKansasUSA
| |
Collapse
|
4
|
Farooqui F, Saeed N, Aaraj S, Sami MA, Amir M. A Comparison Between Written Assessment Methods: Multiple-choice and Short Answer Questions in End-of-clerkship Examinations for Final Year Medical Students. Cureus 2018; 10:e3773. [PMID: 30820392 PMCID: PMC6389017 DOI: 10.7759/cureus.3773] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2018] [Accepted: 12/24/2018] [Indexed: 11/12/2022] Open
Abstract
Introduction An important aspect of a modern academic curriculum is assessment, which can be clinical and written. Written assessment includes both multiple-choice questions (MCQs) and short answer questions (SAQs). Debate continues as to which is more reliable. It is important to assess the correlation between the two different formats of written assessments, especially in the clinical subjects as they are different from the basic science subjects. Moreover, data are lacking in the correlation of the two formats of the written assessment in the clinical subjects. Therefore, we conducted this study to see the correlation between MCQs and SAQs in the end-of-clerkship examinations for final-year medical students. Materials and methods The end-of-clerkship written assessment results of the four disciplines of medicine, surgery, gynecology, and pediatrics were included. This was a retrospective correlational analytical study conducted at Shifa Tameer-e-Millat University, Islamabad, from 2013 to 2017. Data were analyzed using IBM SPSS Statistics for Windows, version 23.0 (IBM Corp., Armonk, NY); mean, standard deviation, Pearson coefficient, and p values were calculated both for MCQs and SAQs. Results A total of 481 students were involved in our study. The mean percentage scores of MCQs and SAQs in medicine were the most similar, and scores in obstetrics and gynecology had the most disparity. As compared to MCQs, the wider standard deviations were found in SAQs. Pearson correlations were 0.49, 0.47, 0.23, and 0.38 for medicine, surgery, gynecology, and pediatrics, respectively. Conclusion While we found mild to moderate significant correlation between MCQs and SAQs for final-year medical students, further investigations are required to explore the correlation and enhance the validity of our written assessments.
Collapse
Affiliation(s)
| | - Nadia Saeed
- Internal Medicine, Shifa Tameer-e-Millat University, Islamabad, PAK
| | - Sahira Aaraj
- Pediatrics, Shifa Tameer-e-Millat University, Islamabad, PAK
| | - Muneeza A Sami
- Medical Education and Simulation, Shifa Tameer-e-Millat University, Islamabad, PAK
| | - Muhammad Amir
- Surgery, Shifa Tameer-e-Millat University, Islamabad, PAK
| |
Collapse
|
5
|
Tariq S, Tariq S, Maqsood S, Jawed S, Baig M. Evaluation of Cognitive levels and Item writing flaws in Medical Pharmacology Internal Assessment Examinations. Pak J Med Sci 2017; 33:866-870. [PMID: 29067055 PMCID: PMC5648954 DOI: 10.12669/pjms.334.12887] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
OBJECTIVES This study aimed to evaluate the cognitive levels of Multiple Choice Questions (MCQs) & Short Answer Questions (SAQs) and types of Item Writing Flaws (IWFs) in MCQs in Medical Pharmacology internal assessment exams. METHODS This descriptive, study was conducted over a period of six months, from December 2015 to May 2016 and evaluated six internal assessment examinations comprising SAQs and MCQs. A total of 150 MCQs and 43 SAQs were analyzed. These questions were administered to third-year medical students in the year of 2015. All SAQs were reviewed for their cognitive levels and MCQs were reviewed for cognitive levels as well as for IWFs. Items were classified as flawed if they contained one or more than one flaw. The cognitive level of the questions was determined by the modified Bloom's taxonomy. RESULTS The proportion of flawed items out of 150 items in six exams ranged from 16% to 52%. While the percentage of total flawed items was 28%. Most common types of flaws were implausible distractors 19.69% (26), extra detail in correct option 18.18% (24), vague terms 9.85% (13), unfocused stem 9.09% (12) and absolute terms 9.09% (12). The two-third of MCQs 97(64.67%) were assessing the recall of information, while 29 (19.33%) and 24 (16%) were assessing the interpretation of data and problem-solving skills respectively. The majority of the SAQs (90.7%) were assessing recall of the information and only 9.3% were assessing interpretation of data while none of the questions was assessing the problem-solving skills. CONCLUSIONS The cognitive level of assessment tools (SAQs & MCQs) is low, and IWFS are common in the MCQs. Therefore, faculty should be urged and groomed to design problem-solving questions which are devoid of any flaws.
Collapse
Affiliation(s)
- Saba Tariq
- Dr. Saba Tariq, MBBS, M.Phil, Assistant Professor, Pharmacology, University Medical & Dental College, Faisalabad, Pakistan
| | - Sundus Tariq
- Dr. Sundus Tariq, MBBS, M.Phil Assistant Professor, Physiology, University Medical & Dental College, Faisalabad, Pakistan
| | - Sadia Maqsood
- Dr. Sadia Maqsood, MBBS, M.Phil Senior Demonstrator, Pharmacology, Shaikh Zayed Postgraduate Medical Institute, Shaikh Zayed Hospital, Lahore, Pakistan
| | - Shireen Jawed
- Dr. Shireen Jawed, MBBS, M.Phil Assistant Professor, Physiology, Aziz Fatima Medical College, Faisalabad, Pakistan
| | - Mukhtiar Baig
- Dr. Mukhtiar Baig, MBBS, M.Phil, PhD Professor of Clinical Biochemistry, Faculty of Medicine, Rabigh, King Abdulaziz University, Jeddah, KSA
| |
Collapse
|
6
|
Baig M, Ali SK, Ali S, Huda N. Evaluation of Multiple Choice and Short Essay Question items in Basic Medical Sciences. Pak J Med Sci 2014; 30:3-6. [PMID: 24639820 PMCID: PMC3955531 DOI: 10.12669/pjms.301.4458] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2013] [Revised: 11/23/2013] [Accepted: 11/30/2013] [Indexed: 11/16/2022] Open
Abstract
Objectives: To evaluate Multiple Choice and Short Essay Question items in Basic Medical Sciences by determining item writing flaws (IWFs) of MCQs along with cognitive level of each item in both methods. Methods: This analytical study evaluated the quality of the assessment tools used for the first batch in a newly established medical college in Karachi, Pakistan. First and sixth module assessment tools in Biochemistry during 2009-2010 were analyzed. Cognitive level of MCQs and SEQs, were noted and MCQ item writing flaws were also evaluated. Results: A total of 36 SEQs and 150 MCQs of four items were analyzed. The cognitive level of 83.33% of SEQs was at recall level while remaining 16.67% were assessing interpretation of data. Seventy six percent of the MCQs were at recall level while remaining 24% were at the interpretation. Regarding IWFs, 69 IWFs were found in 150 MCQs. The commonest among them were implausible distracters (30.43%), unfocused stem (27.54%) and unnecessary information in the stem (24.64%). Conclusion: There is a need to review the quality including the content of assessment tools. A structured faculty development program is recommended for developing improved assessment tools that align with learning outcomes and measure competency of medical students.
Collapse
Affiliation(s)
- Mukhtiar Baig
- Dr. Mukhtiar Baig, PhD, MHPE, Professor of Clinical Biochemistry, Head of Assessment Unit, Faculty of Medicine, Rabigh, King Abdulaziz University, Jeddah, Saudi Arabia
| | - Syeda Kauser Ali
- Dr. Syeda Kauser Ali, PhD, Associate Professor, Department of Educational evelopment, Aga Khan University, Karachi, Pakistan
| | - Sobia Ali
- Dr. Sobia Ali, MHPE, Assistant Professor, Department of Medical Education, Liaquat National Medical College, Karachi, Pakistan
| | - Nighat Huda
- Ms. Nighat Huda, MS in Ed, Associate Professor, Medical Education Department Bahria University Medical & Dental College, Karachi, Pakistan
| |
Collapse
|
7
|
Adeniyi OS, Ogli SA, Ojabo CO, Musa DI. The impact of various assessment parameters on medical students' performance in first professional examination in physiology. Niger Med J 2014; 54:302-5. [PMID: 24403705 PMCID: PMC3883227 DOI: 10.4103/0300-1652.122330] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Background: This study was carried out to assess the relationship between thevarious assessment parameters, viz. continuous assessment (CA), multiple choice questions (MCQ), essay, practical, oral with the overall performance in the first professional examination in Physiology. Materials and Methods: The results of all 244 students that sat for the examination over 4 years were used. The CA, MCQ, essay, practical, oral and overall performance scores were obtained. All the scores were rounded up to 100% to give each parameter equal weighting. Results: Analysis showed that the average overall performance was 50.8 ± 5.3. The best average performance was in practical (55.5 ± 9.1), while the least was in MCQ (44.1 ± 7.8). In the study, 81.1% of students passed orals, 80.3% passed practical, 72.5% passed CA, 58.6% passed essay, 22.5% passed MCQ and 71.7% of students passed on the overall performance. All assessment parameters significantly correlated with overall performance. Continuous assessment had the best correlation (r = 0.801, P = 0.000), while oral had the least correlation (r = 0.277, P = 0.000) with overall performance. Essay was the best predictor of overall performance (β = 0.421, P = 000), followed by MCQ (β = 0.356, P = 000), while practical was the least predictor of performance (β = 0.162, P = 000). Conclusion: We suggest that the department should uphold the principle of continuous assessment and more effort be made in the design of MCQ so that performance can improve.
Collapse
Affiliation(s)
| | | | | | - Danladi Ibrahim Musa
- Department of Human Kinetics, and Health Education, Benue State University, Makurdi, Nigeria
| |
Collapse
|
8
|
|
9
|
Hsieh C, Mache M, Knudson D. Does student learning style affect performance on different formats of biomechanics examinations? Sports Biomech 2012; 11:108-19. [DOI: 10.1080/14763141.2011.637128] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/15/2022]
|