1
|
Toroujeni SMH. Computerized testing in reading comprehension skill: investigating score interchangeability, item review, age and gender stereotypes, ICT literacy and computer attitudes. EDUCATION AND INFORMATION TECHNOLOGIES 2021; 27:1771-1810. [PMID: 34366694 PMCID: PMC8329632 DOI: 10.1007/s10639-021-10584-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/04/2020] [Accepted: 04/21/2021] [Indexed: 06/13/2023]
Abstract
Score interchangeability of Computerized Fixed-Length Linear Testing (henceforth CFLT) and Paper-and-Pencil-Based Testing (henceforth PPBT) has become a controversial issue over the last decade when technology has meaningfully restructured methods of the educational assessment. Given this controversy, various testing guidelines published on computerized testing may be used to investigate the interchangeability of CFLT and PPBT mean scores to corroborate if test takers' testing performance is influenced by the effects of testing administration mode; specifically, if validity and reliability of two versions of the same test are affected. This research was conducted to probe not only score interchangeability across testing modes but also to explore the role of age and gender stereotypes, item review, ICT literacy and attitudes towards computer use as moderator variables in test takers' reading achievement in CFLT. Fifty-eight EFL learners homogeneous in both general English and reading skills assigned into one testing group participated in this study. Three different versions of TOEFL reading comprehension test, Computer Attitude Scale (CAS), and ICT literacy Scale of TOEFL Examinees were used in this crossover quasi-controlled empirical study with a common-person and pretest-posttest design to collect data. The findings demonstrated that although the reading scores of test takers were interchangeable in both CFLT and PPBT versions regarding testing administration modes, they were different regarding item review. Furthermore, no significant interaction was found between age, gender, and ICT literacy and CFLT performance. However, attitudes towards the use of computer led to a significant change in testing achievement on CFLT.
Collapse
|
2
|
Karibyan A, Sabnis G. Students' perceptions of computer-based testing using ExamSoft. CURRENTS IN PHARMACY TEACHING & LEARNING 2021; 13:935-944. [PMID: 34294257 DOI: 10.1016/j.cptl.2021.06.018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/28/2020] [Revised: 01/04/2021] [Accepted: 06/09/2021] [Indexed: 06/13/2023]
Abstract
INTRODUCTION In fall 2017, West Coast University School of Pharmacy implemented ExamSoft for testing. Three courses in each didactic year employed ExamSoft. Prior to this, courses had Scantron-based exams. We surveyed the students to assess their perception of ExamSoft. We hypothesized that students' inherent bias towards technology affected their perception of ExamSoft. METHODS To assess this hypothesis, we conducted a survey of all students. The survey contained questions about comfort with technology and nine questions on students' perceptions of ExamSoft and its usefulness. RESULTS The survey responses were stratified according to the preference of respondents towards technology and its use in exams. Respondents were stratified into three groups: tech-embracers, tech-skeptics, and neutral. Our results showed that respondents classified as tech-skeptics tended to have a more negative view of ExamSoft and its perceived impact on their grades than students stratified as tech-embracers or neutral. CONCLUSIONS Our study suggests that students' inherent bias towards technology plays an important role in their perception of computer-based testing. Assessing incoming students' comfort with technology and student orientation activities to help acquaint students with new technology could help improve their acceptance of educational technology used for testing.
Collapse
Affiliation(s)
- Anna Karibyan
- West Coast University, School of Pharmacy, 590 N. Vermont Ave, Los Angeles CA-90004, United States.
| | - Gauri Sabnis
- Department of Pharmaceutical Sciences, West Coast University, School of Pharmacy, 590 N. Vermont Ave, Room 332, Los Angeles CA-90004, United States.
| |
Collapse
|
3
|
Armstrong SM, Nixon P, Hojilla CV. Pathology Resident Evaluation During the Pandemic: Testing and Implementation of a Comprehensive Online Pathology Exam. Acad Pathol 2021; 8:23742895211013533. [PMID: 34027056 PMCID: PMC8120522 DOI: 10.1177/23742895211013533] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2020] [Revised: 03/09/2021] [Accepted: 04/03/2021] [Indexed: 11/15/2022] Open
Abstract
Despite global digitization, evaluating pathology trainees by paper exams remains the norm. As new social distancing practices require new ways of administering exams, we assessed the viability of an online format for in-house exams from the resident and examiner perspectives. First, pathology residents participated in a practice exam, while staff who were experienced in creating exams were given an online exam-creation demonstration. Subsequently, residents completed a formal 3-hour online exam comprised of multiple-choice, matching, short answer, and whole slide images in place of the paper exam regularly used to evaluate trainees. The experience of the participants was evaluated by surveys. Eighteen residents completed the practice exam; 67% were receptive to the new format and 94% were in favor of moving to digital exams. Seven staff evaluated the digital format and 6 were in favor of it. For the formal online in-house exam, 20 residents participated and 14 completed the survey. Feedback was generally positive with the most common issue being slow-loading digital slides. Exam scores stratified by postgraduate training years in a statistically significant manner, showing positive correlation with resident training level. The online exam format was preferred over paper exams by trainees, with support from both staff and trainees for a permanent transition. Online exams have clear advantages, but technical issues should be addressed before widespread implementation. Our study demonstrates that online exams are a feasible alternative for trainee assessment, especially in socially distanced environments.
Collapse
Affiliation(s)
- Susan M Armstrong
- Department of Laboratory Medicine and Pathobiology, University of Toronto, Toronto, Ontario, Canada
| | - Paula Nixon
- Department of Laboratory Medicine and Pathobiology, University of Toronto, Toronto, Ontario, Canada
| | - Carlo V Hojilla
- Department of Laboratory Medicine and Pathobiology, University of Toronto, Toronto, Ontario, Canada.,Department of Pathology and Laboratory Medicine, Mount Sinai Hospital, Toronto, Ontario, Canada
| |
Collapse
|
4
|
Egarter S, Mutschler A, Tekian A, Norcini J, Brass K. Medical assessment in the age of digitalisation. BMC MEDICAL EDUCATION 2020; 20:101. [PMID: 32234051 PMCID: PMC7110637 DOI: 10.1186/s12909-020-02014-7] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/08/2019] [Accepted: 03/20/2020] [Indexed: 06/02/2023]
Abstract
BACKGROUND Digital assessment is becoming more and more popular within medical education. To analyse the dimensions of this digital trend, we investigated how exam questions (items) are created and designed for use in digital medical assessments in Germany. Thus, we want to explore whether different types of media are used for item creation and if a digital trend in medical assessment can be observed. METHODS In a cross-sectional descriptive study, we examined data of 30 German medical faculties stored within a common assessment platform. More precise, 23,008 exams which contained 847,137 items were analysed concerning the exam type (paper-, computer- or tablet-based) and their respective media content (picture, video and/or audio). Out of these, 5252 electronic exams with 12,214 questions were evaluated. The media types per individual question were quantified. RESULTS The amount of computer- and tablet-based exams were rapidly increasing from 2012 until 2018. Computer- and tablet-based written exams showed with 45 and 66% a higher percentage of exams containing media in comparison to paper-based exams (33%). Analysis on the level of individual questions showed that 90.8% of questions had one single picture. The remaining questions contained either more than one picture (2.9%), video (2.7%), audio (0.2%) or 3.3% of questions had picture as well as video added. The main question types used for items with one picture are TypeA (54%) and Long_Menu (31%). In contrast, questions with video content contain only 11% TypeA questions, whereas Long_Menu is represented by 66%. Nearly all questions containing both picture and video are Long_Menu questions. CONCLUSIONS It can be stated that digital assessment formats are indeed on the raise. Moreover, our data indicates that electronic assessments formats have easier options to embed media items and thus show a higher frequency of media addition. We even identified the usage of different media types in the same question and this innovative item design could be a useful feature for the creation of medical assessments. Moreover, the choice of media type seems to depend on the respective question type.
Collapse
Affiliation(s)
- Saskia Egarter
- Institute for Communication and Assessment Research, Wieblinger Weg 92A, Heidelberg, Germany.
| | - Anna Mutschler
- Institute for Communication and Assessment Research, Wieblinger Weg 92A, Heidelberg, Germany
- Center of Excellence Assessment in Medicine, University of Heidelberg, Heidelberg, Germany
| | - Ara Tekian
- Department of Medical Education, University of Illinois, Chicago, USA
| | - John Norcini
- Foundation for the Advancement of International Medical Education Research, Philadelphia, PA, USA
| | - Konstantin Brass
- Institute for Communication and Assessment Research, Wieblinger Weg 92A, Heidelberg, Germany
| |
Collapse
|
5
|
Egarter S, Mutschler A, Tekian A, Norcini J, Brass K. Medical assessment in the age of digitalisation. BMC MEDICAL EDUCATION 2020; 20:101. [PMID: 32234051 DOI: 10.1186/s12909-020-02014-7.pmid:32234051;pmcid:pmc7110637] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Subscribe] [Scholar Register] [Received: 11/08/2019] [Accepted: 03/20/2020] [Indexed: 05/22/2023]
Abstract
BACKGROUND Digital assessment is becoming more and more popular within medical education. To analyse the dimensions of this digital trend, we investigated how exam questions (items) are created and designed for use in digital medical assessments in Germany. Thus, we want to explore whether different types of media are used for item creation and if a digital trend in medical assessment can be observed. METHODS In a cross-sectional descriptive study, we examined data of 30 German medical faculties stored within a common assessment platform. More precise, 23,008 exams which contained 847,137 items were analysed concerning the exam type (paper-, computer- or tablet-based) and their respective media content (picture, video and/or audio). Out of these, 5252 electronic exams with 12,214 questions were evaluated. The media types per individual question were quantified. RESULTS The amount of computer- and tablet-based exams were rapidly increasing from 2012 until 2018. Computer- and tablet-based written exams showed with 45 and 66% a higher percentage of exams containing media in comparison to paper-based exams (33%). Analysis on the level of individual questions showed that 90.8% of questions had one single picture. The remaining questions contained either more than one picture (2.9%), video (2.7%), audio (0.2%) or 3.3% of questions had picture as well as video added. The main question types used for items with one picture are TypeA (54%) and Long_Menu (31%). In contrast, questions with video content contain only 11% TypeA questions, whereas Long_Menu is represented by 66%. Nearly all questions containing both picture and video are Long_Menu questions. CONCLUSIONS It can be stated that digital assessment formats are indeed on the raise. Moreover, our data indicates that electronic assessments formats have easier options to embed media items and thus show a higher frequency of media addition. We even identified the usage of different media types in the same question and this innovative item design could be a useful feature for the creation of medical assessments. Moreover, the choice of media type seems to depend on the respective question type.
Collapse
Affiliation(s)
- Saskia Egarter
- Institute for Communication and Assessment Research, Wieblinger Weg 92A, Heidelberg, Germany.
| | - Anna Mutschler
- Institute for Communication and Assessment Research, Wieblinger Weg 92A, Heidelberg, Germany
- Center of Excellence Assessment in Medicine, University of Heidelberg, Heidelberg, Germany
| | - Ara Tekian
- Department of Medical Education, University of Illinois, Chicago, USA
| | - John Norcini
- Foundation for the Advancement of International Medical Education Research, Philadelphia, PA, USA
| | - Konstantin Brass
- Institute for Communication and Assessment Research, Wieblinger Weg 92A, Heidelberg, Germany
| |
Collapse
|
6
|
Stauffer R, Pitlick J, Challen L. Impact of an electronic-based assessment on student pharmacist performance in a required therapeutics course. CURRENTS IN PHARMACY TEACHING & LEARNING 2020; 12:287-290. [PMID: 32273064 DOI: 10.1016/j.cptl.2019.12.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/29/2019] [Revised: 09/13/2019] [Accepted: 12/04/2019] [Indexed: 06/11/2023]
Abstract
INTRODUCTION The use of technology in the classroom has continued to grow, and with the advancement of classroom management systems and online exam software, there are opportunities to administer exams electronically. This study assessed the impact of electronic-based assessments on examination scores in a required therapeutics course. METHODS This was a retrospective, single-centered, observational study including second professional year pharmacy students enrolled in a required, one semester therapeutics course. Four assessments were administered each semester. Lecture content and exam format, a mixture of multiple-choice questions and free response written cases, did not differ significantly between years. Assessments administered during the first two years were printed on paper, while assessments administered during the third and fourth year of the study were all electronic, submitted through a classroom management system. Following institutional review board approval, the change in mean overall examination scores between paper and electronic-based assessments were analyzed. RESULTS Of the 948 students enrolled in this study, there was no difference in overall mean scores between paper and electronic-based assessments (74.8% vs. 73.8%). In addition, there was no difference in mean examination scores between overall individual paper and electronic Exam 1 through 4 or overall multiple-choice or free response scores between paper and electronic-based assessments. CONCLUSIONS Scores did not differ between paper and electronic-based assessments. From this study, test method does not appear to impact exam results.
Collapse
Affiliation(s)
- Rebecca Stauffer
- St. Louis College of Pharmacy, 4588 Parkview Place, St. Louis, MO 63110, United States.
| | - Jamie Pitlick
- St. Louis College of Pharmacy, 4588 Parkview Place, St. Louis, MO 63110, United States.
| | - Laura Challen
- St. Louis College of Pharmacy, 4588 Parkview Place, St. Louis, MO 63110, United States.
| |
Collapse
|
7
|
Mahaffey AL. Interfacing virtual and face-to-face teaching methods in an undergraduate human physiology course for health professions students. ADVANCES IN PHYSIOLOGY EDUCATION 2018; 42:477-481. [PMID: 30035633 DOI: 10.1152/advan.00097.2018] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Human physiology is a core physical sciences course for health professions students, such as nurses and exercise science majors. The concepts of human physiology lay the foundation for health professions courses, such as pathophysiology. The National Council Licensing Exam for registered nurses (a timed nursing licensure exam) and the American College of Sports Medicine timed licensure exams for exercise sciences students have a framework consisting of human physiology concepts and are computer adaptive testing (CAT) assessments. This provides a case for electronic testing (in the undergraduate class setting) as a preparatory measure for CAT licensing exams. Case studies have illustrated a high information retention rate, with students completing online homework vs. paper, as well. Additionally, in recent years, virtual laboratories for non-physical science majors have been described as safer and effective for the purposes of educating students in laboratory techniques and experimental measures. Lastly, a successful learning approach utilized by museums has been found to be effective in younger students as well: "touch learning" (tactile learning). It also is important to note that student discussions and the face-to-face teaching dynamic play a critical role in the undergraduate education process. As such, the teaching methodology discussed here combines e-learning, virtual laboratories, tactile learning, and face-to-face didactic instruction of human physiology in developing a course to engage undergraduate health professions students, increase retention of human physiology course materials, and simultaneously prepare students for the CAT assessments that are licensing exams.
Collapse
Affiliation(s)
- Angela L Mahaffey
- Marcella Neihoff School of Nursing, Loyola University Chicago , Chicago, Illinois
| |
Collapse
|