1
|
Stretton B, Montagu A, Kunnel A, Louise J, Behrendt N, Kovoor J, Bacchi S, Thomas J, Davies E. Perceived and actual value of Student-led Objective Structured Clinical Examinations. CLINICAL TEACHER 2024; 21:e13754. [PMID: 38429878 DOI: 10.1111/tct.13754] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2023] [Accepted: 02/06/2024] [Indexed: 03/03/2024]
Abstract
INTRODUCTION Student-led Objective Structured Clinical Examinations (OSCEs) provide formative learning opportunities prior to Faculty-led OSCEs. It is important to undertake quality assurance measurements of peer-led assessments because, if they are found to be unreliable and invalid, they may have detrimental impacts. The objectives of this study were to explore and evaluate Student-led OSCEs hosted by fifth-year medical students. METHODS Student-led OSCE results were analysed to examine reliability (Cronbach's alpha). The relationship between Student-led and Faculty-led OSCEs was evaluated using linear regression. Qualitative data were acquired by survey and semi-structured interviews and were analysed using an inductive content analysis approach. RESULTS In total, 85 (94%) of 91 eligible students consented to study participation. Student-led OSCEs had a low-moderate reliability [Cronbach alpha = 0.47 (primary care) and 0.61 (human reproduction/paediatrics) (HRH)]. A statistically significant, positive relationship between Student-led and Faculty-led OSCE results was observed. Faculty-led OSCE grades increased by 0.49 (95% CI: 0.18, 0.80) to 1.09 (95% CI: 0.67, 1.52), for each percentage increase in Student-led OSCE result. Student-led OSCE participants highly valued the authentic peer-assessed experience. Reported benefits included a reduction of perceived stress and anxiety prior to Faculty-led OSCEs, recognition of learning gaps, contribution to overall clinical competency and facilitation of collaboration between peers. DISCUSSION Student-led OSCEs are moderately reliable and can predict Faculty-led OSCE performance. This form of near-peer assessment encourages the metacognitive process of reflective practice and can be effectively implemented to direct further study. Faculties should collaborate with their student bodies to facilitate Student-led OSCEs and offer assistance to improve the quality, and benefits, of these endeavours.
Collapse
Affiliation(s)
- Brandon Stretton
- Adelaide Medical School, The University of Adelaide, Adelaide, South Australia, Australia
| | - Adam Montagu
- Adelaide Health Simulation, The University of Adelaide, Adelaide, South Australia, Australia
| | - Aline Kunnel
- Biostatistics Unit, South Australian Health and Medical Research Institute, Adelaide, South Australia, Australia
| | - Jenni Louise
- Biostatistics Unit, South Australian Health and Medical Research Institute, Adelaide, South Australia, Australia
| | - Nathan Behrendt
- Adelaide Medical School, The University of Adelaide, Adelaide, South Australia, Australia
| | - Joshua Kovoor
- Adelaide Medical School, The University of Adelaide, Adelaide, South Australia, Australia
- Ballarat Base Hospital, Grampians Health Ballarat, Ballarat, Victoria, Australia
| | - Stephen Bacchi
- Adelaide Medical School, The University of Adelaide, Adelaide, South Australia, Australia
| | - Josephine Thomas
- Adelaide Medical School, The University of Adelaide, Adelaide, South Australia, Australia
| | - Ellen Davies
- Adelaide Health Simulation, The University of Adelaide, Adelaide, South Australia, Australia
| |
Collapse
|
2
|
Bowers RD, Baker CN, Becker KK, Hamilton JN, Trotta K. Comparison of peer, self, and faculty objective structured clinical examination evaluations in a PharmD nonprescription therapeutics course. CURRENTS IN PHARMACY TEACHING & LEARNING 2024; 16:102159. [PMID: 39089218 DOI: 10.1016/j.cptl.2024.102159] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/10/2024] [Revised: 07/05/2024] [Accepted: 07/11/2024] [Indexed: 08/03/2024]
Abstract
PURPOSE Objective structured clinical examinations (OSCE) are a valuable assessment within healthcare education, as they provide the opportunity for students to demonstrate clinical competency, but can be resource intensive to provide faculty graders. The purpose of this study was to determine how overall OSCE scores compared between faculty, peer, and self-evaluations within a Doctor of Pharmacy (PharmD) curriculum. METHODS This study was conducted during the required nonprescription therapeutics course. Seventy-seven first-year PharmD students were included in the study, with 6 faculty members grading 10-15 students each. Students were evaluated by 3 graders: self, peer, and faculty. All evaluators utilized the same rubric. The primary endpoint of the study was to compare the overall scores between groups. Secondary endpoints included interrater reliability and quantification of feedback type based on the evaluator group. RESULTS The maximum possible score for the OSCE was 50 points; the mean scores for self, peer, and faculty evaluations were 43.3, 43.5, and 41.7 points, respectively. No statistically significant difference was found between the self and peer raters. However, statistical significance was found in the comparison of self versus faculty (p = 0.005) and in peer versus faculty (p < 0.001). When these scores were correlated to a letter grade (A, B, C or less), higher grades had greater similarity among raters compared to lower scores. Despite differences in scoring, the interrater reliability, or W score, on overall letter grade was 0.79, which is considered strong agreement. CONCLUSIONS This study successfully demonstrated how peer and self-evaluation of an OSCE provides a comparable alternative to traditional faculty grading, especially in higher performing students. However, due to differences in overall grades, this strategy should be reserved for low-stakes assessments and basic skill evaluations.
Collapse
Affiliation(s)
- Riley D Bowers
- Campbell University College of Pharmacy & Health Sciences, PO Box 1090, Buies Creek, NC 27506, USA.
| | - Carrie N Baker
- Campbell University College of Pharmacy & Health Sciences, PO Box 1090, Buies Creek, NC 27506, USA.
| | - Kaitlyn K Becker
- Campbell University College of Pharmacy & Health Sciences, PO Box 1090, Buies Creek, NC 27506, USA.
| | - Jessica N Hamilton
- Campbell University College of Pharmacy & Health Sciences, PO Box 1090, Buies Creek, NC 27506, USA.
| | - Katie Trotta
- Campbell University College of Pharmacy & Health Sciences, PO Box 1090, Buies Creek, NC 27506, USA.
| |
Collapse
|
3
|
El Sherif R, Shrier I, Paul-Tellier P, Rodriguez C. What do we know about Objective Structured Clinical Examination in Sport and Exercise Medicine? A scoping review. CANADIAN MEDICAL EDUCATION JOURNAL 2024; 15:57-72. [PMID: 39114782 PMCID: PMC11302755 DOI: 10.36834/cmej.77841] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 08/10/2024]
Abstract
Background and objectives Despite the importance of the Objective Structured Clinical Examination (OSCE) in Sport and Exercise Medicine, the literature on the topic is fragmented and has been poorly developed. The goal of this review was to map current knowledge about how the OSCE is used in Sport and Exercise Medicine, and to identify knowledge gaps for future research. Method The authors conducted a scoping review. They searched PubMed and Scopus for articles using key terms related to 'OSCE' and 'sport medicine' with no limit on search start date and up to July 2022. Retrieved records were imported, abstracts were screened, and full-text articles were reviewed. A forward and backward citation tracking was conducted. Data was extracted and a qualitative meta-summary of the studies was conducted. Results A total of 469 records were screened, and 22 studies were included. The objectives of the studies included using OSCEs to assess knowledge/skills after a training program (n = 11), to assess an intervention (n = 8), and to assess and improve the OSCE itself (n = 3). Thirteen studies reported validity and/or reliability of the OSCE. Conclusion Despite the widespread use of OSCEs in the examination of Sport and Exercise Medicine trainees, only a handful of scholarly works have been published. More research is needed to support the use of OSCE in Sport and Exercise Medicine for its initial purpose. We highlight avenues for future research such as assessing the need for a deeper exploration of the relationship between candidate characteristics and OSCE scores.
Collapse
Affiliation(s)
- Reem El Sherif
- Department of Family Medicine, McGill University, Quebec, Canada
| | - Ian Shrier
- Department of Family Medicine, McGill University, Quebec, Canada
- Centre for Clinical Epidemiology, Lady Davis Institute, Jewish General Hospital, Montreal, Quebec, Canada
| | | | - Charo Rodriguez
- Department of Family Medicine, McGill University, Quebec, Canada
| |
Collapse
|
4
|
Teichgräber U, Ingwersen M, Sturm MJ, Giesecke J, Allwang M, Herzog I, von Gierke F, Schellong P, Kolleg M, Lange K, Wünsch D, Gugel K, Wünsch A, Zöllkau J, Petruschke I, Häseler-Ouart K, Besteher B, Philipp S, Mille U, Ouart D, Jünger J. Objective structured clinical examination to teach competency in planetary health care and management - a prospective observational study. BMC MEDICAL EDUCATION 2024; 24:308. [PMID: 38504289 PMCID: PMC10953132 DOI: 10.1186/s12909-024-05274-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/16/2023] [Accepted: 03/07/2024] [Indexed: 03/21/2024]
Abstract
BACKGROUND Health professionals are increasingly called upon and willing to engage in planetary health care and management. However, so far, this topic is rarely covered in medical curricula. As the need for professional communication is particularly high in this subject area, this study aimed to evaluate whether the objective structured clinical examination (OSCE) could be used as an accompanying teaching tool. METHODS During the winter semester 2022/2023, 20 third- and fifth-year medical students voluntarily participated in a self-directed online course, three workshops, and a formal eight-station OSCE on planetary health care and management. Each examinee was also charged alternatingly as a shadower with the role of providing feedback. Experienced examiners rated students' performance using a scoring system supported by tablet computers. Examiners and shadowers provided timely feedback on candidates` performance in the OSCE. Immediately after the OSCE, students were asked about their experience using a nine-point Likert-scale survey and a videotaped group interview. Quantitative analysis included the presentation of the proportional distribution of student responses to the survey and of box plots showing percentages of maximum scores for the OSCE performance. The student group interview was analyzed qualitatively. RESULTS Depending on the sub-theme, 60% -100% of students rated the subject of planetary health as likely to be useful in their professional lives. Similar proportions (57%-100%) were in favour of integrating planetary health into required courses. Students perceived learning success from OSCE experience and feedback as higher compared to that from online courses and workshops. Even shadowers learned from observation and feedback discussions. Examiners assessed students' OSCE performance at a median of 80% (interquartile range: 83%-77%) of the maximum score. CONCLUSIONS OSCE can be used as an accompanying teaching tool for advanced students on the topic of planetary health care and management. It supports learning outcomes, particularly in terms of communication skills to sensitise and empower dialogue partners, and to initiate adaptation steps at the level of individual patients and local communities.
Collapse
Affiliation(s)
- Ulf Teichgräber
- Office of the Dean, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany.
- Department of Radiology, Department of Diagnostic and Interventional Radiology, Jena University Hospital, Friedrich-Schiller-University Jena, Am Klinikum 1, 07747, Jena, Germany.
| | - Maja Ingwersen
- Department of Radiology, Department of Diagnostic and Interventional Radiology, Jena University Hospital, Friedrich-Schiller-University Jena, Am Klinikum 1, 07747, Jena, Germany
| | - Max-Johann Sturm
- Student Representatives, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Jan Giesecke
- Student Representatives, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Manuel Allwang
- Student Representatives, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Ida Herzog
- Student Representatives, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | | | - Paul Schellong
- Institute of Infection Medicine and Hospital Hygiene, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Matthias Kolleg
- Department of Internal Medicine IV, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Kathleen Lange
- Department of Internal Medicine IV, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Daniel Wünsch
- Clinic of Anaesthesiology and Intensive Care Medicine, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Katrin Gugel
- Clinic of Anaesthesiology and Intensive Care Medicine, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Anne Wünsch
- Department of Obstetrics, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Janine Zöllkau
- Department of Obstetrics, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Inga Petruschke
- Institute of General Practice and Family Medicine, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Kristin Häseler-Ouart
- Department of Internal Medicine II, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Bianca Besteher
- Department of Psychiatry and Psychotherapy, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Swetlana Philipp
- Department of Psychosocial Medicine, Psychotherapy, and Psychooncology, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Urte Mille
- SkillsLab Jena, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Dominique Ouart
- Office of the Dean, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Jana Jünger
- Institute for Communication and Assessment Research, Heidelberg, Germany
- Program of Master of Medical Education (MME), Medical Faculty Heidelberg, Heidelberg University, Heidelberg, Germany
| |
Collapse
|
5
|
McKay A, McCall J, Cairns AM. Peer assessment: Development and delivery of the OSCE. EUROPEAN JOURNAL OF DENTAL EDUCATION : OFFICIAL JOURNAL OF THE ASSOCIATION FOR DENTAL EDUCATION IN EUROPE 2023; 27:234-239. [PMID: 35263022 DOI: 10.1111/eje.12796] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/07/2021] [Revised: 02/02/2022] [Accepted: 02/17/2022] [Indexed: 06/14/2023]
Abstract
INTRODUCTION There is an expectation that healthcare professionals display competence in teaching, assessment and providing feedback. Development begins with formative peer-assisted learning and teaching in the undergraduate environment. Using peers or near-peers (in this case having 1 year more experience than the examination cohort) to provide assessment in summative exams remains unexplored. This study investigates how the use of near-peers compares to marking by academic staff in a summative OSCE. MATERIALS AND METHODS BDS4 Peer assessors (PAs) developed an OSCE question and marking schedule. Each PA (n = 3) was paired with an academic staff assessor (ASA) (n = 3). Peer and academic marked the candidates independently. Two years later, the process was repeated on the same cohort of candidates with the PA now 1-year post qualification. Statistical analysis compared the scores awarded by PA during each timeframe and against the marks awarded by the ASA. RESULTS During round 1, 28 students (62.2%) were awarded the same score by PA and ASA. On 17 occasions, there was a discrepancy (37.8%). Bias was skewed in favour of PA scoring higher (mean difference of differences -0.0667). During round 2, 27 students (55.1%) were awarded the same score by PA and ASA. On 22 occasions (44.9%), there was a discrepancy. Bias was skewed in favour of ASA scoring higher (mean difference of differences 0.0612). DISCUSSION Levels of agreement between PA and ASA are strong. Our results show PA mark more leniently as undergraduates and less leniently at 1-year post graduation. CONCLUSIONS Peer assessors are able to write OSCE stations, produce marking schemes and effectively assess their near-peers.
Collapse
Affiliation(s)
- Amy McKay
- Glasgow Dental Hospital and School, Glasgow, UK
| | - John McCall
- Glasgow Dental Hospital and School, Glasgow, UK
| | | |
Collapse
|
6
|
Bhattacharya SB, Sabata D, Gibbs H, Jernigan S, Marchello N, Zwahlen D, Yang FM, Bhattacharya RK, Burkhardt C. The SPEER: An interprofessional team behavior rubric to optimize geriatric clinical care. GERONTOLOGY & GERIATRICS EDUCATION 2023; 44:316-328. [PMID: 34872460 DOI: 10.1080/02701960.2021.2002854] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
Geriatric patients with complex health care needs can benefit from interprofessional (IP) care; however, a major gap in health professional education is determining how to prepare future providers for IP collaboration. Effective IP team behavior assessment tools are needed to teach, implement, and evaluate IP practice skills. After review of IP evaluation tools, the Standardized Patient Encounter Evaluation Rubric (SPEER) was created to evaluate team dynamics in IP practice sites.Independent sample t-tests between faculty and learner SPEER scores showed learners scored themselves 15 points higher than their faculty scores (p < .001). Cronbach's α showed high internal consistency (α = 0.91). Paired t-tests found that learners identified improvements in the team's ability to address the patient's education needs and to allow the patients to voice their expectations. Faculty identified improvements in the teams' ability to make recommendations. Faculty evaluations of learner teams showed improvements in raw ratings on all but two items. Qualitative data analysis for emergent themes showed learners desired team functioning feedback and how teamwork could improve to provide optimal IP care.In conclusion, the SPEER can help faculty and learners identify growth in their teams' ability to perform key IP skills in clinical sites.
Collapse
Affiliation(s)
- Shelley B Bhattacharya
- Department of Family & Community Medicine, University of Kansas, School of Medicine, Kansas City, Kansas, USA
| | - Dory Sabata
- School of Health Professions, University of Kansas Medical Center, Kansas City, Kansas, USA
| | - Heather Gibbs
- Department of Dietetics & Nutrition, University of Kansas Medical Center, Kansas City, KS, USA
| | - Stephen Jernigan
- University of Kansas Medical Center, Department of Physical Therapy, Rehabilitation Science and Athletic Training, Kansas City, Kansas, USA
| | - Nicholas Marchello
- Department of Dietetics and Nutrition, University of Central Missouri, Warrensburg, Missouri, USA
| | - Denise Zwahlen
- Department of Family & Community Medicine, University of Kansas, School of Medicine, Kansas City, Kansas, USA
| | - Frances M Yang
- Department of Occupational Therapy Education, University of Kansas, School of Health Professions, Kansas, KS, USA
| | - Rajib K Bhattacharya
- Department of Internal Medicine, University of Kansas, School of Medicine, Kansas City, Kansas, USA
| | | |
Collapse
|
7
|
Cidoncha G, Muñoz-Corcuera M, Sánchez V, Pardo Monedero MJ, Antoranz A. The Objective Structured Clinical Examination (OSCE) in Periodontology with Simulated Patient: The Most Realistic Approach to Clinical Practice in Dentistry. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2023; 20:2661. [PMID: 36768027 PMCID: PMC9916374 DOI: 10.3390/ijerph20032661] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/14/2022] [Revised: 01/22/2023] [Accepted: 01/30/2023] [Indexed: 06/18/2023]
Abstract
The objective structured clinical examination (OSCE) is becoming an increasingly established assessment test in dental schools. The use of simulated patients in the OSCE makes the stations more similar to clinical practice. Therefore, the student can show their technical and clinical knowledge, and certainly, their ability to manage the patient. These sorts of tests, in which simulated patients can be included, would be used before the student started clinical practice with patients and/or at the end of the degree. The objective of this work was to describe how the periodontology station was developed using a simulated patient for students of a fifth year dentistry degree taking an OSCE test. Furthermore, a questionnaire was created to learn the perception of the students about this station and its characteristics. The fifth year students at the European University of Madrid positively evaluated this station in their examination. In addition, it was recorded that they preferred a simulated patient in their tests, rather than stations with clinical cases, images, X-rays, and presentations. It is essential that once the OSCE has been completed, the student receives a feedback to learn where they have failed and, therefore, be able to improve any of the aspects evaluated in the station.
Collapse
Affiliation(s)
- Gema Cidoncha
- Department of Clinical Dentistry, Faculty of Biomedical and Health Sciences, Universidad Europea de Madrid, 28670 Madrid, Spain
| | - Marta Muñoz-Corcuera
- Department of Clinical Dentistry, Faculty of Biomedical and Health Sciences, Universidad Europea de Madrid, 28670 Madrid, Spain
| | - Virginia Sánchez
- Department of Clinical Dentistry, Faculty of Biomedical and Health Sciences, Universidad Europea de Madrid, 28670 Madrid, Spain
| | - María Jesús Pardo Monedero
- Department of Preclinical Dentistry, Faculty of Biomedical and Health Sciences, Universidad Europea de Madrid, 28670 Madrid, Spain
| | - Ana Antoranz
- Department of Clinical Dentistry, Faculty of Biomedical and Health Sciences, Universidad Europea de Madrid, 28670 Madrid, Spain
| |
Collapse
|
8
|
Buléon C, Mattatia L, Minehart RD, Rudolph JW, Lois FJ, Guillouet E, Philippon AL, Brissaud O, Lefevre-Scelles A, Benhamou D, Lecomte F, group TSAWS, Bellot A, Crublé I, Philippot G, Vanderlinden T, Batrancourt S, Boithias-Guerot C, Bréaud J, de Vries P, Sibert L, Sécheresse T, Boulant V, Delamarre L, Grillet L, Jund M, Mathurin C, Berthod J, Debien B, Gacia O, Der Sahakian G, Boet S, Oriot D, Chabot JM. Simulation-based summative assessment in healthcare: an overview of key principles for practice. ADVANCES IN SIMULATION (LONDON, ENGLAND) 2022; 7:42. [PMID: 36578052 PMCID: PMC9795938 DOI: 10.1186/s41077-022-00238-9] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/02/2022] [Accepted: 11/30/2022] [Indexed: 12/29/2022]
Abstract
BACKGROUND Healthcare curricula need summative assessments relevant to and representative of clinical situations to best select and train learners. Simulation provides multiple benefits with a growing literature base proving its utility for training in a formative context. Advancing to the next step, "the use of simulation for summative assessment" requires rigorous and evidence-based development because any summative assessment is high stakes for participants, trainers, and programs. The first step of this process is to identify the baseline from which we can start. METHODS First, using a modified nominal group technique, a task force of 34 panelists defined topics to clarify the why, how, what, when, and who for using simulation-based summative assessment (SBSA). Second, each topic was explored by a group of panelists based on state-of-the-art literature reviews technique with a snowball method to identify further references. Our goal was to identify current knowledge and potential recommendations for future directions. Results were cross-checked among groups and reviewed by an independent expert committee. RESULTS Seven topics were selected by the task force: "What can be assessed in simulation?", "Assessment tools for SBSA", "Consequences of undergoing the SBSA process", "Scenarios for SBSA", "Debriefing, video, and research for SBSA", "Trainers for SBSA", and "Implementation of SBSA in healthcare". Together, these seven explorations provide an overview of what is known and can be done with relative certainty, and what is unknown and probably needs further investigation. Based on this work, we highlighted the trustworthiness of different summative assessment-related conclusions, the remaining important problems and questions, and their consequences for participants and institutions of how SBSA is conducted. CONCLUSION Our results identified among the seven topics one area with robust evidence in the literature ("What can be assessed in simulation?"), three areas with evidence that require guidance by expert opinion ("Assessment tools for SBSA", "Scenarios for SBSA", "Implementation of SBSA in healthcare"), and three areas with weak or emerging evidence ("Consequences of undergoing the SBSA process", "Debriefing for SBSA", "Trainers for SBSA"). Using SBSA holds much promise, with increasing demand for this application. Due to the important stakes involved, it must be rigorously conducted and supervised. Guidelines for good practice should be formalized to help with conduct and implementation. We believe this baseline can direct future investigation and the development of guidelines.
Collapse
Affiliation(s)
- Clément Buléon
- grid.460771.30000 0004 1785 9671Department of Anesthesiology, Intensive Care and Perioperative Medicine, Caen Normandy University Hospital, 6th Floor, Caen, France ,grid.412043.00000 0001 2186 4076Medical School, University of Caen Normandy, Caen, France ,grid.419998.40000 0004 0452 5971Center for Medical Simulation, Boston, MA USA
| | - Laurent Mattatia
- grid.411165.60000 0004 0593 8241Department of Anesthesiology, Intensive Care and Perioperative Medicine, Nîmes University Hospital, Nîmes, France
| | - Rebecca D. Minehart
- grid.419998.40000 0004 0452 5971Center for Medical Simulation, Boston, MA USA ,grid.32224.350000 0004 0386 9924Department of Anesthesia, Critical Care and Pain Medicine, Massachusetts General Hospital, Boston, MA USA ,grid.38142.3c000000041936754XHarvard Medical School, Boston, MA USA
| | - Jenny W. Rudolph
- grid.419998.40000 0004 0452 5971Center for Medical Simulation, Boston, MA USA ,grid.32224.350000 0004 0386 9924Department of Anesthesia, Critical Care and Pain Medicine, Massachusetts General Hospital, Boston, MA USA ,grid.38142.3c000000041936754XHarvard Medical School, Boston, MA USA
| | - Fernande J. Lois
- grid.4861.b0000 0001 0805 7253Department of Anesthesiology, Intensive Care and Perioperative Medicine, Liège University Hospital, Liège, Belgique
| | - Erwan Guillouet
- grid.460771.30000 0004 1785 9671Department of Anesthesiology, Intensive Care and Perioperative Medicine, Caen Normandy University Hospital, 6th Floor, Caen, France ,grid.412043.00000 0001 2186 4076Medical School, University of Caen Normandy, Caen, France
| | - Anne-Laure Philippon
- grid.411439.a0000 0001 2150 9058Department of Emergency Medicine, Pitié Salpêtrière University Hospital, APHP, Paris, France
| | - Olivier Brissaud
- grid.42399.350000 0004 0593 7118Department of Pediatric Intensive Care, Pellegrin University Hospital, Bordeaux, France
| | - Antoine Lefevre-Scelles
- grid.41724.340000 0001 2296 5231Department of Emergency Medicine, Rouen University Hospital, Rouen, France
| | - Dan Benhamou
- grid.413784.d0000 0001 2181 7253Department of Anesthesiology, Intensive Care and Perioperative Medicine, Kremlin Bicêtre University Hospital, APHP, Paris, France
| | - François Lecomte
- grid.411784.f0000 0001 0274 3893Department of Emergency Medicine, Cochin University Hospital, APHP, Paris, France
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
9
|
Alsaif F, Alkuwaiz L, Alhumud M, Bin Idris R, Neel L, Aljabry M, Soliman M. Evaluation of the Experience of Peer-led Mock Objective Structured Practical Examination for First- and Second-year Medical Students. ADVANCES IN MEDICAL EDUCATION AND PRACTICE 2022; 13:987-992. [PMID: 36059924 PMCID: PMC9438775 DOI: 10.2147/amep.s359669] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 01/23/2022] [Accepted: 03/08/2022] [Indexed: 06/15/2023]
Abstract
BACKGROUND The objective structured practical examination (OSPE) is used as an assessment tool of laboratory practical sessions. This study described the design and implementation of peer-led mock OSPE for first- and second-year medical students, investigated the perception of the students of the peer-led mock OSPE and the impact of attending the mock OSPE on the performance. METHODS This is a cross-sectional study. Two mock OSPEs were designed and conducted by third-, fourth- and fifth- year medical students for year one and two. Each mock OSPE involved six stations. Thirty-three medical students facilitated the OSPE. The OSPEs were conducted prior to the summative end of block exams. Following the mock OSPEs, an online survey was sent to the participants to assess their satisfaction, quality and benefits of the mock OSPE. The study also evaluated the impact of the mock OSPE on students' performance. RESULTS Out of 313 first-year students, 279 (89.1%) attended the mock OSPE and out of 298 second-year students, 213 (71.5%) attended. A total of 192 (68.8%) first-year medical students and 102 (47.9%) second-year medical students completed the questionnaire. There was no significant difference between attending and non-attending the mock OSPE in the students' performance in the summative OSPE. The majority of students felt more confident, less anxious, and lowered the levels of stress after attending the mock OSPE. More than half of the students felt that attending the mock OSPE helped in easing the steps, better preparation, provided sufficient orientation, well explained the materials and helped them to learn the concept of the final OSPE. The majority of students found the mock OSPE stimulating. CONCLUSION Attending the mock OSPE did not affect the students' performance in the summative OSPE. However, the peer-assessed mock OSPE improved the medical students' confidence and lowered the anxiety associated with OSPE.
Collapse
Affiliation(s)
- Faisal Alsaif
- College of Medicine, King Saud University, Riyadh, Saudi Arabia
| | - Lamia Alkuwaiz
- College of Medicine, King Saud University, Riyadh, Saudi Arabia
| | | | - Reem Bin Idris
- College of Medicine, King Saud University, Riyadh, Saudi Arabia
| | - Lina Neel
- College of Medicine, Alfaisal University, Riyadh, Saudi Arabia
| | - Mansour Aljabry
- Pathology Department, College of Medicine, King Saud University, Riyadh, Saudi Arabia
| | - Mona Soliman
- Medical Education & Physiology, Medical Education Department, College of Medicine, King Saud University, Riyadh, Saudi Arabia
| |
Collapse
|
10
|
Sader J, Cerutti B, Meynard L, Geoffroy F, Meister V, Paignon A, Junod Perron N. The pedagogical value of near-peer feedback in online OSCEs. BMC MEDICAL EDUCATION 2022; 22:572. [PMID: 35879752 PMCID: PMC9310367 DOI: 10.1186/s12909-022-03629-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/14/2022] [Accepted: 07/05/2022] [Indexed: 06/15/2023]
Abstract
PURPOSE OF THE ARTICLE During the Covid-19 pandemic, formative OSCE were transformed into online OSCE, and senior students (near peers) substituted experienced clinical teachers. The aims of the study were to evaluate quality of the feedbacks given by near peers during online OSCEs and explore the experience of near-peer feedback from both learner's and near peer's perspectives. MATERIALS AND METHODS All 2nd year medical students (n = 158) attended an online OSCE under the supervision of twelve senior medical students. Outcome measures were 1) students' perception of the quality of the feedback through an online survey (Likert 1-5); 2) objective assessment of the quality of the feedback focusing on both the process and the content using a feedback scale (Likert 1-5); 3) experience of near peer feedback in two different focus groups. RESULTS One hundred six medical students answered the questionnaire and had their feedback session videotaped. The mean perceived overall quality of senior students' overall feedback was 4.75 SD 0.52. They especially valued self-evaluation (mean 4.80 SD 0.67), balanced feedback (mean 4.93 SD 0.29) and provision of simulated patient's feedback (mean 4.97 SD 0.17). The overall objective assessment of the feedback quality was 3.73 SD 0.38: highly scored skills were subjectivity (mean 3.95 SD 1.12) and taking into account student's self-evaluation (mean 3.71 (SD 0.87). Senior students mainly addressed history taking issues (mean items 3.53 SD 2.37) and communication skills (mean items 4.89 SD 2.43) during feedback. Participants reported that near peer feedback was less stressful and more tailored to learning needs- challenges for senior students included to remain objective and to provide negative feedback. CONCLUSION Increased involvement of near peers in teaching activities is strongly supported for formative OSCE and should be implemented in parallel even if experience teachers are again involved in such teaching activities. However, it requires training not only on feedback skills but also on the specific content of the formative OSCE.
Collapse
Affiliation(s)
- Julia Sader
- Unit of Development and Research in Medical Education, Faculty of Medicine, University of Geneva, Rue Michel-Servet 1- CMU 5-6, Geneva, Switzerland.
| | - Bernard Cerutti
- Unit of Development and Research in Medical Education, Faculty of Medicine, University of Geneva, Rue Michel-Servet 1- CMU 5-6, Geneva, Switzerland
| | - Louise Meynard
- Interprofessional Centre of Simulation - CIS, Geneva, Switzerland
| | - Frédéric Geoffroy
- Unit of Development and Research in Medical Education, Faculty of Medicine, University of Geneva, Rue Michel-Servet 1- CMU 5-6, Geneva, Switzerland
| | | | - Adeline Paignon
- Interprofessional Centre of Simulation - CIS, Geneva, Switzerland
- HES-SO University of Applied Sciences and Arts of Western Switzerland, School of Health Sciences Geneva, Geneva, Switzerland
| | - Noëlle Junod Perron
- Unit of Development and Research in Medical Education, Faculty of Medicine, University of Geneva, Rue Michel-Servet 1- CMU 5-6, Geneva, Switzerland
| |
Collapse
|
11
|
Lam A, Lam L, Blacketer C, Parnis R, Franke K, Wagner M, Wang D, Tan Y, Oakden-Rayner L, Gallagher S, Perry SW, Licinio J, Symonds I, Thomas J, Duggan P, Bacchi S. Professionalism and clinical short answer question marking with machine learning. Intern Med J 2022; 52:1268-1271. [PMID: 35879236 DOI: 10.1111/imj.15839] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2022] [Accepted: 04/14/2022] [Indexed: 11/29/2022]
Abstract
Machine learning may assist in medical student evaluation. This study involved scoring short answer questions administered at three centres. Bidirectional encoder representations from transformers were particularly effective for professionalism question scoring (accuracy ranging from 41.6% to 92.5%). In the scoring of 3-mark professionalism questions, as compared with clinical questions, machine learning had a lower classification accuracy (P < 0.05). The role of machine learning in medical professionalism evaluation warrants further investigation.
Collapse
Affiliation(s)
- Antoinette Lam
- University of Adelaide, Adelaide, South Australia, Australia
| | - Lydia Lam
- University of Adelaide, Adelaide, South Australia, Australia
| | - Charlotte Blacketer
- University of Adelaide, Adelaide, South Australia, Australia.,Royal Adelaide Hospital, Adelaide, South Australia, Australia
| | - Roger Parnis
- University of Adelaide, Adelaide, South Australia, Australia.,Royal Darwin Hospital, Darwin, Northern Territory, Australia
| | - Kyle Franke
- University of Adelaide, Adelaide, South Australia, Australia
| | - Morganne Wagner
- State University of New York (SUNY) Upstate Medical University, Syracuse, New York, USA
| | - David Wang
- University of Otago, Dunedin, New Zealand
| | - Yiran Tan
- University of Adelaide, Adelaide, South Australia, Australia.,Royal Adelaide Hospital, Adelaide, South Australia, Australia
| | - Lauren Oakden-Rayner
- University of Adelaide, Adelaide, South Australia, Australia.,Royal Adelaide Hospital, Adelaide, South Australia, Australia
| | | | - Seth W Perry
- State University of New York (SUNY) Upstate Medical University, Syracuse, New York, USA
| | - Julio Licinio
- State University of New York (SUNY) Upstate Medical University, Syracuse, New York, USA
| | - Ian Symonds
- University of Adelaide, Adelaide, South Australia, Australia
| | - Josephine Thomas
- University of Adelaide, Adelaide, South Australia, Australia.,Royal Adelaide Hospital, Adelaide, South Australia, Australia
| | - Paul Duggan
- University of Adelaide, Adelaide, South Australia, Australia.,Royal Adelaide Hospital, Adelaide, South Australia, Australia
| | - Stephen Bacchi
- University of Adelaide, Adelaide, South Australia, Australia.,Royal Adelaide Hospital, Adelaide, South Australia, Australia
| |
Collapse
|
12
|
Volino LR, Brust-Sisti LA, Patel S, Yeh D, Liu MT, Cogan-Drew T, Parrott JS. Evaluation of Interprofessional Education on Effective Communication Between Pharmacy and Physician Assistant Students. J Physician Assist Educ 2022; 33:114-118. [PMID: 35511459 DOI: 10.1097/jpa.0000000000000432] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
PURPOSE The purpose of this study was to assess the effect of Team Strategies and Tools to Enhance Performance and Patient Safety (TeamSTEPPS) on student self-perceived competencies and perceptions of interprofessional (IP) communication and teamwork in a clinical case review activity. TeamSTEPPS is an evidence-based curriculum that is used to enhance and support IP healthcare communication. METHODS A repeated-measures, pretest/posttest study evaluated physician assistant students' and student pharmacists' perceptions of TeamSTEPPS. Students completed Performance Assessment for Communication and Teamwork (PACT) surveys, evaluating teamwork, knowledge, attitudes, and skills perceptions before and after a TeamSTEPPS lecture and associated activity with peer feedback. RESULTS Overall, 87.4% (n = 429) completed pre- and post-PACT surveys. Apart from the Mutual Support domain (p = .898), all changes were significantly positive (p < .004), with the greatest improvements occurring in the Attitudes and Perceived Skills domains. CONCLUSION TeamSTEPPS IP education, application, and peer feedback improved students' perceptions of multiple domains, including effective communication. Using TeamSTEPPS tools in IP formats enabled the students to safely practice and collaborate in preparation for clinical practice.
Collapse
Affiliation(s)
- Lucio R Volino
- Lucio R. Volino, PharmD , is a clinical associate professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Lindsay A. Brust-Sisti, PharmD , is a clinical assistant professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Sarah Patel, DScPAS, PA-C, MBA , is a lecturer in the Department of Physician Assistant Studies and Practice at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Dipali Yeh, MS, PA-C , is an assistant professor and coordinator for Simulation and Interprofessional Education, Department of Physician Assistant Studies and Practice, at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Mei T. Liu, PharmD , is a clinical assistant professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Thea Cogan-Drew, MMSc, PA-C , is a lecturer in the Department of Physician Assistant Studies and Practice at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- J. Scott Parrott, PhD , is the director of the School of Health Professions Methodology and Statistics Support Team, a professor in the Department of Interdisciplinary Studies at Rutgers School of Health Professions, and an adjunct professor in the Department of Biostatistics & Epidemiology at Rutgers School of Public Health in Blackwood, New Jersey
| | - Lindsay A Brust-Sisti
- Lucio R. Volino, PharmD , is a clinical associate professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Lindsay A. Brust-Sisti, PharmD , is a clinical assistant professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Sarah Patel, DScPAS, PA-C, MBA , is a lecturer in the Department of Physician Assistant Studies and Practice at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Dipali Yeh, MS, PA-C , is an assistant professor and coordinator for Simulation and Interprofessional Education, Department of Physician Assistant Studies and Practice, at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Mei T. Liu, PharmD , is a clinical assistant professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Thea Cogan-Drew, MMSc, PA-C , is a lecturer in the Department of Physician Assistant Studies and Practice at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- J. Scott Parrott, PhD , is the director of the School of Health Professions Methodology and Statistics Support Team, a professor in the Department of Interdisciplinary Studies at Rutgers School of Health Professions, and an adjunct professor in the Department of Biostatistics & Epidemiology at Rutgers School of Public Health in Blackwood, New Jersey
| | - Sarah Patel
- Lucio R. Volino, PharmD , is a clinical associate professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Lindsay A. Brust-Sisti, PharmD , is a clinical assistant professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Sarah Patel, DScPAS, PA-C, MBA , is a lecturer in the Department of Physician Assistant Studies and Practice at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Dipali Yeh, MS, PA-C , is an assistant professor and coordinator for Simulation and Interprofessional Education, Department of Physician Assistant Studies and Practice, at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Mei T. Liu, PharmD , is a clinical assistant professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Thea Cogan-Drew, MMSc, PA-C , is a lecturer in the Department of Physician Assistant Studies and Practice at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- J. Scott Parrott, PhD , is the director of the School of Health Professions Methodology and Statistics Support Team, a professor in the Department of Interdisciplinary Studies at Rutgers School of Health Professions, and an adjunct professor in the Department of Biostatistics & Epidemiology at Rutgers School of Public Health in Blackwood, New Jersey
| | - Dipali Yeh
- Lucio R. Volino, PharmD , is a clinical associate professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Lindsay A. Brust-Sisti, PharmD , is a clinical assistant professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Sarah Patel, DScPAS, PA-C, MBA , is a lecturer in the Department of Physician Assistant Studies and Practice at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Dipali Yeh, MS, PA-C , is an assistant professor and coordinator for Simulation and Interprofessional Education, Department of Physician Assistant Studies and Practice, at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Mei T. Liu, PharmD , is a clinical assistant professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Thea Cogan-Drew, MMSc, PA-C , is a lecturer in the Department of Physician Assistant Studies and Practice at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- J. Scott Parrott, PhD , is the director of the School of Health Professions Methodology and Statistics Support Team, a professor in the Department of Interdisciplinary Studies at Rutgers School of Health Professions, and an adjunct professor in the Department of Biostatistics & Epidemiology at Rutgers School of Public Health in Blackwood, New Jersey
| | - Mei T Liu
- Lucio R. Volino, PharmD , is a clinical associate professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Lindsay A. Brust-Sisti, PharmD , is a clinical assistant professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Sarah Patel, DScPAS, PA-C, MBA , is a lecturer in the Department of Physician Assistant Studies and Practice at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Dipali Yeh, MS, PA-C , is an assistant professor and coordinator for Simulation and Interprofessional Education, Department of Physician Assistant Studies and Practice, at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Mei T. Liu, PharmD , is a clinical assistant professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Thea Cogan-Drew, MMSc, PA-C , is a lecturer in the Department of Physician Assistant Studies and Practice at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- J. Scott Parrott, PhD , is the director of the School of Health Professions Methodology and Statistics Support Team, a professor in the Department of Interdisciplinary Studies at Rutgers School of Health Professions, and an adjunct professor in the Department of Biostatistics & Epidemiology at Rutgers School of Public Health in Blackwood, New Jersey
| | - Thea Cogan-Drew
- Lucio R. Volino, PharmD , is a clinical associate professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Lindsay A. Brust-Sisti, PharmD , is a clinical assistant professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Sarah Patel, DScPAS, PA-C, MBA , is a lecturer in the Department of Physician Assistant Studies and Practice at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Dipali Yeh, MS, PA-C , is an assistant professor and coordinator for Simulation and Interprofessional Education, Department of Physician Assistant Studies and Practice, at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Mei T. Liu, PharmD , is a clinical assistant professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Thea Cogan-Drew, MMSc, PA-C , is a lecturer in the Department of Physician Assistant Studies and Practice at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- J. Scott Parrott, PhD , is the director of the School of Health Professions Methodology and Statistics Support Team, a professor in the Department of Interdisciplinary Studies at Rutgers School of Health Professions, and an adjunct professor in the Department of Biostatistics & Epidemiology at Rutgers School of Public Health in Blackwood, New Jersey
| | - J Scott Parrott
- Lucio R. Volino, PharmD , is a clinical associate professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Lindsay A. Brust-Sisti, PharmD , is a clinical assistant professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Sarah Patel, DScPAS, PA-C, MBA , is a lecturer in the Department of Physician Assistant Studies and Practice at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Dipali Yeh, MS, PA-C , is an assistant professor and coordinator for Simulation and Interprofessional Education, Department of Physician Assistant Studies and Practice, at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Mei T. Liu, PharmD , is a clinical assistant professor in the Ernest Mario School of Pharmacy at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- Thea Cogan-Drew, MMSc, PA-C , is a lecturer in the Department of Physician Assistant Studies and Practice at Rutgers, The State University of New Jersey, in Piscataway, New Jersey
- J. Scott Parrott, PhD , is the director of the School of Health Professions Methodology and Statistics Support Team, a professor in the Department of Interdisciplinary Studies at Rutgers School of Health Professions, and an adjunct professor in the Department of Biostatistics & Epidemiology at Rutgers School of Public Health in Blackwood, New Jersey
| |
Collapse
|
13
|
Macauley K, Laprino S, Brudvig T. Perceptions of Physical Therapy Students on their Psychomotor Examinations: a Qualitative Study. MEDICAL SCIENCE EDUCATOR 2022; 32:349-360. [PMID: 35528290 PMCID: PMC9054959 DOI: 10.1007/s40670-022-01514-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 01/21/2022] [Indexed: 06/14/2023]
Abstract
INTRODUCTION Practical examinations are necessary to demonstrate learning in the psychomotor, cognitive, and affective domains. Student perceptions of the organization and execution of practical examinations are an important consideration in the development of practical examinations. REVIEW OF THE LITERATURE Multiple other health professions have investigated students' perceptions of objective structured clinical examinations (OSCE). There is little in the physical therapy literature with respect to student perception regarding proctor presence during practical examinations or OSCEs. SUBJECTS The participants were members of the classes of 2019-2021 in a Doctor of Physical Therapy (DPT) program at a New England University. METHODS A qualitative thematic approach was applied to de-identified transcripts of student focus group interviews. Independently coded themes were identified, discussed, and refined iteratively. RESULTS AND DISCUSSION Four themes emerged with multiple subthemes: impact of proctor being present; realistic, patient-focused experience; preparation for the practical; and stress. Students valued preparation that included clear expectations, utilization of formative assessments, and peer feedback prior to the practical. They also noted that a distractive-free testing space, having no proctor present in the room, recording the practical, and the format of OSCE's decreased stress and improved performance. CONCLUSIONS These findings add to the body of knowledge in physical therapy and provide guidance to faculty as they plan and organize practical examinations.
Collapse
|
14
|
Grover S, Pandya M, Ranasinghe C, Ramji SP, Bola H, Raj S. Assessing the utility of virtual OSCE sessions as an educational tool: a national pilot study. BMC MEDICAL EDUCATION 2022; 22:178. [PMID: 35292001 PMCID: PMC8923093 DOI: 10.1186/s12909-022-03248-3] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/11/2021] [Accepted: 03/08/2022] [Indexed: 05/03/2023]
Abstract
BACKGROUND Objective Structured Clinical Examinations (OSCEs) are a common form of assessment used across medical schools in the UK to assess clinical competence and practical skills and are traditionally held in an in-person format. In the past, medical students have often prepared for such exams through in-person peer-assisted learning (PAL), however, due to the recent Covid-19 pandemic, many in-person teaching sessions transitioned to online-based formats. There is currently a paucity of research on the utility of virtual PAL OSCE sessions and thus, we carried out a national pilot study to determine the feasibility of virtual OSCE teaching via feedback from participants and examiners. METHODS A total of 85 students from 19 UK-based medical schools with eight students based internationally attended the series of online OSCE workshops delivered via Zoom®. All students and examiners completed a feedback questionnaire at the end of each session regarding parameters, which included questions on pre-and post-workshop confidence in three OSCE domains: history-taking, communication and data interpretation. A Likert scale using 5 Likert items was used to self-report confidence, and the results were analysed using the Mann-Whitney U test after assessing for normality using the Shapiro-Wilk test. RESULTS Results from student feedback showed an increase in confidence for all three OSCE domains after each event (p < 0.001) with 69.4% agreeing or strongly agreeing that online OSCE sessions could sufficiently prepare them for in-person exams. Questionnaire feedback revealed that 97.6% of students and 86.7% of examiners agreed that virtual OSCE teaching would be useful for preparing for in-person OSCE examinations after the pandemic. CONCLUSION Most participants in the virtual OSCE sessions reported an improvement in their confidence in history-taking, communication and data interpretation skills. Of the participants and examiners that had also experienced in-person OSCE examinations, the majority also reported that they found virtual OSCE sessions to be as engaging and as interactive as in-person teaching. This study has demonstrated that virtual OSCE workshops are a feasible option with the potential to be beneficial beyond the pandemic. However, more studies are required to assess the overall impact on student learning and to determine the value of virtual OSCE workshops on exam performance.
Collapse
Affiliation(s)
- Sarika Grover
- GKT School of Medicine, King’s College London, London, UK
| | - Maharsh Pandya
- Centre for Medical Education, School of Medicine, Cardiff University, Cardiff, UK
| | - Chavini Ranasinghe
- University College London Medical School, University College London, London, UK
| | | | - Harroop Bola
- Imperial College School of Medicine, Imperial College London, London, UK
| | - Siddarth Raj
- GKT School of Medicine, King’s College London, London, UK
| |
Collapse
|
15
|
Lavery J. Observed structured clinical examination as a means of assessing clinical skills competencies of ANPs. BRITISH JOURNAL OF NURSING (MARK ALLEN PUBLISHING) 2022; 31:214-220. [PMID: 35220736 DOI: 10.12968/bjon.2022.31.4.214] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Observed structured clinical examinations (OSCEs) are a common method of assessment within higher education to prepare for the advanced nurse practitioner (ANP) role. This article reviews a wide range of literature relating to OSCE assessment in the healthcare arena, from an ANP, interprofessional and advanced clinical practice perspective. Theories underpinning OSCE and advanced nursing roles are explored, with relevant supporting literature indicating how established OSCEs can become integrated with other methods to improve outcomes for this level of practice. Alternative assessments are explored with specific reference to the context of the education delivered, and the suitability for higher education today.
Collapse
Affiliation(s)
- Joanna Lavery
- Senior Lecturer in Adult Nursing, Liverpool John Moores University, Liverpool
| |
Collapse
|
16
|
Cosker E, Favier V, Gallet P, Raphael F, Moussier E, Tyvaert L, Braun M, Feigerlova E. Tutor-Student Partnership in Practice OSCE to Enhance Medical Education. MEDICAL SCIENCE EDUCATOR 2021; 31:1803-1812. [PMID: 34956698 PMCID: PMC8651844 DOI: 10.1007/s40670-021-01421-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 09/27/2021] [Indexed: 06/14/2023]
Abstract
BACKGROUND Training of examiners is essential to ensure the quality of objective structured clinical examination (OSCE). We aimed to study a perceived effectiveness of tutor-student partnership in a practice OSCE module by novice OSCE tutors and medical students. METHOD We implemented a practice OSCE at a medical faculty in France with novice tutors and third year medical students as partners. Each tutor (n = 44) served as a partner for the group of 5 students in the conception of the scenario and as an evaluator of the tutored station. Students (n = 303) were involved in the conception of a case and the roles of a physician, evaluator and a simulated patient. Data were obtained through self-assessment questionnaires. Descriptive statistics were used to analyze items of the questionnaires. Free-form answers were coded and analyzed thematically. RESULTS A total of 36 tutors (82%) and 185 students (61%) responded to the questionnaires. The intervention was well perceived. Thirty-two percent of the tutors reported some difficulties in the assessment of student performance and were disposed to receive further training. Fifty-five percent of the students considered the participation in the OSCE case development appropriate to their level of knowledge, and 70% perceived it as beneficial allowing them to set their learning goals. CONCLUSION This initiative provides a relevant method beneficial to OSCE tutors, medical students, and the faculty. Tutors learn how to assess student performance according to expected achievement levels. It allows students to be engaged as partners in co-creation of learning and teaching. SUPPLEMENTARY INFORMATION The online version contains supplementary material available at 10.1007/s40670-021-01421-9.
Collapse
Affiliation(s)
- Eve Cosker
- Pôle Hospitalo-Universitaire de psychiatrie d’adultes et d’addictologie du Grand Nancy, Centre Psychothérapique De Nancy, Laxou, F-54520 France
- Université de Strasbourg Unité de Physiopathologie et Médecine Translationnelle, INSERM U1114, Strasbourg, F-67000 France
| | - Valentin Favier
- Centre hospitalier, régionale et universitaire de Nancy, Otorhinolaryngology, Université de Lorraine, Nancy, F-54000 France
| | - Patrice Gallet
- Centre hospitalier, régionale et universitaire de Nancy, Otorhinolaryngology, Université de Lorraine, Nancy, F-54000 France
- Centre universitaire d’enseignement par simulation (CUESiM), Hôpital virtuel de Lorraine, Faculté de médecine, Nancy, F-54000 France
- Faculté de Médecine, Université de Lorraine,, Nancy, F-54000 France
| | - Francis Raphael
- Faculté de Médecine, Université de Lorraine,, Nancy, F-54000 France
| | | | - Louise Tyvaert
- Faculté de Médecine, Université de Lorraine,, Nancy, F-54000 France
- Centre hospitalier régionale et universitaire de Nancy, Department of Neuro, Université de Lorraine, Nancy, F-54000 France
| | - Marc Braun
- Centre universitaire d’enseignement par simulation (CUESiM), Hôpital virtuel de Lorraine, Faculté de médecine, Nancy, F-54000 France
- Faculté de Médecine, Université de Lorraine,, Nancy, F-54000 France
| | - Eva Feigerlova
- Centre universitaire d’enseignement par simulation (CUESiM), Hôpital virtuel de Lorraine, Faculté de médecine, Nancy, F-54000 France
- Faculté de Médecine, Université de Lorraine,, Nancy, F-54000 France
- Centre hospitalier régionale et universitaire de Nancy, Department of endocrinology, Université de Lorraine, Nancy, F-54000 France
- Université de Lorraine, Inserm, UMR S 1116 – DCAC, Nancy, F-54000 France
| |
Collapse
|
17
|
Vasli P, Shahsavari A, Estebsari F, AsadiParvar-Masouleh H. The predictors of nursing students' clinical competency in pre-internship objective structured clinical examination: The roles of exam anxiety and academic success. NURSE EDUCATION TODAY 2021; 107:105148. [PMID: 34600185 DOI: 10.1016/j.nedt.2021.105148] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/25/2021] [Accepted: 09/12/2021] [Indexed: 06/13/2023]
Abstract
BACKGROUND Identifying the predictors of nursing students' clinical competency in objective structured clinical examination (OSCE) is of utmost importance. Therefore, the present study was to investigate the predictive roles of exam anxiety and academic success in nursing students' clinical competency in the pre-internship OSCE. METHODS In this descriptive-analytical study, a total number of 102 nursing students, meeting the inclusion criteria and enrolled in the sixth semester (third year) of the graduate nursing program in Iran, were firstly selected by the census method. The pre-internship OSCE was then implemented at eight stations based on a pre-designed schedule template. The required data were also collected through a demographic-academic characteristics questionnaire and the State-Trait Anxiety Inventory to determine the exam anxiety score, the nursing program grade point average (GPA) to reflect on the levels of academic success, and the OSCE score to control clinical competency. The data analysis was also performed at the significance level of 0.05. RESULTS The linear regression model, in which the exam anxiety, the nursing program GPA, and the demographic-academic characteristics variables had been imported, could explain 33.52% of the variance of the nursing students' clinical competency in the pre-internship OSCE (R2 = 0.616). Of the variables concerned, only the nursing program GPA could be a significant predictor of the nursing students' clinical competency scores in the OSCE, so that 0.8 points were added to the clinical competency scores in the OSCE as the nursing program GPA increased by one unit (p = 0.000, β = 0.717), but no significant relationship was observed between exam anxiety and clinical competency in the pre-internship OSCE among the nursing students. CONCLUSIONS The results of this study endorsed the use of the OSCE in assessing the nursing students' clinical competency and implementing learning strategies to strengthen the levels of academic success in such individuals.
Collapse
Affiliation(s)
- Parvaneh Vasli
- Department of Nursing, School of Nursing and Midwifery, Shahid Beheshti University of Medical Sciences, Tehran, Iran.
| | - Arezoo Shahsavari
- Department of Nursing, School of Nursing and Midwifery, Shahid Beheshti University of Medical Sciences, Tehran, Iran
| | - Fatemeh Estebsari
- Department of Nursing, School of Nursing and Midwifery, Shahid Beheshti University of Medical Sciences, Tehran, Iran
| | | |
Collapse
|
18
|
Koo JH, Ong KY, Yap YT, Tham KY. The role of training in student examiner rating performance in a student-led mock OSCE. PERSPECTIVES ON MEDICAL EDUCATION 2021; 10:293-298. [PMID: 33351173 PMCID: PMC8505586 DOI: 10.1007/s40037-020-00643-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/09/2020] [Revised: 11/02/2020] [Accepted: 12/02/2020] [Indexed: 06/12/2023]
Abstract
INTRODUCTION Peer assessments are increasingly prevalent in medical education, including student-led mock Objective Structured Clinical Examinations (OSCE). While there is some evidence to suggest that examiner training may improve OSCE assessments, few students undergo training before becoming examiners. We sought to evaluate an examiner training programme in the setting of a student-led mock OSCE. METHODS A year‑2 mock OSCE comprised of history taking (Hx) and physical examination (PE) stations was conducted involving 35 year‑3 (Y3) student examiners and 21 year‑5 (Y5) student examiners who acted as reference examiners. Twelve Y3 student-examiners attended an OSCE examiner training programme conducted by senior faculty. During the OSCE, Y3 and Y5 student examiners were randomly paired to grade the same candidates and scores were compared. Scores for checklist rating (CR) and global rating (GR) domains were assigned for both Hx and PE stations. RESULTS There was moderate to excellent correlation between Y3 and Y5 student examiners for both Hx (ICC 0.71-0.96) and PE stations (ICC 0.71-0.88) across all domains. For both Hx and PE stations, GR domain had poorer correlation than CR domains. Examiner training resulted in better correlations for PE but not Hx stations. Effect sizes were lower than the minimum detectible effect (MDE) sizes for all comparisons made. DISCUSSION Y3 student examiners are effective substitutes for Y5 student examiners in a Y2 mock OSCE. Our findings suggest that examiner training may further improve marking behaviour especially for PE stations. Further studies with larger sample sizes are required to further evaluate the effects of dedicated examiner training.
Collapse
Affiliation(s)
- Jian Hui Koo
- Singapore General Hospital, Singapore, Singapore.
- Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore, Singapore.
| | - Kim Yao Ong
- Tan Tock Seng Hospital, Singapore, Singapore
- Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore, Singapore
| | - Yun Ting Yap
- Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore, Singapore
| | - Kum Ying Tham
- Tan Tock Seng Hospital, Singapore, Singapore
- Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore, Singapore
| |
Collapse
|
19
|
Yu JH, Lee MJ, Kim SS, Yang MJ, Cho HJ, Noh CK, Lee GH, Lee SK, Song MR, Lee JH, Kim M, Jung YJ. Assessment of medical students' clinical performance using high-fidelity simulation: comparison of peer and instructor assessment. BMC MEDICAL EDUCATION 2021; 21:506. [PMID: 34563180 PMCID: PMC8467013 DOI: 10.1186/s12909-021-02952-w] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/12/2021] [Accepted: 09/16/2021] [Indexed: 06/13/2023]
Abstract
BACKGROUND High-fidelity simulators are highly useful in assessing clinical competency; they enable reliable and valid evaluation. Recently, the importance of peer assessment has been highlighted in healthcare education, and studies using peer assessment in healthcare, such as medicine, nursing, dentistry, and pharmacy, have examined the value of peer assessment. This study aimed to analyze inter-rater reliability between peers and instructors and examine differences in scores between peers and instructors in the assessment of high-fidelity-simulation-based clinical performance by medical students. METHODS This study analyzed the results of two clinical performance assessments of 34 groups of fifth-year students at Ajou University School of Medicine in 2020. This study utilized a modified Queen's Simulation Assessment Tool to measure four categories: primary assessment, diagnostic actions, therapeutic actions, and communication. In order to estimate inter-rater reliability, this study calculated the intraclass correlation coefficient and used the Bland and Altman method to analyze agreement between raters. A t-test was conducted to analyze the differences in evaluation scores between colleagues and faculty members. Group differences in assessment scores between peers and instructors were analyzed using the independent t-test. RESULTS Overall inter-rater reliability of clinical performance assessments was high. In addition, there were no significant differences in overall assessment scores between peers and instructors in the areas of primary assessment, diagnostic actions, therapeutic actions, and communication. CONCLUSIONS The results indicated that peer assessment can be used as a reliable assessment method compared to instructor assessment when evaluating clinical competency using high-fidelity simulators. Efforts should be made to enable medical students to actively participate in the evaluation process as fellow assessors in high-fidelity-simulation-based assessment of clinical performance in situations similar to real clinical settings.
Collapse
Affiliation(s)
- Ji Hye Yu
- Office of Medical Education, Ajou University School of Medicine, Suwon, South Korea
| | - Mi Jin Lee
- Department of Medical Humanities and Social medicine, Ajou University School of Medicine, Suwon, South Korea
| | - Soon Sun Kim
- Department of Gastroenterology, Ajou University School of Medicine, Suwon, South Korea
| | - Min Jae Yang
- Department of Gastroenterology, Ajou University School of Medicine, Suwon, South Korea
| | - Hyo Jung Cho
- Department of Gastroenterology, Ajou University School of Medicine, Suwon, South Korea
| | - Choong Kyun Noh
- Department of Gastroenterology, Ajou University School of Medicine, Suwon, South Korea
| | - Gil Ho Lee
- Department of Gastroenterology, Ajou University School of Medicine, Suwon, South Korea
| | - Su Kyung Lee
- Ajou Center for Clinical Excellence, Ajou University School of Medicine, Suwon, South Korea
| | - Mi Ryoung Song
- Office of Medical Education, Ajou University School of Medicine, Suwon, South Korea
| | - Jang Hoon Lee
- Department of Pediatrics, Ajou University School of Medicine, Suwon, South Korea
| | - Miran Kim
- Department of Obstetrics & Gynecology, Ajou University School of Medicine, Suwon, South Korea
| | - Yun Jung Jung
- Department of Pulmonary and Critical Care Medicine, Ajou University School of Medicine, Suwon, South Korea.
| |
Collapse
|
20
|
Maggio LA, Larsen K, Thomas A, Costello JA, Artino AR. Scoping reviews in medical education: A scoping review. MEDICAL EDUCATION 2021; 55:689-700. [PMID: 33300124 PMCID: PMC8247025 DOI: 10.1111/medu.14431] [Citation(s) in RCA: 25] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/24/2020] [Revised: 11/19/2020] [Accepted: 12/04/2020] [Indexed: 05/12/2023]
Abstract
OBJECTIVES Over the last two decades, the number of scoping reviews in core medical education journals has increased by 4200%. Despite this growth, research on scoping reviews provides limited information about their nature, including how they are conducted or why medical educators undertake this knowledge synthesis type. This gap makes it difficult to know where the field stands and may hamper attempts to improve the conduct, reporting and utility of scoping reviews. Thus, this review characterises the nature of medical education scoping reviews to identify areas for improvement and highlight future research opportunities. METHOD The authors searched PubMed for scoping reviews published between 1/1999 and 4/2020 in 14 medical education journals. The authors extracted and summarised key bibliometric data, the rationales given for conducting a scoping review, the research questions and key reporting elements as described in the PRISMA-ScR. Rationales and research questions were mapped to Arksey and O'Malley's reasons for conducting a scoping review. RESULTS One hundred and one scoping reviews were included. On average, 10.1 scoping reviews (SD = 13.1, median = 4) were published annually with the most reviews published in 2019 (n = 42). Authors described multiple reasons for undertaking scoping reviews; the most prevalent being to summarise and disseminate research findings (n = 77). In 11 reviews, the rationales for the scoping review and the research questions aligned. No review addressed all elements of the PRISMA-ScR, with few authors publishing a protocol (n = 2) or including stakeholders (n = 20). Authors identified shortcomings of scoping reviews, including lack of critical appraisal. CONCLUSIONS Scoping reviews are increasingly conducted in medical education and published by most core journals. Scoping reviews aim to map the depth and breadth of emerging topics; as such, they have the potential to play a critical role in the practice, policy and research of medical education. However, these results suggest improvements are needed for this role to be fully realised.
Collapse
Affiliation(s)
- Lauren A. Maggio
- Department of MedicineUniformed Services University of the Health SciencesBethesdaMDUSA
| | - Kelsey Larsen
- Department of Politics, Security, and International AffairsUniversity of Central FloridaOrlandoFLUSA
| | - Aliki Thomas
- School of Physical and Occupational TherapyInstitute of Health Sciences EducationFaculty of MedicineMcGill UniversityMontrealQCCanada
| | | | - Anthony R. Artino
- Department of Health, Human Function, and Rehabilitation SciencesThe George Washington University School of Medicine and Health SciencesWashingtonDCUSA
| |
Collapse
|
21
|
Talwalkar JS, Fortin AH, Morrison LJ, Kliger A, Rosenthal DI, Murtha T, Ellman MS. An Advanced Communication Skills Workshop Using Standardized Patients for Senior Medical Students. MEDEDPORTAL : THE JOURNAL OF TEACHING AND LEARNING RESOURCES 2021; 17:11163. [PMID: 34124349 PMCID: PMC8155077 DOI: 10.15766/mep_2374-8265.11163] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/27/2021] [Accepted: 04/06/2021] [Indexed: 05/30/2023]
Abstract
INTRODUCTION Medical students often lack training in advanced communication skills encompassing emotionally fraught situations and those in which an intense emotional response is expected. Such skills are required for clinical situations encountered during residency. We created and evaluated an advanced communication skills workshop (ACSW) using standardized patients for senior medical students. The workshop emphasized communication skills for four scenarios-strong emotion, goals of care, medical error, and palliative care assessment-and utilized formative peer assessment and feedback. METHODS We created the four ACSW cases with case-specific communication behavior checklists and a common modified Master Interview Rating Scale in a Capstone Course for senior medical students. In groups of three, students rotated through three of four stations. Each student conducted one of the interviews while the other two completed the checklists and provided verbal feedback. We performed one-way analyses of variance on Likert responses and content analysis on open responses on a post-ACSW survey. RESULTS Ninety-one students completed the ACSW and survey. Students assigned high value to all four ACSW student roles: interviewer, observer, feedback recipient, and feedback provider. Students rated the experience above average to excellent on nearly all survey items. Open-response themes included "liked the opportunity to give or receive peer feedback" (46%) and "found the checklists helpful" (45%). DISCUSSION Feasible and well received by senior medical students, our ACSW offers an opportunity to practice and observe advanced communication skills and peer feedback. A peer-assisted, formative learning model, the ACSW efficiently addresses a key aspect of residency preparation.
Collapse
Affiliation(s)
- Jaideep S. Talwalkar
- Associate Professor, Departments of Medicine and Pediatrics, and Director of Clinical Skills, Yale School of Medicine
| | - Auguste H. Fortin
- Professor, Department of Medicine, and Director of Communication Skills Education, Yale School of Medicine
| | - Laura J. Morrison
- Associate Professor, Department of Medicine, and Director of Hospice and Palliative Medicine Fellowship, Yale School of Medicine
| | - Alan Kliger
- Clinical Professor, Department of Medicine, Yale School of Medicine
| | - David I. Rosenthal
- Assistant Professor, Department of Medicine, and Director of Capstone Course, Yale School of Medicine
| | - Tanya Murtha
- Assistant Professor, Department of Pediatrics (Critical Care Medicine), Columbia University
| | - Matthew S. Ellman
- Professor, Department of Medicine, and Director of Medical Student Palliative and End-of-Life Care Education, Yale School of Medicine
| |
Collapse
|
22
|
Rhind SM, MacKay J, Brown AJ, Mosley CJ, Ryan JM, Hughes KJ, Boyd S. Developing Miller's Pyramid to Support Students' Assessment Literacy. JOURNAL OF VETERINARY MEDICAL EDUCATION 2021; 48:158-162. [PMID: 32149588 DOI: 10.3138/jvme.2019-0058] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Assessment literacy is increasingly recognized as an important concept to consider when developing assessment strategies for courses and programs. Assessment literacy approaches support students in their understanding of assessment expectations and help them both understand and optimize their performance in assessment. In this teaching tip, a model for assessment literacy that builds on the well-known Miller's Pyramid model for assessment in clinical disciplines is proposed and contextualized. The model progresses thinking from assessment methods themselves to consideration of the activities that need to be built into curricula to ensure that assessment literacy is addressed at each level of the pyramid. The teaching tip provides specific examples at each of the levels. Finally, the relevance of this work to overall curriculum design is emphasized.
Collapse
|
23
|
Jeong J, Park SY, Sun KH. Agreement between medical students' peer assessments and faculty assessments in advanced resuscitation skills examinations in South Korea. JOURNAL OF EDUCATIONAL EVALUATION FOR HEALTH PROFESSIONS 2021; 18:4. [PMID: 33761737 PMCID: PMC8089466 DOI: 10.3352/jeehp.2021.18.4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/04/2021] [Accepted: 03/12/2021] [Indexed: 06/12/2023]
Abstract
PURPOSE In medical education, peer assessment is considered to be an effective learning strategy. Although several studies have examined agreement between peer and faculty assessments regarding basic life support (BLS), few studies have done so for advanced resuscitation skills (ARS) such as intubation and defibrillation. Therefore, this study aimed to determine the degree of agreement between medical students' and faculty assessments of ARS examinations. METHODS This retrospective explorative study was conducted during the emergency medicine (EM) clinical clerkship of fourth-year medical students from April to July 2020. A faculty assessor (FA) and a peer assessor (PA) assessed each examinee's resuscitation skills (including BLS, intubation, and defibrillation) using a checklist that consisted of 20 binary items (performed or not performed) and 1 global proficiency rating using a 5-point Likert scale. The prior examinee assessed the next examinee after feedback and training as a PA. All 54 students participated in peer assessment. The assessments of 44 FA/PA pairs were analyzed using the intraclass correlation coefficient (ICC) and Gwet's first-order agreement coefficient. RESULTS The PA scores were higher than the FA scores (mean±standard deviation, 20.2±2.5 [FA] vs. 22.3±2.4 [PA]; P<0.001). The agreement was poor to moderate for the overall checklist (ICC, 0.55; 95% confidence interval [CI], 0.31 to 0.73; P<0.01), BLS (ICC, 0.19; 95% CI, -0.11 to 0.46; P<0.10), intubation (ICC, 0.51; 95% CI, 0.26 to 0.70; P<0.01), and defibrillation (ICC, 0.49; 95% CI, 0.23 to 0.68; P<0.01). CONCLUSION Senior medical students showed unreliable agreement in ARS assessments compared to faculty assessments. If a peer assessment is planned in skills education, comprehensive preparation and sufficient assessor training should be provided in advance.
Collapse
Affiliation(s)
- Jinwoo Jeong
- Department of Emergency Medicine, Dong-A University College of Medicine, Busan, Korea
| | - Song Yi Park
- Department of Emergency Medicine, Dong-A University College of Medicine, Busan, Korea
- Department of Medical Education, Dong-A University College of Medicine, Busan, Korea
| | - Kyung Hoon Sun
- Department of Emergency Medicine, Chosun University College of Medicine, Gwangju, Korea
| |
Collapse
|
24
|
Abstract
OBJECTIVES Formative peer assessment focuses on learning and development of the student learning process. This implies that students are taking responsibility for assessing the work of their peers by giving and receiving feedback to each other. The aim was to compile research about formative peer assessment presented in higher healthcare education, focusing on the rationale, the interventions, the experiences of students and teachers and the outcomes of formative assessment interventions. DESIGN A scoping review. DATA SOURCES Searches were conducted until May 2019 in PubMed, Cumulative Index to Nursing and Allied Health Literature, Education Research Complete and Education Research Centre. Grey literature was searched in Library Search, Google Scholar and Science Direct. ELIGIBILITY CRITERIA Studies addressing formative peer assessment in higher education, focusing on medicine, nursing, midwifery, dentistry, physical or occupational therapy and radiology published in peer-reviewed articles or in grey literature. DATA EXTRACTIONS AND SYNTHESIS Out of 1452 studies, 37 met the inclusion criteria and were critically appraised using relevant Critical Appraisal Skills Programme, Joanna Briggs Institute and Mixed Methods Appraisal Tool tools. The pertinent data were analysed using thematic analysis. RESULT The critical appraisal resulted in 18 included studies with high and moderate quality. The rationale for using formative peer assessment relates to giving and receiving constructive feedback as a means to promote learning. The experience and outcome of formative peer assessment interventions from the perspective of students and teachers are presented within three themes: (1) organisation and structure of the formative peer assessment activities, (2) personal attributes and consequences for oneself and relationships and (3) experience and outcome of feedback and learning. CONCLUSION Healthcare education must consider preparing and introducing students to collaborative learning, and thus develop well-designed learning activities aligned with the learning outcomes. Since peer collaboration seems to affect students' and teachers' experiences of formative peer assessment, empirical investigations exploring collaboration between students are of utmost importance.
Collapse
Affiliation(s)
- Marie Stenberg
- Department of Care Science, Faculty of Health and Society, Malmö University, Malmö, Sweden
| | - Elisabeth Mangrio
- Department of Care Science, Faculty of Health and Society, Malmö University, Malmö, Sweden
| | - Mariette Bengtsson
- Department of Care Science, Faculty of Health and Society, Malmö University, Malmö, Sweden
| | - Elisabeth Carlson
- Department of Care Science, Faculty of Health and Society, Malmö University, Malmö, Sweden
| |
Collapse
|
25
|
Möltner A, Lehmann M, Wachter C, Kurczyk S, Schwill S, Loukanova S. Formative assessment of practical skills with peer-assessors: quality features of an OSCE in general medicine at the Heidelberg Medical Faculty. GMS JOURNAL FOR MEDICAL EDUCATION 2020; 37:Doc42. [PMID: 32685670 PMCID: PMC7346287 DOI: 10.3205/zma001335] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Subscribe] [Scholar Register] [Received: 05/14/2019] [Revised: 03/24/2020] [Accepted: 04/15/2020] [Indexed: 05/22/2023]
Abstract
Background: Objective Structured Clinical Examinations (OSCEs) have become an established examination format at German medical faculties. Medical experts routinely use a summative assessment to evaluate practical and communicative skills, while the use of the OSCE format by student examiners, as a formative examination, remains rather limited. Objective: The formative OSCE program of the Department of General Practice and Implementation Research at the Heidelberg Medical Faculty, which is conducted and evaluated by peer tutors, is examined with regard to its quality criteria and compared with summative OSCEs from other departments. Methods: Difficulties and discriminatory power of individual testing stations were determined for the summative, as well as the formative OSCE, and compared with each other. To assess the reliability of the measurements, an analysis of the data was carried out using the Generalizability theory. In addition, a comparison is made between the assessments of student examiners and second assessments by medical experts. Results: The stations of the formative OSCE show similar difficulties as those of the summative comparison OSCEs (Pform=0.882; Psum=0.845 - 0.902). With respect to measurement reliability, there are no differences between the OSCE in General Medicine and the other subjects. The assessments of student examiners and medical experts correlate highly (r=0.888). Conclusion: The formative OSCE in General Medicine is comparable to the summative comparison formats in terms of its quality criteria. The use of student examiners can be a reliable alternative to medical experts in formative OSCEs.
Collapse
Affiliation(s)
- Andreas Möltner
- University Heidelberg, Baden-Württemberg Center of Excellence for Assessment in Medicine, Heidelberg, Germany
| | - Mirijam Lehmann
- University Heidelberg, Baden-Württemberg Center of Excellence for Assessment in Medicine, Heidelberg, Germany
| | - Cornelia Wachter
- University Heidelberg, Medical Faculty, Department of General Practice and Implementation Research, Heidelberg, Germany
| | - Sonia Kurczyk
- University Heidelberg, Medical Faculty, Department of General Practice and Implementation Research, Heidelberg, Germany
| | - Simon Schwill
- University Heidelberg, Medical Faculty, Department of General Practice and Implementation Research, Heidelberg, Germany
| | - Svetla Loukanova
- University Heidelberg, Medical Faculty, Department of General Practice and Implementation Research, Heidelberg, Germany
| |
Collapse
|
26
|
Talwalkar JS, Murtha TD, Prozora S, Fortin AH, Morrison LJ, Ellman MS. Assessing Advanced Communication Skills via Objective Structured Clinical Examination: A Comparison of Faculty Versus Self, Peer, and Standardized Patient Assessors. TEACHING AND LEARNING IN MEDICINE 2020; 32:294-307. [PMID: 32141335 DOI: 10.1080/10401334.2019.1704763] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/23/2023]
Abstract
Construct: The construct addressed in this study is assessment of advanced communication skills among senior medical students. Background: The question of who should assess participants during objective structured clinical examinations (OSCEs) has been debated, and options discussed in the literature have included peer, self, standardized patient, and faculty assessment models. What is not known is whether same-level peer assisted learning can be utilized for formative assessment of advanced communication skills when no faculty, standardized patients, or other trained assessors are involved in providing feedback. If successful, such an educational model would optimize resource utilization and broaden the scope of topics that could be covered in formative OSCEs. Approach: The investigators developed a 4-station formative OSCE focused on advanced communication skills for senior medical students, and evaluated the concordance of assessment done by same-level peers, self, standardized patients, and faculty for 45 students. After each station, examinees completed a self-assessment checklist and received checklist-based assessment and verbal feedback from same-level peers only. Standardized patients completed checklist-based assessments outside the room, and faculty did so after the OSCE via video review; neither group provided direct feedback to examinees. The investigators assessed inter-rater agreement and mean difference scores on the checklists using faculty score as the gold standard. Findings: There was fair to good overall agreement among self, same-level peer, standardized patient, and faculty-assessment of advanced communication skills. Relative to faculty, peer and standardized patient assessors overestimated advanced communication skills, while self-assessments underestimated skills. Conclusions: Self and same-level peer-assessment may be a viable alternative to faculty assessment for a formative OSCE on advanced communication skills for senior medical students.
Collapse
Affiliation(s)
- Jaideep S Talwalkar
- Department of Pediatrics, Yale School of Medicine, New Haven, Connecticut, USA
- Department of Internal Medicine, Yale School of Medicine, New Haven, Connecticut, USA
| | - Tanya D Murtha
- Department of Pediatrics, Yale School of Medicine, New Haven, Connecticut, USA
| | - Stephanie Prozora
- Department of Pediatrics, Yale School of Medicine, New Haven, Connecticut, USA
| | - Auguste H Fortin
- Department of Internal Medicine, Yale School of Medicine, New Haven, Connecticut, USA
| | - Laura J Morrison
- Department of Internal Medicine, Yale School of Medicine, New Haven, Connecticut, USA
| | - Matthew S Ellman
- Department of Internal Medicine, Yale School of Medicine, New Haven, Connecticut, USA
| |
Collapse
|
27
|
Khan R, Chahine S, Macaluso S, Viana R, Cassidy C, Miller T, Bartley D, Payne M. Impressions on Reliability and Students' Perceptions of Learning in a Peer-Based OSCE. MEDICAL SCIENCE EDUCATOR 2020; 30:429-437. [PMID: 34457686 PMCID: PMC8368308 DOI: 10.1007/s40670-020-00923-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
BACKGROUND Peer assessment of performance in the objective structured clinical examination (OSCE) is emerging as a learning instrument. While peers can provide reliable scores, there may be a trade-off with students' learning. The purpose of this study is to evaluate a peer-based OSCE as a viable assessment instrument and its potential to promote learning and explore the interplay between these two roles. METHODS A total of 334 medical students completed an 11-station OSCE from 2015 to 2016. Each station had 1-2 peer examiners (PE) and one faculty examiner (FE). Examinees were rated on a 7-point scale across 5 dimensions: Look, Feel, Move, Special Tests and Global Impression. Students participated in voluntary focus groups in 2016 to provide qualitative feedback on the OSCE. Authors analysed assessment data and transcripts of focus group discussions. RESULTS Overall, PE awarded higher ratings compared with FE, sources of variance were similar across 2 years with unique variance consistently being the largest source, and reliability (r φ ) was generally low. Focus group analysis revealed four themes: Conferring with Faculty Examiners, Difficulty Rating Peers, Insider Knowledge, and Observing and Scoring. CONCLUSIONS While peer assessment was not reliable for evaluating OSCE performance, PE's perceived that it was beneficial for their learning. Insight gained into exam technique and self-appraisal of skills allows students to understand expectations in clinical situations and plan approaches to self-assessment of competence.
Collapse
Affiliation(s)
- Rishad Khan
- Department of Medicine, Schulich School of Medicine and Dentistry, Western University, 1151 Richmond Street North, London, ON N6A 3K7 Canada
| | - Saad Chahine
- Centre for Education Research and Innovation, Schulich School of Medicine and Dentistry, Western University, 1151 Richmond Street North, London, ON N6A 3K7 Canada
| | - Steven Macaluso
- Department of Physical Medicine and Rehabilitation, Schulich School of Medicine and Dentistry, Western University, London, ON N6A 3K7 Canada
| | - Ricardo Viana
- Department of Physical Medicine and Rehabilitation, Schulich School of Medicine and Dentistry, Western University, London, ON N6A 3K7 Canada
| | - Caitlin Cassidy
- Department of Physical Medicine and Rehabilitation, Schulich School of Medicine and Dentistry, Western University, London, ON N6A 3K7 Canada
| | - Thomas Miller
- Department of Physical Medicine and Rehabilitation, Schulich School of Medicine and Dentistry, Western University, London, ON N6A 3K7 Canada
| | - Debra Bartley
- Department of Surgery, Schulich School of Medicine and Dentistry, Western University, 1151 Richmond Street North, London, ON N6A 3K7 Canada
- Department of Paediatrics, Schulich School of Medicine and Dentistry, Western University, 1151 Richmond Street North, London, ON N6A 3K7 Canada
| | - Michael Payne
- Department of Physical Medicine and Rehabilitation, Schulich School of Medicine and Dentistry, Western University, London, ON N6A 3K7 Canada
| |
Collapse
|
28
|
Schwill S, Fahrbach-Veeser J, Moeltner A, Eicher C, Kurczyk S, Pfisterer D, Szecsenyi J, Loukanova S. Peers as OSCE assessors for junior medical students - a review of routine use: a mixed methods study. BMC MEDICAL EDUCATION 2020; 20:17. [PMID: 31948425 PMCID: PMC6966898 DOI: 10.1186/s12909-019-1898-y] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/17/2019] [Accepted: 12/03/2019] [Indexed: 05/30/2023]
Abstract
BACKGROUND Peer-assisted learning is well established in medical education; however, peer tutors rarely act as assessors for the OSCE. In the compulsory, near-peer teaching programme covering basic medical skills at the University of Heidelberg, peer tutors serve as assessors on a formative OSCE. This study aimed to investigate the feasibility and acceptance of peer assessors and to survey the perceived advantages and disadvantages of their use. METHODS In 2016 and 2017 all OSCE peer assessors (third to sixth-year medical students) and all of the peer-assessed students in 2017 (second-year-medical students) were invited to participate in a survey. Both groups were asked to complete a tablet-based questionnaire immediately after the OSCE. Peer assessors were asked to rate eight statements and the peer-assessed students to rate seven statements on a five-point Likert scale. Both were asked to comment on the advantages and disadvantages of peer-assessors. RESULTS Overall, 74 of 76 peer assessors and 307 of 308 peer-assessed students participated in the study. 94% (67/74) of peer assessors and 90% (276/307) of the peer-assessed group thought that it is important to have peer tutors as assessors. Of the peer assessors, 92% (68/74) felt confident in giving structured feedback during the OSCE and 66% (49/74) felt they had improved their teaching skills. Of the peer-assessed students, 99% (306/307) were satisfied with their peers as OSCE assessors and 96% (292/307) considered the peer feedback during the OSCE as helpful. The participants mentioned structural benefits, such as lower costs, and suggested the quality of the OSCE was higher due to the use of peer assessors. The use of peer assessors was found to be beneficial for the learners in the form of high-quality feedback and an overall reduction in stress. Furthermore, the use of peer assessors was found to be beneficial for the peer assessors (improved teaching and clinical skills). CONCLUSION From a learner's perspective, the use of peer assessors for a formative OSCE that is part of a near-peer teaching program aimed at junior medical students is favourable for all. A broad implementation of peer assessment in the formative OSCE should be encouraged to investigate effects on quality and stress-reduction.
Collapse
Affiliation(s)
- Simon Schwill
- Department of General Practice and Health Services Research, University Hospital Heidelberg, Heidelberg, Germany
| | - Johanna Fahrbach-Veeser
- Department of General Practice and Health Services Research, University Hospital Heidelberg, Heidelberg, Germany
| | - Andreas Moeltner
- Competence Center Assessment in Medical Education, University of Heidelberg, Heidelberg, Germany
| | - Christiane Eicher
- Department of General Practice and Health Services Research, University Hospital Heidelberg, Heidelberg, Germany
| | - Sonia Kurczyk
- Department of General Practice and Health Services Research, University Hospital Heidelberg, Heidelberg, Germany
| | - David Pfisterer
- Department of General Practice and Health Services Research, University Hospital Heidelberg, Heidelberg, Germany
| | - Joachim Szecsenyi
- Department of General Practice and Health Services Research, University Hospital Heidelberg, Heidelberg, Germany
| | - Svetla Loukanova
- Department of General Practice and Health Services Research, University Hospital Heidelberg, Heidelberg, Germany
| |
Collapse
|
29
|
Hegg RM, Ivan KF, Tone J, Morten A. Comparison of peer assessment and faculty assessment in an interprofessional simulation-based team training program. Nurse Educ Pract 2020; 42:102666. [PMID: 31734516 DOI: 10.1016/j.nepr.2019.102666] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2018] [Revised: 06/25/2019] [Accepted: 11/08/2019] [Indexed: 11/28/2022]
Abstract
Challenges related to limited clinical sites and shortage of clinical instructors may reduce the quality of clinical experiences, leading to increased demand for the establishment of simulation-based training programs in the curricula of educational institutions. However, simulation-based training programs in health education place great demands on faculty resources. It is interesting, therefore, to investigate peers contributions in formal assessment, and how this compares to faculty assessment. This paper report the results from the comparison of direct observation by peer observers who had received short rater training, and post-hoc video-based assessment by trained facilitators. An observation form with six learning outcomes was used to rate team performance. Altogether 262 postgraduate nursing students, bachelor of nursing students and medical students participated, organized into 44 interprofessional teams. A total of 84 peers and two facilitators rated team performance. The sum score of all six learning outcomes showed that facilitators were more lenient than peer observers (p = .014). The inter-rater reliability varied considerably when comparing scores from peer observers from the three different professions with those of the facilitators. The results indicate that peer assessment may support, but not replace, faculty assessment.
Collapse
Affiliation(s)
- Reime Marit Hegg
- Department of Health and Caring Sciences, Faculty of Health and Social Sciences, Western Norway University of Applied Science, Inndalsveien 28, 5063, Bergen, Norway.
| | - Kvam Fred Ivan
- Department of Health and Caring Sciences, Faculty of Health and Social Sciences, Western Norway University of Applied Science, Inndalsveien 28, 5063, Bergen, Norway.
| | - Johnsgaard Tone
- Department of Health and Caring Sciences, Faculty of Health and Social Sciences, Western Norway University of Applied Science, Inndalsveien 28, 5063, Bergen, Norway.
| | - Aarflot Morten
- Department of Health and Caring Sciences, Faculty of Health and Social Sciences, Western Norway University of Applied Science, Inndalsveien 28, 5063, Bergen, Norway.
| |
Collapse
|
30
|
Madrazo L, Lee CB, McConnell M, Khamisa K, Pugh D. No observed effect from a student-led mock objective structured clinical examination on subsequent performance scores in medical students in Canada. JOURNAL OF EDUCATIONAL EVALUATION FOR HEALTH PROFESSIONS 2019; 16:14. [PMID: 31129947 PMCID: PMC6609294 DOI: 10.3352/jeehp.2019.16.14] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/06/2019] [Accepted: 05/27/2019] [Indexed: 06/01/2023]
Abstract
Student-led peer-assisted mock objective structured clinical examinations (MOSCEs) have been used in different settings to help students prepare for subsequent higher-stakes, faculty-run OSCEs. MOSCE participants generally valued feedback from peers and report benefits to learning. Our study investigated whether participation in a peer-assisted MOSCE affects subsequent OSCE performance. To determine whether mean OSCE scores differed depending on whether medical students participated in the MOSCE, we conducted a between-subjects analysis of variance (ANOVA), with cohort (2016 vs. 2017) and MOSCE participation (MOSCE vs. No MOSCE) as independent variables and mean OSCE score as the dependent variable. Participation in the MOSCE had no influence on mean OSCE scores (P=0.19). There was a significant correlation between mean MOSCE scores and mean OSCE scores (Pearson's r = 0.52, P<0.001). Whereas previous studies report self-reported benefits from participation in student-lead MOSCEs, it was not associated with objective benefits in this study.
Collapse
Affiliation(s)
- Lorenzo Madrazo
- Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Claire Bo Lee
- Department of Medicine, McGill University, Montreal, QC, Canada
| | - Meghan McConnell
- Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
- Department of Anesthesiology and Pain Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Karima Khamisa
- Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Debra Pugh
- Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
- Department of Medicine, University of Ottawa, Ottawa, ON, Canada
- Medical Council of Canada, Ottawa, ON, Canada
| |
Collapse
|
31
|
Lyngå P, Masiello I, Karlgren K, Joelsson-Alm E. Experiences of using an OSCE protocol in clinical examinations of nursing students - A comparison of student and faculty assessments. Nurse Educ Pract 2019; 35:130-134. [PMID: 30802783 DOI: 10.1016/j.nepr.2019.02.004] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2018] [Revised: 10/26/2018] [Accepted: 02/12/2019] [Indexed: 11/18/2022]
Abstract
Peer-assessment in nursing education using an OSCE protocol is an increasing educational activity that complements that of teachers. However, little is known about how students' and teachers' assessments correspond. The study aimed to compare OSCE assessments made by student examiners and faculty examiners during examinations of clinical skills in undergraduate nursing education. Four cohorts of third-year nursing students participated between 2014 and 2016. The students underwent a clinical examination of the management of central venous catheters and totally implantable venous access devices. Students who performed the examinations were observed both by a faculty examiner and student examiner. Both observers used the same OSCE protocol for the assessment but independently. The OSCE protocols from both faculty and student examiners were reviewed and compared. Total agreement between the student and faculty examiner was reached in 127 of 135 (94%) paired protocols. The level of agreement was substantial with a kappa value of 0.79 (95% CI 0.65-0.93). The conclusion was that the level of agreement between student and faculty examiners was high when using an OSCE protocol in clinical examinations of two different clinical skill tasks. The structured checklist (OSCE protocol) was easy to use for the student examiners despite the lack of experience or training in advance.
Collapse
Affiliation(s)
- Patrik Lyngå
- Department of Clinical Science and Education, Södersjukhuset, Karolinska Institutet, Stockholm, Sweden.
| | - Italo Masiello
- Department of Clinical Science and Education, Södersjukhuset, Karolinska Institutet, Stockholm, Sweden; Department of Research, Education, Development and Innovation, Södersjukhuset, Sweden
| | - Klas Karlgren
- Department of Research, Education, Development and Innovation, Södersjukhuset, Sweden; Department of Learning, Informatics, Management and Ethics, Karolinska Institutet, Stockholm, Sweden
| | - Eva Joelsson-Alm
- Department of Clinical Science and Education, Södersjukhuset, Karolinska Institutet, Stockholm, Sweden
| |
Collapse
|
32
|
Freytag J, Stroben F, Hautz WE, Schauber SK, Kämmer JE. Rating the quality of teamwork-a comparison of novice and expert ratings using the Team Emergency Assessment Measure (TEAM) in simulated emergencies. Scand J Trauma Resusc Emerg Med 2019; 27:12. [PMID: 30736821 PMCID: PMC6368771 DOI: 10.1186/s13049-019-0591-9] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2018] [Accepted: 11/14/2018] [Indexed: 01/01/2023] Open
Abstract
Background Training in teamwork behaviour improves technical resuscitation performance. However, its effect on patient outcome is less clear, partly because teamwork behaviour is difficult to measure. Furthermore, it is unknown who should evaluate it. In clinical practice, experts are obliged to participate in resuscitation efforts and are thus unavailable to assess teamwork quality. Consequently, we sought to determine if raters with little clinical experience and experts provide comparable evaluations of teamwork behaviour. Methods Novice and expert raters judged teamwork behaviour during 6 emergency medicine simulations using the Teamwork Emergency Assessment Measure (TEAM). Ratings of both groups were analysed descriptively and compared with U and t tests. We used a mixed effects model to identify the proportion of variance in TEAM scores attributable to rater status and other sources. Results Twelve raters evaluated 7 teams rotating through 6 cases, for a total of 84 observations. We found no significant difference between expert and novice ratings for 7 of the 11 items of the TEAM or in the sums of all item scores. Novices rated teamwork behaviour higher on 4 items and overall. Rater status accounted for 11.1% of the total variance in scores. Conclusions Experts’ and novices’ ratings were similarly distributed, implying that raters with limited experience can provide reliable data on teamwork behaviour. Novices show a consistent, but slightly more lenient rating behaviour. Clinical studies and real-life teams may thus employ novices using a structured observational tool such as TEAM to inform their performance review and improvement. Electronic supplementary material The online version of this article (10.1186/s13049-019-0591-9) contains supplementary material, which is available to authorized users.
Collapse
Affiliation(s)
- Julia Freytag
- Simulated Patients Program, Office of the Vice Dean for Teaching and Learning, Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität zu Berlin, and Berlin Institute of Health, Charitéplatz 1, 10117, Berlin, Germany
| | - Fabian Stroben
- Lernzentrum, Office of the Vice Dean for Teaching and Learning, Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität zu Berlin, and Berlin Institute of Health, Charitéplatz 1, 10117, Berlin, Germany.,AG Progress Test Medizin, Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität zu Berlin, and Berlin Institute of Health, Charitéplatz 1, 10117, Berlin, Germany
| | - Wolf E Hautz
- Department of Emergency Medicine, Inselspital, Bern University Hospital, University of Bern, Freiburgstrasse 4, 3010, Bern, Switzerland.,Centre for Health Sciences Education, University of Oslo, Gaustadalléen 30, 0373, Oslo, Norway
| | - Stefan K Schauber
- Centre for Health Sciences Education, University of Oslo, Gaustadalléen 30, 0373, Oslo, Norway
| | - Juliane E Kämmer
- AG Progress Test Medizin, Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität zu Berlin, and Berlin Institute of Health, Charitéplatz 1, 10117, Berlin, Germany. .,Center for Adaptive Rationality, Max Planck Institute for Human Development, Lentzeallee 94, 14195, Berlin, Germany.
| |
Collapse
|
33
|
Lee CB, Madrazo L, Khan U, Thangarasa T, McConnell M, Khamisa K. A student-initiated objective structured clinical examination as a sustainable cost-effective learning experience. MEDICAL EDUCATION ONLINE 2018; 23:1440111. [PMID: 29480155 PMCID: PMC5827782 DOI: 10.1080/10872981.2018.1440111] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/11/2017] [Accepted: 02/07/2018] [Indexed: 05/22/2023]
Abstract
BACKGROUND The objective structured clinical examination (OSCE) has gained widespread use as a form of performance assessment. However, opportunities for students to participate in practice OSCEs are limited by the financial, faculty and administrative investments required. OBJECTIVES To determine the feasibility and acceptability of a student-run mock OSCE (MOSCE) as a learning experience for medical students of all 4 years. DESIGN We conducted a five-station MOSCE for third-year students. This involved fourth-year students as examiners and first-/second-year students as standardized patients (SPs). Each examiner scored examinees using a checklist and global rating scale while providing written and verbal feedback. MOSCE stations and checklists were designed by students and reviewed by a faculty supervisor. Following the MOSCE, participants completed surveys which elucidated their perceptions on the roles they took during the MOSCE. RESULTS Fifty examinees participated in the MOSCE. Of these, 42 (84%) consented to participate in the study and submitted completed questionnaires. Twenty-four examiners participated in the OSCE and consented to participate in the study, with 22 (92%) submitting completed questionnaires. Fifty-three of 60 SPs (88%) agreed to take part in this study, and 51 (85%) completed questionnaires. The internal consistency of the five-station OSCE was calculated as a Cronbach's alpha of 0.443. Students commented positively on having the opportunity to network and engage in mentorship activities and reinforce clinical concepts. CONCLUSIONS Examinees, examiners, and SPs all perceived the MOSCE to be a beneficial learning experience. We found the MOSCE to be a feasible and acceptable means of providing additional OSCE practice to students prior to higher-stakes evaluations.
Collapse
Affiliation(s)
- Claire B. Lee
- Department of Medicine, McGill University, Montreal, QC, Canada
- CONTACT Claire B. Lee Department of Medicine, McGill University, Montreal, QC, Canada
| | - Lorenzo Madrazo
- Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Usman Khan
- Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
| | | | | | - Karima Khamisa
- Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
| |
Collapse
|