1
|
Sahi N, Humphrey-Murto S, Brennan EE, O'Brien M, Hall AK. Current use of simulation for EPA assessment in emergency medicine. CAN J EMERG MED 2024; 26:179-187. [PMID: 38374281 DOI: 10.1007/s43678-024-00649-9] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2023] [Accepted: 01/12/2024] [Indexed: 02/21/2024]
Abstract
OBJECTIVE Approximately five years ago, the Royal College emergency medicine programs in Canada implemented a competency-based paradigm and introduced the use of Entrustable Professional Activities (EPAs) for assessment of units of professional activity to assess trainees. Many competency-based medical education (CBME) based curricula, involve assessing for entrustment through observations of EPAs. While EPAs are frequently assessed in clinical settings, simulation is also used. This study aimed to characterize the use of simulation for EPA assessment. METHODS A study interview guide was jointly developed by all study authors and followed best practices for survey development. A national interview was conducted with program directors or assistant program directors across all the Royal College emergency medicine programs across Canada. Interviews were conducted over Microsoft Teams, interviews were recorded and transcribed, using Microsoft Teams transcribing service. Sample transcripts were analyzed for theme development. Themes were then reviewed by co-authors to ensure they were representative of the participants' views. RESULTS A 64.7% response rate was achieved. Simulation has been widely adopted by EM training programs. All interviewees demonstrated support for the use of simulation for EPA assessment for many reasons, however, PDs acknowledged limitations and thematic analysis revealed certain themes and tensions for using simulation for EPA assessment. Thematic analysis revealed six major themes: widespread support for the use of simulation for EPA assessment, concerns regarding the potential for EPA assessment to become a "tick- box" exercise, logistical barriers limiting the use of simulation for EPA assessment, varied perceptions about the authenticity of using simulation for EPA assessment, the potential for simulation for EPA assessment to compromise learner psychological safety, and suggestions for the optimization of use of simulation for EPA assessment. CONCLUSIONS Our findings offer insight for other programs and specialties on how simulation for EPA assessment can best be utilized. Programs should use these findings when considering using simulation for EPA assessment.
Collapse
Affiliation(s)
- Nidhi Sahi
- Department of Innovation in Medical Education (DIME), University of Ottawa, Ottawa, ON, Canada.
| | - Susan Humphrey-Murto
- Department of Medicine, University of Ottawa, Ottawa, ON, Canada
- Tier 2 Research Chair in Medical Education and Fellowship Director, Medical Education Research, University of Ottawa, Ottawa, ON, Canada
| | - Erin E Brennan
- Department of Emergency Medicine, Queen's University, Kingston, ON, Canada
| | - Michael O'Brien
- Emergency Medicine, The Ottawa Hospital, Ottawa, ON, Canada
- Department of Innovation in Medical Education, University of Ottawa, Ottawa, ON, Canada
| | - Andrew K Hall
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| |
Collapse
|
2
|
Seed JD, Gauthier S, Zevin B, Hall AK, Chaplin T. Simulation vs workplace-based assessment in resuscitation: a cross-specialty descriptive analysis and comparison. CANADIAN MEDICAL EDUCATION JOURNAL 2023; 14:92-98. [PMID: 37465738 PMCID: PMC10351640 DOI: 10.36834/cmej.73692] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 07/20/2023]
Abstract
Background Simulation-based assessment can complement workplace-based assessment of rare or difficult to assess Entrustable Professional Activities (EPAs). We aimed to compare the use of simulation-based assessment for resuscitation-focused EPAs in three postgraduate medical training programs and describe faculty perceptions of simulation-based assessment. Methods EPA assessment scores and setting (simulation or workplace) were extracted from 2017-2020 for internal medicine, emergency medicine, and surgical foundations residents at the transition to discipline and foundations of discipline stages. A questionnaire was distributed to clinical competency committee members. Results Eleven percent of EPA assessments were simulation-based. The proportion of simulation-based assessment did not differ between programs but differed between transition (38%) and foundations (4%) stages within surgical foundations only. Entrustment scores differed between settings in emergency medicine at the transition level only (simulation: 4.82 ± 0.60 workplace: 3.74 ± 0.93). 70% of committee members (n=20) completed the questionnaire. Of those that use simulation-based assessment, 45% interpret them differently than workplace-based assessments. 73% and 100% trust simulation for high-stakes and low-stakes assessment, respectively. Conclusions The proportion of simulation-based assessment for resuscitation focused EPAs did not differ between three postgraduate medical training programs. Interpretation of simulation-based assessment data between committee members was inconsistent. All respondents trust simulation-based assessment for low-stakes, and the majority for high-stakes assessment. These findings have practical implications for the integration simulation into programs of assessment.
Collapse
Affiliation(s)
- Jeremy D Seed
- Department of Emergency Medicine, Queen's University, Ontario, Canada
| | | | - Boris Zevin
- Department of Surgery, Queen's University, Ontario, Canada
| | - Andrew K Hall
- Department of Emergency Medicine, University of Ottawa, Ontario, Canada
| | - Timothy Chaplin
- Department of Emergency Medicine, Queen's University, Ontario, Canada
| |
Collapse
|
3
|
Mackenzie MJ, Hagel C, Lin Y, Hall AK, Grant VJ, Doshi S. The Reliability of the Resuscitation Assessment Tool (RAT) in Assessing Emergency Medicine Resident Competence in Pediatric Resuscitation Scenarios: A Prospective Observational Pilot Study. Cureus 2023; 15:e35869. [PMID: 37033538 PMCID: PMC10079254 DOI: 10.7759/cureus.35869] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/07/2023] [Indexed: 03/09/2023] Open
Abstract
Introduction Emergency medicine (EM) postgraduate medical education in Canada has transitioned from traditional time-based training to competency-based medical education (CBME). In order to promote residents through stages of training, simulated assessments are needed to evaluate residents in high-stakes but low-frequency medical emergencies. There remains a gap in the literature pertaining to the use of evaluative tools in simulation, such as the Resuscitation Assessment Tool (RAT) in the new CBME curriculum design. Methods We completed a pilot study of resident physicians in one Canadian EM training program to evaluate the effectiveness and reliability of a simulation-based RAT for pediatric resuscitation. We recorded 10 EM trainees completing simulated scenarios and had nine EM physicians use the RAT tool to evaluate their performances. Generalizability theory was used to evaluate the reliability of the RAT tool. Results The mean RAT score for the management of pediatric myocarditis, cardiac arrest, and septic shock (appendicitis) across raters was 3.70, 3.73, and 4.50, respectively. The overall generalizability coefficient for testing simulated pediatric performance competency was 0.77 for internal consistency and 0.75 for absolute agreement. The performance of senior participants was superior to that of junior participants in the management of pediatric myocarditis (p = 0.01) but not statistically significant in the management of pediatric septic shock (p=0.77) or cardiac arrest (p =0.61). Conclusion Overall, our findings suggest that with an appropriately chosen simulated scenario, the RAT tool can be used effectively for the simulation of high-stakes and low-frequency scenarios for practice to enhance the new CBME curriculum in emergency medicine training programs.
Collapse
|
4
|
Spencer M, Sherbino J, Hatala R. Examining the validity argument for the Ottawa Surgical Competency Operating Room Evaluation (OSCORE): a systematic review and narrative synthesis. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2022; 27:659-689. [PMID: 35511356 DOI: 10.1007/s10459-022-10114-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/15/2021] [Accepted: 04/02/2022] [Indexed: 06/14/2023]
Abstract
The Ottawa Surgical Competency Operating Room Evaluation (OSCORE) is an assessment tool that has gained prominence in postgraduate competency-based training programs. We undertook a systematic review and narrative synthesis to articulate the underlying validity argument in support of this tool. Although originally developed to assess readiness for independent performance of a procedure, contemporary implementation includes using the OSCORE for entrustment supervision decisions. We used systematic review methodology to search, identify, appraise and abstract relevant articles from 2005 to September 2020, across MEDLINE, EMBASE and Google Scholar databases. Nineteen original, English-language, quantitative or qualitative articles addressing the use of the OSCORE for health professionals' assessment were included. We organized and synthesized the validity evidence according to Kane's framework, articulating the validity argument and identifying evidence gaps. We demonstrate a reasonable validity argument for the OSCORE in surgical specialties, based on assessing surgical competence as readiness for independent performance for a given procedure, which relates to ad hoc, retrospective, entrustment supervision decisions. The scoring, generalization and extrapolation inferences are well-supported. However, there is a notable lack of implications evidence focused on the impact of the OSCORE on summative decision-making within surgical training programs. In non-surgical specialties, the interpretation/use argument for the OSCORE has not been clearly articulated. The OSCORE has been reduced to a single-item global rating scale, and there is limited validity evidence to support its use in workplace-based assessment. Widespread adoption of the OSCORE must be informed by concurrent data collection in more diverse settings and specialties.
Collapse
Affiliation(s)
- Martha Spencer
- The University of British Columbia, Vancouver, BC, Canada.
| | | | - Rose Hatala
- The University of British Columbia, Vancouver, BC, Canada
| |
Collapse
|
5
|
Pandya A, Patocka C, Huffman J. Simulation for assessment of Entrustable Professional Activities in an emergency medicine residency program. CAN J EMERG MED 2022; 24:84-87. [PMID: 34780048 DOI: 10.1007/s43678-021-00209-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2021] [Accepted: 08/18/2021] [Indexed: 11/28/2022]
Abstract
In 2018, Canadian post-graduate Emergency Medicine (EM) programs transitioned to Competence-by-Design. Residents are now assessed using Entrustable Professional Activities (EPAs). We developed and implemented simulation for assessment to mitigate anticipated challenges with residents completing the required number of observations of resuscitation-based EPAs. Our survey of trainees who participated in these sessions suggests that it may be a feasible and acceptable method for EPA assessment.
Collapse
Affiliation(s)
- Anjli Pandya
- Department of Emergency Medicine, Rm C231, Foothills Medical Centre, University of Calgary, 1403 29th Street NW, Calgary, AB, T2N 2T9, Canada.
| | - Catherine Patocka
- Department of Emergency Medicine, Rm C231, Foothills Medical Centre, University of Calgary, 1403 29th Street NW, Calgary, AB, T2N 2T9, Canada
| | - James Huffman
- Department of Emergency Medicine, Rm C231, Foothills Medical Centre, University of Calgary, 1403 29th Street NW, Calgary, AB, T2N 2T9, Canada
| |
Collapse
|
6
|
Robinson TJG, Wagner N, Szulewski A, Dudek N, Cheung WJ, Hall AK. Exploring the use of rating scales with entrustment anchors in workplace-based assessment. MEDICAL EDUCATION 2021; 55:1047-1055. [PMID: 34060651 DOI: 10.1111/medu.14573] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/01/2020] [Revised: 04/07/2021] [Accepted: 05/26/2021] [Indexed: 06/12/2023]
Abstract
PURPOSE Competency-based medical education (CBME) has prompted widespread implementation of workplace-based assessment (WBA) tools using entrustment anchors. This study aimed to identify factors that influence faculty's rating choices immediately following assessment and explore their experiences using WBAs with entrustment anchors, specifically the Ottawa Surgical Competency Operating Room Evaluation scale. METHOD A convenience sample of 50 semi-structured interviews with Emergency Medicine (EM) physicians from a single Canadian hospital were conducted between July and August 2019. All interviews occurred within two hours of faculty completing a WBA of a trainee. Faculty were asked what they considered when rating the trainee's performance and whether they considered an alternate rating. Two team members independently analysed interview transcripts using conventional content analysis with line-by-line coding to identify themes. RESULTS Interviews captured interactions between 70% (26/37) of full-time EM faculty and 86% (19/22) of EM trainees. Faculty most commonly identified the amount of guidance the trainee required as influencing their rating. Other variables such as clinical context, trainee experience, past experiences with the trainee, perceived competence and confidence were also identified. While most faculty did not struggle to assign ratings, some had difficulty interpreting the language of entrustment anchors, being unsure whether their assessment should be retrospective or prospective in nature, and if/how the assessment should change whether they were 'in the room' or not. CONCLUSIONS By going to the frontline during WBA encounters, this study captured authentic and honest reflections from physicians immediately engaged in assessment using entrustment anchors. While many of the factors identified are consistent with previous retrospective work, we highlight how some faculty consider factors outside the prescribed approach and struggle with the language of entrustment anchors. These results further our understanding of 'in-the-moment' assessments using entrustment anchors and may facilitate effective faculty development regarding WBA in CBME.
Collapse
Affiliation(s)
| | - Natalie Wagner
- Department of Biomedical & Molecular Sciences, Queen's University, Kingston, ON, Canada
- Office of Professional Development & Educational Scholarship, Queen's University, Kingston, ON, Canada
| | - Adam Szulewski
- Department of Emergency Medicine, Queen's University, Kingston, ON, Canada
- Department of Psychology, Queen's University, Kingston, ON, Canada
| | - Nancy Dudek
- Department of Medicine and The Ottawa Hospital, University of Ottawa, Ottawa, ON, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| | - Warren J Cheung
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Andrew K Hall
- Department of Emergency Medicine, Queen's University, Kingston, ON, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| |
Collapse
|
7
|
Breaking down the silos in simulation-based education: Exploring, refining, and standardizing. CAN J EMERG MED 2020; 22:733-734. [DOI: 10.1017/cem.2020.471] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|