1
|
Alharbi NS. Evaluating competency-based medical education: a systematized review of current practices. BMC MEDICAL EDUCATION 2024; 24:612. [PMID: 38831271 PMCID: PMC11149276 DOI: 10.1186/s12909-024-05609-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/13/2023] [Accepted: 05/27/2024] [Indexed: 06/05/2024]
Abstract
BACKGROUND Few published articles provide a comprehensive overview of the available evidence on the topic of evaluating competency-based medical education (CBME) curricula. The purpose of this review is therefore to synthesize the available evidence on the evaluation practices for competency-based curricula employed in schools and programs for undergraduate and postgraduate health professionals. METHOD This systematized review was conducted following the systematic reviews approach with minor modifications to synthesize the findings of published studies that examined the evaluation of CBME undergraduate and postgraduate programs for health professionals. RESULTS Thirty-eight articles met the inclusion criteria and reported evaluation practices in CBME curricula from various countries and regions worldwide, such as Canada, China, Turkey, and West Africa. 57% of the evaluated programs were at the postgraduate level, and 71% were in the field of medicine. The results revealed variation in reporting evaluation practices, with numerous studies failing to clarify evaluations' objectives, approaches, tools, and standards as well as how evaluations were reported and communicated. It was noted that questionnaires were the primary tool employed for evaluating programs, often combined with interviews or focus groups. Furthermore, the utilized evaluation standards considered the well-known competencies framework, specialized association guidelines, and accreditation criteria. CONCLUSION This review calls attention to the importance of ensuring that reports of evaluation experiences include certain essential elements of evaluation to better inform theory and practice.
Collapse
Affiliation(s)
- Nouf Sulaiman Alharbi
- Department of Medical Education, College of Medicine, King Saud bin Abdulaziz University for Health Sciences, Riyadh, Saudi Arabia.
- King Abdullah International Medical Research Center, Riyadh, Saudi Arabia.
- Ministry of the National Guard - Health Affairs, Riyadh, Saudi Arabia.
| |
Collapse
|
2
|
Thoma B, Bernard J, Wang S, Yilmaz Y, Bandi V, Woods RA, Cheung WJ, Choo E, Card A, Chan TM. Deidentifying Narrative Assessments to Facilitate Data Sharing in Medical Education. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2024; 99:513-517. [PMID: 38113414 DOI: 10.1097/acm.0000000000005596] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/21/2023]
Abstract
PROBLEM Narrative assessments are commonly incorporated into competency-based medical education programs. However, efforts to share competency-based medical education assessment data among programs to support the evaluation and improvement of assessment systems have been limited in part because of security concerns. Deidentifying assessment data mitigates these concerns, but deidentifying narrative assessments is time-consuming, resource intensive, and error prone. The authors developed and tested a tool to automate the deidentification of narrative assessments and facilitate their review. APPROACH The authors met throughout 2021 and 2022 to iteratively design, test, and refine the deidentification algorithm and data review interface. Preliminary testing of the prototype deidentification algorithm was performed using narrative assessments from the University of Saskatchewan emergency medicine program. The algorithm's accuracy was assessed by the authors using the review interface designed for this purpose. Formal testing included 2 rounds of deidentification and review by members of the authorship team. Both the algorithm and data review interface were refined during the testing process. OUTCOMES Authors from 3 institutions, including 3 emergency medicine programs, an anesthesia program, and a surgical program, participated in formal testing. In the final round of review, 99.4% of the narrative assessments were fully deidentified (names, nicknames, and pronouns removed). The results were comparable for each institution and specialty. The data review interface was improved with feedback obtained after each round of review and found to be intuitive. NEXT STEPS This innovation has demonstrated viability evidence of an algorithmic approach to the deidentification of assessment narratives while reinforcing that a small number of errors are likely to persist. Future steps include the refinement of both the algorithm to improve its accuracy and the data review interface to support additional data set formats.
Collapse
|
3
|
Frank JR, Karpinski J, Sherbino J, Snell LS, Atkinson A, Oswald A, Hall AK, Cooke L, Dojeiji S, Richardson D, Cheung WJ, Cavalcanti RB, Dalseg TR, Thoma B, Flynn L, Gofton W, Dudek N, Bhanji F, Wong BMF, Razack S, Anderson R, Dubois D, Boucher A, Gomes MM, Taber S, Gorman LJ, Fulford J, Naik V, Harris KA, St. Croix R, van Melle E. Competence By Design: a transformational national model of time-variable competency-based postgraduate medical education. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:201-223. [PMID: 38525203 PMCID: PMC10959143 DOI: 10.5334/pme.1096] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/04/2023] [Accepted: 02/16/2024] [Indexed: 03/26/2024]
Abstract
Postgraduate medical education is an essential societal enterprise that prepares highly skilled physicians for the health workforce. In recent years, PGME systems have been criticized worldwide for problems with variable graduate abilities, concerns about patient safety, and issues with teaching and assessment methods. In response, competency based medical education approaches, with an emphasis on graduate outcomes, have been proposed as the direction for 21st century health profession education. However, there are few published models of large-scale implementation of these approaches. We describe the rationale and design for a national, time-variable competency-based multi-specialty system for postgraduate medical education called Competence by Design. Fourteen innovations were bundled to create this new system, using the Van Melle Core Components of competency based medical education as the basis for the transformation. The successful execution of this transformational training system shows competency based medical education can be implemented at scale. The lessons learned in the early implementation of Competence by Design can inform competency based medical education innovation efforts across professions worldwide.
Collapse
Affiliation(s)
- Jason R. Frank
- Centre for Innovation in Medical Education and Professor, Department of Emergency Medicine, Faculty of Medicine, University of Ottawa, ON, Canada
| | - Jolanta Karpinski
- Department of Medicine, University of Ottawa, Ottawa, ON, Canada
- Competency Based Medical Education, University of Ottawa, Ottawa, ON, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| | | | - Linda S. Snell
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Medicine and Health Sciences Education, McGill University, Montreal, QC, Canada
| | - Adelle Atkinson
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Paediatrics, Temerty Faculty of Medicine, University of Toronto, Toronto, ON, Canada
| | - Anna Oswald
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Medicine, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB, Canada
- Competency Based Medical Education, University of Alberta, Edmonton, AB, Canada
| | - Andrew K. Hall
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Lara Cooke
- Division of Neurology, Department of Clinical Neurosciences, Cumming School of Medicine, University of Calgary, Calgary, AB, Canada
| | - Susan Dojeiji
- Physical Medicine and Rehabilitation, University of Ottawa, Ottawa, ON, Canada
| | - Denyse Richardson
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Physical Medicine and Rehabilitation, Queen’s University, Kingston, ON, Canada
| | - Warren J. Cheung
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Medicine, University of Toronto, Toronto, ON, Canada
| | - Rodrigo B. Cavalcanti
- Department of Medicine, University of Toronto, Toronto, ON, Canada
- HoPingKong Centre, University Health Network, Toronto, ON, Canada
| | - Timothy R. Dalseg
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Division of Emergency Medicine, University of Toronto, Toronto, ON, Canada
| | - Brent Thoma
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Emergency Medicine, University of Saskatchewan, Saskatoon, SK, Canada
| | - Leslie Flynn
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Departments of Psychiatry and Family Medicine, and Co-Director Master of Health Sciences Education, Queen’s University, Kingston, ON, Canada
| | - Wade Gofton
- Department of Surgery (Division of Orthopedic Surgery), The Ottawa Hospital and University of Ottawa, Ottawa, ON, Canada
| | - Nancy Dudek
- Department of Medicine (Division of Physical Medicine & Rehabilitation) and The Ottawa Hospital, University of Ottawa, Ottawa, Ontario, Canada
| | - Farhan Bhanji
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Faculty of Medicine and Health Sciences, McGill University, Montreal, QC, Canada
| | - Brian M.-F. Wong
- Centre for Quality Improvement and Patient Safety, University of Toronto, Toronto, Canada
| | - Saleem Razack
- Centre for Health Education Scholarship, University of British Columbia and BC Children’s Hospital, Vancouver, BC, Canada
| | - Robert Anderson
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Northern Ontario School of Medicine University, Sudbury, ON, Canada
| | - Daniel Dubois
- Department of Anesthesiology and Pain Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Andrée Boucher
- Department of Medicine (Division of Endocrinology), Universitéde Montréal, Montréal, QC, Canada
| | - Marcio M. Gomes
- Department of Pathology and Laboratory Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Sarah Taber
- Office of Standards and Assessment, Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| | - Lisa J. Gorman
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| | - Jane Fulford
- Canadian Internet Registration Authority, Canada
| | - Viren Naik
- Department of Anesthesiology and Pain Medicine, University of Ottawa, Ottawa, ON, Canada
- Medical Council of Canada, Ottawa, ON, Canada
| | - Kenneth A. Harris
- Royal College of Physicians and Surgeons of Canada, Canada
- Emeritus, Western University, Canada
| | - Rhonda St. Croix
- Learning and Connecting at the Royal College of Physicians and Surgeons of Canada, Canada
| | - Elaine van Melle
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Family Medicine, Queen’s University, Kingston, ON, Canada
| |
Collapse
|
4
|
Ibrahim H, Juve AM, Amin A, Railey K, Andolsek KM. Expanding the Study of Bias in Medical Education Assessment. J Grad Med Educ 2023; 15:623-626. [PMID: 38045936 PMCID: PMC10686652 DOI: 10.4300/jgme-d-23-00027.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/05/2023] Open
Affiliation(s)
- Halah Ibrahim
- Halah Ibrahim, MD, MEHP, is Associate Professor, Department of Medicine, Khalifa University College of Medicine and Health Sciences, Abu Dhabi, United Arab Emirates, and Associate Editor, Journal of Graduate Medical Education (JGME)
| | - Amy Miller Juve
- Amy Miller Juve, EdD, is Professor of Anesthesiology and Perioperative Medicine and Professional Development, and Program Improvement Specialist for Graduate Medical Education, Oregon Health & Science University, Portland, Oregon, USA
| | - Alpesh Amin
- Alpesh Amin, MD, MBA, MACP, is Professor, Department of Medicine, University of California Irvine, Irvine, California, USA
| | - Kenyon Railey
- Kenyon Railey, MD, is Associate Professor, Department of Family Medicine and Community Health, School of Medicine, Duke University School of Medicine, Durham, North Carolina, USA; and
| | - Kathryn M. Andolsek
- Kathryn M. Andolsek, MD, MPH, is Assistant Dean for Premedical Education and Professor, Department of Family Medicine and Community Health, Duke University School of Medicine, Durham, North Carolina, USA, and Associate Editor, JGME
| |
Collapse
|
5
|
Costello LL, Cho DD, Daniel RC, Dida J, Pritchard J, Pardhan K. Emergency medicine resident perceptions of simulation-based training and assessment in competence by design. CAN J EMERG MED 2023; 25:828-835. [PMID: 37665550 DOI: 10.1007/s43678-023-00577-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2023] [Accepted: 08/09/2023] [Indexed: 09/05/2023]
Abstract
OBJECTIVES With the launch of competence by design (CBD) in emergency medicine (EM) in Canada, there are growing recommendations on the use of simulation for the training and assessment of residents. Many of these recommendations have been suggested by educational leaders and often exclude the resident stakeholder. This study sought to explore their experiences and perceptions of simulation in CBD. METHODS Qualitative data were collected from November 2020 to May 2021 at McMaster University and the University of Toronto after receiving ethics approval from both sites. Eligible participants included EM residents who were interviewed by a trained interviewer using a semi-structured interview guide. All interviews were recorded, transcribed, coded, and collapsed into themes. Data analysis was guided by constructivist grounded theory. RESULTS A total of seventeen residents participated. Thematic analysis revealed three major themes: 1) impact of CBD on resident views of simulation; 2) simulation's role in obtaining entrustable professional activities (EPAs) and filling educational gaps; and 3) conflicting feelings on the use of high-stakes simulation-based assessment in CBD. CONCLUSIONS EM residents strongly support using simulation in CBD and acknowledge its ability to bridge educational gaps and fulfill specific EPAs. However, this study suggests some unintended consequences of CBD and conflicting views around simulation-based assessment that challenge resident perceptions of simulation as a safe learning space. As CBD evolves, educational leaders should consider these impacts when making future curricular changes or recommendations.
Collapse
Affiliation(s)
- Lorne L Costello
- Division of Emergency Medicine, Department of Medicine, University of Toronto, Toronto, ON, Canada.
- Department of Emergency Services, Sunnybrook Health Sciences Centre, Toronto, ON, Canada.
| | - Dennis D Cho
- Division of Emergency Medicine, Department of Medicine, University of Toronto, Toronto, ON, Canada
- Department of Emergency Medicine, University Health Network, Toronto, ON, Canada
| | - Ryan C Daniel
- Department of Otolaryngology-Head & Neck Surgery, University of Toronto, Toronto, ON, Canada
| | - Joana Dida
- Division of Emergency Medicine, Department of Medicine, McMaster University, Hamilton, ON, Canada
| | - Jodie Pritchard
- Department of Emergency Medicine, Queen's University, Kingston, ON, Canada
| | - Kaif Pardhan
- Division of Emergency Medicine, Department of Medicine, University of Toronto, Toronto, ON, Canada
- Department of Emergency Services, Sunnybrook Health Sciences Centre, Toronto, ON, Canada
- Division of Pediatric Emergency Medicine, Department of Pediatrics, McMaster University, Hamilton, ON, Canada
| |
Collapse
|
6
|
Seed JD, Gauthier S, Zevin B, Hall AK, Chaplin T. Simulation vs workplace-based assessment in resuscitation: a cross-specialty descriptive analysis and comparison. CANADIAN MEDICAL EDUCATION JOURNAL 2023; 14:92-98. [PMID: 37465738 PMCID: PMC10351640 DOI: 10.36834/cmej.73692] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 07/20/2023]
Abstract
Background Simulation-based assessment can complement workplace-based assessment of rare or difficult to assess Entrustable Professional Activities (EPAs). We aimed to compare the use of simulation-based assessment for resuscitation-focused EPAs in three postgraduate medical training programs and describe faculty perceptions of simulation-based assessment. Methods EPA assessment scores and setting (simulation or workplace) were extracted from 2017-2020 for internal medicine, emergency medicine, and surgical foundations residents at the transition to discipline and foundations of discipline stages. A questionnaire was distributed to clinical competency committee members. Results Eleven percent of EPA assessments were simulation-based. The proportion of simulation-based assessment did not differ between programs but differed between transition (38%) and foundations (4%) stages within surgical foundations only. Entrustment scores differed between settings in emergency medicine at the transition level only (simulation: 4.82 ± 0.60 workplace: 3.74 ± 0.93). 70% of committee members (n=20) completed the questionnaire. Of those that use simulation-based assessment, 45% interpret them differently than workplace-based assessments. 73% and 100% trust simulation for high-stakes and low-stakes assessment, respectively. Conclusions The proportion of simulation-based assessment for resuscitation focused EPAs did not differ between three postgraduate medical training programs. Interpretation of simulation-based assessment data between committee members was inconsistent. All respondents trust simulation-based assessment for low-stakes, and the majority for high-stakes assessment. These findings have practical implications for the integration simulation into programs of assessment.
Collapse
Affiliation(s)
- Jeremy D Seed
- Department of Emergency Medicine, Queen's University, Ontario, Canada
| | | | - Boris Zevin
- Department of Surgery, Queen's University, Ontario, Canada
| | - Andrew K Hall
- Department of Emergency Medicine, University of Ottawa, Ontario, Canada
| | - Timothy Chaplin
- Department of Emergency Medicine, Queen's University, Ontario, Canada
| |
Collapse
|
7
|
Paterson QS, Alrimawi H, Sample S, Bouwsema M, Anjum O, Vincent M, Cheung WJ, Hall A, Woods R, Martin LJ, Chan T. Examining enablers and barriers to entrustable professional activity acquisition using the theoretical domains framework: A qualitative framework analysis study. AEM EDUCATION AND TRAINING 2023; 7:e10849. [PMID: 36994315 PMCID: PMC10041073 DOI: 10.1002/aet2.10849] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/04/2022] [Revised: 01/20/2023] [Accepted: 01/29/2023] [Indexed: 06/19/2023]
Abstract
Background Without a clear understanding of the factors contributing to the effective acquisition of high-quality entrustable professional activity (EPA) assessments, trainees, supervising faculty, and training programs may lack appropriate strategies for successful EPA implementation and utilization. The purpose of this study was to identify barriers and facilitators to acquiring high-quality EPA assessments in Canadian emergency medicine (EM) training programs. Methods We conducted a qualitative framework analysis study utilizing the Theoretical Domains Framework (TDF). Semistructured interviews of EM resident and faculty participants underwent audio recording, deidentification, and line-by-line coding by two authors, being coded to extract themes and subthemes across the domains of the TDF. Results From 14 interviews (eight faculty and six residents) we identified, within the 14 TDF domains, major themes and subthemes for barriers and facilitators to EPA acquisition for both faculty and residents. The two most cited domains (and their frequencies) among residents and faculty were environmental context and resources (56) and behavioral regulation (48). Example strategies to improving EPA acquisition include orienting residents to the competency-based medical education (CBME) paradigm, recalibrating expectations relating to "low ratings" on EPAs, engaging in continuous faculty development to ensure familiarity and fluency with EPAs, and implementing longitudinal coaching programs between residents and faculty to encourage repetitive longitudinal interactions and high-quality specific feedback. Conclusions We identified key strategies to support residents, faculty, programs, and institutions in overcoming barriers and improving EPA assessment processes. This is an important step toward ensuring the successful implementation of CBME and the effective operationalization of EPAs within EM training programs.
Collapse
Affiliation(s)
- Quinten S. Paterson
- Department of Emergency MedicineUniversity of SaskatchewanSaskatoonSaskatchewanCanada
| | - Hussein Alrimawi
- Emergency Medicine Division, Department of MedicineMcMaster UniversityHamiltonOntarioCanada
| | - Spencer Sample
- Emergency Medicine Division, Department of MedicineMcMaster UniversityHamiltonOntarioCanada
| | - Melissa Bouwsema
- Department of Emergency MedicineQueens UniversityKingstonOntarioCanada
| | - Omar Anjum
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| | - Maggie Vincent
- Emergency Medicine Division, Department of MedicineMcMaster UniversityHamiltonOntarioCanada
| | - Warren J. Cheung
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| | - Andrew Hall
- Department of Emergency MedicineQueens UniversityKingstonOntarioCanada
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| | - Rob Woods
- Department of Emergency MedicineUniversity of SaskatchewanSaskatoonSaskatchewanCanada
| | - Lynsey J. Martin
- Department of Emergency MedicineUniversity of SaskatchewanSaskatoonSaskatchewanCanada
| | - Teresa Chan
- Emergency Medicine Division, Department of MedicineMcMaster UniversityHamiltonOntarioCanada
| |
Collapse
|
8
|
Woods R, Singh S, Thoma B, Patocka C, Cheung W, Monteiro S, Chan TM. Validity evidence for the Quality of Assessment for Learning score: a quality metric for supervisor comments in Competency Based Medical Education. CANADIAN MEDICAL EDUCATION JOURNAL 2022; 13:19-35. [PMID: 36440075 PMCID: PMC9684040 DOI: 10.36834/cmej.74860] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
BACKGROUND Competency based medical education (CBME) relies on supervisor narrative comments contained within entrustable professional activities (EPA) for programmatic assessment, but the quality of these supervisor comments is unassessed. There is validity evidence supporting the QuAL (Quality of Assessment for Learning) score for rating the usefulness of short narrative comments in direct observation. OBJECTIVE We sought to establish validity evidence for the QuAL score to rate the quality of supervisor narrative comments contained within an EPA by surveying the key end-users of EPA narrative comments: residents, academic advisors, and competence committee members. METHODS In 2020, the authors randomly selected 52 de-identified narrative comments from two emergency medicine EPA databases using purposeful sampling. Six collaborators (two residents, two academic advisors, and two competence committee members) were recruited from each of four EM Residency Programs (Saskatchewan, McMaster, Ottawa, and Calgary) to rate these comments with a utility score and the QuAL score. Correlation between utility and QuAL score were calculated using Pearson's correlation coefficient. Sources of variance and reliability were calculated using a generalizability study. RESULTS All collaborators (n = 24) completed the full study. The QuAL score had a high positive correlation with the utility score amongst the residents (r = 0.80) and academic advisors (r = 0.75) and a moderately high correlation amongst competence committee members (r = 0.68). The generalizability study found that the major source of variance was the comment indicating the tool performs well across raters. CONCLUSION The QuAL score may serve as an outcome measure for program evaluation of supervisors, and as a resource for faculty development.
Collapse
Affiliation(s)
- Rob Woods
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Sim Singh
- College of Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Brent Thoma
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Catherine Patocka
- Department of Emergency Medicine, University of Calgary, Alberta, Canada
| | - Warren Cheung
- Department of Emergency Medicine, University of Ottawa, Ontario, Canada
| | - Sandra Monteiro
- Department of Health Research Methods Evidence and Impact, McMaster University, Ontario, Canada
| | - Teresa M Chan
- Division of Emergency Medicine and Education & Innovation, Department of Medicine, McMaster University, Ontario, Canada
| | | |
Collapse
|
9
|
McKenzie-White J, Mubuuke AG, Westergaard S, Munabi IG, Bollinger RC, Opoka R, Mbalinda SN, Katete D, Manabe YC, Kiguli S. Evaluation of a competency based medical curriculum in a Sub-Saharan African medical school. BMC MEDICAL EDUCATION 2022; 22:724. [PMID: 36242004 PMCID: PMC9569118 DOI: 10.1186/s12909-022-03781-1] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/30/2022] [Revised: 09/16/2022] [Accepted: 09/30/2022] [Indexed: 05/12/2023]
Abstract
BACKGROUND Medical schools in Sub-Saharan Africa have adopted competency based medical education (CBME) to improve the quality of graduates trained. In 2015, Makerere University College of Health Sciences (MaKCHS) implemented CBME for the Bachelor of Medicine and Bachelor of Surgery (MBChB) programme in order to produce doctors with the required attributes to address community health needs. However, no formal evaluation of the curriculum has been conducted to determine whether all established competencies are being assessed. OBJECTIVE To evaluate whether assessment methods within the MBChB curriculum address the stated competencies. METHODS The evaluation adopted a cross-sectional study design in which the MBChB curriculum was evaluated using an Essential Course Evidence Form (ECEF) that was developed to collect information about each assessment used for each course. Information was collected on: (1) Assessment title, (2) Description, (3) Competency domain (4) Sub-competency addressed, (5) Student instructions, and (6) Grading method/details. Data were entered into a structured Access data base. In addition, face-to-face interviews were conducted with faculty course coordinators. RESULTS The MBChB curriculum consisted of 62 courses over 5 years, focusing on preclinical skills in years 1-2 and clinical skills in years 3-5. Fifty-nine competencies were identified and aggregated into 9 domains. Fifty-eight competencies were assessed at least one time in the curriculum. Faculty cited limited training in assessment as well as large student numbers as hindrances to designing robust assessments for the competencies. CONCLUSION CBME was successfully implemented evidenced by all but one of the 59 competencies within the nine domains established being assessed within the MBChB curriculum at MaKCHS. Faculty interviewed were largely aware of it, however indicated the need for more training in competency-based assessment to improve the implementation of CBME.
Collapse
Affiliation(s)
- Jane McKenzie-White
- Division of Infectious Diseases, School of Medicine, Johns Hopkins, Baltimore, USA
| | - Aloysius G Mubuuke
- College of Health Sciences, Makerere University, P.O. Box 7072, Kampala, Uganda.
| | - Sara Westergaard
- Division of Infectious Diseases, School of Medicine, Johns Hopkins, Baltimore, USA
| | - Ian G Munabi
- College of Health Sciences, Makerere University, P.O. Box 7072, Kampala, Uganda
| | - Robert C Bollinger
- Division of Infectious Diseases, School of Medicine, Johns Hopkins, Baltimore, USA
| | - Robert Opoka
- College of Health Sciences, Makerere University, P.O. Box 7072, Kampala, Uganda
| | - Scovia N Mbalinda
- College of Health Sciences, Makerere University, P.O. Box 7072, Kampala, Uganda
| | - David Katete
- College of Health Sciences, Makerere University, P.O. Box 7072, Kampala, Uganda
| | - Yukari C Manabe
- Division of Infectious Diseases, School of Medicine, Johns Hopkins, Baltimore, USA
| | - Sarah Kiguli
- College of Health Sciences, Makerere University, P.O. Box 7072, Kampala, Uganda
| |
Collapse
|
10
|
Yilmaz Y, Carey R, Chan TM, Bandi V, Wang S, Woods RA, Mondal D, Thoma B. Developing a dashboard for program evaluation in competency-based training programs: a design-based research project. CANADIAN MEDICAL EDUCATION JOURNAL 2022; 13:14-27. [PMID: 36310899 PMCID: PMC9588183 DOI: 10.36834/cmej.73554] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
BACKGROUND Canadian specialist residency training programs are implementing a form of competency-based medical education (CBME) that requires the assessment of entrustable professional activities (EPAs). Dashboards could be used to track the completion of EPAs to support program evaluation. METHODS Using a design-based research process, we identified program evaluation needs related to CBME assessments and designed a dashboard containing elements (data, analytics, and visualizations) meeting these needs. We interviewed leaders from the emergency medicine program and postgraduate medical education office at the University of Saskatchewan. Two investigators thematically analyzed interview transcripts to identify program evaluation needs that were audited by two additional investigators. Identified needs were described using quotes, analytics, and visualizations. RESULTS Between July 1, 2019 and April 6, 2021 we conducted 17 interviews with six participants (two program leaders and four institutional leaders). Four needs emerged as themes: tracking changes in overall assessment metrics, comparing metrics to the assessment plan, evaluating rotation performance, and engagement with the assessment metrics. We addressed these needs by presenting analytics and visualizations within a dashboard. CONCLUSIONS We identified program evaluation needs related to EPA assessments and designed dashboard elements to meet them. This work will inform the development of other CBME assessment dashboards designed to support program evaluation.
Collapse
Affiliation(s)
- Yusuf Yilmaz
- Continuing Professional Development Office, and McMaster program for Education Research, Innovation, and Theory (MERIT), McMaster University, Ontario, Canada
- Department of Medical Education, Ege University, Turkey
| | - Robert Carey
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Teresa M Chan
- Continuing Professional Development Office, and McMaster program for Education Research, Innovation, and Theory (MERIT), McMaster University, Ontario, Canada
- Division of Emergency Medicine, Department of Medicine at McMaster University
| | - Venkat Bandi
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Shisong Wang
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Robert A Woods
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Debajyoti Mondal
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Brent Thoma
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
- Royal College of Physicians and Surgeons of Canada, Ontario, Canada
| |
Collapse
|
11
|
Gisondi MA, Michael S, Li-Sauerwine S, Brazil V, Caretta-Weyer HA, Issenberg B, Giordano J, Lineberry M, Olson AS, Burkhardt JC, Chan TM. The Purpose, Design, and Promise of Medical Education Research Labs. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2022; 97:1281-1288. [PMID: 35612923 DOI: 10.1097/acm.0000000000004746] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
Medical education researchers are often subject to challenges that include lack of funding, collaborators, study subjects, and departmental support. The construct of a research lab provides a framework that can be employed to overcome these challenges and effectively support the work of medical education researchers; however, labs are relatively uncommon in the medical education field. Using case examples, the authors describe the organization and mission of medical education research labs contrasted with those of larger research team configurations, such as research centers, collaboratives, and networks. They discuss several key elements of education research labs: the importance of lab identity, the signaling effect of a lab designation, required infrastructure, and the training mission of a lab. The need for medical education researchers to be visionary and strategic when designing their labs is emphasized, start-up considerations and the likelihood of support for medical education labs is considered, and the degree to which department leaders should support such labs is questioned.
Collapse
Affiliation(s)
- Michael A Gisondi
- M.A. Gisondi is associate professor and vice chair for education, Department of Emergency Medicine, and principal, Precision Education and Assessment Research Lab (PEARL), Stanford University School of Medicine, Stanford, California; ORCID: https://orcid.org/0000-0002-6800-3932
| | - Sarah Michael
- S. Michael is assistant professor, Department of Emergency Medicine, University of Colorado Anschutz Medical Campus, Aurora, Colorado; ORCID: https://orcid.org/0000-0003-0077-6282
| | - Simiao Li-Sauerwine
- S. Li-Sauerwine is assistant professor and assistant program director, Department of Emergency Medicine, The Ohio State University, Columbus, Ohio, and chief academic officer, Academic Life in Emergency Medicine Education Research Lab and Incubator; ORCID: https://orcid.org/0000-0003-3445-6404
| | - Victoria Brazil
- V. Brazil is professor of emergency medicine and director, Translational Simulation Collaborative, Bond University, Gold Coast, Queensland, Australia; ORCID: https://orcid.org/0000-0001-9103-2507
| | - Holly A Caretta-Weyer
- H.A. Caretta-Weyer is assistant professor and associate program director, Department of Emergency Medicine, and senior scientist, PEARL, Stanford University School of Medicine, Stanford, California; ORCID: https://orcid.org/0000-0002-9783-5797
| | - Barry Issenberg
- B. Issenberg is professor of medicine, professor of medical education, the Michael S. Gordon Chair of Medical Education, University of Miami Miller School of Medicine, and senior associate dean for research in medical education and director, Gordon Center for Simulation and Innovation in Medical Education, University of Miami Miller School of Medicine, Miami, Florida; ORCID: https://orcid.org/0000-0002-2524-4736
| | - Jonathan Giordano
- J. Giordano is assistant professor, director of undergraduate medical education, codirector of medical education fellowship, Department of Emergency Medicine, and lead, Texas Innovation and Educational Research Lab, McGovern Medical School at University of Texas, Houston, Texas
| | - Matthew Lineberry
- M. Lineberry is director of simulation research, assessment, and outcomes, Zamierowski Institute for Experiential Learning, and associate professor of population health, University of Kansas Medical Center and Health System, Kansas City, Kansas; ORCID: https://orcid.org/0000-0002-0177-5305
| | - Adriana Segura Olson
- A.S. Olson is assistant professor and assistant program director, Section of Emergency Medicine, Department of Medicine, University of Chicago, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-2585-0971
| | - John C Burkhardt
- J.C. Burkhardt is assistant professor, Departments of Emergency Medicine and Learning Health Sciences, University of Michigan Medical School, and principal investigator, Policy Analysis, Research, and Innovation in Medical Education Collective, University of Michigan, Ann Arbor, Michigan; ORCID: https://orcid.org/0000-0001-6273-8762
| | - Teresa M Chan
- T.M. Chan is associate dean for continuing professional development, Faculty of Health Sciences, associate professor, Divisions of Emergency Medicine and of Education & Innovation, Department of Medicine, clinician scientist, McMaster Education Research, Innovation, and Theory Program, McMaster University, Hamilton, Ontario, Canada, and founder, Technology, Education and Collaboration in Healthcare Hub; ORCID: https://orcid.org/0000-0001-6104-462X
| |
Collapse
|
12
|
Spencer M, Sherbino J, Hatala R. Examining the validity argument for the Ottawa Surgical Competency Operating Room Evaluation (OSCORE): a systematic review and narrative synthesis. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2022; 27:659-689. [PMID: 35511356 DOI: 10.1007/s10459-022-10114-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/15/2021] [Accepted: 04/02/2022] [Indexed: 06/14/2023]
Abstract
The Ottawa Surgical Competency Operating Room Evaluation (OSCORE) is an assessment tool that has gained prominence in postgraduate competency-based training programs. We undertook a systematic review and narrative synthesis to articulate the underlying validity argument in support of this tool. Although originally developed to assess readiness for independent performance of a procedure, contemporary implementation includes using the OSCORE for entrustment supervision decisions. We used systematic review methodology to search, identify, appraise and abstract relevant articles from 2005 to September 2020, across MEDLINE, EMBASE and Google Scholar databases. Nineteen original, English-language, quantitative or qualitative articles addressing the use of the OSCORE for health professionals' assessment were included. We organized and synthesized the validity evidence according to Kane's framework, articulating the validity argument and identifying evidence gaps. We demonstrate a reasonable validity argument for the OSCORE in surgical specialties, based on assessing surgical competence as readiness for independent performance for a given procedure, which relates to ad hoc, retrospective, entrustment supervision decisions. The scoring, generalization and extrapolation inferences are well-supported. However, there is a notable lack of implications evidence focused on the impact of the OSCORE on summative decision-making within surgical training programs. In non-surgical specialties, the interpretation/use argument for the OSCORE has not been clearly articulated. The OSCORE has been reduced to a single-item global rating scale, and there is limited validity evidence to support its use in workplace-based assessment. Widespread adoption of the OSCORE must be informed by concurrent data collection in more diverse settings and specialties.
Collapse
Affiliation(s)
- Martha Spencer
- The University of British Columbia, Vancouver, BC, Canada.
| | | | - Rose Hatala
- The University of British Columbia, Vancouver, BC, Canada
| |
Collapse
|
13
|
Landreville JM, Wood TJ, Frank JR, Cheung WJ. Does direct observation influence the quality of workplace-based assessment documentation? AEM EDUCATION AND TRAINING 2022; 6:e10781. [PMID: 35903424 PMCID: PMC9305723 DOI: 10.1002/aet2.10781] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/14/2022] [Revised: 06/07/2022] [Accepted: 06/08/2022] [Indexed: 05/30/2023]
Abstract
BACKGROUND A key component of competency-based medical education (CBME) is direct observation of trainees. Direct observation has been emphasized as integral to workplace-based assessment (WBA) yet previously identified challenges may limit its successful implementation. Given these challenges, it is imperative to fully understand the value of direct observation within a CBME program of assessment. Specifically, it is not known whether the quality of WBA documentation is influenced by observation type (direct or indirect). METHODS The objective of this study was to determine the influence of observation type (direct or indirect) on quality of entrustable professional activity (EPA) assessment documentation within a CBME program. EPA assessments were scored by four raters using the Quality of Assessment for Learning (QuAL) instrument, a previously published three-item quantitative measure of the quality of written comments associated with a single clinical performance score. An analysis of variance was performed to compare mean QuAL scores among the direct and indirect observation groups. The reliability of the QuAL instrument for EPA assessments was calculated using a generalizability analysis. RESULTS A total of 244 EPA assessments (122 direct observation, 122 indirect observation) were rated for quality using the QuAL instrument. No difference in mean QuAL score was identified between the direct and indirect observation groups (p = 0.17). The reliability of the QuAL instrument for EPA assessments was 0.84. CONCLUSIONS Observation type (direct or indirect) did not influence the quality of EPA assessment documentation. This finding raises the question of how direct and indirect observation truly differ and the implications for meta-raters such as competence committees responsible for making judgments related to trainee promotion.
Collapse
Affiliation(s)
| | - Timothy J. Wood
- Department of Innovation in Medical EducationUniversity of OttawaOttawaOntarioCanada
| | - Jason R. Frank
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| | - Warren J. Cheung
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| |
Collapse
|
14
|
Chan TM, Dowling S, Tastad K, Chin A, Thoma B. Integrating training, practice, and reflection within a new model for Canadian medical licensure: a concept paper prepared for the Medical Council of Canada. CANADIAN MEDICAL EDUCATION JOURNAL 2022; 13:68-81. [PMID: 36091730 PMCID: PMC9441128 DOI: 10.36834/cmej.73717] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
In 2020 the Medical Council of Canada created a task force to make recommendations on the modernization of its practices for granting licensure to medical trainees. This task force solicited papers on this topic from subject matter experts. As outlined within this Concept Paper, our proposal would shift licensure away from the traditional focus on high-stakes summative exams in a way that integrates training, clinical practice, and reflection. Specifically, we propose a model of graduated licensure that would have three stages including: a trainee license for trainees that have demonstrated adequate medical knowledge to begin training as a closely supervised resident, a transition to practice license for trainees that have compiled a reflective educational portfolio demonstrating the clinical competence required to begin independent practice with limitations and support, and a fully independent license for unsupervised practice for attendings that have demonstrated competence through a reflective portfolio of clinical analytics. This proposal was reviewed by a diverse group of 30 trainees, practitioners, and administrators in medical education. Their feedback was analyzed and summarized to provide an overview of the likely reception that this proposal would receive from the medical education community.
Collapse
Affiliation(s)
- Teresa M Chan
- Division of Emergency Medicine, Department of Medicine, Faculty of Health Sciences, McMaster University
- Division of Education & Innovation, Department of Medicine
- Faculty of Health Sciences, McMaster University; McMaster Education Research, Innovation, and Theory (MERIT) program
- Office of Continuing Professional Development; Faculty of Health Sciences, McMaster University
| | - Shawn Dowling
- Department of Emergency Medicine, Cumming School of Medicine, University of Calgary
| | - Kara Tastad
- Royal College Emergency Medicine Training Program, University of Toronto
| | - Alvin Chin
- Division of Emergency Medicine, Department of Medicine, Faculty of Health Sciences, McMaster University
| | - Brent Thoma
- Department of Emergency Medicine, University of Saskatchewan
| |
Collapse
|
15
|
Yilmaz Y, Jurado Nunez A, Ariaeinejad A, Lee M, Sherbino J, Chan TM. Harnessing Natural Language Processing to Support Decisions Around Workplace-Based Assessment: Machine Learning Study of Competency-Based Medical Education. JMIR MEDICAL EDUCATION 2022; 8:e30537. [PMID: 35622398 PMCID: PMC9187970 DOI: 10.2196/30537] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/19/2021] [Revised: 12/05/2021] [Accepted: 04/30/2022] [Indexed: 06/15/2023]
Abstract
BACKGROUND Residents receive a numeric performance rating (eg, 1-7 scoring scale) along with a narrative (ie, qualitative) feedback based on their performance in each workplace-based assessment (WBA). Aggregated qualitative data from WBA can be overwhelming to process and fairly adjudicate as part of a global decision about learner competence. Current approaches with qualitative data require a human rater to maintain attention and appropriately weigh various data inputs within the constraints of working memory before rendering a global judgment of performance. OBJECTIVE This study explores natural language processing (NLP) and machine learning (ML) applications for identifying trainees at risk using a large WBA narrative comment data set associated with numerical ratings. METHODS NLP was performed retrospectively on a complete data set of narrative comments (ie, text-based feedback to residents based on their performance on a task) derived from WBAs completed by faculty members from multiple hospitals associated with a single, large, residency program at McMaster University, Canada. Narrative comments were vectorized to quantitative ratings using the bag-of-n-grams technique with 3 input types: unigram, bigrams, and trigrams. Supervised ML models using linear regression were trained with the quantitative ratings, performed binary classification, and output a prediction of whether a resident fell into the category of at risk or not at risk. Sensitivity, specificity, and accuracy metrics are reported. RESULTS The database comprised 7199 unique direct observation assessments, containing both narrative comments and a rating between 3 and 7 in imbalanced distribution (scores 3-5: 726 ratings; and scores 6-7: 4871 ratings). A total of 141 unique raters from 5 different hospitals and 45 unique residents participated over the course of 5 academic years. When comparing the 3 different input types for diagnosing if a trainee would be rated low (ie, 1-5) or high (ie, 6 or 7), our accuracy for trigrams was 87%, bigrams 86%, and unigrams 82%. We also found that all 3 input types had better prediction accuracy when using a bimodal cut (eg, lower or higher) compared with predicting performance along the full 7-point rating scale (50%-52%). CONCLUSIONS The ML models can accurately identify underperforming residents via narrative comments provided for WBAs. The words generated in WBAs can be a worthy data set to augment human decisions for educators tasked with processing large volumes of narrative assessments.
Collapse
Affiliation(s)
- Yusuf Yilmaz
- McMaster Education Research, Innovation, and Theory Program, Faculty of Health Sciences, McMaster University, Hamilton, ON, Canada
- Department of Medical Education, Ege University, Izmir, Turkey
- Program for Faculty Development, Office of Continuing Professional Development, McMaster University, Hamilton, ON, Canada
- Department of Medicine, Faculty of Health Sciences, McMaster University, Hamilton, ON, Canada
| | - Alma Jurado Nunez
- Department of Medicine and Masters in eHealth Program, McMaster University, Hamilton, ON, Canada
| | - Ali Ariaeinejad
- Department of Medicine and Masters in eHealth Program, McMaster University, Hamilton, ON, Canada
| | - Mark Lee
- McMaster Education Research, Innovation, and Theory Program, Faculty of Health Sciences, McMaster University, Hamilton, ON, Canada
| | - Jonathan Sherbino
- McMaster Education Research, Innovation, and Theory Program, Faculty of Health Sciences, McMaster University, Hamilton, ON, Canada
- Division of Emergency Medicine, Department of Medicine, Faculty of Health Sciences, McMaster University, Hamilton, ON, Canada
- Division of Education and Innovation, Department of Medicine, Faculty of Health Sciences, McMaster University, Hamilton, ON, Canada
| | - Teresa M Chan
- McMaster Education Research, Innovation, and Theory Program, Faculty of Health Sciences, McMaster University, Hamilton, ON, Canada
- Program for Faculty Development, Office of Continuing Professional Development, McMaster University, Hamilton, ON, Canada
- Division of Emergency Medicine, Department of Medicine, Faculty of Health Sciences, McMaster University, Hamilton, ON, Canada
- Division of Education and Innovation, Department of Medicine, Faculty of Health Sciences, McMaster University, Hamilton, ON, Canada
| |
Collapse
|
16
|
Chan TM, Sebok-Syer SS, Yilmaz Y, Monteiro S. The Impact of Electronic Data to Capture Qualitative Comments in a Competency-Based Assessment System. Cureus 2022; 14:e23480. [PMID: 35494923 PMCID: PMC9038604 DOI: 10.7759/cureus.23480] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/24/2022] [Indexed: 11/23/2022] Open
Abstract
Introduction Digitalizing workplace-based assessments (WBA) holds the potential for facilitating feedback and performance review, wherein we can easily record, store, and analyze data in real time. When digitizing assessment systems, however, it is unclear what is gained and lost in the message as a result of the change in medium. This study evaluates the quality of comments generated in paper vs. electronic media and the influence of an assessor’s seniority. Methods Using a realist evaluation framework, a retrospective database review was conducted with paper-based and electronic medium comments. A sample of assessments was examined to determine any influence of the medium on the word count and the Quality of Assessment for Learning (QuAL) score. A correlation analysis evaluated the relationship between word count and QuAL score. Separate univariate analyses of variance (ANOVAs) were used to examine the influence of the assessor's seniority and medium on word count, QuAL score, and WBA scores. Results The analysis included a total of 1,825 records. The average word count for the electronic comments (M=16) was significantly higher than the paper version (M=12; p=0.01). Longer comments positively correlated with QuAL score (r=0.2). Paper-based comments received lower QuAL scores (0.41) compared to electronic (0.51; p<0.01). Years in practice was negatively correlated with QuAL score (r=-0.08; p<0.001) as was word count (r=-0.2; p<0.001). Conclusion Digitization of WBAs increased the length of comments and did not appear to jeopardize the quality of WBAs; these results indicate higher-quality assessment data. True digital transformation may be possible by harnessing trainee data repositories and repurposing them to analyze for faculty-relevant metrics.
Collapse
|
17
|
Thoma B, Monteiro S, Pardhan A, Waters H, Chan T. Replacing high-stakes summative examinations with graduated medical licensure in Canada. CMAJ 2022; 194:E168-E170. [PMID: 35131756 PMCID: PMC8900762 DOI: 10.1503/cmaj.211816] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/20/2023] Open
Affiliation(s)
- Brent Thoma
- Department of Emergency Medicine (Thoma), University of Saskatchewan, Saskatoon, Sask.; Royal College of Physicians and Surgeons (Thoma), Ottawa, Ott.; Centre for Simulation-Based Learning (Monteiro); McMaster Education Research, Innovation, and Theory (MERIT) Program (Monteiro, Chan); Division of Education & Innovation, Department of Medicine (Monteiro, Chan); Division of Emergency Medicine (Pardhan), Department of Medicine and Pediatrics, and Fellow of the Royal College of Physicians of Canada Emergency Medicine Program (Pardhan); Department of Family Medicine (Waters); McMaster Office of Postgraduate Medical Education Advisory Board (Waters); Division of Emergency Medicine (Chan), Department of Medicine, Faculty of Health Sciences; Office of Continuing Professional Development (Chan), Faculty of Health Sciences, McMaster University, Hamilton, Ont
| | - Sandra Monteiro
- Department of Emergency Medicine (Thoma), University of Saskatchewan, Saskatoon, Sask.; Royal College of Physicians and Surgeons (Thoma), Ottawa, Ott.; Centre for Simulation-Based Learning (Monteiro); McMaster Education Research, Innovation, and Theory (MERIT) Program (Monteiro, Chan); Division of Education & Innovation, Department of Medicine (Monteiro, Chan); Division of Emergency Medicine (Pardhan), Department of Medicine and Pediatrics, and Fellow of the Royal College of Physicians of Canada Emergency Medicine Program (Pardhan); Department of Family Medicine (Waters); McMaster Office of Postgraduate Medical Education Advisory Board (Waters); Division of Emergency Medicine (Chan), Department of Medicine, Faculty of Health Sciences; Office of Continuing Professional Development (Chan), Faculty of Health Sciences, McMaster University, Hamilton, Ont
| | - Alim Pardhan
- Department of Emergency Medicine (Thoma), University of Saskatchewan, Saskatoon, Sask.; Royal College of Physicians and Surgeons (Thoma), Ottawa, Ott.; Centre for Simulation-Based Learning (Monteiro); McMaster Education Research, Innovation, and Theory (MERIT) Program (Monteiro, Chan); Division of Education & Innovation, Department of Medicine (Monteiro, Chan); Division of Emergency Medicine (Pardhan), Department of Medicine and Pediatrics, and Fellow of the Royal College of Physicians of Canada Emergency Medicine Program (Pardhan); Department of Family Medicine (Waters); McMaster Office of Postgraduate Medical Education Advisory Board (Waters); Division of Emergency Medicine (Chan), Department of Medicine, Faculty of Health Sciences; Office of Continuing Professional Development (Chan), Faculty of Health Sciences, McMaster University, Hamilton, Ont
| | - Heather Waters
- Department of Emergency Medicine (Thoma), University of Saskatchewan, Saskatoon, Sask.; Royal College of Physicians and Surgeons (Thoma), Ottawa, Ott.; Centre for Simulation-Based Learning (Monteiro); McMaster Education Research, Innovation, and Theory (MERIT) Program (Monteiro, Chan); Division of Education & Innovation, Department of Medicine (Monteiro, Chan); Division of Emergency Medicine (Pardhan), Department of Medicine and Pediatrics, and Fellow of the Royal College of Physicians of Canada Emergency Medicine Program (Pardhan); Department of Family Medicine (Waters); McMaster Office of Postgraduate Medical Education Advisory Board (Waters); Division of Emergency Medicine (Chan), Department of Medicine, Faculty of Health Sciences; Office of Continuing Professional Development (Chan), Faculty of Health Sciences, McMaster University, Hamilton, Ont
| | - Teresa Chan
- Department of Emergency Medicine (Thoma), University of Saskatchewan, Saskatoon, Sask.; Royal College of Physicians and Surgeons (Thoma), Ottawa, Ott.; Centre for Simulation-Based Learning (Monteiro); McMaster Education Research, Innovation, and Theory (MERIT) Program (Monteiro, Chan); Division of Education & Innovation, Department of Medicine (Monteiro, Chan); Division of Emergency Medicine (Pardhan), Department of Medicine and Pediatrics, and Fellow of the Royal College of Physicians of Canada Emergency Medicine Program (Pardhan); Department of Family Medicine (Waters); McMaster Office of Postgraduate Medical Education Advisory Board (Waters); Division of Emergency Medicine (Chan), Department of Medicine, Faculty of Health Sciences; Office of Continuing Professional Development (Chan), Faculty of Health Sciences, McMaster University, Hamilton, Ont.
| |
Collapse
|
18
|
Pandya A, Patocka C, Huffman J. Simulation for assessment of Entrustable Professional Activities in an emergency medicine residency program. CAN J EMERG MED 2022; 24:84-87. [PMID: 34780048 DOI: 10.1007/s43678-021-00209-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2021] [Accepted: 08/18/2021] [Indexed: 11/28/2022]
Abstract
In 2018, Canadian post-graduate Emergency Medicine (EM) programs transitioned to Competence-by-Design. Residents are now assessed using Entrustable Professional Activities (EPAs). We developed and implemented simulation for assessment to mitigate anticipated challenges with residents completing the required number of observations of resuscitation-based EPAs. Our survey of trainees who participated in these sessions suggests that it may be a feasible and acceptable method for EPA assessment.
Collapse
Affiliation(s)
- Anjli Pandya
- Department of Emergency Medicine, Rm C231, Foothills Medical Centre, University of Calgary, 1403 29th Street NW, Calgary, AB, T2N 2T9, Canada.
| | - Catherine Patocka
- Department of Emergency Medicine, Rm C231, Foothills Medical Centre, University of Calgary, 1403 29th Street NW, Calgary, AB, T2N 2T9, Canada
| | - James Huffman
- Department of Emergency Medicine, Rm C231, Foothills Medical Centre, University of Calgary, 1403 29th Street NW, Calgary, AB, T2N 2T9, Canada
| |
Collapse
|
19
|
Laureano M, Mithoowani S, Tseng EK, Zeller MP. Improving Medical Education in Hematology and Transfusion Medicine in Canada: Standards and Limitations. ADVANCES IN MEDICAL EDUCATION AND PRACTICE 2021; 12:1153-1163. [PMID: 34675742 PMCID: PMC8504712 DOI: 10.2147/amep.s247159] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/16/2021] [Accepted: 09/04/2021] [Indexed: 06/13/2023]
Abstract
The paradigm of medical education is evolving with the introduction of competency-based medical education (CBME) and it is crucial that residency programs adapt. In this paper, we provide an overview of the current status of medical education in Hematology in Canada including models of training, assessment methods, anticipated challenges, and the effects of the COVID-19 pandemic. We will also discuss additional training that can be pursued after a Hematology residency, with a particular focus On Transfusion Medicine as it was one of the first programs to implement a competency-based curriculum. Finally, we explore the future directions of medical education in Hematology and Transfusion Medicine.
Collapse
Affiliation(s)
- Marissa Laureano
- Department of Medicine, Department of Pathology and Molecular Medicine, McMaster University and Canadian Blood Services, Hamilton, ON, Canada
| | - Siraj Mithoowani
- Division of Hematology & Thromboembolism, Department of Medicine, McMaster University, Hamilton, ON, Canada
| | - Eric K Tseng
- Division of Hematology/Oncology, St. Michael’s Hospital, University of Toronto, Toronto, ON, Canada
| | - Michelle P Zeller
- Division of Hematology & Thromboembolism, McMaster Centre for Transfusion Research and Canadian Blood Services, Hamilton, ON, Canada
| |
Collapse
|
20
|
Robinson TJG, Wagner N, Szulewski A, Dudek N, Cheung WJ, Hall AK. Exploring the use of rating scales with entrustment anchors in workplace-based assessment. MEDICAL EDUCATION 2021; 55:1047-1055. [PMID: 34060651 DOI: 10.1111/medu.14573] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/01/2020] [Revised: 04/07/2021] [Accepted: 05/26/2021] [Indexed: 06/12/2023]
Abstract
PURPOSE Competency-based medical education (CBME) has prompted widespread implementation of workplace-based assessment (WBA) tools using entrustment anchors. This study aimed to identify factors that influence faculty's rating choices immediately following assessment and explore their experiences using WBAs with entrustment anchors, specifically the Ottawa Surgical Competency Operating Room Evaluation scale. METHOD A convenience sample of 50 semi-structured interviews with Emergency Medicine (EM) physicians from a single Canadian hospital were conducted between July and August 2019. All interviews occurred within two hours of faculty completing a WBA of a trainee. Faculty were asked what they considered when rating the trainee's performance and whether they considered an alternate rating. Two team members independently analysed interview transcripts using conventional content analysis with line-by-line coding to identify themes. RESULTS Interviews captured interactions between 70% (26/37) of full-time EM faculty and 86% (19/22) of EM trainees. Faculty most commonly identified the amount of guidance the trainee required as influencing their rating. Other variables such as clinical context, trainee experience, past experiences with the trainee, perceived competence and confidence were also identified. While most faculty did not struggle to assign ratings, some had difficulty interpreting the language of entrustment anchors, being unsure whether their assessment should be retrospective or prospective in nature, and if/how the assessment should change whether they were 'in the room' or not. CONCLUSIONS By going to the frontline during WBA encounters, this study captured authentic and honest reflections from physicians immediately engaged in assessment using entrustment anchors. While many of the factors identified are consistent with previous retrospective work, we highlight how some faculty consider factors outside the prescribed approach and struggle with the language of entrustment anchors. These results further our understanding of 'in-the-moment' assessments using entrustment anchors and may facilitate effective faculty development regarding WBA in CBME.
Collapse
Affiliation(s)
| | - Natalie Wagner
- Department of Biomedical & Molecular Sciences, Queen's University, Kingston, ON, Canada
- Office of Professional Development & Educational Scholarship, Queen's University, Kingston, ON, Canada
| | - Adam Szulewski
- Department of Emergency Medicine, Queen's University, Kingston, ON, Canada
- Department of Psychology, Queen's University, Kingston, ON, Canada
| | - Nancy Dudek
- Department of Medicine and The Ottawa Hospital, University of Ottawa, Ottawa, ON, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| | - Warren J Cheung
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Andrew K Hall
- Department of Emergency Medicine, Queen's University, Kingston, ON, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| |
Collapse
|
21
|
Yilmaz Y, Carey R, Chan TM, Bandi V, Wang S, Woods RA, Mondal D, Thoma B. Developing a dashboard for faculty development in competency-based training programs: a design-based research project. CANADIAN MEDICAL EDUCATION JOURNAL 2021; 12:48-64. [PMID: 34567305 PMCID: PMC8463237 DOI: 10.36834/cmej.72067] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
BACKGROUND Canadian specialist residency training programs are implementing a form of competency-based medical education (CBME) that requires frequent assessments of entrustable professional activities (EPAs). Faculty struggle to provide helpful feedback and assign appropriate entrustment scores. CBME faculty development initiatives rarely incorporate teaching metrics. Dashboards could be used to visualize faculty assessment data to support faculty development. METHODS Using a design-based research process, we identified faculty development needs related to CBME assessments and designed a dashboard containing elements (data, analytics, and visualizations) meeting these needs. Data was collected within the emergency medicine residency program at the University of Saskatchewan through interviews with program leaders, faculty development experts, and faculty participating in development sessions. Two investigators thematically analyzed interview transcripts to identify faculty needs that were audited by a third investigator. The needs were described using representative quotes and the dashboard elements designed to address them. RESULTS Between July 1, 2019 and December 11, 2020 we conducted 15 interviews with nine participants (two program leaders, three faculty development experts, and four faculty members). Three needs emerged as themes from the analysis: analysis of assessments, contextualization of assessments, and accessible reporting. We addressed these needs by designing an accessible dashboard to present contextualized quantitative and narrative assessment data for each faculty member. CONCLUSIONS We identified faculty development needs related to EPA assessments and designed dashboard elements to meet them. The resulting dashboard was used for faculty development sessions. This work will inform the development of CBME assessment dashboards for faculty.
Collapse
Affiliation(s)
- Yusuf Yilmaz
- Continuing Professional Development Office and McMaster Education Research, Innovation, and Theory (MERIT) Program, McMaster University, Ontario, Canada
- Department of Medical Education, Ege University, Izmir, Turkey
| | - Robert Carey
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Teresa M Chan
- Continuing Professional Development Office and McMaster Education Research, Innovation, and Theory (MERIT) Program, McMaster University, Ontario, Canada
- Emergency Medicine, Department of Medicine, McMaster University, Ontario, Canada
| | - Venkat Bandi
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Shisong Wang
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Robert A Woods
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Debajyoti Mondal
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Brent Thoma
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| |
Collapse
|
22
|
Sample S, Al Rimawi H, Bérczi B, Chorley A, Pardhan A, Chan TM. Seeing potential opportunities for teaching (SPOT): Evaluating a bundle of interventions to augment entrustable professional activity acquisition. AEM EDUCATION AND TRAINING 2021; 5:e10631. [PMID: 34471797 PMCID: PMC8381386 DOI: 10.1002/aet2.10631] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/14/2021] [Revised: 05/10/2021] [Accepted: 06/02/2021] [Indexed: 06/13/2023]
Abstract
INTRODUCTION Within the Canadian competency-based medical education system, entrustable professional activities (EPAs) are used to assess residents on performed clinical duties. This study aimed to determine whether implementing a bundle of two interventions (a case-based discussion intervention and a rotation-based nudging system) could increase the number of EPA assessments that could occur for our trainees. METHODS The authors designed an intervention bundle with two components: 1) a case-based workshop where trainees discussed which EPAs could be assessed with multiple cases and 2) a nudging system wherein each trainee was reminded of EPAs that would be useful to them on each rotation in their first year. We conducted a retrospective program evaluation to compare the intervention cohort (2019) to two historical cohorts using similar EPAs (2017, 2018). RESULTS Data from 22 trainees (seven in 2017, eight in 2018, and seven in 2019) were analyzed. There was a marked increase in the total number of EPA assessments acquired in the 2019 cohort (average per resident = 285.7, 95% confidence interval [CI] = 256.1 to 312.3, range = 195-350) compared to the two other years (2018 [average = 132.4, 95% CI = 107.5 to 157.02, range = 107-167] and 2017 [70.1, 95% CI 45.3 to 91.0, range = 49-95]), yielding an effect size of Cohen's d = 4.02 for our intervention bundle. CONCLUSIONS Within the limitations of a small sample size, there was a strong effect of introducing two interventions (a case-based orientation and a nudging system) upon EPA assessments with PGY-1 residents. These strategies may be useful to others seeking to improve EPA assessment numbers in other specialties and clinical environments.
Collapse
Affiliation(s)
- Spencer Sample
- Emergency Medicine Postgraduate Training ProgramMcMaster Royal College of Physicians and Surgeons of CanadaHamiltonOntarioCanada
| | - Hussein Al Rimawi
- Emergency Medicine Postgraduate Training ProgramMcMaster Royal College of Physicians and Surgeons of CanadaHamiltonOntarioCanada
| | - Beatrix Bérczi
- Emergency Medicine Postgraduate Training ProgramMcMaster Royal College of Physicians and Surgeons of CanadaHamiltonOntarioCanada
| | - Alexander Chorley
- Emergency Medicine Postgraduate Training ProgramMcMaster Royal College of Physicians and Surgeons of CanadaHamiltonOntarioCanada
- Division of Emergency Medicine, Department of Medicine, Faculty of Health SciencesMcMaster UniversityHamiltonOntarioCanada
- McMaster Education Research, Innovation, and Theory (MERIT)HamiltonOntarioCanada
| | - Alim Pardhan
- Emergency Medicine Postgraduate Training ProgramMcMaster Royal College of Physicians and Surgeons of CanadaHamiltonOntarioCanada
- Division of Emergency Medicine, Department of Medicine, Faculty of Health SciencesMcMaster UniversityHamiltonOntarioCanada
| | - Teresa M. Chan
- Emergency Medicine Postgraduate Training ProgramMcMaster Royal College of Physicians and Surgeons of CanadaHamiltonOntarioCanada
- Division of Emergency Medicine, Department of Medicine, Faculty of Health SciencesMcMaster UniversityHamiltonOntarioCanada
- McMaster Education Research, Innovation, and Theory (MERIT)HamiltonOntarioCanada
- Division of Education and Innovation, Department of Medicine, Faculty of Health SciencesMcMaster UniversityHamiltonOntarioCanada
| |
Collapse
|
23
|
Hall AK, Schumacher DJ, Thoma B, Caretta-Weyer H, Kinnear B, Gruppen L, Cooke LJ, Frank JR, Van Melle E. Outcomes of competency-based medical education: A taxonomy for shared language. MEDICAL TEACHER 2021; 43:788-793. [PMID: 34038673 DOI: 10.1080/0142159x.2021.1925643] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
As the global transformation of postgraduate medical training continues, there are persistent calls for program evaluation efforts to understand the impact and outcomes of competency-based medical education (CBME) implementation. The measurement of a complex educational intervention such as CBME is challenging because of the multifaceted nature of activities and outcomes. What is needed, therefore, is an organizational taxonomy to both conceptualize and categorize multiple outcomes. In this manuscript we propose a taxonomy that builds on preceding works to organize CBME outcomes across three domains: focus (educational, clinical), level (micro, meso, macro), and timeline (training, transition to practice, practice). We also provide examples of how to conceptualize outcomes of educational interventions across medical specialties using this taxonomy. By proposing a shared language for outcomes of CBME, we hope that this taxonomy will help organize ongoing evaluation work and catalyze those seeking to engage in the evaluation effort to help understand the impact and outcomes of CBME.
Collapse
Affiliation(s)
- Andrew K Hall
- Department of Emergency Medicine, Queen's University, Kingston, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
| | - Daniel J Schumacher
- Division of Emergency Medicine, Cincinnati Children's Hospital Medical Center
- Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Brent Thoma
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Emergency Medicine, University of Saskatchewan, Saskatoon, Canada
| | - Holly Caretta-Weyer
- Department of Emergency Medicine, Stanford University School of Medicine, Palo Alto, CA, USA
| | - Benjamin Kinnear
- Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Larry Gruppen
- Department of Learning Health Sciences, University of Michigan, Ann Arbor, MI, USA
| | - Lara J Cooke
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Division of Neurology, Department of Clinical Neurosciences, Cumming School of Medicine, University of Calgary, Calgary, Canada
| | - Jason R Frank
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, Canada
| | - Elaine Van Melle
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Family Medicine, Queen's University, Kingston, Canada
| |
Collapse
|
24
|
Thoma B, Ellaway RH, Chan TM. From Utopia Through Dystopia: Charting a Course for Learning Analytics in Competency-Based Medical Education. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:S89-S95. [PMID: 34183609 DOI: 10.1097/acm.0000000000004092] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
The transition to the assessment of entrustable professional activities as part of competency-based medical education (CBME) has substantially increased the number of assessments completed on each trainee. Many CBME programs are having difficulty synthesizing the increased amount of assessment data. Learning analytics are a way of addressing this by systematically drawing inferences from large datasets to support trainee learning, faculty development, and program evaluation. Early work in this field has tended to emphasize the significant potential of analytics in medical education. However, concerns have been raised regarding data security, data ownership, validity, and other issues that could transform these dreams into nightmares. In this paper, the authors explore these contrasting perspectives by alternately describing utopian and dystopian futures for learning analytics within CBME. Seeing learning analytics as an important way to maximize the value of CBME assessment data for organizational development, they argue that their implementation should continue within the guidance of an ethical framework.
Collapse
Affiliation(s)
- Brent Thoma
- B. Thoma is associate professor, Department of Emergency Medicine, University of Saskatchewan, Saskatoon, Saskatchewan, Canada, and clinician educator, Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada; ORCID: https://orcid.org/0000-0003-1124-5786
| | - Rachel H Ellaway
- R.H. Ellaway is professor, Department of Community Health Sciences, and director, Office of Health and Medical Education Scholarship, Cumming School of Medicine, University of Calgary, Calgary, Alberta, Canada; ORCID: https://orcid.org/0000-0002-3759-6624
| | - Teresa M Chan
- T.M. Chan is associate professor, Division of Emergency Medicine, Department of Medicine, assistant dean, Program for Faculty Development, Faculty of Health Sciences, and adjunct scientist, McMaster Education Research, Innovation, and Theory (MERIT) program, McMaster University, Hamilton, Ontario, Canada; ORCID: https://orcid.org/0000-0001-6104-462X
| |
Collapse
|
25
|
Caretta-Weyer HA, Chan T, Bigham BL, Kinnear B, Huwendiek S, Schumacher DJ. If we could turn back time: Imagining time-variable, competency-based medical education in the context of COVID-19. MEDICAL TEACHER 2021; 43:774-779. [PMID: 34027813 DOI: 10.1080/0142159x.2021.1925641] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
The COVID-19 pandemic has exposed a paradox in historical models of medical education: organizations responsible for applying consistent standards for progression have needed to adapt to training environments marked by inconsistency and change. Although some institutions have maintained their traditional requirements, others have accelerated their programs to rush nearly graduated trainees to the front lines. One interpretation of the unplanned shortening of the duration of training programs during a crisis is that standards have been lowered. But it is also possible that these trainees were examined according to the same standards as usual and were judged to have already met them. This paper discusses the impacts of the COVID-19 pandemic on the current workforce, provides an analysis of how competency-based medical education (CBME) in the context of the pandemic might have mitigated wide-scale disruption, and identifies structural barriers to achieving an ideal state. The paper further calls upon universities, health centres, governments, certifying bodies, regulatory authorities, and health care professionals to work collectively on a truly time-variable model of CBME. The pandemic has made clear that time variability in medical education already exists and should be adopted widely and formally. If our systems today had used a framework of outcome competencies, sequenced progression, tailored learning, focused instruction, and programmatic assessment, we may have been even more nimble in changing our systems to care for our patients with COVID-19.
Collapse
Affiliation(s)
| | - Teresa Chan
- Department of Medicine, McMaster University, Hamilton, Canada
- McMaster Program for Education Research, Innovation, and Theory (MERIT), Hamilton, Canada
| | - Blair L Bigham
- Division of Emergency Medicine, Department of Medicine, McMaster University, Hamilton, Canada
| | - Benjamin Kinnear
- Division of Hospital Medicine, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA
- Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Sören Huwendiek
- Department for Assessment and Evaluation, Institute for Medical Education, University of Bern, Bern, Switzerland
| | - Daniel J Schumacher
- Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
- Division of Emergency Medicine, Cincinnati Children's Hospital, Cincinnati, OH, USA
| |
Collapse
|
26
|
Crickmer M, Lam T, Tavares W, Meshkat N. Do PGY-1 residents in Emergency Medicine have enough experiences in resuscitations and other clinical procedures to meet the requirements of a Competence by Design curriculum? CANADIAN MEDICAL EDUCATION JOURNAL 2021; 12:100-104. [PMID: 34249195 PMCID: PMC8263047 DOI: 10.36834/cmej.70921] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
BACKGROUND With the transition to a Competence by Design (CBD) curriculum, Fellow of the Royal College of Physicians in Emergency Medicine (FRCP-EM) training has created guidelines on experiences residents should have before progressing. We sought to quantify adult medical resuscitations and clinical procedures completed by PGY1 FRCP-EM residents to compare them to CBD requirements with the aim to identify areas of limited exposure requiring curriculum revisions prior to nation-wide CBD implementation. METHODS Twenty-two PGY1 residents from four FRCP-EM programs recorded their activities from July 2017 to June 2018 in an online log that tracked resuscitations and procedures along with role assumed, supervision, and level of comfort. RESULTS In total 515 resuscitations were logged with the median number per resident 15 (range 0 to 98). The most frequent resuscitation was altered mental status and the least was unstable dysrhythmia. 557 total procedures were logged with the median number 75 (range 8 to 273). The most frequent procedure done was simple laceration repair and the least frequent was intraosseous access. CONCLUSIONS Unstable dysrhythmias and cardiorespiratory arrest along with intraosseous access and arthrocentesis are low event clinical exposures. In the era of CBD, the misalignment of entrustrable professional activity (EPA) targets and curriculum delivery should be monitored/reviewed to ensure expectations are realistic and that sufficient exposures are available.
Collapse
Affiliation(s)
- Michael Crickmer
- Department of Emergency Medicine, University of Toronto, Ontario, Canada
| | - Tobi Lam
- Wilson Center, University of Toronto, Ontario, Canada
| | | | - Nazanin Meshkat
- Department of Emergency Medicine, University of Toronto, Ontario, Canada
| |
Collapse
|
27
|
Landreville JM, Frank JR, Cheung WJ. Does direct observation happen early in a new competency-based residency program? AEM EDUCATION AND TRAINING 2021; 5:e10591. [PMID: 33842816 PMCID: PMC8019151 DOI: 10.1002/aet2.10591] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/11/2020] [Revised: 02/16/2021] [Accepted: 02/23/2021] [Indexed: 06/01/2023]
Abstract
BACKGROUND A key component of competency-based medical education is workplace-based assessment, which includes observation (direct or indirect) of residents. Direct observation has been emphasized as an ideal form of assessment yet challenges have been identified that may limit its adoption. At present, it remains unclear how often direct and indirect observation are being used within the clinical setting. The objective of this study was to describe patterns of observation in an emergency medicine competency-based program 2 years postimplementation. METHODS Emergency medicine residents (n = 19) recorded the type of observation they received (direct or indirect) following workplace-based entrustable professional activity (EPA) assessments from December 15, 2019, to April 30, 2020. Assessment forms were reviewed and analyzed to describe patters of observation. RESULTS Assessments were collected on all 19 eligible residents (100% participation). A total of 1,070 EPA assessments were completed during the study period, of which 798 (74.6%) had the type of observation recorded. Of these recorded observations, 546 (68.4%) were directly observed and 252 (31.6%) were indirectly observed. The length of written comments contained within assessments following direct and indirect observation did not differ significantly. There was no significant association between resident gender and observation type or resident stage of training and observation type. Certain EPA assessments showed a clear preference toward either direct or indirect observation. CONCLUSIONS To the best of our knowledge, this study is the first to report patterns of observation in a competency-based residency program. The results suggest that direct observation can be quickly adopted as the primary means of workplace-based assessment. Indirect observation comprised a sizeable minority of observations and may be an underrecognized contributor to workplace-based assessment. The preference toward either direct or indirect observation for certain EPA assessments suggests that the entrustable professional activity itself may influence the type of observation.
Collapse
Affiliation(s)
| | - Jason R. Frank
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| | - Warren J. Cheung
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| |
Collapse
|
28
|
Carey R, Wilson G, Bandi V, Mondal D, Martin LJ, Woods R, Chan T, Thoma B. Developing a dashboard to meet the needs of residents in a competency-based training program: A design-based research project. CANADIAN MEDICAL EDUCATION JOURNAL 2020; 11:e31-e45. [PMID: 33349752 PMCID: PMC7749685 DOI: 10.36834/cmej.69682] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/01/2023]
Abstract
BACKGROUND Canadian specialty programs are implementing Competence By Design, a competency-based medical education (CBME) program which requires frequent assessments of entrustable professional activities. To be used for learning, the large amount of assessment data needs to be interpreted by residents, but little work has been done to determine how visualizing and interacting with this data can be supported. Within the University of Saskatchewan emergency medicine residency program, we sought to determine how our residents' CBME assessment data should be presented to support their learning and to develop a dashboard that meets our residents' needs. METHODS We utilized a design-based research process to identify and address resident needs surrounding the presentation of their assessment data. Data was collected within the emergency medicine residency program at the University of Saskatchewan via four resident focus groups held over 10 months. Focus group discussions were analyzed using a grounded theory approach to identify resident needs. This guided the development of a dashboard which contained elements (data, analytics, and visualizations) that support their interpretation of the data. The identified needs are described using quotes from the focus groups as well as visualizations of the dashboard elements. RESULTS Resident needs were classified under three themes: (1) Provide guidance through the assessment program, (2) Present workplace-based assessment data, and (3) Present other assessment data. Seventeen dashboard elements were designed to address these needs. CONCLUSIONS Our design-based research process identified resident needs and developed dashboard elements to meet them. This work will inform the creation and evolution of CBME assessment dashboards designed to support resident learning.
Collapse
Affiliation(s)
- Robert Carey
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Grayson Wilson
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Venkat Bandi
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Debajyoti Mondal
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Lynsey J. Martin
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Rob Woods
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Teresa Chan
- Division of Emergency Medicine, Department of Medicine, McMaster University, Ontario, Canada
- McMaster program for Education Research, Innovation, and Theory (MERIT), McMaster University, Ontario, Canada
| | - Brent Thoma
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| |
Collapse
|
29
|
Breaking down the silos in simulation-based education: Exploring, refining, and standardizing. CAN J EMERG MED 2020; 22:733-734. [DOI: 10.1017/cem.2020.471] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|