1
|
Birman NA, Vashdi DR, Miller-Mor Atias R, Riskin A, Zangen S, Litmanovitz I, Sagi D. Unveiling the paradoxes of implementing post graduate competency based medical education programs. MEDICAL TEACHER 2024:1-8. [PMID: 38803298 DOI: 10.1080/0142159x.2024.2356826] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/05/2023] [Accepted: 05/14/2024] [Indexed: 05/29/2024]
Abstract
PURPOSE Competency-based medical education (CBME) has gained prominence as an innovative model for post-graduate medical education, yet its implementation poses significant challenges, especially with regard to its sustainability. Drawing on paradox theory, we suggest that revealing the paradoxes underlying these challenges may contribute to our understanding of post graduate competency-based medical education (PGCBME) implementation processes and serve as a first-step in enhancing better implementation. Thus, the purpose of the current study is to identify the paradoxes associated with PGCBME implementation. METHOD A qualitative study was conducted, as part of a larger action research, using in-depth semi-structured interviews with fellows and educators in eight Neonatal wards. RESULTS Analysis revealed that the PGCBME program examined in this study involves three different levels of standardization, each serving as one side of paradoxical tensions; (1) a paradox between the need for standardized assessment tools and for free-flow flexible assessment tools, (2) a paradox between the need for a standardized implementation process across all wards and the need for unique implementation protocols in each ward; and 3) a paradox between the need for a standardized meaning of competency proficiency and the need for flexible and personal competency achievement indicators. CONCLUSIONS Implementing PGCBME programs involves many challenges, some of which are paradoxical, i.e. two contradictory challenges in which solving one challenge exacerbates another. Revealing these paradoxes is important in navigating them successfully.
Collapse
Affiliation(s)
- Noa A Birman
- University of Haifa, The Herta and Paul Amir Faculty of Social Sciences, School of Political Science, Department of Public Administration, Haifa, Israel
| | - Dana R Vashdi
- University of Haifa, The Herta and Paul Amir Faculty of Social Sciences, School of Political Science, Department of Public Administration, Haifa, Israel
| | - Rotem Miller-Mor Atias
- University of Haifa, The Herta and Paul Amir Faculty of Social Sciences, School of Political Science, Department of Public Administration, Haifa, Israel
| | - Arieh Riskin
- Technion Israel Institute of Technology, The Ruth and Bruce Rappaport Faculty of Medicine, Haifa, Israel
| | - Shmuel Zangen
- Ben- Gurion University of the Negev, Faculty of Health Sciences, Be'er-Sheva, Israel
| | - Ita Litmanovitz
- Tel Aviv University, Faculty of Medicine & Health Sciences, Tel-Aviv, Israel
| | - Doron Sagi
- The Israel Center for Medical Simulation, Sheba Medical Center, Tel-Hashomer, Ramat-Gan, Israel
| |
Collapse
|
2
|
Kalun P, Braund H, McGuire N, McEwen L, Mann S, Trier J, Schultz K, Curtis R, McGuire A, Pereira I, Dagnone D. Was it all worth it? A graduating resident perspective on CBME. MEDICAL TEACHER 2024:1-9. [PMID: 38742827 DOI: 10.1080/0142159x.2024.2339408] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/05/2023] [Accepted: 04/02/2024] [Indexed: 05/16/2024]
Abstract
BACKGROUND Our institution simultaneously transitioned all postgraduate specialty training programs to competency-based medical education (CBME) curricula. We explored experiences of CBME-trained residents graduating from five-year programs to inform the continued evolution of CBME in Canada. METHODS We utilized qualitative description to explore residents' experiences and inform continued CBME improvement. Data were collected from fifteen residents from various specialties through focus groups, interviews, and written responses. The data were analyzed inductively, using conventional content analysis. RESULTS We identified five overarching themes. Three themes provided insight into residents' experiences with CBME, describing discrepancies between the intentions of CBME and how it was enacted, challenges with implementation, and variation in residents' experiences. Two themes - adaptations and recommendations - could inform meaningful refinements for CBME going forward. CONCLUSIONS Residents graduating from CBME training programs offered a balanced perspective, including criticism and recognition of the potential value of CBME when implemented as intended. Their experiences provide a better understanding of residents' needs within CBME curricula, including greater balance and flexibility within programs of assessment and curricula. Many challenges that residents faced with CBME could be alleviated by greater accountability at program, institutional, and national levels. We conclude with actionable recommendations for addressing residents' needs in CBME.
Collapse
Affiliation(s)
- Portia Kalun
- Queen's Health Sciences, Queen's University, Kingston, Canada
| | - Heather Braund
- Queen's Health Sciences, Queen's University, Kingston, Canada
| | - Natalie McGuire
- Queen's Health Sciences, Queen's University, Kingston, Canada
| | - Laura McEwen
- Queen's Health Sciences, Queen's University, Kingston, Canada
| | - Steve Mann
- Queen's Health Sciences, Queen's University, Kingston, Canada
| | - Jessica Trier
- Queen's Health Sciences, Queen's University, Kingston, Canada
- Department of Physical Medicine and Rehabilitation, Queen's University, Kingston, Canada
- Providence Care Hospital, Kingston, Canada
| | - Karen Schultz
- Queen's Health Sciences, Queen's University, Kingston, Canada
| | - Rachel Curtis
- Queen's Health Sciences, Queen's University, Kingston, Canada
- Department of Ophthalmology, Queen's University, Kingston, Canada
| | - Andrew McGuire
- Queen's Health Sciences, Queen's University, Kingston, Canada
| | - Ian Pereira
- Queen's Health Sciences, Queen's University, Kingston, Canada
| | - Damon Dagnone
- Queen's Health Sciences, Queen's University, Kingston, Canada
- Department of Emergency Medicine, Queen's University, Kingston, Canada
| |
Collapse
|
3
|
Frank JR, Hall AK, Oswald A, Dagnone JD, Brand PLP, Reznick R. From Competence by Time to Competence by Design: Lessons From A National Transformation Initiative. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:224-228. [PMID: 38550713 PMCID: PMC10976982 DOI: 10.5334/pme.1342] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/29/2024] [Accepted: 03/06/2024] [Indexed: 04/02/2024]
Affiliation(s)
- Jason R. Frank
- Centre for Innovation in Medical Education, and Professor, Department of Emergency Medicine, Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Andrew K. Hall
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| | - Anna Oswald
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Competency Based Medical Education, and Professor, Division of Rheumatology, Department of Medicine, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB, Canada
| | - J. Damon Dagnone
- Department of Emergency Medicine, Queen’s University, Kingston, ON, Canada
- Standards and Accreditation, Royal College of Physicians & Surgeons of Canada, Ottawa, ON, Canada
| | - Paul L. P. Brand
- Clinical Medical Education, University Medical Centre and University of Groningen, the Netherlands
- Medical Education and Faculty Development, Isala Hospital, Zwolle, The Netherlands
| | - Richard Reznick
- Queen’s University, Immediate Past President Royal College of Physicians and Surgeons of Canada, Canada
| |
Collapse
|
4
|
Sahi N, Humphrey-Murto S, Brennan EE, O'Brien M, Hall AK. Current use of simulation for EPA assessment in emergency medicine. CAN J EMERG MED 2024; 26:179-187. [PMID: 38374281 DOI: 10.1007/s43678-024-00649-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2023] [Accepted: 01/12/2024] [Indexed: 02/21/2024]
Abstract
OBJECTIVE Approximately five years ago, the Royal College emergency medicine programs in Canada implemented a competency-based paradigm and introduced the use of Entrustable Professional Activities (EPAs) for assessment of units of professional activity to assess trainees. Many competency-based medical education (CBME) based curricula, involve assessing for entrustment through observations of EPAs. While EPAs are frequently assessed in clinical settings, simulation is also used. This study aimed to characterize the use of simulation for EPA assessment. METHODS A study interview guide was jointly developed by all study authors and followed best practices for survey development. A national interview was conducted with program directors or assistant program directors across all the Royal College emergency medicine programs across Canada. Interviews were conducted over Microsoft Teams, interviews were recorded and transcribed, using Microsoft Teams transcribing service. Sample transcripts were analyzed for theme development. Themes were then reviewed by co-authors to ensure they were representative of the participants' views. RESULTS A 64.7% response rate was achieved. Simulation has been widely adopted by EM training programs. All interviewees demonstrated support for the use of simulation for EPA assessment for many reasons, however, PDs acknowledged limitations and thematic analysis revealed certain themes and tensions for using simulation for EPA assessment. Thematic analysis revealed six major themes: widespread support for the use of simulation for EPA assessment, concerns regarding the potential for EPA assessment to become a "tick- box" exercise, logistical barriers limiting the use of simulation for EPA assessment, varied perceptions about the authenticity of using simulation for EPA assessment, the potential for simulation for EPA assessment to compromise learner psychological safety, and suggestions for the optimization of use of simulation for EPA assessment. CONCLUSIONS Our findings offer insight for other programs and specialties on how simulation for EPA assessment can best be utilized. Programs should use these findings when considering using simulation for EPA assessment.
Collapse
Affiliation(s)
- Nidhi Sahi
- Department of Innovation in Medical Education (DIME), University of Ottawa, Ottawa, ON, Canada.
| | - Susan Humphrey-Murto
- Department of Medicine, University of Ottawa, Ottawa, ON, Canada
- Tier 2 Research Chair in Medical Education and Fellowship Director, Medical Education Research, University of Ottawa, Ottawa, ON, Canada
| | - Erin E Brennan
- Department of Emergency Medicine, Queen's University, Kingston, ON, Canada
| | - Michael O'Brien
- Emergency Medicine, The Ottawa Hospital, Ottawa, ON, Canada
- Department of Innovation in Medical Education, University of Ottawa, Ottawa, ON, Canada
| | - Andrew K Hall
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| |
Collapse
|
5
|
Hall AK, Oswald A, Frank JR, Dalseg T, Cheung WJ, Cooke L, Gorman L, Brzezina S, Selvaratnam S, Wagner N, Hamstra SJ, Van Melle E. Evaluating Competence by Design as a Large System Change Initiative: Readiness, Fidelity, and Outcomes. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:95-107. [PMID: 38343556 PMCID: PMC10854467 DOI: 10.5334/pme.962] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/08/2023] [Accepted: 08/17/2023] [Indexed: 02/15/2024]
Abstract
Program evaluation is an essential, but often neglected, activity in any transformational educational change. Competence by Design was a large-scale change initiative to implement a competency-based time-variable educational system in Canadian postgraduate medical education. A program evaluation strategy was an integral part of the build and implementation plan for CBD from the beginning, providing insights into implementation progress, challenges, unexpected outcomes, and impact. The Competence by Design program evaluation strategy was built upon a logic model and three pillars of evaluation: readiness to implement, fidelity and integrity of implementation, and outcomes of implementation. The program evaluation strategy harvested from both internally driven studies and those performed by partners and invested others. A dashboard for the program evaluation strategy was created to transparently display a real-time view of Competence by Design implementation and facilitate continuous adaptation and improvement. The findings of the program evaluation for Competence by Design drove changes to all aspects of the Competence by Design implementation, aided engagement of partners, supported change management, and deepened our understanding of the journey required for transformational educational change in a complex national postgraduate medical education system. The program evaluation strategy for Competence by Design provides a framework for program evaluation for any large-scale change in health professions education.
Collapse
Affiliation(s)
- Andrew K. Hall
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| | - Anna Oswald
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Division of Rheumatology, Department of Medicine, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB, Canada
| | - Jason R. Frank
- Department of Emergency Medicine, Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Tim Dalseg
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Medicine, Division of Emergency Medicine, University of Toronto, Toronto, ON, Canada
| | - Warren J. Cheung
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Lara Cooke
- Neurology, Department of Clinical Neurosciences, Cumming School of Medicine, University of Calgary, Calgary, AB, Canada
| | - Lisa Gorman
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| | - Stacey Brzezina
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| | | | - Natalie Wagner
- Queen’s Health Sciences Office of Professional Development and Educational Scholarship, Queen’s University, Kingston, ON, Canada
| | - Stanley J. Hamstra
- Department of Surgery, University of Toronto, Toronto, ON, Canada
- Department of Milestones Research and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, IL, USA
- Department of Medical Education, Northwestern University Feinberg School of Medicine, Chicago, IL, USA
| | - Elaine Van Melle
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Family Medicine, Queen’s University, Kingston, ON, Canada
| |
Collapse
|
6
|
Braund H, Hanmore T, Dalgarno N, Baxter S. Using a rapid-cycle approach to evaluate implementation of competency-based medical education in ophthalmology. CANADIAN JOURNAL OF OPHTHALMOLOGY 2024; 59:40-45. [PMID: 36372134 DOI: 10.1016/j.jcjo.2022.10.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/15/2022] [Revised: 08/15/2022] [Accepted: 10/15/2022] [Indexed: 11/13/2022]
Abstract
OBJECTIVE As competency-based medical education is being implemented across Canada, there is an increasing need to evaluate the progress to date, including identification of strengths and weaknesses, to inform program development. Ophthalmology is preparing for a national launch in coming years. The purpose of this study was to describe key stakeholders' lived experiences in the competency-based medical education foundation-of-discipline stage in one ophthalmology department. DESIGN Using a case-study approach, a qualitative rapid-cycle evaluation was conducted during the 2018-2019 academic year. PARTICIPANTS Residents, faculty, academic advisors, competence committee members, the program director, the program administrator, and the educational consultant were invited to participate in the program evaluation. METHODS The rapid-cycle evaluation consisted of 2 evaluation cycles, with the first round of interviews and focus groups occurring in October 2018 and the second round in March 2019. Recommendations were implemented in November 2019 and June 2019. All data were analyzed thematically using NVivo. RESULTS Three main themes emerged across all data sets: developing a shared understanding (e.g., role expectations and changes to assessment), refining assessment processes and tools (e.g., the need for streamlining and clarification), and feedback (e.g., perceived benefits and value of narrative comments). CONCLUSIONS Exploring lived experiences in this study resulted in positive and immediate improvements to the residency program. The recommendations and approach will be useful to other Canadian departments and institutions as they prepare for Competence by Design.
Collapse
Affiliation(s)
- Heather Braund
- Office of Professional Development and Educational Scholarship, Faculty of Health Sciences, Queen's University, Kingston, ON
| | - Tessa Hanmore
- Departments of Ophthalmology, Physical Medicine and Rehabilitation, and Psychiatry, Queen's University, Kingston, ON
| | - Nancy Dalgarno
- Office of Professional Development and Educational Scholarship, Faculty of Health Sciences, Queen's University, Kingston, ON
| | - Stephanie Baxter
- Department of Ophthalmology, Queen's University and Kingston Health Sciences Centre, Kingston, ON.
| |
Collapse
|
7
|
Carter B, Sidrak J, Wagner B, Travis C, Nehler M, Christian N. Preliminary Development of a Program ABSITE Dashboard (PAD) to Guide Curriculum Innovation. JOURNAL OF SURGICAL EDUCATION 2024; 81:226-242. [PMID: 38195275 DOI: 10.1016/j.jsurg.2023.10.014] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/14/2023] [Accepted: 10/20/2023] [Indexed: 01/11/2024]
Abstract
PURPOSE Medical Knowledge for general surgery residents' is assessed by the American Board of Surgery In- Training Examination (ABSITE). ASBITE score reports contain many metrics residency directors can utilize to assess resident progress and perform program evaluation. The purpose of this study was to develop a framework to evaluate program effectiveness in teaching specific subtest and subtopic areas of the ABSITE, using ABSITE score reports as an indicator. The aim is to demonstrate the identification of topic areas of weakness in program-wide performance on the ABSITE to guide proposed modification of the general surgery residency program curriculum, and to initiate development of a data visualizing dashboard to communicate these metrics. METHODS A single institution retrospective study was performed utilizing ABSITE score reports from general surgery residents at a large academic training program from 2017 to 2020. ABSITE performance metrics from 320 unique records were entered into a database; statistical analysis for linear trends and variance were conducted for standard scores, subtest standard scores, and incorrect subtest topics. Deviation from national average scores were calculated by subtracting the national average score from each subtest score for each trainee. Data were displayed as medians or proportions and are displayed to optimize visualization as a proof-of-concept for the development of a program dashboard. RESULTS Trends and variance in general surgery program and cohort performance on various elements of the ABSITE were visualized using figures and tables that represent a prototype for a program dashboard. Figure A1 demonstrates one example, in which a heatmap displays the median deviation from national average scores for each subtest by program year. Boxplots show the distribution of the deviation from national average, range for national average scores, and the recorded scores for each subtest by program year. Trends in median deviation from the national average scores are displayed for each program year paneled by subtest or for each exam year paneled by cohort. Median change in overall test scores from one program year to another in a cohort is visualized as a table. Bar graphs show the most often missed topics across all program years and heatmaps were generated showing the proportion of times each topic was missed for each subtest and exam year. CONCLUSIONS We demonstrate use of ABSITE reports to identify specific thematic areas of opportunities for curriculum modification and innovation as an element of program evaluation. In this study we demonstrate, through data analysis and visualization, feasibility for the creation of a Program ABSITE Dashboard (PAD) that enhances the use of ABSITE reports for formative program evaluation and can guide modifications to surgery program curriculum and educational practices.
Collapse
Affiliation(s)
- Brian Carter
- University of Colorado School of Medicine, Aurora, Colorado
| | - Jason Sidrak
- University of Colorado School of Medicine, Aurora, Colorado
| | - Brandie Wagner
- Department of Biostatistics and Informatics, Colorado School of Public Health, Aurora, Colorado
| | - Claire Travis
- Department of Surgery, University of Colorado Anschutz Medical Center, Aurora, Colorado
| | - Mark Nehler
- Department of Surgery, University of Colorado Anschutz Medical Center, Aurora, Colorado
| | - Nicole Christian
- Department of Surgery, University of Colorado Anschutz Medical Center, Aurora, Colorado.
| |
Collapse
|
8
|
Schumacher DJ, Kinnear B, Carraccio C, Holmboe E, Busari JO, van der Vleuten C, Lingard L. Competency-based medical education: The spark to ignite healthcare's escape fire. MEDICAL TEACHER 2024; 46:140-146. [PMID: 37463405 DOI: 10.1080/0142159x.2023.2232097] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/20/2023]
Abstract
High-value care is what patients deserve and what healthcare professionals should deliver. However, it is not what happens much of the time. Quality improvement master Dr. Don Berwick argued more than two decades ago that American healthcare needs an escape fire, which is a new way of seeing and acting in a crisis situation. While coined in the U.S. context, the analogy applies in other Western healthcare contexts as well. Therefore, in this paper, the authors revisit Berwick's analogy, arguing that medical education can, and should, provide the spark for such an escape fire across the globe. They assert that medical education can achieve this by fully embracing competency-based medical education (CBME) as a way to place medicine's focus on the patient. CBME targets training outcomes that prepare graduates to optimize patient care. The authors use the escape fire analogy to argue that medical educators must drop long-held approaches and tools; treat CBME implementation as an adaptive challenge rather than a technical fix; demand genuine, rich discussions and engagement about the path forward; and, above all, center the patient in all they do.
Collapse
Affiliation(s)
- Daniel J Schumacher
- Pediatrics, Cincinnati Children's Hospital Medical Center and, University of Cincinnati College of Medicine, Cincinnati, Ohio, USA
| | - Benjamin Kinnear
- Pediatrics and Internal Medicine, Cincinnati Children's Hospital Medical Center and, University of Cincinnati College of Medicine, Cincinnati, Ohio, USA
| | - Carol Carraccio
- Vice President of Competency-Based Medical Education, American Board of Pediatrics, Chapel Hill, North Carolina, USA
| | - Eric Holmboe
- Milestones Development and Evaluation Officer, Accreditation Council for Graduate Medical Education, Chicago, Illinois, USA
| | - Jamiu O Busari
- Department of Educational Development and Research, Faculty of Health, Medicine, and Life Sciences, Maastricht University, Maastricht, The Netherlands
| | - Cees van der Vleuten
- Department of Educational Development and Research, Faculty of Health, Medicine, and Life Sciences, School of Health Professions Education, Maastricht University, Maastricht, The Netherlands
| | - Lorelei Lingard
- Department of Medicine, and Center for Education Research & Innovation, Schulich School of Medicine and Dentistry at Western University, London, Ontario, Canada
| |
Collapse
|
9
|
Moffatt-Bruce SD, Harris K, Rubens FD, Villeneuve PJ, Sundaresan RS. Competency-based training: Canadian cardiothoracic surgery. J Thorac Cardiovasc Surg 2024; 167:407-410. [PMID: 36702679 DOI: 10.1016/j.jtcvs.2023.01.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/03/2022] [Revised: 11/25/2022] [Accepted: 01/05/2023] [Indexed: 01/11/2023]
Affiliation(s)
- Susan D Moffatt-Bruce
- The Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada; Department of Surgery, University of Ottawa, Ottawa, Ontario, Canada.
| | - Ken Harris
- Department of Surgery, University of Western Ontario, London, Ontario, Canada
| | - Fraser D Rubens
- Department of Surgery, University of Ottawa, Ottawa, Ontario, Canada
| | | | | |
Collapse
|
10
|
Acker A, Leifso K, Crawford L, Braund H, Hawksby E, Hall AK, McEwen L, Dalgarno N, Dagnone JD. Lessons learned and new strategies for success: Evaluating the Implementation of Competency-Based Medical Education in Queen's Pediatrics. Paediatr Child Health 2023; 28:463-467. [PMID: 38638538 PMCID: PMC11022870 DOI: 10.1093/pch/pxad021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2022] [Accepted: 04/17/2023] [Indexed: 04/20/2024] Open
Abstract
Objectives In 2017, Queen's University launched Competency-Based Medical Education (CBME) across 29 programs simultaneously. Two years post-implementation, we asked key stakeholders (faculty, residents, and program leaders) within the Pediatrics program for their perspectives on and experiences with CBME so far. Methods Program leadership explicitly described the intended outcomes of implementing CBME. Focus groups and interviews were conducted with all stakeholders to describe the enacted implementation. The intended versus enacted implementations were compared to provide insight into needed adaptations for program improvement. Results Overall, stakeholders saw value in the concept of CBME. Residents felt they received more specific feedback and monthly Competence Committee (CC) meetings and Academic Advisors were helpful. Conversely, all stakeholders noted the increased expectations had led to a feeling of assessment fatigue. Faculty noted that direct observation and not knowing a resident's previous performance information was challenging. Residents wanted to see faculty initiate assessments and improved transparency around progress and promotion decisions. Discussion The results provided insight into how well the intended outcomes had been achieved as well as areas for improvement. Proposed adaptations included a need for increased direct observation and exploration of faculty accessing residents' previous performance information. Education was provided on the performance expectations of residents and how progress and promotion decisions are made. As well, "flex blocks" were created to help residents customize their training experience to meet their learning needs. The results of this study can be used to inform and guide implementation and adaptations in other programs and institutions.
Collapse
Affiliation(s)
- Amy Acker
- Department of Pediatrics, Queen’s University, Kingston, Canada
| | - Kirk Leifso
- Department of Pediatrics, Queen’s University, Kingston, Canada
| | | | - Heather Braund
- Scholarship and Simulation Education, Office of Professional Development and Educational Scholarship, Faculty of Health Sciences, Queen’s University, Kingston, Canada
| | - Emily Hawksby
- Department of Pediatrics, Queen’s University, Kingston, Canada
| | - Andrew K Hall
- Department of Emergency Medicine, University of Ottawa, Ottawa, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
| | - Laura McEwen
- Department of Pediatrics, Queen’s University, Kingston, Canada
- Postgraduate Medical Education Queen’s University, Kingston, Canada
| | - Nancy Dalgarno
- Education Scholarship, Office of Professional Development and Educational Scholarship, Kingston, Canada
- Department of Biomedical and Molecular Sciences, Faculty of Health Sciences, Queen’s University, Kingston, Canada
| | - Jeffrey Damon Dagnone
- Postgraduate Medical Education Queen’s University, Kingston, Canada
- Department of Emergency Medicine, Queen’s University, Kingston, Canada
| |
Collapse
|
11
|
Szulewski A, Braund H, Dagnone DJ, McEwen L, Dalgarno N, Schultz KW, Hall AK. The Assessment Burden in Competency-Based Medical Education: How Programs Are Adapting. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:1261-1267. [PMID: 37343164 DOI: 10.1097/acm.0000000000005305] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/23/2023]
Abstract
Residents and faculty have described a burden of assessment related to the implementation of competency-based medical education (CBME), which may undermine its benefits. Although this concerning signal has been identified, little has been done to identify adaptations to address this problem. Grounded in an analysis of an early Canadian pan-institutional CBME adopter's experience, this article describes postgraduate programs' adaptations related to the challenges of assessment in CBME. From June 2019-September 2022, 8 residency programs underwent a standardized Rapid Evaluation guided by the Core Components Framework (CCF). Sixty interviews and 18 focus groups were held with invested partners. Transcripts were analyzed abductively using CCF, and ideal implementation was compared with enacted implementation. These findings were then shared back with program leaders, adaptations were subsequently developed, and technical reports were generated for each program. Researchers reviewed the technical reports to identify themes related to the burden of assessment with a subsequent focus on identifying adaptations across programs. Three themes were identified: (1) disparate mental models of assessment processes in CBME, (2) challenges in workplace-based assessment processes, and (3) challenges in performance review and decision making. Theme 1 included entrustment interpretation and lack of shared mindset for performance standards. Adaptations included revising entrustment scales, faculty development, and formalizing resident membership. Theme 2 involved direct observation, timeliness of assessment completion, and feedback quality. Adaptations included alternative assessment strategies beyond entrustable professional activity forms and proactive assessment planning. Theme 3 related to resident data monitoring and competence committee decision making. Adaptations included adding resident representatives to the competence committee and assessment platform enhancements. These adaptations represent responses to the concerning signal of significant burden of assessment within CBME being experienced broadly. The authors hope other programs may learn from their institution's experience and navigate the CBME-related assessment burden their invested partners may be facing.
Collapse
Affiliation(s)
- Adam Szulewski
- A. Szulewski is associate professor, Departments of Emergency Medicine and Psychology, and educational scholarship lead, Postgraduate Medical Education, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0002-3076-6221
| | - Heather Braund
- H. Braund is associate director of scholarship and simulation education, Office of Professional Development and Educational Scholarship, and assistant (adjunct) professor, Department of Biomedical and Molecular Sciences and School of Medicine, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0002-9749-7193
| | - Damon J Dagnone
- D.J. Dagnone is associate professor, Department of Emergency Medicine, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0001-6963-7948
| | - Laura McEwen
- L. McEwen is director of assessment and evaluation of postgraduate medical education and assistant professor, Department of Pediatrics, Postgraduate Medical Education, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0003-2457-5311
| | - Nancy Dalgarno
- N. Dalgarno is director of education scholarship, Office of Professional Development and Educational Scholarship, and assistant professor (adjunct), Department of Biomedical and Molecular Sciences and Master of Health Professions Education, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0001-7932-9949
| | - Karen W Schultz
- K.W. Schultz is professor, Department of Family Medicine, and associate dean of postgraduate medical education, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0003-0208-3981
| | - Andrew K Hall
- A.K. Hall is associate professor and vice chair of education, Department of Emergency Medicine, University of Ottawa, and clinician educator, Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada; ORCID: https://orcid.org/0000-0003-1227-5397
| |
Collapse
|
12
|
Ott M, Apramian T, Cristancho S, Roth K. Unintended consequences of technology in competency-based education: a qualitative study of lessons learned in an OtoHNS program. J Otolaryngol Head Neck Surg 2023; 52:55. [PMID: 37612760 PMCID: PMC10463791 DOI: 10.1186/s40463-023-00649-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2022] [Accepted: 06/16/2023] [Indexed: 08/25/2023] Open
Abstract
BACKGROUND Formative feedback and entrustment ratings on assessments of entrustable professional activities (EPAs) are intended to support learner self-regulation and inform entrustment decisions in competency-based medical education. Technology platforms have been developed to facilitate these goals, but little is known about their effects on these new assessment practices. This study investigates how users interacted with an e-portfolio in an OtoHNS surgery program transitioning to a Canadian approach to competency-based assessment, Competence by Design. METHODS We employed a sociomaterial perspective on technology and grounded theory methods of iterative data collection and analysis to study this OtoHNS program's use of an e-portfolio for assessment purposes. All residents (n = 14) and competency committee members (n = 7) participated in the study; data included feedback in resident portfolios, observation of use of the e-portfolio in a competency committee meeting, and a focus group with residents to explore how they used the e-portfolio and visualize interfaces that would better meet their needs. RESULTS Use of the e-portfolio to document, access, and interpret assessment data was problematic for both residents and faculty, but the residents faced more challenges. While faculty were slowed in making entrustment decisions, formative assessments were not actionable for residents. Workarounds to these barriers resulted in a "numbers game" residents played to acquire EPAs. Themes prioritized needs for searchable, contextual, visual, and mobile aspects of technology design to support use of assessment data for resident learning. CONCLUSION Best practices of technology design begin by understanding user needs. Insights from this study support recommendations for improved technology design centred on learner needs to provide OtoHNS residents a more formative experience of competency-based training.
Collapse
Affiliation(s)
- Mary Ott
- Centre for Education Research and Innovation, Schulich School of Medicine and Dentistry, Western University, London, Canada.
| | - Tavis Apramian
- Division of Palliative Care, Department of Family & Community Medicine, University of Toronto, Toronto, Canada
| | - Sayra Cristancho
- Centre for Education Research and Innovation, Schulich School of Medicine and Dentistry, Western University, London, Canada
| | - Kathryn Roth
- Department of Otolaryngology-Head and Neck Surgery, Schulich School of Medicine and Dentistry, Western University, London, Canada
| |
Collapse
|
13
|
Holmboe ES, Osman NY, Murphy CM, Kogan JR. The Urgency of Now: Rethinking and Improving Assessment Practices in Medical Education Programs. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:S37-S49. [PMID: 37071705 DOI: 10.1097/acm.0000000000005251] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
Assessment is essential to professional development. Assessment provides the information needed to give feedback, support coaching and the creation of individualized learning plans, inform progress decisions, determine appropriate supervision levels, and, most importantly, help ensure patients and families receive high-quality, safe care in the training environment. While the introduction of competency-based medical education has catalyzed advances in assessment, much work remains to be done. First, becoming a physician (or other health professional) is primarily a developmental process, and assessment programs must be designed using a developmental and growth mindset. Second, medical education programs must have integrated programs of assessment that address the interconnected domains of implicit, explicit and structural bias. Third, improving programs of assessment will require a systems-thinking approach. In this paper, the authors first address these overarching issues as key principles that must be embraced so that training programs may optimize assessment to ensure all learners achieve desired medical education outcomes. The authors then explore specific needs in assessment and provide suggestions to improve assessment practices. This paper is by no means inclusive of all medical education assessment challenges or possible solutions. However, there is a wealth of current assessment research and practice that medical education programs can use to improve educational outcomes and help reduce the harmful effects of bias. The authors' goal is to help improve and guide innovation in assessment by catalyzing further conversations.
Collapse
Affiliation(s)
- Eric S Holmboe
- E.S. Holmboe is chief, Research, Milestones Development and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois; ORCID: https://orcid.org/0000-0003-0108-6021
| | - Nora Y Osman
- N.Y. Osman is associate professor of medicine, Harvard Medical School, and director of undergraduate medical education, Brigham and Women's Hospital Department of Medicine, Boston, Massachusetts; ORCID: https://orcid.org/0000-0003-3542-1262
| | - Christina M Murphy
- C.M. Murphy is a fourth-year medical student and president, Medical Student Government at Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0003-3966-5264
| | - Jennifer R Kogan
- J.R. Kogan is associate dean, Student Success and Professional Development, and professor of medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0001-8426-9506
| |
Collapse
|
14
|
Bentley H, Darras KE, Forster BB, Probyn L, Sedlic A, Hague CJ. Knowledge and Perceptions of Competency-Based Medical Education in Diagnostic Radiology Post-Graduate Medical Education: Identifying Priorities and Developing a Framework for Professional Development Activities. Can Assoc Radiol J 2023; 74:487-496. [PMID: 36384331 DOI: 10.1177/08465371221137087] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/23/2023] Open
Abstract
Introduction: We evaluated knowledge and perceptions of an established Competency-Based Medical Education (CBME) model developed by the Royal College of Physicians and Surgeons of Canada, Competence by Design (CBD), and identified evidence-informed priorities for professional development activities (PDAs). Materials and Methods: Teaching faculty and residents at a single, large diagnostic radiology post-graduate medical education (PGME) program were eligible to participate in this cross-sectional, survey-based study. Knowledge of CBD was evaluated through multiple choice questions (MCQs), which assessed participants' understanding of major principles and terms associated with CBD. Participants' perceptions of the anticipated impact of CBD on resident education and patient care were evaluated and priorities for PDAs were identified, which informed a framework for CBD PDAs. Results: Fifty faculty and residents participated. The faculty and resident response rates were 11.6% (n = 29/249) and 55.3% (n = 21/38), respectively. The mean ± standard deviation overall score on MCQs was 39.0% ± 20.4%. The majority of participants perceived the impact of CBD on resident education to be equivocal and to not impact patient care. Knowledge of CBD was not statistically significantly associated with participants' perceptions of the impact of CBD on either resident education or patient care (P > .05). Delivery of high-quality feedback was the greatest priority identified for PDAs. Discussion: Our results and proposed CBD PDAs framework may help to guide diagnostic radiology PGME programs in designing evidence-informed PDAs, which may meaningfully contribute to the successful implementation of CBD in diagnostic radiology PGME. As diagnostic radiology PGME programs throughout the world increasingly implement CBME models, evidence-informed PDAs will become of increasing importance.
Collapse
Affiliation(s)
- Helena Bentley
- Department of Radiology, Faculty of Medicine, University of British Columbia, Vancouver, BC, Canada
| | - Kathryn E Darras
- Department of Radiology, Faculty of Medicine, University of British Columbia, Vancouver, BC, Canada
- Department of Radiology, Vancouver General Hospital, Vancouver, BC, Canada
| | - Bruce B Forster
- Department of Radiology, Faculty of Medicine, University of British Columbia, Vancouver, BC, Canada
- Department of Radiology, Vancouver General Hospital, Vancouver, BC, Canada
| | - Linda Probyn
- Department of Medical Imaging, Faculty of Medicine, University of Toronto, Toronto, ON, Canada
| | - Anto Sedlic
- Department of Radiology, Faculty of Medicine, University of British Columbia, Vancouver, BC, Canada
- Department of Radiology, Vancouver General Hospital, Vancouver, BC, Canada
| | - Cameron J Hague
- Department of Radiology, Faculty of Medicine, University of British Columbia, Vancouver, BC, Canada
- Department of Radiology, St Paul's Hospital, Vancouver, BC, Canada
| |
Collapse
|
15
|
Chin M, Pack R, Cristancho S. "A whole other competence story": exploring faculty perspectives on the process of workplace-based assessment of entrustable professional activities. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2023; 28:369-385. [PMID: 35997910 DOI: 10.1007/s10459-022-10156-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/24/2022] [Accepted: 08/07/2022] [Indexed: 05/11/2023]
Abstract
The centrality of entrustable professional activities (EPAs) in competency-based medical education (CBME) is predicated on the assumption that low-stakes, high-frequency workplace-based assessments used in a programmatic approach will result in accurate and defensible judgments of competence. While there have been conversations in the literature regarding the potential of this approach, only recently has the conversation begun to explore the actual experiences of clinical faculty in this process. The purpose of this qualitative study was to explore the process of EPA assessment for faculty in everyday practice. We conducted 18 semi-structured interviews with Anesthesia faculty at a Canadian academic center. Participants were asked to describe how they engage in EPA assessment in daily practice and the factors they considered. Interviews were audio-recorded, transcribed, and analysed using the constant comparative method of grounded theory. Participants in this study perceived two sources of tension in the EPA assessment process that influenced their scoring on official forms: the potential constraints of the assessment forms and the potential consequences of their assessment outcome. This was particularly salient in circumstances of uncertainty regarding the learner's level of competence. Ultimately, EPA assessment in CBME may be experienced as higher-stakes by faculty than officially recognized due to these tensions, suggesting a layer of discomfort and burden in the process that may potentially interfere with the goal of assessment for learning. Acknowledging and understanding the nature of this burden and identifying strategies to mitigate it are critical to achieving the assessment goals of CBME.
Collapse
Affiliation(s)
- Melissa Chin
- Department of Anesthesia and Perioperative Medicine, London Health Sciences Centre, Schulich School of Medicine and Dentistry, University of Western Ontario, London, ON, Canada.
| | - Rachael Pack
- Center for Education Research and Innovation, University of Western Ontario, London, ON, Canada
| | - Sayra Cristancho
- Center for Education Research and Innovation, University of Western Ontario, London, ON, Canada
| |
Collapse
|
16
|
Paterson QS, Alrimawi H, Sample S, Bouwsema M, Anjum O, Vincent M, Cheung WJ, Hall A, Woods R, Martin LJ, Chan T. Examining enablers and barriers to entrustable professional activity acquisition using the theoretical domains framework: A qualitative framework analysis study. AEM EDUCATION AND TRAINING 2023; 7:e10849. [PMID: 36994315 PMCID: PMC10041073 DOI: 10.1002/aet2.10849] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/04/2022] [Revised: 01/20/2023] [Accepted: 01/29/2023] [Indexed: 06/19/2023]
Abstract
Background Without a clear understanding of the factors contributing to the effective acquisition of high-quality entrustable professional activity (EPA) assessments, trainees, supervising faculty, and training programs may lack appropriate strategies for successful EPA implementation and utilization. The purpose of this study was to identify barriers and facilitators to acquiring high-quality EPA assessments in Canadian emergency medicine (EM) training programs. Methods We conducted a qualitative framework analysis study utilizing the Theoretical Domains Framework (TDF). Semistructured interviews of EM resident and faculty participants underwent audio recording, deidentification, and line-by-line coding by two authors, being coded to extract themes and subthemes across the domains of the TDF. Results From 14 interviews (eight faculty and six residents) we identified, within the 14 TDF domains, major themes and subthemes for barriers and facilitators to EPA acquisition for both faculty and residents. The two most cited domains (and their frequencies) among residents and faculty were environmental context and resources (56) and behavioral regulation (48). Example strategies to improving EPA acquisition include orienting residents to the competency-based medical education (CBME) paradigm, recalibrating expectations relating to "low ratings" on EPAs, engaging in continuous faculty development to ensure familiarity and fluency with EPAs, and implementing longitudinal coaching programs between residents and faculty to encourage repetitive longitudinal interactions and high-quality specific feedback. Conclusions We identified key strategies to support residents, faculty, programs, and institutions in overcoming barriers and improving EPA assessment processes. This is an important step toward ensuring the successful implementation of CBME and the effective operationalization of EPAs within EM training programs.
Collapse
Affiliation(s)
- Quinten S. Paterson
- Department of Emergency MedicineUniversity of SaskatchewanSaskatoonSaskatchewanCanada
| | - Hussein Alrimawi
- Emergency Medicine Division, Department of MedicineMcMaster UniversityHamiltonOntarioCanada
| | - Spencer Sample
- Emergency Medicine Division, Department of MedicineMcMaster UniversityHamiltonOntarioCanada
| | - Melissa Bouwsema
- Department of Emergency MedicineQueens UniversityKingstonOntarioCanada
| | - Omar Anjum
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| | - Maggie Vincent
- Emergency Medicine Division, Department of MedicineMcMaster UniversityHamiltonOntarioCanada
| | - Warren J. Cheung
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| | - Andrew Hall
- Department of Emergency MedicineQueens UniversityKingstonOntarioCanada
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| | - Rob Woods
- Department of Emergency MedicineUniversity of SaskatchewanSaskatoonSaskatchewanCanada
| | - Lynsey J. Martin
- Department of Emergency MedicineUniversity of SaskatchewanSaskatoonSaskatchewanCanada
| | - Teresa Chan
- Emergency Medicine Division, Department of MedicineMcMaster UniversityHamiltonOntarioCanada
| |
Collapse
|
17
|
Lavoie P, Boyer L, Pepin J, Déry J, Lavoie-Tremblay M, Paquet M, Bolduc J. Multicentre implementation of a nursing competency framework at a provincial scale: A qualitative description of facilitators and barriers. J Eval Clin Pract 2023; 29:263-271. [PMID: 36099281 DOI: 10.1111/jep.13760] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/21/2022] [Revised: 08/23/2022] [Accepted: 08/27/2022] [Indexed: 11/30/2022]
Abstract
RATIONALE Nurses are responsible for engaging in continuing professional development throughout their careers. This implies that they use tools such as competency frameworks to assess their level of development, identify their learning needs, and plan actions to achieve their learning goals. Although multiple competency frameworks and guidelines for their development have been proposed, the literature on their implementation in clinical settings is sparser. If the complexity of practice creates a need for context-sensitive competency frameworks, their implementation may also be subject to various facilitators and barriers. AIMS AND OBJECTIVES To document the facilitators and barriers to implementing a nursing competency framework on a provincial scale. METHODS This multicentre study was part of a provincial project to implement a nursing competency framework in Quebec, Canada, using a three-step process based on evidence from implementation science. Nurses' participation consisted in the self-assessment of their competencies using the framework. For this qualitative descriptive study, 58 stakeholders from 12 organizations involved in the first wave of implementation participated in group interviews to discuss their experience with the implementation process and their perceptions of facilitators and barriers. Data were subjected to thematic analysis. RESULTS Analysis of the data yielded five themes: finding the 'right unit' despite an unfavourable context; taking and protecting time for self-assessment; creating value around competency assessment; bringing the project as close to the nurses as possible; making the framework accessible. CONCLUSION This study was one of the first to document the large-scale, multi-site implementation of a nursing competency framework in clinical settings. This project represented a unique challenge because it involved two crucial changes: adopting a competency-based approach focused on educational outcomes and accountability to the public and valorizing a learning culture where nurses become active stakeholders in their continuing professional development.
Collapse
Affiliation(s)
- Patrick Lavoie
- Faculty of Nursing, Université de Montréal, Montreal, Québec, Canada.,Montreal Heart Institute Research Center, Montreal, Québec, Canada
| | - Louise Boyer
- Faculty of Nursing, Université de Montréal, Montreal, Québec, Canada
| | - Jacinthe Pepin
- Faculty of Nursing, Université de Montréal, Montreal, Québec, Canada
| | - Johanne Déry
- Faculty of Nursing, Université de Montréal, Montreal, Québec, Canada
| | | | - Maxime Paquet
- Department of Psychology, Université de Montréal, Montreal, Québec, Canada
| | - Jolianne Bolduc
- Faculty of Nursing, Université de Montréal, Montreal, Québec, Canada
| |
Collapse
|
18
|
Safavi AH, Sienna J, Strang BK, Hann C. Competency-Based Medical Education in Canadian Radiation Oncology Residency Training: an Institutional Implementation Pilot Study. JOURNAL OF CANCER EDUCATION : THE OFFICIAL JOURNAL OF THE AMERICAN ASSOCIATION FOR CANCER EDUCATION 2023; 38:274-284. [PMID: 34859361 DOI: 10.1007/s13187-021-02112-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 10/24/2021] [Indexed: 06/13/2023]
Abstract
Canadian radiation oncology (RO) residency programs transitioned to a competency-based medical education (CBME) training model named Competence by Design (CBD) in July 2019. Prior to this, CBD was piloted in a single RO training program to characterize assessment completion and challenges of implementation. Six residents and seven staff participated in a mixed-methods study and were oriented to CBD. Four Entrustable Professional Activities were assessed over a 4-week-long block and documented using online assessment forms. Anonymized assessments were analyzed to characterize completion. Post-pilot surveys were completed by 4/6 residents and 5/7 staff. Semi-structured post-pilot focus groups were conducted with all residents. Assessments were requested and documented on a weekly basis. Narrative comments were found in 68.1% of assessments, of which 26.7% described specific examples of observed competence or recommendations for improvement. Three of five staff believed that assessments have a negative impact on clinical workflow. Three themes were identified: (1) direct observation is the most challenging aspect of CBD to implement; (2) feedback content can be improved; and (3) staff attitude, clinical workflow, and inaccessibility of assessment forms are the primary barriers to completing assessments. This study demonstrates that CBD assessments can be completed regularly in an outpatient radiation oncology setting and that implementation challenges include improving feedback quality, promoting direct observation, and continuing faculty development to improve perceptions of this assessment model. Further study is required to identify best practices and expectations for the discipline in the era of CBME.
Collapse
Affiliation(s)
- Amir H Safavi
- Department of Radiation Oncology, University of Toronto, 700 University Ave 7th Floor, Toronto, Ontario, M5G 2M9, Canada
| | - Julianna Sienna
- Department of Oncology, Juravinski Cancer Centre, McMaster University, 699 Concession Street 3rd Floor, Hamilton, ON, L8V 5C2, Canada
| | - Barbara K Strang
- Department of Oncology, Juravinski Cancer Centre, McMaster University, 699 Concession Street 3rd Floor, Hamilton, ON, L8V 5C2, Canada
| | - Crystal Hann
- Department of Oncology, Juravinski Cancer Centre, McMaster University, 699 Concession Street 3rd Floor, Hamilton, ON, L8V 5C2, Canada.
| |
Collapse
|
19
|
Stoffman JM. Overcoming the barriers to implementation of competence-based medical education in post-graduate medical education: a narrative literature review. MEDICAL EDUCATION ONLINE 2022; 27:2112012. [PMID: 35959887 PMCID: PMC9377243 DOI: 10.1080/10872981.2022.2112012] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/21/2022] [Revised: 07/10/2022] [Accepted: 08/01/2022] [Indexed: 06/15/2023]
Abstract
To ensure that residents are equipped with the necessary skills for practice, competence-based medical education (CBME) represents a transformative change in postgraduate medical education, which is being progressively introduced across Canadian specialty residency programs. Successful implementation will require adjustments to curriculum, assessment, and evaluation, with careful attention to the unique needs in the local context, including resident and faculty development. This narrative review of the literature aimed to determine the potential barriers to the successful implementation of CBME and the strategies by which they can be addressed, with a specific consideration of the author's program in pediatrics in Manitoba. Eleven articles were identified with a specific focus on the implementation of CBME in the post-graduate setting, and 10 were included in the review after critical appraisal. Three key themes emerged from the articles: the value of broad stakeholder engagement and leadership, the importance of faculty and resident development, and the development of specific support systems for the educational curriculum. Different strategies were considered and contrasted for addressing these important themes. This review provides important insights and practical approaches to the barriers that should be useful as programs prepare for the implementation of CBME.
Collapse
Affiliation(s)
- Jayson M. Stoffman
- Department of Pediatrics and Child Health, Director, Pediatric Postgraduate Medical Education, University of Manitoba, Winnipeg, MB, Canada
| |
Collapse
|
20
|
Bentley H, Darras KE, Forster BB, Sedlic A, Hague CJ. Review of Challenges to the Implementation of Competence by Design in Post-Graduate Medical Education: What Can Diagnostic Radiology Learn from the Experience of Other Specialty Disciplines? Acad Radiol 2022; 29:1887-1896. [PMID: 35094947 DOI: 10.1016/j.acra.2021.11.025] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2021] [Revised: 11/27/2021] [Accepted: 11/30/2021] [Indexed: 01/26/2023]
Abstract
Competence by Design (CBD) is a medical education initiative instituted by the Royal College of Physicians and Surgeons of Canada to improve the training of resident physicians in specialty disciplines. CBD integrates Competency Based Medical Education with traditional specialty discipline post-graduate medical education (PGME) training through the application of an organizational framework of competencies. Various specialty disciplines in Canada have transitioned to CBD since 2017 in a staggered approach. Diagnostic radiology PGME programs in Canada are expected to transition to CBD in 2022 for the incoming resident physician cohort. This article reviews potential challenges to the implementation of CBD in diagnostic radiology PGME programs and proposes evidence-informed targeted strategies and solutions to address these challenges. It is important for diagnostic radiology PGME programs to understand the challenges pertaining to the implementation of CBD so that they may be able to successfully implement this or similar medical education initiatives in their programs. Moreover, as radiology subspecialty PGME programs, such as nuclear medicine, interventional radiology, neuroradiology, and pediatric radiology, likewise transition to CBD and diagnostic radiology PGME programs internationally increasingly implement other Competency Based Medical Education models, the implications of the challenges pertaining to the implementation of CBD will further become of increasing importance.
Collapse
Affiliation(s)
- Helena Bentley
- Department of Radiology, Faculty of Medicine, Gordon & Leslie Diamond Health Care Centre, University of British Columbia, 11th Floor, 2775 Laurel Street, Vancouver, BC V5Z 1M9, Canada.
| | - Kathryn E Darras
- Department of Radiology, Faculty of Medicine, Gordon & Leslie Diamond Health Care Centre, University of British Columbia, 11th Floor, 2775 Laurel Street, Vancouver, BC V5Z 1M9, Canada; Department of Radiology, Vancouver General Hospital, Vancouver, BC, Canada
| | - Bruce B Forster
- Department of Radiology, Faculty of Medicine, Gordon & Leslie Diamond Health Care Centre, University of British Columbia, 11th Floor, 2775 Laurel Street, Vancouver, BC V5Z 1M9, Canada; Department of Radiology, Vancouver General Hospital, Vancouver, BC, Canada
| | - Anto Sedlic
- Department of Radiology, Faculty of Medicine, Gordon & Leslie Diamond Health Care Centre, University of British Columbia, 11th Floor, 2775 Laurel Street, Vancouver, BC V5Z 1M9, Canada; Department of Radiology, Vancouver General Hospital, Vancouver, BC, Canada
| | - Cameron J Hague
- Department of Radiology, Faculty of Medicine, Gordon & Leslie Diamond Health Care Centre, University of British Columbia, 11th Floor, 2775 Laurel Street, Vancouver, BC V5Z 1M9, Canada; Department of Radiology, St Paul's Hospital, Vancouver, BC, Canada
| |
Collapse
|
21
|
Read EK, Maxey C, Hecker KG. Longitudinal assessment of competency development at The Ohio State University using the competency-based veterinary education (CBVE) model. Front Vet Sci 2022; 9:1019305. [PMID: 36387400 PMCID: PMC9642912 DOI: 10.3389/fvets.2022.1019305] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2022] [Accepted: 09/29/2022] [Indexed: 09/19/2023] Open
Abstract
With the development of the American Association of Veterinary Medical Colleges' Competency-Based Veterinary Education (CBVE) model, veterinary schools are reorganizing curricula and assessment guidelines, especially within the clinical rotation training elements. Specifically, programs are utilizing both competencies and entrustable professional activities (EPAs) as opportunities for gathering information about student development within and across clinical rotations. However, what evidence exists that use of the central tenets of the CBVE model (competency framework, milestones and EPAs) improves our assessment practices and captures reliable and valid data to track competency development of students as they progress through their clinical year? Here, we report on validity evidence to support the use of scores from in-training evaluation report forms (ITERs) and workplace-based assessments of EPAs to evaluate competency progression within and across domains described in the CBVE, during the final year clinical training period of The Ohio State University's College of Veterinary Medicine (OSU-CVM) program. The ITER, used at the conclusion of each rotation, was modified to include the CBVE competencies that were assessed by identifying the stage of student development on a series of descriptive milestones (from pre-novice to competent). Workplace based assessments containing entrustment scales were used to assess EPAs from the CBVE model within each clinical rotation. Competency progression and entrustment scores were evaluated on each of the 31 rotations offered and high-stakes decisions regarding student performance were determined by a collective review of all the ITERs and EPAs recorded for each learner across each semester and the entire year. Results from the class of 2021, collected on approximately 190 students from 31 rotations, are reported with more than 55 299 total competency assessments combined with milestone placement and 2799 complete EPAs. Approximately 10% of the class was identified for remediation and received additional coaching support. Data collected longitudinally through the ITER on milestones provides initial validity evidence to support using the scores in higher stakes contexts such as identifying students for remediation and for determining whether students have met the necessary requirements to successfully complete the program. Data collected on entrustment scores did not, however, support such decision making. Implications are discussed.
Collapse
Affiliation(s)
- Emma K. Read
- College of Veterinary Medicine, The Ohio State University, Columbus, OH, United States
| | - Connor Maxey
- Faculty of Veterinary Medicine, University of Calgary, Calgary, AB, Canada
| | - Kent G. Hecker
- Faculty of Veterinary Medicine, University of Calgary, Calgary, AB, Canada
- International Council for Veterinary Assessment, Bismarck, ND, United States
| |
Collapse
|
22
|
McKenzie-White J, Mubuuke AG, Westergaard S, Munabi IG, Bollinger RC, Opoka R, Mbalinda SN, Katete D, Manabe YC, Kiguli S. Evaluation of a competency based medical curriculum in a Sub-Saharan African medical school. BMC MEDICAL EDUCATION 2022; 22:724. [PMID: 36242004 PMCID: PMC9569118 DOI: 10.1186/s12909-022-03781-1] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/30/2022] [Revised: 09/16/2022] [Accepted: 09/30/2022] [Indexed: 05/12/2023]
Abstract
BACKGROUND Medical schools in Sub-Saharan Africa have adopted competency based medical education (CBME) to improve the quality of graduates trained. In 2015, Makerere University College of Health Sciences (MaKCHS) implemented CBME for the Bachelor of Medicine and Bachelor of Surgery (MBChB) programme in order to produce doctors with the required attributes to address community health needs. However, no formal evaluation of the curriculum has been conducted to determine whether all established competencies are being assessed. OBJECTIVE To evaluate whether assessment methods within the MBChB curriculum address the stated competencies. METHODS The evaluation adopted a cross-sectional study design in which the MBChB curriculum was evaluated using an Essential Course Evidence Form (ECEF) that was developed to collect information about each assessment used for each course. Information was collected on: (1) Assessment title, (2) Description, (3) Competency domain (4) Sub-competency addressed, (5) Student instructions, and (6) Grading method/details. Data were entered into a structured Access data base. In addition, face-to-face interviews were conducted with faculty course coordinators. RESULTS The MBChB curriculum consisted of 62 courses over 5 years, focusing on preclinical skills in years 1-2 and clinical skills in years 3-5. Fifty-nine competencies were identified and aggregated into 9 domains. Fifty-eight competencies were assessed at least one time in the curriculum. Faculty cited limited training in assessment as well as large student numbers as hindrances to designing robust assessments for the competencies. CONCLUSION CBME was successfully implemented evidenced by all but one of the 59 competencies within the nine domains established being assessed within the MBChB curriculum at MaKCHS. Faculty interviewed were largely aware of it, however indicated the need for more training in competency-based assessment to improve the implementation of CBME.
Collapse
Affiliation(s)
- Jane McKenzie-White
- Division of Infectious Diseases, School of Medicine, Johns Hopkins, Baltimore, USA
| | - Aloysius G Mubuuke
- College of Health Sciences, Makerere University, P.O. Box 7072, Kampala, Uganda.
| | - Sara Westergaard
- Division of Infectious Diseases, School of Medicine, Johns Hopkins, Baltimore, USA
| | - Ian G Munabi
- College of Health Sciences, Makerere University, P.O. Box 7072, Kampala, Uganda
| | - Robert C Bollinger
- Division of Infectious Diseases, School of Medicine, Johns Hopkins, Baltimore, USA
| | - Robert Opoka
- College of Health Sciences, Makerere University, P.O. Box 7072, Kampala, Uganda
| | - Scovia N Mbalinda
- College of Health Sciences, Makerere University, P.O. Box 7072, Kampala, Uganda
| | - David Katete
- College of Health Sciences, Makerere University, P.O. Box 7072, Kampala, Uganda
| | - Yukari C Manabe
- Division of Infectious Diseases, School of Medicine, Johns Hopkins, Baltimore, USA
| | - Sarah Kiguli
- College of Health Sciences, Makerere University, P.O. Box 7072, Kampala, Uganda
| |
Collapse
|
23
|
Ott MC, Pack R, Cristancho S, Chin M, Van Koughnett JA, Ott M. "The Most Crushing Thing": Understanding Resident Assessment Burden in a Competency-Based Curriculum. J Grad Med Educ 2022; 14:583-592. [PMID: 36274774 PMCID: PMC9580312 DOI: 10.4300/jgme-d-22-00050.1] [Citation(s) in RCA: 21] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/13/2022] [Revised: 04/21/2022] [Accepted: 08/15/2022] [Indexed: 01/28/2023] Open
Abstract
BACKGROUND Competency-based medical education (CBME) was expected to increase the workload of assessment for graduate training programs to support the development of competence. Learning conditions were anticipated to improve through the provision of tailored learning experiences and more frequent, low-stakes assessments. Canada has adopted an approach to CBME called Competence by Design (CBD). However, in the process of implementation, learner anxiety and assessment burden have increased unexpectedly. To mitigate this unintended consequence, we need a stronger understanding of how resident assessment burdens emerge and function. OBJECTIVE This study investigates contextual factors leading to assessment burden on residents within the framework of CBD. METHODS Residents were interviewed about their experiences of assessment using constructivist grounded theory. Participants (n=21) were a purposive sample from operative and perioperative training programs, recruited from 6 Canadian medical schools between 2019 and 2020. Self-determination theory was used as a sensitizing concept to categorize findings on types of assessment burden. RESULTS Nine assessment burdens were identified and organized by threats to psychological needs for autonomy, relatedness, and competence. Burdens included: missed opportunities for self-regulated learning, lack of situational control, comparative assessment, lack of trust, constraints on time and resources, disconnects between teachers and learners, lack of clarity, unrealistic expectations, and limitations of assessment forms for providing meaningful feedback. CONCLUSIONS This study contributes a contextual understanding of how assessment burdens emerged as unmet psychological needs for autonomy, relatedness, and competence, with unintended consequences for learner well-being and intrinsic motivation.
Collapse
Affiliation(s)
- Mary C. Ott
- All authors are with Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada
- Mary C. Ott, PhD, is Research Associate, Centre for Education Research and Innovation
| | - Rachael Pack
- All authors are with Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada
- Rachael Pack, PhD, is Research Associate, Centre for Education Research and Innovation
| | - Sayra Cristancho
- All authors are with Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada
- Sayra Cristancho, PhD, is Scientist, Centre for Education Research and Innovation
| | - Melissa Chin
- All authors are with Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada
- Melissa Chin, MD, MHS, FRCPC, is CBME Lead, Department of Anesthesia and Perioperative Medicine
| | - Julie Ann Van Koughnett
- All authors are with Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada
- Julie Ann Van Koughnett, MD, MEd, FRCSC, is Program Director, General Surgery, Department of Surgery
| | - Michael Ott
- All authors are with Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada
- Mary C. Ott, PhD, is Research Associate, Centre for Education Research and Innovation
| |
Collapse
|
24
|
Bentley H, Lee J, Supersad A, Yu H, Dobson JL, Wong SA, Stewart M, Vatturi SS, Lebel K, Crivellaro P, Khatchikian AD, Hague CJ, Taylor J, Probyn L. Preparedness of Residents and Medical Students for the Transition to Competence by Design in Diagnostic Radiology Post-Graduate Medical Education. Can Assoc Radiol J 2022; 74:241-250. [PMID: 36083291 DOI: 10.1177/08465371221119139] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
Introduction: This needs assessment evaluated residents' and medical students' knowledge of Competence by Design (CBD), perceived benefits of and challenges or barriers to the transition to CBD for residents, and perceived overall preparedness for the transition to CBD in diagnostic radiology. Materials and Methods: All diagnostic radiology residents and medical students in Canada were eligible to participate in this national cross-sectional, questionnaire-based needs assessment. Knowledge of CBD was evaluated through participants' self-reported rating of their knowledge of CBD on a 5-point Likert scale. Perceived benefits of and challenges or barriers to the transition to CBD for residents were rank ordered. Participants' overall self-reported preparedness for the transition to CBD was assessed on a 5-point Likert scale. Data were summarized by descriptive statistics and bivariate analyses were conducted as appropriate. Results: Ninety-four residents (n = 77) and medical students (n = 17) participated in this needs assessment. Participants' mean ± standard deviation self-reported rating of their overall knowledge of CBD was 2.86 ± .94. Provision of meaningful feedback to learners and learners' ability to identify their own educational needs were among the highest ranked perceived benefits of the transition to CBD, while demands on time and increased frequency of evaluation were among the highest ranked perceived challenges or barriers to the transition to CBD. Few participants reported being either "prepared" (4.7%) or "somewhat prepared" (14.0%) for the transition to CBD. Conclusion: Preparedness for the transition to CBD in diagnostic radiology may be improved. Targeted interventions to augment the preparedness of residents and medical students should be considered.
Collapse
Affiliation(s)
- Helena Bentley
- Department of Radiology, 478400University of British Columbia, Vancouver, BC, Canada
| | - Juvel Lee
- Department of Radiology, University of Ottawa, Ottawa, ON, Canada
| | - Alanna Supersad
- Department of Radiology and Diagnostic Imaging, 3158University of Alberta, Edmonton, AB, Canada
| | - Hang Yu
- Department of Radiology, 12359University of Manitoba, Winnipeg, MB, Canada
| | - Jessica L Dobson
- Department of Radiology, 3688Dalhousie University, Halifax, NS, Canada
| | - Scott A Wong
- Department of Radiology and Diagnostic Imaging, 3158University of Alberta, Edmonton, AB, Canada
| | - Matthew Stewart
- Department of Radiology, 7235University of Saskatchewan, Saskatoon, SK, Canada
| | | | - Kiana Lebel
- Department of Radiology, 5622University of Montreal, Montreal, QC, Canada
| | | | | | - Cameron J Hague
- Department of Radiology, 478400University of British Columbia, Vancouver, BC, Canada.,Department of Radiology, St Paul's Hospital, Vancouver, BC, Canada
| | - Jana Taylor
- Department of Radiology, 5620McGill University, Montreal, QC, Canada
| | - Linda Probyn
- Department of Radiology, 7938University of Toronto, Toronto, ON, Canada
| |
Collapse
|
25
|
Hall AK, Oswald A. Optimising prospective entrustment: Defaulting on default progression. MEDICAL EDUCATION 2022; 56:870-872. [PMID: 35701709 DOI: 10.1111/medu.14856] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/06/2022] [Accepted: 06/09/2022] [Indexed: 06/15/2023]
Affiliation(s)
- Andrew K Hall
- Department of Emergency Medicine, University of Ottawa, Ottawa, Ontario, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada
| | - Anna Oswald
- Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada
- Division of Rheumatology, Department of Medicine, University of Alberta, Edmonton, Alberta, Canada
| |
Collapse
|
26
|
Yilmaz Y, Carey R, Chan TM, Bandi V, Wang S, Woods RA, Mondal D, Thoma B. Developing a dashboard for program evaluation in competency-based training programs: a design-based research project. CANADIAN MEDICAL EDUCATION JOURNAL 2022; 13:14-27. [PMID: 36310899 PMCID: PMC9588183 DOI: 10.36834/cmej.73554] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
BACKGROUND Canadian specialist residency training programs are implementing a form of competency-based medical education (CBME) that requires the assessment of entrustable professional activities (EPAs). Dashboards could be used to track the completion of EPAs to support program evaluation. METHODS Using a design-based research process, we identified program evaluation needs related to CBME assessments and designed a dashboard containing elements (data, analytics, and visualizations) meeting these needs. We interviewed leaders from the emergency medicine program and postgraduate medical education office at the University of Saskatchewan. Two investigators thematically analyzed interview transcripts to identify program evaluation needs that were audited by two additional investigators. Identified needs were described using quotes, analytics, and visualizations. RESULTS Between July 1, 2019 and April 6, 2021 we conducted 17 interviews with six participants (two program leaders and four institutional leaders). Four needs emerged as themes: tracking changes in overall assessment metrics, comparing metrics to the assessment plan, evaluating rotation performance, and engagement with the assessment metrics. We addressed these needs by presenting analytics and visualizations within a dashboard. CONCLUSIONS We identified program evaluation needs related to EPA assessments and designed dashboard elements to meet them. This work will inform the development of other CBME assessment dashboards designed to support program evaluation.
Collapse
Affiliation(s)
- Yusuf Yilmaz
- Continuing Professional Development Office, and McMaster program for Education Research, Innovation, and Theory (MERIT), McMaster University, Ontario, Canada
- Department of Medical Education, Ege University, Turkey
| | - Robert Carey
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Teresa M Chan
- Continuing Professional Development Office, and McMaster program for Education Research, Innovation, and Theory (MERIT), McMaster University, Ontario, Canada
- Division of Emergency Medicine, Department of Medicine at McMaster University
| | - Venkat Bandi
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Shisong Wang
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Robert A Woods
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Debajyoti Mondal
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Brent Thoma
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
- Royal College of Physicians and Surgeons of Canada, Ontario, Canada
| |
Collapse
|
27
|
Mueller V, Morais M, Lee M, Sherbino J. Implementation of Entrustable Professional Activities assessments in a Canadian obstetrics and gynecology residency program: a mixed methods study. CANADIAN MEDICAL EDUCATION JOURNAL 2022; 13:77-81. [PMID: 36310902 PMCID: PMC9588190 DOI: 10.36834/cmej.72567] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
BACKGROUND Since the implementation of competency-based medical education (CBME) across residency training programs in Canada, there has been limited research understanding how entrustable professional activity (EPA) assessments are used by faculty supervisors and residents. OBJECTIVE This study examines how EPA assessments are used in an Obstetrics and Gynecology residency program and the impact of implementation on both groups. METHODS A mixed methods study design was used. Part one involved the aggregation of descriptive data of EPA assessment completion for postgraduate year 1 and 2 residents from July 2019 to May 2020. Part two involved a thematic analysis of semi-structured interviews of residents and faculty. RESULTS There was significant uptake of EPA assessments across community and teaching hospitals with widespread contribution of assessment data from faculty. However, both residents and faculty reported that the intended design of EPA assessments as low-stakes assessments to provide formative feedback is not how EPA assessments are experienced. Residents and faculty noted the increased level of administrative burden and related perceived stress amongst the resident group. CONCLUSIONS The implementation of EPA assessments is feasible across a variety of sites. However, previous measurement challenges remain. Neither residents nor faculty perceive the value of EPAs to improve feedback, despite their intended nature.
Collapse
Affiliation(s)
- Valerie Mueller
- Department of Obstetrics and Gynecology, McMaster University, Ontario, Canada
| | - Michelle Morais
- Department of Obstetrics and Gynecology, McMaster University, Ontario, Canada
| | - Mark Lee
- McMaster for Education Research, Innovation, and Theory (MERIT) Program, McMaster University, Ontario, Canada
| | - Jonathan Sherbino
- Department of Obstetrics and Gynecology, McMaster University, Ontario, Canada
- McMaster for Education Research, Innovation, and Theory (MERIT) Program, McMaster University, Ontario, Canada
| |
Collapse
|
28
|
Cheung WJ, Hall AK, Skutovich A, Brzezina S, Dalseg TR, Oswald A, Cooke LJ, Van Melle E, Hamstra SJ, Frank JR. Ready, set, go! Evaluating readiness to implement competency-based medical education. MEDICAL TEACHER 2022; 44:886-892. [PMID: 36083123 DOI: 10.1080/0142159x.2022.2041585] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
PURPOSE Organizational readiness is critical for successful implementation of an innovation. We evaluated program readiness to implement Competence by Design (CBD), a model of Competency-Based Medical Education (CBME), among Canadian postgraduate training programs. METHODS A survey of program directors was distributed 1 month prior to CBD implementation in 2019. Questions were informed by the R = MC2 framework of organizational readiness and addressed: program motivation, general capacity for change, and innovation-specific capacity. An overall readiness score was calculated. An ANOVA was conducted to compare overall readiness between disciplines. RESULTS Survey response rate was 42% (n = 79). The mean overall readiness score was 74% (30-98%). There was no difference in scores between disciplines. The majority of respondents agreed that successful implementation of CBD was a priority (74%), and that their leadership (94%) and faculty and residents (87%) were supportive of change. Fewer perceived that CBD was a move in the right direction (58%) and that implementation was a manageable change (53%). Curriculum mapping, competence committees and programmatic assessment activities were completed by >90% of programs, while <50% had engaged off-service disciplines. CONCLUSION Our study highlights important areas where programs excelled in their preparation for CBD, as well as common challenges that serve as targets for future intervention to improve program readiness for CBD implementation.
Collapse
Affiliation(s)
- Warren J Cheung
- Department of Emergency Medicine, University of Ottawa, Ottawa, Ontario, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada
| | - Andrew K Hall
- Department of Emergency Medicine, University of Ottawa, Ottawa, Ontario, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada
| | | | - Stacey Brzezina
- Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada
| | - Timothy R Dalseg
- Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada
- Department of Medicine, Division of Emergency Medicine, University of Toronto, Toronto, Ontario, Canada
| | - Anna Oswald
- Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada
- Department of Medicine, University of Alberta, Edmonton, Alberta, Canada
| | - Lara J Cooke
- Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada
- Department of Clinical Neurosciences, Cumming School of Medicine, University of Calgary, Calgary, Alberta, Canada
| | - Elaine Van Melle
- Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada
- Department of Family Medicine, Queen's University, Kingston, Ontario, Canada
| | - Stanley J Hamstra
- Department of Surgery, University of Toronto, Toronto, Ontario, Canada
- Sunnybrook Research Institute, Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada
- Department of Milestones Research and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois, USA
- Department of Medical Education, Northwestern University Feinberg School of Medicine, Chicago, Illinois, USA
| | - Jason R Frank
- Department of Emergency Medicine, University of Ottawa, Ottawa, Ontario, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada
| |
Collapse
|
29
|
Rachul C, Collins B, Chan MK, Srinivasan G, Hamilton J. Rivalries for attention: insights from a realist evaluation of a postgraduate competency-based medical education implementation in Canada. BMC MEDICAL EDUCATION 2022; 22:583. [PMID: 35906632 PMCID: PMC9336173 DOI: 10.1186/s12909-022-03661-8] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/18/2022] [Accepted: 07/22/2022] [Indexed: 06/15/2023]
Abstract
BACKGROUND Implementing competency-based medical education (CBME) in post-graduate medical education (PGME) is a complex process that requires multiple systemic changes in a complex system that is simultaneously engaged in multiple initiatives. These initiatives often compete for attention during the implementation of CBME and produce unintended and unanticipated consequences. Understanding the impact of this context is necessary for evaluating the effectiveness of CBME. The purpose of the study was to identify factors, such as contexts and processes, that contribute to the implementation of CBME. METHODS We conducted a realist evaluation using data collected from 15 programs through focus groups with residents (2 groups, n = 16) and faculty (one group, n = 8), and semi-structured interviews with program directors (n = 18), and program administrators (n = 12) from 2018 to 2021. Data were analyzed using a template analysis based on a coding framework that was developed from a sample of transcripts, the context-mechanism-outcomes framework for realist evaluations, and the core components of CBME. RESULTS The findings demonstrate that simultaneous initiatives in the academic health sciences system creates a key context for CBME implementation - rivalries for attention - and specifically, the introduction of curricular management systems (CMS) concurrent to, but separate from, the implementation of CBME. This context influenced participants' participation, communication, and adaptation during CBME implementation, which led to change fatigue and unmet expectations for the collection and use of assessment data. CONCLUSIONS Rival initiatives, such as the concurrent implementation of a new CMS, can have an impact on how programs implement CBME and greatly affect the outcomes of CBME. Mitigating the effects of rivals for attention with flexibility, clear communication, and training can facilitate effective implementation of CBME.
Collapse
Affiliation(s)
- Christen Rachul
- Office of Innovation and Scholarship in Medical Education, Max Rady College of Medicine, University of Manitoba, S204, Medical Services Building, 750 Bannatyne Ave, Winnipeg, MB, R3E 0W2, Canada.
- Department of Psychiatry, Max Rady College of Medicine, University of Manitoba, Winnipeg, Canada.
| | - Benjamin Collins
- Department of Anthropology, University of Manitoba, Winnipeg, Canada
| | - Ming-Ka Chan
- Department of Pediatrics and Child Health, Max Rady College of Medicine, University of Manitoba, Winnipeg, Canada
| | - Ganesh Srinivasan
- Department of Pediatrics and Child Health, Max Rady College of Medicine, University of Manitoba, Winnipeg, Canada
| | - Joanne Hamilton
- Office of Innovation and Scholarship in Medical Education, Max Rady College of Medicine, University of Manitoba, S204, Medical Services Building, 750 Bannatyne Ave, Winnipeg, MB, R3E 0W2, Canada
| |
Collapse
|
30
|
Reimagining the Clinical Competency Committee to Enhance Education and Prepare for Competency-Based Time-Variable Advancement. J Gen Intern Med 2022; 37:2280-2290. [PMID: 35445932 PMCID: PMC9021365 DOI: 10.1007/s11606-022-07515-3] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/15/2021] [Accepted: 03/25/2022] [Indexed: 12/01/2022]
Abstract
Assessing residents and clinical fellows is a high-stakes activity. Effective assessment is important throughout training so that identified areas of strength and weakness can guide educational planning to optimize outcomes. Assessment has historically been underemphasized although medical education oversight organizations have strengthened requirements in recent years. Growing acceptance of competency-based medical education and its logical extension to competency-based time-variable (CB-TV) graduate medical education (GME) further highlights the importance of implementing effective evidence-based approaches to assessment. The Clinical Competency Committee (CCC) has emerged as a key programmatic structure in graduate medical education. In the context of launching a multi-specialty pilot of CB-TV GME in our health system, we have examined several program's CCC processes and reviewed the relevant literature to propose enhancements to CCCs. We recommend that all CCCs fulfill three core goals, regularly applied to every GME trainee: (1) discern and describe the resident's developmental status to individualize education, (2) determine readiness for unsupervised practice, and (3) foster self-assessment ability. We integrate the literature and observations from GME program CCCs in our institutions to evaluate how current CCC processes support or undermine these goals. Obstacles and key enablers are identified. Finally, we recommend ways to achieve the stated goals, including the following: (1) assess and promote the development of competency in all trainees, not just outliers, through a shared model of assessment and competency-based advancement; (2) strengthen CCC assessment processes to determine trainee readiness for independent practice; and (3) promote trainee reflection and informed self-assessment. The importance of coaching for competency, robust workplace-based assessments, feedback, and co-production of individualized learning plans are emphasized. Individual programs and their CCCs must strengthen assessment tools and frameworks to realize the potential of competency-oriented education.
Collapse
|
31
|
Hamza DM, Grierson L. Monitoring the integrity and usability of policy evaluation tools within an evolving sociocultural context: A demonstration of reflexivity using the CFPC Family Medicine Longitudinal Survey. J Eval Clin Pract 2022; 28:468-474. [PMID: 34904770 DOI: 10.1111/jep.13646] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/30/2021] [Accepted: 12/01/2021] [Indexed: 11/30/2022]
Abstract
RATIONALE, AIMS, AND OBJECTIVES Over the last decade, policy changes have prompted Canadian medical education to emphasize a transformation to competency-based education, and subsequent development of evaluation tools. The pandemic provides a unique opportunity to emphasize the value of reflexive monitoring, a cyclical and iterative process of appraisal and adaptation, since tools are influenced by social and cultural factors relevant at the time of their development. METHODS Deductive content analysis of documents and resources about the advancement of primary care. Reflexive monitoring of the Family Medicine Longitudinal Survey (FMLS), an evaluation tool for physician training. RESULTS The FMLS tool does not explore all training experiences that are currently relevant; including, incorporating technology, infection control and safety, public health services referrals, patient preferences for care modality, and trauma-informed culturally safe care. CONCLUSION The results illustrate that reflection promotes the validity and usefulness of the data collected to inform policy performance and other initiatives.
Collapse
Affiliation(s)
- Deena M Hamza
- Postgraduate Medical Education, Faculty of Medicine & Dentistry, University of Alberta, Edmonton, Alberta, Canada
| | - Lawrence Grierson
- Department of Family Medicine, McMaster University, Hamilton, Ontario, Canada
| |
Collapse
|
32
|
Do Resident Archetypes Influence the Functioning of Programs of Assessment? EDUCATION SCIENCES 2022. [DOI: 10.3390/educsci12050293] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
While most case studies consider how programs of assessment may influence residents’ achievement, we engaged in a qualitative, multiple case study to model how resident engagement and performance can reciprocally influence the program of assessment. We conducted virtual focus groups with program leaders from four residency training programs from different disciplines (internal medicine, emergency medicine, neurology, and rheumatology) and institutions. We facilitated discussion with live screen-sharing to (1) improve upon a previously-derived model of programmatic assessment and (2) explore how different resident archetypes (sample profiles) may influence their program of assessment. Participants agreed that differences in resident engagement and performance can influence their programs of assessment in some (mal)adaptive ways. For residents who are disengaged and weakly performing (of which there are a few), significantly more time is spent to make sense of problematic evidence, arrive at a decision, and generate recommendations. Whereas for residents who are engaged and performing strongly (the vast majority), significantly less effort is thought to be spent on discussion and formalized recommendations. These findings motivate us to fulfill the potential of programmatic assessment by more intentionally and strategically challenging those who are engaged and strongly performing, and by anticipating ways that weakly performing residents may strain existing processes.
Collapse
|
33
|
Brown DR, Moeller JJ, Grbic D, Biskobing DM, Crowe R, Cutrer WB, Green ML, Obeso VT, Wagner DP, Warren JB, Yingling SL, Andriole DA. Entrustment Decision Making in the Core Entrustable Professional Activities: Results of a Multi-Institutional Study. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2022; 97:536-543. [PMID: 34261864 DOI: 10.1097/acm.0000000000004242] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
PURPOSE In 2014, the Association of American Medical Colleges defined 13 Core Entrustable Professional Activities (EPAs) that all graduating students should be ready to do with indirect supervision upon entering residency and commissioned a 10-school, 5-year pilot to test implementing the Core EPAs framework. In 2019, pilot schools convened trained entrustment groups (TEGs) to review assessment data and render theoretical summative entrustment decisions for class of 2019 graduates. Results were examined to determine the extent to which entrustment decisions could be made and the nature of these decisions. METHOD For each EPA considered (4-13 per student), TEGs recorded an entrustment determination (ready, progressing but not yet ready, evidence against student progressing, could not make a decision); confidence in that determination (none, low, moderate, high); and the number of workplace-based assessments (WBAs) considered (0->15) per determination. These individual student-level data were de-identified and merged into a multischool database; chi-square analysis tested the significance of associations between variables. RESULTS The 2,415 EPA-specific determinations (for 349 students by 4 participating schools) resulted in a decision of ready (n = 997/2,415; 41.3%), progressing but not yet ready (n = 558/2,415; 23.1%), or evidence against student progression (n = 175/2,415; 7.2%). No decision could be made for the remaining 28.4% (685/2,415), generally for lack of data. Entrustment determinations' distribution varied across EPAs (chi-square P < .001) and, for 10/13 EPAs, WBA availability was associated with making (vs not making) entrustment decisions (each chi-square P < .05). CONCLUSIONS TEGs were able to make many decisions about readiness for indirect supervision; yet less than half of determinations resulted in a decision of readiness to perform this EPA with indirect supervision. More work is needed at the 10 schools to enable authentic summative entrustment in the Core EPAs framework.
Collapse
Affiliation(s)
- David R Brown
- D.R. Brown is professor, chief, Division of Family and Community Medicine, and interim chair, Department of Humanities, Health, and Society, Florida International University Herbert Wertheim College of Medicine, Miami, Florida; ORCID: http://orcid.org/0000-0002-5361-6664
| | - Jeremy J Moeller
- J.J. Moeller is associate professor and residency program director, Department of Neurology, Yale University School of Medicine, New Haven, Connecticut; ORCID: https://orcid.org/0000-0002-6135-5572
| | - Douglas Grbic
- D. Grbic is lead research analyst, Medical Education Research, Association of American Medical Colleges, Washington, DC
| | - Diane M Biskobing
- D.M. Biskobing is professor of medicine and associate dean of medical education, Virginia Commonwealth University School of Medicine, Richmond, Virginia
| | - Ruth Crowe
- R. Crowe is director of integrated clinical skills, director of practice of medicine, Office of Medical Education, and associate professor of medicine, New York University Grossman School of Medicine, New York, New York
| | - William B Cutrer
- W.B. Cutrer is associate dean for undergraduate medical education and associate professor of pediatrics (critical care medicine), Vanderbilt University School of Medicine, Nashville, Tennessee; ORCID: https://orcid.org/0000-0003-1538-9779
| | - Michael L Green
- M.L. Green is professor of medicine and director of student assessment, Teaching and Learning Center, Yale University School of Medicine, New Haven, Connecticut
| | - Vivian T Obeso
- V.T. Obeso is associate dean for curriculum and medical education and associate professor, Division of Internal Medicine, Department of Translational Medicine, Florida International University Herbert Wertheim College of Medicine, Miami, Florida
| | - Dianne P Wagner
- D.P. Wagner is associate dean for undergraduate medical education and professor of medicine, Michigan State University College of Human Medicine, East Lansing, Michigan
| | - Jamie B Warren
- J.B. Warren is associate professor, Division of Neonatology, and clinical vice chair, Department of Pediatrics, Oregon Health & Science University, Portland, Oregon; ORCID: https://orcid.org/0000-0003-4422-1502
| | - Sandra L Yingling
- S.L. Yingling is associate dean for educational planning and quality improvement, University of Illinois College of Medicine (Chicago, Peoria, Rockford, and Urbana), Chicago, Illinois
| | - Dorothy A Andriole
- D.A. Andriole is senior director, Medical Education Research, Association of American Medical Colleges, Washington, DC; ORCID: https://orcid.org/0000-0001-8902-1227
| |
Collapse
|
34
|
The Importance of Professional Development in a Programmatic Assessment System: One Medical School’s Experience. EDUCATION SCIENCES 2022. [DOI: 10.3390/educsci12030220] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
The Cleveland Clinic Lerner College of Medicine of Case Western Reserve University (CCLCM) was created in 2004 as a 5-year undergraduate medical education program with a mission to produce future physician-investigators. CCLCM’s assessment system aligns with the principles of programmatic assessment. The curriculum is organized around nine competencies, where each competency has milestones that students use to self-assess their progress and performance. Throughout the program, students receive low-stakes feedback from a myriad of assessors across courses and contexts. With support of advisors, students construct portfolios to document their progress and performance. A separate promotion committee makes high-stakes promotion decisions after reviewing students’ portfolios. This case study describes a systematic approach to provide both student and faculty professional development essential for programmatic assessment. Facilitators, barriers, lessons learned, and future directions are discussed.
Collapse
|
35
|
Nelson MR, Smith AR, Lawrence MG. The continuum of Allergy-Immunology Fellowship Training and continuing certification embraces competency based medical education. Ann Allergy Asthma Immunol 2022; 128:236-237. [PMID: 35216743 DOI: 10.1016/j.anai.2021.12.019] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2021] [Revised: 12/21/2021] [Accepted: 12/27/2021] [Indexed: 10/19/2022]
Affiliation(s)
- Michael R Nelson
- University of Virginia School of Medicine, Charlottesville, VA; American Board of Allergy and Immunology, Philadelphia, PA.
| | - Anna R Smith
- University of Virginia School of Medicine, Charlottesville, VA
| | | |
Collapse
|
36
|
An adaptation-focused evaluation of Canada's first competency-based medical education implementation in radiology. Eur J Radiol 2021; 147:110109. [PMID: 34968900 DOI: 10.1016/j.ejrad.2021.110109] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2021] [Accepted: 12/09/2021] [Indexed: 11/22/2022]
Abstract
OBJECTIVES Systematic program evaluation of the Queen's University diagnostic radiology residency program following transition to a competency-based medical education (CBME) curriculum. METHODS Rapid Evaluation methodology and the Core Components Framework were utilized to measure CBME implementation. A combination of interviews and focus groups were held with program leaders (n = 6), faculty (n = 10), both CBME stream and traditional stream residents (n = 6), and program staff (n = 2). Interviews and focus groups were transcribed and analyzed abductively. Study team met with program leaders to review common themes and plan potential adaptations. RESULTS Strengths of CBME implementation included more frequent and timely feedback as well as the role of the Academic Advisor. However, frontline faculty felt insufficiently supported with regards to the theory and practical implementation of the new curriculum and found assessment tools unintuitive. The circumstances surrounding the curricular implementation also resulted in some negative sentiment. Additional faculty and resident education workshops were identified as areas for improvement as well as changes to assessment tools for increased clarity. Residents overall viewed the changes favorably, with traditional stream residents indicating that they also had a desire for increased feedback. CONCLUSIONS Rapid Evaluation is an effective method for program assessment following curricular change in diagnostic radiology. A departmental champion driving enthusiasm for change from within may be valuable. Adequate resident and faculty education is key to maximize change and smooth the transition. Advances in knowledge: This study provides insights for other radiology training programs transitioning to a CBME framework and provides a structure for programmatic assessment.
Collapse
|
37
|
Batt A, Williams B, Rich J, Tavares W. A Six-Step Model for Developing Competency Frameworks in the Healthcare Professions. Front Med (Lausanne) 2021; 8:789828. [PMID: 34970566 PMCID: PMC8713730 DOI: 10.3389/fmed.2021.789828] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2021] [Accepted: 11/23/2021] [Indexed: 11/13/2022] Open
Abstract
Competency frameworks are developed for a variety of purposes, including describing professional practice and informing education and assessment frameworks. Despite the volume of competency frameworks developed in the healthcare professions, guidance remains unclear and is inconsistently adhered to (perhaps in part due to a lack of organizing frameworks), there is variability in methodological choices, inconsistently reported outputs, and a lack of evaluation of frameworks. As such, we proposed the need for improved guidance. In this paper, we outline a six-step model for developing competency frameworks that is designed to address some of these shortcomings. The six-steps comprise [1] identifying purpose, intended uses, scope, and stakeholders; [2] theoretically informed ways of identifying the contexts of complex, "real-world" professional practice, which includes [3] aligned methods and means by which practice can be explored; [4] the identification and specification of competencies required for professional practice, [5] how to report the process and outputs of identifying such competencies, and [6] built-in strategies to continuously evaluate, update and maintain competency framework development processes and outputs. The model synthesizes and organizes existing guidance and literature, and furthers this existing guidance by highlighting the need for a theoretically-informed approach to describing and exploring practice that is appropriate, as well as offering guidance for developers on reporting the development process and outputs, and planning for the ongoing maintenance of frameworks.
Collapse
Affiliation(s)
- Alan Batt
- Department of Paramedicine, Monash University, Frankston, VIC, Australia
- McNally Project for Paramedicine Research, Toronto, ON, Canada
| | - Brett Williams
- Department of Paramedicine, Monash University, Frankston, VIC, Australia
| | - Jessica Rich
- Assessment and Evaluation, Faculty of Education, Queens University, Kingston, ON, Canada
| | - Walter Tavares
- Department of Paramedicine, Monash University, Frankston, VIC, Australia
- McNally Project for Paramedicine Research, Toronto, ON, Canada
- The Wilson Centre, University of Toronto, Toronto, ON, Canada
- Post Graduate Medical Education and Continuing Professional Development, Faculty of Medicine, University of Toronto, Toronto, ON, Canada
| |
Collapse
|
38
|
Mador B, Daniels VJ, Oswald A, Turner SR. Learner Phenotypes in Competency-Based Medical Education. MEDICAL SCIENCE EDUCATOR 2021; 31:2061-2064. [PMID: 34956713 PMCID: PMC8651902 DOI: 10.1007/s40670-021-01380-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 08/14/2021] [Indexed: 06/14/2023]
Abstract
With the launch of competency-based medical education internationally, the practical realities of implementation have failed to live up to many of the proposed theoretical benefits. Specifically, as educators we have observed a number of assessment challenges that seem directly related to identified learner phenotypes. This commentary seeks to describe these specific learner phenotypes, along with actionable recommendations for programs and their competence committees in order to overcome the associated obstacles in assessment. We describe strategies related to both the individual and program level, which can be utilized for both short-term adjustments and long-term programmatic transformation.
Collapse
Affiliation(s)
- Brett Mador
- Department of Surgery, University of Alberta, Edmonton, AB Canada
| | - Vijay J. Daniels
- Department of Medicine, University of Alberta, Edmonton, AB Canada
| | - Anna Oswald
- Department of Medicine, University of Alberta, Edmonton, AB Canada
| | - Simon R. Turner
- Department of Surgery, University of Alberta, Edmonton, AB Canada
| |
Collapse
|
39
|
Soukoulis V, Martindale J, Bray MJ, Bradley E, Gusic ME. The use of EPA assessments in decision-making: Do supervision ratings correlate with other measures of clinical performance? MEDICAL TEACHER 2021; 43:1323-1329. [PMID: 34242113 DOI: 10.1080/0142159x.2021.1947480] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
BACKGROUND Entrustable professional activities (EPAs) have been introduced as a framework for teaching and assessment in competency-based educational programs. With growing use, has come a call to examine the validity of EPA assessments. We sought to explore the correlation of EPA assessments with other clinical performance measures to support use of supervision ratings in decisions about medical students' curricular progression. METHODS Spearman rank coefficients were used to determine correlation of supervision ratings from EPA assessments with scores on clerkship evaluations and performance on an end-of-clerkship-year Objective Structured Clinical Examination (CPX). RESULTS Both overall clinical evaluation items score (rho 0.40; n = 166) and CPX patient encounter domain score (rho 0.31; n = 149) showed significant correlation with students' overall mean EPA supervision rating during the clerkship year. There was significant correlation between mean supervision rating for EPA assessments of history, exam, note, and oral presentation skills with scores for these skills on clerkship evaluations; less so on the CPX. CONCLUSIONS Correlation of EPA supervision ratings with commonly used clinical performance measures offers support for their use in undergraduate medical education. Data supporting the validity of EPA assessments promotes stakeholders' acceptance of their use in summative decisions about students' readiness for increased patient care responsibility.
Collapse
Affiliation(s)
- Victor Soukoulis
- Division of Cardiovascular Medicine, University of Virginia School of Medicine, Charlottesville, VA, USA
| | - James Martindale
- Center for Medical Education Research and Scholarly Innovation, University of Virginia School of Medicine, Charlottesville, VA, USA
| | - Megan J Bray
- Center for Medical Education Research and Scholarly Innovation and Department of Obstetrics and Gynecology, University of Virginia School of Medicine, Charlottesville, VA, USA
| | - Elizabeth Bradley
- Center for Medical Education Research and Scholarly Innovation, University of Virginia School of Medicine, Charlottesville, VA, USA
| | - Maryellen E Gusic
- Center for Medical Education Research and Scholarly Innovation and Department of Pediatrics, University of Virginia School of Medicine, Charlottesville, VA, USA
| |
Collapse
|
40
|
Hamza DM, Regehr G. Eco-Normalization: Evaluating the Longevity of an Innovation in Context. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:S48-S53. [PMID: 34348375 DOI: 10.1097/acm.0000000000004318] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
PURPOSE When initiating an educational innovation, successful implementation and meaningful, lasting change can be elusive. This elusiveness stems from the difficulty of introducing changes into complex ecosystems. Program evaluation models that focus on implementation fidelity examine the inner workings of an innovation in the real-world context. However, the methods by which fidelity is typically examined may inadvertently limit thinking about the trajectory of an innovation over time. Thus, a new approach is needed, one that focuses on whether the conditions observed during the implementation phase of an educational innovation represent a foundation for meaningful, long-lasting change. METHOD Through a critical review, authors examined relevant models from implementation science and developed a comprehensive framework that shifts the focus of program evaluation from exploring snapshots in time to assessing the trajectory of an innovation beyond the implementation phase. RESULTS Durable and meaningful "normalization" of an innovation is rooted in how the local aspirations and practices of the institutional system and the people doing the work interact with the grand aspirations and features of the innovation. Borrowing from Normalization Process Theory, the Consolidated Framework for Implementation Research, and Reflexive Monitoring in Action, the authors developed a framework, called Eco-Normalization, that highlights 6 critical questions to be considered when evaluating the potential longevity of an innovation. CONCLUSIONS When evaluating an educational innovation, the Eco-Normalization model focuses our attention on the ecosystem of change and the features of the ecosystem that may contribute to (or hinder) the longevity of innovations in context.
Collapse
Affiliation(s)
- Deena M Hamza
- D.M. Hamza is an implementation scientist and the research and evaluation lead for postgraduate medical education, University of Alberta, Edmonton, Canada; ORCID: https://orcid.org/0000-0001-8943-2165
| | - Glenn Regehr
- G. Regehr is professor, Department of Surgery, and scientist, Centre for Health Education Scholarship, Faculty of Medicine, University of British Columbia, Vancouver, British Columbia, Canada; ORCID: https://orcid.org/0000-0002-3144-331X
| |
Collapse
|
41
|
Laureano M, Mithoowani S, Tseng EK, Zeller MP. Improving Medical Education in Hematology and Transfusion Medicine in Canada: Standards and Limitations. ADVANCES IN MEDICAL EDUCATION AND PRACTICE 2021; 12:1153-1163. [PMID: 34675742 PMCID: PMC8504712 DOI: 10.2147/amep.s247159] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/16/2021] [Accepted: 09/04/2021] [Indexed: 06/13/2023]
Abstract
The paradigm of medical education is evolving with the introduction of competency-based medical education (CBME) and it is crucial that residency programs adapt. In this paper, we provide an overview of the current status of medical education in Hematology in Canada including models of training, assessment methods, anticipated challenges, and the effects of the COVID-19 pandemic. We will also discuss additional training that can be pursued after a Hematology residency, with a particular focus On Transfusion Medicine as it was one of the first programs to implement a competency-based curriculum. Finally, we explore the future directions of medical education in Hematology and Transfusion Medicine.
Collapse
Affiliation(s)
- Marissa Laureano
- Department of Medicine, Department of Pathology and Molecular Medicine, McMaster University and Canadian Blood Services, Hamilton, ON, Canada
| | - Siraj Mithoowani
- Division of Hematology & Thromboembolism, Department of Medicine, McMaster University, Hamilton, ON, Canada
| | - Eric K Tseng
- Division of Hematology/Oncology, St. Michael’s Hospital, University of Toronto, Toronto, ON, Canada
| | - Michelle P Zeller
- Division of Hematology & Thromboembolism, McMaster Centre for Transfusion Research and Canadian Blood Services, Hamilton, ON, Canada
| |
Collapse
|
42
|
Robinson TJG, Wagner N, Szulewski A, Dudek N, Cheung WJ, Hall AK. Exploring the use of rating scales with entrustment anchors in workplace-based assessment. MEDICAL EDUCATION 2021; 55:1047-1055. [PMID: 34060651 DOI: 10.1111/medu.14573] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/01/2020] [Revised: 04/07/2021] [Accepted: 05/26/2021] [Indexed: 06/12/2023]
Abstract
PURPOSE Competency-based medical education (CBME) has prompted widespread implementation of workplace-based assessment (WBA) tools using entrustment anchors. This study aimed to identify factors that influence faculty's rating choices immediately following assessment and explore their experiences using WBAs with entrustment anchors, specifically the Ottawa Surgical Competency Operating Room Evaluation scale. METHOD A convenience sample of 50 semi-structured interviews with Emergency Medicine (EM) physicians from a single Canadian hospital were conducted between July and August 2019. All interviews occurred within two hours of faculty completing a WBA of a trainee. Faculty were asked what they considered when rating the trainee's performance and whether they considered an alternate rating. Two team members independently analysed interview transcripts using conventional content analysis with line-by-line coding to identify themes. RESULTS Interviews captured interactions between 70% (26/37) of full-time EM faculty and 86% (19/22) of EM trainees. Faculty most commonly identified the amount of guidance the trainee required as influencing their rating. Other variables such as clinical context, trainee experience, past experiences with the trainee, perceived competence and confidence were also identified. While most faculty did not struggle to assign ratings, some had difficulty interpreting the language of entrustment anchors, being unsure whether their assessment should be retrospective or prospective in nature, and if/how the assessment should change whether they were 'in the room' or not. CONCLUSIONS By going to the frontline during WBA encounters, this study captured authentic and honest reflections from physicians immediately engaged in assessment using entrustment anchors. While many of the factors identified are consistent with previous retrospective work, we highlight how some faculty consider factors outside the prescribed approach and struggle with the language of entrustment anchors. These results further our understanding of 'in-the-moment' assessments using entrustment anchors and may facilitate effective faculty development regarding WBA in CBME.
Collapse
Affiliation(s)
| | - Natalie Wagner
- Department of Biomedical & Molecular Sciences, Queen's University, Kingston, ON, Canada
- Office of Professional Development & Educational Scholarship, Queen's University, Kingston, ON, Canada
| | - Adam Szulewski
- Department of Emergency Medicine, Queen's University, Kingston, ON, Canada
- Department of Psychology, Queen's University, Kingston, ON, Canada
| | - Nancy Dudek
- Department of Medicine and The Ottawa Hospital, University of Ottawa, Ottawa, ON, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| | - Warren J Cheung
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Andrew K Hall
- Department of Emergency Medicine, Queen's University, Kingston, ON, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| |
Collapse
|
43
|
Competency-Based Medical Education in Canadian Radiation Oncology Residency Training: An Institutional Implementation Pilot Study. Int J Radiat Oncol Biol Phys 2021. [DOI: 10.1016/j.ijrobp.2021.05.176] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
44
|
Yilmaz Y, Carey R, Chan TM, Bandi V, Wang S, Woods RA, Mondal D, Thoma B. Developing a dashboard for faculty development in competency-based training programs: a design-based research project. CANADIAN MEDICAL EDUCATION JOURNAL 2021; 12:48-64. [PMID: 34567305 PMCID: PMC8463237 DOI: 10.36834/cmej.72067] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
BACKGROUND Canadian specialist residency training programs are implementing a form of competency-based medical education (CBME) that requires frequent assessments of entrustable professional activities (EPAs). Faculty struggle to provide helpful feedback and assign appropriate entrustment scores. CBME faculty development initiatives rarely incorporate teaching metrics. Dashboards could be used to visualize faculty assessment data to support faculty development. METHODS Using a design-based research process, we identified faculty development needs related to CBME assessments and designed a dashboard containing elements (data, analytics, and visualizations) meeting these needs. Data was collected within the emergency medicine residency program at the University of Saskatchewan through interviews with program leaders, faculty development experts, and faculty participating in development sessions. Two investigators thematically analyzed interview transcripts to identify faculty needs that were audited by a third investigator. The needs were described using representative quotes and the dashboard elements designed to address them. RESULTS Between July 1, 2019 and December 11, 2020 we conducted 15 interviews with nine participants (two program leaders, three faculty development experts, and four faculty members). Three needs emerged as themes from the analysis: analysis of assessments, contextualization of assessments, and accessible reporting. We addressed these needs by designing an accessible dashboard to present contextualized quantitative and narrative assessment data for each faculty member. CONCLUSIONS We identified faculty development needs related to EPA assessments and designed dashboard elements to meet them. The resulting dashboard was used for faculty development sessions. This work will inform the development of CBME assessment dashboards for faculty.
Collapse
Affiliation(s)
- Yusuf Yilmaz
- Continuing Professional Development Office and McMaster Education Research, Innovation, and Theory (MERIT) Program, McMaster University, Ontario, Canada
- Department of Medical Education, Ege University, Izmir, Turkey
| | - Robert Carey
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Teresa M Chan
- Continuing Professional Development Office and McMaster Education Research, Innovation, and Theory (MERIT) Program, McMaster University, Ontario, Canada
- Emergency Medicine, Department of Medicine, McMaster University, Ontario, Canada
| | - Venkat Bandi
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Shisong Wang
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Robert A Woods
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Debajyoti Mondal
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Brent Thoma
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| |
Collapse
|
45
|
Van Melle E, Hall AK, Schumacher DJ, Kinnear B, Gruppen L, Thoma B, Caretta-Weyer H, Cooke LJ, Frank JR. Capturing outcomes of competency-based medical education: The call and the challenge. MEDICAL TEACHER 2021; 43:794-800. [PMID: 34121596 DOI: 10.1080/0142159x.2021.1925640] [Citation(s) in RCA: 27] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
There is an urgent need to capture the outcomes of the ongoing global implementation of competency-based medical education (CBME). However, the measurement of downstream outcomes following educational innovations, such as CBME is fraught with challenges stemming from the complexities of medical training, the breadth and variability of inputs, and the difficulties attributing outcomes to specific educational elements. In this article, we present a logic model for CBME to conceptualize an impact pathway relating to CBME and facilitate outcomes evaluation. We further identify six strategies to mitigate the challenges of outcomes measurement: (1) clearly identify the outcome of interest, (2) distinguish between outputs and outcomes, (3) carefully consider attribution versus contribution, (4) connect outcomes to the fidelity and integrity of implementation, (5) pay attention to unanticipated outcomes, and (6) embrace methodological pluralism. Embracing these challenges, we argue that careful and thoughtful evaluation strategies will move us forward in answering the all-important question: Are the desired outcomes of CBME being achieved?
Collapse
Affiliation(s)
- Elaine Van Melle
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Family Medicine, Queen's University, Kingston, Canada
| | - Andrew K Hall
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Emergency Medicine, Queen's University, Kingston,Canada
| | - Daniel J Schumacher
- Division of Emergency Medicine, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA
- Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Benjamin Kinnear
- Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Larry Gruppen
- Department of Learning Health Sciences, University of Michigan, Ann Arbor, MI, USA
| | - Brent Thoma
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Emergency Medicine, University of Saskatchewan, Saskatoon, Canada
| | - Holly Caretta-Weyer
- Department of Emergency Medicine, Stanford University School of Medicine, Palo Alto, CA, USA
| | - Lara J Cooke
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Clinical Neurosciences, Division of Neurology, Cumming School of Medicine, University of Calgary, Calgary, Canada
| | - Jason R Frank
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, Canada
| |
Collapse
|
46
|
Hall AK, Schumacher DJ, Thoma B, Caretta-Weyer H, Kinnear B, Gruppen L, Cooke LJ, Frank JR, Van Melle E. Outcomes of competency-based medical education: A taxonomy for shared language. MEDICAL TEACHER 2021; 43:788-793. [PMID: 34038673 DOI: 10.1080/0142159x.2021.1925643] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
As the global transformation of postgraduate medical training continues, there are persistent calls for program evaluation efforts to understand the impact and outcomes of competency-based medical education (CBME) implementation. The measurement of a complex educational intervention such as CBME is challenging because of the multifaceted nature of activities and outcomes. What is needed, therefore, is an organizational taxonomy to both conceptualize and categorize multiple outcomes. In this manuscript we propose a taxonomy that builds on preceding works to organize CBME outcomes across three domains: focus (educational, clinical), level (micro, meso, macro), and timeline (training, transition to practice, practice). We also provide examples of how to conceptualize outcomes of educational interventions across medical specialties using this taxonomy. By proposing a shared language for outcomes of CBME, we hope that this taxonomy will help organize ongoing evaluation work and catalyze those seeking to engage in the evaluation effort to help understand the impact and outcomes of CBME.
Collapse
Affiliation(s)
- Andrew K Hall
- Department of Emergency Medicine, Queen's University, Kingston, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
| | - Daniel J Schumacher
- Division of Emergency Medicine, Cincinnati Children's Hospital Medical Center
- Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Brent Thoma
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Emergency Medicine, University of Saskatchewan, Saskatoon, Canada
| | - Holly Caretta-Weyer
- Department of Emergency Medicine, Stanford University School of Medicine, Palo Alto, CA, USA
| | - Benjamin Kinnear
- Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Larry Gruppen
- Department of Learning Health Sciences, University of Michigan, Ann Arbor, MI, USA
| | - Lara J Cooke
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Division of Neurology, Department of Clinical Neurosciences, Cumming School of Medicine, University of Calgary, Calgary, Canada
| | - Jason R Frank
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, Canada
| | - Elaine Van Melle
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Family Medicine, Queen's University, Kingston, Canada
| |
Collapse
|
47
|
Thoma B, Ellaway RH, Chan TM. From Utopia Through Dystopia: Charting a Course for Learning Analytics in Competency-Based Medical Education. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:S89-S95. [PMID: 34183609 DOI: 10.1097/acm.0000000000004092] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
The transition to the assessment of entrustable professional activities as part of competency-based medical education (CBME) has substantially increased the number of assessments completed on each trainee. Many CBME programs are having difficulty synthesizing the increased amount of assessment data. Learning analytics are a way of addressing this by systematically drawing inferences from large datasets to support trainee learning, faculty development, and program evaluation. Early work in this field has tended to emphasize the significant potential of analytics in medical education. However, concerns have been raised regarding data security, data ownership, validity, and other issues that could transform these dreams into nightmares. In this paper, the authors explore these contrasting perspectives by alternately describing utopian and dystopian futures for learning analytics within CBME. Seeing learning analytics as an important way to maximize the value of CBME assessment data for organizational development, they argue that their implementation should continue within the guidance of an ethical framework.
Collapse
Affiliation(s)
- Brent Thoma
- B. Thoma is associate professor, Department of Emergency Medicine, University of Saskatchewan, Saskatoon, Saskatchewan, Canada, and clinician educator, Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada; ORCID: https://orcid.org/0000-0003-1124-5786
| | - Rachel H Ellaway
- R.H. Ellaway is professor, Department of Community Health Sciences, and director, Office of Health and Medical Education Scholarship, Cumming School of Medicine, University of Calgary, Calgary, Alberta, Canada; ORCID: https://orcid.org/0000-0002-3759-6624
| | - Teresa M Chan
- T.M. Chan is associate professor, Division of Emergency Medicine, Department of Medicine, assistant dean, Program for Faculty Development, Faculty of Health Sciences, and adjunct scientist, McMaster Education Research, Innovation, and Theory (MERIT) program, McMaster University, Hamilton, Ontario, Canada; ORCID: https://orcid.org/0000-0001-6104-462X
| |
Collapse
|
48
|
Ross S, Hauer KE, Wycliffe-Jones K, Hall AK, Molgaard L, Richardson D, Oswald A, Bhanji F. Key considerations in planning and designing programmatic assessment in competency-based medical education. MEDICAL TEACHER 2021; 43:758-764. [PMID: 34061700 DOI: 10.1080/0142159x.2021.1925099] [Citation(s) in RCA: 19] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Programmatic assessment as a concept is still novel for many in clinical education, and there may be a disconnect between the academics who publish about programmatic assessment and the front-line clinical educators who must put theory into practice. In this paper, we clearly define programmatic assessment and present high-level guidelines about its implementation in competency-based medical education (CBME) programs. The guidelines are informed by literature and by lessons learned from established programmatic assessment approaches. We articulate five steps to consider when implementing programmatic assessment in CBME contexts: articulate the purpose of the program of assessment, determine what must be assessed, choose tools fit for purpose, consider the stakes of assessments, and define processes for interpreting assessment data. In the process, we seek to offer a helpful guide or template for front-line clinical educators. We dispel some myths about programmatic assessment to help training programs as they look to design-or redesign-programs of assessment. In particular, we highlight the notion that programmatic assessment is not 'one size fits all'; rather, it is a system of assessment that results when shared common principles are considered and applied by individual programs as they plan and design their own bespoke model of programmatic assessment for CBME in their unique context.
Collapse
Affiliation(s)
- Shelley Ross
- Department of Family Medicine, University of Alberta, Edmonton, Canada
- Canadian Association for Medical Education, Edmonton, Canada
| | | | - Keith Wycliffe-Jones
- Department of Family Medicine, Cumming School of Medicine, University of Calgary, Calgary, Canada
| | - Andrew K Hall
- Department of Emergency Medicine, Queen's University, Kingston, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
| | - Laura Molgaard
- University of Minnesota College of Veterinary Medicine, St. Paul, MIN, USA
| | - Denyse Richardson
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Division of Physiatry, Department of Medicine, University of Toronto, Toronto, Canada
| | - Anna Oswald
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Medicine and CBME lead for the Faculty of Medicine & Dentistry, University of Alberta, Edmonton, Canada
| | - Farhan Bhanji
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Pediatrics at McGill University, Montreal, Canada
| |
Collapse
|
49
|
Thoma B, Caretta-Weyer H, Schumacher DJ, Warm E, Hall AK, Hamstra SJ, Cavalcanti R, Chan TM. Becoming a deliberately developmental organization: Using competency based assessment data for organizational development. MEDICAL TEACHER 2021; 43:801-809. [PMID: 34033512 DOI: 10.1080/0142159x.2021.1925100] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Medical education is situated within health care and educational organizations that frequently lag in their use of data to learn, develop, and improve performance. How might we leverage competency-based medical education (CBME) assessment data at the individual, program, and system levels, with the goal of redefining CBME from an initiative that supports the development of physicians to one that also fosters the development of the faculty, administrators, and programs within our organizations? In this paper we review the Deliberately Developmental Organization (DDO) framework proposed by Robert Kegan and Lisa Lahey, a theoretical framework that explains how organizations can foster the development of their people. We then describe the DDO's conceptual alignment with CBME and outline how CBME assessment data could be used to spur the transformation of health care and educational organizations into digitally integrated DDOs. A DDO-oriented use of CBME assessment data will require intentional investment into both the digitalization of assessment data and the development of the people within our organizations. By reframing CBME in this light, we hope that educational and health care leaders will see their investments in CBME as an opportunity to spur the evolution of a developmental culture.
Collapse
Affiliation(s)
- Brent Thoma
- Department of Emergency Medicine, University of Saskatchewan, Saskatoon, Canada
- Royal College of Physicians and Surgeons of Canada, Saskatoon, Canada
| | - Holly Caretta-Weyer
- Department of Emergency Medicine, Stanford University School of Medicine, Palo Alto, CA, USA
| | - Daniel J Schumacher
- Department of Pediatrics, Cincinnati Children's Hospital Medical Center/University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Eric Warm
- Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Andrew K Hall
- Department of Emergency Medicine, University of Saskatchewan, Saskatoon, Canada
- Royal College of Physicians and Surgeons of Canada, Saskatoon, Canada
| | - Stanley J Hamstra
- Milestones Research and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, IL, USA
- Faculty of Education, University of Ottawa, Ottawa, Canada
- Department of Medical Education, Feinberg School of Medicine, Northwestern University, Chicago, IL, USA
| | - Rodrigo Cavalcanti
- Department of Medicine, University of Toronto, Toronto, Canada
- HoPingKong Centre for Excellence in Education and Practice, UHN, Toronto, Canada
| | - Teresa M Chan
- Program for Faculty Development, Faculty of Health Sciences, McMaster University, Hamilton, Canada
- Division of Emergency Medicine, Department of Medicine, McMaster University, Hamilton, Canada
- McMaster program for Education Research, Innovation, and Theory (MERIT), Hamilton, Canada
| |
Collapse
|
50
|
Dagnone JD, Bandiera G, Harris K. Re-examining the value proposition for Competency-Based Medical Education. CANADIAN MEDICAL EDUCATION JOURNAL 2021; 12:155-158. [PMID: 34249202 PMCID: PMC8263028 DOI: 10.36834/cmej.68245] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/12/2023]
Abstract
The adoption of competency-based medical education (CBME) by Canadian postgraduate training programs has created a storm of excitement and controversy. Implementing the system-wide Competency by Design (CBD) project initiated by the Royal College of Physicians & Surgeons of Canada (RCPSC), is an ambitious transformative change challenge. Not surprisingly, tensions have arisen across the country around the theoretical underpinnings of CBME and the practicalities of implementation, resulting in calls for evidence justifying its value. Assumptions have been made on both sides of the argument contributing to an atmosphere of unhealthy protection of the status quo, premature conclusions of CBME's worth, and an oversimplification of risks and costs to participants. We feel that a renewed effort to find a shared vision of medical education and the true value proposition of CBME is required to recreate a growth-oriented mindset. Also, the aspirational assertion of a direct link between CBME and improved patient outcomes requires deferral until further implementation and study has occurred. However, we perceive more concrete and immediate value of CBME arises from the societal contract physicians have, the connection to maintaining self-regulation, and the potential customization of training for learners.
Collapse
Affiliation(s)
- Jeffrey Damon Dagnone
- Emergency Medicine, Queens University, Ontario, Canada
- Correspondence to: J Damon Dagnone;
| | - Glenn Bandiera
- Emergency Medicine and Post-Graduate Medical Education, University of Toronto, Ontario, Canada
| | - Kenneth Harris
- Royal College of Physicians & Surgeons of Canada, Ontario, Canada
| |
Collapse
|