1
|
Mahan JD, Kaczmarczyk JM, Miller Juve AK, Cymet T, Shah BJ, Daniel R, Edgar L. Clinician Educator Milestones: Assessing and Improving Educators' Skills. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2024; 99:592-598. [PMID: 38442199 DOI: 10.1097/acm.0000000000005684] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/07/2024]
Abstract
ABSTRACT The importance of the clinician educator (CE) role in delivery of competency-based medical education is well recognized. There is, however, no formal mechanism to identify when faculty have the knowledge, skills, and attitudes to be successful CEs. In 2020, the Accreditation Council for Graduate Medical Education, Accreditation Council for Continuing Medical Education, Association of American Medical Colleges, and American Association of Colleges of Osteopathic Medicine convened a workgroup of 18 individuals representing multiple medical specialties and diverse institutions in the United States, including nonphysician educators, a medical student, and a resident, to develop a set of competencies, subcompetencies, and milestones for CEs.A 5-step process was used to create the Clinician Educator Milestones (CEMs). In step 1, the workgroup developed an initial CEM draft. Through brainstorming, 141 potential education-related CE tasks were identified. Descriptive statements for each competency and developmental trajectories for each subcompetency were developed and confirmed by consensus. The workgroup then created a supplemental guide, assessment tools, and additional resources. In step 2, a diverse group of CEs were surveyed in 2021 and provided feedback on the CEMs. In step 3, this feedback was used by the workgroup to refine the CEMs. In step 4, the second draft of the CEMs was submitted for public comment, and the CEMs were finalized. In step 5, final CEMs were released for public use in 2022.The CEMs consist of 1 foundational domain that focuses on commitment to lifelong learning, 4 additional domains of competence for CEs in the learning environment, and 20 subcompetencies. These milestones have many potential uses for CEs, including self-assessment, constructing learning and improvement plans, and designing systematic faculty development efforts. The CEMs will continue to evolve as they are applied in practice and as the role of CEs continues to grow and develop.
Collapse
|
2
|
Thoma B, Bernard J, Wang S, Yilmaz Y, Bandi V, Woods RA, Cheung WJ, Choo E, Card A, Chan TM. Deidentifying Narrative Assessments to Facilitate Data Sharing in Medical Education. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2024; 99:513-517. [PMID: 38113414 DOI: 10.1097/acm.0000000000005596] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/21/2023]
Abstract
PROBLEM Narrative assessments are commonly incorporated into competency-based medical education programs. However, efforts to share competency-based medical education assessment data among programs to support the evaluation and improvement of assessment systems have been limited in part because of security concerns. Deidentifying assessment data mitigates these concerns, but deidentifying narrative assessments is time-consuming, resource intensive, and error prone. The authors developed and tested a tool to automate the deidentification of narrative assessments and facilitate their review. APPROACH The authors met throughout 2021 and 2022 to iteratively design, test, and refine the deidentification algorithm and data review interface. Preliminary testing of the prototype deidentification algorithm was performed using narrative assessments from the University of Saskatchewan emergency medicine program. The algorithm's accuracy was assessed by the authors using the review interface designed for this purpose. Formal testing included 2 rounds of deidentification and review by members of the authorship team. Both the algorithm and data review interface were refined during the testing process. OUTCOMES Authors from 3 institutions, including 3 emergency medicine programs, an anesthesia program, and a surgical program, participated in formal testing. In the final round of review, 99.4% of the narrative assessments were fully deidentified (names, nicknames, and pronouns removed). The results were comparable for each institution and specialty. The data review interface was improved with feedback obtained after each round of review and found to be intuitive. NEXT STEPS This innovation has demonstrated viability evidence of an algorithmic approach to the deidentification of assessment narratives while reinforcing that a small number of errors are likely to persist. Future steps include the refinement of both the algorithm to improve its accuracy and the data review interface to support additional data set formats.
Collapse
|
3
|
Frank JR, Karpinski J, Sherbino J, Snell LS, Atkinson A, Oswald A, Hall AK, Cooke L, Dojeiji S, Richardson D, Cheung WJ, Cavalcanti RB, Dalseg TR, Thoma B, Flynn L, Gofton W, Dudek N, Bhanji F, Wong BMF, Razack S, Anderson R, Dubois D, Boucher A, Gomes MM, Taber S, Gorman LJ, Fulford J, Naik V, Harris KA, St. Croix R, van Melle E. Competence By Design: a transformational national model of time-variable competency-based postgraduate medical education. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:201-223. [PMID: 38525203 PMCID: PMC10959143 DOI: 10.5334/pme.1096] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/04/2023] [Accepted: 02/16/2024] [Indexed: 03/26/2024]
Abstract
Postgraduate medical education is an essential societal enterprise that prepares highly skilled physicians for the health workforce. In recent years, PGME systems have been criticized worldwide for problems with variable graduate abilities, concerns about patient safety, and issues with teaching and assessment methods. In response, competency based medical education approaches, with an emphasis on graduate outcomes, have been proposed as the direction for 21st century health profession education. However, there are few published models of large-scale implementation of these approaches. We describe the rationale and design for a national, time-variable competency-based multi-specialty system for postgraduate medical education called Competence by Design. Fourteen innovations were bundled to create this new system, using the Van Melle Core Components of competency based medical education as the basis for the transformation. The successful execution of this transformational training system shows competency based medical education can be implemented at scale. The lessons learned in the early implementation of Competence by Design can inform competency based medical education innovation efforts across professions worldwide.
Collapse
Affiliation(s)
- Jason R. Frank
- Centre for Innovation in Medical Education and Professor, Department of Emergency Medicine, Faculty of Medicine, University of Ottawa, ON, Canada
| | - Jolanta Karpinski
- Department of Medicine, University of Ottawa, Ottawa, ON, Canada
- Competency Based Medical Education, University of Ottawa, Ottawa, ON, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| | | | - Linda S. Snell
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Medicine and Health Sciences Education, McGill University, Montreal, QC, Canada
| | - Adelle Atkinson
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Paediatrics, Temerty Faculty of Medicine, University of Toronto, Toronto, ON, Canada
| | - Anna Oswald
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Medicine, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB, Canada
- Competency Based Medical Education, University of Alberta, Edmonton, AB, Canada
| | - Andrew K. Hall
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Lara Cooke
- Division of Neurology, Department of Clinical Neurosciences, Cumming School of Medicine, University of Calgary, Calgary, AB, Canada
| | - Susan Dojeiji
- Physical Medicine and Rehabilitation, University of Ottawa, Ottawa, ON, Canada
| | - Denyse Richardson
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Physical Medicine and Rehabilitation, Queen’s University, Kingston, ON, Canada
| | - Warren J. Cheung
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Medicine, University of Toronto, Toronto, ON, Canada
| | - Rodrigo B. Cavalcanti
- Department of Medicine, University of Toronto, Toronto, ON, Canada
- HoPingKong Centre, University Health Network, Toronto, ON, Canada
| | - Timothy R. Dalseg
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Division of Emergency Medicine, University of Toronto, Toronto, ON, Canada
| | - Brent Thoma
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Emergency Medicine, University of Saskatchewan, Saskatoon, SK, Canada
| | - Leslie Flynn
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Departments of Psychiatry and Family Medicine, and Co-Director Master of Health Sciences Education, Queen’s University, Kingston, ON, Canada
| | - Wade Gofton
- Department of Surgery (Division of Orthopedic Surgery), The Ottawa Hospital and University of Ottawa, Ottawa, ON, Canada
| | - Nancy Dudek
- Department of Medicine (Division of Physical Medicine & Rehabilitation) and The Ottawa Hospital, University of Ottawa, Ottawa, Ontario, Canada
| | - Farhan Bhanji
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Faculty of Medicine and Health Sciences, McGill University, Montreal, QC, Canada
| | - Brian M.-F. Wong
- Centre for Quality Improvement and Patient Safety, University of Toronto, Toronto, Canada
| | - Saleem Razack
- Centre for Health Education Scholarship, University of British Columbia and BC Children’s Hospital, Vancouver, BC, Canada
| | - Robert Anderson
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Northern Ontario School of Medicine University, Sudbury, ON, Canada
| | - Daniel Dubois
- Department of Anesthesiology and Pain Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Andrée Boucher
- Department of Medicine (Division of Endocrinology), Universitéde Montréal, Montréal, QC, Canada
| | - Marcio M. Gomes
- Department of Pathology and Laboratory Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Sarah Taber
- Office of Standards and Assessment, Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| | - Lisa J. Gorman
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| | - Jane Fulford
- Canadian Internet Registration Authority, Canada
| | - Viren Naik
- Department of Anesthesiology and Pain Medicine, University of Ottawa, Ottawa, ON, Canada
- Medical Council of Canada, Ottawa, ON, Canada
| | - Kenneth A. Harris
- Royal College of Physicians and Surgeons of Canada, Canada
- Emeritus, Western University, Canada
| | - Rhonda St. Croix
- Learning and Connecting at the Royal College of Physicians and Surgeons of Canada, Canada
| | - Elaine van Melle
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Family Medicine, Queen’s University, Kingston, ON, Canada
| |
Collapse
|
4
|
Dalseg TR, Thoma B, Wycliffe-Jones K, Frank JR, Taber S. Enabling Implementation of Competency Based Medical Education through an Outcomes-Focused Accreditation System. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:75-84. [PMID: 38343559 PMCID: PMC10854411 DOI: 10.5334/pme.963] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/08/2023] [Accepted: 09/08/2023] [Indexed: 02/15/2024]
Abstract
Competency based medical education is being adopted around the world. Accreditation plays a vital role as an enabler in the adoption and implementation of competency based medical education, but little has been published about how the design of an accreditation system facilitates this transformation. The Canadian postgraduate medical education environment has recently transitioned to an outcomes-based accreditation system in parallel with the adoption of competency based medical education. Using the Canadian example, we characterize four features of an accreditation system that can facilitate the implementation of competency based medical education: theoretical underpinning, quality focus, accreditation standards, and accreditation processes. Alignment of the underlying educational theories within the accreditation system and educational paradigm drives change in a consistent and desired direction. An accreditation system that prioritizes quality improvement over quality assurance promotes educational system development and progressive change. Accreditation standards that achieve the difficult balance of being sufficiently detailed yet flexible foster a high fidelity of implementation without stifling innovation. Finally, accreditation processes that recognize the change process, encourage program development, and are not overly punitive all enable the implementation of competency based medical education. We also discuss the ways in which accreditation can simultaneously hinder the implementation of this approach. As education bodies adopt competency based medical education, particular attention should be paid to the role that accreditation plays in successful implementation.
Collapse
Affiliation(s)
- Timothy R. Dalseg
- Department of Medicine, Division of Emergency Medicine, University of Toronto, Toronto, ON, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Toronto General Hospital, 200 Elizabeth Street, R. Fraser Elliott Building, Ground Floor, Room 480, Toronto, ON M5G 2C4, (416) 833-0121; Canada
| | - Brent Thoma
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Emergency Medicine, University of Saskatchewan, Saskatoon, SK, Canada
| | - Keith Wycliffe-Jones
- Department of Family Medicine, Cumming School of Medicine, University of Calgary, Calgary, AB, Canada
| | - Jason R. Frank
- Centre for Innovation in Medical Education, University of Ottawa, Ottawa, ON, Canada
| | - Sarah Taber
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| |
Collapse
|
5
|
Oswald A, Dubois D, Snell L, Anderson R, Karpinski J, Hall AK, Frank JR, Cheung WJ. Implementing Competence Committees on a National Scale: Design and Lessons Learned. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:56-67. [PMID: 38343555 PMCID: PMC10854462 DOI: 10.5334/pme.961] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/08/2023] [Accepted: 07/03/2023] [Indexed: 02/15/2024]
Abstract
Competence committees (CCs) are a recent innovation to improve assessment decision-making in health professions education. CCs enable a group of trained, dedicated educators to review a portfolio of observations about a learner's progress toward competence and make systematic assessment decisions. CCs are aligned with competency based medical education (CBME) and programmatic assessment. While there is an emerging literature on CCs, little has been published on their system-wide implementation. National-scale implementation of CCs is complex, owing to the culture change that underlies this shift in assessment paradigm and the logistics and skills needed to enable it. We present the Royal College of Physicians and Surgeons of Canada's experience implementing a national CC model, the challenges the Royal College faced, and some strategies to address them. With large scale CC implementation, managing the tension between standardization and flexibility is a fundamental issue that needs to be anticipated and addressed, with careful consideration of individual program needs, resources, and engagement of invested groups. If implementation is to take place in a wide variety of contexts, an approach that uses multiple engagement and communication strategies to allow for local adaptations is needed. Large-scale implementation of CCs, like any transformative initiative, does not occur at a single point but is an evolutionary process requiring both upfront resources and ongoing support. As such, it is important to consider embedding a plan for program evaluation at the outset. We hope these shared lessons will be of value to other educators who are considering a large-scale CBME CC implementation.
Collapse
Affiliation(s)
- Anna Oswald
- Division of Rheumatology, Department of Medicine, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB, Canada
- Competency Based Medical Education, University of Alberta, Edmonton, AB, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- 8-130 Clinical Sciences building, 11350-83 Avenue, Edmonton, AB, Canada
| | - Daniel Dubois
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Anesthesiology and Pain Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Linda Snell
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Institute of Health Sciences Education and Department of Medicine, McGill University, Montreal, QC, Canada
| | - Robert Anderson
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Northern Ontario School of Medicine University, Sudbury, ON, Canada
| | - Jolanta Karpinski
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Andrew K. Hall
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Dept. of Emergency Medicine, University of Ottawa, Canada
| | - Jason R. Frank
- Centre for Innovation in Medical Education, Faculty of Medicine, University of Ottawa, Canada
| | - Warren J. Cheung
- Dept. of Emergency Medicine, University of Ottawa, Canada
- Royal College of Physicians and Surgeons of Canada, 1053 Carling Avenue, Rm F660, Ottawa, Canada
| |
Collapse
|
6
|
Caretta-Weyer HA, Smirnova A, Barone MA, Frank JR, Hernandez-Boussard T, Levinson D, Lombarts KMJMH, Lomis KD, Martini A, Schumacher DJ, Turner DA, Schuh A. The Next Era of Assessment: Building a Trustworthy Assessment System. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:12-23. [PMID: 38274558 PMCID: PMC10809864 DOI: 10.5334/pme.1110] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/16/2023] [Accepted: 12/18/2023] [Indexed: 01/27/2024]
Abstract
Assessment in medical education has evolved through a sequence of eras each centering on distinct views and values. These eras include measurement (e.g., knowledge exams, objective structured clinical examinations), then judgments (e.g., workplace-based assessments, entrustable professional activities), and most recently systems or programmatic assessment, where over time multiple types and sources of data are collected and combined by competency committees to ensure individual learners are ready to progress to the next stage in their training. Significantly less attention has been paid to the social context of assessment, which has led to an overall erosion of trust in assessment by a variety of stakeholders including learners and frontline assessors. To meaningfully move forward, the authors assert that the reestablishment of trust should be foundational to the next era of assessment. In our actions and interventions, it is imperative that medical education leaders address and build trust in assessment at a systems level. To that end, the authors first review tenets on the social contextualization of assessment and its linkage to trust and discuss consequences should the current state of low trust continue. The authors then posit that trusting and trustworthy relationships can exist at individual as well as organizational and systems levels. Finally, the authors propose a framework to build trust at multiple levels in a future assessment system; one that invites and supports professional and human growth and has the potential to position assessment as a fundamental component of renegotiating the social contract between medical education and the health of the public.
Collapse
Affiliation(s)
- Holly A. Caretta-Weyer
- Department of Emergency Medicine, Stanford University School of Medicine, Palo Alto, California, USA
| | - Alina Smirnova
- Department of Family Medicine, University of Calgary, Calgary, Alberta, Canada
- Kern Institute for the Transformation of Medical Education, Medical College of Wisconsin, Milwaukee, Wisconsin, USA
| | - Michael A. Barone
- NBME, Philadelphia, Pennsylvania, USA
- Johns Hopkins University School of Medicine, Baltimore, Maryland, USA
| | - Jason R. Frank
- Department of Emergency Medicine, University of Ottawa, Ottawa, Ontario, CA
| | | | - Dana Levinson
- Josiah Macy Jr Foundation, Philadelphia, Pennsylvania, USA
| | - Kiki M. J. M. H. Lombarts
- Department of Medical Psychology, Amsterdam University Medical Centers, University of Amsterdam, NL
- Amsterdam Public Health research institute, Amsterdam, NL
| | - Kimberly D. Lomis
- Undergraduate Medical Education Innovations, American Medical Association, Chicago, Illinois, USA
| | - Abigail Martini
- Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio, USA
| | - Daniel J. Schumacher
- Division of Emergency Medicine, Cincinnati Children’s Hospital Medical Center/University of Cincinnati College of Medicine, Cincinnati, Ohio, USA
| | - David A. Turner
- American Board of Pediatrics, Chapel Hill, North Carolina, USA
| | - Abigail Schuh
- Division of Emergency Medicine, Medical College of Wisconsin, Milwaukee, Wisconsin, USA
| |
Collapse
|
7
|
Busari JO, Diffey L, Hauer KE, Lomis KD, Amiel JM, Barone MA, Schultz K, Chen HC, Damodaran A, Turner DA, Jones B, Oandasan I, Chan MK. Advancing anti-oppression and social justice in healthcare through competency-based medical education (CBME). MEDICAL TEACHER 2024:1-8. [PMID: 38215046 DOI: 10.1080/0142159x.2023.2298763] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/13/2023] [Accepted: 12/20/2023] [Indexed: 01/14/2024]
Abstract
Competency-based medical education (CBME) focuses on preparing physicians to improve the health of patients and populations. In the context of ongoing health disparities worldwide, medical educators must implement CBME in ways that advance social justice and anti-oppression. In this article, authors describe how CBME can be implemented to promote equity pedagogy, an approach to education in which curricular design, teaching, assessment strategies, and learning environments support learners from diverse groups to be successful. The five core components of CBME programs - outcomes competency framework, progressive sequencing of competencies, learning experiences tailored to learners' needs, teaching focused on competencies, and programmatic assessment - enable individualization of learning experiences and teaching and encourage learners to partner with their teachers in driving their learning. These educational approaches appreciate each learner's background, experiences, and strengths. Using an exemplar case study, the authors illustrate how CBME can afford opportunities to enhance anti-oppression and social justice in medical education and promote each learner's success in meeting the expected outcomes of training. The authors provide recommendations for individuals and institutions implementing CBME to enact equity pedagogy.
Collapse
Affiliation(s)
- Jamiu O Busari
- Department of Educational Development and Research, Faculty of Health, Medicine and Life Sciences, Maastricht University, Maastricht, The Netherlands
- Department of Pediatrics, Dr. Horacio Oduber Hospital, Oranjestad, Aruba
| | - Linda Diffey
- Community Health Sciences, Max Rady College of Medicine, University of Manitoba, Winnipeg, Canada
| | - Karen E Hauer
- University of California, San Francisco School of Medicine, San Francisco, CA, USA
| | | | - Jonathan M Amiel
- Office of Innovation in Health Professions Education and Department of Psychiatry, Columbia University Vagelos College of Physicians and Surgeons, New York, NY, USA
| | - Michael A Barone
- NBME, Philadelphia, PA, USA
- Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Karen Schultz
- PGME Queen's University, Kingston, Canada
- Department of Family Medicine, Queen's University, Kingston, Canada
| | - H Carrie Chen
- Georgetown University School of Medicine, Washington, DC, USA
| | - Arvin Damodaran
- School of Clinical Medicine, Faculty of Medicine and Health, UNSW Sydney, Sydney, Australia
| | - David A Turner
- Department of Pediatrics, Division of Pediatric Critical Care, Duke Health System, Durham, NC, USA
- Competency-Based Medical Education, American Board of Pediatrics, Chapel Hill, NC, USA
| | - Benjamin Jones
- Health Systems Collaborative, Nuffield Department of Medicine, Oxford, UK
| | - Ivy Oandasan
- Toronto General Hospital Research Institute (TGHRI), Toronto, Canada
| | - Ming-Ka Chan
- Department of Pediatrics & Child Health, Office of Leadership Education, Rady Faculty of Health Sciences and Equity, Diversity, Inclusivity and Social Justice Lead, University of Manitoba and The Children's Hospital of Winnipeg, Winnipeg, Canada
| |
Collapse
|
8
|
Szulewski A, Braund H, Dagnone DJ, McEwen L, Dalgarno N, Schultz KW, Hall AK. The Assessment Burden in Competency-Based Medical Education: How Programs Are Adapting. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:1261-1267. [PMID: 37343164 DOI: 10.1097/acm.0000000000005305] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/23/2023]
Abstract
Residents and faculty have described a burden of assessment related to the implementation of competency-based medical education (CBME), which may undermine its benefits. Although this concerning signal has been identified, little has been done to identify adaptations to address this problem. Grounded in an analysis of an early Canadian pan-institutional CBME adopter's experience, this article describes postgraduate programs' adaptations related to the challenges of assessment in CBME. From June 2019-September 2022, 8 residency programs underwent a standardized Rapid Evaluation guided by the Core Components Framework (CCF). Sixty interviews and 18 focus groups were held with invested partners. Transcripts were analyzed abductively using CCF, and ideal implementation was compared with enacted implementation. These findings were then shared back with program leaders, adaptations were subsequently developed, and technical reports were generated for each program. Researchers reviewed the technical reports to identify themes related to the burden of assessment with a subsequent focus on identifying adaptations across programs. Three themes were identified: (1) disparate mental models of assessment processes in CBME, (2) challenges in workplace-based assessment processes, and (3) challenges in performance review and decision making. Theme 1 included entrustment interpretation and lack of shared mindset for performance standards. Adaptations included revising entrustment scales, faculty development, and formalizing resident membership. Theme 2 involved direct observation, timeliness of assessment completion, and feedback quality. Adaptations included alternative assessment strategies beyond entrustable professional activity forms and proactive assessment planning. Theme 3 related to resident data monitoring and competence committee decision making. Adaptations included adding resident representatives to the competence committee and assessment platform enhancements. These adaptations represent responses to the concerning signal of significant burden of assessment within CBME being experienced broadly. The authors hope other programs may learn from their institution's experience and navigate the CBME-related assessment burden their invested partners may be facing.
Collapse
Affiliation(s)
- Adam Szulewski
- A. Szulewski is associate professor, Departments of Emergency Medicine and Psychology, and educational scholarship lead, Postgraduate Medical Education, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0002-3076-6221
| | - Heather Braund
- H. Braund is associate director of scholarship and simulation education, Office of Professional Development and Educational Scholarship, and assistant (adjunct) professor, Department of Biomedical and Molecular Sciences and School of Medicine, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0002-9749-7193
| | - Damon J Dagnone
- D.J. Dagnone is associate professor, Department of Emergency Medicine, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0001-6963-7948
| | - Laura McEwen
- L. McEwen is director of assessment and evaluation of postgraduate medical education and assistant professor, Department of Pediatrics, Postgraduate Medical Education, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0003-2457-5311
| | - Nancy Dalgarno
- N. Dalgarno is director of education scholarship, Office of Professional Development and Educational Scholarship, and assistant professor (adjunct), Department of Biomedical and Molecular Sciences and Master of Health Professions Education, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0001-7932-9949
| | - Karen W Schultz
- K.W. Schultz is professor, Department of Family Medicine, and associate dean of postgraduate medical education, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0003-0208-3981
| | - Andrew K Hall
- A.K. Hall is associate professor and vice chair of education, Department of Emergency Medicine, University of Ottawa, and clinician educator, Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada; ORCID: https://orcid.org/0000-0003-1227-5397
| |
Collapse
|
9
|
Miller KA, Nagler J, Wolff M, Schumacher DJ, Pusic MV. It Takes a Village: Optimal Graduate Medical Education Requires a Deliberately Developmental Organization. PERSPECTIVES ON MEDICAL EDUCATION 2023; 12:282-293. [PMID: 37520509 PMCID: PMC10377742 DOI: 10.5334/pme.936] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/14/2023] [Accepted: 07/06/2023] [Indexed: 08/01/2023]
Abstract
Coaching is proposed as a means of improving the learning culture of medicine. By fostering trusting teacher-learner relationships, learners are encouraged to embrace feedback and make the most of failure. This paper posits that a cultural shift is necessary to fully harness the potential of coaching in graduate medical education. We introduce the deliberately developmental organization framework, a conceptual model focusing on three core dimensions: developmental communities, developmental aspirations, and developmental practices. These dimensions broaden the scope of coaching interactions. Implementing this organizational change within graduate medical education might be challenging, yet we argue that embracing deliberately developmental principles can embed coaching into everyday interactions and foster a culture in which discussing failure to maximize learning becomes acceptable. By applying the dimensions of developmental communities, aspirations, and practices, we present a six-principle roadmap towards transforming graduate medical education training programs into deliberately developmental organizations.
Collapse
Affiliation(s)
- Kelsey A. Miller
- Pediatrics and Emergency Medicine, Harvard Medical School, Boston, MA, USA
| | - Joshua Nagler
- Pediatrics and Emergency Medicine, Harvard Medical School, Boston, MA, USA
| | - Margaret Wolff
- Emergency Medicine and Pediatrics, University of Michigan Medical School, Ann Arbor, MI, USA
| | - Daniel J. Schumacher
- Cincinnati Children’s Hospital Medical Center and the University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Martin V. Pusic
- Pediatrics and Emergency Medicine, Harvard Medical School, Boston, MA, USA
| |
Collapse
|
10
|
Yilmaz Y, Chan MK, Richardson D, Atkinson A, Bassilious E, Snell L, Chan TM. Defining new roles and competencies for administrative staff and faculty in the age of competency-based medical education. MEDICAL TEACHER 2023; 45:395-403. [PMID: 36471921 DOI: 10.1080/0142159x.2022.2136517] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
PURPOSE These authors sought to define the new roles and competencies required of administrative staff and faculty in the age of CBME. METHOD A modified Delphi process was used to define the new CBME roles and competencies needed by faculty and administrative staff. We invited international experts in CBME (volunteers from the ICBME Collaborative email list), as well as faculty members and trainees identified via social media to help us determine the new competencies required of faculty and administrative staff in the CBME era. RESULTS Thirteen new roles were identified. The faculty-specific roles were: National Leader/Facilitator in CBME; Institutional/University lead for CBME; Assessment Process & Systems Designer; Local CBME Leads; CBME-specific Faculty Developers or Trainers; Competence Committee Chair; Competence Committee Faculty Member; Faculty Academic Coach/Advisor or Support Person; Frontline Assessor; Frontline Coach. The staff-specific roles were: Information Technology Lead; CBME Analytics/Data Support; Competence Committee Administrative Assistant. CONCLUSIONS The authors present a new set of faculty and staff roles that are relevant to the CBME context. While some of these new roles may be incorporated into existing roles, it may be prudent to examine how best to ensure that all of them are supported within all CBME contexts in some manner.
Collapse
Affiliation(s)
- Yusuf Yilmaz
- McMaster Education Research, Innovation, and Theory (MERIT), and Office of Continuing Professional Development, Faculty of Health Sciences, McMaster University, Hamilton, Canada
- Department of Medical Education, Faculty of Medicine, Ege University, Izmir, Turkey
| | - Ming-Ka Chan
- Department of Pediatrics and Child Health, University of Manitoba, Winnipeg, Canada
| | - Denyse Richardson
- Department of Medicine, Dalla Lana School of Public Health, University of Toronto, Toronto, Canada
| | - Adelle Atkinson
- Department of Pediatrics, University of Toronto, Toronto, Canada
| | - Ereny Bassilious
- Department of Pediatrics, Faculty of Health Sciences, McMaster University, Hamilton, Canada
| | - Linda Snell
- Medicine and Health Sciences Education, Faculty of Medicine and Health Sciences, McGill University, Montreal, Canada
| | - Teresa M Chan
- McMaster Education Research, Innovation, and Theory (MERIT), and Office of Continuing Professional Development, Faculty of Health Sciences, McMaster University, Hamilton, Canada
- Divisions of Emergency Medicine, Department of Medicine, McMaster University, Hamilton, Canada
| |
Collapse
|
11
|
Pusic MV, Birnbaum RJ, Thoma B, Hamstra SJ, Cavalcanti RB, Warm EJ, Janssen A, Shaw T. Frameworks for Integrating Learning Analytics With the Electronic Health Record. THE JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS 2023; 43:52-59. [PMID: 36849429 PMCID: PMC9973448 DOI: 10.1097/ceh.0000000000000444] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
The information systems designed to support clinical care have evolved separately from those that support health professions education. This has resulted in a considerable digital divide between patient care and education, one that poorly serves practitioners and organizations, even as learning becomes ever more important to both. In this perspective, we advocate for the enhancement of existing health information systems so that they intentionally facilitate learning. We describe three well-regarded frameworks for learning that can point toward how health care information systems can best evolve to support learning. The Master Adaptive Learner model suggests ways that the individual practitioner can best organize their activities to ensure continual self-improvement. The PDSA cycle similarly proposes actions for improvement but at a health care organization's workflow level. Senge's Five Disciplines of the Learning Organization, a more general framework from the business literature, serves to further inform how disparate information and knowledge flows can be managed for continual improvement. Our main thesis holds that these types of learning frameworks should inform the design and integration of information systems serving the health professions. An underutilized mediator of educational improvement is the ubiquitous electronic health record. The authors list learning analytic opportunities, including potential modifications of learning management systems and the electronic health record, that would enhance health professions education and support the shared goal of delivering high-quality evidence-based health care.
Collapse
|
12
|
Leclair R, Ho JSS, Braund H, Kouzmina E, Bruzzese S, Awad S, Mann S, Zevin B. Exploring the Quality of Narrative Feedback Provided to Residents During Ambulatory Patient Care in Medicine and Surgery. JOURNAL OF MEDICAL EDUCATION AND CURRICULAR DEVELOPMENT 2023; 10:23821205231175734. [PMID: 37216002 PMCID: PMC10192660 DOI: 10.1177/23821205231175734] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/19/2022] [Accepted: 04/21/2023] [Indexed: 05/24/2023]
Abstract
OBJECTIVES The transition to competency-based medical education (CBME) has increased the volume of residents' assessment data; however, the quality of the narrative feedback is yet to be used as feedback-on-feedback for faculty. Our objectives were (1) to explore and compare the quality and content of narrative feedback provided to residents in medicine and surgery during ambulatory patient care and (2) to use the Deliberately Developmental Organization framework to identify strengths, weaknesses, and opportunities to improve quality of feedback within CBME. METHODS We conducted a mixed convergent methods study with residents from the Departments of Surgery (DoS; n = 7) and Medicine (DoM; n = 9) at Queen's University. We used thematic analysis and the Quality of Assessment for Learning (QuAL) tool to analyze the content and quality of narrative feedback documented in entrustable professional activities (EPAs) assessments for ambulatory care. We also examined the association between the basis of assessment, time to provide feedback, and the quality of narrative feedback. RESULTS Forty-one EPA assessments were included in the analysis. Three major themes arose from thematic analysis: Communication, Diagnostics/Management, and Next Steps. Quality of the narrative feedback varied; 46% had sufficient evidence about residents' performance; 39% provided a suggestion for improvement; and 11% provided a connection between the suggestion and the evidence. There were significant differences between DoM and DoS in quality of feedback scores for evidence (2.1 [1.3] vs. 1.3 [1.1]; p < 0.01) and connection (0.4 [0.5] vs. 0.1 [0.3]; p = 0.04) domains of the QuAL tool. Feedback quality was not associated with the basis of assessment or time taken to provide feedback. CONCLUSION The quality of the narrative feedback provided to residents during ambulatory patient care was variable with the greatest gap in providing connections between suggestions and evidence about residents' performance. There is a need for ongoing faculty development to improve the quality of narrative feedback provided to residents.
Collapse
Affiliation(s)
- Rebecca Leclair
- School of Medicine, Queen's University, Kingston, ON, Canada
| | | | - Heather Braund
- Faculty of Health Sciences, Queen's University, Kingston, ON, Canada
- Office of Professional Development and
Educational Scholarship, Queen's University, Kingston, ON, Canada
| | - Ekaterina Kouzmina
- School of Medicine, Queen's University, Kingston, ON, Canada
- Division of General Surgery, Department
of Surgery, Kingston Health Sciences
Center, Kingston, ON, Canada
| | - Samantha Bruzzese
- School of Medicine, Queen's University, Kingston, ON, Canada
- Division of Internal Medicine,
Department of Medicine, Kingston Health Sciences
Center, Kingston, Kingston, ON, Canada
| | - Sara Awad
- School of Medicine, Queen's University, Kingston, ON, Canada
- Division of Endocrinology and
Metabolism, Department of Medicine, Kingston Health Sciences
Center, Kingston ON, Canada
| | - Steve Mann
- School of Medicine, Queen's University, Kingston, ON, Canada
- Division of Orthopaedic Surgery,
Department of Surgery, Kingston Health Sciences
Center, Kingston, ON, Canada
| | - Boris Zevin
- School of Medicine, Queen's University, Kingston, ON, Canada
- Division of General Surgery, Department
of Surgery, Kingston Health Sciences
Center, Kingston, ON, Canada
| |
Collapse
|
13
|
Woods R, Singh S, Thoma B, Patocka C, Cheung W, Monteiro S, Chan TM. Validity evidence for the Quality of Assessment for Learning score: a quality metric for supervisor comments in Competency Based Medical Education. CANADIAN MEDICAL EDUCATION JOURNAL 2022; 13:19-35. [PMID: 36440075 PMCID: PMC9684040 DOI: 10.36834/cmej.74860] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
BACKGROUND Competency based medical education (CBME) relies on supervisor narrative comments contained within entrustable professional activities (EPA) for programmatic assessment, but the quality of these supervisor comments is unassessed. There is validity evidence supporting the QuAL (Quality of Assessment for Learning) score for rating the usefulness of short narrative comments in direct observation. OBJECTIVE We sought to establish validity evidence for the QuAL score to rate the quality of supervisor narrative comments contained within an EPA by surveying the key end-users of EPA narrative comments: residents, academic advisors, and competence committee members. METHODS In 2020, the authors randomly selected 52 de-identified narrative comments from two emergency medicine EPA databases using purposeful sampling. Six collaborators (two residents, two academic advisors, and two competence committee members) were recruited from each of four EM Residency Programs (Saskatchewan, McMaster, Ottawa, and Calgary) to rate these comments with a utility score and the QuAL score. Correlation between utility and QuAL score were calculated using Pearson's correlation coefficient. Sources of variance and reliability were calculated using a generalizability study. RESULTS All collaborators (n = 24) completed the full study. The QuAL score had a high positive correlation with the utility score amongst the residents (r = 0.80) and academic advisors (r = 0.75) and a moderately high correlation amongst competence committee members (r = 0.68). The generalizability study found that the major source of variance was the comment indicating the tool performs well across raters. CONCLUSION The QuAL score may serve as an outcome measure for program evaluation of supervisors, and as a resource for faculty development.
Collapse
Affiliation(s)
- Rob Woods
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Sim Singh
- College of Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Brent Thoma
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Catherine Patocka
- Department of Emergency Medicine, University of Calgary, Alberta, Canada
| | - Warren Cheung
- Department of Emergency Medicine, University of Ottawa, Ontario, Canada
| | - Sandra Monteiro
- Department of Health Research Methods Evidence and Impact, McMaster University, Ontario, Canada
| | - Teresa M Chan
- Division of Emergency Medicine and Education & Innovation, Department of Medicine, McMaster University, Ontario, Canada
| | | |
Collapse
|
14
|
Lam AC, Tang B, Lalwani A, Verma AA, Wong BM, Razak F, Ginsburg S. Methodology paper for the General Medicine Inpatient Initiative Medical Education Database (GEMINI MedED): a retrospective cohort study of internal medicine resident case-mix, clinical care and patient outcomes. BMJ Open 2022; 12:e062264. [PMID: 36153026 PMCID: PMC9511606 DOI: 10.1136/bmjopen-2022-062264] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/03/2022] Open
Abstract
INTRODUCTION Unwarranted variation in patient care among physicians is associated with negative patient outcomes and increased healthcare costs. Care variation likely also exists for resident physicians. Despite the global movement towards outcomes-based and competency-based medical education, current assessment strategies in residency do not routinely incorporate clinical outcomes. The widespread use of electronic health records (EHRs) may enable the implementation of in-training assessments that incorporate clinical care and patient outcomes. METHODS AND ANALYSIS The General Medicine Inpatient Initiative Medical Education Database (GEMINI MedED) is a retrospective cohort study of senior residents (postgraduate year 2/3) enrolled in the University of Toronto Internal Medicine (IM) programme between 1 April 2010 and 31 December 2020. This study focuses on senior IM residents and patients they admit overnight to four academic hospitals. Senior IM residents are responsible for overseeing all overnight admissions; thus, care processes and outcomes for these clinical encounters can be at least partially attributed to the care they provide. Call schedules from each hospital, which list the date, location and senior resident on-call, will be used to link senior residents to EHR data of patients admitted during their on-call shifts. Patient data will be derived from the GEMINI database, which contains administrative (eg, demographic and disposition) and clinical data (eg, laboratory and radiological investigation results) for patients admitted to IM at the four academic hospitals. Overall, this study will examine three domains of resident practice: (1) case-mix variation across residents, hospitals and academic year, (2) resident-sensitive quality measures (EHR-derived metrics that are partially attributable to resident care) and (3) variations in patient outcomes across residents and factors that contribute to such variation. ETHICS AND DISSEMINATION GEMINI MedED was approved by the University of Toronto Ethics Board (RIS#39339). Results from this study will be presented in academic conferences and peer-reviewed journals.
Collapse
Affiliation(s)
- Andrew Cl Lam
- Department of Medicine, University of Toronto Faculty of Medicine, Toronto, Ontario, Canada
| | - Brandon Tang
- Department of Medicine, Division of General Internal Medicine, University of Toronto Faculty of Medicine, Toronto, Ontario, Canada
| | - Anushka Lalwani
- Li Ka Shing Knowledge Institute, Unity Health Toronto, Toronto, Ontario, Canada
| | - Amol A Verma
- Department of Medicine, University of Toronto Faculty of Medicine, Toronto, Ontario, Canada
- Li Ka Shing Knowledge Institute, Unity Health Toronto, Toronto, Ontario, Canada
- Division of General Internal Medicine, Unity Health Toronto, Toronto, Ontario, Canada
| | - Brian M Wong
- Department of Medicine, University of Toronto Faculty of Medicine, Toronto, Ontario, Canada
- Division of General Internal Medicine, Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada
| | - Fahad Razak
- Department of Medicine, University of Toronto Faculty of Medicine, Toronto, Ontario, Canada
- Li Ka Shing Knowledge Institute, Unity Health Toronto, Toronto, Ontario, Canada
- Division of General Internal Medicine, Unity Health Toronto, Toronto, Ontario, Canada
| | - Shiphra Ginsburg
- Department of Medicine, Division of Respirology, University of Toronto Faculty of Medicine, Toronto, Ontario, Canada
- Division of Respirology, Sinai Health System, Toronto, Ontario, Canada
| |
Collapse
|
15
|
Miller S, Caretta-Weyer H, Chan T. Beyond competence: rethinking continuing professional development in the age of competence-based medical education. CAN J EMERG MED 2022; 24:563-565. [PMID: 36071320 DOI: 10.1007/s43678-022-00372-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2022] [Accepted: 07/29/2022] [Indexed: 11/28/2022]
Affiliation(s)
- Stephen Miller
- Faculty of Medicine, Dalhousie University, Halifax, Nova Scotia, Canada
| | - Holly Caretta-Weyer
- Department of Emergency Medicine, Stanford University School of Medicine, Palo Alto, California, USA
| | - Teresa Chan
- Division of Emergency Medicine, McMaster University, Hamilton, Ontario, Canada. .,Division of Education and Innovation, Department of Medicine, McMaster University, Hamilton, Ontario, Canada. .,Department of Health Research Methodologies, Evidence, and Impact (HEI), McMaster University, Hamilton, Ontario, Canada. .,McMaster Education Research, Innovation, and Theory (MERIT) Unit, McMaster University, Hamilton, Ontario, Canada. .,Faculty of Health Sciences, McMaster University, Hamilton, Ontario, Canada. .,Adjunct Associate Professor (Honorary), The Chinese University of Hong Kong - Shenzhen campus, Shenzhen, China.
| |
Collapse
|
16
|
Yilmaz Y, Carey R, Chan TM, Bandi V, Wang S, Woods RA, Mondal D, Thoma B. Developing a dashboard for program evaluation in competency-based training programs: a design-based research project. CANADIAN MEDICAL EDUCATION JOURNAL 2022; 13:14-27. [PMID: 36310899 PMCID: PMC9588183 DOI: 10.36834/cmej.73554] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
BACKGROUND Canadian specialist residency training programs are implementing a form of competency-based medical education (CBME) that requires the assessment of entrustable professional activities (EPAs). Dashboards could be used to track the completion of EPAs to support program evaluation. METHODS Using a design-based research process, we identified program evaluation needs related to CBME assessments and designed a dashboard containing elements (data, analytics, and visualizations) meeting these needs. We interviewed leaders from the emergency medicine program and postgraduate medical education office at the University of Saskatchewan. Two investigators thematically analyzed interview transcripts to identify program evaluation needs that were audited by two additional investigators. Identified needs were described using quotes, analytics, and visualizations. RESULTS Between July 1, 2019 and April 6, 2021 we conducted 17 interviews with six participants (two program leaders and four institutional leaders). Four needs emerged as themes: tracking changes in overall assessment metrics, comparing metrics to the assessment plan, evaluating rotation performance, and engagement with the assessment metrics. We addressed these needs by presenting analytics and visualizations within a dashboard. CONCLUSIONS We identified program evaluation needs related to EPA assessments and designed dashboard elements to meet them. This work will inform the development of other CBME assessment dashboards designed to support program evaluation.
Collapse
Affiliation(s)
- Yusuf Yilmaz
- Continuing Professional Development Office, and McMaster program for Education Research, Innovation, and Theory (MERIT), McMaster University, Ontario, Canada
- Department of Medical Education, Ege University, Turkey
| | - Robert Carey
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Teresa M Chan
- Continuing Professional Development Office, and McMaster program for Education Research, Innovation, and Theory (MERIT), McMaster University, Ontario, Canada
- Division of Emergency Medicine, Department of Medicine at McMaster University
| | - Venkat Bandi
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Shisong Wang
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Robert A Woods
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Debajyoti Mondal
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Brent Thoma
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
- Royal College of Physicians and Surgeons of Canada, Ontario, Canada
| |
Collapse
|
17
|
Warm EJ, Carraccio C, Kelleher M, Kinnear B, Schumacher DJ, Santen S. The education passport: connecting programmatic assessment across learning and practice. CANADIAN MEDICAL EDUCATION JOURNAL 2022; 13:82-91. [PMID: 36091737 PMCID: PMC9441115 DOI: 10.36834/cmej.73871] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/21/2023]
Abstract
Competency-based medical education (CBME) shifts us from static assessment of learning to developmental assessment for learning. However, implementation challenges associated with CBME remain a major hurdle, especially after training and into practice. The full benefit of developmental assessment for learning over time requires collaboration, cooperation, and trust among learners, regulators, and the public that transcends each individual phase. The authors introduce the concept of an "Education Passport" that provides evidence of readiness to travel across the boundaries between undergraduate medical education, graduate medical education, and the expanse of practice. The Education Passport uses programmatic assessment, a process of collecting numerous low stakes assessments from multiple sources over time, judging these data using criterion-referencing, and enhancing this with coaching and competency committees to understand, process, and accelerate growth without end. Information in the Passport is housed on a cloud-based server controlled by the student/physician over the course of training and practice. These data are mapped to various educational frameworks such Entrustable Professional Activities or milestones for ease of longitudinal performance tracking. At each stage of education and practice the student/physician grants Passport access to all entities that can provide data on performance. Database managers use learning analytics to connect and display information over time that are then used by the student/physician, their assigned or chosen coaches, and review committees to maintain or improve performance. Global information is also collected and analyzed to improve the entire system of learning and care. Developing a true continuum that embraces performance and growth will be a long-term adaptive challenge across many organizations and jurisdictions and will require coordination from regulatory and national agencies. An Education Passport could also serve as an organizing tool and will require research and high-value communication strategies to maximize public trust in the work.
Collapse
Affiliation(s)
- Eric J Warm
- Department of Internal Medicine, University of Cincinnati College of Medicine, Ohio, USA
- Correspondence to: Eric J. Warm,
| | | | - Matthew Kelleher
- Department of Internal Medicine, University of Cincinnati College of Medicine, Ohio, USA
| | - Benjamin Kinnear
- Department of Pediatrics, University of Cincinnati College of Medicine, Ohio, USA
| | - Daniel J Schumacher
- Division of Emergency Medicine, Cincinnati Children's Hospital Medical Center and the University of Cincinnati College of Medicine, Ohio, USA
| | - Sally Santen
- Division of Emergency Medicine, Cincinnati Children's Hospital Medical Center and the University of Cincinnati College of Medicine, Ohio, USA
- Virginia Commonwealth University, Ohio, USA
| |
Collapse
|
18
|
Kealey A, Naik VN. Competency-Based Medical Training in Anesthesiology: Has It Delivered on the Promise of Better Education? Anesth Analg 2022; 135:223-229. [PMID: 35839492 DOI: 10.1213/ane.0000000000006091] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Affiliation(s)
- Alayne Kealey
- From the Department of Anesthesia, Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada.,Department of Anesthesiology and Pain Medicine, University of Toronto, Toronto, Ontario, Canada
| | - Viren N Naik
- Department of Anesthesiology and Pain Medicine, University of Ottawa, Ottawa, Ontario, Canada
| |
Collapse
|
19
|
Chan TM, Dowling S, Tastad K, Chin A, Thoma B. Integrating training, practice, and reflection within a new model for Canadian medical licensure: a concept paper prepared for the Medical Council of Canada. CANADIAN MEDICAL EDUCATION JOURNAL 2022; 13:68-81. [PMID: 36091730 PMCID: PMC9441128 DOI: 10.36834/cmej.73717] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
In 2020 the Medical Council of Canada created a task force to make recommendations on the modernization of its practices for granting licensure to medical trainees. This task force solicited papers on this topic from subject matter experts. As outlined within this Concept Paper, our proposal would shift licensure away from the traditional focus on high-stakes summative exams in a way that integrates training, clinical practice, and reflection. Specifically, we propose a model of graduated licensure that would have three stages including: a trainee license for trainees that have demonstrated adequate medical knowledge to begin training as a closely supervised resident, a transition to practice license for trainees that have compiled a reflective educational portfolio demonstrating the clinical competence required to begin independent practice with limitations and support, and a fully independent license for unsupervised practice for attendings that have demonstrated competence through a reflective portfolio of clinical analytics. This proposal was reviewed by a diverse group of 30 trainees, practitioners, and administrators in medical education. Their feedback was analyzed and summarized to provide an overview of the likely reception that this proposal would receive from the medical education community.
Collapse
Affiliation(s)
- Teresa M Chan
- Division of Emergency Medicine, Department of Medicine, Faculty of Health Sciences, McMaster University
- Division of Education & Innovation, Department of Medicine
- Faculty of Health Sciences, McMaster University; McMaster Education Research, Innovation, and Theory (MERIT) program
- Office of Continuing Professional Development; Faculty of Health Sciences, McMaster University
| | - Shawn Dowling
- Department of Emergency Medicine, Cumming School of Medicine, University of Calgary
| | - Kara Tastad
- Royal College Emergency Medicine Training Program, University of Toronto
| | - Alvin Chin
- Division of Emergency Medicine, Department of Medicine, Faculty of Health Sciences, McMaster University
| | - Brent Thoma
- Department of Emergency Medicine, University of Saskatchewan
| |
Collapse
|
20
|
Cheung WJ, Wagner N, Frank JR, Oswald A, Van Melle E, Skutovich A, Dalseg TR, Cooke LJ, Hall AK. Implementation of competence committees during the transition to CBME in Canada: A national fidelity-focused evaluation. MEDICAL TEACHER 2022; 44:781-789. [PMID: 35199617 DOI: 10.1080/0142159x.2022.2041191] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
PURPOSE This study evaluated the fidelity of competence committee (CC) implementation in Canadian postgraduate specialist training programs during the transition to competency-based medical education (CBME). METHODS A national survey of CC chairs was distributed to all CBME training programs in November 2019. Survey questions were derived from guiding documents published by the Royal College of Physicians and Surgeons of Canada reflecting intended processes and design. RESULTS Response rate was 39% (113/293) with representation from all eligible disciplines. Committee size ranged from 3 to 20 members, 42% of programs included external members, and 20% included a resident representative. Most programs (72%) reported that a primary review and synthesis of resident assessment data occurs prior to the meeting, with some data reviewed collectively during meetings. When determining entrustable professional activity (EPA) achievement, most programs followed the national specialty guidelines closely with some exceptions (53%). Documented concerns about professionalism, EPA narrative comments, and EPA entrustment scores were most highly weighted when determining resident progress decisions. CONCLUSIONS Heterogeneity in CC implementation likely reflects local adaptations, but may also explain some of the variable challenges faced by programs during the transition to CBME. Our results offer educational leaders important fidelity data that can help inform the larger evaluation and transformation of CBME.
Collapse
Affiliation(s)
- Warren J Cheung
- Department of Emergency Medicine, University of Ottawa, Ottawa, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
| | - Natalie Wagner
- Office of Professional Development & Educational Scholarship and Department of Biomedical & Molecular Sciences, Queen's University, Kingston, Canada
| | - Jason R Frank
- Department of Emergency Medicine, University of Ottawa, Ottawa, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
| | - Anna Oswald
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Medicine, University of Alberta, Edmonton, Canada
| | - Elaine Van Melle
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Family Medicine, Queen's University, Kingston, Canada
| | | | - Timothy R Dalseg
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Medicine, Division of Emergency Medicine, University of Toronto, Toronto, Canada
| | - Lara J Cooke
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Clinical Neurosciences, Cumming School of Medicine, University of Calgary, Calgary, Canada
| | - Andrew K Hall
- Department of Emergency Medicine, University of Ottawa, Ottawa, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
| |
Collapse
|
21
|
Chan TM, Sebok-Syer SS, Yilmaz Y, Monteiro S. The Impact of Electronic Data to Capture Qualitative Comments in a Competency-Based Assessment System. Cureus 2022; 14:e23480. [PMID: 35494923 PMCID: PMC9038604 DOI: 10.7759/cureus.23480] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/24/2022] [Indexed: 11/23/2022] Open
Abstract
Introduction Digitalizing workplace-based assessments (WBA) holds the potential for facilitating feedback and performance review, wherein we can easily record, store, and analyze data in real time. When digitizing assessment systems, however, it is unclear what is gained and lost in the message as a result of the change in medium. This study evaluates the quality of comments generated in paper vs. electronic media and the influence of an assessor’s seniority. Methods Using a realist evaluation framework, a retrospective database review was conducted with paper-based and electronic medium comments. A sample of assessments was examined to determine any influence of the medium on the word count and the Quality of Assessment for Learning (QuAL) score. A correlation analysis evaluated the relationship between word count and QuAL score. Separate univariate analyses of variance (ANOVAs) were used to examine the influence of the assessor's seniority and medium on word count, QuAL score, and WBA scores. Results The analysis included a total of 1,825 records. The average word count for the electronic comments (M=16) was significantly higher than the paper version (M=12; p=0.01). Longer comments positively correlated with QuAL score (r=0.2). Paper-based comments received lower QuAL scores (0.41) compared to electronic (0.51; p<0.01). Years in practice was negatively correlated with QuAL score (r=-0.08; p<0.001) as was word count (r=-0.2; p<0.001). Conclusion Digitization of WBAs increased the length of comments and did not appear to jeopardize the quality of WBAs; these results indicate higher-quality assessment data. True digital transformation may be possible by harnessing trainee data repositories and repurposing them to analyze for faculty-relevant metrics.
Collapse
|
22
|
Frank JR, Snell LS, Oswald A, Hauer KE. Further on the journey in a complex adaptive system: Elaborating CBME. MEDICAL TEACHER 2021; 43:734-736. [PMID: 34097832 DOI: 10.1080/0142159x.2021.1931083] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Affiliation(s)
- Jason R Frank
- Office of Specialty Education, Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, Canada
| | - Linda S Snell
- Office of Specialty Education, Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Medicine, McGill University, Montreal, Canada
| | - Anna Oswald
- Office of Specialty Education, Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Medicine, University of Alberta, Edmonton, Canada
| | - Karen E Hauer
- Department of Medicine, San Francisco (UCSF) School of Medicine, University of California, San Francisco, CA, USA
| |
Collapse
|