1
|
Woodworth GE, Goldstein ZT, Ambardekar AP, Arthur ME, Bailey CF, Booth GJ, Carney PA, Chen F, Duncan MJ, Fromer IR, Hallman MR, Hoang T, Isaak R, Klesius LL, Ladlie BL, Mitchell SA, Miller Juve AK, Mitchell JD, McGrath BJ, Shepler JA, Sims CR, Spofford CM, Tanaka PP, Maniker RB. Development and Pilot Testing of a Programmatic System for Competency Assessment in US Anesthesiology Residency Training. Anesth Analg 2024; 138:1081-1093. [PMID: 37801598 DOI: 10.1213/ane.0000000000006667] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/08/2023]
Abstract
BACKGROUND In 2018, a set of entrustable professional activities (EPAs) and procedural skills assessments were developed for anesthesiology training, but they did not assess all the Accreditation Council for Graduate Medical Education (ACGME) milestones. The aims of this study were to (1) remap the 2018 EPA and procedural skills assessments to the revised ACGME Anesthesiology Milestones 2.0, (2) develop new assessments that combined with the original assessments to create a system of assessment that addresses all level 1 to 4 milestones, and (3) provide evidence for the validity of the assessments. METHODS Using a modified Delphi process, a panel of anesthesiology education experts remapped the original assessments developed in 2018 to the Anesthesiology Milestones 2.0 and developed new assessments to create a system that assessed all level 1 through 4 milestones. Following a 24-month pilot at 7 institutions, the number of EPA and procedural skill assessments and mean scores were computed at the end of the academic year. Milestone achievement and subcompetency data for assessments from a single institution were compared to scores assigned by the institution's clinical competency committee (CCC). RESULTS New assessment development, 2 months of testing and feedback, and revisions resulted in 5 new EPAs, 11 nontechnical skills assessments (NTSAs), and 6 objective structured clinical examinations (OSCEs). Combined with the original 20 EPAs and procedural skills assessments, the new system of assessment addresses 99% of level 1 to 4 Anesthesiology Milestones 2.0. During the 24-month pilot, aggregate mean EPA and procedural skill scores significantly increased with year in training. System subcompetency scores correlated significantly with 15 of 23 (65.2%) corresponding CCC scores at a single institution, but 8 correlations (36.4%) were <30.0, illustrating poor correlation. CONCLUSIONS A panel of experts developed a set of EPAs, procedural skill assessment, NTSAs, and OSCEs to form a programmatic system of assessment for anesthesiology residency training in the United States. The method used to develop and pilot test the assessments, the progression of assessment scores with time in training, and the correlation of assessment scores with CCC scoring of milestone achievement provide evidence for the validity of the assessments.
Collapse
Affiliation(s)
- Glenn E Woodworth
- From the Department of Anesthesiology and Perioperative Medicine, Oregon Health & Science University, Portland, Oregon
| | - Zachary T Goldstein
- Department of Anesthesiology, Cedars Sinai Medical Center, Los Angeles, California
| | - Aditee P Ambardekar
- Department of Anesthesiology and Pain Management, University of Texas, Southwestern Medical Center, Dallas, Texas
| | - Mary E Arthur
- Department of Anesthesiology and Perioperative Medicine, Medical College of Georgia at Augusta University, Augusta, Georgia
| | - Caryl F Bailey
- Department of Anesthesiology and Perioperative Medicine, Medical College of Georgia at Augusta University, Augusta, Georgia
| | - Gregory J Booth
- Uniformed Services University of the Health Sciences, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Patricia A Carney
- Division of Hospital Medicine, Department of Family Medicine and Internal Medicine, Oregon Health & Science University, Portland, Oregon
| | - Fei Chen
- Department of Anesthesiology, University of North Carolina at Chapel Hill School of Medicine, Chapel Hill, North Carolina
| | - Michael J Duncan
- Department of Anesthesiology, University of Missouri-Kansas City, Kansas City, Missouri
| | - Ilana R Fromer
- Department of Anesthesiology, University of Minnesota, Minneapolis, Minnesota
| | - Matthew R Hallman
- Department of Anesthesiology and Pain Medicine, University of Washington, Seattle, Washington
| | - Thomas Hoang
- From the Department of Anesthesiology and Perioperative Medicine, Oregon Health & Science University, Portland, Oregon
| | - Robert Isaak
- Department of Anesthesiology, University of North Carolina, Chapel Hill, North Carolina
| | - Lisa L Klesius
- Department of Anesthesiology, University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin
| | - Beth L Ladlie
- Department of Anesthesiology, Mayo Clinic, Rochester, Minnesota
| | | | - Amy K Miller Juve
- From the Department of Anesthesiology and Perioperative Medicine, Oregon Health & Science University, Portland, Oregon
| | - John D Mitchell
- Department of Anesthesiology, Critical Care, and Perioperative Medicine, Henry Ford Health, Detroit, Michigan
| | - Brian J McGrath
- Department of Anesthesiology, University of Florida College of Medicine-Jacksonville, Jacksonville, Florida
| | - John A Shepler
- Department of Anesthesiology, University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin
| | - Charles R Sims
- Department of Anesthesiology & Perioperative Medicine, Mayo Clinic, Rochester, Minnesota
| | - Christina M Spofford
- Department of Anesthesiology, Medical College of Wisconsin, Milwaukee, Wisconsin
| | - Pedro P Tanaka
- Department of Anesthesiology, Stanford University, Stanford, California
| | - Robert B Maniker
- Department of Anesthesiology, Columbia University, New York, New York
| |
Collapse
|
2
|
Janssens O, Andreou V, Embo M, Valcke M, De Ruyck O, Robbrecht M, Haerens L. The identification of requirements for competency development during work-integrated learning in healthcare education. BMC MEDICAL EDUCATION 2024; 24:427. [PMID: 38649850 PMCID: PMC11034030 DOI: 10.1186/s12909-024-05428-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/30/2024] [Accepted: 04/15/2024] [Indexed: 04/25/2024]
Abstract
BACKGROUND Work-integrated learning (WIL) is widely accepted and necessary to attain the essential competencies healthcare students need at their future workplaces. Yet, competency-based education (CBE) remains complex. There often is a focus on daily practice during WIL. Hereby, continuous competency development is at stake. Moreover, the fact that competencies need to continuously develop is often neglected. OBJECTIVES To ultimately contribute to the optimization of CBE in healthcare education, this study aimed at examining how competency development during WIL in healthcare education could be optimized, before and after graduation. METHODS Fourteen semi-structured interviews with 16 experts in competency development and WIL were carried out. Eight healthcare disciplines were included namely associate degree nursing, audiology, family medicine, nursing (bachelor), occupational therapy, podiatry, pediatrics, and speech therapy. Moreover, two independent experts outside the healthcare domain were included to broaden the perspectives on competency development. A qualitative research approach was used based on an inductive thematic analysis using Nvivo12© where 'in vivo' codes were clustered as sub-themes and themes. RESULTS The analysis revealed eight types of requirements for effective and continuous competency development, namely requirements in the context of (1) competency frameworks, (2) reflection and feedback, (3) assessment, (4) the continuity of competency development, (5) mentor involvement, (6) ePortfolios, (7) competency development visualizations, and (8) competency development after graduation. It was noteworthy that certain requirements were fulfilled in one educational program whereas they were absent in another. This emphasizes the large differences in how competence-based education is taking shape in different educational programs and internship contexts. Nevertheless, all educational programs seemed to recognize the importance of ongoing competency development. CONCLUSION The results of this study indicate that identifying and meeting the requirements for effective and continuous competency development is essential to optimize competency development during practice in healthcare education.
Collapse
Affiliation(s)
- Oona Janssens
- Department of Educational Studies, Faculty of Psychology and Educational Sciences, Ghent University, H. Dunantlaan 2, Ghent, 9000, Belgium.
- Department of Movement and Sports Sciences, Faculty of Medicine and Health Sciences, Ghent University, Ghent, 9000, Belgium.
| | - Vasiliki Andreou
- Department of Public Health and Primacy Care, Academic Center for General Practice, KU Leuven, Kapucijnenvoer 7, Leuven, 3000, Belgium
| | - Mieke Embo
- Department of Educational Studies, Faculty of Psychology and Educational Sciences, Ghent University, H. Dunantlaan 2, Ghent, 9000, Belgium
- Expertise Network Health and Care, Artevelde University of Applied Sciences, Voetweg 66, Ghent, 9000, Belgium
| | - Martin Valcke
- Department of Educational Studies, Faculty of Psychology and Educational Sciences, Ghent University, H. Dunantlaan 2, Ghent, 9000, Belgium
| | - Olivia De Ruyck
- Imec-mict-UGent, Miriam Makebaplein 1, Ghent, 9000, Belgium
- Department of Industrial Systems Engineering and Product Design, Faculty of Engineering and Architecture, Ghent University, Campus Kortrijk, Graaf Karel de Goedelaan 5, Kortrijk, 8500, Belgium
- Department of Communication Sciences, Ghent University, Campus Ufo Vakgroep Communicatiewetenschappen Technicum, T1, Sint‑Pietersnieuwstraat 41, Ghent, 9000, Belgium
| | - Marieke Robbrecht
- Department of Internal Medicine and Pediatrics, Faculty of Medicine and Health Sciences, Ghent University, C. Heymanslaan 10, Ghent, 9000, Belgium
| | - Leen Haerens
- Department of Movement and Sports Sciences, Faculty of Medicine and Health Sciences, Ghent University, Ghent, 9000, Belgium
| |
Collapse
|
3
|
Paternotte E, Dijksterhuis M, Goverde A, Ezzat H, Scheele F. Comparison of OBGYN postgraduate curricula and assessment methods between Canada and the Netherlands: an auto-ethnographic study. Front Med (Lausanne) 2024; 11:1363222. [PMID: 38601119 PMCID: PMC11004340 DOI: 10.3389/fmed.2024.1363222] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2023] [Accepted: 02/27/2024] [Indexed: 04/12/2024] Open
Abstract
Introduction Although the Dutch and the Canadian postgraduate Obstetrics and Gynecology (OBGYN) medical education systems are similar in their foundations [programmatic assessment, competency based, involving CanMED roles and EPAs (entrustable professional activities)] and comparable in healthcare outcome, their program structures and assessment methods considerably differ. Materials and methods We compared both countries' postgraduate educational blueprints and used an auto-ethnographic method to gain insight in the effects of training program structure and assessment methods on how trainees work. The research questions for this study are as follows: what are the differences in program structure and assessment program in Obstetrics and Gynecology postgraduate medical education in the Netherlands and Canada? And how does this impact the advancement to higher competency for the postgraduate trainee? Results We found four main differences. The first two differences are the duration of training and the number of EPAs defined in the curricula. However, the most significant difference is the way EPAs are entrusted. In Canada, supervision is given regardless of EPA competence, whereas in the Netherlands, being competent means being entrusted, resulting in meaningful and practical independence in the workplace. Another difference is that Canadian OBGYN trainees have to pass a summative written and oral exit examination. This difference in the assessment program is largely explained by cultural and legal aspects of postgraduate training, leading to differences in licensing practice. Discussion Despite the fact that programmatic assessment is the foundation for assessment in medical education in both Canada and the Netherlands, the significance of entrustment differs. Trainees struggle to differentiate between formative and summative assessments. The trainees experience both formative and summative forms of assessment as a judgement of their competence and progress. Based on this auto-ethnographic study, the potential for further harmonization of the OBGYN PGME in Canada and the Netherlands remains limited.
Collapse
Affiliation(s)
- Emma Paternotte
- Department of Obstetrics and Gynaecology, Gelre Hospitals, Apeldoorn, Netherlands
| | - Marja Dijksterhuis
- Department of Obstetrics and Gynaecology, Amphia Ziekenhuis, Breda, Netherlands
| | - Angelique Goverde
- Department of Obstetrics and Gynaecology, University Medical Center Utrecht, Utrecht, Netherlands
| | - Hanna Ezzat
- Division of General Gynaecology and Obstetrics, University of British Columbia, Vancouver, BC, Canada
| | - Fedde Scheele
- Department of Obstetrics and Gynaecology, Onze Lieve Vrouwe Gasthuis (OLVG), Amsterdam, Netherlands
| |
Collapse
|
4
|
Richardson D, Landreville JM, Trier J, Cheung WJ, Bhanji F, Hall AK, Frank JR, Oswald A. Coaching in Competence by Design: A New Model of Coaching in the Moment and Coaching Over Time to Support Large Scale Implementation. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:33-43. [PMID: 38343553 PMCID: PMC10854464 DOI: 10.5334/pme.959] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/08/2023] [Accepted: 11/23/2023] [Indexed: 02/15/2024]
Abstract
Coaching is an increasingly popular means to provide individualized, learner-centered, developmental guidance to trainees in competency based medical education (CBME) curricula. Aligned with CBME's core components, coaching can assist in leveraging the full potential of this educational approach. With its focus on growth and improvement, coaching helps trainees develop clinical acumen and self-regulated learning skills. Developing a shared mental model for coaching in the medical education context is crucial to facilitate integration and subsequent evaluation of success. This paper describes the Royal College of Physicians and Surgeons of Canada's coaching model, one that is theory based, evidence informed, principle driven and iteratively and developed by a multidisciplinary team. The coaching model was specifically designed, fit for purpose to the postgraduate medical education (PGME) context and implemented as part of Competence by Design (CBD), a new competency based PGME program. This coaching model differentiates two coaching roles, which reflect different contexts in which postgraduate trainees learn and develop skills. Both roles are supported by the RX-OCR process: developing Relationship/Rapport, setting eXpectations, Observing, a Coaching conversation, and Recording/Reflecting. The CBD Coaching Model and its associated RX-OCR faculty development tool support the implementation of coaching in CBME. Coaching in the moment and coaching over time offer important mechanisms by which CBD brings value to trainees. For sustained change to occur and for learners and coaches to experience the model's intended benefits, ongoing professional development efforts are needed. Early post implementation reflections and lessons learned are provided.
Collapse
Affiliation(s)
- Denyse Richardson
- Department of Physical Medicine and Rehabilitation, Queen’s University, Kingston, ON, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| | | | - Jessica Trier
- Department of Physical Medicine and Rehabilitation, Queen’s University, Kingston, ON, Canada
| | - Warren J. Cheung
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Farhan Bhanji
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Education, Faculty of Medicine and Health Sciences, McGill University, Montreal, QC, Canada
| | - Andrew K. Hall
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Jason R. Frank
- University of Ottawa Faculty of Medicine, Ottawa, ON, Canada
| | - Anna Oswald
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Division of Rheumatology, Department of Medicine, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, ON, Canada
- Competency Based Medical Education, University of Alberta, Edmonton, AB, Canada
| |
Collapse
|
5
|
Oswald A, Dubois D, Snell L, Anderson R, Karpinski J, Hall AK, Frank JR, Cheung WJ. Implementing Competence Committees on a National Scale: Design and Lessons Learned. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:56-67. [PMID: 38343555 PMCID: PMC10854462 DOI: 10.5334/pme.961] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/08/2023] [Accepted: 07/03/2023] [Indexed: 02/15/2024]
Abstract
Competence committees (CCs) are a recent innovation to improve assessment decision-making in health professions education. CCs enable a group of trained, dedicated educators to review a portfolio of observations about a learner's progress toward competence and make systematic assessment decisions. CCs are aligned with competency based medical education (CBME) and programmatic assessment. While there is an emerging literature on CCs, little has been published on their system-wide implementation. National-scale implementation of CCs is complex, owing to the culture change that underlies this shift in assessment paradigm and the logistics and skills needed to enable it. We present the Royal College of Physicians and Surgeons of Canada's experience implementing a national CC model, the challenges the Royal College faced, and some strategies to address them. With large scale CC implementation, managing the tension between standardization and flexibility is a fundamental issue that needs to be anticipated and addressed, with careful consideration of individual program needs, resources, and engagement of invested groups. If implementation is to take place in a wide variety of contexts, an approach that uses multiple engagement and communication strategies to allow for local adaptations is needed. Large-scale implementation of CCs, like any transformative initiative, does not occur at a single point but is an evolutionary process requiring both upfront resources and ongoing support. As such, it is important to consider embedding a plan for program evaluation at the outset. We hope these shared lessons will be of value to other educators who are considering a large-scale CBME CC implementation.
Collapse
Affiliation(s)
- Anna Oswald
- Division of Rheumatology, Department of Medicine, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB, Canada
- Competency Based Medical Education, University of Alberta, Edmonton, AB, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- 8-130 Clinical Sciences building, 11350-83 Avenue, Edmonton, AB, Canada
| | - Daniel Dubois
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Anesthesiology and Pain Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Linda Snell
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Institute of Health Sciences Education and Department of Medicine, McGill University, Montreal, QC, Canada
| | - Robert Anderson
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Northern Ontario School of Medicine University, Sudbury, ON, Canada
| | - Jolanta Karpinski
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Andrew K. Hall
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Dept. of Emergency Medicine, University of Ottawa, Canada
| | - Jason R. Frank
- Centre for Innovation in Medical Education, Faculty of Medicine, University of Ottawa, Canada
| | - Warren J. Cheung
- Dept. of Emergency Medicine, University of Ottawa, Canada
- Royal College of Physicians and Surgeons of Canada, 1053 Carling Avenue, Rm F660, Ottawa, Canada
| |
Collapse
|
6
|
Cheung WJ, Bhanji F, Gofton W, Hall AK, Karpinski J, Richardson D, Frank JR, Dudek N. Design and Implementation of a National Program of Assessment Model - Integrating Entrustable Professional Activity Assessments in Canadian Specialist Postgraduate Medical Education. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:44-55. [PMID: 38343554 PMCID: PMC10854461 DOI: 10.5334/pme.956] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/08/2023] [Accepted: 09/04/2023] [Indexed: 02/15/2024]
Abstract
Traditional approaches to assessment in health professions education systems, which have generally focused on the summative function of assessment through the development and episodic use of individual high-stakes examinations, may no longer be appropriate in an era of competency based medical education. Contemporary assessment programs should not only ensure collection of high-quality performance data to support robust decision-making on learners' achievement and competence development but also facilitate the provision of meaningful feedback to learners to support reflective practice and performance improvement. Programmatic assessment is a specific approach to designing assessment systems through the intentional selection and combination of a variety of assessment methods and activities embedded within an educational framework to simultaneously optimize the decision-making and learning function of assessment. It is a core component of competency based medical education and is aligned with the goals of promoting assessment for learning and coaching learners to achieve predefined levels of competence. In Canada, postgraduate specialist medical education has undergone a transformative change to a competency based model centred around entrustable professional activities (EPAs). In this paper, we describe and reflect on the large scale, national implementation of a program of assessment model designed to guide learning and ensure that robust data is collected to support defensible decisions about EPA achievement and progress through training. Reflecting on the design and implications of this assessment system may help others who want to incorporate a competency based approach in their own country.
Collapse
Affiliation(s)
- Warren J. Cheung
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, CA
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada, 1053 Carling Avenue, Rm F660, Ottawa, ON K1Y 4E9, CA
| | - Farhan Bhanji
- Department of Pediatrics (Critical Care), Faculty of Medicine and Health Sciences, McGill University, Montreal, QC, CA
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, CA
| | - Wade Gofton
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, CA
- Department of Surgery, Division of Orthopaedic Surgery, University of Ottawa, Ottawa, ON, CA
| | - Andrew K. Hall
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, CA
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, CA
| | - Jolanta Karpinski
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, CA
- Department of Medicine, University of Ottawa, Ottawa, ON, CA
| | - Denyse Richardson
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, CA
- Department of Physical Medicine and Rehabilitation, Queen’s University, Kingston, ON, CA
| | - Jason R. Frank
- Department of Emergency Medicine, Director, Centre for Innovation in Medical Education, Faculty of Medicine, University of Ottawa, Ottawa, ON, CA
| | - Nancy Dudek
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, CA
- Department of Medicine, Division of Physical Medicine and Rehabilitation, University of Ottawa, Ottawa, ON, CA
| |
Collapse
|
7
|
Caretta-Weyer HA, Smirnova A, Barone MA, Frank JR, Hernandez-Boussard T, Levinson D, Lombarts KMJMH, Lomis KD, Martini A, Schumacher DJ, Turner DA, Schuh A. The Next Era of Assessment: Building a Trustworthy Assessment System. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:12-23. [PMID: 38274558 PMCID: PMC10809864 DOI: 10.5334/pme.1110] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/16/2023] [Accepted: 12/18/2023] [Indexed: 01/27/2024]
Abstract
Assessment in medical education has evolved through a sequence of eras each centering on distinct views and values. These eras include measurement (e.g., knowledge exams, objective structured clinical examinations), then judgments (e.g., workplace-based assessments, entrustable professional activities), and most recently systems or programmatic assessment, where over time multiple types and sources of data are collected and combined by competency committees to ensure individual learners are ready to progress to the next stage in their training. Significantly less attention has been paid to the social context of assessment, which has led to an overall erosion of trust in assessment by a variety of stakeholders including learners and frontline assessors. To meaningfully move forward, the authors assert that the reestablishment of trust should be foundational to the next era of assessment. In our actions and interventions, it is imperative that medical education leaders address and build trust in assessment at a systems level. To that end, the authors first review tenets on the social contextualization of assessment and its linkage to trust and discuss consequences should the current state of low trust continue. The authors then posit that trusting and trustworthy relationships can exist at individual as well as organizational and systems levels. Finally, the authors propose a framework to build trust at multiple levels in a future assessment system; one that invites and supports professional and human growth and has the potential to position assessment as a fundamental component of renegotiating the social contract between medical education and the health of the public.
Collapse
Affiliation(s)
- Holly A. Caretta-Weyer
- Department of Emergency Medicine, Stanford University School of Medicine, Palo Alto, California, USA
| | - Alina Smirnova
- Department of Family Medicine, University of Calgary, Calgary, Alberta, Canada
- Kern Institute for the Transformation of Medical Education, Medical College of Wisconsin, Milwaukee, Wisconsin, USA
| | - Michael A. Barone
- NBME, Philadelphia, Pennsylvania, USA
- Johns Hopkins University School of Medicine, Baltimore, Maryland, USA
| | - Jason R. Frank
- Department of Emergency Medicine, University of Ottawa, Ottawa, Ontario, CA
| | | | - Dana Levinson
- Josiah Macy Jr Foundation, Philadelphia, Pennsylvania, USA
| | - Kiki M. J. M. H. Lombarts
- Department of Medical Psychology, Amsterdam University Medical Centers, University of Amsterdam, NL
- Amsterdam Public Health research institute, Amsterdam, NL
| | - Kimberly D. Lomis
- Undergraduate Medical Education Innovations, American Medical Association, Chicago, Illinois, USA
| | - Abigail Martini
- Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio, USA
| | - Daniel J. Schumacher
- Division of Emergency Medicine, Cincinnati Children’s Hospital Medical Center/University of Cincinnati College of Medicine, Cincinnati, Ohio, USA
| | - David A. Turner
- American Board of Pediatrics, Chapel Hill, North Carolina, USA
| | - Abigail Schuh
- Division of Emergency Medicine, Medical College of Wisconsin, Milwaukee, Wisconsin, USA
| |
Collapse
|
8
|
Szulewski A, Braund H, Dagnone DJ, McEwen L, Dalgarno N, Schultz KW, Hall AK. The Assessment Burden in Competency-Based Medical Education: How Programs Are Adapting. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:1261-1267. [PMID: 37343164 DOI: 10.1097/acm.0000000000005305] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/23/2023]
Abstract
Residents and faculty have described a burden of assessment related to the implementation of competency-based medical education (CBME), which may undermine its benefits. Although this concerning signal has been identified, little has been done to identify adaptations to address this problem. Grounded in an analysis of an early Canadian pan-institutional CBME adopter's experience, this article describes postgraduate programs' adaptations related to the challenges of assessment in CBME. From June 2019-September 2022, 8 residency programs underwent a standardized Rapid Evaluation guided by the Core Components Framework (CCF). Sixty interviews and 18 focus groups were held with invested partners. Transcripts were analyzed abductively using CCF, and ideal implementation was compared with enacted implementation. These findings were then shared back with program leaders, adaptations were subsequently developed, and technical reports were generated for each program. Researchers reviewed the technical reports to identify themes related to the burden of assessment with a subsequent focus on identifying adaptations across programs. Three themes were identified: (1) disparate mental models of assessment processes in CBME, (2) challenges in workplace-based assessment processes, and (3) challenges in performance review and decision making. Theme 1 included entrustment interpretation and lack of shared mindset for performance standards. Adaptations included revising entrustment scales, faculty development, and formalizing resident membership. Theme 2 involved direct observation, timeliness of assessment completion, and feedback quality. Adaptations included alternative assessment strategies beyond entrustable professional activity forms and proactive assessment planning. Theme 3 related to resident data monitoring and competence committee decision making. Adaptations included adding resident representatives to the competence committee and assessment platform enhancements. These adaptations represent responses to the concerning signal of significant burden of assessment within CBME being experienced broadly. The authors hope other programs may learn from their institution's experience and navigate the CBME-related assessment burden their invested partners may be facing.
Collapse
Affiliation(s)
- Adam Szulewski
- A. Szulewski is associate professor, Departments of Emergency Medicine and Psychology, and educational scholarship lead, Postgraduate Medical Education, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0002-3076-6221
| | - Heather Braund
- H. Braund is associate director of scholarship and simulation education, Office of Professional Development and Educational Scholarship, and assistant (adjunct) professor, Department of Biomedical and Molecular Sciences and School of Medicine, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0002-9749-7193
| | - Damon J Dagnone
- D.J. Dagnone is associate professor, Department of Emergency Medicine, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0001-6963-7948
| | - Laura McEwen
- L. McEwen is director of assessment and evaluation of postgraduate medical education and assistant professor, Department of Pediatrics, Postgraduate Medical Education, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0003-2457-5311
| | - Nancy Dalgarno
- N. Dalgarno is director of education scholarship, Office of Professional Development and Educational Scholarship, and assistant professor (adjunct), Department of Biomedical and Molecular Sciences and Master of Health Professions Education, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0001-7932-9949
| | - Karen W Schultz
- K.W. Schultz is professor, Department of Family Medicine, and associate dean of postgraduate medical education, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0003-0208-3981
| | - Andrew K Hall
- A.K. Hall is associate professor and vice chair of education, Department of Emergency Medicine, University of Ottawa, and clinician educator, Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada; ORCID: https://orcid.org/0000-0003-1227-5397
| |
Collapse
|
9
|
Yilmaz Y, Chan MK, Richardson D, Atkinson A, Bassilious E, Snell L, Chan TM. Defining new roles and competencies for administrative staff and faculty in the age of competency-based medical education. MEDICAL TEACHER 2023; 45:395-403. [PMID: 36471921 DOI: 10.1080/0142159x.2022.2136517] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
PURPOSE These authors sought to define the new roles and competencies required of administrative staff and faculty in the age of CBME. METHOD A modified Delphi process was used to define the new CBME roles and competencies needed by faculty and administrative staff. We invited international experts in CBME (volunteers from the ICBME Collaborative email list), as well as faculty members and trainees identified via social media to help us determine the new competencies required of faculty and administrative staff in the CBME era. RESULTS Thirteen new roles were identified. The faculty-specific roles were: National Leader/Facilitator in CBME; Institutional/University lead for CBME; Assessment Process & Systems Designer; Local CBME Leads; CBME-specific Faculty Developers or Trainers; Competence Committee Chair; Competence Committee Faculty Member; Faculty Academic Coach/Advisor or Support Person; Frontline Assessor; Frontline Coach. The staff-specific roles were: Information Technology Lead; CBME Analytics/Data Support; Competence Committee Administrative Assistant. CONCLUSIONS The authors present a new set of faculty and staff roles that are relevant to the CBME context. While some of these new roles may be incorporated into existing roles, it may be prudent to examine how best to ensure that all of them are supported within all CBME contexts in some manner.
Collapse
Affiliation(s)
- Yusuf Yilmaz
- McMaster Education Research, Innovation, and Theory (MERIT), and Office of Continuing Professional Development, Faculty of Health Sciences, McMaster University, Hamilton, Canada
- Department of Medical Education, Faculty of Medicine, Ege University, Izmir, Turkey
| | - Ming-Ka Chan
- Department of Pediatrics and Child Health, University of Manitoba, Winnipeg, Canada
| | - Denyse Richardson
- Department of Medicine, Dalla Lana School of Public Health, University of Toronto, Toronto, Canada
| | - Adelle Atkinson
- Department of Pediatrics, University of Toronto, Toronto, Canada
| | - Ereny Bassilious
- Department of Pediatrics, Faculty of Health Sciences, McMaster University, Hamilton, Canada
| | - Linda Snell
- Medicine and Health Sciences Education, Faculty of Medicine and Health Sciences, McGill University, Montreal, Canada
| | - Teresa M Chan
- McMaster Education Research, Innovation, and Theory (MERIT), and Office of Continuing Professional Development, Faculty of Health Sciences, McMaster University, Hamilton, Canada
- Divisions of Emergency Medicine, Department of Medicine, McMaster University, Hamilton, Canada
| |
Collapse
|
10
|
Jamieson S. State of the science: Quality improvement of medical curricula-How should we approach it? MEDICAL EDUCATION 2023; 57:49-56. [PMID: 35950304 PMCID: PMC10087231 DOI: 10.1111/medu.14912] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/15/2022] [Revised: 07/30/2022] [Accepted: 08/10/2022] [Indexed: 06/15/2023]
Abstract
INTRODUCTION Quality improvement (QI) of the medical curriculum is generally regarded as a continuous process of evaluating whether the specific curriculum meets relevant educational and professional standards, implementing new activities or other measures to address perceived deficiencies, and subsequently re-evaluating the quality of the curriculum. QI is of consequence to medical learners, educators, patients, carers, specific disciplines and specialties, regulators and funders. METHODS To address how we should approach QI of medical curricula, a narrative review was undertaken, drawing mainly on medical/health professions education literature, identified through searches of the MEDLINE, EMBASE, PUBMED and ERIC databases, and also on exemplar curricular frameworks and evaluation reports. Assumptions and practices in QI of medical curricula were explored critically. RESULTS The review compares alternative conceptualisations of QI; asks questions about priorities and perspectives in what we choose to evaluate; reflects on standards used to guide QI; critically discusses methods, models and theoretical approaches to the generation of evaluation data; and considers ownership of, and engagement with QI of medical curricula. CONCLUSIONS Recommendations for curriculum teams include that discourse is necessary to achieve transparency and a shared understanding of continuous QI in a particular curricular context. Continuous QI requires data collection methods aligned to specific evaluation questions/foci; multiple methods for data collection, from different stakeholders; and appropriate evaluation models and theory to provide a framework for QI. Embracing a quality culture approach may increase the sense of ownership experienced by stakeholders. Mechanisms include creating democratic-collegiate cultures for multiple stakeholders to collaborate in QI; engaging stakeholders in QI activities and (e.g. SoTL) projects that contribute to holistic continuous QI; and proactively embedding quality in the (co-)creation of curriculum components and resources.
Collapse
Affiliation(s)
- Susan Jamieson
- School of Medicine, Dentistry & NursingUniversity of GlasgowGlasgowUK
| |
Collapse
|
11
|
Wright C, Matthews K. An intentional approach to the development and implementation of meaningful assessment in advanced radiation therapy practice curricula. Tech Innov Patient Support Radiat Oncol 2022; 24:13-18. [PMID: 36124225 PMCID: PMC9482137 DOI: 10.1016/j.tipsro.2022.08.010] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2022] [Revised: 08/16/2022] [Accepted: 08/29/2022] [Indexed: 11/03/2022] Open
Abstract
Intentional assessment design is essential for advanced practitioner courses. Using programmatic assessment enables learner agency and individualised learning. Aligning assessment longitudinally facilitates transparent achievement of advanced practitioner capabilities.
Creating meaningful assessment for advanced radiation therapy practice training programs is a challenge. This is because it requires a balance of formative and summative assessments, which meet the academic and professional needs of the practitioner, as well as the requirements of local service delivery, educational and professional standards. This paper discusses educational strategies and models used to integrate assessment into theoretical and clinical curricula, allowing practitioners to demonstrate higher order cognitive knowledge, advanced level clinical performance and attitudes/values associated with advanced practice. The discussion draws upon concepts of constructive alignment and programmatic approaches to assessment, which use Bloom’s taxonomy, Benner’s beginner to competent model of skill development, and Miller’s pyramid of clinical competence. These models are analysed with respect to an advanced practice program in adaptive radiation therapy to provide context.
Collapse
|
12
|
Brown DR, Moeller JJ, Grbic D, Andriole DA, Cutrer WB, Obeso VT, Hormann MD, Amiel JM. Comparing Entrustment Decision-Making Outcomes of the Core Entrustable Professional Activities Pilot, 2019-2020. JAMA Netw Open 2022; 5:e2233342. [PMID: 36156144 PMCID: PMC9513644 DOI: 10.1001/jamanetworkopen.2022.33342] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/26/2022] Open
Abstract
IMPORTANCE Gaps in readiness for indirect supervision have been identified for essential responsibilities encountered early in residency, presenting risks to patient safety. Core Entrustable Professional Activities (EPAs) for entering residency have been proposed as a framework to address these gaps and strengthen the transition from medical school to residency. OBJECTIVE To assess progress in developing an entrustment process in the Core EPAs framework. DESIGN, SETTING, AND PARTICIPANTS In this quality improvement study in the Core EPAs for Entering Residency Pilot, trained faculty made theoretical entrustment determinations and recorded the number of workplace-based assessments (WBAs) available for each determination in 2019 and 2020. Four participating schools attempted entrustment decision-making for all graduating students or a randomly selected subset of students. Deidentified, individual-level data were merged into a multischool database. INTERVENTIONS Schools implemented EPA-related curriculum, WBAs, and faculty development; developed systems to compile and display data; and convened groups to make theoretical summative entrustment determinations. MAIN OUTCOMES AND MEASURES On an EPA-specific basis, the percentage of students for whom an entrustment determination could be made, the percentage of students ready for indirect supervision, and the volume of WBAs available were recorded. RESULTS Four participating schools made 4525 EPA-specific readiness determinations (2296 determinations in 2019 and 2229 determinations in 2020) for 732 graduating students (349 students in 2019 and 383 students in 2020). Across all EPAs, the proportion of determinations of "ready for indirect supervision" increased from 2019 to 2020 (997 determinations [43.4%] vs 1340 determinations [60.1%]; 16.7 percentage point increase; 95% CI, 13.8-19.6 percentage points; P < .001), as did the proportion of determinations for which there were 4 or more WBAs (456 of 2295 determinations with WBA data [19.9%] vs 938 [42.1%]; 22.2 percentage point increase; 95% CI, 19.6-24.8 percentage points; P < .001). The proportion of EPA-specific data sets considered for which an entrustment determination could be made increased from 1731 determinations (75.4%) in 2019 to 2010 determinations (90.2%) in 2020 (14.8 percentage point increase; 95% CI, 12.6-16.9 percentage points; P < .001). On an EPA-specific basis, there were 5 EPAs (EPA 4 [orders], EPA 8 [handovers], EPA 10 [urgent care], EPA 11 [informed consent], and EPA 13 [patient safety]) for which few students were deemed ready for indirect supervision and for which there were few WBAs available per student in either year. For example, for EPA 13, 0 of 125 students were deemed ready in 2019 and 0 of 127 students were deemed ready in 2020, while 0 determinations in either year included 4 or more WBAs. CONCLUSIONS AND RELEVANCE These findings suggest that there was progress in WBA data collected, the extent to which entrustment determinations could be made, and proportions of entrustment determinations reported as ready for indirect supervision. However, important gaps remained, particularly for a subset of Core EPAs.
Collapse
Affiliation(s)
- David R. Brown
- Division of Family and Community Medicine, Department of Humanities, Health, and Society, Florida International University Herbert Wertheim College of Medicine, Miami
| | - Jeremy J. Moeller
- Department of Neurology, Yale University School of Medicine, New Haven, Connecticut
| | - Douglas Grbic
- Medical Education Research, Association of American Medical Colleges, Washington, District of Columbia
| | - Dorothy A. Andriole
- Medical Education Research, Association of American Medical Colleges, Washington, District of Columbia
| | - William B. Cutrer
- Department of Pediatrics, Division of Critical Care Medicine at Vanderbilt University School of Medicine, Nashville, Tennessee
| | - Vivian T. Obeso
- Division of Internal Medicine, Department of Translational Medicine, Florida International University Herbert Wertheim College of Medicine, Miami
| | - Mark D. Hormann
- Division of Community and General Pediatrics, Department of Pediatrics, McGovern Medical School at UTHealth, Houston, Texas
| | - Jonathan M. Amiel
- Dean’s Office, Columbia University Vagelos College of Physicians and Surgeons, New York, New York
- Department of Psychiatry, Columbia University Vagelos College of Physicians and Surgeons, New York, New York
| |
Collapse
|
13
|
Torre D, Schuwirth L, Van der Vleuten C, Heeneman S. An international study on the implementation of programmatic assessment: Understanding challenges and exploring solutions. MEDICAL TEACHER 2022; 44:928-937. [PMID: 35701165 DOI: 10.1080/0142159x.2022.2083487] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
INTRODUCTION Programmatic assessment is an approach to assessment aimed at optimizing the learning and decision function of assessment. It involves a set of key principles and ground rules that are important for its design and implementation. However, despite its intuitive appeal, its implementation remains a challenge. The purpose of this paper is to gain a better understanding of the factors that affect the implementation process of programmatic assessment and how specific implementation challenges are managed across different programs. METHODS An explanatory multiple case (collective) approach was used for this study. We identified 6 medical programs that had implemented programmatic assessment with variation regarding health profession disciplines, level of education and geographic location. We conducted interviews with a key faculty member from each of the programs and analyzed the data using inductive thematic analysis. RESULTS We identified two major factors in managing the challenges and complexity of the implementation process: knowledge brokers and a strategic opportunistic approach. Knowledge brokers were the people who drove and designed the implementation process acting by translating evidence into practice allowing for real-time management of the complex processes of implementation. These knowledge brokers used a 'strategic opportunistic' or agile approach to recognize new opportunities, secure leadership support, adapt to the context and take advantage of the unexpected. Engaging in an overall curriculum reform process was a critical factor for a successful implementation of programmatic assessment. DISCUSSION The study contributes to the understanding of the intricacies of implementation processes of programmatic assessment across different institutions. Managing opportunities, adaptive planning, awareness of context, were all critical aspects of thinking strategically and opportunistically in the implementation of programmatic assessment. Future research is needed to provide a more in-depth understanding of values and beliefs that underpin the assessment culture of an organization, and how such values may affect implementation.
Collapse
Affiliation(s)
- Dario Torre
- Director of Assessment, and Professor of Medicine, University of Central Florida College of Medicine, Orlando, FL, USA
| | - Lambert Schuwirth
- College of Medicine and Public Health, Flinders University, Adelaide, Australia
| | - Cees Van der Vleuten
- Department of Educational Development and Research, School of Health Profession Education, Maastricht University, Maastricht, The Netherlands
| | - Sylvia Heeneman
- Department of Pathology, School Health Profession Education, Cardiovascular Research Institute Maastricht, Maastricht University, Maastricht, The Netherlands
| |
Collapse
|
14
|
Concordance of Narrative Comments with Supervision Ratings Provided During Entrustable Professional Activity Assessments. J Gen Intern Med 2022; 37:2200-2207. [PMID: 35710663 PMCID: PMC9296736 DOI: 10.1007/s11606-022-07509-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/08/2021] [Accepted: 03/24/2022] [Indexed: 10/18/2022]
Abstract
BACKGROUND Use of EPA-based entrustment-supervision ratings to determine a learner's readiness to assume patient care responsibilities is expanding. OBJECTIVE In this study, we investigate the correlation between narrative comments and supervision ratings assigned during ad hoc assessments of medical students' performance of EPA tasks. DESIGN Data from assessments completed for students enrolled in the clerkship phase over 2 academic years were used to extract a stratified random sample of 100 narrative comments for review by an expert panel. PARTICIPANTS A review panel, comprised of faculty with specific expertise related to their roles within the EPA program, provided a "gold standard" supervision rating using the comments provided by the original assessor. MAIN MEASURES Interrater reliability (IRR) between members of review panel and correlation coefficients (CC) between expert ratings and supervision ratings from original assessors. KEY RESULTS IRR among members of the expert panel ranged from .536 for comments associated with focused history taking to .833 for complete physical exam. CC (Kendall's correlation coefficient W) between panel members' assignment of supervision ratings and the ratings provided by the original assessors for history taking, physical examination, and oral presentation comments were .668, .697, and .735 respectively. The supervision ratings of the expert panel had the highest degree of correlation with ratings provided during assessments done by master assessors, faculty trained to assess students across clinical contexts. Correlation between supervision ratings provided with the narrative comments at the time of observation and supervision ratings assigned by the expert panel differed by clinical discipline, perhaps reflecting the value placed on, and perhaps the comfort level with, assessment of the task in a given specialty. CONCLUSIONS To realize the full educational and catalytic effect of EPA assessments, assessors must apply established performance expectations and provide high-quality narrative comments aligned with the criteria.
Collapse
|
15
|
Seifman MA, Young AB, Nestel D. Simulation in plastic and reconstructive surgery: a scoping review. Simul Healthc 2022. [DOI: 10.54531/hnpw7177] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Since the origins of surgery, simulation has played an important role in surgical education, particularly in plastic and reconstructive surgery. This has greater relevance in contemporary settings of reduced clinical exposure resulting in limited work-based learning opportunities. With changing surgical curricula, it is prescient to examine the role of simulation in plastic and reconstructive surgery.
A scoping review protocol was used to identify relevant studies, with an iterative process identifying, reviewing and charting the data to derive reported outcomes and themes.
Of the 554 studies identified, 52 studies were included in this review. The themes identified included simulator modalities, curriculum elements targeted and relevant surgical competencies. There was a predominance of synthetically based simulators, targeting technical skills largely associated with microsurgery, paediatric surgery and craniomaxillofacial surgery.
Existing simulators largely address high-complexity procedures. There are multiple under-represented areas, including low-complexity procedures and simulation activities addressing communication, collaboration, management and leadership. There are many opportunities for simulation in surgical education, which requires a contextual appreciation of educational theory. Simulation may be used both as a learning method and as an assessment tool.
This review describes the literature relating to simulation in plastic and reconstructive surgery and proposes opportunities for incorporating simulation in a broader sense, in the surgical curriculum.
Collapse
Affiliation(s)
- Marc A Seifman
- 1Plastic, Reconstructive and Hand Surgery Unit, Peninsula Health, Frankston, Australia
| | - Abby B Young
- 1Plastic, Reconstructive and Hand Surgery Unit, Peninsula Health, Frankston, Australia
| | - Debra Nestel
- 2Faculty of Medicine, Dentistry and Health Sciences, The University of Melbourne, Melbourne, Australia
| |
Collapse
|
16
|
Do Resident Archetypes Influence the Functioning of Programs of Assessment? EDUCATION SCIENCES 2022. [DOI: 10.3390/educsci12050293] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
While most case studies consider how programs of assessment may influence residents’ achievement, we engaged in a qualitative, multiple case study to model how resident engagement and performance can reciprocally influence the program of assessment. We conducted virtual focus groups with program leaders from four residency training programs from different disciplines (internal medicine, emergency medicine, neurology, and rheumatology) and institutions. We facilitated discussion with live screen-sharing to (1) improve upon a previously-derived model of programmatic assessment and (2) explore how different resident archetypes (sample profiles) may influence their program of assessment. Participants agreed that differences in resident engagement and performance can influence their programs of assessment in some (mal)adaptive ways. For residents who are disengaged and weakly performing (of which there are a few), significantly more time is spent to make sense of problematic evidence, arrive at a decision, and generate recommendations. Whereas for residents who are engaged and performing strongly (the vast majority), significantly less effort is thought to be spent on discussion and formalized recommendations. These findings motivate us to fulfill the potential of programmatic assessment by more intentionally and strategically challenging those who are engaged and strongly performing, and by anticipating ways that weakly performing residents may strain existing processes.
Collapse
|
17
|
The Importance of Professional Development in a Programmatic Assessment System: One Medical School’s Experience. EDUCATION SCIENCES 2022. [DOI: 10.3390/educsci12030220] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
The Cleveland Clinic Lerner College of Medicine of Case Western Reserve University (CCLCM) was created in 2004 as a 5-year undergraduate medical education program with a mission to produce future physician-investigators. CCLCM’s assessment system aligns with the principles of programmatic assessment. The curriculum is organized around nine competencies, where each competency has milestones that students use to self-assess their progress and performance. Throughout the program, students receive low-stakes feedback from a myriad of assessors across courses and contexts. With support of advisors, students construct portfolios to document their progress and performance. A separate promotion committee makes high-stakes promotion decisions after reviewing students’ portfolios. This case study describes a systematic approach to provide both student and faculty professional development essential for programmatic assessment. Facilitators, barriers, lessons learned, and future directions are discussed.
Collapse
|
18
|
Sirajuddin S, Grubb KJ. Commentary: Transcatheter cardiac surgery training: What to teach? J Thorac Cardiovasc Surg 2021; 165:2163-2164. [PMID: 34503842 DOI: 10.1016/j.jtcvs.2021.08.048] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/19/2021] [Revised: 08/19/2021] [Accepted: 08/19/2021] [Indexed: 11/19/2022]
Affiliation(s)
- Sarah Sirajuddin
- Division of Cardiothoracic Surgery, Structural Heart and Valve Center, Emory University, Atlanta, Ga
| | - Kendra J Grubb
- Division of Cardiothoracic Surgery, Structural Heart and Valve Center, Emory University, Atlanta, Ga.
| |
Collapse
|
19
|
Frank JR, Snell LS, Oswald A, Hauer KE. Further on the journey in a complex adaptive system: Elaborating CBME. MEDICAL TEACHER 2021; 43:734-736. [PMID: 34097832 DOI: 10.1080/0142159x.2021.1931083] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Affiliation(s)
- Jason R Frank
- Office of Specialty Education, Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, Canada
| | - Linda S Snell
- Office of Specialty Education, Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Medicine, McGill University, Montreal, Canada
| | - Anna Oswald
- Office of Specialty Education, Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Medicine, University of Alberta, Edmonton, Canada
| | - Karen E Hauer
- Department of Medicine, San Francisco (UCSF) School of Medicine, University of California, San Francisco, CA, USA
| |
Collapse
|