1
|
Sibicky SL, Daugherty KK, Chen AMH, Rhoney D, Nawarskas J. Enabling Factors for the Implementation of Competency-Based Curricula in Colleges and Schools of Pharmacy. AMERICAN JOURNAL OF PHARMACEUTICAL EDUCATION 2024; 88:100681. [PMID: 38460599 DOI: 10.1016/j.ajpe.2024.100681] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/06/2022] [Revised: 09/22/2023] [Accepted: 03/05/2024] [Indexed: 03/11/2024]
Abstract
OBJECTIVES To review the implementation drivers of competency-based pharmacy education (CBPE) and provide recommendations for enablers. FINDINGS Competency-based education is an emerging model in the health professions, focusing on time-variable competency development and achievement compared with a time-bound, course-based, traditional model. CBPE is an outcomes-based organized framework of competencies enabling pharmacists to meet health care and societal needs. However, challenges need to be recognized and overcome for the successful implementation of CBPE. Competency drivers include defining the competencies and roles of stakeholders, developing transparent learning trajectories and aligned assessments, and establishing lifetime development programs for stakeholders. Organization drivers include developing support systems for stakeholders; facilitating connections between all educational experiences; and having transparent assessment plans, policies, and procedures that align with core CBPE precepts, including the sustainability of time-variability. Leadership drivers include establishing growth mindset and facilitating a culture of connection between workplace and educational environments, program advocacy by institutional leaders, accepting failures as part of the process, shifting the organizational culture away from learner differentiation toward competence, and maintaining sufficient administrative capability to support CBPE. SUMMARY The successful implementation of CBPE involves enabling the competency, organization, and leadership drivers that will lead to program success. More research is needed in the areas of creation, implementation, and assessment of CBPE to determine success in this model. We have reviewed and provided recommendations to enable the drivers of successful implementation of CBPE.
Collapse
Affiliation(s)
- Stephanie L Sibicky
- Northeastern University School of Pharmacy and Pharmaceutical Sciences, Boston, MA, USA.
| | - Kimberly K Daugherty
- Sullivan University College of Pharmacy and Health Sciences, Louisville, KY, USA
| | - Aleda M H Chen
- Cedarville University School of Pharmacy, Cedarville, OH, USA
| | - Denise Rhoney
- University of North Carolina Eshelman School of Pharmacy, Chapel Hill, NC, USA
| | - James Nawarskas
- University of New Mexico College of Pharmacy, University of New Mexico, Albuquerque, NM, USA
| |
Collapse
|
2
|
de Heer MH, Driessen EW, Teunissen PW, Scheele F. Lessons learned spanning 17 years of experience with three consecutive nationwide competency based medical education training plans. Front Med (Lausanne) 2024; 11:1339857. [PMID: 38455473 PMCID: PMC10917951 DOI: 10.3389/fmed.2024.1339857] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2023] [Accepted: 01/31/2024] [Indexed: 03/09/2024] Open
Abstract
Introduction Curricula for postgraduate medical education have transformed since the introduction of competency based medical education (CBME). Postgraduate training plans offer broader training with different competencies and an outcome-based approach, in addition to the medical technical aspects of training. However, CBME also has its challenges. Over the past years, critical views have been shared on the potential drawbacks of CBME, such as assessment burden and conflicts with practicality in the workplace. Recent studies identified a need for a better understanding of how the evolving concept of CBME has been translated to curriculum design and implemented in the practice of postgraduate training. The aim of this study was to describe the development of CBME translations to curriculum design, based on three consecutive postgraduate training programs spanning 17 years. Method We performed a document analysis of three consecutive Dutch gynecology and obstetrics training plans that were implemented in 2005, 2013, and 2021. We used template analysis to identify changes over time. Results Over time, CBME-based curriculum design changed in several domains. Assessment changed from a model with a focus on summative decision to one with an emphasis on formative, low-stakes assessments aimed at supporting learning. The training plans evolved in parallel to evolving educational insights, e.g., by placing increasing emphasis on personal development. The curricula focused on a competency-based concept by introducing training modules and personalized authorization based on feedback rather than on a set duration of internships. There was increasing freedom in personalized training trajectories in the training plans, together with increasing trust towards the resident. Conclusion The way CBME was translated into training plans has evolved in the course of 17 years of experience with CMBE-based education. The main areas of change were the structure of the training plans, which became increasingly open, the degree to which learning outcomes were mandatory or not, and the way these outcomes were assessed.
Collapse
Affiliation(s)
- Merel H. de Heer
- Amsterdam UMC Location Vrije Universiteit Amsterdam, Research in Education, Amsterdam, Netherlands
| | - Erik W. Driessen
- School of Health Professions Education (SHE), Faculty of Health Medicine and Life Sciences (FHML), Maastricht University, Maastricht, Netherlands
| | - Pim W. Teunissen
- School of Health Professions Education (SHE), Faculty of Health Medicine and Life Sciences (FHML), Maastricht University, Maastricht, Netherlands
- Department of Obstetrics and Gynecology, Maastricht University Medical Center (MUMC+), Maastricht, Netherlands
| | - Fedde Scheele
- Amsterdam UMC Location Vrije Universiteit Amsterdam, Research in Education, Amsterdam, Netherlands
- Athena Institute, Faculty of Science, VU, Amsterdam, Netherlands
| |
Collapse
|
3
|
Ross S, Lawrence K, Bethune C, van der Goes T, Pélissier-Simard L, Donoff M, Crichton T, Laughlin T, Dhillon K, Potter M, Schultz K. Development, Implementation, and Meta-Evaluation of a National Approach to Programmatic Assessment in Canadian Family Medicine Residency Training. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:188-198. [PMID: 35671407 PMCID: PMC9855760 DOI: 10.1097/acm.0000000000004750] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
The growing international adoption of competency-based medical education has created a desire for descriptions of innovative assessment approaches that generate appropriate and sufficient information to allow for informed, defensible decisions about learner progress. In this article, the authors provide an overview of the development and implementation of the approach to programmatic assessment in postgraduate family medicine training programs in Canada, called Continuous Reflective Assessment for Training (CRAFT). CRAFT is a principles-guided, high-level approach to workplace-based assessment that was intentionally designed to be adaptable to local contexts, including size of program, resources available, and structural enablers and barriers. CRAFT has been implemented in all 17 Canadian family medicine residency programs, with each program taking advantage of the high-level nature of the CRAFT guidelines to create bespoke assessment processes and tools appropriate for their local contexts. Similarities and differences in CRAFT implementation between 5 different family medicine residency training programs, representing both English- and French-language programs from both Western and Eastern Canada, are described. Despite the intentional flexibility of the CRAFT guidelines, notable similarities in assessment processes and procedures across the 5 programs were seen. A meta-evaluation of findings from programs that have published evaluation information supports the value of CRAFT as an effective approach to programmatic assessment. While CRAFT is currently in place in family medicine residency programs in Canada, given its adaptability to different contexts as well as promising evaluation data, the CRAFT approach shows promise for application in other training environments.
Collapse
Affiliation(s)
- Shelley Ross
- S. Ross is professor and director, Research and Innovation, Teaching and Assessment Support Program, Department of Family Medicine, University of Alberta, Edmonton, Alberta, Canada; ORCID: http://orcid.org/0000-0001-9581-3191
| | - Kathrine Lawrence
- K. Lawrence is associate professor and assessment director and provincial head, Family Medicine, University of Saskatchewan, Saskatoon, Saskatchewan, Canada
| | - Cheri Bethune
- C. Bethune is professor, Northern Ontario School of Medicine, clinical professor, Memorial University of Newfoundland, Newfoundland, Canada, and clinician educator, College of Family Physicians of Canada; ORCID: http://orcid.org/0000-0002-6230-6262
| | - Theresa van der Goes
- T. van der Goes is family physician (retired), medical educator, and director of assessment, University of British Columbia Family Medicine Residency Program, Vancouver, British Columbia, Canada
| | - Luce Pélissier-Simard
- L. Pélissier-Simard is associate professor, Department of Family Medicine and Emergency Medicine, and associate academic director, Centre de Développement Professionnel, Faculté de médecine et des sciences de la santé de l’Université de Sherbrooke, Sherbrooke, Québec, Canada; ORCID: http://orcid.org/0000-0002-9402-1798
| | - Michel Donoff
- M. Donoff is family physician, professor, and associate chair, Department of Family Medicine, University of Alberta, Edmonton, Alberta, Canada
| | - Thomas Crichton
- T. Crichton is family physician and senior advisor, Postgraduate Medical Education, Northern Ontario School of Medicine, Thunder Bay, Ontario, Canada
| | - Thomas Laughlin
- T. Laughlin is an associate professor, Department of Family Medicine, Dalhousie University, Halifax, Nova Scotia, Canada, and clinical associate professor, Discipline of Family Medicine, Memorial University of Newfoundland, Newfoundland, Canada
| | - Kiran Dhillon
- K. Dhillon is clinical lecturer, Department of Family Medicine, University of Alberta, Edmonton, Alberta, Canada, and member, Certification Process and Assessment Committee, College of Family Physicians of Canada
| | - Martin Potter
- M. Potter is assistant professor, Family Medicine and Emergency Department, Université de Montréal, Montréal, Québec, Canada
| | - Karen Schultz
- K. Schultz is professor and assessment director, Department of Family Medicine, Queen’s University, Kingston, Ontario, Canada, and chair, Certification Process and Assessment Committee, College of Family Physicians of Canada; ORCID: http://orcid.org/0000-0003-0208-3981
| |
Collapse
|
4
|
The Importance of Professional Development in a Programmatic Assessment System: One Medical School’s Experience. EDUCATION SCIENCES 2022. [DOI: 10.3390/educsci12030220] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
The Cleveland Clinic Lerner College of Medicine of Case Western Reserve University (CCLCM) was created in 2004 as a 5-year undergraduate medical education program with a mission to produce future physician-investigators. CCLCM’s assessment system aligns with the principles of programmatic assessment. The curriculum is organized around nine competencies, where each competency has milestones that students use to self-assess their progress and performance. Throughout the program, students receive low-stakes feedback from a myriad of assessors across courses and contexts. With support of advisors, students construct portfolios to document their progress and performance. A separate promotion committee makes high-stakes promotion decisions after reviewing students’ portfolios. This case study describes a systematic approach to provide both student and faculty professional development essential for programmatic assessment. Facilitators, barriers, lessons learned, and future directions are discussed.
Collapse
|
5
|
Anderson HL, Kurtz J, West DC. Implementation and Use of Workplace-Based Assessment in Clinical Learning Environments: A Scoping Review. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:S164-S174. [PMID: 34406132 DOI: 10.1097/acm.0000000000004366] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
PURPOSE Workplace-based assessment (WBA) serves a critical role in supporting competency-based medical education (CBME) by providing assessment data to inform competency decisions and support learning. Many WBA systems have been developed, but little is known about how to effectively implement WBA. Filling this gap is important for creating suitable and beneficial assessment processes that support large-scale use of CBME. As a step toward filling this gap, the authors describe what is known about WBA implementation and use to identify knowledge gaps and future directions. METHOD The authors used Arksey and O'Malley's 6-stage scoping review framework to conduct the review, including: (1) identifying the research question; (2) identifying relevant studies; (3) study selection; (4) charting the data; (5) collating, summarizing, and reporting the results; and (6) consulting with relevant stakeholders. RESULTS In 2019-2020, the authors searched and screened 726 papers for eligibility using defined inclusion and exclusion criteria. One hundred sixty-three met inclusion criteria. The authors identified 5 themes in their analysis: (1) Many WBA tools and programs have been implemented, and barriers are common across fields and specialties; (2) Theoretical perspectives emphasize the need for data-driven implementation strategies; (3) User perceptions of WBA vary and are often dependent on implementation factors; (4) Technology solutions could provide useful tools to support WBA; and (5) Many areas of future research and innovation remain. CONCLUSIONS Knowledge of WBA as an implemented practice to support CBME remains constrained. To remove these constraints, future research should aim to generate generalizable knowledge on WBA implementation and use, address implementation factors, and investigate remaining knowledge gaps.
Collapse
Affiliation(s)
- Hannah L Anderson
- H.L. Anderson is research associate, Department of Pediatrics, The Children's Hospital of Philadelphia, Philadelphia, Pennsylvania; ORCID: http://orcid.org/0000-0002-9435-1535
| | - Joshua Kurtz
- J. Kurtz is a first-year resident, Department of Pediatrics, The Children's Hospital of Philadelphia, Philadelphia, Pennsylvania
| | - Daniel C West
- D.C. West is professor of pediatrics, The Perelman School of Medicine at the University of Pennsylvania, and associate chair for education and senior director of medical education, Department of Pediatrics, The Children's Hospital of Philadelphia, Philadelphia, Pennsylvania; ORCID: http://orcid.org/0000-0002-0909-4213
| |
Collapse
|
6
|
Bray MJ, Bradley EB, Martindale JR, Gusic ME. Implementing Systematic Faculty Development to Support an EPA-Based Program of Assessment: Strategies, Outcomes, and Lessons Learned. TEACHING AND LEARNING IN MEDICINE 2021; 33:434-444. [PMID: 33331171 DOI: 10.1080/10401334.2020.1857256] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Problem: Development of a novel, competency-based program of assessment requires creation of a plan to measure the processes that enable successful implementation. The principles of implementation science outline the importance of considering key drivers that support and sustain transformative change within an educational program. The introduction of Entrustable Professional Activities (EPAs) as a framework for assessment has underscored the need to create a structured plan to prepare assessors to engage in a new paradigm of assessment. Although approaches to rater training for workplace-based assessments have been described, specific strategies to prepare assessors to apply standards related to the level of supervision a student needs have not been documented. Intervention: We describe our systematic approach to prepare assessors, faculty and postgraduate trainees, to complete EPA assessments for medical students during the clerkship phase of our curriculum. This institution-wide program is designed to build assessors' skills in direct observation of learners during authentic patient encounters. Assessors apply new knowledge and practice skills in using established performance expectations to determine the level of supervision a learner needs to perform clinical tasks. Assessors also learn to provide feedback and narrative comments to coach students and promote their ongoing clinical development. Data visualizations for assessors facilitate reinforcement of the tenets learned during training. Collaborative learning and peer feedback during faculty development sessions promote the formation of a community of practice among assessors. Context: Faculty development for assessors was implemented in advance of implementation of the EPA program. Assessors in the program include residents/fellows who work closely with students, faculty with discipline-specific expertise and a group of experienced clinicians who were selected to serve as experts in competency-based EPA assessments, the Master Assessors. Training focused on creating a shared understanding about the application of criteria used to evaluate student performance. EPA assessments based on the AAMC's Core Entrustable Professional Activities for Entering Residency, were completed in nine core clerkships. EPA assessments included a supervision rating based on a modified scale for use in undergraduate medical education. Impact: Data from EPA assessments completed during the first year of the program were analyzed to evaluate the effectiveness of the faculty development activities implemented to prepare assessors to consistently apply standards for assessment. A systematic approach to training and attention to critical drivers that enabled institution-wide implementation, led to consistency in the supervision rating for students' first EPA assessment completed by any type of assessor, ratings by assessors done within a specific clinical context, and ratings assigned by a group of specific assessors across clinical settings. Lessons learned: A systematic approach to faculty development with a willingness to be flexible and reach potential participants using existing infrastructure, can facilitate assessors' engagement in a new culture of assessment. Interaction among participants during training sessions not only promotes learning but also contributes to community building. A leadership group responsible to oversee faculty development can ensure that the needs of stakeholders are addressed and that a change in assessment culture is sustained.
Collapse
Affiliation(s)
- Megan J Bray
- Department of Obstetrics and Gynecology, Center for Medical Education Research and Scholarly Innovation, Office of Medical Education, University of Virginia School of Medicine, Charlottesville, Virginia, USA
| | - Elizabeth B Bradley
- Center for Medical Education Research and Scholarly Innovation, Office of Medical Education, University of Virginia School of Medicine, Charlottesville, Virginia, USA
| | - James R Martindale
- Center for Medical Education Research and Scholarly Innovation, Office of Medical Education, University of Virginia School of Medicine, Charlottesville, Virginia, USA
| | - Maryellen E Gusic
- Center for Medical Education Research and Scholarly Innovation, Office of Medical Education, Department of Pediatrics, University of Virginia School of Medicine, Charlottesville, Virginia, USA
| |
Collapse
|
7
|
Thoma B, Caretta-Weyer H, Schumacher DJ, Warm E, Hall AK, Hamstra SJ, Cavalcanti R, Chan TM. Becoming a deliberately developmental organization: Using competency based assessment data for organizational development. MEDICAL TEACHER 2021; 43:801-809. [PMID: 34033512 DOI: 10.1080/0142159x.2021.1925100] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Medical education is situated within health care and educational organizations that frequently lag in their use of data to learn, develop, and improve performance. How might we leverage competency-based medical education (CBME) assessment data at the individual, program, and system levels, with the goal of redefining CBME from an initiative that supports the development of physicians to one that also fosters the development of the faculty, administrators, and programs within our organizations? In this paper we review the Deliberately Developmental Organization (DDO) framework proposed by Robert Kegan and Lisa Lahey, a theoretical framework that explains how organizations can foster the development of their people. We then describe the DDO's conceptual alignment with CBME and outline how CBME assessment data could be used to spur the transformation of health care and educational organizations into digitally integrated DDOs. A DDO-oriented use of CBME assessment data will require intentional investment into both the digitalization of assessment data and the development of the people within our organizations. By reframing CBME in this light, we hope that educational and health care leaders will see their investments in CBME as an opportunity to spur the evolution of a developmental culture.
Collapse
Affiliation(s)
- Brent Thoma
- Department of Emergency Medicine, University of Saskatchewan, Saskatoon, Canada
- Royal College of Physicians and Surgeons of Canada, Saskatoon, Canada
| | - Holly Caretta-Weyer
- Department of Emergency Medicine, Stanford University School of Medicine, Palo Alto, CA, USA
| | - Daniel J Schumacher
- Department of Pediatrics, Cincinnati Children's Hospital Medical Center/University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Eric Warm
- Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Andrew K Hall
- Department of Emergency Medicine, University of Saskatchewan, Saskatoon, Canada
- Royal College of Physicians and Surgeons of Canada, Saskatoon, Canada
| | - Stanley J Hamstra
- Milestones Research and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, IL, USA
- Faculty of Education, University of Ottawa, Ottawa, Canada
- Department of Medical Education, Feinberg School of Medicine, Northwestern University, Chicago, IL, USA
| | - Rodrigo Cavalcanti
- Department of Medicine, University of Toronto, Toronto, Canada
- HoPingKong Centre for Excellence in Education and Practice, UHN, Toronto, Canada
| | - Teresa M Chan
- Program for Faculty Development, Faculty of Health Sciences, McMaster University, Hamilton, Canada
- Division of Emergency Medicine, Department of Medicine, McMaster University, Hamilton, Canada
- McMaster program for Education Research, Innovation, and Theory (MERIT), Hamilton, Canada
| |
Collapse
|
8
|
Touchie C, Kinnear B, Schumacher D, Caretta-Weyer H, Hamstra SJ, Hart D, Gruppen L, Ross S, Warm E, Ten Cate O. On the validity of summative entrustment decisions. MEDICAL TEACHER 2021; 43:780-787. [PMID: 34020576 DOI: 10.1080/0142159x.2021.1925642] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Health care revolves around trust. Patients are often in a position that gives them no other choice than to trust the people taking care of them. Educational programs thus have the responsibility to develop physicians who can be trusted to deliver safe and effective care, ultimately making a final decision to entrust trainees to graduate to unsupervised practice. Such entrustment decisions deserve to be scrutinized for their validity. This end-of-training entrustment decision is arguably the most important one, although earlier entrustment decisions, for smaller units of professional practice, should also be scrutinized for their validity. Validity of entrustment decisions implies a defensible argument that can be analyzed in components that together support the decision. According to Kane, building a validity argument is a process designed to support inferences of scoring, generalization across observations, extrapolation to new instances, and implications of the decision. A lack of validity can be caused by inadequate evidence in terms of, according to Messick, content, response process, internal structure (coherence) and relationship to other variables, and in misinterpreted consequences. These two leading frameworks (Kane and Messick) in educational and psychological testing can be well applied to summative entrustment decision-making. The authors elaborate the types of questions that need to be answered to arrive at defensible, well-argued summative decisions regarding performance to provide a grounding for high-quality safe patient care.
Collapse
Affiliation(s)
- Claire Touchie
- Medical Council of Canada, Ottawa, Canada
- The University of Ottawa, Ottawa, Canada
| | - Benjamin Kinnear
- Internal Medicine and Pediatrics, University of Cincinnati College of Medicine/Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA
| | - Daniel Schumacher
- Pediatrics, Hospital Medical Center/University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Holly Caretta-Weyer
- Emergency Medicine, Stanford University School of Medicine, Palo Alto, CA, USA
| | - Stanley J Hamstra
- University of Toronto, Toronto, Ontario, Canada
- Accreditation Council for Graduate Medical Education, Chicago, IL, USA
| | - Danielle Hart
- Emergency Medicine, Hennepin Healthcare and the University of Minnesota, Minneapolis, MN, USA
| | - Larry Gruppen
- Learning Health Sciences, University of Michigan Medical School, Ann Arbor, MI, USA
| | - Shelley Ross
- Department of Family Medicine, University of Alberta, Edmonton, AB, Canada
| | - Eric Warm
- University of Cincinnati College of Medicine Center, Cincinnati, OH, USA
| | - Olle Ten Cate
- Center for Research and Development of Education, University Medical Center Utrecht, Utrecht, The Netherlands
| |
Collapse
|
9
|
Chan TM, Sebok‐Syer SS, Cheung WJ, Pusic M, Stehman C, Gottlieb M. Workplace-based Assessment Data in Emergency Medicine: A Scoping Review of the Literature. AEM EDUCATION AND TRAINING 2021; 5:e10544. [PMID: 34099992 PMCID: PMC8166307 DOI: 10.1002/aet2.10544] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/04/2020] [Revised: 10/02/2020] [Accepted: 10/05/2020] [Indexed: 06/01/2023]
Abstract
OBJECTIVE In the era of competency-based medical education (CBME), the collection of more and more trainee data is being mandated by accrediting bodies such as the Accreditation Council for Graduate Medical Education and the Royal College of Physicians and Surgeons of Canada. However, few efforts have been made to synthesize the literature around the current issues surrounding workplace-based assessment (WBA) data. This scoping review seeks to synthesize the landscape of literature on the topic of data collection and utilization for trainees' WBAs in emergency medicine (EM). METHODS The authors conducted a scoping review in the style of Arksey and O'Malley, seeking to synthesize and map literature on collecting, aggregating, and reporting WBA data. The authors extracted, mapped, and synthesized literature that describes, supports, and substantiates effective data collection and utilization in the context of the CBME movement within EM. RESULTS Our literature search retrieved 189 potentially relevant references (after removing duplicates) that were screened to 29 abstracts and papers relevant to collecting, aggregating, and reporting WBAs. Our analysis shows that there is an increasing temporal trend toward contributions in these topics, with the majority of the papers (16/29) being published in the past 3 years alone. CONCLUSION There is increasing interest in the areas around data collection and utilization in the age of CBME. The field, however, is only beginning to emerge, leaving more work that can and should be done in this area.
Collapse
Affiliation(s)
- Teresa M. Chan
- From theDepartment of MedicineDivision of Emergency Medicine and the Division of Education & InnovationMcMaster UniversityHamiltonOntarioCanada
- theProgram for Faculty DevelopmentFaculty of Health SciencesMcMaster UniversityHamiltonOntarioCanada
- and the McMaster Program for Education Research, Innovation, and TheoryMcMaster UniversityHamiltonOntarioCanada
| | | | - Warren J. Cheung
- theDepartment of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| | - Martin Pusic
- theDepartment of PediatricsHarvard Medical SchoolBostonMAUSA
| | | | - Michael Gottlieb
- and theDepartment of Emergency MedicineRush University Medical CenterChicagoILUSA
| |
Collapse
|
10
|
Dart J, Twohig C, Anderson A, Bryce A, Collins J, Gibson S, Kleve S, Porter J, Volders E, Palermo C. The Value of Programmatic Assessment in Supporting Educators and Students to Succeed: A Qualitative Evaluation. J Acad Nutr Diet 2021; 121:1732-1740. [PMID: 33612437 DOI: 10.1016/j.jand.2021.01.013] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2020] [Revised: 01/13/2021] [Accepted: 01/17/2021] [Indexed: 11/28/2022]
Abstract
BACKGROUND Programmatic assessment has been proposed as the way forward for competency-based assessment, yet there is a dearth of literature describing the implementation and evaluation of programmatic assessment approaches. OBJECTIVE To evaluate the implementation of a programmatic assessment and explore its ability to support students and assessors. DESIGN A qualitative evaluation of programmatic assessment was employed. PARTICIPANTS/SETTING Interviews with graduates (n = 8) and preceptors (n = 12) together with focus groups with faculty assessors (n = 9) from the one Australian university explored experiences of the programmatic approach, role of assessment in learning, and defensibility of assessment decisions in determining competence. ANALYSIS PERFORMED Data were analyzed into key themes using framework analysis. RESULTS The programmatic assessment increased confidence in defensibility of assessment decisions, reduced emotional burden of assessment, increased value of assessment, and identified and remediated at-risk students earlier when philosophical and practice shifts in approaches to assessment were embraced. CONCLUSIONS Programmatic assessment supports a holistic approach to competency development and assessment and has multiple benefits for learners and assessors.
Collapse
|
11
|
European Section/Board of Anaesthesiology/European Society of Anaesthesiology consensus statement on competency-based education and training in anaesthesiology. Eur J Anaesthesiol 2020; 37:421-434. [DOI: 10.1097/eja.0000000000001201] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/16/2022]
|
12
|
van der Aa JE, Aabakke AJM, Ristorp Andersen B, Settnes A, Hornnes P, Teunissen PW, Goverde AJ, Scheele F. From prescription to guidance: a European framework for generic competencies. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2020; 25:173-187. [PMID: 31451981 PMCID: PMC7018687 DOI: 10.1007/s10459-019-09910-8] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/28/2019] [Accepted: 08/09/2019] [Indexed: 06/01/2023]
Abstract
In postgraduate medical education, required competencies are described in detail in existing competency frameworks. This study proposes an alternative strategy for competency-based medical education design, which is supported by change management theories. We demonstrate the value of allowing room for re-invention and creative adaptation of innovations. This new strategy was explored for the development of a new generic competency framework for a harmonised European curriculum in Obstetrics and Gynaecology. The generic competency framework was developed through action research. Data were collected by four European stakeholder groups (patients, nurses, midwives and hospital boards), using a variety of methods. Subsequently, the data were analysed further in consensus discussions with European specialists and trainees in Obstetrics and Gynaecology. These discussions ensured that the framework provides guidance, is specialty-specific, and that implementation in all European countries could be feasible. The presented generic competency framework identifies four domains: 'Patient-centred care', 'Teamwork', 'System-based practice' and 'Personal and professional development'. For each of these four domains, guiding competencies were defined. The new generic competency framework is supported by European specialists and trainees in Obstetrics and Gynaecology, as well as by their European stakeholders. According to change management theories, it seems vital to allow room for re-invention and creative adaptation of the competency framework by medical professionals. Therefore, the generic competency framework offers guidance rather than prescription. The presented strategy for competency framework development offers leads for implementation of competency-based medical education as well as for development of innovations in postgraduate medical education in general.
Collapse
Affiliation(s)
- Jessica E van der Aa
- Department of Research and Education, OLVG Hospital, Amsterdam, The Netherlands.
- Athena Institute, Faculty of Science, VU, Amsterdam, The Netherlands.
| | - Anna J M Aabakke
- Department of Obstetrics and Gynaecology, Herlev University Hospital, Herlev, Denmark
- European Network of Trainees in Obstetrics and Gynaecology (ENTOG), Brussels, Belgium
| | - Betina Ristorp Andersen
- Department of Gynaecology and Obstetrics, North Zealand Hospital, University of Copenhagen, Copenhagen, Denmark
| | - Annette Settnes
- Department of Gynaecology and Obstetrics, North Zealand Hospital, University of Copenhagen, Copenhagen, Denmark
| | - Peter Hornnes
- Department of Gynaecology and Obstetrics, North Zealand Hospital, University of Copenhagen, Copenhagen, Denmark
- European Board and College of Obstetrics and Gynaecology (EBCOG), Brussels, Belgium
| | - Pim W Teunissen
- Department of Obstetrics and Gynaecology, Amsterdam UMC, VU University Medical Centre, Amsterdam, The Netherlands
- School of Health Professions Education (SHE), Faculty of Health Medicine and Life Sciences, Maastricht University, Maastricht, The Netherlands
| | - Angelique J Goverde
- European Board and College of Obstetrics and Gynaecology (EBCOG), Brussels, Belgium
- Department of Reproductive Medicine and Gynaecology, University Medical Centre, Utrecht, The Netherlands
| | - Fedde Scheele
- Department of Research and Education, OLVG Hospital, Amsterdam, The Netherlands
- Athena Institute, Faculty of Science, VU, Amsterdam, The Netherlands
- European Board and College of Obstetrics and Gynaecology (EBCOG), Brussels, Belgium
- Department of Obstetrics and Gynaecology, Amsterdam UMC, VU University Medical Centre, Amsterdam, The Netherlands
| |
Collapse
|
13
|
Dauphinee WD. Building a core competency assessment program for all stakeholders: the design and building of sailing ships can inform core competency frameworks. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2020; 25:189-193. [PMID: 32030572 DOI: 10.1007/s10459-020-09962-1] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/30/2019] [Accepted: 01/21/2020] [Indexed: 06/10/2023]
Abstract
When educators are developing an effective and workable assessment program in graduate medical education by employing action research and stakeholder mapping to identify core competency domains and directives, the multi-stage process can be guided and informed by utilizing the story of designing, building and sea-testing sailing ships as a metaphor. However, the current challenge of physician burnout demands additional attention when formulating medical training frameworks, assessment guidelines and mentoring programs in 2020. The possibility of job-crafting is raised for consideration by designers of core competency frameworks in the health professions.
Collapse
Affiliation(s)
- W Dale Dauphinee
- Clinical and Health Informatics Research Group, Division of Clinical Epidemiology, Department of Medicine, McGill University, 1140 Pine Avenue West, Montreal, QC, H3A 1A3, Canada.
- Foundation for Advancement of International Medical Education and Research, Philadelphia, PA, USA.
| |
Collapse
|
14
|
Pinilla S, Lenouvel E, Strik W, Klöppel S, Nissen C, Huwendiek S. Entrustable Professional Activities in Psychiatry: A Systematic Review. ACADEMIC PSYCHIATRY : THE JOURNAL OF THE AMERICAN ASSOCIATION OF DIRECTORS OF PSYCHIATRIC RESIDENCY TRAINING AND THE ASSOCIATION FOR ACADEMIC PSYCHIATRY 2020; 44:37-45. [PMID: 31732885 DOI: 10.1007/s40596-019-01142-7] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/26/2019] [Accepted: 10/31/2019] [Indexed: 06/10/2023]
Abstract
OBJECTIVE Entrustable professional activities (EPAs) represent discrete clinical tasks that can be entrusted to trainees in psychiatry. They are increasingly being used as educational framework in several countries. However, the empirical evidence available has not been synthesized in the field of psychiatry. Therefore, the authors conducted a systematic review in order to summarize and evaluate the available evidence in the field of EPAs in undergraduate and graduate medical education in psychiatry. METHODS The authors searched PubMed, Cochrane Library, ERIC, Embase, PsycINFO, all Ovid journals, Scopus, Web of Science, MedEdPORTAL, and the archives of Academic Psychiatry for articles reporting quantitative and qualitative research as well as educational case reports on EPAs in undergraduate and graduate psychiatry education published since 2005. All included articles were assessed for content (development, implementation, and assessment of EPAs) and quality using the Quality Assessment Tool for Studies with Diverse Designs. RESULTS The authors screened 2807 records and included a total of 20 articles in the final data extraction. Most studies were expert consensus reports (n = 6, 30%) and predominantly conducted in English-speaking countries (n = 17, 85%). Papers reported mainly EPA development and/or EPA implementation studies (n = 14, 70%), whereas EPA assessment studies were less frequent (n = 6, 30%). Publications per year showed an increasing trend both in quantity (from 1 in 2011 to 7 in 2018) and quality (from a QATSDD score of 27 in 2011 to an average score of 39 in 2018). The main focus of the articles was the development of individual EPAs for different levels of training for psychiatry or on curricular frameworks based on EPAs in psychiatry (n = 10, 50%). The lack of empirical controlled studies does currently not allow for meta-analyses of educational outcomes. CONCLUSIONS The concept of EPA-based curricula seems to become increasingly present, a focus in the specialty of psychiatry both in UME and GME. The lack of empirical research in this context is an important limitation for educational practice recommendations. Currently there is only preliminary but promising data available for using EPAs with regard to educational outcomes. EPAs seem to be effectively used from a curriculum design perspective for UME and GME in psychiatry.
Collapse
Affiliation(s)
- Severin Pinilla
- University Hospital of Psychiatry and Psychotherapy, University of Bern, Bern, Switzerland.
| | - Eric Lenouvel
- University Hospital of Old Age Psychiatry and Psychotherapy, University of Bern, Bern, Switzerland
| | - Werner Strik
- University Hospital of Psychiatry and Psychotherapy, University of Bern, Bern, Switzerland
| | - Stefan Klöppel
- University Hospital of Old Age Psychiatry and Psychotherapy, University of Bern, Bern, Switzerland
| | - Christoph Nissen
- University Hospital of Psychiatry and Psychotherapy, University of Bern, Bern, Switzerland
| | - Sören Huwendiek
- Institute for Medical Education, University of Bern, Bern, Switzerland
| |
Collapse
|
15
|
Hatala R, Ginsburg S, Hauer KE, Gingerich A. Entrustment Ratings in Internal Medicine Training: Capturing Meaningful Supervision Decisions or Just Another Rating? J Gen Intern Med 2019; 34:740-743. [PMID: 30993616 PMCID: PMC6502893 DOI: 10.1007/s11606-019-04878-y] [Citation(s) in RCA: 22] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
The implementation of Entrustable Professional Activities has led to the simultaneous development of assessment based on a supervisor's entrustment of a learner to perform these activities without supervision. While entrustment may be intuitive when we consider the direct observation of a procedural task, the current implementation of rating scales for internal medicine's non-procedural tasks, based on entrustability, may not translate into meaningful learner assessment. In these Perspectives, we outline a number of potential concerns with ad hoc entrustability assessments in internal medicine post-graduate training: differences in the scope of procedural vs. non-procedural tasks, acknowledgement of the type of clinical oversight common within internal medicine, and the limitations of entrustment language. We point towards potential directions for inquiry that would require us to clarify the purpose of the entrustability assessment, reconsider each of the fundamental concepts of entrustment in internal medicine supervision and explore the use of descriptive rather than numeric assessment approaches.
Collapse
Affiliation(s)
- Rose Hatala
- Department of Medicine, University of British Columbia, Vancouver, Canada. .,St. Paul's Hospital, Suite 5907 Burrard Bldg, 1081 Burrard St., Vancouver, BC, V6Z 1Y6, Canada.
| | - Shiphra Ginsburg
- Department of Medicine, Faculty of Medicine, University of Toronto, Toronto, Canada
| | - Karen E Hauer
- Department of Medicine, University of California at San Francisco, San Francisco, CA, USA
| | - Andrea Gingerich
- Northern Medical Program, University of Northern British Columbia, Prince George, Canada
| |
Collapse
|