1
|
Choi T, Sarkar M, Bonham M, Brock T, Brooks IA, Diug B, Ilic D, Kumar A, Lau WM, Lindley J, Morphet J, Simmons M, Volders E, White PJ, Wright C, Palermo C. Using contribution analysis to evaluate health professions and health sciences programs. Front Med (Lausanne) 2023; 10:1146832. [PMID: 37849488 PMCID: PMC10577286 DOI: 10.3389/fmed.2023.1146832] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2023] [Accepted: 09/18/2023] [Indexed: 10/19/2023] Open
Abstract
Introduction/background Course evaluation in health education is a common practice yet few comprehensive evaluations of health education exist that measure the impact and outcomes these programs have on developing health graduate capabilities. Aim/objectives To explore how curricula contribute to health graduate capabilities and what factors contribute to the development of these capabilities. Methods Using contribution analysis evaluation, a six-step iterative process, key stakeholders in the six selected courses were engaged in an iterative theory-driven evaluation. The researchers collectively developed a postulated theory-of-change. Then evidence from existing relevant documents were extracted using documentary analysis. Collated findings were presented to academic staff, industry representatives and graduates, where additional data was sought through focus group discussions - one for each discipline. The focus group data were used to validate the theory-of-change. Data analysis was conducted iteratively, refining the theory of change from one course to the next. Results The complexity in teaching and learning, contributed by human, organizational and curriculum factors was highlighted. Advances in knowledge, skills, attitudes and graduate capabilities are non-linear and integrated into curriculum. Work integrated learning significantly contributes to knowledge consolidation and forming professional identities for health professional courses. Workplace culture and educators' passion impact on the quality of teaching and learning yet are rarely considered as evidence of impact. Discussion Capturing the episodic and contextual learning moments is important to describe success and for reflection for improvement. Evidence of impact of elements of courses on future graduate capabilities was limited with the focus of evaluation data on satisfaction. Conclusion Contribution analysis has been a useful evaluation method to explore the complexity of the factors in learning and teaching that influence graduate capabilities in health-related courses.
Collapse
Affiliation(s)
- Tammie Choi
- Monash Centre for Scholarship in Health Education, Monash University, Melbourne, VIC, Australia
- Department of Nutrition, Dietetics and Food, Monash University, Melbourne, VIC, Australia
| | - Mahbub Sarkar
- Monash Centre for Scholarship in Health Education, Monash University, Melbourne, VIC, Australia
| | - Maxine Bonham
- Department of Nutrition, Dietetics and Food, Monash University, Melbourne, VIC, Australia
| | - Tina Brock
- Faculty of Pharmacy and Pharmaceutical Sciences, Monash University, Melbourne, VIC, Australia
| | - Ingrid Ann Brooks
- School of Nursing and Midwifery, Monash University, Melbourne, VIC, Australia
| | - Basia Diug
- School of Public Health and Preventive Medicine, Monash University, Melbourne, VIC, Australia
| | - Dragan Ilic
- School of Public Health and Preventive Medicine, Monash University, Melbourne, VIC, Australia
| | - Arunaz Kumar
- Department of Obstetrics and Gynaecology, School of Clinical Sciences, Monash University, Melbourne, VIC, Australia
| | - Wee-Ming Lau
- Jeffrey Cheah School of Medicine and Health Sciences, Monash University Malaysia, Subang Jaya, Selangor, Malaysia
| | - Jennifer Lindley
- Monash Centre for Scholarship in Health Education, Monash University, Melbourne, VIC, Australia
| | - Julia Morphet
- School of Nursing and Midwifery, Monash University, Melbourne, VIC, Australia
| | - Margaret Simmons
- Monash Rural Health, Monash University, Melbourne, VIC, Australia
| | - Evelyn Volders
- Department of Nutrition, Dietetics and Food, Monash University, Melbourne, VIC, Australia
| | - Paul J. White
- Faculty of Pharmacy and Pharmaceutical Sciences, Monash University, Melbourne, VIC, Australia
| | - Caroline Wright
- Department of Medical Imaging and Radiation Sciences, School of Primary and Allied Health, Monash University, Melbourne, VIC, Australia
| | - Claire Palermo
- Monash Centre for Scholarship in Health Education, Monash University, Melbourne, VIC, Australia
| |
Collapse
|
2
|
Sebok-Syer SS, Lingard L, Panza M, Van Hooren TA, Rassbach CE. Supportive and collaborative interdependence: Distinguishing residents' contributions within health care teams. MEDICAL EDUCATION 2023; 57:921-931. [PMID: 36822577 DOI: 10.1111/medu.15064] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/21/2022] [Revised: 02/04/2023] [Accepted: 02/21/2023] [Indexed: 06/18/2023]
Abstract
INTRODUCTION Individual assessments disregard team contributions, while team assessments disregard an individual's contributions. Interdependence has been put forth as a conceptual bridge between our educational traditions of assessing individual performance and our imminent challenge of assessing team-based performance without losing sight of the individual. The purpose of this study was to develop a more refined conceptualisation of interdependence to inform the creation of measures that can assess the interdependence of residents within health care teams. METHODS Following a constructivist grounded theory approach, we conducted 49 semi-structured interviews with various members of health care teams (e.g. physicians, nurses, pharmacists, social workers and patients) across two different clinical specialties-Emergency Medicine and Paediatrics-at two separate sites. Data collection and analysis occurred iteratively. Constant comparative inductive analysis was used, and coding consisted of three stages: initial, focused and theoretical. RESULTS We asked participants to reflect upon interdependence and describe how it exists in their clinical setting. All participants acknowledged the existence of interdependence, but they did not view it as part of a linear spectrum where interdependence becomes independence. Our analysis refined the conceptualisation of interdependence to include two types: supportive and collaborative. Supportive interdependence occurs within health care teams when one member demonstrates insufficient expertise to perform within their scope of practice. Collaborative interdependence, on the other hand, was not triggered by lack of experience/expertise within an individual's scope of practice, but rather recognition that patient care requires contributions from other team members. CONCLUSION In order to assess a team's collective performance without losing sight of the individual, we need to capture interdependent performances and characterise the nature of such interdependence. Moving away from a linear trajectory where independence is seen as the end goal can also help support efforts to measure an individual's competence as an interdependent member of a health care team.
Collapse
Affiliation(s)
| | - Lorelei Lingard
- Department of Medicine and Centre for Education Research and Innovation, Schulich School of Medicine & Dentistry, Western University, London, Ontario, Canada
| | - Michael Panza
- Centre for Education Research and Innovation, Western University, London, Ontario, Canada
| | - Tamara A Van Hooren
- Department of Pediatrics, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada
| | | |
Collapse
|
3
|
Holmboe ES, Osman NY, Murphy CM, Kogan JR. The Urgency of Now: Rethinking and Improving Assessment Practices in Medical Education Programs. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:S37-S49. [PMID: 37071705 DOI: 10.1097/acm.0000000000005251] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
Assessment is essential to professional development. Assessment provides the information needed to give feedback, support coaching and the creation of individualized learning plans, inform progress decisions, determine appropriate supervision levels, and, most importantly, help ensure patients and families receive high-quality, safe care in the training environment. While the introduction of competency-based medical education has catalyzed advances in assessment, much work remains to be done. First, becoming a physician (or other health professional) is primarily a developmental process, and assessment programs must be designed using a developmental and growth mindset. Second, medical education programs must have integrated programs of assessment that address the interconnected domains of implicit, explicit and structural bias. Third, improving programs of assessment will require a systems-thinking approach. In this paper, the authors first address these overarching issues as key principles that must be embraced so that training programs may optimize assessment to ensure all learners achieve desired medical education outcomes. The authors then explore specific needs in assessment and provide suggestions to improve assessment practices. This paper is by no means inclusive of all medical education assessment challenges or possible solutions. However, there is a wealth of current assessment research and practice that medical education programs can use to improve educational outcomes and help reduce the harmful effects of bias. The authors' goal is to help improve and guide innovation in assessment by catalyzing further conversations.
Collapse
Affiliation(s)
- Eric S Holmboe
- E.S. Holmboe is chief, Research, Milestones Development and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois; ORCID: https://orcid.org/0000-0003-0108-6021
| | - Nora Y Osman
- N.Y. Osman is associate professor of medicine, Harvard Medical School, and director of undergraduate medical education, Brigham and Women's Hospital Department of Medicine, Boston, Massachusetts; ORCID: https://orcid.org/0000-0003-3542-1262
| | - Christina M Murphy
- C.M. Murphy is a fourth-year medical student and president, Medical Student Government at Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0003-3966-5264
| | - Jennifer R Kogan
- J.R. Kogan is associate dean, Student Success and Professional Development, and professor of medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0001-8426-9506
| |
Collapse
|
4
|
Burk-Rafel J, Sebok-Syer SS, Santen SA, Jiang J, Caretta-Weyer HA, Iturrate E, Kelleher M, Warm EJ, Schumacher DJ, Kinnear B. TRainee Attributable & Automatable Care Evaluations in Real-time (TRACERs): A Scalable Approach for Linking Education to Patient Care. PERSPECTIVES ON MEDICAL EDUCATION 2023; 12:149-159. [PMID: 37215538 PMCID: PMC10198229 DOI: 10.5334/pme.1013] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/15/2023] [Accepted: 04/30/2023] [Indexed: 05/24/2023]
Abstract
Competency-based medical education (CBME) is an outcomes-based approach to education and assessment that focuses on what competencies trainees need to learn in order to provide effective patient care. Despite this goal of providing quality patient care, trainees rarely receive measures of their clinical performance. This is problematic because defining a trainee's learning progression requires measuring their clinical performance. Traditional clinical performance measures (CPMs) are often met with skepticism from trainees given their poor individual-level attribution. Resident-sensitive quality measures (RSQMs) are attributable to individuals, but lack the expeditiousness needed to deliver timely feedback and can be difficult to automate at scale across programs. In this eye opener, the authors present a conceptual framework for a new type of measure - TRainee Attributable & Automatable Care Evaluations in Real-time (TRACERs) - attuned to both automation and trainee attribution as the next evolutionary step in linking education to patient care. TRACERs have five defining characteristics: meaningful (for patient care and trainees), attributable (sufficiently to the trainee of interest), automatable (minimal human input once fully implemented), scalable (across electronic health records [EHRs] and training environments), and real-time (amenable to formative educational feedback loops). Ideally, TRACERs optimize all five characteristics to the greatest degree possible. TRACERs are uniquely focused on measures of clinical performance that are captured in the EHR, whether routinely collected or generated using sophisticated analytics, and are intended to complement (not replace) other sources of assessment data. TRACERs have the potential to contribute to a national system of high-density, trainee-attributable, patient-centered outcome measures.
Collapse
Affiliation(s)
- Jesse Burk-Rafel
- Division of Hospital Medicine, NYU Langone Health, and assistant director of Precision Medical Education, Institute for Innovations in Medical Education, NYU Grossman School of Medicine, New York, USA
| | - Stefanie S. Sebok-Syer
- Department of Emergency Medicine, Stanford University School of Medicine, Stanford, California, USA
| | - Sally A. Santen
- University of Cincinnati College of Medicine, Cincinnati, Ohio, USA
| | - Joshua Jiang
- University of California Los Angeles, Los Angeles, California. At the time of this work he was a medical student, NYU Grossman School of Medicine, New York, USA
| | - Holly A. Caretta-Weyer
- Department of Emergency Medicine, Stanford University School of Medicine, Stanford, California, USA
| | | | - Matthew Kelleher
- Internal Medicine and Pediatrics, Department of Pediatrics, Cincinnati Children’s Hospital Medical Center, University of Cincinnati College of Medicine, Cincinnati, Ohio, USA
| | - Eric J. Warm
- University of Cincinnati College of Medicine, Cincinnati, Ohio, USA
| | - Daniel J. Schumacher
- Department of Pediatrics, director of Education Research Unit, Cincinnati Children’s Hospital Medical Center/ University of Cincinnati College of Medicine, Cincinnati, Ohio, USA
| | - Benjamin Kinnear
- Internal Medicine and Pediatrics, Department of Pediatrics, Cincinnati Children’s Hospital Medical Center, University of Cincinnati College of Medicine, Cincinnati, Ohio, USA
| |
Collapse
|
5
|
Price DW. To Effectively Address Complex Healthcare Problems, Continuing Professional Development Must Evolve. THE JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS 2023; 43:S59-S63. [PMID: 38054493 DOI: 10.1097/ceh.0000000000000537] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/07/2023]
Abstract
ABSTRACT Continuing professional development aims to provide health professionals with the knowledge, skills, and competencies needed to improve care. Physicians and other clinicians increasingly practice within complex health care delivery organizations aiming to improve the care of populations of patients with multiple problems and differing needs. These organizations are composed of local units in different departments and venues; these teams and the patients they care for change over time. Improving outcomes within constantly changing complex organizations delivering population care takes time and persistence. It takes time to equip critical masses of clinicians and other personnel with knowledge and skills to effect change. Although some changes might be simple, those involving new workflows require implementation support. Not all change will be smooth; individuals need opportunities to learn from and adjust their early intervention efforts, measure effectiveness of change, and sustain successful practices. Longitudinal support is necessary to affect change over complex organizations. This essay proposes that to be more supportive and valuable to health care delivery organizations, continuing professional development needs to intentionally participate in longitudinal, collaborative, context-specific, team-based interventions. An expanded menu of evaluation approaches will better describe the role of continuing professional development in helping health care professionals and organizations address increasingly complex health care delivery problems and improve patient and population outcomes. Selected concepts to achieve these ends are introduced at a high level in this article. Readers are invited to explore concepts that resonate with their current situation in further detail.
Collapse
Affiliation(s)
- David W Price
- Dr. Price: Department of Family Medicine, University of Colorado Anschutz School of Medicine, Aurora, CO; and American Board of Family Medicine, Lexington, KY
| |
Collapse
|
6
|
Emery M, Wolff M, Merritt C, Ellinas H, McHugh D, Zaher M, Semiao ML, Gruppen LD. An outcomes research perspective on medical education: Has anything changed in the last 18 years? MEDICAL TEACHER 2022; 44:1400-1407. [PMID: 35856851 DOI: 10.1080/0142159x.2022.2099259] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
PURPOSE Medical education research focused on patient-centered outcomes holds the promise of improved decision-making by medical educators. In 2001, Prystowsky and Bordage demonstrated that patient-centered outcomes were evaluated in fewer than one percent of studies published in a survey of major medical education journals. Though many have called for increased inclusion of patient-centered outcomes in medical education literature, it remains uncertain to what degree this need has been addressed systematically. METHODS Using the same data sources as in the original report (Academic Medicine, Medical Education, and Teaching and Learning in Medicine), we sought to replicate Prystowsky and Bordage's study. We extracted data from original empirical research reports from these three journal sources for the years 2014-2016. We selected 652 articles that met the inclusion criteria for further analysis. RESULTS Study participants were largely trainees (64% of studies) or faculty (25% of studies). Only 2% of studies included patients as active or passive participants. Study outcomes reported were satisfaction (40% of studies), performance (39%), professionalism (20%), and cost (1%). CONCLUSIONS These results do not differ significantly from the original 2001 study. The medical education literature as represented in these three prominent journals has made little progress in placing a greater focus on patient-centered outcomes.
Collapse
Affiliation(s)
- Matt Emery
- Department of Emergency Medicine, Michigan State University College of Human Medicine, Grand Rapids, MI, USA
| | - Margaret Wolff
- Department of Emergency Medicine and Pediatrics, University of Michigan, Ann Arbor, MI, USA
| | - Chris Merritt
- Department of Emergency Medicine & Pediatrics, Alpert Medical School of Brown University, Providence, RI, USA
| | - Herodotos Ellinas
- Department of Anesthesiology, Medical College of Wisconsin, Milwaukee, WI, USA
| | - Douglas McHugh
- Frank H. Netter MD School of Medicine, Quinnipiac University, North Haven, CT, USA
| | - Mohammad Zaher
- Department of Academic and Training Affairs, Prince Mohammad Bin AbdulAziz Hospital, Riyadh, Saudi Arabia
| | - Meghan L Semiao
- Manager, Medical Simulation & Education Standardized Patient Program, Inova Health System, Falls Church, VA, USA
| | - Larry D Gruppen
- Department of Learning Health Sciences, University of Michigan, Ann Arbor, MI, USA
| |
Collapse
|
7
|
Price DW, Davis DA, Filerman GL. "Systems-Integrated CME": The Implementation and Outcomes Imperative for Continuing Medical Education in the Learning Health Care Enterprise. NAM Perspect 2021; 2021:202110a. [PMID: 34901778 PMCID: PMC8654469 DOI: 10.31478/202110a] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Affiliation(s)
- David W Price
- University of Colorado Anschutz School of Medicine and the American Board of Family Medicine
| | - David A Davis
- AXDEV Group, University of Toronto and Mohammed Bin Rashid University of Medicine & the Health Sciences
| | | |
Collapse
|
8
|
Sebok-Syer SS, Shaw JM, Asghar F, Panza M, Syer MD, Lingard L. A scoping review of approaches for measuring 'interdependent' collaborative performances. MEDICAL EDUCATION 2021; 55:1123-1130. [PMID: 33825192 DOI: 10.1111/medu.14531] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/21/2020] [Revised: 02/24/2021] [Accepted: 03/19/2021] [Indexed: 06/12/2023]
Abstract
INTRODUCTION Individual assessment disregards the team aspect of clinical work. Team assessment collapses the individual into the group. Neither is sufficient for medical education, where measures need to attend to the individual while also accounting for interactions with others. Valid and reliable measures of interdependence are critical within medical education given the collaborative manner in which patient care is provided. Medical education currently lacks a consistent approach to measuring the performance between individuals working together as part of larger healthcare team. This review's objective was to identify existing approaches to measuring this interdependence. METHODS Following Arksey & O'Malley's methodology, we conducted a scoping review in 2018 and updated it to 2020. A search strategy involving five databases located >12 000 citations. At least two reviewers independently screened titles and abstracts, screened full texts (n = 161) and performed data extraction on twenty-seven included articles. Interviews were also conducted with key informants to check if any literature was missing and assess that our interpretations made sense. RESULTS Eighteen of the twenty-seven articles were empirical; nine conceptual with an empirical illustration. Eighteen were quantitative; nine used mixed methods. The articles spanned five disciplines and various application contexts, from online learning to sports performance. Only two of the included articles were from the field of Medical Education. The articles conceptualised interdependence of a group, using theoretical constructs such as collaboration synergy; of a network, using constructs such as degree centrality; and of a dyad, using constructs such as synchrony. Both descriptive (eg social network analysis) and inferential (eg multi-level modelling) approaches were described. CONCLUSION Efforts to measure interdependence are scarce and scattered across disciplines. Multiple theoretical concepts and inconsistent terminology may be limiting programmatic work. This review motivates the need for further study of measurement techniques, particularly those combining multiple approaches, to capture interdependence in medical education.
Collapse
Affiliation(s)
| | - Jennifer M Shaw
- Women's Studies, Western University Faculty of Arts and Humanities Ringgold Standard Institution, London, ON, Canada
| | - Farah Asghar
- Pharmacy, University of Toronto, Toronto, ON, Canada
| | - Michael Panza
- Centre for Education Research and Innovation, Western University Schulich School of Medicine & Dentistry, London, ON, Canada
| | - Mark D Syer
- Computing, Queen's University, Kingston, ON, Canada
| | - Lorelei Lingard
- Department of Medicine, University of Western Ontario, London, ON, Canada
| |
Collapse
|
9
|
Sebok-Syer SS, Gingerich A, Holmboe ES, Lingard L, Turner DA, Schumacher DJ. Distant and Hidden Figures: Foregrounding Patients in the Development, Content, and Implementation of Entrustable Professional Activities. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:S76-S80. [PMID: 34183606 DOI: 10.1097/acm.0000000000004094] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Entrustable professional activities (EPAs) describe activities that qualified professionals must be able to perform to deliver safe and effective care to patients. The entrustable aspect of EPAs can be used to assess learners through documentation of entrustment decisions, while the professional activity aspect can be used to map curricula. When used as an assessment framework, the entrustment decisions reflect supervisory judgments that combine trainees' relational autonomy and patient safety considerations. Thus, the design of EPAs incorporates the supervisor, trainee, and patient in a way that uniquely offers a link between educational outcomes and patient outcomes. However, achieving a patient-centered approach to education amidst both curricular and assessment obligations, educational and patient outcomes, and a supervisor-trainee-patient triad is not simple nor guaranteed. As medical educators continue to advance EPAs as part of their approach to competency-based medical education, the authors share a critical discussion of how patients are currently positioned in EPAs. In this article, the authors examine EPAs and discuss how their development, content, and implementation can result in emphasizing the trainee and/or supervisor while unintentionally distancing or hiding the patient. They consider creative possibilities for how EPAs might better integrate the patient as finding ways to better foreground the patient in EPAs holds promise for aligning educational outcomes and patient outcomes.
Collapse
Affiliation(s)
- Stefanie S Sebok-Syer
- S.S. Sebok-Syer is instructor, Department of Emergency Medicine, Stanford University School of Medicine, Palo Alto, California; ORCID: http://orcid.org/0000-0002-3572-5971
| | - Andrea Gingerich
- A. Gingerich is assistant professor, Division of Medical Sciences, University of Northern British Columbia, Prince George, British Columbia, Canada; ORCID: http://orcid.org/0000-0001-5765-3975
| | - Eric S Holmboe
- E.S. Holmboe is chief research, milestones development, and evaluation officer, Accreditation Council for Graduate Medical Education, Chicago, Illinois; ORCID: http://orcid.org/0000-0003-0108-6021
| | - Lorelei Lingard
- L. Lingard is professor, Department of Medicine and Faculty of Education, and senior scientist, Centre for Education, Research, and Innovation, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada; ORCID: http://orcid.org/0000-0002-1524-0723
| | - David A Turner
- D.A. Turner is vice president for competency-based medical education, American Board of Pediatrics, Chapel Hill, North Carolina
| | - Daniel J Schumacher
- D.J. Schumacher is associate professor, Department of Pediatrics, Cincinnati Children's Hospital Medical Center and the University of Cincinnati College of Medicine, Cincinnati, Ohio; ORCID: http://orcid.org/0000-0001-5507-8452
| |
Collapse
|
10
|
Van Melle E, Hall AK, Schumacher DJ, Kinnear B, Gruppen L, Thoma B, Caretta-Weyer H, Cooke LJ, Frank JR. Capturing outcomes of competency-based medical education: The call and the challenge. MEDICAL TEACHER 2021; 43:794-800. [PMID: 34121596 DOI: 10.1080/0142159x.2021.1925640] [Citation(s) in RCA: 27] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
There is an urgent need to capture the outcomes of the ongoing global implementation of competency-based medical education (CBME). However, the measurement of downstream outcomes following educational innovations, such as CBME is fraught with challenges stemming from the complexities of medical training, the breadth and variability of inputs, and the difficulties attributing outcomes to specific educational elements. In this article, we present a logic model for CBME to conceptualize an impact pathway relating to CBME and facilitate outcomes evaluation. We further identify six strategies to mitigate the challenges of outcomes measurement: (1) clearly identify the outcome of interest, (2) distinguish between outputs and outcomes, (3) carefully consider attribution versus contribution, (4) connect outcomes to the fidelity and integrity of implementation, (5) pay attention to unanticipated outcomes, and (6) embrace methodological pluralism. Embracing these challenges, we argue that careful and thoughtful evaluation strategies will move us forward in answering the all-important question: Are the desired outcomes of CBME being achieved?
Collapse
Affiliation(s)
- Elaine Van Melle
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Family Medicine, Queen's University, Kingston, Canada
| | - Andrew K Hall
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Emergency Medicine, Queen's University, Kingston,Canada
| | - Daniel J Schumacher
- Division of Emergency Medicine, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA
- Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Benjamin Kinnear
- Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Larry Gruppen
- Department of Learning Health Sciences, University of Michigan, Ann Arbor, MI, USA
| | - Brent Thoma
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Emergency Medicine, University of Saskatchewan, Saskatoon, Canada
| | - Holly Caretta-Weyer
- Department of Emergency Medicine, Stanford University School of Medicine, Palo Alto, CA, USA
| | - Lara J Cooke
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Clinical Neurosciences, Division of Neurology, Cumming School of Medicine, University of Calgary, Calgary, Canada
| | - Jason R Frank
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, Canada
| |
Collapse
|
11
|
ten Cate O. Health professions education scholarship: The emergence, current status, and future of a discipline in its own right. FASEB Bioadv 2021; 3:510-522. [PMID: 34258520 PMCID: PMC8255850 DOI: 10.1096/fba.2021-00011] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2021] [Revised: 02/13/2021] [Accepted: 02/24/2021] [Indexed: 01/10/2023] Open
Abstract
Medical education, as a domain of scholarly pursuit, has enjoyed a remarkably rapid development in the past 70 years and is now more commonly known as health professions education (HPE) scholarship. Evidenced by a solid increase of publications, numbers of specialized journals, professional associations, national and international conferences, academies for medical educators, masters and doctoral courses, and the establishment of many units of HPE scholarship, the domain of HPE education scholarship has matured into a scholarly discipline in its own right. In this contribution, the author reviews the developments of the field from Boyer's four criteria that determine scholarship: discovery, integration, application, and teaching. Born mid-20th century, and in the first decades developed in the predominant area of physician education, HPE scholarship has matured, with increasing breadth, depth, and volume of scholars, publications, conferences, and dedicated centers for research and development. The author concludes that, given the infrastructure that has emerged, HPE can arguably be considered a discipline in its own right. This academic question may not matter hugely for practices of scholarly work in this domain, and any stance in this academic debate inevitably reflects a personal view, but the author would support the view of health professions scholarship as being a unique niche, with inherent dependence on both medical and other health professional sciences, on the one hand, and social sciences, including educational sciences, on the other hand.
Collapse
Affiliation(s)
- Olle ten Cate
- Center for Research and Development of EducationUniversity Medical Center UtrechtUtrechtthe Netherlands
| |
Collapse
|
12
|
Touchie C, Kinnear B, Schumacher D, Caretta-Weyer H, Hamstra SJ, Hart D, Gruppen L, Ross S, Warm E, Ten Cate O. On the validity of summative entrustment decisions. MEDICAL TEACHER 2021; 43:780-787. [PMID: 34020576 DOI: 10.1080/0142159x.2021.1925642] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Health care revolves around trust. Patients are often in a position that gives them no other choice than to trust the people taking care of them. Educational programs thus have the responsibility to develop physicians who can be trusted to deliver safe and effective care, ultimately making a final decision to entrust trainees to graduate to unsupervised practice. Such entrustment decisions deserve to be scrutinized for their validity. This end-of-training entrustment decision is arguably the most important one, although earlier entrustment decisions, for smaller units of professional practice, should also be scrutinized for their validity. Validity of entrustment decisions implies a defensible argument that can be analyzed in components that together support the decision. According to Kane, building a validity argument is a process designed to support inferences of scoring, generalization across observations, extrapolation to new instances, and implications of the decision. A lack of validity can be caused by inadequate evidence in terms of, according to Messick, content, response process, internal structure (coherence) and relationship to other variables, and in misinterpreted consequences. These two leading frameworks (Kane and Messick) in educational and psychological testing can be well applied to summative entrustment decision-making. The authors elaborate the types of questions that need to be answered to arrive at defensible, well-argued summative decisions regarding performance to provide a grounding for high-quality safe patient care.
Collapse
Affiliation(s)
- Claire Touchie
- Medical Council of Canada, Ottawa, Canada
- The University of Ottawa, Ottawa, Canada
| | - Benjamin Kinnear
- Internal Medicine and Pediatrics, University of Cincinnati College of Medicine/Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA
| | - Daniel Schumacher
- Pediatrics, Hospital Medical Center/University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Holly Caretta-Weyer
- Emergency Medicine, Stanford University School of Medicine, Palo Alto, CA, USA
| | - Stanley J Hamstra
- University of Toronto, Toronto, Ontario, Canada
- Accreditation Council for Graduate Medical Education, Chicago, IL, USA
| | - Danielle Hart
- Emergency Medicine, Hennepin Healthcare and the University of Minnesota, Minneapolis, MN, USA
| | - Larry Gruppen
- Learning Health Sciences, University of Michigan Medical School, Ann Arbor, MI, USA
| | - Shelley Ross
- Department of Family Medicine, University of Alberta, Edmonton, AB, Canada
| | - Eric Warm
- University of Cincinnati College of Medicine Center, Cincinnati, OH, USA
| | - Olle Ten Cate
- Center for Research and Development of Education, University Medical Center Utrecht, Utrecht, The Netherlands
| |
Collapse
|
13
|
Kinnear B, Warm EJ, Caretta-Weyer H, Holmboe ES, Turner DA, van der Vleuten C, Schumacher DJ. Entrustment Unpacked: Aligning Purposes, Stakes, and Processes to Enhance Learner Assessment. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:S56-S63. [PMID: 34183603 DOI: 10.1097/acm.0000000000004108] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Educators use entrustment, a common framework in competency-based medical education, in multiple ways, including frontline assessment instruments, learner feedback tools, and group decision making within promotions or competence committees. Within these multiple contexts, entrustment decisions can vary in purpose (i.e., intended use), stakes (i.e., perceived risk or consequences), and process (i.e., how entrustment is rendered). Each of these characteristics can be conceptualized as having 2 distinct poles: (1) purpose has formative and summative, (2) stakes has low and high, and (3) process has ad hoc and structured. For each characteristic, entrustment decisions often do not fall squarely at one pole or the other, but rather lie somewhere along a spectrum. While distinct, these continua can, and sometimes should, influence one another, and can be manipulated to optimally integrate entrustment within a program of assessment. In this article, the authors describe each of these continua and depict how key alignments between them can help optimize value when using entrustment in programmatic assessment within competency-based medical education. As they think through these continua, the authors will begin and end with a case study to demonstrate the practical application as it might occur in the clinical learning environment.
Collapse
Affiliation(s)
- Benjamin Kinnear
- B. Kinnear is associate professor of internal medicine and pediatrics, Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, Ohio; ORCID: https://orcid.org/0000-0003-0052-4130
| | - Eric J Warm
- E.J. Warm is professor of internal medicine and program director, Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, Ohio; ORCID: https://orcid.org/0000-0002-6088-2434
| | - Holly Caretta-Weyer
- H. Caretta-Weyer is assistant professor of emergency medicine, Department of Emergency Medicine, Stanford University School of Medicine, Palo Alto, California; ORCID: https://orcid.org/0000-0002-9783-5797
| | - Eric S Holmboe
- E.S. Holmboe is chief, research, milestones development and evaluation officer, Accreditation Council for Graduate Medical Education, Chicago, Illinois; ORCID: https://orcid.org/0000-0003-0108-6021
| | - David A Turner
- D.A. Turner is vice president, Competency-Based Medical Education, American Board of Pediatrics, Chapel Hill, North Carolina
| | - Cees van der Vleuten
- C. van der Vleuten is professor of education, Department of Educational Development and Research, Faculty of Health, Medicine and Life Sciences, Maastricht University, Maastricht, The Netherlands; ORCID: https://orcid.org/0000-0001-6802-3119
| | - Daniel J Schumacher
- D.J. Schumacher is associate professor of pediatrics, Cincinnati Children's Hospital Medical Center/University of Cincinnati College of Medicine, Cincinnati, Ohio; ORCID: https://orcid.org/0000-0001-5507-8452
| |
Collapse
|
14
|
Kinnear B, Kelleher M, Sall D, Schauer DP, Warm EJ, Kachelmeyer A, Martini A, Schumacher DJ. Development of Resident-Sensitive Quality Measures for Inpatient General Internal Medicine. J Gen Intern Med 2021; 36:1271-1278. [PMID: 33105001 PMCID: PMC8131459 DOI: 10.1007/s11606-020-06320-0] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/18/2020] [Revised: 07/20/2020] [Accepted: 10/14/2020] [Indexed: 11/28/2022]
Abstract
BACKGROUND Graduate medical education (GME) training has long-lasting effects on patient care quality. Despite this, few GME programs use clinical care measures as part of resident assessment. Furthermore, there is no gold standard to identify clinical care measures that are reflective of resident care. Resident-sensitive quality measures (RSQMs), defined as "measures that are meaningful in patient care and are most likely attributable to resident care," have been developed using consensus methodology and piloted in pediatric emergency medicine. However, this approach has not been tested in internal medicine (IM). OBJECTIVE To develop RSQMs for a general internal medicine (GIM) inpatient residency rotation using previously described consensus methods. DESIGN The authors used two consensus methods, nominal group technique (NGT) and a subsequent Delphi method, to generate RSQMs for a GIM inpatient rotation. RSQMs were generated for specific clinical conditions found on a GIM inpatient rotation, as well as for general care on a GIM ward. PARTICIPANTS NGT participants included nine IM and medicine-pediatrics (MP) residents and six IM and MP faculty members. The Delphi group included seven IM and MP residents and seven IM and MP faculty members. MAIN MEASURES The number and description of RSQMs generated during this process. KEY RESULTS Consensus methods resulted in 89 RSQMs with the following breakdown by condition: GIM general care-21, diabetes mellitus-16, hyperkalemia-14, COPD-13, hypertension-11, pneumonia-10, and hypokalemia-4. All RSQMs were process measures, with 48% relating to documentation and 51% relating to orders. Fifty-eight percent of RSQMs were related to the primary admitting diagnosis, while 42% could also be related to chronic comorbidities that require management during an admission. CONCLUSIONS Consensus methods resulted in 89 RSQMs for a GIM inpatient service. While all RSQMs were process measures, they may still hold value in learner assessment, formative feedback, and program evaluation.
Collapse
Affiliation(s)
- Benjamin Kinnear
- Department of Pediatrics, University of Cincinnati College of Medicine, , Cincinnati, OH, USA. .,Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA.
| | - Matthew Kelleher
- Department of Pediatrics, University of Cincinnati College of Medicine, , Cincinnati, OH, USA.,Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Dana Sall
- Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Daniel P Schauer
- Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Eric J Warm
- Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Andrea Kachelmeyer
- Department of Pediatrics, University of Cincinnati College of Medicine, , Cincinnati, OH, USA
| | - Abigail Martini
- Department of Pediatrics, University of Cincinnati College of Medicine, , Cincinnati, OH, USA
| | - Daniel J Schumacher
- Department of Pediatrics, University of Cincinnati College of Medicine, , Cincinnati, OH, USA
| |
Collapse
|
15
|
Sebok‐Syer SS, Shepherd L, McConnell A, Dukelow AM, Sedran R, Lingard L. "EMERGing" Electronic Health Record Data Metrics: Insights and Implications for Assessing Residents' Clinical Performance in Emergency Medicine. AEM EDUCATION AND TRAINING 2021; 5:e10501. [PMID: 33898906 PMCID: PMC8052996 DOI: 10.1002/aet2.10501] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/04/2020] [Revised: 07/02/2020] [Accepted: 07/07/2020] [Indexed: 05/30/2023]
Abstract
OBJECTIVES Competency-based medical education requires that residents are provided with frequent opportunities to demonstrate competence as well as receive effective feedback about their clinical performance. To meet this goal, we investigated how data collected by the electronic health record (EHR) might be used to assess emergency medicine (EM) residents' independent and interdependent clinical performance and how such information could be represented in an EM resident report card. METHODS Following constructivist grounded theory methodology, individual semistructured interviews were conducted in 2017 with 10 EM faculty and 11 EM residents across all 5 postgraduate years. In addition to open-ended questions, participants were presented with an emerging list of EM practice metrics and asked to comment on how valuable each would be in assessing resident performance. Additionally, we asked participants the extent to which each metric captured independent or interdependent performance. Data collection and analysis were iterative; analysis employed constant comparative inductive methods. RESULTS Participants refined and eliminated metrics as well as added new metrics specific to the assessment of EM residents (e.g., time between signup and first orders). These clinical practice metrics based on data from our EHR database were organized along a spectrum of independent/interdependent performance. We conclude with discussions about the relationship among these metrics, issues in interpretation, and implications of using EHR for assessment purposes. CONCLUSIONS Our findings document a systematic approach for developing EM resident assessments, based on EHR data, which incorporate the perspectives of both clinical faculty and residents. Our work has important implications for capturing residents' contributions to clinical performances and distinguishing between independent and interdependent metrics in collaborative workplace-based settings.
Collapse
Affiliation(s)
- Stefanie S. Sebok‐Syer
- Department of Emergency Medicine at Stanford University School of MedicineStanford UniversityPalo AltoCAUSA
| | - Lisa Shepherd
- Division of Emergency Medicine at Schulich School of Medicine and DentistryWestern UniversityLondonOntarioCanada
| | - Allison McConnell
- Division of Emergency Medicine at Schulich School of Medicine and DentistryWestern UniversityLondonOntarioCanada
| | - Adam M. Dukelow
- Division of Emergency Medicine at Schulich School of Medicine and DentistryWestern UniversityLondonOntarioCanada
| | - Robert Sedran
- Division of Emergency Medicine at Schulich School of Medicine and DentistryWestern UniversityLondonOntarioCanada
| | - Lorelei Lingard
- Department of Medicine and Faculty of Education and the Centre for Education, Research, and Innovation at Schulich School of Medicine and DentistryWestern UniversityLondonOntarioCanada
| |
Collapse
|