1
|
Thelen AE, George BC, Burkhardt JC, Khamees D, Haas MRC, Weinstein D. Improving Graduate Medical Education by Aggregating Data Across the Medical Education Continuum. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2024; 99:139-145. [PMID: 37406284 DOI: 10.1097/acm.0000000000005313] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/07/2023]
Abstract
ABSTRACT Meaningful improvements to graduate medical education (GME) have been achieved in recent decades, yet many GME improvement pilots have been small trials without rigorous outcome measures and with limited generalizability. Thus, lack of access to large-scale data is a key barrier to generating empiric evidence to improve GME. In this article, the authors examine the potential of a national GME data infrastructure to improve GME, review the output of 2 national workshops on this topic, and propose a path toward achieving this goal.The authors envision a future where medical education is shaped by evidence from rigorous research powered by comprehensive, multi-institutional data. To achieve this goal, premedical education, undergraduate medical education, GME, and practicing physician data must be collected using a common data dictionary and standards and longitudinally linked using unique individual identifiers. The envisioned data infrastructure could provide a foundation for evidence-based decisions across all aspects of GME and help optimize the education of individual residents.Two workshops hosted by the National Academies of Sciences, Engineering, and Medicine Board on Health Care Services explored the prospect of better using GME data to improve education and its outcomes. There was broad consensus about the potential value of a longitudinal data infrastructure to improve GME. Significant obstacles were also noted.Suggested next steps outlined by the authors include producing a more complete inventory of data already being collected and managed by key medical education leadership organizations, pursuing a grass-roots data sharing pilot among GME-sponsoring institutions, and formulating the technical and governance frameworks needed to aggregate data across organizations.The power and potential of big data is evident across many disciplines, and the authors believe that harnessing the power of big data in GME is the best next step toward advancing evidence-based physician education.
Collapse
|
2
|
Gates RS, Marcotte K, Moreci R, Krumm AE, Lynch KA, Bailey C, George BC. An Ideal System of Assessment to Support Competency-Based Graduate Medical Education: Key Attributes and Proposed Next Steps. JOURNAL OF SURGICAL EDUCATION 2024; 81:172-177. [PMID: 38158276 DOI: 10.1016/j.jsurg.2023.10.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/29/2023] [Revised: 09/09/2023] [Accepted: 10/11/2023] [Indexed: 01/03/2024]
Abstract
Competency-based medical education (CBME) is the future of medical education and relies heavily on high quality assessment. However, the current assessment practices employed by many general surgery graduate medical education training programs are subpar. Assessments often lack reliability and validity evidence, have low faculty engagement, and differ from program to program. Given the importance of assessment in CBME, it is critical that we build a better assessment system for measuring trainee competency. We propose that an ideal system of assessment is standardized, evidence-based, comprehensive, integrated, and continuously improving. In this article, we explore these characteristics and propose next steps to achieve such a system of assessment in general surgery.
Collapse
Affiliation(s)
- Rebecca S Gates
- Department of Surgery, Center for Surgical Training and Research, University of Michigan, Ann Arbor, Michigan.
| | - Kayla Marcotte
- Department of Surgery, Center for Surgical Training and Research, University of Michigan, Ann Arbor, Michigan; Department of Learning and Health Sciences, University of Michigan, Ann Arbor, Michigan
| | - Rebecca Moreci
- Department of Surgery, Center for Surgical Training and Research, University of Michigan, Ann Arbor, Michigan
| | - Andrew E Krumm
- Department of Surgery, Center for Surgical Training and Research, University of Michigan, Ann Arbor, Michigan; Department of Learning and Health Sciences, University of Michigan, Ann Arbor, Michigan
| | - Kenneth A Lynch
- Department of Surgery, Brown University, Providence, Rhode Island
| | - Christina Bailey
- Department of Surgery, Vanderbilt University, Nashville, Tennessee
| | - Brian C George
- Department of Surgery, Center for Surgical Training and Research, University of Michigan, Ann Arbor, Michigan
| |
Collapse
|
3
|
Reilly JB, Kim JG, Cooney R, DeWaters AL, Holmboe ES, Mazotti L, Gonzalo JD. Breaking Down Silos Between Medical Education and Health Systems: Creating an Integrated Multilevel Data Model to Advance the Systems-Based Practice Competency. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2024; 99:146-152. [PMID: 37289829 DOI: 10.1097/acm.0000000000005294] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
ABSTRACT The complexity of improving health in the United States and the rising call for outcomes-based physician training present unique challenges and opportunities for both graduate medical education (GME) and health systems. GME programs have been particularly challenged to implement systems-based practice (SBP) as a core physician competency and educational outcome. Disparate definitions and educational approaches to SBP, as well as limited understanding of the complex interactions between GME trainees, programs, and their health system settings, contribute to current suboptimal educational outcomes elated to SBP. To advance SBP competence at individual, program, and institutional levels, the authors present the rationale for an integrated multilevel systems approach to assess and evaluate SBP, propose a conceptual multilevel data model that integrates health system and educational SBP performance, and explore the opportunities and challenges of using multilevel data to promote an empirically driven approach to residency education. The development, study, and adoption of multilevel analytic approaches to GME are imperative to the successful operationalization of SBP and thereby imperative to GME's social accountability in meeting societal needs for improved health. The authors call for the continued collaboration of national leaders toward producing integrated and multilevel datasets that link health systems and their GME-sponsoring institutions to evolve SBP.
Collapse
|
4
|
Montgomery KB, Lindeman B. Using Graduating Surgical Resident Milestone Ratings to Predict Patient Outcomes: A Blunt Instrument for a Complex Problem. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:765-768. [PMID: 36745875 PMCID: PMC10329982 DOI: 10.1097/acm.0000000000005165] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/18/2023]
Abstract
In 2013, U.S. general surgery residency programs implemented a milestones assessment framework in an effort to incorporate more competency-focused evaluation methods. Developed by a group of surgical education leaders and other stakeholders working with the Accreditation Council for Graduate Medical Education and recently updated in a version 2.0, the surgery milestones framework is centered around 6 "core competencies": patient care, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, and systems-based practice. While prior work has focused on the validity of milestones as a measure of resident performance, associations between general surgery resident milestone ratings and their post-training patient outcomes have only recently been explored in an analysis in this issue of Academic Medicine by Kendrick et al. Despite their well-designed efforts to tackle this complex problem, no relationships were identified. This accompanying commentary discusses the broader implications for the use of milestone ratings beyond their intended application, alternative assessment methods, and the challenges of developing predictive assessments in the complex setting of surgical care. Although milestone ratings have not been shown to provide the specificity needed to predict clinical outcomes in the complex settings studied by Kendrick et al, hope remains that utilization of other outcomes, assessment frameworks, and data analytic tools could augment these models and further our progress toward a predictive assessment in surgical education. Evaluation of residents in general surgery residency programs has grown both more sophisticated and complicated in the setting of increasing patient and case complexity, constraints on time, and regulation of resident supervision in the operating room. Over the last decade, surgical education research efforts related to resident assessment have focused on measuring performance through accurate and reproducible methods with evidence for their validity, as well as on attempting to refine decision making about resident preparedness for unsupervised practice.
Collapse
Affiliation(s)
- Kelsey B Montgomery
- K.B. Montgomery is a general surgery resident, Department of Surgery, University of Alabama at Birmingham, Birmingham, Alabama; ORCID: https://orcid.org/0000-0002-1284-1830
| | - Brenessa Lindeman
- B. Lindeman is associate professor, Department of Surgery, and assistant dean, Graduate Medical Education, University of Alabama at Birmingham, Birmingham, Alabama
| |
Collapse
|
5
|
Kendrick DE, Thelen AE, Chen X, Gupta T, Yamazaki K, Krumm AE, Bandeh-Ahmadi H, Clark M, Luckoscki J, Fan Z, Wnuk GM, Ryan AM, Mukherjee B, Hamstra SJ, Dimick JB, Holmboe ES, George BC. Association of Surgical Resident Competency Ratings With Patient Outcomes. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:813-820. [PMID: 36724304 DOI: 10.1097/acm.0000000000005157] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/18/2023]
Abstract
PURPOSE Accurate assessment of clinical performance is essential to ensure graduating residents are competent for unsupervised practice. The Accreditation Council for Graduate Medical Education milestones framework is the most widely used competency-based framework in the United States. However, the relationship between residents' milestones competency ratings and their subsequent early career clinical outcomes has not been established. It is important to examine the association between milestones competency ratings of U.S. general surgical residents and those surgeons' patient outcomes in early career practice. METHOD A retrospective, cross-sectional study was conducted using a sample of national Medicare claims for 23 common, high-risk inpatient general surgical procedures performed between July 1, 2015, and November 30, 2018 (n = 12,400 cases) by nonfellowship-trained U.S. general surgeons. Milestone ratings collected during those surgeons' last year of residency (n = 701 residents) were compared with their risk-adjusted rates of mortality, any complication, or severe complication within 30 days of index operation during their first 2 years of practice. RESULTS There were no associations between mean milestone competency ratings of graduating general surgery residents and their subsequent early career patient outcomes, including any complication (23% proficient vs 22% not yet proficient; relative risk [RR], 0.97, [95% CI, 0.88-1.08]); severe complication (9% vs 9%, respectively; RR, 1.01, [95% CI, 0.86-1.19]); and mortality (5% vs 5%; RR, 1.07, [95% CI, 0.88-1.30]). Secondary analyses yielded no associations between patient outcomes and milestone ratings specific to technical performance, or between patient outcomes and composites of operative performance, professionalism, or leadership milestones ratings ( P ranged .32-.97). CONCLUSIONS Milestone ratings of graduating general surgery residents were not associated with the patient outcomes of those surgeons when they performed common, higher-risk procedures in a Medicare population. Efforts to improve how milestones ratings are generated might strengthen their association with early career outcomes.
Collapse
Affiliation(s)
- Daniel E Kendrick
- D.E. Kendrick is assistant professor, Department of Surgery, University of Minnesota, Minneapolis, Minnesota
| | - Angela E Thelen
- A.E. Thelen is research fellow, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Xilin Chen
- X. Chen is research analyst, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Tanvi Gupta
- T. Gupta is research analyst, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Kenji Yamazaki
- K. Yamazaki is senior data analyst, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | - Andrew E Krumm
- A.E. Krumm is assistant professor, Department of Learning Health Sciences, University of Michigan, Ann Arbor, Michigan
| | - Hoda Bandeh-Ahmadi
- H. Bandeh-Ahmadi is project manager, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Michael Clark
- M. Clark is a biostatistician, Consulting for Statistics, Computing, and Analytics Research, University of Michigan, Ann Arbor, Michigan
| | - John Luckoscki
- J. Luckoscki is research fellow, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Zhaohui Fan
- Z. Fan is research analyst, Center for Healthcare Outcomes and Policy, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Greg M Wnuk
- G.M. Wnuk is program manager, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Andrew M Ryan
- A.M. Ryan is professor, Department of Health Management and Policy, School of Public Health, University of Michigan, Ann Arbor, Michigan
| | - Bhramar Mukherjee
- B. Mukherjee is professor and chair, Division of Biostatistics, School of Public Health, University of Michigan, Ann Arbor, Michigan
| | - Stanley J Hamstra
- S.J. Hamstra is professor, Department of Surgery, University of Toronto, Toronto, Ontario, Canada
| | - Justin B Dimick
- J.B. Dimick is professor and chair, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Eric S Holmboe
- E.S. Holmboe is chief research, Milestone Development, and evaluation officer, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | - Brian C George
- B.C. George is director, Center for Surgical Training and Research, and assistant professor, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| |
Collapse
|
6
|
Degutis LC, Kellermann AL, Jackson K, Middleton A, Harris RS. Graduate Medical Education in the Military Health System: Strategic Analysis and Options. Mil Med 2023; 188:1-7. [PMID: 36882032 DOI: 10.1093/milmed/usac325] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2022] [Revised: 07/25/2022] [Accepted: 10/06/2022] [Indexed: 03/09/2023] Open
Abstract
INTRODUCTION At the request of then-Assistant Secretary of Defense for Health Affairs, Dr. Jonathan Woodson, Defense Health Horizons (DHH) examined options for shaping Graduate Medical Education (GME) in the Military Health System (MHS) in order to achieve the goals of a medically ready force and a ready medical force. MATERIALS AND METHODS The DHH interviewed service GME directors, key designated institutional officials, and subject-matter experts on GME in the military and civilian health care systems. RESULTS This report proposes numerous short- and long-term courses of action in three areas:1. Balancing the allocation of GME resources to suit the needs of active duty and garrisoned troops. We recommend developing a clear, tri-service mission and vision for GME in the MHS and expanding collaborations with outside institutions in order to prepare an optimal mix of physicians and ensure that trainees meet requirements for clinical experience.2. Improving the recruitment and tracking of GME students, as well as the management of accessions. We recommend several measures to improve the quality of incoming students, to track the performance of students and medical schools, and to foster a tri-service approach to accessions.3. Aligning MHS with the tenets of the Clinical Learning Environment Review to advance a culture of safety and to help the MHS become a high reliability organization (HRO). We recommend several actions to strengthen patient care and residency training and to develop a systematic approach to MHS management and leadership. CONCLUSION Graduate Medical Education (GME) is vital to produce the future physician workforce and medical leadership of the MHS. It also provides the MHS with clinically skilled manpower. Graduate Medical Education (GME) research sows the seeds for future discoveries to improve combat casualty care and other priority objectives of the MHS. Although readiness is the MHS's top mission, GME is also vital to meeting the other three components of the quadruple aim (better health, better care, and lower costs). Properly managed and adequately resourced GME can accelerate the transformation of the MHS into an HRO. Based on our analysis, DHH believes that there are numerous opportunities for MHS leadership to strengthen GME so it is more integrated, jointly coordinated, efficient, and productive. All physicians emerging from military GME should understand and embrace team-based practice, patient safety, and a systems-oriented focus. This will ensure that those we prepare to be the military physicians of the future are prepared to meet the needs of the line, to protect the health and safety of deployed warfighters, and to provide expert and compassionate care to garrisoned service members, families, and military retirees.
Collapse
Affiliation(s)
| | - Arthur L Kellermann
- Office of the Senior Vice President, Virginia Commonwealth University Health Sciences, Richmond, VA 23298, USA
| | - Kevin Jackson
- School of Medicine, Uniformed Services University of the Health Sciences, Bethesda, MD 20814, USA
| | - Allen Middleton
- Office of the President, Uniformed Services University of the Health Sciences, Bethesda, MD 20814, USA
| | | |
Collapse
|
7
|
Kim JG, Mazotti L, McDonald KM, Holmboe E, Kanter MH. Rowing Together: Publicly Reported Quality of Care Measures, US Graduate Medical Education Accountability, and Patient Outcomes. Jt Comm J Qual Patient Saf 2023; 49:174-178. [PMID: 36653211 DOI: 10.1016/j.jcjq.2022.12.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2022] [Revised: 12/15/2022] [Accepted: 12/16/2022] [Indexed: 12/24/2022]
|
8
|
Haas MRC, Davis MG, Harvey CE, Huang R, Scott KW, George BC, Wnuk GM, Burkhardt J. Implementation of the SIMPL (Society for Improving Medical Professional Learning) performance assessment tool in the emergency department: A pilot study. AEM EDUCATION AND TRAINING 2023; 7:e10842. [PMID: 36777102 PMCID: PMC9899600 DOI: 10.1002/aet2.10842] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/09/2022] [Revised: 12/01/2022] [Accepted: 12/18/2022] [Indexed: 06/18/2023]
Abstract
Background Feedback and assessment are difficult to provide in the emergency department (ED) setting despite their critical importance for competency-based education, and traditional end-of-shift evaluations (ESEs) alone may be inadequate. The SIMPL (Society for Improving Medical Professional Learning) mobile application has been successfully implemented and studied in the operative setting for surgical training programs as a point-of-care tool that incorporates three assessment scales in addition to dictated feedback. SIMPL may represent a viable tool for enhancing workplace-based feedback and assessment in emergency medicine (EM). Methods We implemented SIMPL at a 4-year EM residency program during a pilot study from March to June 2021 for observable activities such as medical resuscitations and related procedures. Faculty and residents underwent formal rater training prior to launch and were asked to complete surveys regarding the SIMPL app's content, usability, and future directions at the end of the pilot. Results A total of 36/58 (62%) of faculty completed at least one evaluation, for a total of 190 evaluations and an average of three evaluations per faculty. Faculty initiated 130/190 (68%) and residents initiated 60/190 (32%) evaluations. Ninety-one percent included dictated feedback. A total of 45/54 (83%) residents received at least one evaluation, with an average of 3.5 evaluations per resident. Residents generally agreed that SIMPL increased the quality of feedback received and that they valued dictated feedback. Residents generally did not value the numerical feedback provided from SIMPL. Relative to the residents, faculty overall responded more positively toward SIMPL. The pilot generated several suggestions to inform the optimization of the next version of SIMPL for EM training programs. Conclusions The SIMPL app, originally developed for use in surgical training programs, can be implemented for use in EM residency programs, has positive support from faculty, and may provide important adjunct information beyond current ESEs.
Collapse
Affiliation(s)
- Mary R. C. Haas
- Department of Emergency MedicineUniversity of Michigan Medical SchoolAnn ArborMichiganUSA
| | - Mallory G. Davis
- Department of Emergency MedicineUniversity of Michigan Medical SchoolAnn ArborMichiganUSA
| | - Carrie E. Harvey
- Department of Emergency MedicineUniversity of Michigan Medical SchoolAnn ArborMichiganUSA
| | - Rob Huang
- Department of Emergency MedicineUniversity of Michigan Medical SchoolAnn ArborMichiganUSA
| | - Kirstin W. Scott
- University of Michigan Emergency Medicine Residency ProgramAnn ArborMichiganUSA
| | - Brian C. George
- Center for Surgical Training and Research, Department of SurgeryUniversity of Michigan Medical SchoolAnn ArborMichiganUSA
| | - Gregory M. Wnuk
- Center for Surgical Training and Research, Department of SurgeryUniversity of Michigan Medical SchoolAnn ArborMichiganUSA
| | - John Burkhardt
- Departments of Emergency Medicine and Learning Health SciencesUniversity of Michigan Medical SchoolAnn ArborMichiganUSA
| |
Collapse
|
9
|
Lam AC, Tang B, Lalwani A, Verma AA, Wong BM, Razak F, Ginsburg S. Methodology paper for the General Medicine Inpatient Initiative Medical Education Database (GEMINI MedED): a retrospective cohort study of internal medicine resident case-mix, clinical care and patient outcomes. BMJ Open 2022; 12:e062264. [PMID: 36153026 PMCID: PMC9511606 DOI: 10.1136/bmjopen-2022-062264] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/03/2022] Open
Abstract
INTRODUCTION Unwarranted variation in patient care among physicians is associated with negative patient outcomes and increased healthcare costs. Care variation likely also exists for resident physicians. Despite the global movement towards outcomes-based and competency-based medical education, current assessment strategies in residency do not routinely incorporate clinical outcomes. The widespread use of electronic health records (EHRs) may enable the implementation of in-training assessments that incorporate clinical care and patient outcomes. METHODS AND ANALYSIS The General Medicine Inpatient Initiative Medical Education Database (GEMINI MedED) is a retrospective cohort study of senior residents (postgraduate year 2/3) enrolled in the University of Toronto Internal Medicine (IM) programme between 1 April 2010 and 31 December 2020. This study focuses on senior IM residents and patients they admit overnight to four academic hospitals. Senior IM residents are responsible for overseeing all overnight admissions; thus, care processes and outcomes for these clinical encounters can be at least partially attributed to the care they provide. Call schedules from each hospital, which list the date, location and senior resident on-call, will be used to link senior residents to EHR data of patients admitted during their on-call shifts. Patient data will be derived from the GEMINI database, which contains administrative (eg, demographic and disposition) and clinical data (eg, laboratory and radiological investigation results) for patients admitted to IM at the four academic hospitals. Overall, this study will examine three domains of resident practice: (1) case-mix variation across residents, hospitals and academic year, (2) resident-sensitive quality measures (EHR-derived metrics that are partially attributable to resident care) and (3) variations in patient outcomes across residents and factors that contribute to such variation. ETHICS AND DISSEMINATION GEMINI MedED was approved by the University of Toronto Ethics Board (RIS#39339). Results from this study will be presented in academic conferences and peer-reviewed journals.
Collapse
Affiliation(s)
- Andrew Cl Lam
- Department of Medicine, University of Toronto Faculty of Medicine, Toronto, Ontario, Canada
| | - Brandon Tang
- Department of Medicine, Division of General Internal Medicine, University of Toronto Faculty of Medicine, Toronto, Ontario, Canada
| | - Anushka Lalwani
- Li Ka Shing Knowledge Institute, Unity Health Toronto, Toronto, Ontario, Canada
| | - Amol A Verma
- Department of Medicine, University of Toronto Faculty of Medicine, Toronto, Ontario, Canada
- Li Ka Shing Knowledge Institute, Unity Health Toronto, Toronto, Ontario, Canada
- Division of General Internal Medicine, Unity Health Toronto, Toronto, Ontario, Canada
| | - Brian M Wong
- Department of Medicine, University of Toronto Faculty of Medicine, Toronto, Ontario, Canada
- Division of General Internal Medicine, Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada
| | - Fahad Razak
- Department of Medicine, University of Toronto Faculty of Medicine, Toronto, Ontario, Canada
- Li Ka Shing Knowledge Institute, Unity Health Toronto, Toronto, Ontario, Canada
- Division of General Internal Medicine, Unity Health Toronto, Toronto, Ontario, Canada
| | - Shiphra Ginsburg
- Department of Medicine, Division of Respirology, University of Toronto Faculty of Medicine, Toronto, Ontario, Canada
- Division of Respirology, Sinai Health System, Toronto, Ontario, Canada
| |
Collapse
|
10
|
Phillips RL, George BC, Holmboe ES, Bazemore AW, Westfall JM, Bitton A. Measuring Graduate Medical Education Outcomes to Honor the Social Contract. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2022; 97:643-648. [PMID: 35020616 PMCID: PMC9028305 DOI: 10.1097/acm.0000000000004592] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
The graduate medical education (GME) system is heavily subsidized by the public in return for producing physicians who meet society's needs. Under the terms of this implicit social contract, decisions about how this funding is allocated are deferred to the individual training sites. Institutions receiving public funding face potential conflicts of interest, which have at times prioritized institutional purposes and needs over societal needs, highlighting that there is little public accountability for how such funding is used. The cost and institutional burden of assessing many fundamental GME outcomes, such as specialty, geographic physician distribution, training-imprinted cost behaviors, and populations served, could be mitigated as data sources and methods for assessing GME outcomes and guiding training improvement already exist. This new capacity to assess system-level outcomes could help institutions and policymakers strategically address the greatest public needs. Measurement of educational outcomes can also be used to guide training improvement at every level of the educational system (i.e., the individual trainee, individual teaching institution, and collective GME system levels). There are good examples of institutions, states, and training consortia that are already assessing and using GME outcomes in these ways. The ultimate outcome could be a GME system that better meets the needs of society and better honors what is now only an implicit social contract.
Collapse
Affiliation(s)
- Robert L. Phillips
- R.L. Phillips Jr is executive director, Center for Professionalism & Value in Health Care, American Board of Family Medicine Foundation, Washington, DC; ORCID: https://orcid.org/0000-0001-7882-1560
| | - Brian C. George
- B.C. George is director, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan, executive director, Society for Improving Medical Professional Learning, Boston, Massachusetts, and senior scholar, Center for Professionalism & Value in Health Care, American Board of Family Medicine Foundation, Washington, DC
| | - Eric S. Holmboe
- E.S. Holmboe is chief, Research, Milestones Development and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | - Andrew W. Bazemore
- A.W. Bazemore is senior vice president for research and policy, American Board of Family Medicine, and co-director, Center for Professionalism & Value in Health Care, American Board of Family Medicine Foundation, Washington, DC
| | - John M. Westfall
- J.M. Westfall is director, Robert Graham Center, American Academy of Family Physicians, Washington, DC
| | - Asaf Bitton
- A. Bitton is executive director, Ariadne Labs, associate professor of medicine, Division of General Medicine, Brigham and Women’s Hospital, associate professor of health care policy, Harvard Medical School, Boston, Massachusetts, and part-time senior advisor for primary care policy, Center for Medicare & Medicaid Innovation, Baltimore, Maryland
| |
Collapse
|
11
|
|
12
|
Luckoski J, Jean D, Thelen A, Mazer L, George B, Kendrick DE. How Do Programs Measure Resident Performance? A Multi-Institutional Inventory of General Surgery Assessments. JOURNAL OF SURGICAL EDUCATION 2021; 78:e189-e195. [PMID: 34593329 DOI: 10.1016/j.jsurg.2021.08.024] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/27/2021] [Revised: 08/08/2021] [Accepted: 08/27/2021] [Indexed: 06/13/2023]
Abstract
OBJECTIVE To perform an inventory of assessment tools in use at surgical residency programs and their alignment with the Milestone Competencies. DESIGN We conducted an inventory of all assessment tools from a sample of general surgery training programs participating in a multi-center study of resident operative development in the United States. Each instrument was categorized using a data extraction tool designed to identify criteria for effective assessment in competency based education and according to which Milestone Competency was being evaluated. Tabulations of each category were then analyzed using descriptive statistics. Interviews with program directors and assessment coordinators were conducted to understand each instrument's intended use within each program. SETTING Multi-institutional review of general surgery assessment programs. PARTICIPANTS We identified assessment tools used by 10 general surgery programs during the 2019 to 2020 academic year. Programs were selected from a cohort already participating in a separate research study of resident operative development in the United States. RESULTS We identified 42 unique assessment tools used. Each program used an average of 7.2 (range 4-13) unique assessment instruments to measure performance, of which only 5 (11.9%) were used by at least 1 other program in our sample. Of all assessments, 59.5% were used monthly or less frequently. The majority (66.7%) of instruments were retrospective global assessments, rather than discrete observed performances. There were 4 (9.5%) instruments with established reliability or validity evidence. Across programs there was also significant variation in the volume of assessment used to evaluate residents, with the median total number of evaluations/trainee across all Milestone Competencies being 217 (IQR 78) per year. Patient care was the most frequently evaluated Milestone Competency. CONCLUSIONS General surgical assessment systems predominantly employ non-standardized global assessment tools that lack reliability or validity evidence. This variability makes it challenging to interpret and compare competency standards across programs. A standardized assessment toolkit with established reliability and validity evidence would allow training programs to measure the competence of their trainees more uniformly and understand where improvements in our training system can be made.
Collapse
Affiliation(s)
- John Luckoski
- Department of Surgery, University of Michigan, Ann Arbor, Michigan.
| | - Danielle Jean
- University of Michigan Medical School, Ann Arbor, Michigan
| | - Angela Thelen
- Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Laura Mazer
- Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Brian George
- Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Daniel E Kendrick
- Department of Surgery, University of Minnesota, Minneapolis, Minnesota
| |
Collapse
|
13
|
Touchie C, Kinnear B, Schumacher D, Caretta-Weyer H, Hamstra SJ, Hart D, Gruppen L, Ross S, Warm E, Ten Cate O. On the validity of summative entrustment decisions. MEDICAL TEACHER 2021; 43:780-787. [PMID: 34020576 DOI: 10.1080/0142159x.2021.1925642] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Health care revolves around trust. Patients are often in a position that gives them no other choice than to trust the people taking care of them. Educational programs thus have the responsibility to develop physicians who can be trusted to deliver safe and effective care, ultimately making a final decision to entrust trainees to graduate to unsupervised practice. Such entrustment decisions deserve to be scrutinized for their validity. This end-of-training entrustment decision is arguably the most important one, although earlier entrustment decisions, for smaller units of professional practice, should also be scrutinized for their validity. Validity of entrustment decisions implies a defensible argument that can be analyzed in components that together support the decision. According to Kane, building a validity argument is a process designed to support inferences of scoring, generalization across observations, extrapolation to new instances, and implications of the decision. A lack of validity can be caused by inadequate evidence in terms of, according to Messick, content, response process, internal structure (coherence) and relationship to other variables, and in misinterpreted consequences. These two leading frameworks (Kane and Messick) in educational and psychological testing can be well applied to summative entrustment decision-making. The authors elaborate the types of questions that need to be answered to arrive at defensible, well-argued summative decisions regarding performance to provide a grounding for high-quality safe patient care.
Collapse
Affiliation(s)
- Claire Touchie
- Medical Council of Canada, Ottawa, Canada
- The University of Ottawa, Ottawa, Canada
| | - Benjamin Kinnear
- Internal Medicine and Pediatrics, University of Cincinnati College of Medicine/Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA
| | - Daniel Schumacher
- Pediatrics, Hospital Medical Center/University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Holly Caretta-Weyer
- Emergency Medicine, Stanford University School of Medicine, Palo Alto, CA, USA
| | - Stanley J Hamstra
- University of Toronto, Toronto, Ontario, Canada
- Accreditation Council for Graduate Medical Education, Chicago, IL, USA
| | - Danielle Hart
- Emergency Medicine, Hennepin Healthcare and the University of Minnesota, Minneapolis, MN, USA
| | - Larry Gruppen
- Learning Health Sciences, University of Michigan Medical School, Ann Arbor, MI, USA
| | - Shelley Ross
- Department of Family Medicine, University of Alberta, Edmonton, AB, Canada
| | - Eric Warm
- University of Cincinnati College of Medicine Center, Cincinnati, OH, USA
| | - Olle Ten Cate
- Center for Research and Development of Education, University Medical Center Utrecht, Utrecht, The Netherlands
| |
Collapse
|
14
|
Kinnear B, Kelleher M, Sall D, Schauer DP, Warm EJ, Kachelmeyer A, Martini A, Schumacher DJ. Development of Resident-Sensitive Quality Measures for Inpatient General Internal Medicine. J Gen Intern Med 2021; 36:1271-1278. [PMID: 33105001 PMCID: PMC8131459 DOI: 10.1007/s11606-020-06320-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/18/2020] [Revised: 07/20/2020] [Accepted: 10/14/2020] [Indexed: 11/28/2022]
Abstract
BACKGROUND Graduate medical education (GME) training has long-lasting effects on patient care quality. Despite this, few GME programs use clinical care measures as part of resident assessment. Furthermore, there is no gold standard to identify clinical care measures that are reflective of resident care. Resident-sensitive quality measures (RSQMs), defined as "measures that are meaningful in patient care and are most likely attributable to resident care," have been developed using consensus methodology and piloted in pediatric emergency medicine. However, this approach has not been tested in internal medicine (IM). OBJECTIVE To develop RSQMs for a general internal medicine (GIM) inpatient residency rotation using previously described consensus methods. DESIGN The authors used two consensus methods, nominal group technique (NGT) and a subsequent Delphi method, to generate RSQMs for a GIM inpatient rotation. RSQMs were generated for specific clinical conditions found on a GIM inpatient rotation, as well as for general care on a GIM ward. PARTICIPANTS NGT participants included nine IM and medicine-pediatrics (MP) residents and six IM and MP faculty members. The Delphi group included seven IM and MP residents and seven IM and MP faculty members. MAIN MEASURES The number and description of RSQMs generated during this process. KEY RESULTS Consensus methods resulted in 89 RSQMs with the following breakdown by condition: GIM general care-21, diabetes mellitus-16, hyperkalemia-14, COPD-13, hypertension-11, pneumonia-10, and hypokalemia-4. All RSQMs were process measures, with 48% relating to documentation and 51% relating to orders. Fifty-eight percent of RSQMs were related to the primary admitting diagnosis, while 42% could also be related to chronic comorbidities that require management during an admission. CONCLUSIONS Consensus methods resulted in 89 RSQMs for a GIM inpatient service. While all RSQMs were process measures, they may still hold value in learner assessment, formative feedback, and program evaluation.
Collapse
Affiliation(s)
- Benjamin Kinnear
- Department of Pediatrics, University of Cincinnati College of Medicine, , Cincinnati, OH, USA. .,Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA.
| | - Matthew Kelleher
- Department of Pediatrics, University of Cincinnati College of Medicine, , Cincinnati, OH, USA.,Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Dana Sall
- Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Daniel P Schauer
- Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Eric J Warm
- Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Andrea Kachelmeyer
- Department of Pediatrics, University of Cincinnati College of Medicine, , Cincinnati, OH, USA
| | - Abigail Martini
- Department of Pediatrics, University of Cincinnati College of Medicine, , Cincinnati, OH, USA
| | - Daniel J Schumacher
- Department of Pediatrics, University of Cincinnati College of Medicine, , Cincinnati, OH, USA
| |
Collapse
|
15
|
Grischkan JA, Friedman AB, Chandra A. Financing of US Graduate Medical Education-Reply. JAMA 2021; 325:586. [PMID: 33560316 DOI: 10.1001/jama.2020.23988] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Affiliation(s)
| | - Ari B Friedman
- Hospital of the University of Pennsylvania, Leonard David Institute of Health Economics, Philadelphia
| | - Amitabh Chandra
- Harvard Kennedy School and Harvard Business School, Cambridge, Massachusetts
| |
Collapse
|
16
|
Schumacher DJ, Dornoff E, Carraccio C, Busari J, van der Vleuten C, Kinnear B, Kelleher M, Sall DR, Warm E, Martini A, Holmboe E. The Power of Contribution and Attribution in Assessing Educational Outcomes for Individuals, Teams, and Programs. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2020; 95:1014-1019. [PMID: 31833856 DOI: 10.1097/acm.0000000000003121] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Recent discussions have brought attention to the utility of contribution analysis for evaluating the effectiveness and outcomes of medical education programs, especially for complex initiatives such as competency-based medical education. Contribution analysis focuses on the extent to which different entities contribute to an outcome. Given that health care is provided by teams, contribution analysis is well suited to evaluating the outcomes of care delivery. Furthermore, contribution analysis plays an important role in analyzing program- and system-level outcomes that inform program evaluation and program-level improvements for the future. Equally important in health care, however, is the role of the individual. In the overall contribution of a team to an outcome, some aspects of this outcome can be attributed to individual team members. For example, a recently discharged patient with an unplanned return to the emergency department to seek care may not have understood the discharge instructions given by the nurse or may not have received any discharge guidance from the resident physician. In this example, if it is the nurse's responsibility to provide discharge instructions, that activity is attributed to him or her. This and other activities attributed to different individuals (e.g., nurse, resident) combine to contribute to the outcome for the patient. Determining how to tease out such attributions is important for several reasons. First, it is physicians, not teams, that graduate and are granted certification and credentials for medical practice. Second, incentive-based payment models focus on the quality of care provided by an individual. Third, an individual can use data about his or her performance on the team to help drive personal improvement. In this article, the authors explored how attribution and contribution analyses can be used in a complimentary fashion to discern which outcomes can and should be attributed to individuals, which to teams, and which to programs.
Collapse
Affiliation(s)
- Daniel J Schumacher
- D.J. Schumacher is associate professor of pediatrics, Cincinnati Children's Hospital Medical Center and University of Cincinnati College of Medicine, Cincinnati, Ohio. E. Dornoff is a medical student, University of Cincinnati College of Medicine, Cincinnati, Ohio. C. Carraccio is vice president of competency-based assessment, American Board of Pediatrics, Chapel Hill, North Carolina. J. Busari is consultant pediatrician and associate professor of medical education, Maastricht University, Maastricht, the Netherlands. C. van der Vleuten is professor of education, Department of Educational Development and Research, Faculty of Health, Medicine, and Life Sciences, and scientific director, School of Health Professions Education, Maastricht University, Maastricht, the Netherlands. B. Kinnear is assistant professor of pediatrics and internal medicine, Cincinnati Children's Hospital Medical Center and University of Cincinnati College of Medicine, Cincinnati, Ohio. M. Kelleher is assistant professor of pediatrics and internal medicine, Cincinnati Children's Hospital Medical Center and University of Cincinnati College of Medicine, Cincinnati, Ohio. D.R. Sall is assistant professor of internal medicine, University of Cincinnati College of Medicine, Cincinnati, Ohio. E. Warm is professor of medicine and internal medicine program director, University of Cincinnati College of Medicine, Cincinnati, Ohio. A. Martini is a clinical research coordinator, Division of Emergency Medicine, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio. E. Holmboe is senior vice president for milestones development and evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | | | | | | | | | | | | | | | | | | | | |
Collapse
|
17
|
Mai MV, Orenstein EW, Manning JD, Luberti AA, Dziorny AC. Attributing Patients to Pediatric Residents Using Electronic Health Record Features Augmented with Audit Logs. Appl Clin Inform 2020; 11:442-451. [PMID: 32583389 DOI: 10.1055/s-0040-1713133] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/09/2023] Open
Abstract
OBJECTIVE Patient attribution, or the process of attributing patient-level metrics to specific providers, attempts to capture real-life provider-patient interactions (PPI). Attribution holds wide-ranging importance, particularly for outcomes in graduate medical education, but remains a challenge. We developed and validated an algorithm using EHR data to identify pediatric resident PPIs (rPPIs). METHODS We prospectively surveyed residents in three care settings to collect self-reported rPPIs. Participants were surveyed at the end of primary care clinic, emergency department (ED), and inpatient shifts, shown a patient census list, asked to mark the patients with whom they interacted, and encouraged to provide a short rationale behind the marked interaction. We extracted routine EHR data elements, including audit logs, note contribution, order placement, care team assignment, and chart closure, and applied a logistic regression classifier to the data to predict rPPIs in each care setting. We also performed a comment analysis of the resident-reported rationales in the inpatient care setting to explore perceived patient interactions in a complicated workflow. RESULTS We surveyed 81 residents over 111 shifts and identified 579 patient interactions. Among EHR extracted data, time-in-chart was the best predictor in all three care settings (primary care clinic: odds ratio [OR] = 19.36, 95% confidence interval [CI]: 4.19-278.56; ED: OR = 19.06, 95% CI: 9.53-41.65' inpatient: OR = 2.95, 95% CI: 2.23-3.97). Primary care clinic and ED specific models had c-statistic values > 0.98, while the inpatient-specific model had greater variability (c-statistic = 0.89). Of 366 inpatient rPPIs, residents provided rationales for 90.1%, which were focused on direct involvement in a patient's admission or transfer, or care as the front-line ordering clinician (55.6%). CONCLUSION Classification models based on routinely collected EHR data predict resident-defined rPPIs across care settings. While specific to pediatric residents in this study, the approach may be generalizable to other provider populations and scenarios in which accurate patient attribution is desirable.
Collapse
Affiliation(s)
- Mark V Mai
- Department of Anesthesia and Critical Care Medicine, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania, United States.,Department of Biomedical and Health Informatics, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania, United States
| | - Evan W Orenstein
- Department of Pediatrics, Children's Healthcare of Atlanta, Atlanta, Georgia, United States
| | - John D Manning
- Department of Emergency Medicine, Atrium Health's Carolinas Medical Center, Charlotte, North Carolina, United States
| | - Anthony A Luberti
- Department of Biomedical and Health Informatics, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania, United States.,Department of Pediatrics, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania, United States
| | - Adam C Dziorny
- Department of Anesthesia and Critical Care Medicine, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania, United States.,Department of Biomedical and Health Informatics, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania, United States
| |
Collapse
|
18
|
Holmboe ES, Yamazaki K, Nasca TJ, Hamstra SJ. Using Longitudinal Milestones Data and Learning Analytics to Facilitate the Professional Development of Residents: Early Lessons From Three Specialties. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2020; 95:97-103. [PMID: 31348058 PMCID: PMC6924938 DOI: 10.1097/acm.0000000000002899] [Citation(s) in RCA: 48] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
PURPOSE To investigate the effectiveness of using national, longitudinal milestones data to provide formative assessments to identify residents at risk of not achieving recommended competency milestone goals by residency completion. The investigators hypothesized that specific, lower milestone ratings at earlier time points in residency would be predictive of not achieving recommended Level (L) 4 milestones by graduation. METHOD In 2018, the investigators conducted a longitudinal cohort study of emergency medicine (EM), family medicine (FM), and internal medicine (IM) residents who completed their residency programs from 2015 to 2018. They calculated predictive values and odds ratios, adjusting for nesting within programs, for specific milestone rating thresholds at 6-month intervals for all subcompetencies within each specialty. They used final milestones ratings (May-June 2018) as the outcome variables, setting L4 as the ideal educational outcome. RESULTS The investigators included 1,386 (98.9%) EM residents, 3,276 (98.0%) FM residents, and 7,399 (98.0%) IM residents in their analysis. The percentage of residents not reaching L4 by graduation ranged from 11% to 31% in EM, 16% to 53% in FM, and 5% to 15% in IM. Using a milestone rating of L2.5 or lower at the end of post-graduate year 2, the predictive probability of not attaining the L4 milestone graduation goal ranged from 32% to 56% in EM, 32% to 67% in FM, and 15% to 36% in IM. CONCLUSIONS Longitudinal milestones ratings may provide educationally useful, predictive information to help individual residents address potential competency gaps, but the predictive power of the milestones ratings varies by specialty and subcompetency within these 3 adult care specialties.
Collapse
Affiliation(s)
- Eric S. Holmboe
- E.S. Holmboe is chief research, milestones development and evaluation officer, Accreditation Council for Graduate Medical Education, Chicago, Illinois; ORCID: https://orcid.org/0000-0003-0108-6021
| | - Kenji Yamazaki
- K. Yamazaki is senior analyst, Milestones, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | - Thomas J. Nasca
- T.J. Nasca is president and chief executive officer, Accreditation Council for Graduate Medical Education, Chicago, Illinois, professor of medicine and molecular physiology, Sidney Kimmel Medical College, Thomas Jefferson University, Philadelphia, Pennsylvania, and senior scholar, Department of Education, University of Illinois at Chicago School of Medicine, Chicago, Illinois; ORCID: https://orcid.org/0000-0003-0811-5462
| | - Stanley J. Hamstra
- S.J. Hamstra is vice president, Milestones Research and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois, adjunct professor, Faculty of Education, University of Ottawa, Ottawa, Ontario, Canada, and adjunct professor, Department of Medical Education, Feinberg School of Medicine, Northwestern University, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-0680-366X
| |
Collapse
|
19
|
Abid MB, Frank MO. Becoming a physician: towards a more robust physician recruitment process. Postgrad Med J 2019; 96:123-124. [PMID: 31792114 DOI: 10.1136/postgradmedj-2018-136321] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2018] [Revised: 10/21/2019] [Accepted: 11/17/2019] [Indexed: 11/04/2022]
Affiliation(s)
| | - Michael O Frank
- Medicine, Medical College of Wisconsin, Milwaukee, Wisconsin, USA
| |
Collapse
|
20
|
Rosenberg ME, Gauer JL, Smith B, Calhoun A, Olson APJ, Melcher E. Building a Medical Education Outcomes Center: Development Study. JMIR MEDICAL EDUCATION 2019; 5:e14651. [PMID: 31674919 PMCID: PMC6856860 DOI: 10.2196/14651] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/08/2019] [Revised: 06/28/2019] [Accepted: 06/28/2019] [Indexed: 05/28/2023]
Abstract
BACKGROUND Medical education outcomes and clinical data exist in multiple unconnected databases, resulting in 3 problems: (1) it is difficult to connect learner outcomes with patient outcomes, (2) learners cannot be easily tracked over time through the education-training-practice continuum, and (3) no standard methodology ensures quality and privacy of the data. OBJECTIVE The purpose of this study was to develop a Medical Education Outcomes Center (MEOC) to integrate education data and to build a framework to standardize the intake and processing of requests for using these data. METHODS An inventory of over 100 data sources owned or utilized by the medical school was conducted, and nearly 2 dozen of these data sources have been vetted and integrated into the MEOC. In addition, the American Medical Association (AMA) Physician Masterfile data of the University of Minnesota Medical School (UMMS) graduates were linked to the data from the National Provider Identifier (NPI) registry to develop a mechanism to connect alumni practice data to education data. RESULTS Over 160 data requests have been fulfilled, culminating in a range of outcomes analyses, including support of accreditation efforts. The MEOC received data on 13,092 UMMS graduates in the AMA Physician Masterfile and could link 10,443 with NPI numbers and began to explore their practice demographics. The technical and operational work to expand the MEOC continues. Next steps are to link the educational data to the clinical practice data through NPI numbers to assess the effectiveness of our medical education programs by the clinical outcomes of our graduates. CONCLUSIONS The MEOC provides a replicable framework to allow other schools to more effectively operate their programs and drive innovation.
Collapse
Affiliation(s)
- Mark E Rosenberg
- Office of Medical Education, University of Minnesota Medical School, Minneapolis, MN, United States
| | - Jacqueline L Gauer
- Office of Medical Education, University of Minnesota Medical School, Minneapolis, MN, United States
| | - Barbara Smith
- Office of Health Sciences Technology, University of Minnesota, Minneapolis, MN, United States
| | - Austin Calhoun
- Office of Medical Education, University of Minnesota Medical School, Minneapolis, MN, United States
| | - Andrew P J Olson
- Office of Medical Education, University of Minnesota Medical School, Minneapolis, MN, United States
| | - Emily Melcher
- Office of Health Sciences Technology, University of Minnesota, Minneapolis, MN, United States
| |
Collapse
|
21
|
Schumacher DJ, Wu DTY, Meganathan K, Li L, Kinnear B, Sall DR, Holmboe E, Carraccio C, van der Vleuten C, Busari J, Kelleher M, Schauer D, Warm E. A Feasibility Study to Attribute Patients to Primary Interns on Inpatient Ward Teams Using Electronic Health Record Data. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2019; 94:1376-1383. [PMID: 31460936 DOI: 10.1097/acm.0000000000002748] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
PURPOSE To inform graduate medical education (GME) outcomes at the individual resident level, this study sought a method for attributing care for individual patients to individual interns based on "footprints" in the electronic health record (EHR). METHOD Primary interns caring for patients on an internal medicine inpatient service were recorded daily by five attending physicians of record at University of Cincinnati Medical Center in August 2017 and January 2018. These records were considered gold standard identification of primary interns. The following EHR variables were explored to determine representation of primary intern involvement in care: postgraduate year, progress note author, discharge summary author, physician order placement, and logging clicks in the patient record. These variables were turned into quantitative attributes (e.g., progress note author: yes/no), and informative attributes were selected and modeled using a decision tree algorithm. RESULTS A total of 1,511 access records were generated; 116 were marked as having a primary intern assigned. All variables except discharge summary author displayed at least some level of importance in the models. The best model achieved 78.95% sensitivity, 97.61% specificity, and an area under the receiver-operator curve of approximately 91%. CONCLUSIONS This study successfully predicted primary interns caring for patients on inpatient teams using EHR data with excellent model performance. This provides a foundation for attributing patients to primary interns for the purposes of determining patient diagnoses and complexity the interns see as well as supporting continuous quality improvement efforts in GME.
Collapse
Affiliation(s)
- Daniel J Schumacher
- D.J. Schumacher is associate professor of pediatrics, Cincinnati Children's Hospital Medical Center and University of Cincinnati College of Medicine, Cincinnati, Ohio. D.T.Y. Wu is assistant professor of biomedical informatics and pediatrics, University of Cincinnati College of Medicine, Cincinnati, Ohio. K. Meganathan is senior clinical data analyst, Center for Health Informatics, University of Cincinnati College of Medicine, Cincinnati, Ohio. L. Li is research associate, Center for Health Informatics, University of Cincinnati College of Medicine, Cincinnati, Ohio. B. Kinnear is assistant professor of pediatrics and internal medicine, Cincinnati Children's Hospital Medical Center and University of Cincinnati College of Medicine, Cincinnati, Ohio. D.R. Sall is assistant professor of internal medicine, University of Cincinnati College of Medicine, Cincinnati, Ohio. E. Holmboe is senior vice president for milestones development and evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois. C. Carraccio is vice president of competency-based assessment, American Board of Pediatrics, Chapel Hill, North Carolina. C. van der Vleuten is professor of education, Department of Educational Development and Research, Faculty of Health, Medicine, and Life Sciences, and scientific director, School of Health Professions Education (SHE), Maastricht University, Maastricht, The Netherlands. J. Busari is consultant pediatrician and associate professor of medical education, Maastricht University, Maastricht, The Netherlands. M. Kelleher is assistant professor of pediatrics and internal medicine, Cincinnati Children's Hospital Medical Center and University of Cincinnati College of Medicine, Cincinnati, Ohio. D. Schauer is associate professor of internal medicine, University of Cincinnati College of Medicine, Cincinnati, Ohio. E. Warm is professor of medicine and internal medicine program director, University of Cincinnati College of Medicine, Cincinnati, Ohio
| | | | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
22
|
van Schaik SM, Reeves SA, Headrick LA. Exemplary Learning Environments for the Health Professions: A Vision. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2019; 94:975-982. [PMID: 30844927 DOI: 10.1097/acm.0000000000002689] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
In this article, the authors propose a vision for exemplary learning environments in which everyone involved in health professions education and health care collaborates toward optimal health for individuals, populations, and communities. Learning environments in the health professions can be conceptualized as complex adaptive systems, defined as a collection of individual agents whose actions are interconnected and follow a set of shared "simple rules." Using principles from complex adaptive systems as a guiding framework for the proposed vision, the authors postulate that exemplary learning environments will follow four such simple rules: Health care and health professions education share a goal of improving health for individuals, populations, and communities; in exemplary learning environments, learning is work and work is learning; exemplary learning environments recognize that collaboration with integration of diverse perspectives is essential for success; and the organizations and agents in the learning environments learn about themselves and the greater system they are part of in order to achieve continuous improvement and innovation. For each of the simple rules, the authors describe the details of the vision and how the current state diverges from this vision. They provide actionable ideas about how to reach the vision using specific examples from the literature. In addition, they identify potential targets for assessment to monitor the success of learning environments, including outcome measures at the individual, team, institutional, and societal levels. Such measurements can ensure optimal alignment between health professions education and health care and inform ongoing improvement of learning environments.
Collapse
Affiliation(s)
- Sandrijn M van Schaik
- S.M. van Schaik is professor of pediatrics and Baum Family Presidential Chair for Experiential Learning, University of California, San Francisco, San Francisco, California. S.A. Reeves is professor and dean, School of Nursing and Health Professions, Colby-Sawyer College, New London, New Hampshire, and chief nurse executive, Dartmouth-Hitchcock Health, Lebanon, New Hampshire. L.A. Headrick is professor emerita of medicine, University of Missouri School of Medicine, Columbia, Missouri
| | | | | |
Collapse
|
23
|
Gauer JL, van den Hoogenhof S, Rosenberg ME. Treating and teaching: using publicly available data to explore the relationship between student and patient evaluations of teaching hospitals. ADVANCES IN MEDICAL EDUCATION AND PRACTICE 2019; 10:405-409. [PMID: 31354377 PMCID: PMC6580117 DOI: 10.2147/amep.s192304] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 10/26/2018] [Accepted: 04/11/2019] [Indexed: 06/10/2023]
Abstract
Introduction: Treating patients and teaching medical students are parallel activities that occur at teaching hospitals. However, the relationship between these activities is poorly understood. There have been multiple calls for assessing the quality of medical education by examining publicly available clinical data but there is minimal evidence linking these variables. Method: In this proof-of-principle study, the authors examined publicly available Hospital Consumer Assessment of Healthcare Providers and Systems (H-CAHPS)Ⓡ data collected during Calendar Year 2013 to explore the relationship between patient evaluations of their hospital experience and medical student evaluations of the educational experience at that site. Results: Pearson product-moment correlation coefficients were calculated for multiple variables. Patient ratings of doctor-patient communication correlated with student ratings of organization (R=0.882, p=0.048), educational value (R=0.882, p=0.048), teaching (R=0.963, p=0.008), and evaluation and feedback (R=0.920, p=0.027). Conclusion: These findings provide preliminary evidence for a relationship between patient experiences and the quality of education at that site. Further studies linking clinical and education outcomes are needed to explore this relationship in more depth. The contributions of specific hospital locations, providers, or clerkships need to be evaluated. Studies examining these relationships have the potential to improve both patient care and medical education.
Collapse
Affiliation(s)
- Jacqueline L Gauer
- Office of Medical Education, University of Minnesota Medical School, Minneapolis, MN55455, USA
| | | | - Mark E Rosenberg
- Office of Medical Education, University of Minnesota Medical School, Minneapolis, MN55455, USA
| |
Collapse
|
24
|
Smirnova A, Sebok-Syer SS, Chahine S, Kalet AL, Tamblyn R, Lombarts KMJMH, van der Vleuten CPM, Schumacher DJ. Defining and Adopting Clinical Performance Measures in Graduate Medical Education: Where Are We Now and Where Are We Going? ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2019; 94:671-677. [PMID: 30720528 DOI: 10.1097/acm.0000000000002620] [Citation(s) in RCA: 35] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Assessment and evaluation of trainees' clinical performance measures is needed to ensure safe, high-quality patient care. These measures also aid in the development of reflective, high-performing clinicians and hold graduate medical education (GME) accountable to the public. Although clinical performance measures hold great potential, challenges of defining, extracting, and measuring clinical performance in this way hinder their use for educational and quality improvement purposes. This article provides a way forward by identifying and articulating how clinical performance measures can be used to enhance GME by linking educational objectives with relevant clinical outcomes. The authors explore four key challenges: defining as well as measuring clinical performance measures, using electronic health record and clinical registry data to capture clinical performance, and bridging silos of medical education and health care quality improvement. The authors also propose solutions to showcase the value of clinical performance measures and conclude with a research and implementation agenda. Developing a common taxonomy of uniform specialty-specific clinical performance measures, linking these measures to large-scale GME databases, and applying both quantitative and qualitative methods to create a rich understanding of how GME affects quality of care and patient outcomes is important, the authors argue. The focus of this article is primarily GME, yet similar challenges and solutions will be applicable to other areas of medical and health professions education as well.
Collapse
Affiliation(s)
- Alina Smirnova
- A. Smirnova is a PhD researcher, School of Health Professions Education, Department of Educational Development and Research, Faculty of Health, Medicine and Life Sciences, Maastricht University, Maastricht, The Netherlands, and Professional Performance Research Group, Department of Medical Psychology, Academic Medical Center, University of Amsterdam, Amsterdam, The Netherlands. S.S. Sebok-Syer is instructor, Department of Emergency Medicine, Stanford University School of Medicine, Palo Alto, California. S. Chahine is assistant professor and scientist, Centre for Educational Research and Innovation (CERI), Western University, London, Ontario, Canada. A.L. Kalet is professor of medicine and surgery, director of research on medical education outcomes (ROMEO), Unit of the Division of General Internal Medicine and Clinical Innovation, Department of Medicine, and director of research, Program on Medical Education and Technology, NYU School of Medicine, New York, New York. R. Tamblyn is professor, Department of Medicine and Department of Epidemiology and Biostatistics, McGill University, medical scientist, McGill University Health Center Research Institute, scientific director, Clinical and Health Informatics Research Group, McGill University, and scientific director, Canadian Institutes of Health Research-Institute of Health Services and Policy Research, Montreal, Quebec, Canada. K.M.J.M.H. Lombarts is professor and lead investigator, Professional Performance Research Group, Department of Medical Psychology, Academic Medical Center, University of Amsterdam, Amsterdam, The Netherlands. C.P.M. van der Vleuten is professor and scientific director, School of Health Professions Education, Department of Educational Development and Research, Faculty of Health, Medicine and Life Sciences, Maastricht University, Maastricht, The Netherlands. D.J. Schumacher is associate professor, Division of Emergency Medicine, and pediatric emergency physician, Cincinnati Children's Hospital Medical Center/University of Cincinnati College of Medicine, Cincinnati, Ohio
| | | | | | | | | | | | | | | |
Collapse
|
25
|
Weinstein DF, Thibault GE. Illuminating Graduate Medical Education Outcomes in Order to Improve Them. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2018; 93:975-978. [PMID: 29642105 DOI: 10.1097/acm.0000000000002244] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Optimizing clinician education is an essential step toward enhancing health outcomes, and graduate medical education (GME)-as the pipeline for producing the nation's physicians-is an appropriate target for improvement. This Invited Commentary focuses on the need to clarify the specific goals of GME and measure achievement of those goals, using consistent metrics. The authors report on an October 2017 National Academies of Sciences, Engineering, and Medicine (NASEM) workshop focused on this agenda. A broadly representative group of participants reflected strong consensus in support of using GME outcomes data to develop better approaches to education and related policy. Implementation challenges include identifying meaningful metrics, minimizing administrative burden, addressing privacy concerns, and recognizing variability in institutional mission and capabilities. The authors recommend creating a national inventory of current data sources and initiating a pilot program to collect and share common metrics, while advancing a national effort via a "neutral" convener, such as the NASEM. The authors assert that measuring and reporting GME outcomes is a professional responsibility that must now be tackled.
Collapse
Affiliation(s)
- Debra F Weinstein
- D.F. Weinstein is associate professor of medicine, Harvard Medical School, and vice president for graduate medical education, Partners HealthCare, Boston, Massachusetts. G.E. Thibault is president, Josiah Macy Jr. Foundation, New York, New York
| | | |
Collapse
|
26
|
Triola MM, Hawkins RE, Skochelak SE. The Time Is Now: Using Graduates' Practice Data to Drive Medical Education Reform. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2018; 93:826-828. [PMID: 29443719 DOI: 10.1097/acm.0000000000002176] [Citation(s) in RCA: 34] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Medical educators are not yet taking full advantage of the publicly available clinical practice data published by federal, state, and local governments, which can be attributed to individual physicians and evaluated in the context of where they attended medical school and residency training. Understanding how graduates fare in actual practice, both in terms of the quality of the care they provide and the clinical challenges they face, can aid educators in taking an evidence-based approach to medical education. Although in their infancy, efforts to link clinical outcomes data to educational process data hold the potential to accelerate medical education research and innovation. This approach will enable unprecedented insight into the long-term impact of each stage of medical education on graduates' future practice. More work is needed to determine best practices, but the barrier to using these public data is low, and the potential for early results is immediate. Using practice data to evaluate medical education programs can transform how the future physician workforce is trained and better align continuously learning medical education and health care systems.
Collapse
Affiliation(s)
- Marc M Triola
- M.M. Triola is associate professor of medicine, associate dean for educational informatics, and founding director, Institute for Innovations in Medical Education, NYU School of Medicine, New York, New York; ORCID: https://orcid.org/0000-0002-6303-3112. R.E. Hawkins is president and chief executive officer, American Board of Medical Specialties, Chicago, Illinois. He was vice president for medical education outcomes, American Medical Association, Chicago, Illinois, at the time of writing. S.E. Skochelak is group vice president for medical education, American Medical Association, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-9522-4888
| | | | | |
Collapse
|