1
|
Cheney-Peters D, Liveright E, Shusted C, Sinnott JF, Diemer G, Jaffe R. A Learning Community Supporting Experiential Education to Learn About Healthcare Equity Quality Improvement. J Gen Intern Med 2023; 38:3060-3064. [PMID: 37488367 PMCID: PMC10593695 DOI: 10.1007/s11606-023-08314-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/01/2023] [Accepted: 06/29/2023] [Indexed: 07/26/2023]
Abstract
BACKGROUND Quality improvement (QI) for healthcare equity (HCE) is an important aspect of graduate medical education (GME), but there is limited published research on educational programs teaching this topic. AIM To describe and evaluate a novel curriculum and learning community for HCE QI. SETTING Academic institution. PARTICIPANTS Forty-eight participants: 32 learners and 16 faculty. PROGRAM DESCRIPTION This novel, longitudinal curriculum utilized a virtual hub-and-spoke learning community. Five interdepartmental teams of learners and faculty (spokes) used QI methods to address an existing institutional healthcare inequity (HCI). A team of experts (the hub) led monthly group meetings to foster the learning community and guide teams. PROGRAM EVALUATION Retrospective pre-post curricular surveys assessed participant satisfaction, knowledge, and skills in applying QI methods to address HCIs. Response rate was 33%. The majority of participants (92.4%) reported an increase in knowledge and skills in conducting QI for HCIs. All participants reported an increased likelihood of future engagement in HCE QI. Final QI projects average QIPAT7 score was 25.8 (SD = 4.93), consistent with "meets expectations" in most categories. DISCUSSION This program is a feasible model to teach GME learners and faculty about HCE QI and may be adopted by other institutions.
Collapse
Affiliation(s)
- Dianna Cheney-Peters
- Division of Hospital Medicine, Department of Medicine, Sidney Kimmel Medical College at Thomas Jefferson University, Philadelphia, PA, USA.
| | - Elizabeth Liveright
- Department of Obstetrics and Gynecology, Sidney Kimmel Medical College at Thomas Jefferson University, Philadelphia, PA, USA
| | - Christine Shusted
- The Jane and Leonard Korman Respiratory Institute, Division of Pulmonary and Critical Care Medicine, Department of Medicine, Sidney Kimmel Medical College at Thomas Jefferson University, Philadelphia, PA, USA
| | - Jacqueline F Sinnott
- Sidney Kimmel Medical College at Thomas Jefferson University, PA, Philadelphia, USA
| | - Gretchen Diemer
- Division of Hospital Medicine, Department of Medicine, Sidney Kimmel Medical College at Thomas Jefferson University, Philadelphia, PA, USA
| | - Rebecca Jaffe
- Division of Hospital Medicine, Department of Medicine, Sidney Kimmel Medical College at Thomas Jefferson University, Philadelphia, PA, USA
| |
Collapse
|
2
|
Mendoza J, Hampton E, Singleton L. A theoretical and practical approach to quality improvement education. Curr Probl Pediatr Adolesc Health Care 2023; 53:101459. [PMID: 37980237 DOI: 10.1016/j.cppeds.2023.101459] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/20/2023]
Abstract
Quality Improvement (QI) knowledge and skills are required at all levels of physician training. System improvement efforts need to include understanding of health disparities and design of interventions to reduce those disparities, thus health equity needs to be integrated into QI education. Payors, accreditation bodies and health systems' emphases on QI result in the need for QI curricula that meet the needs of diverse learners. This article presents a theoretical background and practical tools for designing, implementing, and evaluating a QI educational program across the spectrum of physician training with an emphasis on competency-based education and a goal of continuous practice improvement. Practice-based learning and improvement and systems-based practice are two core domains of competencies for readiness to practice. These competencies can be met through the health systems science framework for studying improvement in patient care and health care delivery coupled with QI science. Curricula should incorporate interactive learning of theory and principles of QI as well as mentored, experiential QI project work with multidisciplinary teams. QI projects often develop ideas and implement changes but are often inconsistent in studying intervention impacts or reaching the level of patient outcomes. Curriculum design should incorporate adult learning principles, competency based medical education, environmental and audience factors, and formats for content delivery. Key QI topics and how they fit into the clinical environment and teaching resources are provided, as well as options for faculty development. Approaches to evaluation are presented, along with tools for assessing learner's beliefs and attitudes, knowledge and application of QI principles, project evaluation, competency and curriculum evaluation. If the goal is to empower the next generation of change agents, there remains a need for development of scientific methodology and scholarly work, as well as faculty development and support by institutions.
Collapse
Affiliation(s)
- Joanne Mendoza
- Department of Pediatrics, Eastern Virginia Medical School, Children's Hospital of The King's Daughters, Virginia, USA.
| | - Elisa Hampton
- Department of Pediatrics, University of Virginia School of Medicine, University of Virginia Children's, Virginia, USA
| | - Lori Singleton
- Department of Pediatrics, Morehouse School of Medicine, Children's Healthcare of Atlanta, Georgia, USA
| |
Collapse
|
3
|
Levy KL, Grzyb K, Heidemann LA, Paliani DB, Grondin C, Solomon G, Spranger E, Ellies T, Ratz D, Houchens N. Enhancing Resident Education by Embedding Improvement Specialists Into a Quality and Safety Curriculum. J Grad Med Educ 2023; 15:348-355. [PMID: 37363669 PMCID: PMC10286907 DOI: 10.4300/jgme-d-22-00456.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/10/2022] [Revised: 10/19/2022] [Accepted: 02/27/2023] [Indexed: 06/28/2023] Open
Abstract
Background Quality improvement and patient safety (QIPS) curricula are critical in graduate medical education, yet barriers limit the educational experience and project outcomes. Objective To explore the impact of QIPS curricular enhancements and integration of continuous improvement specialists (CIS) by examining the A3 document, the primary project product and surrogate for project quality. Methods Since 2009, University of Michigan internal medicine and medicine-pediatric residents participate in a QIPS curriculum, which includes a 4-week group project. In 2016, residency leaders collaborated with CIS staff, non-clinical experts in QIPS with backgrounds in engineering and business, to improve the curriculum. Informed by a needs assessment, the intervention was implemented in 2017 and consisted of a set of enhancements including integration of CIS staff into groups as co-facilitators. In this retrospective cohort study, a blinded reviewer evaluated all available A3 documents before and after the intervention using a quantitative analysis tool. Results All residents participated in the curriculum during the pre-intervention (July 2009 to June 2016, n=351) and post-intervention (July 2017 to June 2020, n=148) periods. A total of 23 of 84 (27%) pre-intervention and 31 of 34 (91%) post-intervention A3 documents were available for review. Scores improved significantly for 17 of 23 (74%) A3 items and for 7 of 8 (88%) sections. Mean A3 total scores increased from 29.0 to 47.0 (95% CI 12.6-23.4; P<.001) out of a possible 69.0. Conclusions Embedding CIS experts into residency QIPS curricula is associated with improved A3 document quality.
Collapse
Affiliation(s)
- Kathryn L. Levy
- Kathryn L. Levy, MD, is Assistant Professor, Departments of Internal Medicine and Pediatrics, and Associate Program Director, Internal Medicine and Pediatrics Residency, University of Michigan
| | - Katie Grzyb
- Katie Grzyb, BSE, MHSA, is Continuous Improvement Specialist, Department of Internal Medicine, University of Michigan
| | - Lauren A. Heidemann
- Lauren A. Heidemann, MD, MHPE, is Associate Professor, Department of Internal Medicine, University of Michigan and Veterans Affairs Ann Arbor Healthcare System
| | - Debra Burke Paliani
- Debra Burke Paliani, MSME, is Continuous Improvement Specialist, Quality Department, University of Michigan Health System
| | - Christopher Grondin
- Christopher Grondin, MD, is Assistant Professor, Department of Internal Medicine, University of Michigan and Veterans Affairs Ann Arbor Healthcare System
| | - Gabriel Solomon
- Gabriel Solomon, MD, is Assistant Professor, Department of Internal Medicine, University of Michigan and Veterans Affairs Ann Arbor Healthcare System
| | - Elizabeth Spranger
- Elizabeth Spranger, BA, is Continuous Improvement Specialist, Department of Internal Medicine, University of Michigan
| | - Tammy Ellies
- Tammy Ellies, MBA, PMP, is Continuous Improvement Specialist, Department of Internal Medicine, University of Michigan
| | - David Ratz
- David Ratz, MS, is Statistician, Center for Clinical Management Research, Veterans Affairs Ann Arbor Healthcare System
| | - Nathan Houchens
- Nathan Houchens, MD, is Associate Professor, Department of Internal Medicine, and Assistant Program Director, Internal Medicine Residency Program, University of Michigan and Veterans Affairs Ann Arbor Healthcare System
| |
Collapse
|
4
|
Struessel TS, Sleddens NM, Jones KJ. Quality Improvement Content in Physical Therapist Education: A Scoping Review. Phys Ther 2022; 102:6596552. [PMID: 35648123 DOI: 10.1093/ptj/pzac012] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/28/2021] [Revised: 10/01/2021] [Accepted: 01/05/2022] [Indexed: 11/14/2022]
Abstract
OBJECTIVE The purpose of this study was to systematically review the literature regarding teaching quality improvement (QI) in physical therapist education based on the Institute of Medicine's 6-element definition of QI. Educational activities in QI methods in physical therapist professional education curricula, their developmental stage, and their level of evaluation were described. METHODS Keywords related to physical therapist students and QI educational activities were used to search studies indexed in PubMed, CINAHL, and ERIC published from 2004 through November 2020. This search yielded 118 studies. After applying inclusion and exclusion criteria, 13 studies were retained for full-text review, which was conducted independently by 2 reviewers. The University of Toronto framework was used to assess developmental stage, and Kirkpatrick's taxonomy was used to assess the evaluation level of 4 retained studies. RESULTS The scope of QI educational activities in the 4 retained studies was limited to 3 of the 6 elements of QI: identifying opportunities for improvement, designing and testing interventions, and identifying errors and hazards in care. None of the studies included educational activities to teach understanding and measuring quality of care. Three of the 4 studies spanned the first 2 stages of the University of Toronto framework (exposure and immersion); 1 study was limited to exposure. None of the studies assessed competence in QI methods. Evaluation of QI education was limited to Kirkpatrick levels 1 (reaction) or 2 (learning). None of the studies evaluated activities at level 3 (transfer of new behaviors) or level 4 (results). CONCLUSION Education in QI methods in professional physical therapist curricula may be limited in scope due to constraints in physical therapist education and the strategic objective of the profession to differentiate itself from other professions. IMPACT Entry-level physical therapists might not be educated to fully participate in interprofessional teams that use QI methods to continuously improve the quality of patient-centered care.
Collapse
Affiliation(s)
- Tamara S Struessel
- Department of Physical Medicine and Rehabilitation, University of Colorado Physical Therapy Program, Anschutz Medical Campus, Aurora, Colorado, USA
| | - Nicole M Sleddens
- Division of Physical Therapy Education, College of Allied Health Professions, University of Nebraska Medical Center, Omaha, Nebraska, USA
| | | |
Collapse
|
5
|
Mayo AL, Wong BM. Starting off on the right foot: providing timely feedback to learners in quality improvement education. BMJ Qual Saf 2021; 31:263-266. [PMID: 34551994 DOI: 10.1136/bmjqs-2021-013251] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/21/2021] [Indexed: 11/04/2022]
Affiliation(s)
- Amanda L Mayo
- Division of Physical Medicine and Rehabilitation, Department of Medicine, Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada.,Centre for Quality Improvement and Patient Safety (CQuIPS), Temerty Faculty of Medicine, University of Toronto, Toronto, Ontario, Canada
| | - Brian M Wong
- Centre for Quality Improvement and Patient Safety (CQuIPS), Temerty Faculty of Medicine, University of Toronto, Toronto, Ontario, Canada .,Division of General Internal Medicine, Department of Medicine, Sunnybrook Health Sciences Centre, University of Toronto, Toronto, Ontario, Canada
| |
Collapse
|
6
|
Singh MK, Gullett HL, Thomas PA. Using Kern's 6-Step Approach to Integrate Health Systems Science Curricula Into Medical Education. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:1282-1290. [PMID: 33951679 DOI: 10.1097/acm.0000000000004141] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
The term "health systems science" (HSS) has recently emerged as a unifying label for competencies in health care delivery and in population and community health. Despite strong evidence that HSS competencies are needed in the current and future health care workforce, heretofore the integration of HSS into medical education has been slow or fragmented-due, in part, to a lack of evidence that these curricula improve education or population outcomes. The recent COVID-19 pandemic and the national reckoning with racial inequities in the United States further highlight the time-sensitive imperative to integrate HSS content across the medical education continuum. While acknowledging challenges, the authors highlight the unique opportunities inherent in an HSS curriculum and present an elaborated curricular framework for incorporating health care delivery and population health into undergraduate medical education. This framework includes competencies previously left out of medical education, increases the scope of faculty development, and allows for evidence of effectiveness beyond traditional learner-centric metrics. The authors apply a widely adopted 6-step approach to curriculum development to address the unique challenges of incorporating HSS. Two examples-of a module on quality improvement (health care delivery) and of an introductory course on health equity (population and community health)-illustrate how the 6-step approach can be used to build HSS curricula. The Supplemental Digital Appendix (at http://links.lww.com/ACADMED/B106) outlines this approach and provides specific examples and resources. Adapting these resources within local environments to build HSS curricula will allow medical educators to ensure future graduates have the expertise and commitment necessary to effect health systems change and to advocate for their communities, while also building the much-needed evidence for such curricula.
Collapse
Affiliation(s)
- Mamta K Singh
- M.K. Singh is professor of medicine, Jerome Kowal, MD Designated Professor for Geriatric Health Education, Veterans Affairs Northeast Ohio Healthcare System, and former assistant dean, Health Systems Science, Case Western Reserve University School of Medicine, Cleveland, Ohio; ORCID: https://orcid.org/0000-0001-8235-4272
| | - Heidi L Gullett
- H.L. Gullett is associate professor and Charles Kent Smith, MD and Patricia Hughes Moore, MD Professor in Medical Student Education in Family Medicine, Center for Community Health Integration, Case Western Reserve University School of Medicine, Cleveland, Ohio; ORCID: https://orcid.org/0000-0002-3984-517X
| | - Patricia A Thomas
- P.A. Thomas was, when this was written, professor of medicine, Amasa B. Ford Professor of Geriatrics, and vice dean, Medical Education, Case Western Reserve University School of Medicine, Cleveland, Ohio; she is currently professor of medicine emerita, Johns Hopkins University School of Medicine, Baltimore, Maryland; ORCID: https://orcid.org/0000-0003-4528-9891
| |
Collapse
|
7
|
Myers JS, Kin JM, Billi JE, Burke KG, Harrison RV. Development and validation of an A3 problem-solving assessment tool and self-instructional package for teachers of quality improvement in healthcare. BMJ Qual Saf 2021; 31:287-296. [PMID: 33771908 DOI: 10.1136/bmjqs-2020-012105] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2020] [Revised: 02/20/2021] [Accepted: 03/10/2021] [Indexed: 11/04/2022]
Abstract
PURPOSE A3 problem solving is part of the Lean management approach to quality improvement (QI). However, few tools are available to assess A3 problem-solving skills. The authors sought to develop an assessment tool for problem-solving A3s with an accompanying self-instruction package and to test agreement in assessments made by individuals who teach A3 problem solving. METHODS After reviewing relevant literature, the authors developed an A3 assessment tool and self-instruction package over five improvement cycles. Lean experts and individuals from two institutions with QI proficiency and experience teaching QI provided iterative feedback on the materials. Tests of inter-rater agreement were conducted in cycles 3, 4 and 5. The final assessment tool was tested in a study involving 12 raters assessing 23 items on six A3s that were modified to enable testing a range of scores. RESULTS The intraclass correlation coefficient (ICC) for overall assessment of an A3 (rater's mean on 23 items per A3 compared across 12 raters and 6 A3s) was 0.89 (95% CI 0.75 to 0.98), indicating excellent reliability. For the 20 items with appreciable variation in scores across A3s, ICCs ranged from 0.41 to 0.97, indicating fair to excellent reliability. Raters from two institutions scored items similarly (mean ratings of 2.10 and 2.13, p=0.57). Physicians provided marginally higher ratings than QI professionals (mean ratings of 2.17 and 2.00, p=0.003). Raters averaged completing the self-instruction package in 1.5 hours, then rated six A3s in 2.0 hours. CONCLUSION This study provides evidence of the reliability of a tool to assess healthcare QI project proposals that use the A3 problem-solving approach. The tool also demonstrated evidence of measurement, content and construct validity. QI educators and practitioners can use the free online materials to assess learners' A3s, provide formative and summative feedback on QI project proposals and enhance their teaching.
Collapse
Affiliation(s)
- Jennifer S Myers
- Medicine, University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania, USA
| | - Jeanne M Kin
- Quality, Michigan Medicine, University of Michigan, Ann Arbor, Michigan, USA
| | - John E Billi
- Medicine and Learning Health Sciences, Michigan School of Medicine, University of Michigan, Ann Arbor, Michigan, USA.,Health Management and Policy, School of Public Health, University of Michigan, Ann Arbor, Michigan, USA.,Integrative Systems and Design, College of Engineering, University of Michigan, Ann Arbor, Michigan, USA
| | - Kathleen G Burke
- Biobehavioral Health, University of Pennsylvania School of Nursing, Philadelphia, Pennsylvania, USA.,Leonard Davis Institute of Health Economics, University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - Richard Van Harrison
- Learning Health Sciences, University of Michigan Health System, Ann Arbor, Michigan, USA
| |
Collapse
|
8
|
Peiris-John R, Selak V, Robb G, Kool B, Wells S, Sadler L, Wise MR. The State of Quality Improvement Teaching in Medical Schools: A Systematic Review. JOURNAL OF SURGICAL EDUCATION 2020; 77:889-904. [PMID: 32057742 DOI: 10.1016/j.jsurg.2020.01.003] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/22/2019] [Accepted: 01/14/2020] [Indexed: 06/10/2023]
Abstract
INTRODUCTION While teaching patient safety and quality improvement (QI) skills to medical students is endorsed as being important, best practice for achieving learner outcomes in QI is particularly unclear. We systematically reviewed QI curricula for medical students to identify approaches to QI training that are associated with positive learner outcomes. METHODS We searched databases (Medline, EMBASE, and Scopus) and article bibliographies for studies published from 2009 to 2018. Studies evaluating QI teaching for medical students in any setting and reporting learner outcomes were included. Educational content, teaching format, achievement of learning outcomes, and methodological features were abstracted. Outcomes assessed were learners' satisfaction, attitudes, knowledge and skills, changes in behavior and clinical processes, and benefits to patients. RESULTS Twenty of 25 curricula targeted medical students exclusively. Most curricula were well accepted by students (11/13 studies), increased their confidence in QI (9/11) and led to knowledge acquisition (17/20). Overall, positive learner outcomes (Kirkpatrick Levels 1 to 4A) were demonstrated across a range of curricular content and teaching modalities. In particular, 2 curricula demonstrated positive changes in learners' behavior (Kirkpatrick Level 3), both incorporating a clinical audit or QI project based in hospitals, and supplemented by didactic lectures. Seven curricula were associated with improvements in processes of care (Kirkpatrick Level 4A) all of which were set in a clinical setting and supplemented by didactic lectures and/or small group sessions. None of the curricula evaluated patient benefits (Kirkpatrick Level 4B). CONCLUSIONS Whilst there is heterogeneity in educational content and teaching methods, most curricula are well accepted and led to learners' knowledge acquisition. Although there is limited evidence for the impact of QI curricula on learner behavior and benefit to patients, and for interprofessional QI curricula, teaching QI in the clinical setting leads to better learner outcomes with location being potentially a surrogate for clinical experience.
Collapse
Affiliation(s)
- Roshini Peiris-John
- Section of Epidemiology and Biostatistics, Faculty of Medical and Health Sciences, University of Auckland, Auckland, New Zealand; Department of Obstetrics and Gynaecology, Faculty of Medical and Health Sciences, University of Auckland, Auckland, New Zealand.
| | - Vanessa Selak
- Section of Epidemiology and Biostatistics, Faculty of Medical and Health Sciences, University of Auckland, Auckland, New Zealand
| | - Gillian Robb
- Section of Epidemiology and Biostatistics, Faculty of Medical and Health Sciences, University of Auckland, Auckland, New Zealand
| | - Bridget Kool
- Section of Epidemiology and Biostatistics, Faculty of Medical and Health Sciences, University of Auckland, Auckland, New Zealand
| | - Susan Wells
- Section of Epidemiology and Biostatistics, Faculty of Medical and Health Sciences, University of Auckland, Auckland, New Zealand
| | - Lynn Sadler
- Women's Health, Auckland City Hospital, Auckland, New Zealand
| | - Michelle R Wise
- Department of Obstetrics and Gynaecology, Faculty of Medical and Health Sciences, University of Auckland, Auckland, New Zealand
| |
Collapse
|
9
|
Quatrara B, Brashers V, Baernholdt M, Novicoff W, Schlag K, Haizlip J, Plews-Ogan M, Kennedy C. Enhancing interprofessional education through patient safety and quality improvement team-training: A pre-post evaluation. NURSE EDUCATION TODAY 2019; 79:105-110. [PMID: 31112845 DOI: 10.1016/j.nedt.2019.05.011] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/09/2019] [Revised: 04/14/2019] [Accepted: 05/08/2019] [Indexed: 06/09/2023]
Affiliation(s)
- Beth Quatrara
- University of Virginia School of Nursing and Center for Academic Strategic Partnerships for Interprofessional Research and Education, 225 Jeanette Lancaster Way, Charlottesville, VA 22903-3388, United States of America.
| | - Valentina Brashers
- University of Virginia School of Nursing and Center for Academic Strategic Partnerships for Interprofessional Research and Education, 225 Jeanette Lancaster Way, Charlottesville, VA 22903-3388, United States of America
| | - Marianne Baernholdt
- Virginia Commonwealth University School of Nursing, Langston Center for Quality, Safety and Innovation, 1100 East Leigh Street, Richmond, VA 23298-0567, United States of America
| | - Wendy Novicoff
- Public Health Sciences at the University of Virginia, and Research in Quality and Patient Safety, 100 Hospital Drive, Charlottesville, VA 22908, United States of America
| | - Katherine Schlag
- Department of Medicine Quality Program, University of Virginia School of Medicine, 100 Hospital Drive, Charlottesville, VA 22908, United States of America
| | - Julie Haizlip
- University of Virginia School of Nursing, 225 Jeanette Lancaster Way, Charlottesville, VA 22903-3388, United States of America; University of Virginia School of Medicine and Center for Academic Strategic Partnerships for Interprofessional Research and Education, 100 Hospital Drive, Charlottesville, VA 22908, United States of America
| | - Margaret Plews-Ogan
- Department of General Medicine at the University of Virginia School of Medicine, 100 Hospital Drive, Charlottesville, VA 22908, United States of America
| | - Christine Kennedy
- Academic Programs at the University of Virginia School of Nursing, 225 Jeanette Lancaster Way, Charlottesville, VA 22903-3388, United States of America
| |
Collapse
|
10
|
Steele EM, Butcher R, Carluzzo KL, Watts BV. Development of a Tool to Assess Trainees’ Ability to Design and Conduct Quality Improvement Projects. Am J Med Qual 2019; 35:125-132. [DOI: 10.1177/1062860619853880] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
An important competency for residents developing skills in quality improvement (QI) and patient safety (PS) is to independently carry out an improvement project. The authors describe the development and reliability testing of the Quality Improvement Project Evaluation Rubric (QIPER) for use in rating project presentations in the Department of Veterans Affairs Chief Resident in Quality and Safety Program. QIPER contains 19 items across 6 domains to assess competence in designing, implementing, analyzing results of, and reporting on a QI/PS project. Interrater reliability of the instrument was calculated using the intraclass correlation coefficient (ICC). QIPER scores ranged from 28 to 72 out of a possible 76. QIPER demonstrated good reliability overall (ICC = 0.63). Although further testing is warranted, QIPER shows promise as a tool to assess a comprehensive set of skills involved in conducting QI/PS projects and has the sensitivity to detect varied competence and utility for providing learner feedback.
Collapse
|
11
|
Brown A, Nidumolu A, McConnell M, Hecker K, Grierson L. Development and psychometric evaluation of an instrument to measure knowledge, skills, and attitudes towards quality improvement in health professions education: The Beliefs, Attitudes, Skills, and Confidence in Quality Improvement (BASiC-QI) Scale. PERSPECTIVES ON MEDICAL EDUCATION 2019; 8:167-176. [PMID: 31098982 PMCID: PMC6565662 DOI: 10.1007/s40037-019-0511-8] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
INTRODUCTION Health professionals are increasingly expected to foster and lead initiatives to improve the quality and safety of healthcare. Consequently, health professions education has begun to integrate formal quality improvement (QI) training into their curricula. Few instruments exist in the literature that adequately and reliably assess QI-related competencies in learners without the use of multiple, trained raters in the context of healthcare. This paper describes the development and psychometric evaluation of the Beliefs, Attitudes, Skills, and Confidence in Quality Improvement (BASiC-QI) instrument, a 30-item self-assessment tool designed to assess knowledge, skills, and attitudes towards QI. METHODS Sixty first-year medical student participants completed the BASiC-QI and the Quality Improvement Knowledge Application Tool (QIKAT-R) prior to and immediately following a QI program that challenged learners to engage QI concepts in the context of their own medical education. Measurement properties of the BASiC-QI tool were explored through an exploratory factor analysis and generalizability study. Convergent validity was examined through correlations between BASiC-QI and QIKAT-R scores. RESULTS Psychometric evaluation of BASiC-QI indicated reliability and validity evidence based on internal structure. Analyses also revealed that BASiC-QI scores were positively correlated with the scores from the QIKAT-R, which stands an indicator of convergent validity. CONCLUSION BASiC-QI is a multidimensional self-assessment tool that may be used to assess beliefs, attitudes, skills, and confidence towards QI. In comparison with existing instruments, BASiC-QI does not require multiple raters or scoring rubrics, serving as an efficient, reliable assessment instrument for educators to examine the impact of QI curricula on learners.
Collapse
Affiliation(s)
- Allison Brown
- Michael G. DeGroote School of Medicine, McMaster University, Hamilton, ON, Canada.
- Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, ON, Canada.
| | - Aditya Nidumolu
- Michael G. DeGroote School of Medicine, McMaster University, Hamilton, ON, Canada
| | - Meghan McConnell
- Department of Innovation in Medical Education, University of Ottawa, Ottawa, ON, Canada
| | - Kent Hecker
- Department of Community Health Sciences, University of Calgary, Calgary, AB, Canada
| | - Lawrence Grierson
- Michael G. DeGroote School of Medicine, McMaster University, Hamilton, ON, Canada
- Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, ON, Canada
- Department of Family Medicine, McMaster University, Hamilton, ON, Canada
- McMaster Education Research, Innovation & Theory, McMaster University, Hamilton, ON, Canada
| |
Collapse
|
12
|
Shaheen AW, Fedoriw KB, Khachaturyan S, Steiner B, Golding J, Byerley JS, Helgeson ES, Beck Dallaghan GL. Students Adding Value: Improving Patient Care Measures While Learning Valuable Population Health Skills. Am J Med Qual 2019; 35:70-78. [PMID: 31055936 DOI: 10.1177/1062860619845482] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Medical students are potential resources for ambulatory primary care practices if learning goals can align with clinical needs. The authors introduced a quality improvement (QI) curriculum in the ambulatory clinical rotation that matched student learning expectations with practice needs. In 2016-2017, 128 students were assigned to academic, university affiliated, community health, and private practices. Student project measures were matched with appropriate outcome measures on monthly practice dashboards. Binomial mixed effects models were used to model QI measures. For university collaborative practices with student involvement, the estimated odds of a patient being screened for breast cancer in March 2017 was approximately 2 times greater than in 2016. This odds ratio was 36.2% greater than the comparable odds ratio for collaborative practices without student involvement (95% confidence interval = 22.7% to 51.2% greater). When student curriculum and assignments align with practice needs, practice metrics improve and students contribute to improvements in real-world settings.
Collapse
|
13
|
Bowe SN, McCormick ME. Resident and Fellow Engagement in Safety and Quality. Otolaryngol Clin North Am 2019; 52:55-62. [DOI: 10.1016/j.otc.2018.08.010] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
|
14
|
Jamal N, Bowe SN, Brenner MJ, Balakrishnan K, Bent JP. Impact of a Formal Patient Safety and Quality Improvement Curriculum: A Prospective, Controlled Trial. Laryngoscope 2018; 129:1100-1106. [DOI: 10.1002/lary.27527] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/31/2018] [Indexed: 11/09/2022]
Affiliation(s)
- Nausheen Jamal
- Department of Otolaryngology–Head and Neck SurgeryLewis Katz School of Medicine at Temple University Philadelphia Pennsylvania
| | - Sarah N. Bowe
- Department of Otolaryngology–Head and Neck SurgerySan Antonio Uniformed Services Health Education Consortium (SAUSHEC) Ft. Sam Houston TX
| | - Michael J. Brenner
- Department of Otolaryngology–Head and Neck SurgeryUniversity of Michigan School of Medicine Ann Arbor Michigan
| | - Karthik Balakrishnan
- Mayo Clinic Children's Center and Department of OtorhinolaryngologyMayo Clinic Rochester Minnesota
| | - John P. Bent
- Department of Otorhinolaryngology–Head and Neck Surgery, Albert Einstein College of Medicine at Montefiore Medical Center Bronx New York U.S.A
| |
Collapse
|
15
|
Turrentine FE, Hanks JB, Tracci MC, Jones RS, Schirmer BD, Smith PW. Resident-Specific Morbidity Reduced Following ACS NSQIP Data-Driven Quality Program. JOURNAL OF SURGICAL EDUCATION 2018; 75:1558-1565. [PMID: 29674110 DOI: 10.1016/j.jsurg.2018.04.001] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/03/2017] [Revised: 03/30/2018] [Accepted: 04/01/2018] [Indexed: 06/08/2023]
Abstract
BACKGROUND The Accreditation Council for Graduate Medical Education Milestone Project for general surgery provided a more robust method for developing and tracking residents' competence. This framework enhanced systematic and progressive development of residents' competencies in surgical quality improvement. STUDY DESIGN A 22-month interactive, educational program based on resident-specific surgical outcomes data culminated in a quality improvement project for postgraduate year 4 surgery residents. Self- assessment, quality knowledge test, and resident-specific American College of Surgeons National Surgical Quality Improvement Program Quality In-Training Initiative morbidity were compared before and after the intervention. RESULTS Quality in-training initiative morbidity decreased from 25% (82/325) to 18% (93/517), p = 0.015 despite residents performing more complex cases. All participants achieved level 4 competency (4/4) within the general surgery milestones improvement of care, practice-based learning and improvement competency. Institutional American College of Surgeons National Surgical Quality Improvement Program general surgery morbidity improved from the ninth to the sixth decile. Quality assessment and improvement self-assessment postintervention scores (M = 23.80, SD = 4.97) were not significantly higher than preintervention scores (M = 19.20, SD = 5.26), p = 0.061. Quality Improvement Knowledge Application Tool postintervention test scores (M = 17.4, SD = 4.88), were not significantly higher than pretest scores (M = 13.2, SD = 1.92), p = 0.12. CONCLUSION Sharing validated resident-specific clinical data with participants was associated with improved surgical outcomes. Participating fourth year surgical residents achieved the highest score, a level 4, in the practice based learning and improvement competency of the improvement of care practice domain and observed significantly reduced surgical morbidity for cases in which they participated.
Collapse
Affiliation(s)
- Florence E Turrentine
- Department of Surgery, School of Medicine, University of Virginia, Charlottesville, Virginia.
| | - John B Hanks
- Department of Surgery, School of Medicine, University of Virginia, Charlottesville, Virginia
| | - Megan C Tracci
- Department of Surgery, School of Medicine, University of Virginia, Charlottesville, Virginia
| | - R Scott Jones
- Department of Surgery, School of Medicine, University of Virginia, Charlottesville, Virginia
| | - Bruce D Schirmer
- Department of Surgery, School of Medicine, University of Virginia, Charlottesville, Virginia
| | - Philip W Smith
- Department of Surgery, School of Medicine, University of Virginia, Charlottesville, Virginia
| |
Collapse
|
16
|
Brenner M, Cramer J, Cohen S, Balakrishnan K. Leveraging Quality Improvement and Patient Safety Initiatives to Enhance Value and Patient-Centered Care in Otolaryngology. CURRENT OTORHINOLARYNGOLOGY REPORTS 2018. [DOI: 10.1007/s40136-018-0209-1] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022]
|
17
|
Rosenbluth G. Development of a Multi-Domain Assessment Tool for Quality Improvement Projects. J Grad Med Educ 2017; 9:473-478. [PMID: 28824761 PMCID: PMC5559243 DOI: 10.4300/jgme-d-17-00041.1] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/17/2017] [Revised: 04/06/2017] [Accepted: 04/26/2017] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Improving the quality of health care and education has become a mandate at all levels within the medical profession. While several published quality improvement (QI) assessment tools exist, all have limitations in addressing the range of QI projects undertaken by learners in undergraduate medical education, graduate medical education, and continuing medical education. OBJECTIVE We developed and validated a tool to assess QI projects with learner engagement across the educational continuum. METHODS After reviewing existing tools, we interviewed local faculty who taught QI to understand how learners were engaged and what these faculty wanted in an ideal assessment tool. We then developed a list of competencies associated with QI, established items linked to these competencies, revised the items using an iterative process, and collected validity evidence for the tool. RESULTS The resulting Multi-Domain Assessment of Quality Improvement Projects (MAQIP) rating tool contains 9 items, with criteria that may be completely fulfilled, partially fulfilled, or not fulfilled. Interrater reliability was 0.77. Untrained local faculty were able to use the tool with minimal guidance. CONCLUSIONS The MAQIP is a 9-item, user-friendly tool that can be used to assess QI projects at various stages and to provide formative and summative feedback to learners at all levels.
Collapse
|
18
|
McNab D, McKay J, Bowie P. Quality improvement training for core medical and general practice trainees: a pilot study of project participation, completion and journal publication. Scott Med J 2015; 60:208-13. [DOI: 10.1177/0036933015606586] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Background and objectives Small-scale quality improvement projects are expected to make a significant contribution towards improving the quality of healthcare. Enabling doctors-in-training to design and lead quality improvement projects is important preparation for independent practice. Participation is mandatory in speciality training curricula. However, provision of training and ongoing support in quality improvement methods and practice is variable. We aimed to design and deliver a quality improvement training package to core medical and general practice specialty trainees and evaluate impact in terms of project participation, completion and publication in a healthcare journal. Method A quality improvement training package was developed and delivered to core medical trainees and general practice specialty trainees in the west of Scotland encompassing a 1-day workshop and mentoring during completion of a quality improvement project over 3 months. A mixed methods evaluation was undertaken and data collected via questionnaire surveys, knowledge assessment, and formative assessment of project proposals, completed quality improvement projects and publication success. Results Twenty-three participants attended the training day with 20 submitting a project proposal (87%). Ten completed quality improvement projects (43%), eight were judged as satisfactory (35%), and four were submitted and accepted for journal publication (17%). Knowledge and confidence in aspects of quality improvement improved during the pilot, while early feedback on project proposals was valued (85.7%). Conclusion This small study reports modest success in training core medical trainees and general practice specialty trainees in quality improvement. Many gained knowledge of, confidence in and experience of quality improvement, while journal publication was shown to be possible. The development of educational resources to aid quality improvement project completion and mentoring support is necessary if expectations for quality improvement are to be realised.
Collapse
Affiliation(s)
- Duncan McNab
- Medical Directorate, NHS Education for Scotland, Glasgow, UK
| | - John McKay
- Medical Directorate, NHS Education for Scotland, Glasgow, UK
| | - Paul Bowie
- Medical Directorate, NHS Education for Scotland, Glasgow, UK
- Institute of Health and Wellbeing, University of Glasgow, UK
| |
Collapse
|
19
|
|
20
|
McCormick ME, Stadler ME, Shah RK. Embedding Quality and Safety in Otolaryngology–Head and Neck Surgery Education. Otolaryngol Head Neck Surg 2014; 152:778-82. [DOI: 10.1177/0194599814561601] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2014] [Accepted: 11/06/2014] [Indexed: 11/17/2022]
Abstract
Education in patient safety (PS) and quality improvement (QI) helps both medical students and residents understand the health care environment in the United States, where these concepts are now incorporated into virtually every aspect of patient care. The Accreditation Council of Graduate Medical Education has made PS/QI a mandatory component of resident education, and a number of specialties have published their experiences with incorporating PS/QI into their training programs. In otolaryngology–head and neck surgery, a strong curriculum can be built by teaching residents about the principles of PS/QI through both didactic and experiential learning, and morbidity and mortality and QI conferences can serve as the cornerstone of this curriculum. Understanding the potential challenges in PS/QI education can allow training programs to plan their strategy effectively for successful incorporation into their existing curricula.
Collapse
Affiliation(s)
- Michael E. McCormick
- Department of Otolaryngology and Communication Sciences, Medical College of Wisconsin, Milwaukee, Wisconsin, USA
| | - Michael E. Stadler
- Department of Otolaryngology and Communication Sciences, Medical College of Wisconsin, Milwaukee, Wisconsin, USA
| | - Rahul K. Shah
- Division of Pediatric Otolaryngology, Children’s National Medical Center, George Washington University, Washington, DC, USA
| |
Collapse
|
21
|
Wittich CM, Reed DA, Ting HH, Berger RA, Nowicki KM, Blachman MJ, Mandrekar JN, Beckman TJ. Measuring reflection on participation in quality improvement activities for maintenance of certification. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2014; 89:1392-1397. [PMID: 24892403 DOI: 10.1097/acm.0000000000000323] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
PURPOSE To validate a measure of reflection on participation in quality improvement (QI) activities and to identify associations with characteristics of QI projects, participants, and teams. METHOD This was a prospective validation study of all Mayo Clinic team participants who submitted QI projects for maintenance of certification (MOC) credit from 2010 to 2012. The authors developed a measure of reflection on participation in QI activities and explored associations between participants' overall reflection scores and characteristics of projects, participants, and teams. RESULTS A total of 922 participants (567 physicians) on 118 teams completed QI projects and reflections. Factor analysis revealed a two-dimensional model with good internal consistency reliabilities (Cronbach alpha) for high (0.85) and low (0.81) reflection. Reflection scores (mean [standard deviation]) were associated with projects that changed practice (yes: 4.30 [0.51]; no: 3.71 [0.57]; P < .0001), changed the health care system (yes: 4.25 [0.54]; no: 4.03 [0.62]; P < .0001), and impacted patient safety (P < .0001). Physicians' reflection scores (4.27 [0.57]) were higher than support staff scores (4.07 [0.55]; P = .0005). A positive association existed between reflection scores and the number of QI roles per participant (P < .0001). There were no associations with participant gender, team size, or team diversity. CONCLUSIONS The authors identified associations between participant reflection and the impact of QI projects, participants' professional roles, and participants' involvement with projects. With further study, the authors anticipate that the new measure of reflection will be useful for determining meaningful engagement in MOC.
Collapse
Affiliation(s)
- Christopher M Wittich
- Dr. Wittich is associate professor of medicine, Department of Internal Medicine, Division of General Internal Medicine, Mayo Clinic College of Medicine, Mayo Clinic, Rochester, Minnesota. Dr. Reed is associate professor of medicine, Department of Internal Medicine, Division of Primary Care Internal Medicine, Mayo Clinic College of Medicine, Mayo Clinic, Rochester, Minnesota. Dr. Ting is professor of medicine and associate dean of continuous professional development, Department of Internal Medicine, Division of Cardiovascular Diseases, Mayo Clinic College of Medicine, Mayo Clinic, Rochester, Minnesota. Dr. Berger is professor of orthopedics and dean of continuous professional development, Department of Orthopedic Surgery, Mayo Clinic College of Medicine, Mayo Clinic, Rochester, Minnesota. Ms. Nowicki is administrator, Mayo Clinic School of Continuous Professional Development, Mayo Clinic College of Medicine, Mayo Clinic, Rochester, Minnesota. Dr. Blachman is clinical professor and associate dean of continuous professional development and strategic affairs, University of South Carolina School of Medicine, Columbia, South Carolina. Dr. Mandrekar is professor of biostatistics, Department of Health Sciences Research, Division of Biomedical Statistics and Informatics, Mayo Clinic College of Medicine, Mayo Clinic, Rochester, Minnesota. Dr. Beckman is professor of medicine and medical education, Department of Internal Medicine, Division of General Internal Medicine, Mayo Clinic College of Medicine, Mayo Clinic, Rochester, Minnesota
| | | | | | | | | | | | | | | |
Collapse
|
22
|
Singh MK, Ogrinc G, Cox KR, Dolansky M, Brandt J, Morrison LJ, Harwood B, Petroski G, West A, Headrick LA. The Quality Improvement Knowledge Application Tool Revised (QIKAT-R). ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2014; 89:1386-91. [PMID: 25119555 DOI: 10.1097/acm.0000000000000456] [Citation(s) in RCA: 88] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/16/2023]
Abstract
PURPOSE Quality improvement (QI) has been part of medical education for over a decade. Assessment of QI learning remains challenging. The Quality Improvement Knowledge Application Tool (QIKAT), developed a decade ago, is widely used despite its subjective nature and inconsistent reliability. From 2009 to 2012, the authors developed and assessed the validation of a revised QIKAT, the "QIKAT-R." METHOD Phase 1: Using an iterative, consensus-building process, a national group of QI educators developed a scoring rubric with defined language and elements. Phase 2: Five scorers pilot tested the QIKAT-R to assess validity and inter- and intrarater reliability using responses to four scenarios, each with three different levels of response quality: "excellent," "fair," and "poor." Phase 3: Eighteen scorers from three countries used the QIKAT-R to assess the same sets of student responses. RESULTS Phase 1: The QI educators developed a nine-point scale that uses dichotomous answers (yes/no) for each of three QIKAT-R subsections: Aim, Measure, and Change. Phase 2: The QIKAT-R showed strong discrimination between "poor" and "excellent" responses, and the intra- and interrater reliability were strong. Phase 3: The discriminative validity of the instrument remained strong between excellent and poor responses. The intraclass correlation was 0.66 for the total nine-point scale. CONCLUSIONS The QIKAT-R is a user-friendly instrument that maintains the content and construct validity of the original QIKAT but provides greatly improved interrater reliability. The clarity within the key subsections aligns the assessment closely with QI knowledge application for students and residents.
Collapse
Affiliation(s)
- Mamta K Singh
- Dr. Singh is associate professor of medicine, Division of General Medicine, Louis Stokes Veterans Affairs Medical Center, Case Western Reserve University, Cleveland, Ohio. Dr. Ogrinc is associate professor of community and family medicine and of medicine, VA Medical Center, White River Junction, Vermont, and Geisel School of Medicine, Hanover, New Hampshire. Dr. Cox is manager, Quality Improvement, Office of Clinical Effectiveness, University of Missouri Health Care, Columbia, Missouri. Dr. Dolansky is associate professor, Frances Payne Bolton School of Nursing, Case Western Reserve University, Cleveland, Ohio. Dr. Brandt is associate director of quality improvement, School of Medicine, University of Missouri, Columbia, Missouri. Dr. Morrison is currently director of palliative medicine education, Department of Medicine, Yale University School of Medicine, New Haven, Connecticut, but was at Baylor College of Medicine in the Division of Geriatrics at the time of this study. Ms. Harwood is research associate, Geisel School of Medicine, Hanover, New Hampshire. Dr. Petroski is assistant professor of biostatistics, School of Medicine, University of Missouri, Columbia, Missouri. Dr. West is biostatistician, Department of Veterans Affairs, VA Medical Center, White River Junction, Vermont. Dr. Headrick is senior associate dean for education and professor of medicine, School of Medicine, University of Missouri, Columbia, Missouri
| | | | | | | | | | | | | | | | | | | |
Collapse
|
23
|
Medbery RL, Sellers MM, Ko CY, Kelz RR. The unmet need for a national surgical quality improvement curriculum: a systematic review. JOURNAL OF SURGICAL EDUCATION 2014; 71:613-631. [PMID: 24813341 DOI: 10.1016/j.jsurg.2013.12.004] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/05/2013] [Revised: 12/15/2013] [Accepted: 12/19/2013] [Indexed: 06/03/2023]
Abstract
INTRODUCTION The Accreditation Council for Graduate Medical Education Next Accreditation System will require general surgery training programs to demonstrate outstanding clinical outcomes and education in quality improvement (QI). The American College of Surgeons-National Surgical Quality Improvement Project Quality In-Training Initiative reports the results of a systematic review of the literature investigating the availability of a QI curriculum. METHODS Using defined search terms, a systematic review was conducted in Embase, PubMed, and Google Scholar (January 2000-March 2013) to identify a surgical QI curriculum. Bibliographies from selected articles and other relevant materials were also hand searched. Curriculum was defined as an organized program of learning complete with content, instruction, and assessment for use in general surgical residency programs. Two independent observers graded surgical articles on quality of curriculum presented. RESULTS Overall, 50 of 1155 references had information regarding QI in graduate medical education. Most (n = 24, 48%) described QI education efforts in nonsurgical fields. A total of 31 curricular blueprints were identified; 6 (19.4%) were specific to surgery. Targeted learners were most often post graduate year-2 residents (29.0%); only 6 curricula (19.4%) outlined a course for all residents within their respective programs. Plan, Do, Study, Act (n = 10, 32.3%), and Root Cause Analysis (n = 5, 16.1%) were the most common QI content presented, the majority of instruction was via lecture/didactics (n = 26, 83.9%), and only 7 (22.6%) curricula used validated tool kits for assessment. CONCLUSION Elements of QI curriculum for surgical education exist; however, comprehensive content is lacking. The American College of Surgeons-National Surgical Quality Improvement Project Quality In-Training Initiative will build on the high-quality components identified in our review and develop data-centered QI content to generate a comprehensive national QI curriculum for use in graduate surgical education.
Collapse
Affiliation(s)
- Rachel L Medbery
- Department of Surgery, Emory University School of Medicine, Atlanta, Georgia
| | - Morgan M Sellers
- Department of Surgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania
| | - Clifford Y Ko
- Division of Research and Optimal Patient Care, American College of Surgeons, Chicago, Illinois
| | - Rachel R Kelz
- Department of Surgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania.
| |
Collapse
|
24
|
Martinez J, Phillips E, Harris C. Where do we go from here? Moving from systems-based practice process measures to true competency via developmental milestones. MEDICAL EDUCATION ONLINE 2014; 19:24441. [PMID: 24974832 PMCID: PMC4074604 DOI: 10.3402/meo.v19.24441] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/25/2014] [Revised: 05/28/2014] [Accepted: 05/28/2014] [Indexed: 05/28/2023]
Abstract
For many educators it has been challenging to meet the Accreditation Council for Graduate Medical Education's requirements for teaching systems-based practice (SBP). An additional layer of complexity for educators is evaluating competency in SBP, despite milestones and entrustable professional activities (EPAs). In order to address this challenge, the authors present the results of a literature review for how SBP is currently being taught and a series of recommendations on how to achieve competency in SBP for graduate medical trainees with the use of milestones. The literature review included 29 articles and demonstrated that only 28% of the articles taught more than one of the six core principles of SBP in a meaningful way. Only 7% of the articles received the highest grade of A. The authors summarize four guiding principles for creating a competency-based curriculum that is in alignment with the Next Accreditation System (NAS): 1) the curriculum needs to include all of the core principles in that competency, 2) the objectives of the curriculum should be driven by clinical outcomes, 3) the teaching modalities need to be interactive and clinically relevant, and 4) the evaluation process should be able to measure competency and be directly reflective of pertinent milestones and/or EPAs. This literature review and the provided guiding principles can guide other residency educators in their development of competency-based curricula that meets the standards of the NAS.
Collapse
Affiliation(s)
- Johanna Martinez
- Department of Medicine, Weill Medical College of Cornell University, New York, NY, USA;
| | - Erica Phillips
- Department of Medicine, Weill Medical College of Cornell University, New York, NY, USA
| | - Christina Harris
- David Geffen School of Medicine, University of California, Los Angeles, CA, USA
| |
Collapse
|
25
|
Glissmeyer EW, Ziniel SI, Moses J. Use of the Quality Improvement (QI) Knowledge Application Tool in Assessing Pediatric Resident QI Education. J Grad Med Educ 2014; 6:284-91. [PMID: 24949133 PMCID: PMC4054728 DOI: 10.4300/jgme-d-13-00221.1] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/13/2013] [Revised: 11/18/2013] [Accepted: 01/12/2014] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Assessing the effectiveness of quality improvement curricula is important to improving this area of resident education. OBJECTIVE To assess the ability of the Quality Improvement Knowledge Application Tool (QIKAT) to differentiate between residents who were provided instruction in QI and those who were not, when scored by individuals not involved in designing the QIKAT, its scoring rubric, or QI curriculum instruction. METHODS The QIKAT and a 9-item self-assessment of QI proficiency were administered to an intervention and a control group. The intervention was a longitudinal curriculum consisting of 8 hours of didactic QI training and 6 workshops providing just-in-time training for resident QI projects. Two uninvolved faculty scored the QIKAT. RESULTS A total of 33 residents in the intervention group and 27 in the control group completed the baseline and postcurriculum QIKAT and self-assessment. QIKAT mean intervention group scores were significantly higher than mean control group scores postcurriculum (P < .001). Absolute QIKAT differences were small (of 15 points, intervention group improved from a mean score of 12.8 to 13.2). Interrater agreement as measured by kappa test was low (0.09). Baseline self-assessment showed no differences, and after instruction, the intervention group felt more proficient in QI knowledge than controls in 4 of 9 domains tested. CONCLUSIONS The QIKAT detected a statistically significant improvement postintervention, but the absolute differences were small. Self-reported gain in QI knowledge and proficiency agreed with the results of the QIKAT. However, QIKAT limitations include poor interrater agreement and a scoring rubric that lacks specificity. Programs considering using QIKAT to assess curricula should understand these limitations.
Collapse
|
26
|
Vitek CR, Dale JC, Homburger HA, Bryant SC, Saenger AK, Karon BS. Development and initial validation of a project-based rubric to assess the systems-based practice competency of residents in the clinical chemistry rotation of a pathology residency. Arch Pathol Lab Med 2014; 138:809-13. [PMID: 24878020 DOI: 10.5858/arpa.2013-0046-oa] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
CONTEXT Systems-based practice (SBP) is 1 of 6 core competencies required in all resident training programs accredited by the Accreditation Council for Graduate Medical Education. Reliable methods of assessing resident competency in SBP have not been described in the medical literature. OBJECTIVE To develop and validate an analytic grading rubric to assess pathology residents' analyses of SBP problems in clinical chemistry. DESIGN Residents were assigned an SBP project based upon unmet clinical needs in the clinical chemistry laboratories. Using an iterative method, we created an analytic grading rubric based on critical thinking principles. Four faculty raters used the SBP project evaluation rubric to independently grade 11 residents' projects during their clinical chemistry rotations. Interrater reliability and Cronbach α were calculated to determine the reliability and validity of the rubric. Project mean scores and range were also assessed to determine whether the rubric differentiated resident critical thinking skills related to the SBP projects. RESULTS Overall project scores ranged from 6.56 to 16.50 out of a possible 20 points. Cronbach α ranged from 0.91 to 0.96, indicating that the 4 rubric categories were internally consistent without significant overlap. Intraclass correlation coefficients ranged from 0.63 to 0.81, indicating moderate to strong interrater reliability. CONCLUSIONS We report development and statistical analysis of a novel SBP project evaluation rubric. The results indicate the rubric can be used to reliably assess pathology residents' critical thinking skills in SBP.
Collapse
Affiliation(s)
- Carolyn R Vitek
- From the Center for Individualized Medicine (Ms Vitek), Emeritus Faculty (Drs Dale and Homburger), the Division of Biostatistics and Informatics (Ms Bryant), and the Department of Laboratory Medicine and Pathology (Drs Saenger and Karon), Mayo Clinic, Rochester, Minnesota
| | | | | | | | | | | |
Collapse
|
27
|
Mann KJ, Craig MS, Moses JM. Quality improvement educational practices in pediatric residency programs: survey of pediatric program directors. Acad Pediatr 2014; 14:23-8. [PMID: 24369866 DOI: 10.1016/j.acap.2012.11.003] [Citation(s) in RCA: 40] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/07/2012] [Revised: 11/05/2012] [Accepted: 11/15/2012] [Indexed: 01/12/2023]
Abstract
BACKGROUND The Accreditation Council for Graduate Medical Education requires residents to learn quality improvement (QI) methods to analyze, change, and improve their practice. Little is known about how pediatric residency programs design, implement, and evaluate QI curricula to achieve this goal. We sought to describe current QI educational practices, evaluation methods, and program director perceptions through a national survey. METHODS A survey of QI curricula was developed, pilot tested, approved by the Association of Pediatric Program Directors (APPD), and distributed to pediatric program directors. Descriptive statistics were used to analyze the data. RESULTS The response rate was 53% (104 of 197). Most respondents reported presence of a QI curriculum (85%, 88 of 104), including didactic sessions (83%) and resident QI projects (88%). Continuous process improvement was the most common methodology addressed (65%). The most frequent topics taught were "Making a Case for QI" (68%), "PDSA [plan-do-study-act] Cycles" (66%), and "Measurement in QI" (60%). Projects were most frequently designed to improve clinical care (90%), hospital operations (65%), and the residency (61%). Only 35% evaluated patient outcomes, and 17% had no formal evaluation. Programs had a mean of 6 faculty members (standard deviation 4.4, range 2-20) involved in teaching residents QI. Programs with more faculty involved were more likely to have had a resident submit an abstract to a professional meeting about their QI project (<5 faculty, 38%; 5-9, 64%; >9, 92%; P = .003). Barriers to teaching QI included time (66%), funding constraints (39%), and absent local QI expertise (33%). Most PPDs (65%) believed that resident input in hospital QI was important, but only 24% reported resident involvement. Critical factors for success included an experiential component (56%) and faculty with QI expertise (50%). CONCLUSIONS QI curricular practices vary greatly across pediatric residency programs. Although pediatric residency programs commit a fair number of resources to QI education and believe that resident involvement in QI is important, fundamental QI topics are overlooked in many programs, and evaluation of existing curricula is limited. Success as perceived by pediatric program directors appears to be related to the inclusion of a QI project and the availability of faculty mentors.
Collapse
Affiliation(s)
- Keith J Mann
- Children's Mercy Hospitals and Clinics, University of Missouri-Kansas City School of Medicine, MO.
| | - Mark S Craig
- Department of Pediatrics, University of Rochester, Rochester, NY
| | - James M Moses
- Department of Pediatrics, Boston University School of Medicine, Boston, Mass
| |
Collapse
|
28
|
Craig MS, Garfunkel LC, Baldwin CD, Mann KJ, Moses JM, Co JPT, Blumkin AK, Szilagyi PG. Pediatric resident education in quality improvement (QI): a national survey. Acad Pediatr 2014; 14:54-61. [PMID: 24369869 DOI: 10.1016/j.acap.2013.10.004] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/03/2013] [Revised: 10/08/2013] [Accepted: 10/21/2013] [Indexed: 11/29/2022]
Abstract
OBJECTIVE To assess pediatric residents' perceptions of their quality improvement (QI) education and training, including factors that facilitate learning QI and self-efficacy in QI activities. METHODS A 22-question survey questionnaire was developed with expert-identified key topics and iterative pretesting of questions. Third-year pediatric residents from 45 residency programs recruited from a random sample of 120 programs. Data were analyzed by descriptive statistics, chi-square tests, and qualitative content analysis. RESULTS Respondents included 331 residents for a response rate of 47%. Demographic characteristics resembled the national profile of pediatric residents. Over 70% of residents reported that their QI training was well organized and met their needs. Three quarters felt ready to use QI methods in practice. Those with QI training before residency were significantly more confident than those without prior QI training. However, fewer than half of respondents used standard QI methods such as PDSA cycles and run charts in projects. Residents identified faculty support, a structured curriculum, hands-on projects, and dedicated project time as key strengths of their QI educational experiences. A strong QI culture was also considered important, and was reported to be present in most programs sampled. CONCLUSIONS Overall, third-year pediatric residents reported positive QI educational experiences with strong faculty support and sufficient time for QI projects. However, a third of residents thought that the QI curricula in their programs needed improvement, and a quarter lacked self-efficacy in conducting future QI activities. Continuing curricular improvement, including faculty development, is warranted.
Collapse
Affiliation(s)
- Mark S Craig
- Department of Pediatrics, University of Rochester Medical Center, Rochester, NY; Department of Pediatrics, Madigan Army Medical Center, Tacoma, Wash.
| | - Lynn C Garfunkel
- Department of Pediatrics, University of Rochester School of Medicine and Dentistry, and Rochester General Hospital, Rochester, NY
| | - Constance D Baldwin
- Department of Pediatrics, University of Rochester Medical Center, Rochester, NY
| | - Keith J Mann
- Department of Pediatrics, Children's Mercy Hospitals and Clinics, and the University of Missouri-Kansas City School of Medicine, Kansas City, MO
| | - James M Moses
- Department of Pediatrics, Boston University School of Medicine, and Boston Medical Center, Boston, Mass
| | - John Patrick T Co
- Office of Graduate Medical Education, Partners HealthCare, and Department of Pediatrics, Massachusetts General Hospital/Harvard Medical School, Boston, Mass
| | - Aaron K Blumkin
- Department of Pediatrics, University of Rochester Medical Center, Rochester, NY
| | - Peter G Szilagyi
- Department of Pediatrics, University of Rochester Medical Center, Rochester, NY
| |
Collapse
|
29
|
Wieland ML, Jaeger TM, Bundrick JB, Mauck KF, Post JA, Thomas MR, Thomas KG. Resident physician perspectives on outpatient continuity of care. J Grad Med Educ 2013; 5:668-73. [PMID: 24455021 PMCID: PMC3886471 DOI: 10.4300/jgme-05-04-40] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/04/2012] [Revised: 11/27/2012] [Accepted: 11/29/2012] [Indexed: 12/24/2022] Open
Abstract
BACKGROUND The outpatient continuity clinic is an essential component of internal medicine residency programs, yet continuity of patient care in these clinics is suboptimal. Reasons for this discontinuity have been inadequately explored. OBJECTIVE We sought to assess perceived factors contributing to discontinuity in trainee ambulatory clinics. METHODS The study encompassed 112 internal medicine residents at a large academic medical center in the Midwest. We conducted 2 hours of facilitated discussion with 18 small groups of residents. Residents were asked to reflect on factors that pose barriers to continuity in their ambulatory practice and potential mechanisms to reduce these barriers. Resident comments were transcribed and inductive analysis was performed to develop themes. We used these themes to derive recommendations for improving continuity of care in a resident ambulatory clinic. RESULTS Key themes included an imbalance of clinic scheduling that favors access for patients with acute symptoms over continuity, clinic triage scripts that deemphasize continuity, inadequate communication among residents and faculty regarding shared patients, residents' inefficient use of nonphysician care resources, and a lack of shared values between patients and providers regarding continuity of care. CONCLUSIONS The results offer important information that may be applied in iterative program changes to enhance continuity of care in resident clinics.
Collapse
|
30
|
Heflin MT, Pinheiro SO, Konrad TR, Egerton EO, Thornlow DK, White HK, McConnell EJ. Design and evaluation of a prelicensure interprofessional course on improving care transitions. GERONTOLOGY & GERIATRICS EDUCATION 2013; 35:41-63. [PMID: 24279889 DOI: 10.1080/02701960.2013.831349] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
Effective management of care transitions for older adults require the coordinated expertise of an interprofessional team. Unfortunately, different health care professions are rarely educated together or trained in teamwork skills. To address this issue, a team of professionally diverse faculty from the Duke University Geriatric Education Center designed an interprofessional course focused on improving transitions of care for older adults. This innovative prelicensure course provided interactive teaching sessions designed to promote critical thinking and foster effective communication among health care professionals, caregivers, and patients. Students were assessed by in-class and online participation, performance on individual assignments, and team-based proposals to improve care transitions for older patients with congestive heart failure. Twenty students representing six professions completed the course; 18 completed all self-efficacy and course evaluation surveys. Students rated their self-efficacy in several domains before and after the course and reported gains in teamwork skills (p < .001), transitions of care (p < .001), quality improvement (p < .001) and cultural competence (p < .001). Learner feedback emphasized the importance of enthusiastic and well-prepared faculty, interactive learning experiences, and engagement in relevant work. This course offers a promising approach to shifting the paradigm of health professions education to empower graduates to promote quality improvement through team-based care.
Collapse
Affiliation(s)
- Mitchell T Heflin
- a Division of Geriatrics, Department of Medicine , Duke University School of Medicine; Duke University Center for the Study of Aging and Human Development; and Durham Veterans Affairs Medical Center, Geriatrics Research, Education, and Clinical Center (GRECC) , Durham , North Carolina , USA
| | | | | | | | | | | | | |
Collapse
|
31
|
Mookherjee S, Ranji S, Neeman N, Sehgal N. An advanced quality improvement and patient safety elective. CLINICAL TEACHER 2013; 10:368-73. [DOI: 10.1111/tct.12047] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
32
|
Development of an Instrument to Evaluate Residents’ Confidence in Quality Improvement. Jt Comm J Qual Patient Saf 2013; 39:502-10. [DOI: 10.1016/s1553-7250(13)39066-7] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
33
|
Vinci LM, Oyler J, Arora VM. The Quality and Safety Track: Training Future Physician Leaders. Am J Med Qual 2013; 29:277-83. [PMID: 23956340 DOI: 10.1177/1062860613498264] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Future physician leaders will need the knowledge and skills necessary to improve systems of care. To address this need, Pritzker School of Medicine implemented a 4-year scholarly track in quality and patient safety for medical students. The Quality and Safety Track (QST) includes an intensive elective that teaches basic quality-improvement skills, an individual mentored scholarly project, and engagement in the Institute for Healthcare Improvement Open School. The first-year elective incorporates a group project that allows students to apply basic process improvement skills. Institutional quality and safety leaders also present their work, giving students context for how these skills are used. To date, 23 students have completed the elective, and 11 chose to pursue QST throughout their medical school experience. Students who completed the elective reported improved confidence in using core quality improvement skills. QST is a feasible and innovative program to develop future health care leaders in quality and safety.
Collapse
Affiliation(s)
- Lisa M Vinci
- Pritzker School of Medicine, University of Chicago, Chicago, IL
| | - Julie Oyler
- Pritzker School of Medicine, University of Chicago, Chicago, IL
| | - Vineet M Arora
- Pritzker School of Medicine, University of Chicago, Chicago, IL
| |
Collapse
|
34
|
Wetzel AP. Factor analysis methods and validity evidence: a review of instrument development across the medical education continuum. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2012; 87:1060-1069. [PMID: 22722361 DOI: 10.1097/acm.0b013e31825d305d] [Citation(s) in RCA: 46] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/01/2023]
Abstract
PURPOSE Instrument development consistent with best practices is necessary for effective assessment and evaluation of learners and programs across the medical education continuum. The author explored the extent to which current factor analytic methods and other techniques for establishing validity are consistent with best practices. METHOD The author conducted electronic and hand searches of the English-language medical education literature published January 2006 through December 2010. To describe and assess current practices, she systematically abstracted reliability and validity evidence as well as factor analysis methods, data analysis, and reported evidence from instrument development articles reporting the application of exploratory factor analysis and principal component analysis. RESULTS Sixty-two articles met eligibility criteria. They described 64 instruments and 95 factor analyses. Most studies provided at least one source of evidence based on test content. Almost all reported internal consistency, providing evidence based on internal structure. Evidence based on response process and relationships with other variables was reported less often, and evidence based on consequences of testing was not identified. Factor analysis findings suggest common method selection errors and critical omissions in reporting. CONCLUSIONS Given the limited reliability and validity evidence provided for the reviewed instruments, educators should carefully consider the available supporting evidence before adopting and applying published instruments. Researchers should design for, test, and report additional evidence to strengthen the argument for reliability and validity of these measures for research and practice.
Collapse
Affiliation(s)
- Angela P Wetzel
- Department of Foundations of Education, Virginia Commonwealth University School of Education, Richmond, VA 23284-2020, USA.
| |
Collapse
|
35
|
Shaikh U, Natale JE, Nettiksimmons J, Li STT. Improving Pediatric Health Care Delivery by Engaging Residents in Team-Based Quality Improvement Projects. Am J Med Qual 2012; 28:120-6. [DOI: 10.1177/1062860612448927] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Affiliation(s)
- Ulfat Shaikh
- University of California Davis School of Medicine, Sacramento, CA
| | - JoAnne E. Natale
- University of California Davis School of Medicine, Sacramento, CA
| | | | - Su-Ting T. Li
- University of California Davis School of Medicine, Sacramento, CA
| |
Collapse
|
36
|
Levitt DS, Hauer KE, Poncelet A, Mookherjee S. An innovative quality improvement curriculum for third-year medical students. MEDICAL EDUCATION ONLINE 2012; 17:MEO-17-18391. [PMID: 22611330 PMCID: PMC3355381 DOI: 10.3402/meo.v17i0.18391] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/21/2012] [Accepted: 03/29/2012] [Indexed: 06/01/2023]
Abstract
BACKGROUND Competence in quality improvement (QI) is a priority for medical students. We describe a self-directed QI skills curriculum for medical students in a 1-year longitudinal integrated third-year clerkship: an ideal context to learn and practice QI. METHODS Two groups of four students identified a quality gap, described existing efforts to address the gap, made quantifying measures, and proposed a QI intervention. The program was assessed with knowledge and attitude surveys and a validated tool for rating trainee QI proposals. Reaction to the curriculum was assessed by survey and focus group. RESULTS Knowledge of QI concepts did not improve (mean knowledge score±SD): pre: 5.9±1.5 vs. post: 6.6±1.3, p=0.20. There were significant improvements in attitudes (mean topic attitude score±SD) toward the value of QI (pre: 9.9±1.8 vs. post: 12.6±1.9, p=0.03) and confidence in QI skills (pre: 13.4±2.8 vs. post: 16.1±3.0, p=0.05). Proposals lacked sufficient analysis of interventions and evaluation plans. Reaction was mixed, including appreciation for the experience and frustration with finding appropriate mentorship. CONCLUSION Clinical-year students were able to conduct a self-directed QI project. Lack of improvement in QI knowledge suggests that self-directed learning in this domain may be insufficient without targeted didactics. Higher order skills such as developing measurement plans would benefit from explicit instruction and mentorship. Lessons from this experience will allow educators to better target QI curricula to medical students in the clinical years.
Collapse
Affiliation(s)
- David Stern Levitt
- Department of Medicine, University of Washington, Seattle, Washington, USA
| | - Karen E. Hauer
- Department of Medicine, University of California, San Francisco, CA, USA
| | - Ann Poncelet
- Department of Neurology, University of California, San Francisco, CA, USA
| | - Somnath Mookherjee
- Department of Medicine, University of California, San Francisco, CA, USA
| |
Collapse
|
37
|
Moses J, Shore P, Mann KJ. Quality improvement curricula in pediatric residency education: obstacles and opportunities. Acad Pediatr 2011; 11:446-50. [PMID: 21967722 DOI: 10.1016/j.acap.2011.08.007] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/19/2011] [Accepted: 08/20/2011] [Indexed: 11/28/2022]
Affiliation(s)
- James Moses
- Department of Pediatrics, Boston University School of Medicine, Boston Medical Center, USA
| | | | | |
Collapse
|
38
|
Lipstein EA, Kronman MP, Richmond C, White KN, Shugerman RP, McPhillips HA. Addressing core competencies through hospital quality improvement activities: attitudes and engagement. J Grad Med Educ 2011; 3:315-9. [PMID: 22942955 PMCID: PMC3179206 DOI: 10.4300/jgme-d-10-00179.1] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/13/2010] [Revised: 02/09/2011] [Accepted: 02/16/2011] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Hospital quality improvement initiatives are becoming increasingly common. Little is known about the influence of these initiatives on resident learning and attitudes. Our objective was to assess whether training in a hospital committed to involving residents in hospital-initiated, continuous quality improvement (CQI), and to participation in such activities, would influence residents' attitudes toward CQI and engagement in the hospital community. METHODS We surveyed Seattle Children's Hospital pediatric residents, from residency graduation years 2002-2009. We included questions about participation in quality improvement activities during residency and measures of attitude toward CQI and of workplace engagement. We used descriptive statistics to assess trends in resident participation in hospital CQI activities, attitudes toward CQI and workplace engagement. RESULTS The overall response rate was 84% (162 of 194). Among graduated residents, there was a significant trend toward increased participation in CQI activities (P = .03). We found no difference in attitude toward CQI between those who had and those who had not participated in such activities nor between residents who began training before and those who began after the hospital formally committed to CQI. Sixty-three percent of residents (25 of 40) who participated in CQI activities were engaged in the hospital community compared with 53% (57 of 107) who did not participate in CQI activities (P = .21). CONCLUSIONS Training in a hospital committed to involving residents in CQI was associated with a high rate of participation in CQI activities. Although such training and participation in CQI were not associated with resident attitudes toward CQI or hospital engagement, it may allow residents to learn skills for practice-based learning and improvement and systems-based practice.
Collapse
Affiliation(s)
- Ellen A Lipstein
- Corresponding author: Ellen A. Lipstein, MD, MPH, Cincinnati Children's Hospital and University of Cincinnati College of Medicine, 3333 Burnet Avenue, MLC 7027, Cincinnati, OH 45229, 513.803.1626,
| | | | | | | | | | | |
Collapse
|
39
|
Colbert CY, Ogden PE, Ownby AR, Bowe C. Systems-based practice in graduate medical education: systems thinking as the missing foundational construct. TEACHING AND LEARNING IN MEDICINE 2011; 23:179-185. [PMID: 21516607 DOI: 10.1080/10401334.2011.561758] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
Abstract
BACKGROUND Since 2001, residencies have struggled with teaching and assessing systems-based practice (SBP). One major obstacle may be that the competency alone is not sufficient to support assessment. We believe the foundational construct underlying SBP is systems thinking, absent from the current Accreditation Council for Graduate Medical Education competency language. SUMMARY Systems thinking is defined as the ability to analyze systems as a whole. The purpose of this article is to describe psychometric issues that constrain assessment of SBP and elucidate the role of systems thinking in teaching and assessing SBP. CONCLUSION Residency programs should incorporate systems thinking models into their curricula. Trainees should be taught to understand systems at an abstract level, in order to analyze their own healthcare systems, and participate in quality and patient safety activities. We suggest that a developmental trajectory for systems thinking be developed, similar to the model described by Dreyfus and Dreyfus.
Collapse
Affiliation(s)
- Colleen Y Colbert
- Scott & White Healthcare and Internal Medicine, Texas A&M University System Health Science Center College of Medicine, Temple, Texas 76508, USA.
| | | | | | | |
Collapse
|
40
|
Tomolo AM, Lawrence RH, Watts B, Augustine S, Aron DC, Singh MK. Pilot study evaluating a practice-based learning and improvement curriculum focusing on the development of system-level quality improvement skills. J Grad Med Educ 2011; 3:49-58. [PMID: 22379523 PMCID: PMC3186260 DOI: 10.4300/jgme-d-10-00104.1] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/11/2010] [Revised: 08/02/2010] [Accepted: 10/07/2010] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND We developed a practice-based learning and improvement (PBLI) curriculum to address important gaps in components of content and experiential learning activities through didactics and participation in systems-level quality improvement projects that focus on making changes in health care processes. METHODS We evaluated the impact of our curriculum on resident PBLI knowledge, self-efficacy, and application skills. A quasi-experimental design assessed the impact of a curriculum (PBLI quality improvement systems compared with non-PBLI) on internal medicine residents' learning during a 4-week ambulatory block. We measured application skills, self-efficacy, and knowledge by using the Systems Quality Improvement Training and Assessment Tool. Exit evaluations assessed time invested and experiences related to the team projects and suggestions for improving the curriculum. RESULTS The 2 groups showed differences in change scores. Relative to the comparison group, residents in the PBLI curriculum demonstrated a significant increase in the belief about their ability to implement a continuous quality improvement project (P = .020), comfort level in developing data collection plans (P = .010), and total knowledge scores (P < .001), after adjusting for prior PBLI experience. Participants in the PBLI curriculum also demonstrated significant improvement in providing a more complete aim statement for a proposed project after adjusting for prior PBLI experience (P = .001). Exit evaluations were completed by 96% of PBLI curriculum participants who reported high satisfaction with team performance. CONCLUSION Residents in our curriculum showed gains in areas fundamental for PBLI competency. The observed improvements were related to fundamental quality improvement knowledge, with limited gain in application skills. This suggests that while heading in the right direction, we need to conceptualize and structure PBLI training in a way that integrates it throughout the residency program and fosters the application of this knowledge and these skills.
Collapse
Affiliation(s)
- Anne M Tomolo
- Corresponding author: Anne M. Tomolo, MD, MPH, 1670 Clairmont Road, Atlanta, GA 30033, 404.321.6111, extension 4602,
| | | | | | | | | | | |
Collapse
|
41
|
Lawrence RH, Tomolo AM. Development and preliminary evaluation of a practice-based learning and improvement tool for assessing resident competence and guiding curriculum development. J Grad Med Educ 2011; 3:41-8. [PMID: 22379522 PMCID: PMC3186261 DOI: 10.4300/jgme-d-10-00102.1] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/11/2010] [Revised: 08/24/2010] [Accepted: 10/18/2010] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Although practice-based learning and improvement (PBLI) is now recognized as a fundamental and necessary skill set, we are still in need of tools that yield specific information about gaps in knowledge and application to help nurture the development of quality improvement (QI) skills in physicians in a proficient and proactive manner. We developed a questionnaire and coding system as an assessment tool to evaluate and provide feedback regarding PBLI self-efficacy, knowledge, and application skills for residency programs and related professional requirements. METHODS Five nationally recognized QI experts/leaders reviewed and completed our questionnaire. Through an iterative process, a coding system based on identifying key variables needed for ideal responses was developed to score project proposals. The coding system comprised 14 variables related to the QI projects, and an additional 30 variables related to the core knowledge concepts related to PBLI. A total of 86 residents completed the questionnaire, and 2 raters coded their open-ended responses. Interrater reliability was assessed by percentage agreement and Cohen κ for individual variables and Lin concordance correlation for total scores for knowledge and application. Discriminative validity (t test to compare known groups) and coefficient of reproducibility as an indicator of construct validity (item difficulty hierarchy) were also assessed. RESULTS Interrater reliability estimates were good (percentage of agreements, above 90%; κ, above 0.4 for most variables; concordances for total scores were R = .88 for knowledge and R = .98 for application). CONCLUSION Despite the residents' limited range of experiences in the group with prior PBLI exposure, our tool met our goal of differentiating between the 2 groups in our preliminary analyses. Correcting for chance agreement identified some variables that are potentially problematic. Although additional evaluation is needed, our tool may prove helpful and provide detailed information about trainees' progress and the curriculum.
Collapse
Affiliation(s)
- Renée H Lawrence
- Corresponding author: Renée H. Lawrence, PhD, Louis Stokes Cleveland Department of Veterans Affairs Medical Center, 10701 East Boulevard, 111W, Cleveland, OH 44106, 216.791.3800,
| | | |
Collapse
|
42
|
Teaching internal medicine residents to sustain their improvement through the quality assessment and improvement curriculum. J Gen Intern Med 2011; 26:221-5. [PMID: 21053089 PMCID: PMC3019318 DOI: 10.1007/s11606-010-1547-y] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/11/2009] [Revised: 09/20/2010] [Accepted: 10/07/2010] [Indexed: 10/18/2022]
Abstract
INTRODUCTION Although sustainability is a key component in the evaluation of continuous quality improvement (CQI) projects, medicine resident CQI projects are often evaluated by immediate improvements in targeted areas without addressing sustainability. AIM/SETTING: To assess the sustainability of resident CQI projects in an ambulatory university-based clinic. PROGRAM DESCRIPTION During their ambulatory rotation, all second year internal medicine residents use the American Board of Internal Medicine's Clinical Preventive Services (CPS) Practice Improvement Modules (PIM) to complete chart reviews, patient surveys, and a system survey. The residents then develop a group CQI project and collect early post data. Third year residents return to evaluate their original CQI project during an ambulatory rotation two to six months later and complete four plan-do-study-act (PDSA) cycles on each CQI project. PROGRAM EVALUATION From July 2006 to June 2009, 64 (100%) medicine residents completed the CQI curriculum. Residents completed six group projects and examined their success using early (2 to 6 weeks) and late (2 to 6 months) post-intervention data. Three of the projects demonstrated sustainable improvement in the resident continuity clinic. DISCUSSION When residents are taught principles of sustainability and spread and asked to complete multiple PDSA cycles, they are able to identify common themes that may contribute to success of QI projects over time.
Collapse
|
43
|
Wittich CM, Reed DA, Drefahl MM, West CP, McDonald FS, Thomas KG, Halvorsen AJ, Beckman TJ. Relationship between critical reflection and quality improvement proposal scores in resident doctors. MEDICAL EDUCATION 2011; 45:149-154. [PMID: 21166692 DOI: 10.1111/j.1365-2923.2010.03860.x] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
OBJECTIVES transformative learning theory supports the idea that reflection on quality improvement (QI) opportunities and the ability to develop successful QI projects may be fundamentally linked. We used validated methods to explore associations between resident doctors' reflections on QI opportunities and the quality of their QI project proposals. METHODS eighty-six residents completed written reflections on practice improvement opportunities and developed QI proposals. Two faculty members assessed residents' reflections using the 18-item Mayo Evaluation of Reflection on Improvement Tool (MERIT), and assessed residents' QI proposals using the seven-item Quality Improvement Project Assessment Tool (QIPAT-7). Both instruments have been validated in previous work. Associations between MERIT and QIPAT-7 scores were determined. Internal consistency reliabilities of QIPAT-7 and MERIT scores were calculated. RESULTS there were no significant associations between MERIT overall and domain scores, and QIPAT-7 overall and item scores. The internal consistency of MERIT and QIPAT-7 item groups were acceptable (Cronbach's α 0.76-0.94). CONCLUSIONS the lack of association between MERIT and QIPAT-7 scores indicates a distinction between resident doctors' skills at reflection on QI opportunities and their abilities to develop QI projects. These findings suggest that practice-based reflection and QI project development are separate constructs, and that skilful reflection may not predict the ability to design meaningful QI initiatives. Future QI curricula should consider teaching and assessing QI reflection and project development as distinct components.
Collapse
Affiliation(s)
- Christopher M Wittich
- Division of General Internal Medicine, Department of Internal Medicine, Mayo Clinic College of Medicine, Rochester, Minnesota 55905, USA.
| | | | | | | | | | | | | | | |
Collapse
|
44
|
Wittich CM, Reed DA, McDonald FS, Varkey P, Beckman TJ. Perspective: Transformative learning: a framework using critical reflection to link the improvement competencies in graduate medical education. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2010; 85:1790-1793. [PMID: 20881823 DOI: 10.1097/acm.0b013e3181f54eed] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
Quality improvement (QI) in health care involves activities ranging from enhancing personal practice to reforming the larger health care system. The Accreditation Council for Graduate Medical Education recognizes this broad definition of QI in its requirement that physicians-in-training demonstrate competence in practice-based learning and improvement (PBLI) and systems-based practice (SBP). Creative metaphors have been used to teach the PBLI and SBP competencies, but conceptual frameworks describing the relationship between these competencies are needed. Transformative learning is an adult education theory that states individuals must critically reflect on life events in order to change their beliefs or behaviors. The authors propose that critical reflection during transformative learning can conceptually link PBLI and SBP. Reflection on personal experience with suboptimal patient care leads to recognizing personal or system limitations. Addressing personal limitations improves individual practice (PBLI), whereas applying QI methodologies leads to large-scale improvements (SBP). Educators who adopt the transformative learning framework should be able to design meaningful QI curricula that encourage residents to be reflective and empower them with QI skills.
Collapse
Affiliation(s)
- Christopher M Wittich
- Department of Internal Medicine, Division of General Internal Medicine, Mayo Clinic College of Medicine, Mayo Clinic, Rochester, Minnesota 55905, USA.
| | | | | | | | | |
Collapse
|
45
|
Colbert CY, Ogden PE, Lowe D, Moffitt MJ. Students learn systems-based care and facilitate system change as stakeholders in a free clinic experience. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2010; 15:533-545. [PMID: 20039122 DOI: 10.1007/s10459-009-9216-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/03/2009] [Accepted: 12/16/2009] [Indexed: 05/28/2023]
Abstract
Systems-based practice (SBP) is rarely taught or evaluated during medical school, yet is one of the required competencies once students enter residency. We believe Texas A&M College of Medicine students learn about systems issues informally, as they care for patients at a free clinic in Temple, TX. The mandatory free clinic rotation is part of the Internal Medicine clerkship and does not include formal instruction in SBP. During 2008-2009, a sample of students (n = 31) on the IMED clerkship's free clinic rotation participated in a program evaluation/study regarding their experiences. Focus groups (M = 5 students/group) were held at the end of each outpatient rotation. Students were asked: "Are you aware of any system issues which can affect either the delivery of or access to care at the free clinic?" Data saturation was reached after six focus groups, when investigators noted a repetition of responses. Based upon investigator consensus opinion, data collection was discontinued. Based upon a content analysis, six themes were identified: access to specialists, including OB-GYN, was limited; cost containment; lack of resources affects delivery of care; delays in care due to lack of insurance; understanding of larger healthcare system and free clinic role; and delays in tests due to language barriers. Medical students were able to learn about SBP issues during free clinic rotations. Students experienced how SBP issues affected the health care of uninsured individuals. We believe these findings may be transferable to medical schools with mandatory free clinic rotations.
Collapse
Affiliation(s)
- Colleen Y Colbert
- Internal Medicine Department, Scott & White Memorial Hospital, Temple, TX 76508, USA.
| | | | | | | |
Collapse
|
46
|
Szostek JH, Wieland ML, Loertscher LL, Nelson DR, Wittich CM, McDonald FS, Kolars JC, Reed DA. A systems approach to morbidity and mortality conference. Am J Med 2010; 123:663-8. [PMID: 20609691 DOI: 10.1016/j.amjmed.2010.03.010] [Citation(s) in RCA: 59] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/08/2010] [Revised: 03/19/2010] [Accepted: 03/22/2010] [Indexed: 10/19/2022]
|
47
|
Green ML, Holmboe E. Perspective: the ACGME toolbox: half empty or half full? ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2010; 85:787-90. [PMID: 20520026 DOI: 10.1097/acm.0b013e3181d737a6] [Citation(s) in RCA: 44] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/12/2023]
Abstract
The Accreditation Council for Graduate Medical Education Outcome Project changed the currency of accreditation from process and structure to outcomes. Residency program directors must document their residents' competence in six general dimensions of practice. A recent systematic review, published in the March 2009 issue of Academic Medicine, concluded that the instruments currently available are psychometrically inadequate for evaluating residents in five of the six competencies.In this perspective, the authors refute the findings of this earlier review. They demonstrate that the review's search strategy was limited, failing to capture many important evaluation studies. They also question the appropriateness of the analysis of the included articles, which focused, to the exclusion of other important properties, on an instrument's ability to discriminate among residents' performance in the six competencies.Finally, the authors argue that the problem is not the lack of adequate evaluation instruments but, rather, the inconsistent use and interpretation of such instruments by unskilled faculty. They urge the graduate medical education community-if it is to realize the promise of competency-based education-to invest in training for faculty evaluators rather than waiting for new instruments.
Collapse
Affiliation(s)
- Michael L Green
- Department of Internal Medicine, Yale University School of Medicine, New Haven, Connecticut, USA.
| | | |
Collapse
|
48
|
Wittich CM, Beckman TJ, Drefahl MM, Mandrekar JN, Reed DA, Krajicek BJ, Haddad RM, McDonald FS, Kolars JC, Thomas KG. Validation of a method to measure resident doctors' reflections on quality improvement. MEDICAL EDUCATION 2010; 44:248-55. [PMID: 20444055 DOI: 10.1111/j.1365-2923.2009.03591.x] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/17/2023]
Abstract
OBJECTIVES Resident reflection on the clinical learning environment is prerequisite to identifying quality improvement (QI) opportunities and demonstrating competence in practice-based learning. However, residents' abilities to reflect on QI opportunities are unknown. Therefore, we developed and determined the validity of the Mayo Evaluation of Reflection on Improvement Tool (MERIT) for assessing resident reflection on QI opportunities. METHODS The content of MERIT, which consists of 18 items structured on 4-point scales, was based on existing literature and input from national experts. Using MERIT, six faculty members rated 50 resident reflections. Factor analysis was used to examine the dimensionality of MERIT instrument scores. Inter-rater and internal consistency reliabilities were calculated. RESULTS Factor analysis revealed three factors (eigenvalue; number of items): Reflection on Personal Characteristics of QI (8.5; 7); Reflection on System Characteristics of QI (1.9; 6), and Problem of Merit (1.5; 5). Inter-rater reliability was very good (intraclass correlation coefficient range: 0.73-0.89). Internal consistency reliability was excellent (Cronbach's alpha 0.93 overall and 0.83-0.91 for factors). Item mean scores were highest for Problem of Merit (3.29) and lowest for Reflection on System Characteristics of QI (1.99). CONCLUSIONS Validity evidence supports MERIT as a meaningful measure of resident reflection on QI opportunities. Our findings suggest that dimensions of resident reflection on QI opportunities may include personal, system and Problem of Merit factors. Additionally, residents may be more effective at reflecting on 'problems of merit' than personal and systems factors.
Collapse
Affiliation(s)
- Christopher M Wittich
- Department of Internal Medicine, Mayo Clinic College of Medicine, Mayo Clinic, Rochester, Minnesota 5590, USA
| | | | | | | | | | | | | | | | | | | |
Collapse
|
49
|
Holland R, Meyers D, Hildebrand C, Bridges AJ, Roach MA, Vogelman B. Creating champions for health care quality and safety. Am J Med Qual 2009; 25:102-8. [PMID: 19966115 DOI: 10.1177/1062860609352108] [Citation(s) in RCA: 32] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Patient safety and quality of care are public concerns that demand personal responsibility at all levels of the health care organization. Senior residents in our graduate medical education program took responsibility for a capstone quality improvement project designed to transform them into champions for health care quality. Residents (n = 26) participated alone or in pairs in a 1-month faculty-mentored rotation at the Veterans Administration Hospital during the 2007-2008 academic year. They completed a Web-based curriculum, identified a quality-of-care issue, applied Plan-Do-Study-Act cycles, authored a report, and engaged colleagues in their innovations during a department-wide presentation. Results indicated that residents demonstrated significantly enhanced knowledge and attitudes about patient safety and quality improvement and provided consistently positive faculty and rotation evaluations. In addition, residents generated 20 quality improvement project proposals with a 50% rate of hospital-wide implementation, leading to meaningful changes in the systems that affect patient care.
Collapse
Affiliation(s)
- Robert Holland
- William S. Middleton Memorial Veterans Administration Hospital, Madison, WI 53705, USA.
| | | | | | | | | | | |
Collapse
|
50
|
Patow CA, Karpovich K, Riesenberg LA, Jaeger J, Rosenfeld JC, Wittenbreer M, Padmore JS. Residents' engagement in quality improvement: a systematic review of the literature. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2009; 84:1757-64. [PMID: 19940586 DOI: 10.1097/acm.0b013e3181bf53ab] [Citation(s) in RCA: 85] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/20/2023]
Abstract
PURPOSE Residents are being asked to participate in quality improvement (QI) initiatives in hospitals and clinics with increasing frequency; however, the effectiveness of improving patient care through residents' participation in QI initiatives is unknown. METHOD A thorough, systematic review of the English-language medical literature published between 1987 and October 2008 was performed to identify clinical QI initiatives in which there was active engagement of residents. Multiple search strategies were employed using PubMed, EMBASE, CINAHL, and ERIC. Articles were excluded in which residents played a passive or peripheral role in the QI initiative. RESULTS Twenty-eight articles were identified that documented residents' active leadership, development, or participation in a clinical QI initiative, such as curriculum change, clinical guideline implementation, or involvement with a clinical QI team. The role and participation of residents varied widely. Measures of patient health are described as outcomes in the QI initiatives of 5 of the 28 articles. Twenty-three articles described process improvements in patient care or residents' education as the outcome measure. CONCLUSION There are few articles that describe the clinical or educational effectiveness of residents' participation in QI efforts; the authors describe barriers that may be partly responsible. They conclude that there is a great need for additional research on the effectiveness of residents' participation in QI initiatives, particularly as they affect patient health outcomes.
Collapse
Affiliation(s)
- Carl A Patow
- HealthPartners Institute for Medical Education, and Regions Hospital, University of Minnesota Medical School, Minneapolis, Minnesota 55425, USA.
| | | | | | | | | | | | | |
Collapse
|