1
|
Kersey E, Li J, Kay J, Adler-Milstein J, Yazdany J, Schmajuk G. Development and application of Breadth-Depth-Context (BDC), a conceptual framework for measuring technology engagement with a qualified clinical data registry. JAMIA Open 2024; 7:ooae061. [PMID: 39070967 PMCID: PMC11278873 DOI: 10.1093/jamiaopen/ooae061] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2023] [Revised: 05/24/2024] [Accepted: 06/19/2024] [Indexed: 07/30/2024] Open
Abstract
Objectives Despite the proliferation of dashboards that display performance data derived from Qualified Clinical Data Registries (QCDR), the degree to which clinicians and practices engage with such dashboards has not been well described. We aimed to develop a conceptual framework for assessing user engagement with dashboard technology and to demonstrate its application to a rheumatology QCDR. Materials and Methods We developed the BDC (Breadth-Depth-Context) framework, which included concepts of breadth (derived from dashboard sessions), depth (derived from dashboard actions), and context (derived from practice characteristics). We demonstrated its application via user log data from the American College of Rheumatology's Rheumatology Informatics System for Effectiveness (RISE) registry to define engagement profiles and characterize practice-level factors associated with different profiles. Results We applied the BDC framework to 213 ambulatory practices from the RISE registry in 2020-2021, and classified practices into 4 engagement profiles: not engaged (8%), minimally engaged (39%), moderately engaged (34%), and most engaged (19%). Practices with more patients and with specific electronic health record vendors (eClinicalWorks and eMDs) had a higher likelihood of being in the most engaged group, even after adjusting for other factors. Discussion We developed the BDC framework to characterize user engagement with a registry dashboard and demonstrated its use in a specialty QCDR. The application of the BDC framework revealed a wide range of breadth and depth of use and that specific contextual factors were associated with nature of engagement. Conclusion Going forward, the BDC framework can be used to study engagement with similar dashboards.
Collapse
Affiliation(s)
- Emma Kersey
- Department of Medicine, Division of Rheumatology, University of California San Francisco, San Francisco, CA 94143, United States
| | - Jing Li
- Department of Medicine, Division of Rheumatology, University of California San Francisco, San Francisco, CA 94143, United States
| | - Julia Kay
- Department of Medicine, Division of Rheumatology, University of California San Francisco, San Francisco, CA 94143, United States
| | - Julia Adler-Milstein
- Institute for Health Policy Studies, University of California San Francisco, San Francisco, CA 94158, United States
- Department of Medicine, Division of Clinical Informatics and Digital Transformation, University of California San Francisco, San Francisco, CA 94143, United States
| | - Jinoos Yazdany
- Department of Medicine, Division of Rheumatology, University of California San Francisco, San Francisco, CA 94143, United States
- Institute for Health Policy Studies, University of California San Francisco, San Francisco, CA 94158, United States
| | - Gabriela Schmajuk
- Department of Medicine, Division of Rheumatology, University of California San Francisco, San Francisco, CA 94143, United States
- Institute for Health Policy Studies, University of California San Francisco, San Francisco, CA 94158, United States
- San Francisco Veterans Affairs Medical Center, San Francisco, CA 94121, United States
| |
Collapse
|
2
|
Conlin M, Hamard M, Agrinier N, Birgand G. Assessment of implementation strategies adopted for antimicrobial stewardship interventions in long-term care facilities: a systematic review. Clin Microbiol Infect 2024; 30:431-444. [PMID: 38141820 DOI: 10.1016/j.cmi.2023.12.020] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2023] [Revised: 12/11/2023] [Accepted: 12/17/2023] [Indexed: 12/25/2023]
Abstract
BACKGROUND The implementation of antimicrobial stewardship (AMS) interventions in long-term care facilities (LTCFs) is influenced by multi-level factors (resident, organizational, and external) making their effectiveness sensitive to the implementation context. OBJECTIVES This study assessed the strategies adopted for the implementation of AMS interventions in LTCFs, whether they considered organizational characteristics, and their effectiveness. DATA SOURCES Electronic databases until April 2022. STUDY ELIGIBILITY CRITERIA Articles covering implementation of AMS interventions in LTCFs. ASSESSMENT OF RISK OF BIAS Mixed Methods Appraisal Tool for empirical studies. METHODS OF DATA SYNTHESIS Data were collected on AMS interventions and context characteristics (e.g. type of facility, staffing, and residents). Implementation strategies and outcomes were mapped according to the Expert Recommendations for Implementing Change (ERIC) framework and validated taxonomy for implementation outcomes. Implementation and clinical effectiveness were assessed according to the primary and secondary outcomes results provided in each study. RESULTS Among 48 studies included in the analysis, 19 (40%) used implementation strategies corresponding to one to three ERIC domains, including education and training (n = 36/48, 75%), evaluative and iterative strategies (n = 24/48, 50%), and support clinicians (n = 23/48, 48%). Only 8/48 (17%) studies made use of implementation theories, frameworks, or models. Fidelity and sustainability were reported respectively in 21 (70%) and 3 (10%) of 27 studies providing implementation outcomes. Implementation strategy was considered effective in 11/27 (41%) studies, mainly including actions to improve use (n = 6/11, 54%) and education (n = 4/11, 36%). Of the 42 interventions, 18/42 (43%) were deemed clinically effective. Among 21 clinically effective studies, implementation was deemed effective in four and partially effective in five. Two studies were clinically effective despite having non-effective implementation. CONCLUSIONS The effectiveness of AMS interventions in LTCFs largely differed according to the interventions' content and implementation strategies adopted. Implementation frameworks should be considered to adapt and tailor interventions and strategies to the local context.
Collapse
Affiliation(s)
- Michèle Conlin
- Regional Center for Infection Prevention and Control Pays de la Loire, Centre Hospitalier Universitaire de Nantes, Nantes, France
| | - Marie Hamard
- Unité de gériatrie Aiguë, Hôpital Bichat-Claude Bernard, Paris, France
| | - Nelly Agrinier
- Université de Lorraine, Inserm, INSPIIRE, F-54000 Nancy, France; CHRU-Nancy, Inserm, Université de Lorraine, CIC, Epidémiologie clinique, Nancy, France.
| | - Gabriel Birgand
- Regional Center for Infection Prevention and Control Pays de la Loire, Centre Hospitalier Universitaire de Nantes, Nantes, France; National Institute for Health Research Health Protection Research Unit in Healthcare Associated Infections and Antimicrobial Resistance at Imperial College London, London, UK
| |
Collapse
|
3
|
Dash D, Moser A, Feldman S, Saliba D, Bakaev I, Smalbrugge M, Robert B, Karuza J, Heckman G, Katz PR, Costa AP. Focusing on Provider Quality Measurement: Continued Consensus and Feasibility Testing of Practice-Based Quality Measures for Primary Care Providers in Long-Term Care. J Am Med Dir Assoc 2024; 25:189-194. [PMID: 38101456 DOI: 10.1016/j.jamda.2023.10.024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2023] [Revised: 10/25/2023] [Accepted: 10/26/2023] [Indexed: 12/17/2023]
Abstract
Medical providers in long-term care (LTC) use a unique skillset in delivering comprehensive resident care. Publicly reported quality measures (QMs) do not directly emphasize medical provider competency and their role in care. The impact of providers is understudied and to a large extent, unknown. Our objective was to define, test, and validate QMs to pragmatically measure the practice-based quality of medical providers in a pilot study. We included 7 North American LTC homes with data from practicing medical providers for LTC residents. We engaged in a 4-phased approach. In phase 1, experts rated 95 candidate QMs using 5 pragmatic-focused criteria in a RAND-modified Delphi process. Phase 2 involved specifying 37 QMs for collection (4 QMs were dropped during pilot testing). We created an abstraction manual and data collection tool for all QMs. Phase 3 involved a retrospective chart review in 7 LTC homes on 33 QMs with trained data abstractors. Data were sufficient to analyze performance for 26 QMs. Lastly, in phase 4 results and psychometric properties were reviewed with an expert panel. They ranked the tested measures for validity and feasibility for use by a nonphysician auditor to evaluate medical provider performance based on medical record review. In total, we examined data from 343 resident charts from 7 LTC homes and 49 providers. Our process yielded 10 QMs as being specified for measurement, feasible to collect, and had good test performance. This is the only study to systematically identify a subset of QMs for feasible collection from the medical record by various data collectors. This pragmatic approach to measuring practice-based quality and quantifying select medical provider competencies allows for the evaluation of individual and facility-level performance and facilitates quality improvement initiatives. Future work should perform broader testing and validate and refine operationalized QMs.
Collapse
Affiliation(s)
- Darly Dash
- Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, ON, Canada
| | - Andrea Moser
- Department of Family and Community Medicine, University of Toronto, Toronto, ON, Canada; Senior Services and Long-Term Care Division, City of Toronto, Toronto, ON, Canada
| | - Sid Feldman
- Department of Family and Community Medicine, University of Toronto, Toronto, ON, Canada; Baycrest Health Sciences, Toronto, ON, Canada
| | - Debra Saliba
- University of California Los Angeles, Borun Center at David Geffen School of Medicine, Los Angeles, CA, USA; Geriatric Research, Education, and Clinical Centers, Veterans Administration, Los Angeles, CA, USA; RAND Corporation, Santa Monica, CA, USA
| | - Innokentiy Bakaev
- Department of Medicine, Hebrew SeniorLife, Boston, MA, USA; Harvard Medical School, Harvard University, Boston, MA, USA
| | - Martin Smalbrugge
- Department of Medicine for Older People, Amsterdam University Medical Centers, Amsterdam, the Netherlands
| | - Benoît Robert
- Perley Health, Ottawa, ON, Canada; Faculty of Medicine, University of Ottawa, ON, Canada
| | - Jurgis Karuza
- Division of Geriatrics, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - George Heckman
- School of Public Health Sciences, Faculty of Health, University of Waterloo, Waterloo, ON, Canada
| | - Paul R Katz
- Department of Geriatrics, College of Medicine, Florida State University, Tallahassee, FL, USA
| | - Andrew P Costa
- Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, ON, Canada; Centre for Integrated Care, St. Joseph's Health System, Hamilton, ON, Canada; Department of Medicine, McMaster University, Hamilton, ON, Canada.
| |
Collapse
|
4
|
Correia RH, Dash D, Jones A, Vanstone M, Aryal K, Siu HYH, Gopaul A, Costa AP. Primary care quality for older adults: Practice-based quality measures derived from a RAND/UCLA appropriateness method study. PLoS One 2024; 19:e0297505. [PMID: 38241388 PMCID: PMC10798529 DOI: 10.1371/journal.pone.0297505] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2023] [Accepted: 01/06/2024] [Indexed: 01/21/2024] Open
Abstract
We established consensus on practice-based metrics that characterize quality of care for older primary care patients and can be examined using secondary health administrative data. We conducted a two-round RAND/UCLA Appropriateness Method (RAM) study and recruited 10 Canadian clinicians and researchers with expertise relevant to the primary care of elderly patients. Informed by a literature review, the first RAM round evaluated the appropriateness and importance of candidate quality measures in an online questionnaire. Technical definitions were developed for each endorsed indicator to specify how the indicator could be operationalized using health administrative data. In a virtual synchronous meeting, the expert panel offered feedback on the technical specifications for the endorsed indicators. Panelists then completed a second (final) questionnaire to rate each indicator and corresponding technical definition on the same criteria (appropriateness and importance). We used statistical integration to combine technical expert panelists' judgements and content analysis of open-ended survey responses. Our literature search and internal screening resulted in 61 practice-based quality indicators for rating. We developed technical definitions for indicators endorsed in the first questionnaire (n = 55). Following the virtual synchronous meeting and second questionnaire, we achieved consensus on 12 practice-based quality measures across four Priority Topics in Care of the Elderly. The endorsed indicators provide a framework to characterize practice- and population-level encounters of family physicians delivering care to older patients and will offer insights into the outcomes of their care provision. This study presented a case of soliciting expert feedback to develop measurable practice-based quality indicators that can be examined using administrative data to understand quality of care within population-based data holdings. Future work will refine and operationalize the technical definitions established through this process to examine primary care provision for older adults in a particular context (Ontario, Canada).
Collapse
Affiliation(s)
- Rebecca H. Correia
- Department of Health Research Methods, Evidence and Impact, Faculty of Health Sciences, McMaster University, Hamilton, Ontario, Canada
| | - Darly Dash
- Department of Health Research Methods, Evidence and Impact, Faculty of Health Sciences, McMaster University, Hamilton, Ontario, Canada
| | - Aaron Jones
- Department of Health Research Methods, Evidence and Impact, Faculty of Health Sciences, McMaster University, Hamilton, Ontario, Canada
| | - Meredith Vanstone
- Department of Family Medicine, Faculty of Health Sciences, McMaster University, Hamilton, Ontario, Canada
| | - Komal Aryal
- Department of Health Research Methods, Evidence and Impact, Faculty of Health Sciences, McMaster University, Hamilton, Ontario, Canada
| | - Henry Yu-Hin Siu
- Department of Family Medicine, Faculty of Health Sciences, McMaster University, Hamilton, Ontario, Canada
| | - Aquila Gopaul
- Department of Family Medicine, Western University, London, Ontario, Canada
| | - Andrew P. Costa
- Department of Health Research Methods, Evidence and Impact, Faculty of Health Sciences, McMaster University, Hamilton, Ontario, Canada
| |
Collapse
|
5
|
An analysis of a novel Canadian pilot health information exchange to improve transitions between hospital and long-term care/skilled nursing facility. JOURNAL OF INTEGRATED CARE 2022. [DOI: 10.1108/jica-03-2022-0022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
PurposeThe purpose of the article is to assess the effectiveness, compliance, adoption and lessons learnt from the pilot implementation of a data integration solution between an acute care hospital information system (HIS) and a long-term care (LTC) home electronic medical record through a case report.Design/methodology/approachUtilization statistics of the data integration solution were captured at one-month post implementation and again one year later for both the emergency department (ED) and LTC home. Clinician feedback from surveys and structured interviews was obtained from ED physicians and a multidisciplinary LTC group.FindingsThe authors successfully exchanged health information between a HIS and the electronic medical record (EMR) of an LTC facility in Canada. Perceived time savings were acknowledged by ED physicians, and actual time savings as high as 45 min were reported by LTC staff when completing medication reconciliation. Barriers to adoption included awareness, training efficacy and delivery models, workflow integration within existing practice and the limited number of facilities participating in the pilot. Future direction includes broader staff involvement, expanding the number of sites and re-evaluating impacts.Practical implicationsA data integration solution to exchange clinical information can make patient transfers more efficient, reduce data transcription errors, and improve the visibility of essential patient information across the continuum of care.Originality/valueAlthough there has been a large effort to integrate health data across care levels in the United States and internationally, the groundwork for such integrations between interoperable systems has only just begun in Canada. The implementation of the integration between an enterprise LTC electronic medical record system and an HIS described herein is the first of its kind in Canada. Benefits and lessons learnt from this pilot will be useful for further hospital-to-LTC home interoperability work.
Collapse
|
6
|
Daneman N, Lee S, Bai H, Bell CM, Bronskill SE, Campitelli MA, Dobell G, Fu L, Garber G, Ivers N, Kumar M, Lam JMC, Langford B, Laur C, Morris AM, Mulhall CL, Pinto R, Saxena FE, Schwartz KL, Brown KA. Behavioral Nudges to Improve Audit and Feedback Report Opening among Antibiotic Prescribers: A Randomized Controlled Trial. Open Forum Infect Dis 2022; 9:ofac111. [PMID: 35392461 PMCID: PMC8982784 DOI: 10.1093/ofid/ofac111] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Accepted: 03/01/2022] [Indexed: 11/13/2022] Open
Abstract
Abstract
Background
Peer comparison audit and feedback has demonstrated effectiveness in improving antibiotic prescribing practices, but only a minority of prescribers view their reports. We rigorously tested three behavioral nudging techniques delivered by email to improve report opening.
Methods
We conducted a pragmatic randomized controlled trial among Ontario long-term care (LTC) prescribers enrolled in an ongoing peer comparison audit and feedback program which includes data on their antibiotic prescribing patterns. Physicians were randomized to 1 of 8 possible sequences of intervention/control allocation to 3 different behavioral email nudges: a social peer comparison nudge (January 2020), a maintenance of professional certification incentive nudge (October 2020), and a prior participation nudge (January 2021). The primary outcome was feedback report opening; the primary analysis pooled the effects of all 3 nudging interventions.
Results
The trial included 421 physicians caring for more than 28,000 residents at 450 facilities. In the pooled analysis, physicians opened only 29.6% of intervention and 23.9% of control reports (odds ratio (OR) 1.51 (95%CI 1.10-2.07, p=0.011); this difference remained significant after accounting for physician characteristics and clustering (adjusted OR (aOR) 1.74 (95%CI 1.24-2.45, p=0.0014). Of individual nudging techniques, the prior participation nudge was associated with a significant increase in report opening (OR 1.62, 95%CI 1.06-2.47, p=0.026; aOR 2.16, 95%CI 1.33-3.50, p=0.0018). In the pooled analysis, nudges were also associated with accessing more report pages (aOR 1.28, 95%CI 1.14-1.43, p<0.001).
Conclusions
Enhanced nudging strategies modestly improved report opening, but more work is needed to optimize physician engagement with audit and feedback.
Collapse
Affiliation(s)
- Nick Daneman
- Department of Medicine, Sunnybrook Health Sciences Centre, University of Toronto, Toronto, Ontario, Canada
- Public Health Ontario, Ontario, Canada
- ICES, Ontario, Canada
- Institute of Health Policy, Management and Evaluation and Dalla Lana School of Public Health University of Toronto, Toronto, Ontario, Canada
- Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada
| | | | | | - Chaim M Bell
- ICES, Ontario, Canada
- Institute of Health Policy, Management and Evaluation and Dalla Lana School of Public Health University of Toronto, Toronto, Ontario, Canada
- Department of Medicine, Sinai Health System, University of Toronto, Toronto, Ontario, Canada
| | - Susan E Bronskill
- ICES, Ontario, Canada
- Institute of Health Policy, Management and Evaluation and Dalla Lana School of Public Health University of Toronto, Toronto, Ontario, Canada
- Dalla Lana School of Public Health University of Toronto, Toronto, Ontario, Canada
- Women’s College Hospital, University of Toronto, Toronto, Ontario, Canada
| | | | | | | | - Gary Garber
- Department of Medicine, University of Ottawa, Ottawa Hospital Research Institute, Ottawa, Ontario, Canada
| | - Noah Ivers
- ICES, Ontario, Canada
- Institute of Health Policy, Management and Evaluation and Dalla Lana School of Public Health University of Toronto, Toronto, Ontario, Canada
- Dalla Lana School of Public Health University of Toronto, Toronto, Ontario, Canada
- Women’s College Hospital, University of Toronto, Toronto, Ontario, Canada
- Department of Family and Community Medicine, University of Toronto, Toronto, Ontario, Canada
| | | | | | | | - Celia Laur
- Women’s College Hospital, University of Toronto, Toronto, Ontario, Canada
| | - Andrew M Morris
- Department of Medicine, Sinai Health System, University of Toronto, Toronto, Ontario, Canada
- Department of Medicine, University Health Network, University of Toronto, Toronto, Ontario, Canada
| | | | - Ruxandra Pinto
- Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada
| | | | - Kevin L Schwartz
- Public Health Ontario, Ontario, Canada
- ICES, Ontario, Canada
- Dalla Lana School of Public Health University of Toronto, Toronto, Ontario, Canada
| | - Kevin A Brown
- Public Health Ontario, Ontario, Canada
- ICES, Ontario, Canada
| |
Collapse
|
7
|
Bucalon B, Shaw T, Brown K, Kay J. State-of-the-art Dashboards on Clinical Indicator Data to Support Reflection on Practice: Scoping Review. JMIR Med Inform 2022; 10:e32695. [PMID: 35156928 PMCID: PMC8887640 DOI: 10.2196/32695] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2021] [Revised: 11/19/2021] [Accepted: 12/04/2021] [Indexed: 11/13/2022] Open
Abstract
Background There is an increasing interest in using routinely collected eHealth data to support reflective practice and long-term professional learning. Studies have evaluated the impact of dashboards on clinician decision-making, task completion time, user satisfaction, and adherence to clinical guidelines. Objective This scoping review aims to summarize the literature on dashboards based on patient administrative, medical, and surgical data for clinicians to support reflective practice. Methods A scoping review was conducted using the Arksey and O’Malley framework. A search was conducted in 5 electronic databases (MEDLINE, Embase, Scopus, ACM Digital Library, and Web of Science) to identify studies that met the inclusion criteria. Study selection and characterization were performed by 2 independent reviewers (BB and CP). One reviewer extracted the data that were analyzed descriptively to map the available evidence. Results A total of 18 dashboards from 8 countries were assessed. Purposes for the dashboards were designed for performance improvement (10/18, 56%), to support quality and safety initiatives (6/18, 33%), and management and operations (4/18, 22%). Data visualizations were primarily designed for team use (12/18, 67%) rather than individual clinicians (4/18, 22%). Evaluation methods varied among asking the clinicians directly (11/18, 61%), observing user behavior through clinical indicators and use log data (14/18, 78%), and usability testing (4/18, 22%). The studies reported high scores on standard usability questionnaires, favorable surveys, and interview feedback. Improvements to underlying clinical indicators were observed in 78% (7/9) of the studies, whereas 22% (2/9) of the studies reported no significant changes in performance. Conclusions This scoping review maps the current literature landscape on dashboards based on routinely collected clinical indicator data. Although there were common data visualization techniques and clinical indicators used across studies, there was diversity in the design of the dashboards and their evaluation. There was a lack of detail regarding the design processes documented for reproducibility. We identified a lack of interface features to support clinicians in making sense of and reflecting on their personal performance data.
Collapse
Affiliation(s)
- Bernard Bucalon
- Human Centred Technology Cluster, School of Computer Science, The University of Sydney, Darlington, Australia.,Practice Analytics, Digital Health Cooperative Research Centre, Sydney, Australia
| | - Tim Shaw
- Practice Analytics, Digital Health Cooperative Research Centre, Sydney, Australia.,Research in Implementation Science and e-Health Group, Faculty of Medicine and Health, The University of Sydney, Sydney, Australia
| | - Kerri Brown
- Practice Analytics, Digital Health Cooperative Research Centre, Sydney, Australia.,Professional Practice Directorate, The Royal Australasian College of Physicians, Sydney, Australia
| | - Judy Kay
- Human Centred Technology Cluster, School of Computer Science, The University of Sydney, Darlington, Australia.,Practice Analytics, Digital Health Cooperative Research Centre, Sydney, Australia
| |
Collapse
|
8
|
Daneman N, Lee SM, Bai H, Bell CM, Bronskill SE, Campitelli MA, Dobell G, Fu L, Garber G, Ivers N, Lam JMC, Langford BJ, Laur C, Morris A, Mulhall C, Pinto R, Saxena FE, Schwartz KL, Brown KA. Population-Wide Peer Comparison Audit and Feedback to Reduce Antibiotic Initiation and Duration in Long-Term Care Facilities with Embedded Randomized Controlled Trial. Clin Infect Dis 2021; 73:e1296-e1304. [PMID: 33754632 DOI: 10.1093/cid/ciab256] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2021] [Accepted: 03/19/2021] [Indexed: 11/13/2022] Open
Abstract
BACKGROUND Antibiotic overprescribing in long-term care settings is driven by prescriber preferences and is associated with preventable harms for residents. We aimed to determine whether peer comparison audit and feedback reporting for physicians reduces antibiotic overprescribing among residents. METHODS We employed a province wide, difference-in-differences study of antibiotic prescribing audit and feedback, with an embedded pragmatic randomized controlled trial (RCT) across all long-term care facilities in Ontario, Canada, in 2019. The study year included 1238 physicians caring for 96 185 residents. In total, 895 (72%) physicians received no feedback; 343 (28%) were enrolled to receive audit and feedback and randomized 1:1 to static or dynamic reports. The primary outcomes were proportion of residents initiated on an antibiotic and proportion of antibiotics prolonged beyond 7 days per quarter. RESULTS Among all residents, between the first quarter of 2018 and last quarter of 2019, there were temporal declines in antibiotic initiation (28.4% to 21.3%) and prolonged duration (34.4% to 29.0%). Difference-in-differences analysis confirmed that feedback was associated with a greater decline in prolonged antibiotics (adjusted difference -2.65%, 95% confidence interval [CI]: -4.93 to -.28%, P = .026), but there was no significant difference in antibiotic initiation. The reduction in antibiotic durations was associated with 335 912 fewer days of treatment. The embedded RCT detected no differences in outcomes between the dynamic and static reports. CONCLUSIONS Peer comparison audit and feedback is a pragmatic intervention that can generate small relative reductions in the use of antibiotics for prolonged durations that translate to large reductions in antibiotic days of treatment across populations. Clinical Trials Registration. NCT03807466.
Collapse
Affiliation(s)
- Nick Daneman
- Sunnybrook Research Institute, Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada.,Public Health Ontario, Toronto, Ontario, Canada.,Institute for Clinical Evaluative Sciences (ICES), Toronto, Ontario, Canada.,Institute of Health Policy, Management and Evaluation, University of Toronto, Toronto, Ontario, Canada.,Department of Medicine, University of Toronto, Toronto, Ontario, Canada
| | - Samantha M Lee
- Institute for Clinical Evaluative Sciences (ICES), Toronto, Ontario, Canada
| | - Heming Bai
- Ontario Health, Toronto, Ontario, Canada
| | - Chaim M Bell
- Institute for Clinical Evaluative Sciences (ICES), Toronto, Ontario, Canada.,Institute of Health Policy, Management and Evaluation, University of Toronto, Toronto, Ontario, Canada.,Department of Medicine, University of Toronto, Toronto, Ontario, Canada.,Department of Medicine, Mount Sinai Hospital, Toronto, Ontario, Canada
| | - Susan E Bronskill
- Sunnybrook Research Institute, Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada.,Institute for Clinical Evaluative Sciences (ICES), Toronto, Ontario, Canada.,Institute of Health Policy, Management and Evaluation, University of Toronto, Toronto, Ontario, Canada.,Department of Medicine, University of Toronto, Toronto, Ontario, Canada.,Women's College Hospital Institute for Health System Solutions and Virtual Care, Women's College Hospital, Toronto, Ontario, Canada
| | | | | | - Longdi Fu
- Institute for Clinical Evaluative Sciences (ICES), Toronto, Ontario, Canada
| | - Gary Garber
- Public Health Ontario, Toronto, Ontario, Canada.,Department of Medicine, University of Ottawa, Ottawa Hospital Research Institute, Ottawa, Ontario, Canada
| | - Noah Ivers
- Institute for Clinical Evaluative Sciences (ICES), Toronto, Ontario, Canada.,Department of Medicine, University of Toronto, Toronto, Ontario, Canada.,Women's College Hospital Institute for Health System Solutions and Virtual Care, Women's College Hospital, Toronto, Ontario, Canada
| | | | | | - Celia Laur
- Women's College Hospital Institute for Health System Solutions and Virtual Care, Women's College Hospital, Toronto, Ontario, Canada
| | - Andrew Morris
- Department of Medicine, University of Toronto, Toronto, Ontario, Canada.,Department of Medicine, Mount Sinai Hospital, Toronto, Ontario, Canada
| | | | - Ruxandra Pinto
- Sunnybrook Research Institute, Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada
| | - Farah E Saxena
- Institute for Clinical Evaluative Sciences (ICES), Toronto, Ontario, Canada
| | - Kevin L Schwartz
- Public Health Ontario, Toronto, Ontario, Canada.,Institute for Clinical Evaluative Sciences (ICES), Toronto, Ontario, Canada
| | - Kevin A Brown
- Public Health Ontario, Toronto, Ontario, Canada.,Institute for Clinical Evaluative Sciences (ICES), Toronto, Ontario, Canada
| |
Collapse
|
9
|
Vanstone JR, Patel S, Degelman ML, Abubakari IW, McCann S, Parker R, Ross T. Development and implementation of a clinician report to reduce unnecessary urine drug screen testing in the ED: a quality improvement initiative. Emerg Med J 2021; 39:471-478. [PMID: 33980661 PMCID: PMC9132872 DOI: 10.1136/emermed-2020-210009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2020] [Revised: 01/25/2021] [Accepted: 03/07/2021] [Indexed: 11/12/2022]
Abstract
Background Unnecessary testing is a problem-facing healthcare systems around the world striving to achieve sustainable care. Despite knowing this problem exists, clinicians continue to order tests that do not contribute to patient care. Using behavioural and implementation science can help address this problem. Locally, audit and feedback are used to provide information to clinicians about their performance on relevant metrics. However, this is often done without evidence-based methods to optimise uptake. Our objective was to improve the appropriate use of laboratory tests in the ED using evidence-based audit and feedback and behaviour change techniques. Methods Using the behaviour change wheel, we implemented an audit and feedback tool that provided information to ED physicians about their use of laboratory tests; specifically, we focused on education and review of the appropriate use of urine drug screen tests. The report was designed in collaboration with end users to help maximise engagement. Following development of the report, audit and feedback sessions were delivered over an 18-month period. Results Data on urine drug screen testing were collected continually throughout the intervention period and showed a sustained decrease among ED physicians. Test use dropped from a monthly departmental average of 26 urine drug screen tests per 1000 patient visits to only eight tests per 1000 patient visits following the initiation of the audit and feedback intervention. Conclusion Audit and feedback reduced unnecessary urine drug screen testing in the ED. Regular feedback sessions continuously engaged physicians in the audit and feedback intervention and allowed the implementation team to react to changing priorities and feedback from the clinical group. It was important to include the end users in the design of audit and feedback tools to maximise physician engagement. Inclusion in this process can help ensure physicians adopt a sense of ownership regarding which metrics to review and provides a key component for the motivation aspect of behaviour change. Departmental leadership is also critical to the process of implementing a successful audit and feedback initiative and achieving sustained behaviour change.
Collapse
Affiliation(s)
- Jason Robert Vanstone
- Stewardship and Clinical Appropriateness, Saskatchewan Health Authority, Regina, Saskatchewan, Canada
| | - Shivani Patel
- Stewardship and Clinical Appropriateness, Saskatchewan Health Authority, Regina, Saskatchewan, Canada
| | - Michelle L Degelman
- Stewardship and Clinical Appropriateness, Saskatchewan Health Authority, Regina, Saskatchewan, Canada
| | - Ibrahim W Abubakari
- Digital Health, Saskatchewan Health Authority, Saskatoon, Saskatchewan, Canada
| | - Shawn McCann
- eHealth Saskatchewan, Regina, Saskatchewan, Canada
| | - Robert Parker
- Stewardship and Clinical Appropriateness, Saskatchewan Health Authority, Regina, Saskatchewan, Canada
| | - Terry Ross
- Emergency Medicine, Saskatchewan Health Authority, Regina, Saskatchewan, Canada
| |
Collapse
|