1
|
Tran G, Kelly B, Hammersley M, Norman J, Okely A. The utility of website-based quality improvement tools for health professionals: a systematic review. Int J Qual Health Care 2024; 36:mzae068. [PMID: 38985665 PMCID: PMC11277856 DOI: 10.1093/intqhc/mzae068] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2024] [Revised: 06/03/2024] [Accepted: 07/09/2024] [Indexed: 07/12/2024] Open
Abstract
As technology continues to advance, it is important to understand how website-based tools can support quality improvement. Website-based tools refer to resources such as toolkits that users can access and use autonomously through a dedicated website. This review examined how website-based tools can support healthcare professionals with quality improvement, including the optimal processes used to develop tools and the elements of an effective tool. A systematic search of seven databases was conducted to include articles published between January 2012 and January 2024. Articles were included if they were peer reviewed, written in English, based in health settings, and reported the development or evaluation of a quality improvement website-based tool for professionals. A narrative synthesis was conducted using NVivo. Risk of bias was assessed using the Mixed Methods Appraisal Tool. All papers were independently screened and coded by two authors using a six-phase conceptual framework by Braun and Clarke. Eighteen studies met the inclusion criteria. Themes identified were tool development processes, quality improvement mechanisms and barriers and facilitators to tool usage. Digitalizing existing quality improvement processes (n = 7), identifying gaps in practice (n = 6), and contributing to professional development (n = 3) were common quality improvement aims. Tools were associated with the reported enhancement of accuracy and efficiency in clinical tasks, improvement in adherence to guidelines, facilitation of reflective practice, and provision of tailored feedback for continuous quality improvement. Common features were educational resources (n = 7) and assisting the user to assess current practices against standards/recommendations (n = 6), which supported professionals in achieving better clinical outcomes, increased professional satisfaction and streamlined workflow in various settings. Studies reported facilitators to tool usage including relevance to practice, accessibility, and facilitating multidisciplinary action, making these tools practical and time-efficient for healthcare. However, barriers such as being time consuming, irrelevant to practice, difficult to use, and lack of organizational engagement were reported. Almost all tools were co-developed with stakeholders. The co-design approaches varied, reflecting different levels of stakeholder engagement and adoption of co-design methodologies. It is noted that the quality of included studies was low. These findings offer valuable insights for future development of quality improvement website-based tools in healthcare. Recommendations include ensuring tools are co-developed with healthcare professionals, focusing on practical usability and addressing common barriers to enhance engagement and effectiveness in improving healthcare quality. Randomized controlled trials are warranted to provide objective evidence of tool efficacy.
Collapse
Affiliation(s)
- Georgie Tran
- Early Start, Faculty of the Arts, Social Sciences and Humanities, University of Wollongong, Wollongong, NSW 2522, Australia
| | - Bridget Kelly
- Early Start, Faculty of the Arts, Social Sciences and Humanities, University of Wollongong, Wollongong, NSW 2522, Australia
| | - Megan Hammersley
- Early Start, Faculty of the Arts, Social Sciences and Humanities, University of Wollongong, Wollongong, NSW 2522, Australia
| | - Jennifer Norman
- Health Promotion Service, Illawarra Shoalhaven Local Health District, Warrawong, NSW 2502, Australia
| | - Anthony Okely
- Early Start, Faculty of the Arts, Social Sciences and Humanities, University of Wollongong, Wollongong, NSW 2522, Australia
| |
Collapse
|
2
|
Zhuge R, Ruzieva A, Chang N, Wang X, Qi X, Wang Q, Wang Y, Kang Z, Liu J, Wu Q. Evaluation of emergency drills effectiveness by center of disease prevention and control staff in Heilongjiang Province, China: an empirical study using the logistic-ISM model. Front Public Health 2024; 12:1305426. [PMID: 38481835 PMCID: PMC10936000 DOI: 10.3389/fpubh.2024.1305426] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2023] [Accepted: 02/13/2024] [Indexed: 05/07/2024] Open
Abstract
Introduction Emergency drills are critical practices that can improve the preparedness for crisis situations. This study aims to comprehend the evaluation of emergency drill effectiveness by the staff at the Centers for Disease Control and Prevention (CDC) in Heilongjiang Province, China. It identifies potential factors that could influence the personnel's appraisal of outcomes throughout the emergency drill procedure. Methods A cross-sectional survey was conducted among public health professionals from various CDCs in Heilongjiang, a northeastern Chinese province. The binary logistic regression analysis identified the factors associated with the CDC staff's assessment of emergency drill efficacy, while the Interpretative Structural Modeling (ISM) elucidated the hierarchical structure among the influencing factors. Results 53.3% (95% CI = 50.6-55.4) of participants perceived the emergency drills' effectiveness as low. Binary logistic regression analysis revealed that the following adverse factors associated with the emergency drills increased the risk of a lower evaluation: lack of equipment and poor facilities (OR = 2.324, 95% CI = 1.884-2.867), poor training quality (OR = 1.765, 95% CI = 1.445-2.115), low leadership focus (OR = 1.585, 95% CI = 1.275-1.971), insufficient training frequency (OR = 1.539, 95% CI = 1.258-1.882), low skill in designing emergency drill plans (OR = 1.494, 95% CI = 1.180-1.890), lack of funding (OR = 1.407, 95% CI = 1.111-1.781), and poor coordination between departments (OR = 1.335, 95% CI = 1.085-1.641). The ISM revealed the hierarchical relationship of the influential factors, which were classified into three levels: Surface, Middle and Bottom. The Surface Level factors were training frequency, training quality, leaders' focus, and inter-departmental coordination. The Middle Level factors were equipment availability and skill in designing emergency drill plans. The Bottom Level factor was funding guarantee. Discussion This survey revealed that over half of the CDC staff rated the effectiveness of public health emergency drills as low. The Logistic-ISM Model results indicated that the evaluation of drill effectiveness was negatively influenced by insufficient facility and equipment support, financial constraints, lack of departmental coordination, and inadequate leadership attention. Among these factors, funding guarantee was the most fundamental one. Therefore, this calls for strategic decisions to increase funding for equipment, leadership training support, and effective emergency coordination.
Collapse
Affiliation(s)
- Ruiqian Zhuge
- Department of Social Medicine, Health Management College, Harbin Medical University, Harbin, China
| | - Adelina Ruzieva
- Department of Social Medicine, Health Management College, Harbin Medical University, Harbin, China
| | - Na Chang
- Department of Social Medicine, Health Management College, Harbin Medical University, Harbin, China
| | - Xing Wang
- Department of Social Medicine, Health Management College, Harbin Medical University, Harbin, China
| | - Xinye Qi
- Department of Social Medicine, Health Management College, Harbin Medical University, Harbin, China
| | - Qunkai Wang
- Department of Social Medicine, Health Management College, Harbin Medical University, Harbin, China
| | - Yuxuan Wang
- Department of Social Medicine, Health Management College, Harbin Medical University, Harbin, China
| | - Zheng Kang
- Department of Social Medicine, Health Management College, Harbin Medical University, Harbin, China
| | - Jingjing Liu
- School of Public Health, Anhui University of Science and Technology, Huainan, Anhui, China
- Key Laboratory of Industrial Dust Prevention and Control, Occupational Safety and Health, Ministry of Education, Anhui University of Science and Technology, Huainan, Anhui, China
- Anhui Institute of Occupational Safety and Health, Anhui University of Science and Technology, Huainan, Anhui, China
- Joint Research Center of Occupational Medicine and Health, Institute of Grand Health, Hefei Comprehensive National Science Center, Anhui University of Science and Technology, Huainan, Anhui, China
| | - Qunhong Wu
- Department of Social Medicine, Health Management College, Harbin Medical University, Harbin, China
| |
Collapse
|
3
|
Developing Public Health Emergency Response Leaders in Incident Management: A Scoping Review of Educational Interventions. Disaster Med Public Health Prep 2021; 16:2149-2178. [PMID: 34462032 DOI: 10.1017/dmp.2021.164] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
During emergency responses, public health leaders frequently serve in incident management roles that differ from their routine job functions. Leaders' familiarity with incident management principles and functions can influence response outcomes. Therefore, training and exercises in incident management are often required for public health leaders. To describe existing methods of incident management training and exercises in the literature, we queried 6 English language databases and found 786 relevant articles. Five themes emerged: (1) experiential learning as an established approach to foster engaging and interactive learning environments and optimize training design; (2) technology-aided decision support tools are increasingly common for crisis decision-making; (3) integration of leadership training in the education continuum is needed for developing public health response leaders; (4) equal emphasis on competency and character is needed for developing capable and adaptable leaders; and (5) consistent evaluation methodologies and metrics are needed to assess the effectiveness of educational interventions.These findings offer important strategic and practical considerations for improving the design and delivery of educational interventions to develop public health emergency response leaders. This review and ongoing real-world events could facilitate further exploration of current practices, emerging trends, and challenges for continuous improvements in developing public health emergency response leaders.
Collapse
|
4
|
Tools for Assessment of Country Preparedness for Public Health Emergencies: A Critical Review. Disaster Med Public Health Prep 2020; 15:431-441. [PMID: 32366350 PMCID: PMC8532124 DOI: 10.1017/dmp.2020.13] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Recent international communicable disease crises have highlighted the need for countries to assure their preparedness to respond effectively to public health emergencies. The objective of this study was to critically review existing tools to support a country's assessment of its health emergency preparedness. We developed a framework to analyze the expected effectiveness and utility of these tools. Through mixed search strategies, we identified 12 tools with relevance to public health emergencies. There was considerable consensus concerning the critical preparedness system elements to be assessed, although their relative emphasis and means of assessment and measurement varied considerably. Several tools identified appeared to have reporting requirements as their primary aim, rather than primary utility for system self-assessment of the countries and states using the tool. Few tools attempted to give an account of their underlying evidence base. Only some tools were available in a user-friendly electronic modality or included quantitative measures to support the monitoring of system preparedness over time. We conclude there is still a need for improvement in tools available for assessment of country preparedness for public health emergencies, and for applied research to increase identification of system measures that are valid indicators of system response capability.
Collapse
|
5
|
Public Health Emergency Preparedness System Evaluation Criteria and Performance Metrics: A Review of Contributions of the CDC-Funded Preparedness and Emergency Response Research Centers. Disaster Med Public Health Prep 2018; 13:626-638. [PMID: 30419972 DOI: 10.1017/dmp.2018.110] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
OBJECTIVES The US Centers for Disease Control and Prevention (CDC)-funded Preparedness and Emergency Response Research Centers (PERRCs) conducted research from 2008 to 2015 aimed to improve the complex public health emergency preparedness and response (PHEPR) system. This paper summarizes PERRC studies that addressed the development and assessment of criteria for evaluating PHEPR and metrics for measuring their efficiency and effectiveness. METHODS We reviewed 171 PERRC publications indexed in PubMed between 2009 and 2016. These publications derived from 34 PERRC research projects. We identified publications that addressed the development or assessment of criteria and metrics pertaining to PHEPR systems and describe the evaluation methods used and tools developed, the system domains evaluated, and the metrics developed or assessed. RESULTS We identified 29 publications from 12 of the 34 PERRC projects that addressed PHEPR system evaluation criteria and metrics. We grouped each study into 1 of 3 system domains, based on the metrics developed or assessed: (1) organizational characteristics (n = 9), (2) emergency response performance (n = 12), and (3) workforce capacity or capability (n = 8). These studies addressed PHEPR system activities including responses to the 2009 H1N1 pandemic and the 2011 tsunami, as well as emergency exercise performance, situational awareness, and workforce willingness to respond. Both PHEPR system process and outcome metrics were developed or assessed by PERRC studies. CONCLUSIONS PERRC researchers developed and evaluated a range of PHEPR system evaluation criteria and metrics that should be considered by system partners interested in assessing the efficiency and effectiveness of their activities. Nonetheless, the monitoring and measurement problem in PHEPR is far from solved. Lack of standard measures that are readily obtained or computed at local levels remains a challenge for the public health preparedness field. (Disaster Med Public Health Preparedness. 2019;13:626-638).
Collapse
|
6
|
Qari SH, Leinhos MR, Thomas TN, Carbone EG. Overview of the Translation, Dissemination, and Implementation of Public Health Preparedness and Response Research and Training Initiative. Am J Public Health 2018; 108:S355-S362. [PMID: 30260695 DOI: 10.2105/ajph.2018.304709] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
We provide an overview of a Centers for Disease Control and Prevention-funded public health preparedness and response (PHPR) research and training initiative to improve public health practice. Our objectives were to accelerate the translation, dissemination, and implementation (TDI) of promising PHPR evidence-based tools and trainings developed by the Preparedness and Emergency Response Research Centers (PERRC) or the Preparedness and Emergency Response Learning Centers (PERLC) between 2008 and 2015. Nine competitive awards were made to seven academic centers to achieve predetermined TDI objectives. The outputs attained by the initiative included: user-friendly online repositories of PERRC and PERLC tools and trainings; training courses that addressed topics; a community resilience manual to synthesize, translate, and implement evidence-based programs; and Web applications that supported legal preparedness, exercise evaluation, and immunization education. The evaluation identified several best practices and potential barriers to implementation. As illustrated by the work in this supplement, the broader awareness and implementation of PERRC preparedness products and PERLC trainings and the continued evaluation of their impact could enhance the PHPR capacity and capability of the nation, which could lead to improved health security.
Collapse
Affiliation(s)
- Shoukat H Qari
- All of the authors are with the Centers for Disease Control and Prevention, Office of Public Health Preparedness and Response, Atlanta, GA
| | - Mary R Leinhos
- All of the authors are with the Centers for Disease Control and Prevention, Office of Public Health Preparedness and Response, Atlanta, GA
| | - Tracy N Thomas
- All of the authors are with the Centers for Disease Control and Prevention, Office of Public Health Preparedness and Response, Atlanta, GA
| | - Eric G Carbone
- All of the authors are with the Centers for Disease Control and Prevention, Office of Public Health Preparedness and Response, Atlanta, GA
| |
Collapse
|
7
|
Setting Foundations for Developing Disaster Response Metrics. Disaster Med Public Health Prep 2017; 11:505-509. [DOI: 10.1017/dmp.2016.173] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
AbstractThere are few reported efforts to define universal disaster response performance measures. Careful examination of responses to past disasters can inform the development of such measures. As a first step toward this goal, we conducted a literature review to identify key factors in responses to 3 recent events with significant loss of human life and economic impact: the 2003 Bam, Iran, earthquake; the 2004 Indian Ocean tsunami; and the 2010 Haiti earthquake. Using the PubMed (National Library of Medicine, Bethesda, MD) database, we identified 710 articles and retained 124 after applying inclusion and exclusion criteria. Seventy-two articles pertained to the Haiti earthquake, 38 to the Indian Ocean tsunami, and 14 to the Bam earthquake. On the basis of this review, we developed an organizational framework for disaster response performance measurement with 5 key disaster response categories: (1) personnel, (2) supplies and equipment, (3) transportation, (4) timeliness and efficiency, and (5) interagency cooperation. Under each of these, and again informed by the literature, we identified subcategories and specific items that could be developed into standardized performance measures. The validity and comprehensiveness of these measures can be tested by applying them to other recent and future disaster responses, after which standardized performance measures can be developed through a consensus process. (Disaster Med Public Health Preparedness. 2017;11:505–509)
Collapse
|
8
|
Development of an Online Toolkit for Measuring Performance in Health Emergency Response Exercises. Prehosp Disaster Med 2015; 30:503-8. [DOI: 10.1017/s1049023x15005117] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
AbstractIntroductionExercises that simulate emergency scenarios are accepted widely as an essential component of a robust Emergency Preparedness program. Unfortunately, the variability in the quality of the exercises conducted, and the lack of standardized processes to measure performance, has limited the value of exercises in measuring preparedness.MethodsIn order to help health organizations improve the quality and standardization of the performance data they collect during simulated emergencies, a model online exercise evaluation toolkit was developed using performance measures tested in over 60 Emergency Preparedness exercises. The exercise evaluation toolkit contains three major components: (1) a database of measures that can be used to assess performance during an emergency response exercise; (2) a standardized data collection tool (form); and (3) a program that populates the data collection tool with the measures that have been selected by the user from the database. The evaluation toolkit was pilot tested from January through September 2014 in collaboration with 14 partnering organizations representing 10 public health agencies and four health care agencies from eight states across the US. Exercise planners from the partnering organizations were asked to use the toolkit for their exercise evaluation process and were interviewed to provide feedback on the use of the toolkit, the generated evaluation tool, and the usefulness of the data being gathered for the development of the exercise after-action report.ResultsNinety-three percent (93%) of exercise planners reported that they found the online database of performance measures appropriate for the creation of exercise evaluation forms, and they stated that they would use it again for future exercises. Seventy-two percent (72%) liked the exercise evaluation form that was generated from the toolkit, and 93% reported that the data collected by the use of the evaluation form were useful in gauging their organization’s performance during the exercise. Seventy-nine percent (79%) of exercise planners preferred the evaluation form generated by the toolkit to other forms of evaluations.ConclusionResults of this project show that users found the newly developed toolkit to be user friendly and more relevant to measurement of specific public health and health care capabilities than other tools currently available. The developed toolkit may contribute to the further advancement of developing a valid approach to exercise performance measurement.AgboolaF, BernardD, SavoiaE, BiddingerPD. Development of an online toolkit for measuring performance in health emergency response exercises. Prehosp Disaster Med. 2015;30(5):503–508.
Collapse
|