1
|
Douglas S, Page AC, Moltu C, Kyron M, Satterthwaite T. The Connections Matter: Bi-Directional Learning in Program Evaluation and Practice-Oriented Research. ADMINISTRATION AND POLICY IN MENTAL HEALTH AND MENTAL HEALTH SERVICES RESEARCH 2024; 51:318-335. [PMID: 37768486 DOI: 10.1007/s10488-023-01304-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/17/2023] [Indexed: 09/29/2023]
Abstract
Intended for researchers and clinical leaders, this article suggests that embedded program evaluation is a good fit with the desired features of practice-oriented research. The systematic nature of evaluation that is built into the operational workflow of a practice setting may increase the diversity of methods available to explore processes and outcomes of interest. We propose a novel conceptual framework that uses a human-centered systems lens to foster such embedded evaluation in clinical routine. This approach emphasizes the evaluator-practitioner partnership to build confidence in the bi-directional learning of practice-based evidence with evidence-based practice. The iterative cycles inherent to design thinking are aimed at developing better evaluation questions. The attention to structure and context inherent to systems thinking is intended to support meaningful perspectives in the naturally complex world of health care. Importantly, the combined human-centered systems lens can create greater awareness of the influence of individual and systemic biases that exist in any endeavor or institution that involves people. Recommended tools and strategies include systems mapping, program theory development, and visual facilitation using a logic model to represent the complexity of mental health treatment for communication, shared understanding, and connection to the broader evidence base. To illustrate elements of the proposed conceptual framework, two case examples are drawn from routine outcome monitoring (ROM) and progress feedback. We conclude with questions for future collaboration and research that may strengthen the partnership of evaluators and practitioners as a community of learners in service of local and system-level improvement.
Collapse
Affiliation(s)
- Susan Douglas
- Department of Leadership, Policy and Organizations, Vanderbilt University, Peabody College, 230 Appleton Place PMB #414, Nashville, TN, 37203-5721, USA.
| | - Andrew C Page
- School of Psychological Science and WA Mental Health Research Centre, University of Western Australia, Perth, Australia
| | - Christian Moltu
- District General Hospital of Førde, Førde, Norway
- Department of Health and Caring Science, Western Norway University of Applied Science, Førde, Norway
| | - Michael Kyron
- School of Psychological Science and WA Mental Health Research Centre, University of Western Australia, Perth, Australia
| | | |
Collapse
|
2
|
Woods O, MacDonell R, Brennan J, Prihodova L, Cushen B, Costello RW, McDonnell TJ. The Irish national chronic obstructive pulmonary disease quality improvement collaborative: an adaptive learning collaborative. BMJ Open Qual 2024; 13:e002356. [PMID: 38191216 PMCID: PMC10806582 DOI: 10.1136/bmjoq-2023-002356] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2023] [Accepted: 12/02/2023] [Indexed: 01/10/2024] Open
Abstract
BACKGROUND Chronic obstructive pulmonary disease (COPD) is the the most common disease-specific cause of adult emergency hospital admissions in Ireland. Preliminary groundwork indicated that treatment of acute exacerbations of COPD (AECOPD) in Ireland is not standardised between public hospitals. Applying Institute for Healthcare Improvement Breakthrough Series and Model for Improvement methodologies, Royal College of Physicians of Ireland designed and conducted a novel flexible and adaptive quality improvement (QI) collaborative which, using embedded evaluation, aimed to deliver QI teaching to enable teams to implement bespoke, locally applicable changes to improve and standardise acute COPD care at presentation, admission and discharge stages within their hospitals. METHODS Eighteen teams from 19 hospitals across Ireland participated over 13 months. QI teaching was facilitated through inperson learning sessions, site visits, programme manager and coaching support. Teams submitted monthly anonymised patient data (n=10) for 22 measures of AECOPD care for ongoing QI evaluation. A mixed-methods survey was administered at the final learning session to retrospectively evaluate participants' experiences of QI learning and patient care changes. RESULTS Participants reported that they learnt QI and improved patient care during the collaborative. Barriers included increased workload and lack of stakeholder buy-in. Statistically significant improvements (mean±SD) were seen for 'documented dyspnoea, eosinopenia, consolidation, acidaemia and atrial Fibrillation (DECAF) assessment' (7.3 (±14.4)% month(M)1 (n=15 sites); 49.6 (±37.7)% M13 (n=16 sites); p<0.001, 95% CI (14.3 to 66.7)), 'Documented diagnosis - spirometry' (42.5 (± 30.0)% M1 (n=16 sites); 69.1 (±29.9)% M13 (n=16 sites); p=0.0176, 95% CI 5.0 to 48.2) and 'inhaler technique review completed' (45.6 (± 34.1)% M1 (n=16 sites); 76.3 (±33.7)% M13 (n=16 sites); p=0.0131, 95% CI 10.0 to 65.0). 'First respiratory review' demonstrated improved standardisation. CONCLUSION This flexible QI collaborative provided adaptive collaborative learning that facilitated participating teams to improve AECOPD patient care based on the unique context of their own hospitals. Findings indicate that involvement in the QI collaborative facilitated teams in achieving their improvements.
Collapse
Affiliation(s)
- Orla Woods
- Research, Royal College of Physicians of Ireland, Dublin 2, Ireland
| | - Rachel MacDonell
- Quality Improvement, Royal College of Physicians of Ireland, Dublin 2, Ireland
| | - John Brennan
- Quality Improvement, Royal College of Physicians of Ireland, Dublin 2, Ireland
| | - Lucia Prihodova
- Research, Royal College of Physicians of Ireland, Dublin 2, Ireland
| | - Breda Cushen
- Respiratory Medicine, Beaumont Hospital, Dublin 9, Ireland
| | | | - Timothy J McDonnell
- National Clinical Programme for Respiratory, Health Service Executive, Dublin 8, Ireland
| |
Collapse
|
3
|
Dusabe C, Abimpaye M, Kabarungi N, Uwamahoro MD. Monitoring, evaluation and accountability evidence use for design, adaptation, and scale-up of an early childhood development program in Rwanda. Front Public Health 2023; 11:1165353. [PMID: 37588121 PMCID: PMC10426743 DOI: 10.3389/fpubh.2023.1165353] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2023] [Accepted: 07/07/2023] [Indexed: 08/18/2023] Open
Abstract
Introduction The first three years of a child's life are the most critical to child development and have an impact on the future achievement of the child. Young children's healthy development depends on nurturing care that ensures health, nutrition, responsive caregiving, safety, and security. Parents & other adult caregivers play a critical role in moderating children's early experiences, which has a lasting impact be it positive or negative on the children's future. Parenting education programs are proven to improve parental skills, capacity, and efficacy in a way that supports improved child development outcomes. Yet, most parents in low-middle-income countries such as Rwanda lack access to information and skills on how to support their children's holistic development. In response, Save The Children implemented the First Steps "Intera za Mbere" holistic parenting education project in Rwanda from 2014 to 2021. This paper reflects on how monitoring, evaluation, accountability, and learning (MEAL) approaches were applied throughout the project cycle and their impact on program improvement and national policy and advocacy. This paper explores how the aspirations for measurement for change, considerations for innovation uptake and frameworks for learning about improvement are reflected in this project. Methods The project utilized qualitative and quantitative MEAL across the program cycle. Action research at the start of the project identified promoters and inhibitors of high-quality nurturing care and program delivery modalities. The project utilized a randomized control trial to provide insight into components that work better for parenting education. Evidence from surveys done remotely via phones was used to inform COVID-19 adaptations of the program. Results The application of MEAL evidence led to the successful development and improvement of the program. At the policy level, evidence from the project influenced the review of the 2016 National Integrated ECD policy and the development of the national parenting education framework. Conclusion The regular use of evidence from MEAL is critical for program improvement, scale-up, and policy influence.
Collapse
Affiliation(s)
- Caroline Dusabe
- Save the Children Australia, Melbourne, VIC, Australia
- Save the Children (Rwanda), Kigali, Rwanda
| | | | | | | |
Collapse
|
4
|
Wurz A, Bean C, Shaikh M, Culos-Reed SN, Jung ME. From laboratory to community: Three examples of moving evidence-based physical activity into practice in Canada. HEALTH & SOCIAL CARE IN THE COMMUNITY 2022; 30:e1690-e1700. [PMID: 34623004 DOI: 10.1111/hsc.13596] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/15/2020] [Revised: 08/12/2021] [Accepted: 09/20/2021] [Indexed: 06/13/2023]
Abstract
Physical activity (PA) is important for enhancing and sustaining people's health and well-being. Although a number of efficacious PA interventions have been developed, few have been translated from research into practice. Consequently, the knowledge-to-practice gap continues to grow, leaving many individuals unable to access evidence-based PA opportunities. This gap may be particularly relevant for those who grapple with poor health due to intrapersonal, interpersonal, cultural and system-level barriers that limit their access to evidence-based PA opportunities. Implementation efforts designed to bring research into real-world settings may bridge the knowledge-to-practice gap. Yet, cultivating quality partnerships and ensuring effectiveness, methodological rigour and scalability in real-world settings can be difficult. Furthermore, researchers seldom publish examples of how they addressed these challenges and translated their evidence-based PA opportunities into practice. Herein, we describe three cases of successful PA implementation among diverse populations: (a) individuals affected by cancer, (b) adults living with prediabetes, and (c) children from under-resourced communities. Commonalities across cases include guiding theories and frameworks, the strategies to facilitate and maintain partnerships, and scalability and sustainability plans. Practical tips and recommendations are provided to spur research and translation efforts that consider implementation from the outset, ultimately ensuring that people receive the benefits PA can confer.
Collapse
Affiliation(s)
- Amanda Wurz
- Faculty of Kinesiology, University of Calgary, Calgary, Canada
| | - Corliss Bean
- Department of Recreation & Leisure Studies, Brock University, St. Catharines, Canada
| | - Majidullah Shaikh
- School of Human Kinetics, Faculty of Health Sciences, University of Ottawa, Ottawa, Canada
| | - S Nicole Culos-Reed
- Faculty of Kinesiology, University of Calgary, Calgary, Canada
- Department of Oncology, Cumming School of Medicine, University of Calgary, Calgary, Canada
- Department of Psychosocial Resources, Tom Baker Cancer Centre, Calgary, Canada
| | - Mary E Jung
- School of Health and Exercise Sciences, The University of British Columbia, Kelowna, Canada
| |
Collapse
|
5
|
Cullen J, Childerhouse P, McBain L. Contextual antecedents of quality improvement: a comparative case study in rural, urban and Kaupapa Māori general practice. J Prim Health Care 2022; 14:179-186. [PMID: 35771707 DOI: 10.1071/hc22012] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2022] [Accepted: 04/04/2022] [Indexed: 12/20/2022] Open
Abstract
Introduction The impact of contextual factors on primary health-care quality improvement is significant. In-depth research is required to identify the key contextual factors that influence quality improvement initiatives to develop high-performing primary health-care systems. Aim This research seeks to answer two questions; what are the contextual factors influencing primary care improvement initiatives?; and how do contextual factors, the quality improvement initiative and the implementation process influence one another and the overall improvement outcomes? Methods A multi-case study methodology was used to explore the complexities of the phenomena in situ . Three sites where successful quality improvement had occurred were selected by purposeful theoretical sampling to provide a sample of rural, urban and Kaupapa Māori general practice settings typical of the New Zealand environment. Semi-structured interviews were conducted with team members and triangulated with secondary data provided by the organisations. Results The quality improvement topic and the approach taken were intrinsically linked to context. Sites reported success in achieving the desired outcomes benefitting the patients, practice and staff. Teams did not use formal improvement methods, instead relying on established relationships and elements of change management methods. The culture in all three cases was a large component of why and how these initiatives were successful. Discussion Intrinsic motivation was generated by community connections and networks. This combined with a learning climate generated by distributed leadership and teamwork enabled success. Iterative reflection and sensemaking processes were able to deliver quality improvement success in primary care without the use of formal improvement methods.
Collapse
Affiliation(s)
- Jane Cullen
- Massey University, Palmerston North, New Zealand
| | - Paul Childerhouse
- Massey University, Palmerston North, New Zealand; and Department of Supply Chain Management, College of Business and Law, RMIT, Melbourne, Australia
| | - Lynn McBain
- Department of Primary Health Care and General Practice, University of Otago, Wellington, New Zealand
| |
Collapse
|
6
|
Carnahan E, Gurley N, Asiimwe G, Chilundo B, Duber HC, Faye A, Kamya C, Mpanya G, Nagasha S, Phillips D, Salisbury N, Shearer J, Shelley K. Lessons Learned From Implementing Prospective, Multicountry Mixed-Methods Evaluations for Gavi and the Global Fund. GLOBAL HEALTH: SCIENCE AND PRACTICE 2020; 8:771-782. [PMID: 33361241 PMCID: PMC7784079 DOI: 10.9745/ghsp-d-20-00126] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/30/2020] [Accepted: 09/30/2020] [Indexed: 11/24/2022]
Abstract
Lessons learned from implementing evaluations for Gavi, the Vaccine Alliance and the Global Fund for AIDS, Tuberculosis, and Malaria can help inform the design and implementation of ongoing or future evaluations of complex interventions. We share 5 lessons distilled from over 7 years of experience implementing evaluations in 7 countries. Introduction: As global health programs have become increasingly complex, corresponding evaluations must be designed to assess the full complexity of these programs. Gavi and the Global Fund have commissioned 2 such evaluations to assess the full spectrum of their investments using a prospective mixed-methods approach. We aim to describe lessons learned from implementing these evaluations. Methods: This article presents a synthesis of lessons learned based on the Gavi and Global Fund prospective mixed-methods evaluations, with each evaluation considered a case study. The lessons are based on the evaluation team’s experience from over 7 years (2013–2020) implementing these evaluations. The Centers for Disease Control and Prevention Framework for Evaluation in Public Health was used to ground the identification of lessons learned. Results: We identified 5 lessons learned that build on existing evaluation best practices and include a mix of practical and conceptual considerations. The lessons cover the importance of (1) including an inception phase to engage stakeholders and inform a relevant, useful evaluation design; (2) aligning on the degree to which the evaluation is embedded in the program implementation; (3) monitoring programmatic, organizational, or contextual changes and adapting the evaluation accordingly; (4) hiring evaluators with mixed-methods expertise and using tools and approaches that facilitate mixing methods; and (5) contextualizing recommendations and clearly communicating their underlying strength of evidence. Conclusion: Global health initiatives, particularly those leveraging complex interventions, should consider embedding evaluations to understand how and why the programs are working. These initiatives can learn from the lessons presented here to inform the design and implementation of such evaluations.
Collapse
Affiliation(s)
| | | | | | | | - Herbert C Duber
- Institute for Health Metrics and Evaluation, University of Washington, Seattle, WA, USA.,Department of Emergency Medicine, University of Washington, Seattle, WA, USA
| | - Adama Faye
- Institut de Santé et Développement/University Cheikh Anta Diop, Dakar, Senegal
| | - Carol Kamya
- Infectious Diseases Research Collaboration, Kampala, Uganda
| | | | | | - David Phillips
- Institute for Health Metrics and Evaluation, University of Washington, Seattle, WA, USA
| | | | | | | | | |
Collapse
|
7
|
Lamé G, Crowe S, Barclay M. “What’s the evidence?”—Towards more empirical evaluations of the impact of OR interventions in healthcare. Health Syst (Basingstoke) 2020; 11:59-67. [PMID: 35127059 PMCID: PMC8812794 DOI: 10.1080/20476965.2020.1857663] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
Despite an increasing number of papers reporting applications of operational research (OR) to problems in healthcare, there remains little empirical evidence of OR improving healthcare delivery in practice. Without such evidence it is harder both to justify the usefulness of OR to a healthcare audience and to learn and continuously improve our approaches. To progress, we need to build the evidence-base on whether and how OR improves healthcare delivery through careful empirical evaluation. This position paper reviews evaluation standards in healthcare improvement research and dispels some common myths about evaluation. It highlights the current lack of robust evaluation of healthcare OR and makes the case for addressing this. It then proposes possible ways for building better empirical evaluations of OR interventions in healthcare.
Collapse
Affiliation(s)
- Guillaume Lamé
- The Healthcare Improvement Studies Institute (THIS Institute), University of Cambridge, Cambridge, UK
- Laboratoire Génie Industriel, Université Paris-Saclay, CentraleSupélec, Gif-sur-Yvette, France
| | - Sonya Crowe
- Clinical Operational Research Unit, University College London, London, UK
| | - Matthew Barclay
- The Healthcare Improvement Studies Institute (THIS Institute), University of Cambridge, Cambridge, UK
| |
Collapse
|
8
|
Abstract
PURPOSE The embedded researcher is a healthcare-academic partnership model in which the researcher is engaged as a core member of the healthcare organisation. While this model has potential to support evidence translation, there is a paucity of evidence in relation to the specific challenges and strengths of the model. The aim of this study was to map the barriers and enablers of the model from the perspective of embedded researchers in Australian healthcare settings, and compare the responses of embedded researchers with a primary healthcare versus a primary academic affiliation. DESIGN/METHODOLOGY/APPROACH 104 embedded researchers from Australian healthcare organisations completed an online survey. Both purposive and snowball sampling strategies were used to identify current and former embedded researchers. This paper reports on responses to the open-ended questions in relation to barriers and enablers of the role, the available support, and recommendations for change. Thematic analysis was used to describe and interpret the breadth and depth of responses and common themes. FINDINGS Key barriers to being an embedded researcher in a public hospital included a lack of research infrastructure and funding in the healthcare organisation, a culture that does not value research, a lack of leadership and support to undertake research, limited access to mentoring and career progression and issues associated with having a dual affiliation. Key enablers included supportive colleagues and executive leaders, personal commitment to research and research collaboration including formal health-academic partnerships. RESEARCH LIMITATIONS/IMPLICATIONS To support the embedded researcher model, broader system changes are required, including greater investment in research infrastructure and healthcare-academic partnerships with formal agreements. Significant changes are required, so that healthcare organisations appreciate the value of research and support both clinicians and researchers to engage in research that is important to their local population. ORIGINALITY/VALUE This is the first study to systematically investigate the enablers and challenges of the embedded researcher model.
Collapse
|
9
|
Coates D, Mickan S. The embedded researcher model in Australian healthcare settings: comparison by degree of "embeddedness". Transl Res 2020; 218:29-42. [PMID: 31759948 DOI: 10.1016/j.trsl.2019.10.005] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/18/2019] [Revised: 10/30/2019] [Accepted: 10/31/2019] [Indexed: 10/25/2022]
Abstract
The embedded researcher model is a health-academic partnership where researchers are core members of a healthcare organization, with an aim to support evidence translation. The purpose of this study was to describe the characteristics and experiences of embedded researchers in Australian healthcare settings, and investigate how the model is experienced differently based on the level of "embeddedness." This exploratory study utilized a purpose-designed online survey. Responses were described using Word and Excel and analyzed using SPSS. To investigate how the model was experienced based on the level of "embeddedness," we tested for differences in responses between respondents with primary academic vs healthcare affiliations. A total of 104 embedded researchers from nursing and midwifery, allied health and medicine completed the survey, with equal numbers reporting a primary academic vs primary healthcare affiliation. Most indicated that research is a strategic objective of the healthcare organization (85.9%) yet almost a third (31%) reported that research outputs were not measured. While 60% agreed that clinical practice informed by research was valued, only 28% reported having adequate resources. Of those with a formal dual affiliation over a quarter reported conflict between expectations of the healthcare and academic organizations. Respondents with a primary academic affiliation were older, more qualified, had more research experience, had been in the role longer, and had more positive perceptions of the research culture of healthcare organizations. This study provides a starting point for healthcare organizations and academic institutions to partner in the further development and implementation of this model.
Collapse
Affiliation(s)
- Dominiek Coates
- University of Technology Sydney, Faculty of Health, Sydney, New South Wales, Australia.
| | - Sharon Mickan
- Griffith University, Griffith Health, Brisbane, Australia
| |
Collapse
|
10
|
Doyle AM, Mulhern E, Rosen J, Appleford G, Atchison C, Bottomley C, Hargreaves JR, Weinberger M. Challenges and opportunities in evaluating programmes incorporating human-centred design: lessons learnt from the evaluation of Adolescents 360. Gates Open Res 2019; 3:1472. [PMID: 31363715 PMCID: PMC6635668 DOI: 10.12688/gatesopenres.12998.2] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/13/2019] [Indexed: 11/20/2022] Open
Abstract
Adolescents 360 (A360) is a four-year initiative (2016-2020) to increase 15-19-year-old girls' use of modern contraception in Nigeria, Ethiopia and Tanzania. The innovative A360 approach is led by human-centred design (HCD), combined with social marketing, developmental neuroscience, public health, sociocultural anthropology and youth engagement 'lenses', and aims to create context-specific, youth-driven solutions that respond to the needs of adolescent girls. The A360 external evaluation includes a process evaluation, quasi-experimental outcome evaluation, and a cost-effectiveness study. We reflect on evaluation opportunities and challenges associated with measuring the application and impact of this novel HCD-led design approach. For the process evaluation, participant observations were key to capturing the depth of the fast-paced, highly-iterative HCD process, and to understand decision-making within the design process. The evaluation team had to be flexible and align closely with the work plan of the implementers. The HCD process meant that key information such as intervention components, settings, and eligible populations were unclear and changed over outcome evaluation and cost-effectiveness protocol development. This resulted in a more time-consuming and resource-intensive study design process. As much time and resources went into the creation of a new design approach, separating one-off "creation" costs versus those costs associated with actually implementing the programme was challenging. Opportunities included the potential to inform programmatic decision-making in real-time to ensure that interventions adequately met the contextualized needs in targeted areas. Robust evaluation of interventions designed using HCD, a promising and increasingly popular approach, is warranted yet challenging. Future HCD-based initiatives should consider a phased evaluation, focusing initially on programme theory refinement and process evaluation, and then, when the intervention program details are clearer, following with outcome evaluation and cost-effectiveness analysis. A phased approach would delay the availability of evaluation findings but would allow for a more appropriate and tailored evaluation design.
Collapse
Affiliation(s)
- Aoife M. Doyle
- London School of Hygiene & Tropical Medicine, London, WC1E7HT, UK
| | | | | | | | | | | | | | | |
Collapse
|
11
|
Hargreaves S, Rustage K, Nellums LB, Bardfield JE, Agins B, Barker P, Massoud MR, Ford NP, Doherty M, Dougherty G, Singh S. Do Quality Improvement Initiatives Improve Outcomes for Patients in Antiretroviral Programs in Low- and Middle-Income Countries? A Systematic Review. J Acquir Immune Defic Syndr 2019; 81:487-496. [PMID: 31149954 PMCID: PMC6738622 DOI: 10.1097/qai.0000000000002085] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2018] [Accepted: 04/03/2019] [Indexed: 11/26/2022]
Abstract
BACKGROUND There have been a range of quality improvement (QI) and quality assurance initiatives in low- and middle-income countries to improve antiretroviral therapy (ART) treatment outcomes for people living with HIV. To date, these initiatives have not been systematically assessed and little is known about how effective, cost-effective, or sustainable these strategies are in improving clinical outcomes. METHODS We conducted a systematic review adhering to PRISMA guidelines (PROSPERO ID: CRD42017071848), searching PubMed, MEDLINE, Embase, Web of Science, and the Cochrane database of controlled trials for articles reporting on the effectiveness of QI and quality assurance initiatives in HIV programs in low- and middle-income countries in relation to ART uptake, retention in care, adherence, viral load suppression, mortality, and other outcomes including cost-effectiveness and long-term sustainability. RESULTS One thousand eight hundred sixty articles were found, of which 29 were included. QI approaches were categorized as follows: (1) health system approaches using QI methods; (2) QI learning networks including collaboratives; (3) standard-based methods that use QI tools to improve performance gaps; and (4) campaigns using QI methods. The greatest improvements were seen in ART uptake [median increase of 14.0%; interquartile range (IQR) -9.0 to 29.3], adherence [median increase of 22.0% (IQR -7.0 to 25.0)], and viral load suppression [median increase 26.0% (IQR -8.0 to 26.0)]. CONCLUSIONS QI interventions can be effective in improving clinical outcomes; however, there was significant variability, making it challenging to identify which aspects of interventions lead to clinical improvements. Standardizing reporting and assessment of QI initiatives is needed, supported by national quality policies and directorates, and robust research.
Collapse
Affiliation(s)
- Sally Hargreaves
- The Insitute for Infection and Immunity, St George's, University of London, London, United Kingdom
- International Health Unit, Section of Infectious Diseases and Immunity, Imperial College London, London, United Kingdom
| | - Keiran Rustage
- The Insitute for Infection and Immunity, St George's, University of London, London, United Kingdom
- International Health Unit, Section of Infectious Diseases and Immunity, Imperial College London, London, United Kingdom
| | - Laura B. Nellums
- The Insitute for Infection and Immunity, St George's, University of London, London, United Kingdom
- International Health Unit, Section of Infectious Diseases and Immunity, Imperial College London, London, United Kingdom
| | - Joshua E. Bardfield
- Healthqual, Institute for Global Health Sciences, University of California San Francisco, San Francisco, CA
| | - Bruce Agins
- Healthqual, Institute for Global Health Sciences, University of California San Francisco, San Francisco, CA
| | - Pierre Barker
- URC, Chevy Chase, MD
- Institute for Healthcare Improvement, Boston, MA
| | | | - Nathan P. Ford
- Department of HIV/AIDS, World Health Organization, Geneva, Switzerland; and
| | - Meg Doherty
- Department of HIV/AIDS, World Health Organization, Geneva, Switzerland; and
| | | | - Satvinder Singh
- Department of HIV/AIDS, World Health Organization, Geneva, Switzerland; and
| |
Collapse
|
12
|
Facilitators and barriers to implementing task shifting: Expanding the scope of practice of clinical technologists in the Netherlands. Health Policy 2019; 123:1076-1082. [PMID: 31443982 DOI: 10.1016/j.healthpol.2019.07.003] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2018] [Revised: 06/29/2019] [Accepted: 07/07/2019] [Indexed: 11/23/2022]
Abstract
Despite recent studies confirming task shifting is both safe and effective, its implementation has proven difficult in practice. So too in the Netherlands, where legal barriers enforcing strict professional boundaries have historically limited task shifting. In recent years, Dutch policymakers have experimented with temporary expanded scopes of practice (ESP) for several professional groups, with the aim to facilitate task shifting in order to increase the overall effectiveness and efficiency of health care. The Clinical Technologist (CT), is an emerging new professional group that has received such a temporary ESP pending an evaluation. This paper reports the qualitative findings of the implementation process of providing CTs with an temporary ESP. Data collection consisted of 69 semi-structured interviews, 3 focus group interviews and 9 participant observations, conducted between September 2015 and October 2017. Analysis was conducted through an 'editing analysis style' whereby data were categorized using the conceptual framework of Grol & Wensing's implementation model. The study suggests that social features are of great importance when implementing task shifting. In situations with few social barriers, organizational and administrative barriers seem to be less dominant, thereby expediting the overall implementation process. Consequently, we recommend that policymakers should prioritize social features over organizational features when implementing task shifting.
Collapse
|
13
|
Churruca K, Ludlow K, Taylor N, Long JC, Best S, Braithwaite J. The time has come: Embedded implementation research for health care improvement. J Eval Clin Pract 2019; 25:373-380. [PMID: 30632246 DOI: 10.1111/jep.13100] [Citation(s) in RCA: 63] [Impact Index Per Article: 12.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/30/2018] [Revised: 12/19/2018] [Accepted: 12/22/2018] [Indexed: 12/23/2022]
Abstract
RATIONALE, AIMS, AND OBJECTIVES The field of implementation science has developed in response to slow and inconsistent translation of evidence into practice. Despite utilizing increasingly sophisticated approaches to implementation, including applying a complexity science lens and conducting realist evaluations, challenges remain to getting the kinds of outcomes hoped for by implementation efforts. These include gaining access and buy-in from those implementing the change and accounting for the influence of local context. One emerging approach to address these challenges is embedded implementation research-a collaborative, adaptive approach to improvement. It involves researchers and implementers working together in situ from the outset of, and throughout, an implementation project. Both groups can benefit from the collaboration: it increases the rigor of evaluation, provides opportunities to improve the intervention through direct feedback, and promotes better on-the-ground understanding of the change process. We aimed to examine the potential benefits, and some of the challenges, of increased embeddedness. METHOD We performed a multi-case analysis of implementation research projects that varied by degree of embeddedness. RESULTS Embedded implementation research may offer a range of advantages over dichotomized research-practice designs, including better understanding of local context and direct feedback to improve the implementation along the way. We present a model that spans four approaches: dichotomized research-practice, collaborative linking-up, partially-embedded, and deep immersion. CONCLUSION Embedded implementation research approaches hold promise in comparison to traditional dichotomized-research practice designs, where the research is external to the implementation and conducts a summative evaluation. We are only beginning to understand how such partnerships operate in practice and what makes them successful. Our analysis suggests the time has come to consider such approaches.
Collapse
Affiliation(s)
- Kate Churruca
- Centre for Healthcare Resilience and Implementation Science, Australian Institute of Health Innovation, Macquarie University, Sydney, Australia
| | - Kristiana Ludlow
- Centre for Healthcare Resilience and Implementation Science, Australian Institute of Health Innovation, Macquarie University, Sydney, Australia
| | | | - Janet C Long
- Centre for Healthcare Resilience and Implementation Science, Australian Institute of Health Innovation, Macquarie University, Sydney, Australia
| | - Stephanie Best
- Centre for Healthcare Resilience and Implementation Science, Australian Institute of Health Innovation, Macquarie University, Sydney, Australia.,Murdoch Children's Research Institute, Melbourne, Australia
| | - Jeffrey Braithwaite
- Centre for Healthcare Resilience and Implementation Science, Australian Institute of Health Innovation, Macquarie University, Sydney, Australia
| |
Collapse
|
14
|
Doyle AM, Mulhern E, Rosen J, Appleford G, Atchison C, Bottomley C, Hargreaves JR, Weinberger M. Challenges and opportunities in evaluating programmes incorporating human-centred design: lessons learnt from the evaluation of Adolescents 360. Gates Open Res 2019; 3:1472. [PMID: 31363715 PMCID: PMC6635668 DOI: 10.12688/gatesopenres.12998.1] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/10/2019] [Indexed: 11/20/2022] Open
Abstract
Adolescents 360 (A360) is a four-year initiative (2016-2020) to increase 15-19-year-old girls' use of modern contraception in Nigeria, Ethiopia and Tanzania. The innovative A360 approach is led by human-centred design (HCD), combined with social marketing, developmental neuroscience, public health, sociocultural anthropology and youth engagement 'lenses', and aims to create context-specific, youth-driven solutions that respond to the needs of adolescent girls. The A360 external evaluation includes a process evaluation, quasi-experimental outcome evaluation, and a cost-effectiveness study. We reflect on evaluation opportunities and challenges associated with measuring the application and impact of this novel HCD-led design approach. For the process evaluation, participant observations were key to capturing the depth of the fast-paced, highly-iterative HCD process, and to understand decision-making within the design process. The evaluation team had to be flexible and align closely with the work plan of the implementers. The HCD process meant that key information such as intervention components, settings, and eligible populations were unclear and changed over outcome evaluation and cost-effectiveness protocol development. This resulted in a more time-consuming and resource-intensive study design process. As much time and resources went into the creation of a new design approach, separating one-off "creation" costs versus those costs associated with actually implementing the programme was challenging. Opportunities included the potential to inform programmatic decision-making in real-time to ensure that interventions adequately met the contextualized needs in targeted areas. Robust evaluation of interventions designed using HCD, a promising and increasingly popular approach, is warranted yet challenging. Future HCD-based initiatives should consider a phased evaluation, focusing initially on programme theory refinement and process evaluation, and then, when the intervention program details are clearer, following with outcome evaluation and cost-effectiveness analysis. A phased approach would delay the availability of evaluation findings but would allow for a more appropriate and tailored evaluation design.
Collapse
Affiliation(s)
- Aoife M. Doyle
- London School of Hygiene & Tropical Medicine, London, WC1E7HT, UK
| | | | | | | | | | | | | | | |
Collapse
|