51
|
Smith JD, Li DH, Rafferty MR. The Implementation Research Logic Model: a method for planning, executing, reporting, and synthesizing implementation projects. Implement Sci 2020; 15:84. [PMID: 32988389 PMCID: PMC7523057 DOI: 10.1186/s13012-020-01041-8] [Citation(s) in RCA: 193] [Impact Index Per Article: 48.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2020] [Accepted: 09/03/2020] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Numerous models, frameworks, and theories exist for specific aspects of implementation research, including for determinants, strategies, and outcomes. However, implementation research projects often fail to provide a coherent rationale or justification for how these aspects are selected and tested in relation to one another. Despite this need to better specify the conceptual linkages between the core elements involved in projects, few tools or methods have been developed to aid in this task. The Implementation Research Logic Model (IRLM) was created for this purpose and to enhance the rigor and transparency of describing the often-complex processes of improving the adoption of evidence-based interventions in healthcare delivery systems. METHODS The IRLM structure and guiding principles were developed through a series of preliminary activities with multiple investigators representing diverse implementation research projects in terms of contexts, research designs, and implementation strategies being evaluated. The utility of the IRLM was evaluated in the course of a 2-day training to over 130 implementation researchers and healthcare delivery system partners. RESULTS Preliminary work with the IRLM produced a core structure and multiple variations for common implementation research designs and situations, as well as guiding principles and suggestions for use. Results of the survey indicated a high utility of the IRLM for multiple purposes, such as improving rigor and reproducibility of projects; serving as a "roadmap" for how the project is to be carried out; clearly reporting and specifying how the project is to be conducted; and understanding the connections between determinants, strategies, mechanisms, and outcomes for their project. CONCLUSIONS The IRLM is a semi-structured, principle-guided tool designed to improve the specification, rigor, reproducibility, and testable causal pathways involved in implementation research projects. The IRLM can also aid implementation researchers and implementation partners in the planning and execution of practice change initiatives. Adaptation and refinement of the IRLM are ongoing, as is the development of resources for use and applications to diverse projects, to address the challenges of this complex scientific field.
Collapse
Affiliation(s)
- Justin D Smith
- Department of Population Health Sciences, University of Utah School of Medicine, Salt Lake City, Utah, USA. .,Center for Prevention Implementation Methodology for Drug Abuse and HIV, Department of Psychiatry and Behavioral Sciences, Department of Preventive Medicine, Department of Medical Social Sciences, and Department of Pediatrics, Northwestern University Feinberg School of Medicine, Chicago, Illinois, USA.
| | - Dennis H Li
- Center for Prevention Implementation Methodology for Drug Abuse and HIV, Department of Psychiatry and Behavioral Sciences, Feinberg School of Medicine; Institute for Sexual and Gender Minority Health and Wellbeing, Northwestern University Chicago, Chicago, Illinois, USA
| | - Miriam R Rafferty
- Shirley Ryan AbilityLab and Center for Prevention Implementation Methodology for Drug Abuse and HIV, Department of Psychiatry and Behavioral Sciences and Department of Physical Medicine and Rehabilitation, Northwestern University Feinberg School of Medicine, Chicago, Illinois, USA
| |
Collapse
|
52
|
Nguyen MXB, Chu AV, Powell BJ, Tran HV, Nguyen LH, Dao ATM, Pham MD, Vo SH, Bui NH, Dowdy DW, Latkin CA, Lancaster KE, Pence BW, Sripaipan T, Hoffman I, Miller WC, Go VF. Comparing a standard and tailored approach to scaling up an evidence-based intervention for antiretroviral therapy for people who inject drugs in Vietnam: study protocol for a cluster randomized hybrid type III trial. Implement Sci 2020; 15:64. [PMID: 32771017 PMCID: PMC7414564 DOI: 10.1186/s13012-020-01020-z] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2020] [Accepted: 07/08/2020] [Indexed: 12/03/2022] Open
Abstract
Background People who inject drugs (PWID) bear a disproportionate burden of HIV infection and experience poor outcomes. A randomized trial demonstrated the efficacy of an integrated System Navigation and Psychosocial Counseling (SNaP) intervention in improving HIV outcomes, including antiretroviral therapy (ART) and medications for opioid use disorder (MOUD) uptake, viral suppression, and mortality. There is limited evidence about how to effectively scale such intervention. This protocol presents a hybrid type III effectiveness-implementation trial comparing two approaches for scaling-up SNaP. We will evaluate the effectiveness of SNaP implementation approaches as well as cost and the characteristics of HIV testing sites achieving successful or unsuccessful implementation of SNaP in Vietnam. Methods Design: In this cluster randomized controlled trial, two approaches to scaling-up SNaP for PWID in Vietnam will be compared. HIV testing sites (n = 42) were randomized 1:1 to the standard approach or the tailored approach. Intervention mapping was used to develop implementation strategies for both arms. The standard arm will receive a uniform package of these strategies, while implementation strategies for the tailored arm will be designed to address site-specific needs. Participants: HIV-positive PWID participants (n = 6200) will be recruited for medical record assessment at baseline; of those, 1500 will be enrolled for detailed assessments at baseline, 12, and 24 months. Site directors and staff at each of the 42 HIV testing sites will complete surveys at baseline, 12, and 24 months. Outcomes: Implementation outcomes (fidelity, penetration, acceptability) and effectiveness outcomes (ART, MOUD uptake, viral suppression) will be compared between the arms. To measure incremental costs, we will conduct an empirical costing study of each arm and the actual process of implementation from a societal perspective. Qualitative and quantitative site-level data will be used to explore key characteristics of HIV testing sites that successfully or unsuccessfully implement the intervention for each arm. Discussion Scaling up evidence-based interventions poses substantial challenges. The proposed trial contributes to the field of implementation science by applying a systematic approach to designing and tailoring implementation strategies, conducting a rigorous comparison of two promising implementation approaches, and assessing their incremental costs. Our study will provide critical guidance to Ministries of Health worldwide regarding the most effective, cost-efficient approach to SNaP implementation. Trial registration NCT03952520 on Clinialtrials.gov. Registered 16 May 2019.
Collapse
Affiliation(s)
- Minh X B Nguyen
- Department of Health Behavior, Gillings School of Global Public Health, 135 Dauer Dr, Chapel Hill, NC, 27599, USA. .,Department of Epidemiology, Institute of Preventive Medicine and Public Health, 1 Ton That Tung St., Dong Da, Hanoi, Vietnam.
| | - Anh V Chu
- University of North Carolina Project Vietnam, Lot E2 Duong Dinh Nghe St., Cau Giay, Hanoi, Vietnam
| | - Byron J Powell
- Brown School, Washington University in St. Louis, One Brookings Drive, St. Louis, MO, 63130, USA
| | - Ha V Tran
- Department of Health Behavior, Gillings School of Global Public Health, 135 Dauer Dr, Chapel Hill, NC, 27599, USA.,University of North Carolina Project Vietnam, Lot E2 Duong Dinh Nghe St., Cau Giay, Hanoi, Vietnam
| | - Long H Nguyen
- Vietnam Authority of HIV/AIDS Control, Land 8 That Thuyet St., Ba Dinh, Hanoi, Vietnam
| | - An T M Dao
- Department of Epidemiology, Institute of Preventive Medicine and Public Health, 1 Ton That Tung St., Dong Da, Hanoi, Vietnam
| | - Manh D Pham
- Vietnam Authority of HIV/AIDS Control, Land 8 That Thuyet St., Ba Dinh, Hanoi, Vietnam
| | - Son H Vo
- Vietnam Authority of HIV/AIDS Control, Land 8 That Thuyet St., Ba Dinh, Hanoi, Vietnam
| | - Ngoc H Bui
- Department of Epidemiology, Institute of Preventive Medicine and Public Health, 1 Ton That Tung St., Dong Da, Hanoi, Vietnam
| | - David W Dowdy
- Department of Epidemiology, Johns Hopkins Bloomberg School of Public Health, 615 N Wolfe St, Baltimore, MD, 21205, USA
| | - Carl A Latkin
- Department of Health, Behavior and Society, Johns Hopkins Bloomberg School of Public Health, 615 N Wolfe St, Baltimore, MD, 21205, USA
| | - Kathryn E Lancaster
- Department of Epidemiology, College of Public Health, Ohio State University, 250 Cunz Hall, 1841 Neil Ave, Columbus, OH, 43210, USA
| | - Brian W Pence
- Department of Epidemiology, Gillings School of Global Public Health, 135 Dauer Dr, Chapel Hill, NC, 27599, USA
| | - Teerada Sripaipan
- Department of Health Behavior, Gillings School of Global Public Health, 135 Dauer Dr, Chapel Hill, NC, 27599, USA
| | - Irving Hoffman
- Division of Infectious Diseases, UNC School of Medicine, 321 S Columbia St, Chapel Hill, NC, 27516, USA
| | - William C Miller
- Department of Epidemiology, College of Public Health, Ohio State University, 250 Cunz Hall, 1841 Neil Ave, Columbus, OH, 43210, USA
| | - Vivian F Go
- Department of Health Behavior, Gillings School of Global Public Health, 135 Dauer Dr, Chapel Hill, NC, 27599, USA.
| |
Collapse
|
53
|
Allen P, Pilar M, Walsh-Bailey C, Hooley C, Mazzucca S, Lewis CC, Mettert KD, Dorsey CN, Purtle J, Kepper MM, Baumann AA, Brownson RC. Quantitative measures of health policy implementation determinants and outcomes: a systematic review. Implement Sci 2020; 15:47. [PMID: 32560661 PMCID: PMC7304175 DOI: 10.1186/s13012-020-01007-w] [Citation(s) in RCA: 31] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2020] [Accepted: 06/05/2020] [Indexed: 01/02/2023] Open
Abstract
BACKGROUND Public policy has tremendous impacts on population health. While policy development has been extensively studied, policy implementation research is newer and relies largely on qualitative methods. Quantitative measures are needed to disentangle differential impacts of policy implementation determinants (i.e., barriers and facilitators) and outcomes to ensure intended benefits are realized. Implementation outcomes include acceptability, adoption, appropriateness, compliance/fidelity, feasibility, penetration, sustainability, and costs. This systematic review identified quantitative measures that are used to assess health policy implementation determinants and outcomes and evaluated the quality of these measures. METHODS Three frameworks guided the review: Implementation Outcomes Framework (Proctor et al.), Consolidated Framework for Implementation Research (Damschroder et al.), and Policy Implementation Determinants Framework (Bullock et al.). Six databases were searched: Medline, CINAHL Plus, PsycInfo, PAIS, ERIC, and Worldwide Political. Searches were limited to English language, peer-reviewed journal articles published January 1995 to April 2019. Search terms addressed four levels: health, public policy, implementation, and measurement. Empirical studies of public policies addressing physical or behavioral health with quantitative self-report or archival measures of policy implementation with at least two items assessing implementation outcomes or determinants were included. Consensus scoring of the Psychometric and Pragmatic Evidence Rating Scale assessed the quality of measures. RESULTS Database searches yielded 8417 non-duplicate studies, with 870 (10.3%) undergoing full-text screening, yielding 66 studies. From the included studies, 70 unique measures were identified to quantitatively assess implementation outcomes and/or determinants. Acceptability, feasibility, appropriateness, and compliance were the most commonly measured implementation outcomes. Common determinants in the identified measures were organizational culture, implementation climate, and readiness for implementation, each aspects of the internal setting. Pragmatic quality ranged from adequate to good, with most measures freely available, brief, and at high school reading level. Few psychometric properties were reported. CONCLUSIONS Well-tested quantitative measures of implementation internal settings were under-utilized in policy studies. Further development and testing of external context measures are warranted. This review is intended to stimulate measure development and high-quality assessment of health policy implementation outcomes and determinants to help practitioners and researchers spread evidence-informed policies to improve population health. REGISTRATION Not registered.
Collapse
Affiliation(s)
- Peg Allen
- Prevention Research Center, Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130 USA
| | - Meagan Pilar
- Prevention Research Center, Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130 USA
| | - Callie Walsh-Bailey
- Prevention Research Center, Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130 USA
| | - Cole Hooley
- School of Social Work, Brigham Young University, 2190 FJSB, Provo, UT 84602 USA
| | - Stephanie Mazzucca
- Prevention Research Center, Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130 USA
| | - Cara C. Lewis
- Kaiser Permanente Washington Health Research Institute, 1730 Minor Ave, Seattle, WA 98101 USA
| | - Kayne D. Mettert
- Kaiser Permanente Washington Health Research Institute, 1730 Minor Ave, Seattle, WA 98101 USA
| | - Caitlin N. Dorsey
- Kaiser Permanente Washington Health Research Institute, 1730 Minor Ave, Seattle, WA 98101 USA
| | - Jonathan Purtle
- Department of Health Management & Policy, Drexel University Dornsife School of Public Health, Nesbitt Hall, 3215 Market St, Philadelphia, PA 19104 USA
| | - Maura M. Kepper
- Prevention Research Center, Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130 USA
| | - Ana A. Baumann
- Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130 USA
| | - Ross C. Brownson
- Prevention Research Center, Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130 USA
- Department of Surgery (Division of Public Health Sciences) and Alvin J. Siteman Cancer Center, Washington University School of Medicine, 4921 Parkview Place, Saint Louis, MO 63110 USA
| |
Collapse
|
54
|
Cook CR, Lyon AR, Locke J, Waltz T, Powell BJ. Adapting a Compilation of Implementation Strategies to Advance School-Based Implementation Research and Practice. PREVENTION SCIENCE : THE OFFICIAL JOURNAL OF THE SOCIETY FOR PREVENTION RESEARCH 2020; 20:914-935. [PMID: 31152328 DOI: 10.1007/s11121-019-01017-1] [Citation(s) in RCA: 110] [Impact Index Per Article: 27.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/30/2022]
Abstract
Schools, like other service sectors, are confronted with an implementation gap, with the slow adoption and uneven implementation of evidence-based practices (EBP) as part of routine service delivery, undermining efforts to promote better youth behavioral health outcomes. Implementation researchers have undertaken systematic efforts to publish taxonomies of implementation strategies (i.e., methods or techniques that are used to facilitate the uptake, use, and sustainment of EBP), such as the Expert Recommendations for Implementing Change (ERIC) Project. The 73-strategy ERIC compilation was developed in the context of healthcare and largely informed by research and practice experts who operate in that service sector. Thus, the comprehensibility, contextual appropriateness, and utility of the existing compilation to other service sectors, such as the educational setting, remain unknown. The purpose of this study was to initiate the School Implementation Strategies, Translating ERIC Resources (SISTER) Project to iteratively adapt the ERIC compilation to the educational sector. The results of a seven-step adaptation process resulted in 75 school-adapted strategies. Surface-level changes were made to the majority of the original ERIC strategies (52 out of 73), while five of the strategies required deeper modifications for adaptation to the school context. Six strategies were deleted and seven new strategies were added based on existing school-based research. The implications of this study's findings for prevention scientists engaged in implementation research (e.g., creating a common nomenclature for implementation strategies) and limitations are discussed.
Collapse
Affiliation(s)
- Clayton R Cook
- University of Minnesota, Twin Cities, 56 East River Road, Educational Sciences Building, Minneapolis, MN, USA.
| | - Aaron R Lyon
- University of Washington, 6200 NE 74th Street, Suite 100, Seattle, WA, 98115, USA
| | - Jill Locke
- University of Washington, 6200 NE 74th Street, Suite 100, Seattle, WA, 98115, USA
| | - Thomas Waltz
- Eastern Michigan University, 301D Science Complex, Ypsilanti, MI, 48197, USA
| | - Byron J Powell
- Brown School, Washington University in St. Louis, St. Louis, MO, USA
| |
Collapse
|
55
|
Danish A, Chouinard MC, Aubrey-Bassler K, Burge F, Doucet S, Ramsden VR, Bisson M, Cassidy M, Condran B, Lambert M, Penney C, Sabourin V, Warren M, Hudon C. Protocol for a mixed-method analysis of implementation of case management in primary care for frequent users of healthcare services with chronic diseases and complex care needs. BMJ Open 2020; 10:e038241. [PMID: 32487584 PMCID: PMC7265033 DOI: 10.1136/bmjopen-2020-038241] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/03/2022] Open
Abstract
INTRODUCTION Case management (CM) in a primary care setting is a promising approach to integrating and improving healthcare services and outcomes for patients with chronic conditions and complex care needs who frequently use healthcare services. Despite evidence supporting CM and interest in implementing it in Canada, little is known about how to do this. This research aims to identify the barriers and facilitators to the implementation of a CM intervention in different primary care contexts (objective 1) and to explain the influence of the clinical context on the degree of implementation (objective 2) and on the outcomes of the intervention (objective 3). METHODS AND ANALYSIS A multiple-case embedded mixed-methods study will be conducted on CM implemented in ten primary care clinics across five Canadian provinces. Each clinic will represent a subunit of analysis, detailed through a case history. Cases will be compared and contrasted using multiple analytical approaches. Qualitative data (objectives 1 and 2) from individual semistructured interviews (n=130), focus group discussions (n=20) and participant observation of each clinic (36 hours) will be compared and integrated with quantitative (objective 3) clinical data on services use (n=300) and patient questionnaires (n=300). An evaluation of intervention fidelity will be integrated into the data analysis. ETHICS AND DISSEMINATION This project received approval from the CIUSSS de l'Estrie - CHUS Research Ethic Board (project number MP-31-2019-2830). Results will provide the opportunity to refine the CM intervention and to facilitate effective evaluation, replication and scale-up. This research provides knowledge on how to resp ond to the needs of individuals with chronic conditions and complex care needs in a cost-effective way that improves patient-reported outcomes and healthcare use, while ensuring care team well-being. Dissemination of results is planned and executed based on the needs of various stakeholders involved in the research.
Collapse
Affiliation(s)
- Alya Danish
- Department of Family Medicine and Emergency Medicine, Université de Sherbrooke, Sherbrooke, Québec, Canada
| | | | - Kris Aubrey-Bassler
- Primary Healthcare Research Unit, Memorial University, St-John's, Newfoundland and Labrador, Canada
| | - Fred Burge
- Department of Family Medicine, Dalhousie University, Halifax, Nova Scotia, Canada
| | - Shelley Doucet
- Department of Nursing and Health Sciences, University of New Brunswick, Fredericton, New Brunswick, Canada
| | - Vivian R Ramsden
- Department of Academic Family Medicine, University of Saskatchewan, Saskatoon, Saskatchewan, Canada
| | - Mathieu Bisson
- Department of Family Medicine and Emergency Medicine, Université de Sherbrooke, Sherbrooke, Québec, Canada
| | - Monique Cassidy
- Department of Nursing and Health Sciences, University of New Brunswick, Fredericton, New Brunswick, Canada
| | - Brian Condran
- Department of Family Medicine, Dalhousie University, Halifax, Nova Scotia, Canada
| | - Mireille Lambert
- Centre intégré universitaire de santé et de services sociaux du Saguenay-Lac-Saint-Jean, Chicoutimi, Quebec, Canada
| | - Carla Penney
- Primary Healthcare Research Unit, Memorial University, St-John's, Newfoundland and Labrador, Canada
| | | | - Mike Warren
- NL-SPOR Suppport Unit, St-John's, Newfoundland and Labrador, Canada
| | - Catherine Hudon
- Department of Family Medicine and Emergency Medicine, Université de Sherbrooke, Sherbrooke, Québec, Canada
- Centre hospitalier universitaire de Sherbrooke Research Centre, Sherbrooke, Québec, Canada
| |
Collapse
|
56
|
Hasson H, Leviton L, von Thiele Schwarz U. A typology of useful evidence: approaches to increase the practical value of intervention research. BMC Med Res Methodol 2020; 20:133. [PMID: 32460833 PMCID: PMC7254642 DOI: 10.1186/s12874-020-00992-2] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/16/2019] [Accepted: 04/26/2020] [Indexed: 01/25/2023] Open
Abstract
BACKGROUND Too often, studies of evidence-based interventions (EBIs) in preventive, community, and health care are not sufficiently useful to end users (typically practitioners, patients, policymakers, or other researchers). The ways in which intervention studies are conventionally conducted and reported mean that there is often a shortage of information when an EBI is used in practice. The paper aims to invite the research community to consider ways to optimize not only the trustworthiness but also the research's usefulness in intervention studies. This is done by proposing a typology that provides some approaches to useful EBIs for intervention researchers. The approaches originate from different research fields and are summarized to highlight their potential benefits from a usefulness perspective. MAIN MESSAGE The typology consists of research approaches to increase the usefulness of EBIs by improving the reporting of four features in intervention studies: (1) the interventions themselves, including core components and appropriate adaptations; (2) strategies to support-high-quality implementation of the interventions; (3) generalizations about the evidence in a variety of contexts; and (4) outcomes based on end users' preferences and knowledge. The research approaches fall into three levels: Description, Analysis, and Design. The first level, Description, outlines what types of information about the intervention and its implementation, context, and outcomes can be helpful for end users. Research approaches under analysis offers alternative ways of analyzing data, increasing the precision of information provided to end users. Approaches summarized under design involve more radical changes and far-reaching implications for how research can provide more useful information. These approaches partly flip the order of efficacy and effectiveness, focusing not on whether an intervention works in highly controlled and optimal circumstances, but first and foremost whether an intervention can be implemented and lead to anticipated outcomes in everyday practice. CONCLUSIONS The research community, as well as the end users of research, are invited to consider ways to optimize research's usefulness as well as its trustworthiness. Many of the research approaches in the typology are not new, and their contributions to quality have been described for generations - but their contributions to useful knowledge need more attention.
Collapse
Affiliation(s)
- Henna Hasson
- Procome research group, Department of Learning, Informatics, Management and Ethics, Medical Management Centre, Karolinska Institutet, SE 171 77, Stockholm, Sweden.
- Unit for Implementation and Evaluation, Centre for Epidemiology and Community Medicine (CES), Stockholm County Council, SE 171 29, Stockholm, Sweden.
| | | | - Ulrica von Thiele Schwarz
- Procome research group, Department of Learning, Informatics, Management and Ethics, Medical Management Centre, Karolinska Institutet, SE 171 77, Stockholm, Sweden
- School of Health, Care and Social Welfare, Mälardalen University, Mälardalen, Sweden
| |
Collapse
|
57
|
Cidav Z, Mandell D, Pyne J, Beidas R, Curran G, Marcus S. A pragmatic method for costing implementation strategies using time-driven activity-based costing. Implement Sci 2020; 15:28. [PMID: 32370752 PMCID: PMC7201568 DOI: 10.1186/s13012-020-00993-1] [Citation(s) in RCA: 49] [Impact Index Per Article: 12.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2019] [Accepted: 04/16/2020] [Indexed: 01/12/2023] Open
Abstract
BACKGROUND Implementation strategies increase the adoption of evidence-based practices, but they require resources. Although information about implementation costs is critical for decision-makers with budget constraints, cost information is not typically reported in the literature. This is at least partly due to a need for clearly defined, standardized costing methods that can be integrated into implementation effectiveness evaluation efforts. METHODS We present a pragmatic approach to systematically estimating detailed, specific resource use and costs of implementation strategies that combine time-driven activity-based costing (TDABC), a business accounting method based on process mapping and known for its practicality, with a leading implementation science framework developed by Proctor and colleagues, which guides specification and reporting of implementation strategies. We illustrate the application of this method using a case study with synthetic data. RESULTS This step-by-step method produces a clear map of the implementation process by specifying the names, actions, actors, and temporality of each implementation strategy; determining the frequency and duration of each action associated with individual strategies; and assigning a dollar value to the resources that each action consumes. The method provides transparent and granular cost estimation, allowing a cost comparison of different implementation strategies. The resulting data allow researchers and stakeholders to understand how specific components of an implementation strategy influence its overall cost. CONCLUSION TDABC can serve as a pragmatic method for estimating resource use and costs associated with distinct implementation strategies and their individual components. Our use of the Proctor framework for the process mapping stage of the TDABC provides a way to incorporate cost estimation into implementation evaluation and may reduce the burden associated with economic evaluations in implementation science.
Collapse
Affiliation(s)
- Zuleyha Cidav
- Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA.
- Leonard Davis Institute of Health Economics, University of Pennsylvania, Philadelphia, PA, USA.
| | - David Mandell
- Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
- Leonard Davis Institute of Health Economics, University of Pennsylvania, Philadelphia, PA, USA
| | - Jeffrey Pyne
- Center for Mental Healthcare and Outcomes Research, Central Arkansas Veterans Healthcare System, North Little Rock, AR, USA
- South Central Mental Illness Research, Education and Clinical Center, Central Arkansas, Little Rock, USA
- Veterans Healthcare System, North Little Rock, AR, USA
- Division of Health Services Research, Department of Psychiatry, College of Medicine, University of Arkansas for Medical Sciences, Little Rock, AR, USA
| | - Rinad Beidas
- Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
- Leonard Davis Institute of Health Economics, University of Pennsylvania, Philadelphia, PA, USA
- Department of Medical Ethics and Health Policy, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
- Department of Medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
- Penn Implementation Science Center, Leonard Davis Institute of Health Economics, Philadelphia, USA
| | - Geoffrey Curran
- Departments of Pharmacy Practice and Psychiatry, University of Arkansas for Medical Sciences, Little Rock, AR, USA
- Center for Implementation Research, University of Arkansas for Medical Sciences, Little Rock, AR, USA
- Central Arkansas Veterans Healthcare System, Little Rock, AR, USA
| | - Steven Marcus
- Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
- Leonard Davis Institute of Health Economics, University of Pennsylvania, Philadelphia, PA, USA
- School of Social Policy and Practice, University of Pennsylvania, Philadelphia, PA, USA
| |
Collapse
|
58
|
Powell BJ, Haley AD, Patel SV, Amaya-Jackson L, Glienke B, Blythe M, Lengnick-Hall R, McCrary S, Beidas RS, Lewis CC, Aarons GA, Wells KB, Saldana L, McKay MM, Weinberger M. Improving the implementation and sustainment of evidence-based practices in community mental health organizations: a study protocol for a matched-pair cluster randomized pilot study of the Collaborative Organizational Approach to Selecting and Tailoring Implementation Strategies (COAST-IS). Implement Sci Commun 2020; 1. [PMID: 32391524 PMCID: PMC7207049 DOI: 10.1186/s43058-020-00009-5] [Citation(s) in RCA: 21] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
Abstract
Background Implementing and sustaining evidence-based programs with fidelity may require multiple implementation strategies tailored to address multi-level, context-specific barriers and facilitators. Ideally, selecting and tailoring implementation strategies should be guided by theory, evidence, and input from relevant stakeholders; however, methods to guide the selection and tailoring of strategies are not well-developed. There is a need for more rigorous methods for assessing and prioritizing implementation determinants (barriers and facilitators) and linking implementation strategies to determinants. The Collaborative Organizational Approach to Selecting and Tailoring Implementation Strategies (COAST-IS) is an intervention designed to increase the effectiveness of evidence-based practice implementation and sustainment. COAST-IS will enable organizational leaders and clinicians to use Intervention Mapping to select and tailor implementation strategies to address their site-specific needs. Intervention Mapping is a multi-step process that incorporates theory, evidence, and stakeholder perspectives to ensure that implementation strategies effectively address key determinants of change. Methods COAST-IS will be piloted with community mental health organizations that are working to address the needs of children and youth who experience trauma-related emotional or behavioral difficulties by engaging in a learning collaborative to implement an evidence-based psychosocial intervention (trauma-focused cognitive behavioral therapy). Organizations will be matched and then randomized to participate in the learning collaborative only (control) or to receive additional support through COAST-IS. The primary aims of this study are to (1) assess the acceptability, appropriateness, feasibility, and perceived utility of COAST-IS; (2) evaluate the organizational stakeholders' fidelity to the core elements of COAST-IS; and (3) demonstrate the feasibility of testing COAST-IS in a larger effectiveness trial. Discussion COAST-IS is a systematic method that integrates theory, evidence, and stakeholder perspectives to improve the effectiveness and precision of implementation strategies. If effective, COAST-IS has the potential to improve the implementation and sustainment of a wide range of evidence-based practices in mental health and other sectors. Trial registration This study was registered in ClinicalTrials.gov (NCT03799432) on January 10, 2019 (last updated August 5, 2019).
Collapse
Affiliation(s)
- Byron J Powell
- Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130, USA.,Department of Health Policy and Management, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| | - Amber D Haley
- Department of Health Policy and Management, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| | - Sheila V Patel
- Department of Health Policy and Management, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| | - Lisa Amaya-Jackson
- Department of Psychiatry & Behavioral Sciences, Duke University School of Medicine, Durham, NC, USA.,National Center for Child Traumatic Stress, Durham, NC, USA.,North Carolina Child Treatment Program, Center for Child and Family Health, Durham, NC, USA
| | - Beverly Glienke
- North Carolina Child Treatment Program, Center for Child and Family Health, Durham, NC, USA
| | - Mellicent Blythe
- North Carolina Child Treatment Program, Center for Child and Family Health, Durham, NC, USA.,School of Social Work, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| | - Rebecca Lengnick-Hall
- Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130, USA
| | - Stacey McCrary
- Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130, USA
| | - Rinad S Beidas
- Department of Psychiatry, University of Pennsylvania Perelman School of Medicine, Philadelphia, PA, USA.,Department of Medical Ethics and Health Policy, University of Pennsylvania Perelman School of Medicine, Philadelphia, PA, USA.,Penn Implementation Science Center at the Leonard Davis Institute of Health Economics (PISCE@LDI), University of Pennsylvania, Philadelphia, PA, USA
| | - Cara C Lewis
- MacColl Center for Health Care Innovation, Kaiser Permanente Washington Health Research Institute, Seattle, WA, USA
| | - Gregory A Aarons
- Department of Psychiatry, Child and Adolescent Services Research Center, University of California San Diego School of Medicine, San Diego, CA, USA
| | - Kenneth B Wells
- Department of Psychiatry and Behavioral Sciences, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, CA, USA.,The Jane and Terry Semel Institute for Neuroscience and Human Behavior, University of California Los Angeles, Los Angeles, CA, USA
| | | | - Mary M McKay
- Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130, USA
| | - Morris Weinberger
- Department of Health Policy and Management, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| |
Collapse
|
59
|
Saunders EFH, Rice A. Between Scylla and Charybdis: a Primer for Navigating a Department of Psychiatry in the Increasingly Complex Waters of Academic Medicine. ACADEMIC PSYCHIATRY : THE JOURNAL OF THE AMERICAN ASSOCIATION OF DIRECTORS OF PSYCHIATRIC RESIDENCY TRAINING AND THE ASSOCIATION FOR ACADEMIC PSYCHIATRY 2020; 44:106-110. [PMID: 31732884 DOI: 10.1007/s40596-019-01137-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/30/2019] [Accepted: 10/22/2019] [Indexed: 06/10/2023]
Affiliation(s)
- Erika F H Saunders
- The Pennsylvania State University College of Medicine, Hershey, PA, USA.
- Penn State Health Milton S. Hershey Medical Center, Hershey, PA, USA.
| | - Ashley Rice
- Penn State Health Milton S. Hershey Medical Center, Hershey, PA, USA
| |
Collapse
|
60
|
Kirchner JE, Smith JL, Powell BJ, Waltz TJ, Proctor EK. Getting a clinical innovation into practice: An introduction to implementation strategies. Psychiatry Res 2020; 283:112467. [PMID: 31488332 PMCID: PMC7239693 DOI: 10.1016/j.psychres.2019.06.042] [Citation(s) in RCA: 109] [Impact Index Per Article: 27.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/29/2019] [Revised: 06/28/2019] [Accepted: 06/30/2019] [Indexed: 12/31/2022]
Abstract
Just as there is a robust science that supports development and rigorous testing of clinical innovations, the emerging field of implementation science is developing new theory-based knowledge regarding a growing portfolio of meticulously tested implementation strategies that seek to improve uptake of evidence-based practices by targeting barriers at multiple levels within health care settings. Studying and documenting implementation strategies associated with uptake during the development and trial of a clinical innovation could subsequently position the researcher for a more seamless transition and handoff of the innovation to clinical and operational leaders. The objective of this manuscript is to introduce the concept of implementation strategies: what they are; the rigor with which they are defined and applied to address barriers to clinical innovation adoption; how strategy selection may vary based on contextual, innovation, and recipient factors; how to document the application of strategies over the course of an implementation study; and how testing their effectiveness is the focus of implementation research trials.
Collapse
Affiliation(s)
- JoAnn E Kirchner
- Department of Veterans Affairs Quality Enhancement Research Initiative (QUERI) for Team-Based Behavioral Health, Central Arkansas Veterans Healthcare System, North Little Rock, AR, United States; Psychiatric Research Institute, University of Arkansas for Medical Sciences, Little Rock, AR, United States.
| | - Jeffrey L Smith
- Department of Veterans Affairs Quality Enhancement Research Initiative (QUERI) for Team-Based Behavioral Health, Central Arkansas Veterans Healthcare System, North Little Rock, AR, United States; Psychiatric Research Institute, University of Arkansas for Medical Sciences, Little Rock, AR, United States
| | - Byron J Powell
- University of North Carolina, Chapel Hill, NC, United States
| | - Thomas J Waltz
- Department of Psychology, Eastern Michigan University, Ypsilanti, MI, United States; The VA Center for Clinical Management Research, VA Ann Arbor Healthcare System, Ann Arbor, MI, United States
| | - Enola K Proctor
- Brown School, Washington University in St. Louis, St. Louis, MO, United States
| |
Collapse
|
61
|
Assessing Implementation Strategy Reporting in the Mental Health Literature: A Narrative Review. ADMINISTRATION AND POLICY IN MENTAL HEALTH 2019. [PMID: 31482489 DOI: 10.1007/s10488‐019‐00965‐8] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Indexed: 01/31/2023]
Abstract
Inadequate implementation strategy reporting restricts research synthesis and replicability. We explored the implementation strategy reporting quality of a sample of mental health articles using Proctor et al.'s (Implement Sci 8:139, 2013) reporting recommendations. We conducted a narrative review to generate the sample of articles and assigned a reporting quality score to each article. The mean article reporting score was 54% (range 17-100%). The most reported domains were: name (100%), action (82%), target (80%), and actor (67%). The least reported domains included definition (6%), temporality (26%), justification (34%), and outcome (37%). We discuss limitations and provide recommendations to improve reporting.
Collapse
|
62
|
Waltz TJ, Powell BJ, Fernández ME, Abadie B, Damschroder LJ. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci 2019. [PMID: 31036028 DOI: 10.1186/s13012‐019‐0892‐4] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND A fundamental challenge of implementation is identifying contextual determinants (i.e., barriers and facilitators) and determining which implementation strategies will address them. Numerous conceptual frameworks (e.g., the Consolidated Framework for Implementation Research; CFIR) have been developed to guide the identification of contextual determinants, and compilations of implementation strategies (e.g., the Expert Recommendations for Implementing Change compilation; ERIC) have been developed which can support selection and reporting of implementation strategies. The aim of this study was to identify which ERIC implementation strategies would best address specific CFIR-based contextual barriers. METHODS Implementation researchers and practitioners were recruited to participate in an online series of tasks involving matching specific ERIC implementation strategies to specific implementation barriers. Participants were presented with brief descriptions of barriers based on CFIR construct definitions. They were asked to rank up to seven implementation strategies that would best address each barrier. Barriers were presented in a random order, and participants had the option to respond to the barrier or skip to another barrier. Participants were also asked about considerations that most influenced their choices. RESULTS Four hundred thirty-five invitations were emailed and 169 (39%) individuals participated. Respondents had considerable heterogeneity in opinions regarding which ERIC strategies best addressed each CFIR barrier. Across the 39 CFIR barriers, an average of 47 different ERIC strategies (SD = 4.8, range 35 to 55) was endorsed at least once for each, as being one of seven strategies that would best address the barrier. A tool was developed that allows users to specify high-priority CFIR-based barriers and receive a prioritized list of strategies based on endorsements provided by participants. CONCLUSIONS The wide heterogeneity of endorsements obtained in this study's task suggests that there are relatively few consistent relationships between CFIR-based barriers and ERIC implementation strategies. Despite this heterogeneity, a tool aggregating endorsements across multiple barriers can support taking a structured approach to consider a broad range of strategies given those barriers. This study's results point to the need for a more detailed evaluation of the underlying determinants of barriers and how these determinants are addressed by strategies as part of the implementation planning process.
Collapse
Affiliation(s)
- Thomas J Waltz
- Eastern Michigan University, Ypsilanti, USA.,Ann Arbor VA Center for Clinical Management Research, P.O. Box 130170, Ann Arbor, MI, 48113-0170, USA
| | - Byron J Powell
- Department of Health Policy and Management, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, USA
| | - María E Fernández
- Center for Health Promotion and Prevention Research, School of Public Health, University of Texas Health Science Center at Houston, Houston, USA
| | | | - Laura J Damschroder
- Ann Arbor VA Center for Clinical Management Research, P.O. Box 130170, Ann Arbor, MI, 48113-0170, USA.
| |
Collapse
|
63
|
Waltz TJ, Powell BJ, Fernández ME, Abadie B, Damschroder LJ. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci 2019; 14:42. [PMID: 31036028 PMCID: PMC6489173 DOI: 10.1186/s13012-019-0892-4] [Citation(s) in RCA: 389] [Impact Index Per Article: 77.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2018] [Accepted: 04/12/2019] [Indexed: 12/21/2022] Open
Abstract
BACKGROUND A fundamental challenge of implementation is identifying contextual determinants (i.e., barriers and facilitators) and determining which implementation strategies will address them. Numerous conceptual frameworks (e.g., the Consolidated Framework for Implementation Research; CFIR) have been developed to guide the identification of contextual determinants, and compilations of implementation strategies (e.g., the Expert Recommendations for Implementing Change compilation; ERIC) have been developed which can support selection and reporting of implementation strategies. The aim of this study was to identify which ERIC implementation strategies would best address specific CFIR-based contextual barriers. METHODS Implementation researchers and practitioners were recruited to participate in an online series of tasks involving matching specific ERIC implementation strategies to specific implementation barriers. Participants were presented with brief descriptions of barriers based on CFIR construct definitions. They were asked to rank up to seven implementation strategies that would best address each barrier. Barriers were presented in a random order, and participants had the option to respond to the barrier or skip to another barrier. Participants were also asked about considerations that most influenced their choices. RESULTS Four hundred thirty-five invitations were emailed and 169 (39%) individuals participated. Respondents had considerable heterogeneity in opinions regarding which ERIC strategies best addressed each CFIR barrier. Across the 39 CFIR barriers, an average of 47 different ERIC strategies (SD = 4.8, range 35 to 55) was endorsed at least once for each, as being one of seven strategies that would best address the barrier. A tool was developed that allows users to specify high-priority CFIR-based barriers and receive a prioritized list of strategies based on endorsements provided by participants. CONCLUSIONS The wide heterogeneity of endorsements obtained in this study's task suggests that there are relatively few consistent relationships between CFIR-based barriers and ERIC implementation strategies. Despite this heterogeneity, a tool aggregating endorsements across multiple barriers can support taking a structured approach to consider a broad range of strategies given those barriers. This study's results point to the need for a more detailed evaluation of the underlying determinants of barriers and how these determinants are addressed by strategies as part of the implementation planning process.
Collapse
Affiliation(s)
- Thomas J Waltz
- Eastern Michigan University, Ypsilanti, USA
- Ann Arbor VA Center for Clinical Management Research, P.O. Box 130170, Ann Arbor, MI, 48113-0170, USA
| | - Byron J Powell
- Department of Health Policy and Management, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, USA
| | - María E Fernández
- Center for Health Promotion and Prevention Research, School of Public Health, University of Texas Health Science Center at Houston, Houston, USA
| | | | - Laura J Damschroder
- Ann Arbor VA Center for Clinical Management Research, P.O. Box 130170, Ann Arbor, MI, 48113-0170, USA.
| |
Collapse
|
64
|
Rogal SS, Yakovchenko V, Waltz TJ, Powell BJ, Gonzalez R, Park A, Chartier M, Ross D, Morgan TR, Kirchner JE, Proctor EK, Chinman MJ. Longitudinal assessment of the association between implementation strategy use and the uptake of hepatitis C treatment: Year 2. Implement Sci 2019; 14:36. [PMID: 30961615 PMCID: PMC6454775 DOI: 10.1186/s13012-019-0881-7] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2018] [Accepted: 03/25/2019] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND To increase the uptake of evidence-based treatments for hepatitis C (HCV), the Department of Veterans Affairs (VA) established the Hepatitis Innovation Team (HIT) Collaborative. Teams of providers were tasked with choosing implementation strategies to improve HCV care. The aim of the current evaluation was to assess how site-level implementation strategies were associated with HCV treatment initiation and how the use of implementation strategies and their association with HCV treatment changed over time. METHODS A key HCV provider at each VA site (N = 130) was asked in two consecutive fiscal years (FYs) to complete an online survey examining the use of 73 implementation strategies organized into nine clusters as described by the Expert Recommendations for Implementing Change (ERIC) study. The number of Veterans initiating treatment for HCV, or "treatment starts," at each site was captured using national data. Providers reported whether the use of each implementation strategy was due to the HIT Collaborative. RESULTS Of 130 sites, 80 (62%) responded in Year 1 (FY15) and 105 (81%) responded in Year 2 (FY16). Respondents endorsed a median of 27 (IQR19-38) strategies in Year 2. The strategies significantly more likely to be chosen in Year 2 included tailoring strategies to deliver HCV care, promoting adaptability, sharing knowledge between sites, and using mass media. The total number of treatment starts was significantly positively correlated with total number of strategies endorsed in both years. In Years 1 and 2, respectively, 28 and 26 strategies were significantly associated with treatment starts; 12 strategies overlapped both years, 16 were unique to Year 1, and 14 were unique to Year 2. Strategies significantly associated with treatment starts shifted between Years 1 and 2. Pre-implementation strategies in the "training/educating," "interactive assistance," and "building stakeholder interrelationships" clusters were more likely to be significantly associated with treatment starts in Year 1, while strategies in the "evaluative and iterative" and "adapting and tailoring" clusters were more likely to be associated with treatment starts in Year 2. Approximately half of all strategies were attributed to the HIT Collaborative. CONCLUSIONS These results suggest that measuring implementation strategies over time is a useful way to catalog implementation of an evidence-based practice over time and across settings.
Collapse
Affiliation(s)
- Shari S Rogal
- Center for Health Equity Research and Promotion, VA Pittsburgh Healthcare System, University Drive, Pittsburgh, PA, USA. .,Department of Surgery, University of Pittsburgh, Pittsburgh, PA, USA. .,Division of Gastroenterology, Hepatology, and Nutrition, University of Pittsburgh, Pittsburgh, PA, USA.
| | - Vera Yakovchenko
- Center for Healthcare Organization and Implementation Research, Edith Norse Rogers Memorial VA Hospital, Bedford, MA, USA
| | - Thomas J Waltz
- Department of Psychology, Eastern Michigan University, Ypsilanti, MI, USA.,VA Center for Clinical Management Research, VA Ann Arbor Healthcare System, Ann Arbor, MI, USA
| | - Byron J Powell
- Department of Health Policy and Management, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| | - Rachel Gonzalez
- Gastroenterology Section, VA Long Beach Healthcare System, Long Beach, CA, USA
| | - Angela Park
- Office of Strategic Integration
- Veterans Engineering Resource Center, Washington, DC, USA
| | - Maggie Chartier
- HIV, Hepatitis and Related Conditions Programs, Office of Specialty Care Services, Veterans Health Administration, Washington, DC, USA
| | - David Ross
- HIV, Hepatitis and Related Conditions Programs, Office of Specialty Care Services, Veterans Health Administration, Washington, DC, USA
| | - Timothy R Morgan
- Gastroenterology Section, VA Long Beach Healthcare System, Long Beach, CA, USA
| | - JoAnn E Kirchner
- Department of Veterans Affairs Medical Center, HSR&D and Behavioral Health Quality Enhancement Research Initiative (QUERI), Central Arkansas Veterans Healthcare System, Little Rock, AR, USA
| | - Enola K Proctor
- Brown School, Washington University in St. Louis, St. Louis, MO, USA
| | - Matthew J Chinman
- Center for Health Equity Research and Promotion, VA Pittsburgh Healthcare System, University Drive, Pittsburgh, PA, USA.,RAND Corporation, Pittsburgh, PA, USA
| |
Collapse
|
65
|
Perry CK, Damschroder LJ, Hemler JR, Woodson TT, Ono SS, Cohen DJ. Specifying and comparing implementation strategies across seven large implementation interventions: a practical application of theory. Implement Sci 2019. [PMID: 30898133 DOI: 10.1186/s13012‐019‐0876‐4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND The use of implementation strategies is an active and purposive approach to translate research findings into routine clinical care. The Expert Recommendations for Implementing Change (ERIC) identified and defined discrete implementation strategies, and Proctor and colleagues have made recommendations for specifying operationalization of each strategy. We use empirical data to test how the ERIC taxonomy applies to a large dissemination and implementation initiative aimed at taking cardiac prevention to scale in primary care practice. METHODS EvidenceNOW is an Agency for Healthcare Research and Quality initiative that funded seven cooperatives across seven regions in the USA. Cooperatives implemented multi-component interventions to improve heart health and build quality improvement capacity, and used a range of implementation strategies to foster practice change. We used ERIC to identify cooperatives' implementation strategies and specified the actor, action, target, dose, temporality, justification, and expected outcome for each. We mapped and compiled a matrix of the specified ERIC strategies across the cooperatives, and used consensus to resolve mapping differences. We then grouped implementation strategies by outcomes and justifications, which led to insights regarding the use of and linkages between ERIC strategies in real-world scale-up efforts. RESULTS Thirty-three ERIC strategies were used by cooperatives. We identified a range of revisions to the ERIC taxonomy to improve the practical application of these strategies. These proposed changes include revisions to four strategy names and 12 definitions. We suggest adding three new strategies because they encapsulate distinct actions that were not described in the existing ERIC taxonomy. In addition, we organized ERIC implementation strategies into four functional groupings based on the way we observed them being applied in practice. These groupings show how ERIC strategies are, out of necessity, interconnected, to achieve the work involved in rapidly taking evidence to scale. CONCLUSIONS Findings of our work suggest revisions to the ERIC implementation strategies to reflect their utilization in real-work dissemination and implementation efforts. The functional groupings of the ERIC implementation strategies that emerged from on-the-ground implementers will help guide others in choosing among and linking multiple implementation strategies when planning small- and large-scale implementation efforts. TRIAL REGISTRATION Registered as Observational Study at www.clinicaltrials.gov ( NCT02560428 ).
Collapse
Affiliation(s)
- Cynthia K Perry
- School of Nursing, Oregon Health & Science University, 3455 SW US Veterans Hospital Rd, Portland, OR, 97239, USA.
| | - Laura J Damschroder
- Implementation Pathways, LLC, Ann Arbor, MI, USA.,VA Center for Clinical Management Research, Ann Arbor, MI, USA
| | - Jennifer R Hemler
- Department of Family Medicine and Community Health, Rutgers University--Robert Wood Johnson Medical School, 112 Paterson Street, New Brunswick, NJ, 08901, USA
| | - Tanisha T Woodson
- Department of Family Medicine, Oregon Health & Science University, 3181 SW Sam Jackson Park Rd, Portland, OR, 97239, USA
| | - Sarah S Ono
- Department of Family Medicine, Oregon Health & Science University, 3181 SW Sam Jackson Park Rd, Portland, OR, 97239, USA
| | - Deborah J Cohen
- Department of Family Medicine, Oregon Health & Science University, 3181 SW Sam Jackson Park Rd, Portland, OR, 97239, USA
| |
Collapse
|
66
|
Perry CK, Damschroder LJ, Hemler JR, Woodson TT, Ono SS, Cohen DJ. Specifying and comparing implementation strategies across seven large implementation interventions: a practical application of theory. Implement Sci 2019; 14:32. [PMID: 30898133 PMCID: PMC6429753 DOI: 10.1186/s13012-019-0876-4] [Citation(s) in RCA: 123] [Impact Index Per Article: 24.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2019] [Accepted: 02/28/2019] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND The use of implementation strategies is an active and purposive approach to translate research findings into routine clinical care. The Expert Recommendations for Implementing Change (ERIC) identified and defined discrete implementation strategies, and Proctor and colleagues have made recommendations for specifying operationalization of each strategy. We use empirical data to test how the ERIC taxonomy applies to a large dissemination and implementation initiative aimed at taking cardiac prevention to scale in primary care practice. METHODS EvidenceNOW is an Agency for Healthcare Research and Quality initiative that funded seven cooperatives across seven regions in the USA. Cooperatives implemented multi-component interventions to improve heart health and build quality improvement capacity, and used a range of implementation strategies to foster practice change. We used ERIC to identify cooperatives' implementation strategies and specified the actor, action, target, dose, temporality, justification, and expected outcome for each. We mapped and compiled a matrix of the specified ERIC strategies across the cooperatives, and used consensus to resolve mapping differences. We then grouped implementation strategies by outcomes and justifications, which led to insights regarding the use of and linkages between ERIC strategies in real-world scale-up efforts. RESULTS Thirty-three ERIC strategies were used by cooperatives. We identified a range of revisions to the ERIC taxonomy to improve the practical application of these strategies. These proposed changes include revisions to four strategy names and 12 definitions. We suggest adding three new strategies because they encapsulate distinct actions that were not described in the existing ERIC taxonomy. In addition, we organized ERIC implementation strategies into four functional groupings based on the way we observed them being applied in practice. These groupings show how ERIC strategies are, out of necessity, interconnected, to achieve the work involved in rapidly taking evidence to scale. CONCLUSIONS Findings of our work suggest revisions to the ERIC implementation strategies to reflect their utilization in real-work dissemination and implementation efforts. The functional groupings of the ERIC implementation strategies that emerged from on-the-ground implementers will help guide others in choosing among and linking multiple implementation strategies when planning small- and large-scale implementation efforts. TRIAL REGISTRATION Registered as Observational Study at www.clinicaltrials.gov ( NCT02560428 ).
Collapse
Affiliation(s)
- Cynthia K Perry
- School of Nursing, Oregon Health & Science University, 3455 SW US Veterans Hospital Rd, Portland, OR, 97239, USA.
| | - Laura J Damschroder
- Implementation Pathways, LLC, Ann Arbor, MI, USA.,VA Center for Clinical Management Research, Ann Arbor, MI, USA
| | - Jennifer R Hemler
- Department of Family Medicine and Community Health, Rutgers University--Robert Wood Johnson Medical School, 112 Paterson Street, New Brunswick, NJ, 08901, USA
| | - Tanisha T Woodson
- Department of Family Medicine, Oregon Health & Science University, 3181 SW Sam Jackson Park Rd, Portland, OR, 97239, USA
| | - Sarah S Ono
- Department of Family Medicine, Oregon Health & Science University, 3181 SW Sam Jackson Park Rd, Portland, OR, 97239, USA
| | - Deborah J Cohen
- Department of Family Medicine, Oregon Health & Science University, 3181 SW Sam Jackson Park Rd, Portland, OR, 97239, USA
| |
Collapse
|
67
|
Study protocol: a pragmatic, stepped-wedge trial of tailored support for implementing social determinants of health documentation/action in community health centers, with realist evaluation. Implement Sci 2019; 14:9. [PMID: 30691480 PMCID: PMC6348649 DOI: 10.1186/s13012-019-0855-9] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2018] [Accepted: 01/08/2019] [Indexed: 12/21/2022] Open
Abstract
BACKGROUND National leaders recommend documenting social determinants of health and actions taken to address social determinants of health in electronic health records, and a growing body of evidence suggests the health benefits of doing so. However, little evidence exists to guide implementation of social determinants of health documentation/action. METHODS This paper describes a 5-year, mixed-methods, stepped-wedge trial with realist evaluation, designed to test the impact of providing 30 community health centers with step-by-step guidance on implementing electronic health record-based social determinants of health documentation. This guidance will entail 6 months of tailored support from an interdisciplinary team, including training and technical assistance. We will report on tailored support provided at each of five implementation steps; impact of tailored implementation support; a method for tracking such tailoring; and context-specific pathways through which these tailored strategies effect change. We will track the competencies and resources needed to support the study clinics' implementation efforts. DISCUSSION Results will inform how to tailor implementation strategies to meet local needs in real-world practice settings. Secondary analyses will assess impacts of social determinants of health documentation and referral-making on diabetes outcomes. By learning whether and how scalable, tailored implementation strategies help community health centers adopt social determinants of health documentation and action, this study will yield timely guidance to primary care providers. We are not aware of previous studies exploring implementation strategies that support adoption of social determinants of action using electronic health and interventions, despite the pressing need for such guidance. TRIAL REGISTRATION clinicaltrials.gov, NCT03607617 , registration date: 7/31/2018-retrospectively registered.
Collapse
|
68
|
Powell BJ, Fernandez ME, Williams NJ, Aarons GA, Beidas RS, Lewis CC, McHugh SM, Weiner BJ. Enhancing the Impact of Implementation Strategies in Healthcare: A Research Agenda. Front Public Health 2019; 7:3. [PMID: 30723713 PMCID: PMC6350272 DOI: 10.3389/fpubh.2019.00003] [Citation(s) in RCA: 339] [Impact Index Per Article: 67.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2018] [Accepted: 01/04/2019] [Indexed: 01/10/2023] Open
Abstract
The field of implementation science was developed to better understand the factors that facilitate or impede implementation and generate evidence for implementation strategies. In this article, we briefly review progress in implementation science, and suggest five priorities for enhancing the impact of implementation strategies. Specifically, we suggest the need to: (1) enhance methods for designing and tailoring implementation strategies; (2) specify and test mechanisms of change; (3) conduct more effectiveness research on discrete, multi-faceted, and tailored implementation strategies; (4) increase economic evaluations of implementation strategies; and (5) improve the tracking and reporting of implementation strategies. We believe that pursuing these priorities will advance implementation science by helping us to understand when, where, why, and how implementation strategies improve implementation effectiveness and subsequent health outcomes.
Collapse
Affiliation(s)
- Byron J Powell
- Department of Health Policy and Management, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, United States.,Cecil G. Sheps Center for Health Services Research, University of North Carolina at Chapel Hill, Chapel Hill, NC, United States.,Frank Porter Graham Child Development Institute, University of North Carolina at Chapel Hill, Chapel Hill, NC, United States
| | - Maria E Fernandez
- Center for Health Promotion and Prevention Research, School of Public Health, University of Texas Health Science Center at Houston, Houston, TX, United States
| | | | - Gregory A Aarons
- Department of Psychiatry, University of California, San Diego, La Jolla, CA, United States
| | - Rinad S Beidas
- Department of Psychiatry, Center for Mental Health, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, United States.,Department of Medical Ethics and Health Policy, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, United States.,Leonard Davis Institute of Health Economics, University of Pennsylvania, Philadelphia, PA, United States
| | - Cara C Lewis
- MacColl Center for Healthcare Innovation, Kaiser Permanente Washington Health Research Institute, Seattle, WA, United States
| | - Sheena M McHugh
- School of Public Health, University College Cork, Cork, Ireland
| | - Bryan J Weiner
- Department of Global Health, Department of Health Services, University of Washington, Seattle, WA, United States
| |
Collapse
|
69
|
Wolk CB, Beidas RS. The Intersection of Implementation Science and Behavioral Health: An Introduction to the Special Issue. Behav Ther 2018; 49:477-480. [PMID: 29937251 DOI: 10.1016/j.beth.2018.03.004] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/28/2018] [Revised: 03/08/2018] [Accepted: 03/09/2018] [Indexed: 12/26/2022]
Affiliation(s)
- Courtney Benjamin Wolk
- Center for Mental Health Policy and Services Research, Perelman School of Medicine, University of Pennsylvania.
| | - Rinad S Beidas
- Center for Mental Health Policy and Services Research, Perelman School of Medicine, University of Pennsylvania
| |
Collapse
|
70
|
Huynh AK, Hamilton AB, Farmer MM, Bean-Mayberry B, Stirman SW, Moin T, Finley EP. A Pragmatic Approach to Guide Implementation Evaluation Research: Strategy Mapping for Complex Interventions. Front Public Health 2018; 6:134. [PMID: 29868542 PMCID: PMC5968102 DOI: 10.3389/fpubh.2018.00134] [Citation(s) in RCA: 45] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2018] [Accepted: 04/20/2018] [Indexed: 12/31/2022] Open
Abstract
INTRODUCTION Greater specification of implementation strategies is a challenge for implementation science, but there is little guidance for delineating the use of multiple strategies involved in complex interventions. The Cardiovascular (CV) Toolkit project entails implementation of a toolkit designed to reduce CV risk by increasing women's engagement in appropriate services. The CV Toolkit project follows an enhanced version of Replicating Effective Programs (REP), an evidence-based implementation strategy, to implement the CV Toolkit across four phases: pre-conditions, pre-implementation, implementation, and maintenance and evolution. Our current objective is to describe a method for mapping implementation strategies used in real time as part of the CV Toolkit project. This method supports description of the timing and content of bundled strategies and provides a structured process for developing a plan for implementation evaluation. METHODS We conducted a process of strategy mapping to apply Proctor and colleagues' rubric for specification of implementation strategies, constructing a matrix in which we identified each implementation strategy, its conceptual group, and the corresponding REP phase(s) in which it occurs. For each strategy, we also specified the actors involved, actions undertaken, action targets, dose of the implementation strategy, and anticipated outcome addressed. We iteratively refined the matrix with the implementation team, including use of simulation to provide initial validation. RESULTS Mapping revealed patterns in the timing of implementation strategies within REP phases. Most implementation strategies involving the development of stakeholder interrelationships and training and educating stakeholders were introduced during the pre-conditions or pre-implementation phases. Strategies introduced in the maintenance and evolution phase emphasized communication, re-examination, and audit and feedback. In addition to its value for producing valid and reliable process evaluation data, mapping implementation strategies has informed development of a pragmatic blueprint for implementation and longitudinal analyses and evaluation activities. DISCUSSION We update recent recommendations on specification of implementation strategies by considering the implications for multi-strategy frameworks and propose an approach for mapping the use of implementation strategies within complex, multi-level interventions, in support of rigorous evaluation. Developing pragmatic tools to aid in operationalizing the conduct of implementation and evaluation activities is essential to enacting sound implementation research.
Collapse
Affiliation(s)
- Alexis K. Huynh
- VA Greater Los Angeles HSR&D Center for the Study of Healthcare Innovation, Implementation and Policy, Los Angeles, CA, United States
| | - Alison B. Hamilton
- VA Greater Los Angeles HSR&D Center for the Study of Healthcare Innovation, Implementation and Policy, Los Angeles, CA, United States
- David Geffen School of Medicine, University of California Los Angeles, Los Angeles, CA, United States
| | - Melissa M. Farmer
- VA Greater Los Angeles HSR&D Center for the Study of Healthcare Innovation, Implementation and Policy, Los Angeles, CA, United States
| | - Bevanne Bean-Mayberry
- VA Greater Los Angeles HSR&D Center for the Study of Healthcare Innovation, Implementation and Policy, Los Angeles, CA, United States
- David Geffen School of Medicine, University of California Los Angeles, Los Angeles, CA, United States
| | - Shannon Wiltsey Stirman
- Department of Psychiatry and Behavioral Sciences, Stanford University, Palo Alto, CA, United States
- VA Palo Alto Healthcare System, Menlo Park, CA, United States
| | - Tannaz Moin
- VA Greater Los Angeles HSR&D Center for the Study of Healthcare Innovation, Implementation and Policy, Los Angeles, CA, United States
- David Geffen School of Medicine, University of California Los Angeles, Los Angeles, CA, United States
| | - Erin P. Finley
- South Texas Veterans Healthcare System, San Antonio, TX, United States
- UT Health Science Center, San Antonio, TX, United States
| |
Collapse
|
71
|
Lewis CC, Scott K, Marriott BR. A methodology for generating a tailored implementation blueprint: an exemplar from a youth residential setting. Implement Sci 2018; 13:68. [PMID: 29769096 PMCID: PMC5956960 DOI: 10.1186/s13012-018-0761-6] [Citation(s) in RCA: 40] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2017] [Accepted: 05/04/2018] [Indexed: 11/18/2022] Open
Abstract
BACKGROUND Tailored implementation approaches are touted as more likely to support the integration of evidence-based practices. However, to our knowledge, few methodologies for tailoring implementations exist. This manuscript will apply a model-driven, mixed methods approach to a needs assessment to identify the determinants of practice, and pilot a modified conjoint analysis method to generate an implementation blueprint using a case example of a cognitive behavioral therapy (CBT) implementation in a youth residential center. METHODS Our proposed methodology contains five steps to address two goals: (1) identify the determinants of practice and (2) select and match implementation strategies to address the identified determinants (focusing on barriers). Participants in the case example included mental health therapists and operations staff in two programs of Wolverine Human Services. For step 1, the needs assessment, they completed surveys (clinician N = 10; operations staff N = 58; other N = 7) and participated in focus groups (clinician N = 15; operations staff N = 38) guided by the domains of the Framework for Diffusion [1]. For step 2, the research team conducted mixed methods analyses following the QUAN + QUAL structure for the purpose of convergence and expansion in a connecting process, revealing 76 unique barriers. Step 3 consisted of a modified conjoint analysis. For step 3a, agency administrators prioritized the identified barriers according to feasibility and importance. For step 3b, strategies were selected from a published compilation and rated for feasibility and likelihood of impacting CBT fidelity. For step 4, sociometric surveys informed implementation team member selection and a meeting was held to identify officers and clarify goals and responsibilities. For step 5, blueprints for each of pre-implementation, implementation, and sustainment phases were generated. RESULTS Forty-five unique strategies were prioritized across the 5 years and three phases representing all nine categories. CONCLUSIONS Our novel methodology offers a relatively low burden collaborative approach to generating a plan for implementation that leverages advances in implementation science including measurement, models, strategy compilations, and methods from other fields.
Collapse
Affiliation(s)
- Cara C. Lewis
- Kaiser Permanente Washington Health Research Institute, 1730 Minor Ave, Suite 1600, Seattle, WA 98101 USA
- Department of Psychological and Brain Sciences, Indiana University, 1101 E. 10th St, Bloomington, IN 47405 USA
- Department of Psychiatry and Behavioral Sciences, Harborview Medical Center, School of Medicine, University of Washington, Box 359911, 325 9th Ave, Seattle, WA 98104 USA
| | - Kelli Scott
- Department of Psychological and Brain Sciences, Indiana University, 1101 E. 10th St, Bloomington, IN 47405 USA
| | - Brigid R. Marriott
- Department of Psychological Sciences, University of Missouri, 315 Psychology Building, Columbia, MO 65211 USA
| |
Collapse
|