1
|
Chambers DA, Neta GI. Charting Progress in the Science of Technical Assistance for Implementation of Evidence-Based Interventions. Eval Health Prof 2024:1632787241293447. [PMID: 39422156 DOI: 10.1177/01632787241293447] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2024]
Abstract
Technical assistance (TA) has long been a strategy utilized to support implementation of a range of different evidence-based interventions within clinical, community and other service settings. Great progress has come in extending the evidence base to support TA's use across multiple contexts, the result of more extensive categorizing of implementation strategies to support systematic studies of their effectiveness in facilitating successful implementation. This commentary builds on that progress to suggest several opportunities for future investigation and collaborative activity among researchers, practitioners, policymakers and other key decision-makers in hopes of continuing to build the success highlighted in this special issue and elsewhere. Authors call for increased attention to operationalization and tailoring of TA, considering how TA services can be sustained over time and how to consider externally-provided TA versus that housed within an organization. In addition, the commentary suggests a few key areas for capacity-building that can increase the quality, reach, and impact of TA for the future.
Collapse
Affiliation(s)
- David A Chambers
- Division of Cancer Control and Population Sciences, National Cancer Institute, National Institutes of Health, USA
| | - Gila I Neta
- Division of Cancer Control and Population Sciences, National Cancer Institute, National Institutes of Health, USA
| |
Collapse
|
2
|
Lewis CC, Frank HE, Cruden G, Kim B, Stahmer AC, Lyon AR, Albers B, Aarons GA, Beidas RS, Mittman BS, Weiner BJ, Williams NJ, Powell BJ. A research agenda to advance the study of implementation mechanisms. Implement Sci Commun 2024; 5:98. [PMID: 39285504 PMCID: PMC11403843 DOI: 10.1186/s43058-024-00633-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2024] [Accepted: 08/30/2024] [Indexed: 09/20/2024] Open
Abstract
BACKGROUND Implementation science scholars have made significant progress identifying factors that enable or obstruct the implementation of evidence-based interventions, and testing strategies that may modify those factors. However, little research sheds light on how or why strategies work, in what contexts, and for whom. Studying implementation mechanisms-the processes responsible for change-is crucial for advancing the field of implementation science and enhancing its value in facilitating equitable policy and practice change. The Agency for Healthcare Research and Quality funded a conference series to achieve two aims: (1) develop a research agenda on implementation mechanisms, and (2) actively disseminate the research agenda to research, policy, and practice audiences. This article presents the resulting research agenda, including priorities and actions to encourage its execution. METHOD Building on prior concept mapping work, in a semi-structured, 3-day, in-person working meeting, 23 US-based researchers used a modified nominal group process to generate priorities and actions for addressing challenges to studying implementation mechanisms. During each of the three 120-min sessions, small groups responded to the prompt: "What actions need to be taken to move this research forward?" The groups brainstormed actions, which were then shared with the full group and discussed with the support of facilitators trained in structured group processes. Facilitators grouped critical and novel ideas into themes. Attendees voted on six themes they prioritized to discuss in a fourth, 120-min session, during which small groups operationalized prioritized actions. Subsequently, all ideas were collated, combined, and revised for clarity by a subset of the authorship team. RESULTS From this multistep process, 150 actions emerged across 10 priority areas, which together constitute the research agenda. Actions included discrete activities, projects, or products, and ways to shift how research is conducted to strengthen the study of implementation mechanisms. CONCLUSIONS This research agenda elevates actions to guide the selection, design, and evaluation of implementation mechanisms. By delineating recommended actions to address the challenges of studying implementation mechanisms, this research agenda facilitates expanding the field of implementation science, beyond studying what works to how and why strategies work, in what contexts, for whom, and with which interventions.
Collapse
Affiliation(s)
- Cara C Lewis
- Kaiser Permanente Washington Health Research Institute, 1730 Minor Avenue, Suite 1600, Seattle, WA, 98101, USA.
| | - Hannah E Frank
- The Warren Alpert Medical School, Brown University, Box G-BH, Providence, RI, 02912, USA
| | - Gracelyn Cruden
- Chestnut Health System, Lighthouse Institute - OR Group, 1255 Pearl St, Ste 101, Eugene, OR 97401, USA
| | - Bo Kim
- Center for Healthcare Organization and Implementation Research, VA Boston Healthcare System, 150 South Huntington Avenue, Boston, MA, 02130, USA
- Department of Psychiatry, Harvard Medical School, 25 Shattuck Street, Boston, MA, 02115, USA
| | - Aubyn C Stahmer
- UC Davis MIND Institute, 2825 50Th St, Sacramento, CA, 95819, USA
| | - Aaron R Lyon
- Department of Psychiatry and Behavioral Sciences, University of Washington, 1959 NE Pacific Street Box 356560, Seattle, WA, 98195-6560, USA
| | - Bianca Albers
- Institute for Implementation Science in Health Care, University of Zurich, Zürich, Switzerland
| | - Gregory A Aarons
- Department of Psychiatry, University of California San Diego, 9500 Gilman Drive La Jolla California, San Diego, 92093, CA, USA
| | - Rinad S Beidas
- Department of Medical Social Sciences, Feinberg School of Medicine, Northwestern University, 625 N Michigan Avenue, Evanston, IL, 60661, USA
| | - Brian S Mittman
- Division of Health Services Research & Implementation Science, Department of Research & Evaluation, Kaiser Permanente Southern California, 100 S Los Robles Ave, Pasadena, CA, 91101, USA
| | - Bryan J Weiner
- Department of Global Health, School of Public Health, Box 357965, Seattle, WA, 98195, USA
| | - Nate J Williams
- School of Social Work, Boise State University, Boise, ID, 83725, USA
| | - Byron J Powell
- Center for Mental Health Services Research, Brown School, Washington University in St. Louis, St. Louis, MO, USA
- Center for Dissemination & Implementation, Institute for Public Health, Washington University in St. Louis, St. Louis, MO, USA
- Division of Infectious Diseases, John T. Milliken Department of Medicine, School of Medicine, Washington University in St. Louis, St. Louis, MO, USA
| |
Collapse
|
3
|
Smith NR, Levy DE, Falbe J, Purtle J, Chriqui JF. Design considerations for developing measures of policy implementation in quantitative evaluations of public health policy. FRONTIERS IN HEALTH SERVICES 2024; 4:1322702. [PMID: 39076770 PMCID: PMC11285065 DOI: 10.3389/frhs.2024.1322702] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/16/2023] [Accepted: 06/20/2024] [Indexed: 07/31/2024]
Abstract
Typical quantitative evaluations of public policies treat policies as a binary condition, without further attention to how policies are implemented. However, policy implementation plays an important role in how the policy impacts behavioral and health outcomes. The field of policy-focused implementation science is beginning to consider how policy implementation may be conceptualized in quantitative analyses (e.g., as a mediator or moderator), but less work has considered how to measure policy implementation for inclusion in quantitative work. To help address this gap, we discuss four design considerations for researchers interested in developing or identifying measures of policy implementation using three independent NIH-funded research projects studying e-cigarette, food, and mental health policies. Mini case studies of these considerations were developed via group discussions; we used the implementation research logic model to structure our discussions. Design considerations include (1) clearly specifying the implementation logic of the policy under study, (2) developing an interdisciplinary team consisting of policy practitioners and researchers with expertise in quantitative methods, public policy and law, implementation science, and subject matter knowledge, (3) using mixed methods to identify, measure, and analyze relevant policy implementation determinants and processes, and (4) building flexibility into project timelines to manage delays and challenges due to the real-world nature of policy. By applying these considerations in their own work, researchers can better identify or develop measures of policy implementation that fit their needs. The experiences of the three projects highlighted in this paper reinforce the need for high-quality and transferrable measures of policy implementation, an area where collaboration between implementation scientists and policy experts could be particularly fruitful. These measurement practices provide a foundation for the field to build on as attention to incorporating measures of policy implementation into quantitative evaluations grows and will help ensure that researchers are developing a more complete understanding of how policies impact health outcomes.
Collapse
Affiliation(s)
- Natalie Riva Smith
- Department of Social and Behavioral Sciences, Harvard TH Chan School of Public Health, Boston, MA, United States
| | - Douglas E. Levy
- Mongan Institute Health Policy Research Center, Massachusetts General Hospital, Boston, MA, United States
- Harvard Medical School, Boston, MA, United States
| | - Jennifer Falbe
- Human Development and Family Studies Program, Department of Human Ecology, University of California, Davis, CA, United States
| | - Jonathan Purtle
- Department of Public Health Policy & Management, Global Center for Implementation Science, New York University School of Global Public Health, New York, NY, United States
| | - Jamie F. Chriqui
- Institute for Health Research and Policy, University of Illinois Chicago, Chicago, IL, United States
- Department of Health Policy and Administration, School of Public Health, University of Illinois Chicago, Chicago, IL, United States
| |
Collapse
|
4
|
Finney Rutten LJ, Ridgeway JL, Griffin JM. Advancing Translation of Clinical Research Into Practice and Population Health Impact Through Implementation Science. Mayo Clin Proc 2024; 99:665-676. [PMID: 38569814 DOI: 10.1016/j.mayocp.2023.02.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/04/2022] [Revised: 01/31/2023] [Accepted: 02/08/2023] [Indexed: 04/05/2024]
Abstract
Translational and implementation sciences aim to prioritize and guide efforts to create greater efficiency and speed of scientific innovation across the translational science continuum to improve patient and population health. Key principles and practices rooted in translational and implementation science may be incorporated into clinical trials research, particularly pragmatic trials, to improve the relevance and impact of scientific innovation. This thematic review intends to raise awareness on the value of translational and implementation science in clinical research and to encourage its use in designing and implementing clinical trials across the translational research continuum. Herein, we describe the gap in translating research findings into clinical practice, introduce translational and implementation science, and describe the principles and practices from implementation science that can be used in clinical trial research across the translational continuum to inform clinical practice, to improve population health impact, and to address health care inequities.
Collapse
Affiliation(s)
| | - Jennifer L Ridgeway
- Robert D. and Patricia E. Kern Center for the Science of Health Care Delivery, Mayo Clinic, Rochester, MN
| | - Joan M Griffin
- Robert D. and Patricia E. Kern Center for the Science of Health Care Delivery, Mayo Clinic, Rochester, MN
| |
Collapse
|
5
|
Patel-Syed Z, Becker S, Olson M, Rinella H, Scott K. What do you think it means? Using cognitive interviewing to improve measurement in implementation science: description and case example. Implement Sci Commun 2024; 5:14. [PMID: 38355677 PMCID: PMC10865651 DOI: 10.1186/s43058-024-00549-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2023] [Accepted: 01/16/2024] [Indexed: 02/16/2024] Open
Abstract
Pragmatic measures are essential to evaluate the implementation of evidence-based interventions. Cognitive interviewing, a qualitative method that collects partner feedback throughout measure development, is particularly useful for developing pragmatic implementation measures. Measure developers can use cognitive interviewing to increase a measure's fit within a particular implementation context. However, cognitive interviewing is underused in implementation research, where most measures remain "homegrown" and used for single studies. We provide a rationale for using cognitive interviewing in implementation science studies and illustrate its use through a case example employing cognitive interviewing to inform development of a measurement-based care protocol for implementation in opioid treatment programs. Applications of cognitive interviewing, including developing a common language with partners and collecting multi-level feedback on assessment procedures, to improve measurement in implementation science are discussed.
Collapse
Affiliation(s)
- Zabin Patel-Syed
- Northwestern University Feinberg School of Medicine, Institute for Public Health and Medicine, Center for Dissemination and Implementation Science, Chicago, USA.
| | - Sara Becker
- Northwestern University Feinberg School of Medicine, Institute for Public Health and Medicine, Center for Dissemination and Implementation Science, Chicago, USA
| | - Miranda Olson
- Northwestern University Feinberg School of Medicine, Institute for Public Health and Medicine, Center for Dissemination and Implementation Science, Chicago, USA
| | - Hailey Rinella
- Northwestern University Feinberg School of Medicine, Institute for Public Health and Medicine, Center for Dissemination and Implementation Science, Chicago, USA
| | - Kelli Scott
- Northwestern University Feinberg School of Medicine, Institute for Public Health and Medicine, Center for Dissemination and Implementation Science, Chicago, USA
| |
Collapse
|
6
|
Sowan A, Chinman M. Model for Doctor of Nursing Practice Projects Based on Cross-Fertilization Between Improvement and Implementation Sciences: Protocol for Quality Improvement and Program Evaluation Studies. JMIR Res Protoc 2024; 13:e54213. [PMID: 38294860 PMCID: PMC10867758 DOI: 10.2196/54213] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2023] [Revised: 12/19/2023] [Accepted: 12/22/2023] [Indexed: 02/01/2024] Open
Abstract
BACKGROUND Hundreds of nursing professionals graduate each year from Doctor of Nursing Practice (DNP) programs, entrusted with roles as practice scholars and leaders. Graduates are tasked to lead multidisciplinary knowledge implementation projects to improve safety, quality, and key performance metrics. Nevertheless, there is a continued lack of agreement and faculty dissatisfaction with the format, focus, and results of the DNP graduation projects. The use of a wide range of models and methodologies from different sciences for knowledge implementation introduces challenges to DNP students; affects the scientific rigor of the projects; and results in the overuse, superficial use, or misuse of the models. Quality improvement (QI) and program evaluation studies are substantial investments that may lead to waste and even harm if not well conducted. Traditional QI methodologies, commonly used in DNP projects, were found to be uncertain in improving health care outcomes. The complexity of health care systems calls for cross-fertilization between improvement and implementation sciences to improve health care outcomes. OBJECTIVE This study describes the development, implementation, and evaluation of a hybrid model for QI and program evaluation studies to guide scholarship in the DNP program. METHODS The hybrid model was based on cross-fertilization between improvement and implementation sciences. The model adapted the Getting to Outcome (GTO) and Knowledge to Action (KTA) models as the overarching process models for knowledge implementation. Within each phase of the GTO and KTA models, expected barriers and facilitators for the implementation and adoption of innovation were identified based on the CFIR (Consolidated Framework for Implementation Research). Accordingly, strategies to facilitate the implementation and adoption of innovations were identified based on a refined list of implementation strategies and QI tools. The choice of these models was based on the top 5 criteria for selecting implementation science theories and frameworks. Seven DNP students used the hybrid model to conduct QI projects. Students evaluated their experiences by responding to a Qualtrics survey. RESULTS The hybrid model encouraged a comprehensive systematic way of thinking, provided tools essential to implementation success, emphasized the need for adaptability in implementation, maintained rigor in QI, and guided the sustainability of change initiatives. Some of the challenges faced by students included finding reliable and valid measures, attaining and maintaining staff buy-in, and competing organizational priorities. CONCLUSIONS Cross-fertilization between improvement and implementation sciences provided a roadmap and systematic thinking for successful QI projects in the DNP program. The integration of the CFIR with the GTO or KTA process models, enforced by the use of evidence-based implementation strategies and QI tools, reflected the complexity of health care systems and emphasized the need for adaptability in implementation. INTERNATIONAL REGISTERED REPORT IDENTIFIER (IRRID) RR1-10.2196/54213.
Collapse
Affiliation(s)
- Azizeh Sowan
- School of Nursing, The University of Texas Health Science Center at San Antonio, San Antonio, TX, United States
| | - Matthew Chinman
- RAND Corporation, Santa Monica, CA, United States
- VA Pittsburgh Healthcare System, Pittsburgh, PA, United States
| |
Collapse
|
7
|
Yang LH, Bass JK, Le PD, Singh R, Gurung D, Velasco PR, Grivel MM, Susser E, Cleland CM, Muñoz RA, Kohrt BA, Bhana A. A Case Study of the Development of a Valid and Pragmatic Implementation Science Measure: The Barriers and Facilitators in Implementation of Task-Sharing Mental Health Interventions (BeFITS-MH) Measure. RESEARCH SQUARE 2024:rs.3.rs-3877031. [PMID: 38343864 PMCID: PMC10854285 DOI: 10.21203/rs.3.rs-3877031/v1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 02/18/2024]
Abstract
Background Few implementation science (IS) measures have been evaluated for validity, reliability and utility - the latter referring to whether a measure captures meaningful aspects of implementation contexts. In this case study, we describe the process of developing an IS measure that aims to assess Barriers and Facilitators in Implementation of Task-Sharing in Mental Health services (BeFITS-MH), and the procedures we implemented to enhance its utility. Methods We summarize conceptual and empirical work that informed the development of the BeFITS-MH measure, including a description of the Delphi process, detailed translation and local adaptation procedures, and concurrent pilot testing. As validity and reliability are key aspects of measure development, we also report on our process of assessing the measure's construct validity and utility for the implementation outcomes of acceptability, appropriateness, and feasibility. Results Continuous stakeholder involvement and concurrent pilot testing resulted in several adaptations of the BeFITS-MH measure's structure, scaling, and format to enhance contextual relevance and utility. Adaptations of broad terms such as "program," "provider type," and "type of service" were necessary due to the heterogeneous nature of interventions, type of task-sharing providers employed, and clients served across the three global sites. Item selection benefited from the iterative process, enabling identification of relevance of key aspects of identified barriers and facilitators, and what aspects were common across sites. Program implementers' conceptions of utility regarding the measure's acceptability, appropriateness, and feasibility were seen to cluster across several common categories. Conclusions This case study provides a rigorous, multi-step process for developing a pragmatic IS measure. The process and lessons learned will aid in the teaching, practice and research of IS measurement development. The importance of including experiences and knowledge from different types of stakeholders in different global settings was reinforced and resulted in a more globally useful measure while allowing for locally-relevant adaptation. To increase the relevance of the measure it is important to target actionable domains that predict markers of utility (e.g., successful uptake) per program implementers' preferences. With this case study, we provide a detailed roadmap for others seeking to develop and validate IS measures that maximize local utility and impact.
Collapse
Affiliation(s)
- Lawrence H Yang
- New York University School of Global Public Health, Department of Social and Behavioral Sciences
| | - Judy K Bass
- Johns Hopkins Bloomberg School of Public Health, Department of Mental Health
| | - PhuongThao Dinh Le
- New York University School of Global Public Health, Department of Social and Behavioral Sciences
| | - Ritika Singh
- George Washington University, Division of Global Mental Health, Department of Psychiatry and Behavioral Sciences
| | - Dristy Gurung
- Transcultural Psychosocial Organization (TPO) Nepal; King's College London, Denmark Hill Campus
| | - Paola R Velasco
- Universidad O'Higgins; Universidad Católica de Chile; Universidad de Chile
| | - Margaux M Grivel
- 1 New York University School of Global Public Health, Department of Social and Behavioral Sciences
| | - Ezra Susser
- Columbia University Mailman School of Public Health; New York State Psychiatric Institute
| | - Charles M Cleland
- New York University Grossman School of Medicine, Department of Population Health
| | | | - Brandon A Kohrt
- George Washington University, Division of Global Mental Health, Department of Psychiatry and Behavioral Sciences
| | - Arvin Bhana
- University of KwaZulu-Natal, Centre for Rural Health; South African Medical Research Council, Health Systems Research Unit
| |
Collapse
|
8
|
Woodward EN, Castillo AIM, True G, Willging C, Kirchner JE. Challenges and promising solutions to engaging patients in healthcare implementation in the United States: an environmental scan. BMC Health Serv Res 2024; 24:29. [PMID: 38178131 PMCID: PMC10768202 DOI: 10.1186/s12913-023-10315-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2023] [Accepted: 11/13/2023] [Indexed: 01/06/2024] Open
Abstract
BACKGROUND One practice in healthcare implementation is patient engagement in quality improvement and systems redesign. Implementers in healthcare systems include clinical leadership, middle managers, quality improvement personnel, and others facilitating changes or adoption of new interventions. Patients provide input into different aspects of health research. However, there is little attention to involve patients in implementing interventions, especially in the United States (U.S.), and this might be essential to reduce inequities. Implementers need clear strategies to overcome challenges, and might be able to learn from countries outside the U.S. METHODS We wanted to understand existing work about how patients are being included in implementation activities in real world U.S. healthcare settings. We conducted an environmental scan of three data sources: webinars, published articles, and interviews with implementers who engaged patients in implementation activities in U.S. healthcare settings. We extracted, categorized, and triangulated from data sources the key activities, recurring challenges, and promising solutions using a coding template. RESULTS We found 27 examples of patient engagement in U.S. healthcare implementation across four webinars, 11 published articles, and seven interviews, mostly arranging patient engagement through groups and arranging processes for patients that changed how engaged they were able to be. Participants rarely specified if they were engaging a population experiencing healthcare inequities. Participants described eight recurring challenges; the two most frequently identified were: (1) recruiting patients representative of those served in the healthcare system; and (2) ensuring processes for equitable communication among all. We matched recurring challenges to promising solutions, such as logistic solutions on how to arrange meetings to enhance engagement or training in inclusivity and power-sharing. CONCLUSION We clarified how some U.S. implementers are engaging patients in healthcare implementation activities using less and more intensive engagement. It was unclear whether reducing inequities was a goal. Patient engagement in redesigning U.S. healthcare service delivery appears similar to or less intense than in countries with more robust infrastructure for this, such as Canada and the United Kingdom. Challenges were common across jurisdictions, including retaining patients in the design/delivery of implementation activities. Implementers in any region can learn from those in other places.
Collapse
Affiliation(s)
- Eva N Woodward
- VA Center for Mental Healthcare and Outcomes Research, 2200 Fort Roots Drive, Building 11, North Little Rock, AR, 72114, USA.
- Department of Psychiatry, University of Arkansas for Medical Sciences, 4301 W Markham St, Little Rock, AR, 72205, USA.
| | - Andrea Isabel Melgar Castillo
- VA Center for Mental Healthcare and Outcomes Research, 2200 Fort Roots Drive, Building 11, North Little Rock, AR, 72114, USA
- Graduate School, University of Arkansas for Medical Sciences, 4301 W Markham St, Little Rock, AR, 72205, USA
| | - Gala True
- South Central Mental Illness Research Education and Clinical Center, Southeast Louisiana Veterans Health Care System, 2400 Canal St, New Orleans, LA, 70119, USA
- Section on Community and Population Medicine, School of Medicine, Louisiana State University, 2400 Canal St (11F), New Orleans, LA, USA
| | - Cathleen Willging
- Pacific Institute for Research and Evaluation, 851 University Boulevard, Suite 101, Albuquerque, NM, 87106, USA
| | - JoAnn E Kirchner
- Department of Psychiatry, University of Arkansas for Medical Sciences, 4301 W Markham St, Little Rock, AR, 72205, USA
- Behavioral Health Quality Enhancement Research Initiative (QUERI), Central Arkansas Veterans Healthcare System, 2200 Fort Roots Drive, Building 11, North Little Rock, AR, 72114, USA
| |
Collapse
|
9
|
Choy-Brown M, Williams NJ, Ramirez N, Esp S. Psychometric evaluation of a pragmatic measure of clinical supervision as an implementation strategy. Implement Sci Commun 2023; 4:39. [PMID: 37024945 PMCID: PMC10080877 DOI: 10.1186/s43058-023-00419-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2022] [Accepted: 03/16/2023] [Indexed: 04/08/2023] Open
Abstract
BACKGROUND Valid and reliable measurement of implementation strategies is essential to advancing implementation science; however, this area lags behind the measurement of implementation outcomes and determinants. Clinical supervision is a promising and highly feasible implementation strategy in behavioral healthcare for which pragmatic measures are lacking. This research aimed to develop and psychometrically evaluate a pragmatic measure of clinical supervision conceptualized in terms of two broadly applicable, discrete clinical supervision techniques shown to improve providers' implementation of evidence-based psychosocial interventions-(1) audit and feedback and (2) active learning. METHODS Items were generated based on a systematic review of the literature and administered to a sample of 154 outpatient mental health clinicians serving youth and 181 community-based mental health providers serving adults. Scores were evaluated for evidence of reliability, structural validity, construct-related validity, and measurement invariance across the two samples. RESULTS In sample 1, confirmatory factor analysis (CFA) supported the hypothesized two-factor structure of scores on the Evidence-Based Clinical Supervision Strategies (EBCSS) scale (χ2=5.89, df=4, p=0.208; RMSEA=0.055, CFI=0.988, SRMR=0.033). In sample 2, CFA replicated the EBCSS factor structure and provided discriminant validity evidence relative to an established supervisory alliance measure (χ2=36.12, df=30, p=0.204; RMSEA=0.034; CFI=0.990; SRMR=0.031). Construct-related validity evidence was provided by theoretically concordant associations between EBCSS subscale scores and agency climate for evidence-based practice implementation in sample 1 (d= .47 and .55) as well as measures of the supervision process in sample 2. Multiple group CFA supported the configural, metric, and partial scalar invariance of scores on the EBCSS across the two samples. CONCLUSIONS Scores on the EBCSS provide a valid basis for inferences regarding the extent to which behavioral health providers experience audit and feedback and active learning as part of their clinical supervision in both clinic- and community-based behavioral health settings. TRIAL REGISTRATION ClinicalTrials.gov NCT04096274 . Registered on 19 September 2019.
Collapse
Affiliation(s)
- Mimi Choy-Brown
- University of Minnesota, Twin Cities, 1404 Gortner Avenue, St. Paul, MN 55108 USA
| | - Nathaniel J. Williams
- Boise State University, 1910 University Drive, Education Suite 717, Boise, ID 83725-1940 USA
| | - Nallely Ramirez
- Boise State University, 1910 University Drive, Education Suite 717, Boise, ID 83725-1940 USA
| | - Susan Esp
- Boise State University, 1910 University Drive, Education Suite 717, Boise, ID 83725-1940 USA
| |
Collapse
|
10
|
Randall CL. Dissemination and implementation research for oral and craniofacial health: Background, a review of literature and future directions. Community Dent Oral Epidemiol 2023; 51:119-132. [PMID: 36744988 PMCID: PMC10364974 DOI: 10.1111/cdoe.12841] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2022] [Revised: 12/15/2022] [Accepted: 12/22/2022] [Indexed: 02/07/2023]
Abstract
Oral conditions are highly prevalent globally and have profound consequence on individuals and communities. Clinical (e.g. dental treatments, behavioural counselling) and non-clinical (e.g. community-based programming, water fluoridation, oral health policy) evidence-based interventions have been identified, recommended and applied at the clinic, community and policy levels. Still, the burden of oral conditions persists, with inequitable distribution across populations. A major driver of this lack of progress is poor translation of research findings, which results in an evidence-to-practice gap. Dissemination and implementation science (DIS) has emerged to address this gap. A relatively new field, application of DIS represents an important avenue for achieving good dental, oral and craniofacial health for all. The goal of this introductory article is to provide a brief background on DIS relevant to researchers in dentistry and oral health. The problem of knowledge translation, basic concepts and terminology in DIS, and approaches to doing dissemination and implementation research-including implementation strategies, key outcomes, and implementation theories, models and frameworks-are discussed. Additionally, the article reviews literature applying DIS to dentistry and oral health. Results of published studies and their implications for the field are presented. Drawing on the literature review and contemporary thinking in DIS, current gaps, opportunities and future directions are discussed. Resources for understanding and applying DIS are provided throughout. This article serves as a primer on DIS for dental and oral health researchers of all types working across a range of contexts; it also serves as a call to action for increased application of DIS to address the burden of oral conditions globally.
Collapse
Affiliation(s)
- Cameron L Randall
- Department of Oral Health Sciences, University of Washington School of Dentistry, Seattle, Washington, USA
| |
Collapse
|
11
|
Neta G, Pan W, Ebi K, Buss DF, Castranio T, Lowe R, Ryan SJ, Stewart-Ibarra AM, Hapairai LK, Sehgal M, Wimberly MC, Rollock L, Lichtveld M, Balbus J. Advancing climate change health adaptation through implementation science. Lancet Planet Health 2022; 6:e909-e918. [PMID: 36370729 PMCID: PMC9669460 DOI: 10.1016/s2542-5196(22)00199-1] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2022] [Revised: 08/12/2022] [Accepted: 08/18/2022] [Indexed: 05/17/2023]
Abstract
To date, there are few examples of implementation science studies that help guide climate-related health adaptation. Implementation science is the study of methods to promote the adoption and integration of evidence-based tools, interventions, and policies into practice to improve population health. These studies can provide the needed empirical evidence to prioritise and inform implementation of health adaptation efforts. This Personal View discusses five case studies that deployed disease early warning systems around the world. These cases studies illustrate challenges to deploying early warning systems and guide recommendations for implementation science approaches to enhance future research. We propose theory-informed approaches to understand multilevel barriers, design strategies to overcome those barriers, and analyse the ability of those strategies to advance the uptake and scale-up of climate-related health interventions. These findings build upon previous theoretical work by grounding implementation science recommendations and guidance in the context of real-world practice, as detailed in the case studies.
Collapse
Affiliation(s)
- Gila Neta
- Division of Cancer Control and Population Sciences, National Cancer Institute, Rockville, MD, USA.
| | - William Pan
- Duke Global Health Institute and Environmental Science and Policy, Duke University, Durham, NC, USA
| | - Kristie Ebi
- Center for Health and the Global Environment, University of Washington, Seattle, WA, USA
| | - Daniel F Buss
- Climate Change and Health, Pan American Health Organization, Washington, DC, USA
| | - Trisha Castranio
- Global Environmental Health Program, National Institute of Environmental Health Science, Durham, NC, USA
| | - Rachel Lowe
- Barcelona Supercomputing Center (BSC), Barcelona, Spain; Catalan Institution for Research and Advanced Studies (ICREA), Barcelona, Spain; Centre on Climate Change and Planetary Health and Centre for Mathematical Modelling of Infectious Diseases, London School of Hygiene & Tropical Medicine, London, UK
| | - Sadie J Ryan
- Department of Geography and the Emerging Pathogens Institute, University of Florida, Gainesville, FL, USA
| | | | - Limb K Hapairai
- Pacific Island Health Officers Association, Honolulu, HI, USA
| | - Meena Sehgal
- Environment and Health, The Energy and Resources Institute, New Delhi, India
| | - Michael C Wimberly
- Department of Geography and Environmental Sustainability, University of Oklahoma, Norman, OK, USA
| | | | - Maureen Lichtveld
- Environmental and Occupational Health, University of Pittsburgh School of Public Health, Pittsburgh, PA, USA
| | - John Balbus
- Global Environmental Health Program, National Institute of Environmental Health Science, Washington, DC, USA
| |
Collapse
|
12
|
Mielke J, Leppla L, Valenta S, Zullig LL, Zúñiga F, Staudacher S, Teynor A, De Geest S. Unraveling implementation context: the Basel Approach for coNtextual ANAlysis (BANANA) in implementation science and its application in the SMILe project. Implement Sci Commun 2022; 3:102. [PMID: 36183141 PMCID: PMC9526967 DOI: 10.1186/s43058-022-00354-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2022] [Accepted: 09/20/2022] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Designing intervention and implementation strategies with careful consideration of context is essential for successful implementation science projects. Although the importance of context has been emphasized and methodology for its analysis is emerging, researchers have little guidance on how to plan, perform, and report contextual analysis. Therefore, our aim was to describe the Basel Approach for coNtextual ANAlysis (BANANA) and to demonstrate its application on an ongoing multi-site, multiphase implementation science project to develop/adapt, implement, and evaluate an integrated care model in allogeneic SteM cell transplantatIon facILitated by eHealth (the SMILe project). METHODS BANANA builds on guidance for assessing context by Stange and Glasgow (Contextual factors: the importance of considering and reporting on context in research on the patient-centered medical home, 2013). Based on a literature review, BANANA was developed in ten discussion sessions with implementation science experts and a medical anthropologist to guide the SMILe project's contextual analysis. BANANA's theoretical basis is the Context and Implementation of Complex Interventions (CICI) framework. Working from an ecological perspective, CICI acknowledges contextual dynamics and distinguishes between context and setting (the implementation's physical location). RESULTS BANANA entails six components: (1) choose a theory, model, or framework (TMF) to guide the contextual analysis; (2) use empirical evidence derived from primary and/or secondary data to identify relevant contextual factors; (3) involve stakeholders throughout contextual analysis; (4) choose a study design to assess context; (5) determine contextual factors' relevance to implementation strategies/outcomes and intervention co-design; and (6) report findings of contextual analysis following appropriate reporting guidelines. Partly run simultaneously, the first three components form a basis both for the identification of relevant contextual factors and for the next components of the BANANA approach. DISCUSSION Understanding of context is indispensable for a successful implementation science project. BANANA provides much-needed methodological guidance for contextual analysis. In subsequent phases, it helps researchers apply the results to intervention development/adaption and choices of contextually tailored implementation strategies. For future implementation science projects, BANANA's principles will guide researchers first to gather relevant information on their target context, then to inform all subsequent phases of their implementation science project to strengthen every part of their work and fulfill their implementation goals.
Collapse
Affiliation(s)
- Juliane Mielke
- Institute of Nursing Science (INS), Department Public Health (DPH), Faculty of Medicine, University of Basel, Bernoullistrasse 28, CH-4056 Basel, Switzerland
| | - Lynn Leppla
- Institute of Nursing Science (INS), Department Public Health (DPH), Faculty of Medicine, University of Basel, Bernoullistrasse 28, CH-4056 Basel, Switzerland
- Department of Medicine I, Faculty of Medicine, Medical Center University of Freiburg, Freiburg im Breisgau, Germany
| | - Sabine Valenta
- Institute of Nursing Science (INS), Department Public Health (DPH), Faculty of Medicine, University of Basel, Bernoullistrasse 28, CH-4056 Basel, Switzerland
- Department of Hematology, University Hospital Basel, Basel, Switzerland
| | - Leah L. Zullig
- Center for Innovation to Accelerate Discovery and Practice Transformation (ADAPT), Durham Veterans Affairs Health Care & System, and Department of Population Health Sciences, School of Medicine, Duke University, Durham, NC USA
| | - Franziska Zúñiga
- Institute of Nursing Science (INS), Department Public Health (DPH), Faculty of Medicine, University of Basel, Bernoullistrasse 28, CH-4056 Basel, Switzerland
| | - Sandra Staudacher
- Institute of Nursing Science (INS), Department Public Health (DPH), Faculty of Medicine, University of Basel, Bernoullistrasse 28, CH-4056 Basel, Switzerland
- Department of Health Services Research, Care and Public Health Research Institute, Maastricht University, Maastricht, The Netherlands
| | - Alexandra Teynor
- University of Applied Sciences Augsburg, Faculty of Computer Science, Augsburg, Germany
| | - Sabina De Geest
- Institute of Nursing Science (INS), Department Public Health (DPH), Faculty of Medicine, University of Basel, Bernoullistrasse 28, CH-4056 Basel, Switzerland
- Academic Center for Nursing and Midwifery, Department of Public Health and Primary Care, KU Leuven, Leuven, Belgium
| |
Collapse
|
13
|
Baumann A, Vázquez A, Macchione A, Lima A, Coelho A, Juras M, Ribeiro M, Kohlsdorf M, Carothers B. Translation and validation of the evidence-based practice attitude scale (EBPAS-15) to Brazilian Portuguese: Examining providers' perspective about evidence-based parent intervention. CHILDREN AND YOUTH SERVICES REVIEW 2022; 136:106421. [PMID: 35431379 PMCID: PMC9012479 DOI: 10.1016/j.childyouth.2022.106421] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Background Few existing evidence-based parent interventions (EBPIs) for prevention and treatment of child and youth mental health disorders are implemented in low-middle-income countries. This study aimed to translate and confirm the factor structure of the Evidence-Based Practice Attitude Scale (EBPAS-15) survey in Brazilian Portuguese with the goal of examining providers' perspective about EBPIs. Methods We translated and back translated the EBPAS-15 from English to Brazilian Portuguese. Participants were recruited via snowball sampling and data were collected using an online survey from July of 2018 through January of 2020. A confirmatory factor analysis was conducted to determine if the scale retained its original structure. Open-ended questions about providers' perspectives of their own clinical practice were coded using the Theoretical Domains Framework (TDF). Analyses included data from 362 clinicians (318 women, 41 men) from 20 of the 27 states of Brazil. Participants on average were 26.7 years old, held specialist degrees in the field of psychology, actively worked as therapists, and practiced in private clinics. Results The translation of the EBPAS to Brazilian Portuguese retained the same four-factor structure as the English version except for dropping one item from the Divergence domain. When asked about the challenges in their practices, providers generally referred to parents as clients with little skills to discipline their children and lacking knowledge about child development. Discussion The Brazilian version of the EBPAS-15 is promising, but future research should consider using quantitative data alongside qualitative information to better understand providers' attitudes about evidence-based interventions to inform implementation efforts. Trial registration N/A.
Collapse
Affiliation(s)
- A.A. Baumann
- Division of Public Health Sciences, Department of Surgery, Washington University in St. Louis, MO, USA
| | | | - A.C. Macchione
- Centro Paradigma de Ciências do Comportamento, São Paulo, Brazil
| | - A. Lima
- Sam Houston State University, TX, USA
| | - A.F. Coelho
- Universidade de Brasilia, Brasília-DF, Brazil
| | - M. Juras
- Florida Gulf Coast University, USA
| | - M. Ribeiro
- Aiutare Instituto de Psicologia, Brasília-DF, Brazil
| | - M. Kohlsdorf
- Centro Universitario UniCEUB, Brasília-DF, Brazil
| | - B.J. Carothers
- Brown School, Washington University in St. Louis, Missouri, United States
| |
Collapse
|
14
|
Le PD, Eschliman EL, Grivel MM, Tang J, Cho YG, Yang X, Tay C, Li T, Bass J, Yang LH. Barriers and facilitators to implementation of evidence-based task-sharing mental health interventions in low- and middle-income countries: a systematic review using implementation science frameworks. Implement Sci 2022; 17:4. [PMID: 35022081 PMCID: PMC8756725 DOI: 10.1186/s13012-021-01179-z] [Citation(s) in RCA: 38] [Impact Index Per Article: 19.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2021] [Accepted: 12/05/2021] [Indexed: 02/01/2023] Open
Abstract
BACKGROUND Task-sharing is a promising strategy to expand mental healthcare in low-resource settings, especially in low- and middle-income countries (LMICs). Research on how to best implement task-sharing mental health interventions, however, is hampered by an incomplete understanding of the barriers and facilitators to their implementation. This review aims to systematically identify implementation barriers and facilitators in evidence-based task-sharing mental health interventions using an implementation science lens, organizing factors across a novel, integrated implementation science framework. METHODS PubMed, PsychINFO, CINAHL, and Embase were used to identify English-language, peer-reviewed studies using search terms for three categories: "mental health," "task-sharing," and "LMIC." Articles were included if they: focused on mental disorders as the main outcome(s); included a task-sharing intervention using or based on an evidence-based practice; were implemented in an LMIC setting; and included assessment or data-supported analysis of barriers and facilitators. An initial conceptual model and coding framework derived from the Consolidated Framework for Implementation Research and the Theoretical Domains Framework was developed and iteratively refined to create an integrated conceptual framework, the Barriers and Facilitators in Implementation of Task-Sharing Mental Health Interventions (BeFITS-MH), which specifies 37 constructs across eight domains: (I) client characteristics, (II) provider characteristics, (III) family and community factors, (IV) organizational characteristics, (V) societal factors, (VI) mental health system factors, (VII) intervention characteristics, and (VIII) stigma. RESULTS Of the 26,935 articles screened (title and abstract), 192 articles underwent full-text review, yielding 37 articles representing 28 unique intervention studies that met the inclusion criteria. The most prevalent facilitators occur in domains that are more amenable to adaptation (i.e., the intervention and provider characteristics domains), while salient barriers occur in domains that are more challenging to modulate or intervene on-these include constructs in the client characteristics as well as the broader societal and structural levels of influence (i.e., the organizational, mental health system domains). Other notable trends include constructs in the family and community domains occurring as barriers and as facilitators roughly equally, and stigma constructs acting exclusively as barriers. CONCLUSIONS Using the BeFITS-MH model we developed based on implementation science frameworks, this systematic review provides a comprehensive identification and organization of barriers and facilitators to evidence-based task-sharing mental health interventions in LMICs. These findings have important implications for ongoing and future implementation of this critically needed intervention strategy, including the promise of leveraging task-sharing intervention characteristics as sites of continued innovation, the importance of but relative lack of engagement with constructs in macro-level domains (e.g., organizational characteristics, stigma), and the need for more delineation of strategies for task-sharing mental health interventions that researchers and implementers can employ to enhance implementation in and across levels. TRIAL REGISTRATION PROSPERO CRD42020161357.
Collapse
Affiliation(s)
- PhuongThao D. Le
- grid.137628.90000 0004 1936 8753Department of Social and Behavioral Sciences, New York University School of Global Public Health, 708 Broadway, NY 10012 New York, USA
| | - Evan L. Eschliman
- grid.21107.350000 0001 2171 9311Department of Health, Behavior and Society, Johns Hopkins University Bloomberg School of Public Health, 615 North Wolfe St., Baltimore, MD 21205 USA
| | - Margaux M. Grivel
- grid.137628.90000 0004 1936 8753Department of Social and Behavioral Sciences, New York University School of Global Public Health, 708 Broadway, NY 10012 New York, USA
| | - Jeffrey Tang
- grid.137628.90000 0004 1936 8753Department of Psychology, New York University Graduate School of Arts and Science, One-Half Fifth Avenue, New York, NY 10003 USA
| | - Young G. Cho
- grid.21729.3f0000000419368729New York State Psychiatric Institute, Columbia University, 1051 Riverside Dr., New York, NY 10032 USA
| | - Xinyu Yang
- grid.21729.3f0000000419368729Department of Epidemiology, Columbia University Mailman School of Public Health, 722 West 168th St., New York, NY 10032 USA
| | - Charisse Tay
- grid.21729.3f0000000419368729Columbia University Teachers College, 525 West 120th Street, New York, NY 10027 USA
| | - Tingyu Li
- grid.21729.3f0000000419368729Columbia University Teachers College, 525 West 120th Street, New York, NY 10027 USA
| | - Judith Bass
- grid.21107.350000 0001 2171 9311Department of Mental Health, Johns Hopkins University Bloomberg School of Public Health, Hampton House, 8th Floor, 624 N. Broadway, Baltimore, MD 21205 USA
| | - Lawrence H. Yang
- grid.137628.90000 0004 1936 8753Department of Social and Behavioral Sciences, New York University School of Global Public Health, 708 Broadway, NY 10012 New York, USA ,grid.21729.3f0000000419368729Department of Epidemiology, Columbia University Mailman School of Public Health, 722 West 168th St., New York, NY 10032 USA
| |
Collapse
|
15
|
Pilar M, Jost E, Walsh-Bailey C, Powell BJ, Mazzucca S, Eyler A, Purtle J, Allen P, Brownson RC. Quantitative measures used in empirical evaluations of mental health policy implementation: A systematic review. IMPLEMENTATION RESEARCH AND PRACTICE 2022; 3:26334895221141116. [PMID: 37091091 PMCID: PMC9924289 DOI: 10.1177/26334895221141116] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022] Open
Abstract
Background Mental health is a critical component of wellness. Public policies present an opportunity for large-scale mental health impact, but policy implementation is complex and can vary significantly across contexts, making it crucial to evaluate implementation. The objective of this study was to (1) identify quantitative measurement tools used to evaluate the implementation of public mental health policies; (2) describe implementation determinants and outcomes assessed in the measures; and (3) assess the pragmatic and psychometric quality of identified measures. Method Guided by the Consolidated Framework for Implementation Research, Policy Implementation Determinants Framework, and Implementation Outcomes Framework, we conducted a systematic review of peer-reviewed journal articles published in 1995-2020. Data extracted included study characteristics, measure development and testing, implementation determinants and outcomes, and measure quality using the Psychometric and Pragmatic Evidence Rating Scale. Results We identified 34 tools from 25 articles, which were designed for mental health policies or used to evaluate constructs that impact implementation. Many measures lacked information regarding measurement development and testing. The most assessed implementation determinants were readiness for implementation, which encompassed training (n = 20, 57%) and other resources (n = 12, 34%), actor relationships/networks (n = 15, 43%), and organizational culture and climate (n = 11, 31%). Fidelity was the most prevalent implementation outcome (n = 9, 26%), followed by penetration (n = 8, 23%) and acceptability (n = 7, 20%). Apart from internal consistency and sample norms, psychometric properties were frequently unreported. Most measures were accessible and brief, though minimal information was provided regarding interpreting scores, handling missing data, or training needed to administer tools. Conclusions This work contributes to the nascent field of policy-focused implementation science by providing an overview of existing measurement tools used to evaluate mental health policy implementation and recommendations for measure development and refinement. To advance this field, more valid, reliable, and pragmatic measures are needed to evaluate policy implementation and close the policy-to-practice gap. Plain Language Summary Mental health is a critical component of wellness, and public policies present an opportunity to improve mental health on a large scale. Policy implementation is complex because it involves action by multiple entities at several levels of society. Policy implementation is also challenging because it can be impacted by many factors, such as political will, stakeholder relationships, and resources available for implementation. Because of these factors, implementation can vary between locations, such as states or countries. It is crucial to evaluate policy implementation, thus we conducted a systematic review to identify and evaluate the quality of measurement tools used in mental health policy implementation studies. Our search and screening procedures resulted in 34 measurement tools. We rated their quality to determine if these tools were practical to use and would yield consistent (i.e., reliable) and accurate (i.e., valid) data. These tools most frequently assessed whether implementing organizations complied with policy mandates and whether organizations had the training and other resources required to implement a policy. Though many were relatively brief and available at little-to-no cost, these findings highlight that more reliable, valid, and practical measurement tools are needed to assess and inform mental health policy implementation. Findings from this review can guide future efforts to select or develop policy implementation measures.
Collapse
Affiliation(s)
- Meagan Pilar
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
- Department of Infectious Diseases, Washington University School of Medicine,
Washington University in St. Louis, St. Louis, MO, USA
| | - Eliot Jost
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
| | - Callie Walsh-Bailey
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
| | - Byron J. Powell
- Center for Mental Health Services Research, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
- Division of Infectious Diseases, John T. Milliken Department of
Medicine, Washington University School of Medicine, Washington University in St.
Louis, St. Louis, MO, USA
| | - Stephanie Mazzucca
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
| | - Amy Eyler
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
| | - Jonathan Purtle
- Department of Public Health Policy & Management, New York
University School of Global Public Health, Global Center for Implementation Science, New York University, New York, NY, USA
| | - Peg Allen
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
| | - Ross C. Brownson
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
- Department of Surgery (Division of Public Health Sciences) and Alvin
J. Siteman Cancer Center, Washington University School of Medicine, Washington University in St.
Louis, St. Louis, MO, USA
| |
Collapse
|
16
|
Sokol RL, Mehdipanah R, Bess K, Mohammed L, Miller AL. When Families Do Not Request Help: Assessing a Social Determinants of Health Screening Tool in Practice. J Pediatr Health Care 2021; 35:471-478. [PMID: 34116869 DOI: 10.1016/j.pedhc.2021.05.002] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/08/2021] [Revised: 05/06/2021] [Accepted: 05/07/2021] [Indexed: 10/21/2022]
Abstract
INTRODUCTION Using pediatric social determinants of health screening data from a large medical system, we explored social needs dislosures and identified which needs were associated with resource connection requests. METHOD Data came from records of outpatient pediatric patients (0-18 years) seen between October 2018 and March 2020 (39,251 encounters). We assessed percent of encounters where families (1) indicated a social need, and (2) requested a resource connection. We conducted multivariable logistic regression to identify which needs were associated with resource connection requests. RESULTS Among all encounters, 8% indicated a need and 2% requested a resource connection. Among families indicating a need, needs associated with resource requests included: housing (odds ratio [OR], 3.49 [2.42-5.03]), employment (OR, 3.15 [2.21-4.50]), food (OR, 1.89 [1.41-2.52]), and transportation (OR, 1.82 [1.30-2.56]). DISCUSSION Families seldom requested resource connections to address social needs. Better understanding families' interests in receiving assistance is an important next step in pediatric social determinants of health screening system development.
Collapse
|
17
|
Oh A, Vinson CA, Chambers DA. Future directions for implementation science at the National Cancer Institute: Implementation Science Centers in Cancer Control. Transl Behav Med 2021; 11:669-675. [PMID: 32145023 DOI: 10.1093/tbm/ibaa018] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022] Open
Abstract
The National Cancer Institute (NCI) Cancer Moonshot initiative seeks to accelerate cancer research for the USA. One of the scientific priorities identified by the Moonshot's Blue Ribbon Panel (BRP) of scientific experts was the implementation of evidence-based approaches. In September 2019, the NCI launched the Implementation Science Centers in Cancer Control (ISC3 or "Centers") initiative to advance this Moonshot priority. The vision of the ISC3 is to promote the development of research centers to build capacity and research in high-priority areas of cancer control implementation science (e.g., scale-up and spread, sustainability and adaptation, and precision implementation), build implementation laboratories within community and clinical settings, improve the state of measurement and methods, and improve the adoption, implementation, and sustainment of evidence-based cancer control interventions. This paper highlights the research agenda, vision, and strategic direction for these Centers and encourages transdisciplinary scientists to learn more about opportunities to collaborate with these Centers.
Collapse
Affiliation(s)
- April Oh
- Division of Cancer Control and Population Sciences, National Cancer Institute, Rockville, MD, USA
| | - Cynthia A Vinson
- Division of Cancer Control and Population Sciences, National Cancer Institute, Rockville, MD, USA
| | - David A Chambers
- Division of Cancer Control and Population Sciences, National Cancer Institute, Rockville, MD, USA
| |
Collapse
|
18
|
Abstract
INTRODUCTION A 2011 paper proposed a working taxonomy of implementation outcomes, their conceptual distinctions and a two-pronged research agenda on their role in implementation success. Since then, over 1100 papers citing the manuscript have been published. Our goal is to compare the field's progress to the originally proposed research agenda, and outline recommendations for the next 10 years. To accomplish this, we are conducting the proposed scoping review. METHODS AND ANALYSIS Our approach is informed by Arksey and O'Malley's methodological framework for conducting scoping reviews. We will adhere to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews. We first aim to assess the degree to which each implementation outcome has been investigated in the literature, including healthcare settings, clinical populations and innovations represented. We next aim to describe the relationship between implementation strategies and outcomes. Our last aim is to identify studies that empirically assess relationships among implementation and/or service and client outcomes. We will use a forward citation tracing approach to identify all literature that cited the 2011 paper in the Web of Science (WOS) and will supplement this with citation alerts sent to the second author for a 6-month period coinciding with the WOS citation search. Our review will focus on empirical studies that are designed to assess at least one of the identified implementation outcomes in the 2011 taxonomy and are published in peer-reviewed journals. We will generate descriptive statistics from extracted data and organise results by these research aims. ETHICS AND DISSEMINATION No human research participants will be involved in this review. We plan to share findings through a variety of means including peer-reviewed journal publications, national conference presentations, invited workshops and webinars, email listservs affiliated with our institutions and professional associations, and academic social media.
Collapse
Affiliation(s)
| | - Enola K Proctor
- The Brown School, Washington University in St Louis, St Louis, Missouri, USA
| | - Alicia C Bunger
- College of Social Work, The Ohio State University, Columbus, Ohio, USA
| | - Donald R Gerke
- Graduate School of Social Work, University of Denver, Denver, Colorado, USA
| |
Collapse
|
19
|
Patel ZS, Jensen-Doss A, Zopluoglu C. Illustrating the Applicability of IRT to Implementation Science: Examining an Instrument of Therapist Attitudes. ADMINISTRATION AND POLICY IN MENTAL HEALTH AND MENTAL HEALTH SERVICES RESEARCH 2021; 48:921-935. [PMID: 33929639 DOI: 10.1007/s10488-021-01139-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/22/2021] [Indexed: 10/21/2022]
Abstract
Pragmatic instruments with psychometric support are important to advance dissemination and implementation (D&I) research, but few well-researched D&I instruments exist. Item response theory (IRT), an approach that is underutilized in D&I, can help with the development of actionable and brief instruments. This paper provides an overview of IRT for D&I researchers and examines an instrument of therapist attitudes using IRT measurement models. Eight items of the Attitudes Towards Individualized Assessment-Monitoring and Feedback (AIA-MF) Clinical Utility scale were fit to the Graded Response Model in a national sample of master's level therapists. Various IRT model characteristics including item threshold and discrimination parameters, information, and item and person fit were examined. Discrimination and thresholds parameters showed significant variability across the eight items. Item information curves also showed that each item contributed variably to the total test information, suggesting that items 4 and 5 reliably measure therapist attitudes across the latent continuum and items 3 and 6 warrant further investigation. Results suggest that IRT models can help D&I researchers examine existing instruments with greater specificity than traditional measurement methods, thus increasing measurement precision while lowering response burden, both important considerations for the field.
Collapse
Affiliation(s)
- Zabin S Patel
- Department of Psychology, University of Miami, Coral Gables, FL, 33124, USA.
| | - Amanda Jensen-Doss
- Department of Psychology, University of Miami, Coral Gables, FL, 33124, USA
| | - Cengiz Zopluoglu
- Department of Educational Methodology, Policy, and Leadership, University of Oregon, Eugene, OR, 97403, USA
| |
Collapse
|
20
|
Santesson AHE, Bäckström M, Holmberg R, Perrin S, Jarbin H. Confirmatory factor analysis of the Evidence-Based Practice Attitude Scale (EBPAS) in a large and representative Swedish sample: is the use of the total scale and subscale scores justified? BMC Med Res Methodol 2020; 20:254. [PMID: 33054717 PMCID: PMC7557010 DOI: 10.1186/s12874-020-01126-4] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2020] [Accepted: 09/18/2020] [Indexed: 11/24/2022] Open
Abstract
Background There is a call for valid and reliable instruments to evaluate implementation of evidence-based practices (EBP). The 15-item Evidence-Based Practice Attitude Scale (EBPAS) measures attitude toward EBP, incorporating four lower-order factor subscales (Appeal, Requirements, Openness, and Divergence) and a Total scale (General Attitudes). It is one of a few measures of EBP attitudes evaluated for its psychometric properties. The reliability of the Total scale has been repeatedly supported, but also the multidimensionality of the inventory. However, whether all of the items contribute to the EBPAS Total beyond their subscales has yet to be demonstrated. In addition, the Divergence subscale has been questioned because of its low correlation with the other subscales and low inter-item correlations. The EBPAS is widely used to tailor and evaluate implementation efforts, but a Swedish version has not yet been validated. This study aimed to contribute to the development and cross-validation of the EBPAS by examining the factor structure of t a Swedish-language version in a large sample of mental health professionals. Methods The EBPAS was translated into Swedish and completed by 570 mental health professionals working in child and adolescent psychiatry settings spread across Sweden. The factor structure was examined using first-order, second-order and bifactor confirmatory factor analytic (CFA) models. Results Results suggested adequate fit for all CFA models. The EBPAS Total was strongly supported in the Swedish version. Support for the hierarchical second-order model was also strong, while the bifactor model gave mixed support for the subscales. The Openness and Requirements subscales came out best, while there were problems with both the Appeal (e.g. not different from the General Attitudes factor) and the Divergence subscales (e.g. low reliability). Conclusions Overall, the psychometric properties were on par with the English version and the total score appears to be a valid measure of general attitudes towards EBP. This is the first study supporting this General Attitudes factor based on a bifactor model. Although comparatively better supported in this Swedish sample, we conclude that the use of the EBPAS subscale scores may result in misleading conclusions. Practical implications and future directions are discussed.
Collapse
Affiliation(s)
| | - Martin Bäckström
- Department of Psychology, Faculty of Social Sciences, Lund University, Lund, Sweden
| | - Robert Holmberg
- Department of Psychology, Faculty of Social Sciences, Lund University, Lund, Sweden
| | - Sean Perrin
- Department of Psychology, Faculty of Social Sciences, Lund University, Lund, Sweden
| | - Håkan Jarbin
- Department of Clinical Sciences, Faculty of Medicine, Lund University, BMC F12, S-, 221 84, Lund, Sweden
| |
Collapse
|
21
|
Khadjesari Z, Boufkhed S, Vitoratou S, Schatte L, Ziemann A, Daskalopoulou C, Uglik-Marucha E, Sevdalis N, Hull L. Implementation outcome instruments for use in physical healthcare settings: a systematic review. Implement Sci 2020; 15:66. [PMID: 32811517 PMCID: PMC7433178 DOI: 10.1186/s13012-020-01027-6] [Citation(s) in RCA: 21] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2020] [Accepted: 07/29/2020] [Indexed: 01/05/2023] Open
Abstract
BACKGROUND Implementation research aims to facilitate the timely and routine implementation and sustainment of evidence-based interventions and services. A glaring gap in this endeavour is the capability of researchers, healthcare practitioners and managers to quantitatively evaluate implementation efforts using psychometrically sound instruments. To encourage and support the use of precise and accurate implementation outcome measures, this systematic review aimed to identify and appraise studies that assess the measurement properties of quantitative implementation outcome instruments used in physical healthcare settings. METHOD The following data sources were searched from inception to March 2019, with no language restrictions: MEDLINE, EMBASE, PsycINFO, HMIC, CINAHL and the Cochrane library. Studies that evaluated the measurement properties of implementation outcome instruments in physical healthcare settings were eligible for inclusion. Proctor et al.'s taxonomy of implementation outcomes was used to guide the inclusion of implementation outcomes: acceptability, appropriateness, feasibility, adoption, penetration, implementation cost and sustainability. Methodological quality of the included studies was assessed using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist. Psychometric quality of the included instruments was assessed using the Contemporary Psychometrics checklist (ConPsy). Usability was determined by number of items per instrument. RESULTS Fifty-eight publications reporting on the measurement properties of 55 implementation outcome instruments (65 scales) were identified. The majority of instruments assessed acceptability (n = 33), followed by appropriateness (n = 7), adoption (n = 4), feasibility (n = 4), penetration (n = 4) and sustainability (n = 3) of evidence-based practice. The methodological quality of individual scales was low, with few studies rated as 'excellent' for reliability (6/62) and validity (7/63), and both studies that assessed responsiveness rated as 'poor' (2/2). The psychometric quality of the scales was also low, with 12/65 scales scoring 7 or more out of 22, indicating greater psychometric strength. Six scales (6/65) rated as 'excellent' for usability. CONCLUSION Investigators assessing implementation outcomes quantitatively should select instruments based on their methodological and psychometric quality to promote consistent and comparable implementation evaluations. Rather than developing ad hoc instruments, we encourage further psychometric testing of instruments with promising methodological and psychometric evidence. SYSTEMATIC REVIEW REGISTRATION PROSPERO 2017 CRD42017065348.
Collapse
Affiliation(s)
- Zarnie Khadjesari
- Centre for Implementation Science, Health Service and Population Research Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK.
- Behavioural and Implementation Science research group, School of Health Sciences, University of East Anglia, Edith Cavell Building, Norwich Research Park, Norwich, NR4 7TJ, UK.
| | - Sabah Boufkhed
- Centre for Implementation Science, Health Service and Population Research Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK
| | - Silia Vitoratou
- Psychometrics and Measurement Lab, Biostatistics and Health Informatics Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK
| | - Laura Schatte
- Centre for Implementation Science, Health Service and Population Research Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK
| | - Alexandra Ziemann
- Centre for Implementation Science, Health Service and Population Research Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK
- Centre for Healthcare Innovation Research, City, University of London, Northampton Square, London, EC1V 0HB, UK
| | - Christina Daskalopoulou
- Health Service and Population Research Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK
| | - Eleonora Uglik-Marucha
- Psychometrics and Measurement Lab, Biostatistics and Health Informatics Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK
| | - Nick Sevdalis
- Centre for Implementation Science, Health Service and Population Research Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK
| | - Louise Hull
- Centre for Implementation Science, Health Service and Population Research Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK
| |
Collapse
|
22
|
What Is Dissemination and Implementation Science?: An Introduction and Opportunities to Advance Behavioral Medicine and Public Health Globally. Int J Behav Med 2020; 27:3-20. [PMID: 32060805 DOI: 10.1007/s12529-020-09848-x] [Citation(s) in RCA: 56] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
Abstract
There has been a well-documented gap between research (e.g., evidence-based programs, interventions, practices, policies, guidelines) and practice (e.g., what is routinely delivered in real-world community and clinical settings). Dissemination and implementation (D&I) science has emerged to address this research-to-practice gap and accelerate the speed with which translation and real-world uptake and impact occur. In recent years, there has been tremendous development in the field and a growing global interest, but much of the introductory literature has been U.S.-centric. This piece provides an introduction to D&I science and summarizes key concepts and progress of the field for a global audience, provides two case studies that highlight examples of D&I research globally, and identifies opportunities and innovations for advancing the field of D&I research globally.
Collapse
|
23
|
Pascoe M, Mahura O, Dean J. Health resources for South Africa: A scoping review. Health SA 2020; 25:1378. [PMID: 32832107 PMCID: PMC7433232 DOI: 10.4102/hsag.v25i0.1378] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2019] [Accepted: 06/07/2020] [Indexed: 11/24/2022] Open
Abstract
BACKGROUND Healthcare is more effective when people are treated in their own language with respect for their culture. However, information about the availability and nature of health resources is fragmented and studies suggest few assessments, screening tools, or other health resources in many of South Africa's languages. AIM This scoping review identified health resources written in the eleven official languages of South Africa for health professionals to use for patient assessment and management. METH ODS Databases were searched and information about resources collated and analysed. RESULTS Two-hundred-and-fifty two unique resources were found (444 items, if different language versions of the same resource were counted separately). All official languages were represented. The most widely used (excluding English) were Afrikaans (118 resources), IsiXhosa (80) and IsiZulu (55). CONCLUSION Development of more health resources and critical evaluation of their validity and reliability remain important. This study contributes a preliminary database for South African health professionals, ultimately promoting improved service delivery.
Collapse
Affiliation(s)
- Michelle Pascoe
- Department of Health and Rehabilitation Sciences/Child Language Africa, Faculty of Health Sciences, University of Cape Town, Cape Town, South Africa
| | - Olebeng Mahura
- Department of Health and Rehabilitation Sciences/Child Language Africa, Faculty of Health Sciences, University of Cape Town, Cape Town, South Africa
| | - Jessica Dean
- Department of Health and Rehabilitation Sciences/Child Language Africa, Faculty of Health Sciences, University of Cape Town, Cape Town, South Africa
| |
Collapse
|
24
|
Kerner JF, Kavanaugh-Lynch MHE, Baezconde-Garbanati L, Politis C, Prager A, Brownson RC. Doing What We Know, Knowing What to Do: Californians Linking Action with Science for Prevention of Breast Cancer (CLASP-BC). INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2020; 17:E5050. [PMID: 32674312 PMCID: PMC7399883 DOI: 10.3390/ijerph17145050] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 06/01/2020] [Revised: 07/06/2020] [Accepted: 07/07/2020] [Indexed: 12/24/2022]
Abstract
Given the lack of progress in breast cancer prevention, the California Breast Cancer Research Program (CBCRP) plans to apply current scientific knowledge about breast cancer to primary prevention at the population level. This paper describes the first phase of Californians Linking Action with Science for Prevention of Breast Cancer (CLASP-BC). The foci of Phase 1 are building coalitions and coalition capacity building through community engagement in community-based participatory research (CBPR) and dissemination and implementation (D&I) research training. Based on the successful implementation and evaluation of Phase 1, the foci of Phase 2 (presented separately in this special issue) will be to translate the California Breast Cancer Prevention Plan overarching goal and specific intervention goals for 23 breast cancer risk and protective factors strategies into evidence-informed interventions (EIIs) that are disseminated and implemented across California. CLASP-BC is designed to identify, disseminate and implement high-impact, population-based prevention approaches by funding large scale EIIs, through multi-jurisdictional actions, with the intent to decrease the risk of breast cancer and other chronic diseases (sharing common risk factors), particularly among racial/ethnic minorities and medically underserved populations in California.
Collapse
Affiliation(s)
- Jon F. Kerner
- California Breast Cancer Research Program, Bethesda, MD 20186, USA
| | - Marion H. E. Kavanaugh-Lynch
- California Breast Cancer Research Program University of California, Office of the President, Oakland, CA 94612, USA;
| | - Lourdes Baezconde-Garbanati
- Preventive Medicine, Community Initiatives, Keck School of Medicine (KSOM), University of California, Los Angeles, CA 90033, USA;
- Community Engagement, Norris Comprehensive Cancer Center, University of California, Los Angeles, CA 90033, USA
- Center for Health Equity in the Americas, KSOM, University of Southern California, Los Angeles, CA 90007, USA
| | - Christopher Politis
- Cancer Screening, Canadian Partnership Against Cancer, Toronto, ON M5H 1J8, Canada;
| | - Aviva Prager
- California Pan-Ethnic Health Network, Oakland, CA 94612, USA;
| | - Ross C. Brownson
- Brown School, Washington University in St. Louis, St. Louis, MO 63130, USA;
- Department of Surgery (Division of Public Health Sciences) and Alvin J. Siteman Cancer Center, School of Medicine, Washington University, St. Louis, MO 63110, USA
| |
Collapse
|
25
|
Allen P, Pilar M, Walsh-Bailey C, Hooley C, Mazzucca S, Lewis CC, Mettert KD, Dorsey CN, Purtle J, Kepper MM, Baumann AA, Brownson RC. Quantitative measures of health policy implementation determinants and outcomes: a systematic review. Implement Sci 2020; 15:47. [PMID: 32560661 PMCID: PMC7304175 DOI: 10.1186/s13012-020-01007-w] [Citation(s) in RCA: 63] [Impact Index Per Article: 15.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2020] [Accepted: 06/05/2020] [Indexed: 01/02/2023] Open
Abstract
BACKGROUND Public policy has tremendous impacts on population health. While policy development has been extensively studied, policy implementation research is newer and relies largely on qualitative methods. Quantitative measures are needed to disentangle differential impacts of policy implementation determinants (i.e., barriers and facilitators) and outcomes to ensure intended benefits are realized. Implementation outcomes include acceptability, adoption, appropriateness, compliance/fidelity, feasibility, penetration, sustainability, and costs. This systematic review identified quantitative measures that are used to assess health policy implementation determinants and outcomes and evaluated the quality of these measures. METHODS Three frameworks guided the review: Implementation Outcomes Framework (Proctor et al.), Consolidated Framework for Implementation Research (Damschroder et al.), and Policy Implementation Determinants Framework (Bullock et al.). Six databases were searched: Medline, CINAHL Plus, PsycInfo, PAIS, ERIC, and Worldwide Political. Searches were limited to English language, peer-reviewed journal articles published January 1995 to April 2019. Search terms addressed four levels: health, public policy, implementation, and measurement. Empirical studies of public policies addressing physical or behavioral health with quantitative self-report or archival measures of policy implementation with at least two items assessing implementation outcomes or determinants were included. Consensus scoring of the Psychometric and Pragmatic Evidence Rating Scale assessed the quality of measures. RESULTS Database searches yielded 8417 non-duplicate studies, with 870 (10.3%) undergoing full-text screening, yielding 66 studies. From the included studies, 70 unique measures were identified to quantitatively assess implementation outcomes and/or determinants. Acceptability, feasibility, appropriateness, and compliance were the most commonly measured implementation outcomes. Common determinants in the identified measures were organizational culture, implementation climate, and readiness for implementation, each aspects of the internal setting. Pragmatic quality ranged from adequate to good, with most measures freely available, brief, and at high school reading level. Few psychometric properties were reported. CONCLUSIONS Well-tested quantitative measures of implementation internal settings were under-utilized in policy studies. Further development and testing of external context measures are warranted. This review is intended to stimulate measure development and high-quality assessment of health policy implementation outcomes and determinants to help practitioners and researchers spread evidence-informed policies to improve population health. REGISTRATION Not registered.
Collapse
Affiliation(s)
- Peg Allen
- Prevention Research Center, Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130 USA
| | - Meagan Pilar
- Prevention Research Center, Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130 USA
| | - Callie Walsh-Bailey
- Prevention Research Center, Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130 USA
| | - Cole Hooley
- School of Social Work, Brigham Young University, 2190 FJSB, Provo, UT 84602 USA
| | - Stephanie Mazzucca
- Prevention Research Center, Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130 USA
| | - Cara C. Lewis
- Kaiser Permanente Washington Health Research Institute, 1730 Minor Ave, Seattle, WA 98101 USA
| | - Kayne D. Mettert
- Kaiser Permanente Washington Health Research Institute, 1730 Minor Ave, Seattle, WA 98101 USA
| | - Caitlin N. Dorsey
- Kaiser Permanente Washington Health Research Institute, 1730 Minor Ave, Seattle, WA 98101 USA
| | - Jonathan Purtle
- Department of Health Management & Policy, Drexel University Dornsife School of Public Health, Nesbitt Hall, 3215 Market St, Philadelphia, PA 19104 USA
| | - Maura M. Kepper
- Prevention Research Center, Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130 USA
| | - Ana A. Baumann
- Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130 USA
| | - Ross C. Brownson
- Prevention Research Center, Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130 USA
- Department of Surgery (Division of Public Health Sciences) and Alvin J. Siteman Cancer Center, Washington University School of Medicine, 4921 Parkview Place, Saint Louis, MO 63110 USA
| |
Collapse
|
26
|
Smith JD, Rafferty MR, Heinemann AW, Meachum MK, Villamar J, Lieber RL, Brown CH. Pragmatic adaptation of implementation research measures for a novel context and multiple professional roles: a factor analysis study. BMC Health Serv Res 2020; 20:257. [PMID: 32228572 PMCID: PMC7106795 DOI: 10.1186/s12913-020-05118-4] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2018] [Accepted: 03/18/2020] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Although some advances have been made in recent years, the lack of measures remains a major challenge in the field of implementation research. This results in frequent adaptation of implementation measures for different contexts-including different types of respondents or professional roles-than those for which they were originally developed and validated. The psychometric properties of these adapted measures are often not rigorously evaluated or reported. In this study, we examined the internal consistency, factor structure, and structural invariance of four well-validated measures of inner setting factors across four groups of respondents. The items in these measures were adapted as part of an evaluation of a large-scale organizational change in a rehabilitation hospital, which involved transitioning to a new building and a new model of patient care, facilitated by a significant redesign of patient care and research spaces. METHODS Items were tailored for the context and perspective of different respondent groups and shortened for pragmatism. Confirmatory factor analysis was then used to test study hypotheses related to fit, internal consistency, and invariance across groups. RESULTS The survey was administered to approximately 1208 employees; 785 responded (65% response rate) across the roles of clinician, researcher, leader, support staff, or dual clinician and researcher. For each of the four scales, confirmatory factor analysis demonstrated adequate fit that largely replicated the original measure. However, a few items loaded poorly and were removed from the final models. Internal consistencies of the final scales were acceptable. For scales that were administered to multiple professional roles, factor structures were not statistically different across groups, indicating structural invariance. CONCLUSIONS The four inner setting measures were robust for use in this new context and across the multiple stakeholder groups surveyed. Shortening these measures did not significantly impair their measurement properties; however, as this study was cross sectional, future studies are required to evaluate the predictive validity and test-retest reliability of these measures. The successful use of adapted measures across contexts, across and between respondent groups, and with fewer items is encouraging, given the current emphasis on designing pragmatic implementation measures.
Collapse
Affiliation(s)
- Justin D Smith
- Department of Psychiatry and Behavioral Sciences, Northwestern University Feinberg School of Medicine, Chicago, IL, USA. .,Department of Preventive Medicine, Northwestern University Feinberg School of Medicine, Chicago, IL, USA. .,Department of Pediatrics, Northwestern University Feinberg School of Medicine, Chicago, IL, USA. .,Department of Medical Social Sciences, Northwestern University Feinberg School of Medicine, Chicago, IL, USA.
| | - Miriam R Rafferty
- Center for Education in Health Sciences, Northwestern University Feinberg School of Medicine, Chicago, IL, USA
| | - Allen W Heinemann
- Department of Physical Medicine and Rehabilitation, Northwestern University Feinberg School of Medicine, Chicago, IL, USA.,Shirley Ryan AbilityLab, Chicago, IL, USA
| | - Mariah K Meachum
- Department of Psychiatry and Behavioral Sciences, Northwestern University Feinberg School of Medicine, Chicago, IL, USA
| | - Juan Villamar
- Department of Psychiatry and Behavioral Sciences, Northwestern University Feinberg School of Medicine, Chicago, IL, USA
| | - Richard L Lieber
- Shirley Ryan AbilityLab, Chicago, IL, USA.,Departments of Physiology, Biomedical Engineering and Physical Medicine and Rehabilitation, Northwestern University, Chicago, IL, USA
| | - C Hendricks Brown
- Department of Psychiatry and Behavioral Sciences, Northwestern University Feinberg School of Medicine, Chicago, IL, USA.,Department of Preventive Medicine, Northwestern University Feinberg School of Medicine, Chicago, IL, USA.,Department of Medical Social Sciences, Northwestern University Feinberg School of Medicine, Chicago, IL, USA
| |
Collapse
|
27
|
Hwang S, Birken SA, Melvin CL, Rohweder CL, Smith JD. Designs and methods for implementation research: Advancing the mission of the CTSA program. J Clin Transl Sci 2020; 4:159-167. [PMID: 32695483 PMCID: PMC7348037 DOI: 10.1017/cts.2020.16] [Citation(s) in RCA: 36] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2019] [Revised: 02/15/2020] [Accepted: 02/20/2020] [Indexed: 12/25/2022] Open
Abstract
INTRODUCTION The US National Institutes of Health (NIH) established the Clinical and Translational Science Award (CTSA) program in response to the challenges of translating biomedical and behavioral interventions from discovery to real-world use. To address the challenge of translating evidence-based interventions (EBIs) into practice, the field of implementation science has emerged as a distinct discipline. With the distinction between EBI effectiveness research and implementation research comes differences in study design and methodology, shifting focus from clinical outcomes to the systems that support adoption and delivery of EBIs with fidelity. METHODS Implementation research designs share many of the foundational elements and assumptions of efficacy/effectiveness research. Designs and methods that are currently applied in implementation research include experimental, quasi-experimental, observational, hybrid effectiveness-implementation, simulation modeling, and configurational comparative methods. RESULTS Examples of specific research designs and methods illustrate their use in implementation science. We propose that the CTSA program takes advantage of the momentum of the field's capacity building in three ways: 1) integrate state-of-the-science implementation methods and designs into its existing body of research; 2) position itself at the forefront of advancing the science of implementation science by collaborating with other NIH institutes that share the goal of advancing implementation science; and 3) provide adequate training in implementation science. CONCLUSIONS As implementation methodologies mature, both implementation science and the CTSA program would greatly benefit from cross-fertilizing expertise and shared infrastructures that aim to advance healthcare in the USA and around the world.
Collapse
Affiliation(s)
- Soohyun Hwang
- Department of Health Policy and Management, UNC Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| | - Sarah A. Birken
- Department of Health Policy and Management, UNC Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| | - Cathy L. Melvin
- Department of Public Health Sciences, Medical University of South Carolina, Charleston, SC, USA
| | - Catherine L. Rohweder
- UNC Center for Health Promotion and Disease Prevention, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| | - Justin D. Smith
- Department of Psychiatry and Behavioral Sciences, Northwestern University Feinberg School of Medicine, Chicago, IL, USA
| |
Collapse
|
28
|
Miake-Lye IM, Delevan DM, Ganz DA, Mittman BS, Finley EP. Unpacking organizational readiness for change: an updated systematic review and content analysis of assessments. BMC Health Serv Res 2020; 20:106. [PMID: 32046708 PMCID: PMC7014613 DOI: 10.1186/s12913-020-4926-z] [Citation(s) in RCA: 53] [Impact Index Per Article: 13.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2019] [Accepted: 01/23/2020] [Indexed: 12/11/2022] Open
Abstract
BACKGROUND Organizational readiness assessments have a history of being developed as important support tools for successful implementation. However, it remains unclear how best to operationalize readiness across varied projects or settings. We conducted a synthesis and content analysis of published readiness instruments to compare how investigators have operationalized the concept of organizational readiness for change. METHODS We identified readiness assessments using a systematic review and update search. We mapped individual assessment items to the Consolidated Framework for Implementation Research (CFIR), which identifies five domains affecting implementation (outer setting, inner setting, intervention characteristics, characteristics of individuals, and implementation process) and multiple constructs within each domain. RESULTS Of 1370 survey items, 897 (68%) mapped to the CFIR domain of inner setting, most commonly related to constructs of readiness for implementation (n = 220); networks and communication (n = 207); implementation climate (n = 204); structural characteristics (n = 139); and culture (n = 93). Two hundred forty-two items (18%) mapped to characteristics of individuals (mainly other personal attributes [n = 157] and self-efficacy [n = 52]); 80 (6%) mapped to outer setting; 51 (4%) mapped to implementation process; 40 (3%) mapped to intervention characteristics; and 60 (4%) did not map to CFIR constructs. Instruments were typically tailored to specific interventions or contexts. DISCUSSION Available readiness instruments predominantly focus on contextual factors within the organization and characteristics of individuals, but the specificity of most assessment items suggests a need to tailor items to the specific scenario in which an assessment is fielded. Readiness assessments must bridge the gap between measuring a theoretical construct and factors of importance to a particular implementation.
Collapse
Affiliation(s)
- Isomi M Miake-Lye
- VA Greater Los Angeles Healthcare System, Los Angeles, CA, USA. .,University of California, Los Angeles, Los Angeles, CA, USA.
| | | | - David A Ganz
- VA Greater Los Angeles Healthcare System, Los Angeles, CA, USA.,University of California, Los Angeles, Los Angeles, CA, USA
| | - Brian S Mittman
- VA Greater Los Angeles Healthcare System, Los Angeles, CA, USA.,Kaiser Permanente Research, Pasadena, CA, USA
| | - Erin P Finley
- South Texas Veterans Health Care System, San Antonio, TX, USA.,The University of Texas Health Science Center at San Antonio, San Antonio, TX, USA
| |
Collapse
|
29
|
Williams NJ, Wolk CB, Becker-Haimes EM, Beidas RS. Testing a theory of strategic implementation leadership, implementation climate, and clinicians' use of evidence-based practice: a 5-year panel analysis. Implement Sci 2020; 15:10. [PMID: 32033575 PMCID: PMC7006179 DOI: 10.1186/s13012-020-0970-7] [Citation(s) in RCA: 62] [Impact Index Per Article: 15.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2019] [Accepted: 01/31/2020] [Indexed: 01/15/2023] Open
Abstract
BACKGROUND Implementation theory suggests that first-level leaders, sometimes referred to as middle managers, can increase clinicians' use of evidence-based practice (EBP) in healthcare settings by enacting specific leadership behaviors (i.e., proactive, knowledgeable, supportive, perseverant with regard to implementation) that develop an EBP implementation climate within the organization; however, longitudinal and quasi-experimental studies are needed to test this hypothesis. METHODS Using data collected at three waves over a 5-year period from a panel of 30 outpatient children's mental health clinics employing 496 clinicians, we conducted a quasi-experimental difference-in-differences study to test whether within-organization change in implementation leadership predicted within-organization change in EBP implementation climate, and whether change in EBP implementation climate predicted within-organization change in clinicians' use of EBP. At each wave, clinicians reported on their first-level leaders' implementation leadership, their organization's EBP implementation climate, and their use of both EBP and non-EBP psychotherapy techniques for childhood psychiatric disorders. Hypotheses were tested using econometric two-way fixed effects regression models at the organization level which controlled for all stable organizational characteristics, population trends in the outcomes over time, and time-varying covariates. RESULTS Organizations that improved from low to high levels of implementation leadership experienced significantly greater increases in their level of EBP implementation climate (d = .92, p = .017) and within-organization increases in implementation leadership accounted for 11% of the variance in improvement in EBP implementation climate beyond all other covariates. In turn, organizations that improved from low to high levels of EBP implementation climate experienced significantly greater increases in their clinicians' average EBP use (d = .55, p = .007) and within-organization improvement in EBP implementation climate accounted for 14% of the variance in increased clinician EBP use. Mediation analyses indicated that improvement in implementation leadership had a significant indirect effect on clinicians' EBP use via improvement in EBP implementation climate (d = .26, 95% CI [.02 to .59]). CONCLUSIONS When first-level leaders increase their frequency of implementation leadership behaviors, organizational EBP implementation climate improves, which in turn contributes to increased EBP use by clinicians. Trials are needed to test strategies that target this implementation leadership-EBP implementation climate mechanism.
Collapse
Affiliation(s)
- Nathaniel J Williams
- School of Social Work, Boise State University, Boise, ID, USA. .,Institute for the Study of Behavioral Health and Addiction, Boise State University, Boise, ID, USA. .,School of Social Work, Boise State University, Room 711, 1910 University Drive, Boise, ID, 83725, USA.
| | - Courtney Benjamin Wolk
- Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
| | - Emily M Becker-Haimes
- Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA.,Hall Mercer Community Mental Health Center, Pennsylvania Hospital, Philadelphia, PA, USA
| | - Rinad S Beidas
- Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA.,Department of Medical Ethics and Health Policy, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA.,Department of Medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA.,Penn Implementation Science Center at the Leonard Davis Institute of Health Economics (PISCE@LDI), University of Pennsylvania, Philadelphia, PA, USA
| |
Collapse
|
30
|
Quantitative approaches for the evaluation of implementation research studies. Psychiatry Res 2020; 283:112521. [PMID: 31473029 PMCID: PMC7176071 DOI: 10.1016/j.psychres.2019.112521] [Citation(s) in RCA: 25] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/30/2019] [Revised: 08/14/2019] [Accepted: 08/16/2019] [Indexed: 01/10/2023]
Abstract
Implementation research necessitates a shift from clinical trial methods in both the conduct of the study and in the way that it is evaluated given the focus on the impact of implementation strategies. That is, the methods or techniques to support the adoption and delivery of a clinical or preventive intervention, program, or policy. As strategies target one or more levels within the service delivery system, evaluating their impact needs to follow suit. This article discusses the methods and practices involved in quantitative evaluations of implementation research studies. We focus on evaluation methods that characterize and quantify the overall impacts of an implementation strategy on various outcomes. This article discusses available measurement methods for common quantitative implementation outcomes involved in such an evaluation-adoption, fidelity, implementation cost, reach, and sustainment-and the sources of such data for these metrics using established taxonomies and frameworks. Last, we present an example of a quantitative evaluation from an ongoing randomized rollout implementation trial of the Collaborative Care Model for depression management in a large primary healthcare system.
Collapse
|
31
|
McKay H, Naylor PJ, Lau E, Gray SM, Wolfenden L, Milat A, Bauman A, Race D, Nettlefold L, Sims-Gould J. Implementation and scale-up of physical activity and behavioural nutrition interventions: an evaluation roadmap. Int J Behav Nutr Phys Act 2019; 16:102. [PMID: 31699095 PMCID: PMC6839114 DOI: 10.1186/s12966-019-0868-4] [Citation(s) in RCA: 74] [Impact Index Per Article: 14.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2019] [Accepted: 10/22/2019] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Interventions that work must be effectively delivered at scale to achieve population level benefits. Researchers must choose among a vast array of implementation frameworks (> 60) that guide design and evaluation of implementation and scale-up processes. Therefore, we sought to recommend conceptual frameworks that can be used to design, inform, and evaluate implementation of physical activity (PA) and nutrition interventions at different stages of the program life cycle. We also sought to recommend a minimum data set of implementation outcome and determinant variables (indicators) as well as measures and tools deemed most relevant for PA and nutrition researchers. METHODS We adopted a five-round modified Delphi methodology. For rounds 1, 2, and 3 we administered online surveys to PA and nutrition implementation scientists to generate a rank order list of most commonly used; i) implementation and scale-up frameworks, ii) implementation indicators, and iii) implementation and scale-up measures and tools. Measures and tools were excluded after round 2 as input from participants was very limited. For rounds 4 and 5, we conducted two in-person meetings with an expert group to create a shortlist of implementation and scale-up frameworks, identify a minimum data set of indicators and to discuss application and relevance of frameworks and indicators to the field of PA and nutrition. RESULTS The two most commonly referenced implementation frameworks were the Framework for Effective Implementation and the Consolidated Framework for Implementation Research. We provide the 25 most highly ranked implementation indicators reported by those who participated in rounds 1-3 of the survey. From these, the expert group created a recommended minimum data set of implementation determinants (n = 10) and implementation outcomes (n = 5) and reconciled differences in commonly used terms and definitions. CONCLUSIONS Researchers are confronted with myriad options when conducting implementation and scale-up evaluations. Thus, we identified and prioritized a list of frameworks and a minimum data set of indicators that have potential to improve the quality and consistency of evaluating implementation and scale-up of PA and nutrition interventions. Advancing our science is predicated upon increased efforts to develop a common 'language' and adaptable measures and tools.
Collapse
Affiliation(s)
- Heather McKay
- Centre for Hip Health and Mobility, Vancouver Coastal Health Research Centre, 7th Floor Robert H.N. Ho Research Centre, 795-2635 Laurel St, Vancouver, BC, V5Z 1M9, Canada. .,Department of Family Practice, University of British Columbia, 3rd Floor David Strangway Building, 5950 University Boulevard, Vancouver, BC, V6T 1Z3, Canada.
| | - Patti-Jean Naylor
- School of Exercise Science, Physical Health and Education, Faculty of Education, University of Victoria, PO Box 3015 STN CSC, Victoria, BC, V8W 3P1, Canada
| | - Erica Lau
- Centre for Hip Health and Mobility, Vancouver Coastal Health Research Centre, 7th Floor Robert H.N. Ho Research Centre, 795-2635 Laurel St, Vancouver, BC, V5Z 1M9, Canada.,Department of Family Practice, University of British Columbia, 3rd Floor David Strangway Building, 5950 University Boulevard, Vancouver, BC, V6T 1Z3, Canada
| | - Samantha M Gray
- Centre for Hip Health and Mobility, Vancouver Coastal Health Research Centre, 7th Floor Robert H.N. Ho Research Centre, 795-2635 Laurel St, Vancouver, BC, V5Z 1M9, Canada
| | - Luke Wolfenden
- School of Medicine and Public Health, University of Newcastle, Callaghan, New South Wales, 2308, Australia.,Hunter New England Population Health, Wallsend, New South Wales, 2287, Australia
| | - Andrew Milat
- The New South Wales Ministry of Health, North Sydney, New South Wales, 2059, Australia.,Sydney School of Public Health, University of Sydney, Charles Perkins Centre, Building D17, Sydney, New South Wales, 2006, Australia
| | - Adrian Bauman
- Sydney School of Public Health, University of Sydney, Charles Perkins Centre, Building D17, Sydney, New South Wales, 2006, Australia
| | - Douglas Race
- Centre for Hip Health and Mobility, Vancouver Coastal Health Research Centre, 7th Floor Robert H.N. Ho Research Centre, 795-2635 Laurel St, Vancouver, BC, V5Z 1M9, Canada
| | - Lindsay Nettlefold
- Centre for Hip Health and Mobility, Vancouver Coastal Health Research Centre, 7th Floor Robert H.N. Ho Research Centre, 795-2635 Laurel St, Vancouver, BC, V5Z 1M9, Canada
| | - Joanie Sims-Gould
- Centre for Hip Health and Mobility, Vancouver Coastal Health Research Centre, 7th Floor Robert H.N. Ho Research Centre, 795-2635 Laurel St, Vancouver, BC, V5Z 1M9, Canada.,Department of Family Practice, University of British Columbia, 3rd Floor David Strangway Building, 5950 University Boulevard, Vancouver, BC, V6T 1Z3, Canada
| |
Collapse
|
32
|
Haroz EE, Bolton P, Nguyen AJ, Lee C, Bogdanov S, Bass J, Singh NS, Doty SB, Murray L. Measuring implementation in global mental health: validation of a pragmatic implementation science measure in eastern Ukraine using an experimental vignette design. BMC Health Serv Res 2019; 19:262. [PMID: 31036002 PMCID: PMC6489318 DOI: 10.1186/s12913-019-4097-y] [Citation(s) in RCA: 30] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2018] [Accepted: 04/12/2019] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND There is mounting evidence supporting the effectiveness of task-shifted mental health interventions in low- and middle-income countries (LMIC). However, there has been limited systematic scale-up or sustainability of these programs, indicating a need to study implementation. One barrier to progress is a lack of locally relevant and valid implementation measures. We adapted an existing brief dissemination and implementation (D&I) measure which includes scales for acceptability, appropriateness, feasibility and accessibility for local use and studied its validity and reliability among a sample of consumers in Ukraine. METHODS Local qualitative data informed adaptation of the measure and development of vignettes to test the reliability and validity. Participants were veterans and internally displaced persons (IDPs) recruited as part of a separate validity study of adapted mental health instruments. We examined internal consistency reliability, test-retest reliability, and construct and criterion validity for each scale on the measure. We randomly assigned half the participants to respond to a vignette depicting existing local psychiatric services which we knew were not well regarded, while the other half was randomized to a vignette describing a potentially more well-implemented mental health service. Criterion validity was assessed by comparing scores on each scale by vignette and by overall summary ratings of the programs described in the vignettes. RESULTS N = 169 participated in the qualitative study and N = 153 participated in the validity study. Qualitative findings suggested the addition of several items to the measure and indicated the importance of addressing professionalism/competency of providers in both the scales and the vignettes. Internal consistency reliabilities ranged from α = 0.85 for feasibility to α = 0.91 for appropriateness. Test-rest reliabilities were acceptable to good for all scales (rho: 0.61-0.79). All scales demonstrated substantial and significant differences in average scores by vignette assignment (ORs: 2.21-5.6) and overall ratings (ORs: 5.1-14.47), supporting criterion validity. CONCLUSIONS This study represents an innovative mixed-methods approach to testing an implementation science measure in contexts outside the United States. Results support the reliability and validity of most scales for consumers in Ukraine. Challenges included large amounts of missing data due to participants' difficulties responding to questions about a hypothetical program.
Collapse
Affiliation(s)
- E E Haroz
- Department of Mental Health, Johns Hopkins Bloomberg School of Public Health, 624 N. Broadway 8th fl, Baltimore, MD, 21205, USA.
| | - P Bolton
- Department of Mental Health, Johns Hopkins Bloomberg School of Public Health, 624 N. Broadway 8th fl, Baltimore, MD, 21205, USA.,Department of International Health, Johns Hopkins Bloomberg School of Public Health, Baltimore, USA
| | - A J Nguyen
- University of Virginia Curry School of Education, Virginia, USA
| | - C Lee
- Department of International Health, Johns Hopkins Bloomberg School of Public Health, Baltimore, USA
| | - S Bogdanov
- Center for Mental Health and Psychosocial Support National University of Kyiv-Mohyla, Kyiv-Mohyla, Ukraine
| | - J Bass
- Department of Mental Health, Johns Hopkins Bloomberg School of Public Health, 624 N. Broadway 8th fl, Baltimore, MD, 21205, USA
| | - N S Singh
- Department of International Health, Johns Hopkins Bloomberg School of Public Health, Baltimore, USA
| | - S B Doty
- Department of Mental Health, Johns Hopkins Bloomberg School of Public Health, 624 N. Broadway 8th fl, Baltimore, MD, 21205, USA
| | - L Murray
- Department of Mental Health, Johns Hopkins Bloomberg School of Public Health, 624 N. Broadway 8th fl, Baltimore, MD, 21205, USA
| |
Collapse
|
33
|
Walker TJ, Rodriguez SA, Vernon SW, Savas LS, Frost EL, Fernandez ME. Validity and reliability of measures to assess constructs from the inner setting domain of the consolidated framework for implementation research in a pediatric clinic network implementing HPV programs. BMC Health Serv Res 2019; 19:205. [PMID: 30925870 PMCID: PMC6441163 DOI: 10.1186/s12913-019-4021-5] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2018] [Accepted: 03/18/2019] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Accurate and valid measures for implementation constructs are critical to advance research and guide implementation efforts. However, there is a continued need for valid and reliable measures for implementation research. The purpose of this study was to assess the psychometric properties of measures for the Inner Setting domain of the Consolidated Framework for Implementation Research (CFIR) in a network of pediatric clinics. METHODS This study used cross-sectional survey data collected from physicians, advanced practice providers, clinic managers, and clinical staff (n = 546) working in a pediatric clinic network (n = 51). Surveys included measures assessing Inner Setting constructs from CFIR (culture, learning climate, leadership engagement, and available resources). We used a series multilevel confirmatory factor analysis (CFA) models to assess factorial validity. We also examined measure correlations to test discriminant validity and intraclass correlation coefficients, ICC(1) and ICC(2), to assess inter-rater reliability. RESULTS Factor loadings were high (≥0.60) for all but one of the measurement items. Most CFA models for respective constructs demonstrated adequate or good model fit (CFI > 0.90, TLI > 0.90, RMSEA< 0.08, and SRMR< 0.08). The measures also demonstrated good discriminant validity (correlations< 0.90) aside from some evidence of overlap between leadership engagement and learning climate at the clinic level (0.91). The ICC(1) values ranged from 0.05-0.16 while the ICC(2) values ranged from 0.34-0.67. CONCLUSIONS The measures demonstrated good validity and adequate reliability with the exception of available resources, which had some evidence of lower than desired reliability and validity at the clinic level. Our findings extend previous work by providing additional psychometric evidence to support the use of these Inner Setting measures in pediatric clinics implementing human papillomavirus programs.
Collapse
Affiliation(s)
- Timothy J. Walker
- Center for Health Promotion and Prevention Research, Department of Health Promotion & Behavioral Sciences, University of Texas Health Science Center at Houston School of Public Health, 7000 Fannin St., Houston, TX 77030 USA
| | - Serena A. Rodriguez
- Department of Population and Data Sciences, University of Texas Southwestern Medical Center, 5323 Harry Hines Blvd., Dallas, TX 75390 USA
| | - Sally W. Vernon
- Center for Health Promotion and Prevention Research, Department of Health Promotion & Behavioral Sciences, University of Texas Health Science Center at Houston School of Public Health, 7000 Fannin St., Houston, TX 77030 USA
| | - Lara S. Savas
- Center for Health Promotion and Prevention Research, Department of Health Promotion & Behavioral Sciences, University of Texas Health Science Center at Houston School of Public Health, 7000 Fannin St., Houston, TX 77030 USA
| | - Erica L. Frost
- Center for Health Promotion and Prevention Research, Department of Health Promotion & Behavioral Sciences, University of Texas Health Science Center at Houston School of Public Health, 7000 Fannin St., Houston, TX 77030 USA
| | - Maria E. Fernandez
- Center for Health Promotion and Prevention Research, Department of Health Promotion & Behavioral Sciences, University of Texas Health Science Center at Houston School of Public Health, 7000 Fannin St., Houston, TX 77030 USA
| |
Collapse
|
34
|
Woodward EN, Matthieu MM, Uchendu US, Rogal S, Kirchner JE. The health equity implementation framework: proposal and preliminary study of hepatitis C virus treatment. Implement Sci 2019; 14:26. [PMID: 30866982 PMCID: PMC6417278 DOI: 10.1186/s13012-019-0861-y] [Citation(s) in RCA: 204] [Impact Index Per Article: 40.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2018] [Accepted: 01/25/2019] [Indexed: 12/19/2022] Open
Abstract
BACKGROUND Researchers could benefit from methodological advancements to advance uptake of new treatments while also reducing healthcare disparities. A comprehensive determinants framework for healthcare disparity implementation challenges is essential to accurately understand an implementation problem and select implementation strategies. METHODS We integrated and modified two conceptual frameworks-one from implementation science and one from healthcare disparities research to develop the Health Equity Implementation Framework. We applied the Health Equity Implementation Framework to a historical healthcare disparity challenge-hepatitis C virus (HCV) and its treatment among Black patients seeking care in the US Department of Veterans Affairs (VA). A specific implementation assessment at the patient level was needed to understand any barriers to increasing uptake of HCV treatment, independent of cost. We conducted a preliminary study to assess how feasible it was for researchers to use the Health Equity Implementation Framework. We applied the framework to design the qualitative interview guide and interpret results. Using quantitative data to screen potential participants, this preliminary study consisted of semi-structured interviews with a purposively selected sample of Black, rural-dwelling, older adult VA patients (N = 12), living with HCV, from VA medical clinics in the Southern part of the USA. RESULTS The Health Equity Implementation Framework was feasible for implementation researchers. Barriers and facilitators were identified at all levels including the patient, provider (recipients), patient-provider interaction (clinical encounter), characteristics of treatment (innovation), and healthcare system (inner and outer context). Some barriers reflected general implementation issues (e.g., poor care coordination after testing positive for HCV). Other barriers were related to healthcare disparities and likely unique to racial minority patients (e.g., testimonials from Black peers about racial discrimination at VA). We identified several facilitators, including patient enthusiasm to obtain treatment because of its high cure rates, and VA clinics that offset HCV stigma by protecting patient confidentiality. CONCLUSION The Health Equity Implementation Framework showcases one way to modify an implementation framework to better assess health equity determinants as well. Researchers may be able to optimize the scientific yield of research inquiries by identifying and addressing factors that promote or impede implementation of novel treatments in addition to eliminating healthcare disparities.
Collapse
Affiliation(s)
- Eva N. Woodward
- Center for Mental Healthcare & Outcomes Research, Central Arkansas Veterans Healthcare System, U.S. Department of Veterans Affairs, 2200 Fort Roots Drive, 152 NLR, North Little Rock, AR 72114 USA
- Department of Psychiatry, University of Arkansas for Medical Sciences, Little Rock, AR USA
| | - Monica M. Matthieu
- Center for Mental Healthcare & Outcomes Research, Central Arkansas Veterans Healthcare System, U.S. Department of Veterans Affairs, 2200 Fort Roots Drive, 152 NLR, North Little Rock, AR 72114 USA
- College for Public Health and Social Justice, School of Social Work, Saint Louis University, St. Louis, MO USA
| | | | - Shari Rogal
- VA Pittsburgh Healthcare System, Center for Health Equity Research and Promotion, Pittsburgh, PA USA
- Department of Surgery, University of Pittsburgh, Pittsburgh, PA USA
- Division of Gastroenterology, Hepatology, and Nutrition, University of Pittsburgh, Pittsburgh, PA USA
| | - JoAnn E. Kirchner
- Center for Mental Healthcare & Outcomes Research, Central Arkansas Veterans Healthcare System, U.S. Department of Veterans Affairs, 2200 Fort Roots Drive, 152 NLR, North Little Rock, AR 72114 USA
- Department of Psychiatry, University of Arkansas for Medical Sciences, Little Rock, AR USA
- VA Team-Based Behavioral Health QUERI, U.S. Department of Veterans Affairs, North Little Rock, AR USA
| |
Collapse
|
35
|
Conway A, Dowling M, Devane D. Implementing an initiative promote evidence-informed practice: part 2-healthcare professionals' perspectives of the evidence rounds programme. BMC MEDICAL EDUCATION 2019; 19:75. [PMID: 30841872 PMCID: PMC6402168 DOI: 10.1186/s12909-019-1488-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/18/2018] [Accepted: 02/08/2019] [Indexed: 06/09/2023]
Abstract
BACKGROUND The translation of research into clinical practice is a key component of evidence-informed decision making. We implemented a multi-component dissemination and implementation strategy for healthcare professionals (HCPs) called Evidence Rounds. We report the findings of focus groups and interviews with HCPs to explore their perceptions of Evidence Rounds and help inform the implementation of future similar initiatives. This is the second paper in a two-part series. METHODS We employed total population, purposive sampling by targeting all of the health care professionals who attended or presented at group sessions exploring the evidence on clinical questions or topics chosen and presented by the HCPs. We conducted and audio-recorded in-person focus groups and one-to-one interviews, which were then transcribed verbatim. Two authors independently coded transcripts. NVivo software was used to collate the primary data and codes. We analysed data guided by the five steps involved in framework analysis; 1) familiarization 2) identifying a thematic framework 3) indexing 4) charting 5) mapping and interpretation. RESULTS Thirteen HCPs participated, of which 6 were medical doctors an d 7 were nursing or midwifery staff. We identified the following key domains; organisational readiness for change, barriers and facilitators to attendance, barriers and facilitators to presenting, communication and dissemination of information, and sustainability. During focus groups and interviews HCPs reported that Evidence Rounds had a positive impact on their continuing education and clinical practice. They also provided insights into how future initiatives could be optimised to support and enable them to narrow the gap between research evidence and practice. CONCLUSIONS Individual, departmental and organisational level contextual factors can play a major role in implementation within complex health services. HCPs highlighted how in combination with clinical guideline development, implementation of evidence could be increased. Further research after a longer period of implementation could investigate how initiatives might be optimised to promote the uptake of evidence, improve implementation and expedite behaviour change.
Collapse
Affiliation(s)
- Aislinn Conway
- Health Research Board Trials Methodology Research Network, School of Nursing and Midwifery, National University of Ireland Galway, Galway, Ireland
| | - Maura Dowling
- School of Nursing and Midwifery, National University of Ireland Galway, Galway, Ireland
| | - Declan Devane
- Health Research Board Trials Methodology Research Network, School of Nursing and Midwifery, National University of Ireland Galway, Galway, Ireland
| |
Collapse
|
36
|
Huebschmann AG, Leavitt IM, Glasgow RE. Making Health Research Matter: A Call to Increase Attention to External Validity. Annu Rev Public Health 2019; 40:45-63. [PMID: 30664836 DOI: 10.1146/annurev-publhealth-040218-043945] [Citation(s) in RCA: 36] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Most of the clinical research conducted with the goal of improving health is not generalizable to nonresearch settings. In addition, scientists often fail to replicate each other's findings due, in part, to lack of attention to contextual factors accounting for their relative effectiveness or failure. To address these problems, we review the literature on assessment of external validity and summarize approaches to designing for generalizability. When investigators conduct systematic reviews, a critical need is often unmet: to evaluate the pragmatism and context of interventions, as well as their effectiveness. Researchers, editors, and grant reviewers can implement key changes in how they consider and report on external validity issues. For example, the recently published expanded CONSORT figure may aid scientists and potential program adopters in summarizing participation in and representativeness of a program across different settings, staff, and patients. Greater attention to external validity is needed to increase reporting transparency, improve program dissemination, and reduce failures to replicate research.
Collapse
Affiliation(s)
- Amy G Huebschmann
- Division of General Internal Medicine, Center for Women's Health Research, School of Medicine, University of Colorado, Aurora, Colorado 80045, USA; .,Dissemination and Implementation Science Program of Adult and Child Consortium for Outcomes Research and Delivery Science (ACCORDS), School of Medicine, University of Colorado, Aurora, Colorado 80045, USA
| | - Ian M Leavitt
- Department of Social and Behavioral Sciences, T.H. Chan School of Public Health, Harvard University, Boston, Massachusetts 02115, USA;
| | - Russell E Glasgow
- Dissemination and Implementation Science Program of Adult and Child Consortium for Outcomes Research and Delivery Science (ACCORDS), School of Medicine, University of Colorado, Aurora, Colorado 80045, USA.,Department of Family Medicine, School of Medicine, University of Colorado, Aurora, Colorado 80045, USA;
| |
Collapse
|
37
|
Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA. Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implement Sci 2019; 14:1. [PMID: 30611302 PMCID: PMC6321673 DOI: 10.1186/s13012-018-0842-6] [Citation(s) in RCA: 515] [Impact Index Per Article: 103.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2018] [Accepted: 11/26/2018] [Indexed: 11/29/2022] Open
Abstract
Background Effective implementation of evidence-based practices (EBPs) remains a significant challenge. Numerous existing models and frameworks identify key factors and processes to facilitate implementation. However, there is a need to better understand how individual models and frameworks are applied in research projects, how they can support the implementation process, and how they might advance implementation science. This systematic review examines and describes the research application of a widely used implementation framework, the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Methods A systematic literature review was performed to identify and evaluate the use of the EPIS framework in implementation efforts. Citation searches in PubMed, Scopus, PsycINFO, ERIC, Web of Science, Social Sciences Index, and Google Scholar databases were undertaken. Data extraction included the objective, language, country, setting, sector, EBP, study design, methodology, level(s) of data collection, unit(s) of analysis, use of EPIS (i.e., purpose), implementation factors and processes, EPIS stages, implementation strategy, implementation outcomes, and overall depth of EPIS use (rated on a 1–5 scale). Results In total, 762 full-text articles were screened by four reviewers, resulting in inclusion of 67 articles, representing 49 unique research projects. All included projects were conducted in public sector settings. The majority of projects (73%) investigated the implementation of a specific EBP. The majority of projects (90%) examined inner context factors, 57% examined outer context factors, 37% examined innovation factors, and 31% bridging factors (i.e., factors that cross or link the outer system and inner organizational context). On average, projects measured EPIS factors across two of the EPIS phases (M = 2.02), with the most frequent phase being Implementation (73%). On average, the overall depth of EPIS inclusion was moderate (2.8 out of 5). Conclusion This systematic review enumerated multiple settings and ways the EPIS framework has been applied in implementation research projects, and summarized promising characteristics and strengths of the framework, illustrated with examples. Recommendations for future use include more precise operationalization of factors, increased depth and breadth of application, development of aligned measures, and broadening of user networks. Additional resources supporting the operationalization of EPIS are available. Electronic supplementary material The online version of this article (10.1186/s13012-018-0842-6) contains supplementary material, which is available to authorized users.
Collapse
Affiliation(s)
- Joanna C Moullin
- Faculty of Health Sciences, School of Pharmacy and Biomedical Sciences, Curtin University, Kent Street, Bentley, Perth, 6102, Western Australia.,Child and Adolescent Services Research Center, 3665 Kearny Villa Rd., Suite 200N, San Diego, CA, 92123, USA
| | - Kelsey S Dickson
- Child and Adolescent Services Research Center, 3665 Kearny Villa Rd., Suite 200N, San Diego, CA, 92123, USA.,Department of Child and Family Development, San Diego State University, 5500 Campanile Drive, San Diego, CA, 92182, USA
| | - Nicole A Stadnick
- Child and Adolescent Services Research Center, 3665 Kearny Villa Rd., Suite 200N, San Diego, CA, 92123, USA.,Department of Psychiatry, University of California San Diego, 9500 Gilman Drive (0812), La Jolla, San Diego, CA, 92093-0812, USA
| | - Borsika Rabin
- Department of Family Medicine and Public Health, University of California San Diego, 9500 Gilman Drive (0725), La Jolla, San Diego, CA, 92093-0812, USA
| | - Gregory A Aarons
- Child and Adolescent Services Research Center, 3665 Kearny Villa Rd., Suite 200N, San Diego, CA, 92123, USA. .,Department of Psychiatry, University of California San Diego, 9500 Gilman Drive (0812), La Jolla, San Diego, CA, 92093-0812, USA.
| |
Collapse
|
38
|
Willmeroth T, Wesselborg B, Kuske S. Implementation Outcomes and Indicators as a New Challenge in Health Services Research: A Systematic Scoping Review. INQUIRY : A JOURNAL OF MEDICAL CARE ORGANIZATION, PROVISION AND FINANCING 2019; 56:46958019861257. [PMID: 31347418 PMCID: PMC6661793 DOI: 10.1177/0046958019861257] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/13/2019] [Revised: 05/19/2019] [Accepted: 06/12/2019] [Indexed: 11/25/2022]
Abstract
The aim of this systematic scoping review was to identify and analyze indicators that address implementation quality or success in health care services and to deduce recommendations for further indicator development. This review was conducted according to the Joanna Briggs Manual and the PRISMA Statement. CINAHL, EMBASE, MEDLINE, and PsycINFO were searched. Studies or reviews published between August 2008 and 2018 that reported monitoring of the quality or the implementation success in health care services by using indicators based on continuous variables and proportion-based, ratio-based, standardized ratio-based, or rate-based variables or indices were included. The records were screened by title and abstract, and the full-text articles were also independently double-screened by 3 reviewers for eligibility. In total, 4376 records were identified that resulted in 10 eligible studies, including 67 implementation indicators. There was heterogeneity regarding the theoretical backgrounds, designs, objectives, settings, and implementation indicators among the publications. None of the indicators addressed the implementation outcomes of appropriateness or sustainability. Service implementation efficiency was identified as an additional outcome. Achieving consensus in framing implementation outcomes and indicators will be a new challenge in health services research. Considering the new debates regarding health care complexity, the further development of indicators based on complementary qualitative and quantitative approaches is needed.
Collapse
|
39
|
Estabrooks PA, Brownson RC, Pronk NP. Dissemination and Implementation Science for Public Health Professionals: An Overview and Call to Action. Prev Chronic Dis 2018; 15:E162. [PMID: 30576272 PMCID: PMC6307829 DOI: 10.5888/pcd15.180525] [Citation(s) in RCA: 80] [Impact Index Per Article: 13.3] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
Affiliation(s)
- Paul A Estabrooks
- Department of Health Promotion, College of Public Health, University of Nebraska Medical Center, 984365 Nebraska Medical Center, Omaha, NE 68198.
| | - Ross C Brownson
- Prevention Research Center in St. Louis, Brown School, Washington University in St. Louis, St. Louis, Missouri.,Department of Surgery (Division of Public Health Sciences) and Alvin J. Siteman Cancer Center, Washington University School of Medicine, Washington University in St. Louis, St. Louis, Missouri
| | - Nicolaas P Pronk
- HealthPartners Institute, Bloomington, Minnesota.,Harvard T.H. Chan School of Public Health, Department of Social and Behavioral Sciences, Boston, Massachusetts
| |
Collapse
|
40
|
Stover AM, Basch EM. Implementation of Symptom Questionnaires Into Oncology Workflow. J Oncol Pract 2018; 12:859-862. [PMID: 27601508 DOI: 10.1200/jop.2016.015610] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022] Open
Affiliation(s)
- Angela M Stover
- University of North Carolina at Chapel Hill, Chapel Hill, NC
| | - Ethan M Basch
- University of North Carolina at Chapel Hill, Chapel Hill, NC
| |
Collapse
|
41
|
Lane-Fall MB, Cobb BT, Cené CW, Beidas RS. Implementation Science in Perioperative Care. Anesthesiol Clin 2018; 36:1-15. [PMID: 29425593 DOI: 10.1016/j.anclin.2017.10.004] [Citation(s) in RCA: 29] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/16/2022]
Abstract
There is a 17-year gap between the initial publication of scientific evidence and its uptake into widespread practice in health care. The field of implementation science (IS) emerged in the 1990s as an answer to this "evidence-to-practice gap." In this article, we present an overview of implementation science, focusing on the application of IS principles to perioperative care. We describe opportunities for additional training and discuss strategies for funding and publishing IS work. The objective is to demonstrate how IS can improve perioperative patient care, while highlighting perioperative IS studies and identifying areas in need of additional investigation.
Collapse
Affiliation(s)
- Meghan B Lane-Fall
- Penn Center for Perioperative Outcomes Research and Transformation, Perelman School of Medicine, University of Pennsylvania, 423 Guardian Drive, 333 Blockley Hall, Philadelphia, PA 19104, USA; Leonard Davis Institute of Health Economics, University of Pennsylvania, Colonial Penn Center, 3641 Locust Walk Philadelphia, PA 19104-6218; Department of Anesthesiology and Critical Care, Perelman School of Medicine, University of Pennsylvania, 3400 Spruce Street, 680 Dulles (Anesthesia), Philadelphia, PA 19104, USA.
| | - Benjamin T Cobb
- Department of Anesthesiology and Critical Care, Perelman School of Medicine, University of Pennsylvania, 3400 Spruce Street, 680 Dulles (Anesthesia), Philadelphia, PA 19104, USA; National Clinician Scholar Program, University of Pennsylvania, 423 Guardian Drive, 1310 Blockley Hall, Philadelphia, PA 19104, USA
| | - Crystal Wiley Cené
- Division of General Internal Medicine, School of Medicine, University of North Carolina at Chapel Hill, 101 Manning Drive #1050, Chapel Hill, NC 27514, USA
| | - Rinad S Beidas
- Department of Psychiatry, University of Pennsylvania, 3535 Market Street, Suite 3015, Philadelphia, PA 19104, USA
| |
Collapse
|
42
|
[Systematic translation and cross-validation of defined implementation outcomes in health care services]. ZEITSCHRIFT FUR EVIDENZ FORTBILDUNG UND QUALITAET IM GESUNDHEITSWESEN 2018; 135-136:72-80. [PMID: 30057171 DOI: 10.1016/j.zefq.2018.06.005] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/20/2018] [Revised: 06/03/2018] [Accepted: 06/22/2018] [Indexed: 11/20/2022]
Abstract
OBJECTIVE To validate a German translation of construct-validated implementation outcomes of Proctor et al. (2011). METHODS A systematic translation process and a cross-validation based on Beaton et al. (2000) were performed. RESULTS Semantic challenges arose regarding the definitions of "adoption" and "fidelity". Consistent formulation was established. CONCLUSION The validated definitions are a starting point for developing a comprehensive concept to measure implementation effectiveness and efficacy of interventions in health services research.
Collapse
|
43
|
Lewis CC, Mettert KD, Dorsey CN, Martinez RG, Weiner BJ, Nolen E, Stanick C, Halko H, Powell BJ. An updated protocol for a systematic review of implementation-related measures. Syst Rev 2018; 7:66. [PMID: 29695295 PMCID: PMC5918558 DOI: 10.1186/s13643-018-0728-3] [Citation(s) in RCA: 47] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/20/2017] [Accepted: 04/11/2018] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Implementation science is the study of strategies used to integrate evidence-based practices into real-world settings (Eccles and Mittman, Implement Sci. 1(1):1, 2006). Central to the identification of replicable, feasible, and effective implementation strategies is the ability to assess the impact of contextual constructs and intervention characteristics that may influence implementation, but several measurement issues make this work quite difficult. For instance, it is unclear which constructs have no measures and which measures have any evidence of psychometric properties like reliability and validity. As part of a larger set of studies to advance implementation science measurement (Lewis et al., Implement Sci. 10:102, 2015), we will complete systematic reviews of measures that map onto the Consolidated Framework for Implementation Research (Damschroder et al., Implement Sci. 4:50, 2009) and the Implementation Outcomes Framework (Proctor et al., Adm Policy Ment Health. 38(2):65-76, 2011), the protocol for which is described in this manuscript. METHODS Our primary databases will be PubMed and Embase. Our search strings will be comprised of five levels: (1) the outcome or construct term; (2) terms for measure; (3) terms for evidence-based practice; (4) terms for implementation; and (5) terms for mental health. Two trained research specialists will independently review all titles and abstracts followed by full-text review for inclusion. The research specialists will then conduct measure-forward searches using the "cited by" function to identify all published empirical studies using each measure. The measure and associated publications will be compiled in a packet for data extraction. Data relevant to our Psychometric and Pragmatic Evidence Rating Scale (PAPERS) will be independently extracted and then rated using a worst score counts methodology reflecting "poor" to "excellent" evidence. DISCUSSION We will build a centralized, accessible, searchable repository through which researchers, practitioners, and other stakeholders can identify psychometrically and pragmatically strong measures of implementation contexts, processes, and outcomes. By facilitating the employment of psychometrically and pragmatically strong measures identified through this systematic review, the repository would enhance the cumulativeness, reproducibility, and applicability of research findings in the rapidly growing field of implementation science.
Collapse
Affiliation(s)
- Cara C. Lewis
- Kaiser Permanente Washington Health Research Institute, MacColl Center for Health Care Innovation, 1730 Minor Avenue, Suite 1600, Seattle, WA 98101 USA
- Department of Psychological and Brain Sciences, Indiana University, 1101 E 10th Street, Bloomington, IN 47405 USA
- Department of Psychiatry and Behavioral Sciences, Harborview Medical Center, University of Washington, 325 9th Ave, Box 354946, Seattle, WA 98104 USA
| | - Kayne D. Mettert
- Kaiser Permanente Washington Health Research Institute, MacColl Center for Health Care Innovation, 1730 Minor Avenue, Suite 1600, Seattle, WA 98101 USA
| | - Caitlin N. Dorsey
- Kaiser Permanente Washington Health Research Institute, MacColl Center for Health Care Innovation, 1730 Minor Avenue, Suite 1600, Seattle, WA 98101 USA
| | - Ruben G. Martinez
- Psychology Department, Virginia Commonwealth University, 806 W. Franklin St, Box 842018, Richmond, VA 23284 USA
| | - Bryan J. Weiner
- Department of Global Health, University of Washington, 1510 San Juan Road, Box 357965, Seattle, WA 98195 USA
| | - Elspeth Nolen
- Department of Global Health, University of Washington, 1510 San Juan Road, Box 357965, Seattle, WA 98195 USA
| | - Cameo Stanick
- Hathaway-Sycamores Child and Family Services, 210 S DeLacey Ave, Suite 110, Pasadena, CA 91105-2074 USA
| | - Heather Halko
- Department of Psychology, University of Montana, 32 Campus Drive, Missoula, MT 59812 USA
| | - Byron J. Powell
- Department of Health Policy and Management, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, 135 Dauer Drive, Chapel Hill, NC 27599 USA
| |
Collapse
|
44
|
Rabin BA, McCreight M, Battaglia C, Ayele R, Burke RE, Hess PL, Frank JW, Glasgow RE. Systematic, Multimethod Assessment of Adaptations Across Four Diverse Health Systems Interventions. Front Public Health 2018; 6:102. [PMID: 29686983 PMCID: PMC5900443 DOI: 10.3389/fpubh.2018.00102] [Citation(s) in RCA: 81] [Impact Index Per Article: 13.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2018] [Accepted: 03/23/2018] [Indexed: 11/25/2022] Open
Abstract
Background Many health outcomes and implementation science studies have demonstrated the importance of tailoring evidence-based care interventions to local context to improve fit. By adapting to local culture, history, resources, characteristics, and priorities, interventions are more likely to lead to improved outcomes. However, it is unclear how best to adapt evidence-based programs and promising innovations. There are few guides or examples of how to best categorize or assess health-care adaptations, and even fewer that are brief and practical for use by non-researchers. Materials and methods This study describes the importance and potential of assessing adaptations before, during, and after the implementation of health systems interventions. We present a promising multilevel and multimethod approach developed and being applied across four different health systems interventions. Finally, we discuss implications and opportunities for future research. Results The four case studies are diverse in the conditions addressed, interventions, and implementation strategies. They include two nurse coordinator-based transition of care interventions, a data and training-driven multimodal pain management project, and a cardiovascular patient-reported outcomes project, all of which are using audit and feedback. We used the same modified adaptation framework to document changes made to the interventions and implementation strategies. To create the modified framework, we started with the adaptation and modification model developed by Stirman and colleagues and expanded it by adding concepts from the RE-AIM framework. Our assessments address the intuitive domains of Who, How, When, What, and Why to classify and organize adaptations. For each case study, we discuss how the modified framework was operationalized, the multiple methods used to collect data, results to date and approaches utilized for data analysis. These methods include a real-time tracking system and structured interviews at key times during the intervention. We provide descriptive data on the types and categories of adaptations made and discuss lessons learned. Conclusion The multimethod approaches demonstrate utility across diverse health systems interventions. The modified adaptations model adequately captures adaptations across the various projects and content areas. We recommend systematic documentation of adaptations in future clinical and public health research and have made our assessment materials publicly available.
Collapse
Affiliation(s)
- Borsika A Rabin
- Denver-Seattle Center of Innovation for Veteran-Centered and Value-Driven Care (COIN), Denver VHA Medical Center, Denver, CO, United States.,Department of Family Medicine and Public Health, School of Medicine, University of California San Diego, La Jolla, CA, United States.,Adult and Child Consortium for Health Outcomes Research and Delivery Science, School of Medicine, University of Colorado, Aurora, CO, United States.,Department of Family Medicine, School of Medicine, University of Colorado, Aurora, CO, United States
| | - Marina McCreight
- Denver-Seattle Center of Innovation for Veteran-Centered and Value-Driven Care (COIN), Denver VHA Medical Center, Denver, CO, United States
| | - Catherine Battaglia
- Denver-Seattle Center of Innovation for Veteran-Centered and Value-Driven Care (COIN), Denver VHA Medical Center, Denver, CO, United States.,Department of Health System Management and Policy, Colorado School of Public Health, University of Colorado, Aurora, CO, United States
| | - Roman Ayele
- Denver-Seattle Center of Innovation for Veteran-Centered and Value-Driven Care (COIN), Denver VHA Medical Center, Denver, CO, United States.,Department of Health System Management and Policy, Colorado School of Public Health, University of Colorado, Aurora, CO, United States
| | - Robert E Burke
- Denver-Seattle Center of Innovation for Veteran-Centered and Value-Driven Care (COIN), Denver VHA Medical Center, Denver, CO, United States.,Department of Medicine, School of Medicine, University of Colorado, Aurora, CO, United States
| | - Paul L Hess
- Denver-Seattle Center of Innovation for Veteran-Centered and Value-Driven Care (COIN), Denver VHA Medical Center, Denver, CO, United States.,Department of Medicine, School of Medicine, University of Colorado, Aurora, CO, United States
| | - Joseph W Frank
- Denver-Seattle Center of Innovation for Veteran-Centered and Value-Driven Care (COIN), Denver VHA Medical Center, Denver, CO, United States.,Department of Medicine, School of Medicine, University of Colorado, Aurora, CO, United States
| | - Russell E Glasgow
- Denver-Seattle Center of Innovation for Veteran-Centered and Value-Driven Care (COIN), Denver VHA Medical Center, Denver, CO, United States.,Adult and Child Consortium for Health Outcomes Research and Delivery Science, School of Medicine, University of Colorado, Aurora, CO, United States.,Department of Family Medicine, School of Medicine, University of Colorado, Aurora, CO, United States
| |
Collapse
|
45
|
Berman M, Bozsik F, Shook RP, Meissen-Sebelius E, Markenson D, Summar S, DeWit E, Carlson JA. Evaluation of the Healthy Lifestyles Initiative for Improving Community Capacity for Childhood Obesity Prevention. Prev Chronic Dis 2018; 15:E24. [PMID: 29470168 PMCID: PMC5833312 DOI: 10.5888/pcd15.170306] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
Abstract
PURPOSE AND OBJECTIVES Policy, systems, and environmental approaches are recommended for preventing childhood obesity. The objective of our study was to evaluate the Healthy Lifestyles Initiative, which aimed to strengthen community capacity for policy, systems, and environmental approaches to healthy eating and active living among children and families. INTERVENTION APPROACH The Healthy Lifestyles Initiative was developed through a collaborative process and facilitated by community organizers at a local children's hospital. The initiative supported 218 partners from 170 community organizations through training, action planning, coalition support, one-on-one support, and the dissemination of materials and sharing of resources. EVALUATION METHODS Eighty initiative partners completed a brief online survey on implementation strategies engaged in, materials used, and policy, systems, and environmental activities implemented. In accordance with frameworks for implementation science, we assessed associations among the constructs by using linear regression to identify whether and which of the implementation strategies were associated with materials used and implementation of policy, systems, and environmental activities targeted by the initiative. RESULTS Each implementation strategy was engaged in by 30% to 35% of the 80 survey respondents. The most frequently used materials were educational handouts (76.3%) and posters (66.3%). The most frequently implemented activities were developing or continuing partnerships (57.5%) and reviewing organizational wellness policies (46.3%). Completing an action plan and the number of implementation strategies engaged in were positively associated with implementation of targeted activities (action plan, effect size = 0.82; number of strategies, effect size = 0.51) and materials use (action plan, effect size = 0.59; number of strategies, effect size = 0.52). Materials use was positively associated with implementation of targeted activities (effect size = 0.35). IMPLICATIONS FOR PUBLIC HEALTH Community-capacity-building efforts can be effective in supporting community organizations to engage in policy, systems, and environmental activities for healthy eating and active living. Multiple implementation strategies are likely needed, particularly strategies that involve a high level of engagement, such as training community organizations and working with them on structured action plans.
Collapse
Affiliation(s)
- Marcie Berman
- Center for Children's Healthy Lifestyles and Nutrition, Children's Mercy Kansas City, Missouri
| | - Frances Bozsik
- Center for Children's Healthy Lifestyles and Nutrition, Children's Mercy Kansas City, Missouri
| | - Robin P Shook
- Center for Children's Healthy Lifestyles and Nutrition, Children's Mercy Kansas City, Missouri
| | - Emily Meissen-Sebelius
- Center for Children's Healthy Lifestyles and Nutrition, Children's Mercy Kansas City, Missouri
| | - Deborah Markenson
- Center for Children's Healthy Lifestyles and Nutrition, Children's Mercy Kansas City, Missouri
| | - Shelly Summar
- Center for Children's Healthy Lifestyles and Nutrition, Children's Mercy Kansas City, Missouri
| | - Emily DeWit
- Center for Children's Healthy Lifestyles and Nutrition, Children's Mercy Kansas City, Missouri
| | - Jordan A Carlson
- Center for Children's Healthy Lifestyles and Nutrition, Children's Mercy Kansas City, 610 E 22nd St, Kansas City, MO 64108.
| |
Collapse
|
46
|
Shelton RC, Cooper BR, Stirman SW. The Sustainability of Evidence-Based Interventions and Practices in Public Health and Health Care. Annu Rev Public Health 2018; 39:55-76. [PMID: 29328872 DOI: 10.1146/annurev-publhealth-040617-014731] [Citation(s) in RCA: 358] [Impact Index Per Article: 59.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
There is strong interest in implementation science to address the gap between research and practice in public health. Research on the sustainability of evidence-based interventions has been growing rapidly. Sustainability has been defined as the continued use of program components at sufficient intensity for the sustained achievement of desirable program goals and population outcomes. This understudied area has been identified as one of the most significant translational research problems. Adding to this challenge is uncertainty regarding the extent to which intervention adaptation and evolution are necessary to address the needs of populations that differ from those in which interventions were originally tested or implemented. This review critically examines and discusses conceptual and methodological issues in studying sustainability, summarizes the multilevel factors that have been found to influence the sustainability of interventions in a range of public health and health care settings, and highlights key areas for future research.
Collapse
Affiliation(s)
- Rachel C Shelton
- Department of Sociomedical Sciences, Mailman School of Public Health, Columbia University, New York, NY 10032, USA;
| | - Brittany Rhoades Cooper
- Department of Human Development, Washington State University, Pullman, Washington 99164, USA;
| | - Shannon Wiltsey Stirman
- Dissemination and Training Division, National Center for PTSD and Department of Psychiatry and Behavioral Sciences, Stanford University, Stanford, California 94024, USA;
| |
Collapse
|
47
|
Chambers DA. Commentary: Increasing the Connectivity Between Implementation Science and Public Health: Advancing Methodology, Evidence Integration, and Sustainability. Annu Rev Public Health 2017; 39:1-4. [PMID: 29272164 DOI: 10.1146/annurev-publhealth-110717-045850] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Gaps remain between the outcomes of biomedical research and their application within clinical and community settings. The field of implementation science, also referred to as dissemination and implementation research, is intended to improve the adoption, uptake, and sustainability of evidence-based health interventions. The articles in this volume's symposium on implementation science and public health identify important directions in the effort to maximize the impact of research on public and population health. Leading researchers present reviews of the use of quasi-experimental designs in implementation science, the movement toward enhancing evidence-based public health, and intervention sustainability. Each article presents lessons learned from prior research and recommendations for the next generation of studies. Collectively, the symposium offers a road map for future implementation science that seeks to optimize public health.
Collapse
Affiliation(s)
- David A Chambers
- Division of Cancer Control and Population Sciences, National Cancer Institute, Rockville, Maryland 20850, USA;
| |
Collapse
|
48
|
King AA, Baumann AA. Sickle cell disease and implementation science: A partnership to accelerate advances. Pediatr Blood Cancer 2017; 64:10.1002/pbc.26649. [PMID: 28556441 PMCID: PMC6026013 DOI: 10.1002/pbc.26649] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/17/2017] [Revised: 04/05/2017] [Accepted: 04/24/2017] [Indexed: 12/11/2022]
Abstract
Sickle cell disease (SCD) results in end organ damage and a shortened lifespan. Both the pathophysiology of the disease and the social determinants of health affect patient outcomes. Randomized controlled trials have been completed among this population and resulted in medical advances; however, the gestation of these advances and the lack of penetrance into clinical practice have limited advancements in clinical improvements for many people with SCD. We discuss the role of implementation science in SCD and highlight the need for this science to shorten the length of time to implement evidence-based care for more people with SCD.
Collapse
Affiliation(s)
- Allison A. King
- Program in Occupational Therapy, Washington University School of Medicine, St. Louis, Missouri
- Division of Pediatric Hematology/Oncology, Department of Pediatrics, Washington University School of Medicine, St. Louis, Missouri
- Division of Public Health Sciences, Department of Surgery, Washington University School of Medicine, St. Louis, Missouri
- Division of Hematology, Department of Medicine, Washington University School of Medicine, St. Louis, Missouri
| | - Ana A. Baumann
- Brown School, Washington University, St. Louis, Missouri
| |
Collapse
|
49
|
Carpenter CR, Pinnock H. Starry Aims to Overcome Knowledge Translation Inertia: The Standards for Reporting Implementation Studies (StaRI) Guidelines. Acad Emerg Med 2017; 24:1027-1029. [PMID: 28574631 DOI: 10.1111/acem.13235] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2022]
Affiliation(s)
- Christopher R. Carpenter
- Division of Emergency Medicine and Emergency Care Research Core; Washington University in St. Louis School of Medicine; St. Louis MO
| | - Hilary Pinnock
- University of Edinburgh Medical School; Edinburgh Scotland UK
| |
Collapse
|
50
|
Implementation Science: A Neglected Opportunity to Accelerate Improvements in the Safety and Quality of Surgical Care. Ann Surg 2017; 265:1104-1112. [PMID: 27735828 DOI: 10.1097/sla.0000000000002013] [Citation(s) in RCA: 47] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Abstract
OBJECTIVE The aim of this review was to emphasize the importance of implementation science in understanding why efforts to integrate evidence-based interventions into surgical practice frequently fail to replicate the improvements reported in early research studies. SUMMARY OF BACKGROUND DATA Over the past 2 decades, numerous patient safety initiatives have been developed to improve the quality and safety of surgical care. The surgical community is now faced with translating "promising" initiatives from the research environment into clinical practice-the World Health Organization (WHO) has described this task as one of the greatest challenges facing the global health community and has identified the importance of implementation science in scaling up evidence-based interventions. METHODS Using the WHO surgical safety checklist, a prominent example of a rapidly and widely implemented surgical safety intervention of the past decade, a review of literature, spanning surgery, and implementation science, was conducted to identify and describe a broad range of factors affecting implementation success, including contextual factors, implementation strategies, and implementation outcomes. RESULTS Our current approach to conceptualizing and measuring the "effectiveness" of interventions has resulted in factors critical to implementing surgical safety interventions successfully being neglected. CONCLUSION Improvements in the safety and quality of surgical care can be accelerated by drawing more heavily upon implementation science and that until this rapidly evolving field becomes more firmly embedded into surgical research and implementation efforts, our understanding of why interventions such as the checklist "work" in some settings and appear "not to work" in other settings will be limited.
Collapse
|