1
|
Navas C, Minton AP, Rodriguez-Leboeuf AM. The Role of Patient-Reported Outcomes to Measure Treatment Satisfaction in Drug Development. THE PATIENT 2024:10.1007/s40271-024-00702-w. [PMID: 38976224 DOI: 10.1007/s40271-024-00702-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 05/22/2024] [Indexed: 07/09/2024]
Abstract
Treatment satisfaction is a person's rating of his or her treatment experience, including processes and outcomes. It is directly related to treatment adherence, which may be predictive of treatment effectiveness in clinical and real-world research. Consequently, patient-reported outcome (PRO) instruments have been developed to incorporate patient experience throughout various stages of drug development and routine care. PRO instruments enable clinicians and researchers to evaluate and compare treatment satisfaction data in different clinical settings. It is important to select fit-for-purpose PRO instruments that have demonstrated adequate levels of reliability, validity, and sensitivity to change to support their use. Some of these instruments are unidimensional while some are multidimensional; some are generic and can be applied across different therapeutic areas, while others have been developed for use in a specific treatment modality or condition. This article describes the role of treatment satisfaction in drug development as well as regulatory and Health Technology Assessment (HTA) decision making and calls for more widespread use of carefully selected treatment satisfaction PRO instruments in early- and late-phase drug development.
Collapse
|
2
|
Rojas-Andrade R, Agudelo-Hernández F. Validation of an instrument to guide the implementation of strategies for mental health care in Colombia. Rev Panam Salud Publica 2024; 48:e10. [PMID: 38410358 PMCID: PMC10896121 DOI: 10.26633/rpsp.2024.10] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2023] [Accepted: 12/04/2023] [Indexed: 02/28/2024] Open
Abstract
Objectives To validate the implementation drivers scale among first-level mental health care professionals in Colombia. The scale is designed as a tool to guide the implementation of strategies that effectively reduce gaps in mental health care. Methods The Active Implementation Framework was adopted, which is a widely used model for measuring implementation. The participants included 380 individuals (55.56% men) - 349 health personnel trained in the Mental Health Gap Action Programme (mhGAP) and 31 territorial personnel in charge of planning mental health strategies at the territorial level in Colombia. To assess the critical dimensions of mhGAP implementation, we developed a scale of 18 items based on the active implementation framework. We conducted content validity assessments and exploratory factor analysis to evaluate the scale. We used the Organizational Readiness for Knowledge Translation scale as a comparative standard. Results The implementation drivers scale identified four dimensions: system enablers for implementation, accessibility of the strategy, adaptability and acceptability, and strategy training and supervision. These dimensions had Cronbach alpha values of 0.914, 0.868, 0.927, and 0.725, respectively, indicating high internal consistency. In addition, all dimensions demonstrated adequate correlation with the Organizational Readiness for Knowledge Translation scale. Conclusion The implementation drivers scale effectively determines the adaptability and implementation of various components of mental health programs, particularly those focusing on community-based approaches and primary care settings. As such, this scale can contribute to the more effective implementation of strategies outlined by global and local political frameworks, thus improving mental health care.
Collapse
Affiliation(s)
- Rodrigo Rojas-Andrade
- University Santiago de ChileSantiagoChileUniversity Santiago de Chile, Santiago, Chile.
| | | |
Collapse
|
3
|
Patel-Syed Z, Becker S, Olson M, Rinella H, Scott K. What do you think it means? Using cognitive interviewing to improve measurement in implementation science: description and case example. Implement Sci Commun 2024; 5:14. [PMID: 38355677 PMCID: PMC10865651 DOI: 10.1186/s43058-024-00549-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2023] [Accepted: 01/16/2024] [Indexed: 02/16/2024] Open
Abstract
Pragmatic measures are essential to evaluate the implementation of evidence-based interventions. Cognitive interviewing, a qualitative method that collects partner feedback throughout measure development, is particularly useful for developing pragmatic implementation measures. Measure developers can use cognitive interviewing to increase a measure's fit within a particular implementation context. However, cognitive interviewing is underused in implementation research, where most measures remain "homegrown" and used for single studies. We provide a rationale for using cognitive interviewing in implementation science studies and illustrate its use through a case example employing cognitive interviewing to inform development of a measurement-based care protocol for implementation in opioid treatment programs. Applications of cognitive interviewing, including developing a common language with partners and collecting multi-level feedback on assessment procedures, to improve measurement in implementation science are discussed.
Collapse
Affiliation(s)
- Zabin Patel-Syed
- Northwestern University Feinberg School of Medicine, Institute for Public Health and Medicine, Center for Dissemination and Implementation Science, Chicago, USA.
| | - Sara Becker
- Northwestern University Feinberg School of Medicine, Institute for Public Health and Medicine, Center for Dissemination and Implementation Science, Chicago, USA
| | - Miranda Olson
- Northwestern University Feinberg School of Medicine, Institute for Public Health and Medicine, Center for Dissemination and Implementation Science, Chicago, USA
| | - Hailey Rinella
- Northwestern University Feinberg School of Medicine, Institute for Public Health and Medicine, Center for Dissemination and Implementation Science, Chicago, USA
| | - Kelli Scott
- Northwestern University Feinberg School of Medicine, Institute for Public Health and Medicine, Center for Dissemination and Implementation Science, Chicago, USA
| |
Collapse
|
4
|
Orchowski LM, Paszek C, Lopez RM, Oesterle DW, Pearlman DN, Rizzo CJ, Elwy ARG, Berkowitz AD, Malone S, Fortson BL. School partner perspectives on the implementation of the Your Voice Your View sexual assault prevention program for high school students. JOURNAL OF COMMUNITY PSYCHOLOGY 2023; 51:2906-2926. [PMID: 37148561 PMCID: PMC10494965 DOI: 10.1002/jcop.23050] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/20/2023] [Revised: 03/30/2023] [Accepted: 04/16/2023] [Indexed: 05/08/2023]
Abstract
Despite the high risk for sexual assault among adolescents, few sexual assault prevention programs designed for implementation in high schools have sustained rigorous evaluation. The present study sought to better understand the factors that influenced the implementation of Your Voice Your View (YVYV), a four-session sexual assault prevention program for 10th grade students, which includes a teacher "Lunch and Learn" training as well as a 4-week school-specific social norms poster campaign. Following program implementation, eight school partners (i.e., health teachers, guidance counselors, teachers, and principals) participated in an interview to provide feedback on the process of program implementation. The Consolidated Framework for Implementation Research was utilized to examine site-specific determinants of program implementation. Participants discussed the importance of the design quality and packaging of the program, as well as the relative advantage of offering students a violence prevention program led by an outside team, as opposed to teachers in the school. School partners highlighted the importance of intensive preplanning before implementation, clear communication between staff, the utility of engaging a specific champion to coordinate programming, and the utility of offering incentives for participation. Having resources to support implementation, a desire to address sexual violence in the school, and a positive classroom climate in which to administer the small-group sessions were seen as school-specific facilitators of program implementation. These findings can help to support the subsequent implementation of the YVYV program, as well as other sexual assault prevention programs in high schools.
Collapse
Affiliation(s)
- Lindsay M. Orchowski
- Department of Psychiatry and Behavioral Health, Rhode Island Hospital, Rhode Island, Providence, USA
- Department of Psychiatry and Human Behavior, Warren Alpert Medical School of Brown University, Providence, Rhode Island, USA
| | - Claudia Paszek
- Johns Hopkins University, Bloomberg School of Public Health, Baltimore, Maryland, USA
| | - Richard M. Lopez
- Department of Psychiatry and Behavioral Health, Rhode Island Hospital, Providence, Rhode Island, USA
| | - Daniel W. Oesterle
- Department of Psychology, Purdue University, West Lafayette, Indiana, USA
| | - Deborah N. Pearlman
- Department of Epidemiology, School of Public Health, Brown University, Providence, Rhode Island, USA
| | - Christie J. Rizzo
- Department of Applied Psychology, Northeastern University, Boston, Massachusetts, USA
| | - Anashua Rani Ghose Elwy
- Department of Psychiatry and Human Behavior, Warren Alpert Medical School of Brown University, Providence, Rhode Island, USA
| | - Alan D. Berkowitz
- Indepedent Researcher and Practitioner, Mount Shasta, California, USA
| | - Sandra Malone
- Day One of Rhode Island, Providence, Rhode Island, USA
| | - Beverly L. Fortson
- Division of Violence Prevention, National Center for Injury Prevention and Control, Centers for Disease Control and Prevention, Atlanta, Georgia, USA
| |
Collapse
|
5
|
Beasley JM, Johnston EA, Costea D, Sevick MA, Rogers ES, Jay M, Zhong J, Chodosh J. Adapting the Diabetes Prevention Program for Older Adults: Descriptive Study. JMIR Form Res 2023; 7:e45004. [PMID: 37642989 PMCID: PMC10498315 DOI: 10.2196/45004] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2022] [Revised: 06/22/2023] [Accepted: 07/24/2023] [Indexed: 08/31/2023] Open
Abstract
BACKGROUND Prediabetes affects 26.4 million people aged 65 years or older (48.8%) in the United States. Although older adults respond well to the evidence-based Diabetes Prevention Program, they are a heterogeneous group with differing physiological, biomedical, and psychosocial needs who can benefit from additional support to accommodate age-related changes in sensory and motor function. OBJECTIVE The purpose of this paper is to describe adaptations of the Centers for Disease Control and Prevention's Diabetes Prevention Program aimed at preventing diabetes among older adults (ages ≥65 years) and findings from a pilot of 2 virtual sessions of the adapted program that evaluated the acceptability of the content. METHODS The research team adapted the program by incorporating additional resources necessary for older adults. A certified lifestyle coach delivered 2 sessions of the adapted content via videoconference to 189 older adults. RESULTS The first session had a 34.9% (38/109) response rate to the survey, and the second had a 34% (30/88) response rate. Over three-quarters (50/59, 85%) of respondents agreed that they liked the virtual program, with 82% (45/55) agreeing that they would recommend it to a family member or a friend. CONCLUSIONS This data will be used to inform intervention delivery in a randomized controlled trial comparing in-person versus virtual delivery of the adapted program.
Collapse
Affiliation(s)
- Jeannette M Beasley
- Department of Nutrition and Food Studies, New York University Steinhardt School of School of Culture, Education, and Human Development, New York, NY, United States
- Department of Medicine, New York University Grossman School of Medicine, New York, NY, United States
| | - Emily A Johnston
- Department of Medicine, New York University Grossman School of Medicine, New York, NY, United States
| | - Denisa Costea
- Department of Medicine, New York University Grossman School of Medicine, New York, NY, United States
| | - Mary Ann Sevick
- Department of Medicine, New York University Grossman School of Medicine, New York, NY, United States
- Department of Population Health, New York University Grossman School of Medicine, New York, NY, United States
| | - Erin S Rogers
- Department of Population Health, New York University Grossman School of Medicine, New York, NY, United States
| | - Melanie Jay
- Department of Medicine, New York University Grossman School of Medicine, New York, NY, United States
- Department of Population Health, New York University Grossman School of Medicine, New York, NY, United States
- VA New York Harbor Healthcare System, New York, NY, United States
| | - Judy Zhong
- Department of Population Health, New York University Grossman School of Medicine, New York, NY, United States
| | - Joshua Chodosh
- Department of Medicine, New York University Grossman School of Medicine, New York, NY, United States
- Department of Population Health, New York University Grossman School of Medicine, New York, NY, United States
- VA New York Harbor Healthcare System, New York, NY, United States
| |
Collapse
|
6
|
Jiyed O, Alami A, Maskour L, El Batri B, Benjelloun N, Zaki M. Students' approaches to learning (SALs): Validation and psychometric properties of a tool measurement. JOURNAL OF EDUCATION AND HEALTH PROMOTION 2023; 12:228. [PMID: 37727427 PMCID: PMC10506742 DOI: 10.4103/jehp.jehp_203_23] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/14/2023] [Accepted: 05/03/2023] [Indexed: 09/21/2023]
Abstract
BACKGROUND Deep learning is an important outcome of the higher education and is mostly determined by students' approaches to learning (SALs). The latest version of the Study Process Questionnaire (SPQ) is one of the most used instruments assessing SALs. Many studies from various contexts have either validated or used this famous tool. But none of them-to the best of our knowledge-stem from the Moroccan tertiary context. The current study fills this gap by first: Getting a local translation of the questionnaire following the standardized methodological process and secondly to update the validity and psychometric properties of the construct. MATERIALS AND METHODS Arabic back translation was performed. Data were collected among tertiary scientific students. Descriptive statistics, Cronbach's coefficient alpha, and confirmatory factor analysis were carried out under SPSS version 22. RESULTS A strong fit of the dichotomic construct (deep and surface) was found, whereas the hierarchical models were disappointing. CONCLUSIONS Following the standards of the psychometrics' validation, this Arabic version could be used only in first-order factor model to evaluate the deep and surface approach within tertiary education in Moroccan context.
Collapse
Affiliation(s)
- Omar Jiyed
- LIMOME, Department of Chemistry, Faculty of Sciences Dhar Mahraz, Sidi Mohammed Ben Abdellah University, Fez, Morocco
| | - Anouar Alami
- LIMOME, Department of Chemistry, Faculty of Sciences Dhar Mahraz, Sidi Mohammed Ben Abdellah University, Fez, Morocco
| | - Lhoussaine Maskour
- LRST, High School of Education and Training (ESEF), Ibn Zohr University, Agadir, Morocco
| | - Bouchta El Batri
- Regional Center for Education and Training Professions (CRMEF Fez-Meknes), Fez, Morocco
| | - Nadia Benjelloun
- LISAC, Departments of Physics and Mathematics, Faculty of Sciences Dhar Mahraz, Sidi Mohammed Ben Abdellah University, Fez, Morocco
| | - Moncef Zaki
- LISAC, Departments of Physics and Mathematics, Faculty of Sciences Dhar Mahraz, Sidi Mohammed Ben Abdellah University, Fez, Morocco
| |
Collapse
|
7
|
Gelman R, Whelan J, Spiteri S, Duric D, Oakhill W, Cassar S, Love P. Adoption, implementation, and sustainability of early childhood feeding, nutrition and active play interventions in real-world settings: a systematic review. Int J Behav Nutr Phys Act 2023; 20:32. [PMID: 36941649 PMCID: PMC10029282 DOI: 10.1186/s12966-023-01433-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2022] [Accepted: 03/05/2023] [Indexed: 03/23/2023] Open
Abstract
BACKGROUND Instilling healthy dietary habits and active play in early childhood is an important public health focus. Interventions supporting the establishment of nutrition and active play behaviours in the first years of life have shown positive outcomes and long-term cost-effectiveness, however, most are research trials, with limited evidence regarding real-world application. Implementation science theories, models and frameworks (TMFs) can guide the process of research translation from trial to real-world intervention. The application of TMFs within nutrition and active play intervention studies in early childhood (< 5 years) is currently unknown. This systematic review identified the use of TMFs and barriers/ enablers associated with intervention adoption, implementation, and sustainability in early childhood nutrition and active play interventions implemented under real-world conditions. METHODS Six databases were searched for peer-reviewed publications between 2000-2021. Studies were included if primary outcomes reported improvement in diet, physical activity or sedentary behaviours amongst children aged < 5 years and interventions were delivered under real-world conditions within a community and/or healthcare setting. Two reviewers extracted and evaluated studies, cross checked by a third and verified by all authors. Quality assessment of included studies was completed by two authors using the Mixed Methods Appraisal Tool (MMAT). RESULTS Eleven studies comprising eleven unique interventions were included. Studies represented low, middle and high-income countries, and were conducted across a range of settings. Five TMFs were identified representing four of Nilsen's implementation model categories, predominantly 'evaluation models'. Ninety-nine barriers/facilitators were extracted across the three intervention phases-Implementation (n = 33 barriers; 33 facilitators), Sustainability (n = 19 barriers; n = 9 facilitators), Adoption (n = 2 barriers; n = 3 facilitators). Identified barriers/facilitators were mapped to the five domains of the Durlak and DuPre framework, with 'funding', 'compatibility' and 'integration of new programming' common across the three intervention phases. CONCLUSIONS Findings demonstrate that there is no systematic application of TMFs in the planning, implementation and/or evaluation of early childhood nutrition and active play interventions in real-world settings, and selective and sporadic application of TMFs occurs across the intervention lifespan. This apparent limited uptake of TMFs is a missed opportunity to enhance real-world implementation success. TRIAL REGISTRATION PROSPERO (CRD42021243841).
Collapse
Affiliation(s)
- Rivka Gelman
- School of Exercise and Nutrition Science, Deakin University, Geelong, VIC, 3220, Australia.
| | - Jillian Whelan
- School of Medicine, Institute of Health Transformation, Deakin University, Geelong, VIC, 3220, Australia
| | - Sheree Spiteri
- Institute for Physical Activity and Nutrition, School of Exercise and Nutrition Science, Deakin University, Geelong, VIC, 3220, Australia
| | - Danijela Duric
- School of Exercise and Nutrition Science, Deakin University, Geelong, VIC, 3220, Australia
| | - Winnie Oakhill
- School of Exercise and Nutrition Science, Deakin University, Geelong, VIC, 3220, Australia
| | - Samuel Cassar
- Centre for Youth Mental Health, University of Melbourne, Melbourne, VIC, 3052, Australia
| | - Penelope Love
- Institute for Physical Activity and Nutrition, School of Exercise and Nutrition Science, Deakin University, Geelong, VIC, 3220, Australia
| |
Collapse
|
8
|
Ramly E, Brown HW. Beyond Effectiveness: Implementation Science 101 for Clinicians and Clinical Researchers. UROGYNECOLOGY (PHILADELPHIA, PA.) 2023; 29:307-312. [PMID: 36808925 PMCID: PMC10171038 DOI: 10.1097/spv.0000000000001322] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 04/27/2023]
Affiliation(s)
- Edmond Ramly
- From the University of Wisconsin School of Medicine and Public Health, Madison, WI
| | | |
Collapse
|
9
|
Robinson CH, Damschroder LJ. A pragmatic context assessment tool (pCAT): using a Think Aloud method to develop an assessment of contextual barriers to change. Implement Sci Commun 2023; 4:3. [PMID: 36631914 PMCID: PMC9835384 DOI: 10.1186/s43058-022-00380-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2022] [Accepted: 11/27/2022] [Indexed: 01/13/2023] Open
Abstract
BACKGROUND The Consolidated Framework for Implementation Research (CFIR) is a determinant framework that can be used to guide context assessment prior to implementing change. Though a few quantitative measurement instruments have been developed based on the CFIR, most assessments using the CFIR have relied on qualitative methods. One challenge to measurement is to translate conceptual constructs which are often described using highly abstract, technical language into lay language that is clear, concise, and meaningful. The purpose of this paper is to document methods to develop a freely available pragmatic context assessment tool (pCAT). The pCAT is based on the CFIR and designed for frontline quality improvement teams as an abbreviated assessment of local facilitators and barriers in a clinical setting. METHODS Twenty-seven interviews using the Think Aloud method (asking participants to verbalize thoughts as they respond to assessment questions) were conducted with frontline employees to improve a pilot version of the pCAT. Interviews were recorded and transcribed verbatim; the CFIR guided coding and analyses. RESULTS Participants identified several areas where language in the pCAT needed to be modified, clarified, or allow more nuance to increase usefulness for frontline employees. Participants found it easier to respond to questions when they had a recent, specific project in mind. Potential barriers and facilitators tend to be unique to each specific improvement. Participants also identified missing concepts or that were conflated, leading to refinements that made the pCAT more understandable, accurate, and useful. CONCLUSIONS The pCAT is designed to be practical, using everyday language familiar to frontline employees. The pCAT is short (14 items), freely available, does not require research expertise or experience. It is designed to draw on the knowledge of individuals most familiar with their own clinical context. The pCAT has been available online for approximately two years and has generated a relatively high level of interest indicating potential usefulness of the tool.
Collapse
Affiliation(s)
- Claire H Robinson
- VA Center for Clinical Management Research, VA Ann Arbor Healthcare System, 2215 Fuller Road (152), Ann Arbor, MI, 48105, USA.
| | - Laura J Damschroder
- VA Center for Clinical Management Research, VA Ann Arbor Healthcare System, 2215 Fuller Road (152), Ann Arbor, MI, 48105, USA
| |
Collapse
|
10
|
Sweetnam C, Goulding L, Davis RE, Khadjesari Z, Boaz A, Healey A, Sevdalis N, Bakolis I, Hull L. Development and psychometric evaluation of the Implementation Science Research Project Appraisal Criteria (ImpResPAC) tool: a study protocol. BMJ Open 2022; 12:e061209. [PMID: 36526311 PMCID: PMC9764655 DOI: 10.1136/bmjopen-2022-061209] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/19/2022] [Accepted: 09/29/2022] [Indexed: 12/23/2022] Open
Abstract
INTRODUCTION The need for quantitative criteria to appraise the quality of implementation research has recently been highlighted to improve methodological rigour. The Implementation Science Research development (ImpRes) tool and supplementary guide provide methodological guidance and recommendations on how to design high-quality implementation research. This protocol reports on the development of the Implementation Science Research Project Appraisal Criteria (ImpResPAC) tool, a quantitative appraisal tool, developed based on the structure and content of the ImpRes tool and supplementary guide, to evaluate the conceptual and methodological quality of implementation research. METHODS AND ANALYSIS This study employs a three-stage sequential mixed-methods design. During stage 1, the research team will map core domains of the ImpRes tool, guidance and recommendations contained in the supplementary guide and within the literature, to ImpResPAC. In stage 2, an international multidisciplinary expert group, recruited through purposive sampling, will inform the refinement of ImpResPAC, including content, scoring system and user instructions. In stage 3, an extensive psychometric evaluation of ImpResPAC, that was created in stage 1 and refined in stage 2, will be conducted. The scaling assumptions (inter-item and item-total correlations), reliability (internal consistency, inter-rater) and validity (construct and convergent validity) will be investigated by applying ImpResPAC to 50 protocols published in Implementation Science. We envisage developing ImpResPAC in this way will provide implementation research stakeholders, primarily grant reviewers and educators, a comprehensive, transparent and fair appraisal of the conceptual and methodological quality of implementation research, increasing the likelihood of funding research that will generate knowledge and contribute to the advancement of the field. ETHICS AND DISSEMINATION This study will involve human participants. This study has been registered and minimal risk ethical clearance granted by The Research Ethics Office, King's College London (reference number MRA-20/21-20807). Participants will receive written information on the study via email and will provide e-consent if they wish to participate. We will use traditional academic modalities of dissemination (eg, conferences and publications).
Collapse
Affiliation(s)
- Chloe Sweetnam
- Neurology, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - Lucy Goulding
- Centre for Implementation Science, Health Service and Population Research Department, King's College London, London, UK
| | - Rachel E Davis
- Centre for Implementation Science, Health Service and Population Research Department, King's College London, London, UK
| | - Zarnie Khadjesari
- Behavioural and Implementation Science Research Group, School of Health Sciences, University of East Anglia, Norwich, UK
| | - Annette Boaz
- Department of Health Services Research and Policy, London School of Hygiene & Tropical Medicine, London, UK
| | - Andy Healey
- Centre for Implementation Science, Health Service and Population Research Department, King's College London, London, UK
- King's Health Economics, Institute of Psychiatry, Psychology & Neuroscience, King's College London, London, UK
| | - Nick Sevdalis
- Centre for Implementation Science, Health Service and Population Research Department, King's College London, London, UK
| | - Ioannis Bakolis
- Centre for Implementation Science, Health Service and Population Research Department, King's College London, London, UK
- Department of Biostatistics and Health Informatics, Institute of Psychiatry, Psychology and Neuroscience, King's College London, London, UK
| | - Louise Hull
- Centre for Implementation Science, Health Service and Population Research Department, King's College London, London, UK
| |
Collapse
|
11
|
Hall A, Shoesmith A, Doherty E, McEvoy B, Mettert K, Lewis CC, Wolfenden L, Yoong S, Kingsland M, Shelton RC, Wiltsey Stirman S, Imad N, Sutherland R, Nathan N. Evaluation of measures of sustainability and sustainability determinants for use in community, public health, and clinical settings: a systematic review. Implement Sci 2022; 17:81. [PMID: 36514059 PMCID: PMC9746194 DOI: 10.1186/s13012-022-01252-1] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2022] [Accepted: 10/19/2022] [Indexed: 12/15/2022] Open
Abstract
BACKGROUND Sustainability is concerned with the long-term delivery and subsequent benefits of evidence-based interventions. To further this field, we require a strong understanding and thus measurement of sustainability and what impacts sustainability (i.e., sustainability determinants). This systematic review aimed to evaluate the quality and empirical application of measures of sustainability and sustainability determinants for use in clinical, public health, and community settings. METHODS Seven electronic databases, reference lists of relevant reviews, online repositories of implementation measures, and the grey literature were searched. Publications were included if they reported on the development, psychometric evaluation, or empirical use of a multi-item, quantitative measure of sustainability, or sustainability determinants. Eligibility was not restricted by language or date. Eligibility screening and data extraction were conducted independently by two members of the research team. Content coverage of each measure was assessed by mapping measure items to relevant constructs of sustainability and sustainability determinants. The pragmatic and psychometric properties of included measures was assessed using the Psychometric and Pragmatic Evidence Rating Scale (PAPERS). The empirical use of each measure was descriptively analyzed. RESULTS A total of 32,782 articles were screened from the database search, of which 37 were eligible. An additional 186 publications were identified from the grey literature search. The 223 included articles represented 28 individual measures, of which two assessed sustainability as an outcome, 25 covered sustainability determinants and one explicitly assessed both. The psychometric and pragmatic quality was variable, with PAPERS scores ranging from 14 to 35, out of a possible 56 points. The Provider Report of Sustainment Scale had the highest PAPERS score and measured sustainability as an outcome. The School-wide Universal Behaviour Sustainability Index-School Teams had the highest PAPERS score (score=29) of the measure of sustainability determinants. CONCLUSIONS This review can be used to guide selection of the most psychometrically robust, pragmatic, and relevant measure of sustainability and sustainability determinants. It also highlights that future research is needed to improve the psychometric and pragmatic quality of current measures in this field. TRIAL REGISTRATION This review was prospectively registered with Research Registry (reviewregistry1097), March 2021.
Collapse
Affiliation(s)
- Alix Hall
- grid.266842.c0000 0000 8831 109XSchool of Medicine and Public Health, The University of Newcastle, Locked Bag 10 Wallsend, Callaghan, NSW Australia ,grid.266842.c0000 0000 8831 109XPriority Research Centre for Health Behaviour, The University of Newcastle, Callaghan, NSW Australia ,grid.413648.cHunter Medical Research Institute, New Lambton Heights, New South Wales Australia ,grid.3006.50000 0004 0438 2042Hunter New England Population Health, Hunter New England Local Health District, Wallsend, NSW Australia
| | - Adam Shoesmith
- grid.266842.c0000 0000 8831 109XSchool of Medicine and Public Health, The University of Newcastle, Locked Bag 10 Wallsend, Callaghan, NSW Australia ,grid.266842.c0000 0000 8831 109XPriority Research Centre for Health Behaviour, The University of Newcastle, Callaghan, NSW Australia ,grid.413648.cHunter Medical Research Institute, New Lambton Heights, New South Wales Australia ,grid.3006.50000 0004 0438 2042Hunter New England Population Health, Hunter New England Local Health District, Wallsend, NSW Australia
| | - Emma Doherty
- grid.266842.c0000 0000 8831 109XSchool of Medicine and Public Health, The University of Newcastle, Locked Bag 10 Wallsend, Callaghan, NSW Australia ,grid.266842.c0000 0000 8831 109XPriority Research Centre for Health Behaviour, The University of Newcastle, Callaghan, NSW Australia ,grid.413648.cHunter Medical Research Institute, New Lambton Heights, New South Wales Australia ,grid.3006.50000 0004 0438 2042Hunter New England Population Health, Hunter New England Local Health District, Wallsend, NSW Australia
| | - Brydie McEvoy
- grid.266842.c0000 0000 8831 109XSchool of Medicine and Public Health, The University of Newcastle, Locked Bag 10 Wallsend, Callaghan, NSW Australia ,grid.266842.c0000 0000 8831 109XPriority Research Centre for Health Behaviour, The University of Newcastle, Callaghan, NSW Australia ,grid.413648.cHunter Medical Research Institute, New Lambton Heights, New South Wales Australia ,grid.3006.50000 0004 0438 2042Hunter New England Population Health, Hunter New England Local Health District, Wallsend, NSW Australia
| | - Kayne Mettert
- grid.488833.c0000 0004 0615 7519Kaiser Permanente Washington Health Research Institute, Seattle, USA
| | - Cara C. Lewis
- grid.488833.c0000 0004 0615 7519Kaiser Permanente Washington Health Research Institute, Seattle, USA ,grid.34477.330000000122986657Department of Psychology, University of Washington, Seattle, USA
| | - Luke Wolfenden
- grid.266842.c0000 0000 8831 109XSchool of Medicine and Public Health, The University of Newcastle, Locked Bag 10 Wallsend, Callaghan, NSW Australia ,grid.266842.c0000 0000 8831 109XPriority Research Centre for Health Behaviour, The University of Newcastle, Callaghan, NSW Australia ,grid.413648.cHunter Medical Research Institute, New Lambton Heights, New South Wales Australia ,grid.3006.50000 0004 0438 2042Hunter New England Population Health, Hunter New England Local Health District, Wallsend, NSW Australia
| | - Serene Yoong
- grid.266842.c0000 0000 8831 109XSchool of Medicine and Public Health, The University of Newcastle, Locked Bag 10 Wallsend, Callaghan, NSW Australia ,grid.266842.c0000 0000 8831 109XPriority Research Centre for Health Behaviour, The University of Newcastle, Callaghan, NSW Australia ,grid.3006.50000 0004 0438 2042Hunter New England Population Health, Hunter New England Local Health District, Wallsend, NSW Australia ,grid.1021.20000 0001 0526 7079School of Health Sciences and Social Development, Deakin University, Melbourne, Victoria Australia
| | - Melanie Kingsland
- grid.266842.c0000 0000 8831 109XSchool of Medicine and Public Health, The University of Newcastle, Locked Bag 10 Wallsend, Callaghan, NSW Australia ,grid.266842.c0000 0000 8831 109XPriority Research Centre for Health Behaviour, The University of Newcastle, Callaghan, NSW Australia ,grid.413648.cHunter Medical Research Institute, New Lambton Heights, New South Wales Australia ,grid.3006.50000 0004 0438 2042Hunter New England Population Health, Hunter New England Local Health District, Wallsend, NSW Australia
| | - Rachel C. Shelton
- grid.21729.3f0000000419368729Mailman School of Public Health, Department of Sociomedical Sciences, Columbia University, New York, New York USA
| | - Shannon Wiltsey Stirman
- grid.168010.e0000000419368956Dissemination and Training Division, National Center for PTSD and Department of Psychiatry and Behavioural Sciences, Stanford Medicine, Stanford University, Palo Alto, California USA
| | - Noor Imad
- grid.3006.50000 0004 0438 2042Hunter New England Population Health, Hunter New England Local Health District, Wallsend, NSW Australia ,grid.1027.40000 0004 0409 2862School of Health Sciences, Department of Nursing and Allied Health, Swinburne University of Technology, Hawthorn, Victoria Australia
| | - Rachel Sutherland
- grid.266842.c0000 0000 8831 109XSchool of Medicine and Public Health, The University of Newcastle, Locked Bag 10 Wallsend, Callaghan, NSW Australia ,grid.266842.c0000 0000 8831 109XPriority Research Centre for Health Behaviour, The University of Newcastle, Callaghan, NSW Australia ,grid.413648.cHunter Medical Research Institute, New Lambton Heights, New South Wales Australia ,grid.3006.50000 0004 0438 2042Hunter New England Population Health, Hunter New England Local Health District, Wallsend, NSW Australia
| | - Nicole Nathan
- grid.266842.c0000 0000 8831 109XSchool of Medicine and Public Health, The University of Newcastle, Locked Bag 10 Wallsend, Callaghan, NSW Australia ,grid.266842.c0000 0000 8831 109XPriority Research Centre for Health Behaviour, The University of Newcastle, Callaghan, NSW Australia ,grid.413648.cHunter Medical Research Institute, New Lambton Heights, New South Wales Australia ,grid.3006.50000 0004 0438 2042Hunter New England Population Health, Hunter New England Local Health District, Wallsend, NSW Australia
| |
Collapse
|
12
|
Hull L, Boulton R, Jones F, Boaz A, Sevdalis N. Defining, conceptualizing and evaluating pragmatic qualities of quantitative instruments measuring implementation determinants and outcomes: a scoping and critical review of the literature and recommendations for future research. Transl Behav Med 2022; 12:1049-1064. [PMID: 36318228 PMCID: PMC9677469 DOI: 10.1093/tbm/ibac064] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
The pragmatic (i.e., practical) quality of quantitative implementation measures has received increased attention in the implementation science literature in recent years. Implementation measures that are judged to be pragmatic by implementation stakeholders are thought to be more likely to be applied in research and practice. Despite the need for pragmatic implementation measures, ambiguity and uncertainty regarding what constitutes a pragmatic measure remains. This study sought to identify and critically appraise the published literature to understand (i) how pragmatism is defined as a measurement construct/quality of implementation determinants and outcome instruments; (ii) how pragmatic qualities of instruments are evaluated; (iii) identify key gaps and limitations of the current evidence-base and (iv) identify recommendations for future research. We conducted a scoping review of the literature also employing methods of critical review. PubMed and PsycINFO databases, using the OVID interface, were searched for relevant articles published between January 2010 and September 2020. Articles that contained a definition and/or described characteristics of "pragmatism" as a measurement construct of quantitative implementation outcomes (as defined by Proctor's Implementation Outcomes taxonomy) and/or implementation determinants were eligible for inclusion. Nine articles met inclusion criteria. A degree of overlap in definitions and terms used to describe the pragmatic qualities of quantitative implementation determinant and outcome instruments were found. The most frequently cited descriptors of pragmatism were "not burdensome", "brief", "reliable", "valid" and "sensitive to change". 3 of the 9 included articles involved international implementation stakeholders in defining and conceptualizing pragmatism and employed specific methods to do so, including a systematic literature review, stakeholder interviews, concept mapping, and a Delphi process. All other articles defined pragmatism, with or without citing relevant literature. One article objectively assessed the pragmatic qualities, above and beyond the psychometric qualities, of implementation measures, using the Psychometric and Pragmatic Evidence Rating Scale (PAPERS). The evidence base within the implementation instrumentation literature on what pragmatism is and how it might be assessed is limited. Some of the research identified in the review provides a strong foundation to build upon, by testing its applicability in other settings (including healthcare areas and countries) and among a more diverse group of stakeholders. We discuss directions for further development of the concept of pragmatism relating to the measurement of implementation determinants and outcomes.
Collapse
Affiliation(s)
| | - Richard Boulton
- Centre for Health and Social Care, St George’s, University of London and Kingston University, UK
| | - Fiona Jones
- Centre for Health and Social Care, St George’s, University of London and Kingston University, UK
| | - Annette Boaz
- Faculty of Public Health & Policy, London School of Hygiene & Tropical Medicine, London, UK
| | - Nick Sevdalis
- Centre for Implementation Science, Health Service and Population Research Department, King’s College London, London, UK
| |
Collapse
|
13
|
Daniels SI, Cheng H, Gray C, Kim B, Stave CD, Midboe AM. A scoping review of implementation of health-focused interventions in vulnerable populations. Transl Behav Med 2022; 12:935-944. [DOI: 10.1093/tbm/ibac025] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/25/2023] Open
Abstract
Abstract
Vulnerable populations face significant challenges in getting the healthcare they need. A growing body of implementation science literature has examined factors, including facilitators and barriers, relevant to accessing healthcare in these populations. The purpose of this scoping review was to identify themes relevant for improving implementation of healthcare practices and programs for vulnerable populations. This scoping review relied on the methodological framework set forth by Arksey and O’Malley, and the Consolidated Framework for Implementation Research (CFIR) to evaluate and structure our findings. A framework analytic approach was used to code studies. Of the five CFIR Domains, the Inner Setting and Outer Setting were the most frequently examined in the 81 studies included. Themes that were pertinent to each domain are as follows—Inner Setting: organizational culture, leadership engagement, and integration of the intervention; Outer Setting: networks, external policies, and patients’ needs and resources; Characteristics of the Individual: knowledge and beliefs about the intervention, self-efficacy, as well as stigma (i.e., other attributes); Intervention Characteristics: complexities with staffing, cost, and adaptations; and Process: staff and patient engagement, planning, and ongoing reflection and evaluation. Key themes, including barriers and facilitators, are highlighted here as relevant to implementation of practices for vulnerable populations. These findings can inform tailoring of implementation strategies and health policies for vulnerable populations, thereby supporting more equitable healthcare.
Collapse
Affiliation(s)
- Sarah I Daniels
- Center for Innovation to Implementation (Ci2i), VA Palo Alto Health Care System , Menlo Park, CA 94025 , USA
| | - Hannah Cheng
- Center for Innovation to Implementation (Ci2i), VA Palo Alto Health Care System , Menlo Park, CA 94025 , USA
| | - Caroline Gray
- Center for Innovation to Implementation (Ci2i), VA Palo Alto Health Care System , Menlo Park, CA 94025 , USA
| | - Bo Kim
- Center for Healthcare Organization and Implementation Research, VA Boston Healthcare System , Boston, MA 02114 , USA
- Department of Psychiatry, Harvard Medical School , Boston, MA 02115 , USA
| | | | - Amanda M Midboe
- Center for Innovation to Implementation (Ci2i), VA Palo Alto Health Care System , Menlo Park, CA 94025 , USA
- Stanford University School of Medicine , Stanford, CA 94305 , USA
| |
Collapse
|
14
|
Hoy S, Helgadóttir B, Norman Å. Quantitative Measurements for Factors Influencing Implementation in School Settings: Protocol for A Systematic Review and A Psychometric and Pragmatic Analysis. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:12726. [PMID: 36232024 PMCID: PMC9564866 DOI: 10.3390/ijerph191912726] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/31/2022] [Revised: 10/01/2022] [Accepted: 10/02/2022] [Indexed: 06/16/2023]
Abstract
INTRODUCTION In order to address the effectiveness and sustainability of school-based interventions, there is a need to consider the factors affecting implementation success. The rapidly growing field of implementation-focused research is struggling to determine how to assess and measure implementation-relevant constructs. Earlier research has identified the need for strong psychometric and pragmatic measures. The aims of this review are therefore to (i) systematically review the literature to identify measurements of the factors influencing implementations which have been developed or adapted in school settings, (ii) describe each measurement's psychometric and pragmatic properties, (iii) describe the alignment between each measurement and the corresponding domain and/or construct of the Consolidated Framework for Implementation Research (CFIR). METHODS Six databases (Medline, ERIC, PsycInfo, Cinahl, Embase, and Web of Science) will be searched for peer-reviewed articles reporting on school settings, published from the year 2000. The identified measurements will be mapped against the CFIR, and analyzed for their psychometric and pragmatic properties. DISCUSSION By identifying measurements that are psychometrically and pragmatically impactful in the field, this review will contribute to the identification of feasible, effective, and sustainable implementation strategies for future research in school settings.
Collapse
Affiliation(s)
- Sara Hoy
- Department of Movement, Culture, and Society, The Swedish School of Sport and Health Sciences (GIH), 114 86 Stockholm, Sweden
| | - Björg Helgadóttir
- Department of Physical Activity and Health, The Swedish School of Sport and Health Sciences (GIH), 114 33 Stockholm, Sweden
- Department of Clinical Neuroscience, Karolinska Institute, Tomtebodavägen 18A, 171 77 Stockholm, Sweden
| | - Åsa Norman
- Department of Clinical Neuroscience, Karolinska Institute, Tomtebodavägen 18A, 171 77 Stockholm, Sweden
- Department of Psychology, Stockholm University, 106 91 Stockholm, Sweden
| |
Collapse
|
15
|
Mielke J, Leppla L, Valenta S, Zullig LL, Zúñiga F, Staudacher S, Teynor A, De Geest S. Unraveling implementation context: the Basel Approach for coNtextual ANAlysis (BANANA) in implementation science and its application in the SMILe project. Implement Sci Commun 2022; 3:102. [PMID: 36183141 PMCID: PMC9526967 DOI: 10.1186/s43058-022-00354-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2022] [Accepted: 09/20/2022] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Designing intervention and implementation strategies with careful consideration of context is essential for successful implementation science projects. Although the importance of context has been emphasized and methodology for its analysis is emerging, researchers have little guidance on how to plan, perform, and report contextual analysis. Therefore, our aim was to describe the Basel Approach for coNtextual ANAlysis (BANANA) and to demonstrate its application on an ongoing multi-site, multiphase implementation science project to develop/adapt, implement, and evaluate an integrated care model in allogeneic SteM cell transplantatIon facILitated by eHealth (the SMILe project). METHODS BANANA builds on guidance for assessing context by Stange and Glasgow (Contextual factors: the importance of considering and reporting on context in research on the patient-centered medical home, 2013). Based on a literature review, BANANA was developed in ten discussion sessions with implementation science experts and a medical anthropologist to guide the SMILe project's contextual analysis. BANANA's theoretical basis is the Context and Implementation of Complex Interventions (CICI) framework. Working from an ecological perspective, CICI acknowledges contextual dynamics and distinguishes between context and setting (the implementation's physical location). RESULTS BANANA entails six components: (1) choose a theory, model, or framework (TMF) to guide the contextual analysis; (2) use empirical evidence derived from primary and/or secondary data to identify relevant contextual factors; (3) involve stakeholders throughout contextual analysis; (4) choose a study design to assess context; (5) determine contextual factors' relevance to implementation strategies/outcomes and intervention co-design; and (6) report findings of contextual analysis following appropriate reporting guidelines. Partly run simultaneously, the first three components form a basis both for the identification of relevant contextual factors and for the next components of the BANANA approach. DISCUSSION Understanding of context is indispensable for a successful implementation science project. BANANA provides much-needed methodological guidance for contextual analysis. In subsequent phases, it helps researchers apply the results to intervention development/adaption and choices of contextually tailored implementation strategies. For future implementation science projects, BANANA's principles will guide researchers first to gather relevant information on their target context, then to inform all subsequent phases of their implementation science project to strengthen every part of their work and fulfill their implementation goals.
Collapse
Affiliation(s)
- Juliane Mielke
- grid.6612.30000 0004 1937 0642Institute of Nursing Science (INS), Department Public Health (DPH), Faculty of Medicine, University of Basel, Bernoullistrasse 28, CH-4056 Basel, Switzerland
| | - Lynn Leppla
- grid.6612.30000 0004 1937 0642Institute of Nursing Science (INS), Department Public Health (DPH), Faculty of Medicine, University of Basel, Bernoullistrasse 28, CH-4056 Basel, Switzerland ,grid.7708.80000 0000 9428 7911Department of Medicine I, Faculty of Medicine, Medical Center University of Freiburg, Freiburg im Breisgau, Germany
| | - Sabine Valenta
- grid.6612.30000 0004 1937 0642Institute of Nursing Science (INS), Department Public Health (DPH), Faculty of Medicine, University of Basel, Bernoullistrasse 28, CH-4056 Basel, Switzerland ,grid.410567.1Department of Hematology, University Hospital Basel, Basel, Switzerland
| | - Leah L. Zullig
- grid.26009.3d0000 0004 1936 7961Center for Innovation to Accelerate Discovery and Practice Transformation (ADAPT), Durham Veterans Affairs Health Care & System, and Department of Population Health Sciences, School of Medicine, Duke University, Durham, NC USA
| | - Franziska Zúñiga
- grid.6612.30000 0004 1937 0642Institute of Nursing Science (INS), Department Public Health (DPH), Faculty of Medicine, University of Basel, Bernoullistrasse 28, CH-4056 Basel, Switzerland
| | - Sandra Staudacher
- grid.6612.30000 0004 1937 0642Institute of Nursing Science (INS), Department Public Health (DPH), Faculty of Medicine, University of Basel, Bernoullistrasse 28, CH-4056 Basel, Switzerland ,grid.5012.60000 0001 0481 6099Department of Health Services Research, Care and Public Health Research Institute, Maastricht University, Maastricht, The Netherlands
| | - Alexandra Teynor
- grid.440970.e0000 0000 9922 6093University of Applied Sciences Augsburg, Faculty of Computer Science, Augsburg, Germany
| | - Sabina De Geest
- grid.6612.30000 0004 1937 0642Institute of Nursing Science (INS), Department Public Health (DPH), Faculty of Medicine, University of Basel, Bernoullistrasse 28, CH-4056 Basel, Switzerland ,grid.5596.f0000 0001 0668 7884Academic Center for Nursing and Midwifery, Department of Public Health and Primary Care, KU Leuven, Leuven, Belgium
| |
Collapse
|
16
|
Kenny L, McIntosh A, Jardine K, Suna J, Versluis K, Slee N, Lloyd G, Justo R, Merlo G, Wilson M, Reddan T, Powell J, Venugopal P, Betts K, Alphonso N. Vocal cord dysfunction after pediatric cardiac surgery: A prospective implementation study. JTCVS OPEN 2022; 11:398-411. [PMID: 36172446 PMCID: PMC9510869 DOI: 10.1016/j.xjon.2022.06.003] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/01/2021] [Revised: 05/18/2022] [Accepted: 06/01/2022] [Indexed: 11/09/2022]
Abstract
Objective To determine the incidence, outcomes, and evaluate diagnostic modalities for postoperative vocal cord dysfunction (VCD) following cardiothoracic surgery in children. Methods A prospective mixed-methods study using principles of implementation science was completed. All patients undergoing surgery involving the aortic arch, ductus, or ligamentum arteriosum and vascular rings from September 2019 to December 2020 were enrolled. Patients underwent speech pathology assessment, laryngeal ultrasound, and flexible direct laryngoscopy. Results Ninety-five patients were eligible for inclusion. The incidence of VCD ranged from 18% to 56% and varied according to procedure group. VCD occurred in 42% of neonates. Repair of hypoplastic aortic arch was associated with increased risk of VCD (57%; P = .002). There was no significant difference in duration of intubation, pediatric intensive care unit stay, or hospital stay. Forty percent children were able to achieve full oral feeding. Children with VCD were more likely to require nasogastric supplementary feeding at discharge (60% vs 36%; P = .044). Sixty-eight percent of patients demonstrated complete resolution of VCD at a median of 97 days postoperatively. Laryngeal ultrasound and speech pathology assessment combined had a sensitivity of 91% in comparison to flexible direct laryngoscopy. Conclusions VCD occurred in one-third and resolved in two-thirds of patients at a median of 3 months following cardiac surgery. Aortic arch repair carried the highest risk of VCD. VCD adversely influenced feeding. Forty percent of patients achieved full oral feeding before discharge. VCD did not delay intensive care unit or hospital discharge. Speech pathology assessment and laryngeal ultrasound combined was reliable for diagnosis in most patients and was more patient friendly than flexible direct laryngoscopy.
Collapse
|
17
|
Mielke J, De Geest S, Zúñiga F, Brunkert T, Zullig LL, Pfadenhauer LM, Staudacher S. Understanding dynamic complexity in context-Enriching contextual analysis in implementation science from a constructivist perspective. FRONTIERS IN HEALTH SERVICES 2022; 2:953731. [PMID: 36925847 PMCID: PMC10012673 DOI: 10.3389/frhs.2022.953731] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/26/2022] [Accepted: 07/01/2022] [Indexed: 01/24/2023]
Abstract
Context in implementation science includes not only characteristics of a setting in which an intervention will be delivered, but also social systems (e.g., interrelationships). Context is dynamic and interacts with both, the intervention and its implementation. Therefore, contextual analysis is recognized as an indispensable part of implementation science methodology: it provides the foundation for successful and sustainable implementation projects. Yet, driven by the prevailing post-positivist understanding of context, contextual analysis typically focuses on individual characteristics of context i.e., contextual dynamics and interactions go unnoticed. Conducting contextual analysis from a constructivist perspective promotes a multilayered approach, building a more comprehensive understanding of context, and thus facilitating successful implementation. In this article, we highlight the limitations of prevailing perspectives on context and approaches to contextual analysis. We then describe how contextual analysis can be enriched by working from a constructivist perspective. We finish with a discussion of the methodological and practical implications the proposed changes would entail. Emerging literature attempts to address both the concept of context and methods for contextual analysis. Various theories, models and frameworks consider context, however, many of these are reductionistic and do not acknowledge the dynamic nature of context or interactions within it. To complement recent conceptualizations of context, we suggest consider the following five constructivist concepts: 1) social space; 2) social place; 3) agency; 4) sensation; and 5) embodiment. We demonstrate the value of these concepts using COVID-19 vaccination uptake as an example and integrate the concepts in the Context and Implementation of Complex Interventions (CICI) framework-an implementation science framework that pays ample attention to context. To study context from a constructivist perspective, we also suggest additional considerations in view of methodologies for data collection and analysis, e.g., rapid ethnographic methods. A constructivist perspective contributes to a stronger conceptualization of contextual analysis. Considering the five constructivist concepts helps to overcome contextual analysis' current shortcomings, while revealing complex dynamics that usually go unnoticed. Thus, more comprehensive understanding of context can be developed to inform subsequent phases of an implementation project, thereby maximizing an intervention's uptake and sustainability.
Collapse
Affiliation(s)
- Juliane Mielke
- Institute of Nursing Science, Department Public Health, University of Basel, Basel, Switzerland
| | - Sabina De Geest
- Institute of Nursing Science, Department Public Health, University of Basel, Basel, Switzerland
- Academic Center for Nursing and Midwifery, Department of Public Health and Primary Care, KU Leuven, Leuven, Belgium
| | - Franziska Zúñiga
- Institute of Nursing Science, Department Public Health, University of Basel, Basel, Switzerland
| | - Thekla Brunkert
- Institute of Nursing Science, Department Public Health, University of Basel, Basel, Switzerland
- University Department of Geriatric Medicine FELIX PLATTER, Basel, Switzerland
| | - Leah L. Zullig
- Center for Innovation to Accelerate Discovery and Practice Transformation (ADAPT), Durham Veterans Affairs Health Care, Durham, NC, United States
- System and Department of Population Health Sciences, School of Medicine, Duke University, Durham, NC, United States
| | - Lisa M. Pfadenhauer
- Institute for Medical Information Processing, Biometry and Epidemiology, Ludwig Maximilian University of Munich, Munich, Germany
- Pettenkofer School of Public Health, Ludwig Maximilian University of Munich, Munich, Germany
| | - Sandra Staudacher
- Institute of Nursing Science, Department Public Health, University of Basel, Basel, Switzerland
- Department of Health Services Research, Care and Public Health Research Institute, Maastricht University, Maastricht, Netherlands
| |
Collapse
|
18
|
Lamarche L, Clark RE, Parascandalo F, Mangin D. The implementation and validation of the NoMAD during a complex primary care intervention. BMC Med Res Methodol 2022; 22:175. [PMID: 35718763 PMCID: PMC9206734 DOI: 10.1186/s12874-022-01655-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2021] [Accepted: 06/08/2022] [Indexed: 11/10/2022] Open
Abstract
Background Normalization process theory (NPT) has been widely used to better understand how new interventions are implemented and embedded. The NoMAD (Normalization Measurement Development questionnaire) is a 23-item NPT instrument based on NPT. As the NoMAD is a relatively new instrument, the objectives of this paper are: to describe the experience of implementing the NoMAD, to describe it being used as a feedback mechanism to gain insight into the normalization process of a complex health intervention, and to further explore the psychometric properties of the instrument. Methods Health TAPESTRY was implemented in six Family Health Teams (total of seven sites) across Ontario. Healthcare team members at each site were invited to complete the NoMAD, and three general questions about normalization, six times over a 12-month period. Each site was then provided a visual traffic light summary (TLS) reflecting the implementation of the Health TAPESTRY. The internal consistency of each sub-scale and validity of the NoMAD were assessed. Learnings from the implementation of the NoMAD and subsequent feedback mechanism (TLS) are reported descriptively. Results In total, 56 diverse health care team members from six implementation sites completed the NoMAD. Each used it at least once during the 12-month study period. The implementation of the NoMAD and TLS was time consuming to do with multiple collection (and feedback) points. Most (60%) internal consistency values of the four subscales (pooled across site) across each collection point were satisfactory. All correlations were positive, and most (86%) were statistically significant among NoMAD subscales. All but one correlation between the NoMAD subscales and the general questions were positive, and most (72%) were significant. Generally, scores on the subscales were higher at 12-month than baseline, albeit did not follow a linear pattern of change across implementation. Generally, scores were higher for experienced sites compared to first-time implementors. Conclusion Our experience would suggest fewer collection points; three timepoints spaced out by several months are adequate, if repeated administration of the NoMAD is used for feedback loops. We provide additional evidence of the psychometric properties of the NoMAD. Trial Registration Registered at ClinicalTrials.gov: NCT03397836.
Collapse
Affiliation(s)
- Larkin Lamarche
- Department of Family Medicine, McMaster University, David Braley Health Sciences Centre, 100 Main Street West, 5th Floor, Hamilton, ON, L8P 1H6, Canada
| | - Rebecca E Clark
- Department of Family Medicine, McMaster University, David Braley Health Sciences Centre, 100 Main Street West, 5th Floor, Hamilton, ON, L8P 1H6, Canada
| | - Fiona Parascandalo
- Department of Family Medicine, McMaster University, David Braley Health Sciences Centre, 100 Main Street West, 5th Floor, Hamilton, ON, L8P 1H6, Canada
| | - Dee Mangin
- Department of Family Medicine, McMaster University, David Braley Health Sciences Centre, 100 Main Street West, 5th Floor, Hamilton, ON, L8P 1H6, Canada.
| |
Collapse
|
19
|
Aldridge LR, Kemp CG, Bass JK, Danforth K, Kane JC, Hamdani SU, Marsch LA, Uribe-Restrepo JM, Nguyen AJ, Bolton PA, Murray LK, Haroz EE. Psychometric performance of the Mental Health Implementation Science Tools (mhIST) across six low- and middle-income countries. Implement Sci Commun 2022; 3:54. [PMID: 35590428 PMCID: PMC9118868 DOI: 10.1186/s43058-022-00301-6] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2021] [Accepted: 04/26/2022] [Indexed: 01/18/2023] Open
Abstract
BACKGROUND Existing implementation measures developed in high-income countries may have limited appropriateness for use within low- and middle-income countries (LMIC). In response, researchers at Johns Hopkins University began developing the Mental Health Implementation Science Tools (mhIST) in 2013 to assess priority implementation determinants and outcomes across four key stakeholder groups-consumers, providers, organization leaders, and policy makers-with dedicated versions of scales for each group. These were field tested and refined in several contexts, and criterion validity was established in Ukraine. The Consumer and Provider mhIST have since grown in popularity in mental health research, outpacing psychometric evaluation. Our objective was to establish the cross-context psychometric properties of these versions and inform future revisions. METHODS We compiled secondary data from seven studies across six LMIC-Colombia, Myanmar, Pakistan, Thailand, Ukraine, and Zambia-to evaluate the psychometric performance of the Consumer and Provider mhIST. We used exploratory factor analysis to identify dimensionality, factor structure, and item loadings for each scale within each stakeholder version. We also used alignment analysis (i.e., multi-group confirmatory factor analysis) to estimate measurement invariance and differential item functioning of the Consumer scales across the six countries. RESULTS All but one scale within the Provider and Consumer versions had Cronbach's alpha greater than 0.8. Exploratory factor analysis indicated most scales were multidimensional, with factors generally aligning with a priori subscales for the Provider version; the Consumer version has no predefined subscales. Alignment analysis of the Consumer mhIST indicated a range of measurement invariance for scales across settings (R2 0.46 to 0.77). Several items were identified for potential revision due to participant nonresponse or low or cross- factor loadings. We found only one item, which asked consumers whether their intervention provider was available when needed, to have differential item functioning in both intercept and loading. CONCLUSION We provide evidence that the Consumer and Provider versions of the mhIST are internally valid and reliable across diverse contexts and stakeholder groups for mental health research in LMIC. We recommend the instrument be revised based on these analyses and future research examine instrument utility by linking measurement to other outcomes of interest.
Collapse
Affiliation(s)
- Luke R Aldridge
- Johns Hopkins University Bloomberg School of Public Health, Baltimore, USA.
| | - Christopher G Kemp
- Johns Hopkins University Bloomberg School of Public Health, Baltimore, USA
| | - Judith K Bass
- Johns Hopkins University Bloomberg School of Public Health, Baltimore, USA
| | - Kristen Danforth
- University of Washington Department of Global Health, Seattle, USA
| | - Jeremy C Kane
- Columbia University Mailman School of Public Health, New York, USA
| | - Syed U Hamdani
- University of Liverpool Institute of Population Health, Liverpool, UK
| | - Lisa A Marsch
- Dartmouth Center for Technology & Behavioral Health, Lebanon, USA
| | - José M Uribe-Restrepo
- Pontificia Universidad Javeriana Department of Psychiatry and Mental Health, Bogota, Colombia
| | - Amanda J Nguyen
- University of Virginia School of Education and Human Development, Charlottesville, USA
| | - Paul A Bolton
- Johns Hopkins University Bloomberg School of Public Health, Baltimore, USA
| | - Laura K Murray
- Johns Hopkins University Bloomberg School of Public Health, Baltimore, USA
| | - Emily E Haroz
- Johns Hopkins University Bloomberg School of Public Health, Baltimore, USA
| |
Collapse
|
20
|
Pilar M, Jost E, Walsh-Bailey C, Powell BJ, Mazzucca S, Eyler A, Purtle J, Allen P, Brownson RC. Quantitative measures used in empirical evaluations of mental health policy implementation: A systematic review. IMPLEMENTATION RESEARCH AND PRACTICE 2022; 3:26334895221141116. [PMID: 37091091 PMCID: PMC9924289 DOI: 10.1177/26334895221141116] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022] Open
Abstract
Background Mental health is a critical component of wellness. Public policies present an opportunity for large-scale mental health impact, but policy implementation is complex and can vary significantly across contexts, making it crucial to evaluate implementation. The objective of this study was to (1) identify quantitative measurement tools used to evaluate the implementation of public mental health policies; (2) describe implementation determinants and outcomes assessed in the measures; and (3) assess the pragmatic and psychometric quality of identified measures. Method Guided by the Consolidated Framework for Implementation Research, Policy Implementation Determinants Framework, and Implementation Outcomes Framework, we conducted a systematic review of peer-reviewed journal articles published in 1995-2020. Data extracted included study characteristics, measure development and testing, implementation determinants and outcomes, and measure quality using the Psychometric and Pragmatic Evidence Rating Scale. Results We identified 34 tools from 25 articles, which were designed for mental health policies or used to evaluate constructs that impact implementation. Many measures lacked information regarding measurement development and testing. The most assessed implementation determinants were readiness for implementation, which encompassed training (n = 20, 57%) and other resources (n = 12, 34%), actor relationships/networks (n = 15, 43%), and organizational culture and climate (n = 11, 31%). Fidelity was the most prevalent implementation outcome (n = 9, 26%), followed by penetration (n = 8, 23%) and acceptability (n = 7, 20%). Apart from internal consistency and sample norms, psychometric properties were frequently unreported. Most measures were accessible and brief, though minimal information was provided regarding interpreting scores, handling missing data, or training needed to administer tools. Conclusions This work contributes to the nascent field of policy-focused implementation science by providing an overview of existing measurement tools used to evaluate mental health policy implementation and recommendations for measure development and refinement. To advance this field, more valid, reliable, and pragmatic measures are needed to evaluate policy implementation and close the policy-to-practice gap. Plain Language Summary Mental health is a critical component of wellness, and public policies present an opportunity to improve mental health on a large scale. Policy implementation is complex because it involves action by multiple entities at several levels of society. Policy implementation is also challenging because it can be impacted by many factors, such as political will, stakeholder relationships, and resources available for implementation. Because of these factors, implementation can vary between locations, such as states or countries. It is crucial to evaluate policy implementation, thus we conducted a systematic review to identify and evaluate the quality of measurement tools used in mental health policy implementation studies. Our search and screening procedures resulted in 34 measurement tools. We rated their quality to determine if these tools were practical to use and would yield consistent (i.e., reliable) and accurate (i.e., valid) data. These tools most frequently assessed whether implementing organizations complied with policy mandates and whether organizations had the training and other resources required to implement a policy. Though many were relatively brief and available at little-to-no cost, these findings highlight that more reliable, valid, and practical measurement tools are needed to assess and inform mental health policy implementation. Findings from this review can guide future efforts to select or develop policy implementation measures.
Collapse
Affiliation(s)
- Meagan Pilar
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
- Department of Infectious Diseases, Washington University School of Medicine,
Washington University in St. Louis, St. Louis, MO, USA
| | - Eliot Jost
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
| | - Callie Walsh-Bailey
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
| | - Byron J. Powell
- Center for Mental Health Services Research, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
- Division of Infectious Diseases, John T. Milliken Department of
Medicine, Washington University School of Medicine, Washington University in St.
Louis, St. Louis, MO, USA
| | - Stephanie Mazzucca
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
| | - Amy Eyler
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
| | - Jonathan Purtle
- Department of Public Health Policy & Management, New York
University School of Global Public Health, Global Center for Implementation Science, New York University, New York, NY, USA
| | - Peg Allen
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
| | - Ross C. Brownson
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
- Department of Surgery (Division of Public Health Sciences) and Alvin
J. Siteman Cancer Center, Washington University School of Medicine, Washington University in St.
Louis, St. Louis, MO, USA
| |
Collapse
|
21
|
Carlson MA, Morris S, Day F, Dadich A, Ryan A, Fradgley EA, Paul C. Psychometric properties of leadership scales for health professionals: a systematic review. Implement Sci 2021; 16:85. [PMID: 34454567 PMCID: PMC8403357 DOI: 10.1186/s13012-021-01141-z] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2020] [Accepted: 06/27/2021] [Indexed: 11/10/2022] Open
Abstract
Background The important role of leaders in the translation of health research is acknowledged in the implementation science literature. However, the accurate measurement of leadership traits and behaviours in health professionals has not been directly addressed. This review aimed to identify whether scales which measure leadership traits and behaviours have been found to be reliable and valid for use with health professionals. Methods A systematic review was conducted. MEDLINE, EMBASE, PsycINFO, Cochrane, CINAHL, Scopus, ABI/INFORMIT and Business Source Ultimate were searched to identify publications which reported original research testing the reliability, validity or acceptability of a leadership-related scale with health professionals. Results Of 2814 records, a total of 39 studies met the inclusion criteria, from which 33 scales were identified as having undergone some form of psychometric testing with health professionals. The most commonly used was the Implementation Leadership Scale (n = 5) and the Multifactor Leadership Questionnaire (n = 3). Of the 33 scales, the majority of scales were validated in English speaking countries including the USA (n = 15) and Canada (n = 4), but also with some translations and use in Europe and Asia, predominantly with samples of nurses (n = 27) or allied health professionals (n = 10). Only two validation studies included physicians. Content validity and internal consistency were evident for most scales (n = 30 and 29, respectively). Only 20 of the 33 scales were found to satisfy the acceptable thresholds for good construct validity. Very limited testing occurred in relation to test-re-test reliability, responsiveness, acceptability, cross-cultural revalidation, convergent validity, discriminant validity and criterion validity. Conclusions Seven scales may be sufficiently sound to be used with professionals, primarily with nurses. There is an absence of validation of leadership scales with regard to physicians. Given that physicians, along with nurses and allied health professionals have a leadership role in driving the implementation of evidence-based healthcare, this constitutes a clear gap in the psychometric testing of leadership scales for use in healthcare implementation research and practice. Trial registration This review follows the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) (see Additional File 1) (PLoS Medicine. 6:e1000097, 2009) and the associated protocol has been registered with the PROSPERO International Prospective Register of Systematic Reviews (Registration Number CRD42019121544). Supplementary Information The online version contains supplementary material available at 10.1186/s13012-021-01141-z.
Collapse
Affiliation(s)
- Melissa A Carlson
- Hunter Cancer Research Alliance, Newcastle, New South Wales, Australia.,School of Medicine and Public Health, University of Newcastle, Callaghan, New South Wales, Australia
| | - Sarah Morris
- Hunter Cancer Research Alliance, Newcastle, New South Wales, Australia.,School of Medicine and Public Health, University of Newcastle, Callaghan, New South Wales, Australia
| | - Fiona Day
- Hunter Cancer Research Alliance, Newcastle, New South Wales, Australia.,School of Medicine and Public Health, University of Newcastle, Callaghan, New South Wales, Australia.,Calvary Mater Newcastle, Waratah, New South Wales, Australia
| | - Ann Dadich
- Centre for Oncology Education and Research Translation (CONCERT), Western Sydney University, Penrith, Australia
| | - Annika Ryan
- Hunter Cancer Research Alliance, Newcastle, New South Wales, Australia.,School of Medicine and Public Health, University of Newcastle, Callaghan, New South Wales, Australia
| | - Elizabeth A Fradgley
- Hunter Cancer Research Alliance, Newcastle, New South Wales, Australia.,School of Medicine and Public Health, University of Newcastle, Callaghan, New South Wales, Australia
| | - Christine Paul
- Hunter Cancer Research Alliance, Newcastle, New South Wales, Australia. .,School of Medicine and Public Health, University of Newcastle, Callaghan, New South Wales, Australia.
| |
Collapse
|
22
|
Allen JD, Shelton RC, Kephart L, Tom LS, Leyva B, Ospino H, Cuevas AG. Examining the external validity of the CRUZA study, a randomized trial to promote implementation of evidence-based cancer control programs by faith-based organizations. Transl Behav Med 2021; 10:213-222. [PMID: 30496532 DOI: 10.1093/tbm/iby099] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
The CRUZA trial tested the efficacy of an organizational-level intervention to increase capacity among Catholic parishes to implement evidence-based interventions (EBIs) for cancer control. This paper examines the external generalizability of the CRUZA study findings by comparing characteristics of parishes that agreed to participate in the intervention trial versus those that declined participation. Sixty-five Roman Catholic parishes that offered Spanish-language mass in Massachusetts were invited to complete a four-part survey assessing organization-level characteristics that, based on the Consolidated Framework for Implementation Research (CFIR), may be associated with EBI implementation. Forty-nine parishes (75%) completed the survey and were invited to participate in the CRUZA trial, which randomized parishes to either a "capacity enhancement intervention" or a "standard dissemination" group. Of these 49 parishes, 31 (63%) agreed to participate in the trial, whereas 18 parishes (37%) declined participation. Parishes that participated in the CRUZA intervention trial were similar to those that did not participate with respect to "inner organizational setting" characteristics of the CFIR, including innovation and values fit, implementation climate, and organizational culture. Change commitment, a submeasure of organizational readiness that reflects the shared resolve of organizational members to implement an innovation, was significantly higher among the participating parishes (mean = 3.93, SD = 1.08) as compared to nonparticipating parishes (mean = 3.27, SD = 1.08) (Z = -2.16, p = .03). Parishes that agreed to participate in the CRUZA intervention trial were similar to those that declined participation with regard to organizational characteristics that may predict implementation of EBIs. Pragmatic tools to assess external generalizability in community-based implementation trials and to promote readiness among faith-based organizations to implement EBIs are needed to enhance the reach and impact of public health research. Clinical Trial information: The CRUZA trial identifier number with clinicaltrials.gov is NCT01740219.
Collapse
Affiliation(s)
| | | | | | - Laura S Tom
- Community Health, Tufts University, Medford, MA
| | - Bryan Leyva
- Community Health, Tufts University, Medford, MA
| | | | | |
Collapse
|
23
|
Oh A, Vinson CA, Chambers DA. Future directions for implementation science at the National Cancer Institute: Implementation Science Centers in Cancer Control. Transl Behav Med 2021; 11:669-675. [PMID: 32145023 DOI: 10.1093/tbm/ibaa018] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022] Open
Abstract
The National Cancer Institute (NCI) Cancer Moonshot initiative seeks to accelerate cancer research for the USA. One of the scientific priorities identified by the Moonshot's Blue Ribbon Panel (BRP) of scientific experts was the implementation of evidence-based approaches. In September 2019, the NCI launched the Implementation Science Centers in Cancer Control (ISC3 or "Centers") initiative to advance this Moonshot priority. The vision of the ISC3 is to promote the development of research centers to build capacity and research in high-priority areas of cancer control implementation science (e.g., scale-up and spread, sustainability and adaptation, and precision implementation), build implementation laboratories within community and clinical settings, improve the state of measurement and methods, and improve the adoption, implementation, and sustainment of evidence-based cancer control interventions. This paper highlights the research agenda, vision, and strategic direction for these Centers and encourages transdisciplinary scientists to learn more about opportunities to collaborate with these Centers.
Collapse
Affiliation(s)
- April Oh
- Division of Cancer Control and Population Sciences, National Cancer Institute, Rockville, MD, USA
| | - Cynthia A Vinson
- Division of Cancer Control and Population Sciences, National Cancer Institute, Rockville, MD, USA
| | - David A Chambers
- Division of Cancer Control and Population Sciences, National Cancer Institute, Rockville, MD, USA
| |
Collapse
|
24
|
Kalbarczyk A, Rodriguez DC, Mahendradhata Y, Sarker M, Seme A, Majumdar P, Akinyemi OO, Kayembe P, Alonge OO. Barriers and facilitators to knowledge translation activities within academic institutions in low- and middle-income countries. Health Policy Plan 2021; 36:728-739. [PMID: 33661285 PMCID: PMC8173595 DOI: 10.1093/heapol/czaa188] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/02/2020] [Indexed: 11/13/2022] Open
Abstract
The barriers and facilitators of conducting knowledge translation (KT) activities are well-established but less is known about the institutional forces that drive these barriers, particularly in low resource settings. Understanding organizational readiness has been used to assess and address such barriers but the employment of readiness assessments has largely been done in high-income countries. We conducted a qualitative study to describe the institutional needs and barriers in KT specific to academic institutions in low- and middle-income countries. We conducted a review of the grey and published literature to identify country health priorities and established barriers and facilitators for KT. Key-informant interviews (KII) were conducted to elicit perceptions of institutional readiness to conduct KT, including experiences with KT, and views on motivation and capacity building. Participants included representatives from academic institutions and Ministries of Health in six countries (Bangladesh, Democratic Republic of the Congo, Ethiopia, India, Indonesia, Nigeria). We conducted 18 KIIs, 11 with members of academic institutions and 7 with policymakers. KIIs were analysed using a deductive and inductive coding approach. Our findings support many well-documented barriers including lack of time, skills and institutional support to conduct KT. Three additional institutional drivers emerged around soft skills and the complexity of the policy process, alignment of incentives and institutional missions, and the role of networks. Participants reflected on often-lacking soft-skills needed by researchers to engage policy makers. Continuous engagement was viewed as a challenge given competing demands for time (both researchers and policy makers) and lack of institutional incentives to conduct KT. Strong networks, both within the institution and between institutions, were described as important for conducting KT but difficult to establish and maintain. Attention to the cross-cutting themes representing barriers and facilitators for both individuals and institutions can inform the development of capacity building strategies that meet readiness needs.
Collapse
Affiliation(s)
- Anna Kalbarczyk
- Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, USA
| | | | - Yodi Mahendradhata
- Faculty of Medicine, Public Health and Nursing, Universitas Gadjah Mada, Bulaksumur Yogyakarta, Indonesia
| | - Malabika Sarker
- BRAC James P. Grant School of Public Health, BRAC University, Dhaka, Bangladesh.,Heidelberg Global Institute of Health (HIGH), Heidelberg University, Heidelberg, Germany
| | - Assefa Seme
- Addis Ababa University School of Public Health, Ethiopia
| | - Piyusha Majumdar
- Indian Institute of Health Management Research, Bengaluru, India
| | - Oluwaseun O Akinyemi
- Department of Health Policy and Management, College of Medicine, University of Ibadan, Ibadan, Nigeria
| | - Patrick Kayembe
- School of Public Health, University of Kinshasa, Kinshasa, Democratic Republic of the Congo
| | | |
Collapse
|
25
|
Zlateva I, Schiessl A, Khalid N, Bamrick K, Flinter M. Development and validation of the Readiness to Train Assessment Tool (RTAT). BMC Health Serv Res 2021; 21:396. [PMID: 33910561 PMCID: PMC8082650 DOI: 10.1186/s12913-021-06406-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/01/2021] [Accepted: 04/12/2021] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND In recent years, health centers in the United States have embraced the opportunity to train the next generation of health professionals. The uniqueness of the health centers as teaching settings emphasizes the need to determine if health professions training programs align with health center priorities and the nature of any adjustments that would be needed to successfully implement a training program. We sought to address this need by developing and validating a new survey that measures organizational readiness constructs important for the implementation of health professions training programs at health centers where the primary role of the organizations and individuals is healthcare delivery. METHODS The study incorporated several methodological steps for developing and validating a measure for assessing health center readiness to engage with health professions programs. A conceptual framework was developed based on literature review and later validated by 20 experts in two focus groups. A survey-item pool was generated and mapped to the conceptual framework and further refined and validated by 13 experts in three modified Delphi rounds. The survey items were pilot-tested with 212 health center employees. The final survey structure was derived through exploratory factor analysis. The internal consistency reliability of the scale and subscales was evaluated using Chronbach's alpha. RESULTS The exploratory factor analysis revealed a 41-item, 7-subscale solution for the survey structure, with 72% of total variance explained. Cronbach's alphas (.79-.97) indicated high internal consistency reliability. The survey measures: readiness to engage, evidence strength and quality of the health professions training program, relative advantage of the program, financial resources, additional resources, implementation team, and implementation plan. CONCLUSIONS The final survey, the Readiness to Train Assessment Tool (RTAT), is theoretically-based, valid and reliable. It provides an opportunity to evaluate health centers' readiness to implement health professions programs. When followed with appropriate change strategies, the readiness evaluations could make the implementation of health professions training programs, and their spread across the United States, more efficient and cost-effective. While developed specifically for health centers, the survey may be useful to other healthcare organizations willing to assess their readiness to implement education and training programs.
Collapse
Affiliation(s)
- Ianita Zlateva
- Weitzman Institute, Community Health Center, Inc., Middletown, CT, USA.
| | - Amanda Schiessl
- Weitzman Institute, Community Health Center, Inc., Middletown, CT, USA
| | - Nashwa Khalid
- Weitzman Institute, Community Health Center, Inc., Middletown, CT, USA
| | - Kerry Bamrick
- Weitzman Institute, Community Health Center, Inc., Middletown, CT, USA
| | - Margaret Flinter
- Weitzman Institute, Community Health Center, Inc., Middletown, CT, USA
| |
Collapse
|
26
|
Lengnick-Hall R, Stadnick NA, Dickson KS, Moullin JC, Aarons GA. Forms and functions of bridging factors: specifying the dynamic links between outer and inner contexts during implementation and sustainment. Implement Sci 2021; 16:34. [PMID: 33794956 PMCID: PMC8015179 DOI: 10.1186/s13012-021-01099-y] [Citation(s) in RCA: 45] [Impact Index Per Article: 15.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2020] [Accepted: 03/15/2021] [Indexed: 01/08/2023] Open
Abstract
BACKGROUND Bridging factors are relational ties, formal arrangements, and processes that connect outer system and inner organizational contexts. They may be critical drivers of evidence-based practice (EBP) implementation and sustainment. Yet, the complex interplay between outer and inner contexts is often not considered. Bridging factors were recently defined in the updated Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Further identification and specification of this construct will advance implementation models, measures, and methods. Our goal is to advance bridging factor research by identifying relevant dimensions and exemplifying these dimensions through illustrative case studies. METHODS We used a multiple case study design. Each case (n = 10) represented different contexts, EBPs, and bridging factor types. Inclusion criteria were the presence of clearly distinguishable outer and inner contexts, identifiable bridging factor, sufficient information to describe how the bridging factor affected implementation, and variation from other cases. We used an iterative qualitative inquiry process to develop and refine a list of dimensions. Case data were entered into a matrix. Dimensions comprised the rows and case details comprised the columns. After a review of all cases, we collectively considered and independently coded each dimension as function or form. RESULTS We drew upon the concepts of functions and forms, a distinction originally proposed in the complex health intervention literature. Function dimensions help define the bridging factor and illustrate its purpose as it relates to EBP implementation. Form dimensions describe the specific structures and activities that illustrate why and how the bridging factor has been customized to a local implementation experience. Function dimensions can help researchers and practitioners identify the presence and purpose of bridging factors, whereas form dimensions can help us understand how the bridging factor may be designed or modified to support EBP implementation in a specific context. We propose five function and three form bridging factor dimensions. CONCLUSIONS Bridging factors are described in many implementation models and studies, but without explicit reference or investigation. Bridging factors are an understudied and critical construct that requires further attention to facilitate implementation research and practice. We present specific recommendations for a bridging factors research agenda.
Collapse
Affiliation(s)
| | - Nicole A. Stadnick
- Department of Psychiatry, University of California-San Diego, La Jolla, USA
- UC San Diego Altman Clinical and Translational Research Institute Dissemination and Implementation Science Center, La Jolla, CA USA
- Child and Adolescent Services Research Center, San Diego, CA USA
| | - Kelsey S. Dickson
- Child and Adolescent Services Research Center, San Diego, CA USA
- College of Education, San Diego State University, San Diego, CA USA
| | - Joanna C. Moullin
- Child and Adolescent Services Research Center, San Diego, CA USA
- Curtin Medical School, Curtin University, Perth, Western Australia
| | - Gregory A. Aarons
- Department of Psychiatry, University of California-San Diego, La Jolla, USA
- UC San Diego Altman Clinical and Translational Research Institute Dissemination and Implementation Science Center, La Jolla, CA USA
- Child and Adolescent Services Research Center, San Diego, CA USA
| |
Collapse
|
27
|
Wolfenden L, Foy R, Presseau J, Grimshaw JM, Ivers NM, Powell BJ, Taljaard M, Wiggers J, Sutherland R, Nathan N, Williams CM, Kingsland M, Milat A, Hodder RK, Yoong SL. Designing and undertaking randomised implementation trials: guide for researchers. BMJ 2021; 372:m3721. [PMID: 33461967 PMCID: PMC7812444 DOI: 10.1136/bmj.m3721] [Citation(s) in RCA: 87] [Impact Index Per Article: 29.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
Abstract
Implementation science is the study of methods to promote the systematic uptake of evidence based interventions into practice and policy to improve health. Despite the need for high quality evidence from implementation research, randomised trials of implementation strategies often have serious limitations. These limitations include high risks of bias, limited use of theory, a lack of standard terminology to describe implementation strategies, narrowly focused implementation outcomes, and poor reporting. This paper aims to improve the evidence base in implementation science by providing guidance on the development, conduct, and reporting of randomised trials of implementation strategies. Established randomised trial methods from seminal texts and recent developments in implementation science were consolidated by an international group of researchers, health policy makers, and practitioners. This article provides guidance on the key components of randomised trials of implementation strategies, including articulation of trial aims, trial recruitment and retention strategies, randomised design selection, use of implementation science theory and frameworks, measures, sample size calculations, ethical review, and trial reporting. It also focuses on topics requiring special consideration or adaptation for implementation trials. We propose this guide as a resource for researchers, healthcare and public health policy makers or practitioners, research funders, and journal editors with the goal of advancing rigorous conduct and reporting of randomised trials of implementation strategies.
Collapse
Affiliation(s)
- Luke Wolfenden
- School of Medicine and Public Health, Faculty of Health and Medicine, University of Newcastle, Callaghan, NSW, Australia
- Hunter New England Population Health, Locked Bag 10, Wallsend, NSW 2287, Australia
| | - Robbie Foy
- Leeds Institute of Health Sciences, University of Leeds, Leeds, UK
| | - Justin Presseau
- Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, ON, Canada
- School of Epidemiology and Public Health, University of Ottawa, Ottawa, ON, Canada
| | - Jeremy M Grimshaw
- Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, ON, Canada
- Department of Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Noah M Ivers
- Women's College Research Institute, Women's College Hospital, Toronto, ON, Canada
- Institute for Health Systems Solutions and Virtual Care, Women's College Hospital, Toronto, ON, Canada
- Department of Family Medicine and Community Medicine, Faculty of Medicine, University of Toronto, Toronto, ON, Canada
- Institute of Health Policy, Management and Evaluation, University of Toronto, Toronto, ON, Canada
| | - Byron J Powell
- Brown School and School of Medicine, Washington University in St Louis, St Louis, MI, USA
| | - Monica Taljaard
- Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, ON, Canada
- School of Epidemiology and Public Health, University of Ottawa, Ottawa, ON, Canada
| | - John Wiggers
- School of Medicine and Public Health, Faculty of Health and Medicine, University of Newcastle, Callaghan, NSW, Australia
- Hunter New England Population Health, Locked Bag 10, Wallsend, NSW 2287, Australia
| | - Rachel Sutherland
- School of Medicine and Public Health, Faculty of Health and Medicine, University of Newcastle, Callaghan, NSW, Australia
- Hunter New England Population Health, Locked Bag 10, Wallsend, NSW 2287, Australia
| | - Nicole Nathan
- Hunter New England Population Health, Locked Bag 10, Wallsend, NSW 2287, Australia
| | - Christopher M Williams
- School of Medicine and Public Health, Faculty of Health and Medicine, University of Newcastle, Callaghan, NSW, Australia
- Hunter New England Population Health, Locked Bag 10, Wallsend, NSW 2287, Australia
- School of Public Health, Faculty of Medicine and Health, University of Sydney, Sydney, NSW, Australia
| | - Melanie Kingsland
- School of Medicine and Public Health, Faculty of Health and Medicine, University of Newcastle, Callaghan, NSW, Australia
- Hunter New England Population Health, Locked Bag 10, Wallsend, NSW 2287, Australia
| | - Andrew Milat
- School of Public Health, Faculty of Medicine and Health, University of Sydney, Sydney, NSW, Australia
| | - Rebecca K Hodder
- School of Medicine and Public Health, Faculty of Health and Medicine, University of Newcastle, Callaghan, NSW, Australia
- Hunter New England Population Health, Locked Bag 10, Wallsend, NSW 2287, Australia
| | - Sze Lin Yoong
- Swinburne University of Technology, School of Health Sciences, Faculty Health, Arts and Design, Hawthorn, VIC, Australia
| |
Collapse
|
28
|
Pearson N, Naylor PJ, Ashe MC, Fernandez M, Yoong SL, Wolfenden L. Guidance for conducting feasibility and pilot studies for implementation trials. Pilot Feasibility Stud 2020; 6:167. [PMID: 33292770 PMCID: PMC7603668 DOI: 10.1186/s40814-020-00634-w] [Citation(s) in RCA: 123] [Impact Index Per Article: 30.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2020] [Accepted: 06/18/2020] [Indexed: 12/21/2022] Open
Abstract
BACKGROUND Implementation trials aim to test the effects of implementation strategies on the adoption, integration or uptake of an evidence-based intervention within organisations or settings. Feasibility and pilot studies can assist with building and testing effective implementation strategies by helping to address uncertainties around design and methods, assessing potential implementation strategy effects and identifying potential causal mechanisms. This paper aims to provide broad guidance for the conduct of feasibility and pilot studies for implementation trials. METHODS We convened a group with a mutual interest in the use of feasibility and pilot trials in implementation science including implementation and behavioural science experts and public health researchers. We conducted a literature review to identify existing recommendations for feasibility and pilot studies, as well as publications describing formative processes for implementation trials. In the absence of previous explicit guidance for the conduct of feasibility or pilot implementation trials specifically, we used the effectiveness-implementation hybrid trial design typology proposed by Curran and colleagues as a framework for conceptualising the application of feasibility and pilot testing of implementation interventions. We discuss and offer guidance regarding the aims, methods, design, measures, progression criteria and reporting for implementation feasibility and pilot studies. CONCLUSIONS This paper provides a resource for those undertaking preliminary work to enrich and inform larger scale implementation trials.
Collapse
Affiliation(s)
- Nicole Pearson
- School of Medicine and Public Health, University of Newcastle, University Drive, Callaghan, NSW 2308, Australia.
- Hunter New England Population Health, Locked Bag 10, Wallsend, NSW 2287, Australia.
| | - Patti-Jean Naylor
- School of Exercise Science, Physical and Health Education, Faculty of Education, University of Victoria, PO Box 3015 STN CSC, Victoria, BC, V8W 3P1, Canada
| | - Maureen C Ashe
- Department of Family Practice, University of British Columbia (UBC) and Centre for Hip Health and Mobility, University Boulevard, Vancouver, BC, V6T 1Z3, Canada
| | - Maria Fernandez
- Center for Health Promotion and Prevention Research, University of Texas Health Science Center at Houston School of Public Health, Houston, TX, 77204, USA
| | - Sze Lin Yoong
- School of Medicine and Public Health, University of Newcastle, University Drive, Callaghan, NSW 2308, Australia
- Hunter New England Population Health, Locked Bag 10, Wallsend, NSW 2287, Australia
| | - Luke Wolfenden
- School of Medicine and Public Health, University of Newcastle, University Drive, Callaghan, NSW 2308, Australia
- Hunter New England Population Health, Locked Bag 10, Wallsend, NSW 2287, Australia
| |
Collapse
|
29
|
Ginsburg LR, Hoben M, Easterbrook A, Andersen E, Anderson RA, Cranley L, Lanham HJ, Norton PG, Weeks LE, Estabrooks CA. Examining fidelity in the INFORM trial: a complex team-based behavioral intervention. Implement Sci 2020; 15:78. [PMID: 32938481 PMCID: PMC7493316 DOI: 10.1186/s13012-020-01039-2] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2020] [Accepted: 08/31/2020] [Indexed: 11/11/2022] Open
Abstract
Background Fidelity in complex behavioral interventions is underexplored. This study examines the fidelity of the INFORM trial and explores the relationship between fidelity, study arm, and the trial’s primary outcome—care aide involvement in formal team communications about resident care. Methods A concurrent process evaluation of implementation fidelity was conducted in 33 nursing homes in Western Canada (Alberta and British Columbia). Study participants were from 106 clinical care units clustered in 33 nursing homes randomized to the Basic and Enhanced-Assisted Feedback arms of the INFORM trial. Results Fidelity of the INFORM intervention was moderate to high, with fidelity delivery and receipt higher than fidelity enactment for both study arms. Higher enactment teams experienced a significantly larger improvement in formal team communications between baseline and follow-up than lower enactment teams (F(1, 70) = 4.27, p = .042). Conclusions Overall fidelity enactment was associated with improvements in formal team communications, but the study arm was not. This suggests that the intensity with which an intervention is offered and delivered may be less important than the intensity with which intervention participants enact the core components of an intervention. Greater attention to fidelity assessment and publication of fidelity results through studies such as this one is critical to improving the utility of published trials.
Collapse
Affiliation(s)
- Liane R Ginsburg
- School of Health Policy & Management, Faculty of Health, York University, Toronto, Ontario, M3J 1P3, Canada.
| | - Matthias Hoben
- Faculty of Nursing, University of Alberta, Edmonton, Alberta, T6G 1C9, Canada
| | - Adam Easterbrook
- Faculty of Nursing, University of Alberta, Edmonton, Alberta, T6G 1C9, Canada
| | - Elizabeth Andersen
- School of Nursing, Thompson Rivers University, Kamloops, British Columbia, V2C 0C8, Canada
| | - Ruth A Anderson
- School of Nursing, University of North Carolina, Chapel Hill, North Carolina, 27599-7460, USA
| | - Lisa Cranley
- Lawrence S Bloomberg Faculty of Nursing, University of Toronto, Toronto, Ontario, M5T 1P8, Canada
| | - Holly J Lanham
- University of Texas Health Science Center San Antonio, University of Texas, San Antonio, Texas, 78229, USA
| | - Peter G Norton
- Cumming School of Medicine, University of Calgary, Calgary, Alberta, T2N 4 N1, Canada
| | - Lori E Weeks
- School of Nursing, Faculty of Health, Dalhousie University, Halifax, Nova Scotia, B3H 4R2, Canada
| | - Carole A Estabrooks
- Faculty of Nursing, University of Alberta, Edmonton, Alberta, T6G 1C9, Canada
| |
Collapse
|
30
|
Khadjesari Z, Boufkhed S, Vitoratou S, Schatte L, Ziemann A, Daskalopoulou C, Uglik-Marucha E, Sevdalis N, Hull L. Implementation outcome instruments for use in physical healthcare settings: a systematic review. Implement Sci 2020; 15:66. [PMID: 32811517 PMCID: PMC7433178 DOI: 10.1186/s13012-020-01027-6] [Citation(s) in RCA: 21] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2020] [Accepted: 07/29/2020] [Indexed: 01/05/2023] Open
Abstract
BACKGROUND Implementation research aims to facilitate the timely and routine implementation and sustainment of evidence-based interventions and services. A glaring gap in this endeavour is the capability of researchers, healthcare practitioners and managers to quantitatively evaluate implementation efforts using psychometrically sound instruments. To encourage and support the use of precise and accurate implementation outcome measures, this systematic review aimed to identify and appraise studies that assess the measurement properties of quantitative implementation outcome instruments used in physical healthcare settings. METHOD The following data sources were searched from inception to March 2019, with no language restrictions: MEDLINE, EMBASE, PsycINFO, HMIC, CINAHL and the Cochrane library. Studies that evaluated the measurement properties of implementation outcome instruments in physical healthcare settings were eligible for inclusion. Proctor et al.'s taxonomy of implementation outcomes was used to guide the inclusion of implementation outcomes: acceptability, appropriateness, feasibility, adoption, penetration, implementation cost and sustainability. Methodological quality of the included studies was assessed using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist. Psychometric quality of the included instruments was assessed using the Contemporary Psychometrics checklist (ConPsy). Usability was determined by number of items per instrument. RESULTS Fifty-eight publications reporting on the measurement properties of 55 implementation outcome instruments (65 scales) were identified. The majority of instruments assessed acceptability (n = 33), followed by appropriateness (n = 7), adoption (n = 4), feasibility (n = 4), penetration (n = 4) and sustainability (n = 3) of evidence-based practice. The methodological quality of individual scales was low, with few studies rated as 'excellent' for reliability (6/62) and validity (7/63), and both studies that assessed responsiveness rated as 'poor' (2/2). The psychometric quality of the scales was also low, with 12/65 scales scoring 7 or more out of 22, indicating greater psychometric strength. Six scales (6/65) rated as 'excellent' for usability. CONCLUSION Investigators assessing implementation outcomes quantitatively should select instruments based on their methodological and psychometric quality to promote consistent and comparable implementation evaluations. Rather than developing ad hoc instruments, we encourage further psychometric testing of instruments with promising methodological and psychometric evidence. SYSTEMATIC REVIEW REGISTRATION PROSPERO 2017 CRD42017065348.
Collapse
Affiliation(s)
- Zarnie Khadjesari
- Centre for Implementation Science, Health Service and Population Research Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK.
- Behavioural and Implementation Science research group, School of Health Sciences, University of East Anglia, Edith Cavell Building, Norwich Research Park, Norwich, NR4 7TJ, UK.
| | - Sabah Boufkhed
- Centre for Implementation Science, Health Service and Population Research Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK
| | - Silia Vitoratou
- Psychometrics and Measurement Lab, Biostatistics and Health Informatics Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK
| | - Laura Schatte
- Centre for Implementation Science, Health Service and Population Research Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK
| | - Alexandra Ziemann
- Centre for Implementation Science, Health Service and Population Research Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK
- Centre for Healthcare Innovation Research, City, University of London, Northampton Square, London, EC1V 0HB, UK
| | - Christina Daskalopoulou
- Health Service and Population Research Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK
| | - Eleonora Uglik-Marucha
- Psychometrics and Measurement Lab, Biostatistics and Health Informatics Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK
| | - Nick Sevdalis
- Centre for Implementation Science, Health Service and Population Research Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK
| | - Louise Hull
- Centre for Implementation Science, Health Service and Population Research Department, Institute of Psychiatry, Psychology and Neuroscience, King's College London, 16 De Crespigny Park, London, SE5 8AF, UK
| |
Collapse
|
31
|
Using an implementation science approach to implement and evaluate patient-reported outcome measures (PROM) initiatives in routine care settings. Qual Life Res 2020; 30:3015-3033. [PMID: 32651805 PMCID: PMC8528754 DOI: 10.1007/s11136-020-02564-9] [Citation(s) in RCA: 120] [Impact Index Per Article: 30.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/16/2020] [Indexed: 11/04/2022]
Abstract
Purpose Patient-reported outcome and experience measures (PROMs/PREMs) are well established in research for many health conditions, but barriers persist for implementing them in routine care. Implementation science (IS) offers a potential way forward, but its application has been limited for PROMs/PREMs. Methods We compare similarities and differences for widely used IS frameworks and their applicability for implementing PROMs/PREMs through case studies. Three case studies implemented PROMs: (1) pain clinics in Canada; (2) oncology clinics in Australia; and (3) pediatric/adult clinics for chronic conditions in the Netherlands. The fourth case study is planning PREMs implementation in Canadian primary care clinics. We compare case studies on barriers, enablers, implementation strategies, and evaluation. Results Case studies used IS frameworks to systematize barriers, to develop implementation strategies for clinics, and to evaluate implementation effectiveness. Across case studies, consistent PROM/PREM implementation barriers were technology, uncertainty about how or why to use PROMs/PREMs, and competing demands from established clinical workflows. Enabling factors in clinics were context specific. Implementation support strategies changed during pre-implementation, implementation, and post-implementation stages. Evaluation approaches were inconsistent across case studies, and thus, we present example evaluation metrics specific to PROMs/PREMs. Conclusion Multilevel IS frameworks are necessary for PROM/PREM implementation given the complexity. In cross-study comparisons, barriers to PROM/PREM implementation were consistent across patient populations and care settings, but enablers were context specific, suggesting the need for tailored implementation strategies based on clinic resources. Theoretically guided studies are needed to clarify how, why, and in what circumstances IS principles lead to successful PROM/PREM integration and sustainability. Electronic supplementary material The online version of this article (10.1007/s11136-020-02564-9) contains supplementary material, which is available to users.
Collapse
|
32
|
Glasgow RE, Battaglia C, McCreight M, Ayele RA, Rabin BA. Making Implementation Science More Rapid: Use of the RE-AIM Framework for Mid-Course Adaptations Across Five Health Services Research Projects in the Veterans Health Administration. Front Public Health 2020; 8:194. [PMID: 32528921 PMCID: PMC7266866 DOI: 10.3389/fpubh.2020.00194] [Citation(s) in RCA: 50] [Impact Index Per Article: 12.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2020] [Accepted: 04/29/2020] [Indexed: 11/18/2022] Open
Abstract
Introduction: Implementation science frameworks have helped advance translation of research to practice. They have been widely used for planning and post-hoc evaluation, but seldom to inform and guide mid-course adjustments to intervention and implementation strategies. Materials and Methods: This study developed an innovative methodology using the RE-AIM framework and related tools to guide mid-course assessments and adaptations across five diverse health services improvement projects in the Veterans Health Administration (VA). Using a semi-structured guide, project team members were asked to assess the importance of and progress on each RE-AIM dimension (i.e., reach, effectiveness, adoption, implementation, maintenance) at the current phase of their project. Based on these ratings, each team identified one or two RE-AIM dimensions for focused attention. Teams developed proximal goals and implementation strategies to improve progress on their selected dimension(s). A follow-up meeting with each team occurred approximately 6 weeks after the goal setting meeting to evaluate the usefulness of the iterative process. Results were evaluated using both descriptive quantitative analyses and qualitative assessments from interviews and meeting notes. Results: A median of seven team members participated in the two meetings. Qualitative and descriptive data revealed that the process was feasible, understandable and useful to teams in adjusting their interventions and implementation strategies. The RE-AIM dimensions identified as most important were adoption and effectiveness, and the dimension that had the largest gap between importance and rated progress was reach. The dimensions most frequently selected for improvement were reach and adoption. Examples of action plans were summarizing stakeholder interviews for leadership, revising exclusion criteria, and conducting in-service trainings. Follow-up meetings indicated that teams found the process very useful and were able to implement the action plans they set. Discussion: The iterative use of RE-AIM to support adjustments during project implementation proved feasible and useful across diverse projects in the VA setting. Building on this and related examples, future research should replicate these findings and further develop the methodology, as well as explore the optimal frequency and timing for these iterative applications of RE-AIM. More generally, greater focus on more rapid and iterative use of implementation science frameworks is encouraged to facilitate successful translation of research to practice.
Collapse
Affiliation(s)
- Russell E Glasgow
- Department of Family Medicine, School of Medicine, University of Colorado, Aurora, CO, United States.,Director, Dissemination and Implementation Science Program, The Adult and Child Consortium for Health Outcomes Research and Delivery Science, School of Medicine, University of Colorado, Aurora, CO, United States
| | - Catherine Battaglia
- School of Medicine, University of Colorado, Aurora, CO, United States.,Independent researcher, Aurora, CO, United States.,Department of Health System/Management and Policy, Colorado School of Public Health, University of Colorado Denver, Aurora, CO, United States
| | - Marina McCreight
- Veterans Health Administration (VHA), Washington, DC, United States
| | - Roman Aydiko Ayele
- Seattle-Denver Center of Innovation, VA Eastern Colorado Health Care System, Denver, CO, United States
| | - Borsika Adrienn Rabin
- Department of Family Medicine and Public Health, School of Medicine, University of California, San Diego, San Diego, CA, United States.,Seattle-Denver Center of Innovation, VA Eastern Colorado Health Care System, Denver, CO, United States.,Dissemination and Implementation Science Program, The Adult and Child Consortium for Health Outcomes Research and Delivery Science, School of Medicine, University of Colorado, Aurora, CO, United States
| |
Collapse
|
33
|
Lane SD. Comment on “At-risk drinking and current cannabis use among medical students: a multivariable analysis of the role of personality traits”. BRAZILIAN JOURNAL OF PSYCHIATRY 2020; 42:122-123. [PMID: 32348436 PMCID: PMC7115438 DOI: 10.1590/1516-4446-2020-0007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 03/06/2020] [Accepted: 03/07/2020] [Indexed: 12/01/2022]
Affiliation(s)
- Scott D. Lane
- University of Texas Health Science Center at Houston (UTHealth), USA
| |
Collapse
|
34
|
Gäbler G, Coenen M, Fohringer K, Trauner M, Stamm TA. Towards a nationwide implementation of a standardized nutrition and dietetics terminology in clinical practice: a pre-implementation focus group study including a pretest and using the consolidated framework for implementation research. BMC Health Serv Res 2019; 19:920. [PMID: 31783855 PMCID: PMC6884883 DOI: 10.1186/s12913-019-4600-5] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2018] [Accepted: 10/03/2019] [Indexed: 02/08/2023] Open
Abstract
Background & Aims In order to assure high quality of nutrition and dietetic care as well as research, the implementation of a standardized terminology, such as the World Health Organization (WHO) International Classification of Functioning, Disability and Health for Dietetics (ICF-Dietetics) is indispensable. The aim of this study was to explore the clinical practicability and applicability of the ICF-Dietetics in the field of nutrition and dietetic practice prior to the implementation in order to develop criteria (points to consider) for a targeted implementation strategy. Methods A focus group study including a pretest of the ICF-Dietetics was conducted. Subsequently, facilitators and barriers for a nationwide implementation of the ICF-Dietetics in clinical nutrition and dietetic practice were identified and linked to interventions (combining theory-based and group-based approach) using the Consolidated Framework of Implementation Research (CFIR) to organize and represent data and summarized in a logic model. Results In the pretest 55 clinical documentations which consisted of 248 different ICF-Dietetics categories were received. In four focus groups with 22 health professionals, 66 relevant higher-level themes and implementation strategy criteria (points to consider) were identified. These themes referred to all five domains of the CFIR, namely intervention characteristics, inner setting, outer setting, characteristics of individuals and implementation process and contained important barriers and facilitators that were linked to six implementation objectives as well as six context requirements and five main actors. Conclusions This study provides facilitators and barriers to be addressed when implementing the ICF-Dietetics in clinical practice and shows potential interventions based on this analysis. A nationwide implementation was mainly seen as a great advantage for enhancing quality and continuity of care and for providing comparable data. However, it requires further refinements and a multifaceted implementation strategy where the engagement of leadership of institutions plays a crucial role. These results have provided a foundation for a targeted implementation strategy to increase the success, reproducibility and comparability.
Collapse
Affiliation(s)
- Gabriele Gäbler
- Section for Outcomes Research, Center for Medical Statistics, Informatics, and Intelligent Systems, Medical University of Vienna, Spitalgasse 23, 1090, Vienna, Austria.
| | - Michaela Coenen
- LMU Munich, Department of Medical Information Processing, Biometry and Epidemiology (IBE), Chair of Public Health und Health Services Research, Marchioninistr. 17, 81377, Munich, Germany.,Pettenkofer School of Public Health, Munich, Germany
| | - Katrin Fohringer
- Department of Medicine III, Division of Gastroenterology and Hepatology, Medical Nutrition Therapy and Dietetics, Vienna General Hospital, Währinger Gürtel 18-20, 1090, Vienna, Austria
| | - Michael Trauner
- Department of Medicine III, Division of Gastroenterology and Hepatology, Medical University of Vienna, Währinger Gürtel 18-20, 1090, Vienna, Austria
| | - Tanja A Stamm
- Section for Outcomes Research, Center for Medical Statistics, Informatics, and Intelligent Systems, Medical University of Vienna, Spitalgasse 23, 1090, Vienna, Austria.
| |
Collapse
|
35
|
The Clinician Guideline Determinants Questionnaire was developed and validated to support tailored implementation planning. J Clin Epidemiol 2019; 113:129-136. [DOI: 10.1016/j.jclinepi.2019.05.024] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2018] [Revised: 04/04/2019] [Accepted: 05/14/2019] [Indexed: 01/15/2023]
|
36
|
Hull L, Goulding L, Khadjesari Z, Davis R, Healey A, Bakolis I, Sevdalis N. Designing high-quality implementation research: development, application, feasibility and preliminary evaluation of the implementation science research development (ImpRes) tool and guide. Implement Sci 2019; 14:80. [PMID: 31412887 PMCID: PMC6693182 DOI: 10.1186/s13012-019-0897-z] [Citation(s) in RCA: 57] [Impact Index Per Article: 11.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2018] [Accepted: 04/15/2019] [Indexed: 12/26/2022] Open
Abstract
Background Designing implementation research can be a complex and daunting task, especially for applied health researchers who have not received specialist training in implementation science. We developed the Implementation Science Research Development (ImpRes) tool and supplementary guide to address this challenge and provide researchers with a systematic approach to designing implementation research. Methods A multi-method and multi-stage approach was employed. An international, multidisciplinary expert panel engaged in an iterative brainstorming and consensus-building process to generate core domains of the ImpRes tool, representing core implementation science principles and concepts that researchers should consider when designing implementation research. Simultaneously, an iterative process of reviewing the literature and expert input informed the development and content of the tool. Once consensus had been reached, specialist expert input was sought on involving and engaging patients/service users; and economic evaluation. ImpRes was then applied to 15 implementation and improvement science projects across the National Institute of Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC) South London, a research organisation in London, UK. Researchers who applied the ImpRes tool completed an 11-item questionnaire evaluating its structure, content and usefulness. Results Consensus was reached on ten implementation science domains to be considered when designing implementation research. These include implementation theories, frameworks and models, determinants of implementation, implementation strategies, implementation outcomes and unintended consequences. Researchers who used the ImpRes tool found it useful for identifying project areas where implementation science is lacking (median 5/5, IQR 4–5) and for improving the quality of implementation research (median 4/5, IQR 4–5) and agreed that it contained the key components that should be considered when designing implementation research (median 4/5, IQR 4–4). Qualitative feedback from researchers who applied the ImpRes tool indicated that a supplementary guide was needed to facilitate use of the tool. Conclusions We have developed a feasible and acceptable tool, and supplementary guide, to facilitate consideration and incorporation of core principles and concepts of implementation science in applied health implementation research. Future research is needed to establish whether application of the tool and guide has an effect on the quality of implementation research.
Collapse
Affiliation(s)
- Louise Hull
- Centre for Implementation Science, Health Service and Population Research Department, King's College London, London, UK.
| | - Lucy Goulding
- Centre for Implementation Science, Health Service and Population Research Department, King's College London, London, UK
| | - Zarnie Khadjesari
- Centre for Implementation Science, Health Service and Population Research Department, King's College London, London, UK.,School of Health Sciences, University of East Anglia, Norwich Research Park, Norwich, UK
| | - Rachel Davis
- Centre for Implementation Science, Health Service and Population Research Department, King's College London, London, UK
| | - Andy Healey
- Centre for Implementation Science, Health Service and Population Research Department, King's College London, London, UK.,King's Health Economics, Institute of Psychiatry, Psychology and Neuroscience, King's College London, London, UK
| | - Ioannis Bakolis
- Centre for Implementation Science, Health Service and Population Research Department, King's College London, London, UK.,Department of Biostatistics and Health Informatics, Institute of Psychiatry, Psychology and Neuroscience, King's College London, London, UK
| | - Nick Sevdalis
- Centre for Implementation Science, Health Service and Population Research Department, King's College London, London, UK
| |
Collapse
|
37
|
Haroz EE, Bolton P, Nguyen AJ, Lee C, Bogdanov S, Bass J, Singh NS, Doty SB, Murray L. Measuring implementation in global mental health: validation of a pragmatic implementation science measure in eastern Ukraine using an experimental vignette design. BMC Health Serv Res 2019; 19:262. [PMID: 31036002 PMCID: PMC6489318 DOI: 10.1186/s12913-019-4097-y] [Citation(s) in RCA: 30] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2018] [Accepted: 04/12/2019] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND There is mounting evidence supporting the effectiveness of task-shifted mental health interventions in low- and middle-income countries (LMIC). However, there has been limited systematic scale-up or sustainability of these programs, indicating a need to study implementation. One barrier to progress is a lack of locally relevant and valid implementation measures. We adapted an existing brief dissemination and implementation (D&I) measure which includes scales for acceptability, appropriateness, feasibility and accessibility for local use and studied its validity and reliability among a sample of consumers in Ukraine. METHODS Local qualitative data informed adaptation of the measure and development of vignettes to test the reliability and validity. Participants were veterans and internally displaced persons (IDPs) recruited as part of a separate validity study of adapted mental health instruments. We examined internal consistency reliability, test-retest reliability, and construct and criterion validity for each scale on the measure. We randomly assigned half the participants to respond to a vignette depicting existing local psychiatric services which we knew were not well regarded, while the other half was randomized to a vignette describing a potentially more well-implemented mental health service. Criterion validity was assessed by comparing scores on each scale by vignette and by overall summary ratings of the programs described in the vignettes. RESULTS N = 169 participated in the qualitative study and N = 153 participated in the validity study. Qualitative findings suggested the addition of several items to the measure and indicated the importance of addressing professionalism/competency of providers in both the scales and the vignettes. Internal consistency reliabilities ranged from α = 0.85 for feasibility to α = 0.91 for appropriateness. Test-rest reliabilities were acceptable to good for all scales (rho: 0.61-0.79). All scales demonstrated substantial and significant differences in average scores by vignette assignment (ORs: 2.21-5.6) and overall ratings (ORs: 5.1-14.47), supporting criterion validity. CONCLUSIONS This study represents an innovative mixed-methods approach to testing an implementation science measure in contexts outside the United States. Results support the reliability and validity of most scales for consumers in Ukraine. Challenges included large amounts of missing data due to participants' difficulties responding to questions about a hypothetical program.
Collapse
Affiliation(s)
- E E Haroz
- Department of Mental Health, Johns Hopkins Bloomberg School of Public Health, 624 N. Broadway 8th fl, Baltimore, MD, 21205, USA.
| | - P Bolton
- Department of Mental Health, Johns Hopkins Bloomberg School of Public Health, 624 N. Broadway 8th fl, Baltimore, MD, 21205, USA.,Department of International Health, Johns Hopkins Bloomberg School of Public Health, Baltimore, USA
| | - A J Nguyen
- University of Virginia Curry School of Education, Virginia, USA
| | - C Lee
- Department of International Health, Johns Hopkins Bloomberg School of Public Health, Baltimore, USA
| | - S Bogdanov
- Center for Mental Health and Psychosocial Support National University of Kyiv-Mohyla, Kyiv-Mohyla, Ukraine
| | - J Bass
- Department of Mental Health, Johns Hopkins Bloomberg School of Public Health, 624 N. Broadway 8th fl, Baltimore, MD, 21205, USA
| | - N S Singh
- Department of International Health, Johns Hopkins Bloomberg School of Public Health, Baltimore, USA
| | - S B Doty
- Department of Mental Health, Johns Hopkins Bloomberg School of Public Health, 624 N. Broadway 8th fl, Baltimore, MD, 21205, USA
| | - L Murray
- Department of Mental Health, Johns Hopkins Bloomberg School of Public Health, 624 N. Broadway 8th fl, Baltimore, MD, 21205, USA
| |
Collapse
|
38
|
Morris JH, Bernhardsson S, Bird ML, Connell L, Lynch E, Jarvis K, Kayes NM, Miller K, Mudge S, Fisher R. Implementation in rehabilitation: a roadmap for practitioners and researchers. Disabil Rehabil 2019; 42:3265-3274. [PMID: 30978129 DOI: 10.1080/09638288.2019.1587013] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
Abstract
Purpose: Despite growth in rehabilitation research, implementing research findings into rehabilitation practice has been slow. This creates inequities for patients and is an ethical issue. However, methods to investigate and facilitate evidence implementation are being developed. This paper aims to make these methods relevant and accessible for rehabilitation researchers and practitioners.Methods: Rehabilitation practice is varied and complex and occurs within multilevel healthcare systems. Using a "road map" analogy, we describe how implementation concepts and theories can inform implementation strategies in rehabilitation. The roadmap involves a staged journey that considers: the nature of evidence; context for implementation; navigation tools for implementation; strategies to facilitate implementation; evaluation of implementation outcomes; and sustainability of implementation. We have developed a model to illustrate the journey, and four case studies exemplify implementation stages in rehabilitation settings.Results and Conclusions: Effective implementation strategies for the complex world of rehabilitation are urgently required. The journey we describe unpacks that complexity to provide a template for effective implementation, to facilitate translation of the growing evidence base in rehabilitation into improved patient outcomes. It emphasizes the importance of understanding context and application of relevant theory, and highlights areas which should be targeted in new implementation research in rehabilitation.Implications for rehabilitationEffective implementation of research evidence into rehabilitation practice has many interconnected steps and a roadmap analogy is helpful in defining them.Understanding context for implementation is critically important and using theory can facilitate development of understanding.Research methods for implementation in rehabilitation should be carefully selected and outcomes should evaluate implementation success as well as clinical change.Sustainability requires regular revisiting of the interconnected steps.
Collapse
Affiliation(s)
- Jacqui H Morris
- School of Nursing and Health Sciences, University of Dundee, Dundee, UK
| | - Susanne Bernhardsson
- Närhälsan Research and Development Primary Health Care, Gothenburg, Sweden.,The Sahlgrenska Academy Institute of Neuroscience and Physiology, University of Gothenburg, Gothenburg, Sweden
| | - Marie-Louise Bird
- Department of Physical Therapy, University of British Columbia, Vancouver, Canada
| | - Louise Connell
- School of Health Sciences, University of Central Lancashire, Preston, UK
| | - Elizabeth Lynch
- Adelaide Nursing School, University of Adelaide, Adelaide, Australia.,Stroke Division, The Florey Institute of Neuroscience and Mental Health, Victoria, Australia.,NHMRC Centre of Research Excellence in Stroke Rehabilitation and Brain Recovery, Victoria, Australia
| | - Kathryn Jarvis
- School of Health Sciences, University of Central Lancashire, Preston, UK
| | - Nicola M Kayes
- Centre for Person Centred Research, Auckland University of Technology, Auckland, New Zealand
| | - Kim Miller
- Evidence Centre, Sunny Hill Health Centre for Children, Vancouver, Canada.,Faculty of Health Sciences, Simon Fraser University, Burnaby, Canada
| | - Suzie Mudge
- Centre for Person Centred Research, Auckland University of Technology, Auckland, New Zealand
| | - Rebecca Fisher
- School of Medicine, University of Nottingham, Nottingham, UK
| |
Collapse
|
39
|
Vis C, Ruwaard J, Finch T, Rapley T, de Beurs D, van Stel H, van Lettow B, Mol M, Kleiboer A, Riper H, Smit J. Toward an Objective Assessment of Implementation Processes for Innovations in Health Care: Psychometric Evaluation of the Normalization Measure Development (NoMAD) Questionnaire Among Mental Health Care Professionals. J Med Internet Res 2019; 21:e12376. [PMID: 30785402 PMCID: PMC6401675 DOI: 10.2196/12376] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2018] [Revised: 12/21/2018] [Accepted: 01/20/2019] [Indexed: 11/13/2022] Open
Abstract
Background Successfully implementing eMental health (eMH) interventions in routine mental health care constitutes a major challenge. Reliable instruments to assess implementation progress are essential. The Normalization MeAsure Development (NoMAD) study developed a brief self-report questionnaire that could be helpful in measuring implementation progress. Based on the Normalization Process Theory, this instrument focuses on 4 generative mechanisms involved in implementation processes: coherence, cognitive participation, collective action, and reflexive monitoring. Objective The aim of this study was to translate the NoMAD questionnaire to Dutch and to confirm the factor structure in Dutch mental health care settings. Methods Dutch mental health care professionals involved in eMH implementation were invited to complete the translated NoMAD questionnaire. Confirmatory factor analysis (CFA) was conducted to verify interpretability of scale scores for 3 models: (1) the theoretical 4-factor structure, (2) a unidimensional model, and (3) a hierarchical model. Potential improvements were explored, and correlated scale scores with 3 control questions were used to assess convergent validity. Results A total of 262 professionals from mental health care settings in the Netherlands completed the questionnaire (female: 81.7%; mean age: 45 [SD=11]). The internal consistency of the 20-item questionnaire was acceptable (.62≤alpha≤.85). The theorized 4-factor model fitted the data slightly better in the CFA than the hierarchical model (Comparative Fit Index=0.90, Tucker Lewis Index=0.88, Root Mean Square Error of Approximation=0.10, Standardized Root Mean Square Residual=0.12, χ22=22.5, P≤.05). However, the difference is small and possibly not outweighing the practical relevance of a total score and subscale scores combined in one hierarchical model. One item was identified as weak (λCA.2=0.10). A moderate-to-strong convergent validity with 3 control questions was found for the Collective Participation scale (.47≤r≤.54, P≤.05). Conclusions NoMAD’s theoretical factor structure was confirmed in Dutch mental health settings to acceptable standards but with room for improvement. The hierarchical model might prove useful in increasing the practical utility of the NoMAD questionnaire by combining a total score with information on the 4 generative mechanisms. Future research should assess the predictive value and responsiveness over time and elucidate the conceptual interpretability of NoMAD in eMH implementation practices.
Collapse
Affiliation(s)
- Christiaan Vis
- Department of Clinical, Neuro-, & Developmental Psychology, Faculty of Behavioural and Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam, Netherlands.,Mental Health, Amsterdam Public Health Research Institute, Amsterdam, Netherlands
| | - Jeroen Ruwaard
- Amsterdam UMC, Vrije Universiteit, Psychiatry, Amsterdam Public Health Research Institute, Amsterdam, Netherlands.,Research and Innovation, GGZ inGeest Specialized Mental Healthcare, Amsterdam, Netherlands
| | - Tracy Finch
- Department of Nursing, Midwifery & Health, Northumbria University, Northumbria, United Kingdom
| | - Tim Rapley
- Department of Social Work, Education & Community Wellbeing, Northumbria University, Northumbria, United Kingdom
| | - Derek de Beurs
- Mental Health, Netherlands Institute For Health Services Research (NIVEL), Utrecht, Netherlands
| | - Henk van Stel
- Julius Center Research Program Methodology, Department of Public Health, Healthcare Innovation & Evaluation and Medical Humanities, University Medical Center Utrecht, Utrecht, Netherlands
| | | | - Mayke Mol
- Amsterdam UMC, Vrije Universiteit, Psychiatry, Amsterdam Public Health Research Institute, Amsterdam, Netherlands.,Research and Innovation, GGZ inGeest Specialized Mental Healthcare, Amsterdam, Netherlands
| | - Annet Kleiboer
- Department of Clinical, Neuro-, & Developmental Psychology, Faculty of Behavioural and Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam, Netherlands.,Mental Health, Amsterdam Public Health Research Institute, Amsterdam, Netherlands
| | - Heleen Riper
- Department of Clinical, Neuro-, & Developmental Psychology, Faculty of Behavioural and Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam, Netherlands.,Mental Health, Amsterdam Public Health Research Institute, Amsterdam, Netherlands.,Research and Innovation, GGZ inGeest Specialized Mental Healthcare, Amsterdam, Netherlands
| | - Jan Smit
- Amsterdam UMC, Vrije Universiteit, Psychiatry, Amsterdam Public Health Research Institute, Amsterdam, Netherlands.,Research and Innovation, GGZ inGeest Specialized Mental Healthcare, Amsterdam, Netherlands
| |
Collapse
|
40
|
Acharya S, Werts N. Toward the Design of an Engagement Tool for Effective Electronic Health Record Adoption. PERSPECTIVES IN HEALTH INFORMATION MANAGEMENT 2019; 16:1g. [PMID: 30766458 PMCID: PMC6341416] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
As healthcare systems continue to expand their use of electronic health records (EHRs), barriers to robust and successful engagement with such systems by stakeholders remain tenacious. To this effect, this research presents the results of a survey tool utilizing both original and modified constructs from the Consolidated Framework for Implementation Research to assess key points of engagement barriers and potential points of intervention for stakeholders of EHRs in a large-scale healthcare organization (500-bed level II regional trauma center). Based on the extensive assessment, the paper presents recommendations for the utility of engagement process modeling and discusses how intervention opportunities can be used to mitigate engagement barriers.
Collapse
Affiliation(s)
- Subrata Acharya
- Department of Computer and Information Sciences at Towson University in Towson, MD
| | - Niya Werts
- Department of Health Sciences at Towson University in Towson, MD
| |
Collapse
|
41
|
Kien C, Schultes MT, Szelag M, Schoberberger R, Gartlehner G. German language questionnaires for assessing implementation constructs and outcomes of psychosocial and health-related interventions: a systematic review. Implement Sci 2018; 13:150. [PMID: 30541590 PMCID: PMC6292038 DOI: 10.1186/s13012-018-0837-3] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2018] [Accepted: 11/12/2018] [Indexed: 11/29/2022] Open
Abstract
Background Over the past years, implementation science has gained more and more importance in German-speaking countries. Reliable and valid questionnaires are needed for evaluating the implementation of evidence-based practices. On an international level, several initiatives focused on the identification of questionnaires used in English-speaking countries but limited their search processes to mental health and public health settings. Our aim was to identify questionnaires used in German-speaking countries measuring the implementation of interventions in public health and health care settings in general and to assess their psychometric properties. Methods We searched five different bibliographic databases (from 1985 to August 2017) and used several other search strategies (e.g., reference lists, forward citation) to obtain our data. We assessed the instruments, which were identified in an independent dual review process, using 12 psychometric rating criteria. Finally, we mapped the instruments’ scales and subscales in regard to the constructs of the Consolidated Framework for Implementation Research (CFIR) and the Implementation Outcome Framework (IOF). Results We identified 31 unique instruments available for the assessment of implementation science constructs. Hospitals and other health care settings were the ones most often investigated (23 instruments), while education and childcare settings, workplace settings, and community settings lacked published instruments. Internal consistency, face and content validity, usability, and structural validity were the aspects most often described. However, most studies did not report on test-retest reliability, known-groups validity, predictive criterion validity, or responsiveness. Overall, the majority of studies did not reveal high-quality instruments, especially regarding the psychometric criteria internal consistency, structural validity, and criterion validity. In addition, we seldom detected instruments operationalizing the CFIR domains intervention characteristics, outer setting, and process, and the IOF constructs adoption, fidelity, penetration, and sustainability. Conclusions Overall, a sustained and continuous effort is needed to improve the reliability and validity of existing instruments to new ones. Instruments applicable to the assessment of implementation constructs in public health and community settings are urgently needed. Trial registration The systematic review protocol was registered in PROSPERO on October 19, 2017, under the following number: CRD42017075208. Electronic supplementary material The online version of this article (10.1186/s13012-018-0837-3) contains supplementary material, which is available to authorized users.
Collapse
Affiliation(s)
- Christina Kien
- Department for Evidence-based Medicine and Clinical Epidemiology, Danube-University Krems, Dr.-Karl-Dorrek Strasse 30, 3500, Krems a.d. Donau, Austria. .,Center for Public Health, Department of Social and Preventive Medicine, Medical University Vienna, Kinderspitalgasse 15, 1090, Wien, Austria.
| | - Marie-Therese Schultes
- Department of Applied Psychology: Work, Education, Economy, Faculty of Psychology, University of Vienna, Universitaetsstrasse 7, 1010, Vienna, Austria.,Department of Maternal and Child Health, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, CB #7445 Rosenau, Chapel Hill, NC, 27599-7445, USA
| | - Monika Szelag
- Department for Evidence-based Medicine and Clinical Epidemiology, Danube-University Krems, Dr.-Karl-Dorrek Strasse 30, 3500, Krems a.d. Donau, Austria
| | - Rudolf Schoberberger
- Center for Public Health, Department of Social and Preventive Medicine, Medical University Vienna, Kinderspitalgasse 15, 1090, Wien, Austria
| | - Gerald Gartlehner
- Department for Evidence-based Medicine and Clinical Epidemiology, Danube-University Krems, Dr.-Karl-Dorrek Strasse 30, 3500, Krems a.d. Donau, Austria.,RTI International-University of North Carolina at Chapel Hill Evidence-based Practice Center, Chapel Hill, 27599-7445, NC, USA
| |
Collapse
|
42
|
Finch TL, Girling M, May CR, Mair FS, Murray E, Treweek S, McColl E, Steen IN, Cook C, Vernazza CR, Mackintosh N, Sharma S, Barbery G, Steele J, Rapley T. Improving the normalization of complex interventions: part 2 - validation of the NoMAD instrument for assessing implementation work based on normalization process theory (NPT). BMC Med Res Methodol 2018; 18:135. [PMID: 30442094 PMCID: PMC6238372 DOI: 10.1186/s12874-018-0591-x] [Citation(s) in RCA: 110] [Impact Index Per Article: 18.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2017] [Accepted: 10/29/2018] [Indexed: 11/10/2022] Open
Abstract
INTRODUCTION Successful implementation and embedding of new health care practices relies on co-ordinated, collective behaviour of individuals working within the constraints of health care settings. Normalization Process Theory (NPT) provides a theory of implementation that emphasises collective action in explaining, and shaping, the embedding of new practices. To extend the practical utility of NPT for improving implementation success, an instrument (NoMAD) was developed and validated. METHODS Descriptive analysis and psychometric testing of an instrument developed by the authors, through an iterative process that included item generation, consensus methods, item appraisal, and cognitive testing. A 46 item questionnaire was tested in 6 sites implementing health related interventions, using paper and online completion. Participants were staff directly involved in working with the interventions. Descriptive analysis and consensus methods were used to remove redundancy, reducing the final tool to 23 items. Data were subject to confirmatory factor analysis which sought to confirm the theoretical structure within the sample. RESULTS We obtained 831 completed questionnaires, an average response rate of 39% (range: 22-77%). Full completion of items was 50% (n = 413). The confirmatory factor analysis showed the model achieved acceptable fit (CFI = 0.95, TLI = 0.93, RMSEA = 0.08, SRMR = 0.03). Construct validity of the four theoretical constructs of NPT was supported, and internal consistency (Cronbach's alpha) were as follows: Coherence (4 items, α = 0.71); Collective Action (7 items, α = 0.78); Cognitive Participation (4 items, α = 0.81); Reflexive Monitoring (5 items, α = 0.65). The normalisation scale overall, was highly reliable (20 items, α = 0.89). CONCLUSIONS The NoMAD instrument has good face validity, construct validity and internal consistency, for assessing staff perceptions of factors relevant to embedding interventions that change their work practices. Uses in evaluating and guiding implementation are proposed.
Collapse
Affiliation(s)
- Tracy L. Finch
- Department of Nursing, Midwifery and Health, Northumbria University, Coach Lane, Newcastle-upon-Tyne, NE7 7XA UK
| | - Melissa Girling
- Institute of Health & Society, Newcastle University, Baddiley-Clark Building, Richardson Road, Newcastle-upon-Tyne, NE2 4AX UK
| | - Carl R. May
- Faculty of Public Health and Policy, London School of Hygiene and Tropical Medicine, 15-17 Tavistock Place, London, WC1H 9SH UK
| | - Frances S. Mair
- General Practice and Primary Care, Institute of Health and Wellbeing, University of Glasgow, 1 Horselethill Road, Glasgow, G12 9LX UK
| | - Elizabeth Murray
- Research Department of Primary Care and Population Health, University College London, Upper Floor 3, Royal Free Hospital, Rowland Hill Street, London, NW3 2PF UK
| | - Shaun Treweek
- Health Services Research Unit, University of Aberdeen, 3rd Floor, Health Sciences Building, Foresterhill, Aberdeen, AB25 2ZD UK
| | - Elaine McColl
- Institute of Health & Society, Newcastle University, Baddiley-Clark Building, Richardson Road, Newcastle-upon-Tyne, NE2 4AX UK
| | - Ian Nicholas Steen
- Institute of Health & Society, Newcastle University, Baddiley-Clark Building, Richardson Road, Newcastle-upon-Tyne, NE2 4AX UK
| | - Clare Cook
- School of Law and Business, University of Northumbria, City Campus East 1, Newcastle upon Tyne, NE1 8ST UK
| | - Christopher R. Vernazza
- Centre for Oral Health Research, Newcastle University, Framlington Place, Newcastle upon Tyne, NE2 4BW UK
| | - Nicola Mackintosh
- Department of Health Sciences, College of Medicine, Biological Sciences and Psychology, University of Leicester, Centre for Medicine, University Road, Leicester, LE1 7RH UK
| | - Samridh Sharma
- Centre for Oral Health Research, Newcastle University, Framlington Place, Newcastle upon Tyne, NE2 4BW UK
| | - Gaery Barbery
- International Business and Asian Studies, Griffith University, QLD, Gold Coast, 4222 Australia
| | - Jimmy Steele
- Centre for Oral Health Research, Newcastle University, Framlington Place, Newcastle upon Tyne, NE2 4BW UK
| | - Tim Rapley
- Department of Social Work, Education and Community Wellbeing, Northumbria University, Coach Lane, Newcastle-upon-Tyne, NE7 7XA UK
| |
Collapse
|
43
|
Rapley T, Girling M, Mair FS, Murray E, Treweek S, McColl E, Steen IN, May CR, Finch TL. Improving the normalization of complex interventions: part 1 - development of the NoMAD instrument for assessing implementation work based on normalization process theory (NPT). BMC Med Res Methodol 2018; 18:133. [PMID: 30442093 PMCID: PMC6238361 DOI: 10.1186/s12874-018-0590-y] [Citation(s) in RCA: 79] [Impact Index Per Article: 13.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2017] [Accepted: 10/29/2018] [Indexed: 11/23/2022] Open
Abstract
BACKGROUND Understanding and measuring implementation processes is a key challenge for implementation researchers. This study draws on Normalization Process Theory (NPT) to develop an instrument that can be applied to assess, monitor or measure factors likely to affect normalization from the perspective of implementation participants. METHODS An iterative process of instrument development was undertaken using the following methods: theoretical elaboration, item generation and item reduction (team workshops); item appraisal (QAS-99); cognitive testing with complex intervention teams; theory re-validation with NPT experts; and pilot testing of instrument. RESULTS We initially generated 112 potential questionnaire items; these were then reduced to 47 through team workshops and item appraisal. No concerns about item wording and construction were raised through the item appraisal process. We undertook three rounds of cognitive interviews with professionals (n = 30) involved in the development, evaluation, delivery or reception of complex interventions. We identified minor issues around wording of some items; universal issues around how to engage with people at different time points in an intervention; and conceptual issues around the types of people for whom the instrument should be designed. We managed these by adding extra items (n = 6) and including a new set of option responses: 'not relevant at this stage', 'not relevant to my role' and 'not relevant to this intervention' and decided to design an instrument explicitly for those people either delivering or receiving an intervention. This version of the instrument had 53 items. Twenty-three people with a good working knowledge of NPT reviewed the items for theoretical drift. Items that displayed a poor alignment with NPT sub-constructs were removed (n = 8) and others revised or combined (n = 6). The final instrument, with 43 items, was successfully piloted with five people, with a 100% completion rate of items. CONCLUSION The process of moving through cycles of theoretical translation, item generation, cognitive testing, and theoretical (re)validation was essential for maintaining a balance between the theoretical integrity of the NPT concepts and the ease with which intended respondents could answer the questions. The final instrument could be easily understood and completed, while retaining theoretical validity. NoMAD represents a measure that can be used to understand implementation participants' experiences. It is intended as a measure that can be used alongside instruments that measure other dimensions of implementation activity, such as implementation fidelity, adoption, and readiness.
Collapse
Affiliation(s)
- Tim Rapley
- Department of Social Work, Education and Community Wellbeing, Northumbria University, Coach Lane Campus West, Newcastle upon Tyne, NE7 7XA UK
| | - Melissa Girling
- Institute of Health & Society, Newcastle University, Baddiley-Clark Building, Richardson Road, Newcastle-upon-Tyne, NE2 4AX UK
| | - Frances S. Mair
- Institute of Health and Wellbeing, University of Glasgow, 1 Horselethill Road, Glasgow, G12 9LX UK
| | - Elizabeth Murray
- Research Department of Primary Care and Population Health, University College London, Upper Floor 3, Royal Free Hospital, Rowland Hill Street, London, NW3 2PF UK
| | - Shaun Treweek
- Health Services Research Unit, University of Aberdeen, 3rd Floor, Health Sciences Building, Foresterhill, Aberdeen, AB25 2ZD UK
| | - Elaine McColl
- Institute of Health & Society, Newcastle University, Baddiley-Clark Building, Richardson Road, Newcastle-upon-Tyne, NE2 4AX UK
| | - Ian Nicholas Steen
- Institute of Health & Society, Newcastle University, Baddiley-Clark Building, Richardson Road, Newcastle-upon-Tyne, NE2 4AX UK
| | - Carl R. May
- Faculty of Public Health and Policy, London School of Hygiene and Tropical Medicine, 15-17 Tavistock Place, London, WC1H 9SH UK
| | - Tracy L. Finch
- Department of Nursing, Midwifery and Health, Northumbria University, Coach Lane Campus West, Newcastle upon Tyne, NE7 7XA UK
| |
Collapse
|
44
|
'Why do we need a policy?' Administrators' perceptions on breast-feeding-friendly childcare. Public Health Nutr 2018; 22:553-563. [PMID: 30394255 DOI: 10.1017/s1368980018002914] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
OBJECTIVE Mothers' return to work and childcare providers' support for feeding expressed human milk are associated with breast-feeding duration rates in the USA, where most infants are regularly under non-parental care. The objective of the present study was to explore Florida-based childcare centre administrators' awareness and perceptions of the Florida Breastfeeding Friendly Childcare Initiative. DESIGN Semi-structured interviews were based on the Consolidated Framework for Implementation Research and analysed using applied thematic analysis. SETTING Childcare centre administrators in Tampa Bay, FL, USA, interviewed in 2015.ParticipantsTwenty-eight childcare centre administrators: female (100 %) and Non-Hispanic White (61 %) with mean age of 50 years and 13 years of experience. RESULTS Most administrators perceived potential implementation of the Florida Breastfeeding Friendly Childcare Initiative as simple and beneficial. Tension for change and a related construct (perceived consumer need for the initiative) were low, seemingly due to formula-feeding being normative. Perceived financial costs and relative priority varied. Some centres had facilitating structural characteristics, but none had formal breast-feeding policies. CONCLUSIONS A cultural shift, facilitated by state and national breast-feeding-friendly childcare policies and regulations, may be important for increasing tension for change and thereby increasing access to breast-feeding-friendly childcare. Similar to efforts surrounding the rapid growth of the Baby Friendly Hospital Initiative, national comprehensive evidence-based policies, regulations, metrics and technical assistance are needed to strengthen state-level breast-feeding-friendly childcare initiatives.
Collapse
|
45
|
Xu J, Anders S, Pruttianan A, France D, Lau N, Adams JA, Weinger MB. Human performance measures for the evaluation of process control human-system interfaces in high-fidelity simulations. APPLIED ERGONOMICS 2018; 73:151-165. [PMID: 30098630 DOI: 10.1016/j.apergo.2018.06.008] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/15/2017] [Revised: 06/19/2018] [Accepted: 06/26/2018] [Indexed: 06/08/2023]
Abstract
We reviewed the available literature on measuring human performance to evaluate human-system interfaces (HSIs), focused on high-fidelity simulations of industrial process control systems, to identify best practices and future directions for research and operations. We searched the available literature and then conducted in-depth review, structured coding, and analysis of 49 articles, which described 42 studies. Human performance measures were classified across six dimensions: task performance, workload, situation awareness, teamwork/collaboration, plant performance, and other cognitive performance indicators. Many studies measured performance in more than one dimension, but few studies addressed more than three dimensions. Only a few measures demonstrated acceptable levels of reliability, validity, and sensitivity in the reviewed studies in this research domain. More research is required to assess the measurement qualities of the commonly used measures. The results can provide guidance to direct future research and practice for human performance measurement in process control HSI design and deployment.
Collapse
Affiliation(s)
- Jie Xu
- Center for Research and Innovation in Systems Safety, Institute for Medicine and Public Health and the Department of Anesthesiology, Vanderbilt University Medical Center, Nashville, TN, USA; Center for Psychological Sciences, Zhejiang University, Hangzhou, Zhejiang Province, PR China.
| | - Shilo Anders
- Center for Research and Innovation in Systems Safety, Institute for Medicine and Public Health and the Department of Anesthesiology, Vanderbilt University Medical Center, Nashville, TN, USA
| | - Arisa Pruttianan
- Grado Department of Industrial and Systems Engineering, Virginia Polytechnic Institute and State University, Blacksburg, VA, USA
| | - Daniel France
- Center for Research and Innovation in Systems Safety, Institute for Medicine and Public Health and the Department of Anesthesiology, Vanderbilt University Medical Center, Nashville, TN, USA
| | - Nathan Lau
- Grado Department of Industrial and Systems Engineering, Virginia Polytechnic Institute and State University, Blacksburg, VA, USA
| | - Julie A Adams
- Collaborative Robotics and Intelligent Systems Institute, School of Electrical Engineering and Computer Science, Oregon State University, Corvallis, OR, USA
| | - Matthew B Weinger
- Center for Research and Innovation in Systems Safety, Institute for Medicine and Public Health and the Department of Anesthesiology, Vanderbilt University Medical Center, Nashville, TN, USA; Department of Civil and Environmental Engineering (Risk and Reliability Group), Vanderbilt University School of Engineering, Nashville, TN, USA
| |
Collapse
|
46
|
[Systematic translation and cross-validation of defined implementation outcomes in health care services]. ZEITSCHRIFT FUR EVIDENZ FORTBILDUNG UND QUALITAET IM GESUNDHEITSWESEN 2018; 135-136:72-80. [PMID: 30057171 DOI: 10.1016/j.zefq.2018.06.005] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/20/2018] [Revised: 06/03/2018] [Accepted: 06/22/2018] [Indexed: 11/20/2022]
Abstract
OBJECTIVE To validate a German translation of construct-validated implementation outcomes of Proctor et al. (2011). METHODS A systematic translation process and a cross-validation based on Beaton et al. (2000) were performed. RESULTS Semantic challenges arose regarding the definitions of "adoption" and "fidelity". Consistent formulation was established. CONCLUSION The validated definitions are a starting point for developing a comprehensive concept to measure implementation effectiveness and efficacy of interventions in health services research.
Collapse
|
47
|
Mosson R, von Thiele Schwarz U, Hasson H, Lundmark R, Richter A. How do iLead? Validation of a scale measuring active and passive implementation leadership in Swedish healthcare. BMJ Open 2018; 8:e021992. [PMID: 29961033 PMCID: PMC6042620 DOI: 10.1136/bmjopen-2018-021992] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
OBJECTIVES This study aims to describe the creation of a scale-the iLead scale-through adaptations of existing domain-specific scales that measure active and passive implementation leadership, and to describe the psychometric properties of this scale. METHODS Data collected from a leadership intervention were used in this validation study. Respondents were 336 healthcare professionals (90% female and 10% male; mean age 47 years) whose first-line and second-line managers participated in the intervention. The data were collected in the Stockholm regional healthcare organisation that offer primary, psychiatric, rehabilitation and acute hospital care, among other areas. The items for measuring implementation leadership were based on existent research and the full-range leadership model. Confirmatory factor analysis was performed to evaluate the dimensionality of the scale, followed by tests for reliability and convergent, discriminant and criterion-related validity using correlations and multilevel regression analyses. RESULTS The final scale consists of 16 items clustered into four subscales representing active implementation leadership, and one scale signifying passive implementation leadership. Findings showed that the hypothesised model had an acceptable model fit (χ2(99)=382.864**, Comparative Fit Index=0.935, Tucker-Lewis Index=0.911, root mean square error of approximation=0.059). The internal consistency and convergent, discriminant and criterion-related validity were all satisfactory. CONCLUSIONS The iLead scale is a valid measure of implementation leadership and is a tool for understanding how active and passive leader behaviours influence an implementation process. This brief scale may be particularly valuable to apply in training focusing on facilitating implementation, and in evaluating leader training. Moreover, the scale can be useful in evaluating various leader behaviours associated with implementation success or failure.
Collapse
Affiliation(s)
- Rebecca Mosson
- Department of Learning, Informatics, Management and Ethics, Procome Research Group, Medical Management Centre, Karolinska Institutet, Stockholm, Sweden
- Unit for Implementation and Evaluation, Centre for Epidemiology and Community Medicine (CES), Stockholm County Council, Stockholm, Sweden
| | - Ulrica von Thiele Schwarz
- Department of Learning, Informatics, Management and Ethics, Procome Research Group, Medical Management Centre, Karolinska Institutet, Stockholm, Sweden
- School of Health, Care and Social Welfare, Mälardalen University, Västerås, Sweden
| | - Henna Hasson
- Department of Learning, Informatics, Management and Ethics, Procome Research Group, Medical Management Centre, Karolinska Institutet, Stockholm, Sweden
- Unit for Implementation and Evaluation, Centre for Epidemiology and Community Medicine (CES), Stockholm County Council, Stockholm, Sweden
| | - Robert Lundmark
- Department of Learning, Informatics, Management and Ethics, Procome Research Group, Medical Management Centre, Karolinska Institutet, Stockholm, Sweden
| | - Anne Richter
- Department of Learning, Informatics, Management and Ethics, Procome Research Group, Medical Management Centre, Karolinska Institutet, Stockholm, Sweden
- Unit for Implementation and Evaluation, Centre for Epidemiology and Community Medicine (CES), Stockholm County Council, Stockholm, Sweden
| |
Collapse
|
48
|
Valverde PA, Calhoun E, Esparza A, Wells KJ, Risendal BC. The early dissemination of patient navigation interventions: results of a respondent-driven sample survey. Transl Behav Med 2018; 8:456-467. [PMID: 29800405 DOI: 10.1093/tbm/ibx080] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
Patient navigators (PNs) coordinate medical services and connect patients with resources to improve outcomes, satisfaction, and reduce costs. Little national information is available to inform workforce development. We analyzed 819 responses from an online PN survey conducted in 2009-2010. Study variables were mapped to the five Consolidated Framework for Implementation Research (CFIR) constructs to explore program variations by type of PN. Five logistic regression models compared each PN type to all others while adjusting for covariates. Thirty-five percent of respondents were nurse navigators, 28% lay navigators, 20% social work (SW)/counselor navigators, 7% allied health navigators, and 10% were "other" types of PNs. Most were non-Hispanic White (71%), female (94%), and at least college educated (70%). The primary differences were observed among: the core intervention tasks; position structure; work setting; health conditions navigated; navigator race/ethnicity; personal cancer experiences; navigation training; and patient populations served. Lay PNs had fewer odds of identifying as Hispanic, work in rural settings and assist underserved populations compared to others. Nurse navigators showed greater odds of clinical responsibilities, work in hospital or government settings and fewer odds of navigating minority populations compared to others. SW/counselor navigators also had additional duties, provided greater assistance to Medicare patient populations, and less odds of navigating underserved populations than others. In summary, our survey indicates that the type of PN utilized is an indicator of other substantial differences in program implementation. CFIR provides a robust method to compare differences and should incorporate care coordination outcomes in future PN research.
Collapse
Affiliation(s)
- Patricia A Valverde
- Department of Community and Behavioral Health, School of Public Health, Aurora, CO
| | - Elizabeth Calhoun
- University of Arizona, Office of the Senior Vice President for Health Sciences, Vice President for Population Health Sciences, Executive Director, Center for Population Science and Discovery, Roy P. Drachman Hall, Tucson, AZ
| | - Angelina Esparza
- Executive Staff Analyst/Chief Program Officer, Houston Department for Health and Human Services, Houston, TX
| | - Kristen J Wells
- Department of Psychology, San Diego State University, San Diego, CA
| | - Betsy C Risendal
- Department of Community and Behavioral Health, Colorado School of Public Health, Aurora, CO
| |
Collapse
|
49
|
Fernandez ME, Walker TJ, Weiner BJ, Calo WA, Liang S, Risendal B, Friedman DB, Tu SP, Williams RS, Jacobs S, Herrmann AK, Kegler MC. Developing measures to assess constructs from the Inner Setting domain of the Consolidated Framework for Implementation Research. Implement Sci 2018; 13:52. [PMID: 29587804 PMCID: PMC5870186 DOI: 10.1186/s13012-018-0736-7] [Citation(s) in RCA: 105] [Impact Index Per Article: 17.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2017] [Accepted: 03/05/2018] [Indexed: 01/13/2023] Open
Abstract
Background Scientists and practitioners alike need reliable, valid measures of contextual factors that influence implementation. Yet, few existing measures demonstrate reliability or validity. To meet this need, we developed and assessed the psychometric properties of measures of several constructs within the Inner Setting domain of the Consolidated Framework for Implementation Research (CFIR). Methods We searched the literature for existing measures for the 7 Inner Setting domain constructs (Culture Overall, Culture Stress, Culture Effort, Implementation Climate, Learning Climate, Leadership Engagement, and Available Resources). We adapted items for the healthcare context, pilot-tested the adapted measures in 4 Federally Qualified Health Centers (FQHCs), and implemented the revised measures in 78 FQHCs in the 7 states (N = 327 respondents) with a focus on colorectal cancer (CRC) screening practices. To psychometrically assess our measures, we conducted confirmatory factor analysis models (CFA; structural validity), assessed inter-item consistency (reliability), computed scale correlations (discriminant validity), and calculated inter-rater reliability and agreement (organization-level construct reliability and validity). Results CFAs for most constructs exhibited good model fit (CFI > 0.90, TLI > 0.90, SRMR < 0.08, RMSEA < 0.08), with almost all factor loadings exceeding 0.40. Scale reliabilities ranged from good (0.7 ≤ α < 0.9) to excellent (α ≥ 0.9). Scale correlations fell below 0.90, indicating discriminant validity. Inter-rater reliability and agreement were sufficiently high to justify measuring constructs at the clinic-level. Conclusions Our findings provide psychometric evidence in support of the CFIR Inner Setting measures. Our findings also suggest the Inner Setting measures from individuals can be aggregated to represent the clinic-level. Measurement of the Inner Setting constructs can be useful in better understanding and predicting implementation in FQHCs and can be used to identify targets of strategies to accelerate and enhance implementation efforts in FQHCs.
Collapse
Affiliation(s)
- Maria E Fernandez
- University of Texas Health Science Center at Houston, Center for Health Promotion and Prevention Research, School of Public Health, 7000 Fannin St, Houston, TX, 77030, USA.
| | - Timothy J Walker
- University of Texas Health Science Center at Houston, Center for Health Promotion and Prevention Research, School of Public Health, 7000 Fannin St, Houston, TX, 77030, USA
| | - Bryan J Weiner
- Department of Global Health, University of Washington, Box 357965, 1510 San Juan Road, Seattle, WA, 98195, USA
| | - William A Calo
- Department of Public Health Sciences, Penn State College of Medicine, Mail Code CH69
- 500 University Drive, Hershey, PA, 17033, USA
| | - Shuting Liang
- Emory Prevention Research Center, Department of Behavioral Sciences and Health Education, Rollins School of Public Health, Emory University, 1518 Clifton Road NE, Atlanta, GA, 30033, USA
| | - Betsy Risendal
- Department of Community and Behavioral Health, Colorado School of Public Health, University of Colorado Comprehensive Cancer Center, 13001 E. 17th Place, MSF538, Aurora, CO, 80045, USA
| | - Daniela B Friedman
- Department of Health Promotion, Education, and Behavior and the Statewide Cancer Prevention and Control Program, Arnold School of Public Health, University of South Carolina, 915 Greene Street, Columbia, SC, 29208, USA
| | - Shin Ping Tu
- Department of Internal Medicine, University of California Davis, Suite 2400 , 4150 V Street, Sacramento, CA, 95817, USA
| | - Rebecca S Williams
- Center for Health Promotion and Disease Prevention, Lineberger Comprehensive Cancer Center, University of North Carolina at Chapel Hill, CB 7424, Chapel Hill, NC, 27599, USA
| | - Sara Jacobs
- Public Health Research Division, RTI International, 3040 East Cornwallis Road, Research Triangle Park, Durham, NC, 27709-2194, USA
| | - Alison K Herrmann
- UCLA Kaiser Permanente Center for Health Equity, Fielding School of Public Health and Jonsson Comprehensive Cancer Center, 650 Charles E. Young Dr. S., A2-125 CHS, Box 690015, Los Angeles, CA, 90095-6900, USA
| | - Michelle C Kegler
- Emory Prevention Research Center, Department of Behavioral Sciences and Health Education, Rollins School of Public Health, Emory University, 1518 Clifton Road NE, Atlanta, GA, 30033, USA
| |
Collapse
|
50
|
Watson DP, Adams EL, Shue S, Coates H, McGuire A, Chesher J, Jackson J, Omenka OI. Defining the external implementation context: an integrative systematic literature review. BMC Health Serv Res 2018; 18:209. [PMID: 29580251 PMCID: PMC5870506 DOI: 10.1186/s12913-018-3046-5] [Citation(s) in RCA: 37] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2017] [Accepted: 03/20/2018] [Indexed: 11/27/2022] Open
Abstract
Background Proper implementation of evidence-based interventions is necessary for their full impact to be realized. However, the majority of research to date has overlooked facilitators and barriers existing outside the boundaries of the implementing organization(s). Better understanding and measurement of the external implementation context would be particularly beneficial in light of complex health interventions that extend into and interact with the larger environment they are embedded within. We conducted a integrative systematic literature review to identify external context constructs likely to impact implementation of complex evidence-based interventions. Methods The review process was iterative due to our goal to inductively develop the identified constructs. Data collection occurred in four primary stages: (1) an initial set of key literature across disciplines was identified and used to inform (2) journal and (3) author searches that, in turn, informed the design of the final (4) database search. Additionally, (5) we conducted citation searches of relevant literature reviews identified in each stage. We carried out an inductive thematic content analysis with the goal of developing homogenous, well-defined, and mutually exclusive categories. Results We identified eight external context constructs: (1) professional influences, (2) political support, (3) social climate, (4) local infrastructure, (5) policy and legal climate, (6) relational climate, (7) target population, and (8) funding and economic climate. Conclusions This is the first study to our knowledge to use a systematic review process to identify empirically observed external context factors documented to impact implementation. Comparison with four widely-utilized implementation frameworks supports the exhaustiveness of our review process. Future work should focus on the development of more stringent operationalization and measurement of these external constructs. Electronic supplementary material The online version of this article (10.1186/s12913-018-3046-5) contains supplementary material, which is available to authorized users.
Collapse
Affiliation(s)
- Dennis P Watson
- Department of Social and Behavioral Sciences, Indiana University Richard M. Fairbanks School of Public Health, 1050 Wishard Blvd, Indianapolis, IN, 46202, USA.
| | - Erin L Adams
- Department of Psychology, Indiana University Purdue University-Indianapolis, 420 N Blackford St, Indianapolis, IN, 46202, USA
| | - Sarah Shue
- Indiana University-Purdue University Indianapolis, School of Health and Rehabilitation Sciences, 1050 Wishard Blvd, Indianapolis, IN, 46202, USA
| | - Heather Coates
- Indiana University-Purdue University Indianapolis, University Library, Center for Digital Scholarship, 755 W. Michigan St, Indianapolis, IN, 46202, USA
| | - Alan McGuire
- Richard L. Roudebush VA, 1481 W. 10th St, Indianapolis, IN, 46202, USA
| | - Jeremy Chesher
- Department of Environmental Health Sciences, Indiana University Richard M. Fairbanks School of Public Health, 1050 Wishard Blvd, Indianapolis, IN, 46202, USA
| | - Joanna Jackson
- Department of Health Policy and Management, Indiana University Richard M. Fairbanks School of Public Health, 1050 Wishard Blvd, Indianapolis, IN, 46202, USA
| | - Ogbonnaya I Omenka
- Department of Health Policy and Management, Indiana University Richard M. Fairbanks School of Public Health, 1050 Wishard Blvd, Indianapolis, IN, 46202, USA
| |
Collapse
|