1
|
Scott K, Guigayoma J, Palinkas LA, Beaudoin FL, Clark MA, Becker SJ. The measurement-based care to opioid treatment programs project (MBC2OTP): a study protocol using rapid assessment procedure informed clinical ethnography. Addict Sci Clin Pract 2022; 17:44. [PMID: 35986380 PMCID: PMC9389829 DOI: 10.1186/s13722-022-00327-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2021] [Accepted: 08/01/2022] [Indexed: 11/17/2022] Open
Abstract
Background Psychosocial interventions are needed to enhance patient engagement and retention in medication treatment within opioid treatment programs. Measurement-based care (MBC), an evidence-based intervention structure that involves ongoing monitoring of treatment progress over time to assess the need for treatment modifications, has been recommended as a flexible and low-cost intervention for opioid treatment program use. The MBC2OTP Project is a two-phase pilot hybrid type 1 effectiveness-implementation trial that has three specific aims: (1) to employ Rapid Assessment Procedure Informed Clinical Ethnography (RAPICE) to collect mixed methods data to inform MBC implementation; (2) to use RAPICE data to adapt an MBC protocol; and (3) to conduct a hybrid type 1 trial to evaluate MBC’s preliminary effectiveness and implementation potential in opioid treatment programs. Methods This study will be conducted in two phases. Phase 1 will include RAPICE site visits, qualitative interviews (N = 32–48 total), and quantitative surveys (N = 64–80 total) with staff at eight programs to build community partnerships and evaluate contextual factors impacting MBC implementation. Mixed methods data will be analyzed using immersion/crystallization and thematic analysis to inform MBC adaptation and site selection. Four programs selected for Phase 2 will participate in MBC electronic medical record integration, training, and ongoing support. Chart reviews will be completed in the 6 months prior-to and following MBC integration (N = 160 charts, 80 pre and post) to evaluate effectiveness (patient opioid abstinence and treatment engagement) and implementation outcomes (counselor MBC exposure and fidelity). Discussion This study is among the first to take forward recommendations to implement and evaluate MBC in opioid treatment programs. It will also employ an innovative RAPICE approach to enhance the quality and rigor of data collection and inform the development of an MBC protocol best matched to opioid treatment programs. Overall, this work seeks to enhance treatment provision and clinical outcomes for patients with opioid use disorder. Trial registration This study will be registered with Clinicaltrials.gov within 21 days of first participant enrollment in Phase 2. Study Phase 1 (RAPICE) does not qualify as a clinical trial, therefore Phase 2 clinical trial registration has not yet been pursued because all elements of Phase 2 will be dependent on Phase 1 outcomes. Supplementary Information The online version contains supplementary material available at 10.1186/s13722-022-00327-0.
Collapse
|
2
|
Lyon AR, Liu FF, Connors EH, King KM, Coifman JI, Cook H, McRee E, Ludwig K, Law A, Dorsey S, McCauley E. How low can you go? Examining the effects of brief online training and post-training consultation dose on implementation mechanisms and outcomes for measurement-based care. Implement Sci Commun 2022; 3:79. [PMID: 35869500 PMCID: PMC9306246 DOI: 10.1186/s43058-022-00325-y] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2021] [Accepted: 07/10/2022] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Initial training and ongoing post-training consultation (i.e., ongoing support following training, provided by an expert) are among the most common implementation strategies used to change clinician practice. However, extant research has not experimentally investigated the optimal dosages of consultation necessary to produce desired outcomes. Moreover, the degree to which training and consultation engage theoretical implementation mechanisms-such as provider knowledge, skills, and attitudes-is not well understood. This study examined the effects of a brief online training and varying dosages of post-training consultation (BOLT+PTC) on implementation mechanisms and outcomes for measurement-based care (MBC) practices delivered in the context of education sector mental health services. METHODS A national sample of 75 clinicians who provide mental health interventions to children and adolescents in schools were randomly assigned to BOLT+PTC or control (services as usual). Those in BOLT+PTC were further randomized to 2-, 4-, or 8-week consultation conditions. Self-reported MBC knowledge, skills, attitudes, and use (including standardized assessment, individualized assessment, and assessment-informed treatment modification) were collected for 32 weeks. Multilevel models were used to examine main effects of BOLT+PTC versus control on MBC use at the end of consultation and over time, as well as comparisons among PTC dosage conditions and theorized mechanisms (skills, attitudes, knowledge). RESULTS There was a significant linear effect of BOLT+PTC over time on standardized assessment use (b = .02, p < .01), and a significant quadratic effect of BOLT+PTC over time on individualized assessment use (b = .04, p < .001), but no significant effect on treatment modification. BOLT + any level of PTC resulted in higher MBC knowledge and larger growth in MBC skill over the intervention period as compared to control. PTC dosage levels were inconsistently predictive of outcomes, providing no clear evidence for added benefit of higher PTC dosage. CONCLUSIONS Online training and consultation in MBC had effects on standardized and individualized assessment use among clinicians as compared to services as usual with no consistent benefit detected for increased consultation dosage. Continued research investigating optimal dosages and mechanisms of these established implementation strategies is needed to ensure training and consultation resources are deployed efficiently to impact clinician practices. TRIAL REGISTRATION ClinicalTrials.gov NCT05041517 . Retrospectively registered on 10 September 2021.
Collapse
Affiliation(s)
- Aaron R. Lyon
- grid.34477.330000000122986657Department of Psychiatry and Behavioral Sciences, University of Washington, 6200 NE 74th Street, Suite 100, Seattle, WA 98115 USA
| | - Freda F. Liu
- grid.34477.330000000122986657Department of Psychiatry and Behavioral Sciences, University of Washington, 6200 NE 74th Street, Suite 100, Seattle, WA 98115 USA
| | - Elizabeth H. Connors
- grid.47100.320000000419368710Department of Psychiatry, Yale University, 389 Whitney Avenue, Office 106, New Haven, CT 06511 USA
| | - Kevin M. King
- grid.34477.330000000122986657Department of Psychology, University of Washington, Guthrie Hall, Box 351525, Seattle, WA 98195 USA
| | - Jessica I. Coifman
- grid.34477.330000000122986657Department of Psychiatry and Behavioral Sciences, University of Washington, 6200 NE 74th Street, Suite 100, Seattle, WA 98115 USA
| | - Heather Cook
- grid.34477.330000000122986657Department of Psychiatry and Behavioral Sciences, University of Washington, 6200 NE 74th Street, Suite 100, Seattle, WA 98115 USA
| | - Erin McRee
- grid.34477.330000000122986657Department of Psychiatry and Behavioral Sciences, University of Washington, 6200 NE 74th Street, Suite 100, Seattle, WA 98115 USA
| | - Kristy Ludwig
- grid.34477.330000000122986657Department of Psychiatry and Behavioral Sciences, University of Washington, 6200 NE 74th Street, Suite 100, Seattle, WA 98115 USA
| | - Amy Law
- grid.34477.330000000122986657Graduate Medical Education, University of Washington, Learning Gateway, Box 358220, Seattle, WA 98109 USA
| | - Shannon Dorsey
- grid.34477.330000000122986657Department of Psychology, University of Washington, Guthrie Hall, Box 351525, Seattle, WA 98195 USA
| | - Elizabeth McCauley
- grid.34477.330000000122986657Department of Psychiatry and Behavioral Sciences, University of Washington, 6200 NE 74th Street, Suite 100, Seattle, WA 98115 USA
| |
Collapse
|
3
|
Improving the Quality of Children's Mental Health Care with Progress Measures: A Mixed-Methods Study of PCIT Therapist Attitudes. ADMINISTRATION AND POLICY IN MENTAL HEALTH AND MENTAL HEALTH SERVICES RESEARCH 2021; 49:182-196. [PMID: 34363566 PMCID: PMC8850255 DOI: 10.1007/s10488-021-01156-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/24/2021] [Indexed: 11/29/2022]
Abstract
Progress measures are an evidence-based technique for improving the quality of mental health care, however, clinicians rarely incorporate them into treatment. Research into how measure type impacts clinician preference has been recommended to help improve measure implementation. Parent–Child Interaction Therapy (PCIT) is an assessment-driven treatment that serves as an ideal intervention through which to investigate measure preferences given its routine use of two types of assessments, a behavioral observation (the Dyadic Parent–Child Interaction Coding System) and a parent-report measure (the Eyberg Child Behavior Inventory). This study investigated PCIT therapist attitudes towards progress measures used within PCIT and children’s mental health treatment generally. A mixed-method (QUAN + QUAL) study design examined PCIT therapist attitudes towards two types of progress measures and measures used in two contexts (PCIT and general practice). Multi-level modeling of a survey distributed to 324 PCIT therapists identified predictors of therapist attitudes towards measures, while qualitative interviews with 23 therapists expanded and clarified the rationale for differing perceptions. PCIT therapists reported more positive attitudes towards a behavioral observation measure, the DPICS, than a parent-report measure, the ECBI, and towards measures used in PCIT than in general practice. Clinician race/ethnicity was significantly related to measure-specific attitudes. Qualitative interviews highlighted how perceptions of measure reliability, type of data offered, ease of use, utility in guiding sessions and motivating clients, and embeddedness in treatment protocol impact therapist preferences. Efforts to implement progress monitoring should consider preferences for particular types of measures, as well as how therapists are trained to embed measures in treatment.
Collapse
|
4
|
Lyon AR, Koerner K, Chung J. Usability Evaluation for Evidence-Based Psychosocial Interventions (USE-EBPI): A methodology for assessing complex intervention implementability. IMPLEMENTATION RESEARCH AND PRACTICE 2020; 1:2633489520932924. [PMID: 37089126 PMCID: PMC9924253 DOI: 10.1177/2633489520932924] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Background: Most evidence-based practices in mental health are complex psychosocial interventions, but little research has focused on assessing and addressing the characteristics of these interventions, such as design quality and packaging, that serve as intra-intervention determinants (i.e., barriers and facilitators) of implementation outcomes. Usability—the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction—is a key indicator of design quality. Drawing from the field of human-centered design, this article presents a novel methodology for evaluating the usability of complex psychosocial interventions and describes an example “use case” application to an exposure protocol for the treatment of anxiety disorders with one user group. Method: The Usability Evaluation for Evidence-Based Psychosocial Interventions (USE-EBPI) methodology comprises four steps: (1) identify users for testing; (2) define and prioritize EBPI components (i.e., tasks and packaging); (3) plan and conduct the evaluation; and (4) organize and prioritize usability issues. In the example, clinicians were selected for testing from among the identified user groups of the exposure protocol (e.g., clients, system administrators). Clinicians with differing levels of experience with exposure therapies (novice, n =3; intermediate, n = 4; advanced, n = 3) were sampled. Usability evaluation included Intervention Usability Scale (IUS) ratings and individual user testing sessions with clinicians, and heuristic evaluations conducted by design experts. After testing, discrete usability issues were organized within the User Action Framework (UAF) and prioritized via independent ratings (1–3 scale) by members of the research team. Results: Average IUS ratings (80.5; SD = 9.56 on a 100-point scale) indicated good usability and also room for improvement. Ratings for novice and intermediate participants were comparable (77.5), with higher ratings for advanced users (87.5). Heuristic evaluations suggested similar usability (mean overall rating = 7.33; SD = 0.58 on a 10-point scale). Testing with individual users revealed 13 distinct usability issues, which reflected all four phases of the UAF and a range of priority levels. Conclusion: Findings from the current study suggested the USE-EBPI is useful for evaluating the usability of complex psychosocial interventions and informing subsequent intervention redesign (in the context of broader development frameworks) to enhance implementation. Future research goals are discussed, which include applying USE-EBPI with a broader range of interventions and user groups (e.g., clients). Plain language abstract: Characteristics of evidence-based psychosocial interventions (EBPIs) that impact the extent to which they can be implemented in real world mental health service settings have received far less attention than the characteristics of individuals (e.g., clinicians) or settings (e.g., community mental health centers), where EBPI implementation occurs. No methods exist to evaluate the usability of EBPIs, which can be a critical barrier or facilitator of implementation success. The current article describes a new method, the Usability Evaluation for Evidence-Based Psychosocial Interventions (USE-EBPI), which uses techniques drawn from the field of human-centered design to evaluate EBPI usability. An example application to an intervention protocol for anxiety problems among adults is included to illustrate the value of the new approach.
Collapse
Affiliation(s)
| | | | - Julie Chung
- Evidence Based Practice Institute, Seattle, WA, USA
| |
Collapse
|
5
|
Changes in provider Fidelity after introducing a new model of intervention. CURRENT PSYCHOLOGY 2020; 41:3906-3915. [PMID: 32837130 PMCID: PMC7340763 DOI: 10.1007/s12144-020-00910-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Given the impact of implementation fidelity on community-based outcomes, it is important to understand how fidelity may change over time as providers learn an intervention. Attachment and Biobehavioral Catch-up is an evidence-based early intervention that assesses fidelity during weekly supervision. Providers are first trained in the infant model, with toddler model training considered to be a separate, specialized opportunity. The current study examined changes in fidelity, measured by “in-the-moment” commenting, as providers moved from infant to toddler certification. An initial drop, with a subsequent increase, in commenting fidelity over the training year was expected. Results were consistent with our hypotheses, demonstrating a main effect of time, with most indices of commenting data initially decreasing and then increasing. These findings are consistent with research suggesting that fluctuation in fidelity is typical within community dissemination and suggests that ongoing supervision after the initial training is useful in facilitating successful skill development.
Collapse
|
6
|
Stoll RD, Pina AA, Schleider J. Brief, Non-Pharmacological, Interventions for Pediatric Anxiety: Meta-Analysis and Evidence Base Status. JOURNAL OF CLINICAL CHILD AND ADOLESCENT PSYCHOLOGY : THE OFFICIAL JOURNAL FOR THE SOCIETY OF CLINICAL CHILD AND ADOLESCENT PSYCHOLOGY, AMERICAN PSYCHOLOGICAL ASSOCIATION, DIVISION 53 2020; 49:435-459. [PMID: 32285692 PMCID: PMC7473445 DOI: 10.1080/15374416.2020.1738237] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
Abstract
In 1998, Ost published [One-session treatment of specific phobias-a rapid and effective method] [in Swedish] giving rise to the idea that brief, intensive, and concentrated psychosocial interventions could exhibit public health impact. At this juncture, and per criteria of the Society for Clinical Child and Adolescent Psychology, there are data supporting that brief, non-pharmacological intervention [prescriptions] for pediatric anxiety can be considered well-established or probably efficacious. In addition, data from 76 randomized controlled trials (N = 17,203 youth) yield an overall mean effect size of 0.19 on pediatric anxiety outcomes (pre-post). Note, however, that effect sizes vary significantly. These data point to the capacity for clinical change coming from in-vivo exposures for specific phobias (~3 h, one session), CBT with social skills training (~3 h, six sessions for indicated prevention and early intervention), and CBT-based parent training (~6 h, eight digital modules with clinician support). Given such evidence, we recommend efforts be made to establish ways to position such treatment innovations for rapid deployment facilitated by high-quality training, monitoring, technical assistance, and ongoing disclosures.
Collapse
Affiliation(s)
- Ryan D Stoll
- Department of Psychology, Arizona State University
| | | | | |
Collapse
|
7
|
Hogue A, Bobek M, MacLean A, Porter N, Jensen-Doss A, Henderson CE. Measurement training and feedback system for implementation of evidence-based treatment for adolescent externalizing problems: protocol for a randomized trial of pragmatic clinician training. Trials 2019; 20:700. [PMID: 31822294 PMCID: PMC6905067 DOI: 10.1186/s13063-019-3783-8] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2019] [Accepted: 10/09/2019] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Innovations in clinical training and support that enhance fidelity to evidence-based treatment (EBT) for adolescent behavior problems are sorely needed. This study will develop an online training system to address this gap: Measurement Training and Feedback System for Implementation (MTFS-I). Using procedures intended to be practical and sustainable, MTFS-I is designed to increase two aspects of therapist behavior that are fundamental to boosting EBT fidelity: therapist self-monitoring of EBT delivery, and therapist utilization of core techniques of EBTs in treatment sessions. This version of MTFS-I focuses on two empirically supported treatment approaches for adolescent conduct and substance use problems: family therapy and cognitive behavioral therapy (CBT). METHODS/DESIGN MTFS-I expands on conventional measurement feedback systems for client outcomes by adding training in observational coding to promote EBT self-monitoring and focusing on implementation of EBT treatment techniques. It has two primary components. (1) The training component, delivered weekly in two connected parts, involves self-monitored learning modules containing brief clinical descriptions of core EBT techniques and mock session coding exercises based on 5-8 min video segments that illustrate delivery of core techniques. (2) The feedback component summarizes aggregated therapist-reported data on EBT techniques used with their active caseloads. MTFS-I is hosted online and requires approximately 20 min per week to complete for each treatment approach. This randomized trial will first collect data on existing delivery of family therapy and CBT techniques for youth in outpatient behavioral health sites (Baseline phase). It will then randomize site clinicians to two study conditions (Implementation phase): Training Only versus Training + Feedback + Consultation. Therapists will choose whether to train in family therapy, CBT, or both. Study aims will compare clinician performance across study phase and between study conditions on MTFS-I uptake, reliability and accuracy in EBT self-monitoring, and utilization of EBT techniques in treatment sessions (based on observer coding of audiotapes). DISCUSSION Study contributions to implementation science and considerations of MTFS-I sustainability are discussed. TRIAL REGISTRATION ClinicalTrials.gov, NCT03722654. Registered on 29 October 2018.
Collapse
Affiliation(s)
| | | | | | | | | | - Craig E. Henderson
- Department of Psychology, Sam Houston State University, Huntsville, TX USA
| |
Collapse
|
8
|
Lyon AR, Munson SA, Renn BN, Atkins DC, Pullmann MD, Friedman E, Areán PA. Use of Human-Centered Design to Improve Implementation of Evidence-Based Psychotherapies in Low-Resource Communities: Protocol for Studies Applying a Framework to Assess Usability
. JMIR Res Protoc 2019; 8:e14990. [PMID: 31599736 PMCID: PMC6819011 DOI: 10.2196/14990] [Citation(s) in RCA: 62] [Impact Index Per Article: 12.4] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2019] [Revised: 09/10/2019] [Accepted: 09/10/2019] [Indexed: 12/02/2022] Open
Abstract
BACKGROUND This paper presents the protocol for the National Institute of Mental Health (NIMH)-funded University of Washington's ALACRITY (Advanced Laboratories for Accelerating the Reach and Impact of Treatments for Youth and Adults with Mental Illness) Center (UWAC), which uses human-centered design (HCD) methods to improve the implementation of evidence-based psychosocial interventions (EBPIs). We propose that usability-the degree to which interventions and implementation strategies can be used with ease, efficiency, effectiveness, and satisfaction-is a fundamental, yet poorly understood determinant of implementation. OBJECTIVE We present a novel Discover, Design/Build, and Test (DDBT) framework to study usability as an implementation determinant. DDBT will be applied across Center projects to develop scalable and efficient implementation strategies (eg, training tools), modify existing EBPIs to enhance usability, and create usable and nonburdensome decision support tools for quality delivery of EBPIs. METHODS Stakeholder participants will be implementation practitioners/intermediaries, mental health clinicians, and patients with mental illness in nonspecialty mental health settings in underresourced communities. Three preplanned projects and 12 pilot studies will employ the DDBT model to (1) identify usability challenges in implementing EBPIs in underresourced settings; (2) iteratively design solutions to overcome these challenges; and (3) compare the solution to the original version of the EPBI or implementation strategy on usability, quality of care, and patient-reported outcomes. The final products from the center will be a streamlined modification and redesign model that will improve the usability of EBPIs and implementation strategies (eg, tools to support EBPI education and decision making); a matrix of modification targets (ie, usability issues) that are both common and unique to EBPIs, strategies, settings, and patient populations; and a compilation of redesign strategies and the relative effectiveness of the redesigned solution compared to the original EBPI or strategy. RESULTS The UWAC received institutional review board approval for the three separate studies in March 2018 and was funded in May 2018. CONCLUSIONS The outcomes from this center will inform the implementation of EBPIs by identifying cross-cutting features of EBPIs and implementation strategies that influence the use and acceptability of these interventions, actively involving stakeholder clinicians and implementation practitioners in the design of the EBPI modification or implementation strategy solution and identifying the impact of HCD-informed modifications and solutions on intervention effectiveness and quality. TRIAL REGISTRATION ClinicalTrials.gov NCT03515226 (https://clinicaltrials.gov/ct2/show/NCT03515226), NCT03514394 (https://clinicaltrials.gov/ct2/show/NCT03514394), and NCT03516513 (https://clinicaltrials.gov/ct2/show/NCT03516513). INTERNATIONAL REGISTERED REPORT IDENTIFIER (IRRID) DERR1-10.2196/14990.
Collapse
Affiliation(s)
- Aaron R Lyon
- Department of Psychiatry and Behavioral Sciences, University of Washington, Seattle, WA, United States
| | - Sean A Munson
- Department of Human Centered Design and Engineering, University of Washington, Seattle, WA, United States
| | - Brenna N Renn
- Department of Psychiatry and Behavioral Sciences, University of Washington, Seattle, WA, United States
| | - David C Atkins
- Department of Psychiatry and Behavioral Sciences, University of Washington, Seattle, WA, United States
| | - Michael D Pullmann
- Department of Psychiatry and Behavioral Sciences, University of Washington, Seattle, WA, United States
| | - Emily Friedman
- Department of Psychiatry and Behavioral Sciences, University of Washington, Seattle, WA, United States
| | - Patricia A Areán
- Department of Psychiatry and Behavioral Sciences, University of Washington, Seattle, WA, United States
| |
Collapse
|