1
|
Morris C, Jones SH, Oliveira JP. A Practitioner's Guide to Measuring Procedural Fidelity. Behav Anal Pract 2024; 17:643-655. [PMID: 38966272 PMCID: PMC11219619 DOI: 10.1007/s40617-024-00910-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/16/2024] [Indexed: 07/06/2024] Open
Abstract
Ensuring high levels of procedural fidelity during behavior-analytic interventions is a crucial component of providing effective behavior-analytic services. However, few resources are available to help guide practitioners through measuring procedural fidelity. In fact, most published behavior-analytic research on procedural fidelity analyzes a single treatment procedure, which might not completely reflect the process of monitoring and addressing the procedural fidelity of a robust treatment package that might be necessary in clinical settings. The purpose of this article is to guide behavior analysts through the process of creating and using procedural fidelity measurement systems, with a focus on direct observation of implementation as a means of fidelity data collection. This process consists of six steps: (1) task analyze treatment procedures into measurable units; (2) assign measures to each treatment component; (3) plan the direct observation; (4) collect procedural fidelity data; (5) analyze and interpret procedural fidelity data; and (6) take action to improve procedural fidelity. Each step is described and discussed in the article.
Collapse
Affiliation(s)
- Cody Morris
- Department of Psychology, Salve Regina University, 100 Ochre Point Avenue, Newport, RI 02840 USA
| | - Stephanie H. Jones
- Department of Psychology, Salve Regina University, 100 Ochre Point Avenue, Newport, RI 02840 USA
| | - Jacob P. Oliveira
- Department of Psychology, Salve Regina University, 100 Ochre Point Avenue, Newport, RI 02840 USA
| |
Collapse
|
2
|
Preas EJ, Halbur ME, Carroll RA. Procedural Fidelity Reporting in The Analysis of Verbal Behavior from 2007-2021. Anal Verbal Behav 2024; 40:1-12. [PMID: 38962519 PMCID: PMC11217236 DOI: 10.1007/s40616-023-00197-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/08/2023] [Indexed: 07/05/2024] Open
Abstract
Procedural fidelity refers to the degree to which procedures for an assessment or intervention (i.e., independent variables) are implemented consistent with the prescribed protocols. Procedural fidelity is an important factor in demonstrating the internal validity of an experiment and clinical treatments. Previous reviews evaluating the inclusion of procedural fidelity in published empirical articles demonstrated underreporting of procedural fidelity procedures and measures within specific journals. We conducted a systematic review of The Analysis of Verbal Behavior (TAVB) to evaluate the trends in procedural fidelity reporting from 2007 to 2021. Of the 253 articles published in TAVB during the reporting period, 144 of the articles (168 studies) met inclusionary criteria for further analysis. Our results showed that 54% of studies reported procedural fidelity data, which is slightly higher than previous reviews. In comparison, interobserver-agreement data were reported for a high percentage of studies reviewed (i.e., 93%). Further discussion of results and applied research implications are included.
Collapse
Affiliation(s)
| | - Mary E. Halbur
- University of Nebraska Medical Center’s Munroe-Meyer Institute, 9012 Q, St., Omaha, NE 68127 USA
| | - Regina A. Carroll
- University of Nebraska Medical Center’s Munroe-Meyer Institute, 9012 Q, St., Omaha, NE 68127 USA
| |
Collapse
|
3
|
Moore TR, Lee S, Freeman R, Mahmoundi M, Dimian A, Riegelman A, Simacek JJ. A Meta-Analysis of Treatment for Self-Injurious Behavior in Children and Adolescents With Intellectual and Developmental Disabilities. Behav Modif 2024; 48:216-256. [PMID: 38197303 DOI: 10.1177/01454455231218742] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/11/2024]
Abstract
Self-injurious behavior (SIB) among children and youth with developmental disabilities has not diminished in prevalence despite the availability of effective interventions, and the impact on quality of life for people and their families is devastating. The current meta-analysis reviews SIB intervention research between 2011 and 2021 using single-case experimental designs with children and youth up to 21 years old and provides a quantitative synthesis of data from high-quality studies including moderator analyses to determine effects of participant and study characteristics on intervention outcomes. Encouraging findings include a high level of effectiveness across studies in the decrease of SIB (Tau-U = -0.90) and increase of positive behavior (Tau-U = 0.73), as well as an increase in studies (relative to prior reviews) reporting intervention fidelity, generalization, maintenance, and social validity. However, our findings shed limited light on potential moderating variables in the development of interventions for children and youth who exhibit SIB. Of the potential moderators of intervention effects, only implementer (researcher/therapist vs. parent/caregiver) and setting (clinic vs. home) were significantly associated with improved outcomes. We discuss the need for more robust involvement of natural communities of implementers in SIB intervention research to better equip them to effectively and sustainably meet the needs of people they care for. We also discuss the importance of creating systems enabling broad access for children with SIB to effective interventions in service of reducing burden for people, families, and society over time.
Collapse
Affiliation(s)
- Timothy R Moore
- Department of Psychiatry and Behavioral Sciences, University of Minnesota, Minneapolis, USA
| | - Seunghee Lee
- Institute on Community Integration, University of Minnesota, Minneapolis, USA
| | - Rachel Freeman
- Institute on Community Integration, University of Minnesota, Minneapolis, USA
| | - Maryam Mahmoundi
- Institute on Community Integration, University of Minnesota, Minneapolis, USA
| | - Adele Dimian
- Institute on Community Integration, University of Minnesota, Minneapolis, USA
| | - Amy Riegelman
- Social Sciences Libraries, University of Minnesota, Minneapolis, USA
| | - Jessica J Simacek
- Institute on Community Integration, University of Minnesota, Minneapolis, USA
| |
Collapse
|
4
|
Becraft JL, Hardesty SL, Goldman KJ, Shawler LA, Edelstein ML, Orchowitz P. Caregiver involvement in applied behavior-analytic research: A scoping review and discussion. J Appl Behav Anal 2024; 57:55-70. [PMID: 37937407 DOI: 10.1002/jaba.1035] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2022] [Accepted: 10/20/2023] [Indexed: 11/09/2023]
Abstract
We conducted a scoping review to characterize the role of caregiver involvement in behavior-analytic research. We reviewed eight behavioral-learning journals from 2011-2022 for works that included children or caregivers as participants and characterized caregiver involvement as passive (implications for caregivers, input, social validity) and active (implementation, caregiver behavior, training, caregiver-collected data). The review identified 228 studies, and almost all (96.1%; n = 219) involved caregivers in some capacity; 94.3% (n = 215) had passive involvement (26.8% had only passive involvement; n = 61), 69.3% (n = 158) had active involvement (1.8% had only active involvement; n = 4), and 3.9% (n = 9) had neither passive nor active involvement. Involvement generally increased over publication years. The most common types of involvement were implications for caregivers, implementation, and input; caregiver-collected data were rare. We propose considerations when engaging caregivers in research and suggest new avenues of inquiry related to caregivers' treatment objectives and social validity, treatment implementers, and caregiver-collected data.
Collapse
Affiliation(s)
- Jessica L Becraft
- Department of Behavioral Psychology, Kennedy Krieger Institute, Baltimore, MD, USA
- Department of Psychiatry and Behavioral Sciences, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Samantha L Hardesty
- Department of Behavioral Psychology, Kennedy Krieger Institute, Baltimore, MD, USA
- Department of Psychiatry and Behavioral Sciences, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Kissel J Goldman
- Department of Behavioral Psychology, Kennedy Krieger Institute, Baltimore, MD, USA
| | - Lesley A Shawler
- School of Psychological and Behavioral Sciences, Southern Illinois University, Carbondale, IL, USA
| | - Matthew L Edelstein
- Department of Behavioral Psychology, Kennedy Krieger Institute, Baltimore, MD, USA
- Department of Psychiatry and Behavioral Sciences, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Phillip Orchowitz
- Department of Behavioral Psychology, Kennedy Krieger Institute, Baltimore, MD, USA
| |
Collapse
|
5
|
Bergmann S, Long BP, St Peter CC, Brand D, Strum MD, Han JB, Wallace MD. A detailed examination of reporting procedural fidelity in the Journal of Applied Behavior Analysis. J Appl Behav Anal 2023; 56:708-719. [PMID: 37572025 DOI: 10.1002/jaba.1015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2023] [Accepted: 07/09/2023] [Indexed: 08/14/2023]
Abstract
Few reviews on procedural fidelity-the degree to which procedures are implemented as designed-provide details to gauge the quality of fidelity reporting in behavior-analytic research. This review focused on experiments in the Journal of Applied Behavior Analysis (2006-2021) with "integrity" or "fidelity" in the abstract or body. When fidelity data were collected, the coders characterized measurement details (e.g., description of calculation, report of single or multiple values, frequency of fidelity checks, checklist use). The researchers found increasing trends in describing the calculation(s), reporting multiple values, and stating the frequency of measurement. Few studies described using a checklist. Most studies reported fidelity as a percentage, with high obtained values (M = 97%). When not collecting fidelity data was stated as a limitation, authors were unlikely to provide a rationale for the omission. We discuss recommendations for reporting procedural fidelity to increase the quality of and transparency in behavior-analytic research.
Collapse
Affiliation(s)
- Samantha Bergmann
- Department of Behavior Analysis, University of North Texas, Denton, TX, USA
| | - Brian P Long
- Department of Psychology, West Virginia University, Morgantown, WV, USA
| | - Claire C St Peter
- Department of Psychology, West Virginia University, Morgantown, WV, USA
| | - Denys Brand
- Department of Psychology, California State University, Sacramento, CA, USA
| | - Marcus D Strum
- Department of Behavior Analysis, University of North Texas, Denton, TX, USA
| | - Justin B Han
- Department of Child and Family Studies, University of South Florida, Tampa, FL, USA
| | - Michele D Wallace
- Department of Special Education & Counseling, California State University, Los Angeles, CA, USA
| |
Collapse
|
6
|
Bergmann S, Niland H, Gavidia VL, Strum MD, Harman MJ. Comparing Multiple Methods to Measure Procedural Fidelity of Discrete-trial Instruction. EDUCATION & TREATMENT OF CHILDREN 2023; 46:1-20. [PMID: 37362029 PMCID: PMC10208552 DOI: 10.1007/s43494-023-00094-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Accepted: 04/17/2023] [Indexed: 06/28/2023]
Abstract
Procedural fidelity is the extent to which an intervention is implemented as designed and is an important component of research and practice. There are multiple ways to measure procedural fidelity, and few studies have explored how procedural fidelity varies based on the method of measurement. The current study compared adherence to discrete-trial instruction protocols by behavior technicians with a child with autism when observers used different procedural-fidelity measures. We collected individual-component and individual-trial fidelity with an occurrence-nonoccurrence data sheet and compared these scores to global fidelity and all-or-nothing, 3-point Likert scale, and 5-point Likert scale measurement methods. The all-or-nothing method required all instances of a component or trial be implemented without error to be scored correct. The Likert scales used a rating system to score components and trials. At the component level, we found that the global, 3-point Likert, and 5-point Likert methods were likely to overestimate fidelity and mask component errors, and the all-or-nothing method was unlikely to mask errors. At the trial level, we found that the global and 5-point Likert methods approximated individual-trial fidelity, the 3-point Likert method overestimated fidelity, and the all-or-nothing method underestimated fidelity. The occurrence-nonoccurrence method required the most time to complete, and all-or-nothing by trial required the least. We discuss the implications of measuring procedural fidelity with different methods of measurement, including false positives and false negatives, and provide suggestions for practice and research. Supplementary Information The online version contains supplementary material available at 10.1007/s43494-023-00094-w.
Collapse
Affiliation(s)
- Samantha Bergmann
- Department of Behavior Analysis, University of North Texas, 1155 Union Circle #310919, Denton, TX 76203 USA
| | - Haven Niland
- Department of Behavior Analysis, University of North Texas, 1155 Union Circle #310919, Denton, TX 76203 USA
- Kristin Farmer Autism Center, University of North Texas, Denton, TX USA
| | - Valeria Laddaga Gavidia
- Department of Behavior Analysis, University of North Texas, 1155 Union Circle #310919, Denton, TX 76203 USA
- Kristin Farmer Autism Center, University of North Texas, Denton, TX USA
| | - Marcus D. Strum
- Department of Behavior Analysis, University of North Texas, 1155 Union Circle #310919, Denton, TX 76203 USA
| | - Michael J. Harman
- Department of Psychology, Briar Cliff University, Sioux City, IA USA
| |
Collapse
|