1
|
Preston A, Szikszai P, Saini V, Brightman R. Evaluating an Excel-based tool for interpreting functional analyses: A functional analysis decision support system. J Appl Behav Anal 2024. [PMID: 39036867 DOI: 10.1002/jaba.2901] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2023] [Accepted: 06/23/2024] [Indexed: 07/23/2024]
Abstract
When applied to functional analysis results, structured visual inspection criteria have resulted in improvements in the levels of agreement between raters as well as earlier identification of the function of challenging behavior. However, multistep criteria can be difficult to apply in real time, which could be a barrier to widespread adoption in practice. This study evaluated a Microsoft-Excel-based functional analysis decision support system (FADSS), which could aid behavior analysts with interpreting functional analysis results. Final overall agreement between the FADSS and post hoc visual inspection was high at 95%. Final overall agreement between the post hoc results generated by FADSS and ongoing results generated by FADSS was acceptable at 81%, representing a 50% increase in efficiency. These results indicate that FADSS could aid behavior analysts when interpreting functional analysis results in real time.
Collapse
Affiliation(s)
| | | | - Valdeep Saini
- Department of Applied Disability Studies, Brock University, St. Catharines, Ontario, Canada, CA
| | | |
Collapse
|
2
|
Dowdy A, Prime K, Peltier C. Generalized Linear Mixed Effects Modeling (GLMM) of Functional Analysis Graphical Construction Elements on Visual Analysis. Perspect Behav Sci 2024; 47:499-521. [PMID: 39099739 PMCID: PMC11294292 DOI: 10.1007/s40614-024-00406-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/25/2024] [Indexed: 08/06/2024] Open
Abstract
Multielement designs are the quintessential design tactic to evaluate outcomes of a functional analysis in applied behavior analysis. Protecting the credibility of the data collection, graphing, and visual analysis processes from a functional analysis increases the likelihood that optimal intervention decisions are made for individuals. Time-series graphs and visual analysis are the most prevalent method used to interpret functional analysis data. The current project included two principal aims. First, we tested whether the graphical construction manipulation of the x-to-y axes ratio (i.e., data points per x- axis to y-axis ratio [DPPXYR]) influenced visual analyst's detection of a function on 32 multielement design graphs displaying functional analyses. Second, we investigated the alignment between board certified behavior analysts (BCBAs; N = 59) visual analysis with the modified visual inspection criteria (Roane et al., Journal of Applied Behavior Analysis, 46, 130-146, 2013). We found that the crossed GLMM that included random slopes, random intercepts, and did not include an interaction effect (AIC = 1406.1, BIC = 1478.2) performed optimally. Second, alignment between BCBAs decisions and the MVI appeared to be low across data sets. We also leveraged current best practices in Open Science for raw data and analysis transparency.
Collapse
Affiliation(s)
- Art Dowdy
- Department of Teaching and Learning, College of Education and Human Development, Temple University, Philadelphia, PA USA
| | - Kasey Prime
- Department of Teaching and Learning, College of Education and Human Development, Temple University, Philadelphia, PA USA
| | - Corey Peltier
- Department of Educational Psychology, Jeannine Rainbolt College of Education, University of Oklahoma, Norman, OK USA
| |
Collapse
|
3
|
Bergmann S, Long BP, St Peter CC, Brand D, Strum MD, Han JB, Wallace MD. A detailed examination of reporting procedural fidelity in the Journal of Applied Behavior Analysis. J Appl Behav Anal 2023; 56:708-719. [PMID: 37572025 DOI: 10.1002/jaba.1015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2023] [Accepted: 07/09/2023] [Indexed: 08/14/2023]
Abstract
Few reviews on procedural fidelity-the degree to which procedures are implemented as designed-provide details to gauge the quality of fidelity reporting in behavior-analytic research. This review focused on experiments in the Journal of Applied Behavior Analysis (2006-2021) with "integrity" or "fidelity" in the abstract or body. When fidelity data were collected, the coders characterized measurement details (e.g., description of calculation, report of single or multiple values, frequency of fidelity checks, checklist use). The researchers found increasing trends in describing the calculation(s), reporting multiple values, and stating the frequency of measurement. Few studies described using a checklist. Most studies reported fidelity as a percentage, with high obtained values (M = 97%). When not collecting fidelity data was stated as a limitation, authors were unlikely to provide a rationale for the omission. We discuss recommendations for reporting procedural fidelity to increase the quality of and transparency in behavior-analytic research.
Collapse
Affiliation(s)
- Samantha Bergmann
- Department of Behavior Analysis, University of North Texas, Denton, TX, USA
| | - Brian P Long
- Department of Psychology, West Virginia University, Morgantown, WV, USA
| | - Claire C St Peter
- Department of Psychology, West Virginia University, Morgantown, WV, USA
| | - Denys Brand
- Department of Psychology, California State University, Sacramento, CA, USA
| | - Marcus D Strum
- Department of Behavior Analysis, University of North Texas, Denton, TX, USA
| | - Justin B Han
- Department of Child and Family Studies, University of South Florida, Tampa, FL, USA
| | - Michele D Wallace
- Department of Special Education & Counseling, California State University, Los Angeles, CA, USA
| |
Collapse
|
4
|
Imam AA. Remarkably reproducible psychological (memory) phenomena in the classroom: some evidence for generality from small-N research. BMC Psychol 2022; 10:274. [PMID: 36419180 PMCID: PMC9685964 DOI: 10.1186/s40359-022-00982-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2021] [Accepted: 11/09/2022] [Indexed: 11/27/2022] Open
Abstract
BACKGROUND Mainstream psychology is experiencing a crisis of confidence. Many of the methodological solutions offered in response have focused largely on statistical alternatives to null hypothesis statistical testing, ignoring nonstatistical remedies that are readily available within psychology; namely, use of small-N designs. In fact, many classic memory studies that have passed the test of replicability used them. That methodological legacy warranted a retrospective look at nonexperimental data to explore the generality of the reported effects. METHOD Various classroom demonstrations were conducted over multiple semesters in introductory psychology courses with typical, mostly freshman students from a predominantly white private Catholic university in the US Midwest based on classic memory experiments on immediate memory span, chunking, and depth of processing. RESULTS Students tended to remember 7 ± 2 digits, remembered more digits of π following an attached meaningful story, and remembered more words after elaborative rehearsal than after maintenance rehearsal. These results amount to replications under uncontrolled classroom environments of the classic experiments originally conducted largely outside of null hypothesis statistical testing frameworks. CONCLUSIONS In light of the ongoing replication crisis in psychology, the results are remarkable and noteworthy, validating these historically important psychological findings. They are testament to the reliability of reproducible effects as the hallmark of empirical findings in science and suggest an alternative approach to commonly proffered solutions to the replication crisis.
Collapse
Affiliation(s)
- Abdulrazaq A. Imam
- grid.258192.50000 0001 2295 5682Department of Psychology, John Carroll University, 1 John Carroll Blvd, University Heights, OH 44118 USA
| |
Collapse
|
5
|
Mittelman C. Creating Responsive Asynchronous Instructional Sequences Using PowerPoint TM for Microsoft 365®. Behav Anal Pract 2022; 16:312-333. [PMID: 35668747 PMCID: PMC9159038 DOI: 10.1007/s40617-022-00713-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/15/2022] [Indexed: 11/29/2022] Open
Abstract
The prevalence of distance education utilizing asynchronous instruction has increased in recent years. Asynchronous instruction differs from the more common synchronous instruction in that learners primarily contact the lessons and educational materials on their own rather than with a live instructor. Though not without its limitations, asynchronous instruction offers a variety of advantages that can make instruction more efficient, produce better outcomes, and increase accessibility to a greater variety of learners if created using known principles of effective instructional design. Though many platforms exist for creating asynchronous instruction, these are often accompanied by barriers to their widespread use. A potential cost-effective and flexible alternative to these is Microsoft® PowerPointTM. The present report serves as a guide for creating interactive and responsive asynchronous instructional sequences with PowerPoint for Microsoft 365® using principles and procedures derived from programmed instruction (Skinner, 1968). Ideas for additional response types are also provided, as are the limitations of designing instructional sequences with this software. Previous papers on the use of PowerPoint as an instructional tool have been primarily geared towards instruction for young learners or learners with autism. As such, the present article expands on the use of PowerPoint specifically to higher education.
Collapse
|
6
|
Guerrero LA, Engler CW, Hansen BA, Piazza CC. On the validity of interpreting functional analyses of inappropriate mealtime behavior using structured criteria. J Appl Behav Anal 2022; 55:1280-1293. [PMID: 35818937 PMCID: PMC9796493 DOI: 10.1002/jaba.945] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2021] [Accepted: 06/07/2022] [Indexed: 01/01/2023]
Abstract
Visual inspection is the traditional method behavior analysts use to interpret functional-analysis results. Limitations of visual inspection include lack of standardized rules, subjectivity, and inconsistent interrater reliability (Fisch, 1998). To address these limitations, researchers have developed, evaluated, and refined structured criteria to aid interpretation of functional analyses of destructive behavior (Hagopian et al., 1997; Roane et al., 2013; Saini et al., 2018). The current study applied the structured criteria Saini et al. (2018) described to functional analyses of inappropriate mealtime behavior. We assessed its predictive validity and evaluated its efficiency relative to 3 post hoc visual inspection procedures. Validity metrics were lower than those in Saini et al. however, ongoing visual inspection increased the efficiency of functional analyses by more than 30%. We discuss these findings relative to the procedural differences between functional analyses of destructive behavior and inappropriate mealtime behavior.
Collapse
Affiliation(s)
- Lisa A. Guerrero
- University of Nebraska Medical Center, Munroe Meyer Institute,Rutgers University, Rutgers Biomedical and Health Sciences
| | | | | | - Cathleen C. Piazza
- Children's Specialized Hospital,Rutgers University, Rutgers Biomedical and Health Sciences
| |
Collapse
|
7
|
Dowdy A, Jessel J, Saini V, Peltier C. Structured visual analysis of single-case experimental design data: Developments and technological advancements. J Appl Behav Anal 2021; 55:451-462. [PMID: 34962646 DOI: 10.1002/jaba.899] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2020] [Revised: 11/22/2021] [Accepted: 11/22/2021] [Indexed: 11/05/2022]
Abstract
Visual analysis is the primary method used to interpret single-case experimental design (SCED) data in applied behavior analysis. Research shows that agreement between visual analysts can be suboptimal at times. To address the inconsistent interpretations of SCED data, recent structured visual-analysis technological advancements have been developed. To assess the extent to which structured visual analysis is used to guide or supplement applied behavior analysts' interpretation of SCED graphs, a systematic review between the years of 2015 to 2020 in the Journal of Applied Behavior Analysis was conducted. Findings showed that despite recent efforts to develop structured visual-analysis tools and criteria, these methods are rarely used to analyze SCED data. An overview of structured visual-analysis tools is shared, their utility is delineated, common characteristics are brought to light, and future directions for both research and their clinical use are highlighted.
Collapse
Affiliation(s)
- Art Dowdy
- Department of Teaching and Learning, Temple University
| | | | - Valdeep Saini
- Department of Applied Disability Studies, Brock University
| | - Corey Peltier
- Department of Educational Psychology, University of Oklahoma
| |
Collapse
|
8
|
Implementing Automated Nonparametric Statistical Analysis on Functional Analysis Data: A Guide for Practitioners and Researchers. Perspect Behav Sci 2021; 45:53-75. [DOI: 10.1007/s40614-021-00290-2] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/16/2021] [Indexed: 11/25/2022] Open
|
9
|
Lanovaz MJ, Turgeon S. How Many Tiers Do We Need? Type I Errors and Power in Multiple Baseline Designs. Perspect Behav Sci 2020; 43:605-616. [PMID: 33024931 PMCID: PMC7490309 DOI: 10.1007/s40614-020-00263-x] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022] Open
Abstract
Design quality guidelines typically recommend that multiple baseline designs include at least three demonstrations of effects. Despite its widespread adoption, this recommendation does not appear grounded in empirical evidence. The main purpose of our study was to address this issue by assessing Type I error rate and power in multiple baseline designs. First, we generated 10,000 multiple baseline graphs, applied the dual-criteria method to each tier, and computed Type I error rate and power for different number of tiers showing a clear change. Second, two raters categorized the tiers for 300 multiple baseline graphs to replicate our analyses using visual inspection. When multiple baseline designs had at least three tiers and two or more of these tiers showed a clear change, the Type I error rate remained adequate (< .05) while power also reached acceptable levels (> .80). In contrast, requiring all tiers to show a clear change resulted in overly stringent conclusions (i.e., unacceptably low power). Therefore, our results suggest that researchers and practitioners should carefully consider limitations in power when requiring all tiers of a multiple baseline design to show a clear change in their analyses.
Collapse
Affiliation(s)
- Marc J Lanovaz
- École de psychoéducation, Université de Montréal, C.P. 6128, succursale Centre-Ville, Montreal, QC H3C 3J7 Canada.,Centre de recherche de l'Institut universitaire en santé mentale de Montréal, Montreal, QC Canada
| | - Stéphanie Turgeon
- École de psychoéducation, Université de Montréal, C.P. 6128, succursale Centre-Ville, Montreal, QC H3C 3J7 Canada
| |
Collapse
|
10
|
Mitteer DR, Greer BD, Randall KR, Briggs AM. Further Evaluation of Teaching Behavior Technicians to Input Data and Graph Using GraphPad Prism. BEHAVIOR ANALYSIS (WASHINGTON, D.C.) 2020; 20:81-93. [PMID: 33244483 DOI: 10.1037/bar0000172] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
We replicated and extended Mitteer, Greer, Fisher, and Cohrs (2018) by examining the effects of a video model on inputting data into GraphPad Prism, which is a necessary skill for graph construction. We used a concurrent multiple-probe-across-behavior design with two behavior technicians to assess data-input and graphing skills separately prior to and during access to relevant video models. We evaluated the generality of the training procedures by assessing both skills during data-input-plus-graphing sessions without access to the video models. The video models resulted in mastery of data-input and graphing skills when assessed individually. We observed training effects generalize to data-input-plus-graphing sessions once behavior technicians experienced all relevant video models. These results suggest that individuals should view both data-input and graphing video models prior to depicting single-case design data in Prism but that these skills can maintain at high levels of accuracy without continued access to the training materials.
Collapse
Affiliation(s)
| | - Brian D Greer
- University of Nebraska Medical Center's Munroe-Meyer Institute
| | - Kayla R Randall
- University of Nebraska Medical Center's Munroe-Meyer Institute
| | - Adam M Briggs
- University of Nebraska Medical Center's Munroe-Meyer Institute
| |
Collapse
|