Ethan R. Van Norman , David A. Klingbeil , Adelle K. Sturgell
{"title":"The influence of procedural characteristics on within-case effect sizes for academic outcomes","authors":"Ethan R. Van Norman , David A. Klingbeil , Adelle K. Sturgell","doi":"10.1016/j.jsp.2024.101347","DOIUrl":null,"url":null,"abstract":"<div><p>Single-case experimental designs (SCEDs) have been used with increasing frequency to identify evidence-based interventions in education. The purpose of this study was to explore how several procedural characteristics, including within-phase variability (i.e., measurement error), number of baseline observations, and number of intervention observations influenced the magnitude of four SCED effect sizes, including (a) non-overlap of all pairs (NAP), (b) baseline corrected tau (BC-Tau), (c) mean-phase difference (MPD), and (d) generalized least squares (GLS) when applied to hypothetical academic intervention SCED data. Higher levels of measurement error decreased the average magnitude of effect sizes, particularly NAP and BC-Tau. However, the number of intervention observations had minimal impact on the average magnitude of NAP and BC-Tau. Increasing the number of intervention observations dramatically increased the magnitude of GLS and MPD. Increasing the number of baseline observations also tended to increase the average magnitude of MPD. The ratio of baseline to intervention observations had a statistically but not practically significant influence on the average magnitude of NAP, BC-Tau, and GLS. Careful consideration is required when determining the length of time academic SCEDs are conducted and what effect sizes are used to summarize treatment outcomes. This article also highlights the value of using meaningful simulation conditions to understand the performance of SCED effect sizes.</p></div>","PeriodicalId":48232,"journal":{"name":"Journal of School Psychology","volume":null,"pages":null},"PeriodicalIF":3.8000,"publicationDate":"2024-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of School Psychology","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0022440524000670","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, SOCIAL","Score":null,"Total":0}
引用次数: 0
Abstract
Single-case experimental designs (SCEDs) have been used with increasing frequency to identify evidence-based interventions in education. The purpose of this study was to explore how several procedural characteristics, including within-phase variability (i.e., measurement error), number of baseline observations, and number of intervention observations influenced the magnitude of four SCED effect sizes, including (a) non-overlap of all pairs (NAP), (b) baseline corrected tau (BC-Tau), (c) mean-phase difference (MPD), and (d) generalized least squares (GLS) when applied to hypothetical academic intervention SCED data. Higher levels of measurement error decreased the average magnitude of effect sizes, particularly NAP and BC-Tau. However, the number of intervention observations had minimal impact on the average magnitude of NAP and BC-Tau. Increasing the number of intervention observations dramatically increased the magnitude of GLS and MPD. Increasing the number of baseline observations also tended to increase the average magnitude of MPD. The ratio of baseline to intervention observations had a statistically but not practically significant influence on the average magnitude of NAP, BC-Tau, and GLS. Careful consideration is required when determining the length of time academic SCEDs are conducted and what effect sizes are used to summarize treatment outcomes. This article also highlights the value of using meaningful simulation conditions to understand the performance of SCED effect sizes.
期刊介绍:
The Journal of School Psychology publishes original empirical articles and critical reviews of the literature on research and practices relevant to psychological and behavioral processes in school settings. JSP presents research on intervention mechanisms and approaches; schooling effects on the development of social, cognitive, mental-health, and achievement-related outcomes; assessment; and consultation. Submissions from a variety of disciplines are encouraged. All manuscripts are read by the Editor and one or more editorial consultants with the intent of providing appropriate and constructive written reviews.