{"title":"A Description of Missing Data in Single-Case Experimental Designs Studies and an Evaluation of Single Imputation Methods.","authors":"Orhan Aydin","doi":"10.1177/01454455241226879","DOIUrl":null,"url":null,"abstract":"<p><p>Missing data is inevitable in single-case experimental designs (SCEDs) studies due to repeated measures over a period of time. Despite this fact, SCEDs implementers such as researchers, teachers, clinicians, and school psychologists usually ignore missing data in their studies. Performing analyses without considering missing data in an intervention study using SCEDs or a meta-analysis study including SCEDs studies in a topic can lead to biased results and affect the validity of individual or overall results. In addition, missingness can undermine the generalizability of SCEDs studies. Considering these drawbacks, this study aims to give descriptive and advisory information to SCEDs practitioners and researchers about missing data in single-case data. To accomplish this task, the study presents information about missing data mechanisms, item level and unit level missing data, planned missing data designs, drawbacks of ignoring missing data in SCEDs, and missing data handling methods. Since single imputation methods among missing data handling methods do not require complicated statistical knowledge, are easy to use, and hence are more likely to be used by practitioners and researchers, the present study evaluates single imputation methods in terms of intervention effect sizes and missing data rates by using a real and hypothetical data sample. This study encourages SCEDs implementers, and also meta-analysts to use some of the single imputation methods to increase the generalizability and validity of the study results in case they encounter missing data in their studies.</p>","PeriodicalId":48037,"journal":{"name":"Behavior Modification","volume":" ","pages":"312-359"},"PeriodicalIF":2.0000,"publicationDate":"2024-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Behavior Modification","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1177/01454455241226879","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/2/19 0:00:00","PubModel":"Epub","JCR":"Q3","JCRName":"PSYCHOLOGY, CLINICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Missing data is inevitable in single-case experimental designs (SCEDs) studies due to repeated measures over a period of time. Despite this fact, SCEDs implementers such as researchers, teachers, clinicians, and school psychologists usually ignore missing data in their studies. Performing analyses without considering missing data in an intervention study using SCEDs or a meta-analysis study including SCEDs studies in a topic can lead to biased results and affect the validity of individual or overall results. In addition, missingness can undermine the generalizability of SCEDs studies. Considering these drawbacks, this study aims to give descriptive and advisory information to SCEDs practitioners and researchers about missing data in single-case data. To accomplish this task, the study presents information about missing data mechanisms, item level and unit level missing data, planned missing data designs, drawbacks of ignoring missing data in SCEDs, and missing data handling methods. Since single imputation methods among missing data handling methods do not require complicated statistical knowledge, are easy to use, and hence are more likely to be used by practitioners and researchers, the present study evaluates single imputation methods in terms of intervention effect sizes and missing data rates by using a real and hypothetical data sample. This study encourages SCEDs implementers, and also meta-analysts to use some of the single imputation methods to increase the generalizability and validity of the study results in case they encounter missing data in their studies.
期刊介绍:
For two decades, researchers and practitioners have turned to Behavior Modification for current scholarship on applied behavior modification. Starting in 1995, in addition to keeping you informed on assessment and modification techniques relevant to psychiatric, clinical, education, and rehabilitation settings, Behavior Modification revised and expanded its focus to include treatment manuals and program descriptions. With these features you can follow the process of clinical research and see how it can be applied to your own work. And, with Behavior Modification, successful clinical and administrative experts have an outlet for sharing their solutions in the field.