{"title":"The Internal Validity of the School-Level Comparative Interrupted Time Series Design: Evidence From Four New Within-Study Comparisons","authors":"Sam Sims, Jake Anders, Laura Zieger","doi":"10.1080/19345747.2022.2051652","DOIUrl":null,"url":null,"abstract":"Abstract Comparative interrupted time series (CITS) designs evaluate impact by modeling the relative deviation from trends among a treatment and comparison group after an intervention. The broad applicability of the design means it is widely used in education research. Like all non-experimental evaluation methods however, the internal validity of a given CITS evaluation depends on assumptions that cannot be directly verified. We provide an empirical test of the internal validity of CITS by conducting four within-study comparisons of school-level interventions previously evaluated using randomized controlled trials. Our estimate of bias across these four studies is 0.03 school-level (or 0.01 pupil-level) standard deviations. The results suggest well-conducted CITS evaluations of similar school-level education interventions are likely to display limited bias.","PeriodicalId":47260,"journal":{"name":"Journal of Research on Educational Effectiveness","volume":"15 1","pages":"876 - 897"},"PeriodicalIF":1.7000,"publicationDate":"2022-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Research on Educational Effectiveness","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1080/19345747.2022.2051652","RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 6
Abstract
Abstract Comparative interrupted time series (CITS) designs evaluate impact by modeling the relative deviation from trends among a treatment and comparison group after an intervention. The broad applicability of the design means it is widely used in education research. Like all non-experimental evaluation methods however, the internal validity of a given CITS evaluation depends on assumptions that cannot be directly verified. We provide an empirical test of the internal validity of CITS by conducting four within-study comparisons of school-level interventions previously evaluated using randomized controlled trials. Our estimate of bias across these four studies is 0.03 school-level (or 0.01 pupil-level) standard deviations. The results suggest well-conducted CITS evaluations of similar school-level education interventions are likely to display limited bias.
期刊介绍:
As the flagship publication for the Society for Research on Educational Effectiveness, the Journal of Research on Educational Effectiveness (JREE) publishes original articles from the multidisciplinary community of researchers who are committed to applying principles of scientific inquiry to the study of educational problems. Articles published in JREE should advance our knowledge of factors important for educational success and/or improve our ability to conduct further disciplined studies of pressing educational problems. JREE welcomes manuscripts that fit into one of the following categories: (1) intervention, evaluation, and policy studies; (2) theory, contexts, and mechanisms; and (3) methodological studies. The first category includes studies that focus on process and implementation and seek to demonstrate causal claims in educational research. The second category includes meta-analyses and syntheses, descriptive studies that illuminate educational conditions and contexts, and studies that rigorously investigate education processes and mechanism. The third category includes studies that advance our understanding of theoretical and technical features of measurement and research design and describe advances in data analysis and data modeling. To establish a stronger connection between scientific evidence and educational practice, studies submitted to JREE should focus on pressing problems found in classrooms and schools. Studies that help advance our understanding and demonstrate effectiveness related to challenges in reading, mathematics education, and science education are especially welcome as are studies related to cognitive functions, social processes, organizational factors, and cultural features that mediate and/or moderate critical educational outcomes. On occasion, invited responses to JREE articles and rejoinders to those responses will be included in an issue.