Sophie E. Stallasch, Oliver Lüdtke, Cordula Artelt, Larry V. Hedges, Martin Brunner
{"title":"从单层次和多层次角度看学生成绩随机干预研究中的协变量选择","authors":"Sophie E. Stallasch, Oliver Lüdtke, Cordula Artelt, Larry V. Hedges, Martin Brunner","doi":"10.1007/s10648-024-09898-7","DOIUrl":null,"url":null,"abstract":"<p>Well-chosen covariates boost the design sensitivity of individually and cluster-randomized trials. We provide guidance on covariate selection generating an extensive compilation of single- and multilevel design parameters on student achievement. Embedded in psychometric heuristics, we analyzed (a) covariate <i>types</i> of varying bandwidth-fidelity, namely domain-identical (IP), cross-domain (CP), and fluid intelligence (Gf) pretests, as well as sociodemographic characteristics (SC); (b) covariate <i>combinations</i> quantifying incremental validities of CP, Gf, and/or SC beyond IP; and (c) covariate <i>time lags</i> of 1–7 years, testing validity degradation in IP, CP, and Gf. Estimates from six German samples (1868 ≤ <i>N</i> ≤ 10,543) covering various outcome domains across grades 1–12 were meta-analyzed and included in precision simulations. Results varied widely by grade level, domain, and hierarchical level. In general, IP outperformed CP, which slightly outperformed Gf and SC. Benefits from coupling IP with CP, Gf, and/or SC were small. IP appeared most affected by temporal validity decay. Findings are applied in illustrative scenarios of study planning and enriched by comprehensive Online Supplemental Material (OSM) accessible via the Open Science Framework (OSF; https://osf.io/nhx4w).</p>","PeriodicalId":48344,"journal":{"name":"Educational Psychology Review","volume":"57 1","pages":""},"PeriodicalIF":10.1000,"publicationDate":"2024-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Single- and Multilevel Perspectives on Covariate Selection in Randomized Intervention Studies on Student Achievement\",\"authors\":\"Sophie E. Stallasch, Oliver Lüdtke, Cordula Artelt, Larry V. Hedges, Martin Brunner\",\"doi\":\"10.1007/s10648-024-09898-7\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Well-chosen covariates boost the design sensitivity of individually and cluster-randomized trials. We provide guidance on covariate selection generating an extensive compilation of single- and multilevel design parameters on student achievement. Embedded in psychometric heuristics, we analyzed (a) covariate <i>types</i> of varying bandwidth-fidelity, namely domain-identical (IP), cross-domain (CP), and fluid intelligence (Gf) pretests, as well as sociodemographic characteristics (SC); (b) covariate <i>combinations</i> quantifying incremental validities of CP, Gf, and/or SC beyond IP; and (c) covariate <i>time lags</i> of 1–7 years, testing validity degradation in IP, CP, and Gf. Estimates from six German samples (1868 ≤ <i>N</i> ≤ 10,543) covering various outcome domains across grades 1–12 were meta-analyzed and included in precision simulations. Results varied widely by grade level, domain, and hierarchical level. In general, IP outperformed CP, which slightly outperformed Gf and SC. Benefits from coupling IP with CP, Gf, and/or SC were small. IP appeared most affected by temporal validity decay. Findings are applied in illustrative scenarios of study planning and enriched by comprehensive Online Supplemental Material (OSM) accessible via the Open Science Framework (OSF; https://osf.io/nhx4w).</p>\",\"PeriodicalId\":48344,\"journal\":{\"name\":\"Educational Psychology Review\",\"volume\":\"57 1\",\"pages\":\"\"},\"PeriodicalIF\":10.1000,\"publicationDate\":\"2024-09-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Educational Psychology Review\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1007/s10648-024-09898-7\",\"RegionNum\":1,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, EDUCATIONAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Educational Psychology Review","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1007/s10648-024-09898-7","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EDUCATIONAL","Score":null,"Total":0}
Single- and Multilevel Perspectives on Covariate Selection in Randomized Intervention Studies on Student Achievement
Well-chosen covariates boost the design sensitivity of individually and cluster-randomized trials. We provide guidance on covariate selection generating an extensive compilation of single- and multilevel design parameters on student achievement. Embedded in psychometric heuristics, we analyzed (a) covariate types of varying bandwidth-fidelity, namely domain-identical (IP), cross-domain (CP), and fluid intelligence (Gf) pretests, as well as sociodemographic characteristics (SC); (b) covariate combinations quantifying incremental validities of CP, Gf, and/or SC beyond IP; and (c) covariate time lags of 1–7 years, testing validity degradation in IP, CP, and Gf. Estimates from six German samples (1868 ≤ N ≤ 10,543) covering various outcome domains across grades 1–12 were meta-analyzed and included in precision simulations. Results varied widely by grade level, domain, and hierarchical level. In general, IP outperformed CP, which slightly outperformed Gf and SC. Benefits from coupling IP with CP, Gf, and/or SC were small. IP appeared most affected by temporal validity decay. Findings are applied in illustrative scenarios of study planning and enriched by comprehensive Online Supplemental Material (OSM) accessible via the Open Science Framework (OSF; https://osf.io/nhx4w).
期刊介绍:
Educational Psychology Review aims to disseminate knowledge and promote dialogue within the field of educational psychology. It serves as a platform for the publication of various types of articles, including peer-reviewed integrative reviews, special thematic issues, reflections on previous research or new research directions, interviews, and research-based advice for practitioners. The journal caters to a diverse readership, ranging from generalists in educational psychology to experts in specific areas of the discipline. The content offers a comprehensive coverage of topics and provides in-depth information to meet the needs of both specialized researchers and practitioners.