Causal Inference in Multilevel Settings in Which Selection Processes Vary across Schools. CSE Technical Report 708.

Junyeop Kim, Michael H. Seltzer
{"title":"Causal Inference in Multilevel Settings in Which Selection Processes Vary across Schools. CSE Technical Report 708.","authors":"Junyeop Kim, Michael H. Seltzer","doi":"10.1037/e644002011-001","DOIUrl":null,"url":null,"abstract":"In this report we focus on the use of propensity score methodology in multisite studies of the effects of educational programs and practices in which both treatment and control conditions are enacted within each of the schools in a sample, and the assignment to treatment is not random. A key challenge in applying propensity score methodology in such settings is that the process by which students wind up in treatment or control conditions may differ substantially from school to school. To help capture differences in selection processes across schools, and achieve balance on key covariates between treatment and control students in each school, we propose the use of multilevel logistic regression models for propensity score estimation in which intercepts and slopes are treated as varying across schools. Through analyses of the data from the Early Academic Outreach Program (EAOP), we compare the performance of this approach with other possible strategies for estimating propensity scores (e.g., single-level logistic regression models; multilevel logistic regression models with intercepts treated as random and slopes treated as fixed). Furthermore, we draw attention to how the failure to achieve balance within each school can result in misleading inferences concerning the extent to which the effect of a treatment varies across schools, and concerning factors (e.g., differences in implementation across schools) that might dampen or magnify the effects of a treatment.","PeriodicalId":19116,"journal":{"name":"National Center for Research on Evaluation, Standards, and Student Testing","volume":"80 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2007-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"38","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"National Center for Research on Evaluation, Standards, and Student Testing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1037/e644002011-001","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 38

Abstract

In this report we focus on the use of propensity score methodology in multisite studies of the effects of educational programs and practices in which both treatment and control conditions are enacted within each of the schools in a sample, and the assignment to treatment is not random. A key challenge in applying propensity score methodology in such settings is that the process by which students wind up in treatment or control conditions may differ substantially from school to school. To help capture differences in selection processes across schools, and achieve balance on key covariates between treatment and control students in each school, we propose the use of multilevel logistic regression models for propensity score estimation in which intercepts and slopes are treated as varying across schools. Through analyses of the data from the Early Academic Outreach Program (EAOP), we compare the performance of this approach with other possible strategies for estimating propensity scores (e.g., single-level logistic regression models; multilevel logistic regression models with intercepts treated as random and slopes treated as fixed). Furthermore, we draw attention to how the failure to achieve balance within each school can result in misleading inferences concerning the extent to which the effect of a treatment varies across schools, and concerning factors (e.g., differences in implementation across schools) that might dampen or magnify the effects of a treatment.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
在不同学校的选择过程不同的多层次设置中的因果推理。CSE技术报告708。
在本报告中,我们将重点关注倾向得分方法在教育计划和实践效果的多地点研究中的应用,在这些研究中,在样本中的每个学校都制定了治疗和控制条件,并且对治疗的分配不是随机的。在这种情况下应用倾向得分方法的一个关键挑战是,学生最终进入治疗或控制条件的过程可能因学校而异。为了帮助捕捉不同学校之间选择过程的差异,并在每个学校的治疗和控制学生之间的关键协变量上取得平衡,我们建议使用多水平逻辑回归模型进行倾向得分估计,其中截距和斜率在不同学校之间被视为不同。通过对早期学术推广计划(EAOP)数据的分析,我们比较了该方法与其他估计倾向得分的可能策略的性能(例如,单水平逻辑回归模型;多水平逻辑回归模型,截距视为随机,斜率视为固定)。此外,我们还注意到,未能在每所学校内实现平衡,可能会导致对不同学校治疗效果差异程度的误导性推断,以及可能抑制或放大治疗效果的因素(例如,不同学校实施的差异)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Aligning Instruction and Assessment with Game and Simulation Design. CRESST Report 780. Evaluation of Seeds of Science/Roots of Reading: Effective Tools for Developing Literacy through Science in the Early Grades-Light Energy Unit. CRESST Report 781. Accessible Reading Assessments for Students with Disabilities: The Role of Cognitive, Grammatical, Lexical, and Textual/Visual Features. CRESST Report 785. Preparing Students for the 21st Century: Exploring the Effect of Afterschool Participation on Students' Collaboration Skills, Oral Communication Skills, and Self-Efficacy. CRESST Report 777. What Works? Common Practices in High Functioning Afterschool Programs across the Nation in Math, Reading, Science, Arts, Technology, and Homework--A Study by the National Partnership. The Afterschool Program Assessment Guide. CRESST Report 768.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1