Do Social Programs Help Some Beneficiaries More Than Others? Evaluating the Potential for Comparison Group Designs to Yield Low-Bias Estimates of Differential Impact
{"title":"Do Social Programs Help Some Beneficiaries More Than Others? Evaluating the Potential for Comparison Group Designs to Yield Low-Bias Estimates of Differential Impact","authors":"Andrew P. Jaciw","doi":"10.1177/10982140231160561","DOIUrl":null,"url":null,"abstract":"In the current socio-political climate, there is an extra urgency to evaluate whether program impacts are distributed fairly across important student groups in education. Both experimental and quasi-experimental designs (QEDs) can contribute to answering this question. This work demonstrates that QEDs that compare outcomes across higher-level implementation units, such as schools, are especially well-suited to contributing evidence on differential program effects across student groups. Such designs, by differencing away site-level (macro) effects, on average produce estimates of the differential impact that are closer to experimental benchmark results than are estimates of average impact based on the same design. This work argues for the importance of routine evaluation of moderated impacts, describes the differencing procedure, and empirically tests the methodology with seven impact evaluations in education. The hope is to encourage broader use of this design type to more-efficiently develop the evidence base for differential program effects, particularly for underserved students.","PeriodicalId":51449,"journal":{"name":"American Journal of Evaluation","volume":null,"pages":null},"PeriodicalIF":1.1000,"publicationDate":"2023-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"American Journal of Evaluation","FirstCategoryId":"90","ListUrlMain":"https://doi.org/10.1177/10982140231160561","RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"SOCIAL SCIENCES, INTERDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
In the current socio-political climate, there is an extra urgency to evaluate whether program impacts are distributed fairly across important student groups in education. Both experimental and quasi-experimental designs (QEDs) can contribute to answering this question. This work demonstrates that QEDs that compare outcomes across higher-level implementation units, such as schools, are especially well-suited to contributing evidence on differential program effects across student groups. Such designs, by differencing away site-level (macro) effects, on average produce estimates of the differential impact that are closer to experimental benchmark results than are estimates of average impact based on the same design. This work argues for the importance of routine evaluation of moderated impacts, describes the differencing procedure, and empirically tests the methodology with seven impact evaluations in education. The hope is to encourage broader use of this design type to more-efficiently develop the evidence base for differential program effects, particularly for underserved students.
期刊介绍:
The American Journal of Evaluation (AJE) publishes original papers about the methods, theory, practice, and findings of evaluation. The general goal of AJE is to present the best work in and about evaluation, in order to improve the knowledge base and practice of its readers. Because the field of evaluation is diverse, with different intellectual traditions, approaches to practice, and domains of application, the papers published in AJE will reflect this diversity. Nevertheless, preference is given to papers that are likely to be of interest to a wide range of evaluators and that are written to be accessible to most readers.