Kaitlin Anderson, Gema Zamarro, Jennifer Steele, Trey Miller
{"title":"随机实验评估中处理差异磨损方法的性能比较。","authors":"Kaitlin Anderson, Gema Zamarro, Jennifer Steele, Trey Miller","doi":"10.1177/0193841X211034363","DOIUrl":null,"url":null,"abstract":"<p><p><b>Background:</b> In randomized controlled trials, attrition rates often differ by treatment status, jeopardizing causal inference. Inverse probability weighting methods and estimation of treatment effect bounds have been used to adjust for this bias. <b>Objectives:</b> We compare the performance of various methods within two samples, both generated through lottery-based randomization: one with considerable differential attrition and an augmented dataset with less problematic attrition. <b>Research Design:</b> We assess the performance of various correction methods within the dataset with problematic attrition. In addition, we conduct simulation analyses. <b>Results:</b> Within the more problematic dataset, we find the correction methods often performed poorly. Simulation analyses indicate that deviations from the underlying assumptions for bounding approaches damage the performance of estimated bounds. <b>Conclusions:</b> We recommend the verification of the underlying assumptions in attrition correction methods whenever possible and, when verification is not possible, using these methods with caution.</p>","PeriodicalId":47533,"journal":{"name":"Evaluation Review","volume":"45 1-2","pages":"70-104"},"PeriodicalIF":3.0000,"publicationDate":"2021-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Comparing Performance of Methods to Deal With Differential Attrition in Randomized Experimental Evaluations.\",\"authors\":\"Kaitlin Anderson, Gema Zamarro, Jennifer Steele, Trey Miller\",\"doi\":\"10.1177/0193841X211034363\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p><b>Background:</b> In randomized controlled trials, attrition rates often differ by treatment status, jeopardizing causal inference. Inverse probability weighting methods and estimation of treatment effect bounds have been used to adjust for this bias. <b>Objectives:</b> We compare the performance of various methods within two samples, both generated through lottery-based randomization: one with considerable differential attrition and an augmented dataset with less problematic attrition. <b>Research Design:</b> We assess the performance of various correction methods within the dataset with problematic attrition. In addition, we conduct simulation analyses. <b>Results:</b> Within the more problematic dataset, we find the correction methods often performed poorly. Simulation analyses indicate that deviations from the underlying assumptions for bounding approaches damage the performance of estimated bounds. <b>Conclusions:</b> We recommend the verification of the underlying assumptions in attrition correction methods whenever possible and, when verification is not possible, using these methods with caution.</p>\",\"PeriodicalId\":47533,\"journal\":{\"name\":\"Evaluation Review\",\"volume\":\"45 1-2\",\"pages\":\"70-104\"},\"PeriodicalIF\":3.0000,\"publicationDate\":\"2021-02-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Evaluation Review\",\"FirstCategoryId\":\"90\",\"ListUrlMain\":\"https://doi.org/10.1177/0193841X211034363\",\"RegionNum\":4,\"RegionCategory\":\"社会学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"SOCIAL SCIENCES, INTERDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Evaluation Review","FirstCategoryId":"90","ListUrlMain":"https://doi.org/10.1177/0193841X211034363","RegionNum":4,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"SOCIAL SCIENCES, INTERDISCIPLINARY","Score":null,"Total":0}
Comparing Performance of Methods to Deal With Differential Attrition in Randomized Experimental Evaluations.
Background: In randomized controlled trials, attrition rates often differ by treatment status, jeopardizing causal inference. Inverse probability weighting methods and estimation of treatment effect bounds have been used to adjust for this bias. Objectives: We compare the performance of various methods within two samples, both generated through lottery-based randomization: one with considerable differential attrition and an augmented dataset with less problematic attrition. Research Design: We assess the performance of various correction methods within the dataset with problematic attrition. In addition, we conduct simulation analyses. Results: Within the more problematic dataset, we find the correction methods often performed poorly. Simulation analyses indicate that deviations from the underlying assumptions for bounding approaches damage the performance of estimated bounds. Conclusions: We recommend the verification of the underlying assumptions in attrition correction methods whenever possible and, when verification is not possible, using these methods with caution.
期刊介绍:
Evaluation Review is the forum for researchers, planners, and policy makers engaged in the development, implementation, and utilization of studies aimed at the betterment of the human condition. The Editors invite submission of papers reporting the findings of evaluation studies in such fields as child development, health, education, income security, manpower, mental health, criminal justice, and the physical and social environments. In addition, Evaluation Review will contain articles on methodological developments, discussions of the state of the art, and commentaries on issues related to the application of research results. Special features will include periodic review essays, "research briefs", and "craft reports".