{"title":"Addressing Common Analytic Challenges to Randomized Experiments in MOOCs: Attrition and Zero-Inflation","authors":"Anne Lamb, Jascha Smilack, Andrew D. Ho, J. Reich","doi":"10.1145/2724660.2724669","DOIUrl":null,"url":null,"abstract":"Massive open online course (MOOC) platforms increasingly allow easily implemented randomized experiments. The heterogeneity of MOOC students, however, leads to two methodological obstacles in analyzing interventions to increase engagement. (1) Many MOOC participation metrics have distributions with substantial positive skew from highly active users as well as zero-inflation from high attrition. (2) High attrition means that in some experimental designs, most users assigned to the treatment never receive it; analyses that do not consider attrition result in \"intent-to-treat\" (ITT) estimates that underestimate the true effects of interventions. We address these challenges in analyzing an intervention to improve forum participation in the 2014 JusticeX course offered on the edX MOOC platform. We compare the results of four ITT models (OLS, logistic, quantile, and zero-inflated negative binomial regressions) and three \"treatment-on-treated\" (TOT) models (Wald estimator, 2SLS with a second stage logistic model, and instrumental variables quantile regression). A combination of logistic, quantile, and zero-inflated negative binomial regressions provide the most comprehensive description of the ITT effects. TOT methods then adjust the ITT underestimates. Substantively, we demonstrate that self-assessment questions about forum participation encourage more students to engage in forums and increases the participation of already active students.","PeriodicalId":20664,"journal":{"name":"Proceedings of the Second (2015) ACM Conference on Learning @ Scale","volume":"27 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2015-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"28","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Second (2015) ACM Conference on Learning @ Scale","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2724660.2724669","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 28
Abstract
Massive open online course (MOOC) platforms increasingly allow easily implemented randomized experiments. The heterogeneity of MOOC students, however, leads to two methodological obstacles in analyzing interventions to increase engagement. (1) Many MOOC participation metrics have distributions with substantial positive skew from highly active users as well as zero-inflation from high attrition. (2) High attrition means that in some experimental designs, most users assigned to the treatment never receive it; analyses that do not consider attrition result in "intent-to-treat" (ITT) estimates that underestimate the true effects of interventions. We address these challenges in analyzing an intervention to improve forum participation in the 2014 JusticeX course offered on the edX MOOC platform. We compare the results of four ITT models (OLS, logistic, quantile, and zero-inflated negative binomial regressions) and three "treatment-on-treated" (TOT) models (Wald estimator, 2SLS with a second stage logistic model, and instrumental variables quantile regression). A combination of logistic, quantile, and zero-inflated negative binomial regressions provide the most comprehensive description of the ITT effects. TOT methods then adjust the ITT underestimates. Substantively, we demonstrate that self-assessment questions about forum participation encourage more students to engage in forums and increases the participation of already active students.