Stratified learning: A general‐purpose statistical method for improved learning under covariate shift

IF 2.1 4区 数学 Q3 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Statistical Analysis and Data Mining Pub Date : 2023-09-29 DOI:10.1002/sam.11643
Maximilian Autenrieth, David A. Van Dyk, Roberto Trotta, David C. Stenning
{"title":"Stratified learning: A general‐purpose statistical method for improved learning under covariate shift","authors":"Maximilian Autenrieth, David A. Van Dyk, Roberto Trotta, David C. Stenning","doi":"10.1002/sam.11643","DOIUrl":null,"url":null,"abstract":"Abstract We propose a simple, statistically principled, and theoretically justified method to improve supervised learning when the training set is not representative, a situation known as covariate shift. We build upon a well‐established methodology in causal inference and show that the effects of covariate shift can be reduced or eliminated by conditioning on propensity scores. In practice, this is achieved by fitting learners within strata constructed by partitioning the data based on the estimated propensity scores, leading to approximately balanced covariates and much‐improved target prediction. We refer to the overall method as Stratified Learning, or StratLearn . We demonstrate the effectiveness of this general‐purpose method on two contemporary research questions in cosmology, outperforming state‐of‐the‐art importance weighting methods. We obtain the best‐reported AUC (0.958) on the updated “Supernovae photometric classification challenge,” and we improve upon existing conditional density estimation of galaxy redshift from Sloan Digital Sky Survey (SDSS) data.","PeriodicalId":48684,"journal":{"name":"Statistical Analysis and Data Mining","volume":"44 1","pages":"0"},"PeriodicalIF":2.1000,"publicationDate":"2023-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Statistical Analysis and Data Mining","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1002/sam.11643","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 1

Abstract

Abstract We propose a simple, statistically principled, and theoretically justified method to improve supervised learning when the training set is not representative, a situation known as covariate shift. We build upon a well‐established methodology in causal inference and show that the effects of covariate shift can be reduced or eliminated by conditioning on propensity scores. In practice, this is achieved by fitting learners within strata constructed by partitioning the data based on the estimated propensity scores, leading to approximately balanced covariates and much‐improved target prediction. We refer to the overall method as Stratified Learning, or StratLearn . We demonstrate the effectiveness of this general‐purpose method on two contemporary research questions in cosmology, outperforming state‐of‐the‐art importance weighting methods. We obtain the best‐reported AUC (0.958) on the updated “Supernovae photometric classification challenge,” and we improve upon existing conditional density estimation of galaxy redshift from Sloan Digital Sky Survey (SDSS) data.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
分层学习:一种在协变量移位下改善学习的通用统计方法
当训练集不具有代表性时,我们提出了一种简单的、有统计学原则的、理论上合理的方法来改进监督学习,这种情况被称为协变量移位。我们在因果推理中建立了一个完善的方法,并表明协变量移位的影响可以通过对倾向得分的调节来减少或消除。在实践中,这是通过在基于估计的倾向分数划分数据构建的层内拟合学习器来实现的,从而导致近似平衡的协变量和大大改进的目标预测。我们将整个方法称为分层学习(Stratified Learning)或StratLearn。我们证明了这种通用方法在两个当代宇宙学研究问题上的有效性,优于最先进的重要性加权方法。我们在更新的“超新星光度分类挑战”中获得了最佳报告AUC(0.958),并且我们改进了现有的斯隆数字巡天(SDSS)数据中星系红移的条件密度估计。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Statistical Analysis and Data Mining
Statistical Analysis and Data Mining COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCEC-COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
CiteScore
3.20
自引率
7.70%
发文量
43
期刊介绍: Statistical Analysis and Data Mining addresses the broad area of data analysis, including statistical approaches, machine learning, data mining, and applications. Topics include statistical and computational approaches for analyzing massive and complex datasets, novel statistical and/or machine learning methods and theory, and state-of-the-art applications with high impact. Of special interest are articles that describe innovative analytical techniques, and discuss their application to real problems, in such a way that they are accessible and beneficial to domain experts across science, engineering, and commerce. The focus of the journal is on papers which satisfy one or more of the following criteria: Solve data analysis problems associated with massive, complex datasets Develop innovative statistical approaches, machine learning algorithms, or methods integrating ideas across disciplines, e.g., statistics, computer science, electrical engineering, operation research. Formulate and solve high-impact real-world problems which challenge existing paradigms via new statistical and/or computational models Provide survey to prominent research topics.
期刊最新文献
Boosting diversity in regression ensembles Multivariate contaminated normal mixture regression modeling of longitudinal data based on joint mean-covariance model A machine learning oracle for parameter estimation The generalized hyperbolic family and automatic model selection through the multiple-choice LASSO Spatially-correlated time series clustering using location-dependent Dirichlet process mixture model
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1