Estimation and group-feature selection in sparse mixture-of-experts with diverging number of parameters

IF 0.8 4区 数学 Q3 STATISTICS & PROBABILITY Journal of Statistical Planning and Inference Pub Date : 2024-11-19 DOI:10.1016/j.jspi.2024.106250
Abbas Khalili , Archer Yi Yang , Xiaonan Da
{"title":"Estimation and group-feature selection in sparse mixture-of-experts with diverging number of parameters","authors":"Abbas Khalili ,&nbsp;Archer Yi Yang ,&nbsp;Xiaonan Da","doi":"10.1016/j.jspi.2024.106250","DOIUrl":null,"url":null,"abstract":"<div><div>Mixture-of-experts provide flexible statistical models for a wide range of regression (supervised learning) problems. Often a large number of covariates (features) are available in many modern applications yet only a small subset of them is useful in explaining a response variable of interest. This calls for a feature selection device. In this paper, we present new group-feature selection and estimation methods for sparse mixture-of-experts models when the number of features can be nearly comparable to the sample size. We prove the consistency of the methods in both parameter estimation and feature selection. We implement the methods using a modified EM algorithm combined with proximal gradient method which results in a convenient closed-form parameter update in the M-step of the algorithm. We examine the finite-sample performance of the methods through simulations, and demonstrate their applications in a real data example on exploring relationships in body measurements.</div></div>","PeriodicalId":50039,"journal":{"name":"Journal of Statistical Planning and Inference","volume":"237 ","pages":"Article 106250"},"PeriodicalIF":0.8000,"publicationDate":"2024-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Statistical Planning and Inference","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0378375824001071","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 0

Abstract

Mixture-of-experts provide flexible statistical models for a wide range of regression (supervised learning) problems. Often a large number of covariates (features) are available in many modern applications yet only a small subset of them is useful in explaining a response variable of interest. This calls for a feature selection device. In this paper, we present new group-feature selection and estimation methods for sparse mixture-of-experts models when the number of features can be nearly comparable to the sample size. We prove the consistency of the methods in both parameter estimation and feature selection. We implement the methods using a modified EM algorithm combined with proximal gradient method which results in a convenient closed-form parameter update in the M-step of the algorithm. We examine the finite-sample performance of the methods through simulations, and demonstrate their applications in a real data example on exploring relationships in body measurements.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
参数数量分散的稀疏专家混合物中的估计和组特征选择
专家混合模型为各种回归(监督学习)问题提供了灵活的统计模型。在许多现代应用中,往往会有大量的协变量(特征),但其中只有一小部分对解释感兴趣的响应变量有用。这就需要一种特征选择装置。在本文中,我们针对稀疏专家混合物模型提出了新的分组特征选择和估计方法,当特征数量几乎与样本大小相当时,就可以使用这种方法。我们证明了这些方法在参数估计和特征选择方面的一致性。我们使用改进的 EM 算法结合近似梯度法来实现这些方法,从而在算法的 M 步中方便地进行闭式参数更新。我们通过仿真检验了这些方法的有限样本性能,并在一个探索人体测量关系的真实数据示例中演示了这些方法的应用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Journal of Statistical Planning and Inference
Journal of Statistical Planning and Inference 数学-统计学与概率论
CiteScore
2.10
自引率
11.10%
发文量
78
审稿时长
3-6 weeks
期刊介绍: The Journal of Statistical Planning and Inference offers itself as a multifaceted and all-inclusive bridge between classical aspects of statistics and probability, and the emerging interdisciplinary aspects that have a potential of revolutionizing the subject. While we maintain our traditional strength in statistical inference, design, classical probability, and large sample methods, we also have a far more inclusive and broadened scope to keep up with the new problems that confront us as statisticians, mathematicians, and scientists. We publish high quality articles in all branches of statistics, probability, discrete mathematics, machine learning, and bioinformatics. We also especially welcome well written and up to date review articles on fundamental themes of statistics, probability, machine learning, and general biostatistics. Thoughtful letters to the editors, interesting problems in need of a solution, and short notes carrying an element of elegance or beauty are equally welcome.
期刊最新文献
Estimation and group-feature selection in sparse mixture-of-experts with diverging number of parameters Semi-parametric empirical likelihood inference on quantile difference between two samples with length-biased and right-censored data Sieve estimation of the accelerated mean model based on panel count data The proximal bootstrap for constrained estimators Testing the equality of distributions using integrated maximum mean discrepancy
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1