Group variable selection via group sparse neural network

IF 1.5 3区 数学 Q3 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Computational Statistics & Data Analysis Pub Date : 2023-12-29 DOI:10.1016/j.csda.2023.107911
Xin Zhang , Junlong Zhao
{"title":"Group variable selection via group sparse neural network","authors":"Xin Zhang ,&nbsp;Junlong Zhao","doi":"10.1016/j.csda.2023.107911","DOIUrl":null,"url":null,"abstract":"<div><p><span>Group variable selection is an important issue in high-dimensional data modeling and most of existing methods consider only the linear model. Therefore, a new method based on the deep neural network<span><span><span> (DNN), an increasingly popular nonlinear method in both statistics and </span>deep learning communities, is proposed. The method is applicable to general </span>nonlinear models, including the linear model as a special case. Specifically, a </span></span><span><em>group sparse </em><em>neural network</em></span> (GSNN) is designed, where the definition of <em>nonlinear group high-level features</em> (NGHFs) is generalized to the network structure. A <em>two-stage group sparse</em><span><span> (TGS) algorithm is employed to induce group variables selection by performing group structure selection on the network. GSNN is promising for complex nonlinear systems with interactions and </span>correlated predictors, overcoming the shortcomings of linear or marginal variable selection methods. Theoretical results on convergence and group-level selection consistency are also given. Simulations results and real data analysis demonstrate the superiority of our method.</span></p></div>","PeriodicalId":55225,"journal":{"name":"Computational Statistics & Data Analysis","volume":null,"pages":null},"PeriodicalIF":1.5000,"publicationDate":"2023-12-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Statistics & Data Analysis","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167947323002220","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

Group variable selection is an important issue in high-dimensional data modeling and most of existing methods consider only the linear model. Therefore, a new method based on the deep neural network (DNN), an increasingly popular nonlinear method in both statistics and deep learning communities, is proposed. The method is applicable to general nonlinear models, including the linear model as a special case. Specifically, a group sparse neural network (GSNN) is designed, where the definition of nonlinear group high-level features (NGHFs) is generalized to the network structure. A two-stage group sparse (TGS) algorithm is employed to induce group variables selection by performing group structure selection on the network. GSNN is promising for complex nonlinear systems with interactions and correlated predictors, overcoming the shortcomings of linear or marginal variable selection methods. Theoretical results on convergence and group-level selection consistency are also given. Simulations results and real data analysis demonstrate the superiority of our method.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
通过群体稀疏神经网络选择群体变量
分组变量选择是高维数据建模中的一个重要问题,而现有方法大多只考虑线性模型。因此,我们提出了一种基于深度神经网络(DNN)的新方法,这是一种在统计学和深度学习领域日益流行的非线性方法。该方法适用于一般非线性模型,包括作为特例的线性模型。具体来说,本文设计了一个组稀疏神经网络(GSNN),将非线性组高级特征(NGHFs)的定义泛化到网络结构中。采用两阶段组稀疏(TGS)算法,通过对网络进行组结构选择来诱导组变量选择。GSNN 对于具有交互作用和相关预测因子的复杂非线性系统很有前途,克服了线性或边际变量选择方法的缺点。本文还给出了收敛性和组级选择一致性的理论结果。模拟结果和实际数据分析证明了我们方法的优越性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Computational Statistics & Data Analysis
Computational Statistics & Data Analysis 数学-计算机:跨学科应用
CiteScore
3.70
自引率
5.60%
发文量
167
审稿时长
60 days
期刊介绍: Computational Statistics and Data Analysis (CSDA), an Official Publication of the network Computational and Methodological Statistics (CMStatistics) and of the International Association for Statistical Computing (IASC), is an international journal dedicated to the dissemination of methodological research and applications in the areas of computational statistics and data analysis. The journal consists of four refereed sections which are divided into the following subject areas: I) Computational Statistics - Manuscripts dealing with: 1) the explicit impact of computers on statistical methodology (e.g., Bayesian computing, bioinformatics,computer graphics, computer intensive inferential methods, data exploration, data mining, expert systems, heuristics, knowledge based systems, machine learning, neural networks, numerical and optimization methods, parallel computing, statistical databases, statistical systems), and 2) the development, evaluation and validation of statistical software and algorithms. Software and algorithms can be submitted with manuscripts and will be stored together with the online article. II) Statistical Methodology for Data Analysis - Manuscripts dealing with novel and original data analytical strategies and methodologies applied in biostatistics (design and analytic methods for clinical trials, epidemiological studies, statistical genetics, or genetic/environmental interactions), chemometrics, classification, data exploration, density estimation, design of experiments, environmetrics, education, image analysis, marketing, model free data exploration, pattern recognition, psychometrics, statistical physics, image processing, robust procedures. [...] III) Special Applications - [...] IV) Annals of Statistical Data Science [...]
期刊最新文献
Goodness–of–fit tests based on the min–characteristic function Editorial Board A switching state-space transmission model for tracking epidemics and assessing interventions Empirical Bayes Poisson matrix completion Transfer learning via random forests: A one-shot federated approach
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1