Multi-level sparse network lasso: Locally sparse learning with flexible sample clusters

IF 6.5 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Neurocomputing Pub Date : 2025-06-28 Epub Date: 2025-03-11 DOI:10.1016/j.neucom.2025.129898
Luhuan Fei , Xinyi Wang , Jiankun Wang , Lu Sun , Yuyao Zhang
{"title":"Multi-level sparse network lasso: Locally sparse learning with flexible sample clusters","authors":"Luhuan Fei ,&nbsp;Xinyi Wang ,&nbsp;Jiankun Wang ,&nbsp;Lu Sun ,&nbsp;Yuyao Zhang","doi":"10.1016/j.neucom.2025.129898","DOIUrl":null,"url":null,"abstract":"<div><div>Traditional learning usually assumes that all samples share the same global model, which fails to preserve critical local information for heterogeneous data. It can be tackled by detecting sample clusters and learning sample-specific models but is limited to sample-level clustering and sample-specific feature selection. In this paper, we propose multi-level sparse network lasso (MSN Lasso) for flexible local learning. It multiplicatively decomposes model parameters into two components: One component is for coarse-grained group-level, and another is for fine-grained entry-level. At the clustering stage, MSN Lasso simultaneously groups samples (group-level) and clusters specific features across samples (entry-level). At the feature selection stage, it enables both across-sample (group-level) and sample-specific (entry-level) feature selection. Theoretical analysis reveals a potential equivalence to a jointly regularized local model, which informs the development of an efficient algorithm. A divide-and-conquer optimization strategy is further introduced to enhance the algorithm’s efficiency. Extensive experiments across diverse datasets demonstrate that MSN Lasso outperforms existing methods and exhibits greater flexibility.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"635 ","pages":"Article 129898"},"PeriodicalIF":6.5000,"publicationDate":"2025-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225005703","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/3/11 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Traditional learning usually assumes that all samples share the same global model, which fails to preserve critical local information for heterogeneous data. It can be tackled by detecting sample clusters and learning sample-specific models but is limited to sample-level clustering and sample-specific feature selection. In this paper, we propose multi-level sparse network lasso (MSN Lasso) for flexible local learning. It multiplicatively decomposes model parameters into two components: One component is for coarse-grained group-level, and another is for fine-grained entry-level. At the clustering stage, MSN Lasso simultaneously groups samples (group-level) and clusters specific features across samples (entry-level). At the feature selection stage, it enables both across-sample (group-level) and sample-specific (entry-level) feature selection. Theoretical analysis reveals a potential equivalence to a jointly regularized local model, which informs the development of an efficient algorithm. A divide-and-conquer optimization strategy is further introduced to enhance the algorithm’s efficiency. Extensive experiments across diverse datasets demonstrate that MSN Lasso outperforms existing methods and exhibits greater flexibility.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
多级稀疏网络套索:具有灵活样本群的局部稀疏学习
传统学习通常假设所有样本共享相同的全局模型,这无法保留异构数据的关键局部信息。它可以通过检测样本聚类和学习样本特定模型来解决,但仅限于样本级聚类和样本特定特征选择。本文提出了一种用于灵活局部学习的多级稀疏网络套索(MSN套索)。它将模型参数乘法分解为两个组件:一个组件用于粗粒度组级,另一个组件用于细粒度入门级。在聚类阶段,MSN Lasso同时对样本进行分组(组级),并跨样本对特定特性进行聚类(入门级)。在特征选择阶段,它支持跨样本(组级别)和特定于样本(入门级)的特征选择。理论分析揭示了联合正则化局部模型的潜在等价性,这为开发有效的算法提供了信息。为了提高算法的效率,进一步引入了分治优化策略。在不同数据集上进行的大量实验表明,MSN Lasso优于现有方法,并表现出更大的灵活性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Neurocomputing
Neurocomputing 工程技术-计算机:人工智能
CiteScore
13.10
自引率
10.00%
发文量
1382
审稿时长
70 days
期刊介绍: Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.
期刊最新文献
ms-mamba: Multi-scale mamba for time-series forecasting Advances in intelligent animal pose tracking for neuro-behavioral integration Impact of leakage on data harmonization in machine learning pipelines in class imbalance across sites Blind motion deblurring via adaptive frequency-aware and ternary interactive attention fusion Lightweight ensemble vision transformer framework for non-invasive survival prediction in glioblastoma
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1