Zijian Xiao , Hongmei Chen , Yong Mi , Chuan Luo , Shi-Jinn Horng , Tianrui Li
{"title":"Joint subspace learning and subspace clustering based unsupervised feature selection","authors":"Zijian Xiao , Hongmei Chen , Yong Mi , Chuan Luo , Shi-Jinn Horng , Tianrui Li","doi":"10.1016/j.neucom.2025.129885","DOIUrl":null,"url":null,"abstract":"<div><div>Unsupervised feature selection (UFS) has become a focal point of extensive research due to its ability to reduce the dimensionality of unlabeled data. Currently, many UFS methods based on subspace learning embed multiple graph regularization terms to preserve the local similarity structure of samples or features and rarely consider exploring global structure simultaneously, such as the self-representation structure between features and the potential clustering structure of samples. We propose a novel UFS model based on subspace learning and subspace orthogonal basis clustering (JSLSC) to address this problem. First, through robust subspace learning, JSLSC explores the self-representation information between the selected features and the original feature space. Features’ local and global structures are learned through feature selection and self-representation structure learning. Secondly, orthogonal basis clustering is introduced to learn the potential clustering structure in the low-dimensional sample space, thus enabling subspace clustering. Thirdly, hard-constrained graph structure learning is introduced to adaptively maintain the local structural consistency between low-dimensional samples and original samples. Finally, an optimization algorithm and convergence proof are proposed, and the superiority of the JSLSC is demonstrated through comparative experiments on nine real datasets.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"635 ","pages":"Article 129885"},"PeriodicalIF":5.5000,"publicationDate":"2025-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225005570","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Unsupervised feature selection (UFS) has become a focal point of extensive research due to its ability to reduce the dimensionality of unlabeled data. Currently, many UFS methods based on subspace learning embed multiple graph regularization terms to preserve the local similarity structure of samples or features and rarely consider exploring global structure simultaneously, such as the self-representation structure between features and the potential clustering structure of samples. We propose a novel UFS model based on subspace learning and subspace orthogonal basis clustering (JSLSC) to address this problem. First, through robust subspace learning, JSLSC explores the self-representation information between the selected features and the original feature space. Features’ local and global structures are learned through feature selection and self-representation structure learning. Secondly, orthogonal basis clustering is introduced to learn the potential clustering structure in the low-dimensional sample space, thus enabling subspace clustering. Thirdly, hard-constrained graph structure learning is introduced to adaptively maintain the local structural consistency between low-dimensional samples and original samples. Finally, an optimization algorithm and convergence proof are proposed, and the superiority of the JSLSC is demonstrated through comparative experiments on nine real datasets.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.