{"title":"A general adaptive unsupervised feature selection with auto-weighting","authors":"Huming Liao , Hongmei Chen , Tengyu Yin , Zhong Yuan , Shi-Jinn Horng , Tianrui Li","doi":"10.1016/j.neunet.2024.106840","DOIUrl":null,"url":null,"abstract":"<div><div>Feature selection (FS) is essential in machine learning and data mining as it makes handling high-dimensional data more efficient and reliable. More attention has been paid to unsupervised feature selection (UFS) due to the extra resources required to obtain labels for data in the real world. Most of the existing embedded UFS utilize a sparse projection matrix for FS. However, this may introduce additional regularization terms, and it is difficult to control the sparsity of the projection matrix well. Moreover, such methods may seriously destroy the original feature structure in the embedding space. Instead, avoiding projecting the original data into the low-dimensional embedding space and identifying features directly from the raw features that perform well in the process of making the data show a distinct cluster structure is a feasible solution. Inspired by this, this paper proposes a model called A General Adaptive Unsupervised Feature Selection with Auto-weighting (GAWFS), which utilizes two techniques, non-negative matrix factorization, and adaptive graph learning, to simulate the process of dividing data into clusters, and identifies the features that are most discriminative in the clustering process by a feature weighting matrix <span><math><mi>Θ</mi></math></span>. Since the weighting matrix is sparse, it also plays the role of FS or a filter. Finally, experiments comparing GAWFS with several state-of-the-art UFS methods on synthetic datasets and real-world datasets are conducted, and the results demonstrate the superiority of the GAWFS.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"181 ","pages":"Article 106840"},"PeriodicalIF":6.0000,"publicationDate":"2024-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608024007640","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Feature selection (FS) is essential in machine learning and data mining as it makes handling high-dimensional data more efficient and reliable. More attention has been paid to unsupervised feature selection (UFS) due to the extra resources required to obtain labels for data in the real world. Most of the existing embedded UFS utilize a sparse projection matrix for FS. However, this may introduce additional regularization terms, and it is difficult to control the sparsity of the projection matrix well. Moreover, such methods may seriously destroy the original feature structure in the embedding space. Instead, avoiding projecting the original data into the low-dimensional embedding space and identifying features directly from the raw features that perform well in the process of making the data show a distinct cluster structure is a feasible solution. Inspired by this, this paper proposes a model called A General Adaptive Unsupervised Feature Selection with Auto-weighting (GAWFS), which utilizes two techniques, non-negative matrix factorization, and adaptive graph learning, to simulate the process of dividing data into clusters, and identifies the features that are most discriminative in the clustering process by a feature weighting matrix . Since the weighting matrix is sparse, it also plays the role of FS or a filter. Finally, experiments comparing GAWFS with several state-of-the-art UFS methods on synthetic datasets and real-world datasets are conducted, and the results demonstrate the superiority of the GAWFS.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.