{"title":"Partial multi-label feature selection with feature noise","authors":"You Wu , Peipei Li , Yizhang Zou","doi":"10.1016/j.patcog.2024.111310","DOIUrl":null,"url":null,"abstract":"<div><div>As the dimensionality of multi-label data continues to increase, feature selection has become increasingly prevalent in multi-label learning, serving as an efficient and interpretable means of dimensionality reduction. However, existing multi-label feature selection algorithms often assume data to be noise-free, which cannot hold in real-world applications where feature and label noise are frequently encountered. Therefore, we propose a novel partial multi-label feature selection algorithm, which aims to effectively select an optimal subset of features in the environment plagued by feature noise and partial multi-label. Specifically, we first propose a robust label enhancement model to diminish noise interference and enrich the semantic information of labels. Subsequently, a sparse reconstruction is utilized to learn the instance relevance information and then applied to the smoothness assumption to obtain more accurate label distributions. Additionally, we employ the <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>2</mn><mo>,</mo><mn>1</mn></mrow></msub></math></span>-norm to eliminate irrelevant features and constrain the model complexity. Finally, the above processing is optimized end-to-end within a unified objective function. Experimental results demonstrate that our algorithm outperforms several state-of-the-art feature selection methods across 15 datasets.</div></div>","PeriodicalId":49713,"journal":{"name":"Pattern Recognition","volume":"162 ","pages":"Article 111310"},"PeriodicalIF":7.5000,"publicationDate":"2025-01-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0031320324010616","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
As the dimensionality of multi-label data continues to increase, feature selection has become increasingly prevalent in multi-label learning, serving as an efficient and interpretable means of dimensionality reduction. However, existing multi-label feature selection algorithms often assume data to be noise-free, which cannot hold in real-world applications where feature and label noise are frequently encountered. Therefore, we propose a novel partial multi-label feature selection algorithm, which aims to effectively select an optimal subset of features in the environment plagued by feature noise and partial multi-label. Specifically, we first propose a robust label enhancement model to diminish noise interference and enrich the semantic information of labels. Subsequently, a sparse reconstruction is utilized to learn the instance relevance information and then applied to the smoothness assumption to obtain more accurate label distributions. Additionally, we employ the -norm to eliminate irrelevant features and constrain the model complexity. Finally, the above processing is optimized end-to-end within a unified objective function. Experimental results demonstrate that our algorithm outperforms several state-of-the-art feature selection methods across 15 datasets.
期刊介绍:
The field of Pattern Recognition is both mature and rapidly evolving, playing a crucial role in various related fields such as computer vision, image processing, text analysis, and neural networks. It closely intersects with machine learning and is being applied in emerging areas like biometrics, bioinformatics, multimedia data analysis, and data science. The journal Pattern Recognition, established half a century ago during the early days of computer science, has since grown significantly in scope and influence.