Fangyuan Xie, Feiping Nie, Weizhong Yu, Xuelong Li
{"title":"Fast Co-clustering via Anchor-guided Label Spreading.","authors":"Fangyuan Xie, Feiping Nie, Weizhong Yu, Xuelong Li","doi":"10.1016/j.neunet.2025.107187","DOIUrl":null,"url":null,"abstract":"<p><p>The attention towards clustering using anchor graph has grown due to its effectiveness and efficiency. As the most representative points in original data, anchors are also regarded as connecting the sample space to the label space. However, when there is noise in original data, the anchor-guided label spreading may fail. To alleviate this, we propose a Fast Co-clustering method via Anchor-guided Label Spreading (FCALS), in which the label of samples and anchors could be obtained simultaneously. Our method could not only maximize the intra-cluster similarity among anchors but also ensure that the relationship between anchors and original data is preserved. Besides, to avoid trivial solutions, the size constraint is introduced in our model, in which it is required that the samples within each cluster must not fall below a certain value. Furthermore, the lower limit exhibits insensitivity with a relatively broad range of possible values. Considering that the label matrix of original data could be fuzzy or discrete, the continuous and discrete models are proposed, which are named FCALS-C and FCALS-D respectively. Since labels of anchors can be directly obtained, the proposed methods are naturally applicable to out-of-sample problems. The superiority of the proposed methods is demonstrated through experimental results on both synthetic and real-world datasets.</p>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"185 ","pages":"107187"},"PeriodicalIF":6.0000,"publicationDate":"2025-01-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1016/j.neunet.2025.107187","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
The attention towards clustering using anchor graph has grown due to its effectiveness and efficiency. As the most representative points in original data, anchors are also regarded as connecting the sample space to the label space. However, when there is noise in original data, the anchor-guided label spreading may fail. To alleviate this, we propose a Fast Co-clustering method via Anchor-guided Label Spreading (FCALS), in which the label of samples and anchors could be obtained simultaneously. Our method could not only maximize the intra-cluster similarity among anchors but also ensure that the relationship between anchors and original data is preserved. Besides, to avoid trivial solutions, the size constraint is introduced in our model, in which it is required that the samples within each cluster must not fall below a certain value. Furthermore, the lower limit exhibits insensitivity with a relatively broad range of possible values. Considering that the label matrix of original data could be fuzzy or discrete, the continuous and discrete models are proposed, which are named FCALS-C and FCALS-D respectively. Since labels of anchors can be directly obtained, the proposed methods are naturally applicable to out-of-sample problems. The superiority of the proposed methods is demonstrated through experimental results on both synthetic and real-world datasets.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.