Spectral embedded clustering: a framework for in-sample and out-of-sample spectral clustering.

IEEE transactions on neural networks Pub Date : 2011-11-01 Epub Date: 2011-09-29 DOI:10.1109/TNN.2011.2162000
Feiping Nie, Zinan Zeng, Ivor W Tsang, Dong Xu, Changshui Zhang
{"title":"Spectral embedded clustering: a framework for in-sample and out-of-sample spectral clustering.","authors":"Feiping Nie,&nbsp;Zinan Zeng,&nbsp;Ivor W Tsang,&nbsp;Dong Xu,&nbsp;Changshui Zhang","doi":"10.1109/TNN.2011.2162000","DOIUrl":null,"url":null,"abstract":"<p><p>Spectral clustering (SC) methods have been successfully applied to many real-world applications. The success of these SC methods is largely based on the manifold assumption, namely, that two nearby data points in the high-density region of a low-dimensional data manifold have the same cluster label. However, such an assumption might not always hold on high-dimensional data. When the data do not exhibit a clear low-dimensional manifold structure (e.g., high-dimensional and sparse data), the clustering performance of SC will be degraded and become even worse than K -means clustering. In this paper, motivated by the observation that the true cluster assignment matrix for high-dimensional data can be always embedded in a linear space spanned by the data, we propose the spectral embedded clustering (SEC) framework, in which a linearity regularization is explicitly added into the objective function of SC methods. More importantly, the proposed SEC framework can naturally deal with out-of-sample data. We also present a new Laplacian matrix constructed from a local regression of each pattern and incorporate it into our SEC framework to capture both local and global discriminative information for clustering. Comprehensive experiments on eight real-world high-dimensional datasets demonstrate the effectiveness and advantages of our SEC framework over existing SC methods and K-means-based clustering methods. Our SEC framework significantly outperforms SC using the Nyström algorithm on unseen data.</p>","PeriodicalId":13434,"journal":{"name":"IEEE transactions on neural networks","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2011-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/TNN.2011.2162000","citationCount":"285","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TNN.2011.2162000","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2011/9/29 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 285

Abstract

Spectral clustering (SC) methods have been successfully applied to many real-world applications. The success of these SC methods is largely based on the manifold assumption, namely, that two nearby data points in the high-density region of a low-dimensional data manifold have the same cluster label. However, such an assumption might not always hold on high-dimensional data. When the data do not exhibit a clear low-dimensional manifold structure (e.g., high-dimensional and sparse data), the clustering performance of SC will be degraded and become even worse than K -means clustering. In this paper, motivated by the observation that the true cluster assignment matrix for high-dimensional data can be always embedded in a linear space spanned by the data, we propose the spectral embedded clustering (SEC) framework, in which a linearity regularization is explicitly added into the objective function of SC methods. More importantly, the proposed SEC framework can naturally deal with out-of-sample data. We also present a new Laplacian matrix constructed from a local regression of each pattern and incorporate it into our SEC framework to capture both local and global discriminative information for clustering. Comprehensive experiments on eight real-world high-dimensional datasets demonstrate the effectiveness and advantages of our SEC framework over existing SC methods and K-means-based clustering methods. Our SEC framework significantly outperforms SC using the Nyström algorithm on unseen data.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
光谱嵌入聚类:样本内和样本外光谱聚类的框架。
光谱聚类(SC)方法已经成功地应用于许多实际应用中。这些SC方法的成功很大程度上基于流形假设,即在低维数据流形的高密度区域中的两个邻近数据点具有相同的聚类标签。然而,这种假设可能并不总是适用于高维数据。当数据不表现出清晰的低维流形结构时(如高维稀疏数据),SC的聚类性能会下降,甚至比K均值聚类更差。基于观察到高维数据的真实聚类分配矩阵总是可以嵌入到数据所跨越的线性空间中,我们提出了频谱嵌入聚类(SEC)框架,该框架将线性正则化明确地加入到SC方法的目标函数中。更重要的是,拟议的SEC框架可以自然地处理样本外数据。我们还提出了一个由每个模式的局部回归构造的新的拉普拉斯矩阵,并将其纳入我们的SEC框架中,以捕获局部和全局判别信息用于聚类。在8个真实世界的高维数据集上进行的综合实验表明,我们的SEC框架比现有的SC方法和基于k均值的聚类方法具有有效性和优势。我们的SEC框架在不可见数据上使用Nyström算法显著优于SC。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE transactions on neural networks
IEEE transactions on neural networks 工程技术-工程:电子与电气
自引率
0.00%
发文量
2
审稿时长
8.7 months
期刊最新文献
Extracting rules from neural networks as decision diagrams. Design of a data-driven predictive controller for start-up process of AMT vehicles. Data-based hybrid tension estimation and fault diagnosis of cold rolling continuous annealing processes. Unified development of multiplicative algorithms for linear and quadratic nonnegative matrix factorization. Data-based system modeling using a type-2 fuzzy neural network with a hybrid learning algorithm.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1