{"title":"Online incremental clustering with distance metric learning for high dimensional data","authors":"S. Okada, T. Nishida","doi":"10.1109/IJCNN.2011.6033478","DOIUrl":null,"url":null,"abstract":"In this paper, we present a novel incremental clustering algorithm which assigns of a set of observations into clusters and learns the distance metric iteratively in an incremental manner. The proposed algorithm SOINN-AML is composed based on the Self-organizing Incremental Neural Network (Shen et al 2006), which represents the distribution of unlabeled data and reports a reasonable number of clusters. SOINN adopts a competitive Hebbian rule for each input signal, and distance between nodes is measured using the Euclidean distance. Such algorithms rely on the distance metric for the input data patterns. Distance Metric Learning (DML) learns a distance metric for the high dimensional input space of data that preserves the distance relation among the training data. DML is not performed for input space of data in SOINN based approaches. SOINN-AML learns input space of data by using the Adaptive Distance Metric Learning (AML) algorithm which is one of the DML algorithms. It improves the incremental clustering performance of the SOINN algorithm by optimizing the distance metric in the case that input data space is high dimensional. In experimental results, we evaluate the performance by using two artificial datasets, seven real datasets from the UCI dataset and three real image datasets. We have found that the proposed algorithm outperforms conventional algorithms including SOINN (Shen et al 2006) and Enhanced SOINN (Shen et al 2007). The improvement of clustering accuracy (NMI) is between 0.03 and 0.13 compared to state of the art SOINN based approaches.","PeriodicalId":415833,"journal":{"name":"The 2011 International Joint Conference on Neural Networks","volume":"10 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"17","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The 2011 International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2011.6033478","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 17
Abstract
In this paper, we present a novel incremental clustering algorithm which assigns of a set of observations into clusters and learns the distance metric iteratively in an incremental manner. The proposed algorithm SOINN-AML is composed based on the Self-organizing Incremental Neural Network (Shen et al 2006), which represents the distribution of unlabeled data and reports a reasonable number of clusters. SOINN adopts a competitive Hebbian rule for each input signal, and distance between nodes is measured using the Euclidean distance. Such algorithms rely on the distance metric for the input data patterns. Distance Metric Learning (DML) learns a distance metric for the high dimensional input space of data that preserves the distance relation among the training data. DML is not performed for input space of data in SOINN based approaches. SOINN-AML learns input space of data by using the Adaptive Distance Metric Learning (AML) algorithm which is one of the DML algorithms. It improves the incremental clustering performance of the SOINN algorithm by optimizing the distance metric in the case that input data space is high dimensional. In experimental results, we evaluate the performance by using two artificial datasets, seven real datasets from the UCI dataset and three real image datasets. We have found that the proposed algorithm outperforms conventional algorithms including SOINN (Shen et al 2006) and Enhanced SOINN (Shen et al 2007). The improvement of clustering accuracy (NMI) is between 0.03 and 0.13 compared to state of the art SOINN based approaches.
本文提出了一种新的增量聚类算法,该算法将一组观测值分配到聚类中,并以增量的方式迭代学习距离度量。本文提出的SOINN-AML算法是基于自组织增量神经网络(Self-organizing Incremental Neural Network, Shen et al . 2006)组成的,它代表了未标记数据的分布,并报告了合理数量的聚类。SOINN对每个输入信号采用竞争Hebbian规则,节点间距离采用欧氏距离测量。这种算法依赖于输入数据模式的距离度量。距离度量学习(Distance Metric Learning, DML)是对数据的高维输入空间学习一种保持训练数据之间距离关系的距离度量。在基于SOINN的方法中,对数据的输入空间不执行DML。SOINN-AML使用自适应距离度量学习(AML)算法学习数据的输入空间,该算法是DML算法中的一种。在输入数据空间为高维的情况下,通过优化距离度量来提高SOINN算法的增量聚类性能。在实验结果中,我们使用2个人工数据集、7个来自UCI数据集的真实数据集和3个真实图像数据集来评估性能。我们发现所提出的算法优于传统的SOINN (Shen et al . 2006)和Enhanced SOINN (Shen et al . 2007)算法。与基于SOINN的最先进方法相比,聚类精度(NMI)的改进在0.03到0.13之间。