{"title":"自适应子序列聚类的时间自组织神经网络及其实例研究","authors":"Dong Wang, Yanfang Long, Zhu Xiao, Zhiyang Xiang, Wenjie Chen","doi":"10.1109/CITS.2016.7546436","DOIUrl":null,"url":null,"abstract":"Temporal neural networks such as Temporal Kohonen Map (TKM) and Recurrent Self-Organizing Map (RSOM) are popular for their incremental and explicit learning abilities. However, for sub-sequence clustering TKM and RSOM may generate many fragments whose classification membership is hard to decide. Besides they have stability issues in multivariate time series processing because they model the historical neuron activities on each variable independently. To overcome the drawbacks, we propose an adaptive sub-sequence clustering method based on single layered Self-Organizing Incremental Neural Network (SOINN). A recurrent filter is proposed to model the quantizations of neuron activations each as a scalar instead of a vector like in TKM and RSOM. Then it is integrated with the single layered SOINN for adaptive clustering where fragmented clusters in TKM and RSOM is replaced by a smoothed clustering result. Experiments are carried out on two datasets, namely a traffic flow dataset from open Caltrans performance measurement systems and a part of the KDD Cup 99 intrusion detection dataset. Experimental results show that the proposed method outperforms the conventional methods by 21.3% and 9.1% on the two datasets respectively.","PeriodicalId":340958,"journal":{"name":"2016 International Conference on Computer, Information and Telecommunication Systems (CITS)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A temporal self-organizing neural network for adaptive sub-sequence clustering and case studies\",\"authors\":\"Dong Wang, Yanfang Long, Zhu Xiao, Zhiyang Xiang, Wenjie Chen\",\"doi\":\"10.1109/CITS.2016.7546436\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Temporal neural networks such as Temporal Kohonen Map (TKM) and Recurrent Self-Organizing Map (RSOM) are popular for their incremental and explicit learning abilities. However, for sub-sequence clustering TKM and RSOM may generate many fragments whose classification membership is hard to decide. Besides they have stability issues in multivariate time series processing because they model the historical neuron activities on each variable independently. To overcome the drawbacks, we propose an adaptive sub-sequence clustering method based on single layered Self-Organizing Incremental Neural Network (SOINN). A recurrent filter is proposed to model the quantizations of neuron activations each as a scalar instead of a vector like in TKM and RSOM. Then it is integrated with the single layered SOINN for adaptive clustering where fragmented clusters in TKM and RSOM is replaced by a smoothed clustering result. Experiments are carried out on two datasets, namely a traffic flow dataset from open Caltrans performance measurement systems and a part of the KDD Cup 99 intrusion detection dataset. Experimental results show that the proposed method outperforms the conventional methods by 21.3% and 9.1% on the two datasets respectively.\",\"PeriodicalId\":340958,\"journal\":{\"name\":\"2016 International Conference on Computer, Information and Telecommunication Systems (CITS)\",\"volume\":\"32 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-07-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 International Conference on Computer, Information and Telecommunication Systems (CITS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CITS.2016.7546436\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 International Conference on Computer, Information and Telecommunication Systems (CITS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CITS.2016.7546436","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
摘要
时间神经网络如时间Kohonen Map (TKM)和递归自组织Map (RSOM)因其增量式和显式学习能力而广受欢迎。然而,对于子序列聚类,TKM和RSOM可能会产生许多难以确定分类隶属的片段。此外,它们在多变量时间序列处理中存在稳定性问题,因为它们独立地对每个变量的历史神经元活动进行建模。为了克服这种缺点,提出了一种基于单层自组织增量神经网络(SOINN)的自适应子序列聚类方法。提出了一种递归滤波器,将神经元激活的量化建模为标量,而不是像TKM和RSOM那样的向量。然后结合单层SOINN进行自适应聚类,将TKM和RSOM中的碎片聚类替换为平滑聚类结果。实验在两个数据集上进行,即来自开放Caltrans性能测量系统的交通流数据集和KDD Cup 99入侵检测数据集的一部分。实验结果表明,该方法在两个数据集上分别比传统方法提高了21.3%和9.1%。
A temporal self-organizing neural network for adaptive sub-sequence clustering and case studies
Temporal neural networks such as Temporal Kohonen Map (TKM) and Recurrent Self-Organizing Map (RSOM) are popular for their incremental and explicit learning abilities. However, for sub-sequence clustering TKM and RSOM may generate many fragments whose classification membership is hard to decide. Besides they have stability issues in multivariate time series processing because they model the historical neuron activities on each variable independently. To overcome the drawbacks, we propose an adaptive sub-sequence clustering method based on single layered Self-Organizing Incremental Neural Network (SOINN). A recurrent filter is proposed to model the quantizations of neuron activations each as a scalar instead of a vector like in TKM and RSOM. Then it is integrated with the single layered SOINN for adaptive clustering where fragmented clusters in TKM and RSOM is replaced by a smoothed clustering result. Experiments are carried out on two datasets, namely a traffic flow dataset from open Caltrans performance measurement systems and a part of the KDD Cup 99 intrusion detection dataset. Experimental results show that the proposed method outperforms the conventional methods by 21.3% and 9.1% on the two datasets respectively.