Qiushi Wang, Yueming Zhu, Zhicheng Sun, Dong Li, Yunbin Ma
{"title":"Self-attention-based graph transformation learning for anomaly detection in multivariate time series","authors":"Qiushi Wang, Yueming Zhu, Zhicheng Sun, Dong Li, Yunbin Ma","doi":"10.1007/s40747-025-01839-3","DOIUrl":null,"url":null,"abstract":"<p>Multivariate time series anomaly detection has widely applications in many fields such as finance, power, and industry. Recently, Graph Neural Network (GNN) have achieved great success in this task due to their powerful ability of modeling multivariate relationships. However, most existing methods employ shallow networks with only two layers, resulting in restricted node information transfer range and limited sensing field. In this paper, we propose a self-attention based graph transformation learning (AT-GTL) method to solve this problem. AT-GTL uses a global self-attention graph pooling (GATP) module to aggregate all node features to obtain global features. Then, a graph transformation learning pipeline is constructed based on neural transformation learning, and a triplet contrastive loss (TCL) is constructed to optimize the global feature extraction networks using potential features from multi-viewpoints. Extensive experiments on three real-world datasets demonstrate that our method can effectively aggregate global graph features and detect anomalies, providing a new transformation learning solution for multivariate time series anomaly detection.</p>","PeriodicalId":10524,"journal":{"name":"Complex & Intelligent Systems","volume":"55 1","pages":""},"PeriodicalIF":5.0000,"publicationDate":"2025-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Complex & Intelligent Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s40747-025-01839-3","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Multivariate time series anomaly detection has widely applications in many fields such as finance, power, and industry. Recently, Graph Neural Network (GNN) have achieved great success in this task due to their powerful ability of modeling multivariate relationships. However, most existing methods employ shallow networks with only two layers, resulting in restricted node information transfer range and limited sensing field. In this paper, we propose a self-attention based graph transformation learning (AT-GTL) method to solve this problem. AT-GTL uses a global self-attention graph pooling (GATP) module to aggregate all node features to obtain global features. Then, a graph transformation learning pipeline is constructed based on neural transformation learning, and a triplet contrastive loss (TCL) is constructed to optimize the global feature extraction networks using potential features from multi-viewpoints. Extensive experiments on three real-world datasets demonstrate that our method can effectively aggregate global graph features and detect anomalies, providing a new transformation learning solution for multivariate time series anomaly detection.
期刊介绍:
Complex & Intelligent Systems aims to provide a forum for presenting and discussing novel approaches, tools and techniques meant for attaining a cross-fertilization between the broad fields of complex systems, computational simulation, and intelligent analytics and visualization. The transdisciplinary research that the journal focuses on will expand the boundaries of our understanding by investigating the principles and processes that underlie many of the most profound problems facing society today.