Anomaly Detection in High-Dimensional Time Series Data with Scaled Bregman Divergence.
IF 1.8 Q3 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCEAlgorithmsPub Date : 2025-02-01Epub Date: 2025-01-24DOI:10.3390/a18020062
Yunge Wang, Lingling Zhang, Tong Si, Graham Bishop, Haijun Gong
{"title":"Anomaly Detection in High-Dimensional Time Series Data with Scaled Bregman Divergence.","authors":"Yunge Wang, Lingling Zhang, Tong Si, Graham Bishop, Haijun Gong","doi":"10.3390/a18020062","DOIUrl":null,"url":null,"abstract":"<p><p>The purpose of anomaly detection is to identify special data points or patterns that significantly deviate from the expected or typical behavior of the majority of the data, and it has a wide range of applications across various domains. Most existing statistical and machine learning-based anomaly detection algorithms face challenges when applied to high-dimensional data. For instance, the unconstrained least-squares importance fitting (uLSIF) method, a state-of-the-art anomaly detection approach, encounters the unboundedness problem under certain conditions. In this study, we propose a scaled Bregman divergence-based anomaly detection algorithm using both least absolute deviation and least-squares loss for parameter learning. This new algorithm effectively addresses the unboundedness problem, making it particularly suitable for high-dimensional data. The proposed technique was evaluated on both synthetic and real-world high-dimensional time series datasets, demonstrating its effectiveness in detecting anomalies. Its performance was also compared to other density ratio estimation-based anomaly detection methods.</p>","PeriodicalId":7636,"journal":{"name":"Algorithms","volume":"18 2","pages":""},"PeriodicalIF":1.8000,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11790285/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Algorithms","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/a18020062","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/24 0:00:00","PubModel":"Epub","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
The purpose of anomaly detection is to identify special data points or patterns that significantly deviate from the expected or typical behavior of the majority of the data, and it has a wide range of applications across various domains. Most existing statistical and machine learning-based anomaly detection algorithms face challenges when applied to high-dimensional data. For instance, the unconstrained least-squares importance fitting (uLSIF) method, a state-of-the-art anomaly detection approach, encounters the unboundedness problem under certain conditions. In this study, we propose a scaled Bregman divergence-based anomaly detection algorithm using both least absolute deviation and least-squares loss for parameter learning. This new algorithm effectively addresses the unboundedness problem, making it particularly suitable for high-dimensional data. The proposed technique was evaluated on both synthetic and real-world high-dimensional time series datasets, demonstrating its effectiveness in detecting anomalies. Its performance was also compared to other density ratio estimation-based anomaly detection methods.