Contrastive learning enhanced by graph neural networks for Universal Multivariate Time Series Representation

IF 3 2区 计算机科学 Q2 COMPUTER SCIENCE, INFORMATION SYSTEMS Information Systems Pub Date : 2024-07-25 DOI:10.1016/j.is.2024.102429
{"title":"Contrastive learning enhanced by graph neural networks for Universal Multivariate Time Series Representation","authors":"","doi":"10.1016/j.is.2024.102429","DOIUrl":null,"url":null,"abstract":"<div><p>Analyzing multivariate time series data is crucial for many real-world issues, such as power forecasting, traffic flow forecasting, industrial anomaly detection, and more. Recently, universal frameworks for time series representation based on representation learning have received widespread attention due to their ability to capture changes in the distribution of time series data. However, existing time series representation learning models, when confronting multivariate time series data, merely apply contrastive learning methods to construct positive and negative samples for each variable at the timestamp level, and then employ a contrastive loss function to encourage the model to learn the similarities among the positive samples and the dissimilarities among the negative samples for each variable. Despite this, they fail to fully exploit the latent space dependencies between pairs of variables. To address this problem, we propose the Contrastive Learning Enhanced by Graph Neural Networks for Universal Multivariate Time Series Representation (COGNet), which has three distinctive features. (1) COGNet is a comprehensive self-supervised learning model that combines autoencoders and contrastive learning methods. (2) We introduce graph feature representation blocks on top of the backbone encoder, which extract adjacency features of each variable with other variables. (3) COGNet uses graph contrastive loss to learn graph feature representations. Experimental results across multiple public datasets indicate that COGNet outperforms existing methods in time series prediction and anomaly detection tasks.</p></div>","PeriodicalId":50363,"journal":{"name":"Information Systems","volume":null,"pages":null},"PeriodicalIF":3.0000,"publicationDate":"2024-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0306437924000875","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

Analyzing multivariate time series data is crucial for many real-world issues, such as power forecasting, traffic flow forecasting, industrial anomaly detection, and more. Recently, universal frameworks for time series representation based on representation learning have received widespread attention due to their ability to capture changes in the distribution of time series data. However, existing time series representation learning models, when confronting multivariate time series data, merely apply contrastive learning methods to construct positive and negative samples for each variable at the timestamp level, and then employ a contrastive loss function to encourage the model to learn the similarities among the positive samples and the dissimilarities among the negative samples for each variable. Despite this, they fail to fully exploit the latent space dependencies between pairs of variables. To address this problem, we propose the Contrastive Learning Enhanced by Graph Neural Networks for Universal Multivariate Time Series Representation (COGNet), which has three distinctive features. (1) COGNet is a comprehensive self-supervised learning model that combines autoencoders and contrastive learning methods. (2) We introduce graph feature representation blocks on top of the backbone encoder, which extract adjacency features of each variable with other variables. (3) COGNet uses graph contrastive loss to learn graph feature representations. Experimental results across multiple public datasets indicate that COGNet outperforms existing methods in time series prediction and anomaly detection tasks.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用图神经网络增强对比学习,实现通用多变量时间序列表征
分析多变量时间序列数据对许多现实问题至关重要,如电力预测、交通流量预测、工业异常检测等。最近,基于表示学习的时间序列表示通用框架因其捕捉时间序列数据分布变化的能力而受到广泛关注。然而,现有的时间序列表示学习模型在面对多变量时间序列数据时,只是应用对比学习方法在时间戳级别为每个变量构建正样本和负样本,然后采用对比损失函数来鼓励模型学习每个变量的正样本之间的相似性和负样本之间的不相似性。尽管如此,这些方法未能充分利用变量对之间的潜在空间依赖关系。为了解决这个问题,我们提出了用于通用多变量时间序列表示的图神经网络增强对比学习(COGNet),它有三个显著特点。(1) COGNet 是一种结合了自动编码器和对比学习方法的综合自监督学习模型。(2) 我们在主干编码器上引入图特征表示块,提取每个变量与其他变量的邻接特征。(3) COGNet 使用图对比损失来学习图特征表示。多个公共数据集的实验结果表明,COGNet 在时间序列预测和异常检测任务中的表现优于现有方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Information Systems
Information Systems 工程技术-计算机:信息系统
CiteScore
9.40
自引率
2.70%
发文量
112
审稿时长
53 days
期刊介绍: Information systems are the software and hardware systems that support data-intensive applications. The journal Information Systems publishes articles concerning the design and implementation of languages, data models, process models, algorithms, software and hardware for information systems. Subject areas include data management issues as presented in the principal international database conferences (e.g., ACM SIGMOD/PODS, VLDB, ICDE and ICDT/EDBT) as well as data-related issues from the fields of data mining/machine learning, information retrieval coordinated with structured data, internet and cloud data management, business process management, web semantics, visual and audio information systems, scientific computing, and data science. Implementation papers having to do with massively parallel data management, fault tolerance in practice, and special purpose hardware for data-intensive systems are also welcome. Manuscripts from application domains, such as urban informatics, social and natural science, and Internet of Things, are also welcome. All papers should highlight innovative solutions to data management problems such as new data models, performance enhancements, and show how those innovations contribute to the goals of the application.
期刊最新文献
Effective data exploration through clustering of local attributive explanations Data Lakehouse: A survey and experimental study Temporal graph processing in modern memory hierarchies Bridging reading and mapping: The role of reading annotations in facilitating feedback while concept mapping A universal approach for simplified redundancy-aware cross-model querying
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1