Qian Liu , Junchen Ye , Haohan Liang , Leilei Sun , Bowen Du
{"title":"TS-MAE: A masked autoencoder for time series representation learning","authors":"Qian Liu , Junchen Ye , Haohan Liang , Leilei Sun , Bowen Du","doi":"10.1016/j.ins.2024.121576","DOIUrl":null,"url":null,"abstract":"<div><div>Self-supervised learning (SSL) has been widely researched in recent years. In Particular, generative self-supervised learning methods have achieved remarkable success in many AI domains, such as MAE in computer vision, well-known BERT, GPT in natural language processing, and GraphMAE in graph learning. However, in the context of time series analysis, not only is the work that follows this line limited but also the performance has not reached the potential as promised in other fields. To fill this gap, we propose a simple and elegant masked autoencoder for time series representation learning. Firstly, unlike most existing work which uses the Transformer as the backbone, we build our model based on neural ordinary differential equation which possesses excellent mathematical properties. Compared with the position encoding in Transformer, modeling the evolution patterns continuously could better extract the temporal dependency. Secondly, a timestamp-wise mask strategy is provided to cooperate with the autoencoder to avoid bias, and it also could reduce the cross-imputation between variables to learn more robust representations. Lastly, extensive experiments conducted on two classical tasks demonstrate the superiority of our model over the state-of-the-art ones.</div></div>","PeriodicalId":51063,"journal":{"name":"Information Sciences","volume":"690 ","pages":"Article 121576"},"PeriodicalIF":8.1000,"publicationDate":"2024-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Sciences","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0020025524014907","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Self-supervised learning (SSL) has been widely researched in recent years. In Particular, generative self-supervised learning methods have achieved remarkable success in many AI domains, such as MAE in computer vision, well-known BERT, GPT in natural language processing, and GraphMAE in graph learning. However, in the context of time series analysis, not only is the work that follows this line limited but also the performance has not reached the potential as promised in other fields. To fill this gap, we propose a simple and elegant masked autoencoder for time series representation learning. Firstly, unlike most existing work which uses the Transformer as the backbone, we build our model based on neural ordinary differential equation which possesses excellent mathematical properties. Compared with the position encoding in Transformer, modeling the evolution patterns continuously could better extract the temporal dependency. Secondly, a timestamp-wise mask strategy is provided to cooperate with the autoencoder to avoid bias, and it also could reduce the cross-imputation between variables to learn more robust representations. Lastly, extensive experiments conducted on two classical tasks demonstrate the superiority of our model over the state-of-the-art ones.
期刊介绍:
Informatics and Computer Science Intelligent Systems Applications is an esteemed international journal that focuses on publishing original and creative research findings in the field of information sciences. We also feature a limited number of timely tutorial and surveying contributions.
Our journal aims to cater to a diverse audience, including researchers, developers, managers, strategic planners, graduate students, and anyone interested in staying up-to-date with cutting-edge research in information science, knowledge engineering, and intelligent systems. While readers are expected to share a common interest in information science, they come from varying backgrounds such as engineering, mathematics, statistics, physics, computer science, cell biology, molecular biology, management science, cognitive science, neurobiology, behavioral sciences, and biochemistry.