{"title":"Ratai: recurrent autoencoder with imputation units and temporal attention for multivariate time series imputation","authors":"Xiaochen Lai, Yachen Yao, Jichong Mu, Wei Lu, Liyong Zhang","doi":"10.1007/s10462-024-11039-z","DOIUrl":null,"url":null,"abstract":"<div><p>Multivariate time series is ubiquitous in real-world applications, yet it often suffers from missing values that impede downstream analytical tasks. In this paper, we introduce the Long Short-Term Memory Network based Recurrent Autoencoder with Imputation Units and Temporal Attention Imputation Model (RATAI), tailored for multivariate time series. RATAI is designed to address certain limitations of traditional RNN-based imputation methods, which often focus on predictive modeling to estimate missing values, sometimes neglecting the contextual impact of observed data at and beyond the target time step. Drawing inspiration from Kalman smoothing, which effectively integrates past and future information to refine state estimations, RATAI aims to extract feature representations from time series data and use them to reconstruct a complete time series, thus overcoming the shortcomings of existing approaches. It employs a dual-stage imputation process: the encoder utilizes temporal information and attribute correlations to predict and impute missing values, and extract feature representation of imputed time series. Subsequently, the decoder reconstructs the series from the feature representation, and the reconstructed values are used as the final imputation values. Additionally, RATAI incorporates a temporal attention mechanism, allowing the decoder to focus on highly relevant inputs during reconstruction. This model can be trained directly using data that contains missing values, avoiding the misleading effects on model training that can arise from setting initial values for missing values. Our experiments demonstrate that RATAI outperforms benchmark models in multivariate time series imputation.</p></div>","PeriodicalId":8449,"journal":{"name":"Artificial Intelligence Review","volume":"58 2","pages":""},"PeriodicalIF":10.7000,"publicationDate":"2024-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s10462-024-11039-z.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Artificial Intelligence Review","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s10462-024-11039-z","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Multivariate time series is ubiquitous in real-world applications, yet it often suffers from missing values that impede downstream analytical tasks. In this paper, we introduce the Long Short-Term Memory Network based Recurrent Autoencoder with Imputation Units and Temporal Attention Imputation Model (RATAI), tailored for multivariate time series. RATAI is designed to address certain limitations of traditional RNN-based imputation methods, which often focus on predictive modeling to estimate missing values, sometimes neglecting the contextual impact of observed data at and beyond the target time step. Drawing inspiration from Kalman smoothing, which effectively integrates past and future information to refine state estimations, RATAI aims to extract feature representations from time series data and use them to reconstruct a complete time series, thus overcoming the shortcomings of existing approaches. It employs a dual-stage imputation process: the encoder utilizes temporal information and attribute correlations to predict and impute missing values, and extract feature representation of imputed time series. Subsequently, the decoder reconstructs the series from the feature representation, and the reconstructed values are used as the final imputation values. Additionally, RATAI incorporates a temporal attention mechanism, allowing the decoder to focus on highly relevant inputs during reconstruction. This model can be trained directly using data that contains missing values, avoiding the misleading effects on model training that can arise from setting initial values for missing values. Our experiments demonstrate that RATAI outperforms benchmark models in multivariate time series imputation.
期刊介绍:
Artificial Intelligence Review, a fully open access journal, publishes cutting-edge research in artificial intelligence and cognitive science. It features critical evaluations of applications, techniques, and algorithms, providing a platform for both researchers and application developers. The journal includes refereed survey and tutorial articles, along with reviews and commentary on significant developments in the field.