{"title":"Recurrent Interpolants for Probabilistic Time Series Prediction","authors":"Yu Chen, Marin Biloš, Sarthak Mittal, Wei Deng, Kashif Rasul, Anderson Schneider","doi":"arxiv-2409.11684","DOIUrl":null,"url":null,"abstract":"Sequential models such as recurrent neural networks or transformer-based\nmodels became \\textit{de facto} tools for multivariate time series forecasting\nin a probabilistic fashion, with applications to a wide range of datasets, such\nas finance, biology, medicine, etc. Despite their adeptness in capturing\ndependencies, assessing prediction uncertainty, and efficiency in training,\nchallenges emerge in modeling high-dimensional complex distributions and\ncross-feature dependencies. To tackle these issues, recent works delve into\ngenerative modeling by employing diffusion or flow-based models. Notably, the\nintegration of stochastic differential equations or probability flow\nsuccessfully extends these methods to probabilistic time series imputation and\nforecasting. However, scalability issues necessitate a computational-friendly\nframework for large-scale generative model-based predictions. This work\nproposes a novel approach by blending the computational efficiency of recurrent\nneural networks with the high-quality probabilistic modeling of the diffusion\nmodel, which addresses challenges and advances generative models' application\nin time series forecasting. Our method relies on the foundation of stochastic\ninterpolants and the extension to a broader conditional generation framework\nwith additional control features, offering insights for future developments in\nthis dynamic field.","PeriodicalId":501340,"journal":{"name":"arXiv - STAT - Machine Learning","volume":"11 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - STAT - Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.11684","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Sequential models such as recurrent neural networks or transformer-based
models became \textit{de facto} tools for multivariate time series forecasting
in a probabilistic fashion, with applications to a wide range of datasets, such
as finance, biology, medicine, etc. Despite their adeptness in capturing
dependencies, assessing prediction uncertainty, and efficiency in training,
challenges emerge in modeling high-dimensional complex distributions and
cross-feature dependencies. To tackle these issues, recent works delve into
generative modeling by employing diffusion or flow-based models. Notably, the
integration of stochastic differential equations or probability flow
successfully extends these methods to probabilistic time series imputation and
forecasting. However, scalability issues necessitate a computational-friendly
framework for large-scale generative model-based predictions. This work
proposes a novel approach by blending the computational efficiency of recurrent
neural networks with the high-quality probabilistic modeling of the diffusion
model, which addresses challenges and advances generative models' application
in time series forecasting. Our method relies on the foundation of stochastic
interpolants and the extension to a broader conditional generation framework
with additional control features, offering insights for future developments in
this dynamic field.