{"title":"Learning on predictions: Fusing training and autoregressive inference for long-term spatiotemporal forecasts","authors":"P.R. Vlachas , P. Koumoutsakos","doi":"10.1016/j.physd.2024.134371","DOIUrl":null,"url":null,"abstract":"<div><p>Predictions of complex systems ranging from natural language processing to weather forecasting have benefited from advances in Recurrent Neural Networks (RNNs). RNNs are typically trained using techniques like Backpropagation Through Time (BPTT) to minimize one-step-ahead prediction loss. During testing, RNNs often operate in an auto-regressive mode, with the output of the network fed back into its input. However, this process can eventually result in exposure bias since the network has been trained to process ”ground-truth” data rather than its own predictions. This inconsistency causes errors that compound over time, indicating that the distribution of data used for evaluating losses differs from the actual operating conditions encountered by the model during training. Inspired by the solution to this challenge in language processing networks we propose the Scheduled Autoregressive Truncated Backpropagation Through Time (BPTT-SA) algorithm for predicting complex dynamical systems using RNNs. We find that BPTT-SA effectively reduces iterative error propagation in Convolutional and Convolutional Autoencoder RNNs and demonstrates its capabilities in the long-term prediction of high-dimensional fluid flows.</p></div>","PeriodicalId":2,"journal":{"name":"ACS Applied Bio Materials","volume":null,"pages":null},"PeriodicalIF":4.6000,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S016727892400321X/pdfft?md5=c5231aef9d912b65fa750a286252a7f5&pid=1-s2.0-S016727892400321X-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Bio Materials","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S016727892400321X","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATERIALS SCIENCE, BIOMATERIALS","Score":null,"Total":0}
引用次数: 0
Abstract
Predictions of complex systems ranging from natural language processing to weather forecasting have benefited from advances in Recurrent Neural Networks (RNNs). RNNs are typically trained using techniques like Backpropagation Through Time (BPTT) to minimize one-step-ahead prediction loss. During testing, RNNs often operate in an auto-regressive mode, with the output of the network fed back into its input. However, this process can eventually result in exposure bias since the network has been trained to process ”ground-truth” data rather than its own predictions. This inconsistency causes errors that compound over time, indicating that the distribution of data used for evaluating losses differs from the actual operating conditions encountered by the model during training. Inspired by the solution to this challenge in language processing networks we propose the Scheduled Autoregressive Truncated Backpropagation Through Time (BPTT-SA) algorithm for predicting complex dynamical systems using RNNs. We find that BPTT-SA effectively reduces iterative error propagation in Convolutional and Convolutional Autoencoder RNNs and demonstrates its capabilities in the long-term prediction of high-dimensional fluid flows.