{"title":"在预测中学习:融合训练和自回归推理进行长期时空预测","authors":"P.R. Vlachas , P. Koumoutsakos","doi":"10.1016/j.physd.2024.134371","DOIUrl":null,"url":null,"abstract":"<div><p>Predictions of complex systems ranging from natural language processing to weather forecasting have benefited from advances in Recurrent Neural Networks (RNNs). RNNs are typically trained using techniques like Backpropagation Through Time (BPTT) to minimize one-step-ahead prediction loss. During testing, RNNs often operate in an auto-regressive mode, with the output of the network fed back into its input. However, this process can eventually result in exposure bias since the network has been trained to process ”ground-truth” data rather than its own predictions. This inconsistency causes errors that compound over time, indicating that the distribution of data used for evaluating losses differs from the actual operating conditions encountered by the model during training. Inspired by the solution to this challenge in language processing networks we propose the Scheduled Autoregressive Truncated Backpropagation Through Time (BPTT-SA) algorithm for predicting complex dynamical systems using RNNs. We find that BPTT-SA effectively reduces iterative error propagation in Convolutional and Convolutional Autoencoder RNNs and demonstrates its capabilities in the long-term prediction of high-dimensional fluid flows.</p></div>","PeriodicalId":20050,"journal":{"name":"Physica D: Nonlinear Phenomena","volume":"470 ","pages":"Article 134371"},"PeriodicalIF":2.7000,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S016727892400321X/pdfft?md5=c5231aef9d912b65fa750a286252a7f5&pid=1-s2.0-S016727892400321X-main.pdf","citationCount":"0","resultStr":"{\"title\":\"Learning on predictions: Fusing training and autoregressive inference for long-term spatiotemporal forecasts\",\"authors\":\"P.R. Vlachas , P. Koumoutsakos\",\"doi\":\"10.1016/j.physd.2024.134371\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Predictions of complex systems ranging from natural language processing to weather forecasting have benefited from advances in Recurrent Neural Networks (RNNs). RNNs are typically trained using techniques like Backpropagation Through Time (BPTT) to minimize one-step-ahead prediction loss. During testing, RNNs often operate in an auto-regressive mode, with the output of the network fed back into its input. However, this process can eventually result in exposure bias since the network has been trained to process ”ground-truth” data rather than its own predictions. This inconsistency causes errors that compound over time, indicating that the distribution of data used for evaluating losses differs from the actual operating conditions encountered by the model during training. Inspired by the solution to this challenge in language processing networks we propose the Scheduled Autoregressive Truncated Backpropagation Through Time (BPTT-SA) algorithm for predicting complex dynamical systems using RNNs. We find that BPTT-SA effectively reduces iterative error propagation in Convolutional and Convolutional Autoencoder RNNs and demonstrates its capabilities in the long-term prediction of high-dimensional fluid flows.</p></div>\",\"PeriodicalId\":20050,\"journal\":{\"name\":\"Physica D: Nonlinear Phenomena\",\"volume\":\"470 \",\"pages\":\"Article 134371\"},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2024-09-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S016727892400321X/pdfft?md5=c5231aef9d912b65fa750a286252a7f5&pid=1-s2.0-S016727892400321X-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Physica D: Nonlinear Phenomena\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S016727892400321X\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Physica D: Nonlinear Phenomena","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S016727892400321X","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
Learning on predictions: Fusing training and autoregressive inference for long-term spatiotemporal forecasts
Predictions of complex systems ranging from natural language processing to weather forecasting have benefited from advances in Recurrent Neural Networks (RNNs). RNNs are typically trained using techniques like Backpropagation Through Time (BPTT) to minimize one-step-ahead prediction loss. During testing, RNNs often operate in an auto-regressive mode, with the output of the network fed back into its input. However, this process can eventually result in exposure bias since the network has been trained to process ”ground-truth” data rather than its own predictions. This inconsistency causes errors that compound over time, indicating that the distribution of data used for evaluating losses differs from the actual operating conditions encountered by the model during training. Inspired by the solution to this challenge in language processing networks we propose the Scheduled Autoregressive Truncated Backpropagation Through Time (BPTT-SA) algorithm for predicting complex dynamical systems using RNNs. We find that BPTT-SA effectively reduces iterative error propagation in Convolutional and Convolutional Autoencoder RNNs and demonstrates its capabilities in the long-term prediction of high-dimensional fluid flows.
期刊介绍:
Physica D (Nonlinear Phenomena) publishes research and review articles reporting on experimental and theoretical works, techniques and ideas that advance the understanding of nonlinear phenomena. Topics encompass wave motion in physical, chemical and biological systems; physical or biological phenomena governed by nonlinear field equations, including hydrodynamics and turbulence; pattern formation and cooperative phenomena; instability, bifurcations, chaos, and space-time disorder; integrable/Hamiltonian systems; asymptotic analysis and, more generally, mathematical methods for nonlinear systems.