{"title":"MSDformer: an autocorrelation transformer with multiscale decomposition for long-term multivariate time series forecasting","authors":"Guangyao Su, Yepeng Guan","doi":"10.1007/s10489-024-06105-6","DOIUrl":null,"url":null,"abstract":"<p>The improvement of performance and efficiency in long-term time series forecasting is significant for practical applications. However, while enhancing overall performance, existing time series forecasting methods often exhibit unsatisfactory capabilities in the restoration of details and prediction efficiency. To address these issues, an autocorrelation Transformer with multiscale decomposition (MSDformer) is proposed for long-term multivariate time series forecasting. Specifically, a multiscale decomposition (MSDecomp) module is designed, which identifies the temporal repeating patterns in time series with different scales to retain more historical details while extracting trend components. An Encoder layer is proposed based on the MSDecomp module and Auto-Correlation mechanism, which discovers the similarity of subsequences in a periodic manner and effectively captures the seasonal components to improve the degree of restoration of prediction details while extracting the residual trend components. Finally, unlike the traditional Transformer structure, the decoder structure is replaced by the proposed Autoregressive module to simplify the output mode of the decoder and enhance linear information. Compared to other advanced and representative models on six real-world datasets, the experimental results demonstrate that the MSDformer has a relative performance improvement of an average of 8.1%. MSDformer also has lower memory usage and temporal consumption, making it more advantageous for long-term time series forecasting.</p>","PeriodicalId":8041,"journal":{"name":"Applied Intelligence","volume":"55 2","pages":""},"PeriodicalIF":3.4000,"publicationDate":"2024-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Intelligence","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s10489-024-06105-6","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
The improvement of performance and efficiency in long-term time series forecasting is significant for practical applications. However, while enhancing overall performance, existing time series forecasting methods often exhibit unsatisfactory capabilities in the restoration of details and prediction efficiency. To address these issues, an autocorrelation Transformer with multiscale decomposition (MSDformer) is proposed for long-term multivariate time series forecasting. Specifically, a multiscale decomposition (MSDecomp) module is designed, which identifies the temporal repeating patterns in time series with different scales to retain more historical details while extracting trend components. An Encoder layer is proposed based on the MSDecomp module and Auto-Correlation mechanism, which discovers the similarity of subsequences in a periodic manner and effectively captures the seasonal components to improve the degree of restoration of prediction details while extracting the residual trend components. Finally, unlike the traditional Transformer structure, the decoder structure is replaced by the proposed Autoregressive module to simplify the output mode of the decoder and enhance linear information. Compared to other advanced and representative models on six real-world datasets, the experimental results demonstrate that the MSDformer has a relative performance improvement of an average of 8.1%. MSDformer also has lower memory usage and temporal consumption, making it more advantageous for long-term time series forecasting.
期刊介绍:
With a focus on research in artificial intelligence and neural networks, this journal addresses issues involving solutions of real-life manufacturing, defense, management, government and industrial problems which are too complex to be solved through conventional approaches and require the simulation of intelligent thought processes, heuristics, applications of knowledge, and distributed and parallel processing. The integration of these multiple approaches in solving complex problems is of particular importance.
The journal presents new and original research and technological developments, addressing real and complex issues applicable to difficult problems. It provides a medium for exchanging scientific research and technological achievements accomplished by the international community.