Hongwei Jiang, Dongsheng Liu, Xinyi Ding, Yaning Chen, Hongtao Li
{"title":"基于仿射变换的高效轻量级mlp网络,用于长期时间序列预测","authors":"Hongwei Jiang, Dongsheng Liu, Xinyi Ding, Yaning Chen, Hongtao Li","doi":"10.1016/j.neucom.2024.128960","DOIUrl":null,"url":null,"abstract":"<div><div>Time series forecasting (TSF) involves extracting underlying patterns from past information to predict future sequences over a specific period. Extending the prediction length of time series and improving the prediction accuracy have always been challenging tasks. Autoregressive prediction methods based on Markov chains tend to accumulate errors over time. Although Transformer-based methods with various self-attention mechanisms have shown some improvements, they require higher memory and computational resources. In this work, we present an effective MLP-based TSF framework named TCM, which models the sequence and channel dependencies separately using Token MLP and Channel MLP. Additionally, we employ the Affine Transformation to replace layer normalization or batch normalization, leading to substantial enhancements in both accuracy and inference speed. Compared to current state-of-the-art long-term time series forecasting models, TCM achieves 6.0% relative improvement on seven real-world datasets, including electricity, weather, and illness domains. The TCM model, characterized by its efficiency and lightweight architecture, also makes it suitable for scenarios with high real-time requirements.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"617 ","pages":"Article 128960"},"PeriodicalIF":5.5000,"publicationDate":"2024-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"TCM: An efficient lightweight MLP-based network with affine transformation for long-term time series forecasting\",\"authors\":\"Hongwei Jiang, Dongsheng Liu, Xinyi Ding, Yaning Chen, Hongtao Li\",\"doi\":\"10.1016/j.neucom.2024.128960\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Time series forecasting (TSF) involves extracting underlying patterns from past information to predict future sequences over a specific period. Extending the prediction length of time series and improving the prediction accuracy have always been challenging tasks. Autoregressive prediction methods based on Markov chains tend to accumulate errors over time. Although Transformer-based methods with various self-attention mechanisms have shown some improvements, they require higher memory and computational resources. In this work, we present an effective MLP-based TSF framework named TCM, which models the sequence and channel dependencies separately using Token MLP and Channel MLP. Additionally, we employ the Affine Transformation to replace layer normalization or batch normalization, leading to substantial enhancements in both accuracy and inference speed. Compared to current state-of-the-art long-term time series forecasting models, TCM achieves 6.0% relative improvement on seven real-world datasets, including electricity, weather, and illness domains. The TCM model, characterized by its efficiency and lightweight architecture, also makes it suitable for scenarios with high real-time requirements.</div></div>\",\"PeriodicalId\":19268,\"journal\":{\"name\":\"Neurocomputing\",\"volume\":\"617 \",\"pages\":\"Article 128960\"},\"PeriodicalIF\":5.5000,\"publicationDate\":\"2024-11-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neurocomputing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0925231224017314\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231224017314","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
TCM: An efficient lightweight MLP-based network with affine transformation for long-term time series forecasting
Time series forecasting (TSF) involves extracting underlying patterns from past information to predict future sequences over a specific period. Extending the prediction length of time series and improving the prediction accuracy have always been challenging tasks. Autoregressive prediction methods based on Markov chains tend to accumulate errors over time. Although Transformer-based methods with various self-attention mechanisms have shown some improvements, they require higher memory and computational resources. In this work, we present an effective MLP-based TSF framework named TCM, which models the sequence and channel dependencies separately using Token MLP and Channel MLP. Additionally, we employ the Affine Transformation to replace layer normalization or batch normalization, leading to substantial enhancements in both accuracy and inference speed. Compared to current state-of-the-art long-term time series forecasting models, TCM achieves 6.0% relative improvement on seven real-world datasets, including electricity, weather, and illness domains. The TCM model, characterized by its efficiency and lightweight architecture, also makes it suitable for scenarios with high real-time requirements.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.