Long-term time series forecasting (LTSF) is crucial in modern society, playing a pivotal role in facilitating long-term planning and developing early warning systems. While many Transformer-based models have recently been introduced for LTSF, a doubt has been raised regarding the effectiveness of attention modules in capturing cross-time dependencies. In this study, we design a mask-series experiment to validate this assumption and subsequently propose the ”Cross-variable Linear Integrated ENhanced Transformer for Multivariate Long-Term Time Series Forecasting” (Client), an advanced model that outperforms both traditional Transformer-based models and linear models. Client employs the linear module to learn trend information and the enhanced Transformer module to capture cross-variable dependencies. Meanwhile, the cross-variable Transformer module in Client simplifies the embedding and position encoding layers and replaces the decoder module with a projection layer. Extensive experiments with nine real-world datasets have confirmed the SOTA performance of Client with the least computation time and memory consumption compared with the previous Transformer-based models. Our code is available at https://github.com/daxin007/Client.
扫码关注我们
求助内容:
应助结果提醒方式:
