In long-term time series forecasting (LSTF), a fundamental challenge lies in simultaneously capturing fine-grained local dynamics and long-range global dependencies within inherently complex and non-stationary temporal series. However, most existing forecasting architectures rely on single-structure paradigms, each exhibiting inherent representational biases–for example, CNNs are constrained by limited receptive fields, while Transformers often overlook fine-grained local patterns. More critically, these architectures typically operate in isolation, lacking collaborative mechanisms to effectively integrate their complementary modeling capabilities. To address these limitations, we propose MatNet, a Multi-scale Adaptive Forecasting Network with a novel bidirectional collaborative architecture designed to establish bidirectional collaborative pathways between CNN and Transformer branches. Within this architecture, local representations extracted by CNNs are leveraged to refine and enrich the global context modeled by Transformers, thereby improving the model’s sensitivity to fine-grained temporal structures. Conversely, global dependencies captured by Transformer provide high-level semantic guidance to CNNs, enabling them to focus on contextually salient local regions and enhance representation coherence. Additionally, we introduce a Dynamic Temporal-Aware Router that adaptively extracts and fuses temporal features across multiple scales, enabling adaptive multi-scale modeling. Extensive experiments on nine public datasets demonstrate that MatNet consistently outperforms existing state-of-the-art methods in forecasting accuracy.
扫码关注我们
求助内容:
应助结果提醒方式:
