{"title":"DSFormer-LRTC:利用低张量压缩进行交通预测的动态空间变换器","authors":"Jianli Zhao;Futong Zhuo;Qiuxia Sun;Qing Li;Yiran Hua;Jianye Zhao","doi":"10.1109/TITS.2024.3436523","DOIUrl":null,"url":null,"abstract":"Traffic flow forecasting is challenging due to the intricate spatio-temporal correlations in traffic patterns. Previous works captured spatial dependencies based on graph neural networks and used fixed graph construction methods to characterize spatial relationships, which limits the ability of models to capture dynamic and long-range spatial dependencies. Meanwhile, prior studies did not consider the issue of a large number of redundant parameters in traffic prediction models, which not only increases the storage cost of the model but also reduces its generalization ability. To address the above challenges, we propose a Dynamic Spatial Transformer for Traffic Forecasting with Low-Rank Tensor Compression (DSFormer-LRTC). Specifically, we constructed a global spatial Transformer to capture remote spatial dependencies, and a distance-based mask matrix is used in local spatial Transformer to enhance the adjacent spatial influence. To reduce the complexity of the model, the model adopts a design that separates temporal and spatial. Meanwhile, we introduce low-rank tensor decomposition to reconstruct the parameter matrix in Transformer module to compress the proposed model. Experimental results show that DSFormer-LRTC achieves state-of-the-art performance on four real-world datasets. The experimental analysis of attention matrix also proves that the model can learn dynamic and distant spatial features. Finally, the compressed model parameters reduce the original parameter size by two-thirds, while significantly outperforming the baseline model in terms of computational efficiency.","PeriodicalId":13416,"journal":{"name":"IEEE Transactions on Intelligent Transportation Systems","volume":"25 11","pages":"16323-16335"},"PeriodicalIF":7.9000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"DSFormer-LRTC: Dynamic Spatial Transformer for Traffic Forecasting With Low-Rank Tensor Compression\",\"authors\":\"Jianli Zhao;Futong Zhuo;Qiuxia Sun;Qing Li;Yiran Hua;Jianye Zhao\",\"doi\":\"10.1109/TITS.2024.3436523\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Traffic flow forecasting is challenging due to the intricate spatio-temporal correlations in traffic patterns. Previous works captured spatial dependencies based on graph neural networks and used fixed graph construction methods to characterize spatial relationships, which limits the ability of models to capture dynamic and long-range spatial dependencies. Meanwhile, prior studies did not consider the issue of a large number of redundant parameters in traffic prediction models, which not only increases the storage cost of the model but also reduces its generalization ability. To address the above challenges, we propose a Dynamic Spatial Transformer for Traffic Forecasting with Low-Rank Tensor Compression (DSFormer-LRTC). Specifically, we constructed a global spatial Transformer to capture remote spatial dependencies, and a distance-based mask matrix is used in local spatial Transformer to enhance the adjacent spatial influence. To reduce the complexity of the model, the model adopts a design that separates temporal and spatial. Meanwhile, we introduce low-rank tensor decomposition to reconstruct the parameter matrix in Transformer module to compress the proposed model. Experimental results show that DSFormer-LRTC achieves state-of-the-art performance on four real-world datasets. The experimental analysis of attention matrix also proves that the model can learn dynamic and distant spatial features. Finally, the compressed model parameters reduce the original parameter size by two-thirds, while significantly outperforming the baseline model in terms of computational efficiency.\",\"PeriodicalId\":13416,\"journal\":{\"name\":\"IEEE Transactions on Intelligent Transportation Systems\",\"volume\":\"25 11\",\"pages\":\"16323-16335\"},\"PeriodicalIF\":7.9000,\"publicationDate\":\"2024-09-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Intelligent Transportation Systems\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10682604/\",\"RegionNum\":1,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, CIVIL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Intelligent Transportation Systems","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10682604/","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, CIVIL","Score":null,"Total":0}
DSFormer-LRTC: Dynamic Spatial Transformer for Traffic Forecasting With Low-Rank Tensor Compression
Traffic flow forecasting is challenging due to the intricate spatio-temporal correlations in traffic patterns. Previous works captured spatial dependencies based on graph neural networks and used fixed graph construction methods to characterize spatial relationships, which limits the ability of models to capture dynamic and long-range spatial dependencies. Meanwhile, prior studies did not consider the issue of a large number of redundant parameters in traffic prediction models, which not only increases the storage cost of the model but also reduces its generalization ability. To address the above challenges, we propose a Dynamic Spatial Transformer for Traffic Forecasting with Low-Rank Tensor Compression (DSFormer-LRTC). Specifically, we constructed a global spatial Transformer to capture remote spatial dependencies, and a distance-based mask matrix is used in local spatial Transformer to enhance the adjacent spatial influence. To reduce the complexity of the model, the model adopts a design that separates temporal and spatial. Meanwhile, we introduce low-rank tensor decomposition to reconstruct the parameter matrix in Transformer module to compress the proposed model. Experimental results show that DSFormer-LRTC achieves state-of-the-art performance on four real-world datasets. The experimental analysis of attention matrix also proves that the model can learn dynamic and distant spatial features. Finally, the compressed model parameters reduce the original parameter size by two-thirds, while significantly outperforming the baseline model in terms of computational efficiency.
期刊介绍:
The theoretical, experimental and operational aspects of electrical and electronics engineering and information technologies as applied to Intelligent Transportation Systems (ITS). Intelligent Transportation Systems are defined as those systems utilizing synergistic technologies and systems engineering concepts to develop and improve transportation systems of all kinds. The scope of this interdisciplinary activity includes the promotion, consolidation and coordination of ITS technical activities among IEEE entities, and providing a focus for cooperative activities, both internally and externally.