Wenyan Wang , Enguang Zuo , Chen Chen , Cheng Chen , Jie Zhong , Ziwei Yan , Xiaoyi Lv
{"title":"Efficient time series adaptive representation learning via Dynamic Routing Sparse Attention","authors":"Wenyan Wang , Enguang Zuo , Chen Chen , Cheng Chen , Jie Zhong , Ziwei Yan , Xiaoyi Lv","doi":"10.1016/j.patcog.2024.111058","DOIUrl":null,"url":null,"abstract":"<div><div>Time series prediction plays a crucial role in various fields but also faces significant challenges. Converting original 1D time series data into 2D data through dimension transformation allows capturing more hidden features but incurs high memory consumption and low time efficiency. We have designed a sparse attention mechanism with dynamic routing perception called <strong>D</strong>ynamic <strong>R</strong>outing <strong>S</strong>parse <strong>A</strong>ttention (DRSA) to address these issues. Specifically, DRSA can effectively handle variations of complex time series data. Meanwhile, under memory constraints, the <strong>D</strong>ynamic <strong>R</strong>outing <strong>F</strong>ilter (DRF) module further refines it by filtering the blocked 2D time series data to identify the most relevant feature vectors in the local context. We conducted predictive experiments on six real-world time series datasets with fine granularity and long sequence dependencies. Compared to eight state-of-the-art (SOTA) models, DRSA demonstrated relative improvements ranging from 4.18% to 81.02%. Furthermore, its time efficiency is 2 to 5 times higher than the baseline. Our code and dataset will be available at <span><span>https://github.com/wwy8/DRSA_main</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":49713,"journal":{"name":"Pattern Recognition","volume":"158 ","pages":"Article 111058"},"PeriodicalIF":7.5000,"publicationDate":"2024-09-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0031320324008094","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Time series prediction plays a crucial role in various fields but also faces significant challenges. Converting original 1D time series data into 2D data through dimension transformation allows capturing more hidden features but incurs high memory consumption and low time efficiency. We have designed a sparse attention mechanism with dynamic routing perception called Dynamic Routing Sparse Attention (DRSA) to address these issues. Specifically, DRSA can effectively handle variations of complex time series data. Meanwhile, under memory constraints, the Dynamic Routing Filter (DRF) module further refines it by filtering the blocked 2D time series data to identify the most relevant feature vectors in the local context. We conducted predictive experiments on six real-world time series datasets with fine granularity and long sequence dependencies. Compared to eight state-of-the-art (SOTA) models, DRSA demonstrated relative improvements ranging from 4.18% to 81.02%. Furthermore, its time efficiency is 2 to 5 times higher than the baseline. Our code and dataset will be available at https://github.com/wwy8/DRSA_main.
期刊介绍:
The field of Pattern Recognition is both mature and rapidly evolving, playing a crucial role in various related fields such as computer vision, image processing, text analysis, and neural networks. It closely intersects with machine learning and is being applied in emerging areas like biometrics, bioinformatics, multimedia data analysis, and data science. The journal Pattern Recognition, established half a century ago during the early days of computer science, has since grown significantly in scope and influence.