{"title":"ALSTM:持续序列数据的自适应LSTM","authors":"Dejiao Niu, Zheng Xia, Yawen Liu, Tao Cai, Tianquan Liu, Yongzhao Zhan","doi":"10.1109/ICTAI.2018.00032","DOIUrl":null,"url":null,"abstract":"Long short-term memory (LSTM) network is an effective model architecture for deep learning approaches to sequence modeling tasks. However, the current LSTMs can't use the property of sequential data when dealing with the sequence components, which last for a certain period of time. This may make the model unable to benefit from the inherent characteristics of time series and result in poor performance as well as lower efficiency. In this paper, we present a novel adaptive LSTM for durative sequential data which exploits the temporal continuance of the input data in designing a new LSTM unit. By adding a new mask gate and maintaining span, the cell's memory update is not only determined by the input data but also affected by its duration. An adaptive memory update method is proposed according to the change of the sequence input at each time step. This breaks the limitation that the cells calculate the cell state and hidden output for each input always in a unified manner, making the model more suitable for processing the sequences with continuous data. The experimental results on various sequence training tasks show that under the same iteration epochs, the proposed method can achieve higher accuracy, but need relatively less training time compared with the standard LSTM architecture.","PeriodicalId":254686,"journal":{"name":"2018 IEEE 30th International Conference on Tools with Artificial Intelligence (ICTAI)","volume":"62 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"ALSTM: Adaptive LSTM for Durative Sequential Data\",\"authors\":\"Dejiao Niu, Zheng Xia, Yawen Liu, Tao Cai, Tianquan Liu, Yongzhao Zhan\",\"doi\":\"10.1109/ICTAI.2018.00032\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Long short-term memory (LSTM) network is an effective model architecture for deep learning approaches to sequence modeling tasks. However, the current LSTMs can't use the property of sequential data when dealing with the sequence components, which last for a certain period of time. This may make the model unable to benefit from the inherent characteristics of time series and result in poor performance as well as lower efficiency. In this paper, we present a novel adaptive LSTM for durative sequential data which exploits the temporal continuance of the input data in designing a new LSTM unit. By adding a new mask gate and maintaining span, the cell's memory update is not only determined by the input data but also affected by its duration. An adaptive memory update method is proposed according to the change of the sequence input at each time step. This breaks the limitation that the cells calculate the cell state and hidden output for each input always in a unified manner, making the model more suitable for processing the sequences with continuous data. The experimental results on various sequence training tasks show that under the same iteration epochs, the proposed method can achieve higher accuracy, but need relatively less training time compared with the standard LSTM architecture.\",\"PeriodicalId\":254686,\"journal\":{\"name\":\"2018 IEEE 30th International Conference on Tools with Artificial Intelligence (ICTAI)\",\"volume\":\"62 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 IEEE 30th International Conference on Tools with Artificial Intelligence (ICTAI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICTAI.2018.00032\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE 30th International Conference on Tools with Artificial Intelligence (ICTAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICTAI.2018.00032","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Long short-term memory (LSTM) network is an effective model architecture for deep learning approaches to sequence modeling tasks. However, the current LSTMs can't use the property of sequential data when dealing with the sequence components, which last for a certain period of time. This may make the model unable to benefit from the inherent characteristics of time series and result in poor performance as well as lower efficiency. In this paper, we present a novel adaptive LSTM for durative sequential data which exploits the temporal continuance of the input data in designing a new LSTM unit. By adding a new mask gate and maintaining span, the cell's memory update is not only determined by the input data but also affected by its duration. An adaptive memory update method is proposed according to the change of the sequence input at each time step. This breaks the limitation that the cells calculate the cell state and hidden output for each input always in a unified manner, making the model more suitable for processing the sequences with continuous data. The experimental results on various sequence training tasks show that under the same iteration epochs, the proposed method can achieve higher accuracy, but need relatively less training time compared with the standard LSTM architecture.