ALSTM:持续序列数据的自适应LSTM

Dejiao Niu, Zheng Xia, Yawen Liu, Tao Cai, Tianquan Liu, Yongzhao Zhan
{"title":"ALSTM:持续序列数据的自适应LSTM","authors":"Dejiao Niu, Zheng Xia, Yawen Liu, Tao Cai, Tianquan Liu, Yongzhao Zhan","doi":"10.1109/ICTAI.2018.00032","DOIUrl":null,"url":null,"abstract":"Long short-term memory (LSTM) network is an effective model architecture for deep learning approaches to sequence modeling tasks. However, the current LSTMs can't use the property of sequential data when dealing with the sequence components, which last for a certain period of time. This may make the model unable to benefit from the inherent characteristics of time series and result in poor performance as well as lower efficiency. In this paper, we present a novel adaptive LSTM for durative sequential data which exploits the temporal continuance of the input data in designing a new LSTM unit. By adding a new mask gate and maintaining span, the cell's memory update is not only determined by the input data but also affected by its duration. An adaptive memory update method is proposed according to the change of the sequence input at each time step. This breaks the limitation that the cells calculate the cell state and hidden output for each input always in a unified manner, making the model more suitable for processing the sequences with continuous data. The experimental results on various sequence training tasks show that under the same iteration epochs, the proposed method can achieve higher accuracy, but need relatively less training time compared with the standard LSTM architecture.","PeriodicalId":254686,"journal":{"name":"2018 IEEE 30th International Conference on Tools with Artificial Intelligence (ICTAI)","volume":"62 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"ALSTM: Adaptive LSTM for Durative Sequential Data\",\"authors\":\"Dejiao Niu, Zheng Xia, Yawen Liu, Tao Cai, Tianquan Liu, Yongzhao Zhan\",\"doi\":\"10.1109/ICTAI.2018.00032\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Long short-term memory (LSTM) network is an effective model architecture for deep learning approaches to sequence modeling tasks. However, the current LSTMs can't use the property of sequential data when dealing with the sequence components, which last for a certain period of time. This may make the model unable to benefit from the inherent characteristics of time series and result in poor performance as well as lower efficiency. In this paper, we present a novel adaptive LSTM for durative sequential data which exploits the temporal continuance of the input data in designing a new LSTM unit. By adding a new mask gate and maintaining span, the cell's memory update is not only determined by the input data but also affected by its duration. An adaptive memory update method is proposed according to the change of the sequence input at each time step. This breaks the limitation that the cells calculate the cell state and hidden output for each input always in a unified manner, making the model more suitable for processing the sequences with continuous data. The experimental results on various sequence training tasks show that under the same iteration epochs, the proposed method can achieve higher accuracy, but need relatively less training time compared with the standard LSTM architecture.\",\"PeriodicalId\":254686,\"journal\":{\"name\":\"2018 IEEE 30th International Conference on Tools with Artificial Intelligence (ICTAI)\",\"volume\":\"62 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 IEEE 30th International Conference on Tools with Artificial Intelligence (ICTAI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICTAI.2018.00032\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE 30th International Conference on Tools with Artificial Intelligence (ICTAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICTAI.2018.00032","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

摘要

长短期记忆(LSTM)网络是深度学习方法用于序列建模任务的有效模型体系结构。但是,当前的lstm在处理序列组件时不能使用序列数据的属性,序列组件存在一定的时间。这可能使模型无法从时间序列的固有特征中获益,导致性能差,效率低。在本文中,我们提出了一种新的针对持续序列数据的自适应LSTM,在设计新的LSTM单元时利用了输入数据的时间连续性。通过添加一个新的掩码门并保持跨度,单元的内存更新不仅由输入数据决定,而且受其持续时间的影响。根据序列输入在每个时间步长的变化,提出了一种自适应记忆更新方法。这打破了单元对每个输入总是统一计算单元状态和隐藏输出的限制,使模型更适合处理连续数据序列。在各种序列训练任务上的实验结果表明,在相同迭代次数下,与标准LSTM结构相比,该方法可以达到更高的精度,但所需的训练时间相对较少。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
ALSTM: Adaptive LSTM for Durative Sequential Data
Long short-term memory (LSTM) network is an effective model architecture for deep learning approaches to sequence modeling tasks. However, the current LSTMs can't use the property of sequential data when dealing with the sequence components, which last for a certain period of time. This may make the model unable to benefit from the inherent characteristics of time series and result in poor performance as well as lower efficiency. In this paper, we present a novel adaptive LSTM for durative sequential data which exploits the temporal continuance of the input data in designing a new LSTM unit. By adding a new mask gate and maintaining span, the cell's memory update is not only determined by the input data but also affected by its duration. An adaptive memory update method is proposed according to the change of the sequence input at each time step. This breaks the limitation that the cells calculate the cell state and hidden output for each input always in a unified manner, making the model more suitable for processing the sequences with continuous data. The experimental results on various sequence training tasks show that under the same iteration epochs, the proposed method can achieve higher accuracy, but need relatively less training time compared with the standard LSTM architecture.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
[Title page i] Enhanced Unsatisfiable Cores for QBF: Weakening Universal to Existential Quantifiers Effective Ant Colony Optimization Solution for the Brazilian Family Health Team Scheduling Problem Exploiting Global Semantic Similarity Biterms for Short-Text Topic Discovery Assigning and Scheduling Service Visits in a Mixed Urban/Rural Setting
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1