A Modified Long Short-Term Memory Cell.

IF 6.6 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE International Journal of Neural Systems Pub Date : 2023-07-01 DOI:10.1142/S0129065723500399
Giannis Haralabopoulos, Gerasimos Razis, Ioannis Anagnostopoulos
{"title":"A Modified Long Short-Term Memory Cell.","authors":"Giannis Haralabopoulos,&nbsp;Gerasimos Razis,&nbsp;Ioannis Anagnostopoulos","doi":"10.1142/S0129065723500399","DOIUrl":null,"url":null,"abstract":"<p><p>Machine Learning (ML), among other things, facilitates Text Classification, the task of assigning classes to textual items. Classification performance in ML has been significantly improved due to recent developments, including the rise of Recurrent Neural Networks (RNNs), Long Short-Term Memory (LSTM), Gated Recurrent Units (GRUs), and Transformer Models. Internal memory states with dynamic temporal behavior can be found in these kinds of cells. This temporal behavior in the LSTM cell is stored in two different states: \"Current\" and \"Hidden\". In this work, we define a modification layer within the LSTM cell which allows us to perform additional state adjustments for either state, or even simultaneously alter both. We perform 17 state alterations. Out of these 17 single-state alteration experiments, 12 involve the Current state whereas five involve the Hidden one. These alterations are evaluated using seven datasets related to sentiment analysis, document classification, hate speech detection, and human-to-robot interaction. Our results showed that the highest performing alteration for Current and Hidden state can achieve an average <i>F</i>1 improvement of 0.5% and 0.3%, respectively. We also compare our modified cell performance to two Transformer models, where our modified LSTM cell is outperformed in classification metrics in 4/6 datasets, but improves upon the simple Transformer model and clearly has a better cost efficiency than both Transformer models.</p>","PeriodicalId":50305,"journal":{"name":"International Journal of Neural Systems","volume":"33 7","pages":"2350039"},"PeriodicalIF":6.6000,"publicationDate":"2023-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Neural Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1142/S0129065723500399","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Machine Learning (ML), among other things, facilitates Text Classification, the task of assigning classes to textual items. Classification performance in ML has been significantly improved due to recent developments, including the rise of Recurrent Neural Networks (RNNs), Long Short-Term Memory (LSTM), Gated Recurrent Units (GRUs), and Transformer Models. Internal memory states with dynamic temporal behavior can be found in these kinds of cells. This temporal behavior in the LSTM cell is stored in two different states: "Current" and "Hidden". In this work, we define a modification layer within the LSTM cell which allows us to perform additional state adjustments for either state, or even simultaneously alter both. We perform 17 state alterations. Out of these 17 single-state alteration experiments, 12 involve the Current state whereas five involve the Hidden one. These alterations are evaluated using seven datasets related to sentiment analysis, document classification, hate speech detection, and human-to-robot interaction. Our results showed that the highest performing alteration for Current and Hidden state can achieve an average F1 improvement of 0.5% and 0.3%, respectively. We also compare our modified cell performance to two Transformer models, where our modified LSTM cell is outperformed in classification metrics in 4/6 datasets, but improves upon the simple Transformer model and clearly has a better cost efficiency than both Transformer models.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
一种改良的长短期记忆细胞。
机器学习(ML),除其他外,促进了文本分类,即为文本项分配类的任务。由于最近的发展,包括循环神经网络(rnn)、长短期记忆(LSTM)、门通循环单元(gru)和变压器模型的兴起,机器学习中的分类性能得到了显着改善。在这类细胞中可以发现具有动态时间行为的内部记忆状态。LSTM单元中的这种暂时行为以两种不同的状态存储:“当前”和“隐藏”。在这项工作中,我们在LSTM单元中定义了一个修改层,它允许我们对任何一种状态执行额外的状态调整,甚至同时改变这两种状态。我们执行17个状态变更。在这17个单态改变实验中,12个涉及当前状态,而5个涉及隐藏状态。这些变化使用七个数据集进行评估,这些数据集涉及情感分析、文档分类、仇恨言论检测和人机交互。我们的研究结果表明,当前状态和隐藏状态的最高性能改变可以实现平均F1分别提高0.5%和0.3%。我们还将修改后的单元性能与两个Transformer模型进行了比较,其中我们修改后的LSTM单元在4/6数据集的分类指标上表现更好,但在简单Transformer模型的基础上进行了改进,并且显然比两个Transformer模型都具有更好的成本效率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
International Journal of Neural Systems
International Journal of Neural Systems 工程技术-计算机:人工智能
CiteScore
11.30
自引率
28.80%
发文量
116
审稿时长
24 months
期刊介绍: The International Journal of Neural Systems is a monthly, rigorously peer-reviewed transdisciplinary journal focusing on information processing in both natural and artificial neural systems. Special interests include machine learning, computational neuroscience and neurology. The journal prioritizes innovative, high-impact articles spanning multiple fields, including neurosciences and computer science and engineering. It adopts an open-minded approach to this multidisciplinary field, serving as a platform for novel ideas and enhanced understanding of collective and cooperative phenomena in computationally capable systems.
期刊最新文献
Epileptic Seizure Detection with an End-to-end Temporal Convolutional Network and Bidirectional Long Short-Term Memory Model A graph-based neural approach to linear sum assignment problems Automated Quality Evaluation of Large-Scale Benchmark Datasets for Vision-Language Tasks sEMG-based Inter-Session Hand Gesture Recognition via Domain Adaptation with Locality Preserving and Maximum Margin Cultural Differences in the Assessment of Synthetic Voices
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1