利用迁移学习方法的文本情绪检测。

IF 2.5 3区 计算机科学 Q2 COMPUTER SCIENCE, HARDWARE & ARCHITECTURE Journal of Supercomputing Pub Date : 2023-03-22 DOI:10.1007/s11227-023-05168-5
Mahsa Hadikhah Mozhdehi, AmirMasoud Eftekhari Moghadam
{"title":"利用迁移学习方法的文本情绪检测。","authors":"Mahsa Hadikhah Mozhdehi,&nbsp;AmirMasoud Eftekhari Moghadam","doi":"10.1007/s11227-023-05168-5","DOIUrl":null,"url":null,"abstract":"<p><p>Many attempts have been made to overcome the challenges of automating textual emotion detection using different traditional deep learning models such as LSTM, GRU, and BiLSTM. But the problem with these models is that they need large datasets, massive computing resources, and a lot of time to train. Also, they are prone to forgetting and cannot perform well when applied to small datasets. In this paper, we aim to demonstrate the capability of transfer learning techniques to capture the better contextual meaning of the text and as a result better detection of the emotion represented in the text, even without a large amount of data and training time. To do this, we conduct an experiment utilizing a pre-trained model called EmotionalBERT, which is based on bidirectional encoder representations from transformers (BERT), and we compare its performance to RNN-based models on two benchmark datasets, with a focus on the amount of training data and how it affects the models' performance.</p>","PeriodicalId":50034,"journal":{"name":"Journal of Supercomputing","volume":" ","pages":"1-15"},"PeriodicalIF":2.5000,"publicationDate":"2023-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10032627/pdf/","citationCount":"1","resultStr":"{\"title\":\"Textual emotion detection utilizing a transfer learning approach.\",\"authors\":\"Mahsa Hadikhah Mozhdehi,&nbsp;AmirMasoud Eftekhari Moghadam\",\"doi\":\"10.1007/s11227-023-05168-5\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Many attempts have been made to overcome the challenges of automating textual emotion detection using different traditional deep learning models such as LSTM, GRU, and BiLSTM. But the problem with these models is that they need large datasets, massive computing resources, and a lot of time to train. Also, they are prone to forgetting and cannot perform well when applied to small datasets. In this paper, we aim to demonstrate the capability of transfer learning techniques to capture the better contextual meaning of the text and as a result better detection of the emotion represented in the text, even without a large amount of data and training time. To do this, we conduct an experiment utilizing a pre-trained model called EmotionalBERT, which is based on bidirectional encoder representations from transformers (BERT), and we compare its performance to RNN-based models on two benchmark datasets, with a focus on the amount of training data and how it affects the models' performance.</p>\",\"PeriodicalId\":50034,\"journal\":{\"name\":\"Journal of Supercomputing\",\"volume\":\" \",\"pages\":\"1-15\"},\"PeriodicalIF\":2.5000,\"publicationDate\":\"2023-03-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10032627/pdf/\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Supercomputing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1007/s11227-023-05168-5\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Supercomputing","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s11227-023-05168-5","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 1

摘要

已经进行了许多尝试来克服使用不同的传统深度学习模型(如LSTM、GRU和BiLSTM)自动进行文本情绪检测的挑战。但这些模型的问题是,它们需要大量的数据集、大量的计算资源和大量的训练时间。此外,当应用于小型数据集时,它们很容易忘记并且不能很好地执行。在本文中,我们的目的是证明迁移学习技术的能力,即使没有大量的数据和训练时间,也能更好地捕捉文本的上下文意义,从而更好地检测文本中表达的情绪。为了做到这一点,我们利用一个名为EmotionalBERT的预训练模型进行了一项实验,该模型基于来自转换器的双向编码器表示(BERT),我们在两个基准数据集上将其性能与基于RNN的模型进行了比较,重点是训练数据量及其如何影响模型的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

摘要图片

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Textual emotion detection utilizing a transfer learning approach.

Many attempts have been made to overcome the challenges of automating textual emotion detection using different traditional deep learning models such as LSTM, GRU, and BiLSTM. But the problem with these models is that they need large datasets, massive computing resources, and a lot of time to train. Also, they are prone to forgetting and cannot perform well when applied to small datasets. In this paper, we aim to demonstrate the capability of transfer learning techniques to capture the better contextual meaning of the text and as a result better detection of the emotion represented in the text, even without a large amount of data and training time. To do this, we conduct an experiment utilizing a pre-trained model called EmotionalBERT, which is based on bidirectional encoder representations from transformers (BERT), and we compare its performance to RNN-based models on two benchmark datasets, with a focus on the amount of training data and how it affects the models' performance.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Supercomputing
Journal of Supercomputing 工程技术-工程:电子与电气
CiteScore
6.30
自引率
12.10%
发文量
734
审稿时长
13 months
期刊介绍: The Journal of Supercomputing publishes papers on the technology, architecture and systems, algorithms, languages and programs, performance measures and methods, and applications of all aspects of Supercomputing. Tutorial and survey papers are intended for workers and students in the fields associated with and employing advanced computer systems. The journal also publishes letters to the editor, especially in areas relating to policy, succinct statements of paradoxes, intuitively puzzling results, partial results and real needs. Published theoretical and practical papers are advanced, in-depth treatments describing new developments and new ideas. Each includes an introduction summarizing prior, directly pertinent work that is useful for the reader to understand, in order to appreciate the advances being described.
期刊最新文献
Topic sentiment analysis based on deep neural network using document embedding technique. A Fechner multiscale local descriptor for face recognition. Data quality model for assessing public COVID-19 big datasets. BTDA: Two-factor dynamic identity authentication scheme for data trading based on alliance chain. Driving behavior analysis and classification by vehicle OBD data using machine learning.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1