TM-BERT: A Twitter Modified BERT for Sentiment Analysis on Covid-19 Vaccination Tweets

Muhammad Talha Riaz, M. Shah Jahan, S. G. Khawaja, A. Shaukat, Jahan Zeb
{"title":"TM-BERT: A Twitter Modified BERT for Sentiment Analysis on Covid-19 Vaccination Tweets","authors":"Muhammad Talha Riaz, M. Shah Jahan, S. G. Khawaja, A. Shaukat, Jahan Zeb","doi":"10.1109/ICoDT255437.2022.9787395","DOIUrl":null,"url":null,"abstract":"In transfer learning a model is pre-trained on a large unsupervised dataset and then fine-tuned on domain-specific downstream tasks. BERT is the first true-natured deep bidirectional language model which reads the input from both sides of input to better understand the context of a sentence by solely relying on the Attention mechanism. This study presents a Twitter Modified BERT (TM-BERT) based upon Transformer architecture. It has also developed a new Covid-19 Vaccination Sentiment Analysis Task (CV-SAT) and a COVID-19 unsupervised pre-training dataset containing (70K) tweets. BERT achieved (0.70) and (0.76) accuracy when fine-tuned on CV-SAT, whereas TM-BERT achieved (0.89), a (19%) and (13%) accuracy over BERT. Another enhancement introduced is in terms of time efficiency as BERT takes (64) hours of pre-training while TM-BERT takes only (17) hours and still produces (19%) improvement even after pre-trained on four (4) times fewer data.","PeriodicalId":291030,"journal":{"name":"2022 2nd International Conference on Digital Futures and Transformative Technologies (ICoDT2)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 2nd International Conference on Digital Futures and Transformative Technologies (ICoDT2)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICoDT255437.2022.9787395","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

In transfer learning a model is pre-trained on a large unsupervised dataset and then fine-tuned on domain-specific downstream tasks. BERT is the first true-natured deep bidirectional language model which reads the input from both sides of input to better understand the context of a sentence by solely relying on the Attention mechanism. This study presents a Twitter Modified BERT (TM-BERT) based upon Transformer architecture. It has also developed a new Covid-19 Vaccination Sentiment Analysis Task (CV-SAT) and a COVID-19 unsupervised pre-training dataset containing (70K) tweets. BERT achieved (0.70) and (0.76) accuracy when fine-tuned on CV-SAT, whereas TM-BERT achieved (0.89), a (19%) and (13%) accuracy over BERT. Another enhancement introduced is in terms of time efficiency as BERT takes (64) hours of pre-training while TM-BERT takes only (17) hours and still produces (19%) improvement even after pre-trained on four (4) times fewer data.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
TM-BERT:用于对Covid-19疫苗接种推文进行情绪分析的Twitter改进BERT
在迁移学习中,模型在大型无监督数据集上进行预训练,然后在特定领域的下游任务上进行微调。BERT是第一个真正的深度双向语言模型,它仅依靠注意机制从输入的两侧读取输入,从而更好地理解句子的上下文。本研究提出一种基于Transformer架构的Twitter Modified BERT (TM-BERT)。它还开发了一个新的Covid-19疫苗情绪分析任务(CV-SAT)和一个包含(70K)推文的Covid-19无监督预训练数据集。当在CV-SAT上进行微调时,BERT达到了(0.70)和(0.76)的精度,而TM-BERT在BERT上达到了(0.89)、(19%)和(13%)的精度。另一个改进是在时间效率方面,BERT需要(64)小时的预训练,而TM-BERT只需要(17)小时,即使在四(4)倍的数据上进行预训练,仍然产生(19%)的改进。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Segmentation of Images Using Deep Learning: A Survey Semantic Keywords Extraction from Paper Abstract in the Domain of Educational Big Data to support Topic Clustering Automatically Categorizing Software Technologies A Theoretical CNN Compression Framework for Resource-Restricted Environments Automatic Detection and classification of Scoliosis from Spine X-rays using Transfer Learning
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1