CrisisBERT: A Robust Transformer for Crisis Classification and Contextual Crisis Embedding

Junhua Liu, Trisha Singhal, L. Blessing, Kristin L. Wood, Kwan Hui Lim
{"title":"CrisisBERT: A Robust Transformer for Crisis Classification and Contextual Crisis Embedding","authors":"Junhua Liu, Trisha Singhal, L. Blessing, Kristin L. Wood, Kwan Hui Lim","doi":"10.1145/3465336.3475117","DOIUrl":null,"url":null,"abstract":"Detecting crisis events accurately is an important task, as it allows the relevant authorities to implement necessary actions to mitigate damages. For this purpose, social media serve as a timely information source due to its prevalence and high volume of first-hand accounts. While there are prior works on crises detection, many of them do not perform crisis embedding and classification using state-of-the-art attention-based deep neural networks models, such as Transformers and document-level contextual embeddings. In contrast, we propose CrisisBERT, an end-to-end transformer-based model for two crisis classification tasks, namely crisis detection and crisis recognition, which shows promising results across accuracy and F1 scores. The proposed CrisisBERT model demonstrates superior robustness over various benchmarks, and it includes only marginal performance compromise while extending from 6 to 36 events with a mere 51.4% additional data points. We also propose Crisis2Vec, an attention-based, document-level contextual embedding architecture, for crisis embedding, which achieves better performance than conventional crisis embedding methods such as Word2Vec and GloVe.","PeriodicalId":325072,"journal":{"name":"Proceedings of the 32nd ACM Conference on Hypertext and Social Media","volume":"2014 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"42","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 32nd ACM Conference on Hypertext and Social Media","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3465336.3475117","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 42

Abstract

Detecting crisis events accurately is an important task, as it allows the relevant authorities to implement necessary actions to mitigate damages. For this purpose, social media serve as a timely information source due to its prevalence and high volume of first-hand accounts. While there are prior works on crises detection, many of them do not perform crisis embedding and classification using state-of-the-art attention-based deep neural networks models, such as Transformers and document-level contextual embeddings. In contrast, we propose CrisisBERT, an end-to-end transformer-based model for two crisis classification tasks, namely crisis detection and crisis recognition, which shows promising results across accuracy and F1 scores. The proposed CrisisBERT model demonstrates superior robustness over various benchmarks, and it includes only marginal performance compromise while extending from 6 to 36 events with a mere 51.4% additional data points. We also propose Crisis2Vec, an attention-based, document-level contextual embedding architecture, for crisis embedding, which achieves better performance than conventional crisis embedding methods such as Word2Vec and GloVe.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
CrisisBERT:危机分类和上下文危机嵌入的鲁棒转换器
准确地发现危机事件是一项重要的任务,因为它使有关当局能够采取必要的行动来减轻损害。为此,由于社交媒体的普及和大量的第一手资料,它可以作为及时的信息来源。虽然之前有关于危机检测的工作,但其中许多都没有使用最先进的基于注意力的深度神经网络模型(如Transformers和文档级上下文嵌入)执行危机嵌入和分类。相比之下,我们提出了CrisisBERT,这是一个基于端到端的转换器的模型,用于两个危机分类任务,即危机检测和危机识别,它在准确率和F1分数上都显示出令人满意的结果。所提出的CrisisBERT模型在各种基准测试中显示出卓越的鲁棒性,并且它只包括边际性能折衷,同时从6个事件扩展到36个事件,仅增加51.4%的数据点。我们还提出了一种基于关注的文档级上下文嵌入架构Crisis2Vec用于危机嵌入,它比传统的危机嵌入方法如Word2Vec和GloVe取得了更好的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Demonstration of Weblinks: A Rich Linking Layer Over the Web Hate Speech in Political Discourse: A Case Study of UK MPs on Twitter International Teaching and Research in Hypertext Reductio ad absurdum?: From Analogue Hypertext to Digital Humanities RIP Emojis and Words to Contextualize Mourning on Twitter
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1