情感分类的bert - pair网络

Ziwen Wang, Haiming Wu, Han Liu, Qianhua Cai
{"title":"情感分类的bert - pair网络","authors":"Ziwen Wang, Haiming Wu, Han Liu, Qianhua Cai","doi":"10.1109/ICMLC51923.2020.9469534","DOIUrl":null,"url":null,"abstract":"BERT has demonstrated excellent performance in natural language processing due to the training on large amounts of text corpus in an unsupervised way. However, this model is trained to predict the next sentence, and thus it is good at dealing with sentence pair tasks but may not be sufficiently good for other tasks. In our paper, we introduce a novel representation framework BERT-pair-Networks (p-BERTs) for sentiment classification, where p-BERTs involve adopting BERT to encode sentences for sentiment classification as a classic task of single sentence classification, using the auxiliary sentence, and a feature extraction layer on the top. Results on three datasets show that our method achieves considerably improved performance.","PeriodicalId":170815,"journal":{"name":"2020 International Conference on Machine Learning and Cybernetics (ICMLC)","volume":"452 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Bert-Pair-Networks for Sentiment Classification\",\"authors\":\"Ziwen Wang, Haiming Wu, Han Liu, Qianhua Cai\",\"doi\":\"10.1109/ICMLC51923.2020.9469534\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"BERT has demonstrated excellent performance in natural language processing due to the training on large amounts of text corpus in an unsupervised way. However, this model is trained to predict the next sentence, and thus it is good at dealing with sentence pair tasks but may not be sufficiently good for other tasks. In our paper, we introduce a novel representation framework BERT-pair-Networks (p-BERTs) for sentiment classification, where p-BERTs involve adopting BERT to encode sentences for sentiment classification as a classic task of single sentence classification, using the auxiliary sentence, and a feature extraction layer on the top. Results on three datasets show that our method achieves considerably improved performance.\",\"PeriodicalId\":170815,\"journal\":{\"name\":\"2020 International Conference on Machine Learning and Cybernetics (ICMLC)\",\"volume\":\"452 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-12-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 International Conference on Machine Learning and Cybernetics (ICMLC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICMLC51923.2020.9469534\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 International Conference on Machine Learning and Cybernetics (ICMLC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLC51923.2020.9469534","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

摘要

BERT以无监督的方式对大量文本语料库进行训练,在自然语言处理中表现出优异的性能。然而,这个模型被训练来预测下一个句子,因此它很擅长处理句子对任务,但对于其他任务可能不够好。在本文中,我们引入了一种新的情感分类表示框架BERT-pair- networks (p-BERTs),其中p-BERTs涉及将BERT作为单句分类的经典任务对句子进行情感分类编码,使用辅助句,并在其顶部添加特征提取层。在三个数据集上的结果表明,我们的方法取得了显著的性能提升。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Bert-Pair-Networks for Sentiment Classification
BERT has demonstrated excellent performance in natural language processing due to the training on large amounts of text corpus in an unsupervised way. However, this model is trained to predict the next sentence, and thus it is good at dealing with sentence pair tasks but may not be sufficiently good for other tasks. In our paper, we introduce a novel representation framework BERT-pair-Networks (p-BERTs) for sentiment classification, where p-BERTs involve adopting BERT to encode sentences for sentiment classification as a classic task of single sentence classification, using the auxiliary sentence, and a feature extraction layer on the top. Results on three datasets show that our method achieves considerably improved performance.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Behavioral Decision Makings: Reconciling Behavioral Economics and Decision Systems Operating System Classification: A Minimalist Approach Research on Hotspot Mining Method of Twitter News Report Based on LDA and Sentiment Analysis Conservative Generalisation for Small Data Analytics –An Extended Lattice Machine Approach ICMLC 2020 Cover Page
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1