T-LBERT with Domain Adaptation for Cross-Domain Sentiment Classification

Hongye Cao, Qianru Wei, Jiangbin Zheng
{"title":"T-LBERT with Domain Adaptation for Cross-Domain Sentiment Classification","authors":"Hongye Cao, Qianru Wei, Jiangbin Zheng","doi":"10.34028/iajit/20/1/15","DOIUrl":null,"url":null,"abstract":"Cross-domain sentiment classification transfers the knowledge from the source domain to the target domain lacking supervised information for sentiment classification. Existing cross-domain sentiment classification methods establish connections by extracting domain-invariant features manually. However, these methods have poor adaptability to bridge connections across different domains and ignore important sentiment information. Hence, we propose a Topic Lite Bidirectional Encoder Representations from Transformers (T-LBERT) model with domain adaption to improve the adaptability of cross-domain sentiment classification. It combines the learning content of the source domain and the topic information of the target domain to improve the domain adaptability of the model. Due to the unbalanced distribution of information in the combined data, we apply a two-layer attention adaptive mechanism for classification. A shallow attention layer is applied to weigh the important features of the combined data. Inspired by active learning, we propose a deep domain adaption layer, which actively adjusts model parameters to balance the difference and representativeness between domains. Experimental results on Amazon review datasets demonstrate that the T-LBERT model considerably outperforms other state-of-the-art methods. T-LBERT shows stable classification performance on multiple metrics.","PeriodicalId":13624,"journal":{"name":"Int. Arab J. Inf. Technol.","volume":"2 1","pages":"141-150"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Int. Arab J. Inf. Technol.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.34028/iajit/20/1/15","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Cross-domain sentiment classification transfers the knowledge from the source domain to the target domain lacking supervised information for sentiment classification. Existing cross-domain sentiment classification methods establish connections by extracting domain-invariant features manually. However, these methods have poor adaptability to bridge connections across different domains and ignore important sentiment information. Hence, we propose a Topic Lite Bidirectional Encoder Representations from Transformers (T-LBERT) model with domain adaption to improve the adaptability of cross-domain sentiment classification. It combines the learning content of the source domain and the topic information of the target domain to improve the domain adaptability of the model. Due to the unbalanced distribution of information in the combined data, we apply a two-layer attention adaptive mechanism for classification. A shallow attention layer is applied to weigh the important features of the combined data. Inspired by active learning, we propose a deep domain adaption layer, which actively adjusts model parameters to balance the difference and representativeness between domains. Experimental results on Amazon review datasets demonstrate that the T-LBERT model considerably outperforms other state-of-the-art methods. T-LBERT shows stable classification performance on multiple metrics.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于领域自适应的T-LBERT跨领域情感分类
跨领域情感分类将缺乏监督信息的知识从源领域转移到目标领域进行情感分类。现有的跨领域情感分类方法通过人工提取领域不变特征来建立联系。然而,这些方法对跨领域连接的适应性较差,并且忽略了重要的情感信息。因此,我们提出了一种具有领域自适应的话题精简双向编码器表示(T-LBERT)模型,以提高跨领域情感分类的适应性。将源领域的学习内容与目标领域的主题信息相结合,提高了模型的领域适应性。由于组合数据中的信息分布不平衡,我们采用了两层注意力自适应机制进行分类。使用一个浅关注层来权衡组合数据的重要特征。受主动学习的启发,我们提出了一种深度域自适应层,该层主动调整模型参数以平衡域间的差异和代表性。在亚马逊评论数据集上的实验结果表明,T-LBERT模型大大优于其他最先进的方法。T-LBERT在多个指标上表现出稳定的分类性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A Novel Energy Efficient Harvesting Technique for SDWSN using RF Transmitters with MISO Beamforming Incorporating triple attention and multi-scale pyramid network for underwater image enhancement Generative adversarial networks with data augmentation and multiple penalty areas for image synthesis MAPNEWS: a framework for aggregating and organizing online news articles Deep learning based mobilenet and multi-head attention model for facial expression recognition
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1