Contrastive pre-training and instruction tuning for cross-lingual aspect-based sentiment analysis

IF 3.4 2区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Applied Intelligence Pub Date : 2025-01-21 DOI:10.1007/s10489-025-06251-5
Wenwen Zhao, Zhisheng Yang, Song Yu, Shiyu Zhu, Li Li
{"title":"Contrastive pre-training and instruction tuning for cross-lingual aspect-based sentiment analysis","authors":"Wenwen Zhao,&nbsp;Zhisheng Yang,&nbsp;Song Yu,&nbsp;Shiyu Zhu,&nbsp;Li Li","doi":"10.1007/s10489-025-06251-5","DOIUrl":null,"url":null,"abstract":"<div><p>In Natural Language Processing (NLP), aspect-based sentiment analysis (ABSA) has always been one of the critical research areas. However, due to the lack of sufficient sentiment corpora in most languages, existing research mainly focuses on English texts, resulting in limited studies on multilingual ABSA tasks. In this paper, we propose a new pre-training strategy using contrastive learning to improve the performance of cross-lingual ABSA tasks, and we construct a semantic contrastive loss to align parallel sentence representations with the same semantics in different languages. Secondly, we introduce instruction prompt template tuning, which enables the language model to fully understand the task content and learn to generate the required targets through manually constructed instruction prompt templates. During the generation process, we create a more generic placeholder template-based structured output target to capture the relationship between aspect term and sentiment polarity, facilitating cross-lingual transfer. In addition, we have introduced a copy mechanism to improve task performance further. We conduct detailed experiments and ablation analyzes on eight languages to demonstrate the importance of each of our proposed components.</p></div>","PeriodicalId":8041,"journal":{"name":"Applied Intelligence","volume":"55 5","pages":""},"PeriodicalIF":3.4000,"publicationDate":"2025-01-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Intelligence","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s10489-025-06251-5","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

In Natural Language Processing (NLP), aspect-based sentiment analysis (ABSA) has always been one of the critical research areas. However, due to the lack of sufficient sentiment corpora in most languages, existing research mainly focuses on English texts, resulting in limited studies on multilingual ABSA tasks. In this paper, we propose a new pre-training strategy using contrastive learning to improve the performance of cross-lingual ABSA tasks, and we construct a semantic contrastive loss to align parallel sentence representations with the same semantics in different languages. Secondly, we introduce instruction prompt template tuning, which enables the language model to fully understand the task content and learn to generate the required targets through manually constructed instruction prompt templates. During the generation process, we create a more generic placeholder template-based structured output target to capture the relationship between aspect term and sentiment polarity, facilitating cross-lingual transfer. In addition, we have introduced a copy mechanism to improve task performance further. We conduct detailed experiments and ablation analyzes on eight languages to demonstrate the importance of each of our proposed components.

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
跨语言情感分析的对比预训练和指令调优
在自然语言处理(NLP)中,基于方面的情感分析(ABSA)一直是一个重要的研究领域。然而,由于大多数语言缺乏足够的情感语料库,现有的研究主要集中在英语文本上,导致对多语言ABSA任务的研究有限。在本文中,我们提出了一种新的使用对比学习的预训练策略来提高跨语言ABSA任务的性能,并构建了语义对比损失来对齐不同语言中具有相同语义的平行句子表示。其次,我们引入指令提示模板调优,使语言模型能够充分理解任务内容,并通过人工构建指令提示模板来学习生成所需的目标。在生成过程中,我们创建了一个更通用的基于占位符模板的结构化输出目标,以捕获方面术语和情感极性之间的关系,促进跨语言迁移。此外,我们还引入了复制机制来进一步提高任务性能。我们对八种语言进行了详细的实验和消融分析,以证明我们提出的每个组件的重要性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Applied Intelligence
Applied Intelligence 工程技术-计算机:人工智能
CiteScore
6.60
自引率
20.80%
发文量
1361
审稿时长
5.9 months
期刊介绍: With a focus on research in artificial intelligence and neural networks, this journal addresses issues involving solutions of real-life manufacturing, defense, management, government and industrial problems which are too complex to be solved through conventional approaches and require the simulation of intelligent thought processes, heuristics, applications of knowledge, and distributed and parallel processing. The integration of these multiple approaches in solving complex problems is of particular importance. The journal presents new and original research and technological developments, addressing real and complex issues applicable to difficult problems. It provides a medium for exchanging scientific research and technological achievements accomplished by the international community.
期刊最新文献
Insulator defect detection from aerial images in adverse weather conditions A review of the emotion recognition model of robots Knowledge guided relation enhancement for human-object interaction detection A modified dueling DQN algorithm for robot path planning incorporating priority experience replay and artificial potential fields A non-parameter oversampling approach for imbalanced data classification based on hybrid natural neighbors
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1