利用近似值和语法簇增强本地知识,进行方面级情感分析

IF 3.1 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Computer Speech and Language Pub Date : 2023-12-28 DOI:10.1016/j.csl.2023.101616
Pengfei Chen , Biqing Zeng , Yuwu Lu , Yun Xue , Fei Fan , Mayi Xu , Lingcong Feng
{"title":"利用近似值和语法簇增强本地知识,进行方面级情感分析","authors":"Pengfei Chen ,&nbsp;Biqing Zeng ,&nbsp;Yuwu Lu ,&nbsp;Yun Xue ,&nbsp;Fei Fan ,&nbsp;Mayi Xu ,&nbsp;Lingcong Feng","doi":"10.1016/j.csl.2023.101616","DOIUrl":null,"url":null,"abstract":"<div><p>Aspect-level sentiment analysis (ALSA) aims to extract the polarity of different aspect terms in a sentence. Previous works leveraging traditional dependency syntax parsing<span> trees (DSPT) to encode contextual syntactic<span> information had obtained state-of-the-art results. However, these works may not be able to learn fine-grained syntactic knowledge efficiently, which makes them difficult to take advantage of local context. Furthermore, these works failed to exploit the dependency relation from DSPT sufficiently. To solve these problems, we propose a novel method to enhance local knowledge by using extensions of Local Context Network based on Proximity Values (LCPV) and Syntax-clusters Attention (SCA), named LCSA. LCPV first gets the induced trees from pre-trained models and generates the syntactic proximity values between context word and aspect to adaptively determine the extent of local context. Our improved SCA further extracts fine-grained knowledge, which not only focuses on the essential clusters for the target aspect term but also guides the model to learn essential words inside each cluster in DSPT. Extensive experimental results on multiple benchmark datasets demonstrate that LCSA is highly robust and achieves state-of-the-art performance for ALSA.</span></span></p></div>","PeriodicalId":50638,"journal":{"name":"Computer Speech and Language","volume":null,"pages":null},"PeriodicalIF":3.1000,"publicationDate":"2023-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Enhanced local knowledge with proximity values and syntax-clusters for aspect-level sentiment analysis\",\"authors\":\"Pengfei Chen ,&nbsp;Biqing Zeng ,&nbsp;Yuwu Lu ,&nbsp;Yun Xue ,&nbsp;Fei Fan ,&nbsp;Mayi Xu ,&nbsp;Lingcong Feng\",\"doi\":\"10.1016/j.csl.2023.101616\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Aspect-level sentiment analysis (ALSA) aims to extract the polarity of different aspect terms in a sentence. Previous works leveraging traditional dependency syntax parsing<span> trees (DSPT) to encode contextual syntactic<span> information had obtained state-of-the-art results. However, these works may not be able to learn fine-grained syntactic knowledge efficiently, which makes them difficult to take advantage of local context. Furthermore, these works failed to exploit the dependency relation from DSPT sufficiently. To solve these problems, we propose a novel method to enhance local knowledge by using extensions of Local Context Network based on Proximity Values (LCPV) and Syntax-clusters Attention (SCA), named LCSA. LCPV first gets the induced trees from pre-trained models and generates the syntactic proximity values between context word and aspect to adaptively determine the extent of local context. Our improved SCA further extracts fine-grained knowledge, which not only focuses on the essential clusters for the target aspect term but also guides the model to learn essential words inside each cluster in DSPT. Extensive experimental results on multiple benchmark datasets demonstrate that LCSA is highly robust and achieves state-of-the-art performance for ALSA.</span></span></p></div>\",\"PeriodicalId\":50638,\"journal\":{\"name\":\"Computer Speech and Language\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":3.1000,\"publicationDate\":\"2023-12-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computer Speech and Language\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0885230823001353\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Speech and Language","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0885230823001353","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

方面情感分析(ALSA)旨在提取句子中不同方面术语的极性。以前的研究利用传统的依赖语法分析树(DSPT)来编码上下文句法信息,取得了先进的成果。但是,这些研究可能无法有效地学习细粒度的句法知识,因此难以利用局部语境的优势。此外,这些研究也未能充分利用 DSPT 的依赖关系。为了解决这些问题,我们提出了一种新方法来增强本地知识,即使用基于邻近值的本地上下文网络(LCPV)和语法聚类注意力(SCA)的扩展,并将其命名为 LCSA。LCPV 首先从预先训练的模型中获取诱导树,然后生成上下文单词和方面之间的句法邻近值,从而自适应地确定本地上下文的范围。我们改进的 SCA 进一步提取了细粒度知识,这些知识不仅关注目标方面词的重要聚类,而且还引导模型学习 DSPT 中每个聚类内的重要词。在多个基准数据集上的广泛实验结果表明,LCSA 具有很强的鲁棒性,在 ALSA 方面达到了最先进的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Enhanced local knowledge with proximity values and syntax-clusters for aspect-level sentiment analysis

Aspect-level sentiment analysis (ALSA) aims to extract the polarity of different aspect terms in a sentence. Previous works leveraging traditional dependency syntax parsing trees (DSPT) to encode contextual syntactic information had obtained state-of-the-art results. However, these works may not be able to learn fine-grained syntactic knowledge efficiently, which makes them difficult to take advantage of local context. Furthermore, these works failed to exploit the dependency relation from DSPT sufficiently. To solve these problems, we propose a novel method to enhance local knowledge by using extensions of Local Context Network based on Proximity Values (LCPV) and Syntax-clusters Attention (SCA), named LCSA. LCPV first gets the induced trees from pre-trained models and generates the syntactic proximity values between context word and aspect to adaptively determine the extent of local context. Our improved SCA further extracts fine-grained knowledge, which not only focuses on the essential clusters for the target aspect term but also guides the model to learn essential words inside each cluster in DSPT. Extensive experimental results on multiple benchmark datasets demonstrate that LCSA is highly robust and achieves state-of-the-art performance for ALSA.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Computer Speech and Language
Computer Speech and Language 工程技术-计算机:人工智能
CiteScore
11.30
自引率
4.70%
发文量
80
审稿时长
22.9 weeks
期刊介绍: Computer Speech & Language publishes reports of original research related to the recognition, understanding, production, coding and mining of speech and language. The speech and language sciences have a long history, but it is only relatively recently that large-scale implementation of and experimentation with complex models of speech and language processing has become feasible. Such research is often carried out somewhat separately by practitioners of artificial intelligence, computer science, electronic engineering, information retrieval, linguistics, phonetics, or psychology.
期刊最新文献
Editorial Board Enhancing analysis of diadochokinetic speech using deep neural networks Copiously Quote Classics: Improving Chinese Poetry Generation with historical allusion knowledge Significance of chirp MFCC as a feature in speech and audio applications Artificial disfluency detection, uh no, disfluency generation for the masses
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1