面向跨领域基于方面的情感分类的领域独立词选择器

IF 0.4 Q4 COMPUTER SCIENCE, INFORMATION SYSTEMS Applied Computing Review Pub Date : 2023-09-01 DOI:10.1145/3626307.3626309
Junhee Lee, Flavius Frasincar, Maria Mihaela Truşcă
{"title":"面向跨领域基于方面的情感分类的领域独立词选择器","authors":"Junhee Lee, Flavius Frasincar, Maria Mihaela Truşcă","doi":"10.1145/3626307.3626309","DOIUrl":null,"url":null,"abstract":"The Aspect-Based Sentiment Classification (ABSC) models often suffer from a lack of training data in some domains. To exploit the abundant data from another domain, this work extends the original state-of-the-art LCR-Rot-hop++ model that uses a neural network with a rotatory attention mechanism for a cross-domain setting. More specifically, we propose a Domain-Independent Word Selector (DIWS) model that is used in combination with the LCR-Rot-hop++ model (DIWS-LCR-Rot-hop++). DIWS-LCR-Rot-hop++ uses attention weights from the domain classification task to determine whether a word is domain-specific or domain-independent, and discards domain-specific words when training and testing the LCR-Rot-hop++ model for cross-domain ABSC. Overall, our results confirm that DIWS-LCR-Rot-hop++ outperforms the original LCR-Rot-hop++ model under a cross-domain setting in case we impose an optimal domain-dependent attention threshold value for deciding whether a word is domain-specific or domain-independent. For a target domain that is highly similar to the source domain, we find that imposing moderate restrictions on classifying domain-independent words yields the best performance. Differently, a dissimilar target domain requires a strict restriction that classifies a small proportion of words as domain-independent. Also, we observe information loss which deteriorates the performance of DIWS-LCR-Rot-hop++ when we categorize an excessive amount of words as domain-specific and discard them.","PeriodicalId":42971,"journal":{"name":"Applied Computing Review","volume":"20 1","pages":"0"},"PeriodicalIF":0.4000,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"DIWS-LCR-Rot-hop++: A Domain-Independent Word Selector for Cross-Domain Aspect-Based Sentiment Classification\",\"authors\":\"Junhee Lee, Flavius Frasincar, Maria Mihaela Truşcă\",\"doi\":\"10.1145/3626307.3626309\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The Aspect-Based Sentiment Classification (ABSC) models often suffer from a lack of training data in some domains. To exploit the abundant data from another domain, this work extends the original state-of-the-art LCR-Rot-hop++ model that uses a neural network with a rotatory attention mechanism for a cross-domain setting. More specifically, we propose a Domain-Independent Word Selector (DIWS) model that is used in combination with the LCR-Rot-hop++ model (DIWS-LCR-Rot-hop++). DIWS-LCR-Rot-hop++ uses attention weights from the domain classification task to determine whether a word is domain-specific or domain-independent, and discards domain-specific words when training and testing the LCR-Rot-hop++ model for cross-domain ABSC. Overall, our results confirm that DIWS-LCR-Rot-hop++ outperforms the original LCR-Rot-hop++ model under a cross-domain setting in case we impose an optimal domain-dependent attention threshold value for deciding whether a word is domain-specific or domain-independent. For a target domain that is highly similar to the source domain, we find that imposing moderate restrictions on classifying domain-independent words yields the best performance. Differently, a dissimilar target domain requires a strict restriction that classifies a small proportion of words as domain-independent. Also, we observe information loss which deteriorates the performance of DIWS-LCR-Rot-hop++ when we categorize an excessive amount of words as domain-specific and discard them.\",\"PeriodicalId\":42971,\"journal\":{\"name\":\"Applied Computing Review\",\"volume\":\"20 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.4000,\"publicationDate\":\"2023-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied Computing Review\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3626307.3626309\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Computing Review","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3626307.3626309","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

基于方面的情感分类(ABSC)模型在某些领域缺乏训练数据。为了利用来自另一个领域的丰富数据,本工作扩展了原始的最先进的LCR-Rot-hop++模型,该模型使用具有旋转注意机制的神经网络进行跨领域设置。更具体地说,我们提出了一个与LCR-Rot-hop++模型(DIWS-LCR-Rot-hop++)结合使用的领域独立词选择器(DIWS-LCR-Rot-hop++)模型。DIWS-LCR-Rot-hop++使用来自领域分类任务的关注权值来确定一个词是特定于领域的还是独立于领域的,并在训练和测试跨领域ABSC的LCR-Rot-hop++模型时丢弃特定于领域的词。总的来说,我们的结果证实,在跨领域设置下,如果我们施加一个最佳的领域依赖的注意力阈值来决定一个词是特定于领域的还是独立于领域的,DIWS-LCR-Rot-hop++优于原始的LCR-Rot-hop++模型。对于与源域高度相似的目标域,我们发现对与域无关的词进行适度的分类限制可以产生最佳的性能。不同的是,不同的目标领域需要严格的限制,将一小部分单词分类为领域独立的。此外,我们还观察到,当我们将过多的单词分类为特定领域并丢弃它们时,信息丢失会降低DIWS-LCR-Rot-hop++的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
DIWS-LCR-Rot-hop++: A Domain-Independent Word Selector for Cross-Domain Aspect-Based Sentiment Classification
The Aspect-Based Sentiment Classification (ABSC) models often suffer from a lack of training data in some domains. To exploit the abundant data from another domain, this work extends the original state-of-the-art LCR-Rot-hop++ model that uses a neural network with a rotatory attention mechanism for a cross-domain setting. More specifically, we propose a Domain-Independent Word Selector (DIWS) model that is used in combination with the LCR-Rot-hop++ model (DIWS-LCR-Rot-hop++). DIWS-LCR-Rot-hop++ uses attention weights from the domain classification task to determine whether a word is domain-specific or domain-independent, and discards domain-specific words when training and testing the LCR-Rot-hop++ model for cross-domain ABSC. Overall, our results confirm that DIWS-LCR-Rot-hop++ outperforms the original LCR-Rot-hop++ model under a cross-domain setting in case we impose an optimal domain-dependent attention threshold value for deciding whether a word is domain-specific or domain-independent. For a target domain that is highly similar to the source domain, we find that imposing moderate restrictions on classifying domain-independent words yields the best performance. Differently, a dissimilar target domain requires a strict restriction that classifies a small proportion of words as domain-independent. Also, we observe information loss which deteriorates the performance of DIWS-LCR-Rot-hop++ when we categorize an excessive amount of words as domain-specific and discard them.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Applied Computing Review
Applied Computing Review COMPUTER SCIENCE, INFORMATION SYSTEMS-
自引率
40.00%
发文量
8
期刊最新文献
DIWS-LCR-Rot-hop++: A Domain-Independent Word Selector for Cross-Domain Aspect-Based Sentiment Classification Leveraging Semantic Technologies for Collaborative Inference of Threatening IoT Dependencies Relating Optimal Repairs in Ontology Engineering with Contraction Operations in Belief Change Block-RACS: Towards Reputation-Aware Client Selection and Monetization Mechanism for Federated Learning Elastic Data Binning: Time-Series Sketching for Time-Domain Astrophysics Analysis
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1