通过双重注意力融合和动态非对称损失进行文档级关系提取

IF 5 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Complex & Intelligent Systems Pub Date : 2024-11-11 DOI:10.1007/s40747-024-01632-8
Xiaoyao Ding, Dongyan Ding, Gang Zhou, Jicang Lu, Taojie Zhu
{"title":"通过双重注意力融合和动态非对称损失进行文档级关系提取","authors":"Xiaoyao Ding, Dongyan Ding, Gang Zhou, Jicang Lu, Taojie Zhu","doi":"10.1007/s40747-024-01632-8","DOIUrl":null,"url":null,"abstract":"<p>Document-level relation extraction (RE), which requires integrating and reasoning information to identify multiple possible relations among entities. However, previous research typically performed reasoning on heterogeneous graphs and set a global threshold for multiple relations classification, regardless of interaction reasoning information among multiple relations and positive–negative samples imbalance on databases. This paper proposes a novel framework for Document-level RE with two techniques, dual attention fusion and dynamic asymmetric loss. Concretely, to obtain more interdependency feature learning, we construct entity pairs and contextual matrixes using multi-head axial attention and co-attention mechanism to learn the interaction among entity pairs deeply. To alleviate the hard-thresholds influence from positive–negative imbalance samples, we dynamically adjust weights to optimize the probabilities of different labels. We evaluate our model on two benchmark document-level RE datasets, DocRED and CDR. Experimental results show that our DASL (Dual Attention fusion and dynamic aSymmetric Loss) obtains superior performance on two public datasets, we further provide extensive experiments to analyze how dual attention fusion and dynamic asymmetric loss guide the model for better extracting multi-label relations among entities.</p>","PeriodicalId":10524,"journal":{"name":"Complex & Intelligent Systems","volume":"154 1","pages":""},"PeriodicalIF":5.0000,"publicationDate":"2024-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Document-level relation extraction via dual attention fusion and dynamic asymmetric loss\",\"authors\":\"Xiaoyao Ding, Dongyan Ding, Gang Zhou, Jicang Lu, Taojie Zhu\",\"doi\":\"10.1007/s40747-024-01632-8\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Document-level relation extraction (RE), which requires integrating and reasoning information to identify multiple possible relations among entities. However, previous research typically performed reasoning on heterogeneous graphs and set a global threshold for multiple relations classification, regardless of interaction reasoning information among multiple relations and positive–negative samples imbalance on databases. This paper proposes a novel framework for Document-level RE with two techniques, dual attention fusion and dynamic asymmetric loss. Concretely, to obtain more interdependency feature learning, we construct entity pairs and contextual matrixes using multi-head axial attention and co-attention mechanism to learn the interaction among entity pairs deeply. To alleviate the hard-thresholds influence from positive–negative imbalance samples, we dynamically adjust weights to optimize the probabilities of different labels. We evaluate our model on two benchmark document-level RE datasets, DocRED and CDR. Experimental results show that our DASL (Dual Attention fusion and dynamic aSymmetric Loss) obtains superior performance on two public datasets, we further provide extensive experiments to analyze how dual attention fusion and dynamic asymmetric loss guide the model for better extracting multi-label relations among entities.</p>\",\"PeriodicalId\":10524,\"journal\":{\"name\":\"Complex & Intelligent Systems\",\"volume\":\"154 1\",\"pages\":\"\"},\"PeriodicalIF\":5.0000,\"publicationDate\":\"2024-11-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Complex & Intelligent Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1007/s40747-024-01632-8\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Complex & Intelligent Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s40747-024-01632-8","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

文档级关系提取(RE),需要整合和推理信息来识别实体间的多种可能关系。然而,以往的研究通常是在异构图上进行推理,并为多重关系分类设置一个全局阈值,而不考虑多重关系之间的交互推理信息和数据库中正负样本的不平衡。本文利用双重注意力融合和动态非对称损失两种技术提出了一种新的文档级 RE 框架。具体来说,为了获得更多的相互依赖特征学习,我们利用多头轴向注意和共同注意机制构建了实体对和上下文矩阵,以深入学习实体对之间的交互。为了减轻正负不平衡样本对硬阈值的影响,我们动态调整权重以优化不同标签的概率。我们在 DocRED 和 CDR 这两个基准文档级 RE 数据集上评估了我们的模型。实验结果表明,我们的 DASL(双注意融合和动态非对称损失)在两个公共数据集上获得了优异的性能,我们进一步提供了大量实验来分析双注意融合和动态非对称损失如何指导模型更好地提取实体间的多标签关系。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

摘要图片

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Document-level relation extraction via dual attention fusion and dynamic asymmetric loss

Document-level relation extraction (RE), which requires integrating and reasoning information to identify multiple possible relations among entities. However, previous research typically performed reasoning on heterogeneous graphs and set a global threshold for multiple relations classification, regardless of interaction reasoning information among multiple relations and positive–negative samples imbalance on databases. This paper proposes a novel framework for Document-level RE with two techniques, dual attention fusion and dynamic asymmetric loss. Concretely, to obtain more interdependency feature learning, we construct entity pairs and contextual matrixes using multi-head axial attention and co-attention mechanism to learn the interaction among entity pairs deeply. To alleviate the hard-thresholds influence from positive–negative imbalance samples, we dynamically adjust weights to optimize the probabilities of different labels. We evaluate our model on two benchmark document-level RE datasets, DocRED and CDR. Experimental results show that our DASL (Dual Attention fusion and dynamic aSymmetric Loss) obtains superior performance on two public datasets, we further provide extensive experiments to analyze how dual attention fusion and dynamic asymmetric loss guide the model for better extracting multi-label relations among entities.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Complex & Intelligent Systems
Complex & Intelligent Systems COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-
CiteScore
9.60
自引率
10.30%
发文量
297
期刊介绍: Complex & Intelligent Systems aims to provide a forum for presenting and discussing novel approaches, tools and techniques meant for attaining a cross-fertilization between the broad fields of complex systems, computational simulation, and intelligent analytics and visualization. The transdisciplinary research that the journal focuses on will expand the boundaries of our understanding by investigating the principles and processes that underlie many of the most profound problems facing society today.
期刊最新文献
Large-scale multiobjective competitive swarm optimizer algorithm based on regional multidirectional search Towards fairness-aware multi-objective optimization Low-frequency spectral graph convolution networks with one-hop connections information for personalized tag recommendation A decentralized feedback-based consensus model considering the consistency maintenance and readability of probabilistic linguistic preference relations for large-scale group decision-making A dynamic preference recommendation model based on spatiotemporal knowledge graphs
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1