AssocKD: An Association-Aware Knowledge Distillation Method for Document-Level Event Argument Extraction

IF 4.6 Q2 MATERIALS SCIENCE, BIOMATERIALS ACS Applied Bio Materials Pub Date : 2024-09-18 DOI:10.3390/math12182901
Lijun Tan, Yanli Hu, Jianwei Cao, Zhen Tan
{"title":"AssocKD: An Association-Aware Knowledge Distillation Method for Document-Level Event Argument Extraction","authors":"Lijun Tan, Yanli Hu, Jianwei Cao, Zhen Tan","doi":"10.3390/math12182901","DOIUrl":null,"url":null,"abstract":"Event argument extraction is a crucial subtask of event extraction, which aims at extracting arguments that correspond to argument roles when given event types. The majority of current document-level event argument extraction works focus on extracting information for only one event at a time without considering the association among events; this is known as document-level single-event extraction. However, the interrelationship among arguments can yield mutual gains in their extraction. Therefore, in this paper, we propose AssocKD, an Association-aware Knowledge Distillation Method for Document-level Event Argument Extraction, which enables the enhancement of document-level multi-event extraction with event association knowledge. Firstly, we introduce an association-aware training task to extract unknown arguments with the given privileged knowledge of relevant arguments, obtaining an association-aware model that can construct both intra-event and inter-event relationships. Secondly, we adopt multi-teacher knowledge distillation to transfer such event association knowledge from the association-aware teacher models to the event argument extraction student model. Our proposed method, AssocKD, is capable of explicitly modeling and efficiently leveraging event association to enhance the extraction of multi-event arguments at the document level. We conduct experiments on RAMS and WIKIEVENTS datasets and observe a significant improvement, thus demonstrating the effectiveness of our method.","PeriodicalId":2,"journal":{"name":"ACS Applied Bio Materials","volume":null,"pages":null},"PeriodicalIF":4.6000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Bio Materials","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.3390/math12182901","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATERIALS SCIENCE, BIOMATERIALS","Score":null,"Total":0}
引用次数: 0

Abstract

Event argument extraction is a crucial subtask of event extraction, which aims at extracting arguments that correspond to argument roles when given event types. The majority of current document-level event argument extraction works focus on extracting information for only one event at a time without considering the association among events; this is known as document-level single-event extraction. However, the interrelationship among arguments can yield mutual gains in their extraction. Therefore, in this paper, we propose AssocKD, an Association-aware Knowledge Distillation Method for Document-level Event Argument Extraction, which enables the enhancement of document-level multi-event extraction with event association knowledge. Firstly, we introduce an association-aware training task to extract unknown arguments with the given privileged knowledge of relevant arguments, obtaining an association-aware model that can construct both intra-event and inter-event relationships. Secondly, we adopt multi-teacher knowledge distillation to transfer such event association knowledge from the association-aware teacher models to the event argument extraction student model. Our proposed method, AssocKD, is capable of explicitly modeling and efficiently leveraging event association to enhance the extraction of multi-event arguments at the document level. We conduct experiments on RAMS and WIKIEVENTS datasets and observe a significant improvement, thus demonstrating the effectiveness of our method.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
AssocKD:用于文档级事件论据提取的关联意识知识提炼方法
事件参数提取是事件提取的一个重要子任务,其目的是在给定事件类型时提取与参数角色相对应的参数。目前,大多数文档级事件论据提取工作都侧重于一次只提取一个事件的信息,而不考虑事件之间的关联,这被称为文档级单一事件提取。然而,论据之间的相互关系可以在提取中产生互利。因此,我们在本文中提出了一种用于文档级事件论据提取的关联感知知识提炼方法--AssociatedKD,该方法能够利用事件关联知识增强文档级多事件提取。首先,我们引入了关联感知训练任务,利用给定的相关论据特权知识提取未知论据,得到了一个能构建事件内关系和事件间关系的关联感知模型。其次,我们采用多教师知识提炼法,将这些事件关联知识从关联感知教师模型转移到事件论据提取学生模型中。我们提出的方法--AssocKD--能够明确地对事件关联进行建模并有效地利用事件关联,从而提高文档级的多事件论据提取能力。我们在 RAMS 和 WIKIEVENTS 数据集上进行了实验,观察到了显著的改进,从而证明了我们方法的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
ACS Applied Bio Materials
ACS Applied Bio Materials Chemistry-Chemistry (all)
CiteScore
9.40
自引率
2.10%
发文量
464
期刊最新文献
A Systematic Review of Sleep Disturbance in Idiopathic Intracranial Hypertension. Advancing Patient Education in Idiopathic Intracranial Hypertension: The Promise of Large Language Models. Anti-Myelin-Associated Glycoprotein Neuropathy: Recent Developments. Approach to Managing the Initial Presentation of Multiple Sclerosis: A Worldwide Practice Survey. Association Between LACE+ Index Risk Category and 90-Day Mortality After Stroke.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1