A Multi-Format Transfer Learning Model for Event Argument Extraction via Variational Information Bottleneck

Jie Zhou, Qi Zhang, Qin Chen, Liang He, Xuanjing Huang
{"title":"A Multi-Format Transfer Learning Model for Event Argument Extraction via Variational Information Bottleneck","authors":"Jie Zhou, Qi Zhang, Qin Chen, Liang He, Xuanjing Huang","doi":"10.48550/arXiv.2208.13017","DOIUrl":null,"url":null,"abstract":"Event argument extraction (EAE) aims to extract arguments with given roles from texts, which have been widely studied in natural language processing. Most previous works have achieved good performance in specific EAE datasets with dedicated neural architectures. Whereas, these architectures are usually difficult to adapt to new datasets/scenarios with various annotation schemas or formats. Furthermore, they rely on large-scale labeled data for training, which is unavailable due to the high labelling cost in most cases. In this paper, we propose a multi-format transfer learning model with variational information bottleneck, which makes use of the information especially the common knowledge in existing datasets for EAE in new datasets. Specifically, we introduce a shared-specific prompt framework to learn both format-shared and format-specific knowledge from datasets with different formats. In order to further absorb the common knowledge for EAE and eliminate the irrelevant noise, we integrate variational information bottleneck into our architecture to refine the shared representation. We conduct extensive experiments on three benchmark datasets, and obtain new state-of-the-art performance on EAE.","PeriodicalId":91381,"journal":{"name":"Proceedings of COLING. International Conference on Computational Linguistics","volume":"36 1","pages":"1990-2000"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of COLING. International Conference on Computational Linguistics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.48550/arXiv.2208.13017","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

Abstract

Event argument extraction (EAE) aims to extract arguments with given roles from texts, which have been widely studied in natural language processing. Most previous works have achieved good performance in specific EAE datasets with dedicated neural architectures. Whereas, these architectures are usually difficult to adapt to new datasets/scenarios with various annotation schemas or formats. Furthermore, they rely on large-scale labeled data for training, which is unavailable due to the high labelling cost in most cases. In this paper, we propose a multi-format transfer learning model with variational information bottleneck, which makes use of the information especially the common knowledge in existing datasets for EAE in new datasets. Specifically, we introduce a shared-specific prompt framework to learn both format-shared and format-specific knowledge from datasets with different formats. In order to further absorb the common knowledge for EAE and eliminate the irrelevant noise, we integrate variational information bottleneck into our architecture to refine the shared representation. We conduct extensive experiments on three benchmark datasets, and obtain new state-of-the-art performance on EAE.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于变分信息瓶颈的事件参数提取多格式迁移学习模型
事件参数提取(EAE)旨在从文本中提取具有给定角色的参数,这是自然语言处理中广泛研究的问题。大多数先前的工作已经在特定的EAE数据集上取得了良好的性能。然而,这些体系结构通常很难适应具有各种注释模式或格式的新数据集/场景。此外,它们依赖于大规模标记数据进行训练,这在大多数情况下由于标记成本高而无法获得。在本文中,我们提出了一种具有变分信息瓶颈的多格式迁移学习模型,该模型利用现有数据集中的信息特别是公共知识对新数据集中的EAE进行学习。具体来说,我们引入了一个特定于共享的提示框架,以从不同格式的数据集中学习格式共享和格式特定的知识。为了进一步吸收EAE的公共知识,消除无关噪声,我们将变分信息瓶颈集成到我们的体系结构中,以改进共享表示。我们在三个基准数据集上进行了广泛的实验,并获得了新的最先进的EAE性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Modeling Hierarchical Reasoning Chains by Linking Discourse Units and Key Phrases for Reading Comprehension Event Causality Extraction with Event Argument Correlations BERT-Flow-VAE: A Weakly-supervised Model for Multi-Label Text Classification TestAug: A Framework for Augmenting Capability-based NLP Tests Multilingual Word Sense Disambiguation with Unified Sense Representation
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1