立法语料库中的共涉决议与意义表示

Surawat Pothong, N. Facundes
{"title":"立法语料库中的共涉决议与意义表示","authors":"Surawat Pothong, N. Facundes","doi":"10.1109/iSAI-NLP54397.2021.9678168","DOIUrl":null,"url":null,"abstract":"This paper addresses the application and integration of coreferences resolution tasks in a legislative corpus by using SpanBERT, which is an improvement of the BERT (Bidirectional Encoder Representations from Transformers) model and semantic extraction by Abstract Meaning Representation (AMR) for reducing text complexity, meaning preservation and further applications. Our main processes are divided into four subparts: legal text pre-processing, coreference resolution, AMR, evaluation for meaning preservation, and complexity reduction. Smatch evaluation tool and Bilingual Evaluation Understudy (BLEU) scores are applied to evaluate overlapped meaning between resolved and unresolved coreference sentences. The AMR graphs after complexity have been reduced can be applied for further processing tasks with Neural Network such as legal inferencing and legal engineering tasks.","PeriodicalId":339826,"journal":{"name":"2021 16th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP)","volume":"68 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Coreference Resolution and Meaning Representation in a Legislative Corpus\",\"authors\":\"Surawat Pothong, N. Facundes\",\"doi\":\"10.1109/iSAI-NLP54397.2021.9678168\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper addresses the application and integration of coreferences resolution tasks in a legislative corpus by using SpanBERT, which is an improvement of the BERT (Bidirectional Encoder Representations from Transformers) model and semantic extraction by Abstract Meaning Representation (AMR) for reducing text complexity, meaning preservation and further applications. Our main processes are divided into four subparts: legal text pre-processing, coreference resolution, AMR, evaluation for meaning preservation, and complexity reduction. Smatch evaluation tool and Bilingual Evaluation Understudy (BLEU) scores are applied to evaluate overlapped meaning between resolved and unresolved coreference sentences. The AMR graphs after complexity have been reduced can be applied for further processing tasks with Neural Network such as legal inferencing and legal engineering tasks.\",\"PeriodicalId\":339826,\"journal\":{\"name\":\"2021 16th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP)\",\"volume\":\"68 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 16th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/iSAI-NLP54397.2021.9678168\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 16th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/iSAI-NLP54397.2021.9678168","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

本文讨论了在立法语料库中使用SpanBERT(双向编码器表示)模型和抽象意义表示(AMR)的语义提取的改进,以降低文本复杂性、意义保留和进一步的应用)来解决共同引用解析任务的应用和集成。我们的主要过程分为四个子部分:法律文本预处理、共同参考解析、AMR、意义保留评估和复杂性降低。使用Smatch评价工具和双语评价替代(BLEU)分数来评价已解决和未解决的共指句之间的重叠意义。降低复杂度后的AMR图可以应用于神经网络的进一步处理任务,如法律推理和法律工程任务。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Coreference Resolution and Meaning Representation in a Legislative Corpus
This paper addresses the application and integration of coreferences resolution tasks in a legislative corpus by using SpanBERT, which is an improvement of the BERT (Bidirectional Encoder Representations from Transformers) model and semantic extraction by Abstract Meaning Representation (AMR) for reducing text complexity, meaning preservation and further applications. Our main processes are divided into four subparts: legal text pre-processing, coreference resolution, AMR, evaluation for meaning preservation, and complexity reduction. Smatch evaluation tool and Bilingual Evaluation Understudy (BLEU) scores are applied to evaluate overlapped meaning between resolved and unresolved coreference sentences. The AMR graphs after complexity have been reduced can be applied for further processing tasks with Neural Network such as legal inferencing and legal engineering tasks.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Replay Attack Detection in Automatic Speaker Verification Based on ResNeWt18 with Linear Frequency Cepstral Coefficients Image Processing for Classification of Rice Varieties with Deep Convolutional Neural Networks KaleCare: Smart Farm for Kale with Pests Detection System using Machine Learning The comparison of the proposed recommended system with actual data sylbreak4all: Regular Expressions for Syllable Breaking of Nine Major Ethnic Languages of Myanmar
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1