Toward End-to-End Neural Cascading Strategies for Grammatical Error Correction

Kingsley Nketia Acheampong, Wenhong Tian
{"title":"Toward End-to-End Neural Cascading Strategies for Grammatical Error Correction","authors":"Kingsley Nketia Acheampong, Wenhong Tian","doi":"10.1109/ISKE47853.2019.9170364","DOIUrl":null,"url":null,"abstract":"Neural sequence-to-sequence (seq2seq) grammatical error correction (GEC) models are usually computationally expensive both in training and in translation inference. Also, they tend to suffer from poor generalization and arrive at inept capabilities due to limited error-corrected data, and thus, incapable of effectively correcting grammar. In this work, we propose the use of neural cascading strategies in enhancing the effectiveness of neural sequence-to-sequence grammatical error correction models as inspired by post-editing processes of neural machine translations. The findings of our experiments show that adapting cascading techniques in low resource NMT models unleashes performances that is comparable to high setting NMT models. We extensively exploit and evaluate multiple cascading learning strategies and establish best practices toward improving neural seq2seq GECs.","PeriodicalId":399084,"journal":{"name":"2019 IEEE 14th International Conference on Intelligent Systems and Knowledge Engineering (ISKE)","volume":"48 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE 14th International Conference on Intelligent Systems and Knowledge Engineering (ISKE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISKE47853.2019.9170364","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Neural sequence-to-sequence (seq2seq) grammatical error correction (GEC) models are usually computationally expensive both in training and in translation inference. Also, they tend to suffer from poor generalization and arrive at inept capabilities due to limited error-corrected data, and thus, incapable of effectively correcting grammar. In this work, we propose the use of neural cascading strategies in enhancing the effectiveness of neural sequence-to-sequence grammatical error correction models as inspired by post-editing processes of neural machine translations. The findings of our experiments show that adapting cascading techniques in low resource NMT models unleashes performances that is comparable to high setting NMT models. We extensively exploit and evaluate multiple cascading learning strategies and establish best practices toward improving neural seq2seq GECs.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
语法错误纠正的端到端神经级联策略
神经序列到序列(seq2seq)语法错误纠正(GEC)模型通常在训练和翻译推理中都是计算昂贵的。此外,由于错误纠正的数据有限,它们往往泛化不良,能力低下,因此无法有效地纠正语法。在这项工作中,我们建议使用神经级联策略来提高神经序列到序列语法错误纠正模型的有效性,并受到神经机器翻译后编辑过程的启发。我们的实验结果表明,在低资源NMT模型中采用级联技术可以释放出与高设置NMT模型相当的性能。我们广泛地开发和评估了多种级联学习策略,并建立了改善神经seq2seq gec的最佳实践。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Incremental Learning for Transductive SVMs ISKE 2019 Table of Contents Consensus: The Minimum Cost Model based Robust Optimization A Learned Clause Deletion Strategy Based on Distance Ratio Effects of Real Estate Regulation Policy of Beijing Based on Discrete Dependent Variables Model
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1