{"title":"神经网络算法在英语短文翻译中的自动查错研究","authors":"Liang Guo","doi":"10.1007/s10015-024-00952-9","DOIUrl":null,"url":null,"abstract":"<div><p>With the growing population of English learners, how to improve the efficiency of English learning has become a focus of research. This article focuses on automatic error-checking in English short text translation. The Transformer model was enhanced by combining with the bidirectional gated recurrent unit (BiGRU) algorithm to create a dual-encoder model that better captures information within input sequences. Experiments were then conducted on different corpora. The improved Transformer model obtained a <span>\\({\\text{F}}_{0.5}\\)</span> of 59.09 on CoNLL-2014 and 61.05 Google-bilingual evaluation understudy (GLEU) on JFLEG, both of which were better than the other methods compared. The case analysis showed that the improved Transformer model accurately found errors in short text translation. The findings indicate that the proposed approach is reliable in the automatic error-checking of English short text translation and can be applied in practice.</p></div>","PeriodicalId":0,"journal":{"name":"","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Research on automatic error-checking in English short text translation by a neural network algorithm\",\"authors\":\"Liang Guo\",\"doi\":\"10.1007/s10015-024-00952-9\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>With the growing population of English learners, how to improve the efficiency of English learning has become a focus of research. This article focuses on automatic error-checking in English short text translation. The Transformer model was enhanced by combining with the bidirectional gated recurrent unit (BiGRU) algorithm to create a dual-encoder model that better captures information within input sequences. Experiments were then conducted on different corpora. The improved Transformer model obtained a <span>\\\\({\\\\text{F}}_{0.5}\\\\)</span> of 59.09 on CoNLL-2014 and 61.05 Google-bilingual evaluation understudy (GLEU) on JFLEG, both of which were better than the other methods compared. The case analysis showed that the improved Transformer model accurately found errors in short text translation. The findings indicate that the proposed approach is reliable in the automatic error-checking of English short text translation and can be applied in practice.</p></div>\",\"PeriodicalId\":0,\"journal\":{\"name\":\"\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0,\"publicationDate\":\"2024-05-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s10015-024-00952-9\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"","FirstCategoryId":"1085","ListUrlMain":"https://link.springer.com/article/10.1007/s10015-024-00952-9","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Research on automatic error-checking in English short text translation by a neural network algorithm
With the growing population of English learners, how to improve the efficiency of English learning has become a focus of research. This article focuses on automatic error-checking in English short text translation. The Transformer model was enhanced by combining with the bidirectional gated recurrent unit (BiGRU) algorithm to create a dual-encoder model that better captures information within input sequences. Experiments were then conducted on different corpora. The improved Transformer model obtained a \({\text{F}}_{0.5}\) of 59.09 on CoNLL-2014 and 61.05 Google-bilingual evaluation understudy (GLEU) on JFLEG, both of which were better than the other methods compared. The case analysis showed that the improved Transformer model accurately found errors in short text translation. The findings indicate that the proposed approach is reliable in the automatic error-checking of English short text translation and can be applied in practice.