{"title":"Aided Translation Model Based on Logarithmic Position Representation Method and Self-Attention Mechanism","authors":"Chongjun Zhao","doi":"10.1109/ACAIT56212.2022.10137932","DOIUrl":null,"url":null,"abstract":"Aiming at the problem of translation accuracy of traditional auxiliary translation software, this paper proposed to construct an auxiliary translation model based on logarithmic position representation and self-attention. This model used the self-attention mechanism (SA) to capture the semantic relevance of contextual words. Then, the distance information and direction information between words were retained by the logarithmic position representation (LPR), so as to improve the translation accuracy of the model. Experimental results showed that the BLEU score of the proposed model is 31.59, which is 8.04 and 3.65 higher than that of GNMT RL model and existing SOTA model, respectively. In English-French machine translation task, the BLEU score of the proposed model is 42.98, which is higher than that of the other two models. Therefore, the deep learning machine translation model constructed in this paper has higher accuracy and can improve the efficiency of machine translation.","PeriodicalId":398228,"journal":{"name":"2022 6th Asian Conference on Artificial Intelligence Technology (ACAIT)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 6th Asian Conference on Artificial Intelligence Technology (ACAIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACAIT56212.2022.10137932","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Aiming at the problem of translation accuracy of traditional auxiliary translation software, this paper proposed to construct an auxiliary translation model based on logarithmic position representation and self-attention. This model used the self-attention mechanism (SA) to capture the semantic relevance of contextual words. Then, the distance information and direction information between words were retained by the logarithmic position representation (LPR), so as to improve the translation accuracy of the model. Experimental results showed that the BLEU score of the proposed model is 31.59, which is 8.04 and 3.65 higher than that of GNMT RL model and existing SOTA model, respectively. In English-French machine translation task, the BLEU score of the proposed model is 42.98, which is higher than that of the other two models. Therefore, the deep learning machine translation model constructed in this paper has higher accuracy and can improve the efficiency of machine translation.