Dalal Abdullah Aljohany, Hassanin M. Al-Barhamtoshy, Felwa A. Abukhodair
{"title":"Arabic Machine Translation (ArMT) based on LSTM with Attention Mechanism Architecture","authors":"Dalal Abdullah Aljohany, Hassanin M. Al-Barhamtoshy, Felwa A. Abukhodair","doi":"10.1109/ESOLEC54569.2022.10009530","DOIUrl":null,"url":null,"abstract":"As Arabic is considered a low-resource and a rich morphology language. As result, Arabic is considered one of the most challenging languages in Machine Translation (MT). While numerous translation research concentrated on Indo-European languages, much less was made in Arabic. Therefore, the quality of Arabic Machine Translation (ArMT) continues to require improvement. Neural Machine Translation (NMT) is now the state-of-the-art in MT approaches. In this paper, we propose a model for two-way translation between the Arabic and English languages. The proposed model based on NMT and use the Long Short-Term Memory (LSTM) encoder-decoder model with attention mechanism. In the basic encoder–decoder performance is linked to the size of the input sentence, such that as the latter increases, performance diminishes swiftly. Attention mechanisms (AMs) are used to overcome this issue. The proposed model by combining LSTM and attention mechanism is capable to improve accuracy result of translation. The experimental results show that this proposed model improves accuracy of translation and reduces the loss.","PeriodicalId":179850,"journal":{"name":"2022 20th International Conference on Language Engineering (ESOLEC)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 20th International Conference on Language Engineering (ESOLEC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ESOLEC54569.2022.10009530","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
As Arabic is considered a low-resource and a rich morphology language. As result, Arabic is considered one of the most challenging languages in Machine Translation (MT). While numerous translation research concentrated on Indo-European languages, much less was made in Arabic. Therefore, the quality of Arabic Machine Translation (ArMT) continues to require improvement. Neural Machine Translation (NMT) is now the state-of-the-art in MT approaches. In this paper, we propose a model for two-way translation between the Arabic and English languages. The proposed model based on NMT and use the Long Short-Term Memory (LSTM) encoder-decoder model with attention mechanism. In the basic encoder–decoder performance is linked to the size of the input sentence, such that as the latter increases, performance diminishes swiftly. Attention mechanisms (AMs) are used to overcome this issue. The proposed model by combining LSTM and attention mechanism is capable to improve accuracy result of translation. The experimental results show that this proposed model improves accuracy of translation and reduces the loss.