{"title":"通过在 NMT 中整合 MLM 知识,加强评论文本的情感和情绪翻译","authors":"Divya Kumari, Asif Ekbal","doi":"10.1007/s10844-024-00843-2","DOIUrl":null,"url":null,"abstract":"<p>Producing a high-quality review translation is a multifaceted process. It goes beyond successful semantic transfer and requires conveying the original message’s tone and style in a way that resonates with the target audience, whether they are human readers or Natural Language Processing (NLP) applications. Capturing these subtle nuances of the review text demands a deeper understanding and better encoding of the source message. In order to achieve this goal, we explore the use of self-supervised masked language modeling (MLM) and a variant called polarity masked language modeling (p-MLM) as auxiliary tasks in a multi-learning setup. MLM is widely recognized for its ability to capture rich linguistic representations of the input and has been shown to achieve state-of-the-art accuracy in various language understanding tasks. Motivated by its effectiveness, in this paper we adopt joint learning, combining the neural machine translation (NMT) task with source polarity-masked language modeling within a shared embedding space to induce a deeper understanding of the emotional nuances of the text. We analyze the results and observe that our multi-task model indeed exhibits a better understanding of linguistic concepts like sentiment and emotion. Intriguingly, this is achieved even without explicit training on sentiment-annotated or domain-specific sentiment corpora. Our multi-task NMT model consistently improves the translation quality of affect sentences from diverse domains in three language pairs.</p>","PeriodicalId":56119,"journal":{"name":"Journal of Intelligent Information Systems","volume":"135 1","pages":""},"PeriodicalIF":2.3000,"publicationDate":"2024-02-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Enhancing sentiment and emotion translation of review text through MLM knowledge integration in NMT\",\"authors\":\"Divya Kumari, Asif Ekbal\",\"doi\":\"10.1007/s10844-024-00843-2\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Producing a high-quality review translation is a multifaceted process. It goes beyond successful semantic transfer and requires conveying the original message’s tone and style in a way that resonates with the target audience, whether they are human readers or Natural Language Processing (NLP) applications. Capturing these subtle nuances of the review text demands a deeper understanding and better encoding of the source message. In order to achieve this goal, we explore the use of self-supervised masked language modeling (MLM) and a variant called polarity masked language modeling (p-MLM) as auxiliary tasks in a multi-learning setup. MLM is widely recognized for its ability to capture rich linguistic representations of the input and has been shown to achieve state-of-the-art accuracy in various language understanding tasks. Motivated by its effectiveness, in this paper we adopt joint learning, combining the neural machine translation (NMT) task with source polarity-masked language modeling within a shared embedding space to induce a deeper understanding of the emotional nuances of the text. We analyze the results and observe that our multi-task model indeed exhibits a better understanding of linguistic concepts like sentiment and emotion. Intriguingly, this is achieved even without explicit training on sentiment-annotated or domain-specific sentiment corpora. Our multi-task NMT model consistently improves the translation quality of affect sentences from diverse domains in three language pairs.</p>\",\"PeriodicalId\":56119,\"journal\":{\"name\":\"Journal of Intelligent Information Systems\",\"volume\":\"135 1\",\"pages\":\"\"},\"PeriodicalIF\":2.3000,\"publicationDate\":\"2024-02-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Intelligent Information Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1007/s10844-024-00843-2\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Intelligent Information Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s10844-024-00843-2","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Enhancing sentiment and emotion translation of review text through MLM knowledge integration in NMT
Producing a high-quality review translation is a multifaceted process. It goes beyond successful semantic transfer and requires conveying the original message’s tone and style in a way that resonates with the target audience, whether they are human readers or Natural Language Processing (NLP) applications. Capturing these subtle nuances of the review text demands a deeper understanding and better encoding of the source message. In order to achieve this goal, we explore the use of self-supervised masked language modeling (MLM) and a variant called polarity masked language modeling (p-MLM) as auxiliary tasks in a multi-learning setup. MLM is widely recognized for its ability to capture rich linguistic representations of the input and has been shown to achieve state-of-the-art accuracy in various language understanding tasks. Motivated by its effectiveness, in this paper we adopt joint learning, combining the neural machine translation (NMT) task with source polarity-masked language modeling within a shared embedding space to induce a deeper understanding of the emotional nuances of the text. We analyze the results and observe that our multi-task model indeed exhibits a better understanding of linguistic concepts like sentiment and emotion. Intriguingly, this is achieved even without explicit training on sentiment-annotated or domain-specific sentiment corpora. Our multi-task NMT model consistently improves the translation quality of affect sentences from diverse domains in three language pairs.
期刊介绍:
The mission of the Journal of Intelligent Information Systems: Integrating Artifical Intelligence and Database Technologies is to foster and present research and development results focused on the integration of artificial intelligence and database technologies to create next generation information systems - Intelligent Information Systems.
These new information systems embody knowledge that allows them to exhibit intelligent behavior, cooperate with users and other systems in problem solving, discovery, access, retrieval and manipulation of a wide variety of multimedia data and knowledge, and reason under uncertainty. Increasingly, knowledge-directed inference processes are being used to:
discover knowledge from large data collections,
provide cooperative support to users in complex query formulation and refinement,
access, retrieve, store and manage large collections of multimedia data and knowledge,
integrate information from multiple heterogeneous data and knowledge sources, and
reason about information under uncertain conditions.
Multimedia and hypermedia information systems now operate on a global scale over the Internet, and new tools and techniques are needed to manage these dynamic and evolving information spaces.
The Journal of Intelligent Information Systems provides a forum wherein academics, researchers and practitioners may publish high-quality, original and state-of-the-art papers describing theoretical aspects, systems architectures, analysis and design tools and techniques, and implementation experiences in intelligent information systems. The categories of papers published by JIIS include: research papers, invited papters, meetings, workshop and conference annoucements and reports, survey and tutorial articles, and book reviews. Short articles describing open problems or their solutions are also welcome.