{"title":"使用transformer轻松而有益地处理自然语言","authors":"K. Amrutha, P. Prabu","doi":"10.1080/09720529.2022.2133239","DOIUrl":null,"url":null,"abstract":"Abstract Natural Language Processing plays a vital role in our day-to-day life. Deep learning models for NLP help make human life easier as computers can think, talk, and interact like humans. Applications of the NLP models can be seen in many domains, especially in machine translation and psychology. This paper briefly reviews the different transformer models and the advantages of using an Encoder-Decoder language translator model. The article focuses on the need for sequence-to-sequence language-translation models like BERT, RoBERTa, and XLNet, along with their components.","PeriodicalId":46563,"journal":{"name":"JOURNAL OF DISCRETE MATHEMATICAL SCIENCES & CRYPTOGRAPHY","volume":null,"pages":null},"PeriodicalIF":1.2000,"publicationDate":"2022-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Effortless and beneficial processing of natural languages using transformers\",\"authors\":\"K. Amrutha, P. Prabu\",\"doi\":\"10.1080/09720529.2022.2133239\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract Natural Language Processing plays a vital role in our day-to-day life. Deep learning models for NLP help make human life easier as computers can think, talk, and interact like humans. Applications of the NLP models can be seen in many domains, especially in machine translation and psychology. This paper briefly reviews the different transformer models and the advantages of using an Encoder-Decoder language translator model. The article focuses on the need for sequence-to-sequence language-translation models like BERT, RoBERTa, and XLNet, along with their components.\",\"PeriodicalId\":46563,\"journal\":{\"name\":\"JOURNAL OF DISCRETE MATHEMATICAL SCIENCES & CRYPTOGRAPHY\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.2000,\"publicationDate\":\"2022-10-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"JOURNAL OF DISCRETE MATHEMATICAL SCIENCES & CRYPTOGRAPHY\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/09720529.2022.2133239\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"JOURNAL OF DISCRETE MATHEMATICAL SCIENCES & CRYPTOGRAPHY","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/09720529.2022.2133239","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
Effortless and beneficial processing of natural languages using transformers
Abstract Natural Language Processing plays a vital role in our day-to-day life. Deep learning models for NLP help make human life easier as computers can think, talk, and interact like humans. Applications of the NLP models can be seen in many domains, especially in machine translation and psychology. This paper briefly reviews the different transformer models and the advantages of using an Encoder-Decoder language translator model. The article focuses on the need for sequence-to-sequence language-translation models like BERT, RoBERTa, and XLNet, along with their components.