{"title":"Attention-Based Recursive Autoencoder For Sentence-Level Sentiment Classification","authors":"Jiayi Sun, Mingbo Zhao","doi":"10.1109/PRMVIA58252.2023.00050","DOIUrl":null,"url":null,"abstract":"Sentiment analysis is a crucial task in the research of natural language handling. Traditional machine learning approaches frequently employ bag-of-word representations that do not capture complex linguistic phenomena. The recursive autoencoder (RAE) method can availably learn the vector space representation of phrases, which is superior to other sentiment prediction methods on commonly used data sets. However, during the learning process, extensive label data is often required to label each node. In addition, RAE uses greedy strategies to merge adjacent words, it is difficult to capture long-distance and deeper semantic information. We put forward a semi-supervised approach that combines the SenticNet lexicon to train the recursive autoencoder for calculating the sentiment orientation of each node, and incorporates an attention mechanism to capture the contextual relationship between the words in a sentence. Experiments prove that the model proposed in this paper outperforms RAE and other models.","PeriodicalId":221346,"journal":{"name":"2023 International Conference on Pattern Recognition, Machine Vision and Intelligent Algorithms (PRMVIA)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Pattern Recognition, Machine Vision and Intelligent Algorithms (PRMVIA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PRMVIA58252.2023.00050","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Sentiment analysis is a crucial task in the research of natural language handling. Traditional machine learning approaches frequently employ bag-of-word representations that do not capture complex linguistic phenomena. The recursive autoencoder (RAE) method can availably learn the vector space representation of phrases, which is superior to other sentiment prediction methods on commonly used data sets. However, during the learning process, extensive label data is often required to label each node. In addition, RAE uses greedy strategies to merge adjacent words, it is difficult to capture long-distance and deeper semantic information. We put forward a semi-supervised approach that combines the SenticNet lexicon to train the recursive autoencoder for calculating the sentiment orientation of each node, and incorporates an attention mechanism to capture the contextual relationship between the words in a sentence. Experiments prove that the model proposed in this paper outperforms RAE and other models.