{"title":"基于BERT词向量和层次双向LSTM的中文产品评论情感分析","authors":"Kuihua Zhang, Min Hu, Fuji Ren, Pengyuan Hu","doi":"10.1109/CSAIEE54046.2021.9543231","DOIUrl":null,"url":null,"abstract":"Sentiment analysis data on Chinese shopping comments has gained much attention in recent years. Many previous studies focus on the relationship between words in a single sentence but ignore the context relationship between sentences. To better serve this problem, we propose a method based on Bidirectional Encoder Representations from Transformers (BERT) pre-training language model, Hierarchical Bi-directional Long Short-Term Memory (Hierarchical Bi-LSTM) and attention mechanism for Chinese sentiment analysis. We first use BERT pretraining language model to obtained word vector, then applies Hierarchical Bi-LSTM model to extract contextual feature from sentences and words. Finally, we inj ect attention mechanism to highlight key information. Base on the experimental results, our method achieves more idealistic performance.","PeriodicalId":376014,"journal":{"name":"2021 IEEE International Conference on Computer Science, Artificial Intelligence and Electronic Engineering (CSAIEE)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Sentiment Analysis of Chinese Product Reviews Based on BERT Word Vector and Hierarchical Bidirectional LSTM\",\"authors\":\"Kuihua Zhang, Min Hu, Fuji Ren, Pengyuan Hu\",\"doi\":\"10.1109/CSAIEE54046.2021.9543231\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Sentiment analysis data on Chinese shopping comments has gained much attention in recent years. Many previous studies focus on the relationship between words in a single sentence but ignore the context relationship between sentences. To better serve this problem, we propose a method based on Bidirectional Encoder Representations from Transformers (BERT) pre-training language model, Hierarchical Bi-directional Long Short-Term Memory (Hierarchical Bi-LSTM) and attention mechanism for Chinese sentiment analysis. We first use BERT pretraining language model to obtained word vector, then applies Hierarchical Bi-LSTM model to extract contextual feature from sentences and words. Finally, we inj ect attention mechanism to highlight key information. Base on the experimental results, our method achieves more idealistic performance.\",\"PeriodicalId\":376014,\"journal\":{\"name\":\"2021 IEEE International Conference on Computer Science, Artificial Intelligence and Electronic Engineering (CSAIEE)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-08-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE International Conference on Computer Science, Artificial Intelligence and Electronic Engineering (CSAIEE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CSAIEE54046.2021.9543231\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Conference on Computer Science, Artificial Intelligence and Electronic Engineering (CSAIEE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CSAIEE54046.2021.9543231","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Sentiment Analysis of Chinese Product Reviews Based on BERT Word Vector and Hierarchical Bidirectional LSTM
Sentiment analysis data on Chinese shopping comments has gained much attention in recent years. Many previous studies focus on the relationship between words in a single sentence but ignore the context relationship between sentences. To better serve this problem, we propose a method based on Bidirectional Encoder Representations from Transformers (BERT) pre-training language model, Hierarchical Bi-directional Long Short-Term Memory (Hierarchical Bi-LSTM) and attention mechanism for Chinese sentiment analysis. We first use BERT pretraining language model to obtained word vector, then applies Hierarchical Bi-LSTM model to extract contextual feature from sentences and words. Finally, we inj ect attention mechanism to highlight key information. Base on the experimental results, our method achieves more idealistic performance.