{"title":"Learning sentiment-inherent word embedding for word-level and sentence-level sentiment analysis","authors":"Zhihua Zhang, Man Lan","doi":"10.1109/IALP.2015.7451540","DOIUrl":null,"url":null,"abstract":"Vector-based word representations have made great progress on many Natural Language Processing tasks. However, due to the lack of sentiment information, the traditional word vectors are insufficient to settle sentiment analysis tasks. In order to capture the sentiment information, we extended Continuous Skip-gram model (Skip-gram) and presented two sentiment word embedding models by integrating sentiment information into semantic word representations. Experimental results showed that the sentiment word embeddings learned by two models indeed capture sentiment and semantic information as well. Moreover, the proposed sentiment word embedding models outperform traditional word vectors on both Chinese and English corpora.","PeriodicalId":256927,"journal":{"name":"2015 International Conference on Asian Language Processing (IALP)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"19","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 International Conference on Asian Language Processing (IALP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IALP.2015.7451540","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 19
Abstract
Vector-based word representations have made great progress on many Natural Language Processing tasks. However, due to the lack of sentiment information, the traditional word vectors are insufficient to settle sentiment analysis tasks. In order to capture the sentiment information, we extended Continuous Skip-gram model (Skip-gram) and presented two sentiment word embedding models by integrating sentiment information into semantic word representations. Experimental results showed that the sentiment word embeddings learned by two models indeed capture sentiment and semantic information as well. Moreover, the proposed sentiment word embedding models outperform traditional word vectors on both Chinese and English corpora.