从用户信息中提取知识用于文档级情感分类

Jialing Song
{"title":"从用户信息中提取知识用于文档级情感分类","authors":"Jialing Song","doi":"10.1109/ICDEW.2019.00-15","DOIUrl":null,"url":null,"abstract":"Combining global user and product characteristics with local review information provides a powerful mechanism for predicting users' sentiment in a review document about a product on online review sites such as Amazon, Yelp and IMDB. However, the user information is not always available in the real scenario, for example, some new-registered users, or some sites allowing users' comments without logging in. To address this issue, we introduce a novel knowledge distillation (KD) learning paradigm, to transfer the user characteristics into the weights of student neural networks that just utilize product and review information. The teacher model transfers its predictive distributions of training data to the student model. Thus, the user profiles are only required during the training stage. Experimental results on several sentiment classification datasets show that the proposed learning framework enables student models to achieve improved performance.","PeriodicalId":186190,"journal":{"name":"2019 IEEE 35th International Conference on Data Engineering Workshops (ICDEW)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-04-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Distilling Knowledge from User Information for Document Level Sentiment Classification\",\"authors\":\"Jialing Song\",\"doi\":\"10.1109/ICDEW.2019.00-15\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Combining global user and product characteristics with local review information provides a powerful mechanism for predicting users' sentiment in a review document about a product on online review sites such as Amazon, Yelp and IMDB. However, the user information is not always available in the real scenario, for example, some new-registered users, or some sites allowing users' comments without logging in. To address this issue, we introduce a novel knowledge distillation (KD) learning paradigm, to transfer the user characteristics into the weights of student neural networks that just utilize product and review information. The teacher model transfers its predictive distributions of training data to the student model. Thus, the user profiles are only required during the training stage. Experimental results on several sentiment classification datasets show that the proposed learning framework enables student models to achieve improved performance.\",\"PeriodicalId\":186190,\"journal\":{\"name\":\"2019 IEEE 35th International Conference on Data Engineering Workshops (ICDEW)\",\"volume\":\"17 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-04-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE 35th International Conference on Data Engineering Workshops (ICDEW)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICDEW.2019.00-15\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE 35th International Conference on Data Engineering Workshops (ICDEW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDEW.2019.00-15","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

摘要

将全球用户和产品特征与本地评论信息相结合,提供了一种强大的机制,可以预测用户在在线评论网站(如Amazon、Yelp和IMDB)上对产品的评论文档中的情绪。但是,在实际场景中,用户信息并不总是可用的,例如,一些新注册的用户,或者一些不需要登录就允许用户评论的站点。为了解决这个问题,我们引入了一种新的知识蒸馏(KD)学习范式,将用户特征转换为仅利用产品和评论信息的学生神经网络的权重。教师模型将其训练数据的预测分布传递给学生模型。因此,只有在培训阶段才需要用户配置文件。在多个情感分类数据集上的实验结果表明,所提出的学习框架使学生模型获得了更好的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Distilling Knowledge from User Information for Document Level Sentiment Classification
Combining global user and product characteristics with local review information provides a powerful mechanism for predicting users' sentiment in a review document about a product on online review sites such as Amazon, Yelp and IMDB. However, the user information is not always available in the real scenario, for example, some new-registered users, or some sites allowing users' comments without logging in. To address this issue, we introduce a novel knowledge distillation (KD) learning paradigm, to transfer the user characteristics into the weights of student neural networks that just utilize product and review information. The teacher model transfers its predictive distributions of training data to the student model. Thus, the user profiles are only required during the training stage. Experimental results on several sentiment classification datasets show that the proposed learning framework enables student models to achieve improved performance.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Triangle Counting on GPU Using Fine-Grained Task Distribution Distilling Knowledge from User Information for Document Level Sentiment Classification Reachability in Large Graphs Using Bloom Filters Food Image to Cooking Instructions Conversion Through Compressed Embeddings Using Deep Learning Predicting Online User Purchase Behavior Based on Browsing History
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1