Sentiment Analysis: Amazon Electronics Reviews Using BERT and Textblob

Abdulrahman Mahgoub, Hesham Atef, Abdulrahman Nasser, Mohamed Yasser, Walaa Medhat, M. Darweesh, Passent El-Kafrawy
{"title":"Sentiment Analysis: Amazon Electronics Reviews Using BERT and Textblob","authors":"Abdulrahman Mahgoub, Hesham Atef, Abdulrahman Nasser, Mohamed Yasser, Walaa Medhat, M. Darweesh, Passent El-Kafrawy","doi":"10.1109/ESOLEC54569.2022.10009176","DOIUrl":null,"url":null,"abstract":"The market needs a deeper and more comprehensive grasp of its insight, where the analytics world and methodologies such as “Sentiment Analysis” come in. These methods can assist people especially “business owners” in gaining live insights into their businesses and determining wheatear customers are satisfied or not. This paper plans to provide indicators by gathering real world Amazon reviews from Egyptian customers. By applying both Bidirectional Encoder Representations from Transformers “Bert” and “Text Blob” sentiment analysis methods. The processes shall determine the overall satisfaction of Egyptian customers in the electronics department - in order to focus on a specific domain. The two methods will be compared for both the Arabic and English languages. The results show that people in Amazon.eg are mostly satisfied with the percentage of 47%. For the performance, BERT outperformed Textblob indicating that word embedding model BERT is more superior than rule-based model Textblob with a difference of 15% - 25%.","PeriodicalId":179850,"journal":{"name":"2022 20th International Conference on Language Engineering (ESOLEC)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 20th International Conference on Language Engineering (ESOLEC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ESOLEC54569.2022.10009176","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

The market needs a deeper and more comprehensive grasp of its insight, where the analytics world and methodologies such as “Sentiment Analysis” come in. These methods can assist people especially “business owners” in gaining live insights into their businesses and determining wheatear customers are satisfied or not. This paper plans to provide indicators by gathering real world Amazon reviews from Egyptian customers. By applying both Bidirectional Encoder Representations from Transformers “Bert” and “Text Blob” sentiment analysis methods. The processes shall determine the overall satisfaction of Egyptian customers in the electronics department - in order to focus on a specific domain. The two methods will be compared for both the Arabic and English languages. The results show that people in Amazon.eg are mostly satisfied with the percentage of 47%. For the performance, BERT outperformed Textblob indicating that word embedding model BERT is more superior than rule-based model Textblob with a difference of 15% - 25%.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
情感分析:使用BERT和Textblob的亚马逊电子产品评论
市场需要更深入、更全面地把握其洞察力,而“情绪分析”(Sentiment Analysis)等分析领域和方法就在这里发挥了作用。这些方法可以帮助人们,特别是“企业主”获得对他们业务的实时洞察,并确定客户是否满意。本文计划通过收集来自埃及消费者的真实亚马逊评论来提供指标。通过应用变形金刚的双向编码器表示“Bert”和“Text Blob”情感分析方法。该流程将决定埃及客户在电子部门的总体满意度-以便专注于特定领域。这两种方法将对阿拉伯文和英文进行比较。结果表明,人们在亚马逊。大多数人满意的百分比为47%。在性能上,BERT优于Textblob,这表明词嵌入模型BERT比基于规则的模型Textblob更优越,差异为15% - 25%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A Novel Dataset for Known and Unknown Ancient Arabic Manuscripts Sentiment Analysis: Amazon Electronics Reviews Using BERT and Textblob Arabic Documents Layout Analysis (ADLA) using Fine-tuned Faster RCN Towards a Psycholinguistic Database of Arabic Neural Networks for Bilingual Machine Translation Model
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1