Enhancing E-commerce Chatbots with Falcon-7B and 16-bit Full Quantization

Yang Luo, Zibu Wei, Guokun Xu, Zhengning Li, Ying Xie, Yibo Yin
{"title":"Enhancing E-commerce Chatbots with Falcon-7B and 16-bit Full Quantization","authors":"Yang Luo, Zibu Wei, Guokun Xu, Zhengning Li, Ying Xie, Yibo Yin","doi":"10.53469/jtpes.2024.04(02).08","DOIUrl":null,"url":null,"abstract":"E-commerce chatbots play a crucial role in customer service but often struggle with understanding complex queries. This study introduces a breakthrough approach leveraging the Falcon-7B model, a state-of-the-art Large Language Model (LLM) with 7 billion parameters. Trained on a vast dataset of 1,500 billion tokens from RefinedWeb and curated corpora, the Falcon-7B model excels in natural language understanding and generation. Notably, its 16-bit full quantization transformer ensures efficient computation without compromising scalability or performance. By harnessing cutting-edge machine learning techniques, our method aims to redefine e-commerce chatbot systems, providing businesses with a robust solution for delivering personalized customer experiences.","PeriodicalId":489516,"journal":{"name":"Journal of Theory and Practice of Engineering Science","volume":"85 6","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Theory and Practice of Engineering Science","FirstCategoryId":"0","ListUrlMain":"https://doi.org/10.53469/jtpes.2024.04(02).08","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

E-commerce chatbots play a crucial role in customer service but often struggle with understanding complex queries. This study introduces a breakthrough approach leveraging the Falcon-7B model, a state-of-the-art Large Language Model (LLM) with 7 billion parameters. Trained on a vast dataset of 1,500 billion tokens from RefinedWeb and curated corpora, the Falcon-7B model excels in natural language understanding and generation. Notably, its 16-bit full quantization transformer ensures efficient computation without compromising scalability or performance. By harnessing cutting-edge machine learning techniques, our method aims to redefine e-commerce chatbot systems, providing businesses with a robust solution for delivering personalized customer experiences.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用 Falcon-7B 和 16 位全量化技术增强电子商务聊天机器人的功能
电子商务聊天机器人在客户服务中发挥着至关重要的作用,但往往难以理解复杂的查询。本研究介绍了一种利用 Falcon-7B 模型的突破性方法,这是一种拥有 70 亿个参数的先进大型语言模型(LLM)。Falcon-7B 模型在来自 RefinedWeb 的 15000 亿词库的庞大数据集上进行了训练,在自然语言理解和生成方面表现出色。值得注意的是,其 16 位全量化变压器可确保高效计算,同时不影响可扩展性或性能。通过利用尖端的机器学习技术,我们的方法旨在重新定义电子商务聊天机器人系统,为企业提供强大的解决方案,以提供个性化的客户体验。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A Review on Mechanical Automation Control System Enhanced Heart Attack Prediction Using eXtreme Gradient Boosting Feasibility Study of UHPC Reinforced Masonry Structure Review of Research on Nuclear Signal Pulse Shaping Analysis on Machining Precision Control of Mechanical Die
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1