大型语言模型在云计算中的应用:使用真实世界数据的实证研究

Hanzhe Li, Sherry X Wang, Fu Shang, Kaiyi Niu, Runze Song
{"title":"大型语言模型在云计算中的应用:使用真实世界数据的实证研究","authors":"Hanzhe Li, Sherry X Wang, Fu Shang, Kaiyi Niu, Runze Song","doi":"10.55524/ijircst.2024.12.4.10","DOIUrl":null,"url":null,"abstract":"This study investigates the integration of Large Language Models (LLMs) in cloud computing, focusing on their impact on resource allocation and management. The research employs Bayesian inference and Markov Decision Processes (MDPs) to enhance predictive accuracy and decision-making efficiency. Over a month, data collected from AWS, GCP, Azure, IBM, and Oracle reveals significant improvements in CPU utilization, memory usage, network latency, and storage performance. LLMs demonstrated superior performance compared to traditional models, optimizing task scheduling and reducing idle times. Bayesian inference refined resource predictions, while MDPs provided a structured approach to dynamic optimization, resulting in lower latency and better system efficiency. The findings suggest that integrating LLMs can transform cloud service management, offering enhanced performance, reliability, and cost savings. Future research should explore long-term trends, security implications, and the ethical aspects of AI deployment in cloud environments.","PeriodicalId":218345,"journal":{"name":"International Journal of Innovative Research in Computer Science and Technology","volume":"25 60","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Applications of Large Language Models in Cloud Computing: An Empirical Study Using Real-world Data\",\"authors\":\"Hanzhe Li, Sherry X Wang, Fu Shang, Kaiyi Niu, Runze Song\",\"doi\":\"10.55524/ijircst.2024.12.4.10\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This study investigates the integration of Large Language Models (LLMs) in cloud computing, focusing on their impact on resource allocation and management. The research employs Bayesian inference and Markov Decision Processes (MDPs) to enhance predictive accuracy and decision-making efficiency. Over a month, data collected from AWS, GCP, Azure, IBM, and Oracle reveals significant improvements in CPU utilization, memory usage, network latency, and storage performance. LLMs demonstrated superior performance compared to traditional models, optimizing task scheduling and reducing idle times. Bayesian inference refined resource predictions, while MDPs provided a structured approach to dynamic optimization, resulting in lower latency and better system efficiency. The findings suggest that integrating LLMs can transform cloud service management, offering enhanced performance, reliability, and cost savings. Future research should explore long-term trends, security implications, and the ethical aspects of AI deployment in cloud environments.\",\"PeriodicalId\":218345,\"journal\":{\"name\":\"International Journal of Innovative Research in Computer Science and Technology\",\"volume\":\"25 60\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Innovative Research in Computer Science and Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.55524/ijircst.2024.12.4.10\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Innovative Research in Computer Science and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.55524/ijircst.2024.12.4.10","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

本研究调查了云计算中大型语言模型(LLM)的整合情况,重点关注其对资源分配和管理的影响。研究采用贝叶斯推理和马尔可夫决策过程(MDP)来提高预测准确性和决策效率。在一个月的时间里,从 AWS、GCP、Azure、IBM 和 Oracle 收集的数据显示,CPU 利用率、内存使用率、网络延迟和存储性能都有显著提高。与传统模型相比,LLM 表现出卓越的性能,优化了任务调度,减少了空闲时间。贝叶斯推理完善了资源预测,而 MDP 为动态优化提供了结构化方法,从而降低了延迟,提高了系统效率。研究结果表明,集成 LLM 可以改变云服务管理,提供更高的性能、可靠性和成本节约。未来的研究应探索云环境中人工智能部署的长期趋势、安全影响和道德问题。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Applications of Large Language Models in Cloud Computing: An Empirical Study Using Real-world Data
This study investigates the integration of Large Language Models (LLMs) in cloud computing, focusing on their impact on resource allocation and management. The research employs Bayesian inference and Markov Decision Processes (MDPs) to enhance predictive accuracy and decision-making efficiency. Over a month, data collected from AWS, GCP, Azure, IBM, and Oracle reveals significant improvements in CPU utilization, memory usage, network latency, and storage performance. LLMs demonstrated superior performance compared to traditional models, optimizing task scheduling and reducing idle times. Bayesian inference refined resource predictions, while MDPs provided a structured approach to dynamic optimization, resulting in lower latency and better system efficiency. The findings suggest that integrating LLMs can transform cloud service management, offering enhanced performance, reliability, and cost savings. Future research should explore long-term trends, security implications, and the ethical aspects of AI deployment in cloud environments.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Comprehensive Review on Machine Learning Applications in Cloud Computing A Comparative Study of ChatGPT, Gemini, and Perplexity Exploring the Synergy of Web Usage Data and Content Mining for Personalized Effectiveness A Comparative Study of Cat Swarm Algorithm for Graph Coloring Problem: Convergence Analysis and Performance Evaluation A Comprehensive Review- Building A Secure Social Media Environment for Kids- Automated Content Filtering with Biometric Feedback
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1