训练图神经网络以解决 5G 网络中并存的 URLLC 和 eMBB 问题

IF 4.5 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Computer Communications Pub Date : 2024-07-18 DOI:10.1016/j.comcom.2024.07.008
Seyyed Mohammad Mahdi Hosseini Daneshvar, Sayyed Majid Mazinani
{"title":"训练图神经网络以解决 5G 网络中并存的 URLLC 和 eMBB 问题","authors":"Seyyed Mohammad Mahdi Hosseini Daneshvar,&nbsp;Sayyed Majid Mazinani","doi":"10.1016/j.comcom.2024.07.008","DOIUrl":null,"url":null,"abstract":"<div><p>Coexistence of enhanced mobile broadband and ultra-reliable low latency communication in 5G networks is a challenging problem due to the conflicting requirements. In this paper, we decompose the problem into eMBB and URLLC resource allocation phases. For the first phase we propose a heuristic algorithm with <span><math><mrow><mi>O</mi><mrow><mo>(</mo><mi>n</mi><mo>)</mo></mrow></mrow></math></span> runtime and prove its efficiency and optimality under min–max fairness paradigm. For the URLLC resource allocation, the puncturing framework is adopted and a novel approach using the Graph Neural Networks is proposed to maximize eMBB data rates and fairness while minimizing URLLC outage probability. We show that the runtime of this GNN-based algorithm is also <span><math><mrow><mi>O</mi><mrow><mo>(</mo><mi>n</mi><mo>)</mo></mrow></mrow></math></span>. To train the GNN, an application-specific loss function is designed and empirically shown to be convergent. Our simulation results show that our proposed approach performs very well in terms of eMBB data rates, fairness, and URLLC outage probability in comparison to a number of thoughtfully chosen baselines. We also demonstrate that the proposed GNN is robust to changes in network topology and traffic volume. As we show our algorithm has <span><math><mrow><mi>O</mi><mrow><mo>(</mo><mi>n</mi><mo>)</mo></mrow></mrow></math></span> runtime, it is fully practical for solving the resource allocation problem in the very short time spans that are required by 5G and future generation networks.</p></div>","PeriodicalId":55224,"journal":{"name":"Computer Communications","volume":"225 ","pages":"Pages 171-184"},"PeriodicalIF":4.5000,"publicationDate":"2024-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Training a Graph Neural Network to solve URLLC and eMBB coexisting in 5G networks\",\"authors\":\"Seyyed Mohammad Mahdi Hosseini Daneshvar,&nbsp;Sayyed Majid Mazinani\",\"doi\":\"10.1016/j.comcom.2024.07.008\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Coexistence of enhanced mobile broadband and ultra-reliable low latency communication in 5G networks is a challenging problem due to the conflicting requirements. In this paper, we decompose the problem into eMBB and URLLC resource allocation phases. For the first phase we propose a heuristic algorithm with <span><math><mrow><mi>O</mi><mrow><mo>(</mo><mi>n</mi><mo>)</mo></mrow></mrow></math></span> runtime and prove its efficiency and optimality under min–max fairness paradigm. For the URLLC resource allocation, the puncturing framework is adopted and a novel approach using the Graph Neural Networks is proposed to maximize eMBB data rates and fairness while minimizing URLLC outage probability. We show that the runtime of this GNN-based algorithm is also <span><math><mrow><mi>O</mi><mrow><mo>(</mo><mi>n</mi><mo>)</mo></mrow></mrow></math></span>. To train the GNN, an application-specific loss function is designed and empirically shown to be convergent. Our simulation results show that our proposed approach performs very well in terms of eMBB data rates, fairness, and URLLC outage probability in comparison to a number of thoughtfully chosen baselines. We also demonstrate that the proposed GNN is robust to changes in network topology and traffic volume. As we show our algorithm has <span><math><mrow><mi>O</mi><mrow><mo>(</mo><mi>n</mi><mo>)</mo></mrow></mrow></math></span> runtime, it is fully practical for solving the resource allocation problem in the very short time spans that are required by 5G and future generation networks.</p></div>\",\"PeriodicalId\":55224,\"journal\":{\"name\":\"Computer Communications\",\"volume\":\"225 \",\"pages\":\"Pages 171-184\"},\"PeriodicalIF\":4.5000,\"publicationDate\":\"2024-07-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computer Communications\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0140366424002469\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Communications","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0140366424002469","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

在 5G 网络中,增强型移动宽带和超可靠低延迟通信的共存是一个具有挑战性的问题,因为两者的要求相互冲突。在本文中,我们将问题分解为 eMBB 和 URLLC 资源分配阶段。对于第一阶段,我们提出了一种运行时间为 O(n) 的启发式算法,并证明了该算法在最小-最大公平范式下的效率和最优性。对于 URLLC 资源分配,我们采用了穿刺框架,并提出了一种使用图神经网络的新方法,以最大化 eMBB 数据速率和公平性,同时最小化 URLLC 中断概率。我们证明,这种基于图神经网络的算法的运行时间也是 O(n)。为了训练 GNN,我们设计了一个针对特定应用的损失函数,并通过经验证明该函数具有收敛性。我们的仿真结果表明,与一些经过深思熟虑选择的基线相比,我们提出的方法在 eMBB 数据速率、公平性和 URLLC 中断概率方面表现非常出色。我们还证明,所提出的 GNN 对网络拓扑和流量的变化具有鲁棒性。由于我们的算法运行时间为 O(n),因此完全可以在 5G 和下一代网络所需的极短时间内解决资源分配问题。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Training a Graph Neural Network to solve URLLC and eMBB coexisting in 5G networks

Coexistence of enhanced mobile broadband and ultra-reliable low latency communication in 5G networks is a challenging problem due to the conflicting requirements. In this paper, we decompose the problem into eMBB and URLLC resource allocation phases. For the first phase we propose a heuristic algorithm with O(n) runtime and prove its efficiency and optimality under min–max fairness paradigm. For the URLLC resource allocation, the puncturing framework is adopted and a novel approach using the Graph Neural Networks is proposed to maximize eMBB data rates and fairness while minimizing URLLC outage probability. We show that the runtime of this GNN-based algorithm is also O(n). To train the GNN, an application-specific loss function is designed and empirically shown to be convergent. Our simulation results show that our proposed approach performs very well in terms of eMBB data rates, fairness, and URLLC outage probability in comparison to a number of thoughtfully chosen baselines. We also demonstrate that the proposed GNN is robust to changes in network topology and traffic volume. As we show our algorithm has O(n) runtime, it is fully practical for solving the resource allocation problem in the very short time spans that are required by 5G and future generation networks.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Computer Communications
Computer Communications 工程技术-电信学
CiteScore
14.10
自引率
5.00%
发文量
397
审稿时长
66 days
期刊介绍: Computer and Communications networks are key infrastructures of the information society with high socio-economic value as they contribute to the correct operations of many critical services (from healthcare to finance and transportation). Internet is the core of today''s computer-communication infrastructures. This has transformed the Internet, from a robust network for data transfer between computers, to a global, content-rich, communication and information system where contents are increasingly generated by the users, and distributed according to human social relations. Next-generation network technologies, architectures and protocols are therefore required to overcome the limitations of the legacy Internet and add new capabilities and services. The future Internet should be ubiquitous, secure, resilient, and closer to human communication paradigms. Computer Communications is a peer-reviewed international journal that publishes high-quality scientific articles (both theory and practice) and survey papers covering all aspects of future computer communication networks (on all layers, except the physical layer), with a special attention to the evolution of the Internet architecture, protocols, services, and applications.
期刊最新文献
Editorial Board A deep dive into cybersecurity solutions for AI-driven IoT-enabled smart cities in advanced communication networks The pupil outdoes the master: Imperfect demonstration-assisted trust region jamming policy optimization against frequency-hopping spread spectrum High-performance BFT consensus for Metaverse through block linking and shortcut loop Automating 5G network slice management for industrial applications
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1