Pub Date : 2024-11-27DOI: 10.1109/COMST.2024.3507019
Navid Heydarishahreza;Tao Han;Nirwan Ansari
The rapid emergence of satellite systems introduces unprecedented interference challenges to both existing satellite networks and Terrestrial Networks (TNs), necessitating innovative strategies to efficiently manage spectrum resources amid heightened competition. Traditional interference management methods fail to address the unique challenges facing satellite systems. These challenges include higher propagation delays caused by the high altitude of Low Earth Orbit (LEO) satellites, increased Doppler shifts due to their high speeds, atmospheric attenuations affecting LEO satellite-TN links, and limited processing capacity in satellite systems. This article provides a comprehensive exploration of interference in LEO satellite-Integrated Terrestrial Networks (LITNets), encompassing various types of interference, including Inter-Beam Interference (IBI), which occurs between different beams of the same satellites; Inter-Satellite Interference (ISI), which arises between different satellites; and LEO satellite-Terrestrial infrastructure Interference (LTI). Moreover, it outlines strategies for interference management and reviews current mitigation methods. Finally, the article concludes by discussing the research challenges and proposing future directions for enhancing spectrum efficiency and interference management in LITNets.
{"title":"Spectrum Sharing and Interference Management for 6G LEO Satellite-Terrestrial Network Integration","authors":"Navid Heydarishahreza;Tao Han;Nirwan Ansari","doi":"10.1109/COMST.2024.3507019","DOIUrl":"10.1109/COMST.2024.3507019","url":null,"abstract":"The rapid emergence of satellite systems introduces unprecedented interference challenges to both existing satellite networks and Terrestrial Networks (TNs), necessitating innovative strategies to efficiently manage spectrum resources amid heightened competition. Traditional interference management methods fail to address the unique challenges facing satellite systems. These challenges include higher propagation delays caused by the high altitude of Low Earth Orbit (LEO) satellites, increased Doppler shifts due to their high speeds, atmospheric attenuations affecting LEO satellite-TN links, and limited processing capacity in satellite systems. This article provides a comprehensive exploration of interference in LEO satellite-Integrated Terrestrial Networks (LITNets), encompassing various types of interference, including Inter-Beam Interference (IBI), which occurs between different beams of the same satellites; Inter-Satellite Interference (ISI), which arises between different satellites; and LEO satellite-Terrestrial infrastructure Interference (LTI). Moreover, it outlines strategies for interference management and reviews current mitigation methods. Finally, the article concludes by discussing the research challenges and proposing future directions for enhancing spectrum efficiency and interference management in LITNets.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"27 5","pages":"2794-2825"},"PeriodicalIF":34.4,"publicationDate":"2024-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142753545","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
As an essential part of the 6G sea-land-air integrated network, underwater networking has attracted increasing attention and has been widely studied. The key for improving its performance is the communication optimization based on data rate, throughput, latency, reliability, spectrum utilization, and other factors impacting on the quality of service (QoS). However, the poor underwater communication environment makes it difficult to improve the communication quality of underwater networking and brings many challenges to the design of optimization schemes. In the face of complex and unknown dynamic underwater environment, the optimization schemes need to have a higher level of adaptability and intelligence, so as to carry out autonomous decision-making and multi-objective optimization under different conditions. To meet the above challenges and needs, reinforcement learning (RL) is widely used to obtain the optimal strategy for underwater communication. Nevertheless, there is still a lack of comprehensive reviews on using RL to optimize underwater communication networking. Therefore, this survey comprehensively investigates the application of RL in underwater networking to guide the optimization of underwater communication in the future and bridge this gap. Specifically, we provide an overview of RL usage processes and tools and detail its various applications in underwater communication networking, including spectrum resource allocation and development, throughput improvement and delay reduction, reliability improvement, energy saving, and energy efficiency optimization, data sensing and processing, and intelligent cluster networking. Based on the review, we further analyze the open challenges and research directions of RL-enabled underwater communication networking in the future.
水下组网作为6G海陆空一体化网络的重要组成部分,越来越受到人们的重视,并得到了广泛的研究。提高其性能的关键是根据影响QoS (quality of service)的数据速率、吞吐量、延迟、可靠性、频谱利用率等因素进行通信优化。然而,恶劣的水下通信环境给水下组网通信质量的提高带来了困难,也给优化方案的设计带来了诸多挑战。面对复杂未知的动态水下环境,优化方案需要具有更高水平的适应性和智能性,才能在不同条件下进行自主决策和多目标优化。为了应对上述挑战和需求,强化学习(RL)被广泛用于获得水下通信的最优策略。然而,关于利用RL优化水下通信网络的研究仍缺乏全面的综述。因此,本调查将全面研究RL在水下组网中的应用,以指导未来水下通信的优化,弥补这一空白。具体而言,我们概述了RL的使用流程和工具,并详细介绍了RL在水下通信网络中的各种应用,包括频谱资源分配和开发、吞吐量提高和延迟降低、可靠性提高、节能和能效优化、数据感知和处理以及智能集群网络。在此基础上,进一步分析了rl水下通信网络未来面临的挑战和研究方向。
{"title":"Toward Communication Optimization for Future Underwater Networking: A Survey of Reinforcement Learning-Based Approaches","authors":"Ziyuan Wang;Jun Du;Xiangwang Hou;Jingjing Wang;Chunxiao Jiang;Xiao-Ping Zhang;Yong Ren","doi":"10.1109/COMST.2024.3505850","DOIUrl":"10.1109/COMST.2024.3505850","url":null,"abstract":"As an essential part of the 6G sea-land-air integrated network, underwater networking has attracted increasing attention and has been widely studied. The key for improving its performance is the communication optimization based on data rate, throughput, latency, reliability, spectrum utilization, and other factors impacting on the quality of service (QoS). However, the poor underwater communication environment makes it difficult to improve the communication quality of underwater networking and brings many challenges to the design of optimization schemes. In the face of complex and unknown dynamic underwater environment, the optimization schemes need to have a higher level of adaptability and intelligence, so as to carry out autonomous decision-making and multi-objective optimization under different conditions. To meet the above challenges and needs, reinforcement learning (RL) is widely used to obtain the optimal strategy for underwater communication. Nevertheless, there is still a lack of comprehensive reviews on using RL to optimize underwater communication networking. Therefore, this survey comprehensively investigates the application of RL in underwater networking to guide the optimization of underwater communication in the future and bridge this gap. Specifically, we provide an overview of RL usage processes and tools and detail its various applications in underwater communication networking, including spectrum resource allocation and development, throughput improvement and delay reduction, reliability improvement, energy saving, and energy efficiency optimization, data sensing and processing, and intelligent cluster networking. Based on the review, we further analyze the open challenges and research directions of RL-enabled underwater communication networking in the future.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"27 5","pages":"2765-2793"},"PeriodicalIF":34.4,"publicationDate":"2024-11-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142712482","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
With the increasing complexity and scale of modern networks, the demand for transparent and interpretable Artificial Intelligence (AI) models has surged. This survey comprehensively reviews the current state of eXplainable Artificial Intelligence (XAI) methodologies in the context of Network Traffic Analysis (NTA) (including tasks such as traffic classification, intrusion detection, attack classification, and traffic prediction), encompassing various aspects such as techniques, applications, requirements, challenges, and ongoing projects. It explores the vital role of XAI in enhancing network security, performance optimization, and reliability. Additionally, this survey underscores the importance of understanding why AI-driven decisions are made, emphasizing the need for explainability in critical network environments. By providing a holistic perspective on XAI for Internet NTA, this survey aims to guide researchers and practitioners in harnessing the potential of transparent AI models to address the intricate challenges of modern network management and security.
{"title":"A Survey on Explainable Artificial Intelligence for Internet Traffic Classification and Prediction, and Intrusion Detection","authors":"Alfredo Nascita;Giuseppe Aceto;Domenico Ciuonzo;Antonio Montieri;Valerio Persico;Antonio Pescapé","doi":"10.1109/COMST.2024.3504955","DOIUrl":"10.1109/COMST.2024.3504955","url":null,"abstract":"With the increasing complexity and scale of modern networks, the demand for transparent and interpretable Artificial Intelligence (AI) models has surged. This survey comprehensively reviews the current state of eXplainable Artificial Intelligence (XAI) methodologies in the context of Network Traffic Analysis (NTA) (including tasks such as traffic classification, intrusion detection, attack classification, and traffic prediction), encompassing various aspects such as techniques, applications, requirements, challenges, and ongoing projects. It explores the vital role of XAI in enhancing network security, performance optimization, and reliability. Additionally, this survey underscores the importance of understanding why AI-driven decisions are made, emphasizing the need for explainability in critical network environments. By providing a holistic perspective on XAI for Internet NTA, this survey aims to guide researchers and practitioners in harnessing the potential of transparent AI models to address the intricate challenges of modern network management and security.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"27 5","pages":"3165-3198"},"PeriodicalIF":34.4,"publicationDate":"2024-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10763502","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142690975","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-21DOI: 10.1109/COMST.2024.3503680
Yujun Cheng;Weiting Zhang;Zhewei Zhang;Chuan Zhang;Shengjin Wang;Shiwen Mao
Large Language Models (LLMs), such as LLaMA and GPT-4, have transformed the paradigm of natural language comprehension and generation. Despite their impressive performance, these models still face certain challenges, including the need for extensive data, high computational resources, and privacy concerns related to their data sources. Recently, Federated Learning (FL) has surfaced as a cooperative AI methodology that enables AI training across distributed computation entities while maintaining decentralized data. Integrating FL with LLMs presents an encouraging solution for privacy-preserving and collaborative LLM learning across multiple end-users, thus addressing the aforementioned challenges. In this paper, we provide an exhaustive review of federated Large Language Models, starting from an overview of the latest progress in FL and LLMs, and proceeding to a discourse on their motivation and challenges for integration. We then conduct a thorough review of the existing federated LLM research from the perspective of the entire lifespan, from pre-training to fine-tuning and practical applications. Moreover, we address the threats and issues arising from this integration, shedding light on the delicate balance between privacy and robustness, and introduce existing approaches and potential strategies for enhancing federated LLM privacy and resilience. Finally, we conclude this survey by outlining promising avenues for future research in this emerging field.
{"title":"Toward Federated Large Language Models: Motivations, Methods, and Future Directions","authors":"Yujun Cheng;Weiting Zhang;Zhewei Zhang;Chuan Zhang;Shengjin Wang;Shiwen Mao","doi":"10.1109/COMST.2024.3503680","DOIUrl":"10.1109/COMST.2024.3503680","url":null,"abstract":"Large Language Models (LLMs), such as LLaMA and GPT-4, have transformed the paradigm of natural language comprehension and generation. Despite their impressive performance, these models still face certain challenges, including the need for extensive data, high computational resources, and privacy concerns related to their data sources. Recently, Federated Learning (FL) has surfaced as a cooperative AI methodology that enables AI training across distributed computation entities while maintaining decentralized data. Integrating FL with LLMs presents an encouraging solution for privacy-preserving and collaborative LLM learning across multiple end-users, thus addressing the aforementioned challenges. In this paper, we provide an exhaustive review of federated Large Language Models, starting from an overview of the latest progress in FL and LLMs, and proceeding to a discourse on their motivation and challenges for integration. We then conduct a thorough review of the existing federated LLM research from the perspective of the entire lifespan, from pre-training to fine-tuning and practical applications. Moreover, we address the threats and issues arising from this integration, shedding light on the delicate balance between privacy and robustness, and introduce existing approaches and potential strategies for enhancing federated LLM privacy and resilience. Finally, we conclude this survey by outlining promising avenues for future research in this emerging field.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"27 4","pages":"2733-2764"},"PeriodicalIF":34.4,"publicationDate":"2024-11-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142684408","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-21DOI: 10.1109/COMST.2024.3464708
Dusit Niyato
I welcome you to the fourth issue of the IEEE Communications Surveys and Tutorials in 2024. This issue includes 19 papers covering different aspects of communication networks. In particular, these articles survey and tutor various issues in “Wireless Communications”, “Cyber Security”, “IoT and M2M”, “Vehicular and Sensor Communications”, “Internet Technologies”, “Network and Service Management and Green Communications”, “Network Virtualization”, “Optical Communications”, and “Multimedia Communications”. A brief account of each of these papers is given below.
{"title":"Editorial: Fourth Quarter 2024 IEEE Communications Surveys and Tutorials","authors":"Dusit Niyato","doi":"10.1109/COMST.2024.3464708","DOIUrl":"https://doi.org/10.1109/COMST.2024.3464708","url":null,"abstract":"I welcome you to the fourth issue of the IEEE Communications Surveys and Tutorials in 2024. This issue includes 19 papers covering different aspects of communication networks. In particular, these articles survey and tutor various issues in “Wireless Communications”, “Cyber Security”, “IoT and M2M”, “Vehicular and Sensor Communications”, “Internet Technologies”, “Network and Service Management and Green Communications”, “Network Virtualization”, “Optical Communications”, and “Multimedia Communications”. A brief account of each of these papers is given below.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"26 4","pages":"i-vi"},"PeriodicalIF":34.4,"publicationDate":"2024-11-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10762798","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142679407","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Owing to its outstanding parallel computing capabilities, quantum computing (QC) has been a subject of continuous attention. With the gradual maturation of QC platforms, it has increasingly played a significant role in various fields such as transportation, pharmaceuticals, and industrial manufacturing, achieving unprecedented milestones. In modern society, wireless communication stands as an indispensable infrastructure, with its essence lying in optimization. Although artificial intelligence (AI) algorithms such as reinforcement learning (RL) and mathematical optimization have greatly enhanced the performance of wireless communication, the rapid attainment of optimal solutions for wireless communication problems remains an unresolved challenge. QC, however, presents a new alternative. This paper aims to elucidate the fundamentals of QC and explore its applications in wireless communications and networking. First, we provide a tutorial on QC, covering its basics, characteristics, and popular QC algorithms. Next, we introduce the applications of QC in communications and networking, followed by its applications in miscellaneous areas such as security and privacy, localization and tracking, and video streaming. Finally, we discuss remaining open issues before concluding.
{"title":"Quantum Computing in Wireless Communications and Networking: A Tutorial-cum-Survey","authors":"Wei Zhao;Tangjie Weng;Yue Ruan;Zhi Liu;Xuangou Wu;Xiao Zheng;Nei Kato","doi":"10.1109/COMST.2024.3502762","DOIUrl":"10.1109/COMST.2024.3502762","url":null,"abstract":"Owing to its outstanding parallel computing capabilities, quantum computing (QC) has been a subject of continuous attention. With the gradual maturation of QC platforms, it has increasingly played a significant role in various fields such as transportation, pharmaceuticals, and industrial manufacturing, achieving unprecedented milestones. In modern society, wireless communication stands as an indispensable infrastructure, with its essence lying in optimization. Although artificial intelligence (AI) algorithms such as reinforcement learning (RL) and mathematical optimization have greatly enhanced the performance of wireless communication, the rapid attainment of optimal solutions for wireless communication problems remains an unresolved challenge. QC, however, presents a new alternative. This paper aims to elucidate the fundamentals of QC and explore its applications in wireless communications and networking. First, we provide a tutorial on QC, covering its basics, characteristics, and popular QC algorithms. Next, we introduce the applications of QC in communications and networking, followed by its applications in miscellaneous areas such as security and privacy, localization and tracking, and video streaming. Finally, we discuss remaining open issues before concluding.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"27 4","pages":"2378-2419"},"PeriodicalIF":34.4,"publicationDate":"2024-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142678215","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The advent of the sixth-generation (6G) networks presents another round of revolution for the mobile communication landscape, promising an immersive experience, robust reliability, minimal latency, extreme connectivity, ubiquitous coverage, and capabilities beyond communication, including intelligence and sensing. To achieve these ambitious goals, it is apparent that 6G networks need to incorporate the state-of-the-art technologies. One of the technologies that has garnered rising interest is fluid antenna system (FAS) which represents any software-controllable fluidic, conductive, or dielectric structure capable of dynamically changing its shape and position to reconfigure essential radio-frequency (RF) characteristics. Compared to traditional antenna systems (TASs) with fixed-position radiating elements, the core idea of FAS revolves around the unique flexibility of reconfiguring the radiating elements within a given space. One recent driver of FAS is the recognition of its position-flexibility as a new degree of freedom (dof) to harness diversity and multiplexing gains. In this paper, we provide a comprehensive tutorial, covering channel modeling, signal processing and estimation methods, information-theoretic insights, new multiple access techniques, and hardware designs. Moreover, we delineate the challenges of FAS and explore the potential of using FAS to improve the performance of other contemporary technologies. By providing insights and guidance, this tutorial paper serves to inspire researchers to explore new horizons and fully unleash the potential of FAS.
{"title":"A Tutorial on Fluid Antenna System for 6G Networks: Encompassing Communication Theory, Optimization Methods and Hardware Designs","authors":"Wee Kiat New;Kai-Kit Wong;Hao Xu;Chao Wang;Farshad Rostami Ghadi;Jichen Zhang;Junhui Rao;Ross Murch;Pablo Ramírez-Espinosa;David Morales-Jimenez;Chan-Byoung Chae;Kin-Fai Tong","doi":"10.1109/COMST.2024.3498855","DOIUrl":"10.1109/COMST.2024.3498855","url":null,"abstract":"The advent of the sixth-generation (6G) networks presents another round of revolution for the mobile communication landscape, promising an immersive experience, robust reliability, minimal latency, extreme connectivity, ubiquitous coverage, and capabilities beyond communication, including intelligence and sensing. To achieve these ambitious goals, it is apparent that 6G networks need to incorporate the state-of-the-art technologies. One of the technologies that has garnered rising interest is fluid antenna system (FAS) which represents any software-controllable fluidic, conductive, or dielectric structure capable of dynamically changing its shape and position to reconfigure essential radio-frequency (RF) characteristics. Compared to traditional antenna systems (TASs) with fixed-position radiating elements, the core idea of FAS revolves around the unique flexibility of reconfiguring the radiating elements within a given space. One recent driver of FAS is the recognition of its position-flexibility as a new degree of freedom (dof) to harness diversity and multiplexing gains. In this paper, we provide a comprehensive tutorial, covering channel modeling, signal processing and estimation methods, information-theoretic insights, new multiple access techniques, and hardware designs. Moreover, we delineate the challenges of FAS and explore the potential of using FAS to improve the performance of other contemporary technologies. By providing insights and guidance, this tutorial paper serves to inspire researchers to explore new horizons and fully unleash the potential of FAS.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"27 4","pages":"2325-2377"},"PeriodicalIF":34.4,"publicationDate":"2024-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142643062","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-07DOI: 10.1109/COMST.2024.3493626
Muhammad Ali Naeem;Ikram Ud Din;Yahui Meng;Ahmad Almogren;Joel J. P. C. Rodrigues
The substantial growth in data volume has led to considerable technological obstacles on the Internet. In order to address the high volume of Internet traffic, the research community has investigated the improvement of Internet architecture by implementing centrality-based caching, which could involve collaborative efforts. Different centrality-based caching strategies have been put forward that allow for different data distribution. These include betweenness centrality, degree centrality, and closeness centrality. Caching provides several advantages in terms of reducing latency, improving scalability, and enhancing data manageability. In addition, this study provides an overview of cache management algorithms based on centrality in the context of Information Centric Networking (ICN), Named Data Networking (NDN), and Internet of Things (IoT). It highlights the advantages and disadvantages of these algorithms and evaluates their performance in a network simulation environment, specifically in terms of cache hit ratio, data retrieval latency, and average hop count. Ultimately, we aim to pinpoint and deliberate on possible research directions for future studies concerning various aspects of centrality-based caching in communication systems.
{"title":"Centrality-Based On-Path Caching Strategies in NDN-Based Internet of Things: A Survey","authors":"Muhammad Ali Naeem;Ikram Ud Din;Yahui Meng;Ahmad Almogren;Joel J. P. C. Rodrigues","doi":"10.1109/COMST.2024.3493626","DOIUrl":"10.1109/COMST.2024.3493626","url":null,"abstract":"The substantial growth in data volume has led to considerable technological obstacles on the Internet. In order to address the high volume of Internet traffic, the research community has investigated the improvement of Internet architecture by implementing centrality-based caching, which could involve collaborative efforts. Different centrality-based caching strategies have been put forward that allow for different data distribution. These include betweenness centrality, degree centrality, and closeness centrality. Caching provides several advantages in terms of reducing latency, improving scalability, and enhancing data manageability. In addition, this study provides an overview of cache management algorithms based on centrality in the context of Information Centric Networking (ICN), Named Data Networking (NDN), and Internet of Things (IoT). It highlights the advantages and disadvantages of these algorithms and evaluates their performance in a network simulation environment, specifically in terms of cache hit ratio, data retrieval latency, and average hop count. Ultimately, we aim to pinpoint and deliberate on possible research directions for future studies concerning various aspects of centrality-based caching in communication systems.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"27 4","pages":"2621-2657"},"PeriodicalIF":34.4,"publicationDate":"2024-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142597758","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-07DOI: 10.1109/COMST.2024.3493630
Lan-Huong Nguyen;Van-Linh Nguyen;Ren-Hung Hwang;Jian-Jhih Kuo;Yu-Wen Chen;Chien-Chung Huang;Ping-I Pan
Many nations are promoting the green transition in the energy sector to attain neutral carbon emissions by 2050. Smart Grid 2.0 (SG2) is expected to explore data-driven analytics and enhance communication technologies to improve the efficiency and sustainability of distributed renewable energy systems. These features are beyond smart metering and electric surplus distribution in conventional smart grids. Given the high dependence on communication networks to connect distributed microgrids in SG2, potential cascading failures of connectivity can cause disruption to data synchronization to the remote control systems. This paper reviews security threats and defense tactics for three stakeholders: power grid operators, communication network providers, and consumers. Through the survey, we found that SG2‘s stakeholders are particularly vulnerable to substation attacks/vandalism, malware/ransomware threats, blockchain vulnerabilities and supply chain breakdowns. Furthermore, incorporating artificial intelligence (AI) into autonomous energy management in distributed energy resources of SG2 creates new challenges. Accordingly, adversarial samples and false data injection on electricity reading and measurement sensors at power plants can fool AI-powered control functions and cause messy error-checking operations in energy storage, wrong energy estimation in electric vehicle charging, and even fraudulent transactions in peer-to-peer energy trading models. Scalable blockchain-based models, physical unclonable function, interoperable security protocols, and trustworthy AI models designed for managing distributed microgrids in SG2 are typical promising protection models for future research.
{"title":"Toward Secured Smart Grid 2.0: Exploring Security Threats, Protection Models, and Challenges","authors":"Lan-Huong Nguyen;Van-Linh Nguyen;Ren-Hung Hwang;Jian-Jhih Kuo;Yu-Wen Chen;Chien-Chung Huang;Ping-I Pan","doi":"10.1109/COMST.2024.3493630","DOIUrl":"10.1109/COMST.2024.3493630","url":null,"abstract":"Many nations are promoting the green transition in the energy sector to attain neutral carbon emissions by 2050. Smart Grid 2.0 (SG2) is expected to explore data-driven analytics and enhance communication technologies to improve the efficiency and sustainability of distributed renewable energy systems. These features are beyond smart metering and electric surplus distribution in conventional smart grids. Given the high dependence on communication networks to connect distributed microgrids in SG2, potential cascading failures of connectivity can cause disruption to data synchronization to the remote control systems. This paper reviews security threats and defense tactics for three stakeholders: power grid operators, communication network providers, and consumers. Through the survey, we found that SG2‘s stakeholders are particularly vulnerable to substation attacks/vandalism, malware/ransomware threats, blockchain vulnerabilities and supply chain breakdowns. Furthermore, incorporating artificial intelligence (AI) into autonomous energy management in distributed energy resources of SG2 creates new challenges. Accordingly, adversarial samples and false data injection on electricity reading and measurement sensors at power plants can fool AI-powered control functions and cause messy error-checking operations in energy storage, wrong energy estimation in electric vehicle charging, and even fraudulent transactions in peer-to-peer energy trading models. Scalable blockchain-based models, physical unclonable function, interoperable security protocols, and trustworthy AI models designed for managing distributed microgrids in SG2 are typical promising protection models for future research.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"27 4","pages":"2581-2620"},"PeriodicalIF":34.4,"publicationDate":"2024-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142597759","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-04DOI: 10.1109/COMST.2024.3490178
Marvin Manalastas;Muhammad Umar Bin Farooq;Syed Muhammad Asad Zaidi;Haneya Naeem Qureshi;Yusuf Sambo;Ali Imran
Simulators are indispensable parts of the research and development necessary to advance countless industries, including cellular networks. With simulators, the evaluation, analysis, testing, and experimentation of novel designs and algorithms can be executed in a more cost-effective and convenient manner without the risk of real network service disruption. Additionally, recent trends indicate that the advancement of these Digital System Models (DSM), such as system-level simulators, will hold a pivotal role in advancing cellular networks by facilitating the development of digital twins. Given this growing significance, in this survey and tutorial paper, we present an extensive review of the currently available DSMs for 5G and beyond (5G&B) networks. Specifically, we begin with a tutorial on the fundamental concepts of 5G&B network simulations, followed by an identification of the essential design requirements needed to model the key features of these networks. We also devised a taxonomy of different types of 5G&B network simulators. In contrast to existing simulator surveys, which mostly leverage traditional metrics applicable to legacy networks, we devise and use 5G-specific evaluation metrics that capture three key facets of a network simulator, namely realism, completeness, and computational efficiency. We evaluate each simulator according to the devised metrics to generate an applicability matrix that maps different 5G&B simulators vis-à-vis the different research themes they can potentially enable. We also present the current challenges in developing 5G&B simulators while laying out several potential solutions to address the issues. Finally, we discuss the future challenges related to simulator design provisions that will arise with the emergence of 6G networks.
{"title":"From Simulators to Digital Twins for Enabling Emerging Cellular Networks: A Tutorial and Survey","authors":"Marvin Manalastas;Muhammad Umar Bin Farooq;Syed Muhammad Asad Zaidi;Haneya Naeem Qureshi;Yusuf Sambo;Ali Imran","doi":"10.1109/COMST.2024.3490178","DOIUrl":"10.1109/COMST.2024.3490178","url":null,"abstract":"Simulators are indispensable parts of the research and development necessary to advance countless industries, including cellular networks. With simulators, the evaluation, analysis, testing, and experimentation of novel designs and algorithms can be executed in a more cost-effective and convenient manner without the risk of real network service disruption. Additionally, recent trends indicate that the advancement of these Digital System Models (DSM), such as system-level simulators, will hold a pivotal role in advancing cellular networks by facilitating the development of digital twins. Given this growing significance, in this survey and tutorial paper, we present an extensive review of the currently available DSMs for 5G and beyond (5G&B) networks. Specifically, we begin with a tutorial on the fundamental concepts of 5G&B network simulations, followed by an identification of the essential design requirements needed to model the key features of these networks. We also devised a taxonomy of different types of 5G&B network simulators. In contrast to existing simulator surveys, which mostly leverage traditional metrics applicable to legacy networks, we devise and use 5G-specific evaluation metrics that capture three key facets of a network simulator, namely realism, completeness, and computational efficiency. We evaluate each simulator according to the devised metrics to generate an applicability matrix that maps different 5G&B simulators vis-à-vis the different research themes they can potentially enable. We also present the current challenges in developing 5G&B simulators while laying out several potential solutions to address the issues. Finally, we discuss the future challenges related to simulator design provisions that will arise with the emergence of 6G networks.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"27 4","pages":"2693-2732"},"PeriodicalIF":34.4,"publicationDate":"2024-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142580483","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}