Frequency selective surfaces (FSSs) have attracted extensive attention for suppressing interference and improving channel quality and coverage by selectively transmitting or directionally reflecting electromagnetic waves in a specific frequency range in wireless communication networks. Therefore, FSS technology is considered to be another candidate after intelligent reflective surfaces (IRSs) in the sixth generation (6G) communication systems. In this paper, we provide a comprehensive investigation of the theory, design, and classification of FSS, and a contemporary overview of its vision and current applications in future 6G networks, as well as some emerging use cases of FSSs in 6G networks. Then, we describe the fundamentals and design methods of FSS from architecture, performance metrics, and experimental analysis methods and classify the relevant papers according to the FSS applications. Moreover, we review systematically the frequency selection characteristics, reflection, transmission, and absorption properties of FSS, and discuss the corresponding communication models. Additionally, we provide an overview of the vision and requirements for FSS-assisted 6G networks and summarize the FSS communication architecture and performance analysis towards 6G networks. Since the IRS technology is another metasurface candidate for 6G networks, we compare and review them with those for IRS-assisted 6G networks. Furthermore, we review some emerging use cases in 6G networks and present a new use case of adaptive spectrum allocation based on FSS. Finally, based on an extensive literature review and common weaknesses of FSS extant literature, we offer several challenges and some potential research directions.
{"title":"Frequency Selective Surface Toward 6G Communication Systems: A Contemporary Survey","authors":"Xuehan Chen;Jingjing Tan;Litian Kang;Fengxiao Tang;Ming Zhao;Nei Kato","doi":"10.1109/COMST.2024.3369250","DOIUrl":"10.1109/COMST.2024.3369250","url":null,"abstract":"Frequency selective surfaces (FSSs) have attracted extensive attention for suppressing interference and improving channel quality and coverage by selectively transmitting or directionally reflecting electromagnetic waves in a specific frequency range in wireless communication networks. Therefore, FSS technology is considered to be another candidate after intelligent reflective surfaces (IRSs) in the sixth generation (6G) communication systems. In this paper, we provide a comprehensive investigation of the theory, design, and classification of FSS, and a contemporary overview of its vision and current applications in future 6G networks, as well as some emerging use cases of FSSs in 6G networks. Then, we describe the fundamentals and design methods of FSS from architecture, performance metrics, and experimental analysis methods and classify the relevant papers according to the FSS applications. Moreover, we review systematically the frequency selection characteristics, reflection, transmission, and absorption properties of FSS, and discuss the corresponding communication models. Additionally, we provide an overview of the vision and requirements for FSS-assisted 6G networks and summarize the FSS communication architecture and performance analysis towards 6G networks. Since the IRS technology is another metasurface candidate for 6G networks, we compare and review them with those for IRS-assisted 6G networks. Furthermore, we review some emerging use cases in 6G networks and present a new use case of adaptive spectrum allocation based on FSS. Finally, based on an extensive literature review and common weaknesses of FSS extant literature, we offer several challenges and some potential research directions.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"26 3","pages":"1635-1675"},"PeriodicalIF":34.4,"publicationDate":"2024-02-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139938589","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Communication security has to evolve to a higher plane in the face of the threat from the massive computing power of the emerging quantum computers. Quantum secure direct communication (QSDC) constitutes a promising branch of quantum communication, which is provably secure and overcomes the threat of quantum computing, whilst conveying secret messages directly via the quantum channel. In this survey, we highlight the motivation and the status of QSDC research with special emphasis on its theoretical basis and experimental verification. We will detail the associated point-to-point communication protocols and show how information is protected and transmitted. Finally, we discuss the open challenges as well as the future trends of QSDC networks, emphasizing again that QSDC is not a pure quantum key distribution (QKD) protocol, but a fully-fledged secure communication scheme.
{"title":"The Evolution of Quantum Secure Direct Communication: On the Road to the Qinternet","authors":"Dong Pan;Gui-Lu Long;Liuguo Yin;Yu-Bo Sheng;Dong Ruan;Soon Xin Ng;Jianhua Lu;Lajos Hanzo","doi":"10.1109/COMST.2024.3367535","DOIUrl":"10.1109/COMST.2024.3367535","url":null,"abstract":"Communication security has to evolve to a higher plane in the face of the threat from the massive computing power of the emerging quantum computers. Quantum secure direct communication (QSDC) constitutes a promising branch of quantum communication, which is provably secure and overcomes the threat of quantum computing, whilst conveying secret messages directly via the quantum channel. In this survey, we highlight the motivation and the status of QSDC research with special emphasis on its theoretical basis and experimental verification. We will detail the associated point-to-point communication protocols and show how information is protected and transmitted. Finally, we discuss the open challenges as well as the future trends of QSDC networks, emphasizing again that QSDC is not a pure quantum key distribution (QKD) protocol, but a fully-fledged secure communication scheme.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"26 3","pages":"1898-1949"},"PeriodicalIF":34.4,"publicationDate":"2024-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10440135","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139938366","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-12DOI: 10.1109/COMST.2024.3365076
Malka N. Halgamuge
Deep learning shows immense potential for strengthening the cyber-resilience of renewable energy supply chains. However, research gaps in comprehensive benchmarks, real-world model evaluations, and data generation tailored to the renewable domain persist. This study explores applying state-of-the-art deep learning techniques to secure renewable supply chains, drawing insights from over 300 publications. We aim to provide an updated, rigorous analysis of deep learning applications in this field to guide future research. We systematically review literature spanning 2020–2023, retrieving relevant articles from major databases. We examine deep learning’s role in intrusion/anomaly detection, supply chain cyberattack detection frameworks, security standards, historical attack analysis, data management strategies, model architectures, and supply chain cyber datasets. Our analysis demonstrates deep learning enables renewable supply chain anomaly detection by processing massively distributed data. We highlight crucial model design factors, including accuracy, adaptation capability, communication security, and resilience to adversarial threats. Comparing 18 major historical attacks informs risk analysis. We also showcase potential deep learning architectures, evaluating their relative strengths and limitations in security applications. Moreover, our review emphasizes best practices for renewable data curation, considering quality, labeling, access efficiency, and governance. Effective deep learning integration necessitates tailored benchmarks, model tuning guidance, and renewable energy data generation. Our multi-dimensional analysis motivates focused efforts on enhancing detection explanations, securing communications, continually retraining models, and establishing standardized assessment protocols. Overall, we provide a comprehensive roadmap to progress renewable supply chain cyber-resilience leveraging deep learning’s immense potential.
{"title":"Leveraging Deep Learning to Strengthen the Cyber-Resilience of Renewable Energy Supply Chains: A Survey","authors":"Malka N. Halgamuge","doi":"10.1109/COMST.2024.3365076","DOIUrl":"10.1109/COMST.2024.3365076","url":null,"abstract":"Deep learning shows immense potential for strengthening the cyber-resilience of renewable energy supply chains. However, research gaps in comprehensive benchmarks, real-world model evaluations, and data generation tailored to the renewable domain persist. This study explores applying state-of-the-art deep learning techniques to secure renewable supply chains, drawing insights from over 300 publications. We aim to provide an updated, rigorous analysis of deep learning applications in this field to guide future research. We systematically review literature spanning 2020–2023, retrieving relevant articles from major databases. We examine deep learning’s role in intrusion/anomaly detection, supply chain cyberattack detection frameworks, security standards, historical attack analysis, data management strategies, model architectures, and supply chain cyber datasets. Our analysis demonstrates deep learning enables renewable supply chain anomaly detection by processing massively distributed data. We highlight crucial model design factors, including accuracy, adaptation capability, communication security, and resilience to adversarial threats. Comparing 18 major historical attacks informs risk analysis. We also showcase potential deep learning architectures, evaluating their relative strengths and limitations in security applications. Moreover, our review emphasizes best practices for renewable data curation, considering quality, labeling, access efficiency, and governance. Effective deep learning integration necessitates tailored benchmarks, model tuning guidance, and renewable energy data generation. Our multi-dimensional analysis motivates focused efforts on enhancing detection explanations, securing communications, continually retraining models, and establishing standardized assessment protocols. Overall, we provide a comprehensive roadmap to progress renewable supply chain cyber-resilience leveraging deep learning’s immense potential.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"26 3","pages":"2146-2175"},"PeriodicalIF":34.4,"publicationDate":"2024-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139938436","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sixth-generation (6G) mobile communication networks are expected to have dense infrastructures, large antenna size, wide bandwidth, cost-effective hardware, diversified positioning methods, and enhanced intelligence. Such trends bring both new challenges and opportunities for the practical design of 6G. On one hand, acquiring channel state information (CSI) in real time for all wireless links becomes quite challenging in 6G. On the other hand, there would be numerous data sources in 6G containing high-quality location-tagged channel data, e.g., the estimated channels or beams between base station (BS) and user equipment (UE), making it possible to better learn the local wireless environment. By exploiting this new opportunity and for tackling the CSI acquisition challenge, there is a promising paradigm shift from the conventional environment-unaware communications to the new environment-aware communications based on the novel approach of channel knowledge map (CKM). This article aims to provide a comprehensive overview on environment-aware communications enabled by CKM to fully harness its benefits for 6G. First, the basic concept of CKM is presented, followed by the comparison of CKM with various existing channel inference techniques. Next, the main techniques for CKM construction are discussed, including both environment model-free and environment model-assisted approaches. Furthermore, a general framework is presented for the utilization of CKM to achieve environment-aware communications, followed by some typical CKM-aided communication scenarios. Finally, important open problems in CKM research are highlighted and potential solutions are discussed to inspire future work.
{"title":"A Tutorial on Environment-Aware Communications via Channel Knowledge Map for 6G","authors":"Yong Zeng;Junting Chen;Jie Xu;Di Wu;Xiaoli Xu;Shi Jin;Xiqi Gao;David Gesbert;Shuguang Cui;Rui Zhang","doi":"10.1109/COMST.2024.3364508","DOIUrl":"10.1109/COMST.2024.3364508","url":null,"abstract":"Sixth-generation (6G) mobile communication networks are expected to have dense infrastructures, large antenna size, wide bandwidth, cost-effective hardware, diversified positioning methods, and enhanced intelligence. Such trends bring both new challenges and opportunities for the practical design of 6G. On one hand, acquiring channel state information (CSI) in real time for all wireless links becomes quite challenging in 6G. On the other hand, there would be numerous data sources in 6G containing high-quality location-tagged channel data, e.g., the estimated channels or beams between base station (BS) and user equipment (UE), making it possible to better learn the local wireless environment. By exploiting this new opportunity and for tackling the CSI acquisition challenge, there is a promising paradigm shift from the conventional environment-unaware communications to the new environment-aware communications based on the novel approach of channel knowledge map (CKM). This article aims to provide a comprehensive overview on environment-aware communications enabled by CKM to fully harness its benefits for 6G. First, the basic concept of CKM is presented, followed by the comparison of CKM with various existing channel inference techniques. Next, the main techniques for CKM construction are discussed, including both environment model-free and environment model-assisted approaches. Furthermore, a general framework is presented for the utilization of CKM to achieve environment-aware communications, followed by some typical CKM-aided communication scenarios. Finally, important open problems in CKM research are highlighted and potential solutions are discussed to inspire future work.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"26 3","pages":"1478-1519"},"PeriodicalIF":34.4,"publicationDate":"2024-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139953572","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-08DOI: 10.1109/COMST.2024.3363639
Tiep M. Hoang;Alireza Vahid;Hoang Duong Tuan;Lajos Hanzo
Security at the physical layer (PHY) is a salient research topic in wireless systems, and machine learning (ML) is emerging as a powerful tool for providing new data-driven security solutions. Therefore, the application of ML techniques to the PHY security is of crucial importance in the landscape of more and more data-driven wireless services. In this context, we first summarize the family of bespoke ML algorithms that are eminently suitable for wireless security. Then, we review the recent progress in ML-aided PHY security, where the term “PHY security” is classified into two different types: i) PHY authentication and ii) secure PHY transmission. Moreover, we treat NNs as special types of ML and present how to deal with PHY security optimization problems using NNs. Finally, we identify some major challenges and opportunities in tackling PHY security challenges by applying carefully tailored ML tools.
物理层(PHY)安全是无线系统的一个突出研究课题,而机器学习(ML)正在成为提供新的数据驱动安全解决方案的有力工具。因此,在越来越多的数据驱动型无线服务中,将 ML 技术应用于物理层安全至关重要。在此背景下,我们首先总结了非常适合无线安全的定制 ML 算法系列。然后,我们回顾了 ML 辅助 PHY 安全的最新进展,其中 "PHY 安全 "一词分为两种不同类型:i) PHY 身份验证和 ii) 安全 PHY 传输。此外,我们将 NNs 视为特殊类型的 ML,并介绍了如何使用 NNs 处理 PHY 安全优化问题。最后,我们指出了通过应用精心定制的 ML 工具应对 PHY 安全挑战的一些主要挑战和机遇。
{"title":"Physical Layer Authentication and Security Design in the Machine Learning Era","authors":"Tiep M. Hoang;Alireza Vahid;Hoang Duong Tuan;Lajos Hanzo","doi":"10.1109/COMST.2024.3363639","DOIUrl":"10.1109/COMST.2024.3363639","url":null,"abstract":"Security at the physical layer (PHY) is a salient research topic in wireless systems, and machine learning (ML) is emerging as a powerful tool for providing new data-driven security solutions. Therefore, the application of ML techniques to the PHY security is of crucial importance in the landscape of more and more data-driven wireless services. In this context, we first summarize the family of bespoke ML algorithms that are eminently suitable for wireless security. Then, we review the recent progress in ML-aided PHY security, where the term “PHY security” is classified into two different types: i) PHY authentication and ii) secure PHY transmission. Moreover, we treat NNs as special types of ML and present how to deal with PHY security optimization problems using NNs. Finally, we identify some major challenges and opportunities in tackling PHY security challenges by applying carefully tailored ML tools.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"26 3","pages":"1830-1860"},"PeriodicalIF":34.4,"publicationDate":"2024-02-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139938435","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Due to the greatly improved capabilities of devices, massive data, and increasing concern about data privacy, Federated Learning (FL) has been increasingly considered for applications to wireless communication networks (WCNs). Wireless FL (WFL) is a distributed method of training a global deep learning model in which a large number of participants each train a local model on their training datasets and then upload the local model updates to a central server. However, in general, non-independent and identically distributed (non-IID) data of WCNs raises concerns about robustness, as a malicious participant could potentially inject a “backdoor” into the global model by uploading poisoned data or models over WCN. This could cause the model to misclassify malicious inputs as a specific target class while behaving normally with benign inputs. This survey provides a comprehensive review of the latest backdoor attacks and defense mechanisms. It classifies them according to their targets (data poisoning or model poisoning), the attack phase (local data collection, training, or aggregation), and defense stage (local training, before aggregation, during aggregation, or after aggregation). The strengths and limitations of existing attack strategies and defense mechanisms are analyzed in detail. Comparisons of existing attack methods and defense designs are carried out, pointing to noteworthy findings, open challenges, and potential future research directions related to security and privacy of WFL.
{"title":"Data and Model Poisoning Backdoor Attacks on Wireless Federated Learning, and the Defense Mechanisms: A Comprehensive Survey","authors":"Yichen Wan;Youyang Qu;Wei Ni;Yong Xiang;Longxiang Gao;Ekram Hossain","doi":"10.1109/COMST.2024.3361451","DOIUrl":"10.1109/COMST.2024.3361451","url":null,"abstract":"Due to the greatly improved capabilities of devices, massive data, and increasing concern about data privacy, Federated Learning (FL) has been increasingly considered for applications to wireless communication networks (WCNs). Wireless FL (WFL) is a distributed method of training a global deep learning model in which a large number of participants each train a local model on their training datasets and then upload the local model updates to a central server. However, in general, non-independent and identically distributed (non-IID) data of WCNs raises concerns about robustness, as a malicious participant could potentially inject a “backdoor” into the global model by uploading poisoned data or models over WCN. This could cause the model to misclassify malicious inputs as a specific target class while behaving normally with benign inputs. This survey provides a comprehensive review of the latest backdoor attacks and defense mechanisms. It classifies them according to their targets (data poisoning or model poisoning), the attack phase (local data collection, training, or aggregation), and defense stage (local training, before aggregation, during aggregation, or after aggregation). The strengths and limitations of existing attack strategies and defense mechanisms are analyzed in detail. Comparisons of existing attack methods and defense designs are carried out, pointing to noteworthy findings, open challenges, and potential future research directions related to security and privacy of WFL.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"26 3","pages":"1861-1897"},"PeriodicalIF":34.4,"publicationDate":"2024-02-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139953571","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-05DOI: 10.1109/COMST.2024.3361662
Yuan Li;Hao Zhang;Chen Zhang;Tao Huang;F. Richard Yu
With the development of quantum technologies, the quantum Internet has demonstrated unique applications beyond the classical Internet and has been investigated extensively in recent years. In the construction of conventional Internet software, the protocol stack is the core architecture for coordinating modules. How to design a protocol stack for the quantum Internet is a challenging problem. In this paper, we systematically review the latest developments in quantum Internet protocols from the perspective of protocol stack layering. By summarizing and analyzing the progress in each layer’s protocols, we reveal the current research status and connections among the layers. Our work provides readers with a comprehensive understanding of the quantum Internet and can help support researchers focusing on a single layer to better define the functions that the layer should possess and optimize related protocols. This approach enables all layers to work better together based on an understanding of the other layers.
{"title":"A Survey of Quantum Internet Protocols From a Layered Perspective","authors":"Yuan Li;Hao Zhang;Chen Zhang;Tao Huang;F. Richard Yu","doi":"10.1109/COMST.2024.3361662","DOIUrl":"10.1109/COMST.2024.3361662","url":null,"abstract":"With the development of quantum technologies, the quantum Internet has demonstrated unique applications beyond the classical Internet and has been investigated extensively in recent years. In the construction of conventional Internet software, the protocol stack is the core architecture for coordinating modules. How to design a protocol stack for the quantum Internet is a challenging problem. In this paper, we systematically review the latest developments in quantum Internet protocols from the perspective of protocol stack layering. By summarizing and analyzing the progress in each layer’s protocols, we reveal the current research status and connections among the layers. Our work provides readers with a comprehensive understanding of the quantum Internet and can help support researchers focusing on a single layer to better define the functions that the layer should possess and optimize related protocols. This approach enables all layers to work better together based on an understanding of the other layers.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"26 3","pages":"1606-1634"},"PeriodicalIF":34.4,"publicationDate":"2024-02-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139953671","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Communication in millimeter wave (mmWave) and even terahertz (THz) frequency bands is ushering in a new era of wireless communications. Beam management, namely initial access and beam tracking, has been recognized as an essential technique to ensure robust mmWave/THz communications, especially for mobile scenarios. However, narrow beams at higher carrier frequency lead to huge beam measurement overhead, which has a negative impact on beam acquisition and tracking. In addition, the beam management process is further complicated by the fluctuation of mmWave/THz channels, the random movement patterns of users, and the dynamic changes in the environment. For mmWave and THz communications toward 6G, we have witnessed a substantial increase in research and industrial attention on artificial intelligence (AI), reconfigurable intelligent surface (RIS), and integrated sensing and communications (ISAC). The introduction of these enabling technologies presents both open opportunities and unique challenges for beam management. In this paper, we present a comprehensive survey on mmWave and THz beam management. Further, we give some insights on technical challenges and future research directions in this promising area.
{"title":"A Survey of Beam Management for mmWave and THz Communications Towards 6G","authors":"Qing Xue;Chengwang Ji;Shaodan Ma;Jiajia Guo;Yongjun Xu;Qianbin Chen;Wei Zhang","doi":"10.1109/COMST.2024.3361991","DOIUrl":"10.1109/COMST.2024.3361991","url":null,"abstract":"Communication in millimeter wave (mmWave) and even terahertz (THz) frequency bands is ushering in a new era of wireless communications. Beam management, namely initial access and beam tracking, has been recognized as an essential technique to ensure robust mmWave/THz communications, especially for mobile scenarios. However, narrow beams at higher carrier frequency lead to huge beam measurement overhead, which has a negative impact on beam acquisition and tracking. In addition, the beam management process is further complicated by the fluctuation of mmWave/THz channels, the random movement patterns of users, and the dynamic changes in the environment. For mmWave and THz communications toward 6G, we have witnessed a substantial increase in research and industrial attention on artificial intelligence (AI), reconfigurable intelligent surface (RIS), and integrated sensing and communications (ISAC). The introduction of these enabling technologies presents both open opportunities and unique challenges for beam management. In this paper, we present a comprehensive survey on mmWave and THz beam management. Further, we give some insights on technical challenges and future research directions in this promising area.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"26 3","pages":"1520-1559"},"PeriodicalIF":34.4,"publicationDate":"2024-02-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139938433","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-01-23DOI: 10.1109/COMST.2024.3357591
Yafeng Yin;Lei Xie;Zhiwei Jiang;Fu Xiao;Jiannong Cao;Sanglu Lu
Due to the ever-growing powers in sensing, computing, communicating and storing, mobile devices (e.g., smartphone, smartwatch, smart glasses) become ubiquitous and an indispensable part of people’s daily life. Until now, mobile devices have been adopted in many applications, e.g., exercise assessment, daily life monitoring, human-computer interactions, user authentication, etc. Among the various applications, Human Activity Recognition (HAR) is the core technology behind them. Specifically, HAR gets the sensor data corresponding to human activities based on the built-in sensors of mobile devices, and then adopts suitable recognition approaches to infer the type of activity based on sensor data. The last two decades have witnessed the ever-increasing research in HAR. However, new challenges and opportunities are emerging, especially for HAR based on mobile devices. Therefore, in this paper, we review the research of HAR based on mobile devices, aiming to advance the following research in this area. Firstly, we give an overview of HAR based on mobile devices, including the general rationales, main components and challenges. Secondly, we review and analyze the research progress of HAR based on mobile devices from each main aspect, including human activities, sensor data, data preprocessing, recognition approaches, evaluation standards and application cases. Finally, we present some promising trends in HAR based on mobile devices for future research.
由于移动设备(如智能手机、智能手表、智能眼镜)在传感、计算、通信和存储方面的功能日益强大,移动设备已变得无处不在,成为人们日常生活中不可或缺的一部分。迄今为止,移动设备已被广泛应用于多种领域,如运动评估、日常生活监测、人机交互、用户身份验证等。在各种应用中,人类活动识别(HAR)是其背后的核心技术。具体来说,人类活动识别(HAR)是基于移动设备的内置传感器获取与人类活动相对应的传感器数据,然后根据传感器数据采用合适的识别方法来推断活动类型。在过去的二十年里,对 HAR 的研究与日俱增。然而,新的挑战和机遇也在不断涌现,尤其是基于移动设备的 HAR。因此,本文回顾了基于移动设备的 HAR 研究,旨在推动该领域的后续研究。首先,我们概述了基于移动设备的 HAR,包括一般原理、主要组成部分和挑战。其次,我们从人类活动、传感器数据、数据预处理、识别方法、评估标准和应用案例等各个主要方面回顾和分析了基于移动设备的 HAR 的研究进展。最后,我们介绍了基于移动设备的 HAR 的一些有前景的发展趋势,供未来研究参考。
{"title":"A Systematic Review of Human Activity Recognition Based on Mobile Devices: Overview, Progress and Trends","authors":"Yafeng Yin;Lei Xie;Zhiwei Jiang;Fu Xiao;Jiannong Cao;Sanglu Lu","doi":"10.1109/COMST.2024.3357591","DOIUrl":"10.1109/COMST.2024.3357591","url":null,"abstract":"Due to the ever-growing powers in sensing, computing, communicating and storing, mobile devices (e.g., smartphone, smartwatch, smart glasses) become ubiquitous and an indispensable part of people’s daily life. Until now, mobile devices have been adopted in many applications, e.g., exercise assessment, daily life monitoring, human-computer interactions, user authentication, etc. Among the various applications, Human Activity Recognition (HAR) is the core technology behind them. Specifically, HAR gets the sensor data corresponding to human activities based on the built-in sensors of mobile devices, and then adopts suitable recognition approaches to infer the type of activity based on sensor data. The last two decades have witnessed the ever-increasing research in HAR. However, new challenges and opportunities are emerging, especially for HAR based on mobile devices. Therefore, in this paper, we review the research of HAR based on mobile devices, aiming to advance the following research in this area. Firstly, we give an overview of HAR based on mobile devices, including the general rationales, main components and challenges. Secondly, we review and analyze the research progress of HAR based on mobile devices from each main aspect, including human activities, sensor data, data preprocessing, recognition approaches, evaluation standards and application cases. Finally, we present some promising trends in HAR based on mobile devices for future research.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"26 2","pages":"890-929"},"PeriodicalIF":35.6,"publicationDate":"2024-01-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139938434","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-01-19DOI: 10.1109/COMST.2023.3347145
Shadab Mahboob;Lingjia Liu
Non-Terrestrial Networks (NTN) are expected to be a critical component of 6th Generation (6G) networks, providing ubiquitous, continuous, and scalable services. Satellites emerge as the primary enabler for NTN, leveraging their extensive coverage, stable orbits, scalability, and adherence to international regulations. However, satellite-based NTN presents unique challenges, including long propagation delay, high Doppler shift, frequent handovers, spectrum sharing complexities, and intricate beam and resource allocation, among others. The integration of NTNs into existing terrestrial networks in 6G introduces a range of novel challenges, including task offloading, network routing, network slicing, and many more. To tackle all these obstacles, this paper proposes Artificial Intelligence (AI) as a promising solution, harnessing its ability to capture intricate correlations among diverse network parameters. We begin by providing a comprehensive background on NTN and AI, highlighting the potential of AI techniques in addressing various NTN challenges. Next, we present an overview of existing works, emphasizing AI as an enabling tool for satellite-based NTN, and explore potential research directions. Furthermore, we discuss ongoing research efforts that aim to enable AI in satellite-based NTN through software-defined implementations, while also discussing the associated challenges. Finally, we conclude by providing insights and recommendations for enabling AI-driven satellite-based NTN in future 6G networks.
{"title":"Revolutionizing Future Connectivity: A Contemporary Survey on AI-Empowered Satellite-Based Non-Terrestrial Networks in 6G","authors":"Shadab Mahboob;Lingjia Liu","doi":"10.1109/COMST.2023.3347145","DOIUrl":"10.1109/COMST.2023.3347145","url":null,"abstract":"Non-Terrestrial Networks (NTN) are expected to be a critical component of 6th Generation (6G) networks, providing ubiquitous, continuous, and scalable services. Satellites emerge as the primary enabler for NTN, leveraging their extensive coverage, stable orbits, scalability, and adherence to international regulations. However, satellite-based NTN presents unique challenges, including long propagation delay, high Doppler shift, frequent handovers, spectrum sharing complexities, and intricate beam and resource allocation, among others. The integration of NTNs into existing terrestrial networks in 6G introduces a range of novel challenges, including task offloading, network routing, network slicing, and many more. To tackle all these obstacles, this paper proposes Artificial Intelligence (AI) as a promising solution, harnessing its ability to capture intricate correlations among diverse network parameters. We begin by providing a comprehensive background on NTN and AI, highlighting the potential of AI techniques in addressing various NTN challenges. Next, we present an overview of existing works, emphasizing AI as an enabling tool for satellite-based NTN, and explore potential research directions. Furthermore, we discuss ongoing research efforts that aim to enable AI in satellite-based NTN through software-defined implementations, while also discussing the associated challenges. Finally, we conclude by providing insights and recommendations for enabling AI-driven satellite-based NTN in future 6G networks.","PeriodicalId":55029,"journal":{"name":"IEEE Communications Surveys and Tutorials","volume":"26 2","pages":"1279-1321"},"PeriodicalIF":35.6,"publicationDate":"2024-01-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139938437","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}