Recognizing human activities in smart environments has significant usage in home automation, security, healthcare monitoring, etc. This paper proposes WiAR, which is a Wi-Fi-based human activity recognition method using Continuous Wavelet Transform (CWT) and lightweight Convolutional Neural Networks (CNNs). The proposed approach is evaluated on the IEEE 802.11ax Channel State Information (CSI) dataset. First, WiAR utilizes the CWT to generate the spectrogram images from the CSI extracted from Wi-Fi signals for different activities: walking, running, staying in place, and empty space. Then, these spectrogram images are processed with a CNN to classify these activities efficiently. Experimental results show that the proposed WiAR achieves an accuracy of approximately 91.1% when compared to various pre-trained models such as DenseNet, EfficientNet, MobileNet, ResNet, and VGGNet. Finally, the proposed CNN model is deployed on various edge computing devices, including Raspberry Pi 5, to validate its real-time implementation in terms of inference time.
识别智能环境中的人类活动在家庭自动化、安全、医疗监控等方面具有重要用途。WiAR是一种基于wi - fi的基于连续小波变换(CWT)和轻量级卷积神经网络(cnn)的人体活动识别方法。该方法在IEEE 802.11ax信道状态信息(CSI)数据集上进行了评估。首先,WiAR利用CWT从Wi-Fi信号中提取的CSI中生成不同活动的频谱图图像:步行、跑步、原地不动和空旷。然后,对这些光谱图图像进行CNN处理,有效地对这些活动进行分类。实验结果表明,与DenseNet、EfficientNet、MobileNet、ResNet和VGGNet等多种预训练模型相比,本文提出的WiAR模型的准确率约为91.1%。最后,将提出的CNN模型部署在各种边缘计算设备上,包括Raspberry Pi 5,以验证其在推理时间方面的实时性。
{"title":"WiAR : Wi-Fi-based human activity recognition using time-frequency analysis and lightweight deep learning for smart environments","authors":"Vamsi Krishna Puduru , Rakesh Reddy Yakkati , Sreenivasa Reddy Yeduri , Sagar Koorapati , Linga Reddy Cenkeramaddi","doi":"10.1016/j.iot.2026.101881","DOIUrl":"10.1016/j.iot.2026.101881","url":null,"abstract":"<div><div>Recognizing human activities in smart environments has significant usage in home automation, security, healthcare monitoring, etc. This paper proposes WiAR, which is a Wi-Fi-based human activity recognition method using Continuous Wavelet Transform (CWT) and lightweight Convolutional Neural Networks (CNNs). The proposed approach is evaluated on the IEEE 802.11ax Channel State Information (CSI) dataset. First, WiAR utilizes the CWT to generate the spectrogram images from the CSI extracted from Wi-Fi signals for different activities: walking, running, staying in place, and empty space. Then, these spectrogram images are processed with a CNN to classify these activities efficiently. Experimental results show that the proposed WiAR achieves an accuracy of approximately 91.1% when compared to various pre-trained models such as DenseNet, EfficientNet, MobileNet, ResNet, and VGGNet. Finally, the proposed CNN model is deployed on various edge computing devices, including Raspberry Pi 5, to validate its real-time implementation in terms of inference time.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101881"},"PeriodicalIF":7.6,"publicationDate":"2026-01-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077993","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-24DOI: 10.1016/j.iot.2026.101880
José Gómez-delaHiz , Aymen Fakhreddine , Jaime Galán-Jiménez
The research community is currently exploring the use of Unmanned Aerial Vehicle (UAV) networks to address coverage challenges in rural and economically disadvantaged regions. By equipping UAVs with small cells, coverage can be improved in areas where network operators are not prone to invest due to low Return on Investment. If there is a requirement from users in rural scenarios to achieve higher throughput (for instance, users seeking IoT services with stringent Quality-of-Service requirements), deploying multiple UAVs in the same area could be an effective strategy. However, this approach would also result in increased energy consumption. This paper addresses the challenge of maximizing the throughput offered in rural areas for users accessing microservice-based IoT applications, while also minimizing the energy consumption of UAV swarms. To achieve this, an optimal solution is proposed through a Mixed Integer Linear Programming (MILP) model, which is evaluated within realistic environments. Since this placement problem is complex due to its NP hard nature, in order to obtain solutions for large scenarios in tractable times, we also present a genetic algorithm (GA) that obtains results close to those reported by the MILP with a remarkable reduction in the computation time. Specifically, the optimality gap of the proposed GA-based solution is on average 2.32%, with a reduction of 89.92% in the computation time.
{"title":"Energy-efficient throughput optimization in UAV-based microservice networks for rural connectivity scenarios","authors":"José Gómez-delaHiz , Aymen Fakhreddine , Jaime Galán-Jiménez","doi":"10.1016/j.iot.2026.101880","DOIUrl":"10.1016/j.iot.2026.101880","url":null,"abstract":"<div><div>The research community is currently exploring the use of Unmanned Aerial Vehicle (UAV) networks to address coverage challenges in rural and economically disadvantaged regions. By equipping UAVs with small cells, coverage can be improved in areas where network operators are not prone to invest due to low Return on Investment. If there is a requirement from users in rural scenarios to achieve higher throughput (for instance, users seeking IoT services with stringent Quality-of-Service requirements), deploying multiple UAVs in the same area could be an effective strategy. However, this approach would also result in increased energy consumption. This paper addresses the challenge of maximizing the throughput offered in rural areas for users accessing microservice-based IoT applications, while also minimizing the energy consumption of UAV swarms. To achieve this, an optimal solution is proposed through a Mixed Integer Linear Programming (MILP) model, which is evaluated within realistic environments. Since this placement problem is complex due to its NP hard nature, in order to obtain solutions for large scenarios in tractable times, we also present a genetic algorithm (GA) that obtains results close to those reported by the MILP with a remarkable reduction in the computation time. Specifically, the optimality gap of the proposed GA-based solution is on average 2.32%, with a reduction of 89.92% in the computation time.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101880"},"PeriodicalIF":7.6,"publicationDate":"2026-01-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077989","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-24DOI: 10.1016/j.iot.2026.101873
Marco Silva , José Santos , Marília Curado , Chan-Tong Lam , Benjamin K. Ng
This paper introduces SCENE (Serving Cluster formation in cEll-free dyNamic Environments), a novel optimization model for serving cluster formation in Cell-Free massive Multiple-Input Multiple-Output networks. SCENE addresses the challenge of supporting heterogeneous service requirements - including critical Internet of Things (IoT) and latency-sensitive services as well as regular broadband applications - under dynamic network conditions. Unlike traditional approaches that rely on iterative refinement algorithms, SCENE performs one-shot serving cluster formation, eliminating the overhead of successive optimization loops. This innovation leads to significantly lower computational complexity and faster execution times while preserving service quality. Simulation results show that SCENE achieves superior performance in both average and 90%-likely spectral efficiency compared to state-of-the-art baselines, while demonstrating strong robustness under varying traffic profiles and pilot scarcity. SCENE enables efficient, scalable, and service-aware cluster formation, making it a promising candidate for dynamic and heterogeneous 6G and IoT environments
本文介绍了一种新的无cell的大规模多输入多输出网络服务集群形成优化模型SCENE (service Cluster formation in cEll-free dyNamic Environments)。SCENE解决了在动态网络条件下支持异构服务需求的挑战,包括关键的物联网(IoT)和延迟敏感服务以及常规宽带应用。与依赖迭代优化算法的传统方法不同,SCENE执行一次性服务集群形成,消除了连续优化循环的开销。这一创新显著降低了计算复杂度,加快了执行时间,同时保持了服务质量。仿真结果表明,与最先进的基线相比,SCENE在平均和90%可能的频谱效率方面都取得了卓越的性能,同时在不同的流量概况和飞行员稀缺情况下表现出强大的鲁棒性。SCENE支持高效、可扩展和服务感知的集群形成,使其成为动态和异构6G和物联网环境的有希望的候选者
{"title":"SCENE: Serving cluster formation in cEll-free dyNamic environments","authors":"Marco Silva , José Santos , Marília Curado , Chan-Tong Lam , Benjamin K. Ng","doi":"10.1016/j.iot.2026.101873","DOIUrl":"10.1016/j.iot.2026.101873","url":null,"abstract":"<div><div>This paper introduces SCENE (Serving Cluster formation in cEll-free dyNamic Environments), a novel optimization model for serving cluster formation in Cell-Free massive Multiple-Input Multiple-Output networks. SCENE addresses the challenge of supporting heterogeneous service requirements - including critical Internet of Things (IoT) and latency-sensitive services as well as regular broadband applications - under dynamic network conditions. Unlike traditional approaches that rely on iterative refinement algorithms, SCENE performs one-shot serving cluster formation, eliminating the overhead of successive optimization loops. This innovation leads to significantly lower computational complexity and faster execution times while preserving service quality. Simulation results show that SCENE achieves superior performance in both average and 90%-likely spectral efficiency compared to state-of-the-art baselines, while demonstrating strong robustness under varying traffic profiles and pilot scarcity. SCENE enables efficient, scalable, and service-aware cluster formation, making it a promising candidate for dynamic and heterogeneous 6G and IoT environments</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101873"},"PeriodicalIF":7.6,"publicationDate":"2026-01-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-24DOI: 10.1016/j.iot.2026.101875
Samir Si-Mohammed , Fabrice Théoleyre
Wireless networks are increasingly deployed in diverse domains, from best-effort object tracking to real-time control in smart factories. Yet, their performance strongly depends on configuration choices, especially at the MAC level. Thus, a single homogeneous configuration is often suboptimal due to the heterogeneous nature of individual links. We argue that Digital Twins (DTs) are a promising enabler for autonomous networks, capable of adapting configurations dynamically to prevailing conditions. However, global modeling approaches in DTs make it difficult to capture link-level variability and to accurately model the impact of configuration changes on performance. In this work, we propose a link-oriented prediction model designed to serve as a cornerstone for future wireless Digital Twins. Our model estimates the Packet Reception Rate (Packet Reception Rate (PRR)) under different MAC configurations, capturing the unique characteristics of each communication link. To improve scalability in large deployments, we explore a clustering-based approach, where predictive models are trained per group of similar links rather than per individual link. Our experimental evaluation shows that these data-driven methods effectively capture link heterogeneity while offering robust prediction accuracy and enhanced generalization capabilities.
{"title":"Group-based link modeling for wireless digital twins: Towards accurate network performance prediction","authors":"Samir Si-Mohammed , Fabrice Théoleyre","doi":"10.1016/j.iot.2026.101875","DOIUrl":"10.1016/j.iot.2026.101875","url":null,"abstract":"<div><div>Wireless networks are increasingly deployed in diverse domains, from best-effort object tracking to real-time control in smart factories. Yet, their performance strongly depends on configuration choices, especially at the MAC level. Thus, a single homogeneous configuration is often suboptimal due to the heterogeneous nature of individual links. We argue that Digital Twins (DTs) are a promising enabler for autonomous networks, capable of adapting configurations dynamically to prevailing conditions. However, global modeling approaches in DTs make it difficult to capture link-level variability and to accurately model the impact of configuration changes on performance. In this work, we propose a link-oriented prediction model designed to serve as a cornerstone for future wireless Digital Twins. Our model estimates the Packet Reception Rate (Packet Reception Rate (PRR)) under different MAC configurations, capturing the unique characteristics of each communication link. To improve scalability in large deployments, we explore a clustering-based approach, where predictive models are trained per group of similar links rather than per individual link. Our experimental evaluation shows that these data-driven methods effectively capture link heterogeneity while offering robust prediction accuracy and enhanced generalization capabilities.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101875"},"PeriodicalIF":7.6,"publicationDate":"2026-01-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077992","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The explosive growth of the IoT has led to an increasingly complex and heterogeneous network traffic, posing major challenges for intrusion detection. Most existing machine learning and deep learning approaches model network traffic at the level of individual flows, which limits their ability to capture contextual relationships among concurrent communications. This paper introduces a Transformer-based framework for IoT intrusion detection that aggregates network flows into fixed-duration windows and treats each flow as a token within the input sequence. The self-attention mechanism captures contextual relationships among concurrent flows, enabling effective modeling of temporal dependencies without recurrence. Experiments conducted on the CICIoT2023 dataset show that the proposed model achieves a weighted F1-score of 97.9% and a macro ROC–AUC of 99.6% under temporally blocked cross-validation, while maintaining high computational efficiency. These results demonstrate that flow-to-window aggregation combined with self-attention provides a robust and scalable foundation for IoT network security, suitable for deployment in edge and smart-home environments.
{"title":"Transformer-based classification of IoT network traffic with flow-to-window aggregation","authors":"Sergio Martin-Reizabal , Adrian Caballero-Quiroga , Beatriz Gil-Arroyo , Nuño Basurto , Ruben Ruiz-Gonzalez","doi":"10.1016/j.iot.2026.101879","DOIUrl":"10.1016/j.iot.2026.101879","url":null,"abstract":"<div><div>The explosive growth of the IoT has led to an increasingly complex and heterogeneous network traffic, posing major challenges for intrusion detection. Most existing machine learning and deep learning approaches model network traffic at the level of individual flows, which limits their ability to capture contextual relationships among concurrent communications. This paper introduces a Transformer-based framework for IoT intrusion detection that aggregates network flows into fixed-duration windows and treats each flow as a token within the input sequence. The self-attention mechanism captures contextual relationships among concurrent flows, enabling effective modeling of temporal dependencies without recurrence. Experiments conducted on the CICIoT2023 dataset show that the proposed model achieves a weighted F1-score of 97.9% and a macro ROC–AUC of 99.6% under temporally blocked cross-validation, while maintaining high computational efficiency. These results demonstrate that flow-to-window aggregation combined with self-attention provides a robust and scalable foundation for IoT network security, suitable for deployment in edge and smart-home environments.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101879"},"PeriodicalIF":7.6,"publicationDate":"2026-01-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077990","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The convergence of Information Technology (IT) and Operational Technology (OT) in Industry 4.0 produces diverse data streams, such as system logs, sensor readings, and network traffic, which are vital for industrial security. However, existing security analytics are siloed by modality and rely on centralized processing, raising concerns regarding privacy, latency, and scalability. Although Federated Learning (FL) mitigates privacy risks, most frameworks remain unimodal, lack support for non-IID data distributions, and face adversarial evasion challenges. We propose FedMamba, a novel multimodal Federated Learning (MMFL) framework that creates a unified Mamba-based model to address these issues via (i) efficient cross-modal learning, (ii) a FedProx-based protocol for stable non-IID training that remains compatible with secure aggregation, and (iii) modality-specific adversarial training for robustness. Experiments on HDFS, SWaT, and CICIoMT-2024 datasets show that the standard FedMamba achieved competitive macro F1-scores of 0.9584, 0.9795, and 0.9665 relative to centralized baselines, but degraded on HDFS and SWaT under PGD attack (0.3791 and 0.5147), whereas CICIoMT-2024 remained robust under the same attack (0.9665). The adversarially trained FedMamba-AT sustains robust F1-scores (0.9480, 0.8357, 0.9645). FedMamba offers a robust and scalable solution for secure IIoT monitoring.
{"title":"FedMamba: Robust multimodal federated intrusion detection for heterogeneous IoT systems","authors":"Hafiz Bilal Ahmad , Haichang Gao , Naila Latif , Tanjeena Manzoor","doi":"10.1016/j.iot.2026.101877","DOIUrl":"10.1016/j.iot.2026.101877","url":null,"abstract":"<div><div>The convergence of Information Technology (IT) and Operational Technology (OT) in Industry 4.0 produces diverse data streams, such as system logs, sensor readings, and network traffic, which are vital for industrial security. However, existing security analytics are siloed by modality and rely on centralized processing, raising concerns regarding privacy, latency, and scalability. Although Federated Learning (FL) mitigates privacy risks, most frameworks remain unimodal, lack support for non-IID data distributions, and face adversarial evasion challenges. We propose FedMamba, a novel multimodal Federated Learning (MMFL) framework that creates a unified Mamba-based model to address these issues via (i) efficient cross-modal learning, (ii) a FedProx-based protocol for stable non-IID training that remains compatible with secure aggregation, and (iii) modality-specific adversarial training for robustness. Experiments on HDFS, SWaT, and CICIoMT-2024 datasets show that the standard FedMamba achieved competitive macro F1-scores of 0.9584, 0.9795, and 0.9665 relative to centralized baselines, but degraded on HDFS and SWaT under PGD attack (0.3791 and 0.5147), whereas CICIoMT-2024 remained robust under the same attack (0.9665). The adversarially trained FedMamba-AT sustains robust F1-scores (0.9480, 0.8357, 0.9645). FedMamba offers a robust and scalable solution for secure IIoT monitoring.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101877"},"PeriodicalIF":7.6,"publicationDate":"2026-01-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077994","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-20DOI: 10.1016/j.iot.2026.101878
Hua Qin , Ni Li , Gelan Yang , Yang Peng
The rapid expansion of the Internet of Things (IoT) ecosystem has propelled widespread deployment of distributed low-power wireless networks, among which ZigBee stands out for diverse innovative applications. However, energy-efficient cross-network communication remains challenging, as existing solutions like multi-hop ZigBee and one-hop LoRa entail trade-offs between communication delays and deployment costs. To tackle these issues, we propose a novel cross-interface relaying paradigm that utilizes a star topology within each ZigBee network and designates a relay node with an additional LoRa interface to bridge networks via a central LoRa gateway. Compared with existing methods, this approach reduces energy consumption and costs while improving scalability. To implement this paradigm while balancing energy conservation with delay guarantees, we introduce a Cross-network Cross-interface Relaying (CCR) scheme, which jointly schedules LoRa and ZigBee transmission behaviors to minimize energy consumption under delay constraints. CCR uses a scheduling framework that breaks down end-to-end delay constraints into link-level constraints, enabling global optimization of transmission parameters and dynamic adaptation to link quality variations. The effectiveness of CCR is demonstrated through extensive field tests on a prototype implemented on Raspberry Pi 3B+. Results show that CCR reduces energy consumption by 55.4% and 39.1% compared with an advanced LoRa communication protocol and a state-of-the-art cross-interface relaying scheme, respectively, while ensuring that 98.7% of packets satisfy their delay constraints. These findings highlight the potential of CCR for efficient and reliable cross-network communication in large-scale IoT deployments.
{"title":"Cross-network cross-interface relaying via LoRa-ZigBee synergy: Enabling energy-efficient delay-constrained communication across low-power IoT networks","authors":"Hua Qin , Ni Li , Gelan Yang , Yang Peng","doi":"10.1016/j.iot.2026.101878","DOIUrl":"10.1016/j.iot.2026.101878","url":null,"abstract":"<div><div>The rapid expansion of the Internet of Things (IoT) ecosystem has propelled widespread deployment of distributed low-power wireless networks, among which ZigBee stands out for diverse innovative applications. However, energy-efficient cross-network communication remains challenging, as existing solutions like multi-hop ZigBee and one-hop LoRa entail trade-offs between communication delays and deployment costs. To tackle these issues, we propose a novel cross-interface relaying paradigm that utilizes a star topology within each ZigBee network and designates a relay node with an additional LoRa interface to bridge networks via a central LoRa gateway. Compared with existing methods, this approach reduces energy consumption and costs while improving scalability. To implement this paradigm while balancing energy conservation with delay guarantees, we introduce a Cross-network Cross-interface Relaying (CCR) scheme, which jointly schedules LoRa and ZigBee transmission behaviors to minimize energy consumption under delay constraints. CCR uses a scheduling framework that breaks down end-to-end delay constraints into link-level constraints, enabling global optimization of transmission parameters and dynamic adaptation to link quality variations. The effectiveness of CCR is demonstrated through extensive field tests on a prototype implemented on Raspberry Pi 3B+. Results show that CCR reduces energy consumption by 55.4% and 39.1% compared with an advanced LoRa communication protocol and a state-of-the-art cross-interface relaying scheme, respectively, while ensuring that 98.7% of packets satisfy their delay constraints. These findings highlight the potential of CCR for efficient and reliable cross-network communication in large-scale IoT deployments.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101878"},"PeriodicalIF":7.6,"publicationDate":"2026-01-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146023232","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Smart spaces are ubiquitous computing environments that integrate diverse sensing and communication technologies to enhance functionality, optimize energy utilization, and improve user comfort and well-being. The adoption of emerging artificial intelligence (AI) methodologies has led to the development of AI-driven smart spaces, further expanding capabilities through applications such as personalized comfort settings, interactive living spaces, and automation of space systems. These advancements collectively elevate the quality of indoor experiences for users. To systematically examine these developments, we present a comprehensive survey of the foundational components of AI-driven smart spaces, including sensor technologies, data communication protocols, network management and maintenance strategies, and data collection, processing, and analytics. We investigate both traditional machine learning (ML) methods, such as deep learning (DL), and emerging approaches, including transformer networks and large language models (LLMs), highlighting their contributions and potential. We also showcase real-world applications of these technologies and provide insights to guide their continued development. Each section details relevant technologies and methodologies and concludes with an analysis of challenges and limitations, identifying directions for future research.
{"title":"Creation of AI-driven smart spaces for enhanced indoor environments – A survey","authors":"Aygün Varol , Naser Hossein Motlagh , Mirka Leino , Sasu Tarkoma , Johanna Virkki","doi":"10.1016/j.iot.2026.101876","DOIUrl":"10.1016/j.iot.2026.101876","url":null,"abstract":"<div><div>Smart spaces are ubiquitous computing environments that integrate diverse sensing and communication technologies to enhance functionality, optimize energy utilization, and improve user comfort and well-being. The adoption of emerging artificial intelligence (AI) methodologies has led to the development of AI-driven smart spaces, further expanding capabilities through applications such as personalized comfort settings, interactive living spaces, and automation of space systems. These advancements collectively elevate the quality of indoor experiences for users. To systematically examine these developments, we present a comprehensive survey of the foundational components of AI-driven smart spaces, including sensor technologies, data communication protocols, network management and maintenance strategies, and data collection, processing, and analytics. We investigate both traditional machine learning (ML) methods, such as deep learning (DL), and emerging approaches, including transformer networks and large language models (LLMs), highlighting their contributions and potential. We also showcase real-world applications of these technologies and provide insights to guide their continued development. Each section details relevant technologies and methodologies and concludes with an analysis of challenges and limitations, identifying directions for future research.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101876"},"PeriodicalIF":7.6,"publicationDate":"2026-01-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146023233","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-16DOI: 10.1016/j.iot.2026.101872
Hamza Ibrahim , Love Allen Chijioke Ahakonye , Jae-Min Lee , Dong-Seong Kim
The convergence of Quantum Machine Learning (QML) and Blockchain is emerging as a transformative paradigm to address escalating security and scalability challenges in 6G-enabled Industrial Internet of Things (IIoT) networks. This study presents the first comprehensive bibliometric and meta-analysis of this nascent interdisciplinary field. We analyzed 159 peer-reviewed publications (indexed from January 2022 through December 22, 2024) from Scopus, employing a systematic Kitchenham-based methodology for literature selection and VOSviewer for science mapping. Our analysis reveals a 75% annual growth rate since 2022, with India (37.7%), the USA (12.6%), and South Korea (12.6%) as the leading contributors. Keyword co-occurrence analysis identified four dominant thematic clusters: “6G Network Security,” “Quantum Computing and AI,” “Blockchain and Decentralization,” and “IIoT Applications.” The study’s novelty lies in synthesizing bibliometric insights with a proposed five-layer QML-Blockchain integration framework and a comparative analysis against existing reviews. Quantitative performance metrics indicate that QML can improve anomaly detection accuracy by 5–9% over classical models, while advanced consensus mechanisms like PoA2 can reduce transaction latency by 35%. However, significant challenges persist, including quantum hardware limitations (e.g., qubit coherence < 100 μs), scalability challenges in achieving consensus across massive IIoT device densities, and a critical lack of empirical testbeds. This research provides a foundational roadmap, emphasizing the urgent need for standardized benchmarks, hybrid orchestration models, and quantum-resistant cryptography to realize secure, intelligent, and autonomous IIoT ecosystems in the 6G era.
{"title":"Bibliometric analysis of secure IoT for quantum computing","authors":"Hamza Ibrahim , Love Allen Chijioke Ahakonye , Jae-Min Lee , Dong-Seong Kim","doi":"10.1016/j.iot.2026.101872","DOIUrl":"10.1016/j.iot.2026.101872","url":null,"abstract":"<div><div>The convergence of Quantum Machine Learning (QML) and Blockchain is emerging as a transformative paradigm to address escalating security and scalability challenges in 6G-enabled Industrial Internet of Things (IIoT) networks. This study presents the first comprehensive bibliometric and meta-analysis of this nascent interdisciplinary field. We analyzed 159 peer-reviewed publications (indexed from January 2022 through December 22, 2024) from Scopus, employing a systematic Kitchenham-based methodology for literature selection and VOSviewer for science mapping. Our analysis reveals a 75% annual growth rate since 2022, with India (37.7%), the USA (12.6%), and South Korea (12.6%) as the leading contributors. Keyword co-occurrence analysis identified four dominant thematic clusters: “6G Network Security,” “Quantum Computing and AI,” “Blockchain and Decentralization,” and “IIoT Applications.” The study’s novelty lies in synthesizing bibliometric insights with a proposed five-layer QML-Blockchain integration framework and a comparative analysis against existing reviews. Quantitative performance metrics indicate that QML can improve anomaly detection accuracy by 5–9% over classical models, while advanced consensus mechanisms like PoA<sup>2</sup> can reduce transaction latency by 35%. However, significant challenges persist, including quantum hardware limitations (e.g., qubit coherence < 100 <em>μ</em>s), scalability challenges in achieving consensus across massive IIoT device densities, and a critical lack of empirical testbeds. This research provides a foundational roadmap, emphasizing the urgent need for standardized benchmarks, hybrid orchestration models, and quantum-resistant cryptography to realize secure, intelligent, and autonomous IIoT ecosystems in the 6G era.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101872"},"PeriodicalIF":7.6,"publicationDate":"2026-01-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077991","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-14DOI: 10.1016/j.iot.2026.101874
Recep Sinan Arslan
The Internet of Things has a network structure that is vulnerable to cyberattacks and susceptible to attackers. Ensuring the privacy of organizations and individuals is a crucial issue in IoT networks, where sensitive data is transmitted and various types of attacks are prevalent. The importance of Intrusion Detection Systems (IDS) in detecting attacks and infiltration attempts on IoT networks is increasing daily. In this way, it will contribute to the resistance against attackers and the spread of this modern technology. In this study, Prediction Tree Traffic Analyzer (PT-TAnalyzer), an IDS system capable of detecting and classifying attacks on IoT networks, is proposed. PT-TAnalyzer features an ensemble model structure, where weighting is determined by the validation scores of machine learning models, and a prediction tree comprising eight ensemble models trained on the CIC-IoT-2023 dataset. This proposed model detects 34 attack types (including 33 malicious and one benign) with high success rates, due to its unique attack-detection approach, and does so efficiently and cost-effectively. Unlike traditional studies, it achieves this by using eight trained models rather than classifying all attacks with a single model and a single prediction structure within the tree architecture. In the tests performed, PT-TAnalyzer achieved 99.76 % accuracy in the binary classification experiment (Benign vs. Malicious) and 98.70 % accuracy in the 34-class experiment, yielding a similar F1 Score. The test time per sample is less than 0.1 ms. Compared with previous frameworks using the same dataset, PT-TAnalyzer shows a 2 % improvement in overall accuracy and a lower processing time. In practice, the proposed model can be deployed on IoT gateways or edge devices to provide real-time, low-cost, and scalable intrusion detection capabilities. The model outperforms previous studies using the same dataset, while also addressing the limitations.
{"title":"PT-TrafficAnalyzer: A weighted ensemble prediction tree for IoT attack detection","authors":"Recep Sinan Arslan","doi":"10.1016/j.iot.2026.101874","DOIUrl":"10.1016/j.iot.2026.101874","url":null,"abstract":"<div><div>The Internet of Things has a network structure that is vulnerable to cyberattacks and susceptible to attackers. Ensuring the privacy of organizations and individuals is a crucial issue in IoT networks, where sensitive data is transmitted and various types of attacks are prevalent. The importance of Intrusion Detection Systems (IDS) in detecting attacks and infiltration attempts on IoT networks is increasing daily. In this way, it will contribute to the resistance against attackers and the spread of this modern technology. In this study, Prediction Tree Traffic Analyzer (PT-TAnalyzer), an IDS system capable of detecting and classifying attacks on IoT networks, is proposed. PT-TAnalyzer features an ensemble model structure, where weighting is determined by the validation scores of machine learning models, and a prediction tree comprising eight ensemble models trained on the CIC-IoT-2023 dataset. This proposed model detects 34 attack types (including 33 malicious and one benign) with high success rates, due to its unique attack-detection approach, and does so efficiently and cost-effectively. Unlike traditional studies, it achieves this by using eight trained models rather than classifying all attacks with a single model and a single prediction structure within the tree architecture. In the tests performed, PT-TAnalyzer achieved 99.76 % accuracy in the binary classification experiment (Benign vs. Malicious) and 98.70 % accuracy in the 34-class experiment, yielding a similar F1 Score. The test time per sample is less than 0.1 ms. Compared with previous frameworks using the same dataset, PT-TAnalyzer shows a 2 % improvement in overall accuracy and a lower processing time. In practice, the proposed model can be deployed on IoT gateways or edge devices to provide real-time, low-cost, and scalable intrusion detection capabilities. The model outperforms previous studies using the same dataset, while also addressing the limitations.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101874"},"PeriodicalIF":7.6,"publicationDate":"2026-01-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146023230","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}