首页 > 最新文献

Internet of Things最新文献

英文 中文
WiAR : Wi-Fi-based human activity recognition using time-frequency analysis and lightweight deep learning for smart environments WiAR:基于wi - fi的人类活动识别,用于智能环境的时频分析和轻量级深度学习
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2026-01-28 DOI: 10.1016/j.iot.2026.101881
Vamsi Krishna Puduru , Rakesh Reddy Yakkati , Sreenivasa Reddy Yeduri , Sagar Koorapati , Linga Reddy Cenkeramaddi
Recognizing human activities in smart environments has significant usage in home automation, security, healthcare monitoring, etc. This paper proposes WiAR, which is a Wi-Fi-based human activity recognition method using Continuous Wavelet Transform (CWT) and lightweight Convolutional Neural Networks (CNNs). The proposed approach is evaluated on the IEEE 802.11ax Channel State Information (CSI) dataset. First, WiAR utilizes the CWT to generate the spectrogram images from the CSI extracted from Wi-Fi signals for different activities: walking, running, staying in place, and empty space. Then, these spectrogram images are processed with a CNN to classify these activities efficiently. Experimental results show that the proposed WiAR achieves an accuracy of approximately 91.1% when compared to various pre-trained models such as DenseNet, EfficientNet, MobileNet, ResNet, and VGGNet. Finally, the proposed CNN model is deployed on various edge computing devices, including Raspberry Pi 5, to validate its real-time implementation in terms of inference time.
识别智能环境中的人类活动在家庭自动化、安全、医疗监控等方面具有重要用途。WiAR是一种基于wi - fi的基于连续小波变换(CWT)和轻量级卷积神经网络(cnn)的人体活动识别方法。该方法在IEEE 802.11ax信道状态信息(CSI)数据集上进行了评估。首先,WiAR利用CWT从Wi-Fi信号中提取的CSI中生成不同活动的频谱图图像:步行、跑步、原地不动和空旷。然后,对这些光谱图图像进行CNN处理,有效地对这些活动进行分类。实验结果表明,与DenseNet、EfficientNet、MobileNet、ResNet和VGGNet等多种预训练模型相比,本文提出的WiAR模型的准确率约为91.1%。最后,将提出的CNN模型部署在各种边缘计算设备上,包括Raspberry Pi 5,以验证其在推理时间方面的实时性。
{"title":"WiAR : Wi-Fi-based human activity recognition using time-frequency analysis and lightweight deep learning for smart environments","authors":"Vamsi Krishna Puduru ,&nbsp;Rakesh Reddy Yakkati ,&nbsp;Sreenivasa Reddy Yeduri ,&nbsp;Sagar Koorapati ,&nbsp;Linga Reddy Cenkeramaddi","doi":"10.1016/j.iot.2026.101881","DOIUrl":"10.1016/j.iot.2026.101881","url":null,"abstract":"<div><div>Recognizing human activities in smart environments has significant usage in home automation, security, healthcare monitoring, etc. This paper proposes WiAR, which is a Wi-Fi-based human activity recognition method using Continuous Wavelet Transform (CWT) and lightweight Convolutional Neural Networks (CNNs). The proposed approach is evaluated on the IEEE 802.11ax Channel State Information (CSI) dataset. First, WiAR utilizes the CWT to generate the spectrogram images from the CSI extracted from Wi-Fi signals for different activities: walking, running, staying in place, and empty space. Then, these spectrogram images are processed with a CNN to classify these activities efficiently. Experimental results show that the proposed WiAR achieves an accuracy of approximately 91.1% when compared to various pre-trained models such as DenseNet, EfficientNet, MobileNet, ResNet, and VGGNet. Finally, the proposed CNN model is deployed on various edge computing devices, including Raspberry Pi 5, to validate its real-time implementation in terms of inference time.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101881"},"PeriodicalIF":7.6,"publicationDate":"2026-01-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077993","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Energy-efficient throughput optimization in UAV-based microservice networks for rural connectivity scenarios 农村互联场景下基于无人机的微业务网络节能吞吐量优化
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2026-01-24 DOI: 10.1016/j.iot.2026.101880
José Gómez-delaHiz , Aymen Fakhreddine , Jaime Galán-Jiménez
The research community is currently exploring the use of Unmanned Aerial Vehicle (UAV) networks to address coverage challenges in rural and economically disadvantaged regions. By equipping UAVs with small cells, coverage can be improved in areas where network operators are not prone to invest due to low Return on Investment. If there is a requirement from users in rural scenarios to achieve higher throughput (for instance, users seeking IoT services with stringent Quality-of-Service requirements), deploying multiple UAVs in the same area could be an effective strategy. However, this approach would also result in increased energy consumption. This paper addresses the challenge of maximizing the throughput offered in rural areas for users accessing microservice-based IoT applications, while also minimizing the energy consumption of UAV swarms. To achieve this, an optimal solution is proposed through a Mixed Integer Linear Programming (MILP) model, which is evaluated within realistic environments. Since this placement problem is complex due to its NP hard nature, in order to obtain solutions for large scenarios in tractable times, we also present a genetic algorithm (GA) that obtains results close to those reported by the MILP with a remarkable reduction in the computation time. Specifically, the optimality gap of the proposed GA-based solution is on average 2.32%, with a reduction of 89.92% in the computation time.
研究界目前正在探索使用无人机(UAV)网络来解决农村和经济落后地区的覆盖挑战。通过为无人机配备小型基站,可以在网络运营商由于投资回报率低而不倾向于投资的地区改善覆盖范围。如果农村用户要求实现更高的吞吐量(例如,用户寻求具有严格服务质量要求的物联网服务),在同一区域部署多架无人机可能是一种有效的策略。然而,这种方法也会导致能源消耗的增加。本文解决了在农村地区为用户访问基于微服务的物联网应用提供最大吞吐量的挑战,同时也最大限度地减少了无人机群的能耗。为了实现这一目标,通过混合整数线性规划(MILP)模型提出了最优解,并在实际环境中进行了评估。由于该放置问题由于其NP困难性质而变得复杂,为了在可处理的时间内获得大型场景的解决方案,我们还提出了一种遗传算法(GA),该算法获得的结果与MILP报告的结果接近,并且显著减少了计算时间。具体而言,基于遗传算法的方案的最优性差距平均为2.32%,计算时间减少了89.92%。
{"title":"Energy-efficient throughput optimization in UAV-based microservice networks for rural connectivity scenarios","authors":"José Gómez-delaHiz ,&nbsp;Aymen Fakhreddine ,&nbsp;Jaime Galán-Jiménez","doi":"10.1016/j.iot.2026.101880","DOIUrl":"10.1016/j.iot.2026.101880","url":null,"abstract":"<div><div>The research community is currently exploring the use of Unmanned Aerial Vehicle (UAV) networks to address coverage challenges in rural and economically disadvantaged regions. By equipping UAVs with small cells, coverage can be improved in areas where network operators are not prone to invest due to low Return on Investment. If there is a requirement from users in rural scenarios to achieve higher throughput (for instance, users seeking IoT services with stringent Quality-of-Service requirements), deploying multiple UAVs in the same area could be an effective strategy. However, this approach would also result in increased energy consumption. This paper addresses the challenge of maximizing the throughput offered in rural areas for users accessing microservice-based IoT applications, while also minimizing the energy consumption of UAV swarms. To achieve this, an optimal solution is proposed through a Mixed Integer Linear Programming (MILP) model, which is evaluated within realistic environments. Since this placement problem is complex due to its NP hard nature, in order to obtain solutions for large scenarios in tractable times, we also present a genetic algorithm (GA) that obtains results close to those reported by the MILP with a remarkable reduction in the computation time. Specifically, the optimality gap of the proposed GA-based solution is on average 2.32%, with a reduction of 89.92% in the computation time.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101880"},"PeriodicalIF":7.6,"publicationDate":"2026-01-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077989","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
SCENE: Serving cluster formation in cEll-free dyNamic environments 场景:在无细胞动态环境中服务集群形成
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2026-01-24 DOI: 10.1016/j.iot.2026.101873
Marco Silva , José Santos , Marília Curado , Chan-Tong Lam , Benjamin K. Ng
This paper introduces SCENE (Serving Cluster formation in cEll-free dyNamic Environments), a novel optimization model for serving cluster formation in Cell-Free massive Multiple-Input Multiple-Output networks. SCENE addresses the challenge of supporting heterogeneous service requirements - including critical Internet of Things (IoT) and latency-sensitive services as well as regular broadband applications - under dynamic network conditions. Unlike traditional approaches that rely on iterative refinement algorithms, SCENE performs one-shot serving cluster formation, eliminating the overhead of successive optimization loops. This innovation leads to significantly lower computational complexity and faster execution times while preserving service quality. Simulation results show that SCENE achieves superior performance in both average and 90%-likely spectral efficiency compared to state-of-the-art baselines, while demonstrating strong robustness under varying traffic profiles and pilot scarcity. SCENE enables efficient, scalable, and service-aware cluster formation, making it a promising candidate for dynamic and heterogeneous 6G and IoT environments
本文介绍了一种新的无cell的大规模多输入多输出网络服务集群形成优化模型SCENE (service Cluster formation in cEll-free dyNamic Environments)。SCENE解决了在动态网络条件下支持异构服务需求的挑战,包括关键的物联网(IoT)和延迟敏感服务以及常规宽带应用。与依赖迭代优化算法的传统方法不同,SCENE执行一次性服务集群形成,消除了连续优化循环的开销。这一创新显著降低了计算复杂度,加快了执行时间,同时保持了服务质量。仿真结果表明,与最先进的基线相比,SCENE在平均和90%可能的频谱效率方面都取得了卓越的性能,同时在不同的流量概况和飞行员稀缺情况下表现出强大的鲁棒性。SCENE支持高效、可扩展和服务感知的集群形成,使其成为动态和异构6G和物联网环境的有希望的候选者
{"title":"SCENE: Serving cluster formation in cEll-free dyNamic environments","authors":"Marco Silva ,&nbsp;José Santos ,&nbsp;Marília Curado ,&nbsp;Chan-Tong Lam ,&nbsp;Benjamin K. Ng","doi":"10.1016/j.iot.2026.101873","DOIUrl":"10.1016/j.iot.2026.101873","url":null,"abstract":"<div><div>This paper introduces SCENE (Serving Cluster formation in cEll-free dyNamic Environments), a novel optimization model for serving cluster formation in Cell-Free massive Multiple-Input Multiple-Output networks. SCENE addresses the challenge of supporting heterogeneous service requirements - including critical Internet of Things (IoT) and latency-sensitive services as well as regular broadband applications - under dynamic network conditions. Unlike traditional approaches that rely on iterative refinement algorithms, SCENE performs one-shot serving cluster formation, eliminating the overhead of successive optimization loops. This innovation leads to significantly lower computational complexity and faster execution times while preserving service quality. Simulation results show that SCENE achieves superior performance in both average and 90%-likely spectral efficiency compared to state-of-the-art baselines, while demonstrating strong robustness under varying traffic profiles and pilot scarcity. SCENE enables efficient, scalable, and service-aware cluster formation, making it a promising candidate for dynamic and heterogeneous 6G and IoT environments</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101873"},"PeriodicalIF":7.6,"publicationDate":"2026-01-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Group-based link modeling for wireless digital twins: Towards accurate network performance prediction 基于分组的无线数字孪生链路建模:迈向准确的网络性能预测
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2026-01-24 DOI: 10.1016/j.iot.2026.101875
Samir Si-Mohammed , Fabrice Théoleyre
Wireless networks are increasingly deployed in diverse domains, from best-effort object tracking to real-time control in smart factories. Yet, their performance strongly depends on configuration choices, especially at the MAC level. Thus, a single homogeneous configuration is often suboptimal due to the heterogeneous nature of individual links. We argue that Digital Twins (DTs) are a promising enabler for autonomous networks, capable of adapting configurations dynamically to prevailing conditions. However, global modeling approaches in DTs make it difficult to capture link-level variability and to accurately model the impact of configuration changes on performance. In this work, we propose a link-oriented prediction model designed to serve as a cornerstone for future wireless Digital Twins. Our model estimates the Packet Reception Rate (Packet Reception Rate (PRR)) under different MAC configurations, capturing the unique characteristics of each communication link. To improve scalability in large deployments, we explore a clustering-based approach, where predictive models are trained per group of similar links rather than per individual link. Our experimental evaluation shows that these data-driven methods effectively capture link heterogeneity while offering robust prediction accuracy and enhanced generalization capabilities.
无线网络越来越多地部署在不同的领域,从最努力的目标跟踪到智能工厂的实时控制。然而,它们的性能很大程度上取决于配置选择,尤其是在MAC级别。因此,由于单个链接的异构性,单个同构配置通常不是最优的。我们认为数字孪生(dt)是自主网络的一个有前途的推动者,能够动态地调整配置以适应当前条件。然而,DTs中的全局建模方法很难捕获链接级别的可变性,也很难准确地为配置更改对性能的影响建模。在这项工作中,我们提出了一个面向链路的预测模型,旨在作为未来无线数字孪生的基石。我们的模型估计了不同MAC配置下的数据包接收率(Packet Reception Rate (PRR)),捕获了每个通信链路的独特特征。为了提高大型部署中的可伸缩性,我们探索了一种基于集群的方法,在这种方法中,预测模型是按一组相似链接而不是按单个链接进行训练的。我们的实验评估表明,这些数据驱动的方法有效地捕获了链路异质性,同时提供了稳健的预测精度和增强的泛化能力。
{"title":"Group-based link modeling for wireless digital twins: Towards accurate network performance prediction","authors":"Samir Si-Mohammed ,&nbsp;Fabrice Théoleyre","doi":"10.1016/j.iot.2026.101875","DOIUrl":"10.1016/j.iot.2026.101875","url":null,"abstract":"<div><div>Wireless networks are increasingly deployed in diverse domains, from best-effort object tracking to real-time control in smart factories. Yet, their performance strongly depends on configuration choices, especially at the MAC level. Thus, a single homogeneous configuration is often suboptimal due to the heterogeneous nature of individual links. We argue that Digital Twins (DTs) are a promising enabler for autonomous networks, capable of adapting configurations dynamically to prevailing conditions. However, global modeling approaches in DTs make it difficult to capture link-level variability and to accurately model the impact of configuration changes on performance. In this work, we propose a link-oriented prediction model designed to serve as a cornerstone for future wireless Digital Twins. Our model estimates the Packet Reception Rate (Packet Reception Rate (PRR)) under different MAC configurations, capturing the unique characteristics of each communication link. To improve scalability in large deployments, we explore a clustering-based approach, where predictive models are trained per group of similar links rather than per individual link. Our experimental evaluation shows that these data-driven methods effectively capture link heterogeneity while offering robust prediction accuracy and enhanced generalization capabilities.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101875"},"PeriodicalIF":7.6,"publicationDate":"2026-01-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077992","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Transformer-based classification of IoT network traffic with flow-to-window aggregation 基于变压器的物联网网络流量分类与流到窗口聚合
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2026-01-21 DOI: 10.1016/j.iot.2026.101879
Sergio Martin-Reizabal , Adrian Caballero-Quiroga , Beatriz Gil-Arroyo , Nuño Basurto , Ruben Ruiz-Gonzalez
The explosive growth of the IoT has led to an increasingly complex and heterogeneous network traffic, posing major challenges for intrusion detection. Most existing machine learning and deep learning approaches model network traffic at the level of individual flows, which limits their ability to capture contextual relationships among concurrent communications. This paper introduces a Transformer-based framework for IoT intrusion detection that aggregates network flows into fixed-duration windows and treats each flow as a token within the input sequence. The self-attention mechanism captures contextual relationships among concurrent flows, enabling effective modeling of temporal dependencies without recurrence. Experiments conducted on the CICIoT2023 dataset show that the proposed model achieves a weighted F1-score of 97.9% and a macro ROC–AUC of 99.6% under temporally blocked cross-validation, while maintaining high computational efficiency. These results demonstrate that flow-to-window aggregation combined with self-attention provides a robust and scalable foundation for IoT network security, suitable for deployment in edge and smart-home environments.
物联网的爆炸式增长导致网络流量日益复杂和异构,对入侵检测提出了重大挑战。大多数现有的机器学习和深度学习方法都是在单个流的层面上对网络流量进行建模,这限制了它们捕捉并发通信之间上下文关系的能力。本文介绍了一种基于transformer的物联网入侵检测框架,该框架将网络流聚合到固定持续时间的窗口中,并将每个流视为输入序列中的令牌。自关注机制捕获并发流之间的上下文关系,从而能够有效地对时间依赖性进行建模,而不会重复出现。在CICIoT2023数据集上进行的实验表明,在时间阻塞交叉验证下,该模型的加权f1得分为97.9%,宏观ROC-AUC为99.6%,同时保持了较高的计算效率。这些结果表明,流到窗口聚合与自关注相结合,为物联网网络安全提供了强大且可扩展的基础,适合部署在边缘和智能家居环境中。
{"title":"Transformer-based classification of IoT network traffic with flow-to-window aggregation","authors":"Sergio Martin-Reizabal ,&nbsp;Adrian Caballero-Quiroga ,&nbsp;Beatriz Gil-Arroyo ,&nbsp;Nuño Basurto ,&nbsp;Ruben Ruiz-Gonzalez","doi":"10.1016/j.iot.2026.101879","DOIUrl":"10.1016/j.iot.2026.101879","url":null,"abstract":"<div><div>The explosive growth of the IoT has led to an increasingly complex and heterogeneous network traffic, posing major challenges for intrusion detection. Most existing machine learning and deep learning approaches model network traffic at the level of individual flows, which limits their ability to capture contextual relationships among concurrent communications. This paper introduces a Transformer-based framework for IoT intrusion detection that aggregates network flows into fixed-duration windows and treats each flow as a token within the input sequence. The self-attention mechanism captures contextual relationships among concurrent flows, enabling effective modeling of temporal dependencies without recurrence. Experiments conducted on the CICIoT2023 dataset show that the proposed model achieves a weighted F1-score of 97.9% and a macro ROC–AUC of 99.6% under temporally blocked cross-validation, while maintaining high computational efficiency. These results demonstrate that flow-to-window aggregation combined with self-attention provides a robust and scalable foundation for IoT network security, suitable for deployment in edge and smart-home environments.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101879"},"PeriodicalIF":7.6,"publicationDate":"2026-01-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077990","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
FedMamba: Robust multimodal federated intrusion detection for heterogeneous IoT systems FedMamba:针对异构物联网系统的鲁棒多模态联邦入侵检测
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2026-01-20 DOI: 10.1016/j.iot.2026.101877
Hafiz Bilal Ahmad , Haichang Gao , Naila Latif , Tanjeena Manzoor
The convergence of Information Technology (IT) and Operational Technology (OT) in Industry 4.0 produces diverse data streams, such as system logs, sensor readings, and network traffic, which are vital for industrial security. However, existing security analytics are siloed by modality and rely on centralized processing, raising concerns regarding privacy, latency, and scalability. Although Federated Learning (FL) mitigates privacy risks, most frameworks remain unimodal, lack support for non-IID data distributions, and face adversarial evasion challenges. We propose FedMamba, a novel multimodal Federated Learning (MMFL) framework that creates a unified Mamba-based model to address these issues via (i) efficient cross-modal learning, (ii) a FedProx-based protocol for stable non-IID training that remains compatible with secure aggregation, and (iii) modality-specific adversarial training for robustness. Experiments on HDFS, SWaT, and CICIoMT-2024 datasets show that the standard FedMamba achieved competitive macro F1-scores of 0.9584, 0.9795, and 0.9665 relative to centralized baselines, but degraded on HDFS and SWaT under PGD attack (0.3791 and 0.5147), whereas CICIoMT-2024 remained robust under the same attack (0.9665). The adversarially trained FedMamba-AT sustains robust F1-scores (0.9480, 0.8357, 0.9645). FedMamba offers a robust and scalable solution for secure IIoT monitoring.
信息技术(IT)和操作技术(OT)在工业4.0中的融合产生了多种数据流,如系统日志、传感器读数和网络流量,这对工业安全至关重要。然而,现有的安全分析因模式而孤立,依赖于集中处理,引起了对隐私、延迟和可伸缩性的担忧。尽管联邦学习(FL)减轻了隐私风险,但大多数框架仍然是单模态的,缺乏对非iid数据分布的支持,并且面临对抗性规避的挑战。我们提出了FedMamba,一个新的多模态联邦学习(MMFL)框架,它创建了一个统一的基于mamba的模型,通过(i)高效的跨模态学习来解决这些问题,(ii)基于fedprox的稳定非iid训练协议,与安全聚合保持兼容,以及(iii)针对鲁棒性的特定模态对抗性训练。在HDFS、SWaT和CICIoMT-2024数据集上的实验表明,与集中式基线相比,标准FedMamba的宏观f1得分为0.9584、0.9795和0.9665,但在PGD攻击下,标准FedMamba在HDFS和SWaT上的性能下降(0.3791和0.5147),而CICIoMT-2024在相同的攻击下仍然保持鲁棒性(0.9665)。对抗性训练的FedMamba-AT保持稳健的f1得分(0.9480,0.8357,0.9645)。FedMamba为安全IIoT监控提供了强大且可扩展的解决方案。
{"title":"FedMamba: Robust multimodal federated intrusion detection for heterogeneous IoT systems","authors":"Hafiz Bilal Ahmad ,&nbsp;Haichang Gao ,&nbsp;Naila Latif ,&nbsp;Tanjeena Manzoor","doi":"10.1016/j.iot.2026.101877","DOIUrl":"10.1016/j.iot.2026.101877","url":null,"abstract":"<div><div>The convergence of Information Technology (IT) and Operational Technology (OT) in Industry 4.0 produces diverse data streams, such as system logs, sensor readings, and network traffic, which are vital for industrial security. However, existing security analytics are siloed by modality and rely on centralized processing, raising concerns regarding privacy, latency, and scalability. Although Federated Learning (FL) mitigates privacy risks, most frameworks remain unimodal, lack support for non-IID data distributions, and face adversarial evasion challenges. We propose FedMamba, a novel multimodal Federated Learning (MMFL) framework that creates a unified Mamba-based model to address these issues via (i) efficient cross-modal learning, (ii) a FedProx-based protocol for stable non-IID training that remains compatible with secure aggregation, and (iii) modality-specific adversarial training for robustness. Experiments on HDFS, SWaT, and CICIoMT-2024 datasets show that the standard FedMamba achieved competitive macro F1-scores of 0.9584, 0.9795, and 0.9665 relative to centralized baselines, but degraded on HDFS and SWaT under PGD attack (0.3791 and 0.5147), whereas CICIoMT-2024 remained robust under the same attack (0.9665). The adversarially trained FedMamba-AT sustains robust F1-scores (0.9480, 0.8357, 0.9645). FedMamba offers a robust and scalable solution for secure IIoT monitoring.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101877"},"PeriodicalIF":7.6,"publicationDate":"2026-01-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077994","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Cross-network cross-interface relaying via LoRa-ZigBee synergy: Enabling energy-efficient delay-constrained communication across low-power IoT networks 通过LoRa-ZigBee协同实现跨网络跨接口中继:在低功耗物联网网络中实现高能效延迟约束通信
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2026-01-20 DOI: 10.1016/j.iot.2026.101878
Hua Qin , Ni Li , Gelan Yang , Yang Peng
The rapid expansion of the Internet of Things (IoT) ecosystem has propelled widespread deployment of distributed low-power wireless networks, among which ZigBee stands out for diverse innovative applications. However, energy-efficient cross-network communication remains challenging, as existing solutions like multi-hop ZigBee and one-hop LoRa entail trade-offs between communication delays and deployment costs. To tackle these issues, we propose a novel cross-interface relaying paradigm that utilizes a star topology within each ZigBee network and designates a relay node with an additional LoRa interface to bridge networks via a central LoRa gateway. Compared with existing methods, this approach reduces energy consumption and costs while improving scalability. To implement this paradigm while balancing energy conservation with delay guarantees, we introduce a Cross-network Cross-interface Relaying (CCR) scheme, which jointly schedules LoRa and ZigBee transmission behaviors to minimize energy consumption under delay constraints. CCR uses a scheduling framework that breaks down end-to-end delay constraints into link-level constraints, enabling global optimization of transmission parameters and dynamic adaptation to link quality variations. The effectiveness of CCR is demonstrated through extensive field tests on a prototype implemented on Raspberry Pi 3B+. Results show that CCR reduces energy consumption by 55.4% and 39.1% compared with an advanced LoRa communication protocol and a state-of-the-art cross-interface relaying scheme, respectively, while ensuring that 98.7% of packets satisfy their delay constraints. These findings highlight the potential of CCR for efficient and reliable cross-network communication in large-scale IoT deployments.
物联网(IoT)生态系统的快速扩展推动了分布式低功耗无线网络的广泛部署,其中ZigBee在各种创新应用中脱颖而出。然而,节能的跨网络通信仍然具有挑战性,因为现有的解决方案(如多跳ZigBee和单跳LoRa)需要在通信延迟和部署成本之间进行权衡。为了解决这些问题,我们提出了一种新的跨接口中继范例,该范例在每个ZigBee网络中利用星形拓扑,并指定一个具有附加LoRa接口的中继节点,通过中央LoRa网关桥接网络。与现有方法相比,该方法降低了能耗和成本,同时提高了可扩展性。为了实现这种模式,同时平衡节能和延迟保证,我们引入了一种跨网络跨接口中继(CCR)方案,该方案联合调度LoRa和ZigBee传输行为,以最大限度地减少延迟约束下的能耗。CCR使用调度框架,将端到端延迟约束分解为链路级约束,实现传输参数的全局优化和对链路质量变化的动态适应。CCR的有效性通过在树莓派3B+上实现的原型进行了广泛的现场测试。结果表明,与先进的LoRa通信协议和最先进的跨接口中继方案相比,CCR分别降低了55.4%和39.1%的能耗,同时确保98.7%的数据包满足其延迟约束。这些发现突出了CCR在大规模物联网部署中高效可靠的跨网络通信的潜力。
{"title":"Cross-network cross-interface relaying via LoRa-ZigBee synergy: Enabling energy-efficient delay-constrained communication across low-power IoT networks","authors":"Hua Qin ,&nbsp;Ni Li ,&nbsp;Gelan Yang ,&nbsp;Yang Peng","doi":"10.1016/j.iot.2026.101878","DOIUrl":"10.1016/j.iot.2026.101878","url":null,"abstract":"<div><div>The rapid expansion of the Internet of Things (IoT) ecosystem has propelled widespread deployment of distributed low-power wireless networks, among which ZigBee stands out for diverse innovative applications. However, energy-efficient cross-network communication remains challenging, as existing solutions like multi-hop ZigBee and one-hop LoRa entail trade-offs between communication delays and deployment costs. To tackle these issues, we propose a novel cross-interface relaying paradigm that utilizes a star topology within each ZigBee network and designates a relay node with an additional LoRa interface to bridge networks via a central LoRa gateway. Compared with existing methods, this approach reduces energy consumption and costs while improving scalability. To implement this paradigm while balancing energy conservation with delay guarantees, we introduce a Cross-network Cross-interface Relaying (CCR) scheme, which jointly schedules LoRa and ZigBee transmission behaviors to minimize energy consumption under delay constraints. CCR uses a scheduling framework that breaks down end-to-end delay constraints into link-level constraints, enabling global optimization of transmission parameters and dynamic adaptation to link quality variations. The effectiveness of CCR is demonstrated through extensive field tests on a prototype implemented on Raspberry Pi 3B+. Results show that CCR reduces energy consumption by 55.4% and 39.1% compared with an advanced LoRa communication protocol and a state-of-the-art cross-interface relaying scheme, respectively, while ensuring that 98.7% of packets satisfy their delay constraints. These findings highlight the potential of CCR for efficient and reliable cross-network communication in large-scale IoT deployments.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101878"},"PeriodicalIF":7.6,"publicationDate":"2026-01-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146023232","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Creation of AI-driven smart spaces for enhanced indoor environments – A survey 为增强室内环境创造人工智能驱动的智能空间——一项调查
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2026-01-18 DOI: 10.1016/j.iot.2026.101876
Aygün Varol , Naser Hossein Motlagh , Mirka Leino , Sasu Tarkoma , Johanna Virkki
Smart spaces are ubiquitous computing environments that integrate diverse sensing and communication technologies to enhance functionality, optimize energy utilization, and improve user comfort and well-being. The adoption of emerging artificial intelligence (AI) methodologies has led to the development of AI-driven smart spaces, further expanding capabilities through applications such as personalized comfort settings, interactive living spaces, and automation of space systems. These advancements collectively elevate the quality of indoor experiences for users. To systematically examine these developments, we present a comprehensive survey of the foundational components of AI-driven smart spaces, including sensor technologies, data communication protocols, network management and maintenance strategies, and data collection, processing, and analytics. We investigate both traditional machine learning (ML) methods, such as deep learning (DL), and emerging approaches, including transformer networks and large language models (LLMs), highlighting their contributions and potential. We also showcase real-world applications of these technologies and provide insights to guide their continued development. Each section details relevant technologies and methodologies and concludes with an analysis of challenges and limitations, identifying directions for future research.
智能空间是一种无处不在的计算环境,它集成了各种传感和通信技术,以增强功能,优化能源利用,提高用户的舒适度和幸福感。新兴人工智能(AI)方法的采用导致了人工智能驱动的智能空间的发展,通过个性化舒适设置、交互式生活空间和空间系统自动化等应用进一步扩展了功能。这些进步共同提升了用户的室内体验质量。为了系统地研究这些发展,我们对人工智能驱动的智能空间的基本组成部分进行了全面的调查,包括传感器技术、数据通信协议、网络管理和维护策略,以及数据收集、处理和分析。我们研究了传统的机器学习(ML)方法,如深度学习(DL),以及新兴的方法,包括变压器网络和大型语言模型(llm),强调了它们的贡献和潜力。我们还展示了这些技术的实际应用,并提供了指导其持续发展的见解。每个部分详细介绍了相关的技术和方法,最后分析了挑战和局限性,确定了未来研究的方向。
{"title":"Creation of AI-driven smart spaces for enhanced indoor environments – A survey","authors":"Aygün Varol ,&nbsp;Naser Hossein Motlagh ,&nbsp;Mirka Leino ,&nbsp;Sasu Tarkoma ,&nbsp;Johanna Virkki","doi":"10.1016/j.iot.2026.101876","DOIUrl":"10.1016/j.iot.2026.101876","url":null,"abstract":"<div><div>Smart spaces are ubiquitous computing environments that integrate diverse sensing and communication technologies to enhance functionality, optimize energy utilization, and improve user comfort and well-being. The adoption of emerging artificial intelligence (AI) methodologies has led to the development of AI-driven smart spaces, further expanding capabilities through applications such as personalized comfort settings, interactive living spaces, and automation of space systems. These advancements collectively elevate the quality of indoor experiences for users. To systematically examine these developments, we present a comprehensive survey of the foundational components of AI-driven smart spaces, including sensor technologies, data communication protocols, network management and maintenance strategies, and data collection, processing, and analytics. We investigate both traditional machine learning (ML) methods, such as deep learning (DL), and emerging approaches, including transformer networks and large language models (LLMs), highlighting their contributions and potential. We also showcase real-world applications of these technologies and provide insights to guide their continued development. Each section details relevant technologies and methodologies and concludes with an analysis of challenges and limitations, identifying directions for future research.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101876"},"PeriodicalIF":7.6,"publicationDate":"2026-01-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146023233","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Bibliometric analysis of secure IoT for quantum computing 量子计算安全物联网的文献计量分析
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2026-01-16 DOI: 10.1016/j.iot.2026.101872
Hamza Ibrahim , Love Allen Chijioke Ahakonye , Jae-Min Lee , Dong-Seong Kim
The convergence of Quantum Machine Learning (QML) and Blockchain is emerging as a transformative paradigm to address escalating security and scalability challenges in 6G-enabled Industrial Internet of Things (IIoT) networks. This study presents the first comprehensive bibliometric and meta-analysis of this nascent interdisciplinary field. We analyzed 159 peer-reviewed publications (indexed from January 2022 through December 22, 2024) from Scopus, employing a systematic Kitchenham-based methodology for literature selection and VOSviewer for science mapping. Our analysis reveals a 75% annual growth rate since 2022, with India (37.7%), the USA (12.6%), and South Korea (12.6%) as the leading contributors. Keyword co-occurrence analysis identified four dominant thematic clusters: “6G Network Security,” “Quantum Computing and AI,” “Blockchain and Decentralization,” and “IIoT Applications.” The study’s novelty lies in synthesizing bibliometric insights with a proposed five-layer QML-Blockchain integration framework and a comparative analysis against existing reviews. Quantitative performance metrics indicate that QML can improve anomaly detection accuracy by 5–9% over classical models, while advanced consensus mechanisms like PoA2 can reduce transaction latency by 35%. However, significant challenges persist, including quantum hardware limitations (e.g., qubit coherence  < 100 μs), scalability challenges in achieving consensus across massive IIoT device densities, and a critical lack of empirical testbeds. This research provides a foundational roadmap, emphasizing the urgent need for standardized benchmarks, hybrid orchestration models, and quantum-resistant cryptography to realize secure, intelligent, and autonomous IIoT ecosystems in the 6G era.
量子机器学习(QML)和区块链的融合正在成为一种变革范例,以解决支持6g的工业物联网(IIoT)网络中不断升级的安全性和可扩展性挑战。本研究首次对这一新兴的跨学科领域进行了全面的文献计量和荟萃分析。我们分析了来自Scopus的159篇同行评审的出版物(从2022年1月到2024年12月22日),采用了基于kitchenham的系统文献选择方法和VOSviewer的科学制图方法。我们的分析显示,自2022年以来,年增长率为75%,其中印度(37.7%)、美国(12.6%)和韩国(12.6%)是主要贡献者。关键词共现分析确定了四个主要主题集群:“6G网络安全”、“量子计算与人工智能”、“区块链与去中心化”和“工业物联网应用”。该研究的新颖之处在于将文献计量学的见解与提出的五层qml -区块链集成框架相结合,并与现有综述进行比较分析。定量性能指标表明,与经典模型相比,QML可以将异常检测准确率提高5-9%,而像PoA2这样的高级共识机制可以将事务延迟减少35%。然而,重大挑战仍然存在,包括量子硬件限制(例如,量子比特相干性 <; 100 μs),在大规模工业物联网设备密度上达成共识的可扩展性挑战,以及经验测试平台的严重缺乏。本研究提供了一个基本路线图,强调了在6G时代实现安全、智能和自主的工业物联网生态系统对标准化基准、混合编排模型和抗量子加密的迫切需求。
{"title":"Bibliometric analysis of secure IoT for quantum computing","authors":"Hamza Ibrahim ,&nbsp;Love Allen Chijioke Ahakonye ,&nbsp;Jae-Min Lee ,&nbsp;Dong-Seong Kim","doi":"10.1016/j.iot.2026.101872","DOIUrl":"10.1016/j.iot.2026.101872","url":null,"abstract":"<div><div>The convergence of Quantum Machine Learning (QML) and Blockchain is emerging as a transformative paradigm to address escalating security and scalability challenges in 6G-enabled Industrial Internet of Things (IIoT) networks. This study presents the first comprehensive bibliometric and meta-analysis of this nascent interdisciplinary field. We analyzed 159 peer-reviewed publications (indexed from January 2022 through December 22, 2024) from Scopus, employing a systematic Kitchenham-based methodology for literature selection and VOSviewer for science mapping. Our analysis reveals a 75% annual growth rate since 2022, with India (37.7%), the USA (12.6%), and South Korea (12.6%) as the leading contributors. Keyword co-occurrence analysis identified four dominant thematic clusters: “6G Network Security,” “Quantum Computing and AI,” “Blockchain and Decentralization,” and “IIoT Applications.” The study’s novelty lies in synthesizing bibliometric insights with a proposed five-layer QML-Blockchain integration framework and a comparative analysis against existing reviews. Quantitative performance metrics indicate that QML can improve anomaly detection accuracy by 5–9% over classical models, while advanced consensus mechanisms like PoA<sup>2</sup> can reduce transaction latency by 35%. However, significant challenges persist, including quantum hardware limitations (e.g., qubit coherence  &lt; 100 <em>μ</em>s), scalability challenges in achieving consensus across massive IIoT device densities, and a critical lack of empirical testbeds. This research provides a foundational roadmap, emphasizing the urgent need for standardized benchmarks, hybrid orchestration models, and quantum-resistant cryptography to realize secure, intelligent, and autonomous IIoT ecosystems in the 6G era.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101872"},"PeriodicalIF":7.6,"publicationDate":"2026-01-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077991","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
PT-TrafficAnalyzer: A weighted ensemble prediction tree for IoT attack detection PT-TrafficAnalyzer:用于物联网攻击检测的加权集成预测树
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2026-01-14 DOI: 10.1016/j.iot.2026.101874
Recep Sinan Arslan
The Internet of Things has a network structure that is vulnerable to cyberattacks and susceptible to attackers. Ensuring the privacy of organizations and individuals is a crucial issue in IoT networks, where sensitive data is transmitted and various types of attacks are prevalent. The importance of Intrusion Detection Systems (IDS) in detecting attacks and infiltration attempts on IoT networks is increasing daily. In this way, it will contribute to the resistance against attackers and the spread of this modern technology. In this study, Prediction Tree Traffic Analyzer (PT-TAnalyzer), an IDS system capable of detecting and classifying attacks on IoT networks, is proposed. PT-TAnalyzer features an ensemble model structure, where weighting is determined by the validation scores of machine learning models, and a prediction tree comprising eight ensemble models trained on the CIC-IoT-2023 dataset. This proposed model detects 34 attack types (including 33 malicious and one benign) with high success rates, due to its unique attack-detection approach, and does so efficiently and cost-effectively. Unlike traditional studies, it achieves this by using eight trained models rather than classifying all attacks with a single model and a single prediction structure within the tree architecture. In the tests performed, PT-TAnalyzer achieved 99.76 % accuracy in the binary classification experiment (Benign vs. Malicious) and 98.70 % accuracy in the 34-class experiment, yielding a similar F1 Score. The test time per sample is less than 0.1 ms. Compared with previous frameworks using the same dataset, PT-TAnalyzer shows a 2 % improvement in overall accuracy and a lower processing time. In practice, the proposed model can be deployed on IoT gateways or edge devices to provide real-time, low-cost, and scalable intrusion detection capabilities. The model outperforms previous studies using the same dataset, while also addressing the limitations.
物联网的网络结构容易受到网络攻击,容易受到攻击者的攻击。在物联网网络中,确保组织和个人的隐私是一个关键问题,在物联网网络中,敏感数据被传输,各种类型的攻击很普遍。入侵检测系统(IDS)在检测对物联网网络的攻击和渗透企图方面的重要性与日俱增。通过这种方式,它将有助于抵抗攻击者和这种现代技术的传播。在本研究中,提出了一种能够检测和分类物联网网络攻击的IDS系统预测树流量分析器(PT-TAnalyzer)。PT-TAnalyzer具有集成模型结构,其中权重由机器学习模型的验证分数决定,以及由在CIC-IoT-2023数据集上训练的八个集成模型组成的预测树。该模型检测34种攻击类型(包括33种恶意攻击和1种良性攻击),由于其独特的攻击检测方法,成功率很高,并且效率高,成本低。与传统研究不同的是,它通过使用八个训练模型来实现这一目标,而不是在树结构中使用单个模型和单个预测结构对所有攻击进行分类。在进行的测试中,PT-TAnalyzer在二元分类实验(良性vs.恶意)中准确率达到99.76%,在34类实验中准确率达到98.70%,获得相似的F1分数。每个样品的测试时间小于0.1 ms。与以前使用相同数据集的框架相比,PT-TAnalyzer显示总体精度提高了2%,处理时间更短。在实践中,所提出的模型可以部署在物联网网关或边缘设备上,以提供实时、低成本和可扩展的入侵检测功能。该模型优于使用相同数据集的先前研究,同时也解决了局限性。
{"title":"PT-TrafficAnalyzer: A weighted ensemble prediction tree for IoT attack detection","authors":"Recep Sinan Arslan","doi":"10.1016/j.iot.2026.101874","DOIUrl":"10.1016/j.iot.2026.101874","url":null,"abstract":"<div><div>The Internet of Things has a network structure that is vulnerable to cyberattacks and susceptible to attackers. Ensuring the privacy of organizations and individuals is a crucial issue in IoT networks, where sensitive data is transmitted and various types of attacks are prevalent. The importance of Intrusion Detection Systems (IDS) in detecting attacks and infiltration attempts on IoT networks is increasing daily. In this way, it will contribute to the resistance against attackers and the spread of this modern technology. In this study, Prediction Tree Traffic Analyzer (PT-TAnalyzer), an IDS system capable of detecting and classifying attacks on IoT networks, is proposed. PT-TAnalyzer features an ensemble model structure, where weighting is determined by the validation scores of machine learning models, and a prediction tree comprising eight ensemble models trained on the CIC-IoT-2023 dataset. This proposed model detects 34 attack types (including 33 malicious and one benign) with high success rates, due to its unique attack-detection approach, and does so efficiently and cost-effectively. Unlike traditional studies, it achieves this by using eight trained models rather than classifying all attacks with a single model and a single prediction structure within the tree architecture. In the tests performed, PT-TAnalyzer achieved 99.76 % accuracy in the binary classification experiment (Benign vs. Malicious) and 98.70 % accuracy in the 34-class experiment, yielding a similar F1 Score. The test time per sample is less than 0.1 ms. Compared with previous frameworks using the same dataset, PT-TAnalyzer shows a 2 % improvement in overall accuracy and a lower processing time. In practice, the proposed model can be deployed on IoT gateways or edge devices to provide real-time, low-cost, and scalable intrusion detection capabilities. The model outperforms previous studies using the same dataset, while also addressing the limitations.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"36 ","pages":"Article 101874"},"PeriodicalIF":7.6,"publicationDate":"2026-01-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146023230","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Internet of Things
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1