首页 > 最新文献

Internet of Things最新文献

英文 中文
Smart-contract-based blockchain-enabled decentralized scheme for improving smart-grid security 基于智能合约的区块链去中心化方案,用于提高智能电网的安全性
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-01 Epub Date: 2025-10-24 DOI: 10.1016/j.iot.2025.101811
Bhabendu Kumar Mohanta , Ali Ismail Awad , Tarek Elsaka , Hamza Kheddar , Ezedin Baraka
Intelligent devices with embedded technology have proliferated dramatically over the past decade. The Internet of Things (IoT) has emerged as a transformational force, advancing traditional systems to previously unattainable levels of intelligence. Smart cities, transportation, healthcare, supply-chain management, agriculture, water management, and smart grid (SG) systems are among the industries where the IoT has found applications. These developments are demonstrated by the integration of IoT systems into SG networks, offering significant improvements in sustainability, dependability, and efficiency. Such systems use various IoT devices to continuously monitor the environment and transmit data for processing and analysis. Nonetheless, the growth of the IoT has introduced security vulnerabilities, including concerns about user identification, data integrity, and trust, especially in SG applications. This study aims to resolve several security challenges in IoT-enabled SG applications to support sustainability. The proposed scheme effectively tackles critical security requirements such as data integrity, user anonymity, distributed storage, trust management, and decentralized architecture. The security concerns addressed by blockchain technology include preserving data integrity, fostering trust, providing secure communication, and enabling effective monitoring. Smart contracts automate system processes and are effective in maintaining user trust. The experimental findings support the viability of the proposed system, demonstrating a computational cost of 3.150 ms and a communication overhead of 992 bits, both representing improvements over various existing solutions. Additionally, the deployment cost for the smart contract is found to be 5.64 USD with a writing cost of 2.89 USD, both of which are lower than the costs associated with comparable approaches.
具有嵌入式技术的智能设备在过去十年中急剧增加。物联网(IoT)已经成为一股变革力量,将传统系统推进到以前无法实现的智能水平。智慧城市、交通、医疗、供应链管理、农业、水管理和智能电网(SG)系统是物联网应用的行业之一。这些发展通过将物联网系统集成到SG网络中来证明,在可持续性、可靠性和效率方面提供了显着改进。此类系统使用各种物联网设备持续监控环境并传输数据以进行处理和分析。尽管如此,物联网的发展也带来了安全漏洞,包括对用户身份、数据完整性和信任的担忧,尤其是在SG应用中。本研究旨在解决物联网SG应用中的几个安全挑战,以支持可持续性。该方案有效地解决了数据完整性、用户匿名性、分布式存储、信任管理和分散架构等关键安全需求。区块链技术解决的安全问题包括保持数据完整性、促进信任、提供安全通信和支持有效监视。智能合约使系统流程自动化,并能有效地维护用户信任。实验结果支持了该系统的可行性,表明该系统的计算成本为3.150 ms,通信开销为992比特,都比现有的各种解决方案有所改进。此外,智能合约的部署成本为5.64美元,编写成本为2.89美元,两者都低于与可比方法相关的成本。
{"title":"Smart-contract-based blockchain-enabled decentralized scheme for improving smart-grid security","authors":"Bhabendu Kumar Mohanta ,&nbsp;Ali Ismail Awad ,&nbsp;Tarek Elsaka ,&nbsp;Hamza Kheddar ,&nbsp;Ezedin Baraka","doi":"10.1016/j.iot.2025.101811","DOIUrl":"10.1016/j.iot.2025.101811","url":null,"abstract":"<div><div>Intelligent devices with embedded technology have proliferated dramatically over the past decade. The Internet of Things (IoT) has emerged as a transformational force, advancing traditional systems to previously unattainable levels of intelligence. Smart cities, transportation, healthcare, supply-chain management, agriculture, water management, and smart grid (SG) systems are among the industries where the IoT has found applications. These developments are demonstrated by the integration of IoT systems into SG networks, offering significant improvements in sustainability, dependability, and efficiency. Such systems use various IoT devices to continuously monitor the environment and transmit data for processing and analysis. Nonetheless, the growth of the IoT has introduced security vulnerabilities, including concerns about user identification, data integrity, and trust, especially in SG applications. This study aims to resolve several security challenges in IoT-enabled SG applications to support sustainability. The proposed scheme effectively tackles critical security requirements such as data integrity, user anonymity, distributed storage, trust management, and decentralized architecture. The security concerns addressed by blockchain technology include preserving data integrity, fostering trust, providing secure communication, and enabling effective monitoring. Smart contracts automate system processes and are effective in maintaining user trust. The experimental findings support the viability of the proposed system, demonstrating a computational cost of 3.150 ms and a communication overhead of 992 bits, both representing improvements over various existing solutions. Additionally, the deployment cost for the smart contract is found to be 5.64 USD with a writing cost of 2.89 USD, both of which are lower than the costs associated with comparable approaches.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"34 ","pages":"Article 101811"},"PeriodicalIF":7.6,"publicationDate":"2025-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145415712","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Resource-efficient fog computing vision system for occupancy monitoring: A real-world deployment in university libraries 用于占用监控的资源高效雾计算视觉系统:在大学图书馆的实际部署
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-01 Epub Date: 2025-08-25 DOI: 10.1016/j.iot.2025.101748
Alejandro S. Martínez-Sala, Lucio Hernando-Cánovas, Juan C. Sánchez-Aarnoutse, Juan J. Alcaraz
This paper presents a fog computing system for real-time occupancy monitoring across three university libraries, using ceiling-mounted, top-view cameras positioned above each entrance. Video streams from low-cost cameras are securely transmitted to a fog server deployed within the university’s intranet. Top-view person tracking ensures privacy compliance by inherently eliminating facial recognition, but introduces challenges such as non-standard human appearance, occlusions, and lighting variations. For person detection, we employ a YOLOv5 model initially trained on top-view human annotations, further refined through transfer learning using a curated dataset from the three libraries. The system features a two-stage processing pipeline. First, a lightweight background subtraction algorithm filters frames with potential motion, which are queued via RabbitMQ for sequential processing. Second, a People Flow Counting module applies the optimized YOLOv5 model to detect and count individuals in each frame, followed by a custom tracking algorithm and virtual line-crossing logic to ensure accurate flow tracking. Each library is handled independently through a batch processing approach, updating occupancy estimates with bounded delay using a single CPU-only fog server. This architecture maintains low latency while avoiding server overload and minimizing energy use. The system has been in continuous production for over twelve months, demonstrating reliable performance across all three libraries on commodity hardware. Quantitative evaluation confirms 94 % accuracy in people flow detection, validating the system’s robustness, scalability, and practical utility for long-term, privacy-preserving deployment in smart campus environments.
本文介绍了一种雾计算系统,用于实时监控三所大学图书馆的占用情况,该系统使用安装在天花板上的顶景摄像头,位于每个入口上方。来自低成本摄像机的视频流被安全地传输到部署在大学内部网内的雾服务器。俯视图人员跟踪通过消除面部识别来确保隐私合规性,但引入了诸如非标准人的外观,遮挡和照明变化等挑战。对于人的检测,我们使用了一个最初在顶视图人类注释上训练的YOLOv5模型,通过使用来自三个库的精选数据集的迁移学习进一步改进。该系统具有两阶段处理管道。首先,一个轻量级的背景减法算法过滤具有潜在运动的帧,这些帧通过RabbitMQ排队进行顺序处理。其次,人流计数模块采用优化后的YOLOv5模型,对每一帧中的个体进行检测和计数,并采用自定义跟踪算法和虚拟过线逻辑,确保准确的人流跟踪。每个库通过批处理方法独立处理,使用单个仅cpu的雾服务器以有限的延迟更新占用估计。这种架构保持了低延迟,同时避免了服务器过载并最大限度地减少了能源使用。该系统已经连续生产超过12个月,在商用硬件上展示了跨所有三个库的可靠性能。定量评估证实了94%的人流检测准确率,验证了系统的稳健性、可扩展性和在智能校园环境中长期、保护隐私部署的实用性。
{"title":"Resource-efficient fog computing vision system for occupancy monitoring: A real-world deployment in university libraries","authors":"Alejandro S. Martínez-Sala,&nbsp;Lucio Hernando-Cánovas,&nbsp;Juan C. Sánchez-Aarnoutse,&nbsp;Juan J. Alcaraz","doi":"10.1016/j.iot.2025.101748","DOIUrl":"10.1016/j.iot.2025.101748","url":null,"abstract":"<div><div>This paper presents a fog computing system for real-time occupancy monitoring across three university libraries, using ceiling-mounted, top-view cameras positioned above each entrance. Video streams from low-cost cameras are securely transmitted to a fog server deployed within the university’s intranet. Top-view person tracking ensures privacy compliance by inherently eliminating facial recognition, but introduces challenges such as non-standard human appearance, occlusions, and lighting variations. For person detection, we employ a YOLOv5 model initially trained on top-view human annotations, further refined through transfer learning using a curated dataset from the three libraries. The system features a two-stage processing pipeline. First, a lightweight background subtraction algorithm filters frames with potential motion, which are queued via RabbitMQ for sequential processing. Second, a People Flow Counting module applies the optimized YOLOv5 model to detect and count individuals in each frame, followed by a custom tracking algorithm and virtual line-crossing logic to ensure accurate flow tracking. Each library is handled independently through a batch processing approach, updating occupancy estimates with bounded delay using a single CPU-only fog server. This architecture maintains low latency while avoiding server overload and minimizing energy use. The system has been in continuous production for over twelve months, demonstrating reliable performance across all three libraries on commodity hardware. Quantitative evaluation confirms 94 % accuracy in people flow detection, validating the system’s robustness, scalability, and practical utility for long-term, privacy-preserving deployment in smart campus environments.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"34 ","pages":"Article 101748"},"PeriodicalIF":7.6,"publicationDate":"2025-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144989076","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
CARES: A Hybrid caregivers recommendation system using deep learning and knowledge graphs CARES:一个使用深度学习和知识图谱的混合型护理推荐系统
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-01 Epub Date: 2025-09-19 DOI: 10.1016/j.iot.2025.101769
Qiaoyun Zhang , Sze-Han Wang , Chung-Chih Lin , Chih-Yung Chang , Diptendu Sinha Roy
Recommendation systems have prospered by leveraging user-item interactions and their features for personalized recommendations. Recent advancements in deep learning further enhance these recommendation systems with powerful backbones for learning from user-item data. However, solely depending on these interactions often leads to the cold-start problem, where items lacking historical data cannot be effectively recommended. Additionally, the issue of high similarity between user and item features frequently goes unresolved. This paper introduces a Hybrid Caregiver Recommendation mechanism, called CARES, designed to recommend suitable caregivers for postpartum women using deep learning and knowledge graphs. Initially, the proposed CARES utilizes Extreme Gradient Boosting (XGBoost) to identify important features, addressing the issue of feature similarity. Then it employs K-Means clustering to group postpartum women and caregivers based on similar features. Subsequently, it utilizes a Deep & Cross Network (DCN) to automatically learn feature interactions and constructs knowledge graphs to tackle the cold start problem. The proposed CARES also integrates exploration and exploitation strategies to balance the accuracy and diversity of recommendations. The proposed CARES compares with existing mechanisms on real datasets, and the simulation results demonstrate its effectiveness in terms of precision, recall, and F1-Score.
推荐系统通过利用用户与项目之间的交互及其个性化推荐的特性而蓬勃发展。深度学习的最新进展进一步增强了这些推荐系统,它具有强大的骨架,可以从用户-项目数据中学习。然而,仅仅依赖于这些交互通常会导致冷启动问题,在这种情况下,缺乏历史数据的项目无法有效地推荐。此外,用户和道具特征之间的高度相似性问题经常得不到解决。本文介绍了一种名为CARES的混合看护者推荐机制,旨在使用深度学习和知识图为产后女性推荐合适的看护者。最初,所提出的CARES利用极限梯度增强(XGBoost)来识别重要特征,解决特征相似性的问题。然后采用K-Means聚类方法,根据相似特征对产后妇女和护理人员进行分组。随后,利用深度交叉网络(Deep & Cross Network, DCN)自动学习特征交互并构建知识图来解决冷启动问题。提议的CARES还集成了探索和开发策略,以平衡建议的准确性和多样性。将所提出的CARES与现有机制在真实数据集上进行了比较,仿真结果证明了其在准确率、召回率和F1-Score方面的有效性。
{"title":"CARES: A Hybrid caregivers recommendation system using deep learning and knowledge graphs","authors":"Qiaoyun Zhang ,&nbsp;Sze-Han Wang ,&nbsp;Chung-Chih Lin ,&nbsp;Chih-Yung Chang ,&nbsp;Diptendu Sinha Roy","doi":"10.1016/j.iot.2025.101769","DOIUrl":"10.1016/j.iot.2025.101769","url":null,"abstract":"<div><div>Recommendation systems have prospered by leveraging user-item interactions and their features for personalized recommendations. Recent advancements in deep learning further enhance these recommendation systems with powerful backbones for learning from user-item data. However, solely depending on these interactions often leads to the cold-start problem, where items lacking historical data cannot be effectively recommended. Additionally, the issue of high similarity between user and item features frequently goes unresolved. This paper introduces a Hybrid Caregiver Recommendation mechanism, called CARES, designed to recommend suitable caregivers for postpartum women using deep learning and knowledge graphs. Initially, the proposed CARES utilizes Extreme Gradient Boosting (XGBoost) to identify important features, addressing the issue of feature similarity. Then it employs <em>K</em>-Means clustering to group postpartum women and caregivers based on similar features. Subsequently, it utilizes a Deep &amp; Cross Network (DCN) to automatically learn feature interactions and constructs knowledge graphs to tackle the cold start problem. The proposed CARES also integrates exploration and exploitation strategies to balance the accuracy and diversity of recommendations. The proposed CARES compares with existing mechanisms on real datasets, and the simulation results demonstrate its effectiveness in terms of precision, recall, and F1-Score.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"34 ","pages":"Article 101769"},"PeriodicalIF":7.6,"publicationDate":"2025-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145158240","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Enhancing big data analysis in IoT applications and optimizing the performance of machine learning models using hybrid dimensionality optimization approach 加强物联网应用中的大数据分析,并使用混合维数优化方法优化机器学习模型的性能
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-01 Epub Date: 2025-09-18 DOI: 10.1016/j.iot.2025.101764
Ihab Nassra, Juan V. Capella
<div><div>The proliferation of Internet of Things (IoT) applications generates high-dimensional datasets characterized by substantial velocity, variety, and complexity, imposing severe computational constraints on machine learning systems. Such data's high dimensionality complicates identifying meaningful correlations among features. Thus, high-dimensional datasets pose substantial challenges for machine learning, as the abundance of variables tends to obscure meaningful correlations and hinder practical data analysis, particularly regarding computational resource consumption (e.g., memory usage), processing time, and machine learning models' training efficiency and performance. Dimensionality reduction techniques address these challenges by decreasing the number of input variables and preserving the intrinsic structure of the data while alleviating computational burdens. Nevertheless, most contemporary methods are optimized for either linear or nonlinear data patterns, but rarely both. Hybrid strategies integrating linear and nonlinear reduction techniques have increasingly addressed these constraints. Specifically, the combination of Principal Component Analysis (PCA) as a preprocessing stage with Restricted Boltzmann Machines (RBMs) offers a complementary solution, wherein PCA condenses the feature space into a lower-dimensional representation, thereby improving training efficiency and enabling RBMs to capture complex nonlinear dependencies with enhanced convergence and generalization. While this combination can theoretically exploit the data's linear and nonlinear characteristics, conventional PCA-RBM frameworks often struggle to retain essential local manifold structures, limiting their effectiveness in capturing the full complexity of real-world datasets. This study addresses these challenges by proposing a novel hybrid dimensionality reduction framework that integrates PCA's global linear projection capabilities with RBMs' nonlinear feature learning strengths through an adaptive graph regularization mechanism that preserves critical local manifold properties, which address the limitations of conventional PCA-RBM combinations. The adaptive regularization mechanism ensures that proximate data points in input space retain similarity in the reduced feature space, effectively bridging global and local structure preservation. Compared to conventional methods, experimental validation demonstrates superior performance across multiple evaluation metrics, including data reduction efficiency, classification accuracy, precision, recall, and F-score. The framework addresses critical limitations in high-dimensional data processing while maintaining model performance, establishing a methodologically significant contribution to dimensionality reduction techniques applicable across scientific disciplines handling complex IoT-generated datasets. Our findings indicate that dimensionality reduction constitutes a viable and efficacious approach to simplifying
物联网(IoT)应用的激增产生了高维数据集,其特点是速度快、种类多、复杂性高,对机器学习系统施加了严格的计算限制。此类数据的高维性使识别特征之间有意义的相关性变得复杂。因此,高维数据集给机器学习带来了巨大的挑战,因为大量的变量往往会模糊有意义的相关性,阻碍实际的数据分析,特别是在计算资源消耗(例如,内存使用)、处理时间和机器学习模型的训练效率和性能方面。降维技术通过减少输入变量的数量和保留数据的内在结构来解决这些挑战,同时减轻了计算负担。然而,大多数当代方法都针对线性或非线性数据模式进行了优化,但很少同时针对这两种模式进行优化。集成线性和非线性约简技术的混合策略越来越多地解决了这些限制。具体而言,将主成分分析(PCA)作为预处理阶段与受限玻尔兹曼机(rbm)相结合提供了一种互补的解决方案,其中PCA将特征空间压缩为低维表示,从而提高了训练效率,使rbm能够捕获复杂的非线性依赖关系,并具有增强的收敛性和泛化性。虽然这种组合在理论上可以利用数据的线性和非线性特征,但传统的PCA-RBM框架往往难以保留基本的局部流形结构,从而限制了它们在捕获真实世界数据集的全部复杂性方面的有效性。本研究通过提出一种新的混合降维框架来解决这些挑战,该框架通过自适应图正则化机制将PCA的全局线性投影能力与rbm的非线性特征学习优势集成在一起,该机制保留了关键的局部流形属性,从而解决了传统PCA- rbm组合的局限性。自适应正则化机制确保输入空间中的近似数据点在约简特征空间中保持相似性,有效地连接了全局和局部结构保存。与传统方法相比,实验验证表明该方法在多个评估指标上表现优异,包括数据约简效率、分类准确性、精度、召回率和f分数。该框架解决了高维数据处理的关键限制,同时保持了模型性能,为处理复杂物联网生成数据集的科学学科的降维技术建立了方法论上的重要贡献。我们的研究结果表明,降维构成了一种可行和有效的方法来简化数据集,而不会显著影响性能。
{"title":"Enhancing big data analysis in IoT applications and optimizing the performance of machine learning models using hybrid dimensionality optimization approach","authors":"Ihab Nassra,&nbsp;Juan V. Capella","doi":"10.1016/j.iot.2025.101764","DOIUrl":"10.1016/j.iot.2025.101764","url":null,"abstract":"&lt;div&gt;&lt;div&gt;The proliferation of Internet of Things (IoT) applications generates high-dimensional datasets characterized by substantial velocity, variety, and complexity, imposing severe computational constraints on machine learning systems. Such data's high dimensionality complicates identifying meaningful correlations among features. Thus, high-dimensional datasets pose substantial challenges for machine learning, as the abundance of variables tends to obscure meaningful correlations and hinder practical data analysis, particularly regarding computational resource consumption (e.g., memory usage), processing time, and machine learning models' training efficiency and performance. Dimensionality reduction techniques address these challenges by decreasing the number of input variables and preserving the intrinsic structure of the data while alleviating computational burdens. Nevertheless, most contemporary methods are optimized for either linear or nonlinear data patterns, but rarely both. Hybrid strategies integrating linear and nonlinear reduction techniques have increasingly addressed these constraints. Specifically, the combination of Principal Component Analysis (PCA) as a preprocessing stage with Restricted Boltzmann Machines (RBMs) offers a complementary solution, wherein PCA condenses the feature space into a lower-dimensional representation, thereby improving training efficiency and enabling RBMs to capture complex nonlinear dependencies with enhanced convergence and generalization. While this combination can theoretically exploit the data's linear and nonlinear characteristics, conventional PCA-RBM frameworks often struggle to retain essential local manifold structures, limiting their effectiveness in capturing the full complexity of real-world datasets. This study addresses these challenges by proposing a novel hybrid dimensionality reduction framework that integrates PCA's global linear projection capabilities with RBMs' nonlinear feature learning strengths through an adaptive graph regularization mechanism that preserves critical local manifold properties, which address the limitations of conventional PCA-RBM combinations. The adaptive regularization mechanism ensures that proximate data points in input space retain similarity in the reduced feature space, effectively bridging global and local structure preservation. Compared to conventional methods, experimental validation demonstrates superior performance across multiple evaluation metrics, including data reduction efficiency, classification accuracy, precision, recall, and F-score. The framework addresses critical limitations in high-dimensional data processing while maintaining model performance, establishing a methodologically significant contribution to dimensionality reduction techniques applicable across scientific disciplines handling complex IoT-generated datasets. Our findings indicate that dimensionality reduction constitutes a viable and efficacious approach to simplifying ","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"34 ","pages":"Article 101764"},"PeriodicalIF":7.6,"publicationDate":"2025-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145158239","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Robust priority aware multi-criterion offloading in digital twin UAVs networks 数字双机网络中鲁棒优先级感知多准则卸载
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-01 Epub Date: 2025-09-19 DOI: 10.1016/j.iot.2025.101763
Muhammad Yahya , Muhammad Naeem , Zeeshan Kaleem , Waleed Ejaz
Unmanned Aerial Vehicles (UAVs) play a critical role in replenishing the energy of power-constrained Internet of Things (IoT) devices, particularly in public safety operations, thereby maintaining continuous system functionality. Integrating Mobile Edge Computing (MEC) into UAV platforms enables offloading computational tasks to aerial nodes, optimizing resource utilization. Efficient orchestration of communication, computation, caching, and energy resources is imperative to maximize the benefits of UAV-assisted MEC networks. Additionally, ensuring high situational awareness is essential for supporting priority-based latency-sensitive applications. Digital twin technology can be instrumental in minimizing latency by generating a real-time digital representation of the physical infrastructure, enabling enhanced system monitoring and optimization. Accordingly, we have formulated an optimization problem to maximize the number of IoT devices UAVs can support while adhering to predefined constraints. The formulated problem is a mixed integer non-linear programming model. Additionally, the dynamic management of tasks with varying priorities and computational demands introduces a significant resource allocation and scheduling challenge. Our proposed approach entails an efficient task offloading and priority-based scheduling strategy that prioritizes tasks, allocating computational resources to those with higher priority. The approach encompasses a multi-stage offloading strategy combining an interior-point method with a learning algorithm to address the inherent complexity and provide a viable solution. Simulation results validate the effectiveness of the proposed approach, outperforming conventional methods. Specifically, the Penalty Function Method Heuristic combined with the Interior Point Method achieves superior user connectivity compared to the Simple Relaxation Heuristic strategy.
无人机(uav)在补充功率受限的物联网(IoT)设备的能量方面发挥着至关重要的作用,特别是在公共安全行动中,从而保持系统的连续功能。将移动边缘计算(MEC)集成到无人机平台中,可以将计算任务卸载到空中节点,优化资源利用率。有效地协调通信、计算、缓存和能源资源是最大化无人机辅助MEC网络效益的必要条件。此外,确保高态势感知对于支持基于优先级的延迟敏感应用程序至关重要。数字孪生技术可以通过生成物理基础设施的实时数字表示来减少延迟,从而增强系统监控和优化。因此,我们制定了一个优化问题,以最大限度地提高无人机可以支持的物联网设备数量,同时坚持预定义的约束。该问题是一个混合整数非线性规划模型。此外,具有不同优先级和计算需求的任务的动态管理带来了重大的资源分配和调度挑战。我们提出的方法需要一种有效的任务卸载和基于优先级的调度策略,该策略将任务优先化,将计算资源分配给具有更高优先级的任务。该方法采用多阶段卸载策略,结合内点法和学习算法来解决固有的复杂性,并提供一个可行的解决方案。仿真结果验证了该方法的有效性,优于传统方法。具体来说,与简单松弛启发式策略相比,惩罚函数启发式与内点法相结合实现了更好的用户连通性。
{"title":"Robust priority aware multi-criterion offloading in digital twin UAVs networks","authors":"Muhammad Yahya ,&nbsp;Muhammad Naeem ,&nbsp;Zeeshan Kaleem ,&nbsp;Waleed Ejaz","doi":"10.1016/j.iot.2025.101763","DOIUrl":"10.1016/j.iot.2025.101763","url":null,"abstract":"<div><div>Unmanned Aerial Vehicles (UAVs) play a critical role in replenishing the energy of power-constrained Internet of Things (IoT) devices, particularly in public safety operations, thereby maintaining continuous system functionality. Integrating Mobile Edge Computing (MEC) into UAV platforms enables offloading computational tasks to aerial nodes, optimizing resource utilization. Efficient orchestration of communication, computation, caching, and energy resources is imperative to maximize the benefits of UAV-assisted MEC networks. Additionally, ensuring high situational awareness is essential for supporting priority-based latency-sensitive applications. Digital twin technology can be instrumental in minimizing latency by generating a real-time digital representation of the physical infrastructure, enabling enhanced system monitoring and optimization. Accordingly, we have formulated an optimization problem to maximize the number of IoT devices UAVs can support while adhering to predefined constraints. The formulated problem is a mixed integer non-linear programming model. Additionally, the dynamic management of tasks with varying priorities and computational demands introduces a significant resource allocation and scheduling challenge. Our proposed approach entails an efficient task offloading and priority-based scheduling strategy that prioritizes tasks, allocating computational resources to those with higher priority. The approach encompasses a multi-stage offloading strategy combining an interior-point method with a learning algorithm to address the inherent complexity and provide a viable solution. Simulation results validate the effectiveness of the proposed approach, outperforming conventional methods. Specifically, the Penalty Function Method Heuristic combined with the Interior Point Method achieves superior user connectivity compared to the Simple Relaxation Heuristic strategy.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"34 ","pages":"Article 101763"},"PeriodicalIF":7.6,"publicationDate":"2025-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145158236","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Joint optimisation of time and energy consumption for data aggregation in fog-enabled IoT networks 在雾支持的物联网网络中,联合优化数据聚合的时间和能量消耗
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-01 Epub Date: 2025-09-24 DOI: 10.1016/j.iot.2025.101775
Sira Yongchareon
Fog computing extends cloud capabilities to the network edge, enabling Internet-of-Things (IoT) devices to offload computation to nearby fog nodes rather than a remote cloud. Offloading aggregation tasks reduces data redundancy and accelerates analytics while easing device energy use and backhaul load. Yet end-to-end completion time—comprising execution, transmission, and queueing—can still be substantial, creating a challenging time-energy trade-off. We formulate data-aggregation offloading as a multi-objective optimization problem that jointly minimizes latency (makespan) and energy under compute and bandwidth constraints. To solve it, we develop an NSGA-III-based method that searches for Pareto-optimal offloading and scheduling decisions across sensor and fog nodes. Comprehensive simulations and systematic experiments demonstrate that our approach consistently outperforms state-of-the-art baselines, delivering lower latency and energy consumption with better scalability.
雾计算将云功能扩展到网络边缘,使物联网(IoT)设备能够将计算卸载到附近的雾节点,而不是远程云。卸载聚合任务可减少数据冗余并加速分析,同时减少设备能耗和回程负载。然而,端到端完成时间(包括执行、传输和排队)仍然很大,造成了时间-能量权衡的挑战。我们将数据聚合卸载作为一个多目标优化问题,在计算和带宽限制下共同最小化延迟(makespan)和能量。为了解决这个问题,我们开发了一种基于nsga - iii的方法,该方法在传感器和雾节点之间搜索帕累托最优卸载和调度决策。综合模拟和系统实验表明,我们的方法始终优于最先进的基线,提供更低的延迟和能耗,具有更好的可扩展性。
{"title":"Joint optimisation of time and energy consumption for data aggregation in fog-enabled IoT networks","authors":"Sira Yongchareon","doi":"10.1016/j.iot.2025.101775","DOIUrl":"10.1016/j.iot.2025.101775","url":null,"abstract":"<div><div>Fog computing extends cloud capabilities to the network edge, enabling Internet-of-Things (IoT) devices to offload computation to nearby fog nodes rather than a remote cloud. Offloading aggregation tasks reduces data redundancy and accelerates analytics while easing device energy use and backhaul load. Yet end-to-end completion time—comprising execution, transmission, and queueing—can still be substantial, creating a challenging time-energy trade-off. We formulate data-aggregation offloading as a multi-objective optimization problem that jointly minimizes latency (makespan) and energy under compute and bandwidth constraints. To solve it, we develop an NSGA-III-based method that searches for Pareto-optimal offloading and scheduling decisions across sensor and fog nodes. Comprehensive simulations and systematic experiments demonstrate that our approach consistently outperforms state-of-the-art baselines, delivering lower latency and energy consumption with better scalability.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"34 ","pages":"Article 101775"},"PeriodicalIF":7.6,"publicationDate":"2025-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145266674","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Exploring healthcare in the 6G and AI era: Opportunities and challenges 探索6G和人工智能时代的医疗保健:机遇与挑战
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-01 Epub Date: 2025-08-29 DOI: 10.1016/j.iot.2025.101744
Houssein Taleb , Guillaume Andrieux , Daniele Khalife , Alain Ajami , Abbass Nasser
The integration of AI with emerging 6G wireless communications promises to revolutionize healthcare delivery by providing ultra-fast, reliable, and intelligent medical services. Unlike previous generations of mobile phones, 6G is expected to offer sub-millisecond latency, data rates up to terabits per second, and connectivity for more than 10 million devices per square kilometer. Together, these technologies will enable unprecedented healthcare applications, such as real-time remote robotic surgery, holographic telemedicine, and continuous monitoring using bio-nanosensors within the bio-nano-internet of things.
This survey systematically analyzes the integration of AI and 6G technologies, focusing on how their convergence will enable enhanced edge computing, federated and generative AI models, low-latency analytics for personalized treatment, predictive diagnostics, and efficient resource utilization. We present a comprehensive comparison of 5G and 6G architectures, highlighting the limitations of current systems and demonstrating how 6G advancements can address critical healthcare needs, including data throughput, mobility, and security.
Furthermore, this work identifies detailed opportunities, such as AI-powered virtual nurse assistants, AI-enhanced drug discovery accelerated by hyper-responsive 6G infrastructures, and digital twin-enabled patient simulation. Alongside these opportunities, we critically examine the technical challenges related to spectrum management in the terahertz band, the design of energy-efficient IoT devices, robust data privacy frameworks that integrate federated learning and blockchain technology, ethical considerations surrounding AI explainability, and equitable access to healthcare.
By filling gaps in the existing literature, this paper presents a comprehensive framework that combines AI and 6G, specifically designed for healthcare systems. Our findings underscore the transformative potential of this combination for achieving proactive and accessible healthcare, while outlining a roadmap for overcoming prevailing technical, ethical, and infrastructural barriers.
人工智能与新兴的6G无线通信的集成有望通过提供超快速、可靠和智能的医疗服务来彻底改变医疗保健服务。与前几代移动电话不同,6G预计将提供亚毫秒级的延迟,高达每秒太比特的数据速率,以及每平方公里超过1000万台设备的连接。总之,这些技术将实现前所未有的医疗保健应用,如实时远程机器人手术、全息远程医疗,以及在生物纳米物联网中使用生物纳米传感器进行连续监测。本调查系统地分析了人工智能和6G技术的集成,重点关注它们的融合将如何实现增强的边缘计算、联合和生成人工智能模型、个性化治疗的低延迟分析、预测诊断和高效的资源利用。我们对5G和6G架构进行了全面比较,强调了当前系统的局限性,并展示了6G的进步如何解决关键的医疗保健需求,包括数据吞吐量、移动性和安全性。此外,这项工作还确定了详细的机会,例如人工智能驱动的虚拟护士助理,超响应6G基础设施加速的人工智能增强药物发现,以及数字双胞胎患者模拟。除了这些机会,我们还仔细研究了与太赫兹频段频谱管理相关的技术挑战、节能物联网设备的设计、集成联邦学习和区块链技术的强大数据隐私框架、围绕人工智能可解释性的道德考虑以及公平获得医疗保健。通过填补现有文献的空白,本文提出了一个综合框架,结合了人工智能和6G,专门为医疗保健系统设计。我们的研究结果强调了这种结合在实现主动和可获得的医疗保健方面的变革潜力,同时概述了克服普遍的技术、道德和基础设施障碍的路线图。
{"title":"Exploring healthcare in the 6G and AI era: Opportunities and challenges","authors":"Houssein Taleb ,&nbsp;Guillaume Andrieux ,&nbsp;Daniele Khalife ,&nbsp;Alain Ajami ,&nbsp;Abbass Nasser","doi":"10.1016/j.iot.2025.101744","DOIUrl":"10.1016/j.iot.2025.101744","url":null,"abstract":"<div><div>The integration of AI with emerging 6G wireless communications promises to revolutionize healthcare delivery by providing ultra-fast, reliable, and intelligent medical services. Unlike previous generations of mobile phones, 6G is expected to offer sub-millisecond latency, data rates up to terabits per second, and connectivity for more than 10 million devices per square kilometer. Together, these technologies will enable unprecedented healthcare applications, such as real-time remote robotic surgery, holographic telemedicine, and continuous monitoring using bio-nanosensors within the bio-nano-internet of things.</div><div>This survey systematically analyzes the integration of AI and 6G technologies, focusing on how their convergence will enable enhanced edge computing, federated and generative AI models, low-latency analytics for personalized treatment, predictive diagnostics, and efficient resource utilization. We present a comprehensive comparison of 5G and 6G architectures, highlighting the limitations of current systems and demonstrating how 6G advancements can address critical healthcare needs, including data throughput, mobility, and security.</div><div>Furthermore, this work identifies detailed opportunities, such as AI-powered virtual nurse assistants, AI-enhanced drug discovery accelerated by hyper-responsive 6G infrastructures, and digital twin-enabled patient simulation. Alongside these opportunities, we critically examine the technical challenges related to spectrum management in the terahertz band, the design of energy-efficient IoT devices, robust data privacy frameworks that integrate federated learning and blockchain technology, ethical considerations surrounding AI explainability, and equitable access to healthcare.</div><div>By filling gaps in the existing literature, this paper presents a comprehensive framework that combines AI and 6G, specifically designed for healthcare systems. Our findings underscore the transformative potential of this combination for achieving proactive and accessible healthcare, while outlining a roadmap for overcoming prevailing technical, ethical, and infrastructural barriers.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"34 ","pages":"Article 101744"},"PeriodicalIF":7.6,"publicationDate":"2025-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144933364","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
TrustAIoT: A framework for building trustworthy AIoT platforms TrustAIoT:构建可信AIoT平台的框架
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-01 Epub Date: 2025-09-08 DOI: 10.1016/j.iot.2025.101751
Carlos Mario Braga , Ángel Suarez-Bárcena , Manuel A. Serrano , Eduardo Fernández-Medina
In an increasingly connected world, each device is expected to be linked to the internet and, consequently, to other objects with which it can communicate. As the Internet of Things (IoT) expands, so do social, legal, and ethical concerns. Moreover, IoT systems are evolving to process data and provide recommendations based on their findings, a capability that stems from the integration of Artificial Intelligence (AI) and Machine Learning technologies. The convergence of IoT and AI (AIoT), along with the unique aspects of AIoT architectures that differ from non-IoT-enabled AI systems, necessitates a thorough review of specific considerations for building trustworthy AIoT systems. Ensuring trustworthiness in AIoT is crucial due to the increased complexity and potential vulnerabilities introduced by this convergence.
This article introduces TrustAIoT, a structured framework for the development and long-term governance of trustworthy AIoT platforms. The framework integrates ethical, legal, and technical dimensions, and consists of both a multi-layer guideline and a lifecycle-oriented process tailored to the specific architectural characteristics of AIoT systems.
Based on a systematic literature review, trustworthiness-related technical, ethical, and legal elements are cross-referenced and contrasted with the operational and architectural needs of AIoT environments, ensuring that all critical aspects are addressed. Challenges in applying these elements across heterogeneous architectures are analyzed, leading to the definition of a guideline for transferring trust principles across the different layers of an AIoT platform, and a process for constructing and maintaining such platforms. A platform is understood as an environment comprising infrastructure, tools, services, processes, and components for developing, deploying, and operating applications.
The TrustAIoT framework consolidates these artifacts into a cohesive approach that supports trust-oriented decision-making throughout the platform lifecycle, from architectural design to project deployment.
The proposed guideline and process form a unified framework that ensures high trustworthiness standards, thereby enabling the reliable development of multiple projects and applications within AIoT ecosystems.
在一个联系日益紧密的世界里,每一台设备都有望与互联网相连,从而与其他可以与之通信的物体相连。随着物联网(IoT)的发展,社会、法律和伦理方面的担忧也在增加。此外,物联网系统正在演变为处理数据并根据其发现提供建议,这种能力源于人工智能(AI)和机器学习技术的集成。物联网和人工智能(AIoT)的融合,以及AIoT架构与非物联网支持的人工智能系统不同的独特方面,需要对构建值得信赖的AIoT系统的具体考虑因素进行全面审查。由于这种融合增加了复杂性和潜在的漏洞,确保AIoT中的可信度至关重要。本文介绍了TrustAIoT,这是一个用于开发和长期治理可信AIoT平台的结构化框架。该框架集成了道德、法律和技术维度,并由多层指导方针和面向生命周期的流程组成,该流程针对AIoT系统的特定架构特征进行了定制。基于系统的文献回顾,与信任相关的技术、道德和法律要素被交叉引用,并与AIoT环境的运营和架构需求进行对比,确保所有关键方面都得到解决。本文分析了在异构体系结构中应用这些元素所面临的挑战,从而定义了在AIoT平台的不同层之间传递信任原则的指导方针,以及构建和维护此类平台的流程。平台被理解为包含用于开发、部署和操作应用程序的基础设施、工具、服务、流程和组件的环境。TrustAIoT框架将这些构件合并为一个内聚的方法,该方法在整个平台生命周期(从体系结构设计到项目部署)中支持面向信任的决策。建议的指南和流程形成了一个统一的框架,确保了高可信度标准,从而使AIoT生态系统内多个项目和应用的可靠开发成为可能。
{"title":"TrustAIoT: A framework for building trustworthy AIoT platforms","authors":"Carlos Mario Braga ,&nbsp;Ángel Suarez-Bárcena ,&nbsp;Manuel A. Serrano ,&nbsp;Eduardo Fernández-Medina","doi":"10.1016/j.iot.2025.101751","DOIUrl":"10.1016/j.iot.2025.101751","url":null,"abstract":"<div><div>In an increasingly connected world, each device is expected to be linked to the internet and, consequently, to other objects with which it can communicate. As the Internet of Things (IoT) expands, so do social, legal, and ethical concerns. Moreover, IoT systems are evolving to process data and provide recommendations based on their findings, a capability that stems from the integration of Artificial Intelligence (AI) and Machine Learning technologies. The convergence of IoT and AI (AIoT), along with the unique aspects of AIoT architectures that differ from non-IoT-enabled AI systems, necessitates a thorough review of specific considerations for building trustworthy AIoT systems. Ensuring trustworthiness in AIoT is crucial due to the increased complexity and potential vulnerabilities introduced by this convergence.</div><div>This article introduces TrustAIoT, a structured framework for the development and long-term governance of trustworthy AIoT platforms. The framework integrates ethical, legal, and technical dimensions, and consists of both a multi-layer guideline and a lifecycle-oriented process tailored to the specific architectural characteristics of AIoT systems.</div><div>Based on a systematic literature review, trustworthiness-related technical, ethical, and legal elements are cross-referenced and contrasted with the operational and architectural needs of AIoT environments, ensuring that all critical aspects are addressed. Challenges in applying these elements across heterogeneous architectures are analyzed, leading to the definition of a guideline for transferring trust principles across the different layers of an AIoT platform, and a process for constructing and maintaining such platforms. A platform is understood as an environment comprising infrastructure, tools, services, processes, and components for developing, deploying, and operating applications.</div><div>The TrustAIoT framework consolidates these artifacts into a cohesive approach that supports trust-oriented decision-making throughout the platform lifecycle, from architectural design to project deployment.</div><div>The proposed guideline and process form a unified framework that ensures high trustworthiness standards, thereby enabling the reliable development of multiple projects and applications within AIoT ecosystems.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"34 ","pages":"Article 101751"},"PeriodicalIF":7.6,"publicationDate":"2025-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145019978","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Data security for cloud-fog-assisted Industrial Internet of Things (IIoT) in future Industry 5.0 未来工业5.0中云雾辅助工业物联网(IIoT)的数据安全
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-01 Epub Date: 2025-09-17 DOI: 10.1016/j.iot.2025.101765
Haowen Tan , Max Hashem Eiza , Sangman Moh , Kouichi Sakurai
Cloud–Fog assisted Industrial Internet of Things (IIoT) has emerged as a core enabling technology for Industry 5.0, driving innovations in smart manufacturing by facilitating real-time interactions among industrial devices, fog nodes, and cloud platforms. However, inherent limitations in computational power and adaptability of IIoT terminals pose significant challenges to data security protection. This special issue focuses on addressing critical data security issues in Cloud–Fog IIoT systems.
云雾辅助工业物联网(IIoT)已成为工业5.0的核心使能技术,通过促进工业设备、雾节点和云平台之间的实时交互,推动智能制造的创新。然而,工业物联网终端在计算能力和适应性方面的固有局限性给数据安全保护带来了重大挑战。本特刊着重于解决云雾工业物联网系统中的关键数据安全问题。
{"title":"Data security for cloud-fog-assisted Industrial Internet of Things (IIoT) in future Industry 5.0","authors":"Haowen Tan ,&nbsp;Max Hashem Eiza ,&nbsp;Sangman Moh ,&nbsp;Kouichi Sakurai","doi":"10.1016/j.iot.2025.101765","DOIUrl":"10.1016/j.iot.2025.101765","url":null,"abstract":"<div><div>Cloud–Fog assisted Industrial Internet of Things (IIoT) has emerged as a core enabling technology for Industry 5.0, driving innovations in smart manufacturing by facilitating real-time interactions among industrial devices, fog nodes, and cloud platforms. However, inherent limitations in computational power and adaptability of IIoT terminals pose significant challenges to data security protection. This special issue focuses on addressing critical data security issues in Cloud–Fog IIoT systems.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"34 ","pages":"Article 101765"},"PeriodicalIF":7.6,"publicationDate":"2025-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145680989","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A systematic literature review on AI in IoT systems: Tasks, applications, and deployment 关于物联网系统中人工智能的系统文献综述:任务、应用和部署
IF 7.6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-01 Epub Date: 2025-09-30 DOI: 10.1016/j.iot.2025.101779
Umair Khadam , Paul Davidsson , Romina Spalazzese
The integration of Artificial Intelligence (AI) into Internet of Things (IoT) systems has garnered considerable attention for its ability to enhance efficiency, functionality, and decision making. To drive further research and practical applications, it is essential to gain a deeper understanding of the different roles of AI in IoT systems. In this systematic literature review, we analyze 103 articles describing Artificial Intelligence of Things (AIoT) systems found in three databases, i.e. Scopus, IEEE Xplore, and Web of Science. For each article, we examined the tasks for which AI was used, the input and output data, the application domain, the maturity level of the system, the AI methods used, and where the AI components were deployed. As a result, we identified six general tasks of AI in IoT systems, and thirteen subtasks, the most frequent being prediction, object and event recognition, and operational decision-making. Moreover, we conclude that most AI components in IoT systems process numeric data as input and that healthcare is the most common application domain followed by farming and transportation. Our analysis further revealed that most AIoT systems are in early development stages not validated in real environments. We also identified that Convolutional Neural Networks is the most frequently employed AI method, with supervised learning being the dominant approach. Additionally, we found that both AI deployment, either in the cloud or at the edge, are frequent, but that hybrid deployment is not that common. Finally, we identified key gaps in current AIoT research and based on this, we suggest directions for future research.
人工智能(AI)与物联网(IoT)系统的集成因其提高效率、功能和决策的能力而受到广泛关注。为了推动进一步的研究和实际应用,有必要更深入地了解人工智能在物联网系统中的不同角色。在这篇系统的文献综述中,我们分析了在Scopus、IEEE explore和Web of Science三个数据库中发现的103篇描述物联网(AIoT)系统的文章。对于每篇文章,我们检查了使用AI的任务、输入和输出数据、应用程序域、系统的成熟度级别、使用的AI方法,以及部署AI组件的位置。因此,我们确定了物联网系统中人工智能的六个一般任务,以及13个子任务,最常见的是预测,对象和事件识别以及运营决策。此外,我们得出结论,物联网系统中的大多数人工智能组件都将数字数据作为输入处理,医疗保健是最常见的应用领域,其次是农业和交通运输。我们的分析进一步表明,大多数AIoT系统还处于早期开发阶段,没有在真实环境中得到验证。我们还发现卷积神经网络是最常用的人工智能方法,监督学习是主要方法。此外,我们发现人工智能部署,无论是在云端还是在边缘,都很频繁,但混合部署并不常见。最后,我们指出了当前AIoT研究的主要差距,并在此基础上提出了未来研究的方向。
{"title":"A systematic literature review on AI in IoT systems: Tasks, applications, and deployment","authors":"Umair Khadam ,&nbsp;Paul Davidsson ,&nbsp;Romina Spalazzese","doi":"10.1016/j.iot.2025.101779","DOIUrl":"10.1016/j.iot.2025.101779","url":null,"abstract":"<div><div>The integration of Artificial Intelligence (AI) into Internet of Things (IoT) systems has garnered considerable attention for its ability to enhance efficiency, functionality, and decision making. To drive further research and practical applications, it is essential to gain a deeper understanding of the different roles of AI in IoT systems. In this systematic literature review, we analyze 103 articles describing Artificial Intelligence of Things (AIoT) systems found in three databases, i.e. Scopus, IEEE Xplore, and Web of Science. For each article, we examined the tasks for which AI was used, the input and output data, the application domain, the maturity level of the system, the AI methods used, and where the AI components were deployed. As a result, we identified six general tasks of AI in IoT systems, and thirteen subtasks, the most frequent being prediction, object and event recognition, and operational decision-making. Moreover, we conclude that most AI components in IoT systems process numeric data as input and that healthcare is the most common application domain followed by farming and transportation. Our analysis further revealed that most AIoT systems are in early development stages not validated in real environments. We also identified that Convolutional Neural Networks is the most frequently employed AI method, with supervised learning being the dominant approach. Additionally, we found that both AI deployment, either in the cloud or at the edge, are frequent, but that hybrid deployment is not that common. Finally, we identified key gaps in current AIoT research and based on this, we suggest directions for future research.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"34 ","pages":"Article 101779"},"PeriodicalIF":7.6,"publicationDate":"2025-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145220821","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Internet of Things
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1