Wireless Sensor Network (WSN) security and energy consumption is a potential issue. WSN plays an important role in networking technologies to handle edge devices on a heterogeneous edge computing platform. For faster processing of sensor nodes on an Industrial Internet of Everything (IIOE), an efficient computing technique for an emerging networking technology is being explored. As a result, the proposed study provides a chaotic mud ring-based elliptic curve cryptographic (CMR_ECC)-based encryption solution for WSN security. In the proposed WSN environment, various sensor nodes are deployed to collect data. To enhance the network lifetime, the nodes are combined into clusters, and the selection of cluster heads is performed with a fuzzy logic-based osprey algorithm (FL_OA). After the encryption process, the most optimal key selection process is performed with a hybrid chaotic mud ring algorithm, and the encrypted data are optimally routed to varied edge servers with a hybrid Chebyshev Gannet Optimization (CGO) approach. The data aggregation is performed with a Q-reinforcement learning approach. The proposed work is implemented with MATLAB. For 500, 750, and 1000 WSN sensor nodes, the proposed technique resulted in energy consumption values of 0.28780005 mJ, 0.31141 mJ, and 0.339419 mJ, respectively.
{"title":"Secured energy optimization of wireless sensor nodes on edge computing platform using hybrid data aggregation scheme and Q-based reinforcement learning technique","authors":"Rupa Kesavan , Yaashuwanth Calpakkam , Prathibanandhi Kanagaraj , Vijayaraja Loganathan","doi":"10.1016/j.suscom.2024.101072","DOIUrl":"10.1016/j.suscom.2024.101072","url":null,"abstract":"<div><div>Wireless Sensor Network (WSN) security and energy consumption is a potential issue. WSN plays an important role in networking technologies to handle edge devices on a heterogeneous edge computing platform. For faster processing of sensor nodes on an Industrial Internet of Everything (IIOE), an efficient computing technique for an emerging networking technology is being explored. As a result, the proposed study provides a chaotic mud ring-based elliptic curve cryptographic (CMR_ECC)-based encryption solution for WSN security. In the proposed WSN environment, various sensor nodes are deployed to collect data. To enhance the network lifetime, the nodes are combined into clusters, and the selection of cluster heads is performed with a fuzzy logic-based osprey algorithm (FL_OA). After the encryption process, the most optimal key selection process is performed with a hybrid chaotic mud ring algorithm, and the encrypted data are optimally routed to varied edge servers with a hybrid Chebyshev Gannet Optimization (CGO) approach. The data aggregation is performed with a Q-reinforcement learning approach. The proposed work is implemented with MATLAB. For 500, 750, and 1000 WSN sensor nodes, the proposed technique resulted in energy consumption values of 0.28780005 mJ, 0.31141 mJ, and 0.339419 mJ, respectively.</div></div>","PeriodicalId":48686,"journal":{"name":"Sustainable Computing-Informatics & Systems","volume":"45 ","pages":"Article 101072"},"PeriodicalIF":3.8,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143096403","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-01-01DOI: 10.1016/j.suscom.2024.101076
Sneha Pokharkar , Mahesh D. Goudar , Vrushali Waghmare
Due to high maintenance costs and inaccessibility, replacing batteries regularly is a major difficulty for Wireless Sensor Nodes (WSNs) in remote locations. Harvesting energy from multiple resources like sun, wind, thermal, and vibration is one option. Because of its plentiful availability, solar energy harvesting is the finest alternative among them. The battery gets charged during the day by solar energy, and while solar energy is unavailable, the system is powered by the charge stored in the battery. Hence, in this paper, a highly efficient Solar Energy Harvesting (SEH) system is proposed using Leadership Promoted Wild Horse Optimizer (LPWHO). LPWHO refers to the conceptual improvement of the standard Wild Horse optimization (WHO) algorithm. This research is going to focus on overall harvesting efficiency which further depends on MPPT. MPPT is used as it extracts maximal power from the solar panels and reduces power loss. The usage of MPPT enhances the extracted power’s efficiency out of the solar panel when its voltages are out of sync. At last, the supremacy of the presented approach is proved with respect to varied measures.
{"title":"An MPPT integrated DC-DC boost converter for solar energy harvester using LPWHO approach","authors":"Sneha Pokharkar , Mahesh D. Goudar , Vrushali Waghmare","doi":"10.1016/j.suscom.2024.101076","DOIUrl":"10.1016/j.suscom.2024.101076","url":null,"abstract":"<div><div>Due to high maintenance costs and inaccessibility, replacing batteries regularly is a major difficulty for Wireless Sensor Nodes (WSNs) in remote locations. Harvesting energy from multiple resources like sun, wind, thermal, and vibration is one option. Because of its plentiful availability, solar energy harvesting is the finest alternative among them. The battery gets charged during the day by solar energy, and while solar energy is unavailable, the system is powered by the charge stored in the battery. Hence, in this paper, a highly efficient Solar Energy Harvesting (SEH) system is proposed using Leadership Promoted Wild Horse Optimizer (LPWHO). LPWHO refers to the conceptual improvement of the standard Wild Horse optimization (WHO) algorithm. This research is going to focus on overall harvesting efficiency which further depends on MPPT. MPPT is used as it extracts maximal power from the solar panels and reduces power loss. The usage of MPPT enhances the extracted power’s efficiency out of the solar panel when its voltages are out of sync. At last, the supremacy of the presented approach is proved with respect to varied measures.</div></div>","PeriodicalId":48686,"journal":{"name":"Sustainable Computing-Informatics & Systems","volume":"45 ","pages":"Article 101076"},"PeriodicalIF":3.8,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143096407","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Cloud services have become indispensable in critical sectors such as healthcare, drones, digital twins, and autonomous vehicles, providing essential infrastructure for data processing and real-time analytics. These systems operate across multiple layers, including edge, fog, and cloud, requiring efficient resource management to ensure reliability and energy efficiency. However, increasing computational demands have led to rising energy consumption and frequent faults in cloud data centers. Inefficient task scheduling exacerbates these issues, causing resource overutilization, execution delays, and redundant processing. Current approaches struggle to optimize energy consumption, execution time, and fault tolerance simultaneously. While some methods offer partial solutions, they suffer from high computational complexity and fail to effectively balance the workloads or manage redundancy. Therefore, a comprehensive task scheduling solution is needed for mission-critical applications. In this article, we introduce a novel scheduling algorithm based on Mixed Integer Linear Programming (MILP) that optimizes task allocation across edge, fog, and cloud environments. Our solution reduces energy consumption, execution time, and failure rates while ensuring balanced distribution of computational loads across virtual machines. Additionally, it incorporates a fault tolerance mechanism that reduces the overlap between primary and backup tasks by distributing them across multiple availability zones. The scheduler’s efficiency is further enhanced by a custom-designed heuristic, ensuring scalability and practical applicability. The proposed MILP-based scheduler demonstrates significant average improvements over the best state-of-the-art algorithms evaluated. It achieves a 9.63% increase in task throughput, reduces energy consumption by 18.20%, shortens execution times by 9.35%, and lowers failure probabilities by 11.50% across all layers of the distributed cloud system. These results highlight the scheduler’s effectiveness in addressing key challenges in energy-efficient and reliable cloud computing for mission-critical applications.
{"title":"Improving energy efficiency and fault tolerance of mission-critical cloud task scheduling: A mixed-integer linear programming approach","authors":"Mohammadreza Saberikia , Hamed Farbeh , Mahdi Fazeli","doi":"10.1016/j.suscom.2024.101068","DOIUrl":"10.1016/j.suscom.2024.101068","url":null,"abstract":"<div><div>Cloud services have become indispensable in critical sectors such as healthcare, drones, digital twins, and autonomous vehicles, providing essential infrastructure for data processing and real-time analytics. These systems operate across multiple layers, including edge, fog, and cloud, requiring efficient resource management to ensure reliability and energy efficiency. However, increasing computational demands have led to rising energy consumption and frequent faults in cloud data centers. Inefficient task scheduling exacerbates these issues, causing resource overutilization, execution delays, and redundant processing. Current approaches struggle to optimize energy consumption, execution time, and fault tolerance simultaneously. While some methods offer partial solutions, they suffer from high computational complexity and fail to effectively balance the workloads or manage redundancy. Therefore, a comprehensive task scheduling solution is needed for mission-critical applications. In this article, we introduce a novel scheduling algorithm based on Mixed Integer Linear Programming (MILP) that optimizes task allocation across edge, fog, and cloud environments. Our solution reduces energy consumption, execution time, and failure rates while ensuring balanced distribution of computational loads across virtual machines. Additionally, it incorporates a fault tolerance mechanism that reduces the overlap between primary and backup tasks by distributing them across multiple availability zones. The scheduler’s efficiency is further enhanced by a custom-designed heuristic, ensuring scalability and practical applicability. The proposed MILP-based scheduler demonstrates significant average improvements over the best state-of-the-art algorithms evaluated. It achieves a 9.63% increase in task throughput, reduces energy consumption by 18.20%, shortens execution times by 9.35%, and lowers failure probabilities by 11.50% across all layers of the distributed cloud system. These results highlight the scheduler’s effectiveness in addressing key challenges in energy-efficient and reliable cloud computing for mission-critical applications.</div></div>","PeriodicalId":48686,"journal":{"name":"Sustainable Computing-Informatics & Systems","volume":"45 ","pages":"Article 101068"},"PeriodicalIF":3.8,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143096460","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Wildlife trafficking, a significant global issue driven by unsubstantiated medical claims and predatory lifestyle that can lead to zoonotic diseases, involves the illegal trade of endangered and protected species. While IoT-based solutions exist to make wildlife monitoring more widespread and precise, they come with trade-offs. For instance, UAVs cover large areas but cannot detect poaching in real-time once their power is drained. Similarly, using RFID collars on all wildlife is impractical. The wildlife monitoring system should be expeditious, vigilant, and efficient. Therefore, we propose a scalable, motion-sensitive IoT-based wildlife monitoring framework that leverages distributed edge analytics and fog computing, requiring no animal contact. The framework includes 1. Motion Sensing Units (MSUs), 2. Actuating and Processing Units (APUs) containing a camera, a processing unit (such as a single-board computer), and a servo motor, and 3. Hub containing a processing unit. For communication across these components, ESP-NOW, Apache Kafka, and MQTT were employed. Tailored applications (e.g. rare species detection utilizing ML) can then be deployed on these components. This paper details the framework’s implementation, validated through tests in semi-forest and dense forest environments. The system achieved real-time monitoring, defined as a procedure of detecting motion, turning the camera, capturing an image, and transmitting it to the Hub. We also provide a detailed model for implementing the framework, supported by 2800 simulated architectures. These simulations optimize device selection for wildlife monitoring based on latency, cost, and energy consumption, contributing to conservation efforts.
{"title":"An energy efficient fog-based internet of things framework to combat wildlife poaching","authors":"Rahul Siyanwal , Arun Agarwal , Satish Narayana Srirama","doi":"10.1016/j.suscom.2024.101070","DOIUrl":"10.1016/j.suscom.2024.101070","url":null,"abstract":"<div><div>Wildlife trafficking, a significant global issue driven by unsubstantiated medical claims and predatory lifestyle that can lead to zoonotic diseases, involves the illegal trade of endangered and protected species. While IoT-based solutions exist to make wildlife monitoring more widespread and precise, they come with trade-offs. For instance, UAVs cover large areas but cannot detect poaching in real-time once their power is drained. Similarly, using RFID collars on all wildlife is impractical. The wildlife monitoring system should be expeditious, vigilant, and efficient. Therefore, we propose a scalable, motion-sensitive IoT-based wildlife monitoring framework that leverages distributed edge analytics and fog computing, requiring no animal contact. The framework includes 1. Motion Sensing Units (MSUs), 2. Actuating and Processing Units (APUs) containing a camera, a processing unit (such as a single-board computer), and a servo motor, and 3. Hub containing a processing unit. For communication across these components, ESP-NOW, Apache Kafka, and MQTT were employed. Tailored applications (e.g. rare species detection utilizing ML) can then be deployed on these components. This paper details the framework’s implementation, validated through tests in semi-forest and dense forest environments. The system achieved real-time monitoring, defined as a procedure of detecting motion, turning the camera, capturing an image, and transmitting it to the Hub. We also provide a detailed model for implementing the framework, supported by 2800 simulated architectures. These simulations optimize device selection for wildlife monitoring based on latency, cost, and energy consumption, contributing to conservation efforts.</div></div>","PeriodicalId":48686,"journal":{"name":"Sustainable Computing-Informatics & Systems","volume":"45 ","pages":"Article 101070"},"PeriodicalIF":3.8,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143096402","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Constraints are a major issue in radio-based communication in Wireless Sensor Networks, where each sensor node has a limited amount of power. Conventional clustering and optimization methods have been inappropriate for dynamic conditions which lead to timely energy drainage and reduce the network lifetime. In this research, the novel Deep Reinforcement Learning-Enhanced Hybrid African Vulture and Aquila Optimizer has been proposed that optimizes the dynamic clustering and energy-based parameters in real time. The proposed model is designed for optimizing the Wireless Sensor Networks, by including Deep Reinforcement Learning to adjust the dynamic formation of the base of the cluster on real-time data which leads to efficient energy utilization among all the sensor nodes. It combines the best properties of the Aquila and African Vulture Optimizer to optimize the network lifetime and energy consumption. The network lifetime, which is one of the most crucial characteristics, is optimized by using the global search algorithm of African Vulture Optimiser. In contrast, it is optimized by the localized search of Aquila optimizer to reduce energy consumption. The presented novel African Vulture and Aquila model outperforms the existing methods used convention-based optimization methods. It shows a 20 % improvement in energy efficiency and faster convergence with better robustness while keeping the network scalability. The proposed approach is perfectly suited for the scalable WSNs which are mainly used in the environment such as smart cities and IoT systems where a timely adaptation process is inevitable.
{"title":"Deep reinforcement learning and enhanced optimization for real-time energy management in wireless sensor networks","authors":"Vidhya Sachithanandam , Jessintha D. , Balaji V.S. , Mathankumar Manoharan","doi":"10.1016/j.suscom.2024.101071","DOIUrl":"10.1016/j.suscom.2024.101071","url":null,"abstract":"<div><div>Constraints are a major issue in radio-based communication in Wireless Sensor Networks, where each sensor node has a limited amount of power. Conventional clustering and optimization methods have been inappropriate for dynamic conditions which lead to timely energy drainage and reduce the network lifetime. In this research, the novel Deep Reinforcement Learning-Enhanced Hybrid African Vulture and Aquila Optimizer has been proposed that optimizes the dynamic clustering and energy-based parameters in real time. The proposed model is designed for optimizing the Wireless Sensor Networks, by including Deep Reinforcement Learning to adjust the dynamic formation of the base of the cluster on real-time data which leads to efficient energy utilization among all the sensor nodes. It combines the best properties of the Aquila and African Vulture Optimizer to optimize the network lifetime and energy consumption. The network lifetime, which is one of the most crucial characteristics, is optimized by using the global search algorithm of African Vulture Optimiser. In contrast, it is optimized by the localized search of Aquila optimizer to reduce energy consumption. The presented novel African Vulture and Aquila model outperforms the existing methods used convention-based optimization methods. It shows a 20 % improvement in energy efficiency and faster convergence with better robustness while keeping the network scalability. The proposed approach is perfectly suited for the scalable WSNs which are mainly used in the environment such as smart cities and IoT systems where a timely adaptation process is inevitable.</div></div>","PeriodicalId":48686,"journal":{"name":"Sustainable Computing-Informatics & Systems","volume":"45 ","pages":"Article 101071"},"PeriodicalIF":3.8,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143096959","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-01-01DOI: 10.1016/j.suscom.2024.101051
Ankica Barišić , Jácome Cunha , Ivan Ruchkin , Ana Moreira , João Araújo , Moharram Challenger , Dušan Savić , Vasco Amaral
Supporting sustainability through modelling and analysis has become an active area of research in Software Engineering. Therefore, it is important and timely to survey the current state of the art in sustainability in Cyber-Physical Systems (CPS), one of the most rapidly evolving classes of complex software systems. This work presents the findings of a Systematic Mapping Study (SMS) that aims to identify key primary studies reporting on CPS modelling approaches that address sustainability over the last 10 years. Our literature search retrieved 2209 papers, of which 104 primary studies were deemed relevant for a detailed characterisation. These studies were analysed based on nine research questions designed to extract information on sustainability attributes, methods, models/meta-models, metrics, processes, and tools used to improve the sustainability of CPS. These questions also aimed to gather data on domain-specific modelling approaches and relevant application domains. The final results report findings for each of our questions, highlight interesting correlations among them, and identify literature gaps worth investigating in the near future.
{"title":"Modelling sustainability in cyber–physical systems: A systematic mapping study","authors":"Ankica Barišić , Jácome Cunha , Ivan Ruchkin , Ana Moreira , João Araújo , Moharram Challenger , Dušan Savić , Vasco Amaral","doi":"10.1016/j.suscom.2024.101051","DOIUrl":"10.1016/j.suscom.2024.101051","url":null,"abstract":"<div><div>Supporting sustainability through modelling and analysis has become an active area of research in Software Engineering. Therefore, it is important and timely to survey the current state of the art in sustainability in Cyber-Physical Systems (CPS), one of the most rapidly evolving classes of complex software systems. This work presents the findings of a Systematic Mapping Study (SMS) that aims to identify key primary studies reporting on CPS modelling approaches that address sustainability <em>over the last 10 years</em>. Our literature search retrieved 2209 papers, of which 104 primary studies were deemed relevant for a detailed characterisation. These studies were analysed based on nine research questions designed to extract information on sustainability attributes, methods, models/meta-models, metrics, processes, and tools used to improve the sustainability of CPS. These questions also aimed to gather data on domain-specific modelling approaches and relevant application domains. The final results report findings for each of our questions, highlight interesting correlations among them, and identify literature gaps worth investigating in the near future.</div></div>","PeriodicalId":48686,"journal":{"name":"Sustainable Computing-Informatics & Systems","volume":"45 ","pages":"Article 101051"},"PeriodicalIF":3.8,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143135635","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-12-27DOI: 10.1016/j.suscom.2024.101075
Kruti Sutariya , C. Menaka , Mohammad Shahid , Sneha Kashyap , Deeksha Choudhary , Sumitra Padmanabhan
The agricultural industry is critical to guaranteeing food security and sustainability, yet technological improvements have created new opportunities for enhancing farming operations. Nano-grids, or small-scale decentralized energy systems, are a viable response to agriculture's energy challenges.This study aims to investigate the integration of AI technologies into cloud computing frameworks to empower agricultural nano-grids. We propose Dragon Fruit Fly Optimization algorithms (D-FF) for energy management in Nano-grids operations with sustainable farming technology.The proposed approach's efficacy is evaluated using simulations and real-world situations in agricultural environments.The results show that the nano-grid supports agricultural activities as well as improves Accuracy (96 %), F1-Score (93 %), Precision (91 %), and Recall (92 %) with less energy wasted along with lower operating expenses.By developing smart agriculture techniques, more dependable and effective energy management in the agricultural sector is made possible by the results.
{"title":"Leveraging AI in cloud computing to enhance nano grid operations and performance in agriculture","authors":"Kruti Sutariya , C. Menaka , Mohammad Shahid , Sneha Kashyap , Deeksha Choudhary , Sumitra Padmanabhan","doi":"10.1016/j.suscom.2024.101075","DOIUrl":"10.1016/j.suscom.2024.101075","url":null,"abstract":"<div><div>The agricultural industry is critical to guaranteeing food security and sustainability, yet technological improvements have created new opportunities for enhancing farming operations. Nano-grids, or small-scale decentralized energy systems, are a viable response to agriculture's energy challenges.This study aims to investigate the integration of AI technologies into cloud computing frameworks to empower agricultural nano-grids. We propose Dragon Fruit Fly Optimization algorithms (D-FF) for energy management in Nano-grids operations with sustainable farming technology.The proposed approach's efficacy is evaluated using simulations and real-world situations in agricultural environments.The results show that the nano-grid supports agricultural activities as well as improves Accuracy (96 %), F1-Score (93 %), Precision (91 %), and Recall (92 %) with less energy wasted along with lower operating expenses.By developing smart agriculture techniques, more dependable and effective energy management in the agricultural sector is made possible by the results.</div></div>","PeriodicalId":48686,"journal":{"name":"Sustainable Computing-Informatics & Systems","volume":"46 ","pages":"Article 101075"},"PeriodicalIF":3.8,"publicationDate":"2024-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143172631","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-23DOI: 10.1016/j.suscom.2024.101054
Aml G. AbdElkader , Hanaa ZainEldin , Mahmoud M. Saafan
Wind energy is a crucial renewable resource that supports sustainable development and reduces carbon emissions. However, accurate wind power forecasting is challenging due to the inherent variability in wind patterns. This paper addresses these challenges by developing and evaluating some machine learning (ML) and deep learning (DL) models to enhance wind power forecasting accuracy. Traditional ML models, including Random Forest, k-nearest Neighbors, Ridge Regression, LASSO, Support Vector Regression, and Elastic Net, are compared with advanced DL models, such as Recurrent Neural Networks (RNN), Long Short-Term Memory (LSTM), Stacked LSTM, Graph Convolutional Networks (GCN), Temporal Convolutional Networks (TCN), and the Informer network, which is well-suited for long-sequence forecasting and large, sparse datasets. Recognizing the complexities of wind power forecasting, such as the need for high-resolution meteorological data and the limitations of ML models like overfitting and computational complexity, a novel hybrid approach is proposed. This approach uses hybrid RNN-LSTM models optimized through GS-CV. The models were trained and validated on a SCADA dataset from a Turkish wind farm, comprising 50,530 instances. Data preprocessing included cleaning, encoding, and normalization, with 70 % of the dataset allocated for training and 30 % for validation. Model performance was evaluated using key metrics such as R², MSE, MAE, RMSE, and MedAE. The proposed hybrid RNN-LSTM Models achieved outstanding results, with the RNN-LSTM model attaining an R² of 99.99 %, significantly outperforming other models. These results demonstrate the effectiveness of the hybrid approach and the Informer network in improving wind power forecasting accuracy, contributing to grid stability, and facilitating the broader adoption of sustainable energy solutions. The proposed model also achieved superior comparable performance when compared to state-of-the-art methods.
{"title":"Optimizing wind power forecasting with RNN-LSTM models through grid search cross-validation","authors":"Aml G. AbdElkader , Hanaa ZainEldin , Mahmoud M. Saafan","doi":"10.1016/j.suscom.2024.101054","DOIUrl":"10.1016/j.suscom.2024.101054","url":null,"abstract":"<div><div>Wind energy is a crucial renewable resource that supports sustainable development and reduces carbon emissions. However, accurate wind power forecasting is challenging due to the inherent variability in wind patterns. This paper addresses these challenges by developing and evaluating some machine learning (ML) and deep learning (DL) models to enhance wind power forecasting accuracy. Traditional ML models, including Random Forest, k-nearest Neighbors, Ridge Regression, LASSO, Support Vector Regression, and Elastic Net, are compared with advanced DL models, such as Recurrent Neural Networks (RNN), Long Short-Term Memory (LSTM), Stacked LSTM, Graph Convolutional Networks (GCN), Temporal Convolutional Networks (TCN), and the Informer network, which is well-suited for long-sequence forecasting and large, sparse datasets. Recognizing the complexities of wind power forecasting, such as the need for high-resolution meteorological data and the limitations of ML models like overfitting and computational complexity, a novel hybrid approach is proposed. This approach uses hybrid RNN-LSTM models optimized through GS-CV. The models were trained and validated on a SCADA dataset from a Turkish wind farm, comprising 50,530 instances. Data preprocessing included cleaning, encoding, and normalization, with 70 % of the dataset allocated for training and 30 % for validation. Model performance was evaluated using key metrics such as R², MSE, MAE, RMSE, and MedAE. The proposed hybrid RNN-LSTM Models achieved outstanding results, with the RNN-LSTM model attaining an R² of 99.99 %, significantly outperforming other models. These results demonstrate the effectiveness of the hybrid approach and the Informer network in improving wind power forecasting accuracy, contributing to grid stability, and facilitating the broader adoption of sustainable energy solutions. The proposed model also achieved superior comparable performance when compared to state-of-the-art methods.</div></div>","PeriodicalId":48686,"journal":{"name":"Sustainable Computing-Informatics & Systems","volume":"45 ","pages":"Article 101054"},"PeriodicalIF":3.8,"publicationDate":"2024-11-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142745129","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-20DOI: 10.1016/j.suscom.2024.101052
Namita K. Shinde, Vinod H. Patil
There are two main design issues in Wireless Sensor Network (WSN) routing including energy optimization and security provision. Due to the energy limitations of wireless sensor devices, the problem of high usage of energy must be properly addressed to enhance the network efficiency. Several research works have been addressed to solve the routing issue in WSN with security concerns and network life time enhancement. However, the network overhead and routing traffic are some of the obstacles still not tackled by the existing models. Hence, to enhance the routing performance, a new cluster-based routing model is introduced in this work that includes two phases like Cluster Head (CH) selection and Routing. In the first phase, the hybrid optimization model, Tasmanian Integrated Coot Optimization Algorithm (TICOA) is proposed for selecting the optimal CH under the consideration of constraints like security, Energy, Trust, Delay and Distance. Subsequently, the routing process takes place under the constraints of Trust and Link Quality that ensures the enhancement of the network lifetime of WSN. Finally, simulation results show the performance of the proposed work on cluster-based routing in terms of different performance measures. The conventional systems received lower trust ratings, specifically BOA=0.489, BSA=0.475, GA=0.493, TDO=0.418, COOT=0.439, TSGWO=0.427, and P-WWO=0.408, whereas the trust value of the TICOA technique is 0.683.
{"title":"Secured and energy efficient cluster based routing in WSN via hybrid optimization model, TICOA","authors":"Namita K. Shinde, Vinod H. Patil","doi":"10.1016/j.suscom.2024.101052","DOIUrl":"10.1016/j.suscom.2024.101052","url":null,"abstract":"<div><div>There are two main design issues in Wireless Sensor Network (WSN) routing including energy optimization and security provision. Due to the energy limitations of wireless sensor devices, the problem of high usage of energy must be properly addressed to enhance the network efficiency. Several research works have been addressed to solve the routing issue in WSN with security concerns and network life time enhancement. However, the network overhead and routing traffic are some of the obstacles still not tackled by the existing models. Hence, to enhance the routing performance, a new cluster-based routing model is introduced in this work that includes two phases like Cluster Head (CH) selection and Routing. In the first phase, the hybrid optimization model, Tasmanian Integrated Coot Optimization Algorithm (TICOA) is proposed for selecting the optimal CH under the consideration of constraints like security, Energy, Trust, Delay and Distance. Subsequently, the routing process takes place under the constraints of Trust and Link Quality that ensures the enhancement of the network lifetime of WSN. Finally, simulation results show the performance of the proposed work on cluster-based routing in terms of different performance measures. The conventional systems received lower trust ratings, specifically BOA=0.489, BSA=0.475, GA=0.493, TDO=0.418, COOT=0.439, TSGWO=0.427, and P-WWO=0.408, whereas the trust value of the TICOA technique is 0.683.</div></div>","PeriodicalId":48686,"journal":{"name":"Sustainable Computing-Informatics & Systems","volume":"44 ","pages":"Article 101052"},"PeriodicalIF":3.8,"publicationDate":"2024-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142699367","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-09DOI: 10.1016/j.suscom.2024.101053
P. Jagannadha Varma, Srinivasa Rao Bendi
With the rapid development of computing networks, cloud computing (CC) enables the deployment of large-scale applications and meets the increased rate of computational demands. Moreover, task scheduling is an essential process in CC. The tasks must be effectually scheduled across the Virtual Machines (VMs) to increase resource usage and diminish the makespan. In this paper, the multi-objective optimization called Al-Biruni Earth Namib Beetle Optimization (BENBO) with the Bidirectional-Long Short-Term Memory (Bi-LSTM) named as BENBO+ Bi-LSTM is developed for Task scheduling. The user task is subjected to the multi-objective BENBO, in which parameters like makespan, computational cost, reliability, and predicted energy are used to schedule the task. Simultaneously, the user task is fed to Bi-LSTM-based task scheduling, in which the VM parameters like average computation cost, Earliest Starting Time (EST), task priority, and Earliest Finishing Time (EFT) as well as the task parameters like bandwidth and memory capacity are utilized to schedule the task. Moreover, the task scheduling outcomes from the multi-objective BENBO and Bi-LSTM are fused for obtaining the final scheduling with less makespan and resource usage. Moreover, the predicted energy, resource utilization and makespan are considered to validate the BENBO+ Bi-LSTM-based task scheduling, which offered the optimal values of 0.669 J, 0.535 and 0.381.
{"title":"Multiobjective hybrid Al-Biruni Earth Namib Beetle Optimization and deep learning based task scheduling in cloud computing","authors":"P. Jagannadha Varma, Srinivasa Rao Bendi","doi":"10.1016/j.suscom.2024.101053","DOIUrl":"10.1016/j.suscom.2024.101053","url":null,"abstract":"<div><div>With the rapid development of computing networks, cloud computing (CC) enables the deployment of large-scale applications and meets the increased rate of computational demands. Moreover, task scheduling is an essential process in CC. The tasks must be effectually scheduled across the Virtual Machines (VMs) to increase resource usage and diminish the makespan. In this paper, the multi-objective optimization called Al-Biruni Earth Namib Beetle Optimization (BENBO) with the Bidirectional-Long Short-Term Memory (Bi-LSTM) named as BENBO+ Bi-LSTM is developed for Task scheduling. The user task is subjected to the multi-objective BENBO, in which parameters like makespan, computational cost, reliability, and predicted energy are used to schedule the task. Simultaneously, the user task is fed to Bi-LSTM-based task scheduling, in which the VM parameters like average computation cost, Earliest Starting Time (EST), task priority, and Earliest Finishing Time (EFT) as well as the task parameters like bandwidth and memory capacity are utilized to schedule the task. Moreover, the task scheduling outcomes from the multi-objective BENBO and Bi-LSTM are fused for obtaining the final scheduling with less makespan and resource usage. Moreover, the predicted energy, resource utilization and makespan are considered to validate the BENBO+ Bi-LSTM-based task scheduling, which offered the optimal values of 0.669 J, 0.535 and 0.381.</div></div>","PeriodicalId":48686,"journal":{"name":"Sustainable Computing-Informatics & Systems","volume":"44 ","pages":"Article 101053"},"PeriodicalIF":3.8,"publicationDate":"2024-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142699366","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}