{"title":"A Machine Learning-Based Approach to Estimate the CPU-Burst Time for Processes in the Computational Grids","authors":"T. Helmy, Sadam Al-Azani, Omar Bin-Obaidellah","doi":"10.1109/AIMS.2015.11","DOIUrl":null,"url":null,"abstract":"The implementation of CPU-Scheduling algorithms such as Shortest-Job-First (SJF) and Shortest Remaining Time First (SRTF) is relying on knowing the length of the CPU-bursts for processes in the ready queue. There are several methods to predict the length of the CPU-bursts, such as exponential averaging method, however these methods may not give an accurate or reliable predicted values. In this paper, we will propose a Machine Learning (ML) based approach to estimate the length of the CPU-bursts for processes. The proposed approach aims to select the most significant attributes of the process using feature selection techniques and then predicts the CPU-burst for the process in the grid. ML techniques such as Support Vector Machine (SVM) and K-Nearest Neighbors (K-NN), Artificial Neural Networks (ANN) and Decision Trees (DT) are used to test and evaluate the proposed approach using a grid workload dataset named \"GWA-T-4 Auver Grid\". The experimental results show that there is a strength linear relationship between the process attributes and the burst CPU time. Moreover, K-NN performs better in nearly all approaches in terms of CC and RAE. Furthermore, applying attribute selection techniques improves the performance in terms of space, time and estimation.","PeriodicalId":121874,"journal":{"name":"2015 3rd International Conference on Artificial Intelligence, Modelling and Simulation (AIMS)","volume":"83 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"14","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 3rd International Conference on Artificial Intelligence, Modelling and Simulation (AIMS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AIMS.2015.11","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 14
Abstract
The implementation of CPU-Scheduling algorithms such as Shortest-Job-First (SJF) and Shortest Remaining Time First (SRTF) is relying on knowing the length of the CPU-bursts for processes in the ready queue. There are several methods to predict the length of the CPU-bursts, such as exponential averaging method, however these methods may not give an accurate or reliable predicted values. In this paper, we will propose a Machine Learning (ML) based approach to estimate the length of the CPU-bursts for processes. The proposed approach aims to select the most significant attributes of the process using feature selection techniques and then predicts the CPU-burst for the process in the grid. ML techniques such as Support Vector Machine (SVM) and K-Nearest Neighbors (K-NN), Artificial Neural Networks (ANN) and Decision Trees (DT) are used to test and evaluate the proposed approach using a grid workload dataset named "GWA-T-4 Auver Grid". The experimental results show that there is a strength linear relationship between the process attributes and the burst CPU time. Moreover, K-NN performs better in nearly all approaches in terms of CC and RAE. Furthermore, applying attribute selection techniques improves the performance in terms of space, time and estimation.