Pub Date : 2022-05-11DOI: 10.5614/itbj.ict.res.appl.2022.16.1.5
Donghwang Cho, Vu Ngoc Son, D. Duc
Nowadays, software vulnerabilities pose a serious problem, because cyber-attackers often find ways to attack a system by exploiting software vulnerabilities. Detecting software vulnerabilities can be done using two main methods: i) signature-based detection, i.e. methods based on a list of known security vulnerabilities as a basis for contrasting and comparing; ii) behavior analysis-based detection using classification algorithms, i.e., methods based on analyzing the software code. In order to improve the ability to accurately detect software security vulnerabilities, this study proposes a new approach based on a technique of analyzing and standardizing software code and the random forest (RF) classification algorithm. The novelty and advantages of our proposed method are that to determine abnormal behavior of functions in the software, instead of trying to define behaviors of functions, this study uses the Word2vec natural language processing model to normalize and extract features of functions. Finally, to detect security vulnerabilities in the functions, this study proposes to use a popular and effective supervised machine learning algorithm.
{"title":"Automatically Detect Software Security Vulnerabilities Based on Natural Language Processing Techniques and Machine Learning Algorithms","authors":"Donghwang Cho, Vu Ngoc Son, D. Duc","doi":"10.5614/itbj.ict.res.appl.2022.16.1.5","DOIUrl":"https://doi.org/10.5614/itbj.ict.res.appl.2022.16.1.5","url":null,"abstract":"Nowadays, software vulnerabilities pose a serious problem, because cyber-attackers often find ways to attack a system by exploiting software vulnerabilities. Detecting software vulnerabilities can be done using two main methods: i) signature-based detection, i.e. methods based on a list of known security vulnerabilities as a basis for contrasting and comparing; ii) behavior analysis-based detection using classification algorithms, i.e., methods based on analyzing the software code. In order to improve the ability to accurately detect software security vulnerabilities, this study proposes a new approach based on a technique of analyzing and standardizing software code and the random forest (RF) classification algorithm. The novelty and advantages of our proposed method are that to determine abnormal behavior of functions in the software, instead of trying to define behaviors of functions, this study uses the Word2vec natural language processing model to normalize and extract features of functions. Finally, to detect security vulnerabilities in the functions, this study proposes to use a popular and effective supervised machine learning algorithm.","PeriodicalId":42785,"journal":{"name":"Journal of ICT Research and Applications","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2022-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49563461","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-04-30DOI: 10.5614/itbj.ict.res.appl.2022.16.1.3
M. Fikri, K. Saptaji, Fijai Naja Azmi
The implementation of industrial revolution 4.0 in manufacturing industries is necessary to adapt to the rapid changes of technologies. The milling process is one of the common manufacturing processes applied in the industries to produce engineering products. The vibration that occurs in the milling process can disturb the continuity of the process. The wired vibration monitoring system implemented in the manufacturing process needs to be replaced with the wireless monitoring system. Hence wireless vibration monitoring system is developed to solve the problem with wired monitoring systems where tucked cable and high cost are the main challenges of the wired monitoring system. The wireless monitoring system setup is built using three components: sensor node, monitoring node, and base station. Milling experiments with various depths of cut, feed rate, and spindle speed were conducted to examine the performance of the wireless monitoring system. The results indicate the wireless system shows similar data recorded by the wired system. The wireless vibration monitoring system can identify the effect of milling parameters such as depth of cut, feed rate, and spindle speed on the vibrations level. The effect of cut depth is more significant than spindle speed and feed rate in the defined parameters.
{"title":"Wireless Vibration Monitoring System for Milling Process","authors":"M. Fikri, K. Saptaji, Fijai Naja Azmi","doi":"10.5614/itbj.ict.res.appl.2022.16.1.3","DOIUrl":"https://doi.org/10.5614/itbj.ict.res.appl.2022.16.1.3","url":null,"abstract":"The implementation of industrial revolution 4.0 in manufacturing industries is necessary to adapt to the rapid changes of technologies. The milling process is one of the common manufacturing processes applied in the industries to produce engineering products. The vibration that occurs in the milling process can disturb the continuity of the process. The wired vibration monitoring system implemented in the manufacturing process needs to be replaced with the wireless monitoring system. Hence wireless vibration monitoring system is developed to solve the problem with wired monitoring systems where tucked cable and high cost are the main challenges of the wired monitoring system. The wireless monitoring system setup is built using three components: sensor node, monitoring node, and base station. Milling experiments with various depths of cut, feed rate, and spindle speed were conducted to examine the performance of the wireless monitoring system. The results indicate the wireless system shows similar data recorded by the wired system. The wireless vibration monitoring system can identify the effect of milling parameters such as depth of cut, feed rate, and spindle speed on the vibrations level. The effect of cut depth is more significant than spindle speed and feed rate in the defined parameters.","PeriodicalId":42785,"journal":{"name":"Journal of ICT Research and Applications","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2022-04-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42135482","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-04-30DOI: 10.5614/itbj.ict.res.appl.2022.16.1.4
Wishnumurti Wicaksono, S. Isa
The Sidoarjo mud flow in East Java is the result of a natural phenomenon in which hot mudflow occurs due to volcanic activity. The Sidoarjo mud flow resulted in a considerable ecological disaster in the area. In this study, by using the Modification of Normalized Difference Water Index (MNDWI) technique we measured the extension of the mudflow area from 2013 to 2020 using Landsat 8 satellite data imagery. This study is meant to predict the extension of the mud flow area in the research site by comparing regression and neural network techniques in order to find the best approach. The RPROP MLP neural network technique was used to predict the Sidoarjo mud-flowing area in 2021 to 2025. Surprisingly the results of these calculations showed that the RPROP MLP neural network with three hidden layers and 20 neurons performed the best, with an R square value for training of 0.77915565 and for testing of 0.78321550.
{"title":"Predicting the Extent of Sidoarjo Mud Flow Using Remote Sensing","authors":"Wishnumurti Wicaksono, S. Isa","doi":"10.5614/itbj.ict.res.appl.2022.16.1.4","DOIUrl":"https://doi.org/10.5614/itbj.ict.res.appl.2022.16.1.4","url":null,"abstract":"The Sidoarjo mud flow in East Java is the result of a natural phenomenon in which hot mudflow occurs due to volcanic activity. The Sidoarjo mud flow resulted in a considerable ecological disaster in the area. In this study, by using the Modification of Normalized Difference Water Index (MNDWI) technique we measured the extension of the mudflow area from 2013 to 2020 using Landsat 8 satellite data imagery. This study is meant to predict the extension of the mud flow area in the research site by comparing regression and neural network techniques in order to find the best approach. The RPROP MLP neural network technique was used to predict the Sidoarjo mud-flowing area in 2021 to 2025. Surprisingly the results of these calculations showed that the RPROP MLP neural network with three hidden layers and 20 neurons performed the best, with an R square value for training of 0.77915565 and for testing of 0.78321550.","PeriodicalId":42785,"journal":{"name":"Journal of ICT Research and Applications","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2022-04-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49261513","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-04-30DOI: 10.5614/itbj.ict.res.appl.2022.16.1.2
J. P. T. Yusiong
Tomatoes are popular around the world due to their high nutritional value. Tomatoes are also one of the world’s most widely cultivated and profitable crops. The distribution and marketing of tomatoes depend highly on their quality. Estimating tomato ripeness is an essential step in determining shelf life and quality. With the abundant supply of tomatoes on the market, it is exceedingly difficult to estimate tomato ripeness using human graders. To address this issue and improve tomato quality inspection and sorting, automated tomato maturity classification models based on different features have been developed. However, current methods heavily rely on human-engineered or handcrafted features. Convolutional neural networks have emerged as the preferred technique for general object recognition problems because they can automatically detect and extract valuable features by directly working on input images. This paper proposes a CNN-ELM classification model for automated tomato maturity grading that combines CNNs’ automated feature learning capabilities with the efficiency of extreme learning machines to perform fast and accurate classification even with limited training data. The results showed that the proposed CNN-ELM model had a classification accuracy of 96.67% and an F1-score of 96.67% in identifying six maturity stages from the test data.
{"title":"A CNN-ELM Classification Model for Automated Tomato Maturity Grading","authors":"J. P. T. Yusiong","doi":"10.5614/itbj.ict.res.appl.2022.16.1.2","DOIUrl":"https://doi.org/10.5614/itbj.ict.res.appl.2022.16.1.2","url":null,"abstract":"Tomatoes are popular around the world due to their high nutritional value. Tomatoes are also one of the world’s most widely cultivated and profitable crops. The distribution and marketing of tomatoes depend highly on their quality. Estimating tomato ripeness is an essential step in determining shelf life and quality. With the abundant supply of tomatoes on the market, it is exceedingly difficult to estimate tomato ripeness using human graders. To address this issue and improve tomato quality inspection and sorting, automated tomato maturity classification models based on different features have been developed. However, current methods heavily rely on human-engineered or handcrafted features. Convolutional neural networks have emerged as the preferred technique for general object recognition problems because they can automatically detect and extract valuable features by directly working on input images. This paper proposes a CNN-ELM classification model for automated tomato maturity grading that combines CNNs’ automated feature learning capabilities with the efficiency of extreme learning machines to perform fast and accurate classification even with limited training data. The results showed that the proposed CNN-ELM model had a classification accuracy of 96.67% and an F1-score of 96.67% in identifying six maturity stages from the test data.","PeriodicalId":42785,"journal":{"name":"Journal of ICT Research and Applications","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2022-04-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42845431","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-04-30DOI: 10.5614/itbj.ict.res.appl.2022.16.1.1
Sura Mazin Ali, Jan Yonan, Omar Alniemi, A. A. Ahmed
Mobile Robot is an extremely essential technology in the industrial world. Optimal path planning is essential for the navigation of mobile robots. The firefly algorithm is a very promising tool of Swarm Intelligence, which is used in various optimization areas. This study used the firefly algorithm to solve the mobile robot path-planning problem and achieve optimal trajectory planning. The objective of the proposed method is to find the free-collision-free points in the mobile robot environment and then generate the optimal path based on the firefly algorithm. It uses the A∗ algorithm to find the shortest path. The essential function of use the firefly algorithm is applied to specify the optimal control points for the corresponding shortest smooth trajectory of the mobile robot. Cubic Polynomial equation is applied to generate a smooth path from the initial point to the goal point during a specified period. The results of computer simulation demonstrate the efficiency of the firefly algorithm in generating optimal trajectory of mobile robot in a variable degree of mobile robot environment complexity.
{"title":"Mobile Robot Path Planning Optimization Based on Integration of Firefly Algorithm and Cubic Polynomial Equation","authors":"Sura Mazin Ali, Jan Yonan, Omar Alniemi, A. A. Ahmed","doi":"10.5614/itbj.ict.res.appl.2022.16.1.1","DOIUrl":"https://doi.org/10.5614/itbj.ict.res.appl.2022.16.1.1","url":null,"abstract":"Mobile Robot is an extremely essential technology in the industrial world. Optimal path planning is essential for the navigation of mobile robots. The firefly algorithm is a very promising tool of Swarm Intelligence, which is used in various optimization areas. This study used the firefly algorithm to solve the mobile robot path-planning problem and achieve optimal trajectory planning. The objective of the proposed method is to find the free-collision-free points in the mobile robot environment and then generate the optimal path based on the firefly algorithm. It uses the A∗ algorithm to find the shortest path. The essential function of use the firefly algorithm is applied to specify the optimal control points for the corresponding shortest smooth trajectory of the mobile robot. Cubic Polynomial equation is applied to generate a smooth path from the initial point to the goal point during a specified period. The results of computer simulation demonstrate the efficiency of the firefly algorithm in generating optimal trajectory of mobile robot in a variable degree of mobile robot environment complexity.","PeriodicalId":42785,"journal":{"name":"Journal of ICT Research and Applications","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2022-04-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45195328","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-12-28DOI: 10.5614/itbj.ict.res.appl.2021.15.3.6
Rickman Roedavan, Bambang Pudjoatmodjo, Y. Siradj, S. Salam, BQ Desy Hardianti
Serious games or applied games are digital games applied in serious fields such as education, advertising, health, business, and the military. Currently, serious game development is mostly based on the Game Development Life Cycle (GDLC) approach. A serious game is a game product with unique characteristics that require a particular approach to its development. This paper proposes a serious game development model adapted from the Game-Based Learning Foundation. This paper’s main contribution is to enhance knowledge in the game development field and game-related application research. The proposed model was validated using the relativism approach and it was used to develop several game prototypes for universities, national companies, and the military.
{"title":"Serious Game Development Model Based on the Game-Based Learning Foundation","authors":"Rickman Roedavan, Bambang Pudjoatmodjo, Y. Siradj, S. Salam, BQ Desy Hardianti","doi":"10.5614/itbj.ict.res.appl.2021.15.3.6","DOIUrl":"https://doi.org/10.5614/itbj.ict.res.appl.2021.15.3.6","url":null,"abstract":"Serious games or applied games are digital games applied in serious fields such as education, advertising, health, business, and the military. Currently, serious game development is mostly based on the Game Development Life Cycle (GDLC) approach. A serious game is a game product with unique characteristics that require a particular approach to its development. This paper proposes a serious game development model adapted from the Game-Based Learning Foundation. This paper’s main contribution is to enhance knowledge in the game development field and game-related application research. The proposed model was validated using the relativism approach and it was used to develop several game prototypes for universities, national companies, and the military.","PeriodicalId":42785,"journal":{"name":"Journal of ICT Research and Applications","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2021-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48660331","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-12-28DOI: 10.5614/itbj.ict.res.appl.2021.15.3.3
A. F. Kadmin, Rostam Affendi, Nurulfajar Abd Manap, Mohd Saad, Nadzrie Nadzrie, Tg. Mohd Faisal
This work presents the composition of a new algorithm for a stereo vision system to acquire accurate depth measurement from stereo correspondence. Stereo correspondence produced by matching is commonly affected by image noise such as illumination variation, blurry boundaries, and radiometric differences. The proposed algorithm introduces a pre-processing step based on the combination of Contrast Limited Adaptive Histogram Equalization (CLAHE) and Adaptive Gamma Correction Weighted Distribution (AGCWD) with a guided filter (GF). The cost value of the pre-processing step is determined in the matching cost step using the census transform (CT), which is followed by aggregation using the fixed-window and GF technique. A winner-takes-all (WTA) approach is employed to select the minimum disparity map value and final refinement using left-right consistency checking (LR) along with a weighted median filter (WMF) to remove outliers. The algorithm improved the accuracy 31.65% for all pixel errors and 23.35% for pixel errors in nonoccluded regions compared to several established algorithms on a Middlebury dataset.
{"title":"New Stereo Vision Algorithm Composition Using Weighted Adaptive Histogram Equalization and Gamma Correction","authors":"A. F. Kadmin, Rostam Affendi, Nurulfajar Abd Manap, Mohd Saad, Nadzrie Nadzrie, Tg. Mohd Faisal","doi":"10.5614/itbj.ict.res.appl.2021.15.3.3","DOIUrl":"https://doi.org/10.5614/itbj.ict.res.appl.2021.15.3.3","url":null,"abstract":"This work presents the composition of a new algorithm for a stereo vision system to acquire accurate depth measurement from stereo correspondence. Stereo correspondence produced by matching is commonly affected by image noise such as illumination variation, blurry boundaries, and radiometric differences. The proposed algorithm introduces a pre-processing step based on the combination of Contrast Limited Adaptive Histogram Equalization (CLAHE) and Adaptive Gamma Correction Weighted Distribution (AGCWD) with a guided filter (GF). The cost value of the pre-processing step is determined in the matching cost step using the census transform (CT), which is followed by aggregation using the fixed-window and GF technique. A winner-takes-all (WTA) approach is employed to select the minimum disparity map value and final refinement using left-right consistency checking (LR) along with a weighted median filter (WMF) to remove outliers. The algorithm improved the accuracy 31.65% for all pixel errors and 23.35% for pixel errors in nonoccluded regions compared to several established algorithms on a Middlebury dataset.","PeriodicalId":42785,"journal":{"name":"Journal of ICT Research and Applications","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2021-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47748613","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-12-28DOI: 10.5614/itbj.ict.res.appl.2021.15.3.1
Gurjot Singh Mahi, A. Verma
Web crawlers are as old as the Internet and are most commonly used by search engines to visit websites and index them into repositories. They are not limited to search engines but are also widely utilized to build corpora in different domains and languages. This study developed a focused set of web crawlers for three Punjabi news websites. The web crawlers were developed to extract quality text articles and add them to a local repository to be used in further research. The crawlers were implemented using the Python programming language and were utilized to construct a corpus of more than 134,000 news articles in nine different news genres. The crawler code and extracted corpora were made publicly available to the scientific community for research purposes.
{"title":"Development of Focused Crawlers for Building Large Punjabi News Corpus","authors":"Gurjot Singh Mahi, A. Verma","doi":"10.5614/itbj.ict.res.appl.2021.15.3.1","DOIUrl":"https://doi.org/10.5614/itbj.ict.res.appl.2021.15.3.1","url":null,"abstract":" \u0000Web crawlers are as old as the Internet and are most commonly used by search engines to visit websites and index them into repositories. They are not limited to search engines but are also widely utilized to build corpora in different domains and languages. This study developed a focused set of web crawlers for three Punjabi news websites. The web crawlers were developed to extract quality text articles and add them to a local repository to be used in further research. The crawlers were implemented using the Python programming language and were utilized to construct a corpus of more than 134,000 news articles in nine different news genres. The crawler code and extracted corpora were made publicly available to the scientific community for research purposes.","PeriodicalId":42785,"journal":{"name":"Journal of ICT Research and Applications","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2021-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47243303","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-12-28DOI: 10.5614/itbj.ict.res.appl.2021.15.3.2
Rajeshwari B S, M. Dakshayini, H. Guruprasad
The federated cloud is the future generation of cloud computing, allowing sharing of computing and storage resources, and servicing of user tasks among cloud providers through a centralized control mechanism. However, a great challenge lies in the efficient management of such federated clouds and fair distribution of the load among heterogeneous cloud providers. In our proposed approach, called QPFS_MASG, at the federated cloud level, the incoming tasks queue are partitioned in order to achieve a fair distribution of the load among all cloud providers of the federated cloud. Then, at the cloud level, task scheduling using the Modified Activity Selection by Greedy (MASG) technique assigns the tasks to different virtual machines (VMs), considering the task deadline as the key factor in achieving good quality of service (QoS). The proposed approach takes care of servicing tasks within their deadline, reducing service level agreement (SLA) violations, improving the response time of user tasks as well as achieving fair distribution of the load among all participating cloud providers. The QPFS_MASG was implemented using CloudSim and the evaluation result revealed a guaranteed degree of fairness in service distribution among the cloud providers with reduced response time and SLA violations compared to existing approaches. Also, the evaluation results showed that the proposed approach serviced the user tasks with minimum number of VMs.
{"title":"Efficient Task Scheduling and Fair Load Distribution Among Federated Clouds","authors":"Rajeshwari B S, M. Dakshayini, H. Guruprasad","doi":"10.5614/itbj.ict.res.appl.2021.15.3.2","DOIUrl":"https://doi.org/10.5614/itbj.ict.res.appl.2021.15.3.2","url":null,"abstract":"The federated cloud is the future generation of cloud computing, allowing sharing of computing and storage resources, and servicing of user tasks among cloud providers through a centralized control mechanism. However, a great challenge lies in the efficient management of such federated clouds and fair distribution of the load among heterogeneous cloud providers. In our proposed approach, called QPFS_MASG, at the federated cloud level, the incoming tasks queue are partitioned in order to achieve a fair distribution of the load among all cloud providers of the federated cloud. Then, at the cloud level, task scheduling using the Modified Activity Selection by Greedy (MASG) technique assigns the tasks to different virtual machines (VMs), considering the task deadline as the key factor in achieving good quality of service (QoS). The proposed approach takes care of servicing tasks within their deadline, reducing service level agreement (SLA) violations, improving the response time of user tasks as well as achieving fair distribution of the load among all participating cloud providers. The QPFS_MASG was implemented using CloudSim and the evaluation result revealed a guaranteed degree of fairness in service distribution among the cloud providers with reduced response time and SLA violations compared to existing approaches. Also, the evaluation results showed that the proposed approach serviced the user tasks with minimum number of VMs.","PeriodicalId":42785,"journal":{"name":"Journal of ICT Research and Applications","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2021-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44825670","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-12-28DOI: 10.5614/itbj.ict.res.appl.2021.15.3.5
Saleh Abdulaziz Habtor, Ahmed Haidarah Hasan Dahah
The spread of ransomware has risen exponentially over the past decade, causing huge financial damage to multiple organizations. Various anti-ransomware firms have suggested methods for preventing malware threats. The growing pace, scale and sophistication of malware provide the anti-malware industry with more challenges. Recent literature indicates that academics and anti-virus organizations have begun to use artificial learning as well as fundamental modeling techniques for the research and identification of malware. Orthodox signature-based anti-virus programs struggle to identify unfamiliar malware and track new forms of malware. In this study, a malware evaluation framework focused on machine learning was adopted that consists of several modules: dataset compiling in two separate classes (malicious and benign software), file disassembly, data processing, decision making, and updated malware identification. The data processing module uses grey images, functions for importing and Opcode n-gram to remove malware functionality. The decision making module detects malware and recognizes suspected malware. Different classifiers were considered in the research methodology for the detection and classification of malware. Its effectiveness was validated on the basis of the accuracy of the complete process.
{"title":"Machine-Learning Classifiers for Malware Detection Using Data Features","authors":"Saleh Abdulaziz Habtor, Ahmed Haidarah Hasan Dahah","doi":"10.5614/itbj.ict.res.appl.2021.15.3.5","DOIUrl":"https://doi.org/10.5614/itbj.ict.res.appl.2021.15.3.5","url":null,"abstract":"The spread of ransomware has risen exponentially over the past decade, causing huge financial damage to multiple organizations. Various anti-ransomware firms have suggested methods for preventing malware threats. The growing pace, scale and sophistication of malware provide the anti-malware industry with more challenges. Recent literature indicates that academics and anti-virus organizations have begun to use artificial learning as well as fundamental modeling techniques for the research and identification of malware. Orthodox signature-based anti-virus programs struggle to identify unfamiliar malware and track new forms of malware. In this study, a malware evaluation framework focused on machine learning was adopted that consists of several modules: dataset compiling in two separate classes (malicious and benign software), file disassembly, data processing, decision making, and updated malware identification. The data processing module uses grey images, functions for importing and Opcode n-gram to remove malware functionality. The decision making module detects malware and recognizes suspected malware. Different classifiers were considered in the research methodology for the detection and classification of malware. Its effectiveness was validated on the basis of the accuracy of the complete process.","PeriodicalId":42785,"journal":{"name":"Journal of ICT Research and Applications","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2021-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47456810","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}