Pub Date : 2022-09-21DOI: 10.1109/ICOASE56293.2022.10075603
Fatimah H. Mohialdeen, Y. E. Mohammed Ali, F. Mahmood
The deployment of mobile telecommunication networks has increased dramatically in recent decades. This increase in the number of mobile devices, and towers yields to increase in consumed energy. Hence, the need for energy efficiency (EE) has increased to reduce cost and pollution. In this paper, the following parameters are studied to enhance EE: increasing the number of base station antennas, increasing the number of user equipment (UEs), and other parameters such as channel state information (CSI). The purpose of this study is to look into how improvement might be achieved. Using the MATLAB program, this article analyzes and enhances EE using a mathematical model in the fifth generation of wireless communication (5G) massive multiple-input multiple-output (Massive-MIMO). The EE effectiveness is demonstrated through simulation results and shows how different parameter selections affect the fundamental balance between EE and spectral efficiency (SE) or only on the EE. The results show that a couple of parameters enhance the EE-SE curve, such as the number of base station antenna, transmit bandwidth, circuit power, number of users, and the availability of CSI. The increase in the number of base station antennas is considered to be a simple solution to increase the EE before the increase in circuit power. Increasing the number of antennas, also, reduces the impact of having imperfect CSI. The results show an increasing number of antennas with respect to the number of users from 4 to 10 do not increase EE, yet increase the SE by around %55.
{"title":"Energy Efficiency Parameters Evaluation for 5G Application","authors":"Fatimah H. Mohialdeen, Y. E. Mohammed Ali, F. Mahmood","doi":"10.1109/ICOASE56293.2022.10075603","DOIUrl":"https://doi.org/10.1109/ICOASE56293.2022.10075603","url":null,"abstract":"The deployment of mobile telecommunication networks has increased dramatically in recent decades. This increase in the number of mobile devices, and towers yields to increase in consumed energy. Hence, the need for energy efficiency (EE) has increased to reduce cost and pollution. In this paper, the following parameters are studied to enhance EE: increasing the number of base station antennas, increasing the number of user equipment (UEs), and other parameters such as channel state information (CSI). The purpose of this study is to look into how improvement might be achieved. Using the MATLAB program, this article analyzes and enhances EE using a mathematical model in the fifth generation of wireless communication (5G) massive multiple-input multiple-output (Massive-MIMO). The EE effectiveness is demonstrated through simulation results and shows how different parameter selections affect the fundamental balance between EE and spectral efficiency (SE) or only on the EE. The results show that a couple of parameters enhance the EE-SE curve, such as the number of base station antenna, transmit bandwidth, circuit power, number of users, and the availability of CSI. The increase in the number of base station antennas is considered to be a simple solution to increase the EE before the increase in circuit power. Increasing the number of antennas, also, reduces the impact of having imperfect CSI. The results show an increasing number of antennas with respect to the number of users from 4 to 10 do not increase EE, yet increase the SE by around %55.","PeriodicalId":297211,"journal":{"name":"2022 4th International Conference on Advanced Science and Engineering (ICOASE)","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133004950","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-09-21DOI: 10.1109/ICOASE56293.2022.10075596
Mustafa Zaki Mohammed, I. Saleh
Software systems have gotten increasingly complicated and adaptable in today's computer world. As a result, it's critical to track down and fix software design flaws on a regular basis. Software fault prediction in early phase is useful for enhancing software quality and for reducing software testing time and expense; it's a technique for predicting problems using historical data. To anticipate software flaws from historical databases, several machine learning approaches are applied. This paper focuses on creating a predictor to predict software defects, Based on previous data. For this purpose, a supervised machine learning techniques was utilized to forecast future software failures, K-Nearest Neighbor (KNN) and Random Forest (RF) applied technique applied to the defective data set belonging to the NASA's PROMISE repository. Also, a set of performance measures such as accuracy, precision, recall and f1 measure were used to evaluate the performance of the models. This paper showed a good performance of the RF model compared to the KNN model resulting in a maximum and minimum accuracy are 99%,88% on the MC1 and KC1 responsibly. In general, the study's findings suggest that software defect metrics may be used to determine the problematic module, and that the RF model can be used to anticipate software errors.
{"title":"Predicted of Software Fault Based on Random Forest and K-Nearest Neighbor","authors":"Mustafa Zaki Mohammed, I. Saleh","doi":"10.1109/ICOASE56293.2022.10075596","DOIUrl":"https://doi.org/10.1109/ICOASE56293.2022.10075596","url":null,"abstract":"Software systems have gotten increasingly complicated and adaptable in today's computer world. As a result, it's critical to track down and fix software design flaws on a regular basis. Software fault prediction in early phase is useful for enhancing software quality and for reducing software testing time and expense; it's a technique for predicting problems using historical data. To anticipate software flaws from historical databases, several machine learning approaches are applied. This paper focuses on creating a predictor to predict software defects, Based on previous data. For this purpose, a supervised machine learning techniques was utilized to forecast future software failures, K-Nearest Neighbor (KNN) and Random Forest (RF) applied technique applied to the defective data set belonging to the NASA's PROMISE repository. Also, a set of performance measures such as accuracy, precision, recall and f1 measure were used to evaluate the performance of the models. This paper showed a good performance of the RF model compared to the KNN model resulting in a maximum and minimum accuracy are 99%,88% on the MC1 and KC1 responsibly. In general, the study's findings suggest that software defect metrics may be used to determine the problematic module, and that the RF model can be used to anticipate software errors.","PeriodicalId":297211,"journal":{"name":"2022 4th International Conference on Advanced Science and Engineering (ICOASE)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115689399","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-09-21DOI: 10.1109/ICOASE56293.2022.10075610
Qusay Hussein Mirdas, N. Yasin, N. Alshamaa
Because induction motors are used in most industries, IM control is more essential, Optimization is used approaches are becoming more common for improving Three - Phase induction motor (TIM). In addition, the Volt/Hz (V/f) control is utilized to minimize the harmonics level of other control and modulation approaches. This study is about tuning the PI controller parameters for utilization in TIM. To optimize the speed response performance of the TIM, the Particle Swarm Optimization (PSO) algorithm is used to adjust each parameter of the PI speed controller. Kp and Ki of the PI speed controller parameters are optimized for TIM operation with V/ f Control by designing an appropriate PSO algorithm. The PI speed controller's performance on the TIM is measured by measuring changes in speed and torque under-speed response events. In PSO, the PI controller performs well in terms of overshoot, settling time, and steady-state error.
{"title":"PSO Algorithm for Three Phase Induction Motor with V/F Speed Control","authors":"Qusay Hussein Mirdas, N. Yasin, N. Alshamaa","doi":"10.1109/ICOASE56293.2022.10075610","DOIUrl":"https://doi.org/10.1109/ICOASE56293.2022.10075610","url":null,"abstract":"Because induction motors are used in most industries, IM control is more essential, Optimization is used approaches are becoming more common for improving Three - Phase induction motor (TIM). In addition, the Volt/Hz (V/f) control is utilized to minimize the harmonics level of other control and modulation approaches. This study is about tuning the PI controller parameters for utilization in TIM. To optimize the speed response performance of the TIM, the Particle Swarm Optimization (PSO) algorithm is used to adjust each parameter of the PI speed controller. Kp and Ki of the PI speed controller parameters are optimized for TIM operation with V/ f Control by designing an appropriate PSO algorithm. The PI speed controller's performance on the TIM is measured by measuring changes in speed and torque under-speed response events. In PSO, the PI controller performs well in terms of overshoot, settling time, and steady-state error.","PeriodicalId":297211,"journal":{"name":"2022 4th International Conference on Advanced Science and Engineering (ICOASE)","volume":"119 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123485695","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-09-21DOI: 10.1109/ICOASE56293.2022.10075605
D. Abdullah, H. Mohammed
Clouds are the most powerful computation architecture; nevertheless, some applications are delay sensitive and need real time responses. Offloading tasks from user device to the cloud will take relatively long time and consumes network bandwidth. This motivates the appearance of fog computing. In fog, computing additional layer falls between user device layer and the cloud. Offloading tasks to fog layer will be faster and save network bandwidth. Fog computing has spread widely, but it is difficult to build and test such systems in real word. This led the developers to use fog simulation frameworks to simulate and test their own systems. In this paper, we adopt fog simulation formwork, which adds smart agent layer between user device and fog layer. The framework uses multilevel queue instead of single queue at the Ethernet layer, these queues are scheduled according to weighted round robin and tasks dispatched to theses queues according to the value of Type of Service (ToS) bits which falls at the second byte inside the IP header. The value of ToS bits given by the smart agent layer according to take constraints. Framework behavior compared with mFogSim framework and the results shows that the proposed framework has significantly decrease the delay on both brokers and fog nodes. furthermore, packet drop count and packet error rate are slightly improved
{"title":"DHFogSim: Smart Real-Time Traffic Management Framework for Fog Computing Systems","authors":"D. Abdullah, H. Mohammed","doi":"10.1109/ICOASE56293.2022.10075605","DOIUrl":"https://doi.org/10.1109/ICOASE56293.2022.10075605","url":null,"abstract":"Clouds are the most powerful computation architecture; nevertheless, some applications are delay sensitive and need real time responses. Offloading tasks from user device to the cloud will take relatively long time and consumes network bandwidth. This motivates the appearance of fog computing. In fog, computing additional layer falls between user device layer and the cloud. Offloading tasks to fog layer will be faster and save network bandwidth. Fog computing has spread widely, but it is difficult to build and test such systems in real word. This led the developers to use fog simulation frameworks to simulate and test their own systems. In this paper, we adopt fog simulation formwork, which adds smart agent layer between user device and fog layer. The framework uses multilevel queue instead of single queue at the Ethernet layer, these queues are scheduled according to weighted round robin and tasks dispatched to theses queues according to the value of Type of Service (ToS) bits which falls at the second byte inside the IP header. The value of ToS bits given by the smart agent layer according to take constraints. Framework behavior compared with mFogSim framework and the results shows that the proposed framework has significantly decrease the delay on both brokers and fog nodes. furthermore, packet drop count and packet error rate are slightly improved","PeriodicalId":297211,"journal":{"name":"2022 4th International Conference on Advanced Science and Engineering (ICOASE)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125467940","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-09-21DOI: 10.1109/ICOASE56293.2022.10075614
Naaman Omar, Adel Al-zebari, A. Şengur
K-means clustering is known to be the most traditional approach in machine learning. It's been put to a lot of different uses. However, it has difficulty with initialization and performs poorly for non-linear clusters. Several approaches have been offered in the literature to circumvent these restrictions. Kernel K-means (KK-M) is a type of K-means that falls under this group. In this paper, a two-stepped approach is developed to increase the clustering performance of the K-means algorithm. A transformation procedure is applied in the first step where the low-dimensional input space is transferred to a high-dimensional feature space. To this end, the hidden layer of a Radial basis function (RBF) network is used. The typical K-means method is used in the second part of our approach. We offer experimental results comparing the KK-M on simulated data sets to assess the correctness of the suggested approach. The results of the experiments show the efficiency of the proposed method. The clustering accuracy attained is higher than that of the KK-M algorithm. We also applied the proposed clustering algorithm on image segmentation application. A series of segmentation results were given accordingly.
{"title":"Improving the Clustering Performance of the K-Means Algorithm for Non-linear Clusters","authors":"Naaman Omar, Adel Al-zebari, A. Şengur","doi":"10.1109/ICOASE56293.2022.10075614","DOIUrl":"https://doi.org/10.1109/ICOASE56293.2022.10075614","url":null,"abstract":"K-means clustering is known to be the most traditional approach in machine learning. It's been put to a lot of different uses. However, it has difficulty with initialization and performs poorly for non-linear clusters. Several approaches have been offered in the literature to circumvent these restrictions. Kernel K-means (KK-M) is a type of K-means that falls under this group. In this paper, a two-stepped approach is developed to increase the clustering performance of the K-means algorithm. A transformation procedure is applied in the first step where the low-dimensional input space is transferred to a high-dimensional feature space. To this end, the hidden layer of a Radial basis function (RBF) network is used. The typical K-means method is used in the second part of our approach. We offer experimental results comparing the KK-M on simulated data sets to assess the correctness of the suggested approach. The results of the experiments show the efficiency of the proposed method. The clustering accuracy attained is higher than that of the KK-M algorithm. We also applied the proposed clustering algorithm on image segmentation application. A series of segmentation results were given accordingly.","PeriodicalId":297211,"journal":{"name":"2022 4th International Conference on Advanced Science and Engineering (ICOASE)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121100457","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-09-21DOI: 10.1109/ICOASE56293.2022.10075607
Nabeel N. Ali, N. Kako, A. Abdi
In recent years, the machine learning field has been inundated with a variety of deep learning methods. Different deep learning model types, including recurrent neural networks (RNNs), convolutional neural networks (CNNs), adversarial neural networks (ANNs), and autoencoders, are successfully tackling challenging computer vision problems including image detection and segmentation in an unconstrained environment. Although image segmentation has received a lot of interest, there have been several new deep learning methods discovered with regard to object detection and recognition. An academic review of deep learning image segmentation methods is presented in this article. In this study, the major goal is to offer a sensible comprehension of the basic approaches that have already made a substantial contribution to the domain of image segmentation throughout the years. The article describes the existing state of image segmentation, and goes on to make the argument that deep learning has revolutionized this field. Afterwards, segmentation algorithms have been scientifically classified and optimized, each with their own special contribution. With a variety of informative narratives, the reader may be able to understand the internal workings of these processes more quickly.
{"title":"Review on Image Segmentation Methods Using Deep Learning","authors":"Nabeel N. Ali, N. Kako, A. Abdi","doi":"10.1109/ICOASE56293.2022.10075607","DOIUrl":"https://doi.org/10.1109/ICOASE56293.2022.10075607","url":null,"abstract":"In recent years, the machine learning field has been inundated with a variety of deep learning methods. Different deep learning model types, including recurrent neural networks (RNNs), convolutional neural networks (CNNs), adversarial neural networks (ANNs), and autoencoders, are successfully tackling challenging computer vision problems including image detection and segmentation in an unconstrained environment. Although image segmentation has received a lot of interest, there have been several new deep learning methods discovered with regard to object detection and recognition. An academic review of deep learning image segmentation methods is presented in this article. In this study, the major goal is to offer a sensible comprehension of the basic approaches that have already made a substantial contribution to the domain of image segmentation throughout the years. The article describes the existing state of image segmentation, and goes on to make the argument that deep learning has revolutionized this field. Afterwards, segmentation algorithms have been scientifically classified and optimized, each with their own special contribution. With a variety of informative narratives, the reader may be able to understand the internal workings of these processes more quickly.","PeriodicalId":297211,"journal":{"name":"2022 4th International Conference on Advanced Science and Engineering (ICOASE)","volume":"86 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134147489","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-09-21DOI: 10.1109/ICOASE56293.2022.10075578
M. A. Omer, Shimal Sh. Taher, S. Ameen
Telemedicine and telehealth care system show the revolutionary and modern way to deal with the coronavirus 2019 pandemic. However, such systems are facing increased security risks. As a result, healthcare providers and academic institutions must be well-informed, safe, and prepared to respond to any cyber-attack. The aim of this paper is to conduct a review of healthcare information systems together with how security can be provided for such systems. The paper main focus is on the adoption of blockchain technology to support the security of the healthcare system. This adoption has been investigated and assessed to show its benefits compared with other conventional technologies. Finally, a recommendation was pointed out for the security of healthcare with the usage of blockchain technology.
{"title":"Investigation of Healthcare Security Using Blockchain Technology: A review","authors":"M. A. Omer, Shimal Sh. Taher, S. Ameen","doi":"10.1109/ICOASE56293.2022.10075578","DOIUrl":"https://doi.org/10.1109/ICOASE56293.2022.10075578","url":null,"abstract":"Telemedicine and telehealth care system show the revolutionary and modern way to deal with the coronavirus 2019 pandemic. However, such systems are facing increased security risks. As a result, healthcare providers and academic institutions must be well-informed, safe, and prepared to respond to any cyber-attack. The aim of this paper is to conduct a review of healthcare information systems together with how security can be provided for such systems. The paper main focus is on the adoption of blockchain technology to support the security of the healthcare system. This adoption has been investigated and assessed to show its benefits compared with other conventional technologies. Finally, a recommendation was pointed out for the security of healthcare with the usage of blockchain technology.","PeriodicalId":297211,"journal":{"name":"2022 4th International Conference on Advanced Science and Engineering (ICOASE)","volume":"101 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134206270","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-09-21DOI: 10.1109/ICOASE56293.2022.10075581
Shakir M. Abas, Omer Mohammed Salih Hassan, Imad Manaf Ali, Safin Saber Nori, Hamza Sardar Hassan
Recently, the various diseases are infecting the humans due to their living environmental and the changes of the environmental. It is much important to identification and prediction of such diseases at earlier stages to prevent the outbreak these diseases. The identification of these diseases manually by the doctors is difficult. There are many of the chronic diseases that affect human. One of these diseases is the brain tumors that arise by the abnormal growth and division of brain cells which leads to brain cancer. The computer vision plays important role in human health field which gives accurate results that helps the human to tack the true decision. In addition, traditional technics are time consuming, expensive and addressed problem requires expert knowledge. This research aims to focus on the using simple deep learning architecture with accurate results. Moreover, the Convolution Neural Network (CNN) algorithm is used for reliable Classification of the brain tumor Image. The proposed models are showed very good results and reached almost 96.4% accuracy on Brain MRI Images for Brain Tumor Detection1 dataset.
{"title":"Diseases Diagnosis Using Machine Learning of Medical Images","authors":"Shakir M. Abas, Omer Mohammed Salih Hassan, Imad Manaf Ali, Safin Saber Nori, Hamza Sardar Hassan","doi":"10.1109/ICOASE56293.2022.10075581","DOIUrl":"https://doi.org/10.1109/ICOASE56293.2022.10075581","url":null,"abstract":"Recently, the various diseases are infecting the humans due to their living environmental and the changes of the environmental. It is much important to identification and prediction of such diseases at earlier stages to prevent the outbreak these diseases. The identification of these diseases manually by the doctors is difficult. There are many of the chronic diseases that affect human. One of these diseases is the brain tumors that arise by the abnormal growth and division of brain cells which leads to brain cancer. The computer vision plays important role in human health field which gives accurate results that helps the human to tack the true decision. In addition, traditional technics are time consuming, expensive and addressed problem requires expert knowledge. This research aims to focus on the using simple deep learning architecture with accurate results. Moreover, the Convolution Neural Network (CNN) algorithm is used for reliable Classification of the brain tumor Image. The proposed models are showed very good results and reached almost 96.4% accuracy on Brain MRI Images for Brain Tumor Detection1 dataset.","PeriodicalId":297211,"journal":{"name":"2022 4th International Conference on Advanced Science and Engineering (ICOASE)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133925234","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-09-21DOI: 10.1109/ICOASE56293.2022.10075601
Waleed Ayad Mahdi, S. Q. Mahdi, Ali Al-Naji
In 2020, the COVID-19 pandemic spread globally, leading to countries imposing health restrictions on people, including wearing masks, to prevent the spread of the disease. Wearing a mask significantly decreases distinguishing ability due to its concealment of the main facial features. After the outbreak of the pandemic, the existing datasets became unsuitable because they did not contain images of people wearing masks. To address the shortage of large-scale masked faces datasets, a developed method was proposed to generate artificial masks and place them on the faces in the unmasked faces dataset to generate the masked faces dataset. Following the proposed method, masked faces are generated in two steps. First, the face is detected in the unmasked image, and then the detected face image is aligned. The second step is to overlay the mask on the cropped face images using the dlib-ml library. Depending on the proposed method, two datasets of masked faces called masked-dataset-1 and masked-dataset-2 were created. Promising results were obtained when they were evaluated using the Labeled Faces in the Wild (LFW) dataset, and two of the state-of-the-art facial recognition systems for evaluation are FaceNet and ArcFace, where the accuracy of using the two systems was 96.1 and 97, respectively with masked-dataset-1 and 87.6 and 88.9, respectively with masked-dataset-2.
{"title":"Generating Masked Facial Datasets Using Dlib-Machine Learning Library","authors":"Waleed Ayad Mahdi, S. Q. Mahdi, Ali Al-Naji","doi":"10.1109/ICOASE56293.2022.10075601","DOIUrl":"https://doi.org/10.1109/ICOASE56293.2022.10075601","url":null,"abstract":"In 2020, the COVID-19 pandemic spread globally, leading to countries imposing health restrictions on people, including wearing masks, to prevent the spread of the disease. Wearing a mask significantly decreases distinguishing ability due to its concealment of the main facial features. After the outbreak of the pandemic, the existing datasets became unsuitable because they did not contain images of people wearing masks. To address the shortage of large-scale masked faces datasets, a developed method was proposed to generate artificial masks and place them on the faces in the unmasked faces dataset to generate the masked faces dataset. Following the proposed method, masked faces are generated in two steps. First, the face is detected in the unmasked image, and then the detected face image is aligned. The second step is to overlay the mask on the cropped face images using the dlib-ml library. Depending on the proposed method, two datasets of masked faces called masked-dataset-1 and masked-dataset-2 were created. Promising results were obtained when they were evaluated using the Labeled Faces in the Wild (LFW) dataset, and two of the state-of-the-art facial recognition systems for evaluation are FaceNet and ArcFace, where the accuracy of using the two systems was 96.1 and 97, respectively with masked-dataset-1 and 87.6 and 88.9, respectively with masked-dataset-2.","PeriodicalId":297211,"journal":{"name":"2022 4th International Conference on Advanced Science and Engineering (ICOASE)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114779694","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-09-21DOI: 10.1109/ICOASE56293.2022.10075598
O. M. Hussein, N. Yasin
The best P and PI controller parameters of the cascade control of the BLDC system are determined using a new artificial intelligence-based optimization method called the slap swarm algorithm (SSA) in this paper. The algorithm's simplicity allows for precise tuning of optimal P and PI controller values. The integral time absolute error (ITAE) was chosen as the fitness function to optimize the controller parameters. Compared with the classical control technique (PID), the SSA approach was found to have good tuning and obtained less rise time, also less (Approximately zero) overshoot, and is more efficient in increasing the step response of the BLDC system, according to the transient response study.
{"title":"Salp Swarm Algorithm-based Position Control of a BLDC Motor","authors":"O. M. Hussein, N. Yasin","doi":"10.1109/ICOASE56293.2022.10075598","DOIUrl":"https://doi.org/10.1109/ICOASE56293.2022.10075598","url":null,"abstract":"The best P and PI controller parameters of the cascade control of the BLDC system are determined using a new artificial intelligence-based optimization method called the slap swarm algorithm (SSA) in this paper. The algorithm's simplicity allows for precise tuning of optimal P and PI controller values. The integral time absolute error (ITAE) was chosen as the fitness function to optimize the controller parameters. Compared with the classical control technique (PID), the SSA approach was found to have good tuning and obtained less rise time, also less (Approximately zero) overshoot, and is more efficient in increasing the step response of the BLDC system, according to the transient response study.","PeriodicalId":297211,"journal":{"name":"2022 4th International Conference on Advanced Science and Engineering (ICOASE)","volume":"74 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130347794","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}