Pub Date : 2024-04-08DOI: 10.37394/23205.2024.23.4
Ekbal Rashid, Nikos E. Mastorakis
Mathematical models have been developed to study the relation between the growth of quantity and the growth of quality of open-source collaborative software repositories of GitHub. GitHub events triggering the growth of quantity and quality have been identified. Linear regression analysis, Pearson’s, Spearman’s, and Kendall’s correlation coefficients have been used. Hypothesis testing has led to the conclusion that there may be a linear relation between quality and quantity within a certain range of values. Positive monotonic relations and dependency between quantity and quality have been strongly established. Scripts for automated testing have been developed.
{"title":"Monotonic and Linear Relations between Growth of Quality vs Growth in Quantity in Open-Source Software Projects","authors":"Ekbal Rashid, Nikos E. Mastorakis","doi":"10.37394/23205.2024.23.4","DOIUrl":"https://doi.org/10.37394/23205.2024.23.4","url":null,"abstract":"Mathematical models have been developed to study the relation between the growth of quantity and the growth of quality of open-source collaborative software repositories of GitHub. GitHub events triggering the growth of quantity and quality have been identified. Linear regression analysis, Pearson’s, Spearman’s, and Kendall’s correlation coefficients have been used. Hypothesis testing has led to the conclusion that there may be a linear relation between quality and quantity within a certain range of values. Positive monotonic relations and dependency between quantity and quality have been strongly established. Scripts for automated testing have been developed.","PeriodicalId":332148,"journal":{"name":"WSEAS TRANSACTIONS ON COMPUTERS","volume":"47 5","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-04-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140731891","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-04-04DOI: 10.37394/23205.2024.23.3
A. Owolabi, Oluwaseyi Oluwadamilare Okunlola, E. Adewuyi, Janet Iyabo Idowu, Olasunkanmi James Oladapo
Artificial Intelligence (AI) has proven valuable in almost every field of endeavour, including education, sciences, engineering, technology, medical sciences, and numerous other areas of application. Despite its widespread usefulness, concerns have arisen about AI potentially displacing jobs due to its highly advanced capabilities, commonly called the "god-man effect." One remarkable AI product is ChatGPT, a Chatbot developed by an open AI company in the USA that is capable of engaging in conversations that resemble human interactions. This study explores the strengths and limitations of ChatGPT for data analysis, with the primary objective of assessing whether ChatGPT poses a threat to the job of data analysts. An econometric dataset with a sample size of thirty (30), which consists of one dependent variable and three independent variables, was simulated. The dataset was intentionally generated with issues like multicollinearity, outliers, and heteroscedasticity. Subsequently, multiple tests were conducted on the datasets to confirm the presence of these problems. The ChatGPT 3.5 and 4.0 versions were then used to analyse the data to examine this chatbot's prowess in performing data analysis. ChatGPT 3.5 and 4.0 accurately predicted the suitable statistical tool for analyzing the simulated datasets. Both versions of ChatGPT emphasized that the expertise of a professional data analyst would be necessary. While they could offer guidance on data analysis, they cannot perform the analysis themselves as they are solely AI models. ChatGPT can help with what to do next when a data analyst gets stuck. However, they should not be recognized as an authority in making statistical decisions. Therefore, ChatGPT may not replace data analysts but could make their job easier by serving as a helpful resource to turn to when they encounter challenges.
{"title":"The advent of ChatGPT: Job Made Easy or Job Loss to Data Analysts","authors":"A. Owolabi, Oluwaseyi Oluwadamilare Okunlola, E. Adewuyi, Janet Iyabo Idowu, Olasunkanmi James Oladapo","doi":"10.37394/23205.2024.23.3","DOIUrl":"https://doi.org/10.37394/23205.2024.23.3","url":null,"abstract":"Artificial Intelligence (AI) has proven valuable in almost every field of endeavour, including education, sciences, engineering, technology, medical sciences, and numerous other areas of application. Despite its widespread usefulness, concerns have arisen about AI potentially displacing jobs due to its highly advanced capabilities, commonly called the \"god-man effect.\" One remarkable AI product is ChatGPT, a Chatbot developed by an open AI company in the USA that is capable of engaging in conversations that resemble human interactions. This study explores the strengths and limitations of ChatGPT for data analysis, with the primary objective of assessing whether ChatGPT poses a threat to the job of data analysts. An econometric dataset with a sample size of thirty (30), which consists of one dependent variable and three independent variables, was simulated. The dataset was intentionally generated with issues like multicollinearity, outliers, and heteroscedasticity. Subsequently, multiple tests were conducted on the datasets to confirm the presence of these problems. The ChatGPT 3.5 and 4.0 versions were then used to analyse the data to examine this chatbot's prowess in performing data analysis. ChatGPT 3.5 and 4.0 accurately predicted the suitable statistical tool for analyzing the simulated datasets. Both versions of ChatGPT emphasized that the expertise of a professional data analyst would be necessary. While they could offer guidance on data analysis, they cannot perform the analysis themselves as they are solely AI models. ChatGPT can help with what to do next when a data analyst gets stuck. However, they should not be recognized as an authority in making statistical decisions. Therefore, ChatGPT may not replace data analysts but could make their job easier by serving as a helpful resource to turn to when they encounter challenges.","PeriodicalId":332148,"journal":{"name":"WSEAS TRANSACTIONS ON COMPUTERS","volume":"73 6","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-04-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140741542","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-04-04DOI: 10.37394/23205.2024.23.2
Yunus Emre Yeti̇ş, Safiye Turgay, Bi̇lal Erdemi̇r
In the third-party logistics (3PL) environment, it is very important to reduce damage parameters, increase operational efficiency and reduce costs. This study aims to develop strategies for reshaping 3P operations by analyzing the parameters involved in damage control with machine learning. The logistics sector is gradually growing in the world and the potential of the sector is better understood over time. Damage to products in the logistics sector, especially during transportation and storage, not only causes financial losses but also affects customer productivity and operational efficiency. With the use of artificial intelligence techniques, it is possible to determine consumer expectations, predict damage losses, and develop innovative strategies by applying machine learning algorithms. At the same time, options such as driverless vehicles, robots used in storage and shelves, and the easy use of big data within the system, which have emerged with artificial intelligence, minimize errors in the logistics sector. Thanks to the use of artificial intelligence in the logistics sector, businesses are more efficient. This study includes an estimation study in the field of error parameters for the logistics service sector with machine learning methods. In the application, real data of a 3PL company for the last 5 years is used. For the success of 3PL companies, warehousing and undamaged delivery of products are of great importance. The fewer damaged products they send, the more they increase their value. The company examined in the study kept its damage data and wanted it to be analyzed so that it could take precautions accordingly and follow a more profitable path. For this reason, the study focuses on data on errors and damages. This study shows what kind of problems can occur in such a company and how the 3PL company can evaluate the problems to increase customer service quality and cost efficiency.
{"title":"Reshaping 3PL Operations: Machine Learning Approaches to Mitigate and Manage Damage Parameters","authors":"Yunus Emre Yeti̇ş, Safiye Turgay, Bi̇lal Erdemi̇r","doi":"10.37394/23205.2024.23.2","DOIUrl":"https://doi.org/10.37394/23205.2024.23.2","url":null,"abstract":"In the third-party logistics (3PL) environment, it is very important to reduce damage parameters, increase operational efficiency and reduce costs. This study aims to develop strategies for reshaping 3P operations by analyzing the parameters involved in damage control with machine learning. The logistics sector is gradually growing in the world and the potential of the sector is better understood over time. Damage to products in the logistics sector, especially during transportation and storage, not only causes financial losses but also affects customer productivity and operational efficiency. With the use of artificial intelligence techniques, it is possible to determine consumer expectations, predict damage losses, and develop innovative strategies by applying machine learning algorithms. At the same time, options such as driverless vehicles, robots used in storage and shelves, and the easy use of big data within the system, which have emerged with artificial intelligence, minimize errors in the logistics sector. Thanks to the use of artificial intelligence in the logistics sector, businesses are more efficient. This study includes an estimation study in the field of error parameters for the logistics service sector with machine learning methods. In the application, real data of a 3PL company for the last 5 years is used. For the success of 3PL companies, warehousing and undamaged delivery of products are of great importance. The fewer damaged products they send, the more they increase their value. The company examined in the study kept its damage data and wanted it to be analyzed so that it could take precautions accordingly and follow a more profitable path. For this reason, the study focuses on data on errors and damages. This study shows what kind of problems can occur in such a company and how the 3PL company can evaluate the problems to increase customer service quality and cost efficiency.","PeriodicalId":332148,"journal":{"name":"WSEAS TRANSACTIONS ON COMPUTERS","volume":"12 9","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-04-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140744100","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-11-30DOI: 10.37394/23205.2023.22.27
N. W. C. Lasantha, R. Abeysekara, M. W. P. Maduranga
This research project primarily focused on improving security by addressing vulnerabilities and creating a cost-scalable storage cluster solution, for applications. The Oracle Grid Infrastructure for Storage Clusters (OGISC) represents cutting-edge advancements in this field. The research extensively explores the philosophy of OGISC. Explains its core methodology. Recognizing the weaknesses of existing security mechanisms, we propose a solution to enhance security measures. A key aspect of our investigation involves integrating Glusterfs, a system known for its ability to scale linearly. We delve into the architecture of the solution demystifying the storage scale-out processes to its operation. Our study goes beyond integration by developing an approach and metadata model specifically tailored for Glusterfs ensuring optimal performance and robustness. One noteworthy aspect of our research is the application of Glusterfs compression with OpenVPN. This exploration highlights the benefits derived from this integration emphasizing how OpenVPN enhances Glusterfs capabilities. Rigorous analysis stages serve as the foundation for our findings resulting in a forward-thinking solution. Finally, we conclude this research with an examination of avenues for future exploration in this dynamic field.
{"title":"Enhancing Security in Database Grid Infrastructure for Storage Clusters","authors":"N. W. C. Lasantha, R. Abeysekara, M. W. P. Maduranga","doi":"10.37394/23205.2023.22.27","DOIUrl":"https://doi.org/10.37394/23205.2023.22.27","url":null,"abstract":"This research project primarily focused on improving security by addressing vulnerabilities and creating a cost-scalable storage cluster solution, for applications. The Oracle Grid Infrastructure for Storage Clusters (OGISC) represents cutting-edge advancements in this field. The research extensively explores the philosophy of OGISC. Explains its core methodology. Recognizing the weaknesses of existing security mechanisms, we propose a solution to enhance security measures. A key aspect of our investigation involves integrating Glusterfs, a system known for its ability to scale linearly. We delve into the architecture of the solution demystifying the storage scale-out processes to its operation. Our study goes beyond integration by developing an approach and metadata model specifically tailored for Glusterfs ensuring optimal performance and robustness. One noteworthy aspect of our research is the application of Glusterfs compression with OpenVPN. This exploration highlights the benefits derived from this integration emphasizing how OpenVPN enhances Glusterfs capabilities. Rigorous analysis stages serve as the foundation for our findings resulting in a forward-thinking solution. Finally, we conclude this research with an examination of avenues for future exploration in this dynamic field.","PeriodicalId":332148,"journal":{"name":"WSEAS TRANSACTIONS ON COMPUTERS","volume":"39 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-11-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139201681","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-11-30DOI: 10.37394/23205.2023.22.28
Prasenjit Mukherjee, R. S. Gokul, Manish Godse
Precise identification of autism spectrum disorder (ASD) is a challenging task due to the heterogeneity of ASD. Early diagnosis and interventions have positive effects on treatment and later skills development. Hence, it is necessary to provide families and communities with the resources, training, and tools required to diagnose and help patients. Recent work has shown that artificial intelligence-based methods are suitable for the identification of ASD. AI-based tools can be good resources for parents for early detection of ASD in their kids. Even AI-based advanced tools are helpful for health workers and physicians to detect ASD. Facial images and MRI are the best sources to understand ASD symptoms, hence are input required in AI-based model training. The trained models are used for the classification of ASD patients and normal kids. The deep learning models are found to be very accurate in ASD detection. In this paper, we present a comprehensive study of AI techniques like machine learning, image processing, and deep learning, and their accuracy when these techniques are used on facial and MRI images of ASD and normally developed kids.
{"title":"Examination of AI Algorithms for Image and MRI-based Autism Detection","authors":"Prasenjit Mukherjee, R. S. Gokul, Manish Godse","doi":"10.37394/23205.2023.22.28","DOIUrl":"https://doi.org/10.37394/23205.2023.22.28","url":null,"abstract":"Precise identification of autism spectrum disorder (ASD) is a challenging task due to the heterogeneity of ASD. Early diagnosis and interventions have positive effects on treatment and later skills development. Hence, it is necessary to provide families and communities with the resources, training, and tools required to diagnose and help patients. Recent work has shown that artificial intelligence-based methods are suitable for the identification of ASD. AI-based tools can be good resources for parents for early detection of ASD in their kids. Even AI-based advanced tools are helpful for health workers and physicians to detect ASD. Facial images and MRI are the best sources to understand ASD symptoms, hence are input required in AI-based model training. The trained models are used for the classification of ASD patients and normal kids. The deep learning models are found to be very accurate in ASD detection. In this paper, we present a comprehensive study of AI techniques like machine learning, image processing, and deep learning, and their accuracy when these techniques are used on facial and MRI images of ASD and normally developed kids.","PeriodicalId":332148,"journal":{"name":"WSEAS TRANSACTIONS ON COMPUTERS","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-11-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139198697","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-11-20DOI: 10.37394/23205.2023.22.26
Holger Kraenzle, Maximilian Rampp, Daniel Werner, Jürgen Seitz, Neha Sharma
The whole world is affected by climate change and renewable energy plays an important role in combating climate change. To add to the existing precarious situation, the current political events such as the war in Ukraine mean that fossil raw materials such as oil and gas are becoming more and more expensive in the raw material markets. This paper presents the current state of renewable energies in Germany and Europe. Using data from the past 56 years, the predictive models ARIMA and Prophet are used to find out if the conversion to renewable energies and the elimination of fossil raw materials in the energy sector can be achieved in the EU. The results are compared with the target of the EU in 2030 and a long-term outlook until 2050 will be provided.
{"title":"Prediction of the Growth of Renewable Energies in the European Union using Time Series Analysis","authors":"Holger Kraenzle, Maximilian Rampp, Daniel Werner, Jürgen Seitz, Neha Sharma","doi":"10.37394/23205.2023.22.26","DOIUrl":"https://doi.org/10.37394/23205.2023.22.26","url":null,"abstract":"The whole world is affected by climate change and renewable energy plays an important role in combating climate change. To add to the existing precarious situation, the current political events such as the war in Ukraine mean that fossil raw materials such as oil and gas are becoming more and more expensive in the raw material markets. This paper presents the current state of renewable energies in Germany and Europe. Using data from the past 56 years, the predictive models ARIMA and Prophet are used to find out if the conversion to renewable energies and the elimination of fossil raw materials in the energy sector can be achieved in the EU. The results are compared with the target of the EU in 2030 and a long-term outlook until 2050 will be provided.","PeriodicalId":332148,"journal":{"name":"WSEAS TRANSACTIONS ON COMPUTERS","volume":"27 5","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139257376","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-11-20DOI: 10.37394/23205.2023.22.25
Georgios Rigopoulos
In this work, a Group Decision methodology and algorithm for small collaborating teams is introduced. It is based on a multicriteria algorithm for classification decisions, where aggregation of member preferences is executed at the parameter level. The algorithm applies to relatively well-structured problems guided by a process facilitator. Initially, a set of parameters is proposed by the facilitator to the group and next group members evaluate the proposed parameter set and express their preferences in numeric or linguistic format. Individual preferences are aggregated by appropriate operators, and a set of group parameter values is generated, which is used as input for the classification algorithm. NeXClass multicriteria classification algorithm is used for the classification of alternatives, initially at a training set of alternatives and later at the entire set. Finally, group members evaluate results, and consensus, as well as satisfaction metrics, are calculated. In case of a low acceptance level, problem parameters are reviewed by the facilitator, and the aggregation phase is repeated. The methodology is a valid approach for group decision problems and can be utilized in numerous business environments. The algorithm can be also utilized by software agents in multiagent environments for automated decision-making, given the large volume of agent-based decision-making in various settings today.
{"title":"Methodology and Multicriteria Algorithm for Group Decision Support in Classification Problems","authors":"Georgios Rigopoulos","doi":"10.37394/23205.2023.22.25","DOIUrl":"https://doi.org/10.37394/23205.2023.22.25","url":null,"abstract":"In this work, a Group Decision methodology and algorithm for small collaborating teams is introduced. It is based on a multicriteria algorithm for classification decisions, where aggregation of member preferences is executed at the parameter level. The algorithm applies to relatively well-structured problems guided by a process facilitator. Initially, a set of parameters is proposed by the facilitator to the group and next group members evaluate the proposed parameter set and express their preferences in numeric or linguistic format. Individual preferences are aggregated by appropriate operators, and a set of group parameter values is generated, which is used as input for the classification algorithm. NeXClass multicriteria classification algorithm is used for the classification of alternatives, initially at a training set of alternatives and later at the entire set. Finally, group members evaluate results, and consensus, as well as satisfaction metrics, are calculated. In case of a low acceptance level, problem parameters are reviewed by the facilitator, and the aggregation phase is repeated. The methodology is a valid approach for group decision problems and can be utilized in numerous business environments. The algorithm can be also utilized by software agents in multiagent environments for automated decision-making, given the large volume of agent-based decision-making in various settings today.","PeriodicalId":332148,"journal":{"name":"WSEAS TRANSACTIONS ON COMPUTERS","volume":"25 4","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139255047","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-08-29DOI: 10.37394/23205.2023.22.13
R. Al-Khannak, Sajjan Singh Nehal
This paper discusses methods, tools, approaches, and techniques used for the penetration testing on the cloud-based web application on Amazon AWS platform. The findings of a penetration test could be used to fix weaknesses and vulnerabilities, and significantly improve security. The testing is implemented by undertaking a malicious attack aiming to breach system networks and thereby confirm the presence of cloud infrastructure. The research focuses on cloud-based web applications' high-risk vulnerabilities such as unrestricted file upload, command injection, and cross-site scripting. The outcomes expose and approved some vulnerabilities, flaws, and mistakes in the utilised cloud based web application. It is concluded that some vulnerabilities haveto be considered before architecting the cloud system. Recommendations are proposing solutions to testing results.
{"title":"Penetration Testing for the Cloud-Based Web Application","authors":"R. Al-Khannak, Sajjan Singh Nehal","doi":"10.37394/23205.2023.22.13","DOIUrl":"https://doi.org/10.37394/23205.2023.22.13","url":null,"abstract":"This paper discusses methods, tools, approaches, and techniques used for the penetration testing on the cloud-based web application on Amazon AWS platform. The findings of a penetration test could be used to fix weaknesses and vulnerabilities, and significantly improve security. The testing is implemented by undertaking a malicious attack aiming to breach system networks and thereby confirm the presence of cloud infrastructure. The research focuses on cloud-based web applications' high-risk vulnerabilities such as unrestricted file upload, command injection, and cross-site scripting. The outcomes expose and approved some vulnerabilities, flaws, and mistakes in the utilised cloud based web application. It is concluded that some vulnerabilities haveto be considered before architecting the cloud system. Recommendations are proposing solutions to testing results.","PeriodicalId":332148,"journal":{"name":"WSEAS TRANSACTIONS ON COMPUTERS","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127643494","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-08-03DOI: 10.37394/23205.2023.22.10
Daehwan Lee, J. Jeong, Chaegyu Lee, Hakjun Moon, Jaeuk Lee, Dongyoung Lee
In this paper, a Siamese network-based WDCNN + LSTM model was used to diagnose bearing faults using a few shot learning algorithm. Recently, deep learning-based fault diagnosis methods have achieved good results in equipment fault diagnosis. However, there are still limitations in the existing research. The biggest problem is that a large number of training samples are required to train a deep learning model. However, manufacturing sites are complex, and it is not easy to intentionally create equipment defects. Furthermore, it is impossible to obtain enough training samples for all failure types under all working conditions. Therefore, in this study, we propose a few-shot learning algorithm that can effectively learn with limited data. A Few shot learning algorithm and Siamese network based WDCNN + LSTM model bearing fault diagnosis, which can effectively learn with limited data, is proposed in this study.
{"title":"Bearing Fault Diagnosis of WDCNN-LSTM in Siamese Network","authors":"Daehwan Lee, J. Jeong, Chaegyu Lee, Hakjun Moon, Jaeuk Lee, Dongyoung Lee","doi":"10.37394/23205.2023.22.10","DOIUrl":"https://doi.org/10.37394/23205.2023.22.10","url":null,"abstract":"In this paper, a Siamese network-based WDCNN + LSTM model was used to diagnose bearing faults using a few shot learning algorithm. Recently, deep learning-based fault diagnosis methods have achieved good results in equipment fault diagnosis. However, there are still limitations in the existing research. The biggest problem is that a large number of training samples are required to train a deep learning model. However, manufacturing sites are complex, and it is not easy to intentionally create equipment defects. Furthermore, it is impossible to obtain enough training samples for all failure types under all working conditions. Therefore, in this study, we propose a few-shot learning algorithm that can effectively learn with limited data. A Few shot learning algorithm and Siamese network based WDCNN + LSTM model bearing fault diagnosis, which can effectively learn with limited data, is proposed in this study.","PeriodicalId":332148,"journal":{"name":"WSEAS TRANSACTIONS ON COMPUTERS","volume":"2012 6","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-08-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114040402","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-08-03DOI: 10.37394/23205.2023.22.12
M. Kucherov, N. Bogulskaya
Designing security, from the hardware level, is essential to ensure the integrity of the intelligent cyberphysical infrastructure that is the Industrial Internet of Things (IIoT). If intelligent cyber-physical infrastructure fails to do the right things because it is insecure and vulnerable, then there will be negative social consequences [1]. Security is, in a sense, the access control to IIoT systems, which increasingly relies on the ability to compose different policies. Therefore, the advantage in any framework for compiling policies is that it is intuitive, formal, expressive, application-independent, as well as expandable to create domain-specific instances. Recently, such a scheme was proposed based on Belnap logic FOUR2 [2]. Four values of the Belnap bilattice have been interpreted as grant, deny, conflict, or unspecified with respect to access-control policy. Belnap's four-valued logic has found a variety of applications in various fields, such as deductive database theory, distributed logic programming, and other areas. However, it turns out that the truth order in FOUR2 is a truth-and-falsity order at the same time [3]. The smallest lattice, where the orders of truth and falsity are independent of each other, which is especially important for security policy, is that of Shramko-Wansing’s SIXTEEN3. This generalization is well-motivated and leads from the bilattice FOUR2 with an information and a truth-and-falsity ordering to another algebraic structure, namely the trilattice SIXTEEN3 with an information ordering together with a truth ordering and a (distinct) falsity ordering. Based on SIXTEEN3 and new Boolean predicates to control access [4], we define an expressive access-control policy language, having composition statements based on the statements of Schramko-Wansing’s logic. Natural orderings on politics are obtained by independent lifting the orders of truth and falsity of trilattice, which results in a query language in which conflict freedom analysis can be developed. The reduction of formal verification of queries to that on predicates over access requests enables to carry out policy analysis. We evaluate our approach through examples of control access model policy.
{"title":"Trilattice-Based Access Control Models: How to Secure Current Computer Network Mikhail","authors":"M. Kucherov, N. Bogulskaya","doi":"10.37394/23205.2023.22.12","DOIUrl":"https://doi.org/10.37394/23205.2023.22.12","url":null,"abstract":"Designing security, from the hardware level, is essential to ensure the integrity of the intelligent cyberphysical infrastructure that is the Industrial Internet of Things (IIoT). If intelligent cyber-physical infrastructure fails to do the right things because it is insecure and vulnerable, then there will be negative social consequences [1]. Security is, in a sense, the access control to IIoT systems, which increasingly relies on the ability to compose different policies. Therefore, the advantage in any framework for compiling policies is that it is intuitive, formal, expressive, application-independent, as well as expandable to create domain-specific instances. Recently, such a scheme was proposed based on Belnap logic FOUR2 [2]. Four values of the Belnap bilattice have been interpreted as grant, deny, conflict, or unspecified with respect to access-control policy. Belnap's four-valued logic has found a variety of applications in various fields, such as deductive database theory, distributed logic programming, and other areas. However, it turns out that the truth order in FOUR2 is a truth-and-falsity order at the same time [3]. The smallest lattice, where the orders of truth and falsity are independent of each other, which is especially important for security policy, is that of Shramko-Wansing’s SIXTEEN3. This generalization is well-motivated and leads from the bilattice FOUR2 with an information and a truth-and-falsity ordering to another algebraic structure, namely the trilattice SIXTEEN3 with an information ordering together with a truth ordering and a (distinct) falsity ordering. Based on SIXTEEN3 and new Boolean predicates to control access [4], we define an expressive access-control policy language, having composition statements based on the statements of Schramko-Wansing’s logic. Natural orderings on politics are obtained by independent lifting the orders of truth and falsity of trilattice, which results in a query language in which conflict freedom analysis can be developed. The reduction of formal verification of queries to that on predicates over access requests enables to carry out policy analysis. We evaluate our approach through examples of control access model policy.","PeriodicalId":332148,"journal":{"name":"WSEAS TRANSACTIONS ON COMPUTERS","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-08-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116321663","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}