Pub Date : 2024-07-23DOI: 10.18535/ijecs/v13i07.4842
Dennis, T. L., A. V I E, Emmah, V. T.
Oil is a precious and critical natural energy resource that is used in numerous ways to drive various industries worldwide. The extraction of oil from underground reservoirs is a complex process that requires a lot of planning, careful execution, and risk management. In this paper, CNN is employed to extract relevant features from sensor primary data collected from various wells. Detecting undesirable events such as leaks and equipment failure in oil wells is crucial for preventing safety hazards, environmental damage and financial losses, making it challenging to identify issues in a timely and accurate manner. This dissertation describes a hybrid model for detecting undesirable events in oil and gas wells using a combination of Convolutional Neural Network (CNN) and Long Short-Term Memory (LSTM) techniques. The CNN architecture enables effective information extraction by applying convolutional layers and pooling operations to identify patterns and spatial dependencies in the data. The extracted features are then fed into an LSTM network, which can capture temporal dependencies and learning long-term patterns. By utilizing LSTM, the model can effectively analyse the time series data and detect the occurrence of undesirable events, such as abnormal pressure, fluid leakage, or equipment malfunction, in oil and gas wells. The hybrid model leveraging CNN for feature extraction and LSTM for detecting undesirable events in the oil and gas industry presents a comprehensive approach to enhance well monitoring and prevent potential hazards. Achieving high accuracy rates of 99.8% for training and 99.78% for testing demonstrates the efficacy of the proposed model in accurately identifying and classifying undesirable events in oil and gas wells.
{"title":"A FRAMEWORK FOR MANAGEMENT OF LEAKS AND EQUIPMENT FAILURE IN OIL WELLS","authors":"Dennis, T. L., A. V I E, Emmah, V. T.","doi":"10.18535/ijecs/v13i07.4842","DOIUrl":"https://doi.org/10.18535/ijecs/v13i07.4842","url":null,"abstract":"Oil is a precious and critical natural energy resource that is used in numerous ways to drive various industries worldwide. The extraction of oil from underground reservoirs is a complex process that requires a lot of planning, careful execution, and risk management. In this paper, CNN is employed to extract relevant features from sensor primary data collected from various wells. Detecting undesirable events such as leaks and equipment failure in oil wells is crucial for preventing safety hazards, environmental damage and financial losses, making it challenging to identify issues in a timely and accurate manner. This dissertation describes a hybrid model for detecting undesirable events in oil and gas wells using a combination of Convolutional Neural Network (CNN) and Long Short-Term Memory (LSTM) techniques. The CNN architecture enables effective information extraction by applying convolutional layers and pooling operations to identify patterns and spatial dependencies in the data. The extracted features are then fed into an LSTM network, which can capture temporal dependencies and learning long-term patterns. By utilizing LSTM, the model can effectively analyse the time series data and detect the occurrence of undesirable events, such as abnormal pressure, fluid leakage, or equipment malfunction, in oil and gas wells. The hybrid model leveraging CNN for feature extraction and LSTM for detecting undesirable events in the oil and gas industry presents a comprehensive approach to enhance well monitoring and prevent potential hazards. Achieving high accuracy rates of 99.8% for training and 99.78% for testing demonstrates the efficacy of the proposed model in accurately identifying and classifying undesirable events in oil and gas wells.","PeriodicalId":231371,"journal":{"name":"International Journal of Engineering and Computer Science","volume":"30 5","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141813031","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-07-21DOI: 10.18535/ijecs/v13i07.4853
Saurabh Kumar, Mr. Amar Nayak
Demand forecasting is a critical component of supply chain management and business operations, enabling organizations to make informed decisions about production, inventory management, and resource allocation. In recent years, predictive analytics has emerged as a powerful tool for enhancing the accuracy and efficiency of demand forecasting. This review paper explores the transformative role of predictive analytics and deep learning in demand forecasting. It examines how these advanced techniques have evolved from traditional models based on past sales data, offering nuanced predictions through sophisticated statistical and machine learning methods. Deep learning, with its neural network structures, brings automatic feature learning, complex pattern handling, and scalability, enhancing forecasting in sectors like retail, manufacturing, and healthcare. The paper reviews various deep learning models, compares them with traditional methods, and discusses their impact on business operations and decision-making. It concludes by looking at future trends in predictive analytics and deep learning in demand forecasting.
{"title":"Predictive Analytics for Demand Forecasting: A deep Learning-based Decision Support System","authors":"Saurabh Kumar, Mr. Amar Nayak","doi":"10.18535/ijecs/v13i07.4853","DOIUrl":"https://doi.org/10.18535/ijecs/v13i07.4853","url":null,"abstract":"Demand forecasting is a critical component of supply chain management and business operations, enabling organizations to make informed decisions about production, inventory management, and resource allocation. In recent years, predictive analytics has emerged as a powerful tool for enhancing the accuracy and efficiency of demand forecasting. This review paper explores the transformative role of predictive analytics and deep learning in demand forecasting. It examines how these advanced techniques have evolved from traditional models based on past sales data, offering nuanced predictions through sophisticated statistical and machine learning methods. Deep learning, with its neural network structures, brings automatic feature learning, complex pattern handling, and scalability, enhancing forecasting in sectors like retail, manufacturing, and healthcare. The paper reviews various deep learning models, compares them with traditional methods, and discusses their impact on business operations and decision-making. It concludes by looking at future trends in predictive analytics and deep learning in demand forecasting.","PeriodicalId":231371,"journal":{"name":"International Journal of Engineering and Computer Science","volume":"31 8","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141818368","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-07-21DOI: 10.18535/ijecs/v13i07.4846
Nwagwu, C .B., Taylor O. E., Nwiabu N.D
Abstract- Malware detection is a significant challenge in today's digital landscape. As new forms of malware are continuously being developed, traditional detection techniques often fall short due to their inability to detect these new strains. This paperintroduces meaningful features that effectively capture various types of malware, including viruses, worms, Trojans and Ransomware on Edge devices. The paper used a model that implemented Random forest classifier for feature selection and a support vector machine (SVM) model for Malware detection. Object-Oriented Analysis and Design (OOAD) methodology was used to as the design methodology, which involved identifying and modeling the different components of the system and their interactions. The system was developed using Python programming language, with an emphasis on model deployment via Python Flask for web-based testing and execution. The experimental results demonstrate the effectiveness of the proposed systems when compared with other existing system. The result gotten from proposed system is better than that of the existing system by achieving a detection accuracy of 99.98% which is better than existing techniques. This dissertation presents a promising direction for improving malware detection using support vector machine (SVM) model and highlights the potential for collaborative learning approaches to overcome the challenges of traditional centralized approaches. This result simulates edge device that performs malware detection. It measures the latency for each detection and prints whether the latency is high or low. After the simulation, it plots a graph to visualize the latency over multiple requests. Which shows that the proposed model had low latency between 0.25secs to 0.15 secs on multiple requests.
{"title":"A Model for Detection of Malwares on Edge Devices","authors":"Nwagwu, C .B., Taylor O. E., Nwiabu N.D","doi":"10.18535/ijecs/v13i07.4846","DOIUrl":"https://doi.org/10.18535/ijecs/v13i07.4846","url":null,"abstract":"Abstract- Malware detection is a significant challenge in today's digital landscape. As new forms of malware are continuously being developed, traditional detection techniques often fall short due to their inability to detect these new strains. This paperintroduces meaningful features that effectively capture various types of malware, including viruses, worms, Trojans and Ransomware on Edge devices. The paper used a model that implemented Random forest classifier for feature selection and a support vector machine (SVM) model for Malware detection. Object-Oriented Analysis and Design (OOAD) methodology was used to as the design methodology, which involved identifying and modeling the different components of the system and their interactions. The system was developed using Python programming language, with an emphasis on model deployment via Python Flask for web-based testing and execution. The experimental results demonstrate the effectiveness of the proposed systems when compared with other existing system. The result gotten from proposed system is better than that of the existing system by achieving a detection accuracy of 99.98% which is better than existing techniques. This dissertation presents a promising direction for improving malware detection using support vector machine (SVM) model and highlights the potential for collaborative learning approaches to overcome the challenges of traditional centralized approaches. This result simulates edge device that performs malware detection. It measures the latency for each detection and prints whether the latency is high or low. After the simulation, it plots a graph to visualize the latency over multiple requests. Which shows that the proposed model had low latency between 0.25secs to 0.15 secs on multiple requests.","PeriodicalId":231371,"journal":{"name":"International Journal of Engineering and Computer Science","volume":"33 5","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141818537","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-07-21DOI: 10.18535/ijecs/v13i07.4839
Jeyadev Needhi, D. Kk, Vishnu G, Ram Prasath G
This project leverages Recurrent Neural Networks(RNNs) to generate coherent and contextually relevant songlyrics. The methodology includes extensive text preprocessing anddataset creation, followed by the construction of a robust modelfeaturing Embedding, Gated Recurrent Unit (GRU), Dense, andDropout layers. The model is compiled and trained using theAdam optimizer, with checkpointing to monitor and optimize thetraining process. Upon successful training on a comprehensivelyrics dataset, the model is thoroughly evaluated and fine-tunedto enhance performance. Finally, the model generates new lyricsfrom a given seed, showcasing its ability to learn intricatelinguistic patterns and structures, thereby offering a powerfultool for creative and original lyric composition.
{"title":"Data-Driven Approach to Automated Lyric Generation","authors":"Jeyadev Needhi, D. Kk, Vishnu G, Ram Prasath G","doi":"10.18535/ijecs/v13i07.4839","DOIUrl":"https://doi.org/10.18535/ijecs/v13i07.4839","url":null,"abstract":"This project leverages Recurrent Neural Networks(RNNs) to generate coherent and contextually relevant songlyrics. The methodology includes extensive text preprocessing anddataset creation, followed by the construction of a robust modelfeaturing Embedding, Gated Recurrent Unit (GRU), Dense, andDropout layers. The model is compiled and trained using theAdam optimizer, with checkpointing to monitor and optimize thetraining process. Upon successful training on a comprehensivelyrics dataset, the model is thoroughly evaluated and fine-tunedto enhance performance. Finally, the model generates new lyricsfrom a given seed, showcasing its ability to learn intricatelinguistic patterns and structures, thereby offering a powerfultool for creative and original lyric composition.","PeriodicalId":231371,"journal":{"name":"International Journal of Engineering and Computer Science","volume":"51 25","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141818072","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-07-17DOI: 10.18535/ijecs/v13i07.4851
Mrs. K. Gowri, A. Aswath, A. P. Adarsh, R. S. K. Gowtham Balaji
Abstract—Signatures are widely used as a means of personal identification and verification. Many documents like bank cheques and legal transactions require signature verification. Signature-based verification of a large number of documents is a very difficult and time-consuming task. Consequently, an explosive growth has been observed in biometric personal verification and authentication systems that are connected with quantifiable physical unique characteristics (finger prints, hand geometry, face, ear, iris scan, or DNA) or behavioural features (gait, voice etc.). As traditional identity verification methods such as tokens, passwords, pins etc suffer from some fatal flaws and are incapable to satisfy the security necessities, the paper aims to consider a more reliable biometric feature, signature verification for the considering. We present a survey of signature verification systems. We classify and give an account of the various approaches that have been proposed for signature verification.
{"title":"ENHANCE DOCUMENT VALIDATION UIPATH POWERED SIGNATURE VERIFICATION","authors":"Mrs. K. Gowri, A. Aswath, A. P. Adarsh, R. S. K. Gowtham Balaji","doi":"10.18535/ijecs/v13i07.4851","DOIUrl":"https://doi.org/10.18535/ijecs/v13i07.4851","url":null,"abstract":"Abstract—Signatures are widely used as a means of personal identification and verification. Many documents like bank cheques and legal transactions require signature verification. Signature-based verification of a large number of documents is a very difficult and time-consuming task. Consequently, an explosive growth has been observed in biometric personal verification and authentication systems that are connected with quantifiable physical unique characteristics (finger prints, hand geometry, face, ear, iris scan, or DNA) or behavioural features (gait, voice etc.). As traditional identity verification methods such as tokens, passwords, pins etc suffer from some fatal flaws and are incapable to satisfy the security necessities, the paper aims to consider a more reliable biometric feature, signature verification for the considering. We present a survey of signature verification systems. We classify and give an account of the various approaches that have been proposed for signature verification.","PeriodicalId":231371,"journal":{"name":"International Journal of Engineering and Computer Science","volume":" 4","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141827874","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-07-17DOI: 10.18535/ijecs/v13i07.4831
Md Zawharul Islam, Md. Atahar Ishrak, A. H. M. Kamal
Heart disease is a prevalent condition nowadays that, if undiagnosed, can be deadly. To predict heart disease, researchers designed many machine learning models. In this study, we propose a model that chooses fewer attribute columns for training, and we use these chosen features to determine the heart problem severity. Correlation Repeated Heat map and Information Gain were used for selecting the features. To train our model we used the UCI Cleveland heart disease dataset. We removed duplicate data to improve the accuracy score, and we also encoded the categorical data collection using the OneHot(OH) encoding method, which can improve prediction accuracy. Support Vector, Logistic Regression, K-Nearest Neighbour, Naive Bayes, Decision Tree, Random Forest, Adaboost, and XGBoost are the eight classifier algorithms that are used in this process overall. Based on repeated heat map correlation, we compare the accuracy score each time. In this proposed method, the Adaboost classification algorithm used by the fbs row heat map achieves the highest accuracy for heart disease detection and it is 92%. By choosing features according to the information gain value, we compare the accuracy score each time in information gain. For both XGBoost and Logistic Regression, we got an accuracy score of 93.44%. However, compared to the XGBoost classification technique, Logistic Regression requires less time. Accuracy, precision, recall, f1-score, sensitivity, specificity, and the AUC of ROC charts were used to evaluate the performance of the model. Overall, the results of our model demonstrate that it is reliable and accurate in identifying cardiac disease and its level of severeness.
{"title":"Feature value quantization and reduction process for predicting heart attack possibility and the level of severity by a machine learning model","authors":"Md Zawharul Islam, Md. Atahar Ishrak, A. H. M. Kamal","doi":"10.18535/ijecs/v13i07.4831","DOIUrl":"https://doi.org/10.18535/ijecs/v13i07.4831","url":null,"abstract":"Heart disease is a prevalent condition nowadays that, if undiagnosed, can be deadly. To predict heart disease, \u0000researchers designed many machine learning models. In this study, we propose a model that chooses fewer attribute columns for training, and we use these chosen features to determine the heart problem severity. Correlation Repeated Heat map and Information Gain were used for selecting the features. To train our model we used the UCI Cleveland heart disease dataset. We removed duplicate data to improve the accuracy score, and we also encoded the categorical data collection using the OneHot(OH) encoding method, which can improve prediction accuracy. Support Vector, Logistic Regression, K-Nearest Neighbour, Naive Bayes, Decision Tree, Random Forest, Adaboost, and XGBoost are the eight classifier algorithms that are used in this process overall. Based on repeated heat map correlation, we compare the accuracy score each time. In this proposed method, the Adaboost classification algorithm used by the fbs row heat map achieves the highest accuracy for heart disease detection and it is 92%. By choosing features according to the information gain value, we compare the accuracy score each time in information gain. For both XGBoost and Logistic Regression, we got an accuracy score of 93.44%. However, compared to the XGBoost classification technique, Logistic Regression requires less time. Accuracy, precision, recall, f1-score, sensitivity, specificity, and the AUC of ROC charts were used to evaluate the performance of the model. Overall, the results of our model demonstrate that it is reliable and accurate in identifying cardiac disease and its level of severeness.","PeriodicalId":231371,"journal":{"name":"International Journal of Engineering and Computer Science","volume":" 6","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141830728","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-07-14DOI: 10.18535/ijecs/v13i07.4852
Dr. Mohana Priya C
Transfer learning has revolutionized automated plant disease detection by leveraging pre-trained convolutional neural networks (CNNs) on large-scale datasets like ImageNet. This paper explores advanced methodologies in transfer learning, focusing on the integration of memory-augmented networks and meta-learning approaches. These enhancements aim to improve model adaptation to new disease types and environmental conditions, thereby enhancing accuracy and robustness in agricultural applications. The paper reviews existing literature, discusses methodologies, and suggests future research directions to advance the field of AI-driven plant pathology.
{"title":"Enhancing Plant Disease Detection through Transfer Learning by Incorporating MemoryAugmented Networks and Meta-Learning Approaches","authors":"Dr. Mohana Priya C","doi":"10.18535/ijecs/v13i07.4852","DOIUrl":"https://doi.org/10.18535/ijecs/v13i07.4852","url":null,"abstract":"Transfer learning has revolutionized automated plant disease detection by leveraging pre-trained convolutional neural networks (CNNs) on large-scale datasets like ImageNet. This paper explores advanced methodologies in transfer learning, focusing on the integration of memory-augmented networks and meta-learning approaches. These enhancements aim to improve model adaptation to new disease types and environmental conditions, thereby enhancing accuracy and robustness in agricultural applications. The paper reviews existing literature, discusses methodologies, and suggests future research directions to advance the field of AI-driven plant pathology. \u0000 ","PeriodicalId":231371,"journal":{"name":"International Journal of Engineering and Computer Science","volume":" 28","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141833892","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-07-14DOI: 10.18535/ijecs/v13i07.4854
Jeyadev Needhi, Ram Prasath G, Vishnu G, D. Kk
In this paper, we present a novel approach for managing the file system in Linux using a voice assistant. Our system allows users to perform file system operations such as creating directories, renaming files, and deleting files by issuing voice commands. We develop a voice assistant using Python libraries and integrate it with the file system in Linux. The voice assistant is capable of understanding natural language and executing commands based on the user’s voice inputs. We conduct experiments to evaluate the performance of the system and demonstrate that our approach is effective and efficient in managing the file system using voice commands. Our system can enhance the accessibility and usability of the file system in Linux for individuals with disabilities or those who prefer a hands-free approach to file management.
在本文中,我们提出了一种使用语音助手管理 Linux 文件系统的新方法。我们的系统允许用户通过发出语音命令来执行创建目录、重命名文件和删除文件等文件系统操作。我们使用 Python 库开发了一个语音助手,并将其与 Linux 中的文件系统集成。语音助手能够理解自然语言,并根据用户的语音输入执行命令。我们进行了实验来评估系统的性能,结果表明我们的方法在使用语音命令管理文件系统方面是有效和高效的。我们的系统可以提高 Linux 中文件系统的可访问性和可用性,适合残障人士或喜欢免提文件管理方法的人使用。
{"title":"Performance Optimization of Voice-Assisted File Management Systems","authors":"Jeyadev Needhi, Ram Prasath G, Vishnu G, D. Kk","doi":"10.18535/ijecs/v13i07.4854","DOIUrl":"https://doi.org/10.18535/ijecs/v13i07.4854","url":null,"abstract":"In this paper, we present a novel approach for managing the file system in Linux using a voice assistant. Our system allows users to perform file system operations such as creating directories, renaming files, and deleting files by issuing voice commands. We develop a voice assistant using Python libraries and integrate it with the file system in Linux. The voice assistant is capable of understanding natural language and executing commands based on the user’s voice inputs. We conduct experiments to evaluate the performance of the system and demonstrate that our approach is effective and efficient in managing the file system using voice commands. Our system can enhance the accessibility and usability of the file system in Linux for individuals with disabilities or those who prefer a hands-free approach to file management.","PeriodicalId":231371,"journal":{"name":"International Journal of Engineering and Computer Science","volume":" 30","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141833993","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-07-11DOI: 10.18535/ijecs/v13i01.4786
Chirag Vinalbhai Shah
The complexity and hazards of autonomous vehicle systems have posed a significant challenge in predictive maintenance. Since the incompetence of autonomous vehicle system software and hardware could lead to life-threatening crashes, maintenance should be performed regularly to protect human safety. For automotive systems, predicting future failures and taking actions in advance to maintain system reliability and safety is very crucial in large-scale product design. This paper will explore several machine learning algorithms including regression techniques, classification techniques, ensemble techniques, clustering techniques, and deep learning techniques used for system maintenance need assessment in autonomous vehicles. Experimental results indicate that predictive maintenance can be greatly helpful for autonomous vehicles either in improving system design or mitigating the risk of threats.
{"title":"Machine Learning Algorithms for Predictive Maintenance in Autonomous Vehicles","authors":"Chirag Vinalbhai Shah","doi":"10.18535/ijecs/v13i01.4786","DOIUrl":"https://doi.org/10.18535/ijecs/v13i01.4786","url":null,"abstract":"The complexity and hazards of autonomous vehicle systems have posed a significant challenge in predictive maintenance. Since the incompetence of autonomous vehicle system software and hardware could lead to life-threatening crashes, maintenance should be performed regularly to protect human safety. For automotive systems, predicting future failures and taking actions in advance to maintain system reliability and safety is very crucial in large-scale product design. This paper will explore several machine learning algorithms including regression techniques, classification techniques, ensemble techniques, clustering techniques, and deep learning techniques used for system maintenance need assessment in autonomous vehicles. Experimental results indicate that predictive maintenance can be greatly helpful for autonomous vehicles either in improving system design or mitigating the risk of threats.","PeriodicalId":231371,"journal":{"name":"International Journal of Engineering and Computer Science","volume":"61 5","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141655388","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-07-11DOI: 10.18535/ijecs/v11i05.4673
Dilip Kumar Vaka
Edge computing, where sensing, control, and intelligent processing occur near where data is acquired, is poised to be a fundamental enabler of several imminent disruptive future computing paradigms for emerging applications such as CPS, IoT, and more sophisticated AI-driven services. In this context, we posit the convergence of AI, ML, and IoT in automotive systems, the infrastructure required to enable it, and where edge computing will play a pivotal role in the real-world deployment of this ecosystem. We also review a few digital infrastructure technologies that can vastly enhance these next-generation digital automotive systems. This is examined through the investigation of real-world scenarios provided by our partner companies, the prominent Consumer Electronics Show (CES), and other sources. First, it is demonstrated through several industrial benchmarks that the proposed digital infrastructure technologies provide significant alleviation in terms of application accuracy, and at times even take the benefits beyond even 1x equivalent DNN accelerator-based systems in resource-constrained edge computing environments. After this, the challenges of designing and deploying them in real-world automotive systems are outlined. The paper concludes with the verifiable thesis that edge computing technologies need to play a significant role in the next-generation digital automotive system development so that ML-driven AI systems of the future are designed and deployed successfully in the field and can deliver their intent of providing superior user experience, enhanced safety, and convenience.
边缘计算是在获取数据的附近进行传感、控制和智能处理的计算模式,它将成为 CPS、物联网和更复杂的人工智能驱动服务等新兴应用中几种即将出现的颠覆性未来计算模式的基本推动力。在此背景下,我们提出了人工智能、ML 和物联网在汽车系统中的融合,实现这一融合所需的基础设施,以及边缘计算将在这一生态系统的实际部署中发挥关键作用的领域。我们还回顾了一些数字基础设施技术,这些技术可以极大地增强下一代数字汽车系统。我们将通过调查我们的合作伙伴公司、著名的消费电子展(CES)和其他来源提供的真实场景来研究这些技术。首先,通过几个工业基准测试证明,在资源受限的边缘计算环境中,所提出的数字基础架构技术能显著提高应用的准确性,有时其优势甚至超过 1 倍的基于 DNN 加速器的系统。随后,概述了在真实世界的汽车系统中设计和部署这些技术所面临的挑战。本文最后提出了一个可验证的论点,即边缘计算技术需要在下一代数字汽车系统开发中发挥重要作用,以便在现场成功设计和部署未来的 ML 驱动型人工智能系统,并实现其提供卓越用户体验、增强安全性和便利性的目标。
{"title":"The Convergence of AI, ML, and IoT in Automotive Systems: A Future Perspective on Edge Computing","authors":"Dilip Kumar Vaka","doi":"10.18535/ijecs/v11i05.4673","DOIUrl":"https://doi.org/10.18535/ijecs/v11i05.4673","url":null,"abstract":"Edge computing, where sensing, control, and intelligent processing occur near where data is acquired, is poised to be a fundamental enabler of several imminent disruptive future computing paradigms for emerging applications such as CPS, IoT, and more sophisticated AI-driven services. In this context, we posit the convergence of AI, ML, and IoT in automotive systems, the infrastructure required to enable it, and where edge computing will play a pivotal role in the real-world deployment of this ecosystem. We also review a few digital infrastructure technologies that can vastly enhance these next-generation digital automotive systems. This is examined through the investigation of real-world scenarios provided by our partner companies, the prominent Consumer Electronics Show (CES), and other sources. First, it is demonstrated through several industrial benchmarks that the proposed digital infrastructure technologies provide significant alleviation in terms of application accuracy, and at times even take the benefits beyond even 1x equivalent DNN accelerator-based systems in resource-constrained edge computing environments. After this, the challenges of designing and deploying them in real-world automotive systems are outlined. The paper concludes with the verifiable thesis that edge computing technologies need to play a significant role in the next-generation digital automotive system development so that ML-driven AI systems of the future are designed and deployed successfully in the field and can deliver their intent of providing superior user experience, enhanced safety, and convenience.\u0000 ","PeriodicalId":231371,"journal":{"name":"International Journal of Engineering and Computer Science","volume":"138 28","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141655885","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}