Pub Date : 2024-02-12DOI: 10.1109/UVS59630.2024.10467043
Vijaya Bhaskar Reddy Muvva, Ramesh Kumpati, Wojciech Skarka
In this work, Artificial intelligence and IoT based robotic network is proposed to optimize the crop growth in the agriculture fields of Sultanate of Oman. Traditionally, weed detection systems use color and texture features from images. Machine learning algorithms then identify the weeds depending on these features. But the major drawback with feature extraction is the loss of originality, quality of the image, and performance issues. To overcome these issues, we propose an easy and efficient weed detection model using deep-learning techniques. In this research, an image comparison model using convolutional neural networks (CNN) was developed for weed detection. Visual studio code with python programs is used for simulating the model. First, to train the CNN model, we collected a sample of 1300 images from various Potato agriculture farms in Sohar with a pixel size of 512 by 512 (512 * 512) and grouped them into two clusters. Among them, Cluster one comprises 737 weed images, while Cluster two comprises 563 non-weed images. However, loading these images takes more time and requires more memory. It will affect the performance of the model. So, we resized the images to 200 by 200 (200 * 200) pixels and stored them in 2-dimensional array as binary values with a seed value 42. The binary values are stored in a memory as zero (0) for non-weed images and one (1) for weed images. This array of values is given input into a CNN using a rectified linear unit as the activation function for convolution and normalization. As a result, each image will be compared with each other and detect the weeds effectively. Nevertheless, 64 iterations of the model are required to improve its efficiency. Second, the model was tested using random images from both clusters, and it successfully identified weeds and non-weeds. At last, we developed an autonomous robot with an ESP32 microcontroller with motors and embedded it with a Raspberry Pi 3B+ with a camera to test the model efficiency in real time. The robot detected the weed and non-weed images with 95.96% accuracy.
{"title":"Efficient Weed Detection Using CNN with an Autonomous Robot","authors":"Vijaya Bhaskar Reddy Muvva, Ramesh Kumpati, Wojciech Skarka","doi":"10.1109/UVS59630.2024.10467043","DOIUrl":"https://doi.org/10.1109/UVS59630.2024.10467043","url":null,"abstract":"In this work, Artificial intelligence and IoT based robotic network is proposed to optimize the crop growth in the agriculture fields of Sultanate of Oman. Traditionally, weed detection systems use color and texture features from images. Machine learning algorithms then identify the weeds depending on these features. But the major drawback with feature extraction is the loss of originality, quality of the image, and performance issues. To overcome these issues, we propose an easy and efficient weed detection model using deep-learning techniques. In this research, an image comparison model using convolutional neural networks (CNN) was developed for weed detection. Visual studio code with python programs is used for simulating the model. First, to train the CNN model, we collected a sample of 1300 images from various Potato agriculture farms in Sohar with a pixel size of 512 by 512 (512 * 512) and grouped them into two clusters. Among them, Cluster one comprises 737 weed images, while Cluster two comprises 563 non-weed images. However, loading these images takes more time and requires more memory. It will affect the performance of the model. So, we resized the images to 200 by 200 (200 * 200) pixels and stored them in 2-dimensional array as binary values with a seed value 42. The binary values are stored in a memory as zero (0) for non-weed images and one (1) for weed images. This array of values is given input into a CNN using a rectified linear unit as the activation function for convolution and normalization. As a result, each image will be compared with each other and detect the weeds effectively. Nevertheless, 64 iterations of the model are required to improve its efficiency. Second, the model was tested using random images from both clusters, and it successfully identified weeds and non-weeds. At last, we developed an autonomous robot with an ESP32 microcontroller with motors and embedded it with a Raspberry Pi 3B+ with a camera to test the model efficiency in real time. The robot detected the weed and non-weed images with 95.96% accuracy.","PeriodicalId":518078,"journal":{"name":"2024 2nd International Conference on Unmanned Vehicle Systems-Oman (UVS)","volume":"12 8","pages":"1-7"},"PeriodicalIF":0.0,"publicationDate":"2024-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140528206","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-12DOI: 10.1109/UVS59630.2024.10467166
Hamam Mokayed, Amirhossein Nayebiastaneh, Lama Alkhaled, Stergios Sozos, Olle Hagner, Björn Backe
In the world of autonomous systems and aerial surveillance, the quest to efficiently detect vehicles in diverse environmental conditions has emerged as a pivotal challenge. While these technologies have made significant advancements in the identification of objects under ordinary circumstances, the complexities introduced by snow-laden landscapes present a unique set of hurdles. The deployment of unmanned aerial vehicles (UAVs) equipped with state-of-the-art detectors in snowy regions has become an area of intense research, as it holds promise for various applications, from search and rescue operations to efficient transportation management. This paper explores the complexities that surface when it comes to identifying vehicles within snowy landscapes through the utilization of drones. It delves into the intricacies of this state-ofthe-art undertaking, offering insights into potential future directions to tackle these challenges for the unique demands of such environments. The research aims to apply the conventional procedures typically used to enhance the performance of stateof-the-art (STOA) detectors such as YOLO and faster RCNN. This is done to underscore that adhering to traditional approaches may not suffice to achieve the desired level of efficiency and accuracy when viewed from an industrial standpoint. The code and the dataset will be available at https://nvd.ltu-ai.dev/
{"title":"Challenging YOLO and Faster RCNN in Snowy Conditions: UAV Nordic Vehicle Dataset (NVD) as an Example","authors":"Hamam Mokayed, Amirhossein Nayebiastaneh, Lama Alkhaled, Stergios Sozos, Olle Hagner, Björn Backe","doi":"10.1109/UVS59630.2024.10467166","DOIUrl":"https://doi.org/10.1109/UVS59630.2024.10467166","url":null,"abstract":"In the world of autonomous systems and aerial surveillance, the quest to efficiently detect vehicles in diverse environmental conditions has emerged as a pivotal challenge. While these technologies have made significant advancements in the identification of objects under ordinary circumstances, the complexities introduced by snow-laden landscapes present a unique set of hurdles. The deployment of unmanned aerial vehicles (UAVs) equipped with state-of-the-art detectors in snowy regions has become an area of intense research, as it holds promise for various applications, from search and rescue operations to efficient transportation management. This paper explores the complexities that surface when it comes to identifying vehicles within snowy landscapes through the utilization of drones. It delves into the intricacies of this state-ofthe-art undertaking, offering insights into potential future directions to tackle these challenges for the unique demands of such environments. The research aims to apply the conventional procedures typically used to enhance the performance of stateof-the-art (STOA) detectors such as YOLO and faster RCNN. This is done to underscore that adhering to traditional approaches may not suffice to achieve the desired level of efficiency and accuracy when viewed from an industrial standpoint. The code and the dataset will be available at https://nvd.ltu-ai.dev/","PeriodicalId":518078,"journal":{"name":"2024 2nd International Conference on Unmanned Vehicle Systems-Oman (UVS)","volume":"20 3","pages":"1-6"},"PeriodicalIF":0.0,"publicationDate":"2024-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140528379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-12DOI: 10.1109/UVS59630.2024.10467174
Foisal Ahmed, Maksim Jenihhin
The growing use of Unmanned Aerial Vehicles (UAVs) implies high reliability and safety requirements, particularly for safety- and mission-critical applications. To ensure flawless operation of a UAV, it is essential to recognize and isolate faults at all layers before they cause system failures. This paper presents an integrated Bayesian network-based method for UAV health management, considering the cross-layer dependencies of various sub modules such as avionics, propulsion, sensors and actuators, communication modules, and onboard computers. The approach employs Failure Mode and Effect Analysis (FMEA) in a cross-layer manner, factoring in dependencies across various subsystems to enhance Fault Detection and Isolation (FDI) performance. By converting FMEA-derived faults and failure events into a cohesive Bayesian network, the proposed methodology facilitates efficient identification and quantification of fault probabilities based on evidence gathered through sensor data. The paper includes case studies and numerical examples that illustrate the efficacy of the proposed methodology in analysing UAV health and isolating faults in intricate, interdependent systems.
{"title":"Cross-layer Bayesian Network for UAV Health Monitoring","authors":"Foisal Ahmed, Maksim Jenihhin","doi":"10.1109/UVS59630.2024.10467174","DOIUrl":"https://doi.org/10.1109/UVS59630.2024.10467174","url":null,"abstract":"The growing use of Unmanned Aerial Vehicles (UAVs) implies high reliability and safety requirements, particularly for safety- and mission-critical applications. To ensure flawless operation of a UAV, it is essential to recognize and isolate faults at all layers before they cause system failures. This paper presents an integrated Bayesian network-based method for UAV health management, considering the cross-layer dependencies of various sub modules such as avionics, propulsion, sensors and actuators, communication modules, and onboard computers. The approach employs Failure Mode and Effect Analysis (FMEA) in a cross-layer manner, factoring in dependencies across various subsystems to enhance Fault Detection and Isolation (FDI) performance. By converting FMEA-derived faults and failure events into a cohesive Bayesian network, the proposed methodology facilitates efficient identification and quantification of fault probabilities based on evidence gathered through sensor data. The paper includes case studies and numerical examples that illustrate the efficacy of the proposed methodology in analysing UAV health and isolating faults in intricate, interdependent systems.","PeriodicalId":518078,"journal":{"name":"2024 2nd International Conference on Unmanned Vehicle Systems-Oman (UVS)","volume":"53 3","pages":"1-7"},"PeriodicalIF":0.0,"publicationDate":"2024-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140527607","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-12DOI: 10.1109/UVS59630.2024.10467157
César García-Gascón, Pablo Castelló-Pedrero, J. A. García-Manrique
This paper outlines the methodology employed in the design and production of a fixed-wing aircraft using additive manufacturing techniques, while integrating solar panel technology. A study of the main minimal surfaces has been carried out with the material extrusion process. The results obtained show that it is possible to design and fabricate small UAVs using minimal surface infill as reinforcements with a stiffness to weight ratio equal or superior to conventional materials and processes. Gyroid structures maximize mechanical properties. However, integrating elements like spars and actuators in minimal structures is complex. This study showcases the capability of additive manufacturing in expediting the development of UAVs, resulting in significant reductions in both production costs and time.
{"title":"Solar-Powered UAV Design: Leveraging Minimal Surfaces in Material Extrusion Additive Manufacturing","authors":"César García-Gascón, Pablo Castelló-Pedrero, J. A. García-Manrique","doi":"10.1109/UVS59630.2024.10467157","DOIUrl":"https://doi.org/10.1109/UVS59630.2024.10467157","url":null,"abstract":"This paper outlines the methodology employed in the design and production of a fixed-wing aircraft using additive manufacturing techniques, while integrating solar panel technology. A study of the main minimal surfaces has been carried out with the material extrusion process. The results obtained show that it is possible to design and fabricate small UAVs using minimal surface infill as reinforcements with a stiffness to weight ratio equal or superior to conventional materials and processes. Gyroid structures maximize mechanical properties. However, integrating elements like spars and actuators in minimal structures is complex. This study showcases the capability of additive manufacturing in expediting the development of UAVs, resulting in significant reductions in both production costs and time.","PeriodicalId":518078,"journal":{"name":"2024 2nd International Conference on Unmanned Vehicle Systems-Oman (UVS)","volume":"24 ","pages":"1-7"},"PeriodicalIF":0.0,"publicationDate":"2024-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140528010","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}