首页 > 最新文献

2024 2nd International Conference on Unmanned Vehicle Systems-Oman (UVS)最新文献

英文 中文
Efficient Weed Detection Using CNN with an Autonomous Robot 利用 CNN 和自主机器人进行高效杂草探测
Pub Date : 2024-02-12 DOI: 10.1109/UVS59630.2024.10467043
Vijaya Bhaskar Reddy Muvva, Ramesh Kumpati, Wojciech Skarka
In this work, Artificial intelligence and IoT based robotic network is proposed to optimize the crop growth in the agriculture fields of Sultanate of Oman. Traditionally, weed detection systems use color and texture features from images. Machine learning algorithms then identify the weeds depending on these features. But the major drawback with feature extraction is the loss of originality, quality of the image, and performance issues. To overcome these issues, we propose an easy and efficient weed detection model using deep-learning techniques. In this research, an image comparison model using convolutional neural networks (CNN) was developed for weed detection. Visual studio code with python programs is used for simulating the model. First, to train the CNN model, we collected a sample of 1300 images from various Potato agriculture farms in Sohar with a pixel size of 512 by 512 (512 * 512) and grouped them into two clusters. Among them, Cluster one comprises 737 weed images, while Cluster two comprises 563 non-weed images. However, loading these images takes more time and requires more memory. It will affect the performance of the model. So, we resized the images to 200 by 200 (200 * 200) pixels and stored them in 2-dimensional array as binary values with a seed value 42. The binary values are stored in a memory as zero (0) for non-weed images and one (1) for weed images. This array of values is given input into a CNN using a rectified linear unit as the activation function for convolution and normalization. As a result, each image will be compared with each other and detect the weeds effectively. Nevertheless, 64 iterations of the model are required to improve its efficiency. Second, the model was tested using random images from both clusters, and it successfully identified weeds and non-weeds. At last, we developed an autonomous robot with an ESP32 microcontroller with motors and embedded it with a Raspberry Pi 3B+ with a camera to test the model efficiency in real time. The robot detected the weed and non-weed images with 95.96% accuracy.
在这项工作中,提出了基于人工智能和物联网的机器人网络,以优化阿曼苏丹国农田中的作物生长。传统的杂草检测系统使用图像的颜色和纹理特征。然后,机器学习算法根据这些特征识别杂草。但是,特征提取的主要缺点是丢失原始图像、图像质量和性能问题。为了克服这些问题,我们利用深度学习技术提出了一种简单高效的杂草检测模型。本研究利用卷积神经网络(CNN)开发了一种用于杂草检测的图像比较模型。该模型的模拟使用了包含 python 程序的 Visual studio 代码。首先,为了训练卷积神经网络模型,我们从苏哈尔的各个马铃薯农场收集了 1300 张像素大小为 512 x 512 (512 * 512) 的图像样本,并将它们分成两个群组。其中,第一组包括 737 幅杂草图像,第二组包括 563 幅非杂草图像。然而,加载这些图像需要更多时间和内存。这会影响模型的性能。因此,我们将图像大小调整为 200 x 200 (200 * 200) 像素,并以种子值 42 作为二进制值存储在二维数组中。非杂草图像的二进制值存储为 0,杂草图像的二进制值存储为 1。该值数组输入到 CNN 中,CNN 使用整流线性单元作为激活函数进行卷积和归一化。因此,每幅图像都会相互比较,从而有效地检测出杂草。不过,该模型需要迭代 64 次才能提高效率。其次,我们使用两个集群的随机图像对模型进行了测试,结果表明该模型能成功识别杂草和非杂草。最后,我们利用带电机的 ESP32 微控制器开发了一个自主机器人,并将其嵌入带摄像头的 Raspberry Pi 3B+ 中,以实时测试模型的效率。机器人检测杂草和非杂草图像的准确率为 95.96%。
{"title":"Efficient Weed Detection Using CNN with an Autonomous Robot","authors":"Vijaya Bhaskar Reddy Muvva, Ramesh Kumpati, Wojciech Skarka","doi":"10.1109/UVS59630.2024.10467043","DOIUrl":"https://doi.org/10.1109/UVS59630.2024.10467043","url":null,"abstract":"In this work, Artificial intelligence and IoT based robotic network is proposed to optimize the crop growth in the agriculture fields of Sultanate of Oman. Traditionally, weed detection systems use color and texture features from images. Machine learning algorithms then identify the weeds depending on these features. But the major drawback with feature extraction is the loss of originality, quality of the image, and performance issues. To overcome these issues, we propose an easy and efficient weed detection model using deep-learning techniques. In this research, an image comparison model using convolutional neural networks (CNN) was developed for weed detection. Visual studio code with python programs is used for simulating the model. First, to train the CNN model, we collected a sample of 1300 images from various Potato agriculture farms in Sohar with a pixel size of 512 by 512 (512 * 512) and grouped them into two clusters. Among them, Cluster one comprises 737 weed images, while Cluster two comprises 563 non-weed images. However, loading these images takes more time and requires more memory. It will affect the performance of the model. So, we resized the images to 200 by 200 (200 * 200) pixels and stored them in 2-dimensional array as binary values with a seed value 42. The binary values are stored in a memory as zero (0) for non-weed images and one (1) for weed images. This array of values is given input into a CNN using a rectified linear unit as the activation function for convolution and normalization. As a result, each image will be compared with each other and detect the weeds effectively. Nevertheless, 64 iterations of the model are required to improve its efficiency. Second, the model was tested using random images from both clusters, and it successfully identified weeds and non-weeds. At last, we developed an autonomous robot with an ESP32 microcontroller with motors and embedded it with a Raspberry Pi 3B+ with a camera to test the model efficiency in real time. The robot detected the weed and non-weed images with 95.96% accuracy.","PeriodicalId":518078,"journal":{"name":"2024 2nd International Conference on Unmanned Vehicle Systems-Oman (UVS)","volume":"12 8","pages":"1-7"},"PeriodicalIF":0.0,"publicationDate":"2024-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140528206","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Challenging YOLO and Faster RCNN in Snowy Conditions: UAV Nordic Vehicle Dataset (NVD) as an Example 雪地条件下的挑战性 YOLO 和更快的 RCNN:以无人机北欧车辆数据集 (NVD) 为例
Pub Date : 2024-02-12 DOI: 10.1109/UVS59630.2024.10467166
Hamam Mokayed, Amirhossein Nayebiastaneh, Lama Alkhaled, Stergios Sozos, Olle Hagner, Björn Backe
In the world of autonomous systems and aerial surveillance, the quest to efficiently detect vehicles in diverse environmental conditions has emerged as a pivotal challenge. While these technologies have made significant advancements in the identification of objects under ordinary circumstances, the complexities introduced by snow-laden landscapes present a unique set of hurdles. The deployment of unmanned aerial vehicles (UAVs) equipped with state-of-the-art detectors in snowy regions has become an area of intense research, as it holds promise for various applications, from search and rescue operations to efficient transportation management. This paper explores the complexities that surface when it comes to identifying vehicles within snowy landscapes through the utilization of drones. It delves into the intricacies of this state-ofthe-art undertaking, offering insights into potential future directions to tackle these challenges for the unique demands of such environments. The research aims to apply the conventional procedures typically used to enhance the performance of stateof-the-art (STOA) detectors such as YOLO and faster RCNN. This is done to underscore that adhering to traditional approaches may not suffice to achieve the desired level of efficiency and accuracy when viewed from an industrial standpoint. The code and the dataset will be available at https://nvd.ltu-ai.dev/
在自主系统和空中监视领域,如何在各种环境条件下有效探测车辆已成为一项关键挑战。虽然这些技术在识别普通环境下的物体方面取得了重大进展,但积雪环境带来的复杂性却构成了一系列独特的障碍。在雪域部署配备最先进探测器的无人飞行器(UAV)已成为一个热门研究领域,因为它有望应用于从搜救行动到高效交通管理等各种领域。本文探讨了利用无人机在雪地中识别车辆的复杂性。本文深入探讨了这一先进技术的复杂性,并深入分析了未来应对这些挑战的潜在方向,以满足此类环境的独特需求。该研究旨在应用常规程序来提高最先进(STOA)探测器的性能,如 YOLO 和更快的 RCNN。这样做是为了强调,从工业角度来看,遵循传统方法可能不足以达到预期的效率和准确性水平。代码和数据集将发布在 https://nvd.ltu-ai.dev/ 网站上。
{"title":"Challenging YOLO and Faster RCNN in Snowy Conditions: UAV Nordic Vehicle Dataset (NVD) as an Example","authors":"Hamam Mokayed, Amirhossein Nayebiastaneh, Lama Alkhaled, Stergios Sozos, Olle Hagner, Björn Backe","doi":"10.1109/UVS59630.2024.10467166","DOIUrl":"https://doi.org/10.1109/UVS59630.2024.10467166","url":null,"abstract":"In the world of autonomous systems and aerial surveillance, the quest to efficiently detect vehicles in diverse environmental conditions has emerged as a pivotal challenge. While these technologies have made significant advancements in the identification of objects under ordinary circumstances, the complexities introduced by snow-laden landscapes present a unique set of hurdles. The deployment of unmanned aerial vehicles (UAVs) equipped with state-of-the-art detectors in snowy regions has become an area of intense research, as it holds promise for various applications, from search and rescue operations to efficient transportation management. This paper explores the complexities that surface when it comes to identifying vehicles within snowy landscapes through the utilization of drones. It delves into the intricacies of this state-ofthe-art undertaking, offering insights into potential future directions to tackle these challenges for the unique demands of such environments. The research aims to apply the conventional procedures typically used to enhance the performance of stateof-the-art (STOA) detectors such as YOLO and faster RCNN. This is done to underscore that adhering to traditional approaches may not suffice to achieve the desired level of efficiency and accuracy when viewed from an industrial standpoint. The code and the dataset will be available at https://nvd.ltu-ai.dev/","PeriodicalId":518078,"journal":{"name":"2024 2nd International Conference on Unmanned Vehicle Systems-Oman (UVS)","volume":"20 3","pages":"1-6"},"PeriodicalIF":0.0,"publicationDate":"2024-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140528379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Cross-layer Bayesian Network for UAV Health Monitoring 用于无人机健康监测的跨层贝叶斯网络
Pub Date : 2024-02-12 DOI: 10.1109/UVS59630.2024.10467174
Foisal Ahmed, Maksim Jenihhin
The growing use of Unmanned Aerial Vehicles (UAVs) implies high reliability and safety requirements, particularly for safety- and mission-critical applications. To ensure flawless operation of a UAV, it is essential to recognize and isolate faults at all layers before they cause system failures. This paper presents an integrated Bayesian network-based method for UAV health management, considering the cross-layer dependencies of various sub modules such as avionics, propulsion, sensors and actuators, communication modules, and onboard computers. The approach employs Failure Mode and Effect Analysis (FMEA) in a cross-layer manner, factoring in dependencies across various subsystems to enhance Fault Detection and Isolation (FDI) performance. By converting FMEA-derived faults and failure events into a cohesive Bayesian network, the proposed methodology facilitates efficient identification and quantification of fault probabilities based on evidence gathered through sensor data. The paper includes case studies and numerical examples that illustrate the efficacy of the proposed methodology in analysing UAV health and isolating faults in intricate, interdependent systems.
无人驾驶飞行器(UAV)的使用越来越广泛,这意味着对可靠性和安全性的要求也越来越高,尤其是对安全和任务关键型应用而言。为确保无人飞行器的无故障运行,必须在故障导致系统故障之前识别并隔离各层故障。本文提出了一种基于贝叶斯网络的无人机健康管理综合方法,考虑了航空电子设备、推进器、传感器和致动器、通信模块以及机载计算机等各种子模块的跨层依赖关系。该方法以跨层方式采用故障模式和影响分析(FMEA),将各子系统之间的依赖关系考虑在内,以提高故障检测和隔离(FDI)性能。通过将 FMEA 衍生的故障和失效事件转换为内聚贝叶斯网络,所提出的方法有助于根据传感器数据收集的证据高效识别和量化故障概率。论文包括案例研究和数值示例,说明了所提方法在分析无人机健康状况和隔离错综复杂、相互依存系统中的故障方面的功效。
{"title":"Cross-layer Bayesian Network for UAV Health Monitoring","authors":"Foisal Ahmed, Maksim Jenihhin","doi":"10.1109/UVS59630.2024.10467174","DOIUrl":"https://doi.org/10.1109/UVS59630.2024.10467174","url":null,"abstract":"The growing use of Unmanned Aerial Vehicles (UAVs) implies high reliability and safety requirements, particularly for safety- and mission-critical applications. To ensure flawless operation of a UAV, it is essential to recognize and isolate faults at all layers before they cause system failures. This paper presents an integrated Bayesian network-based method for UAV health management, considering the cross-layer dependencies of various sub modules such as avionics, propulsion, sensors and actuators, communication modules, and onboard computers. The approach employs Failure Mode and Effect Analysis (FMEA) in a cross-layer manner, factoring in dependencies across various subsystems to enhance Fault Detection and Isolation (FDI) performance. By converting FMEA-derived faults and failure events into a cohesive Bayesian network, the proposed methodology facilitates efficient identification and quantification of fault probabilities based on evidence gathered through sensor data. The paper includes case studies and numerical examples that illustrate the efficacy of the proposed methodology in analysing UAV health and isolating faults in intricate, interdependent systems.","PeriodicalId":518078,"journal":{"name":"2024 2nd International Conference on Unmanned Vehicle Systems-Oman (UVS)","volume":"53 3","pages":"1-7"},"PeriodicalIF":0.0,"publicationDate":"2024-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140527607","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Solar-Powered UAV Design: Leveraging Minimal Surfaces in Material Extrusion Additive Manufacturing 太阳能无人机设计:在材料挤压增材制造中利用最小表面
Pub Date : 2024-02-12 DOI: 10.1109/UVS59630.2024.10467157
César García-Gascón, Pablo Castelló-Pedrero, J. A. García-Manrique
This paper outlines the methodology employed in the design and production of a fixed-wing aircraft using additive manufacturing techniques, while integrating solar panel technology. A study of the main minimal surfaces has been carried out with the material extrusion process. The results obtained show that it is possible to design and fabricate small UAVs using minimal surface infill as reinforcements with a stiffness to weight ratio equal or superior to conventional materials and processes. Gyroid structures maximize mechanical properties. However, integrating elements like spars and actuators in minimal structures is complex. This study showcases the capability of additive manufacturing in expediting the development of UAVs, resulting in significant reductions in both production costs and time.
本文概述了利用增材制造技术设计和生产固定翼飞机所采用的方法,同时整合了太阳能电池板技术。利用材料挤压工艺对主要的最小表面进行了研究。研究结果表明,可以使用最小表面填充物作为加固材料设计和制造小型无人机,其刚度重量比等于或优于传统材料和工艺。陀螺结构最大限度地提高了机械性能。然而,在最小结构中集成撑杆和致动器等元素非常复杂。这项研究展示了增材制造在加快无人机开发方面的能力,从而显著降低了生产成本和时间。
{"title":"Solar-Powered UAV Design: Leveraging Minimal Surfaces in Material Extrusion Additive Manufacturing","authors":"César García-Gascón, Pablo Castelló-Pedrero, J. A. García-Manrique","doi":"10.1109/UVS59630.2024.10467157","DOIUrl":"https://doi.org/10.1109/UVS59630.2024.10467157","url":null,"abstract":"This paper outlines the methodology employed in the design and production of a fixed-wing aircraft using additive manufacturing techniques, while integrating solar panel technology. A study of the main minimal surfaces has been carried out with the material extrusion process. The results obtained show that it is possible to design and fabricate small UAVs using minimal surface infill as reinforcements with a stiffness to weight ratio equal or superior to conventional materials and processes. Gyroid structures maximize mechanical properties. However, integrating elements like spars and actuators in minimal structures is complex. This study showcases the capability of additive manufacturing in expediting the development of UAVs, resulting in significant reductions in both production costs and time.","PeriodicalId":518078,"journal":{"name":"2024 2nd International Conference on Unmanned Vehicle Systems-Oman (UVS)","volume":"24 ","pages":"1-7"},"PeriodicalIF":0.0,"publicationDate":"2024-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140528010","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
2024 2nd International Conference on Unmanned Vehicle Systems-Oman (UVS)
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1