首页 > 最新文献

Journal of Unmanned Vehicle Systems最新文献

英文 中文
Application of RPAS to disaster risk reduction in Brazil: application in the analysis of urban floods RPAS在巴西减少灾害风险中的应用——在城市洪水分析中的应用
IF 2.3 Q3 REMOTE SENSING Pub Date : 2021-07-26 DOI: 10.1139/JUVS-2020-0033
Elaiz Aparecida Mensch Buffon, F. Mendonça
Geotechnologies have significant potential for application in socio-environmental analysis coupled to disaster risk reduction. Equipment and applications are available that are supported by scientific computing, promoting advances in the acquisition and processing of remote sensing data. Among these are two types: (i) the associated equipment to technology LiDAR (light detection and ranging) and (ii) remotely piloted aircraft systems (RPAS) with platforms of remote sensors. Recently, an growing number of studies has been observed that have the potential for applications in the sensors equipped in RPAS for environmental studies, especially those that evaluate the impacts of natural disasters. In this context, the aim of this research is to demonstrate the possibilities of RPAS applications in the collection of data of interest in the management of natural disasters. Also associated with this task is the prospect of implementing some techniques of scientific computing necessary for the implementation of applications. With these activities, we seek to contribute to the advancement of the employment of RPAS in managing and preventing the risk of natural disasters.
土工技术在社会环境分析和减少灾害风险方面具有巨大的应用潜力。现有的设备和应用得到了科学计算的支持,促进了遥感数据的获取和处理。其中有两种类型:(i)相关的激光雷达技术设备(光探测和测距)和(ii)带有遥感器平台的遥控飞机系统(RPAS)。最近,观察到越来越多的研究具有在RPAS中配备的传感器中应用的潜力,用于环境研究,特别是那些评估自然灾害影响的研究。在这种情况下,本研究的目的是证明RPAS应用于收集自然灾害管理中感兴趣的数据的可能性。与这项任务相关的还有实现应用程序实现所需的一些科学计算技术的前景。通过这些活动,我们寻求为促进RPAS在管理和预防自然灾害风险方面的应用做出贡献。
{"title":"Application of RPAS to disaster risk reduction in Brazil: application in the analysis of urban floods","authors":"Elaiz Aparecida Mensch Buffon, F. Mendonça","doi":"10.1139/JUVS-2020-0033","DOIUrl":"https://doi.org/10.1139/JUVS-2020-0033","url":null,"abstract":"Geotechnologies have significant potential for application in socio-environmental analysis coupled to disaster risk reduction. Equipment and applications are available that are supported by scientific computing, promoting advances in the acquisition and processing of remote sensing data. Among these are two types: (i) the associated equipment to technology LiDAR (light detection and ranging) and (ii) remotely piloted aircraft systems (RPAS) with platforms of remote sensors. Recently, an growing number of studies has been observed that have the potential for applications in the sensors equipped in RPAS for environmental studies, especially those that evaluate the impacts of natural disasters. In this context, the aim of this research is to demonstrate the possibilities of RPAS applications in the collection of data of interest in the management of natural disasters. Also associated with this task is the prospect of implementing some techniques of scientific computing necessary for the implementation of applications. With these activities, we seek to contribute to the advancement of the employment of RPAS in managing and preventing the risk of natural disasters.","PeriodicalId":45619,"journal":{"name":"Journal of Unmanned Vehicle Systems","volume":" ","pages":""},"PeriodicalIF":2.3,"publicationDate":"2021-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43219447","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A comparison of two novel approaches for conducting detect and avoid flight test 两种新的探测和规避飞行试验方法的比较
IF 2.3 Q3 REMOTE SENSING Pub Date : 2021-06-18 DOI: 10.1139/juvs-2021-0005
K. Ellis, Iryna Borshchova, S. Jennings, Caidence Paleske
This paper compares two approaches developed by the National Research Council of Canada to conduct “near-miss” intercepts in flight test, and describes a new method for assessing the efficacy of these trajectories. Each approach used a different combination of flight test techniques and displays to provide guidance to the pilots to set-up the aircraft on a collision trajectory and to maintain the desired path. Approach 1 only provided visual guidance of the relative azimuth and position of the aircraft, whereas Approach 2 established the conflict point (latitude/longitude) from the desired geometry, and provided cross track error from the desired intercept as well as speed cueing for the arrival time. The performance of the approaches was analyzed by comparing the proportion of time where the predicted closest approach distance was below a desired threshold value. The analysis showed that Approach 2 resulted in more than double the amount of time spent at or below desired closest approach distance across all azimuths flown. Moreover, since less time was required to establish the required initial conditions, and to stabilize the flight paths, the authors were able to conduct 50% more intercepts.
本文比较了加拿大国家研究委员会开发的两种在飞行试验中进行“未遂”拦截的方法,并描述了一种评估这些轨迹有效性的新方法。每种方法都使用了飞行测试技术和显示器的不同组合,为飞行员提供指导,以将飞机设置在碰撞轨迹上并保持所需的路径。进近1只提供了飞机相对方位角和位置的视觉引导,而进近2根据所需几何形状确定了冲突点(纬度/经度),并提供了所需截距的交叉航迹误差以及到达时间的速度提示。通过比较预测的最接近接近距离低于所需阈值的时间比例,分析了这些方法的性能。分析表明,进近2导致在所有飞行方位上处于或低于所需最接近进近距离的时间增加了一倍多。此外,由于建立所需的初始条件和稳定飞行路径所需的时间较少,作者能够进行50%以上的拦截。
{"title":"A comparison of two novel approaches for conducting detect and avoid flight test","authors":"K. Ellis, Iryna Borshchova, S. Jennings, Caidence Paleske","doi":"10.1139/juvs-2021-0005","DOIUrl":"https://doi.org/10.1139/juvs-2021-0005","url":null,"abstract":"This paper compares two approaches developed by the National Research Council of Canada to conduct “near-miss” intercepts in flight test, and describes a new method for assessing the efficacy of these trajectories. Each approach used a different combination of flight test techniques and displays to provide guidance to the pilots to set-up the aircraft on a collision trajectory and to maintain the desired path. Approach 1 only provided visual guidance of the relative azimuth and position of the aircraft, whereas Approach 2 established the conflict point (latitude/longitude) from the desired geometry, and provided cross track error from the desired intercept as well as speed cueing for the arrival time. The performance of the approaches was analyzed by comparing the proportion of time where the predicted closest approach distance was below a desired threshold value. The analysis showed that Approach 2 resulted in more than double the amount of time spent at or below desired closest approach distance across all azimuths flown. Moreover, since less time was required to establish the required initial conditions, and to stabilize the flight paths, the authors were able to conduct 50% more intercepts.","PeriodicalId":45619,"journal":{"name":"Journal of Unmanned Vehicle Systems","volume":"1 1","pages":""},"PeriodicalIF":2.3,"publicationDate":"2021-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42605111","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Video analysis for the detection of animals using convolutional neural networks and consumer-grade drones 使用卷积神经网络和消费级无人机进行动物检测的视频分析
IF 2.3 Q3 REMOTE SENSING Pub Date : 2021-04-15 DOI: 10.1139/JUVS-2020-0018
C. Chalmers, P. Fergus, C. C. Montañez, S. Longmore, S. Wich
Determining animal distribution and density is important in conservation. The process is both time-consuming and labour-intensive. Drones have been used to help mitigate human-intensive tasks by covering large geographical areas over a much shorter timescale. In this paper we investigate this idea further using a proof of concept to detect rhinos and cars from drone footage. The proof of concept utilises off-the-shelf technology and consumer-grade drone hardware. The study demonstrates the feasibility of using machine learning (ML) to automate routine conservation tasks, such as animal detection and tracking. The prototype has been developed using a DJI Mavic Pro 2 and tested over a global system for mobile communications (GSM) network. The Faster-RCNN Resnet 101 architecture is used for transfer learning. Inference is performed with a frame sampling technique to address the required trade-off between precision, processing speed, and live video feed synchronisation. Inference models are hosted on a web platform and video streams from the drone (using OcuSync) are transmitted to a real-time messaging protocol (RTMP) server for subsequent classification. During training, the best model achieves a mean average precision (mAP) of 0.83 intersection over union (@IOU) 0.50 and 0.69 @IOU 0.75, respectively. On testing the system in Knowsley Safari our prototype was able to achieve the following: sensitivity (Sen), 0.91 (0.869, 0.94); specificity (Spec), 0.78 (0.74, 0.82); and an accuracy (ACC), 0.84 (0.81, 0.87) when detecting rhinos, and Sen, 1.00 (1.00, 1.00); Spec, 1.00 (1.00, 1.00); and an ACC, 1.00 (1.00, 1.00) when detecting cars.
确定动物的分布和密度在保护中很重要。这个过程既费时又费力。通过在更短的时间内覆盖更大的地理区域,无人机已被用于帮助减轻人力密集型任务。在本文中,我们使用概念验证来进一步研究这一想法,以从无人机镜头中检测犀牛和汽车。概念验证利用了现成的技术和消费级无人机硬件。该研究证明了使用机器学习(ML)自动化日常保护任务(如动物检测和跟踪)的可行性。原型机已经使用大疆Mavic Pro 2开发,并在全球移动通信(GSM)网络系统上进行了测试。Faster-RCNN Resnet 101架构用于迁移学习。推理是用帧采样技术执行的,以解决精度、处理速度和实时视频馈送同步之间所需的权衡。推理模型托管在web平台上,来自无人机的视频流(使用OcuSync)被传输到实时消息传递协议(RTMP)服务器以进行后续分类。在训练过程中,最佳模型的平均精度(mAP)分别为0.83交集/联合(@IOU) 0.50和0.69 @IOU 0.75。在Knowsley Safari中测试系统,我们的原型能够实现以下目标:灵敏度(Sen), 0.91 (0.869, 0.94);特异性(Spec), 0.78 (0.74, 0.82);检测犀牛的准确率(ACC)为0.84 (0.81,0.87),Sen为1.00 (1.00,1.00);规格,1.00 (1.00,1.00);检测车辆时,ACC为1.00(1.00,1.00)。
{"title":"Video analysis for the detection of animals using convolutional neural networks and consumer-grade drones","authors":"C. Chalmers, P. Fergus, C. C. Montañez, S. Longmore, S. Wich","doi":"10.1139/JUVS-2020-0018","DOIUrl":"https://doi.org/10.1139/JUVS-2020-0018","url":null,"abstract":"Determining animal distribution and density is important in conservation. The process is both time-consuming and labour-intensive. Drones have been used to help mitigate human-intensive tasks by covering large geographical areas over a much shorter timescale. In this paper we investigate this idea further using a proof of concept to detect rhinos and cars from drone footage. The proof of concept utilises off-the-shelf technology and consumer-grade drone hardware. The study demonstrates the feasibility of using machine learning (ML) to automate routine conservation tasks, such as animal detection and tracking. The prototype has been developed using a DJI Mavic Pro 2 and tested over a global system for mobile communications (GSM) network. The Faster-RCNN Resnet 101 architecture is used for transfer learning. Inference is performed with a frame sampling technique to address the required trade-off between precision, processing speed, and live video feed synchronisation. Inference models are hosted on a web platform and video streams from the drone (using OcuSync) are transmitted to a real-time messaging protocol (RTMP) server for subsequent classification. During training, the best model achieves a mean average precision (mAP) of 0.83 intersection over union (@IOU) 0.50 and 0.69 @IOU 0.75, respectively. On testing the system in Knowsley Safari our prototype was able to achieve the following: sensitivity (Sen), 0.91 (0.869, 0.94); specificity (Spec), 0.78 (0.74, 0.82); and an accuracy (ACC), 0.84 (0.81, 0.87) when detecting rhinos, and Sen, 1.00 (1.00, 1.00); Spec, 1.00 (1.00, 1.00); and an ACC, 1.00 (1.00, 1.00) when detecting cars.","PeriodicalId":45619,"journal":{"name":"Journal of Unmanned Vehicle Systems","volume":" ","pages":""},"PeriodicalIF":2.3,"publicationDate":"2021-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48899545","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Drone data for decision making in regeneration forests: from raw data to actionable insights1 更新森林决策的无人机数据:从原始数据到可操作的见解1
IF 2.3 Q3 REMOTE SENSING Pub Date : 2020-11-26 DOI: 10.1139/juvs-2020-0029
S. Puliti, A. Granhus
In this study, we aim at developing ways to directly translate raw drone data into actionable insights, thus enabling us to make management decisions directly from drone data. Drone photogrammetric data and data analytics were used to model stand-level immediate tending need and cost in regeneration forests. Field reference data were used to train and validate a logistic model for the binary classification of immediate tending need and a multiple linear regression model to predict the cost to perform the tending operation. The performance of the models derived from drone data was compared to models utilizing the following alternative data sources: airborne laser scanning data (ALS), prior information from forest management plans (Prior) and the combination of drone +Prior and ALS +Prior. The use of drone data and prior information outperformed the remaining alternatives in terms of classification of tending needs, whereas drone data alone resulted in the most accurate cost models. Our results are encouraging for further use of drones in the operational management of regeneration forests and show that drone data and data analytics are useful for deriving actionable insights.
在这项研究中,我们的目标是开发直接将原始无人机数据转化为可操作的见解的方法,从而使我们能够直接从无人机数据中做出管理决策。利用无人机摄影测量数据和数据分析对更新林的林分水平即时抚育需求和成本进行了建模。利用现场参考数据,训练并验证了用于即时抚育需求二元分类的logistic模型和用于预测抚育成本的多元线性回归模型。将无人机数据衍生的模型的性能与使用以下替代数据源的模型进行了比较:机载激光扫描数据(ALS)、森林管理计划的先验信息(prior)以及无人机+ prior和ALS + prior的组合。无人机数据和先验信息的使用在抚育需求分类方面优于其他选择,而无人机数据单独产生最准确的成本模型。我们的研究结果鼓舞了无人机在再生林运营管理中的进一步应用,并表明无人机数据和数据分析对于获得可操作的见解是有用的。
{"title":"Drone data for decision making in regeneration forests: from raw data to actionable insights1","authors":"S. Puliti, A. Granhus","doi":"10.1139/juvs-2020-0029","DOIUrl":"https://doi.org/10.1139/juvs-2020-0029","url":null,"abstract":"In this study, we aim at developing ways to directly translate raw drone data into actionable insights, thus enabling us to make management decisions directly from drone data. Drone photogrammetric data and data analytics were used to model stand-level immediate tending need and cost in regeneration forests. Field reference data were used to train and validate a logistic model for the binary classification of immediate tending need and a multiple linear regression model to predict the cost to perform the tending operation. The performance of the models derived from drone data was compared to models utilizing the following alternative data sources: airborne laser scanning data (ALS), prior information from forest management plans (Prior) and the combination of drone +Prior and ALS +Prior. The use of drone data and prior information outperformed the remaining alternatives in terms of classification of tending needs, whereas drone data alone resulted in the most accurate cost models. Our results are encouraging for further use of drones in the operational management of regeneration forests and show that drone data and data analytics are useful for deriving actionable insights.","PeriodicalId":45619,"journal":{"name":"Journal of Unmanned Vehicle Systems","volume":"1 1","pages":""},"PeriodicalIF":2.3,"publicationDate":"2020-11-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41343331","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Point de référence pour la planification de trajectoires d’UAV à voilure fixe 固定翼无人机轨迹规划参考点
IF 2.3 Q3 REMOTE SENSING Pub Date : 2020-10-08 DOI: 10.1139/juvs-2019-0022
F. Allaire, G. Labonté, Vincent Roberge, M. Tarbouchi
Les revues étendues de la littérature sur la planification de trajectoires s’entendent toutes sur un point commun : le manque d’un point de référence pour permettre de comparer les différentes mises-en-œuvre dans ce domaine. Ce travail présente une fonction d’évaluation de trajectoires de véhicules aériens sans pilote (UAV) à voilure fixe qui couvrent quatre critères de volabilité essentiels et trois critères d’optimisation qui peuvent être applicables à une grande variété de mission tout en étant adaptables à l’ajout de critères supplémentaires. Ce travail présente aussi une série de 20 scénarios permettant de couvrir une grande variété de conditions possibles pour permettre de caractériser la qualité des trajectoires planifiées pour un UAV à voilure fixe. Ce travail propose de combiner ces deux éléments pour constituer un environnement de test détaillé comme point de référence pour les futurs travaux sur la planification de trajectoires d’UAV à voilure fixe.
对轨迹规划文献的广泛审查都有一个共同点:缺乏一个参考点来比较该领域的不同实施情况。这项工作提出了一种评估固定翼无人机(UAV)轨迹的功能,涵盖了四个基本飞行能力标准和三个优化标准,可适用于各种任务,同时可适应附加标准。这项工作还提出了一系列20个场景,涵盖了各种可能的条件,以描述固定翼无人机计划轨迹的质量。这项工作建议将这两个要素结合起来,形成一个详细的测试环境,作为未来固定翼无人机轨迹规划工作的参考点。
{"title":"Point de référence pour la planification de trajectoires d’UAV à voilure fixe","authors":"F. Allaire, G. Labonté, Vincent Roberge, M. Tarbouchi","doi":"10.1139/juvs-2019-0022","DOIUrl":"https://doi.org/10.1139/juvs-2019-0022","url":null,"abstract":"Les revues étendues de la littérature sur la planification de trajectoires s’entendent toutes sur un point commun : le manque d’un point de référence pour permettre de comparer les différentes mises-en-œuvre dans ce domaine. Ce travail présente une fonction d’évaluation de trajectoires de véhicules aériens sans pilote (UAV) à voilure fixe qui couvrent quatre critères de volabilité essentiels et trois critères d’optimisation qui peuvent être applicables à une grande variété de mission tout en étant adaptables à l’ajout de critères supplémentaires. Ce travail présente aussi une série de 20 scénarios permettant de couvrir une grande variété de conditions possibles pour permettre de caractériser la qualité des trajectoires planifiées pour un UAV à voilure fixe. Ce travail propose de combiner ces deux éléments pour constituer un environnement de test détaillé comme point de référence pour les futurs travaux sur la planification de trajectoires d’UAV à voilure fixe.","PeriodicalId":45619,"journal":{"name":"Journal of Unmanned Vehicle Systems","volume":" ","pages":""},"PeriodicalIF":2.3,"publicationDate":"2020-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48504910","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Preliminary data on an affordable UAV system to survey for freshwater turtles: advantages and disadvantages of low-cost drones 用于调查淡水龟的廉价无人机系统的初步数据:低成本无人机的优点和缺点
IF 2.3 Q3 REMOTE SENSING Pub Date : 2020-09-09 DOI: 10.1139/juvs-2018-0037
J. Escobar, Mark A. Rollins, S. Unger
Unmanned aerial vehicles (UAVs) are established, valuable tools for wildlife surveys in marine and terrestrial environments; however, they are seldom utilized in freshwater ecosystems. Therefore, baseline data on the use of UAVs in lotic environments are needed that balances flight parameters (e.g., altitude and noise level) with image quality, while minimizing disturbance to individuals. Moreover, the traditional high-cost UAVs may present challenges to researchers conducting rapid assessments on species presence with limited funding. However, emerging, affordable UAV systems can provide this preliminary data to researchers, albeit with caveats on reliability of data. We tested a low-cost UAV system to document freshwater turtle presence, species distribution, and habitat use in a small North Carolina wetland. We observed minimal instances of turtles fleeing basking sites (∼0.7%), as this UAV system was only ∼2.1 dB above ambient noise levels at an altitude of 20 m. Freshwater turtles were found primarily in algal mat basking habitats with highly variable numbers observed across locations and flights, likely due to image quality reliability and altitude. Our affordable UAV system was successful in providing baseline information on species presence, size distribution, and habitat preference of turtles in freshwater ecosystems.
无人机是在海洋和陆地环境中进行野生动物调查的宝贵工具;然而,它们很少被用于淡水生态系统。因此,需要无人机在激流环境中使用的基线数据,以平衡飞行参数(如高度和噪声水平)与图像质量,同时最大限度地减少对个人的干扰。此外,传统的高成本无人机可能会给研究人员在资金有限的情况下对物种存在进行快速评估带来挑战。然而,新兴的、负担得起的无人机系统可以向研究人员提供这些初步数据,尽管对数据的可靠性有警告。我们测试了一个低成本的无人机系统,以记录北卡罗来纳州一个小型湿地中淡水龟的存在、物种分布和栖息地使用情况。我们观察到海龟逃离日光浴场所的情况很少(~0.7%),因为该无人机系统在20米的高度仅比环境噪声水平高出~2.1分贝。淡水龟主要在藻类垫晒的栖息地被发现,在不同的地点和航班上观察到的数量变化很大,这可能是由于图像质量的可靠性和海拔高度。我们负担得起的无人机系统成功地提供了淡水生态系统中海龟的物种存在、大小分布和栖息地偏好的基线信息。
{"title":"Preliminary data on an affordable UAV system to survey for freshwater turtles: advantages and disadvantages of low-cost drones","authors":"J. Escobar, Mark A. Rollins, S. Unger","doi":"10.1139/juvs-2018-0037","DOIUrl":"https://doi.org/10.1139/juvs-2018-0037","url":null,"abstract":"Unmanned aerial vehicles (UAVs) are established, valuable tools for wildlife surveys in marine and terrestrial environments; however, they are seldom utilized in freshwater ecosystems. Therefore, baseline data on the use of UAVs in lotic environments are needed that balances flight parameters (e.g., altitude and noise level) with image quality, while minimizing disturbance to individuals. Moreover, the traditional high-cost UAVs may present challenges to researchers conducting rapid assessments on species presence with limited funding. However, emerging, affordable UAV systems can provide this preliminary data to researchers, albeit with caveats on reliability of data. We tested a low-cost UAV system to document freshwater turtle presence, species distribution, and habitat use in a small North Carolina wetland. We observed minimal instances of turtles fleeing basking sites (∼0.7%), as this UAV system was only ∼2.1 dB above ambient noise levels at an altitude of 20 m. Freshwater turtles were found primarily in algal mat basking habitats with highly variable numbers observed across locations and flights, likely due to image quality reliability and altitude. Our affordable UAV system was successful in providing baseline information on species presence, size distribution, and habitat preference of turtles in freshwater ecosystems.","PeriodicalId":45619,"journal":{"name":"Journal of Unmanned Vehicle Systems","volume":" ","pages":""},"PeriodicalIF":2.3,"publicationDate":"2020-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1139/juvs-2018-0037","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49106437","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Archaeological documentation of wood caribou fences using unmanned aerial vehicle and very high-resolution satellite imagery in the Mackenzie Mountains, Northwest Territories 使用无人机和高分辨率卫星图像对西北地区麦肯齐山脉的木驯鹿围栏进行考古记录
IF 2.3 Q3 REMOTE SENSING Pub Date : 2020-07-23 DOI: 10.1139/juvs-2020-0007
J. V. D. Sluijs, Glen MacKay, L. Andrew, Naomi Smethurst, T. D. Andrews
Indigenous peoples of Canada’s North have long made use of boreal forest products, with wooden drift fences to direct caribou movement towards kill sites as unique examples. Caribou fences are of archaeological and ecological significance, yet sparsely distributed and increasingly at risk to wildfire. Costly remote field logistics requires efficient prior fence verification and rapid on-site documentation of structure and landscape context. Unmanned aerial vehicle (UAV) and very high-resolution (VHR) satellite imagery were used for detailed site recording and detection of coarse woody debris (CWD) objects under challenging Subarctic alpine woodlands conditions. UAVs enabled discovery of previously unknown wooden structures and revealed extensive use of CWD (n = 1745, total length = 2682 m, total volume = 16.7 m3). The methodology detected CWD objects much smaller than previously reported in remote sensing literature (mean 1.5 m long, 0.09 m wide), substantiating a high spatial resolution requirement for detection. Structurally, the fences were not uniformly left on the landscape. Permafrost patterned ground combined with small CWD contributions at the pixel level complicated identification through VHR data sets. UAV outputs significantly enriched field techniques and supported a deeper understanding of caribou fences as a hunting technology, and they will aid ongoing archaeological interpretation and time-series comparisons of change agents.
加拿大北部的土著人民长期以来一直使用北方森林产品,将驯鹿引向杀戮地点的木制漂流围栏就是一个独特的例子。驯鹿围栏具有考古和生态意义,但分布稀少,越来越容易发生野火。成本高昂的远程现场物流需要事先进行有效的围栏验证,并快速记录结构和景观背景。无人机(UAV)和高分辨率(VHR)卫星图像被用于在具有挑战性的亚北极高山林地条件下对粗木质碎片(CWD)物体进行详细的现场记录和探测。无人机能够发现以前未知的木结构,并揭示了CWD(n = 1745,全长 = 2682m,总体积 = 16.7 m3)。该方法探测到的CWD物体比以前遥感文献中报道的要小得多(平均1.5米长,0.09米宽),证明了探测的高空间分辨率要求。从结构上看,围栏并没有均匀地留在景观上。永久冻土图案化的地面结合像素级的小CWD贡献,通过VHR数据集进行复杂的识别。无人机的输出大大丰富了野外技术,并有助于更深入地理解驯鹿围栏作为一种狩猎技术,它们将有助于对变化因素进行持续的考古解释和时间序列比较。
{"title":"Archaeological documentation of wood caribou fences using unmanned aerial vehicle and very high-resolution satellite imagery in the Mackenzie Mountains, Northwest Territories","authors":"J. V. D. Sluijs, Glen MacKay, L. Andrew, Naomi Smethurst, T. D. Andrews","doi":"10.1139/juvs-2020-0007","DOIUrl":"https://doi.org/10.1139/juvs-2020-0007","url":null,"abstract":"Indigenous peoples of Canada’s North have long made use of boreal forest products, with wooden drift fences to direct caribou movement towards kill sites as unique examples. Caribou fences are of archaeological and ecological significance, yet sparsely distributed and increasingly at risk to wildfire. Costly remote field logistics requires efficient prior fence verification and rapid on-site documentation of structure and landscape context. Unmanned aerial vehicle (UAV) and very high-resolution (VHR) satellite imagery were used for detailed site recording and detection of coarse woody debris (CWD) objects under challenging Subarctic alpine woodlands conditions. UAVs enabled discovery of previously unknown wooden structures and revealed extensive use of CWD (n = 1745, total length = 2682 m, total volume = 16.7 m3). The methodology detected CWD objects much smaller than previously reported in remote sensing literature (mean 1.5 m long, 0.09 m wide), substantiating a high spatial resolution requirement for detection. Structurally, the fences were not uniformly left on the landscape. Permafrost patterned ground combined with small CWD contributions at the pixel level complicated identification through VHR data sets. UAV outputs significantly enriched field techniques and supported a deeper understanding of caribou fences as a hunting technology, and they will aid ongoing archaeological interpretation and time-series comparisons of change agents.","PeriodicalId":45619,"journal":{"name":"Journal of Unmanned Vehicle Systems","volume":" ","pages":""},"PeriodicalIF":2.3,"publicationDate":"2020-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42348702","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Individual tree species identification using Dense Convolutional Network (DenseNet) on multitemporal RGB images from UAV 基于密集卷积网络(DenseNet)的无人机多时相RGB图像树种识别
IF 2.3 Q3 REMOTE SENSING Pub Date : 2020-07-22 DOI: 10.1139/juvs-2020-0014
Sowmya Natesan, C. Armenakis, U. Vepakomma
Tree species identification at the individual tree level is crucial for forest operations and management, yet its automated mapping remains challenging. Emerging technology, such as the high-resolution imagery from unmanned aerial vehicles (UAV) that is now becoming part of every forester’s surveillance kit, can potentially provide a solution to better characterize the tree canopy. To address this need, we have developed an approach based on a deep Convolutional Neural Network (CNN) to classify forest tree species at the individual tree-level that uses high-resolution RGB images acquired from a consumer-grade camera mounted on a UAV platform. This work explores the ability of the Dense Convolutional Network (DenseNet) to classify commonly available economic coniferous tree species in eastern Canada. The network was trained using multitemporal images captured under varying acquisition parameters to include seasonal, temporal, illumination, and angular variability. Validation of this model using distinct images over a mixed-wood forest in Ontario, Canada, showed over 84% classification accuracy in distinguishing five predominant species of coniferous trees. The model remains highly robust even when using images taken during different seasons and times, and with varying illumination and angles.
树木个体层面的树种识别对森林运营和管理至关重要,但其自动化绘图仍然具有挑战性。新兴技术,如无人驾驶飞行器(UAV)的高分辨率图像,现在正成为每个林业工作者监测工具包的一部分,有可能提供一种更好地描述树冠特征的解决方案。为了满足这一需求,我们开发了一种基于深度卷积神经网络(CNN)的方法,在单个树木级别对森林树种进行分类,该方法使用从安装在无人机平台上的消费级相机获取的高分辨率RGB图像。这项工作探索了密集卷积网络(DenseNet)对加拿大东部常见的经济针叶树种进行分类的能力。该网络使用在不同采集参数下捕获的多时相图像进行训练,包括季节、时间、照明和角度变化。在加拿大安大略省的一片混合树林中,使用不同的图像对该模型进行了验证,结果显示,在区分五种主要针叶树时,分类准确率超过84%。即使使用在不同季节和时间拍摄的图像,以及不同的照明和角度,该模型也保持高度鲁棒性。
{"title":"Individual tree species identification using Dense Convolutional Network (DenseNet) on multitemporal RGB images from UAV","authors":"Sowmya Natesan, C. Armenakis, U. Vepakomma","doi":"10.1139/juvs-2020-0014","DOIUrl":"https://doi.org/10.1139/juvs-2020-0014","url":null,"abstract":"Tree species identification at the individual tree level is crucial for forest operations and management, yet its automated mapping remains challenging. Emerging technology, such as the high-resolution imagery from unmanned aerial vehicles (UAV) that is now becoming part of every forester’s surveillance kit, can potentially provide a solution to better characterize the tree canopy. To address this need, we have developed an approach based on a deep Convolutional Neural Network (CNN) to classify forest tree species at the individual tree-level that uses high-resolution RGB images acquired from a consumer-grade camera mounted on a UAV platform. This work explores the ability of the Dense Convolutional Network (DenseNet) to classify commonly available economic coniferous tree species in eastern Canada. The network was trained using multitemporal images captured under varying acquisition parameters to include seasonal, temporal, illumination, and angular variability. Validation of this model using distinct images over a mixed-wood forest in Ontario, Canada, showed over 84% classification accuracy in distinguishing five predominant species of coniferous trees. The model remains highly robust even when using images taken during different seasons and times, and with varying illumination and angles.","PeriodicalId":45619,"journal":{"name":"Journal of Unmanned Vehicle Systems","volume":" ","pages":""},"PeriodicalIF":2.3,"publicationDate":"2020-07-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1139/juvs-2020-0014","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42721934","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 28
Forest fire flame and smoke detection from UAV-captured images using fire-specific color features and multi-color space local binary pattern 利用火灾特定颜色特征和多色空间局部二值模式对无人机捕获的森林火灾火焰和烟雾进行检测
IF 2.3 Q3 REMOTE SENSING Pub Date : 2020-06-30 DOI: 10.1139/juvs-2020-0009
Faruk Hossain, Youmin Zhang, Masuda A. Tonima
In recent years, the frequency and severity of forest fire occurrence have increased, compelling the research communities to actively search for early forest fire detection and suppression methods. Remote sensing using computer vision techniques can provide early detection from a large field of view along with providing additional information such as location and severity of the fire. Over the last few years, the feasibility of forest fire detection by combining computer vision and aerial platforms such as manned and unmanned aerial vehicles, especially low cost and small-size unmanned aerial vehicles, have been experimented with and have shown promise by providing detection, geolocation, and fire characteristic information. This paper adds to the existing research by proposing a novel method of detecting forest fire using color and multi-color space local binary pattern of both flame and smoke signatures and a single artificial neural network. The training and evaluation images in this paper have been mostly obtained from aerial platforms with challenging circumstances such as minuscule flame pixels, varying illumination and range, complex backgrounds, occluded flame and smoke regions, and smoke blending into the background. The proposed method has achieved F1 scores of 0.84 for flame and 0.90 for smoke while maintaining a processing speed of 19 frames per second. It has outperformed support vector machine, random forest, Bayesian classifiers and YOLOv3, and has demonstrated the capability of detecting challenging flame and smoke regions of a wide range of sizes, colors, textures, and opacity.
近年来,森林火灾发生的频率和严重程度都有所增加,迫使研究界积极寻求森林火灾的早期探测和扑救方法。使用计算机视觉技术的遥感可以从大视野中提供早期检测,并提供诸如火灾位置和严重程度等附加信息。近年来,将计算机视觉与有人驾驶和无人驾驶飞行器等空中平台结合起来进行森林火灾探测的可行性,特别是低成本和小尺寸的无人机,通过提供探测、地理定位和火灾特征信息,已经得到了试验和展示。本文在已有研究的基础上,提出了一种利用火焰和烟雾特征的彩色和多色空间局部二值模式以及单个人工神经网络来检测森林火灾的新方法。本文的训练和评估图像大多来自具有挑战性的空中平台,如火焰像素小,光照和距离变化,背景复杂,火焰和烟雾区域遮挡,烟雾混入背景等。该方法在保持19帧/秒的处理速度的情况下,火焰和烟雾的F1得分分别为0.84和0.90。它优于支持向量机、随机森林、贝叶斯分类器和YOLOv3,并且已经证明了检测各种尺寸、颜色、纹理和不透明度的具有挑战性的火焰和烟雾区域的能力。
{"title":"Forest fire flame and smoke detection from UAV-captured images using fire-specific color features and multi-color space local binary pattern","authors":"Faruk Hossain, Youmin Zhang, Masuda A. Tonima","doi":"10.1139/juvs-2020-0009","DOIUrl":"https://doi.org/10.1139/juvs-2020-0009","url":null,"abstract":"In recent years, the frequency and severity of forest fire occurrence have increased, compelling the research communities to actively search for early forest fire detection and suppression methods. Remote sensing using computer vision techniques can provide early detection from a large field of view along with providing additional information such as location and severity of the fire. Over the last few years, the feasibility of forest fire detection by combining computer vision and aerial platforms such as manned and unmanned aerial vehicles, especially low cost and small-size unmanned aerial vehicles, have been experimented with and have shown promise by providing detection, geolocation, and fire characteristic information. This paper adds to the existing research by proposing a novel method of detecting forest fire using color and multi-color space local binary pattern of both flame and smoke signatures and a single artificial neural network. The training and evaluation images in this paper have been mostly obtained from aerial platforms with challenging circumstances such as minuscule flame pixels, varying illumination and range, complex backgrounds, occluded flame and smoke regions, and smoke blending into the background. The proposed method has achieved F1 scores of 0.84 for flame and 0.90 for smoke while maintaining a processing speed of 19 frames per second. It has outperformed support vector machine, random forest, Bayesian classifiers and YOLOv3, and has demonstrated the capability of detecting challenging flame and smoke regions of a wide range of sizes, colors, textures, and opacity.","PeriodicalId":45619,"journal":{"name":"Journal of Unmanned Vehicle Systems","volume":" ","pages":""},"PeriodicalIF":2.3,"publicationDate":"2020-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1139/juvs-2020-0009","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43030875","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 45
Modelling of unmanned aircraft visibility for see-and-avoid operations 无人驾驶飞机能见度的建模
IF 2.3 Q3 REMOTE SENSING Pub Date : 2020-06-24 DOI: 10.1139/juvs-2020-0011
P. Highland, J. Williams, M. Yazvec, A. Dideriksen, N. Corcoran, K. Woodruff, C. Thompson, L. Kirby, E. Chun, H. Kousheh, J. Stoltz, T. Schnell
With more unmanned aircraft (UA) becoming airborne each day, an already high manned aircraft to UA exposure rate continues to grow. Pilots and rulemaking authorities realize that UA visibility is a real, but unquantified, threat to operations under the see-and-avoid concept. To finally quantify the threat, a novel contrast-based UA visibility model is constructed here using collected empirical data as well as previous work on the factors affecting visibility. This work showed that UA visibility <1300 m makes a midair collision a serious threat if a manned aircraft and a UA are on a collision course while operating under the see-and-avoid concept. Similarly, this work also showed that a midair collision may be unavoidable when UA visibility is <400 m. Validating pilot and rulemaking authority concerns, this work demonstrated that UA visibility distances <1300 and <400 m occur often in the real world. Finally, the model produced UA visibility lookup tables that may prove useful to rulemaking authorities such as the U.S. Federal Aviation Administration and International Civil Aviation Organization for future work in the proof of equivalency of detect and avoid operations. Until then, pilots flying at slower airspeeds in the vicinity of UA may improve safety margins.
随着越来越多的无人驾驶飞机(UA)每天在空中飞行,本已很高的有人驾驶飞机对UA的暴露率继续增长。飞行员和规则制定机构意识到,UA的可见性是一种真实的、但未量化的威胁,对“看到并避免”概念下的行动构成威胁。为了最终量化威胁,本文利用收集的经验数据以及之前关于影响能见度因素的工作,构建了一个新的基于对比度的UA能见度模型。这项工作表明UA的可见性 <如果有人驾驶飞机和UA在“看见并避开”概念下运行时处于碰撞路线上,1300米将使空中碰撞成为严重威胁。同样,这项工作还表明,当UA能见度为 <400米。这项工作验证了飞行员和规则制定机构的担忧,证明了UA的能见度距离 <1300米和<400米在现实世界中经常发生。最后,该模型生成了UA可见性查找表,这些表可能对美国联邦航空管理局和国际民用航空组织等规则制定机构有用,以供未来在证明探测和规避操作的等效性方面开展工作。在此之前,飞行员在UA附近以较慢的空速飞行可能会提高安全裕度。
{"title":"Modelling of unmanned aircraft visibility for see-and-avoid operations","authors":"P. Highland, J. Williams, M. Yazvec, A. Dideriksen, N. Corcoran, K. Woodruff, C. Thompson, L. Kirby, E. Chun, H. Kousheh, J. Stoltz, T. Schnell","doi":"10.1139/juvs-2020-0011","DOIUrl":"https://doi.org/10.1139/juvs-2020-0011","url":null,"abstract":"With more unmanned aircraft (UA) becoming airborne each day, an already high manned aircraft to UA exposure rate continues to grow. Pilots and rulemaking authorities realize that UA visibility is a real, but unquantified, threat to operations under the see-and-avoid concept. To finally quantify the threat, a novel contrast-based UA visibility model is constructed here using collected empirical data as well as previous work on the factors affecting visibility. This work showed that UA visibility <1300 m makes a midair collision a serious threat if a manned aircraft and a UA are on a collision course while operating under the see-and-avoid concept. Similarly, this work also showed that a midair collision may be unavoidable when UA visibility is <400 m. Validating pilot and rulemaking authority concerns, this work demonstrated that UA visibility distances <1300 and <400 m occur often in the real world. Finally, the model produced UA visibility lookup tables that may prove useful to rulemaking authorities such as the U.S. Federal Aviation Administration and International Civil Aviation Organization for future work in the proof of equivalency of detect and avoid operations. Until then, pilots flying at slower airspeeds in the vicinity of UA may improve safety margins.","PeriodicalId":45619,"journal":{"name":"Journal of Unmanned Vehicle Systems","volume":" ","pages":""},"PeriodicalIF":2.3,"publicationDate":"2020-06-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42754597","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
期刊
Journal of Unmanned Vehicle Systems
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1