A New U-Net Based Convolutional Neural Network for Estimating Caribou Lichen Ground Cover from Field-Level RGB Images

IF 2 4区 地球科学 Q3 REMOTE SENSING Canadian Journal of Remote Sensing Pub Date : 2022-11-02 DOI:10.1080/07038992.2022.2144179
Julie Lovitt, Galen Richardson, K. Rajaratnam, Wen-jia Chen, S. Leblanc, Liming He, S. Nielsen, Ashley Hillman, Isabelle Schmelzer, A. Arsenault
{"title":"A New U-Net Based Convolutional Neural Network for Estimating Caribou Lichen Ground Cover from Field-Level RGB Images","authors":"Julie Lovitt, Galen Richardson, K. Rajaratnam, Wen-jia Chen, S. Leblanc, Liming He, S. Nielsen, Ashley Hillman, Isabelle Schmelzer, A. Arsenault","doi":"10.1080/07038992.2022.2144179","DOIUrl":null,"url":null,"abstract":"Abstract High-quality ground-truth data are critical for developing reliable Earth Observation (EO) based geospatial products. Conventional methods of collecting these data are either subject to an unknown amount of human error and bias or require extended time in the field to complete (i.e., point-intercept assessments). Digital photograph classification (DPC) may address these drawbacks. In this study, we first assess the performance of a DPC method developed through licensed software to estimate ground cover percentage (%) of bright lichens, a critical caribou forage in fall and winter when other food resources are scarce. We then evaluate the feasibility of replicating this workflow in an open-source environment with a modified U-net model to improve processing time and scalability. Our results indicate that DPC is appropriate for generating ground-truth data in support of large-scale EO-based lichen mapping within the boreal forests of eastern Canada. Our final open-sourced classification model, Lichen Convolutional Neural Network (LiCNN), is comparably accurate yet more efficient than the licensed workflow. Therefore, the LiCNN approach successfully addresses the mentioned shortcomings of conventional ground-truth data collection methods efficiently and without the need for specialized software.","PeriodicalId":48843,"journal":{"name":"Canadian Journal of Remote Sensing","volume":"48 1","pages":"849 - 872"},"PeriodicalIF":2.0000,"publicationDate":"2022-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Canadian Journal of Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1080/07038992.2022.2144179","RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 3

Abstract

Abstract High-quality ground-truth data are critical for developing reliable Earth Observation (EO) based geospatial products. Conventional methods of collecting these data are either subject to an unknown amount of human error and bias or require extended time in the field to complete (i.e., point-intercept assessments). Digital photograph classification (DPC) may address these drawbacks. In this study, we first assess the performance of a DPC method developed through licensed software to estimate ground cover percentage (%) of bright lichens, a critical caribou forage in fall and winter when other food resources are scarce. We then evaluate the feasibility of replicating this workflow in an open-source environment with a modified U-net model to improve processing time and scalability. Our results indicate that DPC is appropriate for generating ground-truth data in support of large-scale EO-based lichen mapping within the boreal forests of eastern Canada. Our final open-sourced classification model, Lichen Convolutional Neural Network (LiCNN), is comparably accurate yet more efficient than the licensed workflow. Therefore, the LiCNN approach successfully addresses the mentioned shortcomings of conventional ground-truth data collection methods efficiently and without the need for specialized software.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
一种新的基于U-Net的卷积神经网络用于从场级RGB图像估计Caribou Lichen地被物
摘要高质量的地面实况数据对于开发可靠的基于地球观测的地理空间产品至关重要。收集这些数据的传统方法要么会受到未知数量的人为错误和偏差的影响,要么需要在现场延长时间才能完成(即点截距评估)。数字照片分类(DPC)可以解决这些缺点。在这项研究中,我们首先评估了通过许可软件开发的DPC方法的性能,该方法用于估计明亮地衣的地面覆盖百分比(%),明亮地衣是秋冬季节其他食物资源稀缺时驯鹿的关键饲料。然后,我们评估了在开源环境中使用修改的U-net模型复制此工作流的可行性,以提高处理时间和可扩展性。我们的结果表明,DPC适用于生成地面实况数据,以支持加拿大东部北方森林中基于EO的大规模地衣测绘。我们的最终开源分类模型,Lichen卷积神经网络(LiCNN),与许可的工作流相比,相对准确但更高效。因此,LiCNN方法成功地解决了传统地面实况数据收集方法的上述缺点,并且不需要专门的软件。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
3.80%
发文量
40
期刊介绍: Canadian Journal of Remote Sensing / Journal canadien de télédétection is a publication of the Canadian Aeronautics and Space Institute (CASI) and the official journal of the Canadian Remote Sensing Society (CRSS-SCT). Canadian Journal of Remote Sensing provides a forum for the publication of scientific research and review articles. The journal publishes topics including sensor and algorithm development, image processing techniques and advances focused on a wide range of remote sensing applications including, but not restricted to; forestry and agriculture, ecology, hydrology and water resources, oceans and ice, geology, urban, atmosphere, and environmental science. Articles can cover local to global scales and can be directly relevant to the Canadian, or equally important, the international community. The international editorial board provides expertise in a wide range of remote sensing theory and applications.
期刊最新文献
Crop Classification Using Multi-Temporal RADARSAT Constellation Mission Compact Polarimetry SAR Data A Bi-Temporal Airborne Lidar Shrub-to-Tree Aboveground Biomass Model for the Taiga of Western Canada Estimating GDP by Fusing Nighttime Light and Land Cover Data Active Reinforcement Learning for the Semantic Segmentation of Urban Images Cumulative Changes in Minimum Snow/Ice Extent over Canada and Northern USA for 2000–2023
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1