基于机载摄像机的空中云检测神经网络

Joseph Nied, Michael Jones, S. Seaman, Taylor J. Shingler, J. Hair, B. Cairns, D. V. Gilst, A. Bucholtz, S. Schmidt, S. Chellappan, P. Zuidema, B. van Diedenhoven, A. Sorooshian, S. Stamnes
{"title":"基于机载摄像机的空中云检测神经网络","authors":"Joseph Nied, Michael Jones, S. Seaman, Taylor J. Shingler, J. Hair, B. Cairns, D. V. Gilst, A. Bucholtz, S. Schmidt, S. Chellappan, P. Zuidema, B. van Diedenhoven, A. Sorooshian, S. Stamnes","doi":"10.3389/frsen.2023.1118745","DOIUrl":null,"url":null,"abstract":"For aerosol, cloud, land, and ocean remote sensing, the development of accurate cloud detection methods, or cloud masks, is extremely important. For airborne passive remotesensing, it is also important to identify when clouds are above the aircraft since their presence contaminates the measurements of nadir-viewing passive sensors. We describe the development of a camera-based approach to detecting clouds above the aircraft via a convolutional neural network called the cloud detection neural network (CDNN). We quantify the performance of this CDNN using human-labeled validation data where we report 96% accuracy in detecting clouds in testing datasets for both zenith viewing and forward-viewing models. We present results from the CDNN basedon airborne imagery from the NASA Aerosol Cloud meteorology Interactions oVer the western Atlantic Experiment (ACTIVATE) and the Clouds, Aerosol, and Monsoon Processes Philippines Experiment (CAMP2Ex). We quantify the ability of the CDNN to identify the presence of clouds above the aircraft using a forward-looking camera mounted inside the aircraft cockpit compared to the use of an all-sky upward-looking camera that is mounted outside the fuselage on top of the aircraft. We assess our performance by comparing the flight-averaged cloud fraction of zenith and forward CDNN retrievals with that of the prototype hyperspectral total-diffuse Sunshine Pyranometer (SPN-S) instrument’s cloud optical depth data. A comparison of the CDNN with the SPN-S on time-specific intervals resulted in 93% accuracy for the zenith viewing CDNN and 84% for the forward-viewing CDNN. The comparison of the CDNNs with the SPN-S on flight-averaged cloud fraction resulted in an agreement of .15 for the forward CDNN and .07 for the zenith CDNN. For CAMP2Ex, 53% of flight dates had above-aircraft cloud fraction above 50%, while for ACTIVATE, 52% and 54% of flight dates observed above-aircraft cloud fraction above 50% for 2020 and 2021, respectively. The CDNN enables cost-effective detection of clouds above the aircraft using an inexpensive camera installed in the cockpit for airborne science research flights where there are no dedicated upward-looking instruments for cloud detection, the installation of which requires time-consuming and expensive aircraft modifications, in addition to added mission cost and complexity of operating additional instruments.","PeriodicalId":198378,"journal":{"name":"Frontiers in Remote Sensing","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"A cloud detection neural network for above-aircraft clouds using airborne cameras\",\"authors\":\"Joseph Nied, Michael Jones, S. Seaman, Taylor J. Shingler, J. Hair, B. Cairns, D. V. Gilst, A. Bucholtz, S. Schmidt, S. Chellappan, P. Zuidema, B. van Diedenhoven, A. Sorooshian, S. Stamnes\",\"doi\":\"10.3389/frsen.2023.1118745\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"For aerosol, cloud, land, and ocean remote sensing, the development of accurate cloud detection methods, or cloud masks, is extremely important. For airborne passive remotesensing, it is also important to identify when clouds are above the aircraft since their presence contaminates the measurements of nadir-viewing passive sensors. We describe the development of a camera-based approach to detecting clouds above the aircraft via a convolutional neural network called the cloud detection neural network (CDNN). We quantify the performance of this CDNN using human-labeled validation data where we report 96% accuracy in detecting clouds in testing datasets for both zenith viewing and forward-viewing models. We present results from the CDNN basedon airborne imagery from the NASA Aerosol Cloud meteorology Interactions oVer the western Atlantic Experiment (ACTIVATE) and the Clouds, Aerosol, and Monsoon Processes Philippines Experiment (CAMP2Ex). We quantify the ability of the CDNN to identify the presence of clouds above the aircraft using a forward-looking camera mounted inside the aircraft cockpit compared to the use of an all-sky upward-looking camera that is mounted outside the fuselage on top of the aircraft. We assess our performance by comparing the flight-averaged cloud fraction of zenith and forward CDNN retrievals with that of the prototype hyperspectral total-diffuse Sunshine Pyranometer (SPN-S) instrument’s cloud optical depth data. A comparison of the CDNN with the SPN-S on time-specific intervals resulted in 93% accuracy for the zenith viewing CDNN and 84% for the forward-viewing CDNN. The comparison of the CDNNs with the SPN-S on flight-averaged cloud fraction resulted in an agreement of .15 for the forward CDNN and .07 for the zenith CDNN. For CAMP2Ex, 53% of flight dates had above-aircraft cloud fraction above 50%, while for ACTIVATE, 52% and 54% of flight dates observed above-aircraft cloud fraction above 50% for 2020 and 2021, respectively. The CDNN enables cost-effective detection of clouds above the aircraft using an inexpensive camera installed in the cockpit for airborne science research flights where there are no dedicated upward-looking instruments for cloud detection, the installation of which requires time-consuming and expensive aircraft modifications, in addition to added mission cost and complexity of operating additional instruments.\",\"PeriodicalId\":198378,\"journal\":{\"name\":\"Frontiers in Remote Sensing\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-02-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Frontiers in Remote Sensing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3389/frsen.2023.1118745\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Remote Sensing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/frsen.2023.1118745","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

摘要

对于气溶胶、云、陆地和海洋遥感来说,开发准确的云探测方法或云掩模是极其重要的。对于机载被动遥感,确定云层何时在飞机上方也很重要,因为云层的存在会污染最低点被动传感器的测量结果。我们描述了一种基于相机的方法的发展,该方法通过称为云检测神经网络(CDNN)的卷积神经网络来检测飞机上方的云。我们使用人工标记的验证数据量化了该cdn的性能,其中我们报告在天顶观测和前视模型的测试数据集中检测云的准确率为96%。本文介绍了基于NASA气溶胶云气象相互作用西大西洋实验(ACTIVATE)和菲律宾云、气溶胶和季风过程实验(CAMP2Ex)的机载图像的CDNN结果。我们量化了cdn识别飞机上空云层的能力,使用安装在飞机驾驶舱内部的前视摄像头,与使用安装在机身外部的飞机顶部的全天空向上的摄像头进行了比较。我们通过比较飞行平均的天顶云分数和正演cdn检索结果与原型高光谱全漫射太阳辐射计(SPN-S)仪器的云光学深度数据来评估我们的性能。将CDNN与SPN-S在特定时间间隔上进行比较,天顶观测CDNN的准确率为93%,前视观测CDNN的准确率为84%。将飞行平均云分数的cdn与SPN-S进行比较,结果表明,前向cdn的一致性为0.15,天顶cdn的一致性为0.07。对于CAMP2Ex, 53%的飞行日期的飞机上空云分数高于50%,而对于ACTIVATE,分别有52%和54%的飞行日期在2020年和2021年观测到飞机上空云分数高于50%。cdn能够使用安装在机载科学研究飞行座舱中的廉价摄像机对飞机上方的云进行经济有效的检测,这些飞行中没有专门的向上观测云检测仪器,安装这些仪器需要耗时和昂贵的飞机改装,此外还需要增加任务成本和操作额外仪器的复杂性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
A cloud detection neural network for above-aircraft clouds using airborne cameras
For aerosol, cloud, land, and ocean remote sensing, the development of accurate cloud detection methods, or cloud masks, is extremely important. For airborne passive remotesensing, it is also important to identify when clouds are above the aircraft since their presence contaminates the measurements of nadir-viewing passive sensors. We describe the development of a camera-based approach to detecting clouds above the aircraft via a convolutional neural network called the cloud detection neural network (CDNN). We quantify the performance of this CDNN using human-labeled validation data where we report 96% accuracy in detecting clouds in testing datasets for both zenith viewing and forward-viewing models. We present results from the CDNN basedon airborne imagery from the NASA Aerosol Cloud meteorology Interactions oVer the western Atlantic Experiment (ACTIVATE) and the Clouds, Aerosol, and Monsoon Processes Philippines Experiment (CAMP2Ex). We quantify the ability of the CDNN to identify the presence of clouds above the aircraft using a forward-looking camera mounted inside the aircraft cockpit compared to the use of an all-sky upward-looking camera that is mounted outside the fuselage on top of the aircraft. We assess our performance by comparing the flight-averaged cloud fraction of zenith and forward CDNN retrievals with that of the prototype hyperspectral total-diffuse Sunshine Pyranometer (SPN-S) instrument’s cloud optical depth data. A comparison of the CDNN with the SPN-S on time-specific intervals resulted in 93% accuracy for the zenith viewing CDNN and 84% for the forward-viewing CDNN. The comparison of the CDNNs with the SPN-S on flight-averaged cloud fraction resulted in an agreement of .15 for the forward CDNN and .07 for the zenith CDNN. For CAMP2Ex, 53% of flight dates had above-aircraft cloud fraction above 50%, while for ACTIVATE, 52% and 54% of flight dates observed above-aircraft cloud fraction above 50% for 2020 and 2021, respectively. The CDNN enables cost-effective detection of clouds above the aircraft using an inexpensive camera installed in the cockpit for airborne science research flights where there are no dedicated upward-looking instruments for cloud detection, the installation of which requires time-consuming and expensive aircraft modifications, in addition to added mission cost and complexity of operating additional instruments.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A near-real-time tropical deforestation monitoring algorithm based on the CuSum change detection method Suitability of different in-water algorithms for eutrophic and absorbing waters applied to Sentinel-2 MSI and Sentinel-3 OLCI data Sea surface barometry with an O2 differential absorption radar: retrieval algorithm development and simulation Assessment of advanced neural networks for the dual estimation of water quality indicators and their uncertainties Selecting HyperNav deployment sites for calibrating and validating PACE ocean color observations
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1