首页 > 最新文献

Photogrammetric Engineering & Remote Sensing最新文献

英文 中文
ReLAP-Net: Residual Learning and Attention Based Parallel Network for Hyperspectral and Multispectral Image Fusion ReLAP-Net:用于高光谱和多光谱图像融合的基于残差学习和注意力的并行网络
Pub Date : 2024-07-01 DOI: 10.14358/pers.24-00003r2
Aditya Agrawal, SourajaKundu, Touseef Ahmad, Manish Bhatt
Remote sensing applications require high-resolution images to obtain precise information about the Earth???s surface. Multispectral images have high spatial resolution but low spectral resolution. Hyperspectral images have high spectral resolution but low spatial resolution. This study proposes a residual learning and attention-based parallel network based on residual network and channel attention. The network performs image fusion of a high spatial resolution multispectral image and a low spatial resolution hyperspectral image. The network training and fusion experiments are conducted on four public benchmark data sets to show the effectiveness of the proposed model. The fusion performance is compared with classical signal processing???based image fusion techniques. Four image metrics are used for the quantitative evaluation of the fused images. The proposed network improved fusion ability by reducing the root mean square error and relative dimensionless global error in synthesis and increased the peak signal-to-noise ratio when compared to other state-of-the-art models.
遥感应用需要高分辨率图像来获取有关地球表面的精确信息。多光谱图像具有较高的空间分辨率,但光谱分辨率较低。高光谱图像具有较高的光谱分辨率,但空间分辨率较低。本研究基于残差网络和通道注意力,提出了一种基于残差学习和注意力的并行网络。该网络可对高空间分辨率的多光谱图像和低空间分辨率的高光谱图像进行图像融合。在四个公共基准数据集上进行了网络训练和融合实验,以显示所提模型的有效性。融合性能与基于经典信号处理的图像融合技术进行了比较。融合图像的定量评估采用了四个图像指标。与其他最先进的模型相比,所提出的网络降低了合成中的均方根误差和相对无量纲全局误差,提高了峰值信噪比,从而改善了融合能力。
{"title":"ReLAP-Net: Residual Learning and Attention Based Parallel Network for Hyperspectral and Multispectral Image Fusion","authors":"Aditya Agrawal, SourajaKundu, Touseef Ahmad, Manish Bhatt","doi":"10.14358/pers.24-00003r2","DOIUrl":"https://doi.org/10.14358/pers.24-00003r2","url":null,"abstract":"Remote sensing applications require high-resolution images to obtain precise information about the Earth???s surface. Multispectral images have high spatial resolution but low spectral resolution. Hyperspectral images have high spectral resolution but low spatial resolution. This study\u0000 proposes a residual learning and attention-based parallel network based on residual network and channel attention. The network performs image fusion of a high spatial resolution multispectral image and a low spatial resolution hyperspectral image. The network training and fusion experiments\u0000 are conducted on four public benchmark data sets to show the effectiveness of the proposed model. The fusion performance is compared with classical signal processing???based image fusion techniques. Four image metrics are used for the quantitative evaluation of the fused images. The proposed\u0000 network improved fusion ability by reducing the root mean square error and relative dimensionless global error in synthesis and increased the peak signal-to-noise ratio when compared to other state-of-the-art models.","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering & Remote Sensing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141696124","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Book Review ‐ Top 20 Essential Skills for ArcGIS Pro 书评 - ArcGIS Pro 的 20 大必备技能
Pub Date : 2024-07-01 DOI: 10.14358/pers.90.7.391
{"title":"Book Review ‐ Top 20 Essential Skills for ArcGIS Pro","authors":"","doi":"10.14358/pers.90.7.391","DOIUrl":"https://doi.org/10.14358/pers.90.7.391","url":null,"abstract":"","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering & Remote Sensing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141701834","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Dynamic Monitoring of Ecological Quality in Eastern Ukraine Amidst the Russia‐Ukraine Conflict 在俄乌冲突中动态监测乌克兰东部的生态质量
Pub Date : 2024-07-01 DOI: 10.14358/pers.23-00085r2
Chaofei Zhang, Zhanghua Xu, Yuanyao Yang, Lei Sun, Haitao Li
To evaluate the spatiotemporal changes in the ecological environment of eastern Ukraine since the Russia-Ukraine conflict, this study used MODIS images from March to September 2020 and 2022 to calculate the Remote Sensing???Based Ecological Index. In 2022, compared with 2020, conflict zones exhibited reduced improvement and increased slight degradation, whereas nonconflict areas showed marginal enhancement. Through propensity score matching, the research confirmed the causal relationship between conflict and ecological trends. Pathway analysis revealed that the conflict contributed to 0.016 units increase in ecological quality while reducing the improvement rate by 0.042 units. This study provides empirical support for understanding the correlation between conflicts and specific environmental factors, offering technical references for ecological quality assessments in other conflict areas and future evaluations by the Ukrainian government.
为评估俄乌冲突以来乌克兰东部生态环境的时空变化,本研究利用 2020 年 3 月至 9 月和 2022 年的 MODIS 图像计算了基于遥感的生态指数。与 2020 年相比,2022 年冲突地区的生态环境改善程度降低,轻微退化程度增加,而非冲突地区的生态环境则略有改善。通过倾向得分匹配,研究证实了冲突与生态趋势之间的因果关系。路径分析显示,冲突导致生态质量提高了 0.016 个单位,而改善率降低了 0.042 个单位。这项研究为理解冲突与特定环境因素之间的相关性提供了经验支持,为其他冲突地区的生态质量评估和乌克兰政府未来的评估提供了技术参考。
{"title":"Dynamic Monitoring of Ecological Quality in Eastern Ukraine Amidst the Russia‐Ukraine Conflict","authors":"Chaofei Zhang, Zhanghua Xu, Yuanyao Yang, Lei Sun, Haitao Li","doi":"10.14358/pers.23-00085r2","DOIUrl":"https://doi.org/10.14358/pers.23-00085r2","url":null,"abstract":"To evaluate the spatiotemporal changes in the ecological environment of eastern Ukraine since the Russia-Ukraine conflict, this study used MODIS images from March to September 2020 and 2022 to calculate the Remote Sensing???Based Ecological Index. In 2022, compared with 2020, conflict\u0000 zones exhibited reduced improvement and increased slight degradation, whereas nonconflict areas showed marginal enhancement. Through propensity score matching, the research confirmed the causal relationship between conflict and ecological trends. Pathway analysis revealed that the conflict\u0000 contributed to 0.016 units increase in ecological quality while reducing the improvement rate by 0.042 units. This study provides empirical support for understanding the correlation between conflicts and specific environmental factors, offering technical references for ecological quality assessments\u0000 in other conflict areas and future evaluations by the Ukrainian government.","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering & Remote Sensing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141715119","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Surface Water Extraction Method Integrating Spectral and Temporal Characteristics 综合光谱和时间特征的地表水提取方法
Pub Date : 2024-07-01 DOI: 10.14358/pers.24-00013r2
Yebin Zou
Remote sensing has been applied to observe large areas of surface water to obtain higher-resolution and long-term continuous observation records of surface water. However, limitations remain in the detection of large-scale and multi-temporal surface water mainly due to the high variability in water surface signatures in space and time. In this study, we developed a surface water remote sensing information extraction model that integrates spectral and temporal characteristics to extract surface water from multi-dimensional data of long-term Landsat scenes to explore the spatiotemporal changes in surface water over decades. The goal is to extract open water in vegetation, clouds, terrain shadows, and other land cover backgrounds from medium-resolution remote sensing images. The average overall accuracy and average kappa coefficient of the classification were verified to be 0.91 and 0.81, respectively. Experiments applied to China’s inland arid area have shown that the method is effective under complex surface environmental conditions.
遥感技术已被用于观测大面积地表水,以获得更高分辨率和长期连续的地表水观测记录。然而,主要由于水面特征在空间和时间上的高变异性,对大面积和多时空地表水的探测仍然存在局限性。在本研究中,我们开发了一种地表水遥感信息提取模型,该模型综合了光谱和时间特征,可从大地遥感卫星长期场景的多维数据中提取地表水,以探索几十年来地表水的时空变化。目标是从中等分辨率遥感图像中提取植被、云层、地形阴影和其他土地覆盖背景中的开放水域。经验证,分类的平均总体准确率和平均卡帕系数分别为 0.91 和 0.81。应用于中国内陆干旱地区的实验表明,该方法在复杂的地表环境条件下是有效的。
{"title":"A Surface Water Extraction Method Integrating Spectral and Temporal Characteristics","authors":"Yebin Zou","doi":"10.14358/pers.24-00013r2","DOIUrl":"https://doi.org/10.14358/pers.24-00013r2","url":null,"abstract":"Remote sensing has been applied to observe large areas of surface water to obtain higher-resolution and long-term continuous observation records of surface water. However, limitations remain in the detection of large-scale and multi-temporal surface water mainly due to the high variability\u0000 in water surface signatures in space and time. In this study, we developed a surface water remote sensing information extraction model that integrates spectral and temporal characteristics to extract surface water from multi-dimensional data of long-term Landsat scenes to explore the spatiotemporal\u0000 changes in surface water over decades. The goal is to extract open water in vegetation, clouds, terrain shadows, and other land cover backgrounds from medium-resolution remote sensing images. The average overall accuracy and average kappa coefficient of the classification were verified to\u0000 be 0.91 and 0.81, respectively. Experiments applied to China’s inland arid area have shown that the method is effective under complex surface environmental conditions.","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering & Remote Sensing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141705139","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Enhancing Forest‐Steppe Ecotone Mapping Accuracy through Synthetic ApertureRadar‐Optical Remote Sensing Data Fusion and Object-based Analysis 通过合成孔径雷达-光学遥感数据融合和基于对象的分析提高森林-干草原生态区绘图精度
Pub Date : 2024-07-01 DOI: 10.14358/pers.23-00070r2
Ruilin Wang, Meng Wang, Xiaofang Sun, Junbang Wang, Guicai Li
In ecologically vulnerable regions with intricate land use dynamics, such as ecotones, frequent and intense land use transitions unfold. Therefore, the precise and timely mapping of land use becomes imperative. With that goal, by using principal component analysis, we integrated Sentinel-1 and Sentinel-2 data, using an object-oriented methodology to craft a 10-meter-resolution land use map for the forest‐grassland ecological zone of the Greater Khingan Mountains spanning the years 2019 to 2021. Our research reveals a substantial enhancement in classification accuracy achieved through the integration of synthetic aperture radar‐optical remote sensing data. Notably, our products outperformed other land use/land cover data sets, excelling particularly in delineating intricate riverine wetlands. The 10-meter land use product stands as a pivotal guide, offering indispensable support for sustainable development, ecological assessment, and conservation endeavors in the Greater Khingan Mountains region.
在生态脆弱、土地利用动态错综复杂的地区,如生态区,土地利用的过渡频繁而激烈。因此,精确、及时地绘制土地利用图势在必行。为此,我们利用主成分分析法整合了哨兵-1 和哨兵-2 数据,采用面向对象的方法绘制了大兴安岭森林草原生态区 10 米分辨率的土地利用图,时间跨度为 2019 年至 2021 年。我们的研究表明,通过整合合成孔径雷达-光学遥感数据,分类精度得到了大幅提升。值得注意的是,我们的产品优于其他土地利用/土地覆被数据集,尤其在划分错综复杂的河流湿地方面表现出色。10 米土地利用产品具有举足轻重的指导作用,为大兴安岭地区的可持续发展、生态评估和保护工作提供了不可或缺的支持。
{"title":"Enhancing Forest‐Steppe Ecotone Mapping Accuracy through Synthetic ApertureRadar‐Optical Remote Sensing Data Fusion and Object-based Analysis","authors":"Ruilin Wang, Meng Wang, Xiaofang Sun, Junbang Wang, Guicai Li","doi":"10.14358/pers.23-00070r2","DOIUrl":"https://doi.org/10.14358/pers.23-00070r2","url":null,"abstract":"In ecologically vulnerable regions with intricate land use dynamics, such as ecotones, frequent and intense land use transitions unfold. Therefore, the precise and timely mapping of land use becomes imperative. With that goal, by using principal component analysis, we integrated Sentinel-1\u0000 and Sentinel-2 data, using an object-oriented methodology to craft a 10-meter-resolution land use map for the forest‐grassland ecological zone of the Greater Khingan Mountains spanning the years 2019 to 2021. Our research reveals a substantial enhancement in classification accuracy\u0000 achieved through the integration of synthetic aperture radar‐optical remote sensing data. Notably, our products outperformed other land use/land cover data sets, excelling particularly in delineating intricate riverine wetlands. The 10-meter land use product stands as a pivotal guide,\u0000 offering indispensable support for sustainable development, ecological assessment, and conservation endeavors in the Greater Khingan Mountains region.","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering & Remote Sensing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141714758","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Assessing the Utility of Uncrewed Aerial System Photogrammetrically Derived Point Clouds for Land Cover Classification in the Alaska North Slope 评估阿拉斯加北坡未装载航空系统摄影测量得出的点云在土地覆盖分类中的实用性
Pub Date : 2024-07-01 DOI: 10.14358/pers.24-00016r1
Jung-Kuan Liu, Rongjun Qin, Samantha T. Arundel
Uncrewed aerial systems (UASs) have been used to collect “pseudo field plot” data in the form of large-scale stereo imagery to supplement and bolster direct field observations to monitor areas in Alaska. These data supplement field data that is difficult to collect in such a vast landscape with a relatively short field season. Dense photogrammetrically derived point clouds are created and are facilitated to extract land cover data using a support vector machine (SVM) classifier in this study. We test our approach using point clouds derived from 1-cm stereo imagery of plots in the Alaska North Slope region and compare the results to field observations. The results show that the overall accuracy of six land cover classes (bare soil, shrub, grass, forb/herb, rock, and litter) is 96.8% from classified patches. Shrub had the highest accuracy (>99%) and forb/herb achieved the lowest (<48%). This study reveals that the approach could be used as reference data to check field observations in remote areas.
无螺旋桨航空系统(UAS)已被用于以大规模立体图像的形式收集 "伪野外地块 "数据,以补充和加强对阿拉斯加监测区域的直接野外观测。这些数据是对实地数据的补充,而实地数据很难在如此广阔的地形和相对较短的实地季节收集到。在这项研究中,我们创建了高密度摄影测量得出的点云,并使用支持向量机 (SVM) 分类器提取土地覆被数据。我们使用从阿拉斯加北坡地区地块的 1 厘米立体图像中提取的点云测试了我们的方法,并将结果与实地观测结果进行了比较。结果表明,六个土地覆被类别(裸土、灌木、草、禁草/草本植物、岩石和乱石)分类斑块的总体准确率为 96.8%。灌木的准确率最高(>99%),而草本植物的准确率最低(<48%)。这项研究表明,该方法可用作检查偏远地区实地观测结果的参考数据。
{"title":"Assessing the Utility of Uncrewed Aerial System Photogrammetrically Derived Point Clouds for Land Cover Classification in the Alaska North Slope","authors":"Jung-Kuan Liu, Rongjun Qin, Samantha T. Arundel","doi":"10.14358/pers.24-00016r1","DOIUrl":"https://doi.org/10.14358/pers.24-00016r1","url":null,"abstract":"Uncrewed aerial systems (UASs) have been used to collect “pseudo field plot” data in the form of large-scale stereo imagery to supplement and bolster direct field observations to monitor areas in Alaska. These data supplement field data that is difficult to collect in such\u0000 a vast landscape with a relatively short field season. Dense photogrammetrically derived point clouds are created and are facilitated to extract land cover data using a support vector machine (SVM) classifier in this study. We test our approach using point clouds derived from 1-cm stereo imagery\u0000 of plots in the Alaska North Slope region and compare the results to field observations. The results show that the overall accuracy of six land cover classes (bare soil, shrub, grass, forb/herb, rock, and litter) is 96.8% from classified patches. Shrub had the highest accuracy (>99%)\u0000 and forb/herb achieved the lowest (<48%). This study reveals that the approach could be used as reference data to check field observations in remote areas.","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering &amp; Remote Sensing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141706037","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
GIS Tips & Tricks ‐ USGS Adds 100K Topo Scale to OnDemand Map Products GIS 提示与技巧 - USGS 为按需地图产品添加 100K 地形比例尺
Pub Date : 2024-07-01 DOI: 10.14358/pers.90.7.389
Ariel Doumbouya
For this months GIS Tips & Tricks, we are revisiting the U.S. Geological Survey (USGS) National Map topoBuilder tool. topoBuilder was featured in this column shortly after its initial release in October 2022. Since then, topoBuilder has become the “goto” application for USGS topographic maps and recently updated to accommodate even more features. This month, our guest columnist, Ariel Doumbouya, USGS, provides an update and tutorial to some new features in topoBuilder.
在本月的 GIS 使用技巧中,我们将重温美国地质调查局(USGS)的国家地图拓扑生成工具。拓扑生成工具在 2022 年 10 月首次发布后不久就在本专栏中作了介绍。从那时起,topoBuilder 已成为 USGS 地形图的 "首选 "应用程序,最近还进行了更新,以容纳更多功能。本月,我们的特邀专栏作家、美国地质调查局的 Ariel Doumbouya 将介绍 topoBuilder 的一些最新功能和教程。
{"title":"GIS Tips & Tricks ‐ USGS Adds 100K Topo Scale to OnDemand Map Products","authors":"Ariel Doumbouya","doi":"10.14358/pers.90.7.389","DOIUrl":"https://doi.org/10.14358/pers.90.7.389","url":null,"abstract":"For this months GIS Tips & Tricks, we are revisiting the U.S. Geological Survey (USGS) National Map topoBuilder tool. topoBuilder was featured in this column shortly after its initial release in October 2022. Since then, topoBuilder has become the “goto” application\u0000 for USGS topographic maps and recently updated to accommodate even more features. This month, our guest columnist, Ariel Doumbouya, USGS, provides an update and tutorial to some new features in topoBuilder.","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering &amp; Remote Sensing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141714472","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Real-Time Semantic Segmentation of Remote Sensing Images for Land Management 用于土地管理的遥感图像实时语义分割
Pub Date : 2024-06-01 DOI: 10.14358/pers.23-00083r2
Yinsheng Zhang, Ru Ji, Yuxiang Hu, Yulong Yang, Xin Chen, Xiuxian Duan, Huilin Shan
Remote sensing image segmentation is a crucial technique in the field of land management. However, existing semantic segmentation networks require a large number of floating-point operations (FLOPs) and have long run times. In this paper, we propose a dual-path feature aggregation network (DPFANet) specifically designed for the low-latency operations required in land management applications. Firstly, we use four sets of spatially separable convolutions with varying dilation rates to extract spatial features. Additionally, we use an improved version of MobileNetV2 to extract semantic features. Furthermore, we use an asymmetric multi-scale fusion module and dual-path feature aggregation module to enhance feature extraction and fusion. Finally, a decoder is constructed to enable progressive up-sampling. Experimental results on the Potsdam data set and the Gaofen image data set (GID) demonstrate that DPFANet achieves overall accuracy of 92.2% and 89.3%, respectively. The FLOPs are 6.72 giga and the number of parameters is 2.067 million.
遥感图像分割是土地管理领域的一项重要技术。然而,现有的语义分割网络需要大量浮点运算(FLOP),运行时间较长。在本文中,我们提出了一种双路径特征聚合网络(DPFANet),专门针对土地管理应用中所需的低延迟操作而设计。首先,我们使用四组不同扩张率的空间可分离卷积来提取空间特征。此外,我们还使用 MobileNetV2 的改进版本来提取语义特征。此外,我们还使用非对称多尺度融合模块和双路径特征聚合模块来增强特征提取和融合。最后,我们还构建了一个解码器,以实现渐进式上采样。波茨坦数据集和高分图像数据集(GID)的实验结果表明,DPFANet 的总体准确率分别达到 92.2% 和 89.3%。FLOP 为 6.72 千兆,参数数为 206.7 万。
{"title":"Real-Time Semantic Segmentation of Remote Sensing Images for Land Management","authors":"Yinsheng Zhang, Ru Ji, Yuxiang Hu, Yulong Yang, Xin Chen, Xiuxian Duan, Huilin Shan","doi":"10.14358/pers.23-00083r2","DOIUrl":"https://doi.org/10.14358/pers.23-00083r2","url":null,"abstract":"Remote sensing image segmentation is a crucial technique in the field of land management. However, existing semantic segmentation networks require a large number of floating-point operations (FLOPs) and have long run times. In this paper, we propose a dual-path feature aggregation network\u0000 (DPFANet) specifically designed for the low-latency operations required in land management applications. Firstly, we use four sets of spatially separable convolutions with varying dilation rates to extract spatial features. Additionally, we use an improved version of MobileNetV2 to extract\u0000 semantic features. Furthermore, we use an asymmetric multi-scale fusion module and dual-path feature aggregation module to enhance feature extraction and fusion. Finally, a decoder is constructed to enable progressive up-sampling. Experimental results on the Potsdam data set and the Gaofen\u0000 image data set (GID) demonstrate that DPFANet achieves overall accuracy of 92.2% and 89.3%, respectively. The FLOPs are 6.72 giga and the number of parameters is 2.067 million.","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering &amp; Remote Sensing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141279914","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
GIS Tips & Tricks – Say Goodbye to ArcMap; We Were Just Getting to Know You, Old Friend! GIS 技巧与窍门 - 告别 ArcMap;我们才刚刚认识你,老朋友!
Pub Date : 2024-06-01 DOI: 10.14358/pers.90.6.329
Al Karlin
As I write this column (today is 1 March 2024) and as many readers are probably aware, Esri is formally retiring ArcGIS Desktop (i. e., ArcMap). Retirement for ArcMap means that, there will be no new releases, i.e.no ArcGIS Desktop 10.9, and the ArcGIS Desktop Product Life Cycle will come to an end, that is, no additional software fixes or Esri support on 1 March 2026. As ArcGIS Desktop has been with us since it was released in June 2010, it has served the GIS community well, with a total of 17 releases and numerous patches and fixes for over 14 years! Quite a run. So, with that, I think it fitting that I should say goodbye to ArcMap with these last ArcMap Desktop Tips, a "Say goodbye to ArcMap".
在我撰写本专栏时(今天是 2024 年 3 月 1 日),许多读者可能已经知道,Esri 将正式退出 ArcGIS Desktop(即 ArcMap)。ArcMap 的退役意味着将不再发布新版本,即 ArcGIS Desktop 10.9,并且 ArcGIS Desktop 产品生命周期将于 2026 年 3 月 1 日结束,即不再提供额外的软件修复或 Esri 支持。ArcGIS Desktop 自 2010 年 6 月发布以来,已为 GIS 社区提供了良好的服务,在超过 14 年的时间里,共发布了 17 个版本,并提供了大量补丁和修复程序!相当不错。因此,我认为我应该通过这些最后的 ArcMap Desktop 提示向 ArcMap 说再见,即 "向 ArcMap 说再见"。
{"title":"GIS Tips & Tricks – Say Goodbye to ArcMap; We Were Just Getting to Know You, Old Friend!","authors":"Al Karlin","doi":"10.14358/pers.90.6.329","DOIUrl":"https://doi.org/10.14358/pers.90.6.329","url":null,"abstract":"As I write this column (today is 1 March 2024) and as many readers are probably aware, Esri is formally retiring ArcGIS Desktop (i. e., ArcMap). Retirement for ArcMap means that, there will be no new releases, i.e.no ArcGIS Desktop 10.9, and the ArcGIS Desktop Product Life Cycle will\u0000 come to an end, that is, no additional software fixes or Esri support on 1 March 2026. As ArcGIS Desktop has been with us since it was released in June 2010, it has served the GIS community well, with a total of 17 releases and numerous patches and fixes for over 14 years! Quite a run. So,\u0000 with that, I think it fitting that I should say goodbye to ArcMap with these last ArcMap Desktop Tips, a \"Say goodbye to ArcMap\".","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering &amp; Remote Sensing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141276238","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Real-Time Cross-View Image Matching and Camera Pose Determination for Unmanned Aerial Vehicles 无人驾驶飞行器的实时跨视角图像匹配和相机姿态确定
Pub Date : 2024-06-01 DOI: 10.14358/pers.23-00073r2
Long Chen, Bo Wu, Ran Duan, Zeyu Chen
In global navigation satellite systems (GNSS)-denied environments, vision-based methods are commonly used for the positioning and navigation of aerial robots. However, traditional methods often suffer from accumulative estimation errors over time, leading to trajectory drift and lack real-time performance, particularly in large-scale scenarios. This article presents novel approaches, including feature-based cross-view image matching and the integration of visual odometry and photogrammetric space resection for camera pose determination in real-time. Experimental evaluation with real UAV datasets demonstrated that the proposed method reliably matches features in cross-view images with large differences in spatial resolution, coverage, and perspective views, achieving a root-mean-square error of 4.7 m for absolute position error and 0.33° for rotation error, and delivering real-time performance of 12 frames per second (FPS) when implemented in a lightweight edge device onboard UAV. This approach offters potential for diverse intelligent UAV applications in GNSS-denied environments based on real-time feedback control.
在没有全球导航卫星系统(GNSS)的环境中,基于视觉的方法通常用于空中机器人的定位和导航。然而,传统方法往往存在随时间累积的估计误差,导致轨迹漂移,缺乏实时性,尤其是在大规模场景中。本文介绍了新颖的方法,包括基于特征的跨视角图像匹配以及视觉里程测量与摄影测量空间切除的整合,用于实时确定相机姿态。利用真实无人机数据集进行的实验评估表明,所提出的方法能可靠地匹配空间分辨率、覆盖范围和透视图差异较大的跨视图像中的特征,绝对位置误差的均方根误差为 4.7 米,旋转误差为 0.33°,在无人机上的轻型边缘设备中实现时,实时性能达到每秒 12 帧(FPS)。这种方法为基于实时反馈控制的无人机在全球导航卫星系统缺失环境中的多样化智能应用提供了潜力。
{"title":"Real-Time Cross-View Image Matching and Camera Pose Determination for Unmanned Aerial Vehicles","authors":"Long Chen, Bo Wu, Ran Duan, Zeyu Chen","doi":"10.14358/pers.23-00073r2","DOIUrl":"https://doi.org/10.14358/pers.23-00073r2","url":null,"abstract":"In global navigation satellite systems (GNSS)-denied environments, vision-based methods are commonly used for the positioning and navigation of aerial robots. However, traditional methods often suffer from accumulative estimation errors over time, leading to trajectory drift and lack\u0000 real-time performance, particularly in large-scale scenarios. This article presents novel approaches, including feature-based cross-view image matching and the integration of visual odometry and photogrammetric space resection for camera pose determination in real-time. Experimental evaluation\u0000 with real UAV datasets demonstrated that the proposed method reliably matches features in cross-view images with large differences in spatial resolution, coverage, and perspective views, achieving a root-mean-square error of 4.7 m for absolute position error and 0.33° for rotation error,\u0000 and delivering real-time performance of 12 frames per second (FPS) when implemented in a lightweight edge device onboard UAV. This approach offters potential for diverse intelligent UAV applications in GNSS-denied environments based on real-time feedback control.","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering &amp; Remote Sensing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141229254","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Photogrammetric Engineering &amp; Remote Sensing
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1