Building detection using a dense attention network from LiDAR and image data

Q3 Social Sciences Geomatica Pub Date : 2022-04-28 DOI:10.1139/geomat-2021-0013
N. Ghasemian, Jinfei Wang, Mohammad Reza Najafi
{"title":"Building detection using a dense attention network from LiDAR and image data","authors":"N. Ghasemian, Jinfei Wang, Mohammad Reza Najafi","doi":"10.1139/geomat-2021-0013","DOIUrl":null,"url":null,"abstract":"Accurate building mapping using remote sensing data is challenging because of the complexity of building structures, particularly in populated cities. LiDAR data are widely used for building extraction because they provide height information, which can help distinguish buildings from other tall objects. However, tall trees and bridges in the vicinity of buildings can limit the application of LiDAR data, particularly in urban areas. Combining LiDAR and orthoimages can help in such situations, because orthoimages can provide information on the physical properties of objects, such as reflectance characteristics. One efficient way to combine these two data sources is to use convolutional neural networks (CNN). This study proposes a CNN architecture based on dense attention blocks for building detection in southern Toronto and Massachusetts. The stacking of information from multiple previous layers was inspired by dense attention networks (DANs). DAN blocks consist of batch normalization, convolution, dropout, and average pooling layers to extract high- and low-level features. Compared with two other widely used deep learning techniques, VGG16 and Resnet50, the proposed method has a simpler architecture and converges faster with higher accuracy. In addition, a comparison with the two other state-of-the-art deep learning methods, including U-net and ResUnet, showed that our proposed technique could achieve a higher F1-score, of 0.71, compared with 0.42 for U-net and 0.49 for ResUnet.","PeriodicalId":35938,"journal":{"name":"Geomatica","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Geomatica","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1139/geomat-2021-0013","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 1

Abstract

Accurate building mapping using remote sensing data is challenging because of the complexity of building structures, particularly in populated cities. LiDAR data are widely used for building extraction because they provide height information, which can help distinguish buildings from other tall objects. However, tall trees and bridges in the vicinity of buildings can limit the application of LiDAR data, particularly in urban areas. Combining LiDAR and orthoimages can help in such situations, because orthoimages can provide information on the physical properties of objects, such as reflectance characteristics. One efficient way to combine these two data sources is to use convolutional neural networks (CNN). This study proposes a CNN architecture based on dense attention blocks for building detection in southern Toronto and Massachusetts. The stacking of information from multiple previous layers was inspired by dense attention networks (DANs). DAN blocks consist of batch normalization, convolution, dropout, and average pooling layers to extract high- and low-level features. Compared with two other widely used deep learning techniques, VGG16 and Resnet50, the proposed method has a simpler architecture and converges faster with higher accuracy. In addition, a comparison with the two other state-of-the-art deep learning methods, including U-net and ResUnet, showed that our proposed technique could achieve a higher F1-score, of 0.71, compared with 0.42 for U-net and 0.49 for ResUnet.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
建筑物检测使用密集的关注网络从激光雷达和图像数据
由于建筑物结构的复杂性,特别是在人口稠密的城市,利用遥感数据进行精确的建筑物测绘具有挑战性。激光雷达数据被广泛用于建筑物提取,因为它们提供高度信息,有助于将建筑物与其他高层物体区分开来。然而,建筑物附近的高大树木和桥梁可能会限制激光雷达数据的应用,特别是在城市地区。在这种情况下,将激光雷达和正射影像结合起来会有所帮助,因为正射影像可以提供物体的物理特性信息,比如反射率特征。结合这两个数据源的一种有效方法是使用卷积神经网络(CNN)。本研究提出了一种基于密集注意力块的CNN架构,用于多伦多南部和马萨诸塞州的建筑检测。来自多个先前层的信息的叠加受到密集注意网络(DANs)的启发。DAN块由批处理归一化、卷积、dropout和平均池化层组成,用于提取高级和低级特征。与其他两种广泛使用的深度学习技术VGG16和Resnet50相比,该方法结构更简单,收敛速度更快,精度更高。此外,与其他两种最先进的深度学习方法(包括U-net和ResUnet)的比较表明,我们提出的技术可以获得更高的f1分数,为0.71,而U-net为0.42,ResUnet为0.49。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Geomatica
Geomatica Social Sciences-Geography, Planning and Development
CiteScore
1.50
自引率
0.00%
发文量
7
期刊介绍: Geomatica (formerly CISM Journal ACSGC), is the official quarterly publication of the Canadian Institute of Geomatics. It is the oldest surveying and mapping publication in Canada and was first published in 1922 as the Journal of the Dominion Land Surveyors’ Association. Geomatica is dedicated to the dissemination of information on technical advances in the geomatics sciences. The internationally respected publication contains special features, notices of conferences, calendar of event, articles on personalities, review of current books, industry news and new products, all of which keep the publication lively and informative.
期刊最新文献
Soil Salinity Mapping Using Remote Sensing and GIS A deep transfer learning-based damage assessment on post-event very high-resolution orthophotos Building detection using a dense attention network from LiDAR and image data Exploring five indicators for the quality of OpenStreetMap road networks: a case study of Québec, Canada Fonction d’appartenance et pouvoir d’expression topologique entre objets aux limites fixes et floues dans le processus d’affectation des terres au Gabon
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1