Fine-grained building function recognition with street-view images and GIS map data via geometry-aware semi-supervised learning

Weijia Li , Jinhua Yu , Dairong Chen , Yi Lin , Runmin Dong , Xiang Zhang , Conghui He , Haohuan Fu
{"title":"Fine-grained building function recognition with street-view images and GIS map data via geometry-aware semi-supervised learning","authors":"Weijia Li ,&nbsp;Jinhua Yu ,&nbsp;Dairong Chen ,&nbsp;Yi Lin ,&nbsp;Runmin Dong ,&nbsp;Xiang Zhang ,&nbsp;Conghui He ,&nbsp;Haohuan Fu","doi":"10.1016/j.jag.2025.104386","DOIUrl":null,"url":null,"abstract":"<div><div>The diversity of building functions is vital for urban planning and optimizing infrastructure and services. Street-view images offer rich exterior details, aiding in function recognition. However, street-view building function annotations are limited and challenging to obtain. In this work, we propose a geometry-aware semi-supervised method for fine-grained building function recognition, which effectively uses multi-source geoinformation data to achieve accurate function recognition in both single-city and cross-city scenarios. We restructured the semi-supervised method based on the Teacher–Student architecture into three stages, which involve pre-training for building facade recognition, building function annotation generation, and building function recognition. In the first stage, to enable semi-supervised training with limited annotations, we employ a semi-supervised object detection model, which trains on both labeled samples and a large amount of unlabeled data simultaneously, achieving building facade detection. In the second stage, to further optimize the pseudo-labels, we effectively utilize the geometric spatial relationships between GIS map data and panoramic street-view images, integrating the building function information with facade detection results. We ultimately achieve fine-grained building function recognition in both single-city and cross-city scenarios by combining the coarse annotations and labeled data in the final stage. We conduct extensive comparative experiments on four datasets, which include OmniCity, Madrid, Los Angeles, and Boston, to evaluate the performance of our method in both single-city (OmniCity &amp; Madrid) and cross-city (OmniCity - Los Angeles &amp; OmniCity - Boston) scenarios. The experimental results show that, compared to advanced recognition methods, our method improves mAP by at least 4.8% and 4.3% for OmniCity and Madrid, respectively, while also effectively handling class imbalance. Furthermore, our method performs well in the cross-categorization system experiments for Los Angeles and Boston, highlighting its strong potential for cross-city tasks. This study offers a new solution for large-scale and multi-city applications by efficiently utilizing multi-source geoinformation data, enhancing urban information acquisition efficiency, and assisting in rational resource allocation.</div></div>","PeriodicalId":73423,"journal":{"name":"International journal of applied earth observation and geoinformation : ITC journal","volume":"137 ","pages":"Article 104386"},"PeriodicalIF":7.6000,"publicationDate":"2025-02-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of applied earth observation and geoinformation : ITC journal","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1569843225000330","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 0

Abstract

The diversity of building functions is vital for urban planning and optimizing infrastructure and services. Street-view images offer rich exterior details, aiding in function recognition. However, street-view building function annotations are limited and challenging to obtain. In this work, we propose a geometry-aware semi-supervised method for fine-grained building function recognition, which effectively uses multi-source geoinformation data to achieve accurate function recognition in both single-city and cross-city scenarios. We restructured the semi-supervised method based on the Teacher–Student architecture into three stages, which involve pre-training for building facade recognition, building function annotation generation, and building function recognition. In the first stage, to enable semi-supervised training with limited annotations, we employ a semi-supervised object detection model, which trains on both labeled samples and a large amount of unlabeled data simultaneously, achieving building facade detection. In the second stage, to further optimize the pseudo-labels, we effectively utilize the geometric spatial relationships between GIS map data and panoramic street-view images, integrating the building function information with facade detection results. We ultimately achieve fine-grained building function recognition in both single-city and cross-city scenarios by combining the coarse annotations and labeled data in the final stage. We conduct extensive comparative experiments on four datasets, which include OmniCity, Madrid, Los Angeles, and Boston, to evaluate the performance of our method in both single-city (OmniCity & Madrid) and cross-city (OmniCity - Los Angeles & OmniCity - Boston) scenarios. The experimental results show that, compared to advanced recognition methods, our method improves mAP by at least 4.8% and 4.3% for OmniCity and Madrid, respectively, while also effectively handling class imbalance. Furthermore, our method performs well in the cross-categorization system experiments for Los Angeles and Boston, highlighting its strong potential for cross-city tasks. This study offers a new solution for large-scale and multi-city applications by efficiently utilizing multi-source geoinformation data, enhancing urban information acquisition efficiency, and assisting in rational resource allocation.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
求助全文
约1分钟内获得全文 去求助
来源期刊
International journal of applied earth observation and geoinformation : ITC journal
International journal of applied earth observation and geoinformation : ITC journal Global and Planetary Change, Management, Monitoring, Policy and Law, Earth-Surface Processes, Computers in Earth Sciences
CiteScore
12.00
自引率
0.00%
发文量
0
审稿时长
77 days
期刊介绍: The International Journal of Applied Earth Observation and Geoinformation publishes original papers that utilize earth observation data for natural resource and environmental inventory and management. These data primarily originate from remote sensing platforms, including satellites and aircraft, supplemented by surface and subsurface measurements. Addressing natural resources such as forests, agricultural land, soils, and water, as well as environmental concerns like biodiversity, land degradation, and hazards, the journal explores conceptual and data-driven approaches. It covers geoinformation themes like capturing, databasing, visualization, interpretation, data quality, and spatial uncertainty.
期刊最新文献
Editorial Board Near real-time land surface temperature reconstruction from FY-4A satellite using spatio-temporal attention network Assessing urban residents’ exposure to greenspace in daily travel from a dockless bike-sharing lens Satellite retrieval of bottom reflectance from high-spatial-resolution multispectral imagery in shallow coral reef waters Using street view imagery and localized crowdsourcing survey to model perceived safety of the visual built environment by gender
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1