From 3D pedestrian networks to wheelable networks: An automatic wheelability assessment method for high-density urban areas using contrastive deep learning of smartphone point clouds

IF 7.1 1区 地球科学 Q1 ENVIRONMENTAL STUDIES Computers Environment and Urban Systems Pub Date : 2025-02-05 DOI:10.1016/j.compenvurbsys.2025.102255
Siyuan Meng , Xian Su , Guibo Sun , Maosu Li , Fan Xue
{"title":"From 3D pedestrian networks to wheelable networks: An automatic wheelability assessment method for high-density urban areas using contrastive deep learning of smartphone point clouds","authors":"Siyuan Meng ,&nbsp;Xian Su ,&nbsp;Guibo Sun ,&nbsp;Maosu Li ,&nbsp;Fan Xue","doi":"10.1016/j.compenvurbsys.2025.102255","DOIUrl":null,"url":null,"abstract":"<div><div>This paper presents a contrastive deep learning-based wheelability assessment method bridging street-scale smartphone point clouds and a city-scale 3D pedestrian network (3DPN). 3DPNs have been studied and mapped for walkability and smart city applications. However, the city-level scale of 3DPN in the literature was incomplete for assessing wheelchair accessibility (i.e., wheelability) due to omitted pedestrian paths, undetected stairs, and oversimplified elevated walkways; these features could be better represented if the mapping scale was at a micro-level designed for wheelchair users. In this paper, we reinforced the city-scale 3DPN using smartphone point clouds, a promising data source for supplementing fine-grain details and temporal changes due to the centimeter-level accuracy, vivid color, high density, and crowd sourcing nature. The three-step method reconstructs pedestrian paths, stairs, and slope details and enriches the city-scale 3DPN for wheelability assessment. The experimental results on pedestrian paths demonstrated accurate 3DPN centerline position (<em>mIoU</em> = 88.81 %), stairs detection (<em>mIoU</em> = 86.39 %), and wheelability assessment (<em>MAE</em> = 0.09). This paper contributes an automatic, accurate, and crowd sourcing wheelability assessment method that bridges ubiquitous smartphones and 3DPN for barrier-free travels in high-density and hilly urban areas.</div></div>","PeriodicalId":48241,"journal":{"name":"Computers Environment and Urban Systems","volume":"117 ","pages":"Article 102255"},"PeriodicalIF":7.1000,"publicationDate":"2025-02-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers Environment and Urban Systems","FirstCategoryId":"89","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0198971525000080","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENVIRONMENTAL STUDIES","Score":null,"Total":0}
引用次数: 0

Abstract

This paper presents a contrastive deep learning-based wheelability assessment method bridging street-scale smartphone point clouds and a city-scale 3D pedestrian network (3DPN). 3DPNs have been studied and mapped for walkability and smart city applications. However, the city-level scale of 3DPN in the literature was incomplete for assessing wheelchair accessibility (i.e., wheelability) due to omitted pedestrian paths, undetected stairs, and oversimplified elevated walkways; these features could be better represented if the mapping scale was at a micro-level designed for wheelchair users. In this paper, we reinforced the city-scale 3DPN using smartphone point clouds, a promising data source for supplementing fine-grain details and temporal changes due to the centimeter-level accuracy, vivid color, high density, and crowd sourcing nature. The three-step method reconstructs pedestrian paths, stairs, and slope details and enriches the city-scale 3DPN for wheelability assessment. The experimental results on pedestrian paths demonstrated accurate 3DPN centerline position (mIoU = 88.81 %), stairs detection (mIoU = 86.39 %), and wheelability assessment (MAE = 0.09). This paper contributes an automatic, accurate, and crowd sourcing wheelability assessment method that bridges ubiquitous smartphones and 3DPN for barrier-free travels in high-density and hilly urban areas.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
13.30
自引率
7.40%
发文量
111
审稿时长
32 days
期刊介绍: Computers, Environment and Urban Systemsis an interdisciplinary journal publishing cutting-edge and innovative computer-based research on environmental and urban systems, that privileges the geospatial perspective. The journal welcomes original high quality scholarship of a theoretical, applied or technological nature, and provides a stimulating presentation of perspectives, research developments, overviews of important new technologies and uses of major computational, information-based, and visualization innovations. Applied and theoretical contributions demonstrate the scope of computer-based analysis fostering a better understanding of environmental and urban systems, their spatial scope and their dynamics.
期刊最新文献
Plot-scale population estimation modeling based on residential plot form clustering and locational attractiveness analysis Future scenarios for urban agriculture and food security in sub-Saharan Africa: Modelling the urban land-food system in an agent-based approach Editorial Board Model-based estimation of the isolated impacts of urban expansion on projected streamflow values under varied climate scenarios Shortest covering paths and other covering walks: Refinements and prospects for subtour prevention
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1