A close-range photogrammetric model for tracking and performance-based forecasting earthmoving operations

IF 3.1 Q2 CONSTRUCTION & BUILDING TECHNOLOGY Construction Innovation-England Pub Date : 2023-06-09 DOI:10.1108/ci-12-2022-0323
Wahib Saif, Adel Alshibani
{"title":"A close-range photogrammetric model for tracking and performance-based forecasting earthmoving operations","authors":"Wahib Saif, Adel Alshibani","doi":"10.1108/ci-12-2022-0323","DOIUrl":null,"url":null,"abstract":"\nPurpose\nThis paper aims to present a highly accessible and affordable tracking model for earthmoving operations in an attempt to overcome some of the limitations of current tracking models.\n\n\nDesign/methodology/approach\nThe proposed methodology involves four main processes: acquiring onsite terrestrial images, processing the images into 3D scaled cloud data, extracting volumetric measurements and crew productivity estimations from multiple point clouds using Delaunay triangulation and conducting earned value/schedule analysis and forecasting the remaining scope of work based on the estimated performance. For validation, the tracking model was compared with an observation-based tracking approach for a backfilling site. It was also used for tracking a coarse base aggregate inventory for a road construction project.\n\n\nFindings\nThe presented model has proved to be a practical and accurate tracking approach that algorithmically estimates and forecasts all performance parameters from the captured data.\n\n\nOriginality/value\nThe proposed model is unique in extracting accurate volumetric measurements directly from multiple point clouds in a developed code using Delaunay triangulation instead of extracting them from textured models in modelling software which is neither automated nor time-effective. Furthermore, the presented model uses a self-calibration approach aiming to eliminate the pre-calibration procedure required before image capturing for each camera intended to be used. Thus, any worker onsite can directly capture the required images with an easily accessible camera (e.g. handheld camera or a smartphone) and can be sent to any processing device via e-mail, cloud-based storage or any communication application (e.g. WhatsApp).\n","PeriodicalId":45580,"journal":{"name":"Construction Innovation-England","volume":null,"pages":null},"PeriodicalIF":3.1000,"publicationDate":"2023-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Construction Innovation-England","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1108/ci-12-2022-0323","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CONSTRUCTION & BUILDING TECHNOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

Purpose This paper aims to present a highly accessible and affordable tracking model for earthmoving operations in an attempt to overcome some of the limitations of current tracking models. Design/methodology/approach The proposed methodology involves four main processes: acquiring onsite terrestrial images, processing the images into 3D scaled cloud data, extracting volumetric measurements and crew productivity estimations from multiple point clouds using Delaunay triangulation and conducting earned value/schedule analysis and forecasting the remaining scope of work based on the estimated performance. For validation, the tracking model was compared with an observation-based tracking approach for a backfilling site. It was also used for tracking a coarse base aggregate inventory for a road construction project. Findings The presented model has proved to be a practical and accurate tracking approach that algorithmically estimates and forecasts all performance parameters from the captured data. Originality/value The proposed model is unique in extracting accurate volumetric measurements directly from multiple point clouds in a developed code using Delaunay triangulation instead of extracting them from textured models in modelling software which is neither automated nor time-effective. Furthermore, the presented model uses a self-calibration approach aiming to eliminate the pre-calibration procedure required before image capturing for each camera intended to be used. Thus, any worker onsite can directly capture the required images with an easily accessible camera (e.g. handheld camera or a smartphone) and can be sent to any processing device via e-mail, cloud-based storage or any communication application (e.g. WhatsApp).
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
用于跟踪和基于性能的预测土方工程的近景摄影测量模型
目的本文旨在为土方作业提供一个高度可访问且价格合理的跟踪模型,试图克服当前跟踪模型的一些局限性。设计/方法论/方法论所提出的方法论涉及四个主要过程:获取现场地面图像、将图像处理成3D比例的云数据、,使用Delaunay三角测量从多点云中提取体积测量和船员生产力估计,并基于估计的性能进行挣值/进度分析和预测剩余工作范围。为了验证,将跟踪模型与回填场地的基于观测的跟踪方法进行了比较。它还用于跟踪道路建设项目的粗骨料库存。发现所提出的模型已被证明是一种实用且准确的跟踪方法,可以从捕获的数据中通过算法估计和预测所有性能参数。独创性/价值所提出的模型在使用Delaunay三角测量直接从开发的代码中的多个点云中提取精确的体积测量值方面是独特的,而不是在建模软件中从纹理模型中提取,该软件既不自动化也不时效。此外,所提出的模型使用了一种自校准方法,旨在消除在对每个拟使用的相机进行图像捕获之前所需的预校准程序。因此,现场的任何工作人员都可以使用易于访问的相机(如手持相机或智能手机)直接捕捉所需的图像,并可以通过电子邮件、基于云的存储或任何通信应用程序(如WhatsApp)发送到任何处理设备。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Construction Innovation-England
Construction Innovation-England CONSTRUCTION & BUILDING TECHNOLOGY-
CiteScore
7.10
自引率
12.10%
发文量
71
期刊最新文献
Usage of digital technology in improving the mental health of workers on construction sites Uncertainties affecting the offsite construction supply chain resilience: a systematic literature review Developing an interactive pile training module for construction risk management and gaging users’ intentions Impact of trust in virtual project teams: structural equation modelling approach Customized shading solutions for complex building façades: the potential of an innovative cement-textile composite material through a performance-based generative design
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1