{"title":"Restoration of Motion Blur in Time-of-Flight Depth Image Using Data Alignment","authors":"Zhuo Chen, Peilin Liu, Fei Wen, Jun Wang, R. Ying","doi":"10.1109/3DV50981.2020.00092","DOIUrl":null,"url":null,"abstract":"Time-of-flight (ToF) sensors are vulnerable to motion blur in the presence of moving objects. This is due to the principle of ToF camera that it estimates depth from the phase-shift between emitted and received modulated signals. And the phase-shift is measured by four sequential phase-shifted images, which is assumed to be consistent in an integration time. However, object motion would give rise to disparity among the four phase-shifted images, contributing to unreliable depth measurement. In this paper, we propose a novel method that is capable of aligning the four phase-shifted images through investigating the electronic value of each pixel in the phase images. It consists of two steps, motion detecting and deblurring. Furthermore, a refinement utilizing an additional group of phase-shifted images is adopted to further improve the accuracy of depth measurement. Experiment results on a new elaborated dataset with ground-truth demonstrate that the proposed method compares favorably over existing methods in both accuracy and runtime. Particularly, the new method can achieve the best accuracy while being computationally efficient that can support real-time running.","PeriodicalId":293399,"journal":{"name":"2020 International Conference on 3D Vision (3DV)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 International Conference on 3D Vision (3DV)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/3DV50981.2020.00092","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Time-of-flight (ToF) sensors are vulnerable to motion blur in the presence of moving objects. This is due to the principle of ToF camera that it estimates depth from the phase-shift between emitted and received modulated signals. And the phase-shift is measured by four sequential phase-shifted images, which is assumed to be consistent in an integration time. However, object motion would give rise to disparity among the four phase-shifted images, contributing to unreliable depth measurement. In this paper, we propose a novel method that is capable of aligning the four phase-shifted images through investigating the electronic value of each pixel in the phase images. It consists of two steps, motion detecting and deblurring. Furthermore, a refinement utilizing an additional group of phase-shifted images is adopted to further improve the accuracy of depth measurement. Experiment results on a new elaborated dataset with ground-truth demonstrate that the proposed method compares favorably over existing methods in both accuracy and runtime. Particularly, the new method can achieve the best accuracy while being computationally efficient that can support real-time running.