Yanming Wu, P. Vandewalle, P. Slaets, E. Demeester
{"title":"An Improved Approach to 6D Object Pose Tracking in Fast Motion Scenarios","authors":"Yanming Wu, P. Vandewalle, P. Slaets, E. Demeester","doi":"10.1109/IRC55401.2022.00045","DOIUrl":null,"url":null,"abstract":"Tracking 6D poses of objects in video sequences is important for many applications such as robot manipulation and augmented reality. End-to-end deep learning based 6D pose tracking methods have achieved notable performance both in terms of accuracy and speed on standard benchmarks characterized by slowly varying poses. However, these methods fail to address a key challenge for using 6D pose trackers in fast motion scenarios. The performance of temporal trackers degrades significantly in fast motion scenarios and tracking failures occur frequently. In this work, we propose a framework to make end-to-end 6D pose trackers work better for fast motion scenarios. We integrate the “Relative Pose Estimation Network” from an end-to-end 6D pose tracker into an EKF framework. The EKF adopts a constant velocity motion model and its measurement is computed from the output of the “Relative Pose Estimation Network”. The proposed method is evaluated on challenging hand-object interaction sequences from the Laval dataset and compared against the original end-to-end pose tracker, referred to as the baseline. Experiments show that integration with EKF significantly improves the tracking performance, achieving a pose detection rate of 85.23% compared to 61.32% achieved by the baseline. The proposed framework exceeds the real-time performance requirement of 30 fps.","PeriodicalId":282759,"journal":{"name":"2022 Sixth IEEE International Conference on Robotic Computing (IRC)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 Sixth IEEE International Conference on Robotic Computing (IRC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IRC55401.2022.00045","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Tracking 6D poses of objects in video sequences is important for many applications such as robot manipulation and augmented reality. End-to-end deep learning based 6D pose tracking methods have achieved notable performance both in terms of accuracy and speed on standard benchmarks characterized by slowly varying poses. However, these methods fail to address a key challenge for using 6D pose trackers in fast motion scenarios. The performance of temporal trackers degrades significantly in fast motion scenarios and tracking failures occur frequently. In this work, we propose a framework to make end-to-end 6D pose trackers work better for fast motion scenarios. We integrate the “Relative Pose Estimation Network” from an end-to-end 6D pose tracker into an EKF framework. The EKF adopts a constant velocity motion model and its measurement is computed from the output of the “Relative Pose Estimation Network”. The proposed method is evaluated on challenging hand-object interaction sequences from the Laval dataset and compared against the original end-to-end pose tracker, referred to as the baseline. Experiments show that integration with EKF significantly improves the tracking performance, achieving a pose detection rate of 85.23% compared to 61.32% achieved by the baseline. The proposed framework exceeds the real-time performance requirement of 30 fps.