{"title":"低计算成本的移动设备实时物体姿态跟踪系统","authors":"Yo-Chung Lau;Kuan-Wei Tseng;Peng-Yuan Kao;I-Ju Hsieh;Hsiao-Ching Tseng;Yi-Ping Hung","doi":"10.1109/JISPIN.2023.3340987","DOIUrl":null,"url":null,"abstract":"Real-time object pose estimation and tracking is challenging but essential for some emerging applications, such as augmented reality. In general, state-of-the-art methods address this problem using deep neural networks, which indeed yield satisfactory results. Nevertheless, the high computational cost of these methods makes them unsuitable for mobile devices where real-world applications usually take place. We propose real-time object pose tracking system with low computational cost for mobile devices. It is a monocular inertial-assisted-visual system with a client–server architecture connected by high-speed networking. Inertial measurement unit (IMU) pose propagation is performed on the client side for fast pose tracking, and RGB image-based 3-D object pose estimation is performed on the server side to obtain accurate poses, after which the pose is sent to the client side for refinement, where we propose a bias self-correction mechanism to reduce the drift. We also propose a fast and effective pose inspection algorithm to detect tracking failures and incorrect pose estimation. In this way, the pose updates rapidly even within 5 ms on low-level devices, making it possible to support real-time tracking for applications. In addition, an object pose dataset with RGB images and IMU measurements is delivered for evaluation. Experiments also show that our method performs well with both accuracy and robustness.","PeriodicalId":100621,"journal":{"name":"IEEE Journal of Indoor and Seamless Positioning and Navigation","volume":"1 ","pages":"211-220"},"PeriodicalIF":0.0000,"publicationDate":"2023-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10352604","citationCount":"0","resultStr":"{\"title\":\"Real-Time Object Pose Tracking System With Low Computational Cost for Mobile Devices\",\"authors\":\"Yo-Chung Lau;Kuan-Wei Tseng;Peng-Yuan Kao;I-Ju Hsieh;Hsiao-Ching Tseng;Yi-Ping Hung\",\"doi\":\"10.1109/JISPIN.2023.3340987\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Real-time object pose estimation and tracking is challenging but essential for some emerging applications, such as augmented reality. In general, state-of-the-art methods address this problem using deep neural networks, which indeed yield satisfactory results. Nevertheless, the high computational cost of these methods makes them unsuitable for mobile devices where real-world applications usually take place. We propose real-time object pose tracking system with low computational cost for mobile devices. It is a monocular inertial-assisted-visual system with a client–server architecture connected by high-speed networking. Inertial measurement unit (IMU) pose propagation is performed on the client side for fast pose tracking, and RGB image-based 3-D object pose estimation is performed on the server side to obtain accurate poses, after which the pose is sent to the client side for refinement, where we propose a bias self-correction mechanism to reduce the drift. We also propose a fast and effective pose inspection algorithm to detect tracking failures and incorrect pose estimation. In this way, the pose updates rapidly even within 5 ms on low-level devices, making it possible to support real-time tracking for applications. In addition, an object pose dataset with RGB images and IMU measurements is delivered for evaluation. Experiments also show that our method performs well with both accuracy and robustness.\",\"PeriodicalId\":100621,\"journal\":{\"name\":\"IEEE Journal of Indoor and Seamless Positioning and Navigation\",\"volume\":\"1 \",\"pages\":\"211-220\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-12-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10352604\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Journal of Indoor and Seamless Positioning and Navigation\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10352604/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Indoor and Seamless Positioning and Navigation","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10352604/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Real-Time Object Pose Tracking System With Low Computational Cost for Mobile Devices
Real-time object pose estimation and tracking is challenging but essential for some emerging applications, such as augmented reality. In general, state-of-the-art methods address this problem using deep neural networks, which indeed yield satisfactory results. Nevertheless, the high computational cost of these methods makes them unsuitable for mobile devices where real-world applications usually take place. We propose real-time object pose tracking system with low computational cost for mobile devices. It is a monocular inertial-assisted-visual system with a client–server architecture connected by high-speed networking. Inertial measurement unit (IMU) pose propagation is performed on the client side for fast pose tracking, and RGB image-based 3-D object pose estimation is performed on the server side to obtain accurate poses, after which the pose is sent to the client side for refinement, where we propose a bias self-correction mechanism to reduce the drift. We also propose a fast and effective pose inspection algorithm to detect tracking failures and incorrect pose estimation. In this way, the pose updates rapidly even within 5 ms on low-level devices, making it possible to support real-time tracking for applications. In addition, an object pose dataset with RGB images and IMU measurements is delivered for evaluation. Experiments also show that our method performs well with both accuracy and robustness.