Yanke Wang, Fan Zhong, C. Li, Hui Xiang, Qunsheng Peng, Xueying Qin
{"title":"基于实时摄像机跟踪的远程ar系统","authors":"Yanke Wang, Fan Zhong, C. Li, Hui Xiang, Qunsheng Peng, Xueying Qin","doi":"10.1109/ICVRV.2012.13","DOIUrl":null,"url":null,"abstract":"We propose a visual tele-AR system, which is a combination of tele-existence and AR techniques. By wearing a video see-through HMD and moving his head, a user in this system is able to tele-operate a remote pan-tilt platform to observe the remote scene. Unlike traditional tele-existence systems which use mechanical or magnetic sensors for tele-operation, we propose a novel tele-operation method based on real-time tracking of master and slave cameras. The master camera is a component of the video see-through HMD mounted in front of user's eyes, while the slave camera is carried by a remote platform. We estimate use's head motion by tracking the master camera in real time, and translate head motions into motion commands to drive the remote pan-tilt platform. Ideally, motion of the remote platform will be exactly the same as user's head. Unfortunately, motion speed of the slave platform is usually limited by their mechanical capability, which may cause pose inconsistence between master and slave cameras, especially when user's head moves fast. We solve this problem by also tracking slave camera in real time, and warping video images according to difference between master and slave camera pose. Since slave camera is tracked, we can integrate virtual objects into remote scene, giving the user an augmented visual sense. To reduce system delay, our implementation is accelerated GPU parallel computation. Experimental results show the efficiency and robustness of our method.","PeriodicalId":421789,"journal":{"name":"2012 International Conference on Virtual Reality and Visualization","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Tele-AR System Based on Real-time Camera Tracking\",\"authors\":\"Yanke Wang, Fan Zhong, C. Li, Hui Xiang, Qunsheng Peng, Xueying Qin\",\"doi\":\"10.1109/ICVRV.2012.13\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We propose a visual tele-AR system, which is a combination of tele-existence and AR techniques. By wearing a video see-through HMD and moving his head, a user in this system is able to tele-operate a remote pan-tilt platform to observe the remote scene. Unlike traditional tele-existence systems which use mechanical or magnetic sensors for tele-operation, we propose a novel tele-operation method based on real-time tracking of master and slave cameras. The master camera is a component of the video see-through HMD mounted in front of user's eyes, while the slave camera is carried by a remote platform. We estimate use's head motion by tracking the master camera in real time, and translate head motions into motion commands to drive the remote pan-tilt platform. Ideally, motion of the remote platform will be exactly the same as user's head. Unfortunately, motion speed of the slave platform is usually limited by their mechanical capability, which may cause pose inconsistence between master and slave cameras, especially when user's head moves fast. We solve this problem by also tracking slave camera in real time, and warping video images according to difference between master and slave camera pose. Since slave camera is tracked, we can integrate virtual objects into remote scene, giving the user an augmented visual sense. To reduce system delay, our implementation is accelerated GPU parallel computation. Experimental results show the efficiency and robustness of our method.\",\"PeriodicalId\":421789,\"journal\":{\"name\":\"2012 International Conference on Virtual Reality and Visualization\",\"volume\":\"3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2012-09-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2012 International Conference on Virtual Reality and Visualization\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICVRV.2012.13\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 International Conference on Virtual Reality and Visualization","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICVRV.2012.13","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
We propose a visual tele-AR system, which is a combination of tele-existence and AR techniques. By wearing a video see-through HMD and moving his head, a user in this system is able to tele-operate a remote pan-tilt platform to observe the remote scene. Unlike traditional tele-existence systems which use mechanical or magnetic sensors for tele-operation, we propose a novel tele-operation method based on real-time tracking of master and slave cameras. The master camera is a component of the video see-through HMD mounted in front of user's eyes, while the slave camera is carried by a remote platform. We estimate use's head motion by tracking the master camera in real time, and translate head motions into motion commands to drive the remote pan-tilt platform. Ideally, motion of the remote platform will be exactly the same as user's head. Unfortunately, motion speed of the slave platform is usually limited by their mechanical capability, which may cause pose inconsistence between master and slave cameras, especially when user's head moves fast. We solve this problem by also tracking slave camera in real time, and warping video images according to difference between master and slave camera pose. Since slave camera is tracked, we can integrate virtual objects into remote scene, giving the user an augmented visual sense. To reduce system delay, our implementation is accelerated GPU parallel computation. Experimental results show the efficiency and robustness of our method.