{"title":"从非常大的运动数据库的运动数据检索","authors":"Cheng Ren, Xiaoyong Lei, Guofeng Zhang","doi":"10.1109/ICVRV.2011.50","DOIUrl":null,"url":null,"abstract":"The reuse of motion capture data has become an important way to generate realistic motions. Retrieval of similar motion segments from large motion datasets accordingly serves as a fundamental problem for data-based motion processing methods. The retrieval task is difficult due to the spatio-temporal variances existing in human motion. With the increasing amount of data, the retrieval task has become even more time consuming. In this paper, we present a motion retrieval approach that is capable of extracting similar motion subsequences from very large motion databases given a query motion input. Our method employs BIRCH-based(Balanced Iterative Reducing and Clustering using Hierarchies) clustering method to incrementally cluster poses so as to effectively deal with very large datasets. An elastic LCS(longest common subsequence) algorithm is then proposed to discover the similar motion subsequences based on the posture clustering result. Finally, the motion patterns are extracted and stored, with each pattern containing a set of similar motions. In the runtime retrieval stage, as each stored pattern effectively compared with the query motion, the group of the similar motions is acquired. Experimental results show that our method successfully retrieves similar motions and outperforms the existing methods in time and space costs when applying to very large motion datasets.","PeriodicalId":239933,"journal":{"name":"2011 International Conference on Virtual Reality and Visualization","volume":"147 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"13","resultStr":"{\"title\":\"Motion Data Retrieval from Very Large Motion Databases\",\"authors\":\"Cheng Ren, Xiaoyong Lei, Guofeng Zhang\",\"doi\":\"10.1109/ICVRV.2011.50\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The reuse of motion capture data has become an important way to generate realistic motions. Retrieval of similar motion segments from large motion datasets accordingly serves as a fundamental problem for data-based motion processing methods. The retrieval task is difficult due to the spatio-temporal variances existing in human motion. With the increasing amount of data, the retrieval task has become even more time consuming. In this paper, we present a motion retrieval approach that is capable of extracting similar motion subsequences from very large motion databases given a query motion input. Our method employs BIRCH-based(Balanced Iterative Reducing and Clustering using Hierarchies) clustering method to incrementally cluster poses so as to effectively deal with very large datasets. An elastic LCS(longest common subsequence) algorithm is then proposed to discover the similar motion subsequences based on the posture clustering result. Finally, the motion patterns are extracted and stored, with each pattern containing a set of similar motions. In the runtime retrieval stage, as each stored pattern effectively compared with the query motion, the group of the similar motions is acquired. Experimental results show that our method successfully retrieves similar motions and outperforms the existing methods in time and space costs when applying to very large motion datasets.\",\"PeriodicalId\":239933,\"journal\":{\"name\":\"2011 International Conference on Virtual Reality and Visualization\",\"volume\":\"147 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-11-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"13\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2011 International Conference on Virtual Reality and Visualization\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICVRV.2011.50\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 International Conference on Virtual Reality and Visualization","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICVRV.2011.50","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 13
摘要
运动捕捉数据的重用已成为生成逼真运动的重要途径。相应地,从大型运动数据集中检索相似的运动片段是基于数据的运动处理方法的一个基本问题。由于人体运动存在时空差异,检索任务比较困难。随着数据量的增加,检索任务变得更加耗时。在本文中,我们提出了一种运动检索方法,该方法能够在给定查询运动输入的情况下从非常大的运动数据库中提取相似的运动子序列。该方法采用基于birch (Balanced Iterative reduction and Clustering using Hierarchies)的聚类方法对姿态进行增量聚类,从而有效地处理非常大的数据集。然后提出了一种弹性LCS(最长公共子序列)算法,基于姿态聚类结果发现相似的运动子序列。最后,提取并存储运动模式,每个模式包含一组相似的运动。在运行时检索阶段,将每个存储模式与查询运动进行有效比较,获得相似运动组。实验结果表明,当应用于非常大的运动数据集时,我们的方法成功地检索了相似的运动,并且在时间和空间成本上优于现有的方法。
Motion Data Retrieval from Very Large Motion Databases
The reuse of motion capture data has become an important way to generate realistic motions. Retrieval of similar motion segments from large motion datasets accordingly serves as a fundamental problem for data-based motion processing methods. The retrieval task is difficult due to the spatio-temporal variances existing in human motion. With the increasing amount of data, the retrieval task has become even more time consuming. In this paper, we present a motion retrieval approach that is capable of extracting similar motion subsequences from very large motion databases given a query motion input. Our method employs BIRCH-based(Balanced Iterative Reducing and Clustering using Hierarchies) clustering method to incrementally cluster poses so as to effectively deal with very large datasets. An elastic LCS(longest common subsequence) algorithm is then proposed to discover the similar motion subsequences based on the posture clustering result. Finally, the motion patterns are extracted and stored, with each pattern containing a set of similar motions. In the runtime retrieval stage, as each stored pattern effectively compared with the query motion, the group of the similar motions is acquired. Experimental results show that our method successfully retrieves similar motions and outperforms the existing methods in time and space costs when applying to very large motion datasets.