{"title":"Synchronized real-time multi-sensor motion capture system","authors":"J. Ruttle, M. Manzke, M. Prazák, Rozenn Dahyot","doi":"10.1145/1666778.1666828","DOIUrl":null,"url":null,"abstract":"This work addresses the challenge of synchronizing multiple sources of visible and audible information from a variety of devices, while capturing human motion in realtime. Video and audio data will be used to augment and enrich a motion capture database that will be released to the research community. While other such augmented motion capture databases exist [Black and Sigal 2006], the goal of this work is to build on these previous works. Critical areas of improvement are in the synchronization between cameras and synchronization between devices. Adding an array of audio recording devices to the setup will greatly expand the research potential of the database, and the positioning of the cameras will be varied to give greater flexibility. The augmented database will facilitate the testing and validation of human pose estimation and motion tracking techniques, among other applications. This sketch briefly describes some of the interesting challenges faced in setting up the pipeline for capturing the synchronized data and the novel approaches proposed to solve them.","PeriodicalId":180587,"journal":{"name":"ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/1666778.1666828","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
This work addresses the challenge of synchronizing multiple sources of visible and audible information from a variety of devices, while capturing human motion in realtime. Video and audio data will be used to augment and enrich a motion capture database that will be released to the research community. While other such augmented motion capture databases exist [Black and Sigal 2006], the goal of this work is to build on these previous works. Critical areas of improvement are in the synchronization between cameras and synchronization between devices. Adding an array of audio recording devices to the setup will greatly expand the research potential of the database, and the positioning of the cameras will be varied to give greater flexibility. The augmented database will facilitate the testing and validation of human pose estimation and motion tracking techniques, among other applications. This sketch briefly describes some of the interesting challenges faced in setting up the pipeline for capturing the synchronized data and the novel approaches proposed to solve them.
这项工作解决了同步来自各种设备的多种可见和可听信息来源的挑战,同时实时捕捉人体运动。视频和音频数据将用于增强和丰富将发布给研究界的动作捕捉数据库。虽然存在其他这样的增强动作捕捉数据库[Black and signal 2006],但这项工作的目标是建立在这些先前工作的基础上。改进的关键领域是相机之间的同步和设备之间的同步。在设置中增加一系列录音设备将大大扩展数据库的研究潜力,并且摄像机的位置将会变化,以提供更大的灵活性。增强数据库将促进人体姿态估计和运动跟踪技术的测试和验证,以及其他应用。本文简要描述了在建立捕获同步数据的管道时面临的一些有趣的挑战,以及提出的解决这些挑战的新方法。