{"title":"从视频中学习运动机器模型","authors":"Lucas Thies, M. Stamminger, F. Bauer","doi":"10.1109/AIVR50618.2020.00028","DOIUrl":null,"url":null,"abstract":"VR/AR applications, such as virtual training or coaching, often require a digital twin of a machine. Such a virtual twin must also include a kinematic model that defines its motion behavior. This behavior is usually expressed by constraints in a physics engine. In this paper, we present a system that automatically derives the kinematic model of a machine from RGB video with an optional depth channel. Our system records a live session while a user performs all typical machine movements. It then searches for trajectories and converts them into linear, circular and helical constraints. Our system can also detect kinematic chains and coupled constraints, for example, when a crank moves a toothed rod.","PeriodicalId":348199,"journal":{"name":"2020 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Learning Kinematic Machine Models from Videos\",\"authors\":\"Lucas Thies, M. Stamminger, F. Bauer\",\"doi\":\"10.1109/AIVR50618.2020.00028\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"VR/AR applications, such as virtual training or coaching, often require a digital twin of a machine. Such a virtual twin must also include a kinematic model that defines its motion behavior. This behavior is usually expressed by constraints in a physics engine. In this paper, we present a system that automatically derives the kinematic model of a machine from RGB video with an optional depth channel. Our system records a live session while a user performs all typical machine movements. It then searches for trajectories and converts them into linear, circular and helical constraints. Our system can also detect kinematic chains and coupled constraints, for example, when a crank moves a toothed rod.\",\"PeriodicalId\":348199,\"journal\":{\"name\":\"2020 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)\",\"volume\":\"8 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/AIVR50618.2020.00028\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AIVR50618.2020.00028","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
VR/AR applications, such as virtual training or coaching, often require a digital twin of a machine. Such a virtual twin must also include a kinematic model that defines its motion behavior. This behavior is usually expressed by constraints in a physics engine. In this paper, we present a system that automatically derives the kinematic model of a machine from RGB video with an optional depth channel. Our system records a live session while a user performs all typical machine movements. It then searches for trajectories and converts them into linear, circular and helical constraints. Our system can also detect kinematic chains and coupled constraints, for example, when a crank moves a toothed rod.