Xiangjun Tang, Linjun Wu, He Wang, Yiqian Wu, Bo Hu, Songnan Li, Xu Gong, Yuchen Liao, Qilong Kou, Xiaogang Jin
{"title":"去耦接触实现精细运动风格传递","authors":"Xiangjun Tang, Linjun Wu, He Wang, Yiqian Wu, Bo Hu, Songnan Li, Xu Gong, Yuchen Liao, Qilong Kou, Xiaogang Jin","doi":"arxiv-2409.05387","DOIUrl":null,"url":null,"abstract":"Motion style transfer changes the style of a motion while retaining its\ncontent and is useful in computer animations and games. Contact is an essential\ncomponent of motion style transfer that should be controlled explicitly in\norder to express the style vividly while enhancing motion naturalness and\nquality. However, it is unknown how to decouple and control contact to achieve\nfine-grained control in motion style transfer. In this paper, we present a\nnovel style transfer method for fine-grained control over contacts while\nachieving both motion naturalness and spatial-temporal variations of style.\nBased on our empirical evidence, we propose controlling contact indirectly\nthrough the hip velocity, which can be further decomposed into the trajectory\nand contact timing, respectively. To this end, we propose a new model that\nexplicitly models the correlations between motions and trajectory/contact\ntiming/style, allowing us to decouple and control each separately. Our approach\nis built around a motion manifold, where hip controls can be easily integrated\ninto a Transformer-based decoder. It is versatile in that it can generate\nmotions directly as well as be used as post-processing for existing methods to\nimprove quality and contact controllability. In addition, we propose a new\nmetric that measures a correlation pattern of motions based on our empirical\nevidence, aligning well with human perception in terms of motion naturalness.\nBased on extensive evaluation, our method outperforms existing methods in terms\nof style expressivity and motion quality.","PeriodicalId":501174,"journal":{"name":"arXiv - CS - Graphics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Decoupling Contact for Fine-Grained Motion Style Transfer\",\"authors\":\"Xiangjun Tang, Linjun Wu, He Wang, Yiqian Wu, Bo Hu, Songnan Li, Xu Gong, Yuchen Liao, Qilong Kou, Xiaogang Jin\",\"doi\":\"arxiv-2409.05387\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Motion style transfer changes the style of a motion while retaining its\\ncontent and is useful in computer animations and games. Contact is an essential\\ncomponent of motion style transfer that should be controlled explicitly in\\norder to express the style vividly while enhancing motion naturalness and\\nquality. However, it is unknown how to decouple and control contact to achieve\\nfine-grained control in motion style transfer. In this paper, we present a\\nnovel style transfer method for fine-grained control over contacts while\\nachieving both motion naturalness and spatial-temporal variations of style.\\nBased on our empirical evidence, we propose controlling contact indirectly\\nthrough the hip velocity, which can be further decomposed into the trajectory\\nand contact timing, respectively. To this end, we propose a new model that\\nexplicitly models the correlations between motions and trajectory/contact\\ntiming/style, allowing us to decouple and control each separately. Our approach\\nis built around a motion manifold, where hip controls can be easily integrated\\ninto a Transformer-based decoder. It is versatile in that it can generate\\nmotions directly as well as be used as post-processing for existing methods to\\nimprove quality and contact controllability. In addition, we propose a new\\nmetric that measures a correlation pattern of motions based on our empirical\\nevidence, aligning well with human perception in terms of motion naturalness.\\nBased on extensive evaluation, our method outperforms existing methods in terms\\nof style expressivity and motion quality.\",\"PeriodicalId\":501174,\"journal\":{\"name\":\"arXiv - CS - Graphics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Graphics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.05387\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Graphics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.05387","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Decoupling Contact for Fine-Grained Motion Style Transfer
Motion style transfer changes the style of a motion while retaining its
content and is useful in computer animations and games. Contact is an essential
component of motion style transfer that should be controlled explicitly in
order to express the style vividly while enhancing motion naturalness and
quality. However, it is unknown how to decouple and control contact to achieve
fine-grained control in motion style transfer. In this paper, we present a
novel style transfer method for fine-grained control over contacts while
achieving both motion naturalness and spatial-temporal variations of style.
Based on our empirical evidence, we propose controlling contact indirectly
through the hip velocity, which can be further decomposed into the trajectory
and contact timing, respectively. To this end, we propose a new model that
explicitly models the correlations between motions and trajectory/contact
timing/style, allowing us to decouple and control each separately. Our approach
is built around a motion manifold, where hip controls can be easily integrated
into a Transformer-based decoder. It is versatile in that it can generate
motions directly as well as be used as post-processing for existing methods to
improve quality and contact controllability. In addition, we propose a new
metric that measures a correlation pattern of motions based on our empirical
evidence, aligning well with human perception in terms of motion naturalness.
Based on extensive evaluation, our method outperforms existing methods in terms
of style expressivity and motion quality.