去耦接触实现精细运动风格传递

Xiangjun Tang, Linjun Wu, He Wang, Yiqian Wu, Bo Hu, Songnan Li, Xu Gong, Yuchen Liao, Qilong Kou, Xiaogang Jin
{"title":"去耦接触实现精细运动风格传递","authors":"Xiangjun Tang, Linjun Wu, He Wang, Yiqian Wu, Bo Hu, Songnan Li, Xu Gong, Yuchen Liao, Qilong Kou, Xiaogang Jin","doi":"arxiv-2409.05387","DOIUrl":null,"url":null,"abstract":"Motion style transfer changes the style of a motion while retaining its\ncontent and is useful in computer animations and games. Contact is an essential\ncomponent of motion style transfer that should be controlled explicitly in\norder to express the style vividly while enhancing motion naturalness and\nquality. However, it is unknown how to decouple and control contact to achieve\nfine-grained control in motion style transfer. In this paper, we present a\nnovel style transfer method for fine-grained control over contacts while\nachieving both motion naturalness and spatial-temporal variations of style.\nBased on our empirical evidence, we propose controlling contact indirectly\nthrough the hip velocity, which can be further decomposed into the trajectory\nand contact timing, respectively. To this end, we propose a new model that\nexplicitly models the correlations between motions and trajectory/contact\ntiming/style, allowing us to decouple and control each separately. Our approach\nis built around a motion manifold, where hip controls can be easily integrated\ninto a Transformer-based decoder. It is versatile in that it can generate\nmotions directly as well as be used as post-processing for existing methods to\nimprove quality and contact controllability. In addition, we propose a new\nmetric that measures a correlation pattern of motions based on our empirical\nevidence, aligning well with human perception in terms of motion naturalness.\nBased on extensive evaluation, our method outperforms existing methods in terms\nof style expressivity and motion quality.","PeriodicalId":501174,"journal":{"name":"arXiv - CS - Graphics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Decoupling Contact for Fine-Grained Motion Style Transfer\",\"authors\":\"Xiangjun Tang, Linjun Wu, He Wang, Yiqian Wu, Bo Hu, Songnan Li, Xu Gong, Yuchen Liao, Qilong Kou, Xiaogang Jin\",\"doi\":\"arxiv-2409.05387\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Motion style transfer changes the style of a motion while retaining its\\ncontent and is useful in computer animations and games. Contact is an essential\\ncomponent of motion style transfer that should be controlled explicitly in\\norder to express the style vividly while enhancing motion naturalness and\\nquality. However, it is unknown how to decouple and control contact to achieve\\nfine-grained control in motion style transfer. In this paper, we present a\\nnovel style transfer method for fine-grained control over contacts while\\nachieving both motion naturalness and spatial-temporal variations of style.\\nBased on our empirical evidence, we propose controlling contact indirectly\\nthrough the hip velocity, which can be further decomposed into the trajectory\\nand contact timing, respectively. To this end, we propose a new model that\\nexplicitly models the correlations between motions and trajectory/contact\\ntiming/style, allowing us to decouple and control each separately. Our approach\\nis built around a motion manifold, where hip controls can be easily integrated\\ninto a Transformer-based decoder. It is versatile in that it can generate\\nmotions directly as well as be used as post-processing for existing methods to\\nimprove quality and contact controllability. In addition, we propose a new\\nmetric that measures a correlation pattern of motions based on our empirical\\nevidence, aligning well with human perception in terms of motion naturalness.\\nBased on extensive evaluation, our method outperforms existing methods in terms\\nof style expressivity and motion quality.\",\"PeriodicalId\":501174,\"journal\":{\"name\":\"arXiv - CS - Graphics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Graphics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.05387\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Graphics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.05387","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

运动风格转换在保留运动内容的同时改变运动风格,在电脑动画和游戏中非常有用。接触是运动风格转换的重要组成部分,应明确加以控制,以便在增强运动自然度和质量的同时生动地表达运动风格。然而,目前还不知道如何解耦和控制接触,以实现运动风格转移的精细控制。在本文中,我们提出了一种新的风格转换方法,在实现动作自然性和风格的时空变化的同时,对触点进行精细控制。基于我们的经验证据,我们提出通过髋关节速度间接控制触点,而髋关节速度又可分别分解为运动轨迹和触点时间。为此,我们提出了一个新模型,该模型明确地模拟了运动和轨迹/接触时机/风格之间的相关性,使我们能够将两者分离并分别进行控制。我们的方法是围绕运动流形建立的,其中臀部控制可以轻松集成到基于变换器的解码器中。这种方法用途广泛,既可以直接生成动作,也可以用作现有方法的后处理,以提高质量和接触可控性。此外,我们还根据经验证据提出了一种新的测量方法,用于测量运动的相关模式,在运动自然度方面与人类的感知非常吻合。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Decoupling Contact for Fine-Grained Motion Style Transfer
Motion style transfer changes the style of a motion while retaining its content and is useful in computer animations and games. Contact is an essential component of motion style transfer that should be controlled explicitly in order to express the style vividly while enhancing motion naturalness and quality. However, it is unknown how to decouple and control contact to achieve fine-grained control in motion style transfer. In this paper, we present a novel style transfer method for fine-grained control over contacts while achieving both motion naturalness and spatial-temporal variations of style. Based on our empirical evidence, we propose controlling contact indirectly through the hip velocity, which can be further decomposed into the trajectory and contact timing, respectively. To this end, we propose a new model that explicitly models the correlations between motions and trajectory/contact timing/style, allowing us to decouple and control each separately. Our approach is built around a motion manifold, where hip controls can be easily integrated into a Transformer-based decoder. It is versatile in that it can generate motions directly as well as be used as post-processing for existing methods to improve quality and contact controllability. In addition, we propose a new metric that measures a correlation pattern of motions based on our empirical evidence, aligning well with human perception in terms of motion naturalness. Based on extensive evaluation, our method outperforms existing methods in terms of style expressivity and motion quality.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
GaussianHeads: End-to-End Learning of Drivable Gaussian Head Avatars from Coarse-to-fine Representations A Missing Data Imputation GAN for Character Sprite Generation Visualizing Temporal Topic Embeddings with a Compass Playground v3: Improving Text-to-Image Alignment with Deep-Fusion Large Language Models Phys3DGS: Physically-based 3D Gaussian Splatting for Inverse Rendering
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1