Parametric Linear Blend Skinning Model for Multiple-Shape 3D Garments.

Xipeng Chen, Guangrun Wang, Xiaogang Xu, Philip Torr, Liang Lin
{"title":"Parametric Linear Blend Skinning Model for Multiple-Shape 3D Garments.","authors":"Xipeng Chen, Guangrun Wang, Xiaogang Xu, Philip Torr, Liang Lin","doi":"10.1109/TVCG.2024.3478852","DOIUrl":null,"url":null,"abstract":"<p><p>We present a novel data-driven Parametric Linear Blend Skinning (PLBS) model meticulously crafted for generalized 3D garment dressing and animation. Previous data-driven methods are impeded by certain challenges including overreliance on human body modeling and limited adaptability across different garment shapes. Our method resolves these challenges via two goals: 1) Develop a model based on garment modeling rather than human body modeling. 2) Separately construct low-dimensional sub-spaces for modeling in-plane deformation (such as variation in garment shape and size) and out-of-plane deformation (such as deformation due to varied body size and motion). Therefore, we formulate garment deformation as a PLBS model controlled by canonical 3D garment mesh, vertex-based skinning weights and associated local patch transformation. Unlike traditional LBS models specialized for individual objects, PLBS model is capable of uniformly expressing varied garments and bodies, the in-plane deformation is encoded on the canonical 3D garment and the out-of-plane deformation is controlled by the local patch transformation. Besides, we propose novel 3D garment registration and skinning weight decomposition strategies to obtain adequate data to build PLBS model under different garment categories. Furthermore, we employ dynamic fine-tuning to complement high-frequency signals missing from LBS for unseen testing data. Experiments illustrate that our method is capable of modeling dynamics for loose-fitting garments, outperforming previous data-driven modeling methods using different sub-space modeling strategies. We showcase that our method can factorize and be generalized for varied body sizes, garment shapes, garment sizes and human motions under different garment categories.</p>","PeriodicalId":94035,"journal":{"name":"IEEE transactions on visualization and computer graphics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on visualization and computer graphics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TVCG.2024.3478852","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

We present a novel data-driven Parametric Linear Blend Skinning (PLBS) model meticulously crafted for generalized 3D garment dressing and animation. Previous data-driven methods are impeded by certain challenges including overreliance on human body modeling and limited adaptability across different garment shapes. Our method resolves these challenges via two goals: 1) Develop a model based on garment modeling rather than human body modeling. 2) Separately construct low-dimensional sub-spaces for modeling in-plane deformation (such as variation in garment shape and size) and out-of-plane deformation (such as deformation due to varied body size and motion). Therefore, we formulate garment deformation as a PLBS model controlled by canonical 3D garment mesh, vertex-based skinning weights and associated local patch transformation. Unlike traditional LBS models specialized for individual objects, PLBS model is capable of uniformly expressing varied garments and bodies, the in-plane deformation is encoded on the canonical 3D garment and the out-of-plane deformation is controlled by the local patch transformation. Besides, we propose novel 3D garment registration and skinning weight decomposition strategies to obtain adequate data to build PLBS model under different garment categories. Furthermore, we employ dynamic fine-tuning to complement high-frequency signals missing from LBS for unseen testing data. Experiments illustrate that our method is capable of modeling dynamics for loose-fitting garments, outperforming previous data-driven modeling methods using different sub-space modeling strategies. We showcase that our method can factorize and be generalized for varied body sizes, garment shapes, garment sizes and human motions under different garment categories.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
多形状三维服装的参数线性混合皮肤模型
我们介绍了一种新颖的数据驱动参数线性混合蒙皮(PLBS)模型,该模型经过精心设计,适用于通用的三维服装着装和动画制作。以往的数据驱动方法面临着一些挑战,包括过度依赖人体建模和对不同服装形状的适应性有限。我们的方法通过两个目标来解决这些难题:1) 基于服装建模而不是人体建模来开发模型。2) 为平面内变形(如服装形状和尺寸的变化)和平面外变形(如身体尺寸和运动变化引起的变形)分别构建低维子空间建模。因此,我们将服装变形表述为一个 PLBS 模型,该模型由典型三维服装网格、基于顶点的蒙皮权重和相关的局部补丁变换控制。与专门针对单个物体的传统 LBS 模型不同,PLBS 模型能够统一地表达不同的服装和人体,平面内的形变由标准三维服装编码,平面外的形变由局部补丁变换控制。此外,我们还提出了新颖的三维服装注册和蒙皮权重分解策略,以获得足够的数据来建立不同服装类别下的 PLBS 模型。此外,我们还采用了动态微调技术来补充未见测试数据中缺少的高频信号。实验表明,我们的方法能够对宽松服装进行动态建模,优于之前使用不同子空间建模策略的数据驱动建模方法。我们展示了我们的方法可以针对不同的人体尺寸、服装形状、服装尺寸和不同服装类别下的人体运动进行因子化和通用化。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
"where Did My Apps Go?" Supporting Scalable and Transition-Aware Access to Everyday Applications in Head-Worn Augmented Reality. PGSR: Planar-based Gaussian Splatting for Efficient and High-Fidelity Surface Reconstruction. From Dashboard Zoo to Census: A Case Study With Tableau Public. Authoring Data-Driven Chart Animations. Super-NeRF: View-consistent Detail Generation for NeRF Super-resolution.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1