CT2Hair: High-Fidelity 3D Hair Modeling using Computed Tomography

Yuefan Shen, Shunsuke Saito, Ziyan Wang, O. Maury, Chenglei Wu, J. Hodgins, Youyi Zheng, Giljoo Nam
{"title":"CT2Hair: High-Fidelity 3D Hair Modeling using Computed Tomography","authors":"Yuefan Shen, Shunsuke Saito, Ziyan Wang, O. Maury, Chenglei Wu, J. Hodgins, Youyi Zheng, Giljoo Nam","doi":"10.1145/3592106","DOIUrl":null,"url":null,"abstract":"We introduce CT2Hair, a fully automatic framework for creating high-fidelity 3D hair models that are suitable for use in downstream graphics applications. Our approach utilizes real-world hair wigs as input, and is able to reconstruct hair strands for a wide range of hair styles. Our method leverages computed tomography (CT) to create density volumes of the hair regions, allowing us to see through the hair unlike image-based approaches which are limited to reconstructing the visible surface. To address the noise and limited resolution of the input density volumes, we employ a coarse-to-fine approach. This process first recovers guide strands with estimated 3D orientation fields, and then populates dense strands through a novel neural interpolation of the guide strands. The generated strands are then refined to conform to the input density volumes. We demonstrate the robustness of our approach by presenting results on a wide variety of hair styles and conducting thorough evaluations on both real-world and synthetic datasets. Code and data for this paper are at github.com/facebookresearch/CT2Hair.","PeriodicalId":7077,"journal":{"name":"ACM Transactions on Graphics (TOG)","volume":"5 1","pages":"1 - 13"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Graphics (TOG)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3592106","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

We introduce CT2Hair, a fully automatic framework for creating high-fidelity 3D hair models that are suitable for use in downstream graphics applications. Our approach utilizes real-world hair wigs as input, and is able to reconstruct hair strands for a wide range of hair styles. Our method leverages computed tomography (CT) to create density volumes of the hair regions, allowing us to see through the hair unlike image-based approaches which are limited to reconstructing the visible surface. To address the noise and limited resolution of the input density volumes, we employ a coarse-to-fine approach. This process first recovers guide strands with estimated 3D orientation fields, and then populates dense strands through a novel neural interpolation of the guide strands. The generated strands are then refined to conform to the input density volumes. We demonstrate the robustness of our approach by presenting results on a wide variety of hair styles and conducting thorough evaluations on both real-world and synthetic datasets. Code and data for this paper are at github.com/facebookresearch/CT2Hair.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
CT2Hair:高保真3D头发建模使用计算机断层扫描
我们介绍CT2Hair,这是一个全自动框架,用于创建高保真3D头发模型,适用于下游图形应用程序。我们的方法利用真实世界的假发作为输入,并且能够重建各种发型的头发。我们的方法利用计算机断层扫描(CT)来创建头发区域的密度体积,使我们能够透过头发看到头发,而不像基于图像的方法仅限于重建可见表面。为了解决噪声和输入密度体积的有限分辨率,我们采用了一种从粗到精的方法。该方法首先利用估计的三维方向场恢复导链,然后通过一种新颖的导链神经插值方法填充密集导链。生成的股线然后被精炼以符合输入密度体积。我们通过展示各种发型的结果,并对现实世界和合成数据集进行全面评估,证明了我们方法的稳健性。本文的代码和数据在github.com/facebookresearch/CT2Hair。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
GeoLatent: A Geometric Approach to Latent Space Design for Deformable Shape Generators An Implicit Neural Representation for the Image Stack: Depth, All in Focus, and High Dynamic Range Rectifying Strip Patterns From Skin to Skeleton: Towards Biomechanically Accurate 3D Digital Humans Warped-Area Reparameterization of Differential Path Integrals
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1