基于局部控制的零件三维人脸变形模型

Donya Ghafourzadeh, Cyrus Rahgoshay, Sahel Fallahdoust, A. Beauchamp, Adeline Aubame, T. Popa, Eric Paquette
{"title":"基于局部控制的零件三维人脸变形模型","authors":"Donya Ghafourzadeh, Cyrus Rahgoshay, Sahel Fallahdoust, A. Beauchamp, Adeline Aubame, T. Popa, Eric Paquette","doi":"10.20380/GI2020.03","DOIUrl":null,"url":null,"abstract":"We propose an approach to construct realistic 3D facial morphable models (3DMM) that allows an intuitive facial attribute editing workflow. Current face modeling methods using 3DMM suffer from a lack of local control. We thus create a 3DMM by combining local part-based 3DMM for the eyes, nose, mouth, ears, and facial mask regions. Our local PCA-based approach uses a novel method to select the best eigenvectors from the local 3DMM to ensure that the combined 3DMM is expressive, while allowing accurate reconstruction. The editing controls we provide to the user are intuitive as they are extracted from anthropometric measurements found in the literature. Out of a large set of possible anthropometric measurements, we filter those that have meaningful generative power given the face data set. We bind the measurements to the part-based 3DMM through mapping matrices derived from our data set of facial scans. Our part-based 3DMM is compact, yet accurate, and compared to other 3DMM methods, it provides a new trade-off between local and global control. We tested our approach on a data set of 135 scans used to derive the 3DMM, plus 19 scans that served for validation. The results show that our part-based 3DMM approach has excellent generative properties and allows the user intuitive local control. *e-mail: donya.ghafourzadeh@ubisoft.com †e-mail: cyrus.rahgoshay@ubisoft.com ‡e-mail: sahel.fallahdoust@ubisoft.com §e-mail: andre.beauchamp@ubisoft.com ¶e-mail: adeline.aubame@ubisoft.com ||e-mail: tiberiu.popa@concordia.ca **e-mail: eric.paquette@etsmtl.ca","PeriodicalId":93493,"journal":{"name":"Proceedings. Graphics Interface (Conference)","volume":"1 1","pages":"7-16"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":"{\"title\":\"Part-Based 3D Face Morphable Model with Anthropometric Local Control\",\"authors\":\"Donya Ghafourzadeh, Cyrus Rahgoshay, Sahel Fallahdoust, A. Beauchamp, Adeline Aubame, T. Popa, Eric Paquette\",\"doi\":\"10.20380/GI2020.03\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We propose an approach to construct realistic 3D facial morphable models (3DMM) that allows an intuitive facial attribute editing workflow. Current face modeling methods using 3DMM suffer from a lack of local control. We thus create a 3DMM by combining local part-based 3DMM for the eyes, nose, mouth, ears, and facial mask regions. Our local PCA-based approach uses a novel method to select the best eigenvectors from the local 3DMM to ensure that the combined 3DMM is expressive, while allowing accurate reconstruction. The editing controls we provide to the user are intuitive as they are extracted from anthropometric measurements found in the literature. Out of a large set of possible anthropometric measurements, we filter those that have meaningful generative power given the face data set. We bind the measurements to the part-based 3DMM through mapping matrices derived from our data set of facial scans. Our part-based 3DMM is compact, yet accurate, and compared to other 3DMM methods, it provides a new trade-off between local and global control. We tested our approach on a data set of 135 scans used to derive the 3DMM, plus 19 scans that served for validation. The results show that our part-based 3DMM approach has excellent generative properties and allows the user intuitive local control. *e-mail: donya.ghafourzadeh@ubisoft.com †e-mail: cyrus.rahgoshay@ubisoft.com ‡e-mail: sahel.fallahdoust@ubisoft.com §e-mail: andre.beauchamp@ubisoft.com ¶e-mail: adeline.aubame@ubisoft.com ||e-mail: tiberiu.popa@concordia.ca **e-mail: eric.paquette@etsmtl.ca\",\"PeriodicalId\":93493,\"journal\":{\"name\":\"Proceedings. Graphics Interface (Conference)\",\"volume\":\"1 1\",\"pages\":\"7-16\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-12-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"12\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings. Graphics Interface (Conference)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.20380/GI2020.03\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings. Graphics Interface (Conference)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.20380/GI2020.03","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 12

摘要

我们提出了一种构建逼真的三维面部变形模型(3DMM)的方法,该方法允许直观的面部属性编辑工作流程。目前使用3DMM的人脸建模方法缺乏局部控制。因此,我们将基于局部部分的3DMM结合在一起,为眼睛、鼻子、嘴巴、耳朵和面膜区域创建一个3DMM。我们基于局部pca的方法采用了一种新颖的方法,从局部3DMM中选择最佳特征向量,以确保组合的3DMM具有表现力,同时允许精确重建。我们提供给用户的编辑控件是直观的,因为它们是从文献中发现的人体测量中提取的。在大量可能的人体测量数据中,我们过滤了那些给定面部数据集具有有意义的生成能力的数据。我们通过从面部扫描数据集导出的映射矩阵将测量值绑定到基于零件的3DMM。我们基于零件的3DMM结构紧凑,但精度高,与其他3DMM方法相比,它在局部和全局控制之间提供了一种新的权衡。我们在135个用于导出3DMM的扫描数据集上测试了我们的方法,加上19个用于验证的扫描数据集。结果表明,基于零件的3DMM方法具有良好的生成特性,并允许用户直观地进行局部控制。*e-mail: donya.ghafourzadeh@ubisoft.com†e-mail: cyrus.rahgoshay@ubisoft.com‡e-mail: sahel.fallahdoust@ubisoft.com§e-mail: andre.beauchamp@ubisoft.com¶e-mail: adeline.aubame@ubisoft.com ||e-mail: tiberiu.popa@concordia.ca **e-mail: eric.paquette@etsmtl.ca
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Part-Based 3D Face Morphable Model with Anthropometric Local Control
We propose an approach to construct realistic 3D facial morphable models (3DMM) that allows an intuitive facial attribute editing workflow. Current face modeling methods using 3DMM suffer from a lack of local control. We thus create a 3DMM by combining local part-based 3DMM for the eyes, nose, mouth, ears, and facial mask regions. Our local PCA-based approach uses a novel method to select the best eigenvectors from the local 3DMM to ensure that the combined 3DMM is expressive, while allowing accurate reconstruction. The editing controls we provide to the user are intuitive as they are extracted from anthropometric measurements found in the literature. Out of a large set of possible anthropometric measurements, we filter those that have meaningful generative power given the face data set. We bind the measurements to the part-based 3DMM through mapping matrices derived from our data set of facial scans. Our part-based 3DMM is compact, yet accurate, and compared to other 3DMM methods, it provides a new trade-off between local and global control. We tested our approach on a data set of 135 scans used to derive the 3DMM, plus 19 scans that served for validation. The results show that our part-based 3DMM approach has excellent generative properties and allows the user intuitive local control. *e-mail: donya.ghafourzadeh@ubisoft.com †e-mail: cyrus.rahgoshay@ubisoft.com ‡e-mail: sahel.fallahdoust@ubisoft.com §e-mail: andre.beauchamp@ubisoft.com ¶e-mail: adeline.aubame@ubisoft.com ||e-mail: tiberiu.popa@concordia.ca **e-mail: eric.paquette@etsmtl.ca
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
2.20
自引率
0.00%
发文量
0
期刊最新文献
Towards Enabling Blind People to Fill Out Paper Forms with a Wearable Smartphone Assistant. BayesGaze: A Bayesian Approach to Eye-Gaze Based Target Selection. Personal+Context navigation: combining AR and shared displays in network path-following Interactive Exploration of Genomic Conservation AffordIt!: A Tool for Authoring Object Component Behavior in Virtual Reality
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1