Intuitive editing of material appearance

A. Serrano, D. Gutierrez, K. Myszkowski, H. Seidel, B. Masiá
{"title":"Intuitive editing of material appearance","authors":"A. Serrano, D. Gutierrez, K. Myszkowski, H. Seidel, B. Masiá","doi":"10.1145/2945078.2945141","DOIUrl":null,"url":null,"abstract":"Many different techniques for measuring material appearance have been proposed in the last few years. These have produced large public datasets, which have been used for accurate, data-driven appearance modeling. However, although these datasets have allowed us to reach an unprecedented level of realism in visual appearance, editing the captured data remains a challenge. In this work, we develop a novel methodology for intuitive and predictable editing of captured BRDF data, which allows for artistic creation of plausible material appearances, bypassing the difficulty of acquiring novel samples. We synthesize novel materials, and extend the existing MERL dataset [Matusik et al. 2003] up to 400 mathematically valid BRDFs. We design a large-scale experiment with 400 participants, gathering 56000 ratings about the perceptual attributes that best describe our extended dataset of materials. Using these ratings, we build and train networks of radial basis functions to act as functionals that map the high-level perceptual attributes to an underlying PCA-based representation of BRDFs. We show how our approach allows for intuitive edits of a wide range of visual properties, and demonstrate through a user study that our functionals are excellent predictors of the perceived attributes of appearance, enabling predictable editing with our framework.","PeriodicalId":417667,"journal":{"name":"ACM SIGGRAPH 2016 Posters","volume":"135 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM SIGGRAPH 2016 Posters","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2945078.2945141","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Many different techniques for measuring material appearance have been proposed in the last few years. These have produced large public datasets, which have been used for accurate, data-driven appearance modeling. However, although these datasets have allowed us to reach an unprecedented level of realism in visual appearance, editing the captured data remains a challenge. In this work, we develop a novel methodology for intuitive and predictable editing of captured BRDF data, which allows for artistic creation of plausible material appearances, bypassing the difficulty of acquiring novel samples. We synthesize novel materials, and extend the existing MERL dataset [Matusik et al. 2003] up to 400 mathematically valid BRDFs. We design a large-scale experiment with 400 participants, gathering 56000 ratings about the perceptual attributes that best describe our extended dataset of materials. Using these ratings, we build and train networks of radial basis functions to act as functionals that map the high-level perceptual attributes to an underlying PCA-based representation of BRDFs. We show how our approach allows for intuitive edits of a wide range of visual properties, and demonstrate through a user study that our functionals are excellent predictors of the perceived attributes of appearance, enabling predictable editing with our framework.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
直观编辑材料外观
在过去的几年中,人们提出了许多不同的测量材料外观的技术。这些已经产生了大量的公共数据集,这些数据集已经被用于精确的、数据驱动的外观建模。然而,尽管这些数据集使我们在视觉外观上达到了前所未有的真实感水平,但编辑捕获的数据仍然是一个挑战。在这项工作中,我们开发了一种新颖的方法,用于对捕获的BRDF数据进行直观和可预测的编辑,该方法允许对看似合理的材料外观进行艺术创作,从而绕过获取新样本的困难。我们合成了新的材料,并将现有的MERL数据集[Matusik et al. 2003]扩展到400个数学上有效的brdf。我们设计了一个有400名参与者的大规模实验,收集了56000个关于感知属性的评分,这些属性最能描述我们扩展的材料数据集。使用这些评级,我们构建和训练径向基函数网络,作为将高级感知属性映射到底层基于pca的brdf表示的函数。我们展示了我们的方法如何允许对各种视觉属性进行直观的编辑,并通过用户研究证明,我们的功能是外观感知属性的优秀预测器,可以使用我们的框架进行可预测的编辑。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A method for realistic 3D projection mapping using multiple projectors Straightening walking path using redirected walking technique Automatic generation of 3D typography Physics-aided editing of simulation-ready muscles for visual effects Multimodal augmentation of surfaces using conductive 3D printing
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1