在野外建模和表示材料

K. Bala
{"title":"在野外建模和表示材料","authors":"K. Bala","doi":"10.1145/2643188.2700379","DOIUrl":null,"url":null,"abstract":"Our everyday life brings us in contact with a rich range of materials that contribute to both the utility and aesthetics of our environment. Human beings are very good at using subtle distinctions in appearance to distinguish between materials (e.g., silk vs. cotton, laminate vs. granite). Capturing these visually important, yet subtle, distinctions is critical for applications in many domains: in virtual and augmented reality fueled by the advent of devices like Google Glass, in virtual prototyping for industrial design, in ecommerce and retail, in textile design and prototyping, in interior design and remodeling, and in games and movies. Understanding how humans perceive materials can drive better graphics and vision algorithms for material recognition and understanding, and material reproduction. As a first step towards achieving this goal, it is useful to collect information about the vast range of materials that we encounter in our daily lives. We introduce two new crowdsourced databases of material annotations to drive better material-driven exploration. OpenSurfaces is a rich, labeled database consisting of thousands of examples of surfaces segmented from consumer photographs of interiors, and annotated with material parameters, texture information, and contextual information. IIW (Intrinsic Images in theWild) is a database of pairwise material annotations of points in images that is useful for decomposing images in the wild into material and lighting layers. Together these databases can drive various material-based applications like surface retexturing, intrinsic image decomposition, intelligent material-based image browsing, and material design.","PeriodicalId":115384,"journal":{"name":"Proceedings of the 30th Spring Conference on Computer Graphics","volume":"225 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-05-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Modeling and representing materials in the wild\",\"authors\":\"K. Bala\",\"doi\":\"10.1145/2643188.2700379\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Our everyday life brings us in contact with a rich range of materials that contribute to both the utility and aesthetics of our environment. Human beings are very good at using subtle distinctions in appearance to distinguish between materials (e.g., silk vs. cotton, laminate vs. granite). Capturing these visually important, yet subtle, distinctions is critical for applications in many domains: in virtual and augmented reality fueled by the advent of devices like Google Glass, in virtual prototyping for industrial design, in ecommerce and retail, in textile design and prototyping, in interior design and remodeling, and in games and movies. Understanding how humans perceive materials can drive better graphics and vision algorithms for material recognition and understanding, and material reproduction. As a first step towards achieving this goal, it is useful to collect information about the vast range of materials that we encounter in our daily lives. We introduce two new crowdsourced databases of material annotations to drive better material-driven exploration. OpenSurfaces is a rich, labeled database consisting of thousands of examples of surfaces segmented from consumer photographs of interiors, and annotated with material parameters, texture information, and contextual information. IIW (Intrinsic Images in theWild) is a database of pairwise material annotations of points in images that is useful for decomposing images in the wild into material and lighting layers. Together these databases can drive various material-based applications like surface retexturing, intrinsic image decomposition, intelligent material-based image browsing, and material design.\",\"PeriodicalId\":115384,\"journal\":{\"name\":\"Proceedings of the 30th Spring Conference on Computer Graphics\",\"volume\":\"225 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-05-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 30th Spring Conference on Computer Graphics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2643188.2700379\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 30th Spring Conference on Computer Graphics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2643188.2700379","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

我们的日常生活使我们接触到丰富的材料,这些材料有助于我们环境的实用性和美学。人类非常善于利用外观上的细微差别来区分材料(例如,丝绸与棉花,层压板与花岗岩)。捕捉这些视觉上重要但微妙的区别对于许多领域的应用至关重要:在虚拟现实和增强现实(由谷歌眼镜等设备的出现推动)、工业设计的虚拟原型、电子商务和零售、纺织品设计和原型、室内设计和改造、游戏和电影中。了解人类如何感知材料可以推动更好的图形和视觉算法,用于材料识别和理解,以及材料复制。作为实现这一目标的第一步,收集我们在日常生活中遇到的大量材料的信息是有用的。我们引入了两个新的材料注释众包数据库,以更好地推动材料驱动的探索。OpenSurfaces是一个丰富的标记数据库,由数千个从消费者室内照片中分割出来的表面示例组成,并附有材料参数,纹理信息和上下文信息的注释。IIW (Intrinsic Images in the wild)是一个对图像中点的成对材料注释的数据库,用于将野外图像分解为材料层和照明层。这些数据库可以共同驱动各种基于材料的应用程序,如表面重构、内在图像分解、基于材料的智能图像浏览和材料设计。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Modeling and representing materials in the wild
Our everyday life brings us in contact with a rich range of materials that contribute to both the utility and aesthetics of our environment. Human beings are very good at using subtle distinctions in appearance to distinguish between materials (e.g., silk vs. cotton, laminate vs. granite). Capturing these visually important, yet subtle, distinctions is critical for applications in many domains: in virtual and augmented reality fueled by the advent of devices like Google Glass, in virtual prototyping for industrial design, in ecommerce and retail, in textile design and prototyping, in interior design and remodeling, and in games and movies. Understanding how humans perceive materials can drive better graphics and vision algorithms for material recognition and understanding, and material reproduction. As a first step towards achieving this goal, it is useful to collect information about the vast range of materials that we encounter in our daily lives. We introduce two new crowdsourced databases of material annotations to drive better material-driven exploration. OpenSurfaces is a rich, labeled database consisting of thousands of examples of surfaces segmented from consumer photographs of interiors, and annotated with material parameters, texture information, and contextual information. IIW (Intrinsic Images in theWild) is a database of pairwise material annotations of points in images that is useful for decomposing images in the wild into material and lighting layers. Together these databases can drive various material-based applications like surface retexturing, intrinsic image decomposition, intelligent material-based image browsing, and material design.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Fast and Furious: How the web got turbo charged just in time… Fast and Furious: How the web got turbo charged just in time? Proceedings of the 30th Spring Conference on Computer Graphics Modeling and representing materials in the wild Kinect-supported dataset creation for human pose estimation
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1