AR360: Dynamic Illumination for Augmented Reality with Real-Time Interaction

A. Alhakamy, M. Tuceryan
{"title":"AR360: Dynamic Illumination for Augmented Reality with Real-Time Interaction","authors":"A. Alhakamy, M. Tuceryan","doi":"10.1109/INFOCT.2019.8710982","DOIUrl":null,"url":null,"abstract":"Current augmented and mixed reality systems suffer a lack of correct illumination modeling where the virtual objects render the same lighting condition as the real environment. While we are experiencing astonishing results from the entertainment industry in multiple media forms, the procedure is mostly accomplished offline. The illumination information extracted from the physical scene is used to interactively render the virtual objects which results in a more realistic output in real-time. In this paper, we present a method that detects the physical illumination with dynamic scene, then uses the extracted illumination to render the virtual objects added to the scene. The method has three steps that are assumed to be working concurrently in real-time. The first is the estimation of the direct illumination (incident light) from the physical scene using computer vision techniques through a 360° live-feed camera connected to AR device. The second is the simulation of indirect illumination (reflected light) from the real-world surfaces to virtual objects rendering using region capture of 2D texture from the AR camera view. The third is defining the virtual objects with proper lighting and shadowing characteristics using shader language through multiple passes. Finally, we tested our work with multiple lighting conditions to evaluate the accuracy of results based on the shadow falling from the virtual objects which should be consistent with the shadow falling from the real objects with a reduced performance cost.","PeriodicalId":369231,"journal":{"name":"2019 IEEE 2nd International Conference on Information and Computer Technologies (ICICT)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"13","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE 2nd International Conference on Information and Computer Technologies (ICICT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INFOCT.2019.8710982","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 13

Abstract

Current augmented and mixed reality systems suffer a lack of correct illumination modeling where the virtual objects render the same lighting condition as the real environment. While we are experiencing astonishing results from the entertainment industry in multiple media forms, the procedure is mostly accomplished offline. The illumination information extracted from the physical scene is used to interactively render the virtual objects which results in a more realistic output in real-time. In this paper, we present a method that detects the physical illumination with dynamic scene, then uses the extracted illumination to render the virtual objects added to the scene. The method has three steps that are assumed to be working concurrently in real-time. The first is the estimation of the direct illumination (incident light) from the physical scene using computer vision techniques through a 360° live-feed camera connected to AR device. The second is the simulation of indirect illumination (reflected light) from the real-world surfaces to virtual objects rendering using region capture of 2D texture from the AR camera view. The third is defining the virtual objects with proper lighting and shadowing characteristics using shader language through multiple passes. Finally, we tested our work with multiple lighting conditions to evaluate the accuracy of results based on the shadow falling from the virtual objects which should be consistent with the shadow falling from the real objects with a reduced performance cost.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
AR360:实时交互增强现实的动态照明
当前的增强现实和混合现实系统缺乏正确的照明建模,其中虚拟对象呈现与真实环境相同的照明条件。虽然我们正在经历着娱乐行业以多种媒体形式带来的惊人结果,但这一过程大多是在线下完成的。利用从物理场景中提取的光照信息对虚拟物体进行交互渲染,使输出结果更加真实。本文提出了一种检测动态场景中物理光照的方法,然后利用提取的光照对添加到场景中的虚拟物体进行渲染。该方法有三个步骤,假设它们是实时并发工作的。首先是使用计算机视觉技术,通过连接到AR设备的360°实时馈电摄像头,估计物理场景的直接照明(入射光)。第二个是模拟从现实世界表面到虚拟物体的间接照明(反射光),使用AR相机视图中2D纹理的区域捕获进行渲染。第三是通过多个通道使用shader语言定义具有适当照明和阴影特征的虚拟对象。最后,我们在多种光照条件下对我们的工作进行了测试,以评估基于虚拟物体阴影的结果的准确性,该结果应该与真实物体的阴影一致,并降低性能成本。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Utilization of Data Mining for Generalizable, All-Admission Prediction of Inpatient Mortality Development of Navigation Monitoring & Assistance Service Data Model ITIKI Plus: A Mobile Based Application for Integrating Indigenous Knowledge and Scientific Agro-Climate Decision Support for Africa’s Small-Scale Farmers TFDroid: Android Malware Detection by Topics and Sensitive Data Flows Using Machine Learning Techniques Weighted DV-Hop Localization Algorithm for Wireless Sensor Network based on Differential Evolution Algorithm
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1