Procedurally generated virtual reality from 3D reconstructed physical space

Misha Sra, Sergio Garrido-Jurado, C. Schmandt
{"title":"Procedurally generated virtual reality from 3D reconstructed physical space","authors":"Misha Sra, Sergio Garrido-Jurado, C. Schmandt","doi":"10.1145/2993369.2993372","DOIUrl":null,"url":null,"abstract":"We present a novel system for automatically generating immersive and interactive virtual reality (VR) environments using the real world as a template. The system captures indoor scenes in 3D, detects obstacles like furniture and walls, and maps walkable areas (WA) to enable real-walking in the generated virtual environment (VE). Depth data is additionally used for recognizing and tracking objects during the VR experience. The detected objects are paired with virtual counterparts to leverage the physicality of the real world for a tactile experience. Our approach is new, in that it allows a casual user to easily create virtual reality worlds in any indoor space of arbitrary size and shape without requiring specialized equipment or training. We demonstrate our approach through a fully working system implemented on the Google Project Tango tablet device.","PeriodicalId":396801,"journal":{"name":"Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology","volume":"44 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"99","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2993369.2993372","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 99

Abstract

We present a novel system for automatically generating immersive and interactive virtual reality (VR) environments using the real world as a template. The system captures indoor scenes in 3D, detects obstacles like furniture and walls, and maps walkable areas (WA) to enable real-walking in the generated virtual environment (VE). Depth data is additionally used for recognizing and tracking objects during the VR experience. The detected objects are paired with virtual counterparts to leverage the physicality of the real world for a tactile experience. Our approach is new, in that it allows a casual user to easily create virtual reality worlds in any indoor space of arbitrary size and shape without requiring specialized equipment or training. We demonstrate our approach through a fully working system implemented on the Google Project Tango tablet device.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
程序生成的虚拟现实从三维重建的物理空间
我们提出了一种新的系统,用于自动生成身临其境的交互式虚拟现实(VR)环境,以现实世界为模板。该系统以3D方式捕捉室内场景,检测家具和墙壁等障碍物,并绘制可行走区域(WA)地图,从而在生成的虚拟环境(VE)中实现真实行走。深度数据还用于在VR体验中识别和跟踪物体。检测到的对象与虚拟对象配对,以利用现实世界的物理特性来获得触觉体验。我们的方法是新颖的,因为它允许普通用户在任意大小和形状的任何室内空间轻松创建虚拟现实世界,而不需要专门的设备或培训。我们通过在Google Project Tango平板设备上实现的一个完整的工作系统来展示我们的方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Eye gaze tracking with google cardboard using purkinje images 6-DOF computation and marker design for magnetic 3D dexterous motion-tracking system PedVR: simulating gaze-based interactions between a real user and virtual crowds A fast and robust Six-DoF god object heuristic for haptic rendering of complex models with friction Combining bimanual interaction and teleportation for 3D manipulation on multi-touch wall-sized displays
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1