Omegalib: A multi-view application framework for hybrid reality display environments

Alessandro Febretti, Arthur Nishimoto, V. Mateevitsi, L. Renambot, Andrew E. Johnson, J. Leigh
{"title":"Omegalib: A multi-view application framework for hybrid reality display environments","authors":"Alessandro Febretti, Arthur Nishimoto, V. Mateevitsi, L. Renambot, Andrew E. Johnson, J. Leigh","doi":"10.1109/VR.2014.6802043","DOIUrl":null,"url":null,"abstract":"In the domain of large-scale visualization instruments, hybrid reality environments (HREs) are a recent innovation that combines the best-in-class capabilities of immersive environments, with the best-in-class capabilities of ultra-high-resolution display walls. HREs create a seamless 2D/3D environment that supports both information-rich analysis as well as virtual reality simulation exploration at a resolution matching human visual acuity. Co-located research groups in HREs tend to work on a variety of tasks during a research session (sometimes in parallel), and these tasks require 2D data views, 3D views, linking between them and the ability to bring in (or hide) data quickly as needed. In this paper we present Omegalib, a software framework that facilitates application development on HREs. Omegalib is designed to support dynamic reconfigurability of the display environment, so that areas of the display can be interactively allocated to 2D or 3D workspaces as needed. Compared to existing frameworks and toolkits, Omegalib makes it possible to have multiple immersive applications running on a cluster-controlled display system, have different input sources dynamically routed to applications, and have rendering results optionally redirected to a distributed compositing manager. Omegalib supports pluggable front-ends, to simplify the integration of third-party libraries like OpenGL, OpenSceneGraph, and the Visualization Toolkit (VTK). We present examples of applications developed with Omegalib for the 74-megapixel, 72-tile CAVE2™ system, and show how a Hybrid Reality Environment proved effective in supporting work for a co-located research group in the environmental sciences.","PeriodicalId":408559,"journal":{"name":"2014 IEEE Virtual Reality (VR)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"43","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE Virtual Reality (VR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VR.2014.6802043","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 43

Abstract

In the domain of large-scale visualization instruments, hybrid reality environments (HREs) are a recent innovation that combines the best-in-class capabilities of immersive environments, with the best-in-class capabilities of ultra-high-resolution display walls. HREs create a seamless 2D/3D environment that supports both information-rich analysis as well as virtual reality simulation exploration at a resolution matching human visual acuity. Co-located research groups in HREs tend to work on a variety of tasks during a research session (sometimes in parallel), and these tasks require 2D data views, 3D views, linking between them and the ability to bring in (or hide) data quickly as needed. In this paper we present Omegalib, a software framework that facilitates application development on HREs. Omegalib is designed to support dynamic reconfigurability of the display environment, so that areas of the display can be interactively allocated to 2D or 3D workspaces as needed. Compared to existing frameworks and toolkits, Omegalib makes it possible to have multiple immersive applications running on a cluster-controlled display system, have different input sources dynamically routed to applications, and have rendering results optionally redirected to a distributed compositing manager. Omegalib supports pluggable front-ends, to simplify the integration of third-party libraries like OpenGL, OpenSceneGraph, and the Visualization Toolkit (VTK). We present examples of applications developed with Omegalib for the 74-megapixel, 72-tile CAVE2™ system, and show how a Hybrid Reality Environment proved effective in supporting work for a co-located research group in the environmental sciences.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Omegalib:用于混合现实显示环境的多视图应用程序框架
在大型可视化仪器领域,混合现实环境(HREs)是最近的一项创新,它结合了一流的沉浸式环境功能和一流的超高分辨率显示墙功能。HREs创建了一个无缝的2D/3D环境,既支持信息丰富的分析,也支持与人类视觉灵敏度相匹配的分辨率的虚拟现实模拟探索。HREs中的协同研究小组往往在一个研究会议期间(有时是并行的)处理各种任务,这些任务需要2D数据视图,3D视图,它们之间的链接以及根据需要快速引入(或隐藏)数据的能力。在本文中,我们介绍了Omegalib,这是一个促进HREs应用程序开发的软件框架。Omegalib旨在支持显示环境的动态可重构性,因此显示区域可以根据需要交互式地分配到2D或3D工作区。与现有的框架和工具包相比,Omegalib可以在集群控制的显示系统上运行多个沉浸式应用程序,将不同的输入源动态路由到应用程序,并将呈现结果可选地重定向到分布式合成管理器。Omegalib支持可插拔的前端,以简化第三方库的集成,如OpenGL、OpenSceneGraph和可视化工具包(VTK)。我们展示了使用Omegalib为7400万像素、72块CAVE2™系统开发的应用程序示例,并展示了混合现实环境如何有效地支持环境科学领域的协同研究小组的工作。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
An enhanced steering algorithm for redirected walking in virtual environments Using relative head and hand-target features to predict intention in 3D moving-target selection Transitional Augmented Reality navigation for live captured scenes Time perception during walking in virtual environments The Mind-Mirror: See your brain in action in your head using EEG and augmented reality
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1