首页 > 最新文献

International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality最新文献

英文 中文
High volume offline image recognition 高容量离线图像识别
Tomasz Adamek, Luis Martinell, Miquel Ferrarons, A. Torrents, David Marimón
We show a prototype of an offline image recognition engine, running on a tablet with Intel®AtomTM processor, searching within less than 250ms through thousands (5000+) of images. Moreover, the prototype still offers the advanced capabilities of recognising real world 3D objects, until now reserved only for cloud solutions. Until now image search within large collections of images could be performed only in the cloud, requiring mobile devices to have Internet connectivity. However, for many use cases the connectivity requirement is impractical, e.g. many museums have no network coverage, or do not want their visitors incurring expensive roaming charges. Existing commercial solutions are very limited in terms of searched collections sizes, often imposing a maximum limit of <100 reference images. Moreover, adding images typically affects the recognition speed and increases RAM requirements.
我们展示了一个离线图像识别引擎的原型,运行在带有英特尔®AtomTM处理器的平板电脑上,在不到250毫秒的时间内搜索数千(5000+)张图像。此外,这款原型机还提供了识别现实世界3D物体的先进功能,迄今为止,这一功能仅用于云解决方案。到目前为止,在大量图像集合中进行图像搜索只能在云中进行,这需要移动设备具有互联网连接。然而,对于许多用例,连接要求是不切实际的,例如,许多博物馆没有网络覆盖,或者不希望他们的游客产生昂贵的漫游费。现有的商业解决方案在搜索集合大小方面非常有限,通常限制在<100个参考图像。此外,添加图像通常会影响识别速度并增加RAM需求。
{"title":"High volume offline image recognition","authors":"Tomasz Adamek, Luis Martinell, Miquel Ferrarons, A. Torrents, David Marimón","doi":"10.1109/ISMAR.2014.6948470","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948470","url":null,"abstract":"We show a prototype of an offline image recognition engine, running on a tablet with Intel®AtomTM processor, searching within less than 250ms through thousands (5000+) of images. Moreover, the prototype still offers the advanced capabilities of recognising real world 3D objects, until now reserved only for cloud solutions. Until now image search within large collections of images could be performed only in the cloud, requiring mobile devices to have Internet connectivity. However, for many use cases the connectivity requirement is impractical, e.g. many museums have no network coverage, or do not want their visitors incurring expensive roaming charges. Existing commercial solutions are very limited in terms of searched collections sizes, often imposing a maximum limit of <100 reference images. Moreover, adding images typically affects the recognition speed and increases RAM requirements.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79470750","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The collaborative design platform - A protocol for a mixed reality installation for improved incorporation of laypeople in architecture 协作设计平台——一种用于混合现实装置的协议,用于改进外行人与建筑的结合
Tibor Goldschwendt, C. Anthes, G. Schubert, D. Kranzlmüller, F. Petzold
Presentations and discussions between architects and clients during the early stages of design usually involve sketches, paper and models, with digital information in the form of simulations and analyses used to assess variants and underpin arguments. Laypeople, however, are not used to reading plans or models and find it difficult to relate digital representations to the real world. Immersive environments represent an alternative approach but are laborious and costly to produce, particularly in the early design phases where information and ideas are still vague. Our project shows, how linking analogue design tools and digital VR representation has given rise to a new interactive presentation platform that bridges the gap between analogue design methods and digital architectural presentation. The prototypical platform creates a direct connection between a physical volumetric model and interactive digital content using a large-format multi-touch table as a work surface combined with real-time 3D scanning. Coupling the 3D data from the scanned model with the 3D digital environment model makes it possible to compute design relevant simulations and analyses. These are displayed in real-time on the working model to help architects assess and substantiate their design decisions. Combining this with a 5sided projection installation based on the concepts Carolina Cruz Neiras CAVE Automatic Virtual Environment (CAVE)1 offers an entirely new means of presentation and interaction. The design (physical working model), the surroundings (GIS data) and the simulations and analyses are presented stereoscopically in real-time in the virtual environment. While the architect can work as usual, the observer is presented with an entirely new mode of viewing. Different ideas and scenarios can be tried out spontaneously and new ideas can be developed and viewed directly in three dimensions. The client is involved more directly in the process and can contribute own ideas and changes, and then see these in user-centred stereoscopic 3D. By varying system parameters, the model can be walked through at life size.
在设计的早期阶段,建筑师和客户之间的展示和讨论通常涉及草图、图纸和模型,并以模拟和分析的形式提供数字信息,用于评估变体和支持论点。然而,外行人不习惯阅读计划或模型,并且很难将数字表示与现实世界联系起来。沉浸式环境代表了另一种方法,但制作起来费力且昂贵,特别是在信息和想法仍然模糊的早期设计阶段。我们的项目展示了如何将模拟设计工具和数字VR表示联系起来,从而产生了一个新的交互式演示平台,弥合了模拟设计方法和数字建筑演示之间的差距。原型平台在物理体积模型和交互式数字内容之间建立了直接连接,使用大幅面多点触控表作为工作面,并结合实时3D扫描。将扫描模型的三维数据与三维数字环境模型相结合,使计算设计相关的仿真和分析成为可能。这些都实时显示在工作模型上,以帮助架构师评估和证实他们的设计决策。将此与基于Carolina Cruz Neiras CAVE自动虚拟环境(CAVE)1概念的5面投影装置相结合,提供了一种全新的呈现和交互方式。设计(物理工作模型)、环境(GIS数据)和仿真分析在虚拟环境中实时立体呈现。虽然架构师可以像往常一样工作,但观察者呈现出一种全新的观察模式。不同的想法和场景可以自发地尝试,新的想法可以在三维空间中发展和直接观看。客户更直接地参与到这个过程中,可以贡献自己的想法和变化,然后在以用户为中心的立体3D中看到这些。通过改变系统参数,模型可以按实际尺寸进行遍历。
{"title":"The collaborative design platform - A protocol for a mixed reality installation for improved incorporation of laypeople in architecture","authors":"Tibor Goldschwendt, C. Anthes, G. Schubert, D. Kranzlmüller, F. Petzold","doi":"10.1109/ISMAR.2014.6948477","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948477","url":null,"abstract":"Presentations and discussions between architects and clients during the early stages of design usually involve sketches, paper and models, with digital information in the form of simulations and analyses used to assess variants and underpin arguments. Laypeople, however, are not used to reading plans or models and find it difficult to relate digital representations to the real world. Immersive environments represent an alternative approach but are laborious and costly to produce, particularly in the early design phases where information and ideas are still vague. Our project shows, how linking analogue design tools and digital VR representation has given rise to a new interactive presentation platform that bridges the gap between analogue design methods and digital architectural presentation. The prototypical platform creates a direct connection between a physical volumetric model and interactive digital content using a large-format multi-touch table as a work surface combined with real-time 3D scanning. Coupling the 3D data from the scanned model with the 3D digital environment model makes it possible to compute design relevant simulations and analyses. These are displayed in real-time on the working model to help architects assess and substantiate their design decisions. Combining this with a 5sided projection installation based on the concepts Carolina Cruz Neiras CAVE Automatic Virtual Environment (CAVE)1 offers an entirely new means of presentation and interaction. The design (physical working model), the surroundings (GIS data) and the simulations and analyses are presented stereoscopically in real-time in the virtual environment. While the architect can work as usual, the observer is presented with an entirely new mode of viewing. Different ideas and scenarios can be tried out spontaneously and new ideas can be developed and viewed directly in three dimensions. The client is involved more directly in the process and can contribute own ideas and changes, and then see these in user-centred stereoscopic 3D. By varying system parameters, the model can be walked through at life size.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87194689","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Mobile augmented reality - 3D object selection and reconstruction with an RGBD sensor and scene understanding 移动增强现实- 3D对象选择和重建与RGBD传感器和场景理解
Daniel Wagner, Gerhard Reitmayr, Alessandro Mulloni, Erick Méndez, S. Diaz
In this proposal we show case two 3D reconstruction systems running in real-time on a tablet equipped with a depth sensor. We believe that the proposed set of demonstrations will engage ISMAR attendees both in terms of tracking technology and user experience. Both demos show state-of-the art 3D reconstruction technology and give attendees a chance to try hands-on our tracking with simple and interactive user interfaces.
在这个提案中,我们展示了两个3D重建系统在配备深度传感器的平板电脑上实时运行。我们相信,拟议的演示集将在跟踪技术和用户体验方面吸引ISMAR与会者。这两个演示都展示了最先进的3D重建技术,并让与会者有机会尝试用简单的交互式用户界面进行跟踪。
{"title":"Mobile augmented reality - 3D object selection and reconstruction with an RGBD sensor and scene understanding","authors":"Daniel Wagner, Gerhard Reitmayr, Alessandro Mulloni, Erick Méndez, S. Diaz","doi":"10.1109/ISMAR.2014.6948499","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948499","url":null,"abstract":"In this proposal we show case two 3D reconstruction systems running in real-time on a tablet equipped with a depth sensor. We believe that the proposed set of demonstrations will engage ISMAR attendees both in terms of tracking technology and user experience. Both demos show state-of-the art 3D reconstruction technology and give attendees a chance to try hands-on our tracking with simple and interactive user interfaces.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89614884","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
On-site augmented collaborative architecture visualization 现场增强协作架构可视化
David Schattel, M. Tönnis, G. Klinker, G. Schubert, F. Petzold
The early design phase for a new building is a crucial stage in the design process of architects. It has to be ensured that the building fits into the future environment. The Collaborative Design Platform targets this issue by integrating modern digital means with well known traditional concepts. Well-used styrofoam blocks are still cut by hand but are now tracked, placed and visualized in 3D by use of a tabletop platform and a TV screen showing an arbitrary view of the scenery. With this demonstration, we get one step further and provide an interactive visualization at the proposed building site, further enhancing collaboration between different audiences. Mobile phones and tablet devices are used to visualize marker-less registered virtual building structures and immediately show changes made to the models in the Collaborative Design laboratory. This way, architects can get a direct impression about how a building will integrate within the environment and residents can get an early impression about future plans.
新建筑的早期设计阶段是建筑师设计过程中的关键阶段。必须确保建筑适应未来的环境。协作设计平台通过将现代数字手段与众所周知的传统概念相结合来解决这一问题。使用良好的聚苯乙烯泡沫塑料块仍然是手工切割的,但现在通过桌面平台和电视屏幕显示任意视图来跟踪,放置和可视化3D。通过这个演示,我们更进一步,在拟议的建筑现场提供交互式可视化,进一步加强不同受众之间的协作。移动电话和平板设备被用于可视化无标记注册的虚拟建筑结构,并立即显示协作设计实验室中模型的变化。通过这种方式,建筑师可以直接了解建筑将如何融入环境,居民也可以对未来的规划有一个早期的印象。
{"title":"On-site augmented collaborative architecture visualization","authors":"David Schattel, M. Tönnis, G. Klinker, G. Schubert, F. Petzold","doi":"10.1109/ISMAR.2014.6948493","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948493","url":null,"abstract":"The early design phase for a new building is a crucial stage in the design process of architects. It has to be ensured that the building fits into the future environment. The Collaborative Design Platform targets this issue by integrating modern digital means with well known traditional concepts. Well-used styrofoam blocks are still cut by hand but are now tracked, placed and visualized in 3D by use of a tabletop platform and a TV screen showing an arbitrary view of the scenery. With this demonstration, we get one step further and provide an interactive visualization at the proposed building site, further enhancing collaboration between different audiences. Mobile phones and tablet devices are used to visualize marker-less registered virtual building structures and immediately show changes made to the models in the Collaborative Design laboratory. This way, architects can get a direct impression about how a building will integrate within the environment and residents can get an early impression about future plans.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77443807","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
Measurement of perceptual tolerance for inconsistencies within mixed reality scenes 测量混合现实场景中不一致性的感知容忍度
G. Hough, Ian Williams, C. Athwal
This demonstration is a live example of the experiment presented in [1], namely a method of assessing the visual credibility of a scene where a real person interacts with a virtual object in realtime. Inconsistencies created by actor's incorrect estimation of the virtual object are measured through a series of videos, each containing a defined visual error and rated against interaction credibility on a scale of 1–5 by conference delegates.
该演示是[1]中实验的现场示例,即评估真人与虚拟对象实时交互场景的视觉可信度的方法。演员对虚拟物体的不正确估计造成的不一致是通过一系列视频来衡量的,每个视频都包含一个定义的视觉误差,并由会议代表以1-5的等级对互动可信度进行评分。
{"title":"Measurement of perceptual tolerance for inconsistencies within mixed reality scenes","authors":"G. Hough, Ian Williams, C. Athwal","doi":"10.1109/ISMAR.2014.6948480","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948480","url":null,"abstract":"This demonstration is a live example of the experiment presented in [1], namely a method of assessing the visual credibility of a scene where a real person interacts with a virtual object in realtime. Inconsistencies created by actor's incorrect estimation of the virtual object are measured through a series of videos, each containing a defined visual error and rated against interaction credibility on a scale of 1–5 by conference delegates.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75549638","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
"It's a Pirate's Life" AR game 《这是海盗的生活》AR游戏
D. Molyneaux, Selim Benhimane
{"title":"\"It's a Pirate's Life\" AR game","authors":"D. Molyneaux, Selim Benhimane","doi":"10.1109/ISMAR.2014.6948489","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948489","url":null,"abstract":"","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72950070","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
QubeAR: Cube style QR code AR interaction QubeAR:立方体风格的二维码AR交互
Han Park, TaeGyu Kim, Jun Park
{"title":"QubeAR: Cube style QR code AR interaction","authors":"Han Park, TaeGyu Kim, Jun Park","doi":"10.1109/ISMAR.2014.6948490","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948490","url":null,"abstract":"","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91353200","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
G-SIAR: Gesture-speech interface for augmented reality G-SIAR:用于增强现实的手势语音界面
Thammathip Piumsomboon, Adrian Clark, M. Billinghurst
{"title":"G-SIAR: Gesture-speech interface for augmented reality","authors":"Thammathip Piumsomboon, Adrian Clark, M. Billinghurst","doi":"10.1109/ISMAR.2014.6948491","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948491","url":null,"abstract":"","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84116351","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
Smartwatch-aided handheld augmented reality 智能手表辅助手持增强现实
Darko Stanimirovic, Daniel Kurz
We demonstrate a novel method for interaction of humans with real objects in their surrounding combining Visual Search and Augmented Reality (AR). This method is based on utilizing a smartwatch tethered to a smartphone, and it is designed to provide a more user-friendly experience compared to approaches based only on a handheld device, such as a smartphone or a tablet computer. The smartwatch has a built-in camera, which enables scanning objects without the need to take the smartphone out of the pocket. An image captured by the watch is sent wirelessly to the phone that performs Visual Search and subsequently informs the smartwatch whether digital information related to the object is available or not. As described in our ISMAR 2014 poster [6], we distinguish between three cases. If no information is available or the object recognition failed, the user is notified accordingly. If there is digital information available that can be presented using the smartwatch display and/or audio output, it is presented there. The third case is that the recognized object has digital information related to it, which would be beneficial to see in an Augmented Reality view spatially registered with the object in real-time. Then the smartwatch informs the user that this option exists and encourages using the smartphone to experience the Augmented Reality view. Thereby, the user only needs to take the phone out of the pocket in case Augmented Reality content is available, and when the content is of interest for the user.
我们展示了一种结合视觉搜索和增强现实(AR)的人类与周围真实物体交互的新方法。这种方法基于将智能手表连接到智能手机上,与仅基于手持设备(如智能手机或平板电脑)的方法相比,它旨在提供更友好的用户体验。这款智能手表内置摄像头,无需将智能手机从口袋里拿出来,就能扫描物体。手表捕捉到的图像会被无线发送到执行视觉搜索的手机上,然后通知智能手表是否有与该物体相关的数字信息。正如我们在2014年ISMAR海报[6]中所描述的,我们区分了三种情况。如果没有可用的信息或对象识别失败,则相应地通知用户。如果有可用的数字信息,可以通过智能手表的显示和/或音频输出来呈现,它就会呈现在那里。第三种情况是,被识别的对象具有与之相关的数字信息,这将有利于在增强现实视图中实时看到与对象在空间上注册的对象。然后,智能手表通知用户这个选项存在,并鼓励用户使用智能手机来体验增强现实视图。因此,用户只需要在增强现实内容可用的情况下将手机从口袋中取出,并且当内容是用户感兴趣的时候。
{"title":"Smartwatch-aided handheld augmented reality","authors":"Darko Stanimirovic, Daniel Kurz","doi":"10.1109/ISMAR.2014.6948495","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948495","url":null,"abstract":"We demonstrate a novel method for interaction of humans with real objects in their surrounding combining Visual Search and Augmented Reality (AR). This method is based on utilizing a smartwatch tethered to a smartphone, and it is designed to provide a more user-friendly experience compared to approaches based only on a handheld device, such as a smartphone or a tablet computer. The smartwatch has a built-in camera, which enables scanning objects without the need to take the smartphone out of the pocket. An image captured by the watch is sent wirelessly to the phone that performs Visual Search and subsequently informs the smartwatch whether digital information related to the object is available or not. As described in our ISMAR 2014 poster [6], we distinguish between three cases. If no information is available or the object recognition failed, the user is notified accordingly. If there is digital information available that can be presented using the smartwatch display and/or audio output, it is presented there. The third case is that the recognized object has digital information related to it, which would be beneficial to see in an Augmented Reality view spatially registered with the object in real-time. Then the smartwatch informs the user that this option exists and encourages using the smartphone to experience the Augmented Reality view. Thereby, the user only needs to take the phone out of the pocket in case Augmented Reality content is available, and when the content is of interest for the user.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73841074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
MRI design review system: A mixed reality interactive design review system for architecture, serious games and engineering using game engines, standard software, a tablet computer and natural interfaces MRI设计评审系统:使用游戏引擎、标准软件、平板电脑和自然界面,为建筑、严肃游戏和工程提供的混合现实交互设计评审系统
A. Behmel, W. Höhl, Thomas Kienzl
Experience and control your design using natural interfaces! Most of todays conventional design review systems require special programming skills for preparation and high-capacity hardand software for demonstration. Interacting with 3D data sometimes can be complicated. Today we face five major problem fields using design review systems: Interaction with 3D data, navigation in 3D space, controlling design alternatives, design presentation using less extensive hardware, content development without special software and programming skills. Developments also targeting these issues by using different methods are presented e.g. by LANCELLE, SETTGAST and FELLNER (2008). They developed DAVE – Definitely Affordable Virtual Environment at Graz University of Technology. This immersive cage-based system today is used in evaluating the design of the new main railway station in Vienna, Austria. Also SHIRATUDDIN and THABET (2011) utilized the Torque 3D game engine to develop a Virtual Design Review System. Finally DUNSTON et. al. (2011) designed an Immersive Virtual Reality Mock-Up for Design Review of Hospital Patient Rooms. These and other research work was based on standard 3D game engines by using a conventional cave or power wall for presentation and physical immersion. The edddison MRI Design Review System is an easy to use mixed reality interface for design evaluation and presentation. It integrates a wide range of hardware input systems including a special 3D-printed tangible user interface, desktop computers, tablets and touch screens. On the software side it offers plug-ins for standard 3D software including Autodesk Navisworks and Showcase, Unity3D, Trimbles, SketchUp, Web GL and others. The edddison MRI Design Review System enables laymen to create their own interactive 3D content. It is a solution which makes the creation and presentation of interactive 3D applications as simple as preparing a powerpoint presentation. Without any programming skills you can easily manipulate 3D models within standard software applications. Control, change or adapt your design easily and interact with 3D models by natural interfaces and standard handheld devices. Navigate in 3D space using only your tablet computer. Complex buildings can be experienced by means of 2D floor plans and a touchscreen. System requirements are reduced by using standard software applications such as SketchUp or Unity3D. The edddison MRI Design Review System also makes it easy to present different design stages without extensive hardand software on all common mobile platforms. Actual application areas are Architectural Design, Digital Prototyping, Industrial Simulation, Serious Games and Product Presentation. Currently, the system has two major usecases: one setup will show the WebGL demo running on an iPad or an Android tablet computer. Using a WebGL/HTML5 cloud solution MRI Design Review System is able to reach the masses. The second demo is a SketchUp file controlled by optical tracking a
使用自然界面体验和控制您的设计!今天大多数传统的设计审查系统需要特殊的编程技能来准备和高容量的硬件和软件来演示。与3D数据交互有时会很复杂。今天,我们在使用设计审查系统时面临着五个主要问题领域:与3D数据的交互、在3D空间中的导航、控制设计方案、使用较少硬件的设计演示、没有特殊软件和编程技能的内容开发。例如,LANCELLE、SETTGAST和FELLNER(2008)采用不同的方法提出了针对这些问题的发展。他们在格拉茨科技大学开发了DAVE——绝对负担得起的虚拟环境。如今,这种基于笼子的沉浸式系统被用于评估奥地利维也纳新主要火车站的设计。SHIRATUDDIN和THABET(2011)也利用Torque 3D游戏引擎开发了一个虚拟设计审查系统。最后,DUNSTON等人(2011)为医院病房的设计评审设计了一个沉浸式虚拟现实模型。这些和其他研究工作都是基于标准的3D游戏引擎,使用传统的洞穴或动力墙来呈现和物理沉浸感。eddison MRI设计评审系统是一个易于使用的混合现实界面,用于设计评估和演示。它集成了广泛的硬件输入系统,包括一个特殊的3d打印的有形用户界面,台式电脑,平板电脑和触摸屏。在软件方面,它提供了标准3D软件的插件,包括Autodesk Navisworks和Showcase, Unity3D, Trimbles, SketchUp, Web GL等。埃迪森MRI设计审查系统使外行人能够创建自己的交互式3D内容。它是一种解决方案,可以使交互式3D应用程序的创建和演示像准备powerpoint演示一样简单。无需任何编程技能,您可以在标准软件应用程序中轻松操作3D模型。轻松控制,更改或调整您的设计,并通过自然界面和标准手持设备与3D模型进行交互。仅使用平板电脑在3D空间中导航。复杂的建筑可以通过2D平面图和触摸屏来体验。通过使用SketchUp或Unity3D等标准软件应用程序,可以降低系统要求。edddison MRI设计评审系统还可以在所有常见的移动平台上轻松呈现不同的设计阶段,而无需大量的硬件和软件。实际应用领域为建筑设计、数字样机、工业仿真、严肃游戏和产品演示。目前,该系统有两个主要用途:一个设置将显示WebGL演示运行在iPad或Android平板电脑上。使用WebGL/HTML5云解决方案,MRI设计审核系统能够达到大众。第二个演示是一个SketchUp文件,由光学跟踪和3D打印的有形物体控制,也使用触摸屏或手持设备。edddison MRI设计评审系统通过易于使用的硬件和软件扩展了现有设计评审系统的范围。在这里,它通过一种进化的、迭代的方法,结合一堆用户友好的直观界面,简化了整个设计过程。演示视频链接:www.youtube.com/watch?v=CyC_TYdRSvo
{"title":"MRI design review system: A mixed reality interactive design review system for architecture, serious games and engineering using game engines, standard software, a tablet computer and natural interfaces","authors":"A. Behmel, W. Höhl, Thomas Kienzl","doi":"10.1109/ISMAR.2014.6948472","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948472","url":null,"abstract":"Experience and control your design using natural interfaces! Most of todays conventional design review systems require special programming skills for preparation and high-capacity hardand software for demonstration. Interacting with 3D data sometimes can be complicated. Today we face five major problem fields using design review systems: Interaction with 3D data, navigation in 3D space, controlling design alternatives, design presentation using less extensive hardware, content development without special software and programming skills. Developments also targeting these issues by using different methods are presented e.g. by LANCELLE, SETTGAST and FELLNER (2008). They developed DAVE – Definitely Affordable Virtual Environment at Graz University of Technology. This immersive cage-based system today is used in evaluating the design of the new main railway station in Vienna, Austria. Also SHIRATUDDIN and THABET (2011) utilized the Torque 3D game engine to develop a Virtual Design Review System. Finally DUNSTON et. al. (2011) designed an Immersive Virtual Reality Mock-Up for Design Review of Hospital Patient Rooms. These and other research work was based on standard 3D game engines by using a conventional cave or power wall for presentation and physical immersion. The edddison MRI Design Review System is an easy to use mixed reality interface for design evaluation and presentation. It integrates a wide range of hardware input systems including a special 3D-printed tangible user interface, desktop computers, tablets and touch screens. On the software side it offers plug-ins for standard 3D software including Autodesk Navisworks and Showcase, Unity3D, Trimbles, SketchUp, Web GL and others. The edddison MRI Design Review System enables laymen to create their own interactive 3D content. It is a solution which makes the creation and presentation of interactive 3D applications as simple as preparing a powerpoint presentation. Without any programming skills you can easily manipulate 3D models within standard software applications. Control, change or adapt your design easily and interact with 3D models by natural interfaces and standard handheld devices. Navigate in 3D space using only your tablet computer. Complex buildings can be experienced by means of 2D floor plans and a touchscreen. System requirements are reduced by using standard software applications such as SketchUp or Unity3D. The edddison MRI Design Review System also makes it easy to present different design stages without extensive hardand software on all common mobile platforms. Actual application areas are Architectural Design, Digital Prototyping, Industrial Simulation, Serious Games and Product Presentation. Currently, the system has two major usecases: one setup will show the WebGL demo running on an iPad or an Android tablet computer. Using a WebGL/HTML5 cloud solution MRI Design Review System is able to reach the masses. The second demo is a SketchUp file controlled by optical tracking a","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73012361","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
期刊
International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1