International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality最新文献
Pub Date : 2014-11-06DOI: 10.1109/ISMAR.2014.6948470
Tomasz Adamek, Luis Martinell, Miquel Ferrarons, A. Torrents, David Marimón
We show a prototype of an offline image recognition engine, running on a tablet with Intel®AtomTM processor, searching within less than 250ms through thousands (5000+) of images. Moreover, the prototype still offers the advanced capabilities of recognising real world 3D objects, until now reserved only for cloud solutions. Until now image search within large collections of images could be performed only in the cloud, requiring mobile devices to have Internet connectivity. However, for many use cases the connectivity requirement is impractical, e.g. many museums have no network coverage, or do not want their visitors incurring expensive roaming charges. Existing commercial solutions are very limited in terms of searched collections sizes, often imposing a maximum limit of <100 reference images. Moreover, adding images typically affects the recognition speed and increases RAM requirements.
{"title":"High volume offline image recognition","authors":"Tomasz Adamek, Luis Martinell, Miquel Ferrarons, A. Torrents, David Marimón","doi":"10.1109/ISMAR.2014.6948470","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948470","url":null,"abstract":"We show a prototype of an offline image recognition engine, running on a tablet with Intel®AtomTM processor, searching within less than 250ms through thousands (5000+) of images. Moreover, the prototype still offers the advanced capabilities of recognising real world 3D objects, until now reserved only for cloud solutions. Until now image search within large collections of images could be performed only in the cloud, requiring mobile devices to have Internet connectivity. However, for many use cases the connectivity requirement is impractical, e.g. many museums have no network coverage, or do not want their visitors incurring expensive roaming charges. Existing commercial solutions are very limited in terms of searched collections sizes, often imposing a maximum limit of <100 reference images. Moreover, adding images typically affects the recognition speed and increases RAM requirements.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79470750","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-11-06DOI: 10.1109/ISMAR.2014.6948477
Tibor Goldschwendt, C. Anthes, G. Schubert, D. Kranzlmüller, F. Petzold
Presentations and discussions between architects and clients during the early stages of design usually involve sketches, paper and models, with digital information in the form of simulations and analyses used to assess variants and underpin arguments. Laypeople, however, are not used to reading plans or models and find it difficult to relate digital representations to the real world. Immersive environments represent an alternative approach but are laborious and costly to produce, particularly in the early design phases where information and ideas are still vague. Our project shows, how linking analogue design tools and digital VR representation has given rise to a new interactive presentation platform that bridges the gap between analogue design methods and digital architectural presentation. The prototypical platform creates a direct connection between a physical volumetric model and interactive digital content using a large-format multi-touch table as a work surface combined with real-time 3D scanning. Coupling the 3D data from the scanned model with the 3D digital environment model makes it possible to compute design relevant simulations and analyses. These are displayed in real-time on the working model to help architects assess and substantiate their design decisions. Combining this with a 5sided projection installation based on the concepts Carolina Cruz Neiras CAVE Automatic Virtual Environment (CAVE)1 offers an entirely new means of presentation and interaction. The design (physical working model), the surroundings (GIS data) and the simulations and analyses are presented stereoscopically in real-time in the virtual environment. While the architect can work as usual, the observer is presented with an entirely new mode of viewing. Different ideas and scenarios can be tried out spontaneously and new ideas can be developed and viewed directly in three dimensions. The client is involved more directly in the process and can contribute own ideas and changes, and then see these in user-centred stereoscopic 3D. By varying system parameters, the model can be walked through at life size.
{"title":"The collaborative design platform - A protocol for a mixed reality installation for improved incorporation of laypeople in architecture","authors":"Tibor Goldschwendt, C. Anthes, G. Schubert, D. Kranzlmüller, F. Petzold","doi":"10.1109/ISMAR.2014.6948477","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948477","url":null,"abstract":"Presentations and discussions between architects and clients during the early stages of design usually involve sketches, paper and models, with digital information in the form of simulations and analyses used to assess variants and underpin arguments. Laypeople, however, are not used to reading plans or models and find it difficult to relate digital representations to the real world. Immersive environments represent an alternative approach but are laborious and costly to produce, particularly in the early design phases where information and ideas are still vague. Our project shows, how linking analogue design tools and digital VR representation has given rise to a new interactive presentation platform that bridges the gap between analogue design methods and digital architectural presentation. The prototypical platform creates a direct connection between a physical volumetric model and interactive digital content using a large-format multi-touch table as a work surface combined with real-time 3D scanning. Coupling the 3D data from the scanned model with the 3D digital environment model makes it possible to compute design relevant simulations and analyses. These are displayed in real-time on the working model to help architects assess and substantiate their design decisions. Combining this with a 5sided projection installation based on the concepts Carolina Cruz Neiras CAVE Automatic Virtual Environment (CAVE)1 offers an entirely new means of presentation and interaction. The design (physical working model), the surroundings (GIS data) and the simulations and analyses are presented stereoscopically in real-time in the virtual environment. While the architect can work as usual, the observer is presented with an entirely new mode of viewing. Different ideas and scenarios can be tried out spontaneously and new ideas can be developed and viewed directly in three dimensions. The client is involved more directly in the process and can contribute own ideas and changes, and then see these in user-centred stereoscopic 3D. By varying system parameters, the model can be walked through at life size.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87194689","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-11-06DOI: 10.1109/ISMAR.2014.6948499
Daniel Wagner, Gerhard Reitmayr, Alessandro Mulloni, Erick Méndez, S. Diaz
In this proposal we show case two 3D reconstruction systems running in real-time on a tablet equipped with a depth sensor. We believe that the proposed set of demonstrations will engage ISMAR attendees both in terms of tracking technology and user experience. Both demos show state-of-the art 3D reconstruction technology and give attendees a chance to try hands-on our tracking with simple and interactive user interfaces.
{"title":"Mobile augmented reality - 3D object selection and reconstruction with an RGBD sensor and scene understanding","authors":"Daniel Wagner, Gerhard Reitmayr, Alessandro Mulloni, Erick Méndez, S. Diaz","doi":"10.1109/ISMAR.2014.6948499","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948499","url":null,"abstract":"In this proposal we show case two 3D reconstruction systems running in real-time on a tablet equipped with a depth sensor. We believe that the proposed set of demonstrations will engage ISMAR attendees both in terms of tracking technology and user experience. Both demos show state-of-the art 3D reconstruction technology and give attendees a chance to try hands-on our tracking with simple and interactive user interfaces.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89614884","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-11-06DOI: 10.1109/ISMAR.2014.6948493
David Schattel, M. Tönnis, G. Klinker, G. Schubert, F. Petzold
The early design phase for a new building is a crucial stage in the design process of architects. It has to be ensured that the building fits into the future environment. The Collaborative Design Platform targets this issue by integrating modern digital means with well known traditional concepts. Well-used styrofoam blocks are still cut by hand but are now tracked, placed and visualized in 3D by use of a tabletop platform and a TV screen showing an arbitrary view of the scenery. With this demonstration, we get one step further and provide an interactive visualization at the proposed building site, further enhancing collaboration between different audiences. Mobile phones and tablet devices are used to visualize marker-less registered virtual building structures and immediately show changes made to the models in the Collaborative Design laboratory. This way, architects can get a direct impression about how a building will integrate within the environment and residents can get an early impression about future plans.
{"title":"On-site augmented collaborative architecture visualization","authors":"David Schattel, M. Tönnis, G. Klinker, G. Schubert, F. Petzold","doi":"10.1109/ISMAR.2014.6948493","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948493","url":null,"abstract":"The early design phase for a new building is a crucial stage in the design process of architects. It has to be ensured that the building fits into the future environment. The Collaborative Design Platform targets this issue by integrating modern digital means with well known traditional concepts. Well-used styrofoam blocks are still cut by hand but are now tracked, placed and visualized in 3D by use of a tabletop platform and a TV screen showing an arbitrary view of the scenery. With this demonstration, we get one step further and provide an interactive visualization at the proposed building site, further enhancing collaboration between different audiences. Mobile phones and tablet devices are used to visualize marker-less registered virtual building structures and immediately show changes made to the models in the Collaborative Design laboratory. This way, architects can get a direct impression about how a building will integrate within the environment and residents can get an early impression about future plans.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77443807","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-11-06DOI: 10.1109/ISMAR.2014.6948480
G. Hough, Ian Williams, C. Athwal
This demonstration is a live example of the experiment presented in [1], namely a method of assessing the visual credibility of a scene where a real person interacts with a virtual object in realtime. Inconsistencies created by actor's incorrect estimation of the virtual object are measured through a series of videos, each containing a defined visual error and rated against interaction credibility on a scale of 1–5 by conference delegates.
{"title":"Measurement of perceptual tolerance for inconsistencies within mixed reality scenes","authors":"G. Hough, Ian Williams, C. Athwal","doi":"10.1109/ISMAR.2014.6948480","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948480","url":null,"abstract":"This demonstration is a live example of the experiment presented in [1], namely a method of assessing the visual credibility of a scene where a real person interacts with a virtual object in realtime. Inconsistencies created by actor's incorrect estimation of the virtual object are measured through a series of videos, each containing a defined visual error and rated against interaction credibility on a scale of 1–5 by conference delegates.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75549638","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-11-06DOI: 10.1109/ISMAR.2014.6948489
D. Molyneaux, Selim Benhimane
{"title":"\"It's a Pirate's Life\" AR game","authors":"D. Molyneaux, Selim Benhimane","doi":"10.1109/ISMAR.2014.6948489","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948489","url":null,"abstract":"","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72950070","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-11-06DOI: 10.1109/ISMAR.2014.6948490
Han Park, TaeGyu Kim, Jun Park
{"title":"QubeAR: Cube style QR code AR interaction","authors":"Han Park, TaeGyu Kim, Jun Park","doi":"10.1109/ISMAR.2014.6948490","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948490","url":null,"abstract":"","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91353200","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-11-06DOI: 10.1109/ISMAR.2014.6948491
Thammathip Piumsomboon, Adrian Clark, M. Billinghurst
{"title":"G-SIAR: Gesture-speech interface for augmented reality","authors":"Thammathip Piumsomboon, Adrian Clark, M. Billinghurst","doi":"10.1109/ISMAR.2014.6948491","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948491","url":null,"abstract":"","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84116351","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-11-06DOI: 10.1109/ISMAR.2014.6948495
Darko Stanimirovic, Daniel Kurz
We demonstrate a novel method for interaction of humans with real objects in their surrounding combining Visual Search and Augmented Reality (AR). This method is based on utilizing a smartwatch tethered to a smartphone, and it is designed to provide a more user-friendly experience compared to approaches based only on a handheld device, such as a smartphone or a tablet computer. The smartwatch has a built-in camera, which enables scanning objects without the need to take the smartphone out of the pocket. An image captured by the watch is sent wirelessly to the phone that performs Visual Search and subsequently informs the smartwatch whether digital information related to the object is available or not. As described in our ISMAR 2014 poster [6], we distinguish between three cases. If no information is available or the object recognition failed, the user is notified accordingly. If there is digital information available that can be presented using the smartwatch display and/or audio output, it is presented there. The third case is that the recognized object has digital information related to it, which would be beneficial to see in an Augmented Reality view spatially registered with the object in real-time. Then the smartwatch informs the user that this option exists and encourages using the smartphone to experience the Augmented Reality view. Thereby, the user only needs to take the phone out of the pocket in case Augmented Reality content is available, and when the content is of interest for the user.
{"title":"Smartwatch-aided handheld augmented reality","authors":"Darko Stanimirovic, Daniel Kurz","doi":"10.1109/ISMAR.2014.6948495","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948495","url":null,"abstract":"We demonstrate a novel method for interaction of humans with real objects in their surrounding combining Visual Search and Augmented Reality (AR). This method is based on utilizing a smartwatch tethered to a smartphone, and it is designed to provide a more user-friendly experience compared to approaches based only on a handheld device, such as a smartphone or a tablet computer. The smartwatch has a built-in camera, which enables scanning objects without the need to take the smartphone out of the pocket. An image captured by the watch is sent wirelessly to the phone that performs Visual Search and subsequently informs the smartwatch whether digital information related to the object is available or not. As described in our ISMAR 2014 poster [6], we distinguish between three cases. If no information is available or the object recognition failed, the user is notified accordingly. If there is digital information available that can be presented using the smartwatch display and/or audio output, it is presented there. The third case is that the recognized object has digital information related to it, which would be beneficial to see in an Augmented Reality view spatially registered with the object in real-time. Then the smartwatch informs the user that this option exists and encourages using the smartphone to experience the Augmented Reality view. Thereby, the user only needs to take the phone out of the pocket in case Augmented Reality content is available, and when the content is of interest for the user.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73841074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-11-06DOI: 10.1109/ISMAR.2014.6948472
A. Behmel, W. Höhl, Thomas Kienzl
Experience and control your design using natural interfaces! Most of todays conventional design review systems require special programming skills for preparation and high-capacity hardand software for demonstration. Interacting with 3D data sometimes can be complicated. Today we face five major problem fields using design review systems: Interaction with 3D data, navigation in 3D space, controlling design alternatives, design presentation using less extensive hardware, content development without special software and programming skills. Developments also targeting these issues by using different methods are presented e.g. by LANCELLE, SETTGAST and FELLNER (2008). They developed DAVE – Definitely Affordable Virtual Environment at Graz University of Technology. This immersive cage-based system today is used in evaluating the design of the new main railway station in Vienna, Austria. Also SHIRATUDDIN and THABET (2011) utilized the Torque 3D game engine to develop a Virtual Design Review System. Finally DUNSTON et. al. (2011) designed an Immersive Virtual Reality Mock-Up for Design Review of Hospital Patient Rooms. These and other research work was based on standard 3D game engines by using a conventional cave or power wall for presentation and physical immersion. The edddison MRI Design Review System is an easy to use mixed reality interface for design evaluation and presentation. It integrates a wide range of hardware input systems including a special 3D-printed tangible user interface, desktop computers, tablets and touch screens. On the software side it offers plug-ins for standard 3D software including Autodesk Navisworks and Showcase, Unity3D, Trimbles, SketchUp, Web GL and others. The edddison MRI Design Review System enables laymen to create their own interactive 3D content. It is a solution which makes the creation and presentation of interactive 3D applications as simple as preparing a powerpoint presentation. Without any programming skills you can easily manipulate 3D models within standard software applications. Control, change or adapt your design easily and interact with 3D models by natural interfaces and standard handheld devices. Navigate in 3D space using only your tablet computer. Complex buildings can be experienced by means of 2D floor plans and a touchscreen. System requirements are reduced by using standard software applications such as SketchUp or Unity3D. The edddison MRI Design Review System also makes it easy to present different design stages without extensive hardand software on all common mobile platforms. Actual application areas are Architectural Design, Digital Prototyping, Industrial Simulation, Serious Games and Product Presentation. Currently, the system has two major usecases: one setup will show the WebGL demo running on an iPad or an Android tablet computer. Using a WebGL/HTML5 cloud solution MRI Design Review System is able to reach the masses. The second demo is a SketchUp file controlled by optical tracking a
{"title":"MRI design review system: A mixed reality interactive design review system for architecture, serious games and engineering using game engines, standard software, a tablet computer and natural interfaces","authors":"A. Behmel, W. Höhl, Thomas Kienzl","doi":"10.1109/ISMAR.2014.6948472","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948472","url":null,"abstract":"Experience and control your design using natural interfaces! Most of todays conventional design review systems require special programming skills for preparation and high-capacity hardand software for demonstration. Interacting with 3D data sometimes can be complicated. Today we face five major problem fields using design review systems: Interaction with 3D data, navigation in 3D space, controlling design alternatives, design presentation using less extensive hardware, content development without special software and programming skills. Developments also targeting these issues by using different methods are presented e.g. by LANCELLE, SETTGAST and FELLNER (2008). They developed DAVE – Definitely Affordable Virtual Environment at Graz University of Technology. This immersive cage-based system today is used in evaluating the design of the new main railway station in Vienna, Austria. Also SHIRATUDDIN and THABET (2011) utilized the Torque 3D game engine to develop a Virtual Design Review System. Finally DUNSTON et. al. (2011) designed an Immersive Virtual Reality Mock-Up for Design Review of Hospital Patient Rooms. These and other research work was based on standard 3D game engines by using a conventional cave or power wall for presentation and physical immersion. The edddison MRI Design Review System is an easy to use mixed reality interface for design evaluation and presentation. It integrates a wide range of hardware input systems including a special 3D-printed tangible user interface, desktop computers, tablets and touch screens. On the software side it offers plug-ins for standard 3D software including Autodesk Navisworks and Showcase, Unity3D, Trimbles, SketchUp, Web GL and others. The edddison MRI Design Review System enables laymen to create their own interactive 3D content. It is a solution which makes the creation and presentation of interactive 3D applications as simple as preparing a powerpoint presentation. Without any programming skills you can easily manipulate 3D models within standard software applications. Control, change or adapt your design easily and interact with 3D models by natural interfaces and standard handheld devices. Navigate in 3D space using only your tablet computer. Complex buildings can be experienced by means of 2D floor plans and a touchscreen. System requirements are reduced by using standard software applications such as SketchUp or Unity3D. The edddison MRI Design Review System also makes it easy to present different design stages without extensive hardand software on all common mobile platforms. Actual application areas are Architectural Design, Digital Prototyping, Industrial Simulation, Serious Games and Product Presentation. Currently, the system has two major usecases: one setup will show the WebGL demo running on an iPad or an Android tablet computer. Using a WebGL/HTML5 cloud solution MRI Design Review System is able to reach the masses. The second demo is a SketchUp file controlled by optical tracking a","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73012361","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}