International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality最新文献
Pub Date : 2014-09-01DOI: 10.1109/ISMAR.2014.6948476
Martin Fischbach, C. Zimmerer, Anke Giebler-Schubert, Marc Erich Latoschik
Quest – XRoads is a multimodal and multimedia mixed reality version of the traditional role-play tabletop game Quest: Zeit der Helden. The original game concept is augmented with virtual content, controllable via auditory, tangible and spatial interfaces to permit a novel gaming experience and to increase the satisfaction while playing. The demonstration consists of a turn-based skirmish, where up to four players have to collaborate to defeat an opposing player. In order to be victorious, players have to control heroes or villains and use their abilities via speech, gesture, touch as well as tangible interactions.
Quest - XRoads是传统角色扮演桌面游戏Quest: Zeit der Helden的多模式和多媒体混合现实版本。最初的游戏概念是通过虚拟内容,通过听觉,有形和空间界面进行控制,以实现新颖的游戏体验,并增加玩游戏时的满意度。演示包括回合制冲突,最多4名玩家必须合作击败对手。为了取得胜利,玩家必须控制英雄或反派,并通过语言、手势、触摸和有形互动来使用他们的能力。
{"title":"Exploring multimodal interaction techniques for a mixed reality digital surface","authors":"Martin Fischbach, C. Zimmerer, Anke Giebler-Schubert, Marc Erich Latoschik","doi":"10.1109/ISMAR.2014.6948476","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948476","url":null,"abstract":"Quest – XRoads is a multimodal and multimedia mixed reality version of the traditional role-play tabletop game Quest: Zeit der Helden. The original game concept is augmented with virtual content, controllable via auditory, tangible and spatial interfaces to permit a novel gaming experience and to increase the satisfaction while playing. The demonstration consists of a turn-based skirmish, where up to four players have to collaborate to defeat an opposing player. In order to be victorious, players have to control heroes or villains and use their abilities via speech, gesture, touch as well as tangible interactions.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83755011","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-09-01DOI: 10.1109/ISMAR.2014.6948482
Felix Leif Keppmann, Tobias Käfer, S. Stadtmüller, R. Schubotz, A. Harth
We demonstrate a Virtual Reality information system that shows the applicability of REST in highly dynamic environments as well as the advantages of Linked Data for on-the-fly data integration. We integrate a motion detection sensor application to remote control an avatar in the Virtual Reality. In the Virtual Reality, information about the user is integrated and visualised. Moreover, the user can interact with the visualised information.
{"title":"Integrating highly dynamic RESTful linked data APIs in a virtual reality environment","authors":"Felix Leif Keppmann, Tobias Käfer, S. Stadtmüller, R. Schubotz, A. Harth","doi":"10.1109/ISMAR.2014.6948482","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948482","url":null,"abstract":"We demonstrate a Virtual Reality information system that shows the applicability of REST in highly dynamic environments as well as the advantages of Linked Data for on-the-fly data integration. We integrate a motion detection sensor application to remote control an avatar in the Virtual Reality. In the Virtual Reality, information about the user is integrated and visualised. Moreover, the user can interact with the visualised information.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79006842","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-09-01DOI: 10.1109/ISMAR.2014.6948473
Daniel Caetano, F. Mattioli, E. Lamounier, Alexandre Cardoso
This work’s purpose is to investigate the use of Augmented Reality techniques on telerehabilitation, applied to wheelchair users training. In this scenario, using a computer with unconventional devices, the user will be connected to a remote training space and will be able to issue commands, in order to accomplish the execution of training exercises. The telerehabilitation environment should reproduce the main challenges faced by wheelchair users in their daily activities.
{"title":"On the use of augmented reality techniques in a telerehabilitation environment for wheelchair users' training","authors":"Daniel Caetano, F. Mattioli, E. Lamounier, Alexandre Cardoso","doi":"10.1109/ISMAR.2014.6948473","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948473","url":null,"abstract":"This work’s purpose is to investigate the use of Augmented Reality techniques on telerehabilitation, applied to wheelchair users training. In this scenario, using a computer with unconventional devices, the user will be connected to a remote training space and will be able to issue commands, in order to accomplish the execution of training exercises. The telerehabilitation environment should reproduce the main challenges faced by wheelchair users in their daily activities.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89130433","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-09-01DOI: 10.1109/ISMAR.2014.6948481
Yuta Itoh, G. Klinker
A correct spatial registration of Optical See-Through Head-Mounted Displays (OST-HMD) w.r.t. a user's eye(s) is an essential problem for any AR application using the such HMDs (Fig. 1). Maintaining the correct registration demands frequent (re)calibrations for the end-users whenever they move the HMD on their head. Thus, a calibration technique should be simple and accurate for the universal, long-run use of the displays. This demonstration showcases INDICA, an automatic OST-HMD calibration approach presented in our previous work[1] and ISMAR 2014 paper [2]. The method calibrates the display to the user's current eyeball position by combining online eye-position tracking with offline parameters. Visitors of our demonstration can try our both manual calibration and our interaction-free calibration on a customized OST-HMD.
{"title":"INDICA : Interaction-free display calibration for optical see-through head-mounted displays based on 3D eye localization","authors":"Yuta Itoh, G. Klinker","doi":"10.1109/ISMAR.2014.6948481","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948481","url":null,"abstract":"A correct spatial registration of Optical See-Through Head-Mounted Displays (OST-HMD) w.r.t. a user's eye(s) is an essential problem for any AR application using the such HMDs (Fig. 1). Maintaining the correct registration demands frequent (re)calibrations for the end-users whenever they move the HMD on their head. Thus, a calibration technique should be simple and accurate for the universal, long-run use of the displays. This demonstration showcases INDICA, an automatic OST-HMD calibration approach presented in our previous work[1] and ISMAR 2014 paper [2]. The method calibrates the display to the user's current eyeball position by combining online eye-position tracking with offline parameters. Visitors of our demonstration can try our both manual calibration and our interaction-free calibration on a customized OST-HMD.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83032950","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-09-01DOI: 10.1109/ISMAR.2014.6948494
S. Siltanen, Henrikki Saraspaa, Jari T. Karvonen
{"title":"A complete interior design solution with diminished reality","authors":"S. Siltanen, Henrikki Saraspaa, Jari T. Karvonen","doi":"10.1109/ISMAR.2014.6948494","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948494","url":null,"abstract":"","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73930170","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-09-01DOI: 10.1109/ISMAR.2014.6948502
Hiroyuki Yoshida, T. Okamoto, H. Saito
We propose a novel system for visual overlay of 3D virtual object onto real environment observed by tablet PC with camera. This system allows us to visually simulate the layout of virtual 3D objects such as furniture in the real environment captured by the tablet PC. For estimating the pose and position of the tablet PC in the 3D structure of the target environment, we propose and implement the 2 procedures using the captured image and using the motion sensor in tablet PC. Those performances are presented in the demonstration.
{"title":"Tablet system for visual, overlay of 3D virtual object onto real environment","authors":"Hiroyuki Yoshida, T. Okamoto, H. Saito","doi":"10.1109/ISMAR.2014.6948502","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948502","url":null,"abstract":"We propose a novel system for visual overlay of 3D virtual object onto real environment observed by tablet PC with camera. This system allows us to visually simulate the layout of virtual 3D objects such as furniture in the real environment captured by the tablet PC. For estimating the pose and position of the tablet PC in the 3D structure of the target environment, we propose and implement the 2 procedures using the captured image and using the motion sensor in tablet PC. Those performances are presented in the demonstration.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89366216","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-09-01DOI: 10.1109/ISMAR.2014.6948501
Folker Wientapper, T. Engelke, J. Keil, H. Wuest, J. Mensik
Optical see through head mounted displays (OST-HMD) are ever since the first days of Augmented Reality (AR) in focus of development and in nowadays first affordable and prototypes are spread out to markets. Despite common technical problems, such as having a proper field of view, weight, and other problems concerning the miniaturization of these systems, a crucial aspect for AR relies also in the calibration of such a device with respect to the individual user for proper alignment of augmentations. Our demonstrator shows a practical solution for this problem along with a fully featured example application for a typical maintenance use case based on a generalized framework for application creation. We depict the technical background and procedure of the calibration, the tracking approach considering the sensors of the device, user experience factors, and its implementation procedure in general. We present our demonstrator using an Epson Moverio BT-200 OST-HMD.
{"title":"User friedly calibration and tracking for optical stereo see-through augmented reality","authors":"Folker Wientapper, T. Engelke, J. Keil, H. Wuest, J. Mensik","doi":"10.1109/ISMAR.2014.6948501","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948501","url":null,"abstract":"Optical see through head mounted displays (OST-HMD) are ever since the first days of Augmented Reality (AR) in focus of development and in nowadays first affordable and prototypes are spread out to markets. Despite common technical problems, such as having a proper field of view, weight, and other problems concerning the miniaturization of these systems, a crucial aspect for AR relies also in the calibration of such a device with respect to the individual user for proper alignment of augmentations. Our demonstrator shows a practical solution for this problem along with a fully featured example application for a typical maintenance use case based on a generalized framework for application creation. We depict the technical background and procedure of the calibration, the tracking approach considering the sensors of the device, user experience factors, and its implementation procedure in general. We present our demonstrator using an Epson Moverio BT-200 OST-HMD.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87961779","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-09-01DOI: 10.1109/ISMAR.2014.6948488
Boris Meden, Sebastian Knödel, S. Bourgeois
{"title":"Markerless augmented reality solution for industrial manufacturing","authors":"Boris Meden, Sebastian Knödel, S. Bourgeois","doi":"10.1109/ISMAR.2014.6948488","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948488","url":null,"abstract":"","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76816575","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-09-01DOI: 10.1109/ISMAR.2014.6948503
Y. Yoshida, T. Kawamoto
{"title":"Displaying free-viewpoint video with user controlable head mounted display DEMO","authors":"Y. Yoshida, T. Kawamoto","doi":"10.1109/ISMAR.2014.6948503","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948503","url":null,"abstract":"","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87958836","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-01-01DOI: 10.1109/ISMAR.2014.6948498
J. Vigueras
In this demonstration, we present a general purpose Augmented Reality (AR) system that allows to add easily 3D computer generated (CG) objects into real man-made environments. Our system goes to a very intuitive and easy in situ 3D structure recovery of planar piecewise scenes without using powerful hardware nor commodity sensors. The user simply has to move the camera (translation of the camera is mandatory) and take two different pictures of the scene, and our approach obtains a rough planar piecewise representation of the environment suitable to conduct multi-planar tracking for visual model-based augmented reality and to augment it with virtual objects coherently. Polyhedral representations of scenes are very convenient for manmade environments indoor (e.g., offices, rooms, classrooms) and outdoor (e.g., facades, floor), hence we focus the potential applications of our system to augment simple rooms or urban scenes with virtual imagery.
{"title":"Fast vision-based multiplanar scene modeling in unprepared environments","authors":"J. Vigueras","doi":"10.1109/ISMAR.2014.6948498","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948498","url":null,"abstract":"In this demonstration, we present a general purpose Augmented Reality (AR) system that allows to add easily 3D computer generated (CG) objects into real man-made environments. Our system goes to a very intuitive and easy in situ 3D structure recovery of planar piecewise scenes without using powerful hardware nor commodity sensors. The user simply has to move the camera (translation of the camera is mandatory) and take two different pictures of the scene, and our approach obtains a rough planar piecewise representation of the environment suitable to conduct multi-planar tracking for visual model-based augmented reality and to augment it with virtual objects coherently. Polyhedral representations of scenes are very convenient for manmade environments indoor (e.g., offices, rooms, classrooms) and outdoor (e.g., facades, floor), hence we focus the potential applications of our system to augment simple rooms or urban scenes with virtual imagery.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83466172","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}