Pub Date : 2010-11-22DOI: 10.1109/ISMAR.2010.5643597
Tatsuro Orikasa, S. Kagami, K. Hashimoto
We propose a new approach for augmented reality, in which real-world scene images are augmented with video fragments manipulated in the time domain. The proposed system aims to display slow-motion video sequences of moving objects instantly without accumulated time lag so that a user can recognize and observe high-speed motion on the spot. Images from a high-speed camera are analyzed to detect regions with important visual features, which are overlaid on a normal-speed video sequence.
{"title":"Time-domain augmented reality based on locally adaptive video sampling","authors":"Tatsuro Orikasa, S. Kagami, K. Hashimoto","doi":"10.1109/ISMAR.2010.5643597","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643597","url":null,"abstract":"We propose a new approach for augmented reality, in which real-world scene images are augmented with video fragments manipulated in the time domain. The proposed system aims to display slow-motion video sequences of moving objects instantly without accumulated time lag so that a user can recognize and observe high-speed motion on the spot. Images from a high-speed camera are analyzed to detect regions with important visual features, which are overlaid on a normal-speed video sequence.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124571471","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-22DOI: 10.1109/ISMAR.2010.5643618
C. Traxler, Martin Knecht
This laboratory demo is a showcase for the research results published in our ISMAR 2010 paper [3], where we describe a method to simulate the mutual shading effects between virtual and real objects in Mixed Reality applications. The aim is to provide a plausible illusion so that virtual objects seem to be really there. It combines Instant Radiosity [2] with Differential Rendering [1] to a method suitable for MR applications. The demo consists of two scenarios, a simple one to focus on mutual shading effects and an MR game based on LEGO®.
{"title":"Demo for differential Instant Radiosity for Mixed Reality","authors":"C. Traxler, Martin Knecht","doi":"10.1109/ISMAR.2010.5643618","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643618","url":null,"abstract":"This laboratory demo is a showcase for the research results published in our ISMAR 2010 paper [3], where we describe a method to simulate the mutual shading effects between virtual and real objects in Mixed Reality applications. The aim is to provide a plausible illusion so that virtual objects seem to be really there. It combines Instant Radiosity [2] with Differential Rendering [1] to a method suitable for MR applications. The demo consists of two scenarios, a simple one to focus on mutual shading effects and an MR game based on LEGO®.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127208074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-22DOI: 10.1109/ISMAR.2010.5643574
Wen-Chao Chen, Fu-Jen Hsiao, Chung-Wei Lin
This paper presents an automatic parallax adjustment method that considers the border effect to produce more realistic stereo images on a stereoscopic augmented reality system. Three-dimensional (3D) imaging is an emerging method of displaying three-dimensional information and providing an immersive and intuitive experience with augmented reality. However, the protruding parts of displayed stereoscopic images may be blurry and cause viewing discomfort. Furthermore, the border effect may make it difficult for an imaging system to display regions next to screen borders, even with considerable negative parallax. This paper proposes a method of automatically adjusting the parallax of displayed stereo images by analyzing the feature points in regions near screen borders to produce better stereo effects. Experimental results and a subjective assessment of human factor issues indicate that the proposed method makes stereoscopic augmented reality systems significantly more attractive and comfortable to view.
{"title":"An automatic parallax adjustment method for stereoscopic augmented reality systems","authors":"Wen-Chao Chen, Fu-Jen Hsiao, Chung-Wei Lin","doi":"10.1109/ISMAR.2010.5643574","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643574","url":null,"abstract":"This paper presents an automatic parallax adjustment method that considers the border effect to produce more realistic stereo images on a stereoscopic augmented reality system. Three-dimensional (3D) imaging is an emerging method of displaying three-dimensional information and providing an immersive and intuitive experience with augmented reality. However, the protruding parts of displayed stereoscopic images may be blurry and cause viewing discomfort. Furthermore, the border effect may make it difficult for an imaging system to display regions next to screen borders, even with considerable negative parallax. This paper proposes a method of automatically adjusting the parallax of displayed stereo images by analyzing the feature points in regions near screen borders to produce better stereo effects. Experimental results and a subjective assessment of human factor issues indicate that the proposed method makes stereoscopic augmented reality systems significantly more attractive and comfortable to view.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"83 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115866397","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-22DOI: 10.1109/ISMAR.2010.5643580
Lukas Gruber, Denis Kalkofen, D. Schmalstieg
In this paper we discuss color harmonization for Augmented Reality. Color harmonization is a technique used to adjust the combination of colors in order to follow aesthetic guidelines. We implemented a system which is able to harmonize the combination of the colors in video based AR systems. The presented approach is able to re-color virtual and real-world items, achieving overall more visually pleasant results. In order to allow preservation of certain colors in an AR composition, we furthermore introduce the concept of constraint color harmonization.
{"title":"Color harmonization for Augmented Reality","authors":"Lukas Gruber, Denis Kalkofen, D. Schmalstieg","doi":"10.1109/ISMAR.2010.5643580","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643580","url":null,"abstract":"In this paper we discuss color harmonization for Augmented Reality. Color harmonization is a technique used to adjust the combination of colors in order to follow aesthetic guidelines. We implemented a system which is able to harmonize the combination of the colors in video based AR systems. The presented approach is able to re-color virtual and real-world items, achieving overall more visually pleasant results. In order to allow preservation of certain colors in an AR composition, we furthermore introduce the concept of constraint color harmonization.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"193 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115635899","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-22DOI: 10.1109/ISMAR.2010.5643604
C. Waechter, Manuel J. Huber, P. Keitler, M. Schlegel, G. Klinker, D. Pustka
Indoor tracking scenarios still face challenges in providing continuous tracking support in wide-area workplaces. This is especially the case in Augmented Reality since such augmentations generally require exact full 6DOF pose measurements in order to continuously display 3D graphics from user-related view points. Many single sensor systems have been explored but only few of them have the capability to track reliably in wide-area environments. We introduce a mobile multi-sensor platform to overcome the shortcomings of single sensor systems. The platform is equipped with a detachable optical camera and a rigidly mounted odometric measurement system providing relative positions and orientations with respect to the ground plane. The camera is used for marker-based as well as for marker-less (feature-based) inside-out tracking as part of a hybrid approach. We explain the principle tracking technologies in our competitive/cooperative fusion approach and show possible enhancements to further developments. This inside-out approach scales well with increasing tracking range, as opposed to stationary outside-in tracking.
{"title":"A multi-sensor platform for wide-area tracking","authors":"C. Waechter, Manuel J. Huber, P. Keitler, M. Schlegel, G. Klinker, D. Pustka","doi":"10.1109/ISMAR.2010.5643604","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643604","url":null,"abstract":"Indoor tracking scenarios still face challenges in providing continuous tracking support in wide-area workplaces. This is especially the case in Augmented Reality since such augmentations generally require exact full 6DOF pose measurements in order to continuously display 3D graphics from user-related view points. Many single sensor systems have been explored but only few of them have the capability to track reliably in wide-area environments. We introduce a mobile multi-sensor platform to overcome the shortcomings of single sensor systems. The platform is equipped with a detachable optical camera and a rigidly mounted odometric measurement system providing relative positions and orientations with respect to the ground plane. The camera is used for marker-based as well as for marker-less (feature-based) inside-out tracking as part of a hybrid approach. We explain the principle tracking technologies in our competitive/cooperative fusion approach and show possible enhancements to further developments. This inside-out approach scales well with increasing tracking range, as opposed to stationary outside-in tracking.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"51 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126354538","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-22DOI: 10.1109/ISMAR.2010.5643566
Brett R. Jones, Rajinder Sodhi, R. Campbell, Guy E. Garnett, B. Bailey
We explore interacting with everyday objects by representing content as interactive surface particles. Users can build their own physical world, map virtual content onto their physical construction and play directly with the surface using a stylus. A surface particle representation allows programmed content to be created independent of the display object and to be reused on many surfaces. We demonstrated this idea through a projector-camera system that acquires the object geometry and enables direct interaction through an IR tracked stylus. We present three motivating example applications, each displayed on three example surfaces. We discuss a set of interaction techniques that show possible avenues for structuring interaction on complicated everyday objects, such as Surface Adaptive GUIs for menu selection. Through a preliminary informal evaluation and interviews with end users, we demonstrate the potential of interacting with surface particles and identify improvements necessary to make this interaction practical on everyday surfaces.
{"title":"Build your world and play in it: Interacting with surface particles on complex objects","authors":"Brett R. Jones, Rajinder Sodhi, R. Campbell, Guy E. Garnett, B. Bailey","doi":"10.1109/ISMAR.2010.5643566","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643566","url":null,"abstract":"We explore interacting with everyday objects by representing content as interactive surface particles. Users can build their own physical world, map virtual content onto their physical construction and play directly with the surface using a stylus. A surface particle representation allows programmed content to be created independent of the display object and to be reused on many surfaces. We demonstrated this idea through a projector-camera system that acquires the object geometry and enables direct interaction through an IR tracked stylus. We present three motivating example applications, each displayed on three example surfaces. We discuss a set of interaction techniques that show possible avenues for structuring interaction on complicated everyday objects, such as Surface Adaptive GUIs for menu selection. Through a preliminary informal evaluation and interviews with end users, we demonstrate the potential of interacting with surface particles and identify improvements necessary to make this interaction practical on everyday surfaces.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124392676","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-22DOI: 10.1109/ISMAR.2010.5643587
S. Kahn, H. Wuest, D. Stricker, D. Fellner
For many tasks like markerless model-based camera tracking it is essential that the 3D model of a scene accurately represents the real geometry of the scene. It is therefore very important to detect deviations between a 3D model and a scene. We present an innovative approach which is based on the insight that camera tracking can not only be used for Augmented Reality visualization but also to solve the correspondence problem between 3D measurements of a real scene and their corresponding positions in the 3D model. We combine a time-of-flight camera (which acquires depth images in real time) with a custom 2D camera (used for the camera tracking) and developed an analysis-by-synthesis approach to detect deviations between a scene and a 3D model of the scene.
{"title":"3D discrepancy check via Augmented Reality","authors":"S. Kahn, H. Wuest, D. Stricker, D. Fellner","doi":"10.1109/ISMAR.2010.5643587","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643587","url":null,"abstract":"For many tasks like markerless model-based camera tracking it is essential that the 3D model of a scene accurately represents the real geometry of the scene. It is therefore very important to detect deviations between a 3D model and a scene. We present an innovative approach which is based on the insight that camera tracking can not only be used for Augmented Reality visualization but also to solve the correspondence problem between 3D measurements of a real scene and their corresponding positions in the 3D model. We combine a time-of-flight camera (which acquires depth images in real time) with a custom 2D camera (used for the camera tracking) and developed an analysis-by-synthesis approach to detect deviations between a scene and a 3D model of the scene.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123359895","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-22DOI: 10.1109/ISMAR.2010.5643568
Jairo R. Sánchez, H. Álvarez, D. Borro
This paper addresses the problem of camera tracking and 3D reconstruction from image sequences, i.e., the monocular SLAM problem. Traditionally, this problem is solved using non-linear minimization techniques that are very accurate but hardly used in real time. This work presents a highly parallelizable random sampling approach based on Monte Carlo simulations that fits very well on the graphics hardware. The proposed algorithm achieves the same precision as non linear optimization, getting real time performance running on commodity graphics hardware. Both accuracy and performance are evaluated using synthetic data and real video sequences captured with a hand-held camera. Moreover, results are compared with an implementation of Bundle Adjustment showing that the presented method gets similar results in much less time.
{"title":"Towards real time 3D tracking and reconstruction on a GPU using Monte Carlo simulations","authors":"Jairo R. Sánchez, H. Álvarez, D. Borro","doi":"10.1109/ISMAR.2010.5643568","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643568","url":null,"abstract":"This paper addresses the problem of camera tracking and 3D reconstruction from image sequences, i.e., the monocular SLAM problem. Traditionally, this problem is solved using non-linear minimization techniques that are very accurate but hardly used in real time. This work presents a highly parallelizable random sampling approach based on Monte Carlo simulations that fits very well on the graphics hardware. The proposed algorithm achieves the same precision as non linear optimization, getting real time performance running on commodity graphics hardware. Both accuracy and performance are evaluated using synthetic data and real video sequences captured with a hand-held camera. Moreover, results are compared with an implementation of Bundle Adjustment showing that the presented method gets similar results in much less time.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115010440","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-22DOI: 10.1109/ISMAR.2010.5643581
Jens Grubert, D. Hamacher, R. Mecke, I. Böckelmann, L. Schega, A. Huckauf, Mario H. Urbina, Michael Schenk, Fabian Doil, Johannes Tümler
The potential of Augmented Reality (AR) to support industrial processes has been demonstrated in several studies. While there have been first investigations on user related issues in the long-duration use of mobile AR systems, to date the impact of theses systems on physiological and psychological aspects is not explored extensively. We conducted an extended study in which 19 participants worked 4 hours continuously in an order picking process with and without AR support. Results of the study comparing strain and work efficiency are presented and open issues are discussed.
{"title":"Extended investigations of user-related issues in mobile industrial AR","authors":"Jens Grubert, D. Hamacher, R. Mecke, I. Böckelmann, L. Schega, A. Huckauf, Mario H. Urbina, Michael Schenk, Fabian Doil, Johannes Tümler","doi":"10.1109/ISMAR.2010.5643581","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643581","url":null,"abstract":"The potential of Augmented Reality (AR) to support industrial processes has been demonstrated in several studies. While there have been first investigations on user related issues in the long-duration use of mobile AR systems, to date the impact of theses systems on physiological and psychological aspects is not explored extensively. We conducted an extended study in which 19 participants worked 4 hours continuously in an order picking process with and without AR support. Results of the study comparing strain and work efficiency are presented and open issues are discussed.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"1974 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129987918","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-22DOI: 10.1109/ISMAR.2010.5643591
Suwoong Lee, Jong-Gook Ko, Seokbin Kang, Junsuk Lee
This paper introduces immersive e-learning system which provides vivid learning experience using augmented reality(AR) technology. This system gives illusion that participants feel as if they are in foreign environment by synthesizing images of participants, virtual environment, foreign-language speakers in real-time. Furthermore, surrounding virtual environment reacts to the behavior of each participant including student, local teacher, remote teacher. The system has been installed along with 10 scenarios at 14 public elementary schools and conducted during regular class time. This paper presents our motivations for the system development, a detailed design, and its contents.
{"title":"An immersive e-learning system providing virtual experience","authors":"Suwoong Lee, Jong-Gook Ko, Seokbin Kang, Junsuk Lee","doi":"10.1109/ISMAR.2010.5643591","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643591","url":null,"abstract":"This paper introduces immersive e-learning system which provides vivid learning experience using augmented reality(AR) technology. This system gives illusion that participants feel as if they are in foreign environment by synthesizing images of participants, virtual environment, foreign-language speakers in real-time. Furthermore, surrounding virtual environment reacts to the behavior of each participant including student, local teacher, remote teacher. The system has been installed along with 10 scenarios at 14 public elementary schools and conducted during regular class time. This paper presents our motivations for the system development, a detailed design, and its contents.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133542464","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}