首页 > 最新文献

2010 IEEE International Symposium on Mixed and Augmented Reality最新文献

英文 中文
Time-domain augmented reality based on locally adaptive video sampling 基于局部自适应视频采样的时域增强现实
Pub Date : 2010-11-22 DOI: 10.1109/ISMAR.2010.5643597
Tatsuro Orikasa, S. Kagami, K. Hashimoto
We propose a new approach for augmented reality, in which real-world scene images are augmented with video fragments manipulated in the time domain. The proposed system aims to display slow-motion video sequences of moving objects instantly without accumulated time lag so that a user can recognize and observe high-speed motion on the spot. Images from a high-speed camera are analyzed to detect regions with important visual features, which are overlaid on a normal-speed video sequence.
我们提出了一种增强现实的新方法,即用在时域中处理的视频片段增强真实世界的场景图像。该系统旨在即时显示运动物体的慢动作视频序列,而不累积时间延迟,以便用户可以在现场识别和观察高速运动。对高速摄像机的图像进行分析,以检测具有重要视觉特征的区域,并将其覆盖在正常速度的视频序列上。
{"title":"Time-domain augmented reality based on locally adaptive video sampling","authors":"Tatsuro Orikasa, S. Kagami, K. Hashimoto","doi":"10.1109/ISMAR.2010.5643597","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643597","url":null,"abstract":"We propose a new approach for augmented reality, in which real-world scene images are augmented with video fragments manipulated in the time domain. The proposed system aims to display slow-motion video sequences of moving objects instantly without accumulated time lag so that a user can recognize and observe high-speed motion on the spot. Images from a high-speed camera are analyzed to detect regions with important visual features, which are overlaid on a normal-speed video sequence.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124571471","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Demo for differential Instant Radiosity for Mixed Reality 混合现实差分即时辐射的演示
Pub Date : 2010-11-22 DOI: 10.1109/ISMAR.2010.5643618
C. Traxler, Martin Knecht
This laboratory demo is a showcase for the research results published in our ISMAR 2010 paper [3], where we describe a method to simulate the mutual shading effects between virtual and real objects in Mixed Reality applications. The aim is to provide a plausible illusion so that virtual objects seem to be really there. It combines Instant Radiosity [2] with Differential Rendering [1] to a method suitable for MR applications. The demo consists of two scenarios, a simple one to focus on mutual shading effects and an MR game based on LEGO®.
这个实验室演示展示了我们在ISMAR 2010论文[3]中发表的研究成果,其中我们描述了一种在混合现实应用中模拟虚拟和真实物体之间相互阴影效果的方法。其目的是提供一种似是而非的错觉,使虚拟物体看起来真的存在。它将Instant Radiosity[2]与Differential Rendering[1]相结合,成为一种适合MR应用的方法。演示包括两个场景,一个简单的一个专注于相互阴影效果,一个基于LEGO®的MR游戏。
{"title":"Demo for differential Instant Radiosity for Mixed Reality","authors":"C. Traxler, Martin Knecht","doi":"10.1109/ISMAR.2010.5643618","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643618","url":null,"abstract":"This laboratory demo is a showcase for the research results published in our ISMAR 2010 paper [3], where we describe a method to simulate the mutual shading effects between virtual and real objects in Mixed Reality applications. The aim is to provide a plausible illusion so that virtual objects seem to be really there. It combines Instant Radiosity [2] with Differential Rendering [1] to a method suitable for MR applications. The demo consists of two scenarios, a simple one to focus on mutual shading effects and an MR game based on LEGO®.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127208074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
An automatic parallax adjustment method for stereoscopic augmented reality systems 一种立体增强现实系统视差自动调整方法
Pub Date : 2010-11-22 DOI: 10.1109/ISMAR.2010.5643574
Wen-Chao Chen, Fu-Jen Hsiao, Chung-Wei Lin
This paper presents an automatic parallax adjustment method that considers the border effect to produce more realistic stereo images on a stereoscopic augmented reality system. Three-dimensional (3D) imaging is an emerging method of displaying three-dimensional information and providing an immersive and intuitive experience with augmented reality. However, the protruding parts of displayed stereoscopic images may be blurry and cause viewing discomfort. Furthermore, the border effect may make it difficult for an imaging system to display regions next to screen borders, even with considerable negative parallax. This paper proposes a method of automatically adjusting the parallax of displayed stereo images by analyzing the feature points in regions near screen borders to produce better stereo effects. Experimental results and a subjective assessment of human factor issues indicate that the proposed method makes stereoscopic augmented reality systems significantly more attractive and comfortable to view.
提出了一种在立体增强现实系统中考虑边界效应的视差自动调整方法,以产生更逼真的立体图像。三维(3D)成像是一种新兴的显示三维信息的方法,并通过增强现实提供沉浸式和直观的体验。然而,所显示的立体图像的突出部分可能会模糊并引起观看不适。此外,边界效应可能使成像系统难以显示屏幕边界附近的区域,即使具有相当大的负视差。本文提出了一种通过分析屏幕边界附近区域的特征点来自动调整显示立体图像视差的方法,以获得更好的立体效果。实验结果和人为因素问题的主观评估表明,该方法使立体增强现实系统显着更具吸引力和舒适性。
{"title":"An automatic parallax adjustment method for stereoscopic augmented reality systems","authors":"Wen-Chao Chen, Fu-Jen Hsiao, Chung-Wei Lin","doi":"10.1109/ISMAR.2010.5643574","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643574","url":null,"abstract":"This paper presents an automatic parallax adjustment method that considers the border effect to produce more realistic stereo images on a stereoscopic augmented reality system. Three-dimensional (3D) imaging is an emerging method of displaying three-dimensional information and providing an immersive and intuitive experience with augmented reality. However, the protruding parts of displayed stereoscopic images may be blurry and cause viewing discomfort. Furthermore, the border effect may make it difficult for an imaging system to display regions next to screen borders, even with considerable negative parallax. This paper proposes a method of automatically adjusting the parallax of displayed stereo images by analyzing the feature points in regions near screen borders to produce better stereo effects. Experimental results and a subjective assessment of human factor issues indicate that the proposed method makes stereoscopic augmented reality systems significantly more attractive and comfortable to view.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"83 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115866397","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Color harmonization for Augmented Reality 增强现实的色彩协调
Pub Date : 2010-11-22 DOI: 10.1109/ISMAR.2010.5643580
Lukas Gruber, Denis Kalkofen, D. Schmalstieg
In this paper we discuss color harmonization for Augmented Reality. Color harmonization is a technique used to adjust the combination of colors in order to follow aesthetic guidelines. We implemented a system which is able to harmonize the combination of the colors in video based AR systems. The presented approach is able to re-color virtual and real-world items, achieving overall more visually pleasant results. In order to allow preservation of certain colors in an AR composition, we furthermore introduce the concept of constraint color harmonization.
本文讨论了增强现实中的色彩协调问题。色彩协调是一种用于调整颜色组合以遵循审美准则的技术。我们实现了一个能够在基于视频的AR系统中协调颜色组合的系统。所提出的方法能够重新为虚拟和现实世界的物品上色,从而获得更令人愉悦的视觉效果。为了在AR合成中保留某些颜色,我们进一步引入了约束颜色协调的概念。
{"title":"Color harmonization for Augmented Reality","authors":"Lukas Gruber, Denis Kalkofen, D. Schmalstieg","doi":"10.1109/ISMAR.2010.5643580","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643580","url":null,"abstract":"In this paper we discuss color harmonization for Augmented Reality. Color harmonization is a technique used to adjust the combination of colors in order to follow aesthetic guidelines. We implemented a system which is able to harmonize the combination of the colors in video based AR systems. The presented approach is able to re-color virtual and real-world items, achieving overall more visually pleasant results. In order to allow preservation of certain colors in an AR composition, we furthermore introduce the concept of constraint color harmonization.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"193 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115635899","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 20
A multi-sensor platform for wide-area tracking 面向广域跟踪的多传感器平台
Pub Date : 2010-11-22 DOI: 10.1109/ISMAR.2010.5643604
C. Waechter, Manuel J. Huber, P. Keitler, M. Schlegel, G. Klinker, D. Pustka
Indoor tracking scenarios still face challenges in providing continuous tracking support in wide-area workplaces. This is especially the case in Augmented Reality since such augmentations generally require exact full 6DOF pose measurements in order to continuously display 3D graphics from user-related view points. Many single sensor systems have been explored but only few of them have the capability to track reliably in wide-area environments. We introduce a mobile multi-sensor platform to overcome the shortcomings of single sensor systems. The platform is equipped with a detachable optical camera and a rigidly mounted odometric measurement system providing relative positions and orientations with respect to the ground plane. The camera is used for marker-based as well as for marker-less (feature-based) inside-out tracking as part of a hybrid approach. We explain the principle tracking technologies in our competitive/cooperative fusion approach and show possible enhancements to further developments. This inside-out approach scales well with increasing tracking range, as opposed to stationary outside-in tracking.
室内跟踪场景仍然面临着在广域工作场所提供持续跟踪支持的挑战。这在增强现实中尤其如此,因为这种增强通常需要精确的完整6DOF姿势测量,以便从用户相关的视点连续显示3D图形。许多单一传感器系统已被探索,但只有少数具有在广域环境中可靠跟踪的能力。为了克服单传感器系统的缺点,我们引入了一种移动多传感器平台。该平台配备有可拆卸的光学相机和刚性安装的里程测量系统,提供相对于地平面的相对位置和方向。作为混合方法的一部分,该相机既可用于基于标记的跟踪,也可用于无标记(基于特征的)由内到外跟踪。我们解释了竞争/合作融合方法中的跟踪技术原理,并展示了进一步发展的可能增强功能。这种从内到外的方法随着跟踪范围的增加而扩大,而不是固定的由外到内的跟踪。
{"title":"A multi-sensor platform for wide-area tracking","authors":"C. Waechter, Manuel J. Huber, P. Keitler, M. Schlegel, G. Klinker, D. Pustka","doi":"10.1109/ISMAR.2010.5643604","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643604","url":null,"abstract":"Indoor tracking scenarios still face challenges in providing continuous tracking support in wide-area workplaces. This is especially the case in Augmented Reality since such augmentations generally require exact full 6DOF pose measurements in order to continuously display 3D graphics from user-related view points. Many single sensor systems have been explored but only few of them have the capability to track reliably in wide-area environments. We introduce a mobile multi-sensor platform to overcome the shortcomings of single sensor systems. The platform is equipped with a detachable optical camera and a rigidly mounted odometric measurement system providing relative positions and orientations with respect to the ground plane. The camera is used for marker-based as well as for marker-less (feature-based) inside-out tracking as part of a hybrid approach. We explain the principle tracking technologies in our competitive/cooperative fusion approach and show possible enhancements to further developments. This inside-out approach scales well with increasing tracking range, as opposed to stationary outside-in tracking.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"51 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126354538","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
Build your world and play in it: Interacting with surface particles on complex objects 建立你的世界并在其中玩耍:与复杂物体的表面粒子相互作用
Pub Date : 2010-11-22 DOI: 10.1109/ISMAR.2010.5643566
Brett R. Jones, Rajinder Sodhi, R. Campbell, Guy E. Garnett, B. Bailey
We explore interacting with everyday objects by representing content as interactive surface particles. Users can build their own physical world, map virtual content onto their physical construction and play directly with the surface using a stylus. A surface particle representation allows programmed content to be created independent of the display object and to be reused on many surfaces. We demonstrated this idea through a projector-camera system that acquires the object geometry and enables direct interaction through an IR tracked stylus. We present three motivating example applications, each displayed on three example surfaces. We discuss a set of interaction techniques that show possible avenues for structuring interaction on complicated everyday objects, such as Surface Adaptive GUIs for menu selection. Through a preliminary informal evaluation and interviews with end users, we demonstrate the potential of interacting with surface particles and identify improvements necessary to make this interaction practical on everyday surfaces.
我们通过将内容表示为交互表面粒子来探索与日常物体的交互。用户可以建立自己的物理世界,将虚拟内容映射到他们的物理结构上,并使用触控笔直接在表面上进行游戏。表面粒子表示允许独立于显示对象创建可编程内容,并可在许多表面上重用。我们通过一个投影相机系统演示了这个想法,该系统可以获取物体的几何形状,并通过红外跟踪触控笔实现直接交互。我们提供了三个激励的示例应用程序,每个应用程序在三个示例表面上显示。我们讨论了一组交互技术,这些技术显示了在复杂的日常对象上构建交互的可能途径,例如用于菜单选择的表面自适应gui。通过初步的非正式评估和对最终用户的访谈,我们展示了与表面粒子相互作用的潜力,并确定了在日常表面上实现这种相互作用所需的改进。
{"title":"Build your world and play in it: Interacting with surface particles on complex objects","authors":"Brett R. Jones, Rajinder Sodhi, R. Campbell, Guy E. Garnett, B. Bailey","doi":"10.1109/ISMAR.2010.5643566","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643566","url":null,"abstract":"We explore interacting with everyday objects by representing content as interactive surface particles. Users can build their own physical world, map virtual content onto their physical construction and play directly with the surface using a stylus. A surface particle representation allows programmed content to be created independent of the display object and to be reused on many surfaces. We demonstrated this idea through a projector-camera system that acquires the object geometry and enables direct interaction through an IR tracked stylus. We present three motivating example applications, each displayed on three example surfaces. We discuss a set of interaction techniques that show possible avenues for structuring interaction on complicated everyday objects, such as Surface Adaptive GUIs for menu selection. Through a preliminary informal evaluation and interviews with end users, we demonstrate the potential of interacting with surface particles and identify improvements necessary to make this interaction practical on everyday surfaces.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124392676","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 54
3D discrepancy check via Augmented Reality 通过增强现实检查3D差异
Pub Date : 2010-11-22 DOI: 10.1109/ISMAR.2010.5643587
S. Kahn, H. Wuest, D. Stricker, D. Fellner
For many tasks like markerless model-based camera tracking it is essential that the 3D model of a scene accurately represents the real geometry of the scene. It is therefore very important to detect deviations between a 3D model and a scene. We present an innovative approach which is based on the insight that camera tracking can not only be used for Augmented Reality visualization but also to solve the correspondence problem between 3D measurements of a real scene and their corresponding positions in the 3D model. We combine a time-of-flight camera (which acquires depth images in real time) with a custom 2D camera (used for the camera tracking) and developed an analysis-by-synthesis approach to detect deviations between a scene and a 3D model of the scene.
对于许多任务,如基于无标记模型的相机跟踪,场景的3D模型准确地表示场景的真实几何形状是至关重要的。因此,检测3D模型和场景之间的偏差非常重要。我们提出了一种创新的方法,该方法基于摄像机跟踪不仅可以用于增强现实可视化,而且可以解决真实场景的三维测量与其在三维模型中的对应位置之间的对应问题。我们将飞行时间相机(实时获取深度图像)与定制2D相机(用于相机跟踪)相结合,并开发了一种合成分析方法来检测场景和场景的3D模型之间的偏差。
{"title":"3D discrepancy check via Augmented Reality","authors":"S. Kahn, H. Wuest, D. Stricker, D. Fellner","doi":"10.1109/ISMAR.2010.5643587","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643587","url":null,"abstract":"For many tasks like markerless model-based camera tracking it is essential that the 3D model of a scene accurately represents the real geometry of the scene. It is therefore very important to detect deviations between a 3D model and a scene. We present an innovative approach which is based on the insight that camera tracking can not only be used for Augmented Reality visualization but also to solve the correspondence problem between 3D measurements of a real scene and their corresponding positions in the 3D model. We combine a time-of-flight camera (which acquires depth images in real time) with a custom 2D camera (used for the camera tracking) and developed an analysis-by-synthesis approach to detect deviations between a scene and a 3D model of the scene.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123359895","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 17
Towards real time 3D tracking and reconstruction on a GPU using Monte Carlo simulations 在GPU上使用蒙特卡罗模拟实现实时3D跟踪和重建
Pub Date : 2010-11-22 DOI: 10.1109/ISMAR.2010.5643568
Jairo R. Sánchez, H. Álvarez, D. Borro
This paper addresses the problem of camera tracking and 3D reconstruction from image sequences, i.e., the monocular SLAM problem. Traditionally, this problem is solved using non-linear minimization techniques that are very accurate but hardly used in real time. This work presents a highly parallelizable random sampling approach based on Monte Carlo simulations that fits very well on the graphics hardware. The proposed algorithm achieves the same precision as non linear optimization, getting real time performance running on commodity graphics hardware. Both accuracy and performance are evaluated using synthetic data and real video sequences captured with a hand-held camera. Moreover, results are compared with an implementation of Bundle Adjustment showing that the presented method gets similar results in much less time.
本文研究的是基于图像序列的摄像机跟踪和三维重建问题,即单眼SLAM问题。传统上,这个问题是用非线性最小化技术来解决的,这种技术非常精确,但很少用于实时。这项工作提出了一种基于蒙特卡罗模拟的高度并行随机抽样方法,该方法非常适合图形硬件。该算法达到了与非线性优化相同的精度,在普通图形硬件上运行时具有实时性。使用合成数据和手持摄像机捕获的真实视频序列来评估准确性和性能。此外,将结果与束平差的实现进行了比较,表明该方法在更短的时间内获得了相似的结果。
{"title":"Towards real time 3D tracking and reconstruction on a GPU using Monte Carlo simulations","authors":"Jairo R. Sánchez, H. Álvarez, D. Borro","doi":"10.1109/ISMAR.2010.5643568","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643568","url":null,"abstract":"This paper addresses the problem of camera tracking and 3D reconstruction from image sequences, i.e., the monocular SLAM problem. Traditionally, this problem is solved using non-linear minimization techniques that are very accurate but hardly used in real time. This work presents a highly parallelizable random sampling approach based on Monte Carlo simulations that fits very well on the graphics hardware. The proposed algorithm achieves the same precision as non linear optimization, getting real time performance running on commodity graphics hardware. Both accuracy and performance are evaluated using synthetic data and real video sequences captured with a hand-held camera. Moreover, results are compared with an implementation of Bundle Adjustment showing that the presented method gets similar results in much less time.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115010440","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 17
Extended investigations of user-related issues in mobile industrial AR 移动工业AR中用户相关问题的扩展调查
Pub Date : 2010-11-22 DOI: 10.1109/ISMAR.2010.5643581
Jens Grubert, D. Hamacher, R. Mecke, I. Böckelmann, L. Schega, A. Huckauf, Mario H. Urbina, Michael Schenk, Fabian Doil, Johannes Tümler
The potential of Augmented Reality (AR) to support industrial processes has been demonstrated in several studies. While there have been first investigations on user related issues in the long-duration use of mobile AR systems, to date the impact of theses systems on physiological and psychological aspects is not explored extensively. We conducted an extended study in which 19 participants worked 4 hours continuously in an order picking process with and without AR support. Results of the study comparing strain and work efficiency are presented and open issues are discussed.
增强现实(AR)支持工业过程的潜力已经在几项研究中得到证实。虽然对长期使用移动增强现实系统的用户相关问题进行了初步调查,但迄今为止,这些系统对生理和心理方面的影响尚未得到广泛探讨。我们进行了一项扩展研究,其中19名参与者在有或没有AR支持的情况下连续工作4小时。给出了应变与工作效率比较的研究结果,并对有待解决的问题进行了讨论。
{"title":"Extended investigations of user-related issues in mobile industrial AR","authors":"Jens Grubert, D. Hamacher, R. Mecke, I. Böckelmann, L. Schega, A. Huckauf, Mario H. Urbina, Michael Schenk, Fabian Doil, Johannes Tümler","doi":"10.1109/ISMAR.2010.5643581","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643581","url":null,"abstract":"The potential of Augmented Reality (AR) to support industrial processes has been demonstrated in several studies. While there have been first investigations on user related issues in the long-duration use of mobile AR systems, to date the impact of theses systems on physiological and psychological aspects is not explored extensively. We conducted an extended study in which 19 participants worked 4 hours continuously in an order picking process with and without AR support. Results of the study comparing strain and work efficiency are presented and open issues are discussed.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"1974 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129987918","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 29
An immersive e-learning system providing virtual experience 提供虚拟体验的沉浸式电子学习系统
Pub Date : 2010-11-22 DOI: 10.1109/ISMAR.2010.5643591
Suwoong Lee, Jong-Gook Ko, Seokbin Kang, Junsuk Lee
This paper introduces immersive e-learning system which provides vivid learning experience using augmented reality(AR) technology. This system gives illusion that participants feel as if they are in foreign environment by synthesizing images of participants, virtual environment, foreign-language speakers in real-time. Furthermore, surrounding virtual environment reacts to the behavior of each participant including student, local teacher, remote teacher. The system has been installed along with 10 scenarios at 14 public elementary schools and conducted during regular class time. This paper presents our motivations for the system development, a detailed design, and its contents.
本文介绍了利用增强现实(AR)技术提供生动学习体验的沉浸式电子学习系统。该系统通过实时合成参与者、虚拟环境、外语使用者的图像,让参与者产生置身于异国环境的错觉。此外,周围的虚拟环境会对每个参与者的行为做出反应,包括学生、本地教师、远程教师。该系统与10个场景一起安装在14所公立小学,并在正常上课时间进行。本文介绍了系统开发的动机、详细设计和内容。
{"title":"An immersive e-learning system providing virtual experience","authors":"Suwoong Lee, Jong-Gook Ko, Seokbin Kang, Junsuk Lee","doi":"10.1109/ISMAR.2010.5643591","DOIUrl":"https://doi.org/10.1109/ISMAR.2010.5643591","url":null,"abstract":"This paper introduces immersive e-learning system which provides vivid learning experience using augmented reality(AR) technology. This system gives illusion that participants feel as if they are in foreign environment by synthesizing images of participants, virtual environment, foreign-language speakers in real-time. Furthermore, surrounding virtual environment reacts to the behavior of each participant including student, local teacher, remote teacher. The system has been installed along with 10 scenarios at 14 public elementary schools and conducted during regular class time. This paper presents our motivations for the system development, a detailed design, and its contents.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133542464","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
期刊
2010 IEEE International Symposium on Mixed and Augmented Reality
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1