首页 > 最新文献

2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality最新文献

英文 中文
Fast annotation and modeling with a single-point laser range finder 快速注释和建模与单点激光测距仪
Pub Date : 2008-09-15 DOI: 10.1109/ISMAR.2008.4637326
Jason Wither, Christopher Coffin, Jonathan Ventura, Tobias Höllerer
This paper presents methodology for integrating a small, single-point laser range finder into a wearable augmented reality system. We first present a way of creating object-aligned annotations with very little user effort. Second, we describe techniques to segment and pop-up foreground objects. Finally, we introduce a method using the laser range finder to incrementally build 3D panoramas from a fixed observerpsilas location. To build a 3D panorama semi-automatically, we track the systempsilas orientation and use the sparse range data acquired as the user looks around in conjunction with real-time image processing to construct geometry around the userpsilas position. Using full 3D panoramic geometry, it is possible for new virtual objects to be placed in the scene with proper lighting and occlusion by real world objects, which increases the expressivity of the AR experience.
本文介绍了将小型单点激光测距仪集成到可穿戴增强现实系统中的方法。我们首先提出了一种创建对象对齐注释的方法,用户只需付出很少的努力。其次,我们描述了分割和弹出前景对象的技术。最后,我们介绍了一种利用激光测距仪从固定观测者位置增量构建三维全景图的方法。为了半自动地构建3D全景图,我们跟踪系统的方向,并使用用户环顾四周时获得的稀疏范围数据,结合实时图像处理来构建用户位置周围的几何结构。使用全3D全景几何,可以将新的虚拟物体放置在场景中,并通过真实世界的物体进行适当的照明和遮挡,从而增加AR体验的表现力。
{"title":"Fast annotation and modeling with a single-point laser range finder","authors":"Jason Wither, Christopher Coffin, Jonathan Ventura, Tobias Höllerer","doi":"10.1109/ISMAR.2008.4637326","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637326","url":null,"abstract":"This paper presents methodology for integrating a small, single-point laser range finder into a wearable augmented reality system. We first present a way of creating object-aligned annotations with very little user effort. Second, we describe techniques to segment and pop-up foreground objects. Finally, we introduce a method using the laser range finder to incrementally build 3D panoramas from a fixed observerpsilas location. To build a 3D panorama semi-automatically, we track the systempsilas orientation and use the sparse range data acquired as the user looks around in conjunction with real-time image processing to construct geometry around the userpsilas position. Using full 3D panoramic geometry, it is possible for new virtual objects to be placed in the scene with proper lighting and occlusion by real world objects, which increases the expressivity of the AR experience.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"107 11","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132477134","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 37
A differential GPS carrier phase technique for precision outdoor AR tracking 一种差分GPS载波相位室外AR精确跟踪技术
Pub Date : 2008-09-15 DOI: 10.1109/ISMAR.2008.4637319
W. T. Fong, S. Ong, A. Nee
This paper presents a differential GPS carrier phase technique for 3D outdoor position tracking in mobile augmented reality (AR) applications. It has good positioning accuracy, low drift and jitter, and low computation requirement. It eliminates the resolution of integer ambiguities. The position from an initial point is tracked by accumulating the displacement in each time step, which is determined using Differential Single Difference. Preliminary results using low cost GPS receivers show that the position error is 10 cm, and the drift is 0.001 ms-1, which can be compensated using linear models. Stable and accurate augmentations in outdoor scenes are demonstrated.
提出了一种差分GPS载波相位技术,用于移动增强现实(AR)应用中的三维室外位置跟踪。定位精度高,漂移和抖动小,计算量小。它消除了整数歧义的解析。通过累积每个时间步的位移来跟踪从初始点开始的位置,这是用微分单差确定的。在低成本GPS接收机上的初步结果表明,定位误差为10 cm,漂移为0.001 ms-1,可以用线性模型进行补偿。演示了在室外场景中稳定和准确的增强。
{"title":"A differential GPS carrier phase technique for precision outdoor AR tracking","authors":"W. T. Fong, S. Ong, A. Nee","doi":"10.1109/ISMAR.2008.4637319","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637319","url":null,"abstract":"This paper presents a differential GPS carrier phase technique for 3D outdoor position tracking in mobile augmented reality (AR) applications. It has good positioning accuracy, low drift and jitter, and low computation requirement. It eliminates the resolution of integer ambiguities. The position from an initial point is tracked by accumulating the displacement in each time step, which is determined using Differential Single Difference. Preliminary results using low cost GPS receivers show that the position error is 10 cm, and the drift is 0.001 ms-1, which can be compensated using linear models. Stable and accurate augmentations in outdoor scenes are demonstrated.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"140 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133266510","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
The effect of registration error on tracking distant augmented objects 配准误差对远距离增强目标跟踪的影响
Pub Date : 2008-09-15 DOI: 10.1109/ISMAR.2008.4637329
M. Livingston, Zhuming Ai
We conducted a user study of the effect of registration error on performance of tracking distant objects in augmented reality. Categorizing error by types that are often used as specifications, we hoped to derive some insight into the ability of users to tolerate noise, latency, and orientation error. We used measurements from actual systems to derive the parameter settings. We expected all three errors to influence userspsila ability to perform the task correctly and the precision with which they performed the task. We found that high latency had a negative impact on both performance and response time. While noise consistently interacted with the other variables, and orientation error increased user error, the differences between ldquohighrdquo and ldquolowrdquo amounts were smaller than we expected. Results of userspsila subjective rankings of these three categories of error were surprisingly mixed. Users believed noise was the most detrimental, though statistical analysis of performance refuted this belief. We interpret the results and draw insights for system design.
我们对注册误差对增强现实中远距离目标跟踪性能的影响进行了用户研究。根据通常用作规范的类型对错误进行分类,我们希望能够深入了解用户容忍噪音、延迟和方向错误的能力。我们使用实际系统的测量结果来推导参数设置。我们预计这三种错误都会影响用户正确执行任务的能力和他们执行任务的精度。我们发现,高延迟对性能和响应时间都有负面影响。虽然噪声始终与其他变量相互作用,并且方向误差增加了用户误差,但ldquohighrdquo和ldquolowrdquo之间的差异比我们预期的要小。用户对这三类错误的主观排名结果令人惊讶地参差不齐。用户认为噪音是最有害的,尽管对性能的统计分析驳斥了这种观点。我们解释结果并为系统设计提供见解。
{"title":"The effect of registration error on tracking distant augmented objects","authors":"M. Livingston, Zhuming Ai","doi":"10.1109/ISMAR.2008.4637329","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637329","url":null,"abstract":"We conducted a user study of the effect of registration error on performance of tracking distant objects in augmented reality. Categorizing error by types that are often used as specifications, we hoped to derive some insight into the ability of users to tolerate noise, latency, and orientation error. We used measurements from actual systems to derive the parameter settings. We expected all three errors to influence userspsila ability to perform the task correctly and the precision with which they performed the task. We found that high latency had a negative impact on both performance and response time. While noise consistently interacted with the other variables, and orientation error increased user error, the differences between ldquohighrdquo and ldquolowrdquo amounts were smaller than we expected. Results of userspsila subjective rankings of these three categories of error were surprisingly mixed. Users believed noise was the most detrimental, though statistical analysis of performance refuted this belief. We interpret the results and draw insights for system design.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114739065","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 40
3D fiducials for scalable AR visual tracking 可扩展AR视觉跟踪的3D基准
Pub Date : 2008-09-15 DOI: 10.1109/ISMAR.2008.4637357
J. Steinbis, W. Hoff, T. Vincent
A new vision and inertial pose estimation system was implemented for real-time handheld augmented reality (AR). A sparse set of 3D cone fiducials are utilized for scalable indoor/outdoor tracking, as opposed to traditional planar patterns. The cones are easy to segment and have a large working volume which makes them more suitable for many applications. The pose estimation system receives measurements from the camera and IMU at 30 Hz and 100 Hz respectively. With a dual-core workstation, all measurements can be processed in real-time to update the pose of virtual graphics within the AR display.
实现了一种新的手持增强现实(AR)视觉和惯性姿态估计系统。一组稀疏的三维锥基准被用于可扩展的室内/室外跟踪,而不是传统的平面模式。锥体易于分割,具有较大的工作体积,这使得它们更适合于许多应用。姿态估计系统分别以30 Hz和100 Hz的频率接收来自相机和IMU的测量值。通过双核工作站,所有测量都可以实时处理,以更新AR显示器内虚拟图形的姿态。
{"title":"3D fiducials for scalable AR visual tracking","authors":"J. Steinbis, W. Hoff, T. Vincent","doi":"10.1109/ISMAR.2008.4637357","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637357","url":null,"abstract":"A new vision and inertial pose estimation system was implemented for real-time handheld augmented reality (AR). A sparse set of 3D cone fiducials are utilized for scalable indoor/outdoor tracking, as opposed to traditional planar patterns. The cones are easy to segment and have a large working volume which makes them more suitable for many applications. The pose estimation system receives measurements from the camera and IMU at 30 Hz and 100 Hz respectively. With a dual-core workstation, all measurements can be processed in real-time to update the pose of virtual graphics within the AR display.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"125 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121965234","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Efficiency of techniques for mixed-space collaborative navigation 混合空间协同导航技术效率研究
Pub Date : 2008-09-15 DOI: 10.1109/ISMAR.2008.4637356
A. Stafford, B. Thomas, W. Piekarski
This paper describes the results of a study conducted to determine the efficiency of visual cues for a collaborative navigation task in a mixed-space environment. The task required a user with an exocentric view of a virtual room to navigate a fully immersed user with an egocentric view to an exit. The study compares natural hand-based gestures, a mouse-based interface and an audio only technique to determine their relative efficiency on task completion times. The results show that visual cue-based collaborative navigation techniques are significantly more efficient than an audio-only technique.
本文描述了一项研究的结果,该研究旨在确定混合空间环境中协同导航任务中视觉线索的效率。这项任务需要一个拥有虚拟房间外中心视角的用户引导一个完全沉浸在以自我为中心视角的用户到出口。该研究比较了自然手势、鼠标界面和纯音频技术,以确定它们在任务完成时间上的相对效率。结果表明,基于视觉线索的协同导航技术明显优于纯音频导航技术。
{"title":"Efficiency of techniques for mixed-space collaborative navigation","authors":"A. Stafford, B. Thomas, W. Piekarski","doi":"10.1109/ISMAR.2008.4637356","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637356","url":null,"abstract":"This paper describes the results of a study conducted to determine the efficiency of visual cues for a collaborative navigation task in a mixed-space environment. The task required a user with an exocentric view of a virtual room to navigate a fully immersed user with an egocentric view to an exit. The study compares natural hand-based gestures, a mouse-based interface and an audio only technique to determine their relative efficiency on task completion times. The results show that visual cue-based collaborative navigation techniques are significantly more efficient than an audio-only technique.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121526096","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
User evaluation of see-through vision for mobile outdoor augmented reality 用户对移动户外增强现实的透视视觉评价
Pub Date : 2008-09-15 DOI: 10.1109/ISMAR.2008.4637327
Ben Avery, B. Thomas, W. Piekarski
We have developed a system built on our mobile AR platform that provides users with see-through vision, allowing visualization of occluded objects textured with real-time video information. We present a user study that evaluates the userpsilas ability to view this information and understand the appearance of an outdoor area occluded by a building while using a mobile AR computer. This understanding was compared against a second group of users who watched video footage of the same outdoor area on a regular computer monitor. The comparison found an increased accuracy in locating specific points from the scene for the outdoor AR participants. The outdoor participants also displayed more accurate results, and showed better speed improvement than the indoor group when viewing more than one video simultaneously.
我们开发了一个基于我们的移动AR平台的系统,为用户提供透明的视觉,允许用实时视频信息纹理可视化被遮挡的物体。我们提出了一项用户研究,该研究评估了用户在使用移动AR计算机时查看这些信息和理解被建筑物遮挡的室外区域外观的能力。将这种理解与第二组在普通电脑显示器上观看同一户外区域视频片段的用户进行比较。对比发现,户外AR参与者从场景中定位特定点的准确性有所提高。在同时观看多个视频时,户外组的参与者也显示出更准确的结果,并且比室内组显示出更好的速度提高。
{"title":"User evaluation of see-through vision for mobile outdoor augmented reality","authors":"Ben Avery, B. Thomas, W. Piekarski","doi":"10.1109/ISMAR.2008.4637327","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637327","url":null,"abstract":"We have developed a system built on our mobile AR platform that provides users with see-through vision, allowing visualization of occluded objects textured with real-time video information. We present a user study that evaluates the userpsilas ability to view this information and understand the appearance of an outdoor area occluded by a building while using a mobile AR computer. This understanding was compared against a second group of users who watched video footage of the same outdoor area on a regular computer monitor. The comparison found an increased accuracy in locating specific points from the scene for the outdoor AR participants. The outdoor participants also displayed more accurate results, and showed better speed improvement than the indoor group when viewing more than one video simultaneously.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"124 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130348263","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 40
Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR 增强现实跟踪、交互和显示的趋势:ISMAR十年回顾
Pub Date : 2008-09-15 DOI: 10.1109/ismar.2008.4637362
Feng Zhou, H. Duh, M. Billinghurst
Although Augmented Reality technology was first developed over forty years ago, there has been little survey work giving an overview of recent research in the field. This paper reviews the ten-year development of the work presented at the ISMAR conference and its predecessors with a particular focus on tracking, interaction and display research. It provides a roadmap for future augmented reality research which will be of great value to this relatively young field, and also for helping researchers decide which topics should be explored when they are beginning their own studies in the area.
尽管增强现实技术是在四十多年前首次开发的,但很少有调查工作对该领域的最新研究进行概述。本文回顾了在ISMAR会议及其前任会议上提出的工作的十年发展,特别关注跟踪,交互和显示研究。它为未来的增强现实研究提供了一个路线图,这将对这个相对年轻的领域有很大的价值,也有助于研究人员决定在他们开始自己的研究领域时应该探索哪些主题。
{"title":"Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR","authors":"Feng Zhou, H. Duh, M. Billinghurst","doi":"10.1109/ismar.2008.4637362","DOIUrl":"https://doi.org/10.1109/ismar.2008.4637362","url":null,"abstract":"Although Augmented Reality technology was first developed over forty years ago, there has been little survey work giving an overview of recent research in the field. This paper reviews the ten-year development of the work presented at the ISMAR conference and its predecessors with a particular focus on tracking, interaction and display research. It provides a roadmap for future augmented reality research which will be of great value to this relatively young field, and also for helping researchers decide which topics should be explored when they are beginning their own studies in the area.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127672506","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1107
Haptically extended augmented prototyping 触觉扩展的增强原型
Pub Date : 2008-09-15 DOI: 10.1109/ISMAR.2008.4637350
Mariza Dima, D. Arvind, John R. Lee, Mark Wright
This project presents a new display concept, which brings together haptics, augmented and mixed reality and tangible computing within the context of an intuitive conceptual design environment. The project extends the paradigm of augmented prototyping by allowing modelling of virtual geometry on the physical prototype, which can be touched by means of a haptic device. Wireless tracking of the physical prototype is achieved in three different ways by attaching to it a 'Speck', a tracker and Nintendo Wii Remote and it provides continuous tangible interaction. The physical prototype becomes a tangible interface augmented with mixed reality and with a novel 3D haptic design system.
这个项目提出了一个新的显示概念,它将触觉、增强现实和混合现实以及有形计算结合在一个直观的概念设计环境中。该项目扩展了增强原型的范例,允许在物理原型上建模虚拟几何,可以通过触觉设备触摸。物理原型的无线跟踪可以通过三种不同的方式实现,即连接到“Speck”,跟踪器和任天堂Wii遥控器,并提供持续的有形交互。物理原型通过混合现实和新颖的3D触觉设计系统增强成为一个有形的界面。
{"title":"Haptically extended augmented prototyping","authors":"Mariza Dima, D. Arvind, John R. Lee, Mark Wright","doi":"10.1109/ISMAR.2008.4637350","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637350","url":null,"abstract":"This project presents a new display concept, which brings together haptics, augmented and mixed reality and tangible computing within the context of an intuitive conceptual design environment. The project extends the paradigm of augmented prototyping by allowing modelling of virtual geometry on the physical prototype, which can be touched by means of a haptic device. Wireless tracking of the physical prototype is achieved in three different ways by attaching to it a 'Speck', a tracker and Nintendo Wii Remote and it provides continuous tangible interaction. The physical prototype becomes a tangible interface augmented with mixed reality and with a novel 3D haptic design system.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126512902","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Multiple 3D Object tracking for augmented reality 用于增强现实的多个3D对象跟踪
Pub Date : 2008-09-15 DOI: 10.1109/ISMAR.2008.4637336
Youngmin Park, V. Lepetit, Woontack Woo
We present a method that is able to track several 3D objects simultaneously, robustly, and accurately in real-time. While many applications need to consider more than one object in practice, the existing methods for single object tracking do not scale well with the number of objects, and a proper way to deal with several objects is required. Our method combines object detection and tracking: Frame-to-frame tracking is less computationally demanding but is prone to fail, while detection is more robust but slower. We show how to combine them to take the advantages of the two approaches, and demonstrate our method on several real sequences.
我们提出了一种能够同时、鲁棒地、准确地实时跟踪多个三维物体的方法。在实际应用中,许多应用需要考虑多个对象,但现有的单对象跟踪方法不能很好地随对象数量的增加而扩展,需要一种适当的方法来处理多个对象。我们的方法结合了目标检测和跟踪:帧到帧的跟踪计算量较少,但容易失败,而检测更健壮,但速度较慢。我们展示了如何将它们结合起来以利用这两种方法的优点,并在几个真实序列上演示了我们的方法。
{"title":"Multiple 3D Object tracking for augmented reality","authors":"Youngmin Park, V. Lepetit, Woontack Woo","doi":"10.1109/ISMAR.2008.4637336","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637336","url":null,"abstract":"We present a method that is able to track several 3D objects simultaneously, robustly, and accurately in real-time. While many applications need to consider more than one object in practice, the existing methods for single object tracking do not scale well with the number of objects, and a proper way to deal with several objects is required. Our method combines object detection and tracking: Frame-to-frame tracking is less computationally demanding but is prone to fail, while detection is more robust but slower. We show how to combine them to take the advantages of the two approaches, and demonstrate our method on several real sequences.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131853833","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 135
Virtual redlining for civil engineering in real environments 现实环境中土木工程的虚拟划线
Pub Date : 2008-09-15 DOI: 10.1109/ISMAR.2008.4637332
Gerhard Schall, Erick Méndez, D. Schmalstieg
Field workers of utility companies are regularly engaged in outdoor tasks such as network planning and inspection of underground infrastructure. Redlining is the term used for manually annotating either printed paper maps or a 2D geographic information system on a notebook computer taken to the field. Either of these approaches requires finding the physical location to be annotated on the physical or digital map. In this paper, we describe a mobile Augmented Reality (AR) system capable of supporting virtual redlining. The AR visualization delivered by the system is constructed from data directly extracted from a GIS used in day-to-day production by utility companies. We also report on encouraging trials and interviews performed with professional field workers from the utility sector.
公用事业公司的现场工作人员经常从事网络规划和地下基础设施检查等户外任务。“划红线”是指在实地使用的笔记本电脑上手动标注纸质地图或二维地理信息系统。这两种方法都需要找到要在物理或数字地图上标注的物理位置。在本文中,我们描述了一个能够支持虚拟划线的移动增强现实(AR)系统。该系统提供的AR可视化是由直接从公用事业公司日常生产中使用的GIS中提取的数据构建的。我们还报告了与公用事业部门的专业现场工作人员进行的鼓励试验和访谈。
{"title":"Virtual redlining for civil engineering in real environments","authors":"Gerhard Schall, Erick Méndez, D. Schmalstieg","doi":"10.1109/ISMAR.2008.4637332","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637332","url":null,"abstract":"Field workers of utility companies are regularly engaged in outdoor tasks such as network planning and inspection of underground infrastructure. Redlining is the term used for manually annotating either printed paper maps or a 2D geographic information system on a notebook computer taken to the field. Either of these approaches requires finding the physical location to be annotated on the physical or digital map. In this paper, we describe a mobile Augmented Reality (AR) system capable of supporting virtual redlining. The AR visualization delivered by the system is constructed from data directly extracted from a GIS used in day-to-day production by utility companies. We also report on encouraging trials and interviews performed with professional field workers from the utility sector.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"80 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123948936","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 49
期刊
2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1