International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality最新文献
Pub Date : 2002-09-30DOI: 10.1109/ISMAR.2002.1115110
J. Fung, Steve Mann
We present a way of making the wearing of a lifelong electrocardiographic health monitor fun for a user. The health monitor is coupled with a reality mediator device to create physiologically mediated reality, i.e. mediated reality which alters a user's audiovisual perception of the world based upon their own electrocardiographic waveform. This creates an interesting audiovisual experience for the user, playing upon the poetic narrative of combining cardio-centric metaphors pervasive in everyday life (the heart as a symbol of love and centrality, e.g. "get to the heart of the matter") with ubiquitous occular-centric metaphors such as "see the world from my point of view". This audiovisual experience is further enhanced by combining music which alters the visual perception and also heightens the user's emotional response to their experience and, in doing so, further affects their heart(beat).
{"title":"Exploring humanistic intelligence through physiologically mediated reality","authors":"J. Fung, Steve Mann","doi":"10.1109/ISMAR.2002.1115110","DOIUrl":"https://doi.org/10.1109/ISMAR.2002.1115110","url":null,"abstract":"We present a way of making the wearing of a lifelong electrocardiographic health monitor fun for a user. The health monitor is coupled with a reality mediator device to create physiologically mediated reality, i.e. mediated reality which alters a user's audiovisual perception of the world based upon their own electrocardiographic waveform. This creates an interesting audiovisual experience for the user, playing upon the poetic narrative of combining cardio-centric metaphors pervasive in everyday life (the heart as a symbol of love and centrality, e.g. \"get to the heart of the matter\") with ubiquitous occular-centric metaphors such as \"see the world from my point of view\". This audiovisual experience is further enhanced by combining music which alters the visual perception and also heightens the user's emotional response to their experience and, in doing so, further affects their heart(beat).","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78262504","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-09-30DOI: 10.1109/ISMAR.2002.1115078
Xiang Zhang, S. Fronz, Nassir Navab
Visual markers are widely used in existing augmented reality (AR) applications. In most of such applications, the performance of an AR system depends highly on the tracking system for visual marker detection, tracking, and pose estimation. Currently, there are more than one marker based tracking/calibration systems available. It is thus desirable for the user to know which marker tracking system is likely to perform the best for a specific AR application. For this purpose, we compare several marker systems all using planar square coded visual markers. We present the evaluation results, both qualitatively and quantitatively, for the usability, efficiency, accuracy, and reliability. For a particular AR application, there are different marker detection and tracking requirements. Therefore, the purpose of this work is not to rank existing marker systems; instead, we try to analyze the strength and weakness of various aspects of the marker tracking systems and provide AR application developers with this information.
{"title":"Visual marker detection and decoding in AR systems: a comparative study","authors":"Xiang Zhang, S. Fronz, Nassir Navab","doi":"10.1109/ISMAR.2002.1115078","DOIUrl":"https://doi.org/10.1109/ISMAR.2002.1115078","url":null,"abstract":"Visual markers are widely used in existing augmented reality (AR) applications. In most of such applications, the performance of an AR system depends highly on the tracking system for visual marker detection, tracking, and pose estimation. Currently, there are more than one marker based tracking/calibration systems available. It is thus desirable for the user to know which marker tracking system is likely to perform the best for a specific AR application. For this purpose, we compare several marker systems all using planar square coded visual markers. We present the evaluation results, both qualitatively and quantitatively, for the usability, efficiency, accuracy, and reliability. For a particular AR application, there are different marker detection and tracking requirements. Therefore, the purpose of this work is not to rank existing marker systems; instead, we try to analyze the strength and weakness of various aspects of the marker tracking systems and provide AR application developers with this information.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78491762","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-09-30DOI: 10.1109/ISMAR.2002.1115098
P. Tschirner, B. Hillers, A. Gräser
The problem of creating manual welds of constant high quality results from missing optical information during the actual welding process. Due to the extreme brightness conditions in arc welding and the use of protective glasses, even experienced welders can hardly recognize details of the welding process and the environment. This paper describes a new research project for the development of a support system for the welder.
{"title":"A concept for the application of augmented reality in manual gas metal arc welding","authors":"P. Tschirner, B. Hillers, A. Gräser","doi":"10.1109/ISMAR.2002.1115098","DOIUrl":"https://doi.org/10.1109/ISMAR.2002.1115098","url":null,"abstract":"The problem of creating manual welds of constant high quality results from missing optical information during the actual welding process. Due to the extreme brightness conditions in arc welding and the use of protective glasses, even experienced welders can hardly recognize details of the welding process and the environment. This paper describes a new research project for the development of a support system for the welder.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76679121","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-09-30DOI: 10.1109/ISMAR.2002.1115113
C. Geiger, V. Paelke, C. Reimann, W. Rosenbach, Jörg Stöcklein
This paper applies the idea of a continuously testable design representation to authoring of augmented realities for mobile devices.
本文将连续可测试设计表示的思想应用于移动设备增强现实的创作。
{"title":"Testable design representations for mobile augmented reality authoring","authors":"C. Geiger, V. Paelke, C. Reimann, W. Rosenbach, Jörg Stöcklein","doi":"10.1109/ISMAR.2002.1115113","DOIUrl":"https://doi.org/10.1109/ISMAR.2002.1115113","url":null,"abstract":"This paper applies the idea of a continuously testable design representation to authoring of augmented realities for mobile devices.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84644523","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-09-30DOI: 10.1109/ISMAR.2002.1115095
Shinji Uchiyama, Kazuki Takemoto, K. Satoh, Hiroyuki Yamamoto, H. Tamura
This paper describes a platform package, called "MR Platform," which we have been implementing for research and development of augmented reality technology and applications. This package includes a parallax-less stereo video see-through HMD and a software development kit (SDK) for a Linux PC environment. The SDK is composed of a C++ class library for making runtime MR applications and related utilities such as a camera calibration tool. By using the SDK, the following functions are available: capturing video, handling a six degree-of-freedom (DOF) sensor, image processing such as color detection, estimating head position and orientation, displaying the real world image, and calibrating sensor placement and camera parameters of two cameras mounted on the HMD.
{"title":"MR Platform: a basic body on which mixed reality applications are built","authors":"Shinji Uchiyama, Kazuki Takemoto, K. Satoh, Hiroyuki Yamamoto, H. Tamura","doi":"10.1109/ISMAR.2002.1115095","DOIUrl":"https://doi.org/10.1109/ISMAR.2002.1115095","url":null,"abstract":"This paper describes a platform package, called \"MR Platform,\" which we have been implementing for research and development of augmented reality technology and applications. This package includes a parallax-less stereo video see-through HMD and a software development kit (SDK) for a Linux PC environment. The SDK is composed of a C++ class library for making runtime MR applications and related utilities such as a camera calibration tool. By using the SDK, the following functions are available: capturing video, handling a six degree-of-freedom (DOF) sensor, image processing such as color detection, estimating head position and orientation, displaying the real world image, and calibrating sensor placement and camera parameters of two cameras mounted on the HMD.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87287201","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-09-30DOI: 10.1109/ISMAR.2002.1115059
Wolfgang Friedrich
Augmented reality (AR) is a form of human-machine interaction where information is presented in the field of view of an individual. ARVIKA, funded by the German Ministry of Education and Research, develops this technology and applications in the fields of development, production, and service in the automotive and aerospace industries, for power and processing plants and for machine tools and production machinery. Up to now, AR has only been a subject of individual research projects and a small number of application-specific industrial projects on a global scale. The current state of the art and the available appliances do not yet permit a product-oriented application of the technology. However, AR enables a new, innovative form of human-machine interaction that not only places the individual in the center of the industrial workflow, but also offers a high potential for process and quality improvements in production and process workflows. ARVIKA is primarily designed to implement an augmented reality system for mobile use in industrial applications. The report presents the milestones that have been achieved after a project duration of a full three years.
{"title":"ARVIKA-augmented reality for development, production and service","authors":"Wolfgang Friedrich","doi":"10.1109/ISMAR.2002.1115059","DOIUrl":"https://doi.org/10.1109/ISMAR.2002.1115059","url":null,"abstract":"Augmented reality (AR) is a form of human-machine interaction where information is presented in the field of view of an individual. ARVIKA, funded by the German Ministry of Education and Research, develops this technology and applications in the fields of development, production, and service in the automotive and aerospace industries, for power and processing plants and for machine tools and production machinery. Up to now, AR has only been a subject of individual research projects and a small number of application-specific industrial projects on a global scale. The current state of the art and the available appliances do not yet permit a product-oriented application of the technology. However, AR enables a new, innovative form of human-machine interaction that not only places the individual in the center of the industrial workflow, but also offers a high potential for process and quality improvements in production and process workflows. ARVIKA is primarily designed to implement an augmented reality system for mobile use in industrial applications. The report presents the milestones that have been achieved after a project duration of a full three years.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73365115","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-09-30DOI: 10.1109/ISMAR.2002.1115102
B. Schwald, H. Seibert, Tanja Weller
This paper presents an approach to use a semitransparent display as a kind of window into a patient in the context of medical augmented reality (AR) applications. Besides the presentation of the non-off-the-shelf display, the tracking aspects of such an application are the focus of the work presented. In order to allow augmentations of real objects by virtual ones on the display, the user (i.e. physician), the display, the object (i.e. patient) and optional instruments have to be tracked. If required, a tracking system consisting of more than one subsystem, e.g. optical tracking combined with electromagnetic tracking, is used to satisfy all the needs of such a medical application.
{"title":"A flexible tracking concept applied to medical scenarios using an AR window","authors":"B. Schwald, H. Seibert, Tanja Weller","doi":"10.1109/ISMAR.2002.1115102","DOIUrl":"https://doi.org/10.1109/ISMAR.2002.1115102","url":null,"abstract":"This paper presents an approach to use a semitransparent display as a kind of window into a patient in the context of medical augmented reality (AR) applications. Besides the presentation of the non-off-the-shelf display, the tracking aspects of such an application are the focus of the work presented. In order to allow augmentations of real objects by virtual ones on the display, the user (i.e. physician), the display, the object (i.e. patient) and optional instruments have to be tracked. If required, a tracking system consisting of more than one subsystem, e.g. optical tracking combined with electromagnetic tracking, is used to satisfy all the needs of such a medical application.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73997719","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-09-30DOI: 10.1109/ISMAR.2002.1115111
R. Behringer, Jun Park, V. Sundareswaran
Outdoor augmented reality (AR) applications rely on hybrid tracking (GPS, digital compass, visual) for registration. RSC has developed a real-time visual tracking system that uses visual cues of buildings in an urban environment for correcting the results of a conventional tracking system. This approach relies on knowledge of a CAD model of the building. It not only provides motion estimation, but also absolute orientation/position. It is based on the "visual servoing" approach, originally developed for robotics tasks. We have demonstrated this approach in real-time at a building on the NRL campus This poster shows the approach and results. The concept can be generalized to any scenario where a CAD model is available. This system is being prepared for integration into the NRL system BARS (Battlefield Augmented Reality System).
{"title":"Model-based visual tracking for outdoor augmented reality applications","authors":"R. Behringer, Jun Park, V. Sundareswaran","doi":"10.1109/ISMAR.2002.1115111","DOIUrl":"https://doi.org/10.1109/ISMAR.2002.1115111","url":null,"abstract":"Outdoor augmented reality (AR) applications rely on hybrid tracking (GPS, digital compass, visual) for registration. RSC has developed a real-time visual tracking system that uses visual cues of buildings in an urban environment for correcting the results of a conventional tracking system. This approach relies on knowledge of a CAD model of the building. It not only provides motion estimation, but also absolute orientation/position. It is based on the \"visual servoing\" approach, originally developed for robotics tasks. We have demonstrated this approach in real-time at a building on the NRL campus This poster shows the approach and results. The concept can be generalized to any scenario where a CAD model is available. This system is being prepared for integration into the NRL system BARS (Battlefield Augmented Reality System).","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85064302","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-09-30DOI: 10.1109/ISMAR.2002.1115077
M. Fiorentino, R. Amicis, G. Monno, A. Stork
Spacedesign is an innovative mixed reality (MR) application addressed to aesthetic design of free form curves and surfaces. It is a unique and comprehensive approach which uses task-specific configurations to support the design workflow from concept to mock-up evaluation and review. The first-phase conceptual design benefits from a workbench-like 3-D display for free hand sketching, surfacing and engineering visualization. Semitransparent stereo glasses augment the pre-production physical prototype by additional shapes, textures and annotations. Both workspaces share a common interface and allow collaboration and cooperation between different experts, who can configure the system for the specific task. A faster design workflow and CAD data consistency can be thus naturally achieved. Tests and collaborations with designers, mainly from automotive industry, are providing systematic feedback for this ongoing research. As far as the authors are concerned, there is no known similar approach that integrates the creation and editing phase of 3D curves and surfaces in virtual and augmented reality (VR/AR). Herein we see the major contribution of our new application.
{"title":"Spacedesign: a mixed reality workspace for aesthetic industrial design","authors":"M. Fiorentino, R. Amicis, G. Monno, A. Stork","doi":"10.1109/ISMAR.2002.1115077","DOIUrl":"https://doi.org/10.1109/ISMAR.2002.1115077","url":null,"abstract":"Spacedesign is an innovative mixed reality (MR) application addressed to aesthetic design of free form curves and surfaces. It is a unique and comprehensive approach which uses task-specific configurations to support the design workflow from concept to mock-up evaluation and review. The first-phase conceptual design benefits from a workbench-like 3-D display for free hand sketching, surfacing and engineering visualization. Semitransparent stereo glasses augment the pre-production physical prototype by additional shapes, textures and annotations. Both workspaces share a common interface and allow collaboration and cooperation between different experts, who can configure the system for the specific task. A faster design workflow and CAD data consistency can be thus naturally achieved. Tests and collaborations with designers, mainly from automotive industry, are providing systematic feedback for this ongoing research. As far as the authors are concerned, there is no known similar approach that integrates the creation and editing phase of 3D curves and surfaces in virtual and augmented reality (VR/AR). Herein we see the major contribution of our new application.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82906773","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-09-30DOI: 10.1109/ISMAR.2002.1115084
Masayuki Takemura, Y. Ohta
We propose a new scheme to recover the eye-contact between multiple users in a shared mixed-reality space. The eye-contact in a shared mixed-reality space is lost as the side effect of wearing head-mounted displays. We synthesize facial images in real-time with arbitrary poses and eye expressions by using several photographs of the user. The face images are overlaid in order to diminish the HMD in his partner's view for the recovery of eye-contact. The basic idea, facial image synthesis, and an experimental system to diminish HMD are presented in this paper.
{"title":"Diminishing head-mounted display for shared mixed reality","authors":"Masayuki Takemura, Y. Ohta","doi":"10.1109/ISMAR.2002.1115084","DOIUrl":"https://doi.org/10.1109/ISMAR.2002.1115084","url":null,"abstract":"We propose a new scheme to recover the eye-contact between multiple users in a shared mixed-reality space. The eye-contact in a shared mixed-reality space is lost as the side effect of wearing head-mounted displays. We synthesize facial images in real-time with arbitrary poses and eye expressions by using several photographs of the user. The face images are overlaid in order to diminish the HMD in his partner's view for the recovery of eye-contact. The basic idea, facial image synthesis, and an experimental system to diminish HMD are presented in this paper.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90973766","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}