Alexander Kulik, Jan Hochstrate, André Kunert, B. Fröhlich
The Globefish is a novel desktop input device for three-dimensional object manipulation and viewpoint navigation. It consists of an elastically suspended 3D trackball, which provides a natural mapping for position-controlled 3D rotations. Rate-controlled 3D translations are performed by pushing the trackball into the appropriate direction. The device is manipulated by the fingertips allowing for precise interaction with virtual objects. The Globefish was designed for 3D graphics applications such as computer aided design (CAD), digital content creation (DCC) and 3D games.
{"title":"The Globefish: A 3D Motion Controller","authors":"Alexander Kulik, Jan Hochstrate, André Kunert, B. Fröhlich","doi":"10.1109/VR.2009.4811064","DOIUrl":"https://doi.org/10.1109/VR.2009.4811064","url":null,"abstract":"The Globefish is a novel desktop input device for three-dimensional object manipulation and viewpoint navigation. It consists of an elastically suspended 3D trackball, which provides a natural mapping for position-controlled 3D rotations. Rate-controlled 3D translations are performed by pushing the trackball into the appropriate direction. The device is manipulated by the fingertips allowing for precise interaction with virtual objects. The Globefish was designed for 3D graphics applications such as computer aided design (CAD), digital content creation (DCC) and 3D games.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126751140","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We are investigating the utility of a projection-based stereoscopic two-user system for applications in the automotive industry. In this paper we compare real-world pointing to pointing at virtual objects in a two-user scenario. In our study we investigated the following situation: One user points at an object while the second person has to guess the referenced object. Our results indicate that pointing at objects in a virtual world is much less precise than pointing in the real world even for a well calibrated stereoscopic two-user system. Pointing techniques like outlining an object or pointing from a distance produce more errors than the corresponding real-world techniques, but they are less error prone than interactions requiring to touch a virtual object. Our findings are a first step to qualify direct interaction techniques in a multi-user projection-based system.
{"title":"Virtual vs. Real-World Pointing in Two-User Scenarios","authors":"Holger Salzmann, Mathias Moehring, B. Fröhlich","doi":"10.1109/VR.2009.4811011","DOIUrl":"https://doi.org/10.1109/VR.2009.4811011","url":null,"abstract":"We are investigating the utility of a projection-based stereoscopic two-user system for applications in the automotive industry. In this paper we compare real-world pointing to pointing at virtual objects in a two-user scenario. In our study we investigated the following situation: One user points at an object while the second person has to guess the referenced object. Our results indicate that pointing at objects in a virtual world is much less precise than pointing in the real world even for a well calibrated stereoscopic two-user system. Pointing techniques like outlining an object or pointing from a distance produce more errors than the corresponding real-world techniques, but they are less error prone than interactions requiring to touch a virtual object. Our findings are a first step to qualify direct interaction techniques in a multi-user projection-based system.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124529818","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tyler Johnson, G. Welch, H. Fuchs, E. Force, H. Towles
We present a novel calibration framework for multi-projector displays that achieves continuous geometric calibration by estimating and refining the poses of all projectors in an ongoing fashion during actual display use. Our framework provides scalability by operating as a distributed system of "intelligent" projector units: projectors augmented with rigidly-mounted cameras, and paired with dedicated computers. Each unit interacts asynchronously with its peers, leveraging their combined computational power to cooperatively estimate the poses of all of the projectors. In cases where the projection surface is static, our system is able to continuously refine all of the projector poses, even when they change simultaneously.
{"title":"A Distributed Cooperative Framework for Continuous Multi-Projector Pose Estimation","authors":"Tyler Johnson, G. Welch, H. Fuchs, E. Force, H. Towles","doi":"10.1109/VR.2009.4810996","DOIUrl":"https://doi.org/10.1109/VR.2009.4810996","url":null,"abstract":"We present a novel calibration framework for multi-projector displays that achieves continuous geometric calibration by estimating and refining the poses of all projectors in an ongoing fashion during actual display use. Our framework provides scalability by operating as a distributed system of \"intelligent\" projector units: projectors augmented with rigidly-mounted cameras, and paired with dedicated computers. Each unit interacts asynchronously with its peers, leveraging their combined computational power to cooperatively estimate the poses of all of the projectors. In cases where the projection surface is static, our system is able to continuously refine all of the projector poses, even when they change simultaneously.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"32 9-10","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132757306","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
H. Matsukura, Hitoshi Yoshida, H. Ishida, A. Saitoh, T. Nakamoto
The results of our current multidisciplinary research efforts on the development of an olfactory display system are demonstrated. To present odors with a vivid sense of reality, we propose to use computational fluid dynamics (CFD) simulation in conjunction with the olfactory display system. In this demonstration, an odor is released with a movie clip using the olfactory display. A CFD solver is employed to calculate the turbulent airflow field in the given environment and diffusion/advection of odor molecules from their source. The olfactory display system generates an odor with the concentration determined by the calculated odor distribution. The user is assumed to be a small animal slowly walking through a virtual room, and experiences the spread of the odor in the room.
{"title":"Odor Presentation with a Vivid Sense of Reality: Incorporating Fluid Dynamics Simulation into Olfactory Display","authors":"H. Matsukura, Hitoshi Yoshida, H. Ishida, A. Saitoh, T. Nakamoto","doi":"10.1109/VR.2009.4811062","DOIUrl":"https://doi.org/10.1109/VR.2009.4811062","url":null,"abstract":"The results of our current multidisciplinary research efforts on the development of an olfactory display system are demonstrated. To present odors with a vivid sense of reality, we propose to use computational fluid dynamics (CFD) simulation in conjunction with the olfactory display system. In this demonstration, an odor is released with a movie clip using the olfactory display. A CFD solver is employed to calculate the turbulent airflow field in the given environment and diffusion/advection of odor molecules from their source. The olfactory display system generates an odor with the concentration determined by the calculated odor distribution. The user is assumed to be a small animal slowly walking through a virtual room, and experiences the spread of the odor in the room.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126606235","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Kolb, M. Lambers, S. Todt, Nicolas Cuntz, C. Rezk-Salama
We present a new VR installation at the University of Siegen, Germany. It consists of a 180° cylindrical rear-projection screen and a front-projection floor, allowing both immersive VR applications with user tracking and convincing presentations for a larger audience.
{"title":"Immersive Rear Projection on Curved Screens","authors":"A. Kolb, M. Lambers, S. Todt, Nicolas Cuntz, C. Rezk-Salama","doi":"10.1109/VR.2009.4811057","DOIUrl":"https://doi.org/10.1109/VR.2009.4811057","url":null,"abstract":"We present a new VR installation at the University of Siegen, Germany. It consists of a 180° cylindrical rear-projection screen and a front-projection floor, allowing both immersive VR applications with user tracking and convincing presentations for a larger audience.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"98 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126900608","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Seunghyun Woo, Takafumi Aoki, Hironori Mitake, N. Hashimoto, Makoto Sato
Sometimes, mirrors provide illusions that distort physical laws. In these methods, the illusions become "real" as your visual, tactile, and auditory senses are immersed in the world inside the mirror. Our methods allows you to experience a mirror illusion through three modalities of feedback (visual, haptic, and auditory) and perceive a boundary less transition between the real world and the world inside the mirror. This approach is expected to open new possibilities for using mirrors in the fields of media art or virtual reality.
{"title":"Virtual Reality in Physical Mirrors","authors":"Seunghyun Woo, Takafumi Aoki, Hironori Mitake, N. Hashimoto, Makoto Sato","doi":"10.1109/VR.2009.4811034","DOIUrl":"https://doi.org/10.1109/VR.2009.4811034","url":null,"abstract":"Sometimes, mirrors provide illusions that distort physical laws. In these methods, the illusions become \"real\" as your visual, tactile, and auditory senses are immersed in the world inside the mirror. Our methods allows you to experience a mirror illusion through three modalities of feedback (visual, haptic, and auditory) and perceive a boundary less transition between the real world and the world inside the mirror. This approach is expected to open new possibilities for using mirrors in the fields of media art or virtual reality.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116776190","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Saulo A. Pessoa, G. Moura, J. P. Lima, V. Teichrieb, J. Kelner
This paper presents a solution for the photorealistic rendering of synthetic objects into dynamic real scenes, in Augmented Reality applications. In order to achieve this goal, an Image Based Lighting approach is proposed, where environment maps with different levels of glossiness are generated for each virtual object in the scene at every frame. Due to this, illumination effects, such as color bleeding and specular reflections, can be simulated for virtual objects in a consistent way, even under the presence of scene changes. A unifying sampling method for the spherical harmonics transformation pass is also used. It is independent of map format and does not need to apply different weights for each sample. The developed technique is combined with an extended version of Lafortune Spatial BRDF, featuring Fresnel effect and an innovative tangent rotation parameterization. The solution is evaluated in various Augmented Reality case studies, where other features like shadowing and lens effects are also exploited.
{"title":"A Global Illumination and BRDF Solution Applied to Photorealistic Augmented Reality","authors":"Saulo A. Pessoa, G. Moura, J. P. Lima, V. Teichrieb, J. Kelner","doi":"10.1109/VR.2009.4811036","DOIUrl":"https://doi.org/10.1109/VR.2009.4811036","url":null,"abstract":"This paper presents a solution for the photorealistic rendering of synthetic objects into dynamic real scenes, in Augmented Reality applications. In order to achieve this goal, an Image Based Lighting approach is proposed, where environment maps with different levels of glossiness are generated for each virtual object in the scene at every frame. Due to this, illumination effects, such as color bleeding and specular reflections, can be simulated for virtual objects in a consistent way, even under the presence of scene changes. A unifying sampling method for the spherical harmonics transformation pass is also used. It is independent of map format and does not need to apply different weights for each sample. The developed technique is combined with an extended version of Lafortune Spatial BRDF, featuring Fresnel effect and an innovative tangent rotation parameterization. The solution is evaluated in various Augmented Reality case studies, where other features like shadowing and lens effects are also exploited.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"74 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130186100","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
E. Ragan, Curtis Wilkes, D. Bowman, Tobias Höllerer
We propose the use of virtual environments to simulate augmented reality (AR) systems for the purposes of experimentation and usability evaluation. This method allows complete control in the AR environment, providing many advantages over testing with true AR systems. We also discuss some of the limitations to the simulation approach. We have demonstrated the use of such a simulation in a proof of concept experiment controlling the levels of registration error in the AR scenario. In this experiment, we used the simulation method to investigate the effects of registration error on task performance for a generic task involving precise motor control for AR object manipulation. Isolating jitter and latency errors, we provide empirical evidence of the relationship between accurate registration and task performance.
{"title":"Simulation of Augmented Reality Systems in Purely Virtual Environments","authors":"E. Ragan, Curtis Wilkes, D. Bowman, Tobias Höllerer","doi":"10.1109/VR.2009.4811058","DOIUrl":"https://doi.org/10.1109/VR.2009.4811058","url":null,"abstract":"We propose the use of virtual environments to simulate augmented reality (AR) systems for the purposes of experimentation and usability evaluation. This method allows complete control in the AR environment, providing many advantages over testing with true AR systems. We also discuss some of the limitations to the simulation approach. We have demonstrated the use of such a simulation in a proof of concept experiment controlling the levels of registration error in the AR scenario. In this experiment, we used the simulation method to investigate the effects of registration error on task performance for a generic task involving precise motor control for AR object manipulation. Isolating jitter and latency errors, we provide empirical evidence of the relationship between accurate registration and task performance.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124055859","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We introduce new features for the broad phase algorithm sweep and prune that increase scalability for large virtual reality environments and allow for efficient AABB insertion and removal to support dynamic object creation and destruction. We introduce a novel segmented interval list structure that allows AABB insertion and removal without requiring a full sort of the axes. This algorithm is well-suited to large environments in which many objects are not moving at once. We analyze and test implementations of sweep and prune that include subdivision, batch insertion and removal, and segmented interval lists. Our tests show these techniques provide higher performance than previous sweep and prune methods, and perform better than octrees in temporally coherent environments.
{"title":"Efficient Large-Scale Sweep and Prune Methods with AABB Insertion and Removal","authors":"Daniel J. Tracy, S. Buss, Bryan M. Woods","doi":"10.1109/VR.2009.4811022","DOIUrl":"https://doi.org/10.1109/VR.2009.4811022","url":null,"abstract":"We introduce new features for the broad phase algorithm sweep and prune that increase scalability for large virtual reality environments and allow for efficient AABB insertion and removal to support dynamic object creation and destruction. We introduce a novel segmented interval list structure that allows AABB insertion and removal without requiring a full sort of the axes. This algorithm is well-suited to large environments in which many objects are not moving at once. We analyze and test implementations of sweep and prune that include subdivision, batch insertion and removal, and segmented interval lists. Our tests show these techniques provide higher performance than previous sweep and prune methods, and perform better than octrees in temporally coherent environments.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115029328","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This article introduces explosion diagrams to Augmented Reality (AR) applications. It presents algorithms to seamlessly integrate an object's explosion diagram into a real world environment, including the AR rendering of relocated objects textured with live video and the restoration of visual information which are hidden behind relocated objects. It demonstrates several types of visualizations for convincing AR explosion diagrams and it discusses visualizations of exploded parts as well as visual links conveying their relocation direction. Furthermore, we show the integration of our rendering and visualization techniques in an AR framework, which is able to automatically compute a diagram's layout and an animation of its corresponding explosion.
{"title":"Explosion Diagrams in Augmented Reality","authors":"Denis Kalkofen, Markus Tatzgern, D. Schmalstieg","doi":"10.1109/VR.2009.4811001","DOIUrl":"https://doi.org/10.1109/VR.2009.4811001","url":null,"abstract":"This article introduces explosion diagrams to Augmented Reality (AR) applications. It presents algorithms to seamlessly integrate an object's explosion diagram into a real world environment, including the AR rendering of relocated objects textured with live video and the restoration of visual information which are hidden behind relocated objects. It demonstrates several types of visualizations for convincing AR explosion diagrams and it discusses visualizations of exploded parts as well as visual links conveying their relocation direction. Furthermore, we show the integration of our rendering and visualization techniques in an AR framework, which is able to automatically compute a diagram's layout and an animation of its corresponding explosion.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114363669","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}