Frank Gommlich, Guido Heumer, Arnd Vitzthum, B. Jung
Realistic behavior of control actuators is important for virtual proto-typing applications. We present a systematic approach for modeling such articulated components as described in the European Standard EN 894-3. Control actuators may have several rotational and translational degrees of freedom (DOFs), possibly with discrete lock states. During user interactions, information about the actuators' manipulation is collected and made available to the higher application layers in the form of interaction events. This allows for recording and playback of demonstrated manipulation sequences for many purposes, such as ergonomics evaluations involving virtual humans. The framework uses XML for declaration and is implemented using a freely available physics engine.
{"title":"Simulation of Standard Control Actuators in Dynamic Virtual Environments","authors":"Frank Gommlich, Guido Heumer, Arnd Vitzthum, B. Jung","doi":"10.1109/VR.2009.4811049","DOIUrl":"https://doi.org/10.1109/VR.2009.4811049","url":null,"abstract":"Realistic behavior of control actuators is important for virtual proto-typing applications. We present a systematic approach for modeling such articulated components as described in the European Standard EN 894-3. Control actuators may have several rotational and translational degrees of freedom (DOFs), possibly with discrete lock states. During user interactions, information about the actuators' manipulation is collected and made available to the higher application layers in the form of interaction events. This allows for recording and playback of demonstrated manipulation sequences for many purposes, such as ergonomics evaluations involving virtual humans. The framework uses XML for declaration and is implemented using a freely available physics engine.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126256717","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
H. Matsukura, Hitoshi Yoshida, H. Ishida, A. Saitoh, T. Nakamoto
The results of our current multidisciplinary research efforts on the development of an olfactory display system are demonstrated. To present odors with a vivid sense of reality, we propose to use computational fluid dynamics (CFD) simulation in conjunction with the olfactory display system. In this demonstration, an odor is released with a movie clip using the olfactory display. A CFD solver is employed to calculate the turbulent airflow field in the given environment and diffusion/advection of odor molecules from their source. The olfactory display system generates an odor with the concentration determined by the calculated odor distribution. The user is assumed to be a small animal slowly walking through a virtual room, and experiences the spread of the odor in the room.
{"title":"Odor Presentation with a Vivid Sense of Reality: Incorporating Fluid Dynamics Simulation into Olfactory Display","authors":"H. Matsukura, Hitoshi Yoshida, H. Ishida, A. Saitoh, T. Nakamoto","doi":"10.1109/VR.2009.4811062","DOIUrl":"https://doi.org/10.1109/VR.2009.4811062","url":null,"abstract":"The results of our current multidisciplinary research efforts on the development of an olfactory display system are demonstrated. To present odors with a vivid sense of reality, we propose to use computational fluid dynamics (CFD) simulation in conjunction with the olfactory display system. In this demonstration, an odor is released with a movie clip using the olfactory display. A CFD solver is employed to calculate the turbulent airflow field in the given environment and diffusion/advection of odor molecules from their source. The olfactory display system generates an odor with the concentration determined by the calculated odor distribution. The user is assumed to be a small animal slowly walking through a virtual room, and experiences the spread of the odor in the room.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126606235","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Alexander Kulik, Jan Hochstrate, André Kunert, B. Fröhlich
The Globefish is a novel desktop input device for three-dimensional object manipulation and viewpoint navigation. It consists of an elastically suspended 3D trackball, which provides a natural mapping for position-controlled 3D rotations. Rate-controlled 3D translations are performed by pushing the trackball into the appropriate direction. The device is manipulated by the fingertips allowing for precise interaction with virtual objects. The Globefish was designed for 3D graphics applications such as computer aided design (CAD), digital content creation (DCC) and 3D games.
{"title":"The Globefish: A 3D Motion Controller","authors":"Alexander Kulik, Jan Hochstrate, André Kunert, B. Fröhlich","doi":"10.1109/VR.2009.4811064","DOIUrl":"https://doi.org/10.1109/VR.2009.4811064","url":null,"abstract":"The Globefish is a novel desktop input device for three-dimensional object manipulation and viewpoint navigation. It consists of an elastically suspended 3D trackball, which provides a natural mapping for position-controlled 3D rotations. Rate-controlled 3D translations are performed by pushing the trackball into the appropriate direction. The device is manipulated by the fingertips allowing for precise interaction with virtual objects. The Globefish was designed for 3D graphics applications such as computer aided design (CAD), digital content creation (DCC) and 3D games.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126751140","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tyler Johnson, G. Welch, H. Fuchs, E. Force, H. Towles
We present a novel calibration framework for multi-projector displays that achieves continuous geometric calibration by estimating and refining the poses of all projectors in an ongoing fashion during actual display use. Our framework provides scalability by operating as a distributed system of "intelligent" projector units: projectors augmented with rigidly-mounted cameras, and paired with dedicated computers. Each unit interacts asynchronously with its peers, leveraging their combined computational power to cooperatively estimate the poses of all of the projectors. In cases where the projection surface is static, our system is able to continuously refine all of the projector poses, even when they change simultaneously.
{"title":"A Distributed Cooperative Framework for Continuous Multi-Projector Pose Estimation","authors":"Tyler Johnson, G. Welch, H. Fuchs, E. Force, H. Towles","doi":"10.1109/VR.2009.4810996","DOIUrl":"https://doi.org/10.1109/VR.2009.4810996","url":null,"abstract":"We present a novel calibration framework for multi-projector displays that achieves continuous geometric calibration by estimating and refining the poses of all projectors in an ongoing fashion during actual display use. Our framework provides scalability by operating as a distributed system of \"intelligent\" projector units: projectors augmented with rigidly-mounted cameras, and paired with dedicated computers. Each unit interacts asynchronously with its peers, leveraging their combined computational power to cooperatively estimate the poses of all of the projectors. In cases where the projection surface is static, our system is able to continuously refine all of the projector poses, even when they change simultaneously.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"32 9-10","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132757306","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
E. Ragan, Curtis Wilkes, D. Bowman, Tobias Höllerer
We propose the use of virtual environments to simulate augmented reality (AR) systems for the purposes of experimentation and usability evaluation. This method allows complete control in the AR environment, providing many advantages over testing with true AR systems. We also discuss some of the limitations to the simulation approach. We have demonstrated the use of such a simulation in a proof of concept experiment controlling the levels of registration error in the AR scenario. In this experiment, we used the simulation method to investigate the effects of registration error on task performance for a generic task involving precise motor control for AR object manipulation. Isolating jitter and latency errors, we provide empirical evidence of the relationship between accurate registration and task performance.
{"title":"Simulation of Augmented Reality Systems in Purely Virtual Environments","authors":"E. Ragan, Curtis Wilkes, D. Bowman, Tobias Höllerer","doi":"10.1109/VR.2009.4811058","DOIUrl":"https://doi.org/10.1109/VR.2009.4811058","url":null,"abstract":"We propose the use of virtual environments to simulate augmented reality (AR) systems for the purposes of experimentation and usability evaluation. This method allows complete control in the AR environment, providing many advantages over testing with true AR systems. We also discuss some of the limitations to the simulation approach. We have demonstrated the use of such a simulation in a proof of concept experiment controlling the levels of registration error in the AR scenario. In this experiment, we used the simulation method to investigate the effects of registration error on task performance for a generic task involving precise motor control for AR object manipulation. Isolating jitter and latency errors, we provide empirical evidence of the relationship between accurate registration and task performance.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124055859","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This article introduces explosion diagrams to Augmented Reality (AR) applications. It presents algorithms to seamlessly integrate an object's explosion diagram into a real world environment, including the AR rendering of relocated objects textured with live video and the restoration of visual information which are hidden behind relocated objects. It demonstrates several types of visualizations for convincing AR explosion diagrams and it discusses visualizations of exploded parts as well as visual links conveying their relocation direction. Furthermore, we show the integration of our rendering and visualization techniques in an AR framework, which is able to automatically compute a diagram's layout and an animation of its corresponding explosion.
{"title":"Explosion Diagrams in Augmented Reality","authors":"Denis Kalkofen, Markus Tatzgern, D. Schmalstieg","doi":"10.1109/VR.2009.4811001","DOIUrl":"https://doi.org/10.1109/VR.2009.4811001","url":null,"abstract":"This article introduces explosion diagrams to Augmented Reality (AR) applications. It presents algorithms to seamlessly integrate an object's explosion diagram into a real world environment, including the AR rendering of relocated objects textured with live video and the restoration of visual information which are hidden behind relocated objects. It demonstrates several types of visualizations for convincing AR explosion diagrams and it discusses visualizations of exploded parts as well as visual links conveying their relocation direction. Furthermore, we show the integration of our rendering and visualization techniques in an AR framework, which is able to automatically compute a diagram's layout and an animation of its corresponding explosion.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114363669","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Seunghyun Woo, Takafumi Aoki, Hironori Mitake, N. Hashimoto, Makoto Sato
Sometimes, mirrors provide illusions that distort physical laws. In these methods, the illusions become "real" as your visual, tactile, and auditory senses are immersed in the world inside the mirror. Our methods allows you to experience a mirror illusion through three modalities of feedback (visual, haptic, and auditory) and perceive a boundary less transition between the real world and the world inside the mirror. This approach is expected to open new possibilities for using mirrors in the fields of media art or virtual reality.
{"title":"Virtual Reality in Physical Mirrors","authors":"Seunghyun Woo, Takafumi Aoki, Hironori Mitake, N. Hashimoto, Makoto Sato","doi":"10.1109/VR.2009.4811034","DOIUrl":"https://doi.org/10.1109/VR.2009.4811034","url":null,"abstract":"Sometimes, mirrors provide illusions that distort physical laws. In these methods, the illusions become \"real\" as your visual, tactile, and auditory senses are immersed in the world inside the mirror. Our methods allows you to experience a mirror illusion through three modalities of feedback (visual, haptic, and auditory) and perceive a boundary less transition between the real world and the world inside the mirror. This approach is expected to open new possibilities for using mirrors in the fields of media art or virtual reality.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116776190","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Kolb, M. Lambers, S. Todt, Nicolas Cuntz, C. Rezk-Salama
We present a new VR installation at the University of Siegen, Germany. It consists of a 180° cylindrical rear-projection screen and a front-projection floor, allowing both immersive VR applications with user tracking and convincing presentations for a larger audience.
{"title":"Immersive Rear Projection on Curved Screens","authors":"A. Kolb, M. Lambers, S. Todt, Nicolas Cuntz, C. Rezk-Salama","doi":"10.1109/VR.2009.4811057","DOIUrl":"https://doi.org/10.1109/VR.2009.4811057","url":null,"abstract":"We present a new VR installation at the University of Siegen, Germany. It consists of a 180° cylindrical rear-projection screen and a front-projection floor, allowing both immersive VR applications with user tracking and convincing presentations for a larger audience.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"98 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126900608","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Saulo A. Pessoa, G. Moura, J. P. Lima, V. Teichrieb, J. Kelner
This paper presents a solution for the photorealistic rendering of synthetic objects into dynamic real scenes, in Augmented Reality applications. In order to achieve this goal, an Image Based Lighting approach is proposed, where environment maps with different levels of glossiness are generated for each virtual object in the scene at every frame. Due to this, illumination effects, such as color bleeding and specular reflections, can be simulated for virtual objects in a consistent way, even under the presence of scene changes. A unifying sampling method for the spherical harmonics transformation pass is also used. It is independent of map format and does not need to apply different weights for each sample. The developed technique is combined with an extended version of Lafortune Spatial BRDF, featuring Fresnel effect and an innovative tangent rotation parameterization. The solution is evaluated in various Augmented Reality case studies, where other features like shadowing and lens effects are also exploited.
{"title":"A Global Illumination and BRDF Solution Applied to Photorealistic Augmented Reality","authors":"Saulo A. Pessoa, G. Moura, J. P. Lima, V. Teichrieb, J. Kelner","doi":"10.1109/VR.2009.4811036","DOIUrl":"https://doi.org/10.1109/VR.2009.4811036","url":null,"abstract":"This paper presents a solution for the photorealistic rendering of synthetic objects into dynamic real scenes, in Augmented Reality applications. In order to achieve this goal, an Image Based Lighting approach is proposed, where environment maps with different levels of glossiness are generated for each virtual object in the scene at every frame. Due to this, illumination effects, such as color bleeding and specular reflections, can be simulated for virtual objects in a consistent way, even under the presence of scene changes. A unifying sampling method for the spherical harmonics transformation pass is also used. It is independent of map format and does not need to apply different weights for each sample. The developed technique is combined with an extended version of Lafortune Spatial BRDF, featuring Fresnel effect and an innovative tangent rotation parameterization. The solution is evaluated in various Augmented Reality case studies, where other features like shadowing and lens effects are also exploited.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"74 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130186100","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Robert J. Teather, Andriy Pavlovych, W. Stuerzlinger
We investigate the effects of input device latency and spatial jitter on 2D pointing tasks and a 3D movement. First, we characterize jitter and latency in a 3D tracking device and an optical mouse used for baseline comparison. We present an experiment based on ISO 9241-9, which measures performance of pointing devices. We added latency and jitter to the mouse and compared it to a 3D tracker. Results indicate that latency has a stronger effect on performance than small spatial jitter. A second experiment found that erratic jitter "spikes" can affect 3D movement performance.
{"title":"Effects of Latency and Spatial Jitter on 2D and 3D Pointing","authors":"Robert J. Teather, Andriy Pavlovych, W. Stuerzlinger","doi":"10.1109/VR.2009.4811029","DOIUrl":"https://doi.org/10.1109/VR.2009.4811029","url":null,"abstract":"We investigate the effects of input device latency and spatial jitter on 2D pointing tasks and a 3D movement. First, we characterize jitter and latency in a 3D tracking device and an optical mouse used for baseline comparison. We present an experiment based on ISO 9241-9, which measures performance of pointing devices. We added latency and jitter to the mouse and compared it to a 3D tracker. Results indicate that latency has a stronger effect on performance than small spatial jitter. A second experiment found that erratic jitter \"spikes\" can affect 3D movement performance.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133037482","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}