Pub Date : 2009-03-14DOI: 10.1109/3DUI.2009.4811198
Shamus P. Smith, Sam Du'Mont
Virtual environments are synthetic 3D worlds typically viewed from a first-person point of view with many potential applications within areas such as visualisation, entertainment and training simulators. To effectively develop and utilise virtual environments, user-centric evaluations are commonly performed. Anecdotal evidence suggests that factors such as prior experience with computer games may affect the results of such evaluations. This paper examines the effects of previous computer gaming experience, user perceived gaming ability and actual gaming performance on navigation tasks in a virtual environment. Two computer games and a virtual environment were developed to elicit performance metrics for use within a user study. Results indicated that perceived gaming skill and progress in a First-Person-Shooter (FPS) game were the most consistent metrics showing significant correlations with performance in time-based navigation tasks. There was also strong evidence that these relations were significantly intensified by the inclusion of participants who play FPS games. In addition, it was found that increased gaming experience decreased spatial perception performance.
{"title":"Measuring the effect of gaming experience on virtual environment navigation tasks","authors":"Shamus P. Smith, Sam Du'Mont","doi":"10.1109/3DUI.2009.4811198","DOIUrl":"https://doi.org/10.1109/3DUI.2009.4811198","url":null,"abstract":"Virtual environments are synthetic 3D worlds typically viewed from a first-person point of view with many potential applications within areas such as visualisation, entertainment and training simulators. To effectively develop and utilise virtual environments, user-centric evaluations are commonly performed. Anecdotal evidence suggests that factors such as prior experience with computer games may affect the results of such evaluations. This paper examines the effects of previous computer gaming experience, user perceived gaming ability and actual gaming performance on navigation tasks in a virtual environment. Two computer games and a virtual environment were developed to elicit performance metrics for use within a user study. Results indicated that perceived gaming skill and progress in a First-Person-Shooter (FPS) game were the most consistent metrics showing significant correlations with performance in time-based navigation tasks. There was also strong evidence that these relations were significantly intensified by the inclusion of participants who play FPS games. In addition, it was found that increased gaming experience decreased spatial perception performance.","PeriodicalId":125705,"journal":{"name":"2009 IEEE Symposium on 3D User Interfaces","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127702835","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2009-03-14DOI: 10.1109/3DUI.2009.4811222
Maiya Hori, M. Kanbara, N. Yokoya
This paper presents a telepresence system for a ride, such as on a roller coaster, using a motion platform that can provide a seated user with the sensation of inertial force. Most conventional studies using a motion platform with a few degrees of freedom have not generated an inertial force when a ride accelerates, because a motion platform cannot simulate the same motion of a real roller coaster. We propose a new telepresence system that can provide a user with an inertial force sensation using a motion platform with a few degrees of freedom and an immersive display. In our research, the inertial force sensation is generated by acceleration of gravity produced by inclining the motion platform. The inclination of the seated user is estimated from an image sequence captured using an omnidirectional camera placed on an actual running roller coaster. In our experiments, the inertial force sensation is realized using a motion platform and an immersive display.
{"title":"Poster: MR telepresence system with inertial force sensation using a motion platform and an immersive display","authors":"Maiya Hori, M. Kanbara, N. Yokoya","doi":"10.1109/3DUI.2009.4811222","DOIUrl":"https://doi.org/10.1109/3DUI.2009.4811222","url":null,"abstract":"This paper presents a telepresence system for a ride, such as on a roller coaster, using a motion platform that can provide a seated user with the sensation of inertial force. Most conventional studies using a motion platform with a few degrees of freedom have not generated an inertial force when a ride accelerates, because a motion platform cannot simulate the same motion of a real roller coaster. We propose a new telepresence system that can provide a user with an inertial force sensation using a motion platform with a few degrees of freedom and an immersive display. In our research, the inertial force sensation is generated by acceleration of gravity produced by inclining the motion platform. The inclination of the seated user is estimated from an image sequence captured using an omnidirectional camera placed on an actual running roller coaster. In our experiments, the inertial force sensation is realized using a motion platform and an immersive display.","PeriodicalId":125705,"journal":{"name":"2009 IEEE Symposium on 3D User Interfaces","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116645352","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2009-03-14DOI: 10.1109/3DUI.2009.4811199
M. Wolter, B. Hentschel, I. Tedjo-Palczynski, T. Kuhlen
Scientific visualization tools are applied to gain understanding of time-varying simulations. When these simulations have a high temporal resolution or simulate a long time span, efficient navigation in the temporal dimension of the visualization is mandatory. For this purpose, we propose direct manipulation of visualization objects to control time. By dragging objects along their three-dimensional trajectory, a user can navigate in time by specifying spatial input. We propose two interaction techniques for different kinds of trajectories. In the design phase of these methods, we conducted expert evaluations. To show the benefits of the techniques, we compare them in a user study with the traditional slider-based interface.
{"title":"A direct manipulation interface for time navigation in scientific visualizations","authors":"M. Wolter, B. Hentschel, I. Tedjo-Palczynski, T. Kuhlen","doi":"10.1109/3DUI.2009.4811199","DOIUrl":"https://doi.org/10.1109/3DUI.2009.4811199","url":null,"abstract":"Scientific visualization tools are applied to gain understanding of time-varying simulations. When these simulations have a high temporal resolution or simulate a long time span, efficient navigation in the temporal dimension of the visualization is mandatory. For this purpose, we propose direct manipulation of visualization objects to control time. By dragging objects along their three-dimensional trajectory, a user can navigate in time by specifying spatial input. We propose two interaction techniques for different kinds of trajectories. In the design phase of these methods, we conducted expert evaluations. To show the benefits of the techniques, we compare them in a user study with the traditional slider-based interface.","PeriodicalId":125705,"journal":{"name":"2009 IEEE Symposium on 3D User Interfaces","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116999088","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2009-03-14DOI: 10.1109/3DUI.2009.4811224
Seiko Myojin, H. Kato, S. Nishida
We have proposed MagicCup, a novel wand (3D mouse), for controlling virtual objects in a tabletop augmented reality (AR) environment. In this paper, we present an evaluation of its usability. MagicCup is a handheld input device that uses a real cup as a control interface. It differs from conventional wands in that it can be used to perform “covering” operations to interact with virtual objects. In order to evaluate its fundamental characteristics, we compared the “covering” interaction method with the corresponding methods of conventional wands, such as “pointing” and “touching.” MagicCup allows a user to select a virtual object by covering it with the cup and then holding it up. In contrast, pointing and touching wands allow a user to select an object by pointing at it and pressing a button or by touching it, respectively. Our experimental results indicated that the fundamental characteristics differed depending on the interaction methods. MagicCup was suitable for selecting one object from among several small, scattered objects. In contrast, the pointing and touching wands were suitable for selecting one object from among several small, closely packed objects. In addition, we describe the advantages of MagicCup and the other wands in detail.
{"title":"Poster: Evaluation of a cup-shaped interface in tabletop AR environments","authors":"Seiko Myojin, H. Kato, S. Nishida","doi":"10.1109/3DUI.2009.4811224","DOIUrl":"https://doi.org/10.1109/3DUI.2009.4811224","url":null,"abstract":"We have proposed MagicCup, a novel wand (3D mouse), for controlling virtual objects in a tabletop augmented reality (AR) environment. In this paper, we present an evaluation of its usability. MagicCup is a handheld input device that uses a real cup as a control interface. It differs from conventional wands in that it can be used to perform “covering” operations to interact with virtual objects. In order to evaluate its fundamental characteristics, we compared the “covering” interaction method with the corresponding methods of conventional wands, such as “pointing” and “touching.” MagicCup allows a user to select a virtual object by covering it with the cup and then holding it up. In contrast, pointing and touching wands allow a user to select an object by pointing at it and pressing a button or by touching it, respectively. Our experimental results indicated that the fundamental characteristics differed depending on the interaction methods. MagicCup was suitable for selecting one object from among several small, scattered objects. In contrast, the pointing and touching wands were suitable for selecting one object from among several small, closely packed objects. In addition, we describe the advantages of MagicCup and the other wands in detail.","PeriodicalId":125705,"journal":{"name":"2009 IEEE Symposium on 3D User Interfaces","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126508223","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2009-03-14DOI: 10.1109/3DUI.2009.4811227
Jing Chen, D. Bowman, D. Laidlaw
We present a hybrid user interface and gesture-based direct visual editing techniques for quick and rough object creation and manipulation in three-dimensional (3D) virtual environments (VEs). The user interface includes a novel table-prop to resemble an architect's physical workbench. A tracked pinch glove and a stylus pen provide both rough and fingertip level precise spatial input. For quick placement, objects do not float in space, but instead obey a set of constrained physics laws. Experimental results indicate our design is effective for architectural massing study. Our work contributes to the hardware system design and novel gesture-based interaction techniques, which have the potential to bring VEs into practical use in architecture.
{"title":"Poster: A hybrid direct visual editing method for architectural massing study in virtual environments","authors":"Jing Chen, D. Bowman, D. Laidlaw","doi":"10.1109/3DUI.2009.4811227","DOIUrl":"https://doi.org/10.1109/3DUI.2009.4811227","url":null,"abstract":"We present a hybrid user interface and gesture-based direct visual editing techniques for quick and rough object creation and manipulation in three-dimensional (3D) virtual environments (VEs). The user interface includes a novel table-prop to resemble an architect's physical workbench. A tracked pinch glove and a stylus pen provide both rough and fingertip level precise spatial input. For quick placement, objects do not float in space, but instead obey a set of constrained physics laws. Experimental results indicate our design is effective for architectural massing study. Our work contributes to the hardware system design and novel gesture-based interaction techniques, which have the potential to bring VEs into practical use in architecture.","PeriodicalId":125705,"journal":{"name":"2009 IEEE Symposium on 3D User Interfaces","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116868630","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/3DUI.2009.4811234
Younhee Kim, Zoran Duric, N. L. Gerber, Arthur R. Palsbo, S. E. Palsbo
We designed a virtual hand-writing teaching system for children with handwriting difficulties due to attention or motor deficits, using a haptic interface that could provide a neutral, repetitive engaging approach to letter writing. The approach we took to accomplish this included: (a) Using letter primitives, (b) User friendly interface for teachers, therapists, subjects and parents, (c) Adjustable force and assessment mode, and (d) Quantitative reports.
{"title":"Demo: Teaching letter writing using a programmable haptic device interface for children with handwriting difficulties","authors":"Younhee Kim, Zoran Duric, N. L. Gerber, Arthur R. Palsbo, S. E. Palsbo","doi":"10.1109/3DUI.2009.4811234","DOIUrl":"https://doi.org/10.1109/3DUI.2009.4811234","url":null,"abstract":"We designed a virtual hand-writing teaching system for children with handwriting difficulties due to attention or motor deficits, using a haptic interface that could provide a neutral, repetitive engaging approach to letter writing. The approach we took to accomplish this included: (a) Using letter primitives, (b) User friendly interface for teachers, therapists, subjects and parents, (c) Adjustable force and assessment mode, and (d) Quantitative reports.","PeriodicalId":125705,"journal":{"name":"2009 IEEE Symposium on 3D User Interfaces","volume":"198 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132156283","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}