Head-coupled virtual reality systems can cause symptoms of sickness (cybersickness). A study has been conducted to investigate the effects of scene oscillations on the level and types of cybersickness. Sixteen male subjects participated in the experiments. They were exposed to four 20-minute virtual simulation sessions, in a balanced order with 10 days separation. The 4 simulation sessions exposed the subjects to similar visual scene oscillation in different axis: pitch axis, yaw axis, roll axis and no oscillation (speed: 30/spl deg//s, range: +/-60/spl deg/). Verbal ratings of nausea level were taken at 5-minute intervals and sickness symptoms were measured before and after the exposure using the Simulator Sickness Questionnaire (SSQ). Significant differences were found between the no oscillation condition and the oscillating conditions. With scene oscillation, nausea ratings increased significantly after 5-minute exposure for all the oscillation axes (pitch, yaw, and roll axes). Total sickness scores were obtained from the SSQ and their profiles with different scene oscillation axes were presented.
{"title":"Cybersickness: an experimental study to isolate the effects of rotational scene oscillations","authors":"R. So, W. Lo","doi":"10.1109/VR.1999.756957","DOIUrl":"https://doi.org/10.1109/VR.1999.756957","url":null,"abstract":"Head-coupled virtual reality systems can cause symptoms of sickness (cybersickness). A study has been conducted to investigate the effects of scene oscillations on the level and types of cybersickness. Sixteen male subjects participated in the experiments. They were exposed to four 20-minute virtual simulation sessions, in a balanced order with 10 days separation. The 4 simulation sessions exposed the subjects to similar visual scene oscillation in different axis: pitch axis, yaw axis, roll axis and no oscillation (speed: 30/spl deg//s, range: +/-60/spl deg/). Verbal ratings of nausea level were taken at 5-minute intervals and sickness symptoms were measured before and after the exposure using the Simulator Sickness Questionnaire (SSQ). Significant differences were found between the no oscillation condition and the oscillating conditions. With scene oscillation, nausea ratings increased significantly after 5-minute exposure for all the oscillation axes (pitch, yaw, and roll axes). Total sickness scores were obtained from the SSQ and their profiles with different scene oscillation axes were presented.","PeriodicalId":175913,"journal":{"name":"Proceedings IEEE Virtual Reality (Cat. No. 99CB36316)","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116140531","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The Naval Postgraduate School has spent the last two years developing a new degree program called the Modeling, Virtual Environments and Simulation (MOVES) curriculum. That curriculum has turned into an Academic Group, a department-like structure, and a research center. We discuss the composition of that curriculum and the directions for the MOVES Research Center. Background The Naval Postgraduate School has had a successful research program in virtual environments for the last twelve years under the aegis of the NPSNET Research Group (http://www.npsnet.nps.navy.mil). Students attached to the NPSNET group have typically graduated with an MS in Computer Science, with a specialization in Computer Graphics and Visual Simulation. About two years ago, we desired to expand our course offerings in the virtual environment field, particularly with respect to human-computer interaction, physically-based modeling, and modeling and simulation. Within the confines of the defined Computer Science MS program, we found no room for expansion and decided to explore the potential for developing our own degree program. We found great sponsor interest in the development of that program and rapidly designed a new degree. The desire of the sponsors of MOVES was for students who understood applied computer visual simulation technology, and the application of quantitative analyses to human-computer interaction in visual simulation technology. We wanted to work from our successes in what we had been doing and add onto that degree program. We constructed a two year, eight quarter program that is roughly 1/2 computer science and 1/2 operations analysis and mathematics. The program looks very much like a double major with a fully packed eight quarters. We educate people in modeling, virtual environments and simulation fundamentals and their application. The MOVES program offers MS degrees in MOVES, with a Ph.D. program under development for consideration by the NPS Academic Council. Initially, we did not create a department but rather a degree program comprised of existing courses from several departments. We are now in the process of forming our own department as our student population has grown from zero to twenty-four in two years and will climb to more than seventy at the end of another two years.
{"title":"Directions in Modeling, Virtual Environments and Simulation (MOVES)","authors":"M. Zyda","doi":"10.1109/VR.1999.756925","DOIUrl":"https://doi.org/10.1109/VR.1999.756925","url":null,"abstract":"The Naval Postgraduate School has spent the last two years developing a new degree program called the Modeling, Virtual Environments and Simulation (MOVES) curriculum. That curriculum has turned into an Academic Group, a department-like structure, and a research center. We discuss the composition of that curriculum and the directions for the MOVES Research Center. Background The Naval Postgraduate School has had a successful research program in virtual environments for the last twelve years under the aegis of the NPSNET Research Group (http://www.npsnet.nps.navy.mil). Students attached to the NPSNET group have typically graduated with an MS in Computer Science, with a specialization in Computer Graphics and Visual Simulation. About two years ago, we desired to expand our course offerings in the virtual environment field, particularly with respect to human-computer interaction, physically-based modeling, and modeling and simulation. Within the confines of the defined Computer Science MS program, we found no room for expansion and decided to explore the potential for developing our own degree program. We found great sponsor interest in the development of that program and rapidly designed a new degree. The desire of the sponsors of MOVES was for students who understood applied computer visual simulation technology, and the application of quantitative analyses to human-computer interaction in visual simulation technology. We wanted to work from our successes in what we had been doing and add onto that degree program. We constructed a two year, eight quarter program that is roughly 1/2 computer science and 1/2 operations analysis and mathematics. The program looks very much like a double major with a fully packed eight quarters. We educate people in modeling, virtual environments and simulation fundamentals and their application. The MOVES program offers MS degrees in MOVES, with a Ph.D. program under development for consideration by the NPS Academic Council. Initially, we did not create a department but rather a degree program comprised of existing courses from several departments. We are now in the process of forming our own department as our student population has grown from zero to twenty-four in two years and will climb to more than seventy at the end of another two years.","PeriodicalId":175913,"journal":{"name":"Proceedings IEEE Virtual Reality (Cat. No. 99CB36316)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129307031","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
H. Q. Dinh, N. Walker, L. Hodges, Chang Song, Akira Kobayashi
322 subjects participated in an experimental study to investigate the effects of tactile, olfactory, audio and visual sensory cues on a participant's sense of presence in a virtual environment and on their memory for the environment and the objects in that environment. Results strongly indicate that increasing the modalities of sensory input in a virtual environment can increase both the sense of presence and memory for objects in the environment. In particular, the addition of tactile, olfactory and auditory cues to a virtual environment increased the user's sense of presence and memory of the environment. Surprisingly, increasing the level of visual detail did not result in an increase in the user's sense of presence or memory of the environment.
{"title":"Evaluating the importance of multi-sensory input on memory and the sense of presence in virtual environments","authors":"H. Q. Dinh, N. Walker, L. Hodges, Chang Song, Akira Kobayashi","doi":"10.1109/VR.1999.756955","DOIUrl":"https://doi.org/10.1109/VR.1999.756955","url":null,"abstract":"322 subjects participated in an experimental study to investigate the effects of tactile, olfactory, audio and visual sensory cues on a participant's sense of presence in a virtual environment and on their memory for the environment and the objects in that environment. Results strongly indicate that increasing the modalities of sensory input in a virtual environment can increase both the sense of presence and memory for objects in the environment. In particular, the addition of tactile, olfactory and auditory cues to a virtual environment increased the user's sense of presence and memory of the environment. Surprisingly, increasing the level of visual detail did not result in an increase in the user's sense of presence or memory of the environment.","PeriodicalId":175913,"journal":{"name":"Proceedings IEEE Virtual Reality (Cat. No. 99CB36316)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129867785","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The study of human-computer interaction within immersive virtual environments requires us to balance what we have learned from the design and use of desktop interfaces with novel approaches to allow us to work effectively in three dimensions. While some researchers have called for revolutionary interfaces for these new environments, devoid of two-dimensional (2D) desktop widgets, others have taken a more evolutionary approach. Windowing within immersive virtual environments is an attempt to apply 2D interface techniques to three-dimensional (3D) worlds. 2D techniques are attractive because of their proven acceptance and widespread use on the desktop. With current methods environments, however, it is difficult for users of 3D worlds to perform precise manipulations, such as dragging sliders, or precisely positioning or orienting objects. We have developed a testbed designed to take advantage of bimanual interaction, proprioception, and passive-haptic feedback. We present preliminary results from an empirical study of 2D interaction in 3D environments using this system. We use a window registered with a tracked, physical surface, to provide support for precise manipulation of interface widgets displayed in the virtual environment.
{"title":"Hand-held windows: towards effective 2D interaction in immersive virtual environments","authors":"R. Lindeman, J. Sibert, J. Hahn","doi":"10.1109/VR.1999.756952","DOIUrl":"https://doi.org/10.1109/VR.1999.756952","url":null,"abstract":"The study of human-computer interaction within immersive virtual environments requires us to balance what we have learned from the design and use of desktop interfaces with novel approaches to allow us to work effectively in three dimensions. While some researchers have called for revolutionary interfaces for these new environments, devoid of two-dimensional (2D) desktop widgets, others have taken a more evolutionary approach. Windowing within immersive virtual environments is an attempt to apply 2D interface techniques to three-dimensional (3D) worlds. 2D techniques are attractive because of their proven acceptance and widespread use on the desktop. With current methods environments, however, it is difficult for users of 3D worlds to perform precise manipulations, such as dragging sliders, or precisely positioning or orienting objects. We have developed a testbed designed to take advantage of bimanual interaction, proprioception, and passive-haptic feedback. We present preliminary results from an empirical study of 2D interaction in 3D environments using this system. We use a window registered with a tracked, physical surface, to provide support for precise manipulation of interface widgets displayed in the virtual environment.","PeriodicalId":175913,"journal":{"name":"Proceedings IEEE Virtual Reality (Cat. No. 99CB36316)","volume":"36 12","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133171027","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We present Avocado, our object-oriented framework for the development of distributed, interactive virtual environment applications. Data distribution is achieved by transparent replication of a shared scene graph among the participating processes of a distributed application. A sophisticated group communication system is used to guarantee state consistency even in the presence of late joining and leaving processes. We also describe how the familiar data flow graph found in modern stand-alone 3D-application toolkits extends nicely to the distributed case.
{"title":"Avocado: a distributed virtual reality framework","authors":"H. Tramberend","doi":"10.1109/VR.1999.756918","DOIUrl":"https://doi.org/10.1109/VR.1999.756918","url":null,"abstract":"We present Avocado, our object-oriented framework for the development of distributed, interactive virtual environment applications. Data distribution is achieved by transparent replication of a shared scene graph among the participating processes of a distributed application. A sophisticated group communication system is used to guarantee state consistency even in the presence of late joining and leaving processes. We also describe how the familiar data flow graph found in modern stand-alone 3D-application toolkits extends nicely to the distributed case.","PeriodicalId":175913,"journal":{"name":"Proceedings IEEE Virtual Reality (Cat. No. 99CB36316)","volume":"64 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131309797","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Ellis, B. Adelstein, S. Baumeler, G. Jense, R. Jacoby
We examined the effects of human 3D tracking performance of several common defects of immersing virtual environments: spatial sensor distortion, visual latency and low update rates. Results show: removal of relatively small static distortion had minor effects on tracking accuracy; an adapted Cooper-Harper controllability scale proved the most sensitive subjective indicator of simulation degradation; and RMS tracking error and subjective impressions were more influenced by changing visual latency than by update rate.
{"title":"Sensor spatial distortion, visual latency, and update rate effects on 3D tracking in virtual environments","authors":"S. Ellis, B. Adelstein, S. Baumeler, G. Jense, R. Jacoby","doi":"10.1109/VR.1999.756954","DOIUrl":"https://doi.org/10.1109/VR.1999.756954","url":null,"abstract":"We examined the effects of human 3D tracking performance of several common defects of immersing virtual environments: spatial sensor distortion, visual latency and low update rates. Results show: removal of relatively small static distortion had minor effects on tracking accuracy; an adapted Cooper-Harper controllability scale proved the most sensitive subjective indicator of simulation degradation; and RMS tracking error and subjective impressions were more influenced by changing visual latency than by update rate.","PeriodicalId":175913,"journal":{"name":"Proceedings IEEE Virtual Reality (Cat. No. 99CB36316)","volume":"581 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116591719","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Generating, representing, and sharing synthetic environments is a key factor in networked simu.lation systems. Because the use of three-dimensional synthetic environments in commercial, educational, and entertainment applications will increase, and because these app1:ications will be connected via local and global networks, it is important to understand how these environmental databases can impact networked systems. There are unique constraints that realtime networked simulation places on the successful creation of these databases. The representation of the information about a particular environment can be drastically different based on the needs of a simulation platform, or the specific application. As an example, creation of electronic (or paper) maps demands a data representation scheme that may not be suitable for a. thermal sight simulator, even though both are intended to depict and operate in the same geographical region. There are many such examples. These diverse computing needs, and how they affect the creation and exchange of environmental databases, have direct impacts on the cost and effectiveness of “virtual reality” systems. The trade-offs in source data selection, geometric representations, the key steps in creation of environmental databases, the management considerations in improving the process, tool development, and achieving quality simulations, as well as issues such as interchange vs. inte:roperability are all critical elements that deserve close examination and attention.
{"title":"SEDRIS as a Standard for Interchange Virtual World Data Sets","authors":"Farid Mamaghani","doi":"10.1109/VR.1999.756927","DOIUrl":"https://doi.org/10.1109/VR.1999.756927","url":null,"abstract":"Generating, representing, and sharing synthetic environments is a key factor in networked simu.lation systems. Because the use of three-dimensional synthetic environments in commercial, educational, and entertainment applications will increase, and because these app1:ications will be connected via local and global networks, it is important to understand how these environmental databases can impact networked systems. There are unique constraints that realtime networked simulation places on the successful creation of these databases. The representation of the information about a particular environment can be drastically different based on the needs of a simulation platform, or the specific application. As an example, creation of electronic (or paper) maps demands a data representation scheme that may not be suitable for a. thermal sight simulator, even though both are intended to depict and operate in the same geographical region. There are many such examples. These diverse computing needs, and how they affect the creation and exchange of environmental databases, have direct impacts on the cost and effectiveness of “virtual reality” systems. The trade-offs in source data selection, geometric representations, the key steps in creation of environmental databases, the management considerations in improving the process, tool development, and achieving quality simulations, as well as issues such as interchange vs. inte:roperability are all critical elements that deserve close examination and attention.","PeriodicalId":175913,"journal":{"name":"Proceedings IEEE Virtual Reality (Cat. No. 99CB36316)","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121144904","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
F. V. Deriggi, M. Kubo, A. Sementille, J. Brega, Simone Garcon dos Santos, C. Kirner
This paper focuses on the use of the CORBA (Common Object Request Broker Architecture) platform as a middleware layer to support distributed virtual environments, particularly those based on the WorldToolKit software. Some results of an application implemented by using the ILU (InterLanguage Unification) software that is compatible with the CORBA platform, are also discussed.
{"title":"CORBA platform as support for distributed virtual environments","authors":"F. V. Deriggi, M. Kubo, A. Sementille, J. Brega, Simone Garcon dos Santos, C. Kirner","doi":"10.1109/VR.1999.756917","DOIUrl":"https://doi.org/10.1109/VR.1999.756917","url":null,"abstract":"This paper focuses on the use of the CORBA (Common Object Request Broker Architecture) platform as a middleware layer to support distributed virtual environments, particularly those based on the WorldToolKit software. Some results of an application implemented by using the ILU (InterLanguage Unification) software that is compatible with the CORBA platform, are also discussed.","PeriodicalId":175913,"journal":{"name":"Proceedings IEEE Virtual Reality (Cat. No. 99CB36316)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124878139","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Arthur D. Gregory, M. Lin, S. Gottschalk, Russell M. Taylor
We present a framework for fast and accurate collision detection for haptic interaction with polygonal models. Given a model, we pre-compute a hybrid hierarchical representation, consisting of uniform grids and trees of tight-fitting oriented bounding box trees (OBB-Trees). At run time, we use hybrid hierarchical representations and exploit frame-to-frame coherence for fast proximity queries. We describe a new overlap test, which is specialized for intersection of a line segment with an oriented bounding box for haptic simulation and takes 6-36 operations excluding transformation costs. The algorithms have been implemented as part of H-COLLIDE and interfaced with a PHANToM arm and its haptic toolkit, GHOST, and applied to a number of models. As compared to the commercial implementation, we are able to achieve up to 20 times speedup in our experiments and sustain update rates over 1000 Hz on a 400 MHz Pentium II.
{"title":"A framework for fast and accurate collision detection for haptic interaction","authors":"Arthur D. Gregory, M. Lin, S. Gottschalk, Russell M. Taylor","doi":"10.1109/VR.1999.756921","DOIUrl":"https://doi.org/10.1109/VR.1999.756921","url":null,"abstract":"We present a framework for fast and accurate collision detection for haptic interaction with polygonal models. Given a model, we pre-compute a hybrid hierarchical representation, consisting of uniform grids and trees of tight-fitting oriented bounding box trees (OBB-Trees). At run time, we use hybrid hierarchical representations and exploit frame-to-frame coherence for fast proximity queries. We describe a new overlap test, which is specialized for intersection of a line segment with an oriented bounding box for haptic simulation and takes 6-36 operations excluding transformation costs. The algorithms have been implemented as part of H-COLLIDE and interfaced with a PHANToM arm and its haptic toolkit, GHOST, and applied to a number of models. As compared to the commercial implementation, we are able to achieve up to 20 times speedup in our experiments and sustain update rates over 1000 Hz on a 400 MHz Pentium II.","PeriodicalId":175913,"journal":{"name":"Proceedings IEEE Virtual Reality (Cat. No. 99CB36316)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132345485","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}