Erika Oishi, Masahiro Koge, Sugarragchaa Khurelbaatar, H. Kajimoto
Stimulation of the vestibular and somatosensory systems has been proposed as a way to enhance motion sensation in combination with visual movement. However, such systems may be large with limited presentation areas. Here, we propose a method of enhancing motion sensation by pulling clothing. Our system uses DC motors and force sensors to present traction force and cause skin deformation. We investigated whether users perceived the presented sensation as acceleration, or another physical quantity, and found that they matched it with velocity. We also conducted a user study to see whether immersion of gaming contents could be improved by our clothes-pulling system.
{"title":"Enhancement of Motion Sensation by Pulling Clothes","authors":"Erika Oishi, Masahiro Koge, Sugarragchaa Khurelbaatar, H. Kajimoto","doi":"10.1145/2983310.2985749","DOIUrl":"https://doi.org/10.1145/2983310.2985749","url":null,"abstract":"Stimulation of the vestibular and somatosensory systems has been proposed as a way to enhance motion sensation in combination with visual movement. However, such systems may be large with limited presentation areas. Here, we propose a method of enhancing motion sensation by pulling clothing. Our system uses DC motors and force sensors to present traction force and cause skin deformation. We investigated whether users perceived the presented sensation as acceleration, or another physical quantity, and found that they matched it with velocity. We also conducted a user study to see whether immersion of gaming contents could be improved by our clothes-pulling system.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"71 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133905677","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Input Device & Usability","authors":"W. Stuerzlinger","doi":"10.1145/3248576","DOIUrl":"https://doi.org/10.1145/3248576","url":null,"abstract":"","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115295570","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Barrett Ens, A. Byagowi, Teng Han, Juan David Hincapié-Ramos, Pourang Irani
Current wearable interfaces are designed to support short-duration tasks known as micro-interactions. To support productive interfaces for everyday analytic tasks, designers can leverage natural input methods such as direct manipulation and pointing. Such natural methods are now available in virtual, mobile environments thanks to miniature depth cameras mounted on head-worn displays (HWDs). However, these techniques have drawbacks, such as fatigue and limited precision. To overcome these limitations, we explore combined input: hand tracking data from a head-mounted depth camera, and input from a small ring device. We demonstrate how a variety of input techniques can be implemented using this novel combination of devices. We harness these techniques for use with Spatial Analytic Interfaces: multi-application, spatial UIs for in-situ, analytic taskwork on wearable devices. This research demonstrates how combined input from multiple wearable devices holds promise for supporting high-precision, low-fatigue interaction techniques, to support Spatial Analytic Interfaces on HWDs.
{"title":"Combining Ring Input with Hand Tracking for Precise, Natural Interaction with Spatial Analytic Interfaces","authors":"Barrett Ens, A. Byagowi, Teng Han, Juan David Hincapié-Ramos, Pourang Irani","doi":"10.1145/2983310.2985757","DOIUrl":"https://doi.org/10.1145/2983310.2985757","url":null,"abstract":"Current wearable interfaces are designed to support short-duration tasks known as micro-interactions. To support productive interfaces for everyday analytic tasks, designers can leverage natural input methods such as direct manipulation and pointing. Such natural methods are now available in virtual, mobile environments thanks to miniature depth cameras mounted on head-worn displays (HWDs). However, these techniques have drawbacks, such as fatigue and limited precision. To overcome these limitations, we explore combined input: hand tracking data from a head-mounted depth camera, and input from a small ring device. We demonstrate how a variety of input techniques can be implemented using this novel combination of devices. We harness these techniques for use with Spatial Analytic Interfaces: multi-application, spatial UIs for in-situ, analytic taskwork on wearable devices. This research demonstrates how combined input from multiple wearable devices holds promise for supporting high-precision, low-fatigue interaction techniques, to support Spatial Analytic Interfaces on HWDs.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125937289","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hind Gacem, G. Bailly, James R. Eagan, É. Lecolinet
Various guidance techniques have been proposed to help users to quickly and effectively locate objects in large and dense environments such as supermarkets, libraries, or control rooms. Little research, however, has focused on their impact on learning. These techniques generally transfer control from the user to the system, making the user more passive and reducing kinesthetic feedback. In this paper, we present an experiment that evaluates the impact of projection-based guidance techniques on spatial memorization. We investigate the roles of user (handheld) vs. system control (robotic arm) guidance and of kinesthetic feedback on memorization. Results show (1) higher recall rates with system-controlled guidance, (2) no significant influence of kinesthetic feedback on recall under system control, and (3) the visibility and noticeability of objects impact memorization.
{"title":"Impact of Motorized Projection Guidance on Spatial Memory","authors":"Hind Gacem, G. Bailly, James R. Eagan, É. Lecolinet","doi":"10.1145/2983310.2985751","DOIUrl":"https://doi.org/10.1145/2983310.2985751","url":null,"abstract":"Various guidance techniques have been proposed to help users to quickly and effectively locate objects in large and dense environments such as supermarkets, libraries, or control rooms. Little research, however, has focused on their impact on learning. These techniques generally transfer control from the user to the system, making the user more passive and reducing kinesthetic feedback. In this paper, we present an experiment that evaluates the impact of projection-based guidance techniques on spatial memorization. We investigate the roles of user (handheld) vs. system control (robotic arm) guidance and of kinesthetic feedback on memorization. Results show (1) higher recall rates with system-controlled guidance, (2) no significant influence of kinesthetic feedback on recall under system control, and (3) the visibility and noticeability of objects impact memorization.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131117633","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Keynote Address","authors":"C. Sandor","doi":"10.1145/3248571","DOIUrl":"https://doi.org/10.1145/3248571","url":null,"abstract":"","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133977884","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hrvoje Benko, Katsuhiro Harada, Otmar Hilliges, A. Olwal, Aitor Rovira
In this panel, we will focus the discussion on the present and future of spatial user interfaces. The discussion will include both the technologies and their applications. We want to explore the current limitations, possible future solutions and how other fields such as artificial intelligence, body implants, or robotic-enhanced interactions will contribute to these interactions. New applications will also bring new possibilities, but they might also raise controversy, leading to a discussion about where the ethical limits are.
{"title":"Spatial User Interaction Panel","authors":"Hrvoje Benko, Katsuhiro Harada, Otmar Hilliges, A. Olwal, Aitor Rovira","doi":"10.1145/2983310.2996295","DOIUrl":"https://doi.org/10.1145/2983310.2996295","url":null,"abstract":"In this panel, we will focus the discussion on the present and future of spatial user interfaces. The discussion will include both the technologies and their applications. We want to explore the current limitations, possible future solutions and how other fields such as artificial intelligence, body implants, or robotic-enhanced interactions will contribute to these interactions. New applications will also bring new possibilities, but they might also raise controversy, leading to a discussion about where the ethical limits are.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127467767","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Applications & Technology","authors":"D. Iwai","doi":"10.1145/3248575","DOIUrl":"https://doi.org/10.1145/3248575","url":null,"abstract":"","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129627716","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Touch and Movement","authors":"E. Kruijff","doi":"10.1145/3248573","DOIUrl":"https://doi.org/10.1145/3248573","url":null,"abstract":"","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126490304","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Monica Perusquía-Hernández, T. Martins, Takahisa Enomoto, M. Otsuki, Hiroo Iwata, Kenji Suzuki
A multimodal embodied interface for 3D navigation was designed as a modular wearable. The user is suspended with a harness controlled by a mechanical Motion Base. This allows both physical and virtual displacement within an immersive virtual environment. Through a combination of passive and active modalities, users are enabled to fly at their own will.
{"title":"Multimodal Embodied Interface for Levitation and Navigation in 3D Space","authors":"Monica Perusquía-Hernández, T. Martins, Takahisa Enomoto, M. Otsuki, Hiroo Iwata, Kenji Suzuki","doi":"10.1145/2983310.2989207","DOIUrl":"https://doi.org/10.1145/2983310.2989207","url":null,"abstract":"A multimodal embodied interface for 3D navigation was designed as a modular wearable. The user is suspended with a harness controlled by a mechanical Motion Base. This allows both physical and virtual displacement within an immersive virtual environment. Through a combination of passive and active modalities, users are enabled to fly at their own will.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"77 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121640119","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper describes the use of walk-in-place (WIP) interface to control the speed of 360-degree panoramic virtual reality (VR) video on a head-mounted display and through a pilot user study, the effect of using this interface with respect to the amount of simulator sickness and presence in the environment felt by the user is evaluated. The results are compared with traditional 360-degree VR video experience.
{"title":"Effect of using Walk-In-Place Interface for Panoramic Video Play in VR","authors":"A. Muhammad, S. Ahn, Jae-In Hwang","doi":"10.1145/2983310.2989197","DOIUrl":"https://doi.org/10.1145/2983310.2989197","url":null,"abstract":"This paper describes the use of walk-in-place (WIP) interface to control the speed of 360-degree panoramic virtual reality (VR) video on a head-mounted display and through a pilot user study, the effect of using this interface with respect to the amount of simulator sickness and presence in the environment felt by the user is evaluated. The results are compared with traditional 360-degree VR video experience.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"169 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122063799","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}