We present a mixed-reality exhibition for the Museum of Peace Corps Experiences designed using the Ad-Hoc Mixed-reality Exhibition Designer (AHMED) toolset. AHMED enables visitors to experience mixed-reality museum or art exhibitions created ad-hoc at any location. The system democratizes access to exhibitions for populations that cannot visit these exhibitions in person for reasons of disability, time-constraints, travel restrictions, or socio-economic status.
{"title":"Mixed-Reality Exhibition for Museum of Peace Corps Experiences using AHMED toolset","authors":"Krzysztof Pietroszek","doi":"10.1145/3357251.3358754","DOIUrl":"https://doi.org/10.1145/3357251.3358754","url":null,"abstract":"We present a mixed-reality exhibition for the Museum of Peace Corps Experiences designed using the Ad-Hoc Mixed-reality Exhibition Designer (AHMED) toolset. AHMED enables visitors to experience mixed-reality museum or art exhibitions created ad-hoc at any location. The system democratizes access to exhibitions for populations that cannot visit these exhibitions in person for reasons of disability, time-constraints, travel restrictions, or socio-economic status.","PeriodicalId":370782,"journal":{"name":"Symposium on Spatial User Interaction","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121850014","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Chronic Pain is a universal disorder affecting millions of people, influencing even the most basic decisions in their lives. With the computer becoming such an integral part of our society and the ever-expanding interaction paradigm, the need to explore potential computer interactions for people with Chronic Pain has only increased. In this paper we explore the used of gesture-based interaction as a medium with which these users can perform the base operations of computer interaction. We show that, for gestural pointing and selection, modeling users’ interaction space and multimodel interaction performed the best in terms of throughput.
{"title":"Interaction can hurt - Exploring gesture-based interaction for users with Chronic Pain","authors":"G. M. Poor, Alvin Jude","doi":"10.1145/3357251.3357589","DOIUrl":"https://doi.org/10.1145/3357251.3357589","url":null,"abstract":"Chronic Pain is a universal disorder affecting millions of people, influencing even the most basic decisions in their lives. With the computer becoming such an integral part of our society and the ever-expanding interaction paradigm, the need to explore potential computer interactions for people with Chronic Pain has only increased. In this paper we explore the used of gesture-based interaction as a medium with which these users can perform the base operations of computer interaction. We show that, for gestural pointing and selection, modeling users’ interaction space and multimodel interaction performed the best in terms of throughput.","PeriodicalId":370782,"journal":{"name":"Symposium on Spatial User Interaction","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122195910","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
There are some techniques to show the smartphone’s content on an external display in large. However, since smartphones are designed for mobility, a seamless interaction is necessary to make the best use of external display by a smartphone. We are currently exploring the feasibility of another technique, which we call Screen Extension. Our technique seamlessly adds display spaces to a smartphone using an external display, allowing users to use displays available in many places. To test search performance with Screen Extension, we conducted a pilot study; which suggested that Screen Extension helps users to search content faster.
{"title":"Preliminary Study of Screen Extension for Smartphone Using External Display","authors":"Yuta Urushiyama, B. Shizuki, Shin Takahashi","doi":"10.1145/3357251.3359750","DOIUrl":"https://doi.org/10.1145/3357251.3359750","url":null,"abstract":"There are some techniques to show the smartphone’s content on an external display in large. However, since smartphones are designed for mobility, a seamless interaction is necessary to make the best use of external display by a smartphone. We are currently exploring the feasibility of another technique, which we call Screen Extension. Our technique seamlessly adds display spaces to a smartphone using an external display, allowing users to use displays available in many places. To test search performance with Screen Extension, we conducted a pilot study; which suggested that Screen Extension helps users to search content faster.","PeriodicalId":370782,"journal":{"name":"Symposium on Spatial User Interaction","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129354018","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Many cases in which augmented reality would be useful in everyday life requires the ability to access to information on the go. This means that interfaces should support user movement and also adjust to different physical environments. Prior research has showed that spatial adaptation can reduce the effort required to manage windows when walking and moving to different spaces. We designed and implemented an unified interaction system for AR windows that allow users to quickly switch and fine tune spatial adaptation. Our study indicates that a small number of adaptive behaviors is sufficient to facilitate information access in variety of conditions.
{"title":"Adjustable Adaptation for Spatial Augmented Reality Workspaces","authors":"W. Lages, D. Bowman","doi":"10.1145/3357251.3358755","DOIUrl":"https://doi.org/10.1145/3357251.3358755","url":null,"abstract":"Many cases in which augmented reality would be useful in everyday life requires the ability to access to information on the go. This means that interfaces should support user movement and also adjust to different physical environments. Prior research has showed that spatial adaptation can reduce the effort required to manage windows when walking and moving to different spaces. We designed and implemented an unified interaction system for AR windows that allow users to quickly switch and fine tune spatial adaptation. Our study indicates that a small number of adaptive behaviors is sufficient to facilitate information access in variety of conditions.","PeriodicalId":370782,"journal":{"name":"Symposium on Spatial User Interaction","volume":"51 12","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133008449","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Building collaborative VR applications for exploring and interacting with large or abstract spaces presents several problems. Given a large space and a potentially large number of possible interactions, it is expected that users will need a tool selection menu that will be easily accessible at any point in the environment. Given the collaborative nature, users will also want to be able to maintain awareness of each other within the environment and communicate about what they are seeing or doing. We present a demo that shows solutions to these problems developed in the context of a collaborative geological dataset viewer.
{"title":"Collaborative Interaction in Large Explorative Environments","authors":"J. Woodworth, David M. Broussard, C. Borst","doi":"10.1145/3357251.3360017","DOIUrl":"https://doi.org/10.1145/3357251.3360017","url":null,"abstract":"Building collaborative VR applications for exploring and interacting with large or abstract spaces presents several problems. Given a large space and a potentially large number of possible interactions, it is expected that users will need a tool selection menu that will be easily accessible at any point in the environment. Given the collaborative nature, users will also want to be able to maintain awareness of each other within the environment and communicate about what they are seeing or doing. We present a demo that shows solutions to these problems developed in the context of a collaborative geological dataset viewer.","PeriodicalId":370782,"journal":{"name":"Symposium on Spatial User Interaction","volume":"85 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122315592","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper, we propose a virtual window manipulation method used for information search while utilizing a head-mounted display (HMD). Existing HMD operation methods have several issues like causing user fatigue and processing input tasks inefficiently. Such problems are difficult to solve simultaneously. Therefore, we propose using head tracking cursors and smart devices. The suggested method aims to operate a head tracking cursor by swiping input on the smart device. In this paper, we compared the operability of this new method and the classic hand tracking one based on the results of user experiments. As a result, it was confirmed that operability of the proposed method is deemed to be high.
{"title":"Virtual Window Manipulation Method for Head-mounted Display Using Smart Device","authors":"Shu Sorimachi, Kota Kita, Mitsunori Matsushita","doi":"10.1145/3357251.3358753","DOIUrl":"https://doi.org/10.1145/3357251.3358753","url":null,"abstract":"In this paper, we propose a virtual window manipulation method used for information search while utilizing a head-mounted display (HMD). Existing HMD operation methods have several issues like causing user fatigue and processing input tasks inefficiently. Such problems are difficult to solve simultaneously. Therefore, we propose using head tracking cursors and smart devices. The suggested method aims to operate a head tracking cursor by swiping input on the smart device. In this paper, we compared the operability of this new method and the classic hand tracking one based on the results of user experiments. As a result, it was confirmed that operability of the proposed method is deemed to be high.","PeriodicalId":370782,"journal":{"name":"Symposium on Spatial User Interaction","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125246419","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Spatial knowledge about the environment often helps people accomplish their navigation and wayfinding tasks more efficiently. Off-the-shelf mobile navigation applications often focus on guiding people to go between two locations, ignoring the importance of learning spatial knowledge. Drawing on theories and findings from the area of learning spatial knowledge, we investigated how the background reference frames (RF) and navigational cues can be combined in navigation applications to help people acquire better spatial (route and survey) knowledge. We conducted two user studies, where participants used our custom-designed applications to navigate in an indoor location. We found that having more navigational cues in a navigation application does not always assist users in acquiring better spatial knowledge; rather, these cues can be distracting in some specific setups. Users can acquire better spatial knowledge only when the navigational cues complement each other in the interface design. We discussed the implications of designing navigation interfaces that can assist users in learning spatial knowledge by combining navigational elements in a complimentary way.
{"title":"Understanding the Effect of the Combination of Navigation Tools in Learning Spatial Knowledge","authors":"S. Dey, W. Fu, Karrie Karahalios","doi":"10.1145/3357251.3357582","DOIUrl":"https://doi.org/10.1145/3357251.3357582","url":null,"abstract":"Spatial knowledge about the environment often helps people accomplish their navigation and wayfinding tasks more efficiently. Off-the-shelf mobile navigation applications often focus on guiding people to go between two locations, ignoring the importance of learning spatial knowledge. Drawing on theories and findings from the area of learning spatial knowledge, we investigated how the background reference frames (RF) and navigational cues can be combined in navigation applications to help people acquire better spatial (route and survey) knowledge. We conducted two user studies, where participants used our custom-designed applications to navigate in an indoor location. We found that having more navigational cues in a navigation application does not always assist users in acquiring better spatial knowledge; rather, these cues can be distracting in some specific setups. Users can acquire better spatial knowledge only when the navigational cues complement each other in the interface design. We discussed the implications of designing navigation interfaces that can assist users in learning spatial knowledge by combining navigational elements in a complimentary way.","PeriodicalId":370782,"journal":{"name":"Symposium on Spatial User Interaction","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114596672","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Christopher You, Evan Suma Rosenberg, Jerald Thomas
Redirected walking enables natural locomotion in virtual environments that are larger than the user’s real world space. However, in complex setups with physical obstacles, existing redirection techniques that were originally designed for empty spaces may be sub-optimal. This poster presents strafing gains, a novel redirected walking technique that can be used to shift the user laterally away from obstacles without disrupting their current orientation. In the future, we plan to conduct a study to identify perceptual detection thresholds and investigate new algorithms that can use strafing gains in combination with other existing redirection techniques to achieve superior obstacle avoidance in complex physical spaces.
{"title":"Strafing Gain: A Novel Redirected Walking Technique","authors":"Christopher You, Evan Suma Rosenberg, Jerald Thomas","doi":"10.1145/3357251.3358757","DOIUrl":"https://doi.org/10.1145/3357251.3358757","url":null,"abstract":"Redirected walking enables natural locomotion in virtual environments that are larger than the user’s real world space. However, in complex setups with physical obstacles, existing redirection techniques that were originally designed for empty spaces may be sub-optimal. This poster presents strafing gains, a novel redirected walking technique that can be used to shift the user laterally away from obstacles without disrupting their current orientation. In the future, we plan to conduct a study to identify perceptual detection thresholds and investigate new algorithms that can use strafing gains in combination with other existing redirection techniques to achieve superior obstacle avoidance in complex physical spaces.","PeriodicalId":370782,"journal":{"name":"Symposium on Spatial User Interaction","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116966641","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this work, a spatial interface was designed and evaluated for enabling effective teleoperation of bi-manual robotic manipulators. Previous work in this area has investigated using immersive virtual reality systems to provide more natural, intuitive spatial control and viewing of the remote robot workspace. The current work builds upon this research through the design of the teleoperator interface and by additionally studying how varying the spatial interaction metaphor and devices employed to control the robot impacts task performance. A user study was conducted with 33 novice teleoperators split into two groups by interaction metaphor used to control the robot end-effectors, one group using a grabbing metaphor with tracked motion controllers (Oculus Touch) and the other using driving metaphor with two fixed 6-axis controllers (3Dconnexion SpaceMouse). Results indicated that, despite the challenging task, both interfaces were highly effective for bimanual teleoperation, but that motion controls provided higher peak performance, likely due to faster gross movement planning.
{"title":"Remote Robotic Arm Teleoperation through Virtual Reality","authors":"Anton Franzluebbers, Kyle A. Johnson","doi":"10.1145/3357251.3359444","DOIUrl":"https://doi.org/10.1145/3357251.3359444","url":null,"abstract":"In this work, a spatial interface was designed and evaluated for enabling effective teleoperation of bi-manual robotic manipulators. Previous work in this area has investigated using immersive virtual reality systems to provide more natural, intuitive spatial control and viewing of the remote robot workspace. The current work builds upon this research through the design of the teleoperator interface and by additionally studying how varying the spatial interaction metaphor and devices employed to control the robot impacts task performance. A user study was conducted with 33 novice teleoperators split into two groups by interaction metaphor used to control the robot end-effectors, one group using a grabbing metaphor with tracked motion controllers (Oculus Touch) and the other using driving metaphor with two fixed 6-axis controllers (3Dconnexion SpaceMouse). Results indicated that, despite the challenging task, both interfaces were highly effective for bimanual teleoperation, but that motion controls provided higher peak performance, likely due to faster gross movement planning.","PeriodicalId":370782,"journal":{"name":"Symposium on Spatial User Interaction","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123390073","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Drifting student attention is a common problem in educational environments. We demonstrate 8 attention-restoring visual cues for display when eye tracking detects that student attention shifts away from critical objects. These cues include novel aspects and variations of standard cues that performed well in prior work on visual guidance. Our cues are integrated into an offshore training system on an oil rig. While students participate in training on the oil rig, we can compare our various cues in terms of performance and student preference, while also observing the impact of eye tracking. We demonstrate experiment software with which users can compare various cues and tune selected parameters for visual quality and effectiveness.
{"title":"Visual Cues to Restore Student Attention based on Eye Gaze Drift, and Application to an Offshore Training System","authors":"Andrew Yoshimura, Adil Khokhar, C. Borst","doi":"10.1145/3357251.3360007","DOIUrl":"https://doi.org/10.1145/3357251.3360007","url":null,"abstract":"Drifting student attention is a common problem in educational environments. We demonstrate 8 attention-restoring visual cues for display when eye tracking detects that student attention shifts away from critical objects. These cues include novel aspects and variations of standard cues that performed well in prior work on visual guidance. Our cues are integrated into an offshore training system on an oil rig. While students participate in training on the oil rig, we can compare our various cues in terms of performance and student preference, while also observing the impact of eye tracking. We demonstrate experiment software with which users can compare various cues and tune selected parameters for visual quality and effectiveness.","PeriodicalId":370782,"journal":{"name":"Symposium on Spatial User Interaction","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126970699","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}