In recent years, traditional art museums have begun to use AR/VR technology to make visits more engaging and interactive. This paper details an application which provides features designed to be immediately engaging and educational to museum visitors within an AR view. The application superimposes an automatically generated 3D representation over a scanned artwork, along with the work's authorship, title, and date of creation. A GUI allows the user to exaggerate or decrease the depth scale of the 3D representation, as well as to search for related works of music. Given this music as audio input, the generated 3D model will act as an audio visualizer by changing depth scale based on input frequency.
{"title":"Automatic 3D modeling of artwork and visualizing audio in an augmented reality environment","authors":"Elijah Schwelling, Kyungjin Yoo","doi":"10.1145/3281505.3281617","DOIUrl":"https://doi.org/10.1145/3281505.3281617","url":null,"abstract":"In recent years, traditional art museums have begun to use AR/VR technology to make visits more engaging and interactive. This paper details an application which provides features designed to be immediately engaging and educational to museum visitors within an AR view. The application superimposes an automatically generated 3D representation over a scanned artwork, along with the work's authorship, title, and date of creation. A GUI allows the user to exaggerate or decrease the depth scale of the 3D representation, as well as to search for related works of music. Given this music as audio input, the generated 3D model will act as an audio visualizer by changing depth scale based on input frequency.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122436177","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ying-Li Lin, Tsai-Yi Chou, Yu-Cheng Lieo, Yu-Cheng Huang, Ping-Hsuan Han
When people eat, the taste is very complex and be influenced easily by other senses. Such as visual, olfactory, and haptic, even past experiences, can affect the human perception, which in turn creates more taste possibilities. We present TransFork, an eating tool with olfactory feedback, which augments the tasting experience with video see-through head-mounted display. Additionally, we design a recipe via preliminary experiments to find out the taste conversion formula, which could enhance the flavor of foods and change the user perception to recognize the food. In this demonstration, we prepare a mini feast with bite-sized fruit, the participants use the TransFork to eat food A and smell the scent of food B stored at the aromatic box via airflow guiding. Before they deliver the food to their mouth, the head-mounted display augmented the color of food B on food A by the QR code on the aromatic box. With this augmented reality techniques and the recipe, the tasting experience could be augmented or enhanced, which is a potential approach and could be a playful used for eating.
{"title":"TransFork","authors":"Ying-Li Lin, Tsai-Yi Chou, Yu-Cheng Lieo, Yu-Cheng Huang, Ping-Hsuan Han","doi":"10.1145/3281505.3281560","DOIUrl":"https://doi.org/10.1145/3281505.3281560","url":null,"abstract":"When people eat, the taste is very complex and be influenced easily by other senses. Such as visual, olfactory, and haptic, even past experiences, can affect the human perception, which in turn creates more taste possibilities. We present TransFork, an eating tool with olfactory feedback, which augments the tasting experience with video see-through head-mounted display. Additionally, we design a recipe via preliminary experiments to find out the taste conversion formula, which could enhance the flavor of foods and change the user perception to recognize the food. In this demonstration, we prepare a mini feast with bite-sized fruit, the participants use the TransFork to eat food A and smell the scent of food B stored at the aromatic box via airflow guiding. Before they deliver the food to their mouth, the head-mounted display augmented the color of food B on food A by the QR code on the aromatic box. With this augmented reality techniques and the recipe, the tasting experience could be augmented or enhanced, which is a potential approach and could be a playful used for eating.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127240794","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We propose a low-cost omni-directional VR walking platform by thigh supporting and motion estimation. Specifically, this platform supports the thighs of the user to the walking direction, and the user make the stepping motion while leaning to the walking direction. Thereby making it possible to change the center of gravity of the foot sole like an actual walking. Moreover, our platform estimate the foot movement which constrained by thigh supporting part with load cells around the user's thigh, and render to the HMD according to the estimated foot movement. As a result, our platform enables user to make the walking sensation more realistic at low-cost.
{"title":"A low-cost omni-directional VR walking platform by thigh supporting and motion estimation","authors":"Wataru Wakita, Tomoyuki Takano, Toshiyuki Hadama","doi":"10.1145/3281505.3281570","DOIUrl":"https://doi.org/10.1145/3281505.3281570","url":null,"abstract":"We propose a low-cost omni-directional VR walking platform by thigh supporting and motion estimation. Specifically, this platform supports the thighs of the user to the walking direction, and the user make the stepping motion while leaning to the walking direction. Thereby making it possible to change the center of gravity of the foot sole like an actual walking. Moreover, our platform estimate the foot movement which constrained by thigh supporting part with load cells around the user's thigh, and render to the HMD according to the estimated foot movement. As a result, our platform enables user to make the walking sensation more realistic at low-cost.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126732534","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
While the content of virtual reality (VR) has grown explosively in recent years, the advance of designing user-friendly control interfaces in VR still remains a slow pace. The most commonly used device, such as gamepad or controller, has fixed shape and weight, and thus can not provide realistic haptic feedback when interacting with virtual objects in VR. In this work, we present a novel and lightweight tracking system in the context of manipulating handheld objects in VR. Specifically, our system can effortlessly synchronize the 3D pose of arbitrary handheld objects between the real world and VR in realtime performance. The tracking algorithm is simple, which delicately leverages the power of Leap Motion and IMU sensor to respectively track object's location and orientation. We demonstrate the effectiveness of our system with three VR applications use pencil, ping-pong paddle, and smartphone as control interfaces to provide users more immersive VR experience.
{"title":"A lightweight and efficient system for tracking handheld objects in virtual reality","authors":"Ya-Kuei Chang, Jui-Wei Huang, Chien-Hua Chen, Chien-Wen Chen, Jian-Wei Peng, Min-Chun Hu, Chih-Yuan Yao, Hung-Kuo Chu","doi":"10.1145/3281505.3281567","DOIUrl":"https://doi.org/10.1145/3281505.3281567","url":null,"abstract":"While the content of virtual reality (VR) has grown explosively in recent years, the advance of designing user-friendly control interfaces in VR still remains a slow pace. The most commonly used device, such as gamepad or controller, has fixed shape and weight, and thus can not provide realistic haptic feedback when interacting with virtual objects in VR. In this work, we present a novel and lightweight tracking system in the context of manipulating handheld objects in VR. Specifically, our system can effortlessly synchronize the 3D pose of arbitrary handheld objects between the real world and VR in realtime performance. The tracking algorithm is simple, which delicately leverages the power of Leap Motion and IMU sensor to respectively track object's location and orientation. We demonstrate the effectiveness of our system with three VR applications use pencil, ping-pong paddle, and smartphone as control interfaces to provide users more immersive VR experience.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121514455","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Due to the need to interact with virtual objects, the hand-object interaction has become an important element in mixed reality (MR) applications. In this paper, we propose a novel approach to handle the occlusion of augmented 3D object manipulation with hands by exploiting the nature of hand poses combined with tracking-based and model-based methods, to achieve a complete mixed reality experience without necessities of heavy computations, complex manual segmentation processes or wearing special gloves. The experimental results show a frame rate faster than real-time and a great accuracy of rendered virtual appearances, and a user study verifies a more immersive experience compared to past approaches. We believe that the proposed method can improve a wide range of mixed reality applications that involve hand-object interactions.
{"title":"Resolving occlusion for 3D object manipulation with hands in mixed reality","authors":"Qi Feng, Hubert P. H. Shum, S. Morishima","doi":"10.1145/3281505.3283390","DOIUrl":"https://doi.org/10.1145/3281505.3283390","url":null,"abstract":"Due to the need to interact with virtual objects, the hand-object interaction has become an important element in mixed reality (MR) applications. In this paper, we propose a novel approach to handle the occlusion of augmented 3D object manipulation with hands by exploiting the nature of hand poses combined with tracking-based and model-based methods, to achieve a complete mixed reality experience without necessities of heavy computations, complex manual segmentation processes or wearing special gloves. The experimental results show a frame rate faster than real-time and a great accuracy of rendered virtual appearances, and a user study verifies a more immersive experience compared to past approaches. We believe that the proposed method can improve a wide range of mixed reality applications that involve hand-object interactions.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130513421","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Myungho Lee, Nahal Norouzi, G. Bruder, P. Wisniewski, G. Welch
In this paper, we investigate the effects of the physical influence of a virtual human (VH) in the context of face-to-face interaction in augmented reality (AR). In our study, participants played a tabletop game with a VH, in which each player takes a turn and moves their own token along the designated spots on the shared table. We compared two conditions as follows: the VH in the virtual condition moves a virtual token that can only be seen through AR glasses, while the VH in the physical condition moves a physical token as the participants do; therefore the VH's token can be seen even in the periphery of the AR glasses. For the physical condition, we designed an actuator system underneath the table. The actuator moves a magnet under the table which then moves the VH's physical token over the surface of the table. Our results indicate that participants felt higher co-presence with the VH in the physical condition, and participants assessed the VH as a more physical entity compared to the VH in the virtual condition. We further observed transference effects when participants attributed the VH's ability to move physical objects to other elements in the real world. Also, the VH's physical influence improved participants' overall experience with the VH. We discuss potential explanations for the findings and implications for future shared AR tabletop setups.
{"title":"The physical-virtual table: exploring the effects of a virtual human's physical influence on social interaction","authors":"Myungho Lee, Nahal Norouzi, G. Bruder, P. Wisniewski, G. Welch","doi":"10.1145/3281505.3281533","DOIUrl":"https://doi.org/10.1145/3281505.3281533","url":null,"abstract":"In this paper, we investigate the effects of the physical influence of a virtual human (VH) in the context of face-to-face interaction in augmented reality (AR). In our study, participants played a tabletop game with a VH, in which each player takes a turn and moves their own token along the designated spots on the shared table. We compared two conditions as follows: the VH in the virtual condition moves a virtual token that can only be seen through AR glasses, while the VH in the physical condition moves a physical token as the participants do; therefore the VH's token can be seen even in the periphery of the AR glasses. For the physical condition, we designed an actuator system underneath the table. The actuator moves a magnet under the table which then moves the VH's physical token over the surface of the table. Our results indicate that participants felt higher co-presence with the VH in the physical condition, and participants assessed the VH as a more physical entity compared to the VH in the virtual condition. We further observed transference effects when participants attributed the VH's ability to move physical objects to other elements in the real world. Also, the VH's physical influence improved participants' overall experience with the VH. We discuss potential explanations for the findings and implications for future shared AR tabletop setups.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130650018","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Shotaro Ichikawa, Kazuki Takashima, Anthony Tang, Y. Kitamura
We present a concept-based world building approach, realized in a system called VR Safari Park, which allows users to rapidly create and manipulate a world simulation. Conventional world building tools focus on the manipulation and arrangement of entities to set up the simulation, which is time consuming as it requires frequent view and entity manipulations. Our approach focuses on a far simpler mechanic, where users add virtual blocks which represent world entities (e.g. animals, terrain, weather, etc.) to a World Tree, which represents the simulation. In so doing, the World Tree provides a quick overview of the simulation, and users can easily set up scenarios in the simulation without having to manually perform fine-grain manipulations on world entities. A preliminary user study found that the proposed interface is effective and usable for novice users without prior immersive VR experience.
{"title":"VR safari park: a concept-based world building interface using blocks and world tree","authors":"Shotaro Ichikawa, Kazuki Takashima, Anthony Tang, Y. Kitamura","doi":"10.1145/3281505.3281517","DOIUrl":"https://doi.org/10.1145/3281505.3281517","url":null,"abstract":"We present a concept-based world building approach, realized in a system called VR Safari Park, which allows users to rapidly create and manipulate a world simulation. Conventional world building tools focus on the manipulation and arrangement of entities to set up the simulation, which is time consuming as it requires frequent view and entity manipulations. Our approach focuses on a far simpler mechanic, where users add virtual blocks which represent world entities (e.g. animals, terrain, weather, etc.) to a World Tree, which represents the simulation. In so doing, the World Tree provides a quick overview of the simulation, and users can easily set up scenarios in the simulation without having to manually perform fine-grain manipulations on world entities. A preliminary user study found that the proposed interface is effective and usable for novice users without prior immersive VR experience.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131143486","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We are pursuing a vision of reactive interior spaces that are aware of people's actions and transform according to changing needs. We envision furniture and walls that act as interactive displays and that shapeshift to the correct physical form, and the appropriate interactive visual content and modality. This paper briefly describes our proposal based on our recent efforts on realizing this vision.
{"title":"Designing dynamic aware interiors","authors":"Y. Kitamura, Kazuki Takashima, Kazuyuki Fujita","doi":"10.1145/3281505.3281603","DOIUrl":"https://doi.org/10.1145/3281505.3281603","url":null,"abstract":"We are pursuing a vision of reactive interior spaces that are aware of people's actions and transform according to changing needs. We envision furniture and walls that act as interactive displays and that shapeshift to the correct physical form, and the appropriate interactive visual content and modality. This paper briefly describes our proposal based on our recent efforts on realizing this vision.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123989564","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Creativity and innovation training is the core of the art education. Modern technology provides more effective tools to help students obtain artistic creativity. In this paper, we propose to employ augmented reality technology to assist artistic creativity education. We first analyze the inefficiency of traditional artistic creation training. We then introduce our AR-based smartphone app with technical detail and explain how it can improve accelerate artistic creativity training. We finally show 3 examples created by our AR app to demonstrate the effectiveness of our proposed method.
{"title":"An AR system for artistic creativity education","authors":"Jiajia Tan, Boyang Gao, Xiaobo Lu","doi":"10.1145/3281505.3283396","DOIUrl":"https://doi.org/10.1145/3281505.3283396","url":null,"abstract":"Creativity and innovation training is the core of the art education. Modern technology provides more effective tools to help students obtain artistic creativity. In this paper, we propose to employ augmented reality technology to assist artistic creativity education. We first analyze the inefficiency of traditional artistic creation training. We then introduce our AR-based smartphone app with technical detail and explain how it can improve accelerate artistic creativity training. We finally show 3 examples created by our AR app to demonstrate the effectiveness of our proposed method.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127728071","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mixed reality offers an immersive and interactive experience through the use of head mounted displays and in-air gestures. Visitors can discover additional content virtually, on top of existing physical items. For a small-scale exhibition at a cafe, we developed a Microsoft HoloLens application to create an interactive experience on top of a collection of historic physical items. Through public experiences of this exhibition, we received positive feedback of our system, and found that it also helped to promote brand perception. In this demo, visitors can experience a similar mixed reality experience that was shown at the exhibition.
{"title":"Using mixed reality for promoting brand perception","authors":"Kelvin Cheng, Ichiro Furusawa","doi":"10.1145/3281505.3281574","DOIUrl":"https://doi.org/10.1145/3281505.3281574","url":null,"abstract":"Mixed reality offers an immersive and interactive experience through the use of head mounted displays and in-air gestures. Visitors can discover additional content virtually, on top of existing physical items. For a small-scale exhibition at a cafe, we developed a Microsoft HoloLens application to create an interactive experience on top of a collection of historic physical items. Through public experiences of this exhibition, we received positive feedback of our system, and found that it also helped to promote brand perception. In this demo, visitors can experience a similar mixed reality experience that was shown at the exhibition.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115770850","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}