S. Chun, J. Seo, Caleb Kicklighter, Elizabeth Wells-Beede, Jack Greene, Tomas Arguello
We present Leopold's Maneuvers VR, a haptic-enabled virtual reality simulation in which a user determines the size, position, and weight of a fetus within a virtual patient by palpating the patient's abdomen. Users of the application receive corresponding haptic cues (force and vibration) as they touch the fetus through the patient's torso. The physical sensations generated by the SenseGlove, paired with the immersive visuals within the virtual environment, support to create a palpation experience similar to the real Leopold's Maneuvers activity. This application addresses a need in nursing education for an immersive virtual reality experience that integrates direct hand manipulation in the learning of assessment skills.
{"title":"Exploration of Visuo-haptic Interactions to Support Learning Leopold's Maneuvers Process in Virtual Reality","authors":"S. Chun, J. Seo, Caleb Kicklighter, Elizabeth Wells-Beede, Jack Greene, Tomas Arguello","doi":"10.1145/3478514.3487615","DOIUrl":"https://doi.org/10.1145/3478514.3487615","url":null,"abstract":"We present Leopold's Maneuvers VR, a haptic-enabled virtual reality simulation in which a user determines the size, position, and weight of a fetus within a virtual patient by palpating the patient's abdomen. Users of the application receive corresponding haptic cues (force and vibration) as they touch the fetus through the patient's torso. The physical sensations generated by the SenseGlove, paired with the immersive visuals within the virtual environment, support to create a palpation experience similar to the real Leopold's Maneuvers activity. This application addresses a need in nursing education for an immersive virtual reality experience that integrates direct hand manipulation in the learning of assessment skills.","PeriodicalId":294021,"journal":{"name":"SIGGRAPH Asia 2021 XR","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133187128","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Keisuke Itoh, Hiroko Fujioka, Katsutoshi Machiba, T. Ohashi
The Story Incubated from Your heartbeats. Concept: You can share your own heartbeat with a robot, and you can feel its growth together. Main Character: A robot boy called “Maruboro” honest robot, who is not yet familiar with “life”.. He has been left in an old factory quietly. One day, the viewer comes over and gives his/her heart as life to him. Viewers can project themselves into the characters and enjoy the story. Logline: The viewer gives his/her heart to Maruboro and breathes a life into him. Maruboro wants to become friends with another robot called Kakuboro, but Maruboro doesn't know how to interact with others. It makes Kakuboro angry. However, Maruboro desperately wants to make friends and he starts to think from Kakuboro's point of view. Kakuboro finally opens up his heart to Maruboro. Story: “Beat” is a story elaborated from your “Heart”. Viewers can experience the work with their hearts in their hands. The heart in the animation vibrates at the same pace at viewers' heartbeat. The setting of the story is an old factory. Viewers encounter a rusted robot called Maruboro, it was absolutely static. He doesn't have a “heart” to move. Viewers can grant Maruboro a new heart by putting theirs on the robot. He then stands up and starts to move, expressing joy to live out all his strength. Maruboro looks a little lonely. When he finds that other robots open their hearts and connect with each other, he starts to search for friends. However, when Maruboro meets with new robots, he doesn't know how to communicate properly. He tries to open up his heart to another robot called Kakuboro. While in Maruboro's disturbance, Kakuboro dropped an important component of his factory. Kakuboro becomes angry and tries to gets rid of Maruboro. He broke Maruboro's heart accidentally. Maruboro is so depressed, but he tries to work with the viewer to search for the lost component. Finally, Maruboro finds it and give that back to Kakuboro. Kakuboro uses that to run the factory. The factory then starts to operate and sets off big fireworks. The hearts of the two robots eventually come together. Maruboro's heart starts to beat again. “Heart” becomes the key to move the story forward. The story aims to arouse consciousness of “Heart” via the growth of the robot.
{"title":"Beat","authors":"Keisuke Itoh, Hiroko Fujioka, Katsutoshi Machiba, T. Ohashi","doi":"10.1145/3478514.3487616","DOIUrl":"https://doi.org/10.1145/3478514.3487616","url":null,"abstract":"The Story Incubated from Your heartbeats. Concept: You can share your own heartbeat with a robot, and you can feel its growth together. Main Character: A robot boy called “Maruboro” honest robot, who is not yet familiar with “life”.. He has been left in an old factory quietly. One day, the viewer comes over and gives his/her heart as life to him. Viewers can project themselves into the characters and enjoy the story. Logline: The viewer gives his/her heart to Maruboro and breathes a life into him. Maruboro wants to become friends with another robot called Kakuboro, but Maruboro doesn't know how to interact with others. It makes Kakuboro angry. However, Maruboro desperately wants to make friends and he starts to think from Kakuboro's point of view. Kakuboro finally opens up his heart to Maruboro. Story: “Beat” is a story elaborated from your “Heart”. Viewers can experience the work with their hearts in their hands. The heart in the animation vibrates at the same pace at viewers' heartbeat. The setting of the story is an old factory. Viewers encounter a rusted robot called Maruboro, it was absolutely static. He doesn't have a “heart” to move. Viewers can grant Maruboro a new heart by putting theirs on the robot. He then stands up and starts to move, expressing joy to live out all his strength. Maruboro looks a little lonely. When he finds that other robots open their hearts and connect with each other, he starts to search for friends. However, when Maruboro meets with new robots, he doesn't know how to communicate properly. He tries to open up his heart to another robot called Kakuboro. While in Maruboro's disturbance, Kakuboro dropped an important component of his factory. Kakuboro becomes angry and tries to gets rid of Maruboro. He broke Maruboro's heart accidentally. Maruboro is so depressed, but he tries to work with the viewer to search for the lost component. Finally, Maruboro finds it and give that back to Kakuboro. Kakuboro uses that to run the factory. The factory then starts to operate and sets off big fireworks. The hearts of the two robots eventually come together. Maruboro's heart starts to beat again. “Heart” becomes the key to move the story forward. The story aims to arouse consciousness of “Heart” via the growth of the robot.","PeriodicalId":294021,"journal":{"name":"SIGGRAPH Asia 2021 XR","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122652293","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kunal Gupta, Yuewei Zhang, Yun Suen Pai, M. Billinghurst
We demonstrate WizardOfVR, a personalized emotion-adaptive Virtual Reality (VR) game akin to a Harry Potter experience, which uses using off-the-shelf physiological sensors to create a real-time biofeedback loop between a user’s emotional state and an adaptive VR environment (VRE). In our demo, the user initially trains the system during a calibration process using Electroencephalogram (EEG), Electrodermal Activity (EDA), and Heart Rate Variability (HRV) physiological signals. After calibration, the user will explore a virtual forest with adapting environmental factors based on a ’SanityMeter’ determined by the user’s real-time emotional state. The overall goal is to provide more balanced, immersive, and optimal emotional virtual experiences.
{"title":"WizardOfVR: An Emotion-Adaptive Virtual Wizard Experience","authors":"Kunal Gupta, Yuewei Zhang, Yun Suen Pai, M. Billinghurst","doi":"10.1145/3478514.3487628","DOIUrl":"https://doi.org/10.1145/3478514.3487628","url":null,"abstract":"We demonstrate WizardOfVR, a personalized emotion-adaptive Virtual Reality (VR) game akin to a Harry Potter experience, which uses using off-the-shelf physiological sensors to create a real-time biofeedback loop between a user’s emotional state and an adaptive VR environment (VRE). In our demo, the user initially trains the system during a calibration process using Electroencephalogram (EEG), Electrodermal Activity (EDA), and Heart Rate Variability (HRV) physiological signals. After calibration, the user will explore a virtual forest with adapting environmental factors based on a ’SanityMeter’ determined by the user’s real-time emotional state. The overall goal is to provide more balanced, immersive, and optimal emotional virtual experiences.","PeriodicalId":294021,"journal":{"name":"SIGGRAPH Asia 2021 XR","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127776062","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The intangible cultural heritage of China contains many different forms of performing arts, of which oral performance is an important branch. "Hua'er" is the most popular folk performance sung in the Hui ethnic area of Ningxia. Based on technology of virtual reality (VR) and gesture recognition, our work proposes three design methods: interactive performance narrative, metaphorical elements and embodied cognition, applied to the VR performance, "Flower and the Youth". VR can provide audience with a more immersed experience that contributes to the transmission and dissemination of non-heritage performing arts. Our work provides universal design approaches to the creation of content for future intangible cultural heritage performances.
{"title":"“Flower and the Youth”: Virtual Narrative and Creative Methods of Oral Performance in China Intangible Cultural Heritage","authors":"Yu Lu, Zixiao Liu, Xuning Yan, Shuo Yan","doi":"10.1145/3478514.3487623","DOIUrl":"https://doi.org/10.1145/3478514.3487623","url":null,"abstract":"The intangible cultural heritage of China contains many different forms of performing arts, of which oral performance is an important branch. \"Hua'er\" is the most popular folk performance sung in the Hui ethnic area of Ningxia. Based on technology of virtual reality (VR) and gesture recognition, our work proposes three design methods: interactive performance narrative, metaphorical elements and embodied cognition, applied to the VR performance, \"Flower and the Youth\". VR can provide audience with a more immersed experience that contributes to the transmission and dissemination of non-heritage performing arts. Our work provides universal design approaches to the creation of content for future intangible cultural heritage performances.","PeriodicalId":294021,"journal":{"name":"SIGGRAPH Asia 2021 XR","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128092813","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Fumihiko Nakamura, Adrien Verhulst, Kuniharu Sakurada, M. Fukuoka, M. Sugimoto
In spatial navigation, adding haptic cues to visual information lets users understand the spatial information better. Most haptic devices stimulate various body parts, while few devices target our heads that are sensitive to mechanical stimuli. This paper presents Virtual Whiskers, a spatial directional guidance technique using cheek haptics in a virtual space. We created a cheek haptic stimulation device by attaching two tiny robot arms to a Head-Mounted Display. The robot arms trace the cheek with proximity sensors to estimate the cheek surface. Target azimuthal and elevational directions are translated into a point on the cheek surface. The robot arms touch the point to present target directional cues. We demonstrate our technique in two applications.
{"title":"Virtual Whiskers: Cheek Haptic-Based Spatial Directional Guidance in a Virtual Space","authors":"Fumihiko Nakamura, Adrien Verhulst, Kuniharu Sakurada, M. Fukuoka, M. Sugimoto","doi":"10.1145/3478514.3487625","DOIUrl":"https://doi.org/10.1145/3478514.3487625","url":null,"abstract":"In spatial navigation, adding haptic cues to visual information lets users understand the spatial information better. Most haptic devices stimulate various body parts, while few devices target our heads that are sensitive to mechanical stimuli. This paper presents Virtual Whiskers, a spatial directional guidance technique using cheek haptics in a virtual space. We created a cheek haptic stimulation device by attaching two tiny robot arms to a Head-Mounted Display. The robot arms trace the cheek with proximity sensors to estimate the cheek surface. Target azimuthal and elevational directions are translated into a point on the cheek surface. The robot arms touch the point to present target directional cues. We demonstrate our technique in two applications.","PeriodicalId":294021,"journal":{"name":"SIGGRAPH Asia 2021 XR","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114581496","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This work "indefinitely" tells the epitome of the earth and future urban pollution. In contemporary society, high walls have been built between people, and people living in cities are basically replicas. We use VR technology to create a surreal world in VR helmets, focusing on spiritual pollution and environmental pollution. In this ambiguous world, how can people find themselves in the bustling maze of nothingness? Through the three scenes of tomb forest, acid rain wasteland and hazy village, the experimenter uses the handle to understand human diseases, disasters and pollution in the past[Ypsilanti et al., 2018].They also picked up the photo fragments of "Lucas" representing hope and felt the fear of being lost in the spiritual maze. In this scene, they become poor people who are lost and polluted in their souls. At the end of the scene, the experimenter found a door full of hope, picked up all the fragments of Lucas, took a group photo of Lucas and his daughter before departure, and ended the journey.
这个作品“无限”地讲述了地球和未来城市污染的缩影。在当代社会,人与人之间建起了高墙,生活在城市里的人基本上都是复制品。我们用VR技术在VR头盔中创造一个超现实的世界,关注精神污染和环境污染。在这个模糊的世界里,人们如何在虚无的迷宫中找到自己?实验者通过墓林、酸雨荒地和雾蒙蒙的村庄三个场景,用手柄了解过去人类的疾病、灾害和污染[Ypsilanti et al., 2018]。他们也捡起了代表希望的“卢卡斯”的照片碎片,感受到了迷失在精神迷宫中的恐惧。在这个场景中,他们变成了灵魂迷失和被污染的穷人。在场景的最后,实验者找到了一扇充满希望的门,捡起了卢卡斯的所有碎片,在出发前给卢卡斯和他的女儿拍了一张合影,结束了旅程。
{"title":"Indefinately:Indefinately","authors":"Chenxin Zhang, Lesi Hu","doi":"10.1145/3478514.3487626","DOIUrl":"https://doi.org/10.1145/3478514.3487626","url":null,"abstract":"This work \"indefinitely\" tells the epitome of the earth and future urban pollution. In contemporary society, high walls have been built between people, and people living in cities are basically replicas. We use VR technology to create a surreal world in VR helmets, focusing on spiritual pollution and environmental pollution. In this ambiguous world, how can people find themselves in the bustling maze of nothingness? Through the three scenes of tomb forest, acid rain wasteland and hazy village, the experimenter uses the handle to understand human diseases, disasters and pollution in the past[Ypsilanti et al., 2018].They also picked up the photo fragments of \"Lucas\" representing hope and felt the fear of being lost in the spiritual maze. In this scene, they become poor people who are lost and polluted in their souls. At the end of the scene, the experimenter found a door full of hope, picked up all the fragments of Lucas, took a group photo of Lucas and his daughter before departure, and ended the journey.","PeriodicalId":294021,"journal":{"name":"SIGGRAPH Asia 2021 XR","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126639611","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Seiichiro Takeuchi, Kyoko Hashiguchi, Yuki Homma, Kent Kajitani, Shingo Meguro
GIBSON is a novel city walking system that enables distant users to walk together as if they are physically in the same city. The advancement of virtual reality technology has opened the possibility to travel around the world virtually beyond geographical limitations, but there is still room for improvement to make the experience as realistic as real travel. Unlike conventional virtual travel tools and prior multi-user collaborative XR studies, we designed our system to evoke both a sense of co-presence and a sense of being in the real space. For this purpose, we implemented two main functions: (1) function to transfer real-time audio-visual information of the surroundings and (2) function to transfer body movements of users through avatars. We also combined visual positioning system (VPS) and SLAM to align the user locations. We conducted user testing to verify the experience of cross-AR/VR city walking using GIBSON. The result suggests that our system could make people feel as if they were walking together in the city even though they are physically distanced.
{"title":"GIBSON: AR/VR synchronized city walking system","authors":"Seiichiro Takeuchi, Kyoko Hashiguchi, Yuki Homma, Kent Kajitani, Shingo Meguro","doi":"10.1145/3478514.3487638","DOIUrl":"https://doi.org/10.1145/3478514.3487638","url":null,"abstract":"GIBSON is a novel city walking system that enables distant users to walk together as if they are physically in the same city. The advancement of virtual reality technology has opened the possibility to travel around the world virtually beyond geographical limitations, but there is still room for improvement to make the experience as realistic as real travel. Unlike conventional virtual travel tools and prior multi-user collaborative XR studies, we designed our system to evoke both a sense of co-presence and a sense of being in the real space. For this purpose, we implemented two main functions: (1) function to transfer real-time audio-visual information of the surroundings and (2) function to transfer body movements of users through avatars. We also combined visual positioning system (VPS) and SLAM to align the user locations. We conducted user testing to verify the experience of cross-AR/VR city walking using GIBSON. The result suggests that our system could make people feel as if they were walking together in the city even though they are physically distanced.","PeriodicalId":294021,"journal":{"name":"SIGGRAPH Asia 2021 XR","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116619541","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Eman Al-Zubeidi, J. Seo, Julia DeLaney, Dominic Nguyen, Jonathan Konderla, Jaime Díaz
A Walk Alone is a virtual reality experience that simulates what it feels like to walk alone at night as a woman. The concept is inspired by the kidnapping and murder of Sarah Everard, a 33-year-old woman who went missing during her walk home from her friend's apartment in March 2021. It is one thing to discuss this universal issue and highlight the precautions women take when walking alone, but it is another thing to experience it. A Walk Alone focuses on displaying the vulnerability of the user by triggering multiple senses in the virtual reality environment. The experience is centered around a linear story involving (1) one night-city environment, (2) the user in first person point-of-view, (3) eerie sound design, and (5) dim street lighting.
“独自行走”是一种虚拟现实体验,它模拟了女性在夜间独自行走的感觉。这个概念的灵感来自于莎拉·埃弗拉德的绑架和谋杀,这名33岁的女子于2021年3月从朋友的公寓步行回家时失踪。讨论这个普遍的问题并强调女性在独自行走时采取的预防措施是一回事,但体验它是另一回事。A Walk Alone侧重于在虚拟现实环境中通过触发多种感官来展示用户的脆弱性。游戏体验围绕线性故事展开,包括(1)夜城环境,(2)用户第一人称视角,(3)怪异的音效设计,(5)昏暗的街灯。
{"title":"A Walk Alone: Triggering Fear and Simulating Empathy to Raise Awareness about the Dangers Women Face when Walking Alone at Night: Triggering Fear and Simulating Empathy to Raise Awareness about the Dangers Women Face when Walking Alone at Night","authors":"Eman Al-Zubeidi, J. Seo, Julia DeLaney, Dominic Nguyen, Jonathan Konderla, Jaime Díaz","doi":"10.1145/3478514.3487627","DOIUrl":"https://doi.org/10.1145/3478514.3487627","url":null,"abstract":"A Walk Alone is a virtual reality experience that simulates what it feels like to walk alone at night as a woman. The concept is inspired by the kidnapping and murder of Sarah Everard, a 33-year-old woman who went missing during her walk home from her friend's apartment in March 2021. It is one thing to discuss this universal issue and highlight the precautions women take when walking alone, but it is another thing to experience it. A Walk Alone focuses on displaying the vulnerability of the user by triggering multiple senses in the virtual reality environment. The experience is centered around a linear story involving (1) one night-city environment, (2) the user in first person point-of-view, (3) eerie sound design, and (5) dim street lighting.","PeriodicalId":294021,"journal":{"name":"SIGGRAPH Asia 2021 XR","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129802099","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ximing Shen, Yun Suen Pai, Dai Kiuchi, Kanoko Oishi, Kehan Bao, Tomomi Aoki, K. Minamizawa
Dementia is a global health crisis, of which there is a need to understand the patients’ perception towards improving their quality of life. We propose Dementia Eyes, a mobile AR experience that simulates common visual symptoms of senile dementia based on the known pathology and caregivers’ actual experience with patients. Leveraging an iPhone and a Head-Mounted Display (HMD), we developed a real-time application which allows users to see the world from the perspective of an Alzheimer’s type of dementia (AD) patient. The experience was validated by professional medical workers in Japan, and the result advocates for the efficacy of the empathy we intended to bring to them.
{"title":"Dementia Eyes: Perceiving Dementia with Augmented Reality","authors":"Ximing Shen, Yun Suen Pai, Dai Kiuchi, Kanoko Oishi, Kehan Bao, Tomomi Aoki, K. Minamizawa","doi":"10.1145/3478514.3487617","DOIUrl":"https://doi.org/10.1145/3478514.3487617","url":null,"abstract":"Dementia is a global health crisis, of which there is a need to understand the patients’ perception towards improving their quality of life. We propose Dementia Eyes, a mobile AR experience that simulates common visual symptoms of senile dementia based on the known pathology and caregivers’ actual experience with patients. Leveraging an iPhone and a Head-Mounted Display (HMD), we developed a real-time application which allows users to see the world from the perspective of an Alzheimer’s type of dementia (AD) patient. The experience was validated by professional medical workers in Japan, and the result advocates for the efficacy of the empathy we intended to bring to them.","PeriodicalId":294021,"journal":{"name":"SIGGRAPH Asia 2021 XR","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114213183","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
‘The World of Hiroshige’ is an immersive virtual reality experience that allows the participants not only to view the artworks of the Japanese artist Utagawa Hiroshige in 3 dimensions, but also to experience and interact with the Ukiyo or “floating world" that inspired the works.
{"title":"Virtual Reality Experience ‘The World of Hiroshige’: An Immersive Virtual Reality Experience of the World that Inspired the Works of the Japanese Artist Utagawa Hiroshige","authors":"A. Weight, Daniel Flood, Dylan Neill","doi":"10.1145/3478514.3487611","DOIUrl":"https://doi.org/10.1145/3478514.3487611","url":null,"abstract":"‘The World of Hiroshige’ is an immersive virtual reality experience that allows the participants not only to view the artworks of the Japanese artist Utagawa Hiroshige in 3 dimensions, but also to experience and interact with the Ukiyo or “floating world\" that inspired the works.","PeriodicalId":294021,"journal":{"name":"SIGGRAPH Asia 2021 XR","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126963642","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}