There is growing interest in using living materials as sustainable alternatives to conventional materials in product design. Particularly, the ability of some species of mushroom to grow lightweight, rigid materials that can be moulded into complex 3D forms is of interest to a range of interactive applications including data physicalisation, aesthetic experiences of a space, and wearable computing. This tutorial aims to provide a hands-on experience with living, mushroom-based “myco-materials” that can be grown into a range of complex shapes, and introduce a cheap, low-technology workflow for the design and fabrication of 3D designs using living, sustainable materials.
{"title":"Making Sustainable, Tangible Objects with Myco-materials","authors":"Phillip Gough, A. Globa, A. Hadigheh, A. Withana","doi":"10.1145/3532104.3571468","DOIUrl":"https://doi.org/10.1145/3532104.3571468","url":null,"abstract":"There is growing interest in using living materials as sustainable alternatives to conventional materials in product design. Particularly, the ability of some species of mushroom to grow lightweight, rigid materials that can be moulded into complex 3D forms is of interest to a range of interactive applications including data physicalisation, aesthetic experiences of a space, and wearable computing. This tutorial aims to provide a hands-on experience with living, mushroom-based “myco-materials” that can be grown into a range of complex shapes, and introduce a cheap, low-technology workflow for the design and fabrication of 3D designs using living, sustainable materials.","PeriodicalId":431929,"journal":{"name":"Companion Proceedings of the 2022 Conference on Interactive Surfaces and Spaces","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125771221","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Adolescents with significant illnesses face various psychosocial and mental wellbeing challenges during hospitalisation. Social isolation from family and peers is identified as a significant concern for this group. Several digital interventions have been proposed to connect these young people with others, such as video conferencing, social media, social robots, and online games. Research so far has found those to be beneficial for adolescents’ wellbeing. Social VR is a relatively novel social interaction mechanism that allows users to interact socially within an immersive 3D virtual environment with embodiment experience. Game-play is generally identified as a motivational factor in user engagement. Therefore, applying gamification into social VR space would encourage and motivate socially isolated adolescents to engage socially. The main goal of this research project is to enhance the social engagement of socially isolated adolescents by fostering positive interactions with their peers. We expect to discover if engaging with the intervention will decrease problems associated with social isolation.
{"title":"Social VR for Socially Isolated Adolescents with Significant Illnesses","authors":"U. Udapola","doi":"10.1145/3532104.3571466","DOIUrl":"https://doi.org/10.1145/3532104.3571466","url":null,"abstract":"Adolescents with significant illnesses face various psychosocial and mental wellbeing challenges during hospitalisation. Social isolation from family and peers is identified as a significant concern for this group. Several digital interventions have been proposed to connect these young people with others, such as video conferencing, social media, social robots, and online games. Research so far has found those to be beneficial for adolescents’ wellbeing. Social VR is a relatively novel social interaction mechanism that allows users to interact socially within an immersive 3D virtual environment with embodiment experience. Game-play is generally identified as a motivational factor in user engagement. Therefore, applying gamification into social VR space would encourage and motivate socially isolated adolescents to engage socially. The main goal of this research project is to enhance the social engagement of socially isolated adolescents by fostering positive interactions with their peers. We expect to discover if engaging with the intervention will decrease problems associated with social isolation.","PeriodicalId":431929,"journal":{"name":"Companion Proceedings of the 2022 Conference on Interactive Surfaces and Spaces","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114337214","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Yamazaki, N. Wakatsuki, K. Mizutani, Yukihiko Okada, K. Zempo
This paper discusses more effective interaction between a user immersed in a virtual environment (VE) and a salesperson avatar. Utilizing the superiority of visual information over auditory information, we implemented an interpersonal situation in which only a sound image invaded the user’s personal space (PS) using a virtual salesperson avatar. We investigated the impressions that 16 participants had of the avatar. The experimental results showed that when the sound image invaded the PS, the correlation coefficient between the distance from the user to the sound image normalized by the interpersonal distance of each participant and the rapport for service quality was -0.24 (p<0.05). This indicates that in the range of the ventriloquism effect, the avatar’s sound image intruding into the PS and approaching the user leads to an improvement in rapport. The application of the technique proposed in this paper, in which the position of the visual image and the sound image diverge, is expected to improve the value cocreation of the interpersonal service experience in VEs such as virtual stores, which have been increasing in recent years.
{"title":"Impact on the Quality of Interpersonal Relationships by Proximity using the Ventriloquism Effect in a Virtual Environment","authors":"A. Yamazaki, N. Wakatsuki, K. Mizutani, Yukihiko Okada, K. Zempo","doi":"10.1145/3532104.3571460","DOIUrl":"https://doi.org/10.1145/3532104.3571460","url":null,"abstract":"This paper discusses more effective interaction between a user immersed in a virtual environment (VE) and a salesperson avatar. Utilizing the superiority of visual information over auditory information, we implemented an interpersonal situation in which only a sound image invaded the user’s personal space (PS) using a virtual salesperson avatar. We investigated the impressions that 16 participants had of the avatar. The experimental results showed that when the sound image invaded the PS, the correlation coefficient between the distance from the user to the sound image normalized by the interpersonal distance of each participant and the rapport for service quality was -0.24 (p<0.05). This indicates that in the range of the ventriloquism effect, the avatar’s sound image intruding into the PS and approaching the user leads to an improvement in rapport. The application of the technique proposed in this paper, in which the position of the visual image and the sound image diverge, is expected to improve the value cocreation of the interpersonal service experience in VEs such as virtual stores, which have been increasing in recent years.","PeriodicalId":431929,"journal":{"name":"Companion Proceedings of the 2022 Conference on Interactive Surfaces and Spaces","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122997692","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kevin Shedlock, M. Vos, Petera Hudson, Jamey Hepi, Betty Kim, Zane Rawson, Marino Doyle
This CHI workshop looked to connect indigenous peoples to surfaces and spaces using a media channel of experience to visit historical narrative, heritage and/or interactive and immersive media applications. The workshop was especially interested in contributions that displayed cultural resilience inside a digital environment or aligned with a common interest to challenge the status quo. We welcomed individuals or groups interested in submitting (but not limited to) interactive design, prototyping, methodology, human-computer interaction, conceptual works that progress indigenous ideas and creative works within a technical lens.
{"title":"Indigenous CHI Workshop","authors":"Kevin Shedlock, M. Vos, Petera Hudson, Jamey Hepi, Betty Kim, Zane Rawson, Marino Doyle","doi":"10.1145/3532104.3571469","DOIUrl":"https://doi.org/10.1145/3532104.3571469","url":null,"abstract":"This CHI workshop looked to connect indigenous peoples to surfaces and spaces using a media channel of experience to visit historical narrative, heritage and/or interactive and immersive media applications. The workshop was especially interested in contributions that displayed cultural resilience inside a digital environment or aligned with a common interest to challenge the status quo. We welcomed individuals or groups interested in submitting (but not limited to) interactive design, prototyping, methodology, human-computer interaction, conceptual works that progress indigenous ideas and creative works within a technical lens.","PeriodicalId":431929,"journal":{"name":"Companion Proceedings of the 2022 Conference on Interactive Surfaces and Spaces","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126542984","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
R. Matsui, Takuya Ibara, Kazuya Tsukamoto, Takafumi Koyama, K. Fujita, Yuta Sugiura
Carpal tunnel syndrome (CTS) is a common condition characterized by hand dysfunction due to median nerve compression. Orthopedic surgeons often detect signs of the symptoms to screen for CTS; however, it is difficult to distinguish other diseases with symptoms similar to those of CTS. We previously introduced a method of evaluating fine hand movements to screen for cervical myelopathy (CM). The present work applies this method to screen for CTS, using videos of specific hand gestures to measure their quickness. Machine learning models are used to evaluate the gestures to estimate the probability that a patient has CTS. We cross-validated the models to evaluate our method’s effectiveness in screening for CTS. The results showed that the sensitivity and specificity were 90.0% and 85.3%, respectively. Furthermore, we found that our method can also be used to distinguish CTS and CM and may enable earlier detection and treatment of similar neurological diseases.
{"title":"Video Analysis of Hand Gestures for Distinguishing Patients with Carpal Tunnel Syndrome","authors":"R. Matsui, Takuya Ibara, Kazuya Tsukamoto, Takafumi Koyama, K. Fujita, Yuta Sugiura","doi":"10.1145/3532104.3571461","DOIUrl":"https://doi.org/10.1145/3532104.3571461","url":null,"abstract":"Carpal tunnel syndrome (CTS) is a common condition characterized by hand dysfunction due to median nerve compression. Orthopedic surgeons often detect signs of the symptoms to screen for CTS; however, it is difficult to distinguish other diseases with symptoms similar to those of CTS. We previously introduced a method of evaluating fine hand movements to screen for cervical myelopathy (CM). The present work applies this method to screen for CTS, using videos of specific hand gestures to measure their quickness. Machine learning models are used to evaluate the gestures to estimate the probability that a patient has CTS. We cross-validated the models to evaluate our method’s effectiveness in screening for CTS. The results showed that the sensitivity and specificity were 90.0% and 85.3%, respectively. Furthermore, we found that our method can also be used to distinguish CTS and CM and may enable earlier detection and treatment of similar neurological diseases.","PeriodicalId":431929,"journal":{"name":"Companion Proceedings of the 2022 Conference on Interactive Surfaces and Spaces","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129261107","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
“Space Ocean Library” is an interactive narrative VR experience that transports you to a study-turned mystical purgatory. In this room, you explore the connectivity of humans through the objects we keep around to remember our lives. “Space Ocean Library” features actors recorded through volumetric capture and objects rendered using photogrammetry.
{"title":"Space Ocean Library: Interactive Narrative Experience in VR","authors":"Becky Lake, Krzysztof Pietroszek","doi":"10.1145/3532104.3571453","DOIUrl":"https://doi.org/10.1145/3532104.3571453","url":null,"abstract":"“Space Ocean Library” is an interactive narrative VR experience that transports you to a study-turned mystical purgatory. In this room, you explore the connectivity of humans through the objects we keep around to remember our lives. “Space Ocean Library” features actors recorded through volumetric capture and objects rendered using photogrammetry.","PeriodicalId":431929,"journal":{"name":"Companion Proceedings of the 2022 Conference on Interactive Surfaces and Spaces","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128468652","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Textual sources provide limited information to their readers which could be underwhelming and may reduce engagement. Augmentation approaches have been introduced to present information more engagingly and have shown the potential in supporting information retention. In this research, we inquire further into this opportunity through the use of interactive touchscreen visual elements such as puzzle pieces. We present Retzzles, where users get to solve puzzles applied in a tourist use-case. To evaluate this, we will do a within-subject study with participants n = 30 to determine whether such elements promote engagement which thereby supports information retention. Our preliminary findings shed light on some perspectives on the use of touchscreen displays for engagement but are subject to further investigation. We contribute to more discussions on the use of interactive screens in other similar learning scenarios.
{"title":"Retzzles: Engaging Users towards Retention through Touchscreen Puzzles","authors":"N. Kovacevic, M. Weerasinghe, J. A. Deja","doi":"10.1145/3532104.3571531","DOIUrl":"https://doi.org/10.1145/3532104.3571531","url":null,"abstract":"Textual sources provide limited information to their readers which could be underwhelming and may reduce engagement. Augmentation approaches have been introduced to present information more engagingly and have shown the potential in supporting information retention. In this research, we inquire further into this opportunity through the use of interactive touchscreen visual elements such as puzzle pieces. We present Retzzles, where users get to solve puzzles applied in a tourist use-case. To evaluate this, we will do a within-subject study with participants n = 30 to determine whether such elements promote engagement which thereby supports information retention. Our preliminary findings shed light on some perspectives on the use of touchscreen displays for engagement but are subject to further investigation. We contribute to more discussions on the use of interactive screens in other similar learning scenarios.","PeriodicalId":431929,"journal":{"name":"Companion Proceedings of the 2022 Conference on Interactive Surfaces and Spaces","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134214154","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Martin Schmitz, Sebastian Günther, Karola Marky, Florian Müller, A. Matviienko, Alexandra Voit, Roberts Marky, M. Mühlhäuser, T. Kosch
The increasing proliferation of smart objects in everyday life has changed how we interact with computers. Instead of concentrating computational capabilities and interaction into one device, everyday objects have naturally integrated parts of interactive features. Although this has led to many practical applications, the possibilities for explicit or implicit interaction with such objects are still limited in interaction spaces. We still often rely on smartphones as interactive hubs for controlling smart objects, hence not fulfilling the vision of truly smart objects. The workshop Rethinking Smart Objects invites practitioners and researchers from both academia and industry to discuss novel interaction paradigms and the integration and societal implications of using smart objects in interactive space. This workshop will include an action plan with leading questions, aiming to move the research field forward.
{"title":"Rethinking Smart Objects: The International Workshop on Interacting with Smart Objects in Interactive Spaces","authors":"Martin Schmitz, Sebastian Günther, Karola Marky, Florian Müller, A. Matviienko, Alexandra Voit, Roberts Marky, M. Mühlhäuser, T. Kosch","doi":"10.1145/3532104.3571470","DOIUrl":"https://doi.org/10.1145/3532104.3571470","url":null,"abstract":"The increasing proliferation of smart objects in everyday life has changed how we interact with computers. Instead of concentrating computational capabilities and interaction into one device, everyday objects have naturally integrated parts of interactive features. Although this has led to many practical applications, the possibilities for explicit or implicit interaction with such objects are still limited in interaction spaces. We still often rely on smartphones as interactive hubs for controlling smart objects, hence not fulfilling the vision of truly smart objects. The workshop Rethinking Smart Objects invites practitioners and researchers from both academia and industry to discuss novel interaction paradigms and the integration and societal implications of using smart objects in interactive space. This workshop will include an action plan with leading questions, aiming to move the research field forward.","PeriodicalId":431929,"journal":{"name":"Companion Proceedings of the 2022 Conference on Interactive Surfaces and Spaces","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113947368","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
X. Zhang, Kaori Ikematsu, Kunihiro Kato, Yuta Sugiura
To achieve adaptive user interfaces (UI) for smartphones, researchers have been developing sensing methods to detect how a user is holding a smartphone. A variety of promising adaptive UIs have been demonstrated, such as those that automatically switch the displayed content and the position of interactive components in accordance with how the phone is being held. In this paper, we present a follow-up study on ReflecTouch, a state-of-the-art grasping posture detection method proposed by Zhang et al. that uses corneal reflection images captured by the front camera of a smartphone. We extend the previous work by investigating the performance of this method towards actual use and its potential challenges through a crowdsourced experiment with a large number of participants.
{"title":"Evaluation of Grasp Posture Detection Method using Corneal Reflection Images through a Crowdsourced Experiment","authors":"X. Zhang, Kaori Ikematsu, Kunihiro Kato, Yuta Sugiura","doi":"10.1145/3532104.3571457","DOIUrl":"https://doi.org/10.1145/3532104.3571457","url":null,"abstract":"To achieve adaptive user interfaces (UI) for smartphones, researchers have been developing sensing methods to detect how a user is holding a smartphone. A variety of promising adaptive UIs have been demonstrated, such as those that automatically switch the displayed content and the position of interactive components in accordance with how the phone is being held. In this paper, we present a follow-up study on ReflecTouch, a state-of-the-art grasping posture detection method proposed by Zhang et al. that uses corneal reflection images captured by the front camera of a smartphone. We extend the previous work by investigating the performance of this method towards actual use and its potential challenges through a crowdsourced experiment with a large number of participants.","PeriodicalId":431929,"journal":{"name":"Companion Proceedings of the 2022 Conference on Interactive Surfaces and Spaces","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126130174","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jing Wei, Young-Ho Kim, Samantha W. T. Chan, Tilman Dingler
Conversational agents have gained increasing interest from researchers as a tool to collect data and administer interventions. They provide a natural user interface through conversations and hence have the potential to reach a wide population in their homes and on the go. Several developer tools and commercial as well as open-source frameworks allow for the deployment of both text-based chatbots and voice assistants. In this 90 min tutorial, participants will learn how to choose an appropriate platform, how to design and deploy their conversational agents, and how to transform traditional surveys through conversation agents.
{"title":"Design and Prototype Conversational Agents for Research Data Collection","authors":"Jing Wei, Young-Ho Kim, Samantha W. T. Chan, Tilman Dingler","doi":"10.1145/3532104.3571467","DOIUrl":"https://doi.org/10.1145/3532104.3571467","url":null,"abstract":"Conversational agents have gained increasing interest from researchers as a tool to collect data and administer interventions. They provide a natural user interface through conversations and hence have the potential to reach a wide population in their homes and on the go. Several developer tools and commercial as well as open-source frameworks allow for the deployment of both text-based chatbots and voice assistants. In this 90 min tutorial, participants will learn how to choose an appropriate platform, how to design and deploy their conversational agents, and how to transform traditional surveys through conversation agents.","PeriodicalId":431929,"journal":{"name":"Companion Proceedings of the 2022 Conference on Interactive Surfaces and Spaces","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114977242","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}