Soft robotics - robots built from highly compliant materials that resemble soft biological materials - have recently become more popular especially in industry settings, given their ability to handle fragile objects. One problem however of these devices is that they can only communicate intend or the need for help through movement. To overcome this limitation, in this paper we present ChromaBot, a method towards prototyping soft robotic actuators with integrated printed electrochromic displays. Our method only degrades the longevity of the soft robotic actuator in an acceptable manner while at the same time allows for a more expressive soft robot. We present detailed instructions on how to prototype ChromaBot as well as an initial analysis of the durability, both of the display as well as of the actuator.
{"title":"ChromaBot - Prototyping Soft Robotic Actuators with Integrated Electrochromic Displays","authors":"Anna Dagmar Bille Milthers, Markus Löchtefeld","doi":"10.1145/3490632.3497831","DOIUrl":"https://doi.org/10.1145/3490632.3497831","url":null,"abstract":"Soft robotics - robots built from highly compliant materials that resemble soft biological materials - have recently become more popular especially in industry settings, given their ability to handle fragile objects. One problem however of these devices is that they can only communicate intend or the need for help through movement. To overcome this limitation, in this paper we present ChromaBot, a method towards prototyping soft robotic actuators with integrated printed electrochromic displays. Our method only degrades the longevity of the soft robotic actuator in an acceptable manner while at the same time allows for a more expressive soft robot. We present detailed instructions on how to prototype ChromaBot as well as an initial analysis of the durability, both of the display as well as of the actuator.","PeriodicalId":158762,"journal":{"name":"Proceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122317368","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We live surrounded by the most varied computing devices, which may give us the opportunity to combine them to form a unified and richer user experience. Considering this opportunity, we created the UnaxY Framework to support the development of applications with UI components distributed by co-located devices. This paper is focused on a feasibility study based on two prototype applications created using UnaxY. We performed user studies to evaluate concepts associated to this type of applications and the framework they were based on. We had a special interest in assessing how managing the application state and collaborating across devices would be perceived and received by users. The results are positive and clearly indicate that we should continue developing solutions that support a generalized implementation of applications with the user interaction spanning multiple devices.
{"title":"Designing Proxemic-aware Cross-Device Applications: A Feasibility Study","authors":"P. Santos, R. Madeira, Nuno Correia","doi":"10.1145/3490632.3490658","DOIUrl":"https://doi.org/10.1145/3490632.3490658","url":null,"abstract":"We live surrounded by the most varied computing devices, which may give us the opportunity to combine them to form a unified and richer user experience. Considering this opportunity, we created the UnaxY Framework to support the development of applications with UI components distributed by co-located devices. This paper is focused on a feasibility study based on two prototype applications created using UnaxY. We performed user studies to evaluate concepts associated to this type of applications and the framework they were based on. We had a special interest in assessing how managing the application state and collaborating across devices would be perceived and received by users. The results are positive and clearly indicate that we should continue developing solutions that support a generalized implementation of applications with the user interaction spanning multiple devices.","PeriodicalId":158762,"journal":{"name":"Proceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121370868","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This course is a hands-on introduction to the fabrication of flexible, transparent free-form displays based on electrochromism for an audience with a variety of backgrounds, including artists and designers with no prior knowledge of physical prototyping. Besides prototyping using screen printing or ink-jet printing of electrochromic ink and an easy assembly process, participants will learn essentials for designing and controlling electrochromic displays.
{"title":"Prototyping of Transparent and Flexible Electrochromic Displays","authors":"Markus Löchtefeld, Walther Jensen, Çağlar Genç","doi":"10.1145/3490632.3497750","DOIUrl":"https://doi.org/10.1145/3490632.3497750","url":null,"abstract":"This course is a hands-on introduction to the fabrication of flexible, transparent free-form displays based on electrochromism for an audience with a variety of backgrounds, including artists and designers with no prior knowledge of physical prototyping. Besides prototyping using screen printing or ink-jet printing of electrochromic ink and an easy assembly process, participants will learn essentials for designing and controlling electrochromic displays.","PeriodicalId":158762,"journal":{"name":"Proceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129885098","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Geerts, Radu-Daniel Vatavu, Alisa Burova, V. Vinayagamoorthy, Martez E. Mott, Mike Crabb, K. Gerling
Immersive experiences – enabled by technologies such as VR, AR, 360° video and other highly immersive multimedia applications – have the potential to make interacting with various activities more inclusive for many people. This can be achieved by applying the principles of inclusive design. This panel will discuss the current challenges in designing inclusive immersive technologies, and how they should be addressed.
{"title":"Challenges in Designing Inclusive Immersive Technologies","authors":"D. Geerts, Radu-Daniel Vatavu, Alisa Burova, V. Vinayagamoorthy, Martez E. Mott, Mike Crabb, K. Gerling","doi":"10.1145/3490632.3497751","DOIUrl":"https://doi.org/10.1145/3490632.3497751","url":null,"abstract":"Immersive experiences – enabled by technologies such as VR, AR, 360° video and other highly immersive multimedia applications – have the potential to make interacting with various activities more inclusive for many people. This can be achieved by applying the principles of inclusive design. This panel will discuss the current challenges in designing inclusive immersive technologies, and how they should be addressed.","PeriodicalId":158762,"journal":{"name":"Proceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124073594","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We present two use cases of mobile displays in cross-reality interactions between users immersed in Virtual Reality (VR) and users present in the Physical Reality (PR) by using the mobile display to show select artefacts of interest. The first use case is the “Substitutional Display” where a display serves as a passive haptic for an artefact. Both VR and PR users can then move the artefact by physically moving the display. The second use case is a “Virtual Artefact Handover” which allows the VR user to pass artefacts onto the PR user’s display. We envision this handover as a natural interaction where the VR user moves the artefact onto a virtual proxy of the display the PR user is holding, after which the artefact is displayed for the PR user to see.
{"title":"Mobile Displays for Cross-Reality Interactions between Virtual and Physical Realities","authors":"Robbe Cools, A. Simeone","doi":"10.1145/3490632.3497838","DOIUrl":"https://doi.org/10.1145/3490632.3497838","url":null,"abstract":"We present two use cases of mobile displays in cross-reality interactions between users immersed in Virtual Reality (VR) and users present in the Physical Reality (PR) by using the mobile display to show select artefacts of interest. The first use case is the “Substitutional Display” where a display serves as a passive haptic for an artefact. Both VR and PR users can then move the artefact by physically moving the display. The second use case is a “Virtual Artefact Handover” which allows the VR user to pass artefacts onto the PR user’s display. We envision this handover as a natural interaction where the VR user moves the artefact onto a virtual proxy of the display the PR user is holding, after which the artefact is displayed for the PR user to see.","PeriodicalId":158762,"journal":{"name":"Proceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130457452","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tamzid Hossain, Md. Fahimul Islam, W. Delamare, Farida Chowdhury, Khalad Hasan
Advancements in eye-tracking technology has compelled researchers to explore potential eye-based interactions with diverse devices. Though many commercial devices are now equipped with eye-tracking solutions (e.g., HTC VIVE Pro), little is known about users social acceptance and preference of eye-based interaction techniques, especially with smartphones. We report on three studies to explore users’ social acceptance and preferences regarding different head- and eye-based inputs with smartphones. Study results show that eye movements are more socially acceptable than other head- and eye-based techniques due to its subtle nature. Based on these findings, we further examine users preferences regarding saccade and pursuit eye movements. Results reveal users’ preference for saccade compared to pursuit eye movements. In a third study exploring delimiting actions to discriminate between intentional and unintentional eye-inputs, Dwell is shown as the preferred delimiter, both in public and private spaces. We conclude with design guidelines for eye-based interactions on smartphones.
{"title":"Exploring Social Acceptability and Users’ Preferences of Head- and Eye-Based Interaction with Mobile Devices","authors":"Tamzid Hossain, Md. Fahimul Islam, W. Delamare, Farida Chowdhury, Khalad Hasan","doi":"10.1145/3490632.3490636","DOIUrl":"https://doi.org/10.1145/3490632.3490636","url":null,"abstract":"Advancements in eye-tracking technology has compelled researchers to explore potential eye-based interactions with diverse devices. Though many commercial devices are now equipped with eye-tracking solutions (e.g., HTC VIVE Pro), little is known about users social acceptance and preference of eye-based interaction techniques, especially with smartphones. We report on three studies to explore users’ social acceptance and preferences regarding different head- and eye-based inputs with smartphones. Study results show that eye movements are more socially acceptable than other head- and eye-based techniques due to its subtle nature. Based on these findings, we further examine users preferences regarding saccade and pursuit eye movements. Results reveal users’ preference for saccade compared to pursuit eye movements. In a third study exploring delimiting actions to discriminate between intentional and unintentional eye-inputs, Dwell is shown as the preferred delimiter, both in public and private spaces. We conclude with design guidelines for eye-based interactions on smartphones.","PeriodicalId":158762,"journal":{"name":"Proceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122326950","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jonah-Noël Kaiser, Thu Marianski, Marco Muras, M. Chamunorwa
The need for remote usability testing has increased during the ongoing COVID-19 global pandemic. However, lockdown and physical distancing regulations have affected how HCI researchers conduct in-person tests of systems and technologies under design. We present a Pop-Up Observation Kit, which serves as an affordable mobile usability lab. The kit is sent to participants of a study alongside the system they are testing. The Pop-Up Observation Kit provides a simple, unobtrusive form factor that enables the study participant to concentrate on the task itself and not on documenting the task they are performing. While initially developed to observe hand and finger gestures on a pressure sensor mat for the Rich Interactive Materials for everyday objects project, the Pop-up Observation Kit also applies to other use cases. Additionally, the kit is extensible with additional functionalities combined with the sensor mat to enable better data collection or unmoderated remote observations.
{"title":"Popup Observation Kit for Remote Usability Testing","authors":"Jonah-Noël Kaiser, Thu Marianski, Marco Muras, M. Chamunorwa","doi":"10.1145/3490632.3497871","DOIUrl":"https://doi.org/10.1145/3490632.3497871","url":null,"abstract":"The need for remote usability testing has increased during the ongoing COVID-19 global pandemic. However, lockdown and physical distancing regulations have affected how HCI researchers conduct in-person tests of systems and technologies under design. We present a Pop-Up Observation Kit, which serves as an affordable mobile usability lab. The kit is sent to participants of a study alongside the system they are testing. The Pop-Up Observation Kit provides a simple, unobtrusive form factor that enables the study participant to concentrate on the task itself and not on documenting the task they are performing. While initially developed to observe hand and finger gestures on a pressure sensor mat for the Rich Interactive Materials for everyday objects project, the Pop-up Observation Kit also applies to other use cases. Additionally, the kit is extensible with additional functionalities combined with the sensor mat to enable better data collection or unmoderated remote observations.","PeriodicalId":158762,"journal":{"name":"Proceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia","volume":"51 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129856589","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sarah Aragon Bartsch, Christina Schneegass, Florian Bemmann, D. Buschek
Common sources of career information like websites often provide a static overall picture of a job, yet lack personal insights into the daily working life. To address this problem, we present a novel mobile career guidance method: It enables users to remotely gain an impression of different work routines by receiving several short, scheduled chat messages from a persona throughout the day. These messages were previously collected from real professionals reporting on their tasks over a week. We implemented a smartphone application to compare our message-based approach to a traditional blog entry in a two-week within-subject field study (N = 17). Users highlighted that the scheduled messages (1) enhanced their understanding of work routines by integrating career information into their own daily context and (2) offered authentic insights into the jobs. We discuss design implications for mobile career guidance systems and future opportunities for presenting chunks of information in a temporal context.
{"title":"A Day in the Life: Exploring the Use of Scheduled Mobile Chat Messages for Career Guidance","authors":"Sarah Aragon Bartsch, Christina Schneegass, Florian Bemmann, D. Buschek","doi":"10.1145/3490632.3490637","DOIUrl":"https://doi.org/10.1145/3490632.3490637","url":null,"abstract":"Common sources of career information like websites often provide a static overall picture of a job, yet lack personal insights into the daily working life. To address this problem, we present a novel mobile career guidance method: It enables users to remotely gain an impression of different work routines by receiving several short, scheduled chat messages from a persona throughout the day. These messages were previously collected from real professionals reporting on their tasks over a week. We implemented a smartphone application to compare our message-based approach to a traditional blog entry in a two-week within-subject field study (N = 17). Users highlighted that the scheduled messages (1) enhanced their understanding of work routines by integrating career information into their own daily context and (2) offered authentic insights into the jobs. We discuss design implications for mobile career guidance systems and future opportunities for presenting chunks of information in a temporal context.","PeriodicalId":158762,"journal":{"name":"Proceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia","volume":" 4","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132123673","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We present the design and prototype of a memory orb, a cylindrical device inspired from an artifact depicted in the science fiction movie Blade Runner 2049. The memory orb is a handheld input device, which allows users to control and manipulate three-dimensional, virtual content by combining pushing, pulling and rotating motor actions, facilitating muscle memory and eyes-free interaction. Our prototype integrates a large number of electronic components while maintaining a small form factor to allow ease of control and simple handling. The described implementation of this prototype aims to showcase the application potential of this cylindrical device in mixed, or virtual reality systems.
{"title":"Demonstrating a Memory Orb — Cylindrical Device Inspired by Science Fiction","authors":"D. Brun, Philipp Jordan, Jonna Häkkilä","doi":"10.1145/3490632.3497873","DOIUrl":"https://doi.org/10.1145/3490632.3497873","url":null,"abstract":"We present the design and prototype of a memory orb, a cylindrical device inspired from an artifact depicted in the science fiction movie Blade Runner 2049. The memory orb is a handheld input device, which allows users to control and manipulate three-dimensional, virtual content by combining pushing, pulling and rotating motor actions, facilitating muscle memory and eyes-free interaction. Our prototype integrates a large number of electronic components while maintaining a small form factor to allow ease of control and simple handling. The described implementation of this prototype aims to showcase the application potential of this cylindrical device in mixed, or virtual reality systems.","PeriodicalId":158762,"journal":{"name":"Proceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia","volume":"89 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114666483","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Munehiro Iwamoto, Ayumi Ohnishi, T. Terada, M. Tsukamoto
Field hockey is a popular global sport played with a stick and hard ball, in which stick manipulation is difficult. In field hockey, a push pass is the most basic and common pass to teammates. For the best push pass, the stick should stay in contact with the ball for a long time until it is released for the pass. However, it is difficult for beginners to perceive the contact point of the stick by themselves. This study proposed a system to enable the user hear the movement path of the contact position of the ball on the stick to improve the push pass technique. In the proposed system, eight pressure sensors are placed on a hockey stick. The system detects the contact path of the ball, and feedback sounds with different pitches depending on the contact positions of the piezoelectric speaker are produced in real time. According to the evaluation experiment conducted over a period of two months, the average distance of the movement path of the ball on the stick was significantly longer when the auditory feedback was provided than when it was not given. This confirms the effectiveness of auditory feedback utilizing the proposed system.
{"title":"Design and Implementation of Push Pass Practice Support System for Field Hockey with Auditory Feedback","authors":"Munehiro Iwamoto, Ayumi Ohnishi, T. Terada, M. Tsukamoto","doi":"10.1145/3490632.3490648","DOIUrl":"https://doi.org/10.1145/3490632.3490648","url":null,"abstract":"Field hockey is a popular global sport played with a stick and hard ball, in which stick manipulation is difficult. In field hockey, a push pass is the most basic and common pass to teammates. For the best push pass, the stick should stay in contact with the ball for a long time until it is released for the pass. However, it is difficult for beginners to perceive the contact point of the stick by themselves. This study proposed a system to enable the user hear the movement path of the contact position of the ball on the stick to improve the push pass technique. In the proposed system, eight pressure sensors are placed on a hockey stick. The system detects the contact path of the ball, and feedback sounds with different pitches depending on the contact positions of the piezoelectric speaker are produced in real time. According to the evaluation experiment conducted over a period of two months, the average distance of the movement path of the ball on the stick was significantly longer when the auditory feedback was provided than when it was not given. This confirms the effectiveness of auditory feedback utilizing the proposed system.","PeriodicalId":158762,"journal":{"name":"Proceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114751181","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}