We present ShapelineGuide, a dynamic visual guide that supports users of large interactive displays while performing mid-air gestures. Today, we find many examples of large displays supporting interaction through gestures performed in Mid-air. Yet, approaches that support users in learning and executing these gestures are still scarce. Prior approaches require complex setups, are targeted towards the use of 2D gestures, or focus on the initial gestures only. Our work extends state-of-the-art by presenting a feedforward system that provides users constant updates on their gestures. We report on the design and implementation of the approach and present findings from an evaluation of the system in a lab study (N=44), focusing on learning performance, accuracy, and errors. We found that ShapelineGuide helps users with regard to learning the gestures as well as decreases execution times and cognitive load.
{"title":"ShapelineGuide","authors":"Florian Alt, Sabrina Geiger, W. Höhl","doi":"10.1145/3205873.3205887","DOIUrl":"https://doi.org/10.1145/3205873.3205887","url":null,"abstract":"We present ShapelineGuide, a dynamic visual guide that supports users of large interactive displays while performing mid-air gestures. Today, we find many examples of large displays supporting interaction through gestures performed in Mid-air. Yet, approaches that support users in learning and executing these gestures are still scarce. Prior approaches require complex setups, are targeted towards the use of 2D gestures, or focus on the initial gestures only. Our work extends state-of-the-art by presenting a feedforward system that provides users constant updates on their gestures. We report on the design and implementation of the approach and present findings from an evaluation of the system in a lab study (N=44), focusing on learning performance, accuracy, and errors. We found that ShapelineGuide helps users with regard to learning the gestures as well as decreases execution times and cognitive load.","PeriodicalId":340580,"journal":{"name":"Proceedings of the 7th ACM International Symposium on Pervasive Displays","volume":"148 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116152788","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The Internet of Things (IoT) enabled through sensor-rich environments and smart devices allows us to collect and exchange vast quantities of data. The advent of new markets, such as the smart home sector, and movements, such as the quantified self, indicate the IoT's huge economic and social impact. With the increased availability of IoT services, it becomes important to enable users with intuitive mechanisms for accessing the gathered data. In this work, we present findings from an exploratory design case study, in which we deployed a low-res lighting display in three family households to visualize domestic energy performance data. Our study showed that the standalone lighting display was preferred over a commercially available web-based application. Further, we found that in two of the three households those participants, who did not use the mobile application before, became the main user of the display and actively engaged with the visualized data. The paper concludes with design implications for pervasive displays connected as ambient gateways to smart devices.
{"title":"Designing Low-Res Lighting Displays as Ambient Gateways to Smart Devices","authors":"Marius Hoggenmüller, A. Wiethoff, M. Tomitsch","doi":"10.1145/3205873.3205876","DOIUrl":"https://doi.org/10.1145/3205873.3205876","url":null,"abstract":"The Internet of Things (IoT) enabled through sensor-rich environments and smart devices allows us to collect and exchange vast quantities of data. The advent of new markets, such as the smart home sector, and movements, such as the quantified self, indicate the IoT's huge economic and social impact. With the increased availability of IoT services, it becomes important to enable users with intuitive mechanisms for accessing the gathered data. In this work, we present findings from an exploratory design case study, in which we deployed a low-res lighting display in three family households to visualize domestic energy performance data. Our study showed that the standalone lighting display was preferred over a commercially available web-based application. Further, we found that in two of the three households those participants, who did not use the mobile application before, became the main user of the display and actively engaged with the visualized data. The paper concludes with design implications for pervasive displays connected as ambient gateways to smart devices.","PeriodicalId":340580,"journal":{"name":"Proceedings of the 7th ACM International Symposium on Pervasive Displays","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116255094","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
LEGO Serious Play (LSP) supports the facilitation of creative communication with various methods (e.g. team-building or strategy development) that are successfully used for more than 15 years. Nevertheless, it is missing a process for documenting the outcome of the workshops. Therefore, participants or an assistant of the facilitator take hand-written notes within the workshops. As these are personal notes of the participants, they are most likely not feasible as a documentation for people who did not attend at the workshop. On the other hand, many applications use augmented reality (AR) to add information to real world objects. In this paper, we try to develop a method for documentation of LSP artefacts with AR. Our focus is on the one hand on the process of producing the documentation and on the other hand on the interaction with an AR prototype for 3D positioning of annotations using Microsoft's HoloLens.
LEGO Serious Play (LSP)通过各种方法(例如团队建设或战略制定)支持促进创造性沟通,这些方法已成功使用超过15年。然而,它缺少一个记录讲习班成果的过程。因此,参与者或引导者的助手在研讨会上手写笔记。由于这些是参与者的个人笔记,因此对于没有参加研讨会的人来说,它们很可能不可行。另一方面,许多应用程序使用增强现实(AR)向现实世界的对象添加信息。在本文中,我们试图开发一种用AR记录LSP工件的方法。我们的重点一方面是制作文档的过程,另一方面是使用微软的HoloLens与AR原型进行交互,以便对注释进行3D定位。
{"title":"An AR-method for documenting LEGO Serious Play models","authors":"J. Herbert, Thomas Herrmann","doi":"10.1145/3205873.3210703","DOIUrl":"https://doi.org/10.1145/3205873.3210703","url":null,"abstract":"LEGO Serious Play (LSP) supports the facilitation of creative communication with various methods (e.g. team-building or strategy development) that are successfully used for more than 15 years. Nevertheless, it is missing a process for documenting the outcome of the workshops. Therefore, participants or an assistant of the facilitator take hand-written notes within the workshops. As these are personal notes of the participants, they are most likely not feasible as a documentation for people who did not attend at the workshop. On the other hand, many applications use augmented reality (AR) to add information to real world objects. In this paper, we try to develop a method for documentation of LSP artefacts with AR. Our focus is on the one hand on the process of producing the documentation and on the other hand on the interaction with an AR prototype for 3D positioning of annotations using Microsoft's HoloLens.","PeriodicalId":340580,"journal":{"name":"Proceedings of the 7th ACM International Symposium on Pervasive Displays","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133080747","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Sorce, V. Gentile, Debora Oliveto, Rossella Barraco, A. Malizia, A. Gentile
Many prior works investigated the potential of pervasive technologies and interactive applications to increase access capabilities to digital content for people with disability, particularly Neuro-Developmental Disorders (NDDs). In this paper, we present an exploratory study aimed at understanding if an avatar-based touchless gestural interface is able to foster interest towards digital representations of artworks, e.g. paintings or sculptures usually exhibited in museums, and to make them more accessible for such people. In particular, the study involved three autistic people and a therapist, and allowed us to report the potential of an avatar to communicate the interactivity and stimulate interaction with just a few directions to start, or not at all. We also shortly present and discuss some possible idea for future developments.
{"title":"Exploring Usability and Accessibility of Avatar-based Touchless Gestural Interfaces for Autistic People","authors":"S. Sorce, V. Gentile, Debora Oliveto, Rossella Barraco, A. Malizia, A. Gentile","doi":"10.1145/3205873.3210705","DOIUrl":"https://doi.org/10.1145/3205873.3210705","url":null,"abstract":"Many prior works investigated the potential of pervasive technologies and interactive applications to increase access capabilities to digital content for people with disability, particularly Neuro-Developmental Disorders (NDDs). In this paper, we present an exploratory study aimed at understanding if an avatar-based touchless gestural interface is able to foster interest towards digital representations of artworks, e.g. paintings or sculptures usually exhibited in museums, and to make them more accessible for such people. In particular, the study involved three autistic people and a therapist, and allowed us to report the potential of an avatar to communicate the interactivity and stimulate interaction with just a few directions to start, or not at all. We also shortly present and discuss some possible idea for future developments.","PeriodicalId":340580,"journal":{"name":"Proceedings of the 7th ACM International Symposium on Pervasive Displays","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133060431","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Anke M. Brock, Julia Chatain, Michelle Park, Tommy Fang, M. Hachet, J. Landay, Jessica R. Cauchard
Interactive maps have become ubiquitous in our daily lives, helping us reach destinations and discovering our surroundings. Yet, designing map interactions is not straightforward and depends on the device being used. As mobile devices evolve and become independent from users, such as with robots and drones, how will we interact with the maps they provide? We propose FlyMap as a novel user experience for drone-based interactive maps. We designed and developed three interaction techniques for FlyMap's usage scenarios. In a comprehensive indoor study (N = 16), we show the strengths and weaknesses of two techniques on users' cognition, task load, and satisfaction. FlyMap was then pilot tested with the third technique outdoors in real world conditions with four groups of participants (N = 13). We show that FlyMap's interactivity is exciting to users and opens the space for more direct interactions with drones.
{"title":"FlyMap","authors":"Anke M. Brock, Julia Chatain, Michelle Park, Tommy Fang, M. Hachet, J. Landay, Jessica R. Cauchard","doi":"10.1145/3205873.3205877","DOIUrl":"https://doi.org/10.1145/3205873.3205877","url":null,"abstract":"Interactive maps have become ubiquitous in our daily lives, helping us reach destinations and discovering our surroundings. Yet, designing map interactions is not straightforward and depends on the device being used. As mobile devices evolve and become independent from users, such as with robots and drones, how will we interact with the maps they provide? We propose FlyMap as a novel user experience for drone-based interactive maps. We designed and developed three interaction techniques for FlyMap's usage scenarios. In a comprehensive indoor study (N = 16), we show the strengths and weaknesses of two techniques on users' cognition, task load, and satisfaction. FlyMap was then pilot tested with the third technique outdoors in real world conditions with four groups of participants (N = 13). We show that FlyMap's interactivity is exciting to users and opens the space for more direct interactions with drones.","PeriodicalId":340580,"journal":{"name":"Proceedings of the 7th ACM International Symposium on Pervasive Displays","volume":"338 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122334750","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
L. Coconu, Brygg Ullmer, P. Paar, Jing Lyu, Miriam K. Konkel, H. Hege
The use of tangible interfaces for navigation of landscape scenery -- for example, lost places re-created in 3D -- has been pursued and articulated as a promising, impactful application of interactive visualization. In this demonstration, we present a modern, low-cost implementation of a previously-realized multimodal gallery installation. Our demonstration centers upon the versatile usage of a smartphone for sensing, navigating, and (optionally) displaying element on a physical surface in tandem with a larger, more immersive display.
{"title":"A Smartphone-Based Tangible Interaction Approach for Landscape Visualization","authors":"L. Coconu, Brygg Ullmer, P. Paar, Jing Lyu, Miriam K. Konkel, H. Hege","doi":"10.1145/3205873.3210707","DOIUrl":"https://doi.org/10.1145/3205873.3210707","url":null,"abstract":"The use of tangible interfaces for navigation of landscape scenery -- for example, lost places re-created in 3D -- has been pursued and articulated as a promising, impactful application of interactive visualization. In this demonstration, we present a modern, low-cost implementation of a previously-realized multimodal gallery installation. Our demonstration centers upon the versatile usage of a smartphone for sensing, navigating, and (optionally) displaying element on a physical surface in tandem with a larger, more immersive display.","PeriodicalId":340580,"journal":{"name":"Proceedings of the 7th ACM International Symposium on Pervasive Displays","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114169522","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Vanessa Cobus, Bastian Ehrhardt, Susanne CJ Boll, Wilko Heuten
Nurses in intensive care units are exposed to a large number of acoustic alarms which need to be evaluated and acknowledged. With different pitches and frequencies those alarms convey different levels of urgency to support the evaluation of its source. However, with up to 350 acoustic alarms per patient per day a desensitization for alarms is unavoidable. To reduce the risk of alarm fatigue, we develop multimodal concepts to deliver alarms in a non-acoustical manner. In this work, we present a body-worn pervasive device by which former acoustic alarms are now displayed by tactile stimuli on the upper arm of a care taker. The prototype as well as the implemented vibration patterns were evaluated with nurses under task conditions that mimic common loads of care tasks.
{"title":"Demo","authors":"Vanessa Cobus, Bastian Ehrhardt, Susanne CJ Boll, Wilko Heuten","doi":"10.1145/3205873.3210710","DOIUrl":"https://doi.org/10.1145/3205873.3210710","url":null,"abstract":"Nurses in intensive care units are exposed to a large number of acoustic alarms which need to be evaluated and acknowledged. With different pitches and frequencies those alarms convey different levels of urgency to support the evaluation of its source. However, with up to 350 acoustic alarms per patient per day a desensitization for alarms is unavoidable. To reduce the risk of alarm fatigue, we develop multimodal concepts to deliver alarms in a non-acoustical manner. In this work, we present a body-worn pervasive device by which former acoustic alarms are now displayed by tactile stimuli on the upper arm of a care taker. The prototype as well as the implemented vibration patterns were evaluated with nurses under task conditions that mimic common loads of care tasks.","PeriodicalId":340580,"journal":{"name":"Proceedings of the 7th ACM International Symposium on Pervasive Displays","volume":"128 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115096881","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tim Claudius Stratmann, Andreas Löcken, Uwe Gruenefeld, Wilko Heuten, Susanne CJ Boll
For decision making in monitoring and control rooms situation awareness is key. Given the often spacious and complex environments, simple alarms are not sufficient for attention guidance (e.g., on ship bridges). In our work, we explore shifting attention towards the location of relevant entities in large cyber-physical systems. Therefore, we used pervasive displays: tactile displays on both upper arms and a peripheral display. With these displays, we investigated shifting the attention in a seated and standing scenario. In a first user study, we evaluated four distinct cue patterns for each on-body display. We tested seated monitoring limited to 90° in front of the user. In a second study, we continued with the two patterns from the first study for lowest and highest urgency perceived. Here, we investigated standing monitoring in a 360° environment. We found that tactile cues led to faster arousal times than visual cues, whereas the attention shift speed for visual cues was faster than tactile cues.
{"title":"Exploring Vibrotactile and Peripheral Cues for Spatial Attention Guidance","authors":"Tim Claudius Stratmann, Andreas Löcken, Uwe Gruenefeld, Wilko Heuten, Susanne CJ Boll","doi":"10.1145/3205873.3205874","DOIUrl":"https://doi.org/10.1145/3205873.3205874","url":null,"abstract":"For decision making in monitoring and control rooms situation awareness is key. Given the often spacious and complex environments, simple alarms are not sufficient for attention guidance (e.g., on ship bridges). In our work, we explore shifting attention towards the location of relevant entities in large cyber-physical systems. Therefore, we used pervasive displays: tactile displays on both upper arms and a peripheral display. With these displays, we investigated shifting the attention in a seated and standing scenario. In a first user study, we evaluated four distinct cue patterns for each on-body display. We tested seated monitoring limited to 90° in front of the user. In a second study, we continued with the two patterns from the first study for lowest and highest urgency perceived. Here, we investigated standing monitoring in a 360° environment. We found that tactile cues led to faster arousal times than visual cues, whereas the attention shift speed for visual cues was faster than tactile cues.","PeriodicalId":340580,"journal":{"name":"Proceedings of the 7th ACM International Symposium on Pervasive Displays","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125209167","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper we investigate kinetic displays in the form of robotic installations in the city of Hull, UK. The kinetic installations -- comprising a set of orange robotic arms, light sources, mirrors and soundscapes -- performed spatial and temporal rhythms in four different urban settings across Hull's Old Town. We investigate the installations as an attempt to clarify a) the visual and auditory impact of the robots on the surrounding environments; b) the social impact of the performances on each setting; and c) the temporal impact of the performances on the social behaviours and experiences around the robots. The results of the study suggest that, in the context of outdoor urban settings, people tend to perceive robots as kinetic sculptures more than as urban installations. We contribute to the discussion around pervasive displays by considering kinetic robotic installations as an emergent type of urban displays, with potentially lasting effects on the experience of city environments. We address and chart constraints and challenges for urban environment of the future.
{"title":"Welcoming the Orange Collars: Robotic Performance in Everyday City Life","authors":"Ecem Ergin, A. G. Afonso, A. Schieck","doi":"10.1145/3205873.3205893","DOIUrl":"https://doi.org/10.1145/3205873.3205893","url":null,"abstract":"In this paper we investigate kinetic displays in the form of robotic installations in the city of Hull, UK. The kinetic installations -- comprising a set of orange robotic arms, light sources, mirrors and soundscapes -- performed spatial and temporal rhythms in four different urban settings across Hull's Old Town. We investigate the installations as an attempt to clarify a) the visual and auditory impact of the robots on the surrounding environments; b) the social impact of the performances on each setting; and c) the temporal impact of the performances on the social behaviours and experiences around the robots. The results of the study suggest that, in the context of outdoor urban settings, people tend to perceive robots as kinetic sculptures more than as urban installations. We contribute to the discussion around pervasive displays by considering kinetic robotic installations as an emergent type of urban displays, with potentially lasting effects on the experience of city environments. We address and chart constraints and challenges for urban environment of the future.","PeriodicalId":340580,"journal":{"name":"Proceedings of the 7th ACM International Symposium on Pervasive Displays","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125408282","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper, we address smart jewelry, with particular focus on the enhancement of traditional jewelry. We chart people's perceptions on smart jewelry through focus group and online survey user studies. We then present and evaluate a necklace design including a prototype mobile augmented reality application. Our salient findings show that the technical concept and visual design should not conflict with the physical form and aesthetic of the jewelry, and that the personal nature and emotional bonding with the jewelry should be reflected also in the digital extensions. Our work progresses the understanding of people's attitudes and preferences towards smart jewelry, and extends the body of research on augmented wearables, which is under-explored but has a great potential for the future.
{"title":"Smart Jewelry: Augmenting Traditional Wearable Self-Expression Displays","authors":"Inka Rantala, Ashley Colley, Jonna Häkkilä","doi":"10.1145/3205873.3205891","DOIUrl":"https://doi.org/10.1145/3205873.3205891","url":null,"abstract":"In this paper, we address smart jewelry, with particular focus on the enhancement of traditional jewelry. We chart people's perceptions on smart jewelry through focus group and online survey user studies. We then present and evaluate a necklace design including a prototype mobile augmented reality application. Our salient findings show that the technical concept and visual design should not conflict with the physical form and aesthetic of the jewelry, and that the personal nature and emotional bonding with the jewelry should be reflected also in the digital extensions. Our work progresses the understanding of people's attitudes and preferences towards smart jewelry, and extends the body of research on augmented wearables, which is under-explored but has a great potential for the future.","PeriodicalId":340580,"journal":{"name":"Proceedings of the 7th ACM International Symposium on Pervasive Displays","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130340250","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}