We present BackPat - a technique for supporting one-handed smartphone operation by using pats of the index finger, middle finger or thumb on the back or side of the device. We devise a novel method using the device's microphone and gyroscope that enables finger-specific gesture detection and explore efficiency and user acceptance of gesture execution for each finger in three user studies with novice BackPat users.
{"title":"BackPat: one-handed off-screen patting gestures","authors":"Karsten Seipp, K. Devlin","doi":"10.1145/2628363.2628396","DOIUrl":"https://doi.org/10.1145/2628363.2628396","url":null,"abstract":"We present BackPat - a technique for supporting one-handed smartphone operation by using pats of the index finger, middle finger or thumb on the back or side of the device. We devise a novel method using the device's microphone and gyroscope that enables finger-specific gesture detection and explore efficiency and user acceptance of gesture execution for each finger in three user studies with novice BackPat users.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"1 1","pages":"77-80"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91123293","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Hesenius, Tobias Griebe, Stefan Gries, V. Gruhn
Touch- and gesture-based interfaces are common in applications for mobile devices. By evolving into mass market products, smartphones and tablets created an increased need for specialized software engineering methods. To ensure high quality applications, constant and efficient testing is crucial in software development. However, testing mobile applications is still cumbersome, time-consuming and error-prone. One reason is the devices' focus on touch-based interaction - gestures cannot be easily incorporated into automated application tests. We present an extension to the popular Calabash testing framework solving this problem by allowing to describe gestures with a formal language in tests scripts.
{"title":"Automating UI tests for mobile applications with formal gesture descriptions","authors":"M. Hesenius, Tobias Griebe, Stefan Gries, V. Gruhn","doi":"10.1145/2628363.2628391","DOIUrl":"https://doi.org/10.1145/2628363.2628391","url":null,"abstract":"Touch- and gesture-based interfaces are common in applications for mobile devices. By evolving into mass market products, smartphones and tablets created an increased need for specialized software engineering methods. To ensure high quality applications, constant and efficient testing is crucial in software development. However, testing mobile applications is still cumbersome, time-consuming and error-prone. One reason is the devices' focus on touch-based interaction - gestures cannot be easily incorporated into automated application tests. We present an extension to the popular Calabash testing framework solving this problem by allowing to describe gestures with a formal language in tests scripts.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"16 1","pages":"213-222"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87357233","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
How can mobile technology create new models for audience participation in live performances? CoPerformance is a research project that aims to develop a set of plug-and-play participatory performance modules. These modules will allow designers to quickly build/test interactive experiences that utilize mobile devices. CoPerformance can be deployed via web browsers or native applications. The goal of this platform is to use existing frameworks to offer designers a powerful set of tools, templates, and scripts for interactive performances; and decrease the barriers for building participatory performances.
{"title":"CoPerformance: a rapid prototyping platform for developing interactive artist-audience performances with mobile devices","authors":"B. Anderson, S. Oliver, Patricio Dávila","doi":"10.1145/2628363.2645699","DOIUrl":"https://doi.org/10.1145/2628363.2645699","url":null,"abstract":"How can mobile technology create new models for audience participation in live performances? CoPerformance is a research project that aims to develop a set of plug-and-play participatory performance modules. These modules will allow designers to quickly build/test interactive experiences that utilize mobile devices. CoPerformance can be deployed via web browsers or native applications. The goal of this platform is to use existing frameworks to offer designers a powerful set of tools, templates, and scripts for interactive performances; and decrease the barriers for building participatory performances.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"2 1","pages":"605-607"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75395233","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Qianli Xu, Liyuan Li, Joo-Hwee Lim, Cheston Tan, Michal Mukawa, Gang S. Wang
In this paper, we explore a new way to provide context-aware assistance for indoor navigation using a wearable vision system. We investigate how to represent the cognitive knowledge of wayfinding based on first-person-view videos in real-time and how to provide context-aware navigation instructions in a human-like manner. Inspired by the human cognitive process of wayfinding, we propose a novel cognitive model that represents visual concepts as a hierarchical structure. It facilitates efficient and robust localization based on cognitive visual concepts. Next, we design a prototype system that provides intelligent context-aware assistance based on the cognitive indoor navigation knowledge model. We conducted field tests and evaluated the system's efficacy by benchmarking it against traditional 2D maps and human guidance. The results show that context-awareness built on cognitive visual perception enables the system to emulate the efficacy of a human guide, leading to positive user experience.
{"title":"A wearable virtual guide for context-aware cognitive indoor navigation","authors":"Qianli Xu, Liyuan Li, Joo-Hwee Lim, Cheston Tan, Michal Mukawa, Gang S. Wang","doi":"10.1145/2628363.2628390","DOIUrl":"https://doi.org/10.1145/2628363.2628390","url":null,"abstract":"In this paper, we explore a new way to provide context-aware assistance for indoor navigation using a wearable vision system. We investigate how to represent the cognitive knowledge of wayfinding based on first-person-view videos in real-time and how to provide context-aware navigation instructions in a human-like manner. Inspired by the human cognitive process of wayfinding, we propose a novel cognitive model that represents visual concepts as a hierarchical structure. It facilitates efficient and robust localization based on cognitive visual concepts. Next, we design a prototype system that provides intelligent context-aware assistance based on the cognitive indoor navigation knowledge model. We conducted field tests and evaluated the system's efficacy by benchmarking it against traditional 2D maps and human guidance. The results show that context-awareness built on cognitive visual perception enables the system to emulate the efficacy of a human guide, leading to positive user experience.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"45 1","pages":"111-120"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85351680","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Clemens Holzmann, Florian Lettner, C. Grossauer, Werner Wetzlinger, Paul Latzelsperger, Christian Augdopler
In industrial automation, there is a growing use of touch panels for controlling and programming machines. Compared to traditional panels with physical buttons, they provide higher flexibility and operating efficiency. A big challenge in the user interface design is their usability, which is directly related to the operator's safety and performance. In this paper, we present a software solution for the acquisition and visualization of user interaction data on teach pendants, which are handheld terminals for teaching robot positions. Interaction data include touch coordinates and navigation sequences, and they are visualized with a heatmap and a graph view, respectively. The design and implementation of the software is based on interviews which we conducted with companies from the automation industry. The painless integration of our software by manufacturers, together with the automated recording and visualisation of user interactions, allow for cost-efficient usability analysis of handheld terminals.
{"title":"Logging and visualization of touch interactions on teach pendants","authors":"Clemens Holzmann, Florian Lettner, C. Grossauer, Werner Wetzlinger, Paul Latzelsperger, Christian Augdopler","doi":"10.1145/2628363.2628429","DOIUrl":"https://doi.org/10.1145/2628363.2628429","url":null,"abstract":"In industrial automation, there is a growing use of touch panels for controlling and programming machines. Compared to traditional panels with physical buttons, they provide higher flexibility and operating efficiency. A big challenge in the user interface design is their usability, which is directly related to the operator's safety and performance. In this paper, we present a software solution for the acquisition and visualization of user interaction data on teach pendants, which are handheld terminals for teaching robot positions. Interaction data include touch coordinates and navigation sequences, and they are visualized with a heatmap and a graph view, respectively. The design and implementation of the software is based on interviews which we conducted with companies from the automation industry. The painless integration of our software by manufacturers, together with the automated recording and visualisation of user interactions, allow for cost-efficient usability analysis of handheld terminals.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"22 1","pages":"619-624"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82524760","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Benjamin Poppinga, Alireza Sahami Shirazi, N. Henze, Wilko Heuten, Susanne CJ Boll
Touch gestures become steadily more important with the ongoing success of touch screen devices. Compared to traditional user interfaces, gestures have the potential to lower cognitive load and the need for visual attention. However, nowadays gestures are defined by designers and developers and it is questionable if these meet all user requirements. In this paper, we present two exploratory studies that investigate how users would use unistroke touch gestures for shortcut access to a mobile phone's key functionalities. We study the functions that users want to access, the preferred activators for gesture execution, and the shapes of the user-invented gestures. We found that most gestures trigger applications, letter-shaped gestures are preferred, and the gestures should be accessible from the lock screen, the wallpaper, and the notification bar. We conclude with a coherent, unambiguous set of gestures for the 20 most frequently accessed functions, which can inform the design of future gesture-controlled applications.
{"title":"Understanding shortcut gestures on mobile touch devices","authors":"Benjamin Poppinga, Alireza Sahami Shirazi, N. Henze, Wilko Heuten, Susanne CJ Boll","doi":"10.1145/2628363.2628378","DOIUrl":"https://doi.org/10.1145/2628363.2628378","url":null,"abstract":"Touch gestures become steadily more important with the ongoing success of touch screen devices. Compared to traditional user interfaces, gestures have the potential to lower cognitive load and the need for visual attention. However, nowadays gestures are defined by designers and developers and it is questionable if these meet all user requirements. In this paper, we present two exploratory studies that investigate how users would use unistroke touch gestures for shortcut access to a mobile phone's key functionalities. We study the functions that users want to access, the preferred activators for gesture execution, and the shapes of the user-invented gestures. We found that most gestures trigger applications, letter-shaped gestures are preferred, and the gestures should be accessible from the lock screen, the wallpaper, and the notification bar. We conclude with a coherent, unambiguous set of gestures for the 20 most frequently accessed functions, which can inform the design of future gesture-controlled applications.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"73 1","pages":"173-182"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83289176","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Diamond, B. Arunachalan, Derek F. Reilly, Anne Stevens
This workshop aims to discuss and develop ideas on how healthcare services, mobile technologies, and visual analytics techniques can be leveraged and contribute to new ways of mobile healthcare supportive system designs. Designing contemporary mobile support systems for healthcare support requires a clear understanding of information requirements, behaviors and basic needs of users. Design must take into account the challenges of human-device interactions in the healthcare environment; the extension of the care environment beyond the institutional setting and the engagement of patients, facility residents and families in an extended circle of care; and issues of formal and informal data sharing and privacy. This workshop invites researchers and designers working in relevant fields to discuss, compare, and demonstrate effective design approaches that can be adopted to improve the designs of mobile support systems for interactive visualization in healthcare.
{"title":"Workshop on designing the future of mobile healthcare support","authors":"S. Diamond, B. Arunachalan, Derek F. Reilly, Anne Stevens","doi":"10.1145/2628363.2634263","DOIUrl":"https://doi.org/10.1145/2628363.2634263","url":null,"abstract":"This workshop aims to discuss and develop ideas on how healthcare services, mobile technologies, and visual analytics techniques can be leveraged and contribute to new ways of mobile healthcare supportive system designs. Designing contemporary mobile support systems for healthcare support requires a clear understanding of information requirements, behaviors and basic needs of users. Design must take into account the challenges of human-device interactions in the healthcare environment; the extension of the care environment beyond the institutional setting and the engagement of patients, facility residents and families in an extended circle of care; and issues of formal and informal data sharing and privacy. This workshop invites researchers and designers working in relevant fields to discuss, compare, and demonstrate effective design approaches that can be adopted to improve the designs of mobile support systems for interactive visualization in healthcare.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"13 1","pages":"589-592"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81736609","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Secure authentication with devices or services that store sensitive and personal information is highly important. However, traditional password and pin-based authentication methods compromise between the level of security and user experience. AirAuth is a biometric authentication technique that uses in-air gesture input to authenticate users. We evaluated our technique on a predefined (simple) gesture set and our classifier achieved an average accuracy of 96.6% in an equal error rate (EER-)based study. We obtained an accuracy of 100% when exclusively using personal (complex) user gestures. In a further user study, we found that AirAuth is highly resilient to video-based shoulder surfing attacks, with a measured false acceptance rate of just 2.2%. Furthermore, a longitudinal study demonstrates AirAuth's repeatability and accuracy over time. AirAuth is relatively simple, robust and requires only a low amount of computational power and is hence deployable on embedded or mobile hardware. Unlike traditional authentication methods, our system's security is positively aligned with user-rated pleasure and excitement levels. In addition, AirAuth attained acceptability ratings in personal, office, and public spaces that are comparable to an existing stroke-based on-screen authentication technique. Based on the results presented in this paper, we believe that AirAuth shows great promise as a novel, secure, ubiquitous, and highly usable authentication method.
{"title":"AirAuth: evaluating in-air hand gestures for authentication","authors":"Md Tanvir Islam Aumi, Sven G. Kratz","doi":"10.1145/2628363.2628388","DOIUrl":"https://doi.org/10.1145/2628363.2628388","url":null,"abstract":"Secure authentication with devices or services that store sensitive and personal information is highly important. However, traditional password and pin-based authentication methods compromise between the level of security and user experience. AirAuth is a biometric authentication technique that uses in-air gesture input to authenticate users. We evaluated our technique on a predefined (simple) gesture set and our classifier achieved an average accuracy of 96.6% in an equal error rate (EER-)based study. We obtained an accuracy of 100% when exclusively using personal (complex) user gestures. In a further user study, we found that AirAuth is highly resilient to video-based shoulder surfing attacks, with a measured false acceptance rate of just 2.2%. Furthermore, a longitudinal study demonstrates AirAuth's repeatability and accuracy over time. AirAuth is relatively simple, robust and requires only a low amount of computational power and is hence deployable on embedded or mobile hardware. Unlike traditional authentication methods, our system's security is positively aligned with user-rated pleasure and excitement levels. In addition, AirAuth attained acceptability ratings in personal, office, and public spaces that are comparable to an existing stroke-based on-screen authentication technique. Based on the results presented in this paper, we believe that AirAuth shows great promise as a novel, secure, ubiquitous, and highly usable authentication method.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"43 1","pages":"309-318"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84629299","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We present a touch-based method for binding mobile devices for collaborative interactions in a group of collocated users. The method is highly flexible, enabling a broad range of different group formation strategies. We report an evaluation of the method in medium-sized groups of six users. When forming a group, the participants primarily followed viral patterns where they opportunistically added other participants to the group without advance planning. The participants also suggested a number of more systematic patterns, which required the group to agree on a common strategy but then provided a clear procedure to follow. The flexibility of the method allowed the participants to adapt it to the changing needs of the situation and to recover from errors and technical problems. Overall, device binding in medium-sized groups was found to be a highly collaborative group activity and the binding methods should pay special attention to supporting groupwork and social interactions.
{"title":"FlexiGroups: binding mobile devices for collaborative interactions in medium-sized groups with device touch","authors":"T. Jokela, A. Lucero","doi":"10.1145/2628363.2628376","DOIUrl":"https://doi.org/10.1145/2628363.2628376","url":null,"abstract":"We present a touch-based method for binding mobile devices for collaborative interactions in a group of collocated users. The method is highly flexible, enabling a broad range of different group formation strategies. We report an evaluation of the method in medium-sized groups of six users. When forming a group, the participants primarily followed viral patterns where they opportunistically added other participants to the group without advance planning. The participants also suggested a number of more systematic patterns, which required the group to agree on a common strategy but then provided a clear procedure to follow. The flexibility of the method allowed the participants to adapt it to the changing needs of the situation and to recover from errors and technical problems. Overall, device binding in medium-sized groups was found to be a highly collaborative group activity and the binding methods should pay special attention to supporting groupwork and social interactions.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"10 1","pages":"369-378"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79797135","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In my PhD research I am exploring the effect of email on work-home boundaries. The ultimate goal is to design a tool that helps people manage their email better and reduces the stress associated with this activity. I argue that this will require understanding individual differences in email behaviours and how email can impact work-home boundaries.
{"title":"Email management and work-home boundaries","authors":"Marta E. Cecchinato","doi":"10.1145/2628363.2634267","DOIUrl":"https://doi.org/10.1145/2628363.2634267","url":null,"abstract":"In my PhD research I am exploring the effect of email on work-home boundaries. The ultimate goal is to design a tool that helps people manage their email better and reduces the stress associated with this activity. I argue that this will require understanding individual differences in email behaviours and how email can impact work-home boundaries.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"51 1","pages":"403-404"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86935595","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}