In this paper, we describe a participatory-based approach to developing tactile feedback for a head-mounted device. Three focus groups iteratively designed and evaluated tactile interaction concepts for user-generated use-case scenarios. Results showed productive design insights from the groups regarding approaches to tactile coding schemes addressing the scenario conditions, as well as method-innovations to participatory design techniques for interaction development in unfamiliar sensory modalities such as touch. The study has culminated in the development of a library of tactile icons relating to spatial concepts, which will be tested as part of future work.
{"title":"Developing tactile feedback for wearable presentation: observations from using a participatory approach","authors":"Flynn Wolf, Ravi Kuber","doi":"10.1145/2628363.2634230","DOIUrl":"https://doi.org/10.1145/2628363.2634230","url":null,"abstract":"In this paper, we describe a participatory-based approach to developing tactile feedback for a head-mounted device. Three focus groups iteratively designed and evaluated tactile interaction concepts for user-generated use-case scenarios. Results showed productive design insights from the groups regarding approaches to tactile coding schemes addressing the scenario conditions, as well as method-innovations to participatory design techniques for interaction development in unfamiliar sensory modalities such as touch. The study has culminated in the development of a library of tactile icons relating to spatial concepts, which will be tested as part of future work.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"151 1","pages":"543-548"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85386589","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Xiang 'Anthony' Chen, Julia Schwarz, Chris Harrison, Jennifer Mankoff, S. Hudson
The space around the body provides a large interaction volume that can allow for big interactions on small mobile devices. However, interaction techniques making use of this opportunity are underexplored, primarily focusing on distributing information in the space around the body. We demonstrate three types of around-body interaction including canvas, modal and context-aware interactions in six demonstration applications. We also present a sensing solution using standard smartphone hardware: a phone's front camera, accelerometer and inertia measurement units. Our solution allows a person to interact with a mobile device by holding and positioning it between a normal field of view and its vicinity around the body. By leveraging a user's proprioceptive sense, around-body Interaction opens a new input channel that enhances conventional interaction on a mobile device without requiring additional hardware.
{"title":"Around-body interaction: sensing & interaction techniques for proprioception-enhanced input with mobile devices","authors":"Xiang 'Anthony' Chen, Julia Schwarz, Chris Harrison, Jennifer Mankoff, S. Hudson","doi":"10.1145/2628363.2628402","DOIUrl":"https://doi.org/10.1145/2628363.2628402","url":null,"abstract":"The space around the body provides a large interaction volume that can allow for big interactions on small mobile devices. However, interaction techniques making use of this opportunity are underexplored, primarily focusing on distributing information in the space around the body. We demonstrate three types of around-body interaction including canvas, modal and context-aware interactions in six demonstration applications. We also present a sensing solution using standard smartphone hardware: a phone's front camera, accelerometer and inertia measurement units. Our solution allows a person to interact with a mobile device by holding and positioning it between a normal field of view and its vicinity around the body. By leveraging a user's proprioceptive sense, around-body Interaction opens a new input channel that enhances conventional interaction on a mobile device without requiring additional hardware.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"1 1","pages":"287-290"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79970744","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Text entry using a soft keyboard on small mobile devices is difficult, one reason being that there is often an offset when typing. This essay presents a soft keyboard whose key shapes have been changed in order to avoid the problem of offset. An app and a usability test prove that this soft keyboard with a changed shape of the keys can increase words per minute and reduce the error rate. Another finding is that a large space between keys is preferred.
{"title":"Changed shape of key: an approach to enhance the performance of the soft keyboard","authors":"Hsi-Jen Chen","doi":"10.1145/2628363.2634213","DOIUrl":"https://doi.org/10.1145/2628363.2634213","url":null,"abstract":"Text entry using a soft keyboard on small mobile devices is difficult, one reason being that there is often an offset when typing. This essay presents a soft keyboard whose key shapes have been changed in order to avoid the problem of offset. An app and a usability test prove that this soft keyboard with a changed shape of the keys can increase words per minute and reduce the error rate. Another finding is that a large space between keys is preferred.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"24 1","pages":"447-452"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84512720","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Naotsune Hosono, Hiromitsu Inoue, M. Nakanishi, Y. Tomita
This paper introduces a mobile application that allows deaf, language dysfunctioned, or non-native language users to report emergencies. An earlier version (booklet) was designed for hearing impaired person to be able to communicate with others without speaking. The current smart phone application allows calls to be made from a remote location. The screen transitions application follows the dialogue models used by emergency services. Users interact with the dialogues by tapping on icons or pictograms instead of using text messages. Evaluation by deaf people and a non-native speaker found that it was about three times quicker to report an emergency using this tool than it was by using text messages.
{"title":"Urgent mobile tool for hearing impaired, language dysfunction and foreigners at emergency situation","authors":"Naotsune Hosono, Hiromitsu Inoue, M. Nakanishi, Y. Tomita","doi":"10.1145/2628363.2633568","DOIUrl":"https://doi.org/10.1145/2628363.2633568","url":null,"abstract":"This paper introduces a mobile application that allows deaf, language dysfunctioned, or non-native language users to report emergencies. An earlier version (booklet) was designed for hearing impaired person to be able to communicate with others without speaking. The current smart phone application allows calls to be made from a remote location. The screen transitions application follows the dialogue models used by emergency services. Users interact with the dialogues by tapping on icons or pictograms instead of using text messages. Evaluation by deaf people and a non-native speaker found that it was about three times quicker to report an emergency using this tool than it was by using text messages.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"3 1","pages":"413-416"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89348592","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A large body of mobile phone sharing research focuses on creating new interaction techniques for sharing, and considers the usability of such applications and features whilst ignoring the context of their use, their adoption or appropriation. Therefore it is not known whether these technologies are used in practice or whether they really meet people's sharing needs. The aim of this research was to understand current real-world user sharing practices around mobile smart phones through the use of a diary study with 63 participants. We focused on close proximity sharing and discovered that new technologies to support this kind of sharing, for example bumping handsets together to exchange files, are not being widely used. More than half of all sharing via phones in this sample involved only telling, showing or passing the phone, though this often triggered further sharing. Possible explanations for this and their implications are discussed.
{"title":"\"People don't bump\": sharing around mobile phones in close proximity","authors":"Afshan Kirmani, Rowanne Fleck","doi":"10.1145/2628363.2634231","DOIUrl":"https://doi.org/10.1145/2628363.2634231","url":null,"abstract":"A large body of mobile phone sharing research focuses on creating new interaction techniques for sharing, and considers the usability of such applications and features whilst ignoring the context of their use, their adoption or appropriation. Therefore it is not known whether these technologies are used in practice or whether they really meet people's sharing needs. The aim of this research was to understand current real-world user sharing practices around mobile smart phones through the use of a diary study with 63 participants. We focused on close proximity sharing and discovered that new technologies to support this kind of sharing, for example bumping handsets together to exchange files, are not being widely used. More than half of all sharing via phones in this sample involved only telling, showing or passing the phone, though this often triggered further sharing. Possible explanations for this and their implications are discussed.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"17 1","pages":"549-554"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75122482","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Karel Vandenbroucke, Denzil Ferreira, Jorge Gonçalves, V. Kostakos, K. Moor
In an increasingly connected world, users access personal or shared data, stored "in the cloud" (e.g., Dropbox, Skydrive, iCloud) with multiple devices. Despite the popularity of cloud storage services, little work has focused on investigating cloud storage users' Quality of Experience (QoE), in particular on mobile devices. Moreover, it is not clear how users' context might affect QoE. We conducted an online survey with 349 cloud service users to gain insight into their usage and affordances. In a 2-week follow-up study, we monitored mobile cloud service usage on tablets and smartphones, in real-time using a mobile-based Experience Sampling Method (ESM) questionnaire. We collected 156 responses on in-situ context of use for Dropbox on mobile devices. We provide insights for future QoE-aware cloud services by highlighting the most important mobile contextual factors (e.g., connectivity, location, social, device), and how they affect users' experiences while using such services on their mobile devices.
{"title":"Mobile cloud storage: a contextual experience","authors":"Karel Vandenbroucke, Denzil Ferreira, Jorge Gonçalves, V. Kostakos, K. Moor","doi":"10.1145/2628363.2628386","DOIUrl":"https://doi.org/10.1145/2628363.2628386","url":null,"abstract":"In an increasingly connected world, users access personal or shared data, stored \"in the cloud\" (e.g., Dropbox, Skydrive, iCloud) with multiple devices. Despite the popularity of cloud storage services, little work has focused on investigating cloud storage users' Quality of Experience (QoE), in particular on mobile devices. Moreover, it is not clear how users' context might affect QoE. We conducted an online survey with 349 cloud service users to gain insight into their usage and affordances. In a 2-week follow-up study, we monitored mobile cloud service usage on tablets and smartphones, in real-time using a mobile-based Experience Sampling Method (ESM) questionnaire. We collected 156 responses on in-situ context of use for Dropbox on mobile devices. We provide insights for future QoE-aware cloud services by highlighting the most important mobile contextual factors (e.g., connectivity, location, social, device), and how they affect users' experiences while using such services on their mobile devices.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"1 1","pages":"101-110"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73388136","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We have developed an application called JuxtaPinch that allows users to share photos on multiple devices, i.e. mobile phones and tablets, while being collocated. JuxtaPinch employs simple and intuitive interaction techniques, e.g. pinching to connect devices, and it enables flexible physical positioning of devices and supports partial photo viewing. JuxtaPinch further enables users to use their own devices and access photos stored on own devices. In the Interactivity session, audience members can explore and view photos with friends and colleagues using different devices and experience defamiliarization and playful interaction with the photos -- aspects that we have uncovered during lab and field studies of JuxtaPinch.
{"title":"JuxtaPinch: an application for collocated multi-device photo sharing","authors":"H. S. Nielsen, M. Olsen, M. Skov, J. Kjeldskov","doi":"10.1145/2628363.2633569","DOIUrl":"https://doi.org/10.1145/2628363.2633569","url":null,"abstract":"We have developed an application called JuxtaPinch that allows users to share photos on multiple devices, i.e. mobile phones and tablets, while being collocated. JuxtaPinch employs simple and intuitive interaction techniques, e.g. pinching to connect devices, and it enables flexible physical positioning of devices and supports partial photo viewing. JuxtaPinch further enables users to use their own devices and access photos stored on own devices. In the Interactivity session, audience members can explore and view photos with friends and colleagues using different devices and experience defamiliarization and playful interaction with the photos -- aspects that we have uncovered during lab and field studies of JuxtaPinch.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"14 1","pages":"417-420"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78452489","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Collaboration between users of a system is often a crucial factor for reaching given goals in an effective and efficient way. However, in many application domains, the current systems do not sufficiently support collaborative work (sometimes they even don't support it at all). One good example is geographical information systems (GIS), that usually only follow an one user at one time approach. In this paper, we present the development of a scalable Multi-user Geographical Information System (MuGIS). With MuGIS it is now possible to integrate large display environments with mobile smart devices for remote control. The system is deployed as a client-server architecture. It uses the NASA World Wind Java framework and SOAP web services for communication. On the client side, all common mobile smart devices are supported. The underlying concept provides different user roles and multi-user identification.
系统用户之间的协作通常是以有效和高效的方式达到给定目标的关键因素。然而,在许多应用领域中,当前的系统不能充分支持协作工作(有时甚至根本不支持)。一个很好的例子是地理信息系统(GIS),它通常只遵循一次一个用户的方法。在本文中,我们提出了可扩展的多用户地理信息系统(MuGIS)的开发。有了MuGIS,现在可以将大型显示环境与移动智能设备集成在一起进行远程控制。系统采用客户机-服务器架构进行部署。它使用NASA World Wind Java框架和SOAP web服务进行通信。在客户端,支持所有常见的移动智能设备。底层概念提供了不同的用户角色和多用户标识。
{"title":"MuGIS multi-user geographical information system","authors":"S. Schöffel, Johannes Schwank, A. Ebert","doi":"10.1145/2628363.2634219","DOIUrl":"https://doi.org/10.1145/2628363.2634219","url":null,"abstract":"Collaboration between users of a system is often a crucial factor for reaching given goals in an effective and efficient way. However, in many application domains, the current systems do not sufficiently support collaborative work (sometimes they even don't support it at all). One good example is geographical information systems (GIS), that usually only follow an one user at one time approach. In this paper, we present the development of a scalable Multi-user Geographical Information System (MuGIS). With MuGIS it is now possible to integrate large display environments with mobile smart devices for remote control. The system is deployed as a client-server architecture. It uses the NASA World Wind Java framework and SOAP web services for communication. On the client side, all common mobile smart devices are supported. The underlying concept provides different user roles and multi-user identification.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"21 1","pages":"477-482"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78479739","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This tutorial explores the possibility of using touchscreen-based mobile devices as active tangibles on an interactive tabletop surface. The tutorial starts with an open discussion about various aspects of tangible interaction, including an overview of different approaches and design principles. It then guides participants through the design and development of innovative interaction techniques, where mobile phones are used as active tangibles on a shared tabletop display. The intent is to encourage the mobile HCI community to further explore the possibility of using everyday devices such as mobile phones as tangibles.
{"title":"Mobile-based tangible interaction techniques for shared displays","authors":"Ali Mazalek, A. Arif","doi":"10.1145/2628363.2645668","DOIUrl":"https://doi.org/10.1145/2628363.2645668","url":null,"abstract":"This tutorial explores the possibility of using touchscreen-based mobile devices as active tangibles on an interactive tabletop surface. The tutorial starts with an open discussion about various aspects of tangible interaction, including an overview of different approaches and design principles. It then guides participants through the design and development of innovative interaction techniques, where mobile phones are used as active tangibles on a shared tabletop display. The intent is to encourage the mobile HCI community to further explore the possibility of using everyday devices such as mobile phones as tangibles.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"84 3","pages":"561-562"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91407049","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Identifying intentions of users when they launch an application on their smartphone, and understanding which tasks they actually execute, is a key problem in mobile usability analysis. First, knowing which tasks users actually execute is required for calculating common usability metrics such as task efficiency, error rates and effectiveness. Second, understanding how users perform these tasks is important for developers in order to validate designed interaction sequences for tasks (e.g. sequential steps required to successfully perform and complete a task). In this paper, we describe a novel approach for automatically extracting and grouping interaction sequences from users, assigning them to predefined tasks (e.g. writing an email) and visualising them in an intuitive way. Thus, we are able to find out if the designer's intention of how users should perform designed tasks, and how they actually execute them in the field, matches, and where it differs. This allows us to figure out if users find alternate ways of performing certain tasks, which contributes to the application design process. Moreover, if the users' perception of tasks differs from the designer's intention, we lay the foundation for recognising issues users may have while executing them.
{"title":"Mobile interaction analysis: towards a novel concept for interaction sequence mining","authors":"Florian Lettner, C. Grossauer, Clemens Holzmann","doi":"10.1145/2628363.2628384","DOIUrl":"https://doi.org/10.1145/2628363.2628384","url":null,"abstract":"Identifying intentions of users when they launch an application on their smartphone, and understanding which tasks they actually execute, is a key problem in mobile usability analysis. First, knowing which tasks users actually execute is required for calculating common usability metrics such as task efficiency, error rates and effectiveness. Second, understanding how users perform these tasks is important for developers in order to validate designed interaction sequences for tasks (e.g. sequential steps required to successfully perform and complete a task). In this paper, we describe a novel approach for automatically extracting and grouping interaction sequences from users, assigning them to predefined tasks (e.g. writing an email) and visualising them in an intuitive way. Thus, we are able to find out if the designer's intention of how users should perform designed tasks, and how they actually execute them in the field, matches, and where it differs. This allows us to figure out if users find alternate ways of performing certain tasks, which contributes to the application design process. Moreover, if the users' perception of tasks differs from the designer's intention, we lay the foundation for recognising issues users may have while executing them.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"79 1","pages":"359-368"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83800856","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}