This PVA-shortpaper introduces an enhancement to the Keystroke-Level Model (KLM) by extending it with three new operators to describe interactions on mobile touchscreen devices. Based on Fitts's Law we modelled a performance measure estimate equation for each common touch screen interaction. Three prototypes were developed to serve as a test environment in which to validate Fitts's equations and estimate the parameters for these interactions. A total of 3090 observations were made with a total of 51 users. While the studies confirmed each interaction fitted well to Fitts's Law for most interactions, it was noticed that Fitts's Law does not fit well for interactions with an Index of Difficulty exceeding 4 bits, highlighting a possible maximum comfortable stretch. Based on results, the following approximate movement times for KLM are suggested: 70ms for a short untargeted swipe, 200ms for a half-screen sized zoom, and 80ms for an icon pointing from a home position. These results could be used by developers of mobile phone and tablet applications to describe tasks as a sequence of the operators used and to predict user interaction times prior to creating prototypes.
{"title":"Enhancing KLM (keystroke-level model) to fit touch screen mobile devices","authors":"K. E. Batran, Mark D. Dunlop","doi":"10.1145/2628363.2628385","DOIUrl":"https://doi.org/10.1145/2628363.2628385","url":null,"abstract":"This PVA-shortpaper introduces an enhancement to the Keystroke-Level Model (KLM) by extending it with three new operators to describe interactions on mobile touchscreen devices. Based on Fitts's Law we modelled a performance measure estimate equation for each common touch screen interaction. Three prototypes were developed to serve as a test environment in which to validate Fitts's equations and estimate the parameters for these interactions. A total of 3090 observations were made with a total of 51 users. While the studies confirmed each interaction fitted well to Fitts's Law for most interactions, it was noticed that Fitts's Law does not fit well for interactions with an Index of Difficulty exceeding 4 bits, highlighting a possible maximum comfortable stretch. Based on results, the following approximate movement times for KLM are suggested: 70ms for a short untargeted swipe, 200ms for a half-screen sized zoom, and 80ms for an icon pointing from a home position. These results could be used by developers of mobile phone and tablet applications to describe tasks as a sequence of the operators used and to predict user interaction times prior to creating prototypes.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"73 1","pages":"283-286"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91131594","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Genovefa Kefalidou, A. Skatova, Michael A. Brown, Victoria Shipp, J. Pinchin, P. Kelly, A. Dix, Xu Sun
Advances in ubiquitous technologies have changed the way humans interact with the world around them. Technology has the power not only to inform and perform but also to further peoples' experiences of the world. It has enhanced the methodological approaches within the CHI research realm in terms of data gathering (e.g. via wearable sensors) and sharing (e.g. via self-reflection methods). While such methodologies have been mainly adopted in isolation, exploring the implications and the synergy of them has yet to be fully explored. This workshop brings together a multidisciplinary group of researchers to explore and experience the use of wearable sensors with self-reflection as a multi-method approach to conduct research and fully experience the world on-the-go.
{"title":"Enhancing self-reflection with wearable sensors","authors":"Genovefa Kefalidou, A. Skatova, Michael A. Brown, Victoria Shipp, J. Pinchin, P. Kelly, A. Dix, Xu Sun","doi":"10.1145/2628363.2634257","DOIUrl":"https://doi.org/10.1145/2628363.2634257","url":null,"abstract":"Advances in ubiquitous technologies have changed the way humans interact with the world around them. Technology has the power not only to inform and perform but also to further peoples' experiences of the world. It has enhanced the methodological approaches within the CHI research realm in terms of data gathering (e.g. via wearable sensors) and sharing (e.g. via self-reflection methods). While such methodologies have been mainly adopted in isolation, exploring the implications and the synergy of them has yet to be fully explored. This workshop brings together a multidisciplinary group of researchers to explore and experience the use of wearable sensors with self-reflection as a multi-method approach to conduct research and fully experience the world on-the-go.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"20 1","pages":"577-580"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90481594","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
With the increasing availability of smartwatches the question of suited input modalities arises. While direct touch input comes at the cost of the fat-finger problem, we propose to use a dynamic peephole to explore larger content such as websites or maps. In this paper, we present the results of a user study comparing the performance of static and dynamic peephole interactions for a map navigation task on a smartwatch display. As a first method, we investigated the static peephole methaphor where the displayed map is moved on the device via direct touch interaction. In contrast, for the second method - the dynamic peephole - the device is moved and the map is static with respect to an external frame of reference. We compared both methods in terms of task performance and perceived user experience. The results show that the dynamic peephole interaction performs significantly more slowly in terms of task completion time.
{"title":"Investigating the effectiveness of peephole interaction for smartwatches in a map navigation task","authors":"Frederic Kerber, A. Krüger, Markus Löchtefeld","doi":"10.1145/2628363.2628393","DOIUrl":"https://doi.org/10.1145/2628363.2628393","url":null,"abstract":"With the increasing availability of smartwatches the question of suited input modalities arises. While direct touch input comes at the cost of the fat-finger problem, we propose to use a dynamic peephole to explore larger content such as websites or maps. In this paper, we present the results of a user study comparing the performance of static and dynamic peephole interactions for a map navigation task on a smartwatch display. As a first method, we investigated the static peephole methaphor where the displayed map is moved on the device via direct touch interaction. In contrast, for the second method - the dynamic peephole - the device is moved and the map is static with respect to an external frame of reference. We compared both methods in terms of task performance and perceived user experience. The results show that the dynamic peephole interaction performs significantly more slowly in terms of task completion time.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"15 1","pages":"291-294"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84329096","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nicola Dell, Ian Francis, H. Sheppard, R. Simbi, G. Borriello
The worldwide adoption of mobile devices presents an opportunity to build mobile systems to support health workers in low-resource settings. This paper presents an in-depth field evaluation of a mobile system that uses a smartphone's built-in camera and computer vision to capture and analyze diagnostic tests for infectious diseases. We describe how health workers integrate the system into their daily clinical workflow and detail important differences in system usage between small clinics and large hospitals that could inform the design of future mobile health systems. We also describe a variety of strategies that health workers developed to overcome poor network connectivity and transmit data to a central database. Finally, we show strong agreement between our system's computed diagnoses and trained health workers' visual diagnoses, which suggests that our system could aid disease diagnosis in a variety of scenarios. Our findings will help to guide ministries of health and other stakeholders working to deploy mobile health systems in similar environments.
{"title":"Field evaluation of a camera-based mobile health system in low-resource settings","authors":"Nicola Dell, Ian Francis, H. Sheppard, R. Simbi, G. Borriello","doi":"10.1145/2628363.2628366","DOIUrl":"https://doi.org/10.1145/2628363.2628366","url":null,"abstract":"The worldwide adoption of mobile devices presents an opportunity to build mobile systems to support health workers in low-resource settings. This paper presents an in-depth field evaluation of a mobile system that uses a smartphone's built-in camera and computer vision to capture and analyze diagnostic tests for infectious diseases. We describe how health workers integrate the system into their daily clinical workflow and detail important differences in system usage between small clinics and large hospitals that could inform the design of future mobile health systems. We also describe a variety of strategies that health workers developed to overcome poor network connectivity and transmit data to a central database. Finally, we show strong agreement between our system's computed diagnoses and trained health workers' visual diagnoses, which suggests that our system could aid disease diagnosis in a variety of scenarios. Our findings will help to guide ministries of health and other stakeholders working to deploy mobile health systems in similar environments.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"562 1","pages":"33-42"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77763159","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Tindale, M. Cumming, Hudson Pridham, Jessica Peter, S. Diamond
In this paper we explore the design, layout and configuration of wrist-wearable, haptic gaming interfaces, which involve visual and vibrotactile spatial and temporal patterns. Our goal is to determine overall layouts and spatial and temporal resolutions on the wrist suitable for interactive tactile stimuli. Our approach is to first explore the simplest of configurative patterns that are intended to encircle the wrist, and then study their affordances. We describe various informal user studies we have employed to explore and test issues that arose.
{"title":"Wearable haptic gaming using vibrotactile arrays","authors":"A. Tindale, M. Cumming, Hudson Pridham, Jessica Peter, S. Diamond","doi":"10.1145/2628363.2633574","DOIUrl":"https://doi.org/10.1145/2628363.2633574","url":null,"abstract":"In this paper we explore the design, layout and configuration of wrist-wearable, haptic gaming interfaces, which involve visual and vibrotactile spatial and temporal patterns. Our goal is to determine overall layouts and spatial and temporal resolutions on the wrist suitable for interactive tactile stimuli. Our approach is to first explore the simplest of configurative patterns that are intended to encircle the wrist, and then study their affordances. We describe various informal user studies we have employed to explore and test issues that arose.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"35 1","pages":"435-438"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76346678","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nikola Banovic, Christina Brant, Jennifer Mankoff, A. Dey
Mobile devices have become powerful ultra-portable personal computers supporting not only communication but also running a variety of complex, interactive applications. Because of the unique characteristics of mobile interaction, a better understanding of the time duration and context of mobile device uses could help to improve and streamline the user experience. In this paper, we first explore the anatomy of mobile device use and propose a classification of use based on duration and interaction type: glance, review, and engage. We then focus our investigation on short review interactions and identify opportunities for streamlining these mobile device uses through proactively suggesting short tasks to the user that go beyond simple application notifications. We evaluate the concept through a user evaluation of an interactive lock screen prototype, called ProactiveTasks. We use the findings from our study to create and explore the design space for proactively presenting tasks to the users. Our findings underline the need for a more nuanced set of interactions that support short mobile device uses, in particular review sessions.
{"title":"ProactiveTasks: the short of mobile device use sessions","authors":"Nikola Banovic, Christina Brant, Jennifer Mankoff, A. Dey","doi":"10.1145/2628363.2628380","DOIUrl":"https://doi.org/10.1145/2628363.2628380","url":null,"abstract":"Mobile devices have become powerful ultra-portable personal computers supporting not only communication but also running a variety of complex, interactive applications. Because of the unique characteristics of mobile interaction, a better understanding of the time duration and context of mobile device uses could help to improve and streamline the user experience. In this paper, we first explore the anatomy of mobile device use and propose a classification of use based on duration and interaction type: glance, review, and engage. We then focus our investigation on short review interactions and identify opportunities for streamlining these mobile device uses through proactively suggesting short tasks to the user that go beyond simple application notifications. We evaluate the concept through a user evaluation of an interactive lock screen prototype, called ProactiveTasks. We use the findings from our study to create and explore the design space for proactively presenting tasks to the users. Our findings underline the need for a more nuanced set of interactions that support short mobile device uses, in particular review sessions.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"122 1","pages":"243-252"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73454979","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Xiang 'Anthony' Chen, Julia Schwarz, Chris Harrison, Jennifer Mankoff, S. Hudson
The space around the body provides a large interaction volume that can allow for big interactions on small mobile devices. However, interaction techniques making use of this opportunity are underexplored, primarily focusing on distributing information in the space around the body. We demonstrate three types of around-body interaction including canvas, modal and context-aware interactions in six demonstration applications. We also present a sensing solution using standard smartphone hardware: a phone's front camera, accelerometer and inertia measurement units. Our solution allows a person to interact with a mobile device by holding and positioning it between a normal field of view and its vicinity around the body. By leveraging a user's proprioceptive sense, around-body Interaction opens a new input channel that enhances conventional interaction on a mobile device without requiring additional hardware.
{"title":"Around-body interaction: sensing & interaction techniques for proprioception-enhanced input with mobile devices","authors":"Xiang 'Anthony' Chen, Julia Schwarz, Chris Harrison, Jennifer Mankoff, S. Hudson","doi":"10.1145/2628363.2628402","DOIUrl":"https://doi.org/10.1145/2628363.2628402","url":null,"abstract":"The space around the body provides a large interaction volume that can allow for big interactions on small mobile devices. However, interaction techniques making use of this opportunity are underexplored, primarily focusing on distributing information in the space around the body. We demonstrate three types of around-body interaction including canvas, modal and context-aware interactions in six demonstration applications. We also present a sensing solution using standard smartphone hardware: a phone's front camera, accelerometer and inertia measurement units. Our solution allows a person to interact with a mobile device by holding and positioning it between a normal field of view and its vicinity around the body. By leveraging a user's proprioceptive sense, around-body Interaction opens a new input channel that enhances conventional interaction on a mobile device without requiring additional hardware.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"1 1","pages":"287-290"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79970744","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper, we describe a participatory-based approach to developing tactile feedback for a head-mounted device. Three focus groups iteratively designed and evaluated tactile interaction concepts for user-generated use-case scenarios. Results showed productive design insights from the groups regarding approaches to tactile coding schemes addressing the scenario conditions, as well as method-innovations to participatory design techniques for interaction development in unfamiliar sensory modalities such as touch. The study has culminated in the development of a library of tactile icons relating to spatial concepts, which will be tested as part of future work.
{"title":"Developing tactile feedback for wearable presentation: observations from using a participatory approach","authors":"Flynn Wolf, Ravi Kuber","doi":"10.1145/2628363.2634230","DOIUrl":"https://doi.org/10.1145/2628363.2634230","url":null,"abstract":"In this paper, we describe a participatory-based approach to developing tactile feedback for a head-mounted device. Three focus groups iteratively designed and evaluated tactile interaction concepts for user-generated use-case scenarios. Results showed productive design insights from the groups regarding approaches to tactile coding schemes addressing the scenario conditions, as well as method-innovations to participatory design techniques for interaction development in unfamiliar sensory modalities such as touch. The study has culminated in the development of a library of tactile icons relating to spatial concepts, which will be tested as part of future work.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"151 1","pages":"543-548"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85386589","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Text entry using a soft keyboard on small mobile devices is difficult, one reason being that there is often an offset when typing. This essay presents a soft keyboard whose key shapes have been changed in order to avoid the problem of offset. An app and a usability test prove that this soft keyboard with a changed shape of the keys can increase words per minute and reduce the error rate. Another finding is that a large space between keys is preferred.
{"title":"Changed shape of key: an approach to enhance the performance of the soft keyboard","authors":"Hsi-Jen Chen","doi":"10.1145/2628363.2634213","DOIUrl":"https://doi.org/10.1145/2628363.2634213","url":null,"abstract":"Text entry using a soft keyboard on small mobile devices is difficult, one reason being that there is often an offset when typing. This essay presents a soft keyboard whose key shapes have been changed in order to avoid the problem of offset. An app and a usability test prove that this soft keyboard with a changed shape of the keys can increase words per minute and reduce the error rate. Another finding is that a large space between keys is preferred.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"24 1","pages":"447-452"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84512720","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Naotsune Hosono, Hiromitsu Inoue, M. Nakanishi, Y. Tomita
This paper introduces a mobile application that allows deaf, language dysfunctioned, or non-native language users to report emergencies. An earlier version (booklet) was designed for hearing impaired person to be able to communicate with others without speaking. The current smart phone application allows calls to be made from a remote location. The screen transitions application follows the dialogue models used by emergency services. Users interact with the dialogues by tapping on icons or pictograms instead of using text messages. Evaluation by deaf people and a non-native speaker found that it was about three times quicker to report an emergency using this tool than it was by using text messages.
{"title":"Urgent mobile tool for hearing impaired, language dysfunction and foreigners at emergency situation","authors":"Naotsune Hosono, Hiromitsu Inoue, M. Nakanishi, Y. Tomita","doi":"10.1145/2628363.2633568","DOIUrl":"https://doi.org/10.1145/2628363.2633568","url":null,"abstract":"This paper introduces a mobile application that allows deaf, language dysfunctioned, or non-native language users to report emergencies. An earlier version (booklet) was designed for hearing impaired person to be able to communicate with others without speaking. The current smart phone application allows calls to be made from a remote location. The screen transitions application follows the dialogue models used by emergency services. Users interact with the dialogues by tapping on icons or pictograms instead of using text messages. Evaluation by deaf people and a non-native speaker found that it was about three times quicker to report an emergency using this tool than it was by using text messages.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"3 1","pages":"413-416"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89348592","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}