Matthias Baldauf, Stefan Suette, Peter Fröhlich, Ulrich L. Lehner
Interactive opinion polls are a promising novel use case for public urban displays. However, voicing one's opinion at such a public installation poses special privacy requirements. In this paper, we introduce our ongoing work on investigating the roles of the interaction technique and the poll question in this novel context. We present a field study comparing three different voting techniques (public touch interface, personal smartphone by scanning a QR code, from remote through a short Web address) and three types of poll questions (general, personal, local). Overall, the results show that actively casting an opinion on a timely topic is highly appreciated by passers-by. The public voting opportunity through a touch screen is clearly preferred. Offering mobile or remote voting does not significantly increase the overall participation rate. The type of poll question has an impact on the number of participants but does not influence the preferred interaction modality.
{"title":"Interactive opinion polls on public displays: studying privacy requirements in the wild","authors":"Matthias Baldauf, Stefan Suette, Peter Fröhlich, Ulrich L. Lehner","doi":"10.1145/2628363.2634222","DOIUrl":"https://doi.org/10.1145/2628363.2634222","url":null,"abstract":"Interactive opinion polls are a promising novel use case for public urban displays. However, voicing one's opinion at such a public installation poses special privacy requirements. In this paper, we introduce our ongoing work on investigating the roles of the interaction technique and the poll question in this novel context. We present a field study comparing three different voting techniques (public touch interface, personal smartphone by scanning a QR code, from remote through a short Web address) and three types of poll questions (general, personal, local). Overall, the results show that actively casting an opinion on a timely topic is highly appreciated by passers-by. The public voting opportunity through a touch screen is clearly preferred. Offering mobile or remote voting does not significantly increase the overall participation rate. The type of poll question has an impact on the number of participants but does not influence the preferred interaction modality.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"14 1","pages":"495-500"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74444088","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The rise of modern smartphones brought gesture-based interaction to our daily lives. As the number of different operating systems and graphical user interfaces increases, designers and researchers can benefit from a common notation for mobile interaction design. In this paper, we present a concept of an extensible sketching notation for mobile gestures. The proposed notation, Monox, provides a common basis for collaborative design and analysis of mobile interactions. Monox is platform independent and enables general discussions and negotiations on topics of mobile gestures. An extensive evaluation showed the practicability and ability of Monox to serve as a common denominator for discussion and communication within interdisciplinary groups of researchers, designers and developers.
{"title":"Monox: extensible gesture notation for mobile devices","authors":"Roman Ganhör, Wolfgang Spreicer","doi":"10.1145/2628363.2628394","DOIUrl":"https://doi.org/10.1145/2628363.2628394","url":null,"abstract":"The rise of modern smartphones brought gesture-based interaction to our daily lives. As the number of different operating systems and graphical user interfaces increases, designers and researchers can benefit from a common notation for mobile interaction design. In this paper, we present a concept of an extensible sketching notation for mobile gestures. The proposed notation, Monox, provides a common basis for collaborative design and analysis of mobile interactions. Monox is platform independent and enables general discussions and negotiations on topics of mobile gestures. An extensive evaluation showed the practicability and ability of Monox to serve as a common denominator for discussion and communication within interdisciplinary groups of researchers, designers and developers.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"15 1","pages":"203-212"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73127323","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The popularity of touchscreen mobile devices gives users a variety of useful apps and functionality on the move. As a result, mobile devices are being used in a range of different contexts. One common scenario that has received little attention from researchers is the effects of encumbrance carrying objects (for example, shopping bags and personal gear) and interacting with mobile devices at the same time. This is a frequent everyday situation and one that can cause interaction difficulties [3]. There is a lack of knowledge of the impact encumbrance has on interaction therefore the usability issues in these physically and mentally demanding contexts are unknown. Prior to the start of our research, there was only one related study that has examined input performance with handheld devices while multitasking with different types of objects [7]. A better understanding of the interaction problems caused by encumbrance would allow researchers to develop more effective input techniques on touchscreen mobile devices.
{"title":"The effects of encumbrance on mobile interactions","authors":"Alexander Ng","doi":"10.1145/2628363.2634268","DOIUrl":"https://doi.org/10.1145/2628363.2634268","url":null,"abstract":"The popularity of touchscreen mobile devices gives users a variety of useful apps and functionality on the move. As a result, mobile devices are being used in a range of different contexts. One common scenario that has received little attention from researchers is the effects of encumbrance carrying objects (for example, shopping bags and personal gear) and interacting with mobile devices at the same time. This is a frequent everyday situation and one that can cause interaction difficulties [3]. There is a lack of knowledge of the impact encumbrance has on interaction therefore the usability issues in these physically and mentally demanding contexts are unknown. Prior to the start of our research, there was only one related study that has examined input performance with handheld devices while multitasking with different types of objects [7]. A better understanding of the interaction problems caused by encumbrance would allow researchers to develop more effective input techniques on touchscreen mobile devices.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"66 1","pages":"405-406"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77869717","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Although there has been a great deal of work on using machine learning algorithms to categorize user activity (e.g., walking, biking) from accelerometer and other sensor data, less attention has been focused on relaying such information in real-time for remote implicit communication. Humans are very familiar with using both explicit and implicit communication with others physically near to us, for example, by using body language and modulating our voice tone and volume. Likewise, remote explicit communication has also existed for a long time in the form of phone calls, text messages, and other mechanisms for communicating explicitly over great distances. However, remote implicit communication between mobile users has been less explored, likely since there have been limited avenues for collecting such information and rendering it in a meaningful way while we go about our daily lives.
{"title":"Body-worn sensors for remote implicit communication","authors":"Jeffrey R. Blum","doi":"10.1145/2628363.2634271","DOIUrl":"https://doi.org/10.1145/2628363.2634271","url":null,"abstract":"Although there has been a great deal of work on using machine learning algorithms to categorize user activity (e.g., walking, biking) from accelerometer and other sensor data, less attention has been focused on relaying such information in real-time for remote implicit communication. Humans are very familiar with using both explicit and implicit communication with others physically near to us, for example, by using body language and modulating our voice tone and volume. Likewise, remote explicit communication has also existed for a long time in the form of phone calls, text messages, and other mechanisms for communicating explicitly over great distances. However, remote implicit communication between mobile users has been less explored, likely since there have been limited avenues for collecting such information and rendering it in a meaningful way while we go about our daily lives.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"13 1","pages":"411-412"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85175952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This poster is looking at how users utilize mobile applications that offer an interface for finding locations and how the way of interaction changes depending on the users' intent. Through the analysis of existing interfaces we identified 5 location search patterns. In a further evaluation of the existing patterns we tried to identify which patterns serve which users' demand for information. In a goal directed pilot study we were able to gain a first insight into the correlations of specific user requirements and location search patterns.
{"title":"A comparison of location search UI patterns on mobile devices","authors":"S. Meier, Frank Heidmann, Andreas Thom","doi":"10.1145/2628363.2634216","DOIUrl":"https://doi.org/10.1145/2628363.2634216","url":null,"abstract":"This poster is looking at how users utilize mobile applications that offer an interface for finding locations and how the way of interaction changes depending on the users' intent. Through the analysis of existing interfaces we identified 5 location search patterns. In a further evaluation of the existing patterns we tried to identify which patterns serve which users' demand for information. In a goal directed pilot study we were able to gain a first insight into the correlations of specific user requirements and location search patterns.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"79 1","pages":"465-470"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80790327","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
With the recent success of tablet devices a new device type became available for mobile interaction. Just as for mobile phones, touch is the dominant way people interact with tablets. In contrast to the much smaller phones a firm grip with both hands is needed to securely hold tablet devices. While a large body of work has investigated touch interaction on smaller devices, is little empirical research has been carried out on touch-based pointing while holding the device with both hands. To understand touch-based interactions using tablet devices, we conducted an experiment to compare four pointing techniques on both the front and back of the devices while it was held in landscape format. We compare direct touch with the following alternatives for selecting targets, indirect pointing on a virtual touchpad, an inverse cursor, and a miniature interaction area. While direct touch is 35% faster than the fastest alternative, only 74% of the touchscreen and 64% of a back-of-device can be reached by each hand. We show that among the indirect pointing techniques, the miniaturized interaction area is significantly faster and received the best subjective ratings. We conclude that a miniaturized interaction area is a viable alternative to direct touch especially on the backside of tablet devices.
{"title":"Comparing pointing techniques for grasping hands on tablets","authors":"Katrin Wolf, N. Henze","doi":"10.1145/2628363.2628371","DOIUrl":"https://doi.org/10.1145/2628363.2628371","url":null,"abstract":"With the recent success of tablet devices a new device type became available for mobile interaction. Just as for mobile phones, touch is the dominant way people interact with tablets. In contrast to the much smaller phones a firm grip with both hands is needed to securely hold tablet devices. While a large body of work has investigated touch interaction on smaller devices, is little empirical research has been carried out on touch-based pointing while holding the device with both hands. To understand touch-based interactions using tablet devices, we conducted an experiment to compare four pointing techniques on both the front and back of the devices while it was held in landscape format. We compare direct touch with the following alternatives for selecting targets, indirect pointing on a virtual touchpad, an inverse cursor, and a miniature interaction area. While direct touch is 35% faster than the fastest alternative, only 74% of the touchscreen and 64% of a back-of-device can be reached by each hand. We show that among the indirect pointing techniques, the miniaturized interaction area is significantly faster and received the best subjective ratings. We conclude that a miniaturized interaction area is a viable alternative to direct touch especially on the backside of tablet devices.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"9 1","pages":"53-62"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84083874","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Conor Holler, Patrick Crowe, A. Mayhew, A. Tindale, S. Diamond
Time Tremors is a transmedia experience for children aged 8-14 that crosses television, web, locative media, and mobile apps. Time Tremors is a collection game in which players search for objects from history supposedly scattered throughout time and space, hidden, invisible to the human eye but detectable and collectable using a variety of mobile and online broadband technologies. Extending the game into locative augmented reality and mobile play was an applied research challenge that required narrative continuity while ensuring safe play.
{"title":"Time tremors: developing transmedia gaming for children","authors":"Conor Holler, Patrick Crowe, A. Mayhew, A. Tindale, S. Diamond","doi":"10.1145/2628363.2645698","DOIUrl":"https://doi.org/10.1145/2628363.2645698","url":null,"abstract":"Time Tremors is a transmedia experience for children aged 8-14 that crosses television, web, locative media, and mobile apps. Time Tremors is a collection game in which players search for objects from history supposedly scattered throughout time and space, hidden, invisible to the human eye but detectable and collectable using a variety of mobile and online broadband technologies. Extending the game into locative augmented reality and mobile play was an applied research challenge that required narrative continuity while ensuring safe play.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"144 1","pages":"601-603"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80392683","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This demonstration accompanies a paper accepted at MobileHCI'14. Back-of-device (BoD) authentication has shown to be significantly more secure than standard front-facing approaches, being BoD Shapes the most representative method found in the literature. With the aim of getting a better understanding and improving its usage, we developed BoD Taps as a novel alternative. Our experiments revealed that BoD Taps and BoD Shapes perform equally good at unlocking the device, but BoD Taps allows users to enter passwords about twice faster. Moreover, BoD Taps was perceived as being more usable and less frustrating than BoD Shapes. This demonstration showcases both authentication methods in action, aimed at comparing and discussing their features and potential improvements.
{"title":"Back-of-device authentication with bod taps and bod shapes","authors":"A. Catalá, Luis A. Leiva","doi":"10.1145/2628363.2633571","DOIUrl":"https://doi.org/10.1145/2628363.2633571","url":null,"abstract":"This demonstration accompanies a paper accepted at MobileHCI'14. Back-of-device (BoD) authentication has shown to be significantly more secure than standard front-facing approaches, being BoD Shapes the most representative method found in the literature. With the aim of getting a better understanding and improving its usage, we developed BoD Taps as a novel alternative. Our experiments revealed that BoD Taps and BoD Shapes perform equally good at unlocking the device, but BoD Taps allows users to enter passwords about twice faster. Moreover, BoD Taps was perceived as being more usable and less frustrating than BoD Shapes. This demonstration showcases both authentication methods in action, aimed at comparing and discussing their features and potential improvements.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"6 1","pages":"425"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77683434","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Great part of human communication is carried out non-verbally. All this information is lost in mobile text messaging. This work describes an attempt to augment text chatting in mobile phones by adding automatically detected facial expression reactions, to conversations. These expressions are detected using known image processing techniques. Known related work, concerning the investigation of non-verbal communication through text messaging are considered and distinguished from the present solution. The conception and implementation of a mobile phone application with the debated feature is described and user studies are narrated. Finally, context of application, conclusions and future work are also discussed.
{"title":"Non-verbal communications in mobile text chat: emotion-enhanced mobile chat","authors":"Jackson Feijó Filho, T. Valle, Wilson Prata","doi":"10.1145/2628363.2633576","DOIUrl":"https://doi.org/10.1145/2628363.2633576","url":null,"abstract":"Great part of human communication is carried out non-verbally. All this information is lost in mobile text messaging. This work describes an attempt to augment text chatting in mobile phones by adding automatically detected facial expression reactions, to conversations. These expressions are detected using known image processing techniques. Known related work, concerning the investigation of non-verbal communication through text messaging are considered and distinguished from the present solution. The conception and implementation of a mobile phone application with the debated feature is described and user studies are narrated. Finally, context of application, conclusions and future work are also discussed.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"99 1","pages":"443-446"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88551139","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
For many people their phones have become their main everyday tool. While phones can fulfill many different roles they also require users to (1) make do with affordance not specialized for the specific task, and (2) closely engage with the device itself. We propose utilizing the space and objects around the phone to offer better task affordance and to create an opportunity for casual interactions. Such around-device devices are a class of interactors that do not require users to bring special tangibles, but repurpose items already found in the user's surroundings. In a survey study, we determine which places and objects are available to around-device devices. Furthermore, in an elicitation study, we observe what objects users would use for ten interactions.
{"title":"Around-device devices: my coffee mug is a volume dial","authors":"Henning Pohl, M. Rohs","doi":"10.1145/2628363.2628401","DOIUrl":"https://doi.org/10.1145/2628363.2628401","url":null,"abstract":"For many people their phones have become their main everyday tool. While phones can fulfill many different roles they also require users to (1) make do with affordance not specialized for the specific task, and (2) closely engage with the device itself. We propose utilizing the space and objects around the phone to offer better task affordance and to create an opportunity for casual interactions. Such around-device devices are a class of interactors that do not require users to bring special tangibles, but repurpose items already found in the user's surroundings. In a survey study, we determine which places and objects are available to around-device devices. Furthermore, in an elicitation study, we observe what objects users would use for ten interactions.","PeriodicalId":74207,"journal":{"name":"MobileHCI : proceedings of the ... International Conference on Human Computer Interaction with Mobile Devices and Services. MobileHCI (Conference)","volume":"8 1","pages":"81-90"},"PeriodicalIF":0.0,"publicationDate":"2014-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91163442","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}