Support for ecological sustainability is of growing interest and the over-consumption, production and disposal of foods are a major concern for sustainability, ethics and the economy. However, there is a deficit in current understandings of how technologies could be used within this area. In this paper we focus on food waste and report on a qualitative study to understand daily food practices around shopping planning, gardening, storing, cooking and throwing away food, and their relations to waste. The findings point to design-relevant factors such as losing sight and reordering; spatial, temporal and social constraints; trust and valuing food source; and busyness, unpredictability and effort. The main contribution of this paper is to understand food practices and in turn to present seven dimensions of visibility to draw out implications for designing mobile and ubiquitous technologies for this new arena for design. We also present a prototype evolving from our qualitative results, the mobile food waste diary.
{"title":"Creating visibility: understanding the design space for food waste","authors":"Eva Ganglbauer, G. Fitzpatrick, Georg Molzer","doi":"10.1145/2406367.2406369","DOIUrl":"https://doi.org/10.1145/2406367.2406369","url":null,"abstract":"Support for ecological sustainability is of growing interest and the over-consumption, production and disposal of foods are a major concern for sustainability, ethics and the economy. However, there is a deficit in current understandings of how technologies could be used within this area. In this paper we focus on food waste and report on a qualitative study to understand daily food practices around shopping planning, gardening, storing, cooking and throwing away food, and their relations to waste. The findings point to design-relevant factors such as losing sight and reordering; spatial, temporal and social constraints; trust and valuing food source; and busyness, unpredictability and effort. The main contribution of this paper is to understand food practices and in turn to present seven dimensions of visibility to draw out implications for designing mobile and ubiquitous technologies for this new arena for design. We also present a prototype evolving from our qualitative results, the mobile food waste diary.","PeriodicalId":181563,"journal":{"name":"Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126517926","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper presents an interactive thumbnail-based video browser for mobile devices such as smartphones featuring a touch screen. Developed as part of on-going research and supported by user studies, it introduces HiStory, a hierarchical storyboard design offering an interface metaphor that is familiar and intuitive yet supports fast and effective completion of Known-Item-Search tasks by rapidly providing an overview of a video's content with varying degrees of granularity.
{"title":"HiStory: a hierarchical storyboard interface design for video browsing on mobile devices","authors":"Wolfgang Hürst, Dimitrios Paris Darzentas","doi":"10.1145/2406367.2406389","DOIUrl":"https://doi.org/10.1145/2406367.2406389","url":null,"abstract":"This paper presents an interactive thumbnail-based video browser for mobile devices such as smartphones featuring a touch screen. Developed as part of on-going research and supported by user studies, it introduces HiStory, a hierarchical storyboard design offering an interface metaphor that is familiar and intuitive yet supports fast and effective completion of Known-Item-Search tasks by rapidly providing an overview of a video's content with varying degrees of granularity.","PeriodicalId":181563,"journal":{"name":"Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133930028","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Studies show that every fourth smartphone user watches videos on their device. However, because of increasing camera and encoding quality more and more smartphones are providing an attractive tool for creating and editing videos. The demand for smooth video browsing interfaces is challenged by the limited input and output capabilities that such mobile devices offer. This paper discusses a novel interface for fast and precise video browsing suitable for watching and editing videos. The browsing mechanism offers a simple but powerful interface for browsing videos at different levels of granularity. All interactions can be carried out with no modal changes at all. The interface is easy to understand and efficient to use. A first evaluation proves the suitability of the presented design for casual users as well as for creative professionals such as video editors.
{"title":"ProPane: fast and precise video browsing on mobile phones","authors":"Roman Ganhör","doi":"10.1145/2406367.2406392","DOIUrl":"https://doi.org/10.1145/2406367.2406392","url":null,"abstract":"Studies show that every fourth smartphone user watches videos on their device. However, because of increasing camera and encoding quality more and more smartphones are providing an attractive tool for creating and editing videos. The demand for smooth video browsing interfaces is challenged by the limited input and output capabilities that such mobile devices offer. This paper discusses a novel interface for fast and precise video browsing suitable for watching and editing videos. The browsing mechanism offers a simple but powerful interface for browsing videos at different levels of granularity. All interactions can be carried out with no modal changes at all. The interface is easy to understand and efficient to use. A first evaluation proves the suitability of the presented design for casual users as well as for creative professionals such as video editors.","PeriodicalId":181563,"journal":{"name":"Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130909397","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Personal well-being is influenced by small daily decisions such as what or when to eat or whether to go jogging. The health consequences of these decisions accumulate over time. Mobile applications can be designed to support people in everyday decisions and thus help to improve well-being real-time. In this paper we propose a framework to study the influential factors for users to download and use applications from the wide selection currently available in App stores. The framework includes attractiveness, value, ease-of-use, trust, social support, diffusiveness, as well as fun and excitement. We illustrated how the framework works in practise by applying it to an online survey to assess 12 mobile Apps for well-being. The results showed that these influential factors did match the decisions on users' attitudes toward taking Apps into use. User feedback explained how people assessed the influential factors before actually using the applications.
{"title":"What influences users' decisions to take apps into use?: a framework for evaluating persuasive and engaging design in mobile Apps for well-being","authors":"Ting-Ray Chang, E. Kaasinen, K. Kaipainen","doi":"10.1145/2406367.2406370","DOIUrl":"https://doi.org/10.1145/2406367.2406370","url":null,"abstract":"Personal well-being is influenced by small daily decisions such as what or when to eat or whether to go jogging. The health consequences of these decisions accumulate over time. Mobile applications can be designed to support people in everyday decisions and thus help to improve well-being real-time. In this paper we propose a framework to study the influential factors for users to download and use applications from the wide selection currently available in App stores. The framework includes attractiveness, value, ease-of-use, trust, social support, diffusiveness, as well as fun and excitement. We illustrated how the framework works in practise by applying it to an online survey to assess 12 mobile Apps for well-being. The results showed that these influential factors did match the decisions on users' attitudes toward taking Apps into use. User feedback explained how people assessed the influential factors before actually using the applications.","PeriodicalId":181563,"journal":{"name":"Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127220586","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mobile phones and video game controllers using gesture recognition technologies enable easy and intuitive operations, such as scrolling a browser and drawing objects. However, usually only one of each kind of sensor is installed in a device, and the effect of multiple homogeneous sensors on recognition accuracy has not been investigated. Moreover, the effect of the differences in the motion of a gesture has not been examined. We have investigated the use of a test mobile device with nine accelerometers and nine gyroscopes. We have captured the data for 27 kinds of gestures for a mobile tablet. We experimentally investigated the effects on recognition accuracy of changing the number and positions of the sensors and of the number and kinds of gestures. The results showed that the use of multiple homogeneous sensors has zero or negligible effect on recognition accuracy, but that using an accelerometer along with a gyroscope improves recognition accuracy. They also showed that some gestures were not consistent among test subjects and interdependent, so selecting specific gestures to use can improve recognition accuracy.
{"title":"Evaluation study on sensor placement and gesture selection for mobile devices","authors":"Kazuya Murao, A. Yano, T. Terada, R. Matsukura","doi":"10.1145/2406367.2406376","DOIUrl":"https://doi.org/10.1145/2406367.2406376","url":null,"abstract":"Mobile phones and video game controllers using gesture recognition technologies enable easy and intuitive operations, such as scrolling a browser and drawing objects. However, usually only one of each kind of sensor is installed in a device, and the effect of multiple homogeneous sensors on recognition accuracy has not been investigated. Moreover, the effect of the differences in the motion of a gesture has not been examined. We have investigated the use of a test mobile device with nine accelerometers and nine gyroscopes. We have captured the data for 27 kinds of gestures for a mobile tablet. We experimentally investigated the effects on recognition accuracy of changing the number and positions of the sensors and of the number and kinds of gestures. The results showed that the use of multiple homogeneous sensors has zero or negligible effect on recognition accuracy, but that using an accelerometer along with a gyroscope improves recognition accuracy. They also showed that some gestures were not consistent among test subjects and interdependent, so selecting specific gestures to use can improve recognition accuracy.","PeriodicalId":181563,"journal":{"name":"Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129868682","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Recent mobile phones allow users to perform a multitude of different tasks. Complexity of tasks challenges the design of mobile applications in many ways. For instance, the limited screen space of mobile devices allows only a small number of items to be displayed. Therefore, users often have to change the view or have to resize the displayed content (e.g., images or maps) to view the required information. We present the system MobIES, which allows users to extend the mobile interfaces of their mobiles phones using external screens. Users connect their mobile phone with an external display by holding it on the border of the external display. When the connection is established, the user interface of the currently active mobile application is distributed on the phone and the external screen. This enables users to take advantage of using existing screens in their environments and temporarily benefit from an extended screen space. In this paper, we discuss the concept of MobIES and present a prototype implementation.
{"title":"MobIES: extending mobile interfaces using external screens","authors":"Dennis Schneider, Julian Seifert, E. Rukzio","doi":"10.1145/2406367.2406438","DOIUrl":"https://doi.org/10.1145/2406367.2406438","url":null,"abstract":"Recent mobile phones allow users to perform a multitude of different tasks. Complexity of tasks challenges the design of mobile applications in many ways. For instance, the limited screen space of mobile devices allows only a small number of items to be displayed. Therefore, users often have to change the view or have to resize the displayed content (e.g., images or maps) to view the required information. We present the system MobIES, which allows users to extend the mobile interfaces of their mobiles phones using external screens. Users connect their mobile phone with an external display by holding it on the border of the external display. When the connection is established, the user interface of the currently active mobile application is distributed on the phone and the external screen. This enables users to take advantage of using existing screens in their environments and temporarily benefit from an extended screen space. In this paper, we discuss the concept of MobIES and present a prototype implementation.","PeriodicalId":181563,"journal":{"name":"Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia","volume":"51 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123593868","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Interaction is repeatedly pointed out as a key enabling element towards more engaging and valuable public displays. Still, most digital public displays today do not support any interactive features. We believe that this is mainly due to the lack of efficient and clear abstractions that developers can use to incorporate interactivity into their applications. In this demo we present PuReWidgets, a toolkit that developers can use in their public display applications to support the interaction process across multiple display systems, without considering the specifics of what interaction modality will be used on each particular display. PuReWidgets provides high-level widgets to application programmers, and allows users to interact via various interaction mechanisms, such as graphical user interfaces for mobile devices, QR codes, SMS, etc.
{"title":"Creating web-based interactive public display applications with the PuReWidgets toolkit","authors":"Jorge C. S. Cardoso, R. Jose","doi":"10.1145/2406367.2406434","DOIUrl":"https://doi.org/10.1145/2406367.2406434","url":null,"abstract":"Interaction is repeatedly pointed out as a key enabling element towards more engaging and valuable public displays. Still, most digital public displays today do not support any interactive features. We believe that this is mainly due to the lack of efficient and clear abstractions that developers can use to incorporate interactivity into their applications. In this demo we present PuReWidgets, a toolkit that developers can use in their public display applications to support the interaction process across multiple display systems, without considering the specifics of what interaction modality will be used on each particular display. PuReWidgets provides high-level widgets to application programmers, and allows users to interact via various interaction mechanisms, such as graphical user interfaces for mobile devices, QR codes, SMS, etc.","PeriodicalId":181563,"journal":{"name":"Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128410745","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper provides a description of the technical implementation of a complete software and hardware solution supporting guided factory tours in an industrial environment. It starts with an overview of the strategies applied to come up with a working solution satisfying given user requirements. The overall architecture of the system is shown based on design decisions elaborated during this process. Then, specific topics posing challenges during the implementation are described from the assessment of the current state of the art and possible implementation options to the final working solution. These topics include the organization of configurable content delivery, the selection and application of an augmented reality framework, and the creation of an audio broadcast solution for the tour guide. We conclude with a review of the practical application of the system and possible further development and future improvements.
{"title":"evoGuide: implementation of a tour guide support solution with multimedia and augmented-reality content","authors":"R. Hable, T. Rössler, Christina Schuller","doi":"10.1145/2406367.2406403","DOIUrl":"https://doi.org/10.1145/2406367.2406403","url":null,"abstract":"This paper provides a description of the technical implementation of a complete software and hardware solution supporting guided factory tours in an industrial environment. It starts with an overview of the strategies applied to come up with a working solution satisfying given user requirements. The overall architecture of the system is shown based on design decisions elaborated during this process. Then, specific topics posing challenges during the implementation are described from the assessment of the current state of the art and possible implementation options to the final working solution. These topics include the organization of configurable content delivery, the selection and application of an augmented reality framework, and the creation of an audio broadcast solution for the tour guide. We conclude with a review of the practical application of the system and possible further development and future improvements.","PeriodicalId":181563,"journal":{"name":"Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120962068","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Niko Mäkitalo, Jari Pääkkö, M. Raatikainen, Varvana Myllärniemi, T. Aaltonen, Tapani Leppänen, T. Männistö, T. Mikkonen
Online social media services, such as Facebook and Twitter, have set new standards on how people interact with each other online, share their everyday activities, and media services. While current mobile services supporting social interaction are typically primarily for remote communication, similar services can be introduced to co-located social interactions. In such a setting, people and proactive, context sensing mobile devices form a new kind of a socio-digital system where the mobile devices are active participants and can initiate interaction among the devices and people. Physical proximity of the devices becomes a key enabler to advance users' interaction with each other and the supporting mobile services. In this paper, we introduce the concept of Social Devices and its implementation. The Social Devices Platform facilitates autonomously composed cooperative services in co-located devices where the client part is simple and easily deployable to different kinds of devices.
{"title":"Social devices: collaborative co-located interactions in a mobile cloud","authors":"Niko Mäkitalo, Jari Pääkkö, M. Raatikainen, Varvana Myllärniemi, T. Aaltonen, Tapani Leppänen, T. Männistö, T. Mikkonen","doi":"10.1145/2406367.2406380","DOIUrl":"https://doi.org/10.1145/2406367.2406380","url":null,"abstract":"Online social media services, such as Facebook and Twitter, have set new standards on how people interact with each other online, share their everyday activities, and media services. While current mobile services supporting social interaction are typically primarily for remote communication, similar services can be introduced to co-located social interactions. In such a setting, people and proactive, context sensing mobile devices form a new kind of a socio-digital system where the mobile devices are active participants and can initiate interaction among the devices and people. Physical proximity of the devices becomes a key enabler to advance users' interaction with each other and the supporting mobile services. In this paper, we introduce the concept of Social Devices and its implementation. The Social Devices Platform facilitates autonomously composed cooperative services in co-located devices where the client part is simple and easily deployable to different kinds of devices.","PeriodicalId":181563,"journal":{"name":"Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121932206","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Reinhardt, A. Reinhardt, M. Hollick, Kai Trumpold
By leveraging smartphones as sensing platforms, mobile sensing applications can collect information in an unprecedented quantity and granularity. The transmission of unprocessed sensor readings can, however, pose severe threats to the users' privacy. To protect their privacy, users can apply filters to eliminate privacy-sensitive elements of the sensor readings prior to transmission. The resulting privacy protection depends on the configuration of these filters, which is controlled by the users through a privacy interface. In this paper, we study interface elements for the realization of this interface in order to foster its acceptance and maximize the efficacy of the provided privacy protection. To this end, we have implemented six graphical privacy interfaces, which have been evaluated by 80 participants of our user study. The results show a preference of the users towards differently colored and sized elements to visualize the current level of privacy protection and define their preferred privacy settings.
{"title":"Exploring user preferences for privacy interfaces in mobile sensing applications","authors":"D. Reinhardt, A. Reinhardt, M. Hollick, Kai Trumpold","doi":"10.1145/2406367.2406385","DOIUrl":"https://doi.org/10.1145/2406367.2406385","url":null,"abstract":"By leveraging smartphones as sensing platforms, mobile sensing applications can collect information in an unprecedented quantity and granularity. The transmission of unprocessed sensor readings can, however, pose severe threats to the users' privacy. To protect their privacy, users can apply filters to eliminate privacy-sensitive elements of the sensor readings prior to transmission. The resulting privacy protection depends on the configuration of these filters, which is controlled by the users through a privacy interface. In this paper, we study interface elements for the realization of this interface in order to foster its acceptance and maximize the efficacy of the provided privacy protection. To this end, we have implemented six graphical privacy interfaces, which have been evaluated by 80 participants of our user study. The results show a preference of the users towards differently colored and sized elements to visualize the current level of privacy protection and define their preferred privacy settings.","PeriodicalId":181563,"journal":{"name":"Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127844213","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}