Stefan Schneegass, Thomas Olsson, Sven Mayer, Kristof Van Laerhoven
{"title":"Mobile Interactions Augmented by Wearable Computing: A Design Space and Vision","authors":"Stefan Schneegass, Thomas Olsson, Sven Mayer, Kristof Van Laerhoven","doi":"10.4018/IJMHCI.2016100106","DOIUrl":null,"url":null,"abstract":"Wearable computing has a huge potential to shape the way we interact with mobile devices in the future. Interaction with mobile devices is still mainly limited to visual output and tactile finger-based input. Despite the visions of next-generation mobile interaction, the hand-held form factor hinders new interaction techniques becoming commonplace. In contrast, wearable devices and sensors are intended for more continuous and close-to-body use. This makes it possible to design novel wearable-augmented mobile interaction methods-both explicit and implicit. For example, the EEG signal from a wearable breast strap could be used to identify user status and change the device state accordingly implicit and the optical tracking with a head-mounted camera could be used to recognize gestural input explicit. In this paper, the authors outline the design space for how the existing and envisioned wearable devices and sensors could augment mobile interaction techniques. Based on designs and discussions in a recently organized workshop on the topic as well as other related work, the authors present an overview of this design space and highlight some use cases that underline the potential therein.","PeriodicalId":43100,"journal":{"name":"International Journal of Mobile Human Computer Interaction","volume":null,"pages":null},"PeriodicalIF":0.2000,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"17","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Mobile Human Computer Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4018/IJMHCI.2016100106","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
引用次数: 17
Abstract
Wearable computing has a huge potential to shape the way we interact with mobile devices in the future. Interaction with mobile devices is still mainly limited to visual output and tactile finger-based input. Despite the visions of next-generation mobile interaction, the hand-held form factor hinders new interaction techniques becoming commonplace. In contrast, wearable devices and sensors are intended for more continuous and close-to-body use. This makes it possible to design novel wearable-augmented mobile interaction methods-both explicit and implicit. For example, the EEG signal from a wearable breast strap could be used to identify user status and change the device state accordingly implicit and the optical tracking with a head-mounted camera could be used to recognize gestural input explicit. In this paper, the authors outline the design space for how the existing and envisioned wearable devices and sensors could augment mobile interaction techniques. Based on designs and discussions in a recently organized workshop on the topic as well as other related work, the authors present an overview of this design space and highlight some use cases that underline the potential therein.