Florian Güldenpfennig, Oliver Hödl, P. Reichl, Christian Löw, A. Gartus, Matthew Pelowski
We introduce 'The interactive Audience Sensor Kit' (TASK). This modular system of wirelessly networked sensors facilitates the augmentation of artistic performances, in particular, music events or visual art. It was conceived to enable low-level and low-cost audience interaction by offering a set of tracker-nodes to be arranged across the venue as demanded by the corresponding performance event. In this paper, we present two modules from TASK, which are currently under development, and provide early insights from a field study. (1) TASKswitch reacts to the presence of bodies acting as an on/off switch and thereby can be used to modify particular aspects of a performance. (2) TASKvector, on the other hand, enables more complex input by tracking movement among the audience. For example, as we will show in the paper, we used the modules to create interactive audio-visual experiences for the audience where projections were modulated by TASK.
{"title":"TASK: Introducing The Interactive Audience Sensor Kit","authors":"Florian Güldenpfennig, Oliver Hödl, P. Reichl, Christian Löw, A. Gartus, Matthew Pelowski","doi":"10.1145/2839462.2856538","DOIUrl":"https://doi.org/10.1145/2839462.2856538","url":null,"abstract":"We introduce 'The interactive Audience Sensor Kit' (TASK). This modular system of wirelessly networked sensors facilitates the augmentation of artistic performances, in particular, music events or visual art. It was conceived to enable low-level and low-cost audience interaction by offering a set of tracker-nodes to be arranged across the venue as demanded by the corresponding performance event. In this paper, we present two modules from TASK, which are currently under development, and provide early insights from a field study. (1) TASKswitch reacts to the presence of bodies acting as an on/off switch and thereby can be used to modify particular aspects of a performance. (2) TASKvector, on the other hand, enables more complex input by tracking movement among the audience. For example, as we will show in the paper, we used the modules to create interactive audio-visual experiences for the audience where projections were modulated by TASK.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128632587","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper we describe the development and evaluation of three kitchen blenders that were specifically designed to stimulate mindfulness in interaction, that is: engagement with, and care for what you are doing. We find that the directness we used to have preparing our food has been sacrificed to efficiency and ease of use, which does not match our current zest for 'slow food' and 'slow cooking'. We argue that most of our kitchen appliances make us less engaged in the act of and less caring for cooking. In order to counter this we see opportunities for a more tangible or embodied interaction style where expressive input leads to expressive output. In order to research this argument we have developed three embodied kitchen blender interaction styles and compared these to a more traditional blender interaction. Preliminary findings suggest that more embodied interaction styles do indeed lead to more mindful engagement in interaction.
{"title":"Engagement Through Embodiment: A Case For Mindful Interaction","authors":"V. V. Rheden, B. Hengeveld","doi":"10.1145/2839462.2839498","DOIUrl":"https://doi.org/10.1145/2839462.2839498","url":null,"abstract":"In this paper we describe the development and evaluation of three kitchen blenders that were specifically designed to stimulate mindfulness in interaction, that is: engagement with, and care for what you are doing. We find that the directness we used to have preparing our food has been sacrificed to efficiency and ease of use, which does not match our current zest for 'slow food' and 'slow cooking'. We argue that most of our kitchen appliances make us less engaged in the act of and less caring for cooking. In order to counter this we see opportunities for a more tangible or embodied interaction style where expressive input leads to expressive output. In order to research this argument we have developed three embodied kitchen blender interaction styles and compared these to a more traditional blender interaction. Preliminary findings suggest that more embodied interaction styles do indeed lead to more mindful engagement in interaction.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130449037","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Stuff That Works","authors":"P. Bennett","doi":"10.1145/3257866","DOIUrl":"https://doi.org/10.1145/3257866","url":null,"abstract":"","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129206315","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
With few exceptions, technology for autistic children tends to be focused on the regulation of perceived deficits. With OutsideTheBox we focus on the strengths of the children as design partners and created in our first year four technological objects together with them. They all have common that they are embedded in the children's lives and share some degree of embodied interaction. We present a case study along with four objects, two of them with wearable components, two of them focused at sharing experiences in an embodied mode. This opens up the argument not only for more design actually led by autistic children, but also for companion technologies that embody situatedness. Such technologies are then not driven by an outsider's perspective of what an autistic child needs, but rather are intrinsically valuable to them as a user.
{"title":"Embodied Companion Technologies for Autistic Children","authors":"Katta Spiel, Julia Makhaeva, C. Frauenberger","doi":"10.1145/2839462.2839495","DOIUrl":"https://doi.org/10.1145/2839462.2839495","url":null,"abstract":"With few exceptions, technology for autistic children tends to be focused on the regulation of perceived deficits. With OutsideTheBox we focus on the strengths of the children as design partners and created in our first year four technological objects together with them. They all have common that they are embedded in the children's lives and share some degree of embodied interaction. We present a case study along with four objects, two of them with wearable components, two of them focused at sharing experiences in an embodied mode. This opens up the argument not only for more design actually led by autistic children, but also for companion technologies that embody situatedness. Such technologies are then not driven by an outsider's perspective of what an autistic child needs, but rather are intrinsically valuable to them as a user.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121583285","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
3D modeling is used in Computer Graphics in various fields. Since the growth of gestures, virtual reality and embodied cognition, there have been various new technologies developed to either improve the modeling efficiency, or to provide more nature intuitive experience to the users. In this paper, from the user experience perspective, we try to compare these methods for navigation of 3D objects in the virtual modeling environment including: simple bare hand gestures, tangible user interfaces (TUI) with object in hand, as well as mouse/keyboard as the primary input. Based on embodied cognition theory, we hypothesis that the object-in-hand method might bring better user experience since the interaction between the object and hand can enhance the user's cognition while navigating a model. We present a conceptual design, with two approaches and three design models which demonstrate differences in user interaction with 3D modeling software.
{"title":"Comparing bare-hand-in-air Gesture and Object-in-hand Tangible User Interaction for Navigation of 3D Objects in Modeling","authors":"S. Dangeti, Victor Y. Chen, Chunhui Zheng","doi":"10.1145/2839462.2856555","DOIUrl":"https://doi.org/10.1145/2839462.2856555","url":null,"abstract":"3D modeling is used in Computer Graphics in various fields. Since the growth of gestures, virtual reality and embodied cognition, there have been various new technologies developed to either improve the modeling efficiency, or to provide more nature intuitive experience to the users. In this paper, from the user experience perspective, we try to compare these methods for navigation of 3D objects in the virtual modeling environment including: simple bare hand gestures, tangible user interfaces (TUI) with object in hand, as well as mouse/keyboard as the primary input. Based on embodied cognition theory, we hypothesis that the object-in-hand method might bring better user experience since the interaction between the object and hand can enhance the user's cognition while navigating a model. We present a conceptual design, with two approaches and three design models which demonstrate differences in user interaction with 3D modeling software.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"422 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122796558","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sowmya Somanath, Laura Morrison, J. Hughes, E. Sharlin, M. Sousa
This paper presents a set of lessons learnt from introducing maker culture and DIY paradigms to 'at-risk' students (age 12-14). Our goal is to engage 'at-risk' students through maker culture activities. While improved technology literacy is one of the outcomes we also wanted the learners to use technology to realize concepts and ideas, and to gain freedom of thinking similar to creators, artists and designers. We present our study and a set of high level suggestions to enable thinking about how maker culture activities can facilitate engagement and creative use of technology by 1) thinking about creativity in task, 2) facilitating different entry points, 3) the importance of personal relevance, and 4) relevance to education.
{"title":"Engaging 'At-Risk' Students through Maker Culture Activities","authors":"Sowmya Somanath, Laura Morrison, J. Hughes, E. Sharlin, M. Sousa","doi":"10.1145/2839462.2839482","DOIUrl":"https://doi.org/10.1145/2839462.2839482","url":null,"abstract":"This paper presents a set of lessons learnt from introducing maker culture and DIY paradigms to 'at-risk' students (age 12-14). Our goal is to engage 'at-risk' students through maker culture activities. While improved technology literacy is one of the outcomes we also wanted the learners to use technology to realize concepts and ideas, and to gain freedom of thinking similar to creators, artists and designers. We present our study and a set of high level suggestions to enable thinking about how maker culture activities can facilitate engagement and creative use of technology by 1) thinking about creativity in task, 2) facilitating different entry points, 3) the importance of personal relevance, and 4) relevance to education.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131363459","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this research, we present the design and formative evaluation of an interactive simulation for informal learning environments. The wearable feature of Augmented Reality(AR) glasses enables full-body movement and embodied interactions in digitally augmented physical environments. The interactive simulation was developed to engage and immerse users to understand an abstract scientific concept about the refraction of light. To design playful and meaningful learning experiences, several design features related to social interaction, multi-user interaction, and embodied interaction were unpacked and integrated in the design process. Through the formative evaluation with participants in the laboratory setting, we found several possibilities and challenges about designing an interactive simulation in informal learning contexts using AR glasses.
{"title":"Designing a Multi-user Interactive Simulation Using AR Glasses","authors":"Seungjae Oh, Kyudong Park, Soonmo Kwon, H. So","doi":"10.1145/2839462.2856521","DOIUrl":"https://doi.org/10.1145/2839462.2856521","url":null,"abstract":"In this research, we present the design and formative evaluation of an interactive simulation for informal learning environments. The wearable feature of Augmented Reality(AR) glasses enables full-body movement and embodied interactions in digitally augmented physical environments. The interactive simulation was developed to engage and immerse users to understand an abstract scientific concept about the refraction of light. To design playful and meaningful learning experiences, several design features related to social interaction, multi-user interaction, and embodied interaction were unpacked and integrated in the design process. Through the formative evaluation with participants in the laboratory setting, we found several possibilities and challenges about designing an interactive simulation in informal learning contexts using AR glasses.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128149580","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
miMic, a sonic analogue of paper and pencil is proposed: An augmented microphone for vocal and gestural sonic sketching. Vocalizations are classified and interpreted as instances of sound models, which the user can play with by vocal and gestural control. The physical device is based on a modified microphone, with embedded inertial sensors and buttons. Sound models can be selected by vocal imitations that are automatically classified, and each model is mapped to vocal and gestural features for real-time control. With miMic, the sound designer can explore a vast sonic space and quickly produce expressive sonic sketches, which may be turned into sound prototypes by further adjustment of model parameters.
{"title":"miMic: The Microphone as a Pencil","authors":"D. Rocchesso, D. Mauro, S. Monache","doi":"10.1145/2839462.2839467","DOIUrl":"https://doi.org/10.1145/2839462.2839467","url":null,"abstract":"miMic, a sonic analogue of paper and pencil is proposed: An augmented microphone for vocal and gestural sonic sketching. Vocalizations are classified and interpreted as instances of sound models, which the user can play with by vocal and gestural control. The physical device is based on a modified microphone, with embedded inertial sensors and buttons. Sound models can be selected by vocal imitations that are automatically classified, and each model is mapped to vocal and gestural features for real-time control. With miMic, the sound designer can explore a vast sonic space and quickly produce expressive sonic sketches, which may be turned into sound prototypes by further adjustment of model parameters.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133552265","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This proposal presents a modular system design for a set of programmable tools with various form factors inspired by the relations between human body parts and industrial elements. By providing functional forms to sensors and actuators, and tangible methods of programming behaviors to the objects, we propose a more customizable experience in the area of the Internet of Things. We explore the design space through studying motions in everyday elements and introduce applications of digitally enabled modules in form factors such as hinge, joints, zipper etc. Lastly, we investigate physical ways to program these modules that affords playful interactions in the tangible world. The workshop will first focus on using probes, brainstorming toolkits to generate the design ideas related to themes such as super-powers, environments as extension of the body and next generation of ubiquitous computing [5]. By using the resources and tool-kits provided by the workshop organizers, participants will generate proof-of-concept prototype as final deliverables. Participants are required to bring personal laptops.
{"title":"MeMod: A Modular Hacking and Programming Toolkit For Everyday Objects","authors":"Austin S. Lee, Dhairya Dand","doi":"10.1145/2839462.2854113","DOIUrl":"https://doi.org/10.1145/2839462.2854113","url":null,"abstract":"This proposal presents a modular system design for a set of programmable tools with various form factors inspired by the relations between human body parts and industrial elements. By providing functional forms to sensors and actuators, and tangible methods of programming behaviors to the objects, we propose a more customizable experience in the area of the Internet of Things. We explore the design space through studying motions in everyday elements and introduce applications of digitally enabled modules in form factors such as hinge, joints, zipper etc. Lastly, we investigate physical ways to program these modules that affords playful interactions in the tangible world. The workshop will first focus on using probes, brainstorming toolkits to generate the design ideas related to themes such as super-powers, environments as extension of the body and next generation of ubiquitous computing [5]. By using the resources and tool-kits provided by the workshop organizers, participants will generate proof-of-concept prototype as final deliverables. Participants are required to bring personal laptops.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132194455","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This research explores design opportunities where tangible interaction enables new ways to engage visitors with the stories and artefacts on display, not in a museum as such, but within a house museum -- a particular type of heritage site where I noticed little attention from the field of interaction design. The work sits between the fields of design (e.g. exhibition design), heritage, and technology (e.g. HCI) and it investigates how the approach to designing for house museums may be different than for conventional museums. This research unfolds through a designerly approach to explore the potential of tangible interaction by means of a series of design interventions where art and design practices (e.g. creation of interpretive object), technology (e.g. tangible technologies embedded within object) and historical content (e.g. evocative narrative) are connected together to prompt visitors' personal, tangible and multi-sensory engagement at a house museum. This research is at an early stage (begun in October 2015), thus this paper presents an initial analysis of literature to frame the research, the motivations and context behind the project, the methods to achieve the goals of the study and future work plans.
{"title":"Crafting Tangible Interaction to Prompt Visitors' Engagement in House Museums","authors":"C. Claisse","doi":"10.1145/2839462.2854107","DOIUrl":"https://doi.org/10.1145/2839462.2854107","url":null,"abstract":"This research explores design opportunities where tangible interaction enables new ways to engage visitors with the stories and artefacts on display, not in a museum as such, but within a house museum -- a particular type of heritage site where I noticed little attention from the field of interaction design. The work sits between the fields of design (e.g. exhibition design), heritage, and technology (e.g. HCI) and it investigates how the approach to designing for house museums may be different than for conventional museums. This research unfolds through a designerly approach to explore the potential of tangible interaction by means of a series of design interventions where art and design practices (e.g. creation of interpretive object), technology (e.g. tangible technologies embedded within object) and historical content (e.g. evocative narrative) are connected together to prompt visitors' personal, tangible and multi-sensory engagement at a house museum. This research is at an early stage (begun in October 2015), thus this paper presents an initial analysis of literature to frame the research, the motivations and context behind the project, the methods to achieve the goals of the study and future work plans.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"182 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132246801","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}