Assistive technology (AT) has the ability to improve the standard of living of those with disabilities, however, it can often be abandoned for aesthetic or stigmatizing reasons. Garment-based AT offers novel opportunities to address these issues as it can stay with the user to continuously monitor and convey relevant information, is non-invasive, and can provide aesthetically pleasing alternatives. In an effort to overcome traditional AT and wearable computing challenges including, cumbersome hardware constraints and social acceptability, we present Flutter, a fashion-oriented wearable AT. Flutter seamlessly embeds low-profile networked sensing, computation, and actuation to facilitate sensory augmentation for those with hearing loss. The miniaturized distributed hardware enables both textile integration and new methods to pair fashion with function, as embellishments are functionally leveraged to complement technology integration. Finally, we discuss future applications and broader implications of using such computationally-enabled textile wearables to support sensory augmentation beyond the realm of AT.
{"title":"Flutter: An Exploration of an Assistive Garment Using Distributed Sensing, Computation and Actuation","authors":"Halley P. Profita, N. Farrow, N. Correll","doi":"10.1145/2677199.2680586","DOIUrl":"https://doi.org/10.1145/2677199.2680586","url":null,"abstract":"Assistive technology (AT) has the ability to improve the standard of living of those with disabilities, however, it can often be abandoned for aesthetic or stigmatizing reasons. Garment-based AT offers novel opportunities to address these issues as it can stay with the user to continuously monitor and convey relevant information, is non-invasive, and can provide aesthetically pleasing alternatives. In an effort to overcome traditional AT and wearable computing challenges including, cumbersome hardware constraints and social acceptability, we present Flutter, a fashion-oriented wearable AT. Flutter seamlessly embeds low-profile networked sensing, computation, and actuation to facilitate sensory augmentation for those with hearing loss. The miniaturized distributed hardware enables both textile integration and new methods to pair fashion with function, as embellishments are functionally leveraged to complement technology integration. Finally, we discuss future applications and broader implications of using such computationally-enabled textile wearables to support sensory augmentation beyond the realm of AT.","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130244996","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. A. Lee, Alice M. Chung, Nate J. Cira, I. Riedel-Kruse
We present an interactive platform that enables human users to interface with microbiological living cells through a touch-screen, thereby generating a tangible interactive experience with the microscopic world that is hidden to most people. Euglena gracilis, single-celled phototactic microorganisms, are imaged and optically stimulated via a microscope setup equipped with a projector and a touch-screen display. Users can directly interact with these organisms by drawing patterns onto the screen, which displays the real-time magnified view of the microfluidic chamber with the motile euglena cells. The drawings are directly projected onto the chamber, thereby influencing the swimming motion of the cells. We discuss the architecture of the system and provide exploratory user testing results in a facilitated setting, which shows engaging nature of our system for children and the general public. In conclusion, our tangible interactive microscope allows artistic expression and scientific exploration with the ease of a "child's play."
{"title":"Tangible Interactive Microbiology for Informal Science Education","authors":"S. A. Lee, Alice M. Chung, Nate J. Cira, I. Riedel-Kruse","doi":"10.1145/2677199.2680561","DOIUrl":"https://doi.org/10.1145/2677199.2680561","url":null,"abstract":"We present an interactive platform that enables human users to interface with microbiological living cells through a touch-screen, thereby generating a tangible interactive experience with the microscopic world that is hidden to most people. Euglena gracilis, single-celled phototactic microorganisms, are imaged and optically stimulated via a microscope setup equipped with a projector and a touch-screen display. Users can directly interact with these organisms by drawing patterns onto the screen, which displays the real-time magnified view of the microfluidic chamber with the motile euglena cells. The drawings are directly projected onto the chamber, thereby influencing the swimming motion of the cells. We discuss the architecture of the system and provide exploratory user testing results in a facilitated setting, which shows engaging nature of our system for children and the general public. In conclusion, our tangible interactive microscope allows artistic expression and scientific exploration with the ease of a \"child's play.\"","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130918317","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper introduces an open research platform for exploring haptic interactions with co-located, capacitive and piezoresistive sensors. The solution uses readily available material, hardware and software components and allows for experiments on many system levels from low-level material concerns up to high-level sensor fusion software. This provides the HCI community with a platform to accelerate explorations of the many applications that have opened up of sensor fusion in haptic interaction.
{"title":"An Accessible Platform for Exploring Haptic Interactions with Co-located Capacitive and Piezoresistive Sensors","authors":"A. Freed, D. Wessel","doi":"10.1145/2677199.2680571","DOIUrl":"https://doi.org/10.1145/2677199.2680571","url":null,"abstract":"This paper introduces an open research platform for exploring haptic interactions with co-located, capacitive and piezoresistive sensors. The solution uses readily available material, hardware and software components and allows for experiments on many system levels from low-level material concerns up to high-level sensor fusion software. This provides the HCI community with a platform to accelerate explorations of the many applications that have opened up of sensor fusion in haptic interaction.","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127697457","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Chu, Paul G. Clifton, Daniel Harley, J. Pavao, Ali Mazalek
Museums are exploring new ways of using emerging digital technologies to enhance the visitor experience. In this context, our research focuses on designing, developing and studying interactions for museum exhibits that introduce cultural concepts in ways that are tangible and embodied. We introduce here a tangible tabletop installation piece that was designed for a museum exhibition contrasting Western and African notions of mapping history and place. Inspired by the Lukasa board, a mnemonic device used by the Luba peoples in Central Africa, the tabletop piece enables visitors to learn and understand symbolic and nonlinguistic mapping concepts that are central to the Lukasa by creating and sharing stories with each other. In this paper we share our design process, a user study focusing on children and learning, and design implications on how digital and tangible interaction technologies can be used for cultural learning in museum exhibits.
{"title":"Mapping Place: Supporting Cultural Learning through a Lukasa-inspired Tangible Tabletop Museum Exhibit","authors":"J. Chu, Paul G. Clifton, Daniel Harley, J. Pavao, Ali Mazalek","doi":"10.1145/2677199.2680559","DOIUrl":"https://doi.org/10.1145/2677199.2680559","url":null,"abstract":"Museums are exploring new ways of using emerging digital technologies to enhance the visitor experience. In this context, our research focuses on designing, developing and studying interactions for museum exhibits that introduce cultural concepts in ways that are tangible and embodied. We introduce here a tangible tabletop installation piece that was designed for a museum exhibition contrasting Western and African notions of mapping history and place. Inspired by the Lukasa board, a mnemonic device used by the Luba peoples in Central Africa, the tabletop piece enables visitors to learn and understand symbolic and nonlinguistic mapping concepts that are central to the Lukasa by creating and sharing stories with each other. In this paper we share our design process, a user study focusing on children and learning, and design implications on how digital and tangible interaction technologies can be used for cultural learning in museum exhibits.","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"63 3","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120973039","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
For some time now, robotics research has shifted its attention from robots that function within their own predefined space to robots that coexist with humans in the human's natural habitats. This evolution has not only driven interest in robot safety and compliance, it has also resulted in the subdomain of Social Robotics, which is concerned with natural interaction between robots and humans. In this studio, we will offer participants the chance to create their own animatronic creature using modular building blocks derived from Ono, our low-cost Do-It-Yourself social robot. In the first part, we will help participants to conceptualize a context and scenario for their social robot. Then, using craft materials (e.g. cardboard, glue, fabrics, foam, etc.) in combination with custom connectors and our animatronic modules, participants will build the physical embodiment of their creature. Finally, they are brought to life by connecting the modules to our electronics platform (Raspberry PI), which is then programmed using an easy to use library.
{"title":"Prototyping Social Interactions with DIY Animatronic Creatures","authors":"Cesar Vandevelde, M. Vanhoucke, Jelle Saldien","doi":"10.1145/2677199.2687146","DOIUrl":"https://doi.org/10.1145/2677199.2687146","url":null,"abstract":"For some time now, robotics research has shifted its attention from robots that function within their own predefined space to robots that coexist with humans in the human's natural habitats. This evolution has not only driven interest in robot safety and compliance, it has also resulted in the subdomain of Social Robotics, which is concerned with natural interaction between robots and humans. In this studio, we will offer participants the chance to create their own animatronic creature using modular building blocks derived from Ono, our low-cost Do-It-Yourself social robot. In the first part, we will help participants to conceptualize a context and scenario for their social robot. Then, using craft materials (e.g. cardboard, glue, fabrics, foam, etc.) in combination with custom connectors and our animatronic modules, participants will build the physical embodiment of their creature. Finally, they are brought to life by connecting the modules to our electronics platform (Raspberry PI), which is then programmed using an easy to use library.","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"192 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116523367","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ryuma Niiyama, Xu Sun, Lining Yao, H. Ishii, D. Rus, Sangbae Kim
We propose soft planar actuators enhanced by free-form fabrication that are suitable for making everyday objects move. The actuator consists of one or more inflatable pouches with an adhesive back. We have developed a machine for the fabrication of free-from pouches; squares, circles and ribbons are all possible. The deformation of the pouches can provide linear, rotational, and more complicated motion corresponding to the pouch's geometry. We also provide a both manual and programmable control system. In a user study, we organized a hands-on workshop of actuated origami for children. The results show that the combination of the actuator and classic materials can enhance rapid prototyping of animated objects.
{"title":"Sticky Actuator: Free-Form Planar Actuators for Animated Objects","authors":"Ryuma Niiyama, Xu Sun, Lining Yao, H. Ishii, D. Rus, Sangbae Kim","doi":"10.1145/2677199.2680600","DOIUrl":"https://doi.org/10.1145/2677199.2680600","url":null,"abstract":"We propose soft planar actuators enhanced by free-form fabrication that are suitable for making everyday objects move. The actuator consists of one or more inflatable pouches with an adhesive back. We have developed a machine for the fabrication of free-from pouches; squares, circles and ribbons are all possible. The deformation of the pouches can provide linear, rotational, and more complicated motion corresponding to the pouch's geometry. We also provide a both manual and programmable control system. In a user study, we organized a hands-on workshop of actuated origami for children. The results show that the combination of the actuator and classic materials can enhance rapid prototyping of animated objects.","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"73 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127219166","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Scented Pebbles is a collection of interactive lighting objects to create multisensory ambience of light and smell. When it senses people's movement and touch, the networked objects will generate dynamic ambience. The pebbles emit smells and control the lighting conditions to create your unique ambience such as Hawaiian Sunset or Japanese Onsen. Experience the orchestra of smell and light play and let your mind run free. The paper presented interactive approach to evoke sensorial imagination through multisensory interactions including olfactory sense.
{"title":"Scented Pebbles: Interactive Ambient Experience with Smell and Lighting","authors":"Y. Y. Cao, N. Okude","doi":"10.1145/2677199.2690873","DOIUrl":"https://doi.org/10.1145/2677199.2690873","url":null,"abstract":"Scented Pebbles is a collection of interactive lighting objects to create multisensory ambience of light and smell. When it senses people's movement and touch, the networked objects will generate dynamic ambience. The pebbles emit smells and control the lighting conditions to create your unique ambience such as Hawaiian Sunset or Japanese Onsen. Experience the orchestra of smell and light play and let your mind run free. The paper presented interactive approach to evoke sensorial imagination through multisensory interactions including olfactory sense.","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123799056","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We present Cube-in, a kit designed to help beginners learn about fundamental concepts in physical computing. Through play and observation, Cube-in users can investigate digital and analog signals, inputs and outputs, and mapping between inputs and outputs before they work on electronics and construct circuits. By simplifying interaction methods, Cube-in provides an accessible entry point to key physical computing concepts.
{"title":"Cube-in: A Learning Kit for Physical Computing Basics","authors":"Hyunjoo Oh, M. Gross","doi":"10.1145/2677199.2680597","DOIUrl":"https://doi.org/10.1145/2677199.2680597","url":null,"abstract":"We present Cube-in, a kit designed to help beginners learn about fundamental concepts in physical computing. Through play and observation, Cube-in users can investigate digital and analog signals, inputs and outputs, and mapping between inputs and outputs before they work on electronics and construct circuits. By simplifying interaction methods, Cube-in provides an accessible entry point to key physical computing concepts.","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"151 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124364978","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Augusto Esteves, Saskia Bakker, A. Antle, Aaron May, Jillian L. Warren, Ian Oakley
In task performance, pragmatic actions refer to behaviors that make direct progress, while epistemic actions involve altering the world so that cognitive processes are faster, more reliable or less taxing. Epistemic actions are frequently presented as a beneficial consequence of interacting with tangible systems. However, we currently lack tools to measure epistemic behaviors, making substantiating such claims highly challenging. This paper addresses this problem by presenting ATB, a video-coding framework that enables the identification and measurement of different epistemic actions during problem-solving tasks. The framework was developed through a systematic literature review of 78 papers, and analyzed through a study involving a jigsaw puzzle -- a classical spatial problem -- involving 60 participants. In order to assess the framework's value as a metric, we analyze the study with respect to its reliability, validity and predictive power. The broadly supportive results lead us to conclude that the ATB framework enables the use of observed epistemic behaviors as a performance metric for tangible systems. We believe that the development of metrics focused explicitly on the properties of tangible interaction are currently required to gain insight into the genuine and unique benefits of tangible interaction. The ATB framework is a step towards this goal.
{"title":"The ATB Framework: Quantifying and Classifying Epistemic Strategies in Tangible Problem-Solving Tasks","authors":"Augusto Esteves, Saskia Bakker, A. Antle, Aaron May, Jillian L. Warren, Ian Oakley","doi":"10.1145/2677199.2680546","DOIUrl":"https://doi.org/10.1145/2677199.2680546","url":null,"abstract":"In task performance, pragmatic actions refer to behaviors that make direct progress, while epistemic actions involve altering the world so that cognitive processes are faster, more reliable or less taxing. Epistemic actions are frequently presented as a beneficial consequence of interacting with tangible systems. However, we currently lack tools to measure epistemic behaviors, making substantiating such claims highly challenging. This paper addresses this problem by presenting ATB, a video-coding framework that enables the identification and measurement of different epistemic actions during problem-solving tasks. The framework was developed through a systematic literature review of 78 papers, and analyzed through a study involving a jigsaw puzzle -- a classical spatial problem -- involving 60 participants. In order to assess the framework's value as a metric, we analyze the study with respect to its reliability, validity and predictive power. The broadly supportive results lead us to conclude that the ATB framework enables the use of observed epistemic behaviors as a performance metric for tangible systems. We believe that the development of metrics focused explicitly on the properties of tangible interaction are currently required to gain insight into the genuine and unique benefits of tangible interaction. The ATB framework is a step towards this goal.","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133298298","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Redeform presents an alternative vision of 3D printing that complicates common divisions between human/machine, abstract/concrete, and high/low tech. It invites people to perform the functions of a 3D printer in order to collaboratively construct digital models from everyday materials in everyday spaces. At TEI, Redeform will serve as a site for discussion about values in digital fabrication design.
{"title":"Redeform: Participatory 3D Printing in Public Spaces","authors":"Laura Devendorf, Kimiko Ryokai","doi":"10.1145/2677199.2690880","DOIUrl":"https://doi.org/10.1145/2677199.2690880","url":null,"abstract":"Redeform presents an alternative vision of 3D printing that complicates common divisions between human/machine, abstract/concrete, and high/low tech. It invites people to perform the functions of a 3D printer in order to collaboratively construct digital models from everyday materials in everyday spaces. At TEI, Redeform will serve as a site for discussion about values in digital fabrication design.","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132060772","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}