Sootoid is a symphonic art created by collaboration between human, computer and flame of candle. While the major generative art pictures are drawn in a computer display, Sootoid creates a generative art drawing on a linen canvas by candle soot. Movement of candle is controlled by pre-programmed algorithm, making airflow and flickering of candle flame each time it moves to create unique soot line and color.
{"title":"Sootoid: Generative Art Drawing by Flame of Candle","authors":"Hisakazu Hada, Daisuke Tozuka, Sho Kamei, Tatsumi Hasegawa, Fuhito Morita, Shuntaro Ootsuki, Akito Nakano","doi":"10.1145/2677199.2690869","DOIUrl":"https://doi.org/10.1145/2677199.2690869","url":null,"abstract":"Sootoid is a symphonic art created by collaboration between human, computer and flame of candle. While the major generative art pictures are drawn in a computer display, Sootoid creates a generative art drawing on a linen canvas by candle soot. Movement of candle is controlled by pre-programmed algorithm, making airflow and flickering of candle flame each time it moves to create unique soot line and color.","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"106 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115556837","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
For some time now, robotics research has shifted its attention from robots that function within their own predefined space to robots that coexist with humans in the human's natural habitats. This evolution has not only driven interest in robot safety and compliance, it has also resulted in the subdomain of Social Robotics, which is concerned with natural interaction between robots and humans. In this studio, we will offer participants the chance to create their own animatronic creature using modular building blocks derived from Ono, our low-cost Do-It-Yourself social robot. In the first part, we will help participants to conceptualize a context and scenario for their social robot. Then, using craft materials (e.g. cardboard, glue, fabrics, foam, etc.) in combination with custom connectors and our animatronic modules, participants will build the physical embodiment of their creature. Finally, they are brought to life by connecting the modules to our electronics platform (Raspberry PI), which is then programmed using an easy to use library.
{"title":"Prototyping Social Interactions with DIY Animatronic Creatures","authors":"Cesar Vandevelde, M. Vanhoucke, Jelle Saldien","doi":"10.1145/2677199.2687146","DOIUrl":"https://doi.org/10.1145/2677199.2687146","url":null,"abstract":"For some time now, robotics research has shifted its attention from robots that function within their own predefined space to robots that coexist with humans in the human's natural habitats. This evolution has not only driven interest in robot safety and compliance, it has also resulted in the subdomain of Social Robotics, which is concerned with natural interaction between robots and humans. In this studio, we will offer participants the chance to create their own animatronic creature using modular building blocks derived from Ono, our low-cost Do-It-Yourself social robot. In the first part, we will help participants to conceptualize a context and scenario for their social robot. Then, using craft materials (e.g. cardboard, glue, fabrics, foam, etc.) in combination with custom connectors and our animatronic modules, participants will build the physical embodiment of their creature. Finally, they are brought to life by connecting the modules to our electronics platform (Raspberry PI), which is then programmed using an easy to use library.","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"192 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116523367","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Scented Pebbles is a collection of interactive lighting objects to create multisensory ambience of light and smell. When it senses people's movement and touch, the networked objects will generate dynamic ambience. The pebbles emit smells and control the lighting conditions to create your unique ambience such as Hawaiian Sunset or Japanese Onsen. Experience the orchestra of smell and light play and let your mind run free. The paper presented interactive approach to evoke sensorial imagination through multisensory interactions including olfactory sense.
{"title":"Scented Pebbles: Interactive Ambient Experience with Smell and Lighting","authors":"Y. Y. Cao, N. Okude","doi":"10.1145/2677199.2690873","DOIUrl":"https://doi.org/10.1145/2677199.2690873","url":null,"abstract":"Scented Pebbles is a collection of interactive lighting objects to create multisensory ambience of light and smell. When it senses people's movement and touch, the networked objects will generate dynamic ambience. The pebbles emit smells and control the lighting conditions to create your unique ambience such as Hawaiian Sunset or Japanese Onsen. Experience the orchestra of smell and light play and let your mind run free. The paper presented interactive approach to evoke sensorial imagination through multisensory interactions including olfactory sense.","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123799056","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We present Cube-in, a kit designed to help beginners learn about fundamental concepts in physical computing. Through play and observation, Cube-in users can investigate digital and analog signals, inputs and outputs, and mapping between inputs and outputs before they work on electronics and construct circuits. By simplifying interaction methods, Cube-in provides an accessible entry point to key physical computing concepts.
{"title":"Cube-in: A Learning Kit for Physical Computing Basics","authors":"Hyunjoo Oh, M. Gross","doi":"10.1145/2677199.2680597","DOIUrl":"https://doi.org/10.1145/2677199.2680597","url":null,"abstract":"We present Cube-in, a kit designed to help beginners learn about fundamental concepts in physical computing. Through play and observation, Cube-in users can investigate digital and analog signals, inputs and outputs, and mapping between inputs and outputs before they work on electronics and construct circuits. By simplifying interaction methods, Cube-in provides an accessible entry point to key physical computing concepts.","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"151 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124364978","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. A. Lee, Alice M. Chung, Nate J. Cira, I. Riedel-Kruse
We present an interactive platform that enables human users to interface with microbiological living cells through a touch-screen, thereby generating a tangible interactive experience with the microscopic world that is hidden to most people. Euglena gracilis, single-celled phototactic microorganisms, are imaged and optically stimulated via a microscope setup equipped with a projector and a touch-screen display. Users can directly interact with these organisms by drawing patterns onto the screen, which displays the real-time magnified view of the microfluidic chamber with the motile euglena cells. The drawings are directly projected onto the chamber, thereby influencing the swimming motion of the cells. We discuss the architecture of the system and provide exploratory user testing results in a facilitated setting, which shows engaging nature of our system for children and the general public. In conclusion, our tangible interactive microscope allows artistic expression and scientific exploration with the ease of a "child's play."
{"title":"Tangible Interactive Microbiology for Informal Science Education","authors":"S. A. Lee, Alice M. Chung, Nate J. Cira, I. Riedel-Kruse","doi":"10.1145/2677199.2680561","DOIUrl":"https://doi.org/10.1145/2677199.2680561","url":null,"abstract":"We present an interactive platform that enables human users to interface with microbiological living cells through a touch-screen, thereby generating a tangible interactive experience with the microscopic world that is hidden to most people. Euglena gracilis, single-celled phototactic microorganisms, are imaged and optically stimulated via a microscope setup equipped with a projector and a touch-screen display. Users can directly interact with these organisms by drawing patterns onto the screen, which displays the real-time magnified view of the microfluidic chamber with the motile euglena cells. The drawings are directly projected onto the chamber, thereby influencing the swimming motion of the cells. We discuss the architecture of the system and provide exploratory user testing results in a facilitated setting, which shows engaging nature of our system for children and the general public. In conclusion, our tangible interactive microscope allows artistic expression and scientific exploration with the ease of a \"child's play.\"","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130918317","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Assistive technology (AT) has the ability to improve the standard of living of those with disabilities, however, it can often be abandoned for aesthetic or stigmatizing reasons. Garment-based AT offers novel opportunities to address these issues as it can stay with the user to continuously monitor and convey relevant information, is non-invasive, and can provide aesthetically pleasing alternatives. In an effort to overcome traditional AT and wearable computing challenges including, cumbersome hardware constraints and social acceptability, we present Flutter, a fashion-oriented wearable AT. Flutter seamlessly embeds low-profile networked sensing, computation, and actuation to facilitate sensory augmentation for those with hearing loss. The miniaturized distributed hardware enables both textile integration and new methods to pair fashion with function, as embellishments are functionally leveraged to complement technology integration. Finally, we discuss future applications and broader implications of using such computationally-enabled textile wearables to support sensory augmentation beyond the realm of AT.
{"title":"Flutter: An Exploration of an Assistive Garment Using Distributed Sensing, Computation and Actuation","authors":"Halley P. Profita, N. Farrow, N. Correll","doi":"10.1145/2677199.2680586","DOIUrl":"https://doi.org/10.1145/2677199.2680586","url":null,"abstract":"Assistive technology (AT) has the ability to improve the standard of living of those with disabilities, however, it can often be abandoned for aesthetic or stigmatizing reasons. Garment-based AT offers novel opportunities to address these issues as it can stay with the user to continuously monitor and convey relevant information, is non-invasive, and can provide aesthetically pleasing alternatives. In an effort to overcome traditional AT and wearable computing challenges including, cumbersome hardware constraints and social acceptability, we present Flutter, a fashion-oriented wearable AT. Flutter seamlessly embeds low-profile networked sensing, computation, and actuation to facilitate sensory augmentation for those with hearing loss. The miniaturized distributed hardware enables both textile integration and new methods to pair fashion with function, as embellishments are functionally leveraged to complement technology integration. Finally, we discuss future applications and broader implications of using such computationally-enabled textile wearables to support sensory augmentation beyond the realm of AT.","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130244996","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Paper Session 1: Why Use Theory?","authors":"Ali Mazalek","doi":"10.1145/3246880","DOIUrl":"https://doi.org/10.1145/3246880","url":null,"abstract":"","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129796880","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Stefanie Mueller, Martin Fritzsche, Jan Kossmann, Maximilian Schneider, Jonathan Striebel, Patrick Baudisch
We present a simple self-contained appliance that allows relocating inanimate physical objects across distance. Each unit consists of an off-the-shelf 3D printer that we have extended with a 3-axis milling machine, a camera, and a micro-controller for encryption/decryption and transmission. Users place an object into the sender unit, enter the address of a receiver unit, and press the relocate button. The sender unit now digitizes the original object layer-by-layer: it shaves off material using the built-in milling machine, takes a photo using the built-in camera, encrypts the layer using the public key of the receiver, and transmits it. The receiving unit decrypts the layer in real-time and starts printing right away. Users thus see the object appear layer-by-layer on the receiver side as it disappears layer-by-layer at the sender side. Scotty is different from previous systems that copy physical objects, as its destruction and encryption mechanism guarantees that only one copy of the object exists at a time. Even though our current prototype is limited to single-material plastic objects, it allows us to address two application scenarios: (1) Scotty can help preserve the uniqueness and thus the emotional value of physical objects shared between friends. (2) Scotty can address some of the licensing issues involved in fast electronic delivery of physical goods. We explore the former in an exploratory user study with three pairs of participants.
{"title":"Scotty: Relocating Physical Objects Across Distances Using Destructive Scanning, Encryption, and 3D Printing","authors":"Stefanie Mueller, Martin Fritzsche, Jan Kossmann, Maximilian Schneider, Jonathan Striebel, Patrick Baudisch","doi":"10.1145/2677199.2680547","DOIUrl":"https://doi.org/10.1145/2677199.2680547","url":null,"abstract":"We present a simple self-contained appliance that allows relocating inanimate physical objects across distance. Each unit consists of an off-the-shelf 3D printer that we have extended with a 3-axis milling machine, a camera, and a micro-controller for encryption/decryption and transmission. Users place an object into the sender unit, enter the address of a receiver unit, and press the relocate button. The sender unit now digitizes the original object layer-by-layer: it shaves off material using the built-in milling machine, takes a photo using the built-in camera, encrypts the layer using the public key of the receiver, and transmits it. The receiving unit decrypts the layer in real-time and starts printing right away. Users thus see the object appear layer-by-layer on the receiver side as it disappears layer-by-layer at the sender side. Scotty is different from previous systems that copy physical objects, as its destruction and encryption mechanism guarantees that only one copy of the object exists at a time. Even though our current prototype is limited to single-material plastic objects, it allows us to address two application scenarios: (1) Scotty can help preserve the uniqueness and thus the emotional value of physical objects shared between friends. (2) Scotty can address some of the licensing issues involved in fast electronic delivery of physical goods. We explore the former in an exploratory user study with three pairs of participants.","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125633585","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ryuma Niiyama, Xu Sun, Lining Yao, H. Ishii, D. Rus, Sangbae Kim
We propose soft planar actuators enhanced by free-form fabrication that are suitable for making everyday objects move. The actuator consists of one or more inflatable pouches with an adhesive back. We have developed a machine for the fabrication of free-from pouches; squares, circles and ribbons are all possible. The deformation of the pouches can provide linear, rotational, and more complicated motion corresponding to the pouch's geometry. We also provide a both manual and programmable control system. In a user study, we organized a hands-on workshop of actuated origami for children. The results show that the combination of the actuator and classic materials can enhance rapid prototyping of animated objects.
{"title":"Sticky Actuator: Free-Form Planar Actuators for Animated Objects","authors":"Ryuma Niiyama, Xu Sun, Lining Yao, H. Ishii, D. Rus, Sangbae Kim","doi":"10.1145/2677199.2680600","DOIUrl":"https://doi.org/10.1145/2677199.2680600","url":null,"abstract":"We propose soft planar actuators enhanced by free-form fabrication that are suitable for making everyday objects move. The actuator consists of one or more inflatable pouches with an adhesive back. We have developed a machine for the fabrication of free-from pouches; squares, circles and ribbons are all possible. The deformation of the pouches can provide linear, rotational, and more complicated motion corresponding to the pouch's geometry. We also provide a both manual and programmable control system. In a user study, we organized a hands-on workshop of actuated origami for children. The results show that the combination of the actuator and classic materials can enhance rapid prototyping of animated objects.","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"73 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127219166","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper introduces an open research platform for exploring haptic interactions with co-located, capacitive and piezoresistive sensors. The solution uses readily available material, hardware and software components and allows for experiments on many system levels from low-level material concerns up to high-level sensor fusion software. This provides the HCI community with a platform to accelerate explorations of the many applications that have opened up of sensor fusion in haptic interaction.
{"title":"An Accessible Platform for Exploring Haptic Interactions with Co-located Capacitive and Piezoresistive Sensors","authors":"A. Freed, D. Wessel","doi":"10.1145/2677199.2680571","DOIUrl":"https://doi.org/10.1145/2677199.2680571","url":null,"abstract":"This paper introduces an open research platform for exploring haptic interactions with co-located, capacitive and piezoresistive sensors. The solution uses readily available material, hardware and software components and allows for experiments on many system levels from low-level material concerns up to high-level sensor fusion software. This provides the HCI community with a platform to accelerate explorations of the many applications that have opened up of sensor fusion in haptic interaction.","PeriodicalId":117478,"journal":{"name":"Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127697457","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}