Gaze signals, frequently used by the sighted in social interactions as visual cues, are hardly accessible for low-vision and blind people. A concept is proposed to help the blind people access and react to gaze signals in face-to-face communication. 20 blind and low-vision participants were interviewed to discuss the features of this concept. One feature of the concept is further developed into a prototype, namely Tactile Band, to aim at testing the hypothesis that tactile feedback can enable the blind person to feel attention (gaze signals) from the sighted, enhancing the level of engagement in face-to-face communication. We tested our hypothesis with 30 participants with a face-to-face conversation scenario, in which the blindfolded and the sighted participants talked about a given daily topic. Comments from the participants and the reflection on the experiment provided useful insights for improvements and further research.
{"title":"Tactile Band: Accessing Gaze Signals from the Sighted in Face-to-Face Communication","authors":"S. Qiu, G.W.M. Rauterberg, Jun Hu","doi":"10.1145/2839462.2856520","DOIUrl":"https://doi.org/10.1145/2839462.2856520","url":null,"abstract":"Gaze signals, frequently used by the sighted in social interactions as visual cues, are hardly accessible for low-vision and blind people. A concept is proposed to help the blind people access and react to gaze signals in face-to-face communication. 20 blind and low-vision participants were interviewed to discuss the features of this concept. One feature of the concept is further developed into a prototype, namely Tactile Band, to aim at testing the hypothesis that tactile feedback can enable the blind person to feel attention (gaze signals) from the sighted, enhancing the level of engagement in face-to-face communication. We tested our hypothesis with 30 participants with a face-to-face conversation scenario, in which the blindfolded and the sighted participants talked about a given daily topic. Comments from the participants and the reflection on the experiment provided useful insights for improvements and further research.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"319 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126944108","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Johanna Okerlund, E. Segreto, C. Grote, Lauren Westendorf, Anja Scholze, R. Littrell, Orit Shaer
We present SynFlo, a tangible museum exhibit for exploring bio-design. SynFlo utilizes active and concrete tangible tokens to allow visitors to experience a playful biodesign activity through complex interactivity with digital biological creations. We developed two versions of SynFlo: one that combines active tokens with real concrete objects (i.e. labware) and one that consists of only abstract active tokens. Results from an evaluation in a museum indicate that both systems support learning. We discuss design choices for biology education tools to overcome confounders of biology and facilitate positive engagement and learning.
{"title":"SynFlo: A Tangible Museum Exhibit for Exploring Bio-Design","authors":"Johanna Okerlund, E. Segreto, C. Grote, Lauren Westendorf, Anja Scholze, R. Littrell, Orit Shaer","doi":"10.1145/2839462.2839488","DOIUrl":"https://doi.org/10.1145/2839462.2839488","url":null,"abstract":"We present SynFlo, a tangible museum exhibit for exploring bio-design. SynFlo utilizes active and concrete tangible tokens to allow visitors to experience a playful biodesign activity through complex interactivity with digital biological creations. We developed two versions of SynFlo: one that combines active tokens with real concrete objects (i.e. labware) and one that consists of only abstract active tokens. Results from an evaluation in a museum indicate that both systems support learning. We discuss design choices for biology education tools to overcome confounders of biology and facilitate positive engagement and learning.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"103 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116121879","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Majken Kirkegaard Rasmussen, Timothy R. Merritt, M. B. Alonso, M. G. Petersen
Despite an increasing number of examples of shape-changing interfaces, the relation between users' actions and product movements has not gained a great deal of attention, nor been very well articulated. This paper presents a framework articulating the level of control offered to the user over the shape change. The framework considers whether the shape change is: 1) directly controlled by the user's explicit interactions; 2) negotiated with the user; 3) indirectly controlled by the users actions; 4) fully controlled by the system. The four types are described through design examples using ReFlex, a shape-changing interface in the form of a smartphone. The paper concludes that shape-changing interfaces tend to assign the control to either the user or the underlying system, while few (e.g. [16,28]) consider sharing the control between the user and the system.
{"title":"Balancing User and System Control in Shape-Changing Interfaces: a Designerly Exploration","authors":"Majken Kirkegaard Rasmussen, Timothy R. Merritt, M. B. Alonso, M. G. Petersen","doi":"10.1145/2839462.2839499","DOIUrl":"https://doi.org/10.1145/2839462.2839499","url":null,"abstract":"Despite an increasing number of examples of shape-changing interfaces, the relation between users' actions and product movements has not gained a great deal of attention, nor been very well articulated. This paper presents a framework articulating the level of control offered to the user over the shape change. The framework considers whether the shape change is: 1) directly controlled by the user's explicit interactions; 2) negotiated with the user; 3) indirectly controlled by the users actions; 4) fully controlled by the system. The four types are described through design examples using ReFlex, a shape-changing interface in the form of a smartphone. The paper concludes that shape-changing interfaces tend to assign the control to either the user or the underlying system, while few (e.g. [16,28]) consider sharing the control between the user and the system.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122098652","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Keep In Shape","authors":"T. Nam","doi":"10.1145/3257870","DOIUrl":"https://doi.org/10.1145/3257870","url":null,"abstract":"","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125116225","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This study presents a technique to make 3D objects by folding a resin sheet using a piece of common home electronic equipment: a microwave oven. Though personal fabrication has grown widely popular because of the price reduction of digital fabrication tools such as 3D printers or laser cutters, printing a 3D object is still slow. Moreover, installation of laser cutter at home is still difficult because of issues of health and safety. So, we have proposed a simple but widely applicable home fabrication method called "MOR4R": Microwave Oven Recipes for Resins. By putting properly sized microwave susceptor strips onto a piece of acrylic sheet, and microwaving it for about 3 minutes at a power of 800 W, only the part where the susceptor has been placed becomes soft. This paper reveals a suitable size of susceptor strips for folding an acrylic sheet. This technique allows the creator to form a rigid and strong object, much like folding an origami.
{"title":"MOR4R: How to Create 3D Objects Using a Microwave Oven","authors":"K. Yasu","doi":"10.1145/2839462.2839507","DOIUrl":"https://doi.org/10.1145/2839462.2839507","url":null,"abstract":"This study presents a technique to make 3D objects by folding a resin sheet using a piece of common home electronic equipment: a microwave oven. Though personal fabrication has grown widely popular because of the price reduction of digital fabrication tools such as 3D printers or laser cutters, printing a 3D object is still slow. Moreover, installation of laser cutter at home is still difficult because of issues of health and safety. So, we have proposed a simple but widely applicable home fabrication method called \"MOR4R\": Microwave Oven Recipes for Resins. By putting properly sized microwave susceptor strips onto a piece of acrylic sheet, and microwaving it for about 3 minutes at a power of 800 W, only the part where the susceptor has been placed becomes soft. This paper reveals a suitable size of susceptor strips for folding an acrylic sheet. This technique allows the creator to form a rigid and strong object, much like folding an origami.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124169619","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper we discuss the development of a novel rapid prototyping method that makes the process of creating tangible electronic artifacts faster and easier. This method makes use of a new paper-like material that can be given any form just by hand or by using other stationary objects. This material changes its stiffness and becomes harder soon after the modeling process. Furthermore, this material can be integrated within electronic circuits using magnetic connectors and silver. The discussed method has been conceptualized using a People-Centered Design approach while its implementation has been led by an engineering approach. Both, the conceptualization process as well as the implementation of the discussed rapid prototyping method have been detailed in this paper along with example scenarios where the said implementation could be useful.
{"title":"Tangible Modeling Methods for Faster Rapid Prototyping","authors":"Satoshi Nakamaru, J. Bak, D. Saxena","doi":"10.1145/2839462.2856530","DOIUrl":"https://doi.org/10.1145/2839462.2856530","url":null,"abstract":"In this paper we discuss the development of a novel rapid prototyping method that makes the process of creating tangible electronic artifacts faster and easier. This method makes use of a new paper-like material that can be given any form just by hand or by using other stationary objects. This material changes its stiffness and becomes harder soon after the modeling process. Furthermore, this material can be integrated within electronic circuits using magnetic connectors and silver. The discussed method has been conceptualized using a People-Centered Design approach while its implementation has been led by an engineering approach. Both, the conceptualization process as well as the implementation of the discussed rapid prototyping method have been detailed in this paper along with example scenarios where the said implementation could be useful.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130072130","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Am I the only one who got a little confused over the years on what tangible interaction is about? Originally it was about fusing the physical and the digital, but isn't everything these days? Augmented Reality, The Internet of Things ... are they too forms of tangible interaction? Does it really matter whether intelligence is embedded in an object or attributed to it through computer vision? And when a commonplace electronic product gets internet-enabled and its state affects other systems, does its use suddenly count as tangible interaction? More and more, I am starting to understand tangible interaction as an approach to designing interactive products which respect and exploit the user's bodily skills and which build upon the notion that our traditional physical environment is inherently meaningful to us. In this talk I will show a number of demonstrators built at Philips Design. Whilst none of them were designed with tangible interaction in mind, they all take cues from our interaction with the physical world. Working from these examples, I will share some of the obstacles we ran into as well as some interaction principles which I believe have recurring value when designing for tangible interaction.
{"title":"Inherently Meaningful","authors":"J. P. Djajadiningrat","doi":"10.1145/2839462.2883590","DOIUrl":"https://doi.org/10.1145/2839462.2883590","url":null,"abstract":"Am I the only one who got a little confused over the years on what tangible interaction is about? Originally it was about fusing the physical and the digital, but isn't everything these days? Augmented Reality, The Internet of Things ... are they too forms of tangible interaction? Does it really matter whether intelligence is embedded in an object or attributed to it through computer vision? And when a commonplace electronic product gets internet-enabled and its state affects other systems, does its use suddenly count as tangible interaction? More and more, I am starting to understand tangible interaction as an approach to designing interactive products which respect and exploit the user's bodily skills and which build upon the notion that our traditional physical environment is inherently meaningful to us. In this talk I will show a number of demonstrators built at Philips Design. Whilst none of them were designed with tangible interaction in mind, they all take cues from our interaction with the physical world. Working from these examples, I will share some of the obstacles we ran into as well as some interaction principles which I believe have recurring value when designing for tangible interaction.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130143586","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
IrukaTact is an open source assistive underwater glove that translates ultrasonic range finding data into haptic feedback. This paper describes the development of new models for underwater haptic actuation and the ongoing prototype development process necessary for creating this tool. Our objective is to create a system that detects underwater topographies to assist the location of sunken objects in flooded areas by sending haptic signals to wearer's fingertips produced by micropumps that propel water of varying pressures. This feedback system extends current haptic technologies by providing hybrid actuation including pressure and vibration underwater, while preserving the wearer's natural ability to grasp objects. This technology has many potential applications beyond underwater echohaptic location, such as new interfaces for virtual reality object simulation in aqueous environments.
{"title":"IrukaTact: Submersible Haptic Search Glove","authors":"A. Chacin, Takeshi Oozu, Hiroo Iwata","doi":"10.1145/2839462.2856546","DOIUrl":"https://doi.org/10.1145/2839462.2856546","url":null,"abstract":"IrukaTact is an open source assistive underwater glove that translates ultrasonic range finding data into haptic feedback. This paper describes the development of new models for underwater haptic actuation and the ongoing prototype development process necessary for creating this tool. Our objective is to create a system that detects underwater topographies to assist the location of sunken objects in flooded areas by sending haptic signals to wearer's fingertips produced by micropumps that propel water of varying pressures. This feedback system extends current haptic technologies by providing hybrid actuation including pressure and vibration underwater, while preserving the wearer's natural ability to grasp objects. This technology has many potential applications beyond underwater echohaptic location, such as new interfaces for virtual reality object simulation in aqueous environments.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130390355","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Marshall, N. Dulake, L. Ciolfi, D. Duranti, Hub Kockelkorn, Daniela Petrelli
This paper presents the design, creation and use of tangible smart replicas in a large-scale museum exhibition. We describe the design rationale for the replicas, the process used in their creation, as well as the implementation and deployment of these replicas in a live museum exhibition. Deployment of the exhibition resulted in over 14000 visitors interacting with the system during the 6 months that the exhibition was open. Based on log data, interviews and observations, we examine the reaction to these smart replicas from the point of view of the museum curators and also of the museum's visitors and reflect on the fulfillment of our expectations.
{"title":"Using Tangible Smart Replicas as Controls for an Interactive Museum Exhibition","authors":"M. Marshall, N. Dulake, L. Ciolfi, D. Duranti, Hub Kockelkorn, Daniela Petrelli","doi":"10.1145/2839462.2839493","DOIUrl":"https://doi.org/10.1145/2839462.2839493","url":null,"abstract":"This paper presents the design, creation and use of tangible smart replicas in a large-scale museum exhibition. We describe the design rationale for the replicas, the process used in their creation, as well as the implementation and deployment of these replicas in a live museum exhibition. Deployment of the exhibition resulted in over 14000 visitors interacting with the system during the 6 months that the exhibition was open. Based on log data, interviews and observations, we examine the reaction to these smart replicas from the point of view of the museum curators and also of the museum's visitors and reflect on the fulfillment of our expectations.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127423994","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mankind's capacity for creativity is infinite. In the physical world, people create visual artistic works not only with specific tools, such as paintbrushes, but also with various objects, such as dried flowers pressed on paper. In contrast, digital painting has a number of advantages; however, such painting currently requires a specific tool, such as a stylus, which might diminish the pleasurable experience of creation. This paper proposes a digital painting system called UnicrePaint that utilizes daily objects as tools of expression and demonstrates the capabilities of the first prototype system with a pilot user study.
{"title":"UnicrePaint: Digital Painting through Physical Objects for Unique Creative Experiences","authors":"Mami Kosaka, K. Fujinami","doi":"10.1145/2839462.2856553","DOIUrl":"https://doi.org/10.1145/2839462.2856553","url":null,"abstract":"Mankind's capacity for creativity is infinite. In the physical world, people create visual artistic works not only with specific tools, such as paintbrushes, but also with various objects, such as dried flowers pressed on paper. In contrast, digital painting has a number of advantages; however, such painting currently requires a specific tool, such as a stylus, which might diminish the pleasurable experience of creation. This paper proposes a digital painting system called UnicrePaint that utilizes daily objects as tools of expression and demonstrates the capabilities of the first prototype system with a pilot user study.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127969914","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}