Information visualisation is the transformation of abstract data into visual, interactive representations. In this paper we present InfoPhys, a device that enables the direct, tangible manipulation of visualisations. InfoPhys makes use of a force-feedback pointing device to simulate haptic feedback while the user explores visualisations projected on top of the device. We present a use case illustrating the trends in ten years of TEI proceedings and how InfoPhys allows users to feel and manipulate these trends. The technical and software aspects of our prototype are presented, and promising improvements and future work opened by InfoPhys are then discussed.
{"title":"InfoPhys: Direct Manipulation of Information Visualisation through a Force-Feedback Pointing Device","authors":"Christian Frisson, Bruno Dumas","doi":"10.1145/2839462.2856545","DOIUrl":"https://doi.org/10.1145/2839462.2856545","url":null,"abstract":"Information visualisation is the transformation of abstract data into visual, interactive representations. In this paper we present InfoPhys, a device that enables the direct, tangible manipulation of visualisations. InfoPhys makes use of a force-feedback pointing device to simulate haptic feedback while the user explores visualisations projected on top of the device. We present a use case illustrating the trends in ten years of TEI proceedings and how InfoPhys allows users to feel and manipulate these trends. The technical and software aspects of our prototype are presented, and promising improvements and future work opened by InfoPhys are then discussed.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130039945","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The value of engaging sensory motor skills in the design and use of smart systems is increasingly recognized. Yet robust and reliable methods for development, reporting and transfer are not fully understood. This workshop investigates the role of embodied design research techniques in the context of soft wearables. Throughout, we will experiment with how embodied design research techniques might be shared, developed, and used as direct and unmediated vehicles for their own reporting. Rather than engage in oral presentations, participants will lead each other through a proven embodied method or approach. Then small groups will create mash-ups of techniques, exploring ways that the new approaches might be coherently reported. By applying such methods to the problem of their reporting, we hope to deepen understanding of how to move towards nuanced and repeatable methods for embodied design and knowledge transfer in the context of soft wearables.
{"title":"Embodying Soft Wearables Research","authors":"O. Tomico, D. Wilde","doi":"10.1145/2839462.2854115","DOIUrl":"https://doi.org/10.1145/2839462.2854115","url":null,"abstract":"The value of engaging sensory motor skills in the design and use of smart systems is increasingly recognized. Yet robust and reliable methods for development, reporting and transfer are not fully understood. This workshop investigates the role of embodied design research techniques in the context of soft wearables. Throughout, we will experiment with how embodied design research techniques might be shared, developed, and used as direct and unmediated vehicles for their own reporting. Rather than engage in oral presentations, participants will lead each other through a proven embodied method or approach. Then small groups will create mash-ups of techniques, exploring ways that the new approaches might be coherently reported. By applying such methods to the problem of their reporting, we hope to deepen understanding of how to move towards nuanced and repeatable methods for embodied design and knowledge transfer in the context of soft wearables.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126592515","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lasse Legaard, J. Thomsen, Christian Hannesbo Lorentzen, Jonas Peter Techen
This paper explores the opportunities for incorporating shape changing properties into everyday home appliances. Throughout a design research approach the vacuum cleaner is used as a design case with the overall aim of enhancing the user experience by transforming the appliance into a sensing object. Three fully functional prototypes were developed in order to illustrate how shape change can fit into the context of our homes. The shape changing functionalities are: 1) a digital power button that supports dynamic affordances, 2) an analog handle that mediates the amount of dust particles through haptic feedback and 3) a body that behaves in a lifelike manner dependent on the user treatment. We report the development and implementation of the functional prototypes as well as technical limitations and initial user reactions on the prototypes.
{"title":"Exploring SCI as Means of Interaction through the Design Case of Vacuum Cleaning","authors":"Lasse Legaard, J. Thomsen, Christian Hannesbo Lorentzen, Jonas Peter Techen","doi":"10.1145/2839462.2856540","DOIUrl":"https://doi.org/10.1145/2839462.2856540","url":null,"abstract":"This paper explores the opportunities for incorporating shape changing properties into everyday home appliances. Throughout a design research approach the vacuum cleaner is used as a design case with the overall aim of enhancing the user experience by transforming the appliance into a sensing object. Three fully functional prototypes were developed in order to illustrate how shape change can fit into the context of our homes. The shape changing functionalities are: 1) a digital power button that supports dynamic affordances, 2) an analog handle that mediates the amount of dust particles through haptic feedback and 3) a body that behaves in a lifelike manner dependent on the user treatment. We report the development and implementation of the functional prototypes as well as technical limitations and initial user reactions on the prototypes.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127782192","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Arif, R. Manshaei, Sean DeLong, Brien East, M. Kyan, Ali Mazalek
We present Sparse Tangibles, a tabletop and active tangible-based framework to support cross-platform, collaborative gene network exploration using a Web interface. It uses smartwatches as active tangibles to allow query construction on- and off-the-table. We expand their interaction vocabulary using inertial sensors and a custom case. We also introduce a new metric for measuring the "confidence level" of protein and genetic interactions. Three expert biologists evaluated the system and found it fun, useful, easy to use, and ideal for collaborative explorations.
{"title":"Sparse Tangibles: Collaborative Exploration of Gene Networks using Active Tangibles and Interactive Tabletops","authors":"A. Arif, R. Manshaei, Sean DeLong, Brien East, M. Kyan, Ali Mazalek","doi":"10.1145/2839462.2839500","DOIUrl":"https://doi.org/10.1145/2839462.2839500","url":null,"abstract":"We present Sparse Tangibles, a tabletop and active tangible-based framework to support cross-platform, collaborative gene network exploration using a Web interface. It uses smartwatches as active tangibles to allow query construction on- and off-the-table. We expand their interaction vocabulary using inertial sensors and a custom case. We also introduce a new metric for measuring the \"confidence level\" of protein and genetic interactions. Three expert biologists evaluated the system and found it fun, useful, easy to use, and ideal for collaborative explorations.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"27 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132834293","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Spatial augmented reality and tangible interaction enrich the standard computer I/O space. Systems based on such modalities offer new user experiences and open up interesting perspectives in various fields. On the other hand, such systems tend to live outside the standard desktop paradigm and, as a consequence, they do not benefit from the richness and versatility of desktop environments. In this work, we propose to join together physical visualization and tangible interaction within a standard desktop environment. We introduce the concept of Tangible Viewport, an on-screen window that creates a dynamic link between augmented objects and computer screens, allowing a screen-based cursor to move onto the object in a seamless manner. We describe an implementation of this concept and explore the interaction space around it. A preliminary evaluation shows the metaphor is transparent to the users while providing the benefits of tangibility.
{"title":"Tangible Viewports: Getting Out of Flatland in Desktop Environments","authors":"Renaud Gervais, J. Roo, M. Hachet","doi":"10.1145/2839462.2839468","DOIUrl":"https://doi.org/10.1145/2839462.2839468","url":null,"abstract":"Spatial augmented reality and tangible interaction enrich the standard computer I/O space. Systems based on such modalities offer new user experiences and open up interesting perspectives in various fields. On the other hand, such systems tend to live outside the standard desktop paradigm and, as a consequence, they do not benefit from the richness and versatility of desktop environments. In this work, we propose to join together physical visualization and tangible interaction within a standard desktop environment. We introduce the concept of Tangible Viewport, an on-screen window that creates a dynamic link between augmented objects and computer screens, allowing a screen-based cursor to move onto the object in a seamless manner. We describe an implementation of this concept and explore the interaction space around it. A preliminary evaluation shows the metaphor is transparent to the users while providing the benefits of tangibility.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122193588","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Oren Zuckerman, Tamar Gal, T. Keren-Capelovitch, T. Krasovsky, Ayelet Gal-Oz, P. Weiss
The design of tangible and embedded assistive technologies poses unique challenges. We describe the challenges we encountered during the design of "DataSpoon", explain how we overcame them, and suggest design guidelines. DataSpoon is an instrumented spoon that monitors movement kinematics during self-feeding. Children with motor disorders often encounter difficulty mastering self-feeding. In order to treat them effectively, professional caregivers need to assess their movement kinematics. Currently, assessment is performed through observations and questionnaires. DataSpoon adds sensor-based data to this process. A validation study showed that data obtained from DataSpoon and from a 6-camera 3D motion capture system were similar. Our experience yielded three design guidelines: needs of both caregivers and children should be considered; distractions to direct caregiver-child interaction should be minimized; familiar-looking devices may alleviate concerns associated with unfamiliar technology.
{"title":"DataSpoon: Overcoming Design Challenges in Tangible and Embedded Assistive Technologies","authors":"Oren Zuckerman, Tamar Gal, T. Keren-Capelovitch, T. Krasovsky, Ayelet Gal-Oz, P. Weiss","doi":"10.1145/2839462.2839505","DOIUrl":"https://doi.org/10.1145/2839462.2839505","url":null,"abstract":"The design of tangible and embedded assistive technologies poses unique challenges. We describe the challenges we encountered during the design of \"DataSpoon\", explain how we overcame them, and suggest design guidelines. DataSpoon is an instrumented spoon that monitors movement kinematics during self-feeding. Children with motor disorders often encounter difficulty mastering self-feeding. In order to treat them effectively, professional caregivers need to assess their movement kinematics. Currently, assessment is performed through observations and questionnaires. DataSpoon adds sensor-based data to this process. A validation study showed that data obtained from DataSpoon and from a 6-camera 3D motion capture system were similar. Our experience yielded three design guidelines: needs of both caregivers and children should be considered; distractions to direct caregiver-child interaction should be minimized; familiar-looking devices may alleviate concerns associated with unfamiliar technology.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"142 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116546101","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Makerspaces of various models are forming all around the world. We present a model and case study of the Maketec, a public drop-in makerspace for children, run by teens. The Maketec model is designed to promote making and socializing opportunities for girls and boys of ages 9-14. It is based on three underlying principles: (1) "Low Floor/Wide Walls": construction kits and digital fabrication technologies that allow kids to invent and create with no prior knowledge or expertise; (2) "Unstructured Learning": no formal instructors, teens serve as mentors for kids, and promote a culture of self-driven learning through projects; and (3) "A Makerspace as a Third Place": the Maketec is free and managed by kids for kids in an effort to form a unique community of young makers. We report on interviews with four recurring visitors, and discuss our insights around the three principles and the proposed model.
{"title":"Maketec: A Makerspace as a Third Place for Children","authors":"David Bar-El, Oren Zuckerman","doi":"10.1145/2839462.2856556","DOIUrl":"https://doi.org/10.1145/2839462.2856556","url":null,"abstract":"Makerspaces of various models are forming all around the world. We present a model and case study of the Maketec, a public drop-in makerspace for children, run by teens. The Maketec model is designed to promote making and socializing opportunities for girls and boys of ages 9-14. It is based on three underlying principles: (1) \"Low Floor/Wide Walls\": construction kits and digital fabrication technologies that allow kids to invent and create with no prior knowledge or expertise; (2) \"Unstructured Learning\": no formal instructors, teens serve as mentors for kids, and promote a culture of self-driven learning through projects; and (3) \"A Makerspace as a Third Place\": the Maketec is free and managed by kids for kids in an effort to form a unique community of young makers. We report on interviews with four recurring visitors, and discuss our insights around the three principles and the proposed model.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131069207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jewelry has long been used to modify one's body and mediate experiences and interactions. Fulfilling a variety of roles, ranging from ritual objects to sentimental tokens to pure adornment, jewelry provides an externalization on the body of inner expression, non-verbally communicating information about the wearer to those she or he encounters. "Functionality in Wearable Tech," a set of three jewelry pieces, seeks to satirize and call attention to wearable technology's transition into the space that jewelry has occupied for thousands of years. Through the use of cheap robotic children's toys, converted into "functional" jewelry, the series of work considers desired and real emotional attachments to technological and jewelry objects, and those objects in between. The documented performance captures an exploration of jewelry with technological media and the ostentatious statement wearing Wearables today makes.
{"title":"Functionality in Wearable Tech: Device, as Jewelry, as Body Mediator","authors":"A. Ju","doi":"10.1145/2839462.2856348","DOIUrl":"https://doi.org/10.1145/2839462.2856348","url":null,"abstract":"Jewelry has long been used to modify one's body and mediate experiences and interactions. Fulfilling a variety of roles, ranging from ritual objects to sentimental tokens to pure adornment, jewelry provides an externalization on the body of inner expression, non-verbally communicating information about the wearer to those she or he encounters. \"Functionality in Wearable Tech,\" a set of three jewelry pieces, seeks to satirize and call attention to wearable technology's transition into the space that jewelry has occupied for thousands of years. Through the use of cheap robotic children's toys, converted into \"functional\" jewelry, the series of work considers desired and real emotional attachments to technological and jewelry objects, and those objects in between. The documented performance captures an exploration of jewelry with technological media and the ostentatious statement wearing Wearables today makes.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128549606","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
3D printing is widely used to physically prototype the look and feel of 3D objects. However, interaction possibilities of these prototypes are often limited to mechanical parts or post-assembled electronics. Moreover, fabricating interactive 3D printed objects is still an expert task. In my dissertation, I therefore explore how to support users in the design of interactive 3D objects and how to automate the generation of adequate sensing structures. Further, I investigate tangible interaction concepts for 3D printed objects. In this paper, I outline my past and future research towards the fabrication of 3D objects in terms of (1) user-friendly design, (2) automation of fabrication, and (3) tangible interaction concepts for the input modalities touch and deformation.
{"title":"Exploring 3D Printed Interaction","authors":"Martin Schmitz","doi":"10.1145/2839462.2854105","DOIUrl":"https://doi.org/10.1145/2839462.2854105","url":null,"abstract":"3D printing is widely used to physically prototype the look and feel of 3D objects. However, interaction possibilities of these prototypes are often limited to mechanical parts or post-assembled electronics. Moreover, fabricating interactive 3D printed objects is still an expert task. In my dissertation, I therefore explore how to support users in the design of interactive 3D objects and how to automate the generation of adequate sensing structures. Further, I investigate tangible interaction concepts for 3D printed objects. In this paper, I outline my past and future research towards the fabrication of 3D objects in terms of (1) user-friendly design, (2) automation of fabrication, and (3) tangible interaction concepts for the input modalities touch and deformation.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117008773","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper presents a preliminary framework to inform the analysis and design of tangible narratives. Researchers and designers have been using tangible user interfaces (TUIs) for storytelling over the past two decades, but to date no comprehensive analysis of these systems exists. We argue that storytelling systems that use digitally-enhanced physical objects form a unique medium with identifiable narrative characteristics. Our framework isolates these characteristics and focuses on the user's perspective to identify commonalities between existing systems, as well as gaps that can be addressed by new systems. We find that the majority of systems in our sample require the user to perform exploratory actions from an external narrative position. We note that systems that cast the user in other interactive roles are rare but technologically feasible, suggesting that there are many underexplored possibilities for tangible storytelling.
{"title":"Towards a Framework for Tangible Narratives","authors":"Daniel Harley, J. Chu, Jamie Kwan, Ali Mazalek","doi":"10.1145/2839462.2839471","DOIUrl":"https://doi.org/10.1145/2839462.2839471","url":null,"abstract":"This paper presents a preliminary framework to inform the analysis and design of tangible narratives. Researchers and designers have been using tangible user interfaces (TUIs) for storytelling over the past two decades, but to date no comprehensive analysis of these systems exists. We argue that storytelling systems that use digitally-enhanced physical objects form a unique medium with identifiable narrative characteristics. Our framework isolates these characteristics and focuses on the user's perspective to identify commonalities between existing systems, as well as gaps that can be addressed by new systems. We find that the majority of systems in our sample require the user to perform exploratory actions from an external narrative position. We note that systems that cast the user in other interactive roles are rare but technologically feasible, suggesting that there are many underexplored possibilities for tangible storytelling.","PeriodicalId":422083,"journal":{"name":"Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115459714","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}