The Bronze Key art installation is the result of performative re-materializations of bodily data. This collaborative experiment in data encryption expands research into practices of archiving and critical discourses around open data. It integrates bodily movement, motion capture and Virtual Reality (VR) with a critical awareness of data trails and data protection. A symmetric cryptosystem was enacted producing a post-digital cipher system, along with archival artefacts of the encryption process. Material components for inclusion in the TEI Arts Track include: an audio file of text to speech of the raw motion capture data from the original movement sequence on cassette tape (The Plaintext), a 3D printed bronze shape produced from a motion captured gesture (The Encryption Key), and a printed book containing the scrambled motion capture data (The Ciphertext).
青铜钥匙艺术装置是对身体数据进行表演性再物化的结果。这个数据加密的合作实验将研究扩展到存档实践和围绕开放数据的关键话语。它将身体运动,动作捕捉和虚拟现实(VR)与数据跟踪和数据保护的关键意识相结合。一个对称密码系统被制定为产生后数字密码系统,以及加密过程的存档工件。TEI Arts Track中包含的材料组件包括:盒式磁带上原始运动序列的原始动作捕捉数据的文本到语音的音频文件(明文),由动作捕捉手势产生的3D打印青铜形状(加密密钥),以及包含打乱动作捕捉数据的印刷书籍(密文)。
{"title":"The Bronze Key: Performing Data Encryption","authors":"S. Kozel, Ruth Gibson, B. Martelli","doi":"10.1145/3173225.3173306","DOIUrl":"https://doi.org/10.1145/3173225.3173306","url":null,"abstract":"The Bronze Key art installation is the result of performative re-materializations of bodily data. This collaborative experiment in data encryption expands research into practices of archiving and critical discourses around open data. It integrates bodily movement, motion capture and Virtual Reality (VR) with a critical awareness of data trails and data protection. A symmetric cryptosystem was enacted producing a post-digital cipher system, along with archival artefacts of the encryption process. Material components for inclusion in the TEI Arts Track include: an audio file of text to speech of the raw motion capture data from the original movement sequence on cassette tape (The Plaintext), a 3D printed bronze shape produced from a motion captured gesture (The Encryption Key), and a printed book containing the scrambled motion capture data (The Ciphertext).","PeriodicalId":176301,"journal":{"name":"Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131640024","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Telecommunication with family and friends is often offered as a solution for aging adults facing social isolation. While strengthening existing ties is important, it fails to address the importance of spontaneous community interactions. This paper presents Nettle, a system that is designed to build casual human connection into one's daily routine. Nettle is based on the artist's alternate vision of smart home design where interfaces are playful, based on familiar household forms and warmly inviting. The audience will observe a performance where Nettle fosters a spontaneous spoken conversation alongside the process of making a pot of tea.
{"title":"Nettle: An Exploration of Communication Interface Design for Older Adults","authors":"Audrey M. Fox","doi":"10.1145/3173225.3173298","DOIUrl":"https://doi.org/10.1145/3173225.3173298","url":null,"abstract":"Telecommunication with family and friends is often offered as a solution for aging adults facing social isolation. While strengthening existing ties is important, it fails to address the importance of spontaneous community interactions. This paper presents Nettle, a system that is designed to build casual human connection into one's daily routine. Nettle is based on the artist's alternate vision of smart home design where interfaces are playful, based on familiar household forms and warmly inviting. The audience will observe a performance where Nettle fosters a spontaneous spoken conversation alongside the process of making a pot of tea.","PeriodicalId":176301,"journal":{"name":"Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"92 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123600183","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Touch Connection is a pair of digitally embroidered textile surfaces with vibrotactile capability. The surfaces are designed as a paired system, to enable people to connect, share touch responses and construct communal tactile experiences. It consists of two different fabric surfaces that use the textile techniques quilting and embroidery to create dramatic volume and texture. We explore the application of textural effects and vibrotactile feedback as materials for design that can be shaped and moulded to engage participants. The work explores the role these materials play in generating improvised action and emotional responses.
{"title":"Touch Connection: A Vibrotactile, Textile Prototype","authors":"L. Hernandez","doi":"10.1145/3173225.3173293","DOIUrl":"https://doi.org/10.1145/3173225.3173293","url":null,"abstract":"Touch Connection is a pair of digitally embroidered textile surfaces with vibrotactile capability. The surfaces are designed as a paired system, to enable people to connect, share touch responses and construct communal tactile experiences. It consists of two different fabric surfaces that use the textile techniques quilting and embroidery to create dramatic volume and texture. We explore the application of textural effects and vibrotactile feedback as materials for design that can be shaped and moulded to engage participants. The work explores the role these materials play in generating improvised action and emotional responses.","PeriodicalId":176301,"journal":{"name":"Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123629293","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Seokbin Kang, Leyla Norooz, Virginia L. Byrne, Tamara L. Clegg, Jon E. Froehlich
We present early work developing an Augmented Reality (AR) system that allows young children to design and experiment with complex systems (e.g., bicycle gears, human circulatory system). Our novel approach combines low-fidelity prototyping to help children represent creative ideas, AR visualization to scaffold iterative design, and virtual simulation to support personalized experiments. To evaluate our approach, we conducted an exploratory study with eight children (ages 8-11) using an initial prototype. Our findings demonstrate the viability of our approach, uncover usability challenges, and suggest opportunities for future work. We also distill additional design implications from a follow-up participatory design session with children.
{"title":"Prototyping and Simulating Complex Systems with Paper Craft and Augmented Reality: An Initial Investigation","authors":"Seokbin Kang, Leyla Norooz, Virginia L. Byrne, Tamara L. Clegg, Jon E. Froehlich","doi":"10.1145/3173225.3173264","DOIUrl":"https://doi.org/10.1145/3173225.3173264","url":null,"abstract":"We present early work developing an Augmented Reality (AR) system that allows young children to design and experiment with complex systems (e.g., bicycle gears, human circulatory system). Our novel approach combines low-fidelity prototyping to help children represent creative ideas, AR visualization to scaffold iterative design, and virtual simulation to support personalized experiments. To evaluate our approach, we conducted an exploratory study with eight children (ages 8-11) using an initial prototype. Our findings demonstrate the viability of our approach, uncover usability challenges, and suggest opportunities for future work. We also distill additional design implications from a follow-up participatory design session with children.","PeriodicalId":176301,"journal":{"name":"Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125081017","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Piyum Fernando, Jennifer Weiler, S. Kuznetsov, P. Turaga
Dynamic elements of traditional drawing processes such as the order of compilation, and speed, length, and pressure of strokes can be as important as the final art piece because they can reveal the technique, process, and emotions of the artist. In this paper, we present an interactive system that unobtrusively tracks the freehand drawing process (movement and pressure of artist»s pencil) on a regular easel. The system outputs captured information using 2D video renderings and 3D-printed sculptures. We also present a summery of findings from a user study with 6 experienced artists who created multiple pencil drawings using our system. The resulting digital and physical outputs from our system revealed vast differences in drawing speeds, styles, and techniques. At TEI art track, the attendees will likely engage in lively discussion around the analog, digital, and tangible aspects of our exhibit. We believe that such a discussion will be critical not only in shaping the future of our work, but also in understanding novel research directions at the intersection of art and computation.
{"title":"Tracking, Animating, and 3D Printing Elements of the Fine Arts Freehand Drawing Process","authors":"Piyum Fernando, Jennifer Weiler, S. Kuznetsov, P. Turaga","doi":"10.1145/3173225.3173307","DOIUrl":"https://doi.org/10.1145/3173225.3173307","url":null,"abstract":"Dynamic elements of traditional drawing processes such as the order of compilation, and speed, length, and pressure of strokes can be as important as the final art piece because they can reveal the technique, process, and emotions of the artist. In this paper, we present an interactive system that unobtrusively tracks the freehand drawing process (movement and pressure of artist»s pencil) on a regular easel. The system outputs captured information using 2D video renderings and 3D-printed sculptures. We also present a summery of findings from a user study with 6 experienced artists who created multiple pencil drawings using our system. The resulting digital and physical outputs from our system revealed vast differences in drawing speeds, styles, and techniques. At TEI art track, the attendees will likely engage in lively discussion around the analog, digital, and tangible aspects of our exhibit. We believe that such a discussion will be critical not only in shaping the future of our work, but also in understanding novel research directions at the intersection of art and computation.","PeriodicalId":176301,"journal":{"name":"Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115340102","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
"Let's Fake News" is an interactive media art installation that forces participants to realize that anyone can create fake news and may even find joy in doing so. The artwork addresses the conference themes as an interactive installation that challenges ideas around "post-truth", creating an experience that engages with digital representations of the discursive interactions.
{"title":"'Let's Fake News'","authors":"Léon McCarthy","doi":"10.1145/3173225.3173318","DOIUrl":"https://doi.org/10.1145/3173225.3173318","url":null,"abstract":"\"Let's Fake News\" is an interactive media art installation that forces participants to realize that anyone can create fake news and may even find joy in doing so. The artwork addresses the conference themes as an interactive installation that challenges ideas around \"post-truth\", creating an experience that engages with digital representations of the discursive interactions.","PeriodicalId":176301,"journal":{"name":"Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"197 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115655300","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
[pain]Byte is looks at the world of chronic pain. The invisible disability of spinal chronic pain which is manifested and represented through data driven dance (classical ballet) and virtual reality (VR). Enabling the non sufferer audience to 'see' the hidden nature and challenges of chronic pain linked to the benefits of biomedical engineering and implanted technology. The body as analogue represented through the digital of the wearables and the virtual in the VR experience. Humanising implanted technology and exposing the invisible nature of chronic pain for audiences. In our exhibit, people can watch the VR, interact with the biometric sensors and our single Kinect motion capture. A recording of the ballet will be projected.
{"title":"PainByte: Chronic Pain and BioMedical Engineering Through the Lens of Classical Ballet & Virtual Reality","authors":"Genevieve Smith-Nunes, Alex Shaw, C. Neale","doi":"10.1145/3173225.3173296","DOIUrl":"https://doi.org/10.1145/3173225.3173296","url":null,"abstract":"[pain]Byte is looks at the world of chronic pain. The invisible disability of spinal chronic pain which is manifested and represented through data driven dance (classical ballet) and virtual reality (VR). Enabling the non sufferer audience to 'see' the hidden nature and challenges of chronic pain linked to the benefits of biomedical engineering and implanted technology. The body as analogue represented through the digital of the wearables and the virtual in the VR experience. Humanising implanted technology and exposing the invisible nature of chronic pain for audiences. In our exhibit, people can watch the VR, interact with the biometric sensors and our single Kinect motion capture. A recording of the ballet will be projected.","PeriodicalId":176301,"journal":{"name":"Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116991551","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
When compared to mainstream touch/button-centric devices, deformable devices enable a more organic and tangible way for human-computer interaction. This studio provides participants an opportunity to have hands-on experience in fabricating controllers that use various deformation inputs (e.g., bending, stretching), and promote novel interaction mechanisms using hand gestures. Participants will learn different types of deformation inputs and create their own sensors. They will work in groups to design deformable controllers using both the sensors they make in-session and/or pre-built ones, to afford novel hand gestural inputs beyond conventional touches and clicks. The main objectives of this studio are to facilitate exchange of experience in fabricating deformable sensors/materials, and to foster creativity in using such controllers as inputs with hand gestures.
{"title":"Deformable Controllers: Fabrication and Design to Promote Novel Hand Gestural Interaction Mechanisms","authors":"Victor Cheung, Alexander Keith Eady, A. Girouard","doi":"10.1145/3173225.3173332","DOIUrl":"https://doi.org/10.1145/3173225.3173332","url":null,"abstract":"When compared to mainstream touch/button-centric devices, deformable devices enable a more organic and tangible way for human-computer interaction. This studio provides participants an opportunity to have hands-on experience in fabricating controllers that use various deformation inputs (e.g., bending, stretching), and promote novel interaction mechanisms using hand gestures. Participants will learn different types of deformation inputs and create their own sensors. They will work in groups to design deformable controllers using both the sensors they make in-session and/or pre-built ones, to afford novel hand gestural inputs beyond conventional touches and clicks. The main objectives of this studio are to facilitate exchange of experience in fabricating deformable sensors/materials, and to foster creativity in using such controllers as inputs with hand gestures.","PeriodicalId":176301,"journal":{"name":"Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126584604","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We propose a new method for shape- and color-changing display called COLORISE. Our 'COLORISE' system has inflatable shape changing pixels that can change their colors without using any light-emitting type devices. The array of the modules enables various color patterns, making full use of the characteristics of the material. Each pixel also has a touch sensing function that enables users to interact with it intuitively. This paper describes the design and mechanism of our system, explores interactions with users, and presents technical evaluations of the proposed pixel modules.
{"title":"COLORISE: Shape- and Color-Changing Pixels with Inflatable Elastomers and Interactions","authors":"Juri Fujii, Takuya Matsunobu, Y. Kakehi","doi":"10.1145/3173225.3173228","DOIUrl":"https://doi.org/10.1145/3173225.3173228","url":null,"abstract":"We propose a new method for shape- and color-changing display called COLORISE. Our 'COLORISE' system has inflatable shape changing pixels that can change their colors without using any light-emitting type devices. The array of the modules enables various color patterns, making full use of the characteristics of the material. Each pixel also has a touch sensing function that enables users to interact with it intuitively. This paper describes the design and mechanism of our system, explores interactions with users, and presents technical evaluations of the proposed pixel modules.","PeriodicalId":176301,"journal":{"name":"Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129148270","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yuanzhi Cao, Zhuangying Xu, Terrell Glenn, Ke Huo, K. Ramani
Ani-Bot is a modular robotics system that allows users to control their DIY robots using Mixed-Reality Interaction (MRI). This system takes advantage of MRI to enable users to visually program the robot through the augmented view of a Head-Mounted Display (HMD). In this paper, we first explain the design of the Mixed-Reality (MR) ready modular robotics system, which allows users to instantly perform MRI once they finish assembling the robot. Then, we elaborate the augmentations provided by the MR system in the three primary phases of a construction kit's lifecycle: Creation, Tweaking, and Usage. Finally, we demonstrate Ani-Bot with four application examples and evaluate the system with a two-session user study. The results of our evaluation indicate that Ani-Bot does successfully embed MRI into the lifecycle (Creation, Tweaking, Usage) of DIY robotics and that it does show strong potential for delivering an enhanced user experience.
{"title":"Ani-Bot: A Modular Robotics System Supporting Creation, Tweaking, and Usage with Mixed-Reality Interactions","authors":"Yuanzhi Cao, Zhuangying Xu, Terrell Glenn, Ke Huo, K. Ramani","doi":"10.1145/3173225.3173226","DOIUrl":"https://doi.org/10.1145/3173225.3173226","url":null,"abstract":"Ani-Bot is a modular robotics system that allows users to control their DIY robots using Mixed-Reality Interaction (MRI). This system takes advantage of MRI to enable users to visually program the robot through the augmented view of a Head-Mounted Display (HMD). In this paper, we first explain the design of the Mixed-Reality (MR) ready modular robotics system, which allows users to instantly perform MRI once they finish assembling the robot. Then, we elaborate the augmentations provided by the MR system in the three primary phases of a construction kit's lifecycle: Creation, Tweaking, and Usage. Finally, we demonstrate Ani-Bot with four application examples and evaluate the system with a two-session user study. The results of our evaluation indicate that Ani-Bot does successfully embed MRI into the lifecycle (Creation, Tweaking, Usage) of DIY robotics and that it does show strong potential for delivering an enhanced user experience.","PeriodicalId":176301,"journal":{"name":"Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121964431","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}