Current attempts to render touch in multimedia technology still represents a challenge. Touch is indeed a complex system and there are many aspects to take into account when trying to rendering it (e.g. the compliance of an object, its weight, orientation, geometric properties and forces on the skin). This is especially true for VR, where touch is an important factor to achieve the embodiment in virtual environments. Recently, new tactile technology has been developed: the mid-air devices, capable of delivering tactile feedback without entering in contact with the skin. One of the contributions of the doctoral research described in this paper is to overcome design challenges and create immersive experiences by applying psychological principles and paradigms, exploiting the advantages of the mid-air technology. We designed possible embodied interaction scenarios in mid-air and physical touch. Findings from these research point to opportunities for designing new immersive experiences. Future work will involve different parts of the body and different tactile properties (e.g. thermal stimulation).
{"title":"Understanding and Designing Embodied Experiences Through Mid-air Tactile Stimulation","authors":"D. Pittera","doi":"10.1145/3173225.3173340","DOIUrl":"https://doi.org/10.1145/3173225.3173340","url":null,"abstract":"Current attempts to render touch in multimedia technology still represents a challenge. Touch is indeed a complex system and there are many aspects to take into account when trying to rendering it (e.g. the compliance of an object, its weight, orientation, geometric properties and forces on the skin). This is especially true for VR, where touch is an important factor to achieve the embodiment in virtual environments. Recently, new tactile technology has been developed: the mid-air devices, capable of delivering tactile feedback without entering in contact with the skin. One of the contributions of the doctoral research described in this paper is to overcome design challenges and create immersive experiences by applying psychological principles and paradigms, exploiting the advantages of the mid-air technology. We designed possible embodied interaction scenarios in mid-air and physical touch. Findings from these research point to opportunities for designing new immersive experiences. Future work will involve different parts of the body and different tactile properties (e.g. thermal stimulation).","PeriodicalId":176301,"journal":{"name":"Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128450663","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Live performances which involve digital technology often strive toward clear correspondences between distinct media modes, particularly those works which combine audio and video. Often, the process of creating and executing such performances involves mapping schemes which are encased within the digital system, producing content which is tightly synchronized but with relationships which can feel rigid and unexpressive. Within this paper we present a collaborative process between visualist and musician, which builds toward a method for promoting co-creativity in multimedia performance and prioritizes the performer's physical presence and interaction with digital content. Through the development of two autonomous systems, a novel physical interface and an interactive music system, we summarize our creative process of co-exploration of system capabilities, and extended periods of experimentation and exploration. From this experience, we offer an early-stage framework for approaching engaging digital audiovisual relationships in live performance settings.
{"title":"Beacon: Exploring Physicality in Digital Performance","authors":"Anna Weisling, Anna Xambó","doi":"10.1145/3173225.3173312","DOIUrl":"https://doi.org/10.1145/3173225.3173312","url":null,"abstract":"Live performances which involve digital technology often strive toward clear correspondences between distinct media modes, particularly those works which combine audio and video. Often, the process of creating and executing such performances involves mapping schemes which are encased within the digital system, producing content which is tightly synchronized but with relationships which can feel rigid and unexpressive. Within this paper we present a collaborative process between visualist and musician, which builds toward a method for promoting co-creativity in multimedia performance and prioritizes the performer's physical presence and interaction with digital content. Through the development of two autonomous systems, a novel physical interface and an interactive music system, we summarize our creative process of co-exploration of system capabilities, and extended periods of experimentation and exploration. From this experience, we offer an early-stage framework for approaching engaging digital audiovisual relationships in live performance settings.","PeriodicalId":176301,"journal":{"name":"Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125814208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A growing part of HCI is research on wearables, e-textiles and performances with technology. However, conventional theatre productions underlie traditionally grown structures and have predefined production process that do not allow for in-house creations of interactive costumes. This research project addresses this issue and tries to figure out how this cultural sector and traditional costume design could engage with contemporary e-textile and wearable technologies.
{"title":"Creating and Staging Interactive Costumes","authors":"Michaela Honauer","doi":"10.1145/3173225.3173343","DOIUrl":"https://doi.org/10.1145/3173225.3173343","url":null,"abstract":"A growing part of HCI is research on wearables, e-textiles and performances with technology. However, conventional theatre productions underlie traditionally grown structures and have predefined production process that do not allow for in-house creations of interactive costumes. This research project addresses this issue and tries to figure out how this cultural sector and traditional costume design could engage with contemporary e-textile and wearable technologies.","PeriodicalId":176301,"journal":{"name":"Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115162916","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mikko Kytö, D. McGookin, Wilfried Bock, Héctor A. Caltenco, Charlotte Magnusson
Stroke is a significant cause of long-term disability, impairing over 10 million peoples motor function, primarily on one side of the body every year. Whilst effective rehabilitation exercises can help recover and maintain some affected motor function, stroke survivors often do not carry out enough of these. Instead relying on their 'good'; side to carry out tasks. However, this leads to poor recovery limiting the ability to carry out everyday bimanual tasks (such as dressing or cooking). We present work that seeks to support stroke survivors to engage in bimanual rehabilitation through interaction with augmented tangible objects that can be used to control everyday devices. Through a user-centered design process, we uncovered how bimanual rehabilitation can be supported. This led to the development of the ActivSticks device that allows bimanual rehabilitation and interaction with other devices and services.
{"title":"Designing Bimanual Tangible Interaction for Stroke Survivors","authors":"Mikko Kytö, D. McGookin, Wilfried Bock, Héctor A. Caltenco, Charlotte Magnusson","doi":"10.1145/3173225.3173269","DOIUrl":"https://doi.org/10.1145/3173225.3173269","url":null,"abstract":"Stroke is a significant cause of long-term disability, impairing over 10 million peoples motor function, primarily on one side of the body every year. Whilst effective rehabilitation exercises can help recover and maintain some affected motor function, stroke survivors often do not carry out enough of these. Instead relying on their 'good'; side to carry out tasks. However, this leads to poor recovery limiting the ability to carry out everyday bimanual tasks (such as dressing or cooking). We present work that seeks to support stroke survivors to engage in bimanual rehabilitation through interaction with augmented tangible objects that can be used to control everyday devices. Through a user-centered design process, we uncovered how bimanual rehabilitation can be supported. This led to the development of the ActivSticks device that allows bimanual rehabilitation and interaction with other devices and services.","PeriodicalId":176301,"journal":{"name":"Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"59 9","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113933480","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Opto-Phono-Kinesia (OPK) is an audio-visual performance piece in which all media elements are controlled by the body movements of a single performer. The title is a play on a possible synesthetic state involving connections between vision, sound and body motion. Theoretically, for a person who experiences this state, a specific colour could trigger both a sound and a body action. This synesthetic intersection is simulated in OPK by simultaneity of body movement, and audio-visual result. Using the Gesture and Media System 3.0 motion-tracking system, the performer can dynamically manipulate an immersive environment using two small infrared trackers. The project employs a multipart interface design based on a formal model of increasing complexity in visual-sound-body mapping, and is therefore best performed by an expert performer with strong spatial memory and advanced musical ability. OPK utilizes the "body as experience, instrument and interface" [1] for control of a large-scale environment.
{"title":"Opto-Phono-Kinesia (OPK): Designing Motion-Based Interaction for Expert Performers","authors":"Steve Gibson","doi":"10.1145/3173225.3173295","DOIUrl":"https://doi.org/10.1145/3173225.3173295","url":null,"abstract":"Opto-Phono-Kinesia (OPK) is an audio-visual performance piece in which all media elements are controlled by the body movements of a single performer. The title is a play on a possible synesthetic state involving connections between vision, sound and body motion. Theoretically, for a person who experiences this state, a specific colour could trigger both a sound and a body action. This synesthetic intersection is simulated in OPK by simultaneity of body movement, and audio-visual result. Using the Gesture and Media System 3.0 motion-tracking system, the performer can dynamically manipulate an immersive environment using two small infrared trackers. The project employs a multipart interface design based on a formal model of increasing complexity in visual-sound-body mapping, and is therefore best performed by an expert performer with strong spatial memory and advanced musical ability. OPK utilizes the \"body as experience, instrument and interface\" [1] for control of a large-scale environment.","PeriodicalId":176301,"journal":{"name":"Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128800952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}