Manfred Lau, Masaki Hirose, Akira Ohgawara, J. Mitani, T. Igarashi
Existing 3D sketching methods typically allow the user to draw in empty space which is imprecise and lacks tactile feedback. We introduce a shape-stamping interface where users can model with tangible 3D primitive shapes. Each of these shapes represents a copy or a fragment of the construction material. Instead of modeling in empty space, these shapes allow us to use the real-world environment and other existing objects as a tangible guide during 3D modeling. We call this approach Situated Modeling: users can create new real-sized 3D objects directly in 3D space while using the nearby existing objects as the ultimate reference. We also describe a two-handed shape-stamping technique for stamping with tactile feedback. We show a variety of doit-yourself furniture and household products designed with our system, and perform a user study to compare our method with a related AR-based modeling system.
{"title":"Situated modeling: a shape-stamping interface with tangible primitives","authors":"Manfred Lau, Masaki Hirose, Akira Ohgawara, J. Mitani, T. Igarashi","doi":"10.1145/2148131.2148190","DOIUrl":"https://doi.org/10.1145/2148131.2148190","url":null,"abstract":"Existing 3D sketching methods typically allow the user to draw in empty space which is imprecise and lacks tactile feedback. We introduce a shape-stamping interface where users can model with tangible 3D primitive shapes. Each of these shapes represents a copy or a fragment of the construction material. Instead of modeling in empty space, these shapes allow us to use the real-world environment and other existing objects as a tangible guide during 3D modeling. We call this approach Situated Modeling: users can create new real-sized 3D objects directly in 3D space while using the nearby existing objects as the ultimate reference. We also describe a two-handed shape-stamping technique for stamping with tactile feedback. We show a variety of doit-yourself furniture and household products designed with our system, and perform a user study to compare our method with a related AR-based modeling system.","PeriodicalId":440364,"journal":{"name":"Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128801292","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Art explorations","authors":"A. Antle, T. Schiphorst","doi":"10.1145/3256403","DOIUrl":"https://doi.org/10.1145/3256403","url":null,"abstract":"","PeriodicalId":440364,"journal":{"name":"Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124613031","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
R. Mustafin, J. Wehner, Wolfgang Sattler, Kristian Gohlke
We present the prototype of an augmented game that uses an enhanced Frisbee-Disc as an interaction device to explore the capability of flying tangible user interfaces for increasing the attractiveness of physical games. While playing with the disc, a dynamic audio stream is generated, which serves as an additional semantic layer that can be leveraged to develop novel game concepts for simple catch-and-throw games. Initial observations of users interacting with our prototype indicate that minor auditory augmentations to seemingly old-fashioned physical exertion games can have a potential to enhance the playing experience and support a more persistent engagement in physical activity.
{"title":"T.F.O.: tangible flying objects","authors":"R. Mustafin, J. Wehner, Wolfgang Sattler, Kristian Gohlke","doi":"10.1145/2148131.2148173","DOIUrl":"https://doi.org/10.1145/2148131.2148173","url":null,"abstract":"We present the prototype of an augmented game that uses an enhanced Frisbee-Disc as an interaction device to explore the capability of flying tangible user interfaces for increasing the attractiveness of physical games. While playing with the disc, a dynamic audio stream is generated, which serves as an additional semantic layer that can be leveraged to develop novel game concepts for simple catch-and-throw games. Initial observations of users interacting with our prototype indicate that minor auditory augmentations to seemingly old-fashioned physical exertion games can have a potential to enhance the playing experience and support a more persistent engagement in physical activity.","PeriodicalId":440364,"journal":{"name":"Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127373645","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This work presents an approach to overcome the disadvantages of existing interaction frameworks and technologies for touch detection and object interaction. The robust and easy to use framework dSensingNI (Depth Sensing Natural Interaction) is described, which supports multitouch and tangible interaction with arbitrary objects. It uses images from a depth-sensing camera and provides tracking of users fingers of palm of hands and combines this with object interaction, such as grasping, grouping and stacking, which can be used for advanced interaction techniques.
{"title":"dSensingNI: a framework for advanced tangible interaction using a depth camera","authors":"Florian Klompmaker, Karsten Nebe, Alex Fast","doi":"10.1145/2148131.2148179","DOIUrl":"https://doi.org/10.1145/2148131.2148179","url":null,"abstract":"This work presents an approach to overcome the disadvantages of existing interaction frameworks and technologies for touch detection and object interaction. The robust and easy to use framework dSensingNI (Depth Sensing Natural Interaction) is described, which supports multitouch and tangible interaction with arbitrary objects. It uses images from a depth-sensing camera and provides tracking of users fingers of palm of hands and combines this with object interaction, such as grasping, grouping and stacking, which can be used for advanced interaction techniques.","PeriodicalId":440364,"journal":{"name":"Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123081988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Visual mathematical concepts have long been challenging to access for people with limited or no vision. Given that functions and data plots are typically presented visually; there are few affordable and accessible solutions for individuals with limited or no vision to interpret data in this format. We have developed software that leverages new affordable 3D printing technology to rapidly and automatically generate tactile visualizations. In this paper, we describe development of the VizTouch software through a user-centered design process. We worked with six individuals with low or limited vision to understand the usefulness of 3D printed custom tactile visualizations, and their design. We describe how VizTouch can be used to make data visualizations in education, business, and entertainment accessible.
{"title":"VizTouch: automatically generated tactile visualizations of coordinate spaces","authors":"C. Brown, A. Hurst","doi":"10.1145/2148131.2148160","DOIUrl":"https://doi.org/10.1145/2148131.2148160","url":null,"abstract":"Visual mathematical concepts have long been challenging to access for people with limited or no vision. Given that functions and data plots are typically presented visually; there are few affordable and accessible solutions for individuals with limited or no vision to interpret data in this format. We have developed software that leverages new affordable 3D printing technology to rapidly and automatically generate tactile visualizations. In this paper, we describe development of the VizTouch software through a user-centered design process. We worked with six individuals with low or limited vision to understand the usefulness of 3D printed custom tactile visualizations, and their design. We describe how VizTouch can be used to make data visualizations in education, business, and entertainment accessible.","PeriodicalId":440364,"journal":{"name":"Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction","volume":"73 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127724742","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Wiethoff, H. Schneider, M. Rohs, A. Butz, S. Greenberg
Graspable tangibles are now being explored on the current generation of capacitive touch surfaces, such as the iPad and the Android tablet. Because the size and form factor is relatively new, early and low fidelity prototyping of these TUIs is crucial in getting the right design. The problem is that it is difficult for the average interaction designer to develop such physical prototypes. They require a substantial amount time and effort to physically model the tangibles, and expertise in electronics to instrument them. Thus prototyping is sometimes handed off to specialists, or is limited to only a few design iterations and alternative designs. Our solution contributes a low fidelity prototyping approach that is time and cost effective, and that requires no electronics knowledge. First, we supply non-specialists with cardboard forms to create tangibles. Second, we have them draw lines on it via conductive ink, which makes their objects recognizable by the capacitive touch screen. They can then apply routine programming to recognize these tangibles and thus iterate over various designs.
{"title":"Sketch-a-TUI: low cost prototyping of tangible interactions using cardboard and conductive ink","authors":"A. Wiethoff, H. Schneider, M. Rohs, A. Butz, S. Greenberg","doi":"10.1145/2148131.2148196","DOIUrl":"https://doi.org/10.1145/2148131.2148196","url":null,"abstract":"Graspable tangibles are now being explored on the current generation of capacitive touch surfaces, such as the iPad and the Android tablet. Because the size and form factor is relatively new, early and low fidelity prototyping of these TUIs is crucial in getting the right design. The problem is that it is difficult for the average interaction designer to develop such physical prototypes. They require a substantial amount time and effort to physically model the tangibles, and expertise in electronics to instrument them. Thus prototyping is sometimes handed off to specialists, or is limited to only a few design iterations and alternative designs. Our solution contributes a low fidelity prototyping approach that is time and cost effective, and that requires no electronics knowledge. First, we supply non-specialists with cardboard forms to create tangibles. Second, we have them draw lines on it via conductive ink, which makes their objects recognizable by the capacitive touch screen. They can then apply routine programming to recognize these tangibles and thus iterate over various designs.","PeriodicalId":440364,"journal":{"name":"Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133601061","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Azusa Kadomura, K. Tsukada, Tetsuaki Baba, Kumiko Kushiyama
We propose an interactive toy, "Hangul Gangul", which helps users learn Hangul characters using a tangible interface. Using our system, users can enjoy learning Hangul characters by combining physical blocks representing vowel and consonant characters. Our system aims to encourage collaborative learning between children and adults.
{"title":"Hangul Gangul: interactive toy for Hangul learning","authors":"Azusa Kadomura, K. Tsukada, Tetsuaki Baba, Kumiko Kushiyama","doi":"10.1145/2148131.2148203","DOIUrl":"https://doi.org/10.1145/2148131.2148203","url":null,"abstract":"We propose an interactive toy, \"Hangul Gangul\", which helps users learn Hangul characters using a tangible interface. Using our system, users can enjoy learning Hangul characters by combining physical blocks representing vowel and consonant characters. Our system aims to encourage collaborative learning between children and adults.","PeriodicalId":440364,"journal":{"name":"Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction","volume":"389 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134127886","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
With WristFlicker, we detect wrist movement through sets of stretch sensors embedded in clothing. Our system supports wrist rotation (pronation/supination), and both wrist tilts (flexion/extension and ulnar/radial deviation). Each wrist movement is measured by two opposing stretch sensors, mimicking the counteracting movement of muscles. We discuss interaction techniques that allow a user to control a music player through this lightweight input.
{"title":"With a flick of the wrist: stretch sensors as lightweight input for mobile devices","authors":"Paul Strohmeier, Roel Vertegaal, A. Girouard","doi":"10.1145/2148131.2148195","DOIUrl":"https://doi.org/10.1145/2148131.2148195","url":null,"abstract":"With WristFlicker, we detect wrist movement through sets of stretch sensors embedded in clothing. Our system supports wrist rotation (pronation/supination), and both wrist tilts (flexion/extension and ulnar/radial deviation). Each wrist movement is measured by two opposing stretch sensors, mimicking the counteracting movement of muscles. We discuss interaction techniques that allow a user to control a music player through this lightweight input.","PeriodicalId":440364,"journal":{"name":"Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction","volume":"80 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133993925","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A major challenge for developers of tangible, embedded and embodied interfaces is the understanding of dynamic social contexts. To address this challenge, the concept of "social contraptions" is proposed. Social contraptions are interactive installations and performative interventions employed as designerly explorations of social relations.
{"title":"Social contraptions as breaching environments","authors":"Robb Mitchell","doi":"10.1145/2148131.2148230","DOIUrl":"https://doi.org/10.1145/2148131.2148230","url":null,"abstract":"A major challenge for developers of tangible, embedded and embodied interfaces is the understanding of dynamic social contexts. To address this challenge, the concept of \"social contraptions\" is proposed. Social contraptions are interactive installations and performative interventions employed as designerly explorations of social relations.","PeriodicalId":440364,"journal":{"name":"Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction","volume":"140 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122209155","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
David Ledo, Miguel A. Nacenta, Nicolai Marquardt, Sebastian Boring, S. Greenberg
In the real world, touch based interaction relies on haptic feedback (e.g., grasping objects, feeling textures). Unfortunately, such feedback is absent in current tabletop systems. The previously developed Haptic Tabletop Puck (HTP) aims at supporting experimentation with and development of inexpensive tabletop haptic interfaces in a do-it-yourself fashion. The problem is that programming the HTP (and haptics in general) is difficult. To address this problem, we contribute the Haptictouch toolkit, which enables developers to rapidly prototype haptic tabletop applications. Our toolkit is structured in three layers that enable programmers to: (1) directly control the device, (2) create customized combinable haptic behaviors (e.g., softness, oscillation), and (3) use visuals (e.g., shapes, images, buttons) to quickly make use of these behaviors. In our preliminary exploration we found that programmers could use our toolkit to create haptic tabletop applications in a short amount of time.
{"title":"The HapticTouch toolkit: enabling exploration of haptic interactions","authors":"David Ledo, Miguel A. Nacenta, Nicolai Marquardt, Sebastian Boring, S. Greenberg","doi":"10.1145/2148131.2148157","DOIUrl":"https://doi.org/10.1145/2148131.2148157","url":null,"abstract":"In the real world, touch based interaction relies on haptic feedback (e.g., grasping objects, feeling textures). Unfortunately, such feedback is absent in current tabletop systems. The previously developed Haptic Tabletop Puck (HTP) aims at supporting experimentation with and development of inexpensive tabletop haptic interfaces in a do-it-yourself fashion. The problem is that programming the HTP (and haptics in general) is difficult. To address this problem, we contribute the Haptictouch toolkit, which enables developers to rapidly prototype haptic tabletop applications. Our toolkit is structured in three layers that enable programmers to: (1) directly control the device, (2) create customized combinable haptic behaviors (e.g., softness, oscillation), and (3) use visuals (e.g., shapes, images, buttons) to quickly make use of these behaviors. In our preliminary exploration we found that programmers could use our toolkit to create haptic tabletop applications in a short amount of time.","PeriodicalId":440364,"journal":{"name":"Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction","volume":"22 12","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120912824","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}