Pub Date : 2008-10-28DOI: 10.1109/TABLETOP.2008.4660204
C. Wren, Y. Ivanov, P. Beardsley, Biliana K. Kaneva, Shoji Tanaka
In this paper we describe a method to support interaction with a cellphone-based projector-camera system. We describe a novel approach that uses a technique known in computer vision as ldquostructured lightrdquo. It is based on projecting some special pattern of light onto a scene while imaging it with a camera. The distortions of the known pattern in the resulting image are due to the scene geometry which can be readily estimated. The main contribution of this paper is that the structure is created as consequence of the way raster-scan, laser-based micro-projectors operate, and is in fact invisible to the user. The structure of the projected light is sensed through careful synchronization within the camera-projector system and is imperceptible to the user. In this paper we describe the technique, and test it with a cellphone based application that exploits this method while providing a natural interactive environment with no additional special equipment. The system enables manual interaction with a projected user interface using only the rasterizing projector and camera that will be part of next generation cellphones.
{"title":"Pokey: Interaction through covert structured light","authors":"C. Wren, Y. Ivanov, P. Beardsley, Biliana K. Kaneva, Shoji Tanaka","doi":"10.1109/TABLETOP.2008.4660204","DOIUrl":"https://doi.org/10.1109/TABLETOP.2008.4660204","url":null,"abstract":"In this paper we describe a method to support interaction with a cellphone-based projector-camera system. We describe a novel approach that uses a technique known in computer vision as ldquostructured lightrdquo. It is based on projecting some special pattern of light onto a scene while imaging it with a camera. The distortions of the known pattern in the resulting image are due to the scene geometry which can be readily estimated. The main contribution of this paper is that the structure is created as consequence of the way raster-scan, laser-based micro-projectors operate, and is in fact invisible to the user. The structure of the projected light is sensed through careful synchronization within the camera-projector system and is imperceptible to the user. In this paper we describe the technique, and test it with a cellphone based application that exploits this method while providing a natural interactive environment with no additional special equipment. The system enables manual interaction with a projected user interface using only the rasterizing projector and camera that will be part of next generation cellphones.","PeriodicalId":130376,"journal":{"name":"2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115371401","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2008-10-28DOI: 10.1109/TABLETOP.2008.4660201
Liwei Chan, Ting-Ting Hu, Jin-Yao Lin, Y. Hung, Jane Yung-jen Hsu
In the real world, a physical tabletop provides public and private needs for people around the table. For competing scenarios such as playing a poker game or running a price negotiation around a tabletop system, privacy protection is obviously an indispensable requirement. In this work we developed a privacy-enhanced tabletop system composed of two kinds of displays, the tabletop surface and the virtual panel. All users share the large tabletop surface as a public display while every user is provided with a virtual panel emerging above the tabletop as a personal display for viewing private information. The virtual panel is an intangible, privacy-protected virtual screen created by a special optical mechanism which offers several promising characteristics, making it perfect to be integrated into a tabletop system. The contributions of the paper include: Firstly, we introduce a novel display technique, the virtual panel, into a tabletop system to build a privacy-enhanced tabletop system. Secondly, an analysis on display optics of the virtual panel is presented to explore other potentials of the display and to claim the feasibility of the proposed combination. Thirdly a computer vision-based interaction technique is proposed to provide direct-touch interaction for the virtual panel. Finally, we discuss a wide range of considerations on designing the user interface and interaction for the virtual panel.
{"title":"On top of tabletop: A virtual touch panel display","authors":"Liwei Chan, Ting-Ting Hu, Jin-Yao Lin, Y. Hung, Jane Yung-jen Hsu","doi":"10.1109/TABLETOP.2008.4660201","DOIUrl":"https://doi.org/10.1109/TABLETOP.2008.4660201","url":null,"abstract":"In the real world, a physical tabletop provides public and private needs for people around the table. For competing scenarios such as playing a poker game or running a price negotiation around a tabletop system, privacy protection is obviously an indispensable requirement. In this work we developed a privacy-enhanced tabletop system composed of two kinds of displays, the tabletop surface and the virtual panel. All users share the large tabletop surface as a public display while every user is provided with a virtual panel emerging above the tabletop as a personal display for viewing private information. The virtual panel is an intangible, privacy-protected virtual screen created by a special optical mechanism which offers several promising characteristics, making it perfect to be integrated into a tabletop system. The contributions of the paper include: Firstly, we introduce a novel display technique, the virtual panel, into a tabletop system to build a privacy-enhanced tabletop system. Secondly, an analysis on display optics of the virtual panel is presented to explore other potentials of the display and to claim the feasibility of the proposed combination. Thirdly a computer vision-based interaction technique is proposed to provide direct-touch interaction for the virtual panel. Finally, we discuss a wide range of considerations on designing the user interface and interaction for the virtual panel.","PeriodicalId":130376,"journal":{"name":"2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116578064","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2008-10-28DOI: 10.1109/TABLETOP.2008.4660191
Hao Jiang, Daniel J. Wigdor, C. Forlines, Chia Shen
The WeSpace is a long-term project dedicated to the creation of environments supporting walk-up and share collaboration among small groups. The focus of our system design has been to provide 1) groups with mechanisms to easily share their own data and 2) necessary native visual applications suitable on large display environments. Our current prototype system includes both a large high-resolution data wall and an interactive table. These are utilized to provide a focal point for collaborative interaction with data and applications. In this paper, we describe in detail the designs behind the current prototype system. In particular, we present 1) the infrastructure which allows users to connect and visually share their laptop content on-the-fly, and supports the extension of native visualization applications, and 2) the table-centric design employed in customized WeSpace applications to support cross-surface interactions. We will also describe elements of our user-centered iterative design process, in particular the results from a late-stages session which saw our astrophysicist participants successfully use the WeSpace to collaborate on their own real research problems.
{"title":"System design for the WeSpace: Linking personal devices to a table-centered multi-user, multi-surface environment","authors":"Hao Jiang, Daniel J. Wigdor, C. Forlines, Chia Shen","doi":"10.1109/TABLETOP.2008.4660191","DOIUrl":"https://doi.org/10.1109/TABLETOP.2008.4660191","url":null,"abstract":"The WeSpace is a long-term project dedicated to the creation of environments supporting walk-up and share collaboration among small groups. The focus of our system design has been to provide 1) groups with mechanisms to easily share their own data and 2) necessary native visual applications suitable on large display environments. Our current prototype system includes both a large high-resolution data wall and an interactive table. These are utilized to provide a focal point for collaborative interaction with data and applications. In this paper, we describe in detail the designs behind the current prototype system. In particular, we present 1) the infrastructure which allows users to connect and visually share their laptop content on-the-fly, and supports the extension of native visualization applications, and 2) the table-centric design employed in customized WeSpace applications to support cross-surface interactions. We will also describe elements of our user-centered iterative design process, in particular the results from a late-stages session which saw our astrophysicist participants successfully use the WeSpace to collaborate on their own real research problems.","PeriodicalId":130376,"journal":{"name":"2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116824828","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2008-10-28DOI: 10.1109/tabletop.2008.4660180
Luc Vlaming, Jasper Smit, Tobias Isenberg
Based on recent demonstrations of low-cost, infrared-based point tracking, we explore two-handed, surface-less interaction for presentation. On both hands, thumb and index finger are equipped with retro-reflective markers which are tracked by a Wiimote. We contribute a robust finger pairing and pinch recognition method that allows us to discriminate the hands and to initiate actions. We apply this input to a presentation application that allows users to work with slide decks, images, and videos. We identify specific requirements of this application domain and discuss the implemented transformation interactions and widgets. We report on user experience in both casual use and an actual presentation as well as discuss advantages and limitations.
{"title":"Presenting using two-handed interaction in open space","authors":"Luc Vlaming, Jasper Smit, Tobias Isenberg","doi":"10.1109/tabletop.2008.4660180","DOIUrl":"https://doi.org/10.1109/tabletop.2008.4660180","url":null,"abstract":"Based on recent demonstrations of low-cost, infrared-based point tracking, we explore two-handed, surface-less interaction for presentation. On both hands, thumb and index finger are equipped with retro-reflective markers which are tracked by a Wiimote. We contribute a robust finger pairing and pinch recognition method that allows us to discriminate the hands and to initiate actions. We apply this input to a presentation application that allows users to work with slide decks, images, and videos. We identify specific requirements of this application domain and discuss the implemented transformation interactions and widgets. We report on user experience in both casual use and an actual presentation as well as discuss advantages and limitations.","PeriodicalId":130376,"journal":{"name":"2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems","volume":"259 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133103629","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2008-10-28DOI: 10.1109/TABLETOP.2008.4660186
Jochen Rick, Y. Rogers
To realize the potential of multi-touch tables, interaction designers need to create meaningful applications for them in real-world contexts. One convenient shortcut towards that end is adapting a meaningful application from another interface paradigm. In this paper, we detail the process of adapting DigiQuilt, a single-user desktop educational technology, to DigiTile, a collaborative multi-touch application. With this case study, we concretely demonstrate the utility of adapting and how previous research and theory can inform that process. In particular, we show how learning theory (1) motivated the transition from the desktop to the multi-touch table, (2) guided the design process, and (3) informed the evaluation.
{"title":"From DigiQuilt to DigiTile: Adapting educational technology to a multi-touch table","authors":"Jochen Rick, Y. Rogers","doi":"10.1109/TABLETOP.2008.4660186","DOIUrl":"https://doi.org/10.1109/TABLETOP.2008.4660186","url":null,"abstract":"To realize the potential of multi-touch tables, interaction designers need to create meaningful applications for them in real-world contexts. One convenient shortcut towards that end is adapting a meaningful application from another interface paradigm. In this paper, we detail the process of adapting DigiQuilt, a single-user desktop educational technology, to DigiTile, a collaborative multi-touch application. With this case study, we concretely demonstrate the utility of adapting and how previous research and theory can inform that process. In particular, we show how learning theory (1) motivated the transition from the desktop to the multi-touch table, (2) guided the design process, and (3) informed the evaluation.","PeriodicalId":130376,"journal":{"name":"2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128855527","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2008-10-28DOI: 10.1109/TABLETOP.2008.4660192
M. Morris, A. Brush, B. Meyers
To better understand the potential for horizontal surfaces in day-to-day work, we conducted a field study. We collected and analyzed over a month of use data from eight participants who used horizontal displays in conjunction with their existing office computer setups. Our analysis of the system logs, observations, and interview data from the study reveals clear differences in preference and use patterns for horizontal and vertical display configurations. Based on these findings, we formulate hardware and software design guidelines that would increase the utility of interactive horizontal displays for office scenarios.
{"title":"A field study of knowledge workers’ use of interactive horizontal displays","authors":"M. Morris, A. Brush, B. Meyers","doi":"10.1109/TABLETOP.2008.4660192","DOIUrl":"https://doi.org/10.1109/TABLETOP.2008.4660192","url":null,"abstract":"To better understand the potential for horizontal surfaces in day-to-day work, we conducted a field study. We collected and analyzed over a month of use data from eight participants who used horizontal displays in conjunction with their existing office computer setups. Our analysis of the system logs, observations, and interview data from the study reveals clear differences in preference and use patterns for horizontal and vertical display configurations. Based on these findings, we formulate hardware and software design guidelines that would increase the utility of interactive horizontal displays for office scenarios.","PeriodicalId":130376,"journal":{"name":"2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117048553","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2008-10-28DOI: 10.1109/TABLETOP.2008.4660190
E. Mansor, A. D. Angeli, O. Bruijn
This paper presents selected results from an experimental study designed to compare fantasy play in a virtual and physical setting. Twenty-two children (aged 3 and 4) played in same-sex dyads with a real wooden tree house and its virtual implementation on a DiamondTouch tabletop. The study evinced several problems in the interaction with the tabletop as children often struggled to drag the objects displayed on the surface. An error analysis is presented and results are used to propose guidelines for improving the use of DiamondTouch tabletops by young children.
{"title":"Little fingers on the tabletop: A usability evaluation in the kindergarten","authors":"E. Mansor, A. D. Angeli, O. Bruijn","doi":"10.1109/TABLETOP.2008.4660190","DOIUrl":"https://doi.org/10.1109/TABLETOP.2008.4660190","url":null,"abstract":"This paper presents selected results from an experimental study designed to compare fantasy play in a virtual and physical setting. Twenty-two children (aged 3 and 4) played in same-sex dyads with a real wooden tree house and its virtual implementation on a DiamondTouch tabletop. The study evinced several problems in the interaction with the tabletop as children often struggled to drag the objects displayed on the surface. An error analysis is presented and results are used to propose guidelines for improving the use of DiamondTouch tabletops by young children.","PeriodicalId":130376,"journal":{"name":"2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130450293","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2008-10-28DOI: 10.1109/TABLETOP.2008.4660196
Toshiki Sato, K. Fukuchi, H. Koike
Finger flicking is a familiar finger movement, and we often flick a small object on the table with a finger in our daily life. We developed a flicking gesture recognition system called the dasiaOHAJIKI interfacepsila, which can track very rapid flicking movements using a high-speed camera, and estimate the flicking power and direction in real time. In this paper, we describe the recognition techniques of the flicking gesture on a large LCD to produce a novel tabletop system that enables us to use the gesture to flick a virtual marble on the table. Moreover, we also present an evaluation of a vision-based flicking tabletop system as an entertainment application.
{"title":"Implementation and evaluations of vision-based finger flicking gesture recognition for tabletops","authors":"Toshiki Sato, K. Fukuchi, H. Koike","doi":"10.1109/TABLETOP.2008.4660196","DOIUrl":"https://doi.org/10.1109/TABLETOP.2008.4660196","url":null,"abstract":"Finger flicking is a familiar finger movement, and we often flick a small object on the table with a finger in our daily life. We developed a flicking gesture recognition system called the dasiaOHAJIKI interfacepsila, which can track very rapid flicking movements using a high-speed camera, and estimate the flicking power and direction in real time. In this paper, we describe the recognition techniques of the flicking gesture on a large LCD to produce a novel tabletop system that enables us to use the gesture to flick a virtual marble on the table. Moreover, we also present an evaluation of a vision-based flicking tabletop system as an entertainment application.","PeriodicalId":130376,"journal":{"name":"2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115947986","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2008-10-28DOI: 10.1109/TABLETOP.2008.4660202
Ting-Ting Hu, Yi-Wei Chia, Liwei Chan, Y. Hung, Jane Yung-jen Hsu
For human vision, the resolution of visual perception is not uniform across the entire eye, in which the fovea, a dimple on the central retina, provides our highest resolution vision. While many researchers have focused on building a large homogeneous high-resolution display for better visual quality, our approach goes a step further to exploit the variable-resolution nature of human vision on tabletop systems. In this work, we developed an innovative tabletop display system, called i-m-top (interactive multi-resolution tabletop), featuring not only multi-touch, but also multi-resolution display accommodating to the multi-resolution characteristics of human vision. Based on this characteristic, i-m-top provides a high-resolution projection image in the foveal region with a steerable projector, while providing a low-resolution projection image in the peripheral region with a wide-angle fixed projector. With this configuration, we are able to realize an interactive high-resolution display in a cost-effective way. To hide the engineering challenges posed by the unique hardware configuration, we also develop a software development toolkit - the i-m-Top SDK - for rapid prototyping multi-resolution and multi-touch applications, to help push forward research in this field.
{"title":"i-m-Top: An interactive multi-resolution tabletop system accommodating to multi-resolution human vision","authors":"Ting-Ting Hu, Yi-Wei Chia, Liwei Chan, Y. Hung, Jane Yung-jen Hsu","doi":"10.1109/TABLETOP.2008.4660202","DOIUrl":"https://doi.org/10.1109/TABLETOP.2008.4660202","url":null,"abstract":"For human vision, the resolution of visual perception is not uniform across the entire eye, in which the fovea, a dimple on the central retina, provides our highest resolution vision. While many researchers have focused on building a large homogeneous high-resolution display for better visual quality, our approach goes a step further to exploit the variable-resolution nature of human vision on tabletop systems. In this work, we developed an innovative tabletop display system, called i-m-top (interactive multi-resolution tabletop), featuring not only multi-touch, but also multi-resolution display accommodating to the multi-resolution characteristics of human vision. Based on this characteristic, i-m-top provides a high-resolution projection image in the foveal region with a steerable projector, while providing a low-resolution projection image in the peripheral region with a wide-angle fixed projector. With this configuration, we are able to realize an interactive high-resolution display in a cost-effective way. To hide the engineering challenges posed by the unique hardware configuration, we also develop a software development toolkit - the i-m-Top SDK - for rapid prototyping multi-resolution and multi-touch applications, to help push forward research in this field.","PeriodicalId":130376,"journal":{"name":"2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132842069","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2008-10-28DOI: 10.1109/TABLETOP.2008.4660197
Yoshihiro Watanabe, Á. Cassinelli, T. Komuro, M. Ishikawa
We propose a variant of the multi-touch display technology that introduces an original way of manipulating three-dimensional data. The underlying metaphor is that of a deformable screen that acts as a boundary surface between the real and the virtual worlds. By doing so, the interface can create the illusion of continuity between the userpsilas real space and the virtual three-dimensional space. The prototype system presented here enables this by employing three key technologies: a tangible and deformable projection screen, a real-time three-dimensional sensing mechanism, and an algorithm for dynamic compensation for anamorphic projection. This paper introduces the concept of the deformable tangible workspace, and describes the required technologies for implementing it. Also, several applications developed on a prototype system are detailed and demonstrated.
{"title":"The deformable workspace: A membrane between real and virtual space","authors":"Yoshihiro Watanabe, Á. Cassinelli, T. Komuro, M. Ishikawa","doi":"10.1109/TABLETOP.2008.4660197","DOIUrl":"https://doi.org/10.1109/TABLETOP.2008.4660197","url":null,"abstract":"We propose a variant of the multi-touch display technology that introduces an original way of manipulating three-dimensional data. The underlying metaphor is that of a deformable screen that acts as a boundary surface between the real and the virtual worlds. By doing so, the interface can create the illusion of continuity between the userpsilas real space and the virtual three-dimensional space. The prototype system presented here enables this by employing three key technologies: a tangible and deformable projection screen, a real-time three-dimensional sensing mechanism, and an algorithm for dynamic compensation for anamorphic projection. This paper introduces the concept of the deformable tangible workspace, and describes the required technologies for implementing it. Also, several applications developed on a prototype system are detailed and demonstrated.","PeriodicalId":130376,"journal":{"name":"2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems","volume":"51 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130308693","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}