Johannes Schöning, Frank Steinicke, A. Krüger, K. Hinrichs
{"title":"海报:跨镜多点触摸表面:使用手动交互来直观地操作空间数据","authors":"Johannes Schöning, Frank Steinicke, A. Krüger, K. Hinrichs","doi":"10.1109/3DUI.2009.4811219","DOIUrl":null,"url":null,"abstract":"In recent years visualization of and interaction with 3D data have become more and more popular and widespread due to the requirements of numerous application areas. Two-dimensional desktop systems are often limited in cases where natural and intuitive interfaces are desired. Sophisticated 3D user interfaces, as they are provided by virtual reality (VR) systems consisting of stereoscopic projection and tracked input devices, are rarely adopted by ordinary users or even by experts. Since most applications dealing with 3D data still use traditional 2D GUIs, current user interface designs lack adequate efficiency. Multi-touch interaction has received considerable attention in the last few years, in particular for non-immersive, natural 2D interaction. Interactive multi-touch surfaces even support three degrees of freedom in terms of 2D position on the surface and varying levels of pressure. Since multi-touch interfaces represent a good trade-off between intuitive, constrained interaction on a touch surface providing tangible feedback, and unrestricted natural interaction without any instrumentation, they have the potential to form the fundaments of the next generation 2D and 3D user interfaces. Indeed, stereoscopic display of 3D data provides an additional depth cue, but until now challenges and limitations for multi-touch interaction in this context have not been considered. In this paper we present new multi-touch paradigms that combine traditional 2D interaction performed in monoscopic mode with 3D interaction and stereoscopic projection, which we refer to as interscopic multi-touch surfaces (iMUTS).","PeriodicalId":125705,"journal":{"name":"2009 IEEE Symposium on 3D User Interfaces","volume":"174 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Poster: Interscopic multi-touch surfaces: Using bimanual interaction for intuitive manipulation of spatial data\",\"authors\":\"Johannes Schöning, Frank Steinicke, A. Krüger, K. Hinrichs\",\"doi\":\"10.1109/3DUI.2009.4811219\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In recent years visualization of and interaction with 3D data have become more and more popular and widespread due to the requirements of numerous application areas. Two-dimensional desktop systems are often limited in cases where natural and intuitive interfaces are desired. Sophisticated 3D user interfaces, as they are provided by virtual reality (VR) systems consisting of stereoscopic projection and tracked input devices, are rarely adopted by ordinary users or even by experts. Since most applications dealing with 3D data still use traditional 2D GUIs, current user interface designs lack adequate efficiency. Multi-touch interaction has received considerable attention in the last few years, in particular for non-immersive, natural 2D interaction. Interactive multi-touch surfaces even support three degrees of freedom in terms of 2D position on the surface and varying levels of pressure. Since multi-touch interfaces represent a good trade-off between intuitive, constrained interaction on a touch surface providing tangible feedback, and unrestricted natural interaction without any instrumentation, they have the potential to form the fundaments of the next generation 2D and 3D user interfaces. Indeed, stereoscopic display of 3D data provides an additional depth cue, but until now challenges and limitations for multi-touch interaction in this context have not been considered. In this paper we present new multi-touch paradigms that combine traditional 2D interaction performed in monoscopic mode with 3D interaction and stereoscopic projection, which we refer to as interscopic multi-touch surfaces (iMUTS).\",\"PeriodicalId\":125705,\"journal\":{\"name\":\"2009 IEEE Symposium on 3D User Interfaces\",\"volume\":\"174 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2009-03-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2009 IEEE Symposium on 3D User Interfaces\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/3DUI.2009.4811219\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 IEEE Symposium on 3D User Interfaces","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/3DUI.2009.4811219","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Poster: Interscopic multi-touch surfaces: Using bimanual interaction for intuitive manipulation of spatial data
In recent years visualization of and interaction with 3D data have become more and more popular and widespread due to the requirements of numerous application areas. Two-dimensional desktop systems are often limited in cases where natural and intuitive interfaces are desired. Sophisticated 3D user interfaces, as they are provided by virtual reality (VR) systems consisting of stereoscopic projection and tracked input devices, are rarely adopted by ordinary users or even by experts. Since most applications dealing with 3D data still use traditional 2D GUIs, current user interface designs lack adequate efficiency. Multi-touch interaction has received considerable attention in the last few years, in particular for non-immersive, natural 2D interaction. Interactive multi-touch surfaces even support three degrees of freedom in terms of 2D position on the surface and varying levels of pressure. Since multi-touch interfaces represent a good trade-off between intuitive, constrained interaction on a touch surface providing tangible feedback, and unrestricted natural interaction without any instrumentation, they have the potential to form the fundaments of the next generation 2D and 3D user interfaces. Indeed, stereoscopic display of 3D data provides an additional depth cue, but until now challenges and limitations for multi-touch interaction in this context have not been considered. In this paper we present new multi-touch paradigms that combine traditional 2D interaction performed in monoscopic mode with 3D interaction and stereoscopic projection, which we refer to as interscopic multi-touch surfaces (iMUTS).