{"title":"用户对Snap-To-Feature交互方法的研究","authors":"Gun A. Lee, M. Billinghurst","doi":"10.1109/ISMAR.2011.6092398","DOIUrl":null,"url":null,"abstract":"Recent advances in mobile computing and augmented reality (AR) technology have lead to popularization of mobile AR applications. Touch screen input is common in mobile devices, and also widely used in mobile AR applications. However, due to unsteady camera view movement, it can be hard to carry out precise interactions in handheld AR environments, for tasks such as tracing physical objects. In this research, we investigate a Snap-To-Feature interaction method that helps users to perform more accurate touch screen interactions by attracting user input points to image features in the AR scene. A user experiment is performed using the method to trace a physical object, which is typical for modeling real objects within the AR scene. The results shows that the Snap-To-Feature method makes a significant difference in the accuracy of touch screen based AR interaction.","PeriodicalId":298757,"journal":{"name":"2011 10th IEEE International Symposium on Mixed and Augmented Reality","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":"{\"title\":\"A user study on the Snap-To-Feature interaction method\",\"authors\":\"Gun A. Lee, M. Billinghurst\",\"doi\":\"10.1109/ISMAR.2011.6092398\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recent advances in mobile computing and augmented reality (AR) technology have lead to popularization of mobile AR applications. Touch screen input is common in mobile devices, and also widely used in mobile AR applications. However, due to unsteady camera view movement, it can be hard to carry out precise interactions in handheld AR environments, for tasks such as tracing physical objects. In this research, we investigate a Snap-To-Feature interaction method that helps users to perform more accurate touch screen interactions by attracting user input points to image features in the AR scene. A user experiment is performed using the method to trace a physical object, which is typical for modeling real objects within the AR scene. The results shows that the Snap-To-Feature method makes a significant difference in the accuracy of touch screen based AR interaction.\",\"PeriodicalId\":298757,\"journal\":{\"name\":\"2011 10th IEEE International Symposium on Mixed and Augmented Reality\",\"volume\":\"26 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-10-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"12\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2011 10th IEEE International Symposium on Mixed and Augmented Reality\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISMAR.2011.6092398\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 10th IEEE International Symposium on Mixed and Augmented Reality","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMAR.2011.6092398","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A user study on the Snap-To-Feature interaction method
Recent advances in mobile computing and augmented reality (AR) technology have lead to popularization of mobile AR applications. Touch screen input is common in mobile devices, and also widely used in mobile AR applications. However, due to unsteady camera view movement, it can be hard to carry out precise interactions in handheld AR environments, for tasks such as tracing physical objects. In this research, we investigate a Snap-To-Feature interaction method that helps users to perform more accurate touch screen interactions by attracting user input points to image features in the AR scene. A user experiment is performed using the method to trace a physical object, which is typical for modeling real objects within the AR scene. The results shows that the Snap-To-Feature method makes a significant difference in the accuracy of touch screen based AR interaction.