Uwe Gruenefeld, Daniel Lange, Lasse Hammer, Susanne CJ Boll, Wilko Heuten
{"title":"FlyingARrow","authors":"Uwe Gruenefeld, Daniel Lange, Lasse Hammer, Susanne CJ Boll, Wilko Heuten","doi":"10.1145/3205873.3205881","DOIUrl":null,"url":null,"abstract":"Augmented Reality (AR) devices empower users to enrich their surroundings by pinning digital content onto real world objects. However, current AR devices suffer from having small fields of view, making the process of locating spatially distributed digital content similar to looking through a keyhole. Previous solutions are not suitable to address the problem of locating digital content out of view on small field of view devices because of visual clutter. Therefore, we developed FlyingARrow, which consists of a visual representation that flies on-demand from the user's line of sight toward the position of the out-of-view object and returns an acoustic signal over headphones if reached. We compared our technique with the out-of-view object visualization technique EyeSee360 and found that it resulted in higher usability and lower workload. However, FlyingARrow performed slightly worse with respect to search time and direction error. Furthermore, we discuss the challenges, and opportunities in combining visual and acoustic representations to overcome visual clutter.","PeriodicalId":340580,"journal":{"name":"Proceedings of the 7th ACM International Symposium on Pervasive Displays","volume":"457 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 7th ACM International Symposium on Pervasive Displays","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3205873.3205881","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Augmented Reality (AR) devices empower users to enrich their surroundings by pinning digital content onto real world objects. However, current AR devices suffer from having small fields of view, making the process of locating spatially distributed digital content similar to looking through a keyhole. Previous solutions are not suitable to address the problem of locating digital content out of view on small field of view devices because of visual clutter. Therefore, we developed FlyingARrow, which consists of a visual representation that flies on-demand from the user's line of sight toward the position of the out-of-view object and returns an acoustic signal over headphones if reached. We compared our technique with the out-of-view object visualization technique EyeSee360 and found that it resulted in higher usability and lower workload. However, FlyingARrow performed slightly worse with respect to search time and direction error. Furthermore, we discuss the challenges, and opportunities in combining visual and acoustic representations to overcome visual clutter.