{"title":"Visuo-Haptic Augmented Reality runtime environment for medical training","authors":"U. Eck, C. Sandor, Hamid Laga","doi":"10.1109/ISMAR.2013.6671816","DOIUrl":null,"url":null,"abstract":"During the last decade, Visuo-Haptic Augmented Reality (VHAR) systems have emerged that enable users to see and touch digital information that is embedded in the real world. They pose unique problems to developers, including the need for precise augmentations, accurate colocation of haptic devices, and efficient concurrent processing of multiple, realtime sensor inputs to achieve low latency. We think that this complexity is one of the main reasons, why VHAR technology has only been used in few user interface research projects. The proposed project's main objective is to pioneer the development of a widely applicable VHAR runtime environment, which meets the requirements of realtime, low latency operation with precise co-location, haptic interaction with deformable bodies, and realistic rendering, while reducing the overall cost and complexity for developers. A further objective is to evaluate the benefits of VHAR user interfaces with a focus on medical training applications, so that creators of future medical simulators or other haptic applications recognize the potential of VHAR.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2013-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMAR.2013.6671816","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
During the last decade, Visuo-Haptic Augmented Reality (VHAR) systems have emerged that enable users to see and touch digital information that is embedded in the real world. They pose unique problems to developers, including the need for precise augmentations, accurate colocation of haptic devices, and efficient concurrent processing of multiple, realtime sensor inputs to achieve low latency. We think that this complexity is one of the main reasons, why VHAR technology has only been used in few user interface research projects. The proposed project's main objective is to pioneer the development of a widely applicable VHAR runtime environment, which meets the requirements of realtime, low latency operation with precise co-location, haptic interaction with deformable bodies, and realistic rendering, while reducing the overall cost and complexity for developers. A further objective is to evaluate the benefits of VHAR user interfaces with a focus on medical training applications, so that creators of future medical simulators or other haptic applications recognize the potential of VHAR.