International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality最新文献
Pub Date : 2013-10-01DOI: 10.1109/ISMAR.2013.6671750
Hartmut Seichter, Denis Kalkofen
It is our pleasure to present the workshops associated with ISMAR 2013. These events provide a chance to thoroughly examine specific research areas in the exciting field of Mixed and Augmented Reality.
{"title":"Workshop chairs","authors":"Hartmut Seichter, Denis Kalkofen","doi":"10.1109/ISMAR.2013.6671750","DOIUrl":"https://doi.org/10.1109/ISMAR.2013.6671750","url":null,"abstract":"It is our pleasure to present the workshops associated with ISMAR 2013. These events provide a chance to thoroughly examine specific research areas in the exciting field of Mixed and Augmented Reality.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74359609","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-10-01DOI: 10.1109/ISMAR.2013.6671816
U. Eck, C. Sandor, Hamid Laga
During the last decade, Visuo-Haptic Augmented Reality (VHAR) systems have emerged that enable users to see and touch digital information that is embedded in the real world. They pose unique problems to developers, including the need for precise augmentations, accurate colocation of haptic devices, and efficient concurrent processing of multiple, realtime sensor inputs to achieve low latency. We think that this complexity is one of the main reasons, why VHAR technology has only been used in few user interface research projects. The proposed project's main objective is to pioneer the development of a widely applicable VHAR runtime environment, which meets the requirements of realtime, low latency operation with precise co-location, haptic interaction with deformable bodies, and realistic rendering, while reducing the overall cost and complexity for developers. A further objective is to evaluate the benefits of VHAR user interfaces with a focus on medical training applications, so that creators of future medical simulators or other haptic applications recognize the potential of VHAR.
{"title":"Visuo-Haptic Augmented Reality runtime environment for medical training","authors":"U. Eck, C. Sandor, Hamid Laga","doi":"10.1109/ISMAR.2013.6671816","DOIUrl":"https://doi.org/10.1109/ISMAR.2013.6671816","url":null,"abstract":"During the last decade, Visuo-Haptic Augmented Reality (VHAR) systems have emerged that enable users to see and touch digital information that is embedded in the real world. They pose unique problems to developers, including the need for precise augmentations, accurate colocation of haptic devices, and efficient concurrent processing of multiple, realtime sensor inputs to achieve low latency. We think that this complexity is one of the main reasons, why VHAR technology has only been used in few user interface research projects. The proposed project's main objective is to pioneer the development of a widely applicable VHAR runtime environment, which meets the requirements of realtime, low latency operation with precise co-location, haptic interaction with deformable bodies, and realistic rendering, while reducing the overall cost and complexity for developers. A further objective is to evaluate the benefits of VHAR user interfaces with a focus on medical training applications, so that creators of future medical simulators or other haptic applications recognize the potential of VHAR.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83336853","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-10-01DOI: 10.1109/ISMAR.2013.6671752
W. Chinthammit, S. Kim
We are proud to present the 2nd ISMAR Doctoral Consortium (DC). We are continuing to build on the success of the 1st DC at ISMAR 2012, where DC students received invaluable inputs from DC mentors to help improve their research. The goal of the DC is to create an opportunity for PhD students to present their research, discuss their current progresses and future plans, and receive constructive criticism and guidances regarding their current work, future work, and career perspectives.
{"title":"Doctoral chairs","authors":"W. Chinthammit, S. Kim","doi":"10.1109/ISMAR.2013.6671752","DOIUrl":"https://doi.org/10.1109/ISMAR.2013.6671752","url":null,"abstract":"We are proud to present the 2nd ISMAR Doctoral Consortium (DC). We are continuing to build on the success of the 1st DC at ISMAR 2012, where DC students received invaluable inputs from DC mentors to help improve their research. The goal of the DC is to create an opportunity for PhD students to present their research, discuss their current progresses and future plans, and receive constructive criticism and guidances regarding their current work, future work, and career perspectives.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73072524","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-10-01DOI: 10.1109/ISMAR.2013.6671817
Neven A. M. ElSayed, C. Sandor, Hamid Laga
In the last decade, Augmented Reality has become more mature and is widely adopted on mobile devices. Exploring the available information of a user's environment is one of the key applications. However, current mobile Augmented Reality interfaces are very limited compared to the recently emerging big data exploration tools for desktop computers. Our vision is to bring powerful Visual Analytic tools to mobile Augmented Reality.
{"title":"Visual analytics in Augmented Reality","authors":"Neven A. M. ElSayed, C. Sandor, Hamid Laga","doi":"10.1109/ISMAR.2013.6671817","DOIUrl":"https://doi.org/10.1109/ISMAR.2013.6671817","url":null,"abstract":"In the last decade, Augmented Reality has become more mature and is widely adopted on mobile devices. Exploring the available information of a user's environment is one of the key applications. However, current mobile Augmented Reality interfaces are very limited compared to the recently emerging big data exploration tools for desktop computers. Our vision is to bring powerful Visual Analytic tools to mobile Augmented Reality.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79910859","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-10-01DOI: 10.1109/ISMAR.2013.6671753
S. DiVerdi, Jun Park
New this year to ISMAR 2013, we are proud to present the Works In Progress (WIP) Program. Augmented Reality is rapidly growing into many new areas, so the WIP is a platform to present the field's latest, emerging results to the larger community before the work has reached its final form. This year, the program includes bread and butter AR technologies such as remote collaboration interfaces, fiducial marker design, and perceptual studies, to even loftier applications like AR interactions aboard the International Space Station. Passive haptics, bare-handed gesture interfaces, and realistic rendering round out the offerings. So come to the WIP sessions to hear about active AR research and find the spark of inspiration!
{"title":"Wip chairs","authors":"S. DiVerdi, Jun Park","doi":"10.1109/ISMAR.2013.6671753","DOIUrl":"https://doi.org/10.1109/ISMAR.2013.6671753","url":null,"abstract":"New this year to ISMAR 2013, we are proud to present the Works In Progress (WIP) Program. Augmented Reality is rapidly growing into many new areas, so the WIP is a platform to present the field's latest, emerging results to the larger community before the work has reached its final form. This year, the program includes bread and butter AR technologies such as remote collaboration interfaces, fiducial marker design, and perceptual studies, to even loftier applications like AR interactions aboard the International Space Station. Passive haptics, bare-handed gesture interfaces, and realistic rendering round out the offerings. So come to the WIP sessions to hear about active AR research and find the spark of inspiration!","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74639129","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-10-01DOI: 10.1109/ISMAR.2013.6671836
L. Figueiredo, Ronaldo Dos Anjos, Jorge Eduardo Falcao Lindoso, E. Neto, R. Roberto, Manoela Silva, V. Teichrieb
In this work in progress we address the problem of interacting with augmented objects. A bare hand tracking technique is developed, which allied to gesture recognition heuristics, enables interaction with augmented objects in an intuitive way. The tracking algorithm uses a flock of features approach that tracks both hands in real time. The interaction occurs by the execution of grasp and release gestures. Physics simulation and photorealistic rendering are added to the pipeline. This way, the tool provides more coherent feedback in order to make the virtual objects look and respond more likely real ones. The pipeline was tested through specific tasks, designed to analyze its performance regarding the easiness of use, precision and response time.
{"title":"Bare hand natural interaction with augmented objects","authors":"L. Figueiredo, Ronaldo Dos Anjos, Jorge Eduardo Falcao Lindoso, E. Neto, R. Roberto, Manoela Silva, V. Teichrieb","doi":"10.1109/ISMAR.2013.6671836","DOIUrl":"https://doi.org/10.1109/ISMAR.2013.6671836","url":null,"abstract":"In this work in progress we address the problem of interacting with augmented objects. A bare hand tracking technique is developed, which allied to gesture recognition heuristics, enables interaction with augmented objects in an intuitive way. The tracking algorithm uses a flock of features approach that tracks both hands in real time. The interaction occurs by the execution of grasp and release gestures. Physics simulation and photorealistic rendering are added to the pipeline. This way, the tool provides more coherent feedback in order to make the virtual objects look and respond more likely real ones. The pipeline was tested through specific tasks, designed to analyze its performance regarding the easiness of use, precision and response time.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87133675","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-10-01DOI: 10.1109/ISMAR.2013.6671749
T. Drummond, Matt Adcock
It is our great pleasure to present the ISMAR Tutorials. We proudly host two tutorials that provide sharing of knowledge from seasoned researchers. Our tutorials cover open standards and development using HTML and instantAR. Through these exciting tutorials we hope to expand the minds of ISMAR 2013 attendees and help to foster the next generation of Mixed and Augmented Reality researchers, practitioners, and artists.
{"title":"Tutorial chairs","authors":"T. Drummond, Matt Adcock","doi":"10.1109/ISMAR.2013.6671749","DOIUrl":"https://doi.org/10.1109/ISMAR.2013.6671749","url":null,"abstract":"It is our great pleasure to present the ISMAR Tutorials. We proudly host two tutorials that provide sharing of knowledge from seasoned researchers. Our tutorials cover open standards and development using HTML and instantAR. Through these exciting tutorials we hope to expand the minds of ISMAR 2013 attendees and help to foster the next generation of Mixed and Augmented Reality researchers, practitioners, and artists.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72582913","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-10-01DOI: 10.1109/ISMAR.2013.6671837
Rahul Budhiraja, Gun A. Lee, M. Billinghurst
Mobile Augmented Reality (AR) applications are typically deployed either on head mounted displays (HMD) or handheld displays (HHD). This paper explores novel interaction techniques for a combined HHD-HMD hybrid system that builds on the strengths of each type of device. We use the HMD for viewing AR content and a touch screen HHD for interacting with the content. A prototype system was developed and a user study was conducted comparing four interaction techniques for selection tasks.
{"title":"Using a HHD with a HMD for mobile AR interaction","authors":"Rahul Budhiraja, Gun A. Lee, M. Billinghurst","doi":"10.1109/ISMAR.2013.6671837","DOIUrl":"https://doi.org/10.1109/ISMAR.2013.6671837","url":null,"abstract":"Mobile Augmented Reality (AR) applications are typically deployed either on head mounted displays (HMD) or handheld displays (HHD). This paper explores novel interaction techniques for a combined HHD-HMD hybrid system that builds on the strengths of each type of device. We use the HMD for viewing AR content and a touch screen HHD for interacting with the content. A prototype system was developed and a user study was conducted comparing four interaction techniques for selection tasks.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81017196","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-10-01DOI: 10.1109/ISMAR.2013.6671833
Seungwon Kim, Gun A. Lee, Nobuchika Sakata
In this research, we explore using pointing and drawing in a remote collaboration system. Our application allows a local user with a tablet to communicate with a remote expert on a desktop computer. We compared performance in four conditions: (1) Pointers on Still Image, (2) Pointers on Live Video, (3) Annotation on Still Image, and (4) Annotation on Live Video. We found that using drawing annotations would require fewer inputs on an expert side, and would require less cognitive load on the local worker side. In a follow-on study we compared the conditions (2) and (4) using a more complicated task. We found that pointing input requires good verbal communication to be effective and that drawing annotations need to be erased after completing each step of a task.
{"title":"Comparing pointing and drawing for remote collaboration","authors":"Seungwon Kim, Gun A. Lee, Nobuchika Sakata","doi":"10.1109/ISMAR.2013.6671833","DOIUrl":"https://doi.org/10.1109/ISMAR.2013.6671833","url":null,"abstract":"In this research, we explore using pointing and drawing in a remote collaboration system. Our application allows a local user with a tablet to communicate with a remote expert on a desktop computer. We compared performance in four conditions: (1) Pointers on Still Image, (2) Pointers on Live Video, (3) Annotation on Still Image, and (4) Annotation on Live Video. We found that using drawing annotations would require fewer inputs on an expert side, and would require less cognitive load on the local worker side. In a follow-on study we compared the conditions (2) and (4) using a more complicated task. We found that pointing input requires good verbal communication to be effective and that drawing annotations need to be erased after completing each step of a task.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88292917","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-10-01DOI: 10.1109/ISMAR.2013.6671826
Markus Broecker, B. Thomas, Ross T. Smith
Ray tracing is an elegant and intuitive image generation method. The introduction of GPU-accelerated ray tracing and corresponding software frameworks makes this rendering technique a viable option for Augmented Reality applications. Spatial Augmented Reality employs projectors to illuminate physical models and is used in fields that require photorealism, such as design and prototyping. Ray tracing can be used to great effect in this Augmented Reality environment to create scenes of high visual fidelity. However, the peculiarities of SAR systems require that core ray tracing algorithms be adapted to this new rendering environment. This paper highlights the problems involved in using ray tracing in a SAR environment and provides solutions to overcome them. In particular, the following issues are addressed: ray generation, hybrid rendering and view-dependent rendering.
{"title":"Adapting ray tracing to Spatial Augmented Reality","authors":"Markus Broecker, B. Thomas, Ross T. Smith","doi":"10.1109/ISMAR.2013.6671826","DOIUrl":"https://doi.org/10.1109/ISMAR.2013.6671826","url":null,"abstract":"Ray tracing is an elegant and intuitive image generation method. The introduction of GPU-accelerated ray tracing and corresponding software frameworks makes this rendering technique a viable option for Augmented Reality applications. Spatial Augmented Reality employs projectors to illuminate physical models and is used in fields that require photorealism, such as design and prototyping. Ray tracing can be used to great effect in this Augmented Reality environment to create scenes of high visual fidelity. However, the peculiarities of SAR systems require that core ray tracing algorithms be adapted to this new rendering environment. This paper highlights the problems involved in using ray tracing in a SAR environment and provides solutions to overcome them. In particular, the following issues are addressed: ray generation, hybrid rendering and view-dependent rendering.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89126860","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}