Point-and-shoot for ubiquitous tagging on mobile phones

Wonwoo Lee, Youngmin Park, V. Lepetit, Woontack Woo
{"title":"Point-and-shoot for ubiquitous tagging on mobile phones","authors":"Wonwoo Lee, Youngmin Park, V. Lepetit, Woontack Woo","doi":"10.1109/ISMAR.2010.5643551","DOIUrl":null,"url":null,"abstract":"We propose a novel way to augment a real scene with minimalist user intervention on a mobile phone: The user only has to point the phone camera to the desired location of the augmentation. Our method is valid for vertical or horizontal surfaces only, but this is not a restriction in practice in man-made environments, and avoids to go through any reconstruction of the 3D scene, which is still a delicate process. Our approach is inspired by recent work on perspective patch recognition [5] and we show how to modify it for better performances on mobile phones and how to exploit the phone accelerometers to relax the need for fronto-parallel views. In addition, our implementation allows to share the augmentations and the required data over peer-to-peer communication to build a shared AR space on mobile phones.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"19","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 IEEE International Symposium on Mixed and Augmented Reality","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMAR.2010.5643551","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 19

Abstract

We propose a novel way to augment a real scene with minimalist user intervention on a mobile phone: The user only has to point the phone camera to the desired location of the augmentation. Our method is valid for vertical or horizontal surfaces only, but this is not a restriction in practice in man-made environments, and avoids to go through any reconstruction of the 3D scene, which is still a delicate process. Our approach is inspired by recent work on perspective patch recognition [5] and we show how to modify it for better performances on mobile phones and how to exploit the phone accelerometers to relax the need for fronto-parallel views. In addition, our implementation allows to share the augmentations and the required data over peer-to-peer communication to build a shared AR space on mobile phones.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
瞄准和拍摄无处不在的标签在手机上
我们提出了一种新颖的方法,以最少的用户干预来增强手机上的真实场景:用户只需要将手机摄像头指向所需的增强位置。我们的方法只适用于垂直或水平的表面,但这在人工环境的实践中并不是一个限制,并且避免了任何3D场景的重建,这仍然是一个微妙的过程。我们的方法受到最近透视补丁识别[5]工作的启发,我们展示了如何修改它以在手机上获得更好的性能,以及如何利用手机加速计来放松对正面平行视图的需求。此外,我们的实现允许通过点对点通信共享增强功能和所需数据,从而在手机上构建共享的AR空间。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
3DOF tracking accuracy improvement for outdoor Augmented Reality ARCrowd-a tangible interface for interactive crowd simulation Generating vision based Lego augmented reality training and evaluation systems MTMR: A conceptual interior design framework integrating Mixed Reality with the Multi-Touch tabletop interface Towards real time 3D tracking and reconstruction on a GPU using Monte Carlo simulations
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1