大型地理标记视频数据库的高效索引结构

Ying Lu, C. Shahabi, S. H. Kim
{"title":"大型地理标记视频数据库的高效索引结构","authors":"Ying Lu, C. Shahabi, S. H. Kim","doi":"10.1145/2666310.2666480","DOIUrl":null,"url":null,"abstract":"An unprecedented number of user-generated videos (UGVs) are currently being collected by mobile devices, however, such unstructured data are very hard to index and search. Due to recent development, UGVs can be geo-tagged, e.g., GPS locations and compass directions, at the acquisition time at a very fine spatial granularity. Ideally, each video frame can be tagged by the spatial extent of its coverage area, termed Field-Of-View (FOV). In this paper, we focus on the challenges of spatial indexing and querying of FOVs in a large repository. Since FOVs contain both location and orientation information, and their distribution is non-uniform, conventional spatial indexes (e.g., R-tree, Grid) cannot index them efficiently. We propose a class of new R-tree-based index structures that effectively harness FOVs' camera locations, orientations and view-distances, in tandem, for both filtering and optimization. In addition, we present novel search strategies and algorithms for efficient range and directional queries on FOVs utilizing our indexes. Our experiments with a real-world dataset and a large synthetic video dataset (over 30 years worth of videos) demonstrate the scalability and efficiency of our proposed indexes and search algorithms and their superiority over the competitors.","PeriodicalId":153031,"journal":{"name":"Proceedings of the 22nd ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems","volume":"36 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"13","resultStr":"{\"title\":\"An efficient index structure for large-scale geo-tagged video databases\",\"authors\":\"Ying Lu, C. Shahabi, S. H. Kim\",\"doi\":\"10.1145/2666310.2666480\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"An unprecedented number of user-generated videos (UGVs) are currently being collected by mobile devices, however, such unstructured data are very hard to index and search. Due to recent development, UGVs can be geo-tagged, e.g., GPS locations and compass directions, at the acquisition time at a very fine spatial granularity. Ideally, each video frame can be tagged by the spatial extent of its coverage area, termed Field-Of-View (FOV). In this paper, we focus on the challenges of spatial indexing and querying of FOVs in a large repository. Since FOVs contain both location and orientation information, and their distribution is non-uniform, conventional spatial indexes (e.g., R-tree, Grid) cannot index them efficiently. We propose a class of new R-tree-based index structures that effectively harness FOVs' camera locations, orientations and view-distances, in tandem, for both filtering and optimization. In addition, we present novel search strategies and algorithms for efficient range and directional queries on FOVs utilizing our indexes. Our experiments with a real-world dataset and a large synthetic video dataset (over 30 years worth of videos) demonstrate the scalability and efficiency of our proposed indexes and search algorithms and their superiority over the competitors.\",\"PeriodicalId\":153031,\"journal\":{\"name\":\"Proceedings of the 22nd ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems\",\"volume\":\"36 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-11-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"13\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 22nd ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2666310.2666480\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 22nd ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2666310.2666480","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 13

摘要

目前,移动设备正在收集空前数量的用户生成视频(ugv),然而,这种非结构化数据很难索引和搜索。由于最近的发展,ugv可以在采集时以非常精细的空间粒度进行地理标记,例如GPS位置和指南针方向。理想情况下,每个视频帧可以通过其覆盖区域的空间范围来标记,称为视场(FOV)。本文主要研究了大型数据库中fov的空间索引和查询问题。由于fov同时包含位置和方向信息,且其分布不均匀,传统的空间索引(如R-tree、Grid)无法有效地对其进行索引。我们提出了一类新的基于r树的索引结构,它有效地利用fov的相机位置、方向和视距,同时进行过滤和优化。此外,我们提出了新的搜索策略和算法,利用我们的索引对fov进行有效的范围和方向查询。我们对真实世界数据集和大型合成视频数据集(超过30年的视频)的实验证明了我们提出的索引和搜索算法的可扩展性和效率,以及它们相对于竞争对手的优势。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
An efficient index structure for large-scale geo-tagged video databases
An unprecedented number of user-generated videos (UGVs) are currently being collected by mobile devices, however, such unstructured data are very hard to index and search. Due to recent development, UGVs can be geo-tagged, e.g., GPS locations and compass directions, at the acquisition time at a very fine spatial granularity. Ideally, each video frame can be tagged by the spatial extent of its coverage area, termed Field-Of-View (FOV). In this paper, we focus on the challenges of spatial indexing and querying of FOVs in a large repository. Since FOVs contain both location and orientation information, and their distribution is non-uniform, conventional spatial indexes (e.g., R-tree, Grid) cannot index them efficiently. We propose a class of new R-tree-based index structures that effectively harness FOVs' camera locations, orientations and view-distances, in tandem, for both filtering and optimization. In addition, we present novel search strategies and algorithms for efficient range and directional queries on FOVs utilizing our indexes. Our experiments with a real-world dataset and a large synthetic video dataset (over 30 years worth of videos) demonstrate the scalability and efficiency of our proposed indexes and search algorithms and their superiority over the competitors.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A parallel query engine for interactive spatiotemporal analysis Spatio-temporal trajectory simplification for inferring travel paths Parameterized spatial query processing based on social probabilistic clustering Accurate and efficient map matching for challenging environments Top-k point of interest retrieval using standard indexes
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1