Learning Dense Optical-Flow Trajectory Patterns for Video Object Extraction

Wang-Chou Lu, Y. Wang, Chu-Song Chen
{"title":"Learning Dense Optical-Flow Trajectory Patterns for Video Object Extraction","authors":"Wang-Chou Lu, Y. Wang, Chu-Song Chen","doi":"10.1109/AVSS.2010.79","DOIUrl":null,"url":null,"abstract":"We proposes an unsupervised method to address videoobject extraction (VOE) in uncontrolled videos, i.e. videoscaptured by low-resolution and freely moving cameras. Weadvocate the use of dense optical-flow trajectories (DOTs),which are obtained by propagating the optical flow informationat the pixel level. Therefore, no interest point extractionis required in our framework. To integrate colorand and shape information of moving objects, we groupthe DOTs at the super-pixel level to extract co-motion regions,and use the associated pyramid histogram of orientedgradients (PHOG) descriptors to extract objects of interestacross video frames. Our approach for VOE is easy to implement,and the use of DOTs for both motion segmentationand object tracking is more robust than existing trajectorybasedmethods. Experiments on several video sequencesexhibit the feasibility of our proposed VOE framework.","PeriodicalId":415758,"journal":{"name":"2010 7th IEEE International Conference on Advanced Video and Signal Based Surveillance","volume":"4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"15","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 7th IEEE International Conference on Advanced Video and Signal Based Surveillance","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AVSS.2010.79","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 15

Abstract

We proposes an unsupervised method to address videoobject extraction (VOE) in uncontrolled videos, i.e. videoscaptured by low-resolution and freely moving cameras. Weadvocate the use of dense optical-flow trajectories (DOTs),which are obtained by propagating the optical flow informationat the pixel level. Therefore, no interest point extractionis required in our framework. To integrate colorand and shape information of moving objects, we groupthe DOTs at the super-pixel level to extract co-motion regions,and use the associated pyramid histogram of orientedgradients (PHOG) descriptors to extract objects of interestacross video frames. Our approach for VOE is easy to implement,and the use of DOTs for both motion segmentationand object tracking is more robust than existing trajectorybasedmethods. Experiments on several video sequencesexhibit the feasibility of our proposed VOE framework.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
学习用于视频对象提取的密集光流轨迹模式
我们提出了一种无监督的方法来解决非受控视频中的视频对象提取(VOE)问题,即由低分辨率和自由移动的摄像机捕获的视频。我们提倡使用密集光流轨迹(DOTs),它是通过在像素级传播光流信息获得的。因此,在我们的框架中不需要提取兴趣点。为了整合运动物体的颜色和形状信息,我们在超像素级别对DOTs进行分组以提取共同运动区域,并使用相关的定向梯度金字塔直方图(PHOG)描述符在视频帧中提取感兴趣的物体。我们的VOE方法很容易实现,并且在运动分割和目标跟踪中使用DOTs比现有的基于轨迹的方法更健壮。在多个视频序列上的实验证明了我们提出的VOE框架的可行性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Statistical Background Modeling: An Edge Segment Based Moving Object Detection Approach Who, what, when, where, why and how in video analysis: an application centric view Trajectory Based Activity Discovery Local Abnormality Detection in Video Using Subspace Learning Functionality Delegation in Distributed Surveillance Systems
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1