Moving Object Detection and Tracking Based on the Contour Extraction and Centroid Representation

N. M, Sriharsha K. V., V. A.
{"title":"Moving Object Detection and Tracking Based on the Contour Extraction and Centroid Representation","authors":"N. M, Sriharsha K. V., V. A.","doi":"10.4018/978-1-5225-7368-5.ch012","DOIUrl":null,"url":null,"abstract":"This chapter presents a novel approach for moving object detection and tracking based on contour extraction and centroid representation (CECR). Firstly, two consecutive frames are read from the video, and they are converted into grayscale. Next, the absolute difference is calculated between them and the result frame is converted into binary by applying gray threshold technique. The binary frame is segmented using contour extraction algorithm. The centroid representation is used for motion tracking. In the second stage of experiment, initially object is detected by using CECR and motion of each track is estimated by Kalman filter. Experimental results show that the proposed method can robustly detect and track the moving object.","PeriodicalId":52560,"journal":{"name":"Foundations and Trends in Human-Computer Interaction","volume":"06 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Foundations and Trends in Human-Computer Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4018/978-1-5225-7368-5.ch012","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 0

Abstract

This chapter presents a novel approach for moving object detection and tracking based on contour extraction and centroid representation (CECR). Firstly, two consecutive frames are read from the video, and they are converted into grayscale. Next, the absolute difference is calculated between them and the result frame is converted into binary by applying gray threshold technique. The binary frame is segmented using contour extraction algorithm. The centroid representation is used for motion tracking. In the second stage of experiment, initially object is detected by using CECR and motion of each track is estimated by Kalman filter. Experimental results show that the proposed method can robustly detect and track the moving object.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于轮廓提取和质心表示的运动目标检测与跟踪
本章提出了一种基于轮廓提取和质心表示(CECR)的运动目标检测与跟踪方法。首先,从视频中读取连续的两帧,并将其转换为灰度。然后,计算它们之间的绝对差值,并应用灰度阈值技术将结果帧转换为二值。采用轮廓提取算法对二值帧进行分割。质心表示用于运动跟踪。在实验的第二阶段,利用CECR对目标进行初始检测,并利用卡尔曼滤波对每条轨迹的运动进行估计。实验结果表明,该方法能够对运动目标进行鲁棒检测和跟踪。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Foundations and Trends in Human-Computer Interaction
Foundations and Trends in Human-Computer Interaction Computer Science-Computer Science Applications
CiteScore
10.10
自引率
0.00%
发文量
2
期刊介绍: Foundations and Trends® in Human-Computer Interaction publishes surveys and tutorials in the following topics: - History of the research community - Design and Evaluation - Theory - Technology - Computer Supported Cooperative Work - Interdisciplinary influence - Advanced topics and trends - Information visualization
期刊最新文献
The Roles and Modes of Human Interactions with Automated Machine Learning Systems: A Critical Review and Perspectives Artificial Intelligence Review Computer-Assisted Parallel Program Generation Eight Tips for the Theme “Data and Forecasts” Telesurgical Robotics and a Kinematic Perspective
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1