基于视频的多受试者轨迹亲和群体检测

Abdullah Al Masum, Mahady Hasan Rafy, S. M. Mahbubur Rahman
{"title":"基于视频的多受试者轨迹亲和群体检测","authors":"Abdullah Al Masum, Mahady Hasan Rafy, S. M. Mahbubur Rahman","doi":"10.1109/ICECE.2014.7026834","DOIUrl":null,"url":null,"abstract":"Affinity detection has been largely motivated by the increasing interest in modelling the social behavior of humans. This paper presents a supervised learning method for affinity detection which is based on an inference obtained from tracking trajectories of the human subjects captured in video sequences. In particular, the proxemic cues of group detection such as the pair-wise similarity of the positional and translational measurements of the tracked people are used in the well-known principal component analysis-based feature extraction process. The existence or non-existence of pair-wise affinities is recognized using the nearest neighbor detector applied on the proposed features and the majority voting-based fusion of decisions. Experiments conducted on surveillance video captured in diverse-type of movements of the subjects show favorable results in terms of accuracy of detecting affinities when compared with the ground truth.","PeriodicalId":335492,"journal":{"name":"8th International Conference on Electrical and Computer Engineering","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Video-based affinity group detection using trajectories of multiple subjects\",\"authors\":\"Abdullah Al Masum, Mahady Hasan Rafy, S. M. Mahbubur Rahman\",\"doi\":\"10.1109/ICECE.2014.7026834\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Affinity detection has been largely motivated by the increasing interest in modelling the social behavior of humans. This paper presents a supervised learning method for affinity detection which is based on an inference obtained from tracking trajectories of the human subjects captured in video sequences. In particular, the proxemic cues of group detection such as the pair-wise similarity of the positional and translational measurements of the tracked people are used in the well-known principal component analysis-based feature extraction process. The existence or non-existence of pair-wise affinities is recognized using the nearest neighbor detector applied on the proposed features and the majority voting-based fusion of decisions. Experiments conducted on surveillance video captured in diverse-type of movements of the subjects show favorable results in terms of accuracy of detecting affinities when compared with the ground truth.\",\"PeriodicalId\":335492,\"journal\":{\"name\":\"8th International Conference on Electrical and Computer Engineering\",\"volume\":\"25 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"8th International Conference on Electrical and Computer Engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICECE.2014.7026834\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"8th International Conference on Electrical and Computer Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICECE.2014.7026834","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

摘要

亲和检测在很大程度上是由于人们对人类社会行为建模的兴趣日益浓厚。本文提出了一种基于从视频序列中捕获的人类受试者的跟踪轨迹得到的推理的亲和检测的监督学习方法。特别是,群体检测的近似线索,如被跟踪人的位置和平移测量的成对相似性,被用于著名的基于主成分分析的特征提取过程。使用应用于所提出特征的最近邻检测器和基于多数投票的决策融合来识别成对亲和性的存在与否。对被试不同类型运动的监控视频进行的实验表明,与地面真实情况相比,在检测亲和力的准确性方面取得了良好的结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Video-based affinity group detection using trajectories of multiple subjects
Affinity detection has been largely motivated by the increasing interest in modelling the social behavior of humans. This paper presents a supervised learning method for affinity detection which is based on an inference obtained from tracking trajectories of the human subjects captured in video sequences. In particular, the proxemic cues of group detection such as the pair-wise similarity of the positional and translational measurements of the tracked people are used in the well-known principal component analysis-based feature extraction process. The existence or non-existence of pair-wise affinities is recognized using the nearest neighbor detector applied on the proposed features and the majority voting-based fusion of decisions. Experiments conducted on surveillance video captured in diverse-type of movements of the subjects show favorable results in terms of accuracy of detecting affinities when compared with the ground truth.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Empirical prediction of optical transitions in metallic armchair SWCNTs Dynamics of fullerene self-insertion into carbon nanotubes in water Diffusion tensor based global tractography of human brain fiber bundles Biomass quality analysis for power generation Video-based affinity group detection using trajectories of multiple subjects
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1