Feature extraction for human action classification using adaptive key frame interval

Kanokphan Lertniphonphan, S. Aramvith, T. Chalidabhongse
{"title":"Feature extraction for human action classification using adaptive key frame interval","authors":"Kanokphan Lertniphonphan, S. Aramvith, T. Chalidabhongse","doi":"10.1109/APSIPA.2014.7041766","DOIUrl":null,"url":null,"abstract":"Human actions in video have the variation in both spatial and time domains which cause the difficulty for action classification. According to the nature of articulated body, an amount of movement from point-to-point is not constant, which can be illustrated as a bell-shape. In this paper, key frames are detected for specifying a starting and ending point for an action cycle. The time between key frames determines the window length for feature extraction in time domain. Since the cycles are varying, the key frame interval is varying and adaptive to performer and action. A local orientation histogram of Key Pose Energy Image (KPEI) and Motion History Image (MHI) is constructed during the period. The experimental results on WEIZMANN dataset demonstrate that the feature within the adaptive key frame interval can effectively classify actions.","PeriodicalId":231382,"journal":{"name":"Signal and Information Processing Association Annual Summit and Conference (APSIPA), 2014 Asia-Pacific","volume":"384 2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Signal and Information Processing Association Annual Summit and Conference (APSIPA), 2014 Asia-Pacific","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/APSIPA.2014.7041766","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

Human actions in video have the variation in both spatial and time domains which cause the difficulty for action classification. According to the nature of articulated body, an amount of movement from point-to-point is not constant, which can be illustrated as a bell-shape. In this paper, key frames are detected for specifying a starting and ending point for an action cycle. The time between key frames determines the window length for feature extraction in time domain. Since the cycles are varying, the key frame interval is varying and adaptive to performer and action. A local orientation histogram of Key Pose Energy Image (KPEI) and Motion History Image (MHI) is constructed during the period. The experimental results on WEIZMANN dataset demonstrate that the feature within the adaptive key frame interval can effectively classify actions.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于自适应关键帧间隔的人体动作分类特征提取
视频中的人的动作在空间和时间上都有变化,这给动作分类带来了困难。根据铰接体的性质,从点到点的运动量不是恒定的,可以用钟形来表示。在本文中,检测关键帧用于指定动作循环的起点和终点。关键帧之间的时间决定了时域特征提取的窗口长度。由于周期是变化的,关键帧间隔是变化的,并适应表演者和动作。在此期间,构建了关键姿态能量图像(KPEI)和运动历史图像(MHI)的局部方向直方图。在WEIZMANN数据集上的实验结果表明,自适应关键帧间隔内的特征可以有效地对动作进行分类。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Smoothing of spatial filter by graph Fourier transform for EEG signals Intra line copy for HEVC screen content coding Design of FPGA-based rapid prototype spectral subtraction for hands-free speech applications Fetal ECG extraction using adaptive functional link artificial neural network Opened Pins Recommendation System to promote tourism sector in Chiang Rai Thailand
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1