Kanokphan Lertniphonphan, S. Aramvith, T. Chalidabhongse
{"title":"Feature extraction for human action classification using adaptive key frame interval","authors":"Kanokphan Lertniphonphan, S. Aramvith, T. Chalidabhongse","doi":"10.1109/APSIPA.2014.7041766","DOIUrl":null,"url":null,"abstract":"Human actions in video have the variation in both spatial and time domains which cause the difficulty for action classification. According to the nature of articulated body, an amount of movement from point-to-point is not constant, which can be illustrated as a bell-shape. In this paper, key frames are detected for specifying a starting and ending point for an action cycle. The time between key frames determines the window length for feature extraction in time domain. Since the cycles are varying, the key frame interval is varying and adaptive to performer and action. A local orientation histogram of Key Pose Energy Image (KPEI) and Motion History Image (MHI) is constructed during the period. The experimental results on WEIZMANN dataset demonstrate that the feature within the adaptive key frame interval can effectively classify actions.","PeriodicalId":231382,"journal":{"name":"Signal and Information Processing Association Annual Summit and Conference (APSIPA), 2014 Asia-Pacific","volume":"384 2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Signal and Information Processing Association Annual Summit and Conference (APSIPA), 2014 Asia-Pacific","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/APSIPA.2014.7041766","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
Human actions in video have the variation in both spatial and time domains which cause the difficulty for action classification. According to the nature of articulated body, an amount of movement from point-to-point is not constant, which can be illustrated as a bell-shape. In this paper, key frames are detected for specifying a starting and ending point for an action cycle. The time between key frames determines the window length for feature extraction in time domain. Since the cycles are varying, the key frame interval is varying and adaptive to performer and action. A local orientation histogram of Key Pose Energy Image (KPEI) and Motion History Image (MHI) is constructed during the period. The experimental results on WEIZMANN dataset demonstrate that the feature within the adaptive key frame interval can effectively classify actions.