Recognition of Human Walking Motion Using a Wearable Camera

Zi-yang Liu, Tomoyuki Kurosaki, J. Tan
{"title":"Recognition of Human Walking Motion Using a Wearable Camera","authors":"Zi-yang Liu, Tomoyuki Kurosaki, J. Tan","doi":"10.1145/3581807.3581809","DOIUrl":null,"url":null,"abstract":"In recent years, the computer vision technology has been attracting more attention than ever and being applied in a wide range of fields. Among them, the technology on automatic recognition of human motion is particularly important, since it leads to automatic detection of suspicious persons and automatic monitoring of elderly people. Therefore, the research on human motion recognition using computer vision techniques has been actively conducted in Japan and overseas. However, most of the conventional researches on human motion recognition employs a video of a human motion taken using an external fixed camera. There is no research on human motion recognition using a video of a surrounding scenery provided from a wearable camera. This paper proposes a method of recognizing a human motion by estimating the posture change of a wearable camera attached to a walking human from the motion of a scenery in the video provided from the wearable camera and by analyzing a human trunk change obtained from the posture change of the camera. In the method, AKAZE is applied to the images to detect feature points and to find their correspondence. The 5-point algorithm is used to estimate the Epipolar geometry constraint and an essential matrix which provides a camera relative motion. The change of the camera relative motion is then used to analyze the shape of a human trunk. The analyzed results, i.e., walking motion features, are finally fed into a SVM to identify the motion. In the experiment, five types of walking motions are captured by a wearable camera from five subjects. The accuracy on human motion recognition was 80%. More precise feature points extraction, more exact estimation of motions, and considering variety of human walking motions are needed to improve the proposed technique.","PeriodicalId":292813,"journal":{"name":"Proceedings of the 2022 11th International Conference on Computing and Pattern Recognition","volume":"4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 11th International Conference on Computing and Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3581807.3581809","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In recent years, the computer vision technology has been attracting more attention than ever and being applied in a wide range of fields. Among them, the technology on automatic recognition of human motion is particularly important, since it leads to automatic detection of suspicious persons and automatic monitoring of elderly people. Therefore, the research on human motion recognition using computer vision techniques has been actively conducted in Japan and overseas. However, most of the conventional researches on human motion recognition employs a video of a human motion taken using an external fixed camera. There is no research on human motion recognition using a video of a surrounding scenery provided from a wearable camera. This paper proposes a method of recognizing a human motion by estimating the posture change of a wearable camera attached to a walking human from the motion of a scenery in the video provided from the wearable camera and by analyzing a human trunk change obtained from the posture change of the camera. In the method, AKAZE is applied to the images to detect feature points and to find their correspondence. The 5-point algorithm is used to estimate the Epipolar geometry constraint and an essential matrix which provides a camera relative motion. The change of the camera relative motion is then used to analyze the shape of a human trunk. The analyzed results, i.e., walking motion features, are finally fed into a SVM to identify the motion. In the experiment, five types of walking motions are captured by a wearable camera from five subjects. The accuracy on human motion recognition was 80%. More precise feature points extraction, more exact estimation of motions, and considering variety of human walking motions are needed to improve the proposed technique.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于可穿戴相机的人体行走运动识别
近年来,计算机视觉技术越来越受到人们的重视,并在各个领域得到了广泛的应用。其中,人体运动的自动识别技术尤为重要,因为它可以自动检测可疑人员和自动监控老年人。因此,利用计算机视觉技术进行人体运动识别的研究在日本和国外都得到了积极的开展。然而,传统的人体运动识别研究大多采用外部固定摄像机拍摄的人体运动视频。目前还没有研究利用可穿戴相机提供的周围风景视频来识别人体动作。本文提出了一种识别人体运动的方法,通过可穿戴摄像机提供的视频中景物的运动来估计附着在行走的人身上的可穿戴摄像机的姿势变化,并通过分析摄像机姿势变化得到的人体躯干变化来识别人体运动。该方法利用AKAZE对图像进行特征点检测,找出特征点之间的对应关系。5点算法用于估计极几何约束和提供相机相对运动的基本矩阵。然后利用相机相对运动的变化来分析人体躯干的形状。最后将分析结果即行走运动特征输入支持向量机进行运动识别。在实验中,一个可穿戴相机捕捉了五名受试者的五种行走动作。对人体运动识别的准确率为80%。该方法需要更精确地提取特征点,更精确地估计运动,并考虑到人类行走运动的多样性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Multi-Scale Channel Attention for Chinese Scene Text Recognition Vehicle Re-identification Based on Multi-Scale Attention Feature Fusion Comparative Study on EEG Feature Recognition based on Deep Belief Network VA-TransUNet: A U-shaped Medical Image Segmentation Network with Visual Attention Traffic Flow Forecasting Research Based on Delay Reconstruction and GRU-SVR
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1