Subject Independent Human Activity Recognition with Foot IMU Data

Xiaotong Zhang, Jin Zhang
{"title":"Subject Independent Human Activity Recognition with Foot IMU Data","authors":"Xiaotong Zhang, Jin Zhang","doi":"10.1109/MSN48538.2019.00054","DOIUrl":null,"url":null,"abstract":"Human activity recognition is a very active research on pervasive computing and mobile health application. Many human activity systems based on inertial measurement unit (IMU) sensor data were proposed in the past few years. These systems mainly use IMU sensor placed on he torso and limbs to collect data and utilize supervised machine learning algorithms on sensor data. One main issue of these systems is that wearing multiple on-body IMU sensors may bring inconvenience to users' daily life. The other issue of these exiting methods is that an activity recognition model that is trained on a specific subject does not work well when being applied to predict another subject's activities since IMU activity data always carry information that is specific to the human subject who conducts the activities. In our work, inspired by the principle of domain adaption, we proposed a new deep-learning activity recognition model based on an adversarial network which can remove the subject-specific information within the IMU activity data and extract subject-independent features shared by the data collected on different subjects. We also for the first time use data collected from insole based IMU sensors on 8 participants for 5 common activities to build a new real world human activity dataset which can minimize the inconvenience for users to wear. We conducted experiments with our new real-world dataset. Results show that our subject independent activity recognition model outperforms state-of-art supervised learning techniques and eliminates the effects of individual differences between subjects successfully. The average recognition accuracy under the leave-one-out (L1O) condition achieves 99.0% which is higher than the performance of traditional human activity recognition system based on CNNs.","PeriodicalId":368318,"journal":{"name":"2019 15th International Conference on Mobile Ad-Hoc and Sensor Networks (MSN)","volume":"2 2","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 15th International Conference on Mobile Ad-Hoc and Sensor Networks (MSN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MSN48538.2019.00054","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

Human activity recognition is a very active research on pervasive computing and mobile health application. Many human activity systems based on inertial measurement unit (IMU) sensor data were proposed in the past few years. These systems mainly use IMU sensor placed on he torso and limbs to collect data and utilize supervised machine learning algorithms on sensor data. One main issue of these systems is that wearing multiple on-body IMU sensors may bring inconvenience to users' daily life. The other issue of these exiting methods is that an activity recognition model that is trained on a specific subject does not work well when being applied to predict another subject's activities since IMU activity data always carry information that is specific to the human subject who conducts the activities. In our work, inspired by the principle of domain adaption, we proposed a new deep-learning activity recognition model based on an adversarial network which can remove the subject-specific information within the IMU activity data and extract subject-independent features shared by the data collected on different subjects. We also for the first time use data collected from insole based IMU sensors on 8 participants for 5 common activities to build a new real world human activity dataset which can minimize the inconvenience for users to wear. We conducted experiments with our new real-world dataset. Results show that our subject independent activity recognition model outperforms state-of-art supervised learning techniques and eliminates the effects of individual differences between subjects successfully. The average recognition accuracy under the leave-one-out (L1O) condition achieves 99.0% which is higher than the performance of traditional human activity recognition system based on CNNs.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于足部IMU数据的受试者独立人类活动识别
人体活动识别是普适计算和移动医疗应用领域非常活跃的研究方向。近年来,人们提出了许多基于惯性测量单元(IMU)传感器数据的人体活动系统。这些系统主要使用放置在躯干和四肢上的IMU传感器来收集数据,并利用监督机器学习算法对传感器数据进行处理。这些系统的一个主要问题是,在身上佩戴多个IMU传感器可能会给用户的日常生活带来不便。这些现有方法的另一个问题是,在特定主题上训练的活动识别模型在应用于预测另一个主题的活动时效果不佳,因为IMU活动数据总是携带特定于进行活动的人类主题的信息。在我们的工作中,受领域自适应原理的启发,我们提出了一种新的基于对抗网络的深度学习活动识别模型,该模型可以去除IMU活动数据中的特定主题信息,并提取不同主题数据共享的独立于主题的特征。我们还首次使用基于鞋垫的IMU传感器收集的8名参与者的5种常见活动的数据来构建一个新的真实世界的人类活动数据集,该数据集可以最大限度地减少用户穿着的不便。我们用新的真实世界数据集进行了实验。结果表明,我们的主体独立活动识别模型优于目前最先进的监督学习技术,并成功消除了主体之间个体差异的影响。在l10条件下,平均识别准确率达到99.0%,高于传统的基于cnn的人体活动识别系统。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Optimal Download of Dynamically Generated Data by Using ISL Offloading in LEO Networks Data Aggregation Scheduling in Duty-Cycled Multihop Wireless Networks Subject to Physical Interference Subject Independent Human Activity Recognition with Foot IMU Data Encoding Space to Count Multi-Targets with Multiplexed Binary Infrared Sensors An Approach to Time Series Classification Using Binary Distribution Tree
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1