A method for recognizing living activities in homes using positioning sensor and power meters

Kenki Ueda, M. Tamai, K. Yasumoto
{"title":"A method for recognizing living activities in homes using positioning sensor and power meters","authors":"Kenki Ueda, M. Tamai, K. Yasumoto","doi":"10.1109/PERCOMW.2015.7134062","DOIUrl":null,"url":null,"abstract":"To realize smart homes with sophisticated services including energy-saving context-aware appliance control in homes and elderly monitoring systems, automatic recognition of human activities in homes is essential. Several daily activity recognition methods have been proposed so far, but most of them still have issues to be solved such as high deployment cost due to many sensors and/or violation of users' feeling of privacy due to use of cameras. Moreover, many activity recognition methods using wearable sensors have been proposed, but they focus on simple human activities like walking, running, etc. and it is difficult to use these methods for recognition of various complex activities in homes. In this paper, we propose a machine learning based method for recognizing various daily activities in homes using only positioning sensors equipped by inhabitants and power meters attached to appliances. To efficiently collect training data for constructing a recognition model, we have developed a tool which visualizes a time series of sensor data and facilitates a user to put labels (activity types) to a specified time interval of the sensor data. We obtain training samples by dividing the extracted training data by a fixed time window and calculating for each sample position and power consumptions averaged over a time window as feature values. Then, the obtained samples are used to construct an activity recognition model by machine learning. Targeting six different activities (watching TV, taking a meal, cooking, reading a book, washing dishes, and other), we applied our proposed method to the sensor data collected in a smart home testbed. As a result, our method recognized 6 different activities with precision of about 85% and recall of about 82%.","PeriodicalId":180959,"journal":{"name":"2015 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops)","volume":"75 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"35","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PERCOMW.2015.7134062","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 35

Abstract

To realize smart homes with sophisticated services including energy-saving context-aware appliance control in homes and elderly monitoring systems, automatic recognition of human activities in homes is essential. Several daily activity recognition methods have been proposed so far, but most of them still have issues to be solved such as high deployment cost due to many sensors and/or violation of users' feeling of privacy due to use of cameras. Moreover, many activity recognition methods using wearable sensors have been proposed, but they focus on simple human activities like walking, running, etc. and it is difficult to use these methods for recognition of various complex activities in homes. In this paper, we propose a machine learning based method for recognizing various daily activities in homes using only positioning sensors equipped by inhabitants and power meters attached to appliances. To efficiently collect training data for constructing a recognition model, we have developed a tool which visualizes a time series of sensor data and facilitates a user to put labels (activity types) to a specified time interval of the sensor data. We obtain training samples by dividing the extracted training data by a fixed time window and calculating for each sample position and power consumptions averaged over a time window as feature values. Then, the obtained samples are used to construct an activity recognition model by machine learning. Targeting six different activities (watching TV, taking a meal, cooking, reading a book, washing dishes, and other), we applied our proposed method to the sensor data collected in a smart home testbed. As a result, our method recognized 6 different activities with precision of about 85% and recall of about 82%.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
一种使用定位传感器和电表识别家庭生活活动的方法
为了实现智能家居的复杂服务,包括节能的环境感知家电控制和老年人监控系统,自动识别家庭中的人类活动是必不可少的。目前已经提出了几种日常活动识别方法,但大多数方法都存在传感器多、部署成本高、使用摄像头侵犯用户隐私感等问题。此外,已经提出了许多使用可穿戴传感器的活动识别方法,但它们都集中在简单的人类活动上,如走路、跑步等,很难将这些方法用于家庭中各种复杂活动的识别。在本文中,我们提出了一种基于机器学习的方法,仅使用居民配备的定位传感器和连接在电器上的电表来识别家庭中的各种日常活动。为了有效地收集训练数据以构建识别模型,我们开发了一种工具,可以将传感器数据的时间序列可视化,并方便用户在传感器数据的指定时间间隔上放置标签(活动类型)。我们将提取的训练数据除以一个固定的时间窗,并计算每个样本的位置和功耗在一个时间窗内的平均值作为特征值来获得训练样本。然后,将得到的样本通过机器学习构建活动识别模型。针对六种不同的活动(看电视、吃饭、做饭、读书、洗碗等),我们将我们提出的方法应用于智能家居试验台收集的传感器数据。结果表明,该方法识别了6种不同的活动,准确率约为85%,召回率约为82%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A sensing coverage analysis of a route control method for vehicular crowd sensing Next place prediction by understanding mobility patterns AgriAcT: Agricultural Activity Training using multimedia and wearable sensing A concept for a C2X-based crossroad assistant RuPS: Rural participatory sensing with rewarding mechanisms for crop monitoring
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1