Lili Tao, T. Volonakis, Bo Tan, Ziqi. Zhang, Yanguo Jing, Melvyn L. Smith
{"title":"3D Convolutional Neural network for Home Monitoring using Low Resolution Thermal-sensor Array","authors":"Lili Tao, T. Volonakis, Bo Tan, Ziqi. Zhang, Yanguo Jing, Melvyn L. Smith","doi":"10.1049/CP.2019.0100","DOIUrl":null,"url":null,"abstract":"The recognition of daily actions, such as walking, sitting or standing, in the home is informative for assisted living, smart homes and general health care. A variety of actions in complex scenes can be recognised using visual information. However cameras succumb to privacy concerns. In this paper, we present a home activity recognition system using an 8×8 infared sensor \narray. This low spatial resolution retains user privacy, but is still a powerful representation of actions in a scene. Actions are recognised using a 3D convolutional neural network, extracting not only spatial but temporal information from video sequences. Experimental results obtained from a publicly available dataset Infra-ADL2018 demonstrate a better performance of the proposed approach compared to the state-of-the-art. We show that the sensor is considered better at detecting the occurrence of falls and activities of daily living. Our method achieves an overall accuracy of 97.22% across 7 actions with a \nfall detection sensitivity of 100% and specificity of 99.31%.","PeriodicalId":331745,"journal":{"name":"3rd IET International Conference on Technologies for Active and Assisted Living (TechAAL 2019)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"3rd IET International Conference on Technologies for Active and Assisted Living (TechAAL 2019)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1049/CP.2019.0100","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 11
Abstract
The recognition of daily actions, such as walking, sitting or standing, in the home is informative for assisted living, smart homes and general health care. A variety of actions in complex scenes can be recognised using visual information. However cameras succumb to privacy concerns. In this paper, we present a home activity recognition system using an 8×8 infared sensor
array. This low spatial resolution retains user privacy, but is still a powerful representation of actions in a scene. Actions are recognised using a 3D convolutional neural network, extracting not only spatial but temporal information from video sequences. Experimental results obtained from a publicly available dataset Infra-ADL2018 demonstrate a better performance of the proposed approach compared to the state-of-the-art. We show that the sensor is considered better at detecting the occurrence of falls and activities of daily living. Our method achieves an overall accuracy of 97.22% across 7 actions with a
fall detection sensitivity of 100% and specificity of 99.31%.