AAL has benefitted tremendously from the near-ubiquity of powerful smartphones and very high data rates available over broadband and mobile networks. However, this is beyond the reach of many users. IoT systems offer the potential to extend some of the benefits to disadvantaged users. Such solutions will need to secure personal health information and provide a sufficient quality of service even when operating constrained user devices and communications.
{"title":"Improving access to healthcare in rural communities - IoT as part of the solution","authors":"I. K. Poyner, R. Sherratt","doi":"10.1049/CP.2019.0104","DOIUrl":"https://doi.org/10.1049/CP.2019.0104","url":null,"abstract":"AAL has benefitted tremendously from the near-ubiquity of powerful smartphones and very high data rates available over broadband and mobile networks. However, this is beyond the reach of many users. IoT systems offer the potential to extend some of the benefits to disadvantaged users. Such solutions will need to secure personal health information and provide a sufficient quality of service even when operating constrained user devices and communications.","PeriodicalId":331745,"journal":{"name":"3rd IET International Conference on Technologies for Active and Assisted Living (TechAAL 2019)","volume":"304 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122263147","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
N. Kobayashi, M. Iwasaki, Y. Ito, Ann-Louise Lindborg, Takumi Ohashi, Miki Saijo
There is a growing need in recent years for products with high usability. Conventionally, researchers have only focused on the interactions between user and device in carrying out user tests. We have premised our research, however, on the assumption that there exist two users in the use of robotic assistive devices: a frail person, such as an elderly person who needs care, and the caregiver who provides support. As an approach to making a video analysis of a user test, we propose a methodology that focuses on the perspectives of these two users. We used behaviour analysis to compare two videos: one showing a frail elderly person being fed by a caregiver and the other showing the same person eating food on their own with the help of a robotic assistive device. Our finding based on these videos was that the robotic assistive device reduces the burden of caregivers and increases the quality of life (QOL) of the elderly. Also, we discovered that this methodology using video analysis is useful in pinpointing improvements that can be made in the device.
{"title":"Device Improvement by Video Analysis of User Tests: Case Study of a Robotic Assistive Device for the Frail Elderly","authors":"N. Kobayashi, M. Iwasaki, Y. Ito, Ann-Louise Lindborg, Takumi Ohashi, Miki Saijo","doi":"10.1049/CP.2019.0099","DOIUrl":"https://doi.org/10.1049/CP.2019.0099","url":null,"abstract":"There is a growing need in recent years for products with high usability. Conventionally, researchers have only focused on the interactions between user and device in carrying out user tests. We have premised our research, however, on the assumption that there exist two users in the use of robotic assistive devices: a frail person, such as an elderly person who needs care, and the caregiver who provides support. As an approach to making a video analysis of a user test, we propose a methodology that focuses on the perspectives of these two users. We used behaviour analysis to compare two videos: one showing a frail elderly person being fed by a caregiver and the other showing the same person eating food on their own with the help of a robotic assistive device. Our finding based on these videos was that the robotic assistive device reduces the burden of caregivers and increases the quality of life (QOL) of the elderly. Also, we discovered that this methodology using video analysis is useful in pinpointing improvements that can be made in the device.","PeriodicalId":331745,"journal":{"name":"3rd IET International Conference on Technologies for Active and Assisted Living (TechAAL 2019)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125284757","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper proposes a new method to protect the privacy while retaining the ability to accurately recognise the activities of daily living for video-based monitoring in ambient assisted living applications. The proposed method obfuscates the human appearance by modelling the temporal saliency in the monitoring video sequences. It mimics the functionality of neuromorphic cameras and explores the temporal saliency to generate a mask to anonymise the human appearance. Since the anonymising masks encapsulate the temporal saliency with respect to motion in the sequence, they provide a good basis for further utilisation in activity recognition, which is achieved by representing the HOG features on privacy masks. The proposed method has resulted in excellent anonymising performances compared using the cross correlation measures. In terms of activity recognition, the proposed method has resulted in 5.6% and 5.4% improvements of accuracies over other anonymisation methods for Weizmann and DHA datasets, respectively.
{"title":"Privacy Protected Recognition of Activities of Daily Living in Video","authors":"S. Al-Obaidi, Charith Abhayaratne","doi":"10.1049/CP.2019.0101","DOIUrl":"https://doi.org/10.1049/CP.2019.0101","url":null,"abstract":"This paper proposes a new method to protect the privacy while retaining the ability to accurately recognise the activities of daily living for video-based monitoring in ambient assisted living applications. The proposed method obfuscates the human appearance by modelling the temporal saliency in the monitoring video sequences. It mimics the functionality of neuromorphic cameras and explores the temporal saliency to generate a mask to anonymise the human appearance. Since the anonymising masks encapsulate the temporal saliency with respect to motion in the sequence, they provide a good basis for further utilisation in activity recognition, which is achieved by representing the HOG features on privacy masks. The proposed method has resulted in excellent anonymising performances compared using the cross correlation measures. In terms of activity recognition, the proposed method has resulted in 5.6% and 5.4% improvements of accuracies over other anonymisation methods for Weizmann and DHA datasets, respectively.","PeriodicalId":331745,"journal":{"name":"3rd IET International Conference on Technologies for Active and Assisted Living (TechAAL 2019)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125411658","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Physiological changes in old age that affect mobility along with losses such as widowhood and shrinking of social circles, may lead to isolation and to lower levels of emotional well-being. Remaining engaged in activity and in society in later life is associated with emotional well-being for many older adults. This paper presents the results of applying an easy-to-use/accessible technology-based intervention aimed to promote social connectivity and to provide entrainment content for older adults. Forty people between the ages 77–100 participated in the study. The system's hardware was installed on participant's television set and enabling them to participate in Senior Center activities from far, allowing them to connect with family and friends through video calls and share photos, as well as watch curated video content. The system that this study used was developed and deployed by Uniper-Care Technologies. Data regarding emotional well were collected at two time points, the first point on the day that the system was installed at each participant's home, the second point was four to five weeks later. Results show that participants were successful in adopting the system and using it as intended. In addition, a significant decrease in measurements of loneliness, and depression as well as an increase in emotional well-being as well as a growth in social engagement.
{"title":"Leveraging Emotional Wellbeing and Social Engagement of the Oldest Old by Using Advanced Communication Technologies: A Pilot Study Using Uniper-Care's Technology","authors":"M. Isaacson, I. Cohen, C. Shpigelman","doi":"10.1049/CP.2019.0102","DOIUrl":"https://doi.org/10.1049/CP.2019.0102","url":null,"abstract":"Physiological changes in old age that affect mobility along with losses such as widowhood and shrinking of social circles, may lead to isolation and to lower levels of emotional well-being. Remaining engaged in activity and in society in later life is associated with emotional well-being for many older adults. This paper presents the results of applying an easy-to-use/accessible technology-based intervention aimed to promote social connectivity and to provide entrainment content for older adults. Forty people between the ages 77–100 participated in the study. The system's hardware was installed on participant's television set and enabling them to participate in Senior Center activities from far, allowing them to connect with family and friends through video calls and share photos, as well as watch curated video content. The system that this study used was developed and deployed by Uniper-Care Technologies. Data regarding emotional well were collected at two time points, the first point on the day that the system was installed at each participant's home, the second point was four to five weeks later. Results show that participants were successful in adopting the system and using it as intended. In addition, a significant decrease in measurements of loneliness, and depression as well as an increase in emotional well-being as well as a growth in social engagement.","PeriodicalId":331745,"journal":{"name":"3rd IET International Conference on Technologies for Active and Assisted Living (TechAAL 2019)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130216596","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Năstac, O. Arsene, M. Drăgoi, I. Stanciu, I. Mocanu
{"title":"An AAL scenario involving automatic data collection and robotic manipulation","authors":"D. Năstac, O. Arsene, M. Drăgoi, I. Stanciu, I. Mocanu","doi":"10.1049/CP.2019.0105","DOIUrl":"https://doi.org/10.1049/CP.2019.0105","url":null,"abstract":"","PeriodicalId":331745,"journal":{"name":"3rd IET International Conference on Technologies for Active and Assisted Living (TechAAL 2019)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115331095","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lili Tao, T. Volonakis, Bo Tan, Ziqi. Zhang, Yanguo Jing, Melvyn L. Smith
The recognition of daily actions, such as walking, sitting or standing, in the home is informative for assisted living, smart homes and general health care. A variety of actions in complex scenes can be recognised using visual information. However cameras succumb to privacy concerns. In this paper, we present a home activity recognition system using an 8×8 infared sensor array. This low spatial resolution retains user privacy, but is still a powerful representation of actions in a scene. Actions are recognised using a 3D convolutional neural network, extracting not only spatial but temporal information from video sequences. Experimental results obtained from a publicly available dataset Infra-ADL2018 demonstrate a better performance of the proposed approach compared to the state-of-the-art. We show that the sensor is considered better at detecting the occurrence of falls and activities of daily living. Our method achieves an overall accuracy of 97.22% across 7 actions with a fall detection sensitivity of 100% and specificity of 99.31%.
{"title":"3D Convolutional Neural network for Home Monitoring using Low Resolution Thermal-sensor Array","authors":"Lili Tao, T. Volonakis, Bo Tan, Ziqi. Zhang, Yanguo Jing, Melvyn L. Smith","doi":"10.1049/CP.2019.0100","DOIUrl":"https://doi.org/10.1049/CP.2019.0100","url":null,"abstract":"The recognition of daily actions, such as walking, sitting or standing, in the home is informative for assisted living, smart homes and general health care. A variety of actions in complex scenes can be recognised using visual information. However cameras succumb to privacy concerns. In this paper, we present a home activity recognition system using an 8×8 infared sensor \u0000array. This low spatial resolution retains user privacy, but is still a powerful representation of actions in a scene. Actions are recognised using a 3D convolutional neural network, extracting not only spatial but temporal information from video sequences. Experimental results obtained from a publicly available dataset Infra-ADL2018 demonstrate a better performance of the proposed approach compared to the state-of-the-art. We show that the sensor is considered better at detecting the occurrence of falls and activities of daily living. Our method achieves an overall accuracy of 97.22% across 7 actions with a \u0000fall detection sensitivity of 100% and specificity of 99.31%.","PeriodicalId":331745,"journal":{"name":"3rd IET International Conference on Technologies for Active and Assisted Living (TechAAL 2019)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132903214","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Human activity recognition using smartphones and wearables is a field gathering a lot of attention. Although a plethora of systems have been proposed in the literature, comparing their results is not an easy task. As a universal evaluation framework is absent, direct comparison is not feasible. This paper compares state-of-the-art classifiers already used on mobile human activity recognition, under the same conditions. In addition, an Android application was developed and the method yielding the best results was evaluated in real world in a semi-supervised environment. Results shown that deep learning techniques have better performance and could be transferred to a phone without many modifications.
{"title":"Evaluating state-of-the-art classifiers for human activity recognition using smartphones","authors":"A. Lentzas, A. Agapitos, D. Vrakas","doi":"10.1049/CP.2019.0098","DOIUrl":"https://doi.org/10.1049/CP.2019.0098","url":null,"abstract":"Human activity recognition using smartphones and wearables is a field gathering a lot of attention. Although a plethora of systems have been proposed in the literature, comparing their results is not an easy task. As a universal evaluation framework is absent, direct comparison is not feasible. This paper compares state-of-the-art classifiers already used on mobile human activity recognition, under the same conditions. In addition, an Android application was developed and the method yielding the best results was evaluated in real world in a semi-supervised environment. Results shown that deep learning techniques have better performance and could be transferred to a phone without many modifications.","PeriodicalId":331745,"journal":{"name":"3rd IET International Conference on Technologies for Active and Assisted Living (TechAAL 2019)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122008089","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}