{"title":"MHCNLS-HAR: Multiheaded CNN-LSTM-Based Human Activity Recognition Leveraging a Novel Wearable Edge Device for Elderly Health Care","authors":"Neha Gaud;Maya Rathore;Ugrasen Suman","doi":"10.1109/JSEN.2024.3450499","DOIUrl":null,"url":null,"abstract":"Human activity recognition (HAR) systems include the activities performed by a person during daily routines, such as running, walking, and jogging, performed with the help of the lower extremities. This article proposes an HAR system designed for health monitoring, focusing on five gesture categories: walking, running, jumping, squatting, and other activities for data collection. Data collection was carried out using an Arduino Nano 33 Bluetooth Low Energy (BLE) Sense microcontroller, equipped with a 9° inertial measurement unit (IMU) sensor system, at a sampling frequency of 110 Hz. For each gesture, 50 samples are collected from 30 subjects of various age groups (15–65) from the Indian subcontinent (Asian region). All gestures were manifested using the movement of the hip, knee, and ankle joints, which captures the spatial and temporal data of the person during various gestures. This research leverages the power of edge computing devices by fusing the deep learning code over the Arduino Nano microcontroller for gesture recognition. The multiheaded convolutional neural network (CNN) and long short-term memory (LSTM) (MHCNLS)-based deep learning model is proposed to classify the gestures. This model utilizes CNN for spatial dependencies and LSTM for sequential, time series dependencies in the human activity data. The proposed MHCNLS model is evaluated on three benchmark datasets—WISDM, PAMPM2, and UCI-HAR—and our own HEAHL-HAR dataset. The results of the MHCNLS model are compared with various other hybrid deep learning models based on CNN, LSTM, and GRU and their combination to classify the gestures and check the stability of the model. The results are evaluated based on various performance index accuracy, precision, F1-score, recall, and sensitivity. The proposed MHCNLS model has outperformed all existing state-of-the-art models mentioned in the literature with an accuracy of 98.17%. To enable real-time functionality, the MHCNLS model was compressed using pruning and quantization and successfully deployed on an edge computing device with constrained power, data rate, and bandwidth. The model size was reduced by up to five times while maintaining performance accuracy comparable to the uncompressed version. This innovative approach has significant implications for healthcare, rehabilitation, sports, prosthetics, and augmented learning.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"24 21","pages":"35394-35405"},"PeriodicalIF":4.3000,"publicationDate":"2024-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/10666991/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Human activity recognition (HAR) systems include the activities performed by a person during daily routines, such as running, walking, and jogging, performed with the help of the lower extremities. This article proposes an HAR system designed for health monitoring, focusing on five gesture categories: walking, running, jumping, squatting, and other activities for data collection. Data collection was carried out using an Arduino Nano 33 Bluetooth Low Energy (BLE) Sense microcontroller, equipped with a 9° inertial measurement unit (IMU) sensor system, at a sampling frequency of 110 Hz. For each gesture, 50 samples are collected from 30 subjects of various age groups (15–65) from the Indian subcontinent (Asian region). All gestures were manifested using the movement of the hip, knee, and ankle joints, which captures the spatial and temporal data of the person during various gestures. This research leverages the power of edge computing devices by fusing the deep learning code over the Arduino Nano microcontroller for gesture recognition. The multiheaded convolutional neural network (CNN) and long short-term memory (LSTM) (MHCNLS)-based deep learning model is proposed to classify the gestures. This model utilizes CNN for spatial dependencies and LSTM for sequential, time series dependencies in the human activity data. The proposed MHCNLS model is evaluated on three benchmark datasets—WISDM, PAMPM2, and UCI-HAR—and our own HEAHL-HAR dataset. The results of the MHCNLS model are compared with various other hybrid deep learning models based on CNN, LSTM, and GRU and their combination to classify the gestures and check the stability of the model. The results are evaluated based on various performance index accuracy, precision, F1-score, recall, and sensitivity. The proposed MHCNLS model has outperformed all existing state-of-the-art models mentioned in the literature with an accuracy of 98.17%. To enable real-time functionality, the MHCNLS model was compressed using pruning and quantization and successfully deployed on an edge computing device with constrained power, data rate, and bandwidth. The model size was reduced by up to five times while maintaining performance accuracy comparable to the uncompressed version. This innovative approach has significant implications for healthcare, rehabilitation, sports, prosthetics, and augmented learning.
期刊介绍:
The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following:
-Sensor Phenomenology, Modelling, and Evaluation
-Sensor Materials, Processing, and Fabrication
-Chemical and Gas Sensors
-Microfluidics and Biosensors
-Optical Sensors
-Physical Sensors: Temperature, Mechanical, Magnetic, and others
-Acoustic and Ultrasonic Sensors
-Sensor Packaging
-Sensor Networks
-Sensor Applications
-Sensor Systems: Signals, Processing, and Interfaces
-Actuators and Sensor Power Systems
-Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting
-Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data)
-Sensors in Industrial Practice