Sheharyar Khan, S. M. A. Shah, Sadam Hussain Noorani, Aamir Arsalan, M. Ehatisham-ul-Haq, Aasim Raheel, Wakeel Ahmed
{"title":"A Framework for Daily Living Activity Recognition using Fusion of Smartphone Inertial Sensors Data","authors":"Sheharyar Khan, S. M. A. Shah, Sadam Hussain Noorani, Aamir Arsalan, M. Ehatisham-ul-Haq, Aasim Raheel, Wakeel Ahmed","doi":"10.1109/iCoMET57998.2023.10099271","DOIUrl":null,"url":null,"abstract":"Recent years have seen rapid advancements in the human activity recognition field using data from smart sensor devices. A wide variety of real-world applications can be found in different domains, particularly health and security. Smartphones are common devices that let people do a wide range of everyday tasks anytime, anywhere. The sensors and networking capabilities found in modern smartphones enable context awareness for a wide range of applications. This research mainly focuses on recognizing human activities in the wild for which we selected an in-the-wild extra-sensory dataset. Six human activities i.e., lying down, sitting, standing, running, walking, and bicycling are selected. Time domain features are extracted and human activity recognition is performed using three different machine learning classifiers i.e., random forest, k-nearest neighbors, and decision trees. The proposed human activity recognition scheme resulted in the highest classification accuracy of 89.98%, using the random forest classifier. Our proposed scheme outperforms the state-of-the-art human activity recognition schemes in the wild.","PeriodicalId":369792,"journal":{"name":"2023 4th International Conference on Computing, Mathematics and Engineering Technologies (iCoMET)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 4th International Conference on Computing, Mathematics and Engineering Technologies (iCoMET)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/iCoMET57998.2023.10099271","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Recent years have seen rapid advancements in the human activity recognition field using data from smart sensor devices. A wide variety of real-world applications can be found in different domains, particularly health and security. Smartphones are common devices that let people do a wide range of everyday tasks anytime, anywhere. The sensors and networking capabilities found in modern smartphones enable context awareness for a wide range of applications. This research mainly focuses on recognizing human activities in the wild for which we selected an in-the-wild extra-sensory dataset. Six human activities i.e., lying down, sitting, standing, running, walking, and bicycling are selected. Time domain features are extracted and human activity recognition is performed using three different machine learning classifiers i.e., random forest, k-nearest neighbors, and decision trees. The proposed human activity recognition scheme resulted in the highest classification accuracy of 89.98%, using the random forest classifier. Our proposed scheme outperforms the state-of-the-art human activity recognition schemes in the wild.