{"title":"用于识别人体姿势转换的可见光传感","authors":"Ziad Salem, A. Weiss","doi":"10.1109/CoBCom55489.2022.9880682","DOIUrl":null,"url":null,"abstract":"Human activity recognition and monitoring classify signals that are generated from various sensors based on the physical activities a person is performing during his/her daily life. This is useful if a human performs some postural transition activities such as sit-to-stand and stand-to-sit, which are hardly detected accurately by a single sensor. The aim of this study is to explore the possibilities of detecting daily postural transition activities through a novel wearable approach comprising of inertial measurement sensors (IMU) and visible light sensing (VLS) utilizing a single RGB photodiode in an unmodified lighting infrastructure. By employing a low-complex decision tree algorithm, the activity recognition can be achieved in a resourceful way. For enabling our approach to work precisely in changing environments, a K-means clustering algorithm is employed to adapt the parameters of both sit-to-stand and stand-to-sit transition detection. Our approach is validated with different scenarios; representing basic and daily life postural transition activities. The results showed that the approach was able to achieve the tasks accurately, which could not be the case if either IMU sensors or VLS is used alone.","PeriodicalId":131597,"journal":{"name":"2022 International Conference on Broadband Communications for Next Generation Networks and Multimedia Applications (CoBCom)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Visible Light Sensing for Recognising Human Postural Transitions\",\"authors\":\"Ziad Salem, A. Weiss\",\"doi\":\"10.1109/CoBCom55489.2022.9880682\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Human activity recognition and monitoring classify signals that are generated from various sensors based on the physical activities a person is performing during his/her daily life. This is useful if a human performs some postural transition activities such as sit-to-stand and stand-to-sit, which are hardly detected accurately by a single sensor. The aim of this study is to explore the possibilities of detecting daily postural transition activities through a novel wearable approach comprising of inertial measurement sensors (IMU) and visible light sensing (VLS) utilizing a single RGB photodiode in an unmodified lighting infrastructure. By employing a low-complex decision tree algorithm, the activity recognition can be achieved in a resourceful way. For enabling our approach to work precisely in changing environments, a K-means clustering algorithm is employed to adapt the parameters of both sit-to-stand and stand-to-sit transition detection. Our approach is validated with different scenarios; representing basic and daily life postural transition activities. The results showed that the approach was able to achieve the tasks accurately, which could not be the case if either IMU sensors or VLS is used alone.\",\"PeriodicalId\":131597,\"journal\":{\"name\":\"2022 International Conference on Broadband Communications for Next Generation Networks and Multimedia Applications (CoBCom)\",\"volume\":\"32 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-07-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 International Conference on Broadband Communications for Next Generation Networks and Multimedia Applications (CoBCom)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CoBCom55489.2022.9880682\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Broadband Communications for Next Generation Networks and Multimedia Applications (CoBCom)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CoBCom55489.2022.9880682","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Visible Light Sensing for Recognising Human Postural Transitions
Human activity recognition and monitoring classify signals that are generated from various sensors based on the physical activities a person is performing during his/her daily life. This is useful if a human performs some postural transition activities such as sit-to-stand and stand-to-sit, which are hardly detected accurately by a single sensor. The aim of this study is to explore the possibilities of detecting daily postural transition activities through a novel wearable approach comprising of inertial measurement sensors (IMU) and visible light sensing (VLS) utilizing a single RGB photodiode in an unmodified lighting infrastructure. By employing a low-complex decision tree algorithm, the activity recognition can be achieved in a resourceful way. For enabling our approach to work precisely in changing environments, a K-means clustering algorithm is employed to adapt the parameters of both sit-to-stand and stand-to-sit transition detection. Our approach is validated with different scenarios; representing basic and daily life postural transition activities. The results showed that the approach was able to achieve the tasks accurately, which could not be the case if either IMU sensors or VLS is used alone.