{"title":"从可穿戴传感器检测过程转换:一种无监督标记方法","authors":"S. Böttcher, P. Scholl, Kristof Van Laerhoven","doi":"10.1145/3134230.3134233","DOIUrl":null,"url":null,"abstract":"Authoring protocols for manual tasks such as following recipes, manufacturing processes, or laboratory experiments requires a significant effort. This paper presents a system that estimates individual procedure transitions from the user's physical movement and gestures recorded with inertial motion sensors. Combined with egocentric or external video recordings this facilitates efficient review and annotation of video databases. We investigate different clustering algorithms on wearable inertial sensor data recorded on par with video data, to automatically create transition marks between task steps. The goal is to match these marks to the transitions given in a description of the workflow, thus creating navigation cues to browse video repositories of manual work. To evaluate the performance of unsupervised clustering algorithms, the automatically generated marks are compared to human-expert created labels on publicly available datasets. Additionally, we tested the approach on a novel data set in a manufacturing lab environment, describing an existing sequential manufacturing process.","PeriodicalId":209424,"journal":{"name":"Proceedings of the 4th International Workshop on Sensor-based Activity Recognition and Interaction","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2017-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Detecting Process Transitions from Wearable Sensors: An Unsupervised Labeling Approach\",\"authors\":\"S. Böttcher, P. Scholl, Kristof Van Laerhoven\",\"doi\":\"10.1145/3134230.3134233\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Authoring protocols for manual tasks such as following recipes, manufacturing processes, or laboratory experiments requires a significant effort. This paper presents a system that estimates individual procedure transitions from the user's physical movement and gestures recorded with inertial motion sensors. Combined with egocentric or external video recordings this facilitates efficient review and annotation of video databases. We investigate different clustering algorithms on wearable inertial sensor data recorded on par with video data, to automatically create transition marks between task steps. The goal is to match these marks to the transitions given in a description of the workflow, thus creating navigation cues to browse video repositories of manual work. To evaluate the performance of unsupervised clustering algorithms, the automatically generated marks are compared to human-expert created labels on publicly available datasets. Additionally, we tested the approach on a novel data set in a manufacturing lab environment, describing an existing sequential manufacturing process.\",\"PeriodicalId\":209424,\"journal\":{\"name\":\"Proceedings of the 4th International Workshop on Sensor-based Activity Recognition and Interaction\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-09-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 4th International Workshop on Sensor-based Activity Recognition and Interaction\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3134230.3134233\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 4th International Workshop on Sensor-based Activity Recognition and Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3134230.3134233","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Detecting Process Transitions from Wearable Sensors: An Unsupervised Labeling Approach
Authoring protocols for manual tasks such as following recipes, manufacturing processes, or laboratory experiments requires a significant effort. This paper presents a system that estimates individual procedure transitions from the user's physical movement and gestures recorded with inertial motion sensors. Combined with egocentric or external video recordings this facilitates efficient review and annotation of video databases. We investigate different clustering algorithms on wearable inertial sensor data recorded on par with video data, to automatically create transition marks between task steps. The goal is to match these marks to the transitions given in a description of the workflow, thus creating navigation cues to browse video repositories of manual work. To evaluate the performance of unsupervised clustering algorithms, the automatically generated marks are compared to human-expert created labels on publicly available datasets. Additionally, we tested the approach on a novel data set in a manufacturing lab environment, describing an existing sequential manufacturing process.