S. Amrouche, Benedikt Gollan, A. Ferscha, Josef Heftberger
{"title":"基于眼球注视特征的活动分割和识别","authors":"S. Amrouche, Benedikt Gollan, A. Ferscha, Josef Heftberger","doi":"10.1145/3197768.3197775","DOIUrl":null,"url":null,"abstract":"In coherence with the ongoing digitalization of production processes, Human Computer Interaction (HCI) technologies have evolved rapidly in industrial applications, providing abundant numbers of the versatile tracking and monitoring devices suitable to address complex challenges. This paper focuses on Activity Segmentation and Activity Identification as one of the most crucial challenges in pervasive computing, applying only visual attention features captured through mobile eye-tracking sensors. We propose a novel, application-independent approach towards segmentation of task executions in semi-manual industrial assembly setup via exploiting the expressive properties of the distribution-based gaze feature Nearest Neighbor Index (NNI) to build a dynamic activity segmentation algorithm. The proposed approach is enriched with a machine learning validation model acting as a feedback loop to classify segments qualities. The approach is evaluated in an alpine ski assembly scenario with real-world data reaching an overall of 91% detection accuracy.","PeriodicalId":130190,"journal":{"name":"Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference","volume":"207 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Activity Segmentation and Identification based on Eye Gaze Features\",\"authors\":\"S. Amrouche, Benedikt Gollan, A. Ferscha, Josef Heftberger\",\"doi\":\"10.1145/3197768.3197775\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In coherence with the ongoing digitalization of production processes, Human Computer Interaction (HCI) technologies have evolved rapidly in industrial applications, providing abundant numbers of the versatile tracking and monitoring devices suitable to address complex challenges. This paper focuses on Activity Segmentation and Activity Identification as one of the most crucial challenges in pervasive computing, applying only visual attention features captured through mobile eye-tracking sensors. We propose a novel, application-independent approach towards segmentation of task executions in semi-manual industrial assembly setup via exploiting the expressive properties of the distribution-based gaze feature Nearest Neighbor Index (NNI) to build a dynamic activity segmentation algorithm. The proposed approach is enriched with a machine learning validation model acting as a feedback loop to classify segments qualities. The approach is evaluated in an alpine ski assembly scenario with real-world data reaching an overall of 91% detection accuracy.\",\"PeriodicalId\":130190,\"journal\":{\"name\":\"Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference\",\"volume\":\"207 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-06-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3197768.3197775\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3197768.3197775","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Activity Segmentation and Identification based on Eye Gaze Features
In coherence with the ongoing digitalization of production processes, Human Computer Interaction (HCI) technologies have evolved rapidly in industrial applications, providing abundant numbers of the versatile tracking and monitoring devices suitable to address complex challenges. This paper focuses on Activity Segmentation and Activity Identification as one of the most crucial challenges in pervasive computing, applying only visual attention features captured through mobile eye-tracking sensors. We propose a novel, application-independent approach towards segmentation of task executions in semi-manual industrial assembly setup via exploiting the expressive properties of the distribution-based gaze feature Nearest Neighbor Index (NNI) to build a dynamic activity segmentation algorithm. The proposed approach is enriched with a machine learning validation model acting as a feedback loop to classify segments qualities. The approach is evaluated in an alpine ski assembly scenario with real-world data reaching an overall of 91% detection accuracy.