{"title":"基于对话的日常生活活动主动学习方法","authors":"Ronnie Smith, M. Dragone","doi":"10.1145/3616017","DOIUrl":null,"url":null,"abstract":"While Human Activity Recognition systems may benefit from Active Learning by allowing users to self-annotate their Activities of Daily Living (ADLs), many proposed methods for collecting such annotations are for short-term data collection campaigns for specific datasets. We present a reusable dialogue-based approach to user interaction for active learning in activity recognition systems, which utilises semantic similarity measures and a dataset of natural language descriptions of common activities (which we make publicly available). Our approach involves system-initiated dialogue, including follow-up questions to reduce ambiguity in user responses where appropriate. We apply this approach to two active learning scenarios: (i) using an existing CASAS dataset, demonstrating long-term usage; and (ii) using an online activity recognition system, which tackles the issue of online segmentation and labelling. We demonstrate our work in context, in which a natural language interface provides knowledge that can help interpret other multi-modal sensor data. We provide results highlighting the potential of our dialogue- and semantic similarity-based approach. We evaluate our work: (i) quantitatively, as an efficient way to seek users’ input for active learning of ADLs; and (ii) qualitatively, through a user study in which users were asked to compare our approach and an established method. Results show the potential of our approach as a hands-free interface for annotation of sensor data as part of an active learning system. We provide insights into the challenges of active learning for activity recognition under real-world conditions and identify potential ways to address them.","PeriodicalId":3,"journal":{"name":"ACS Applied Electronic Materials","volume":null,"pages":null},"PeriodicalIF":4.3000,"publicationDate":"2023-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Generalisable Dialogue-based Approach for Active Learning of Activities of Daily Living\",\"authors\":\"Ronnie Smith, M. Dragone\",\"doi\":\"10.1145/3616017\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"While Human Activity Recognition systems may benefit from Active Learning by allowing users to self-annotate their Activities of Daily Living (ADLs), many proposed methods for collecting such annotations are for short-term data collection campaigns for specific datasets. We present a reusable dialogue-based approach to user interaction for active learning in activity recognition systems, which utilises semantic similarity measures and a dataset of natural language descriptions of common activities (which we make publicly available). Our approach involves system-initiated dialogue, including follow-up questions to reduce ambiguity in user responses where appropriate. We apply this approach to two active learning scenarios: (i) using an existing CASAS dataset, demonstrating long-term usage; and (ii) using an online activity recognition system, which tackles the issue of online segmentation and labelling. We demonstrate our work in context, in which a natural language interface provides knowledge that can help interpret other multi-modal sensor data. We provide results highlighting the potential of our dialogue- and semantic similarity-based approach. We evaluate our work: (i) quantitatively, as an efficient way to seek users’ input for active learning of ADLs; and (ii) qualitatively, through a user study in which users were asked to compare our approach and an established method. Results show the potential of our approach as a hands-free interface for annotation of sensor data as part of an active learning system. We provide insights into the challenges of active learning for activity recognition under real-world conditions and identify potential ways to address them.\",\"PeriodicalId\":3,\"journal\":{\"name\":\"ACS Applied Electronic Materials\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":4.3000,\"publicationDate\":\"2023-08-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACS Applied Electronic Materials\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1145/3616017\",\"RegionNum\":3,\"RegionCategory\":\"材料科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Electronic Materials","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1145/3616017","RegionNum":3,"RegionCategory":"材料科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Generalisable Dialogue-based Approach for Active Learning of Activities of Daily Living
While Human Activity Recognition systems may benefit from Active Learning by allowing users to self-annotate their Activities of Daily Living (ADLs), many proposed methods for collecting such annotations are for short-term data collection campaigns for specific datasets. We present a reusable dialogue-based approach to user interaction for active learning in activity recognition systems, which utilises semantic similarity measures and a dataset of natural language descriptions of common activities (which we make publicly available). Our approach involves system-initiated dialogue, including follow-up questions to reduce ambiguity in user responses where appropriate. We apply this approach to two active learning scenarios: (i) using an existing CASAS dataset, demonstrating long-term usage; and (ii) using an online activity recognition system, which tackles the issue of online segmentation and labelling. We demonstrate our work in context, in which a natural language interface provides knowledge that can help interpret other multi-modal sensor data. We provide results highlighting the potential of our dialogue- and semantic similarity-based approach. We evaluate our work: (i) quantitatively, as an efficient way to seek users’ input for active learning of ADLs; and (ii) qualitatively, through a user study in which users were asked to compare our approach and an established method. Results show the potential of our approach as a hands-free interface for annotation of sensor data as part of an active learning system. We provide insights into the challenges of active learning for activity recognition under real-world conditions and identify potential ways to address them.