N. Bicocchi, M. Mamei, A. Prati, R. Cucchiara, F. Zambonelli
{"title":"多模态分布式传感器的普适自学习","authors":"N. Bicocchi, M. Mamei, A. Prati, R. Cucchiara, F. Zambonelli","doi":"10.1109/SASOW.2008.51","DOIUrl":null,"url":null,"abstract":"Truly ubiquitous computing poses new and significant challenges. One of the key aspects that will condition the impact of these new technologies is how to obtain a manageable representation of the surrounding environment starting from simple sensing capabilities. This will make devices able to adapt their computing activities on an everchanging environment. This paper presents a framework to promote unsupervised training processes among different sensors. This framework allows different sensors to exchange the needed knowledge to create a model to classify events. In particular we developed, as a case study,a multi-modal multi-sensor classification system combining data from a camera and a body-worn accelerometer to identify the user motion state. The body-worn accelerometer learns a model of the user behavior exploiting the information coming from the camera and uses it later on to classify the user motion in an autonomous way. Experiments demonstrate the accuracy of the proposed approach in different situations.","PeriodicalId":447279,"journal":{"name":"2008 Second IEEE International Conference on Self-Adaptive and Self-Organizing Systems Workshops","volume":"45 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2008-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Pervasive Self-Learning with Multi-modal Distributed Sensors\",\"authors\":\"N. Bicocchi, M. Mamei, A. Prati, R. Cucchiara, F. Zambonelli\",\"doi\":\"10.1109/SASOW.2008.51\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Truly ubiquitous computing poses new and significant challenges. One of the key aspects that will condition the impact of these new technologies is how to obtain a manageable representation of the surrounding environment starting from simple sensing capabilities. This will make devices able to adapt their computing activities on an everchanging environment. This paper presents a framework to promote unsupervised training processes among different sensors. This framework allows different sensors to exchange the needed knowledge to create a model to classify events. In particular we developed, as a case study,a multi-modal multi-sensor classification system combining data from a camera and a body-worn accelerometer to identify the user motion state. The body-worn accelerometer learns a model of the user behavior exploiting the information coming from the camera and uses it later on to classify the user motion in an autonomous way. Experiments demonstrate the accuracy of the proposed approach in different situations.\",\"PeriodicalId\":447279,\"journal\":{\"name\":\"2008 Second IEEE International Conference on Self-Adaptive and Self-Organizing Systems Workshops\",\"volume\":\"45 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2008-10-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2008 Second IEEE International Conference on Self-Adaptive and Self-Organizing Systems Workshops\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SASOW.2008.51\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2008 Second IEEE International Conference on Self-Adaptive and Self-Organizing Systems Workshops","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SASOW.2008.51","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Pervasive Self-Learning with Multi-modal Distributed Sensors
Truly ubiquitous computing poses new and significant challenges. One of the key aspects that will condition the impact of these new technologies is how to obtain a manageable representation of the surrounding environment starting from simple sensing capabilities. This will make devices able to adapt their computing activities on an everchanging environment. This paper presents a framework to promote unsupervised training processes among different sensors. This framework allows different sensors to exchange the needed knowledge to create a model to classify events. In particular we developed, as a case study,a multi-modal multi-sensor classification system combining data from a camera and a body-worn accelerometer to identify the user motion state. The body-worn accelerometer learns a model of the user behavior exploiting the information coming from the camera and uses it later on to classify the user motion in an autonomous way. Experiments demonstrate the accuracy of the proposed approach in different situations.